so got this working... sort of... at times... on t...
# troubleshooting
g
so got this working... sort of... at times... on the flink web ui i can see jobs flipping between running and restarting. at times i can see data hitting back end and then suddenly not. think my flink environment needs more resources. i'm also sitting with the S3 user and password not being consumed from env variable... so at the moment i actually copy it into the flink-conf file and then restart the container. please advise. AS a start, running in docker compose, writing to minio. with the below I got to copy/create variables inside /opt/flink/conf/flink-conf.yaml.. after creating container. modify it and then restart containers.
Copy code
CREATE CATALOG c_paimon1 WITH (
     'type'                      = 'paimon'
    ,'warehouse'                 = '<s3://paimon/>'
    ,'catalog-type'              = 'hive'
    ,'hive-conf-dir'             = './conf'
    ,'table-default.file.format' = 'parquet'
);
I also have to set environment variables in my compose file inside the jobmanager and taskmanager blocks.
Copy code
environment:
      - AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
      - AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
was hoping this would negate that, but not...
Copy code
CREATE CATALOG c_paimon WITH (
     'type'                      = 'paimon'
    ,'warehouse'                 = '<s3://paimon/>'
    ,'catalog-type'              = 'hive'
    ,'hive-conf-dir'             = './conf'
    ,'table-default.file.format' = 'parquet'
    ,'fs.s3a.endpoint'           = '<http://minio:9000>'
    ,'fs.s3a.access-key'         = 'admin'
    ,'fs.s3a.secret-key'         = 'password'
);
how would be a better setup/solution. i tried mapping a complete flink-conf.yaml into /opt/flink/conf/ but it seems to be over written by the container at startup. please advise.