Hi All I am very new to flink and I am trying to ...
# troubleshooting
a
Hi All I am very new to flink and I am trying to setup flink operator with s3 presto file system on a self managed k8s cluster . I have modified the Dockerfile to copy over the presto jar file to plugins directory(Check attached screenshot from inside the operator pod) and Ive also added the below in the configmap
Copy code
taskmanager.numberOfTaskSlots: 1
    parallelism.default: 1
    state.backend.type: rocksdb
    state.checkpoints.dir: <s3p://test-flink-rocksdb/flink/>
    s3.access-key: DDD
    s3.secret-key: DDD
    execution.checkpointing.interval: 10000
Then I am trying to run this sample job https://github.com/apache/flink-kubernetes-operator/blob/main/examples/basic-session-deployment-only.yaml I am getting this error in the job's logs
Copy code
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 's3p'. The scheme is directly supported by Flink through the following plugin(s): flink-s3-fs-presto.
Would be great if someone can help with this