Hi folks, I'm hoping someone has seen this and has...
# troubleshooting
r
Hi folks, I'm hoping someone has seen this and has a solve. I'm building a flink docker image, using S3 as my checkpoint and the job runs 100% except, checkpointing fails with the error: java.lang.NoSuchMethodError: 'boolean org.apache.flink.core.fs.EntropyInjector.isEntropyInjecting(org.apache.flink.core.fs.FileSystem, org.apache.flink.core.fs.Path)' Anyone seen this before?
m
How have you added the necessary S3 dependencies in your Docker image?
r
Copy code
ENABLE_BUILT_IN_PLUGINS="flink-s3-fs-hadoop-${FLINK_VER}.jar"
And downloaded and unpacked Hadoop.
Copy code
HADOOP_CLASSPATH="/opt/hadoop/hadoop-${HADOOP_VER}/etc/hadoop:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/common/lib/*:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/common/*:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/hdfs:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/hdfs/lib/*:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/hdfs/*:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/mapreduce/*:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/yarn:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/yarn/lib/*:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/yarn/*:/opt/hadoop/hadoop-${HADOOP_VER}:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/tools/lib/hadoop-aws-${HADOOP_VER}.jar:/opt/hadoop/hadoop-${HADOOP_VER}/share/hadoop/tools/lib/aws-java-sdk-bundle-${HADOOP_AWS_VER}.jar"
Actually using S3 works fine, I can read and write
m
For S3, you shouldn't download & unpack Hadoop. It's a self-contained plugin. For checkpointing, you should use the S3 Presto plugin only
r
OK - I'll strip down the image and see if that sorts it out. Thanks!
Turns out it only seems to exhibit this behaviour with the kinesis plugin. But this did help me get past the immediate issue, thanks very much!
m
Great!