Hello, so I am modifying the spark-lineage plugin ...
# troubleshoot
m
Hello, so I am modifying the spark-lineage plugin to try some improvements but I keep getting this error which I don't really know the source of it:
Exception in thread "map-output-dispatcher-3" java.lang.UnsatisfiedLinkError: com.github.luben.zstd.Zstd.setCompressionLevel(JI)I
I am building the jar with the following command
./gradlew metadata-integration:java:spark-lineage:buildDependents
, and the source code I have started from is the code of v0.9.1 tag. The modifications done do not modify yet the base functionality, they are just more log prints. The application submitted worked with the v0.9.2 of the spark-lineage plugin, so I am guessing that this error could have been some step that I have unconsciously skipped in the build process. (I skipped the tests as I was stuck with another error and as it was a checking of the docker deployment, I didn't gave it much importance. Could that be it??)
d
hmm, which version of docker do you have? Is it possible it is not running? Based on this error message it seems like TestContainers unable to find docker environment
The first one seems like a spark setup related issue. I haven’t seen this before but this is what I found on the Internet -> https://github.com/luben/zstd-jni/issues/139
m
I double checked in case it wasn't, but docker was indeed running. I will read the github issue in case it applies, thank you Tamas.
b
Odd, can you run docker containers manually via the cli? Perhaps the user doesn’t have permission to use docker? Does the hello world test for that user work?
Copy code
docker run hello-world
m
Oh my bad, when I installed the VM I thought I gave the user root privileges but it seems like I was mistaken. Changing the user I was able to build it correctly, thank you!!
I have checked the solution of the github issue that posted @dazzling-judge-80093 but haven't been able to solve it. It is quite strange as at first (using the jar downloaded from the maven repository) it executed the applications perfectly. Then I changed to my compiled jar and the error started to appear. But what surprises me the most is the fact that after the appearance of the error I changed again to the jar from maven (obviously restarting spark between changes) and the error kept printing and applications weren't executed correctly.
d
What happens if you run your spark job without our jar?
m
It prints the same error, although at first it didn't. Could it be possible that the instalation of my compiled jar could updated or modified something of the package mentioned (zstd-jni)?
d
it shouldn’t, you can try to remove our jar
our jar doesn’t/can’t modify anything in your spark setup. If you don’t specify the jar then it should run as before.
m
And hdfs?? I imagine that it doesn't modify it either but it could be (as the jar is moved to HDFS after the download process if I am not mistaken)