Hello, I’m getting the following error when settin...
# troubleshooting
j
Hello, I’m getting the following error when setting up the Flink SQL Client on my machine with a 3rd party connector. I am using Gradle to build the project and I have included that dependency in my build.gradle file:
Copy code
dependencies {
    ...
    implementation 'com.ververica:flink-sql-connector-postgres-cdc:2.2.1'
This is the error I’m getting:
Copy code
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'postgres-cdc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
I will add more details in-thread for the steps I have taken. Any help would be much appreciated.
1
Here is what I have done: 1. Build a fat JAR file using the command:
gradle clean shadowJar
2. Start a cluster:
./bin/start-cluster.sh
3. Start a SQL Client and pass the JAR file from step 1:
./bin/sql-client.sh -j build/libs/<project-name>-<version>-all.jar
4. Register a PostgreSQL table in Flink SQL using the ‘postgres-cdc’ connector. 5. When I try and
select *
from the table created in step 4 I get the error shown in the post I think this paragraph explains the exact issue I’m having but for a pom.xml file/Maven, but it doesn’t specify what to do for Gradle. I came across a similar issue in GitHub for another 3rd party connector which says to add
mergeServiceFiles()
to the build.gradle file. I tried that and it also didn’t work.
m
I'm actually not sure that you can add a connector during step 3, because you've already started a cluster and the connector code should be loaded during the start of the cluster, not afterwards
But not 100% sure
j
Hey Martijn, do you know how I can load the connector code when starting the cluster in that case?
So I have got this working by downloading the jar file for that single connector and moving it to
$FLINK_HOME/lib/
before starting the cluster, but it’s not particularly clear how I can do this with a fat JAR.
m
but it’s not particularly clear how I can do this with a fat JAR.
I'm not sure what your intent is of that fat JAR. What's in there? Are there UDFs? Because a normal use case for the SQL Client is that you submit SQL statements, which get executed. There is no JAR involved in those cases
So I have got this working by downloading the jar file for that single connector and moving it to
$FLINK_HOME/lib/
That's indeed how you normally add connectors to the cluster. https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/configuration/overview/ talks specifically about Table API programs, not SQL applications
https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/configuration/connector/ also talks about the possible options; I do think that we could clarify how for SQL-only apps you would need to add connectors, because there's no JAR in those cases
j
I see, thanks for clarifying. yeah this through me in the docs:
An overview of available connectors and formats is available for both DataStream and Table API/SQL.
….
The uber/fat JARs are supported mostly for being used in conjunction with the SQL client, but you can also use them in any DataStream/Table application.
(Link)
m
Yeah I agree that's confusing