Hello, I am trying to deploy a Flink app on a Kube...
# troubleshooting
y
Hello, I am trying to deploy a Flink app on a Kubernetes cluster (application-mode) using the Flink Kubernetes operator. The issue is that my app executes an SQL insert statement and later on an update statement using Flink SQL API in batch mode with tableEnv.executeSql back to back statements. When deploying the app, this leads to an error like the following :
"Caused by: org.apache.flink.util.FlinkRuntimeException: Cannot have more than one execute() or executeAsync() call in a single environment"
How can I avoid this issue and deploy the application without having to create different apps for insert and update ?
d
Flink expects all ops within a single tableEnv.execute() call to be part of a single job execution when you are in batch mode.
lots of calls is not supported in batch mode.
it will try to create multiple jobs and so you see these kinds of errors.
so you have a few choices. 1) You can try to chain the operations into a single execution.
Copy code
String sql = "INSERT INTO target_table SELECT ... FROM source_table WHERE ...;"; //  combining insert and update logic
tableEnv.executeSql(sql);
2) Use temporal joins
or 3) switch to streaming mode.
y
Thanks Draco for the reply I am afraid option 1 does not work, it produces a similar error, and options 2 and 3 are really different implementations that change the scope
d
Well if you need more control there is always DataStream API