Hi Team :wave: We're running PyFlink on K8s using ...
# troubleshooting
m
Hi Team πŸ‘‹ We're running PyFlink on K8s using the FlinkDeployment. I have a job that declares kafka source and sink in Table API, the source is transformed into DataStream to perform a custom process (interacting with a state and emitting an appropriate msg after). This then gets inserted into above mentioned sink. Locally I can submit the job and in fact it submits 2 jobs/graphs: the piece with Datastream API and Table API separately. It is outputting logs to the taskmanager from the process function also. On Kubernetes I get
Cannot have more than one execute() or executeAsync() call in a single environment.
described in a closed issue here. Whenever I remove the env.execute(), only the Table API part is submitted. Messages are output into the Kafka stream, but there is no sign of execution of the Python process function in the logs 🀯 Could you please advise on how to execute all parts of code? πŸ™
j
Seems like you use K8s operator and it fails with the application mode? The issue indicates it woks with flink run but not using UI or rest api by design. The flink job may rely on the rest api. You may create a session cluster and submit your app via flink run. It should be possible using a Kubernetes job.
d
This error typically arises when trying to submit multiple jobs within same execution env with flink.