Hello everyone, i have a short question regarding ...
# troubleshooting
j
Hello everyone, i have a short question regarding the Flink Connector JDBC we use this one together with an oracle database. Sometimes when a error happens during writing data into the database and flink retries to write the data and also failing during the retries, the sessions seems not to be closed from Flink. We took a look into the code of the JDBC Connector and i have a short question: In the above code taken from here: Github.com/flink-connector-jdbc In the
close
Method the sessions are not closed if an exception happens during the call to
flush()
is this correct? Or should there be the
connectionProvider.closeConnection()
call be put into an
finally
block?
m
@Joao Boto Do you know this one perhaps?
j
I will say no, but have to check with more detail.. if something has to be added I think that should be change the throw exception with "flushException = e;". this way checkFlushException will throw the exception
@Jan Kevin Dick have you check the data that are trying to insert into oracle to see if is ok?
j
@Joao Boto as far as we know there is no data successfully written to the database. But the session counts on our oracle database increasing relativly fast until the complet process memory is full and no new session could be opend. And we saw a lots of (>1000) open inactive session. On Flink side our job crashes and is then automatically restarted. We assume that the job retries with the same data crashes again until the inactive sessions are full
j
@Jan Kevin Dick the problem of sessions is address on this PR https://github.com/apache/flink-connector-jdbc/pull/5/files
j
@Joao Boto thanks and sorry for the late replay.