Hi all, I am building a streaming app with pyflink which needs to do a lookup to shared state store to obtain a value (produced by different flink job), but as I understood flink native state stores are tied to an operator (e.g. flat_map…). Options I am thinking about are either using my own rocks db serving as shared state or building table on the top of a changelog topic in each flink job, which requires the lookup. I am wondering if there is any better/“flink native” way of implementing such a lookup to shared state store? My stack if pyflink+confluent kafka
Any help will be much appreciated.