Hello,
We have a very large kafka topic (it gets about 10-25 mil rows/second) and we would like to use real time ingestion in order to fill in some tables. For each table we have a custom decoder that knows how to extract the proper data from each message or skip the message.
I'm curious how the ingestion work - will Pinot stream the data independently or can we have one time ingestion/apply each decoder for the ingested rows?
Thanks a lot :)