https://pinot.apache.org/ logo
#general
Title
# general
m

Mark.Tang

01/07/2021, 6:24 AM
Hi, Team, a streaming app often does the following: 1. Read local files using flume into kafka 2. Do ETL transformation from kafka topic using flink 3. Pull data from flink into Linkedin's Pinot So, I am not doing direct map from kafka to pinot table just like https://docs.pinot.apache.org/basics/data-import/pinot-stream-ingestion , any suggestion or example can help me, thanks!
k

Kishore G

01/07/2021, 7:50 AM
You can write the output of Dlink to another Kafka topic for real-time ingest or use Pinot segment generation api to do periodic batch uploads
m

Mark.Tang

01/07/2021, 8:09 AM
thanks reply. @Kishore G In reality, I wish to write output of flink/dlink into pinot in memory way.
k

Kishore G

01/07/2021, 8:11 AM
Pinot does not have a write api as of now...
write happens via Kafka
its something we plan to add in 2021
❤️ 1
m

Mark.Tang

01/07/2021, 8:14 AM
Sure, I am ready to write the output of Dlink to another Kafka topic for real-time ingest, looking forward to write api 👍