Title
#write-for-the-community
Dunith

Dunith

01/28/2022, 1:35 PM
Hello, I submitted an issue for a new article idea.https://github.com/airbytehq/write-for-the-community/issues/145 Submission Details Having timely access to insights is crucial for data-driven decision making. Typically, it takes a significant amount of engineering effort to build and maintain a data pipeline that moves operational data into a real-time analytics platform like Apache Pinot. This tutorial explores how Airbyte is making it easy and accessible for a typical developer to build a streaming data pipeline to move operational data from a MySQL database to Apache Pinot.Steps Airbyte related1. Configure a source connector for MySQL 2. Configure destination/sync connector for Kaka 3. Make a connection between MySQL and Kafka Pinot related 1. Configure a table definition to ingest from the Kafka topic that has data coming from Airbyte 2. Query the table using integrated UI 3. Invoke Pinot APIs with CURL to mimic an application making request for analytics
1:36 PM
Let me know the next steps
Ari Bajo (Airbyte)

Ari Bajo (Airbyte)

01/28/2022, 2:05 PM
@Dunith Sounds awesome! 🙂 Can you book some time to discuss the topic and process next week? You can do it here: https://calendly.com/ari-airbyte/30min
Dunith

Dunith

01/28/2022, 2:47 PM
Absolutely! I've already blocked your calendar. Looking forward to the chat next week.