Hello, I submitted an issue for a new article idea...
# write-for-the-community
d
Hello, I submitted an issue for a new article idea. https://github.com/airbytehq/write-for-the-community/issues/145 Submission Details Having timely access to insights is crucial for data-driven decision making. Typically, it takes a significant amount of engineering effort to build and maintain a data pipeline that moves operational data into a real-time analytics platform like Apache Pinot. This tutorial explores how Airbyte is making it easy and accessible for a typical developer to build a streaming data pipeline to move operational data from a MySQL database to Apache Pinot. Steps Airbyte related 1. Configure a source connector for MySQL 2. Configure destination/sync connector for Kaka 3. Make a connection between MySQL and Kafka Pinot related 1. Configure a table definition to ingest from the Kafka topic that has data coming from Airbyte 2. Query the table using integrated UI 3. Invoke Pinot APIs with CURL to mimic an application making request for analytics
👍 1
Let me know the next steps
a
@Dunith Sounds awesome! 🙂 Can you book some time to discuss the topic and process next week? You can do it here: https://calendly.com/ari-airbyte/30min
d
Absolutely! I've already blocked your calendar. Looking forward to the chat next week.
😍 1