Additionally, when I add a new topic and add sampl...
# getting-started
e
Additionally, when I add a new topic and add sample data to it using kafkacat like this
Copy code
kcat -P -b localhost:9092 -t topic1 -K :
mykey1:mymessage1
mykey2:mymessage2
I don't see this show up in datahub. Do I need to run an ingestion job after this for this to be picked up by datahub?
b
You need to run an ingest job for datahub to pick up the topic. Documentation here https://github.com/linkedin/datahub/blob/master/metadata-ingestion/source_docs/kafka.md
e
Thanks! Will dig through that. So when this happens in actual production at an enterprise level w/ tons of developers producing new topics, does the data team need to run periodic ingest jobs to ingest the new topic metadatas via Airflow or is there some trigger mechanism using which the ingest can be automated at the time of topic creation. Thanks for the insight
l
Yes you would need to schedule ingestion using Airflow or similar mechanism