Hi All, I am not using docker (docker commands) to...
# ingestion
s
Hi All, I am not using docker (docker commands) to ingest data, so i have produced the data to "MetadataChangeEvent" topic, which is persisted to MYSQL through GMS, so Can anyone suggest me how to use the "MetadataAuditEvent" ?
b
s
yes sir
b
Got it. So what do you mean by "using MAE"? Are you trying to update the search index?
Or just to verify that MAE was emitted correctly?
s
yes actually, my data is persisted to mysql but is not searchable in datahub UI
i want to make my data searchable
b
Got it. I assume you're using the quickstart docker images?
s
yes, but i am not using docker commands to ingest metadata
b
Sure. Could you run
Copy code
docker logs datahub-mae-consumer
s
i've installed kafka and mysql locally
so, the docker command won't work
b
sure. but you need to have the
datahub-mae-consumer
docker image up and running to process the MAEs
s
okay, i will check and get back to you.
Sure, thanks
b
You can start only the mae-consumer-job using this docker file: https://github.com/linkedin/datahub/tree/master/docker/mae-consumer
s
is "mae-consumer-job" a topic in kafka?
topic with the name "MetadataAuditEvent"
b
It's a kafka stream job. Just like the "mce-consumer-job" that consumes MetadataChangeEvent and injects that into the MySQL DB
Please take a look at the architecture diagram I posted above to get a better understanding how things are hooked together
s
yes, and i already have created a topic "MetadataAuditEvent"
b
You still need the process running in order to consume the messages from the topic
m
Maybe also worth mentioning: you also need to have elasticsearch running as well