Hi guys! how can I ingest Airflow in Datahub syste...
# all-things-deployment
s
Hi guys! how can I ingest Airflow in Datahub system?
and I want to know whether neo4j is necessary or optional ( I wanna know some info about neo4j’s dependency)
i
Hello Antonius, to ingest airflow please take a look at DataHub’s ingestion documentation: https://datahubproject.io/docs/lineage/airflow About Neo4j, the technology itself is optional. You can opt for elasticsearch. DataHub uses one or the other for search & indexing. Take a look at DataHub’s architecture docs: https://datahubproject.io/docs/components#metadata-store
✔️ 1
s
@incalculable-ocean-74010 Thank you! Can I ask more? I want to ingest dataset of postgreSQL. and I created ingestion source with web ui like below picture. and I decide the schedule as ’00 03 * * *’. But I initially did run manually to see if the data ingestion was good, but it doesn’t sink normally. What is the reason and how can I solve it? It was possible to manually sink through CLI on my EKS server.
s
Cross-post ref https://datahubspace.slack.com/archives/CUMUWQU66/p1644476292351779 Request to please refer to threads and not cross-post. Helps team keep track of discussions
✔️ 1
i
What do you mean it does not sink normally? Can you share the recipe config (click the edit button) and the execution logs for the “not normal” execution?
s
@incalculable-ocean-74010 sorry for cross-post. please see this post https://datahubspace.slack.com/archives/CUMUWQU66/p1644476292351779
i
We will have office-hours in 20m, you can come and ask about your issue there Antonius, checkout #office-hours!