Hey everyone, I am fiddling around with ingesting...
# ingestion
w
Hey everyone, I am fiddling around with ingesting lineage from airflow. However, the demo script does not really help my problem. https://github.com/linkedin/datahub/blob/master/metadata-ingestion/src/datahub_provider/example_dags/lineage_backend_demo.py In the demo the datasets "Table A", "Table B" and "Table C" are generated in the airflow script. Is there can ingest lineage for alreagy existing datasets? I.e. My first job has already ingested "Table A", "Table B" and "Table C" in the example? Is there anyway to pass the dataset urn in the airflow job?
Nevermind, I've figured it out myself. The urn is constructed via the dataset class and if it exists, the existing object is used. Otherwise it is created. 🥳
l
Thanks for following up with the resolution, @witty-dream-29576! Super helpful context for other community members that might hit the same issue teamwork