General Question- How to we connect an existing ai...
# ingestion
h
General Question- How to we connect an existing airflow environment to the datahub docker ? Also in the documentation https://datahubproject.io/docs/metadata-ingestion/#lineage-with-airflow , where does the airflow.cfg be present (new to airflow as well) ?
e
Is airflow running in the same cluster?
Also, will you be using the lineage backend? or operators?
h
@early-lamp-41924 - Yes will be using Airflow lineage backend, though airflow might not be running on the same cluster
e
In that case, you need to expose the gms through ingress
and then go through steps 2 and 3
h
@early-lamp-41924 - Can you share the steps for exposing gms through ingress ?
e
Should be very similar to setting up ingress for frontend! https://datahubproject.io/docs/deploy/aws#expose-endpoints-using-a-load-balancer
If you already set up alb controller, you can skip to the helm values.yaml change at the end of this section
You will have to add the ingress section to datahub-gms instead of frontend
h
@early-lamp-41924 - I followed the steps and was able to connect Datahub and airflow both running on different servers. I had overlooked pip install acryl-datahub[airflow] step and so the lineage was not visible.
e
Great news!!!