I have a simple Airflow DAG with a task like thiS:...
# advice-metadata-modeling
p
I have a simple Airflow DAG with a task like thiS:
Copy code
@task(
    task_id="save_users_to_postgres",
    outlets=create_datasets(
        names=["directory.employee"], database_type="postgres"
    ),
)
def task():
    save_data_to_the_employee_table()
The
directory.employee
table has already been catalogued by DataHub via a Postgres => DataHub direct ingestion and has its own URN in the system. How do I ensure that the Airflow DAG gets mapped to that Postgres DAG? Do I need to specify the full URN in my python code? Is there a way I could simplify that?
1
a
Hi @proud-table-38689, is this issue still affecting you?
p
it’s not an issue as it’s just a python code management one. I haven’t revisited this yet since holiday break.
thanks for checking in!
a
Thank you!