proud-table-38689
12/19/2022, 10:18 PM@task(
task_id="save_users_to_postgres",
outlets=create_datasets(
names=["directory.employee"], database_type="postgres"
),
)
def task():
save_data_to_the_employee_table()
The directory.employee
table has already been catalogued by DataHub via a Postgres => DataHub direct ingestion and has its own URN in the system. How do I ensure that the Airflow DAG gets mapped to that Postgres DAG? Do I need to specify the full URN in my python code? Is there a way I could simplify that?astonishing-answer-96712
01/04/2023, 7:21 PMproud-table-38689
01/04/2023, 7:31 PMproud-table-38689
01/04/2023, 7:31 PMastonishing-answer-96712
01/04/2023, 7:47 PM