Hi everyone. I want to ingest metada from airflow into datahub. In my airflow py. file, I gave inlets and outlets and configured the connection to datahub (as guided in
https://datahubproject.io/docs/lineage/airflow). The inlets and outlets are about HDFS and they looked like outlets={"datasets": [Dataset("hdfs", "/general/project1/folder1/file1.parquet")]}. The problem is file1's schema didn't show up in the datahub UI, and I saw the whole path rather the file alone in the UI. Can anyone tell me what's the cause here?