Hello everybody! We are having the next issue whil...
# getting-started
f
Hello everybody! We are having the next issue while trying to capture a spark job lineage: when we read the table tmp.agus_test_4857 it captures it in datahub as a hdfs file instead of a hive table. Since we've already ingested the hive table with its metadata, it appears as another object with all its metadata are we doing something incorrectly? Thanks in advance!
m
Hey @fast-potato-13714 we have an open PR that addresses this issue. https://github.com/datahub-project/datahub/pull/5687
cc @dazzling-judge-80093
f
thanks @mammoth-bear-12532, It will appear as a hive table or just the prefix s3:// gcs:// ?
Because the PR doesn't mention anything about hive
m
@fast-potato-13714 you're right this particular PR only addresses other file-like stores