Ok, if I understood correctly, in case of a rename...
# advice-metadata-modeling
e
Ok, if I understood correctly, in case of a renamed table we can download metadata and the upload descriptions to the newly named table. But can we have Datahub understand renaming and do that automatically? Like if commands like
ALTER TABLE table_name RENAME TO table_new_name
were used, it should take descriptions and other documentation of table_name and assigne them to table_new_name?
👀 1
✅ 1
a
If you do that and ingest on the new table, you should automatically get everything in. Manual will have to be ported over.
Which means that anything you added via the datahub ui will need to have their parent reassigned to the new dataset URN. Which is what we talked about yesterday with python querying and changing every key aspect
e
Could you please elaborate on python querying and changing key aspects? Like list the steps for me
a
@elegant-salesmen-99143 you could either do this programmatically using the python emitter/REST emitter https://datahubproject.io/docs/metadata-ingestion/as-a-library/, but you’d need to know the URNS for everything
g
@elegant-salesmen-99143, I believe that the key question is whether DataHub can know that somebody renamed the table using the alter table command. I don't think that DataHub processes database logs, thus no "automated" remapping/renaming seems to be feasible from my point of view. You can do that by performing updates as suggested by @astonishing-answer-96712 above, but it wouldn't be automatic - as already mentioned by Paul, you need to know which original dataset (URN) was renamed to which new name and perform the updates accordingly.