glamorous-spring-97970
05/16/2023, 1:02 PMERROR {datahub.entrypoints:195} - Command failed: Failed to find a registered source for type redshift: 'str' object is not callable
glamorous-spring-97970
05/16/2023, 1:03 PMhallowed-kilobyte-916
05/16/2023, 1:44 PMfrom datahub.ingestion.run.pipeline import Pipeline.
Now I want to ingest the data dictionaries of the various metada ingested. I see the option to do this via the datahub interface but I can't find any documentation for doing this programmatically. Has anyone done this in the past? Is there any suggestion?quiet-television-68466
05/16/2023, 4:13 PMdatahub.capture_ownership_info = false
, the owners of Airflow pipelines are removed on each dag run.bitter-evening-61050
05/17/2023, 11:05 AMAttributeError: 'function' object has no attribute 'run'
Can anyone please help me with this
wonderful-quill-11255
05/17/2023, 1:06 PMtall-caravan-42586
05/17/2023, 5:11 PMbland-orange-13353
05/17/2023, 7:48 PMlittle-refrigerator-78584
05/17/2023, 8:10 PMCapture data lineage
AWS glue job ran successfully but on Datahub UI lineage was not created properly(as mentioned in the blog).
My source and target are not visible in the lineage.
Does anyone have any idea about this issue ?
Datahub Version: v0.10.2colossal-waitress-83487
05/18/2023, 3:00 AMbillions-baker-82097
05/18/2023, 10:30 AMlimited-forest-73733
05/18/2023, 11:01 AMagreeable-cricket-61480
05/18/2023, 4:04 PMagreeable-cricket-61480
05/18/2023, 4:05 PMnumerous-address-22061
05/18/2023, 5:04 PMpipeline_name: snowflake-lineage-ingestion
source:
type: snowflake
config:
# This option is recommended to be used to ingest all lineage
ignore_start_time_lineage: false
# Coordinates
account_id: ${SNOWFLAKE_ACCOUNT_ID}
warehouse: ${SNOWFLAKE_WAREHOUSE}
# Credentials
username: ${SNOWFLAKE_USERNAME}
password: ${SNOWFLAKE_PASSWORD}
role: ${SNOWFLAKE_ROLE}
#this ingestion is just for lineage
include_view_lineage: true
include_table_lineage: true
include_usage_stats: true
include_column_lineage: true
stateful_ingestion:
enabled: true
profiling:
# Change to false to disable profiling
enabled: false
sink:
type: "datahub-rest"
config:
server: ${DATAHUB_GMS_ENDPOINT}
miniature-hair-20451
05/18/2023, 5:51 PMbland-orange-13353
05/18/2023, 7:47 PMorange-gpu-90973
05/19/2023, 5:42 AMbillions-baker-82097
05/19/2023, 10:46 AMlimited-forest-73733
05/19/2023, 12:44 PMhallowed-lock-74921
05/19/2023, 12:47 PMlimited-forest-73733
05/19/2023, 1:31 PMmost-byte-90620
05/19/2023, 11:56 PM'[2023-05-19 23:52:03,881] DEBUG {datahub.ingestion.run.pipeline:199} - Source type:tableau,<class '
"'datahub.ingestion.source.tableau.TableauSource'> configured\n"
'[2023-05-19 23:52:03,881] INFO {datahub.ingestion.run.pipeline:200} - Source configured successfully.\n'
'[2023-05-19 23:52:03,882] INFO {datahub.cli.ingest_cli:129} - Starting metadata ingestion\n'
'[2023-05-19 23:52:03,885] DEBUG {datahub.ingestion.source.tableau:364} - Query workbooksConnection to get 10 objects with offset 0\n'
'[2023-05-19 23:52:03,885] INFO {tableau.endpoint.metadata:61} - Querying Metadata API\n'
'[2023-05-19 23:52:04,041] INFO {tableau.endpoint.auth:66} - Signed out\n'
'[2023-05-19 23:52:04,042] INFO {datahub.ingestion.reporting.file_reporter:54} - Wrote SUCCESS report successfully to '
"<_io.TextIOWrapper name='/tmp/datahub/ingest/9906dc8e-3870-45b0-8c86-d08b5360adf7/ingestion_report.json' mode='w' encoding='UTF-8'>\n"
'[2023-05-19 23:52:04,042] INFO {datahub.cli.ingest_cli:150} - Finished metadata ingestion\n'
'[2023-05-19 23:52:04,042] DEBUG {datahub.telemetry.telemetry:243} - Sending Telemetry\n'
'\n'
'Cli report:\n'
"{'cli_version': '0.9.1',\n"
" 'cli_entry_location': '/tmp/datahub/ingest/venv-tableau-0.9.1/lib/python3.10/site-packages/datahub/__init__.py',\n"
" 'py_version': '3.10.7 (main, Oct 5 2022, 14:33:54) [GCC 10.2.1 20210110]',\n"
" 'py_exec_path': '/tmp/datahub/ingest/venv-tableau-0.9.1/bin/python3',\n"
" 'os_details': 'Linux-5.10.147-133.644.amzn2.x86_64-x86_64-with-glibc2.31',\n"
" 'mem_info': '64.43 MB'}\n"
'Source (tableau) report:\n'
"{'events_produced': '0',\n"
" 'events_produced_per_sec': '0',\n"
" 'event_ids': [],\n"
" 'warnings': {},\n"
" 'failures': {},\n"
" 'soft_deleted_stale_entities': [],\n"
" 'start_time': '2023-05-19 23:52:03.500064 (now).',\n"
" 'running_time': '0.65 seconds'}\n"
'Sink (datahub-rest) report:\n'
"{'total_records_written': '0',\n"
" 'records_written_per_second': '0',\n"
" 'warnings': [],\n"
" 'failures': [],\n"
" 'start_time': '2023-05-19 23:52:03.380872 (now).',\n"
" 'current_time': '2023-05-19 23:52:04.149734 (now).',\n"
" 'total_duration_in_seconds': '0.77',\n"
" 'gms_version': 'v0.9.1',\n"
" 'pending_requests': '0'}\n"
'\n'
' Pipeline finished successfully; produced 0 events in 0.65 seconds.\n'
```prehistoric-farmer-31305
05/20/2023, 3:23 AMdatahub delete --entity_type dataset --env prod --hard.
I also recreated the pods but that also did not help.hundreds-airline-29192
05/20/2023, 9:02 AMfreezing-account-90733
05/22/2023, 4:11 AMagreeable-cricket-61480
05/22/2023, 7:24 AMcreamy-caravan-15387
05/22/2023, 9:36 AMhundreds-airline-29192
05/22/2023, 9:57 AMhundreds-airline-29192
05/22/2023, 10:00 AM