eager-florist-67924
03/21/2022, 10:26 AMprivate static void createTracingDomain() throws IOException, ExecutionException, InterruptedException {
MetadataChangeProposalWrapper mcpw = MetadataChangeProposalWrapper.builder()
.entityType("domain")
.entityUrn("urn:li:domain:tracing")
.upsert()
.aspect(new DomainProperties()
.setName("Tracing domain")
.setDescription("Domain for tracing")
)
.build();
emit(mcpw);
}
but it get error 500:
Caused by: java.lang.IllegalArgumentException: Failed to find entity with name domain in EntityRegistry
by checking the snapshot directory of given release on repo i am not able to see any domain entity. https://github.com/datahub-project/datahub/tree/v0.8.23/metadata-models/src/main/pegasus/com/linkedin/metadata/snapshot
So how can i create a domain entity? Could you please provide some example? thxhundreds-pillow-5032
03/21/2022, 2:50 PMhandsome-football-66174
03/21/2022, 3:33 PMFile "/root/.venvs/airflow/lib/python3.8/site-packages/datahub/ingestion/source/sql/sql_common.py", line 62, in get_platform_from_sqlalchemy_uri
if sqlalchemy_uri.startswith("bigquery"):
AttributeError: 'NoneType' object has no attribute 'startswith'
[2022-03-17 21:03:07,554] {taskinstance.py:1525} INFO - Marking task as FAILED. dag_id=metadata_ingestion_dag, task_id=ingest_metadata, execution_date=20220317T205757, start_date=20220317T210304, end_date=20220317T210307
[2022-03-17 21:03:07,623] {local_task_job.py:146} INFO - Task exited with return code 1
mysterious-australia-30101
03/21/2022, 3:52 PMworried-branch-76677
03/22/2022, 1:55 PMdataHubPolicy
?
Things like make_dataset_urn
?
I am trying to create a MCP to ingest new policies by python.astonishing-byte-5433
03/22/2022, 5:26 PMCannot locate a 64-bit Oracle Client library: "libclntsh.so: cannot open shared '
'object file: No such file or directory"
Thanks for your helprich-policeman-92383
03/23/2022, 10:44 AMbrave-secretary-27487
03/23/2022, 12:11 PMcalm-sunset-28996
03/23/2022, 1:18 PMmysterious-lamp-91034
03/23/2022, 7:02 PMadorable-flower-19656
03/24/2022, 1:18 AMmysterious-nail-70388
03/24/2022, 3:05 AMstocky-noon-61140
03/24/2022, 9:50 AMimportant-machine-62199
03/24/2022, 12:00 PMmysterious-portugal-30527
03/24/2022, 5:32 PM'File "/tmp/datahub/ingest/venv-61422838-a72c-4c58-991f-380cbc29cafc/lib/python3.9/site-packages/datahub/ingestion/api/registry.py", line '
'132, in get\n'
' 115 def get(self, key: str) -> Type[T]:\n'
' (...)\n'
' 128 raise ConfigurationError(\n'
' 129 f"{key} is disabled; try running: pip install \'{__package_name__}[{key}]\'"\n'
' 130 ) from tp\n'
' 131 elif isinstance(tp, Exception):\n'
'--> 132 raise ConfigurationError(\n'
' 133 f"{key} is disabled due to an error in initialization"\n'
Thoughts??modern-artist-55754
03/25/2022, 5:03 AMkind-teacher-18789
03/25/2022, 9:20 AMfew-grass-66826
03/25/2022, 10:26 AMcolossal-alligator-29986
03/25/2022, 3:53 PM[2022-03-25 15:09:59,922] INFO {datahub.cli.ingest_cli:91} - Starting metadata ingestion
[2022-03-25 15:09:59,922] INFO {datahub.ingestion.source.sql.bigquery:276} - Populating lineage info via GCP audit logs
[2022-03-25 15:09:59,928] INFO {datahub.ingestion.source.sql.bigquery:369} - Start loading log entries from BigQuery start_time=2022-03-23T23:45:00Z and end_time=2022-03-26T00:15:00Z
[2022-03-25 15:19:32,800] INFO {datahub.ingestion.source.sql.bigquery:380} - Finished loading 12047 log entries from BigQuery so far
[2022-03-25 15:19:32,800] INFO {datahub.ingestion.source.sql.bigquery:462} - Parsing BigQuery log entries: number of log entries successfully parsed=12047
[2022-03-25 15:19:32,800] INFO {datahub.ingestion.source.sql.bigquery:513} - Creating lineage map: total number of entries=12047, number skipped=1.
[2022-03-25 15:19:32,800] INFO {datahub.ingestion.source.sql.bigquery:270} - Built lineage map containing 12015 entries.
polite-application-51650
03/28/2022, 7:05 AMspark = SparkSession.builder()
.appName("test-application")
.config("spark.master", "<spark://spark-master:7077>")
.config("spark.jars.packages","io.acryl:datahub-spark-lineage:0.8.23")
.config("spark.extraListeners", "datahub.spark.DatahubSparkListener")
.config("spark.datahub.rest.server", "<http://localhost:8080>")
.enableHiveSupport()
.getOrCreate();
this is what I setup in my spark config file.hallowed-analyst-96384
03/28/2022, 10:07 AMstocky-midnight-78204
03/28/2022, 10:27 AMshy-fireman-88724
03/28/2022, 3:09 PMbitter-toddler-42943
03/29/2022, 2:15 AM'ERROR: Could not find a version that satisfies the requirement acryl-datahub[datahub-rest,mssql]==0.8.26.6 (from versions: none)\n'
'ERROR: No matching distribution found for acryl-datahub[datahub-rest,mssql]==0.8.26.6\n'
cold-hydrogen-10513
03/29/2022, 11:52 AM0.8.31
and created a recipe
source:
type: snowflake
config:
host_port: <http://nonprodcompanyname.us-east-1.snowflakecomputing.com|nonprodcompanyname.us-east-1.snowflakecomputing.com>
warehouse: COMPANY_NAME_NON_PROD_VWH
username: '${snowflake-user}'
password: '${snowflake-pass}'
sink:
type: datahub-rest
config:
server: '<http://datahub-gms.datahub.svc.cluster.local:8080/api/gms>'
and I added it to the ingestion UI. When I execute it I get the following
'ConfigurationError: Unable to connect to <http://datahub-gms.datahub.svc.cluster.local:8080/api/gms/config> with status_code: '
'404. Please check your configuration and make sure you are talking to the DataHub GMS (usually <datahub-gms-host>:8080) or Frontend GMS '
'API (usually <frontend>:9002/api/gms).\n'
Could you please tell me what I can check here?mysterious-australia-30101
03/29/2022, 12:50 PMfresh-memory-10355
03/29/2022, 4:15 PMpurple-ghost-64569
03/29/2022, 9:03 PMbetter-orange-49102
03/30/2022, 7:25 AMastonishing-plumber-56128
03/30/2022, 7:54 AM