numerous-application-54063
05/19/2022, 4:41 PMgifted-bird-57147
05/21/2022, 1:33 PMgraph.get_aspect_v2(entity_urn=ds_urn, aspect_type=SchemaMetadataClass, aspect='schemaMetadata')
throws:
ValueError: com.linkedin.pegasus2avro.schema.Schemaless contains extra fields: {'com.linkedin.schema.MySqlDDL'}
astonishing-dusk-99990
05/23/2022, 4:21 AMchilly-elephant-51826
05/23/2022, 5:51 AM'Source (superset) report:\n'
"{'workunits_produced': 0,\n"
" 'workunit_ids': [],\n"
" 'warnings': {},\n"
" 'failures': {},\n"
" 'cli_version': '0.8.34.1',\n"
" 'cli_entry_location': '/tmp/datahub/ingest/venv-79a6e9d4-1370-4f11-a3cf-3b1ad0466ef9/lib/python3.9/site-packages/datahub/__init__.py',\n"
" 'py_version': '3.9.9 (main, Dec 21 2021, 10:03:34) \\n[GCC 10.2.1 20210110]',\n"
" 'py_exec_path': '/tmp/datahub/ingest/venv-79a6e9d4-1370-4f11-a3cf-3b1ad0466ef9/bin/python3',\n"
" 'os_details': 'Linux-5.10.109-104.*****'}\n"
'Sink (datahub-rest) report:\n'
"{'records_written': 0,\n"
" 'warnings': [],\n"
" 'failures': [],\n"
" 'downstream_start_time': None,\n"
" 'downstream_end_time': None,\n"
" 'downstream_total_latency_in_seconds': None,\n"
" 'gms_version': 'v0.8.33'}\n"
'\n'
'Pipeline finished successfully\n',
"2022-05-23 05:30:56.587754 [exec_id=79a6e9d4-1370-4f11-a3cf-3b1ad0466ef9] INFO: Successfully executed 'datahub ingest'"]}
Execution finished successfully!
quick-pizza-8906
05/23/2022, 10:02 AMbumpy-activity-74405
05/23/2022, 11:01 AMbumpy-activity-74405
05/23/2022, 11:31 AMacceptable-judge-21659
05/23/2022, 1:03 PMflaky-market-12551
05/24/2022, 1:56 AMdatahub docker quickstart
I am getting this error 😞
ERROR: for mysql Cannot start service mysql: b'Mounts denied: sxfs/#namespaces for more info.\r\n.\ny7f7690s324xf7k2_1ypvm0000gn/mysql/init.sql\r\nis not shared from OS X and is not known to Docker.\r\nYou can configure shared paths from Docker -> Preferences... -> File Sharing.\r\nSee <https://docs.docker.com/docker-for-mac/o>'
ERROR: Encountered errors while bringing up the project.
Btw, also attached the logs as well.
Binary
tmpw99ops7j.loghigh-family-71209
05/24/2022, 6:48 AMechoing-farmer-38304
05/24/2022, 6:54 AMINFO {datahub.cli.ingest_cli:130} - Finished metadata pipeline
Source (powerbireportserver.report_server.PowerBiReportServerDashboardSource) report:
{'workunits_produced': 560,
'workunit_ids': ['powerbi-urn:li:corpuser:a-a.user-corpUserInfo',
'powerbi-urn:li:corpuser:a-a.user-status',
'powerbi-urn:li:corpuser:a-a.user-corpUserKey',
'powerbi-urn:li:dashboard:(powerbi,reports.38437a3f-9818-43e4-ad0f-be0b4aa2868d)-browsePaths',
'powerbi-urn:li:dashboard:(powerbi,reports.38437a3f-9818-43e4-ad0f-be0b4aa2868d)-dashboardInfo',
'powerbi-urn:li:dashboard:(powerbi,reports.38437a3f-9818-43e4-ad0f-be0b4aa2868d)-status',
'powerbi-urn:li:dashboard:(powerbi,reports.38437a3f-9818-43e4-ad0f-be0b4aa2868d)-dashboardKey',
'powerbi-urn:li:dashboard:(powerbi,reports.38437a3f-9818-43e4-ad0f-be0b4aa2868d)-ownership',
...
'warnings': {},
'failures': {},
'cli_version': '0.8.34.2',
'scanned_report': 70,
'filtered_reports': []}
Sink (datahub-rest) report:
{'records_written': 0,
'warnings': [],
'failures': [],
'downstream_start_time': None,
'downstream_end_time': None,
'downstream_total_latency_in_seconds': None,
'gms_version': 'v0.8.34'}
Pipeline finished successfully
But getting this error,
ERROR {datahub.ingestion.run.pipeline:229} - Failed to extract some records due to: source produced an invalid metadata work unit: MetadataChangeProposalWrapper(
entityType="dashboard",
changeType="UPSERT",
entityUrn="urn:li:dashboard:(powerbi,reports.8371ebd6-0385-4a70-a286-6ac20ee69f74)",
entityKeyAspect=None,
auditHeader=None,
aspectName="dashboardInfo",
aspect=DashboardInfoClass(
{
"customProperties": {
"chartCount": 0,
"workspaceName": "PowerBI Report Server",
"workspaceId": "8371ebd6-0385-4a70-a286-6ac20ee69f74",
},
"externalUrl": None,
"title": "Staff_Tea",
"description": "",
"charts": [],
"lastModified": ChangeAuditStampsClass(
{
"created": AuditStampClass(
{
"time": 0,
"actor": "urn:li:corpuser:unknown",
"impersonator": None,
}
),
"lastModified": AuditStampClass(
{
"time": 0,
"actor": "urn:li:corpuser:unknown",
"impersonator": None,
}
),
"deleted": None,
}
),
"dashboardUrl": "<myurl>",
"access": None,
"lastRefreshed": None,
}
),
systemMetadata=SystemMetadataClass(
{
"lastObserved": 1653374365648,
"runId": "powerbireportserver.report_server.PowerBiReportServerDashboardSource-2022_05_24-09_36_35",
"registryName": None,
"registryVersion": None,
"properties": None,
}
),
)
able-rain-74449
05/24/2022, 9:00 AMcalm-waitress-61333
05/24/2022, 5:52 PMcalm-waitress-61333
05/24/2022, 5:53 PMcalm-waitress-61333
05/24/2022, 5:53 PMvolumeMounts:
- mountPath: /tmp/jks/
name: connect-devn2-trust-jks
volumes:
- configMap:
defaultMode: 420
items:
- key: connect-devn2-trust.jks
path: connect-devn2-trust.jks
name: connect-devn2-trust-jks
name: connect-devn2-trust-jks
- name: SPRING_KAFKA_PROPERTIES_SSL_TRUSTSTORE_LOCATION
value: /tmp/jks/connect-devn2-trust.jks
hostAliases:
- hostnames:
- connect-devn2
ip: 10.18.0.16
calm-waitress-61333
05/24/2022, 5:53 PMcalm-waitress-61333
05/24/2022, 5:53 PMcalm-waitress-61333
05/24/2022, 5:53 PMConnectionError: HTTPSConnectionPool(host='connect-devn2', port=8083): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fe81fa8e490>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))
calm-waitress-61333
05/24/2022, 5:54 PMcalm-waitress-61333
05/24/2022, 5:54 PMcalm-waitress-61333
05/24/2022, 5:54 PMcalm-waitress-61333
05/24/2022, 5:55 PM$ k get po
NAME READY STATUS RESTARTS AGE
datahub-acryl-datahub-actions-f94557c78-pw6s9 1/1 Running 0 11m
datahub-datahub-frontend-6548f8bb45-9sbtk 1/1 Running 0 6m31s
datahub-datahub-gms-5c5c5d4f5b-rkhf4 1/1 Running 0 17m
datahub-datahub-upgrade-job--1-p7z6p 0/1 Completed 0 3h31m
datahub-elasticsearch-setup-job--1-r57m2 0/1 Completed 0 3h34m
datahub-kafka-setup-job--1-wtqn7 0/1 Completed 0 3h34m
datahub-mysql-setup-job--1-ht4lx 0/1 Completed 0 3h31m
elasticsearch-master-0 1/1 Running 0 19d
elasticsearch-master-1 1/1 Running 0 19d
elasticsearch-master-2 1/1 Running 0 19d
prerequisites-cp-schema-registry-cf79bfccf-5pk27 2/2 Running 6 (19d ago) 19d
prerequisites-kafka-0 1/1 Running 4 (19d ago) 19d
prerequisites-mysql-0 1/1 Running 0 19d
prerequisites-neo4j-community-0 1/1 Running 0 19d
prerequisites-zookeeper-0 1/1 Running 0 19d
calm-waitress-61333
05/24/2022, 5:59 PMfresh-garage-83780
05/24/2022, 7:03 PMsecurity.protocol
is SASL_SSL
but the docs show it as just SASL
.
springKafkaConfigurationOverrides:
security.protocol: SASL_SSL
Still investigating that, as SASL_SSL causes issues for acryl-datahub-actions. Will report back
KafkaException: KafkaError{code=_INVALID_ARG,val=-186,str="Failed to create consumer: No provider for SASL mechanism GSSAPI: recompile librdkafka with libsasl2 or openssl support. Current build options: PLAIN SASL_SCRAM OAUTHBEARER"}
numerous-account-62719
05/25/2022, 4:41 AMshy-ability-24875
05/25/2022, 7:50 AMadamant-furniture-37835
05/25/2022, 1:58 PMkind-dawn-17532
05/25/2022, 7:33 PMkind-dawn-17532
05/25/2022, 7:33 PMkind-dawn-17532
05/25/2022, 7:34 PM