better-spoon-77762
12/23/2021, 8:52 AMable-yacht-17327
12/26/2021, 7:31 AMfuture-petabyte-5942
12/27/2021, 6:30 AMcurved-magazine-23582
12/27/2021, 7:57 PMmarketplace.jdbc
connection type? 🤔
File "/usr/local/lib/python3.7/site-packages/datahub/entrypoints.py", line 93, in main
sys.exit(datahub(standalone_mode=False, **kwargs))
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1128, in __call__
return self.main(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1053, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1395, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 754, in invoke
return __callback(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/datahub/cli/ingest_cli.py", line 73, in run
pipeline.run()
File "/usr/local/lib/python3.7/site-packages/datahub/ingestion/run/pipeline.py", line 149, in run
self.source.get_workunits(), 10 if self.preview_mode else None
File "/usr/local/lib/python3.7/site-packages/datahub/ingestion/source/aws/glue.py", line 521, in get_workunits
dag, flow_urn, s3_formats
File "/usr/local/lib/python3.7/site-packages/datahub/ingestion/source/aws/glue.py", line 309, in process_dataflow_graph
node, flow_urn, new_dataset_ids, new_dataset_mces, s3_formats
File "/usr/local/lib/python3.7/site-packages/datahub/ingestion/source/aws/glue.py", line 265, in process_dataflow_node
raise ValueError(f"Unrecognized Glue data object type: {node_args}")
ValueError: Unrecognized Glue data object type: {'connection_type': 'marketplace.jdbc', 'connection_options': {'dbTable': 'contact', 'connectionName': 'prod-salesforce', 'filterPredicate': 'SystemModstamp'}, 'transformation_ctx': 'DataSource0'}
busy-dusk-4970
12/27/2021, 10:50 PM<http://localhost:9002/authenticate?redirect_uri=%2F>
Attached are screenshots of the error and my keycloak settings.
My datahub-frontend-react env vars:
- DATAHUB_GMS_HOST=datahub-gms
- DATAHUB_GMS_PORT=8090
- DATAHUB_SECRET=YouKnowNothing
- DATAHUB_APP_VERSION=1.0
- DATAHUB_PLAY_MEM_BUFFER_SIZE=10MB
- JAVA_OPTS=-Xms512m -Xmx512m -Dhttp.port=9002 -Dconfig.file=datahub-frontend/conf/application.conf
-Djava.security.auth.login.config=datahub-frontend/conf/jaas.conf
-Dlogback.configurationFile=datahub-frontend/conf/logback.xml
-Dlogback.debug=false -Dpidfile.path=/dev/null
- KAFKA_BOOTSTRAP_SERVER=broker:29092
- DATAHUB_TRACKING_TOPIC=DataHubUsageEvent_v1
- ELASTIC_CLIENT_HOST=opensearch
- ELASTIC_CLIENT_PORT=9200
- KEYCLOAK_CLIENT_SECRET=e96ef414-de3e-460c-bd91-0866debfd8eb
- AUTH_OIDC_ENABLED=true
- AUTH_OIDC_CLIENT_ID=data-fabric-confidential
- AUTH_OIDC_CLIENT_SECRET=e96ef414-de3e-460c-bd91-0866debfd8eb
- AUTH_OIDC_DISCOVERY_URI=<http://localhost:8080/auth/realms/data-fabric-realm/.well-known/openid-configuration>
- AUTH_OIDC_BASE_URL=<http://localhost:9002>
- AUTH_OIDC_SCOPE="openid profile email groups"
better-spoon-77762
12/29/2021, 2:04 AMambitious-guitar-89068
12/29/2021, 11:32 AMmagnificent-park-81142
12/30/2021, 6:56 PMastonishing-lamp-98820
12/30/2021, 8:37 PMmagnificent-park-81142
01/01/2022, 5:23 PMrich-policeman-92383
01/03/2022, 1:47 PMdatahub_datahub-mae-consumer.1.ia85tvruu4y4@N4PBIL-DARD0887 | 13:29:30.498 [kafka-kerberos-refresh-thread-my_user@INDIA.TTT] WARN o.a.k.c.s.kerberos.KerberosLogin - [Principal=my_user@INDIA.TTT]: TGT renewal thread has been interrupted and will exit.
datahub_datahub-mae-consumer.1.ia85tvruu4y4@N4PBIL-DARD0887 | 13:29:30.499 [main] WARN o.s.b.w.s.c.AnnotationConfigServletWebServerApplicationContext - Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TopicAuthorizationException: Not authorized to access topics: [DataHubUsageEvent_v1]
worried-terabyte-81786
01/03/2022, 7:12 PMworried-terabyte-81786
01/03/2022, 7:15 PMValidationError: 2 validation errors for BusinessGlossaryConfig
url
none is not an allowed value (type=type_error.none.not_allowed)
nodes -> 0 -> terms -> 1 -> custom_properties
extra fields not permitted (type=value_error.extra)
hallowed-breakfast-61855
01/04/2022, 3:20 AMambitious-guitar-89068
01/04/2022, 5:27 AMKafkaException: KafkaError{code=MSG_SIZE_TOO_LARGE,val=10,str="Unable to produce message: Broker: Message size too large"}
gentle-florist-49869
01/04/2022, 2:11 PMmillions-notebook-72121
01/04/2022, 3:26 PMkubectl describe
on the pod I get
Readiness probe failed: mysqladmin: [Warning] Using a password on the command line interface can be insecure.
mysqladmin: connect to server at 'localhost' failed
error: 'Access denied for user 'root'@'localhost' (using password: YES)'
Has anyone seen this before? I'm checking the Kubernetes cluster to see if anything has changed there, but seems the same as before. Attaching some screenshots for reference. If I do a port-forward for the DB port and try accessing the DB with a client, I get the same error.gentle-nest-904
01/05/2022, 9:47 AMgentle-nest-904
01/05/2022, 9:55 AMbusy-dusk-4970
01/05/2022, 6:17 PMdocker/dev.sh
for docker_datahub-frontend-react_1 Cannot start service datahub-frontend-react: OCI runtime create failed: container_linux.go:380: starting container process caused: exec: "datahub-frontend/bin/playBinary": stat datahub-frontend/bin/playBinary: no such file or directory: unknown
Any ideas how I can fix this? Tnx 🙏lemon-hydrogen-83671
01/05/2022, 6:45 PMnutritious-bird-77396
01/05/2022, 8:43 PMssl.truststore.location
• security.protocol
• sasl.mechanism
• sasl.jaas.config
• sasl.client.callback.handler.class
Setting these properties did not work KAFKA_PROPERTIES_SSL_TRUSTSTORE_LOCATION
.....
Any help on this would be great....some-crayon-90964
01/06/2022, 6:00 PM./gradlew build
. Howevery, when we ran this specific test (./gradlew :metadata-integration:java:spark-lineage:compileJava
), it was always successful. Any idea how to fix that? Thanks.many-pilot-7340
01/06/2022, 8:10 PMnutritious-bird-77396
01/06/2022, 9:20 PMSPRING_KAFKA_PROPERTIES_SASL_MECHANISM=AWS_MSK_IAM
Raised the issue in the project - https://github.com/aws/aws-msk-iam-auth/issues/50
If anyone else have seen this issue or any suggestions do let me know...magnificent-park-81142
01/07/2022, 3:58 PMlemon-hydrogen-83671
01/07/2022, 6:47 PM/api/*path
returns a
HTTP/2 400 Bad Request: HTTP message contains more than the configured limit of 64 headers
billions-tent-29367
01/11/2022, 5:41 PMcurl --location --request POST 'gms:8080/entities?action=search' \
--header 'Content-Type: text/plain' \
--data-raw '{
"input": "**",
"entity": "Application",
"start": 0,
"count": 500
}'
So what does **
actually mean? (Application is a custom entity)billions-tent-29367
01/11/2022, 5:49 PMcom.linkedin
namespace, otherwise the build did not pick them up properly. Every person I've demonstrated our models to has asked something like "Why do the models say com.linkedin? Can you remove that?"
So.. is it possible yet?quick-pizza-8906
01/11/2022, 7:33 PM❯ datahub delete --hard --urn 'urn-here'
This will permanently delete data from DataHub. Do you want to continue? [y/N]: y
[2022-01-10 15:20:27,608] INFO {datahub.cli.delete_cli:126} - DataHub configured with <http://localhost:9999>
Nothing deleted for urn-here
Took 1.898 seconds to hard delete 0 rows for 1 entities
where urn-here
is actual URN. After this operation I still see the dataset in UI. Running reindexing job does not help. Note 0 rows deleted for 1 entities info. What can I do?