millions-notebook-72121
01/04/2022, 3:29 PMgentle-nest-904
01/05/2022, 9:45 AMgentle-nest-904
01/05/2022, 9:46 AMgentle-nest-904
01/05/2022, 9:46 AMgentle-nest-904
01/05/2022, 9:55 AMnutritious-bird-77396
01/12/2022, 10:08 PMgms
as well.....
Looking to debug this issue further....nutritious-bird-77396
01/19/2022, 2:55 PMDATAHUB_ANALYTICS_ENABLED
has been set to false
in the frontend config. As the integration for frontend with MSK IAM Auth is still not in place.
On the browser Developer Tools
, I get DataFetchingException
Does any of this explain the An unknown error occurred. (code 500)
error?curved-thailand-48451
01/20/2022, 7:03 PMError checking feature flag no context set, have you authenticated to a cluster
Error checking feature flag no context set, have you authenticated to a cluster
buildkit not supported by daemon
Error: command 'docker build -t datahub-test_048890/airflow:latest failed: failed to execute cmd: exit status 1
curved-thailand-48451
01/20/2022, 7:03 PMcurved-thailand-48451
01/20/2022, 7:04 PMred-napkin-59945
01/20/2022, 10:11 PMred-napkin-59945
01/21/2022, 2:35 AM./gradlew build
finished successfully!strong-iron-17184
01/25/2022, 6:40 PMHello I am trying to start datahub in docker, when I run "datahub docker quickstart
"in the terminal it appears that it is already running on port 9002, when I look at the browser it cannot be accessed
quaint-whale-60966
01/26/2022, 6:19 AMdelightful-orange-22738
01/28/2022, 12:28 PMbash_operator = BashOperator(
dag=dag,
**create_dag_params(dag_conf=DAG_CONF, task_id='task'),
# working
inlets={
"datasets": [
Dataset("snowflake", "mydb.schema.tableA"),
Dataset("snowflake", "mydb.schema.tableB"),
],
},
outlets={"datasets": [Dataset("snowflake", "mydb.schema.tableC")]}
# My sample not working
# inlets={
# "datasets": [
# Dataset("hive", "db.read_1", "prod"),
# Dataset("hive", "db.read_2", "prod")
# ],
# },
# outlets={
# "datasets": [
# Dataset("hive", "db.read_3", "prod")
# ],
# }
loud-musician-49912
01/28/2022, 2:27 PMbillions-receptionist-60247
02/01/2022, 5:00 AMred-napkin-59945
02/01/2022, 6:20 PMbillions-receptionist-60247
02/02/2022, 9:53 AMorg.apache.kafka.common.errors.TimeoutException: Expiring 3 record(s) for MetadataAuditEvent_v4-1:120000 ms has passed since batch creation
solution found on internet and shared by @orange-night-91387 is to increase request.timeout.ms on producer.
i'm deploying datahub using helmchart on kubernetes. Can someone sugget me how to change this through helm chart or is there any other reason why this occursstrong-iron-17184
02/03/2022, 2:37 PMHi, I'm trying to get Airflow up on port 58080, but it doesn't show me anything. some help?
strong-iron-17184
02/03/2022, 2:37 PMIt appears to me that they are unhealthy
``````few-air-56117
02/03/2022, 3:33 PMstrong-iron-17184
02/04/2022, 3:38 PMstrong-iron-17184
02/04/2022, 4:45 PMIs the documentation outdated?
strong-iron-17184
02/08/2022, 2:06 PMwooden-football-7175
02/08/2022, 3:31 PMfuture-dusk-77156
02/09/2022, 9:18 AMThe retention policy of the schema topic _schemas is incorrect. Expected cleanup.policy to be 'compact' but it is delete
?
We’re using the Helm chart and the pod prerequisites-cp-schema-registry
is in a crash loop due to this error.
Is there a way to modify the cleanup policy in the chart?strong-iron-17184
02/10/2022, 6:46 PMnutritious-bird-77396
02/11/2022, 5:01 PMacoustic-raincoat-46544
02/12/2022, 1:43 AM