little-megabyte-1074
billowy-eye-48149
08/12/2020, 8:25 PMcalm-minister-22324
08/31/2020, 2:58 PMfast-exabyte-18411
08/31/2020, 4:06 PMsilly-apple-97303
09/04/2020, 9:53 PM- name: SPRING_KAFKA_PROPERTIES_BASIC_AUTH_CREDENTIALS_SOURCE
value: USER_INFO
- name: SPRING_KAFKA_PROPERTIES_BASIC_AUTH_USER_INFO
valueFrom:
secretKeyRef:
name: "kafka-schema-registry-credentials"
key: "user-info"
And the logs from both MAE/MCE look like:
16:21:26.721 [main] INFO i.c.k.s.KafkaAvroDeserializerConfig - KafkaAvroDeserializerConfig values:
schema.registry.url = [redacted]
<http://basic.auth.user.info|basic.auth.user.info> = [hidden]
auto.register.schemas = true
max.schemas.per.subject = 1000
basic.auth.credentials.source = USER_INFO
<http://schema.registry.basic.auth.user.info|schema.registry.basic.auth.user.info> = [hidden]
specific.avro.reader = false
value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy
key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy
16:21:26.857 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version: 2.2.1-cp1
However, doing the same for the GMS is not working. Specifically I get these warn log messages on startup and the configs are not attached to the serializer:
16:20:23.481 [main] INFO i.c.k.s.KafkaAvroSerializerConfig - KafkaAvroSerializerConfig values:
schema.registry.url = [redacted]
max.schemas.per.subject = 1000
16:20:24.213 [main] WARN o.a.k.c.producer.ProducerConfig - The configuration '<http://basic.auth.user.info|basic.auth.user.info>' was supplied but isn't a known config.
16:20:24.215 [main] WARN o.a.k.c.producer.ProducerConfig - The configuration 'basic.auth.credentials.source' was supplied but isn't a known config.
16:20:24.217 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version: 2.3.0
When digging into this I noticed the MAE/MCE are using kafka 2.2.1-cp1
(confluent platform version) while the GMS is using 2.3.0
(non-confluent platform version). I'm thinking regular non confluent platform clients might not support the same set of schema registry configurations.swift-account-97627
09/07/2020, 12:28 PMswift-account-97627
09/07/2020, 12:40 PMSchemaMetadata.SchemaFields
and something like DataProfile.FieldProfiles
).
If this is the correct model, what would be a good way to associate each particular FieldProfile with a particular SchemaField? Or is there a different model that would be better?
More generally, it seems like there's a tension between two models for field-level aspects:
1. Dataset has many Aspects, some of which have metadata for many Fields
2. Dataset has many Fields, some of which have multiple Aspects
I don't have an opinion on which of these models is more "correct", but the current implementation only seems to really support one aspect per field, and pushes any extensions to favour model (1) above. Is this a conscious design decision, or has this question just not come up yet?high-hospital-85984
09/14/2020, 6:15 PMquickstart.sh
and ingestion/ingestion.sh
I see items in the Upstream/Downstream tables in the “Relationship” tab for all dummy datasets. The lineage data is however empty, as well as the graph visualisation. Am I missing something?able-garden-99963
09/15/2020, 11:00 AMhigh-hospital-85984
09/15/2020, 7:05 PMdatahub/metadata-models/src/main/pegasus/com/linkedin/metadata/snapshot/
we define e.g. ChartSnapshot and MLModelSnapshot. However, in Snapshot.pdl
we only list MLModelSnapshot and not ChartSnapshot in the union. Why is that?
Similary, in datahub/metadata-models/src/main/pegasus/com/linkedin/metadata/entity/
we define a ChartEntity, but not a MLModelEntity, and the ChartEntity is not listed in the union in Entity.pdl
. Why is that?high-hospital-85984
09/16/2020, 1:47 PMaloof-fall-4769
09/16/2020, 11:53 PMswift-account-97627
09/21/2020, 10:29 AMhigh-hospital-85984
09/21/2020, 11:39 AMdatahub-frontend
image, but hitting | Launcher Chrome not found. Not installed?
. Couldn’t find any mention about this in the docs. Any ideas?some-crayon-90964
09/22/2020, 5:47 PM/var/run/mysqlId
but it does seem to help. Anyone has idea how I can fix this? Thanks in advance.high-hospital-85984
09/25/2020, 4:37 PMsome-crayon-90964
09/25/2020, 8:36 PMdocker run --name mysql --hostname mysql -e "MYSQL_DATABASE=datahub" -e "MYSQL_USER=datahub" -e "MYSQL_PASSWORD=datahub" -e "MYSQL_ROOT_PASSWORD=datahub" -v mysql:/docker-entrypoint-initdb.d -v mysqldata:/var/lib/mysql -v mysql:/var/run/mysqld:rw --publish 3306:3306 --tmpfs /tmp:rw --tmpfs /run:rw --tmpfs /var/run:rw --network geotab_docker_bridge --read-only --security-opt=no-new-privileges -d <http://gcr.io/data-infrastructure-test-env/myql:5.7|gcr.io/data-infrastructure-test-env/myql:5.7> --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
Not sure what goes wrong, what is the credential / setup GMS is looking for?
Thanks in advance!acceptable-architect-70237
09/25/2020, 8:56 PMhigh-hospital-85984
09/30/2020, 7:57 AMflat-answer-18123
09/30/2020, 11:49 AMstrong-pharmacist-65336
09/30/2020, 7:19 PMsome-crayon-90964
09/30/2020, 7:51 PMchilly-barista-6524
10/06/2020, 9:30 AM/gradlew build
failing with the following error :
> Task :datahub-web:emberWorkspaceTest FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':datahub-web:emberWorkspaceTest'.
> Process 'command '/home/shubham.gupta2/datahub/datahub-web/build/yarn/yarn-v1.13.0/bin/yarn'' finished with non-zero exit value 1
can someone help with this? This is hosted on an EC2 instancestrong-pharmacist-65336
10/06/2020, 11:54 AM./gradlew :metadata-events:mxe-schemas:build
hallowed-dinner-34937
10/07/2020, 2:00 PMhallowed-dinner-34937
10/07/2020, 4:25 PMnutritious-bird-77396
10/07/2020, 10:52 PMDatasetSnapshot
metadata…..
I am having some issues when Deserializing the fields-> type-> type
within the SchemaMetadata
Aspect….
https://github.com/linkedin/datahub/blob/master/metadata-models/src/main/pegasus/com/linkedin/schema/SchemaFieldDataType.pdl
I guess i should set the type of data as value in my Jackson Deserialization inorder for me to set the corresponding type but i am having challenges with that…
If linkedin or anyone in the community handled such case with Jackson Deserialization kindly help out….
Details of input/error in the Thread
high-hospital-85984
10/08/2020, 9:58 AMhigh-hospital-85984
10/09/2020, 8:52 AMhallowed-dinner-34937
10/09/2020, 2:06 PM