rich-activity-70509
11/26/2022, 7:18 AMImportError: dlopen(/private/var/folders/r9/kkk0b47j4fl9f0nlyxsw7dyr0000gn/T/pip-build-env-ls6nbxyb/overlay/lib/python3.10/site-packages/grpc_tools/_protoc_compiler.cpython-310-darwin.so, 0x0002): tried: '/private/var/folders/r9/kkk0b47j4fl9f0nlyxsw7dyr0000gn/T/pip-build-env-ls6nbxyb/overlay/lib/python3.10/site-packages/grpc_tools/_protoc_compiler.cpython-310-darwin.so' (mach-o file, but is an incompatible architecture (have (x86_64), need (arm64e)))
This error occurs while building datahub's command line tool (or) while building datahub's documentation.
./gradlew :metadata-ingestion:installDev
./gradlew :docs-website:yarnLintFix :docs-website:build -x :metadata-ingestion:runPreFlightScript
This task is failing:
> Task :metadata-ingestion:installDevTest FAILED
with Error:
...
Successfully built acryl-datahub
Failed to build feast
ERROR: Could not build wheels for feast, which is required to install pyproject.toml-based projects
Using Python Version 3.10.8
few-sunset-43876
11/28/2022, 8:34 AM08:01:55.065 [qtp522764626-446] INFO c.l.m.r.entity.EntityResource:157 - GET urn:li:corpuser:tri.tran5
08:01:55.069 [pool-11-thread-1] INFO c.l.m.filter.RestliLoggingFilter:55 - GET /entities/urn%3Ali%3Acorpuser%3Atri.tran5 - get - 200 - 4ms
08:01:55.074 [qtp522764626-391] INFO c.l.m.r.entity.AspectResource:143 - INGEST PROPOSAL proposal: {aspectName=corpUserStatus, entityUrn=urn:li:corpuser:tri.tran5, entityType=corpuser, changeType=UPSERT, aspect={contentType=application/json, value=ByteString(length=100,bytes=7b227374...37327d7d)}}
08:01:55.091 [pool-11-thread-1] INFO c.l.m.filter.RestliLoggingFilter:55 - POST /aspects?action=ingestProposal - ingestProposal - 200 - 17ms
08:01:55.752 [ThreadPoolTaskExecutor-1] INFO c.l.m.k.t.DataHubUsageEventTransformer:74 - Invalid event type: HomePageViewEvent
08:01:55.752 [ThreadPoolTaskExecutor-1] WARN c.l.m.k.DataHubUsageEventsProcessor:56 - Failed to apply usage events transform to record: {"type":"HomePageViewEvent","actorUrn":"urn:li:corpuser:tri.tran5","timestamp":1669622515141,"date":"Mon Nov 28 2022 15:01:55 GMT+0700 (Indochina Time)","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36","browserId":"1b20cc1b-5afe-4f60-b6f4-2876120b0463"}
08:01:55.776 [I/O dispatcher 1] INFO c.l.m.s.e.update.BulkListener:47 - Successfully fed bulk request. Number of events: 3 Took time ms: -1
08:01:55.781 [pool-11-thread-1] INFO c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 3ms
08:01:55.783 [ThreadPoolTaskExecutor-1] INFO c.l.m.k.t.DataHubUsageEventTransformer:74 - Invalid event type: HomePageViewEvent
08:01:55.783 [ThreadPoolTaskExecutor-1] WARN c.l.m.k.DataHubUsageEventsProcessor:56 - Failed to apply usage events transform to record: {"type":"HomePageViewEvent","actorUrn":"urn:li:corpuser:tri.tran5","timestamp":1669622515213,"date":"Mon Nov 28 2022 15:01:55 GMT+0700 (Indochina Time)","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36","browserId":"1b20cc1b-5afe-4f60-b6f4-2876120b0463"}
08:01:56.453 [pool-11-thread-1] INFO c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 3ms
08:01:56.458 [pool-11-thread-1] INFO c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 2ms
08:01:56.464 [pool-11-thread-1] INFO c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 2ms
08:01:56.497 [I/O dispatcher 1] INFO c.l.m.s.e.update.BulkListener:47 - Successfully fed bulk request. Number of events: 1 Took time ms: -1
08:01:56.516 [pool-11-thread-1] INFO c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 3ms
08:01:56.534 [pool-11-thread-1] INFO c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 2ms
08:01:56.841 [pool-11-thread-1] INFO c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 2ms
rich-pager-68736
11/28/2022, 10:30 AM- name: KAFKA_BOOTSTRAP_SERVER
value: "XXXXXXXXXXXXXXXX:9098,YYYYYYYYYYYYYYYYY:9098"
- name: KAFKA_PROPERTIES_SECURITY_PROTOCOL
value: "SASL_SSL"
- name: KAFKA_PROPERTIES_SASL_MECHANISM
value: "AWS_MSK_IAM"
- name: KAFKA_PROPERTIES_SASL_JAAS_CONFIG
value: "software.amazon.msk.auth.iam.IAMLoginModule required;"
- name: KAFKA_PROPERTIES_SASL_LOGIN_CALLBACK_HANDLER_CLASS
value: "software.amazon.msk.auth.iam.IAMClientCallbackHandler"
but it fails to authenticate:
08:53:32 [application-akka.actor.default-dispatcher-7] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values:
acks = 1
batch.size = 16384
bootstrap.servers = [XXXXXXXXXXXXXXXXXXXXX:9098, YYYYYYYYYYYYYYYYYYYYYYYY:9098]
buffer.memory = 33554432
client.dns.lookup = default
client.id = datahub-frontend
compression.type = none
<http://connections.max.idle.ms|connections.max.idle.ms> = 540000
<http://delivery.timeout.ms|delivery.timeout.ms> = 120000
enable.idempotence = false
interceptor.classes = []
key.serializer = class org.apache.kafka.common.serialization.StringSerializer
<http://linger.ms|linger.ms> = 0
<http://max.block.ms|max.block.ms> = 60000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
<http://metadata.max.age.ms|metadata.max.age.ms> = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
<http://metrics.sample.window.ms|metrics.sample.window.ms> = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
<http://reconnect.backoff.max.ms|reconnect.backoff.max.ms> = 1000
<http://reconnect.backoff.ms|reconnect.backoff.ms> = 50
<http://request.timeout.ms|request.timeout.ms> = 30000
retries = 2147483647
<http://retry.backoff.ms|retry.backoff.ms> = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = [hidden]
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = class software.amazon.msk.auth.iam.IAMClientCallbackHandler
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = AWS_MSK_IAM
security.protocol = SASL_SSL
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
<http://transaction.timeout.ms|transaction.timeout.ms> = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.StringSerializer
08:53:33 [application-akka.actor.default-dispatcher-7] INFO o.a.k.c.s.a.AbstractLogin - Successfully logged in.
08:53:33 [application-akka.actor.default-dispatcher-7] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version: 2.3.0
08:53:33 [application-akka.actor.default-dispatcher-7] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId: fc1aaa116b661c8a
08:53:33 [application-akka.actor.default-dispatcher-7] INFO o.a.kafka.common.utils.AppInfoParser - Kafka startTimeMs: 1669625613221
08:53:33 [kafka-producer-network-thread | datahub-frontend] INFO o.a.kafka.common.network.Selector - [Producer clientId=datahub-frontend] Failed authentication with XXXXXXXXXXXXXXXXX (An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: Exception while evaluating challenge [Caused by javax.security.auth.callback.UnsupportedCallbackException: Unrecognized SASL ClientCallback]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTHENTICATION_FAILED state.)
08:53:33 [kafka-producer-network-thread | datahub-frontend] ERROR o.apache.kafka.clients.NetworkClient - [Producer clientId=datahub-frontend] Connection to node -2 (XXXXXXXXXXXXXXXXX:9098) failed authentication due to: An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: Exception while evaluating challenge [Caused by javax.security.auth.callback.UnsupportedCallbackException: Unrecognized SASL ClientCallback]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTHENTICATION_FAILED state.
08:53:33 [kafka-producer-network-thread | datahub-frontend] INFO o.a.kafka.common.network.Selector - [Producer clientId=datahub-frontend] Failed authentication with YYYYYYYYYYYYYYYYYY (An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: Exception while evaluating challenge [Caused by javax.security.auth.callback.UnsupportedCallbackException: Unrecognized SASL ClientCallback]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTHENTICATION_FAILED state.)
08:53:33 [kafka-producer-network-thread | datahub-frontend] ERROR o.apache.kafka.clients.NetworkClient - [Producer clientId=datahub-frontend] Connection to node -1 (YYYYYYYYYYYYYYYYYY:9098) failed authentication due to: An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: Exception while evaluating challenge [Caused by javax.security.auth.callback.UnsupportedCallbackException: Unrecognized SASL ClientCallback]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTHENTICATION_FAILED state.
...
Any idea what I can do here? Thanks!breezy-portugal-43538
11/28/2022, 12:04 PMlively-dusk-19162
11/28/2022, 4:35 PMambitious-cartoon-15344
11/29/2022, 3:04 AMflaky-soccer-57765
11/29/2022, 9:37 AMbreezy-portugal-43538
11/29/2022, 10:27 AMstrong-belgium-32572
11/29/2022, 12:54 PMrefined-dream-17668
11/29/2022, 1:10 PMFailed to add owners: Failed to update resource with urn urn:li:tag: [our tag name]. Entity does not exist.
What is the reason? Are we able to add both description and onwer to tag directly from dbt files?
I see that on public demo the same issue exist.ancient-wire-3767
11/29/2022, 1:16 PM``` 'sqlalchemy.exc.DatabaseError: (cx_Oracle.DatabaseError) ORA-00936: missing expression\n'
'[SQL: SELECT FROM DUAL \n'
'WHERE ROWNUM <= :param_1]\n'
"[parameters: {'param_1': 1}]\n"
'(Background on this error at: http://sqlalche.me/e/13/4xp6)\n'
'[2022-11-29 120449,030] INFO {datahub.ingestion.source.ge_data_profiler:909} - Profiling '
'scheme_name.table_name\n'
'[2022-11-29 120449,037] ERROR {datahub.ingestion.source.ge_data_profiler:939} - Encountered exception while profiling '```Without enabled profiling it runs fine without error. Here is our ingestion recipe:
source:
type: oracle
config:
env: TEST
password: xxxx
host_port: 'ip:port'
service_name: test
username: xxxx
schema_pattern:
allow:
scheme_name
include_views: true
include_tables: true
profiling:
enabled: true
pipeline_name: 'xxx'
Probably it causes query SELECT FROM DUAL which can be seen in log.bright-motherboard-35257
11/29/2022, 1:52 PMbright-motherboard-35257
11/29/2022, 1:53 PMcareful-computer-14484
11/29/2022, 7:11 PMcareful-computer-14484
11/29/2022, 9:26 PM'failures': {'Stateful Ingestion': ['Fail safe mode triggered, entity difference percent:100.0 > fail_safe_threshold:{self.stateful_ingestion_config.fail_safe_threshold}']},
and I have this in my yaml:
stateful_ingestion:
enabled: false
shy-dog-84302
11/30/2022, 10:37 AMsilly-chef-95502
11/30/2022, 12:08 PMproud-memory-42381
11/30/2022, 2:23 PMlively-dusk-19162
11/30/2022, 3:00 PMlimited-forest-73733
11/30/2022, 6:14 PMgentle-tailor-78929
11/30/2022, 9:11 PMdatahub
admin user is missing the following permissions:
managePolicies: false
manageUserCredentials: false
Any ideas what might be causing that? Thanks!full-gold-60357
12/01/2022, 5:49 AMbest-wire-59738
12/01/2022, 7:11 AM07:55:12 [application-akka.actor.default-dispatcher-50] ERROR auth.sso.oidc.OidcCallbackLogic - Failed to extract groups: Expected to find a list of strings for attribute with name groups, found class net.minidev.json.JSONArray
late-ability-59580
12/01/2022, 8:54 AMdatahub delete --env PROD --entity_type dataset --platform snowflake
I get a super long traceback, somewhere I within it says:
HTTP ERROR 401 Unauthorized to perform this action
I get the same error when trying to delete specific entity using its urn.
Any ideas?
P.S: I don't get that error when running datahub ingest ...
commandspowerful-cat-68806
12/01/2022, 10:33 AMdatahub-datahub-upgrade-job-xxxxx:
Error: secret "mysql-secrets" not found
I presume this error occurs because I need a DB for the app(pgSQL / MySQL). I want to use a DB that is provided by my cloud vendor(AWS) & not from the app deployment
Obviously, I’m customizing the values.yaml
both for prereq & datahub
Pls. advise
10x 🙏faint-translator-23365
12/01/2022, 11:25 AMadorable-magazine-49274
12/01/2022, 2:39 PMdatahub-elasticsearch-setup-job
.
Problem with request: Get <http://elasticsearch-master:9200>: dial tcp 10.102.226.215:9200: connect: connection refused. Sleeping 1s
Could you please help about this situation?modern-answer-65441
12/01/2022, 9:33 PM[+] Building 205.1s (25/32)
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 37B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 35B 0.0s
=> [internal] load metadata for <http://docker.io/library/node:16.13.0-alpine3.14|docker.io/library/node:16.13.0-alpine3.14> 1.0s
=> [internal] load metadata for <http://docker.io/library/alpine:3.14|docker.io/library/alpine:3.14> 1.0s
=> [auth] library/node:pull token for <http://registry-1.docker.io|registry-1.docker.io> 0.0s
=> [auth] library/alpine:pull token for <http://registry-1.docker.io|registry-1.docker.io> 0.0s
=> [prod-build 1/16] FROM <http://docker.io/library/node:16.13.0-alpine3.14@sha256:60ef0bed1dc2ec835cfe3c4226d074fdfaba571fd619c280474cc04e93f0ec5b|docker.io/library/node:16.13.0-alpine3.14@sha256:60ef0bed1dc2ec835cfe3c4226d074fdfaba571fd619c280474cc04e93f0ec5b> 0.0s
=> [internal] load build context 0.2s
=> => transferring context: 471.64kB 0.2s
=> [base 1/3] FROM <http://docker.io/library/alpine:3.14@sha256:4c869a63e1b7c0722fed1e402a6466610327c3b83bdddb94bd94fb71da7f638a|docker.io/library/alpine:3.14@sha256:4c869a63e1b7c0722fed1e402a6466610327c3b83bdddb94bd94fb71da7f638a> 0.0s
=> CACHED [base 2/3] RUN addgroup -S datahub && adduser -S datahub -G datahub 0.0s
=> CACHED [base 3/3] RUN apk --no-cache --update-cache --available upgrade && apk --no-cache add curl && apk --no-cache add openjdk11-jre --repository=<http://dl-cdn.alpinelinux.org/alpine/edge/com> 0.0s
=> CACHED [prod-build 2/16] RUN apk --no-cache --update-cache --available upgrade && apk --no-cache add perl openjdk8 openjdk11 0.0s
=> CACHED [prod-build 3/16] COPY ./datahub-frontend ./datahub-src/datahub-frontend 0.0s
=> CACHED [prod-build 4/16] COPY ./entity-registry ./datahub-src/entity-registry 0.0s
=> CACHED [prod-build 5/16] COPY ./buildSrc ./datahub-src/buildSrc 0.0s
=> [prod-build 6/16] COPY ./datahub-web-react ./datahub-src/datahub-web-react 0.1s
=> [prod-build 7/16] COPY ./li-utils ./datahub-src/li-utils 0.0s
=> [prod-build 8/16] COPY ./metadata-models ./datahub-src/metadata-models 0.0s
=> [prod-build 9/16] COPY ./metadata-models-validator ./datahub-src/metadata-models-validator 0.0s
=> [prod-build 10/16] COPY ./metadata-utils ./datahub-src/metadata-utils 0.0s
=> [prod-build 11/16] COPY ./metadata-service ./datahub-src/metadata-service 0.0s
=> [prod-build 12/16] COPY ./metadata-io ./datahub-src/metadata-io 0.0s
=> [prod-build 13/16] COPY ./datahub-graphql-core ./datahub-src/datahub-graphql-core 0.1s
=> [prod-build 14/16] COPY ./gradle ./datahub-src/gradle 0.0s
=> [prod-build 15/16] COPY repositories.gradle gradle.properties gradlew settings.gradle build.gradle ./datahub-src/ 0.0s
=> [prod-build 16/16] RUN cd datahub-src && ./gradlew :datahub-web-react:build -x test -x yarnTest -x yarnLint && ./gradlew :datahub-frontend:dist -PuseSystemNode=true -x test -x yarnTest -x yar 203.3s
=> => # at org.gradle.configuration.internal.DefaultUserCodeApplicationContext.apply(DefaultUserCodeApplicationContext.java:43)
=> => # at org.gradle.api.internal.plugins.DefaultPluginManager.doApply(DefaultPluginManager.java:156)
=> => # ... 190 more
=> => # fullVersion=0.0.0-unknown-SNAPSHOT
=> => # cliMajorVersion=0.0.0-unknown-SNAPSHOT
=> => # version=0.0.0-unknown-SNAPSHOT
Can someone help me here ?fierce-electrician-85924
12/02/2022, 7:34 AMquick-student-61408
12/02/2022, 10:08 AM