https://datahubproject.io logo
Join SlackCommunities
Powered by
# troubleshoot
  • r

    rich-activity-70509

    11/26/2022, 7:18 AM
    Hello, team. Has anyone faced this error before? seems like a problem with Apple M1 setup for grpc. Couldn't find a working solution online.
    Copy code
    ImportError: dlopen(/private/var/folders/r9/kkk0b47j4fl9f0nlyxsw7dyr0000gn/T/pip-build-env-ls6nbxyb/overlay/lib/python3.10/site-packages/grpc_tools/_protoc_compiler.cpython-310-darwin.so, 0x0002): tried: '/private/var/folders/r9/kkk0b47j4fl9f0nlyxsw7dyr0000gn/T/pip-build-env-ls6nbxyb/overlay/lib/python3.10/site-packages/grpc_tools/_protoc_compiler.cpython-310-darwin.so' (mach-o file, but is an incompatible architecture (have (x86_64), need (arm64e)))
    This error occurs while building datahub's command line tool (or) while building datahub's documentation.
    Copy code
    ./gradlew :metadata-ingestion:installDev
    ./gradlew :docs-website:yarnLintFix :docs-website:build -x :metadata-ingestion:runPreFlightScript
    This task is failing:
    Copy code
    > Task :metadata-ingestion:installDevTest FAILED
    with Error:
    Copy code
    ...
    Successfully built acryl-datahub
    Failed to build feast
    ERROR: Could not build wheels for feast, which is required to install pyproject.toml-based projects
    Using Python Version
    3.10.8
    b
    • 2
    • 2
  • f

    few-sunset-43876

    11/28/2022, 8:34 AM
    Hi team! I got these WARN logs in datahub-gms. It seems data could not be ingested? How can I fix it? (I deploy datahub using docker compose). Thanks in advance!
    Copy code
    08:01:55.065 [qtp522764626-446] INFO  c.l.m.r.entity.EntityResource:157 - GET urn:li:corpuser:tri.tran5
    08:01:55.069 [pool-11-thread-1] INFO  c.l.m.filter.RestliLoggingFilter:55 - GET /entities/urn%3Ali%3Acorpuser%3Atri.tran5 - get - 200 - 4ms
    08:01:55.074 [qtp522764626-391] INFO  c.l.m.r.entity.AspectResource:143 - INGEST PROPOSAL proposal: {aspectName=corpUserStatus, entityUrn=urn:li:corpuser:tri.tran5, entityType=corpuser, changeType=UPSERT, aspect={contentType=application/json, value=ByteString(length=100,bytes=7b227374...37327d7d)}}
    08:01:55.091 [pool-11-thread-1] INFO  c.l.m.filter.RestliLoggingFilter:55 - POST /aspects?action=ingestProposal - ingestProposal - 200 - 17ms
    08:01:55.752 [ThreadPoolTaskExecutor-1] INFO  c.l.m.k.t.DataHubUsageEventTransformer:74 - Invalid event type: HomePageViewEvent
    08:01:55.752 [ThreadPoolTaskExecutor-1] WARN  c.l.m.k.DataHubUsageEventsProcessor:56 - Failed to apply usage events transform to record: {"type":"HomePageViewEvent","actorUrn":"urn:li:corpuser:tri.tran5","timestamp":1669622515141,"date":"Mon Nov 28 2022 15:01:55 GMT+0700 (Indochina Time)","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36","browserId":"1b20cc1b-5afe-4f60-b6f4-2876120b0463"}
    08:01:55.776 [I/O dispatcher 1] INFO  c.l.m.s.e.update.BulkListener:47 - Successfully fed bulk request. Number of events: 3 Took time ms: -1
    08:01:55.781 [pool-11-thread-1] INFO  c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 3ms
    08:01:55.783 [ThreadPoolTaskExecutor-1] INFO  c.l.m.k.t.DataHubUsageEventTransformer:74 - Invalid event type: HomePageViewEvent
    08:01:55.783 [ThreadPoolTaskExecutor-1] WARN  c.l.m.k.DataHubUsageEventsProcessor:56 - Failed to apply usage events transform to record: {"type":"HomePageViewEvent","actorUrn":"urn:li:corpuser:tri.tran5","timestamp":1669622515213,"date":"Mon Nov 28 2022 15:01:55 GMT+0700 (Indochina Time)","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36","browserId":"1b20cc1b-5afe-4f60-b6f4-2876120b0463"}
    08:01:56.453 [pool-11-thread-1] INFO  c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 3ms
    08:01:56.458 [pool-11-thread-1] INFO  c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 2ms
    08:01:56.464 [pool-11-thread-1] INFO  c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 2ms
    08:01:56.497 [I/O dispatcher 1] INFO  c.l.m.s.e.update.BulkListener:47 - Successfully fed bulk request. Number of events: 1 Took time ms: -1
    08:01:56.516 [pool-11-thread-1] INFO  c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 3ms
    08:01:56.534 [pool-11-thread-1] INFO  c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 2ms
    08:01:56.841 [pool-11-thread-1] INFO  c.l.m.filter.RestliLoggingFilter:55 - GET /entitiesV2?ids=List(urn%3Ali%3Acorpuser%3Atri.tran5) - batchGet - 200 - 2ms
    b
    f
    • 3
    • 9
  • r

    rich-pager-68736

    11/28/2022, 10:30 AM
    Hi there! I am trying to setup Kafka using MSK and IAM authentication. It's working fine for most services: • actions is using SCRAM-SHA-512, but that's okay for now. Any plans to support MSK IAM here? • gms, mae, mce and the schema registry are working fine with MSK IAM • What I cannot solve is telling the datahub-frontend to use MSK IAM. I have configured it like this:
    Copy code
    - name: KAFKA_BOOTSTRAP_SERVER
      value: "XXXXXXXXXXXXXXXX:9098,YYYYYYYYYYYYYYYYY:9098"
    - name: KAFKA_PROPERTIES_SECURITY_PROTOCOL
      value: "SASL_SSL"
    - name: KAFKA_PROPERTIES_SASL_MECHANISM
      value: "AWS_MSK_IAM"
    - name: KAFKA_PROPERTIES_SASL_JAAS_CONFIG
      value: "software.amazon.msk.auth.iam.IAMLoginModule required;"
    - name: KAFKA_PROPERTIES_SASL_LOGIN_CALLBACK_HANDLER_CLASS
      value: "software.amazon.msk.auth.iam.IAMClientCallbackHandler"
    but it fails to authenticate:
    Copy code
    08:53:32 [application-akka.actor.default-dispatcher-7] INFO  o.a.k.c.producer.ProducerConfig - ProducerConfig values: 
        acks = 1
        batch.size = 16384
        bootstrap.servers = [XXXXXXXXXXXXXXXXXXXXX:9098, YYYYYYYYYYYYYYYYYYYYYYYY:9098]
        buffer.memory = 33554432
        client.dns.lookup = default
        client.id = datahub-frontend
        compression.type = none
        <http://connections.max.idle.ms|connections.max.idle.ms> = 540000
        <http://delivery.timeout.ms|delivery.timeout.ms> = 120000
        enable.idempotence = false
        interceptor.classes = []
        key.serializer = class org.apache.kafka.common.serialization.StringSerializer
        <http://linger.ms|linger.ms> = 0
        <http://max.block.ms|max.block.ms> = 60000
        max.in.flight.requests.per.connection = 5
        max.request.size = 1048576
        <http://metadata.max.age.ms|metadata.max.age.ms> = 300000
        metric.reporters = []
        metrics.num.samples = 2
        metrics.recording.level = INFO
        <http://metrics.sample.window.ms|metrics.sample.window.ms> = 30000
        partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
        receive.buffer.bytes = 32768
        <http://reconnect.backoff.max.ms|reconnect.backoff.max.ms> = 1000
        <http://reconnect.backoff.ms|reconnect.backoff.ms> = 50
        <http://request.timeout.ms|request.timeout.ms> = 30000
        retries = 2147483647
        <http://retry.backoff.ms|retry.backoff.ms> = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = [hidden]
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = class software.amazon.msk.auth.iam.IAMClientCallbackHandler
        sasl.login.class = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.mechanism = AWS_MSK_IAM
        security.protocol = SASL_SSL
        send.buffer.bytes = 131072
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
        ssl.endpoint.identification.algorithm = https
        ssl.key.password = null
        ssl.keymanager.algorithm = SunX509
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLS
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        <http://transaction.timeout.ms|transaction.timeout.ms> = 60000
        transactional.id = null
        value.serializer = class org.apache.kafka.common.serialization.StringSerializer
    
    08:53:33 [application-akka.actor.default-dispatcher-7] INFO  o.a.k.c.s.a.AbstractLogin - Successfully logged in.
    08:53:33 [application-akka.actor.default-dispatcher-7] INFO  o.a.kafka.common.utils.AppInfoParser - Kafka version: 2.3.0
    08:53:33 [application-akka.actor.default-dispatcher-7] INFO  o.a.kafka.common.utils.AppInfoParser - Kafka commitId: fc1aaa116b661c8a
    08:53:33 [application-akka.actor.default-dispatcher-7] INFO  o.a.kafka.common.utils.AppInfoParser - Kafka startTimeMs: 1669625613221
    08:53:33 [kafka-producer-network-thread | datahub-frontend] INFO  o.a.kafka.common.network.Selector - [Producer clientId=datahub-frontend] Failed authentication with XXXXXXXXXXXXXXXXX (An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: Exception while evaluating challenge [Caused by javax.security.auth.callback.UnsupportedCallbackException: Unrecognized SASL ClientCallback]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTHENTICATION_FAILED state.)
    08:53:33 [kafka-producer-network-thread | datahub-frontend] ERROR o.apache.kafka.clients.NetworkClient - [Producer clientId=datahub-frontend] Connection to node -2 (XXXXXXXXXXXXXXXXX:9098) failed authentication due to: An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: Exception while evaluating challenge [Caused by javax.security.auth.callback.UnsupportedCallbackException: Unrecognized SASL ClientCallback]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTHENTICATION_FAILED state.
    08:53:33 [kafka-producer-network-thread | datahub-frontend] INFO  o.a.kafka.common.network.Selector - [Producer clientId=datahub-frontend] Failed authentication with YYYYYYYYYYYYYYYYYY (An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: Exception while evaluating challenge [Caused by javax.security.auth.callback.UnsupportedCallbackException: Unrecognized SASL ClientCallback]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTHENTICATION_FAILED state.)
    08:53:33 [kafka-producer-network-thread | datahub-frontend] ERROR o.apache.kafka.clients.NetworkClient - [Producer clientId=datahub-frontend] Connection to node -1 (YYYYYYYYYYYYYYYYYY:9098) failed authentication due to: An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: Exception while evaluating challenge [Caused by javax.security.auth.callback.UnsupportedCallbackException: Unrecognized SASL ClientCallback]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTHENTICATION_FAILED state.
    ...
    Any idea what I can do here? Thanks!
    b
    b
    • 3
    • 7
  • b

    breezy-portugal-43538

    11/28/2022, 12:04 PM
    Hi, I wanted to enable profiling when I ingest the s3 file to datahub but I receive missing spark dependencies due to the proxies. Is there some easy way to pass to spark session parameters like this? spark.driver.extraJavaOptions -D http.proxyHost=<my_proxy>
    b
    d
    • 3
    • 4
  • l

    lively-dusk-19162

    11/28/2022, 4:35 PM
    Hello team, I have couple of sql queries. Is there any way or any library to find column level lineage from sql queries?
    b
    g
    m
    • 4
    • 11
  • a

    ambitious-cartoon-15344

    11/29/2022, 3:04 AM
    hi all, please ask if getting an airflow dag to not send metadata to datahub
    a
    • 2
    • 3
  • f

    flaky-soccer-57765

    11/29/2022, 9:37 AM
    Morning All, Tags created through the ingestion recipe are not editable (adding description) in the UI. "Error: URN not available" Is this a known bug? can you suggest how to resolve this, please?
    b
    • 2
    • 2
  • b

    breezy-portugal-43538

    11/29/2022, 10:27 AM
    Hello everyone, Is there some way to pass mlFeatureTable urns to MLModelPropertiesClass in order to see featureTables used in MlModel, if yes, how can I achieve this? thanks for help in advance!
    a
    g
    • 3
    • 7
  • s

    strong-belgium-32572

    11/29/2022, 12:54 PM
    Hello, If someone could guide me on this issues on Datahub. I don’t see the options to add/edit users/groups nor even policies. I have the OIDC enable authentications and also when logged in with root users I just see the following in the Settings. No options or cant add/update users and groups.
    b
    b
    e
    • 4
    • 21
  • r

    refined-dream-17668

    11/29/2022, 1:10 PM
    Hi, during dbt ingestion tag is automatically created based on dbt files. When trying to add text to "About" section of tag or add Owner to that tag error appears:
    Failed to add owners: Failed to update resource with urn urn:li:tag: [our tag name]. Entity does not exist.
    What is the reason? Are we able to add both description and onwer to tag directly from dbt files? I see that on public demo the same issue exist.
    plus1 2
    a
    l
    • 3
    • 3
  • a

    ancient-wire-3767

    11/29/2022, 1:16 PM
    Hi, during import with profiling from Oracle we face an run failure. Would anyone help us to solve this? Details bellow
    ``` 'sqlalchemy.exc.DatabaseError: (cx_Oracle.DatabaseError) ORA-00936: missing expression\n'
    '[SQL: SELECT FROM DUAL \n'
    'WHERE ROWNUM <= :param_1]\n'
    "[parameters: {'param_1': 1}]\n"
    '(Background on this error at: http://sqlalche.me/e/13/4xp6)\n'
    '[2022-11-29 120449,030] INFO {datahub.ingestion.source.ge_data_profiler:909} - Profiling '
    'scheme_name.table_name\n'
    '[2022-11-29 120449,037] ERROR {datahub.ingestion.source.ge_data_profiler:939} - Encountered exception while profiling '```
    Without enabled profiling it runs fine without error. Here is our ingestion recipe:
    Copy code
    source:
        type: oracle
        config:
            env: TEST
            password: xxxx
            host_port: 'ip:port'
            service_name: test
            username: xxxx
            schema_pattern:
                allow:
    scheme_name
            include_views: true
            include_tables: true
            profiling:
                enabled: true
    pipeline_name: 'xxx'
    Probably it causes query SELECT FROM DUAL which can be seen in log.
    d
    a
    • 3
    • 3
  • b

    bright-motherboard-35257

    11/29/2022, 1:52 PM
    I have several days worth scheduled ingestions stuck at "pending", Trying to cancel them via UI does not seem to make any difference. Any way to force cancel these?
    b
    m
    • 3
    • 8
  • b

    bright-motherboard-35257

    11/29/2022, 1:53 PM
    image.png
  • c

    careful-computer-14484

    11/29/2022, 7:11 PM
    Hey folks. I am working with the snowflake connector, and was able to get the view metadata for a set of views, but was not able to get profile output. Does Datahub profile views?
    b
    d
    • 3
    • 3
  • c

    careful-computer-14484

    11/29/2022, 9:26 PM
    Anyone know what this error means? I am using the Snowflake operator
    Copy code
    'failures': {'Stateful Ingestion': ['Fail safe mode triggered, entity difference percent:100.0 > fail_safe_threshold:{self.stateful_ingestion_config.fail_safe_threshold}']},
    and I have this in my yaml:
    Copy code
    stateful_ingestion:
                enabled: false
    b
    l
    +2
    • 5
    • 11
  • s

    shy-dog-84302

    11/30/2022, 10:37 AM
    Anyone has experienced similar issue with possible hints for solving it?
  • s

    silly-chef-95502

    11/30/2022, 12:08 PM
    hi, it’s a super starter question but I’m trying to connect datahub to a redshift which is portforwarded to my local via ssh tunnel. I’ve tried running the crawler and it’s failing to connect, for my localhost I’ve tried to put • localhost • host IP • container IP • host.docker.internal in the UI. In the datahub’s action container I can connect using host.docker.internal but not when I issue a crawl job
    d
    • 2
    • 4
  • p

    proud-memory-42381

    11/30/2022, 2:23 PM
    Hi - I'll just ask here too: is there a way of viewing api calls to the source when trying to ingest? The debug option only seems to show me communication with datahub itself... Thanks in advance! Posted to the tableau channel initially: https://datahubspace.slack.com/archives/C02GBGG90CU/p1669815878875039
    a
    • 2
    • 4
  • l

    lively-dusk-19162

    11/30/2022, 3:00 PM
    Hello team, When we ingest fine grained lineage to datahub using python SDK , what should be the dataset in the following line: upstream = Upstream ( dataset = , type=DatasetLineageType.TRANSFORMED) Is it all upstream tables found while finding lineage or particular table? What is the purpose of that? Could any one please help me on this?
    b
    • 2
    • 6
  • l

    limited-forest-73733

    11/30/2022, 6:14 PM
    Hey team i am taking all acryldata images of version 0.9.2 but all the images have lot many high and critical vulnerabilities. Can anyone suggest me any fix for them? I tried to update the jar packages but then build got fail. Any suggestion or any remedy to get it fix.
    b
    • 2
    • 3
  • g

    gentle-tailor-78929

    11/30/2022, 9:11 PM
    Hi all, just finished deploying Datahub v0.9.2 and I’m noticing that the
    datahub
    admin user is missing the following permissions:
    Copy code
    managePolicies: false
    manageUserCredentials: false
    Any ideas what might be causing that? Thanks!
    b
    • 2
    • 3
  • f

    full-gold-60357

    12/01/2022, 5:49 AM
    Hello all! Our Datahub is deployed on K8s. And we tried to integrate great expectations. On the great expectations part validations is succeeded but in datahub the validation tab is not appearing. Can you, please, help us and provide any suggestions?
    a
    • 2
    • 2
  • b

    best-wire-59738

    12/01/2022, 7:11 AM
    Hello Team , I was trying to ingest users and groups using OIDC SSO. Users are getting authenticated and was able to login to datahub. But the datahub is unable to fetch the group details(groups scope enabled). I was getting Type mismatch error in frontend logs. can you please help me to solve this issue. we are using version 0.9.2
    Copy code
    07:55:12 [application-akka.actor.default-dispatcher-50] ERROR auth.sso.oidc.OidcCallbackLogic - Failed to extract groups: Expected to find a list of strings for attribute with name groups, found class net.minidev.json.JSONArray
  • l

    late-ability-59580

    12/01/2022, 8:54 AM
    Hello everyone! I encounter a problem when trying to delete entities using the cli. I run
    datahub delete --env PROD --entity_type dataset --platform snowflake
    I get a super long traceback, somewhere I within it says:
    HTTP ERROR 401 Unauthorized to perform this action
    I get the same error when trying to delete specific entity using its urn. Any ideas? P.S: I don't get that error when running
    datahub ingest ...
    commands
    a
    • 2
    • 1
  • p

    powerful-cat-68806

    12/01/2022, 10:33 AM
    Hi team I’m trying to deploy a slim NS from Datahub - just with Elasticsearch & Kafka When deploying Datahub charts, I’m facing the following error, from
    datahub-datahub-upgrade-job-xxxxx:
    Copy code
    Error: secret "mysql-secrets" not found
    I presume this error occurs because I need a DB for the app(pgSQL / MySQL). I want to use a DB that is provided by my cloud vendor(AWS) & not from the app deployment Obviously, I’m customizing the
    values.yaml
    both for prereq & datahub Pls. advise 10x 🙏
    a
    l
    +2
    • 5
    • 57
  • f

    faint-translator-23365

    12/01/2022, 11:25 AM
    Hi , the datahub has a site package “htrace-core4-4.1.0-incubating.jar” which has a lot of vulnerabilities and this htrace-core4 is not having any updated versions since “September 2016” . How are we supposed to resolve this vulnerability if we dont have any updated version of the above jar.
    b
    • 2
    • 1
  • a

    adorable-magazine-49274

    12/01/2022, 2:39 PM
    Hi team, I'm currently deploying datahub on kubernetes via quickstart. Currently, the following error occurs in the pod called
    datahub-elasticsearch-setup-job
    .
    Copy code
    Problem with request: Get <http://elasticsearch-master:9200>: dial tcp 10.102.226.215:9200: connect: connection refused. Sleeping 1s
    Could you please help about this situation?
    b
    • 2
    • 6
  • m

    modern-answer-65441

    12/01/2022, 9:33 PM
    Hello Team, I made changes to datahub frontend and trying to build the image However, the build takes forever to run with the below information
    Copy code
    [+] Building 205.1s (25/32)
     => [internal] load build definition from Dockerfile                                                                                                                                                           0.0s
     => => transferring dockerfile: 37B                                                                                                                                                                            0.0s
     => [internal] load .dockerignore                                                                                                                                                                              0.0s
     => => transferring context: 35B                                                                                                                                                                               0.0s
     => [internal] load metadata for <http://docker.io/library/node:16.13.0-alpine3.14|docker.io/library/node:16.13.0-alpine3.14>                                                                                                                                     1.0s
     => [internal] load metadata for <http://docker.io/library/alpine:3.14|docker.io/library/alpine:3.14>                                                                                                                                                 1.0s
     => [auth] library/node:pull token for <http://registry-1.docker.io|registry-1.docker.io>                                                                                                                                                    0.0s
     => [auth] library/alpine:pull token for <http://registry-1.docker.io|registry-1.docker.io>                                                                                                                                                  0.0s
     => [prod-build  1/16] FROM <http://docker.io/library/node:16.13.0-alpine3.14@sha256:60ef0bed1dc2ec835cfe3c4226d074fdfaba571fd619c280474cc04e93f0ec5b|docker.io/library/node:16.13.0-alpine3.14@sha256:60ef0bed1dc2ec835cfe3c4226d074fdfaba571fd619c280474cc04e93f0ec5b>                                                                  0.0s
     => [internal] load build context                                                                                                                                                                              0.2s
     => => transferring context: 471.64kB                                                                                                                                                                          0.2s
     => [base 1/3] FROM <http://docker.io/library/alpine:3.14@sha256:4c869a63e1b7c0722fed1e402a6466610327c3b83bdddb94bd94fb71da7f638a|docker.io/library/alpine:3.14@sha256:4c869a63e1b7c0722fed1e402a6466610327c3b83bdddb94bd94fb71da7f638a>                                                                                      0.0s
     => CACHED [base 2/3] RUN addgroup -S datahub && adduser -S datahub -G datahub                                                                                                                                 0.0s
     => CACHED [base 3/3] RUN apk --no-cache --update-cache --available upgrade     && apk --no-cache add curl     && apk --no-cache add openjdk11-jre --repository=<http://dl-cdn.alpinelinux.org/alpine/edge/com>  0.0s
     => CACHED [prod-build  2/16] RUN apk --no-cache --update-cache --available upgrade     && apk --no-cache add perl openjdk8 openjdk11                                                                          0.0s
     => CACHED [prod-build  3/16] COPY ./datahub-frontend ./datahub-src/datahub-frontend                                                                                                                           0.0s
     => CACHED [prod-build  4/16] COPY ./entity-registry ./datahub-src/entity-registry                                                                                                                             0.0s
     => CACHED [prod-build  5/16] COPY ./buildSrc ./datahub-src/buildSrc                                                                                                                                           0.0s
     => [prod-build  6/16] COPY ./datahub-web-react ./datahub-src/datahub-web-react                                                                                                                                0.1s
     => [prod-build  7/16] COPY ./li-utils ./datahub-src/li-utils                                                                                                                                                  0.0s
     => [prod-build  8/16] COPY ./metadata-models ./datahub-src/metadata-models                                                                                                                                    0.0s
     => [prod-build  9/16] COPY ./metadata-models-validator ./datahub-src/metadata-models-validator                                                                                                                0.0s
     => [prod-build 10/16] COPY ./metadata-utils ./datahub-src/metadata-utils                                                                                                                                      0.0s
     => [prod-build 11/16] COPY ./metadata-service ./datahub-src/metadata-service                                                                                                                                  0.0s
     => [prod-build 12/16] COPY ./metadata-io ./datahub-src/metadata-io                                                                                                                                            0.0s
     => [prod-build 13/16] COPY ./datahub-graphql-core ./datahub-src/datahub-graphql-core                                                                                                                          0.1s
     => [prod-build 14/16] COPY ./gradle ./datahub-src/gradle                                                                                                                                                      0.0s
     => [prod-build 15/16] COPY repositories.gradle gradle.properties gradlew settings.gradle build.gradle ./datahub-src/                                                                                          0.0s
     => [prod-build 16/16] RUN cd datahub-src     && ./gradlew :datahub-web-react:build -x test -x yarnTest -x yarnLint     && ./gradlew :datahub-frontend:dist -PuseSystemNode=true -x test -x yarnTest -x yar  203.3s
     => => # at org.gradle.configuration.internal.DefaultUserCodeApplicationContext.apply(DefaultUserCodeApplicationContext.java:43)
     => => # at org.gradle.api.internal.plugins.DefaultPluginManager.doApply(DefaultPluginManager.java:156)
     => => # ... 190 more
     => => # fullVersion=0.0.0-unknown-SNAPSHOT
     => => # cliMajorVersion=0.0.0-unknown-SNAPSHOT
     => => # version=0.0.0-unknown-SNAPSHOT
    Can someone help me here ?
    a
    • 2
    • 14
  • f

    fierce-electrician-85924

    12/02/2022, 7:34 AM
    Hi, is there way to update urn of entity in datahub?
    d
    • 2
    • 3
  • q

    quick-student-61408

    12/02/2022, 10:08 AM
    Hi, Should we use "datahub docker quickstart" of each VM restart ? I have the impression that the storage of my VM fills up at each quickstart.
1...626364...119Latest