https://datahubproject.io logo
Join SlackCommunities
Powered by
# all-things-deployment
  • b

    bumpy-activity-74405

    05/02/2023, 9:54 AM
    Question about dockerhub images - should I use images under
    linkedin
    or under
    acryldata
    ?
    โœ… 1
    l
    a
    • 3
    • 2
  • p

    proud-dusk-671

    05/02/2023, 11:25 AM
    Hi team, We are ready to deploy datahub in production woohoo. Can you tell us the list of all the important logs in Datahub that we should keep an eye on and should persist. I know container logs is one.
    ๐Ÿ” 1
    ๐Ÿ“– 1
    l
    a
    • 3
    • 2
  • c

    chilly-potato-57465

    05/02/2023, 12:38 PM
    Hello! I have ingested several Kafka topics which had no schemata entered in the schema registry. However, schemata for those exist so I thought I could use one of the DataHub APIs to insert those schemata for the respective topic. I started looking into the GraphQL and OpenAPI but there seems to be no such capability implemented. There is some updateDescription mutation but it requires a path to an already existing field in the schema and in my case the fields are not ingested at all since the schemata were not populated in the Kafka registry. So I am wondering if there is any way to achieve this as I haven't found one yet. Thank you!
    ๐Ÿ” 1
    ๐Ÿ“– 1
    l
    a
    • 3
    • 2
  • w

    wonderful-jordan-36532

    05/03/2023, 9:58 AM
    How to enable
    REST_API_AUTHORIZATION
    in helm charts during deployment? Maybe @gentle-hamburger-31302 can you help?
    l
    i
    • 3
    • 10
  • l

    lemon-scooter-69730

    05/03/2023, 10:29 AM
    Does Datahub support SAML authentication with SSO?
    ๐Ÿ” 1
    ๐Ÿ“– 1
    โœ… 1
    l
    a
    • 3
    • 2
  • s

    square-football-37770

    05/03/2023, 9:44 PM
    Hi! After playing with Datahub on my dev machine via docker-compose, Iโ€™m trying to set it up on GKE so my teammates canm also play with it and convince them to adopt it as our Data Catalog. However, our Kafka is on Aiven, which do not expose/require a zookeeper. It seems this doesnโ€™t suits too well with
    datahub-kafka-setup-job
    since I can see its yaml has this conf:*
    Copy code
    spec:                                                                                                                                                                                                                                      
    โ”‚   containers:                                                                                                                                                                                                                              
    โ”‚   - env:                                                                                                                                                                                                                                   
    โ”‚     - name: KAFKA_ZOOKEEPER_CONNECT                                                                                                                                                                                                        
    โ”‚       value: prerequisites-zookeeper:2181
    which doesnโ€™t exists obviously. It seems this is causing the job to time out. Any ideas what can be done about it? thanks
    ๐Ÿ” 1
    ๐Ÿ“– 1
    l
    a
    d
    • 4
    • 20
  • b

    blue-microphone-24514

    05/04/2023, 2:17 PM
    Hi ! Setting up Datahub on EKS with Azure based SSO. Getting an error about frontend not authorized with gms. client id & secret doesn't match / missing 'Bearer' prefix ... anyone can shed some light ?
    Copy code
    datahub-gms-6698965898-bjv4x datahub-gms 2023-05-04 14:10:38,024 [qtp447981768-278] WARN  c.d.a.a.AuthenticatorChain:80 - Authentication chain failed to resolve a valid authentication. Errors: [(com.datahub.authentication.authenticator.DataHubSystemAuthenticator,Failed to authenticate inbound request: Provided credentials do not match known system client id & client secret. Check your configuration values...), (com.datahub.authentication.authenticator.DataHubTokenAuthenticator,Failed to authenticate inbound request: Authorization header missing 'Bearer' prefix.)]
    datahub-gms-6698965898-bjv4x datahub-gms 2023-05-04 14:10:40,028 [qtp447981768-269] WARN  c.d.a.a.AuthenticatorChain:80 - Authentication chain failed to resolve a valid authentication. Errors: [(com.datahub.authentication.authenticator.DataHubSystemAuthenticator,Failed to authenticate inbound request: Provided credentials do not match known system client id & client secret. Check your configuration values...), (com.datahub.authentication.authenticator.DataHubTokenAuthenticator,Failed to authenticate inbound request: Authorization header missing 'Bearer' prefix.)]
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 2023-05-04 14:10:44,033 [application-akka.actor.default-dispatcher-12] ERROR auth.sso.oidc.OidcCallbackLogic - Failed to perform post authentication steps. Redirecting to error page.
    datahub-frontend-7b758459b7-vs8sj datahub-frontend java.lang.RuntimeException: Failed to provision user with urn urn:li:corpuser:me@company.com.
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at auth.sso.oidc.OidcCallbackLogic.tryProvisionUser(OidcCallbackLogic.java:340)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at auth.sso.oidc.OidcCallbackLogic.handleOidcCallback(OidcCallbackLogic.java:129)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at auth.sso.oidc.OidcCallbackLogic.perform(OidcCallbackLogic.java:107)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at controllers.SsoCallbackController$SsoCallbackLogic.perform(SsoCallbackController.java:89)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at controllers.SsoCallbackController$SsoCallbackLogic.perform(SsoCallbackController.java:75)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at org.pac4j.play.CallbackController.lambda$callback$0(CallbackController.java:54)
    datahub-gms-6698965898-bjv4x datahub-gms 2023-05-04 14:10:44,031 [qtp447981768-277] WARN  c.d.a.a.AuthenticatorChain:80 - Authentication chain failed to resolve a valid authentication. Errors: [(com.datahub.authentication.authenticator.DataHubSystemAuthenticator,Failed to authenticate inbound request: Provided credentials do not match known system client id & client secret. Check your configuration values...), (com.datahub.authentication.authenticator.DataHubTokenAuthenticator,Failed to authenticate inbound request: Authorization header missing 'Bearer' prefix.)]
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at 
    ...
    auth.sso.oidc.OidcCallbackLogic.tryProvisionUser(OidcCallbackLogic.java:321)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	... 14 common frames omitted
    datahub-frontend-7b758459b7-vs8sj datahub-frontend Caused by: com.linkedin.r2.RemoteInvocationException: Received error 401 from server for URI <http://datahub-gms:8080/entities/urn:li:corpuser:me@company.com>
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at com.linkedin.restli.internal.client.ExceptionUtil.exceptionForThrowable(ExceptionUtil.java:98)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at com.linkedin.restli.client.RestLiCallbackAdapter.convertError(RestLiCallbackAdapter.java:66)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at com.linkedin.common.callback.CallbackAdapter.onError(CallbackAdapter.java:86)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at com.linkedin.r2.message.timing.TimingCallback.onError(TimingCallback.java:81)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at com.linkedin.r2.transport.common.bridge.client.TransportCallbackAdapter.onResponse(TransportCallbackAdapter.java:47)
    ...
    java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at java.base/java.lang.Thread.run(Thread.java:829)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend Caused by: com.linkedin.r2.message.rest.RestException: Received error 401 from server for URI <http://datahub-gms:8080/entities/urn:li:corpuser:me@company.com>
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	at com.linkedin.r2.transport.http.common.HttpBridge$1.onResponse(HttpBridge.java:76)
    datahub-frontend-7b758459b7-vs8sj datahub-frontend 	... 4 common frames omitted
    ๐Ÿ” 1
    โœ… 2
    ๐Ÿ“– 1
    l
    a
    g
    • 4
    • 10
  • q

    quick-television-59428

    05/04/2023, 9:21 PM
    Hi all, I'm new to Datahub. To confirm, is it possible to deploy Datahub in Openshift Cluster? I have deployed Datahub in AWS relatively smooth experience as of now.
    l
    a
    • 3
    • 3
  • b

    billions-baker-82097

    05/08/2023, 7:00 AM
    Hi, When we deploy datahub using helm, we need to install 2 things: 1. prerequisites 2. datahub prerequisites is installed within 5 mins datahub is not installed in 5 mins so our pipeline is failing in the production. Can you suggest a way to deploy the datahub within 5 mins?
    l
    o
    • 3
    • 3
  • b

    brief-nail-41206

    05/08/2023, 2:17 PM
    Copy code
    2023/05/08 09:30:02 Waiting for: <tcp://prerequisites-mysql.datahub-temp:3306>
    2023/05/08 09:30:02 Waiting for: tcp://*****-kafka-bootstrap:9092
    2023/05/08 09:30:02 Waiting for: http://*****@datahub-elasticsearch-*****:9200
    2023/05/08 09:30:02 Waiting for: http:
    2023/05/08 09:30:02 Problem with request: Get http:: http: no Host in request URL. Sleeping 1s
    2023/05/08 09:30:02 Connected to tcp://*****-kafka-bootstrap:9092
    2023/05/08 09:30:02 Received 200 from http://*****@datahub-elasticsearch-*****:9200
    2023/05/08 09:30:02 Connected to <tcp://prerequisites-mysql.datahub-temp:3306>
    2023/05/08 09:30:03 Problem with request: Get http:: http: no Host in request URL. Sleeping 1s
    Based on previous posts, despite using
    graph_service_impl: elasticsearch
    the gms is somehow still waiting for
    neo4j
    service to start. Any way to disable this?
    ๐Ÿ“– 1
    ๐Ÿ” 1
    l
    o
    a
    • 4
    • 7
  • q

    quick-megabyte-61846

    05/08/2023, 5:10 PM
    Hey DataHub deployed on k8s with helm chart actual version
    0.2.161
    Recently we updated the application with Azure AD SSO and created a permission model based on groups uuid from Azure which is pulled from Azure AD while logging into DataHub and here problem araised not every group is being synced with DataHub from Azure AD (only groups with the specific prefix are being pulled to DataHub from AD) Iโ€™ve tried to search through docs and check If there is any variable to specify regex for groups but there is nothing or I didnโ€™t catch that
    Copy code
    <https://github.com/datahub-project/datahub/blob/master/datahub-frontend/conf/application.conf#L156>
    <https://datahubproject.io/docs/authentication/guides/sso/configure-oidc-react/#user--group-provisioning-jit-provisioning>
    <https://datahubproject.io/docs/authentication/guides/sso/configure-oidc-react-azure/>
    Our config
    Copy code
    datahub-frontend:
      extraEnvs:
        - name: AUTH_OIDC_JIT_PROVISIONING_ENABLED
          value: "true"
        - name: AUTH_OIDC_EXTRACT_GROUPS_ENABLED
          value: "true"
        - name: AUTH_OIDC_GROUPS_CLAIM
          value: "groups"
        - name: AUTH_JAAS_ENABLED
          value: "true"
    
      oidcAuthentication:
        enabled: true
        provider: azure
        clientId: change_me
        azureTenantId: change_me
        clientSecretRef:
          secretRef: "change_me"
          secretKey: "change_me"
    I know that we can accomplish this somehow using this https://datahubproject.io/docs/generated/ingestion/sources/azure-ad but I wanted to ask if is there any chance to pull all groups to DataHub with Azure Ad provider rather than using an additional recipe for this My idea was to look for regex for groups and permissions in OIDC attributes/applications to access a wider list of groups? Or maybe there is the limitation that only a few groups are being pulled while logging and we cannot overcome this?
    ๐Ÿ” 1
    ๐Ÿ“– 1
    l
    a
    • 3
    • 4
  • c

    creamy-machine-95935

    05/08/2023, 6:55 PM
    Can someone please share the terraform files to deploy Datahub with kubernetes and Helm? Please ๐Ÿ˜„
    โœ… 2
    l
    d
    • 3
    • 2
  • l

    lively-rose-99563

    05/09/2023, 9:38 AM
    Hi! I'm trying to deploy datahub v0.10.2. i set google auth login system, but there is default account (datahub:datahub) I think that's because I'm using datahub frontend image, and in that image there is default account information In this situatoin, how can i change default account information. I deploy datahub on my eks cluster. Thanks for helping me
    l
    d
    • 3
    • 5
  • s

    square-football-37770

    05/09/2023, 2:28 PM
    Hi! So
    create-indices.sh
    on
    ealsticsearch-setup
    job creates a number of indexes starting with underscore
    _
    Copy code
    #   1. ILM policy
      create_if_not_exists "_ilm/policy/${PREFIX}datahub_usage_event_policy" policy.json
      #   2. index template
      create_if_not_exists "_index_template/${PREFIX}datahub_usage_event_index_template" index_template.json
      #   3. although indexing request creates the data stream, it's not queryable before creation, causing GMS to throw exceptions
      create_if_not_exists "_data_stream/${PREFIX}datahub_usage_event" "datahub_usage_event"
    turns out names are forbidden to start with
    _
    on my ES instance (Aiven). If I change the script to use another nameโ€ฆ what other components would I need to change? or would I just be better off starting my own ES on GKE?
    ๐Ÿ” 1
    ๐Ÿ“– 1
    l
    a
    +2
    • 5
    • 19
  • l

    lively-addition-55180

    05/09/2023, 2:28 PM
    Hi I have noticed a few here have deployed on AKS. I have a few questions surrounding the deployment. Can I ask what are you using AGIC/Appgateway or some other ingress controller? Also do you need to expose the GMS port via ingress? I am trying to deploy using AKS using AGIC/appgateway so any help or guides would be appreciated.
    ๐Ÿ“– 1
    ๐Ÿ” 1
    l
    d
    a
    • 4
    • 5
  • c

    creamy-ram-28134

    05/09/2023, 4:34 PM
    Hi All - my datahub upgrade job keeps failing and these are the logs -
    Copy code
    [root@adkube06 ~]# kubectl logs -f datahub-datahub-system-update-job-w6qr7 -n gopikab
    ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console...
    
      .   ____          _            __ _ _
     /\\ / ___'_ __ _ _(_)_ __  __ _ \ \ \ \
    ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
     \\/  ___)| |_)| | | | | || (_| |  ) ) ) )
      '  |____| .__|_| |_|_| |_\__, | / / / /
     =========|_|==============|___/=/_/_/_/
     :: Spring Boot ::        (v2.1.4.RELEASE)
    
    SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
    SLF4J: Defaulting to no-operation (NOP) logger implementation
    SLF4J: See <http://www.slf4j.org/codes.html#StaticLoggerBinder> for further details.
    SLF4J: Failed to load class "org.slf4j.impl.StaticMDCBinder".
    SLF4J: Defaulting to no-operation MDCAdapter implementation.
    SLF4J: See <http://www.slf4j.org/codes.html#no_static_mdc_binder> for further details.
    May 09, 2023 4:30:03 PM org.neo4j.driver.internal.logging.JULogger info
    INFO: Direct driver instance 1495445111 created for server address localhost:7687
    ERROR SpringApplication Application run failed
     java.lang.IllegalStateException: Failed to execute CommandLineRunner
    	at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:816)
    	at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:797)
    	at org.springframework.boot.SpringApplication.run(SpringApplication.java:324)
    	at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:139)
    	at com.linkedin.datahub.upgrade.UpgradeCliApplication.main(UpgradeCliApplication.java:13)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
    	at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
    	at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
    	at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
    Caused by: java.lang.IllegalArgumentException: No upgrade with id SystemUpdate could be found. Aborting...
    	at com.linkedin.datahub.upgrade.impl.DefaultUpgradeManager.execute(DefaultUpgradeManager.java:32)
    	at com.linkedin.datahub.upgrade.UpgradeCli.run(UpgradeCli.java:44)
    	at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:813)
    	... 12 more
    May 09, 2023 4:30:10 PM org.neo4j.driver.internal.logging.JULogger info
    INFO: Closing driver instance 1495445111
    Does anyone know how to fix this ?
    l
    o
    • 3
    • 8
  • r

    rich-restaurant-61261

    05/10/2023, 9:34 AM
    Hi Team, I am trying to deploy the datahub at kubenetes using helm install, but receive
    Imagepullbackoff
    error on elasticsearch-setup-job, prometheus-jmx-exporter, and cp-schema-registry-server, but I do see the image exist in the docker hub, can anyone help me have a look over there? thanks
    kubectl --kubeconfig ~/.kube/di_config describe pod datahub-elasticsearch-setup-job-ntmk5
    Copy code
    Name:     datahub-elasticsearch-setup-job-ntmk5
    Namespace:  feature-aoc-27123-crawler
    Priority:   0
    Node:     didevwkrvm9/xx.2.xx.37
    Start Time:  Wed, 10 May 2023 15:59:13 +0800
    Labels:    controller-uid=94aec4cc-060c-46c7-bf92-xxxxx
           job-name=datahub-elasticsearch-setup-job
    Annotations: <http://cni.projectcalico.org/containerID|cni.projectcalico.org/containerID>: 48d897d76f8487556c564f51e97236fe423d46f5a5651ff7d4873032bd39370a
           <http://cni.projectcalico.org/podIP|cni.projectcalico.org/podIP>: 10.42.11.63/32
           <http://cni.projectcalico.org/podIPs|cni.projectcalico.org/podIPs>: 10.42.11.63/32
    Status:    Pending
    IP:      10.42.11.xx
    IPs:
     IP:      10.42.11.xx
    Controlled By: Job/datahub-elasticsearch-setup-job
    Containers:
     elasticsearch-setup-job:
      Container ID:  
      Image:     linkedin/datahub-elasticsearch-setup:v0.10.2
      Image ID:    
      Port:      <none>
      Host Port:   <none>
      State:     Waiting
       Reason:    ImagePullBackOff
      Ready:     False
      Restart Count: 0
      Limits:
       cpu:   500m
       memory: 512Mi
      Requests:
       cpu:   300m
       memory: 256Mi
      Environment:
       ELASTICSEARCH_HOST:     elasticsearch-master
       ELASTICSEARCH_PORT:     9200
       SKIP_ELASTICSEARCH_CHECK:  false
       ELASTICSEARCH_INSECURE:   false
       ELASTICSEARCH_USE_SSL:   false
       DATAHUB_ANALYTICS_ENABLED: true
      Mounts:
       /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-xm27c (ro)
    Conditions:
     Type       Status
     Initialized    True 
     Ready       False 
     ContainersReady  False 
     PodScheduled   True 
    Volumes:
     kube-api-access-xm27c:
      Type:          Projected (a volume that contains injected data from multiple sources)
      TokenExpirationSeconds: 3607
      ConfigMapName:      kube-root-ca.crt
      ConfigMapOptional:    <nil>
      DownwardAPI:       true
    QoS Class:          Burstable
    Node-Selectors:       <none>
    Tolerations:         <http://node.kubernetes.io/not-ready:NoExecute|node.kubernetes.io/not-ready:NoExecute> op=Exists for 300s
                   <http://node.kubernetes.io/unreachable:NoExecute|node.kubernetes.io/unreachable:NoExecute> op=Exists for 300s
    Events:
     Type  Reason  Age          From   Message
     ----  ------  ----          ----   -------
     Normal BackOff 4m29s (x264 over 64m) kubelet Back-off pulling image "linkedin/datahub-elasticsearch-setup:v0.10.2"
    l
    a
    • 3
    • 2
  • f

    fresh-toothbrush-9306

    05/10/2023, 3:32 PM
    Hi Team, We are deploying datahub on AWS EKS. We use RDS MySQL and created auth and encryption secrets manually. But whenever we uninstall datahub helm completely and remove pvc. It looses all the metadata. All the existing users are able to login but none of them are shown on UI. PATs are also lost. What is that I am missing here?
    l
    o
    a
    • 4
    • 5
  • m

    miniature-room-15319

    05/11/2023, 8:35 AM
    Hi, is there any method for execute datahub actions beside run the recipe from CLI? I want to configure datahub action with filter to be running automatically when the service is on.
    l
    d
    • 3
    • 2
  • a

    adamant-rain-51672

    05/11/2023, 9:21 AM
    Hi, in this guide it's written:
    The command will provision an EKS cluster powered by 3 EC2 m3.large nodes and provision a VPC based networking layer.
    https://datahubproject.io/docs/deploy/aws Is it possible to run datahub on smaller instances?
    ๐Ÿ“– 1
    ๐Ÿ” 1
    l
    a
    • 3
    • 4
  • c

    careful-lunch-53644

    05/12/2023, 3:34 AM
    hi team, the gms service still not support schema-registry set HTTP Basic Authentication ๏ผŸ gms-error.log 031032 [pool-9-thread-1] ERROR c.l.metadata.boot.BootstrapManager - Caught exception while executing bootstrap step IngestRolesStep. Continuing... org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {"type":"record","name":"MetadataChangeLog","namespace":"com.linkedin.pegasus2avro.mxe","doc":" Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unauthorized; error code: 401 at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:292) at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:351) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:494) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:485) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:458) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:206) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:268) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:244) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:74) at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:59) at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:902) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:862) at com.linkedin.metadata.dao.producer.KafkaEventProducer.produceMetadataChangeLog(KafkaEventProducer.java:114) at com.linkedin.metadata.entity.EntityService.produceMetadataChangeLog(EntityService.java:1284) at com.linkedin.metadata.entity.EntityService.produceMetadataChangeLog(EntityService.java:1309) at com.linkedin.metadata.boot.steps.IngestRolesStep.ingestRole(IngestRolesStep.java:111) at com.linkedin.metadata.boot.steps.IngestRolesStep.execute(IngestRolesStep.java:79) at com.linkedin.metadata.boot.BootstrapManager.lambda$start$0(BootstrapManager.java:44) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1736) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829)
    l
    a
    • 3
    • 2
  • s

    steep-soccer-91284

    05/12/2023, 6:21 AM
    Hi, everyone I want to integrate Airflow and Datahub. Therefore I tried to install Datahub plugin in my Airflow EKS cluster. However, Datahub is not installed by making an error like below. How can I solve this problem? Best regards, Daeyoung Kim
    โœ… 1
    l
    • 2
    • 2
  • g

    great-monkey-52307

    05/12/2023, 4:09 PM
    Hi All, I have installed Datahub using helm repository, I'm able to change Datahub logo in the app adding env variable:REACT_APP_FAVICON_URL in gms chart. My requirement is to change term 'DataHub' showing up in couple of places in settings tab within the app. is it possible updating the Helm repo?? how can I do that , can anyone please suggest Screenshot attached which shows the Datahub term looking to replace. https://github.com/acryldata/datahub-helm
    l
    d
    • 3
    • 3
  • f

    future-controller-3884

    05/13/2023, 11:55 AM
    Hi folks, โœ‹ I just started with Datahub. I deployed it on K8S by helm. I have a question. What is the minimum metadata that needs to be backed up to restore my Datahub? If it gets a problem. (mysql, elasticsearch, neo4jโ€ฆ).
    โœ… 1
    l
    d
    • 3
    • 3
  • s

    shy-dog-84302

    05/13/2023, 7:43 PM
    Hi! When can we expect next release of DataHub helm charts. Current release 0.2.164 is 3 weeks old. Iโ€™m interested in DataHub GMS version 0.10.2.2
    l
    d
    +2
    • 5
    • 6
  • c

    careful-lunch-53644

    05/15/2023, 7:55 AM
    hi team, How to build the project and package tar (docker env), have any relevant information?
    โœ… 1
    l
    a
    • 3
    • 2
  • s

    steep-vr-39297

    05/15/2023, 8:23 AM
    Hi team. I have a question. The value of
    enablePrometheus
    was set to true as helm in the cluster, and datahub was deployed. Do I only need to install the grafana separately on the cluster (helm repo add grafana https://grafana.github.io/helm-charts) ? The datahub document contains only the
    docker-compose
    description. Is there any information related to Helm?
    ๐Ÿ” 1
    ๐Ÿ“– 1
    l
    a
    • 3
    • 3
  • p

    proud-dusk-671

    05/15/2023, 11:45 AM
    Hi team, Can you suggest what are the various alerts that should be configured with datahub post its deployment with K8s. I can imagine infra health related alerts to be one but I would love to know more. I would also like to know how you can configure these alerts
    ๐Ÿ” 1
    ๐Ÿ“– 1
    l
    a
    • 3
    • 5
  • r

    rough-summer-14442

    05/15/2023, 11:59 AM
    Hi team, I am trying to Deploy Datahub via Docker in my local system. It was up and running a few weeks ago but now when I try to start it again I am running into an issue where it says datahub-gms is running but not healthy. I have tried docker nuke and reinstalled the images but to no avail. Ive attached an image of the error and the log file for datahub-gms container. Any help would be greatly appreciated. Thanks
    datahub-gms.txt
    l
    a
    • 3
    • 4
  • b

    bland-orange-13353

    05/15/2023, 4:28 PM
    This message was deleted.
    ๐Ÿ“– 1
    โœ… 1
    l
    • 2
    • 1
1...424344...53Latest