https://datahubproject.io logo
Join SlackCommunities
Powered by
# troubleshoot
  • a

    acceptable-musician-1893

    04/07/2022, 9:22 PM
    Hi all, could someone explain how Datahub manages Elasticsearch schema updates/migrations?
    e
    • 2
    • 3
  • p

    plain-napkin-77279

    04/08/2022, 2:40 AM
    Hi Team, I tried to instal datahub with the "docker-compose up"... all was good, but the ones I try to log in to the datahub I got this error "Failed to log in! SyntaxError: Unexpected token < in JSON at position 0" and when i check the logs of datahub-gms i got this " Problem with request: Get "http://neo4j:7474": dial tcp: lookup neo4j on 127.0.0.1153 read udp 127.0.0.156622 &gt;127.0.0.1153: i/o timeout. Sleeping 1s" Ps: i am using these way to add a user to datahub and i pulled the last version of the project
    b
    • 2
    • 2
  • b

    bumpy-activity-74405

    04/08/2022, 7:45 AM
    After upgrading
    0.8.23
    ->
    0.8.32.1
    I’ve lost most of my looker charts (±58k to ±1k). I don’t have a deep understanding of what happens when gms is starting but I suspect that this happened, because the container was killed in the middle of bootstrapping due to misconfiguration of
    initialDelaySeconds
    on my part. I see around ±58k chart records in mysql (
    aspect = chartKey
    ) and
    chartindex_v2
    es index. I’ve tried reingesting but only 100-200 more charts show up after ingesting the ±58k elements. I suspect that the couple hundred more that appear in the UI are the ones that had any changes. How does one go about fixing something like this?
    d
    • 2
    • 13
  • b

    breezy-portugal-43538

    04/08/2022, 8:56 AM
    Hi, I had discovered some really strange issue lately, after ingesting some demo data to datahub, server stops responding after some period of time returning error 500. I had created an issue on github for it: https://github.com/datahub-project/datahub/issues/4619 Is this some known problem?
    d
    e
    b
    • 4
    • 8
  • c

    cuddly-lunch-28022

    04/08/2022, 8:56 AM
    Hello! DataMap should have no more than one entry for a union tell me please how to fix it?
    d
    l
    • 3
    • 8
  • p

    prehistoric-dawn-23569

    04/08/2022, 9:00 AM
    Does anyone know how to disable certificate verification from the frontend to an SSL/TLS enabled GMS server? I have this error message:
    Copy code
    Caused by: javax.net.ssl.SSLPeerUnverifiedException: Certificate for <datahub-gms-main-tls-service.datahub.svc.cluster.local> doesn't match any of the subject alternative names: [staging.svc.eqiad.wmnet, staging.svc.codfw.wmnet]
    l
    b
    • 3
    • 4
  • c

    cuddly-lunch-28022

    04/08/2022, 9:38 AM
    Hello one more error Parameters of method 'ingest' failed validation with error 'ERROR :: /entity/value/com.linkedin.metadata.snapshot.DatasetSnapshot/aspects/0 :: \"DatasetUpstreamLineage\" is not a member type of union
    h
    • 2
    • 3
  • b

    busy-shampoo-12116

    04/08/2022, 6:29 PM
    hi. im following this (https://datahubproject.io/docs/datahub-web-react/#adding-an-entity) to create a new entity. ive completed step 1 and trying to do step 2. how do i update
    types.generated.ts
    and get the repo to create
    my-entity.generated.ts
    in the graphql folder? thank you in advance!
    g
    • 2
    • 6
  • a

    adamant-magazine-62649

    04/08/2022, 11:57 PM
    Hi All, getting the following error when trying to run the datahub docker quickstart command. Anyone got any ideas why the pulling from linkedin/dathub-gms and front end react aren't working? PS C:\Windows\system32> datahub docker quickstart No Datahub Neo4j volume found, starting with elasticsearch as graph service. To use neo4j as a graph backend, run
    datahub docker quickstart --quickstart-compose-file ./docker/quickstart/docker-compose.quickstart.yml
    from the root of the datahub repo Fetching docker-compose file https://raw.githubusercontent.com/datahub-project/datahub/master/docker/quickstart/docker-compose-without-neo4j.quickstart.yml from GitHub WARNING: The HOME variable is not set. Defaulting to a blank string. Pulling elasticsearch ... done Pulling elasticsearch-setup ... done Pulling mysql ... done Pulling datahub-gms ... pulling from linkedin/datahub-gms Pulling datahub-frontend-react ... pulling from linkedin/datahub-fro... Pulling datahub-actions ... done Pulling mysql-setup ... done Pulling zookeeper ... done Pulling broker ... done Pulling schema-registry ... done Pulling kafka-setup ... done ERROR: for datahub-frontend-react no matching manifest for linux/amd64 in the manifest list entries ERROR: for datahub-gms no matching manifest for linux/amd64 in the manifest list entries ERROR: no matching manifest for linux/amd64 in the manifest list entries
    🙀 2
    c
    b
    +4
    • 7
    • 11
  • p

    plain-napkin-77279

    04/09/2022, 12:18 AM
    Hi, I am trying to add new users to to datahub using the user. Props file and the docker-compose command, it worked at the first time for the first user (I have now the 'datahub' and another user) but when I tried to add more users I got this message ' Failed to log in! Invalid Credentials ' like if I didn't add another user.
    e
    • 2
    • 23
  • m

    many-guitar-67205

    04/11/2022, 8:26 AM
    Hello, I'm trying to delete some metadata, but the
    datahub delete
    output is ambiguous, and the data is not gone:
    Copy code
    ❯ datahub delete --entity_type dataset --platform kafka --hard 
    This will permanently delete data from DataHub. Do you want to continue? [y/N]: y
    [2022-04-11 10:17:22,059] INFO     {datahub.cli.delete_cli:200} - datahub configured with <http://localhost:8080>
    [2022-04-11 10:17:22,182] INFO     {datahub.cli.delete_cli:212} - Filter matched 22 entities. Sample: ['urn:li:dataset:(urn:li:dataPlatform:kafka, 
    ... (22 urns) 
    ]
    This will delete 22 entities. Are you sure? [y/N]: y
    100% (22 of 22) |################################################################################################################################################################################################################| Elapsed Time: 0:00:01 Time:  0:00:01
    Took 6.673 seconds to hard delete 0 rows for 22 entities
    the gms debug log shows 22 successful delete actions, but the output of the command says
    0 rows
    The data is not deleted. What can I do to a. troubleshoot this further b. actually delete the data ?
    e
    • 2
    • 3
  • c

    creamy-van-28626

    04/11/2022, 10:50 AM
    hello team, I have deployed datahub on Kubernetes but I am facing issue while connecting it to snowflake. I am executing the receipe.yml file but it is getting failed everytime. When I checked the logs in acryl-datahub-actions pods, various errors are coming: 1. could not find the version that satisfies wheel 2. could not find the version that satisfies acryl-datahub[datahub-rest,snowflake04]==0.8.31 3. failed to execute 'datahub-ingest Please refer the below image :
    d
    • 2
    • 6
  • c

    creamy-van-28626

    04/11/2022, 10:51 AM
    image_50345729.JPG
    a
    • 2
    • 4
  • a

    able-rain-74449

    04/11/2022, 2:20 PM
    has anyone came across this issue while connecting / syncing source. mysql
    Copy code
    see thread
    e
    • 2
    • 8
  • s

    salmon-manchester-60485

    04/11/2022, 2:28 PM
    Hello, We integrated datahub in our spark job (scheduled with airflow) which is reading data from our s3 bucket and writing data to SQL database. At the end of the spark job, the job is blocked after receiving the MetadataWriteResponse. The spark job correctly loaded the data into the SQL table and the metadata is ok on datahub but the job is not ending and it fails after a timeout. [2022-04-05, 163321 ] {spark_submit.py:488} INFO - 22/04/05 143321 INFO McpEmitter: MetadataWriteResponse(success=true, responseContent={"value":"urnlidataJob:(urnlidataFlow:(spark,ANACOUNTERPARTY,local[*]),QueryExecId_6)"}, underlyingResponse=HTTP/1.1 200 OK [Content-Length: 91, Content-Type: application/json, Date: Tue, 05 Apr 2022 143321 GMT, Server: nginx/1.21.6, X-Restli-Protocol-Version: 2.0.0] [Content-Length: 91,Chunked: false]) [2022-04-05, 163805 ] {timeout.py:36} ERROR - Process timed out, PID: 662 [2022-04-05, 163805 ] {spark_submit.py:623} INFO - Sending kill signal to spark-submit Any idea to solve this issue ? we opened a ticket here : https://github.com/datahub-project/datahub/issues/4583
    e
    c
    • 3
    • 7
  • e

    early-midnight-66457

    04/12/2022, 7:42 AM
    Hello Team, This is regarding Java client to publish MCP topics from the Kafka. we are migrating from MCE topics to MCP topics. we are on version LDH version 0.8.26. I am getting deseralization error at the LDH side while publishing the topic(MCP) from the Java. After debugging the issue: deserialization error occurs at the GenericAspect.value property. it throws a deserialization error and return a ByteString hash value. ByteString type is used at the LDH while ingesting the mcp at the entityclient (Rest service) . I tried to send string but it is complex string so i wanted to send it as a byte form Any suggestion would be helpful : i have tried below methods; Methods used: aspectValue.getBytes(StandardCharsets.UTF_8); Base64.getDecoder().decode(aspectValue); Any suggestion would be really helpful.
    e
    • 2
    • 2
  • h

    hundreds-ability-78888

    04/12/2022, 4:43 PM
    I'm not entirely sure how but I managed to create an assertion through the emitter that didn't have a valid URN. When I hit the api to query it, I'm getting failures because of it. I can't delete it through the CLI either because when I tried to query for it, I'm getting no responses. Is there a way to override the API's valid urn assertion so I can figure out what the entity I created was? Or should I just go hit the database directly?
    e
    o
    • 3
    • 6
  • q

    quick-pizza-8906

    04/13/2022, 7:31 AM
    Hello, I am upgrading datahub version 0.8.27 deployment to 0.8.32 and datahub-upgrade job fails for me with exception:
    Copy code
    The bean 'kafkaProducerFactory', defined in class path resource [org/springframework/boot/autoconfigure/kafka/KafkaAutoConfiguration.class], could not be registered. A bean with that name has already been defined in URL [jar:file:/datahub/datahub-upgrade/bin/datahub-upgrade.jar!/BOOT-INF/lib/factories.jar!/com/linkedin/gms/factory/kafka/KafkaProducerFactory.class] and overriding is disabled.
    Any ideas where might it be coming from?
    b
    • 2
    • 2
  • c

    colossal-easter-99672

    04/13/2022, 10:06 AM
    Hello, team. After some ingestion/deleting procedures i can't find dbt datasets via search and browse paths. Only by direct url with urn. For other platforms all works fine. Elastic re-index dont help. What should i can try to fix it?
    b
    • 2
    • 20
  • b

    busy-waiter-6669

    04/13/2022, 10:08 AM
    Hi all, I am trying to ingest MlFeatureTables with the API. But when I do that I get always an internal error 500. Do you have an idea what the problem is? This is the JSON that I try to ingest. The ingestion shows no errors, just the ui gives me an error.
    b
    e
    • 3
    • 5
  • s

    strong-kite-83354

    04/13/2022, 10:41 AM
    Hello - I'm venturing into GraphQL. I want to replicate a query which is straightforward in the UI (querying a customProperty called "hash" for a specific value): customProperties:hash=\"a3ec70319fe0bc5fee9ad666576a18cc-2\" but I'm struggling! I have got as far as the attached query but I can't see how to search specifying a customProperty as a target. Is there a way of running the UI query programmatically? Or any hints for doing this in GraphQL?

    https://datahubspace.slack.com/files/U02NY1Q2PFG/F03B8HD4ETF/image.png▾

    e
    • 2
    • 5
  • b

    breezy-portugal-43538

    04/13/2022, 1:57 PM
    Hello, sorry for asking probably very basic question, but I'm a little stuck with this. So I want to add endpoint_url for aws in datahub code so I can modify this to my own value. I saw that currently on beta branch datahub started to support S3 data lake so, changing this endpoint_url would be essential for me as I am using my own S3 buckets. I tried to build the datahub using the ./gradle build command but for some reason I had encountered following problem:
    Copy code
    symbol:   class Generated
      location: package javax.annotation.processing
    /sharedvolume/datahub/datahub-graphql-core/src/mainGeneratedGraphQL/java/com/linkedin/datahub/graphql/generated/Filter.java:7: error: cannot find symbol
    @javax.annotation.processing.Generated(
                                ^
      symbol:   class Generated
      location: package javax.annotation.processing
    100 errors
    
    > Task :datahub-graphql-core:compileJava FAILED
    
    FAILURE: Build failed with an exception.
    
    * What went wrong:
    Execution failed for task ':datahub-graphql-core:compileJava'.
    > Compilation failed; see the compiler error output for details.
    
    * Try:
    Run with --info or --debug option to get more log output. Run with --scan to get full insights.
    
    * Exception is:
    org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':datahub-graphql-core:compileJava'.
            at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.lambda$executeIfValid$3(ExecuteActionsTaskExecuter.java:186)
            at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:268)
            at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:184)
            at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:173)
            at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:109)
    
    		at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)                                                                    [37/1861]
            at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:62)
            at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
            at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
            at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
            at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
            at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
            at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
            at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:200)
            at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:195)
            at org.gradle.internal.operations.DefaultBuildOperationRunner$3.execute(DefaultBuildOperationRunner.java:75)
            at org.gradle.internal.operations.DefaultBuildOperationRunner$3.execute(DefaultBuildOperationRunner.java:68)
            at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:153)
            at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:68)
            at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:62)
            at org.gradle.internal.operations.DefaultBuildOperationExecutor.lambda$call$2(DefaultBuildOperationExecutor.java:76)
            at org.gradle.internal.operations.UnmanagedBuildOperationWrapper.callWithUnmanagedSupport(UnmanagedBuildOperationWrapper.java:54)
            at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:76)
            at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
            at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:41)
            at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:411)
            at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:398)
            at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:391)
            at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:377)
            at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.lambda$run$0(DefaultPlanExecutor.java:127)
            at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:191)
            at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:182)
            at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:124)
            at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
            at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
            at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    Caused by: org.gradle.api.internal.tasks.compile.CompilationFailedException: Compilation failed; see the compiler error output for details.
            at org.gradle.api.internal.tasks.compile.JdkJavaCompiler.execute(JdkJavaCompiler.java:57)
            at org.gradle.api.internal.tasks.compile.JdkJavaCompiler.execute(JdkJavaCompiler.java:40)
            at org.gradle.api.internal.tasks.compile.daemon.AbstractDaemonCompiler$CompilerWorkAction.execute(AbstractDaemonCompiler.java:135)
            at org.gradle.workers.internal.DefaultWorkerServer.execute(DefaultWorkerServer.java:63)
            at org.gradle.workers.internal.AbstractClassLoaderWorker$1.create(AbstractClassLoaderWorker.java:49)
            at org.gradle.workers.internal.AbstractClassLoaderWorker$1.create(AbstractClassLoaderWorker.java:43)
            at org.gradle.internal.classloader.ClassLoaderUtils.executeInClassloader(ClassLoaderUtils.java:97)
            at org.gradle.workers.internal.AbstractClassLoaderWorker.executeInClassLoader(AbstractClassLoaderWorker.java:43)
            at org.gradle.workers.internal.FlatClassLoaderWorker.run(FlatClassLoaderWorker.java:32)
            at org.gradle.workers.internal.FlatClassLoaderWorker.run(FlatClassLoaderWorker.java:22)
            at org.gradle.workers.internal.WorkerDaemonServer.run(WorkerDaemonServer.java:85)
            at org.gradle.workers.internal.WorkerDaemonServer.run(WorkerDaemonServer.java:55)
            at org.gradle.process.internal.worker.request.WorkerAction$1.call(WorkerAction.java:138)
            at org.gradle.process.internal.worker.child.WorkerLogEventListener.withWorkerLoggingProtocol(WorkerLogEventListener.java:41)
            at org.gradle.process.internal.worker.request.WorkerAction.run(WorkerAction.java:135)
            at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
            at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
            at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
            at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
            at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
            at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
            at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
            at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    I'm not sure where does it come from and how to handle this, I think i did everything correctly according to the instruction here: https://datahubproject.io/docs/developers/ Also, since I am not familiar with datahub repository, how big is the scope of the change to make endpoint_url for aws changeable via the yml file? Could you provide some info on what to check and where to look and what to change in what src file? Thank you deeply for all the help you guys provide : )
    g
    e
    • 3
    • 4
  • r

    red-napkin-59945

    04/13/2022, 4:43 PM
    Hey Team, any doc about the diff between
    Container
    and
    Domain
    ?
    o
    b
    r
    • 4
    • 15
  • c

    curved-crayon-1929

    04/13/2022, 5:05 PM
    Hi All, i am trying to ingest AWS s3 by following this document https://datahubproject.io/docs/metadata-ingestion/source_docs/s3 but it isn't working : UnboundLocalError: local variable 'node_urn' referenced before assignment below is the code snippet that I am using
    Copy code
    source:
        type: glue
        config:
            aws_region: us-east-2
            aws_access_key_id: AKIA226GV
            aws_secret_access_key: j4EzEH12YEQLw0p4+K
            aws_session_token: null
            database_pattern:
                allow:
                    - "billing"
            table_pattern:
                allow:
                    - "billingtable"
    sink:
        type: datahub-rest
        config:
            server: '<http://localhost:8080>'
    e
    h
    • 3
    • 5
  • n

    nutritious-machine-80578

    04/13/2022, 6:21 PM
    hello - I'm trying to modify a dataset table name in the datahub ui, we renamed the table in Snowflake and I was wondering if this is possible to do in the UI or CLI. I couldn't find how to rename in the cli docs I was reading here: https://datahubproject.io/docs/cli
    m
    • 2
    • 5
  • g

    gentle-father-80172

    04/13/2022, 6:27 PM
    Hi is
    Python 3.9.2
    required for Looker ingestion? Thanks
    Copy code
    TypeError: You should use `typing_extensions.TypedDict` instead of `typing.TypedDict` with Python < 3.9.2. Without it, there is no way to differentiate required and optional fields when subclassed.
    [2022-04-13 18:18:03,597] INFO     {datahub.entrypoints:161} - DataHub CLI version: 0.8.32.6 at /home/ubuntu/.local/lib/python3.8/site-packages/datahub/__init__.py
    [2022-04-13 18:18:03,597] INFO     {datahub.entrypoints:164} - Python version: 3.8.10 (default, Mar 15 2022, 12:22:08) 
    [GCC 9.4.0] at /usr/bin/python3 on Linux-5.4.0-1045-aws-x86_64-with-glibc2.29
    [2022-04-13 18:18:03,597] INFO     {datahub.entrypoints:167} - GMS config {'models': {}, 'versions': {'linkedin/datahub': {'version': 'v0.8.32', 'commit': '7080798825c4ac696c074d335a7eab7d510346c8'}}, 'managedIngestion': {'defaultCliVersion': '0.8.32.1', 'enabled': True}, 'statefulIngestionCapable': True, 'supportsImpactAnalysis': False, 'telemetry': {'enabledCli': True, 'enabledIngestion': False}, 'retention': 'true', 'noCode': 'true'}
    e
    • 2
    • 1
  • n

    nutritious-bird-77396

    04/13/2022, 9:51 PM
    `FYI`: Airflow Lineage Bundle Fails with version v0.8.32.6 on Acryldata/datahub...More details in 🧵 It works in
    v0.8.32.5
    so its not blocking...
    d
    l
    e
    • 4
    • 4
  • a

    able-rain-74449

    04/14/2022, 8:48 AM
    I am still having this issue. I destroyed the entire EKS cluster and redeployed the following guide here https://datahubproject.io/docs/deploy/aws
    e
    • 2
    • 2
  • a

    able-rain-74449

    04/14/2022, 8:50 AM
    i see there's some bug reported on this issue https://github.com/datahub-project/datahub/issues/4088
    e
    • 2
    • 2
  • n

    nutritious-jackal-99119

    04/14/2022, 9:01 AM
    Whats the best way to customize and deploy datahub according to our own parameters on aws eks . any suggestions ? is it compulsary to checkout the source and buiild docker images ?
    e
    • 2
    • 1
1...252627...119Latest