https://datahubproject.io logo
Join Slack
Powered by
# troubleshoot
  • f

    flat-painter-78331

    04/27/2023, 5:53 AM
    Hi team, I'm trying to integrate Airflow with Datahub. i'm running Datahub and Airflow both on my Kubernetes cluster and I've followed the exact steps mentioned in https://datahubproject.io/docs/lineage/airflow#using-datahubs-airflow-lineage-plugin but none of the DAGs I've deployed are shown in Datahub and the task logs of the DAGs do not show any datahub logs. I've been struggling with this for days and I cannot figure out what I'm missing... Could you please help me resolve this? It'll be much appreciated!
    plus1 1
    πŸ†˜ 1
  • e

    early-kitchen-6639

    04/27/2023, 6:31 AM
    Hello Team, I’ve deployed Datahub version 0.10.2 on EKS. All the services are up and running. When I login with the root user into Datahub, I don’t see any users on the UI. I don’t see any errors in the logs for datahub-frontend. Has anyone faced this before? What could be the possible reasons here?
    πŸ“– 1
    πŸ” 1
    βœ… 1
    l
    d
    • 3
    • 5
  • a

    adamant-car-44878

    04/27/2023, 7:57 AM
    Hii team, column level tag and description ingestion is not working for schema derived from glue but it is working on schema derived from mssql. Y is this happening.
    πŸ“– 1
    l
    a
    • 3
    • 6
  • j

    jolly-baker-19848

    04/27/2023, 10:15 AM
    Hello team, I am using ubuntu 20.04 & getting this error while at the end of the installation of
    datahub docker quickstart
    command. How can I fix this ?
    Copy code
    Unable to run quickstart - the following issues were detected:
    - quickstart.sh or dev.sh is not running
    Python version - 3.8.16
    πŸ” 1
    πŸ“– 1
    l
    a
    • 3
    • 2
  • b

    bland-orange-13353

    04/27/2023, 11:54 AM
    This message was deleted.
    βœ… 1
    l
    • 2
    • 1
  • b

    bland-orange-13353

    04/27/2023, 12:39 PM
    This message was deleted.
    βœ… 1
    l
    a
    b
    • 4
    • 3
  • b

    brave-room-48783

    04/27/2023, 12:43 PM
    Hi, I cannot see any lineage after getting successful ingestion. Ingestion Source - Snowflake DataHub CLI version: 0.10.2.1 Python version: 3.9.6 (default, Mar 10 2023, 201638) [Clang 14.0.3 (clang-1403.0.22.14.1)] Deployment method - Docker on local machine YAML Recipe -
    Copy code
    source:    
    	type: snowflake    
    	config:        
    		account_id: *masked*        
    		include_table_lineage: true        
    		include_view_lineage: true        
    		include_tables: true        
    		include_views: true        
    		profiling:            
    			enabled: false            
    			profile_table_level_only: true        
    		stateful_ingestion:            
    			enabled: true        
    		warehouse: *masked*        
    		username: *masked*  
    		password:  *masked*
    		role:  *masked*
    • 1
    • 1
  • a

    agreeable-ability-94084

    04/27/2023, 3:13 PM
    Hi I'm having trouble ingesting the sample data, python3 -m datahub docker quickstart works fine and brings the UI up but when I run python3 -m datahub docker ingest-sample-data I get the following error:
    l
    a
    • 3
    • 2
  • a

    agreeable-ability-94084

    04/27/2023, 3:14 PM
    Hi I'm having trouble ingesting the sample data, python3 -m datahub docker quickstart works fine and brings the UI up but when I run python3 -m datahub docker ingest-sample-data I get the following error: I have HTTPS_PROXY and HTTP_PROXY set to proxy IP and port, even tried NO_PROXY=http://localhost but no difference
    a
    • 2
    • 1
  • a

    agreeable-ability-94084

    04/27/2023, 3:15 PM
    If I use sudo I get long wait and then following error:
  • s

    stocky-plumber-3084

    04/27/2023, 4:15 PM
    Hi, any idea that I'm getting these errors and a failed status for the mssql ingestion job when Enabling Stateful Ingestion ~~ Execution Summary - RUN_INGEST ~~ Execution finished with errors. {'exec_id': 'f54bc15d-808f-4957-9e62-731d12cfa652', 'infos': ['2023-04-27 160731.851823 INFO: Starting execution for task with name=RUN_INGEST', "2023-04-27 160803.582169 INFO: Failed to execute 'datahub ingest'", '2023-04-27 160803.583391 INFO: Caught exception EXECUTING task_id=f54bc15d-808f-4957-9e62-731d12cfa652, name=RUN_INGEST, ' 'stacktrace=Traceback (most recent call last):\n' ' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/default_executor.py", line 122, in execute_task\n' ' task_event_loop.run_until_complete(task_future)\n' ' File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete\n' ' return future.result()\n' ' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 231, in execute\n' ' raise TaskError("Failed to execute \'datahub ingest\'")\n' "acryl.executor.execution.task.TaskError: Failed to execute 'datahub ingest'\n"], 'errors': []} ~~ Ingestion Report ~~ { "cli": { "cli_version": "0.10.0.7", "cli_entry_location": "/usr/local/lib/python3.10/site-packages/datahub/__init__.py", "py_version": "3.10.10 (main, Mar 14 2023, 023711) [GCC 10.2.1 20210110]", "py_exec_path": "/usr/local/bin/python", "os_details": "Linux-5.15.0-60-generic-x86_64-with-glibc2.31", "peak_memory_usage": "332.39 MB", "mem_info": "332.39 MB" }, 'aspects': {'container': {'containerProperties': 2, 'status': 2, 'dataPlatformInstance': 2, 'subTypes': 2, 'container': 1}, 'dataset': {'container': 23, 'status': 23, 'datasetProperties': 23, 'schemaMetadata': 23, 'subTypes': 23, 'datasetProfile': 23}}, 'warnings': {}, 'failures': {}, 'soft_deleted_stale_entities': [], 'tables_scanned': 23, 'views_scanned': 0, 'entities_profiled': 23, 'filtered': ['db_accessadmin.*', 'db_backupoperator.*', 'db_datareader.*', 'db_datawriter.*', 'db_ddladmin.*', 'db_denydatareader.*', 'db_denydatawriter.*', 'db_owner.*', 'db_securityadmin.*', 'INFORMATION_SCHEMA.*', '... sampled of 12 total elements'], 'query_combiner': {'total_queries': 1145, 'uncombined_queries_issued': 494, 'combined_queries_issued': 71, 'queries_combined': 337, 'query_exceptions': 314}, 'start_time': '2023-04-27 160733.888433 (27.11 seconds ago)', 'running_time': '27.11 seconds'} [2023-04-27 160801,000] INFO {datahub.cli.ingest_cli:137} - Sink (datahub-rest) report: {'total_records_written': 101, 'records_written_per_second': 3, 'warnings': [], 'failures': [], 'start_time': '2023-04-27 160733.529165 (27.47 seconds ago)', 'current_time': '2023-04-27 160801.000169 (now)', 'total_duration_in_seconds': 27.47, 'gms_version': 'v0.10.2', 'pending_requests': 0} [2023-04-27 160801,008] ERROR {datahub.entrypoints:188} - Command failed: Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/datahub/entrypoints.py", line 175, in main sys.exit(datahub(standalone_mode=False, **kwargs)) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1130, in call return self.main(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1055, in main rv = self.invoke(ctx) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 760, in invoke return __callback(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/click/decorators.py", line 26, in new_func return f(get_current_context(), *args, **kwargs) File "/usr/local/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 379, in wrapper raise e File "/usr/local/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 334, in wrapper res = func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/datahub/utilities/memory_leak_detector.py", line 95, in wrapper return func(ctx, *args, **kwargs) File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 198, in run loop.run_until_complete(run_func_check_upgrade(pipeline)) File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete return future.result() File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 158, in run_func_check_upgrade ret = await the_one_future File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 149, in run_pipeline_async return await loop.run_in_executor( File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 140, in run_pipeline_to_completion raise e File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 132, in run_pipeline_to_completion pipeline.run() File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 339, in run for wu in itertools.islice( File "/usr/local/lib/python3.10/site-packages/datahub/utilities/source_helpers.py", line 98, in auto_stale_entity_removal yield from stale_entity_removal_handler.gen_removed_entity_workunits() File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/state/stale_entity_removal_handler.py", line 267, in gen_removed_entity_workunits last_checkpoint: Optional[Checkpoint] = self.source.get_last_checkpoint( File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/state/stateful_ingestion_base.py", line 337, in get_last_checkpoint self.last_checkpoints[job_id] = self._get_last_checkpoint( File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/state/stateful_ingestion_base.py", line 301, in _get_last_checkpoint last_checkpoint_aspect = self.ingestion_checkpointing_state_provider.get_latest_checkpoint( # type: ignore File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/state_provider/datahub_ingestion_checkpointing_provider.py", line 85, in get_latest_checkpoint ] = self.graph.get_latest_timeseries_value( File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/graph/client.py", line 292, in get_latest_timeseries_value assert len(values) == 1 AssertionError
    l
    d
    • 3
    • 4
  • b

    best-market-29539

    04/27/2023, 5:48 PM
    Hi, can I get any support how to add links into documentation using transformers? https://datahubproject.io/docs/metadata-ingestion/docs/transformer/dataset_transformer#simple-add-dataset-datasetproperties. I found this, but I don't know how to provide appropriate values.
    πŸ“– 1
    πŸ” 1
    l
    a
    • 3
    • 4
  • a

    adorable-daybreak-45557

    04/27/2023, 7:08 PM
    Hi! I managed to run
    datahub docker quickstart
    once successfully. Then I decided to
    nuke
    all to reset my tests. Now
    gms
    will not start, with his error in the logs
    com.fasterxml.jackson.core.JsonParseException: Unexpected character (']' (code 93)): expected a value
    I deleted all Volumes related to DataHub in docker. Any idea where this error could come from?
    πŸ” 1
    βœ… 1
    πŸ“– 1
    l
    o
    • 3
    • 9
  • r

    red-painter-89141

    04/27/2023, 10:14 PM
    I'm having trouble getting quickstart to run again.
    Caught exception while executing bootstrap step IngestDataPlatformsStep. Exiting...
    πŸ“– 1
    πŸ” 1
    βœ… 1
    l
    o
    +2
    • 5
    • 14
  • j

    jolly-baker-19848

    04/28/2023, 7:41 AM
    I am facing issue running the
    datahub docker quickstart
    Getting this error -
    Copy code
    Unable to run quickstart - the following issues were detected:
    - quickstart.sh or dev.sh is not running
    Also tried with
    datahub docker nuke
    but didn't worked. I have Ubuntu20.04 | DataHub CLI version: 0.10.2.2, Python version: 3.8.10 Can anyone help me out in this ? Attaching the log file in the thread.
    πŸ” 1
    πŸ“– 1
    βœ… 1
    l
    a
    • 3
    • 3
  • a

    adamant-car-44878

    04/28/2023, 8:18 AM
    I m facing an issue in adding column level tags to a table which was ingested through glue catalogue. When i m doing the same thing with table ingested through mssql then it is working perfectly fine. I have done it through graphql mutation
    Copy code
    mutation addTags {
        addTags(
          input: {
            tagUrns: "urn:li:tag:pi",
            resourceUrn: "urn:li:dataset:(urn:li:dataPlatform:glue,cdc_test.test1,PROD)",
             subResourceType:DATASET_FIELD,
            subResource:"changetype"
            })
    }
    and the error that i m getting is
    Copy code
    {
      "errors": [
        {
          "message": "Failed to update resource with urn urn:li:dataset:(urn:li:dataPlatform:glue,cdc_test.test1,PROD). Entity does not exist.",
          "locations": [
            {
              "line": 2,
              "column": 5
            }
          ],
          "path": [
            "addTags"
          ],
          "extensions": {
            "code": 400,
            "type": "BAD_REQUEST",
            "classification": "DataFetchingException"
          }
        }
      ],
      "data": {
        "addTags": null
      },
      "extensions": {}
    }
    I m able to add tags through ui.
    l
    a
    • 3
    • 3
  • l

    lively-raincoat-33818

    04/28/2023, 10:47 AM
    Hello everyone, During dbt ingestion tag is automatically created based on dbt files, but when I try to add text to "About" section of tags an error appears:"Failed to update description: Failed to update urn" Entity does not exist. Does everyone know what is the reason? Are we able to add both description tag directly from dbt files? Thanks in advance.
    βœ… 1
    l
    b
    • 3
    • 2
  • b

    bland-gold-64386

    04/28/2023, 1:11 PM
    hii @here, apologies to say that i feel bad experience of datahub nobody responding on here and nobody is looking the error actively i have faced lot’s of challenges for Exm: β€’ when i creating lineage from ui sometime it is showing properly but after 5 min it is disappear β€’ i am able to tag in mssql column data but not able to tag in glue catalog β€’ spark lineage not working please guide me did i choose wrong product
    πŸ‘Ž 2
    l
    l
    • 3
    • 2
  • v

    victorious-monkey-86128

    04/28/2023, 3:25 PM
    Hi, this should probably be under #contribute but for the Modifying Tags on Datasets page of the DataHub documentation, the code block for "Add Tags" and "Create Tags" are identical for the Python SDK. The Add Tags section code block simply creates new tags without actually adding it to the
    fct_users_created
    dataset. Thanks!
    🩺 1
    l
    l
    a
    • 4
    • 5
  • l

    lemon-account-12661

    05/01/2023, 4:42 PM
    Hello everyone, I'm currently evaluating DataHub for adoption (particularly the delta-lake source), but running into issues with the pyspark versioning. I'm currently running pyspark v3.3.2, which appears to be incompatible with DataHub because the version is pinned to 3.0.3. This PR would solve our issue: #6852 Bump pyspark dependency to =3.1.3> but this was reverted later here, and I couldn't dig up logs or a thread here for a reason: https://github.com/datahub-project/datahub/pull/6954 Is there any way we can get this reopened/the version loosened? Thanks!
    πŸ“– 1
    πŸ” 1
    βœ… 1
    l
    b
    +5
    • 8
    • 11
  • b

    bland-orange-13353

    05/01/2023, 10:08 PM
    This message was deleted.
  • m

    miniature-room-15319

    05/02/2023, 2:59 AM
    Hi, I want to ask about slack notification. My team configured slack notification based on docs (https://datahubproject.io/docs/actions/actions/slack). Current situation, slack give notification when we change table's tag, table's ownership, etc via UI. However, when there is new dataset coming to datahub via ingestion or new DAG metadata from airflow, the slack is not give notification to us. I check the docs related to event creation event and datahub actions already cover it (https://datahubproject.io/docs/actions/events/entity-change-event#entity-create-event). Does everyone know how to configure this slack notification for handling new entity creation? Thank you.
    πŸ“– 1
    βœ… 1
    πŸ” 1
    l
    • 2
    • 2
  • b

    brave-room-48783

    05/02/2023, 5:15 AM
    Hi, I cannot see any lineage after getting successful ingestion. Ingestion Source - Snowflake DataHub CLI version: 0.10.2.1 Python version: 3.9.6 (default, Mar 10 2023, 201638) [Clang 14.0.3 (clang-1403.0.22.14.1)] Deployment method - Docker on local machine YAML Recipe -
    Copy code
    source:    
    	type: snowflake    
    	config:        
    		account_id: *masked*        
    		include_table_lineage: true        
    		include_view_lineage: true        
    		include_tables: true        
    		include_views: true        
    		profiling:            
    			enabled: false            
    			profile_table_level_only: true        
    		stateful_ingestion:            
    			enabled: true        
    		warehouse: *masked*        
    		username: *masked*  
    		password:  *masked*
    		role:  *masked*
    πŸ“– 1
    βœ… 2
    πŸ” 1
    l
    h
    a
    • 4
    • 7
  • a

    adamant-glass-95296

    05/02/2023, 9:29 AM
    Hi ! I’ trying to add LDAP configuration with an AD LDAP without success Here my jass.conf :
    Copy code
    WHZ-Authentication {
      com.sun.security.auth.module.LdapLoginModule REQUIRED
    	userProvider="<ldap://host:389/OU=COMPANY%20NAME,DC=COMPANY,DC=fr>"
    	authIdentity="{USERNAME}"
    	userFilter="(&(sAMAccountName={USERNAME})(objectClass=person))"
    	java.naming.security.principal="CN=ldapread_aws,OU=Compte_technique,OU=COMPANY%20NAME,DC=COMPANY,DC=fr"
    	java.naming.security.credentials="ldap_password"
    	java.naming.security.authentication="simple"
    	debug="true"
    	useSSL="false";
    };
    I getting this error :
    Copy code
    datahub-frontend-react    | 2023-05-02 09:13:51,778 [application-akka.actor.default-dispatcher-10] INFO  org.eclipse.jetty.util.log - Logging initialized @199876ms to org.eclipse.jetty.util.log.Slf4jLog
    datahub-frontend-react    |             [LdapLoginModule] authentication-first mode; SSL disabled
    datahub-frontend-react    |             [LdapLoginModule] user provider: <ldap://host:389/OU=FEU%20VERT,DC=feuvert,DC=fr>
    datahub-frontend-react    |             [LdapLoginModule] attempting to authenticate user: weileal1
    datahub-frontend-react    |             [LdapLoginModule] authentication failed
    datahub-frontend-react    |             [LdapLoginModule] aborted authentication
    datahub-frontend-react    | 2023-05-02 09:13:52,075 [application-akka.actor.default-dispatcher-10] ERROR p.api.http.DefaultHttpErrorHandler - 
    datahub-frontend-react    | 
    datahub-frontend-react    | ! @81e3lp6d5 - Internal server error, for (POST) [/logIn] ->
    datahub-frontend-react    |  
    datahub-frontend-react    | play.api.UnexpectedException: Unexpected exception[RuntimeException: Failed to verify credentials for user]
    datahub-frontend-react    |     at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:358)
    datahub-frontend-react    |     at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:264)
    datahub-frontend-react    |     at play.core.server.AkkaHttpServer$$anonfun$2.applyOrElse(AkkaHttpServer.scala:436)
    datahub-frontend-react    |     at play.core.server.AkkaHttpServer$$anonfun$2.applyOrElse(AkkaHttpServer.scala:428)
    datahub-frontend-react    |     at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
    datahub-frontend-react    |     at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    datahub-frontend-react    |     at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:63)
    datahub-frontend-react    |     at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:100)
    datahub-frontend-react    |     at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    datahub-frontend-react    |     at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
    datahub-frontend-react    |     at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:100)
    datahub-frontend-react    |     at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
    datahub-frontend-react    |     at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
    datahub-frontend-react    |     at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
    datahub-frontend-react    |     at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
    datahub-frontend-react    |     at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
    datahub-frontend-react    |     at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
    datahub-frontend-react    |     at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
    datahub-frontend-react    | Caused by: java.lang.RuntimeException: Failed to verify credentials for user
    datahub-frontend-react    |     at client.AuthServiceClient.verifyNativeUserCredentials(AuthServiceClient.java:265)
    datahub-frontend-react    |     at controllers.AuthenticationController.tryLogin(AuthenticationController.java:327)
    datahub-frontend-react    |     at controllers.AuthenticationController.logIn(AuthenticationController.java:170)
    datahub-frontend-react    |     at router.Routes$$anonfun$routes$1.$anonfun$applyOrElse$17(Routes.scala:581)
    datahub-frontend-react    |     at play.core.routing.HandlerInvokerFactory$$anon$8.resultCall(HandlerInvoker.scala:150)
    datahub-frontend-react    |     at play.core.routing.HandlerInvokerFactory$$anon$8.resultCall(HandlerInvoker.scala:149)
    datahub-frontend-react    |     at play.core.routing.HandlerInvokerFactory$JavaActionInvokerFactory$$anon$3$$anon$4$$anon$5.invocation(HandlerInvoker.scala:115)
    datahub-frontend-react    |     at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:119)
    datahub-frontend-react    |     at play.http.DefaultActionCreator$1.call(DefaultActionCreator.java:33)
    datahub-frontend-react    |     at play.core.j.JavaAction.$anonfun$apply$8(JavaAction.scala:175)
    datahub-frontend-react    |     at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
    datahub-frontend-react    |     at scala.util.Success.$anonfun$map$1(Try.scala:255)
    datahub-frontend-react    |     at scala.util.Success.map(Try.scala:213)
    datahub-frontend-react    |     at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    datahub-frontend-react    |     at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    datahub-frontend-react    |     at play.core.j.HttpExecutionContext.$anonfun$execute$1(HttpExecutionContext.scala:64)
    datahub-frontend-react    |     at play.api.libs.streams.Execution$trampoline$.execute(Execution.scala:70)
    datahub-frontend-react    |     at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:59)
    datahub-frontend-react    |     at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete(Promise.scala:372)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise$KeptPromise$Kept.onComplete$(Promise.scala:371)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise$KeptPromise$Successful.onComplete(Promise.scala:379)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise.transform(Promise.scala:33)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise.transform$(Promise.scala:31)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise$KeptPromise$Successful.transform(Promise.scala:379)
    datahub-frontend-react    |     at scala.concurrent.Future.map(Future.scala:292)
    datahub-frontend-react    |     at scala.concurrent.Future.map$(Future.scala:292)
    datahub-frontend-react    |     at scala.concurrent.impl.Promise$KeptPromise$Successful.map(Promise.scala:379)
    datahub-frontend-react    |     at scala.concurrent.Future$.apply(Future.scala:659)
    datahub-frontend-react    |     at play.core.j.JavaAction.apply(JavaAction.scala:176)
    datahub-frontend-react    |     at play.api.mvc.Action.$anonfun$apply$4(Action.scala:82)
    datahub-frontend-react    |     at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307)
    datahub-frontend-react    |     ... 14 common frames omitted
    datahub-frontend-react    | Caused by: org.apache.http.conn.HttpHostConnectException: Connect to datahub-gms:8080 [datahub-gms/172.18.0.10] failed: Connection refused (Connection refused)
    datahub-frontend-react    |     at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:156)
    datahub-frontend-react    |     at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:376)
    datahub-frontend-react    |     at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:393)
    datahub-frontend-react    |     at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
    datahub-frontend-react    |     at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
    datahub-frontend-react    |     at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
    datahub-frontend-react    |     at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
    datahub-frontend-react    |     at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
    datahub-frontend-react    |     at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
    datahub-frontend-react    |     at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
    datahub-frontend-react    |     at client.AuthServiceClient.verifyNativeUserCredentials(AuthServiceClient.java:253)
    datahub-frontend-react    |     ... 47 common frames omitted
    datahub-frontend-react    | Caused by: java.net.ConnectException: Connection refused (Connection refused)
    datahub-frontend-react    |     at java.base/java.net.PlainSocketImpl.socketConnect(Native Method)
    datahub-frontend-react    |     at java.base/java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:412)
    datahub-frontend-react    |     at java.base/java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:255)
    datahub-frontend-react    |     at java.base/java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:237)
    datahub-frontend-react    |     at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    datahub-frontend-react    |     at java.base/java.net.Socket.connect(Socket.java:609)
    datahub-frontend-react    |     at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:75)
    datahub-frontend-react    |     at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
    datahub-frontend-react    |     ... 57 common frames omitted
    When I run this command in my container, it’s successfull :
    Copy code
    ldapsearch -H <ldap://host> -D "CN=<FISTNAME> <LASTNAME>,OU=Normal User,OU=Informatique,OU=Siège,OU=Office,OU=Zones,OU=COMPANY NAME,DC=COMPANY,DC=fr" -w "xxxxxx" -b "DC=feuvert,DC=fr" "(sAMAccountName=Username)"
    Does anyone have clue please ?
    πŸ” 1
    πŸ“– 1
    l
    a
    b
    • 4
    • 5
  • b

    brainy-oxygen-20792

    05/02/2023, 9:35 AM
    Good morning πŸŒ„ I'm having some difficulty with impact analysis. With DataHub quickstart on Docker I've ingested Snowflake, DBT and Looker. I can see table and column lineage in the lineage visualisation and list. But when I choose to export the lineage in a .csv file, (using Lineage > 3 dots > Download) I only get the headers, and no rows of data. There is lineage shown on the "lineage" tab at the time I do the export, this happens for all platforms, and the issue persists after a nuke-and-reingest. All platforms are ingested via CLI. DataHub CLI version 0.10.2.2 Python 3.10.11
    πŸ” 1
    πŸ“– 1
    πŸ› 1
    l
    a
    +2
    • 5
    • 7
  • w

    wonderful-spring-3326

    05/02/2023, 1:27 PM
    Getting this error in the logs on the front-end pod every time someone tries to login with SSO, though nothing has changed on our end as far as I know I don't see any other logs happening that look like they're triggered from login attempts Anyone got a clue what "invalid client" suddenly appearing could mean?
    Copy code
    2023-05-02 12:59:28,433 [application-akka.actor.default-dispatcher-10] ERROR controllers.SsoCallbackController - Caught exception while attempting to handle SSO callback! It's likely that SSO integration is mis-configured.
    java.util.concurrent.CompletionException: org.pac4j.core.exception.TechnicalException: Bad token response, error=invalid_client
            at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)
            at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)
            at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)
            at play.core.j.HttpExecutionContext.$anonfun$execute$1(HttpExecutionContext.scala:64)
            at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
            at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
            at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
            at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
            at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
            at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
            at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
    Caused by: org.pac4j.core.exception.TechnicalException: Bad token response, error=invalid_client
            at auth.sso.oidc.custom.CustomOidcAuthenticator.validate(CustomOidcAuthenticator.java:162)
            at auth.sso.oidc.custom.CustomOidcAuthenticator.validate(CustomOidcAuthenticator.java:41)
            at org.pac4j.core.client.BaseClient.lambda$retrieveCredentials$0(BaseClient.java:70)
            at java.base/java.util.Optional.ifPresent(Optional.java:183)
            at org.pac4j.core.client.BaseClient.retrieveCredentials(BaseClient.java:67)
            at org.pac4j.core.client.IndirectClient.getCredentials(IndirectClient.java:143)
            at org.pac4j.core.engine.DefaultCallbackLogic.perform(DefaultCallbackLogic.java:85)
            at auth.sso.oidc.OidcCallbackLogic.perform(OidcCallbackLogic.java:100)
            at controllers.SsoCallbackController$SsoCallbackLogic.perform(SsoCallbackController.java:91)
            at controllers.SsoCallbackController$SsoCallbackLogic.perform(SsoCallbackController.java:77)
            at org.pac4j.play.CallbackController.lambda$callback$0(CallbackController.java:54)
            at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
            ... 8 common frames omitted
    πŸ” 1
    βœ… 2
    πŸ“– 1
    l
    a
    • 3
    • 5
  • n

    nice-rocket-26538

    05/02/2023, 3:00 PM
    Hello, I'm trying to set up an enrichment process as a lineage from a dataset to itself (=self referencing lineage). For this I want to use column lineage to visualise that one field is calculated based on two others. To accomplish this, I have created the metadata through the Python SDK using the metachangeproposalwrapper but when I try to visualise the self referencing lineage no lineages are shown in Datahub. However, if I add another lineage from that dataset to another, then I can visualise not only that link but also the self referencing lineage is shown on the UI. How would you recommend I deal with self referencing lineages? It would seem to me that this isn't really supported in Datahub? eg:
    l
    a
    • 3
    • 6
  • l

    lively-dusk-19162

    05/02/2023, 5:40 PM
    Hi team, I am trying to run datahub docker quickstart command, i got the following error inside elasticsearch-setup docker container and also gms container: Problem with request: Get "http://elasticsearch:9200": EOF. Sleeping 1s I have done the following steps: 1. Checked the network connectivity in both elasticsearch and elasticsearch-setup containers. I am able to ping elasticsearch from elasticsearch-setup and vicecersa. Can anyone please help me on this error?
    πŸ” 1
    πŸ“– 1
    l
    a
    • 3
    • 9
  • a

    agreeable-address-71270

    05/02/2023, 7:13 PM
    Hello all! I am trying to deploy Datahub on ECS, and want to know the minimum required container needed to run the service. For example are the following containers from the docker-compose needed?
    Copy code
    mysql-setup
    elasticsearch-setup
    kafka-setup
    datahub-actions
    datahub-upgrade
    Thanks!
    βœ… 1
    l
    a
    d
    • 4
    • 4
  • s

    stocky-plumber-3084

    05/03/2023, 2:20 AM
    Hi, when exporting/downloading the searched results to an csv, the terms field shows only the UID but the actual term name. is this expected? is there a way to show the name instead of its UID?
    l
    a
    • 3
    • 2
1...929394...119Latest