https://datahubproject.io logo
Join Slack
Powered by
# troubleshoot
  • f

    fierce-monkey-46092

    04/18/2023, 8:08 AM
    Hello everyone, I've updated all my datahub containers into v0.10.0 after that I run the datahub_upgrade.sh successfully. The issue is on the gms log: Connection refused: localhost/127.0.0.1:8080. datahub-action log : 2023/04/18 080824 Problem with request: Get "http:///health": http: no Host in request URL. Sleeping 1s (attached my .yml file in the comments)
    l
    a
    • 3
    • 4
  • c

    clever-magician-79463

    04/18/2023, 12:14 PM
    Hi, I am trying to ingest metadata from redshift, I wanted to understand, how the data is ingested? Like does it take change delta or scans all the tables each time the ingestion job is run. If there is any config i must use to control that, can anyone please point that out. I ask this because when i try to ingest data, datahub consumes all my cluster bandwidth and essentially chokes the redshift. This cause all other read queries to get piled up and ultimately the system crashes. If it reads all the data each time ingestion queries are run then it will be difficult to adapt datahub for our data governance purpose. Please help with this query as this is our only blocker for now. If anyone wants to connect we can discuss on direct messages as well. Thanks in Advance.
    βœ… 1
    l
    a
    • 3
    • 2
  • m

    millions-summer-3123

    04/18/2023, 1:12 PM
    Hi everyone, I tried running
    datahub docker quickstart
    , following the guidelines here. But I get the following error:
    Copy code
    ERROR    {datahub.entrypoints:192} - Command failed: Command '['docker', 'compose', '-f', '/Users/david/.datahub/quickstart/docker-compose.yml', '-p', 'datahub', 'logs']' returned non-zero exit status 15.
    Any idea of what may be happening?
    πŸ“– 1
    βœ… 1
    πŸ” 1
    l
    i
    • 3
    • 4
  • s

    steep-alligator-93593

    04/18/2023, 3:22 PM
    Hi, has anyone seen this error before?
    Copy code
    ERROR    {datahub_actions.plugin.source.kafka.kafka_event_source:157} - Kafka consume error: KafkaError{code=_NOT_IMPLEMENTED,val=-170,str="Decompression (codec 0x4) of message at 110 of 186 bytes failed: Local: Not implemented"}
    Seeing this mainly in my
    datahub-acryl-datahub-actions
    pod
    πŸ” 1
    βœ… 1
    πŸ“– 1
    l
    a
    +2
    • 5
    • 10
  • h

    handsome-football-66174

    04/18/2023, 4:56 PM
    Hi Team, Upgrading datahub to 0.10.1 and we have sso configured. Getting the following error, how do we resolve this ?
    Copy code
    2023-04-18 16:53:57,724 [application-akka.actor.default-dispatcher-12] ERROR controllers.SsoCallbackController - Caught exception while attempting to handle SSO callback! It's likely that SSO integration is mis-configured.
    java.util.concurrent.CompletionException: org.pac4j.core.exception.TechnicalException: Unsigned ID tokens are not allowed: they must be explicitly enabled on client side and the response_type used must return no ID Token from the authorization endpoint
    πŸ“– 1
    l
    a
    a
    • 4
    • 6
  • l

    lively-businessperson-84910

    04/18/2023, 8:09 PM
    Hi, im seeing this error when trying to ingest from BigQuery. Maybe someone knows something about that?
    Copy code
    ~~~~ Execution Summary - RUN_INGEST ~~~~
    Execution finished with errors.
    {'exec_id': 'b7fe3905-6cb1-4c7b-9fb7-06f2e516826a',
     'infos': ['2023-04-18 19:28:43.814912 INFO: Starting execution for task with name=RUN_INGEST',
               "2023-04-18 19:28:49.881382 INFO: Failed to execute 'datahub ingest'",
               '2023-04-18 19:28:49.883130 INFO: Caught exception EXECUTING task_id=b7fe3905-6cb1-4c7b-9fb7-06f2e516826a, name=RUN_INGEST, '
               'stacktrace=Traceback (most recent call last):\n'
               '  File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/default_executor.py", line 122, in execute_task\n'
               '    task_event_loop.run_until_complete(task_future)\n'
               '  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete\n'
               '    return future.result()\n'
               '  File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 231, in execute\n'
               '    raise TaskError("Failed to execute \'datahub ingest\'")\n'
               "acryl.executor.execution.task.TaskError: Failed to execute 'datahub ingest'\n"],
     'errors': []}
    
    ~~~~ Ingestion Logs ~~~~
    Obtaining venv creation lock...
    Acquired venv creation lock
    venv setup time = 0
    This version of datahub supports report-to functionality
    datahub  ingest run -c /tmp/datahub/ingest/b7fe3905-6cb1-4c7b-9fb7-06f2e516826a/recipe.yml --report-to /tmp/datahub/ingest/b7fe3905-6cb1-4c7b-9fb7-06f2e516826a/ingestion_report.json
    [2023-04-18 19:28:45,463] INFO     {datahub.cli.ingest_cli:173} - DataHub CLI version: 0.10.2
    [2023-04-18 19:28:45,532] INFO     {datahub.ingestion.run.pipeline:204} - Sink configured successfully. DataHubRestEmitter: configured to talk to 
    <http://datahub-gms:8080>
    [2023-04-18 19:28:47,467] WARNING  {py.warnings:109} - /tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/ratelimiter.py:127: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
      __aexit__ = asyncio.coroutine(__exit__)
    
    [2023-04-18 19:28:47,791] ERROR    {datahub.entrypoints:195} - Command failed: Failed to find a registered source for type bigquery: 'str' object is not callable
    Traceback (most recent call last):
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 119, in _add_init_error_context
        yield
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 214, in __init__
        source_class = source_registry.get(source_type)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/api/registry.py", line 173, in get
        tp = self._ensure_not_lazy(key)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/api/registry.py", line 117, in _ensure_not_lazy
        plugin_class = import_path(path)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/api/registry.py", line 48, in import_path
        item = importlib.import_module(module_name)
      File "/usr/local/lib/python3.10/importlib/__init__.py", line 126, in import_module
        return _bootstrap._gcd_import(name[level:], package, level)
      File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
      File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
      File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
      File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/bigquery_v2/bigquery.py", line 57, in <module>
        from datahub.ingestion.source.bigquery_v2.lineage import (
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/bigquery_v2/lineage.py", line 39, in <module>
        from datahub.utilities.bigquery_sql_parser import BigQuerySQLParser
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/utilities/bigquery_sql_parser.py", line 6, in <module>
        from datahub.utilities.sql_parser import SqlLineageSQLParser, SQLParser
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/utilities/sql_parser.py", line 9, in <module>
        from datahub.utilities.sql_lineage_parser_impl import SqlLineageSQLParserImpl
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/utilities/sql_lineage_parser_impl.py", line 8, in <module>
        from sqllineage.core.holders import Column, SQLLineageHolder
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/sqllineage/__init__.py", line 41, in <module>
        _monkey_patch()
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/sqllineage/__init__.py", line 35, in _monkey_patch
        _patch_updating_lateral_view_lexeme()
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/sqllineage/__init__.py", line 24, in _patch_updating_lateral_view_lexeme
        if regex("LATERAL VIEW EXPLODE(col)"):
    TypeError: 'str' object is not callable
    
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/entrypoints.py", line 182, in main
        sys.exit(datahub(standalone_mode=False, **kwargs))
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/click/core.py", line 1130, in __call__
        return self.main(*args, **kwargs)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/click/core.py", line 1055, in main
        rv = self.invoke(ctx)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/click/core.py", line 1657, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/click/core.py", line 1657, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/click/core.py", line 1404, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/click/core.py", line 760, in invoke
        return __callback(*args, **kwargs)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/click/decorators.py", line 26, in new_func
        return f(get_current_context(), *args, **kwargs)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 379, in wrapper
        raise e
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 334, in wrapper
        res = func(*args, **kwargs)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/utilities/memory_leak_detector.py", line 95, in wrapper
        return func(ctx, *args, **kwargs)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 187, in run
        pipeline = Pipeline.create(
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 328, in create
        return cls(
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 211, in __init__
        with _add_init_error_context(
      File "/usr/local/lib/python3.10/contextlib.py", line 153, in __exit__
        self.gen.throw(typ, value, traceback)
      File "/tmp/datahub/ingest/venv-bigquery-0.10.2/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 121, in _add_init_error_context
        raise PipelineInitError(f"Failed to {step}: {e}") from e
    datahub.ingestion.run.pipeline.PipelineInitError: Failed to find a registered source for type bigquery: 'str' object is not callable
    DataHub CLI version: 0.10.2 Python version: 3.10.10 (main, Feb 8 2023, 053204) [Clang 14.0.0 (clang-1400.0.29.202)]
    πŸ“– 1
    πŸ‘ 2
    πŸ” 1
    βœ… 2
    l
    n
    +15
    • 18
    • 33
  • q

    quaint-barista-82836

    04/18/2023, 8:30 PM
    Hi Team, Upgraded to v0.10.2 on GKE and using pingfed for the integration with below configuration: name: AUTH_OIDC_CLIENT_ID value: client_id - name: AUTH_OIDC_CLIENT_SECRET value: secret - name: AUTH_OIDC_DISCOVERY_URI value: https://dns/.well-known/openid-configuration - name: AUTH_OIDC_BASE_URL value: https://dns - name: AUTH_OIDC_SCOPE value: openid profile email - name: AUTH_OIDC_USER_NAME_CLAIM value: email - name: METADATA_SERVICE_AUTH_ENABLED value: "true" - name: AUTH_OIDC_JIT_PROVISIONING_ENABLED value: "true" - name: AUTH_OIDC_PRE_PROVISIONING_REQUIRED value: "false" - name: AUTH_JAAS_ENABLED value: "false" getting SSO error : "java.util.concurrent.CompletionException: org.pac4j.core.exception.TechnicalException: Unsigned ID tokens are not allowed: they must be explicitly enabled on client side and the response_type used must return no ID Token from the authorization endpoint at" cc: @bland-artist-80619
    πŸ” 1
    βœ… 1
    πŸ“– 1
    l
    h
    • 3
    • 8
  • b

    brief-petabyte-56179

    04/18/2023, 10:06 PM
    Hi everyone! I have a couple questions. First, is there is a way to display entities based on a view in the api? I tried the following with no luck
    Copy code
    {
      "query": "query { entityRegistry { findList(input: { type: DataHubView, urn: \"urn:li:dataHubView:<my_view_urn>\" }) { entities { urn, name { value } } } } } }"
    }
    Second, is there a way to assign views based on group membership? For example, I want everyone in the "test-users" group to have their default view be the "test-user-view".
    πŸ” 1
    βœ… 1
    πŸ“– 1
    l
    • 2
    • 8
  • r

    red-painter-89141

    04/18/2023, 10:43 PM
    I'm struggling to get datahub docker quickstart to work on ubuntu. It seems to be stuck in some sort of loop, then eventually errors out:
    Copy code
    Finished pulling docker images!
    
    Starting up DataHub...
    [+] Running 13/13
     βœ” Network datahub_network              Created                                                                    0.1s 
     βœ” Container datahub-upgrade            Started                                                                    1.9s 
     βœ” Container elasticsearch              Started                                                                    2.0s 
     βœ” Container mysql                      Started                                                                    2.1s 
     βœ” Container zookeeper                  Started                                                                    2.0s 
     βœ” Container datahub-gms                Started                                                                    3.6s 
     βœ” Container mysql-setup                Started                                                                    3.3s 
     βœ” Container elasticsearch-setup        Started                                                                    3.0s 
     βœ” Container broker                     Started                                                                    3.0s 
     βœ” Container datahub-frontend-react     Started                                                                    5.7s 
     βœ” Container datahub-datahub-actions-1  Started                                                                    5.5s 
     βœ” Container schema-registry            Started                                                                    5.6s 
     βœ” Container kafka-setup                Started                                                                    7.0s 
    ...........
    [+] Running 12/12
     βœ” Container mysql                      Running                                                                    0.0s 
     βœ” Container datahub-gms                Running                                                                    0.0s 
     βœ” Container datahub-frontend-react     Running                                                                    0.0s 
     βœ” Container elasticsearch              Running                                                                    0.0s 
     βœ” Container elasticsearch-setup        Running                                                                    0.0s 
     βœ” Container zookeeper                  Running                                                                    0.0s 
     βœ” Container broker                     Running                                                                    0.0s 
     βœ” Container schema-registry            Running                                                                    0.0s 
     βœ” Container kafka-setup                Started                                                                    1.5s 
     βœ” Container datahub-upgrade            Running                                                                    0.0s 
     βœ” Container mysql-setup                Started                                                                    1.5s 
     βœ” Container datahub-datahub-actions-1  Running                                                                    0.0s 
    .............
    [+] Running 12/12
     βœ” Container mysql                      Running                                                                    0.0s 
     βœ” Container datahub-gms                Running                                                                    0.0s 
     βœ” Container zookeeper                  Running                                                                    0.0s 
     βœ” Container datahub-datahub-actions-1  Running                                                                    0.0s 
     βœ” Container elasticsearch              Running                                                                    0.0s 
     βœ” Container mysql-setup                Started                                                                    4.1s 
     βœ” Container broker                     Running                                                                    0.0s 
     βœ” Container datahub-frontend-react     Running                                                                    0.0s 
     βœ” Container elasticsearch-setup        Running                                                                    0.0s 
     βœ” Container datahub-upgrade            Running                                                                    0.0s
    πŸ“– 1
    πŸ” 1
    l
    a
    • 3
    • 6
  • w

    witty-motorcycle-52108

    04/18/2023, 10:47 PM
    hi all, we just ran into a pretty unfortunate issue where the CLI is entirely broken when working with systems that have system auth enabled. filed a bug for it here, but wanted to also reach out and provide a direct heads up: https://github.com/datahub-project/datahub/issues/7852
    πŸ“– 1
    πŸ” 1
    l
    a
    +2
    • 5
    • 10
  • b

    bland-orange-13353

    04/19/2023, 6:08 AM
    This message was deleted.
    βœ… 1
    πŸ“– 1
    l
    • 2
    • 1
  • m

    millions-hydrogen-95879

    04/19/2023, 6:48 AM
    I am still not able to install datahub on Mac M1.
    Copy code
    Unable to run quickstart - the following issues were detected:
    - elasticsearch-setup exited with an error
    - datahub-gms is still starting
    - elasticsearch is running by not yet healthy
    - datahub-upgrade is still running
    l
    l
    a
    • 4
    • 4
  • b

    bland-orange-13353

    04/19/2023, 8:14 AM
    This message was deleted.
    βœ… 1
    l
    • 2
    • 1
  • b

    bland-orange-13353

    04/19/2023, 8:22 AM
    This message was deleted.
    βœ… 1
    l
    • 2
    • 1
  • r

    rapid-accountant-13127

    04/19/2023, 8:29 AM
    Hi team, I am not able to extract redshift lineage
    Copy code
    'warnings': {'extract-QUERY_SCAN': ["Error was (psycopg2.errors.FeatureNotSupported) Specified types or functions (one per INFO message) not supported on Redshift tables.\n\n[SQL: \n            select\n                distinct cluster,\n                target_schema,\n                target_table,\n                username,\n                source_schema,\n                source_table\n            from\n                    (\n                select\n                    distinct tbl as target_table_id,\n                    sti.schema as target_schema,\n                    sti.table as target_table,\n                    sti.database as cluster,\n                    query,\n                    starttime\n                from\n                    stl_insert\n                join SVV_TABLE_INFO sti on\n                    sti.table_id = tbl\n                where starttime >= '2023-04-18 00:00:00'\n                and starttime < '2023-04-19 08:00:04'\n                and cluster = 'egateewh'\n                    ) as target_tables\n            join ( (\n                select\n                    pu.usename::varchar(40) as username,\n                    ss.tbl as source_table_id,\n                    sti.schema as source_schema,\n                    sti.table as source_table,\n                    scan_type,\n                    sq.query as query\n                from\n                    (\n                    select\n                        distinct userid,\n                        query,\n                        tbl,\n                        type as scan_type\n                    from\n                        stl_scan\n                ) ss\n                join SVV_TABLE_INFO sti on\n                    sti.table_id = ss.tbl\n                left join pg_user pu on\n                    pu.usesysid = ss.userid\n                left join stl_query sq on\n                    ss.query = sq.query\n                where\n                    pu.usename <> 'rdsdb')\n            ) as source_tables\n                    using (query)\n            where\n                scan_type in (1, 2, 3)\n            order by cluster, target_schema, target_table, starttime asc\n        ]\n(Background on this error at: <https://sqlalche.me/e/14/tw8g>)"]},
    #troubleshoot #ingestion I use
    datahub docker quickstart
    command to deploy DataHub version v0.10.2, DataHub CLI version: 0.10.2.1, Python version: 3.8.15
    βœ… 1
    πŸ” 1
    πŸ“– 1
    l
    b
    • 3
    • 4
  • w

    white-insurance-25610

    04/19/2023, 8:55 AM
    Hi team, I was using openapi/entities/v1/latest GET API, when I gave urn of a pipeline the output came successfully, but when I gave urn of a task it gave me error with multiple causes. Can anyone help me with this? urn of a pipeline: urnlidataFlow:(airflow,new_dag,prod) urn of a task: urnlidataJob:(urnlidataFlow:(airflow,new_dag,prod),first_task)
    response_1681891043946.json
    l
    a
    • 3
    • 2
  • s

    strong-twilight-1984

    04/19/2023, 10:19 AM
    Hi there. I ran the datahub-upgrade container succesfully to upgrade from v0.9.6.1 to v0.10.1 , but when starting the gms, mce, and mae container, i see this repeated in the logs (the upgrade container was on v0.10.1 and the other containers on v0.9.6.1 - in AWS EKS)
    Copy code
    2023-04-19 10:03:48,711 [ThreadPoolTaskExecutor-1] WARN  o.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-generic-duhe-consumer-job-client-2, groupId=generic-duhe-consumer-job-client] Error while fetching metadata with correlation id 156 : {DataHubUpgradeHistory_v1=UNKNOWN_TOPIC_OR_PARTITION}
    not sure what i am missing?
    πŸ” 1
    πŸ“– 1
    l
    a
    • 3
    • 3
  • f

    flaky-dinner-67771

    04/19/2023, 11:15 AM
    HI Team, I'm trying to ingest metadata from Oracle DB, but I'm getting numerous exceptions at runtime:
    Copy code
    [2023-04-19 14:00:07,217] ERROR    {datahub.utilities.sqlalchemy_query_combiner:257} - Failed to execute query normally, using fallback:
    SELECT hire_date
    FROM hr.employees
    WHERE (hire_date NOT IN (NULL) OR (1 = 1)) AND hire_date IS NOT NULL
    AND ROWNUM <= 20
    Traceback (most recent call last):
      File "/home/vladimir/.local/lib/python3.10/site-packages/datahub/utilities/sqlalchemy_query_combiner.py", line 119, in get_query_columns
        return list(query.inner_columns)
    AttributeError: 'str' object has no attribute 'inner_columns'
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/home/vladimir/.local/lib/python3.10/site-packages/datahub/utilities/sqlalchemy_query_combiner.py", line 253, in _sa_execute_fake
        handled, result = self._handle_execute(conn, query, args, kwargs)
      File "/home/vladimir/.local/lib/python3.10/site-packages/datahub/utilities/sqlalchemy_query_combiner.py", line 218, in _handle_execute
        if not self.is_single_row_query_method(query):
      File "/home/vladimir/.local/lib/python3.10/site-packages/datahub/ingestion/source/ge_data_profiler.py", line 227, in _is_single_row_query_method
        query_columns = get_query_columns(query)
      File "/home/vladimir/.local/lib/python3.10/site-packages/datahub/utilities/sqlalchemy_query_combiner.py", line 121, in get_query_columns
        return list(query.columns)
    AttributeError: 'str' object has no attribute 'columns'
    They all refer to the same place. This is the get_query_columns function in the file
    ~\.local\lib\python3.10\site-packages\datahub\utilities\sqlalchemy_query_combiner.py
    As I understand it, in this function the request variable is passed with the
    str
    type, and due to the lack of attributes
    inner_columns
    and
    columns
    for this type, the python gives an error. How can this be fixed? OS: Linux Ubuntu 22.04 Database: Oracle Database 21c Express Edition Release 21.0.0.0.0 Datahub version (quickstart): 0.10.1.1 Python: 3.10.6 My recipe:
    Copy code
    pipeline_name: "Oracle_HR"
    source:
        type: oracle
        config:
            host_port: "localhost:1521"
            username: HR
            password: HR
            service_name: XEPDB1
            schema_pattern:
                allow:
                    - HR
            profiling:
                enabled: true
            stateful_ingestion:
                enabled: true
            profile_pattern:
                allow:
                - "HR.*.*"
    sink:
        type: "datahub-rest"
        config:
            server: "<http://localhost:8080>"
    I launch the recipe through the CLI with the following command:
    datahub ingest -c Oracle_HR.dhub.yaml
    πŸ“– 1
    πŸ” 1
    l
    a
    +2
    • 5
    • 16
  • l

    limited-sundown-85797

    04/19/2023, 11:44 AM
    Hello everyone! I'm trying to follow the instructions under the first_link (local development and the next one with docke using) but everytime i try to launch ./gradlew it returns me an error like: ./gradlew Downloading https://services.gradle.org/distributions/gradle-6.9.2-bin.zip Exception in thread "main" java.io.IOException: Downloading from https://services.gradle.org/distributions/gradle-6.9.2-bin.zip failed: timeout at org.gradle.wrapper.Download.downloadInternal(Download.java:110) at org.gradle.wrapper.Download.download(Download.java:67) at org.gradle.wrapper.Install$1.call(Install.java:68) at org.gradle.wrapper.Install$1.call(Install.java:48) at org.gradle.wrapper.ExclusiveFileAccessManager.access(ExclusiveFileAccessManager.java:69) at org.gradle.wrapper.Install.createDist(Install.java:48) at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:107) at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:63) Caused by: java.net.SocketTimeoutException: connect timed out at java.base/java.net.PlainSocketImpl.socketConnect(Native Method) at java.base/java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:412) at java.base/java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:255) at java.base/java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:237) at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.base/java.net.Socket.connect(Socket.java:609) at java.base/sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:305) at java.base/sun.net.NetworkClient.doConnect(NetworkClient.java:177) at java.base/sun.net.www.http.HttpClient.openServer(HttpClient.java:507) at java.base/sun.net.www.http.HttpClient.openServer(HttpClient.java:602) at java.base/sun.net.www.protocol.https.HttpsClient.<init>(HttpsClient.java:266) at java.base/sun.net.www.protocol.https.HttpsClient.New(HttpsClient.java:373) at java.base/sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.getNewHttpClient(AbstractDelegateHttpsURLConnection.java:207) at java.base/sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1187) at java.base/sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1081) at java.base/sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:193) at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1592) at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1520) at java.base/sun.net.www.protocol.https.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:250) at org.gradle.wrapper.Download.downloadInternal(Download.java:87) ... 7 more We have proxies in env configuration, but i can't understand what's the problem we faced with. Probably i should explain why i want to do like this. We used quickstart guide and laucnhed our datahub successfully, but now i want to make changes with creating aspects and etc. And i cant find those objects under all the containers
    πŸ“– 1
    l
    b
    • 3
    • 2
  • b

    better-fireman-33387

    04/19/2023, 12:23 PM
    Hi guys, need your help, I deleted all the elastic indices and than restore it using the restore-indices job (deployed on k8s)
    kubectl create job --from=cronjob/datahub-datahub-restore-indices-job-template datahub-restore-indices-adhoc
    but ui is empty and I don’t see all my data and ingestions can anyone assist please?
    l
    a
    • 3
    • 6
  • c

    cuddly-dress-27487

    04/19/2023, 1:43 PM
    hello everyone! I'm testing datahub on my local machine now trying to import looker metadata using looker module and encountering this error anyone know what might be happening here?
    Copy code
    [2023-04-19 12:57:09,481] ERROR    {datahub.entrypoints:195} - Command failed: 'str' object is not callable
    traceback
    Copy code
    Traceback (most recent call last):
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/entrypoints.py", line 182, in main
        sys.exit(datahub(standalone_mode=False, **kwargs))
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/click/core.py", line 1130, in __call__
        return self.main(*args, **kwargs)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/click/core.py", line 1055, in main
        rv = self.invoke(ctx)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/click/core.py", line 1657, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/click/core.py", line 1657, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/click/core.py", line 1404, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/click/core.py", line 760, in invoke
        return __callback(*args, **kwargs)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/click/decorators.py", line 26, in new_func
        return f(get_current_context(), *args, **kwargs)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 379, in wrapper
        raise e
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 334, in wrapper
        res = func(*args, **kwargs)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/utilities/memory_leak_detector.py", line 95, in wrapper
        return func(ctx, *args, **kwargs)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 198, in run
        loop.run_until_complete(run_func_check_upgrade(pipeline))
      File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
        return future.result()
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 158, in run_func_check_upgrade
        ret = await the_one_future
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 149, in run_pipeline_async
        return await loop.run_in_executor(
      File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
        result = self.fn(*self.args, **self.kwargs)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 140, in run_pipeline_to_completion
        raise e
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 132, in run_pipeline_to_completion
        pipeline.run()
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 359, in run
        for wu in itertools.islice(
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/utilities/source_helpers.py", line 91, in auto_stale_entity_removal
        for wu in stream:
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/utilities/source_helpers.py", line 42, in auto_status_aspect
        for wu in stream:
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 1265, in get_workunits_internal
        ) = job.result()
      File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 451, in result
        return self.__get_result()
      File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
        raise self._exception
      File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
        result = self.fn(*self.args, **self.kwargs)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 1145, in process_dashboard
        metric_dim_workunits = self.process_metrics_dimensions_and_fields_for_dashboard(
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 982, in process_metrics_dimensions_and_fields_for_dashboard
        chart_mcps = [
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 983, in <listcomp>
        self._make_metrics_dimensions_chart_mcp(element, dashboard)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 1087, in _make_metrics_dimensions_chart_mcp
        fields=self._input_fields_from_dashboard_element(dashboard_element)
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 1021, in _input_fields_from_dashboard_element
        explore = self.explore_registry.get_explore(
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_common.py", line 925, in get_explore
        looker_explore = LookerExplore.from_api(
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_common.py", line 628, in from_api
        from datahub.ingestion.source.looker.lookml_source import _BASE_PROJECT_NAME
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/ingestion/source/looker/lookml_source.py", line 85, in <module>
        from datahub.utilities.sql_parser import SQLParser
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/utilities/sql_parser.py", line 9, in <module>
        from datahub.utilities.sql_lineage_parser_impl import SqlLineageSQLParserImpl
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/datahub/utilities/sql_lineage_parser_impl.py", line 8, in <module>
        from sqllineage.core.holders import Column, SQLLineageHolder
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/sqllineage/__init__.py", line 41, in <module>
        _monkey_patch()
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/sqllineage/__init__.py", line 35, in _monkey_patch
        _patch_updating_lateral_view_lexeme()
      File "/tmp/datahub/ingest/venv-looker-0.10.2/lib/python3.10/site-packages/sqllineage/__init__.py", line 24, in _patch_updating_lateral_view_lexeme
        if regex("LATERAL VIEW EXPLODE(col)"):
    TypeError: 'str' object is not callable
    πŸ“– 1
    plus1 1
    πŸ” 1
    βœ… 1
    l
    b
    +2
    • 5
    • 6
  • r

    rough-florist-86890

    04/19/2023, 2:04 PM
    Hi guys I use Datahub since 1h, but somehow have all the time path problems, I wanted to use the Demo Data, however I get an error (use windows) datahub.configuration.common.ConfigurationError: Cannot read remote file C:\Users\MYNAME\AppData\Local\Temp\tmp2hvw6vrv.json, error:No connection adapters were found for 'C:\Users\MYNAME\AppData\Local\Temp\tmp2hvw6vrv.json'. My .yaml look like that: source: type: demo-data config: {} sink: type: "datahub-rest" config: server: "http://localhost:8080" This is the terminal input: python3 -m datahub ingest -c .\recipe_json.dhub.yaml how can I solve this problem? Thanks in advance!
    l
    a
    • 3
    • 2
  • n

    nice-rose-9274

    04/19/2023, 5:30 PM
    hey all β€” not sure if this is the correct channel, but I’m having trouble figuring out how to specify negative permissions and hope someone can help. For example, if I have tagged a table β€œPII”, is there a way to restrict users from being able to view the β€œStats” tab on that table? Thanks in advance!
    πŸ” 1
    πŸ“– 1
    l
    a
    • 3
    • 2
  • r

    rich-crowd-33361

    04/19/2023, 11:54 PM
    I am having issues while installing
    l
    a
    • 3
    • 3
  • r

    rich-crowd-33361

    04/19/2023, 11:54 PM
    can someone help?
  • f

    faint-boots-14622

    04/20/2023, 1:11 AM
    Hey all I have just started implementing DataHub to our data stack and using for cataloging mainly BigQuery data, and got some issues while data ingestion. All the tables are being stored, but no metadata is being imported for the views and materialized views. And also getting some errors when ingesting data sets with views or m views. It would be great if you guys can help me out. The log is as below:
    Copy code
    ~~~~ Execution Summary - RUN_INGEST ~~~~
    Execution finished with errors.
    {'exec_id': 'ed590018-cf94-455f-8815-3a9835f40d80',
     'infos': ['2023-04-20 00:27:45.091260 INFO: Starting execution for task with name=RUN_INGEST',
               "2023-04-20 00:28:51.873290 INFO: Failed to execute 'datahub ingest'",
               '2023-04-20 00:28:51.875247 INFO: Caught exception EXECUTING task_id=ed590018-cf94-455f-8815-3a9835f40d80, name=RUN_INGEST, '
               'stacktrace=Traceback (most recent call last):\n'
               '  File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/default_executor.py", line 122, in execute_task\n'
               '    task_event_loop.run_until_complete(task_future)\n'
               '  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete\n'
               '    return future.result()\n'
               '  File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 231, in execute\n'
               '    raise TaskError("Failed to execute \'datahub ingest\'")\n'
               "acryl.executor.execution.task.TaskError: Failed to execute 'datahub ingest'\n"],
     'errors': []}
    
    ~~~~ Ingestion Report ~~~~
    {
      "cli": {
        "cli_version": "0.10.1",
        "cli_entry_location": "/tmp/datahub/ingest/venv-bigquery-0.10.1/lib/python3.10/site-packages/datahub/__init__.py",
        "py_version": "3.10.10 (main, Mar 14 2023, 02:37:11) [GCC 10.2.1 20210110]",
        "py_exec_path": "/tmp/datahub/ingest/venv-bigquery-0.10.1/bin/python3",
        "os_details": "Linux-5.15.0-1030-gcp-x86_64-with-glibc2.31",
        "peak_memory_usage": "308.13 MB",
        "mem_info": "308.13 MB"
      },
      "source": {
        "type": "bigquery",
        "report": {
          "events_produced": 68,
          "events_produced_per_sec": 1,
          "entities": {
            "container": [
              "urn:li:container:770462b97050e805bf0bbf39a8ba0334",
       
            ],
            "dataset": [
              "urn:li:dataset:(urn:li:dataPlatform:bigquery,)",
              "urn:li:dataset:(urn:li:dataPlatform:bigquery)",
     
              "... sampled of 20 total elements"
            ]
          },
          "aspects": {
            "container": {
              "containerProperties": 4,
              "status": 9,
              "dataPlatformInstance": 4,
              "subTypes": 4,
              "domains": 4,
              "container": 2
            },
            "dataset": {
              "status": 20,
              "schemaMetadata": 3,
              "datasetProperties": 2,
              "container": 4,
              "subTypes": 2,
              "domains": 4,
              "datasetUsageStatistics": 4,
              "datasetProfile": 1,
              "upstreamLineage": 1
            }
          },
          "warnings": {
            "profile skipped as partitioned table is empty or partition id was invalid": [
              table
            ]
          },
          "failures": {
            "metadata-extraction": [
              "tableid" - Unable to get tables for dataset ATE in project jade-2022-ee, skipping. Does your service account has bigquery.tables.list, bigquery.routines.get, bigquery.routines.list permission, bigquery.tables.getData permission? The error was: 'int' object has no attribute 'timestamp' - Traceback (most recent call last):\n  File \"/tmp/datahub/ingest/venv-bigquery-0.10.1/lib/python3.10/site-packages/datahub/ingestion/source/bigquery_v2/bigquery.py\", line 626, in _process_project\n    yield from self._process_schema(\n  File \"/tmp/datahub/ingest/venv-bigquery-0.10.1/lib/python3.10/site-packages/datahub/ingestion/source/bigquery_v2/bigquery.py\", line 782, in _process_schema\n    yield from self._process_view(\n  File \"/tmp/datahub/ingest/venv-bigquery-0.10.1/lib/python3.10/site-packages/datahub/ingestion/source/bigquery_v2/bigquery.py\", line 882, in _process_view\n    yield from self.gen_view_dataset_workunits(\n  File \"/tmp/datahub/ingest/venv-bigquery-0.10.1/lib/python3.10/site-packages/datahub/ingestion/source/bigquery_v2/bigquery.py\", line 951, in gen_view_dataset_workunits\n    yield from self.gen_dataset_workunits(\n  File \"/tmp/datahub/ingest/venv-bigquery-0.10.1/lib/python3.10/site-packages/datahub/ingestion/source/bigquery_v2/bigquery.py\", line 1007, in gen_dataset_workunits\n    lastModified=TimeStamp(time=int(table.last_altered.timestamp() * 1000))\nAttributeError: 'int' object has no attribute 'timestamp'\n"
            ]
          },
          "soft_deleted_stale_entities": [
            "urn:li:dataset:(urn:li",
    
          ],
          "tables_scanned": 2,
          "views_scanned": 1,
          "entities_profiled": 1,
          "filtered": [
            "... sampled of 33 total elements"
          ],
          "query_combiner": {
            "total_queries": 23,
            "uncombined_queries_issued": 10,
            "combined_queries_issued": 3,
            "queries_combined": 13,
            "query_exceptions": 0
          },
          "profiling_skipped_not_updated": {},
          "profiling_skipped_size_limit": {},
          "profiling_skipped_row_limit": {},
          "num_tables_not_eligible_profiling": {},
          "num_total_lineage_entries": {
          },
          "num_skipped_lineage_entries_missing_data": {},
          "num_skipped_lineage_entries_not_allowed": {
          },
          "num_lineage_entries_sql_parser_failure": {},
          "num_lineage_entries_sql_parser_success": {},
          "num_skipped_lineage_entries_other": {},
          "num_total_log_entries": {
          },
          "num_parsed_log_entries": {
          },
          "num_total_audit_entries": {},
          "num_parsed_audit_entries": {},
          "lineage_failed_extraction": [],
          "lineage_metadata_entries": {
          },
          "lineage_mem_size": {
    
          },
          "include_table_lineage": true,
          "use_date_sharded_audit_log_tables": false,
          "log_page_size": 1000,
          "use_exported_bigquery_audit_metadata": false,
          "log_entry_start_time": "2023-04-18T23:45:00Z",
          "log_entry_end_time": "2023-04-20T00:42:54Z",
          "upstream_lineage": {},
          "partition_info": {},
          "profile_table_selection_criteria": {},
          "selected_profile_tables": {},
          "invalid_partition_ids": {},
          "num_usage_workunits_emitted": 4,
          "total_query_log_entries": 507,
          "num_read_events": 34,
          "num_query_events": 220,
          "num_filtered_read_events": 253,
          "num_filtered_query_events": 0,
          "num_operational_stats_workunits_emitted": 0,
          "read_reasons_stat": {
            "JOB": 32,
            "TABLEDATA_LIST_REQUEST": 2
          },
          "operation_types_stat": {
            "SELECT": 24
          },
          "current_project_status": {
            "ultimate-bit-334500": {
              "Lineage Extraction": "2023-04-20 00:28:45.591964 (3.96 seconds ago)"
            }
          },
          "start_time": "2023-04-20 00:27:54.786969 (54.76 seconds ago)",
          "running_time": "54.76 seconds"
        }
      },
      "sink": {
        "type": "datahub-rest",
        "report": {
          "total_records_written": 54,
          "records_written_per_second": 0,
          "warnings": [],
          "failures": [],
          "start_time": "2023-04-20 00:27:47.753399 (1 minute and 1.8 seconds ago)",
          "current_time": "2023-04-20 00:28:49.549549 (now)",
          "total_duration_in_seconds": 61.8,
          "gms_version": "v0.10.1",
          "pending_requests": 0
        }
      }
    }
    l
    a
    • 3
    • 2
  • r

    rich-crowd-33361

    04/20/2023, 1:14 AM
    Hi Everyone, I am trying to install and stuck with it. Please help me
    βœ… 1
    l
    r
    • 3
    • 2
  • h

    helpful-raincoat-19244

    04/20/2023, 7:58 AM
    Hi everyone, since we updated our datahub version to the latest version on master yesterday, we encounter an error when building the project. Some file containihg gradle-plugins seems to be missing... https://repo.maven.apache.org/maven2/com/linkedin/pegasus/gradle-plugins/29.22.16/gradle-plugins-29.22.16.pom could not be found,
    πŸ” 1
    πŸ“– 1
    l
    a
    • 3
    • 3
  • q

    quiet-rain-16785

    04/20/2023, 8:50 AM
    Hi Guys, I am using custom_action but i not able to parse ___parametes json from the event ....can anyone provide me that code._
    πŸ“– 1
    πŸ” 1
    l
    a
    • 3
    • 2
  • f

    faint-hair-91313

    04/20/2023, 9:19 AM
    Hello all, I am trying to update to v0.10.2 but getting this error in GMS after deployment with docker containers.
    2023-04-20 09:14:41,111 [pool-14-thread-1] ERROR c.d.m.ingestion.IngestionScheduler:244 - Failed to retrieve ingestion sources! Skipping updating schedule cache until next refresh. start: 0, count: 30
    com.linkedin.r2.RemoteInvocationException: com.linkedin.r2.RemoteInvocationException: Failed to get response from server for URI <http://localhost:8080/entities>
    at com.linkedin.restli.internal.client.ExceptionUtil.wrapThrowable(ExceptionUtil.java:135)
    at com.linkedin.restli.internal.client.ResponseFutureImpl.getResponseImpl(ResponseFutureImpl.java:130)
    at com.linkedin.restli.internal.client.ResponseFutureImpl.getResponse(ResponseFutureImpl.java:94)
    at com.linkedin.common.client.BaseClient.sendClientRequest(BaseClient.java:51)
    at com.linkedin.entity.client.RestliEntityClient.list(RestliEntityClient.java:374)
    at com.datahub.metadata.ingestion.IngestionScheduler$BatchRefreshSchedulesRunnable.run(IngestionScheduler.java:220)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
    Caused by: com.linkedin.r2.RemoteInvocationException: Failed to get response from server for URI <http://localhost:8080/entities>
    at com.linkedin.r2.transport.http.common.HttpBridge$1.onResponse(HttpBridge.java:67)
    at com.linkedin.r2.transport.http.client.rest.ExecutionCallback.lambda$onResponse$0(ExecutionCallback.java:64)
    ... 3 common frames omitted
    Caused by: com.linkedin.r2.RetriableRequestException: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/127.0.0.1:8080
    at com.linkedin.r2.transport.http.client.common.ChannelPoolLifecycle.onError(ChannelPoolLifecycle.java:142)
    at com.linkedin.r2.transport.http.client.common.ChannelPoolLifecycle.lambda$create$0(ChannelPoolLifecycle.java:97)
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:590)
    at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:583)
    at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:559)
    at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:492)
    at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:636)
    at io.netty.util.concurrent.DefaultPromise.setFailure0(DefaultPromise.java:629)
    at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:118)
    at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.fulfillConnectPromise(AbstractNioChannel.java:321)
    at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:337)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    ... 1 common frames omitted
    Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/127.0.0.1:8080
    Caused by: java.net.ConnectException: Connection refused
    at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:777)
    at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:337)
    at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at java.base/java.lang.Thread.run(Thread.java:829)
    πŸ“– 1
    πŸ” 1
    πŸ’― 1
    l
    f
    +3
    • 6
    • 10
1...909192...119Latest