https://datahubproject.io logo
Join Slack
Powered by
# troubleshoot
  • c

    clever-garden-23538

    09/14/2022, 4:10 PM
    is it possible to define policies through a configuration file?
    b
    • 2
    • 4
  • b

    better-spoon-77762

    09/14/2022, 4:30 PM
    hello Is there a flag we can set to start seeing
    log.debug
    in the pod/container console logs ?
    b
    • 2
    • 1
  • q

    quiet-school-18370

    09/14/2022, 7:23 PM
    Copy code
    'acryl.executor.execution.task.TaskError: Failed to resolve secret with name DBT_PROJECT_ROOT. Aborting recipe execution.\n']}
    can anyone please help me to resolve this error, I am integrating dbt with datahub
    b
    • 2
    • 1
  • b

    best-sunset-26241

    09/14/2022, 9:53 PM
    Hi, Everyone, I am having a problem with deploying the quickstart in my M1 Mac using docker
    no matching manifest for linux/arm64/v8 in the manifest list entries
    I saw that there are 3 issues about this topic marked as closed, here, here, and here, but none of the solutions presented there solved my problem. I already tried the datahub docker quickstart --quickstart-compose-file [path in my machine]/docker-compose-without-neo4j.quickstart.yml but it didn’t work. Could someone help me, please? 🙌🏼
    d
    • 2
    • 1
  • p

    proud-table-38689

    09/14/2022, 10:14 PM
    why would I see
    Received 503 from <http://service-endpoint-for-gms:8080/health>
    from my datahub actions container, but when I exec in and run
    curl -I <http://service-endpoint-for-gms:8080/health>
    I see
    Copy code
    HTTP/1.1 200 OK
    Date: Wed, 14 Sep 2022 22:11:46 GMT
    Content-Length: 0
    Server: Jetty(9.4.46.v20220331)`
    • 1
    • 2
  • n

    numerous-account-62719

    09/15/2022, 7:25 AM
    Hi Team I am trying to execute the ingestion pipeline through UI. I am getting the following error Please help me out
    Copy code
    ~~~~ Execution Summary ~~~~
    
    RUN_INGEST - {'errors': [],
     'exec_id': 'ffb67bca-4da2-40f2-b846-34c17e167ce9',
     'infos': ['2022-09-15 07:35:04.529251 [exec_id=ffb67bca-4da2-40f2-b846-34c17e167ce9] INFO: Starting execution for task with name=RUN_INGEST',
               '2022-09-15 07:35:08.306135 [exec_id=ffb67bca-4da2-40f2-b846-34c17e167ce9] INFO: stdout=Requirement already satisfied: pip in '
               '/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages (21.2.4)\n'
               'ERROR: Exception:\n'
               'Traceback (most recent call last):\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/cli/base_command.py", '
               'line 173, in _main\n'
               '    status = self.run(options, args)\n'
               '    state = resolution.resolve(requirements, max_rounds=max_rounds)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/resolvelib/resolvers.py", '
               'line 341, in resolve\n'
               '    resp = self.send(prep, **send_kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/requests/sessions.py", '
               'line 655, in send\n'
               '    r = adapter.send(request, **kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/cachecontrol/adapter.py", '
    
               'ValueError: check_hostname requires server_hostname\n'
               'ERROR: Exception:\n'
               'Traceback (most recent call last):\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/cli/base_command.py", '
               'line 173, in _main\n'
               '    status = self.run(options, args)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/cli/req_command.py", '
               'line 203, in wrapper\n'
               '    return func(self, options, args)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/commands/install.py", '
               'line 315, in run\n'
               '    requirement_set = resolver.resolve(\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/resolver.py", '
               'line 94, in resolve\n'
               '    result = self._result = resolver.resolve(\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/resolvelib/resolvers.py", '
               'line 472, in resolve\n'
               '    state = resolution.resolve(requirements, max_rounds=max_rounds)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/resolvelib/resolvers.py", '
               'line 341, in resolve\n'
               '    self._add_to_criteria(self.state.criteria, r, parent=None)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/resolvelib/resolvers.py", '
               'line 172, in _add_to_criteria\n'
               '    if not criterion.candidates:\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/resolvelib/structs.py", '
               'line 151, in __bool__\n'
               '    return bool(self._sequence)\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", '
               'line 140, in __bool__\n'
               '    return any(self)\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", '
               'line 128, in <genexpr>\n'
               '    return (c for c in iterator if id(c) not in self._incompatible_ids)\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", '
               'line 29, in _iter_built\n'
               '    for version, func in infos:\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/factory.py", '
               'line 272, in iter_index_candidate_infos\n'
               '    result = self._finder.find_best_candidate(\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/index/package_finder.py", line '
               '851, in find_best_candidate\n'
               '    candidates = self.find_all_candidates(project_name)\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/index/package_finder.py", line '
               '798, in find_all_candidates\n'
               '    page_candidates = list(page_candidates_it)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/index/sources.py", line '
               '134, in page_candidates\n'
               '    yield from self._candidates_from_page(self._link)\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/index/package_finder.py", line '
               '758, in process_project_url\n'
               '    html_page = self._link_collector.fetch_page(project_url)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/index/collector.py", '
               'line 490, in fetch_page\n'
               '    return _get_html_page(location, session=self.session)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/index/collector.py", '
               'line 400, in _get_html_page\n'
               '    resp = _get_html_response(url, session=session)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/index/collector.py", '
               'line 115, in _get_html_response\n'
               '    resp = session.get(\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/requests/sessions.py", '
               'line 555, in get\n'
               "    return self.request('GET', url, **kwargs)\n"
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_internal/network/session.py", '
               'line 454, in request\n'
               '    return super().request(method, url, *args, **kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/requests/sessions.py", '
               'line 542, in request\n'
               '    resp = self.send(prep, **send_kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/requests/sessions.py", '
               'line 655, in send\n'
               '    r = adapter.send(request, **kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/cachecontrol/adapter.py", '
               'line 53, in send\n'
               '    resp = super(CacheControlAdapter, self).send(request, **kw)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/requests/adapters.py", '
               'line 439, in send\n'
               '    resp = conn.urlopen(\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/urllib3/connectionpool.py", line '
               '696, in urlopen\n'
               '    self._prepare_proxy(conn)\n'
               '  File '
               '"/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/urllib3/connectionpool.py", line '
               '964, in _prepare_proxy\n'
               '    conn.connect()\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/urllib3/connection.py", '
               'line 359, in connect\n'
               '    conn = self._connect_tls_proxy(hostname, conn)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/urllib3/connection.py", '
               'line 500, in _connect_tls_proxy\n'
               '    return ssl_wrap_socket(\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/urllib3/util/ssl_.py", '
               'line 453, in ssl_wrap_socket\n'
               '    ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls)\n'
               '  File "/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/lib/python3.9/site-packages/pip/_vendor/urllib3/util/ssl_.py", '
               'line 495, in _ssl_wrap_socket_impl\n'
               '    return ssl_context.wrap_socket(sock)\n'
               '  File "/usr/local/lib/python3.9/ssl.py", line 500, in wrap_socket\n'
               '    return self.sslsocket_class._create(\n'
               '  File "/usr/local/lib/python3.9/ssl.py", line 997, in _create\n'
               '    raise ValueError("check_hostname requires server_hostname")\n'
               'ValueError: check_hostname requires server_hostname\n'
               '/tmp/datahub/ingest/venv-ffb67bca-4da2-40f2-b846-34c17e167ce9/bin/python3: No module named datahub\n',
               "2022-09-15 07:35:08.306472 [exec_id=ffb67bca-4da2-40f2-b846-34c17e167ce9] INFO: Failed to execute 'datahub ingest'",
               '2022-09-15 07:35:08.307260 [exec_id=ffb67bca-4da2-40f2-b846-34c17e167ce9] INFO: Caught exception EXECUTING '
               'task_id=ffb67bca-4da2-40f2-b846-34c17e167ce9, name=RUN_INGEST, stacktrace=Traceback (most recent call last):\n'
               '  File "/usr/local/lib/python3.9/site-packages/acryl/executor/execution/default_executor.py", line 121, in execute_task\n'
               '    self.event_loop.run_until_complete(task_future)\n'
               '  File "/usr/local/lib/python3.9/site-packages/nest_asyncio.py", line 89, in run_until_complete\n'
               '    return f.result()\n'
               '  File "/usr/local/lib/python3.9/asyncio/futures.py", line 201, in result\n'
               '    raise self._exception\n'
               '  File "/usr/local/lib/python3.9/asyncio/tasks.py", line 256, in __step\n'
               '    result = coro.send(None)\n'
               '  File "/usr/local/lib/python3.9/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 115, in execute\n'
               '    raise TaskError("Failed to execute \'datahub ingest\'")\n'
               "acryl.executor.execution.task.TaskError: Failed to execute 'datahub ingest'\n"]}
    Execution finished with errors.
    p
    h
    • 3
    • 32
  • w

    witty-wall-84488

    09/15/2022, 10:18 AM
    Hi, Everyone! Trying delete meta from Datahub using CLI comand "delete" Got next error:
    Do you want to delete these references? [y/N]: y
    Failed to execute operation
    java.lang.UnsupportedOperationException: Only upsert operation is supported
    Same with all flags --soft, --hard, --force Could someone help me, please?
    h
    r
    • 3
    • 6
  • l

    lively-jackal-83760

    09/15/2022, 10:31 AM
    Hi guys Today I've started to see the error
    [ThreadPoolTaskExecutor-1] ERROR c.l.m.kafka.hydrator.EntityHydrator:49 - Error while calling GMS to hydrate entity for urn urn:li:corpuser:datahub
    On UI don't see any data. What does this error mean?
    b
    d
    • 3
    • 2
  • g

    great-account-95406

    09/15/2022, 11:41 AM
    Hi everyone! I’ve updated the DataHub to v0.8.44 and now facing the issue that ingestions can’t resolve the secrets. Could anyone help me?
    • 1
    • 4
  • s

    salmon-angle-92685

    09/15/2022, 12:21 PM
    Hello everyone, I am using
    snowflake
    and
    snowflake-usage
    to ingest all snowflake data into Datahub. However I get an error about an extra SnowflakeConfig field. The problem is that I am not using this field on my yaml and still it tells I cant use this field.
    pydantic.error_wrappers.ValidationError: 1 validation error for SnowflakeUsageConfig
    stateful_ingestion -> ignore_old_state
    extra fields not permitted (type=value_error.extra)
    Message: "Failed to construct checkpoint's config from checkpoint aspect."
    Arguments: (ValidationError(model='SnowflakeUsageConfig', errors=[{'loc': ('stateful_ingestion', 'ignore_old_state'), 'msg': 'extra fields not permitted', 'type': 'value_error.extra'}]),)
    What am I doing wrong ? Thanks !
  • b

    bumpy-whale-50799

    09/15/2022, 3:27 PM
    I am getting this error while installing datahub on my windows laptop. Any ideas? (env2) c:\Users\USER\Desktop\datahub>python -m datahub version Traceback (most recent call last): File "C:\Program Files\python39\lib\runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Program Files\python39\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "C:\Users\USER\Desktop\datahub\env2\lib\site-packages\datahub\__main__.py", line 1, in <module> from datahub.entrypoints import main File "C:\Users\USER\Desktop\datahub\env2\lib\site-packages\datahub\entrypoints.py", line 14, in <module> from datahub.cli.docker_cli import docker File "C:\Users\USER\Desktop\datahub\env2\lib\site-packages\datahub\cli\docker_cli.py", line 60, in <module> subprocess.run(["uname", "-m"], stdout=subprocess.PIPE) File "C:\Program Files\python39\lib\subprocess.py", line 505, in run with Popen(*popenargs, **kwargs) as process: File "C:\Program Files\python39\lib\subprocess.py", line 951, in init self._execute_child(args, executable, preexec_fn, close_fds, File "C:\Program Files\python39\lib\subprocess.py", line 1420, in _execute_child hp, ht, pid, tid = _winapi.CreateProcess(executable, args, FileNotFoundError: [WinError 2] The system cannot find the file specified
    h
    • 2
    • 1
  • a

    ancient-policeman-73437

    09/15/2022, 3:56 PM
    Dear Support, we try to link DataHub with Azure AD SSO in order with https://datahubproject.io/docs/authentication/guides/sso/configure-oidc-react-azure/ and by login we get this error message. What would be a Reply URL ?
    c
    • 2
    • 3
  • e

    eager-oil-39220

    09/15/2022, 9:02 PM
    Hi Team, our current datahub version is v0.8.33, and I found out some differences on the UI between our version and the version on the datahub demo. On the demo website, I notice that the pipelines are put in folders (first screenshot). On our version, the pipelines are not put in folders (second screenshot). Will this difference be caused by the version difference? If so, could you tell me in which version do you introduce this UI feature? Thanks!
    c
    h
    • 3
    • 4
  • c

    clever-garden-23538

    09/15/2022, 10:49 PM
    I've granted the "View Analytics" permission to All Users, and so now my non-admin user is able to open the Analytics page, and it mostly works, except a 403 toast pops up and I get
    Domains failed to load
    . We don't have any domains atm, maybe related?
    b
    t
    • 3
    • 5
  • f

    fast-ice-59096

    09/16/2022, 10:54 AM
    Hi, everyone, I am new to datahub world, and I am trying to run it in my PC to make a concept proof in order to use datahub in the company I work for. I am facing this issue: datahub docker quickstart Unable to run quickstart: - Docker doesn't seem to be running. Did you start it?
    h
    • 2
    • 16
  • f

    fast-ice-59096

    09/16/2022, 10:54 AM
    The docker instance is running
  • f

    fast-ice-59096

    09/16/2022, 10:54 AM
    I have checked
  • f

    fast-ice-59096

    09/16/2022, 10:55 AM
    Sorry for this elementary doubt, but I am really new to datahub world
  • m

    microscopic-mechanic-13766

    09/16/2022, 11:20 AM
    Good morning, I have been facing a problem that I don't know where its coming from. Datahub works just fine but sometimes the front container falls (then the new container starts without a problem and the front is available again, but with a period of some minutes of failure) and prints the following error:
    Copy code
    [Thread-3] INFO  play.core.server.AkkaHttpServer - Stopping server...
    [application-akka.actor.default-dispatcher-227] [akka.actor.ActorSystemImpl(application)] Response stream for [GET /assets/static/js/2.f8eb32eb.chunk.js] failed with 'Processor actor [Actor[<akka://application/system/StreamSupervisor-0/flow-2327-0-detacher#-1437369062]]> terminated abruptly'. Aborting connection. (akka.stream.AbruptTerminationException: Processor actor [Actor[<akka://application/system/StreamSupervisor-0/flow-2327-0-detacher#-1437369062]]> terminated abruptly)
    I am currently using the last release of such image (
    linkedin/datahub-frontend-react:v0.8.44
    ), does anyone know why this happens or has had this error before?? Thanks in advance!
    h
    s
    • 3
    • 7
  • b

    brave-businessperson-3969

    09/16/2022, 1:18 PM
    Another question: As a user I can't see glossary entries without providing edit permission for the glossary entries. Is this a bug or do I miss some settings?
    b
    • 2
    • 8
  • g

    green-lion-58215

    09/16/2022, 6:01 PM
    When I ingest dbt metadta, I see “dbt:” appended at the start of each tag. is there a way to remove this behaviour?
    f
    g
    • 3
    • 5
  • a

    able-evening-90828

    09/17/2022, 12:20 AM
    Has anyone run into the following issue where the GMS kept crashing during the bootstrap step of
    IngestRootUserStep
    ?
    Copy code
    io.ebean.AcquireLockException: Error when batch flush on sql: update metadata_aspect_v2 set metadata=?, createdOn=?, createdBy=?, createdFor=?, systemmetadata=? where urn=? and aspect=? and version=?
    	at io.ebean.config.dbplatform.SqlCodeTranslator.translate(SqlCodeTranslator.java:44)
    	at io.ebean.config.dbplatform.DatabasePlatform.translate(DatabasePlatform.java:219)
    	at io.ebeaninternal.server.transaction.TransactionManager.translate(TransactionManager.java:246)
    	at io.ebeaninternal.server.transaction.JdbcTransaction.translate(JdbcTransaction.java:698)
    	at io.ebeaninternal.server.transaction.JdbcTransaction.batchFlush(JdbcTransaction.java:680)
    	at io.ebeaninternal.server.transaction.JdbcTransaction.internalBatchFlush(JdbcTransaction.java:796)
    	at io.ebeaninternal.server.transaction.JdbcTransaction.flushCommitAndNotify(JdbcTransaction.java:1005)
    	at io.ebeaninternal.server.transaction.JdbcTransaction.commit(JdbcTransaction.java:1057)
    	at io.ebeaninternal.api.ScopeTrans.commitTransaction(ScopeTrans.java:136)
    	at io.ebeaninternal.api.ScopedTransaction.commit(ScopedTransaction.java:110)
    	at com.linkedin.metadata.entity.ebean.EbeanAspectDao.runInTransactionWithRetry(EbeanAspectDao.java:462)
    	at com.linkedin.metadata.entity.EntityService.ingestAspectToLocalDB(EntityService.java:536)
    	at com.linkedin.metadata.entity.EntityService.wrappedIngestAspectToLocalDB(EntityService.java:495)
    	at com.linkedin.metadata.entity.EntityService.ingestAspect(EntityService.java:632)
    	at com.linkedin.metadata.boot.steps.IngestRootUserStep.execute(IngestRootUserStep.java:67)
    	at com.linkedin.metadata.boot.BootstrapManager.start(BootstrapManager.java:35)
    Caused by
    Copy code
    Caused by: com.mysql.cj.jdbc.exceptions.MySQLTransactionRollbackException: Lock wait timeout exceeded; try restarting transaction
    	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:123)
    	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
    	at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
    	at com.mysql.cj.jdbc.ClientPreparedStatement.executeInternal(ClientPreparedStatement.java:953)
    	at com.mysql.cj.jdbc.ClientPreparedStatement.executeUpdateInternal(ClientPreparedStatement.java:1092)
    	at com.mysql.cj.jdbc.ClientPreparedStatement.executeBatchSerially(ClientPreparedStatement.java:832)
    • 1
    • 1
  • c

    clever-garden-23538

    09/17/2022, 12:58 AM
    how do you go about searching for tags? i'm getting this
    b
    • 2
    • 4
  • b

    bland-orange-13353

    09/19/2022, 6:34 AM
    This message was deleted.
    f
    • 2
    • 1
  • m

    microscopic-mechanic-13766

    09/19/2022, 7:44 AM
    Good morning, so I am getting the following error (which actually doesn't stop Datahub from working correctly):
    Copy code
    com.linkedin.restli.server.RestLiServiceException: null
    ERROR c.l.m.filter.RestliLoggingFilter:38 - <http://Rest.li|Rest.li> error:
    I am currently using
    v0.8.44
    of Datahub gms. The error usually appears at the starting of the container, after the
    Successfully fetched x ingestion sources
    , or sometimes while using Datahub.
    • 1
    • 1
  • c

    colossal-fish-54995

    09/19/2022, 9:10 AM
    Hello everyone.When I use 'datahub docker quickstart' to deploy.'Pulling elasticsearch-setup' has been executed for a long time without results. Is this normal or not?
    d
    • 2
    • 3
  • m

    microscopic-mechanic-13766

    09/19/2022, 10:31 AM
    Hello, so I am trying to reingest and reprofile from Hive, but I keep getting this error why I don't understand why appears:
    Copy code
    "AttributeError: 'bytes' object has no attribute 'value'\n
    I suppose that the problem might be related to a problem of the parse process during the porfiling (as if the profile option is disabled, this error does not appear), but I don't know why I am getting it (the last successful ingestion was on Friday and during the weekend no one has touched the deployment). I will share the full log in the next message of the thread I would appreciate any help to narrow down the source of the error!
    h
    • 2
    • 4
  • b

    brief-ability-41819

    09/19/2022, 11:28 AM
    Hello, We have DataHub v.0.8.44 deployed via Helm to EKS cluster. The very first ingestion we’d like to have is Redshift, so the AWS-part was done beforehand: connectivity, roles, policies etc. After adding the secret for Redshift readonly user and filling the connection wizard, we get the error:
    Copy code
    ~~~~ Execution Summary ~~~~
    
    RUN_INGEST - {'errors': [],
     'exec_id': 'b40bc26a-28d8-46cd-a6ee-14e0397cbd40',
     'infos': ['2022-09-19 09:07:33.274446 [exec_id=b40bc26a-28d8-46cd-a6ee-14e0397cbd40] INFO: Starting execution for task with name=RUN_INGEST',
               '2022-09-19 09:07:33.276286 [exec_id=b40bc26a-28d8-46cd-a6ee-14e0397cbd40] INFO: Caught exception EXECUTING '
               'task_id=b40bc26a-28d8-46cd-a6ee-14e0397cbd40, name=RUN_INGEST, stacktrace=Traceback (most recent call last):\n'
               '  File "/usr/local/lib/python3.9/site-packages/acryl/executor/execution/default_executor.py", line 121, in execute_task\n'
               '    self.event_loop.run_until_complete(task_future)\n'
               '  File "/usr/local/lib/python3.9/site-packages/nest_asyncio.py", line 89, in run_until_complete\n'
               '    return f.result()\n'
               '  File "/usr/local/lib/python3.9/asyncio/futures.py", line 201, in result\n'
               '    raise self._exception\n'
               '  File "/usr/local/lib/python3.9/asyncio/tasks.py", line 256, in __step\n'
               '    result = coro.send(None)\n'
               '  File "/usr/local/lib/python3.9/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 71, in execute\n'
               '    validated_args = SubProcessIngestionTaskArgs.parse_obj(args)\n'
               '  File "pydantic/main.py", line 521, in pydantic.main.BaseModel.parse_obj\n'
               '  File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__\n'
               'pydantic.error_wrappers.ValidationError: 1 validation error for SubProcessIngestionTaskArgs\n'
               'debug_mode\n'
               '  extra fields not permitted (type=value_error.extra)\n']}
    Execution finished with errors.
    Any ideas what’s wrong with the validation? There are basically only three fields (endpoint, user, secret).
    h
    r
    • 3
    • 4
  • l

    little-dinner-99970

    09/16/2022, 8:43 PM
    Hello Everyone. Just now I tried to install the docker quickstart datahub, but I had the problem installing elasticsearch-setup as shown in the image below.
    h
    • 2
    • 2
  • c

    cool-architect-34612

    09/20/2022, 5:33 AM
    Hi I got this error on Analytics page.
    Copy code
    An unknown error occurred. (code 500)
    so I checked my datahub docker
    Copy code
    $ datahub docker check
    The following issues were detected:
    - elasticsearch-setup container is not present
    - mysql container is not present
    Can you tell me what should I do?
    i
    g
    • 3
    • 3
1...495051...119Latest