https://datahubproject.io logo
Join Slack
Powered by
# ingestion
  • a

    aloof-energy-17918

    12/22/2022, 10:26 AM
    Hi team, I have emitted a looker chart similar to below, however, I could not get the Fields tab to show up. I already tried adding customProperties upstream_fields, what that show up in the properties field tab, the Fields tab is still no where to be found.
    h
    • 2
    • 10
  • w

    worried-chef-87127

    12/22/2022, 4:34 PM
    Hello - getting errors with my Looker integration. Was hoping someone can provide some insights:
    Copy code
    '[2022-12-22 06:13:20,458] ERROR    {datahub.entrypoints:192} - \n'
               'Traceback (most recent call last):\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/entrypoints.py", line 149, in main\n'
               '    sys.exit(datahub(standalone_mode=False, **kwargs))\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1130, in __call__\n'
               '    return self.main(*args, **kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1055, in main\n'
               '    rv = self.invoke(ctx)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1657, in invoke\n'
               '    return _process_result(sub_ctx.command.invoke(sub_ctx))\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1657, in invoke\n'
               '    return _process_result(sub_ctx.command.invoke(sub_ctx))\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1404, in invoke\n'
               '    return ctx.invoke(self.callback, **ctx.params)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 760, in invoke\n'
               '    return __callback(*args, **kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/decorators.py", line 26, in new_func\n'
               '    return f(get_current_context(), *args, **kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 347, in wrapper\n'
               '    raise e\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 299, in wrapper\n'
               '    res = func(*args, **kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/utilities/memory_leak_detector.py", line 95, in '
               'wrapper\n'
               '    return func(ctx, *args, **kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 212, in run\n'
               '    loop.run_until_complete(run_func_check_upgrade(pipeline))\n'
               '  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 646, in run_until_complete\n'
               '    return future.result()\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 166, in '
               'run_func_check_upgrade\n'
               '    ret = await the_one_future\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 157, in run_pipeline_async\n'
               '    return await loop.run_in_executor(\n'
               '  File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run\n'
               '    result = self.fn(*self.args, **self.kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 148, in '
               'run_pipeline_to_completion\n'
               '    raise e\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 134, in '
               'run_pipeline_to_completion\n'
               '    pipeline.run()\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 348, in run\n'
               '    for wu in itertools.islice(\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 1164, '
               'in get_workunits\n'
               '    ) = job.result()\n'
               '  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 451, in result\n'
               '    return self.__get_result()\n'
               '  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result\n'
               '    raise self._exception\n'
               '  File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run\n'
               '    result = self.fn(*self.args, **self.kwargs)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 1037, '
               'in process_dashboard\n'
               '    looker_dashboard = self._get_looker_dashboard(dashboard_object, self.looker_api)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 831, '
               'in _get_looker_dashboard\n'
               '    looker_dashboard_element = self._get_looker_dashboard_element(element)\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 514, '
               'in _get_looker_dashboard_element\n'
               '    input_fields = self._get_input_fields_from_query(\n'
               '  File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 344, '
               'in _get_input_fields_from_query\n'
               '    label=field["label"],\n'
               "KeyError: 'label'\n"
               '[2022-12-22 06:13:20,464] ERROR    {datahub.entrypoints:195} - Command failed: \n'
               "\t'label'.\n"
               '\tRun with --debug to get full stacktrace.\n'
               "\te.g. 'datahub --debug ingest run -c /tmp/datahub/ingest/29de83cc-03b7-432a-9657-753d67d724a5/recipe.yml --report-to "
               "/tmp/datahub/ingest/29de83cc-03b7-432a-9657-753d67d724a5/ingestion_report.json'\n",
               "2022-12-22 06:13:47.047248 [exec_id=29de83cc-03b7-432a-9657-753d67d724a5] INFO: Failed to execute 'datahub ingest'",
               '2022-12-22 06:13:47.047574 [exec_id=29de83cc-03b7-432a-9657-753d67d724a5] INFO: Caught exception EXECUTING '
               'task_id=29de83cc-03b7-432a-9657-753d67d724a5, name=RUN_INGEST, stacktrace=Traceback (most recent call last):\n'
               '  File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/default_executor.py", line 123, in execute_task\n'
               '    task_event_loop.run_until_complete(task_future)\n'
               '  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 646, in run_until_complete\n'
               '    return future.result()\n'
               '  File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 168, in execute\n'
               '    raise TaskError("Failed to execute \'datahub ingest\'")\n'
               "acryl.executor.execution.task.TaskError: Failed to execute 'datahub ingest'\n"],
     'structured_report': '{"source": {"type": "looker", "report": {"events_produced": "27590", "events_produced_per_sec": "34", "event_ids": '
                          '["looker-urn:li:chart:(looker,dashboard_elements.18243)", "looker-urn:li:chart:(looker,dashboard_elements.18914)", '
                          '"looker-urn:li:chart:(looker,dashboard_elements.19888)", "looker-urn:li:chart:(looker,dashboard_elements.23351)", '
                          '"looker-urn:li:chart:(looker,dashboard_elements.26246)", "looker-inputFields-urn:li:chart:(looker,dashboard_elements.27578)", '
                          '"looker-inputFields-urn:li:chart:(looker,dashboard_elements.30159)", "looker-urn:li:chart:(looker,dashboard_elements.30581)", '
                          '"looker-urn:li:chart:(looker,dashboard_elements.31238)", "looker-inputFields-urn:li:chart:(looker,dashboard_elements.33689)", '
                          '"... sampled of 27590 total elements"], "warnings": {}, "failures": {}, "total_dashboards": "1542", "dashboards_scanned": '
                          '"1541", "looks_scanned": "13412", "filtered_dashboards": ["3191", "x_base::athena_demo", "x_billing_pmo::athena_demo", '
                          '"x_billing_salesops::athena_demo", "x_clientservices::athena_demo", "x_digital::athena_demo", "x_executive::athena_demo", '
                          '"x_finance_pmo::athena_demo", "x_finance_salesops::athena_demo", "x_phi_prone::athena_demo", "... sampled of 25 total '
                          'elements"], "filtered_looks": [], "dashboards_scanned_for_usage": "0", "charts_scanned_for_usage": "0", '
                          '"charts_with_activity": [], "dashboards_with_activity": [], "stage_latency": [{"name": "list_dashboards", "latency": "1.56 '
                          'seconds"}, {"name": "dashboard_chart_metadata", "start_time": "2022-12-22 06:00:04.937367 (13 minutes and 14.96 seconds '
                          'ago)."}], "total_explores": "0", "explores_scanned": "0", "query_latency": {}, "user_resolution_latency": {}, "start_time": '
                          '"2022-12-22 06:00:02.830271 (13 minutes and 17.07 seconds ago).", "running_time": "13 minutes and 17.07 seconds", '
                          '"dashboard_process_percentage_completion": "99.94", "explore_registry_stats": {"cache_info": "CacheInfo(hits=184581, '
                          'misses=371, maxsize=128, currsize=128)"}, "looker_api_stats": {"client_stats": {"dashboard_calls": 1542, "user_calls": 236, '
                          '"explore_calls": 371, "query_calls": 0, "folder_calls": 441, "all_connections_calls": 0, "connection_calls": 0, '
                          '"lookml_model_calls": 0, "all_dashboards_calls": 1, "search_dashboards_calls": 0}, "folder_cache": "CacheInfo(hits=1101, '
                          'misses=441, maxsize=1000, currsize=415)", "user_cache": "CacheInfo(hits=1968, misses=236, maxsize=2000, currsize=221)"}}}, '
                          '"sink": {"type": "datahub-rest", "report": {"total_records_written": "26589", "records_written_per_second": "33", "warnings": '
                          '[], "failures": [], "start_time": "2022-12-22 06:00:02.099666 (13 minutes and 17.8 seconds ago).", "current_time": '
                          '"2022-12-22 06:13:19.901843 (now).", "total_duration_in_seconds": "797.8", "gms_version": "v0.9.2", "pending_requests": '
                          '"1001"}}}'}
    Execution finished with errors.
    b
    g
    • 3
    • 2
  • k

    kind-dusk-91074

    12/23/2022, 7:58 AM
    Hello Team, I am new to datahub and I am trying to ingest metadata from MySQL but it is stuck at pending. Please any help?
    h
    b
    • 3
    • 23
  • w

    wonderful-vegetable-45135

    12/23/2022, 1:36 PM
    Hi Guys, Is there somebody that knows how the Power BI ingestion works? I have set up and working and I can scan workspaces, and see the different reports in there with their respective charts. However, I cannot set the datasets which are related to those reports. Even though according to the documentation. Nonetheless, I have a feeling that maybe what is defined as a dataset according to the datahub documentation is not the same as what a dataset is in Power BI. I see that there is a configuration for the [dataset_type_mapping] which only supports Postgres and Oracle, even though that doesn't make any sense for a dataset in Power BI because you define it in Power BI itself. I guess it would make sense if they mean data sources which are connected to Power BI datasets. In any case, maybe I'm not understanding fully how the integration works. Could somebody shed some light on this.
    g
    • 2
    • 8
  • a

    acceptable-account-83031

    12/23/2022, 7:56 PM
    Hello Team, I am using PATCH and when I’m trying to add a user using
    curl -X POST -H POST -H "X-Restli-Protocol-Version: 2.0.0" -H
    as shown in PATCH

    youtube▾

    video and I’m getting the following error.
    Copy code
    "message":"Cannot parse request entity"
    Can anyone help me solve this?
    g
    o
    • 3
    • 5
  • b

    bland-lighter-26751

    12/23/2022, 11:35 PM
    Hello, Started seeing a new error with connectivity to BigQuery. To ensure it wasn't something with my setup, I did the following. 1. Grabbed quickstart docker-compose.yaml from git 2. Composed up and logged in as datahub 3. Added 2 secrets, one for the key and one for the id 4. Inputted the remaining BQ connection values 5. Clicked test connection
    Copy code
    ('Failed to load service account credentials from /tmp/tmpm969wnso', ValueError('Could not deserialize key data. The data may be in an incorrect format, it may be encrypted with an unsupported algorithm, or it may be an unsupported key type (e.g. EC curves with explicit parameters).', [_OpenSSLErrorWithText(code=503841036, lib=60, reason=524556, reason_text=b'error:1E08010C:DECODER routines::unsupported')]))
    plus1 2
    h
    n
    +3
    • 6
    • 25
  • b

    bumpy-eye-36525

    12/24/2022, 7:04 PM
    Hello team, I am trying to ingest column tag from dbt meta with the automated mappings, but i met error . Please let me know if you have any suggestions on this, Thank you.
    Copy code
    File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/dbt.py", line 1101, in create
        1099  @classmethod
        1100  def create(cls, config_dict, ctx):
    --> 1101      config = DBTConfig.parse_obj(config_dict)
        1102      return cls(config, ctx, "dbt")
    
    File "pydantic/main.py", line 521, in pydantic.main.BaseModel.parse_obj
    
    File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
    
    ValidationError: 1 validation error for DBTConfig
    column_meta_mapping
      extra fields not permitted (type=value_error.extra)
    b
    • 2
    • 16
  • t

    thankful-fireman-70616

    12/25/2022, 3:08 PM
    Hello, I'm trying to ingest from local delta lake (wherein my set-up is running in a docker container) and I'm getting below error message - datahub.ingestion.eun.pipeline.PipelineaInitError: Failed to set up framework context: Failed to connect to datahub
    s
    h
    • 3
    • 25
  • m

    magnificent-notebook-88304

    12/26/2022, 7:37 AM
    hello ,
  • m

    magnificent-notebook-88304

    12/26/2022, 7:42 AM
    Hello Team, can someone Please point me to the sql queries executed on Oracle Sources for data profiling (stats tab on UI) ?
    h
    f
    b
    • 4
    • 9
  • e

    enough-mouse-67490

    12/26/2022, 9:29 AM
    Hi team! I did an ingestion of lookml, looker, snowflake and dbt but cant see the lineage between snowflake and looker. Is there a way to automatically identify basic lineage between snowflake and looker? Maybe some flag in lookml/looker/snowflake ingestion? Thanks in advance🙏
    h
    • 2
    • 12
  • p

    plain-cricket-83456

    12/27/2022, 1:34 AM
    Why can only obtain a maximum of 20 pieces of data for profiling? Where can this upper limit be set
    f
    h
    b
    • 4
    • 13
  • s

    silly-ability-65278

    12/27/2022, 8:55 AM
    Hello, How do I access host file (for dbt json file) if I use datahub docker? I tried to execute and it show that file not found. Thank you
    h
    • 2
    • 5
  • l

    late-ability-59580

    12/27/2022, 10:03 AM
    Hi all I ingest both dbt and snowflake for a specific table. When searching for the table in the general search, I find only a snowflake entity. I found a dbt entity of that table (without snowflake) as an upstream of a different table when looking at its lineage. Why don't the two datasets appear as one? Some other info: • dbt urn is: urnlidataset:(urnlidataPlatform:dbt,bi.DB.SCHEMA.TABLE,PROD) • snowflake urn is: urnlidataset:(urnlidataPlatform:snowflake,bi.DB.SCHEMA.TABLE,PROD) • cli version is
    0.9.3
    • I specify
    target_platform
    and
    platform_instance
    in the yaml files
    h
    • 2
    • 2
  • t

    thankful-fireman-70616

    12/27/2022, 1:50 PM
    Hi, according the delta-lake doc we can ingest data from S3, however I'm curious to know - isn't it possible to ingest from Azure/ ADLS - https://datahubproject.io/docs/generated/ingestion/sources/delta-lake/#delta-table-on-s3
    h
    • 2
    • 4
  • p

    plain-cricket-83456

    12/28/2022, 2:06 AM
    Good morning everyone, I have a problem. When using yml to ingestion a custom glossary term, some subsets with "/" titles will show the content of their parent titles. The picture below should show the sibling relationship, but when clicked on, it shows the content of the parent personal family
    h
    • 2
    • 2
  • c

    clever-dawn-33472

    12/28/2022, 4:52 AM
    Hello everyone. I'm new of datahub. And I'm studying.
  • c

    clever-dawn-33472

    12/28/2022, 4:56 AM
    Hello everyone. 🙌 I'm a beginer of datahub. I can't understand of the sink function in ingestion. In the docs, it's short of explanation. Is there any additional information of the function or is anyone who can explane this easily? Thanks in advance.
    b
    • 2
    • 8
  • e

    enough-mouse-67490

    12/27/2022, 3:56 PM
    Hi team! I’m trying to ingest lookml using this recipe:
    Copy code
    source:
        type: lookml
        config:
            parse_table_names_from_sql: True
            github_info:
                deploy_key: \${LOOKER_BI_DEPLOY_KEY_PROD}
                repo: ____/looker-snowflake
            api:
                base_url: \${LOOKER_BI_BASE_URL}
                client_secret: \${LOOKER_BI_CLIENT_SECRET_PROD}
                client_id: \${LOOKER_BI_CLIENT_ID_PROD}
            connection_to_platform_map:
                snowflake:
                    platform: snowflake
                    default_db: PRD_BI
    and getting this:
    Copy code
    Pipeline finished with at least 21 warnings; produced 0 events in 27.11 seconds.
    error msg:
    Failed to load connection prod_unsec_27062022. Check your API key permissions.
    Thanks in advance🙏
    h
    • 2
    • 1
  • s

    steep-family-13549

    12/28/2022, 11:21 AM
    @hundreds-photographer-13496 Hey Team, Can we create the glossary term with java emitter?
    h
    • 2
    • 5
  • m

    microscopic-mechanic-13766

    12/29/2022, 11:13 AM
    Good morning team, I have trying to create the HDFS ingestion source. The thing is that I am trying to create it so that it is functional with a kerberized HDFS (in which case you would have to indicate the kerberos info as follows: options: connect_args: auth: KERBEROS kerberos_service_name: hdfs ) The problem is that I am getting the following error:
    Copy code
    ERROR    {datahub.entrypoints:183} - Failed to configure source (datahub.ingestion.source.hdfs.source.HDFSSource): 1 validation error for HDFSSourceConfig
    options
      extra fields not permitted (type=value_error.extra)
    What should I add to the configuration in order to be able to have the options option in the recipe?? This is my actual configuration:
    Copy code
    path_specs: List[PathSpec] = Field(
            description="List of PathSpec. See [below](#path-spec) the details about PathSpec"
        )
        username: Optional[str] = Field(default=None, description="username")
        password: Optional[pydantic.SecretStr] = Field(
            default=None, exclude=True, description="password"
        )
        host_port: str = Field(description="host URL") 
        connect_args: Optional[Dict] = Field(
            default=None,
            description="Connect args to pass",
            exclude=True
        )
        max_rows: int = Field(
            default=100,
            description="Maximum number of rows to use when inferring schemas for TSV and CSV files.",
        )
        _rename_path_spec_to_plural = pydantic_renamed_field(
            "path_spec", "path_specs", lambda path_spec: [path_spec]
        )
    h
    • 2
    • 1
  • k

    kind-dusk-91074

    12/29/2022, 12:44 PM
    Hello, I hope you’re all doing well. Is there a way to connect datahub to a database via ssh tunnel?
    ✅ 2
    👀 2
    m
    a
    a
    • 4
    • 8
  • a

    acceptable-account-83031

    12/29/2022, 3:04 PM
    Hi, I am using the new version of datahub 0.9.5 and I can see I can edit the upstream and downstream lineage manually, but every time I run the ingestion metadata schedule, it goes back to the original lineage. I know I asked this question last time, just wondering if there’s a fix for this in the new version of datahub?
    ✅ 1
    t
    a
    • 3
    • 6
  • l

    late-jackal-56547

    12/29/2022, 4:29 PM
    Hello, Does anyone know if there is a way to download table schemas via either the datahub UI or the graphql api? Thanks!
    b
    • 2
    • 1
  • l

    lively-dusk-19162

    12/29/2022, 8:00 PM
    Hello everyone, could any one please help me on how to integrate airflow with datahub?
    d
    g
    • 3
    • 4
  • s

    steep-family-13549

    12/30/2022, 11:25 AM
    This python library "datahub.emitter.mce_builder" can also use for java emitter?
  • a

    adamant-sugar-28445

    12/31/2022, 1:50 PM
    How to delete Spark tasks in lineage. I was using the datahub tutorial. I had a url for a Spark task in the lineage: http://localhost:9002/tasks/urn:li:dataJob:(urn:li:dataFlow:(spark,Spark%20shell,local[3]),QueryExecId_1)/Documentation?is_lineage_mode=true. I used this command to remove the task:
    datahub delete --urn "urn:li:dataJob:(urn:li:dataFlow:(spark,Spark%20shell,local[3]),QueryExecId_1)"
    but it didn't work. Could anyone tell me how to do this correctly?
    a
    • 2
    • 6
  • g

    gorgeous-memory-27579

    01/01/2023, 5:42 PM
    Happy New Year, everyone. I'm ingesting from superset (datahub v0.9.3) and get charts, but not dashboards ingested. The recipe is:
    Copy code
    source:
      type: superset
      config:
        # Coordinates
        connect_uri: <http://superset.example.com:8088> #url changed
        username: username
        password: password
    The logs show:
    Copy code
    Cli report:
    {'cli_version': '0.9.3',
     'cli_entry_location': '/Users/seandavis/Library/Caches/pypoetry/virtualenvs/datahub-local-wHVdVnmL-py3.9/lib/python3.9/site-packages/datahub/__init__.py',
     'py_version': '3.9.6 (default, Sep 10 2021, 16:04:06) \n[Clang 12.0.5 (clang-1205.0.22.11)]',
     'py_exec_path': '/Users/seandavis/Library/Caches/pypoetry/virtualenvs/datahub-local-wHVdVnmL-py3.9/bin/python',
     'os_details': 'macOS-13.0-x86_64-i386-64bit',
     'mem_info': '92.97 MB'}
    Source (superset) report:
    {'events_produced': '11',
     'events_produced_per_sec': '3',
     'event_ids': ['urn:li:chart:(superset,18)',
                   'urn:li:chart:(superset,6)',
                   'urn:li:chart:(superset,17)',
                   'urn:li:chart:(superset,9)',
                   'urn:li:chart:(superset,8)',
                   'urn:li:chart:(superset,7)',
                   'urn:li:chart:(superset,4)',
                   'urn:li:chart:(superset,5)',
                   'urn:li:chart:(superset,3)',
                   'urn:li:chart:(superset,2)'],
     'warnings': {},
     'failures': {},
     'start_time': '2023-01-01 10:30:25.434105 (3.05 seconds ago).',
     'running_time': '3.05 seconds'}
    Sink (datahub-rest) report:
    {'total_records_written': '11',
     'records_written_per_second': '3',
     'warnings': [],
     'failures': [],
     'start_time': '2023-01-01 10:30:24.933428 (3.55 seconds ago).',
     'current_time': '2023-01-01 10:30:28.484836 (now).',
     'total_duration_in_seconds': '3.55',
     'gms_version': 'v0.9.3',
     'pending_requests': '0'}
    I checked the superset dashboard endpoint and I see a nice list of dashboards there. Any thoughts on what I'm missing to get the dashboards? A quick search of google and Slack didn't turn anything up.
    b
    d
    • 3
    • 5
  • e

    enough-mouse-67490

    01/02/2023, 8:55 AM
    Hi team! I can see the lineage from Dataset snowflake to looker but without the dbt which means I cant see the full flow end to end, I guess it may be because it is represented as a dataset and not table. Is there a way to automatically identify basic lineage between DBT and looker or to identify as snowflake tables and not dataset? Attached pictures of the same flow separated:
    h
    g
    • 3
    • 11
  • b

    brainy-piano-85560

    01/02/2023, 12:26 PM
    Hey guys, I have a DH v0.9.3.2 on EC2, using the quickstart deployment. After doing some ingestions, looks like scheduled & manual ingestions won't work (new or old ingestions). it's stuck on 'pending' (see picture). Checked 'datahub docker check' and everything looks alright. I had a problem a week ago with one of the ingestions (got an error while trying to do profiling), someone here suggested it might be a OOM problem on the docker, Do you think it can be related? Thanks for helping.
    h
    d
    b
    • 4
    • 10
1...929394...144Latest