aloof-energy-17918
12/22/2022, 10:26 AMworried-chef-87127
12/22/2022, 4:34 PM'[2022-12-22 06:13:20,458] ERROR {datahub.entrypoints:192} - \n'
'Traceback (most recent call last):\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/entrypoints.py", line 149, in main\n'
' sys.exit(datahub(standalone_mode=False, **kwargs))\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1130, in __call__\n'
' return self.main(*args, **kwargs)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1055, in main\n'
' rv = self.invoke(ctx)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1657, in invoke\n'
' return _process_result(sub_ctx.command.invoke(sub_ctx))\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1657, in invoke\n'
' return _process_result(sub_ctx.command.invoke(sub_ctx))\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 1404, in invoke\n'
' return ctx.invoke(self.callback, **ctx.params)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/core.py", line 760, in invoke\n'
' return __callback(*args, **kwargs)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/click/decorators.py", line 26, in new_func\n'
' return f(get_current_context(), *args, **kwargs)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 347, in wrapper\n'
' raise e\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 299, in wrapper\n'
' res = func(*args, **kwargs)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/utilities/memory_leak_detector.py", line 95, in '
'wrapper\n'
' return func(ctx, *args, **kwargs)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 212, in run\n'
' loop.run_until_complete(run_func_check_upgrade(pipeline))\n'
' File "/usr/local/lib/python3.10/asyncio/base_events.py", line 646, in run_until_complete\n'
' return future.result()\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 166, in '
'run_func_check_upgrade\n'
' ret = await the_one_future\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 157, in run_pipeline_async\n'
' return await loop.run_in_executor(\n'
' File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run\n'
' result = self.fn(*self.args, **self.kwargs)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 148, in '
'run_pipeline_to_completion\n'
' raise e\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 134, in '
'run_pipeline_to_completion\n'
' pipeline.run()\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 348, in run\n'
' for wu in itertools.islice(\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 1164, '
'in get_workunits\n'
' ) = job.result()\n'
' File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 451, in result\n'
' return self.__get_result()\n'
' File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result\n'
' raise self._exception\n'
' File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run\n'
' result = self.fn(*self.args, **self.kwargs)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 1037, '
'in process_dashboard\n'
' looker_dashboard = self._get_looker_dashboard(dashboard_object, self.looker_api)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 831, '
'in _get_looker_dashboard\n'
' looker_dashboard_element = self._get_looker_dashboard_element(element)\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 514, '
'in _get_looker_dashboard_element\n'
' input_fields = self._get_input_fields_from_query(\n'
' File "/tmp/datahub/ingest/venv-looker-0.9.1/lib/python3.10/site-packages/datahub/ingestion/source/looker/looker_source.py", line 344, '
'in _get_input_fields_from_query\n'
' label=field["label"],\n'
"KeyError: 'label'\n"
'[2022-12-22 06:13:20,464] ERROR {datahub.entrypoints:195} - Command failed: \n'
"\t'label'.\n"
'\tRun with --debug to get full stacktrace.\n'
"\te.g. 'datahub --debug ingest run -c /tmp/datahub/ingest/29de83cc-03b7-432a-9657-753d67d724a5/recipe.yml --report-to "
"/tmp/datahub/ingest/29de83cc-03b7-432a-9657-753d67d724a5/ingestion_report.json'\n",
"2022-12-22 06:13:47.047248 [exec_id=29de83cc-03b7-432a-9657-753d67d724a5] INFO: Failed to execute 'datahub ingest'",
'2022-12-22 06:13:47.047574 [exec_id=29de83cc-03b7-432a-9657-753d67d724a5] INFO: Caught exception EXECUTING '
'task_id=29de83cc-03b7-432a-9657-753d67d724a5, name=RUN_INGEST, stacktrace=Traceback (most recent call last):\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/default_executor.py", line 123, in execute_task\n'
' task_event_loop.run_until_complete(task_future)\n'
' File "/usr/local/lib/python3.10/asyncio/base_events.py", line 646, in run_until_complete\n'
' return future.result()\n'
' File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 168, in execute\n'
' raise TaskError("Failed to execute \'datahub ingest\'")\n'
"acryl.executor.execution.task.TaskError: Failed to execute 'datahub ingest'\n"],
'structured_report': '{"source": {"type": "looker", "report": {"events_produced": "27590", "events_produced_per_sec": "34", "event_ids": '
'["looker-urn:li:chart:(looker,dashboard_elements.18243)", "looker-urn:li:chart:(looker,dashboard_elements.18914)", '
'"looker-urn:li:chart:(looker,dashboard_elements.19888)", "looker-urn:li:chart:(looker,dashboard_elements.23351)", '
'"looker-urn:li:chart:(looker,dashboard_elements.26246)", "looker-inputFields-urn:li:chart:(looker,dashboard_elements.27578)", '
'"looker-inputFields-urn:li:chart:(looker,dashboard_elements.30159)", "looker-urn:li:chart:(looker,dashboard_elements.30581)", '
'"looker-urn:li:chart:(looker,dashboard_elements.31238)", "looker-inputFields-urn:li:chart:(looker,dashboard_elements.33689)", '
'"... sampled of 27590 total elements"], "warnings": {}, "failures": {}, "total_dashboards": "1542", "dashboards_scanned": '
'"1541", "looks_scanned": "13412", "filtered_dashboards": ["3191", "x_base::athena_demo", "x_billing_pmo::athena_demo", '
'"x_billing_salesops::athena_demo", "x_clientservices::athena_demo", "x_digital::athena_demo", "x_executive::athena_demo", '
'"x_finance_pmo::athena_demo", "x_finance_salesops::athena_demo", "x_phi_prone::athena_demo", "... sampled of 25 total '
'elements"], "filtered_looks": [], "dashboards_scanned_for_usage": "0", "charts_scanned_for_usage": "0", '
'"charts_with_activity": [], "dashboards_with_activity": [], "stage_latency": [{"name": "list_dashboards", "latency": "1.56 '
'seconds"}, {"name": "dashboard_chart_metadata", "start_time": "2022-12-22 06:00:04.937367 (13 minutes and 14.96 seconds '
'ago)."}], "total_explores": "0", "explores_scanned": "0", "query_latency": {}, "user_resolution_latency": {}, "start_time": '
'"2022-12-22 06:00:02.830271 (13 minutes and 17.07 seconds ago).", "running_time": "13 minutes and 17.07 seconds", '
'"dashboard_process_percentage_completion": "99.94", "explore_registry_stats": {"cache_info": "CacheInfo(hits=184581, '
'misses=371, maxsize=128, currsize=128)"}, "looker_api_stats": {"client_stats": {"dashboard_calls": 1542, "user_calls": 236, '
'"explore_calls": 371, "query_calls": 0, "folder_calls": 441, "all_connections_calls": 0, "connection_calls": 0, '
'"lookml_model_calls": 0, "all_dashboards_calls": 1, "search_dashboards_calls": 0}, "folder_cache": "CacheInfo(hits=1101, '
'misses=441, maxsize=1000, currsize=415)", "user_cache": "CacheInfo(hits=1968, misses=236, maxsize=2000, currsize=221)"}}}, '
'"sink": {"type": "datahub-rest", "report": {"total_records_written": "26589", "records_written_per_second": "33", "warnings": '
'[], "failures": [], "start_time": "2022-12-22 06:00:02.099666 (13 minutes and 17.8 seconds ago).", "current_time": '
'"2022-12-22 06:13:19.901843 (now).", "total_duration_in_seconds": "797.8", "gms_version": "v0.9.2", "pending_requests": '
'"1001"}}}'}
Execution finished with errors.
kind-dusk-91074
12/23/2022, 7:58 AMwonderful-vegetable-45135
12/23/2022, 1:36 PMacceptable-account-83031
12/23/2022, 7:56 PMbland-lighter-26751
12/23/2022, 11:35 PM('Failed to load service account credentials from /tmp/tmpm969wnso', ValueError('Could not deserialize key data. The data may be in an incorrect format, it may be encrypted with an unsupported algorithm, or it may be an unsupported key type (e.g. EC curves with explicit parameters).', [_OpenSSLErrorWithText(code=503841036, lib=60, reason=524556, reason_text=b'error:1E08010C:DECODER routines::unsupported')]))
bumpy-eye-36525
12/24/2022, 7:04 PMFile "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/dbt.py", line 1101, in create
1099 @classmethod
1100 def create(cls, config_dict, ctx):
--> 1101 config = DBTConfig.parse_obj(config_dict)
1102 return cls(config, ctx, "dbt")
File "pydantic/main.py", line 521, in pydantic.main.BaseModel.parse_obj
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
ValidationError: 1 validation error for DBTConfig
column_meta_mapping
extra fields not permitted (type=value_error.extra)
thankful-fireman-70616
12/25/2022, 3:08 PMmagnificent-notebook-88304
12/26/2022, 7:37 AMmagnificent-notebook-88304
12/26/2022, 7:42 AMenough-mouse-67490
12/26/2022, 9:29 AMplain-cricket-83456
12/27/2022, 1:34 AMsilly-ability-65278
12/27/2022, 8:55 AMlate-ability-59580
12/27/2022, 10:03 AM0.9.3
• I specify target_platform
and platform_instance
in the yaml filesthankful-fireman-70616
12/27/2022, 1:50 PMplain-cricket-83456
12/28/2022, 2:06 AMclever-dawn-33472
12/28/2022, 4:52 AMclever-dawn-33472
12/28/2022, 4:56 AMenough-mouse-67490
12/27/2022, 3:56 PMsource:
type: lookml
config:
parse_table_names_from_sql: True
github_info:
deploy_key: \${LOOKER_BI_DEPLOY_KEY_PROD}
repo: ____/looker-snowflake
api:
base_url: \${LOOKER_BI_BASE_URL}
client_secret: \${LOOKER_BI_CLIENT_SECRET_PROD}
client_id: \${LOOKER_BI_CLIENT_ID_PROD}
connection_to_platform_map:
snowflake:
platform: snowflake
default_db: PRD_BI
and getting this:
Pipeline finished with at least 21 warnings; produced 0 events in 27.11 seconds.
error msg: Failed to load connection prod_unsec_27062022. Check your API key permissions.
Thanks in advance🙏steep-family-13549
12/28/2022, 11:21 AMmicroscopic-mechanic-13766
12/29/2022, 11:13 AMERROR {datahub.entrypoints:183} - Failed to configure source (datahub.ingestion.source.hdfs.source.HDFSSource): 1 validation error for HDFSSourceConfig
options
extra fields not permitted (type=value_error.extra)
What should I add to the configuration in order to be able to have the options option in the recipe??
This is my actual configuration:
path_specs: List[PathSpec] = Field(
description="List of PathSpec. See [below](#path-spec) the details about PathSpec"
)
username: Optional[str] = Field(default=None, description="username")
password: Optional[pydantic.SecretStr] = Field(
default=None, exclude=True, description="password"
)
host_port: str = Field(description="host URL")
connect_args: Optional[Dict] = Field(
default=None,
description="Connect args to pass",
exclude=True
)
max_rows: int = Field(
default=100,
description="Maximum number of rows to use when inferring schemas for TSV and CSV files.",
)
_rename_path_spec_to_plural = pydantic_renamed_field(
"path_spec", "path_specs", lambda path_spec: [path_spec]
)
kind-dusk-91074
12/29/2022, 12:44 PMacceptable-account-83031
12/29/2022, 3:04 PMlate-jackal-56547
12/29/2022, 4:29 PMlively-dusk-19162
12/29/2022, 8:00 PMsteep-family-13549
12/30/2022, 11:25 AMadamant-sugar-28445
12/31/2022, 1:50 PMdatahub delete --urn "urn:li:dataJob:(urn:li:dataFlow:(spark,Spark%20shell,local[3]),QueryExecId_1)"
but it didn't work. Could anyone tell me how to do this correctly?gorgeous-memory-27579
01/01/2023, 5:42 PMsource:
type: superset
config:
# Coordinates
connect_uri: <http://superset.example.com:8088> #url changed
username: username
password: password
The logs show:
Cli report:
{'cli_version': '0.9.3',
'cli_entry_location': '/Users/seandavis/Library/Caches/pypoetry/virtualenvs/datahub-local-wHVdVnmL-py3.9/lib/python3.9/site-packages/datahub/__init__.py',
'py_version': '3.9.6 (default, Sep 10 2021, 16:04:06) \n[Clang 12.0.5 (clang-1205.0.22.11)]',
'py_exec_path': '/Users/seandavis/Library/Caches/pypoetry/virtualenvs/datahub-local-wHVdVnmL-py3.9/bin/python',
'os_details': 'macOS-13.0-x86_64-i386-64bit',
'mem_info': '92.97 MB'}
Source (superset) report:
{'events_produced': '11',
'events_produced_per_sec': '3',
'event_ids': ['urn:li:chart:(superset,18)',
'urn:li:chart:(superset,6)',
'urn:li:chart:(superset,17)',
'urn:li:chart:(superset,9)',
'urn:li:chart:(superset,8)',
'urn:li:chart:(superset,7)',
'urn:li:chart:(superset,4)',
'urn:li:chart:(superset,5)',
'urn:li:chart:(superset,3)',
'urn:li:chart:(superset,2)'],
'warnings': {},
'failures': {},
'start_time': '2023-01-01 10:30:25.434105 (3.05 seconds ago).',
'running_time': '3.05 seconds'}
Sink (datahub-rest) report:
{'total_records_written': '11',
'records_written_per_second': '3',
'warnings': [],
'failures': [],
'start_time': '2023-01-01 10:30:24.933428 (3.55 seconds ago).',
'current_time': '2023-01-01 10:30:28.484836 (now).',
'total_duration_in_seconds': '3.55',
'gms_version': 'v0.9.3',
'pending_requests': '0'}
I checked the superset dashboard endpoint and I see a nice list of dashboards there. Any thoughts on what I'm missing to get the dashboards? A quick search of google and Slack didn't turn anything up.enough-mouse-67490
01/02/2023, 8:55 AMbrainy-piano-85560
01/02/2023, 12:26 PM