Hi team, I'm getting the below error while creatin...
# ingestion
i
Hi team, I'm getting the below error while creating the BigQuery integration in the Kubernetes deployment while a similar integration is successful with same configurations in my localhost (with quickstart).
Copy code
~~~~ Execution Summary ~~~~

RUN_INGEST - {'errors': [],
 'exec_id': '7f529d57-21f5-4d39-a8e8-2b92580692ab',
 'infos': ['2022-09-12 10:22:14.801662 [exec_id=7f529d57-21f5-4d39-a8e8-2b92580692ab] INFO: Starting execution for task with name=RUN_INGEST',
           '2022-09-12 10:22:14.855554 [exec_id=7f529d57-21f5-4d39-a8e8-2b92580692ab] INFO: Caught exception EXECUTING '
           'task_id=7f529d57-21f5-4d39-a8e8-2b92580692ab, name=RUN_INGEST, stacktrace=Traceback (most recent call last):\n'
           '  File "/usr/local/lib/python3.9/site-packages/acryl/executor/execution/default_executor.py", line 121, in execute_task\n'
           '    self.event_loop.run_until_complete(task_future)\n'
           '  File "/usr/local/lib/python3.9/site-packages/nest_asyncio.py", line 89, in run_until_complete\n'
           '    return f.result()\n'
           '  File "/usr/local/lib/python3.9/asyncio/futures.py", line 201, in result\n'
           '    raise self._exception\n'
           '  File "/usr/local/lib/python3.9/asyncio/tasks.py", line 256, in __step\n'
           '    result = coro.send(None)\n'
           '  File "/usr/local/lib/python3.9/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 71, in execute\n'
           '    validated_args = SubProcessIngestionTaskArgs.parse_obj(args)\n'
           '  File "pydantic/main.py", line 521, in pydantic.main.BaseModel.parse_obj\n'
           '  File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__\n'
           'pydantic.error_wrappers.ValidationError: 1 validation error for SubProcessIngestionTaskArgs\n'
           'debug_mode\n'
           '  extra fields not permitted (type=value_error.extra)\n']}
Execution finished with errors.
h
Hi @important-answer-79732 could you please share recipe ? Could it be due to different versions of localhost and kuberneters deployment ?
e
Here is the recipe of the BigQuery integration.
Copy code
source:
    type: bigquery
    config:
        include_table_lineage: true
        table_pattern:
            allow:
                - '.*\.bq_dataset\.bq_table'
        upstream_lineage_in_report: true
        credential:
            private_key_id: '${PRIVATE_KEY_ID}'
            project_id: <GCP_Project_ID>
            client_email: <service-account-name>@<GCP_Project_ID>.<http://iam.gserviceaccount.com|iam.gserviceaccount.com>
            private_key: '${PRIVATE_KEY}'
            client_id: '<service-account-client-id>'
        profiling:
            enabled: true
        project_id: <GCP_Project_ID>
        include_view_lineage: true
        view_pattern:
            deny:
                - '.*'
        stateful_ingestion:
            enabled: true
        schema_pattern:
            allow:
                - '.*bq_dataset.*'
Anonymised the secrets but the structure and configs used are the same.
h
I think - you can remove
include_view_lineage
from the recipe, otherwise it looks good. Which version of datahub have you deployed on Kubernetes ? I believe this error occurs when using managed ingestion ?
e
Tried removing the
include_view_lineage
but still getting the same error. Before this, I've also tried uninstalling and reinstalling the dockers with the the helm tool but no change.
I believe this error occurs when using managed ingestion?
Yes, getting the same error in the MySQL integration also.
The helm chart values.yaml shows the
tag: "v0.8.44"
that I used for Deployment.
@hundreds-photographer-13496, Please let me know how can I resolve the pydantic ValidationError that I'm currently facing.
s
We are getting the same error with Snowflake and dbt ingestions. We started having this problem after the upgrade to version 8.44
g
Do you know what version of datahub actions you’re running (this is a different version from the other images)? I believe you need datahub actions 0.0.6
e
Thanks @gray-shoe-75895, updating datahub actions from 0.0.4 to 0.0.6 resovled this issue.
s
Thanks, I'm going to try this then 👌