Dana Vatavu
10/18/2022, 6:55 AMAlexander Pospiech
10/18/2022, 9:03 AMSvatopluk Chalupa
10/18/2022, 9:18 AMAlexander Pospiech
10/18/2022, 10:01 AM2022-10-18 04:26:03.376 UTC [25558] [IP] airbyte@parkdepot LOG: SSL error: too many key updates
2022-10-18 04:26:03.376 UTC [25558] [IP] airbyte@parkdepot LOG: could not receive data from client: Connection reset by peer
2022-10-18 04:26:03.376 UTC [25558] [IP] airbyte@parkdepot LOG: unexpected EOF on client connection with an open transaction
Mamadi
10/18/2022, 10:26 AM2022-10-18 09:52:25 [32mINFO[m i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 -
2022-10-18 09:52:25 [42mnormalization[0m > Configuration:
2022-10-18 09:52:25 [42mnormalization[0m > profiles.yml file [[32mOK found and valid[0m]
2022-10-18 09:52:25 [42mnormalization[0m > dbt_project.yml file [[32mOK found and valid[0m]
2022-10-18 09:52:25 [32mINFO[m i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 -
2022-10-18 09:52:25 [42mnormalization[0m > Required dependencies:
2022-10-18 09:52:25 [42mnormalization[0m > - git [[32mOK found[0m]
2022-10-18 09:52:25 [32mINFO[m i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 -
2022-10-18 09:52:25 [42mnormalization[0m > Connection:
2022-10-18 09:52:25 [42mnormalization[0m > driver: native
2022-10-18 09:52:25 [42mnormalization[0m > host: host.docker.internal
2022-10-18 09:52:25 [42mnormalization[0m > port: 8123
2022-10-18 09:52:25 [42mnormalization[0m > user: cuser
2022-10-18 09:52:25 [42mnormalization[0m > schema: kronos
2022-10-18 09:52:25 [42mnormalization[0m > secure: False
2022-10-18 09:52:25 [42mnormalization[0m > verify: True
2022-10-18 09:52:25 [42mnormalization[0m > connect_timeout: 10
2022-10-18 09:52:25 [42mnormalization[0m > send_receive_timeout: 300
2022-10-18 09:52:25 [42mnormalization[0m > sync_request_timeout: 5
2022-10-18 09:52:25 [42mnormalization[0m > compress_block_size: 1048576
2022-10-18 09:52:25 [42mnormalization[0m > compression:
2022-10-18 09:52:25 [42mnormalization[0m > custom_settings: None
2022-10-18 09:52:25 [42mnormalization[0m > Connection test: [[31mERROR[0m]
2022-10-18 09:52:25 [32mINFO[m i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 -
2022-10-18 09:52:25 [42mnormalization[0m > [31m1 check failed:[0m
2022-10-18 09:52:25 [42mnormalization[0m > dbt was unable to connect to the specified database.
2022-10-18 09:52:25 [42mnormalization[0m > The database returned the following error:
2022-10-18 09:52:25 [32mINFO[m i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 -
2022-10-18 09:52:25 [42mnormalization[0m > >Database Error
2022-10-18 09:52:25 [42mnormalization[0m > Code: 102. Unexpected packet from server host.docker.internal:8123 (expected Hello or Exception, got Unknown packet)
2022-10-18 09:52:25 [32mINFO[m i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 -
2022-10-18 09:52:25 [42mnormalization[0m > Check your database credentials and try again. For more information, visit:
2022-10-18 09:52:25 [42mnormalization[0m > <https://docs.getdbt.com/docs/configure-your-profile>
Amrendra nath Upadhyay
10/18/2022, 10:36 AMFrank Kody
10/18/2022, 11:13 AMCaused by: java.lang.RuntimeException: java.sql.SQLException: [Amazon](500310) Invalid operation: Problem reading manifest file - S3CurlException: Connection timed out after 50001 milliseconds, CurlError 28, multiCurlError 0, CanRetry 1, UserError 0
Details: -----------------------------------------------
error: Problem reading manifest file - S3CurlException: Connection timed out after 50001 milliseconds, CurlError 28, multiCurlError 0, CanRetry 1, UserError 0
code: 9001
context: <s3://insurely-airbyte-prod/data_sync/prod/hubspot_campaigns/2022_10_18_07_682ec106-c3f7-498b-9f10-ac0fbd3f233d/ff8681dd-d32a-4184-a263-b66f22843c66.manifest>
query: 2051168
location: s3_utility.cpp:334
process: padbmaster [pid=18629]
-----------------------------------------------;
at io.airbyte.commons.lang.Exceptions.castCheckedToRuntime(Exceptions.java:58)
at io.airbyte.commons.lang.Exceptions.toRuntime(Exceptions.java:41)
at io.airbyte.integrations.destination.redshift.operations.RedshiftS3StagingSqlOperations.executeCopy(RedshiftS3StagingSqlOperations.java:137)
at io.airbyte.integrations.destination.redshift.operations.RedshiftS3StagingSqlOperations.lambda$copyIntoTmpTableFromStage$1(RedshiftS3StagingSqlOperations.java:107)
at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
at java.base/java.util.stream.Streams$StreamBuilderImpl.forEachRemaining(Streams.java:411)
at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
at io.airbyte.integrations.destination.redshift.operations.RedshiftS3StagingSqlOperations.lambda$copyIntoTmpTableFromStage$2(RedshiftS3StagingSqlOperations.java:107)
at io.airbyte.commons.lang.Exceptions.castCheckedToRuntime(Exceptions.java:54)
at io.airbyte.commons.lang.Exceptions.toRuntime(Exceptions.java:41)
at io.airbyte.integrations.destination.redshift.operations.RedshiftS3StagingSqlOperations.copyIntoTmpTableFromStage(RedshiftS3StagingSqlOperations.java:105)
at io.airbyte.integrations.destination.staging.StagingConsumerFactory.lambda$onCloseFunction$3(StagingConsumerFactory.java:195)
... 6 more
Luis Pereira
10/18/2022, 11:17 AMFailed to load <https://zohourlhere>: ConfigurationError('Reader json is not supported\nTraceback (most recent call last):\n File "/airbyte/integration_code/source_file/client.py", line 314, in load_dataframes\n reader = readers[self._reader_format]\nKeyError: \'json\'\n') Traceback (most recent call last): File "/airbyte/integration_code/source_file/client.py", line 314, in load_dataframes reader = readers[self._reader_format] KeyError: 'json' The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/airbyte/integration_code/source_file/source.py", line 95, in check next(client.load_dataframes(f)) File "/airbyte/integration_code/source_file/client.py", line 318, in load_dataframes raise ConfigurationError(error_msg) from err source_file.client.ConfigurationError: Reader json is not supported Traceback (most recent call last): File "/airbyte/integration_code/source_file/client.py", line 314, in load_dataframes reader = readers[self._reader_format] KeyError: 'json'
Can someone help please?Vikas Goswami
10/18/2022, 11:35 AMTony Lewis
10/18/2022, 12:05 PMns
10/18/2022, 12:21 PM2022-10-18 11:34:34 ERROR i.a.w.g.DefaultCheckConnectionWorker(run):100 - Unexpected error while checking connection:
io.airbyte.workers.exception.WorkerException: null
at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:138) ~[io.airbyte-airbyte-workers-0.39.41-alpha.jar:?]
at io.airbyte.workers.process.AirbyteIntegrationLauncher.check(AirbyteIntegrationLauncher.java:84) ~[io.airbyte-airbyte-workers-0.39.41-alpha.jar:?]
at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:63) ~[io.airbyte-airbyte-workers-0.39.41-alpha.jar:?]
at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:37) ~[io.airbyte-airbyte-workers-0.39.41-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.39.41-alpha.jar:?]
at java.lang.Thread.run(Thread.java:1589) [?:?]
Caused by: java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1681) ~[?:?]
at java.util.concurrent.LinkedBlockingDeque.pollFirst(LinkedBlockingDeque.java:515) ~[?:?]
at java.util.concurrent.LinkedBlockingDeque.poll(LinkedBlockingDeque.java:677) ~[?:?]
at io.airbyte.workers.process.KubePortManagerSingleton.take(KubePortManagerSingleton.java:67) ~[io.airbyte-airbyte-workers-0.39.41-alpha.jar:?]
at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:102) ~[io.airbyte-airbyte-workers-0.39.41-alpha.jar:?]
... 5 more
Robert Put
10/18/2022, 1:21 PMAbba
10/18/2022, 1:44 PMstephen oriyomi
10/18/2022, 1:51 PMLucas Almada
10/18/2022, 2:00 PMAlbert Marrero
10/18/2022, 2:59 PMAlex Quartey-Papafio
10/18/2022, 4:41 PMRobert Put
10/18/2022, 5:54 PM2022-10-18 14:30:45 normalization > Database Error in model RESTAURANT_SCD (models/generated/airbyte_incremental/scd/READ_MIRROR_V3/RESTAURANT_SCD.sql)
2022-10-18 14:30:45 normalization > 100035 (22007): Timestamp '+192153-11-18T12:06:13.000000' is not recognized
this is in the snowflake destination,
which i understand won't support the timestamp...
but im not sure where this timstamp is coming from, on the db source, i search or it but can't find it:
SELECT *
FROM restaurant
WHERE updated_at = '192153-11-18 12:06:13.000000';
Is there an easy way to each for the airbyte row id in the raw table in snowflake to see the entire row with the issue?le Minh Nguyen
10/18/2022, 10:17 PMgcloud --project=$PROJECT_ID beta compute ssh $INSTANCE_NAME -- -L 8000:localhost:8000 -N -f
I encounter the problem bind [127.0.0.1]:8000: Address already in use
I have no idea what port is using it. I have tried to change to port 8080; 80; 4444 but none work. What should I do here? thank youAndrew Exlet
10/18/2022, 11:06 PMLucas Souza Lira Silva
10/18/2022, 11:18 PMEmilja Dankevičiūtė
10/19/2022, 6:26 AMairbyte webapp
where the deployment ignores serviceAccount. We have global.serviceAccountName
as well as
serviceAccount:
create: false
name:..
and everything's ok for every other pod except webapp. If I view resource description I still see serviceAccountName: default
while for others I can see the values we've provided. We're using the service account to load secrets from Google Secret Manager (by attaching a volume) and would prefer to not have them hardcoded anywhere as it makes secret lifecycle much simpler for us. Is there anything we can do?Georg Heiler
10/19/2022, 9:02 AMDonk
10/19/2022, 11:00 AMAnandkumar Dharmaraj
10/19/2022, 11:05 AMSebastian Brickel
10/19/2022, 11:40 AMpython main.py check --config secrets/config.json
returns
{"type": "LOG", "log": {"level": "INFO", "message": "Check succeeded"}}
{"type": "CONNECTION_STATUS", "connectionStatus": {"status": "SUCCEEDED"}}
so the connection works. However at step 4, when running
python main.py read --debug --config secrets/config.json --catalog integration_tests/configured_catalog.json
I get the following error message
{"type": "DEBUG", "message": "Debug logs enabled", "data": {}}
{"type": "LOG", "log": {"level": "FATAL", "message": "Expecting value: line 12 column 3 (char 240)\nTraceback (most recent call last):\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/main.py\", line 13, in <module>\n launch(source, sys.argv[1:])\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/.venv/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 123, in launch\n for message in source_entrypoint.run(parsed_args):\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/.venv/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 111, in run\n config_catalog = self.source.read_catalog(parsed_args.catalog)\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/.venv/lib/python3.9/site-packages/airbyte_cdk/sources/source.py\", line 90, in read_catalog\n return ConfiguredAirbyteCatalog.parse_obj(self.read_config(catalog_path))\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/.venv/lib/python3.9/site-packages/airbyte_cdk/connector.py\", line 53, in read_config\n return json.loads(contents)\n File \"/Users/sebastianbrickel/opt/miniconda3/lib/python3.9/json/_init_.py\", line 346, in loads\n return _default_decoder.decode(s)\n File \"/Users/sebastianbrickel/opt/miniconda3/lib/python3.9/json/decoder.py\", line 337, in decode\n obj, end = self.raw_decode(s, idx=_w(s, 0).end())\n File \"/Users/sebastianbrickel/opt/miniconda3/lib/python3.9/json/decoder.py\", line 355, in raw_decode\n raise JSONDecodeError(\"Expecting value\", s, err.value) from None\njson.decoder.JSONDecodeError: Expecting value: line 12 column 3 (char 240)"}}
{"type": "TRACE", "trace": {"type": "ERROR", "emitted_at": 1666179423642.493, "error": {"message": "Something went wrong in the connector. See the logs for more details.", "internal_message": "Expecting value: line 12 column 3 (char 240)", "stack_trace": "Traceback (most recent call last):\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/main.py\", line 13, in <module>\n launch(source, sys.argv[1:])\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/.venv/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 123, in launch\n for message in source_entrypoint.run(parsed_args):\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/.venv/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 111, in run\n config_catalog = self.source.read_catalog(parsed_args.catalog)\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/.venv/lib/python3.9/site-packages/airbyte_cdk/sources/source.py\", line 90, in read_catalog\n return ConfiguredAirbyteCatalog.parse_obj(self.read_config(catalog_path))\n File \"/Users/sebastianbrickel/Documents/airbyte/airbyte-integrations/connectors/source-waiteraid/.venv/lib/python3.9/site-packages/airbyte_cdk/connector.py\", line 53, in read_config\n return json.loads(contents)\n File \"/Users/sebastianbrickel/opt/miniconda3/lib/python3.9/json/_init_.py\", line 346, in loads\n return _default_decoder.decode(s)\n File \"/Users/sebastianbrickel/opt/miniconda3/lib/python3.9/json/decoder.py\", line 337, in decode\n obj, end = self.raw_decode(s, idx=_w(s, 0).end())\n File \"/Users/sebastianbrickel/opt/miniconda3/lib/python3.9/json/decoder.py\", line 355, in raw_decode\n raise JSONDecodeError(\"Expecting value\", s, err.value) from None\njson.decoder.JSONDecodeError: Expecting value: line 12 column 3 (char 240)\n", "failure_type": "system_error"}}}
I am completely stuck right now. Any advice/hint is welcomeAviel Even-Or
10/19/2022, 1:04 PMPatrik Deke
10/19/2022, 1:07 PMAndrzej Brzusnian
10/19/2022, 1:20 PMJonathan Anspaugh
10/19/2022, 1:30 PM