Madison Mae
07/06/2023, 5:31 PMRichard Anthony Hein (Auxon)
07/06/2023, 5:40 PMEduardo Aviles
07/06/2023, 6:55 PMSuppressed: org.elasticsearch.client.ResponseException: method [POST], host [<https://vpc-bi-sp-opensearch-bwb5kptvvi6tzbfkyqxhetnquy.us-east-1.es.amazonaws.com>], URI [/_search/scroll], status line [HTTP/1.1 404 Not Found]
{"error":{"root_cause":[{"type":"search_context_missing_exception","reason":"No search context found for id [65883]"},{"type":"search_context_missing_exception","reason":"No search context found for id [65882]"},{"type":"search_context_missing_exception","reason":"No search context found for id [62223]"},{"type":"search_context_missing_exception","reason":"No search context found for id [62222]"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":-1,"index":null,"reason":{"type":"search_context_missing_exception","reason":"No search context found for id [65883]"}},{"shard":-1,"index":null,"reason":{"type":"search_context_missing_exception","reason":"No search context found for id [65882]"}},{"shard":-1,"index":null,"reason":{"type":"search_context_missing_exception","reason":"No search context found for id [62223]"}},{"shard":-1,"index":null,"reason":{"type":"search_context_missing_exception","reason":"No search context found for id [62222]"}}],"caused_by":{"type":"search_context_missing_exception","reason":"No search context found for id [62222]"}},"status":404}
at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:326)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:296)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:270)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1621)
... 25 more
Has someone tried this before?Fariha Baloch
07/06/2023, 7:15 PM./airbyte-local.sh \
--src 'farosai/airbyte-faros-feeds-source' \
--src.feed_cfg.feed_name 'github-feed' \
--src.feed_cfg.feed_path 'vcs/github-feed' \
--src.feed_cfg.auth_cfg '{"auth":"token", "personal_access_token":"<my_token>"}' \
--src.feed_cfg.repos_query_mode '{"query_mode":"GitHubOrg"}' \
--src.feed_cfg.org_repo_list '["org/repo"]' \
--src.feed_cfg.cutoff_days '90' \
--src.feed_args '["--github-api-url", "<github api link>", "--debug"]' \
--dst.edition_configs.hasura_url '<http://host.docker.internal:8080/>' \
--dst.edition_configs.hasura_admin_secret 'admin' \
--dst.edition_configs.edition 'community' \
--dst 'farosai/airbyte-faros-destination' \
--debug
Patrick Kompier
07/06/2023, 7:40 PMGabriel Martelloti
07/06/2023, 8:57 PMTEMPORAL_HISTORY_RETENTION_IN_DAYS
works in k8s when installing Airbyte through Helm? I tried setting it for the airbyte-cron, airbyte-worker and airbyte-server but it seems the deletion job is never triggered.Gabriel Levine
07/06/2023, 8:59 PMio.airbyte.workers.internal.exception.StreamStatusException: Invalid stream status transition to COMPLETE.
I’m running Airbyte OSS 0.50.6 on GKEJason Sorensen
07/06/2023, 9:16 PM"stacktrace" : "java.time.format.DateTimeParseException: Text '2023-07-06T17:50:24.296000Z' could not be parsed, unparsed text found at index 26
. It looks like the 'Z' is causing this error. I did not set up this airbyte instance and only use the webui for our Airbyte. Is there a way to fix this cursor value so that the connection can runChidambara Ganapathy
07/07/2023, 6:11 AMChidambara Ganapathy
07/07/2023, 6:15 AMHaki Dere
07/07/2023, 10:49 AMLily Tian
07/07/2023, 2:26 PM2023-07-07 14:17:24 INFO i.a.w.p.KubePodProcess(close):797 - (pod: airbyte / normalization-normalize-468-0-ycujg) - Closed all resources for pod
2023-07-07 14:17:24 INFO i.a.w.n.DefaultNormalizationRunner(close):196 - Terminating normalization process...
2023-07-07 14:17:24 ERROR i.a.w.g.DefaultNormalizationWorker(run):86 - Normalization failed for job
ramiro barraco
07/07/2023, 3:07 PMConfiguration check failed
AirbyteTracedException('Access Denied')
in the worker pods i get this in logs:
Log4j2Appender says: Traceback (most recent call last):
File "/airbyte/integration_code/source_s3/stream.py", line 60, in filepath_iterator
response = client.list_objects_v2(**kwargs)
File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 530, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 964, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/airbyte/integration_code/source_s3/source_files_abstract/source.py", line 63, in check_connection
for file_info in stream.filepath_iterator():
File "/airbyte/integration_code/source_s3/stream.py", line 64, in filepath_iterator
raise AirbyteTracedException(message, message, failure_type=FailureType.config_error)
airbyte_cdk.utils.traced_exception.AirbyteTracedException: Access Denied
2023-07-07 15:00:55 ERROR i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - Traceback (most recent call last):
File "/airbyte/integration_code/source_s3/stream.py", line 60, in filepath_iterator
response = client.list_objects_v2(**kwargs)
File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 530, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 964, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/airbyte/integration_code/source_s3/source_files_abstract/source.py", line 63, in check_connection
for file_info in stream.filepath_iterator():
File "/airbyte/integration_code/source_s3/stream.py", line 64, in filepath_iterator
raise AirbyteTracedException(message, message, failure_type=FailureType.config_error)
airbyte_cdk.utils.traced_exception.AirbyteTracedException: Access Denied
Log4j2Appender says: Check failed
2023-07-07 15:00:55 ERROR i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - Check failed
2023-07-07 15:00:55 INFO i.a.w.p.ExitCodeWatcher(persistExitCode):117 - Received exit code 0 for pod source-s3-check-bf89656d-54b0-4e0d-9393-790344798210-0-vqedb
Log4j2Appender says: (pod: airbyte / source-s3-check-bf89656d-54b0-4e0d-9393-790344798210-0-vqedb) - Closed all resources for pod
The cluster is currently runing in eks and by running a pod with aws-cli installed in the namespace i could check that the SA is being used and that i can access the bucket. Am i missing somehting here?Cesar Santos
07/07/2023, 3:26 PMAndrés O. Arredondo
07/07/2023, 3:53 PMjava.lang.RuntimeException: Cannot upgrade from version 0.35.36-alpha to version 0.50.6 directly. First you must upgrade to version 0.37.0-alpha. After that upgrade is complete, you may upgrade to version 0.50.6.
I did a docker system prune -a
and tried the instructions on https://docs.airbyte.com/operator-guides/upgrading-airbyte/#overview for the 0.32.0-alpha but with 0.37.0-alpha in the checkout and it keeps showing the same error. I have installed airbyte with docker before but if that's the problem then I dont know where I should delete something. The instructions on the page just say run
./run-ab-platform.sh
before clone so I am not sure about what to do to fix this. A thread that appeared on "search" pointed to the 0.32.0-alpha error and seems to be fixed with the instructions in the link but it doesnt work for me on this version.James Salmon
07/07/2023, 3:54 PMEliott Bigiaoui
07/07/2023, 5:14 PM2023-07-07 17:07:32 destination > INFO i.a.i.d.b.BufferManager(printQueueInfo):94 QUEUE INFO
Global Mem Manager -- max: 1.2 GB, allocated: 66.45 MB (66.44576072692871 MB), % used: 0.05407369854181614
Queue name: Table1, num records: 0, num bytes: 0 bytes
Queue name: Table2, num records: 0, num bytes: 0 bytes
Queue name: Table3, num records: 0, num bytes: 0 bytes
Queue name: Table4, num records: 0, num bytes: 0 bytes
Queue name: Table5, num records: 0, num bytes: 0 bytes
Queue name: Table6, num records: 0, num bytes: 0 bytes
Queue name: Table7, num records: 0, num bytes: 0 bytes
2023-07-07 17:07:32 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:32 destination > INFO i.a.i.d.FlushWorkers(printWorkerInfo):134 WORKER INFO
Pool queue size: 0, Active threads: 0
2023-07-07 17:07:33 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:34 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:35 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:36 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:37 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:38 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:39 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:40 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:41 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:42 destination > INFO i.a.i.d.b.BufferManager(printQueueInfo):94 QUEUE INFO
Global Mem Manager -- max: 1.2 GB, allocated: 66.45 MB (66.44576072692871 MB), % used: 0.05407369854181614
Queue name: Table1, num records: 0, num bytes: 0 bytes
Queue name: Table2, num records: 0, num bytes: 0 bytes
Queue name: Table3, num records: 0, num bytes: 0 bytes
Queue name: Table4, num records: 0, num bytes: 0 bytes
Queue name: Table5, num records: 0, num bytes: 0 bytes
Queue name: Table6, num records: 0, num bytes: 0 bytes
Queue name: Table7, num records: 0, num bytes: 0 bytes
2023-07-07 17:07:42 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:42 destination > INFO i.a.i.d.FlushWorkers(printWorkerInfo):134 WORKER INFO
Pool queue size: 0, Active threads: 0
2023-07-07 17:07:43 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
2023-07-07 17:07:44 destination > INFO i.a.i.d.FlushWorkers(retrieveWork):101 Retrieve Work -- Finding queues to flush
Other details:
• the tables are created in Snowflake, but only some of the raw tables are populated (not the normalized ones)
• I have tried to reset the streams, same problem after resetting
• The sync worked yesterday when I had only one table
Any idea? Thanks in advance!Semyon Komissarov
07/07/2023, 6:51 PMOctavia Squidington III
07/07/2023, 7:45 PMZack Parker
07/07/2023, 9:33 PMZack Parker
07/07/2023, 10:15 PMSlackbot
07/08/2023, 7:16 AMRodrigo Mont'Alegre
07/08/2023, 7:34 AM04.40.23
. I would appreciate any insight. Here's part of the log
2023-07-08 07:04:31 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2023-07-08 07:04:31 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed.
errors: $.auth_type: must be a constant value OAuth2.0, $.auth_type: does not have a value in the enumeration [OAuth2.0], $.access_token: is missing but it is required, $.refresh_token: is missing but it is required
2023-07-08 07:04:31 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed.
errors: $.auth_type: must be a constant value Key Pair Authentication, $.auth_type: does not have a value in the enumeration [Key Pair Authentication], $.private_key: is missing but it is required
2023-07-08 07:04:31 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2023-07-08 07:04:31 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed.
errors: $.password: object found, string expected
2023-07-08 07:04:31 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/63142/0/logs.log
2023-07-08 07:04:31 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.23
2023-07-08 07:04:31 [32mINFO[m i.a.a.c.AirbyteApiClient(retryWithJitter):179 - Attempt 0 to save workflow id for cancellation
2023-07-08 07:04:32 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
2023-07-08 07:04:32 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
2023-07-08 07:04:32 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
2023-07-08 07:04:32 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-mysql:2.0.11 exists...
2023-07-08 07:04:43 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-mysql:2.0.11 was found locally.
2023-07-08 07:04:44 [32mINFO[m i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-mysql-check-63142-0-izimd with resources io.airbyte.config.ResourceRequirements@525765e7[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2023-07-08 07:04:44 [32mINFO[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/63142/0 --log-driver none --name source-mysql-check-63142-0-izimd --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/source-mysql:2.0.11 -e WORKER_JOB_ATTEMPT=0 -e AUTO_DETECT_SCHEMA=false -e AIRBYTE_VERSION=0.40.23 -e WORKER_JOB_ID=63142 airbyte/source-mysql:2.0.11 check --config source_config.json
2023-07-08 07:04:44 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(create):97 - Reading messages from protocol version 0.2.0
2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.s.m.MySqlSource(main):407 starting source: class io.airbyte.integrations.source.mysql.MySqlSource
2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json}
2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationRunner(runInternal):108 Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource
2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationRunner(runInternal):109 Command: CHECK
2023-07-08 07:05:30 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationRunner(runInternal):110 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2023-07-08 07:05:31 [33mWARN[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2023-07-08 07:05:31 [33mWARN[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2023-07-08 07:05:32 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.s.SshTunnel(getInstance):204 Starting connection with method: NO_TUNNEL
2023-07-08 07:05:33 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO c.z.h.HikariDataSource(<init>):80 HikariPool-1 - Starting...
2023-07-08 07:05:33 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO c.z.h.HikariDataSource(<init>):82 HikariPool-1 - Start completed.
2023-07-08 07:06:33 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO c.z.h.HikariDataSource(close):350 HikariPool-1 - Shutdown initiated...
2023-07-08 07:06:36 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO c.z.h.HikariDataSource(close):352 HikariPool-1 - Shutdown completed.
2023-07-08 07:06:36 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.b.IntegrationRunner(runInternal):186 Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource
2023-07-08 07:06:36 [32mINFO[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - INFO i.a.i.s.m.MySqlSource(main):409 completed source: class io.airbyte.integrations.source.mysql.MySqlSource
2023-07-08 07:06:36 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
2023-07-08 07:06:36 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling...
2023-07-08 07:06:36 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
2023-07-08 07:06:36 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
2023-07-08 07:06:36 [33mWARN[m i.t.i.w.ActivityWorker$TaskHandlerImpl(logExceptionDuringResultReporting):365 - Failure during reporting of activity result to the server. ActivityId = cc93e74a-db81-37bc-a6b6-3895b739fa1f, ActivityType = RunWithJobOutput, WorkflowId=connection_manager_c68a63fd-ec58-45a3-b8ee-8d787ddf52e7, WorkflowType=ConnectionManagerWorkflow, RunId=61607809-4637-4161-b0f7-ef83a37e2d9d
io.grpc.StatusRuntimeException: NOT_FOUND: invalid activityID or activity already timed out or invoking workflow is completed
at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:271) ~[grpc-stub-1.50.2.jar:1.50.2]
at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:252) ~[grpc-stub-1.50.2.jar:1.50.2]
at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:165) ~[grpc-stub-1.50.2.jar:1.50.2]
at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.respondActivityTaskCompleted(WorkflowServiceGrpc.java:3840) ~[temporal-serviceclient-1.17.0.jar:?]
at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.lambda$sendReply$0(ActivityWorker.java:303) ~[temporal-sdk-1.17.0.jar:?]
at io.temporal.internal.retryer.GrpcRetryer.lambda$retry$0(GrpcRetryer.java:52) ~[temporal-serviceclient-1.17.0.jar:?]
at io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:67) ~[temporal-serviceclient-1.17.0.jar:?]
at io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:60) ~[temporal-serviceclient-1.17.0.jar:?]
at io.temporal.internal.retryer.GrpcRetryer.retry(GrpcRetryer.java:50) ~[temporal-serviceclient-1.17.0.jar:?]
at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.sendReply(ActivityWorker.java:298) ~[temporal-sdk-1.17.0.jar:?]
at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handleActivity(ActivityWorker.java:252) ~[temporal-sdk-1.17.0.jar:?]
at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:206) ~[temporal-sdk-1.17.0.jar:?]
at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:179) ~[temporal-sdk-1.17.0.jar:?]
at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.17.0.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
at java.lang.Thread.run(Thread.java:1589) ~[?:?]
2023-07-08 07:07:02 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/63142/0/logs.log
2023-07-08 07:07:02 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.23
2023-07-08 07:07:02 [32mINFO[m i.a.a.c.AirbyteApiClient(retryWithJitter):179 - Attempt 0 to save workflow id for cancellation
2023-07-08 07:07:02 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 -
Raghav Mittal
07/09/2023, 7:55 AMAkilesh V
07/09/2023, 1:55 PMFailed to detect if there is a schema change
Nazif Ishrak
07/10/2023, 1:05 AMEkansh Verma
07/10/2023, 9:28 AMGerrit van Zyl
07/10/2023, 11:27 AM2023-07-10 11:12:14 destination > INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 Airbyte message consumer: succeeded.
2023-07-10 11:12:14 destination > INFO i.a.i.d.b.BufferedStreamConsumer(close):255 executing on success close procedure.
2023-07-10 11:12:14 destination > INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(flushAllBuffers):85 Flushing agreement: 24684 records (26 MB)
2023-07-10 11:12:14 destination > INFO i.a.i.d.s.DefaultStreamLoader(send):109 Stream loading, label : airbyte__airbyte_tmp_hav_agreement_4afc7597-60c1-41dc-957f-6731d5322f6a1688987534832, database : test, table : _airbyte_tmp_hav_agreement, request : PUT <http://abd.ser.net:8040/api/test/_airbyte_tmp_hav_agreement/_stream_load> HTTP/1.1
Carolina Buckler
07/10/2023, 1:56 PMLast attempt:
0 Bytes|no records|no records|Job id: 16197|27m 48s
Failure Origin: airbyte_platform, Message: Something went wrong within the airbyte platform
and within the logs
Failure reason: scheduledEventId=58, startedEventId=59, activityType='RunWithJobOutput', activityId='94c73d73-e37d-316f-9a7a-ef6ac57efd11', identity='', retryState=RETRY_STATE_MAXIMUM_ATTEMPTS_REACHED
Steven Wang
07/10/2023, 4:42 PMValueError: time data '2023-07-08T13:00:28+00:00' does not match format '%Y-%m-%dT%H:%M:%S.%f%z'
We are using Airbyte Cloud. Anyone else run into this?