https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • t

    Thomas

    08/26/2024, 11:25 AM
    @kapa.ai after migrating a docker compose setup to abctl and setting the email via `abctl local credentials --email <email>`I am getting an error
    You don't have the right permissions to take this action. (HTTP 403)
    when trying to log in
    k
    u
    • 3
    • 12
  • v

    Vikas Tejani

    08/26/2024, 11:46 AM
    @kapa.ai i was setup hubspot to clickhouse data sync on self hosted air byte issue is job was successfull but in clickhouse i dint see data
    k
    • 2
    • 10
  • s

    Sourav Sikka

    08/26/2024, 11:48 AM
    @kapa.ai I am getting below error. Could you please help me 24-08-26 153423 airbyte-db | 2024-08-26 100423.022 UTC [746] FATAL: role "airflow" does not exist 2024-08-26 153428 airflow-webserver-1 | 127.0.0.1 - - [26/Aug/20241004:28 +0000] "GET /health HTTP/1.1" 200 318 "-" "curl/7.88.1" 2024-08-26 153433 airbyte-db | 2024-08-26 100433.078 UTC [760] FATAL: role "airflow" does not exist 2024-08-26 153443 airbyte-db | 2024-08-26 100443.154 UTC [771] FATAL: role "airflow" does not exist 2024-08-26 153453 airbyte-db | 2024-08-26 100453.206 UTC [780] FATAL: role "airflow" does not exist 2024-08-26 153458 airflow-webserver-1 | 127.0.0.1 - - [26/Aug/20241004:58 +0000] "GET /health HTTP/1.1" 200 318 "-" "curl/7.88.1" 2024-08-26 153503 airbyte-db | 2024-08-26 100503.250 UTC [793] FATAL: role "airflow" does not exist 2024-08-26 153507 airflow-triggerer-1 | [2024-08-26T100507.694+0000] {triggerer_job_runner.py:481} INFO - 0 triggers currently running 2024-08-26 153513 airbyte-db | 2024-08-26 100513.320 UTC [806] FATAL: role "airflow" does not exist 2024-08-26 153516 airflow-scheduler-1 | [2024-08-26T100516.338+0000] {scheduler_job_runner.py:1608} INFO - Adopting or resetting orphaned tasks for active dag runs
    k
    • 2
    • 1
  • v

    Vikas Tejani

    08/26/2024, 12:17 PM
    @kapa.ai i was setup destination mysql but i got issue 2024-08-26 121613 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@5ec7383[status=failed,message=State code: 08S01; Message: Communications link failure in cli i check its work fine
    k
    • 2
    • 4
  • m

    mauro

    08/26/2024, 12:28 PM
    I have a local airbyte installed with
    abctl
    . is it possible to stop it (and restart it later) without uninstalling?
    k
    u
    • 3
    • 2
  • v

    Vikas Tejani

    08/26/2024, 12:41 PM
    @kapa.ai i was tring hubspot to mysql data syn i got some issue in airbyte Caused by: java.sql.SQLSyntaxErrorException: Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs
    k
    • 2
    • 1
  • s

    Stefano Messina

    08/26/2024, 1:15 PM
    @kapa.ai I have deployed Airbyte to Kubernetes, when multiple sync are running I start getting 504 in the UI, it seems the load balancing is not really happening as we have 3 airbyte-server pods
    k
    • 2
    • 1
  • s

    Stefano Messina

    08/26/2024, 1:19 PM
    @kapa.ai I'm pulling data from different Databases with Incremental append sync mode, it seems it works correctly for some databases, however for other ones the table is loaded as a whole, it seems the cursor is not working correctly, I'm not sure how I can debug this
    k
    • 2
    • 1
  • l

    Lucas Rodrigues Dhein

    08/26/2024, 1:52 PM
    @kapa.ai tell me more about airbyte and airflow integration. Show me step by step the configuration of the dev environment and what are the pre requisites to start using airflow
    k
    • 2
    • 1
  • g

    GUANGYU QU

    08/26/2024, 2:03 PM
    @kapa.ai do you know what is airbyte temporal?
    k
    j
    • 3
    • 5
  • r

    Rabea Yousef

    08/26/2024, 2:11 PM
    @kapa.ai incremental | Append + Deduped is working correctly with some connectors as sources like intercom and Jira, but keep applying full load with others like Hubspot Redshift is target for Both cases Do you have an idea?
    k
    • 2
    • 1
  • g

    GUANGYU QU

    08/26/2024, 2:13 PM
    @kapa.ai is it possible to use private registry for all connectors?
    k
    • 2
    • 4
  • a

    Andrew Box

    08/26/2024, 2:55 PM
    Using the YAML low code builder, how do I build a connector that uses an API key and bearer and requires both to be in every request as query parameters? Such as this example fake API call? https://api.alchemer.com/v4/survey?api_token=E4F796932C2743FEBF150B421BE15EB9&amp;api_token_secret=A9fGMkJ5pJF1k
    k
    • 2
    • 6
  • a

    Alex Green

    08/26/2024, 2:56 PM
    @kapa.ai I am receiving the 'Internal Server Error: Get Spec job failed.' error when adding a custom docker image for a new connector in the Airbyte UI. Airbyte is running on a linux EC2 instance and I'm building the image on a M2 Mac, I have built the image using the --platform linux/amd64 to ensure the architecture is aligned but am still receiving the error. Any ideas? I am able to add the image to my local instance of Airbyte.
    k
    • 2
    • 1
  • a

    Ana DuCristea

    08/26/2024, 3:10 PM
    Is there a way to save the google drive URL of each document when using the google drive source? THe only fields I see are _ab_source_file_last_modified, _ab_source_file_parse_error, , _ab_source_file_url, content, and document_key. _ab_source_file_url is just the title of the doc, not the URL
    k
    • 2
    • 1
  • n

    Naoko Nishimura

    08/26/2024, 4:32 PM
    @kapa.ai I'm unable to access applications page for api key. I'm using airbyte cloud
    k
    t
    • 3
    • 3
  • t

    Thomas

    08/26/2024, 6:02 PM
    @kapa.ai I am receiving the following error on the BigQuery destination connector:
    Caused by: io.temporal.failure.ApplicationFailure: message='Server error : 500 Internal Server Error {"message":"Internal Server Error","logref":null,"path":null,"_links":{"self":{"href":"/api/v1/workload/create","templated":false,"profile":null,"deprecation":null,"title":null,"hreflang":null,"type":null,"name":null}},"_embedded":{"errors":[{"message":"Internal Server Error: Error executing PERSIST: ERROR: could not read block 8 in file \"base/16385/17447\": read only 0 of 8192 bytes","logref":null,"path":null,"_links":{},"_embedded":{}}]}}', type='io.airbyte.workload.api.client.generated.infrastructure.ServerException', nonRetryable=false
    k
    • 2
    • 1
  • r

    Ritvik Nagpal

    08/26/2024, 6:16 PM
    I am not getting action and action_values while fetching data via custom insights in airbyte meta
    k
    • 2
    • 1
  • k

    KRISHIV GUBBA

    08/26/2024, 6:33 PM
    @kapa.ai how do i build a custom paginator using the low code cdk.
    k
    • 2
    • 28
  • b

    Blake Miller

    08/26/2024, 6:33 PM
    I’m setting up a Bing Ads connector on Airbyte OSS, and not sure what to set for the redirect URI on my Azure app. When requesting user consent it says it requires a redirect URI, but there’s nothing in the Airbyte docs on what to set this to.
    k
    • 2
    • 1
  • b

    Brian Bolt

    08/26/2024, 6:36 PM
    @kapa.ai I upgraded airbyte to 0.63.17. Some of my connections work but a few are giving this error:
    Copy code
    Internal Server Error: com.fasterxml.jackson.databind.exc.ValueInstantiationException: Cannot construct instance of `io.airbyte.config.FailureReason$FailureOrigin`, problem: normalization\n at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 2767] (through reference chain: io.airbyte.config.AttemptFailureSummary[\"failures\"]->java.util.ArrayList[0]->io.airbyte.config.FailureReason[\"failureOrigin\"])",
          "exceptionClassName": "java.lang.RuntimeException",
          "exceptionStack": [
            "java.lang.RuntimeException: com.fasterxml.jackson.databind.exc.ValueInstantiationException: Cannot construct instance of `io.airbyte.config.FailureReason$FailureOrigin`, problem: normalization",
            " at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 2767] (through reference chain: io.airbyte.config.AttemptFailureSummary[\"failures\"]->java.util.ArrayList[0]->io.airbyte.config.FailureReason[\"failureOrigin\"])",
            "\tat io.airbyte.commons.json.Jsons.deserialize(Jsons.java:101)",
            "\tat io.airbyte.persistence.job.DefaultJobPersistence.getAttemptFromRecordLight(DefaultJobPersistence.java:466)",
            "\tat io.airbyte.persistence.job.DefaultJobPersistence.getJobsFromResultLight(DefaultJobPersistence.java:524)",
            "\tat io.airbyte.persistence.job.DefaultJobPersistence.listJobsLight(DefaultJobPersistence.java:1137)",
            "\tat io.airbyte.commons.server.handlers.ConnectionsHandler.getConnectionStatuses(ConnectionsHandler.java:1018)",
            "\tat io.airbyte.server.apis.ConnectionApiController.lambda$getConnectionStatuses$15(ConnectionApiController.java:237)",
            "\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:28)",
    k
    j
    • 3
    • 16
  • n

    Naoko Nishimura

    08/26/2024, 9:03 PM
    @kapa.ai Can GH connector update GH issue metadata if it is updated on GH? Is update also synced?
    k
    • 2
    • 1
  • g

    Glen Aultman-Bettridge

    08/26/2024, 9:26 PM
    @kapa.ai For the snowflake destination, how do I set up the airbyte_internal schema and what grants does my role need to have to manage that schema?
    k
    j
    • 3
    • 4
  • b

    Blake Miller

    08/26/2024, 9:53 PM
    I am getting an error when connecting my Facebook Marketing source:
    Copy code
    2024-08-26 21:51:20 WARN i.a.m.l.MetricClientFactory(getMetricClient):43 - MetricClient has not been initialized. Must call MetricClientFactory.CreateMetricClient before using MetricClient. Using a dummy client for now. Ignore this if Airbyte is configured to not publish any metrics.
    2024-08-26 21:51:20 ERROR i.a.w.i.VersionedAirbyteStreamFactory(internalLog):304 - [Errno 13] Permission denied: '/config/connectionConfiguration.json'
    Traceback (most recent call last):
      File "/airbyte/integration_code/main.py", line 9, in <module>
        run()
      File "/airbyte/integration_code/source_facebook_marketing/run.py", line 18, in run
        MigrateSecretsPathInConnector.migrate(sys.argv[1:], source)
      File "/airbyte/integration_code/source_facebook_marketing/config_migrations.py", line 160, in migrate
        cls._modify_and_save(config_path, source, config),
      File "/airbyte/integration_code/source_facebook_marketing/config_migrations.py", line 186, in _modify_and_save
        source.write_config(migrated_config, config_path)
      File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/connector.py", line 60, in write_config
        with open(config_path, "w") as fh:
    PermissionError: [Errno 13] Permission denied: '/config/connectionConfiguration.json'
    2024-08-26 21:51:20 INFO i.a.c.ConnectorMessageProcessor(updateConfigFromControlMessage):231 - Checking for optional control message...
    2024-08-26 21:51:20 INFO i.a.c.ConnectorMessageProcessor(updateConfigFromControlMessage):253 - Optional control message not found. Skipping...
    2024-08-26 21:51:20 INFO i.a.c.ConnectorWatcher(run):134 - Writing output of e7778cfc-e97c-4458-9ecb-b4f2bba8946c_8e83c8fa-797c-4046-bd06-48517207460a_0_check to the doc store
    2024-08-26 21:51:21 INFO i.a.c.ConnectorWatcher(run):136 - Marking workload as successful
    2024-08-26 21:51:21 INFO i.a.c.ConnectorWatcher(exitProperly):189 - Deliberately exiting process with code 0.
    2024-08-26 21:51:21 INFO i.a.c.i.LineGobbler(voidCall):166 - 
    2024-08-26 21:51:21 INFO i.a.c.i.LineGobbler(voidCall):166 - ----- END CHECK -----
    2024-08-26 21:51:21 INFO i.a.c.i.LineGobbler(voidCall):166 -
    k
    • 2
    • 1
  • b

    Brian Bolt

    08/27/2024, 12:19 AM
    I’m trying to run the v2 postgres connector and running into an issue where this message is repeated indefinitely:
    Copy code
    2024-08-27 00:11:08 destination > INFO pool-6-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):127 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    2024-08-27 00:12:08 destination > INFO pool-3-thread-1 i.a.c.i.d.a.b.BufferManager(printQueueInfo):94 [ASYNC QUEUE INFO] Global: max: 296.96 MB, allocated: 10 MB (10.0 MB), %% used: 0.03367428551701215 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.0
    2024-08-27 00:12:08 destination > INFO pool-6-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):127 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    2024-08-27 00:13:08 destination > INFO pool-3-thread-1 i.a.c.i.d.a.b.BufferManager(printQueueInfo):94 [ASYNC QUEUE INFO] Global: max: 296.96 MB, allocated: 10 MB (10.0 MB), %% used: 0.03367428551701215 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.0
    2024-08-27 00:13:08 destination > INFO pool-6-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):127 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    2024-08-27 00:14:08 destination > INFO pool-3-thread-1 i.a.c.i.d.a.b.BufferManager(printQueueInfo):94 [ASYNC QUEUE INFO] Global: max: 296.96 MB, allocated: 10 MB (10.0 MB), %% used: 0.03367428551701215 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.0
    2024-08-27 00:14:08 destination > INFO pool-6-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):127 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    2024-08-27 00:15:08 destination > INFO pool-3-thread-1 i.a.c.i.d.a.b.BufferManager(printQueueInfo):94 [ASYNC QUEUE INFO] Global: max: 296.96 MB, allocated: 10 MB (10.0 MB), %% used: 0.03367428551701215 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.0
    2024-08-27 00:15:08 destination > INFO pool-6-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):127 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    2024-08-27 00:16:08 destination > INFO pool-3-thread-1 i.a.c.i.d.a.b.BufferManager(printQueueInfo):94 [ASYNC QUEUE INFO] Global: max: 296.96 MB, allocated: 10 MB (10.0 MB), %% used: 0.03367428551701215 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.0
    2024-08-27 00:16:08 destination > INFO pool-6-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):127 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    2024-08-27 00:17:08 destination > INFO pool-3-thread-1 i.a.c.i.d.a.b.BufferManager(printQueueInfo):94 [ASYNC QUEUE INFO] Global: max: 296.96 MB, allocated: 10 MB (10.0 MB), %% used: 0.03367428551701215 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.0
    2024-08-27 00:17:08 destination > INFO pool-6-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):127 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    2024-08-27 00:18:08 destination > INFO pool-3-thread-1 i.a.c.i.d.a.b.BufferManager(printQueueInfo):94 [ASYNC QUEUE INFO] Global: max: 296.96 MB, allocated: 10 MB (10.0 MB), %% used: 0.03367428551701215 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.0
    2024-08-27 00:18:08 destination > INFO pool-6-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):127 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    k
    d
    • 3
    • 10
  • s

    Sam

    08/27/2024, 2:21 AM
    why is my sync stuck in “Queued… Starting” >
    k
    • 2
    • 1
  • j

    Julie Choong

    08/27/2024, 3:01 AM
    How to add ssl mode preferred to values.yaml file?
    k
    • 2
    • 1
  • s

    Stockton Fisher

    08/27/2024, 3:23 AM
    @kapa.ai Can I create a new source in my workspace using the API when the source is from the community marketplace?
    k
    • 2
    • 4
  • s

    Sar Joshi

    08/27/2024, 4:22 AM
    @kapa.ai one of the connections has been stuck since last few days. it was caused due to DB failover, and Airbyte could not auto retry sync and it’s stuck now. what are the options that i can force cancel the sync? Cancelling via UI is not working. im using Airbyte v0.50.35
    k
    t
    +8
    • 11
    • 22
  • b

    Brian Kasen

    08/27/2024, 4:49 AM
    @kapa.ai - help me interpret this error message happening to a specific s3 bucket path source with target connector to snowflake. other paths work fine from this source
    "internalMessage" : "Destination process is still alive, cannot retrieve exit value.",
    “externalMessage” : “Something went wrong during replication”, “metadata” : { “attemptNumber” : 5, “jobId” : 12596 }, “stacktrace” : “java.lang.IllegalStateException: Destination process is still alive, cannot retrieve exit value.\n\tat com.google.common.base.Preconditions.checkState(Preconditions.java:512)\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.getExitValue(DefaultAirbyteDestination.java:217)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromDestination(BufferedReplicationWorker.java:471)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsync$2(BufferedReplicationWorker.java:227)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n”, “timestamp” : 1724407512600 }, { “failureOrigin” : “destination”, “internalMessage” : “Destination process message delivery failed”, “externalMessage” : “Something went wrong within the destination connector”, “metadata” : { “attemptNumber” : 5, “jobId” : 12596, “connector_command” : “write” }, “stacktrace” : “io.airbyte.workers.internal.exception.DestinationException: Destination process message delivery failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:444)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:255)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.io.IOException: Broken pipe\n\tat java.base/java.io.FileOutputStream.writeBytes(Native Method)\n\tat java.base/java.io.FileOutputStream.write(FileOutputStream.java:367)\n\tat java.base/java.io.BufferedOutputStream.implWrite(BufferedOutputStream.java:217)\n\tat java.base/java.io.BufferedOutputStream.write(BufferedOutputStream.java:206)\n\tat java.base/sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:309)\n\tat java.base/sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:405)\n\tat java.base/sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:410)\n\tat java.base/sun.nio.cs.StreamEncoder.lockedFlush(StreamEncoder.java:214)\n\tat java.base/sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:201)\n\tat java.base/java.io.OutputStreamWriter.flush(OutputStreamWriter.java:262)\n\tat java.base/java.io.BufferedWriter.implFlush(BufferedWriter.java:372)\n\tat java.base/java.io.BufferedWriter.flush(BufferedWriter.java:359)\n\tat io.airbyte.workers.internal.DefaultAirbyteMessageBufferedWriter.flush(DefaultAirbyteMessageBufferedWriter.java:31)\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.notifyEndOfInputWithNoTimeoutMonitor(DefaultAirbyteDestination.java:166)\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.notifyEndOfInput(DefaultAirbyteDestination.java:156)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:438)\n\t... 5 more\n”, “timestamp” : 1724407512600 }, { “failureOrigin” : “source”, “internalMessage” : “Source process read attempt failed”, “externalMessage” : “Something went wrong within the source connector”, “metadata” : { “attemptNumber” : 5, “jobId” : 12596, “connector_command” : “read” }, “stacktrace” : “io.airbyte.workers.internal.exception.SourceException: Source process read attempt failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:373)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:234)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.lang.IllegalStateException: Source process is still alive, cannot retrieve exit value.\n\tat com.google.common.base.Preconditions.checkState(Preconditions.java:512)\n\tat io.airbyte.workers.internal.DefaultAirbyteSource.getExitValue(DefaultAirbyteSource.java:140)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:359)\n\t... 5 more\n”, “timestamp” : 1724407512611 }, { “failureOrigin” : “replication”, “internalMessage” : “io.airbyte.workers.exception.WorkerException: Destination process exit with code 3. This warning is normal if the job was cancelled.“, “externalMessage” : “Something went wrong during replication”, “metadata” : { “attemptNumber” : 5, “jobId” : 12596 }, “stacktrace” : “java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Destination process exit with code 3. This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:517)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:255)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: io.airbyte.workers.exception.WorkerException: Destination process exit with code 3. This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:187)\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:515)\n\t... 5 more\n”, “timestamp” : 1724407512611 } ]
    k
    • 2
    • 49
1...242526...48Latest