https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • k

    Kamal Tuteja

    08/27/2024, 8:39 PM
    why custom GAQL does not support incremental update
    k
    • 2
    • 1
  • a

    Aldo Orozco

    08/27/2024, 9:44 PM
    @kapa.ai I'm facing an error syncing data from snowflake to BQ. I'm using the OSS helm chart to deploy airbyte. When I manually (from the UI) create the snowflake source, it passes the set up check, but then when I try to do a full sync to BQ, the job hangs and the logs of the source-snowflake-read pod only show this
    Copy code
    Waiting on CHILD_PID 7
    PARENT_PID: 1
    the orchestrator pod shows a ton of these
    Copy code
    2024-08-27 16:10:33 destination > INFO pool-3-thread-1 i.a.c.i.d.a.b.BufferManager(printQueueInfo):94 [ASYNC QUEUE INFO] Global: max: 768 MB, allocated: 10 MB (10.0 MB), %% used: 0.013020833333333334 | State Manager memory usage: Allocated: 10 MB, Used: 0 bytes, percentage Used 0.0
    2024-08-27 16:10:33 destination > INFO pool-6-thread-1 i.a.c.i.d.a.FlushWorkers(printWorkerInfo):127 [ASYNC WORKER INFO] Pool queue size: 0, Active threads: 0
    and the destination-bigquery-write shows something identical to the source
    Copy code
    Waiting on CHILD_PID 7
    PARENT_PID: 1
    Any ideas?
    k
    • 2
    • 1
  • a

    Alkis

    08/27/2024, 10:03 PM
    Hi everyone, I’m running into an issue with Airbyte where the data syncs from BigQuery to SQL Server, but the data seems to be stuck in the
    airbyte_internal
    schema. It doesn’t seem to be unpacking into the expected tables in the actual schema. I’m using the default setup without any custom transformations or dbt, so I’m not sure what might be causing this. Has anyone else experienced this or know how to ensure the data is correctly unpacked into the final destination schema? Thanks for any help you can provide!
    k
    j
    • 3
    • 7
  • j

    Jorge Estrada

    08/27/2024, 11:13 PM
    "I installed Airbyte on my Ubuntu server in a privately managed VPS. However, in the cloud version, I see a section for integrations and a transformation section in the sources. What do I need to do on my VPS to have integrations and transformations?" Let me know if you need any further adjustments!
    k
    • 2
    • 3
  • j

    Jorge Estrada

    08/28/2024, 12:29 AM
    @kapa.ai I’m having issues with synchronization in Airbyte using the full refresh mode. Currently, I’m running this synchronization on a VPS with seemingly sufficient resources (CPU, memory, and disk performance). However, the process seems to be stuck, as the Airbyte logs repeatedly show the message "Waiting for all streams to flush" with a constant number of remaining records, suggesting that it is not making progress. I’ve monitored the VPS and database performance, and both appear to be operating within normal parameters, with no obvious bottlenecks in CPU, memory, or IOPS. Additionally, I noticed that several Node.js processes are running with low CPU usage but consuming a significant amount of memory. I’m unsure if these processes are contributing to the issue or if there’s another factor that I might have overlooked. I’m using a transactional PostgreSQL database as the source, and the destination database to which I applied the "Refresh your data" (the option that replaces all data in the destination by reloading all data from the source) is another PostgreSQL database. Despite this, the system still shows no progress, and I'm observing the following connection states: (Here, the connection state details are included) I would like to understand what might be causing this issue and how I can resolve it.
    k
    • 2
    • 3
  • t

    Thành Đặng Minh

    08/28/2024, 1:25 AM
    @kapa.ai what is the full progress in connection's sync in Airbyte?
    k
    • 2
    • 4
  • g

    Gideon

    08/28/2024, 3:37 AM
    @kapa.ai What do these logs mean? They come from a airbyte job:
    Copy code
    Replication output for workload 2e0139eb-a4e4-48c1-a428-83ffe3e2cc8a_6895_0_sync : io.airbyte.config.ReplicationOutput@581121ef[replicationAttemptSummary=io.airbyte.config.ReplicationAttemptSummary@2fd12051[status=failed,recordsSynced=<null>,bytesSynced=<null>,startTime=1724815989300,endTime=1724816029093,totalStats=io.airbyte.config.SyncStats@34a74f0d[bytesCommitted=<null>,bytesEmitted=0,destinationStateMessagesEmitted=0,destinationWriteEndTime=0,destinationWriteStartTime=1724815989409,estimatedBytes=<null>,estimatedRecords=<null>,meanSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBetweenStateMessageEmittedandCommitted=<null>,meanSecondsBetweenStateMessageEmittedandCommitted=0,recordsEmitted=0,recordsCommitted=<null>,replicationEndTime=1724816029015,replicationStartTime=1724815989300,sourceReadEndTime=0,sourceReadStartTime=1724815989415,sourceStateMessagesEmitted=0,discoverSchemaEndTime=<null>,discoverSchemaStartTime=<null>,additionalProperties={}],streamStats=[],performanceMetrics=io.airbyte.config.PerformanceMetrics@692274e0[additionalProperties={processFromSource={elapsedTimeInNanos=0, executionCount=0, avgExecTimeInNanos=NaN}, readFromSource={elapsedTimeInNanos=0, executionCount=0, avgExecTimeInNanos=NaN}, processFromDest={elapsedTimeInNanos=0, executionCount=0, avgExecTimeInNanos=NaN}, writeToDest={elapsedTimeInNanos=0, executionCount=0, avgExecTimeInNanos=NaN}, readFromDest={elapsedTimeInNanos=0, executionCount=0, avgExecTimeInNanos=NaN}}],additionalProperties={}],state=<null>,outputCatalog=ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name=amazon, jsonSchema={"type":"object","$schema":"<http://json-schema.org/draft-07/schema#>","properties":{"asin":{"type":"string"},"_price_":{"type":"string"},"category":{"type":"string"},"retailer":{"type":"string"},"subcategory":{"type":"string"},"time_period":{"type":"string"},"_unit_volume_":{"type":"string"},"product_title":{"type":"string"},"_retail_sales_":{"type":"string"},"normalized_brand_name":{"type":"string"},"brand_listed_on_amazon_com":{"type":"string"}}}, supportedSyncModes=[full_refresh], sourceDefinedCursor=false, defaultCursorField=[], sourceDefinedPrimaryKey=[], namespace=stackline_data, isResumable=null), syncMode=full_refresh, destinationSyncMode=overwrite, cursorField=[], primaryKey=[], generationId=94, minimumGenerationId=94, syncId=6895)]),failures=[io.airbyte.config.FailureReason@af1f8c9[failureOrigin=replication,failureType=<null>,internalMessage=Unable to start the destination,externalMessage=Something went wrong during replication,metadata=io.airbyte.config.Metadata@1bfc77d6[additionalProperties={attemptNumber=0, jobId=6895}],stacktrace=io.airbyte.workers.exception.WorkerException: Unable to start the destination
            at io.airbyte.workers.general.ReplicationWorkerHelper.startDestination(ReplicationWorkerHelper.kt:247)
            at io.airbyte.workers.general.BufferedReplicationWorker.lambda$run$0(BufferedReplicationWorker.java:170)
            at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsync$2(BufferedReplicationWorker.java:235)
            at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
            at java.base/java.lang.Thread.run(Thread.java:1583)
    Caused by: io.airbyte.workers.exception.WorkerException: Failed to create pod for write step
            at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:195)
            at io.airbyte.workers.process.AirbyteIntegrationLauncher.write(AirbyteIntegrationLauncher.java:254)
            at io.airbyte.workers.internal.DefaultAirbyteDestination.start(DefaultAirbyteDestination.java:115)
            at io.airbyte.workers.general.ReplicationWorkerHelper.startDestination(ReplicationWorkerHelper.kt:245)
            ... 6 more
    Caused by: java.lang.RuntimeException: java.io.IOException: kubectl cp failed with exit code 1
            at io.airbyte.workers.process.KubePodProcess.copyFilesToKubeConfigVolume(KubePodProcess.java:364)
            at io.airbyte.workers.process.KubePodProcess.<init>(KubePodProcess.java:662)
            at io.airbyte.workers.process.KubeProcessFactory.create(KubeProcessFactory.java:191)
            ... 9 more
    Caused by: java.io.IOException: kubectl cp failed with exit code 1
            at io.airbyte.workers.process.KubePodProcess.copyFilesToKubeConfigVolume(KubePodProcess.java:358)
            ... 11 more
    k
    • 2
    • 9
  • a

    Abhra Gupta

    08/28/2024, 5:48 AM
    Copy code
    "metadata" : {
        "attemptNumber" : 0,
        "jobId" : 13279,
        "connector_command" : "read"
      },
      "stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source process read attempt failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:373)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:234)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.lang.IllegalStateException: Source process is still alive, cannot retrieve exit value.\n\tat com.google.common.base.Preconditions.checkState(Preconditions.java:512)\n\tat io.airbyte.workers.internal.DefaultAirbyteSource.getExitValue(DefaultAirbyteSource.java:140)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:359)\n\t... 5 more\n",
      "timestamp" : 1724823738705
    }, {
      "failureOrigin" : "replication",
      "internalMessage" : "io.airbyte.workers.exception.WorkerException: Destination process exit with code 3. This warning is normal if the job was cancelled.",
      "externalMessage" : "Something went wrong during replication",
      "metadata" : {
        "attemptNumber" : 0,
        "jobId" : 13279
      },
      "stacktrace" : "java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Destination process exit with code 3. This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:517)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:255)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: io.airbyte.workers.exception.WorkerException: Destination process exit with code 3. This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.internal.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:187)\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:515)\n\t... 5 more\n",
      "timestamp" : 1724823738706
    } ]
    k
    • 2
    • 1
  • d

    DR

    08/28/2024, 6:14 AM
    @kapa.ai The Airbyte pipeline failed with the following issue. During the first sync, the GCS-Snowflake connector successfully processed the UTF-16LE file, but it failed during the latest sync
    Copy code
    2024-08-27 18:01:17 INFO i.a.c.ConnectorWatcher(run):134 - Writing output of 424892c4-daac-4491-b35d-c6688ba547ba_16953484_0_check to the doc store
    2024-08-27 18:01:18 INFO i.a.c.ConnectorWatcher(run):136 - Marking workload as successful
    2024-08-27 18:01:18 INFO i.a.c.ConnectorWatcher(exitProperly):189 - Deliberately exiting process with code 0.
    2024-08-27 18:01:18 INFO i.a.c.i.LineGobbler(voidCall):166 - 
    2024-08-27 18:01:18 INFO i.a.c.i.LineGobbler(voidCall):166 - ----- END CHECK -----
    2024-08-27 18:01:18 INFO i.a.c.i.LineGobbler(voidCall):166 - 
    2024-08-27 18:01:20 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: CLAIM — (workloadId = 50696eb4-ac93-4997-b582-029bc704b76c_1724781600000_discover) — (dataplaneId = prod-dataplane-gcp-us-west3-0)
    2024-08-27 18:01:32 INFO i.a.c.i.LineGobbler(voidCall):166 - 
    2024-08-27 18:00:46 INFO i.a.w.l.c.WorkloadApiClient(claim):75 - Claimed: true for 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_16953484_0_check via API for gcp-us-expanded-queue-1
    2024-08-27 18:00:46 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: CHECK_STATUS — (workloadId = 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_16953484_0_check) — (dataplaneId = gcp-us-expanded-queue-1)
    2024-08-27 18:00:46 INFO i.a.w.l.p.s.CheckStatusStage(applyStage):59 - No pod found running for workload 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_16953484_0_check
    2024-08-27 18:00:46 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: BUILD — (workloadId = 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_16953484_0_check) — (dataplaneId = gcp-us-expanded-queue-1)
    2024-08-27 18:00:46 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: MUTEX — (workloadId = 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_16953484_0_check) — (dataplaneId = gcp-us-expanded-queue-1)
    2024-08-27 18:00:46 INFO i.a.w.l.p.s.EnforceMutexStage(applyStage):50 - No mutex key specified for workload: 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_16953484_0_check. Continuing...
    2024-08-27 18:00:46 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: LAUNCH — (workloadId = 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_16953484_0_check) — (dataplaneId = gcp-us-expanded-queue-1)
    2024-08-27 18:00:51 INFO i.a.w.l.c.WorkloadApiClient(updateStatusToLaunched):60 - Attempting to update workload: 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_16953484_0_check to LAUNCHED.
    2024-08-27 18:00:51 INFO i.a.w.l.p.h.SuccessHandler(accept):60 - Pipeline completed for workload: 2a8c41ae-8c23-4be0-a73f-2ab10ca1a820_16953484_0_check.
    2024-08-27 18:01:03 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: CLAIM — (workloadId = 424892c4-daac-4491-b35d-c6688ba547ba_16953484_0_check) — (dataplaneId = gcp-us-expanded-queue-1)
    2024-08-27 18:01:03 INFO i.a.w.l.c.WorkloadApiClient(claim):75 - Claimed: true for 424892c4-daac-4491-b35d-c6688ba547ba_16953484_0_check via API for gcp-us-expanded-queue-1
    2024-08-27 18:01:03 INFO i.a.w.l.p.s.m.Stage(apply):39 - APPLY Stage: CHECK_STATUS — (workloadId = 424892c4-daac-4491-b35d-c6688ba547ba_16953484_0_check) — (dataplaneId = gcp-us-expanded-queue-1)
    2024-08-27 18:01:03 INFO i.a.w.l.p.s.CheckStatusStage(applyStage):59 - No pod found running for workload 424892c4-daac-4491-b35d-c6688ba547ba_16953484_0_check
        raise UnicodeError("UTF-16 stream does not start with BOM")
    UnicodeError: UTF-16 stream does not start with BOM
    The above exception was the direct cause of the following exception:
    Traceback (most recent call last):
      File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/file_based/stream/default_file_based_stream.py", line 281, in _infer_schema
        base_schema = merge_schemas(base_schema, task.result())
      File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/file_based/stream/default_file_based_stream.py", line 291, in _infer_file_schema
        raise SchemaInferenceError(
    airbyte_cdk.sources.file_based.exceptions.SchemaInferenceError: Error inferring schema from files. Are the files valid? Contact Support if you need assistance.
    file=<https://storage.googleapis.com/gp-install-stats/2024/installs_com.ornateapps.fitnessworkoutchallengefree_202408_overview.csv>? format=filetype='csv' delimiter=',' quote_char='"' escape_char=None encoding='UTF-16' double_quote=True null_values=set() strings_can_be_null=True skip_rows_before_header=0 skip_rows_after_header=0 header_definition=CsvHeaderFromCsv(header_definition_type='From CSV') true_values={'on', 'y', '1', 'yes', 't', 'true'} false_values={'n', 'no', '0', 'off', 'false', 'f'} inference_type=<inferencetype.none:> ignore_errors_on_fields_mismatch=False stream=install_report_latest
    Traceback (most recent call last):
      File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/file_based/stream/default_file_based_stream.py", line 289, in _infer_file_schema
        return await self.get_parser().infer_schema(self.config, file, self.stream_reader, self.logger)
      File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/file_based/file_types/csv_parser.py", line 168, in infer_schema
        for row in data_generator:
      File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/file_based/file_types/csv_parser.py", line 55, in read_data
        headers = self._get_headers(fp, config_format, dialect_name)
      File "/usr/local/lib/python3.10/site-packages/airbyte_cdk/sources/file_based/file_types/csv_parser.py", line 108, in _get_headers
        headers = list(next(reader))
      File "/usr/local/lib/python3.10/codecs.py", line 322, in decode
        (result, consumed) = self._buffer_decode(data, self.errors, final)
      File "/usr/local/lib/python3.10/encodings/utf_16.py", line 67, in _buffer_decode
        raise UnicodeError("UTF-16 stream does not start with BOM")
    UnicodeError: UTF-16 stream does not start with BOM
    The above exception was the direct cause of the following exception:
    k
    • 2
    • 3
  • j

    Jordi Beunk

    08/28/2024, 6:42 AM
    @kapa.ai since i upgraded airbyte (0.63.11) and my snowflake destinations (3.11.4) several connections are failing after several attempts. it seems to occur more frequently with connections that contain more tables (around 20 tables, which is not even close to the limit of 200). the logs show the following: Source process read attempt failed
    Copy code
    2024-08-28 06:12:41 platform > readFromSource: exception caught
    2963
    io.airbyte.workers.exception.WorkerException: A stream status (public.contact_inboxes) has been detected for a stream not present in the catalog
    2964
            at io.airbyte.workers.helper.StreamStatusCompletionTracker.track(StreamStatusCompletionTracker.kt:36) ~[io.airbyte-airbyte-commons-worker-0.63.11.jar:?]
    2965
            at io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:361) ~[io.airbyte-airbyte-commons-worker-0.63.11.jar:?]
    2966
            at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:242) ~[io.airbyte-airbyte-commons-worker-0.63.11.jar:?]
    2967
            at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    2968
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    2969
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    2970
            at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
    2971
    2024-08-28 06:12:41 platform > readFromSource: done. (source.isFinished:false, fromSource.isClosed:false)
    2972
    2024-08-28 06:12:41 platform > processMessage: done. (fromSource.isDone:true, forDest.isClosed:false)
    2973
    2024-08-28 06:12:41 platform > writeToDestination: exception caught
    2974
    java.lang.IllegalStateException: Source process is still alive, cannot retrieve exit value.
    2975
            at com.google.common.base.Preconditions.checkState(Preconditions.java:515) ~[guava-33.2.0-jre.jar:?]
    2976
            at io.airbyte.workers.internal.DefaultAirbyteSource.getExitValue(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-commons-worker-0.63.11.jar:?]
    2977
            at io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:454) ~[io.airbyte-airbyte-commons-worker-0.63.11.jar:?]
    2978
            at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:263) ~[io.airbyte-airbyte-commons-worker-0.63.11.jar:?]
    2979
            at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    2980
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    2981
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    2982
            at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
    2983
    2024-08-28 06:12:41 platform > writeToDestination: done. (forDest.isDone:true, isDestRunning:true)
    Copy code
    2024-08-28 06:12:42 platform > readFromDestination: exception caught
    3025
    java.lang.IllegalStateException: Destination process is still alive, cannot retrieve exit value.
    3026
            at com.google.common.base.Preconditions.checkState(Preconditions.java:515) ~[guava-33.2.0-jre.jar:?]
    3027
            at io.airbyte.workers.internal.DefaultAirbyteDestination.getExitValue(DefaultAirbyteDestination.java:210) ~[io.airbyte-airbyte-commons-worker-0.63.11.jar:?]
    3028
            at io.airbyte.workers.general.BufferedReplicationWorker.readFromDestination(BufferedReplicationWorker.java:492) ~[io.airbyte-airbyte-commons-worker-0.63.11.jar:?]
    3029
            at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsync$2(BufferedReplicationWorker.java:235) ~[io.airbyte-airbyte-commons-worker-0.63.11.jar:?]
    3030
            at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    3031
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    3032
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    3033
            at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
    3034
    2024-08-28 06:12:42 platform > readFromDestination: done. (writeToDestFailed:true, dest.isFinished:false)
    3035
    2024-08-28 06:13:42 platform > airbyte-destination gobbler IOException: Stream closed. Typically happens when cancelling a job.
    3036
    2024-08-28 06:14:42 platform > airbyte-source gobbler IOException: Stream closed. Typically happens when cancelling a job.
    Copy code
    2024-08-28 06:14:42 platform > failures: [ {
    3236
      "failureOrigin" : "source",
    3237
      "internalMessage" : "Source process read attempt failed",
    3238
      "externalMessage" : "Something went wrong within the source connector",
    3239
      "metadata" : {
    3240
        "attemptNumber" : 4,
    3241
        "jobId" : 57034,
    3242
        "connector_command" : "read"
    3243
      },
    3244
      "stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source process read attempt failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:389)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:242)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: io.airbyte.workers.exception.WorkerException: A stream status (public.contact_inboxes) has been detected for a stream not present in the catalog\n\tat io.airbyte.workers.helper.StreamStatusCompletionTracker.track(StreamStatusCompletionTracker.kt:36)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:361)\n\t... 5 more\n",
    3245
      "timestamp" : 1724825561606
    3246
    }, {
    3247
      "failureOrigin" : "destination",
    3248
      "internalMessage" : "Destination process message delivery failed",
    3249
      "externalMessage" : "Something went wrong within the destination connector",
    3250
      "metadata" : {
    3251
        "attemptNumber" : 4,
    3252
        "jobId" : 57034,
    3253
        "connector_command" : "write"
    3254
      },
    3255
      "stacktrace" : "io.airbyte.workers.internal.exception.DestinationException: Destination process message delivery failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:465)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:263)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.lang.IllegalStateException: Source process is still alive, cannot retrieve exit value.\n\tat com.google.common.base.Preconditions.checkState(Preconditions.java:515)\n\tat io.airbyte.workers.internal.DefaultAirbyteSource.getExitValue(DefaultAirbyteSource.java:136)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:454)\n\t... 5 more\n",
    3256
      "timestamp" : 1724825561609
    3257
    }, {
    3258
      "failureOrigin" : "replication",
    3259
    k
    b
    • 3
    • 15
  • p

    pham viet

    08/28/2024, 6:44 AM
    @kapa.ai I ingest data from Postgre via CDC. When the connection runs, if no data is found, it waits for about 20 minutes to finish. Can I adjust wait time ?
    k
    • 2
    • 3
  • a

    Anthony LEBLEU

    08/28/2024, 7:23 AM
    @kapa.ai It's possible to create a custom connector with Python. Will this connector be available in the Airbyte UI ?
    k
    • 2
    • 7
  • s

    Syed Hamza Raza Kazmi

    08/28/2024, 7:27 AM
    I have a postgres to postgres replication connection setup using cdc. What is happening is that sometime for lesser number of records it is taking more time and some time for more number of records its taking less time. What can be the reason
    k
    • 2
    • 1
  • j

    Julie Choong

    08/28/2024, 7:40 AM
    @kapa.ai How to add ssl mode preffered in values.yaml for Kubernetes?
    k
    • 2
    • 4
  • e

    Eka Pramudita

    08/28/2024, 7:52 AM
    hi @kapa.ai Failure in source: Saved offset no longer present on the server. Please reset the connection, and then increase binlog retention and/or increase sync frequency
    k
    • 2
    • 7
  • g

    guifesquet

    08/28/2024, 10:02 AM
    how to update abctl version
    k
    • 2
    • 1
  • c

    crazysrot

    08/28/2024, 10:10 AM
    @kapa.ai Since 00:53 UTC, the mongoDB connector no longer works. Has something changed?
    k
    • 2
    • 1
  • c

    crazysrot

    08/28/2024, 10:36 AM
    @kapa.ai I want to access invoice history. So i send my email address at "invoice history" page. But email doesn't served.
    k
    • 2
    • 1
  • p

    Pascal Weichert

    08/28/2024, 10:56 AM
    can i export also images over airbyte into a database
    k
    • 2
    • 3
  • t

    Tobias Willi

    08/28/2024, 11:19 AM
    @kapa.ai can i only use airbyte-ci when running it from within the airbyte repo ?
    k
    • 2
    • 1
  • j

    Jehkindra Klu-Jackson

    08/28/2024, 11:47 AM
    @kapa.ai how do i use docker compose to upgrade to the latest airbyte version in gcp vm?
    k
    • 2
    • 1
  • g

    guifesquet

    08/28/2024, 11:49 AM
    what’s airbyte terraform provider server_url for an oss deployed locally with abctl
    k
    • 2
    • 1
  • g

    guifesquet

    08/28/2024, 12:04 PM
    what’s going on with terraform provider, any idea │ Error: failure to invoke API │ │ with airbyte_source_mssql.mssql_avensys_srv, │ on main.tf line 5, in resource “airbyte_source_mssql” “mssql_avensys_srv”: │ 5: resource “airbyte_source_mssql” “mssql_avensys_srv” { │ │ failed to get token: unexpected status code: 500: {“message”:“Internal Server Error: Cannot invoke │ \“io.micronaut.web.router.RouteMatch.isAnnotationPresent(java.lang.Class)\” because \“routeMatch\” is │ null”,“exceptionClassName”“java.lang.NullPointerException”,“exceptionStack”[“java.lang.NullPointerException: Cannot invoke │ \“io.micronaut.web.router.RouteMatch.isAnnotationPresent(java.lang.Class)\” because \“routeMatch\” is null”,
    k
    • 2
    • 1
  • s

    Simon Veerman

    08/28/2024, 12:37 PM
    Hi Kapa, I would like to launch airbyte oss on the google cloud compute engine. I would prefer to use abctl. Can you explain the steps?
    k
    • 2
    • 1
  • d

    DM

    08/28/2024, 1:17 PM
    help, getting this issue io.airbyte.workers.exception.WorkerException: Could not find image: airbyte/destination-mssql:1.0.0
    k
    u
    +7
    • 10
    • 17
  • g

    guifesquet

    08/28/2024, 1:23 PM
    running on macos ERROR Cluster ‘airbyte-abctl’ could not be created ERROR unable to create kind cluster: command “docker run --name airbyte-abctl-control-plane --hostname airbyte-abctl-control-plane --label io.x-k8s.kind.role=control-plane --privileged --security-opt seccomp=unconfined --security-opt apparmor=unconfined --tmpfs /tmp --tmpfs /run --volume /var --volume /lib/modules/lib/modulesro -e KIND_EXPERIMENTAL_CONTAINERD_SNAPSHOTTER --detach --tty --label io.x-k8s.kind.cluster=airbyte-abctl --net kind --restart=on-failure:1 --init=false --cgroupns=private --volume=/Users/guifesquet/.airbyte/abctl/data:/var/local-path-provisioner --publish=0.0.0.0800080/TCP --publish=127.0.0.1518346443/TCP -e KUBECONFIG=/etc/kubernetes/admin.conf kindest/node:v1.29.4@sha256:3abb816a5b1061fb15c6e9e60856ec40d56b7b52bcea5f5f1350bc6e2320b6f8” failed with error: exit status 125
    k
    • 2
    • 1
  • g

    Gideon

    08/28/2024, 2:04 PM
    @kapa.ai What do these errors indicate:
    Copy code
    2024-08-28 13:56:58 replication-orchestrator > Creating stderr socket server...
    2024-08-28 13:56:58 replication-orchestrator > Creating stdout socket server...
    2024-08-28 13:56:58 replication-orchestrator > Creating stderr socket server...
    2024-08-28 13:56:58 replication-orchestrator > Creating pod source-google-sheets-read-6897-0-qbigt...
    2024-08-28 13:56:58 replication-orchestrator > Creating pod destination-bigquery-write-6897-0-csdjf...
    2024-08-28 13:57:00 replication-orchestrator > Waiting for init container to be ready before copying files...
    2024-08-28 13:57:00 replication-orchestrator > Waiting for init container to be ready before copying files...
    2024-08-28 13:57:37 replication-orchestrator > Init container ready..
    2024-08-28 13:57:37 replication-orchestrator > Copying files...
    2024-08-28 13:57:37 replication-orchestrator > Uploading file: source_config.json
    2024-08-28 13:57:37 replication-orchestrator > kubectl cp /tmp/a15ca4df-c2d5-4a9e-a716-e81bda74d21b/source_config.json airbyte/source-google-sheets-read-6897-0-qbigt:/config/source_config.json -c init --retries=3
    2024-08-28 13:57:37 replication-orchestrator > Waiting for kubectl cp to complete
    2024-08-28 13:57:37 replication-orchestrator > Init container ready..
    2024-08-28 13:57:37 replication-orchestrator > Copying files...
    2024-08-28 13:57:37 replication-orchestrator > Uploading file: destination_config.json
    2024-08-28 13:57:37 replication-orchestrator > kubectl cp /tmp/f8cc99a4-dc17-4cc9-b588-4271185d0c29/destination_config.json airbyte/destination-bigquery-write-6897-0-csdjf:/config/destination_config.json -c init --retries=3
    2024-08-28 13:57:37 replication-orchestrator > Waiting for kubectl cp to complete
    2024-08-28 13:57:38 replication-orchestrator > (pod: airbyte / source-google-sheets-read-6897-0-qbigt) - Destroying Kube process.
    java.net.SocketException: Socket closed
    	at java.base/sun.nio.ch.NioSocketImpl.endAccept(NioSocketImpl.java:682)
    	at java.base/sun.nio.ch.NioSocketImpl.accept(NioSocketImpl.java:755)
    	at java.base/java.net.ServerSocket.implAccept(ServerSocket.java:698)
    	at java.base/java.net.ServerSocket.platformImplAccept(ServerSocket.java:663)
    	at java.base/java.net.ServerSocket.implAccept(ServerSocket.java:639)
    	at java.base/java.net.ServerSocket.implAccept(ServerSocket.java:585)
    	at java.base/java.net.ServerSocket.accept(ServerSocket.java:543)
    	at io.airbyte.workers.process.KubePodProcess.lambda$setupStdOutAndStdErrListeners$10(KubePodProcess.java:706)
    	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
    	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
    	at java.base/java.lang.Thread.run(Thread.java:1583)
    java.net.SocketException: Socket closed
    	at java.base/sun.nio.ch.NioSocketImpl.endAccept(NioSocketImpl.java:682)
    	at java.base/sun.nio.ch.NioSocketImpl.accept(NioSocketImpl.java:755)
    	at java.base/java.net.ServerSocket.implAccept(ServerSocket.java:698)
    	at java.base/java.net.ServerSocket.platformImplAccept(ServerSocket.java:663)
    	at java.base/java.net.ServerSocket.implAccept(ServerSocket.java:639)
    	at java.base/java.net.ServerSocket.implAccept(ServerSocket.java:585)
    	at java.base/java.net.ServerSocket.accept(ServerSocket.java:543)
    	at io.airbyte.workers.process.KubePodProcess.lambda$setupStdOutAndStdErrListeners$11(KubePodProcess.java:724)
    	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
    	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
    	at java.base/java.lang.Thread.run(Thread.java:1583)
    2024-08-28 13:57:38 replication-orchestrator > (pod: airbyte / source-google-sheets-read-6897-0-qbigt) - Closed all resources for pod
    2024-08-28 13:57:38 replication-orchestrator > (pod: airbyte / source-google-sheets-read-6897-0-qbigt) - Destroyed Kube process.
    k
    • 2
    • 9
  • g

    guifesquet

    08/28/2024, 2:05 PM
    │ Error: failure to invoke API │ │ with airbyte_source_mssql.mssql_avensys_srv, │ on main.tf line 5, in resource “airbyte_source_mssql” “mssql_avensys_srv”: │ 5: resource “airbyte_source_mssql” “mssql_avensys_srv” { │ │ failed to get token: failed to send token request: Post “http://stl-rec-datel:8000/api/public/v1/api/v1/applications/token”: dial tcp 10.60.0.1508000 │ connect: connection refused
    k
    • 2
    • 2
  • a

    Aditya Gupta

    08/28/2024, 2:07 PM
    @kapa.ai where can i find my openapi.json config from my serve?
    k
    • 2
    • 1
  • g

    guifesquet

    08/28/2024, 2:07 PM
    this is the api answer to terraform provider failed to get token: failed to send token request: Post “http://stl-rec-datel:8000/api/public/v1/api/v1/applications/token”: dial tcp 10.60.0.1508000 │ connect: connection refused
    k
    • 2
    • 1
1...262728...48Latest