https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • a

    Aswin

    12/01/2025, 10:13 AM
    @kapa.ai I am using airbyte software in my own machine the free version i am migrating around few tb of data from one postgres db to another now here once its done does all the indexes and bloat would be gone how is the data being brought to the new db if its all insert statements then it would be fresh right how the initial data transfer occurs and the cdc i want to know here in airbyte
    k
    • 2
    • 1
  • p

    Piyush Shakya

    12/01/2025, 3:34 PM
    @kapa.ai where is the cron data stored in airbyte database tables ?
    k
    • 2
    • 1
  • l

    Lucas Segers

    12/01/2025, 4:24 PM
    Hi, does the file sources (such as s3 or sftp) that support the "excel" file format support configuring which worksheet will be read?
    k
    • 2
    • 1
  • c

    Chris Dahms

    12/01/2025, 5:06 PM
    @kapa.ai we installed airbyte 1.8.2 with external postgresql database using abctl, then uninstalled and re-installed airbyte 1.8.4 and now our connections are not working, they fail with a 502 when testing in the GUI
    k
    • 2
    • 1
  • t

    Tara DeBono

    12/01/2025, 7:01 PM
    @kapa.ai is there a thread for airbyte support users. I am a product manager and my team uses airbyte, I have to check logs and review error reporting. We upgraded to 2.0 and would love to see what features have been updated.
    k
    • 2
    • 13
  • f

    Finn Bauer

    12/01/2025, 8:16 PM
    Airbyte s3 destination error, broken pipe. it started out of knowhere.
    k
    • 2
    • 4
  • e

    Eduardo Ferreira

    12/01/2025, 8:56 PM
    @kapa.ai this is happening on airbyte on helm v2
    Copy code
    ERROR	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$3):63 - Failed to call unknown.  Last response: null
    java.io.IOException: HTTP error: 500 Internal Server Error
    	at io.airbyte.api.client.interceptor.ThrowOn5xxInterceptor.intercept(ThrowOn5xxInterceptor.kt:16)
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
    	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
    	at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154)
    	at dev.failsafe.okhttp.FailsafeCall.lambda$execute$0(FailsafeCall.java:117)
    	at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
    	at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
    	at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
    	at dev.failsafe.CallImpl.execute(CallImpl.java:33)
    	at dev.failsafe.okhttp.FailsafeCall.execute(FailsafeCall.java:121)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutputWithHttpInfo(WorkloadOutputApi.kt:376)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutput(WorkloadOutputApi.kt:62)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.writeOutputThroughServer(WorkloadOutputWriter.kt:82)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.write(WorkloadOutputWriter.kt:65)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.handleException(ConnectorWatcher.kt:207)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.run(ConnectorWatcher.kt:83)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt:22)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt)
    k
    • 2
    • 1
  • j

    Joao Pedro Ferreira Canutto

    12/01/2025, 8:58 PM
    @kapa.ai I have a Airbyte 1.8.1 K8S installation and one of my connections is breaking when reaching 300Mb of records read:
    Copy code
    2025-11-28 17:25:52 replication-orchestrator INFO Records read: 315000 (299 MB)
    2025-11-28 17:26:14 replication-orchestrator INFO Records read: 320000 (305 MB)
    2025-11-28 17:26:20 destination ERROR Killed
    2025-11-28 17:26:20 replication-orchestrator ERROR DestinationReader error: 
    2025-11-28 17:26:20 replication-orchestrator INFO DestinationReader finished.
    k
    • 2
    • 37
  • z

    Zack Mattor

    12/01/2025, 10:29 PM
    @kapa.ai I'm having issues with my mysql sync... it seems it works fine the initial sync with the refresh but then sometimes fails after with the following
    Copy code
    2025-12-01 17:18:51 info Failures: [ {
      "failureOrigin" : "source",
      "failureType" : "config_error",
      "internalMessage" : "org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.",
      "externalMessage" : "MySQL Connector Error: The sync encountered an unexpected error in the change event producer and has stopped. Please check the logs for details and troubleshoot accordingly.\n<https://docs.oracle.com/javase/9/docs/api/java/lang/RuntimeException.html>",
      "metadata" : {
        "attemptNumber" : 0,
        "jobId" : 362,
        "from_trace_message" : true,
        "connector_command" : "read"
      },
      "stacktrace" : "org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.\n\tat io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:67)\n\tat io.debezium.connector.binlog.BinlogStreamingChangeEventSource$ReaderThreadLifecycleListener.onCommunicationFailure(BinlogStreamingChangeEventSource.java:1252)\n\tat com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:1110)\n\tat com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:657)\n\tat com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:959)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: io.debezium.DebeziumException: Failed to read next byte from position 5349653\n\tat io.debezium.connector.binlog.BinlogStreamingChangeEventSource.wrap(BinlogStreamingChangeEventSource.java:1206)\n\t... 5 more\nCaused by: java.io.EOFException: Failed to read next byte from position 5349653\n\tat com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:226)\n\tat com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.readLong(ByteArrayInputStream.java:66)\n\tat com.github.shyiko.mysql.binlog.event.deserialization.EventHeaderV4Deserializer.deserialize(EventHeaderV4Deserializer.java:36)\n\tat com.github.shyiko.mysql.binlog.event.deserialization.EventHeaderV4Deserializer.deserialize(EventHeaderV4Deserializer.java:27)\n\tat com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:232)\n\tat io.debezium.connector.binlog.BinlogStreamingChangeEventSource$1.nextEvent(BinlogStreamingChangeEventSource.java:426)\n\tat com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:1082)\n\t... 3 more\n",
      "timestamp" : 1764627313837
    }, {
      "failureOrigin" : "destination",
      "failureType" : "transient_error",
      "internalMessage" : "io.airbyte.cdk.TransientErrorException: Input was fully read, but some streams did not receive a terminal stream status message. If the destination did not encounter other errors, this likely indicates an error in the source or platform. Streams without a status message: [
    k
    • 2
    • 1
  • y

    Yuki Kakegawa

    12/02/2025, 4:05 AM
    I want to keep incremental refresh while deleting rows that are deleted in the source. Is that possible?
    k
    l
    • 3
    • 2
  • i

    Ishan Anilbhai Koradiya

    12/02/2025, 4:48 AM
    Hi @kapa.ai getting this error 97 [Activity Executor taskQueue="CONNECTION_UPDATER", namespace="default": 2040] ERROR i.a.w.t.s.a.JobCreationAndStatusUpdateActivityImpl(createNewAttemptNumber):98 - createNewAttemptNumber for job 60048 failed with exception: HTTP error: 500 Internal Server Error
    k
    • 2
    • 4
  • r

    Rahul

    12/02/2025, 6:45 AM
    @kapa.ai what happen when I change the database of destination after historical sync is complete and also 3 4 increamental sync are completed in old database?
    k
    • 2
    • 1
  • a

    Akhil Varghese

    12/02/2025, 9:08 AM
    In Airbyte self hosted instance, can we create new airbyte app via the REST API ?
    k
    • 2
    • 1
  • a

    Akhil Varghese

    12/02/2025, 9:22 AM
    @kapa.ai can we create new airbyte app via the REST API in selfhosted enterprise ?
    k
    • 2
    • 1
  • i

    Ishan Anilbhai Koradiya

    12/02/2025, 10:00 AM
    Hi @kapa.ai seeing this error in the server logs 2025-12-01T205646.590357331Z stdout F 2025-12-01 205646,590 [io-executor-thread-6] ESC[1;31mERRORESC[0;39m i.a.c.s.e.h.UncaughtExceptionHandler(handle):33 - Uncaught exception 2025-12-01T205646.590365878Z stdout F java.lang.IllegalStateException: Cannot create an attempt for a job id: 60048 that has a running attempt: RUNNING for connection id: 6a6a0c43-9ade-4ea0-8979-d28115ff5df3
    k
    • 2
    • 1
  • e

    Eduardo Ferreira

    12/02/2025, 12:47 PM
    @kapa.ai having problems with airbyte-workload-launcher. auth is disabled 2025-12-02 124351,538 [Thread-2] ERROR i.a.w.l.StartupApplicationEventListener(onApplicationEvent$lambda$1):43 - Failed to retrieve and resume claimed workloads, exiting. io.airbyte.api.client.ApiException: Unauthorized at io.airbyte.workload.api.client.WorkloadApiClient.workloadList(WorkloadApiClient.kt:288) at io.airbyte.workload.launcher.ClaimedProcessor.getWorkloadList$lambda$9(ClaimedProcessor.kt:112) at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243) at dev.failsafe.Functions.lambda$get$0(Functions.java:46) at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74) at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187) at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376) at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112) at io.airbyte.workload.launcher.ClaimedProcessor.getWorkloadList(ClaimedProcessor.kt:110) at io.airbyte.workload.launcher.ClaimedProcessor.retrieveAndProcess(ClaimedProcessor.kt:57) at io.airbyte.workload.launcher.StartupApplicationEventListener.onApplicationEvent$lambda$1(StartupApplicationEventListener.kt:39) at kotlin.concurrent.ThreadsKt$thread$thread$1.run(Thread.kt:30)
    k
    • 2
    • 10
  • s

    Seckin Dinc

    12/02/2025, 2:06 PM
    TikTok GMV Max Campaign availability in Airbyte
    k
    • 2
    • 2
  • d

    Dana Williams

    12/02/2025, 2:31 PM
    @kapa.ai 5 connectors failed to sync with error Warning from source: Check took too long. Check exceeded the timeout.
    k
    • 2
    • 4
  • c

    Christopher Vreugdenhil

    12/02/2025, 4:40 PM
    @kapa.ai, when using airbyte OSS, is the oauth source flow available? Is this only for Cloud / Entreprise?
    k
    • 2
    • 1
  • l

    Louis Demet

    12/02/2025, 4:58 PM
    Est ce qu'il est possible avec le connecteur Gorgias de récupérer les valeurs des custom fields pour chaque ticket ? On peut récupérer les tickets, la liste des custom fields dispo, mais je ne trouve pas l'association des deux
    k
    • 2
    • 1
  • a

    Albin Henneberger

    12/02/2025, 6:01 PM
    @kapa.ai - i am getting a gateway 504 error for my self hosted that began last week between airbyte and snowflake. i jsut focred the update to 4.0.30 and no idea why this issue is occuring. An unknown error occurred. (HTTP 504)
    k
    • 2
    • 1
  • d

    Dan Cook

    12/02/2025, 7:31 PM
    We are Airbyte Cloud customers and have built custom connectors using the Connector Builder UI. Question for @kapa.ai: can we migrate those custom connectors to a self-hosted version of Airbyte?
    k
    • 2
    • 1
  • s

    Sam Woodbeck

    12/02/2025, 7:39 PM
    @kapa.ai I'm using the Marketplace Freshservice connector https://docs.airbyte.com/integrations/sources/freshservice . I'd like to report on a ticket's acknowledged datetime and resolved datetime. I don't see any fields like this in the
    tickets
    stream. In the Freshservice documentation (https://api.freshservice.com/v2/#view_all_ticket) for the endpoint
    /api/v2/tickets
    , they mention that using the API parameter
    include=stats
    will embed additional details in the response, including
    resolved_at
    and
    first_responded_at
    datetime fields. Is there a way to configure the Freshservice connector to pass this API parameters? Or would I need to make a feature request for the connector owner or something?
    k
    • 2
    • 1
  • j

    Jared Parco

    12/02/2025, 7:43 PM
    @kapa.ai We are running into issues where a MySQL table isn’t capturing all of the CDC changes into Snowflake. We occasionally have to do a full refresh of certain streams. Why would this be occurring and where should we look to resolve this? We have these optional parameters on the MySQL source: useCursorFetch=true&defaultFetchSize=1000 Checkpoint Target Time Interval = 300 We are using MySQL v3.50.9 source and Snowflake v4.0.4 destination
    k
    • 2
    • 1
  • c

    Carmela Beiro

    12/02/2025, 8:02 PM
    @kapa.ai what happens with _ab_cdc_lsn when there is a refresh? Is it set to null?
    k
    • 2
    • 1
  • s

    soma chandra sekhar attaluri

    12/02/2025, 8:08 PM
    @kapa.ai is there any svc with name airbyte webapp svc in airbyte 2.0 installed using hek charts v2
    k
    • 2
    • 1
  • s

    soma chandra sekhar attaluri

    12/02/2025, 10:00 PM
    @kapa.ai How to disable secure cookie in abctl installation
    k
    • 2
    • 1
  • j

    Jeremy Plummer

    12/02/2025, 10:45 PM
    @kapa.ai Is there a way to cooy the terraform configuration of a connection setup in airbytte
    k
    • 2
    • 5
  • y

    Yuki Kakegawa

    12/02/2025, 11:15 PM
    Is there a way to "Refresh your data" only one of the tables configured in the sync?
    k
    • 2
    • 13
  • m

    Mauricio Pérez

    12/02/2025, 11:56 PM
    @kapa.ai Im having troubles setting the amazon seller partner connector, Im getting this error. 'Encountered an error while checking availability of stream Orders. Error: 400 Client Error: Bad Request for url: https://api.amazon.com/auth/o2/token', how can I solve this?
    k
    • 2
    • 1