https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • s

    Sergey

    11/21/2025, 2:13 PM
    @kapa.ai, what did I miss in clickhouse setup?
    Copy code
    warn WARN main i.a.c.l.c.CheckOperation(execute):58 Caught throwable during CHECK java.lang.IllegalArgumentException: Failed to insert expected rows into check table. Actual written: 0
    	at io.airbyte.integrations.destination.clickhouse.check.ClickhouseChecker.check(ClickhouseChecker.kt:48) ~[io.airbyte.airbyte-integrations.connectors-destination-clickhouse.jar:?]
    	at io.airbyte.integrations.destination.clickhouse.check.ClickhouseChecker.check(ClickhouseChecker.kt:20) ~[io.airbyte.airbyte-integrations.connectors-destination-clickhouse.jar:?]
    	at io.airbyte.cdk.load.check.CheckOperation.execute(CheckOperation.kt:48) [bulk-cdk-core-load-0.1.78.jar:?]
    	at io.airbyte.cdk.AirbyteConnectorRunnable.run(AirbyteConnectorRunnable.kt:36) [bulk-cdk-core-base-0.1.78.jar:?]
    	at picocli.CommandLine.executeUserObject(CommandLine.java:2030) [picocli-4.7.6.jar:4.7.6]
    	at picocli.CommandLine.access$1500(CommandLine.java:148) [picocli-4.7.6.jar:4.7.6]
    	at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2465) [picocli-4.7.6.jar:4.7.6]
    	at picocli.CommandLine$RunLast.handle(CommandLine.java:2457) [picocli-4.7.6.jar:4.7.6]
    	at picocli.CommandLine$RunLast.handle(CommandLine.java:2419) [picocli-4.7.6.jar:4.7.6]
    	at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2277) [picocli-4.7.6.jar:4.7.6]
    	at picocli.CommandLine$RunLast.execute(CommandLine.java:2421) [picocli-4.7.6.jar:4.7.6]
    	at picocli.CommandLine.execute(CommandLine.java:2174) [picocli-4.7.6.jar:4.7.6]
    	at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run(AirbyteConnectorRunner.kt:289) [bulk-cdk-core-base-0.1.78.jar:?]
    	at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run$default(AirbyteConnectorRunner.kt:75) [bulk-cdk-core-base-0.1.78.jar:?]
    	at io.airbyte.integrations.destination.clickhouse.ClickhouseDestinationKt.main(ClickhouseDestination.kt:10) [io.airbyte.airbyte-integrations.connectors-destination-clickhouse.jar:?]
    
    Stack Trace: java.lang.IllegalArgumentException: Failed to insert expected rows into check table. Actual written: 0
    	at io.airbyte.integrations.destination.clickhouse.check.ClickhouseChecker.check(ClickhouseChecker.kt:48)
    	at io.airbyte.integrations.destination.clickhouse.check.ClickhouseChecker.check(ClickhouseChecker.kt:20)
    	at io.airbyte.cdk.load.check.CheckOperation.execute(CheckOperation.kt:48)
    	at io.airbyte.cdk.AirbyteConnectorRunnable.run(AirbyteConnectorRunnable.kt:36)
    	at picocli.CommandLine.executeUserObject(CommandLine.java:2030)
    	at picocli.CommandLine.access$1500(CommandLine.java:148)
    	at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2465)
    	at picocli.CommandLine$RunLast.handle(CommandLine.java:2457)
    	at picocli.CommandLine$RunLast.handle(CommandLine.java:2419)
    	at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2277)
    	at picocli.CommandLine$RunLast.execute(CommandLine.java:2421)
    	at picocli.CommandLine.execute(CommandLine.java:2174)
    	at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run(AirbyteConnectorRunner.kt:289)
    	at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run$default(AirbyteConnectorRunner.kt:75)
    	at io.airbyte.integrations.destination.clickhouse.ClickhouseDestinationKt.main(ClickhouseDestination.kt:10)
    k
    • 2
    • 14
  • r

    Renu Fulmali

    11/21/2025, 2:27 PM
    @kapa.ai i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):245 - Caught exception that stops the processing of the jobs: dictionary update sequence element #0 has length 10; 2 is required. Traceback: Traceback (most recent call last): File "/home/airbyte/.pyenv/versions/3.10.18/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/async_job/job_orchestrator.py", line 464, in create_and_get_completed_partitions self._update_jobs_status() File "/home/airbyte/.pyenv/versions/3.10.18/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/async_job/job_orchestrator.py", line 302, in _update_jobs_status self._job_repository.update_jobs_status(running_jobs) File "/home/airbyte/.pyenv/versions/3.10.18/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/requesters/http_job_repository.py", line 183, in update_jobs_status stream_slice = self._get_create_job_stream_slice(job) File "/home/airbyte/.pyenv/versions/3.10.18/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/requesters/http_job_repository.py", line 328, in _get_create_job_stream_slice "creation_response": self._get_creation_response_interpolation_context(job), File "/home/airbyte/.pyenv/versions/3.10.18/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/requesters/http_job_repository.py", line 288, in _get_creation_response_interpolation_context creation_response_context = dict(self._create_job_response_by_id[job.api_job_id()].json()) ValueError: dictionary update sequence element #0 has length 10; 2 is required I am getting this error in connector builder
    k
    • 2
    • 1
  • r

    Ram Kalbande

    11/21/2025, 4:13 PM
    @kapa.ai, what does this error means in my google ads connection ERROR main i.a.c.i.u.ConnectorExceptionHandler(handleException):68 caught exception! io.airbyte.commons.exceptions.TransientErrorException: Some streams were unsuccessful due to a source error. See logs for details. at io.airbyte.cdk.integrations.destination.async.AsyncStreamConsumer.close(AsyncStreamConsumer.kt:215) ~[airbyte-cdk-core-0.46.0.jar:?] at kotlin.jdk7.AutoCloseableKt.closeFinally(AutoCloseableJVM.kt:48) ~[kotlin-stdlib-2.0.0.jar:2.0.0-release-341] at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.kt:215) [airbyte-cdk-core-0.46.0.jar:?] at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.kt:119) [airbyte-cdk-core-0.46.0.jar:?] at io.airbyte.cdk.integrations.base.IntegrationRunner.run$default(IntegrationRunner.kt:113) [airbyte-cdk-core-0.46.0.jar:?] at io.airbyte.integrations.destination.bigquery.BigQueryDestinationKt.main(BigQueryDestination.kt:566) [io.airbyte.airbyte-integrations.connectors-destination-bigquery.jar:?]
    k
    • 2
    • 1
  • m

    mike Trienis

    11/21/2025, 9:19 PM
    @kapa.ai do you have some concrete examples using configs with stream templates?
    k
    • 2
    • 1
  • p

    Prabhu Agarwal

    11/22/2025, 8:06 AM
    Getting this error in download phase of async stream. The download location is a google storage and file type is CSV `nternal Server Error: com.fasterxml.jackson.databind.JsonMappingException: String value length (20054016) exceeds the maximum allowed (20000000, from
    StreamReadConstraints.getMaxStringLength()
    ) (through reference chain: io.airbyte.connectorbuilderserver.api.client.model.generated.StreamRead["slices"]->java.util.ArrayList[0]->io.airbyte.connectorbuilderserver.api.client.model.generated.StreamReadSlicesInner["pages"]->java.util.ArrayList[0]->io.airbyte.connectorbuilderserver.api.client.model.generated.StreamReadSlicesInnerPagesInner["response"]->io.airbyte.connectorbuilderserver.api.client.model.generated.HttpResponse["body"])`
    k
    • 2
    • 4
  • s

    Slackbot

    11/22/2025, 8:03 PM
    This message was deleted.
    k
    • 2
    • 5
  • i

    Ishan Anilbhai Koradiya

    11/23/2025, 5:50 AM
    @kapa.ai i am getting 143 exit error codes across source and desintaion rpocesses . what could be the issue ? logs dont say much.
    k
    • 2
    • 1
  • s

    Slackbot

    11/23/2025, 7:42 PM
    This message was deleted.
    k
    • 2
    • 1
  • w

    Wira Tjo

    11/24/2025, 1:41 AM
    @kapa.ai What are the possible causes there are if the Workload Launcher is not spinning up any new pods?
    k
    • 2
    • 1
  • a

    Ajinkya Atiwadkar

    11/24/2025, 7:21 AM
    @kapa.ai I want to deploy Airbyte using ArgoCD https://docs.airbyte.com/platform/deploying-airbyte/chart-v2-community The external secrets will be stored in AWS Secret manager
    k
    • 2
    • 1
  • m

    Mateo Colina

    11/24/2025, 8:08 AM
    @kapa.ai what is the s3 storage for? are the logs different from what is available in STDIN STDOUT STDERR?
    k
    • 2
    • 1
  • m

    Mohamed Badran

    11/24/2025, 9:35 AM
    @kapa.ai for snapchat public profiles api . For airbyte is it enough to create one oauth app and get the refresh token for it and store it in airbyte to start pulling public profiles data?
    k
    • 2
    • 10
  • r

    Renu Fulmali

    11/24/2025, 9:37 AM
    Hi @kapa.ai I wanted to increase the ram and cpu for the connector job pods how can I do?
    k
    • 2
    • 4
  • b

    Bogdan

    11/24/2025, 10:18 AM
    I have set
    max_check_workers=10
    and
    max_sync_workers=10
    in my Airbyte setup, but syncs are still running sequentially instead of in parallel. Has anyone encountered this issue before? What could be causing this behavior despite having these values set? Any suggestions would be greatly appreciated. @kapa.ai
    k
    • 2
    • 10
  • r

    Rahul

    11/24/2025, 11:11 AM
    How can I connect to MongoDB using destination in Fivetron. I have DocumentDB in AWS which I can access through EC2 instance using SSH. Which are the mandatory fields to give in destination MongoDB in Airbyte OSS? @kapa.ai
    k
    • 2
    • 1
  • m

    Mohamed Badran

    11/24/2025, 11:31 AM
    @kapa.ai for spotify channels if i want to create a source for them to get the analytics, monetizations and so on. What should i do . Give me the steps
    k
    • 2
    • 19
  • a

    Abhijith C

    11/24/2025, 12:48 PM
    @kapa.ai Is there a known issue in the latest Airbyte 2.0 where the source connector gets OOM-killed, but the destination continues its legitimate processing, and the sync still ends up being marked as successful?
    k
    • 2
    • 1
  • a

    Anna Bogo

    11/24/2025, 1:33 PM
    @kapa.ai i have a replication from postgres using the xmin strategy, I am using an incremental +deduped strategy to upsert into snowflake. if a row is deleted in postgres db, what happened to the same rows in my snowflake table?
    k
    • 2
    • 1
  • s

    Sabbir Ahmmed

    11/24/2025, 3:09 PM
    What is the workaround for this dynamodb connector limitation?
    Copy code
    software.amazon.awssdk.services.dynamodb.model.DynamoDbException: Invalid ProjectionExpression: Attribute name is a reserved keyword; reserved keyword: language (Service: DynamoDb, Status Code: 400, Request ID: 2L8ABPT3DJ6QG6Q00RRUBU3T3NVV4KQNSO5AEMVJF66Q9ASUAAJG)
    k
    • 2
    • 1
  • k

    Kothapalli Venkata Avinash

    11/24/2025, 3:44 PM
    @kapa.ai, we are using connectors from private docker repository, but when we using latest version of destinations connectors we are seeing error Failure in destination: You must upgrade your platform version to use this connector version. Either downgrade your connector or upgrade platform to 0.63.7
    k
    • 2
    • 4
  • t

    Théo

    11/24/2025, 4:11 PM
    @kapa.ai I installed the version 1.6.2 of airbyte (a new instance) on my kubernetes cluster. But I don't see the connector "s3" anymore available in connectors we can install. It's like it disappeared. On the instances in the same version i had installed in the past, the connector is still available to install.
    k
    • 2
    • 23
  • j

    Joshua Garza

    11/24/2025, 4:21 PM
    #C01AHCD885S I just setup a new Airbyte cluster and am trying to load data into an existing database that I was previously loading with a different airbyte cluster. I am unable to get the streams to run. What do I do?
    k
    • 2
    • 1
  • m

    MTA

    11/24/2025, 4:49 PM
    @kapa.ai I have set up on Airbyte Cloud, through an custom connector, an API in POST call that returns data from source. The PAI call returns a mximum of 200 rows. What is happening is that I keep receiving multiple pages of data but exactly the same data. Basically, the 200 rows are repeated on each page, which poses a problem because I am not getting the rest of the data. this what I have configured in the paginator section see screenshot. How can I solve this problem ?
    k
    • 2
    • 21
  • j

    Javier Molina Sánchez

    11/24/2025, 5:04 PM
    @kapa.ai I've setup Airbyte in my eks cluster, slack notifications work when I click Test in the UI but they don't when a stream actually succeeds or fails even though these events are enabled in the UI.
    k
    • 2
    • 1
  • j

    Jared Parco

    11/24/2025, 5:30 PM
    @kapa.ai we are running into issues with our S3 source connector performing incredibly slow. We are using the self-managed version of Airbyte, version 1.6. Looking at the resources consumed on Kubernetes, it looks like the source connector isn’t using the majority of the resources allotted to it. What areas should we look at to improve the performance of this source connector
    k
    • 2
    • 4
  • n

    Nicolas Albertini

    11/24/2025, 9:06 PM
    Hi, how can i connect my postgres database from supabase to airbyte? I'm having the 08001 error code @kapa.ai
    k
    • 2
    • 3
  • e

    Eduardo Ferreira

    11/24/2025, 9:28 PM
    @kapa.ai I've migrated to Airbyte 2.0.1 using helm charts v1, but I now get the following error:
    Error: couldn't find key dataplane-client-id in Secret airbyte/airbyte-auth-secrets
    . What values should this be set to? Note that I'm using the oss airbyte core version. I was previously on the chart 1.5.x and we didnt have that requirement. dataplane is not cited in any of the migration docs
    k
    • 2
    • 16
  • t

    Todd Matthews

    11/24/2025, 11:26 PM
    airbyte is not adding jwt-signature-secret
    k
    • 2
    • 7
  • n

    Neeraj N

    11/25/2025, 3:59 AM
    Gmail Connector suport ingestion
    k
    • 2
    • 1
  • n

    Neeraj N

    11/25/2025, 4:00 AM
    Gmail source connector how to source create ??
    k
    • 2
    • 1