https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • k

    Kothapalli Venkata Avinash

    11/24/2025, 3:44 PM
    @kapa.ai, we are using connectors from private docker repository, but when we using latest version of destinations connectors we are seeing error Failure in destination: You must upgrade your platform version to use this connector version. Either downgrade your connector or upgrade platform to 0.63.7
    k
    • 2
    • 4
  • t

    Théo

    11/24/2025, 4:11 PM
    @kapa.ai I installed the version 1.6.2 of airbyte (a new instance) on my kubernetes cluster. But I don't see the connector "s3" anymore available in connectors we can install. It's like it disappeared. On the instances in the same version i had installed in the past, the connector is still available to install.
    k
    • 2
    • 23
  • j

    Joshua Garza

    11/24/2025, 4:21 PM
    #C01AHCD885S I just setup a new Airbyte cluster and am trying to load data into an existing database that I was previously loading with a different airbyte cluster. I am unable to get the streams to run. What do I do?
    k
    • 2
    • 1
  • m

    MTA

    11/24/2025, 4:49 PM
    @kapa.ai I have set up on Airbyte Cloud, through an custom connector, an API in POST call that returns data from source. The PAI call returns a mximum of 200 rows. What is happening is that I keep receiving multiple pages of data but exactly the same data. Basically, the 200 rows are repeated on each page, which poses a problem because I am not getting the rest of the data. this what I have configured in the paginator section see screenshot. How can I solve this problem ?
    k
    • 2
    • 24
  • j

    Javier Molina Sánchez

    11/24/2025, 5:04 PM
    @kapa.ai I've setup Airbyte in my eks cluster, slack notifications work when I click Test in the UI but they don't when a stream actually succeeds or fails even though these events are enabled in the UI.
    k
    • 2
    • 7
  • j

    Jared Parco

    11/24/2025, 5:30 PM
    @kapa.ai we are running into issues with our S3 source connector performing incredibly slow. We are using the self-managed version of Airbyte, version 1.6. Looking at the resources consumed on Kubernetes, it looks like the source connector isn’t using the majority of the resources allotted to it. What areas should we look at to improve the performance of this source connector
    k
    • 2
    • 4
  • n

    Nicolas Albertini

    11/24/2025, 9:06 PM
    Hi, how can i connect my postgres database from supabase to airbyte? I'm having the 08001 error code @kapa.ai
    k
    • 2
    • 3
  • e

    Eduardo Ferreira

    11/24/2025, 9:28 PM
    @kapa.ai I've migrated to Airbyte 2.0.1 using helm charts v1, but I now get the following error:
    Error: couldn't find key dataplane-client-id in Secret airbyte/airbyte-auth-secrets
    . What values should this be set to? Note that I'm using the oss airbyte core version. I was previously on the chart 1.5.x and we didnt have that requirement. dataplane is not cited in any of the migration docs
    k
    • 2
    • 16
  • t

    Todd Matthews

    11/24/2025, 11:26 PM
    airbyte is not adding jwt-signature-secret
    k
    • 2
    • 7
  • n

    Neeraj N

    11/25/2025, 3:59 AM
    Gmail Connector suport ingestion
    k
    • 2
    • 1
  • n

    Neeraj N

    11/25/2025, 4:00 AM
    Gmail source connector how to source create ??
    k
    • 2
    • 1
  • a

    Akshata Shanbhag

    11/25/2025, 6:41 AM
    @kapa.ai what do these logs mean. 2025-11-25 010543 platform INFO APPLY Stage: BUILD — (workloadId=b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync) 2025-11-25 010544 platform INFO APPLY Stage: CLAIM — (workloadId=b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync) 2025-11-25 010544 platform INFO Claimed: true for workload b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync via API in dataplane AUTO (993efb93-3056-43ac-81bd-0b5df76ded34) 2025-11-25 010544 platform INFO APPLY Stage: LOAD_SHED — (workloadId=b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync) 2025-11-25 010544 platform INFO APPLY Stage: CHECK_STATUS — (workloadId=b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync) 2025-11-25 010544 platform INFO No pod found running for workload b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync 2025-11-25 010544 platform INFO APPLY Stage: MUTEX — (workloadId=b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync) 2025-11-25 010544 platform INFO Mutex key: b4869d37-a0d5-4f17-a25b-195ec5e895a8 specified for workload: b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync. Attempting to delete existing pods... 2025-11-25 010544 platform INFO Mutex key: b4869d37-a0d5-4f17-a25b-195ec5e895a8 specified for workload: b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync found no existing pods. Continuing... 2025-11-25 010544 platform INFO APPLY Stage: ARCHITECTURE — (workloadId=b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync) 2025-11-25 010544 platform INFO APPLY Stage: LAUNCH — (workloadId=b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync) 2025-11-25 010544 platform INFO [initContainer] image: 226479973325.dkr.ecr.eu-west-1.amazonaws.com/docker-hub/airbyte/workload-init-container:2.0.0 resources: ResourceRequirements(claims=[], limits={memory=4Gi, cpu=1}, requests={memory=1Gi, cpu=500m}, additionalProperties={}) 2025-11-25 010544 platform INFO Launching replication pod: replication-job-20012-attempt-0 (selectors = {}) with containers: 2025-11-25 010544 platform INFO [source] image: 226479973325.dkr.ecr.eu-west-1.amazonaws.com/docker-hub/airbyte/source-google-ads:4.0.2 resources: ResourceRequirements(claims=[], limits={memory=4Gi, cpu=1}, requests={memory=1Gi, cpu=500m}, additionalProperties={}) 2025-11-25 010544 platform INFO [destination] image: 226479973325.dkr.ecr.eu-west-1.amazonaws.com/docker-hub/airbyte/destination-s3-data-lake:0.3.41 resources: ResourceRequirements(claims=[], limits={memory=4Gi, cpu=1}, requests={memory=1Gi, cpu=500m}, additionalProperties={}) 2025-11-25 010544 platform INFO [orchestrator] image: 226479973325.dkr.ecr.eu-west-1.amazonaws.com/docker-hub/airbyte/container-orchestrator:2.0.0 resources: ResourceRequirements(claims=[], limits={memory=4Gi, cpu=1}, requests={memory=1Gi, cpu=500m}, additionalProperties={}) 2025-11-25 010709 platform INFO Attempting to update workload: b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync to LAUNCHED. 2025-11-25 010709 platform INFO Pipeline completed for workload: b4869d37-a0d5-4f17-a25b-195ec5e895a8_20012_0_sync. 2025-11-25 011925 platform INFO ----- START POST REPLICATION OPERATIONS ----- 2025-11-25 011926 platform INFO No post-replication operation(s) to perform. 2025-11-25 011926 platform INFO ----- END POST REPLICATION OPERATIONS -----
    k
    • 2
    • 7
  • k

    Kürşat Yalın

    11/25/2025, 10:10 AM
    Does Airbyte Core support Catalog Items API in the Amazon Seller Partner connector? @kapa.ai
    k
    • 2
    • 1
  • m

    Mehul Sanghvi

    11/25/2025, 10:18 AM
    #C01AHCD885S When using Source Airbyte Connector as MongoDB & Destination Airbyte Connector as S3, does data syncing feature is available?
    k
    • 2
    • 16
  • k

    Kürşat Yalın

    11/25/2025, 10:25 AM
    I couldn't see the Amazon Search Terms Report stream in the Amazon Seller Partner connector for the Airbyte Core sources/settings menu. #C01AHCD885S
    k
    • 2
    • 4
  • r

    Renu Fulmali

    11/25/2025, 11:25 AM
    @kapa.ai i.a.w.s.a.OutputStorageClient(persist):61 - Message: The authorization header is malformed; the region 'us-east-1' is wrong; expecting 'eu-west-1' (Service: S3, Status Code: 400, Request ID: TW1RSEY6MBVYF8ND, Extended Request ID: bynVIKxfhDJJ8CYQUJcYyw4RNw9zaZUAXXY111q7xg8cVKWnQgSTEMXzVNi5bnx5bqVr65JEO4Q=) (SDK Attempt Count: 1)
    k
    • 2
    • 4
  • n

    Neeraj N

    11/25/2025, 11:45 AM
    Internal Server Error: java.net.UnknownHostException: airbyte-abctl-airbyte-connector-builder-server-svc.airbyte-abctl
    k
    • 2
    • 1
  • n

    Neeraj N

    11/25/2025, 11:46 AM
    issue in rest api case Internal Server Error: java.net.UnknownHostException: airbyte-abctl-airbyte-connector-builder-server-svc.airbyte-abctl
    k
    • 2
    • 1
  • n

    Neeraj N

    11/25/2025, 12:11 PM
    version: 7.0.4 type: DeclarativeSource check: type: CheckStream stream_names: - neeraj api test streams: - type: DeclarativeStream retriever: type: SimpleRetriever record_selector: type: RecordSelector extractor: type: DpathExtractor field_path: - results requester: type: HttpRequester http_method: GET url: https://randomuser.me/api/ decoder: type: JsonDecoder name: api test - type: DeclarativeStream retriever: type: AsyncRetriever record_selector: type: RecordSelector extractor: type: DpathExtractor field_path: [] creation_requester: type: HttpRequester http_method: GET url: https://randomuser.me/api/ polling_requester: type: HttpRequester http_method: GET download_requester: type: HttpRequester http_method: GET status_mapping: type: AsyncJobStatusMap running: [] completed: [] failed: [] timeout: [] status_extractor: type: DpathExtractor field_path: [] download_target_extractor: type: DpathExtractor field_path: [] decoder: type: CsvDecoder encoding: utf-8 delimiter: "," name: neeraj api test spec: type: Spec connection_specification: type: object properties: {} metadata: autoImportSchema: neeraj api test: true api test: true getting issue Internal Server Error: java.net.UnknownHostException: airbyte-abctl-airbyte-connector-builder-server-svc.airbyte-abctl: Name or service not known
    k
    • 2
    • 1
  • a

    Adrien

    11/25/2025, 12:28 PM
    @kapa.ai Hi there, I’m running into an issue with a Postgres CDC → Snowflake connection. Since Monday, all syncs complete successfully, but 0 records are replicated across all streams. After checking the connection state, I noticed something strange: • The shared CDC state (Debezium) contains a valid LSN and is advancing normally. • But all streamStates only contain cursor_field: [] and no per-stream state at all. • Logs show for every stream: • Found cursor field. Original cursor field: null. New cursor field: _ab_cdc_lsn. Resetting cursor value. • Debezium logs also show: • Received offset commit request… but ignoring it. LSN flushing is not allowed yet. The replication slot on Postgres is healthy and active,. It looks like Airbyte is stuck in an inconsistent snapshot/CDC state. The result is that no changes are produced during syncs. Could you help me understand how to fix this state inconsistency, and whether a full “Reset data + Reset state” is required? Is there any way to recover the connection without triggering a full snapshot? Thanks!
    k
    • 2
    • 4
  • f

    Fabrizio Spini

    11/25/2025, 1:47 PM
    @kapa.ai the postgres db had "no space left" and was triggered back to previous valid position. Now my connections couldn't advance. How can I try to resolve?
    k
    • 2
    • 3
  • p

    Paweł Jurkiewicz

    11/25/2025, 2:25 PM
    @kapa.ai I'm trying to ingest Excel files using SFTP Bulk connector to GCS bucket. I'm getting
    Destination does not support file transfers, but source requires it
    How can this issue be solved?
    k
    • 2
    • 1
  • j

    Jeremy Plummer

    11/25/2025, 3:05 PM
    @kapa.ai What does this error mean when testing sources? Airbyte is temporarily unavailable. Please try again. (HTTP 502)
    k
    • 2
    • 1
  • f

    Fabrizio Spini

    11/25/2025, 3:36 PM
    @kapa.ai I have the following part on conneciton state
    Copy code
    "[\"communications\",{\"server\":\"communications\"}]": "{\"ts_sec\":1764050457,\"file\":\"db05-slave.087107\",\"pos\":90877811,\"row\":1,\"server_id\":347503800,\"event\":3}"
    what are row and event values?
    k
    • 2
    • 4
  • k

    Karim Meguenni-Tani

    11/25/2025, 4:29 PM
    @kapa.ai j'ai un probleme avec airbyte 0.63.5 et le connector builder 2.3.0, explique moi les problemes de cette version
    k
    • 2
    • 10
  • e

    Eduardo Ferreira

    11/25/2025, 4:37 PM
    @kapa.ai after updating to airbyte 2.0.1, I started getting timeout errors when upgrading my source and destination connector versions. What could be happening?
    k
    • 2
    • 19
  • z

    Zack Mattor

    11/25/2025, 4:53 PM
    @kapa.ai when running syncs I see warnings like this... WARN com.datadog.profiling.uploader.ProfileUploader - Failed to upload profile to http://localhost:8126/p ││ rofiling/v1/input java.net.ConnectException: Failed to connect to localhost/127.0.0.1:8126 (Will not log warnings for 5 minutes) I thought datadog was disabled by default in the helm v2 charts... is there something else i need to disable?
    k
    • 2
    • 4
  • j

    Jeremy Plummer

    11/25/2025, 5:54 PM
    @kapa.ai What does this error point to in a destination postgres sync: invalid input syntax for type double precision: "None"
    k
    • 2
    • 1
  • j

    Joao Pedro Ferreira Canutto

    11/25/2025, 7:02 PM
    @kapa.ai Analyze the following logs
    k
    • 2
    • 10
  • s

    soma chandra sekhar attaluri

    11/25/2025, 7:03 PM
    @kapa.ai whcih is the best way to deploy airbyte . My company is trying to ingest data aroound 1tb per day from 20 different source databases on a daily basis ,,if i want to deploy airbyte on aws we are thinking of deploying it on a large ec2 instance ..is it best if we do it on eks or can we deploy it on ec2 and is it best if we install it using helm charts on k3s or using abctl and which underlyingos is best windows or liunux ...explain me why i should take the particular option clearly
    k
    • 2
    • 19