https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • m

    Michal Krawczyk

    11/21/2025, 10:30 AM
    Since upgrade to Airbyte 2.0 I observe that some jobs don't start even though Airbyte reports a successful status (0 bytes | no records loaded). It looks like Airbyte starts discover step but then it doesn't start orchestrator, source, destinations pods. My airbyte is released to Kubernetes on AWS EKS using helm chart. Has anyone see similar issues or know what could cause it? Logs in the 🧵
    k
    a
    • 3
    • 11
  • m

    Mahmoud Mostafa

    11/21/2025, 2:31 PM
    Hello everyone, need help here I have an issue related to a parent child streams Where I built a custom connector using the low code ui builder to fetch data from rasayel api which is an omni channel for whatsapp messages I wish to fetch conversations along with messages in each conversation to do so I use graphql and there is a relation between messages and conversations to fetch any message you have to pass conversation ID as an argument in the message query My current situation is that I have the connector configured correctly but it’s stuck in sync running for messages stream with no visible errors in the logs if anyone can help with this it will be much appreciated I am using airbyte 2.0.1 deployed with abctl on my local laptop
    k
    • 2
    • 10
  • d

    Diego Dias

    11/21/2025, 5:12 PM
    Hello everyone, I'm trying to use Airbyte to sync my data, and I've created multiple connections, each with many schemas mapped for synchronization. However, I expected that, if the VM resources were available, Airbyte would run multiple connections in parallel, but instead, all connections are running sequentially. Do I need to change any configuration to enable parallel execution? Airbyte is running on a VM with Debian 12, 8 vCPUs, and 32 GB of memory. Below is a snippet of my values.yaml:
    Copy code
    worker:
      enabled: true
      replicaCount: 2
      maxNotifyWorkers: "20"
      maxCheckWorkers: "20"
      maxSyncWorkers: "20"
      resources:
        requests:
          cpu: "500m"
          memory: "1Gi"
        limits:
          cpu: "1000m"
          memory: "2Gi"
      extraEnv:
        - name: MAX_SYNC_WORKERS
          value: "20"
        - name: MAX_CHECK_WORKERS
          value: "20"
        - name: MAX_DISCOVERY_WORKERS
          value: "10"
        - name: MAX_SPEC_WORKERS
          value: "10"
    
    workloadLauncher:
      enabled: true
      replicaCount: 2
      resources:
        requests:
          cpu: "500m"
          memory: "1Gi"
        limits:
          cpu: "1000m"
          memory: "2Gi"
      extraEnv:
        - name: WORKLOAD_LAUNCHER_PARALLELISM
          value: "25"
    k
    • 2
    • 2
  • t

    Tigran Zalyan

    11/21/2025, 6:36 PM
    Hey everyone! I was trying to migrate Airbyte from Helm chart version 1.8.2 (v1) to 2.0.1 (v1) and I'm running into a lot of errors. When using an external Postgres database with an explicitly set username via
    global.database.user
    , the airbyte-bootloader fails with:
    Copy code
    Error: couldn't find key DATABASE_USER in Secret airbyte-airbyte-secrets
    The workaround was to move the value into the secret under that key. I'm also seeing another error in airbyte-server when using GCS as the storage provider:
    Copy code
    io.micronaut.context.exceptions.BeanInstantiationException: Bean definition [io.airbyte.server.apis.controllers.SourceDefinitionApiController] could not be loaded:
    Error instantiating bean of type [io.airbyte.commons.storage.GcsStorageClient]: Is a directory
    The last error doesn't allow Airbyte to successfully start. Any ideas why this might be happening?
    k
    • 2
    • 1
  • d

    Dan Cook

    11/21/2025, 6:37 PM
    (FYI, we use Airbyte Cloud) If you've been having problems with your Google Ads syncs over the past few days, especially if you have custom reports built from GAQL then read on: We originally concluded that tab characters in our custom GAQL was the root cause of a bunch of recent failed syncs, and in fact replacing tabs with spaces did improve a few things. But the actual root cause was the moving around by Airbyte staff of our version of the Google Ads source connector. Over the past 3 days it has been the following values, in sequential order. The two versions in the middle of the sequence, variants of v4.1, were the cause of a bunch of failed syncs. Something in that connector version does not work nicely with custom GAQL reports: • v4.0.2 • v4.1.0-rc.8 • v4.1.0 • v4.0.2 The final change back to v4.0.2 took place this AM, presumably after Airbyte staff heard from enough customers about failing syncs. Once this one change was made our syncs started working again. Some issues: 1. why would Airbyte Cloud customers like us get moved to a release candidate (v4.1.0-rc.8) without any foreknowledge? 2. the first release candidate dropped in August, and the changelog for v4.1.0-rc.1 specifically mentions custom queries. Several of the subsequent PRs do too. How did very obvious, very fatal problems with custom queries make it all the way to an official release?
    k
    • 2
    • 2
  • j

    Jonathan Clemons

    11/21/2025, 9:56 PM
    Hey All. I am running into an issue with a connector that I built in the connector builder where syncs are getting stuck. From some testing, I think the issue is being caused by rate limiting from the source api. I am working on dialing in the API budget, but this has been tricky because I cannot see logs from the api calls being made to the source. The connection is not failing, so I am not able to diagnose via an error code. Is it possible to get more granular logs so I can troubleshoot why the connector is getting stuck or does anyone have suggestions for further troubleshooting steps?
    k
    • 2
    • 1
  • g

    Gaurav Jain

    11/22/2025, 9:26 AM
    @kapa.ai What do you store in this folder root@airbyte-prod-new:~/.airbyte/abctl/data/airbyte-volume-db/pgdata/base/16384#
    k
    • 2
    • 4
  • b

    Bogdan

    11/24/2025, 9:53 AM
    Hi everyone, I'm using Airbyte for the first time via abctl, and I've encountered an issue with sync parallelism. It seems that each sync is waiting for the previous one to complete before starting, which is slowing down the process. Is there a way to configure Airbyte to allow multiple syncs to run in parallel?
    k
    • 2
    • 4
  • s

    Samy-Alexandre LICOUR

    11/24/2025, 11:50 AM
    Hello, I am trying to install Airbyte with abctl and logs in a GCP bucket.
    Copy code
    sudo abctl local install --secret secrets.yaml --values values.yaml
    Here are my values and secrets files: values.yaml
    Copy code
    global:
      storage:
        type: "GCS"
        secretName: airbyte-config-secrets
        bucket: # GCS bucket names that you've created. We recommend storing the following all in one bucket.
          log: dev-airbyte-abctl-logs-
          state: dev-airbyte-abctl-logs
          workloadOutput: dev-airbyte-abctl-logs
        gcs:
          projectId: source-plm-dev
          credentialsJsonPath: /secrets/gcs-log-creds/gcp.json
      jobs:
        resources:
          ## Example:
           requests:
              memory: 8Gi
              cpu: 2
          # -- Job resource requests
          #requests: {}
          ## Example:
           limits:
              cpu: 4
              memory: 16Gi
          # -- Job resource limits
          #limits: {}
      auth:
        enabled: false
    secrets.yaml :
    Copy code
    apiVersion: v1
    kind: Secret
    metadata:
      name: airbyte-config-secrets
    type: Opaque
    stringData:
      gcp.json: "..."
    I am getting this error:
    Copy code
    Caused by: java.io.IOException: Is a directory
    There seems to be a problem with credentialsJsonPath: /secrets/gcs-log-creds/gcp.json even though I followed the documentation. Thanks for your help!
    k
    • 2
    • 7
  • a

    Akshata Shanbhag

    11/25/2025, 7:36 AM
    Hello i am out of ideas now so could you please help me out here. I have sync job failing with error "Workload failed, source: workload-monitor-heartbeat". the fact is that the replication pod is not created at all, even though the log shows that it is being launched. I have the workload launcher, worker, server, temporal all running in non-disrupt mode. No logs indicate this silent failure. The job succeeds on re-attempting. any ideas on what might be causing the replication pod from not starting? Between 1:07 and 1:19, it looks like it was attempting to create a pod but did not and failed silently. How could i mitigate this issue?
    l
    k
    • 3
    • 2
  • k

    kapa.ai

    11/25/2025, 2:55 PM
    You’re connected to kapa.ai, an assistant focused specifically on Airbyte. Your last message only says “Hi guys,” without a question or problem description. If you have an Airbyte-related question (connectors, deployments, errors, features, etc.), please share the details and I’ll help based on the available Airbyte docs, GitHub issues, and forum posts.
  • m

    MihĂĄly Dombi

    11/25/2025, 2:55 PM
    Hello Team! I'm trying to add a new stream to a connector where the Retrieval type is Asynchronous Job. There is a creation endpoint that returns a Location (URL with
    report_task_id
    ) in the response header that can be polled to check its status. The problem is that this Location is in the response header but the response doesn't have a payload. Because of this the _get_creation_response_interpolation_context is always failing. Also in the connector builder the HTTP Response Format section doesn't have an option that tells that the response doesn't have a body. Is there a workaround for this, or the CDK should be extended with this option? The report creation endpoint in question: https://developer.ebay.com/api-docs/sell/marketing/resources/ad_report_task/methods/createReportTask#request.dateFrom
    k
    h
    • 3
    • 2
  • l

    Lucas Segers

    11/25/2025, 2:56 PM
    Hey guys, anyone with any "Plain-text" Http Requester on a custom Builder facing any issues after 2.0? I find it hard that I'm the only one that has broken PlainText streams after upgrading, so I must be missing something? 😛
    k
    • 2
    • 1
  • d

    Dyllan Pascoe

    11/25/2025, 9:04 PM
    Greetings. Is there a way to delete a builder connector from my organization once it's been published?
    k
    • 2
    • 2
  • r

    Rob Kwark

    11/26/2025, 1:39 AM
    I am trying to do a create connection api call for Mysql -> s3 on OSS, and it works when I use the api call for my other connections (snowflake, postgres) when I use the same flow. However, I am getting a { "status": 500, "type": "https://reference.airbyte.com/reference/errors", "title": "unexpected-problem", "detail": "An unexpected problem has occurred. If this is an error that needs to be addressed, please submit a pull request or github issue.", "documentationUrl": null, "data": { "message": "Something went wrong in the connector. logs:Workload failed, source: airbyte_platform" } } when I make the basic post request: { "name": "badump1", "sourceId": "sourceUUID", "destinationId": "destUUID", "schedule": { "scheduleType": "manual" } }. This is only happening with the mysql connection.
    k
    • 2
    • 10
  • k

    Kevin O'Keefe

    11/26/2025, 4:00 AM
    Hello Airbyte community. I am new here and have set up our first Airbyte community edition deployment. I want to reach out to the community and ask if you have any helpful suggestions for optimizing the environment that you may have discovered during your use of the product. I am on version 1.9.1 of the Helm deployment to K8s using ArgoCD, and things are running well at the moment. If anyone is willing to share any helpful items related to your experience and setup woes, please let me know. Thank you for your attention and participation.
    k
    j
    • 3
    • 2
  • j

    JadperNL

    11/26/2025, 9:03 AM
    Hi is there any methods to install Airbyte without kubernetes? All our systems cannot run kubernetes in our case we currently make use of docker compose for any other deployments.
    k
    • 2
    • 1
  • i

    Isaac Steele

    11/26/2025, 3:45 PM
    I am getting different responses when I use
    requests.get()
    and the python
    airbyte-api
    on sources for my source configuration information. For example, when I get run a
    GET
    on
    ...v1/sources
    , my sources have connection information like
    configuration":{"auth_type":"Client","client_id":"myclient123456789","is_sandbox":true,"client_secret":"**********","refresh_token":"**********","streams_criteria":[],"stream_slice_step":"P30D","force_use_bulk_api":false}
    . But when I use the python API
    get_source()
    or
    list_sources()
    the configuration information returned is just:
    configuration=SourceAirtable(credentials=None, SOURCE_TYPE=<SourceAirtableAirtable.AIRTABLE: 'airtable'>)
    . This is an example from my salesforce connector, but the same is true for my other connectors as well, just giving the
    Airtable
    for configuration on all of them. Is this a python-api bug? How can I get/view my masked source configuration data from the python api?
    k
    • 2
    • 1
  • n

    Nikita

    11/26/2025, 4:36 PM
    Hello! I'm trying to establish export from GA4 to Clickhouse. This is screenshot from the traffic_sources table. Is there a way that in the endDate and startDate we can also include time? Thanks!
  • f

    Francis Carr

    11/26/2025, 5:00 PM
    Hey all. I am experiencing timeout issues when transferring data from Snowflake. I am currently on the official Snowflake Source
    airbyte/source-snowflake
    on version
    1.0.8
    . The source table within Snowflake is very large and it seems I am hitting a credentials timeout of around 6 hours. What seems to be happening; 1. Airbyte makes a query for all data from the Snowflake table and this is staged internally within an S3 bucket 2. Airbyte then starts transferring this data out of the bucket piece-by-piece sending it from the Source to the Destination, the destination being BigQuery 3. After 6 hours or so, sometimes longer, we run into a
    403 Forbidden
    on reading the data and the job within Airbyte runs another attempt. a. After 5 attemps of around 6 hours it gives up and reports it as a failure. b. When the job starts again we seem to start from the very beginning; even though the job has written a lot of records to the BigQuery destination, it doesn't seem to save where it has got to in the Source. The sync mode is
    Incremental | Append
    with a Date as the Cursor Has anyone had any similar experiences with this type of Snowflake timeout? It sounds like it could be something that can be configured on the Snowflake side, but we aren't sure what configuration we could use to control this. There is documentation around the 6 hour timeout in the Using Persisted Query Results from Snowflake but it doesn't really say much beyond
    A new token can be retrieved to access results while they are still in cache.
    We are also not sure what we can do on our side as this seems to be internal to the JDBC driver that reads from Snowflake. Maybe there is a JDBC query param we could add? We know we can break up this data and there are lots of workarounds like that, but it still avoids the core issue that can arise again. Any recommendations would be greatly appreciated! 🙏
    k
    • 2
    • 3
  • a

    Alex Tasioulis

    11/26/2025, 5:27 PM
    hi, I am exporting from MongoDB into Snowflake - everything works fine for smaller tables. I have some tables that are a bit too big in MongoDB and I am worried they will take like 2 days to copy over by which point the MongoDB oplog will have expired and incremental syncing would fail. Is that what will happen if initial sync takes longer than oplog window or does Airbyte handle this gracefully? If not, what can I do to deal with it?
    k
    • 2
    • 1
  • b

    Baudilio GarcĂ­a HernĂĄndez

    11/26/2025, 5:54 PM
    185 - Connector entitlement not available. actorDefinitionId=c0b24000-d34d-b33f-fea7-6b96dc0e5f0d organizationId=OrganizationId(value=00000000-0000-0000-0000-000000000000)
    k
    • 2
    • 1
  • c

    Chanakya Pendem

    11/27/2025, 8:46 AM
    Hi, I am working on creating an api with which I can create a source, but I need to build the connector first. Is there a way to make an api for building, as publish api gives sourcedefinitionID as output, which we can further use to create a source.
    k
    • 2
    • 1
  • l

    Linas Leščinskas

    11/27/2025, 12:51 PM
    hello! I was running my Airbyte instance hosted on GCP VM successfully for a year. Until certs expired and the syncs stopped running. To renew the certs, reinstalled the Airbyte with
    sudo abctl local uninstall
    and
    sudo abctl local install
    . In the process Airbyte was upgraded from 0.22x to 0.3x. However, I still cannot run any syncs. When I run a sync, it just hangs indefinitely, generating zero logs. When I try to replicate the connection, I encounter
    Airbyte is temporarily unavailable. Please try again. (HTTP 502)
    in the Select streams stage. I tried kicking workload launchers and restarting cluster, but this yields no tangible results. P. S. I have made a machine image so I can start anew on a fresh VM (with expired certs). Any guidance appreciated!
    k
    y
    • 3
    • 3
  • f

    Fernand Ramat

    11/27/2025, 7:29 PM
    Hello, I am trying to run an historical data load of my mongo cluster on one collection which is quite huge (more than 50 millions rows). but at some point the sync is failing; I am trying to refresh from there, but got the following message on ettempt 3,4,5 ... 7:
    io.airbyte.cdk.TransientErrorException: Input was fully read, but some streams did not receive a terminal stream status message. If the destination did not encounter other errors, this likely indicates an error in the source or platform. Streams without a status message: [import_mongodb_hh_production_product2_v2 ......]
    I am not sure how I could solve this pipeline
    k
    • 2
    • 1
  • a

    Alex Johnson

    11/28/2025, 3:32 AM
    Hi Team, I have configured the Xero connector using AirByte open source on Ubuntu. The source configures correctly and appears to authenticate using OAuth. I have successfully setup the connection and the streams sync but always return 0 records. Is there a fix for this?
    k
    • 2
    • 2
  • a

    Alex Johnson

    11/28/2025, 5:28 AM
    Zoho Inventory Connector - No Organisation ID? Hey Guys, Does anyone know why the Zoho Inventory connector does not use the organisation ID? I use multiple organisations so this is a required variable in all API calls. Is there a plan to include this in the future? Thanks! Alex
    k
    • 2
    • 1
  • p

    Prabhu Agarwal

    11/28/2025, 8:53 AM
    Hello, I am facing issue while using the custom decoder component. I have setup the custom component in builder UI and using it for one of the stream, and when I run this stream, its working fine in connector builder UI 2.0, but the rest of the streams which are not using the custom component, getting error while running them in builder UI. Could not load module
    source_declarative_manifest.components
    . I have checked for
    source_declarative_manifest
    traces in the other streams but its not the case. Not sure why this error is coming. How can I fix this issue? any leads would be highly appreciated. Thanks
    k
    • 2
    • 4
  • l

    Leon Kozlowski

    11/29/2025, 4:09 PM
    I'm using RDS for my external database for deploying self hosted airbyte - are there any minimum requirements for external database? I'm running a t3.large and my CPU is pinned at 100%, temporal is holding 150 idle connections - does anyone know what the min RDS size I should use?
    k
    • 2
    • 4
  • s

    Sam Riggleman

    11/29/2025, 5:37 PM
    I get this far in setting up a connection for the first time, but then it always hangs. All the tests pass in the configuration screen.
    • 1
    • 1