https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • j

    Johan Holmström

    11/18/2025, 8:25 AM
    Hello everybody. I am new here and new to Airbyte. I work as an IT-technician at a Hospital in Finland and I have two questions regarding Airbyte. 1. I have installed Airbyte on a Ubuntu Server using abctl. Is it possible to have multiple users login in to Airbyte? It seems that I can only have one user account but we would need more. 2. I have set up a reverse proxy using NGINX in order to get HTTPS to work. When I try to login to Airbyte it says: Sorry, something went wrong. Failed to get user after login. Check the network tab for more details. Status401 Unauthorized Anyone that can help?
    k
    a
    • 3
    • 4
  • s

    Shakar Bakr

    11/18/2025, 10:27 AM
    Hello everyone, After upgrading the chart to the latest version, I got the following error:
    <http://docker.io/airbyte/webapp:2.0.1|docker.io/airbyte/webapp:2.0.1>: not found
    I checked the repository and noticed that the
    webapp:2.0.1
    image tag is missing from the releases. The latest available release is
    1.7.8
    k
    • 2
    • 1
  • p

    Pragyash Barman

    11/18/2025, 10:53 AM
    Hi everyone, I am facing an issue where the MySQL CDC syncs stall at
    global-round-1-acquire-resources
    (Airbyte Helm 2.0.19). The job pods create the Unix domain sockets, list tables, then stop with no further output. Logs:
    Copy code
    2025-11-18 09:03:59,436 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO main i.a.c.d.JdbcMetadataQuerier$memoizedColumnMetadata$2(invoke):126 Querying column nam
    es for catalog discovery.
    2025-11-18 09:03:59,621 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO main i.a.c.d.JdbcMetadataQuerier$memoizedColumnMetadata$2(invoke):171 Discovered 2488 col
    umn(s) and pseudo-column(s).
    2025-11-18 09:03:59,694 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO main i.a.c.d.JdbcMetadataQuerier(close):382 Closing JDBC connection.
    2025-11-18 09:03:59,705 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-55#read i.a.c.r.RootReader(read):91 Read configured with data ch
    annel medium: SOCKET. data channel format: PROTOBUF
    2025-11-18 09:03:59,705 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-55#read i.a.c.r.RootReader(read):178 Reading feeds of type class
     io.airbyte.cdk.read.Global.
    2025-11-18 09:03:59,711 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-66#global i.a.c.r.FeedReader(createPartitions):107 Attempting bo
    otstrap using class io.airbyte.integrations.source.mysql.MySqlJdbcConcurrentPartitionsCreatorFactory.
    2025-11-18 09:03:59,712 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-66#global i.a.c.r.FeedReader(createPartitions):107 Attempting bo
    otstrap using class io.airbyte.cdk.read.cdc.CdcPartitionsCreatorFactory.
    2025-11-18 09:03:59,719 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-55#read i.a.c.r.ReadOperation$execute$1$1$1(invokeSuspend):80 co
    routine state:
    read
     └─global
        └─global-round-1-acquire-resources
    Setup
    • Deployment: Self-hosted Airbyte via Helm (airbyte chart
    v2.0.19
    ) • Source: MySQL
    v3.51.5
    connector (CDC mode) reading from
    RDS MySQL 8.0.40
    • Destination: BigQuery
    v3.0.15
    connector • MySQL user grants: SELECT, RELOAD, SHOW DATABASES, SHOW VIEW, REPLICATION SLAVE, REPLICATION CLIENT Any guidance on how to debug or resolve this would be appreciated—thanks!
    k
    • 2
    • 4
  • e

    Eloy Eligon

    11/18/2025, 2:58 PM
    Hi! It seems like the HubSpot connector is failing to bring the
    line_items
    field for the
    deals
    stream. This probably has to do with yesterday's release, as before yesterday the connection was working just fine. I already tried to run a refresh of the data and the problem persists.
    k
    h
    h
    • 4
    • 3
  • d

    Danielle Murdock

    11/18/2025, 8:05 PM
    Has anyone had an issue with junction objects in Salesforce not syncing fully? I have
    opportunitycontactrole
    setup and I'm only getting ~2300 records in Snowflake but there are over 8000 in Salesforce itself. I've tried doing a full overwrite but I"m still missing most of the data. Running the most recent version of the connector on cloud
    k
    • 2
    • 4
  • s

    Santoshi Kalaskar

    11/19/2025, 7:14 AM
    Hi #C021JANJ6TY Team, Has anyone worked with Full Refresh data sync for unstructured documents? For example, my source is Google Drive and the destination is Azure Blob Storage. When I delete files in the source, the Full Refresh sync does not delete them in the destination. Shouldn’t Full Refresh remove deleted files from the destination as well?
    k
    • 2
    • 1
  • s

    Stefano Messina

    11/19/2025, 8:40 AM
    Hello, we're exeperiencing some problems with the latest ClickHouse connector (v2). All the connections are throwing this error
    Sync completed, but unflushed states were detected.
    during syncs, the stacktrace in the logs doesn't really give more information, I can post it here if necessary. At the same time, the Mapper configuration is also not working as expected compared to v1. Has anyone any idea of what's going on? Created an issue on GitHub https://github.com/airbytehq/airbyte/issues/69746
    k
    • 2
    • 4
  • s

    Slackbot

    11/19/2025, 9:10 AM
    This message was deleted.
    k
    a
    • 3
    • 2
  • g

    Giridhar Gopal Vemula

    11/19/2025, 3:39 PM
    Hello Team , one quick question. for storing and using the secrets, is it possible to use Akeyless . does Airbyte supports custom integrations?
    k
    h
    • 3
    • 3
  • t

    Tanuja

    11/19/2025, 4:16 PM
    #C021JANJ6TY Anyone facing issues with
    Google Ads - Airbyte connector
    ? Our pipelines are not coming up from past few hours.
    k
    • 2
    • 1
  • d

    Dan Schlosser

    11/19/2025, 10:02 PM
    Hey all, anyone running into issues with the HubSpot connector? We’re seeing an issue where as of yesterday the
    contacts
    field is all of a sudden
    null
    for all of the deals that we sync into BigQuery from Airbyte – no config changes (definitely selected in the Airbyte mapping UI), just coming in as null.
    k
    h
    m
    • 4
    • 3
  • m

    Mochamad Eka Pramudita

    11/20/2025, 4:34 AM
    Hello all, I have a data source located on-premise in Cloudera, and it can be accessed through HDFS, Hive, or Impala. Has anyone here had experience using any of these as a source? I asked an AI assistant, and it seems that Airbyte does not provide a native source connector for HDFS, Hive, or Impala. Any suggestions or workarounds would be appreciated. Thank you.
    k
    • 2
    • 1
  • y

    Yuvaraj Prem Kumar

    11/20/2025, 9:49 AM
    Hi all, I am experiencing this behaviour with the v2 Helm charts, deployed in AWS EKS. When a replication job completes or when I trigger a sync, the airbyte UI does not update immediately, instead I have to refresh the page before I see the new status.
    k
    • 2
    • 1
  • b

    Barun Pattanaik

    11/20/2025, 10:12 AM
    Hi Team , I am using airbyte to connect from mongo to bigquery , but the data i am getting its in a single column as a json , i need the data to be splitted into different columns as in my native original table.What is the solution for this basic normailzation ? In my connection setting its not showing normalisation option , is it because i am using free trial version ?
    k
    • 2
    • 8
  • k

    Kevin O'Keefe

    11/20/2025, 6:42 PM
    Hello. I am new to the community and having the "best" time trying to figure out deployment. I am currently deploying Airbyte OSS to GCP through argoCD. I am able to get everything running as long as I do not try to enable auth: I am using external secrets to pull in secrets from gcp to K8s. The documentation is very unclear a to what is needed on this. I am adding what I think is expected to the gcp secret
    Copy code
    {
      "dataplane-client-id": "Redacted",
      "dataplane-client-secret": "Redacted==",
      "instance-admin-client-id": "Redacted",
      "instance-admin-client-secret": "Redacted=",
      "jwt-signature-secret": "Redacted=",
      "instance-admin-email": "Redacted=",
      "instance-admin-password": "Redacted"
    }
    This is getting properly imported into the server container. It seems like if I create any secret with the email and password it breaks all of the other generated secrets. I have tried splitting these two into another secret and pointing the auth to that, but then it does not autogenerate the airbyte-auth-secrets secret like it does if I have auth disabled. Has anyone else had to deal with this? Sorry if this has been answered before, I searched and there was not much to go on. Here is the values file I am using
    Copy code
    global:
      serviceAccountName: &service-account-name airbyte-sa
      edition: community
      airbyteUrl: "airbyte.k8s.haus"
    
      annotations:
        <http://argocd.argoproj.io/sync-wave|argocd.argoproj.io/sync-wave>: "2"
    
      database:
        type: external
        secretName: "airbyte-config-secrets"
        host: "10.23.1.3"
        port: 5432
        database: "postgres"
        userSecretKey: "database-user"
        passwordSecretKey: "database-password"
    
      auth:
        enabled: true
        secretName: "airbyte-auth2-secrets"
        instanceAdmin:
          firstName: "security"
          lastName: "admin"
          emailSecretKey: "instance-admin-email"
          passwordSecretKey: "instance-admin-password"
    
    minio:
      enabled: false
    
    postgresql:
      enabled: false
    
    serviceAccount:
      create: true
      name: *service-account-name
      annotations:
        <http://iam.gke.io/gcp-service-account|iam.gke.io/gcp-service-account>: "Redacted"
    k
    • 2
    • 1
  • a

    Aviran Moshe

    11/20/2025, 10:51 PM
    Hey everyone, Is anyone else seeing problems with the Google Ads connector after the auto-update earlier today? Our sync, which normally finishes in ~6 minutes, is now taking hours, and the connector is also producing duplicate data in the raw tables. We didn’t change anything on our side, the issues started right after the update. If anyone has more information or knows whether this is a known issue, I’d really appreciate it. Thanks!
    plus1 2
    k
    • 2
    • 1
  • m

    Michal Krawczyk

    11/21/2025, 10:30 AM
    Since upgrade to Airbyte 2.0 I observe that some jobs don't start even though Airbyte reports a successful status (0 bytes | no records loaded). It looks like Airbyte starts discover step but then it doesn't start orchestrator, source, destinations pods. My airbyte is released to Kubernetes on AWS EKS using helm chart. Has anyone see similar issues or know what could cause it? Logs in the 🧵
    k
    a
    • 3
    • 11
  • m

    Mahmoud Mostafa

    11/21/2025, 2:31 PM
    Hello everyone, need help here I have an issue related to a parent child streams Where I built a custom connector using the low code ui builder to fetch data from rasayel api which is an omni channel for whatsapp messages I wish to fetch conversations along with messages in each conversation to do so I use graphql and there is a relation between messages and conversations to fetch any message you have to pass conversation ID as an argument in the message query My current situation is that I have the connector configured correctly but it’s stuck in sync running for messages stream with no visible errors in the logs if anyone can help with this it will be much appreciated I am using airbyte 2.0.1 deployed with abctl on my local laptop
    k
    • 2
    • 10
  • d

    Diego Dias

    11/21/2025, 5:12 PM
    Hello everyone, I'm trying to use Airbyte to sync my data, and I've created multiple connections, each with many schemas mapped for synchronization. However, I expected that, if the VM resources were available, Airbyte would run multiple connections in parallel, but instead, all connections are running sequentially. Do I need to change any configuration to enable parallel execution? Airbyte is running on a VM with Debian 12, 8 vCPUs, and 32 GB of memory. Below is a snippet of my values.yaml:
    Copy code
    worker:
      enabled: true
      replicaCount: 2
      maxNotifyWorkers: "20"
      maxCheckWorkers: "20"
      maxSyncWorkers: "20"
      resources:
        requests:
          cpu: "500m"
          memory: "1Gi"
        limits:
          cpu: "1000m"
          memory: "2Gi"
      extraEnv:
        - name: MAX_SYNC_WORKERS
          value: "20"
        - name: MAX_CHECK_WORKERS
          value: "20"
        - name: MAX_DISCOVERY_WORKERS
          value: "10"
        - name: MAX_SPEC_WORKERS
          value: "10"
    
    workloadLauncher:
      enabled: true
      replicaCount: 2
      resources:
        requests:
          cpu: "500m"
          memory: "1Gi"
        limits:
          cpu: "1000m"
          memory: "2Gi"
      extraEnv:
        - name: WORKLOAD_LAUNCHER_PARALLELISM
          value: "25"
    k
    • 2
    • 2
  • t

    Tigran Zalyan

    11/21/2025, 6:36 PM
    Hey everyone! I was trying to migrate Airbyte from Helm chart version 1.8.2 (v1) to 2.0.1 (v1) and I'm running into a lot of errors. When using an external Postgres database with an explicitly set username via
    global.database.user
    , the airbyte-bootloader fails with:
    Copy code
    Error: couldn't find key DATABASE_USER in Secret airbyte-airbyte-secrets
    The workaround was to move the value into the secret under that key. I'm also seeing another error in airbyte-server when using GCS as the storage provider:
    Copy code
    io.micronaut.context.exceptions.BeanInstantiationException: Bean definition [io.airbyte.server.apis.controllers.SourceDefinitionApiController] could not be loaded:
    Error instantiating bean of type [io.airbyte.commons.storage.GcsStorageClient]: Is a directory
    The last error doesn't allow Airbyte to successfully start. Any ideas why this might be happening?
    k
    • 2
    • 1
  • d

    Dan Cook

    11/21/2025, 6:37 PM
    (FYI, we use Airbyte Cloud) If you've been having problems with your Google Ads syncs over the past few days, especially if you have custom reports built from GAQL then read on: We originally concluded that tab characters in our custom GAQL was the root cause of a bunch of recent failed syncs, and in fact replacing tabs with spaces did improve a few things. But the actual root cause was the moving around by Airbyte staff of our version of the Google Ads source connector. Over the past 3 days it has been the following values, in sequential order. The two versions in the middle of the sequence, variants of v4.1, were the cause of a bunch of failed syncs. Something in that connector version does not work nicely with custom GAQL reports: • v4.0.2 • v4.1.0-rc.8 • v4.1.0 • v4.0.2 The final change back to v4.0.2 took place this AM, presumably after Airbyte staff heard from enough customers about failing syncs. Once this one change was made our syncs started working again. Some issues: 1. why would Airbyte Cloud customers like us get moved to a release candidate (v4.1.0-rc.8) without any foreknowledge? 2. the first release candidate dropped in August, and the changelog for v4.1.0-rc.1 specifically mentions custom queries. Several of the subsequent PRs do too. How did very obvious, very fatal problems with custom queries make it all the way to an official release?
    k
    • 2
    • 2
  • j

    Jonathan Clemons

    11/21/2025, 9:56 PM
    Hey All. I am running into an issue with a connector that I built in the connector builder where syncs are getting stuck. From some testing, I think the issue is being caused by rate limiting from the source api. I am working on dialing in the API budget, but this has been tricky because I cannot see logs from the api calls being made to the source. The connection is not failing, so I am not able to diagnose via an error code. Is it possible to get more granular logs so I can troubleshoot why the connector is getting stuck or does anyone have suggestions for further troubleshooting steps?
    k
    • 2
    • 1
  • g

    Gaurav Jain

    11/22/2025, 9:26 AM
    @kapa.ai What do you store in this folder root@airbyte-prod-new:~/.airbyte/abctl/data/airbyte-volume-db/pgdata/base/16384#
    k
    • 2
    • 4
  • b

    Bogdan

    11/24/2025, 9:53 AM
    Hi everyone, I'm using Airbyte for the first time via abctl, and I've encountered an issue with sync parallelism. It seems that each sync is waiting for the previous one to complete before starting, which is slowing down the process. Is there a way to configure Airbyte to allow multiple syncs to run in parallel?
    k
    • 2
    • 4
  • s

    Samy-Alexandre LICOUR

    11/24/2025, 11:50 AM
    Hello, I am trying to install Airbyte with abctl and logs in a GCP bucket.
    Copy code
    sudo abctl local install --secret secrets.yaml --values values.yaml
    Here are my values and secrets files: values.yaml
    Copy code
    global:
      storage:
        type: "GCS"
        secretName: airbyte-config-secrets
        bucket: # GCS bucket names that you've created. We recommend storing the following all in one bucket.
          log: dev-airbyte-abctl-logs-
          state: dev-airbyte-abctl-logs
          workloadOutput: dev-airbyte-abctl-logs
        gcs:
          projectId: source-plm-dev
          credentialsJsonPath: /secrets/gcs-log-creds/gcp.json
      jobs:
        resources:
          ## Example:
           requests:
              memory: 8Gi
              cpu: 2
          # -- Job resource requests
          #requests: {}
          ## Example:
           limits:
              cpu: 4
              memory: 16Gi
          # -- Job resource limits
          #limits: {}
      auth:
        enabled: false
    secrets.yaml :
    Copy code
    apiVersion: v1
    kind: Secret
    metadata:
      name: airbyte-config-secrets
    type: Opaque
    stringData:
      gcp.json: "..."
    I am getting this error:
    Copy code
    Caused by: java.io.IOException: Is a directory
    There seems to be a problem with credentialsJsonPath: /secrets/gcs-log-creds/gcp.json even though I followed the documentation. Thanks for your help!
    k
    • 2
    • 7
  • a

    Akshata Shanbhag

    11/25/2025, 7:36 AM
    Hello i am out of ideas now so could you please help me out here. I have sync job failing with error "Workload failed, source: workload-monitor-heartbeat". the fact is that the replication pod is not created at all, even though the log shows that it is being launched. I have the workload launcher, worker, server, temporal all running in non-disrupt mode. No logs indicate this silent failure. The job succeeds on re-attempting. any ideas on what might be causing the replication pod from not starting? Between 1:07 and 1:19, it looks like it was attempting to create a pod but did not and failed silently. How could i mitigate this issue?
    l
    k
    • 3
    • 2
  • k

    kapa.ai

    11/25/2025, 2:55 PM
    You’re connected to kapa.ai, an assistant focused specifically on Airbyte. Your last message only says “Hi guys,” without a question or problem description. If you have an Airbyte-related question (connectors, deployments, errors, features, etc.), please share the details and I’ll help based on the available Airbyte docs, GitHub issues, and forum posts.
  • m

    Mihály Dombi

    11/25/2025, 2:55 PM
    Hello Team! I'm trying to add a new stream to a connector where the Retrieval type is Asynchronous Job. There is a creation endpoint that returns a Location (URL with
    report_task_id
    ) in the response header that can be polled to check its status. The problem is that this Location is in the response header but the response doesn't have a payload. Because of this the _get_creation_response_interpolation_context is always failing. Also in the connector builder the HTTP Response Format section doesn't have an option that tells that the response doesn't have a body. Is there a workaround for this, or the CDK should be extended with this option? The report creation endpoint in question: https://developer.ebay.com/api-docs/sell/marketing/resources/ad_report_task/methods/createReportTask#request.dateFrom
    k
    h
    • 3
    • 2
  • l

    Lucas Segers

    11/25/2025, 2:56 PM
    Hey guys, anyone with any "Plain-text" Http Requester on a custom Builder facing any issues after 2.0? I find it hard that I'm the only one that has broken PlainText streams after upgrading, so I must be missing something? 😛
    k
    • 2
    • 1
  • d

    Dyllan Pascoe

    11/25/2025, 9:04 PM
    Greetings. Is there a way to delete a builder connector from my organization once it's been published?
    k
    • 2
    • 2