https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • b

    bollo

    11/12/2025, 3:36 PM
    hello im using version 1.5.3 of the S3 destination connector we are using the path format to partition our ingestion like
    ingested_at=${YEAR}-${MONTH}-${DAY}-${HOUR}/stream=${STREAM_NAME}/client=${NAMESPACE}/
    and then using
    ingested_at
    as bookmark to process the data in our pipeline problem is that airbyte is putting all the data from an ingestion in the same partition no matter if it takes 1 hour or several, so our pipeline drops data is there a workaround for this? is it a known bug?
    k
    • 2
    • 1
  • m

    Mahmoud Khaled

    11/12/2025, 4:26 PM
    Has anyone used airbyte to read data from Google Play and app store such as number of app installs?
    k
    • 2
    • 1
  • a

    aidatum

    11/12/2025, 4:51 PM
    Hi, I am facing challenge deploying Airbyte 1.8.5 Workload-Launcher Deployment on OpenShift Environment, I generated menefest file yet its failing • Platform: OpenShift Container Platform 4.18 (Kubernetes 1.31.11) • Airbyte Version: 1.8.5 OSS (Open Source) Problem Workload-launcher pod fails to start with authentication errors despite running OSS version which shouldn’t require authentication. Error Messages 1. "Could not resolve placeholder ${DATAPLANE_CLIENT_ID}" 2. "Could not resolve placeholder ${CONTROL_PLANE_TOKEN_ENDPOINT}" 3. "Failed to heartbeat - baseUrl is invalid" 4. "Failed to obtain or add access token - token request failed" any idea?
    k
    • 2
    • 1
  • s

    Steve Ma

    11/12/2025, 9:47 PM
    HI, I am getting an error when set up postgres source connector like
    We detected XMIN transaction wraparound in the database..
    . Looks like it is introduced in this PR https://github.com/airbytehq/airbyte/pull/38836/files. I understand the concern here about
    XMIN transaction wraparound
    but could we consider to raise a warning message instead of throwing an error? For my case, I am just planning to sync some regular tables not those really large tables
    k
    • 2
    • 1
  • a

    Akshata Shanbhag

    11/13/2025, 7:59 AM
    i am noticing a behaviour where there are more job attempts per job than usual for the airbyte version 2.0.8 as compared to previous version 1.7.1. How this affects us is that, the cursor progresses in some of the failed attempts and the final successful attempt usually picks up the updated cursor. Since the cursor already progressed in a previous attempt, we lose the records emitted during that attempt. here is a screenshot where records are emitted but not committed and the cursor has progressed in this failed attempt. any recommendations on how to ensure this does not happen.
    k
    • 2
    • 1
  • s

    Santoshi Kalaskar

    11/13/2025, 1:20 PM
    Hi #C021JANJ6TY I recently started using Airbyte as a data integration platform to fulfill a client requirement. Our objective is to transfer data from SharePoint (specifically from site pages or users’ personal drives) to a Google Cloud Storage (GCS) bucket as the destination. During the configuration of the SharePoint source in Airbyte, we encountered issues while performing OAuth authentication. The error occurred during the redirect URL step, even though we had configured the redirect URL in the Azure App Registration as per the available documentation. Need some help, while setting up the redirect url. thanks in advance! https://docs.airbyte.com/integrations/sources/microsoft-sharepoint
    k
    • 2
    • 2
  • a

    aidatum

    11/13/2025, 2:41 PM
    I found several issue while deploying airbyte community edition 1.8.5 in openshift as it has several helm-chart issue haven anyone faced similar issue, sever worker fine yet woker and workload-laucher has serious config issues
    k
    • 2
    • 1
  • v

    Valeria Tapia

    11/13/2025, 4:25 PM
    Hi! Quick question — does anyone know if it’s possible to do an incremental sync (append) when using Airtable as a source? So far we’ve only been able to get it working with full refresh, but that’s not what we’re aiming for. Any ideas? 🙏
    k
    j
    • 3
    • 3
  • y

    yanndata

    11/13/2025, 6:15 PM
    Hi! I want to install airbyte with abctl on my droplet (digita ocean with ubuntu) , so i failed
  • p

    Pranay S

    11/14/2025, 6:35 AM
    hello, i have made a connection with airbyte and my shopify store. but while syncing im getting this error can anyone please help me? what ive tried : 1. using new pgsql connection 2. using new store 3. clearing airbyte data and syncing again none of these helped.
    • 1
    • 1
  • a

    Alejandro De La Cruz López

    11/14/2025, 9:24 AM
    Hey! I am ingesting Hubspot data with Airbyte. From the 7th of November, our full-refresh syncs don't ingest all data available in HS. Nothing changed, but our daily run ingest a random number of rows instead of the total that appear in HS. Any ideas?
    👍 1
    k
    m
    • 3
    • 2
  • a

    Alessio Darmanin

    11/14/2025, 11:02 AM
    Hi. I was successfully using Airbyte version 1.8.4. Today I upgraded to 2.0.1 and now I get
    Airbyte is temporarily unavailable. Please try again. (HTTP 502)
    when trying to retest a previously working source. From the pods view, the control-plane looks healthy; everything is shown as Running. Going through the server.log I can see the contents shown below, but SSH Tunnel Method is set to "No Tunnel" in the source definition, and the JSON also reflects this.
    Copy code
    JSON schema validation failed.
    errors: $.tunnel_method: must be the constant value 'SSH_KEY_AUTH',
    required property 'tunnel_host' not found,
    required property 'tunnel_port' not found,
    required property 'tunnel_user' not found,
    required property 'ssh_key' not found
    Source JSON:
    Copy code
    "tunnel_method": { "tunnel_method": "NO_TUNNEL" }
    What could the issue be please?
    k
    • 2
    • 1
  • d

    Diego Quintana

    11/14/2025, 11:34 AM
    Hi! I'm getting a weird error on my clickhouse -> postgres connection
    Copy code
    source connector ClickHouse v0.2.6
    destination Postgres v2.2.1
    airbyte version 0.50.31 (I know, I know)
    The error appears after a bit syncing incremental models, and it seems to be
    Copy code
    java.sql.SQLException: java.io.IOException: Premature EOF
    A full refresh takes around 1.22H and it does not disconnect though. I've set
    socket_timeout=300000
    in my connection with no success. What can it be?
    k
    • 2
    • 13
  • a

    Ashok Pothireddy

    11/14/2025, 11:35 AM
    hello team, Question regarding the sharepoint connector, currently we have multiple excel files in a folder which we are trying to ingest. But we need to ingest in seperate s3 buckets. problem is that it shows as 1 excel file when wanting to choose the file via ingestion and all columns in all files are clubbed into 1 big file, thats issue one and second one is that even though we select only few columns , in destination it still has multiple columns and sometimes all columns. how to fix the issue?
    k
    • 2
    • 1
  • r

    Rafael Santos

    11/14/2025, 12:34 PM
    Hello, guys. Currently I'm facing an issue while trying to build the connection shown in this tutorial, where I get a Broken Pipe with not many further details during the replication phase. Also, is there a way to find Pinecone's environment value as of today? The interface for their indexes' information changed and no longer includes such a field, which is required on Airbyte's destination, although the initial test to check if Pinecone is working seems to be successful. I don't know exactly what the source of this issue I'm facing, but since I get some logs with a generic Destination Writer error, the mismatch on this env value could be the culprit. The error logs start like this: replication-orchestrator INFO Stream status TRACE received of status: STARTED for stream issues replication-orchestrator INFO Sending update for issues - null -> RUNNING replication-orchestrator INFO Stream Status Update Received: issues - RUNNING replication-orchestrator INFO Creating status: issues - RUNNING replication-orchestrator INFO Stream status TRACE received of status: RUNNING for stream issues replication-orchestrator INFO Workload successfully transitioned to running state destination INFO Writing complete. replication-orchestrator ERROR DestinationWriter error: replication-orchestrator ERROR DestinationWriter error: replication-orchestrator INFO DestinationReader finished. replication-orchestrator WARN Attempted to close a destination which is already closed. replication-orchestrator INFO DestinationWriter finished. replication-orchestrator INFO MessageProcessor finished. replication-orchestrator ERROR SourceReader error: replication-orchestrator INFO SourceReader finished. replication-orchestrator ERROR runJobs failed; recording failure but continuing to finish. replication-orchestrator INFO Closing StateCheckSumCountEventHandler
    k
    • 2
    • 1
  • r

    Rafael Santos

    11/14/2025, 9:16 PM
    Now I'm trying to connect a local Qdrant instance to a local Airbyte instance. I've already made sure that both containers are under the same network; however, I keep getting this error: Internal message: 'QdrantIndexer' object has no attribute '_client'
    k
    • 2
    • 1
  • k

    Kevin Conseil

    11/17/2025, 9:19 AM
    Hi everyone, Does someone got issues with the file source connector using excel on date/time fields being not JSON serializable ? This suddenly happen end of october while the connector was correctly working for years and nothing changed in our excel file
    k
    • 2
    • 11
  • e

    Eduardo Fernandes

    11/17/2025, 11:46 AM
    We're having problems with Google Ads custom query. Most of the metrics we selecting from campaign aren't being sent to the schema. Does anyone else is having this problem? I think this is happening because of the new version published 3 days ago here. Details in the thread.
    k
    • 2
    • 2
  • m

    Martijn van Maasakkers

    11/17/2025, 7:37 PM
    HI y'all. I'm trying to reinstall airbyte to update to the latest version, but after running uninstall when I tried to run
    abctl local install
    again I run into this error:
    docker: Error response from daemon: failed to set up container networking: failed to create endpoint airbyte-abctl-control-plane on network kind: network 3abd9d5c5005a3ed8563fe69633ac525c72ff3e956cf5309c9b6a69394b897a2 does not exist
    When checking
    docker network ls
    however it seems to be there:
    3abd9d5c5005   kind      bridge    local
    Does any of you have any idea on how to fix this?
    k
    • 2
    • 1
  • m

    Martijn van Maasakkers

    11/17/2025, 8:06 PM
    After help from AI i removed the network, ran the install, everything looked fine but airbyte does seem to start the sync, but the timeline shows nothing and there is also no data transferred.
    k
    h
    • 3
    • 3
  • a

    Andrey Souza

    11/17/2025, 9:06 PM
    Hi guys, looking some help. I was using Airbyte 1.6.0 OSS local, deployed with abctl, and had some issues related to the secrets configuration when tried update to lastest version (2.0.1). Probably because schema and/or defaults changes in values.yaml, so i tried to update my secrets accordingly, but still got multiple errors like "Error: couldn't find key AWS_ACCESS_KEY_ID in Secret airbyte-abctl/airbyte-config-secrets" and other related missing keys look into my values and secrets-template files showed below:
    Copy code
    abctl local uninstall
    
      INFO    Using Kubernetes provider:
                Provider: kind
                Kubeconfig: /home/residencia/.airbyte/abctl/abctl.kubeconfig
                Context: kind-airbyte-abctl
     SUCCESS  Found Docker installation: version 28.0.0
     SUCCESS  Existing cluster 'airbyte-abctl' found
     SUCCESS  Uninstallation of cluster 'airbyte-abctl' completed successfully
     SUCCESS  Airbyte uninstallation complete
    
    
    kubectl version
    
    Client Version: v1.32.1
    Kustomize Version: v5.5.0
    Error from server (NotFound): the server could not find the requested resource
    
    
    abctl version
    
    version: v0.30.3
    
    
    cat values.yaml
    
    global:
      edition: "community"
      secretName: "airbyte-config-secrets"
      local: true
    
      image:
        tag: "2.0.1"
    
      auth:
        enabled: true
        instanceAdmin:
          emailSecretKey: "instance-admin-email"
          passwordSecretKey: "instance-admin-password"
        security:
          cookieSecureSetting: false
    
    
    cat secrets-template.yaml
    
    apiVersion: v1
    kind: Secret
    metadata:
      name: airbyte-config-secrets
    type: Opaque
    stringData:
      instance-admin-email: "${AIRBYTE_USER}"
      instance-admin-password: "${AIRBYTE_PASSWORD}"
    
    
    envsubst < secrets-template.yaml > secrets.yaml
    
    
    abctl local install --values ./values.yaml --secret ./secrets.yaml
    
      INFO    Using Kubernetes provider:
                Provider: kind
                Kubeconfig: /home/residencia/.airbyte/abctl/abctl.kubeconfig
                Context: kind-airbyte-abctl
     SUCCESS  Found Docker installation: version 28.0.0
      INFO    No existing cluster found, cluster 'airbyte-abctl' will be created
     SUCCESS  Port 8000 appears to be available
     SUCCESS  Cluster 'airbyte-abctl' created
     WARNING  Found MinIO physical volume. Consider migrating it to local storage (see project docs)
     WARNING  PostgreSQL 13 detected. Consider upgrading to PostgreSQL 17
      INFO    Pulling image airbyte/async-profiler:2.0.1
      INFO    Pulling image airbyte/bootloader:2.0.1
      INFO    Pulling image airbyte/connector-builder-server:2.0.1
      INFO    Pulling image airbyte/connector-sidecar:2.0.1
      INFO    Pulling image airbyte/container-orchestrator:2.0.1
      INFO    Pulling image airbyte/cron:2.0.1
      INFO    Pulling image airbyte/db:2.0.1
      INFO    Pulling image airbyte/server:2.0.1
      INFO    Pulling image airbyte/utils:2.0.1
      INFO    Pulling image airbyte/worker:2.0.1
      INFO    Pulling image airbyte/workload-api-server:2.0.1
      INFO    Pulling image airbyte/workload-init-container:2.0.1
      INFO    Pulling image airbyte/workload-launcher:2.0.1
      INFO    Pulling image minio/minio:RELEASE.2023-11-20T22-40-07Z
      INFO    Pulling image temporalio/auto-setup:1.27.2
      INFO    Namespace 'airbyte-abctl' created
      INFO    Persistent volume 'airbyte-minio-pv' created
      INFO    Persistent volume claim 'airbyte-minio-pv-claim-airbyte-minio-0' created
      INFO    Persistent volume 'airbyte-volume-db' created
      INFO    Persistent volume claim 'airbyte-volume-db-airbyte-db-0' created
     SUCCESS  Secret from '/home/residencia/containers/elt/airbyte/airbyte-config/secrets.yaml' created or updated
      INFO    Starting Helm Chart installation of 'airbyte/airbyte' (version: 2.0.19)
     WARNING  Encountered an issue deploying Airbyte:
                Pod: airbyte-minio-0.1878d8db93f57520
                Reason: Failed
                Message: Error: couldn't find key AWS_ACCESS_KEY_ID in Secret airbyte-abctl/airbyte-config-secrets
                Count: 6
     WARNING  Encountered an issue deploying Airbyte:
                Pod: airbyte-abctl-bootloader.1878d8db8cac0600
                Reason: Failed
                Message: Error: couldn't find key AB_INSTANCE_ADMIN_CLIENT_ID in Secret airbyte-abctl/airbyte-config-secrets
                Count: 6
     WARNING  Encountered an issue deploying Airbyte:
                Pod: airbyte-minio-0.1878d8db93f57520
                Reason: Failed
                Message: Error: couldn't find key AWS_ACCESS_KEY_ID in Secret airbyte-abctl/airbyte-config-secrets
                Count: 7
     WARNING  Encountered an issue deploying Airbyte:
                Pod: airbyte-abctl-bootloader.1878d8db8cac0600
                Reason: Failed
                Message: Error: couldn't find key AB_INSTANCE_ADMIN_CLIENT_ID in Secret airbyte-abctl/airbyte-config-secrets
                Count: 7
    
    ...
    
    ▀  Installing 'airbyte/airbyte' (version: 2.0.19) Helm Chart (this may take several minutes) (27m39s)
    k
    • 2
    • 6
  • f

    Fabian

    11/18/2025, 7:27 AM
    Hi all, I'm searching for a scriptable way to upgrade source- or destination definition versions of connectors, i.e. their docker image version, like say 100ms from 0.0.13 to 0.0.14 on Airbyte Core, i.e. the free self-hosted option. I'd be creating connectors via Terraform, yet upgrades seem not supported by the Airbyte Terraform provider. All attempts to use e.g. this Airbyte API endpoint https://reference.airbyte.com/reference/updatesourcedefinition did not succeed. A Forbidden was returned by the API while all other requests worked fine and the user is an organization_admin and instance_admin according to the permissions API endpoint. I've tried with Airbyte 2.0.1 installed locally via abctl as well as with Airbyte 2.0.0. installed via Helm chart 2.0.18 locally and to EKS. By inspecting the Webgui when upgrading a source connector, I found the Airbyte Configuration API https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#overview. I was able to upgrade a connector with it in a curl command to this endpoint with the same token as retrieved via the main Airbyte API. Yet as the API documentation welcome page says that the API is not guaranteed to be supported long term and may have breaking changes, I'm hesitant to use it. From my search around this upgrade ability I found mentions, that the main Airbyte API is meant to be used for custom connector definitions only, not Airbyte native ones, hence the forbidden. If the Airbyte Configuration API may however be subject to change, I'd like to ask if anyone knows what would be the correct / best way to upgrade connector versions via code and not via the GUI on a self-hosted Airbyte Core. Many thanks for any inputs!
    k
    • 2
    • 2
  • j

    Johan Holmström

    11/18/2025, 8:25 AM
    Hello everybody. I am new here and new to Airbyte. I work as an IT-technician at a Hospital in Finland and I have two questions regarding Airbyte. 1. I have installed Airbyte on a Ubuntu Server using abctl. Is it possible to have multiple users login in to Airbyte? It seems that I can only have one user account but we would need more. 2. I have set up a reverse proxy using NGINX in order to get HTTPS to work. When I try to login to Airbyte it says: Sorry, something went wrong. Failed to get user after login. Check the network tab for more details. Status401 Unauthorized Anyone that can help?
    k
    a
    • 3
    • 4
  • s

    Shakar Bakr

    11/18/2025, 10:27 AM
    Hello everyone, After upgrading the chart to the latest version, I got the following error:
    <http://docker.io/airbyte/webapp:2.0.1|docker.io/airbyte/webapp:2.0.1>: not found
    I checked the repository and noticed that the
    webapp:2.0.1
    image tag is missing from the releases. The latest available release is
    1.7.8
    k
    • 2
    • 1
  • p

    Pragyash Barman

    11/18/2025, 10:53 AM
    Hi everyone, I am facing an issue where the MySQL CDC syncs stall at
    global-round-1-acquire-resources
    (Airbyte Helm 2.0.19). The job pods create the Unix domain sockets, list tables, then stop with no further output. Logs:
    Copy code
    2025-11-18 09:03:59,436 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO main i.a.c.d.JdbcMetadataQuerier$memoizedColumnMetadata$2(invoke):126 Querying column nam
    es for catalog discovery.
    2025-11-18 09:03:59,621 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO main i.a.c.d.JdbcMetadataQuerier$memoizedColumnMetadata$2(invoke):171 Discovered 2488 col
    umn(s) and pseudo-column(s).
    2025-11-18 09:03:59,694 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO main i.a.c.d.JdbcMetadataQuerier(close):382 Closing JDBC connection.
    2025-11-18 09:03:59,705 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-55#read i.a.c.r.RootReader(read):91 Read configured with data ch
    annel medium: SOCKET. data channel format: PROTOBUF
    2025-11-18 09:03:59,705 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-55#read i.a.c.r.RootReader(read):178 Reading feeds of type class
     io.airbyte.cdk.read.Global.
    2025-11-18 09:03:59,711 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-66#global i.a.c.r.FeedReader(createPartitions):107 Attempting bo
    otstrap using class io.airbyte.integrations.source.mysql.MySqlJdbcConcurrentPartitionsCreatorFactory.
    2025-11-18 09:03:59,712 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-66#global i.a.c.r.FeedReader(createPartitions):107 Attempting bo
    otstrap using class io.airbyte.cdk.read.cdc.CdcPartitionsCreatorFactory.
    2025-11-18 09:03:59,719 [pool-3-thread-1]    INFO    i.a.w.i.VersionedAirbyteStreamFactory(internalLog$io_airbyte_airbyte_commons_worker):248 - INFO DefaultDispatcher-worker-55#read i.a.c.r.ReadOperation$execute$1$1$1(invokeSuspend):80 co
    routine state:
    read
     └─global
        └─global-round-1-acquire-resources
    Setup
    • Deployment: Self-hosted Airbyte via Helm (airbyte chart
    v2.0.19
    ) • Source: MySQL
    v3.51.5
    connector (CDC mode) reading from
    RDS MySQL 8.0.40
    • Destination: BigQuery
    v3.0.15
    connector • MySQL user grants: SELECT, RELOAD, SHOW DATABASES, SHOW VIEW, REPLICATION SLAVE, REPLICATION CLIENT Any guidance on how to debug or resolve this would be appreciated—thanks!
    k
    • 2
    • 1
  • e

    Eloy Eligon

    11/18/2025, 2:58 PM
    Hi! It seems like the HubSpot connector is failing to bring the
    line_items
    field for the
    deals
    stream. This probably has to do with yesterday's release, as before yesterday the connection was working just fine. I already tried to run a refresh of the data and the problem persists.
    k
    h
    • 3
    • 2
  • d

    Danielle Murdock

    11/18/2025, 8:05 PM
    Has anyone had an issue with junction objects in Salesforce not syncing fully? I have
    opportunitycontactrole
    setup and I'm only getting ~2300 records in Snowflake but there are over 8000 in Salesforce itself. I've tried doing a full overwrite but I"m still missing most of the data. Running the most recent version of the connector on cloud
    k
    • 2
    • 4
  • s

    Santoshi Kalaskar

    11/19/2025, 7:14 AM
    Hi #C021JANJ6TY Team, Has anyone worked with Full Refresh data sync for unstructured documents? For example, my source is Google Drive and the destination is Azure Blob Storage. When I delete files in the source, the Full Refresh sync does not delete them in the destination. Shouldn’t Full Refresh remove deleted files from the destination as well?
    k
    • 2
    • 1
  • s

    Stefano Messina

    11/19/2025, 8:40 AM
    Hello, we're exeperiencing some problems with the latest ClickHouse connector (v2). All the connections are throwing this error
    Sync completed, but unflushed states were detected.
    during syncs, the stacktrace in the logs doesn't really give more information, I can post it here if necessary. At the same time, the Mapper configuration is also not working as expected compared to v1. Has anyone any idea of what's going on? Created an issue on GitHub https://github.com/airbytehq/airbyte/issues/69746
    k
    • 2
    • 1
  • s

    Slackbot

    11/19/2025, 9:10 AM
    This message was deleted.
    k
    a
    • 3
    • 2
1...241242243244245Latest