https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • l

    loganeales

    09/05/2025, 7:35 AM
    Hi All, Goal: • to upgrade AB platform from v1.6.0 to v1.7.0 Environment context: • running an EC2 instance (instance type = t3a.xlarge) • upgrading using abctl via SSH into instance terminal Issue:
    Copy code
    ERROR failed to determine if any previous psql version exists: error reading pgdata version file: open /home/ssm-user/.airbyte/abctl/data/airbyte-volume-db/pgdata/PG_VERSION: permission denied
    I have read through the various threads relating to the issue and the kapa.ai responses (see attached thread), however I do not see any clear or verified solutions and am worried that most I read are leading to a rabbit hole. Has anyone resolved this and can share a step by step procedure on how to resolve? https://airbytehq.slack.com/archives/C01AHCD885S/p1752213180000729?thread_ts=1752213163.186179&cid=C01AHCD885S
    k
    • 2
    • 1
  • s

    Sebastien vaudour

    09/05/2025, 8:53 AM
    Hi all ! We have set up a new connection in Airbyte (OSS ; v1.4) using the Postgres Connector (v3.7) with the CDC mechanism. The connector seems to have been set up correctly in Airbyte, however, when creating a new connection, we are not being given the option to use an incremental sync mode : only full-refresh. Have we missed something during the setup in Airbyte or the configuration in our Postgres instance ? Thank you for your help !
    k
    • 2
    • 4
  • p

    Prugniaud Melchior

    09/05/2025, 9:05 AM
    Hello, when trying to rebuild a failed instance on my EC2 using abctl, i'm getting:
    Copy code
    abctl local install --timeout 30m --host MY_HOST_NAME
      INFO    Using Kubernetes provider:
                Provider: kind
                Kubeconfig: /home/ec2-user/.airbyte/abctl/abctl.kubeconfig
                Context: kind-airbyte-abctl
     SUCCESS  Found Docker installation: version 25.0.6                                                                                                                                                                                                  
     SUCCESS  Port 8000 appears to be available                                                                                                                                                                                                          
      INFO    No existing cluster found, cluster 'airbyte-abctl' will be created                                                                                                                                                                         
     SUCCESS  Cluster 'airbyte-abctl' created                                                                                                                                                                                                            
      INFO    Namespace 'airbyte-abctl' created                                                                                                                                                                                                          
      INFO    Persistent volume 'airbyte-minio-pv' created                                                                                                                                                                                               
      INFO    Persistent volume 'airbyte-volume-db' created                                                                                                                                                                                              
      INFO    Persistent volume claim 'airbyte-minio-pv-claim-airbyte-minio-0' created                                                                                                                                                                   
      INFO    Persistent volume claim 'airbyte-volume-db-airbyte-db-0' created                                                                                                                                                                           
      INFO    Starting Helm Chart installation of 'airbyte/airbyte' (version: 1.8.2)                                                                                                                                                                     
     SUCCESS  Installed Helm Chart airbyte/airbyte:                                                                                                                                                                                                      
                Name: airbyte-abctl
                Namespace: airbyte-abctl
                Version: 1.8.2
                AppVersion: 1.8.2
                Release: 1
      INFO    Starting Helm Chart installation of 'nginx/ingress-nginx' (version: 4.13.2)                                                                                                                                                                
     SUCCESS  Installed Helm Chart nginx/ingress-nginx:                                                                                                                                                                                                  
                Name: ingress-nginx
                Namespace: ingress-nginx
                Version: 4.13.2
                AppVersion: 1.13.2
                Release: 1
      INFO    No existing Ingress found, creating one                                                                                                                                                                                                    
     SUCCESS  Ingress created                                                                                                                                                                                                                            
      ERROR   Timed out waiting for ingress                                                                                                                                                                                                              
      ERROR   Unable to install Airbyte locally                                                                                                                                                                                                          
      ERROR   browser liveness check failed: context deadline exceeded
    current abctl version is v0.13.1. I'm not upgrading this because i have a airbyte db backup and i'm affraid to loose all my config of connections. (but currently my airbyte is down)
    k
    • 2
    • 20
  • k

    keshav

    09/05/2025, 9:48 AM
    Hi Airbyte Community, We’re working on integrating Airbyte Embedded Widget APIs into our web application so our users can connect data sources directly from our UI. While testing the /v1/embedded/widget_token endpoint, we realized that we need an Embedded API key and that this feature has to be enabled for our workspace. Could someone from the Airbyte team please guide us on how to enable the Embedded feature for our workspace and provide access to the Embedded API key / client credentials? Thanks in advance!
    k
    • 2
    • 1
  • m

    Mert Ors

    09/05/2025, 9:59 AM
    conversions/conversion values from the facebook marketing api aren't matching anymore even though i haven't changed anything about my data pipeline; is anyone else experiencing this issue?
    k
    • 2
    • 1
  • p

    Pradyumna Kulkarni

    09/05/2025, 11:50 AM
    I too have the same issue when i try to reinstall and the abctl local install is not working even on a fresh instance.There is some issue with Airbyte installation in general and Kapa is not helping. I recently migrated all my production workload to Airbyte and now all the data is out of sync
    k
    • 2
    • 3
  • a

    Armando Khachatryan

    09/05/2025, 2:39 PM
    hi there.... I have a trouble with creating a source (mysql), I've set up all configs, but when I press set up source, it after long time shows 502 and no other errors (I mean wrong host, or wrong pass for user, or permission problem no one, just Airbyte is temporarily unavailable).... environment is windows with docker, I've done everything as it is written in docs another strange things: local credentials don't show email, and in settings appication part my default application doesn't show client id and secret (after clicking on eye, it shows nothing)
    k
    • 2
    • 1
  • a

    Anil Thapa

    09/05/2025, 3:24 PM
    Hello Team, When I am trying to test the google sheet builder with a particular google sheet. I am trying to hit "ctrl+enter" as instructed and it doesn't perform a test. Does anyone know how to get this resolved.
  • t

    Tim Santos

    09/05/2025, 4:53 PM
    Hello team, I'm doing a benchmark comparison of Airbyte and Fivetran by ingesting our Hubspot data into Snowflake. We are ingesting about 60k Hubspot deal records daily. Airbyte is taking 1-2 hours to ingest the data while Fivetran can do this in under 10 minutes. I would like to understand what the bottleneck is with Airbyte's Hubspot connector and if there is anything we can do to bring this ingestion time down. Thanks!
    k
    • 2
    • 1
  • f

    Faisal

    09/05/2025, 6:58 PM
    I have tried the kapa.ai route but no luck. In a new Airbyte install I am attempting to move data from SQL server to Snowflake and the sync completes successfully. However no tables are created in destination and 0 bytes are transferred. Per kapa it is because T+D final table creation is disabled. I am not sure where to check for this and where to enable this. Any help is appreciated. This is on a local install BTW.
    k
    • 2
    • 1
  • c

    Carlos Henrique Ramos Dutra

    09/08/2025, 2:19 AM
    Hello everyone, I'm adding to the issues that Nycolas Nascimento described in his recent thread. I am also facing critical failures trying to perform a fresh installation of Airbyte Open Source on a clean Debian 12 VM. https://airbytehq.slack.com/archives/C021JANJ6TY/p1757040662756339 Here is a summary of my attempts: 1. abctl method: The
    abctl local install
    completes, but authentication is broken. *
    abctl local credentials
    shows
    Email: [not set]
    . * The login UI requires an email format, but fails with "Invalid username or password" for any default I try (e.g.,
    <mailto:user@example.com|user@example.com>
    ). * Trying to force-set the credentials with
    abctl local credentials --email <mailto:my-email@example.com|my-email@example.com>
    fails with a
    401 Unauthorized
    error, meaning the tool can't manage its own installation. 2. Docker run methods: To bypass
    abctl
    , I tried using the simpler Docker methods, but the images seem to be removed from Docker Hub. *
    docker run ... airbyte/quickstart
    fails with a
    pull access denied
    error. *
    docker run ... airbyte/platform-launcher
    also fails with a
    pull access denied
    error. It seems the entire installation pipeline for new open-source users is currently blocked. The official
    abctl
    tool has authentication bugs, and the simpler Docker-based alternatives are no longer available. Could you please advise on the current, correct, and functional method to deploy a new OSS instance on a Linux VM? Thank you.
    k
    • 2
    • 7
  • c

    Cris Navas

    09/08/2025, 10:35 AM
    Is the
    Quotes
    stream not available for the
    Hubspot
    connector, right? If so, how may I sync it? I don't quite understand why is not present
    k
    • 2
    • 14
  • s

    Slackbot

    09/08/2025, 10:53 AM
    This message was deleted.
    k
    • 2
    • 2
  • t

    Tanuj Shriyan

    09/08/2025, 10:57 AM
    Hey Guys, Is the snowflake source a very slow sync connector? It is only syncing 5000 records every 2 mins. Is it possible for me to increase the sync rate to reduce the connection time?
    k
    • 2
    • 11
  • l

    loganeales

    09/08/2025, 11:21 AM
    Hi AB Community, Has anyone heard of any solutions for this issue: https://github.com/airbytehq/airbyte/issues/44914 Also received this error message:
    Copy code
    ERROR   unable to create kind cluster: failed to init node with kubeadm: command "docker exec --privileged airbyte-abctl-control-plane kubeadm init --config=/kind/kubeadm.conf --skip-token-print --v=6" failed with error: exit status 1: I0908 08:20:49.610951     203initconfiguration.go:261] loading configuration from "/kind/kubeadm.conf"
    When trying to do this:
    Copy code
    abctl local install \
      --chart-version 1.7.0 \
      --host <http://val-aws-airbyte-ab-test.valentureinstitute.com|val-aws-airbyte-ab-test.valentureinstitute.com> \
      --low-resource-mode
    Environment details: • Running on AWS EC2 instance size t3a.xlarge • AB version 1.6.0 • abctl version v0.30.1
    k
    • 2
    • 1
  • j

    Jordi Santacreu Carranco

    09/08/2025, 11:28 AM
    hi , i am pretend to conect my mysql database hosted on aws on rds making a cdc to redshift and i have this issue when i am going to refresh my data [config_error] Incumbent CDC state is invalid, reason: Saved offset no longer present on the server, please reset the connection, and then increase binlog retention and/or increase sync frequency. Connector last known binlog file mysql-bin-changelog.083668 is not found in the server. Server has [mysql-bin-changelog.083678, mysql-bin-changelog.083679, mysql-bin-changelog.083680]. Anyone have any clue or anything ?
    k
    • 2
    • 7
  • l

    Lillian Jiang

    09/08/2025, 1:29 PM
    I noticed that some stream_partitions call dot (.) and bracket ([]). Which one would be used in what context?
    k
    j
    • 3
    • 5
  • a

    Aman Deep

    09/08/2025, 2:45 PM
    Hello Team, I am trying the self hosted solution for Airbyte, if i use the 1.8.2 version, i can't connect the source as at the end of connecting i get this error:
    Copy code
    {
      "message": "Internal Server Error: not yet implemented",
      "exceptionClassName": "java.lang.UnsupportedOperationException",
      "exceptionStack": [],
      "rootCauseExceptionStack": []
    }
    1.8.1 isn't installing 1.7.2 uses deprecated google apis V18 Can anyone help ?
    k
    • 2
    • 3
  • r

    Ram N

    09/08/2025, 3:24 PM
    Hello Team, I am trying to load 81GB of table from Mysql to Snowflake. My open source airbyte is deployed on AKS cluster. I am getting below error while load big tables. can anyone shed some light on this error? Error: "Airbyte could not start the sync process or track the progress of the sync."
    k
    • 2
    • 1
  • r

    Ram N

    09/08/2025, 3:24 PM
    I can see it has extracted 40 million records out of 500 million records.
    k
    • 2
    • 1
  • l

    Lillian Jiang

    09/08/2025, 3:42 PM
    Hi! I am adding a new stream to a source connector. I am trying to add a field to the source connector that comes from its parent stream. However, this field is not adding correctly. Does anyone have any idea why this is occurring?
    k
    • 2
    • 9
  • t

    Tiémé Togola

    09/08/2025, 8:25 PM
    There appears to be a very recent regression in the airbyte cloud ui. It's no longer possible to switch workspace? There used to be a dropdown selector where my arrow is pointing below. I'm submitting a support ticket but posting here because sometimes the answer takes awhile. Also in case in helps anyone, my trick for now is to manually enter the workspace id in the url to switch
    s
    • 2
    • 4
  • l

    Louis Demet

    09/09/2025, 8:55 AM
    [MySQL → BigQuery replication issue: date fields replaced by NULL (
    DESTINATION_SERIALIZATION_ERROR
    )]
    Hi everyone, I’ve been running a MySQL → BigQuery replication with Airbyte for months without issues, but since a few days I’ve noticed that date/datetime fields are being progressively replaced with NULL in BigQuery. Example from `_airbyte_meta`:
    Copy code
    {
      "changes": [
        {
          "change": "NULLED",
          "field": "checkout_completed_at",
          "reason": "DESTINATION_SERIALIZATION_ERROR"
        },
        {
          "change": "NULLED",
          "field": "created_at",
          "reason": "DESTINATION_SERIALIZATION_ERROR"
        },
        {
          "change": "NULLED",
          "field": "updated_at",
          "reason": "DESTINATION_SERIALIZATION_ERROR"
        }
      ],
      "sync_id": 49221008
    }
    The source values in MySQL are valid (e.g.
    2025-07-30 14:22:03
    ), so it seems Airbyte fails to serialize them properly when inserting into BigQuery. This was working fine for weeks before suddenly starting to fail. Does this ring a bell for anyone ? Any known fixes or workarounds ? Thaaanks !!
    k
    d
    j
    • 4
    • 21
  • e

    evris

    09/09/2025, 9:27 AM
    Hey everyone, I have been experiencing issues with the Facebook marketing source since the 5th of September. I get this error:
    Copy code
    Status:  400
      Response:
        {
          "error": {
            "message": "Invalid parameter",
            "type": "OAuthException",
            "code": 100,
            "error_subcode": 2446289,
            "is_transient": false,
            "error_user_title": "Ad Creative Is Incomplete",
            "error_user_msg": "The reel you selected for your ad is not available. It could be deleted or you might not have permissions to see it. Please check your ad creative and try again.",
            "fbtrace_id": "ADoKNFtcEVNYzmo_bx0iSHg"
          }
        }
    I have also opened an issue on Github for this: https://github.com/airbytehq/airbyte/issues/66027 Has anyone else experienced this?
    k
    j
    • 3
    • 3
  • v

    Vagner Guilherme Figueira Neto

    09/09/2025, 12:26 PM
    hello, i'm installing airbyte with abctl on ubuntu server in VM-WARE and i keep getting "Your credentials were correct, but the server failed to set a cookie. You appear to have deployed over HTTP. Make sure you have disabled secure cookies." when opening in my browser with the machine IP. i've re-installed more than once with --insecure-cookies as sugested on the docks but still getting the error. has anyone experienced this or know a solution? thanks!
    k
    d
    h
    • 4
    • 4
  • k

    Kyle Romines

    09/09/2025, 1:55 PM
    Hello, when trying to use the latest cdk version (7.0.1) for a python connector, it does not seem to be parsing config variables correctly and returning blank. The error I am getting:
    Copy code
    {
      "type": "CONNECTION_STATUS",
      "connectionStatus": {
        "status": "FAILED",
        "message": "\"Encountered an error while checking availability of stream merchant_account_stream. Error: Invalid URL 'https:///merchants/merchant_accounts': No host supplied\""
      }
    }
    And this is the yaml:
    Copy code
    url_base: >-
              https://{% if config['environment'] == 'Development' %}localhost:3000{% elif config['environment'] == 'Sandbox' %}<http://api.sandbox.braintreegateway.com:443{%|api.sandbox.braintreegateway.com:443{%> elif config['environment'] == 'Qa' %}<http://gateway.qa.braintreepayments.com:443{%|gateway.qa.braintreepayments.com:443{%> elif config['environment'] == 'Production' %}<http://api.braintreegateway.com:443{%|api.braintreegateway.com:443{%> endif %}/merchants/{{config['merchant_id']}}
            path: /merchant_accounts
    Also I am getting the error:
    Copy code
    {
      "type": "CONNECTION_STATUS",
      "connectionStatus": {
        "status": "FAILED",
        "message": "\"Encountered an error while discovering streams. Error: time data '' does not match format '%Y-%m-%dT%H:%M:%SZ'\""
      }
    }
    For yaml:
    Copy code
    incremental_sync:
          type: DatetimeBasedCursor
          cursor_field: created_at
          datetime_format: "%Y-%m-%d %H:%M:%S"
          cursor_datetime_formats:
            - "%Y-%m-%d %H:%M:%S"
            - "%Y-%m-%dT%H:%M:%S"
          start_datetime:
            type: MinMaxDatetime
            datetime: "{{ config['start_date'] }}"
    k
    • 2
    • 4
  • z

    Zeev Shteiman

    09/09/2025, 2:49 PM
    Hi, for those of you using Airbyte with Snowflake. I have noticed that the incremental mechanism is not very optimal. The main paint point as I see it as when using merge strategy, the merge is only but the PK and you can't add additional columns to be used in the join. For large tables, this creates very heavy joins, as in order to find the IDs you would need to scan all the table. Clustering on the ID column may help, but not necessarily. When using INSERT+DELETE strategy, the problem is even bigger, as the delete after the insert needs to scan all the table to find duplicates to delete. Any ideas how you tackle this?
    k
    • 2
    • 1
  • i

    Ivan Barbosa Pinheiro

    09/09/2025, 11:21 PM
    I’m deploying Airbyte on Kubernetes with an external PostgreSQL database and Google Cloud Storage (GCS) for logs. I created the Kubernetes secret with my GCP service account key:
    Copy code
    kubectl create secret generic gcs-log-creds -n airbyte --from-file=gcp.json=./gcs-key.json
    Inside the pod, I see the environment variable:
    Copy code
    GOOGLE_APPLICATION_CREDENTIALS=/secrets/gcs-log-creds/gcp.json
    But when I describe the pod (
    kubectl describe pod
    ), the secret is not mounted as a volume. The only mounts are
    /config
    and the default service account token. Because of this, the pod crashes with the error:
    Copy code
    java.nio.file.NoSuchFileException: /secrets/gcs-log-creds/gcp.json
    So the problem is: The secret
    gcs-log-creds
    exists in Kubernetes, but it’s not being mounted into the pod at
    /secrets/gcs-log-creds/gcp.json
    , even though Airbyte expects it there. values.yaml
    Copy code
    # --- Configuração Global ---
    global:
      edition: community  
      
      # --- Configuração de Banco de Dados Externa ---
      database:
        type: external
        secretName: airbyte-config-secrets
        host: 10.54.242.3
        port: 5432
        name: airbyte
        userSecretKey: DATABASE_USER
        passwordSecretKey: DATABASE_PASSWORD
    
      # --- Configuração de Armazenamento Externa (GCS) ---
      storage:
        secretName: gcs-log-creds
        type: gcs
        bucket:
          log: airbyte-guaradata-logs-communal-hen
          auditLogging: airbyte-guaradata-logs-communal-hen
          state: airbyte-guaradata-logs-communal-hen
          workloadOutput: airbyte-guaradata-logs-communal-hen
          activityPayload: airbyte-guaradata-logs-communal-hen
        gcs:
          projectId: guaradata
          credentialsJsonPath: /secrets/gcs-log-creds/gcp.json
    
      # --- Configuração de Segredos ---
      secretsManager:
        enabled: false
    
      # --- Configuração CRÍTICA: Agendamento dos Pods de Job (Sync, Check, Discover) ---
      jobs:
        kube:
          nodeSelector:
            workload-type: airbyte-jobs
    
    # --- Configuração de Service Account ---
    serviceAccount:
      create: "true"
      name: airbyte-admin
    
    # --- Desativar serviços internos ---
    postgresql:
      enabled: false
    
    minio:
      enabled: false
    
    # --- Componentes (renomeados para o formato oficial V2) ---
    airbyteBootloader:
      podAnnotations:
        "<http://cluster-autoscaler.kubernetes.io/safe-to-evict|cluster-autoscaler.kubernetes.io/safe-to-evict>": "true"
      nodeSelector:
        workload-type: airbyte-services
    
    webapp:
      nodeSelector:
        workload-type: airbyte-services
    
    server:
      nodeSelector:
        workload-type: airbyte-services
    
    worker:
      nodeSelector:
        workload-type: airbyte-services
    
    temporal:
      nodeSelector:
        workload-type: airbyte-services
    
    # --- PodAnnotations globais (mantive sua anotação) ---
    podAnnotations:
      "<http://cluster-autoscaler.kubernetes.io/safe-to-evict|cluster-autoscaler.kubernetes.io/safe-to-evict>": "true"
    Install Airbyte in cluster:
    Copy code
    kubectl create namespace airbyte
    
    kubectl create serviceaccount airbyte-admin --namespace airbyte
    
    kubectl create secret generic airbyte-config-secrets --namespace airbyte `
      --from-literal=DATABASE_USER=airbyte `
      --from-literal=DATABASE_PASSWORD=123 `
    
    kubectl create secret generic gcs-log-creds --namespace airbyte `
      --from-file=GOOGLE_APPLICATION_CREDENTIALS_JSON=./gcs-key.json `
      --from-file=gcp.json=./gcs-key.json
    
    helm install airbyte airbyte-v2/airbyte --namespace airbyte --version 2.0.3 -f values.yaml --set global.image.tag=1.7.0
    k
    • 2
    • 4
  • f

    Florent B

    09/10/2025, 9:44 AM
    Hey glad to see a community about this tool, I don't know if you have clue about it but in my company we are using Hubspot connector to append data in BigQuery for further analysis. But week after week we are observing that data are missing even using Incremental sync. Did it happens to someone here ? I would be glad if someone has an experience with setting this up right with Hubspot 😉
    k
    • 2
    • 1
  • f

    Faisal

    09/10/2025, 3:05 PM
    Has anyone successfully created connections into snowflake destination with over 100 streams per connection and run several connections in parallel? For me, not all tables are transferred into destination or some connections have 0 records transferred. Source is SQL Server.
    k
    • 2
    • 1
1...241242243244245Latest