https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • k

    Kenneth Fernandez

    06/19/2025, 7:58 PM
    Hi all, does anybody has Mixpanel as a source to Snowflake as a destination? I'm running into impossibly long sync times that are keeping the Snowflake warehouse active. Just curious if anybody has a different setup that either speeds up the sync or can keep the WH inactive while it collects the exports from Mixpanel.
    p
    • 2
    • 2
  • j

    Justin Frye

    06/22/2025, 9:26 PM
    Okay I battled with following the steps here: (https://docs.airbyte.com/platform/using-airbyte/getting-started/oss-quickstart) to get the self-hosted version running. However, even after running
    abctl local install --insecure-cookies
    , I still get the error
    Copy code
    your credentials were correct, but the server failed to set a cookie. You appear to have deployed over HTTP. Make sure you have disabled secure cookies.
    This is running on a droplet in Digital Ocean and on ubuntu 24.04LTS Installed via the same steps as above I am currently trying the --no-browser method, but I would ideally like to still use a browser to setup the connectors. This is simply a PoC so I am not really looking to setup a cert for this. However, if that is going to resolve my issue-I am all ears and would be extremely grateful if someone had steps to do this
    • 1
    • 1
  • g

    Giacomo Chiarella

    06/23/2025, 8:12 AM
    Hi everybody! I’ve setup a non slack url for Connection Updates and Connection Updates Requiring Action. I seen in the documentation that these 2 notifications do not send a payload. I’ve also heard from kapa.ai that these 2 notifications do not call any other url but slack webhook. Is it so? I have not found such statement in the documentation. Is kapa.ai right or not?
  • p

    Pranay Gangaram Deokar

    06/23/2025, 9:42 AM
    @here We observed Airbyte connections were not syncing . I've restarted pods, and the connections started to work. So need to identify why most of the time sync stops until the pods restart. Airbyte is deployed using helm on EKS and how can we fix this issue permanently ?
  • b

    Bernardo Fernandes

    06/23/2025, 11:51 AM
    Hey everyone! I've been running into an issue lately where I try creating airbyte connectors via Terraform. For some reason, terraform is stuck at a config called STANDARD_WORKSPACE and everytime I want to make changes/create a new connection, it fails with the following error:
    Copy code
    unknown status code returned: Status 500
    │ {"status":500,"type":"<https://reference.airbyte.com/reference/errors>","title":"unexpected-problem","detail":"An
    │ unexpected problem has occurred. If this is an error that needs to be
    │ addressed, please submit a pull request or github
    │ issue.","documentationUrl":null,"data":{"message":"java.util.concurrent.ExecutionException:
    │ io.airbyte.data.exceptions.ConfigNotFoundException: config type:
    │ STANDARD_WORKSPACE id: cc585b57-9873-44db-823e-4bc3ed02180f"}}
    p
    • 2
    • 14
  • a

    Aneela Saleem

    06/23/2025, 2:20 PM
    Hi, I'm having this same problem https://discuss.airbyte.io/t/extracting-string-value-from-custom-connector-response-body/5136 the token returned in the response is either xml based or a plain string (whole Body). Any way of extracting it in custom connector? I tried with '$' as Session Token Path. But it says response['$'] not found.
  • p

    Pedro Roque

    06/23/2025, 9:45 PM
    Hey everyone! I installed airbyte using abctl, but now I'm facing memory issues in one of my connections, so I would like to scale airbyte to use more RAM memory. I tried creating a values.yaml file
    Copy code
    global:
      jobs:
        resources:
          requests:
            memory: 16Gi
          limits:
            memory: 16Gi
    and ran
    abctl local install --values .airbyte/values.yaml
    but it looks like it didn't work is this the correct way to increase memory?
    p
    • 2
    • 1
  • a

    Aaron Robins

    06/24/2025, 3:32 AM
    Has anyone had a response from Customer Support in last 5 days? I raised a Support Request 5 days ago and still no response. Thanks
    u
    • 2
    • 1
  • l

    Louis Gabilly

    06/24/2025, 4:05 PM
    Hi guys, So I deployed both Airbyte and Airflow on the same EKS cluster. I managed to trigger a jobs sync from the airflow-webserver pod using the following command :
    Copy code
    curl -X POST "<http://airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local:8001/api/v1/connections/sync>" -H "Accept: application/json" -H "Content-Type: application/json" -d '{"connectionId":"b971a860-254b-4e0a-b4ff-db3e4761f3be"}'
    When I try to run the following command to get the job status :
    Copy code
    curl --request GET \
        --url <http://airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local:8001/v1/jobs/1518> \
        --header 'accept: application/json'
    I get a weird HTML saying "You need to enable JavaScript to run this app." I have tried many settings (I share the last one of them) to setup a Airbyte connection within the Airflow interface, but both Airbyte and HTTP connection keep failing.
    Copy code
    airflow.exceptions.AirflowException: HTTPConnectionPool(host='airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local', port=80): Max retries exceeded with url: /v1/applications/token (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7ff5616a7590>, 'Connection to airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local timed out. (connect timeout=None)'))
    Can you guys help me see what I am missing ?
    u
    • 2
    • 4
  • a

    Alec Sharp

    06/25/2025, 8:29 AM
    Hey Everyone, does anyone know if it's possible to create tables with a custom suffix when using big query as the destination? I ask because big query has a wildcard feature that I want to use: https://cloud.google.com/bigquery/docs/querying-wildcard-tables Thank you!
    p
    j
    • 3
    • 5
  • i

    Idan Moradov

    06/25/2025, 10:15 AM
    We're using pyairbyte but suddenly severals days ago it's stop working, the read() function I looked into the release abd saw Airbyte released a new version someone face this issue?
    p
    • 2
    • 1
  • m

    Mert Ors

    06/25/2025, 10:16 AM
    is anyone else having issues with the facebook marketing connection?>
    u
    • 2
    • 2
  • o

    Oleksandr Riabyi

    06/25/2025, 10:42 AM
    Debugging Custom Airbyte Destination Connector (Docker Image on Apple Silicon) Hello, I've been working on building a custom Airbyte destination connector and ran into an issue. Here’s what I did: 1. I cloned the Airbyte repository. 2. Built the MySQL destination connector with the following command:
    Copy code
    ./gradlew :airbyte-integrations:connectors:destination-mysql:build -x test -x integrationTestJava
    3. This created the image
    airbyte/destination-mysql:dev
    . I then tagged and pushed it to my public Docker Hub account:
    Copy code
    docker tag df5787833601 olria97/testmysql_3:latest
    docker push olria97/testmysql_3:latest
    4. However, when I try to add this connector as a destination in the Airbyte UI, I receive the following error:
    Copy code
    An unexpected error occurred. Please report this if the issue persists. (HTTP 500)
    I suspect it might be related to the image architecture or compatibility. Here’s some info about my setup: • Python: 3.11.9 • Java: OpenJDK 21 (
    JAVA_HOME=/opt/homebrew/opt/openjdk@21
    ) • Architecture: arm64 • CPU: Apple M3 I also tried passing platform-specific flags during the build:
    Copy code
    ./gradlew ... -Dos.arch=amd64 -Dos.name=Linux
    …but that didn’t seem to help. Can anyone help me understand what might be going wrong? Should I be building the image differently to support Airbyte on this setup? Thanks in advance!
    p
    • 2
    • 15
  • m

    Mert Karabulut

    06/25/2025, 12:00 PM
    Anyone experiencing an issue with the Facebook Marketing connector today? Seems like it's pulling a lot less rows than usual
    m
    p
    u
    • 4
    • 14
  • j

    Jacob Batt

    06/25/2025, 2:35 PM
    Hi everyone, ever since Salesforce released their new edition, the airbyte connector has not been working for us due to not allowed scopes permission. We are running airbyte OSS. Has anyone run into this before, and if so, how did you solve for it
  • j

    Jason Friedman

    06/25/2025, 4:05 PM
    Hi, we are running an older version of Airbyte (0.63.11, and we plan to upgrade shortly) and see only SFTP-JSON as an allowed SFTP-flavor as a destination. Will the latest version of Airbyte offer the other flavors of SFTP? • SFTP • SFTP Bulk
    u
    • 2
    • 1
  • m

    Mohd Asad

    06/25/2025, 7:58 PM
    I deployed Airbyte using the Helm chart:
    Copy code
    helm repo add airbyte 
    <https://airbytehq.github.io/helm-charts>  
    helm install my-airbyte airbyte/airbyte --version 1.7.0
    The core components are running fine. However, when I create a source and destination and trigger a sync, a new replication job pod is created. This pod includes three containers—`source`,
    destination
    , and `orchestrator`—and it requests a total of 4 CPUs, which is too high for my environment. I attempted to reduce the CPU and memory usage by setting the following values in my `values.yaml`:
    Copy code
    global:
      jobs:
        resources:
          requests:
            cpu: 250m
            memory: 256Mi
          limits:
            cpu: 500m
            memory: 512Mi
    I also tried setting these environment variables:
    Copy code
    JOB_MAIN_CONTAINER_CPU_REQUEST  
    JOB_MAIN_CONTAINER_CPU_LIMIT  
    JOB_MAIN_CONTAINER_MEMORY_REQUEST  
    JOB_MAIN_CONTAINER_MEMORY_LIMIT
    Despite these changes, the replication job pods are still requesting 4 CPUs. I’m looking for a reliable way to reduce their resource requests to around 1.5 to 2 CPUs in total.
    j
    • 2
    • 2
  • j

    Jordan Lee

    06/26/2025, 4:42 AM
    What determines when "extracted" vs "loaded" values are shown here? It looks like the streams are doing nothing, however the in the mini stats graph on the top right where it says "Records loaded" that number is actually increasing, despite the "latest sync" column not changing values for extended periods of time.
    • 1
    • 1
  • p

    Phat Luu Huynh

    06/26/2025, 7:00 AM
    Hi team, I'm using the SDK and got this error why trying to init Airbyte Client using SchemeBasicAuth:
    Copy code
    DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): <http://airbyte.staging.data-engineering.myteksi.net:443|airbyte.staging.data-engineering.myteksi.net:443>
    send: b'GET /api/v1/health HTTP/1.1\r\nHost: <valid-host>\r\nuser-agent: speakeasy-sdk/python 0.52.2 2.474.15 1.0.0 airbyte-api\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'
    reply: 'HTTP/1.1 401 Unauthorized\r\n'
    header: Content-Type: text/html
    header: Date: Thu, 26 Jun 2025 05:59:26 GMT
    header: WWW-Authenticate: Basic realm=""
    header: Content-Length: 172
    header: Connection: keep-alive
    DEBUG:urllib3.connectionpool:<valid-host> "GET /api/v1/health HTTP/1.1" 401 172
    Error: API error occurred: Status 401
    <html>
    <head><title>401 Authorization Required</title></head>
    <body>
    here's my code:
    Copy code
    client = airbyte_api.AirbyteAPI(
        server_url=Config.AIRBYTE_SERVER_URL,
        security=models.Security(
            basic_auth=models.SchemeBasicAuth(
                username=Config.AIRBYTE_USERNAME,
                password=Config.AIRBYTE_PASSWORD
            )
        )
    )
    
    try:
        health = client.health.get_health_check()
        print(health)
    except errors.SDKError as e:
        print(f"Error: {e}")
    Assume that all my credentials and configs are valid (because I've alr tested on Postman and it responsed successfully). I asked Cursor to diagnose the error and it said that there's a problem with Basic Authentication of the SDK. Did I missed any important steps? Could you kindly advice? TYSM! thanks Version: • python: 3.10.13 • airbyte-api: 0.52.2
    u
    • 2
    • 1
  • d

    Durim Gashi

    06/26/2025, 9:18 AM
    Hey everyone, we are experiencing the following issue today with our Postgres -> Redshift connector. I am not able to find more details on the issue. Any help would be appreciated:
    Copy code
    2025-06-26 11:13:54 replication-orchestrator INFO Failures: [ {
      "failureOrigin" : "source",
      "internalMessage" : "Source process exited with non-zero exit code 1",
      "externalMessage" : "Something went wrong within the source connector",
      "metadata" : {
        "attemptNumber" : 4,
        "jobId" : 40576074,
        "connector_command" : "read"
      },
      "stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source process exited with non-zero exit code 1\n\tat io.airbyte.container.orchestrator.worker.SourceReader.run(ReplicationTask.kt:209)\n\tat io.airbyte.container.orchestrator.worker.SourceReader$run$1.invokeSuspend(ReplicationTask.kt)\n\tat kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)\n\tat kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:100)\n\tat io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141)\n\tat io.micrometer.core.instrument.Timer.lambda$wrap$2(Timer.java:199)\n\tat datadog.trace.bootstrap.instrumentation.java.concurrent.Wrapper.run(Wrapper.java:47)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n",
      "timestamp" : 1750929234687
    } ]
    u
    • 2
    • 3
  • a

    Arun Daniel

    06/26/2025, 10:19 AM
    I’m new to Airbyte. I’m trying to run Airbyte on AWS EKS and it works with the ALB. I want to put authentication in front of it so it’s not open to anyone that hits the site. I read Cognito doesn’t work nor does the usual auth statements in the values.yank file. I heard of and tried Cloak but just not working. Has anyone else come across a solution that will put an auth layer in front of Airbyte site in AWS EKS?
    d
    • 2
    • 1
  • x

    Xavier Van Ausloos

    06/26/2025, 1:29 PM
    Hi team, I cannot use API for getting all workspaces.
    <http://localhost:8081/api/v1/workspaces/list>
    Got 404 not found error API works well for getting all connections (with basic auth):
    <http://localhost:8081/api/v1/connections/list>
    I am using Airbyte 1.7.1 deployed thanks to HELM @kapa.ai any idea ?
    u
    p
    • 3
    • 2
  • s

    Stockton Fisher

    06/26/2025, 3:00 PM
    I upgraded to BigQuery v3.0.0 and now a sync won't complete. Warning from destination: Syntax error: Unexpected "9578ed2f_df7c_4757_addb_1c535c789473" at [1:36] (9578ed2f_df7c_4757_addb_1c535c789473 is the name of the namespace) Any suggestions?
    u
    j
    • 3
    • 4
  • t

    Tom Holder

    06/26/2025, 5:04 PM
    I've got an connection running from salesforce to csv on s3. One run is generating a 138 meg file called 2025_06_25_1750886136152_0.csv and then another file 2025_06_25_1750886136152_1.csv - I've never seen it generate multiple files before. What causes this? Presumably number of rows? Is this a setting somewhere?
    u
    • 2
    • 2
  • p

    Prashanth Mohan

    06/26/2025, 6:19 PM
    Hello! I'm running Airbyte 1.5.0 through the Helm chart in a cluster that requires proxy environment variables. I've been able to set these variables on the normal deployments (worker, server, etc.), but can't seem to get them onto the spawned job pods for connectors. I saw this discussion that mentioned setting
    CONTAINER_ORCHESTRATOR_ENABLED
    to
    false
    would allow environment variables with the
    JOB_DEFAULT_ENV
    prefix to be passed on to spawned jobs, but this strategy did not work. Is there another way that was introduced to accomplish this? (If it comes down to it, maybe we can build custom docker images with the environment variables, but it would be nice to have a way to this via the Helm chart directly)
  • k

    Kanchal Karale

    06/27/2025, 10:14 AM
    Hi all is ZohoCrm is working fine ? for anyone ?
  • u

    Usman Pasha

    06/27/2025, 10:22 AM
    iam getting the following error "Saved offset is not valid. Please reset the connection, and then increase oplog retention and/or increase sync frequency to prevent his from happening in the future. See https://docs.airbyte.com/integrations/sources/mongodb-v2#mongodb-oplog-and-change-streams for more details" even though i have set the oplog retention to 72hrs and the sync frequency is 5 mins even then i am getting the above the error
  • j

    Justin Beasley

    06/27/2025, 2:11 PM
    Bumping this comment since I put it in an older thread . . .
    u
    • 2
    • 2
  • h

    Hảo Phan

    06/27/2025, 3:50 PM
    hi guys, I deployed Airbytes following to github document. But, I see 2 option sync mode on below. Slack Conversation
  • h

    Hảo Phan

    06/27/2025, 3:50 PM
    Airbytes documentation have 5 sync mode. I don't understand the reason why my airbytes have 2. Pls help me... Slack Conversation
    j
    • 2
    • 6
1...241242243244245Latest