https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • x

    Xavier LAI

    06/18/2025, 9:31 AM
    Hi All, Using Custom Connector on Airbyte CLoud, Is Anyone experiencing Base URL issue when paginate over a next link ? Used to work but since I updated today by using the Base URL in config instead of absolute path the pagination doesn't work anymore because base url of next link is malformed. Weird because I used @odata.nextLink that is the absolute path When I get back to my previous version it's working perfectly, but when I create a new version exactly as same as the working one it fails (might be a airbyte 1.7 version issue)
  • s

    Sérgio Marçal

    06/18/2025, 9:34 AM
    Hi all, Yesterday I submitted a ticket regarding the Monday.com connector, specifically what appears to be an issue related to pagination and concurrent requests: 🔗 https://github.com/airbytehq/airbyte/issues/61635 Has anyone encountered this before or have any insights on what might be causing it? Thanks in advance!
    • 1
    • 1
  • d

    David Rubio Piqueras

    06/18/2025, 9:37 AM
    Hi community, I'm just trying to understand in detail how the Airbyte pipeline execution works internally for the MixMax connector (is an API call mainly). I have configured a pipeline in append + dedupe mode, based the incremental load in a createdAt timestamp. My assumption is that when we execute the pipeline, it will read the whole table, and inserts the data in the target. And in the second load, if it's immediate, it should load 0 rows from the source. However, it's reading the same amount of data. What am missing at this point? Appreciate any help you can give me on this. Thanks!
    t
    • 2
    • 4
  • g

    George Polichronides

    06/18/2025, 10:17 AM
    Hi Guys, Is anyone using the provided Bing Ads connector on Airbyte Cloud ? We are transferring the data from Microsoft Ads to BigQuery daily. Recently there are some failures with error: "Failure in source: Unauthorized. Please ensure you are authenticated correctly." After removing a few tables from the schema, it works normally. After a few days the error comes back e.t.c. The past few days it was running on only one table! (campaign performance daily) and today we have the same error and it not usable any more. Thanks for your advise
    • 1
    • 1
  • c

    Cenk Batman

    06/18/2025, 8:32 PM
    Hello, I use self managed community edition installed through helm Started with version 1.6.2 and then upgraded to 1.7.0 but I have this issue in both versions. If I don’t modify the airbyte-auth-secrets to set a custom password it all works fine. However, if I set my own secrets workload launcher starts throwing errors for missing dataplane-client-id missing from the airbyte-auth-secrets. There are Github issues opened for this problem but they mention that it happened while upgrading to 1.6.0 and upgrading to 1.6.1 fixed it for them. Here I am with 1.7.0 still having the same problem sadparrot Tbh, if I could set a custom secret just for the user id and the password I wouldn’t need to override the automatically created dataplane related secrets in the first place but I guess it is not possible either. Any suggestions?
    c
    s
    • 3
    • 17
  • s

    Soumya B

    06/19/2025, 6:34 AM
    @kapa.ai Hi, We are facing the issue of ssl certificate, when we are trying to fetch the data from the SAP API
    • 1
    • 1
  • s

    Santhosh Chandrasekharan

    06/19/2025, 7:16 AM
    Hi, I am new to airbyte and have setup airbyte locally via abctl We have built custom connector using airbyte cdk and also have created a docker image for the same We want to establish connection by creating a source and destination programmatically to test if data from source is transferred to destination in my local setup for which I am making some AirByte API calls but we see below error. Let me know if its setup issue and assist me in solving this
    Copy code
    curl -X GET "<http://localhost:8000/api/v1/workspaces>" -H "Content-Type: application/json" -H "Authorization: Bearer <Token>  | jq .
    
    Response:
    <.ore1UoMWinW-7pM09X8qP4nUsV1weuOVGOjDd6qGSm8" | jq .
      % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                     Dload  Upload   Total   Spent    Left  Speed
    100   165  100   165    0     0   9152      0 --:--:-- --:--:-- --:--:--  9166
    {
      "message": "Forbidden",
      "_links": {
        "self": {
          "href": "/api/v1/workspaces",
          "templated": false
        }
      },
      "_embedded": {
        "errors": [
          {
            "message": "Forbidden",
            "_links": {},
            "_embedded": {}
          }
        ]
      }
    }
    s
    p
    • 3
    • 2
  • p

    Paul

    06/19/2025, 7:32 AM
    Can anyone confirm if the Airbyte provider for Airflow is currently working on the new 3.0.2 version? I am getting some very strange errors
  • e

    Euan Blackledge

    06/19/2025, 9:23 AM
    Hey folks. This isn’t a request for help but rather an answer for those that are struggling to reduce the log levels of the workload pods. I tried putting
    JAVA_OPTS
    and
    LOG_LEVEL
    to
    ERROR
    everywhere with no luck. I noticed that the workload pods always unset the
    JAVA_OPTS
    env var meaning it was disregarding the value I was setting. In the end I’ve set up Kyverno in our cluster and used that to inject the env vars into all pods. This shouldn’t be necessary and if anyone else knows a different way then let me know. Regardless, it works and reduced our hourly log production by over 3 million hits in Kibana
    • 1
    • 1
  • g

    Guilherme Banhudo

    06/19/2025, 11:17 AM
    Hey team! I have been exploring the idea of using Airbyte instead of Dlt. A critical topic is cost, what I've been thinking is having the Server / API - to add connections, test them, etc - in a constantly live cluster, and then having the main components (webui, temporal service, worker, workload api, workload launcher, cron and bootloader) in an environment which is turned on or off via airflow. I was looking at
    abctl
    but as far as I can see, the whole thing is combined. Anyone tried this architecture?
    p
    • 2
    • 1
  • j

    Justin Rea

    06/19/2025, 1:52 PM
    When I upgrade to airbyte 1.7, the dagster integration no longer works with the open source version of airbyte. I get a 403 forbidden error for /api/v1/workspaces/list. If I downgrade to 1.6.3 it works.
  • s

    Santhosh Chandrasekharan

    06/19/2025, 3:00 PM
    Hi, When I am creating source for custom connector in my local, I am receiving below response
    Copy code
    Payload URL: <http://localhost:8000/api/public/v1/sources>
    🔧 Payload: {'name': 'Reddit Ads Test Source', 'workspaceId': '<workspace_id>', 'configuration': {'client_id': '<client_id>', 'client_secret': '<client_secret>', 'refresh_token': '<refresh_token>', 'ad_account_id': '<account_id>', 'page_size': 1000}}
    📡 Response Status: 422
    ❌ Source creation failed: 422
    📝 Response: {"status":422,"type":"<https://reference.airbyte.com/reference/errors#unprocessable-entity>","title":"unprocessable-entity","detail":"The body of the request was not understood","documentationUrl":null,"data":null}
    the sources endpoint expects the payload which confirms to structure in documentation. However the payload expects data and documentationUrl. Am i missing something?
    p
    • 2
    • 1
  • k

    Kenneth Fernandez

    06/19/2025, 7:58 PM
    Hi all, does anybody has Mixpanel as a source to Snowflake as a destination? I'm running into impossibly long sync times that are keeping the Snowflake warehouse active. Just curious if anybody has a different setup that either speeds up the sync or can keep the WH inactive while it collects the exports from Mixpanel.
    p
    • 2
    • 2
  • j

    Justin Frye

    06/22/2025, 9:26 PM
    Okay I battled with following the steps here: (https://docs.airbyte.com/platform/using-airbyte/getting-started/oss-quickstart) to get the self-hosted version running. However, even after running
    abctl local install --insecure-cookies
    , I still get the error
    Copy code
    your credentials were correct, but the server failed to set a cookie. You appear to have deployed over HTTP. Make sure you have disabled secure cookies.
    This is running on a droplet in Digital Ocean and on ubuntu 24.04LTS Installed via the same steps as above I am currently trying the --no-browser method, but I would ideally like to still use a browser to setup the connectors. This is simply a PoC so I am not really looking to setup a cert for this. However, if that is going to resolve my issue-I am all ears and would be extremely grateful if someone had steps to do this
    • 1
    • 1
  • g

    Giacomo Chiarella

    06/23/2025, 8:12 AM
    Hi everybody! I’ve setup a non slack url for Connection Updates and Connection Updates Requiring Action. I seen in the documentation that these 2 notifications do not send a payload. I’ve also heard from kapa.ai that these 2 notifications do not call any other url but slack webhook. Is it so? I have not found such statement in the documentation. Is kapa.ai right or not?
  • p

    Pranay Gangaram Deokar

    06/23/2025, 9:42 AM
    @here We observed Airbyte connections were not syncing . I've restarted pods, and the connections started to work. So need to identify why most of the time sync stops until the pods restart. Airbyte is deployed using helm on EKS and how can we fix this issue permanently ?
  • b

    Bernardo Fernandes

    06/23/2025, 11:51 AM
    Hey everyone! I've been running into an issue lately where I try creating airbyte connectors via Terraform. For some reason, terraform is stuck at a config called STANDARD_WORKSPACE and everytime I want to make changes/create a new connection, it fails with the following error:
    Copy code
    unknown status code returned: Status 500
    │ {"status":500,"type":"<https://reference.airbyte.com/reference/errors>","title":"unexpected-problem","detail":"An
    │ unexpected problem has occurred. If this is an error that needs to be
    │ addressed, please submit a pull request or github
    │ issue.","documentationUrl":null,"data":{"message":"java.util.concurrent.ExecutionException:
    │ io.airbyte.data.exceptions.ConfigNotFoundException: config type:
    │ STANDARD_WORKSPACE id: cc585b57-9873-44db-823e-4bc3ed02180f"}}
    p
    • 2
    • 14
  • a

    Aneela Saleem

    06/23/2025, 2:20 PM
    Hi, I'm having this same problem https://discuss.airbyte.io/t/extracting-string-value-from-custom-connector-response-body/5136 the token returned in the response is either xml based or a plain string (whole Body). Any way of extracting it in custom connector? I tried with '$' as Session Token Path. But it says response['$'] not found.
  • p

    Pedro Roque

    06/23/2025, 9:45 PM
    Hey everyone! I installed airbyte using abctl, but now I'm facing memory issues in one of my connections, so I would like to scale airbyte to use more RAM memory. I tried creating a values.yaml file
    Copy code
    global:
      jobs:
        resources:
          requests:
            memory: 16Gi
          limits:
            memory: 16Gi
    and ran
    abctl local install --values .airbyte/values.yaml
    but it looks like it didn't work is this the correct way to increase memory?
    p
    • 2
    • 1
  • a

    Aaron Robins

    06/24/2025, 3:32 AM
    Has anyone had a response from Customer Support in last 5 days? I raised a Support Request 5 days ago and still no response. Thanks
    u
    • 2
    • 1
  • l

    Louis Gabilly

    06/24/2025, 4:05 PM
    Hi guys, So I deployed both Airbyte and Airflow on the same EKS cluster. I managed to trigger a jobs sync from the airflow-webserver pod using the following command :
    Copy code
    curl -X POST "<http://airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local:8001/api/v1/connections/sync>" -H "Accept: application/json" -H "Content-Type: application/json" -d '{"connectionId":"b971a860-254b-4e0a-b4ff-db3e4761f3be"}'
    When I try to run the following command to get the job status :
    Copy code
    curl --request GET \
        --url <http://airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local:8001/v1/jobs/1518> \
        --header 'accept: application/json'
    I get a weird HTML saying "You need to enable JavaScript to run this app." I have tried many settings (I share the last one of them) to setup a Airbyte connection within the Airflow interface, but both Airbyte and HTTP connection keep failing.
    Copy code
    airflow.exceptions.AirflowException: HTTPConnectionPool(host='airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local', port=80): Max retries exceeded with url: /v1/applications/token (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7ff5616a7590>, 'Connection to airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local timed out. (connect timeout=None)'))
    Can you guys help me see what I am missing ?
    u
    • 2
    • 4
  • a

    Alec Sharp

    06/25/2025, 8:29 AM
    Hey Everyone, does anyone know if it's possible to create tables with a custom suffix when using big query as the destination? I ask because big query has a wildcard feature that I want to use: https://cloud.google.com/bigquery/docs/querying-wildcard-tables Thank you!
    p
    j
    • 3
    • 3
  • i

    Idan Moradov

    06/25/2025, 10:15 AM
    We're using pyairbyte but suddenly severals days ago it's stop working, the read() function I looked into the release abd saw Airbyte released a new version someone face this issue?
    p
    • 2
    • 1
  • m

    Mert Ors

    06/25/2025, 10:16 AM
    is anyone else having issues with the facebook marketing connection?>
    u
    • 2
    • 2
  • o

    Oleksandr Riabyi

    06/25/2025, 10:42 AM
    Debugging Custom Airbyte Destination Connector (Docker Image on Apple Silicon) Hello, I've been working on building a custom Airbyte destination connector and ran into an issue. Here’s what I did: 1. I cloned the Airbyte repository. 2. Built the MySQL destination connector with the following command:
    Copy code
    ./gradlew :airbyte-integrations:connectors:destination-mysql:build -x test -x integrationTestJava
    3. This created the image
    airbyte/destination-mysql:dev
    . I then tagged and pushed it to my public Docker Hub account:
    Copy code
    docker tag df5787833601 olria97/testmysql_3:latest
    docker push olria97/testmysql_3:latest
    4. However, when I try to add this connector as a destination in the Airbyte UI, I receive the following error:
    Copy code
    An unexpected error occurred. Please report this if the issue persists. (HTTP 500)
    I suspect it might be related to the image architecture or compatibility. Here’s some info about my setup: • Python: 3.11.9 • Java: OpenJDK 21 (
    JAVA_HOME=/opt/homebrew/opt/openjdk@21
    ) • Architecture: arm64 • CPU: Apple M3 I also tried passing platform-specific flags during the build:
    Copy code
    ./gradlew ... -Dos.arch=amd64 -Dos.name=Linux
    …but that didn’t seem to help. Can anyone help me understand what might be going wrong? Should I be building the image differently to support Airbyte on this setup? Thanks in advance!
    p
    • 2
    • 15
  • m

    Mert Karabulut

    06/25/2025, 12:00 PM
    Anyone experiencing an issue with the Facebook Marketing connector today? Seems like it's pulling a lot less rows than usual
    m
    p
    u
    • 4
    • 12
  • j

    Jacob Batt

    06/25/2025, 2:35 PM
    Hi everyone, ever since Salesforce released their new edition, the airbyte connector has not been working for us due to not allowed scopes permission. We are running airbyte OSS. Has anyone run into this before, and if so, how did you solve for it
  • j

    Jason Friedman

    06/25/2025, 4:05 PM
    Hi, we are running an older version of Airbyte (0.63.11, and we plan to upgrade shortly) and see only SFTP-JSON as an allowed SFTP-flavor as a destination. Will the latest version of Airbyte offer the other flavors of SFTP? • SFTP • SFTP Bulk
    u
    • 2
    • 1
  • m

    Mohd Asad

    06/25/2025, 7:58 PM
    I deployed Airbyte using the Helm chart:
    Copy code
    helm repo add airbyte 
    <https://airbytehq.github.io/helm-charts>  
    helm install my-airbyte airbyte/airbyte --version 1.7.0
    The core components are running fine. However, when I create a source and destination and trigger a sync, a new replication job pod is created. This pod includes three containers—`source`,
    destination
    , and `orchestrator`—and it requests a total of 4 CPUs, which is too high for my environment. I attempted to reduce the CPU and memory usage by setting the following values in my `values.yaml`:
    Copy code
    global:
      jobs:
        resources:
          requests:
            cpu: 250m
            memory: 256Mi
          limits:
            cpu: 500m
            memory: 512Mi
    I also tried setting these environment variables:
    Copy code
    JOB_MAIN_CONTAINER_CPU_REQUEST  
    JOB_MAIN_CONTAINER_CPU_LIMIT  
    JOB_MAIN_CONTAINER_MEMORY_REQUEST  
    JOB_MAIN_CONTAINER_MEMORY_LIMIT
    Despite these changes, the replication job pods are still requesting 4 CPUs. I’m looking for a reliable way to reduce their resource requests to around 1.5 to 2 CPUs in total.
    j
    • 2
    • 2
  • j

    Jordan Lee

    06/26/2025, 4:42 AM
    What determines when "extracted" vs "loaded" values are shown here? It looks like the streams are doing nothing, however the in the mini stats graph on the top right where it says "Records loaded" that number is actually increasing, despite the "latest sync" column not changing values for extended periods of time.
    • 1
    • 1