https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • i

    Ivan Demabildo

    08/05/2025, 9:57 PM
    Hi everyone, there's an upcoming change for Bitnami as referenced here: https://github.com/bitnami/charts/issues/35164?utm_source=convertkit&utm_medium=email&utm_campaign=[Last%20Week%20in%20AWS]%20Issue%20#434:%[…]13417 We're currently using 1.7.0. Are there going to be any impact in our instance? Currently seeing that pod-sweeper is using bitnami kubectl: https://artifacthub.io/packages/helm/airbyte/pod-sweeper
  • a

    Arwa A

    08/07/2025, 8:36 AM
    Hello, I am using an Oracle source connector to pull some tables from a specific schema. Is there a way to add a filter to the connector so that it only takes specific tables from this schema? I do not want to pull all tables from it.
  • s

    Sérgio Marçal

    08/07/2025, 11:52 AM
    HI, IS ANyone having problems with monday connector?
  • s

    Seb J

    08/07/2025, 12:11 PM
    Hello everyone, I have deployed Airbyte v1.7.1 OSS on a Kubernetes cluster, but I’m unable to use the Connector Builder. Error message:
    "Could not validate your connector as specified:
    An unknown error has occurred"
    After some research, I found that starting from this version, certain changes are required regarding ingress configuration. Information found here: https://docs.airbyte.com/release_notes/v-1.7#171 And also here: https://docs.airbyte.com/platform/deploying-airbyte/integrations/ingress-1-7 My question is: Am I required to set up an ingress and expose my Airbyte deployment externally, or is there a way to resolve this issue without doing so? Thanks everyone!
    • 1
    • 1
  • m

    Mohit Kumar

    08/07/2025, 2:07 PM
    We are writing to report a critical issue encountered during a recent upgrade attempt of our Airbyte Open Source (OSS) airbyte-1.6.2 on Kubernetes, performed via its official Helm chart. The upgrade process failed in our production environment due to a persistent schema migration error during the Airbyte bootloader phase. Specifically, the bootloader reported an inability to uninstall certain data types, citing their active usage within existing database tables. Migration of schema "airbyte" to version "0.50.5.005 - AddScopeToActorDefinitionWo │ │ 2025-08-06 094117,787 [main] ERROR i.a.b.ApplicationKt(main):28 - Unable to bootstrap Airbyte environment. ││ org.flywaydb.core.internal.command.DbMigrate$FlywayMigrateException: Migration failed ! Our subsequent attempts to manually resolve this by directly manipulating the database to remove the implicated data types led to significant complications and data inconsistencies, ultimately preventing us from completing the Helm chart upgrade. Consequently, we had to abandon the upgrade command. We are seeking your urgent guidance and suggestions on how to effectively overcome this schema migration challenge during the Airbyte bootloader process. Any insights or recommended procedures to safely address this issue would be greatly appreciated. Thank you for your time and assistance.
  • p

    Pablo Morales

    08/07/2025, 6:15 PM
    Hey, We noticed that we are missing some Facebook Ads - Purchases (for 7 day click 1 day view attribution) since July 29th. It is affecting all the data we have using this metric. We created a custom connector using the Facebook Ads API, and the data is accurate, so we assume it's a connector issue. We also created a new connector to check if it would update the data, but the issue is still there. Is anyone else suffering from this issue? 😓 Thanks!
    a
    • 2
    • 2
  • j

    Jonathan Sider

    08/08/2025, 4:59 PM
    Anyone else having issues with the Shopify - Inventory_Levels endpoint? I've tried deleting/recreating connector, deleting the tables, every time it tries to sync inventory_levels it just hangs and tries to get empty records for hours and hours and will never finish. I've had a case open for 3 days but haven't heard anything from Airbyte yet
  • t

    Tom Holder

    08/08/2025, 6:58 PM
    I can't help but feel this is a really bad format for community support. Currently if you google an airbyte issue it often turns up the old forum. There's no open discoverability with slack (maybe that's the point) and it makes it hard to find recent and or common issues ?
    ✅ 1
  • j

    Jay Tavares

    08/08/2025, 7:31 PM
    Everyone who uses the Quickbooks connector in “Incremental | Append + Deduped” mode: How do you handle deleted records? We have lots of clients who delete transactions in the past as part of clean up work. When our sync runs, those records don’t come across the wire anymore. Since Airbyte is just appending and deduping, it doesn’t do anything with the record that is in our destination that was previously synced but should now be deleted. If we want to have our destination datastore to match what is actually in Quickbooks as of the most recent sync, is the only option “Full refresh | Overwrite”
  • w

    Will Skelton

    08/08/2025, 8:26 PM
    Hey All! Is there a non-JDBC option for the Postgresql source connector? Our databases run with SQL_ASCII encoding and have some characters that cause the standard UTF8 encoding to fail on us. It is my understanding that the JDBC Driver Requires UTF8.
  • m

    Mengying Li

    08/08/2025, 9:07 PM
    Hi all! I have a question about the Datadog connector. I’m trying to sync Metrics stream from Datadog to snowflake, but I’m only receiving the metric metadata rather than the actual time series data. Could anyone help me understand why this might be happening? Below is the JSON of my setup
    Copy code
    {
      "name": "Datadog",
      "workspaceId": "xxxxx",
      "definitionId": "xxxx",
      "configuration": {
        "site": "<http://us5.datadoghq.com|us5.datadoghq.com>",
        "api_key": "******",
        "queries": [
          {
            "name": "test_metrics",
            "query": "sum:api.log_data.num_bytes{env:production} by {org_id}.as_rate()",
            "data_source": "metrics"
          }
        ],
        "start_date": "2025-08-06T00:00:00Z",
        "application_key": "******",
        "max_records_per_request": 5000
      }
    }
    p
    • 2
    • 2
  • m

    Marcin Siudziński

    08/11/2025, 9:14 AM
    Hi! The new
    Direct-Load Tables
    (https://docs.airbyte.com/platform/using-airbyte/core-concepts/direct-load-tables) paradigm is introduced in the BigQuery destination v3. In the documentation (https://docs.airbyte.com/integrations/destinations/bigquery-migrations) there is a note:
    Copy code
    If you only interact with the raw tables, make sure that you have the Disable Final Tables option enabled before upgrading. This will automatically enable the Legacy raw tables option after upgrading.
    In the GUI the change is straightforward and we can simply flip the switch, but my question is how to add the
    Legacy raw tables
    option into a terraform definition of destination? The terraform documentation (https://registry.terraform.io/providers/airbytehq/airbyte/latest/docs/resources/destination_bigquery#nestedatt--configuration) for Airbyte provider does not say anything about how we can configure it. There is a
    raw_data_dataset
    option but it is now defining the
    Legacy raw tables
    option.
    p
    • 2
    • 2
  • y

    Yuki Kakegawa

    08/11/2025, 4:05 PM
    I'm trying to bring expanded data from stripe subscriptions table. I see we could use a line like
    expand[]: items.data.discounts
    , but it gives me 400 error every time I tried it. With a curl cmd, I can pass it in like
    -d "expand[]=items.data.discounts"
    and it works just fine. I'd appreciate if anybody has any insights on this!
  • m

    Mengying Li

    08/12/2025, 5:57 AM
    Hi I have a question about the Series stream in datadog connector which I built from the builder. I m trying to use stream slicer to sync the data aggregated on a daily basis. However, I couldn't get the
    from
    and
    to
    parameter to work properly after many tries. Would appreciate any pointers on this! The error is and the yaml file is in the comment. The set up in the UI is in the screenshot
    Copy code
    Bad request. Please check your request parameters.
    'GET' request to '<https://api.us5.datadoghq.com/api/v1/query?query=avg%3Asystem.cpu.idle%7B%2A%7D>' failed with status code '400' and error message: 'The parameter 'from' is required'. Request (body): 'None'. Response (body): '{'errors': ["The parameter 'from' is required"]}'. Response (headers): '{'date': 'Tue, 12 Aug 2025 05:53:56 GMT', 'content-type': 'application/json', 'content-security-policy': "frame-ancestors 'self'; report-uri <https://logs.browser-intake-datadoghq.com/api/v2/logs?dd-api-key=pube4f163c23bbf91c16b8f57f56af9fc58&dd-evp-origin=content-security-policy&ddsource=csp-report&ddtags=site%3Aus5.datadoghq.com>", 'x-frame-options': 'SAMEORIGIN', 'vary': 'Accept-Encoding', 'content-encoding': 'gzip', 'x-content-type-options': 'nosniff', 'strict-transport-security': 'max-age=31536000; includeSubDomains; preload', 'x-ratelimit-limit': '10', 'x-ratelimit-period': '10', 'x-ratelimit-remaining': '9', 'x-ratelimit-reset': '4', 'x-ratelimit-name': 'batch_query', 'Via': '1.1 google', 'Alt-Svc': 'h3=":443"; ma=2592000,h3-29=":443"; ma=2592000', 'Transfer-Encoding': 'chunked'}'.
    • 1
    • 1
  • b

    Behzad Ahmed

    08/12/2025, 7:49 AM
    🚀 Boost Your Online Presence with Premium Link Insertion Opportunities! 🌍✨ 🔗 High Authority & Niche Relevant Websites Available for Link Insertion: 🌄 ninepeaks.io 🎨 stringlabscreative.com 🥷 ninjathlete.com 🤖 buzz4ai.com 💡 marketinghack4u.com 🌐 7knetwork.com 💼 business-money.com 🚀 startupeditor.com 🏆 designnominees.com 📊 edchart.com ⚙️ digitalenginelands.com 📈 xsquareseo.com 🤝 aitrendytools.com 🔄 hardreset.info 🖥️ onewebinc.com 📣 marketmystique.com 🌟 digitalgpoint.com 💌 Email: bahzadbacklinks56@gmail.com 📲 WhatsApp: +92 301 3608912 📌 Services Offered: ✅ Link Insertion ✅ Guest Posts ✅ High DA Backlinks ✅ SEO Growth Strategies #SEO #Backlinks #LinkInsertion #DigitalMarketing #GrowYourBusiness #HighDA #LinkBuilding #SEOExperts
    💩 1
  • m

    Muhammad Hassaan Mustafa

    08/12/2025, 8:55 AM
    Hi guys i am new to airbyte setting it up community version on my local machine everything is working except APIs what i am doing wrong here below is my values.yaml
    Copy code
    workload-launcher:
      extraEnv:
        - name: AIRBYTE_ENABLE_UNSAFE_CODE
          value: "true"
    connector-builder-server:
      extraEnv:
        - name: AIRBYTE_ENABLE_UNSAFE_CODE
          value: "true"
    server:
      extraEnv:
        - name: AIRBYTE_PUBLIC_API_ENABLED
          value: "true"
    webapp:
      extraEnv:
        - name: AIRBYTE_PUBLIC_API_ENABLED
          value: "true"
  • t

    Thế Minh Huỳnh

    08/12/2025, 11:22 AM
    Hi guys! I’ve set up the Airbyte 1.6.2 on EKS, and want to create job to sync data from S3 bucket to Snowflake. However, I cannot find the S3 source on the UI to set up. There is only S3 destination. Does any one know how to address the issue? Thank you so much.
    p
    • 2
    • 1
  • s

    Siarhei Karko

    08/12/2025, 2:01 PM
    Hello community, i have a problem with Airbyte Self-Hosted v1.6.2. I need to create a new Connection using custom builder Source, when i create and test the Custom Builder I see no issues (all tests pass successfully). Then I create a new Source and again all tests pass successfully. But when i try creating a new "Connection" i see an error during Step3 (When we are fetching the schema of data source):
    Discovering schema failed
    The discover catalog failed due to an internal error for source: 30b0e9b8-ad8c-42bf-86fc-04cdc929e45a
    Copy code
    Internal message: Unexpected error performing DISCOVER. The exit of the connector was: 0
    Failure origin: source
    io.airbyte.workers.exception.WorkerException: Unexpected error performing DISCOVER. The exit of the connector was: 0
    	at io.airbyte.connectorSidecar.ConnectorMessageProcessor.run(ConnectorMessageProcessor.kt:95)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.processConnectorOutput(ConnectorWatcher.kt:138)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.run(ConnectorWatcher.kt:79)
    	at io.airbyte.connectorSidecar.$ConnectorWatcher$Definition.initialize(Unknown Source)
    	at io.airbyte.connectorSidecar.$ConnectorWatcher$Definition.instantiate(Unknown Source)
    	at io.micronaut.context.DefaultBeanContext.resolveByBeanFactory(DefaultBeanContext.java:2334)
    	at io.micronaut.context.DefaultBeanContext.doCreateBean(DefaultBeanContext.java:2304)
    	at io.micronaut.context.DefaultBeanContext.doCreateBean(DefaultBeanContext.java:2316)
    	at io.micronaut.context.DefaultBeanContext.createRegistration(DefaultBeanContext.java:3127)
    	at io.micronaut.context.SingletonScope.getOrCreate(SingletonScope.java:80)
    	at io.micronaut.context.DefaultBeanContext.findOrCreateSingletonBeanRegistration(DefaultBeanContext.java:3029)
    	at io.micronaut.context.DefaultBeanContext.initializeEagerBean(DefaultBeanContext.java:2702)
    	at io.micronaut.context.DefaultBeanContext.initializeContext(DefaultBeanContext.java:1994)
    	at io.micronaut.context.DefaultApplicationContext.initializeContext(DefaultApplicationContext.java:314)
    	at io.micronaut.context.DefaultBeanContext.configureAndStartContext(DefaultBeanContext.java:3318)
    	at io.micronaut.context.DefaultBeanContext.start(DefaultBeanContext.java:345)
    	at io.micronaut.context.DefaultApplicationContext.start(DefaultApplicationContext.java:216)
    	at io.micronaut.runtime.Micronaut.start(Micronaut.java:75)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt:18)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt)
    Caused by: org.openapitools.client.infrastructure.ClientException: Client error : 409 Conflict
    I tried creating a fresh Airbyte Cloud trial account and added there my YAML source files, and schema detection worked w/o issues. What can be the root of the problem for my Self-Hosted Airbyte?
    p
    • 2
    • 1
  • v

    Vince Pillinger

    08/12/2025, 4:11 PM
    I am trying to setup a local airbyte installation on a single server and setup an SSL secured ingress for it. I cannot find any useful documentation related to this. Can anybody point me in the right direction?
    p
    • 2
    • 5
  • a

    Abhay Kevat

    08/12/2025, 4:20 PM
    Hi Community, i am facing some issues with Airbyte Facebook Marketing source connection. We are running a social media advertising company and currently pulling data from multiple ad platforms, including Facebook Marketing API, into MySQL using Airbyte. For Facebook specifically, we have 250+ ad accounts connected to a single Airbyte source. The sync is taking a very long time to complete — for example, 25 accounts take around 11 minutes, so scaling up to all accounts becomes impractically very slow. Current setup: • Source: Facebook Marketing API (v20.0) • Destination: MySQL • Airbyte deployment: Self-hosted on AWS EC2 (via
    abctl
    in Docker) • Sync mode: Incremental | Append + Deduped • Final destination usage: MySQL (downstream transformations via DBT) • Other context: We also collect data from Pinterest, TikTok, Snapchat, and other platforms, so reducing sync time for Facebook is critical to keeping our daily pipelines within schedule. Thank you for your support.
    p
    • 2
    • 1
  • n

    Nicholas Roberts

    08/13/2025, 4:40 AM
    why do I get the error message setting up a connection to aws rds?
    Copy code
    PostgresDestinationStrictEncrypt$Companion(main):73 starting destination: class io.airbyte.integrations.destination.postgres.PostgresDestinationStrictEncrypt
    p
    • 2
    • 1
  • s

    Sudeesh Rajeevan

    08/13/2025, 8:30 AM
    sporadic EventDataDeserializationException error I am getting in RDS mysql -> redshift connection I increased slave_net_timeout to 240, but still see this.. can anyone help me?
  • m

    Mert Ors

    08/13/2025, 9:15 AM
    the re authenticate button on airbyte cloud for snapchat marketing isn't working for me; is anyone else having the same issue?
    • 1
    • 1
  • m

    Mert Ors

    08/13/2025, 9:32 AM
    I think the issue is that the re-authenticate button doesn't add the client id properly: https://accounts.snapchat.com/accounts/oauth2/auth?client_id=******&amp;redirect_uri=https%[…]ponse_type=code&amp;scope=snapchat-marketing-api&amp;state=yNyknsJ
  • m

    Mert Ors

    08/13/2025, 9:32 AM
    this is what i get
  • c

    cketch Engli

    08/13/2025, 10:00 AM
    Hi, please tell me how to disable email autofill from airbyte
  • m

    Madhur Gupta

    08/13/2025, 11:25 AM
    Hello Airbyte users, I’m deploying Airbyte v1.6.1 on Azure Kubernetes Service using a Helm chart, with an NGINX ingress controller and HTTPS enabled. I already have Airflow running on the same cluster, and I want to use an Airbyte connector through an Airflow connection. However, when I run an Airflow DAG that triggers the Airbyte connector, I get the following error: AirflowException: Unexpected status code 405 from token endpoint I’m encountering this error on Airbyte version 1.6.1. Could you help me understand why it’s occurring and what changes I need to make in the Helm configuration to resolve it? Note: I’m using the free community edition of the Airbyte Helm chart.
  • a

    Ami Mehta

    08/13/2025, 11:49 AM
    Hi Airbyte users, I am trying to connect a destination in Airbyte but keep getting error ``Airbyte is temporarily unavailable. Please try again. (HTTP 502)` Can anyone please help? I am running docker desktop and launching Airbyte in browser. From there, I am adding Postgres as a destination
  • l

    Lamartine Santana

    08/14/2025, 12:56 PM
    In Airbyte Cloud, is it possible to update stream schemas without having to select all the new fields for ingestion? I would just like to update the schema and keep the previously selected fields and add the ones I want. Note that after I update the schemas, it automatically selects several new fields.
  • v

    Vince Pillinger

    08/14/2025, 3:00 PM
    I am getting this error when trying to use clickhouse as a source. I installed locally using abctl local install. java.lang.RuntimeException: java.sql.SQLFeatureNotSupportedException: getResultSet not implemented Clickhouse version: 25.6.2.5781