https://linen.dev logo
Join Slack
Powered by
# give-feedback
  • a

    Alan Chan

    07/18/2025, 12:37 AM
    hey Airbyte, We're migrating data from our old database to a new one. To save costs, we set up Airbyte to only sync recent data and moved the older data directly to the new database. However, we're finding that after each Airbyte sync, the data we moved manually keeps disappearing. Does Airbyte delete existing data that doesn't match its metadata expectations?
    p
    • 2
    • 4
  • j

    Jono Kumarich

    07/20/2025, 10:43 PM
    Hey Airbyte, We have been experiencing a breaking issue with the Snowflake source connector ever since got bumped from
    1.0.1
    to
    1.0.2
    . We are now getting a missing certificate JDBC error and are not able to rectify it on our side. I have confirmed this occurs on our organisation account as well as on a separate Snowflake trial account. As we use cloud, we are unable to roll back to the previous release.
    Copy code
    java.lang.RuntimeException: Column name discovery query failed: JDBC driver internal error: Max retry reached for the download of chunk#0 (Total chunks: 1) retry: 7, 
    error: net.snowflake.client.jdbc.SnowflakeSQLLoggedException: JDBC driver encountered communication error. Message: No trusted certificate found 
    Verify that the hostnames and portnumbers in SYSTEM$ALLOWLIST are added to your firewall's allowed list. To troubleshoot your connection further, you can refer to this article: <https://docs.snowflake.com/en/user-guide/client-connectivity-troubleshooting/overview>.
    I have contacted support 5 days ago but have not received a reply. Would love some sort of guidance or assistance as this has completely broken some of our critical syncs.
    š
    • 2
    • 1
  • s

    Shakar Bakr

    07/30/2025, 11:14 AM
    chart version:
    v1.7.2
    Hello good people, I'm having this issue in deploying Airbyte, when I try to deploy it using the helm chart I'm getting this error:
    Copy code
    Could not determine release state: unable to determine cluster state: Deployment/data-pipelines/airbyte-workload-api-server dry-run failed: failed to create typed patch object (data-pipelines/airbyte-workload-api-server; apps/v1, Kind=Deployment): .spec.template.spec.containers[name="airbyte-workload-api-server-container"].env: duplicate entries for key [name="AB_JWT_SIGNATURE_SECRET"]
    the issue is the charts has duplicate entries for
    AB_JWT_SIGNATURE_SECRET
    env variable which is inside workload-api-server chart -> templates -> deployment.yaml:
    Copy code
    - name: WORKLOAD_API_BEARER_TOKEN
              valueFrom:
                secretKeyRef:
                  name: {{ index .Values "workloadApi" "bearerTokenSecretName" | default (printf "%s-airbyte-secrets" .Release.Name ) }}
                  key: {{ index .Values "workloadApi" "bearerTokenSecretKey" | default "WORKLOAD_API_BEARER_TOKEN" }}
            - name: AB_JWT_SIGNATURE_SECRET ---------------------------------------> First Entry here <----------------------------------------------------------------------
              valueFrom:
                secretKeyRef:
                  name: {{ .Values.global.auth.secretName | default "airbyte-auth-secrets" | quote }}
                  key: {{ .Values.global.auth.jwtSignatureSecretKey | default "jwt-signature-secret" | quote }}
            - name: MICRONAUT_ENVIRONMENTS
              valueFrom:
                configMapKeyRef:
                  name: {{ .Release.Name }}-airbyte-env
                  key: WORKERS_MICRONAUT_ENVIRONMENTS
            - name: AB_JWT_SIGNATURE_SECRET ---------------------------------------> Second Entry here <---------------------------------------------------------------------
              valueFrom:
                secretKeyRef:
                  name: {{ .Values.global.auth.secretName | default "airbyte-auth-secrets" | quote }}
                  key: {{ .Values.global.auth.jwtSignatureSecretKey | default "jwt-signature-secret" | quote }}
    p
    • 2
    • 1
  • s

    Stockton Fisher

    08/06/2025, 3:58 PM
    Pagination on the connections page doesn't seem to be working. When you have a lot of connections, some don't show. Not even if you apply filters. Cloud.
  • j

    Jillian Moore

    08/06/2025, 8:46 PM
    Hi all - Does anyone know if there is an update on the Google Ads connector upgrade to Google Ads API v19? I received a notification from Google Ads this morning that said v18 will stop accepting requests starting August 20, 2025.
    • 1
    • 1
  • r

    Rowan Moul

    08/14/2025, 9:27 PM
    Starting in v1.7 the webapp container was rolled into the server container, but neither helm chart v1 or v2 provide an equivalent to
    webapp.ingress
    in
    values.yaml
    for the server container. I see no github issues about this (open or closed). Surely I am not the only person in the world who was using this portion of the helm chart? Now it seems I have to separately deploy an ingress resource to make the webapp available again, which is a rather large breaking change in the chart itself. This could have easily been migrated seamlessly in the chart by reading
    webapp.ingress
    and using those values to make an ingress resource for the server container. Then this backfill functionality could be removed in the v2 helm chart in place of a dedicated
    server.ingress
    section in
    values.yaml
    I created the following issue for this: https://github.com/airbytehq/airbyte/issues/64941
    m
    • 2
    • 1
  • a

    Aya Zeplovitch

    08/26/2025, 12:00 AM
    hey i think your docs are wrong. i ran
    abctl local uninstall
    and all my data and connectors are now gone. https://docs.airbyte.com/platform/using-airbyte/getting-started/oss-quickstart#uninstall-airbyte
    r
    i
    • 3
    • 5
  • g

    gaurav vivek

    09/01/2025, 6:05 AM
    Is there any way to get sample records while using airbyte ? Say just getting 10 record per stream each before full load
    j
    l
    • 3
    • 2
  • l

    Lillian Jiang

    09/03/2025, 8:16 PM
    Hello! I’m working with the Airbyte Gmail source connector and noticed that it currently only pulls *email attachment metadata*—not the actual binary files of the attachments themselves. I’d like to extend the connector so it automatically downloads all attachments along with the messages. • Has anyone here tried modifying an Airbyte source connector before (especially Gmail)? • Were there any unexpected challenges when editing the source code? • Do you think it’s worth customizing the connector, or is there a better approach for getting attachments into Airbyte?
    i
    • 2
    • 8
  • g

    Gavin Acres

    09/11/2025, 4:28 PM
    General UI feedback: We like the new changes to the sidebar so that you can navigate to the organization's settings and a home screen. But with that change we lost the ability to immediately see the workspace name, which was visible before. While we may sometimes navigate into the workspace via the organization, we also navigate directly to the workspace via a Slack notification alert, which doesn't provide the workspace name either. So when we do click on an alert link, we now need to navigate to the "Workspace settings" to see the workspace name.
    h
    j
    • 3
    • 4
  • u

    unfrgivn

    09/17/2025, 4:37 PM
    I don't know if Airbyte PMs or support monitor GitHub Issues but there is a very concerning issue with the 1.8 release web app changes and no response from Airbyte weeks after the release. This is a major issue as the new web app is essentially bricked and users are forced to rollback and do a database restore. https://github.com/airbytehq/airbyte/issues/65070 There are always release bugs and prioritization and time are understandable constraints, but this is the first Airbyte issue that has me very concerned about us investing further in the platform. Our goal is to migrate 1000s of workloads to Airbyte Enterprise next year but this is a major platform issue with the new web app that has nothing to do with licensing. I have to assume you've applied patched this for the cloud version or your product would be unusable??
    i
    l
    • 3
    • 6
  • t

    Thiago

    09/26/2025, 2:12 PM
    We're not being able to sync any connection or create/update sources after updating to 1.8.3 using abctl. everything keeps stuck at the test source. tried to reinstall prior versions but it always errors out. completely stuck without any of our connections working.
    👀 1
    h
    i
    • 3
    • 7
  • a

    Anudhyan Datta LT-23

    09/29/2025, 6:02 AM
    Isn't airbyte participating in hacktoberfest this year?
    ➕ 1
  • a

    Alan Chan

    10/01/2025, 3:11 AM
    I really don't understand why Airbyte people put AI bot in channels to help answer questions even when Airbyte's documents are insufficient. Most of time AI bot can't even help without enough knowledge for it. Does anyone know how to manually input schema for S3 source?
    a
    • 2
    • 1
  • s

    Steven Ayers

    10/02/2025, 6:10 AM
    We've been getting constant Client Error 409: conflict for URL when calling the Airbyte API (OSS 1.7 & after 1.8.3 upgrade), even after a completely fresh install. I've noticed a huge amount of people on slack reporting similar issues especially in the last month or so. Is anyone at airbyte looking at this? It's made airbyte unusable for us. We were using OSS with a plan to look at enterprise after a year, but the way things are going....
    i
    o
    • 3
    • 2
  • t

    Tanuj Shriyan

    10/07/2025, 8:56 AM
    Is it a known issue where the data being transferred from Snowflake to MongoDB is very slow? This is on Airbyte Cloud
  • n

    Neal Morris

    10/08/2025, 12:39 PM
    I'd like to give some feedback, on the self-hosted offering. I have some concern with the defaults, particularly the Temporal deployment. The airbyte chart uses the "auto-setup" image and deployment, which is totally fine for local experimentation, and perhaps small scale deployments. As far as I can tell, there aren't any docs caveating this choice and how to use either Temporal Cloud, or use the production deployment constellation. I do see the chart supports Temporal cloud, but I think it should be: 1. Made clear that the auto-setup configuration has limitations, and the considerations folks should make. 2. Docs around how to use the full constellation / temporal cloud. a. In addition, I think support is missing for the full production constellation, eg: History, Matching, Frontend, Worker all as their own deployments.
    👀 1
  • b

    Bala Chandar

    10/11/2025, 5:59 PM
    We’re facing a serious issue with Airbyte OSS. Earlier, all our connections (around 10 GB of data) from MySQL to Snowflake were syncing successfully without any problems. However, recently we’ve started encountering OOM (Out of Memory) errors, and the logs show the following line: [ASYNC QUEUE INFO] Global: max: 5.79 GB It seems like the global async queue buffer size has increased significantly. Is there any way to reduce or control this buffer size to prevent OOM issues?
  • t

    Thiago

    10/15/2025, 1:37 PM
    getting this value after installing airbyte latest version "status": 404 }, "name": "HttpError", "requestId": "uzH4abNnv5D4u7wpzQpSVZ", "request": { "url": "/api/v1/workspaces/get", "method": "POST", "headers": { "Content-Type": "application/json" }, "data": { "workspaceId": "48e3540c-07e9-403d-a5b3-a4190b6362d7" } }, "status": 404, "response": { "message": "Internal Server Error: Could not find configuration for STANDARD_WORKSPACE: 48e3540c-07e9-403d-a5b3-a4190b6362d7.", "exceptionClassName": "io.airbyte.commons.server.errors.IdNotFoundKnownException", "exceptionStack": [], "rootCauseExceptionStack": [] } } and now can't even download abctl. no commands listed on the documentation work
  • t

    Thiago

    10/15/2025, 1:44 PM
    # wget https://github.com/airbytehq/abctl/releases/download/v0.30.2/abctl_linux_amd64 chmod +x abctl_linux_amd64 s--2025-10-15 154403-- https://github.com/airbytehq/abctl/releases/download/v0.30.2/abctl_linux_amd64 Resolving github.com (github.com)... 140.82.121.4 Connecting to github.com (github.com)|140.82.121.4|:443... udo mv abctl_linux_amd64 /usr/local/bin/abctl connected. HTTP request sent, awaiting response... 404 Not Found 2025-10-15 154404 ERROR 404: Not Found. # chmod: cannot access 'abctl_linux_amd64': No such file or directory # mv: cannot stat 'abctl_linux_amd64': No such file or directory #
  • s

    sp33dy

    10/16/2025, 11:54 PM
    Hi, there's a typo in bootloader's pod airbyte-v2 manifest:
    Copy code
    {{- if .Values.airbyteBootloader.extraVolumeMount }}
          volumeMounts:
            {{- toYaml .Values.airbyteBootloader.extraVolumeMounts | nindent 8 }}
    First line is missing "s". It should be like this:
    Copy code
    {{- if .Values.airbyteBootloader.extraVolumeMounts }}
    /airbyte/templates/airbyte-bootloader/pod.yaml, line 99 I noticed it in 2.0.12, but I've checked now and it's still there in 2.0.18
  • d

    deng ganyin

    10/20/2025, 8:40 AM
    streams: order_list: name: order_list path: /pb/mp/order/v2/list url_base: https://openapi.lingxing.com sync_mode: incremental http_method: POST primary_key: - platform_order_no cursor_field: - end_time success: api_codes: - 0 - 200 record_path: list pagination: type: offset page_size: 500 offset_field: offset length_field: length body_defaults: length: 500 offset: 0 date_type: global_purchase_time order_status: 6 include_delete: 0 platform_payment_status: - partially_paid - paid platform_shipping_status: - fulfilled body_field_map: length: length offset: offset end_time: end_time store_id: store_id date_type: date_type start_time: start_time order_status: order_status platform_code: platform_code include_delete: include_delete platform_order_nos: platform_order_nos platform_order_names: platform_order_names platform_payment_status: platform_payment_status platform_shipping_status: platform_shipping_status request_headers: Content-Type: application/json destination_sync_mode: append_dedup schema: type: object properties: platform_order_no: type: string end_time: type: string format: date-time currency_month: name: currency_month path: /erp/sc/routing/finance/currency/currencyMonth url_base: https://openapi.lingxing.com sync_mode: incremental http_method: POST primary_key: - date cursor_field: - date success: api_codes: - 0 - 200 record_path: data request_headers: Content-Type: application/json destination_sync_mode: append_dedup schema: type: object properties: date: type: string format: date marketplace_list: name: marketplace_list path: /erp/sc/data/seller/allMarketplace url_base: https://openapi.lingxing.com sync_mode: full_refresh http_method: GET primary_key: - mid success: api_codes: - 0 - 200 record_path: data request_headers: Content-Type: application/json destination_sync_mode: overwrite schema: type: object properties: mid: type: integer code: type: string region: type: string country: type: string aws_region: type: string marketplace_id: type: string I'm a bit curious. I don't know why I keep getting errors when using AI to modify, but I can't see the wrong line numbers
  • s

    sp33dy

    10/21/2025, 11:06 AM
    Hi, I noticed another mistake - there are livenessProbe and readinessProbe options in Helm chart values for cron. Unfortunatelly these options are missing from template.
  • a

    Aymen NEGUEZ

    10/24/2025, 1:25 PM
    The current HubSpot destination in Airbyte already supports writing to both standard objects (Contacts, Companies, Deals, etc.) and custom objects. However, it does not yet provide native support for creating associations between these records. This feature request proposes extending the HubSpot destination to handle associations via the HubSpot API. For example: Linking a Contact to a Company Associating a Deal with multiple Contacts Relating Custom Objects to standard objects (e.g., a custom Subscription object linked to a Contact) Supporting associations between custom objects themselves Key benefits: Preserve the relational structure of CRM data when syncing into HubSpot. Ensure that objects written via Airbyte reflect real-world business relationships. Enable more advanced HubSpot use cases by leveraging both default and custom associations. Potential implementation details: Extend the destination configuration to define association mappings (e.g., contactId → companyId). Support both default HubSpot associations and custom associations defined in the HubSpot account. Handle upserts gracefully to prevent duplicate or broken associations. Adding this functionality would make the HubSpot destination more complete and better aligned with HubSpot’s data model.
    h
    • 2
    • 1
  • v

    Vinicius Nunes

    10/27/2025, 4:58 PM
    Feedback on the support: I have opened some tickets that apparently never got resolved or never received responses. We only believe that the support experience has not been great lately.
    ➕ 2
    i
    j
    • 3
    • 3
  • v

    Vasil Boshnakov

    11/01/2025, 8:13 AM
    My feedback is this: Our Airbyte instance stopped working unexpectedly yesterday. Today is the second day that we are unable to deploy a fresh Airbyte instance on an EC2 machine using an external database (AWS RDS). We are consistently encountering the error: “Airbyte Bootloader failed to start.” This situation is highly disruptive, and it seems that every time we need to make changes or updates in Airbyte, it results in 1–2 days of downtime. At this point, it raises serious concerns about the reliability of the tool for production use.
    j
    i
    +3
    • 6
    • 16
  • s

    Steven Ayers

    11/07/2025, 8:08 PM
    Tip for other Snowflake Users: I've been using the "Snowflake" Destination for ages, and have found it to be quite slow depending on the source... also had issues with the COPY INTO trimming spaces in columns. I've just switch to using the "S3 Data Lake" Destination with the newly released Polaris/Snowflake Open Catalog Support. I've seen upto 6x performance increase, especially when the source is Oracle/Postgres/MySQL. The set up is • an "Internal" Catalog in a Snowflake Open Catalog Account • Create a catalog integration in your main snowflake account, connecting to your polaris catalog • Create a catalog-linked database from the integration so it's always upto date Useful links: • This guide is great, I mainly followed this but replaced the spark bits with Airbyte setup https://docs.snowflake.com/en/user-guide/opencatalog/tutorials/open-catalog-gs#use-case-2-sync-apache-iceberg-tables-from-snowflake-to-open-catalog • Creating a catalog linked database https://docs.snowflake.com/en/user-guide/tables-iceberg-catalog-linked-database • Don't forget to set
    ACCESS_DELEGATION_MODE = VENDED_CREDENTIALS
    on your integration.... https://docs.snowflake.com/en/user-guide/tables-iceberg-configure-catalog-integration Hope this helps reduce your Snowflake bill, your Airbyte infra bill, and also gets you some hands-on exp with Snowflake x Iceberg tables ❄️
    🔥 1
    i
    r
    s
    • 4
    • 13
  • s

    sp33dy

    11/10/2025, 4:33 PM
    There is another mistake in Helm Chart v2. Template for metrics looks for `global.metrics.publish`:
    Copy code
    {{/*
    Renders the global.metrics.publish value
    */}}
    {{- define "airbyte.metrics.publish" }}
    	{{- if eq .Values.global.metrics.publish nil }}
        	{{- true }}
    	{{- else }}
        	{{- .Values.global.metrics.publish }}
    	{{- end }}
    {{- end }}
    but
    values.yaml
    only contains
    global.metrics.enabled
    , which is set to false by default, but
    PUBLISH_METRICS
    is set to true.
  • k

    Kevin Conseil

    11/18/2025, 10:08 AM
    Hi Everyone and the Airbyte Crew 🙂, Are all updates of all connector communicated via email or webhook if the user toggle it in ? Cause i was not aware about the update of the file connector which created some issue on my end and i m wondering if I just overlooked the update or never receive it ? If it is not sent for all connectors then my feedback would be to enable it to all connectors 🙂
    ➕ 1
    h
    • 2
    • 1
  • s

    Simon Duvergier

    11/19/2025, 10:34 AM
    Hello all 🙂 Quick feedback on Helm Chart V2 migration (that took me 1 day because of not having the correct values.yml in the documentation 😿). •
    global.storage.type
    changed from
    GCS
    in Chart V1 to
    gcs
    in Chart V2 • For secret management: ◦ the key in Chart V2
    global.storage.gcs.credentialsJsonPath
    is not correct for
    gcs
    (I could not make it work) ◦ the key in Chart v2 is
    global.storage.gcs.credentialsJsonSecretKey
    in combinasion with
    global.storage.secretName
    So if it can help some people trying the migration: • In V1 I had; ◦
    global.storage.type: 'GCS'
    ◦
    global.storage.gcs.storageSecretName: 'airbyte-config-secrets'
    ◦
    global.storage.gcs.credentialsPath: '/secrets/gcs-log-creds/gcp.json'
    • In V2 I could make work ◦
    global.storage.type: '*GCS*'
    ◦
    global.storage.gcs.*secretName*: 'airbyte-config-secrets'
    ◦
    global.storage.gcs.*credentialsJsonSecretKey*: '*gcp.json*'
    I hope it can help 🙂
    l
    • 2
    • 4