https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • s

    soma chandra sekhar attaluri

    11/26/2025, 6:32 PM
    @kapa.ai i installed airbyte 2.0.1 with helm charts v1 is it ok if i install it with helm charts v1 instead of with v2
    k
    • 2
    • 16
  • s

    soma chandra sekhar attaluri

    11/26/2025, 7:09 PM
    @kapa.ai i want o access airbyte db and use those configurations for another airbyte setup in prod environment how can i do that??
    k
    • 2
    • 40
  • e

    Eduardo Ferreira

    11/26/2025, 8:50 PM
    @kapa.ai Error: couldn't find key WORKLOAD_API_BEARER_TOKEN in Secret airbyte/airbyte-airbyte-secrets. this is happening on airbyte 2.0.1 v1 helm chart
    k
    • 2
    • 7
  • s

    soma chandra sekhar attaluri

    11/27/2025, 3:53 AM
    @kapa.ai is there any limit to the number of tables transfered from a database because when i am trying to make a sync upto 25 tables are getting synced more than that are not getting started why is that how to rectify this error
    k
    • 2
    • 4
  • s

    Salman Siddiqui

    11/27/2025, 5:13 AM
    Configuration check failed State code: 08001; Message: The connection attempt failed.
    k
    • 2
    • 21
  • s

    Salman Siddiqui

    11/27/2025, 5:13 AM
    @kapa.ai Configuration check failed State code: 08001; Message: The connection attempt failed.
    k
    • 2
    • 1
  • g

    Guy

    11/27/2025, 5:15 AM
    does the airbyte marketplace rss connector support pagination?
    k
    • 2
    • 2
  • r

    Russell Tang

    11/27/2025, 5:16 AM
    I hit 403 'Forbidden' error when connecting to Airbyte from Airflow. What should I do?
    k
    • 2
    • 2
  • c

    Chanakya Pendem

    11/27/2025, 7:05 AM
    @kapa.ai guide me through making an api through which I can hit build source, for restapis
    k
    • 2
    • 7
  • k

    Kothapalli Venkata Avinash

    11/27/2025, 7:54 AM
    @kapa.ai We have moved connectors to private docker repository and getting error "You must upgrade your platform version to use this connector version. Either downgrade your connector or upgrade platform to 0.63.7"
    k
    • 2
    • 10
  • k

    Kothapalli Venkata Avinash

    11/27/2025, 1:45 PM
    @kapa.ai we want to update Destination-Databricks and push in private docker. Please share steps.
    k
    • 2
    • 16
  • k

    Kothapalli Venkata Avinash

    11/27/2025, 2:11 PM
    @kapa.ai How to use authorized account and pull images from docker .
    k
    • 2
    • 4
  • s

    soma chandra sekhar attaluri

    11/27/2025, 5:20 PM
    @kapa.ai GIVE ME THE DOCUMENTATION TO INCREASE NUMBER OF TABLES BEING INGESTED ITS LIKE 25 TABLES ARE BEING INGESTED AND AFTER 25 REMAINING ARE NOT GETTING STARTED WHATS THE REASON
    k
    • 2
    • 1
  • s

    soma chandra sekhar attaluri

    11/27/2025, 5:25 PM
    @kapa.ai can you explain me how airbyte is working when i installed on a ec2 linux instance when its ingesting data from postgres to s3 whats happening underlying process
    k
    • 2
    • 1
  • s

    soma chandra sekhar attaluri

    11/27/2025, 6:46 PM
    @kapa.ai in the latest sync i can see number of records extracted as well as | time elapsed whats the meaning of time elapsed
    k
    • 2
    • 4
  • i

    Ishan Anilbhai Koradiya

    11/27/2025, 7:08 PM
    Hi @kapa.ai which api to use to start a job programmatically in airbyte ?
    k
    • 2
    • 4
  • s

    Slackbot

    11/27/2025, 10:17 PM
    This message was deleted.
    k
    • 2
    • 3
  • r

    Renu Fulmali

    11/28/2025, 9:47 AM
    @kapa.ai I am trying to create the custom connector and I have setup the airbyte using the helm chart and used oauth2-proxy for authentication but now I am getting an error When configuring your OAuth app, set the callback or redirect URL to https://airbyte-test.com/auth_flow
    k
    • 2
    • 4
  • f

    Fabrizio Spini

    11/28/2025, 11:06 AM
    @kapa.ai how can I change on airbyte 2.0.1 deployed by abctl the resources given to a replica job? currently I have values.yaml like the following but it is instantiating default values
    Copy code
    postgresql:
      enabled: false
      extraVolumes:
      - name: postgres-ssl-cert
        configMap:
          name: postgres-ssl-cert
      extraVolumeMounts:
        - name: postgres-ssl-cert
          mountPath: /etc/ssl/certs/postgres-ca.crt
          subPath: ca.crt
      extraEnv:
        - name: POSTGRES_TLS_ENABLED
          value: "true"
        - name: POSTGRES_TLS_DISABLE_HOST_VERIFICATION
          value: "true"
        - name: SQL_TLS_ENABLED
          value: "true"
        - name: SQL_TLS_DISABLE_HOST_VERIFICATION
          value: "true"
    
    global:
      auth:
        enabled: false
      resources:
        worker:
          jobs:
            requests:
              memory: "6Gi"
            limits:
              memory: "8Gi"
    k
    • 2
    • 28
  • k

    Kothapalli Venkata Avinash

    11/28/2025, 1:29 PM
    @kapa.ai After upgrading to 1.3.to we are seeing error /usr/share/nginx/html/api/v1/instance_configuration" failed (2: No such file or directory)
    k
    • 2
    • 1
  • j

    Javier Molina Sánchez

    11/28/2025, 1:56 PM
    @kapa.ai I'm doing a blue/green deployment in rds mysql in aws, and after doing the switch I get
    java.lang.NullPointerException: Cannot invoke "io.debezium.connector.mysql.gtid.MySqlGtidSet$UUIDSet.getUUID()" because "other" is null
    . I've enabled GTID in mysql, reboot the machine, did a full-load and then perform the blue/green deployment. Can you help me understand why Airbyte is not able to connect again?
    k
    • 2
    • 4
  • d

    David Backx

    12/01/2025, 7:12 AM
    @kapa.ai im on self hosted airbyte version 0.63.13 recently I have been getting the error
    eadFromSource: exception caught
    java.lang.IllegalStateException: Source process is still alive, cannot retrieve exit value.
    The source is a MSSQL database and the destination is S3. It all started when i had wrong permissions on the database and when that was fixed i started getting these errors. Its reading almost every try between 300MB-370MB. What could be the issue?
    k
    • 2
    • 7
  • n

    Neeraj N

    12/01/2025, 7:56 AM
    The ingestion can't ingest files from gmails to parquet
    k
    • 2
    • 1
  • s

    Salman Khan

    12/01/2025, 8:06 AM
    @kapa.ai I deployed airbyte 2.0.1, then I tried to connect db using this command kubectl exec -it -n airbyte-abctl airbyte-db-0 -- psql -U airbyte -d db-airbyte showing this error -bash: kubectl: command not found, can you guide me fix this issue or any issue on my commands let fix
    k
    • 2
    • 1
  • f

    Fabrizio Spini

    12/01/2025, 8:35 AM
    Hi @kapa.ai, I want to open a bug report with the following content
    Copy code
    [Bug]: MySQL CDC: Record Duplication Due to Incorrect Offset Restart After Debezium Connector Failure (Error 1236)
    
    **Airbyte Version:** 2.0.1
    **Source Connector:** MySQL CDC (Debezium)
    **Destination Connector:** BigQuery
    **Sync Mode:** Change Data Capture (CDC)
    **Target Write Schema:** Append
    
    ### :ladybug: Bug Description
    
    When a MySQL CDC sync job (Run 1) fails *after* starting the data emission, the subsequent job (Run 2) incorrectly restarts from the **initial binlog offset of Run 1** instead of the last committed offset. This leads to the re-processing and re-writing of records already sent to the BigQuery destination, causing **data duplication** due to the **Append** write mode.
    
    ### Steps to Reproduce
    
    1.  Configure a **MySQL CDC Source Connector** (Airbyte 2.0.1) syncing to a **BigQuery Destination** using the **Append** write mode.
    2.  Start **Run 1** (sync) which successfully begins streaming from a specific position:
        ```log
        2025-11-30 11:46:50 source ERROR : Requesting streaming from position filename: db05-slave.087542, position: 87056536
    3. Force Run 1 to fail shortly after it starts streaming, specifically triggering the Debezium/MySQL Error 1236 (replica ID conflict). * The failure log excerpt:
    Copy code
    log
            2025-11-30 11:46:56 source ERROR blc-db05-slave.bravofly.intra:3306 i.d.p.ErrorHandler(setProducerThrowable):52 Producer failure io.debezium.DebeziumException: A replica with the same server_uuid/server_id as this replica has connected to the source; the first event 'db05-slave.087542' at 87056536... Error code: 1236; SQLSTATE: HY000.
    4. Verify that BigQuery received records before Run 1 terminated. 5. Correct the failure cause (e.g., resolve the server ID conflict) and start Run 2. 6. Observe the Run 2 log: The connector logs immediately confirm it is restarting from the exact same binlog position where Run 1 started (
    db05-slave.087542, position=87056536
    ), demonstrating the incorrect offset retrieval: * Run 2 Log excerpt:
    Copy code
    log
            2025-11-30 12:11:19 source INFO DefaultDispatcher-worker-3#global-round-1-create-partitions i.a.c.r.c.CdcPartitionsCreator(run):144 Current position 'MySqlSourceCdcPosition(fileName=db05-slave.087542, position=87056536)' does not exceed target position 'MySqlSourceCdcPosition(fileName=db05-slave.087546, position=19566958)'.
    7. Check the BigQuery table: the initial batch of records (those processed between Run 1 start and failure) is duplicated. ### Expected Behavior The subsequent job (Run 2) should resume from the last binlog offset that was successfully confirmed (committed state) by the BigQuery destination connector. This ensures that records already written to the target are not re-processed and duplicated.``` do you already have a similar bug reported for this behaviour?
    k
    • 2
    • 1
  • p

    Piyush Shakya

    12/01/2025, 9:33 AM
    @kapa.ai Init container error encountered while processing workload for id: 85283132-f3b9-470d-847f-cea2c0a57301_29938_4_check. Encountered exception of type: class io.micronaut.data.connection.jdbc.exceptions.CannotGetJdbcConnectionException. Exception message: Failed to obtain JDBC Connection.
    k
    • 2
    • 1
  • a

    Aswin

    12/01/2025, 10:13 AM
    @kapa.ai I am using airbyte software in my own machine the free version i am migrating around few tb of data from one postgres db to another now here once its done does all the indexes and bloat would be gone how is the data being brought to the new db if its all insert statements then it would be fresh right how the initial data transfer occurs and the cdc i want to know here in airbyte
    k
    • 2
    • 1
  • p

    Piyush Shakya

    12/01/2025, 3:34 PM
    @kapa.ai where is the cron data stored in airbyte database tables ?
    k
    • 2
    • 1
  • l

    Lucas Segers

    12/01/2025, 4:24 PM
    Hi, does the file sources (such as s3 or sftp) that support the "excel" file format support configuring which worksheet will be read?
    k
    • 2
    • 1
  • c

    Chris Dahms

    12/01/2025, 5:06 PM
    @kapa.ai we installed airbyte 1.8.2 with external postgresql database using abctl, then uninstalled and re-installed airbyte 1.8.4 and now our connections are not working, they fail with a 502 when testing in the GUI
    k
    • 2
    • 1