https://linen.dev logo
Join Slack
Powered by
# replication-troubleshooting
  • n

    Nataliia Cheremshynska

    12/09/2025, 4:30 PM
    Hello, I have an issue with connector Airbyte. Can someone help me please?
    k
    j
    • 3
    • 2
  • v

    vinnu

    12/09/2025, 8:33 PM
    Hi team, I’ve been trying to set up YouTube Analytics as a source in Airbyte, but I keep getting the same error every time I run the connection test
    Copy code
    The Youtube account is not valid. Please make sure you're trying to use the active Youtube Account connected to your Google Account.
    I have already tried all common solutions but still face the same issue. Here is everything I’ve done so far: Enabled APIs: • YouTube Data API v3 • YouTube Analytics API • YouTube Reporting API Created both Desktop OAuth Client and Web App Added all required scopes: •
    <https://www.googleapis.com/auth/youtube.readonly>
    •
    <https://www.googleapis.com/auth/yt-analytics.readonly>
    •
    <https://www.googleapis.com/auth/yt-analytics-monetary.readonly>
    Created OAuth consent screen (External), in testing mode. YouTube channel exists and is verified. Not a Brand Account (checked via YouTube account advanced settings) Same Gmail account is used: • For Google Cloud • For YouTube channel • During OAuth login in Airbyte (tested in incognito) Even after generating a new refresh token and re-authenticating multiple times, Airbyte still fails the connection test with:
    Copy code
    The Youtube account is not valid. Please make sure you're trying to use the active Youtube Account connected to your Google Account.
    I am unsure whether this is: • an OAuth bug in the YouTube Analytics connector, • a channel ID mismatch issue, • or something related to YouTube brand/identity mapping. Any guidance would be greatly appreciated...thank you
    k
    • 2
    • 1
  • v

    Vishal Garg

    12/10/2025, 8:03 AM
    @kapa.ai I am moving data from gcs to snowflake using incremental append + deduped . If I have a record with id = 1 and it came on 1st october , and then the same record comes on 10th october with one column updated , will airbyte update the record ?
    k
    • 2
    • 1
  • b

    Bob De Schutter

    12/10/2025, 9:13 AM
    Hi, I'm struggling to migrate from using Minio to using an azure container for storing the logs. According to the airbyte docs, I'm deploying using abctl and I have added the following block to my values.yaml file:
    Copy code
    storage:
            type: "Azure"
            secretName: "airbyte-config-secrets"
            bucket:
                log: "airbyte-container"
                state: "airbyte-container"
                workloadOutput: "airbyte-container"
            azure:
                connectionString: "<my_connection_string>"
    When I'm running the abctl local install, I'm getting the following error:
    Copy code
    java.lang.IllegalArgumentException: Invalid connection string.
                    at com.azure.storage.common.implementation.connectionstring.StorageConnectionString.create(StorageConnectionString.java:103)
    What could be wrong? @kapa.ai
    k
    • 2
    • 12
  • s

    s

    12/10/2025, 9:37 AM
    Hi everyone, I need some help with an issue after upgrading the Postgres destination to v3.0.0. I upgraded only the destination connector, and after the upgrade Airbyte stopped writing into my existing table. Even though my sync mode is still Incremental | Append, Airbyte is now creating a new table instead of appending to the old one. No schema changes were made on my side. What I’m seeing: Old table: still exists with the same column structure After upgrade, Airbyte created a new table with a different internal naming pattern Data is going into the new table, not the original one _airbyte_extracted_at, _airbyte_raw_id, and _airbyte_meta structure looks the same Only change was upgrading to Postgres Destination 3.0 Questions: Is this expected behavior with the new “Direct Load” architecture in v3.0? How can I force Airbyte to continue appending into my original destination table? Is there a recommended migration path to avoid table recreation during Postgres 3.0 upgrade? I can share table names, logs, and the connection ID if needed. Thanks in advance for any help! 🙏
    k
    • 2
    • 2
  • b

    Bob De Schutter

    12/10/2025, 10:18 AM
    @kapa.ai I'm getting the following error when trying to deploy airbyte with abctl using chart version 1.8.5:
    Copy code
    Encountered an issue deploying Airbyte:                                                                                                                                                                                                                                                                      
                Pod: airbyte-abctl-workload-launcher-5c8f7fdd56-kksdj.187fd31f8c5f4819
                Reason: Failed
                Message: Error: couldn't find key dataplane-client-id in Secret airbyte-abctl/airbyte-auth-secrets
                Count: 13
    k
    • 2
    • 1
  • s

    s

    12/10/2025, 10:51 AM
    Hi, after upgrading the Postgres destination to v3.0.0, Airbyte stopped writing to my old table and started creating a new table instead, even though the sync mode is still Incremental | Append. No schema changes were made. Is this expected in v3.0, and how can I make Airbyte continue appending to my original table? Also, is there a migration step I’m missing? Thanks!
    k
    • 2
    • 1
  • d

    Daniel Chime

    12/10/2025, 11:06 AM
    Hello everyone! I’m trying to move data from GA4 to MotherDuck using the documentation on the marketplace connector on Airbyte Cloud. The connectors (both for GA4 and MotherDuck) authenticate accurately and the data loads, but majority of the columns show up as
    Null
    . Has anyone faced something like this before? What might be the issue? And how can I resolve this? Please let me know if you need any additional context or screenshots on my end to resolve this correctly.
    k
    • 2
    • 1
  • b

    Bob De Schutter

    12/10/2025, 12:10 PM
    @kapa.ai I'm getting the following error when syncing a connection after upgrading to 1.9.2:
    Copy code
    2025-12-10 12:09:50,703 [io-executor-thread-2]    ERROR    i.a.c.s.e.h.IdNotFoundExceptionHandler(handle):33 - Not found exception class NotFoundKnownExceptionInfo {                                                                                                                                               │
    │     id: null                                                                                                                                                                                                                                                                                                        │
    │     message: Id not found: Could not find attempt stats for job_id: 12483 and attempt no: 0                                                                                                                                                                                                                         │
    │     exceptionClassName: io.airbyte.commons.server.errors.IdNotFoundKnownException                                                                                                                                                                                                                                   │
    │     exceptionStack: [io.airbyte.commons.server.errors.IdNotFoundKnownException: Id not found: Could not find attempt stats for job_id: 12483 and attempt no: 0,     at io.airbyte.commons.server.errors.handlers.IdNotFoundExceptionHandler.handle(IdNotFoundExceptionHandler.kt:32),     at io.airbyte.commons.ser │
    │     rootCauseExceptionClassName: java.lang.Class                                                                                                                                                                                                                                                                    │
    │     rootCauseExceptionStack: [io.airbyte.commons.server.errors.IdNotFoundKnownException: Could not find attempt stats for job_id: 12483 and attempt no: 0,     at io.airbyte.commons.server.handlers.AttemptHandler.getAttemptCombinedStats(AttemptHandler.kt:251),     at io.airbyte.server.apis.controllers.Attem │
    │ }
    The error occurs in the
    airbyte-abctl-server
    pod
    k
    • 2
    • 12
  • a

    Akhilesh Kumar

    12/10/2025, 5:49 PM
    Hi Guys, I am trying to run airbyte to load data from postgres to snowflake. So this is the first time I am running to load data from postgres to snwoflake with method as cdc with incremntal | append+dedup and even though no log is generated on that table in the db it failed with below error...any help would be much appreciated ... Sync failed 0 bytes | no records loaded | 10m 49s Warning from source: Something went wrong in the connector. See the logs for more details. Something went wrong in the connector. See the logs for more details. java.lang.RuntimeException: org.postgresql.util.PSQLException: ERROR: requested WAL segment 000000010000000000000006 has already been removed
    k
    • 2
    • 2
  • v

    V S Supritha

    12/11/2025, 5:26 AM
    sudo abctl local install --insecure-cookies --chart-version=1.1.0 --verbose
    Copy code
    kubectl get pods -n airbyte-abctl                                                                                
    NAME                                                      READY   STATUS      RESTARTS   AGE                                                         
    airbyte-abctl-airbyte-bootloader                          0/1     Completed   0          16h                                                         
    airbyte-abctl-connector-builder-server-7787844c79-d7ncc   1/1     Running     0          16h                                                         
    airbyte-abctl-cron-7fcc6f85d7-9c8pd                       1/1     Running     0          16h                                                         
    airbyte-abctl-pod-sweeper-pod-sweeper-86db9f55c5-rlr6l    1/1     Running     0          16h                                                         
    airbyte-abctl-server-544d94dcdf-8r7vk                     1/1     Running     0          16h                                                         
    airbyte-abctl-temporal-5fd44f968f-fdmnm                   1/1     Running     0          16h                                                         
    airbyte-abctl-webapp-7b96bc447-n44sk                      1/1     Running     0          16h                                                         
    airbyte-abctl-worker-6486bfdccc-nggrx                     1/1     Running     0          16h                                                         
    airbyte-abctl-workload-api-server-5884589bd9-76jzl        1/1     Running     0          16h                                                         
    airbyte-abctl-workload-launcher-7bcdd8db5f-wxlvn          1/1     Running     0          16h                                                         
    airbyte-db-0                                              1/1     Running     0          16h
    I am not able to create source in airbyte UI--> throwing HTTP 500 status airbyteserver pod error:
    Copy code
    2025-12-10 13:05:58 ERROR i.a.s.a.ApiHelper(execute):49 - Unexpected Exception
    java.lang.RuntimeException: java.nio.file.NoSuchFileException: /workspace/5a749837-102a-47bc-a8a8-8156a1fd850e/0/logs.log
            at io.airbyte.commons.server.converters.JobConverter.getLogRead(JobConverter.java:285) ~[io.airbyte-airbyte-commons-server-1.1.0.jar:?]
            at io.airbyte.commons.server.converters.JobConverter.getSynchronousJobRead(JobConverter.java:318) ~[io.airbyte-airbyte-commons-server-1.1.0.jar:?]
            at io.airbyte.commons.server.converters.JobConverter.getSynchronousJobRead(JobConverter.java:304) ~[io.airbyte-airbyte-commons-server-1.1.0.jar:?]
            at io.airbyte.commons.server.handlers.SchedulerHandler.reportConnectionStatus(SchedulerHandler.java:736) ~[io.airbyte-airbyte-commons-server-1.1.0.jar:?]
            at io.airbyte.commons.server.handlers.SchedulerHandler.checkSourceConnectionFromSourceCreate(SchedulerHandler.java:257) ~[io.airbyte-airbyte-commons-server-1.1.0.jar:?]
            at io.airbyte.server.apis.SchedulerApiController.lambda$executeSourceCheckConnection$1(SchedulerApiController.java:47) ~[io.airbyte-airbyte-server-1.1.0.jar:?]
            at io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:28) ~[io.airbyte-airbyte-server-1.1.0.jar:?]
            at io.airbyte.server.apis.SchedulerApiController.executeSourceCheckConnection(SchedulerApiController.java:47) ~[io.airbyte-airbyte-server-1.1.0.jar:?]
            at io.airbyte.server.apis.$SchedulerApiController$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-server-1.1.0.jar:?]
            at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invokeUnsafe(AbstractExecutableMethodsDefinition.java:461) ~[micronaut-inject-4.6.5.jar:4.6.5]
            at io.micronaut.context.DefaultBeanContext$BeanContextUnsafeExecutionHandle.invokeUnsafe(DefaultBeanContext.java:4350) ~[micronaut-inject-4.6.5.jar:4.6.5]
            at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:272) ~[micronaut-router-4.6.5.jar:4.6.5]
            at io.micronaut.web.router.DefaultUriRouteMatch.execute(DefaultUriRouteMatch.java:38) ~[micronaut-router-4.6.5.jar:4.6.5]
            at io.micronaut.http.server.RouteExecutor.executeRouteAndConvertBody(RouteExecutor.java:498) ~[micronaut-http-server-4.6.5.jar:4.6.5]
            at io.micronaut.http.server.RouteExecutor.lambda$callRoute$5(RouteExecutor.java:475) ~[micronaut-http-server-4.6.5.jar:4.6.5]
            at io.micronaut.core.execution.ExecutionFlow.lambda$async$1(ExecutionFlow.java:87) ~[micronaut-core-4.6.5.jar:4.6.5]
            at io.micronaut.core.propagation.PropagatedContext.lambda$wrap$3(PropagatedContext.java:211) ~[micronaut-core-4.6.5.jar:4.6.5]
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
            at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
    Caused by: java.nio.file.NoSuchFileException: /workspace/5a749837-102a-47bc-a8a8-8156a1fd850e/0/logs.log
            at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92) ~[?:?]
            at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106) ~[?:?]
            at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111) ~[?:?]
            at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:261) ~[?:?]
            at java.base/java.nio.file.Files.newByteChannel(Files.java:379) ~[?:?]
            at java.base/java.nio.file.Files.newByteChannel(Files.java:431) ~[?:?]
            at org.apache.commons.io.input.ReversedLinesFileReader.<init>(ReversedLinesFileReader.java:415) ~[commons-io-2.15.1.jar:2.15.1]
            at org.apache.commons.io.input.ReversedLinesFileReader.<init>(ReversedLinesFileReader.java:358) ~[commons-io-2.15.1.jar:2.15.1]
            at org.apache.commons.io.input.ReversedLinesFileReader.<init>(ReversedLinesFileReader.java:305) ~[commons-io-2.15.1.jar:2.15.1]
            at io.airbyte.commons.logging.LocalLogClient.tailCloudLogs(LogClient.kt:502) ~[io.airbyte-airbyte-commons-storage-1.1.0.jar:?]
            at io.airbyte.commons.logging.LogClientManager.getJobLogFile(LogClientManager.kt:37) ~[io.airbyte-airbyte-commons-storage-1.1.0.jar:?]
            at io.airbyte.commons.server.converters.JobConverter.getLogRead(JobConverter.java:283) ~[io.airbyte-airbyte-commons-server-1.1.0.jar:?]
    Please help how to fix this
    k
    • 2
    • 1
  • b

    Brendan Kamp

    12/11/2025, 12:03 PM
    Hi 🙂 Im busy trying to setup the Helm V2 and it seems that a hack used for S3 compatible storgare no longer works (Github issue here) and in the docs it says "If you are using an S3 compatible solution, use the S3 type and provide an
    endpoint
    key/value as needed."
    but im not entirely sure what that means, has anybody got this working?
    k
    • 2
    • 1
  • d

    DK

    12/11/2025, 12:57 PM
    Hello I am getting an error 2025-12-11 191049 warn Failed to find output files from connector within timeout of 9 minute(s). Is the connector still running?
    k
    • 2
    • 3
  • d

    DK

    12/11/2025, 1:22 PM
    @kapa.ai please check
    k
    • 2
    • 10
  • j

    Jayesh Jain

    12/11/2025, 1:51 PM
    Hello guys i am trying to setup airbyte using helm v2 in my helmrelease file, I want to use vault as secretmanager. I am following what is given and tried many more things then too when am creating any source in airbyte getting an error:- Configuration check failed Workload failed, source: airbyte_platform Internal message: Init container error encountered while hydrating secrets for id: 6e2-b3-4c7-xxx-4499ad_0_check. Encountered exception of type: class secrets.persistence.SecretCoordinateException. Exception message: That secret was not found in the store! Coordinate: airbyte_workspace_00000000-0000-0000-0000-000000000000_secret_f1fc-6e3c-4877-xxx-933_v1. Failure origin: airbyte_platform can someone help me this is really urgent thanks in advance
    k
    • 2
    • 13
  • h

    Hayden Hansson

    12/11/2025, 2:34 PM
    Is there currently an outage? I'm struggling to get Airbyte Cloud to connect to either my postgres database or clickhouse database, both databases have correct firewall rules (the clickhouse database has no firewall....) but it's still failing with timeouts
    k
    • 2
    • 1
  • m

    Mathieu Nicolle

    12/11/2025, 4:33 PM
    Hi Airbyte Community 👋, I'm currently configuring a new connection using the Hubspot Source connector and encountering an issue with the visibility of an experimental stream. Connection Configuration • Connector: Hubspot Source • Authentication Method: OAuth (Public App) • Source Setting:
    Enable experimental streams
    is Activated (set to
    true
    ). Scope Check I have carefully reviewed and confirmed the scopes granted to my Hubspot Public App. They include the necessary scopes documented for the experimental streams: •
    crm.objects.contacts.read
    •
    business-intelligence
    •
    crm.schema.contacts.read
    The Problem Despite activating the
    Enable experimental streams
    option and ensuring all required scopes are present in the OAuth app configuration, the
    contacts_web_analytics
    stream is not appearing in the list of available streams when I configure the connection. Question 1. What could be preventing this specific experimental stream from being listed? 2. Are there any other prerequisites (e.g., a specific Hubspot tier, feature enablement, or additional scopes not explicitly documented) that are required for the
    contacts_web_analytics
    stream to become visible? Any insights or suggestions from others who have successfully used this experimental stream would be greatly appreciated!
    👍 1
    k
    • 2
    • 1
  • m

    Mihály Dombi

    12/11/2025, 5:58 PM
    Hi Airbyte Community! Did anyone experience report generation slowness, or any other stream slowness in the last 2-3 weeks related to amazon seller partner?
    k
    • 2
    • 1
  • h

    Harsh Dodiya

    12/12/2025, 5:48 AM
    Hey everyone, I have setup the Airbyte using abctl in my VM (with 4 CPU and 32 GB Memory) I am trying to sync from MySQL to PostgreSQL DB. But when we try to sync a large table, Airbyte caches a large number of records from the source connection. Because of that the replication-orchestrator pod kubernetes experiences excessive memory consumption during syncs. We can clearly see it on Pod status:
    Copy code
    airbyte-abctl replication-job-662-attempt-0 0/3 Error 0 4h24m
    airbyte-abctl replication-job-690-attempt-0 2/3 OOMKilled 0 9m15s
    As I installed Airbyte using abctl, but I didn’t set any resource limits during the installation. Now I’d like to adjust the memory allocation and apply new resource limits. Is there a way to update these settings after installation, or do I need to reinstall Airbyte to change them?
  • a

    Abhijith C

    12/12/2025, 6:07 AM
    Hey everyone, I’m deploying Airbyte on EKS using Helm v2. I want to configure heartbeat and job timeouts (e.g., heartbeat-max-seconds-between-messages) via
    values.yaml
    . https://docs.airbyte.com/platform/understanding-airbyte/heartbeats Questions: 1. Is it possible to configure runtime flags (like heartbeat-max-seconds-between-messages) via
    values.yaml
    in Helm v2? 2. If yes, under which section should these flags go (
    airbyte:
    vs
    airbyteBootloader:
    )? 3. If no, what is the recommended way to set these values in EKS deployment? Thanks for clarification!
  • m

    Mohammad Ashikur Rahman

    12/12/2025, 9:03 AM
    Hi everyone, I’m currently setting up the TrustPilot connector on a self-hosted Airbyte cluster and encountering some challenges: 1. When using an API Key, the connector successfully passes the "test connection." However, when creating a pipeline, none of the stream data is loaded. 2. I also tested with OAuth2.0 using access token and refresh token details. Unfortunately, I’m encountering the following unauthorization error, even though the API and access token work perfectly when tested locally via Postman. Has anyone faced similar issues with the TrustPilot connector or been able to use it successfully? Any suggestions or guidance would be greatly appreciated! Thank you!
    Copy code
    Configuration check failed
    'Encountered an error while checking availability of stream configured_business_units. Error: 401 Client Error: Unauthorized for url: <https://api.trustpilot.com/v1/oauth/oauth-business-users-for-applications/accesstoken>'
  • m

    Michal Krawczyk

    12/12/2025, 11:02 AM
    I'm running a self-hosted Airbyte deployed to Kubernetes on EKS with helm charts. After upgrading to Airbyte 2.0.1 (from 1.6 so previous versions could cause this) I noticed that some syncs finish with Sync succeeded status on the dashboard but don't load any data (0 bytes | no records loaded). When I looked into Airbyte database I noticed that attempts are marked as succeeded status but the attempt output report sync failures. I see many different reasons why jobs may fail like this (details in the 🧵) and they can be handled and resolved separately but my primary question is why Airbyte reports failed attempts as succeeded. This causes some major issues: • Airbyte doesn't restart failed attempts • Airbyte doesn't send failed connector webhooks • It's hard to notice failed attempts when looking at the connectors list Has anyone noticed a similar behavior? Is this an Airbyte issue that we should report on Github?
    • 1
    • 1
  • j

    Jason Wiener

    12/12/2025, 4:31 PM
    Good day. I am hoping to attract the attention of a maintainer to PR 60892, addressing an open issue that prevents the discover method from working on a Redshift source if the schema contains late-binding views. What's a good way to do so? Thanks.
  • r

    Richard Lam

    12/12/2025, 4:33 PM
    Hey all, noticed some odd behavior in latest Snowflake connector. Currently we have it set to full refresh, and the first sync successfully pulls in all the records. But subsequent syncs bring in 0 records. If we do a data reset, same behavior happens where first sync succeeds with all records, then subsequent syncs don't bring in anything. This almost seems to act like an incremental sync despite us selecting full refresh. Has anyone else ran into this/is this a known issue with the current snowflake connector? Any help or context is appreciated, thank you!
    s
    • 2
    • 4
  • b

    Biplove Jaisi

    12/13/2025, 12:23 AM
    Hi everyone 👋 I’m trying to use PyAirbyte to move data from an S3 source to an Iceberg destination. The sync from the source to DuckDB cache works fine, but when loading into Iceberg, I’m getting an error saying that the namespace cannot be null. The issue is that the S3 source doesn’t seem to have any field or config option to populate the namespace, so I’m not sure how to handle this requirement on the destination side. Has anyone faced this before or found a workaround? Any guidance would be really appreciated. Thanks in advance! 🙏
  • p

    Py Bot

    12/13/2025, 6:56 AM
    Hi Everyone, I'm setting up the TikTok Marketing Ads connector and need some guidance on the recommended configuration. My use case requires the following: 1. The connection should sync only the previous day’s data, not the current day's incomplete data. 2. Each sync run should store the extracted data in separate S3 folders (using date-based prefixes). 3. The sync should run daily at 1:00 AM, after the previous day’s data is finalized in TikTok. Could you please advise on the best configuration options or best practices within Airbyte to achieve this setup? Any guidance on the correct sync mode, cursor strategy, and S3 prefix configuration would be greatly appreciated.
  • r

    Rajib Imran

    12/14/2025, 2:57 AM
    Hi experts — I’m stuck and need your help. I created a custom API connection to my on-premises server and receive JSON as output. How can I format this JSON to insert it into a ClickHouse table? The fields/schemas aren’t being auto-mapped. Where do I map the JSON fields to the database table columns? In the schema section I can’t find the JSON fields inside the data array (see attached Postman output).
    🙏 1
  • o

    omar sayed

    12/14/2025, 5:46 PM
    Hi experts i have an issue where after creating my connection from big query to postgres it takes more than half an hour and still shows 0 records and 0 bytes and no data is move any idea what might cause that ?
  • o

    omar sayed

    12/14/2025, 7:40 PM
    Is this a known issue that not setting default data set cause this ?
  • y

    Yury Liavitski

    12/15/2025, 11:54 AM
    Hi everyone! I have a problem with the Google Ads connector in the OSS version of Airbyte (2.0.1). I am using it to load campaign data to Redshift. Up till December 11th it worked fine, then at one point it just stopped loading data. What I see is that the sync starts to load data, loads a few thousand rows, then just stays in the running state without loading anything and i cannot see in the log the problem. I tried to do a full reload of the campaigns table too, same problem. It loads ~1360000 rows, then just keeps logging the same fragment and not loading any more records. Airbyte version 2.0.1, running on K8s deployed via the Helm Chart. Source connector: Google Ads v.4.1.3 Destination: Redshift v3.5.3 I will post the section of the logs in the comments. I've tried reauthenticating the source connector, that didn't help
    • 1
    • 4