https://linen.dev logo
Join Slack
Powered by
# replication-ask-ai
  • r

    Rafael Felipe

    12/12/2025, 5:07 PM
    mongodb self-hosted...warn before error:
    Copy code
    280 Unable to query for op log stats, exception: {}Command failed with error 13 (Unauthorized): 'not authorized on local to execute command { collStats: "<http://oplog.rs|oplog.rs>", $db: "local", lsid: { id: UUID("a21208b4-bb4a-4dad-8470-990f6af66e19") } }' on server
    k
    • 2
    • 1
  • g

    Gideon Stowell

    12/12/2025, 5:19 PM
    @kapa.ai i am noticing that my 3 airbyte server pods are running at nearly 4GB of memory almost all the time. I am running version 1.7.2. Is this normal?
    k
    • 2
    • 1
  • r

    Rafael Felipe

    12/12/2025, 5:50 PM
    before mondb error: 2025-12-12 144905 destination INFO main i.a.c.i.d.a.FlushWorkers(close):227 Waiting for all streams to flush. 2025-12-12 144906 destination INFO main i.a.c.i.d.a.FlushWorkers(close):226 REMAINING_BUFFERS_INFO
    k
    • 2
    • 1
  • k

    Kevin O'Keefe

    12/12/2025, 7:29 PM
    @kapa.ai I am having issues when adding a google ads source in airbyte oss. I can authenticate fine with the google ads token, but when I click set up source I get a 504 error. I am filtering to one customer ID but I believe that the front end is timing out before the source is completely queried and created.
    k
    • 2
    • 1
  • t

    Tanuja

    12/12/2025, 7:34 PM
    Subject: Setting workspace-level concurrent sync limits in Airbyte Cloud Hi team! I’m looking for a way to limit the number of concurrent syncs at the workspace level in Airbyte Cloud to comply with our constraints (e.g., max 3 concurrent syncs). What I need: A way to ensure we never exceed X concurrent syncs (e.g., 3) across our entire workspace, regardless of how syncs are triggered (scheduled, manual, API, or Airflow). Questions: 1. Does Airbyte Cloud support setting a workspace-level concurrent sync limit? 2. If yes, where can I configure this (UI/API/CLI)? 3. If not available in UI, can this be configured on your backend for our workspace?
    k
    • 2
    • 1
  • k

    Kevin O'Keefe

    12/12/2025, 8:02 PM
    @kapa.ai Deploying v2 helm chart and receive this error mapping key "DATAPLANE_CLIENT_ID_SECRET_KEY" already defined at line 24 line 220: mapping key "DATAPLANE_CLIENT_SECRET_SECRET_KEY" already defined at line 25
    k
    • 2
    • 1
  • j

    Jerry Bao

    12/12/2025, 8:22 PM
    @kapa.ai Is there a way to set the schedulerName for airbyte jobs running in kubernetes? I would like to enable bin packing of jobs
    k
    • 2
    • 1
  • o

    Omree Gal-Oz

    12/12/2025, 11:06 PM
    @kapa.ai why does airbyte force non-legacy state messages in Airbyte 2.0+
    k
    • 2
    • 5
  • k

    Kevin O'Keefe

    12/12/2025, 11:45 PM
    @kapa.ai how can I configure source connector test timeouts. My google ads times out at 30 seconds
    k
    • 2
    • 4
  • m

    Mohammad Atif

    12/14/2025, 7:29 AM
    @kapa.ai i dont see login page in my my self hosted aibyte setup using helm
    k
    • 2
    • 2
  • i

    Ishan Anilbhai Koradiya

    12/14/2025, 9:21 AM
    Hi @kapa.ai getting this error from worker
    k
    • 2
    • 7
  • a

    Aviad Deri

    12/14/2025, 12:21 PM
    @kapa.ai how do i set authentication in gke installation?
    k
    • 2
    • 4
  • s

    stanley

    12/14/2025, 8:08 PM
    I have 1 source-s3 to postgres connection. S3 having just 1 CSV with 417 rows with 'scheduled' sync every 24 hours. Job runs at 6.37pm everyday without issue and I can see 417 rows in postgres. But something happens between 9pm and 9am, that same table always shows 390 rows(not 417). I setup trigger for any insert/modify/delete rows operation on that table and it did not capture anything. How to explain this? Can Airbyte or Postgres doing some internal cleaning/truncating and how to troubleshoot this weird issue?
    k
    • 2
    • 1
  • l

    Lenin Mishra

    12/15/2025, 1:56 AM
    I have an endpoint which provides a field updatedAt in the form of epoch seconds (integer). How can I use this field to build incremental sync using datetime based cursor?
    k
    • 2
    • 19
  • f

    Fabrizio Spini

    12/15/2025, 8:16 AM
    @kapa.ai I have deployed airbyte 2.0 on kubernetes using as db an external DB that was previously used with airbyte version 1.5. On airbyte UI now I see all sources/destinations/connections from the old 1.5 server but on worker I have the following errors and I cannot proceed neither creating new sources. How can I resolve?
    Copy code
    2025-12-15 00:00:43,067 [Workflow Poller taskQueue="SYNC", namespace="default": 5]	WARN	i.t.i.w.Poller$PollerUncaughtExceptionHandler(logPollErrors):362 - Failure in poller thread Workflow Poller taskQueue="SYNC", namespace="default": 5
    io.grpc.StatusRuntimeException: CANCELLED: context canceled
    	at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:351)
    	at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:332)
    	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:174)
    	at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.pollWorkflowTaskQueue(WorkflowServiceGrpc.java:5634)
    	at io.temporal.internal.worker.WorkflowPollTask.doPoll(WorkflowPollTask.java:174)
    	at io.temporal.internal.worker.WorkflowPollTask.poll(WorkflowPollTask.java:155)
    	at io.temporal.internal.worker.WorkflowPollTask.poll(WorkflowPollTask.java:49)
    	at io.temporal.internal.worker.Poller$PollExecutionTask.run(Poller.java:336)
    	at io.temporal.internal.worker.Poller$PollLoopTask.run(Poller.java:296)
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
    	at java.base/java.lang.Thread.run(Thread.java:1583)
    
    2025-12-15 00:02:41,868 [Workflow Poller taskQueue="ui_commands", namespace="default": 2]	WARN	i.t.i.w.Poller$PollerUncaughtExceptionHandler(logPollErrors):362 - Failure in poller thread Workflow Poller taskQueue="ui_commands", namespace="default": 2
    io.grpc.StatusRuntimeException: CANCELLED: context canceled
    	at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:351)
    	at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:332)
    	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:174)
    	at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.pollWorkflowTaskQueue(WorkflowServiceGrpc.java:5634)
    	at io.temporal.internal.worker.WorkflowPollTask.doPoll(WorkflowPollTask.java:174)
    	at io.temporal.internal.worker.WorkflowPollTask.poll(WorkflowPollTask.java:155)
    	at io.temporal.internal.worker.WorkflowPollTask.poll(WorkflowPollTask.java:49)
    	at io.temporal.internal.worker.Poller$PollExecutionTask.run(Poller.java:336)
    	at io.temporal.internal.worker.Poller$PollLoopTask.run(Poller.java:296)
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
    	at java.base/java.lang.Thread.run(Thread.java:1583)
    
    2025-12-14 19:36:30,863 [Workflow Poller taskQueue="CONNECTION_UPDATER", namespace="default": 4]	WARN	i.t.i.w.Poller$PollerUncaughtExceptionHandler(logPollErrors):362 - Failure in poller thread Workflow Poller taskQueue="CONNECTION_UPDATER", namespace="default": 4
    io.grpc.StatusRuntimeException: CANCELLED: context canceled
    	at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:351)
    	at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:332)
    	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:174)
    	at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.pollWorkflowTaskQueue(WorkflowServiceGrpc.java:5634)
    	at io.temporal.internal.worker.WorkflowPollTask.doPoll(WorkflowPollTask.java:174)
    	at io.temporal.internal.worker.WorkflowPollTask.poll(WorkflowPollTask.java:155)
    	at io.temporal.internal.worker.WorkflowPollTask.poll(WorkflowPollTask.java:49)
    	at io.temporal.internal.worker.Poller$PollExecutionTask.run(Poller.java:336)
    	at io.temporal.internal.worker.Poller$PollLoopTask.run(Poller.java:296)
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
    	at java.base/java.lang.Thread.run(Thread.java:1583)
    k
    • 2
    • 16
  • b

    Bogdan

    12/15/2025, 9:26 AM
    "Why does connector-builder consume so much memory, and is this normal?" NAMESPACE NAME CPU(cores) MEMORY(bytes) airbyte-abctl airbyte-abctl-connector-builder-server-68f5dd5b48-glzks 2m 3731Mi airbyte-abctl airbyte-abctl-cron-7dcf574857-6zgmv 21m 866Mi airbyte-abctl airbyte-abctl-server-6597c64c5d-j9jth 2m 1191Mi airbyte-abctl airbyte-abctl-temporal-67d4dd4845-9mg44 91m 638Mi airbyte-abctl airbyte-abctl-worker-5768484bf4-cf2r7 2m 539Mi airbyte-abctl airbyte-abctl-worker-5768484bf4-qc4jm 2m 533Mi airbyte-abctl airbyte-abctl-worker-5768484bf4-wqnqt 2m 529Mi airbyte-abctl airbyte-abctl-workload-api-server-55bc5c8776-g2226 5m 437Mi airbyte-abctl airbyte-abctl-workload-launcher-69d6b95789-mzlgz 2m 459Mi @kapa.ai
    k
    • 2
    • 1
  • f

    Fabrizio Spini

    12/15/2025, 11:16 AM
    @kapa.ai trying to downgrade mysql source to version 3.46.2 source-mysql-spec failed with
    Copy code
    Defaulted container "connector-sidecar" out of: connector-sidecar, main, init (init)
    Unsetting empty environment variable 'DATA_PLANE_SERVICE_ACCOUNT_CREDENTIALS_PATH'
    Unsetting empty environment variable 'AIRBYTE_URL'
    Unsetting empty environment variable 'KEYCLOAK_INTERNAL_REALM_ISSUER'
    Unsetting empty environment variable 'DATA_PLANE_SERVICE_ACCOUNT_EMAIL'
    Unsetting empty environment variable 'KEYCLOAK_CLIENT_ID'
    Unsetting empty environment variable 'AIRBYTE_ROLE'
    Unsetting empty environment variable 'CONTROL_PLANE_AUTH_ENDPOINT'
    2025-12-15 10:46:46,950 [main]	INFO	i.a.c.ApplicationKt(main):14 - Sidecar start
    
        ___    _      __          __
       /   |  (_)____/ /_  __  __/ /____
      / /| | / / ___/ __ \/ / / / __/ _ \
     / ___ |/ / /  / /_/ / /_/ / /_/  __/
    /_/  |_/_/_/  /_.___/\__, /\__/\___/
                        /____/
     : airbyte-connector-sidecar :
    
    2025-12-15 10:46:48,334 [main]	INFO	i.m.c.e.DefaultEnvironment(<init>):170 - Established active environments: [worker-v2, k8s, control-plane, edition-community, local-secrets]
    2025-12-15 10:46:49,003 [main]	INFO	i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'INFO' for logger: 'io.netty'
    2025-12-15 10:46:49,004 [main]	INFO	i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'ERROR' for logger: 'com.zaxxer.hikari'
    2025-12-15 10:46:49,004 [main]	INFO	i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'INFO' for logger: 'io.grpc'
    2025-12-15 10:46:49,005 [main]	INFO	i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'INFO' for logger: 'io.temporal'
    2025-12-15 10:46:49,006 [main]	INFO	i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'ERROR' for logger: 'com.zaxxer.hikari.pool'
    2025-12-15 10:46:49,006 [main]	INFO	i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'INFO' for logger: 'io.fabric8.kubernetes.client'
    2025-12-15 10:46:49,037 [main]	INFO	i.m.r.Micronaut(start):183 - No embedded container found. Running as CLI application
    2025-12-15 10:46:52,573 [pool-3-thread-1]	INFO	i.a.c.i.LineGobbler$Companion(gobble$lambda$2):108 -
    2025-12-15 10:46:52,609 [pool-3-thread-1]	INFO	i.a.c.i.LineGobbler$Companion(gobble$lambda$2):108 - ----- START SPEC -----
    2025-12-15 10:46:52,610 [pool-3-thread-1]	INFO	i.a.c.i.LineGobbler$Companion(gobble$lambda$2):108 -
    2025-12-15 10:55:52,599 [main]	WARN	i.a.c.ConnectorWatcher(waitForConnectorOutput):108 - Failed to find output files from connector within timeout of 9 minute(s). Is the connector still running?
    2025-12-15 10:55:52,622 [main]	INFO	i.a.c.ConnectorWatcher(failWorkload):324 - Failing workload e668e35b-1934-4c24-913c-133c3f3312b6_spec.
    2025-12-15 10:55:52,687 [main]	INFO	i.a.c.ConnectorWatcher(exitFileNotFound):242 - Deliberately exiting process with code 2.
    k
    • 2
    • 13
  • t

    Tom Dobson

    12/15/2025, 12:59 PM
    We deploy l Airbyte 2.0.1 with Helm chart v2.0.19 on Kubernetes. Airbyte keeps failing after deployment with: { "url": "https://airbyte.staging.tailstech-nonprod.com/workspaces/42482ecd-c993-477d-b92f-daeba56e567e/connections", "airbyteVersion": "2.0.1", "errorType": "HttpError", "errorConstructor": "$d", "error": { "i18nKey": "errors.http.notFound", "i18nParams": { "status": 404 }, "name": "HttpError", "requestId": "98KEVSNf2V1wLfUxoSqE5B", "request": { "url": "/api/v1/workspaces/get", "method": "POST", "headers": { "Content-Type": "application/json" }, "data": { "workspaceId": "42482ecd-c993-477d-b92f-daeba56e567e" } }, "status": 404, "response": { "message": "Internal Server Error: Could not find configuration for STANDARD_WORKSPACE: 42482ecd-c993-477d-b92f-daeba56e567e.", "exceptionClassName": "io.airbyte.commons.server.errors.IdNotFoundKnownException", "exceptionStack": [], "rootCauseExceptionStack": [] } }, "stacktrace": "HttpError: errors.http.notFound\n at fyt (https://airbyte.staging.tailstech-nonprod.com/assets/core-l0xde5f173.js:56:1032)", "userAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/143.0.0.0 Safari/537.36", "featureFlags": {} } What's going wrong?
    k
    • 2
    • 4
  • m

    Martin Brummerstedt

    12/15/2025, 1:05 PM
    @kapa.ai Everything works with the source in the builder and the destination works in other connections. what could be the reasons of this issue? Sync summary: { “status” : “failed”, “recordsSynced” : 0, “bytesSynced” : 0, “startTime” : 1765803772778, “endTime” : 1765803788018, “totalStats” : { “additionalStats” : { }, “bytesCommitted” : 0, “bytesEmitted” : 0, “destinationStateMessagesEmitted” : 0, “destinationWriteEndTime” : 0, “destinationWriteStartTime” : 1765803776888, “meanSecondsBeforeSourceStateMessageEmitted” : 0, “maxSecondsBeforeSourceStateMessageEmitted” : 0, “meanSecondsBetweenStateMessageEmittedandCommitted” : 0, “recordsEmitted” : 0, “recordsCommitted” : 0, “recordsFilteredOut” : 0, “bytesFilteredOut” : 0, “replicationEndTime” : 1765803788012, “replicationStartTime” : 1765803772778, “sourceReadEndTime” : 1765803787989, “sourceReadStartTime” : 1765803776888, “sourceStateMessagesEmitted” : 0 }, “streamStats” : [ ], “performanceMetrics” : { }, “streamCount” : 2 } 2025-12-15 140308 replication-orchestrator INFO Failures: [ { “failureOrigin” : “destination”, “failureType” : “config_error”, “internalMessage” : “io.airbyte.cdk.ConfigErrorException: Failed to initialize connector operation”, “externalMessage” : “Failed to initialize connector operation”, “metadata” : { “attemptNumber” : 0, “jobId” : 61725921, “from_trace_message” : true, “connector_command” : “write” }, , “timestamp” : 1765803777482 }, { “failureOrigin” : “destination”, “internalMessage” : “Destination process exited with non-zero exit code 1”, “externalMessage” : “Something went wrong within the destination connector”, “metadata” : { “attemptNumber” : 0, “jobId” : 61725921, “connector_command” : “write” }, “stacktrace” : “io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1\n\tat io.airbyte.container.orchestrator.worker.DestinationReader.run(ReplicationTask.kt:55)\n\tat io.airbyte.container.orchestrator.worker.ReplicationWorker$runJobs$2$tasks$1$1.invokeSuspend(ReplicationWorker.kt:152)\n\tat io.airbyte.container.orchestrator.worker.ReplicationWorker$runJobs$2$tasks$1$1.invoke(ReplicationWorker.kt)\n\tat io.airbyte.container.orchestrator.worker.ReplicationWorker$runJobs$2$tasks$1$1.invoke(ReplicationWorker.kt)\n\tat io.airbyte.container.orchestrator.worker.util.AsyncUtils$runAsync$1$1.invokeSuspend(AsyncUtils.kt:22)\n\tat io.airbyte.container.orchestrator.worker.util.AsyncUtils$runAsync$1$1.invoke(AsyncUtils.kt)\n\tat io.airbyte.container.orchestrator.worker.util.AsyncUtils$runAsync$1$1.invoke(AsyncUtils.kt)\n\tat kotlinx.coroutines.intrinsics.UndispatchedKt.startUndspatched(Undispatched.kt:66)\n\tat kotlinx.coroutines.intrinsics.UndispatchedKt.startUndispatchedOrReturn(Undispatched.kt:43)\n\tat kotlinx.coroutines.BuildersKt__Builders_commonKt.withContext(Builders.common.kt:165)\n\tat kotlinx.coroutines.BuildersKt.withContext(Unknown Source)\n\tat io.airbyte.container.orchestrator.worker.util.AsyncUtils$runAsync$1.invokeSuspend(AsyncUtils.kt:21)\n\tat kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:34)\n\tat kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:100)\n\tat io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141)\n\tat io.micrometer.core.instrument.Timer.lambda$wrap$2(Timer.java:199)\n\tat datadog.trace.bootstrap.instrumentation.java.concurrent.Wrapper.run(Wrapper.java:47)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n”, “timestamp” : 1765803788011 } ]
    k
    • 2
    • 5
  • a

    Andrea Brenna

    12/15/2025, 1:18 PM
    @kapa.ai I have Airbyte 2.0.1 and a connection between Source MySQL (3.51.5) and Destination BigQuery (3.0.16). After a series of successful syncs, I have a series of syncs that are still successful but that do not import and extract any rows. The only way to fix this is to refresh the entire connection. Why does this happen?
    k
    • 2
    • 1
  • a

    Adrian Monge

    12/15/2025, 1:24 PM
    @kapa.ai is there an easy way, either through API or code, that we could limit the number of rows synced for every table in a single connection? For context, we are working on a side process that will extract a limited number of rows from some larger sources to facilitate analysis. We don't need all the data; a fixed sample is more than enough for our use case.
    k
    • 2
    • 7
  • k

    Kevin Lefevre

    12/15/2025, 2:03 PM
    @kapa.ai I am developing on source-hubspot connector in local How can I perform sample tests to extract data and visualize the output in local ?
    k
    • 2
    • 10
  • p

    Pedro Peixoto

    12/15/2025, 2:47 PM
    @kapa.ai We have migrated our airbyte instance to 2.0.1 (from 1.5.1) - we are using helm to deploy the instance in k8s. After the upgrade, we seem to be unable to trigger any source check (stuck with the below stack trace in the worker component). Any know issue with this?
    Copy code
    2025-12-15 14:44:27,904 [workflow-method-connection_manager_5611cad0-f3de-4366-a9cd-0ea1afe...-f74c0650-6d6b-4054-a2df-b414213cbf75]       INFO    i.a.w.t.s.ConnectionManagerWorkflowImpl(checkConnectionsWithCommandApi):650 - SOURCE CHECK: Starting
    2025-12-15 14:44:27,914 [workflow-method-connection_manager_5611cad0-f3de-4366-a9cd-0ea1afe...-f74c0650-6d6b-4054-a2df-b414213cbf75]       INFO    i.a.w.t.s.ConnectionManagerWorkflowImpl(runCheckWithCommandApiInChildWorkflow):1071 - Running command check for source with id fa1c844f-939f-4b31-b6e3-0c01748ef162 with the use of the new command API
    k
    • 2
    • 4
  • l

    Lucas Segers

    12/15/2025, 4:25 PM
    @kapa.ai on the low code connection builder, can I set a call timeout to prevent hangs?
    k
    • 2
    • 1
  • s

    Simon Schmitke

    12/15/2025, 4:34 PM
    @kapa.ai I have a connector to postgres RDS with 5 tables that sync to Snowflake using CDC. 4 of the tables are 2B rows large. Each table takes longer than 24hours to sync on it's own for an initial sync, meaning a timeout will occur (since they occur every 24 hours). Airbyte seems to be missing a lot of data when it syncs during this initial load. Once it starts incrementing after the initial snapshot, things seem to work 100%. However, we're missing a lot of data (millions of rows) from the initial sync. My initial waiting time is 2400 seconds Size of queue = 10K LSN commit behaviour is "After loading data in the destination". Initial load timeout in hours = 60. Do you have any tips for me to increase the success rate of our initial snapshot? We use airbyte version 2.0.1 and snowflake connector 4.0.27
    k
    • 2
    • 1
  • o

    Onur Musaoglu

    12/15/2025, 7:34 PM
    @kapa.ai, I am trying to use feature flags in my Kubernetes deployment. It's a custom Kubernetes deployment, and I am not using Helm charts. I am using Airbyte 1.7. I created a
    flags.yaml
    file in
    /etc/airbyte/flags.yaml
    directory in containers for Airbyte services like airbyte-server, airbyte-workload-launcher, etc. Then, I provided the following environment variables to these containers.
    "FEATURE_FLAG_PATH": "/etc/airbyte/flags.yaml"
    "FEATURE_FLAG_CLIENT": "OPENFEATURE"
    "OPENFEATURE_SDK_FILE_PATH": "/etc/airbyte/flags.yaml"
    However, this does not work. I couldn't find much documentation about feature flag management. What is the issue with my approach? How can I provide feature flags to Airbyte 1.7 with my Kubernetes deployment?
    k
    • 2
    • 1
  • h

    Henry Mattingly

    12/15/2025, 8:39 PM
    has renamed the channel from "ask-ai" to "replication-ask-ai"
  • i

    Ian Hansen

    12/15/2025, 8:41 PM
    @kapa.ai I am getting duplicate records based on primary key in a full refresh data sync from postgres source to S3 Data Lake connector. How do I trouble shoot and create a bug to report this issue?
    k
    • 2
    • 12
  • h

    Hari Haran R

    12/16/2025, 6:33 AM
    @kapa.ai is there any readble humna format api to know what cron type is this 0 30 11 * * ?, i see in airbyte UI it show the human readble format, but when i inspect, im not able to see any api
    k
    • 2
    • 4
  • f

    Fabrizio Spini

    12/16/2025, 7:55 AM
    @kapa.ai I have the following error on source-mysql-check pod
    Copy code
    2025-12-16 07:52:18,199 [main]	INFO	i.a.c.ConnectorWatcher(saveConnectorOutput):186 - Writing output of 435bb9a5-7887-4809-aa58-28c27df0d7ad_138badd9-c058-4b44-9f21-a2b86a18699a_0_check to the doc store
    2025-12-16 07:52:21,092 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 1 of 5. Last response: null
    2025-12-16 07:52:23,663 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 2 of 5. Last response: null
    2025-12-16 07:52:26,543 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 3 of 5. Last response: null
    2025-12-16 07:52:28,802 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 4 of 5. Last response: null
    2025-12-16 07:52:31,253 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 5 of 5. Last response: null
    2025-12-16 07:52:31,512 [main]	ERROR	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$7):95 - Retry attempts exceeded.
    java.io.IOException: HTTP error: 500 Internal Server Error
    	at io.airbyte.api.client.interceptor.ThrowOn5xxInterceptor.intercept(ThrowOn5xxInterceptor.kt:16)
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
    	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
    	at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154)
    	at dev.failsafe.okhttp.FailsafeCall.lambda$execute$0(FailsafeCall.java:117)
    	at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
    	at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
    	at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
    	at dev.failsafe.CallImpl.execute(CallImpl.java:33)
    	at dev.failsafe.okhttp.FailsafeCall.execute(FailsafeCall.java:121)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutputWithHttpInfo(WorkloadOutputApi.kt:376)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutput(WorkloadOutputApi.kt:62)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.writeOutputThroughServer(WorkloadOutputWriter.kt:82)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.write(WorkloadOutputWriter.kt:65)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.saveConnectorOutput(ConnectorWatcher.kt:187)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.run(ConnectorWatcher.kt:80)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt:22)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt)
    
    2025-12-16 07:52:31,515 [main]	ERROR	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$3):63 - Failed to call unknown.  Last response: null
    java.io.IOException: HTTP error: 500 Internal Server Error
    	at io.airbyte.api.client.interceptor.ThrowOn5xxInterceptor.intercept(ThrowOn5xxInterceptor.kt:16)
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
    	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
    	at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154)
    	at dev.failsafe.okhttp.FailsafeCall.lambda$execute$0(FailsafeCall.java:117)
    	at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
    	at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
    	at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
    	at dev.failsafe.CallImpl.execute(CallImpl.java:33)
    	at dev.failsafe.okhttp.FailsafeCall.execute(FailsafeCall.java:121)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutputWithHttpInfo(WorkloadOutputApi.kt:376)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutput(WorkloadOutputApi.kt:62)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.writeOutputThroughServer(WorkloadOutputWriter.kt:82)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.write(WorkloadOutputWriter.kt:65)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.saveConnectorOutput(ConnectorWatcher.kt:187)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.run(ConnectorWatcher.kt:80)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt:22)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt)
    
    2025-12-16 07:52:31,516 [main]	ERROR	i.a.c.ConnectorWatcher(handleException):200 - Error performing operation: io.airbyte.workers.workload.exception.DocStoreAccessException
    io.airbyte.workers.workload.exception.DocStoreAccessException: Unable to write output for 435bb9a5-7887-4809-aa58-28c27df0d7ad_138badd9-c058-4b44-9f21-a2b86a18699a_0_check
    	at io.airbyte.workers.workload.WorkloadOutputWriter.writeOutputThroughServer(WorkloadOutputWriter.kt:86)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.write(WorkloadOutputWriter.kt:65)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.saveConnectorOutput(ConnectorWatcher.kt:187)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.run(ConnectorWatcher.kt:80)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt:22)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt)
    Caused by: java.io.IOException: HTTP error: 500 Internal Server Error
    	at io.airbyte.api.client.interceptor.ThrowOn5xxInterceptor.intercept(ThrowOn5xxInterceptor.kt:16)
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
    	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
    	at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154)
    	at dev.failsafe.okhttp.FailsafeCall.lambda$execute$0(FailsafeCall.java:117)
    	at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
    	at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
    	at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
    	at dev.failsafe.CallImpl.execute(CallImpl.java:33)
    	at dev.failsafe.okhttp.FailsafeCall.execute(FailsafeCall.java:121)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutputWithHttpInfo(WorkloadOutputApi.kt:376)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutput(WorkloadOutputApi.kt:62)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.writeOutputThroughServer(WorkloadOutputWriter.kt:82)
    	... 5 common frames omitted
    
    2025-12-16 07:52:33,484 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 1 of 5. Last response: null
    2025-12-16 07:52:36,030 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 2 of 5. Last response: null
    2025-12-16 07:52:38,146 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 3 of 5. Last response: null
    2025-12-16 07:52:40,634 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 4 of 5. Last response: null
    2025-12-16 07:52:43,302 [main]	WARN	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$5):79 - Retry attempt 5 of 5. Last response: null
    2025-12-16 07:52:43,663 [main]	ERROR	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$7):95 - Retry attempts exceeded.
    java.io.IOException: HTTP error: 500 Internal Server Error
    	at io.airbyte.api.client.interceptor.ThrowOn5xxInterceptor.intercept(ThrowOn5xxInterceptor.kt:16)
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
    	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
    	at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154)
    	at dev.failsafe.okhttp.FailsafeCall.lambda$execute$0(FailsafeCall.java:117)
    	at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
    	at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
    	at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
    	at dev.failsafe.CallImpl.execute(CallImpl.java:33)
    	at dev.failsafe.okhttp.FailsafeCall.execute(FailsafeCall.java:121)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutputWithHttpInfo(WorkloadOutputApi.kt:376)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutput(WorkloadOutputApi.kt:62)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.writeOutputThroughServer(WorkloadOutputWriter.kt:82)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.write(WorkloadOutputWriter.kt:65)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.handleException(ConnectorWatcher.kt:207)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.run(ConnectorWatcher.kt:83)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt:22)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt)
    
    2025-12-16 07:52:43,664 [main]	ERROR	i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$3):63 - Failed to call unknown.  Last response: null
    java.io.IOException: HTTP error: 500 Internal Server Error
    	at io.airbyte.api.client.interceptor.ThrowOn5xxInterceptor.intercept(ThrowOn5xxInterceptor.kt:16)
    	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
    	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
    	at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154)
    	at dev.failsafe.okhttp.FailsafeCall.lambda$execute$0(FailsafeCall.java:117)
    	at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
    	at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
    	at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
    	at dev.failsafe.CallImpl.execute(CallImpl.java:33)
    	at dev.failsafe.okhttp.FailsafeCall.execute(FailsafeCall.java:121)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutputWithHttpInfo(WorkloadOutputApi.kt:376)
    	at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutput(WorkloadOutputApi.kt:62)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.writeOutputThroughServer(WorkloadOutputWriter.kt:82)
    	at io.airbyte.workers.workload.WorkloadOutputWriter.write(WorkloadOutputWriter.kt:65)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.handleException(ConnectorWatcher.kt:207)
    	at io.airbyte.connectorSidecar.ConnectorWatcher.run(ConnectorWatcher.kt:83)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt:22)
    	at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt)
    
    2025-12-16 07:52:43,666 [main]	INFO	i.a.c.ConnectorWatcher(failWorkload):342 - Failing workload 435bb9a5-7887-4809-aa58-28c27df0d7ad_138badd9-c058-4b44-9f21-a2b86a18699a_0_check.
    2025-12-16 07:52:43,729 [main]	INFO	i.a.c.ConnectorWatcher(exitInternalError):254 - Deliberately exiting process with code 1.
    k
    • 2
    • 1