https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • i

    Ishan Anilbhai Koradiya

    11/20/2025, 9:46 AM
    @kapa.ai getting this error in some of my syncs Cannot invoke "retrofit2.Response.raw()" because the return value of "dev.failsafe.event.ExecutionCompletedEvent.getResult()" is null airbyte v1.8.3
    k
    • 2
    • 4
  • b

    Barun Pattanaik

    11/20/2025, 10:03 AM
    @kapa.ai I am using airbyte to connect from mongo to bigquery , but the data i am getting its in a single column as a json , i need the data to be splitted into different columns as in my native original table.What is the solution for this basic normailzation ?
    k
    • 2
    • 11
  • t

    Tom Dobson

    11/20/2025, 10:53 AM
    I've raised the following issue. Is there anything we can do? source-google-analytics-data-api] Schema discovery fails with multiple GA4 property IDs on connector versions > 2.7.7 (Airbyte OSS 2.0.1) #69745 Open Open [source-google-analytics-data-api] Schema discovery fails with multiple GA4 property IDs on connector versions 2.7.7 (Airbyte OSS 2.0.1)> #69745 Description dobsontom opened yesterday · edited by dobsontom Connector Name
    source-google-analytics-data-api
    Connector Version
    2.9.17
    (Behaviour appears to regress on versions newer than
    2.7.7
    ) What step the error happened? Configuring a new connector Relevant information Summary On Airbyte OSS
    2.0.1
    (Helm chart
    2.0.19
    ), the GA4 source (
    source-google-analytics-data-api
    ) fails schema discovery on newer connector versions when multiple GA4 property IDs are configured in a single source. • ✅ `2.7.7`: discovery and syncs succeed with 10 property IDs. • ❌ `2.9.17`: discovery often fails with 5–10 property IDs and the UI shows:
    An unknown error occurred. (HTTP 504)
    • Fewer property IDs → higher chance of discovery succeeding (1–2 usually works; 5–10 is unreliable). This effectively blocks upgrading the GA4 connector while keeping our current multi-property configuration. Environment • Airbyte edition: Community • Airbyte version:
    2.0.1
    • Deployment: Kubernetes • Helm chart:
    2.0.19
    • GA4 connector:
    source-google-analytics-data-api
    • Connector versions tested: ◦ ✅
    2.7.7
    → discovery succeeds with 10 property IDs ◦ ❌
    2.9.17
    → discovery frequently fails with 5–10 property IDs (
    HTTP 504
    in UI) Source configuration • Auth type: GA4 service account (Google Analytics Data API) • GA4 properties: 10 property IDs in a single source configuration • Streams: Aggregate/report-style GA4 streams (e.g.
    events_report
    ,
    _events_times
    , etc.) for all 10 properties Workarounds: • Splitting into two sources (5 + 5) reduces but does not remove failures on
    2.9.17
    . • Downgrading to
    2.7.7
    with the same config/property IDs restores stable discovery and syncs. The same GA4 properties also work with: • The Snowflake Connector for Google Analytics Aggregate Data (Snowflake native) • The Airbyte GA4 connector v2.7.7 Expected behaviour • Schema discovery should reliably succeed with 5–10 GA4 property IDs in a single source. • Newer GA4 connector versions should not require reducing the number of property IDs per source just to pass discovery. Actual behaviour On connector versions newer than
    2.7.7
    (e.g.
    2.9.17
    ): • Connection test / schema discovery often fails when 5–10 GA4 property IDs are configured. • The Airbyte UI shows:
    An unknown error occurred. (HTTP 504)
    • The failure occurs before any sync jobs start. From logs: • The connector check succeeds (
    Check succeeded
    , exit code
    0
    ). • Discovery runs and appears to write the catalog, but schema validation and/or orchestration then fails. On
    2.7.7
    with the same configuration: • Schema discovery succeeds and syncs run successfully for all 10 GA4 property IDs. Steps to reproduce 1. Deploy Airbyte OSS
    2.0.1
    on Kubernetes using Helm chart
    2.0.19
    . 2. Create a GA4 source using
    source-google-analytics-data-api
    at version
    2.9.17
    (or any version >
    2.7.7
    ). 3. Configure: ◦ GA4 auth via service account. ◦ A list of 10 GA4 property IDs in the source configuration. 4. Click “Set up connection” to trigger schema discovery / connection test. 5. Observe: ◦ UI shows
    An unknown error occurred. (HTTP 504)
    . ◦ Discovery fails, although connector workloads appear to complete successfully. 6. Downgrade the same source to connector
    2.7.7
    with no config changes. 7. Run discovery again → discovery succeeds, syncs run normally. Impact • We cannot upgrade the GA4 connector beyond
    2.7.7
    while keeping our current multi-property setup. • This blocks adoption of new connector features and fixes. • Because the failure is at discovery time, we cannot test runtime behaviour on newer versions with our real configuration. Additional notes From the logs on a failing setup: • Source check passes, but schema validation reports: ◦
    JSON schema validation failed.
    ◦
    $.auth_type: must be the constant value 'Client'
    ◦ Required properties
    client_id
    ,
    client_secret
    , and
    refresh_token
    reported as missing. • This is unexpected because we are using service account auth, not OAuth client auth, suggesting a possible mismatch between the config schema and validation for newer versions. • Despite the eventual
    HTTP 504
    in the UI, logs show the discovery workload: ◦ Exits with code
    0
    . • Logs: ▪︎
    Check succeeded
    ▪︎
    Writing catalog result to API...
    ▪︎
    Finished writing catalog result to API.
    ◦
    Marking workload ..._discover as successful
    This suggests discovery itself completes, and the failure may occur later in API/schema validation or orchestration when many property IDs are present. Relevant log output
    Copy code
    Below is a trimmed excerpt around a failing discovery (connector version `2.9.17`, 10 property IDs). Full logs available if needed.
    
        2025-11-18 16:31:42,323 [io-executor-thread-7] INFO  i.a.v.j.JsonSchemaValidator(test):119 - JSON schema validation failed.
        2025-11-18 16:31:42 errors: $.auth_type: must be the constant value 'Client',
          $: required property 'client_id' not found,
          $: required property 'client_secret' not found,
          $: required property 'refresh_token' not found
    
        2025-11-18 16:31:59,147 [pool-3-thread-1] INFO  i.a.c.HeartbeatMonitor$HeartbeatTask(run):92 - Transitioning workload to running state
    
        2025-11-18 16:32:02,994 [pool-4-thread-1] INFO  i.a.c.i.LineGobbler$Companion(gobble$lambda$2):108 - ----- START CHECK -----
    
        2025-11-18 16:32:08,637 [main] INFO  i.a.w.i.VersionedAirbyteStreamFactory(internalLog):248 - Check succeeded
    
        2025-11-18 16:32:09,146 [io-executor-thread-1] INFO  i.a.v.j.JsonSchemaValidator(test):119 - JSON schema validation failed.
        2025-11-18 16:32:09 errors: $.auth_type: must be the constant value 'Client',
          $: required property 'client_id' not found,
          $: required property 'client_secret' not found,
          $: required property 'refresh_token' not found
    
        2025-11-18 16:32:33,104 [pool-4-thread-1] INFO  i.a.c.i.LineGobbler$Companion(gobble$lambda$2):108 - ----- START DISCOVER -----
    
        2025-11-18 16:33:16,013 [main] INFO  i.a.c.ConnectorMessageProcessor(setOutput):199 - Writing catalog result to API...
        2025-11-18 16:33:16,162 [main] INFO  i.a.c.ConnectorMessageProcessor(setOutput):203 - Finished writing catalog result to API.
        2025-11-18 16:33:16,269 [main] INFO  i.a.c.ConnectorWatcher(markWorkloadSuccess):191 - Marking workload ..._discover as successful
        2025-11-18 16:33:16,307 [main] INFO  i.a.c.ConnectorWatcher(exitProperly):248 - Deliberately exiting process with code 0.
    
        2025-11-18 16:33:20,001 [scheduled-executor-thread-4] ERROR i.m.s.DefaultTaskExceptionHandler(handle):47 -
          Error invoking scheduled task for bean [io.airbyte.cron.jobs.SelfHealTemporalWorkflows@f11fd65]
          RESOURCE_EXHAUSTED: namespace rate limit exceeded
    
    In the Airbyte UI, this discovery attempt ultimately surfaces as:
    
        An unknown error occurred. (HTTP 504)
    k
    • 2
    • 1
  • m

    Mateo Colina

    11/20/2025, 11:30 AM
    @kapa.ai what does this mean for source-sftp-bulk "When the state history of the file store is full, syncs will only read files that were last modified in the provided day range."
    k
    • 2
    • 10
  • i

    Ishan Anilbhai Koradiya

    11/20/2025, 12:25 PM
    Hi @kapa.ai how do i set node selector labels for replication and other job related pods dynamically being created in airbyte ?
    k
    • 2
    • 1
  • t

    Tom Dobson

    11/20/2025, 12:33 PM
    Everything is working but I'm seeing this in the logs. What's the issue?
    Copy code
    [dd.trace 2025-11-20 12:32:12:668 +0000] [dd-trace-processor] WARN datadog.trace.agent.common.writer.ddagent.DDAgentApi - Error while sending 1 (size=1KB) traces. Total: 99, Received: 99, Sent: 0, Failed: 99. java.net.ConnectException: Failed to connect to localhost/[0:0:0:0:0:0:0:1]:8126 (Will not log warnings for 5 minutes)
    k
    • 2
    • 1
  • r

    Renu Fulmali

    11/20/2025, 12:45 PM
    @kapa.ai I am trying to create the connector anfd I am getting an error i.a.c.s.e.h.UncaughtExceptionHandler(handle):33 - Uncaught exception java.lang.RuntimeException: java.net.SocketTimeoutException: timeout at io.airbyte.server.apis.ApiHelper.execute(ApiHelper.kt:44) at io.airbyte.server.apis.controllers.ConnectorBuilderProjectApiController.readConnectorBuilderProjectStream(ConnectorBuilderProjectApiController.kt:147) at io.airbyte.server.apis.controllers.$ConnectorBuilderProjectApiController$Definition$Exec.dispatch(Unknown Source) at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invokeUnsafe(AbstractExecutableMethodsDefinition.java:461) at io.micronaut.context.DefaultBeanContext$BeanContextUnsafeExecutionHandle.invokeUnsafe(DefaultBeanContext.java:4438) at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:272) at io.micronaut.web.router.DefaultUriRouteMatch.execute(DefaultUriRouteMatch.java:38) at io.micronaut.http.server.RouteExecutor.executeRouteAndConvertBody(RouteExecutor.java:465) at io.micronaut.http.server.RouteExecutor.lambda$callRoute$5(RouteExecutor.java:442) at io.micronaut.core.execution.ExecutionFlow.lambda$async$0(ExecutionFlow.java:92) at io.micronaut.core.propagation.PropagatedContext.lambda$wrap$3(PropagatedContext.java:232) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: java.net.SocketTimeoutException: timeout at okio.SocketAsyncTimeout.newTimeoutException(JvmOkio.kt:146) at okio.AsyncTimeout.access$newTimeoutException(AsyncTimeout.kt:161) at okio.AsyncTimeout$source$1.read(AsyncTimeout.kt:339) at okio.RealBufferedSource.indexOf(RealBufferedSource.kt:430) at okio.RealBufferedSource.readUtf8LineStrict(RealBufferedSource.kt:323) at okhttp3.internal.http1.HeadersReader.readLine(HeadersReader.kt:29) at okhttp3.internal.http1.Http1ExchangeCodec.readResponseHeaders(Http1ExchangeCodec.kt:180) at okhttp3.internal.connection.Exchange.readResponseHeaders(Exchange.kt:110) at okhttp3.internal.http.CallServerInterceptor.intercept(CallServerInterceptor.kt:93) at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.kt:34) at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.kt:95) at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.kt:83) at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:76) at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) at io.airbyte.api.client.auth.InternalClientTokenInterceptor.intercept(InternalClientTokenInterceptor.kt:32) at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201) at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154) at dev.failsafe.okhttp.FailsafeCall.lambda$execute$0(FailsafeCall.java:117) at dev.failsafe.Functions.lambda$get$0(Functions.java:46) at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74) at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187) at dev.failsafe.CallImpl.execute(CallImpl.java:33) at dev.failsafe.okhttp.FailsafeCall.execute(FailsafeCall.java:121) at io.airbyte.connectorbuilderserver.api.client.generated.ConnectorBuilderServerApi.readStreamWithHttpInfo(ConnectorBuilderServerApi.kt:2596) at io.airbyte.connectorbuilderserver.api.client.generated.ConnectorBuilderServerApi.readStream(ConnectorBuilderServerApi.kt:433) at io.airbyte.commons.server.handlers.ConnectorBuilderProjectsHandler.readConnectorBuilderProjectStream(ConnectorBuilderProjectsHandler.kt:549) at io.airbyte.server.apis.controllers.ConnectorBuilderProjectApiController.readConnectorBuilderProjectStream$lambda$7(ConnectorBuilderProjectApiController.kt:148) at io.airbyte.server.apis.ApiHelper.execute(ApiHelper.kt:32) ... 13 common frames omitted Caused by: java.net.SocketException: Socket closed at java.base/sun.nio.ch.NioSocketImpl.endRead(NioSocketImpl.java:243) at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:323) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at okio.InputStreamSource.read(JvmOkio.kt:93) at okio.AsyncTimeout$source$1.read(AsyncTimeout.kt:128) ... 43 common frames omitted
    k
    • 2
    • 1
  • k

    Konathala Chaitanya

    11/20/2025, 1:36 PM
    @kapa.ai
    Caused by: org.postgresql.util.PSQLException: FATAL: "base/30724477" is not a valid data directory
    Detail: File "base/30724477/PG_VERSION" is missing.
    at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2733)
    at org.postgresql.core.v3.QueryExecutorImpl.readStartupMessages(QueryExecutorImpl.java:2845)
    at org.postgresql.core.v3.QueryExecutorImpl.<init>(QueryExecutorImpl.java:176)
    at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:323)
    at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:54)
    at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:273)
    at org.postgresql.Driver.makeConnection(Driver.java:446)
    at org.postgresql.Driver.connect(Driver.java:298)
    at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:139)
    at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:362)
    at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:203)
    at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:479)
    at com.zaxxer.hikari.pool.HikariPool$PoolEntryCreator.call(HikariPool.java:744)
    at com.zaxxer.hikari.pool.HikariPool$PoolEntryCreator.call(HikariPool.java:723)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
    ... 3 common frames omitted
    2025-11-20 13:22:06,390 [health-executor-thread-2]	ERROR	i.a.d.s.i.j.HealthCheckServiceJooqImpl(healthCheck):40 - Health check error:
    org.jooq.exception.DataAccessException: Error getting connection from data source HikariDataSource (HikariPool-1)
    at org.jooq_3.19.7.POSTGRES.debug(Unknown Source)
    at org.jooq.impl.DataSourceConnectionProvider.acquire(DataSourceConnectionProvider.java:90)
    at org.jooq.impl.DefaultExecuteContext.connection(DefaultExecuteContext.java:651)
    at org.jooq.impl.AbstractQuery.connection(AbstractQuery.java:388)
    at org.jooq.impl.AbstractQuery.execute(AbstractQuery.java:308)
    at org.jooq.impl.AbstractResultQuery.fetch(AbstractResultQuery.java:290)
    at org.jooq.impl.DefaultDSLContext.fetch(DefaultDSLContext.java:870)
    at io.airbyte.data.services.impls.jooq.HealthCheckServiceJooqImpl.lambda$healthCheck$0(HealthCheckServiceJooqImpl.java:38)
    at io.airbyte.db.Database.query(Database.java:23)
    at io.airbyte.db.ExceptionWrappingDatabase.query(ExceptionWrappingDatabase.java:31)
    at io.airbyte.data.services.impls.jooq.HealthCheckServiceJooqImpl.healthCheck(HealthCheckServiceJooqImpl.java:38)
    at io.airbyte.commons.server.handlers.HealthCheckHandler.health(HealthCheckHandler.java:24)
    at io.airbyte.server.apis.HealthApiController.getHealthCheck(HealthApiController.java:32)
    at io.airbyte.server.apis.$HealthApiController$Definition$Exec.dispatch(Unknown Source)
    at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invokeUnsafe(AbstractExecutableMethodsDefinition.java:461)
    at io.micronaut.context.DefaultBeanContext$BeanContextUnsafeExecutionHandle.invokeUnsafe(DefaultBeanContext.java:4354)
    at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:231)
    at io.micronaut.web.router.DefaultUriRouteMatch.execute(DefaultUriRouteMatch.java:38)
    at io.micronaut.http.server.RouteExecutor.executeRouteAndConvertBody(RouteExecutor.java:488)
    at io.micronaut.http.server.RouteExecutor.lambda$callRoute$5(RouteExecutor.java:465)
    at io.micronaut.core.execution.ExecutionFlow.lambda$async$1(ExecutionFlow.java:87)
    at io.micronaut.core.propagation.PropagatedContext.lambda$wrap$3(PropagatedContext.java:211)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
    at java.base/java.lang.Thread.run(Thread.java:1583)
    Caused by: java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not available, request timed out after 30000ms (total=0, active=0, idle=0, waiting=1)
    at com.zaxxer.hikari.pool.HikariPool.createTimeoutException(HikariPool.java:706)
    at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:187)
    at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:145)
    at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:99)
    at org.jooq.impl.DataSourceConnectionProvider.acquire(DataSourceConnectionProvider.java:87)
    ... 23 common frames omitted
    Caused by: org.postgresql.util.PSQLException: FATAL: "base/30724477" is not a valid data directory
    Detail: File "base/30724477/PG_VERSION" is missing.
    at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2733)
    at org.postgresql.core.v3.QueryExecutorImpl.readStartupMessages(QueryExecutorImpl.java:2845)
    at org.postgresql.core.v3.QueryExecutorImpl.<init>(QueryExecutorImpl.java:176)
    at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:323)
    at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:54)
    at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:273)
    at org.postgresql.Driver.makeConnection(Driver.java:446)
    at org.postgresql.Driver.connect(Driver.java:298)
    at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:139)
    at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:362)
    at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:203)
    at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:479)
    at com.zaxxer.hikari.pool.HikariPool$PoolEntryCreator.call(HikariPool.java:744)
    at com.zaxxer.hikari.pool.HikariPool$PoolEntryCreator.call(HikariPool.java:723)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
    ... 3 common frames omitted
    2025-11-20 13:22:36,305 [health-executor-thread-3]	ERROR	i.a.d.s.i.j.HealthCheckServiceJooqImpl(healthCheck):40 - Health check error:
    org.jooq.exception.DataAccessException: Error getting connection from data source HikariDataSource (HikariPool-1)
    at org.jooq_3.19.7.POSTGRES.debug(Unknown Source)
    at org.jooq.impl.DataSourceConnectionProvider.acquire(DataSourceConnectionProvider.java:90)
    at org.jooq.impl.DefaultExecuteContext.connection(DefaultExecuteContext.java:651)
    at org.jooq.impl.AbstractQuery.connection(AbstractQuery.java:388)
    at org.jooq.impl.AbstractQuery.execute(AbstractQuery.java:308)
    at org.jooq.impl.AbstractResultQuery.fetch(AbstractResultQuery.java:290)
    at org.jooq.impl.DefaultDSLContext.fetch(DefaultDSLContext.java:870)
    at io.airbyte.data.services.impls.jooq.HealthCheckServiceJooqImpl.lambda$healthCheck$0(HealthCheckServiceJooqImpl.java:38)
    at io.airbyte.db.Database.query(Database.java:23)
    at io.airbyte.db.ExceptionWrappingDatabase.query(ExceptionWrappingDatabase.java:31)
    at io.airbyte.data.services.impls.jooq.HealthCheckServiceJooqImpl.healthCheck(HealthCheckServiceJooqImpl.java:38)
    at io.airbyte.commons.server.handlers.HealthCheckHandler.health(HealthCheckHandler.java:24)
    at io.airbyte.server.apis.HealthApiController.getHealthCheck(HealthApiController.java:32)
    at io.airbyte.server.apis.$HealthApiController$Definition$Exec.dispatch(Unknown Source)
    at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invokeUnsafe(AbstractExecutableMethodsDefinition.java:461)
    at io.micronaut.context.DefaultBeanContext$BeanContextUnsafeExecutionHandle.invokeUnsafe(DefaultBeanContext.java:4354)
    at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:231)
    at io.micronaut.web.router.DefaultUriRouteMatch.execute(DefaultUriRouteMatch.java:38)
    at io.micronaut.http.server.RouteExecutor.executeRouteAndConvertBody(RouteExecutor.java:488)
    at io.micronaut.http.server.RouteExecutor.lambda$callRoute$5(RouteExecutor.java:465)
    what is this error i an getting server pods
    k
    • 2
    • 1
  • j

    Joshua Garza

    11/20/2025, 1:39 PM
    #C01AHCD885S I am unable to update my airbyte database. I get the following error when I try to update the organizational email: ERROR Unable to update the email address. unable to udpate the email address: failed to get organization: unable to fetch token: unable to decode token request: invalid character ‘<’ looking for beginning of value
    k
    • 2
    • 1
  • k

    Konathala Chaitanya

    11/20/2025, 3:34 PM
    @kapa.ai what is this error i am getting in workload launcher
    2025-11-20 15:31:39,940 [main]	ERROR	i.m.r.Micronaut(handleStartupException):349 - Error starting Micronaut server: Bean definition [io.micronaut.data.jdbc.config.SchemaGenerator] could not be loaded: Error instantiating bean of type  [io.micronaut.data.runtime.support.DefaultRuntimeEntityRegistry]
    Message: Could not resolve placeholder ${STORAGE_TYPE}
    Path Taken:
    new @j.i.Singleton i.m.d.r.s.DefaultRuntimeEntityRegistry(EntityEventRegistry eventRegistry, Collection<BeanRegistration<PropertyAutoPopulator<Annotation>>> propertyPopulators, ApplicationContext applicationContext, AttributeConverterRegistry attributeConverterRegistry)
    io.micronaut.context.exceptions.BeanInstantiationException: Bean definition [io.micronaut.data.jdbc.config.SchemaGenerator] could not be loaded: Error instantiating bean of type  [io.micronaut.data.runtime.support.DefaultRuntimeEntityRegistry]
    Message: Could not resolve placeholder ${STORAGE_TYPE}
    Path Taken:
    new @j.i.Singleton i.m.d.r.s.DefaultRuntimeEntityRegistry(EntityEventRegistry eventRegistry, Collection<BeanRegistration<PropertyAutoPopulator<Annotation>>> propertyPopulators, ApplicationContext applicationContext, AttributeConverterRegistry attributeConverterRegistry)
    at io.micronaut.context.DefaultBeanContext.initializeContext(DefaultBeanContext.java:2038)
    at io.micronaut.context.DefaultApplicationContext.initializeContext(DefaultApplicationContext.java:323)
    at io.micronaut.context.DefaultBeanContext.configureAndStartContext(DefaultBeanContext.java:3342)
    at io.micronaut.context.DefaultBeanContext.start(DefaultBeanContext.java:353)
    at io.micronaut.context.DefaultApplicationContext.start(DefaultApplicationContext.java:225)
    at io.micronaut.runtime.Micronaut.start(Micronaut.java:75)
    at io.airbyte.workload.launcher.ApplicationKt.main(Application.kt:10)
    Caused by: io.micronaut.context.exceptions.BeanInstantiationException: Error instantiating bean of type  [io.micronaut.data.runtime.support.DefaultRuntimeEntityRegistry]
    Message: Could not resolve placeholder ${STORAGE_TYPE}
    Path Taken:
    new @j.i.Singleton i.m.d.r.s.DefaultRuntimeEntityRegistry(EntityEventRegistry eventRegistry, Collection<BeanRegistration<PropertyAutoPopulator<Annotation>>> propertyPopulators, ApplicationContext applicationContext, AttributeConverterRegistry attributeConverterRegistry)
    at io.micronaut.context.DefaultBeanContext.resolveByBeanFactory(DefaultBeanContext.java:2350)
    at io.micronaut.context.DefaultBeanContext.createRegistration(DefaultBeanContext.java:3146)
    at io.micronaut.context.SingletonScope.getOrCreate(SingletonScope.java:80)
    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:2996)
    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:2758)
    at io.micronaut.context.DefaultBeanContext.getBean(DefaultBeanContext.java:1779)
    at io.micronaut.context.DefaultBeanContext.getBean(DefaultBeanContext.java:855)
    at io.micronaut.context.DefaultBeanContext.getBean(DefaultBeanContext.java:847)
    at io.micronaut.data.jdbc.config.SchemaGenerator.createSchema(SchemaGenerator.java:90)
    at io.micronaut.data.jdbc.config.$SchemaGenerator$Definition.initialize$intercepted(Unknown Source)
    at io.micronaut.data.jdbc.config.$SchemaGenerator$Definition$InitializeInterceptor.invokeInternal(Unknown Source)
    at io.micronaut.context.AbstractExecutableMethod.invoke(AbstractExecutableMethod.java:166)
    at io.micronaut.aop.chain.MethodInterceptorChain.doIntercept(MethodInterceptorChain.java:285)
    at io.micronaut.aop.chain.MethodInterceptorChain.initialize(MethodInterceptorChain.java:208)
    at io.micronaut.data.jdbc.config.$SchemaGenerator$Definition.initialize(Unknown Source)
    at io.micronaut.data.jdbc.config.$SchemaGenerator$Definition.instantiate(Unknown Source)
    at io.micronaut.context.DefaultBeanContext.resolveByBeanFactory(DefaultBeanContext.java:2335)
    at io.micronaut.context.DefaultBeanContext.createRegistration(DefaultBeanContext.java:3146)
    at io.micronaut.context.SingletonScope.getOrCreate(SingletonScope.java:80)
    at io.micronaut.context.DefaultBeanContext.intializeEagerBean(DefaultBeanContext.java:3035)
    at io.micronaut.context.DefaultBeanContext.initializeEagerBean(DefaultBeanContext.java:2704)
    at io.micronaut.context.DefaultBeanContext.initializeContext(DefaultBeanContext.java:2032)
    ... 6 common frames omitted
    Caused by: io.micronaut.context.exceptions.ConfigurationException: Could not resolve placeholder ${STORAGE_TYPE}
    at io.micronaut.context.env.DefaultPropertyPlaceholderResolver$PlaceholderSegment.getValue(DefaultPropertyPlaceholderResolver.java:391)
    at io.micronaut.context.env.DefaultPropertyPlaceholderResolver.resolveRequiredPlaceholdersObject(DefaultPropertyPlaceholderResolver.java:116)
    at io.micronaut.context.env.PropertySourcePropertyResolver.resolvePlaceHoldersIfNecessary(PropertySourcePropertyResolver.java:857)
    at io.micronaut.context.env.PropertySourcePropertyResolver.getProperty(PropertySourcePropertyResolver.java:398)
    at io.micronaut.context.DefaultApplicationContext.getProperty(DefaultApplicationContext.java:257)
    at io.micronaut.context.conditions.MatchesPropertyCondition.resolvePropertyValue(MatchesPropertyCondition.java:97)
    at io.micronaut.context.conditions.MatchesPropertyCondition.matches(MatchesPropertyCondition.java:66)
    at io.micronaut.context.AbstractInitializableBeanDefinitionAndReference.matches(AbstractInitializableBeanDefinitionAndReference.java:111)
    at io.micronaut.context.AbstractInitializableBeanDefinitionAndReference.isEnabled(AbstractInitializableBeanDefinitionAndReference.java:92)
    at io.micronaut.context.DefaultBeanContext$BeanDefinitionProducer.isReferenceEnabled(DefaultBeanContext.java:4282)
    at io.micronaut.context.DefaultBeanContext$BeanDefinitionProducer.getReferenceIfEnabled(DefaultBeanContext.java:4322)
    at io.micronaut.context.DefaultBeanContext$BeanDefinitionProducer.getReferenceIfEnabled(DefaultBeanContext.java:4313)
    at io.micronaut.context.DefaultBeanContext.getBeanDefinitions(DefaultBeanContext.java:1599)
    at io.micronaut.context.AnnotationProcessorListener.onCreated(AnnotationProcessorListener.java:64)
    at io.micronaut.context.AnnotationProcessorListener.onCreated(AnnotationProcessorListener.java:44)
    at io.micronaut.context.DefaultBeanContext.triggerBeanCreatedEventListener(DefaultBeanContext.java:2388)
    at io.micronaut.context.DefaultBeanContext.postBeanCreated(DefaultBeanContext.java:2365)
    at io.micronaut.context.DefaultBeanContext.createRegistration(DefaultBeanContext.java:3150)
    at io.micronaut.context.SingletonScope.getOrCreate(SingletonScope.java:80)
    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:2996)
    at io.micronaut.context.DefaultBeanContext.resolveBeanRegistration(DefaultBeanContext.java:2758)
    at io.micronaut.context.DefaultBeanContext.getBean(DefaultBeanContext.java:1779)
    at io.micronaut.context.AbstractBeanResolutionContext.getBean(AbstractBeanResolutionContext.java:210)
    at io.micronaut.context.AbstractInitializableBeanDefinition.resolveBean(AbstractInitializableBeanDefinition.java:2122)
    at io.micronaut.context.AbstractInitializableBeanDefinition.getBeanForConstructorArgument(AbstractInitializableBeanDefinition.java:1352)
    at io.micronaut.data.runtime.support.$DefaultRuntimeEntityRegistry$Definition.instantiate(Unknown Source)
    at io.micronaut.context.DefaultBeanContext.resolveByBeanFactory(DefaultBeanContext.java:2335)
    ... 27 common frames omitted
    k
    • 2
    • 7
  • j

    Johann Meier

    11/20/2025, 3:44 PM
    Looking at a Hubspot Airbyte Big Query Connection. The deals obeject which ist exported to big query should contain the field line_items. Which show nulls which cannot be true because i see line items in hubspot.
    k
    • 2
    • 1
  • a

    Aviad Deri

    11/20/2025, 3:58 PM
    @kapa.ai trying to set a new mssql source on a kubernetes installation but getting Internal message: io.airbyte.workload.launcher.pipeline.stages.model.StageError: io.airbyte.workers.exception.KubeClientException: Failed to create pod source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj. at io.airbyte.workload.launcher.pipeline.stages.model.Stage.apply(Stage.kt:49) at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.apply(LaunchPodStage.kt:40) at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Intercepted.$$access$$apply(Unknown Source) at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Exec.dispatch(Unknown Source) at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:456) at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:134) at io.airbyte.metrics.interceptors.InstrumentInterceptorBase.doIntercept(InstrumentInterceptorBase.kt:65) at io.airbyte.metrics.interceptors.InstrumentInterceptorBase.intercept(InstrumentInterceptorBase.kt:48) at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:143) at io.airbyte.workload.launcher.pipeline.stages.$LaunchPodStage$Definition$Intercepted.apply(Unknown Source) at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.apply(LaunchPodStage.kt:28) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:132) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:158) at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2570) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.MonoFlatMap$FlatMapMain.request(MonoFlatMap.java:194) at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.set(Operators.java:2366) at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onSubscribe(FluxOnErrorResume.java:74) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onSubscribe(MonoFlatMap.java:117) at reactor.core.publisher.FluxFlatMap.trySubscribeScalarMap(FluxFlatMap.java:193) at reactor.core.publisher.MonoFlatMap.subscribeOrReturn(MonoFlatMap.java:53) at reactor.core.publisher.Mono.subscribe(Mono.java:4560) at reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:430) at reactor.core.publisher.FluxPublishOn$PublishOnSubscriber.runAsync(FluxPublishOn.java:446) at reactor.core.publisher.FluxPublishOn$PublishOnSubscriber.run(FluxPublishOn.java:533) at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:84) at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:37) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: io.airbyte.workers.exception.KubeClientException: Failed to create pod source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj. at io.airbyte.workload.launcher.pods.KubePodClient.launchConnectorWithSidecar$io_airbyte_airbyte_workload_launcher(KubePodClient.kt:258) at io.airbyte.workload.launcher.pods.KubePodClient.launchCheck(KubePodClient.kt:194) at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.applyStage(LaunchPodStage.kt:50) at io.airbyte.workload.launcher.pipeline.stages.LaunchPodStage.applyStage(LaunchPodStage.kt:28) at io.airbyte.workload.launcher.pipeline.stages.model.Stage.apply(Stage.kt:45) ... 44 more Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: PATCH at: https://172.40.0.1:443/api/v1/namespaces/airbyte/pods/source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj?fieldManager=fabric8. Message: pods "source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj" is forbidden: violates PodSecurity "restricted:latest": allowPrivilegeEscalation != false (containers "init", "connector-sidecar", "main" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "init", "connector-sidecar", "main" must set securityContext.capabilities.drop=["ALL"]), runAsNonRoot != true (pod or containers "init", "connector-sidecar", "main" must set securityContext.runAsNonRoot=true). Received status: Status(apiVersion=v1, code=403, details=StatusDetails(causes=[], group=null, kind=pods, name=source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj" is forbidden: violates PodSecurity "restricted:latest": allowPrivilegeEscalation != false (containers "init", "connector-sidecar", "main" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "init", "connector-sidecar", "main" must set securityContext.capabilities.drop=["ALL"]), runAsNonRoot != true (pod or containers "init", "connector-sidecar", "main" must set securityContext.runAsNonRoot=true), metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=Forbidden, status=Failure, additionalProperties={}). at io.fabric8.kubernetes.client.KubernetesClientException.copyAsCause(KubernetesClientException.java:205) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.waitForResult(OperationSupport.java:507) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.handleResponse(OperationSupport.java:524) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.handlePatch(OperationSupport.java:419) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.handlePatch(OperationSupport.java:397) at io.fabric8.kubernetes.client.dsl.internal.BaseOperation.handlePatch(BaseOperation.java:764) at io.fabric8.kubernetes.client.dsl.internal.HasMetadataOperation.lambda$patch$2(HasMetadataOperation.java:231) at io.fabric8.kubernetes.client.dsl.internal.HasMetadataOperation.patch(HasMetadataOperation.java:236) at io.fabric8.kubernetes.client.dsl.internal.HasMetadataOperation.patch(HasMetadataOperation.java:251) at io.fabric8.kubernetes.client.dsl.internal.BaseOperation.serverSideApply(BaseOperation.java:1179) at io.fabric8.kubernetes.client.dsl.internal.BaseOperation.serverSideApply(BaseOperation.java:98) at io.airbyte.workload.launcher.pods.KubePodLauncher.create$lambda$0(KubePodLauncher.kt:58) at io.airbyte.workload.launcher.pods.KubePodLauncher$runKubeCommand$1.get(KubePodLauncher.kt:324) at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243) at dev.failsafe.Functions.lambda$get$0(Functions.java:46) at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74) at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187) at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376) at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112) at io.airbyte.workload.launcher.pods.KubePodLauncher.runKubeCommand(KubePodLauncher.kt:322) at io.airbyte.workload.launcher.pods.KubePodLauncher.create(KubePodLauncher.kt:52) at io.airbyte.workload.launcher.pods.KubePodClient.launchConnectorWithSidecar$io_airbyte_airbyte_workload_launcher(KubePodClient.kt:255) ... 48 more Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: PATCH at: https://172.40.0.1:443/api/v1/namespaces/airbyte/pods/source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj?fieldManager=fabric8. Message: pods "source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj" is forbidden: violates PodSecurity "restricted:latest": allowPrivilegeEscalation != false (containers "init", "connector-sidecar", "main" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "init", "connector-sidecar", "main" must set securityContext.capabilities.drop=["ALL"]), runAsNonRoot != true (pod or containers "init", "connector-sidecar", "main" must set securityContext.runAsNonRoot=true). Received status: Status(apiVersion=v1, code=403, details=StatusDetails(causes=[], group=null, kind=pods, name=source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "source-mssql-check-e355f24f-2a78-4e57-9617-a22c4a57fcae-0-tfiuj" is forbidden: violates PodSecurity "restricted:latest": allowPrivilegeEscalation != false (containers "init", "connector-sidecar", "main" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "init", "connector-sidecar", "main" must set securityContext.capabilities.drop=["ALL"]), runAsNonRoot != true (pod or containers "init", "connector-sidecar", "main" must set securityContext.runAsNonRoot=true), metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=Forbidden, status=Failure, additionalProperties={}). at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.requestFailure(OperationSupport.java:642) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.requestFailure(OperationSupport.java:622) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.assertResponseCode(OperationSupport.java:582) at io.fabric8.kubernetes.client.dsl.internal.OperationSupport.lambda$handleResponse$0(OperationSupport.java:549) at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:646) at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2179) at io.fabric8.kubernetes.client.http.StandardHttpClient.lambda$completeOrCancel$10(StandardHttpClient.java:141) at ... why?
    k
    • 2
    • 1
  • s

    Stefano Messina

    11/20/2025, 4:24 PM
    @kapa.ai what could this error be:
    Copy code
    2025-11-20 17:21:13 destination ERROR io.airbyte.cdk.ConfigErrorException: Failed to initialize connector operation
    2025-11-20 17:21:13 destination ERROR 	at io.airbyte.cdk.AirbyteConnectorRunnable.run(AirbyteConnectorRunnable.kt:33)
    2025-11-20 17:21:13 destination ERROR 	at picocli.CommandLine.executeUserObject(CommandLine.java:2030)
    2025-11-20 17:21:13 destination ERROR 	at picocli.CommandLine.access$1500(CommandLine.java:148)
    2025-11-20 17:21:13 destination ERROR 	at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2465)
    2025-11-20 17:21:13 destination ERROR 	at picocli.CommandLine$RunLast.handle(CommandLine.java:2457)
    2025-11-20 17:21:13 destination ERROR 	at picocli.CommandLine$RunLast.handle(CommandLine.java:2419)
    2025-11-20 17:21:13 destination ERROR 	at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2277)
    2025-11-20 17:21:13 destination ERROR 	at picocli.CommandLine$RunLast.execute(CommandLine.java:2421)
    2025-11-20 17:21:13 destination ERROR 	at picocli.CommandLine.execute(CommandLine.java:2174)
    2025-11-20 17:21:13 destination ERROR 	at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run(AirbyteConnectorRunner.kt:289)
    2025-11-20 17:21:13 destination ERROR 	at io.airbyte.cdk.AirbyteDestinationRunner$Companion.run$default(AirbyteConnectorRunner.kt:75)
    2025-11-20 17:21:13 destination ERROR 	at io.airbyte.integrations.destination.clickhouse.ClickhouseDestinationKt.main(ClickhouseDestination.kt:10)
    2025-11-20 17:21:13 destination ERROR Caused by: io.micronaut.context.exceptions.BeanInstantiationException: Error instantiating bean of type  [io.airbyte.cdk.load.orchestration.db.legacy_typing_deduping.TableCatalog]
    syncing from MSSQL to Clickhouse with the latest connector version
    k
    • 2
    • 7
  • j

    Jean-Denis COSTA

    11/20/2025, 5:06 PM
    Using MSSQL source to Clickhouse destination, there is a specific big table for which every row seems to be inserted 1 by 1 instead of batched. This makes for a very slow replication. I can see megabytes of such logs on each run:
    Copy code
    pool-2-thread-1 i.a.c.l.d.a.AggregateStore(removeNextComplete):56 PUBLISH — Reason: Cardinality
    2025-11-20 17:58:15 destination INFO DefaultDispatcher-worker-1 i.a.i.d.c.w.l.BinaryRowInsertBuffer(flush):70 Finished insert of 1 rows into datacache_AdInsertion
    2025-11-20 17:58:15 destination INFO DefaultDispatcher-worker-1 i.a.i.d.c.w.l.BinaryRowInsertBuffer(flush):59 Beginning insert into datacache_AdInsertion
    Why does the connection commit the batch every row and how can I improve this behaviour ? @kapa.ai
    k
    • 2
    • 2
  • m

    MTA

    11/20/2025, 6:37 PM
    @kapa.ai While configuring a new connection from an API source, in post mode, for the sync mode, i only have an option of Full refresh. I want to do an incremental refresh. Why only full refresh available ?
    k
    • 2
    • 1
  • k

    Konathala Chaitanya

    11/20/2025, 7:00 PM
    @kapa.ai my postgress to which airbyte was hosted as source was effected so i reverted with previous snapshot that is one day back from the time postgress stopped working now when i try to create new connections i am getting error Configuration check failed Failed to run connection tests.
    k
    • 2
    • 7
  • s

    soma chandra sekhar attaluri

    11/20/2025, 7:33 PM
    @kapa.ai I want to install airbyte-v2.0 open source helm charts on eks for my company. What kind of challenges will i encounter , what kind of technical expertise should i have to maintain and resolve any errors and what to do if sync failed in the half way and how to scale to move around 1-2 tb of batch data daily and how to implement authentication and how can we monitor usage and failures of syncs ..give me detailed description of what to do and any reference documents to support those
    k
    • 2
    • 1
  • k

    Keegan Haukaas

    11/20/2025, 8:58 PM
    @kapa.ai For a HttpStream source created used the Python CDK, how can I set the timeout for API Calls?
    k
    • 2
    • 1
  • w

    Wira Tjo

    11/21/2025, 1:32 AM
    @kapa.ai
    Copy code
    2025-11-21 01:24:45,143 [io-executor-thread-4]	ERROR	i.a.c.s.e.h.UncaughtExceptionHandler(handle):33 - Uncaught exception
    software.amazon.awssdk.core.exception.SdkClientException: Unable to load credentials from any of the providers in the chain AwsCredentialsProviderChain(credentialsProviders=[SystemPropertyCredentialsProvider(), EnvironmentVariableCredentialsProvider(), WebIdentityTokenCredentialsProvider(), ProfileCredentialsProvider(profileName=default, profileFile=ProfileFile(sections=[])), ContainerCredentialsProvider(), InstanceProfileCredentialsProvider()]) : [SystemPropertyCredentialsProvider(): Unable to load credentials from system settings. Access key must be specified either via environment variable (AWS_ACCESS_KEY_ID) or system property (aws.accessKeyId)., EnvironmentVariableCredentialsProvider(): Unable to load credentials from system settings. Access key must be specified either via environment variable (AWS_ACCESS_KEY_ID) or system property (aws.accessKeyId)., WebIdentityTokenCredentialsProvider(): Either the environment variable AWS_WEB_IDENTITY_TOKEN_FILE or the javaproperty aws.webIdentityTokenFile must be set., ProfileCredentialsProvider(profileName=default, profileFile=ProfileFile(sections=[])): Profile file contained no credentials for profile 'default': ProfileFile(sections=[]), ContainerCredentialsProvider(): Cannot fetch credentials from container - neither AWS_CONTAINER_CREDENTIALS_FULL_URI or AWS_CONTAINER_CREDENTIALS_RELATIVE_URI environment variables are set., InstanceProfileCredentialsProvider(): Failed to load credentials from IMDS.]
    What is this issue associated with?
    k
    • 2
    • 2
  • e

    Ed Godalle

    11/21/2025, 2:06 AM
    I have airbyte installed in google vm using abctl. Previously I can add and run connectors without any issue but now even adding connector or doing connection test fails when the same connector runs previously fine. I am getting this failure: Failure in source: Checking source connection failed - please review this connection's configuration to prevent future syncs from failing When I go VM i can connect to the API using "curl -I https://api.hyros.com/... returns 401 immediately" but when i run inside the node "docker exec airbyte-abctl-control-plane curl -I ... never returns" there was no change in the vm or firewall
    k
    • 2
    • 1
  • i

    Ishan Anilbhai Koradiya

    11/21/2025, 4:50 AM
    @kapa.ai getting this error 2025-11-21 052406 source INFO main i.a.c.i.d.i.DebeziumRecordIterator(computeNext):87 CDC events queue poll(): blocked for PT6.511585925S after 10 previous call(s) which were not logged. 2025-11-21 052406 source INFO main i.a.c.i.d.i.DebeziumRecordIterator(computeNext):184 CDC events queue poll(): returned a change event with "source": {"version":"2.6.2.Final","connector":"mongodb","name":"jansahas","ts_ms":1763671676000,"snapshot":null,"db":"jansahas","sequence":null,"ts_us":1763671676000000,"ts_ns":1763671676000000000,"collection":"supportingDocs","ord":1,"lsid":null,"txnNumber":null,"wallTime":1763671676228}. 2025-11-21 052408 source WARN debezium-mongodbconnector-jansahas-replicator-fetcher-0 i.d.c.m.e.BufferingChangeStreamCursor$EventFetcher(enqueue):285 Unable to acquire buffer lock, buffer queue is likely full 2025-11-21 052409 source INFO pool-2-thread-1 i.a.c.i.d.AirbyteDebeziumHandler$CapacityReportingBlockingQueue(reportQueueUtilization):48 CDC events queue stats: size=0, cap=10000, puts=130, polls=0 2025-11-21 052409 source INFO main i.a.c.i.d.i.DebeziumRecordIterator(computeNext):87 CDC events queue poll(): blocked for PT2.503467921S after 9 previous call(s) which were not logged. 2025-11-21 052409 source INFO main i.a.c.i.d.i.DebeziumRecordIterator(computeNext):140 CDC events queue poll(): returned a heartbeat event: progressing to Timestamp{value=7574912216546149853, seconds=1763671687, inc=1501}. 2025-11-21 052415 replication-orchestrator ERROR Source process exited with non-zero exit code 143 2025-11-21 052415 replication-orchestrator ERROR SourceReader error: 2025-11-21 052415 replication-orchestrator INFO Source queue closed — stopping message processor... 2025-11-21 052415 replication-orchestrator INFO SourceReader finished. 2025-11-21 052415 replication-orchestrator INFO MessageProcessor finished. 2025-11-21 052415 destination INFO main i.a.c.i.d.FlushWorkers(close):188 Closing flush workers -- waiting for all buffers to flush 2025-11-21 052415 replication-orchestrator INFO DestinationWriter finished. 2025-11-21 052415 destination INFO main i.a.c.i.d.FlushWorkers(close):213 REMAINING_BUFFERS_INFO Namespace: dalgo_ingest_mongodb_pci Stream: schemeStatus -- remaining records: 86 Namespace: dalgo_ingest_mongodb_pci Stream: supportingDocs -- remaining records: 52 2025-11-21 052415 destination INFO main i.a.c.i.d.FlushWorkers(close):214 Waiting for all streams to flush. 2025-11-21 052415 replication-orchestrator ERROR DestinationReader error: 2025-11-21 052415 replication-orchestrator INFO DestinationReader finished. 2025-11-21 052415 replication-orchestrator ERROR runJobs failed; recording failure but continuing to finish. 2025-11-21 052415 replication-orchestrator INFO Closing StateCheckSumCountEventHandler 2025-11-21 052415 replication-orchestrator INFO Sync summary: { "status" : "failed", "recordsSynced" : 34893, "bytesSynced" : 63937180, "startTime" : 1763681442139, "endTime" : 1763682855542, "totalStats" : { "bytesCommitted" : 63937180, "bytesEmitted" : 64278845, "destinationStateMessagesEmitted" : 60, "destinationWriteEndTime" : 0, "destinationWriteStartTime" : 1763681443874, "meanSecondsBeforeSourceStateMessageEmitted" : 931, "maxSecondsBeforeSourceStateMessageEmitted" : 4, "maxSecondsBetweenStateMessageEmittedandCommitted" : 302, "meanSecondsBetweenStateMessageEmittedandCommitted" : 289, "recordsEmitted" : 35135, "recordsCommitted" : 34893, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0, "replicationEndTime" : 1763682855540, "replicationStartTime" : 1763681442139, "sourceReadEndTime" : 0, "sourceReadStartTime" : 1763681443891, "sourceStateMessagesEmitted" : 60 }, "streamStats" : [ { "streamName" : "migrants", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 1112681, "bytesEmitted" : 1117967, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 1513, "recordsCommitted" : 1506, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "benefitTypes", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "schemeCategories", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "surveyAnswers", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 36263163, "bytesEmitted" : 36421935, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 3435, "recordsCommitted" : 3420, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "states", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "supportingDocs", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 2540806, "bytesEmitted" : 2564680, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 7392, "recordsCommitted" : 7318, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "documentTypes", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "schemes", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "users", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 2747253, "bytesEmitted" : 2754756, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 1989, "recordsCommitted" : 1984, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "eligibilityCriteria", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "blocks", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "surveys", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "districts", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "roles", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 0, "bytesEmitted" : 0, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 0, "recordsCommitted" : 0, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } }, { "streamName" : "schemeStatus", "streamNamespace" : "jansahas", "stats" : { "bytesCommitted" : 21273277, "bytesEmitted" : 21419507, "estimatedBytes" : 0, "estimatedRecords" : 0, "recordsEmitted" : 20806, "recordsCommitted" : 20665, "recordsFilteredOut" : 0, "bytesFilteredOut" : 0 } } ], "performanceMetrics" : { } } 2025-11-21 052415 replication-orchestrator INFO Failures: [ { "failureOrigin" : "source", "internalMessage" : "Source process exited with non-zero exit code 143", "externalMessage" : "Something went wrong within the source connector", "metadata" : { "attemptNumber" : 0, "jobId" : 58328, "connector_command" : "read" }, "stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source process exited with non-zero exit code 143\n\tat io.airbyte.container.orchestrator.worker.SourceReader.run(ReplicationTask.kt:206)\n\tat io.airb
    k
    • 2
    • 4
  • k

    Konathala Chaitanya

    11/21/2025, 6:03 AM
    @kapa.ai
    Copy code
    i.m.r.Micronaut(handleStartupException):349 - Error starting Micronaut server: Bean definition [io.micronaut.data.jdbc.config.SchemaGenerator] could not be loaded: Error instantiating bean of type  [io.micronaut.data.runtime.support.DefaultRuntimeEntityRegistry]
    Message: Could not resolve placeholder ${STORAGE_TYPE}
    Path Taken:
    new @j.i.Singleton i.m.d.r.s.DefaultRuntimeEntityRegistry(EntityEventRegistry eventRegistry, Collection<BeanRegistration<PropertyAutoPopulator<Annotation>>> propertyPopulators, ApplicationContext applicationContext, AttributeConverterRegistry attributeConverterRegistry)
    io.micronaut.context.exceptions.BeanInstantiationException: Bean definition [io.micronaut.data.jdbc.config.SchemaGenerator] could not be loaded: Error instantiating bean of type  [io.micronaut.data.runtime.support.DefaultRuntimeEntityRegistry]
    Message: Could not resolve placeholder ${STORAGE_TYPE}
    Path Taken:
    new @j.i.Singleton i.m.d.r.s.DefaultRuntimeEntityRegistry(EntityEventRegistry eventRegistry, Collection<BeanRegistration<PropertyAutoPopulator<Annotation>>> propertyPopulators, ApplicationContext applicationContext, AttributeConverterRegistry attributeConverterRegistry)
    	at io.micronaut.context.DefaultBeanContext.initializeContext(DefaultBeanContext.java:2038)
    	at io.micronaut.context.DefaultApplicationContext.initializeContext(DefaultApplicationContext.java:323)
    	at io.micronaut.context.DefaultBeanContext.configureAndStartContext(DefaultBeanContext.java:3342)
    	at io.micronaut.context.DefaultBeanContext.start(DefaultBeanContext.java:353)
    	at io.micronaut.context.DefaultApplication
    but i have added
    Copy code
    storage:
      type: minio
    what is the issue?
    k
    • 2
    • 1
  • k

    Konathala Chaitanya

    11/21/2025, 6:20 AM
    @kapa.ai
    Copy code
    ERROR	i.m.r.Micronaut(handleStartupException):349 - Error starting Micronaut server: Bean definition [io.micronaut.data.jdbc.config.SchemaGenerator] could not be loaded: Error instantiating bean of type  [io.micronaut.data.runtime.support.DefaultRuntimeEntityRegistry]
    Message: Could not resolve placeholder ${DATAPLANE_CLIENT_ID}
    Path Taken:
    new @j.i.Singleton i.m.d.r.s.DefaultRuntimeEntityRegistry(EntityEventRegistry eventRegistry, Collection<BeanRegistration<PropertyAutoPopulator<Annotation>>> propertyPopulators, ApplicationContext applicationContext, AttributeConverterRegistry attributeConverterRegistry)
    io.micronaut.context.exceptions.BeanInstantiationException: Bean definition [io.micronaut.data.jdbc.config.SchemaGenerator] could not be loaded: Error instantiating bean of type  [io.micronaut.data.runtime.support.DefaultRuntimeEntityRegistry]
    Message: Could not resolve placeholder ${DATAPLANE_CLIENT_ID}
    Path Taken:
    new @j.i.Singleton i.m.d.r.s.DefaultRuntimeEntityRegistry(EntityEventRegistry eventRegistry, Collection<BeanRegistration<PropertyAutoPopulator<Annotation>>> propertyPopulators, ApplicationContext applicationContext, AttributeConverterRegistry attributeConverterRegistry)
    	at io.micronaut.context.DefaultBeanContext.initializeContext(DefaultBeanContext.java:2038)
    	at io.micronaut.context.DefaultApplicationContext.initializeContext(DefaultApplicationContext.java:323)
    	at io.micronaut.context.DefaultBeanContext.configureAndStartContext(DefaultBeanContext.java:3342)
    	at io.micronaut.context.DefaultBeanContext.start(DefaultBeanContext.java:353)
    	at io.micronaut.context.DefaultApplicationContext.start(DefaultApplicationContext.java:225)
    	at io.micronaut.runtime.Micronaut.start(Micronaut.java:75)
    	at io.airbyte.workload.launcher.ApplicationKt.main(Application.kt:10)
    Caused by: io.micronaut.context.exceptions.BeanInstantiationException: Error instantiating bean of type  [io.micronaut.data.runtime.support.DefaultRuntimeEntityRegistry]
    Message: Could not resolve placeholder ${DATAPLANE_CLIENT_ID}
    what is the issue
    k
    • 2
    • 1
  • k

    Konathala Chaitanya

    11/21/2025, 7:36 AM
    @kapa.ai
    Copy code
    Unable to bootstrap Airbyte environment.
    java.lang.RuntimeException: com.fasterxml.jackson.core.JsonParseException: Invalid numeric value: Leading zeroes not allowed
     at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 2]
    	at io.airbyte.commons.json.Jsons.deserialize(Jsons.java:101)
    	at io.airbyte.data.services.impls.jooq.DbConverter.buildStandardWorkspace(DbConverter.java:194)
    	at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
    	at java.base/java.util.ArrayList$Itr.forEachRemaining(ArrayList.java:1085)
    	at java.base/java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1939)
    	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
    	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
    	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:575)
    	at java.base/java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
    	at java.base/java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:616)
    	at java.base/java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:622)
    	at java.base/java.util.stream.ReferencePipeline.toList(ReferencePipeline.java:627)
    	at io.airbyte.data.services.impls.jooq.WorkspaceServiceJooqImpl.listStandardWorkspaces(WorkspaceServiceJooqImpl.java:169)
    	at io.airbyte.bootloader.Bootloader.createWorkspaceIfNoneExists(Bootloader.java:202)
    	at io.airbyte.bootloader.Bootloader.load(Bootloader.java:119)
    	at io.airbyte.bootloader.Application.main(Application.java:29)
    Caused by: com.fasterxml.jackson.core.JsonParseException: Invalid numeric value: Leading zeroes not allowed
     at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 2]
    	at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:2584)
    	at com.fasterxml.jackson.core.JsonParser._constructReadException(JsonParser.java:2610)
    	at com.fasterxml.jackson.core.base.ParserMinimalBase.reportInvalidNumber(ParserMinimalBase.java:655)
    	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._verifyNLZ2(ReaderBasedJsonParser.java:1705)
    	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._verifyNoLeadingZeroes(ReaderBasedJsonParser.java:1692)
    	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._parseNumber2(ReaderBasedJsonParser.java:1554)
    	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._parseUnsignedNumber(ReaderBasedJsonParser.java:1363)
    	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:777)
    	at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:5004)
    	at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4910)
    	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3860)
    	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3828)
    	at io.airbyte.commons.json.Jsons.deserialize(Jsons.java:99)
    	... 15 common frames omitted
    2025-11-21 07:32:48,648 [Thread-2]	INFO	i.m.r.Micronaut(lambda$start$0):118 - Embedded Application shutting down
    i tried to downgrade airbyte from 1.7 to 1.5 and i got this issue
    k
    • 2
    • 1
  • j

    Jimmy Phommarath

    11/21/2025, 8:16 AM
    hello, I have a question for you 🙂 I use terraform for developing my source, destination, connection... for versioning them into a repository. How can I do the same for custom connector (with the builder) for versioning ? thanks !
    k
    • 2
    • 7
  • j

    Jazz

    11/21/2025, 8:40 AM
    This error occurred when the source MSSQL server uses CDC
    Copy code
    2025-11-21 15:29:35 source ERROR main i.a.c.i.u.ConnectorExceptionHandler(handleException):68 caught exception! io.airbyte.cdk.integrations.source.relationaldb.state.FailedRecordIteratorException: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "io.airbyte.integrations.source.mssql.initialsync.MssqlInitialReadUtil$OrderedColumnInfo.ocFieldName()" because "this.ocInfo" is null
    	at io.airbyte.cdk.integrations.source.relationaldb.state.SourceStateIterator.computeNext(SourceStateIterator.kt:38) ~[airbyte-cdk-db-sources-0.48.18.jar:?]
    	at io.airbyte.cdk.integrations.source.relationaldb.state.SourceStateIterator.computeNext(SourceStateIterator.kt:18) ~[airbyte-cdk-db-sources-0.48.18.jar:?]
    	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145) ~[guava-33.3.0-jre.jar:?]
    	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:140) ~[guava-33.3.0-jre.jar:?]
    	at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.kt:42) ~[airbyte-cdk-dependencies-0.48.18.jar:?]
    	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145) ~[guava-33.3.0-jre.jar:?]
    	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:140) ~[guava-33.3.0-jre.jar:?]
    	at com.google.common.collect.TransformedIterator.hasNext(TransformedIterator.java:46) ~[guava-33.3.0-jre.jar:?]
    	at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.kt:42) ~[airbyte-cdk-dependencies-0.48.18.jar:?]
    	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145) ~[guava-33.3.0-jre.jar:?]
    	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:140) ~[guava-33.3.0-jre.jar:?]
    	at io.airbyte.commons.util.CompositeIterator.computeNext(CompositeIterator.kt:67) ~[airbyte-cdk-dependencies-0.48.18.jar:?]
    	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145) ~[guava-33.3.0-jre.jar:?]
    	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:140) ~[guava-33.3.0-jre.jar:?]
    	at io.airbyte.commons.util.CompositeIterator.computeNext(CompositeIterator.kt:67) ~[airbyte-cdk-dependencies-0.48.18.jar:?]
    	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145) ~[guava-33.3.0-jre.jar:?]
    	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:140) ~[guava-33.3.0-jre.jar:?]
    	at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.kt:42) ~[airbyte-cdk-dependencies-0.48.18.jar:?]
    	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145) ~[guava-33.3.0-jre.jar:?]
    	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:140) ~[guava-33.3.0-jre.jar:?]
    	at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.kt:42) ~[airbyte-cdk-dependencies-0.48.18.jar:?]
    	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145) ~[guava-33.3.0-jre.jar:?]
    	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:140) ~[guava-33.3.0-jre.jar:?]
    	at java.base/java.util.Iterator.forEachRemaining(Iterator.java:132) ~[?:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.produceMessages(IntegrationRunner.kt:234) ~[airbyte-cdk-core-0.48.18.jar:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.readSerial(IntegrationRunner.kt:291) ~[airbyte-cdk-core-0.48.18.jar:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.kt:190) [airbyte-cdk-core-0.48.18.jar:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.kt:119) [airbyte-cdk-core-0.48.18.jar:?]
    	at io.airbyte.integrations.source.mssql.MssqlSource.main(MssqlSource.java:636) [io.airbyte.airbyte-integrations.connectors-source-mssql.jar:?]
    k
    • 2
    • 1
  • k

    Kothapalli Venkata Avinash

    11/21/2025, 9:11 AM
    @kapa.ai we are seeing error Readiness probe failed: Get "<http://*...*:8001/api/v1/health>": dial tcp *8001 connect: connection refused.
    k
    • 2
    • 1
  • d

    David Robinson

    11/21/2025, 9:58 AM
    @kapa.ai how do I configure cpu/memory requests/limits for a custom docker connector?
    k
    • 2
    • 1
  • d

    David Robinson

    11/21/2025, 10:27 AM
    @kapa.ai how do I export sync timeline info (e.g. duration, volume of data, no records, etc)?
    k
    • 2
    • 1
  • h

    Hari Haran R

    11/21/2025, 10:52 AM
    @kapa.ai i want to clear airbyte cache, our sync jobs are pending for long due to insufficient storage, or CPU our airbyte version is 1.4.1, we cannot increate server size because there will be too much cost , need to clear any cache or unwanted storage