https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • j

    Jason Wiener

    12/08/2025, 4:05 PM
    @kapa.ai The error below is occurring during an attempt to set up Redshift as a source in a connection. The test method succeeds in configuring the source but this error occurs during setup of a connection using the source. The destination is not the issue as that destination works with other sources and this source fails with multiple destinations.
    Copy code
    2025-12-08 09:00:37  ERROR 2025-12-08 09:00:37 error ERROR i.a.c.i.b.AirbyteExceptionHandler(uncaughtException):64 Something went wrong in the connector. See the logs for more details. java.lang.NullPointerException: null value in entry: isNullable=null
    	at com.google.common.collect.CollectPreconditions.checkEntryNotNull(CollectPreconditions.java:33) ~[guava-33.0.0-jre.jar:?]
    	at com.google.common.collect.ImmutableMapEntry.<init>(ImmutableMapEntry.java:54) ~[guava-33.0.0-jre.jar:?]
    	at com.google.common.collect.ImmutableMap.entryOf(ImmutableMap.java:345) ~[guava-33.0.0-jre.jar:?]
    	at com.google.common.collect.ImmutableMap$Builder.put(ImmutableMap.java:454) ~[guava-33.0.0-jre.jar:?]
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.getColumnMetadata(AbstractJdbcSource.java:248) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.cdk.db.jdbc.JdbcDatabase$1.tryAdvance(JdbcDatabase.java:84) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332) ~[?:?]
    	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?]
    	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
    	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?]
    	at io.airbyte.cdk.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:57) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.discoverInternal(AbstractJdbcSource.java:171) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:94) ~[io.airbyte.airbyte-integrations.connectors-source-redshift.jar:?]
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:30) ~[io.airbyte.airbyte-integrations.connectors-source-redshift.jar:?]
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discoverWithoutSystemTables(AbstractDbSource.java:268) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discover(AbstractDbSource.java:126) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:159) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.java:125) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at io.airbyte.integrations.source.redshift.RedshiftSource.main(RedshiftSource.java:141) ~[io.airbyte.airbyte-integrations.connectors-source-redshift.jar:?]
    
    Stack Trace: java.lang.NullPointerException: null value in entry: isNullable=null
    	at com.google.common.collect.CollectPreconditions.checkEntryNotNull(CollectPreconditions.java:33)
    	at com.google.common.collect.ImmutableMapEntry.<init>(ImmutableMapEntry.java:54)
    	at com.google.common.collect.ImmutableMap.entryOf(ImmutableMap.java:345)
    	at com.google.common.collect.ImmutableMap$Builder.put(ImmutableMap.java:454)
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.getColumnMetadata(AbstractJdbcSource.java:248)
    	at io.airbyte.cdk.db.jdbc.JdbcDatabase$1.tryAdvance(JdbcDatabase.java:84)
    	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332)
    	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
    	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
    	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
    	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
    	at io.airbyte.cdk.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:57)
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.discoverInternal(AbstractJdbcSource.java:171)
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:94)
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:30)
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discoverWithoutSystemTables(AbstractDbSource.java:268)
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discover(AbstractDbSource.java:126)
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:159)
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.java:125)
    	at io.airbyte.integrations.source.redshift.RedshiftSource.main(RedshiftSource.java:141)
    k
    • 2
    • 4
  • n

    Nour Ben Hassen

    12/08/2025, 4:19 PM
    @kapa.ai I'm having issues with dynamodb connector, sometimes the sync works and sometimes I receive an error "message='Airbyte could not track the sync progress. Sync process exited without reporting status.', type='io.airbyte.workers.exception.WorkloadMonitorException', nonRetryable=false" after a very long sync time. (OSS Version)
    k
    • 2
    • 1
  • l

    Lucas Segers

    12/08/2025, 4:49 PM
    is it possible to have two connections pointed at the same bigquery destination? I need one of them to be full refresh (once per weekend) and the normal one be daily
    k
    • 2
    • 1
  • l

    Luis Gustavo Macedo Lousada

    12/08/2025, 8:19 PM
    @kapa.ai the airbyte-worker throw this error when try test a destination, how can fix or debug this bro???
    Copy code
    2025-12-08 19:34:30,213 [Workflow Executor taskQueue="ui_commands", namespace="default": 1]     WARN    i.t.i.r.ReplayWorkflowTaskHandler(failureToWFTResult):302 - Workflow task processing failure. startedEventId=3, WorkflowId=check_b83dd098-8e2d-4741-84f5-0bd4390fd5f8, RunId=019aff75-4b49-7898-aff4-ecf3f6c52233. If seen continuously the workflow might be stuck.
    io.temporal.internal.statemachines.InternalWorkflowTaskException: Failure handling event 3 of type 'EVENT_TYPE_WORKFLOW_TASK_STARTED' during execution. {WorkflowTaskStartedEventId=3, CurrentStartedEventId=3}
    Copy code
    Caused by: io.temporal.internal.sync.PotentialDeadlockException: [TMPRL1101] Potential deadlock detected. Workflow thread "workflow-method-check_b83dd098-8e2d-4741-84f5-0bd4390fd5f8-019aff75-4b49-7898-aff4-ecf3f6c52233" didn't yield control for over a second. {detectionTimestamp=1765222469954, threadDumpTimestamp=1765222469957}
    
    workflow-method-check_b83dd098-8e2d-4741-84f5-0bd4390fd5f8-019aff75-4b49-7898-aff4-ecf3f6c52233
            at kotlin.reflect.jvm.internal.impl.types.KotlinTypeFactory.flexibleType(KotlinTypeFactory.kt:188)
    k
    • 2
    • 9
  • h

    Hari Haran R

    12/09/2025, 6:11 AM
    @kapa.ai i'm using airbyte open source 1.4 and trying to connect customer.io but when i try to enter the creds its failing, it also says its maintainted by fargos AI The Customer.io source is maintained by Faros AI. so does this connectr work in airbyte or not?
    k
    • 2
    • 1
  • i

    Ishan Anilbhai Koradiya

    12/09/2025, 6:51 AM
    Hi @kapa.ai what kind of instance type are reocmmended in aws eks for airbyte deployment with medium to hihgh workloads ?
    k
    • 2
    • 7
  • e

    Ed

    12/09/2025, 8:46 AM
    How can I configure my date field in yaml to always have a date of yesterday
    k
    • 2
    • 1
  • m

    Markus Müüripeal

    12/09/2025, 9:02 AM
    @kapa.ai After upgrading from 1.4 to 2.0.1, the syncs that took 1-2 minutes now take 30minutes minimum with 0 bytes
    k
    • 2
    • 1
  • s

    Stefano Messina

    12/09/2025, 9:35 AM
    @kapa.ai I keep getting
    Request to Airbyte API failed: 409 Client Error: Conflict for url: <http://airbyte-airbyte-webapp-svc.airbyte.svc:80/api/v1/connections/sync>
    for a group of connections being triggered, they only share the same destinations
    k
    • 2
    • 4
  • a

    Ami Mehta

    12/09/2025, 10:58 AM
    @kapa.ai I am trying to develop a source connector for Etsy from scratch. Using self hosted version of Airbyte installed with abctl command and running under a docker container. Used declarative authentication section to define the OAuth 2.0 authentication configurations in connector builder UI. But there is no place to define the redirect url there. Also it prompts to use
    <http://localhost:3000/auth_flow>
    as the redirect_url in Etsy app which I did with
    ngrok_url/auth_flow
    When I authenticate the connector, Etsy window for login opens up but on login, instead of redirection, the page shows
    A stitch has gone away!
    error. Can you help on this please?
    k
    • 2
    • 2
  • p

    Pablo Martin Calvo

    12/09/2025, 11:10 AM
    @kapa.ai How can I check the payloads of the api calls made by airbyte to hubspot?
    k
    • 2
    • 1
  • a

    Ami Mehta

    12/09/2025, 11:25 AM
    @kapa.ai The redirect_url given in Airbyte is
    localhost:3000/auth_flow
    . Etsy won't be able to reach localhost as it requires public URL. How to fix this? Is it possible to complete OAuth 2.0 flow for Etsy API in Airbyte?
    k
    • 2
    • 3
  • j

    Jeremy Juventin

    12/09/2025, 12:44 PM
    @kapa.ai hi, I have deployed airbyte-webapp with helm chart v2, but I'm getting this error:
    Copy code
    2025-12-09 13:30:26.147 CET
    2025/12/09 12:30:26 [notice] 1#1: signal 29 (SIGIO) received
    2025-12-09 13:30:26.147 CET
    2025/12/09 12:30:26 [notice] 1#1: signal 17 (SIGCHLD) received from 35
    2025-12-09 13:30:26.147 CET
    2025/12/09 12:30:26 [notice] 1#1: worker process 35 exited with code 0
    2025-12-09 13:30:26.147 CET
    2025/12/09 12:30:26 [notice] 1#1: signal 29 (SIGIO) received
    2025-12-09 13:30:26.155 CET
    2025/12/09 12:30:26 [notice] 1#1: signal 17 (SIGCHLD) received from 37
    2025-12-09 13:30:26.155 CET
    2025/12/09 12:30:26 [notice] 1#1: worker process 37 exited with code 0
    2025-12-09 13:30:26.155 CET
    2025/12/09 12:30:26 [notice] 1#1: exit
    k
    • 2
    • 1
  • j

    Jeremy Juventin

    12/09/2025, 12:49 PM
    @kapa.ai I try to install airbyte V2 on kubernetes, but it keeps installing version 1.6.0. Here is my command line : helm install --version 2.0.19 --values ./helm/v2/values.yaml airbyte airbyte-v2/airbyte Any help ?
    k
    • 2
    • 7
  • y

    Yenying Chen

    12/09/2025, 1:14 PM
    @kapa.ai I tried to update connections settings cron expression, but got an unexpected error. Failed to update connection.
    k
    • 2
    • 1
  • j

    Jeremy Juventin

    12/09/2025, 1:55 PM
    @kapa.ai Since upgrading to version 2, my syncs do not start. They start "starting ..." with "No logs found for this job." and no processing pod crearted
    k
    • 2
    • 1
  • s

    Simon Veerman

    12/09/2025, 3:28 PM
    Hi @kapa.ai I have a woocommerce connector on an open source instance which keeps giving this error: 2025-12-09 155310 info Backing off _send(...) for 1.0s (airbyte_cdk.sources.streams.http.exceptions.DefaultBackoffException: HTTP Status Code: 503. Error: Service unavailable.) 2025-12-09 155310 info Caught retryable error 'HTTP Status Code: 503. Error: Service unavailable.' after 1 tries. Waiting 1 seconds then retrying... On the orders table, but it can happen on any other table which has a lot of records. I've tried replicating the issue by running requests quickly concurrently from the same IP but I can't figure out what is wrong. What do you now about this issue with woocommerce and how could I effectively troubleshoot this?
    k
    • 2
    • 4
  • n

    Nofar Zeidenshnir

    12/09/2025, 4:02 PM
    @kapa.ai how can I predict the cost of a source without knowing first how much data will I move?
    k
    • 2
    • 1
  • r

    Rafael Felipe

    12/09/2025, 6:37 PM
    mongodb connector (self-managed) error:
    Copy code
    Connector configuration is not valid. Unable to connect: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=X, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketWriteException: Exception sending message}, caused by {javax.net.ssl.SSLHandshakeException: Remote host terminated the handshake}, caused by {java.io.EOFException: SSL peer shut down incorrectly
    k
    • 2
    • 1
  • j

    Joao Pedro Ferreira Canutto

    12/09/2025, 9:40 PM
    @kapa.ai I have a Self-Managed Community Airbyte 1.8.1 K8S installation, and I want to create a File source of a 2GB json file on a AWS bucket. When I try to create the source connector the "Test the source" part runs endlessly, never giving me any error. What could be causing this?
    k
    • 2
    • 22
  • j

    Jason Wiener

    12/09/2025, 10:45 PM
    @kapa.ai Is there a way to enforce respect for column order when using Redshift as a source?
    k
    • 2
    • 1
  • e

    Eliran Mualmi

    12/10/2025, 7:32 AM
    @kapa.ai how to disable job attempts so the job will run only once
    k
    • 2
    • 5
  • s

    s

    12/10/2025, 9:36 AM
    Hi everyone, I need some help with an issue after upgrading the Postgres destination to v3.0.0. I upgraded only the destination connector, and after the upgrade Airbyte stopped writing into my existing table. Even though my sync mode is still Incremental | Append, Airbyte is now creating a new table instead of appending to the old one. No schema changes were made on my side. What I’m seeing: Old table: still exists with the same column structure After upgrade, Airbyte created a new table with a different internal naming pattern Data is going into the new table, not the original one _airbyte_extracted_at, _airbyte_raw_id, and _airbyte_meta structure looks the same Only change was upgrading to Postgres Destination 3.0 Questions: Is this expected behavior with the new “Direct Load” architecture in v3.0? How can I force Airbyte to continue appending into my original destination table? Is there a recommended migration path to avoid table recreation during Postgres 3.0 upgrade? I can share table names, logs, and the connection ID if needed. Thanks in advance for any help! 🙏
    k
    • 2
    • 5
  • t

    Tom Dobson

    12/10/2025, 11:39 AM
    We deploy Airbyte v2.0.1 commity on Kubernetes with Helm chart v2.0.19. We use Airbyte's internal Postgres DB and external state storage and logging in S3. We've had to delete the Airbyte DB and set up again from scratch. Do we also need to delete the contents in the S3 bucket to get up and running again?
    k
    • 2
    • 7
  • a

    Abhijith C

    12/10/2025, 12:11 PM
    @kapa.ai Is there an existing issue where airbyte sync marked as succeeded even though docker image for the source is not available
    k
    • 2
    • 3
  • a

    Alejo Buxeres

    12/10/2025, 1:47 PM
    when building a connector through the UI builder, I'm getting this error for an asynchronous request:
    Copy code
    Internal Server Error: com.fasterxml.jackson.databind.JsonMappingException: String value length (20054016) exceeds the maximum allowed (20000000, from `StreamReadConstraints.getMaxStringLength()`) (through reference chain: io.airbyte.connectorbuilderserver.api.client.model.generated.StreamRead["slices"]->java.util.ArrayList[0]->io.airbyte.connectorbuilderserver.api.client.model.generated.StreamReadSlicesInner["pages"]->java.util.ArrayList[0]->io.airbyte.connectorbuilderserver.api.client.model.generated.StreamReadSlicesInnerPagesInner["response"]->io.airbyte.connectorbuilderserver.api.client.model.generated.HttpResponse["body"])
    k
    • 2
    • 1
  • a

    Andrey Souza

    12/10/2025, 2:26 PM
    @kapa.ai I created a custom API source using connector builder, , and exported manifest json using API How can i use that manifest to recreate that source on another Airbyte core local instance (v2.0.1) using terraform provider (v0.13.0)? what terraform resource template should I use, "airbyte_source_custom", "airbyte_declarative_source_definition" or other? how should be the airbyte/terraform/modules/sources/main.tf file?
    k
    • 2
    • 1
  • t

    Tom Dobson

    12/10/2025, 4:12 PM
    What is the difference between the airbyte and airbyte-data-plane v2 Helm charts?
    k
    • 2
    • 4
  • t

    Tom Dobson

    12/10/2025, 4:39 PM
    We're trying to install Airbyte 2.0.1 with Helm chart v2.0.19 on Kubernetes. The bootloader fails and it looks like the database creation also fails. Here are some logs. What are we doing wrong? 2025-12-10 163257 Unsetting empty environment variable 'AB_INSTANCE_ADMIN_CLIENT_SECRET' 2025-12-10 163257 Unsetting empty environment variable 'DEFAULT_DATAPLANE_GROUP_NAME' 2025-12-10 163257 Unsetting empty environment variable 'AB_INSTANCE_ADMIN_CLIENT_ID' 2025-12-10 163257 Unsetting empty environment variable 'AB_JWT_SIGNATURE_SECRET' 2025-12-10 163257 Unsetting empty environment variable 'AB_INSTANCE_ADMIN_PASSWORD' 2025-12-10 163300 2025-12-10 163300 _ _ _ _ 2025-12-10 163300 / | (_)____/ /_ __ __/ /____ 2025-12-10 163300 / /| | / / ___/ __ \/ / / / __/ _ \ 2025-12-10 163300 / _ |/ / / / /_/ / /_/ / /_/ __/ 2025-12-10 163300 /_/ |_/_/_/ /_.___/\__, /\__/\___/ 2025-12-10 163300 /____/ 2025-12-10 163300 : airbyte-bootloader : 2025-12-10 163300 2025-12-10 163306 2025-12-10 163306,684 [main] INFO i.m.c.e.DefaultEnvironment(<init>):170 - Established active environments: [k8s, control-plane, edition-community] 2025-12-10 163309 2025-12-10 163309,864 [main] INFO c.z.h.HikariDataSource(<init>):79 - HikariPool-1 - Starting... 2025-12-10 163309 2025-12-10 163309,987 [main] INFO c.z.h.HikariDataSource(<init>):81 - HikariPool-1 - Start completed. 2025-12-10 163309 2025-12-10 163309,996 [main] INFO c.z.h.HikariDataSource(<init>):79 - HikariPool-2 - Starting... 2025-12-10 163309 2025-12-10 163309,997 [main] INFO c.z.h.HikariDataSource(<init>):81 - HikariPool-2 - Start completed. 2025-12-10 163310 2025-12-10 163310,563 [main] INFO i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'INFO' for logger: 'io.netty' 2025-12-10 163310 2025-12-10 163310,564 [main] INFO i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'ERROR' for logger: 'com.zaxxer.hikari' 2025-12-10 163310 2025-12-10 163310,564 [main] INFO i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'INFO' for logger: 'io.grpc' 2025-12-10 163310 2025-12-10 163310,564 [main] INFO i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'INFO' for logger: 'io.temporal' 2025-12-10 163310 2025-12-10 163310,565 [main] INFO i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'ERROR' for logger: 'com.zaxxer.hikari.pool' 2025-12-10 163310 2025-12-10 163310,565 [main] INFO i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 - Setting log level 'INFO' for logger: 'io.fabric8.kubernetes.client' 2025-12-10 163316 2025-12-10 163316,066 [main] INFO i.m.r.Micronaut(start):101 - Startup completed in 15475ms. Server Running: http://airbyte-bootloader:9002 2025-12-10 163317 2025-12-10 163317,662 [main] INFO i.a.f.ConfigFileClient(<init>):141 - path /etc/launchdarkly/flags.yml does not exist, will return default flag values 2025-12-10 163319 2025-12-10 163319,466 [main] INFO i.a.c.s.RemoteDefinitionsProvider(<init>):82 - Created remote definitions provider for URL 'https://connectors.airbyte.com/files/' and registry 'oss'... 2025-12-10 163319 2025-12-10 163319,467 [main] INFO i.a.c.i.c.SeedBeanFactory(seedDefinitionsProvider):46 - Using remote definitions provider for seeding 2025-12-10 163320 2025-12-10 163320,789 [main] INFO i.a.b.Bootloader(load):71 - Initializing auth secrets... 2025-12-10 163323 2025-12-10 163323,570 [main] INFO i.a.b.AuthKubernetesSecretInitializer(initializeSecrets):30 - Initializing auth secret in Kubernetes... 2025-12-10 163323 2025-12-10 163323,673 [main] INFO i.a.b.AuthKubernetesSecretInitializer(getOrCreateSecretValue):108 - Using existing value for secret key instance-admin-password 2025-12-10 163323 2025-12-10 163323,762 [main] INFO i.a.b.AuthKubernetesSecretInitializer(getOrCreateSecretValue):108 - Using existing value for secret key instance-admin-client-id 2025-12-10 163323 2025-12-10 163323,775 [main] INFO i.a.b.AuthKubernetesSecretInitializer(getOrCreateSecretValue):108 - Using existing value for secret key instance-admin-client-secret 2025-12-10 163323 2025-12-10 163323,788 [main] INFO i.a.b.AuthKubernetesSecretInitializer(getOrCreateSecretValue):108 - Using existing value for secret key jwt-signature-secret 2025-12-10 163323 2025-12-10 163323,877 [main] INFO i.a.b.K8sSecretHelper(createOrUpdateSecret):45 - Secret with name airbyte-auth-secrets already exists. Updating it... 2025-12-10 163323 2025-12-10 163323,973 [main] INFO i.a.b.K8sSecretHelper(createOrUpdateSecret):53 - Successfully updated secret airbyte-auth-secrets 2025-12-10 163323 2025-12-10 163323,974 [main] INFO i.a.b.AuthKubernetesSecretInitializer(initializeSecrets):32 - Finished initializing auth secret. 2025-12-10 163323 2025-12-10 163323,975 [main] INFO i.a.b.Bootloader(load):76 - Initializing databases... 2025-12-10 163323 2025-12-10 163323,975 [main] INFO i.a.b.Bootloader(initializeDatabases):220 - Initializing databases... 2025-12-10 163323 2025-12-10 163323,977 [main] WARN i.a.d.c.DatabaseAvailabilityCheck$DefaultImpls(check):65 - Waiting for database to become available... 2025-12-10 163323 2025-12-10 163323,980 [main] INFO i.a.d.c.DatabaseAvailabilityCheck$DefaultImpls(isDatabaseConnected$lambda$6):96 - Testing airbyte configs database connection... 2025-12-10 163355 2025-12-10 163355,071 [main] [1;31mERROR[0;39m i.a.d.c.DatabaseAvailabilityCheck$DefaultImpls(isDatabaseConnected$lambda$6):103 - Failed to verify database connection. 2025-12-10 163355 org.jooq.exception.DataAccessException: Error getting connection from data source HikariDataSource (HikariPool-2) 2025-12-10 163355 at io.airbyte.db.init.DatabaseInitializer$DefaultImpls.initialize(DatabaseInitializer.kt:85) 2025-12-10 163355 at io.airbyte.db.init.ConfigsDatabaseInitializer.initialize(ConfigsDatabaseInitializer.kt:20) 2025-12-10 163355 at io.airbyte.bootloader.Bootloader.initializeDatabases(Bootloader.kt:221) 2025-12-10 163355 at io.airbyte.bootloader.Bootloader.load(Bootloader.kt:77) 2025-12-10 163355 at io.airbyte.bootloader.ApplicationKt.main(Application.kt:25) 2025-12-10 163355 Caused by: java.sql.SQLTransientConnectionException: HikariPool-2 - Connection is not available, request timed out after 30064ms (total=0, active=0, idle=0, waiting=0) 2025-12-10 163355 at com.zaxxer.hikari.pool.HikariPool.createTimeoutException(HikariPool.java:709) 2025-12-10 163355 at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:188) 2025-12-10 163355 at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:146) 2025-12-10 163355 at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:99) 2025-12-10 163355 at org.jooq.impl.DataSourceConnectionProvider.acquire(DataSourceConnectionProvider.java:87) 2025-12-10 163355 ... 24 common frames omitted 2025-12-10 163355 Caused by: org.postgresql.util.PSQLException: Connection to airbyte-db-svc.airbyte.svc.cluster.local:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections. 2025-12-10 163540 2025-12-10 163543 2025-12-10 163543,083 [main] [1;31mERROR[0;39m i.a.d.c.DatabaseAvailabilityCheck$DefaultImpls(isDatabaseConnected$lambda$6):103 - Failed to verify database connection. 2025-12-10 163543 org.jooq.exception.DataAccessException: Error getting connection from data source HikariDataSource (HikariPool-2) 2025-12-10 163543 at org.jooq_3.19.18.POSTGRES.debug(Unknown Source) 2025-12-10 163543 at org.jooq.impl.DataSourceConnectionProvider.acquire(DataSourceConnectionProvider.java:90) 2025-12-10 163543 at org.jooq.impl.DefaultExecuteContext.connection(DefaultExecuteContext.java:651) 2025-12-10 163543 at org.jooq.impl.AbstractQuery.connection(AbstractQuery.java:388) ... 2025-12-10 163543 at io.airbyte.db.check.DatabaseAvailabilityCheck$DefaultImpls.isDatabaseConnected$lambda$6(DatabaseAvailabilityCheck.kt:97) 2025-12-10 163543 at io.airbyte.db.check.DatabaseAvailabilityCheck$DefaultImpls.check(DatabaseAvailabilityCheck.kt:71) 2025-12-10 163543 at io.airbyte.db.check.ConfigsDatabaseAvailabilityCheck.check(ConfigsDatabaseAvailabilityCheck.kt:18) 2025-12-10 163543 at io.airbyte.db.init.DatabaseInitializer$DefaultImpls.initialize(DatabaseInitializer.kt:85) 2025-12-10 163543 at io.airbyte.db.init.ConfigsDatabaseInitializer.initialize(ConfigsDatabaseInitializer.kt:20) 2025-12-10 163543 at io.airbyte.bootloader.Bootloader.initializeDatabases(Bootloader.kt:221) 2025-12-10 163543 at io.airbyte.bootloader.Bootloader.load(Bootloader.kt:77) 2025-12-10 163543 at io.airbyte.bootloader.ApplicationKt.main(Application.kt:25) 2025-12-10 163543 Caused by: java.sql.SQLTransientConnectionException: HikariPool-2 - Connection is not available, request timed out after 30000ms (total=0, active=0, idle=0, waiting=0) 2025-12-10 163543 at com.zaxxer.hikari.pool.HikariPool.createTimeoutException(HikariPool.java:709) 2025-12-10 163543 at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:188) 2025-12-10 163543 at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:146) 2025-12-10 163543 at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:99) 2025-12-10 163543 at org.jooq.impl.DataSourceConnectionProvider.acquire(DataSourceConnectionProvider.java:87) 2025-12-10 163543 ... 24 common frames omitted 2025-12-10 163543 Caused by: org.postgresql.util.PSQLException: Connection to airbyte-db-svc.airbyte.svc.cluster.local:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections. 2025-12-10 163543 Caused by: java.net.ConnectException: Connection refused 2025-12-10 163543 at java.base/sun.nio.ch.Net.pollConnect(Native Method) 2025-12-10 163543 at java.base/sun.nio.ch.Net.pollConnectNow(Net.java:682) 2025-12-10 163543 at java.base/sun.nio.ch.NioSocketImpl.timedFinishConnect(NioSocketImpl.java:542) 2025-12-10 163543 at java.base/sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:592) 2025-12-10 163543 at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:327) 2025-12-10 163543 at java.base/java.net.Socket.connect(Socket.java:751) 2025-12-10 163543 at org.postgresql.core.PGStream.createSocket(PGStream.java:261) 2025-12-10 163543 at org.postgresql.core.PGStream.<init>(PGStream.java:122) 2025-12-10 163543 at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:146) 2025-12-10 163543 at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:289) 2025-12-10 163543 ... 14 common frames omitted 2025-12-10 163655 2025-12-10 163655 2025-12-10 163655,091 [main] INFO i.a.d.c.DatabaseAvailabilityCheck$DefaultImpls(check):73 - Database is not ready yet. Please wait a moment, it might still be initializing... 2025-12-10 163701 2025-12-10 163701,091 [main] WARN i.a.d.c.DatabaseAvailabilityCheck$DefaultImpls(check):65 - Waiting for database to become available... 2025-12-10 163701 2025-12-10 163701,092 [main] INFO i.a.d.c.DatabaseAvailabilityCheck$DefaultImpls(isDatabaseConnected$lambda$6):96 - Testing airbyte configs database connection...
    k
    • 2
    • 4
  • k

    Kevin O'Keefe

    12/10/2025, 7:20 PM
    @kapa.ai Failed to save Google Ads Test due to the following error: errors.http.default
    k
    • 2
    • 28
1...4445464748Latest