https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • r

    Rahul

    12/08/2025, 10:53 AM
    @kapa.ai How I can check the client id and client secrete and tenant id which I used while creating microsoft sharepoint source?
    k
    • 2
    • 1
  • j

    Júlia Lemes

    12/08/2025, 11:09 AM
    @kapa.ai I want to update a table in destination with a sync mode that allows me to keep old records (a history of changes in the same record with same id). Can I use Incremental + Append?
    k
    • 2
    • 1
  • a

    Abdullah mail

    12/08/2025, 1:14 PM
    @kapa.ai < could any of the following source datatypes in clickhouse cause the below error Array(String) Bool DateTime Int64 Nullable(DateTime) Nullable(String) String UInt8 java.lang.RuntimeException: java.sql.SQLFeatureNotSupportedException: getResultSet not implemented
    k
    • 2
    • 1
  • a

    A S Yamini

    12/08/2025, 1:28 PM
    2025-12-08 132327.432 UTC [15] FATAL: data directory "/var/lib/postgresql/data/pgdata" has wrong ownership 2025-12-08 132327.432 UTC [15] HINT: The server must be started by the user that owns the data directory. ERROR Failed to install airbyte/airbyte Helm Chart ERROR Unable to install Airbyte locally ERROR unable to install airbyte chart: unable to install helm: failed pre-install: 1 error occurred: * pod airbyte-abctl-bootloader failed
    k
    • 2
    • 1
  • k

    Kevin Liu

    12/08/2025, 2:57 PM
    @kapa.ai What is this community for?
    k
    • 2
    • 1
  • j

    Jason Wiener

    12/08/2025, 4:05 PM
    @kapa.ai The error below is occurring during an attempt to set up Redshift as a source in a connection. The test method succeeds in configuring the source but this error occurs during setup of a connection using the source. The destination is not the issue as that destination works with other sources and this source fails with multiple destinations.
    Copy code
    2025-12-08 09:00:37  ERROR 2025-12-08 09:00:37 error ERROR i.a.c.i.b.AirbyteExceptionHandler(uncaughtException):64 Something went wrong in the connector. See the logs for more details. java.lang.NullPointerException: null value in entry: isNullable=null
    	at com.google.common.collect.CollectPreconditions.checkEntryNotNull(CollectPreconditions.java:33) ~[guava-33.0.0-jre.jar:?]
    	at com.google.common.collect.ImmutableMapEntry.<init>(ImmutableMapEntry.java:54) ~[guava-33.0.0-jre.jar:?]
    	at com.google.common.collect.ImmutableMap.entryOf(ImmutableMap.java:345) ~[guava-33.0.0-jre.jar:?]
    	at com.google.common.collect.ImmutableMap$Builder.put(ImmutableMap.java:454) ~[guava-33.0.0-jre.jar:?]
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.getColumnMetadata(AbstractJdbcSource.java:248) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.cdk.db.jdbc.JdbcDatabase$1.tryAdvance(JdbcDatabase.java:84) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332) ~[?:?]
    	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?]
    	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
    	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?]
    	at io.airbyte.cdk.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:57) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.discoverInternal(AbstractJdbcSource.java:171) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:94) ~[io.airbyte.airbyte-integrations.connectors-source-redshift.jar:?]
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:30) ~[io.airbyte.airbyte-integrations.connectors-source-redshift.jar:?]
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discoverWithoutSystemTables(AbstractDbSource.java:268) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discover(AbstractDbSource.java:126) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:159) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.java:125) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at io.airbyte.integrations.source.redshift.RedshiftSource.main(RedshiftSource.java:141) ~[io.airbyte.airbyte-integrations.connectors-source-redshift.jar:?]
    
    Stack Trace: java.lang.NullPointerException: null value in entry: isNullable=null
    	at com.google.common.collect.CollectPreconditions.checkEntryNotNull(CollectPreconditions.java:33)
    	at com.google.common.collect.ImmutableMapEntry.<init>(ImmutableMapEntry.java:54)
    	at com.google.common.collect.ImmutableMap.entryOf(ImmutableMap.java:345)
    	at com.google.common.collect.ImmutableMap$Builder.put(ImmutableMap.java:454)
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.getColumnMetadata(AbstractJdbcSource.java:248)
    	at io.airbyte.cdk.db.jdbc.JdbcDatabase$1.tryAdvance(JdbcDatabase.java:84)
    	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332)
    	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
    	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
    	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
    	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
    	at io.airbyte.cdk.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:57)
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.discoverInternal(AbstractJdbcSource.java:171)
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:94)
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:30)
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discoverWithoutSystemTables(AbstractDbSource.java:268)
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discover(AbstractDbSource.java:126)
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:159)
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.java:125)
    	at io.airbyte.integrations.source.redshift.RedshiftSource.main(RedshiftSource.java:141)
    k
    • 2
    • 4
  • n

    Nour Ben Hassen

    12/08/2025, 4:19 PM
    @kapa.ai I'm having issues with dynamodb connector, sometimes the sync works and sometimes I receive an error "message='Airbyte could not track the sync progress. Sync process exited without reporting status.', type='io.airbyte.workers.exception.WorkloadMonitorException', nonRetryable=false" after a very long sync time. (OSS Version)
    k
    • 2
    • 1
  • l

    Lucas Segers

    12/08/2025, 4:49 PM
    is it possible to have two connections pointed at the same bigquery destination? I need one of them to be full refresh (once per weekend) and the normal one be daily
    k
    • 2
    • 1
  • l

    Luis Gustavo Macedo Lousada

    12/08/2025, 8:19 PM
    @kapa.ai the airbyte-worker throw this error when try test a destination, how can fix or debug this bro???
    Copy code
    2025-12-08 19:34:30,213 [Workflow Executor taskQueue="ui_commands", namespace="default": 1]     WARN    i.t.i.r.ReplayWorkflowTaskHandler(failureToWFTResult):302 - Workflow task processing failure. startedEventId=3, WorkflowId=check_b83dd098-8e2d-4741-84f5-0bd4390fd5f8, RunId=019aff75-4b49-7898-aff4-ecf3f6c52233. If seen continuously the workflow might be stuck.
    io.temporal.internal.statemachines.InternalWorkflowTaskException: Failure handling event 3 of type 'EVENT_TYPE_WORKFLOW_TASK_STARTED' during execution. {WorkflowTaskStartedEventId=3, CurrentStartedEventId=3}
    Copy code
    Caused by: io.temporal.internal.sync.PotentialDeadlockException: [TMPRL1101] Potential deadlock detected. Workflow thread "workflow-method-check_b83dd098-8e2d-4741-84f5-0bd4390fd5f8-019aff75-4b49-7898-aff4-ecf3f6c52233" didn't yield control for over a second. {detectionTimestamp=1765222469954, threadDumpTimestamp=1765222469957}
    
    workflow-method-check_b83dd098-8e2d-4741-84f5-0bd4390fd5f8-019aff75-4b49-7898-aff4-ecf3f6c52233
            at kotlin.reflect.jvm.internal.impl.types.KotlinTypeFactory.flexibleType(KotlinTypeFactory.kt:188)
    k
    • 2
    • 9
  • h

    Hari Haran R

    12/09/2025, 6:11 AM
    @kapa.ai i'm using airbyte open source 1.4 and trying to connect customer.io but when i try to enter the creds its failing, it also says its maintainted by fargos AI The Customer.io source is maintained by Faros AI. so does this connectr work in airbyte or not?
    k
    • 2
    • 1
  • i

    Ishan Anilbhai Koradiya

    12/09/2025, 6:51 AM
    Hi @kapa.ai what kind of instance type are reocmmended in aws eks for airbyte deployment with medium to hihgh workloads ?
    k
    • 2
    • 7
  • e

    Ed

    12/09/2025, 8:46 AM
    How can I configure my date field in yaml to always have a date of yesterday
    k
    • 2
    • 1
  • m

    Markus Müüripeal

    12/09/2025, 9:02 AM
    @kapa.ai After upgrading from 1.4 to 2.0.1, the syncs that took 1-2 minutes now take 30minutes minimum with 0 bytes
    k
    • 2
    • 1
  • s

    Stefano Messina

    12/09/2025, 9:35 AM
    @kapa.ai I keep getting
    Request to Airbyte API failed: 409 Client Error: Conflict for url: <http://airbyte-airbyte-webapp-svc.airbyte.svc:80/api/v1/connections/sync>
    for a group of connections being triggered, they only share the same destinations
    k
    • 2
    • 4
  • a

    Ami Mehta

    12/09/2025, 10:58 AM
    @kapa.ai I am trying to develop a source connector for Etsy from scratch. Using self hosted version of Airbyte installed with abctl command and running under a docker container. Used declarative authentication section to define the OAuth 2.0 authentication configurations in connector builder UI. But there is no place to define the redirect url there. Also it prompts to use
    <http://localhost:3000/auth_flow>
    as the redirect_url in Etsy app which I did with
    ngrok_url/auth_flow
    When I authenticate the connector, Etsy window for login opens up but on login, instead of redirection, the page shows
    A stitch has gone away!
    error. Can you help on this please?
    k
    • 2
    • 2
  • p

    Pablo Martin Calvo

    12/09/2025, 11:10 AM
    @kapa.ai How can I check the payloads of the api calls made by airbyte to hubspot?
    k
    • 2
    • 1
  • a

    Ami Mehta

    12/09/2025, 11:25 AM
    @kapa.ai The redirect_url given in Airbyte is
    localhost:3000/auth_flow
    . Etsy won't be able to reach localhost as it requires public URL. How to fix this? Is it possible to complete OAuth 2.0 flow for Etsy API in Airbyte?
    k
    • 2
    • 3
  • j

    Jeremy Juventin

    12/09/2025, 12:44 PM
    @kapa.ai hi, I have deployed airbyte-webapp with helm chart v2, but I'm getting this error:
    Copy code
    2025-12-09 13:30:26.147 CET
    2025/12/09 12:30:26 [notice] 1#1: signal 29 (SIGIO) received
    2025-12-09 13:30:26.147 CET
    2025/12/09 12:30:26 [notice] 1#1: signal 17 (SIGCHLD) received from 35
    2025-12-09 13:30:26.147 CET
    2025/12/09 12:30:26 [notice] 1#1: worker process 35 exited with code 0
    2025-12-09 13:30:26.147 CET
    2025/12/09 12:30:26 [notice] 1#1: signal 29 (SIGIO) received
    2025-12-09 13:30:26.155 CET
    2025/12/09 12:30:26 [notice] 1#1: signal 17 (SIGCHLD) received from 37
    2025-12-09 13:30:26.155 CET
    2025/12/09 12:30:26 [notice] 1#1: worker process 37 exited with code 0
    2025-12-09 13:30:26.155 CET
    2025/12/09 12:30:26 [notice] 1#1: exit
    k
    • 2
    • 1
  • j

    Jeremy Juventin

    12/09/2025, 12:49 PM
    @kapa.ai I try to install airbyte V2 on kubernetes, but it keeps installing version 1.6.0. Here is my command line : helm install --version 2.0.19 --values ./helm/v2/values.yaml airbyte airbyte-v2/airbyte Any help ?
    k
    • 2
    • 7
  • y

    Yenying Chen

    12/09/2025, 1:14 PM
    @kapa.ai I tried to update connections settings cron expression, but got an unexpected error. Failed to update connection.
    k
    • 2
    • 1
  • j

    Jeremy Juventin

    12/09/2025, 1:55 PM
    @kapa.ai Since upgrading to version 2, my syncs do not start. They start "starting ..." with "No logs found for this job." and no processing pod crearted
    k
    • 2
    • 1
  • s

    Simon Veerman

    12/09/2025, 3:28 PM
    Hi @kapa.ai I have a woocommerce connector on an open source instance which keeps giving this error: 2025-12-09 155310 info Backing off _send(...) for 1.0s (airbyte_cdk.sources.streams.http.exceptions.DefaultBackoffException: HTTP Status Code: 503. Error: Service unavailable.) 2025-12-09 155310 info Caught retryable error 'HTTP Status Code: 503. Error: Service unavailable.' after 1 tries. Waiting 1 seconds then retrying... On the orders table, but it can happen on any other table which has a lot of records. I've tried replicating the issue by running requests quickly concurrently from the same IP but I can't figure out what is wrong. What do you now about this issue with woocommerce and how could I effectively troubleshoot this?
    k
    • 2
    • 4
  • n

    Nofar Zeidenshnir

    12/09/2025, 4:02 PM
    @kapa.ai how can I predict the cost of a source without knowing first how much data will I move?
    k
    • 2
    • 1
  • r

    Rafael Felipe

    12/09/2025, 6:37 PM
    mongodb connector (self-managed) error:
    Copy code
    Connector configuration is not valid. Unable to connect: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=10.19.41.93:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketWriteException: Exception sending message}, caused by {javax.net.ssl.SSLHandshakeException: Remote host terminated the handshake}, caused by {java.io.EOFException: SSL peer shut down incorrectly
    k
    • 2
    • 1
  • j

    Joao Pedro Ferreira Canutto

    12/09/2025, 9:40 PM
    @kapa.ai I have a Self-Managed Community Airbyte 1.8.1 K8S installation, and I want to create a File source of a 2GB json file on a AWS bucket. When I try to create the source connector the "Test the source" part runs endlessly, never giving me any error. What could be causing this?
    k
    • 2
    • 4
  • j

    Jason Wiener

    12/09/2025, 10:45 PM
    @kapa.ai Is there a way to enforce respect for column order when using Redshift as a source?
    k
    • 2
    • 1
  • e

    Eliran Mualmi

    12/10/2025, 7:32 AM
    @kapa.ai how to disable job attempts so the job will run only once
    k
    • 2
    • 5
  • s

    s

    12/10/2025, 9:36 AM
    Hi everyone, I need some help with an issue after upgrading the Postgres destination to v3.0.0. I upgraded only the destination connector, and after the upgrade Airbyte stopped writing into my existing table. Even though my sync mode is still Incremental | Append, Airbyte is now creating a new table instead of appending to the old one. No schema changes were made on my side. What I’m seeing: Old table: still exists with the same column structure After upgrade, Airbyte created a new table with a different internal naming pattern Data is going into the new table, not the original one _airbyte_extracted_at, _airbyte_raw_id, and _airbyte_meta structure looks the same Only change was upgrading to Postgres Destination 3.0 Questions: Is this expected behavior with the new “Direct Load” architecture in v3.0? How can I force Airbyte to continue appending into my original destination table? Is there a recommended migration path to avoid table recreation during Postgres 3.0 upgrade? I can share table names, logs, and the connection ID if needed. Thanks in advance for any help! 🙏
    k
    • 2
    • 5
  • t

    Tom Dobson

    12/10/2025, 11:39 AM
    We deploy Airbyte v2.0.1 commity on Kubernetes with Helm chart v2.0.19. We use Airbyte's internal Postgres DB and external state storage and logging in S3. We've had to delete the Airbyte DB and set up again from scratch. Do we also need to delete the contents in the S3 bucket to get up and running again?
    k
    • 2
    • 7
  • a

    Abhijith C

    12/10/2025, 12:11 PM
    @kapa.ai Is there an existing issue where airbyte sync marked as succeeded even though docker image for the source is not available
    k
    • 2
    • 3