https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • p

    Pablo Martin Calvo

    12/05/2025, 2:20 PM
    @kapa.ai When trying to send data to contacts in HubSpot, I'm getting an error
    Copy code
    Warning from destination: Invalid response with status code 409 while starting ingestion: {"status":"error","message":"Contact already exists. Existing ID: id_1","correlationId":"****","category":"CONFLICT"}
    It's the case for those hubspot contacts that have more than one email, and when both emails are sent in separate records for upserting. Has anybody else found this problem? Is there a fix going on?
    k
    • 2
    • 1
  • l

    Lucas Segers

    12/05/2025, 2:42 PM
    @kapa.ai what is this error when trying to setup a new bigquery datasource with a gcs bucekt 2025-12-05 144131,315 [io-executor-thread-6] WARN i.a.c.e.EntitlementServiceImpl(hasEnterpriseConnectorEntitlements-Y0BDzM4):185 - Connector entitlement not available. actorDefinitionId=d2542966-8cc8-4899-9b74-413a7d9bb28e organizationId=OrganizationId(value=00000000-0000-0000-0000-000000000000)
    k
    • 2
    • 1
  • a

    Alexander Ettingshausen

    12/05/2025, 3:52 PM
    @kapa.ai I am interested in a Firebird source connector to transfer data to BigQuery and found this page https://airbyte.com/integrations/firebird. In the Airbyte UI I can't find any FIrebird source connector. Could you support me with that?
    k
    • 2
    • 4
  • s

    Slackbot

    12/05/2025, 4:00 PM
    This message was deleted.
    k
    • 2
    • 1
  • p

    Parry Chen

    12/05/2025, 4:42 PM
    is there a sage intacct connector? i can't find it in the UI or in github
    k
    • 2
    • 5
  • j

    Júlia Lemes

    12/05/2025, 4:45 PM
    @kapa.ai I have a pipeline in airbyte that has one stream with sync mode Full Refresh Overwrite. What does this sync mode do behind the scenes? What are the commands it does in Redshift? Mainly if I have the drop cascade option enabled for Redshift destination connector
    k
    • 2
    • 4
  • j

    Juan Fabrega

    12/05/2025, 7:21 PM
    @kapa.ai Is there a way to set an alert for pipelines stalling or taking longer than usual to run?
    k
    • 2
    • 2
  • m

    Mateo Graciano

    12/05/2025, 8:18 PM
    @kapa.ai what can i do if my disk has no free space left?
    k
    • 2
    • 4
  • s

    Slackbot

    12/05/2025, 9:54 PM
    This message was deleted.
    k
    • 2
    • 1
  • л

    Ляшик Іван

    12/08/2025, 7:21 AM
    Hello! During replication in Airbyte I am periodically encountering the following errors: • Workload Heartbeat Error • Workload failed, source: workload-monitor-heartbeat •
    Failed to connect to airbyte-workload-api-server-svc:8007 (Connection refused)
    •
    No exit code found. exitCode.txt does not exist
    In the logs it looks like the sync crashes in the middle of the process, and the source container does not manage to write the
    exitCode.txt
    file before termination. The data size per sync is approximately: • 3.7M records emitted (~379 MB total) • 3 streams (around 1M, 1.58M and 1.126M records) • The sync runs for ~4 minutes before failing. ❓ Questions for Support Could you please help me understand the root cause and how to resolve this issue? 1. What could cause the Workload Heartbeat Error? 2. Specifically, why would the workload be unable to connect to
    airbyte-workload-api-server-svc:8007
    (Connection refused)? 3. Why does the workload container fail to create
    exitCode.txt
    ?
    4. This seems to indicate the container crashes abruptly rather than exiting gracefully. 5. Could this problem be related to insufficient resources (RAM/CPU) on the workload container or the workload-api-server? 6. If so, what are the recommended resource requirements for handling large syncs (~4M records / ~400 MB)? 7. Are there any known issues in Airbyte where the Workload API Server loses connection during high-volume syncs? 8. What additional logs or metrics would you need from my environment to further diagnose the issue? 9. (pods logs, describe output, node metrics, etc.) 10. Do you have recommended approaches for optimizing large syncs? 11. For example: ◦ Increasing workload / heartbeat timeouts ◦ Tuning CPU/RAM for workloads or API server ◦ Splitting the sync into smaller batches ◦ Using the new Worker mode or other scaling strategies
    k
    • 2
    • 1
  • л

    Ляшик Іван

    12/08/2025, 7:45 AM
    Subject: Job Failures – Workload Heartbeat Error / No exit code found Body: Hello Airbyte Support Team, We are experiencing multiple job failures on Managed Airbyte Cloud. Below are the details: • Environment: Managed Airbyte Cloud • Job ID: 3033 • Time: 2025-12-08 03:23 UTC • Connectors affected: dok_system_tables.* (multiple connectors) • Error messages observed: ◦
    Workload Heartbeat Error
    ◦
    No exit code found
    ◦
    Input was fully read, but some streams did not receive a terminal stream status message
    ◦ Streams without terminal status:
    summary_storage_delivery_date
    ,
    summary_storage_delivery
    ,
    product_link_all
    ,
    landing_url
    • Logs: (attached / see exported logs) • Additional notes: ◦ The failures happen on multiple connectors. ◦ Large datasets are being synced. ◦ We are unsure if Airbyte version was recently updated. ◦ The errors suggest potential instability in the workload API or pods, but as this is Managed Airbyte Cloud, we do not have access to internal pod logs or Kubernetes events. Could you please investigate why these jobs are failing and advise on a resolution? We want to ensure reliable replication for these connectors.
    k
    • 2
    • 1
  • r

    Rahul

    12/08/2025, 9:39 AM
    @kapa.ai How can we automatically install or prompt the TLS certificate for a source database when it is missing? FiveTran provides a prompt to review and trust the source TLS certificate during connection setup. Without adding TLS certificate, certain database instances that require custom TLS certificates won't be able to connect. I am not able to connect to XAP MongoDB connector, where the connection fails because the required TLS certificate is not present in the Airbyte JVM trust store. Attached screenshot - I can see the TLS certificates from server during the SSL handshake using below command.
    k
    • 2
    • 3
  • m

    Michael DeWulf

    12/08/2025, 10:35 AM
    @kapa.ai, in the builder, I'm modifying an "updatedAt" timestamp that python can't parse into a "cleaned" version, that python can parse. I'm trying to use that version to enable incremental sync. Everything looks right with the cleaned field when I examine it, but I keep getting an "Unknown Error" when I test the incremental sync step with it. I need help debugging this.
    k
    • 2
    • 1
  • r

    Rahul

    12/08/2025, 10:53 AM
    @kapa.ai How I can check the client id and client secrete and tenant id which I used while creating microsoft sharepoint source?
    k
    • 2
    • 1
  • j

    Júlia Lemes

    12/08/2025, 11:09 AM
    @kapa.ai I want to update a table in destination with a sync mode that allows me to keep old records (a history of changes in the same record with same id). Can I use Incremental + Append?
    k
    • 2
    • 1
  • a

    Abdullah mail

    12/08/2025, 1:14 PM
    @kapa.ai < could any of the following source datatypes in clickhouse cause the below error Array(String) Bool DateTime Int64 Nullable(DateTime) Nullable(String) String UInt8 java.lang.RuntimeException: java.sql.SQLFeatureNotSupportedException: getResultSet not implemented
    k
    • 2
    • 1
  • a

    A S Yamini

    12/08/2025, 1:28 PM
    2025-12-08 132327.432 UTC [15] FATAL: data directory "/var/lib/postgresql/data/pgdata" has wrong ownership 2025-12-08 132327.432 UTC [15] HINT: The server must be started by the user that owns the data directory. ERROR Failed to install airbyte/airbyte Helm Chart ERROR Unable to install Airbyte locally ERROR unable to install airbyte chart: unable to install helm: failed pre-install: 1 error occurred: * pod airbyte-abctl-bootloader failed
    k
    • 2
    • 1
  • k

    Kevin Liu

    12/08/2025, 2:57 PM
    @kapa.ai What is this community for?
    k
    • 2
    • 1
  • j

    Jason Wiener

    12/08/2025, 4:05 PM
    @kapa.ai The error below is occurring during an attempt to set up Redshift as a source in a connection. The test method succeeds in configuring the source but this error occurs during setup of a connection using the source. The destination is not the issue as that destination works with other sources and this source fails with multiple destinations.
    Copy code
    2025-12-08 09:00:37  ERROR 2025-12-08 09:00:37 error ERROR i.a.c.i.b.AirbyteExceptionHandler(uncaughtException):64 Something went wrong in the connector. See the logs for more details. java.lang.NullPointerException: null value in entry: isNullable=null
    	at com.google.common.collect.CollectPreconditions.checkEntryNotNull(CollectPreconditions.java:33) ~[guava-33.0.0-jre.jar:?]
    	at com.google.common.collect.ImmutableMapEntry.<init>(ImmutableMapEntry.java:54) ~[guava-33.0.0-jre.jar:?]
    	at com.google.common.collect.ImmutableMap.entryOf(ImmutableMap.java:345) ~[guava-33.0.0-jre.jar:?]
    	at com.google.common.collect.ImmutableMap$Builder.put(ImmutableMap.java:454) ~[guava-33.0.0-jre.jar:?]
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.getColumnMetadata(AbstractJdbcSource.java:248) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.cdk.db.jdbc.JdbcDatabase$1.tryAdvance(JdbcDatabase.java:84) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332) ~[?:?]
    	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?]
    	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
    	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?]
    	at io.airbyte.cdk.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:57) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.discoverInternal(AbstractJdbcSource.java:171) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:94) ~[io.airbyte.airbyte-integrations.connectors-source-redshift.jar:?]
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:30) ~[io.airbyte.airbyte-integrations.connectors-source-redshift.jar:?]
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discoverWithoutSystemTables(AbstractDbSource.java:268) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discover(AbstractDbSource.java:126) ~[airbyte-cdk-db-sources-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:159) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.java:125) ~[airbyte-cdk-core-0.20.4.jar:?]
    	at io.airbyte.integrations.source.redshift.RedshiftSource.main(RedshiftSource.java:141) ~[io.airbyte.airbyte-integrations.connectors-source-redshift.jar:?]
    
    Stack Trace: java.lang.NullPointerException: null value in entry: isNullable=null
    	at com.google.common.collect.CollectPreconditions.checkEntryNotNull(CollectPreconditions.java:33)
    	at com.google.common.collect.ImmutableMapEntry.<init>(ImmutableMapEntry.java:54)
    	at com.google.common.collect.ImmutableMap.entryOf(ImmutableMap.java:345)
    	at com.google.common.collect.ImmutableMap$Builder.put(ImmutableMap.java:454)
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.getColumnMetadata(AbstractJdbcSource.java:248)
    	at io.airbyte.cdk.db.jdbc.JdbcDatabase$1.tryAdvance(JdbcDatabase.java:84)
    	at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332)
    	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
    	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
    	at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
    	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    	at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
    	at io.airbyte.cdk.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:57)
    	at io.airbyte.cdk.integrations.source.jdbc.AbstractJdbcSource.discoverInternal(AbstractJdbcSource.java:171)
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:94)
    	at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:30)
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discoverWithoutSystemTables(AbstractDbSource.java:268)
    	at io.airbyte.cdk.integrations.source.relationaldb.AbstractDbSource.discover(AbstractDbSource.java:126)
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:159)
    	at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.java:125)
    	at io.airbyte.integrations.source.redshift.RedshiftSource.main(RedshiftSource.java:141)
    k
    • 2
    • 1
  • n

    Nour Ben Hassen

    12/08/2025, 4:19 PM
    @kapa.ai I'm having issues with dynamodb connector, sometimes the sync works and sometimes I receive an error "message='Airbyte could not track the sync progress. Sync process exited without reporting status.', type='io.airbyte.workers.exception.WorkloadMonitorException', nonRetryable=false" after a very long sync time. (OSS Version)
    k
    • 2
    • 1
  • l

    Lucas Segers

    12/08/2025, 4:49 PM
    is it possible to have two connections pointed at the same bigquery destination? I need one of them to be full refresh (once per weekend) and the normal one be daily
    k
    • 2
    • 1
  • l

    Luis Gustavo Macedo Lousada

    12/08/2025, 8:19 PM
    @kapa.ai the airbyte-worker throw this error when try test a destination, how can fix or debug this bro???
    Copy code
    2025-12-08 19:34:30,213 [Workflow Executor taskQueue="ui_commands", namespace="default": 1]     WARN    i.t.i.r.ReplayWorkflowTaskHandler(failureToWFTResult):302 - Workflow task processing failure. startedEventId=3, WorkflowId=check_b83dd098-8e2d-4741-84f5-0bd4390fd5f8, RunId=019aff75-4b49-7898-aff4-ecf3f6c52233. If seen continuously the workflow might be stuck.
    io.temporal.internal.statemachines.InternalWorkflowTaskException: Failure handling event 3 of type 'EVENT_TYPE_WORKFLOW_TASK_STARTED' during execution. {WorkflowTaskStartedEventId=3, CurrentStartedEventId=3}
    Copy code
    Caused by: io.temporal.internal.sync.PotentialDeadlockException: [TMPRL1101] Potential deadlock detected. Workflow thread "workflow-method-check_b83dd098-8e2d-4741-84f5-0bd4390fd5f8-019aff75-4b49-7898-aff4-ecf3f6c52233" didn't yield control for over a second. {detectionTimestamp=1765222469954, threadDumpTimestamp=1765222469957}
    
    workflow-method-check_b83dd098-8e2d-4741-84f5-0bd4390fd5f8-019aff75-4b49-7898-aff4-ecf3f6c52233
            at kotlin.reflect.jvm.internal.impl.types.KotlinTypeFactory.flexibleType(KotlinTypeFactory.kt:188)
    k
    • 2
    • 9
  • h

    Hari Haran R

    12/09/2025, 6:11 AM
    @kapa.ai i'm using airbyte open source 1.4 and trying to connect customer.io but when i try to enter the creds its failing, it also says its maintainted by fargos AI The Customer.io source is maintained by Faros AI. so does this connectr work in airbyte or not?
    k
    • 2
    • 1
  • i

    Ishan Anilbhai Koradiya

    12/09/2025, 6:51 AM
    Hi @kapa.ai what kind of instance type are reocmmended in aws eks for airbyte deployment with medium to hihgh workloads ?
    k
    • 2
    • 7
  • e

    Ed

    12/09/2025, 8:46 AM
    How can I configure my date field in yaml to always have a date of yesterday
    k
    • 2
    • 1
  • m

    Markus Müüripeal

    12/09/2025, 9:02 AM
    @kapa.ai After upgrading from 1.4 to 2.0.1, the syncs that took 1-2 minutes now take 30minutes minimum with 0 bytes
    k
    • 2
    • 1
  • s

    Stefano Messina

    12/09/2025, 9:35 AM
    @kapa.ai I keep getting
    Request to Airbyte API failed: 409 Client Error: Conflict for url: <http://airbyte-airbyte-webapp-svc.airbyte.svc:80/api/v1/connections/sync>
    for a group of connections being triggered, they only share the same destinations
    k
    • 2
    • 4
  • a

    Ami Mehta

    12/09/2025, 10:58 AM
    @kapa.ai I am trying to develop a source connector for Etsy from scratch. Using self hosted version of Airbyte installed with abctl command and running under a docker container. Used declarative authentication section to define the OAuth 2.0 authentication configurations in connector builder UI. But there is no place to define the redirect url there. Also it prompts to use
    <http://localhost:3000/auth_flow>
    as the redirect_url in Etsy app which I did with
    ngrok_url/auth_flow
    When I authenticate the connector, Etsy window for login opens up but on login, instead of redirection, the page shows
    A stitch has gone away!
    error. Can you help on this please?
    k
    • 2
    • 2
  • p

    Pablo Martin Calvo

    12/09/2025, 11:10 AM
    @kapa.ai How can I check the payloads of the api calls made by airbyte to hubspot?
    k
    • 2
    • 1
  • a

    Ami Mehta

    12/09/2025, 11:25 AM
    @kapa.ai The redirect_url given in Airbyte is
    localhost:3000/auth_flow
    . Etsy won't be able to reach localhost as it requires public URL. How to fix this? Is it possible to complete OAuth 2.0 flow for Etsy API in Airbyte?
    k
    • 2
    • 3