https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • j

    Justin Rea

    09/05/2025, 3:29 PM
    @kapa.ai There is one airbyte_internal schema table in postgres that is 11GB, far larger than any other table in the database. What can be safely be done about this while also keeping the incremental append and deduped sync mode?
    k
    • 2
    • 4
  • y

    Yandi Andriamasy

    09/05/2025, 3:51 PM
    @kapa.ai I use a stream that uses a parent stream partition router. The URL of the "child" stream is
    <https://graph.facebook.com/v22.0/{{stream_partition.page_id}}/insights>
    I want to be able to print the
    {{ stream_partition.page_id }}
    in an error message. For context, I use Airbyte Cloud. What solution do you suggest ?
    k
    • 2
    • 4
  • j

    Júlia Lemes

    09/05/2025, 4:09 PM
    @kapa.ai How does Airbyte application gets the workspace Id?
    k
    • 2
    • 1
  • s

    Slackbot

    09/05/2025, 4:52 PM
    This message was deleted.
    k
    • 2
    • 1
  • l

    Leland Whitlock

    09/05/2025, 4:59 PM
    @kapa.ai Since updating to 1.8.1, I am experiencing more UI outages on my deployment, with the error
    TypeError: Failed to fetch
    . Refreshing tends to fix the issue but it can take 5-10 refreshes every time. What could be the cause of this?
    k
    • 2
    • 4
  • j

    Jhonatas Kleinkauff

    09/05/2025, 5:55 PM
    #C01AHCD885S I'm migrating from helm chart version "0.492.0" to "0.654.0". When rendering the DATABASE_USER key, it seems the new version is breaking the key in a new line. Do you know something about it?
    Copy code
    - name: DATABASE_USER
              valueFrom:
                secretKeyRef:
                  name: airbyte-config-secrets
                  key: 
                    database-user
    k
    • 2
    • 1
  • f

    Faisal

    09/05/2025, 6:21 PM
    @kapa.ai for a brand new sql server to snowflake connection the sync completes successfully however no tables are created or data moved into the destination.
    k
    • 2
    • 10
  • j

    Joey Sham

    09/05/2025, 6:27 PM
    @kapa.ai I'm trying to load from meta to bigquery, but I keep getting this error:
    Copy code
    Error code 100: Invalid parameter
    k
    • 2
    • 7
  • f

    Faisal

    09/05/2025, 7:25 PM
    @kapa.ai SQL Server source connection errors out with error 504. What does this mean?
    k
    • 2
    • 1
  • f

    Faisal

    09/05/2025, 8:15 PM
    @kapa.ai Snowflake destination namespace custom format is being ignored and the schema defined in the destination is being considered. How can we resolve this so the destination schema is picked from the destination namespace field?
    k
    • 2
    • 1
  • t

    Tatiane Corrêa da Costa e Silva

    09/05/2025, 8:25 PM
    @kapa.ai ao configurar um destination do bq (v.1.2.13) no airbyte v.0.50.29 tomo o seguinte erro:
    Copy code
    Internal message: scheduledEventId=6, startedEventId=7, activityType='RunWithJobOutput', activityId='xxx-xxx-xxx-xxx-xxxx', identity='@airbyte-worker-xxxx-xxxxx', retryState=RETRY_STATE_MAXIMUM_ATTEMPTS_REACHED
    Failure origin: destination
    qual o motivo?
    k
    • 2
    • 13
  • l

    Lillian Jiang

    09/06/2025, 1:26 AM
    @kapa.ai is it possible to filter a record selector by equality to a string? This is an example: record["type"] == "csv"
    k
    • 2
    • 1
  • l

    Lillian Jiang

    09/06/2025, 2:06 AM
    @kapa.ai can you have a nested object in a parent_key of a parent_stream_config in a SubstreamPartitionerRouter. For example, could I do parent_key: body.attachmentId
    k
    • 2
    • 1
  • i

    Iulian

    09/06/2025, 9:50 AM
    I have a connection with a source of Mongdb v2. I have 2 databases in the source, that both have obviiously differnte names, but exact same collection name. When I try to create a connection with Terraform, I can see the streams in the terrafomr plan, having the same name, but different namespaces, begin the name of the db, however whne applying the terraform script I'm getting. "Duplicate stream found │ in configuration for"
    k
    • 2
    • 47
  • p

    Pradyumna Kulkarni

    09/06/2025, 7:07 PM
    @kapa.ai I want to connect my existing airbyte abctl setup to AWS rds postgres sql .Give me the steps
    k
    l
    • 3
    • 38
  • o

    Ofek Eliahu

    09/07/2025, 3:57 PM
    @kapa.ai does airbyte runs streams in parallel? i have connection with 10 streams and it takes a lot fo time to sync all the data how can i improve it?
    🙏 1
    k
    • 2
    • 4
  • g

    grocca

    09/08/2025, 1:29 AM
    @kapa.ai we selfhost airbyte on EKS and over the past few days our postgres has been throttling high on memory, and most airbyte syncs have been failing. when looking into the logs, there are a few different errors, but mainly this in the worker: Transaction isolation level -1 not supported. the server has also restarted 10 times over the past 3 days and most recent was due to OOMKilled. we havent made any updates to the deployment for a long time so unsure where this has come from.
    k
    • 2
    • 13
  • r

    Rahul

    09/08/2025, 7:12 AM
    @kapa.ai why this issue is coming
    Copy code
    [ {
      "failureOrigin" : "replication",
      "internalMessage" : "Broken pipe",
      "externalMessage" : "Something went wrong during replication",
      "metadata" : {
        "attemptNumber" : 4,
        "jobId" : 358
      },
      "stacktrace" : "java.io.IOException: Broken pipe\n\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write0(Native Method)\n\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write(UnixFileDispatcherImpl.java:65)\n\tat java.base/sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:137)\n\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:102)\n\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:72)\n\tat java.base/sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:300)\n\tat java.base/sun.nio.ch.ChannelOutputStream.writeFully(ChannelOutputStream.java:68)\n\tat java.base/sun.nio.ch.ChannelOutputStream.write(ChannelOutputStream.java:105)\n\tat java.base/sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:309)\n\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:381)\n\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:357)\n\tat java.base/sun.nio.cs.StreamEncoder.lockedWrite(StreamEncoder.java:158)\n\tat java.base/sun.nio.cs.StreamEncoder.write(StreamEncoder.java:139)\n\tat java.base/java.io.OutputStreamWriter.write(OutputStreamWriter.java:219)\n\tat java.base/java.io.BufferedWriter.implFlushBuffer(BufferedWriter.java:178)\n\tat java.base/java.io.BufferedWriter.flushBuffer(BufferedWriter.java:163)\n\tat java.base/java.io.BufferedWriter.implWrite(BufferedWriter.java:334)\n\tat java.base/java.io.BufferedWriter.write(BufferedWriter.java:313)\n\tat java.base/java.io.Writer.write(Writer.java:278)\n\tat io.airbyte.container.orchestrator.worker.io.AirbyteMessageBufferedWriter.write(AirbyteMessageBufferedWriter.kt:29)\n\tat io.airbyte.container.orchestrator.worker.io.LocalContainerAirbyteDestination.acceptWithNoTimeoutMonitor(LocalContainerAirbyteDestination.kt:132)\n\tat io.airbyte.container.orchestrator.worker.io.LocalContainerAirbyteDestination.accept(LocalContainerAirbyteDestination.kt:91)\n\tat io.airbyte.container.orchestrator.worker.DestinationWriter.run(ReplicationTask.kt:96)\n\tat io.airbyte.container.orchestrator.worker.DestinationWriter$run$1.invokeSuspend(ReplicationTask.kt)\n\tat kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)\n\tat kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:100)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n",
      "timestamp" : 1757276821400
    } ]
    k
    • 2
    • 1
  • r

    Rachel Ambler

    09/08/2025, 7:59 AM
    @kapa.ai When installing Airbyte with an external PostgreSql database, is the temporal database name customizable or is it fixed?
    k
    • 2
    • 1
  • o

    Ofek Eliahu

    09/08/2025, 8:30 AM
    @kapa.ai what is this log that i get on my job:
    Copy code
    2025-09-07 22:01:47 replication-orchestrator INFO thread status... heartbeat thread: true , replication thread: false
    2025-09-07 22:01:47 replication-orchestrator INFO Do not terminate as feature flag is disable
    k
    • 2
    • 7
  • y

    Youssef HAMROUNI

    09/08/2025, 9:13 AM
    c quoi la taille parfaite des pvc airbyte-db et minio pour un environnement de prod serieux
    k
    • 2
    • 1
  • o

    Ofek Eliahu

    09/08/2025, 9:44 AM
    @kapa.ai how can i use the FEATURE_FLAG_PATH in the helm chart inorder to set airbyte flags?
    k
    • 2
    • 1
  • j

    Jordi Santacreu Carranco

    09/08/2025, 9:45 AM
    @kapa.ai i am working on a conection from mysql to redshift all on aws enviroment and i have this error on the incremental load -> [config_error] Incumbent CDC state is invalid, reason: Saved offset no longer present on the server, please reset the connection, and then increase binlog retention and/or increase sync frequency. Connector last known binlog file mysql-bin-changelog.083648 is not found in the server. Server has [mysql-bin-changelog.083660, mysql-bin-changelog.083661, mysql-bin-changelog.083662].
    k
    • 2
    • 1
  • k

    Kothapalli Venkata Avinash

    09/08/2025, 10:36 AM
    @kapa.ai One of snowflake destination failing with error java.util.concurrent.CompletionException: java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not available, request timed out after 30000ms (total=1, active=1, idle=0, waiting=3)
    k
    • 2
    • 1
  • x

    Xavier LAI

    09/08/2025, 10:57 AM
    Hi @kapa.ai, Using custom builder I want to add a field with transformations that would be the year of the fiscal year depending on the field date. Our fiscal year start the 01 july and ends the 30 june FY26 started 1 july 2025 and will end 30 juin 2026 for example Give me the formula to add in value to set this new field
    k
    • 2
    • 1
  • t

    Tanuj Shriyan

    09/08/2025, 11:01 AM
    @kapa.ai Is the snowflake source a very slow sync connector? It is only syncing 5000 records every 2 mins. Is it possible for me to increase the sync rate to reduce the connection time?
    k
    • 2
    • 3
  • m

    Maxime Morelli

    09/08/2025, 12:06 PM
    @kapa.ai, I've installed locally on my computer airbyte. But as soon as I setup a destination or a source, the instance crash, and I get a 503 error, and then *{*"message": "Internal Server Error: not yet implemented", "exceptionClassName": "java.lang.UnsupportedOperationException", "exceptionStack": [], "rootCauseExceptionStack": [] }
    k
    • 2
    • 13
  • y

    Youssef HAMROUNI

    09/08/2025, 12:17 PM
    est ce que je peux terraformiser le deploiement de airbyte sur eks ? si oui tu peux me decrire les etapes necessaires ?
    k
    • 2
    • 1
  • o

    Ofek Eliahu

    09/08/2025, 12:57 PM
    @kapa.ai how can i set higher memory for my jobs to use? i am using airbyte 1.4.1 on kube with helm
    k
    • 2
    • 1
  • i

    Iliyas Shirol

    09/08/2025, 1:16 PM
    @kapa.ai When I use a CSV file as the source and Postgres as the destination, if the CSV has a column name as id, then in the destination it is creating a column name as _id. What might be causing it?
    k
    • 2
    • 1
1...4445464748Latest