https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • b

    Bernardo Fernandes

    06/25/2025, 7:34 AM
    @kapa.ai is it possible to create custom connectors via terraform using the airbyte provider?
    k
    • 2
    • 1
  • p

    Power School

    06/25/2025, 8:01 AM
    how to update abctl
    k
    • 2
    • 14
  • k

    Khánh Phạm An

    06/25/2025, 8:09 AM
    @kapa.ai how do i define partition for tables in S3 Iceberg destination sync?
    k
    • 2
    • 4
  • a

    Alec Sharp

    06/25/2025, 8:13 AM
    hey @kapa.ai, Is it possible to create tables with a custom suffix when using big query as the destination? I ask because big query has a wildcard feature that I want to use: https://cloud.google.com/bigquery/docs/querying-wildcard-tables
    k
    • 2
    • 1
  • s

    Sudeesh Rajeevan

    06/25/2025, 8:17 AM
    how to clone a source in airbyte free version. MySQL source doesn't support multiple database. so I want to create another source for other database. @kapa.ai
    k
    • 2
    • 16
  • m

    Matheus Dantas

    06/25/2025, 8:31 AM
    is there a way to disable notifications related to schema changes caused by creation of new tables? My data source is a postgres connector. I would like to receive only notification related to selected tables to be sync
    k
    • 2
    • 1
  • e

    Eitan Hyams

    06/25/2025, 11:40 AM
    does airbyte truncate request body? how to set limits?
    k
    • 2
    • 7
  • n

    Nevo

    06/25/2025, 12:06 PM
    Hello @kapa.ai I'm using Mixpanel as a source and i'm consuming data from the 'export' stream. While in some cases everything is working fine, I have noticed cases where the stream only sync really small amount of partial data while in my Mixpanel dashboard I can see that many more records exists. I have verified that the 'time' property, that is being used as the cursor field is indeed valid. What can be the cause of this partial data?
    k
    • 2
    • 4
  • c

    Chris

    06/25/2025, 12:16 PM
    I am using the GCS Source, how to make sure an empty string, so ,, is becoming NULL instead of an empty string in my destination
    k
    • 2
    • 5
  • s

    sai ram komma

    06/25/2025, 12:32 PM
    @kapa.ai I'm using the MongoDB destination connector with Full Refresh - Overwrite sync mode. I noticed that during the sync, my original MongoDB collection gets dropped before any new data is written. Then it seems like Airbyte writes to a temporary collection, and finally renames it to the original name. Is this the expected behavior? I was under the impression that Airbyte would write to a temp collection first, and only drop the original after a successful write — to avoid data loss if the sync fails. Could you explain how this works and whether there’s a safer way to handle overwrites?
    k
    • 2
    • 1
  • s

    Slackbot

    06/25/2025, 1:10 PM
    This message was deleted.
    k
    • 2
    • 1
  • l

    Lui Pillmann

    06/25/2025, 1:54 PM
    I need to integrate with an API that requires each request to be hmac signed. Does Airbyte UI support it?
    k
    • 2
    • 7
  • j

    Jacob Batt

    06/25/2025, 2:20 PM
    @kapa.ai What salesforce oauth permissions are needed in order to use the airbyte connector to sync data from salesforce to postgres
    k
    • 2
    • 4
  • l

    Lucas Segers

    06/25/2025, 2:46 PM
    does the httpclient used for making httprequests in the connection builder have a default timeout?
    k
    • 2
    • 1
  • s

    Sérgio Marçal

    06/25/2025, 4:08 PM
    hello, Experiecing issues on monday connector 2025-06-25 014703 source ERROR Exception while syncing stream items Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/streams/concurrent/partition_reader.py", line 40, in process_partition for record in partition.read(): File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/declarative/stream_slicers/declarative_partition_generator.py", line 59, in read for stream_data in self._retriever.read_records(self._json_schema, self._stream_slice): File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/declarative/retrievers/simple_retriever.py", line 469, in read_records for stream_data in self._read_pages(record_generator, self.state, _slice): File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/declarative/retrievers/simple_retriever.py", line 366, in _read_pages response = self._fetch_next_page(stream_state, stream_slice, next_page_token) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/declarative/retrievers/simple_retriever.py", line 322, in _fetch_next_page return self.requester.send_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/declarative/requesters/http_requester.py", line 400, in send_request request, response = self._http_client.send_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/streams/http/http_client.py", line 524, in send_request response: requests.Response = self._send_with_retry( ^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/streams/http/http_client.py", line 270, in _send_with_retry response = backoff_handler(rate_limit_backoff_handler(user_backoff_handler))( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/streams/http/http_client.py", line 340, in _send self._handle_error_resolution( File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/streams/http/http_client.py", line 466, in _handle_error_resolution raise UserDefinedBackoffException( airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Internal server error. 476 2025-06-25 014703 source ERROR Concurrent read failure Traceback (most recent call last): File "/airbyte/integration_code/main.py", line 9, in <module> run() File "/airbyte/integration_code/source_monday/run.py", line 54, in run launch(source, _args) File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/entrypoint.py", line 341, in launch for message in source_entrypoint.run(parsed_args): File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/entrypoint.py", line 171, in run yield from map( File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/entrypoint.py", line 244, in read for message in self.source.read(self.logger, config, catalog, state): File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/declarative/concurrent_declarative_source.py", line 154, in read yield from self._concurrent_source.read(selected_concurrent_streams) File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/concurrent_source/concurrent_source.py", line 119, in read yield from self._consume_from_queue( File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/concurrent_source/concurrent_source.py", line 144, in _consume_from_queue if concurrent_stream_processor.is_done() and queue.empty(): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/airbyte_cdk/sources/concurrent_source/concurrent_read_processor.py", line 230, in is_done raise AirbyteTracedException( airbyte_cdk.utils.traced_exception.AirbyteTracedException: Concurrent read failure 518 2025-06-25 014709 destination ERROR main i.a.c.i.u.ConnectorExceptionHandler(handleException):68 caught exception! io.airbyte.commons.exceptions.TransientErrorException: Some streams were unsuccessful due to a source error. See logs for details. at io.airbyte.cdk.integrations.destination.async.AsyncStreamConsumer.close(AsyncStreamConsumer.kt:217) ~[airbyte-cdk-core-0.48.9.jar:?] at io.airbyte.cdk.integrations.base.SerializedAirbyteMessageConsumer$Companion$appendOnClose$1.close(SerializedAirbyteMessageConsumer.kt:70) ~[airbyte-cdk-core-0.48.9.jar:?] at kotlin.jdk7.AutoCloseableKt.closeFinally(AutoCloseableJVM.kt:48) ~[kotlin-stdlib-2.0.0.jar:2.0.0-release-341] at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.kt:215) [airbyte-cdk-core-0.48.9.jar:?] at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.kt:119) [airbyte-cdk-core-0.48.9.jar:?] at io.airbyte.cdk.integrations.base.IntegrationRunner.run$default(IntegrationRunner.kt:113) [airbyte-cdk-core-0.48.9.jar:?] at io.airbyte.integrations.destination.postgres.PostgresDestination$Companion.main(PostgresDestination.kt:230) [io.airbyte.airbyte-integrations.connectors-destination-postgres.jar:?] at io.airbyte.integrations.destination.postgres.PostgresDestination.main(PostgresDestination.kt) [io.airbyte.airbyte-integrations.connectors-destination-postgres.jar:?] Stack Trace: io.airbyte.commons.exceptions.TransientErrorException: Some streams were unsuccessful due to a source error. See logs for details. at io.airbyte.cdk.integrations.destination.async.AsyncStreamConsumer.close(AsyncStreamConsumer.kt:217) at io.airbyte.cdk.integrations.base.SerializedAirbyteMessageConsumer$Companion$appendOnClose$1.close(SerializedAirbyteMessageConsumer.kt:70) at kotlin.jdk7.AutoCloseableKt.closeFinally(AutoCloseableJVM.kt:48) at io.airbyte.cdk.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.kt:215) at io.airbyte.cdk.integrations.base.IntegrationRunner.run(IntegrationRunner.kt:119) at io.airbyte.cdk.integrations.base.IntegrationRunner.run$default(IntegrationRunner.kt:113) at io.airbyte.integrations.destination.postgres.PostgresDestination$Companion.main(PostgresDestination.kt:230) at io.airbyte.integrations.destination.postgres.PostgresDestination.main(PostgresDestination.kt) Do you have a clue?
    k
    • 2
    • 1
  • l

    Lakin Wecker

    06/25/2025, 4:40 PM
    Hi, I'm still trying to wrap my head around what airbyte is and how I can make use of it in a project I'm working on. I dunno if I'm just failing to understand what it is, and how it works, or if AI and Google or whether I even have the correct terminology. Someone else I'm talking to is convinced that we can use it in a specific way, and I'm not sure. I get the idea of using it to enable data-teams to move data. Like, if we have an internal team that needs to move data between disparate sources into a common data lake or data analytics system, we deploy airbyte and configure RBAC/UBAC/Worskpaces etc, and they use the UI or pyairbyte or something to configure their workflos/pipelines, etc. What I need is a way for my developers to mostly configure a specific set of pipelines (from anywhere into our product, and from our product out to anywhere), and then for our UI/product to be able to expose these pipelines to the end users in a way where they just quickly configure the necessary source name / credentials they need and then our airbyte/our system runs the pipeline for them. I assume this is possible, but I'm missing the right terminology to find the right documentation/tutorial to get a good sense for how this works and how well it would work, etc. What terminology am I looking for? (also, is this the right channel? I couldn't find a better one?)
    k
    • 2
    • 7
  • k

    Konathala Chaitanya

    06/25/2025, 6:54 PM
    @kapa.ai i trying to connect to iceberg with glue catalog Could not connect to the Iceberg catalog with the provided configuration. null, root cause: NullPointerException(null) but i am getting error
    k
    • 2
    • 1
  • k

    Konathala Chaitanya

    06/25/2025, 7:29 PM
    @kapa.ai i trying to connect to iceberg with glue catalog Could not connect to the Iceberg catalog with the provided configuration. null, root cause: NullPointerException(null) in which pod can i access these error details
    k
    • 2
    • 1
  • m

    Mohd Asad

    06/25/2025, 7:58 PM
    I deployed Airbyte using the Helm chart:
    Copy code
    helm repo add airbyte 
    <https://airbytehq.github.io/helm-charts>  
    helm install my-airbyte airbyte/airbyte --version 1.7.0
    The core components are running fine. However, when I create a source and destination and trigger a sync, a new replication job pod is created. This pod includes three containers—`source`,
    destination
    , and `orchestrator`—and it requests a total of 4 CPUs, which is too high for my environment. I attempted to reduce the CPU and memory usage by setting the following values in my `values.yaml`:
    Copy code
    global:
      jobs:
        resources:
          requests:
            cpu: 250m
            memory: 256Mi
          limits:
            cpu: 500m
            memory: 512Mi
    I also tried setting these environment variables:
    Copy code
    JOB_MAIN_CONTAINER_CPU_REQUEST  
    JOB_MAIN_CONTAINER_CPU_LIMIT  
    JOB_MAIN_CONTAINER_MEMORY_REQUEST  
    JOB_MAIN_CONTAINER_MEMORY_LIMIT
    Despite these changes, the replication job pods are still requesting 4 CPUs. I’m looking for a reliable way to reduce their resource requests to around 1.5 to 2 CPUs in total.
    k
    • 2
    • 1
  • s

    Sree Shanthan Kuthuru

    06/26/2025, 1:19 AM
    @kapa.ai I have a sync from S3 to S3 which is failing with below error 2025-06-25 210804 replication-orchestrator INFO Failures: [ { "failureOrigin" : "destination", "internalMessage" : "Destination process exited with non-zero exit code 143", "externalMessage" : "Something went wrong within the destination connector", "metadata" : { "attemptNumber" : 0, "jobId" : 187744, "connector_command" : "write" }, "stacktrace" : "io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 143\n\tat io.airbyte.container.orchestrator.worker.DestinationReader.run(ReplicationTask.kt:54)\n\tat io.airbyte.container.orchestrator.worker.ReplicationWorker$runJobs$2$tasks$1$1.invokeSuspend(ReplicationWorker.kt:151)\n\tat io.airbyte.container.orchestrator.worker.ReplicationWorker$runJobs$2$tasks$1$1.invoke(ReplicationWorker.kt)\n\tat io.airbyte.container.orchestrator.worker.ReplicationWorker$runJobs$2$tasks$1$1.invoke(ReplicationWorker.kt)\n\tat io.airbyte.container.orchestrator.worker.util.AsyncUtils$runAsync$1$1.invokeSuspend(AsyncUtils.kt:22)\n\tat io.airbyte.container.orchestrator.worker.util.AsyncUtils$runAsync$1$1.invoke(AsyncUtils.kt)\n\tat io.airbyte.container.orchestrator.worker.util.AsyncUtils$runAsync$1$1.invoke(AsyncUtils.kt)\n\tat kotlinx.coroutines.intrinsics.UndispatchedKt.startUndspatched(Undispatched.kt:66)\n\tat kotlinx.coroutines.intrinsics.UndispatchedKt.startUndispatchedOrReturn(Undispatched.kt:43)\n\tat kotlinx.coroutines.BuildersKt__Builders_commonKt.withContext(Builders.common.kt:165)\n\tat kotlinx.coroutines.BuildersKt.withContext(Unknown Source)\n\tat io.airbyte.container.orchestrator.worker.util.AsyncUtils$runAsync$1.invokeSuspend(AsyncUtils.kt:21)\n\tat kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)\n\tat kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:100)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n\tSuppressed: io.airbyte.workers.internal.exception.DestinationException: Destination process message delivery failed\n\t\tat io.airbyte.container.orchestrator.worker.DestinationWriter.run(ReplicationTask.kt:98)\n\t\tat io.airbyte.container.orchestrator.worker.DestinationWriter$run$1.invokeSuspend(ReplicationTask.kt)\n\t\t... 5 more\n\tCaused by: java.io.IOException: Broken pipe\n\t\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write0(Native Method)\n\t\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write(UnixFileDispatcherImpl.java:65)\n\t\tat java.base/sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:137)\n\t\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:102)\n\t\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:72)\n\t\tat java.base/sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:300)\n\t\tat java.base/sun.nio.ch.ChannelOutputStream.writeFully(ChannelOutputStream.java:68)\n\t\tat java.base/sun.nio.ch.ChannelOutputStream.write(ChannelOutputStream.java:105)\n\t\tat java.base/sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:309)\n\t\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:381)\n\t\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:357)\n\t\tat java.base/sun.nio.cs.StreamEncoder.lockedWrite(StreamEncoder.java:158)\n\t\tat java.base/sun.nio.cs.StreamEncoder.write(StreamEncoder.java:139)\n\t\tat java.base/java.io.OutputStreamWriter.write(OutputStreamWriter.java:219)\n\t\tat java.base/java.io.BufferedWriter.implFlushBuffer(BufferedWriter.java:178)\n\t\tat java.base/java.io.BufferedWriter.flushBuffer(BufferedWriter.java:163)\n\t\tat java.base/java.io.BufferedWriter.implWrite(BufferedWriter.java:334)\n\t\tat java.base/java.io.BufferedWriter.write(BufferedWriter.java:313)\n\t\tat java.base/java.io.Writer.write(Writer.java:278)\n\t\tat io.airbyte.container.orchestrator.worker.io.AirbyteMessageBufferedWriter.write(AirbyteMessageBufferedWriter.kt:29)\n\t\tat io.airbyte.container.orchestrator.worker.io.LocalContainerAirbyteDestination.acceptWithNoTimeoutMonitor(LocalContainerAirbyteDestination.kt:132)\n\t\tat io.airbyte.container.orchestrator.worker.io.LocalContainerAirbyteDestination.accept(LocalContainerAirbyteDestination.kt:91)\n\t\tat io.airbyte.container.orchestrator.worker.DestinationWriter.run(ReplicationTask.kt:96)\n\t\t... 6 more\n", "timestamp" : 1750900084398 } ]
    k
    • 2
    • 4
  • g

    Glen Aultman-Bettridge

    06/26/2025, 1:34 AM
    @kapa.ai All of my jobs are failing with a "message='com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain…" error. Is this a known problem?
    k
    • 2
    • 1
  • d

    David Hu

    06/26/2025, 2:07 AM
    @kapa.ai i'm getting this error when syncing from mysql to bigquery. this started happening after upgrading airbyte from 1.5.1 to 1.7.0
    Copy code
    2025-06-25 19:05:16 replication-orchestrator INFO Failures: [ {
      "failureOrigin" : "destination",
      "failureType" : "system_error",
      "internalMessage" : "java.lang.NullPointerException: Cannot invoke \"io.airbyte.protocol.models.v0.AirbyteGlobalState.getSharedState()\" because the return value of \"io.airbyte.protocol.models.v0.AirbyteStateMessage.getGlobal()\" is null",
      "externalMessage" : "Cannot invoke \"io.airbyte.protocol.models.v0.AirbyteGlobalState.getSharedState()\" because the return value of \"io.airbyte.protocol.models.v0.AirbyteStateMessage.getGlobal()\" is null",
      "metadata" : {
        "attemptNumber" : 0,
        "jobId" : 18128,
        "from_trace_message" : true,
        "connector_command" : "write"
      },
      "stacktrace" : "java.lang.NullPointerException: Cannot invoke \"io.airbyte.protocol.models.v0.AirbyteGlobalState.getSharedState()\" because the return value of \"io.airbyte.protocol.models.v0.AirbyteStateMessage.getGlobal()\" is null\n\tat io.airbyte.cdk.load.message.DestinationMessageFactory.fromAirbyteProtocolMessage(DestinationMessageFactory.kt:166)\n\tat io.airbyte.cdk.load.message.ProtocolMessageDeserializer.deserialize(DestinationMessageDeserializer.kt:50)\n\tat io.airbyte.cdk.load.task.internal.ReservingDeserializingInputFlow.collect(ReservingDeserializingInputFlow.kt:46)\n\tat io.airbyte.cdk.load.task.internal.InputConsumerTask.execute(InputConsumerTask.kt:120)\n\tat io.airbyte.cdk.load.task.DestinationTaskLauncher$WrappedTask.execute(DestinationTaskLauncher.kt:124)\n\tat io.airbyte.cdk.load.task.TaskScopeProvider$launch$job$1.invokeSuspend(TaskScopeProvider.kt:35)\n\tat kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)\n\tat kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:100)\n\tat kotlinx.coroutines.internal.LimitedDispatcher$Worker.run(LimitedDispatcher.kt:124)\n\tat kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:89)\n\tat kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:586)\n\tat kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:820)\n\tat kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:717)\n\tat kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:704)\n\tSuppressed: io.airbyte.cdk.TransientErrorException: Input was fully read, but some streams did not receive a terminal stream status message. If the destination did not encounter other errors, this likely indicates an error in the source or platform. Streams without a status message: [Account, User, ApiKey, SubscriptionRetentionOffer, UserSubscription, UserNotificationConfig, VideoProjectNotification, ImageProject, SubscriptionCancellation, VideoProject, PaymentIntent, WebhookEventLog, SubscriptionInvoice, Webhook, Tag, WebhookSubscribedEvent, VideoProjectTag, ImageCustomLora]\n\t\tat io.airbyte.cdk.load.state.SyncManager.markInputConsumed(SyncManager.kt:125)\n\t\tat io.airbyte.cdk.load.state.PipelineEventBookkeepingRouter.close(PipelineEventBookkeepingRouter.kt:248)\n\t\tat io.airbyte.cdk.load.task.internal.InputConsumerTask.execute(InputConsumerTask.kt:133)\n\t\t... 10 more\n",
      "timestamp" : 1750903514303
    }, {
      "failureOrigin" : "replication",
      "internalMessage" : "Broken pipe",
      "externalMessage" : "Something went wrong during replication",
      "metadata" : {
        "attemptNumber" : 0,
        "jobId" : 18128
      },
      "stacktrace" : "java.io.IOException: Broken pipe\n\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write0(Native Method)\n\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write(UnixFileDispatcherImpl.java:65)\n\tat java.base/sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:137)\n\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:102)\n\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:72)\n\tat java.base/sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:300)\n\tat java.base/sun.nio.ch.ChannelOutputStream.writeFully(ChannelOutputStream.java:68)\n\tat java.base/sun.nio.ch.ChannelOutputStream.write(ChannelOutputStream.java:105)\n\tat java.base/sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:309)\n\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:381)\n\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:357)\n\tat java.base/sun.nio.cs.StreamEncoder.lockedWrite(StreamEncoder.java:158)\n\tat java.base/sun.nio.cs.StreamEncoder.write(StreamEncoder.java:139)\n\tat java.base/java.io.OutputStreamWriter.write(OutputStreamWriter.java:219)\n\tat java.base/java.io.BufferedWriter.implFlushBuffer(BufferedWriter.java:178)\n\tat java.base/java.io.BufferedWriter.flushBuffer(BufferedWriter.java:163)\n\tat java.base/java.io.BufferedWriter.implWrite(BufferedWriter.java:334)\n\tat java.base/java.io.BufferedWriter.write(BufferedWriter.java:313)\n\tat java.base/java.io.Writer.write(Writer.java:278)\n\tat io.airbyte.container.orchestrator.worker.io.AirbyteMessageBufferedWriter.write(AirbyteMessageBufferedWriter.kt:29)\n\tat io.airbyte.container.orchestrator.worker.io.LocalContainerAirbyteDestination.acceptWithNoTimeoutMonitor(LocalContainerAirbyteDestination.kt:132)\n\tat io.airbyte.container.orchestrator.worker.io.LocalContainerAirbyteDestination.accept(LocalContainerAirbyteDestination.kt:91)\n\tat io.airbyte.container.orchestrator.worker.DestinationWriter.run(ReplicationTask.kt:96)\n\tat io.airbyte.container.orchestrator.worker.DestinationWriter$run$1.invokeSuspend(ReplicationTask.kt)\n\tat kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)\n\tat kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:100)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n",
      "timestamp" : 1750903516065
    }
    k
    • 2
    • 7
  • a

    Alex C

    06/26/2025, 7:04 AM
    @kapa.ai The Airbyte UI is experiencing a loading failure. While querying the endpoint list_by_user_id, I encountered a TypeError: Failed to fetch.
    k
    • 2
    • 10
  • a

    Alex C

    06/26/2025, 7:20 AM
    @kapa.ai I got an error :Setting attempt to FAILED because the workflow for this connection was restarted, and existing job state was cleaned.
    k
    • 2
    • 7
  • l

    Luke Alexander

    06/26/2025, 8:17 AM
    @kapa.ai after upgrading from 1.5.1 to 1.7.0 our slack notifications do not format the links to airbyte correctly, the url is correct but they are now missing
    https://
    so they are not formatted correctly, how to fix this?
    k
    • 2
    • 1
  • s

    Slackbot

    06/26/2025, 8:35 AM
    This message was deleted.
    k
    • 2
    • 1
  • j

    Jacob

    06/26/2025, 9:41 AM
    What's the best way to compute for shopify refunds?
    k
    • 2
    • 1
  • f

    Fabrizio Spini

    06/26/2025, 10:12 AM
    @kapa.ai why I have to grant RELOAD for mysql source as described https://docs.airbyte.com/integrations/sources/mysql#step-1-create-a-dedicated-read-only-mysql-user ?
    k
    • 2
    • 1
  • c

    Cenk Batman

    06/26/2025, 10:31 AM
    @kapa.ai I am using helm chart 1.7.0 any source creation fails at the “test the source” step with An unexpected error occurred. Please report this if the issue persists. (HTTP 500) an example log from the source-s3-check pod i.a.a.c.c.ClientConfigurationSupport(generateDefaultRetryPolicy$lambda$3):65 - Failed to call unknown. Last response: null java.io.IOException: HTTP error: 500 Internal Server Error at io.airbyte.api.client.ThrowOn5xxInterceptor.intercept(ThrowOn5xxInterceptor.kt:23) at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201) at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154) at dev.failsafe.okhttp.FailsafeCall.lambda$execute$0(FailsafeCall.java:117) at dev.failsafe.Functions.lambda$get$0(Functions.java:46) at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74) at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187) at dev.failsafe.CallImpl.execute(CallImpl.java:33) at dev.failsafe.okhttp.FailsafeCall.execute(FailsafeCall.java:121) at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutputWithHttpInfo(WorkloadOutputApi.kt:376) at io.airbyte.api.client.generated.WorkloadOutputApi.writeWorkloadOutput(WorkloadOutputApi.kt:62) at io.airbyte.workers.workload.WorkloadOutputWriter.writeOutputThroughServer(WorkloadOutputWriter.kt:82) at io.airbyte.workers.workload.WorkloadOutputWriter.write(WorkloadOutputWriter.kt:65) at io.airbyte.connectorSidecar.ConnectorWatcher.handleException(ConnectorWatcher.kt:191) at io.airbyte.connectorSidecar.ConnectorWatcher.run(ConnectorWatcher.kt:83) at io.airbyte.connectorSidecar.$ConnectorWatcher$Definition.initialize$intercepted(Unknown Source) at io.airbyte.connectorSidecar.$ConnectorWatcher$Definition$InitializeInterceptor.invokeInternal(Unknown Source) at io.micronaut.context.AbstractExecutableMethod.invoke(AbstractExecutableMethod.java:166) at io.micronaut.aop.chain.MethodInterceptorChain.doIntercept(MethodInterceptorChain.java:285) at io.micronaut.aop.chain.MethodInterceptorChain.initialize(MethodInterceptorChain.java:208) at io.airbyte.connectorSidecar.$ConnectorWatcher$Definition.initialize(Unknown Source) at io.airbyte.connectorSidecar.$ConnectorWatcher$Definition.instantiate(Unknown Source) at io.micronaut.context.DefaultBeanContext.resolveByBeanFactory(DefaultBeanContext.java:2335) at io.micronaut.context.DefaultBeanContext.createRegistration(DefaultBeanContext.java:3146) at io.micronaut.context.SingletonScope.getOrCreate(SingletonScope.java:80) at io.micronaut.context.DefaultBeanContext.intializeEagerBean(DefaultBeanContext.java:3035) at io.micronaut.context.DefaultBeanContext.initializeEagerBean(DefaultBeanContext.java:2704) at io.micronaut.context.DefaultBeanContext.initializeContext(DefaultBeanContext.java:2032) at io.micronaut.context.DefaultApplicationContext.initializeContext(DefaultApplicationContext.java:323) at io.micronaut.context.DefaultBeanContext.configureAndStartContext(DefaultBeanContext.java:3342) at io.micronaut.context.DefaultBeanContext.start(DefaultBeanContext.java:353) at io.micronaut.context.DefaultApplicationContext.start(DefaultApplicationContext.java:225) at io.micronaut.runtime.Micronaut.start(Micronaut.java:75) at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt:18) at io.airbyte.connectorSidecar.ApplicationKt.main(Application.kt)
    k
    • 2
    • 1
  • k

    Konathala Chaitanya

    06/26/2025, 11:59 AM
    @kapa.ai "airbyte_internal"."events_common_raw__stream_requests" what os this table created by airbyte??
    k
    • 2
    • 1
1...4445464748Latest