https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • k

    Kürşat Yalın

    11/25/2025, 10:10 AM
    Does Airbyte Core support Catalog Items API in the Amazon Seller Partner connector? @kapa.ai
    k
    • 2
    • 1
  • m

    Mehul Sanghvi

    11/25/2025, 10:18 AM
    #C01AHCD885S When using Source Airbyte Connector as MongoDB & Destination Airbyte Connector as S3, does data syncing feature is available?
    k
    • 2
    • 16
  • k

    Kürşat Yalın

    11/25/2025, 10:25 AM
    I couldn't see the Amazon Search Terms Report stream in the Amazon Seller Partner connector for the Airbyte Core sources/settings menu. #C01AHCD885S
    k
    • 2
    • 4
  • r

    Renu Fulmali

    11/25/2025, 11:25 AM
    @kapa.ai i.a.w.s.a.OutputStorageClient(persist):61 - Message: The authorization header is malformed; the region 'us-east-1' is wrong; expecting 'eu-west-1' (Service: S3, Status Code: 400, Request ID: TW1RSEY6MBVYF8ND, Extended Request ID: bynVIKxfhDJJ8CYQUJcYyw4RNw9zaZUAXXY111q7xg8cVKWnQgSTEMXzVNi5bnx5bqVr65JEO4Q=) (SDK Attempt Count: 1)
    k
    • 2
    • 4
  • n

    Neeraj N

    11/25/2025, 11:45 AM
    Internal Server Error: java.net.UnknownHostException: airbyte-abctl-airbyte-connector-builder-server-svc.airbyte-abctl
    k
    • 2
    • 1
  • n

    Neeraj N

    11/25/2025, 11:46 AM
    issue in rest api case Internal Server Error: java.net.UnknownHostException: airbyte-abctl-airbyte-connector-builder-server-svc.airbyte-abctl
    k
    • 2
    • 1
  • n

    Neeraj N

    11/25/2025, 12:11 PM
    version: 7.0.4 type: DeclarativeSource check: type: CheckStream stream_names: - neeraj api test streams: - type: DeclarativeStream retriever: type: SimpleRetriever record_selector: type: RecordSelector extractor: type: DpathExtractor field_path: - results requester: type: HttpRequester http_method: GET url: https://randomuser.me/api/ decoder: type: JsonDecoder name: api test - type: DeclarativeStream retriever: type: AsyncRetriever record_selector: type: RecordSelector extractor: type: DpathExtractor field_path: [] creation_requester: type: HttpRequester http_method: GET url: https://randomuser.me/api/ polling_requester: type: HttpRequester http_method: GET download_requester: type: HttpRequester http_method: GET status_mapping: type: AsyncJobStatusMap running: [] completed: [] failed: [] timeout: [] status_extractor: type: DpathExtractor field_path: [] download_target_extractor: type: DpathExtractor field_path: [] decoder: type: CsvDecoder encoding: utf-8 delimiter: "," name: neeraj api test spec: type: Spec connection_specification: type: object properties: {} metadata: autoImportSchema: neeraj api test: true api test: true getting issue Internal Server Error: java.net.UnknownHostException: airbyte-abctl-airbyte-connector-builder-server-svc.airbyte-abctl: Name or service not known
    k
    • 2
    • 1
  • a

    Adrien

    11/25/2025, 12:28 PM
    @kapa.ai Hi there, I’m running into an issue with a Postgres CDC → Snowflake connection. Since Monday, all syncs complete successfully, but 0 records are replicated across all streams. After checking the connection state, I noticed something strange: • The shared CDC state (Debezium) contains a valid LSN and is advancing normally. • But all streamStates only contain cursor_field: [] and no per-stream state at all. • Logs show for every stream: • Found cursor field. Original cursor field: null. New cursor field: _ab_cdc_lsn. Resetting cursor value. • Debezium logs also show: • Received offset commit request… but ignoring it. LSN flushing is not allowed yet. The replication slot on Postgres is healthy and active,. It looks like Airbyte is stuck in an inconsistent snapshot/CDC state. The result is that no changes are produced during syncs. Could you help me understand how to fix this state inconsistency, and whether a full “Reset data + Reset state” is required? Is there any way to recover the connection without triggering a full snapshot? Thanks!
    k
    • 2
    • 4
  • f

    Fabrizio Spini

    11/25/2025, 1:47 PM
    @kapa.ai the postgres db had "no space left" and was triggered back to previous valid position. Now my connections couldn't advance. How can I try to resolve?
    k
    • 2
    • 3
  • p

    Paweł Jurkiewicz

    11/25/2025, 2:25 PM
    @kapa.ai I'm trying to ingest Excel files using SFTP Bulk connector to GCS bucket. I'm getting
    Destination does not support file transfers, but source requires it
    How can this issue be solved?
    k
    • 2
    • 1
  • j

    Jeremy Plummer

    11/25/2025, 3:05 PM
    @kapa.ai What does this error mean when testing sources? Airbyte is temporarily unavailable. Please try again. (HTTP 502)
    k
    • 2
    • 1
  • f

    Fabrizio Spini

    11/25/2025, 3:36 PM
    @kapa.ai I have the following part on conneciton state
    Copy code
    "[\"communications\",{\"server\":\"communications\"}]": "{\"ts_sec\":1764050457,\"file\":\"db05-slave.087107\",\"pos\":90877811,\"row\":1,\"server_id\":347503800,\"event\":3}"
    what are row and event values?
    k
    • 2
    • 4
  • k

    Karim Meguenni-Tani

    11/25/2025, 4:29 PM
    @kapa.ai j'ai un probleme avec airbyte 0.63.5 et le connector builder 2.3.0, explique moi les problemes de cette version
    k
    • 2
    • 10
  • e

    Eduardo Ferreira

    11/25/2025, 4:37 PM
    @kapa.ai after updating to airbyte 2.0.1, I started getting timeout errors when upgrading my source and destination connector versions. What could be happening?
    k
    • 2
    • 19
  • z

    Zack Mattor

    11/25/2025, 4:53 PM
    @kapa.ai when running syncs I see warnings like this... WARN com.datadog.profiling.uploader.ProfileUploader - Failed to upload profile to http://localhost:8126/p ││ rofiling/v1/input java.net.ConnectException: Failed to connect to localhost/127.0.0.1:8126 (Will not log warnings for 5 minutes) I thought datadog was disabled by default in the helm v2 charts... is there something else i need to disable?
    k
    • 2
    • 4
  • j

    Jeremy Plummer

    11/25/2025, 5:54 PM
    @kapa.ai What does this error point to in a destination postgres sync: invalid input syntax for type double precision: "None"
    k
    • 2
    • 1
  • j

    Joao Pedro Ferreira Canutto

    11/25/2025, 7:02 PM
    @kapa.ai Analyze the following logs
    k
    • 2
    • 10
  • s

    soma chandra sekhar attaluri

    11/25/2025, 7:03 PM
    @kapa.ai whcih is the best way to deploy airbyte . My company is trying to ingest data aroound 1tb per day from 20 different source databases on a daily basis ,,if i want to deploy airbyte on aws we are thinking of deploying it on a large ec2 instance ..is it best if we do it on eks or can we deploy it on ec2 and is it best if we install it using helm charts on k3s or using abctl and which underlyingos is best windows or liunux ...explain me why i should take the particular option clearly
    k
    • 2
    • 29
  • m

    Maxime Broussard

    11/25/2025, 11:17 PM
    @kapa.ai If the page contains less than 5 records, then the paginator knows there are no more pages to fetch. If the API returns more records than requested, all records will be processed. The paginator stops by default if the number of records is less than the max page size - but it stops wrongly in some cases. For example, I have an API that returns max 200 records, it does return 200 for page 1 and 2, then returns 199 (for whatever reason) which stops airbyte, even though there is still a page 3 , 4 , 5 etc after. Is there a way around this?
    k
    • 2
    • 5
  • a

    Aviad Deri

    11/26/2025, 5:58 AM
    @kapa.ai trying to set potsgres source on Kubernetes installation but getting the following 2025-11-25 162827 INFO 2025-11-25 162827 info 2025-11-25 162827 INFO 2025-11-25 162827 info Transitioning workload to running state 2025-11-25 162827 INFO 2025-11-25 162827 info ----- START CHECK ----- 2025-11-25 162827 INFO 2025-11-25 162827 info 2025-11-25 162827 INFO 2025-11-25 162827 info Workload successfully transitioned to running state 2025-11-25 162848 INFO 2025-11-25 162848 info Connector exited, processing output 2025-11-25 162848 INFO 2025-11-25 162848 info Output file jobOutput.json found 2025-11-25 162848 INFO 2025-11-25 162848 info Connector exited with exit code 0 2025-11-25 162848 INFO 2025-11-25 162848 info Reading messages from protocol version 0.2.0 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.i.s.p.PostgresSource(main):750 starting source: class io.airbyte.integrations.source.postgres.PostgresSource 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.c.i.b.IntegrationCliParser$Companion(parseOptions):144 integration args: {check=null, config=/config/connectionConfiguration.json} 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):130 Running integration: io.airbyte.cdk.integrations.base.ssh.SshWrappedSource 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):131 Command: CHECK 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):132 Integration config: IntegrationConfig{command=CHECK, configPath='/config/connectionConfiguration.json', catalogPath='null', statePath='null'} 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword group - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword pattern_descriptor - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword display_type - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword min - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword max - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 WARN 2025-11-25 162848 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword groups - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.c.i.b.s.SshTunnel$Companion(getInstance):424 Starting connection with method: NO_TUNNEL 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.i.s.p.PostgresUtils(isCdc):70 using CDC: false 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.i.s.p.PostgresSource(toSslJdbcParamInternal):976 DISABLED toSslJdbcParam disable 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main c.z.h.HikariDataSource(<init>):79 HikariPool-1 - Starting... 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main c.z.h.HikariDataSource(<init>):81 HikariPool-1 - Start completed. 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main c.z.h.HikariDataSource(close):349 HikariPool-1 - Shutdown initiated... 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main c.z.h.HikariDataSource(close):351 HikariPool-1 - Shutdown completed. 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):224 Completed integration: io.airbyte.cdk.integrations.base.ssh.SshWrappedSource 2025-11-25 162848 INFO 2025-11-25 162848 info INFO main i.a.i.s.p.PostgresSource(main):752 completed source: class io.airbyte.integrations.source.postgres.PostgresSource 2025-11-25 162848 INFO 2025-11-25 162848 info Checking for optional control message... 2025-11-25 162848 INFO 2025-11-25 162848 info Optional control message not found. Skipping... 2025-11-25 162849 INFO 2025-11-25 162849 info Writing output of decd338e-5647-4c0b-adf4-da0e75f5a750_436d097d-c41f-4775-bc3f-186803310e78_0_check to the doc store 2025-11-25 162849 INFO 2025-11-25 162849 info Marking workload decd338e-5647-4c0b-adf4-da0e75f5a750_436d097d-c41f-4775-bc3f-186803310e78_0_check as successful 2025-11-25 162849 INFO 2025-11-25 162849 info 2025-11-25 162849 INFO 2025-11-25 162849 info Deliberately exiting process with code 0. 2025-11-25 162849 INFO 2025-11-25 162849 info ----- END CHECK ----- 2025-11-25 162849 INFO 2025-11-25 162849 info what is the issue?
    k
    • 2
    • 4
  • m

    Mateo Colina

    11/26/2025, 9:31 AM
    @kapa.ai what does "Days To Sync If History Is Full" mean for source-sftp-bulk? When do i know that the history is full?
    k
    • 2
    • 4
  • r

    Renu Fulmali

    11/26/2025, 9:40 AM
    @kapa.ai I have setup the airbyte on EKS cluster using the helm chart of version 8.5.0. When I am trying to create the custom connector after one time testing it doesn't allow me to update it again if I wanted to I need to create the new custom connector also it converts the UI into the yaml
    k
    • 2
    • 10
  • v

    Vishal Garg

    11/26/2025, 10:03 AM
    🧵 help
    k
    • 2
    • 4
  • a

    Abhijith C

    11/26/2025, 10:35 AM
    @kapa.ai why would sync show success even though pod was never allocated? 2025-11-26 095322 platform INFO APPLY Stage: BUILD — (workloadId=d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync) 2025-11-26 095322 platform INFO APPLY Stage: CLAIM — (workloadId=d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync) 2025-11-26 095322 platform INFO Claimed: true for workload d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync via API in dataplane 2894b95b-f14c-446c-9c04-14eb422092fa (acf550f2-6155-4451-8db9-b62443826c92) 2025-11-26 095322 platform INFO APPLY Stage: LOAD_SHED — (workloadId=d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync) 2025-11-26 095322 platform INFO APPLY Stage: CHECK_STATUS — (workloadId=d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync) 2025-11-26 095322 platform INFO No pod found running for workload d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync 2025-11-26 095322 platform INFO APPLY Stage: MUTEX — (workloadId=d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync) 2025-11-26 095322 platform INFO Mutex key: d88dfecc-6847-4445-8733-c2d0ab51ff67 specified for workload: d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync. Attempting to delete existing pods... 2025-11-26 095322 platform INFO Existing pods for mutex key: d88dfecc-6847-4445-8733-c2d0ab51ff67 deleted. 2025-11-26 095322 platform INFO APPLY Stage: ARCHITECTURE — (workloadId=d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync) 2025-11-26 095322 platform INFO APPLY Stage: LAUNCH — (workloadId=d88dfecc-6847-4445-8733-c2d0ab51ff67_67_0_sync) 2025-11-26 095322 platform INFO [initContainer] image: airbyte/workload-init-container:2.0.1 resources: ResourceRequirements(claims=[], limits={memory=7530Mi, cpu=3.6723}, requests={memory=6348Mi, cpu=1.845}, additionalProperties={}) 2025-11-26 095322 platform INFO Launching replication pod: replication-job-67-attempt-0 (selectors = {kube/nodetype=airbyte-job}) with containers: 2025-11-26 095322 platform INFO [source] image: 454115844779.dkr.ecr.us-east-1.amazonaws.com/conn/teams-dev:e3395cc.4059 resources: ResourceRequirements(claims=[], limits={memory=7530Mi, cpu=3.6723}, requests={memory=6348Mi, cpu=1.845}, additionalProperties={}) 2025-11-26 095322 platform INFO [destination] image: 454115844779.dkr.ecr.us-east-1.amazonaws.com/conn/efsdest-dev:fe5e292.3541 resources: ResourceRequirements(claims=[], limits={memory=7530Mi, cpu=3.6723}, requests={memory=6348Mi, cpu=1.845}, additionalProperties={}) 2025-11-26 095322 platform INFO [orchestrator] image: airbyte/container-orchestrator:2.0.1 resources: ResourceRequirements(claims=[], limits={memory=7530Mi, cpu=3.6723}, requests={memory=6348Mi, cpu=1.845}, additionalProperties={}) 2025-11-26 102408 platform INFO ----- START POST REPLICATION OPERATIONS ----- 2025-11-26 102408 platform INFO No post-replication operation(s) to perform. 2025-11-26 102408 platform INFO ----- END POST REPLICATION OPERATIONS -----
    k
    • 2
    • 1
  • s

    Slackbot

    11/26/2025, 12:41 PM
    This message was deleted.
    k
    • 2
    • 1
  • d

    David Robinson

    11/26/2025, 12:41 PM
    @kapa.ai how do I increase memory requirements for the orchestrator container?
    k
    • 2
    • 4
  • a

    Anna Bogo

    11/26/2025, 12:51 PM
    @kapa.ai I have an issue with replicating retool data in Snwoflake, The same tables don't have any issues in redshift. I tried to refresh the stream, clear the data, add a new connection with all streams, add a new connection to the only stream that is failing. where else should I look into?
    k
    • 2
    • 10
  • e

    Eduardo Ferreira

    11/26/2025, 1:10 PM
    @kapa.ai UpgradeFailed - Helm upgrade failed for release airbyte/airbyte with chart airbyte@1.8.5: template: airbyte/charts/workload-api-server/templates/secrets.yaml1460: executing "airbyte/charts/workload-api-server/templates/secrets.yaml" at <.Values.global.secrets>: wrong type for value; expected map[string]interface {}; got interface {}
    k
    • 2
    • 4
  • r

    Rafael Felipe

    11/26/2025, 1:16 PM
    does airbyte any group by at mongo db data source at CDC sync?
    k
    • 2
    • 4
  • a

    Abhijith C

    11/26/2025, 1:24 PM
    @kapa.ai where should one configure heart beat timeout? We have a long running sync, eventhough sync is running Airbyte has marked satus as success
    k
    • 2
    • 4