https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • s

    soma chandra sekhar attaluri

    12/02/2025, 8:08 PM
    @kapa.ai is there any svc with name airbyte webapp svc in airbyte 2.0 installed using hek charts v2
    k
    • 2
    • 1
  • s

    soma chandra sekhar attaluri

    12/02/2025, 10:00 PM
    @kapa.ai How to disable secure cookie in abctl installation
    k
    • 2
    • 1
  • j

    Jeremy Plummer

    12/02/2025, 10:45 PM
    @kapa.ai Is there a way to cooy the terraform configuration of a connection setup in airbytte
    k
    • 2
    • 5
  • y

    Yuki Kakegawa

    12/02/2025, 11:15 PM
    Is there a way to "Refresh your data" only one of the tables configured in the sync?
    k
    • 2
    • 13
  • m

    Mauricio Pérez

    12/02/2025, 11:56 PM
    @kapa.ai Im having troubles setting the amazon seller partner connector, Im getting this error. 'Encountered an error while checking availability of stream Orders. Error: 400 Client Error: Bad Request for url: https://api.amazon.com/auth/o2/token', how can I solve this?
    k
    • 2
    • 1
  • s

    Shawna Zhao

    12/03/2025, 1:04 AM
    Cost of self hosted enterprise
    k
    • 2
    • 1
  • j

    Joshua Garza

    12/03/2025, 4:54 AM
    #C01AHCD885S I see these errors on a simple full refresh overwrite connection: 2025-12-03 045054,718 [io-executor-thread-4] ERROR i.a.c.s.e.h.UncaughtExceptionHandler(handle):33 - Uncaught exception java.lang.IllegalStateException: Duplicate key 54 (attempted merging values {StreamNameAndNamespace(name=core_facility_ratings, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=stage_plantype, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=core_mean_prices_combined, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=core_facility_top_problems, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=stage_facilities, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=stage_facility_types, namespace=COPAY)=full_refresh} and {StreamNameAndNamespace(name=core_facility_ratings, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=stage_plantype, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=core_mean_prices_combined, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=core_facility_top_problems, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=stage_facilities, namespace=COPAY)=full_refresh, StreamNameAndNamespace(name=stage_facility_types, namespace=COPAY)=full_refresh}) at java.base/java.util.stream.Collectors.duplicateKeyException(Collectors.java:135) at java.base/java.util.stream.Collectors.lambda$uniqKeysMapAccumulator$1(Collectors.java:182) at java.base/java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169) at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1708) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) at io.airbyte.commons.server.handlers.helpers.StatsAggregationHelper.hydrateWithStats(StatsAggregationHelper.kt:181) at io.airbyte.commons.server.handlers.helpers.StatsAggregationHelper.getJobIdToJobWithAttemptsReadMap(StatsAggregationHelper.kt:306) at io.airbyte.commons.server.handlers.ConnectionsHandler.getConnectionLastJobPerStream(ConnectionsHandler.kt:2188) at io.airbyte.server.apis.controllers.ConnectionApiController.getConnectionLastJobPerStream$lambda$18(ConnectionApiController.kt:281) at io.airbyte.server.apis.ApiHelper.execute(ApiHelper.kt:32) at io.airbyte.server.apis.controllers.ConnectionApiController.getConnectionLastJobPerStream(ConnectionApiController.kt:280) at io.airbyte.server.apis.controllers.$ConnectionApiController$Definition$Exec.dispatch(Unknown Source) at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invokeUnsafe(AbstractExecutableMethodsDefinition.java:461) at io.micronaut.context.DefaultBeanContext$BeanContextUnsafeExecutionHandle.invokeUnsafe(DefaultBeanContext.java:4438) at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:272) at io.micronaut.web.router.DefaultUriRouteMatch.execute(DefaultUriRouteMatch.java:38) at io.micronaut.http.server.RouteExecutor.executeRouteAndConvertBody(RouteExecutor.java:465) at io.micronaut.http.server.RouteExecutor.lambda$callRoute$5(RouteExecutor.java:442) at io.micronaut.core.execution.ExecutionFlow.lambda$async$0(ExecutionFlow.java:92) at io.micronaut.core.propagation.PropagatedContext.lambda$wrap$3(PropagatedContext.java:232) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) at java.base/java.lang.Thread.run(Thread.java:1583)
    k
    • 2
    • 3
  • y

    Yosua Fernandus Simanjuntak

    12/03/2025, 9:23 AM
    @kapa.ai I used connection from API Builder to BigQuery, why the bigquery table store the data as json instead of columnar?
    k
    • 2
    • 2
  • v

    Vitor P. Miguel

    12/03/2025, 9:34 AM
    @kapa.ai i am building a custom conector using the UI. my source app only returns 25 objects per page, but I need it to return everything by default, the page = 1, if I want to pass to the next one, I need to add a query paramater =2, and so on, until no more data (with the default format) is returned. how can I set this?
    k
    • 2
    • 7
  • s

    Syed Hamza Raza Kazmi

    12/03/2025, 9:51 AM
    @kapa.ai issue of postgres to postgres cdc replication is it resolved ?
    k
    • 2
    • 4
  • h

    Horia

    12/03/2025, 10:04 AM
    @kapa.ai A sync I was performing from MongoDB to RedShift has been working for a while now. However suddenly I received the error below Failure in destination: Failed to drop table without the CASCADE option. Consider changing the drop_cascade configuration parameter Now I tried performing a full refresh but this did not work, what is the issue here and why did this suddenly start happening?
    k
    • 2
    • 1
  • k

    kapa.ai

    12/03/2025, 10:04 AM
    The error means the Redshift destination is trying to
    DROP TABLE
    but Redshift is refusing because that table has dependent objects (for example, views or foreign keys) and the connector is not allowed to use
    DROP ... CASCADE
    based on your current config. The new behavior is that, instead of silently failing, the destination now surfaces this as: > Failed to drop table without the CASCADE option. Consider changing the drop_cascade configuration parameter. [[Redshift ref](https://docs.airbyte.com/integrations/destinations/redshift#reference)] Why it started “suddenly”: - Recent Redshift destination versions added and changed how they respect the
    drop_cascade
    option, including for raw tables and resets. [[Redshift changelog](https://docs.airbyte.com/integrations/destinations/redshift#changelog)] - If your Airbyte platform or Redshift connector auto‑upgraded (as has happened to other users), connector logic around table drops may have changed even though your pipeline looked the same. [[Mongo perf forum](https://discuss.airbyte.io/t/1011); [Sudden failures forum](https://discuss.airbyte.io/t/2656)] Because the underlying Redshift table now has dependencies (or different DDL) a simple
    DROP TABLE
    no longer works, and full refresh alone doesn’t fix that. What you can do: 1. Enable “Drop tables with CASCADE” on the Redshift destination In the Redshift destination config, set
    drop_cascade
    to
    true
    . [[Redshift ref](https://docs.airbyte.com/integrations/destinations/redshift#reference)] - This lets the connector run
    DROP TABLE ... CASCADE
    so dependent objects will be dropped and the sync can recreate tables cleanly. - Be aware this can cause permanent loss of dependent objects (views, FKs, etc.), similar to the caveats documented for Postgres. [[Postgres dependent objects](https://docs.airbyte.com/integrations/destinations/postgres#creating-dependent-objects)] 2. Alternatively, remove or adjust dependencies yourself - Manually drop or adjust any views / constraints / sort keys that block drops. For example, a similar issue was fixed by removing a sort key on
    _airbyte_emitted_at
    before the destination could alter/drop it. [[Redshift sortkey issue](https://discuss.airbyte.io/t/3219)] 3. If you rely on dependent objects (views, etc.) - Use a tool like dbt to recreate them after each sync if you do enable
    drop_cascade
    , as recommended for other destinations. [[Postgres dependent objects](https://docs.airbyte.com/integrations/destinations/postgres#creating-dependent-objects)] If, after enabling
    drop_cascade
    and rerunning the sync, the problem persists, the knowledge sources don’t provide more specific Redshift DDL patterns that could be blocking the drop; in that case you’d need to inspect the exact Redshift table DDL and dependencies directly in your cluster.
  • t

    Tom

    12/03/2025, 10:11 AM
    @kapa.ai I’ve upgraded airbyte community version from 1.2 to 2.0.1 and spawning pods stopped working. Once I trigger test connector it starts, no job is created in kubernetes and then it returns 504. there is no error logs during that time even at debug verbose level
    k
    • 2
    • 2
  • k

    Konathala Chaitanya

    12/03/2025, 10:46 AM
    @kapa.ai
    Copy code
    Sync failed
    
    0 bytes
    |
    no records loaded
    |
    1h 40m 19s
    Warning from replication: Airbyte could not start the sync process or track the progress of the sync.
    
     
    See less
    message='Airbyte could not track the sync progress. Sync process exited without reporting status.', type='io.airbyte.workers.exception.WorkloadMonitorException', nonRetryable=false
    what is the error
    k
    • 2
    • 1
  • н

    Назар Тутин

    12/03/2025, 1:00 PM
    @kapa.ai there is airbyte running self hosted in k8s. we want to update to a new version using helm chart 2.0. How can we migrate all of the data from older to new? Is there any import/ export process. Change version: 1.4.0 -> 1.8.5
    k
    • 2
    • 1
  • j

    J Bob

    12/03/2025, 1:29 PM
    @kapa.ai I wish to use terraform to create a custom source connector, I have an exported builder yaml file.
    k
    • 2
    • 3
  • d

    Daniel de Cloet

    12/03/2025, 2:22 PM
    @kapa.ai On Airbyte OSS (2.0.1) deployed through the official Helm chart, I am trying to update a custom connector version (Docker container) through the API. Every time I send a PUT request to
    /api/public/v1/workspaces/<workspace ID>/definitions/sources/<source UUID>
    , the airbyte-server becomes unresponsive and is killed due to failing liveness checks. Is this a known problem?
    k
    • 2
    • 1
  • m

    Martin Brummerstedt

    12/03/2025, 2:58 PM
    @kapa.ai I am setting up the Jira Source. If I am setting up the API token without scopes then the connector work as expected but if I set it up with full read scopes then it fails. what scopes does the Jira source connector need to work?
    k
    • 2
    • 2
  • k

    Kevin Robert

    12/03/2025, 3:48 PM
    hello @kapa.ai trying to sync aircall data, I am facing this error 'GET' request to 'https://api.aircall.io/v1/contacts?page=201&amp;per_page=50' failed with status code '400' and error message 'BAD_REQUEST'
    k
    • 2
    • 1
  • k

    kanchalkumar karale

    12/03/2025, 4:02 PM
    Hi @kapa.ai, can you help me to setup airbyte setup locally so I test the code changes
    k
    • 2
    • 5
  • s

    Slackbot

    12/03/2025, 4:32 PM
    This message was deleted.
    k
    • 2
    • 1
  • j

    Jared Parco

    12/03/2025, 4:49 PM
    @kapa.ai how do we prevent airbyte from deleting fields when MySQL CDC logs a field has been removed
    k
    • 2
    • 7
  • k

    kanchalkumar karale

    12/03/2025, 5:21 PM
    @kapa.ai can you please share abctl download link ? my linux machine
    k
    • 2
    • 3
  • k

    kanchalkumar karale

    12/03/2025, 6:15 PM
    @kapa.ai HOW to build docker image locally foe server
    k
    • 2
    • 6
  • r

    Ralf Santa Cruz

    12/03/2025, 6:21 PM
    @kapa.ai i am getting this error in my airbyte bootloader:
    Copy code
    2025-12-03 18:18:04,553 [main]    ERROR    i.a.b.Application(main):32 - Unable to bootstrap Airbyte environment.                                                                                                                                                                                                │
    │ java.lang.RuntimeException: com.fasterxml.jackson.core.JsonParseException: Invalid numeric value: Leading zeroes not allowed                                                                                                                                                                                    │
    │  at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 2]                                                                                                                                                                                                            │
    │     at io.airbyte.commons.json.Jsons.deserialize(Jsons.java:101)
    what could be the issue here?
    k
    • 2
    • 4
  • m

    Moshe Netkin

    12/03/2025, 7:50 PM
    @kapa.ai getting 502 timeout error when testing connection
    Copy code
    Airbyte is temporarily unavailable. Please try again. (HTTP 502)
    Failed to save Walmart due to the following error: errors.http.badGateway
    k
    • 2
    • 4
  • a

    Andrea Brenna

    12/03/2025, 10:09 PM
    Hi @kapa.ai, all my connection are failing with Airbyte Platform is experiencing a higher than usual load, please try again later.
    k
    • 2
    • 7
  • l

    Liew Tze Hao Timothy

    12/04/2025, 12:21 AM
    Hey @kapa.ai, I noticed there are some missing data in one of my tables in the airbyte_raw schema two days ago. Is it possible for me to do some kind of partial refresh by deleting the records in the past two days from the schema? Would airbyte's cursor be 'resetted' to the previous two days?
    k
    • 2
    • 1
  • r

    Ralf Santa Cruz

    12/04/2025, 12:25 AM
    @kapa.ai how does one obtain a new set of dataplane credentials? i am looking to re-issue my dataplane client id and secret. i am using airbyte 1.8.x
    k
    • 2
    • 4
  • f

    Fabrizio Spini

    12/04/2025, 6:16 AM
    @kapa.ai how can I upgrade abctl?
    k
    • 2
    • 4