https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • t

    Tiger Sun

    02/21/2023, 3:34 AM
    hi i’m quite new to airbyte, currently trying it out via on the UI locally but things are extremely slow / sometimes requests do not terminate (e.g. test + save this connection never actually completing) — am I doing something wrong? somehwat separately: i’m getting a lot of
    The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested
    but i’m not sure how to specify a platform (working on arm64 with an M1 macbook)
    j
    • 2
    • 8
  • t

    Tmac Han

    02/21/2023, 4:18 AM
    Hi team, how to add the normalization docker image into a custom airbyte-destination? I have built my dbt model and normalization docker image. I want to debug it.
    u
    u
    u
    • 4
    • 6
  • m

    Manav Kothari

    02/21/2023, 5:23 AM
    Hi there, I am trying to connect click-house as the destination everything was working fine till yesterday but today I am getting this error, am I missing anything?.
    Copy code
    2023-02-21 05:11:16 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-clickhouse:0.2.2 exists...
    2023-02-21 05:11:16 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-clickhouse:0.2.2 not found locally. Attempting to pull the image...
    2023-02-21 05:11:26 INFO i.a.c.i.LineGobbler(voidCall):114 - Image does not exist.
    2023-02-21 06:46:55 ERROR i.a.w.g.DefaultCheckConnectionWorker(run):124 - Unexpected error while checking connection: 
    io.airbyte.workers.exception.WorkerException: Could not find image: airbyte/destination-clickhouse:0.2.2
    r
    • 2
    • 1
  • m

    Manikandan Ruppa Sukabrammam

    02/21/2023, 5:42 AM
    Hi Airbyte Team, How can i alter the DISCOVER_LIMIT in
    source-mongodb-v2
    image? Currently its set to 10000, but i want to reduce this to 1000.
    r
    n
    • 3
    • 4
  • a

    Asadbek Muminov

    02/21/2023, 8:02 AM
    Hello, I want to pull all data from Stripe. So I’m leaving
    start_date
    blank as written in the docs. I’m making POST API request to
    /api/v1/sources/create
    with the following body:
    Copy code
    {
      "sourceDefinitionId": "e094cb9a-26de-4645-8761-65c0c425d1de",
      "workspaceId": "3cef7489-8f6d-4883-b9ae-39370f1db5ac",
      "connectionConfiguration": {
        "account_id": "accountIdHereReplaced",
        "client_secret": "clientSercretReplaced",
        "start_date": ""
      },
      "name": "426542183048155136"
    }
    But I’m getting the
    422Unprocessable Entity (WebDAV) (RFC 4918
    ) error when I leave
    start_date
    blank in Stripe source:
    Copy code
    "Caused by: io.airbyte.validation.json.JsonValidationException: json schema validation failed when comparing the data to the json schema. ",
            "Errors: $.start_date: does not match the regex pattern ^[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}:[0-9]{2}:[0-9]{2}Z$ ",
            "Schema: ",
    I”m getting the same error for HubSpot and Shopify source. Anyway to solve this?
    ✅ 1
    m
    u
    • 3
    • 3
  • t

    Tmac Han

    02/21/2023, 8:59 AM
    Any one who can help me about this issue ? https://github.com/airbytehq/airbyte/issues/23267
    u
    • 2
    • 2
  • g

    Gautam B

    02/21/2023, 10:24 AM
    We need to sync data from multiple tenants (having different apps like GA4, Shopify etc) and run tenant-specific dbt transformation on them. For this we are utilizing the
    dbtArguments
    JSON key of the
    operator_dbt
    column for passing in the
    --vars
    argumen to dbt, so it knows which source tenant schema to write to. It looks like this:
    Copy code
    select operator_dbt from operation o ;
    
    {"gitRepoUrl": "<https://oauth2>:<key>@github.com/<org>/<repo>-etl.git", "dockerImage": "<http://ghcr.io/dbt-labs/dbt-postgres:1.4.1|ghcr.io/dbt-labs/dbt-postgres:1.4.1>", "dbtArguments": "run --vars \"{source_schema: tenant_5f9d3c9}\"", "gitRepoBranch": "main"}
    For each of the connection we will have a separate
    operation
    with a different
    tenant_id
    However when running it with this JSON, somehow the Airbyte is stripping the
    "
    from the
    "dbtArguments"
    key. We tried all combinations and each time the
    "
    was stripped, as visible in the logs below.
    Copy code
    2023-02-21 08:50:57 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if <http://ghcr.io/dbt-labs/dbt-postgres:1.4.1|ghcr.io/dbt-labs/dbt-postgres:1.4.1> exists...
    2023-02-21 08:50:57 INFO i.a.c.i.LineGobbler(voidCall):114 - <http://ghcr.io/dbt-labs/dbt-postgres:1.4.1|ghcr.io/dbt-labs/dbt-postgres:1.4.1> was found locally.
    2023-02-21 08:50:57 INFO i.a.w.p.DockerProcessFactory(create):125 - Creating docker container = dbt-postgres-custom-21-0-ilecx with resources io.airbyte.config.ResourceRequirements@7a3b5d56[cpuR
    equest=,cpuLimit=,memoryRequest=,memoryLimit=] and allowedHosts null
    2023-02-21 08:50:57 INFO i.a.w.p.DockerProcessFactory(create):170 - Preparing command: docker run --rm --init -i -w /data/21/0/transform --log-driver none --name dbt-postgres-custom-21-0-ilecx -
    -network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.40.32 --entrypoint /bin/bash ghcr.
    io/dbt-labs/dbt-postgres:1.4.1 entrypoint.sh run --vars source_schema: tenant_5f9d3c9
    2023-02-21 08:50:57 dbt > Running from /data/21/0/transform/git_repo
    2023-02-21 08:50:57 dbt > detected no config file for ssh, assuming ssh is off.
    2023-02-21 08:50:57 dbt > Running: dbt run --vars source_schema: tenant_5f9d3c9 --profiles-dir=/data/21/0/transform --project-dir=/data/21/0/transform/git_repo
    2023-02-21 08:50:58 dbt > usage: dbt [-h] [--version] [-r RECORD_TIMING_INFO] [-d]
    2023-02-21 08:50:58 dbt >      [--log-format {text,json,default}] [--no-write-json]
    2023-02-21 08:50:58 dbt >      [--use-colors | --no-use-colors] [--printer-width PRINTER_WIDTH]
    2023-02-21 08:50:58 dbt >      [--warn-error | --warn-error-options WARN_ERROR_OPTIONS]
    2023-02-21 08:50:58 dbt >      [--no-version-check] [--partial-parse | --no-partial-parse]
    2023-02-21 08:50:58 dbt >      [--use-experimental-parser] [--no-static-parser]
    2023-02-21 08:50:58 dbt >      [--profiles-dir PROFILES_DIR] [--no-anonymous-usage-stats] [-x]
    2023-02-21 08:50:58 dbt >      [-q] [--no-print]
    2023-02-21 08:50:58 dbt >      [--cache-selected-only | --no-cache-selected-only]
    2023-02-21 08:50:58 dbt >      {docs,source,init,clean,debug,deps,list,ls,build,snapshot,run,compile,parse,test,seed,run-operation}
    2023-02-21 08:50:58 dbt >      ...
    2023-02-21 08:50:58 dbt > dbt: error: unrecognized arguments: tenant_5f9d3c9
    Some other combinations that we tried:
    Copy code
    "run --vars \"{source_schema: tenant_5f9d3c9}\""
    "run --vars '{source_schema: tenant_5f9d3c9}'"
    ✅ 1
    m
    • 2
    • 3
  • r

    Rithick

    02/21/2023, 11:50 AM
    I'm currently looking into running streams in a parallel way, and i'm aware of https://github.com/airbytehq/airbyte/issues/7750, what is the current progress on this one, is there exists any other way for doing this as of now ? i'm trying to fetch object metadata from a list of s3 buckets in a single source and dynamically creating a list of streams to do that currently. However as whole, the data is large and it might help to run the streams in parallel.
    m
    • 2
    • 1
  • w

    Willi

    02/21/2023, 1:03 PM
    Hello, is it possible to manually modify the cursor if a custom connector, configured as “incremental append” with a cursor field? I realized that data for my cursor field
    createdAt > '2023-02-20
    is wrong and deleted it from my raw table and also the
    airbyte_raw_...
    table. However, the airbyte cursor state is unaffected by that. Thus, a manual “Sync now” loads 0 records. Can I modify the cursor somehow manually or is my only solution to reset the entire data for this connection and backfill it from scratch?
    u
    a
    • 3
    • 2
  • v

    Vitaly Stakhov

    02/21/2023, 1:16 PM
    Hello all, what is the plan to get more destinations to at least Beta stage? I can see there is a bunch of destination connectors in Alpha, but at the same time a note at the top:
    We strongly discourage using alpha releases for production use cases
    n
    • 2
    • 3
  • t

    Tobias Troelsen

    02/21/2023, 1:23 PM
    Discovering schema failed in Webflow alpha connector I get an error when trying to connect my webflow source to a destination. Logs in thread. Any idea as to why that is? 🙂 Thanks!
    a
    • 2
    • 6
  • j

    Jean Lorillon

    02/21/2023, 1:57 PM
    Hey - I’m new to airbyte and wanted to try the product locally. It seems like the open source quick start isn’t working; I cloned the repo
    <https://github.com/airbytehq/airbyte.git>
    but when executing the command to launch it
    ./run-ab-platform.sh
    I get many errors/
    Docker compose up
    doesn’t work
    j
    b
    • 3
    • 5
  • j

    Jean Lorillon

    02/21/2023, 1:58 PM
    (errors on bootloader + Unable to create SQL database + Unable to setup SQL schema +)
  • l

    Layth Al-Ani

    02/21/2023, 2:13 PM
    Hello Airbyte geeks, I am still getting the
    413 Request Entity Too Large
    issue, which is suppose to be solved in Dec. 2022. I am running Airbyte version 0.40.32 using helm chart version 0.43.36
    u
    u
    +3
    • 6
    • 16
  • b

    Balasubramanian T K

    02/21/2023, 2:24 PM
    Can anyone say why docker-compose up is not working in base directory?
    r
    • 2
    • 2
  • h

    Hamid Shariati

    02/21/2023, 2:28 PM
    hi, can I use airbye to #migrate data between two version of a web app ?
    r
    n
    • 3
    • 2
  • v

    Vitaly Stakhov

    02/21/2023, 2:45 PM
    Airbyte is an ELT platform, meaning that transformation is supposed to happen within a destination. However I can see there is the Heap Analytics destination, which is not a generic warehouse/db/queue. The data mapping for Heap Analytics destination is configured within Airbyte and done by Airbyte. Does it mean that custom application destination connectors can be created which will support ETL from all available Airbyte sources?
    u
    u
    u
    • 4
    • 6
  • m

    Mehmet Berk Souksu

    02/21/2023, 3:17 PM
    Hello, I am new to Airbyte and I am trying it with Freshdesk source and Postgres as destination.
    Copy code
    2023-02-20 17:16:02 source > Backing off _send(...) for 0.0s (airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Request URL: /v2/tickets/799613/conversations?per_page=100, Response Code: 429, Response Text: )
    2023-02-20 17:16:02 source > Retrying. Sleeping for 3044.0 seconds
    There are a waiting time of 3044 second which it takes a lot of time. Is there a way to reduce this wait time or is it a limitation that we ahve to accept? Also, I see that Freshdesk API only brings last 30 days for the Tickets, if we give a Start Date to the source, doe sit bring all the tickets from that date? Thank you
    n
    • 2
    • 1
  • j

    Justen Walker

    02/21/2023, 3:19 PM
    Are there any recommendations for setting Requests/Limits on CPU/Memory for the various AirByte Components in Kubernetes? By default they seem to be unset (ie: unlimited)
    m
    • 2
    • 2
  • r

    Robert Put

    02/21/2023, 3:42 PM
    Have releases paused for OSS? I know there was the repo re org happening but i think that would be done by now?
    u
    • 2
    • 2
  • j

    Justen Walker

    02/21/2023, 3:58 PM
    Are there any issues expected running more than one
    server
    component on Kubernetes (for redundancy)? I had 3 instances and I feel like they might be doing some locking on the DB and conflicting with each other. We had an instance with Docker Compose which only has 1 server and it seems to work better.
    u
    • 2
    • 2
  • k

    Kevin Phan

    02/21/2023, 4:13 PM
    Hi Airbyte team! Where in minio(?) does Airbyte store temp data? I see files such as
    c0563d6c-fa1c-46da-aef9-86687ab77da210194541268806076752.csv.gz
    coming in and would like to inspect it
    r
    m
    u
    • 4
    • 5
  • r

    Robert Put

    02/21/2023, 6:12 PM
    On 1/29 i upgraded to version 0.40.30 and since then i have had massive spikes in EBS byte balance usage for my replica dbs that airbyte is connecting to. This is for postgres to snowflake replication. I know 040.31/32 had issues, was there anything in .30?
    a
    n
    • 3
    • 6
  • t

    Tiger Sun

    02/21/2023, 7:09 PM
    hi i’m new to airbyte and trying to work with airbyte locally but the experience is quite slow. things time out + are stuck in loading for many minutes before failing. i’ve restarted via
    docker-compose down -v
    many times and worked from a fresh restart. i’ve spun up a local postgres instance with just default values (user: postgres) that I’m trying to sync to (using just the default pokemon api source) but it never syncs correctly + always times out or fails. does anyone have advice for the best way to debug (network requests aren’t that useful) or develop locally? thanks!
    u
    • 2
    • 1
  • u

    user

    02/21/2023, 7:45 PM
    Hello Saif Abid, it's been a while without an update from us. Are you still having problems or did you find a solution?
  • m

    Malik

    02/21/2023, 8:29 PM
    I understand that AWS ECS is not currently supported because of the docker-in-docker limitation. I would like to use AWS EKS with Fargate, but I was not sure if the Fargate component has a similar limitation. Is it possible to deploy airbyte on AWS EKS with Fargate or do I need to use EKS with provisioned/managed nodes?
    n
    • 2
    • 1
  • c

    Carolina Buckler

    02/21/2023, 9:17 PM
    Any plans for a Salesforce destination connector?
    m
    h
    • 3
    • 6
  • g

    Gunnar Lykins

    02/21/2023, 9:34 PM
    Hi there I am trying to deploy my Airbyte helm chart onto an EKS cluster, however, am running into the following error on one of the worker pods:
    Copy code
    ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: null
    java.lang.IllegalArgumentException: null
            at com.google.common.base.Preconditions.checkArgument(Preconditions.java:131) ~[guava-31.1-jre.jar:?]
            at io.airbyte.config.storage.DefaultS3ClientFactory.validate(DefaultS3ClientFactory.java:32) ~[io.airbyte.airbyte-config-config-models-0.40.21.jar:?]
            at io.airbyte.config.storage.DefaultS3ClientFactory.<init>(DefaultS3ClientFactory.java:24) ~[io.airbyte.airbyte-config-config-models-0.40.21.jar:?]
            at io.airbyte.config.helpers.CloudLogs.createCloudLogClient(CloudLogs.java:45) ~[io.airbyte.airbyte-config-config-models-0.40.21.jar:?]
            at io.airbyte.config.helpers.LogClientSingleton.createCloudClientIfNull(LogClientSingleton.java:164) ~[io.airbyte.airbyte-config-config-models-0.40.21.jar:?]
            at io.airbyte.config.helpers.LogClientSingleton.setWorkspaceMdc(LogClientSingleton.java:151) ~[io.airbyte.airbyte-config-config-models-0.40.21.jar:?]
            at io.airbyte.workers.ApplicationInitializer.initializeCommonDependencies(ApplicationInitializer.java:188) ~[io.airbyte-airbyte-workers-0.40.21.jar:?]
            at io.airbyte.workers.ApplicationInitializer.onApplicationEvent(ApplicationInitializer.java:152) ~[io.airbyte-airbyte-workers-0.40.21.jar:?]
            at io.airbyte.workers.ApplicationInitializer.onApplicationEvent(ApplicationInitializer.java:64) ~[io.airbyte-airbyte-workers-0.40.21.jar:?]
            at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:262) ~[micronaut-inject-3.7.4.jar:3.7.4]
            at io.micronaut.context.event.ApplicationEventPublisherFactory.access$200(ApplicationEventPublisherFactory.java:60) ~[micronaut-inject-3.7.4.jar:3.7.4]
            at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:229) ~[micronaut-inject-3.7.4.jar:3.7.4]
            at io.micronaut.http.server.netty.NettyHttpServer.lambda$fireStartupEvents$15(NettyHttpServer.java:587) ~[micronaut-http-server-netty-3.7.4.jar:3.7.4]
            at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
            at io.micronaut.http.server.netty.NettyHttpServer.fireStartupEvents(NettyHttpServer.java:581) ~[micronaut-http-server-netty-3.7.4.jar:3.7.4]
            at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:298) ~[micronaut-http-server-netty-3.7.4.jar:3.7.4]
            at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:104) ~[micronaut-http-server-netty-3.7.4.jar:3.7.4]
            at io.micronaut.runtime.Micronaut.lambda$start$2(Micronaut.java:81) ~[micronaut-context-3.7.4.jar:3.7.4]
            at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
            at io.micronaut.runtime.Micronaut.start(Micronaut.java:79) ~[micronaut-context-3.7.4.jar:3.7.4]
            at io.micronaut.runtime.Micronaut.run(Micronaut.java:323) ~[micronaut-context-3.7.4.jar:3.7.4]
            at io.micronaut.runtime.Micronaut.run(Micronaut.java:309) ~[micronaut-context-3.7.4.jar:3.7.4]
            at io.airbyte.workers.Application.main(Application.java:12) ~[io.airbyte-airbyte-workers-0.40.21.jar:?]
    I have tried with using
    0.43.22
    and rolled back to
    0.41.3
    and am still running into the same error. Any recommendations on how others have had any luck in resolving this error?
    a
    • 2
    • 3
  • l

    Lucas Wiley

    02/21/2023, 11:01 PM
    I'm getting normalization failures with error message
    SQL compilation error: cannot change column ID from type VARIANT to NUMBER(38,0)
    . Oddly, a reset of the stream does not seem to fix the problem. Is this a known issue? Running pg > snowflake on Airbyte v0.40.30.
    u
    p
    • 3
    • 6
  • n

    Nathan Chan

    02/22/2023, 1:18 AM
    Hi Team, we are having issue with our zendesk support connector and this error was show on our ingestion to big query, any chance you know what this is about?
    Copy code
    at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:196) ~[io.airbyte-airbyte-commons-worker-0.40.30.jar:?]
    		at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:175) ~[io.airbyte-airbyte-commons-worker-0.40.30.jar:?]
    		at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:91) ~[io.airbyte-airbyte-commons-worker-0.40.30.jar:?]
    		at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.40.30.jar:?]
    		at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: io.airbyte.workers.internal.exception.SourceException: Source cannot be stopped!
    	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$7(DefaultReplicationWorker.java:392) ~[io.airbyte-airbyte-commons-worker-0.40.30.jar:?]
    	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    	... 1 more
    Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
    	at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:155) ~[io.airbyte-airbyte-commons-worker-0.40.30.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$7(DefaultReplicationWorker.java:390) ~[io.airbyte-airbyte-commons-worker-0.40.30.jar:?]
    	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    	... 1 more
    n
    • 2
    • 6
1...147148149...245Latest