https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • m

    Mike B

    04/11/2023, 8:46 PM
    Hi there. I'm experiencing a lot of problems with recent releases of Airbyte. First with the initial deployment: running the bootstrap shell script would get most of the containers up and running, except for the worker (which could take hours to finally come up and successfully respond to the other containers). Now I'm experiencing intermittent failures when trying to run sync jobs that previously worked perfectly. They're reporting "Failure during reporting of activity result to the server", and then failing the run. The only possible solution I've seen recommended for either problem is to bump the machine specs, as it was running low on resources. I've monitored this server while hitting the errors, and it's never gone over ~6% CPU utilization or ~2% RAM utilization (it's on a very beefy server); there are also no limits imposed on any docker containers or services on this server. Has anyone seen anything similar to this? We've been running Airbyte with no problems since last summer, and just started seeing these problems in the past week or two.
    k
    c
    • 3
    • 4
  • w

    Walker Philips

    04/11/2023, 8:50 PM
    Is there a way to increase the flush size for records used by DefaultReplicationWorker? Right now it seems restricted to 24MB
    k
    • 2
    • 2
  • t

    Thiago Chiliatto

    04/11/2023, 10:02 PM
    Hi.. I´m having trouble using an external config data base (postgres sql) I´ve change everything in .env file, checked user permitions and acess.. I need to persist configuration after a update
    k
    s
    n
    • 4
    • 14
  • s

    Shreepad Khandve

    04/12/2023, 4:59 AM
    Hello team, I have created custom connector and on instance i have uploaded the docker image to get the new connector. Conenctor is working fine in local as well as on interface I can see emitted records as well. but when i look into the schema, the main table is blank. What could be the reason ? is this a version issue or am i missing something
    k
    m
    • 3
    • 4
  • y

    yuan sun

    04/12/2023, 5:44 AM
    After I pull the latest airbyte code, https://github.com/airbytehq/airbyte, the page cannot be accessed. How can I solve this problem?
    ✅ 1
    🙏 1
    j
    • 2
    • 3
  • k

    King Ho

    04/12/2023, 8:08 AM
    Hi there, we have been having issues with Airbyte being deployed on a GCP VM. We are currently using
    c2-standard-8
    which is an 8-core 32GB RAM Cascade-lake cpu. We have run into runtime errors during pipeline runs on Airbyte 0.41.0. The Airbyte web interface shows the standard error (one with the purple squid thing) and the docker container needs to be kicked to start working again. My initial thought was resource use has not been limited, and wanted to focus on the
    WORKER
    and
    JOBS
    section here. Am I on the right track? Is there something else I can do to find a more specific error? Any other suggestions?
    k
    t
    • 3
    • 21
  • l

    Lenin Mishra

    04/12/2023, 8:41 AM
    Hey folks, when I try to run
    python main.py spec
    , I get the following error.
    Copy code
    ValueError: 'global' is not a valid parameter name
    Can anyone help me out here? Why is this happening?
    k
    w
    • 3
    • 3
  • a

    Andrei Siniakin

    04/12/2023, 9:16 AM
    Hello team, I’m trying to get Facebook Ads data with daily breakdown, I found here in this chat this solution:
    Hey guys, thank you for your input.
    I was indeed able to resolve this.
    I setup a custom insight with
    _campaign_id, campaign_name, spend_
    as the only fields and set the time increment to 1.
    This provides daily spend data for each campaign which you can then aggregate to your liking.
    I’ve done the same thing, but got this error (screen attached) Haven’t you experienced the same thing? I tried to ask mate who provided solution, but as he is offline for a few days, hope you can also share some advices 🙂
  • a

    Asif Raza Khan

    04/12/2023, 9:42 AM
    Hi All, I am trying to use Klaviyo connector to fetch the events stream and syncing it to s3. I am not running any transformation an d also when we have destination as s3, airbyte doesn't allow transformations. While running the connector I am gettiing the error like - "*Failed to transform value '139.0' of type 'string' to 'number', key path: '.event_properties.items.0.price'*" . Full logs attached. Please let me know how can I get through this issue. Thanks in advance error snippet -2023-04-12 093459 [44msource[0m > Starting syncing SourceKlaviyo 2023-04-12 093459 [44msource[0m > Syncing stream: events 2023-04-12 093500 [43mdestination[0m > INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):76 Starting a new buffer for stream events_events (current state: 0 bytes in 0 buffers) 2023-04-12 093500 [44msource[0m > Failed to transform value '139.0' of type 'string' to 'number', key path: '.event_properties.items.0.price' 2023-04-12 093500 [44msource[0m > Failed to transform value '139.0' of type 'string' to 'number', key path: '.event_properties.items.1.price'
    klaviyo_events
  • a

    Arun RP

    04/12/2023, 10:20 AM
    Hi Team, We are trying to transfer/stream the data from postgres to snowflake, we tried doing it with logical incremental replication to achieve Change Data Capture (CDC), it is working fine for non-partitioned table, when we try to do it for Partitioned table its not working.. whereas we could do the stream for Partitioned table in standard replication..
    k
    • 2
    • 12
  • s

    Sushant

    04/12/2023, 11:53 AM
    Hi As per the purchase page of airbyte ,its mentioned Airbyte API will be available for open source. I hope its the latest Airbyte API and not the configuration API.
  • t

    Thiago Villani

    04/12/2023, 12:09 PM
    Hello, I have an MSSQL source and minio(S3) destination, due to a problem in syncing to .parquet, which was mentioned, I am executing the sync in .csv format on the destination. But it is for a mssql server table, it is generating in minio(S3) around 4 or 5 csv files of a table. Is there a way to generate a single csv file?, or is there any limitation of lines or size.
    k
    • 2
    • 2
  • s

    Stefano Cascavilla

    04/12/2023, 12:30 PM
    Hello everyone, is there a way to export just the result of a query from PostgreSQL to S3 using Airbyte? Thanks!
    k
    • 2
    • 2
  • b

    Ba Thien Tran

    04/12/2023, 1:48 PM
    Hey everyone! Trying to set up a connector for MongoDB hosted on EC2. Has anyone successfully done this?
    k
    m
    • 3
    • 6
  • s

    Sandhya Manimaran

    04/12/2023, 2:01 PM
    Hi team , I need help in setting up salesforce in airbyte
    k
    • 2
    • 3
  • g

    Gonzalo Bottari

    04/12/2023, 3:44 PM
    Team, good morning! . I'm trying to pull some data from a MongoDB database, with a lot of collections. Some of them with huge volume, but I'm only interested in just one collection. Is there any chance to filter DISCOVER step (when I create the connection)? some json into some docker container to take a look? Otherwise I get an issue (non-json schema response).
    k
    • 2
    • 2
  • s

    Sandhya Manimaran

    04/12/2023, 6:55 PM
    Hi Team , I am trying to configure facebook on airbyte but unable to open the meta page to create an app
    k
    • 2
    • 3
  • v

    Vishal Doshi

    04/12/2023, 8:00 PM
    Hi there, I'm testing the S3 source with an S3 destination. The
    format
    is required, so I'm guessing not, but can Airbyte replicate arbitrary files? Or do they have to be csv, parquet, avro or jsonl?
    k
    • 2
    • 2
  • n

    Nohelia Merino

    04/12/2023, 8:42 PM
    Good afternoon, I'm trying to get information about whether or not currently there is a data source connector for databricks, https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors . Or if there is one in the plans. I see there is one for destination, but I haven't found one for source.
    k
    • 2
    • 2
  • s

    Simbazz

    04/12/2023, 9:07 PM
    Hi wonder if someone can help. Is it possible to use podman instead of docker for local install? I have podman but when I try to start up airbyte it’s looking for the docket daemon. Any help appreciated
    k
    m
    • 3
    • 4
  • r

    Ryan Martin-Gawn

    04/13/2023, 12:16 AM
    Hi there I am having a niggly issue trying to get a fresh installation of Airbyte deployed on an EC2 instance. Everything is going well until I need to run
    docker compose up
    and then I get a error as follows:
    service "init" didn't complete successfully: exit 1
    which is detailed only by this:
    init | exec /bin/sh: exec format error
    . Would anyone know how I might fix this? Airbyte version: 0.43.1 Using latest pull from the Airbyte Platform git, following EC2 deployment instructions.
    k
    m
    u
    • 4
    • 4
  • a

    Alyson Franklin

    04/13/2023, 12:44 AM
    Hello, I'm trying to set up state storage in AWS S3 but it doesn't work and I'm getting an error that I can't identify. I'm using AWS EKS and Helm Chart. Chart Version: 0.45.8 EKS Version: 1.21 Error Logs: Airbyte-Worker:
    Copy code
    2023-04-13 00:39:18 INFO i.a.c.t.TemporalUtils(getTemporalClientWhenConnected):276 - Temporal namespace default initialized!
    2023-04-13 00:39:18 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable AWS_ACCESS_KEY_ID: ''
    2023-04-13 00:39:18 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable AWS_SECRET_ACCESS_KEY: ''
    2023-04-13 00:39:18 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable METRIC_CLIENT: ''
    2023-04-13 00:39:18 INFO i.a.c.EnvConfigs(getEnvOrDefault):1222 - Using default value for environment variable METRIC_CLIENT: ''
    2023-04-13 00:39:18 WARN i.a.m.l.MetricClientFactory(initialize):74 - MetricClient was not recognized or not provided. Accepted values are `datadog` or `otel`.
    2023-04-13 00:39:19 INFO i.a.w.ApplicationInitializer(initializeCommonDependencies):186 - Initializing common worker dependencies.
    2023-04-13 00:39:19 ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: null
    java.lang.IllegalArgumentException: null
            at com.google.common.base.Preconditions.checkArgument(Preconditions.java:131) ~[guava-31.1-jre.jar:?]
            at io.airbyte.config.storage.DefaultS3ClientFactory.validateBase(DefaultS3ClientFactory.java:36) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.storage.DefaultS3ClientFactory.validate(DefaultS3ClientFactory.java:31) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.storage.DefaultS3ClientFactory.<init>(DefaultS3ClientFactory.java:24) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.helpers.CloudLogs.createCloudLogClient(CloudLogs.java:51) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.helpers.LogClientSingleton.createCloudClientIfNull(LogClientSingleton.java:226) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.helpers.LogClientSingleton.setWorkspaceMdc(LogClientSingleton.java:213) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.workers.ApplicationInitializer.initializeCommonDependencies(ApplicationInitializer.java:189) ~[io.airbyte-airbyte-workers-0.42.1.jar:?]
            at io.airbyte.workers.ApplicationInitializer.onApplicationEvent(ApplicationInitializer.java:156) ~[io.airbyte-airbyte-workers-0.42.1.jar:?]
            at io.airbyte.workers.ApplicationInitializer.onApplicationEvent(ApplicationInitializer.java:62) ~[io.airbyte-airbyte-workers-0.42.1.jar:?]
            at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:262) ~[micronaut-inject-3.8.7.jar:3.8.7]
            at io.micronaut.context.event.ApplicationEventPublisherFactory.access$200(ApplicationEventPublisherFactory.java:60) ~[micronaut-inject-3.8.7.jar:3.8.7]
            at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:229) ~[micronaut-inject-3.8.7.jar:3.8.7]
            at io.micronaut.http.server.netty.NettyHttpServer.lambda$fireStartupEvents$15(NettyHttpServer.java:587) ~[micronaut-http-server-netty-3.8.7.jar:3.8.7]
            at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
            at io.micronaut.http.server.netty.NettyHttpServer.fireStartupEvents(NettyHttpServer.java:581) ~[micronaut-http-server-netty-3.8.7.jar:3.8.7]
            at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:298) ~[micronaut-http-server-netty-3.8.7.jar:3.8.7]
            at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:104) ~[micronaut-http-server-netty-3.8.7.jar:3.8.7]
            at io.micronaut.runtime.Micronaut.lambda$start$2(Micronaut.java:81) ~[micronaut-context-3.8.7.jar:3.8.7]
            at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
            at io.micronaut.runtime.Micronaut.start(Micronaut.java:79) ~[micronaut-context-3.8.7.jar:3.8.7]
            at io.micronaut.runtime.Micronaut.run(Micronaut.java:323) ~[micronaut-context-3.8.7.jar:3.8.7]
            at io.micronaut.runtime.Micronaut.run(Micronaut.java:309) ~[micronaut-context-3.8.7.jar:3.8.7]
            at io.airbyte.workers.Application.main(Application.java:15) ~[io.airbyte-airbyte-workers-0.42.1.jar:?]
    Airbyte-Server:
    Copy code
    2023-04-13 00:39:44 ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: null
    java.lang.IllegalArgumentException: null
            at com.google.common.base.Preconditions.checkArgument(Preconditions.java:131) ~[guava-31.1-jre.jar:?]
            at io.airbyte.config.storage.DefaultS3ClientFactory.validateBase(DefaultS3ClientFactory.java:36) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.storage.DefaultS3ClientFactory.validate(DefaultS3ClientFactory.java:31) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.storage.DefaultS3ClientFactory.<init>(DefaultS3ClientFactory.java:24) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.helpers.CloudLogs.createCloudLogClient(CloudLogs.java:51) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.helpers.LogClientSingleton.createCloudClientIfNull(LogClientSingleton.java:226) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.config.helpers.LogClientSingleton.setWorkspaceMdc(LogClientSingleton.java:213) ~[io.airbyte.airbyte-config-config-models-0.42.1.jar:?]
            at io.airbyte.server.LoggingEventListener.onApplicationEvent(LoggingEventListener.java:34) ~[io.airbyte-airbyte-server-0.42.1.jar:?]
            at io.airbyte.server.LoggingEventListener.onApplicationEvent(LoggingEventListener.java:21) ~[io.airbyte-airbyte-server-0.42.1.jar:?]
            at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:262) ~[micronaut-inject-3.8.7.jar:3.8.7]
            at io.micronaut.context.event.ApplicationEventPublisherFactory.access$200(ApplicationEventPublisherFactory.java:60) ~[micronaut-inject-3.8.7.jar:3.8.7]
            at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:229) ~[micronaut-inject-3.8.7.jar:3.8.7]
            at io.micronaut.http.server.netty.NettyHttpServer.lambda$fireStartupEvents$15(NettyHttpServer.java:587) ~[micronaut-http-server-netty-3.8.7.jar:3.8.7]
            at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
            at io.micronaut.http.server.netty.NettyHttpServer.fireStartupEvents(NettyHttpServer.java:581) ~[micronaut-http-server-netty-3.8.7.jar:3.8.7]
            at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:298) ~[micronaut-http-server-netty-3.8.7.jar:3.8.7]
            at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:104) ~[micronaut-http-server-netty-3.8.7.jar:3.8.7]
            at io.micronaut.runtime.Micronaut.lambda$start$2(Micronaut.java:81) ~[micronaut-context-3.8.7.jar:3.8.7]
            at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
            at io.micronaut.runtime.Micronaut.start(Micronaut.java:79) ~[micronaut-context-3.8.7.jar:3.8.7]
            at io.micronaut.runtime.Micronaut.run(Micronaut.java:323) ~[micronaut-context-3.8.7.jar:3.8.7]
            at io.micronaut.runtime.Micronaut.run(Micronaut.java:309) ~[micronaut-context-3.8.7.jar:3.8.7]
            at io.airbyte.server.Application.main(Application.java:15) ~[io.airbyte-airbyte-server-0.42.1.jar:?]
    k
    • 2
    • 3
  • a

    Anjaneyulu K

    04/13/2023, 2:35 AM
    Hi , I am new to Airbyte. I have Airbyte with Kafka as source and Snowflake is destination. Kafka message is Avro format. But sync performance is very slow. Even for 40k messages it took 1hour. Should I add any special settings or schema registry for Avro? Can anyone suggest?
    k
    f
    • 3
    • 4
  • s

    Saymon_says1985

    04/13/2023, 6:52 AM
    Hi everyone, I’ve got a question. I have a connection in append - deduped + history mode. Or even in just incremental append. And my goal is to run a pl\python function on certain columns in the destination. How can I ensure that this function only runs on newly ingested data? I want it to run completely on my first initial full sync and then to run only on the part of data that is to be appended to my final table. Is there a way to do that? Source: Oracle, destination: Postgres
    k
    w
    • 3
    • 3
  • k

    kelvin

    04/13/2023, 7:04 AM
    Hi everyone, I’m try Xero source and use the
    Custom Connections Authentication
    to setup Xero source on Airbyte. Then I create a connection from Xero source to Postgresql destination but got the error seems can’t fetch schema from Xero. Do you have any idea to fix? Thanks
    • 1
    • 1
  • v

    Vuong Huynh

    04/13/2023, 7:13 AM
    Hi everyone, I'm trying to load multiple CSV Files into Redshift using the Files source connector. And I got the error ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):140 - Failed to load s3://my-bucket/ . Please check File Format and Reader Options are set correctly. EmptyDataError('No columns to parse from file') Does Airbyte support this feature? Thank you.
    k
    • 2
    • 2
  • t

    Theo Sjöstedt

    04/13/2023, 8:08 AM
    Hi! I tried last week to set up a Pipedrive to BigQuery sync in Airbyte Cloud. It worked, but it only syncs the default fields in Pipedrive. Contacted support and got the response that the fields are hard coded, making it kind of useless since you always add custom fields to a CRM. The only solution from supports end was to host Airbyte ourselfs, something I explicitly not want to do since we are a two person tech team. My rambling question is is there any point of having Airbyte Cloud at all? I've continuously been disappointed by the lack of features and difficulty of debugging when you run into issues. I want to move away from Airflow as much as anyone, but at this stage its not feasible.
    k
    • 2
    • 2
  • l

    laila ribke

    04/13/2023, 8:39 AM
    Hi all, I´ve changed my Google Ads version to the latest due to the V11 sunset. The sync works great but I get a lot of discrepancies between the data, such as costs between the Google ads console and the API data. With the API v11 data matched perfectly, and I can't figure out how the v13 modification impacted my query.
    k
    • 2
    • 3
  • c

    Chau Vu

    04/13/2023, 9:01 AM
    Hi! I’m new to Airbyte and don’t know which schedule configs I should use to handle late arriving data. For example, I want an hourly schedule to sync at 13:05 and only want records that arrive before 13:00 - the extra 5 mins is there to make sure data closer to the cutoff make it to the source. How can I achieve this?
    k
    • 2
    • 3
  • d

    Dale Bradman

    04/13/2023, 9:20 AM
    Where can I find the source code for the standard Airbyte Normalization process?
    k
    a
    h
    • 4
    • 11
1...179180181...245Latest