https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • d

    Developer Crunchy

    10/10/2022, 7:26 AM
    Hello I have created a mysql Source in AirByte open source and restored data into this. But the table is not being appeared on "Refresh Scheme" in AirByte. Any help regarding this??
    ✍️ 1
    • 1
    • 2
  • g

    Georg Heiler

    10/10/2022, 8:00 AM
    When trying to work with GA4 connector, I face:
    403 Client Error: Forbidden for url: <https://analyticsdata.googleapis.com/v1beta/properties/><<m<_property>>/metadata'
    . Is this a bug in the GA4 connector (alpha) or permission/networking issue on my side?
    ✍️ 1
    m
    • 2
    • 16
  • n

    Nicola Corda

    10/10/2022, 8:52 AM
    Hey all, what is the max supported version of postgres in airbyte in the metadata db? is 14.4 supported??
    ✍️ 1
    s
    • 2
    • 4
  • a

    Asmaa Althakafi

    10/10/2022, 12:23 PM
    Hi all, I have a question is Airbyte support docker compose with version 2 and above or not?
    ✅ 1
    ✍️ 1
    m
    • 2
    • 8
  • q

    Qamarudeen Muhammad

    10/10/2022, 12:35 PM
    Hey @channel I am using this tutorial https://airbyte.com/tutorials/Incremental-data-synchronization as base of my implementation. Source is Postgres Destination is Bigquery Airbyte version 0.40.4 (original instance) Airbyte version 0.40.14 ( Second instance on a separate Machine, nothing on this machine except Airbyte and the two endpoints) Issues encountered, during Normalization stage, this issue trigger Immediately after *1 of 3 Start view model _airbyte__public._stg The following error is displayed, *"unhandle error while executing model.airbyte_util._stg *=table_name Sync mode = Incremental | deduped + history Transformation = Tabular Normalization With this, dbt never normalized the data, hence normalised was never generated, however intermediate table source as raw, tmp were generated and populated. I have the complete log file on this Airbyte discuss page. https://discuss.airbyte.io/t/normalization-error-when-sync-mode-is-incremental-deduped-history-when-implementing-cdc-with-bigquery-as-destination/2839 Thanks
    ✍️ 1
    • 1
    • 4
  • s

    Sheshan

    10/10/2022, 1:22 PM
    Guys ,I'm Working on creating new connector ./ as per documentationi created spec.yaml but "python main.py spec" this command fails with " No such file or directory: 'Airbyte/airbyte/airbyte-integrations/connectors/<>/spec.json" it still looks for json instead of yaml. how to fix this? thanks
    ✍️ 1
    s
    • 2
    • 6
  • r

    Rocky Appiah

    10/10/2022, 1:35 PM
    Are there any plans to offer a paid support model for the self-hosted versions?
    ✍️ 1
    • 1
    • 2
  • l

    Luca Moity

    10/10/2022, 1:44 PM
    Hi everyone! While working on a new connector, I cant proceed with installing the requirements:
    Copy code
    Could not find a version that satisfies the requirement airbyte-cdk~=0.1.56
    My setup: -ubuntu 20.04 -python3.9.1 I was looking for similar posts and did found a few, but none of the suggestion did help me.
    ✍️ 1
    ✅ 1
    s
    • 2
    • 4
  • t

    Tony Lewis

    10/10/2022, 1:45 PM
    hello, having an issue setting up airbyte with redshift
    ✍️ 1
    u
    • 2
    • 2
  • n

    Nicola Corda

    10/10/2022, 2:02 PM
    hey all, is there a way to add simple auth to airbyte deployed in k8s? a simple password will do the job, no fancy RBAC needed.
    ✍️ 1
    u
    • 2
    • 5
  • s

    Samantha Duggan

    10/10/2022, 6:39 PM
    Hello! I’m evaluating Airbyte and running into an issue where
    Incremental | deduped + history
    is ‘succeeding’ but finding “no records” on an initial sync from an Aurora Postgres database into Snowflake. I was able to do an initial sync using the same source/destination and same replication settings for
    Full refresh
    and
    Incremental | append
    . My cursor is an updated_at timestamp column and the table I’m testing has a primary key. Any initial ideas on why the ‘deduped + history’ mode might not be emitting any records?
    ✍️ 1
    ✅ 1
    m
    • 2
    • 9
  • g

    Grant Pendrey

    10/10/2022, 7:28 PM
    I am Prototyping a project using open source Airbyte to sync about 100 tables in a self manage cloud SQL Server to Bigquery. For context, some of these tables are smaller and don’t get updated often, but others have millions of rows with 100,000+ rows written per day. Because of how Airbyte likes to do a “Reset all streams” for any edit to a connection setup in Airbyte… Should I make a single connection for each table, so that we can make changes later and not reset every table? Or maybe I am thinking about this the wrong way?
    ✍️ 1
    e
    s
    • 3
    • 8
  • p

    Pradyumna Thakur Deshmukh

    10/10/2022, 1:35 PM
    Hello guys! I am trying to build a connector using Python CDK, can someone help me with this error, please!
    ✍️ 1
    m
    • 2
    • 2
  • m

    Marcelo Pio de Castro

    10/11/2022, 12:10 AM
    From my testing, Airbyte does not continue an ingestion if something went wrong in the first attempt, even though the state was saved and is registered on the connection. From what I investigated, this is due to Airbyte only loading the state when the job starts, and using that state for all attempts, even though records were flushed to the destination database, making so that on any failure, Airbyte just loads everything he just did again (x3 because of default number of attempts). I opened an issue for that https://github.com/airbytehq/airbyte/issues/17774. Bizarrely enough, if after the first job fails all attempts, and I run a second job, it will continue from where it left of, making so that it is better to have 1 attempt configured, that way the state is respected and the load will not be duplicated unnecessarily. My question is, is this by design? Can we make so that the attempt loads the current state from the connection and continue from where it stopped on the first one? Even if this is by desing, for me there is a bug here, because the second job should ignore the saved state from the first attempt and try to start the sync from the beginning, just like the attempts do. It is just not consistent what AIrbyte is doing now For me this is a show stopper, because we have very big syncs, and they can fail a lot, making that loading dozens of GB every attempt for no gain at all.
    ✍️ 1
    • 1
    • 4
  • n

    Nathan Chan

    10/11/2022, 12:46 AM
    Hey Team, I am very new to airbyte and I am trying to setup ADC on airbyte opensource - I have already edited the
    .env
    file and point to the dir where I am going to place
    application_default_credentials.json
    , configured the volume mapping in
    docker-compose.yaml
    and in worker + webserver contain the env variable
    GOOGLE_APPLICATION_CREDENTIALS
    is accessible, but still on ui I get this error:
    java.io.IOException: The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See <https://developers.google.com/accounts/docs/application-default-credentials> for more information.
    Did I miss something?
    ✍️ 1
    m
    • 2
    • 6
  • m

    Matt Webster

    10/11/2022, 1:53 AM
    Hi, I’m having a problem connecting to an AWS Aurora MySQL instance with the beta Airbyte MySql connector. When I hit the “Set up source” button the progress bar runs for about a minute and then gives me the following message.
    Copy code
    State code: 08S01; Message: Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
    There’s an airbyte worker message
    Copy code
    airbyte-worker      | 2022-10-11 01:47:42 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - HikariPool-1 - Start completed.
    And 60 seconds later
    Copy code
    airbyte-worker      | 2022-10-11 01:48:42 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):109 - HikariPool-1 - Shutdown initiated...
    I’m using the latest server 0.40.14 and latest connector (airbyte/source-mysql:1.0.3). I’m connecting to MySql 5.6 in AWS Aurora. I tried the
    enabledTLSProtocols=TLSv1.2
    JDBC parameter even though I’m not seeing the poolable connection error. I’ve tested my credentials connecting directly to MySql both from inside my VPC and externally through a SSH tunnel and it works fine everywhere other than from an Airbyte Server. Any other ideas for me to try? Any other ways to troubleshoot? Thanks in advance for the help!
    m
    • 2
    • 2
  • n

    Nathan Chan

    10/11/2022, 3:28 AM
    hey team, we get this error is our zendesk > bq connection:
    Response Code: 429, Response Text: Pagination limit reached for offset model
    . Is this something we need to configure on zendesk side or we can somehow set the pagination limit on airbyte? Thanks!
    ✍️ 1
    e
    s
    • 3
    • 3
  • k

    Keshav Agarwal

    10/11/2022, 5:03 AM
    Hello, how do I cancel all syncs? They are running from past 18 days with no movement I am using 0.40.14 on ubuntu 20.04. deployed using docker-compose
    e
    • 2
    • 3
  • z

    Zachary Damcevski

    10/11/2022, 5:11 AM
    Hey all, just looking into Airbyte and ran into a similar issue as mentioned here. Looking to set up s3 as a destination but at our organisation we have temporary keys. The main issue is testing in local becomes impossible as we aren't allowed to have any users with static IAM credentials. Does anyone know if there has been any resolution to this?
    ✅ 1
    ✍️ 1
    m
    h
    • 3
    • 6
  • a

    Aditya Shelke

    10/11/2022, 6:38 AM
    I am working on this issue. for that I need dremio personal access token. When I create a account on their platform they are redirecting me to aws to connect to my aws. It is required to complete the billing details of aws. I am student and I do not have a credit card because of age restrictions in my country. How should I proceed? Is it not possible without aws account ? In that case I request to assign me to work on adding a different low-code cdk? Thread in Slack Conversation
    ✍️ 1
    u
    • 2
    • 2
  • m

    Mark Elayan

    10/11/2022, 7:10 AM
    hi team! thanks for a wonderful product, best wishes for the future. I got a small issue trying to figure it out, I am running Airbyte under docker hosted on AWS A1-ARM64 instance (A1.xlarge) Everything is running fine and working great until it comes to the basic normalization that fails with the following error in the logs:
    Copy code
    airbyte-worker      | 2022-10-11 07:02:14 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1/0/normalize --log-driver none --name normalization-normalize-1-0-jmoyq --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.40.14 airbyte/normalization:0.2.22 run --integration-type postgres --config destination_config.json --catalog destination_catalog.json
    airbyte-worker      | 2022-10-11 07:02:16 normalization > [FATAL tini (7)] exec /airbyte/entrypoint.sh failed: Exec format error
    airbyte-worker      | 2022-10-11 07:02:16 INFO i.a.w.g.DefaultNormalizationWorker(run):82 - Normalization executed in 36 seconds.
    airbyte-worker      | 2022-10-11 07:02:16 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):164 - Completing future exceptionally...
    airbyte-worker      | io.airbyte.workers.exception.WorkerException: Normalization Failed.
    airbyte-worker      |   at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:92) ~[io.airbyte-airbyte-workers-0.40.14.jar:?]
    airbyte-worker      |   at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:27) ~[io.airbyte-airbyte-workers-0.40.14.jar:?]
    airbyte-worker      |   at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:161) ~[io.airbyte-airbyte-workers-0.40.14.jar:?]
    airbyte-worker      |   at java.lang.Thread.run(Thread.java:833) ~[?:?]
    I ran the same installation on my localhost (Apple M1) and all was working fine and got good results. Used the below for deployments: - git clone https://github.com/airbytehq/airbyte.git - cd airbyte - docker-compose up Appreciate any tips if possible.
    ✍️ 1
    e
    • 2
    • 15
  • r

    Rachel RIZK

    10/11/2022, 7:22 AM
    Hello! I opened an issue about Bing Ads connector and I'd like to contribute with a PR to fix it. However on the contributing guide I see that I need to assign the issue to myself, but I don't have the rights to do so (no one else is assigned). How can I proceed? Thanks 🙂
    ✍️ 1
    s
    • 2
    • 2
  • u

    高松拳人

    10/11/2022, 8:11 AM
    I have started airbyte with GKE. When I try to establish a connection with Bigquery, I get I get an error like this
    Copy code
    Internal Server Error: The specified bucket does not exist.
    However, in fact, the specified bucket does not exist in
    Copy code
    job-logging/workspace
    is actually created in the specified bucket. What is wrong?
    ✍️ 1
    a
    m
    • 3
    • 8
  • a

    Ahmad Zamrik

    10/11/2022, 9:02 AM
    it’s something related to the minio logs config when logging to S3. Does anyone know how to configure it? I’m facing the same issue (Tested with BQ & Redshift as target), and the error logs don’t print out the name of the bucket that airbyte is trying to connect to
    ✅ 1
    ✍️ 1
    h
    k
    +2
    • 5
    • 15
  • s

    Selman Ay

    10/11/2022, 9:14 AM
    Hello team, is it possible to increase reading fetchsize in ms sql server source connector? I’m doing a load test here with 50 million records and it’s so far very slow as it’s fetching data with 1k batches. any help is appreciated, thanks 🙂
    ✍️ 1
    • 1
    • 14
  • r

    Rahul Borse

    10/11/2022, 10:45 AM
    Hello team, I am new to airbyte and we are planning to use airbyte to export data from postgres table to s3 bucket. I wanted to know how can I integrate airbyte with spring boot application. Any help is appreciated!
    ✍️ 1
    m
    h
    • 3
    • 12
  • k

    Kazi Khayruzzaman

    10/11/2022, 10:19 AM
    Hello team, I'm trying out the AirByte Low Code Connector. AirByte released the part-2 of the tutorial in youtube yesterday. Was trying out according to the tutorial and the steps mentioned in the docs. During this step: https://docs.airbyte.com/connector-development/config-based/tutorial/incremental-reads After adding the
    stream_slicer
    Just before moving to this https://docs.airbyte.com/connector-development/config-based/tutorial/incremental-reads#supporting-incremental-syncs we check if the stream_slicer is working or not. i run this command
    python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
    But getting this error: [ATTACHED THE LOG IN A FILE] Can anyone help, please?
    error_log.txt
    ✍️ 1
    h
    • 2
    • 12
  • d

    Donk

    10/11/2022, 11:35 AM
    Hi all, New to airbyte this day. We are very hopeful that we can use it for most of our integrations but we met a problem right away. We are using IBM Db2 Warehouse on Cloud for our Data Warehouse and we need to have that IBM Db2 connector as a destination. How come it is available as a source but not as destination? BR,
    ✍️ 1
    u
    • 2
    • 3
  • n

    Nikita Kogut

    10/11/2022, 12:09 PM
    Hi! Can someone please point out small moment - do
    sourceDefinitionIds
    differ from machine to machine for built-in connectors like S3 / Postgres? I need to hardcode that value in automated script, and not sure if it will be different on another machine and that will require changes to be made.
    ✍️ 1
    m
    • 2
    • 4
  • f

    Florian Melki

    10/11/2022, 1:27 PM
    Hi ! Do someone know what's the use of the NEED_STATE_VALIDATION env. variable ? And what's going on if i set it to false ?
    ✍️ 1
    s
    • 2
    • 9
1...737475...245Latest