https://linen.dev logo
Join SlackCommunities
Powered by
# ask-community-for-troubleshooting
  • c

    Charlie Getzen

    07/21/2022, 9:17 PM
    đź‘‹ I'm creating a connection between cockroach->postgres. I see that airbyte mangles the schema. bigints become "double precision", timestamps and uuids become "character varying". Is it possible to maintain the original types? Or can I define the postgres destination types somewhere? Thanks!
  • p

    Patricio Lozano

    07/22/2022, 3:35 AM
    Hi! I'm using the python CDK for HTTP API. Once I call the API the response I get is that the token should be on the Header . I can't find any documentation on how to send a header parameter . Appreciate any guidance
    a
    • 2
    • 7
  • s

    suman

    07/22/2022, 6:10 AM
    how can i add multiple webhooks to get notifications. Apologies if it is a basic question but I couldn’t find an answer online. Thank you.
  • s

    Sujith Kumar.S

    07/22/2022, 7:28 AM
    Hi Team, Do we have support for Celvertap or any roadmap ?
  • a

    Ankit Jain

    07/22/2022, 9:22 AM
    Hi , How is the connection speed between Airbyte to Google sheet connectivity ? is there a possibility to reduce the frequency of update on Airbyte faster than 15 mins ? does it take lots of time(more than 5 mins?) to sync data from bigger gsheets (say 50k + rows) ?
  • m

    Mathieu Beau

    07/22/2022, 11:54 AM
    Hello, I successfully deployed my first instance of airbyte, and configured sources, destinations and connections. But when I restarted my server, I lost my configuration and got the welcome screen of airbyte again, with no pipe configured. Do you know where the data is and how to get it again? I'm on a self hosted version, and followed the tutorial to deploy. Thank you
    • 1
    • 1
  • l

    Leandro Queiroz dos Santos

    07/22/2022, 12:46 PM
    Hi, Im trying to use DBT custom transformation with Airbyte but i’m stuck due the problem described on Airbyte’s forum: Airbyte cannot find adapter on custom DBT transformation. Anyone knows how to overcome this problem?
  • e

    Evgenii.Sukharev

    07/22/2022, 12:58 PM
    Hello, everyone. I'm trying to integrate Airbyte into our ETL with Kafka-destination and I faced a problem with Kafka topics naming convention in our company: their names should have dots as a separator. I found out that Airbyte-kafka-connector always replaces them with underscore if put into names. For example, if I write in field "*Topic Pattern*" something like "airbyte.test" in Kafka I get "airbyte_test". Is there any way to configure Airbyte to create topics with dots in their names?
  • n

    nathan gonzalez

    07/22/2022, 2:59 PM
    yeah that broke a lot of things, no ConfigContainer type, etc.
  • n

    nathan gonzalez

    07/22/2022, 3:22 PM
    actually it seems so far like just that ConfigContainer type is missing. replaced that with json and discover works
  • j

    Jake Gillberg

    07/22/2022, 4:37 PM
    Hello! I'm in a situation where I will need to develop custom ELs over HTTP from 3rd parties not big enough to have standard singer taps or airbyte sources. A spec more modern and potentially more standardized than Singer is interesting to me (still haven't looked deeply the differences, tbh, so if someone has a reference there, that would be great), but the fact Airbyte comes with a whole bunch of other stuff like orchestration is unappealing to me at this point. Is there a way to strip out and run an Airbyte EL similar to how one might run a Singer EL? Thinking about orchestrating via AWS step functions while we are in the bootstrap phase and evalating things like airflow and airbyte (or a combination) for orchestration as our pipelines get more complex.
  • l

    Luke Morgan-Scott

    07/22/2022, 7:49 PM
    Hey Team 👋 New here and had a quick question — If I’m looking to create a custom connector between a non-supported API source and Snowflake and want an easy way to schedule that to run daily… is there a way for me to use that custom connector on Airbyte Cloud and have Cloud handle the ongoing sync scheduling and running?
  • l

    Lukas Sandmeir

    07/22/2022, 11:29 PM
    Hey 👋 I am evaluating Airbyte and more specifically the postgres connector. Before I dive more into this, I am seeing two limitations in the doc that I would need to resolve to make the connector work for my use case: • We are using AWS IAM for RDS for authentication. I am not seeing that as being supported. Is that on the roadmap, or if not, how would I go about adding that functionality to make the connector work for me? • I am not seeing
    daterange
    as being supported as column type. How would I go about adding that functionality to make the connector work for me?
    đź‘€ 1
    g
    • 2
    • 4
  • s

    Sana Shaikh

    07/23/2022, 6:48 PM
    Has someone used Snowflake as a source in Airbyte?
  • y

    Yage Hu

    07/23/2022, 10:40 PM
    I noticed I can't specify the streams I want in the GitHub connector (for example, I don't want to ingest PR data). 1. Is there any connector that does this? 2. Hypothetically, how would I add an webapp UI option on the source connector onboarding page (If I have a custom connector). For example, for GitHub source connector I can only specify a few options like "start_date". Where do these options get defined?
    n
    • 2
    • 1
  • m

    mahmoud farah

    07/24/2022, 9:16 AM
    Hello guys, I'm trying to connect Mailchimp with Postgres DB, I did the connection and every thing is fine, but some of the data are stored as a json value is there any way to filter this data and make it more accessible when saving it? example:
    {"list_id": "7b3b92xzeb", "list_name": "USA", "segment_text": "", "list_is_active": false, "recipient_count": 1}
  • j

    Jaafar

    07/25/2022, 9:06 AM
    Hello. I used to use Fivetran for ingesting data from Shopify and now trying Airbyte with a client. I noticed I have duplicates in my raw data even though I am using the replication method “Incremental | Dedup + history”. • Is it a known issue? • What would you recommend to fix it? Delete duplicates with manual queries or redo a full resync? • Applying a select distinct clause to raw data can be tricky in some cases for 2 reasons: 1) distinct doesn’t apply to arrays and 2) The columns _airbyte_emitted_at and _airbyte_normalized_at are different even for duplicates, so to use distinct I would need to list manually all the columns in my staging tables and exclude _airbyte_emitted and _airbyte_normalized_at , which is not ideal Thanks
  • d

    Donald Dewulf

    07/25/2022, 12:55 PM
    Hey all, I started using Airbyte yesterday. I managed to set it up and create a source (planetscale) and destination (bigquery) When doing the sync I’m getting a bunch of errors
    Copy code
    2022-07-24 19:55:48 normalization > 19:55:42.951512 [error] [MainThread]: Database Error in model users_scd (models/generated/airbyte_incremental/scd/dev_planetscale_sync/users_scd.sql)
    2022-07-24 19:55:48 normalization > 19:55:42.952160 [error] [MainThread]:   Bad bool value: 1
    2022-07-24 19:55:48 normalization > 19:55:42.952742 [error] [MainThread]:   compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/dev_planetscale_sync/users_scd.sql
    2022-07-24 19:55:48 normalization > 19:55:42.953318 [info ] [MainThread]: 
    2022-07-24 19:55:48 normalization > 19:55:42.953890 [error] [MainThread]: Database Error in model series_scd (models/generated/airbyte_incremental/scd/dev_planetscale_sync/series_scd.sql)
    2022-07-24 19:55:48 normalization > 19:55:42.954516 [error] [MainThread]:   Bad bool value: 0
    2022-07-24 19:55:48 normalization > 19:55:42.955168 [error] [MainThread]:   compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/dev_planetscale_sync/series_scd.sql
    Any idea on what I’m doing wrong here?
    y
    b
    • 3
    • 2
  • l

    Leonardo Almeida Reis

    07/25/2022, 3:11 PM
    Hi everyone, I'm trying to setup a postgres source following the tutorial on this page: https://airbyte.com/tutorials/postgres-replication Apparently it is not find the necessary airbyte images. Does anyone know how to solve this? Obs.: Tried to download images manually on terminal
    errormessage.txt
  • r

    Roberto Malcotti

    07/25/2022, 3:15 PM
    Guys, just want to point out that this core example about incremental, not only refers to a different code on Github, but also contains the deprecated method
    get_updated_state
    It seems I need a PHD to develop a proper connector
  • a

    Aditya Nambiar

    07/25/2022, 5:50 PM
    Hey folks wanted to understand how _airbyte_ab_id is created. Is it based on some hash of the data or randomly created ? Is it guaranteed that if a row is emitted by airbyte twice it will have the same _airbyte_ab_id ? ( if its based on the hash of the data )
  • y

    Yash Makwana

    07/25/2022, 9:55 PM
    Hello Team, I'm working on a python connector and my base_url is
    <http://api.example.com/v1/cust/{custID}/address>
    I have a list of
    custID
    which I want to loop through my base_url. What would be the best approach to do this? any examples?
  • s

    Simon Thelin

    07/26/2022, 12:32 PM
    Does anyone know if netsuite is available as source in current airbyte release?
    m
    • 2
    • 4
  • f

    Fazimoon Samad

    07/26/2022, 2:00 PM
    đź‘‹ Hello, team!
    m
    • 2
    • 1
  • a

    Akul Goel

    07/26/2022, 4:07 PM
    Hi, can someone assist me with this issue? https://github.com/airbytehq/airbyte/issues/15038
    h
    • 2
    • 2
  • s

    Sujith Kumar.S

    07/27/2022, 6:23 AM
    Hi @Alex Marquardt (Airbyte) Can you please help me to clarify below points. 1. Does Postgres connector for airbyte supports schema evolution ? 2. Is airbyte supports hadoop/hive systems as destinations? 3. Is there any plan we have for CleverTap connector ?
    a
    • 2
    • 2
  • m

    Mike Smith

    07/27/2022, 9:52 AM
    Are there any good articles - or any advice on where to start in the documentation for running AirByte in AWS to ship data from RDS Postgres to BigQuery for analytics purposes? Have plenty of experience with Logstash but this will be my first time working with AirByte so any quick start guides would be great. Plan is to run it in an ECS cluster using Fargate.
  • r

    Ryan Moore

    07/27/2022, 12:36 PM
    hi Airbyters - question for you… we currently use two different tools for data ingestion - Fivetran for “Cloud available” sources and Azure Data Factory (with their on-premise integration runtime) when our sources are “on-premise” and not available from the Cloud, but need to be replicated to a Cloud destination. It’s obvious to me that Airbyte can accomplish the Cloud=>Cloud syncs and it appears to me that Airbyte would have no problem handling the on-prem=>Cloud with a self-hosted installation. Am I correct about that? Any other considerations for on-premise data sources that are not part of a hybrid cloud network?
    m
    • 2
    • 2
  • k

    Kevin Phan

    07/27/2022, 4:32 PM
    hey folks, I am a little confused on the discover catalag portion of the source. Are we manually representing all of the tables / DB that we want to pull inside of the disover function? Or is it outputting that for us.
    • 1
    • 3
  • e

    Evan Rosebrook

    07/27/2022, 10:18 PM
    Is there a way to annotate worker pods? for ex:
    destination-snowflake-check-5344-0-kgvhe
    We are using the helm chart and are already annotating the worker, but not the "worker-pods". How can this be achieved?
    plus1 1
    a
    • 2
    • 2
1...555657...245Latest