https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • h

    Harish Garg

    12/27/2021, 9:58 AM
    Hi, An AirByte Newbie. Is it possible to connect to sources like Google Analytics or Facebook Ads Accounts using an OAuth flow where you select an account and provide access? I am running the Open Source version locally
    ✅ 1
    u
    • 2
    • 2
  • s

    Saket Singh

    12/28/2021, 5:13 AM
    Hi everyone! I am new to airbyte. Could someone help me with this issue? I am trying to figure out how to get the message stream from the
    read()
    function of the connector by running the container. The docs mention that the message stream is output to STDOUT, however I can't find it there. All I can see are the log messages of the container. I have tried this with the postgres connector only so far as it is the only connector relevant to me right now. Please help me out!
    👀 1
    u
    • 2
    • 5
  • w

    wp

    12/28/2021, 4:53 PM
    Hi, I am working on Facebook Marketing connector with Basic Normalization enabled, is the dbt repo for the normalization somewhere? I intend to customize the dbt models so we don’t end up with 600 models in the dataset.
    ✅ 1
    u
    • 2
    • 2
  • t

    Tyler Buth

    12/28/2021, 6:29 PM
    Is there a way to use an older version of the Intercom connector? I know Intercom has done away with Users in their latest versions but we still have Users on our account and they aren’t being synced.
    ✅ 1
    u
    • 2
    • 10
  • v

    Vincent Dhennin

    12/28/2021, 9:57 PM
    Hi everyone, First of all thanks for airbyte, the product looks really great, especially when I saw that your first pre-release was only a year ago, that impressing. I would like getting an advice on something. I have an usecase where I have hundred of CSV files. I would like to transform this data according to some business constraints and put everything into a postgres database. I saw this connector https://airbyte.io/connectors/csv-file but I didn't found a documented way to use it automatically on hundreds files. Did I missed something ? Does there is a contribution to do ? Or maybe there is a better way to approach such problem but I feel like a newbie with airbyte for the moment. After that, I'm gonna wrote an article to help newcomers to approach such problems with airbyte, because this product really looks promising.
    ✅ 1
    u
    • 2
    • 2
  • j

    Jonathan Crawford

    12/28/2021, 11:21 PM
    Hi is anyone using Typeform to sync to a data warehouse? The tables are super confusing and don’t seem to have foreign keys to connect the answers/responses with the various forms. Any insights there?
    ✅ 1
    👀 1
    u
    • 2
    • 10
  • b

    Bruno Lopes

    12/29/2021, 9:58 AM
    Hi Everyone, I’ve just started testing airbyte and, while going through the logs of a sync, I noticed that it lists all of the fields in my PostgreSQL tables (source) even for those tables that are not selected for sync. At first I thought this was to try and find out of any schema changes but it seems like schema changes are not detected so, I’m just curious, why are all of the fields and data types listed in the log? What is airbyte doing with every sync?
    ✅ 1
    u
    • 2
    • 5
  • r

    Roy Peter

    12/29/2021, 12:11 PM
    Hi everyone, any recommendation/doc for monitoring Airbyte in k8s environment?
    ✅ 1
    u
    • 2
    • 4
  • o

    Owais Javed

    12/29/2021, 6:03 PM
    Is there a way you schedule the replication time in airbyte instead of just the interval? For example, if I wanted to kick off a replication process at 1am every morning?
    ✅ 1
    u
    • 2
    • 1
  • c

    Callum McCaffery

    12/29/2021, 7:08 PM
    Is there a connector available that can download a zip from the open web (https) and load the CSV contained within to a destination?
    u
    • 2
    • 3
  • p

    Paul Cothenet

    12/30/2021, 1:03 AM
    Hi all, I'm sure this has been asked several time, but I'm new to BigQuery (as well as Airbyte). Do you recommend setting: • a) One datasource in BigQuery per source • b) A single datasource (and prefix the tables) ?
    u
    • 2
    • 3
  • h

    Hans Jónsson

    12/30/2021, 3:18 PM
    Hi, the Shopify source setup guide for OAuth has 2 steps. 1. Select OAuth 2.0 in Shopify Authorization Method 2. Click on authenticate I can see fields for Access Token, Client ID and Client Secret but an authenticate button is not visible? Can anyone clarify?
    ✅ 1
    u
    o
    • 3
    • 6
  • m

    Manish Ballav

    01/02/2022, 12:26 AM
    Hello there! New to airbyte. Have a question about the connectors that you offer. Can I use say Salesforce source connector in my own python code? The reason I need this is that I do not intend to send the data from the source to a destination, but instead use it for analytics or some other purpose. If you could provide a code example of how I can use a source connector, I would really appreciate that.
    👀 1
    r
    u
    • 3
    • 2
  • e

    Etienne J

    01/03/2022, 9:21 AM
    Hello there ! New to Airbyte. First, i wish you the best for 2022 ! Second, Airbyte seems to be 💎 thanks to the team. I've set up Airbyte and try to use the Amazon Ads connector. Does anyone know if the Amazon Ads connector works with Amazon DSP campaigns ? For the moment, i connect to the source but can not extract data. Thanks for helping ! 🙂
    ✅ 1
    r
    u
    • 3
    • 3
  • v

    Victor

    01/04/2022, 1:45 PM
    Hi! I kept on getting this error even after like 3 attempts. The failures had to do with existing tables (that contain previous data). Even when I rename them, the second and third attempt also fail by pointing to another (different) tables as ambiguous.
    ✅ 1
    c
    u
    • 3
    • 16
  • d

    Donovan Maree

    01/04/2022, 2:31 PM
    Hi all and happy new year. I'm looking for a way to split the _airbyte tables into their own dataset separately from the basic normalization tables as they enter bigquery, is there a way to do this currently? The use case would be to make navigation simpler
    👀 1
    u
    c
    • 3
    • 5
  • n

    Nikzad Khani

    01/04/2022, 4:08 PM
    Hello all, I am looking at the source-greenhouse connector, and I was wondering if I change json schema in the schemas folder, for example removing primary_email_address from Users.json. Will this not create this column in the destination DB? I see that you can filter out specific streams, but is it possible to filter subfields within the streams (columns in a table)? There might be sensitive data in these fields and I want to filter them out before they land at the destination.
    👀 1
    ✅ 1
    u
    • 2
    • 4
  • n

    Nikzad Khani

    01/04/2022, 9:36 PM
    Hello all, Is there any documentation to understand how data is staged and how the source data is extracted into postgres and then loaded out during a synchronization?
    u
    • 2
    • 1
  • v

    Victor

    01/05/2022, 8:16 AM
    Hi, I was trying to configure Quickbooks as the source. Anytime, I select quickbook, it shows me this page but Airbyte is running. If I click on other pages, they load
    ✅ 1
    • 1
    • 3
  • r

    reg

    01/05/2022, 9:19 AM
    Hello, anyone here already tried using airbyte-helm? Was able to deploy in an EKS cluster. But everytime a node containing the PVC/PV gets replaced, the airbyte-server can’t reload anymore
    👀 1
    u
    • 2
    • 3
  • j

    Joey Taleño

    01/05/2022, 9:25 AM
    Hello! 👋🏻 I'm trying to setup my first data source connector which is Google Sheet but I'm having an unknown error. How you guys set this up?
    ✅ 1
    j
    h
    • 3
    • 6
  • c

    Chetan Chaudhari

    01/05/2022, 9:39 AM
    File type source only accepts 1 file I guess, if we have 100 CSVs, it will require 100 sources to be created. Is that right understanding?
    👀 1
    ✅ 1
    h
    • 2
    • 2
  • j

    Joey Taleño

    01/05/2022, 10:02 AM
    Hi! There's no Woo Commerce Connector in the OSS version?
    👀 1
    ✅ 1
    h
    u
    • 3
    • 10
  • m

    Muhammad Haroon Aslam

    01/05/2022, 10:32 AM
    why ami i getting this message
    👀 1
    h
    u
    • 3
    • 3
  • j

    Julien Bovet

    01/05/2022, 12:19 PM
    Hello guys! Quick question regarding the Google Ads connector and the new campaign type "Performance Max". This campaign type is not available in the Data delivered using the Google Data Transfer & it's not available in Fivetran as well. I was wondering if it's the same for the Airbyte connector but was not able to find anything in Slack/Docs/Github issues... Performance Max campaigns were released at the end of 2021 (https://ads-developers.googleblog.com/2021/11/announcing-v9-of-google-ads-api.html) Thanks!
    ✅ 1
    r
    j
    +3
    • 6
    • 10
  • o

    Omid

    01/05/2022, 7:06 PM
    Hello 🙂 Is there any way to delete the created tables/schemas/databases when I delete a destination?
    👀 1
    u
    • 2
    • 3
  • j

    Jiyuan Zheng

    01/05/2022, 11:21 PM
    Hi Airbyte team, my previous attempt to do an upgrade from 0.29.19-alpha to 0.32.0-alpha-patch-1 failed and I would like to revert everything but importing my backup is not working 😞 Does anyone know why? Error logs are added to the thread.
    👀 1
    u
    m
    • 3
    • 10
  • a

    Alexander Hernandez

    01/06/2022, 2:38 AM
    Hello, I want to learn how to use airbyte and Im just followed the quick start deployment instructions, but when I ran docker-compose up, after a few mins, my terminal just started displaying
    airbyte-scheduler  | 2022-01-06 02:26:11 INFO i.a.s.a.SchedulerApp(waitForServer):210 - Waiting for server to become available...
    What does this mean and how do I fix this? This has been going on for 5 mins now
    👀 1
    h
    • 2
    • 3
  • c

    Chetan Chaudhari

    01/06/2022, 6:41 AM
    Loading a large CSV thru airbyte but keep getting following
    👀 1
    u
    • 2
    • 1
  • s

    Susheel Kumar

    01/06/2022, 3:34 PM
    Hey guys I was trying out airbyte earlier today. It looks really cool and I would like to congratulate the devs on a good job! 🙂 However, i came across a couple of issues and was wondering if anyone has any ideas around this. Basically, i was trying to sync a few tables from postgres into clickhouse. When i added postgres as a source, and clickhouse as a destination and tried to make a connection, i saw only some tables from a couple of schemas only appeared. There wasn't necessarily a way for me to search for other tables(or perhaps i was not doing it right?). What can i do to ensure i see all tables from postgres so that i can choose which ones i need to sync?
    👀 1
    u
    • 2
    • 1
1...181920...245Latest