https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Ariyo Kabir

    01/26/2022, 1:23 PM
    Hi everyone, please I need someone who can assist me with how i can use dbt to transform data that is coming from airbyte before it gets into the destination database
    👀 1
    a
    • 2
    • 3
  • a

    Andreas

    01/26/2022, 1:28 PM
    Hi, I'm trying to setup a prefect task to schedule Airbyte runs. Is there a way to pass parameters to Airbyte? For example if I want to have a backfill flow, is there a way to programatically set the Airbyte sources start_date? Thanks!
    ✅ 1
    a
    • 2
    • 3
  • j

    Jani Sourander

    01/27/2022, 6:50 AM
    Hi! I’m currently using JDBC connection (on Spark) to fetch a couple of Db2 tables for analytical use. I am considering using an ingestion tool. I wonder how Airbyte performs the db2 incremental append sync. The connector documentation mentions only
    GRANT CONNECT
    . Wouldn’t it need a lot more grants? For comparison, AWS DMS wants a user to have
    SYSADM
    (?!) and
    DATAACCESS
    . Debezium puts tables into a capture mode using ASN Capture/Apply agents, which sounds a bit scary on any production-related databases.
    ✅ 1
    a
    • 2
    • 2
  • p

    Preetam Balijepalli

    01/27/2022, 12:20 PM
    Hi can I do everything which I can do with airbyte UI using these API/s https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html Warm Regards.
    👍 1
    👀 2
    m
    a
    h
    • 4
    • 11
  • d

    Daniel Eduardo Portugal Revilla

    01/27/2022, 4:36 PM
    Hello!!! I am creating a S3 destination for a POC but... i can see only Key id and Access Key as a authentication. Is there another method ? maybe secrets ? and.. where are the keys stored?
    m
    • 2
    • 1
  • p

    Pedro Machado

    01/27/2022, 6:16 PM
    Hi there! I am new to Airbyte and I am trying to load 36 different sets of csv files from s3 to Snowflake. I am able to do this directly on Snowflake but wanted to test Airbyte. Is it possible to define a single S3 source that will understand the s3 "folder" structure and create separate tables in the DB? The structure is: s3://<bucket>/<table name>/yyyy/mm/dd/<table name>.csv.gz and I'd like to have a table for each <table name> created in the DB. Thanks!
    ✅ 1
    m
    m
    • 3
    • 7
  • m

    Matthieu Berger

    01/27/2022, 8:18 PM
    Hi there! I’m new to Airbyte. Is there a way to embed the
    workspaceId
    somehow into the destination ?
    ✅ 1
    m
    • 2
    • 2
  • r

    Ryan Loveland

    01/27/2022, 11:15 PM
    Hello there - looking to setup a POC for syncing Stripe data to Snowflake and was reading through the docs and setup wizard that I would need to provide a lookback window to catch updates to records - I am sure each implementation is different, but is there a recommended value here or is there a way to check my source to see how much data is changing to influence this decision? I am fairly new to Stripe as a source so any help here would be appreciated. 🙂
    m
    • 2
    • 3
  • a

    Arvi

    01/28/2022, 8:00 AM
    Hi Airbyte Team, I am evaluating Airbyte. Source is CloudSQL (postgresSQL). Do we need SSH even when Airbyte is deployed in GKE in same VPC? Any help would be appreciated. FYI.. I did sign up for airbyte Cloud.
    👀 1
    a
    n
    • 3
    • 6
  • n

    Nick Booth

    01/28/2022, 9:10 AM
    Morning, I put in a ticket for time based jobs (as our hubspot import was killing redshift during the import) into github and got a response suggesting that cron could be used to trigger jobs. I was wondering if anyone had written any docs on how to set this up?
    ✅ 1
    a
    • 2
    • 1
  • l

    Line Rahal

    01/28/2022, 4:44 PM
    Hello all! 🍀 I would like to create a Google Ads integration to Bigquery using Airbyte, but I’d like to be sure that Google Performance Max campaigns are included in the campaign reports. Do you know if it is the case? Thanks in advance for your help!
    👀 1
    ✅ 1
    a
    • 2
    • 2
  • t

    Thao Pham

    01/28/2022, 6:19 PM
    Hi all, new here to the community. Thank you for this! Does anybody know if Airbyte has a PostgreSQL issue where if there is too much time in between successful syncs that it will trigger a vacuum process that will fail the connection?
    ✅ 1
    m
    • 2
    • 2
  • m

    Matthieu Vegreville

    01/28/2022, 7:49 PM
    Hey all, Airbyte seems fantastic. At Greenly, we are doing carbon footprint reporting for companies. I wanted to try Airbyte Cloud this week end but got on the waiting list, any way to have a faster access? If that helps, I solemnly promise to give feedback once I did my tests 😉
    ✅ 1
    m
    • 2
    • 1
  • s

    Saif Mahamood

    01/28/2022, 8:57 PM
    Hey everyone 👋. I understand that right now connector secrets are to be entered in the UI, which I think is stored in a DB. I was just wondering if anybody has tried defining secrets loaded in as env vars in K8s, and have the secrets read from the environment instead?
    ✅ 1
    m
    • 2
    • 3
  • t

    Thao Pham

    01/28/2022, 9:20 PM
    Hi team, another question. I haven't set up Airbyte yet. How long has it taken others to get Airbyte set up and be able to start migrating data into data lake?
    ✅ 1
    m
    • 2
    • 5
  • v

    Victor

    01/28/2022, 10:07 PM
    Hi team, Please I am using Quickbook as the source but I have always been getting this error. No successful migration so far. Is there anything I might have missed?
    👀 1
    ✅ 1
    m
    h
    • 3
    • 3
  • s

    Sreenivas Reddy

    01/29/2022, 6:00 AM
    Hi Guys, It is stuck on connection creation
    a
    • 2
    • 1
  • s

    Sreenivas Reddy

    01/29/2022, 6:02 AM
    do I need to spin big on e
    h
    • 2
    • 1
  • w

    Wahyudinata Setiawan

    01/29/2022, 11:01 PM
    hey all, I wonder if airbyte fits my usecase: we want to enable our end user (users who register on our app) to manipulate their data from multiple sources and we are thinking to use airbyte to do the syncing for us, which means each end user has to allow us to access their data and we pool them into table per source, so it’s multi tenant. I see this: https://airbyte.com/embed-airbyte-connectors-with-api which i think is what we need but I am not able to find much information, it does link me to workspace? Is airbyte a fit for our need?
    ✅ 1
    h
    • 2
    • 1
  • d

    Daniel Eduardo Portugal Revilla

    01/31/2022, 1:41 PM
    hello, Is there a source connector for neo4j?
    ✅ 1
    a
    • 2
    • 1
  • p

    Pierre CORBEL

    01/31/2022, 2:01 PM
    Match for a specific use case
    Hello there 👋, I wanted to ask you if Airbyte was a good option for a particular use-case: not as an ETL but as a RETL 🧐. I explain: we developed a lot of data-models via dbt in our DataWarehouse, Snowflake, and we unload them to a PostgreSQL instance in order to expose them to our end-users (Snowflake is OLAP oriented and not fast-enough for end-users queries, thus the need to unload them to Postgres) ⚡️. So our use-case is to keep a perfect sync between our Snowflake tables and our Postgres tables 🔄. By perfect sync, I mean having all the fields needed, a regular incremental replication and handling of deleted rows ⏰ We already set-up primary key in our Snowflake tables, as well as a "meta-field" with the last update date of the row 👍. Thanks a lot! 🙏
    ✅ 1
    a
    • 2
    • 3
  • d

    Daniel Eduardo Portugal Revilla

    01/31/2022, 5:14 PM
    Hi! I created a S3 bucket and used a S3 parquet destination, but i have this message when I try to reed the parquet file
    👀 1
    a
    • 2
    • 2
  • d

    Dave Lindley

    01/31/2022, 5:18 PM
    Would Airbyte be seen as a 1:1 replacement for Snowpipe (Snowflakes ingestion from S3)?
    👀 2
    🔥 1
    a
    h
    • 3
    • 3
  • n

    Nava Gross

    01/31/2022, 8:22 PM
    Hi 👋, Just playing around with airbyte and we set up a connector between an sql server db and bigqquery. When setting up which tables to sync in airbytes, some of the tables have the options for incremental sync and some only have full refresh. They are all coming from the same db and we enabled changed detection on all the tables as well, so there is no apparent different between tables on the sql side. Is there something I'm missing?
    ✅ 1
    👀 1
    a
    • 2
    • 2
  • t

    Tien Nguyen

    01/31/2022, 11:17 PM
    Hello everyone, Just getting started with Airbyte. I try to connect to Redshift server from local host. I was able to connect with a connector. However, I was not be able to connect to another Redshift server as a destination point. I keeps raising me with wrong configuration but I think I have looked for every on the internet for couple of days. But couln't find the answer. Is there anyone able to help me out.
    👀 1
    h
    • 2
    • 1
  • u

    Ulf Svensson

    01/31/2022, 11:28 PM
    Hi. I am trying to create a source for google sheets using the configuration API: https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#post-/v1/connections/create The example schema looks like this: { "sourceDefinitionId": "17eb26bb-1d20-4000-8338-eab5bc180801", "connectionConfiguration": { "user": "charles" }, "workspaceId": "17eb26bb-1d20-4000-89b3-7a47513d8017", "name": "string" } How would that look like if I create a google sheets source using a service account?
    ✅ 1
    h
    • 2
    • 6
  • r

    Reynaldo Mendez

    02/01/2022, 4:37 AM
    Hey guys, hope everyone is doing great, testing out some sources and connectors, in postgress I have a JSONB column and I want to convert it to a variant column in snowflake, but i can’t find in the docs how to change the data type in the connections settings, can you help me figure it out please?
    ✅ 1
    h
    • 2
    • 2
  • a

    Aniruddha Chattopadhyay

    02/01/2022, 8:22 AM
    Hello. I have been stuck on an issue. How do I use airbytes API to build a connection between a preexisting source type(lets say MySQL) and a pre-existing destination(let's say Bigquery)? In both the API documentation and the python CDK I have found example to create a new source and all but I can't find an example as to how to build a source from one of the given types like MYSQL and all. Can someone please guide me. Thanks.
    ✅ 1
    h
    • 2
    • 1
  • n

    Nikzad Khani

    02/01/2022, 8:15 PM
    Hi Airbyte team, what access does the airbyte user need on an external database? Also, is there a sql script to generate this user? edit1: FYI i am using PG 13.4 on AWS RDS. edit2: I am talking about this btw
    👀 1
    h
    m
    • 3
    • 7
  • a

    Armand

    02/02/2022, 2:17 AM
    Running a fresh git clone of airbyte and getting this error on
    docker-compose up
    Copy code
    airbyte-server      | 2022-02-02 02:15:45 ERROR i.a.s.ServerApp(main):236 - Server failed
    airbyte-server      | java.lang.NullPointerException: Cannot invoke "org.flywaydb.core.api.MigrationInfo.getVersion()" because the return value of "io.airbyte.db.instance.DatabaseMigrator.getLatestMigration()" is null
    airbyte-server      |   at io.airbyte.db.instance.MinimumFlywayMigrationVersionCheck.assertMigrations(MinimumFlywayMigrationVersionCheck.java:75) ~[io.airbyte.airbyte-db-lib-0.35.15-alpha.jar:?]
    airbyte-server      |   at io.airbyte.server.ServerApp.assertDatabasesReady(ServerApp.java:131) ~[io.airbyte-airbyte-server-0.35.15-alpha.jar:?]
    airbyte-server      |   at io.airbyte.server.ServerApp.getServer(ServerApp.java:155) ~[io.airbyte-airbyte-server-0.35.15-alpha.jar:?]
    airbyte-server      |   at io.airbyte.server.ServerApp.main(ServerApp.java:234) [io.airbyte-airbyte-server-0.35.15-alpha.jar:?]
    airbyte-server exited with code 1
    ✅ 1
    t
    • 2
    • 3
1...222324...245Latest