https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • n

    Naphat Theerawat

    01/20/2022, 4:37 AM
    Hi. Per my understanding, Airbyte supports 4 sync modes: 1. Full refresh - Overwrite 2. Full refresh - Append 3. Incremental Sync - Append 4. Incremental Sync - Deduped History Ref: https://docs.airbyte.com/understanding-airbyte/connections but according to this link https://docs.airbyte.com/quickstart/set-up-a-connection, it says that Airbyte is doing
    Full refresh
    on final destination table.
    Copy code
    At the moment, Airbyte runs a full-refresh to recreate the final tables. This can cause more costs in some destinations like Snowflake, Redshift, and Bigquery. To understand better what sync mode and frequency you should select, read this doc. There is a FAQ topic on our Discourse that more extensively explains the cost issue here.
    Can someone tell me if this is still true for Incremental Sync - Deduped History too?
    ✅ 1
    h
    • 2
    • 1
  • m

    Moshe Mazuz

    01/20/2022, 10:26 AM
    Hi we would like to know if Airbyte can help us to take all the data changes in SQL server(we use web edition and CDC is not working in our version ) and in mondoDB ?
    ✅ 1
    a
    • 2
    • 1
  • s

    Soufiane Odf

    01/20/2022, 10:09 PM
    Hi, I just want to know if Airbyte has a Rest API that I can use to connect with the already built in connections and destinations, instead of using it’s UI ???
    ✅ 1
    m
    • 2
    • 1
  • j

    Jacob Mass

    01/21/2022, 12:55 AM
    Just wanted to say hi--I'm looking to make a first contribution here at Airbyte but might need some guidance!
    ✅ 1
    m
    • 2
    • 2
  • s

    Soufiane Odf

    01/21/2022, 9:27 AM
    In which programming language you can contribute with ?
    👀 1
    a
    • 2
    • 2
  • s

    Soufiane Odf

    01/21/2022, 12:32 PM
    Is there any way I can receive notification that a job is done ???
    ✅ 1
    a
    • 2
    • 1
  • d

    Daniel Eduardo Portugal Revilla

    01/21/2022, 1:31 PM
    Hello! I am trying to create a HTTP connector for consume a servicesnow API i am following the tutorial Speedrun but T do not know where put the credentials for the API
    Copy code
    {
      "type": "LOG",
      "log": {
        "level": "FATAL",
        "message": "HttpStream.__init__() got an unexpected keyword argument 'auth'\nTraceback (most recent call last):\n  File \"/Users/daniel_edu/Projects/PERSONAL/airbyte/airbyte-integrations/connectors/source-servicesnow-api/main.py\", line 13, in <module>\n    launch(source, sys.argv[1:])\n  File \"/Users/daniel_edu/Projects/PERSONAL/airbyte/airbyte-integrations/connectors/source-servicesnow-api/.env/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py\", line 127, in launch\n    for message in source_entrypoint.run(parsed_args):\n  File \"/Users/daniel_edu/Projects/PERSONAL/airbyte/airbyte-integrations/connectors/source-servicesnow-api/.env/lib/python3.10/site-packages/airbyte_cdk/entrypoint.py\", line 118, in run\n    for message in generator:\n  File \"/Users/daniel_edu/Projects/PERSONAL/airbyte/airbyte-integrations/connectors/source-servicesnow-api/.env/lib/python3.10/site-packages/airbyte_cdk/sources/abstract_source.py\", line 92, in read\n    stream_instances = {s.name: s for s in self.streams(config)}\n  File \"/Users/daniel_edu/Projects/PERSONAL/airbyte/airbyte-integrations/connectors/source-servicesnow-api/source_servicesnow_api/source.py\", line 55, in streams\n    return [ServicesnowApi(auth=(config[\"psswrd\"], config[\"psswrd\"]), user=config[\"user\"], psswrd=config[\"psswrd\"])]\n  File \"/Users/daniel_edu/Projects/PERSONAL/airbyte/airbyte-integrations/connectors/source-servicesnow-api/source_servicesnow_api/source.py\", line 67, in __init__\n    super().__init__(**kwargs)\nTypeError: HttpStream.__init__() got an unexpected keyword argument 'auth'"
      }
    }
    I was trying this script
    Copy code
    def streams(self, config: Mapping[str, Any]) -> List[Stream]:
            """
            TODO: Replace the streams below with your own streams.
    
            :param config: A Mapping of the user input configuration as defined in the connector spec.
            """
            # TODO remove the authenticator if not required.
            auth = TokenAuthenticator(token="api_key")  # Oauth2Authenticator is also available if you need oauth support
            return [ServicesnowApi(authenticator=(config["psswrd"], config["psswrd"]))]
    👀 1
    ✅ 1
    a
    • 2
    • 5
  • j

    Justin Cole

    01/21/2022, 2:59 PM
    I'm trying to replicate connections in the config files and then upload it.. but I am having troubles getting it to recognize the added connections. I have ensured they all have unique identifiers. Is there something else I am missing?
    ✅ 1
    a
    h
    • 3
    • 20
  • s

    shunde zhao

    01/21/2022, 10:16 PM
    Hi, I just started looking at using Airbyte to replicate on-prem Sqlserver database tables to BigQuery in GCP. I used “Basic normalization” with “No custom transformation”. I am able to replicate the tables in Sqlserver to BigQuery. However, I notice the dataset created in BigQuery is automatically chosen as “dbo” which is the schema name of the Sqlserver database tables (btw, the database is “XYZ” and the tables are like dbo.tableABC in Sqlserver). I would like to force the destination dataset name in BigQuery to use the Sqlserver database name “XYZ” (and the table names in BigQuery should be the same as in Sqlserver with or without the dbo prefix). Is there a simple way to do this? Thanks a lot.
    ✅ 1
    m
    • 2
    • 3
  • s

    Soufiane Odf

    01/22/2022, 3:36 PM
    Hey, I want to know if there is any way to customize the UI, for example changing the logo etc.., and do I have the right to modify the UI if I want to use it internally ?
    c
    • 2
    • 1
  • o

    Oluwapelumi Adeosun

    01/22/2022, 3:58 PM
    Hello, I'd like to know if it's possible to extract and load data directly from Postgres database hosted on AWS RDS to BigQuery?
    ✅ 1
    n
    • 2
    • 2
  • s

    Steve

    01/22/2022, 5:09 PM
    Hi. Is there a Firestore source or destination please? There seems to be a menu item in the docs, but the page itself appears to be empty? Thanks https://docs.airbyte.com/integrations/destinations/google-firestore
    ✅ 1
    h
    r
    • 3
    • 7
  • s

    Soufiane Odf

    01/23/2022, 2:52 PM
    or destination
    ✅ 1
    h
    • 2
    • 1
  • f

    Fridtjof Petersen

    01/23/2022, 6:27 PM
    Heya! We are slowely setting up Airbyte (deployed locally as Airbyte Cloud not availabe in EU yet). Is it possible already possible to use the Google Sheets adapter with OAuth? Having to download the json file with credentials doesn't seem too save! (And if not - any suggestions on where/how to store those credentials?)
    ✅ 1
    👀 1
    j
    r
    +2
    • 5
    • 5
  • e

    Elias Djurfeldt

    01/24/2022, 10:39 AM
    Hi there! Curious about how the Airbyte Cloud offering is implemented (and I can’t trial as I’m in the EU). Say I were to use Cloud and want to transfer data from source A to destination B, does the data first flow to your cloud and gets staged there before it goes to my destination B? Thanks!
    👀 1
    a
    e
    • 3
    • 3
  • e

    Eugene Krall

    01/24/2022, 11:01 AM
    Hi everyone! When resetting data, my connection doesn't initiate a full sync, anyone encountered a problem like that? My understanding is that on resetting the data, the entire dataset in the destination gets deleted and re-synced, but in my case it's indeed get's deleted but only new data get's sync. This problem can be solved by creating a new connection.
    👀 1
    a
    h
    • 3
    • 6
  • l

    Lihan

    01/24/2022, 11:23 AM
    Hi, I got one very simple question in AirByte architecutre. If Airbyte already has a scheduler and workers, why does it need Airflow? What's the point of triggering Airbyte job from Airflow?
    ✅ 2
    j
    a
    • 3
    • 3
  • v

    Victor Francess

    01/24/2022, 12:09 PM
    Hi guys - one question - how can I move an instance of airbyte to a new host (i don't want to move the VM, just airbyte). It's important I want to keep the state, so no full resyncs, etc. I'm using docker. Thanks
    👀 1
    a
    m
    h
    • 4
    • 7
  • s

    Steve

    01/24/2022, 1:23 PM
    Hi. I've followed the two steps for deployment shown at https://docs.airbyte.com/quickstart/deploy-airbyte but am getting a "_File not found_" issue. Log attached. Can anyone help please? macOS 12.0.1 (Intel) docker version 20.10.6, build 370c289 docker-compose version 1.29.1, build c34c88b2 Thanks
    airbyteFileNotFound.txt
    👀 1
    a
    • 2
    • 7
  • a

    Alex Meadows

    01/24/2022, 7:32 PM
    I'm attempting to deploy airbyte onto an EKS cluster but am hitting an error. All pods are displaying this error. Any advice? Solution: Was due to no node group being created on the cluster.
    ✅ 1
    m
    h
    • 3
    • 8
  • a

    Amir Davidoff

    01/24/2022, 8:27 PM
    hey guys! im a total noobie. got a question, an I use airbyte to send data from a DWH to salesforce (like a reverse etl) ?
    ✅ 1
    m
    a
    • 3
    • 3
  • y

    Yiyang (Heap.io)

    01/24/2022, 8:58 PM
    Hi guys! I am new to the platform. You guys have done an amazing jobs in cracking the APIs for lots of the sources connectors. It seems that most APIs are read-only API. what if I want to write my data to these apps. For example, I want to sync user data to intercom.
    ✅ 2
    m
    • 2
    • 1
  • f

    Fridtjof Petersen

    01/24/2022, 9:01 PM
    Heya! I have a question regarding Airbyte + dbt We have an important table in Google sheet with product information that get's updated once every week where new rows get added and others deleted/changed ( and the schema changes sometimes). And the goal is to use airbyte + dbt on a posgres database to save all historic data for downstream aggregation and analysis My first thought was to sync tables by
    full refresh - append
    but this already broke because someone added a new column and the sync schema didn't stay the same so we need to stick with the
    overwrite
    option. Does anyone have a suggestion on how to properly archive/store all data with dbt? My idea was to use the
    snapshot
    feature of dbt as it can also add new columns that don't fit with the created schema. Thank you!
    ✅ 1
    👀 1
    m
    • 2
    • 2
  • a

    Amol Walanj

    01/25/2022, 3:46 AM
    and even retry get stucked
    👀 1
    h
    m
    • 3
    • 10
  • d

    Dekel R

    01/25/2022, 8:41 AM
    Hey everyone! I have a question regarding the integration of Prefect and Airbyte. I have a Prefect flow that is copying a couple of Google BigQuery tables between 2 different GCP projects. The flow is reading a config (json) file that I have in google storage. For each region and table in the config file a copy task is spawned using Prefect map option (I have multiple regions, each one has different tables - each table has its own unique source and destination). Now I was wondering if its possible to do the same thing using Airbyte? spawning a new connection by using Prefect if this connection doesn’t exists (lets say - for region A between BQ table a and table b) and if it does exist - just trigger the sync? More specifically - since my configuration is dynamic and regions/tables will get added I don’t want to configure each connection manually in the Airbyte UI. Thanks!!
    ✅ 1
    👀 1
    a
    h
    • 3
    • 3
  • s

    Steve

    01/25/2022, 11:05 AM
    Hi, Successful install of AirByte on AWS Lightsail, but is there a straightforward way of adding authentication / a login & password to secure my AirByte install? Thanks
    ✅ 1
    h
    a
    • 3
    • 3
  • t

    Tech Account

    01/25/2022, 8:41 PM
    What is the appropriate channel to ask for help if I am stuck with something?
    m
    • 2
    • 1
  • v

    Vikram Medishetty

    01/26/2022, 1:32 AM
    Hello, I am trying to install airbyte from the docs here ->https://docs.airbyte.com/deploying-airbyte/on-gcp-compute-engine but i am getting an error sudo apt-get install -y docker-ce docker-ce-cli containerd.io -- > E: Package 'docker-ce' has no installation candidate E: Unable to locate package docker-ce-cli E: Unable to locate package containerd.io E: Couldn't find any package by glob 'containerd.io' E: Couldn't find any package by regex 'containerd.io'
    ✅ 1
    t
    m
    • 3
    • 4
  • m

    Mahdir Ishmam

    01/26/2022, 2:24 AM
    hey folks, I'm evaluating Airbyte vs. Fivetran for MySQL -> Redshift CDC data replication. Have a few questions about how this works: 1. How does the Redshift connector handle updates and deletes from the MySQL source data on the Redshift side? 2. How fast can it run? What is the latency between MySQL to Redshift at the maximum possible replication speed Airbyte currently allows?
    👀 1
    a
    m
    • 3
    • 7
  • s

    Steve

    01/26/2022, 11:56 AM
    Hi! Can someone help me understand the relationship, if any, between the list of connectors here on Github https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors and the connectors listed on the Airbyte website at https://airbyte.com/connectors ? Some seem to be in one place but not the other. eg; Firestore is on Github but not on the web, ebay is on the web but not on the Github list ^^. Apologies if I missed something obvious. :-) Thanks
    ✅ 1
    a
    • 2
    • 8
1...212223...245Latest