https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • v

    Vijay Kothareddy

    02/24/2022, 11:58 PM
    Team Postgres connector only discovers public schema? When I see in connection it's only shows public? Any thoughts
    ✅ 1
    j
    a
    • 3
    • 4
  • m

    Muhammad Imtiaz

    02/25/2022, 11:07 AM
    Hi, Can anyone tell where the
    local CSV
    files are stored in which pod? Like: if we configure
    Local CSV
    as destination in docker container, it is stored in WORKER pod under
    /tmp/airbyte_local
    . But this is not the case for KUBERNETES as stated from the Documentation https://docs.airbyte.com/integrations/destinations/local-csv
    👀 1
    m
    • 2
    • 1
  • m

    Matheus Guinezi

    02/25/2022, 1:52 PM
    Hi guys! I am trying to use custom GAQL Query in google ads source, but I don't get any update on this cutom report in the destination. How can I manage to correct it?
    ✅ 1
    m
    • 2
    • 12
  • l

    Lucas Wiley

    02/25/2022, 7:21 PM
    I suppose this is bc of the security issue?
    👀 1
    a
    • 2
    • 2
  • v

    Vesna Djinovic

    02/25/2022, 11:49 PM
    Hello, Can you help with Monday Connector that fails all sync attempts? More details in the thread : D
    ✅ 1
    m
    a
    • 3
    • 5
  • s

    Saurabh Mathur

    02/26/2022, 7:52 AM
    Hi, I am new to Airbyte and was setting up my first ETL via Mongodb. I understood that I should be able to explode the nested fields via https://docs.airbyte.com/understanding-airbyte/basic-normalization but when I am testing it out. I am only getting the basic tables in MySql and the nested objects are saved as JSON In the transformation step, I see two options "Original Structure" and "Serialized JSON" and there is no "Basic Normalization". I am using Mysql 8.0.28 so as per the docs basic normalization should be supported. Can someone suggest what I can do to get normalized tables in MySQL Thanks
    ✅ 1
    a
    • 2
    • 1
  • a

    Arvi

    02/28/2022, 12:17 AM
    Hi All, would we be able to user Terraform to create connections in airbyte?
    ✅ 1
    n
    a
    • 3
    • 3
  • a

    Aakash G

    02/28/2022, 3:06 PM
    Hello team! I am trying to run Airbytes on minikube (was running into issues running locally on M1 where postgres db was not spinning up) and got the UI to show up. However, none of the sources are loading. Any idea why that would be the case?
    👀 1
    a
    m
    • 3
    • 2
  • m

    Mahesh

    03/01/2022, 10:22 AM
    I am trying to use kafka source and redshift destination. I have message in the form of JSON, but the JSON is just inserted into raw table as blob, I want the JSON message as a proper table with fields as columns in the table. I have turned on the “Normalisation” on, still it is not working Help appreciated
    👀 1
    h
    s
    • 3
    • 17
  • b

    bertran ruiz

    03/01/2022, 4:01 PM
    Hello Team ! Do you think it’s “a good fork” to use airbyte instead fo merde.dev ou apideck.com ? My use case is to create custum connection between my app and lot’s of others saas. Your help / perspective will be really appreciated
    ✅ 1
    m
    h
    • 3
    • 7
  • m

    Mike B

    03/01/2022, 5:43 PM
    Hi all! New here, and have a stupid question. 🙂 With the File connector, is there any way to use a dynamic filename? Ex: we need to download a CSV file from a SFTP server monthly, and the name changes every month (Monthly Report - January2022.csv, ...February2022.csv, etc.)
    👀 1
    m
    • 2
    • 1
  • v

    Vijay Kothareddy

    03/01/2022, 10:11 PM
    Team Does any one have experienced the lag while doing CDC on Postgres and filling up the source database disk space I am doing the sync for every 5 min
    👀 1
    m
    • 2
    • 8
  • y

    Yiyang (Heap.io)

    03/02/2022, 3:57 AM
    Hi, team. I am using the API to create a new workspace.
    Copy code
    {
      "email": "<mailto:support@example.com|support@example.com>",
      "anonymousDataCollection": false,
      "name": "string",
      "news": false,
      "securityUpdates": false,
      "notifications": [],
      "displaySetupWizard": false
    }
    Here’s the payload, I specify displaySetupWizard as false. But I still see the wizard after I use the newly-generated workspaceId to access the workspace. for example, http://localhost:8000/workspaces/7c7922da-9199-4610-b4e3-3de5686abd0a How do I bypass the UI? I was redirected to
    workspaces/:id/preferences/
    The UI is used for our internal support team.
    👀 1
    ✅ 1
    h
    m
    • 3
    • 7
  • h

    Hampus Åström

    03/02/2022, 8:54 AM
    Hi everyone, I’m following the Getting Started Guide and everything works until I come to the ‘Check the data of your first sync’. When I run the ‘cat /tmp/airbyte_local/json_data/_airbyte_raw_exchange_rate.jsonl’ in the terminal I get ‘No such file or directory’ and when I look in the folder /tmp/airbyte_local/json_data/ it’s empty. Does anyone now why that is?
    👀 1
    m
    • 2
    • 1
  • s

    Shubham Pinjwani

    03/02/2022, 10:00 AM
    Hello, I want to delete some columns from table while transferring it from postgres to bigquery. By using dbt will the column also gets deleted in the airbyte_raw table that's created ?
    ✅ 1
    h
    • 2
    • 2
  • g

    Gal Yadid

    03/02/2022, 10:44 AM
    Hey 🙂, how can I get the token for the api? (talking about airbyte cloud)
    ✅ 1
    a
    • 2
    • 1
  • d

    Doğacan Düğmeci

    03/02/2022, 11:03 AM
    Hi, what are Airbyte (cloud) public IP's? I need to whitelist airbyte in my destination
    ✅ 1
    a
    • 2
    • 1
  • e

    Espen Espelund

    03/02/2022, 4:35 PM
    Hey. Evaluating Airbyte after Fivetran quoted us 10-15k usd a month that seems like a giant waste of money to extract and load some data automatically. Can someone help me out with these questions? • Does Airbyte support < 5 min syncing? • How stable are the Facebook, AdWords and other major ad connectors? Are they production ready at scale, or can we expect a lot of troubleshooting or possible downtime over multiple days if things change? • How stable are the MySQL CDC? • What is the performance like compared to Fivetran? • Biggest difference from Meltano and Rudderstack that seems like the other open-source alternatives? • Which of the open-source platforms are most pleasant to integrate a custom rest api source on? • Any real-time / streaming data functionality?
    c
    m
    +2
    • 5
    • 14
  • s

    Saif Mahamood

    03/02/2022, 8:04 PM
    👋 Hey. When trying to connect to a Postgres source hosted on RDS, AWS we see this error:
    Copy code
    The connection tests failed.
    Internal Server Error: The Access Key Id you provided does not exist in our records. (Service: S3, Status Code: 403, Request ID: 16D8A8978B2C5C68, Extended Request ID: null)
    The Postgres source doesn’t ask for AWS Access Key ID. And the Postgres database doesn’t require AWS Access Key Ids. Can anyone help us understand what’s happening?
    🙏 1
    👀 1
    m
    • 2
    • 5
  • j

    Jonathan Alvarado

    03/02/2022, 9:32 PM
    How is the state storage used? Would there be a problem with in-process jobs when upgrading a kubernetes deployment when using the pre-configured Minio state storage instead of S3 storage? The instructions have you delete the the instances and recreate them. I’m wondering if there is some temporary job information stored in Minio/S3 that would affect in-process or other jobs with an upgrade. Regarding this: STATE_STORAGE_MINIO_ACCESS_KEY=minio STATE_STORAGE_MINIO_SECRET_ACCESS_KEY=minio123
    👀 1
    h
    • 2
    • 3
  • t

    Tan Ho

    03/03/2022, 4:21 AM
    Hi guys, I’m looking at the airbyte api - the source definition list to be specifically. I noticed that the
    sourceDefinitionId
    is somehow fixed in my two environments. Is that right? if that is true, where is it configured? or it’s auto generated at server setting up?
    Copy code
    {
       "sourceDefinitionId": "71607ba1-c0ac-4799-8049-7f4b90dd50f7",
       "name": "Google Sheets",
       .... 
    }
    👀 1
    h
    • 2
    • 1
  • c

    Chukwuebuka Akwiwu-Uzoma

    03/03/2022, 4:26 AM
    Hello I am new to Airbyte and docker, and I have issues setting up both on my local and on EC2 running AWS Linux 2 For Local, when trying to clone the git repo I get the Filename too long error Please how can I successfully clone and pass this stage? Please refer to the first image For EC2, when trying to access airbyte via server_ip:8000, I get the error Cannot reach server. The server may still be starting up. See the second image
    ✅ 1
    👀 1
    h
    • 2
    • 3
  • n

    Niclas Grahm

    03/03/2022, 8:00 AM
    Hi friends, My current client is hosting several databases (mssql and oracle) all on prem in a windows based intranet. the mssql servers all use trusted/windows ldap authentication. Am I correct in assuming that there is no way for Airbyte to connect to the database this way?
    ✅ 1
    a
    • 2
    • 2
  • a

    Andrei Batomunkuev

    03/03/2022, 10:08 AM
    Source: Shopify Destination: Postgres Hello! I have a question regarding duplicate rows. I am using deduped history sync mode, however there are still duplicate rows in orders_line_items. The parent table "orders" doesn't contain duplicate rows, the child table "orders_line_items" contains duplicates. What would be a good approach to filter out duplicate rows? I have some ideas in mind using dbt models: 1. Use
    id
    as a filter
    SELECT DISTINCT ON (id)
    2. Use
    _airbyte_line_items_hashid
    as a filter ? Is it safe to use this approach?
    SELECT DISTINCT ON (__airbyte_line_items_hashid_)
    👀 1
    👍 1
    h
    a
    m
    • 4
    • 10
  • s

    Shubham Pinjwani

    03/03/2022, 11:28 AM
    Currently, is there any way we can delete the columns or rows or apply transformations in the airbyte_raw table?
    ✅ 1
    a
    • 2
    • 1
  • s

    Sania Zafar

    03/03/2022, 12:21 PM
    Hi I have created a mongodb source and a bigquery destination, however my data is missing certain rows from the table, How can I fix this?
    👀 1
    a
    • 2
    • 1
  • c

    Chukwuebuka Akwiwu-Uzoma

    03/03/2022, 6:16 PM
    Hello, A cal for help on a data normalization issue. I selected Normalized tabular data for my destination (Redshift) (see first image) However, my data still inserts as a json blob (please see the second image) How can I ensure it inserts to my redshift warehouse as different rows and columns (as in my source)
    ✅ 1
    m
    • 2
    • 6
  • s

    Shubham Pinjwani

    03/04/2022, 7:22 AM
    Is there any option for using dbt or custom tranformations in Bigquery Denormalized Destination ? When I tried to Export Plain SQL files it gave an error saying the profiles.yml and dbt_project.yml files were not found.
    ✅ 1
    h
    • 2
    • 1
  • j

    Jan Tore Stolsvik

    03/04/2022, 8:53 AM
    Hi! We have just started using Airbyte and it looks very promising 🙂 We want to use it in the embedded way, as explained here https://airbyte.com/embed-airbyte-connectors-with-api. We want to use the Airbyte APIs form our own application, so our customers can setup their needed integrations. I searched slack, and see there are several questions about this. The answer is usually to look at the API docs, or sign up for the cloud waitinglist. I would love to learn more about what the difference will be like if we use airbyte cloud vs self hosted, but unfortunately we are situated in the EU, so we will have to wait a while I guess. So, I would love to learn in what ways the airbyte cloud will make “powered by airbyte” easier. And more about how we should be doing that if we need to self host for now. Do we create one workspace for each customer etc. Are there any resources on the topic, or anyone I could have a chat with?
    ✅ 1
    h
    • 2
    • 1
  • c

    Chukwuebuka Akwiwu-Uzoma

    03/05/2022, 8:27 AM
    Hello! So, started off with Airtbyte last week, I have been able to setup and run data movements between different sources. Now I am trying to integrate dbt custom transformations but it never runs. Notes: I am using dbt cloud, not self hosted My airbyte is hosted locally I can connect to my sources and move data But transformations dont work, see transformations settings attached Question: How can I get dbt custom transformations to work? Does with dbt integration work for both dbt cloud and dbt on prem? Please help🥲 Thank you
    👀 1
    m
    s
    • 3
    • 4
1...262728...245Latest