https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Ana Loureiro

    01/31/2023, 11:24 AM
    Hi! Could someone help me with this issue? https://github.com/airbytehq/airbyte/issues/22062 Airbyte is killing the job because it reaches the memory limit but I don’t know how I can decrease the memory usage in this case. This is a Jira <> Snowflake connection. Thank you!
    n
    • 2
    • 1
  • j

    Jean Lorillon

    01/31/2023, 11:27 AM
    Hello everyone 👋 What are the options to sync two dataobjects when there’s no obvious primary key / foreign key? For example, linking a Stripe Subscription to a Salesforce Deal.
    u
    • 2
    • 1
  • a

    Andrzej Lewandowski

    01/31/2023, 11:36 AM
    Hi, how can I speed up mysql connector? Im replicating data from mysql to snowflake, currently after 20h i have 106GB. Airbyte runs on ec2, 5/16gb memory is used, ~50% cpu
    a
    n
    • 3
    • 5
  • s

    Slackbot

    01/31/2023, 11:51 AM
    This message was deleted.
    n
    • 2
    • 3
  • e

    Emilja Dankevičiūtė

    01/31/2023, 1:20 PM
    Hi all, I can see you're a bit overwhelmed 😅 hopefully I can reach the developers here. What's the purpose for the table
    actor_definition_workspace_grant
    for
    oss
    deployments? (I can't really see anything in cloud pricing regards limiting connectors either btw). We can fully use the connector even though it's not marked as public in
    actor_definition
    table anyways and I can't find info about it via googling it. Is this something unfinished? Or is it going to be implemented sometime in the future? At the moment the endpoint
    api/v1/source_definitions/list_for_workspace
    is causing problems for us because it doesn't return all connectors that are actually available.
    n
    • 2
    • 3
  • a

    Alexis Manuel

    01/31/2023, 1:43 PM
    Hi 🙂, I have a question concerning the MongoDB source connector. We are only syncing one stream out of many collections but we are experiencing latencies because of many queries done on all the collections. From what I’ve found, it seems to be because of the discovery phase which requests all collections to fetch their schema. Are my suspicions accurate ? If so, is it possible to only discover the streams which are only checked for syncs ? We are using Airbyte version 0.40.29 with Mongo source connector 0.1.19, and we schedule the sync with Dagster every 15 minutes.
    m
    u
    • 3
    • 4
  • j

    João Cunha

    01/31/2023, 1:51 PM
    hey. Has anyone used the Looker airbyte connector? If so, do you know what type of permissions/role the User associated to the API key has to have? Didn't find it documented anywhere. Thanks 🙏
    u
    • 2
    • 1
  • r

    Ron Handler

    01/31/2023, 2:19 PM
    Hi, I’d like to make an attempt at working on and contributing a fix for https://github.com/airbytehq/airbyte/issues/8167 Anything I should take in mind before I begin? I noticed https://docs.airbyte.com/contributing-to-airbyte/ mentions #general and #dev slack channels which may have been deleted
    n
    • 2
    • 4
  • j

    jeremiah ishaya

    01/31/2023, 2:34 PM
    Hello , i am trying to use the airbyteRecords data types conversion from Json schemas to the in build data types defined by airbyte using " well_known_types.yaml". However, the file is not available in the link provided in the documentation..... https://docs.airbyte.com/understanding-airbyte/supported-data-types/. -->
    n
    u
    • 3
    • 12
  • s

    Sharath Chandra

    01/31/2023, 2:38 PM
    Hi friends, I have 100 GB data in Postgres which I want to dump in redshift, I tried using airbyte and it took 7 hours. Is there any way I can do this data load very faster? Please help.
    u
    m
    • 3
    • 4
  • j

    jcachat

    01/31/2023, 2:44 PM
    is there a way to clear previously used payment information from the airbyte stripe page? without having to put in a valid alternative option?
    u
    • 2
    • 1
  • u

    user

    01/31/2023, 2:59 PM
    Message test.
  • k

    King Ho

    01/31/2023, 3:03 PM
    Hi all, we have we’re using the hubspot connector and specifically the email_events table as a source and are retrieving negating
    created
    dates. Anyone else experience this / know why it’s occurring? I can confirm that retrieving the same data using the api key directly using postman works and retrieves the correct epoch date.
    n
    u
    t
    • 4
    • 14
  • g

    Gery

    01/31/2023, 3:26 PM
    Hello, I hesitate between Fivetran and Airbyte Cloud. I was hype to use Airbyte because it’s open source. But the destination connector for Postgres is in Alpha 😢 I need to deploy my stack before the 15th February. Do someone has visibility on when this connector will be ready to use in production please ?
    u
    • 2
    • 1
  • a

    Adrian Bakula

    01/31/2023, 3:41 PM
    Hi! We're running a k8s deployment of Airbyte. I'm wondering what are the temp files that Airbyte is creating when running a sync on a connection? Logs end up being a bit confusing since I'm seeing:
    Copy code
    2023-01-04 09:33:11 destination > Finished writing data to ed6c9cf9-98bf-4ac3-a38e-c64fa3bde8121712327310262298089.csv.gz (11 MB)
    2023-01-04 09:33:13 destination > closing connection
    2023-01-04 09:33:13 destination > Deleting tempFile data ed6c9cf9-98bf-4ac3-a38e-c64fa3bde8121712327310262298089.csv.gz
    Looks like the file is being written to and then immediately deleted. Is this the buffer to send data to the staging directory for Snowflake? And FWIW this is using the Snowflake destination connector
    u
    • 2
    • 1
  • j

    Jake Beresford

    01/31/2023, 3:44 PM
    👋 Hello! I'm new to Airbyte and doing a spike right now, having some trouble connecting to a MongoDB source (hosted on Atlas). The error I'm running into is
    Unable to execute any operation on the source!
    - if anyone has ideas on what might be causing that I'd appreciate any pointers here. Thanks!
    n
    u
    u
    • 4
    • 10
  • k

    kashish ahuja

    01/31/2023, 3:56 PM
    Hi everyone, I am trying to create the MSSQL source but after providing all the details of the MSSQL i am getting my connection test failed. here is the the error i am getting while testing State code: 08S01; Message: The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "The server selected protocol version TLS10 is not accepted by client preferences [TLS13, TLS12]" PLease help me to resolve this issue
    u
    • 2
    • 1
  • l

    Lenin Mishra

    01/31/2023, 4:27 PM
    Hello anyone, Can any one suggest any good resources/articles for building incremental sync types for a custom connector in Airbyte?
    s
    • 2
    • 1
  • e

    Eduardo lopes

    01/31/2023, 5:06 PM
    Folks, I cant see the Linkedin Pages in airbyte cloud, can anybody help me? https://airbytehq.slack.com/archives/C01A4CAP81L/p1674749015891789
    n
    • 2
    • 1
  • s

    saurabh bansal

    01/31/2023, 5:30 PM
    Can I change the raw table name from __airbyte_raw to some custom name and same goes for meta data records like_ _airbyte_ab_id
    s
    • 2
    • 1
  • s

    saurabh bansal

    01/31/2023, 5:32 PM
    to something like __mycompany_raw and_ _mycompany_id
  • j

    jonty

    01/31/2023, 5:39 PM
    Hi, I recently upgraded our MySQL connector to the latest version (1.0.21). We had some huge issues with syncing after that (basically, everything stopped syncing), so we deleted all of our datasets in the destination, and deleted all of Airbyte's Connections, and created new Connections started from scratch. The initial syncs in the new Connections worked perfectly, but the issue that we're now running into is that every subsequent sync is doing the full dataset again, even though we are using the Incremental + Deduped strategy (which has always worked perfectly for us in the past). The logs show the following messages for every single Connection and Stream:
    Copy code
    No cursor field set in catalog but not present in state. Stream: [redacted], New Cursor Field: updated_at. Resetting cursor value
    Any ideas on what is causing this, and how we can solve it? Thank you!
    n
    • 2
    • 5
  • m

    Mayank V

    01/31/2023, 7:08 PM
    Hi! Can someone please help me with sample input file of destination
    --write
    check. It requires the input for
    --config
    and
    --catalog
    I am not sure how to build
    ConfiguredAirbyteCatalog
    in the arg and
    AirByteMessage
    in the CLI to be written to destination.
    ✅ 1
    u
    • 2
    • 2
  • j

    Jake Beresford

    01/31/2023, 7:17 PM
    Hello! I'm trying to set up a connection between MongoDB (source) and Snowflake (destination). I'm running airbyte in docker currently and running into
    Error: non-json response
    after about 20minutes of the schema discovery running. The most relevant log I can find is:
    Copy code
    [error] 13#13: *36 upstream timed out (110: Connection timed out) while reading response header from upstream , client: 192.168.240.1, server: , request: "GET /api/v1/health HTTP/1.1", upstream: "<http://192.168.240.6:80/api/v1/health>", host: "localhost:8000", referrer: "<http://localhost:8000/workspaces/032d28bb-e354-4277-915d-eb6567d7d495/connections/new-connection>"
    Not really sure how to proceed with debugging the actual issue, any pointers would be appreciated!
    m
    • 2
    • 2
  • j

    Jon Simpson

    01/31/2023, 7:30 PM
    I understand the Shopify Source is unsupported. We’ve been unable to perform a single sync with production volumes. After multiple 23-26h runtimes they all fail with similar reasons. Wanted to provide the logs in case they help. Also we currently only have 1.7M populated users out of expected 10M, so fetching Metadata 1 at time is painful. I believe the customers object fetch has metafields already.
    ShopifySyncFailureLogs
    s
    u
    u
    • 4
    • 5
  • r

    Rafael Paraíso Rossim

    01/31/2023, 9:23 PM
    Hello, I'm using airbyte open-source and I'm trying to configure the bigquery connector, after filling in all the fields, the error
    Copy code
    com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 1 column 1 path $
    Note: - In the Service Account Key JSON field I am putting the "private_key_id" information from my json could you help me please?
    o
    n
    • 3
    • 3
  • c

    Corbin Beebe

    01/31/2023, 9:27 PM
    Hi! I am currently trying to connect Redshfit Serverless as a destination. Currently using airbyte open source as this is a proof of concept project for some of our data needs. However, I am getting the following error during the connection process, any insight would be greatly appreciated:
    a
    s
    • 3
    • 7
  • m

    Michael Yee

    02/01/2023, 2:32 AM
    I'm testing out syncing a bunch of tables from a mysql db to aws s3 in parquet using snappy compression. It's taking a long time and I was wondering maybe I didn't set something up right or it just takes hours and hours. I was wondering if anyone could point me to documentation or describe how to sync faster. One thought I had was to create a few pipelines sync different tables at the same time...
    n
    j
    • 3
    • 12
  • d

    Daniel Snell

    02/01/2023, 7:27 AM
    Hey team, i'm evaluating Airbyte for a few different use cases. • Can workspace be used to broker my users connection from their data into our systems? • Do Airbyte work with AnalyticsJS natively (or a conector) of it?
    ✅ 1
    m
    • 2
    • 2
  • i

    Ihor Konovalenko

    02/01/2023, 8:03 AM
    Hi all. I use as configuration Airbyte database external DB (AWS RDS Postgres instance). Is after upgrade Airbyte minor version (from 0.40.26 to 0.40.32) safe to use the same DB?
    s
    • 2
    • 1
1...132133134...245Latest