https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • d

    Disha

    05/31/2023, 6:47 PM
    Can we do transformation is destination is Teradata Vantage or is that not supported currently on local airbyte
    k
    • 2
    • 2
  • d

    Disha

    05/31/2023, 7:21 PM
    I have built a local container in my dev for github can see it in the docker images but need to run it. Any idea how to run it?
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    05/31/2023, 7:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1PM PDT click here to join us on Zoom!
  • p

    Paulo José Ianes Bernardo Filho

    05/31/2023, 11:50 PM
    Guys what do you do when pandas thrown this error: 2023-05-31 234525 destination > Out of bounds nanosecond timestamp: 953-12-03 000000 Source: postgresql Destination: AWS Lake
    k
    • 2
    • 2
  • n

    Nico Gallinal

    06/01/2023, 6:30 AM
    Hi! Was the capability of exporting configurations removed in version 0.44.4?
    k
    • 2
    • 2
  • d

    Damien Blenkinsopp

    06/01/2023, 8:07 AM
    Re: Amazon Seller Partner connector • This was recently removed as there were some bugs preventing it from connecting with Amazon and the AirByte team is working on it. • This connector is a big differentiator for AirByte (other ETLs don’t have it, yet many companies selling on amazon need it). It’s one of the main reasons we started moving our ETLs from Stitch to AirByte. • Currently we’re using a custom one we built - but maintenance + updates is heavy/ and its clunky/ so would really like to move to AirByte or other ETL. • QUESTIONS ◦ Is there any information available on timeline for the bug fixes on this connector? ◦ Has anyone seen an Amazon Seller Partner connection on any other platforms? ◦ Is this a big need for others of you - it’s actually our most important connector - would love to hear from others on this if so.
    k
    • 2
    • 2
  • s

    Soshi Nakachi仲地早司

    06/01/2023, 10:08 AM
    Hi Teams. I am currently facing a problem with missing records in Salesforce syncing. I am currently working with a Salesforce Contact table. The synchronization method is Incremental Sync - Deduped History The Contact table is heavily updated and pulls in tens of thousands of records daily. So I found that some records are missing. 1. Synchronization started on 2023-05-28 210000 (State is {SystemModstamp”: “2023-05-27T211508.000+0000} at this time) 2. after synchronization, State is updated to {“SystemModstamp”: “2023-05-28T211008.000+0000"}. 3. next synchronization, it was found that a few records were missing, records for which SystemModstamp had the following values: {“SystemModstamp”: “2023-05-28T211008.000+0000"}.(I checked the raw table) ◦ 2023-05-28 211507 UTC ◦ 2023-05-28 211506 UTC I can see why it is missing because the value of SystemModstamp, which is indeed a curl field, is the value after that at step2. How can this be resolved? https://discuss.airbyte.io/t/missing-records-in-salesforce-incremental-synchronization/4577
    k
    • 2
    • 15
  • a

    Adrian Aioanei

    06/01/2023, 10:42 AM
    Hello I have one question guys. I start using jira connector and from what I can see it's in beta now. Is there any page where I can fallow this connector development? Or maybe someone know when a release version will be ready? Thanks
    k
    • 2
    • 3
  • n

    Nico Gallinal

    06/01/2023, 11:53 AM
    Hi I have problem with the destination stream prefix, airbyte creates a tmp table and a final that change it
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/01/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom octavia loves
  • h

    Henrique Melo

    06/01/2023, 2:55 PM
    Hi, everyone I'm running into the following situation, and this does not seem correct: I have some syncs with S3 destination and sync mode "full refresh | overwrite". They eventually fail during the sync. These failed attempts report that X records have been emitted, and no records have been committed. However, the sync does write files to S3, causing incomplete data to be persisted to my destination. Is this expected?
    k
    • 2
    • 2
  • s

    Simon Collinson

    06/01/2023, 4:59 PM
    We're trying to use the MSSQL connector to transfer several large tables (100M - 1B rows) to Snowflake. The initial sync is timing out, and it looks like the temp files + logs are filling up storage on our VM. Can anyone point me towards best practices / recommendations for managing the initial load of large tables from MSSQL?
    k
    • 2
    • 3
  • l

    Luis Valor

    06/01/2023, 7:37 PM
    Hello All - I'm trying to set up Metabase as a source in Airbyte. I'm having trouble completing this process because Airbyte is looking for an API endpoint that Metabase removed a week ago. The below are the default endpoints the connector is trying to hit. I am unable to specify which endpoints to hit until after the connection has been set up. However, since the connection keeps failing on the
    activity
    endpoint, which no longer exists, I'm sort of stuck with this connector. Is there anyway to specify which endpoints to hit, or a way to upgrade the connector? Currently using the latest version. 1.
    activity
    - deprecated by Metabase a week ago 2.
    card
    3.
    collections
    4.
    dashboard
    5.
    user
    k
    • 2
    • 2
  • j

    Justin Flannery

    06/01/2023, 7:57 PM
    Is there a way to override the URL provided in Slack notifications? In my case, since I’m deployed in K8S, I get messages like this:
    Copy code
    You can access its logs here: <http://airbyte-airbyte-webapp-svc:80/workspaces/4a392011-d406-4e7a-bb87-710662b8c6e9/connections/3a36f0cf-791e-4a43-8b11-c5368e247a77>
    But I’d like to replace
    <http://airbyte-airbyte-webapp-svc:80>
    with our own DNS name that is configured in the helm chart
    k
    h
    • 3
    • 5
  • d

    Dan Cook

    06/01/2023, 9:33 PM
    I can't seem to create a GA4 (Google Analytics) custom report with the set of dimensions I want, even though I can create a report using the very same dimensions in the analytics.google.com Explorer. The circled dimension always causes Airbyte to throw an error, even though the dimension is an official API dimension listed here. And yet if I create a custom report with only
    pagePathPlusQueryString
    as a dimension that works OK! Setup: • Airbyte Cloud • Destination: Snowflake (1.0.4) • Source: GA4 (0.2.4)
    Copy code
    Internal message: 400 Client Error: Bad Request for url: <https://analyticsdata.googleapis.com/v1beta/properties/350528718:runReport>
    Failure origin: source
    Failure type: system_error
    n
    • 2
    • 5
  • l

    Lihan Li

    06/02/2023, 5:04 AM
    Hi team, this is not an issue, but would be great to have Airbyte to be able to CDC streaming a postgres database of at least 8-10TB
    k
    a
    • 3
    • 3
  • v

    Victor C

    06/02/2023, 7:55 AM
    Hello all, I'm trying to import data from Square to BigQuery. We have developed a custom connector, and I'm testing the Airbyte one. There are a lot of missing records in Airbyte and I can't understand why. The job runs fine, I'm using the same identifiers and targeting the same environments, but I only get a fraction of the records I retrieve with my custom connector. I don't have much more to share as there isn't any error log or failure I can look at. Is this an issue other people have been running into ? Thanks !
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    06/02/2023, 9:02 AM
    Hi all, I am trying to configure okta source connector using Api token, it is working fine. But while using OAuth 2.0, it is not working I got the client I'd , client secret and refresh token But source is not getting configured
    k
    • 2
    • 5
  • a

    Abdelmoughit Errachid

    06/02/2023, 9:07 AM
    Hi, any idea how to trigger a full resync for a connection? or for a specific table?
    k
    • 2
    • 3
  • c

    Chidambara Ganapathy

    06/02/2023, 9:13 AM
    Hi all, I am trying to configure okta source connector using Api token, it is working fine. But while using OAuth 2.0, it is not working I got the client I'd , client secret and refresh token But source is not getting configured Thread in Slack Conversation
    k
    • 2
    • 2
  • g

    Gilberto Vilar

    06/02/2023, 11:46 AM
    Hi all! I am using airbyte open source and I want to build an application on top of airbyte. To do so, I need to list connections, sources and destination informations. Whats the best way to do it?
    k
    • 2
    • 2
  • a

    Abdul Hameed

    06/02/2023, 12:41 PM
    Hi Team, I am trying to setup airbyte in my local machine but unable to run the bash script using VScode after cloning the GIT repository
    k
    • 2
    • 3
  • a

    Albert Wong

    06/02/2023, 12:45 PM
    Is it possible to sync a single stream for a connection with hundreds of streams without syncing all streams?
    k
    • 2
    • 2
  • h

    Hiroto Yamakawa

    06/02/2023, 3:48 PM
    Hello, has anyone been able to set a different
    table name
    than the
    stream name
    (using the UI, the API, Octavia, or building a connector)? I have in mind the Gsheet connector, but could be interesting for any of them.
    k
    • 2
    • 2
  • d

    Daniel Pietschmann

    06/02/2023, 4:08 PM
    Hello folks, I am having the Amazon Seller Partner api connected for a client. I compared the results of the data I receive from the airbyte sync (used the raw json) with the report I got from Amazon Seller Central. I am confused, since I see some differing values. For example I identified, that in the raw json from airbyte I have some rows which are a sum of multiple of the rows from the Amazon Seller Central report. Did someone experience something like that before?
    k
    • 2
    • 2
  • n

    Nicolas Bucardo

    06/02/2023, 5:12 PM
    Hi everyone, I've an issue with GCS (Google Cloud Storage) Connector. I configured my bucket path as
    landing/${STREAM_NAME}/partition_date=${YEAR}-${MONTH}-${DAY}/
    . In my case,
    ${STREAM_NAME} = feriadosNacionales
    But when I synced the table, the output file is into the following path
    landing/feriadosNacionales/partition_date=${YEAR}-${MONTH}-${DAY}/feriadosNacionales/
    . Always add the stream_name to the end of the path and I don't want that. Do you know how solve it??
    k
    • 2
    • 2
  • i

    Ignacio Reyna

    06/02/2023, 7:11 PM
    Hello! Do you know if there are there any alternatives to IRSA auth when using Airbyte on EKS?
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/02/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 At 1pm PDT click here to join us on Zoom!
  • r

    Roman Žydyk

    06/03/2023, 2:48 PM
    Hello, is it possible to load JSON files from S3 and output it to Iceberg?
    k
    • 2
    • 3
  • m

    Matheus Barbosa

    06/04/2023, 4:22 PM
    Hi, I’m having issues when trying to ingest data from Stripe to a Clickhouse DB. I’m getting this error:
    Failure Origin: normalization, Message: Normalization failed during the dbt run. This may indicate a problem with the data itself.
    k
    • 2
    • 2
1...198199200...245Latest