https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • r

    Ramesh Shanmugam

    11/10/2022, 10:31 PM
    downloading data from snowflake. full refresh works. but incremental sync with timestamp not working. any idea why if fails?
    Copy code
    internalMessage" : "io.airbyte.integrations.source.relationaldb.InvalidCursorException: The following tables have invalid columns selected as cursor, please select a column with a well-defined ordering as a cursor. {tableName='tmp.test', cursorColumnName='LAST_MODIFIED_DATE', cursorSqlType=TIMESTAMP_WITH_TIMEZONE}",
    n
    • 2
    • 2
  • a

    Aaron Pritchard

    11/10/2022, 10:59 PM
    Hey everyone, I'm looking for the right piece for our EL requirements and have probably assessed over 50 tools by now. I'm looking into Airbyte and have two fundamental questions... • Can you read/write from/to Trino? • Can you read/write from/to generic REST APIs?
    • 1
    • 1
  • e

    Eli Rosenberg

    11/10/2022, 11:18 PM
    After i restart an airbyte vm on gcp, i get the following errors when trying to login via an ssh tunnel. Before restart, it works fine. Any ideas?
    Copy code
    channel 4: open failed: connect failed: Connection refused
    channel 3: open failed: connect failed: Connection refused
    channel 3: open failed: connect failed: Connection refused
    channel 4: open failed: connect failed: Connection refused
    channel 3: open failed: connect failed: Connection refused
    channel 3: open failed: connect failed: Connection refused
    channel 4: open failed: connect failed: Connection refused
    channel 3: open failed: connect failed: Connection refused
    c
    n
    • 3
    • 6
  • a

    Alex Sher

    11/11/2022, 4:43 AM
    Hey everyone, I am creating custom low-code connector for HTTP API. I have a stream A and described scheme of its data in
    A.json
    file. Is it possible to setup Airbyte such way, so when target HTTP API will modify it schema, Airbyte will stop sync and notify me about it? I read in docs that general Airbyte approach is to be more soft (i.e. ignore new fields, if they are appeared in API, but not in Airbyte expected schema). But it would be nice at least to know somehow, if real data doesn’t follow 1:1 expected schema 🙂
    • 1
    • 1
  • r

    Rahul Borse

    11/11/2022, 8:57 AM
    Hi , Is there any way we can debug the airbyte code in intellij or other IDE?
    • 1
    • 1
  • m

    Muhammad Usman

    11/11/2022, 10:53 AM
    Hi I am facing this error when connecting with LinkedIn Ads connector. HTTPError('403 Client Error: Forbidden for url: https://www.linkedin.com/oauth/v2/accessToken') Has anyone else faced it or can help please ?
    • 1
    • 2
  • a

    Agung Pratama

    11/11/2022, 11:01 AM
    Hi everyone. I have tried airbyte from MySQL to PostgreSQL with
    Normalized tabular data
    transformation. So far so good, the data succesfully synced to destination. However when I tried to use additional Custom Transformation with
    dbt
    within the airbyte, it was failed. But when I tried running my
    dbt
    project manually by
    dbt run
    , it was success. The error log found from the airbyte dashboard isn't really that helpful. Is there anyway to investigate/troubleshoot this further? What I did on my dbt project is to create view that base64 decode on some column on the tables ingested by airbyte (see the screenshot).
    n
    r
    • 3
    • 6
  • k

    konrad schlatte

    11/11/2022, 11:54 AM
    Hi, I have made some changes and added some streams to a source connector (Retently) and would like to submit a PR what is the best way of doing this? I couldn't find instructions in the docs.
    m
    • 2
    • 1
  • r

    Rahul Borse

    11/11/2022, 12:20 PM
    Hi Team, Just wanted to understand on a use case which our company is trying to solve. Consider we have a source as a hubspot and destination as S3 where output format is csv. Can we modify the writer so that along with generating csv in S3, we want to put that data in aws Athena. How can we achieve it, so that our company can leverage the usage of airbyte and it's component. Thanks in advance.
    n
    • 2
    • 3
  • c

    Chandrashekhar S

    11/11/2022, 12:49 PM
    Hi.. I'm trying to create HTTP API source for Airbyte. I'm stuck at installing dependencies. How can i get rid of this error(show in the screenshot)..? Err msg: ERROR: Could not find a version that satisfies the requirement airbyte-cdk~=0.2
    m
    n
    • 3
    • 6
  • r

    Riccardo Fuzzi

    11/11/2022, 1:45 PM
    Hey we are migrating from Debezium to Airbyte our pipeline SQL Server CDC to Google Pub/Sub. We have 2 Main problem 1)if we configure Incremental Change with no history the pipeline put all history in the Pub/Sub 2) The produced JSON event show only the current state of the changed column and not the old state as in Debezium Is this the expected behavior?
    • 1
    • 1
  • t

    Tolani Afolabi

    11/11/2022, 2:58 PM
    Good morning, I need help with tabular normalization in Big query. I am able to upload raw json into big query from HubSpot, but when I transform using tabular normalization I get these errors " pickling client objects is explicitly not supported. Please can someone help? Has anyone had this issue before?
    ✅ 1
    n
    • 2
    • 6
  • l

    laila ribke

    11/11/2022, 3:04 PM
    Hi all, any ideas?? I have created a MySql->S3->Redshift connection. It makes standard normalization and no dbt custom transformation. The source emits 10M rows, but I receive in destination only 3M rows. I even did a full refresh Overwrite. set new source and new destination.. What can it be?
    n
    • 2
    • 1
  • v

    Venkat Dasari

    11/11/2022, 3:20 PM
    Airbyte folks, since secret manager or azure vault is not yet supported, can we use rest API to trigger airbyte jobs? To build a source, destination and a connector on the fly and ingest data?
    m
    • 2
    • 2
  • v

    Venkat Dasari

    11/11/2022, 3:28 PM
    Airbyte folks, another question, if i have a large dataset to transfer, and if i am running airbyte on k8s, will it spawn multiple workers to bring the data or just one worker bringing in the data?
  • j

    João Larrosa

    11/11/2022, 3:47 PM
    Hi, mates! Question: I have a shopify -> bigquery connection and I'm bringing the Inventory Items data. But, for some reason, not all possible columns described in shopify's are being brought by Airbyte 😞. For example, the "tracked" and "requires_shipping" aren't comming. May someone help me with it? Detail: The shopify app has all the read authorizations checked. Thank you very much!
    t
    • 2
    • 6
  • c

    Chetan Dalal

    11/11/2022, 4:06 PM
    Hi team. my sync is failing. Getting the error of "Failure Origin: source, Message: Something went wrong within the source connector". PFA the error logs. Source: Elastic Search Destination : SQL Btw, I am using OpenSearch1.3 as source.
    6cb3195c_ae87_40a3_8dbd_d3c34047e884_logs_2_txt.txt
    • 1
    • 1
  • v

    Vincent Koppen

    11/11/2022, 4:47 PM
    Hello everybody! The Amazon Ads connector seems to only allow for "reportDate" as the cursor field in incremental sync mode. Why is not "updatedAt" used as a cursor field and is there any way to configure this? Thank you!
    • 1
    • 1
  • a

    Adnan

    11/11/2022, 5:43 PM
    Hi All, Has anyone tried integrating Quickbooks , need some help configuring Quickbooks side as getting empty tables in the warehouse .
    n
    • 2
    • 1
  • v

    Venkat Dasari

    11/11/2022, 8:11 PM
    Does Airbyte support 443 instead of 8080?
    m
    • 2
    • 1
  • j

    João Larrosa

    11/11/2022, 8:20 PM
    Hey, guys! Does anybody can help me with it?
    m
    • 2
    • 2
  • r

    Rytis Zolubas

    11/12/2022, 10:59 AM
    hello, how could I move airbyte docker internal postgres server to RDS, I don't want to lose any data?
    r
    f
    n
    • 4
    • 3
  • n

    Narendra Kadiri

    11/12/2022, 4:27 PM
    Hi team recently we started data ingestion using airbyte (installed using docker image locally) from sql server source. from on of the source, while fetching source schema we are getting error
    r
    n
    a
    • 4
    • 8
  • r

    Rytis Zolubas

    11/13/2022, 9:27 AM
    Hello! In airbyte v. 0.40.17 I don't see export configuration. How could I export configuration?
    d
    m
    • 3
    • 3
  • s

    Steven Wilber

    11/13/2022, 3:56 PM
    Hi, I'm trying to use Prefect to trigger an Airbyte flow, but it fails at the Airbyte health check. I can see in the logs the following:
    httpx.HTTPStatusError: Client error '401 Unauthorized' for url '<http://localhost:8000/api/v1/health/>'
    But I can check that url and it works fine and returns:
    {"available":true}
    Is this some HTTPS thing? Any help is much appreciated. Thanks.
    n
    • 2
    • 14
  • l

    laila ribke

    11/13/2022, 4:59 PM
    Hi everyone, I´m trying my luck again. I have a MySql -> redshift connection with S3 streaming and basic normalization. I´m syncing 6 tables. All tables with incremental, history deduped. 5 tables sync perfectly, receiving records from 2018. But one table, which should have 10 million records, I receive only 400k and data back only to 2022-20-02. I´ve set new source, new destination.. and it´s the same
    s
    • 2
    • 9
  • d

    Dr. Ori Cohen

    11/13/2022, 6:43 PM
    HI, I need a bit of help with the pricing, anyone here from the team and can tell me how much does it cost to sync a 1GB DB with a single table and 1M records?
    j
    • 2
    • 2
  • f

    Faris

    11/14/2022, 12:26 AM
    Hello everyone, I have source (read replica) and destination Postgres DB and I face failure whenever the sync run. I select only some of the tables to sync, some tables are full refresh whereas others set to deduplication. The error message:
    Copy code
    Sync Failed
    Last attempt:67.48 MB512,010 emitted records450,000 committed records2m 10s
    Failure Origin: source, Message: Something went wrong in the connector. See the logs for more details.
    7:39AM 11/14
    3 attempts
    2022-11-13 23:44:58 - Additional Failure Information: java.lang.RuntimeException: org.postgresql.util.PSQLException: FATAL: terminating connection due to conflict with recovery Detail: User query might have needed to see row versions that must be removed. Hint: In a moment you should be able to reconnect to the database and repeat your command.
    • 1
    • 1
  • m

    Mukul Gopinath

    11/14/2022, 5:13 AM
    https://discuss.airbyte.io/t/lag-data-in-google-search-console-source/3184 Could someone help me on the Data lag for Search console ?
    • 1
    • 1
  • n

    Naren Kadiri

    11/14/2022, 6:19 AM
    Hi Team can someone help that there is any size limit while fetching source schema using sql server?
    n
    • 2
    • 6
1...929394...245Latest