https://linen.dev logo
Join Slack
Powered by
# advice-data-ingestion
  • l

    Lucas Wiley

    08/15/2022, 5:51 PM
    Where does airbyte open source store credentials once they are entered in the UI?
    e
    a
    +3
    • 6
    • 13
  • m

    Marissa Pagador

    08/15/2022, 9:14 PM
    Hi team, is there a way to set sync frequency for a custom connector greater than 24 hours? For example 36 or 48 hours?
  • j

    James Egan

    08/16/2022, 10:52 AM
    Is there a reason that a FB Marketing Ads report shows 2 ads in the ads report on FB ads, but when I pull the report through Airbyte it shows no records
  • r

    Rocky Appiah

    08/16/2022, 2:49 PM
    Is there a way to speed up the number of rows fetched at a time?
    a
    • 2
    • 1
  • d

    Don

    08/16/2022, 10:39 PM
    Is there a soft limit to the number of tables I should ingest through a source?
    r
    • 2
    • 2
  • a

    Alex Banks

    08/17/2022, 6:20 PM
    For reading JSONL content from an S3 Source, getting an error
    JSON parse error: Column(/DepositToAccountRef) changed from string to object in row 0
    , but in the enforced schema I provide, I have
    'DepositToAccountRef': 'object'
    . It seems like the
    Manually enforced JSON schema
    isn't actually being enforced/applied?
    • 1
    • 1
  • r

    Rocky Appiah

    08/18/2022, 1:06 PM
    Copy code
    2022-08-18 12:52:00 source > Caused by: org.postgresql.util.PSQLException: ERROR: publication "airbyte_publication" does not exist
    2022-08-18 12:52:00 source >   Where: slot "airbyte_slot", output plugin "pgoutput", in the change callback, associated LSN 338C/B861BCF0
    I’m seeing above when trying to sync to a postgres 10 db, first sync works, second fails. The publication definitely exists.
    a
    • 2
    • 3
  • d

    Daniel Rothamel

    08/18/2022, 1:32 PM
    We're ingesting data from Close.com via the Airbyte connector, which is in Alpha. We're running into a pagination issue on the largest tables, because the Close.com API limit appears to be 250100 rows. Is there a way to batch the sync somehow, through Airbyte?
    r
    • 2
    • 3
  • r

    Rocky Appiah

    08/18/2022, 3:44 PM
    Is anyone using pgoutput plugin with a postgres db with a large table with json blobs value. I see the recommended approach is to use wal2json, but I have a system I rather not modify the config.
  • g

    Gautam

    08/18/2022, 5:18 PM
    Ingestion Advice, looking for any thoughts/inputs on this https://github.com/airbytehq/airbyte/issues/15611
  • p

    Piyawat Pavachansatit

    08/19/2022, 4:31 AM
    Hi, I tried to sync the table with Incremental | Deduped + history mode in Airbyte and the issue is when I deleted some records in the source(MSSQL Server), the table in the destination(Big Query) doesn't seem to remove those records, I already enable CDC btw. issue only occur in deduped table, not in history table which has "_scd" in suffix. It works fine with update transaction.
    • 1
    • 1
  • d

    Dmytro Vorotyntsev

    08/19/2022, 8:20 AM
    Hi team! I’m receiving an error (and sync failure after initial was successful) in the logs that I didn’t find anywhere in the GitHub Issues. The Postgres source with CDC configured is failing with an error
    PSQLException: ERROR: publication "airbyte_publication" does not exist
    Which doesn’t seem correct since few logs line above it was clear that Airbyte worker got a list of tables which are added to mentioned publication. Details in thread.
    a
    • 2
    • 7
  • m

    Marcelo Santoro

    08/19/2022, 4:17 PM
    Hey guys.... I am working with airbyte locally ... But just in one ingestion ... I am getting this error:
    Copy code
    Additional Failure Information: [31mUnhandled error while executing model.airbyte_utils.CUSTOMER__Account_id__sponsored_display_campaigns[0m Pickling client objects is explicitly not supported. Clients have non-trivial state that is local and unpickleable. 2 of 25 ERROR creating incremental model amazon_ads_api.CUSTOMER__Account_id__sponsored_display_campaigns......
    Does anyone knows how to resolve this problem ? Do I need to change something in airbyte configuration ?
    l
    o
    • 3
    • 9
  • r

    Rocky Appiah

    08/19/2022, 4:33 PM
    Getting the error below, any insight?
    Copy code
    2022-08-19 15:54:53 source > 2022-08-19 15:54:53 ERROR i.d.p.ErrorHandler(setProducerThrowable):35 - Producer failure
    2022-08-19 15:54:53 source > io.debezium.DebeziumException: java.time.format.DateTimeParseException: Text '0' could not be parsed at index 0
    2022-08-19 15:54:53 source >    at io.debezium.pipeline.source.AbstractSnapshotChangeEventSource.execute(AbstractSnapshotChangeEventSource.java:85) ~[debezium-core-1.9.2.Final.jar:1.9.2.Final]
    g
    r
    +2
    • 5
    • 15
  • m

    Marissa Pagador

    08/19/2022, 7:02 PM
    Hi team, when you first deploy a connector for incremental sync can you set the state for the first run? or does it have to run once first without a state in order to save a state to use for next time?
  • g

    Gayathri Chakravarthy

    08/20/2022, 4:47 PM
    Hello team, am hoping someone can throw some light on this behaviour I encountered whilst configuring a Snowflake to Redshift connector. So, we made schema changes (add & remove columns) in the Snowflake tables we are trying to bring into Redshift as part of a migration. However, irrespective of the number of times we refreshed the schema from within the Airbyte connector it wasn’t picking up the underlying changes we had affected in the Snowflake database objects. We finally dropped and recreated the connector and this time around the schema picked up by the connector reflected the state in Snowflake. Is this by design or is there something I’m missing in terms of my understanding of the connector? Appreciate any thoughts from the community. Redshift destination connector version -
    0.3.47
    Snowflake source connector version -
    0.1.18
    Thanks
    l
    • 2
    • 5
  • c

    Cristian Ivanoff

    08/22/2022, 2:38 PM
    Hi all, Im a new user and testing Airbyte to ingest data from a DB2 database. Im getting an error and I cant find out what’s wrong. The source table is approx 4500 rows and Airbyte stops syncing after 132 rows. This is the error
    "Caused by: java.sql.SQLException: java.nio.charset.MalformedInputException: Input length = 1"
    I hope someone here can point me in the right direction but it seems that something is wrong with the source data?
  • r

    Ryan

    08/22/2022, 5:21 PM
    Anyone happen to know if the Elasticsearch destination connector supports more than 1 field as a primary key? I'm getting
    i.a.i.d.e.ElasticsearchConnection(extractPrimaryKey):182 - unable to extract primary key
    for every record attempting to be inserted.
  • a

    Alex Banks

    08/22/2022, 5:24 PM
    Not sure if this is the right place to ask, but is there an update on: https://github.com/airbytehq/airbyte/issues/15339 ?
  • r

    Rocky Appiah

    08/22/2022, 8:38 PM
    Even after reverting, I’m still getting the error below.
    2022-08-22 203544 source > Caused by: io.debezium.DebeziumException: java.time.format.DateTimeParseException: Text ‘0’ could not be parsed at index 0
    r
    • 2
    • 4
  • m

    Manas Hardas

    08/22/2022, 9:20 PM
    Been getting this error when trying to setup Zendesk support as source
    Copy code
    Internal Server Error: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 1 column 1 path $
    Any advice on how to resolve this?
  • z

    Zaza Javakhishvili

    08/22/2022, 10:00 PM
    Please Someone allow merge this change: https://github.com/airbytehq/airbyte/pull/15822
  • r

    Rocky Appiah

    08/23/2022, 11:38 AM
    Will PR 15877 be merged to main today? I’d like to deploy and test.
    r
    • 2
    • 4
  • r

    Rocky Appiah

    08/23/2022, 1:55 PM
    I have a table which has a unique key, but no PK (I realize it’s effectively the same since both columns are not null). It is difficult for us to add a PK to this table since it’s quite large. I tried to use replica identity on the unique index, but Airbyte doesn’t show an incremental option when I refresh source schema, only full refresh. Is there a way around this? Or do we NEED to add a PK to the table
    Copy code
    Table "public.test_20220823"
     Column |  Type   | Collation | Nullable | Default
    --------+---------+-----------+----------+---------
     a      | integer |           | not null |
     b      | integer |           | not null |
    Indexes:
        "test_20220823_a_b_key" UNIQUE CONSTRAINT, btree (a, b) REPLICA IDENTITY
    Publications:
        "airbyte_publication_2"
  • u

    Umang Sheth

    08/23/2022, 9:36 PM
    I had a quick question over replication - Does Airbyte support parallel replication, when we choose multiple streams for a single source? MySql -> BigQuery
    m
    z
    • 3
    • 3
  • z

    Zaza Javakhishvili

    08/24/2022, 3:12 PM
    Guys can you add some filters and bulk actions for connections? Like a search or filter by status, bulk pause or bulk delete connections. p.s. p.s. How can I "fully" delete connections? I mean from history too.
    r
    e
    • 3
    • 7
  • h

    Hong Zhang

    08/24/2022, 11:53 PM
    Morning all i am encountering an
    SSLCertVerificationError
    issue while playing with the onboarding connections and also came across the post in forum as well as the open issue in github. Can any one please confirm if importing a custom certification still not supported? thanks
  • z

    Zaza Javakhishvili

    08/25/2022, 3:49 AM
    Guys please check this issue: https://github.com/airbytehq/airbyte/issues/15927
  • m

    MANOJ KUMAR

    08/25/2022, 5:55 AM
    Hi there. I am using AIRBYTE to transfer data from Mongo DB to Snowflake (edited) I created source connection with Mongo DB successfully. But as I go to use this source connection with destination connection (snowflake), it stops the service of my Mongo DB. Please help. (edited)
  • h

    Howard

    08/26/2022, 6:37 AM
    Hi there I am trying to user airbyte connect to Azure sql server, the client only allow Azure AD login method and MFA. does airbyte support Azure AD and MFA?
1...89101112Latest