https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • n

    Nataly Merezhuk (Airbyte)

    11/09/2022, 7:01 PM
    Hello Airbyte community! We are hosting an open office hour for the next 60 minutes 👋 if you've got questions - please stop by! https://airbyte.com/weekly-office-hours We do this every Wednesday at 11PST!
  • l

    Lucas Gonthier

    11/09/2022, 8:39 PM
    Hi all, I'm currently working with the Airbyte API and octavia-cli. I noticed that the different field name of the schemas aren't the same (probably by conventions). However I think it is very inconvenient. For example if I create a connection using the API (
    /v1/connections/create
    ), I can specify the field
    syncCatalog
    . However using octavia-cli it is
    sync_catalog
    . is a standardization planned ?
    • 1
    • 1
  • r

    Robert Put

    11/09/2022, 9:04 PM
    So i have a postgres database, that has partitioned tables, running incremental dedup replication. These tables should never get new rows, since i replicated these tables during an initial replication., after the data they are paritoned on. So every time normaliztion runs, these tables take much longer to run eventhough they have no new data, so no diff, but if some manual changes are done, I'd like them to replicate.... So now i need some way minimize the high normalization run time after initial replication without any actual changes? Is there some mock data i could push to snowflake or edit data there to accomplish this? I don't want to make changes directly to the db if i can avoid it. Or am i approaching this wrong and just need to drop the tables from the sync job after initial replciation and not remove the data?
    • 1
    • 5
  • a

    Ameer Hamza

    11/09/2022, 10:07 PM
    Hi guys, Can anybody please tell me how to pass api_key in exchange rate API. I've been following along airbyte doc but i think the way they are passing api_key previously access_key is deprecated. I'm getting the following error, if anybody knows do let me know. response ----> { "success": false, "error": { "code": 101, "type": "missing_access_key", "info": "You have not supplied an API Access Key. [Required format: access_key=YOUR_ACCESS_KEY]" } }
    • 1
    • 2
  • j

    Joel Olazagasti

    11/09/2022, 11:02 PM
    As an Airbyte Cloud customer, am I still able to orchestrate my jobs with Dagster? If so, how do I go about getting credentials/creating a connection between Dagster and an Airbyte cloud instance?
    • 1
    • 1
  • a

    AJ

    11/10/2022, 1:59 AM
    Hi can someone please help with image not found issue related to source and destination connector ? I am able to download the source connector image manually but cannot via airbyte in a closed environment and server logs doesn’t give enough information. Is there any solution to this issue?
    • 1
    • 1
  • a

    AJ

    11/10/2022, 2:00 AM
    @Marcos Marx (Airbyte) can you please provide some suggestions.
    • 1
    • 1
  • a

    AJ

    11/10/2022, 2:00 AM
    Thanks in advance
  • a

    Agung Pratama

    11/10/2022, 3:27 AM
    Hi, does anyone have any experience configuring AWS RDS MySQL 8.0 to enable CDC? In particular the RDS instance that already has replica? I wonder how it affect existing replica connection if I were gonna change some parameter?
    • 1
    • 3
  • b

    Balaji Seetharaman

    11/10/2022, 3:55 AM
    Team, I am observing this issue https://github.com/airbytehq/airbyte/issues/19274 Please correct me If i am wrong
  • b

    Balaji Seetharaman

    11/10/2022, 3:57 AM
    Please let me know if my observations are valid
    • 1
    • 3
  • d

    Darshan Bhagat

    11/10/2022, 6:44 AM
    Hi Team, Anyone building a Tally ERP connector, we would want to contribute to the development ?
    n
    • 2
    • 3
  • g

    godlin ampcome

    11/10/2022, 6:53 AM
    Hi, I inserted tables value on my database using airbyte. but I need to change the table value format on the text to time and date format using the airbyte reader option. So, someone guide me, please.
    • 1
    • 1
  • r

    Rishabh D

    11/10/2022, 9:32 AM
    Hi team, In Salesforce source, when the sync mode is selected as ‘Incremental append’, there are duplicate records (based on id ) coming up. May I know if there is an option to just upsert the records?
    • 1
    • 4
  • w

    Wichitchai Buathong

    11/10/2022, 9:47 AM
    Hi Team, I got error on connection from MS SQL to Postgres "_2022-11-10 093338 -_ Additional Failure Information: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: The conversion of a varchar data type to a datetime data type resulted in an out-of-range value." something like this how can I edit the type of destination. I can't edit the datasource datatype by the wey TT Big Thank in Advanve from Thailand ^^
  • l

    laila ribke

    11/10/2022, 11:32 AM
    Hi all, I have created a MySql->S3->Redshift connection. Is makes standard normalization and no dbt custom transformation. The source emits 10M rows, but I receive in destination only 3M rows. I even did a full refresh Overwrite. Attaching the logs. What can it be?
    9d9aea6f_8f2e_4ef1_9c3e_0bbf57935780_logs_306_txt.txt
    • 1
    • 1
  • y

    yijie Wang

    11/10/2022, 11:38 AM
    Hello Team, we are Fivetran users and would like to try Airbyte. The column names have mostly been transformed to snake_case due to the Fivetran naming conversion. Please let me know if Airbyte has a mapper or utility that can assist with this. When we test Meltano, I have to modify the Snowflake loader source code to achieve this which is not ideal. Thank you.
    n
    • 2
    • 3
  • j

    jonty

    11/10/2022, 12:02 PM
    Hey all, we have a major pain point: adding a new stream/table to an existing connection. Airbyte wants to reset the whole connection! I see there is a github issue on this - the issue has been marked as completed recently, but I don't see this feature in the latest version of Airbyte. Am I missing something obvious?
    n
    • 2
    • 3
  • m

    Maykon Lopes

    11/10/2022, 1:02 PM
    Hey airbyte team, is it possible in airbyte have a free stryle query to be used for connection extraction?
    m
    n
    • 3
    • 3
  • l

    Leah Thomas

    11/10/2022, 2:45 PM
    Hi, the airbyte documentation for creating a postgres connector says that creating a dedicated user is optional, and the UI configuration marks the password as
    optional
    . If I want to re-use an existing postgres user for my source and skip passing in the password, is there documentation about how I would set that up? TIA!
    • 1
    • 1
  • s

    Satish Chinthanippu

    11/10/2022, 2:54 PM
    Hi team, does anyone facing the below issue while building connector/airbyte platform
    Copy code
    2022-11-10 14:49:33 INFO o.t.i.RemoteDockerImage(resolve):75 - Pulling docker image: postgres:13-alpine. Please be patient; this may take some time but only needs to be done once.
    2022-11-10 14:49:34 ERROR c.g.d.a.a.ResultCallbackTemplate(onError):52 - Error during callback
    com.github.dockerjava.api.exception.InternalServerErrorException: Status 500: {"message":"toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: <https://www.docker.com/increase-rate-limit>"}
    
            at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.execute(DefaultInvocationBuilder.java:247) ~[testcontainers-1.17.3.jar:?]
            at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.lambda$executeAndStream$1(DefaultInvocationBuilder.java:269) ~[testcontainers-1.17.3.jar:?]
            at java.lang.Thread.run(Thread.java:833) ~[?:?]
    2022-11-10 14:49:34 WARN o.t.i.RemoteDockerImage(resolve):105 - Retrying pull for image: postgres:13-alpine (119s remaining)
    2022-11-10 14:49:35 ERROR c.g.d.a.a.ResultCallbackTemplate(onError):52 - Error during callback
    com.github.dockerjava.api.exception.InternalServerErrorException: Status 500: {"message":"toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: <https://www.docker.com/increase-rate-limit>"}
    
            at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.execute(DefaultInvocationBuilder.java:247) ~[testcontainers-1.17.3.jar:?]
            at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.lambda$executeAndStream$1(DefaultInvocationBuilder.java:269) ~[testcontainers-1.17.3.jar:?]
            at java.lang.Thread.run(Thread.java:833) ~[?:?]
    2022-11-10 14:49:35 WARN o.t.i.RemoteDockerImage(resolve):105 - Retrying pull for image: postgres:13-alpine (118s remaining)
    
    > Task :airbyte-api:compileJava
    Note: /root/airbyte/airbyte-api/build/generated/api/client/src/main/java/io/airbyte/api/client/invoker/generated/JSON.java uses or overrides a deprecated API.
    Note: Recompile with -Xlint:deprecation for details.
    
    > Task :airbyte-db:jooq:generateConfigsDatabaseJooq
    2022-11-10 14:49:35 ERROR c.g.d.a.a.ResultCallbackTemplate(onError):52 - Error during callback
    com.github.dockerjava.api.exception.InternalServerErrorException: Status 500: {"message":"toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: <https://www.docker.com/increase-rate-limit>"}
    r
    p
    • 3
    • 5
  • d

    Domenic

    11/10/2022, 2:57 PM
    I was successful at setting up a _Airbyte_(0.40.18) using _Docker_(4.13.1) and connecting to OracleDB (0.3.22). I then did another test against an OracleDB but I used a username that had access to a significant volume of table/views (entire database) and it failed. I get the response
    non-json response
    while trying to load the schemas even though the connection was successful. Thoughts?
    n
    • 2
    • 4
  • t

    Tolani Afolabi

    11/10/2022, 3:25 PM
    👋 Hello, team!
    • 1
    • 1
  • t

    Tolani Afolabi

    11/10/2022, 3:28 PM
    Good morning everyone, I am very new to airbyte and my only experience with airbyte is connecting hubspot to big query using airbyte. I am running into an issue when I perform a sync using tabular normalization. This is the error: "16 of 77 ERROR creating table model Marketing.form_submissions.......................................................... [[31mERROR[0m in 0.74s] [31mUnhandled error while executing model.airbyte_utils.line_items[0m Pickling client objects is explicitly not supported. Clients have non-trivial state that is local and unpickleable." Please can someone help? Have you had this issue with big query?
    ✅ 1
    d
    b
    • 3
    • 3
  • s

    Sameer Jyani

    11/10/2022, 3:56 PM
    Hello Folks, I am first time user for Airbyte. we are planning to implement in our company. Before that I wanted to do some practice . I was trying to connect with google big query to run some test. I am getting following error. I would appreciate if anyone can help me-
    t
    n
    a
    • 4
    • 7
  • t

    Thomas Bestfleisch

    11/10/2022, 4:30 PM
    Hi team, Not sure if this is the right channel. I'm currently developing a new destination for the Exasol databasse based on Java and JDBC and I'm following the tutorial and I'm extending AbstractJdbcDestination . Currently I'm blocked at the AcceptanceTest part and most tests fail with the following exception:
    Copy code
    io.airbyte.workers.exception.WorkerException: Could not find image: airbyte/destination-exasol:dev
    I'm running the acceptance tests from the airbyte root folder and I have also run the spec and check commands succesfully before and I'm also able to load the connector in the local airbyte UI. Would be great to get some guidance from the airbyte team or people with experience building Java based destinations. Thanks a lot.
    n
    i
    • 3
    • 5
  • a

    Adil Karim

    11/10/2022, 5:17 PM
    Hi! I’m running Airbyte on Kubernetes and it seems like Airbyte is spinning up instances using Autoscaling. Does Airbyte come with HPA out of the box or there something spooky going on?
    • 1
    • 3
  • c

    Callum McCaffery

    11/10/2022, 5:55 PM
    Hi team! For the 'file' source connector, is there a way to force the data types of the columns for csv when creating a connection? It auto's everything to string, but I know ahead what the columns datatypes will be.
    • 1
    • 3
  • r

    Rami M Theeb

    11/10/2022, 8:49 PM
    Hi team, what’s the minimum version i must have to be able to collect metrics to datadog ?
    • 1
    • 1
  • s

    Sharat Visweswara

    11/10/2022, 9:52 PM
    Hello. I'm using the Facebook Marketing source and connecting it to a Redshift destination, and I need to do this across multiple ad accounts. I understand at this time each ad account needs to be a separate source and therefore a separate connection. I do however, need all the data collected into a single set of table on the Redshift side. If I make a change to any of the connections, Airbyte asks to "reset the streams" which results in all the data being deleted, including data that was sourced from other ad accounts. Is it safe to never reset the streams? Are there any better patterns around handling this?
    c
    • 2
    • 3
1...919293...245Latest