https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Abdulrahman Abu Tayli

    07/04/2023, 6:49 AM
    Hi, I’ve upgraded airbyte to version 0.50.6 and for some reason the transformation feature has disappeared from all of my connections I’ve tried the following connections postgres (0.49) -> postgres (0.3.23) postgres (0.49) -> postgres (0.4.0) custom connector -> postgres (0.3.23) custom connector -> postgres (0.4.0) When I ran the connectors after upgrading I got this message
    Copy code
    2023-07-04 06:19:57 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- START DBT TRANSFORMATION -----
    2023-07-04 06:19:57 INFO i.a.c.i.LineGobbler(voidCall):149 - 
    2023-07-04 06:19:57 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$5):198 - Completing future exceptionally...
    io.airbyte.workers.exception.WorkerException: Dbt Transformation Failed.
            at io.airbyte.workers.general.DbtTransformationWorker.run(DbtTransformationWorker.java:75) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?]
            at io.airbyte.workers.general.DbtTransformationWorker.run(DbtTransformationWorker.java:29) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?]
            at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.50.6.jar:?]
            at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: java.lang.NullPointerException: Cannot invoke "String.toLowerCase()" because "this.normalizationIntegrationType" is null
            at io.airbyte.workers.normalization.DefaultNormalizationRunner.configureDbt(DefaultNormalizationRunner.java:96) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?]
            at io.airbyte.workers.general.DbtTransformationRunner.run(DbtTransformationRunner.java:79) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?]
            at io.airbyte.workers.general.DbtTransformationWorker.run(DbtTransformationWorker.java:64) ~[io.airbyte-airbyte-commons-worker-0.50.6.jar:?]
            ... 3 more
    When I opened the transformation tap (which was configured perfectly before upgrading), I got this message:
    Copy code
    Normalization and Transformation operations are not supported for this connection.
    k
    r
    j
    • 4
    • 8
  • m

    Matheus

    07/04/2023, 1:18 PM
    Hey, I keep getting this error when trying to setup a source from GA4:
    Copy code
    Internal message: 'expires_in'
    Failure origin: source
    Failure type: system_error
    does anyone know what would that be?
    k
    • 2
    • 3
  • o

    Octavia Squidington III

    07/04/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom!
  • d

    Dipankar Kumar

    07/04/2023, 6:37 PM
    Hi team, I am trying to setup salesforce as a source but getting below error- 'An error occurred: {"error":"invalid_grant","error_description":"expired access/refresh token"}'.
    m
    • 2
    • 2
  • a

    Andre de Lima e Silva

    07/04/2023, 8:08 PM
    Hello, on the SQL connectors are we able to use queries to replicate data or just tables? Is there any alternatives?
    k
    m
    • 3
    • 4
  • k

    Karandeep Singh

    07/05/2023, 5:23 AM
    Hi Team, I am not able ingest big table ~10M rows from MySQL to BigQuery with the same error as - https://discuss.airbyte.io/t/mysql-5-7-big-table-ingestion-fails/2511.
    k
    • 2
    • 2
  • e

    Ekansh Verma

    07/05/2023, 6:51 AM
    With respect to dynamodb source connector/general java source connector. How can we increase the default heapsize from 1/4th to something else
    k
    • 2
    • 2
  • d

    Du Trần

    07/05/2023, 7:01 AM
    Hi Team, I''m not able to ingest data to MySQL because statement: "CREATE SCHEMA IF NOT EXIST", It will lock if any session active. https://dev.mysql.com/doc/refman/8.0/en/create-database.html
    k
    • 2
    • 2
  • r

    Rishav Sinha

    07/05/2023, 7:26 AM
    Hi Team, i am pushing data into
    ElasticSearch
    destination from
    Microsoft Teams
    Source (users stream), with every sync the data is duplicating (appending), although i have set it to
    Full Refresh | Overwrite
    need help with this !!!
    k
    • 2
    • 5
  • d

    Du Trần

    07/05/2023, 10:27 AM
    Hi, with destination-mysql how to set MAX_BATCH_SIZE_BYTES?
    k
    • 2
    • 2
  • v

    vijay

    07/05/2023, 11:28 AM
    Hey I have an etl sync from aws s3 to mongo db, it is taking a lot of time around 10 minutes in Airbyte, where my node code would take less 1 minute to sync, how could I optimize it to increase speed
    k
    • 2
    • 2
  • l

    Lamiya Nabaova

    07/05/2023, 12:05 PM
    hi I have oracleDB 19c and minio .I want get my tables from oracle to minio and when i create new connection sync botten doesn't enabled. look like this
    k
    • 2
    • 2
  • l

    Lamiya Nabaova

    07/05/2023, 12:05 PM
    image.png
  • p

    Precious Agboado

    07/05/2023, 1:31 PM
    Hi, Our company has has is moving all its transformations to Airbyte. There is one huge blocker making us look at other solutions. The issues is we keep getting 60sec timeout when we sync large tables into Redshift. We have spent over a month trying to solve it but no success. Any help with we can get?
    k
    • 2
    • 3
  • s

    Shaye Pearson

    07/05/2023, 1:37 PM
    Hi, I am having issues where since upgrading airbyte I have a sync that says cancelled but the time keeps going up and I am unable to reset the sync or start a new one as it says there is already one running. I have restarted airbyte and the instance that it is running on and no luck. Has anyone experienced this before or is able to help?
    plus1 2
    c
    b
    • 3
    • 3
  • h

    Haki Dere

    07/05/2023, 1:43 PM
    Is there a way to configure nginx timeouts for airbyte web app which runs on k8s deployed via airbyte official helm chart?
    k
    • 2
    • 2
  • b

    Benjamin Edwards

    07/05/2023, 3:42 PM
    Hi All, regarding the
    detect and propagate schema changes
    feature, does this run a full reset on a table with schema changes? I ask as my source DB does not contain all historical data but the destination table in Snowflake does and I would like to activate this feature provided it does not result in data loss.
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    07/05/2023, 7:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1PM PDT click here to join us on Zoom!
  • e

    Ekansh Verma

    07/05/2023, 8:32 PM
    Hello Everyone! I have a dynamoDb source connector and it loads data to in a postgres DB. Now the DynamoDB states that the total size of data is 700Mb but Airbyte states sync size as 3.07Mb. The sync is of type full refresh overwrite.
    k
    • 2
    • 2
  • l

    Liam Coley

    07/05/2023, 10:47 PM
    Hi team, any way to force stop a job? I have one that says it’s cancelled, and has a failure message (“Failure Origin: airbyte_platform, Message: Something went wrong within the airbyte platform”), but it’s still running and I’m not able to start a new job (“Failed to start sync: A sync is already running for: UUID”). Using 0.50.6, connector is Amazon Ads v2.1.0. I’ve tried cancelling it via the API endpoint, but no luck.
    k
    j
    • 3
    • 3
  • d

    Du Trần

    07/06/2023, 2:58 AM
    Hi Team, I'm using destination-mysql with normalization and transformation with multi-worker. But I have a trouble: Deadlock when concurrency insert to MySQL. Detail ERROR:
    k
    • 2
    • 2
  • f

    Fernanda Face

    07/06/2023, 3:29 AM
    Este mensaje contiene elementos interactivos.
    fer_logs_3005703_txt.txt
  • v

    vijay

    07/06/2023, 4:52 AM
    Hey another question when I duplicate the connection, source and destination are not getting duplicated, so if have a different destionation for both the connections I cannot update if I update at one place it is getting updated at other places
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    07/06/2023, 9:19 AM
    Hi I tried using Twitter connector. It is asking for search query. Any idea of how it should be? Thanks
    k
    • 2
    • 2
  • q

    Quang Dang Vu Nhat

    07/06/2023, 9:54 AM
    Hi, I’m facing a weird behavior from Airbyte Some of my connections, If they fail while syncing, they will keep syncing forever and I can’t stop or reset data or do any action Even If I try to update the state of that sync to “failed” in Airbyte Database, It only changes the state but I can’t interact with the connection anymore
    k
    • 2
    • 2
  • j

    Jayant Kumar

    07/06/2023, 11:11 AM
    Hi, I am trying to setup Airbyte using official helm on K8S. I am getting the following error from temporal while using external postgresDB.
    Copy code
    2023-07-06T11:03:57.281Z        ERROR   Unable to create SQL database.  {"error": "pq: database \"temporal\" already exists", "logging-call-at": "handler.go:97"}
    k
    • 2
    • 2
  • e

    Ekansh Verma

    07/06/2023, 12:31 PM
    Hi Team! How can I configure JOB_MAIN_CONTAINER_MEMORY_REQUEST and JOB_MAIN_CONTAINER_MEMORY_LIMIT env variables as some percentage of the total memory.
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    07/06/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom octavia loves
  • t

    Thibaut B

    07/06/2023, 2:21 PM
    Hello there octavia wave I’m trying to release in production the saleforce connector which is GA, but I still experience these bugs: https://github.com/airbytehq/airbyte/issues/27048 and https://github.com/airbytehq/airbyte/issues/27146 I launch an incremental replication, but there is a huge amount of data missing destination. In the ticket, they mention-ned that rollback to 2.0.5 fixed their bug, but unfortunately, I can’t do that, because I need the new logic implemented in the latest versions of the connectors => the
    stream_slices
    are necessary, because otherwise my replication with Airbyte run in TIMEOUT (we have a lot of amount of data). note: Even the 2.1.0 was going in TIMEOUT for us, but thanks to Marcos, I rebuilt the connector changing the step : https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-salesforce/source_salesforce/streams.py#L582C5-L582C22 from 30 to 1 day, then we didn’t had the TIMEOUT. I don’t mind rebuilding stuffs, but touching the logic of slices seems to require some experience and knowledge I don’t have as I haven’t built connector on Airbyte, Can you please help ? Many Thanks 🙏
    👀 1
    k
    • 2
    • 2
  • s

    Semyon Komissarov

    07/06/2023, 3:59 PM
    Hello, team! I have next setup Airbyte 0.40.23 airbyte/source-postgres 2.0.34 airbyte/destination-bigquery 1.4.4 For BQ destination I use “GCS staging” loading methdon I am running syncs on VM machine in GCP (4 cores, 16 ram) All of the syncs are very long (from 75 minutes to 5 hours). But they are not big - 50-200mb How Can I speed up this sync?
    k
    • 2
    • 4
1...209210211...245Latest