https://linen.dev logo
Join SlackCommunities
Powered by
# ask-community-for-troubleshooting
  • a

    Alex Zeleznikov

    06/15/2022, 1:42 PM
    How can I disable S3 logs?
  • f

    Fabián Escobar

    06/15/2022, 2:40 PM
    Hello, I want to use Airbyte with Salesforce, which are the pre-requisites that I need from the Salesforce side?
    m
    • 2
    • 2
  • a

    Akarsh Konchada

    06/15/2022, 3:20 PM
    Is GCS not a source in Airbyte?
    m
    • 2
    • 1
  • t

    Tomas Perez

    06/15/2022, 7:40 PM
    Has anyone used
    plural.sh
    to deploy Airbyte in EKS? Is there any approach to calculate the cost of Airbyte running in EKS?
    m
    a
    • 3
    • 3
  • g

    Gaurang Swarge

    06/15/2022, 9:54 PM
    Hello everyone , I have an Airbyte instance which has been running quite smoothly for last 3 months , It is replicating a MySQL data source behind an SSH to a local PostgreSQL instance. Since yesterday, the sync has started failing and Im wondering whats going wrong, nothing has changed so far, I can connect to the MySQL server independently to run any queries. Can anyone point me where to look for troubleshooting this?
    Copy code
    2022-06-15 21:45:42 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/47fdb0c2-775f-4a53-a071-23b4b0f52bbd/0/logs.log
    2022-06-15 21:45:42 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.35.45-alpha
    2022-06-15 21:45:42 [32mINFO[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-mysql:0.5.11 exists...
    2022-06-15 21:45:43 [32mINFO[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-mysql:0.5.11 was found locally.
    2022-06-15 21:45:43 [32mINFO[m i.a.w.p.DockerProcessFactory(create):104 - Creating docker job ID: 47fdb0c2-775f-4a53-a071-23b4b0f52bbd
    2022-06-15 21:45:43 [32mINFO[m i.a.w.p.DockerProcessFactory(create):155 - Preparing command: docker run --rm --init -i -w /data/47fdb0c2-775f-4a53-a071-23b4b0f52bbd/0 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local airbyte/source-mysql:0.5.11 check --config source_config.json
    2022-06-15 21:45:43 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings.
    2022-06-15 21:45:43 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    2022-06-15 21:45:43 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    2022-06-15 21:45:43 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See <http://www.slf4j.org/codes.html#multiple_bindings> for an explanation.
    2022-06-15 21:45:43 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
    2022-06-15 21:45:43 [32mINFO[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-15 21:45:43 [32mINFO[m i.a.i.s.m.MySqlSource(main):210 - starting source: class io.airbyte.integrations.source.mysql.MySqlSource
    2022-06-15 21:45:44 [32mINFO[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-15 21:45:44 [32mINFO[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json}
    2022-06-15 21:45:44 [32mINFO[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-15 21:45:44 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):121 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource
    2022-06-15 21:45:44 [32mINFO[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-15 21:45:44 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):122 - Command: CHECK
    2022-06-15 21:45:44 [32mINFO[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-15 21:45:44 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):123 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - Exception in thread "main" java.lang.RuntimeException: java.nio.file.NoSuchFileException: source_config.json
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at io.airbyte.commons.io.IOs.readFile(IOs.java:74)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at io.airbyte.integrations.base.IntegrationRunner.parseConfig(IntegrationRunner.java:286)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:129)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:105)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at io.airbyte.integrations.source.mysql.MySqlSource.main(MySqlSource.java:211)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - Caused by: java.nio.file.NoSuchFileException: source_config.json
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:219)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at java.base/java.nio.file.Files.newByteChannel(Files.java:380)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at java.base/java.nio.file.Files.newByteChannel(Files.java:432)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at java.base/java.nio.file.Files.readAllBytes(Files.java:3288)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at java.base/java.nio.file.Files.readString(Files.java:3366)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	at io.airbyte.commons.io.IOs.readFile(IOs.java:72)
    2022-06-15 21:45:44 [1;31mERROR[m i.a.c.i.LineGobbler(voidCall):82 - 	... 4 more
    2022-06-15 21:45:44 [32mINFO[m i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):158 - Completing future exceptionally...
    io.airbyte.workers.WorkerException: Error while getting checking connection.
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:84) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:27) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	at java.lang.Thread.run(Thread.java:833) [?:?]
    Caused by: io.airbyte.workers.WorkerException: Error checking connection, status: Optional.empty, exit code: 1
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:80) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	... 3 more
    2022-06-15 21:45:44 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
    2022-06-15 21:45:44 [33mWARN[m i.t.i.a.POJOActivityTaskHandler(activityFailureToResult):307 - Activity failure. ActivityId=8a2dcf61-1139-3673-9310-d155549efa44, activityType=Run, attempt=1
    java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Error while getting checking connection.
    	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
    	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:129) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	at io.airbyte.workers.temporal.check.connection.CheckConnectionActivityImpl.run(CheckConnectionActivityImpl.java:78) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
    	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
    	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
    	at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
    	at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityInboundCallsInterceptor.execute(POJOActivityTaskHandler.java:214) ~[temporal-sdk-1.8.1.jar:?]
    	at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityImplementation.execute(POJOActivityTaskHandler.java:180) ~[temporal-sdk-1.8.1.jar:?]
    	at io.temporal.internal.activity.POJOActivityTaskHandler.handle(POJOActivityTaskHandler.java:120) ~[temporal-sdk-1.8.1.jar:?]
    	at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:204) ~[temporal-sdk-1.8.1.jar:?]
    	at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:164) ~[temporal-sdk-1.8.1.jar:?]
    	at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.8.1.jar:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
    	at java.lang.Thread.run(Thread.java:833) [?:?]
    Caused by: io.airbyte.workers.WorkerException: Error while getting checking connection.
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:84) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:27) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	... 1 more
    Caused by: io.airbyte.workers.WorkerException: Error checking connection, status: Optional.empty, exit code: 1
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:80) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	at io.airbyte.workers.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:27) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.45-alpha.jar:?]
    	... 1 more
  • k

    KK

    06/16/2022, 7:16 AM
    Hi All, we are trying to get data from Amazon Seller Partner to S3. Just trying to understand if there is any difference in terms of data processing if we get it directly to S3 using it as a destination connector or stream it through some means say kafka, using it as a destination connector and then get the data to S3
  • z

    Zaza Javakhishvili

    06/16/2022, 7:25 AM
    Need help, I am not able pull data from Shopify:
    Copy code
    2022-06-16 07:18:36 normalization > 07:18:31.133803 [error] [MainThread]:   ) as user_id,
    2022-06-16 07:18:36 normalization > 07:18:31.134448 [error] [MainThread]:       cast(curr...' at line 84
    2022-06-16 07:18:36 normalization > 07:18:31.135066 [error] [MainThread]:   compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/airbyte_shopify/transactions.sql
    2022-06-16 07:18:36 normalization > 07:18:31.135707 [info ] [MainThread]: 
    2022-06-16 07:18:36 normalization > 07:18:31.136343 [error] [MainThread]: Database Error in model order_refunds_refund_line_items (models/generated/airbyte_incremental/airbyte_shopify/order_refunds_refund_line_items.sql)
    2022-06-16 07:18:36 normalization > 07:18:31.137057 [error] [MainThread]:   1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'json) as line_item,
    2022-06-16 07:18:36 normalization > 07:18:31.137647 [error] [MainThread]:       cast(total_tax as 
    2022-06-16 07:18:36 normalization > 07:18:31.138289 [error] [MainThread]:       float
    2022-06-16 07:18:36 normalization > 07:18:31.138908 [error] [MainThread]:   ) as total_tax,
    2022-06-16 07:18:36 normalization > 07:18:31.139495 [error] [MainThread]:       cast...' at line 111
    2022-06-16 07:18:36 normalization > 07:18:31.140133 [error] [MainThread]:   compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/airbyte_shopify/order_refunds_refund_line_items.sql
    2022-06-16 07:18:36 normalization > 07:18:31.140773 [info ] [MainThread]: 
    2022-06-16 07:18:36 normalization > 07:18:31.141397 [error] [MainThread]: Database Error in model order_refunds_transactions (models/generated/airbyte_incremental/airbyte_shopify/order_refunds_transactions.sql)
    2022-06-16 07:18:36 normalization > 07:18:31.142119 [error] [MainThread]:   1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'json) as receipt,
    2022-06-16 07:18:36 normalization > 07:18:31.142736 [error] [MainThread]:       cast(user_id as 
    2022-06-16 07:18:36 normalization > 07:18:31.143399 [error] [MainThread]:       signed
    2022-06-16 07:18:36 normalization > 07:18:31.144009 [error] [MainThread]:   ) as user_id,
    2022-06-16 07:18:36 normalization > 07:18:31.144604 [error] [MainThread]:       cast(curr...' at line 127
    2022-06-16 07:18:36 normalization > 07:18:31.145228 [error] [MainThread]:   compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/airbyte_shopify/order_refunds_transactions.sql
    I tried get Raw data too, but it was not able pull anyway...
    g
    a
    • 3
    • 5
  • a

    Aamir Butt

    06/16/2022, 7:29 AM
    Hi, Is there a way to write a destination connector which takes in binary files (pdf, doc, docx, etc) from a source (for example, S3)? Because as far as I have seen, Airbyte works with record-based data such as CSV files from a source etc.
    g
    • 2
    • 3
  • j

    Joey Taleño

    06/16/2022, 8:19 AM
    Hi Airbyte Team, Hope all is well with everyone! Anybody here have setup a Salesforce to Snowflake data ingestion?
  • m

    Martin Fanev

    06/16/2022, 12:46 PM
    Hello everyone, Im currently working on using Airbyte to Load some Github data onto Snowflake, but I noticed that some data such as the number of clones for a certain repo is not provided with airbyte. My question is; is there any way to include that information? Thank you so much for your support!
  • t

    TG

    06/16/2022, 2:57 PM
    Anyone facing this issue in Kubernetes setup while upgrading the flyaway migrations version via config-map for the Airbyte-Server
    Copy code
    Current database migration version 0.30.22.001
    Minimum Flyway version required 0.35.15.00
    I have added this to the Configmap but still its not getting updated on the pods even after restart.
    CONFIGS_DATABASE_MINIMUM_FLYWAY_MIGRATION_VERSION: 0.35.15.00
  • g

    Gayathri Chakravarthy

    06/16/2022, 4:18 PM
    Hello all, hope this is really basic and someone can help. I’ve gotten started with Airbyte installation on EC2 by following this official documentation. However, an error’s thrown whilst attempting to install
    docker-compose
    - any idea how to fix this please?
    Copy code
    sudo wget <https://github.com/docker/compose/releases/download/1.26.2/docker-compose-$(uname> -s)-$(uname -m) -O /usr/local/bin/docker-compose
    --2022-06-16 15:51:35--  <https://github.com/docker/compose/releases/download/1.26.2/docker-compose-Linux-aarch64>
    Resolving <http://github.com|github.com> (<http://github.com|github.com>)... 140.82.121.3
    Connecting to <http://github.com|github.com> (<http://github.com|github.com>)|140.82.121.3|:443... connected.
    HTTP request sent, awaiting response... 404 Not Found
    2022-06-16 15:51:35 ERROR 404: Not Found.
  • r

    Rocky Appiah

    06/16/2022, 5:39 PM
    How difficult to get the hosted version of airbyte running ecs? Or should I stick with running it on a traditional ec2 instance
  • s

    Sandesh Kumar

    06/16/2022, 8:55 PM
    hey folks, does airbyte support transformation(basic normalization) for dynamo db destinations?
    • 1
    • 1
  • p

    Pawan

    06/17/2022, 9:17 AM
    Hi TEam
  • p

    Pawan

    06/17/2022, 9:17 AM
    I am trying to setup airbyte on aws EKS
  • p

    Pawan

    06/17/2022, 9:18 AM
    some of the pods are getting crashed
  • p

    Pawan

    06/17/2022, 9:18 AM
    can u pls help me with this
  • p

    Prakash

    06/17/2022, 1:08 PM
    Hi Team, doing airbyte-dbt transformation but after transformation columns are coming in small letters ?..I have used alias in .sql file to get some specific column names. my destination is postgres DB. how to get correct column names as per alias written.
  • l

    Lior Solomon

    06/17/2022, 3:55 PM
    Hi guys, did anyone experienced the following error when trying to sync mysql to snowflake
    net.snowflake.client.jdbc.SnowflakeSQLException: Cannot perform DROP. This session does not have a current schema. Call 'USE SCHEMA', or use a qualified name.
    ?
  • r

    Richard Ighodaro

    06/17/2022, 5:33 PM
    Morning Team do you know someone who can support us on a heroku deployment of airbyte?
    m
    • 2
    • 2
  • n

    Nikita Kotlyarov

    06/17/2022, 7:08 PM
    Did anyone experience the following "Unknown error occurred" in the UI? I restarted the VM and upgraded the Airbyte version, but it did not help. It is also interesting the settings -> sources shows an error while settings -> destinations look as usual! How should I troubleshoot that?
    m
    e
    • 3
    • 4
  • l

    Lior Solomon

    06/17/2022, 8:03 PM
    There is an issue with prefixes setup on the mysql connector to snowflake, specifically around drop stage statements. so I added a prefix to my connector but for some reason when it runs the drop stage command it doesn’t add it
    DROP STAGE IF EXISTS COMPANY_COM_TENANT_ADMIN_ONBOARDING;
    my expectation would be to get
    DROP STAGE IF EXISTS PREFIX.COMPANY_COM_TENANT_ADMIN_ONBOARDING;
    m
    • 2
    • 3
  • z

    Zaza Javakhishvili

    06/18/2022, 1:51 AM
    anyone?
    m
    • 2
    • 1
  • m

    Mukul Gopinath

    06/18/2022, 5:18 AM
    Hey team, Trying to setup Airbyte over EKS and external RDS. I have cloned the repo and tried to setup with internal pg (default config) but seem to run into issues where the bootloader is not started.
    m
    • 2
    • 1
  • d

    Daniel Bartley

    06/19/2022, 9:21 AM
    Hi, Was there ever a follow up to this video with more details about how to use the CDK?

    https://www.youtube.com/watch?v=kJ3hLoNfz_E&amp;t▾

    I need to ingest from a long list of SaaS services REST api and about half need a custom connector if using Airbyte.
    m
    • 2
    • 2
  • s

    Srinidhi krishnamurthy

    06/20/2022, 5:27 AM
    Hi #C021JANJ6TY , we have setup airbyte cluster on AWS EC2. the containers are up and running fine. we can able to sync the data as well . but the logs not writing to /tmp/workspace/*** , nor the directory has been created . can we get some help here please.
    m
    a
    • 3
    • 8
  • j

    Jannik Steinmann

    06/20/2022, 9:24 AM
    Hi 👋 I'm a bit confused about Airbytes architecture with regard to change data capture. In the blog post Understanding Change Data Capture (CDC) you mention that "To support log-based CDC, Airbyte uses Debezium". But as far as I can understand it, Airbyte only runs on Temporal. Does the Debezium/ Kafka part only apply to CDC? The docs on CDC don't mention anything of it (neither do the docs on the postgres source connector). Thanks for your help! 🙏
    m
    t
    • 3
    • 5
  • m

    Mukul Gopinath

    06/20/2022, 10:49 AM
    Hey team, I have an existing EC2 instance setup for airbyte and looking forward to move the data and connectors to EKS setup. I tried copying the airbyte-db container postgresql data files but doesnt seem to work. Any help?
    m
    • 2
    • 3
  • n

    Naresh Nelluri

    06/20/2022, 1:34 PM
    Hi Team, how does airbyte use disk memory optimally? Does it load everything from source to disk memory and flush to target finally or it keeps streaming data?
    m
    • 2
    • 1
1...484950...245Latest