https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • n

    navod perera

    11/16/2022, 2:48 PM
    Hello Team Airbyte, We integrated woo-commerce with Airbyte. So the problem is when we add a correct woocommerce shop URL and wrong secret keys (Consumer key and Consumer secret) it succeeds. I need airbyte and woocommerce integration to succeed only if the user's input shop URL and secret keys (Consumer key and Consumer secret) are valid. Thanks in advance.
    n
    • 2
    • 7
  • h

    Hao Kuang

    11/16/2022, 3:09 PM
    Hello Team, It would be great someone could look at this PR and in 0.40.18 at least, we are basically blocked by this since orchestrator cannot use s3 for saving state objects which forces us to enable minio after the upgrade.
    m
    • 2
    • 5
  • b

    Ben Cahill

    11/16/2022, 3:16 PM
    hey guys - i want to use the SFTP connector on airbyte cloud, but the source i’m getting the files from uses IP whitelisting. do requests from airbyte cloud come from a static IP?
    n
    • 2
    • 1
  • l

    Lee Danilek

    11/16/2022, 5:31 PM
    Hi team! I merged a new connector 2 weeks ago and I'd like to request a release so my company's customers can start using Airbyte. I think the only way to use a new connector without a release is to compile airbyte from source, which is onerous
    s
    • 2
    • 3
  • r

    Rahul Borse

    11/16/2022, 6:44 PM
    Hi Team, As per the airbyte document, to build airbyte below command is required. SUB_BUILD=PLATFORM ./gradlew build But when I try that I am getting error in command prompt, attached screenshot for the same. Background : I have gone to my local folder where airbyte code I have kept, there in command prompt I did docker-compose up. It is running perfectly fine. But now I want do some changes in code, which is available in my local folder and build it before docker-compose. How can I build it? Please help me out with this.
    j
    r
    s
    • 4
    • 9
  • d

    D Feinstein

    11/16/2022, 8:55 PM
    Hi there - I’m requesting a Google Ads basic token, and need to put together a design documentation. The airbyte documentation links suggested in previous threads seem to be broken/down. Does anyone have a good example of this that I can work off of or new links?
    • 1
    • 2
  • r

    Rafael Gomes

    11/16/2022, 9:27 PM
    Hi all, I'm trying Airbyte (POC) to validate if it will fit my Data Stack. I'm trying to setup a connection using my own CA certificates, but jdbc cannot find my certificates files. What is the proper way to setup that? I was trying to add some docker compose volumes with my certificates but no success so far.
    m
    r
    • 3
    • 2
  • m

    Matt Palmer

    11/16/2022, 10:18 PM
    Hi! Could I have some developer eyes on this issue? I think I identified the root of a bug in a postgres <> redshift connection, but I’d like to understand what’s causing it. Four of my table normalizations are failing due to a normalization error that appears to be the result of faulty naming/aliases in
    /models/
    . Help would be much appreciated!
  • r

    Rahul Borse

    11/16/2022, 11:57 PM
    Hi Team, I just cloned the airbyte repo. I am trying to build airbyte with below, as per the document SUB_BUILD=PLATFORM ./gradlew build But I am getting build failure(Attached screenshot), can someone please help. PS: I have installed all the tech stack install as per the document
    s
    • 2
    • 1
  • j

    Jin Gong

    11/17/2022, 12:49 AM
    Hi team I ran into this ERROR with Airbyte-Stripe connector when syncing the
    coupons
    table. I removed the real data from error log but can share in DM if someone could help take a look 🙏 Filed a ticket here: https://discuss.airbyte.io/t/failed-to-replicate-coupons-data-to-snowflake-because-of-data-type-casting-issue/3223
    Copy code
    2022-11-17 00:14:10 [ERROR] - Airbyte output line process error: Error storing 17 source objects in [5ynfrw115cdzbio29kt6s.wk1ga7d2r6m4dr8ls20k] destination: failed to insert into temporary table 
    
    ...
    
    cause: 002014 (22000): SQL compilation error:
    Invalid expression [CAST(? AS VARCHAR(16777216))] in VALUES clause. Process will be killed
    2022-11-17 00:15:10 [ERROR] - Error closing airbyte runner: exit status 1
    • 1
    • 2
  • k

    kelvin

    11/17/2022, 3:28 AM
    hi everyone, I intend to use Airbyte to load some data in csv from customer into own DB in Postgresql. The customer will upload the file in daily so I need to append data day by day by schedule daily also. I had read about SFTP or Google directory source but both not support incremental mode, Is the any source that support to append new data from CSV file ? or anyone have any solution for it?
    m
    n
    • 3
    • 6
  • v

    vivek

    11/17/2022, 4:48 AM
    HI
  • v

    vivek

    11/17/2022, 4:49 AM
    connection failing with non-json response when trying to establish destination in aws s3
    m
    s
    • 3
    • 2
  • v

    vivek

    11/17/2022, 4:49 AM
    can anyone help ?
  • b

    Berzan Yildiz

    11/17/2022, 6:52 AM
    Hello Team. The normalization from pipedrive to postgres uses the wrong type.
    • 1
    • 1
  • r

    Rahul Borse

    11/17/2022, 7:12 AM
    Hi Team, I am new to docker, When we do docker-compose up for the first time on cloned airbyte code, does it build the local code and create image or what really happens.?
    • 1
    • 1
  • a

    Akash Ghadge

    11/17/2022, 8:59 AM
    Hi Team, I am new to the Airbyte. I'm trying to deploy Airbyte on my windows machine for creating a environment I have followed the documentation as well as the video I found on YouTube

    Video link▾

    . I have installed wsl2 as it comes with python3.8 support, I need to install LTS version of the python which is python3.10 or above. I have installed the python and the command I have used is,
    sudo apt install python3.10
    but when I check python version it is still python3.8, I am not sure what causing this, please let me know the solution for the same. Thank you
    n
    • 2
    • 3
  • g

    Gopinath

    11/17/2022, 9:39 AM
    Hi Team, Is this your first time deploying Airbyte: Yes OS Version / Instance: Windows 10 Deployment: docker Airbyte Version: 0.40.17 Source name: File Destination: Postgres Description: I am trying to setup source file connector to read my excel sheet located in the windows filesystem. I have created docker volume to mount to the windows filesystem location. While setting up source, i am getting file not found error. Error Description: Failed to load file///tmp/airbyte local/PDFull Copy.xlsx FileNotFoundError(2, 'No such file or directory') Traceback (most recent call last): File "/airbyte/integration_code/source_file/source.py", line 101, in check with client.reader.open(): File "/airbyte/integration_code/source_file/client.py", line 86, in open self._file = self._open() File "/airbyte/integration_code/source_file/client.py", line 130, in _open return smart_open.open(self.full_url, **self.args) File "/usr/local/lib/python3.9/site-packages/smart_open/smart_open_lib.py", line 177, in open fobj = _shortcut_open( File "/usr/local/lib/python3.9/site-packages/smart_open/smart_open_lib.py", line 363, in _shortcut_open return _builtin_open(local_path, mode, buffering=buffering, **open_kwargs) FileNotFoundError: [Errno 2] No such file or directory: '/tmp/airbyte_local/PDFull-Copy.xlsx' Docker volume created for airbyte-server volumes: _*- /C/Users/XXXX/Downloads/airbyte-mount:${LOCAL_ROOT}*_ Please could someone help me out here. Regards,
    ✅ 2
  • n

    Nathan Chan

    11/17/2022, 10:16 AM
    Hi team, my team deployed airbyte on gke, everything seems working fine except the log is not showing on console and on cloud logging we get this:
    Copy code
    ERROR Unable to invoke factory method in class com.van.logging.log4j2.Log4j2Appender for element Log4j2Appender: java.lang.IllegalStateException: No factory method found for class com.van.logging.log4j2.Log4j2Appender java.lang.IllegalStateException: No factory method found for class com.van.logging.log4j2.Log4j2Appender
    	at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.findFactoryMethod(PluginBuilder.java:238)
    	at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:136)
    	at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:1122)
    	at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:1047)
    	at org.apache.logging.log4j.core.appender.routing.RoutingAppender.createAppender(RoutingAppender.java:310)
    	at org.apache.logging.log4j.core.appender.routing.RoutingAppender.getControl(RoutingAppender.java:282)
    	at org.apache.logging.log4j.core.appender.routing.RoutingAppender.append(RoutingAppender.java:240)
    	at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:161)
    	at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:134)
    	at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:125)
    	at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:89)
    	at org.apache.logging.log4j.core.appender.rewrite.RewriteAppender.append(RewriteAppender.java:84)
    	at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:161)
    	at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:134)
    	at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:125)
    	at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:89)
    	at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:675)
    	at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:633)
    	at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:616)
    	at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:552)
    do you have any ideas?
    n
    m
    • 3
    • 2
  • l

    Lihan Li

    11/17/2022, 10:27 AM
    I deployed on GCP (helm chart) but seeing this error which relate to AWS SDK ???
    Copy code
    Caused by: com.amazonaws.SdkClientException: Unable to find a region via the region provider chain. Must provide an explicit region in the builder or setup environment to supply a region.
    	at com.amazonaws.client.builder.AwsClientBuilder.setRegion(AwsClientBuilder.java:462)
    	at com.amazonaws.client.builder.AwsClientBuilder.configureMutableProperties(AwsClientBuilder.java:424)
    	at com.amazonaws.client.builder.AwsSyncClientBuilder.build(AwsSyncClientBuilder.java:46)
    	at com.van.logging.aws.AwsClientHelpers.buildClient(AwsClientHelpers.java:88)
    	at com.van.logging.aws.S3PublishHelper.<init>(S3PublishHelper.java:53)
    	at com.van.logging.log4j2.Log4j2AppenderBuilder.lambda$createCachePublisher$0(Log4j2AppenderBuilder.java:271)
    	at java.base/java.util.Optional.ifPresent(Optional.java:178)
    	at com.van.logging.log4j2.Log4j2AppenderBuilder.createCachePublisher(Log4j2AppenderBuilder.java:266)
    	at com.van.logging.log4j2.Log4j2AppenderBuilder.build(Log4j2AppenderBuilder.java:137)
    n
    x
    +2
    • 5
    • 16
  • l

    Lihan Li

    11/17/2022, 12:20 PM
    Why Airbyte has such high connections usage for the Postgres database (part of Airbyte Setup, not as source or destination)?
    n
    • 2
    • 2
  • g

    Gustavo Molina

    11/17/2022, 12:23 PM
    Hey guys, I'm just starting with airbyte (connected just a few sources x destinations) but I'm having trouble adding clickup connector. I can see that it is correctly in the folder, but it's not listed as a source in the UI... I then tried to add it (using airbyte/source-clickup-api as docker repository name) but still with no luck (I get a Internal Server Error: Get Spec job failed.). What am I doing wrong?
    m
    • 2
    • 3
  • d

    Dragan

    11/17/2022, 12:46 PM
    Hi all, we started with using Airbyte on EC2 instance, we just created a small dev instance before moving to production to evaluate the work Our use case is to move streaming data from BigQuery to Snowflake, but unfortunately as soon as we try and push more data it fails We managed to move small amount of data easily (100.000 of rows in less than 2 min) but when we try and scale up (we tried with 1.000.000 rows - table had 12Gb) it failed and wasn’t able to finish -->
    Job either timed out or was cancelled
    Would be interested to know if anyone had similar issues and were able to solve it, we can scale up our instance and we plan to do it for prd, but not sure if this is the solution at the end?
    n
    • 2
    • 5
  • r

    Rahul Borse

    11/17/2022, 1:46 PM
    Hi Team, How can I get access to airbyte db which is running in my local?
    • 1
    • 1
  • a

    Akash Ghadge

    11/17/2022, 1:58 PM
    Hi Team, I am facing this issue while I run command ./generate.sh Instead of getting custom connector list
    s
    • 2
    • 4
  • k

    Karan

    11/17/2022, 2:07 PM
    Does anybody knows here how can we establish source connection between Google Cloud Storage via Airbyte? I tried using File connection, but not able to make connections
    t
    n
    m
    • 4
    • 10
  • g

    Gerard Clos

    11/17/2022, 2:09 PM
    Hey guys 👋 Is there a way to configure the number of retries of a connection from airbyte's API?
    m
    s
    u
    • 4
    • 9
  • p

    Philip Corr

    11/17/2022, 2:20 PM
    Hey everyone, I’ve opened a PR to add a shopify stream here. Just wondering if there is somewhere we should post that it’s ready to review or anything. I’m a little worried that it will just sit gathering dust?
    m
    s
    • 3
    • 3
  • s

    Shangwei Wang

    11/17/2022, 3:44 PM
    👋 we are having difficulty understand the error log during the normalization process (partial log pasted in thread), maybe someone can help us decipher 🙏 ? more deployment info in thread too. This is for a large job where we are moving about 80GB (110 million rows) of data. The other smaller jobs of ours worked well.
    m
    • 2
    • 14
  • m

    M Andriansyah Putra

    09/28/2022, 11:49 AM
    • Is this your first time deploying Airbyte?: No • OS Version / Instance: Amazon Linux 2 / t3.large EC2 instance • Memory / Disk: 8Gb memory / 30 Gb EBS volume • Deployment: Kubernetes on EC2 • Airbyte Version: v0.40.9 • Source name/version: Google Sheets / 0.2.17 • Destination name/version: Postgres / 1.0.10 • Step: The issue is happening during sync • Description: Postgres sync is getting stuck because “auth_type: must be a constant value Client, client_id: is missing but it is required, client_secret: is missing but it is required, refresh_token: is missing but it is required” when im using Service Account Key Authentication not the Google Auth One. do i need to downgrade version?
    ✍️ 1
    s
    • 2
    • 2
1...959697...245Latest