https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • r

    Rahul Borse

    11/14/2022, 8:05 AM
    Hi All, As per attached screenshot while creating s3 destination, we have two options for normalization. 1.No flattering 2.Root level Normalization As per the example given in screenshot for root level normalization, it creates columns for root level field eg. user_Id, name. But what if we want to normalize nested fields as well in this example name column has a json data, which contains "first","last". How can we create columns as "name.first" and "name.last" in destination. How this transformation we can achieve using airbyte? Can someone please help on this. @Karen (Airbyte) @Marcos Marx (Airbyte) @Siddhant Singh @Jerri Comeau (Airbyte) @John Wasserman
    n
    • 2
    • 3
  • k

    kavi arasu

    11/14/2022, 8:17 AM
    Hi Team, I trying to replicate the data from Source Database tables to Destination Database tables. I have scenario like, I want to replicate the data from a table to some other table in destination. For example I want to replicate table_abc data to table_xyz in destination. But I couldn't able to achieve this through Aibyte. Please find the screen shot for your reference, Here, I want to replicate the soruce data from src_user_temp and src_user_attribute_temp to some other table like dm_user_temp and dm_user_attribute_temp tables in destination. But the problem is it's created new tables in destination and insert the data from source. So, team please tell me how to handle my scenario.
    s
    • 2
    • 3
  • š

    Šimon Appelt

    11/14/2022, 11:12 AM
    Hi All, we have recently set up CDC connection between RDS PostgresSQL and BigQuery (using pgoutput plugin). The connection is running hourly and emits records correctly, however, we noticed that the WAL is not flushed properly with every run, meaning there is data accumulating in our Postgres log (sometimes only partial flush of the log happens). Is anyone facing the same issue or does anyone know how why this happens?
    👍 1
    a
    d
    • 3
    • 4
  • n

    Naren Kadiri

    11/14/2022, 11:28 AM
    Hi team, How to increase nginx proxy read time Im getting this error
    • 1
    • 1
  • n

    Naren Kadiri

    11/14/2022, 11:29 AM
    by default this time is 60seconds?
    • 1
    • 1
  • r

    Rahul Borse

    11/14/2022, 12:44 PM
    Hi Team, What is the significance of airbyte metadata columns like _airbyte_ab_id, _airbyte_emitted_at, and If I don't want these columns into my destination we can achieve it writting custom connector. Since sync mode will be FullRefresh - Overwrite for all our sync operations so removing these field will impact anything? I have gone through the document and I could not see impact.
    • 1
    • 1
  • a

    Ameena El-Agha

    11/14/2022, 1:35 PM
    Hi, is the Shopify OAuth 2.0 authentication method no longer supported? Are there any options for connecting to Shopify other than custom app development?
    • 1
    • 1
  • r

    Roman Naumenko

    11/14/2022, 1:57 PM
    Hi, We were planning to use airbyte cloud for data sync in a saas product, and found that the cloud version doesn’t have API access https://docs.airbyte.com/api-documentation
    Airbyte Cloud does not currently support API access
    Is it on roadmap?
    • 1
    • 1
  • b

    Bernard Notarianni

    11/14/2022, 2:02 PM
    Hi Team, I am a newbie to Airbyte, using a java connector to pull data from a grpc endpoint. Is there a tutorial and/or sample how to write a java incremental source? I could find python examples, but could not find a java source example.
    • 1
    • 5
  • v

    Vishy ganesh

    11/14/2022, 2:10 PM
    I’m sure this question has been asked in some flavor or the other before but, I’m having issues testing Airbyte via Docker behind a proxy. I’m trying to setup a snowflake connection as documented and I get the error as below. I’ve tried adding JDBC parameters but, i presume the issue is getting past the proxy Has anyone encountered this before?
    Copy code
    airbyte-proxy       | 172.20.0.1 - airbyte [14/Nov/2022:14:06:59 +0000] "GET /api/v1/health HTTP/1.1" 200 18 "<http://localhost:8000/workspaces/3743a9af-d147-4c5a-b3b1-5756ee70ad31/destination/new-destination>" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36"
    airbyte-worker      | 2022-11-14 14:07:03 ERROR i.a.c.i.LineGobbler(voidCall):114 - Nov 14, 2022 2:07:03 PM net.snowflake.client.jdbc.RestRequest execute
    airbyte-worker      | 2022-11-14 14:07:03 ERROR i.a.c.i.LineGobbler(voidCall):114 - SEVERE: Stop retrying since elapsed time due to network issues has reached timeout. Elapsed: 60,556(ms), timeout: 60,000(ms)
    airbyte-worker      | 2022-11-14 14:07:04 ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):116 - HikariPool-1 - Exception during pool initialization.
    airbyte-worker      | Stack Trace: net.snowflake.client.jdbc.SnowflakeSQLException: JDBC driver encountered communication error. Message: Exception encountered for HTTP request: No trusted certificate found.
    airbyte-worker      |   at net.snowflake.client.jdbc.RestRequest.execute(RestRequest.java:301)
    airbyte-worker      |   at net.snowflake.client.core.HttpUtil.executeRequestInternal(HttpUtil.java:737) 
    ...
    ...
    airbyte-worker      |   at io.airbyte.integrations.destination.snowflake.SnowflakeDestination.main(SnowflakeDestination.java:30)
    airbyte-worker      | Caused by: javax.net.ssl.SSLHandshakeException: No trusted certificate found
    airbyte-worker      |   at java.base/sun.security.ssl.Alert.createSSLException(Alert.java:131)
    airbyte-worker      |   at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:370)
    airbyte-worker      |   at net.snowflake.client.jdbc.internal.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
    airbyte-worker      |   at net.snowflake.client.jdbc.RestRequest.execute(RestRequest.java:174)
    airbyte-worker      |   ... 25 more  
    airbyte-worker      |   at java.base/sun.security.ssl.CertificateMessage$T12CertificateConsumer.checkServerCerts(CertificateMessage.java:638)
    airbyte-worker      |   ... 49 more
    airbyte-worker      | 
    airbyte-worker      | 2022-11-14 14:07:04 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
    airbyte-server      | 2022-11-14 14:07:04 INFO i.a.s.RequestLogger(filter):112 - REQ 172.20.0.3 GET 200 /api/v1/health
    • 1
    • 1
  • d

    Dario Forti

    11/14/2022, 2:49 PM
    Hi team! We have an integration process where We use Airbyte's Hubspot connector as source and Snowflake as destination. We need to rename some tables after the data is loaded. What would be the best course of action to do this? I initially thought of using dbt (triggered by custom transformation option of Airbyte), but we're having some issues with it, and i'm not sure i should keep trying with this or if there's a better way of solving this. we need to run this as an automated command every time Airbyte finishes the connection
    Copy code
    ALTER TABLE t1 RENAME TO t2;
    • 1
    • 1
  • d

    Dudu Vaanunu

    11/14/2022, 2:52 PM
    Hi all 🙂 MySQL ---> Snowflake issue/question. What’s the correct way of moving an Airbyte Snowflake table from an old destination schema to another? I don’t wanna resync the entire set of data since some of the tables may be very big. The behavior at the moment with AB v.40.17 and mySQL connector v.1.0.13: 1. Original schema contains 2 destination tables: X and X_SCD. 2. Once I change the destination schema, new tables are created. X contains only the increment and X_SCD only the increments with no former versions. Is moving the old data manually from the new schema to both of these tables the right way to go here? I would appreciate any kind of feedback. Thanks!
    • 1
    • 1
  • e

    Espoir Murhabazi

    11/14/2022, 3:12 PM
    Hi all.. is it possible to have one source that write to two destinations in parallel? Let say, I have data I am pulling via an API, which is my source. Then I would like to have the same data on both dev and production instance of Snowflake which is my destination. How can I achieve this? I don’t want to set up two different connections since my API have rate limits and I want to avoid that. If this is implemented, please point me to the write section in the documentation. Regards.
    m
    j
    • 3
    • 5
  • m

    Matt Palmer

    11/14/2022, 3:34 PM
    Morning all, I just wanted to follow-up on this issue— I’m experiencing the same failure with our self-hosted airbyte instance. The only possible point of error I can find is that something is off with the
    alias = "addresses_ab1"
    argument in the config block. This is blocking us from implementing Airbyte at our org and I’d love to come to a solution if this is caused by data on our end. Can anyone on the Airbyte team confirm this is a bug?
    s
    • 2
    • 3
  • n

    Nick Saroki

    11/14/2022, 5:37 PM
    Attempting to launch my first Amazon Selling Partner > Postgres sync. I'm failing w/ a normalization error. How do see which record it's failing on or find `get_merchant_listings_all_data.sql`:
    Copy code
    2022-11-14 15:23:23 INFO i.a.w.g.DefaultNormalizationWorker(run):93 - Normalization executed in 13 seconds.
    2022-11-14 15:23:23 ERROR i.a.w.g.DefaultNormalizationWorker(run):101 - Normalization Failed.
    2022-11-14 15:23:23 INFO i.a.w.g.DefaultNormalizationWorker(run):106 - Normalization summary: io.airbyte.config.NormalizationSummary@68a2f518[startTime=1668439390337,endTime=1668439403632,failures=[io.airbyte.config.FailureReason@fba5e53[failureOrigin=normalization,failureType=system_error,internalMessage=invalid input syntax for type double precision: "",externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@15c237d4[additionalProperties={attemptNumber=0, jobId=3, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    1 of 1 ERROR creating table model temp_workspace.get_merchant_listings_all_data......................................... [ERROR in 0.16s]
    Database Error in model get_merchant_listings_all_data (models/generated/airbyte_tables/temp_workspace/get_merchant_listings_all_data.sql)
      invalid input syntax for type double precision: ""
      compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_tables/temp_workspace/get_merchant_listings_all_data.sql
    It's importing JSON into a temp table, but won't move from there.
    s
    • 2
    • 7
  • j

    Jerri Comeau (Airbyte)

    11/14/2022, 7:38 PM
    Hello Airbyte Community, and happy Monday. Just a heads up as we head into the new week and prep for the US Holidays coming up: • Due to a significant increase in volume of questions coinciding with staffing changes, it is possible that responses from the Community Assistance Team may be delayed. We are asking you, the Community, to help us with that in two ways. First, please be patient and respectful with CAT engineers when you do engage with us; Second, please make an effort to find an answer before you ask the question (our Discourse is a great source of solutions for problems that have cropped up before); and Third, if you have the answer, please don’t hesitate to jump in and share it on other folks’ questions. We have an amazing community of collaborative engineers all working to make things better, and we in CAT as well as at Airbyte as a whole really appreciate and thank you for all your help. • Please note that Airbyte Company Holidays include 24 and 25 November (US Thanksgiving), 26 December (US Christmas Observed), and 2 January (US New Year’s Day Observed). The Community Assistance Team will be off on these days and will not be responding to any questions, issues, or PRs opened until the next working day. If you have any questions about the team, our operating hours, or any sort of feedback about anything, please feel free to reach out to me here or directly; I’m happy to answer what I can. Thanks, and have a fantastic week and a great Holiday season!
    🦃 1
    🎄 1
    pto 1
  • v

    Vishy ganesh

    11/14/2022, 8:16 PM
    I tried building the airbyte-server project which tagged it with :dev I substituted the docker-comose.yaml from
    image: airbyte/server:${VERSION}
    to
    image: airbyte/server:dev
    After I make this change I get the error
    Copy code
    Mapper(toResponse):22 - Uncaught exception
    airbyte-server      | org.jooq.exception.DataAccessException: SQL [select * from "public"."actor_definition" where "public"."actor_definition"."actor_type" = ?::"public"."actor_type"]; Error while reading field: "public"."actor_definition"."normalization_repository", at JDBC index: 19
    
     The column index is out of range: 19, number of columns: 18.
    Am I supposed to pass any attributes while building?
    e
    • 2
    • 2
  • b

    Berzan Yildiz

    11/14/2022, 9:08 PM
    Anyone else have issues with hubspot source and private apps auth? It does not work at all.
    d
    d
    +3
    • 6
    • 9
  • g

    Gustavo Maia

    11/14/2022, 9:30 PM
    Hello guys, I am building a connection from the minimal Python Airbyte Source Connector (created with the generator), but the AirbyteRecordMessage requires a valid JSON data as argument. However the data I am collecting is quite large and it is getting complicated to put it in a JSON records format. Is there another way to do this? Maybe collect the whole file and not read its contents?
    m
    m
    +2
    • 5
    • 12
  • a

    Aaron Pritchard

    11/14/2022, 11:26 PM
    I’m trying Airbyte for the first time and am seeing a lot of errors when trying to set up sources, destinations and connections. I’ve been experimenting with MSSQL and Jira connections and it looks very unstable. Fields marked as optional aren’t as such. Source schemas appear to timeout when validating on large MSSQL databases. The Jira connection cannot sync due to “Invalid timezone offset” errors. Does anyone else see these sort of issues? Is this sort of instability typical or limited to the ALPHA connectors? Thanks.
    m
    n
    +2
    • 5
    • 16
  • n

    Nikhil Patel

    11/14/2022, 11:44 PM
    Hi Team, Airbyte Instance: Deployed on Kubernetes using Kusmization Need Help: I am having trouble understanding how can I migrate my Airbyte Instance from one GKE kubernetes cluster to another GKE kubernetes cluster. Due to resource limit issues I had to create new GKE clsuter but I don't to start from scratch and want to get all the connections that are created on my first cluster. Forum post : https://discuss.airbyte.io/t/best-way-to-migrate-connectors-across-airbyte-instances/1468/2 I checked the Forum post about migration but I did not understand it. Please help me here. Thank you.
    • 1
    • 1
  • f

    Faris

    11/14/2022, 11:58 PM
    I find this confusing when connecting postgres via ssh host. The documentation says I only need to copy-paste the private key into airbyte, but the connection fails with this message which seems like I need to key-in both key? 🆘
    Copy code
    The connection tests failed. 
    Could not connect with provided SSH configuration. Error: Unable to load private key pairs, verify key pairs are properly inputted
    n
    • 2
    • 7
  • s

    Slackbot

    11/15/2022, 4:38 AM
    This message was deleted.
    n
    • 2
    • 1
  • r

    Resford Rouzer

    11/15/2022, 4:52 AM
    Is anyone having issues with Google Analytics 4 source exhausting property tokens? Anything beyond a few days fails for me.
    Copy code
    "error": {
        "code": 429,
        "message": "Exhausted property tokens for a project per hour. These quota tokens will return in under an hour. To learn more, see <https://developers.google.com/analytics/devguides/reporting/data/v1/quotas>",
        "status": "RESOURCE_EXHAUSTED"
    s
    • 2
    • 4
  • s

    Sanjeev

    11/15/2022, 5:16 AM
    Hey everyone, is there a way to transform the data before sending to the destination ? Am I correct in assuming that Airbyte is ELT only ? For more context, There is a column of type geometry in my source (Mysql) table which is not being extracted properly. I was wondering if I can transform it into string before sending it to the destination
    n
    m
    • 3
    • 7
  • a

    Arjun Yadav

    11/15/2022, 6:38 AM
    I was trying to do a POC on airbytes. i have downloaded the open source version of it and having issues. I am using mac M1 OS and I ma currentlygetting the below error, can you pleae guid me with to resolve this error 2022-11-15 062459 WARN i.a.c.t.TemporalUtils(getTemporalClientWhenConnected):235 - Waiting for namespace default to be initialized in temporal
    n
    • 2
    • 5
  • a

    Arjun Yadav

    11/15/2022, 6:38 AM
    can someone please help me or guid me in resolving this issue
  • v

    Vignesh Sankaran

    11/15/2022, 6:41 AM
    👋 Hello, team!
  • k

    kavi arasu

    11/15/2022, 7:03 AM
    Can anyone help on this? Is it possible in Airbyte?
    • 1
    • 1
  • v

    Vignesh Sankaran

    11/15/2022, 7:17 AM
    Hi Team Airbyte, I am posting this query representing the product team of Pentafox (https://pentafox.in). “We are using Airbyte extensively and replicate data from Postgres 9.6 to GBQ. We run in an ELT mode where we transform more than billion records through Airbyte every time using DBT integration across 200 schemas. We are facing a memory bloating issue when we use DBT with Airbyte as the repo in which we have the DBT code is being cloned for every single run for every connection run even if the Git repo URL is the same. It should ideally be check pointed and run only Git pull if the URL is same. Attaching two consecutive run logs for the same Repo being cloned twice for the same connection that is run twice. This is a production impact bug (as I could not find anything in docs to set any settings or purge the cloned repos after every run) and any help will really save us a lot of time.” FYI @Subramaniyaswamy Chellakumaran @Mohan Prasath
    airbyte_dbt_temp_72_log.txtairbyte_dbt_temp_73_log.txt
    s
    • 2
    • 4
1...939495...245Latest