https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • n

    neeraj sen

    05/04/2023, 9:33 AM
    Hi Guys, while setup the source as AppsFlyer along with required parameter value. getting below -
    fetch to failed
    Configuration check failed - 'The supplied API token is invalid'
    validate the required parameter is working fine... any suggestion ?
    k
    • 2
    • 2
  • r

    R I

    05/04/2023, 9:56 AM
    Hello! I'm struggling with integration of GA4 data to ClickHouse via Airbyte for further use with dbt. I'm trying to figure out what is what and find any data description field by field in tables created by Airbyte. There are links in documentation which are probably supposed to provide some explanation about preconfigured streams here: https://docs.airbyte.com/integrations/sources/google-analytics-data-api but they lead to nowhere. Where to read info on preconfigured streams then?
    k
    • 2
    • 3
  • p

    Piyush Singariya

    05/04/2023, 10:35 AM
    Hi I was writing a golang script to work with destination connector, I am getting this error from destination logs
    Copy code
    ignoring input which can't be deserialized as Airbyte Message: {"type":"RECORD","record":{"stream":"latest_view4","data":{"key":2,"someCol":"ttt","eventTime":"2020-01-01T02:02:02"}}}
    k
    • 2
    • 2
  • b

    Benjamin Edwards

    05/04/2023, 11:25 AM
    Hi All, I am trying to set up a connection between Braze (source) and Snowflake (destination) I know Braze is in Alpha stage but I wanted to test it. Could I confirm that it can be used with Airbyte Open Source as the documentation only mentions setting up with Airbyte cloud
    k
    • 2
    • 2
  • s

    Slackbot

    05/04/2023, 12:33 PM
    This message was deleted.
  • m

    Miguel Ángel Torres Font - Valencia C.F.

    05/04/2023, 12:38 PM
    Hello all, I am issuing a problem with a connection from an API to BigQuery. The amount of time that it spent used to be less than an hour and now I am getting more than an hour and a half to get all done. I attach you some logs: 2023-05-04 090019 destination > 2023-05-04 090019 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):104 - Storage Object airbyte_bucket1/data/tour/tour_mestalla_transactions/2023/05/04/09/c27b4d8c-1797-4b94-b591-973cab6e9a7c/ has been created in bucket. 2023-05-04 090019 destination > 2023-05-04 090019 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$3):107 - Preparing tmp tables in destination completed. 2023-05-04 090051 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword $defs - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-05-04 090051 INFO i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed. errors: $: integer found, object expected 2023-05-04 090051 ERROR i.a.w.i.DefaultAirbyteStreamFactory(validate):87 - Validation failed: 2 2023-05-04 090146 INFO i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed. errors: $: integer found, object expected 2023-05-04 090146 ERROR i.a.w.i.DefaultAirbyteStreamFactory(validate):87 - Validation failed: 3
    k
    • 2
    • 2
  • u

    Utkarsh

    05/04/2023, 1:14 PM
    hey guys, having trouble creating a connection between source and destination. below are the details: source: redshift destination: s3 error:
    Copy code
    Internal message: java.lang.NullPointerException: null value in entry: isNullable=null
    Failure type: system_error
    
    Stacktrace
    java.lang.NullPointerException: null value in entry: isNullable=null
    let me know if you require full stack trace
    k
    • 2
    • 2
  • d

    Dylan Foster

    05/04/2023, 4:34 PM
    Hey all, I'm having an issue setting up MongoDB standalone source with TLS.
    Copy code
    io.airbyte.db.exception.ConnectionErrorException: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting to connect. Client view of cluster state is {type=UNKNOWN, servers=[{address=<http://preproduction-mongo-1.vpc.travelbank.com:27017|preproduction-mongo-1.vpc.travelbank.com:27017>, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketWriteException: Exception sending message}, caused by {javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target}, caused by {sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target}, caused by {sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target}}]
    	at io.airbyte.integrations.source.mongodb.MongoDbSource.getAuthorizedCollections(MongoDbSource.java:145)
    	at io.airbyte.integrations.source.mongodb.MongoDbSource.lambda$getCheckOperations$0(MongoDbSource.java:79)
    	at io.airbyte.integrations.source.relationaldb.AbstractDbSource.check(AbstractDbSource.java:85)
    	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:121)
    	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:97)
    	at io.airbyte.integrations.source.mongodb.MongoDbSource.main(MongoDbSource.java:52)
    Caused by: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting to connect. Client view of cluster state is {type=UNKNOWN, servers=[{address=<http://preproduction-mongo-1.vpc.travelbank.com:27017|preproduction-mongo-1.vpc.travelbank.com:27017>, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketWriteException: Exception sending message}, caused by {javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target}, caused by {sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target}, caused by {sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target}}]
    	at com.mongodb.internal.connection.BaseCluster.getDescription(BaseCluster.java:181)
    	at com.mongodb.internal.connection.SingleServerCluster.getDescription(SingleServerCluster.java:41)
    	at com.mongodb.client.internal.MongoClientDelegate.getConnectedClusterDescription(MongoClientDelegate.java:144)
    	at com.mongodb.client.internal.MongoClientDelegate.createClientSession(MongoClientDelegate.java:101)
    	at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.getClientSession(MongoClientDelegate.java:291)
    	at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:183)
    	at com.mongodb.client.internal.MongoDatabaseImpl.executeCommand(MongoDatabaseImpl.java:195)
    	at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:164)
    	at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:159)
    	at com.mongodb.client.internal.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:149)
    	at io.airbyte.integrations.source.mongodb.MongoDbSource.getAuthorizedCollections(MongoDbSource.java:130)
    	... 5 more
    k
    j
    • 3
    • 4
  • n

    Neil Turner

    05/04/2023, 4:46 PM
    Hi all, I've setup facebook ads to BigQuery connection. The connection runs without error, but the data seems wrong. For example, the attached screenshot is a query on the ads_insights table. The amount spent is a lot less then what's reported in the Facebook UI. Any suggestions on how to troubleshoot this? Could this be a datetime thing? The replication happens at 4PM EST, but the day isn't over yet so perhaps we're missing ad spend that occurs after the replicatoin?
    v
    • 2
    • 2
  • m

    Marcos Marx (Airbyte)

    05/04/2023, 5:02 PM
    We’re live on Zoom Meeting with Malik from talking about some cool features!
  • j

    Jeff Crooks

    05/04/2023, 5:17 PM
    Should we use this channel for help requests or go to discourse? It seems the only responses recently are from the kapa bot
    k
    m
    • 3
    • 3
  • t

    Tony Popov

    05/04/2023, 9:35 PM
    Hey Airbyte team, I’ve a question/issue report on the Xero connector. When syncing Xero Accounts Airbyte only returns the accounts modified later than the start date. This leads to all sort of join issues, e.g. transactions not joining to an account which has not been modified for years. Do you think it’s worth adding a stream config option to sync all historic data for a stream regardless of a start date or just hardcoding it into Accounts stream for Xero? Btw, not sure if it makes sense to have similar sync-all approach for other Xero entities, waiting for a review from our analytics team on it. Thanks!
    k
    • 2
    • 3
  • j

    Johannes Müller

    05/05/2023, 6:14 AM
    Are email addresses automatically hashed in the MySQL source connector? I would expect
    dpo_email
    to be actual email addresses like in the source database:
    s
    • 2
    • 2
  • u

    Utkarsh

    05/05/2023, 6:42 AM
    it is surprising to have such huge community yet no one answers or helps you out in the issues you face in basic tasks such as creating connections! :)
    this 1
    k
    • 2
    • 2
  • m

    maddu kiran

    05/05/2023, 7:21 AM
    Hi, I am running airbyte using helm chart. It was working fine and at once when I am trying to create a connection it is showing like this (image). What may be the cause? Thanks in advance.
  • y

    Yuva

    05/05/2023, 7:41 AM
    Hi all, with the RDS MySQL connector, we face this error
    java.time.DateTimeException: Invalid value for MonthOfYear (valid values 1 - 12): 0
    Similar to this on Github, https://github.com/airbytehq/airbyte/issues/24603. However the source table has 37million rows, so running a select query to isolate the faulty data is expensive. Any idea why even turning off basic normalization in Airbyte will output this date parsing issue? I was hoping to load the data into the raw JSON column, and analyze it from there
    k
    l
    • 3
    • 3
  • a

    aidan

    05/05/2023, 9:15 AM
    Hi I am getting this error in the UI on the connections, sources and destinations tabs. The connections are still running and I can find the connections via the web url. Is there away to change this. I have multiple airbyte configurations on this version and this is the only one with this issue. Versions 44.4 error message ""Oops! Something went wrong… Unknown error occurred""
    a
    • 2
    • 1
  • s

    Sebastian Calderone

    05/05/2023, 3:27 PM
    Hi, I'm trying to setup Airbyte using helm and an external postgre database. The problem I'm facing is with temporal. Once the temporal pod starts it tries to create
    temporal
    and
    temporal_visibility
    databases. This works flawlessly and I see the dbs created. For some reason it tries to recreate the dbs and it fails with the error:
    Copy code
    ERROR	Unable to create SQL database.	{"error": "pq: database \"temporal\" already exists", "logging-call-at": "handler.go:97"}
    k
    • 2
    • 2
  • m

    Micky

    05/05/2023, 4:00 PM
    Hi, I have deployed Airbyte on AWS EC2. I moved data to Snowflake yesterday successfully but I got an error message this morning that telling me destination connection failed and no space left on device, any point on it? Thanks
    k
    l
    • 3
    • 3
  • r

    Roberto Tolosa

    05/05/2023, 4:23 PM
    hi – it's been almost 2 weeks since i got a response on this thread, and the issue is still ongoing. could i have someone take another look? 🙏
    k
    • 2
    • 2
  • m

    Micky

    05/05/2023, 5:14 PM
    Hi, I got the webpage 'Oops! Something went wrong... Unknown error occurred' suddenly, can anyone help out? Thanks
    k
    • 2
    • 2
  • t

    Timothy McFall

    05/05/2023, 11:26 PM
    Hi all, I'm getting an error while trying to do an incremental append sync of a 150M row Postgres table to Redshift:
    Copy code
    org.postgresql.util.PSQLException: ERROR: temporary file size exceeds temp_file_limit (10485760kB)
    The Postgres Connector docs has a section for troubleshooting temporary file size limits, but it's a bit light on specifics:
    Some larger tables may encounter an error related to the temporary file size limit such as
    temporary file size exceeds temp_file_limit
    . To correct this error increase the temp_file_limit.
    My question is, are there any guidelines on how much I'd need to increase the
    temp_file_limit
    ? Does it need to be at least as big as the size of the table I'm replicating? Any help would be appreciated
    k
    • 2
    • 2
  • y

    Yannick Géry

    05/06/2023, 2:08 PM
    Hi everyone ! When syncing pipedrive to bigquery I'm getting the same error on certain strams, especially notes. The error is always the same during normalization:
    Copy code
    compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/airbyte/pipedrive_notes_scd.sql,retryable=<null>,timestamp=1683380879753,additionalProperties={}], io.airbyte.config.FailureReason@2f2d5ed8[failureOrigin=normalization,failureType=system_error,internalMessage=Bad int64 value: dbe95fa0-ebeb-11ed-95e8-f321add3...,externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@458f1e5d[additionalProperties={attemptNumber=2, jobId=89, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    2 of 7 ERROR creating incremental model airbyte.pipedrive_notes_scd..................................................... [ERROR in 4.82s]
    Database Error in model pipedrive_notes_scd (models/generated/airbyte_incremental/scd/airbyte/pipedrive_notes_scd.sql)
      Bad int64 value: dbe95fa0-ebeb-11ed-95e8-f321add3...
      compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/airbyte/pipedrive_notes_scd.sql
    2 of 7 ERROR creating incremental model airbyte.pipedrive_notes_scd..................................................... [ERROR in 4.82s]
    Database Error in model pipedrive_notes_scd (models/generated/airbyte_incremental/scd/airbyte/pipedrive_notes_scd.sql)
    k
    • 2
    • 3
  • l

    Lucas Azevedo

    05/06/2023, 4:12 PM
    Hey guys, I`m trying to configure a CDC from MySQL to S3. During the firsts syncs, worked fine.. but now I'm facing sync issues.. looking at the sync logs I see a loop reading the binlogs looking for new data in the logs.. this is flushing hundreds/thousands of new files in our bucket and ends up failing the sync (like our mysql is forcing closing the conn). Anyone have tried to implement CDC over s3 and facing something similar? Maybe I'm doing something wrong..
    Copy code
    2023-05-05 16:20:26 [43mdestination[0m > INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$getOrCreateBuffer$0):109 Starting a new buffer for stream atendimento (current state: 1 KB in 1 buffers)
    2023-05-05 16:20:26 [43mdestination[0m > INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$getOrCreateBuffer$0):109 Starting a new buffer for stream chamada (current state: 3 KB in 2 buffers)
    k
    n
    • 3
    • 3
  • n

    Newbie

    05/07/2023, 4:20 AM
    Hi everyone, I have setup the Standalone Mongodb instance as the source but
    Error: non-json response
    is thrown when I was trying to setup BigQuery as the destination, both the source and destination connection tests have passed. Not sure what went wrong here as there is no other error logs for troubleshooting, before using Airbyte I was able to retrieve the data using PyMongo. Can someone please help?
  • s

    Sebastian Sundet Flaen

    05/07/2023, 9:56 AM
    Hi! I have upgraded Airbyte version to 0.44.2 and the option to choose between Raw data (JSON) and Normalized tabular data has disappeared for all my connections. I want to do all normalization steps in DBT, therefore selecting Raw data (JSON). Can someone help me out?😄
    j
    d
    • 3
    • 4
  • s

    Svatopluk Chalupa

    05/07/2023, 11:13 AM
    Hi, I need to change the destination table name. Prefixing is not good enough because I have to sync from some interface views "v_tablename" to "tablename" . Am I right I have to use DBT for that and write custom transformation? Thanks!
    k
    • 2
    • 2
  • s

    Slackbot

    05/07/2023, 8:23 PM
    This message was deleted.
    k
    • 2
    • 2
  • j

    Jordan Velich

    05/07/2023, 11:48 PM
    Hi, is there any way to change the heartbeat settings for temporal? We are having issues with the temporal service killing jobs due to an
    activity_timeout
    , despite the connector running just fine!
    k
    • 2
    • 2
  • a

    Alon Bar

    05/08/2023, 5:15 AM
    Hey everyone, I’m creating a few connectors for Google analytics 4 (GA4) While I was able to define the source + stream properly and get the initial data flowing into my destination (postgres) the incremental syncs seems not to work (even with the default streams that comes out of the box with the connector) For instance, I have this stream: daily_active_users that was defined by default by airbyte connector It successfully created the table daily_active_users, but did not fetch new data, only after reseting the data and syncing the connector again did I get fresh data from my platform . Did anyone come across such a scenario ?
    k
    • 2
    • 3
1...190191192...245Latest