https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • d

    Domenic

    01/25/2023, 6:50 PM
    Hey folks. When will the OracleDB source be updated to reflect 'Replicate Incremental Deletes'. We are waiting on this feature to start using Airbyte
    s
    u
    • 3
    • 4
  • m

    Mayank V

    01/25/2023, 8:05 PM
    Hi! I am trying to add a new connector, it doesn't show up in the UI locally. How can I troubleshoot it?
    m
    u
    • 3
    • 8
  • j

    Jason Carter

    01/25/2023, 8:33 PM
    Hi Airbyte team! Is there any insight on when or even if this PR can be merged? It fixes an env variables issue when using S3 config ( and in my case kustimize) It was opened in Oct and I just ran into the same issue on v0.40.29 https://github.com/airbytehq/airbyte/pull/18413
    u
    • 2
    • 1
  • k

    Kamal Chavda

    01/25/2023, 8:41 PM
    Hi All, using Airbyte to move data from aurora postgres to redshift using S3 copy strategy. I successfully copied one table with 219 records in 2 and a half minutes (initial load). However, when I try to copy multiple tables (21) the sync job seems to just sit there. Worker log shows
    Copy code
    source > Max memory limit: 94670422016, JDBC buffer size: 1073741824
    Any advice, documentation, blogs that would help tuning airbyte to speed things up?
    Copy code
    airbyte version 0.40.26 running on EKS
    u
    u
    • 3
    • 5
  • s

    Sean Zicari

    01/25/2023, 8:46 PM
    Is it possible to assign a different service account to Airbyte jobs, other than default?
    u
    u
    • 3
    • 3
  • c

    Chen Lin

    01/25/2023, 8:47 PM
    What is the google ads api that airbyte connector calls to get data? is it https://googleads.googleapis.com/v12/customers/{customer_id}/googleAds:searchStream ?
    n
    • 2
    • 1
  • j

    Jesus Rivero

    01/25/2023, 8:50 PM
    Hi all, is it possible set up airbyte to send logs to s3 using service account role in eks, instead of
    AWS_ACCESS_KEY_ID
    and
    AWS_SECRET_ACCESS_KEY
    ?
    n
    j
    e
    • 4
    • 8
  • w

    Walker Philips

    01/25/2023, 8:52 PM
    Is there a way to check current resources granted for connections? I'd like to ensure that Airbyte is granted a majority of system resources.
    m
    • 2
    • 1
  • e

    Emre Arikan

    01/09/2023, 10:43 AM
    hi Guys I am on GCP and trying to set up Airbyte&Octavia per Terraform, onto Compute engine. I wrote scripts which are running per SSH through Terraform remote-exec Provisioner after the provisioning of Compute engine itself is done. I was able to get to the point where I could spin up the airbyte and octavia containers . with
    octavia init
    but getting problems: • I can run the command by sshing into the machine with
    gcloud compute ssh ...
    and connect to Airbyte at localhost:8000 • But if I try the command through the remote-exec provisioner script, it tells me it cannot reach Airbyte instance: any help is appreicated!
    u
    • 2
    • 1
  • s

    Somasekhar Reddy Palli

    01/25/2023, 8:54 PM
    Hello All, I was trying to extract data from Klaviyo and ingest in Azure Databricks and the sync has failed with the syntax error. What I noticed was few columns were included in create table statement without data type which resulted in the syntax error. Here is the stack trace- ********************************************************************** 2023-01-25 173149 2023-01-25 120149 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):209 - sync summary: io.airbyte.config.StandardSyncOutput@7be643fc[standardSyncSummary=io.airbyte.config.StandardSyncSummary@2c7ec77[status=failed,recordsSynced=41841,bytesSynced=60626678,startTime=1674647748084,endTime=1674648109925,totalStats=io.airbyte.config.SyncStats@7108bc3b[bytesEmitted=60626678,destinationStateMessagesEmitted=0,destinationWriteEndTime=1674648109924,destinationWriteStartTime=1674647748168,estimatedBytes=<null>,estimatedRecords=<null>,meanSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBetweenStateMessageEmittedandCommitted=0,meanSecondsBetweenStateMessageEmittedandCommitted=0,recordsEmitted=41841,recordsCommitted=0,replicationEndTime=1674648109925,replicationStartTime=1674647748084,sourceReadEndTime=1674648076066,sourceReadStartTime=1674647748126,sourceStateMessagesEmitted=0,additionalProperties={}],streamStats=[io.airbyte.config.StreamSyncStats@22fbe853[streamName=campaigns,streamNamespace=<null>,stats=io.airbyte.config.SyncStats@6bcba653[bytesEmitted=118350,destinationStateMessagesEmitted=<null>,destinationWriteEndTime=<null>,destinationWriteStartTime=<null>,estimatedBytes=<null>,estimatedRecords=<null>,meanSecondsBeforeSourceStateMessageEmitted=<null>,maxSecondsBeforeSourceStateMessageEmitted=<null>,maxSecondsBetweenStateMessageEmittedandCommitted=<null>,meanSecondsBetweenStateMessageEmittedandCommitted=<null>,recordsEmitted=95,recordsCommitted=<null>,replicationEndTime=<null>,replicationStartTime=<null>,sourceReadEndTime=<null>,sourceReadStartTime=<null>,sourceStateMessagesEmitted=<null>,additionalProperties={}]], io.airbyte.config.StreamSyncStats@56525b09[streamName=events,streamNamespace=<null>,stats=io.airbyte.config.SyncStats@620fdc9c[bytesEmitted=60508328,destinationStateMessagesEmitted=<null>,destinationWriteEndTime=<null>,destinationWriteStartTime=<null>,estimatedBytes=<null>,estimatedRecords=<null>,meanSecondsBeforeSourceStateMessageEmitted=<null>,maxSecondsBeforeSourceStateMessageEmitted=<null>,maxSecondsBetweenStateMessageEmittedandCommitted=<null>,meanSecondsBetweenStateMessageEmittedandCommitted=<null>,recordsEmitted=41746,recordsCommitted=<null>,replicationEndTime=<null>,replicationStartTime=<null>,sourceReadEndTime=<null>,sourceReadStartTime=<null>,sourceStateMessagesEmitted=<null>,additionalProperties={}]]]],normalizationSummary=<null>,webhookOperationSummary=<null>,state=<null>,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@1fcb09fe[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@2904dd6a[stream=io.airbyte.protocol.models.AirbyteStream@6d4a5150[name=campaigns,jsonSchema={"type":"object","properties":{"id":{"type":"string"},"name":{"type":"string"},"lists":{"type":"array","items":{"type":"object","properties":{"id":{"type":"string"},"name":{"type":"string"},"folder":{"type":["null","string"]},"object":{"type":"string"},"created":{"type":"string","format":"date-time"},"updated":{"type":"string","format":"date-time"},"list_type":{"type":"string"},"person_count":{"type":"integer"}}}},"object":{"type":"string"},"status":{"type":"string"},"created":{"type":["null","string"],"format":"date-time"},"sent_at":{"type":["null","string"],"format":"date-time"},"subject":{"type":["null","string"]},"updated":{"type":["null","string"],"format":"date-time"},"from_name":{"type":"string"},"send_time":{"type":["null","string"],"format":"date-time"},"status_id":{"type":"integer"},"from_email":{"type":"string"},"template_id":{"type":["null","string"]},"is_segmented":{"type":"boolean"},"message_type":{"type":"string"},"status_label":{"type":"string"},"campaign_type":{"type":"string"},"excluded_lists":{"type":"array","items":{"type":"object","properties":{"id":{"type":"string"},"name":{"type":"string"},"folder":{"type":["null","string"]},"object":{"type":"string"},"created":{"type":"string","format":"date-time"},"updated":{"type":"string","format":"date-time"},"list_type":{"type":"string"},"person_count":{"type":"integer"}}}},"num_recipients":{"type":"integer"}}},supportedSyncModes=[full_refresh],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=<null>,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@713d8b59[stream=io.airbyte.protocol.models.AirbyteStream@367ee879[name=events,jsonSchema={"type":"object","properties":{"id":{"type":"string"},"uuid":{"type":"string"},"object":{"type":"string"},"person":{"type":"object","properties":{"id":{"type":"string"},"$email":{"type":"string"},"object":{"type":"string"}}},"flow_id":{"type":["null","string"]},"datetime":{"type":"string"},"timestamp":{"type":"integer"},"event_name":{"type":"string"},"campaign_id":{"type":["null","string"]},"statistic_id":{"type":"string"},"flow_message_id":{"type":["null","string"]},"event_properties":{"type":"object","properties":{"items":{"type":"array","items":{"type":"object","properties":{"sku":{"type":"string"},"name":{"type":"string"},"price":{"type":"integer"},"object":{"type":"string"},"quantity":{"type":"integer"}}}},"$value":{"type":"number"}}}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[timestamp],sourceDefinedPrimaryKey=[[id]],namespace=<null>,additionalProperties={}],syncMode=full_refresh,cursorField=[timestamp],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@391e8c51[failureOrigin=destination,failureType=system_error,internalMessage=java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: [PARSE_SYNTAX_ERROR] org.apache.spark.sql.catalyst.parser.ParseException: 2023-01-25 173149 [PARSE_SYNTAX_ERROR] Syntax error at or near ',': extra input ','(line 1, pos 149) 2023-01-25 173149 2023-01-25 173149 == SQL == 2023-01-25 173149 CREATE TABLE platformdbtest_sandbox._airbyte_tmp_ihi_campaigns (_airbyte_ab_id string, _airbyte_emitted_at string,
    campaign_type
    string,
    created
    ,
    excluded_lists
    array,
    from_email
    string,
    from_name
    string,
    id
    string,
    is_segmented
    boolean,
    lists
    array,
    message_type
    string,
    name
    string,
    num_recipients
    integer,
    object
    string,
    send_time
    ,
    sent_at
    ,
    status
    string,
    status_id
    integer,
    status_label
    string,
    subject
    ,
    template_id
    ,
    updated
    ) USING csv LOCATION 'abfss:REDACTED_LOCAL_PART@sxsdc.dfs.core.windows.net/1674e209-94ae-4f1b-af7c-f9de6cb8fcf4/platformdbtest_sandbox/_airbyte_tmp_ihi_campaigns/' options ("header" = "true", "multiLine" = "true") 2023-01-25 173149 -----------------------------------------------------------------------------------------------------------------------------------------------------^^^ 2023-01-25 173149 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48) 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:585) 2023-01-25 173149 at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) 2023-01-25 173149 at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41) 2023-01-25 173149 at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:99) 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:484) 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:353) 2023-01-25 173149 at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149) 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49) 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:60) 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:331) 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:316) 2023-01-25 173149 at java.security.AccessController.doPrivileged(Native Method) 2023-01-25 173149 at javax.security.auth.Subject.doAs(Subject.java:422) 2023-01-25 173149 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878) 2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:365) 2023-01-25 173149 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 2023-01-25 173149 at java.util.concurrent.FutureTask.run(FutureTask.java:266) 2023-01-25 173149 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 2023-01-25 173149 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 2023-01-25 173149 at java.lang.Thread.run(Thread.java:750) 2023-01-25 173149 Caused by: org.apache.spark.sql.catalyst.parser.ParseException: 2023-01-25 173149 [PARSE_SYNTAX_ERROR] Syntax error at or near ',': extra input ','(line 1, pos 149) 2023-01-25 173149 ************************************************************************* Is there any open bug related to Klaviyo source connector or did I miss any configuration that resulted in the syntax error?
    m
    • 2
    • 4
  • o

    Omar Ayoub Salloum

    01/25/2023, 8:57 PM
    I am trying to restore a back up from cluster version 0.39.38 to the latest version 0.40.29 but I am getting an error
    Copy code
    curl -H "Content-Type: application/x-gzip" -X POST "<http://airbytenew-api.airbyte.svc/api/v1/deployment/import>" --data-binary @/tmp/airbyte_archive.tar.gz 
    
    Object not found.
    👍 1
    m
    u
    • 3
    • 8
  • u

    user

    01/25/2023, 8:58 PM
    Hello god830, it's been a while without an update from us. Are you still having problems or did you find a solution?
  • u

    user

    01/25/2023, 9:35 PM
    Hey Sheshan, sorry for the long delay here. Were you able to get this working?
  • m

    Mark Suemegi

    01/06/2023, 12:30 PM
    Hi all, I have an issue where for some records we have multiple days difference between
    _airbyte_emitted_at
    and
    _airbyte_normalized_at
    We run this job daily (on working days), and in most cases it works how we expect it to work, the difference between these times is at most a few minutes. Could someone clarify what could cause days of difference, as seen in the row I highlighted on the screenshot? Thank you! 😊
    u
    u
    • 3
    • 3
  • u

    user

    01/25/2023, 9:37 PM
    Hello Vikas, it's been a while without an update from us. Are you still having problems or did you find a solution?
  • u

    user

    01/25/2023, 9:37 PM
    Hello Philip Johnson, it's been a while without an update from us. Are you still having problems or did you find a solution?
  • b

    Bob Seehra

    01/25/2023, 9:43 PM
    Hi All! We are upgrading from 0.40.4 to 0.40.29 in our test environment running on GKE and are now experiencing issues with the replication-orchestrator for all of our syncs.
    Copy code
    2023-01-25 01:50:54 [32mINFO[m i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$5):198 - Completing future exceptionally...
    java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Running the launcher replication-orchestrator failed
    The sync fails when we have the replication-orchestrator enabled. Disabling this allows the syncs to work, although we still see the exception in the logs (see thread for full stack trace)
    Copy code
    CONTAINER_ORCHESTRATOR_ENABLED=true
    Are there additional environment settings we are missing that need to be added?
    p
    s
    • 3
    • 6
  • j

    Jon Simpson

    01/25/2023, 9:52 PM
    It seems like a large amount of the connectors, whether sources or destination, are marked alpha and beta. But from discussions and issues it seems like people are already using them in production. Is there an outline of what Airbyte considers the labels for? For example the Website does not mention the beta/alpha status at all for any of the connectors that I can tell, like https://airbyte.com/connectors/shopify But when using the UI to create a source it’s ‘alpha’.
    n
    • 2
    • 1
  • j

    Joey Taleño

    01/26/2023, 3:02 AM
    Hi Team, I'm using Airbyte Cloud and setting up Google Analytics (UA) source. However, I'm getting this error. Please help!
    u
    • 2
    • 1
  • d

    Denis Meng

    01/26/2023, 4:09 AM
    Hi Airbyte Team, we are using Airbyte 0.34 right now, and also have Monte Carlo to monitor our table data to ensure data quality. One of the things we constantly do is based on table data alert, go to Airbyte and fix failed data syncs. Because there are so many tables buried in different connections, it can be challenging to locate the tables quickly sometimes. Is there an easy way to find the connection based on table name? TIA
    u
    u
    • 3
    • 4
  • a

    Aman Satya

    01/26/2023, 10:55 AM
    Hello everyone , When I am building any existing Java connector using the gradlew command , I am getting an error: Task : airbyte-integrationsbasesbase: airbyteDocker failed When I am following the stacktrace of the build , the error is happening due to IOexception: Create process error =193, %1 is not a valid Win32 application. It is unable to run build_image.sh program
    t
    n
    a
    • 4
    • 15
  • a

    Aman Satya

    01/26/2023, 10:55 AM
    If I am building any python connector using docker build command it is not failing
    n
    u
    • 3
    • 3
  • a

    Aman Satya

    01/26/2023, 10:55 AM
    Only Java connectors are throwing this kind of error. Please help me in this regard
  • o

    Ola Sehlén

    01/26/2023, 10:56 AM
    Hi, Using the Snapchat connector together with BigQuery I seem to be missing some Ad IDs in the "ads" table vs the "ads_stats_daily" table.
    a
    u
    u
    • 4
    • 6
  • h

    Hampus Sandén

    01/26/2023, 11:42 AM
    Hi, i noticed when setting up a new Bing Ads connector that there are some fields missing in Source Settings that are described in the documentation. The problem is already explained here by another user: https://discuss.airbyte.io/t/selecting-specific-accounts-when-using-bing-ads-source-connector/3648
    Copy code
    I am using the Open Source Airbyte running on GCP to retrieve data from Bing Ads. Right now, I am retrieving all client accounts' data. How do I choose specific accounts when doing that?
    
    Here in this website in the Requirement (Bing Ads - Airbyte Documentation)
    I can choose whether to retrieve from all accounts or specific accounts, but when I am setting up the Bing Ads as a source connector, I don't see any options… Please help
    Have anyone encountered this?
    u
    • 2
    • 2
  • a

    Andres Gutierrez

    01/26/2023, 12:01 PM
    Hi, quick question does anyone knows if is there already work in progress the support for
    incremental
    and
    delete+insert
    for Clickhouse as destionation from Airbyte? By delete+insert I mean this If I'm understand correctly this requires dbt-clickhouse@1.3.2 and I see in Airbyte's master branch =1.3.1> Also. How I can check in my Airbyte installation what version of dbt-clickhouse is installed? In what docker container should I check?
    ✅ 1
    u
    u
    +2
    • 5
    • 14
  • m

    Mustafa Idris

    01/26/2023, 12:42 PM
    Hi everyone I'm trying to set up a connection on airbyte. I've successfully set up the source (PostgreSQL) but im getting the following issue with the destination (BigQuery) what should i do?
    a
    u
    u
    • 4
    • 5
  • l

    Léon Klung

    01/26/2023, 1:37 PM
    Hi everyone, I installed Docker and Airbyte on a VM instance on GCP. Ran docker compose up, but now it is stuck at airbyte-temporal - see screenshot below. Did not find any resources to resolve this. Any idea how to resolve this? Let me know, if more info is needed. Thanks! ❤️
    a
    t
    • 3
    • 5
  • s

    Sabbiu Shah

    01/26/2023, 1:49 PM
    Hi, While setting up "Square" as a source. I am getting some weird issues. When I set
    API token
    , it says
    Invalid OAuth Token
    . And, when I set
    OAuth Token
    , it says
    Invalid API token
    . Weird! I've attached screenshots. It would be great if somebody can look into this. Thanks
    a
    n
    u
    • 4
    • 8
  • a

    Andres Gutierrez

    01/26/2023, 1:50 PM
    Hi, I see these options in
    SyncModes
    I see these are the only options available in Airbyte UI. But it would be possible to do
    incremental + override
    and just override changed rows?
    n
    • 2
    • 4
1...129130131...245Latest