Domenic
01/25/2023, 6:50 PMMayank V
01/25/2023, 8:05 PMJason Carter
01/25/2023, 8:33 PMKamal Chavda
01/25/2023, 8:41 PMsource > Max memory limit: 94670422016, JDBC buffer size: 1073741824
Any advice, documentation, blogs that would help tuning airbyte to speed things up?
airbyte version 0.40.26 running on EKS
Sean Zicari
01/25/2023, 8:46 PMChen Lin
01/25/2023, 8:47 PMJesus Rivero
01/25/2023, 8:50 PMAWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
?Walker Philips
01/25/2023, 8:52 PMEmre Arikan
01/09/2023, 10:43 AMoctavia init
but getting problems:
• I can run the command by sshing into the machine with gcloud compute ssh ...
and connect to Airbyte at localhost:8000
• But if I try the command through the remote-exec provisioner script, it tells me it cannot reach Airbyte instance:
any help is appreicated!Somasekhar Reddy Palli
01/25/2023, 8:54 PMcampaign_type
string, created
, excluded_lists
array, from_email
string, from_name
string, id
string, is_segmented
boolean, lists
array, message_type
string, name
string, num_recipients
integer, object
string, send_time
, sent_at
, status
string, status_id
integer, status_label
string, subject
, template_id
, updated
) USING csv LOCATION 'abfss:REDACTED_LOCAL_PART@sxsdc.dfs.core.windows.net/1674e209-94ae-4f1b-af7c-f9de6cb8fcf4/platformdbtest_sandbox/_airbyte_tmp_ihi_campaigns/' options ("header" = "true", "multiLine" = "true")
2023-01-25 173149 -----------------------------------------------------------------------------------------------------------------------------------------------------^^^
2023-01-25 173149
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:585)
2023-01-25 173149 at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-01-25 173149 at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41)
2023-01-25 173149 at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:99)
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:484)
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:353)
2023-01-25 173149 at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:60)
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:331)
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:316)
2023-01-25 173149 at java.security.AccessController.doPrivileged(Native Method)
2023-01-25 173149 at javax.security.auth.Subject.doAs(Subject.java:422)
2023-01-25 173149 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
2023-01-25 173149 at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:365)
2023-01-25 173149 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
2023-01-25 173149 at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2023-01-25 173149 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2023-01-25 173149 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
2023-01-25 173149 at java.lang.Thread.run(Thread.java:750)
2023-01-25 173149 Caused by: org.apache.spark.sql.catalyst.parser.ParseException:
2023-01-25 173149 [PARSE_SYNTAX_ERROR] Syntax error at or near ',': extra input ','(line 1, pos 149)
2023-01-25 173149
*************************************************************************
Is there any open bug related to Klaviyo source connector or did I miss any configuration that resulted in the syntax error?Omar Ayoub Salloum
01/25/2023, 8:57 PMcurl -H "Content-Type: application/x-gzip" -X POST "<http://airbytenew-api.airbyte.svc/api/v1/deployment/import>" --data-binary @/tmp/airbyte_archive.tar.gz
Object not found.
user
01/25/2023, 8:58 PMuser
01/25/2023, 9:35 PMMark Suemegi
01/06/2023, 12:30 PM_airbyte_emitted_at
and _airbyte_normalized_at
We run this job daily (on working days), and in most cases it works how we expect it to work, the difference between these times is at most a few minutes.
Could someone clarify what could cause days of difference, as seen in the row I highlighted on the screenshot?
Thank you! 😊user
01/25/2023, 9:37 PMuser
01/25/2023, 9:37 PMBob Seehra
01/25/2023, 9:43 PM2023-01-25 01:50:54 [32mINFO[m i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$5):198 - Completing future exceptionally...
java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Running the launcher replication-orchestrator failed
The sync fails when we have the replication-orchestrator enabled. Disabling this allows the syncs to work, although we still see the exception in the logs (see thread for full stack trace)
CONTAINER_ORCHESTRATOR_ENABLED=true
Are there additional environment settings we are missing that need to be added?Jon Simpson
01/25/2023, 9:52 PMJoey Taleño
01/26/2023, 3:02 AMDenis Meng
01/26/2023, 4:09 AMAman Satya
01/26/2023, 10:55 AMAman Satya
01/26/2023, 10:55 AMAman Satya
01/26/2023, 10:55 AMOla Sehlén
01/26/2023, 10:56 AMHampus Sandén
01/26/2023, 11:42 AMI am using the Open Source Airbyte running on GCP to retrieve data from Bing Ads. Right now, I am retrieving all client accounts' data. How do I choose specific accounts when doing that?
Here in this website in the Requirement (Bing Ads - Airbyte Documentation)
I can choose whether to retrieve from all accounts or specific accounts, but when I am setting up the Bing Ads as a source connector, I don't see any options… Please help
Have anyone encountered this?Andres Gutierrez
01/26/2023, 12:01 PMincremental
and delete+insert
for Clickhouse as destionation from Airbyte? By delete+insert I mean this
If I'm understand correctly this requires dbt-clickhouse@1.3.2 and I see in Airbyte's master branch =1.3.1>
Also. How I can check in my Airbyte installation what version of dbt-clickhouse is installed?
In what docker container should I check?Mustafa Idris
01/26/2023, 12:42 PMLéon Klung
01/26/2023, 1:37 PMSabbiu Shah
01/26/2023, 1:49 PMAPI token
, it says Invalid OAuth Token
. And, when I set OAuth Token
, it says Invalid API token
. Weird!
I've attached screenshots. It would be great if somebody can look into this. ThanksAndres Gutierrez
01/26/2023, 1:50 PMSyncModes
I see these are the only options available in Airbyte UI.
But it would be possible to do incremental + override
and just override changed rows?