Cristiano Sarmento
12/03/2021, 2:51 PMJoël Luijmes
12/03/2021, 3:14 PMPing
12/06/2021, 10:06 PMValentin Nourdin
12/07/2021, 2:28 PMHarsha Teja Kanna
12/07/2021, 5:23 PMManav
12/08/2021, 10:12 PMio.airbyte.workers.WorkerException: Error while getting checking connection
- has anyone run into this before?Mohamed Magdy
12/09/2021, 5:09 PMDavin Chia (Airbyte)
Matthew Tovbin
12/10/2021, 6:25 PMJared Rhizor (Airbyte)
brew install openjdk@17
to get to the most recent version.
If you see a version error when running Gradle commands, try running ./gradlew clean
or removing .gradle
from the root directory or Airbyte. If you're switching between master
and a current branch you may have to do this multiple times (merging latest master
into your branch after the Java 17 update should prevent the need for running into this multiple times).
To configure IntelliJ, check to make sure that the in Project Structure
the Project SDK
is Java 17 and the language level is set to 17. This may happen automatically, but it likely depends on your configuration.
Please let me know if you see anything surprising locally or on CI.
We're first releasing this for our internal cloud project and then will soon release it for the OSS project.Rheza
12/12/2021, 2:19 PMOleksandr Tsukanov [GL]
12/13/2021, 9:13 AMairbyte-worker | Exception in thread "main" io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 4.998952187s.
airbyte-worker | at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:262)
airbyte-worker | at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:243)
airbyte-worker | at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:156)
airbyte-worker | at io.grpc.health.v1.HealthGrpc$HealthBlockingStub.check(HealthGrpc.java:252)
airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubsImpl.lambda$checkHealth$2(WorkflowServiceStubsImpl.java:282)
airbyte-worker | at io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:61)
airbyte-worker | at io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:51)
airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubsImpl.checkHealth(WorkflowServiceStubsImpl.java:275)
airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubsImpl.<init>(WorkflowServiceStubsImpl.java:182)
airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubs.newInstance(WorkflowServiceStubs.java:51)
airbyte-worker | at io.temporal.serviceclient.WorkflowServiceStubs.newInstance(WorkflowServiceStubs.java:41)
airbyte-worker | at io.airbyte.workers.temporal.TemporalUtils.createTemporalService(TemporalUtils.java:40)
airbyte-worker | at io.airbyte.workers.WorkerApp.main(WorkerApp.java:189)
What was done by me:
• Switched global env to JDK 17
• Rebuild docker images
• Switched IntelliJ to JDK 17
• run VERSION=dev docker-compose up
Did somebody faced some similar issue?Bruno
12/13/2021, 2:41 PMaws_secret_access_key
Is there anywhere in the docs, or Airbyte's repos, where we could find a list of the hidden (and non-hidden) fields for the different connectors? Finding them out by trial and error, taking into account the vast number of connectors available, is extremely difficult.Mohamed Magdy
12/13/2021, 11:04 PM021-12-13 22:57:58 INFO i.a.s.RequestLogger(filter):95 - {cloud_workspace_app_root=/workspace/server/logs} - REQ 10.30.33.154 GET 200 /api/v1/health
======= service endpoint: <http://airbyte-minio>:%!s(int=9000)
Ryan N
12/14/2021, 1:08 AM# Worker pod tolerations and node selectors
JOB_POD_TOLERATIONS=
JOB_POD_NODE_SELECTORS=
Eugene Krall
12/14/2021, 9:55 AMSai
12/14/2021, 3:09 PMSalesforce
as the source and GCP GCS
as a destination, do airbyte pull all the data from the source first then stage it on temporary storage and then start streaming the temporarily stored data to the destination i.e GCP GCS
?? I see streams
usage but would like to understand how this is handled with large datasets like 20 GB or 30 GB data.
Disclaimer : Just started using Airbyte.Cristiano Sarmento
12/14/2021, 7:19 PMgunu
12/14/2021, 11:59 PMManav
12/15/2021, 3:58 AMJason Edwards
12/17/2021, 11:14 PMAlexander Furer
12/20/2021, 9:51 AMEugene Krall
12/21/2021, 8:02 PMEugene Krall
12/22/2021, 2:27 PMDatabase Error in model messages (models/airbyte_incremental/whatsapp/messages.sql)
Syntax error: Expected "(" or keyword SELECT or keyword WITH but got keyword CREATE at [9:5]
The model is trying create an intermediary table like it did in the original modelsRemi Salmon
12/22/2021, 8:07 PM100038 (22018): Numeric value 'KEYWORD' is not recognized
) because of a wrong type definition here: https://github.com/airbytehq/airbyte/blob/13ac480a8b0024360d35c20fd6d640296b57f137[…]source-google-ads/source_google_ads/schemas/keyword_report.jsondevelopersteve
12/23/2021, 1:59 AMSeth Saperstein
12/25/2021, 5:21 AMdbt run --select <model>+
however to get the raw model to my dbt project, I don’t love the suggestion of hopping into the airbyte container, determining the normalization directory, and grabbing the generated dbt model, syncing that back to a dbt repo, and then integrating the dbt repo on the airbyte job configuration.
Has anyone found a better way of exporting dbt models? I’m also planning on running on dbt cloud and this configuration means that I cannot “deploy” models via dbt cloud when the source dataset changes. Alternatively, I could trigger the dbt cloud api but that isn’t possible directly via airbyte, which means I would then have to use Airflow to schedule the airbyte job and then kick of the dbt cloud job. This means that Airflow code must be written for new data sources and I’m looking to keep data integration and normalization self-service to speed up development time for new datasets.
If anyone has suggestions I’m all ears.Zawar khan
12/27/2021, 1:11 AMWisse Jelgersma
12/28/2021, 12:22 PMFernando Nava
12/28/2021, 9:55 PM