Krzysztof
11/23/2022, 11:16 AMDragan
11/23/2022, 11:31 AMResponse too large to return. Consider specifying a destination table in your job configuration
and if you seen it, have you been able to resolve this, since there is no configuration when creating a source on how to get the data from it
This seem to be an issue on old BigQuery API, but is there an option to override it in Airbyte?Rahul Borse
11/23/2022, 12:27 PMAbhijeet Singh
11/23/2022, 12:32 PMRahul Borse
11/23/2022, 1:16 PMGuy Feldman
11/23/2022, 3:15 PMException: ('Could not discover schema for source',
trying to run discovery in octavia cli. Tailing the logs in the temporal worker I see a null pointer exception
java.lang.NullPointerException: null
at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:889) ~[guava-31.1-jre.jar:?]
at io.airbyte.workers.internal.VersionedAirbyteStreamFactory.<init>(VersionedAirbyteStreamFactory.java:65) ~[io.airbyte-airbyte-commons-worker-0.40.16.jar:?]
at io.airbyte.workers.internal.VersionedAirbyteStreamFactory.<init>(VersionedAirbyteStreamFactory.java:56) ~[io.airbyte-airbyte-commons-worker-0.40.16.jar:?]
at io.airbyte.workers.temporal.discover.catalog.DiscoverCatalogActivityImpl.lambda$getWorkerFactory$2(DiscoverCatalogActivityImpl.java:127) ~[io.airbyte-airbyte-workers-0.40.16.jar:?]
This also happens in the UI. Airbyte version 0.40.16Joviano Cicero Costa Junior
11/23/2022, 5:23 PMJerri Comeau (Airbyte)
Will Callaghan
11/23/2022, 8:36 PMwcmatch.globmatch
in the S3 connector? In our use-case, it would be very helpful if we could use a negation pattern in addition to an inclusion pattern. This would require adding support for the negate flag. The change would essentially be:
What currently exists:
flags= GLOBSTAR | SPLIT
Adding a new flag:
flags= GLOBSTAR | SPLIT | NEGATE
https://github.com/airbytehq/airbyte/blob/8d4f7db4e717a39acb3266497c2e8d77d71ab2b3[…]/connectors/source-s3/source_s3/source_files_abstract/source.pyShangwei Wang
11/23/2022, 11:54 PM0.40.18
• postgres source version - 1.0.23
• redshift destination version - 0.3.51
1. We use “incremental dedup” for 2 tables (postgres -> redshift). When i was examine the queries in the log, i noticed that one query has updatedAt >= ?
while the other uses updatedAt > ?
, the difference is the presence of =
. Intuitively, i thought all incremental queries should be without the =
? Whats the deciding factor for adding/omitting the =
sign? _update_: i just found some doc on this, our updatedAt
timestamp for both tables are down to the millisecond resolution, so I’m not sure why there is a difference in the query…
2. The one query that has the =
sign, the connection state has cursor_record_count = 333
a. what does “cursor count” represent in airbyte?
b. is this normal?
Thanks a bunch, and happy holidays!Jason Maddern
11/24/2022, 12:27 AMJuan Chaves
11/24/2022, 12:35 AMHemanta
11/24/2022, 6:13 AMFaris
11/24/2022, 6:33 AMcaesee
11/24/2022, 6:51 AMcaesee
11/24/2022, 6:54 AMRytis Zolubas
11/24/2022, 7:46 AMb'<html>\r\n<head><title>504 Gateway Time-out</title></head>\r\n<body>\r\n<center><h1>504 Gateway Time-out</h1></center>\r\n<hr><center>nginx/1.23.2</center>\r\n</body>\r\n</html>\r\n'
I can see that API is passed correctly and runs the job. Later in the day I don't get this error.Rytis Zolubas
11/24/2022, 8:33 AMGerard Clos
11/24/2022, 9:41 AMRahul Borse
11/24/2022, 10:20 AMAndreas
11/24/2022, 10:41 AMairbyte-worker | 2022-11-24 10:33:25 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if redacted.pkg.dev/redacted//npg-airbyte-docker-connectors/plenigo-source:0.2 exists...
airbyte-worker | 2022-11-24 10:33:25 INFO i.a.c.i.LineGobbler(voidCall):114 - redacted.pkg.dev/redacted//npg-airbyte-docker-connectors/plenigo-source:0.2 was found locally.
airbyte-worker | 2022-11-24 10:33:25 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = plenigo-source-spec-ce979a0a-0c83-4913-a759-f6fd056a415d-0-hpdba with resources io.airbyte.config.ResourceRequirements@1b780d57[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
airbyte-worker | 2022-11-24 10:33:25 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/ce979a0a-0c83-4913-a759-f6fd056a415d/0 --log-driver none --name plenigo-source-spec-ce979a0a-0c83-4913-a759-f6fd056a415d-0-hpdba --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=redacted.pkg.dev/redacted//npg-airbyte-docker-connectors/plenigo-source:0.2 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.22 -e WORKER_JOB_ID=ce979a0a-0c83-4913-a759-f6fd056a415d redacted.pkg.dev/redacted//npg-airbyte-docker-connectors
Avi Sagal
11/24/2022, 10:42 AMlaila ribke
11/24/2022, 12:58 PMMikhail Masyagin
11/24/2022, 1:14 PMthomas trividic
11/24/2022, 1:53 PMthomas trividic
11/24/2022, 1:53 PMGustavo Maia
11/24/2022, 2:17 PMdocker build <connector-path> -t <tag>
and at the pip install .
step it takes a long time building dependencies with messages like "Installing build dependencies: still running..." but eventually it goes through those dependencies but reaches a point where it tries to build psutil but fails because psutil/_psutil_linux.c:19:10: fatal error: linux/version.h: No such file or directory
Has anyone ever seen something like this?laila ribke
11/24/2022, 2:31 PMAlexander Schmidt
11/24/2022, 2:36 PMOriol
11/24/2022, 2:53 PM