Ashish Singh
10/11/2022, 2:02 PMEduardo Aviles
10/10/2022, 5:59 AMMichael Cooper
10/11/2022, 3:18 PM0.37.1-alpha
Dylan Pierce
10/11/2022, 3:20 PMNishant George
10/10/2022, 4:55 PMfind /var/lib/docker/volumes/airbyte_workspace/_data/ -maxdepth 1 -regex '/var/lib/docker/volumes/airbyte_workspace/_data/[0-9]+' -mtime +10 | xargs rm -rf
Does anyone have a more hands-free solution? (Why doesn't Airbyte manage its disk space better out of the box...?)Steven Herweijer
10/11/2022, 6:09 PMclaudio viera
10/11/2022, 7:02 PMTeri Eyenike
10/11/2022, 8:58 PMJordan Young
10/11/2022, 9:25 PMWilter Yee
10/11/2022, 10:45 PM0 0 12 * * ?
), and every time I hit Save Changes
, it reverts back to manual. When I select another option in the dropdown and save, it’ll apply properly. Does anyone else know why that’s happening?gunu
10/12/2022, 2:51 AMgoogle search console
source connector?Zachary Damcevski
10/12/2022, 3:52 AMlast_sync_max_cursor_field_value
, what happens if the previous sync failed for any reason? Does it take the last sync time of a successful run?
https://docs.airbyte.com/understanding-airbyte/connections/incremental-append#known-limitationsgunu
10/12/2022, 4:34 AMgoogle search console
source OAuth credentials: access
and refresh
tokens e.g. python code with client.
Existing Oauth clients have been affected
October 3, 2022 - the OOB flow is deprecated for OAuth clients created before February 28, 2022https://developers.google.com/identity/protocols/oauth2/resources/oob-migration
Santiago Stachuk
10/12/2022, 5:03 AMSatya Varma
10/12/2022, 6:12 AMsonti srihari
10/12/2022, 7:27 AMsonti srihari
10/12/2022, 7:40 AMHuib
10/12/2022, 9:58 AM2022-10-12 09:53:41 - Additional Failure Information: tech.allegro.schema.json2avro.converter.AvroConversionException: Failed to convert JSON to Avro: Could not evaluate union, field Location is expected to be one of these: NULL, DOUBLE. If this is a complex type, check if offending field (path: Location) adheres to schema: 9.447
Felipe Soares Costa
10/12/2022, 11:40 AMShashank Tiwari
10/12/2022, 12:27 PMRamon Vermeulen
10/12/2022, 1:35 PMaibyte-chart-airbyte-worker
pod with the helm release:
Message: No bean of type [io.airbyte.config.persistence.split_secrets.SecretPersistence] exists for the given qualifier: @Named('secretPersistence'). Make sure the bean is not disabled by bean requirements (enable trace logging for 'io.micronaut.context.condition' to check) and if the bean is enabled then ensure the class is declared a bean and annotation processing is enabled (for Java and Kotlin the 'micronaut-inject-java' dependency should be configured as an annotation processor).
Seems the same problem as https://airbytehq.slack.com/archives/C021JANJ6TY/p1665149370573089 ?
UPDATE:
Due to @lucien his answer I know he was running 0.40.6 without problems. Upgrading from 0.40.2 to 0.40.6 succeeded. So this somewhat confirms this problem is introduced between 0.40.6 and 0.40.13.
UPDATE2:
Can it be something related to this commit in version 0.40.10? https://github.com/airbytehq/airbyte/commit/9abcbadd9316d3017a4573bc195f44e15b5a0dfbMitch Eccles
10/12/2022, 2:59 PM<https://my-airbyte-endpoint/api/v1/connections/create|api/v1/connections/create>
and I'm receiving an error: Errors: supply old or new schedule schema but not both
. I'm not sure why I am seeing this error, as I have basically copied the schedule schema from the documented example on the API documentation. My schedule looks likes like this:
"schedule": {
"units": 24,
"timeUnit": "hours"
},
"scheduleType": "basic",
"scheduleData": {
"basicSchedule": {
"timeUnit": "hours",
"units": 24
}
},
What am I doing wrong? And where can I find documentation on the old or new schedule schema?
I'm using airbyte 0.40.4Mohit Reddy
10/12/2022, 3:31 PM2022-10-12 15:17:22 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):334 - Records read: 4587000 (655 MB)
2022-10-12 15:17:31 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):337 - Source has no more messages, closing connection.
2022-10-12 15:17:33 [32mINFO[m i.a.w.p.KubePodProcess(close):713 - (pod: t-107 / source-bigquery-read-4529-2-irywx) - Closed all resources for pod
2022-10-12 15:17:33 [32mINFO[m i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed.
errors: $: null found, object expected
2022-10-12 15:17:33 [1;31mERROR[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: null
2022-10-12 15:17:34 [32mINFO[m i.a.w.p.KubePodProcess(close):713 - (pod: t-107 / destination-kafka-write-4529-2-wnxxp) - Closed all resources for pod
2022-10-12 15:17:34 [1;31mERROR[m i.a.w.g.DefaultReplicationWorker(run):177 - Sync worker failed.
java.util.concurrent.ExecutionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:170) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
Suppressed: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:134) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:62) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!
at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:341) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
... 1 more
Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:339) ~[io.airbyte-airbyte-workers-0.39.1-alpha.jar:?]
at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
... 1 more
2022-10-12 15:17:34 [32mINFO[m i.a.w.g.DefaultReplicationWorker(run):236 - sync summary: io.airbyte.config.ReplicationAttemptSummary@613c6c00[status=failed,recordsSynced=4587950,bytesSynced=687407575,startTime=1665586308868,endTime=1665587854669,totalStats=io.airbyte.config.SyncStats@4fe132af[recordsEmitted=4587950,bytesEmitted=687407575,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[io.airbyte.config.StreamSyncStats@2251ad00[streamName=fennel_impressions,stats=io.airbyte.config.SyncStats@5ffe54a3[recordsEmitted=4587950,bytesEmitted=687407575,stateMessagesEmitted=<null>,recordsCommitted=<null>]]]]
2022-10-12 15:17:34 [32mINFO[m i.a.w.g.DefaultReplicationWorker(run):265 - Source did not output any state messages
2022-10-12 15:17:34 [33mWARN[m i.a.w.g.DefaultReplicationWorker(run):273 - State capture: No new state, falling back on input state: io.airbyte.config.State@47f56920[state={"cdc":false,"streams":[{"cursor":"0","stream_name":"fennel_impressions","cursor_field":["added_on"],"stream_namespace":"impression_dataset"}]}]
2022-10-12 15:17:34 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling...
2022-10-12 15:17:34 [32mINFO[m i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):157 - sync summary: io.airbyte.config.StandardSyncOutput@5f71a285[standardSyncSummary=io.airbyte.config.StandardSyncSummary@47bc3f11[status=failed,recordsSynced=4587950,bytesSynced=687407575,startTime=1665586308868,endTime=1665587854669,totalStats=io.airbyte.config.SyncStats@4fe132af[recordsEmitted=4587950,bytesEmitted=687407575,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[io.airbyte.config.StreamSyncStats@2251ad00[streamName=fennel_impressions,stats=io.airbyte.config.SyncStats@5ffe54a3[recordsEmitted=4587950,bytesEmitted=687407575,stateMessagesEmitted=<null>,recordsCommitted=<null>]]]],normalizationSummary=<null>,state=io.airbyte.config.State@47f56920[state={"cdc":false,"streams":[{"cursor":"0","stream_name":"fennel_impressions","cursor_field":["added_on"],"stream_namespace":"impression_dataset"}]}],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@3d45e59d[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@1aaa2f0[stream=io.airbyte.protocol.models.AirbyteStream@539d5dca[name=fennel_impressions,jsonSchema={"type":"object","properties":{"post_id":{"type":"number"},"user_id":{"type":"number"},"added_on":{"type":"number"},"event_name":{"type":"string"},"event_timestamp":{"type":"number"},"time_spent_secs":{"type":"number"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[],namespace=impression_dataset,additionalProperties={}],syncMode=incremental,cursorField=[added_on],destinationSyncMode=append,primaryKey=[],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@757fb41f[failureOrigin=source,failureType=<null>,internalMessage=io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@6704de20[additionalProperties={attemptNumber=2, jobId=4529}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!
at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315)
at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320)
at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!
at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:341)
at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)
... 3 more
Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136)
at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:339)
... 4 more
,retryable=<null>,timestamp=1665587853062]]]
2022-10-12 15:17:34 [32mINFO[m i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating...
2022-10-12 15:17:34 [32mINFO[m i.a.c.p.ConfigRepository(updateConnectionState):774 - Updating connection 1a755743-0802-4dea-a026-29071d0c55db state: io.airbyte.config.State@2e6b37f2[state={"cdc":false,"streams":[{"cursor":"0","stream_name":"fennel_impressions","cursor_field":["added_on"],"stream_namespace":"impression_dataset"}]}]
2022-10-12 15:17:13 [44msource[0m > 2022-10-12 15:17:13 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$createReadIterator$7):250 - Reading stream fennel_impressions. Records read: 4530000
...
2022-10-12 15:17:21 [44msource[0m > 2022-10-12 15:17:20 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$createReadIterator$7):250 - Reading stream fennel_impressions. Records read: 4580000
2022-10-12 15:17:31 [44msource[0m > 2022-10-12 15:17:31 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$read$2):124 - Closing database connection pool.
2022-10-12 15:17:31 [44msource[0m > 2022-10-12 15:17:31 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$read$2):126 - Closed database connection pool.
2022-10-12 15:17:31 [44msource[0m > Exception in thread "main" com.google.cloud.bigquery.BigQueryException: <http://www.googleapis.com|www.googleapis.com>
2022-10-12 15:17:31 [44msource[0m > at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:115)
2022-10-12 15:17:31 [44msource[0m > at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.listTableData(HttpBigQueryRpc.java:514)
2022-10-12 15:17:31 [44msource[0m > at com.google.cloud.bigquery.BigQueryImpl$29.call(BigQueryImpl.java:1129)
2022-10-12 15:17:31 [44msource[0m > at com.google.cloud.bigquery.BigQueryImpl$29.call(BigQueryImpl.java:1124)
......
2022-10-12 15:17:31 [44msource[0m > ... 53 more
2022-10-12 15:17:33 [43mdestination[0m > 2022-10-12 15:17:33 [32mINFO[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded.
2022-10-12 15:17:33 [43mdestination[0m > 2022-10-12 15:17:33 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.kafka.KafkaDestination
2022-10-12 15:17:33 [43mdestination[0m > 2022-10-12 15:17:33 [32mINFO[m i.a.i.d.k.KafkaDestination(main):86 - Completed destination: class io.airbyte.integrations.destination.kafka.KafkaDestination
Bogdan
10/12/2022, 3:42 PMsonti srihari
10/12/2022, 4:29 PMJagruti Tiwari
10/12/2022, 5:23 PMJoe Swatosh
10/12/2022, 6:45 PM======================================================================================================================================================================== short test summary info =========================================================================================================================================================================
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:TestDiscovery:test_defined_refs_exist_in_schema[inputs0] - AssertionError: Found unresolvedvalues for selected streams: ({‘customrecord_advpromo_discount’: [‘/services/rest/record/v1/metadata-catalog/inventoryitem’, ’/services/rest/record/v1/metadata-catalog/assem...$refs
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_core.py:TestBasicRead:test_read[inputs0] - docker.errors.ContainerError: Command ‘read --config /data/tap_config.json --catalog /data/catalog.json’ in image ’<Image: ‘airbyte/source-netsuite:dev’>' returned non-zero exit status 1: {“type”: “TRACE”, “trace”: {“type”: “ERROR”, “emit...
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py:TestFullRefresh:test_sequential_reads[inputs0] - docker.errors.ContainerError: Command ‘read --config /data/tap_config.json --catalog /data/catalog.json’ in image ’<Image: ‘airbyte/source-netsuite:dev’>' returned non-zero exit status 1: {“type”: “TRACE”, “trace”: {...
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py:TestIncremental:test_two_sequential_reads[inputs0] - AssertionError: Should produce at least one record
FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_incremental.py:TestIncremental:test_read_sequential_slices[inputs0] - AssertionError: Should produce at least one recordCan anyone offer a suggestion how I might get these tests working so that I can look at the real failure on the branch? Thanks!
laila ribke
10/12/2022, 7:26 PMSamantha Duggan
10/12/2022, 8:10 PMNazih Kalo
10/12/2022, 10:19 PMdocker compose up
but I'm not getting the new options from that repo 🤔 any help/documentation on how to docker compose / build a fork
locally?