Mario Beteta
12/19/2022, 6:35 PMFileNotFoundError: [Errno 2] No such file or directory: '/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/declarative/declarative_component_schema.yaml'
Sam Stoelinga
12/19/2022, 6:52 PMStepan Chatalyan
12/19/2022, 7:08 PMChaima ROUZZI
12/19/2022, 7:47 PMLeo Qin
12/19/2022, 8:07 PMZaza Javakhishvili
12/19/2022, 8:18 PMKeiryn Hart
12/19/2022, 8:57 PMAazam Thakur
12/19/2022, 11:16 PMNancy Pham
12/20/2022, 12:23 AMZaza Javakhishvili
12/20/2022, 4:06 AM> Task :airbyte-container-orchestrator:test FAILED
MANOJ KUMAR
12/20/2022, 7:15 AMMario Beteta
12/20/2022, 9:23 AMTypeError: Cannot convert undefined or null to object
Rachel RIZK
12/20/2022, 11:16 AMKevin Noguera
12/20/2022, 1:10 PMTEMPORAL_HISTORY_RETENTION_IN_DAYS
variable.. though I've set it to TEMPORAL_HISTORY_RETENTION_IN_DAYS=1
on my .env file (running via docker) it seems like it didn't remove any log files for the syncs.
Airbyte version: 0.40.22Anthony Souza
12/20/2022, 2:03 PMStepan Chatalyan
12/20/2022, 3:08 PMSlackbot
12/20/2022, 3:47 PMMurat Cetink
12/20/2022, 5:59 PMIncremental | Deduped + history
sync mode? My source is Zendesk Support and destination is Snowflake. Airbyte version is 0.40.14,Jon M
12/20/2022, 6:15 PMSanthosh Swaminathan
12/20/2022, 9:00 PMNoah Selman
12/20/2022, 9:50 PMSlackbot
12/20/2022, 10:40 PMZvika Badalov
12/20/2022, 11:09 PM/api/v1/deployment/import
endpoint?Jon M
12/20/2022, 11:18 PMCould not find image: airbyte/destination-mssql:0.1.22
laila ribke
12/21/2022, 12:35 AMAnurag Jain
12/21/2022, 5:34 AMKristina Ushakova
12/21/2022, 6:49 AMSaved offset is before Replication slot's confirmed_flush_lsn, Airbyte will trigger sync from scratch
points at why full syncs are being initiated.
My Q is why could this be happening? Is it an error from the driver's side - it's not storing the latest lsn offset? Is the database not acknowledging this offset? Has anyone encountered this before?Anand Zaveri
12/21/2022, 6:59 AMKevin Noguera
12/21/2022, 7:20 AMZendesk Support
and `Klaviyo`connectors.
It seems like though some streams are set up as incremental + dedup
it is still pulling all data as if it were a full refresh. Any idea what may be the issue, or what should I be looking for in the logs?
The logs don't seem to show any warning nor errors.
Just to add some detail:
• in the Klaviyo -> BigQuery
connector events
stream the cursor field 'timestamp'
seems to be an INTEGER
whereas all other streams that are working incrementally are of type TIMESTAMP
, could this be the issue?Dany Chepenko
12/21/2022, 7:41 AMTo request advanced access to this permission, you need to make a successful test API call. It may take up to 24 hours after the first API call for this button to become active. Learn about testing