Marcos Marx (Airbyte)
04/26/2023, 2:55 PMRyan
04/26/2023, 3:15 PMcancellation_details
field from the Stripe subscriptions stream doesn't exist in the final tables, it only exists as part of the JSON fetched from Stripe. Is it possible to pull in this extra field?Gabriel Levine
04/26/2023, 3:29 PMWalPositionLocator(<init>):45 Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{13A/C1F3C440}'
PostgresStreamingChangeEventSource(searchWalPosition):344 WAL resume position 'null' discovered
Running Airbyte 0.44.2 on GKE via Helm deploy. Postgres source is 2.0.24 and BigQuery destination is 1.3.0. Postgres source database is GCP CloudSQL.Alejo
04/26/2023, 4:00 PMMantas Linkis
12/27/2022, 1:06 PMcould not accept SSL connection: SSLV3_ALERT_CERTIFICATE_UNKNOWN"
We think this is related to truststore but we’re unsure how to go about this.Ben Konicek
04/26/2023, 7:40 PMFailed to start sync: The connection manager workflow for connection bc2a79bb-bb0d-42b4-9492-8dcdbf5e7d4e is deleted, so no further operations cannot be performed on it.
Rafael Rossini
04/26/2023, 7:53 PMInayet Hadi
04/26/2023, 10:01 PMKundan Kumar
04/27/2023, 11:29 AM이유진
04/27/2023, 11:42 AMGaëtan Podevijn
04/27/2023, 1:34 PM_airbyte_data
), as explained in the documentation Basic Normalization. However, when writing from Postgres to Databricks lakehouse in unmanaged tables, the data is written in a normalised from (i.e., the columns from the source are created in the Lakehouse external table). But according to the documentation (or at least my understanding of it), the unnormalized form should be the default and normalization should be configured (and not supported by the Dataabricks connector yet).
With Kafka to Databricks in managed tables, the data is written in an unnormalized form as I would expect in _airbyte_data
. With unmanaged tables, however, the data is put in _airbyte_additional_properties
. I assume that it is because the Kafka source does not use the schema registry to create the schema (even though it is correctly configured in the source).
Do you know why
• The data is in the unnormalized form depending on the Lakehouse table’s type in the postgres case?
• Does the Kafka source use the schema registry to write the data in the destination?
Thanks!Benjamin Edwards
04/27/2023, 2:14 PMKrunal Jagani
04/27/2023, 7:38 PMArjunsingh Yadav
04/28/2023, 10:25 AMFailed to load <s3://bucket/folder/*.xlsx>: OSError("unable to access bucket: 'bucket' key: 'folder/*.xlsx' version: None error: An error occurred (NoSuchKey) when calling the GetObject operation: The specified key does not exist.")
Any help would be appreciated
Thanks :)Slackbot
04/28/2023, 10:41 AMSlackbot
04/28/2023, 10:41 AMKonrad Ertmanski
04/28/2023, 10:56 AM0.1.33
but performed just fine for our needs. Recently had upgraded it to the newest version 3.4.1
and the sync times have sky-rocketed( and I mean it, from 30min to 30h).
Airbyte version: 0.40.32
It seems that the upgrade has increased the amount of records read/emitted massively.
More info in the thread - any help/ideas would be appreciated 🙏 Thanks!Slackbot
04/28/2023, 12:21 PMSatyam Saxena
04/28/2023, 12:31 PMapiVersion: v1
kind: Pod
metadata:
name: airbyte-destination-pod
spec:
restartPolicy: Never
containers:
- name: airbyte-destination
image: airbyte/source-snowflake:latest
command: ["/bin/sh", "-c"]
args:
- sh
- '-c'
- >-
if [ ! -z "$AIRBYTE_ENTRYPOINT" ]; then
ENTRYPOINT=$AIRBYTE_ENTRYPOINT
else
ENTRYPOINT="/airbyte/base.sh"
fi
(eval "$ENTRYPOINT write --config /config/config.json --catalog
/catalog/catalog.json")
ports:
- containerPort: 9999
Bradley Penwarden
04/28/2023, 1:21 PMSlackbot
04/28/2023, 3:02 PMGiuseppe Russo
04/28/2023, 4:02 PM<https://xxxxx.talkdeskid>.*eu*/oauth/token?grant_type=client_credentials
I'm now trying to run a connection but I get the following error:
Unauthorized for url: <https://api.talkdeskapp>.*com*/data/reports/calls/jobs
However, this URL is unauthorized because it should have the .eu and not the .com domain. Do you know if there is the possibility to change the region for the TalkDesk connection?Micky
04/28/2023, 4:46 PMGabriel Levine
04/28/2023, 5:25 PMSlackbot
04/28/2023, 6:28 PMMurat Cetink
04/28/2023, 7:06 PMShubhra Ghosh
04/28/2023, 9:28 PMSlackbot
04/29/2023, 4:58 PMGabriel Levine
04/29/2023, 5:03 PMDestination process message delivery failed
, Failed to upload buffer to stage and commit to destination
, and Failed to upload staging files to destination table
.Dhanji Mahto
04/29/2023, 11:18 PM