Thomas
08/26/2022, 12:05 PMBrendan McDonald
08/30/2022, 7:23 PMmarketing_emails
object, however it seems to be limited at 250 records (we have a total of 720 in hubspot). I am assuming this is because of some sort of rate limit through the API. Is there a way to backfill all data if the only way around this is through an incremental load setup?
For further context, I was able to get all records via the API using python directly using pagination. I am just not sure how to configure this via the airbyte UI.
Looking at the source code, it looks like there is a 250 record limit setup for each pull. This is definitely a nooby question, but how do you get around the pagination limit here?Dmytro Vorotyntsev
08/31/2022, 5:11 AMAn analysis of the cluster’s workload and database schema identified columns that will significantly benefit from using a different compression encoding.All suggested tables are those configured with Postgres CDC (Deduped History) And its suggestion
ALTER TABLE "public"."tatable_1_scd" ALTER COLUMN "_airbyte_unique_key_scd" ENCODE lzo;
ALTER TABLE "public"."table_2_scd" ALTER COLUMN "_airbyte_unique_key_scd" ENCODE lzo;
ALTER TABLE "public"."table_3_scd" ALTER COLUMN "_airbyte_unique_key_scd" ENCODE lzo;
ALTER TABLE "public"."tatable_1_scd" ALTER COLUMN "_airbyte_emitted_at" ENCODE az64;
ALTER TABLE "public"."table_3_scd" ALTER COLUMN "_airbyte_emitted_at" ENCODE az64;
ALTER TABLE "public"."table_4_scd" ALTER COLUMN "_airbyte_emitted_at" ENCODE az64;
Is it a relevant suggestion? Would it break the airbyte sync logic if encoding updated?
ThanksShivam Thakkar
08/31/2022, 3:17 PMLucas Wiley
09/15/2022, 11:08 PMCould not connect with provided configuration. net.snowflake.client.jdbc.SnowflakeSQLLoggedException: Private key provided is invalid or not supported: rsa_key.p8: Cannot invoke "net.snowflake.client.jdbc.internal.org.bouncycastle.util.io.pem.PemObject.getContent()" because the return value of "net.snowflake.client.jdbc.internal.org.bouncycastle.util.io.pem.PemReader.readPemObject()" is null
Parham
09/21/2022, 10:19 AMAlexis Charrier
09/30/2022, 12:28 PM500 An internal error occurred and the request could not be completed. This is usually caused by a transient issue. Retrying the job with back-off as described in the BigQuery SLA should solve the problem: <https://cloud.google.com/bigquery/sla>. If the error continues to occur please contact support at <https://cloud.google.com/support>. Error: 5423415
Google status page is not reporting any issue regarding Bigquery service 🤔 any idea ?