https://linen.dev logo
#ask-community-for-troubleshooting
Title
# ask-community-for-troubleshooting
e

Emmanuel Orrego

12/10/2021, 1:29 PM
Hi everyone, am loading a table into snowflake, the table is about 1.4Gb and it has 160M records, and is taking about 15 hours, how can i optimized the process in airbyte?
👀 1
s

s

12/10/2021, 4:26 PM
are you using internal staging?
e

Emmanuel Orrego

12/10/2021, 4:54 PM
am not sure, am using the standar normalization, after that nothing else
s

s

12/10/2021, 4:56 PM
i mean for snowflake
when configuring the destination settings
what is the loading method?
e

Emmanuel Orrego

12/10/2021, 5:19 PM
in snowflake it takes about 11 seconds to load the data, like 300,000 records
but if i take away the normalizaion and put raw data it loads 2 times faster
but i need the normalization as well
u

[DEPRECATED] Marcos Marx

12/10/2021, 8:50 PM
Can you check in your sync how much normalizatoin takes to run? Normalization runs in Snowflake side and if it's taking a long time you need to increase resources in your Snowflake instance.
e

Emmanuel Orrego

12/10/2021, 11:06 PM
for example is taking about 6-8 min to load 1M
message has been deleted
is it possible to change the batch size in airbyte?
u

[DEPRECATED] Marcos Marx

12/13/2021, 4:54 PM
Yes, but you need to change the connector code and re-build the connector image.
e

Emmanuel Orrego

12/13/2021, 5:58 PM
in the snoflake conector how does it work, i haven't being able to change that
u

[DEPRECATED] Marcos Marx

12/14/2021, 12:59 AM
The batch size is related to source and not destination
2 Views