Hi everyone, am loading a table into snowflake, th...
# ask-community-for-troubleshooting
e
Hi everyone, am loading a table into snowflake, the table is about 1.4Gb and it has 160M records, and is taking about 15 hours, how can i optimized the process in airbyte?
👀 1
s
are you using internal staging?
e
am not sure, am using the standar normalization, after that nothing else
s
i mean for snowflake
when configuring the destination settings
what is the loading method?
e
in snowflake it takes about 11 seconds to load the data, like 300,000 records
but if i take away the normalizaion and put raw data it loads 2 times faster
but i need the normalization as well
u
Can you check in your sync how much normalizatoin takes to run? Normalization runs in Snowflake side and if it's taking a long time you need to increase resources in your Snowflake instance.
e
for example is taking about 6-8 min to load 1M
message has been deleted
is it possible to change the batch size in airbyte?
u
Yes, but you need to change the connector code and re-build the connector image.
e
in the snoflake conector how does it work, i haven't being able to change that
u
The batch size is related to source and not destination