Hi all, new to Airbyte. I been trying to sync a large table (30GB, 300M records) from Snowflake in to Google BiqQuery but kept failing due to several issues such as memory issues, part limits, partition limits etc. After several attempts, I gave up trying several things and wasting cloud resources. I would like to know how can I load data from a large table in batches ? For example, extract and load 1-10000000 records and so on. Do I have to do it as part of custom transformation in DBT ? I have a feeling this is a very common issue and wondering how others are tackling.
[DEPRECATED] Augustin Lafanechere
11/19/2021, 5:19 PM
Hi @Chandini Nekkantti could you please post this kind of problem in #troubleshooting channel + fill our slack issue template pinned to this channel + attach the full log of your sync attempt 🙏