Hello, I set up Airbyte locally. My source is Reds...
# ask-community-for-troubleshooting
m
Hello, I set up Airbyte locally. My source is Redshift and destination is Snowflake. For destination I use AWS S3 staging. If I run such integration manually, I would use UNLOAD command in Redshift to place data in S3 and then COPY into Snowflake. But what Airbyte does, it just uses select statements in Redshift (no UNLOAD). Can I set some parameters so UNLOAD could be used? I can’t imagine how long will it take without UNLOAD command for 1TB, 10TB or 100TB tables.
Should I manually set up source - Redshift, destination - S3 and then another source S3, destination - Snowflake?
Also, what those logs mean? Storing Redshift data in local machine memory?
r
Hi, I noticed same issue with Snowflake reader as well, it is using select instead of a Unload statement. + Snowflake writer don t have option to use snowflake internal storage What is the best way to have these feature enabled ? not familiar yet with open source contribution Regards
👍 1
p
Hi @Mindaugas Nižauskas and @Romain LOPEZ I am wondering if Airbyte would be a good fit to use during a migration from Redshift to Snowflake. I am curious, what was your experience? Did you figure out answers to your questions above? I just posted this. Thanks!
r
I have not been testing thé feature since this comment. And airbyte is delivering a lot of feature 😍.