Hi all,
We’ve been struggling with a pretty simple issue that doesn’t seem to have a clear solution in Airbyte, Initial loading of big tables from a RDS (PG or mySQL) to a DWH (Snowflake).
Airbyte just can’t overcome the initial load of a big table and fails over and over again.
I thought about getting the initial load by exporting it into a file and loading it into Snowflake but:
1. I’m not sure I can create the target table for Airbyte.
2. Target table usually contains internal AB id’s I can’t generate.
3. I’m pretty sure Airbyte won’t recognize the increment and we’ll try to load the entire data set again.
How did you overcome that issue?
Any creative workaround would be gladly accepted.
Thanks!