Hi!
Do you have any advice on how to setup dev->prod pipeline?
I have Airflow + dbt project that I can run locally, develop all the stuff needed and then commit to GitHub and it will update the production environment on the server. Everything is stored in code and env files.
I'd like to add Airbyte to that stack and ideally I'd like to schedule everything from Airflow, including Airbyte connections. I managed to do that on my local env, but I don't see an easy way to recreate connections the production server aside from manually dumping the config export and importing it on prod (I've seen connectionIds are part of the backup, so those shouldn't change so DAGs in Airflow won't break).
That's not ideal, cause I'd like to be able to eventually hand over project to a different developer (or maybe even develop in parallel) and we would have to sync the config manually each time.
Another solution is to use Airbyte separately from Airflow (just like we do now with Segment), but that means we'd have yet another tool to check for errors and Airflow scheduling wouldn't be as utilized as I'd like. I could also call the same prod Airbyte instance from both dev and prod Airflow instances, but that could create some conflicts.
Is there any way to put source, destination and connections definitions in GitHub alongside rest of the code? Or maybe there's another way I don't see to build ELT suite using those tools? Any pointers on how to set up something that is easily redeployable, versionable and can be worked on efficiently would be appreciated.