hi airbyte community. my startup is trying to build it’s first data pipeline, and there are 2 usecases which have come up:
1. ELT: postgres/mixpanel/salesforce/etc => snowflake, dbt for analytics transforms
2. audit trails so we can ask “what was the state of this object (and related objects) at a specific point in time” (for use by operations, ML, etc)
The first one seems like a pretty clear implementation of airbyte, where snowflake has the latest snapshot of the postgres data. 🙂 🎉
For the second one, it sounds like a problem in the event sourcing space: an append only log of object changes, so i can replay the changes at anytime to get the state of the object at a specific time (not just right now). Is there anyone using airbyte to solve this problem (ie: postgres CDC => kafka => snowflake => query to calculate aggregates on the fly)? or do I need to tackle this in the application tier (app => eventstore/kafka => projection application)?