We don't have a very high load at the moment, around 90k requests per day, which is around 1 request per second. But we are expecting a high increase in the upcoming months.
The side goal for us is to relieve Postgres from this pressure and use adequate tools for customer-facing realtime analytics.
We are trying to find the sweet spot between precomputing everything (flat data using Flink+kafka) and maybe relying on some joins. Because data can come from multiple tables in the source database, and it can grow super complex to flatten everything in a single kafka topic while guaranteeing 100% data consistency