Is it possible for an sst stack to connect to a lo...
# sst
h
Is it possible for an sst stack to connect to a local database instead of one running and created in aws ? Our concern is that if we have multiple engineers working locally, sst will create multiple databases which doesn’t seem right. https://serverless-stack.com/examples/how-to-use-postgresql-in-your-serverless-app.html
j
You could use a shared dev stack for your stateful resources (like databases) that is used by developer-specific stacks for the rest of the application. So, stage-specific stacks and stage-dependent shared stacks. For instance, my dev stage (jp) and your dev stage (hn) would be on different stacks for our API (dev-jp and dev-hn) but use the same stack for the database (dev-shared).
Otherwise you could check if
app.IS_LOCAL
and use local database credentials if it is set to
true
.
j
• the problem with shared dev stacks like databases is that people could run different code/tests/experiments and mess up the DB for the other team mates • do we need to check
app.IS_LOCAL
? can't we just setup the connection parameters with env variables and connect to a local DB without checking environments?
j
• You could have a developer-specific but feature-agnostic database. • Env vars should work, yes.
s
Yeah, this is possible. I'm doing this myself since the service I'm working with uses RDS for Postgres (the non-serverless offering). Provisioning an RDS DB per developer doesn't really work well (40+ devs, RDS is slow to provision, not pay per use, etc). It works really well, but requires developers to run a local database. In my case, I'm running Postgres within docker and the setup works quite well
j
We will try the docker + Postgres approach for local development. thanks for your responses 🙏
t
Hey @Josep Segarra let me know if you got this working
s
I'm conditionally creating my RDS for Postgres in my DB stack when the stage is set to "staging" or "production". In the event I'm running under another stage, I connect to my local Postgres DB running in Docker. It occurs to me that I could conditionally spin up a serverless Aurora instance when running in dev to keep all my services resources in the cloud and ditch Docker entirely. Not sure if that's a good idea or not...
It doesn't feel right to have stacks that conditionally create resources based on stage
I'm also not sure if it's the type of resource you create/tear down like DynamoDB.
p
We use one DB cluster for all our dev deployments and just create new tables in it with prefixes based on the stage name.
s
hmm, that sounds like a promising pattern
t
Yeah I've done something similar in the past but did a database per stage
can use sst.Script to provision
p
We use SST.scripts to create the tables on deploy and delete them on remove
s
Ah, that's smart. I like this a lot
I bet that makes the deploy much faster when you're just creating tables and not provisioning the entire cluster each time
p
Yup! Plus you don't have to run a load of clusters that you either have to wait to wake up or keep running at an expense.
(we're using Aurora Postgres Serverless)
s
Yeah, I want to get my team onto DynamoDB for the serverless workloads, but that's going to take time to socialize the concepts of NoSQL. Aurora serverless may be the perfect step in a better direction than the current setup
j
Hi @thdxr I set up yesterday the local dockerfile, env variables and it worked. I used "postgres" library for simple table creation and data manipulation tests and it went fine 👍