https://serverless-stack.com/ logo
#guide
Title
# guide
c

Carlos Daniel

01/14/2022, 8:06 PM
Hey guys, how do you usually do to warm up SST lambdas - if there is a way to (sorry if it is a newbie question)
r

Ross Coundon

01/14/2022, 8:13 PM
I typically don't. If you get a bunch of requests in parallel, you'll get cold starts for each lambda that's spun up to deal with those requests. So to keep all of them warm ahead of time you'd have to mimic multiple parallel requests or use provisioned concurrency. All of this costs money and adds complexity. IMHO it's better to work on keeping your cold starts to a the shortest time you can if that's important to you
c

Carlos Daniel

01/14/2022, 8:15 PM
Hmmm interesting!
a

Adam Fanello

01/14/2022, 8:24 PM
Not an SST issue, just Lambda. Cold starts aren't bad if you: 1. Keep your deployed code small 2. Don't do too much work outside the handler (init code) 3. Avoid putting Lambda inside VPC if you can.
Oh, and avoid Java. 🤷 If I recall, it has the highest cold start penalty.
g

Garret Harp

01/14/2022, 8:31 PM
Doing more outside the handler is better technically because it gives you the full 2 vcpus instead of limited based on ram settings
and it will make the subsequent warm requests faster
a

Adam Fanello

01/14/2022, 8:33 PM
As long as it is useful work. Don't want to do work before you get the request that may not need something. Never heard of cold start having 2 vCPUs regardless of memory size setting though. Reference?
t

thdxr

01/14/2022, 8:41 PM
Agree with Ross. Lambda warming is a somewhat complicated subject and it's a bit counter intuitive. Focus on cold starts by keeping package small. IF you do want to keep stuff warm, checkout provisioned concurrency. That's the only way to actually keep a bunch of capacity warm
c

Carlos Daniel

01/14/2022, 8:41 PM
thanks guys, that makes sense. will focus on reduce the packages then
t

thdxr

01/14/2022, 9:04 PM
lmk if you need help with analyzing your functions
m

Michael Robellard

01/15/2022, 4:27 AM
One trick you can use in Python (I would assume it translates to other languages as well) is, if you have a big expensive package to import (Pillow, Pandas, Numpy, etc...) then wait to do that import to the inside of the handler. It will speed up your cold start, but it will move the time to do the import to the handler. It's helpful if you have multiple different code paths in one handler and not all of them need the expensive import. Better to keep multiple code paths on separate lambdas though. A better bet is to try and move expensive code (In terms of size and speed) to a non-interactive asynchronous format like in an SQS queue where the cold start time doesn't matter.
s

Sam Hulick

01/15/2022, 7:43 AM
Warming Lambdas seems like it doesn't work. I have an event firing off every minute, and it still doesn't keep the functions warm
Provisioned concurrency can work, but it forces versioning, so you have to be mindful of that. And it can get costly quickly
c

Carlos Daniel

01/15/2022, 9:35 AM
really, @Sam Hulick? good to know
@thdxr that’d be great! though the side-project I’m working on using SST doesn’t actually need warm lambdas all the time (most of the lambdas are cron jobs) - I just asked to know if there is a way and implement it if so, but just to learn
s

Sam Hulick

01/15/2022, 5:18 PM
@Carlos Daniel actually, forget what I said about function warming not working. my function warmer was not working right 😄 It used an env var to determine what functions to keep warm, and I realized when SST deploys, it wipes out the env vars
@thdxr is there a way to have a function’s env vars be left alone upon deploy?
c

Carlos Daniel

01/15/2022, 5:44 PM
great! i think you can set the env vars on your function props
even add default envs
don't know if that was your question
s

Sam Hulick

01/15/2022, 5:44 PM
well, I don’t want to hardcode the env var anywhere, that’s the thing
but I don’t want SST to overwrite the env var when it deploys
I don’t know if that’s even possible. Lambda itself might just wipe out the env vars upon every new function deploy
c

Carlos Daniel

01/15/2022, 6:04 PM
i "hard code" my env vars but get the values from the actual .env file, like this
my ci has the values i want to be deployed so it actual works. would be good tho to know if there is another way
r

Ross Coundon

01/15/2022, 8:04 PM
With my projects there are typically various groups of lambdas within a stack that have requirements for different env vars. So, rather than clutter up the stack definition with lots of specific assignments I have a helper function that looks like this:
Copy code
type EnvironmentVariableHolder = {
  [key: string]: string;
};

export function buildEnvVarObject(vars: string[]): EnvironmentVariableHolder {
  const output: EnvironmentVariableHolder = {};
  vars.forEach((envVar) => {
    const varName = envVar as keyof EnvironmentVariableHolder;
    if (!process.env[varName]) return;
    output[varName] = process.env[varName] as string;
  });
  return output;
}
Then, for each group of lambdas I have a separate file that contains the keys of the required env vars. e.g.
Copy code
const vars = [
  'CUSTOMER',
  'CUSTOMER_PUBLIC',
  'DATASET_IDS',
]
export { vars };
Then in the stack definition I can do
Copy code
import { buildEnvVarObject } from './helpers/buildEnvObject';
import { vars as envVarsCommon } from './helpers/envVarsCommon';
Then I can can build the environment with

environment: {
  ...buildEnvVarObject(envVarsCommon),
}
c

Carlos Daniel

01/15/2022, 8:06 PM
wow, great idea! will test it here