Hi, what’s the SST approach to Provisioned Concurr...
# sst
j
Hi, what’s the SST approach to Provisioned Concurrency?
t
Can you elaborate on your question
j
Well, I’m dealing with cold starts and don’t want this in Prod
With CDK, the process of setting up an Alias and Auto-scaling group is long
So it seems like something SST could abstract?
t
I do something like this
Copy code
const router = new sst.Function(this, "router", {
      handler: `src/functions/router.handler`,
      currentVersionOptions: {
        provisionedConcurrentExecutions:
          this.stage === "production" ? 1 : undefined,
      },
    })
    // <https://github.com/aws/aws-cdk/issues/13731#issuecomment-814801449>
    router.currentVersion
We don't do anything special in SST for this, it's just from CDK. Note that weird last line is needed
j
But will, for example, AppSync construct link to the versioned ARN or aliased ARN? Probably not, so there are some extra steps involved I think
t
Ah good point it's possible the integration points to the wrong thing, I'd need to do some testing
I can look into this or were you planning on playing with it?
j
I’ll look into it no worries. I will report back
t
ty
j
This is the config:
Copy code
const api = new AppSyncApi(this, "Api", {
      ...
      defaultFunctionProps: {
        timeout: Duration.seconds(15),
        currentVersionOptions: {
          provisionedConcurrentExecutions: 2
        }
      },
Actual behaviour: The lambda resolvers don’t have a provisioned concurrency config and the appsync data sources don’t point to a version
To be honest, even with provisioned concurrency the lambda won’t stay warm and it’s the initialisation code that is causing a slow initial start
So perhaps the solution is just to keep the lambdas warm with a new construct
t
I think you need that weird line that I had
which requires defining the function seperately unfortunately
Copy code
// <https://github.com/aws/aws-cdk/issues/13731#issuecomment-814801449>
    router.currentVersion
j
oh
The thing is I’m doing it for defaultFunctionProps
t
yeah we should mayble solve for this bug inside sst.Function
Let me make an issue
j
I’m not sure how important this edge case is
I think something else is wrong
Causing me to get lambda timeouts on first invocation
Even though it is running locally
t
are you using a sql database?
j
Ddb
t
hm yeah not sure what's causing a slow start
If you send me
.build/sst-debug.log
I can make sure it's not on sst's side
j
Yeah sorry I don’t think I can blame SST. I have to find out. I will monitor that file in the future.
Even though I said that issue is a minor edge case, I have come across quite an important use case and that’s with Auth stack. I am using a custom authorizer which spins up some Golang functions and it would be nice to not have the user wait a few seconds. I’m not anticipating lots of demand so this latency impacts too many
It uses all of the cognito triggers so that causes quite a long delay
defaultFunctionProps could support using provisioned concurrency
t
Yeah this is one of the tough things with serverless, if you want to high quality product it can be tricky to get some of these things right
j
I suppose either that or eventbridge scheduled event, but with some custom logic inside each lambda to discard ping requests
It would be nice if SST had a lambda warmer but that’s not possible without custom logic in each function, or can it use just a lambda layer for all?
t
Provisioned concurrency should solve your issue though so ideally we should get that working
warming shouldn't be needed with that
j
That’s what I thought, but it turns out Provisioned Concurrency only guarantees there are VMs with your code downloaded in. It does not run the initialisation code and keep them thawed. So that is causing a few hundred milliseconds delay when invoking provisioned lambdas that aren’t warm.
As impressive as lambda is, it’s still ridiculous that this hasn’t been been made an option in 2021 for customers who want to pay more to have the lambda warm and not just provisioned
j
@thdxr so you don't see any use case where warming has benefits of provisioned concurrency? 🤔
t
Ah @Joe Kendal I wasn't aware of that. I was kind of wondering why turning on provisioned concurrency didn't seem to fix my issues but never dug into it
j
@thdxr Yeah I wonder if it’s better to look at using Fargate instead of trying to warm them with events. Do you know if SEED use Fargate instead of Lambdas? Harder to integrate with other services natively.
t
Do you mean what seed uses under the hood? I believe it's normal ec2
I also was able get my cold start times down pretty low, how slow are yours?
j
EC2 is extra hard to manage. I can’t imagine doing that in 2021.
Like 800-1400ms
I had a pipeline resolver with 3 golang lambdas so it was literally taking 10 seconds
So I managed to convert two of them into VTL and now it is only 1.5-2.5 seconds cold start
The use case was to cache any mutation results when the API user provides an Idempotency-Key header
Similar to Stripe, quite neat actually. VTL mappers with a Ddb TTL
t
Hm that's still pretty high
This is with GraphQL?
I've been planning on using GraphCDN for caching, pretty incredible
I also tend to not use too many of the AWS pipeline lambdas, like a lambda authorizer. It's less "serverless" but the more I can do in a single lambda the fewer steps and less latency
j
I’m not sure if I can apply GraphCDN in this case. But it seems to do what AppSync and CloudFront can already do, just better.
Yes I’ve learned not to do it anymore. The VTL takes 10ms to do what took a golang lambda 400ms (warm), 1000ms (cold)
l
@thdxr @Frank This seems like the most recent thread on provisioned concurrency (given the amount in the past, no need to go any further :D). Just wanted to make sure I'm doing everything right here...
Copy code
const getUserSettings = new Function(this, "getSettings", {
    handler: "src/userAPI/userController.getUserSettingsAPI",
    currentVersionOptions: {
        provisionedConcurrentExecutions: this.isNamedEnv ? 1 : 0,
    },
    environment: {
..
    },
});
// REQUIRED TO CREATE PROVISIONED FUNCTIONS
const userSettingsVersion = getUserSettings.currentVersion;

props.api.addRoutes(this, {
            'GET /user/settings': getUserSettings
        });
I'm seeing the reserved instances being created and idle. I'm seeing (oddly) lower latency in the requests, but according to all sources available I'm not calling the reserved instances (couldn't find any invocation different than
"AWS_LAMBDA_INITIALIZATION_TYPE": "on-demand"
) Should the version construct be referenced somewhere when adding the actual route?
j
@Lukasz K yes, you need to reference the version otherwise apigw won't call the provisioned lambda
So instead do this: props.api.addRoutes(this, {‘GET /user/settings’: getUserSettings.currentVersion});
l
@Joe Kendal that was my first thought prior to sending the question. What I get from
sst build
when trying to do it is:
Copy code
Error: Invalid function definition for the "Lambda_GET_/company" Function
since API construct is looking for
Copy code
FunctionDefinition | ApiFunctionRouteProps | ApiHttpRouteProps | ApiAlbRouteProps
j
i think it may be .currentVersion.handler or something I'm afk sorry
l
Sure, no prob. Felt like it's gonna be something like this since that was the requirement in ApiGwV1. Just need to dig a bit in the object 😀
On the other hand this automatically takes care of the problem noted by @thdxr (the fact that we need to reference
.currentVersion
for one to be created)
Yeah, the API construct is ignoring the version definition (even if i call
currentVersion.lambda
it's gonna get right back to the root function def and use $latest as version in integration details)
Taking a break for today. After tinkering with what's available I just can't shrug the feeling that I should simply add a Cron which calls the function every 10-12 minutes during peak hours and be done with it
j
Well if you want your lambdas to stay warm this winter the only option is to use eventbridge cron since provisioned concurrency is misleading and doesn't actually warm them up
f
Hey @Lukasz K, SST currently doesn’t support using a specific version out of the box. We’ve got an open issue for this https://github.com/serverless-stack/serverless-stack/issues/777
If you want to use a specific version, you’d have to hook up the API route manually. Similar to what @Aram is doing in the GitHub issue.
a
Hi everyone. Is there any conclusion on how to run Provisioned Concurrency with SST?
l
@Artem Pabacham are you sure you need the provisioned concurrency instead of just keeping function warm with a set of cron warmers? I did some math and my cron setup is over 100 times cheaper per function compared to provisioned... I'm using Jeremy Daly's
lambda-warmer
+ created a small supporting method in my extension class of Stack so it only takes one line in the definition and 2 lines (including import) in actual function for warmer to process stuff properly. If you wish I can send you a snippet via PM
a
@Lukasz K That's a good question. My goal is to keep some functions warm for 8 hours. Based on this article https://aws.amazon.com/blogs/aws/new-provisioned-concurrency-for-lambda-functions/ PrCo can help me do that. Making Cron requests is an alternative way. If you think that I should take this option as the first one, I am good and will appreciate your support.