Hello! I've seem to have a problem with deployment...
# help
h
Hello! I've seem to have a problem with deployment. I've mainly been working with
sst start
for development, and today tried to deploy my application with
sst deploy
which seems to have hit a lambda limit.
Copy code
stack failed: Resource handler returned message: "Unzipped size must be smaller than 47358303 bytes (Service: Lambda, Status Code: 400, Request ID: e9aea816-1578-4523-9a68-2f16d3bb278c, Extended Request ID: null)" (RequestToken: 687bb199-2423-bcc6-df83-ffd2c4f44dce, HandlerErrorCode: InvalidRequest)
I've got the exact same application deployed in SLS and the functions collectively do have a size larger than the limit, but when looking at them in their "stacks" or "microservices" they dont reach more than few MBs. Any suggestions on how to go about this? I've disabled bundling on few functions but I do have quite a few using Layers and the same layer may be used in multiple stacks - not sure if its an issue but I thought i will point it out. Thanks.
f
Hey @Hubert, can you post the entire build log (the CloudFormation events that are printed to the terminal)?
h
Sure. Added as file as its chunky.
(this is on a second run)
f
Sorry, was in a meeting.
So it looks like it’s this function
fn_read_cols.lambda_handler
that’s causing the issue
Can you try this: 1. go into
.build/cdk.out
and open up the template for the
live-sst-dataset-stack
stack 2. look for the
ReadCols5CD479F5
resource in the template 3. at the bottom of the resource json block, there’s a
aws:asset:path
4. you should see a folder in
.build/cdk.out
with matching name Can you check the size of the folder?
h
Apologies for a late reply, I've just checked, its 182.8 MB.
Any suggestions on how to downsize it?
I'm now also getting this same error on
npx sst start
Despite not making any changes to the stacks.
For some reason
boto3
and
botocore
was bundling together with my Stack causing insane stack sizes - Resolved.
f
Oh nice! I was looking up the error message above and usually it should say:
Copy code
Unzipped size must be smaller than 262144000 bytes
b/c unzipped function size is 250MB. I was going to dig deeper into this issue, glad you got it work.
Btw, we are trying to make some improvements on how SST packages python code. I’m curious how are you packaging ur python code, are you using any packaging tools? And how are you excluding the boto3 and botocore?
h
All my lambda code is packaged by SST. I'm not zipping up anything, if a stack requires a specific package I have a
requirements.txt
in the folder of the stack that it bundles up with all lambda functions in that folder, but I've since removed that in favour of layers because of this sizing problem. Layers I have a docker script that builds a
python
folder with all the packages I need inside. Happy to share some code and set up for that if you would like. I have a monorepo structure, so each stack has its own
./src/stack/
folder for lambda functions that way if if i need stack-wide package bundled i can easily add it, as long as it doesnt exceed the limit. Not sure if that answers the question 🙈 As for excluding boto3 it was actually in one of those requirements.txt files and I didnt realise it, so it was bundling
boto3
and
botocore
with the stack and causing that huge 100+ MB size. I just removed them from requirements file in that folder. For
sst start
I have a separate requirements file in the root folder with all the packages and a venv that i source before
sst start
. Hope that helps 🙂
f
Ah I see. Do you have local packages? ie. packaging common code into a local package? Are you also packaging them into the layer?
h
Into Layers. I mean having it packaged with code didnt work out well with all the above 🙈 I use some packages in multiple stacks so packaging it into every stack wouldn't make sense. Layers solve that problem I just import them as needed in that specific function in a stack I need. But I have explored both approaches.