Hey guys, I'm encountering a problem where the deb...
# help
h
Hey guys, I'm encountering a problem where the debug stack appears to restart when logging the response of a lambda that returns a base64 string.
I deployed the stack and didn't encounter the error anymore. This is something in the debug stack.
Could we perhaps just truncate the body in the logs when the content type "isBase64Encoded" flag is set to true?
f
Hi @Harry Collin, can you share a snippet of what the Lambda is returning?
I just tried this, and it worked for me.
Copy code
export async function main() {
  const response = Buffer.from("hello world").toString('base64');
  console.log(response);

  return {
    statusCode: 200,
    body: response,
  };
}
So it’s probably not base64 related?
Can you update SST to v0.12.1 and give a try again? There’s a recent fix where large Lambda response caused a similar error.
If you still have the same issue after upgrade, can you share what the Lambda response is. And I will give that a try.
h
Hey @Frank, my response looks like this:
Copy code
return {
    "isBase64Encoded": true,
    "statusCode": 200,
    "headers": {
        "content-type": "image/png",
        "cache-control" : "max-age=86400",
        'Access-Control-Allow-Headers' : '*',
        'Access-Control-Allow-Origin': '*',
        'Access-Control-Allow-Credentials': true,
    },
    "body":  base64
}
I upgraded to v0.12.1 but didn't seem to have any luck. The response is very large for a console view and doesn't really make sense to show it.
f
I see. Can you do a length or byte check on
base64
?
h
Hey Frank, the responses range from around 0.4-0.8 MB.
The exact size from a response: bytes: 694624
Cloudwatch logs for the function
I'm not sure if this is related but AWS docs mention this - Amazon API Gateway important notes for WebSocket APIs • API Gateway supports message payloads up to 128 KB with a maximum frame size of 32 KB. If a message exceeds 32 KB, you must split it into multiple frames, each 32 KB or smaller. If a larger message is received, the connection is closed with code 1009
To report further on this - I have a similar problem when using POST. Although it doesn't even appear on the debug console. It fails silently and just returns status 500. This only happens when the body is large. I would assume these issues are related somehow.
f
Hey @Harry Collin, you are 100% right, the root cause is the API Gateway limit of the websocket payload. I also ran into a couple other issues related to the large payload. And it took me a couple of days to put together a solution.
I created a beta version
0.13.1-next.8
, and it would be great if you can help me confirm that it works. To test it out, you just have to change the SST version in your package.json:
Copy code
"@serverless-stack/cli": "0.13.1-next.8",
    "@serverless-stack/resources": "0.13.1-next.8"
Then run npm/yarn install.
The fix is that SST will now create an S3 bucket in the debug stack. And for Lambda payloads greater than 32KB, the payload is saved to the S3 bucket, and the websocket message will contain the S3 file path instead of the full payload.
Let me know if this works for you, and I will cut an official release with it.
h
Hey @Frank. I updated the stack and found that it no longer restarts when encountering a large payload. I did notice that it still logs massive responses. I copied the body from the console and it was around 686KB of base64. This was a lambda response.
f
I see. So the Lambda worked, just that the massive response is being printed out, and it clutters the console. Is that correct?
h
That's correct.
f
Hey @Harry Collin, I just cut a release v0.14.0 with the large payload fix. Also truncating large console output in the console.
h
Hey @Frank, sorry for the late reply, I missed the notification. I'll give it now. Thanks for your work on this!