Hi there! Firstly I have to say I love SST, debugg...
# help
d
Hi there! Firstly I have to say I love SST, debugging lambdas running in the cloud has turned me from "feedback cycles for faas is too slow" to "PUT ALL THINGS ON LAMBDAS!" I have a question about the debugger, it works for me sometimes, but sometimes not, I understand the general idea of how it works with the websocket/remote debug thing, but when it doesn't work I'm a bit stuck... this is how I'm thinking to figure out what's wrong: 1. Run mitmproxy and filter all requests to amazonaws.com 2. Get my machine to trust the mitmproxy certificate 3. Figure out how to disable cert pinning in aws-sdk/aws-cdk etc. that's making requests from vscode out to the magical "debug" websockets api gateway 4. Look at what happens in the happy path 5. Look at what's missing when it breaks So my questions are: • Does this sound like a solid approach? Is there anything else you'd suggest? (I'm not getting the answers I need from logs by setting DEBUG=1) • Is the debugger functionality know to be a bit flakey? (if this is the case I'm happy to contribute PRs)
f
Hey @Dirk Stewart, can you elaborate on what you are seeing when the debugger isn’t working?
Is it that your Live Lambda dev console isn’t receiving any requests when your Lambda functions are invoked?
d
I'm new to this so I'm not sure what you mean by the live lambda dev console, I'm using vscode and the issue is with breakpoints not being triggered (sometimes), the sst process still seems to be reporting those REQUEST and RESPONSE logs from the triggering of the lambdas in the debug console of vscode, and I'm seeing things like
Debugger attached.
and
Sending keep-alive call
, it's just that the breakpoints aren't triggered
I guess another place I could start is to get a really good grasp of how the node debugger works (what's happening on port 9229) and try to get more information out of vscode itself... I just wanted to post a message on here to see if anyone was like "nah nah nah that's dumb, approach it from this angle"
f
lol.. I see. If you are seeing REQUEST and RESPONSE logs in the console, most likely SST is working fine, and your local function is getting triggered.
Let me loop in Jay. @Jay, Dirk is having an issue where the functions are run locally, but the break points in VSCode aren’t getting triggered.
Dirk also mentioned that this happens only sometimes.
d
but when you say locally i'm assuming you mean it's just made to look like it's executed on my machine, really it's going up to a lambda, executing there and there's a cool connection between that debug-stack and the main stack where one of the lambdas is running as
node --inspect
somehow and the other is proxying a debug connection somehow back to my machine?
so when i'm stepping through code as per the guides on the sts website, i'm actually seeing the stuff in ram on running lambda on aws?
j
Hmm yeah I’m not totally sure about this. But your code is running locally on your machine.
I’m not totally sure about the VS Code part I mean.
But what’s happening is that the WebSocket connection sends the serialized request, your local machine runs your Lambda functions, then replies with the serialized response.
The VS Code debugger is connecting to that node process, from what I understand. Any clear patterns on when it fails to stop at the breakpoints?
Also, let me mention some folks that are using VS Code. Maybe they might have some insights on the VS Code part of this. @Karo and her team are using it I think.
d
not really, that's why i thought i'd have a good stab at understanding it before i posted a issue on github "it's broken help"
so the thing that made me think it wasn't running locally was the fact that the breakpoints weren't "bound" (they're empty circles), that's the same as in the video on your website... i haven't seen that ever happen except for cases where the actual code is running elsewhere.... i wonder if i set those breakpoints in the files in the .build folder i'll get a different outcome...
j
Oh that’s interesting, yeah, the code is transpiled.
d
weird, if i screw up the code in the .build folder then call the lambda via api gateway it errors in the way i expected, but vscode doesn't pause on my breakpoints in either files
i think i could get some answers just sniffing 9229 on my local machine in that case because it's just the debug websocket client talking to my debugging node process locally
i'll do some more digging and see what i can find, thanks for your help 🙂
j
Yeah, let me know what you find. We’d like to document the VS Code integration better and fix any issues with it.
d
should have read the docs closer, they very carefully describe the thing i was confused about at https://docs.serverless-stack.com/live-lambda-development ! the one thing that did throw me though was the sample
launch.json
at https://docs.serverless-stack.com/debugging-with-vscode which includes
"port": 9229
which can be removed. The
port
option is for situations where vscode is connecting to a remotely running node process. For example, you have some node code in a docker container and you want to debug it, obviously you can't have vscode "inside" the docker container, so you run
node my-cool-app.js --inspect
in the container, which runs the program and also listens for debug connections on 9229, and back in your host you click debug, and vscode connects over port 9229 to establish a debug connection with that remote process I think when I saw that I instantly thought "oh they do some cool reverse proxy thing with api gateway and get lambdas to run in
--inspect
mode, when really I should have just RTFM 😅
in terms of the debugger not triggering, i'm going to have another crack at it tonight. i'll let you know if I find anything of interest 🙂
j
Ah thanks for pointing that out. We’ll test removing that port and update the doc.
d
great!
hmmmm i think i may have something
if i take this: https://github.com/serverless-stack/examples/tree/main/vscode and run it as is, debugging works fine, if I upgrade the dependencies to the most recent versions of cdk/sst i have problems
in the
launch.json
I set
"trace":true
to get detailed information about the debugger process, the difference I can see between the two is: • v0.13 has debug logs with the tag
runtime.sourcecreate
for all built js files in
.build/...
, and a bunch of
Debugger.breakpointResolved
messages • v0.21 only has
runtime.sourcecreate
logs for the following files:
Copy code
.build/eslint.js
.build/run.js
.build/lib/index.js
.build/eslint.js
• v0.21 doesn't have any ``Debugger.breakpointResolved`` logs • In both cases
.build/src/lambda.js
is there and looks perfectly fine • Testing all the versions up from 0.13, the break appears to happen between 0.18 & 0.19 • Looking at the difference between the sst-debug.log files for 0.18 and 0.19, the only significant looking difference seems to be that
cli/scripts/start.js:onClientMessage
reports
[2021-05-26T00:26:18.336] [DEBUG] client - onClientMessage {"action":"server.clientRegistered","clientConnectionId":"f6IgFfS_LPECHAQ="}
in 0.18 (debugging working) and in 0.19 where debugging doesn't work there's no trace of a
server.clientRegistered
message being received (however the code in both cases runs, and I get the "Hello, World! Your request was received at 25/May/20212326:41 +0000." message returned to my browser) • @Jay @Frank I'm clocking off for the evening but let me know if anything springs to mind from the above
j
Thank you for taking the time to investigate this and tracking down the version. I can confirm that starting v0.19, there’s an issue. Let me talk to @Frank and see what’s going on and what changed in that release.
f
@Dirk Stewart Thanks for digging into this! I managed to track down the issue and there were a couple of system environment variables we stopped passing down to the invoked Lambda process. And that seemed to cause VSCode not able to attach to it.
I just released v0.21.1 with the fix. Give it a try and let me know if it works for you. To update:
Copy code
$ npm install --save --save-exact @serverless-stack/cli@0.21.1 @serverless-stack/resources@0.21.1
d
Yes! Working perfectly 🙂 thanks for the fast release and no worries for the digging!