Hey team, I'm getting an error `Runtime.ImportModu...
# help
Hey team, I'm getting an error
Runtime.ImportModuleError: Unable to import module 'handlers/alerts/demo_read_db': No module named 'marshmallow'
and I'm thinking this is bundling issue. I notice that the actual deployed lambda function isn't running a docker container. This is my cron job def
Copy code
this.cron = new sst.Cron(this, 'cron', {
            schedule: Duration.minutes(1),
            job: {
                // check the docs here to see what you can pass into the function props
                // <https://docs.serverless-stack.com/constructs/Function>
                function: {
                    runtime: 'python3.9',
                    srcPath: 'src',
                    handler: 'handlers/alerts/demo_read_db.handler',
                    bundle: {
                        installCommands: [
                            'yum -y install mariadb-devel gcc',
                            'pip install -r requirements.txt',
This is my requirements file that defines marshmallow
Copy code
When I run
npx sst deploy --stage prod
I see on the cli that docker is building a container, yum installs packages, and pip installs packages (snippet in thread). However when I go into the aws console and look at the code I just see bare code source (screenshot) and I notice that the package source is via zip. The lambda doesn't seem to have any access to the packages that are installed in the requirements file. Any ideas?
or does sst do packages via lambda layers?
How do python packages actually get included in the lambda function? From looking at sst outputs it looks like there should be a layer included on the lambda. I don't see this actually get linked onto the lambda function though.
Found the answer reading through sst codes. My pip install command should have been this.
Copy code
'pip install -r requirements.txt -t .',
wasn't installing packages to the right location.
Hey @Ryan Barnes, there might be some confusion. For Python runtimes, only the build process happens inside a docker container. The reason being, unlike Node.js, some commonly used Python packages are OS dependent, so on
sst deploy
, they are built in a Lambda-like docker environment.
Your Python code along with its dependencies are zipped up and sent to AWS Lambda.
It’s a standard Python Lambda, it doesn’t run inside docker on the AWS end.
appreciate the clarification 🙂