I want to implement Lambda Layers into my Serverless application. I plan to create three layers, each based on separate requirements.txt files located in separate directories. Unfortunately, I am having trouble implementing using the serverless configuration. The file tree looks like this:
app-directory/
├── lambda functions codes ...
├── layers/
│ ├── layer_A/
│ │ └── requirements.txt
│ ├── layer_B/
│ │ └── requirements.txt
│ └── layer_C/
│ └── requirements.txt
└── serverless.yml
According to the instructions
- https://www.serverless.com/framework/docs/providers/aws/guide/layers
- https://docs.aws.amazon.com/lambda/latest/dg/packaging-layers.html
during first attempt, I created the python venv using python3.9 -m venv python and then installed the requirements.txt dependencies using the pip install -r layers/layer_A/requirements.txt targetted in the python/lib/python3.9/site-packages/ folder, then zipped the python directory. I then created a layers property in the serverless.yml file where I included the following parameters:
layers:
layerA:
path: layers/layer_A
compatibleRuntimes:
- python3.9
layerB:
path: layers/layer_B
compatibleRuntimes:
- python3.9
layerC:
path: layers/layer_C
compatibleRuntimes:
- python3.9
The layer reference in the function is as follows:
migrations:
handler: src.migrations.lambda_handler
layers:
- !Ref LayerALambdaLayer
- !Ref LayerBLambdaLayer
- !Ref LayercLambdaLayer
After running deployment script containing sls deploy and sls invoke stage test --function migrations command which turns on migration lambda, I received an error:
{
"errorMessage": "Unable to import module 'src.migrations.handler': No module named 'alembic'",
"errorType": "Runtime.ImportModuleError",
"requestId": "59a33572-9439-4590-b547-de6084e784b3",
"stackTrace": []
}
Environment: darwin, node 20.8.0, framework 3.35.2 (local) 3.38.0v (global), plugin 7.0.5, SDK 4.4.0
Credentials: Local, "default" profile
Docs: docs.serverless.com
Support: forum.serverless.com
Bugs: github.com/serverless/serverless/issues
Error:
Invoked function failed
For the second attempt, I created another directory only for requirements files. I have also created a bash script simply creating a python/ directory where I installed the dependencies using the pip3 command.
#!/bin/bash
set -e
LAYER_TARGET_DIR="layers"
LAYER_REQUIREMENTS_DIR="layers_requirements"
RUNTIME="python3.9"
for LAYER_PATH in $LAYER_TARGET_DIR/*/; do
LAYER=$(basename $LAYER_PATH)
printf "Processing $LAYER layer\n"
if [ -f "$LAYER_PATH/python.zip" ]; then
echo "Existing python.zip found in $LAYER layer. Deleting...\n"
rm -rf "$LAYER_PATH/python.zip"
fi
printf "Installing dependencies for $LAYER layer\n"
mkdir $LAYER_TARGET_DIR/$LAYER/python
pip3 install -r $LAYER_REQUIREMENTS_DIR/$LAYER/requirements.txt -t $LAYER_TARGET_DIR/$LAYER/python/
printf "Dependencies installed for $LAYER layer\n"
printf "Zipping the python folder\n"
cd $LAYER_TARGET_DIR/$LAYER
zip -r9 python.zip python
cd - > /dev/null
rm -rf $LAYER_TARGET_DIR/$LAYER/python
printf "$LAYER layer processing complete"
done
printf "\nAll layers processed\n--------------------------------------\n"
In the end directories looked like that:
├── lambda functions codes ...
├── layers_directories/
│ ├── layer_A/
│ │ └── requirements.txt
│ ├── layer_B/
│ │ └── requirements.txt
│ └── layer_C/
│ └── requirements.txt
├── layers/
│ ├── layer_A/
│ │ └── python.zip
│ ├── layer_B/
│ │ └── python.zip
│ └── layer_C/
│ └── python.zip
└── serverless.yml
This doesn't work either, received same error. I have checked twice, how does the directory structure looks like - same as required on AWS site.
Is something in my configuration incorrect? Am I forgetting something?