Deploy a Django application to AWS Lambda using Serverless Framework
João Marcos
Posted on December 14, 2023
If you have a Django application and want an easy and fast way to deploy it to the AWS Lambda service, you can use the Serverless Framework. It helps you deploy AWS Lambda functions with the necessary resources.
The instructions in the Framework’s documentation are very clear. If you haven’t installed it yet, you can follow the steps outlined in this link to do so.
For this demonstration, I developed a simple Django application. It has only one model and leverages DRF (Django REST Framework) to provide a REST API. You can check out all the code used in this post by accessing this GitHub repository.
Creating the Serverless service
To create a new Serverless service you can run the sls create
command and pass a template as a parameter. You can see a list of template examples in the official sls repository. As I write this post, there isn’t a Django template available in this list. Therefore, we are going to create a serverless.yml
file in the root directory of our project and manually write the code we need in it.
service: django-serverless
frameworkVersion: '3'
useDotenv: true
provider:
name: aws
region: ${env:AWS_REGION_NAME}
runtime: python3.9
stage: ${opt:stage, 'stg'}
timeout: 30
memorySize: 2048
The code above have defined the name of our service and some basic configuration (timeout, memory allocation, AWS region) of the lambda function that will be created when we first deploy our application.
Serverless’ Plugins
A Serverless plugin is custom JavaScript code that extends the Serverless Framework with new features. To deploy our Django application we will need to install a few plugins and you’ll learn to do that in the next sections.
Plugin: serverless-wsgi
Django’s primary deployment platform is WSGI (Web Server Gateway Interface), which is a specification that describes how a web server communicates with web applications.
API Gateway, which is going to receive the requests to our application and invoke the Lambda function, does not natively support WSGI-based applications. We need to convert API Gateway requests to and from standard WSGI request.
To accomplish this task, we’ll install our first plugin: serverless-wsgi
.
sls plugin install -n serverless-wsgi
After you run this command in the project's root directory via the terminal, you’ll notice that a new section was created within our serverless.yml file:
plugins:
- serverless-wsgi
All plugins we install will be declared in this section.
Now we can define the creation of our API Gateway and Lambda function:
functions:
api:
handler: wsgi_handler.handler
events:
- http: ANY /
- http: ANY /{proxy+}
The API Gateway will act as a transparent proxy, passing all requests it receives directly to the lambda function. Our Django application will be responsible for matching them with the appropriate endpoint. The serverless-wsgi
plugin requires the function to have wsgi_handler.handler
set as the Lambda handler.
The plugin also requires extra configuration and the place to add that is under the custom
section in serverless.yml
custom:
wsgi:
app: myapi.wsgi.application
As I mentioned before, Django is configured for WSGI by default. The application callable, which the application server uses to communicate with our code, is created when you run the Django’s startproject
command. By default, its’s path is set to <project_name>.wsgi.application
.
Plugin: serverless-python-requirements
We need to bundle the dependencies our application requires to work properly. This plugin does this and make them available in PYTHONPATH
.
sls plugin install -n serverless-python-requirements
Again, the command will add the plugin to the plugins
section of our serverless.yml
file.That’s all that’s needed for basic use, python dependencies specified in requirements.txt
will be bundled when you run the command to deploy the app.
You might have noticed that the Django app I’m using for this example has a folder requirements
with two files: dev.txt
and prod.txt
. In this case, we’ll need to explicitly point out the file that will be read by the plugin. We do that by editing the custom
section again:
custom:
wsgi:
app: myapi.wsgi.application
pythonRequirements:
fileName: requirements/prod.txt
Plugin: serverless-dotenv-plugin
This plugin will automatically load the variables stored in the .env
file into the Lambda function as environment variables.
sls plugin install -n serverless-dotenv-plugin
Packaging
Serverless Framework packages up our code into a zip file. You can run sls package
to build and save the deployment artifact in the service’s .serverless/
directory.
Some files and/or folders that are in our app’s directory are not needed in the lambda function. To prevent them from getting bundled up during the deploy we’ll use the configuration bellow:
package:
exclude:
- venv/**
- __pycache__/*
- node_modules/**
- README.md
- pytest.ini
- conftest.py
- .venv
- .venv.example
- package.json
- package-lock.json
This is how our serverless.yml
file looks like at this point:
service: django-serverless
frameworkVersion: '3'
useDotenv: true
provider:
name: aws
region: ${env:AWS_REGION_NAME}
runtime: python3.9
stage: ${opt:stage, 'stg'}
timeout: 30
memorySize: 2048
functions:
api:
handler: wsgi_handler.handler
events:
- http: ANY /
- http: ANY /{proxy+}
plugins:
- serverless-wsgi
- serverless-python-requirements
- serverless-dotenv-plugin
custom:
wsgi:
app: myapi.wsgi.application
pythonRequirements:
fileName: requirements/prod.txt
package:
exclude:
- venv/**
- __pycache__/*
- node_modules/**
- README.md
- pytest.ini
- conftest.py
- .venv
- .venv.example
- package.json
- package-lock.json
Deployment
To make our first deployment we need a way to authenticate with our AWS account. I’ll use an Access Key from an IAM User that I’ve created in my AWS account. You can learn more about this here.
Once you have the “secret” and “id” values of your credentials, create the following env vars in the terminal window where the deploy command will be run:
export AWS_ACCESS_KEY_ID="your_key_id_value"
export AWS_SECRET_ACCESS_KEY="your_secret_key_value"
Now you can run:
sls deploy
The deploy command bundles up the application and creates a CloudFormation stack that manages all the resources within the AWS account. If you’re familiar with CloudFormation, exploring the stack info will provide deeper insights into the underlying infrastructure.
The Lambda Function
Now that the the Lambda function was created and is triggered by an API Gateway, we can access our API with the link provided by the deploy command.
The application is working, but it looks weird. Something is missing…
The static files
Websites generally need to serve additional files such as images, JavaScript, or CSS. In Django, we refer to these files as “static files”. Learn more.
There are a few different approaches we can employ to serve the static files. A commonly used tactic is to serve files from a cloud storage provider like AWS’ S3 and that’s the method we will implement.
Let’s configure Django to use a custom file storage backend to integrate with S3. The django-stogages
python lib provides exactly what we need, you can install it using pip:
The Django app I’m using as example already has this lib in the requirements file. You’ll also notice that the changes required in
settings.py
are already implemented.
pip install django-storages
Then add add the code bellow to the myapi/settings.py
file.
STORAGES = {
"default": {
"BACKEND": "storages.backends.s3boto3.S3Boto3Storage",
},
"staticfiles": {
"BACKEND": "storages.backends.s3boto3.S3Boto3Storage",
},
}
AWS_STORAGE_BUCKET_NAME = os.environ.get("STATIC_FILES_BUCKET_NAME")
AWS_S3_REGION_NAME = os.environ.get("AWS_REGION_NAME")
AWS_QUERYSTRING_AUTH = False
Setting AWS_QUERYSTRING_AUTH
to False
removes query parameter authentication from generated URLs, which we don’t need because we’ll use a public bucket. The name of the bucket to which Django will send the static files and the AWS region will be read from environment variables. Add those values to your .env
file:
STATIC_FILES_BUCKET_NAME=YOUR_BUCKET_NAME_HERE
AWS_REGION_NAME=us-east-1
DB_NAME=
DB_USER=postgres
DB_PASSWORD=
DB_HOST=
DB_PORT=5432
The next step is to create the S3 bucket that will store and serve the static files. Using the Serverless Framework, we can define AWS infrastructure resources we need and easily deploy them. Those resources can be defined in a property titled resources
in serverless.yml
. What goes in this property is raw CloudFormation template syntax in YAML. So let’s create a public S3 bucket:
resources:
Resources:
StaticFilesBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: ${env:STATIC_FILES_BUCKET_NAME}
PublicAccessBlockConfiguration:
BlockPublicAcls: false
BlockPublicPolicy: false
IgnorePublicAcls: false
RestrictPublicBuckets: false
StaticFilesBucketPolicy:
Type: AWS::S3::BucketPolicy
Properties:
Bucket:
Ref: StaticFilesBucket
PolicyDocument:
Version: "2012-10-17"
Statement:
- Sid: PublicReadGetObject
Action: "s3:GetObject"
Effect: Allow
Principal: "*"
Resource: "arn:aws:s3:::${env:STATIC_FILES_BUCKET_NAME}/*"
The Lambda function needs permission to upload the static files to the bucket. This permission can be set via IAM Role, which Serverless Framework automatically creates for our service. We can modify this Role to add permissions to the code running in our function as we need them. Add the code bellow under the provider
property:
iam:
role:
statements:
- Effect: Allow
Action:
- s3:*
Resource:
- arn:aws:s3:::${env:STATIC_FILES_BUCKET_NAME}
- arn:aws:s3:::${env:STATIC_FILES_BUCKET_NAME}/*
This is the final version of our serverless.yml
file:
service: django-serverless
frameworkVersion: '3'
useDotenv: true
provider:
name: aws
region: ${env:AWS_REGION_NAME}
runtime: python3.9
stage: ${opt:stage, 'stg'}
timeout: 30
memorySize: 2048
iam:
role:
statements:
- Effect: Allow
Action:
- s3:*
Resource:
- arn:aws:s3:::${env:STATIC_FILES_BUCKET_NAME}
- arn:aws:s3:::${env:STATIC_FILES_BUCKET_NAME}/*
functions:
api:
handler: wsgi_handler.handler
events:
- http: ANY /
- http: ANY /{proxy+}
plugins:
- serverless-wsgi
- serverless-python-requirements
- serverless-dotenv-plugin
custom:
wsgi:
app: myapi.wsgi.application
pythonRequirements:
fileName: requirements/prod.txt
package:
exclude:
- venv/**
- __pycache__/*
- node_modules/**
- README.md
- pytest.ini
- conftest.py
- .venv
- .venv.example
- package.json
- package-lock.json
resources:
Resources:
StaticFilesBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: ${env:STATIC_FILES_BUCKET_NAME}
PublicAccessBlockConfiguration:
BlockPublicAcls: false
BlockPublicPolicy: false
IgnorePublicAcls: false
RestrictPublicBuckets: false
StaticFilesBucketPolicy:
Type: AWS::S3::BucketPolicy
Properties:
Bucket:
Ref: StaticFilesBucket
PolicyDocument:
Version: "2012-10-17"
Statement:
- Sid: PublicReadGetObject
Action: "s3:GetObject"
Effect: Allow
Principal: "*"
Resource: "arn:aws:s3:::${env:STATIC_FILES_BUCKET_NAME}/*"
Run sls deploy
again to update the stack. You can access the S3 page of your AWS account to verify the bucket created. If the deploy command was successful, you should be able to see 2 buckets: the one explicitly created by us and another implicitly generated by Serverless Framework to store the deployment artifacts of our application.
Now we only need to run Django’s collectstatic
command to upload the static files to the bucket. Fortunately, serverless-wsgi
plugin already provides an easy way for remote execution of Django management commands, the wsgi manage
command. Run in your terminal window:
sls wsgi manage --command "collectstatic --noinput"
Run the migrations:
sls wsgi manage --command "migrate"
You can access /api/books
to interact with the app using DRF’s Browsable API.
Create a package.json
file
It is important to have a package.json
file to track the dependencies our project has. In our case, those dependencies are the Serverless' plugins we installed. To create the file you can run npm init
. The dependencies installed in node_modules
will be read and used to create the file.
{
"name": "django-serverless",
"version": "1.0.0",
"description": "",
"dependencies": {
"serverless-wsgi": "^3.0.3",
"serverless-dotenv-plugin": "^6.0.0",
"serverless-python-requirements": "^6.0.1"
},
"devDependencies": {},
"repository": {
"type": "git",
"url": "git+https://github.com/joao-marcos/django-serverless.git"
},
"author": "João Marcos"
}
What’s Next?
In the next blog post we’ll build a pipeline using GitLab CI to automate deployments whenever new code is pushed to the repository. See you next time!
Posted on December 14, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.