Deploying Django + Celery + Amazon SQS to AWS Elastic Beanstalk with Amazon Linux 2
Mahir Mahbub
Posted on October 25, 2022
In this tutorial, we will learn about the configuration for deploying Django+Celery+Amazon SQS to Amazon Elastic Beanstalk with Amazon Linux 2.
I will assume here that you already familiar with Django/DRF and have running Django/DRF application. I will also assume that you know how to create a Beanstalk application with Amazon Linux 2 image.
Let's drive into the configuration.
The first step into getting your application to run in Beanstalk is to create the ".ebextensions" folder in the root of the source.
Now create a file named "02_package.config" add the code into that file showed in snippet below.
packages:
yum:
libcurl-devel: []
openssl-static.x86_64: []
python3-devel: []
gcc: []
the openssl-static.x86_64, libcurl-devel, gcc are required for installing pycurl which will be used to connect with Amazon SQS.
To run DB migration and install python required packages, add the snippet below to a file named "04_db_migrate.config".
container_commands:
01_install_requirements:
command: "source /var/app/venv/*/bin/activate && pip3 install -r requirements.txt"
leader_only: true
02_migrate:
command: "ls /var/app/venv/*/bin && source /var/app/venv/*/bin/activate && python3 manage.py migrate"
leader_only: true
Then It is needed to configure the Beanstalk to find/work with the Django settings and run the WSGI server. So create a file name "05_django.config" and add the code snippet to the file.
option_settings:
aws:elasticbeanstalk:container:python:
WSGIPath: config.wsgi:application
aws:elasticbeanstalk:application:environment:
DJANGO_SETTINGS_MODULE: config.settings
Now It is needed to define a pre-deployment action through the hook which used by many Amazon services for controlled action execution. the file named "03_pycurl_reinstall.sh" is needed to be created under the folder path ".platform/hook/predeploy". It will reinstall the pycurl after installing the required yum packages mentioned above.
source ${PYTHONPATH}/activate && ${PYTHONPATH}/pip3 install pycurl --global-option="--with-openssl" --upgrade
Now It is needed to create a file named "01_create_celery_service.sh" in ".platform/hook/postdeploy" folder and add the code snippet below. The Celery will be configured with this service file.
#!/usr/bin/env bash
echo "[Unit]
Name=Celery
Description=Celery service for __
After=network.target
StartLimitInterval=0
[Service]
Type=simple
Restart=always
RestartSec=30
User=root
WorkingDirectory=/var/app/current
ExecStart=$PYTHONPATH/celery -A config worker --loglevel=INFO
ExecReload=$PYTHONPATH/celery -A config worker --loglevel=INFO
EnvironmentFile=/opt/elasticbeanstalk/deployment/env
[Install]
WantedBy=multi-user.target
" | tee /etc/systemd/system/celery.service
# Start celery service
systemctl start celery.service
# Enable celery service to load on system start
systemctl enable celery.service
Finally, the AWS compulsory config along with Amazon SQS configuration is needed to be added to the settings.py file under the config folder of the Django app.
# AWS config
AWS_ACCESS_KEY_ID = config("AWS_ACCESS_KEY_ID", "")
AWS_SECRET_ACCESS_KEY = config("AWS_SECRET_ACCESS_KEY", "")
# CELERY SETTINGS
BROKER_URL = f"sqs://{AWS_ACCESS_KEY_ID}:{AWS_SECRET_ACCESS_KEY}@"
BROKER_TRANSPORT_OPTIONS = {
"region": <region name>, # example ap-southeast-2
"polling_interval": 60,
"visibility_timeout": 3600,
"queue_name_prefix": <app-name>,
}
CELERY_DEFAULT_QUEUE = "sqs"
CELERY_ACCEPT_CONTENT = ["application/json"]
CELERY_TASK_SERIALIZER = "json"
CELERY_RESULT_SERIALIZER = "json"
CELERY_BROKER_TRANSPORT_OPTIONS = BROKER_TRANSPORT_OPTIONS
CELERY_TASK_DEFAULT_QUEUE = "sqs"
Predefined queue do not work with pooling so It has been avoided. Now zip the source code and upload to Elastic Beanstalk. Alternatively you can use AWS console or command line interface to deploy the application. Now we have successfully configured the application with celery and SQS in Elastic Beanstalk with Amazon Linux 2. You can check the log file to check the status of the celery. Also a new queue will be created in AWS SQS console.
Posted on October 25, 2022
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.
Related
October 25, 2022