Farhan Masud Aneek
Posted on November 13, 2023
When you are setting up a new Django project, you'll have to configure quite a lot of things depending on the requirements of the project. Starting from database configurations to a custom user model, there are more than quite a few things to be done in between.
There are very useful packages for bootstrapping your Django projects in minutes such as django-cookiecutter and djangox. If you are a seasoned developer, I'd highly recommend using one of these instead of what I'm going to show here. But if you are struggling with the project structure of these packages as a beginner to intermediate Django developer and looking to structure your own Django projects in a better way, I have created a lightweight setup that deals with the basics of setting up a Django project with PostgreSQL as database and TailwindCSS as our styling library.
This Django Starter kit takes care of automated creation of virtual environment and installing of Python packages and setting up the database with bash scripts. In addition to PostgreSQL and TailwindCSS, all the sensitive values are taken care of in a .env file using django-environ package. The virtual environment is maintained using pip-tools.
The static files are managed with whitenoise, Gunicorn is used for the WSGI server and Sentry is used for error monitoring and reporting in production.
Since the scripts are in bash, these will only run on UNIX based operating systems like Linux and MacOS. But you can also run this on Windows with Git Bash or WSL.
Before getting started
Make sure you have PostgreSQL and Node.js + npm installed on your local machine.
Quick Setup
Here are the steps to setup this starter kit. Setting this up shouldn't take more than 5 minutes. I'm going to explain all the steps in detail and how it's configured on the later sections so that you have in depth understanding about how this works under the hood (which will take way longer than 5 minutes but I can assure that it's worth the time). By understanding the structure of the project from the later sections, you can create something on your own that suits your needs best.
- Clone repo with your project name
git clone git@github.com:farhanmasud/django-tailwind-starter-template.git your-project-name
- Setup .env file following the example env file
example.env
- Make all bash scripts executable with
find . -type f -iname "*.sh" -exec chmod +x {} \;
- Run
bash update-git-remote.sh
script to remove existing git remote link (of this repo) and update with your on git repo link. Copy your remote URL from GitHub and run the bash script, it'll prompt to enter the new link. Paste your new link and hit Enter. - Setup database [requires step 2] with
bash setup-db.sh
- Setup venv and install dependencies with
bash setup-venv-pip-tools.sh
- Activate virtual environment with
source venv/bin/activate
- Install tailwind dependencies with
python manage.py tailwind install
- Run migrations with
python manage.py migrate
- Collect static
python manage.py collectstatic --no-input
Setup Explanation
1. Cloning the repository
First, clone this repository on your local with git clone git@github.com:farhanmasud/django-tailwind-starter-template.git your-project-name
command. If you are going to call this project quick-django-project
, the command will be -
git clone git@github.com:farhanmasud/django-tailwind-starter-template.git quick-django-project
2. Updating the .env
file
Navigate to this newly created quick-django-project
directory and copy the .example.env
and paste with the name .env
in the same directory.
Open the .env
file and update the environment variables as follows.
WORK_ENV=local
Since we have different settings file for different environment such as production, testing, staging and local development. We're updating this tolocal
to make sure only packages needed for local development packages are installed and only settings that are intended for the local development environment is loaded.Generate a secret key for Django and update the
SECRET_KEY
variable. You can generate Django secret key by following this question from StackOverflow or use this web UI. Let's say our generated secret key is@6jc16lc(+mhjnkfk@^)46xc728u0j&(z)y!qkimo4b#ggnos7
and in our environment file, it'll beSECRET_KEY=@6jc16lc(+mhjnkfk@^)46xc728u0j&(z)y!qkimo4b#ggnos7
Set
ALLOWED_HOSTS=localhost,127.0.0.1
which will allow us to access our Django project on local machine with localhost or 127.0.0.1.Grab the password for the user
postgres
on PostgreSQL and use withPGPASSWORD
variable. If your password isexamplepostgrespassword1234
setPGPASSWORD=examplepostgrespassword1234
.Set your database name, database password and database user as your preference.
DB_NAME=django_project
DB_USER=django_user
DB_PASSWORD=exampledatabasepassword1234
And keep the following as is -
DB_HOST=localhost
DB_PORT=5432
Create an account on Sentry and create a project for Django. You can follow the steps on Sentry documentation for Django to get your dsn link and update the SENTRY_DSN
variable. Enter the value without quotes here.
SENTRY_DSN=https://examplePublicKey@o0.ingest.sentry.io/0
Setting up the .env
file is done. Now let's use these environment variables to setup our Django project.
3. Making the helper bash files executable and running the bash files
There are 4 bash scripts in this starter kit -
-
setup-db.sh
script is for setting up the database for our Django project with PostgreSQL -
setup-venv-pip-tools.sh
script is for setting up the virtual environment usingpip-tools
and installing the packages. -
pip-tools
require a few commands to compile and then install a new package.pip-install.sh
script runs these commands one by one. - For updating the git origin, you can use
update-git-remote.sh
.
To make these bash script, open your terminal, navigate to the project directory and run find . -type f -iname "*.sh" -exec chmod +x {} \;
Run bash update-git-remote.sh
Now the bash scripts are ready to run.
To start, create a new repository on GitHub and create an empty repository. Copy the git remote origin link and on your terminal, inside your project directory, run bash update-git-remote.sh
. This will prompt you to enter your git remote origin link, just paste it here with CTRL + Shift + V
or CMD + Shift + V
and press Enter
. Your project is now linked with your GitHub repository.
4. Setting up the Database
We're going to use PostgreSQL for the database here. We'll need to setup the PostgreSQL database and user and tell Django the database, user and user password so that Django can connect to the database. For both of the purposes we'll just update the .env
file with the variables (we've already done this on step 2) and run the setup-db.sh
bash script.
Similar to running other bash scripts, open up the terminal, navigate to your project directory and run bash setup-db.sh
which will ask for your sudo password of your current user. Press enter after entering the password and the script will setup the database.
5. Setting up the Virtual Environment
This repository has all the packages that you'll need to get up and running with PostgreSQL, Tailwind, managing environment variables, managing static files and so on. To install this package in a virtual environment, this package uses pip-tools. pip-tools
requires a couple of commands to collect, compile and install the packages that you need. We have a bash script to create our virtual environment and run all of the commands one by one.
Open your terminal and navigate to your project directory and run bash setup-venv-pip-tools.sh
. This will prompt you to enter if you are in development environment or in production. Enter D
if you are in development environment or enter P
if you are in production environment and press Enter on your keyboard. This will create a virtual environment called venv
, install pip-tools
and install all the packages required for your development or production environment.
6. Adding further packages
The existing packages in this repository are for getting the Django project up and running only. You'll definitely need a lot more packages while developing the project. The packages are kept within the .in
files inside the requirements
directory. You'll find three files in the directory -
-
requirements.in
for packages that will be used both in local and production such as django itself. -
requirements-dev.in
for packages that will only be used in development such as django-debug-toolbar. -
requirements-prod.in
for packages that will only be used in production such as Gunicorn.
When you need to add a new package, figure out where the package will be used and and the package there.
Then run the pip-install.sh
bash script using bash pip-install.sh
command, enter D
if you are on your local environment or enter P
if you are on your production environment. This will automatically compile the packages and dependencies and install them in the virtual environment.
7. Installing Tailwind
Most of our installation is done but we haven't configured Tailwind yet. django-tailwind is already installed with the required Python packages but we'll need to compile this to fully setup Tailwind and make it ready to work in our local development environment.
To finish the Tailwind configuration, activate the virtual environment that we have created with source venv/bin/activate
and run python manage.py tailwind install
to complete the installation. You must have Node.js
and npm
installed for this to work.
You'll see a Django app in the project directory called theme
and this is the app for Tailwind. Inside this you'll find the static_src>tailwind.config.js
where you can configure Tailwind as per your requirements. You'll also see there is a base template (theme/templates/base.html
) which is provided by the django-tailwind
package. You can use this or make your own following the conventions shown here for loading tailwind. First, you'll have to load the tailwind template tags with {% load tailwind_tags %}
at the top of your html file and use {% tailwind_css %}
to load the CSS files into the head of your html.
8. Running Migrations
You are probably aware that Django recommends setting up a custom user model for your project before you do any migrations. This starter kit comes with a custom user model that uses email
instead of the default username
field. The migrations are already made in the accounts
app and there you'll find the custom user model Account
.
Now we just have to apply these migrations to our database including all other default migrations that Django comes with. With virtual environment activated, run the command python manage.py migrate
to apply the migrations.
9. Managing Static files
We have some static files coming from the Tailwind package. In your development environment, these will be automatically available when you start the development server. But in production, you'll need to collect the static files using python manage.py collectstatic
command.
10. Run the development server!
Okay, we are done setting the Django project! With virtual environment activated, run python manage.py runserver
and the development server will run on the default port 8000 on your localhost. If go to localhost:8000
from your browser, you'll see a 404 Not Found
error. But don't worry, this is intentional. This starter kit doesn't come with extra app or template that you'll not need later. Just create your app from here, configure urls and start building models and views as you want.
Structure of the project
Splitting the settings.py file
When you create a new Django project, you'll find a settings.py
file inside your project directory that holds all the settings file related to your Django Project. But in this starter kit, you'll see that there is a directory called config
and inside this there is a directory called settings
. Inside this directory you'll find multiple settings files - base.py
, local.py
, test.py
, staging.py
, production.py
and engage.py
. The base.py
settings file holds all the settings that are applicable no matter which environment your project your running. local.py
, test.py
, staging.py
and production.py
files contain all the settings related to Local, testing, staging and production environment. Finally, the engage.py
file dynamically loads the settings from local.py
, test.py
, staging.py
and production.py
based on what you have provided on the WORK_ENV
variable in the .env
file and combines the settings with base.py
.
The reason behind this splitting is that we can safely use packages and related settings only where we need. For example, this starter kit has the package django-debug-toolbar. This is only intended for your development environment and not for your production. This can be very risky if used in production because if your Django project encounters errors, all the debug info will be shown to the user which is a severe security risk. Similarly, for tracking errors in production, we're using Sentry which is not needed in our local environment since we already have django-debug-toolbar
. For keeping these settings file separate so that they don't conflict with each other, the settings file is split for serving different environments.
Keeping sensitive data out of the codebase (and version control system)
Data like SECRET_KEY
or your database connection details are sensitive and can cause serious security issues if they are not protected. If you keep these (along with other sensitive data like API keys) in your settings.py
file and commit them in your version control system such as Git, your project can be at risk in production.
To protect these data, it's best to keep this in a separate .env
file and keeping it out of the version control system. In this starter kit, we are using django-environ to achieve this. For example, you can see in config/settings/base.py
, we are reading the SECRET_KEY
with django-environ
as SECRET_KEY = env("SECRET_KEY")
where we have a variable named SECRET_KEY
in our .env
file. Obviously, we have to import django-environ
and the python os
package with
import environ
import os
and configure it with
env = environ.Env()
env.read_env(os.path.join(BASE_DIR, ".env"))
Make sure that your .env
file lives at the root of the project directory and you are good to go.
Using pip-tools
When you are working on a Python project, it's best to pin your Python packages in a separate requirements.txt
file with the exact version pinned and install them in a virtual environment. Python venv is probably the most popular choice due to it's very easy setup process. Keeping required packages pinned with versions helps in many ways. For example maybe you were working on a project 2 years back and you were using a package that had the version 1.0.0 at that time. Now after 2 years, that package might have been updated to version 3.0.0. If you try to install the version 3.0.0 and try to run your project, there is high chance it won't work because it has changed.
Similarly, if you have that old project that used the package version 1.0.0 and you installed the package globally and try to start a new project with the same package and use version 3.0.0, only one of them will work. If you install version 3.0.0, the one that uses version 1.0.0 won't work and vice-versa.
Python virtual environments solve this issue. You can create virtual environments with venv
and keep a simple requirements.txt
file with packages and version information that's required in your project. And you can install them in the virtual environment independently so that they don't affect other projects.
Instead of venv
, we are using pip-tools
in this starter kit. pip-tools
take things further in dependency management. Check out what pip-tools does in their official GitHub repo. In short, it helps your project find the best match for the dependent packages. For example, you might need two packages A
and B
in your project that requires same package C
under the hood. But A
requires any version of C
from 1.0.1 to 1.0.10 and B
requires any version of C
from 1.0.7 to 1.0.15. Pip tools will automatically compile the version of 'C' that suits for both of your packages.
To keep all the requirements, you'll find a requirements
directory in the project. There will be 3 files (two more after they are compiled but we don't keep that in the version control) - requirements.in
, requirements-dev.in
and requirements-prod.in
. The idea here is similar how are splitting the settings.py
file. We are only keeping the packages that are relevant to the related environment. So when you setup the virtual environment with bash setup-venv-pip-tools.sh
or install the packages with bash pip-install.sh
, you are prompted to enter your choice between development and production environments and with that the bash script compiles the packages with pip-tools
from the requirements.in
file as the base and from requirements-dev.in
or requirements-prod.in
as per your choice.
Custom User Model
As Django recommends, we are using a custom user model that you can find as Account
model in accounts/models.py
. Here we are using the user's email field as the unique identifier instead of the default username. We are also using a custom manager for managing the objects. And in accounts/admin.py
, the Account
model is made compatible with the Django admin so that you can add users from the Django admin.
Finally, Bash Scripts
What we have done setting up the Django project with this starter kit is updating the .env
file and running some bash scripts. Bash is a fun scripting tool for UNIX based systems (although it can vary sometime from different UNIX based systems). What is done in the scripts is just telling the computer to run a series of related commands one after another. Bash scripting is a lot of fun (and frustrating) as you can automate many of your workflows with this. If you are interested, you can start learning bash scripting from here.
Well, that's pretty much all. Ended up longer than I had expected even after skipping some details I initially wanted to include. But I hope it helps you give a good insight about structuring a Django project. If you examine the repository on Github, you can get a better idea of what's actually going on which is actually very simple. And from there you can explore django-cookiecutter and djangox to see what suits your needs best and might even create something that works the best for you.
Posted on November 13, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.
Related
November 20, 2024