This time I'm going to show you how to get started with Fast-Api and Docker. Our demo project will be a simple API that allows the user to create and save notes to a Postgres database.
Fast-Api is a modern, fast (high-performance), web framework for building APIs with Python 3.7+ based on standard Python type hints.
It is a relatively new framework that is gaining popularity in the Python community(over 50k on GithHub). It is built on top of Starlette (a lightweight ASGI framework/toolkit, which is ideal for building high performance asyncio services) and Pydantic. It is fast, easy to use and easy to learn, making it a perfect choice for building APIs. It also has built-in support for OpenAPI and Swagger.
As the name implies, Fast-Api is fast. It is one of the fastest Python frameworks available. Using Uvicorn as the ASGI server, it is capable of handling over 10,000 requests per second.
Audience and Objectives π£οΈ
This tutorial is for anyone who wants to get started with Fast-Api and Docker. From beginners to intermediate developers.
By the end of this tutorial, you will be able to:
Create a Fast-Api project.
Run the project locally.
Connect to a Postgres database to perform CRUD operations.
Dockerize the project.
Commit the code and Push to GitHub.
Use GitHub Actions as our CI/CD pipeline to test and build Docker image and container.
Interact with Api via the browser or 3rd Party tools like Postman, Insomnia, etc.
Optionally create and connect a Vue frontend to the API.
Prerequisitesπ§βπ»
The main prerequisite for this tutorial is a basic understanding of Python and Docker.
The code is a simple hello world example. It imports the Fast-Api class and creates an instance of it. It then creates a route that returns a simple JSON object.
We are using Uvicorn as the ASGI server. It is a lightning-fast ASGI server implementation, using uvloop and httptools.
We are also using the --reload flag to enable hot reloading. This means that whenever we make a change to our code, the server will automatically restart. The --workers 1 flag is used to specify the number of worker processes. The --host and --port flags are used to specify the host and port to run the server on.
Open your browser and navigate to http://localhost:8002/. You should see the following:
Connect to a Postgres Database
Inside the src folder, create a new file called db.py and add the following code.
# src/db.py
importosfromsqlalchemyimport(Column,Integer,String,Table,create_engine,MetaData)fromdotenvimportload_dotenvfromdatabasesimportDatabasefromdatetimeimportdatetimeasdtfrompytzimporttimezoneastzload_dotenv()# Database url if none is passed the default one is used
DATABASE_URL=os.getenv("DATABASE_URL","postgresql://hello_fastapi:hello_fastapi@localhost/hello_fastapi_dev")# SQLAlchemy
engine=create_engine(DATABASE_URL)metadata=MetaData()notes=Table("notes",metadata,Column("id",Integer,primary_key=True),Column("title",String(50)),Column("description",String(50)),Column("completed",String(8),default="False"),Column("created_date",String(50),default=dt.now(tz("Africa/Nairobi")).strftime("%Y-%m-%d %H:%M")))# Databases query builder
database=Database(DATABASE_URL)
In the code ,we are using SQLAlchemy as our ORM(Object Relational Mapper) and Databases as our query builder.
Lets go through what the code does:
We are importing the required libraries.
from sqlalchemy we are importing the Column, Integer, String, Table, create_engine and MetaData classes. These are important classes that we will be using to create our database schema and perform CRUD operations.
We are also using the dotenv library to load environment variables from a .env file. This is good security practice escpecially in a production environment. When we run the project in a test env, we are providing a default database url. This is the url is important for testing also if the .env file is not found.
We are creating a table called notes with the following columns: id, title, description, completed and created_date. The id column is the primary key and the created_date column is set to the current date and time.
We are also creating a database instance using the Database class from the databases library. This instance will be used to perform CRUD operations.
Create a CRUD API
Inside the src folder, create a new folder called api and inside it create a new file called models.py and add the following code.
# src/api/models.py
frompydanticimportBaseModel,Field,NonNegativeIntfromdatetimeimportdatetimeasdtfrompytzimporttimezoneastzclassNoteSchema(BaseModel):title:str=Field(...,min_length=3,max_length=50)#additional validation for the inputs
description:str=Field(...,min_length=3,max_length=50)completed:str="False"created_date:str=dt.now(tz("Africa/Nairobi")).strftime("%Y-%m-%d %H:%M")classNoteDB(NoteSchema):id:int
In the code above, we are creating two classes. The NoteSchema class is used to validate the data that is sent to the API. The NoteDB class is used to validate the data that is returned from the database. The id field is added to the NoteDB class because it is not sent to the API. It is generated by the database. The created_date field is also added to the NoteDB class because it is not sent to the API. It is generated by the database.
Here is a list of the fields that we are using:
title - The title of the note. It is a required field and it must be between 3 and 50 characters.
description - The description of the note. It is a required field and it must be between 3 and 50 characters.
completed - The status of the note. It is a required field and it must be either True or False.
created_date - The date and time when the note was created. It is a required field and it must be in the format YYYY-MM-DD HH:MM.
Using Pydantic, we can add additional validation to the fields. For example, we can add a regex to the title field to ensure that it only contains letters and numbers. We can also add a regex to the description field to ensure that it only contains letters and numbers. We can also add a regex to the completed field to ensure that it only contains letters and numbers. We can also add a regex to the created_date field to ensure that it only contains letters and numbers.
Inside the src/api folder, create a new file called crud.py and add the following code.
In the code above, we are creating five functions that will be used to perform CRUD operations. The post function is used to create a new note. The get function is used to get a note by its id. The get_all function is used to get all the notes. The put function is used to update a note. The delete function is used to delete a note.
Unlike the normal way of defining python functions using the def keyword, we are using the async keyword to define the functions. This is because we are using a core feature of FastApi which is that its asynchronous. This means that the functions will be executed asynchronously. This is good because it will allow the application to handle multiple requests at the same time.
Routing and API Endpoints
Inside the src/api folder, create a new file called notes.py and add the following code.
# src/api/notes.py
fromapp.apiimportcrudfromapp.api.modelsimportNoteDB,NoteSchemafromfastapiimportAPIRouter,HTTPException,PathfromtypingimportListfromdatetimeimportdatetimeasdtrouter=APIRouter()@router.post("/",response_model=NoteDB,status_code=201)asyncdefcreate_note(payload:NoteSchema):note_id=awaitcrud.post(payload)created_date=dt.now().strftime("%Y-%m-%d %H:%M")response_object={"id":note_id,"title":payload.title,"description":payload.description,"completed":payload.completed,"created_date":created_date,}returnresponse_object@router.get("/{id}/",response_model=NoteDB)asyncdefread_note(id:int=Path(...,gt=0),):note=awaitcrud.get(id)ifnotnote:raiseHTTPException(status_code=404,detail="Note not found")returnnote@router.get("/",response_model=List[NoteDB])asyncdefread_all_notes():returnawaitcrud.get_all()@router.put("/{id}/",response_model=NoteDB)asyncdefupdate_note(payload:NoteSchema,id:int=Path(...,gt=0)):#Ensures the input is greater than 0
note=awaitcrud.get(id)ifnotnote:raiseHTTPException(status_code=404,detail="Note not found")note_id=awaitcrud.put(id,payload)response_object={"id":note_id,"title":payload.title,"description":payload.description,"completed":payload.completed,}returnresponse_object#DELETE route
@router.delete("/{id}/",response_model=NoteDB)asyncdefdelete_note(id:int=Path(...,gt=0)):note=awaitcrud.get(id)ifnotnote:raiseHTTPException(status_code=404,detail="Note not found")awaitcrud.delete(id)returnnote
In the code above, we are creating four functions that will be used to handle the requests. The create_note function is used to handle the POST request. The read_note function is used to handle the GET request. The read_all_notes function is used to handle the GET request. The update_note function is used to handle the PUT request. The delete_note function is used to handle the DELETE request.
The create_note function takes in a payload of type NoteSchema and returns a response of type NoteDB. The read_note function takes in an id of type int and returns a response of type NoteDB. The read_all_notes function returns a response of type List[NoteDB]. The update_note function takes in a payload of type NoteSchema and an id of type int and returns a response of type NoteDB. The delete_note function takes in an id of type int and returns a response of type NoteDB.
The @router.post decorator is used to define the route for the create_note function. The @router.get decorator is used to define the route for the read_note function. The @router.get decorator is used to define the route for the read_all_notes function. The @router.put decorator is used to define the route for the update_note function. The @router.delete decorator is used to define the route for the delete_note function.
Main File
Inside the src folder, create a new file called main.py and add the following code.
In the code above, we are creating a new FastApi application. We are also adding a middleware to allow cross-origin resource sharing. This is to allow the frontend to make requests to the backend. We are also adding the notes and ping routers to the application.
On app start up, we are connecting to the database. On app shutdown, we are disconnecting from the database. We are also adding the notes and ping endpoints to the application. Ths notes endpoint is added to the application with the prefix /notes. This means that all the routes in the notes endpoint will be prefixed with /notes. The ping endpoint is added to the application without a prefix. This means that all the routes in the ping endpoint will not be prefixed with anything.
One Database to Rule Them All π
In this section, we will be creating a database to store the notes. We will be using PostgreSQL as our database. We will be using Docker to run the database. We will be using Docker Compose to run the database and the application.
Docker File
Inside the src folder, create a new file called Dockerfile and add the following code.
In the code above, we are creating a new Docker image. We are setting the working directory to /app. We are copying the requirements.txt file to the working directory. We are installing the dependencies in the requirements.txt file. We are copying all the files in the current directory to the working directory. We are setting the command to run when the container is started.
Feel free to change the port number in the command to any port number you want.
Docker Compose File π
Inside the root folder, create a new file called docker-compose.yml and add the following code.
In the code above, we are creating two services. The db service is used to run the database. The app service is used to run the application. The db service is dependent on the app service. This means that the db service will not start until the app service is running.
Running the Application π
To run the application locally, run the following command.
The application should be running on port 8002. You can test the application by making requests to the endpoints. You can use Postman/Insomnia to make requests to the endpoints. You can also use the frontend to make requests to the endpoints.
Screenshot of the application running locally.
Testing the Api with Thunder Client π§ͺ
Personally am using Visual Studio Code as my editor. I have installed the Thunder Client extension. This allows me to make requests to the endpoints from within the editor. You can install the extension and make requests to the endpoints from within the editor. Examples as shown below.
CI/CD using GitHub Actions π
In this section, we will be setting up CI/CD using GitHub Actions. We will be using GitHub Actions to test the application. This is important as it alllows to be sure that any proposed changes dont break the api.
We will also implement a workflow to test and build the docker Images for the application and the database. This is important as it allows us to be sure that the application and the database are working as expected.
Test Workflow
Inside the .github/workflows folder, create a new file called pythonapp.yml and add the following code.
name:Python Application Teston:push:branches:[main]pull_request:branches:[main]jobs:build:runs-on:ubuntu-lateststrategy:matrix:python-version:[3.6,3.7,3.8,3.9]steps:-uses:actions/checkout@v2-name:Set up Python ${{ matrix.python-version }}uses:actions/setup-python@v2with:python-version:${{ matrix.python-version }}-name:Install dependenciesrun:|python -m pip install --upgrade pippip install -r requirements.txt-name:Lint with flake8run:|pip install flake8# stop the build if there are Python syntax errors or undefined namesflake8 . --count --select=E9,F63,F7,F82 --show-source --statistics# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wideflake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics-name:Setup PostgreSQLuses:Harmon758/postgresql-action@v1.0.0with:# Version of PostgreSQL to usepostgresql version:12.1-alpine# POSTGRES_DB - name for the default database that is createdpostgresql db:hello_fastapi_dev# POSTGRES_USER - create the specified user with superuser powerpostgresql user:hello_fastapi# POSTGRES_PASSWORD - superuser passwordpostgresql password:hello_fastapi-name:Test with pytestrun:|pip install pytestpytest .
In the code above, we are creating a new workflow. We are setting the name of the workflow to Python Application Test. We are setting the workflow to run when a push is made to the main branch or when a pull request is made to the main branch. We are creating a new job called build. We are setting the job to run on the latest version of Ubuntu. We are setting the job to run on multiple versions of Python. We are checking out the code. We are setting up Python. We are installing the dependencies. We are linting the code. We are setting up PostgreSQL. We are running the tests.
Build Workflow
Inside the .github/workflows folder, create a new file called docker-image.yml and add the following code.
# .github/workflows/docker-image.yml# This is a basic workflow to help you get started with Actionsname:Docker Compose Actions Workflowon:push:branches:["main"]pull_request:branches:["main"]jobs:build:runs-on:ubuntu-lateststeps:-uses:actions/checkout@v3-name:Build the Docker imagerun:docker-compose build --no-cache --force-rmtest:runs-on:ubuntu-lateststeps:-uses:actions/checkout@v2-name:Build the stackrun:docker-compose up -d
In the code above, we are creating a new workflow. We are setting the name of the workflow to Docker Compose Actions Workflow. We are setting the workflow to run when a push is made to the master branch or when a pull request is made to the master branch. We are creating a new job called build. We are setting the job to run on the latest version of Ubuntu. We are checking out the code. We are building the docker images for the application and the database.
Running the Workflows
To run the workflows, push the code to the main branch. The workflows will run automatically. You can check the status of the workflows by going to the Actions tab on GitHub.
Conclusion
In this article, we have built a simple CRUD API using FastAPI. We have also set up CI/CD using GitHub Actions. We have also set up a PostgreSQL database. We have also set up a docker-compose file to run the application locally.
Simple asynchronous API implemented with Fast-Api framework utilizing Postgres as a Database and SqlAlchemy as ORM . GitHub Actions as CI/CD Pipeline
FastAPI Example App
This repository contains code for asynchronous example api using the Fast Api framework ,Uvicorn server and Postgres Database to perform crud operations on notes.
Ensure you have a Postgres Database running locally
Additionally create a fast_api_dev database with user **fast_api** having required privileges
OR
Change the DATABASE_URL variable in the .env file inside then app folder toβ¦