Deploy NextJs and NestJs as a single application
Ivan Alekseev
Posted on September 9, 2024
Hey there! I’m excited to share how you can configure NestJS to work seamlessly on a single host. But first, let me explain why this setup has been my top choice for managing both frontend and backend for so long.
Next.js is a powerhouse when it comes to kickstarting new projects. It comes packed with features like built-in routing, server-side rendering (SSR), and caching that help you hit the ground running. Plus, Next.js has its own internal API capabilities, letting you manage tasks like caching and data prep right within the framework. This means you can focus more on building your app and less on setting up the infrastructure.
But sometimes you need something more powerful for the server. That’s where Nest.js steps in. This framework is so powerful that it can handle not just the middleware duties between your backend and frontend, but can also act as a robust backend solution all on its own. Therefore NestJS is a good addition to Next.js in this case allowing using a single programming language for frontend and backend.
Why a single host?
Simply put, it’s incredibly convenient. With just a git pull and a docker-compose up -d, you’re ready to go. There is no need to worry about CORS or juggling ports. Plus, it streamlines the delivery process, making everything run more smoothly and efficiently. As a disadvantage, I can point out that this does not suit big projects with a high load.
1. First, let's define the folder structure of your repository
2. Let's declare a docker file for the server
File: ./docker-compose.yml
services:
nginx:
image: nginx:alpine
ports:
- "80:80"
volumes:
- "./docker/nginx/conf.d:/etc/nginx/conf.d"
depends_on:
- frontend
- backend
networks:
- internal-network
- external-network
frontend:
image: ${FRONTEND_IMAGE}
restart: always
networks:
- internal-network
backend:
image: ${BACKEND_IMAGE}
environment:
NODE_ENV: ${NODE_ENV}
POSTGRES_HOST: ${POSTGRES_HOST}
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: ${POSTGRES_DB}
depends_on:
- postgres
restart: always
networks:
- internal-network
postgres:
image: postgres:12.1-alpine
container_name: postgres
volumes:
- "./docker/postgres:/var/lib/postgresql/data"
environment:
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: ${POSTGRES_DB}
ports:
- "5432:5432"
networks:
internal-network:
driver: bridge
external-network:
driver: bridge
Simply put, it’s incredibly convenient. With just a git pull and a docker-compose up -d, you’re ready to go. There is no need to worry about CORS or juggling ports. Plus, it streamlines the delivery process, making everything run more smoothly and efficiently. As a disadvantage, I can point out that this does not suit big projects with a high load.
3. Another docker file for development mode
For development mode, we don’t need container service for the backend and frontend because we will run them locally.
File: ./docker-compose.dev.yml
version: '3'
services:
nginx:
image: nginx:alpine
ports:
- "80:80"
volumes:
- "./docker/nginx/conf.d:/etc/nginx/conf.d"
postgres:
image: postgres:12.1-alpine
container_name: postgres
volumes:
- "./docker/postgres:/var/lib/postgresql/data"
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: postgres
ports:
- "5432:5432"
4. Docker file for backend
File: ./backend/Dockerfile
FROM node:18-alpine AS deps
RUN apk add --no-cache libc6-compat
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm install
FROM node:18-alpine AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .
ENV NEXT_TELEMETRY_DISABLED 1
RUN npm run build
FROM node:18-alpine AS runner
WORKDIR /app
ENV NODE_ENV production
ENV NEXT_TELEMETRY_DISABLED 1
RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs
COPY --from=builder --chown=nextjs:nodejs /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package.json ./package.json
RUN mkdir -p /app/backups && chown -R nextjs:nodejs /app/backups && chmod -R 777 /app/backups
USER nextjs
EXPOSE 3010
ENV PORT 3010
CMD ["node", "dist/src/main"]
## 5. Docker file for frontend
File: ./frontend/Dockerfile
FROM node:18-alpine AS deps
RUN apk add --no-cache libc6-compat
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm install
FROM node:18-alpine AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .
ENV NEXT_TELEMETRY_DISABLED 1
RUN npm run build
FROM node:18-alpine AS runner
WORKDIR /app
ENV NODE_ENV production
ENV NEXT_TELEMETRY_DISABLED 1
RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs
COPY --from=builder --chown=nextjs:nodejs /app/.next ./.next
COPY --from=builder --chown=nextjs:nodejs /app/public ./public
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package.json ./package.json
USER nextjs
EXPOSE 3000
ENV PORT 3000
CMD ["npm", "start"]
6. Ngnix configuration
In this step, we configure Nginx to act as a reverse proxy for our Next.js frontend and Nest.js backend. The Nginx configuration allows you to route requests seamlessly between the frontend and backend, all while serving them from the same host.
File: /docker/nginx/conf.d/default.conf
server {
listen 80;
location / {
proxy_pass http://host.docker.internal:3000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
location /api {
proxy_pass http://host.docker.internal:3010;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
This configuration listens on port 80 and routes general traffic to the Next.js frontend on port 3000, while any requests to /api are forwarded to the Nest.js backend on port 3010.
7. NestJs global pregix
Since we use the same host we need NestJs to be available on /apipath. To do this we need to setGlobalPrefix — API.
File: ./backend/src/main.js
import { NestFactory } from '@nestjs/core';
import { AppModule } from './app.module';
async function bootstrap() {
const app = await NestFactory.create(AppModule, { cors: true });
app.setGlobalPrefix('api');
await app.listen(3010);
}
bootstrap();
8. Frontend
No configuration is required on the frontend but only taking into account that all the server requests should be called relative to /api path.
9. Run locally
cd frontend
npm run dev
cd ../backend
npm run start:dev
cd ../
docker-compose -f docker-compose.dev.yml up -d
Now, we can check our website by opening localhost in the browser. In the example, we have 1 request on the server and another on the client. Both these requests are called from the Next.Js and processed by Nest.Js.
10. Deploy and run on the server via GitHub
This article explores how to deploy a project to a server using Docker Registry and GitHub Actions. The process begins with creating Docker images for both the backend and frontend in the Docker Registry. After that, you’ll need to set up a GitHub repository and configure the necessary secrets for seamless deployment:
DOCKERHUB_USERNAME
DOCKERHUB_TOKEN
DOCKER_FRONTEND_IMAGE
DOCKER_BACKEND_IMAGE
REMOTE_SERVER_HOST
REMOTE_SERVER_USERNAME
REMOTE_SERVER_SSH_KEY
REMOTE_SERVER_SSH_PORT
The backside of using one repository for the backend and frontend is that each time you push something both images are rebuilt. To optimize it we can use these conditions:
if: contains(github.event_name, ‘push’) && !startsWith(github.event.head_commit.message, ‘frontend’)
if: contains(github.event_name, ‘push’) && !startsWith(github.event.head_commit.message, ‘backend’)
It makes it possible to rebuild only the image you heed by specifying the commit message.
File: ./github/workflows/deploy.yml
name: deploy nextjs and nestjs to GITHUB
on:
push:
branches: [ "main" ]
jobs:
build-and-push-frontend:
runs-on: ubuntu-latest
if: contains(github.event_name, 'push') && !startsWith(github.event.head_commit.message, 'backend')
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and push frontend to Docker Hub
uses: docker/build-push-action@v2
with:
context: frontend
file: frontend/Dockerfile
push: true
tags: ${{ secrets.DOCKER_FRONTEND_IMAGE }}:latest
- name: SSH into the remote server and deploy frontend
uses: appleboy/ssh-action@master
with:
host: ${{ secrets.REMOTE_SERVER_HOST }}
username: ${{ secrets.REMOTE_SERVER_USERNAME }}
password: ${{ secrets.REMOTE_SERVER_SSH_KEY }}
port: ${{ secrets.REMOTE_SERVER_SSH_PORT }}
script: |
cd website/
docker rmi -f ${{ secrets.DOCKER_FRONTEND_IMAGE }}:latest
docker-compose down
docker-compose up -d
build-and-push-backend:
runs-on: ubuntu-latest
if: contains(github.event_name, 'push') && !startsWith(github.event.head_commit.message, 'frontend')
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and push backend to Docker Hub
uses: docker/build-push-action@v2
with:
context: backend
file: backend/Dockerfile
push: true
tags: ${{ secrets.DOCKER_BACKEND_IMAGE }}:latest
- name: SSH into the remote server and deploy backend
uses: appleboy/ssh-action@master
with:
host: ${{ secrets.REMOTE_SERVER_HOST }}
username: ${{ secrets.REMOTE_SERVER_USERNAME }}
password: ${{ secrets.REMOTE_SERVER_SSH_KEY }}
port: ${{ secrets.REMOTE_SERVER_SSH_PORT }}
script: |
cd website/
docker rmi -f ${{ secrets.DOCKER_BACKEND_IMAGE }}:latest
docker-compose down
docker-compose up -d=
Repository: https://github.com/xvandevx/blog-examples/tree/main/nextjs-nestjs-deploy
Recap
This article is a hands-on guide to deploying Next.js and Nest.js together on a single server, making it a go-to solution for developers who want a streamlined setup. By combining the strengths of Next.js for frontend and Nest.js for backend, I showed how to efficiently manage both parts of your application using Docker and GitHub Actions. It simplifies the deployment process, allowing you to focus on building your app rather than juggling multiple configurations. Perfect for those looking to get a full-stack project up and running quickly with minimal hassle.
Posted on September 9, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.