Jack Hsu
Posted on February 28, 2023
There are many decisions to make when it comes to building a Node API. There are a variety of frameworks to choose from (Express, Fastify, Koa, etc.), and a million different ways to build and deploy the application.
In this article, I’ll show you the easiest way to go from zero to production by using Nx to create a Node API project.
We’ll be using Fastify as the framework of choice. Fastify is a fast (as the name implies) and low-overhead server in Node. It has grown in popularity, recently crossing the 1 million weekly download mark on npm. I’m a fan of Fastify’s plugin architecture, and the ecosystem is quite impressive, boasting over 250 core and community plugins.
Creating the project
You can create a new API with a single command.
$ npx create-nx-workspace@latest \
--preset=node-server \ # create a server project
--framework=fastify \ # other options are express and koa
--docker # we'll touch on this later on
To run the server in dev-mode, use npx nx serve
(aliased to npm start
), and you should see the server starting at port 3000.
$ curl http://localhost:3000
{"message":"Hello API"}
A couple of notable files:
- The
src/main.ts
file is responsible for starting the Fastify server and registering plugins. - The
src/app/app.ts
file is the app plugin that provides an initial endpoint at/
that replies with{"message": "Hello API"}
.
When you edit the source code, the server will reload. You can pass --no-watch
to disable this behavior.
Running tests
In additional to generating the source code, Nx will also create two test suites:
- Unit tests via
npx nx test
(aliased tonpm run test
). - E2E tests via
npx nx e2e e2e
(aliased tonpm run e2e
).
Unit tests take advantage of Fastify’s plugin architecture, and allows you to test each plugin in isolation. It runs using Jest, which is the most popular test runner in Node.
// src/app/app.spec.ts
// This file is generated by Nx.
import Fastify, { FastifyInstance } from 'fastify';
import { app } from './app';
describe('GET /', () => {
let server: FastifyInstance;
beforeEach(() => {
server = Fastify();
server.register(app);
});
it('should respond with a message', async () => {
const response = await server.inject({
method: 'GET',
url: '/',
});
expect(response.json()).toEqual({ message: 'Hello API' });
});
});
The E2E tests run against the actual server (with all plugins registered), which gives better real-world guarantees.
$ npx nx serve & # run server in background
$ npx nx e2e e2e # run test suite
GET /
✓ should return a message (27 ms)
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Snapshots: 0 total
Time: 0.429 s
Ran all test suites.
Tearing down...
———————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————
> NX Successfully ran target e2e for project e2e (3s)
$ lsof -i:3000 -t | xargs kill # stop server process
There are trade-offs between speed versus confidence when it comes to unit versus E2E tests, which is a topic that is out of scope for this article. I think you should do both, but this is a discussion be held with your team. Nx supports both cases out of the box.
Building for production using esbuild
Now that we have our production-ready app, let’s examine how Nx handles the build process using [esbuild](https://esbuild.github.io/)
.
esbuild
is a bundler written in Go that is extremely fast — it is much faster than other bundlers like webpack and parcel, but that may change in the future as other tools make their own speed improvements.
When you run the build command, Nx uses esbuild
to generate a self-contained app bundle, which does not require node_modules
.
$ npx nx build # aliased to npm build
> nx run api:build
———————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————
> NX Successfully ran target build for project api (2s)
A cool feature of Nx, is that commands such as build
and test
are cached if the project (or its dependencies) have not changed. If we run the build a second time, you’ll see it completes in a few milliseconds.
$ npx nx build
> nx run api:build [existing outputs match the cache, left as is]
———————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————
> NX Successfully ran target build for project api (16ms)
Nx read the output from the cache instead of running the command for 1 out of 1 tasks.
We’ll touch more on the caching when we look at Docker support.
And now that the server bundle is ready, we can run it.
$ node dist/api
{"level":30,"time":1675880059720,"pid":70605,"hostname":"mbp.lan","msg":"Server listening at http://[::1]:3000"}
{"level":30,"time":1675880059721,"pid":70605,"hostname":"mbp.lan","msg":"Server listening at http://127.0.0.1:3000"}
[ ready ] http://localhost:3000
You can even run the e2e suite against the production bundle: npx nx e2e e2e
.
Docker support
Recall that we passed the --docker
option when creating the API project. WIth this option, Nx will generate a default Dockerfile
and a docker-build
target.
# This file is generated by Nx.
#
# Build the docker image with `npx nx docker-build api`.
# Tip: Modify "docker-build" options in project.json to change docker build args.
#
# Run the container with `docker run -p 3000:3000 -t api`.
FROM docker.io/node:lts-alpine
ENV HOST=0.0.0.0
ENV PORT=3000
WORKDIR /app
RUN addgroup --system api && \
adduser --system -G api api
COPY dist/api api
RUN chown -R api:api .
CMD [ "node", "api" ]
To build the image, run npx nx docker-build
. The image copies only the self-contained bundle, so no npm install
at all!
Nx is smart enough to bundle the app before building the Docker image — because of the dependsOn
configuration in project.json
. You can visualize this dependency with npx nx graph
.
Now that the image is built, we can run it.
$ docker run -p 3000:3000 -t api
{"level":30,"time":1675880993256,"pid":1,"hostname":"248744de020a","msg":"Server listening at http://0.0.0.0:3000"}
[ ready ] http://0.0.0.0:3000
Note: The server binds to 0.0.0.0
so that it can be access from the host machine. You can run curl to verify that it indeed works, or better yet use the E2E test suite (npx nx e2e e2e
)!
Deploying the server
There are numerous platforms that we can deploy our app to. I like Fly.io since it very easy to deploy all over the world using the CLI, and it comes with good Docker support.
If you haven’t used Fly before, please follow their short getting started guide (5-10 mins).
Once you are ready, let’s configure our project.
$ fly launch --generate-name --no-deploy
Follow the prompts and a fly.toml
file will be generated, which contains the Fly configuration. We need to update this file with the correct port used by our image.
[[services]]
http_checks = []
internal_port = 3000 # Make sure this matches what the app listens on
Now we can deploy the app.
$ fly deploy
Fly will log out the monitoring link when the app is successfully deployed.
And you can open the deployed server using fly open
.
That’s it! Our server is now deployed for the world to use.
Summary
In this post, we saw how easy it is to go from zero code to a deployed server using Nx. Here is a quick summary of the points.
- Use
create-nx-workspace --preset=node-server --framework=fastify --docker
to quickly create a Fastify server project. - Nx provide both unit test and E2E test suites —
npx nx test
andnpx nx e2e e2e
. - Nx builds the server using esbuild —
npx nx build
. - Docker support is provided out of the box via the
--docker
option, and Nx understands that Docker build depends on the app to be bundled. Run it vianpx nx docker-build
. - Deploying to Fly (or other platforms) is easy since we have a Docker image.
To learn more about Nx and what else it can do, refer to the intro page in the docs.
Learn more
- 🧠 Nx Docs
- 👩💻 Nx GitHub
- 💬 Nx Community Slack
- 📹 Nx Youtube Channel
- 🥚 Free Egghead course
- 🚀 Speed up your CI
Also, if you liked this, click the ❤️ and make sure to follow Jack and Nx on Twitter for more!
#nx
Posted on February 28, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.