Communicating between Node.js microservices with gRPC

mangelosanto

Matt Angelosanto

Posted on March 1, 2023

Communicating between Node.js microservices with gRPC

Written by Shalitha Suranga✏️

Nowadays, most developers select the microservices pattern for building their web backends to overcome issues in the traditional monolithic architecture. In web development, microservices are typically loosely-coupled web services that web developers integrate through an API gateway. The availability of third-party libraries, fully-featured inbuilt APIs, and helpful developer tools make Node.js a good candidate for building microservices.

Even though microservices typically work as independent services, they can communicate with each other via various communication mechanisms. We can use either a synchronous, request-response-based communication strategy or an asynchronous, event-based communication strategy for inter-microservice communication.

gRPC (a.k.a., Google RPC) offers a fully-featured RPC framework for developers with implementations for almost all popular programming languages, including Node.js. The gRPC framework sends binary messages between clients and servers with the Protobuf serialization technology via the HTTP/2 protocol.

In this tutorial, I will explain how to use gRPC in Node.js by building a practical communication system for three microservices.

Jump ahead:

Highlighted features of Node.js gRPC

Previously, the gRPC team offered the grpc package for Node.js developers by binding the C++ gRPC implementation with Node.js via the Node.js add-ons system. More recently, they re-wrote the Node.js package with pure JavaScript code and without a C++ add-on. This created the @grpc/grpc-js (Node gRPC) package.

In this section, we’ll highlight several of the features offered by @grpc/grpc-js.

A complete, official gRPC implementation for Node.js

Theoretically, gRPC is a concept to implement an RPC framework with the HTTP/2 protocol and Protobuf serialization. The Node.js gRPC implementation is an official, well-maintained project that lets you use every feature from the gRPC concept in Node.js. For example, you can use Node gRPC to implement the following communication types:

  • Unary RPC: A traditional request-response-style communication
  • Server streaming RPC: The server sends a stream of data to the client’s request
  • Client streaming RPC: The client sends a stream of data to the server
  • Bi-directional streaming RPC: Stream messages to both sides with two independent streams

Apart from these basic gRPC features, this Node package supports automatic re-connections, client interceptors, and more.

A developer-friendly API

When library developers offer minimal, self-explanatory APIs, other developers are able to be productive with that library. The Node gRPC package offers a friendly API with runtime and static code generation support. When we use a Protobuf definition with Node gRPC, it will attach available procedures to the RPC interface during the runtime. Besides, you can use static code generation if you want to see RPC methods before running the code (i.e., while using TypeScript).

Moreover, the Node gRPC API has a standard, event-based and callback-based programming style. So any Node.js developer can get started with it quickly.

A pure JavaScript implementation

The @grpc/grpc-js package is a pure JavaScript implementation using inbuilt Node.js APIs, like http2. So, this package doesn’t trigger additional Node C++ add-on installation tasks like the legacy grpc package does. Also, this pure JavaScript implementation uses TypeScript to include type data, so we can easily use @grpc/grpc-js with TypeScript.

Node.js gRPC tutorial: Inter-microservice communication

We’ve covered the highlighted features of the @grpc/grpc-js package. Overall, it offers a fully-featured, official, pure JavaScript implementation of the gRPC framework concept.

Now, let’s learn how to use it by implementing a practical system! We’ll develop three Node.js microservices using gRPC for inter-microservice communication:

  1. Main microservice (primary): This microservice accepts a food order via a RESTful API endpoint, then communicates with our two secondary microservices to process the order. The RESTful API also offers an endpoint to check the status of an order
  2. Recipe selector microservice (secondary): The main microservice communicates with this microservice and searches for recipes
  3. Order processor microservice (secondary): This microservice accepts an order request and provides the current order status based on the order status change events

Architecting the solution

We know our product requirements, so let’s define the project architecture. Look at the following high-level design diagram: Our demo food-ordering system's high-level architecture. Our demo food-ordering system's high-level architecture. As shown in the above diagram, we’ll build our solution based on the following specification:

  • The main microservice uses gRPC to communicate with the secondary microservices and offers the following RESTful endpoints for web clients:
    • POST /orders: Creates a new order
    • GET /orders/{orderId}: Returns order details (including the current order status)
  • When the recipe selector microservice receives a new gRPC message, it selects a recipe and transmits using the unary mode
  • When the order processor microservice receives a new gRPC message, it streams the order status via the server streaming mode

Even though real-world microservices typically reside in separate computers or containers, we’ll create a monorepo-oriented Node.js project for this solution and demonstrate the microservices system with three processes to keep this tutorial simple.

Creating the project and installing dependencies

First, create three directories to logically separate our microservices:

mkdir {main,recipe,processor}-ms 
Enter fullscreen mode Exit fullscreen mode

Create a new Node.js project as follows:

npm init 
# --- or ---
yarn init
Enter fullscreen mode Exit fullscreen mode

Next, install @grpc/grpc-js, @grpc/proto-loader, and express dependencies:

npm install @grpc/grpc-js @grpc/proto-loader express
# --- or ---
yarn add @grpc/grpc-js @grpc/proto-loader express
Enter fullscreen mode Exit fullscreen mode

Note: We use express for the RESTful API implementation. The @grpc/proto-loader package lets you load .proto files. We’ll discuss .proto files soon!

Install the concurrently package to run all microservices with one command:

npm install concurrently -D
# --- or ---
yarn add concurrently -D
Enter fullscreen mode Exit fullscreen mode

Defining services with protocol buffers

RPC frameworks/libraries typically let developers execute remote procedures, so we should define the required procedures first. In the gRPC framework, we have to pre-define procedures with Protobuf definitions. Create a new directory to store the Protobuf files:

mkdir protos
Enter fullscreen mode Exit fullscreen mode

Let’s create a Protobuf file for the communication line between the main microservice and recipe selector. Add the following content to the ./protos/recipes.proto file:

syntax = "proto3";

service Recipes {
  rpc Find (ProductId) returns (Recipe) {}
}

message ProductId {
  uint32 id = 1;
}

message Recipe {
  uint32 id = 1;
  string title = 2;
  string notes = 3;
}
Enter fullscreen mode Exit fullscreen mode

Here, we defined the Find procedure to return a Recipe object based on ProductId, which is a unique identifier for a food product. Note that we typically need to group all procedures with a service definition, like Recipes.

Next, add the following definition to the ./protos/processing.proto:

syntax = "proto3";

service Processing {
  rpc Process (OrderRequest) returns (stream OrderStatusUpdate) {}
}

message OrderRequest {
  uint32 recipeId = 1;
  uint32 orderId = 2;
}

enum OrderStatus {
    NEW = 1;
    QUEUED = 2;
    PROCESSING = 3;
    DONE = 4;
}

message OrderStatusUpdate {
  OrderStatus status = 1;
}
Enter fullscreen mode Exit fullscreen mode

We defined the Process procedure to return a stream of OrderStatusUpdate messages to track the order status change events. Besides, the Process procedure expects a OrderRequest message as the parameter.

Developing gRPC servers

Now that our Protobuf definitions are ready, we can start developing gRPC servers. The main microservice is a gRPC client that communicates with two secondary microservices. So, first, we need to implement two gRPC servers for secondary microservices.

Let’s start with the recipe selector microservice. Add the following code to ./recipe-ms/main.js:

const path = require('path');
const grpc = require('@grpc/grpc-js');
const protoLoader = require('@grpc/proto-loader');
const packageDefinition = protoLoader.
                            loadSync(path.join(__dirname, '../protos/recipes.proto'));
const recipesProto = grpc.loadPackageDefinition(packageDefinition);

const RECIPES = [
    {
        id: 100,
        productId: 1000,
        title: 'Pizza',
        notes: 'See video: pizza_recipe.mp4\. Use oven No. 12'
    },
    {
        id: 200,
        productId: 2000,
        title: 'Lasagna',
        notes: 'Ask from John. Use any oven, but make sure to pre-heat it!'
    }
];

function findRecipe(call, callback) {
    let recipe = RECIPES.find((recipe) => recipe.productId == call.request.id);
    if(recipe) {
        callback(null, recipe);
    }
    else {
        callback({
            message: 'Recipe not found',
            code: grpc.status.INVALID_ARGUMENT
        });
    }
}

const server = new grpc.Server();
server.addService(recipesProto.Recipes.service, { find: findRecipe });
server.bindAsync('0.0.0.0:50051', grpc.ServerCredentials.createInsecure(), () => {
    server.start();
});
Enter fullscreen mode Exit fullscreen mode

The above code spawns a gRPC server instance on the port 50051 and handles gRPC messages based on the service definition in the recipes.proto file. Whenever a gRPC client executes the find procedure with a valid product identifier, the server finds an appropriate recipe and sends it back via the callback function (using the unary mode).

We’re using one service in our tutorial, but you can attach multiple services to the server as follows:

server.addService(recipesProto.Recipes.service, { find: findRecipe });
server.addService(ingredientsProto.Ingredients.service, { find: findIng });
Enter fullscreen mode Exit fullscreen mode

You can also add multiple procedures as follows:

server.addService(recipesProto.Recipes.service, {
    find: findRecipe,
    add: addRecipe,
    update: updateRecipe,
    remove: remove Recipe
});
Enter fullscreen mode Exit fullscreen mode

Before testing the above microservice, let’s create the other secondary microservice. I will explain how to test both microservices with Postman in the following section.

Add the following code to ./processor-ms/main.js to create the second, secondary microservice:

const path = require('path');
const grpc = require('@grpc/grpc-js');
const protoLoader = require('@grpc/proto-loader');
const packageDefinition = protoLoader.
                            loadSync(path.join(__dirname, '../protos/processing.proto'));
const processingProto = grpc.loadPackageDefinition(packageDefinition);

function process(call) {
    let orderRequest = call.request;
    let time = orderRequest.orderId * 1000 + orderRequest.recipeId * 10;

    call.write({ status: 2 });
    setTimeout(() => {
        call.write({ status: 3 });
        setTimeout(() => {
            call.write({ status: 4 });
            call.end();
        }, time);
    }, time);
}

const server = new grpc.Server();
server.addService(processingProto.Processing.service, { process });
server.bindAsync('0.0.0.0:50052', grpc.ServerCredentials.createInsecure(), () => {
    server.start();
});
Enter fullscreen mode Exit fullscreen mode

In this gRPC server also, we attach one procedure; here we use the streaming mode. Whenever the microservice gets a new order request to process, it streams the newly created order status via the call.write function. We used the call.end function call to indicate the end of the stream instead of using callback, as we used in the previous unary microservice communication implementation.

Update your package.json with the following scripts to run these two microservices at once:

"scripts": {
  "start-recipe-ms": "node ./recipe-ms/main.js",
  "start-processor-ms": "node ./processor-ms/main.js",
  "start": "concurrently 'npm run start-recipe-ms' 'npm run start-processor-ms'"
},
Enter fullscreen mode Exit fullscreen mode

Now, we can use npm start or yarn start to start both microservices.

Testing gRPC servers with Postman

The Postman app added gRPC client support in v9.7.1, so if you use an older Postman version, download the latest version before continuing with this tutorial.

Start both secondary microservices using the start npm script. First, we can test the recipe selector microservice.

Open the Postman app, click File, then click New (or, press Control+N/Command+N), and create a new gRPC request for 0.0.0.0:50051: Creating a new gRPC request with Postman

When we use Postman for gRPC testing, it acts as a client, so it needs to know the Protobuf service definition. Import the recipes.proto file into the Postman client as follows: Importing a Protobuf file into Postman

Postman will automatically show the Find procedure, so you can test it as follows: Testing the find procedure with Postman

Here, we send the product identifier to receive a recipe object. Use the same steps and test the order processor microservice. It will stream multiple order status change objects via the gRPC server streaming feature. Look at the following preview: Testing gRPC streaming (with the Process procedure) with Postman

Developing a gRPC client and communicating with servers

Our gRPC-based secondary microservices work as expected at the specification preparation stage. At the start of this post, we planned to create a primary microservice with a gRPC client to communicate with secondary microservices. The main microservice’s goal is to accept a food product order request, find a recipe, process the order, and update the order status.

Before implementing a RESTful interface for the main microservice, let’s connect it with other microservices and test it via the terminal.

Add the following code to ./main-ms/main.js:

const path = require('path');
const grpc = require('@grpc/grpc-js');
const protoLoader = require('@grpc/proto-loader');

const packageDefinitionReci = protoLoader.
                            loadSync(path.join(__dirname, '../protos/recipes.proto'));
const packageDefinitionProc = protoLoader.
                            loadSync(path.join(__dirname, '../protos/processing.proto'));
const recipesProto = grpc.loadPackageDefinition(packageDefinitionReci);
const processingProto = grpc.loadPackageDefinition(packageDefinitionProc);

const recipesStub = new recipesProto.Recipes('0.0.0.0:50051',
                        grpc.credentials.createInsecure());
const processingStub = new processingProto.Processing('0.0.0.0:50052',
                        grpc.credentials.createInsecure());

let productId = 1000;
let orderId = 1;

console.log(`Searching a recipe for the product: ${productId}`);

recipesStub.find({ id: productId }, (err, recipe) => {
    console.log('Found a recipe:');
    console.log(recipe);
    console.log('Processing...');
    const call = processingStub.process({ orderId, recipeId: recipe.id });
    call.on('data', (statusUpdate) => {
        console.log('Order status changed:');
        console.log(statusUpdate);
    });
    call.on('end', () => {
        console.log('Processing done.');
    });
});
Enter fullscreen mode Exit fullscreen mode

First, the above code calls the find procedure in the recipe selector microservice to fetch a recipe based on a product identifier. Next, it calls the process procedure in the order processor microservice to detect order status changes.

Look at the client stubs. We typically use the following pattern for the unary mode:

recipesStub.find({ id: productId }, (err, recipe) => {
Enter fullscreen mode Exit fullscreen mode

For streaming, we can attach events to the RPC instance as follows:

call.on('data', (statusUpdate) => {
Enter fullscreen mode Exit fullscreen mode

Start both secondary microservices. Run the following command to start the client:

node ./main-ms/main.js
Enter fullscreen mode Exit fullscreen mode

Now, you will see the sample food processing system’s log, as shown below: Testing the gRPC client in the terminal

Finalizing microservices with a RESTful interface

Earlier, the main microservice worked as a console program by writing logs to the terminal. In web development, microservices typically use web protocols and let web clients communicate with them.

Let’s complete the demo order processing system by implementing a RESTful API for the main microservice. Add the following code to the ./main-ms/main.js file:

const path = require('path');
const grpc = require('@grpc/grpc-js');
const protoLoader = require('@grpc/proto-loader');
const express = require('express');

const packageDefinitionReci = protoLoader.
                            loadSync(path.join(__dirname, '../protos/recipes.proto'));
const packageDefinitionProc = protoLoader.
                            loadSync(path.join(__dirname, '../protos/processing.proto'));
const recipesProto = grpc.loadPackageDefinition(packageDefinitionReci);
const processingProto = grpc.loadPackageDefinition(packageDefinitionProc);

const recipesStub = new recipesProto.Recipes('0.0.0.0:50051',
                        grpc.credentials.createInsecure());
const processingStub = new processingProto.Processing('0.0.0.0:50052',
                        grpc.credentials.createInsecure());

const app = express();
app.use(express.json());

const restPort = 5000;
let orders = {};

function processAsync(order) {
    recipesStub.find({ id: order.productId }, (err, recipe) => {
        if(err) return;

        orders[order.id].recipe = recipe;
        const call = processingStub.process({
            orderId: order.id,
            recipeId: recipe.id
        });
        call.on('data', (statusUpdate) => {
            orders[order.id].status = statusUpdate.status;
        });
    });
}

app.post('/orders', (req, res) => {
    if(!req.body.productId) {
        res.status(400).send('Product identifier is not set');
        return;
    }
    let orderId = Object.keys(orders).length + 1;
    let order = {
        id: orderId,
        status: 0,
        productId: req.body.productId,
        createdAt : new Date().toLocaleString()
    };
    orders[order.id] = order;
    processAsync(order);
    res.send(order);
});

app.get('/orders/:id', (req, res) => {
    if(!req.params.id || !orders[req.params.id]) {
        res.status(400).send('Order not found');
        return;
    }
    res.send(orders[req.params.id]);
});

app.listen(restPort, () => {
  console.log(`RESTful API is listening on port ${restPort}`)
});
Enter fullscreen mode Exit fullscreen mode

We’ve implemented two RESTful API endpoints. Whenever the microservice receives a new request for the POST /orders endpoint with a valid product identifier, it creates a new order and invokes the processAsync function. The processAsync function communicates with secondary microservices via the gRPC protocol, finds a recipe, and updates the order status.

The POST /orders endpoint returns the newly generated order identifier, which we can use with the GET /orders/{orderId} endpoint to get order details.

Now, we can improve the start npm script by letting it run all of our microservices at once. Use the following script definitions in your package.json:

"scripts": {
  "start-recipe-ms": "node ./recipe-ms/main.js",
  "start-processor-ms": "node ./processor-ms/main.js",
  "start-main-ms": "node ./main-ms/main.js",
  "start": "concurrently 'npm run start-recipe-ms' 'npm run start-processor-ms' 'npm run start-main-ms'"
},
Enter fullscreen mode Exit fullscreen mode

Run npm start or yarn start to start the demo food ordering system. Test the RESTful API with Postman, as follows:

First, create several orders with POST /orders: Creating a new order with the POST endpoint

Next, check order statuses with GET /orders/{orderId}: Checking order details with the GET endpoint

You can download the complete project source code from my GitHub repository.

Conclusion

In this tutorial, we practiced using gRPC in Node.js by implementing a communication system for three practical microservices. We used three local processes to demonstrate three microservices, but you can use gRPC over local network ports or remote ports on bare-metal servers or container systems, such as Docker.

You can also use WebSockets for inter-microservice communication, but gRPC is a fully-featured framework with an RPC definition language (Protobuf), unlike the WebSocket protocol. WAMP offers a gRPC-like concept over WebSockets, but its implementations aren’t popular as gRPC official implementations.


200’s only ✔️ Monitor failed and slow network requests in production

Deploying a Node-based web app or website is the easy part. Making sure your Node instance continues to serve resources to your app is where things get tougher. If you’re interested in ensuring requests to the backend or third party services are successful, try LogRocket.

LogRocket Network Request Monitoring

LogRocket is like a DVR for web apps, recording literally everything that happens on your site. Instead of guessing why problems happen, you can aggregate and report on problematic network requests to quickly understand the root cause.

LogRocket instruments your app to record baseline performance timings such as page load time, time to first byte, slow network requests, and also logs Redux, NgRx, and Vuex actions/state. Start monitoring for free.

💖 💪 🙅 🚩
mangelosanto
Matt Angelosanto

Posted on March 1, 2023

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related