Deploying a Node.js API to Cloud Functions with Terraform
Ruan Martinelli
Posted on December 1, 2020
In this tutorial you are going to deploy a simple Node.js API to Google Cloud Functions using Terraform.
Cloud Functions is a compute solution from Google Cloud Platform (GCP) . It provides functions as a service (FaaS), which is a way to run your code "on-demand", without managing any servers.
For deployment we choose Terraform, a command-line tool to build and deploy infrastructure using code. Terraform will help create a predictable and reproducible environment to run your code.
Since it’s not the main focus of this tutorial, we will go with a super simple Node.js API using Fastify. Feel free to use any other languages supported by Cloud Functions on this part.
After finished, you will have an up and running API with an URL that you can make requests to.
Prerequisites
To follow this guide you will need:
- Terraform 0.13 or later. You can find the installation instructions here;
- Google Cloud SDK. Any recent version should be fine. Installation instructions here;
- Node.js 12 or later. If you don’t have Node.js installed, I recommend using nvm for that.
1. Setting up the GCP account
If you are using the Google Cloud SDK for the first time, you will need to authenticate with your Google Account. You can run the following command:
# Authenticate with GCP
gcloud auth application-default login
Now create the project on GCP:
# Create a GCP project
gcloud projects create PROJECT_ID --name="My App"
Note: replace
PROJECT_ID
with an unique project identifier (e.g. "my-app-2847162").
Set the project you just created as the default one. This will make it easier to run the subsequent commands.
# Set the project as the default one
gcloud config set project PROJECT_ID
Many features on GCP require a billing account linked to the project, Cloud functions is one of them. For this step, you will need to visit the dashboard:
Create a billing account on GCP.
Note: even though Google asks for your credit card, this tutorial should not cost any money to run. The first 2 million invocations of a Cloud Function are free. You will also learn how to shutdown and destroy all the resources created here.
After setting up billing, the account will be listed when you run the following command:
# List billing accounts available
gcloud beta billing accounts list
The output will look something like this:
ACCOUNT_ID NAME OPEN MASTER_ACCOUNT_ID
87PPT3-QECCDL-9OLSSQ my-account True
Copy the account ID and run the following command to link the billing account to your project:
# Link a billing account to project
gcloud beta billing projects link PROJECT_ID --billing-account=BILLING_ACCOUNT_ID
Now you are going to structure the project.
2. Structuring the project
Create the files listed below so your repository looks like this:
.
├── terraform
│ ├── modules
│ │ └── function
│ │ ├── main.tf
│ │ ├── outputs.tf
│ │ └── variables.tf
│ ├── main.tf
│ ├── backend.tf
│ ├── outputs.tf
│ └── variables.tf
└── src
└── index.js
Don’t worry about adding any content now. We will do that on the next step.
Note: This is an opinionated, but common structure for Terraform projects. Terraform doesn’t require your files to be in a certain disposition or have a specific name. Although not recommended, you can build an entire system within a single
main.tf
file.
The terraform/
folder contain files related to Terraform.
The src
folder hosts the code for the Node.js API. Remember, the API code will be very simple, only a single index.js
file is enough.
3. Writing the API
Let’s write the API using Fastify.
If you are following this tutorial with a different language, you can add your custom code at this point.
Note: If you never heard about Fastify before, it has a very similar API to other Node.js frameworks like Express.
First, initialize the project with npm and install fastify
as a dependency:
# Initialize project
npm init
# Install fastify
npm install fastify
Add this content to the src/index.js
file:
// src/index.js
const fastify = require('fastify')
const app = fastify({ logger: true })
app.get('/', async (req, res) => {
return { works: true }
})
exports.app = async (req, res) => {
await app.ready()
app.server.emit('request', req, res)
}
Update the entry point for your code in the package.json
file:
// package.json
{
- "main": "index.js",
+ "main": "src/index.js"
// ...
}
This will tell Cloud Functions where your API is located. Now let’s jump to the terraform/
folder and start writing the infrastructure code.
4. Writing the infrastructure code
At this point, you already have all the files and folders created inside your terraform/
folder.
Before start adding code to them, let’s take a look at each file’s responsibility:
-
backend.tf
. Declares which Terraform backend you will use. -
main.tf
. Where you will write the logic for creating resources or invoking modules. -
variables.tf
. Lists the variables and their values that will be used onmain.tf
. -
outputs.tf
. Lists the values your Terraform code will return. -
modules/
. A place for your Terraform modules. In this case, there will be just one namedfunction
.
Note: The name
function
was chosen for the module because it creates a Cloud Function. Feel free to use a different name.Note: If you are not familiar with modules in Terraform, think of them as a function or component. It’s something that you write once and then can reuse on other parts of your code.
Begin by declaring which Terraform backend you want to use - where you want to store your Terraform state files.
Let's choose the "local" backend for now, meaning the state files will be stored on your local repository.
# terraform/backend.tf
terraform {
backend "local" {}
}
Now add the following variables to your terraform/variables.tf
file:
# terraform/variables.tf
variable "project" {
default = "PROJECT_ID"
}
variable "region" {
default = "us-central1" # Choose a region
}
In terraform/main.tf
, declare the provider Terraform will connect to. In your case, the Google Cloud Platform provider (named "google"
) .
The Google provider has two required parameters, project and region. We can reference the values declared on the step above by accessing the properties in the var
object.
# terraform/main.tf
provider "google" {
project = var.project
region = var.region
}
# ⚠️ More code here soon
You will go back to this file soon to add more configuration.
Creating the function
module
In order to create a Cloud Function on GCP you will need to combine a few resources together:
- A storage bucket, to store the code that will be executed by the function
- The function itself, to run the code you wrote
- An IAM policy, to allow users to invoke the function
These resources will be grouped in the Terraform module you are about to create.
This will make it easier if you want to deploy a second environment (e.g. development & staging) or create multiple functions - you can just invoke the module again with different parameters.
On terraform/modules/function/variables.tf
, add the arguments needed by the module. All arguments are required, so don’t add default values.
# terraform/modules/function/variables
variable "project" {}
variable "function_name" {}
variable "function_entry_point" {}
Proceeding to terraform/modules/function/main.tf
, add the logic to create the function and all resources needed.
⚠️ This file is a bit dense. Follow through the comments in it to get a better idea of what’s happening.
# terraform/modules/function/main.tf
locals {
timestamp = formatdate("YYMMDDhhmmss", timestamp())
root_dir = abspath("../")
}
# Compress source code
data "archive_file" "source" {
type = "zip"
source_dir = local.root_dir
output_path = "/tmp/function-${local.timestamp}.zip"
}
# Create bucket that will host the source code
resource "google_storage_bucket" "bucket" {
name = "${var.project}-function"
}
# Add source code zip to bucket
resource "google_storage_bucket_object" "zip" {
# Append file MD5 to force bucket to be recreated
name = "source.zip#${data.archive_file.source.output_md5}"
bucket = google_storage_bucket.bucket.name
source = data.archive_file.source.output_path
}
# Enable Cloud Functions API
resource "google_project_service" "cf" {
project = var.project
service = "cloudfunctions.googleapis.com"
disable_dependent_services = true
disable_on_destroy = false
}
# Enable Cloud Build API
resource "google_project_service" "cb" {
project = var.project
service = "cloudbuild.googleapis.com"
disable_dependent_services = true
disable_on_destroy = false
}
# Create Cloud Function
resource "google_cloudfunctions_function" "function" {
name = var.function_name
runtime = "nodejs12" # Switch to a different runtime if needed
available_memory_mb = 128
source_archive_bucket = google_storage_bucket.bucket.name
source_archive_object = google_storage_bucket_object.zip.name
trigger_http = true
entry_point = var.function_entry_point
}
# Create IAM entry so all users can invoke the function
resource "google_cloudfunctions_function_iam_member" "invoker" {
project = google_cloudfunctions_function.function.project
region = google_cloudfunctions_function.function.region
cloud_function = google_cloudfunctions_function.function.name
role = "roles/cloudfunctions.invoker"
member = "allUsers"
}
This file is dealing with all the logic of compressing the source code, storing it in a bucket, creating the Cloud Function and setting the necessary permissions to it.
Using your module
Now that you have your function
module ready, you can invoke it in other parts of your Terraform code.
Go back to the entry point file on terraform/main.tf
and add the following:
# terraform/main.tf
provider "google" {
project = var.project
region = var.region
}
+ module "my_function" {
+ source = "./modules/function"
+ project = var.project
+ function_name = "my-function"
+ function_entry_point = "app"
+ }
When running the file above, Terraform will look for a main.tf
file on the path declared in the source
parameter and run the code there along with the other variables.
Note: The
function_entry_point
must match the name of the exported variable in your Node.js code. You will findexports.app = …
on the bottom ofsrc/index.js
.
In the terraform/outputs.tf
file, add the return values from the module you want to use. Since the module only returns one output value, your file should look like this:
# terraform/outputs.tf
output "function_url" {
# Access the module output with module.<module_name>.<output_name>
value = module.my_function.function_url
}
Now let’s see how to deploy all the resources with the Terraform CLI.
5. Deploying
The hard work is already done! Creating the infrastructure should be an easier step.
Run the following commands on the root for your repository to create all the resources and deploy your code:
# Make sure you are on the terraform folder
cd terraform
# Initialize your configuration
terraform init
# Plan the configuration
terraform plan
# Create all the resources
terraform apply
If everything works well, you will see a similar output in your terminal:
Apply complete! Resources: 6 added, 0 changed, 0 destroyed.
Outputs:
function_url = https://us-central1-my-project-1234567.cloudfunctions.net/my-function
You can verify that it works with a simple curl
command. Remember to replace the URL with your own URL.
curl https://us-central1-my-project-1234567.cloudfunctions.net/my-function
{"works":true}
Updating the function
Your first deploy is never final. Eventually, you will want to deploy new versions of the code that runs in the Cloud Function.
After changing and testing your code, you can simply run terraform apply
in your terminal. Terraform will compress your source files, store them in the Cloud Storage bucket and update the function with the new code.
Destroying the function
You can clean up all the resources created by running terraform destroy
.
The project won’t be deleted this way (it wasn’t created by Terraform). For that, you can run:
# Delete the project
gcloud projects delete PROJECT_ID
6. Going further
This tutorial provides a quick way to get started. Many other good practices can be incorporated to build a more robust application:
Remote Terraform backend. If you check your repository you will notice that a state file was created by Terraform. It’s a good practice to store this file in a remote storage. You can change the backend from "local" to a Cloud Storage bucket, for example. See the list of available backends here.
Multiple environments. You might want to deploy the same infrastructure here under a different environment (e.g. development & production). There are many ways to do it with Terraform and you will find lots of tutorials around.
Continuous deployment. Ideally, you shouldn’t be running terraform plan
and terraform apply
from your local machine. This should be done as part of the automation process of a CI/CD solution such as Cloud Build or GitHub Actions.
This whole tutorial, plus some of the things are implemented on this repository on GitHub. Check it out!
Posted on December 1, 2020
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.