JAMStack deployment with Azure DevOps Pipeline
OssiDev
Posted on July 18, 2021
I often do some freelance work for customers who have their own servers. These servers are usually the type of webhosting solutions where you don't have root access. Just Apache or Nginx running and pointing to a specific folder and serves PHP files. Often enough customers have a website running on Wordpress there.
Not having Node.js available in a server limits a web developer's capabilities, when nowadays a big portion of us work with Node.js apps. I do too, and I often build separate front-end applications with JavaScript frameworks, while having a headless CMS provide the site with content. With this I can often do Jamstack builds.
Since Jamstack requires you to re-build your website each time your content changes, I need Node.js to run on the server. This isn't happening, so I had to figure out an alternative way to build front-end applications, away from the server. Good thing is I have access to Azure meaning I could use it's DevOps Pipelines to have it build the application for me, and deploy the static output to Blob Storage, and then dispatch an event back to the server. The server then would fetch the built website for me and extract it to be hosted.
The thing was, I wasn't sure if I could have external events from a platform like Wordpress or Strapi trigger the build. Luckily, it could, but let me walk you through the process.
Requirements
I will walk you through what I did and provide links to resources (docs) that I read to get everything done. I use Azure CLI to create resources as it's much faster and cleaner for the purposes of this tutorial as instead of providing screenshots of Portal.
Azure provides a lot of documentation regarding their services, but in my experience Microsoft has a habit of sprinkling things here and there so you have to know where to look.
You need the following things to build this pipeline:
- An Azure subscription
- A repository which can be integrated with Azure DevOps
- Azure DevOps organization
- Enable Event Grid resource provider
- A storage account and a container
- An API with an endpoint which can validate the event subscription and then receive events when blobs are created in Blob Storage. I will refer you to this how-to guide on how to validate your subscription.
I will assume you can set up the first four, and last one, yourself and let's simply look at setting up a storage account and the pipeline.
Setting up a storage account
We need to set up a storage account and a container, or use an existing one, to hold our finalized build.
-
Use the Azure CLI to create a storage account.
az storage account create --name <storage-account-name> --resource-group <resource-group-name>
Grab the
"id"
property value (fully qualified identifier) from the response json in your CLI, which is something like/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{storageAccountName}
-
Create a container
az storage container create --name deployment --account-name <storage-account-name> --public-access blob
If you wish for stricter security, please consider dropping the
--public-access
flag and going the private route. -
Create an event subscription
Before you create an event subscription make sure your API endpoint is capable of subscribing to receive events.
Now use the
"id"
from your storage container creation to create an event subscription.
az eventgrid event-subscription create --source-resource-id <storage-account-fully-qualified-identifier> --name nodeAppDeployment --endpoint <your-api-endpoint> --included-event-types Microsoft.Storage.BlobCreated Microsoft.Storage.BlobCreated
DevOps
Before we get into setting up a pipeline, we first have to visit Project settings and set up a Service Connection (under Pipelines section.) Select Incoming WebHook as the connection type and add a WebHook name, and Service connection name. Don't grant access to all pipelines unless you want to!
Now, once that's done we can create a pipeline. Simply click on New pipeline from the Pipelines dashboard and add your repository configurations. Select a basic Node.js type task and once you're in the pipeline YAML configuration page, let's add the following there:
# Use our Service Connection WebHook (change)
resources:
webhooks:
- webhook: <webhook-name>
connection: <service-connection-name>
# CI trigger
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
# Install Node.js
- task: NodeTool@0
inputs:
versionSpec: '14.x'
displayName: 'Install Node.js'
# Install packages and run build command
- script: |
npm install
npm run build
displayName: 'npm install and build'
# Archive the /out folder
- task: ArchiveFiles@2
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)/out'
includeRootFolder: false
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/release.zip'
replaceExistingArchive: true
# Upload the output zip to blob storage
- task: AzureCLI@2
inputs:
azureSubscription: <azure-subscription>
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az storage blob upload --account-name <storage-account-name> -c <container-name> -f $(Build.ArtifactStagingDirectory)/release.zip -n release.zip'
The YAML configuration file is commented, but let me just point a few things here.
- First you need to set your webhook settings in the configuration file
- The app being built is a Next.js framework app, so
npm run build
executesnext build && next export
, that's why the output folder isout
. If you use a different framework, you have to adjust this configuration. - You need to authorize your azure subscription so you can use it in your upload task. You need to input your subscription name in the
azureSubscription
field. What I did was I just grabbed a task likeAzure file copy
and inputed the configuration via the UI, and from there I got the correct name and ID for the YAML file. Since this is a linux agent, we have to use Azure CLI to upload the file. TheAzure file copy
only works for PowerShell ie. Windows agents. - I didn't take into consideration any environment variables here. You will have to deal with those yourself.
That's pretty much it.
Triggering the pipeline.
That's simple. You just make a POST request to the following URL:
https://dev.azure.com/<organization-name>/_apis/public/distributedtask/webhooks/<webhook-name>?api-version=6.0-preview
The problem here is that whoever finds out this URL can cause you harm and potentially make your pipeline run constantly by pinging that URL. That's why you can set up a Secret and an HTTP Header in the Service Connection. You can generate a HMAC SHA1 checksum of your payload (not sure if other message digest algorithms are supported), and add it to the specified header. The secret will of course be the key to generate that checksum.
Azure will compare the checksums together and block any request where the payload is empty, or where the checksum does not match.
Wrapping up
This tutorial does not tell you what to do with the archived output file in your API. You're going to have to figure it out yourself. What I've done is simply copy the file from storage (url provided by the BlobCreated event payload) and extracted it to the Apache hosted folder.
As always, there are other alternatives besides Azure and any type of pipeline. You can use services like Vercel or Netlify to host your front-end application and you can skip building anything. I belive both contain some type of webhook trigger mechanism to initiate a re-build. You can also host the static website in Azure Storage, and set up a CDN and everything in Azure. This is propably cheaper, and definitely more scalable, than some non-cloud based virtual machine.
The pros of this approach are that I don't have to run Git or Node.js on the server just to fetch the latest code changes, or build the app. I can fetch the repo and install packages in an isolated container. If it fails it doesn't affect the website in production. You can also run your tests in the same pipeline too to make sure you haven't broken anything. Using TypeScript can give you some confidence for the compilation.
Resources
Below are some resources that helped me figure out what's needed. I've been playing around with Azure DevOps a lot lately, so that experience helped me do this in an hour.
Installing Azure CLI:
https://docs.microsoft.com/en-us/cli/azure/install-azure-cli
Jamstack:
https://jamstack.org/
Azure CLI - az storage account:
https://docs.microsoft.com/en-us/cli/azure/storage/account?view=azure-cli-latest
Azure CLI - az eventgrid event-subscription:
https://docs.microsoft.com/en-us/cli/azure/eventgrid/event-subscription?view=azure-cli-latest
Receive events from Azure Event Grid to an HTTP endpoint: https://docs.microsoft.com/en-us/azure/event-grid/receive-events#endpoint-validation
Generic webhook based triggers for YAML pipelines: https://docs.microsoft.com/en-us/azure/devops/release-notes/2020/pipelines/sprint-172-update?WT.mc_id=DOP-MVP-21138#generic-webhook-based-triggers-for-yaml-pipelines
Enable Event Grid resource provider:
https://docs.microsoft.com/en-us/azure/event-grid/custom-event-quickstart-portal#enable-event-grid-resource-provider
Posted on July 18, 2021
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.