Stop Using Servers to Handle Webhooks
Richard Moot
Posted on October 11, 2019
Webhooks are increasingly becoming the main method to get real time data from different services. GitHub, Slack, SendGrid, and even Square use webhooks to let you see data or be notified of events happening on your account. Webhooks are awesome, since they are fairly easy to deal with and prevent developers from having to build some archaic polling system that ends up being fairly wasteful in terms of network requests made vs. actual useful data retrieved.
When creating a service to process webhooks, you have a few choices available: you can extend our application to handle incoming data from a defined URL, you can create a microservice, or you can create a Function as a Service (FaaS) function for processing our webhooks. We’ll briefly go through each of those options and the possible tradeoffs, and then we’ll wrap up with an example implementation of a FaaS webhook handler for Square.
Extending your Application
Source: Giphy (CCTV Servers)
Extending your application gives you the benefit of leveraging any helpers or other libraries you already have in your application. Your helpers (or other application tools) can assist with processing this incoming data and might make it easier to manage. Your application is likely continually running anyways, so there isn’t a problem with having it also handle listening for incoming data for your webhooks. However, this approach can be a drawback, since you might be extending your application to handle something that’s not a core functionality or shouldn’t really be coupled with it. How the extension works can really depend on how your own application is structured, but it might be best to separate how your webhooks are handled to something outside of your application.
Microservice
Source: Giphy (Computer Ants)
Meanwhile, a microservice approach might help you move a step away from your application and allow it to simply communicate or process this new data to be consumed by the application later. Unfortunately, we still have the drawback of scalability and provisioning, since we would still need to be continually listening for the new data being sent to the webhook handler. Although it’s entirely possible to estimate how much data might be coming into our webhook handler and provision accordingly, it’s still pretty likely to be a lot of downtime where it’s simply just waiting to service a request.
Function as a Service
Source: Giphy (Saturday Night Live GIF)
At this point I know it’s pretty obvious that I am going to advocate for all the wonderful benefits of using FaaS for processing webhooks, though I do acknowledge there are some pretty annoying tradeoffs. First the benefits. One advantage of using FaaS for processing webhook data is that it allows for nearly unlimited scalability, so you don’t have to worry about being over or under provisioned. Your function only runs when a new event occurs, so you could be saving infrastructure costs by not having to run a server continuously just for processing webhook data. On the other hand, the drawbacks around using FaaS are usually around maintainability, testing, and cold starts. There are some tools that help with maintaining versions of your functions, deploying functions, and keeping the functions warm. Since webhooks are not directly servicing users and most webhook providers are fairly forgiving about required response times, FaaS is really well suited for processing webhooks despite the issues around cold starts.
Working Example
So this is all good in theory, but it’s better to show an example of how we could implement a webhook handler on a FaaS platform. This example will be on Google Cloud Platform using their Google Cloud Functions, but the majority of what we cover would translate across platforms since we’re using JavaScript.
For starters, we want to be sure to service the webhook request as quickly as possible since we don’t want it to timeout. If our webhook handler takes too long to service the request repeatedly and timeout, a lot of webhook systems will stop serving our webhook URL and assume that it is no longer working. Our goal is to minimize the amount of processing time before sending back our 200 response to be sure we can account for any cold start lag time our function may have.
To make things easy and work a little faster, we’re just going to write the JSON response we get for our webhook into a JSON file and upload it to Google Cloud Storage. This will allow our webhook handler to quickly respond to the request and we can just periodically check this bucket for new events or even write another Google Cloud Function that processes the new JSON files.
An easy way to get started if you’re completely new to FaaS is to use Serverless. It is a tool that helps facilitate creating and deploying functions to cloud providers. You can use their quick-start guide to get a template generated and they have guides on getting your credentials setup for each provider as well. Here, we’ll show what a slightly filled out Serverless template looks like for our webhook handler:
const fs = require('fs');
const Storage = require('@google-cloud/storage');
const BUCKET_NAME = ''; // This would actually have the name of our bucket
const storage = new Storage({
projectId: '', // This should be your Google Cloud Project ID where you're deploying your function & have your bucket
keyFilename: './keyfile.json'
});
exports.webhook = (request, response) => {
const data = JSON.stringify(request.body, null, 2);
const fileName = `/tmp/${request.body.location_id}_${request.body.entity_id}_${Date.now()}.json`;
fs.writeFileSync(fileName, data);
storage
.bucket(BUCKET_NAME)
.upload(`${fileName}`)
.then((success) => {
fs.unlink(fileName);
console.log(success);
response.status(200).send();
})
.catch((error) => {
fs.unlink(fileName);
console.log(error);
response.status(403).send(error);
});
};
exports.event = (event, callback) => {
callback();
};
Our example gives a simplified version of how our final webhook handler will be functioning. We’re stringifying our JSON, writing it to the /tmp/
directory using the fs
module. Then, we’re sending that right into Google Cloud Storage using their NodeSDK. Finally, we clean up the temporary JSON file we created locally and log our success before sending our 200
response.
'use strict';
require('dotenv').config();
const fs = require('fs');
const crypto = require('crypto');
const Storage = require('@google-cloud/storage');
const projectId = 'YOUR_PROJECT_ID';
const storage = new Storage({
projectId: projectId,
keyFilename: './keyfile.json'
});
const BUCKET_NAME = 'YOUR_BUCKET_NAME';
const REQUEST_URL = 'https://us-central1-YOUR_PROJECT_ID.cloudfunctions.net/webhook';
function isFromSquare(REQUEST_URL, request, sigKey) {
const hmac = crypto.createHmac('sha1', sigKey);
hmac.update(REQUEST_URL + JSON.stringify(request.body));
const hash = hmac.digest('base64');
return request.get('X-Square-Signature') === hash;
}
exports.webhook = (request, response) => {
if (isFromSquare(REQUEST_URL, request, process.env.SIG_KEY)) {
const data = JSON.stringify(request.body, null, 2);
const fileName = `/tmp/${request.body.location_id}_${request.body.entity_id}_${Date.now()}.json`;
fs.writeFileSync(fileName, data);
storage
.bucket(BUCKET_NAME)
.upload(`${fileName}`)
.then((success) => {
fs.unlink(fileName);
console.log(success);
response.status(200).send();
})
.catch((error) => {
fs.unlink(fileName);
console.log(error);
response.status(403).send(error);
});
} else {
console.log(request);
response.status(401).send();
}
};
exports.event = (event, callback) => {
callback();
};
The above webhook handler shows how to handle events coming from our Square account. We’ve added in verification of the X-Square-Signature
header to validate it’s a payload coming from Square. It’s always worth being sure that a webhook service offers some way to verify the data being sent, since it’s possible for bad actors to interrupt or manipulate services by sending malicious data to your webhook handler.
Verifying our headers here lets us be sure we’re not storing arbitrary payloads into our Google Cloud Storage bucket. From here, you can choose to create another function for processing the new data as it comes in with another Google Cloud Function, or you can simply just have your application periodically check this storage bucket for new events to process.
For example, you could have it check if a refund is above a certain limit, monitor your inventory for an item that is getting too low, or look to see when a high value item has sold. You can find out more information about the events you can track using Square’s webhooks here.
I highly recommend giving Serverless a try and creating your own webhook handlers as a way to react to different events in your Square account. If you don’t already have a Square account, make sure to sign up at https://squareup.com/developers. Let us know how you’ve used FaaS or webhooks in the comments, we’d love to hear more!
Want more? Signup for our monthly developer newsletter.
Posted on October 11, 2019
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.