AWS Lambda and DynamoDB - some first steps
Graham Trott
Posted on May 18, 2019
When starting out with any new technology, the first steps are always the most difficult. When you return later to do the same actions again everything seems so simple, it's hard to remember what an uphill task it was the first time. In my case the task was to do something with AWS Lambdas, using JavaScript in the form of Node.js. Since it was only a week or so ago I can still well remember all the problems I encountered, so I figure it's a good time to write about the experience while I'm still at that beginner level.
There are countless resources out there on how to set up an AWS account, so there's little point me going through it in detail, especially as the details tend to change over time so detailed documentation can quickly go out of date. The main thing is to ensure you start with the 'free tier', meaning you don't have to pay anything for a year unless you generate a spectacular amount of traffic. Even after that time, some AWS features remain free for low usage; 'low' being a quite generous allocation of resources for someone just learning about the system. Once you've created your account, go to the AWS Management Console where you'll see a long list of services. I'll be using 4 of them here.
The task
The task I set myself was to create a couple of REST endpoints; one to write stuff to a database and the other to retrieve it. These are the main elements a static website can't provide (as it requires server code), so it could be useful to have them available as standalone functions that can be called from anywhere. I'm increasingly building 100% client-side (JavaScript) websites so it's pretty important to nail the storage issue.
The outline of the task is to write a couple of AWS Lambda
functions that deal with the DynamoDB
database; one to POST data to it and the other to GET something from it. It turns out that 2 other AWS services will be called in to play as well, these being API Gateway
and IAM
, so I'll run through how the 4 services fit together. I'll start with the ones that have the least dependency on anything outside themselves; that's DynamoDB
and IAM
.
DynamoDB
DynamoDB
is a NoSQL database, which means it doesn't talk SQL. However, the JavaScript interface to it is pretty simple, as you'll see shortly. The console for this service is quite straightforward. I recommend that before starting coding you spend a bit of time creating some tables, manually populating them with test data and doing scans and/or queries to retrieve the data. All this is covered in depth by AWS documentation and the management interface itself is one of the more friendly ones you'll find in AWS.
I would like my two endpoints to be as generic as possible so they can interface to many different tables. A lot of the tables I'll be using have a rather similar structure; each record has a primary partition key and an arbitrary set of attributes. In my endpoint handlers, the name of the table and of the primary key will both be variables. So for example, one table might contain HTML fragments, keyed by a unique name, and another table holds data about specific keywords where the value includes the name of one of the fragments in the first table. So the partition key for the first table might be 'name' and for the second table will be 'keyword'. Similarly for the data; the first table calls it 'script' and the second one 'value'. When you look up a keyword in the second table you'll get back a chunk of data for it, including the name of a page that describes it, allowing you to search the first table to retrieve the HTML. All a bit artificial but quite valid. The tables are structurally similar, so the same Lambda
function should be able to handle either of them.
IAM
That's all I need to say for now about the database, so let's move on. The second service is IAM
, or Identity and Access Management. You may already have come across it while setting up your account because AWS will have recommended you create a user and not do everything in the root account. The main thing you need in here is to set up a "role", which is a block of permissions that allow other services to do their work.
Under the Roles menu item you'll find a button to create a new Role. Give it a name like GenericEndpoint
. You'll need to add two sets of permissions; one is AmazonDynamoDBFullAccess
and the other is AWSLambdaBasicExecutionRole
. Both should be fairly self-explanatory (or will soon be).
Lambda
Now we get the point where we can do some coding. Go to the Lambda service and create a new function. This is a chunk of code that will be called into existence when someone hits your endpoint, do its job then disappear again. No code is left running so it costs you nothing while it's inactive. You can create Lambdas in a variety of languages but I'll use Node.js here. Again consult the standard documentation if you need help.
Near the bottom of the Lambda
Dashboard is a dropdown for Execution Role. Here you choose the role you created in IAM
. Your code now has all the permissions it needs to run and to interact with DynamoDB
.
Further up is a code editor, so let's put some code into it.
The POST endpoint
const AWS = require(`aws-sdk`);
AWS.config.update({region: `eu-west-2`});
const dynamo = new AWS.DynamoDB.DocumentClient();
exports.handler = (event, context, callback) => {
const params = JSON.parse(event.body);
const TableName = params.table;
const Item = {};
Item[params.kName] = params.kValue;
Item[params.vName] = params.vValue;
dynamo.put({TableName, Item}, function (err, data) {
if (err) {
console.log(`error`, err);
callback(err, null);
} else {
var response = {
statusCode: 200,
headers: {
'Content-Type': `application/json`,
'Access-Control-Allow-Methods': `GET,POST,OPTIONS`,
'Access-Control-Allow-Origin': `https://my-domain.com`,
'Access-Control-Allow-Credentials': `true`
},
isBase64Encoded: false
};
console.log(`success: returned ${data.Item}`);
callback(null, response);
}
});
};
At the top we create a database client instance to work with, then we have a handler for a POST request. The event
argument carries the posted data, which is all in the body
element. Here the table is named. The bit that follows creates an Item
comprising a named key and its value. Because I wanted to handle multiple tables the name of the key will not always be the same, so instead of hard-coding everything I've put the name of the table and the names of the key and data into POSTed parameters. The name of the key is passed as kName
and its value is passed as kValue
. Similarly, the name of the value key is taken from vName
and its value from vValue
.
For example, let's assume the table is called mytable
, its primary key is name
, its data is in an attribute (like a column in SQL) called script
, the name of the item we're writing is about
and it has the content This is my script content
. In this case the POST data would be
{
"table":"mytable",
"kName":"name",
"kValue":"about",
"vName":"script",
"vValue":"This is my script content"
}
If this seems a little complex, for comparison here's the code you would use if just one table were to be handled. The table name and the keys are all hard-coded in this version:
const TableName = `mytable`;
const Item = {
name: body.name,
script: body.script
}
where the table name is mytable
, the primary key is name
and the data is script
. Here's the POST data that corresponds:
{"name":"about","script":"This is my script content"}
The call to DynamoDB
takes the table name and the item and returns either an error or potentially some data. The latter is packaged up into a suitable response and returned to the caller. Important: See the note below about CORS, which is relevant if an error ever occurs.
The GET endpoint
The GET endpoint has a similar script:
const AWS = require(`aws-sdk`);
AWS.config.update({region: `eu-west-2`});
const dynamo = new AWS.DynamoDB.DocumentClient();
exports.handler = (event, context, callback) => {
const TableName = event.queryStringParameters.table;
const Key = {};
Key[event.queryStringParameters.key] = event.queryStringParameters.value;
dynamo.get({TableName, Key}, function(err, data) {
if (err) {
callback(err, null);
} else {
var response = {
statusCode: 200,
headers: {
'Content-Type': 'application/json',
'Access-Control-Allow-Methods': 'GET,POST,OPTIONS',
'Access-Control-Allow-Origin': `https://my-domain.com`,
},
body: JSON.stringify(data.Item),
isBase64Encoded: false
};
callback(null, response);
}
});
};
The difference here is the element in event
that contains your query parameters, and the use of Key
instead of Item
. The query string in this case, to return the value we just wrote, would be
?table=mytable&key=name&value=about
API Gateway
The final part of the jigsaw is API Gateway
. This, as its name suggests, interfaces the other AWS services to the outside world. One Gateway serves for both GET and POST (and also PUT etc) so give it a name that relates to your Lambda
. In the Dashboard, click the Actions button to create methods for GET and for POST. Then click it again and choose the CORS action, which allows you to specify who will be allowed to access your endpoints (the default being 'all'). For each of the GET and POST specify the Lambda
that will be invoked, and also select Use Lambda Proxy integration.
Don't forget every time you make a change to a method to click Deploy API
in Actions
. The stage can be named anything you like but most people choose dev or prod. The endpoint URL will then be revealed to you.
API Gateway
has a useful test feature that gives you direct access to log information for when things don't go as expected (as will almost certainly be the case the first few times). For the GET you'll need to go into Method Request and set up URL Query String Parameters, and for POST the body parameters (as shown above) must be typed into the box provided. Then you can click Test and see what happens.
CORS
Judging by the questions being asked about it, CORS is one of the more tricky aspects of client-server programming, yet as far as I can see it's actually quite simple. There are however a couple of wrinkles, at least in the AWS environment.
One problem that had me scratching my head for a couple of days was that my GET endpoint worked fine but the POST endpoint kept reporting CORS errors, complaining that the right permissions weren't set. This was true, but not for the reason I expected. It turns out I had a typo in my parameter list, which caused JSON.parse()
to fail. This meant the call to DynamoDB
never actually happened and my endpoint returned with an empty response code. The reason why this causes a CORS error is that when using Lambda Proxy integration, API Gateway
only sets up for a 200 response. If you want to handle any other response code you have to do it yourself manually or your browser will refuse the error response because it lacks the required CORS headers.
Finally
Once things are working in the test environment you can set up Postman to throw some test data at the endpoint and then retrieve it. Use the endpoint URL you obtained from API Gateway
a couple of paragraphs back; for GET add your query parameters to it and for POST put the data in the request body. After that you can try calling your Lambda
from a real web page, such as the following:
HTML
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Lambda Test</title>
<script type='text/javascript' src='/testg.js'></script>
</head>
<body>
</body>
</html>
JavaScript (amend the URL line as appropriate)
window.onload = function () {
const createCORSRequest = function (method, url) {
const xhr = new XMLHttpRequest();
if (`withCredentials` in xhr) {
// Most browsers.
xhr.open(method, url, true);
} else if (typeof XDomainRequest != `undefined`) {
// IE8 & IE9
xhr = new XDomainRequest();
xhr.open(method, url);
} else {
// CORS not supported.
xhr = null;
}
return xhr;
};
const method = `GET`;
const url = `https://k84msuyg7a.execute-api.eu-west-2.amazonaws.com/prod?table=easycoder-script&key=name&value=fileman`;
const request = createCORSRequest(method, url);
request.setRequestHeader(`Content-Type`, `application/json; charset=UTF-8`);
request.send();
request.onload = function () {
const content = request.responseText;
console.log(content);
};
request.onerror = function () {
const error = request.responseText;
console.log(error);
};
};
That's the end of my brief resumé of the joys of AWS, captured while the pain was still fresh. I'm sure there are plenty of inaccuracies; I'll be pleased to hear about them and will make amendments as appropriate. I hope other newcomers to AWS will find the article useful.
Title photo by Jan Antonin Kolar on Unsplash
Posted on May 18, 2019
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.