The serverless development model has become a powerful tool for developers to deploy and scale infrastructure without having to worry about the complexities of managing servers.
Cloud providers like Amazon Web Services (AWS) do the heavy lifting of provisioning, maintaining and scaling physical infrastructure so that developers can focus on the end product. In fact many developers today can and do build entire serverless applications on the cloud.
In this post, you'll learn how Lambda and IAM unlock the power and versatility of the cloud by implementing a serverless User API that can be expanded on as you grow and explore the many services on AWS. Let's get started!
What even is a Lambda?
In the spirit of serverless, AWS Lambda is a cloud computing service that lets you run code without provisioning or managing servers. Think of it as an execution environment on AWS’s servers where you can deploy and run your code without having to sweat all the low-level details like updating operating systems or upgrading hardware.
Code deployed to Lambda is organized into functions, with each function ideally designed to accomplish just one job. For example, the application we will build will have two functions, one to read from the database and one to write to the database.
This ensures that each Lambda function has a single responsibility and helps keep things organized and readable. Below is a sneak peek at the create user function we’ll make later on to hopefully show you that Lambda isn't as intimidating and nebulous as its name might sound.
Since Lambda is an AWS service like any other, using it to communicate and link other AWS services together is a breeze. It can, however, contain almost any arbitrary code so it can be used to communicate with any third-party service or even other cloud providers. Because of this, the use cases for Lambda functions are so limitless that it has become a staple component of many serverless applications.
Unlocking the Cloud with IAM
By default, every resource you create on AWS, be it a Lambda function or a database, is locked down by AWS Identity and Access Management (AWS IAM). This ensures access to resources created in your account is minimized, intentional, and follows the principle of least privilege.
There are three important constructs in IAM used to define access.
Permissions: Permissions make up the atoms of IAM and describe individual allowed actions such as writing to a database
Policies: Policies are logical groups of permissions with two main types
Identity-based policies: Specify what an identity can do, such as what permissions a role is entitled to
Resource-based policies: Specify who has access to a resource and what actions they can perform on it
Roles: Roles are identities assumed by both users and resources like Lambdas functions in order to receive the necessary privileges they need to accomplish their task. A role can be given one or more policies.
It may be tempting at first to be extremely liberal with permissions in order to get things working, but it is important that your permissions are scoped down to the minimum set that your function needs to do its job. Granting permissions that are too broad increases the risk of components being used to obtain or modify data in unwanted ways.
In short, in order to allow our components to communicate with each other we'll need:
A resource-based policy on our Lambda function to specify who can invoke it
A role that our Lambda function can assume with an Identity Based Policy that specifies what that role can do
Putting it all together with SAM
To build and deploy our serverless application we’ll be using the AWS Serverless Application Model (SAM). SAM is a framework that allows us to describe our serverless application in code. This enables us to track changes to our architecture using version control as well as avoid trekking through complex interfaces in the AWS console that are subject to frequent change.
To get started you’ll first need to install the AWS SAM CLI. The SAM CLI is used to locally build, test, debug, and deploy serverless applications defined in SAM template files.
You can install the SAM CLI according to your operating system by following the instructions here.
You'll also need the AWS CLI configured with the credentials of a IAM User with sufficient permissions to deploy resources the SAM CLI will create on its behalf. You can set this up by following the guide here.
Next, you can clone the repository that contains the code for our application below. This repository was initialized by using the sam init command and selecting the Python quick start template when prompted. For brevity I’ve stripped away the excess boilerplate and provided the code we’ll work with.
This project contains source code and supporting files for a serverless application that you can deploy with the SAM CLI. It includes the following files and folders.
create_user - Code for the application's Lambda function.
get_user - Code for the application's Lambda function.
template.yaml - A template that defines the application's AWS resources.
The application uses several AWS resources, including Lambda functions and an API Gateway API. These resources are defined in the template.yaml file in this project. You can update the template to add AWS resources through the same deployment process that updates your application code.
If you prefer to use an integrated development environment (IDE) to build and test your application, you can use the AWS Toolkit.
The AWS Toolkit is an open source plug-in for popular IDEs that uses the SAM CLI to build and deploy serverless applications on AWS. The AWS Toolkit also adds a…
But the real core of a serverless project is the template file. This is where all the resources and configuration of your serverless architecture is defined.
It can definitely look intimidating at first but lets walk through each property to understand what each is accomplishing for us.
Under Globals we can define properties for multiple resources at once. In this case each of the properties under Function apply to all Lambda functions defined in this file. Of note is the Handler which specifies the entry point for our functions and the Environment Variable TABLE_NAME which will be available to our functions to determine exactly which DynamboDB table to read and write from.
Under Resources you can define almost any AWS resource! Feel free to experiment!
Here we've created our two Lambda Functions with type AWS::Serverless::Function and our DynamoDB Table with type AWS::DynamoDB::Table.
Next, lets zoom in on our create-user Lambda function.
Starting from the top, the TypeAWS::Serverless::Function is actually a unique resource type provided by SAM that will implicitly create resources in order to help us quickly accomplish the Lambda configuration we saw earlier.
The CodeUri simply describes what folder the code for our function is contained in.
Under Events we define an API EventSource. Defining this under AWS::Serverless::Function will actually create an API Gateway endpoint and a resource policy on the function allowing the API Gateway to Invoke it.
Under Policies, we define a list of policies that describe what the function is allowed to do. Thanks to the AWS::Serverless::Function type, these policies will be attached to an automatically created Execution Role that the function will assume when it is deployed.
The policy DynamoDBWritePolicy is one of the many pre-defined managed policies provided by AWS for ease of use. To ensure it doesn't provide privileges that are too broad we've scoped it down to only allow writes to our specific user table we've defined further down in the file.
The definition for the GetUser function is very much the same but for reads.
Finally, the properties on our UserTable simply define its Schema under AttributeDefinitions, its primary key under KeySchema and configures the billing mode to PAY_PER_REQUEST, which means we only pay for what we use. You can also configure provisioned capacity if you know that you need to support a fairly constant rate of throughput.
To see it all in action, open up a terminal in the same folder as the project and run:
sam deploy
Verification
Upon successful deployment, you should find all the resources, roles and policies we defined in our template right in your AWS account.
Clicking on the CreateUserFunction and selecting the configuration tab, you'll find a link to the Execution Role with the DynamoDBWrite policy attached to it that we defined earlier.
Scrolling down you'll also find the Resource-based policy that allows our API Gateway to invoke this lambda function.
The diagram is satisfied!
Feel free to poke around the deployed resources and make sure everything checks out.
When you're ready, make a POST request to the create user API endpoint. If all went well, it should invoke the create-user lambda, and create a user in our DynamoDB!
And that's it! With just a tiny template file you've spun up an entire serverless application, complete with the basic resources, roles, and policies you'll need to continue your serverless journey.
If you're up for an extra challenge try using what we've shown here to make an endpoint to delete a user. Good luck!