Mastering Local AWS Lambda Development

skabba

Enri Peters

Posted on December 13, 2022

Mastering Local AWS Lambda Development

Introduction

Are you tired of deploying your AWS Lambda functions to AWS just to test them? Do you feel like you're constantly uploading your code just to see if it works?

Well, fear not my fellow Lambda enthusiasts! In this post, I will share with you two simple ways to do local AWS Lambda development. That's right, no more deployments or tedious back-and-forth just to test your code. You'll be able to run your Lambda functions right on your own computer, just like a boss.

So, grab a cup of coffee and get ready to unleash the full potential of your local Lambda development skills. Let's do this!

Prerequisites

To develop Lambda functions locally, you will need to have the following prerequisites installed on your system:

  • Docker to emulate the Lambda runtime on your machine
  • Python with a version equal or greater than 3.6
  • pipenv to not make a mess of your local Python installation

You can install pipenv by running the following command:

pip install pipenv
Enter fullscreen mode Exit fullscreen mode

Once you have these prerequisites installed, you will be ready to start developing your lambda functions locally.

Method 1 - AWS Serverless Application Model

The first method we will discuss is using AWS SAM (Serverless Application Model). AWS SAM is an open-source framework that allows developers to quickly and easily create, build, and deploy applications that run on the AWS platform.

With AWS SAM developers can test their AWS Lambda functions locally, without having to deploy them to AWS. It does this by using a local version of the AWS Lambda runtime, which simulates the environment in which Lambda functions execute in AWS.

This is useful because it allows developers to test their Lambda functions quickly and easily, without having to incur the costs or wait for the deployment process associated with deploying to AWS. It also makes it easier to debug and troubleshoot problems with Lambda functions, since developers can see the output and errors directly on their own computers.


To install and set up the AWS SAM CLI on your local machine, follow these steps:

mkdir sam-demo
cd sam-demo
pipenv shell
Enter fullscreen mode Exit fullscreen mode

NOTE: pipenv will launch a subshell in a virtual environment, to exit this subshell type 'exit'. Don't forget to do this once your done.

Now within your virtualenv created by pipenv you can install the SAM CLI by running:

pipenv install aws-sam-cli
Enter fullscreen mode Exit fullscreen mode

Once the SAM CLI is installed, you can use the sam init command to initialise your SAM project. Use the sam init command as shown below to create a SAM project based on the "Hello World" template. This template includes a Lambda Function and a REST API. We will not be using the API part, but feel free to play with it.

sam init \
  --name my-sam-app \
  --app-template hello-world \
  --runtime python3.9 \
  --dependency-manager pip \
  --package-type Zip \
  --no-tracing
Enter fullscreen mode Exit fullscreen mode

Make sure you change directories into the 'my-sam-app' folder created by SAM.

cd my-sam-app
Enter fullscreen mode Exit fullscreen mode

You can use and edit the files in this 'hello-world' example to meet you needs. For example you can change the template.yaml file to add extra serverless resource blocks. Additionally you can change the code inside the Lambda function python file: hello_world/app.py.

Now that you know this, it's time to build and run your Lambda Function code by using the following commands:

sam build --use-container
sam local invoke
Enter fullscreen mode Exit fullscreen mode

Congratulations! You have successfully learned the basics of using the SAM CLI tool to develop and test Lambda functions on your local machine.


This approach involves using the AWS SAM CLI and has the advantage of creating a streamlined development flow. The SAM CLI makes it easy to develop and test serverless applications locally, without the need to deploy them to AWS. This allows developers to iterate quickly and test their changes without incurring the cost and overhead of deploying to AWS.

However, it also has the disadvantage of that it may not be the best choice for complex or large-scale applications. While the SAM CLI is a great tool for developing and deploying simple or medium-sized serverless applications, it may not be the best choice for more complex or large-scale applications that require a more advanced deployment or management solution. In these cases, you may want to consider using other tools or frameworks that provide additional features and capabilities.

Method 2 - Docker run

There might be a moment while happily using method 1 that you think "I do not want to keep running sam build every time I make a change to my Lambda Function code".

Well, there is another way to locally test your Lambda Function, but in a slightly different way.

Before, when you ran sam build, the tool was building a Docker container image behind the scenes which is based on the public.ecr.aws/sam/build-python3.9 docker image.

We can use this base image and take full control of it, moo-hahaha!


Let's give this a try by pulling this container to our machine:

docker pull public.ecr.aws/sam/build-python3.9
Enter fullscreen mode Exit fullscreen mode

Once the image has been downloaded, you can use the docker run command to start a new container based on the image. When starting the container, you will need to specify the following arguments:

  • The -it flag to be able to interact with the container
  • The --read-only flag which will create a read-only file system, this mimics the AWS Lambda behaviour.
  • The --tmpfs flag with /tmp argument which will create a temporary file-system on /tmp inside the container to which you can write. You also get this while running your Lambda Function inside AWS.
  • The -v flag which mounts a local directory containing your Lambda function code inside the container.
  • The -e flag for setting environment variables that the Lambda runtime can use while executing your function.

NOTE: You can only write into /tmp with a Lambda Function. In some cases (not entirely known to me) temporary storage is not deleted between warm Lambda invocations, this can help in having a faster execution of your Lambda Function.

Let's look at an example.

This example assumes you have a my_lambda_function.py file in your current directory, with the following content:

# my_lambda_function.py

def lambda_handler(event, context):
    print("Hello from Lambda!")

if __name__ == '__main__':
    lambda_handler(None, None)
Enter fullscreen mode Exit fullscreen mode

Once you have created this my_lambda_function.py file you can continu by running the below command to start an interactive session with your emulated Lambda Function container.

docker run -it \
--read-only \
--tmpfs /tmp \
-v $(pwd):/var/task \
-e MY_ENV_VAR_1=hello \
-e MY_ENV_VAR_2=world \
public.ecr.aws/sam/build-python3.9
Enter fullscreen mode Exit fullscreen mode

After you hit enter you will be entering the container immediately, you can verify this by looking at your PS1 prompt which should look something like this:

bash-4.2# 
Enter fullscreen mode Exit fullscreen mode

This prompt indicates that the container is ready to accept commands. Let's try some things to verify what we just did. Let's start by trying to write something on the root of the containers filesystem.

bash-4.2# touch /this
touch: cannot touch ‘this’: Read-only file system
Enter fullscreen mode Exit fullscreen mode

Look at that! Our --read-only does indeed help in simulating a Lambda Function. You can't touch this!

Now let me explain the -v flag we passed to the docker run command. The -v flag is used with the docker run command to create a volume, which is a persistent storage location that can be accessed from the container. This flag allows you to specify a host directory to be mounted as a volume in the container. The -v flag can be used multiple times to mount multiple volumes (you could do this when you want to simulate Lambda layers). If you want to know more about this, please leave a comment.

For example, the above command uses the -v flag to mount the hosts current directory as the volume /var/task in the container.

NOTE: The $(pwd) command returns the current working directory.

This again simulates our Lambda Function perfectly. Now let's check how to run our Lambda Function. But first let's verify our Python version by running:

bash-4.2# python -V
Python 3.9.15
Enter fullscreen mode Exit fullscreen mode

Nice! We are running Python 3.9.x on our local Lambda Function, let's try to run our functions code:

bash-4.2# python /var/task/my_lambda_function.py
Hello from Lambda!
Enter fullscreen mode Exit fullscreen mode

Congratulations on successfully running your Lambda function code from within a Docker container! This is a great achievement, as it allows you to run your code in a consistent and predictable environment, which can be especially useful for testing and debugging.


This approach involves using Docker and has the advantage that you can now use your favourite text editor or IDE to make changes to your my_lambda_function.py file. Because the volume is mounted, any changes you make to the file on your host machine will be immediately available within the Docker container, allowing you to quickly test and iterate on your code.

However, it also has the disadvantage of that it may not provide an exact replica of the AWS Lambda runtime environment behaves. As a result, there could be differences in how the function behaves when tested locally using Docker compared to when it is actually executed in the AWS Lambda environment.

Conclusion

By using these approaches, you can test your Lambda functions locally without having to deploy them to the AWS cloud. This can be useful for development and testing purposes, as it allows you to iterate quickly and make changes to your code without incurring the overhead of deploying to AWS.

What do you think? Share your thoughts in the comments below!

💖 💪 🙅 🚩
skabba
Enri Peters

Posted on December 13, 2022

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related