Working with API Gateway HTTP API type, Lambda Function, and SQS Queue

awsmine

Revathi Joshi

Posted on October 8, 2022

Working with API Gateway HTTP API type, Lambda Function, and SQS Queue

In this article, I will show you on how the API Gateway HTTP API type, triggers a Lambda function to send a message with the current time to the SQS queue created with Python and boto3.

This function then returns that message back to API Gateway so we can see it.

AWS Lambda is a serverless, event-driven compute service that lets you run code for virtually any type of application or back-end service without provisioning or managing servers. You can trigger Lambda from over 200 AWS services and only pay for what you use.

Let’s get started!

Objectives:

  • Create a Standard SQS Queue using Python.
  • Create a Lambda function in the console with a Python 3.7 or higher runtime.
  • Modify the Lambda to send a message to the SQS queue and to test the function.

The message should contain the current time. Use the built-in test function for testing.

  • Create an API Gateway HTTP API type trigger for the lambda.
  • Test the trigger to verify the message was sent.
  • Finally, we will check out the SQS queue to see that it is receiving messages from Lambda.
  • Check out your Logs in CloudWatch.

Pre-requisites:

  • AWS user account with admin access, not a root account.
  • IDE of your liking with AWS CLI, Python, and boto3 installed.

I am using Cloud9, had to install boto3 on Cloud9 because it did not have it.

pip3 install boto3

For this article, I used boto3 documentation.

You can find the complete code on my GitHub Repository.

Create a standard queue in AWS SQS using Python

Here is the python and boto3 script with a (.py extension) to create AWS Standard SQS Queue on Cloud9.

python create_sqs_queue.py

Image description

The output will give you a URL.

Here is the link to my code for create_sqs_queue.py

# --- boto3_python_projects/trigger_api_lambda_sqs/create_sqs_queue.py ---

# Create a Standard SQS Queue using Python


# AWS Python SDK
import boto3


# A low-level client representing Amazon Elastic Compute Cloud (SQS)
sqs_client=boto3.client("sqs")


# use Amazon service sqs
sqs_resource = boto3.resource('sqs')


# Queue will be created 
sqs_queue = sqs_resource.create_queue(QueueName='time_queue')



# prints the output - Standard Queue url
print('Standard Queue created. Queue URL: ' + sqs_queue.url)

Enter fullscreen mode Exit fullscreen mode

Check Amazon SQS on AWS Console to see that the Queue was created.

Image description

  • Click on the queue — time_queue
  • Note down SQS ARN and SQS URL

Image description

Create a Lambda Function with Python

Now I will create a Lambda function that will send a message to the SQS queue I just created.

In the AWS console, navigate to Lambda.

  • Click Create function.

Image description

  • Select Author from scratch
  • Name your function — lambda_sqs
  • Select Python 3.9 from Runtime from drop-down
  • Click change default execution role to see - create a new role with basic lambda permissions
  • Click Create function.

Image description

Image description

Image description

Configuring the Role with proper policies/permissions is the key issue here adhering to the principle of least privilege, just so that it can access proper AWS resources for sending out the message.

Once the Lambda function has been successfully created, navigate to the Lambda function

  • Click on the function — lambda_sqs

Image description

On the Function overview

  • Click Configuration/Permissions
  • Click the newly created IAM role -> lambda_sqs-role

Image description

You will now come to IAM Console.

  • Click Add Permissions
  • Click Attach policies to add additional permissions

Image description

You will now come to the screen as shown below:

  • Click in the Current permission policies
  • Click on the + sign of the AWSLambdaBasicExecutionRole - to see the current permissions

Image description

Image description

  • Click Edit to add the following permissions
  • Click on the JSON tab to view

Image description

  • Add “sqs:SendMessage” to send message to SQS in the Action element.
  • Add the ARN of the SQS queue in the Resource element.

This is what it looks like in the end after adding permissions.

Image description

- Review policy
- Save changes

Image description

Image description

Here is the link to my code for AWSLambdaBasicExecutionRole.json

# --- boto3_python_projects/trigger_api_lambda_sqs/AWSLambdaBasicExecutionRole.json ---

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogGroup",
                "sqs:SendMessage"
                ],
            "Resource": [
                "arn:aws:logs:us-east-1:xxxxxxxxxxxx:*",
                "arn:aws:sqs:us-east-1:xxxxxxxxxxxx:time_queue"
                ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogStream",
                "logs:PutLogEvents"
            ],
            "Resource": [
                "arn:aws:logs:us-east-1:xxxxxxxxxxxx:log-group:/aws/lambda/lambda_sqs:*"
            ]
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode

Modify lambda to send a message to SQS queue and to test the function

Navigate back to Lambda

  • Click Code
  • Add the following code to Code source of your Lambda function.
  • Change the QueueURL to the URL for your SQS queue that you created in the first step.

This will send a message to SQS that contains the current time — time_date

Here is the link to my code for lambda_sendmsg_sqs.py

# --- boto3_python_projects/trigger_api_lambda_sqs/lambda_sendmsg_sqs.py ---
import json
import boto3
from datetime import datetime


def lambda_handler(event, context):
    now = datetime.now()
    time_date = now.strftime("%H:%M:%S %m/%d/%Y")
    sqs = boto3.client('sqs')
    sqs.send_message(QueueUrl="https://sqs.us-east-1.amazonaws.com/xxxxxxxxxxxx/time_queue",MessageBody=time_date)

    return {
        'statusCode': 200,
        'body': json.dumps('Message sent Successfully!')

    }
Enter fullscreen mode Exit fullscreen mode

Image description

Once you have copy and paste the code

  • Click on Deploy to save the changes
  • Click on Test to configure a test event

You will be brought to a Configure test event screen.

  • Create new event, name it, lambdaevent.
  • Chose the Template > apigateway-aws-proxy.
  • Click Save.

Image description

Image description

Image description

  • Click on Test

You will see the results in a new Execution tab.

Image description

Create an API gateway HTTP API type trigger for the lambda

Now to add the Trigger to our Lambda function.

  • Click on Add Trigger

Image description

  • In the Trigger configuration settings
  • Select API Gateway from the drop down menu
  • Select Create a new API
  • Security select Open from drop down menu

Image description

Image description

  • Click Add

Image description

Test the trigger to verify the message was sent

  • Navigate to the Function overview
  • On Configuration tab
  • Click the API endpoint created

Image description

  • You should see the output of our code
  • “Message sent successfully”
  • Click on it a few times to accumulate more messages in your queue.

Image description

Finally we will check out the SQS queue to see that it is receiving messages from Lambda

  • Over to SQS in the AWS console.
  • Find your Queue and it shows we have a few messages!
  • Click Send and receive messages

Image description

  • Click Poll for messages

Image description

  • Click on one of the messages
  • Finally, we can verify the date & time the message was sent

Image description

Check out your Logs in CloudWatch

In your Lambda Function

  • Select Monitor
  • Choose Logs

Image description

Conclusion

We created an SQS queue, using Python, to receive messages from a Lambda function that was triggered by an API.

💖 💪 🙅 🚩
awsmine
Revathi Joshi

Posted on October 8, 2022

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related