#CloudGuruChallenge – Cloud Resume on AWS

kaijuking

Michael Field

Posted on May 11, 2021

#CloudGuruChallenge – Cloud Resume on AWS

Summary

This post is about the Cloud Resume challenge posted by Forrest Brazeal from "A Cloud Guru". #CloudGuruChallenge

The challenge was to create a static website for your "Cloud Resume". As I journeyed through the cloud I wound up registering and creating four static websites since, as the saying goes, practice makes perfect. I wasn't worried about the extra cost associated with the three other domains as I plan to use them for other personal projects and/or sandboxing.

Now onto the details of the project.

Marge's Resume

Tools

I used VSCode for my IDE and I'm loving it more and more each time I use it. Check it out! It's free!

Requirements

I organized the requirements into three parts:

Front-end Components

  • Include the AWS Cloud Practitioner certificate (WIP)
  • Be built with HTML, CSS and stored in an S3 bucket
  • Use Route 53 to host the domain and configure DNS
  • Use HTTPS for security via CloudFront
  • Use Javascript to send a request to the backend to get the visitor count

Back-end Components

  • Store the visitor count in DynamoDB
  • An API Gateway endpoint which invokes a Lambda Function
  • A Lambda function invoked which can access the database to retrieve/update the visitor count as well as send a response back to the front-end
  • Infrastructure As Code (IAC) using the SAM CLI to create, build and deploy the template.yaml file needed to generate the resources in AWS

GitHub CI/CD

  • Two private Github repos one for each component
  • Github actions invoked whenever new changes are merged
    • Front-end: need to invalidate the CloudFront cache
    • Back-end: run python tests, build and deploy SAM 'template.yaml' file

Front-end Components

Resume Needs Work

HTML & CSS

I used a free, open sourced starter bootstrap template and modified the content as needed.

  • index.html: personal and resume related background info
  • projects.html: highlights other AWS projects & Cloud Challenges
  • 404.html: custom error page returned by CloudFront distribution whenever user navigates to a non-existent page
  • default.js: contains the Javascript which requests the user count

Javascript

I attempted two ways to send a request to the Visitor Count API:

Although both methods worked I eventually went with XMLHttpRequest since it was easier to handle non-200 status codes as Fetch doesn't reject even on an HTTP error status.

When testing the site locally I used a VSCode plugin called Live Server which launches and loads your site in a local server. It was during this part of testing that I noticed my visitor count kept incrementing each time I refreshed and/or navigated back to the main index.html page.

To help prevent this I decided to use the browser's Session Storage which is checked when the page loads.

If the Session Storage item doesn't exist:

  • Send a request to the API endpoint to update and return the updated Visitor Count. The count value is then parsed from the response, stored in the Session Storage and displayed on the page.

If the Session Storage item does exist:

  • Retrieve the current value from the Session Storage and display the value on the page.

Per MSDN documentation the Session Storage persists as long as the browser window is open. Once the user closes the browser the session is ended and the object is cleared.

AWS Resources

S3 Bucket

AWS documentation found here was my main source of information when setting up and configuring the S3 bucket. I found it very interesting how the "Access" level of the bucket changed throughout the entire project.

  • After creating the bucket
    • Access = Bucket and objects not public
  • After disabling the "Block all public access" setting
    • Access = Objects can be public
  • After editing the bucket policy
    • Access = Public
  • After creating and configuring CloudFront
    • Access = Objects can be public
    • Details = The bucket is not public but anyone with appropriate permissions can grant public access to objects.
AWS Certificate Manager

For the most part things went smoothly. However, there were some hiccups along the way.

  • For one of my test sites I figured I'd try creating a site with all the resources in the US-West region. When configuring the CloudFront distribution I was surprised to see the cert was not listed. Attempting to manually enter the cert failed but luckily the AWS error message displayed contained the answer as to why. When using CloudFront the certificate must be requested/imported in the US East region as is stated here.
  • Remember to click the "Create record in Route 53" button. Otherwise one could spend a while wondering why the request was taking so long. (true story! lol)
CloudFront

Configuring a CloudFront distribution was (and still is) a rather daunting task considering all the available settings and options.

  • General
    • Make sure to fill in the CNAME(s) properly. A few times I quickly skipped over or forgot this part and wondered why I wasn't able to view the site.
  • Origin Access Identity
    • Found creating a new identity worked best instead of using an existing identity.
    • Select the "Yes, Update Bucket Policy" option. Doing so will automatically update the S3 bucket's policy which is how I noticed the bucket's access changed from "Public" to "Objects can be public". Otherwise you will have to manually update the bucket policy to grant the distribution access.
  • Custom Error Pages
    • When navigating to a non-existent page the response returned by CloudFront is a very user unfriendly '403 Access Denied' page. To help mitigate this I created a custom error response which returns the 404.html page in the S3 bucket.

Back-end Components

Lisa Work's On Resume

Infrastructure As Code

Initial setup was rather simple. To start I used the "Hello World Example" template which one can initialize through the SAM CLI. In the command line run the following:

  • Step 1: type in sam init
  • Step 2: Select "AWS Quick Start Templates"
  • Step 3: Select "Zip (artifact is a zip uploaded to S3)"
  • Step 4: Select your runtime of choice (ex: Python3.8)
  • Step 5: Give your sam-app a name
  • Step 6: Select template #1 (Hello World Example)

Following the above steps will auto-generate a template which one can then modify to suit your needs. For this project, I modified the following three pieces.

  • template.yaml file
  • Lambda Function
  • python tests

The template.yaml file consists of a serverless lambda function configured with events which generates an implicit API. Modifying the template the final version consisted of:

  • DynamoDB to store the visitor count
  • Lambda Function with policy allowing access to
    • DynamoDB
    • Parameter Store to retrieve the database name
    • API Events to request the data from the backend

The Lambda Function has two "API Events":

  • Count
    • This path will retrieve, update and return the new visitor count to the frontend.
  • Health
    • I wanted the CI/CD workflow to fail and not even get to the SAM build/deploy portion if the API itself was down. To accomplish this I wrote a test which asserts the health response's status. If returned response code is not the expected value than the test fails which in turn should cause the CI/CD workflow to stop.

I also used Postman to locally verify the endpoints. Postman is an absolutely awesome tool for testing APIs and I highly recommend it to anyone!
image

GitHub CI/CD

Marge At Work

The final portion of the project was to create a CI/CD pipeline using Github workflow actions. I was nervous at first since I had never used this feature before but I was able to put together a workflow for both the front-end and back-end code. The initial step was to add a workflow to the private repository. This can be done by going to the "Actions" tab of the repository and creating the suggested "Simple Workflow".

image

From there it was a matter of using other resources (i.e. Google) as well as trial/error to finetune the workflow. Part of the process was using an AWS user with only the permissions necessary for the job at hand. To accomplish this I created a new Group with a custom policy consisting of only the necessary permissions needed. Afterwards I created and assigned a new User to that Group.

Back-End
image

Front-End
image

Closing Thoughts

Upload Resume

This challenge is a really great starting point for anyone who is interested in learning what the cloud is and the resources AWS provides. It's a highly recommended stepping stone for anyone who wants to start learning by doing. And once you start you'll get bitten by the 'cloud bug' and want to learn and do more. And what better way to showcase such knowledge than building and deploying your resume utilizing the technologies inside the (AWS) cloud itself.

Where to go from here? Well, as of this writing I am scheduled to take the AWS Cloud Practitioners exam in early June 2021 and will use the bulk of my time in preparation for the exam. Other areas of interest and projects include:

Thank you again Forrest! This was a lot of fun! And thank you to everyone who took the time to read this post.

[Update - June 12, 2021]
I took and passed the AWS Cloud Practitioner certification exam on June 11th! Click here to view my certificate.

💖 💪 🙅 🚩
kaijuking
Michael Field

Posted on May 11, 2021

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related