Serverless Cloud Resume Challenge
Chris Etienne
Posted on October 28, 2024
Introduction
A few years back when I first learned about the Cloud Resume Challenge, developed by Forrest Brazeal, I knew I'd give it a go eventually. Fast forward a bit and I end up putting the finishing touches on my attempt around the same time I got my Bachelors from Western Governors University. I started a few months back, slowly putting my HTML and CSS together with the help of fancy templates and GitHub Copilot. After getting taking some Cloud related courses and getting my AWS Solutions Architect certification, I decided to jump into the rest of the challenge. With a lot of research and ideas and inspiration from other challengers, I was able to finish my first real project and learn a lot along the way.
Setting Up the Environment
The first thing I did was set up my AWS account. Fortunately, I had it set up already from when I was studying for the Solutions Architect Associate exam, but I took time to review my setup to ensure security best practices. I made sure the account used the least privileges necessary and enabled Multi-Factor Authentication (MFA). Security is a priority, so configuring Identity and Access Management (IAM) roles with the principle of least privilege was essential.
To interact with AWS resources, I installed the AWS CLI, allowing me to manage my infrastructure directly from the terminal. I decided to use AWS SAM (Serverless Application Model) from the start to streamline building and deploying my backend resources rather than setting up the frontend manually, and it satisfies the IaC requirement of the challenge.
Since my host machine runs Windows 10, I opted to work on this project using a Linux environment for compatibility and simplicity. Windows Subsystem for Linux (WSL2) allowed me to run Ubuntu directly on my Windows machine.
During development, I used Make to streamline my testing process, creating a Makefile with commands to automate test runs. By setting up testing commands in Make, I could run tests consistently with a single command, which reduced manual steps and minimized errors.
Makefile is a file that helps you automate various repetitive tasks like building, testing, and deploying your application. With a Makefile, you can define a series of commands that streamline each part of your workflow. Instead of running each command manually, you can use short, memorable commands like make build, make test, or make deploy, which simplifies working with your project.
For example, in an AWS serverless project using AWS SAM, a Makefile might include commands to:
Build the SAM application.
Run tests to validate code changes.
Deploy the application to an AWS environment.
Clean up any temporary files created during the build process.
Commands I used in my makefile
build
sam build
deploy-infra
sam build && aws-vault exec [user] --no-session -- sam deploy
deploy-site
aws-vault exec [user] --no-session -- aws s3 sync [bucket name]
Frontend Development
For the frontend, I used basic HTML and CSS to create a simple but effective resume page. This would have been a good opportunity to refine web development skills, but I wanted to focus on the cloud tools so I found an HTML+CSS template at bootstrapmade. I deployed my frontend using AWS services:
- Amazon CloudFront: I set up CloudFront to deliver my content through a Content Delivery Network (CDN), improving load times and providing a secure layer with HTTPS.
- Amazon Route 53: I purchased my domain (cetienne.cloud) from Route53 and mapped it to the CloudFront distribution, I configured Route 53 and setup DNS records.
- AWS Certificate Manager (ACM): I obtained an SSL/TLS certificate from ACM to ensure my site serves securely over HTTPS. It is integrated with CloudFront, allowing site encryption.
Backend Development
The challenge required me to track the number of visitors to my site, so I built a REST API using:
- API Gateway: This acts as the frontend for my API, handling incoming requests to my Lambda function.
- AWS Lambda: I wrote my Lambda function in Python, which interacts with DynamoDB to read and update the visitor count.
- DynamoDB: This NoSQL database service stores the visitor count for my site. I used it because of its serverless nature, which complements Lambda and API Gateway perfectly. DynamoDB's scalability and pay-per-use pricing model are a great fit this small project.
Source Control
I used Git for version control and GitHub as my repository, ensuring all code changes were tracked.
CI/CD
To automate deployment, I used GitHub Actions to set up a Continuous Integration and Continuous Deployment (CI/CD) pipeline. This included a main.yml file which essentially takes over for the makefile. Every time I push changes to the repository, GitHub Actions automatically builds and deploys my SAM template, updating the backend API and frontend as needed. The CI/CD pipeline speeds up development and ensures that my infrastructure stays consistent. _*I added my access keys to my repos GitHub Actions Secrets repository, to avoid hardcoding secrets.
Architecture Diagram
Finished Project
Conclusion
This was a fun challenge and I wouldn't have been able to pull it off without a ton of research, tutorials, and using ChatGPT for HTML, CSS, and to find those pesky typos or syntax errors that were breaking everything. Though I'm just scratching the surface, I learned a lot and have a good understanding of what I want to keep learning and what I might move onto next.
Posted on October 28, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.
Related
November 27, 2024