The Cloud Resume Challenge: Discovering the power of AWS
joshmears169
Posted on October 28, 2020
Recently, I stumbled across a forum online talking about projects that people had done to demonstrate their skills in the cloud. One commenter linked a post called “The Cloud Resume Challenge by Forrest Brazeal” and it got me curious. Later that day I found myself diving head-first into a new project that really tested my new AWS knowledge to the max!
I started this project as I wanted to have something on my CV which demonstrated my newly acquired knowledge from passing 3 AWS exams. Not coming from an IT background, I was proud of how far I’d come already this year, after using the Covid-19 lockdown as a chance to switch careers from a job in market research. However, I wanted to get a bit more hands-on with the AWS platform and get some valuable practical experience.
I was in a good position conceptually as I had passed the two AWS associate exams that would help the most with this project in my opinion (Solutions Architect and Developer) – it was just a case of putting what I’d learnt into practice. Easy right?!
Projects are a great way to solidify what you think you know and having read through the very vague instructions of the challenge, I thought this project would put into practice everything I had learnt so far on this journey.
The Cloud Resume Challenge involves building a serverless resume website using the AWS cloud infrastructure. You are required to host your website in S3 behind the content delivery network, CloudFront, and then use Infrastructure-as-Code (IaC) for the backend. You should then build continuous integration/continuous deployment (CI/CD) practices into both your frontend and backend to make updates easier, quicker and most importantly, automatic! The backend was used for updating a visitor count on my webpage and so it was going to involve a little javascript in my frontend too. I hadn’t used javascript before as I’d learnt the basics of python in lockdown instead, so it was going to be interesting learning some new syntax and how to make API calls in another programming language.
Frontend
I had already created a website during lockdown through learning a little html and css and watching tutorials, but I added another tab to my navigation bar with my resume included. I stripped back a template that I liked online and integrated my CV that I had been sending out to recruiters onto my webpage. This was very exciting as I was already thinking of ways I could tell the world about my new project and hopefully land that first role in the cloud! I finished off by coding my javascript visitor count as a template ready for inserting my API once I’d created it.
My website was already hosted in S3 and using Route 53, and can be found at www.joshmearsportfolio.com (subtle plug I know!). However, as part of the challenge I needed to use HTTPS for security. This is where CloudFront came in and I was able to set this up without too much trouble.
I then pushed my code to my GitHub repository and created a ‘CodePipeline’ for my front end work so that any changes I made to my code were automatically synced to my S3 bucket hosting my website. This was my first practical use of CI/CD tools and I instantly saw the benefits - it was super useful! I didn’t have too much experience with git at this point so this process was also great for learning some of the basic commands and getting used to pushing new versions to my main branch – it was great seeing these almost instantly, once I refreshed my webpage in my browser too!
Through this step I learnt a lot about caching behaviour too. When I made a change to my javascript code to play around and bring in different information from my test database, I wasn’t seeing these changes on my website because CloudFront had cached my old .js file. After some googling I realised that because I had now set up my website behind CloudFront, any changes I made using my pipeline would not be seen in my browser for 24 hours until the cache was cleared. I manually invalidated the cache for the relevant files and saw my changes propagate. When previously testing my pipeline, I hadn’t picked this up when changing my website as I was changing the index.html file (default root object). For the purposes of development and wanting to play around with changes to my website and see them play out, I practiced changing the ‘Cache-Control Max-Age’. This was another nice moment for me, as I understood the AWS side of things from my studies and was using my knowledge in practice!
Backend
Next was what I considered the hard part. Not coming from an IT background, I found these steps very daunting despite already passing 3 AWS exams. It’s all very well having a good foundation in the concepts and theoretical knowledge but putting it into practice was going to be much harder – which is precisely why I took up this challenge!
I made a plan of what I wanted to do, drawing a simple flow diagram (which was the easy part having passed both the Solutions Architect and Developer exams!). I then played around in the console and created my backend manually to test my javascript API fetch and to write and test my python Lambda function. This was a great way of getting to know the different services better and dust off some of my python skills I had learnt at the start of lockdown. I read a lot of AWS documentation to learn some of the commands to DynamoDB (my serverless database) and see how to integrate API Gateway too.
There were a couple of ‘doh!’ moments for me, one being that I didn’t realise you had to press ‘deploy’ in the console to save any edits to your lambda code before testing again. Through my studies I was used to having a ‘save’ button and since the console had been updated, this had been changed and was enough to throw me off course for a while!
After watching a few YouTube videos and doing a lot of googling, I eventually finished my fully functioning website with a serverless backend!
But that wasn’t enough. Next was learning how to deploy my backend as Infrastructure-as-Code!
For this, I had to learn how to use AWS SAM (serverless CloudFormation). This was my favourite part of the whole project as it required a lot of attention to detail and demonstrated just how powerful Infrastructure-as-Code is. AWS SAM gives you the ability to deploy and remove services with relative ease once you have your template set up.
To practice again, I used a template project from the AWS documentation to see how it all worked and to learn the SAM commands. Once I felt comfortable with it, I cracked on with writing my own template file. This required a lot of patience as YAML indentation isn’t very forgiving! Also, making sure all of my resources were referencing each other in the correct way was tricky. I learnt so much during this part of the project and it made me feel like it was all worthwhile taking up this challenge.
Documentation and googling was a lot more fruitful for the SAM part of the challenge and although it took some time (and plenty of error messages!) I managed to build my template and deploy my backend.
For those of you who haven’t deployed anything with AWS SAM before and are a beginner like me, I recommend the following link to get you started:
SAM Serverless Hello World App
This YouTube video is also a great tutorial for understanding the advantages of serverless and diagrams explaining SAM and how to use it. This may be helpful for anyone who hasn’t studied for any of the associate AWS exams yet!
Lastly, I also recommend YouTube channel ‘FooBar Serverless’ for understanding SAM better and learning more on writing a template.yaml file.
Despite managing to deploy my serverless backend, I encountered more problems with my visitor count and integrating the backend with the frontend. API Gateway wasn’t working at first as the output from my lambda function wasn’t in the correct format (502 error). After a couple of long nights googling, I read that DynamoDB has a bug that causes get requests to return in ‘Decimal’ format sometimes. This was extremely frustrating as it meant my API Gateway didn’t like the format of my lambda output as it was expecting pure, simple JSON. I wanted to persevere and get the right format and found the simplejson package which was meant to sort this problem in python. I figured out how to implement this locally but needed to learn lambda layers to be able to import the simplejson module into my function. Through going down a few rabbit holes and doing some research, I managed to deploy and configure a layer but was still getting an error! Lambda layers is something I am still looking into but for the purpose of simply wanting to get the visitor count number in the right format, I resigned to chucking in some python code I’d found on stack overflow that would sort the problem! It was great experience learning more about Lambda layers though so I’m happy I came across this problem!
One other problem I had when writing my SAM template file was acknowledging CORS throughout my backend. When playing around in the console, it is simply a single click to enable CORS, but it wasn’t quite as simple when using SAM. This involved configuring my responses from Lambda with the correct headers/origin and returning them to my website via API Gateway and my javascript code in the correct way – you will spend a bit of time googling for answers if you haven’t done this before!
CI/CD
Finally, after getting my project fully working, I had to implement CI/CD to my backend. As I had used AWS’s own CI/CD tools for the frontend, I decided to use GitHub Actions for the backend to learn some different tools and get more exposure to different practices. I had used GitHub before but never come across GitHub Actions so this was all new to me. Again, the documentation was very useful for this process and helped explain what was needed to get a ‘workflow’ set up. It involved writing a template file that would be trigerred by a push to my backend repository. After a few late nights I managed to get this working and again realised the benefits of having this set up for future testing and updating.
As part of this final process, I wrote a test in python using pytest which checked for a 200 status code from my lambda function. Once passed, the whole of my backend infrastructure would be packaged and deployed using a fancy GitHub Action!
Final Words
I am sat here writing this blog, feeling slightly tired from the whole process, but extremely happy with the finished product and feeling like I have learnt just as much as I did when passing the AWS exams! It is true that building something in the cloud teaches you so much – things that I wouldn’t necessarily have ever learnt from a book or an online course. I would recommend it to anyone looking to gain experience of the AWS tools and to practice their coding too. You can find my finished product here and help me get my visitor count up whilst you’re at it 😉
I’d like to say a huge thanks to Forrest Brazeal for designing the challenge! Next for me? I think I’m going to have a stab at one of A Cloud Guru’s ‘challenges of the month’ and continue learning more about the AWS cloud.
If anyone enjoyed reading this and wants to reach out, feel free to connect with me on LinkedIn or email me at joshmears169@hotmail.com.
Posted on October 28, 2020
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.