Building a Serverless ChatGPT Powered Resume Assistant - Foundation

misterjacko

Jakob Ondrey

Posted on June 20, 2023

Building a Serverless ChatGPT Powered Resume Assistant - Foundation

Project Origin

A few weekends ago I was itching to build something with the OpenAI API. As a career changer into the cloud space and someone who works in the early career talent space, I am acutely aware how important and difficult it is for candidates to write targeted resumes for positions they are applying to. Therefore, I decided to make a resume assistant powered by ChatGPT.

There has been a lot of ink that has been spilled about Large Language Models (LLMs) in general and ChatGPT specifically, and I'm not really trying to add to that torrent. One thing that I would like to address is the criticism that LLMs just compose confident and compelling bullshit. Not a great trait for writing an academic paper, but are there applications where that could be an advantage?

A resume, if you think about it, could also be considered a work of confident compelling bullshit. No, I am not saying that lying is acceptable on a resume, but often we find ourselves stretching our experience a little bit while we write our resumes so that we can conform just a little bit better to the requirements of a given job description. Speaking of which..

Job descriptions are also a similar sort of aspirational bullshit. Requirements are based on an old job that doesn't really match the work anymore, or just a description of the experience the last person had (who left to earn 20% more than they will offer you). Thus, together resumes, job descriptions, and LLMs form what I call "The Triad of Truthiness"

The Triad of Truthiness is a three circled Venn diagram with LLMs, resumes, and job descriptions being the circles. The three intersect in the middle where it is labeled

Having decided that LMMs are, therefore, the perfect resume assistant, we will now build a serverless application that, given a "resume" and a job description, can tailor the resume for the job through a virtual resume review, ie. a conversation with you.

The Plan

Having written a large portion of this as a single post, and it being entirely too long, I have decided to split it up into 4 posts.

  1. An introduction and some background (this post)
  2. Building a Python function that we can run and test locally from the command line
  3. Using the AWS CDK (Python) to deploy that function to the cloud (but still interacting with it via CLI)
  4. Deploying a front-end so that we can allow others to use the service

This should keep things a little more bite sized and allow anyone who wants to also build it (or put their own spin on it) to generally focus in one file or area rather than jump around like a maniac.

How a Conversation with ChatGPT works

Lets talk a little about the OpenAI library and how ChatGPT conversations work.

First of all, there are three participants in every conversation, assistant - the chatbot, user - the person chatting with the chatbot, and system - a hidden but important participant that sets the tone of the conversation.

Second, that system message is your "secret sauce." The thing that separates your bot from just chatting with ChatGPT. With the system message, you can set the rules and constraints that will apply for the duration of the conversation. It is where we can instruct the chatbot that its a helpful resume review assistant and to "leverage transferable skills if they lack direct experience". It might be something you want to keep secret otherwise someone else could copy your million dollar idea!

Third, the OpenAI API doesn't "remember" anything. Every time you pass it a request, you need to provide ALL the context necessary to generate a relevant completion. That means that the first request might look like this:

[
  {'role':'system','content':'You are a resume bot named Resume Bot 3000'},
  {'role':'user','content':'My name is Jimmy and I want a job as a Bank Teller!'}
]
Enter fullscreen mode Exit fullscreen mode

And your second request might look like this (we will talk about parsing the response in a minute):

[
  {'role':'system','content':'You are a resume bot named Resume Bot 3000'},
  {'role':'user','content':'My name is Jimmy and I want a job as a Bank Teller!'},
  {'role':'assistant','content':'Hi Jimmy, A bank teller is a job that typically requires money handling experience. Have you ever had a job or other experience where you had to handle cash or facilitate transactions?'},
  {'role':'user','content':'Well, maybe not a job but I volunteered at school basketball games selling tickets for admission and also working at the concession stand. Does that count?'}
]
Enter fullscreen mode Exit fullscreen mode

For ChatGPT to understand what is going on, it basically needs the whole conversation every time. Everything is new to it. This means that the longer the conversation goes on the more context is built, and the more expensive your requests become.

gpt-3.5-turbo has a price of $0.0015/1k tokens (about 750 words according to the openAI pricing page) for context you send to it and $0.003/1k tokens for responses. That can be pretty cheap but can also easily get out of hand if you have a lot of people having long conversations. Be sure to set limits on your account in the billing section.

With all of that out of the way now we can move on to building something! Join me in post 2!

💖 💪 🙅 🚩
misterjacko
Jakob Ondrey

Posted on June 20, 2023

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related