In a World with AI, Where Will the Future of Developers Lie?

mlaposta

Michael La Posta

Posted on April 25, 2024

In a World with AI, Where Will the Future of Developers Lie?

Ever since OpenAI released ChatGPT, AI has been at the forefront of many discussions, including people wondering about the future of work, among other things.

People are worried that AI will take their jobs, and I personally believe that developers are no exception. A belief that a lot of other developers don't seem to share at the moment.

Now of course "AI" isn't a new thing, as it exists in perhaps more basic forms in many of the tools we already use every day. But the kind of AI that's being developed now is a lot more advanced, and it's starting to encroach on areas that were previously thought to be untouchable by computers and/or machines.

The Rise of ChatGPT and Large Language Models

When ChatGPT was first released, I'll admit I didn't pay much attention to it. I figured it was just another chatbot, and I didn't see how it could be much different from the ones I had used before in a professional setting.

AI coming out of Computer Screen

But I kept seeing "LLMs" pop up in discussions online, and I started to see some of the things that people were doing with it. I saw people using it mostly to generate code and write content, but eventually other things as well.

It was at that point that I realized I needed to start paying attention to ChatGPT and LLMs, and get a better understanding of what they were, and how I could use them to my advantage.

What is a Large Language Model?

I won't bother trying to explain what an LLM is, because frankly, I'm not smart enough to fully understand its complexities! 🤪

But my basic understanding is that essentially, it's a model (code) that's been trained on a massive amount of existing texts (and data), and is able to make predictions of the next most probable word in a sentence based on the words that have come before it.

Geeks for Geeks has a much more in-depth article on the topic if you're interested in learning more.

For things like computer code especially, where the rules are somewhat straight forward, it's easy to see how LLM-based AI can really shine.

Using ChatGPT to be more productive

When I finally decided to look into what all the fuss was about a few years ago, LLMs had already been gaining quite a bit of momentum, and had increased in complexity quite a bit.

So I tried it initially for some useless things like asking it how it was doing, or to tell me a joke.

ChatGPT telling a joke

But I quickly moved to testing it out with code snippets, using GitHub Copilot in VS Code, and I was blown away by how well it could predict what I was trying to write.

I mean it was entirely wrong a lot of the time... probably like 50% or more of the time. And often it would get into these endless loops where it would repeat the same 5 to 10 lines or so of code over and over again.

But when it got the code right, holy crap was it ever right! A bit eerie really, as it was able to predict what I was trying to write before I even finished typing it out. It was like having a pair programmer that was able to read my mind.

GitHub Copilot in action

But was it really making me more productive? I mean, I was able to write code faster, but I was also spending a lot of time correcting it. And I was also spending a lot of time trying to coax it to suggest the code I wanted.

But I digress...

Are Developers Doomed?

Ok, so back to the topic at hand: the future of AI and developers.

Whenever I see this topic come up online, it's quickly shut down by a lot of devs who say that AI will never be able to replace them because there's just too much creativity and problem solving involved in what they do.

Developing an application isn't just about hacking code after all.

It's about understanding the initial problem or design request. It's about getting all of the details, and understanding the requirements. It's about architecting the environment and figuring out what framework, DB, and APIs will be needed, how to structure the code, and how to make it maintainable and scalable. It's about testing and debugging, and making sure that the code is secure and efficient.

So there's a lot of creativity and problem solving involved in what developers do, and it's not something that can be easily replaced by a machine... Yet.

And I think that's the key thing here: LLMs might not be able to do everything yet, but they're getting better all the time. And as they get better, they're going to start to encroach on more and more of the tasks that we do.

It's not just techies that are at risk

If you follow some of the AI-related forums online, then you might have come across some of the images that some people are generating with AI. Or perhaps you've seen some of the videos or "movies" being created by AI.

Some of them have pretty obvious flaws, like people with three arms, extra fingers, faces that are all distorted, or things like cars appearing out of nowhere or seemingly gliding sideways on the pavement. But some of them are pretty convincing, and it's getting harder and harder to tell what's real and what's not.

But in the case of AI-generated code, AI-generated images, and AI-generated videos, they all have one thing in common: "AI Prompting".

AI and Prompting

So what is AI Prompting?

Well, in a nutshell, prompting, as its name implies, is simply the process of giving the AI a starting point - basically the instructions or directions to get started, and then letting it generate the rest of the content based on that starting point.

You can prompt with a simple sentence like "Using typescript, provide me with code to connect to a T-SQL database using [insert library name]", or you can prompt it with much more complex, multi-sentence, multi-paragraph instructions that go into a lot of specific detail.

From what I've seen, the really well done AI-generated content is usually prompted with a lot of detail. The devil is, after all, in the details right?

Creators - or Prompters?

In the context of using AI to generate content - whether code, images, music, videos, etc., do you even really need to know how to create the content yourself when you have AI at your disposal?

I think the answer is both yes, and no.

I think for simpler AI generation, you can get away with having no knowledge or training in the area that you're generating the content for. But for more complex AI generation, you're going to need to have some kind of a basis in the area to understand or recognize when things go wrong, so that you can fix the issues.

And so we come to my point: Developers, more and more, are going to need to learn how to prompt AI.

The Future of Developers

In fact, this is exactly what my prediction revolves around - the need to understand proper AI prompting.

I see companies leaning more and more towards senior developers, and less so intermediate and junior developers. You can see it already in the postings for jobs online, where most postings are for senior devs, or "intermediate" devs with a resume (but not salary!) that reads like a senior dev.

Now I'm not necessarily saying the shift right now is because of AI. Just that companies already started trimming the fat after covid, and as always, continue to look for any ways to reduce the workforce to save $$$, and AI is a possible way to do that.

My Prediction

Ok, so finally, my prediction is this:

Going forward, as AI becomes more commonplace in tech companies, I believe that tech companies will stop, or at least limit, hiring beginner and intermediate devs. Instead, they'll hire only for a new role for a hybrid type of dev; one that has a lot of experience coding, but has also mastered the art of prompting AI to generate production-ready code.

Essentially, a Senior dev/QA type of hybrid role. A bit like an AI code whisperer, if you will.

The job of this "AI code whisperer" will essentially entail doing AI code prompting as the primary function, but being experienced and knowledgeable enough in multiple languages and frameworks to fix any issues/bugs, or possibly to prompt the AI to fix them when it's wrong. But the main criteria will be to know how to prompt the AI to architect and code the application from start to finish.

Since when can humans predict the future?

If there's one thing humans are really bad at though, it's making accurate predictions about the future, and I'm certainly no exception.

So take my prediction with a grain of salt. I'm probably way off.

But I think it's an interesting thought experiment, and I'm curious what other people's thoughts are on my "AI code whisperer" prediction.


Am I crazy? Or do you think I might be onto something?

Let me know what you think in the comments, and if you disagree, why?

💖 💪 🙅 🚩
mlaposta
Michael La Posta

Posted on April 25, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related