The Adventures of Blink #26: GPT4All! (and All4GPT!)
Ben Link
Posted on June 6, 2024
If you've managed to find my blog, I'm guessing you're at least generally aware of the tech industry and the massive disruption we're currently experiencing, as a result of crazy big advancements in the space of LLMs (Large Language Models). Even if you aren't familiar with the specific jargon, there are very few people with internet access who haven't yet heard of ChatGPT... which is amazing. The rate at which we've seen awareness of this technology increase is staggering...
The wildest part is that it's not slowing down at all yet. Many of the experts predict that this AI revolution will be an even larger disruption than the adoption of the world wide web by common users in the 90s.
Paying to play
One of the hardest parts of wanting to learn a new technology is that the frontrunners monetize it so fast. Oh, sure, if you just want to talk to ChatGPT or Gemini, you can do it in a web page window and that's offered for free... but if you're a builder who would like to write code that uses this technology to do something interesting, you need an API key and you have to pay for it... per use. And even at very low prices, that can get super expensive for someone who's doing a fair amount of experimenting! It's cost-prohibitive to get in on the latest & greatest model action.
An alternative
The thing about this monetization is that you're paying for access to the latest & greatest... but if you're a brand-new builder and you've not done work in this space before, you can go a LONG way with some of the older models!
This is where I find myself - I've studied a lot, even interviewed with some AI-based companies... but when it comes to putting code in an editor I'm still fairly new at the whole AI scene. So I'm always on the lookout for a low-cost way to skill up... and that's what we're doing today!
We're going to experiment with GPT4All - a free platform that allows you access to several GPT-style models that you download and run locally. Running the model on your own hardware will of course slow its responses, and the models themselves aren't the most cutting-edge tools... but if you're learning the ropes, it's a viable way to start with very little cost. So if you're ready to start working on your AI development skills... come on, let's go build some stuff!
TL/DR: Youtube
As is my custom with posts where I build things, I'm providing a way to follow along on Youtube if you'd rather watch a video than read all the words. Just please leave me a like and subscribe!
Getting started with GPT4All
The first step to getting started with GPT4All is an app you can install, located here. Note that this isn't absolutely required to use the models, but it provides a neat interface to interact with LLMs as you begin experimenting.
Today's post is about getting started with this interface. If you're here for how to code something that interacts with the models... sorry! Next week we'll cover how to build a Python app that uses a LLM, so come and see me next Thursday!
Uh, isn't this just what I can do from ChatGPT's website?
Well... sort of.
GPT4All is going to run the code locally: that is, directly on your computer. This means your privacy is protected - you're not calling code that's running on someone's server somewhere else. Any data you share with the model is hosted only on your computer, and processed only on your computer.
It has the pricy models too, as long as you've paid for them
GPT4All can also connect you easily with the latest from OpenAI or Zilliz or any of the other pay-to-play AI tools: you just have to provide the API key you paid for and the platform will integrate them so you have a one-stop shop for experimentation. Just be aware that this doesn't eliminate the cost of using those tools, and those integrations absolutely will share your data with the company!
Custom Data Stores
GPT4All also has a cool option built into the app where you can create a "data store". You see, LLMs are great at regurgitating data they've been trained on, but maybe you want a bot that knows you well. Or maybe you want it to know about the internal processes of your company... you know, data that wouldn't be readily available on the internet. GPT4All will let you put all those documents in a folder where it knows about them and allows the model to access them, so it can "learn" that data! (Again, this is safe & secure because the model that's doing this learning is the one you've downloaded and run locally - it's not talking to the internet!)
Ok, the installer finished. Now what?
We're going to start with one of the free, open-source models in the GPT4All library today. Once we have it working, we'll add in some data and interact with our data store to see how it works.
When you start up GPT4All, you'll have something like this appear:
The first thing you'll need to do is select a model to use. There are several helpful ways to do this: you can use the "Download Models" button in the middle of the screen, or navigate through the gear icon in the top right, or use the "Downloads" button in the bottom left.
You have to choose which model you want to work with, because, well... you have options here! As the screenshot hints, Mistral, Falcon, Llama, MPT, Replit, and more are available to you. For my purposes today, I picked Llama 3. Once you've let the download complete, you'll be able to select your model from the drop-down at the top of the main screen:
This drop-down will be super helpful if you find yourself wanting to try out multiple models and switch between them, for comparative purposes.
After you've done this, you're ready to chat with your model. Just send your message down at the bottom of the screen:
It will, of course, respond in whatever way it sees fit:
We've seen this before, blink.
Yeah, I know - while there's still so much wonder around the chat-with-an-AI paradigm, it's pretty much old news as we've all tried this with a bot or 4 already! You're here because you wanted to see how you could do more.
So let's look at a really cool feature that's built into GPT4All - you can add a document store to its knowledge and ask questions about it!
Adding some data
You need a good solid data set to make this work. For my own experimentation, I used my personal journal... I've been working occasionally on my autobiography and other notes, as some personal therapy, and I have about 60 pages written so far. So I saved a PDF to my Downloads folder to work with.
Obviously I'm not sharing my journal with all of y'all, though! So my encouragement is to find some large body of writing that's unlikely to have been in the model's training sets (that is, stuff you can't find easily on the web).
Aren't you a little worried about putting your diary into an LLM???
Well, no. It's a local copy, remember? What happens in my laptop stays in my laptop!
Back to adding the data
To add a local data set, click the Database Icon in the top right of GPT4All:
From there, select the "add" button and then build your dataset. You can select a folder and all its contents will be indexed:
When you return to the main page, you can select the model you want to use to converse with the data, and begin asking it questions about the content.
Suggestions for improving your experience with the data
Most of the fun is exploring on your own, but if you find yourself feeling stuck, here are some things I tried that made a big difference in the experiment for me:
- Initiate role-play.
I was using my own personal journals... so I prompted the model like this:
Please pretend to be the author of this document and answer any questions in the style and tone that he would use.
This made a tremendous difference in the conversational aesthetic; I was able to hear my own emotions and tone (and even a bit of sarcasm!) reflected to me as I asked it questions.
- Ask broad questions... but not too broad.
LLMs tend to want to please you with their responses. So if it doesn't know something, it may completely make something up in order to provide what it thinks would be a "satisfying" answer. One way to combat this tendency is to make sure the questions aren't overly specific or overly general. It's tough to find that "just right" spot, but if you work at it, you'll see a difference!
Who would you say was your best friend?
This query gave me a VERY good answer! It picked a person mentioned in my journal, and explained its choice using specifics from what I'd written.
Tell me something you're really proud of.
This query got VERY hallucinatory. It actually picked an extremely embarrassing memory and then tried to twist it into a "life lesson"... which actually might be something that I would do, but overall I felt like the answer missed the mark.
- Ask analysis questions.
Sometimes picking favorites or recounting stories will lead to hallucination - but if you ask for an analysis of the author and their writing style, you'll get some really intriguing insights.
Wrapping Up
This has been SUCH a fun project, and even with a "lesser" model, the responses were insightful and interesting... and I haven't even written code yet! If you're not playing with AI and learning how to make it do things yet, you absolutely need to get on board quickly; there's immense power wrapped up in this!
And funny enough... that's my plan for next week's post. We're going to expand on the use of GPT4All and connect a free model with a Python program to see if we can learn how to begin manipulating AI in code! Come back next week to try it out in the next Adventure of Blink!
Posted on June 6, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.