Local and offline AI code assistant for VS Code with Ollama and Sourcegraph
Thor 闆风
Posted on June 7, 2024
I recently learned that Sourcegraph's AI coding assistant Cody can be used offline by connecting it to a local running Ollama server.
Now, unfortunately my little old MacBook Air doesn't have enough VRAM to run Mistral's 22B Codestral model, but fear not, I found that the Llama 3 8B model works quite well in powering both code completion and code chat workloads!
Let's have a look at how we can set this up with VS Code for the absolute offline / in-flight coding bliss:
Install Ollama and pull Llama 3 8B
- Install Ollama
- Run
ollama pull llama3:8b
- Once the downloade has completed, run
ollama serve
to start the Ollama server.
Configure Sourcegraph Cody in Vs Code
- Install the Sourcegraph Cody Vs Code Extension.
- Add the following to your Vs Code settings:
{
//...
// Cody autocomplete configuration:
"cody.autocomplete.advanced.provider": "experimental-ollama",
"cody.autocomplete.experimental.ollamaOptions": {
"url": "http://127.0.0.1:11434",
"model": "llama3:8b"
},
// Enable Ollama for Cody Chat:
"cody.experimental.ollamaChat": true,
// optional but useful to see detailed logs in the OUTPUT tab
// (make sure to select "Cody by Sourcegraph" from the dropbdown)
"cody.debug.verbose": true
//...
}
Start Cody and enjoy your Local Offline AI Code Assistant
That's it, as long as Ollama is running in the background, you should now have a fully functional offline AI code assistant for Vs Code with Cody. This setup allows you to use both code completion and code chat features without relying on any external services or internet connection. In fact most of this last paragraph was written by Llama 3 8B itself.
For Cody Chat, make sure to select the llama3:8b Experimental
option from the dropdown and you're good to go! Happy Cod(y)ing \o/
Posted on June 7, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.