Configuring Ollama and Continue VS Code Extension for Local Coding Assistant
Manjush
Posted on June 22, 2024
π Links
Prerequisites
- Ollama installed on your system. You can visit Ollama and download application as per your system.
- AI model that we will be using here is Codellama. You can use your prefered model. Code Llama is a model for generating and discussing code, built on top of Llama 2. Code Llama supports many of the most popular programming languages including Python, C++, Java, PHP, Typescript (Javascript), C#, Bash and more. If not installed, you can install wiith following command:
ollama pull codellama
You can also install Starcoder 2 3B
for code autocomplete by running:
ollama pull starcoder2:3b
NOTE: Itβs crucial to choose models that are compatible with your system to ensure smooth operation and avoid any hiccups.
Installing Continue and configuring
You can install Continue from here in VS Code store.
After installation, you should see it in sidebar as shown below:
Configuring Continue to use local model
Click on settings icon:
Add configs:
{
"apiBase": "http://localhost:11434/",
"model": "codellama",
"provider": "ollama",
"title": "CodeLlama"
}
and also add tabAutocompleteModel
too
"tabAutocompleteModel": {
"apiBase": "http://localhost:11434/",
"title": "Starcoder2 3b",
"provider": "ollama",
"model": "starcoder2:3b"
}
Select CodeLlama, which would be visible in dropdown once you add it in configuration
And you can also chat as normal as shown below
And you can also select a codeblock file and ask AI:
References:
π πͺ π
π©
Manjush
Posted on June 22, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.