Configuring Ollama and Continue VS Code Extension for Local Coding Assistant

GitHub GitHub Pages

Prerequisites

ollama pull codellama 

You can also install Starcoder 2 3B for code autocomplete by running:

ollama pull starcoder2:3b

NOTE: It’s crucial to choose models that are compatible with your system to ensure smooth operation and avoid any hiccups.

Installing Continue and configuring

You can install Continue from here in VS Code store.

After installation, you should see it in sidebar as shown below:

Continue in VSCode

Configuring Continue to use local model

Click on settings icon:

Configure settings icon

Add configs:

{
      "apiBase": "http://localhost:11434/",
      "model": "codellama",
      "provider": "ollama",
      "title": "CodeLlama"
    }

Update config

Select CodeLlama, which would be visible in dropdown once you add it in config

Pick modal added in dropdown

And you can also chat as normal as shown below

Chat

And you can also select a codeblock file and ask AI:

Code

References: