Skip to main content

Ollama

Ollama is an open-source tool that allows to run large language models (LLMs) locally on their own computers. To use Ollama, you can install it here and download the model you want to run with the ollama run command.

Chat model

We recommend configuring Llama3.1 8B as your chat model.

config.json
{
"models": [
{
"title": "Llama3.1 8B",
"provider": "ollama",
"model": "llama3.1:8b"
}
]
}

Autocomplete model

We recommend configuring StarCoder2 3B as your autocomplete model.

config.json
{
"tabAutocompleteModel": {
"title": "StarCoder2 3B",
"provider": "ollama",
"model": "starcoder2:3b"
}
}

Embeddings model

We recommend configuring Nomic Embed Text as your embeddings model.

config.json
{
"embeddingsProvider": {
"provider": "ollama",
"model": "nomic-embed-text"
}
}

Reranking model

Ollama currently does not offer any reranking models.

Click here to see a list of reranking model providers.