Lecca.io LogoLecca.io

Running Ollama

Ollama is already configured to work locally. You just have to make sure the server can find it.

This has only been tested on Mac and Windows. If you find any errors with other operating systems, please create an issue.

Download and install Ollama

Go to the download url and follow Ollama's instructions to get it running on your machine.

Ollama Base Url

By default, the ServerConfig has OLLAMA_BASE_URL set to http://127.0.0.1:11434/api. If this is not the correct base url, set a OLLAMA_BASE_URL environment variable to override it.

Restart your server

When the server initializes it looks for Ollama running on the ollama base url. If it finds it, it adds it to the available AI providers.

On this page