Skip to content

Configuration

Billy reads from ~/.localai/config.toml on startup. If it doesn’t exist, defaults are used.

[backend]
type = "ollama"
url = "http://localhost:11434"
[ollama]
model = "qwen2.5-coder:14b"
temperature = 0.7
[storage]
history_file = "~/.localai/history.db"
KeyDefaultDescription
typeollamaBackend to use. Currently only ollama.
urlhttp://localhost:11434URL of the Ollama server
KeyDefaultDescription
modelqwen2.5-coder:14bDefault model to use
temperature0.7Sampling temperature (0.0 = deterministic, 1.0 = creative)
KeyDefaultDescription
history_file~/.localai/history.dbSQLite database for history & memories

All config values can be overridden:

Terminal window
BILLY_MODEL=llama3.2 billy
BILLY_BACKEND_URL=http://192.168.1.10:11434 billy