Skip to content

Installation

Billy needs Ollama to talk to AI models.

  • Billy Slim — install Ollama separately
  • Billy Full — Ollama is bundled, auto-starts on first run
Terminal window
curl -fsSL https://raw.githubusercontent.com/jd4rider/billy-app/main/scripts/install.sh | bash

Then install Ollama separately at ollama.com.

Terminal window
billy --version
Terminal window
ollama pull mistral # great all-rounder, ~4GB
# or
ollama pull llama3.2 # fast 3B model, ~2GB

Then run billy and start chatting.