Skip to content

Model Management

Wilson supports 10 LLM providers with dozens of models. Use /model to switch between them and /pull to download local models.

Run /model without arguments to open the interactive model selector:

  1. Pick a provider (Ollama, Transformers.js, OpenAI, Anthropic, etc.)
  2. Pick a model from the provider’s list, or type a custom model name
  3. Wilson switches immediately — the next message uses the new model
/model

Skip the selector by passing a model name directly:

/model ollama:qwen3:8b
/model gpt-4.1
/model claude-sonnet-4-6
/model transformers:HuggingFaceTB/SmolLM3-3B-ONNX

Download Ollama or Transformers.js models with progress reporting:

/pull qwen3:8b # Ollama model
/pull transformers:HuggingFaceTB/SmolLM3-3B-ONNX # Transformers.js model

For Ollama models, this calls ollama pull under the hood. For Transformers.js models, it downloads ONNX weights from HuggingFace Hub to ~/.openaccountant/models/.

These models work with Wilson’s agent tool system — they can call tools like search_transactions, categorize, and import_csv.

ModelSizeDescription
qwen3:0.6b523 MBAlibaba — smallest tool-calling model
qwen3:4b2.5 GBAlibaba — rivals 72B quality, tool-calling
qwen3:8b5.2 GBAlibaba — best balance, tool-calling
granite4:3b2.1 GBIBM Granite 4 — tool-calling, 128K context
granite4:tiny-h4.2 GBIBM Granite 4 MoE — 7B, tool-calling
ministral-3:3b2.1 GBMistral — vision + tool-calling, 256K context

Good for conversation and analysis, but may not reliably call tools.

ModelSizeDescription
gemma3:4b3.3 GBGoogle — multimodal, 140+ languages
smollm2:1.7b1.8 GBHuggingFace — tiny but capable
gemma3:1b815 MBGoogle — ultralight, 32K context
granite4:350m708 MBIBM — smallest Granite, edge-ready

Set DEFAULT_MODEL in .env for headless mode and cron jobs:

Terminal window
DEFAULT_MODEL=ollama:qwen3:8b

This is used when Wilson starts without an interactive model selection — for example, with wilson --run "..." or wilson --sync.

The model name prefix determines which provider handles the request:

PrefixProviderExample
ollama:Ollamaollama:qwen3:8b
transformers:Transformers.jstransformers:HuggingFaceTB/SmolLM3-3B-ONNX
openrouter:OpenRouteropenrouter:anthropic/claude-3.5-sonnet
litellm:LiteLLMlitellm:gpt-4o
claude-Anthropicclaude-sonnet-4-6
gpt-OpenAIgpt-4.1
gemini-Googlegemini-3-flash-preview
grok-xAIgrok-4-0709
kimi-Moonshotkimi-k2-5
deepseek-DeepSeekdeepseek-chat

No prefix defaults to OpenAI.