Built-in providers
| Provider | Name | Env var |
|---|---|---|
| OpenAI | openai | OPENAI_API_KEY |
| Anthropic | anthropic | ANTHROPIC_API_KEY |
| Google Gemini | google | GOOGLE_API_KEY |
| Mistral | mistral | MISTRAL_API_KEY |
| Cohere | cohere | COHERE_API_KEY |
| OpenRouter | openrouter | OPENROUTER_API_KEY |
Inline flags
Pass--model (or -m) once per model in provider/model format:
Config file
For more than a couple of models, use a config file. Supports.yaml, .json, and .toml.
label field sets the display name in results. It’s optional — defaults to provider/model.
OpenRouter
OpenRouter gives you access to 200+ models across every major provider with a single API key. Model names follow theprovider/model format listed
on openrouter.ai/models.
Local models (vLLM / Ollama)
Any OpenAI-compatible local server works via theopenai provider with a custom base_url.
- vLLM
- Ollama
Start the server:Config entry:
Custom providers
SubclassProvider and register it: