LLM Providers
TAU supports 75+ LLM providers. Use your API keys or connect existing CLI tools for free.
Use Your Existing Subscription
Connect to Claude Code, GitHub Copilot, Gemini CLI, or Codex — no extra API costs:
# Claude Code (Claude Pro/Max subscription)
tau auth login claude-code
# GitHub Copilot (GitHub subscription)
tau auth login github-copilot
# Gemini CLI (Google AI subscription)
tau auth login gemini-cli
# Codex CLI
tau auth login codexTAU acts as an ACP (Agent Client Protocol) adapter — you use models through existing subscriptions.
Quick Setup (API Keys)
# Set API key for your preferred provider
export ANTHROPIC_API_KEY=sk-ant-xxx # Claude
export OPENAI_API_KEY=sk-xxx # GPT
export GOOGLE_API_KEY=xxx # Gemini
export XAI_API_KEY=xxx # Grok
export DEEPSEEK_API_KEY=xxx # DeepSeek
# Run TAU - it auto-detects configured providers
tauTier 1: Native Providers
These have dedicated implementations with full feature support:
| Provider | Env Variable | Models | Features |
|---|---|---|---|
| Anthropic | ANTHROPIC_API_KEY | Claude Opus 4.5, Sonnet 4.5, Haiku 4.5 | Streaming, tools, vision, prompt caching |
| OpenAI | OPENAI_API_KEY | GPT-5.2, GPT-4o, o1, o3 | Streaming, tools, vision, function calling |
GOOGLE_API_KEY | Gemini 3 Pro, 3 Flash | Streaming, tools, vision, context caching | |
| xAI | XAI_API_KEY | Grok 4.1, Grok Code Fast 1 | Streaming, tools, fast inference |
| AWS Bedrock | AWS_ACCESS_KEY_ID | Claude, Nova, Llama | Cross-region inference |
| OpenRouter | OPENROUTER_API_KEY | 300+ models | Gateway to all providers |
| Ollama | OLLAMA_HOST | Llama 4, Mistral, Qwen, etc. | Local, free, offline |
| MLX | (auto-detected) | Local embeddings | macOS Apple Silicon only |
CLI Providers (Use Your Subscription)
Connect to existing CLI agents and use models through your subscription:
Claude Code (Claude Pro/Max)
# Requires: Claude Pro/Max subscription + Claude Code CLI installed
# Install: npm install -g @anthropic/claude-code
# Authenticate
tau auth login claude-code
# Use Claude Code models (free with subscription)
tau run --provider claude-code "Explain this codebase"
# Or set as default
export TAU_ACP_AGENT=claudeGitHub Copilot
# Requires: GitHub Copilot subscription
# Authenticate via OAuth
tau auth login github-copilot
# Use Copilot models (free with subscription)
tau run --model github-copilot/gpt-4o "Hello"
tau run --model github-copilot/claude-4-5-sonnet "Hello"
# Available models depend on your subscription tierGemini CLI
# Requires: Gemini CLI installed + Google AI subscription
# Authenticate
tau auth login gemini-cli
# Use Gemini
tau run --provider gemini-cli "Explain this code"
export TAU_ACP_AGENT=geminiCodex CLI
# Requires: Codex CLI installed
# Authenticate
tau auth login codex
# Use Codex
tau run --provider codex "Refactor this function"
export TAU_ACP_AGENT=codexNote: CLI providers use ACP (Agent Client Protocol). TAU acts as an adapter — the actual inference happens through the external CLI, not through direct API calls. This means your subscription limits apply.
Tier 2: models.dev Providers (70+)
All providers from models.dev are supported via OpenAI-compatible API:
+ 60 more providers. Run curl -s https://models.dev/api.json | jq 'keys' to see all.
Switching Providers
# Use specific provider/model
tau run --model anthropic/claude-sonnet-4-5-20250929 "Hello"
tau run --model openai/gpt-4o "Hello"
tau run --model xai/grok-code-fast-1 "Hello"
# Set default model
export TAU_MODEL=anthropic/claude-sonnet-4-5-20250929
# List configured providers
tau models list
# List models from specific provider
tau models list --provider anthropicLocal Models (Ollama)
# Start Ollama
ollama serve
# Pull a model
ollama pull llama3.2:3b
# Configure TAU to use Ollama
export OLLAMA_HOST=http://localhost:11434
export TAU_ENABLE_OLLAMA=1
# Run with Ollama
tau run --model ollama/llama3.2:3b "Hello"Enterprise Providers
AWS Bedrock
export AWS_ACCESS_KEY_ID=AKIA...
export AWS_SECRET_ACCESS_KEY=...
export AWS_REGION=us-east-1
tau run --model amazon-bedrock/anthropic.claude-sonnet-4-5-20250929-v1:0 "Hello"Azure OpenAI
export AZURE_OPENAI_API_KEY=xxx
export AZURE_OPENAI_ENDPOINT=https://my-resource.openai.azure.com
tau run --model azure-openai/my-gpt4-deployment "Hello"Google Vertex AI
# Authenticate with GCP
gcloud auth application-default login
export GOOGLE_CLOUD_PROJECT=my-project
export VERTEX_LOCATION=us-central1
tau run --model google-vertex/gemini-3-pro "Hello"Cost Tracking
TAU tracks costs for all providers using models.dev pricing data:
# See cost breakdown after conversation
tau history --costs
# Real-time cost shown in TUI status bar
# CLI providers show $0 (uses your subscription)