Installing OpenClaw – Part 3: Configuring Models (OpenAI, Anthropic, OpenRouter)
OpenClaw supports many AI models – how do you sensibly set up OpenAI, Anthropic, and OpenRouter, manage API keys, and define fallbacks? Here's the practical guide.
📚 Series: Installing & Setting Up OpenClaw — Part 3 of 8
← Part 2: Installation | Part 4: Connecting Telegram & WhatsApp →
In the second part, we installed OpenClaw and started the gateway. Now it’s time to connect AI models to OpenClaw – because without models, your agent can’t generate responses.
OpenClaw supports many providers: OpenAI, Anthropic, OpenRouter, Google Gemini, DeepSeek, and even local LLMs via Ollama. In this part, you’ll learn how to securely manage API keys, configure providers, and set up fallbacks sensibly.
Prerequisites
✅ Running OpenClaw Gateway (see Part 2)
✅ API keys for desired providers (e.g., OpenAI, Anthropic, OpenRouter)
✅ Basic understanding of API keys and environment variables
Understanding the Configuration File
OpenClaw uses a configuration file (typically ~/.openclaw/config.yaml or ~/.openclaw/openclaw.json). There you define:
- Providers (which models do you use?)
- Models per provider (e.g.,
gpt-5.4,claude-3.7-sonnet,deepseek-v3.2) - If a provider is unavailable, it switches to a backup provider.
Example: Minimal Config
providers:
openai:
enabled: true
apiKey: "{{secrets.openai.apiKey}}"
models:
- gpt-5.4
- gpt-5-mini
anthropic:
enabled: true
apiKey: "{{secrets.anthropic.apiKey}}"
models:
- claude-3.7-sonnet
openrouter:
enabled: true
apiKey: "{{secrets.openrouter.apiKey}}"
models:
- deepseek-v3.2
- gemini-3.1-flash
💡 Tip: Use
{{secrets.<name>}}instead of plain text – the secret value lies outside the versioned config.
Storing API Keys Securely
Option A: Environment Variables (Simple)
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-..."
export OPENROUTER_API_KEY="sk-..."
Then in the config:
apiKey: "{{env.OPENAI_API_KEY}}"
Option B: Secrets File (Recommended)
Create ~/.openclaw/secrets.json (or similar) and load it via SecretRef in the config:
apiKey: "{{secrets.openai.apiKey}}"
Important: Do not commit this file to Git repos!
Providers in Detail
OpenAI
- Models: GPT-5.4, GPT-5-Mini, GPT-5-Nano
- Advantage: High quality, many features (functions, vision)
- Disadvantage: More expensive than many alternatives
Anthropic
- Models: Claude 3.7 Sonnet, Claude 4.6
- Advantage: Very good language quality, good context handling
- Disadvantage: Fewer features than OpenAI, somewhat more expensive
OpenRouter
- Models: DeepSeek V3.2, Gemini 3.1 Flash, GLM-Flash, Qwen, etc.
- Advantage: Affordable, many models on one platform
- Disadvantage: Quality varies depending on the model
Defining Fallbacks
You can specify which model is used by default and what to switch to if a provider is unavailable:
defaults:
model: "openai:gpt-5.4"
fallback:
- "anthropic:claude-3.7-sonnet"
- "openrouter:deepseek-v3.2"
This ensures your agent can always respond – even if a provider is temporarily unreachable.
Testing
After updating the config, restart the gateway:
openclaw gateway restart
Then test with:
openclaw status
You should receive a response – if not, check the logs (openclaw gateway logs) or the configuration.
Troubleshooting
- “API key invalid”: Check if the API key is correct and not expired.
- “Model not found”: Check if the model is available with the provider.
- “Rate limit exceeded”: Wait a moment or switch to another model/provider.
Sources & Further Links
This article is part of the series “Installing & Setting Up OpenClaw”. The next part will be published soon. If you have questions or suggestions, feel free to write to me on Mastodon or via email.