Switch Your OpenClaw LLM Backend With One Command — Claude, Gemini, Ollama & OpenRouter
Switch Your LLM Backend
With llm-switch
One shell script. Four providers. No config file editing. Switch between Claude, Gemini, Ollama, and OpenRouter from the terminal and restart.
Install llm-switch
/usr/local/bin resets on every add-on restart, the install is a one-liner you run whenever needed.Download the raw script from the Gist above. Save it to your Windows/Mac Samba share as llm-switch.txt. The HAOS Samba add-on mounts your share at /share/ inside the OpenClaw container.
Windows path example: \\192.168.1.42\share\llm-switch.txt
root@openclaw:/# cp /share/llm-switch.txt /usr/local/bin/llm-switch \ && sed -i 's/\r//' /usr/local/bin/llm-switch \ && chmod +x /usr/local/bin/llm-switch
/usr/local/bin resets on every add-on restart. Keep the script on your share as the source of truth and run this one-liner to reinstall after each restart.
๐ฆ llm-switch — OpenClaw Provider Switcher
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Current model: anthropic/claude-sonnet-4-6
API Keys (providers.env):
✔ Anthropic
✔ Gemini
✔ OpenRouter
✔ Ollama (no key needed — local)
Active gateway_env_vars:
✔ ANTHROPIC_API_KEY
root@openclaw:/# curl -o /usr/local/bin/llm-switch \ https://gist.githubusercontent.com/imnoor/\ f38b8e32e07a5cb0c01b2b53352e201f/raw/\ 4b03300a056f58426fb27952f19e3b4c6a20fa94/llm-switch \ && chmod +x /usr/local/bin/llm-switch
The OpenClaw container's network egress may be restricted. If curl fails, use the Samba share method (Option A) instead.
root@openclaw:/# llm-switch keys Configure API keys (press Enter to skip / keep existing) Anthropic API Key (console.anthropic.com) Current: not set New key: •••••••• ← typed but hidden Saved. Google Gemini API Key (aistudio.google.com) Current: not set New key: •••••••• Saved. ✔ Keys saved to /config/.openclaw/providers.env Note: File is chmod 600 — readable only by root.
Keys are stored in /config/.openclaw/providers.env — this directory persists across add-on restarts so you only need to set keys once.
Supported Providers
anthropic/, google/, ollama/, or openrouter/.| Provider | Default Model | API Key | Best For | Cost |
|---|---|---|---|---|
| anthropic | claude-sonnet-4-6 | Required | Best all-rounder, tool use, reasoning | ~$3–8/mo |
| gemini | gemini-3-flash-preview | Required | Fast, large context, free tier | Free / paid |
| ollama | llama3.2 | None needed | Privacy, offline, zero API cost | $0 |
| openrouter | llama-3.3-70b-instruct | Required | Model experimentation, 100+ options | Pay per use |
When you run llm-switch anthropic, the script makes three changes:
# 1. Sets the model in openclaw.json openclaw config set agents.defaults.model.primary "anthropic/claude-sonnet-4-6" # 2. Writes API key to auth-profiles.json (where OpenClaw reads it) auth-profiles.json ← { "anthropic:default": { "type": "api_key", "key": "sk-ant-..." } } # 3. Injects API key into gateway_env_vars in /data/options.json options.json ← { "gateway_env_vars": ["ANTHROPIC_API_KEY=sk-ant-..."] }
Ollama must be running on your local network. Set the endpoint with: OLLAMA_ENDPOINT=http://192.168.1.x:11434 llm-switch ollama llama3.2. The default endpoint is http://192.168.1.42:11434 — update the script's OLLAMA_DEFAULT variable to match your setup.
Command Reference
# Anthropic llm-switch anthropic # Claude Sonnet 4.6 (default) llm-switch anthropic claude-opus-4-6 # Claude Opus llm-switch anthropic claude-haiku-4-5-20251001 # Claude Haiku # Gemini llm-switch gemini # Gemini 3 Flash (default) llm-switch gemini gemini-3-pro-preview # Gemini 3 Pro # Ollama (local — no API cost) llm-switch ollama # llama3.2 (default) llm-switch ollama mistral llm-switch ollama deepseek-r1 OLLAMA_ENDPOINT=http://192.168.1.x:11434 llm-switch ollama llama3.2 # OpenRouter (100+ models) llm-switch openrouter # Llama 3.3 70B (default) llm-switch openrouter mistralai/mistral-large-2411 llm-switch openrouter google/gemini-2.0-flash
llm-switch status # current model, key status, active env vars llm-switch list # all providers and available models llm-switch keys # set or update API keys (hidden input)
root@openclaw:/# cp /share/llm-switch.txt /usr/local/bin/llm-switch \
&& sed -i 's/\r//' /usr/local/bin/llm-switch \
&& chmod +x /usr/local/bin/llm-switch
# 2. Switch provider
root@openclaw:/# llm-switch gemini
๐ฆ llm-switch — OpenClaw Provider Switcher
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Switching to Google Gemini → gemini-3-flash-preview...
auth-profiles.json: google:default set
gateway_env_vars: GEMINI_API_KEY set
✔ Done! Model set to: google/gemini-3-flash-preview
⚡ Restart required: HA → Apps → OpenClaw Assistant → Restart
# 3. Restart add-on from HA UI, then verify on Telegram:
# "What model are you running on?"
# → I'm currently running on Gemini 3 Flash (Preview). ๐ฆ
After restarting, always ask Claw: "What model are you running on?" — it will tell you exactly which model and provider is active. This is the fastest way to confirm the switch worked end to end.
How It Works Internally
Path: /config/.openclaw/openclaw.json
Updated via: openclaw config set agents.defaults.model.primary
Contains: the active model string e.g. google/gemini-3-flash-preview. OpenClaw reads this on startup to know which model to use.
Path: /config/.openclaw/agents/main/agent/auth-profiles.json
Updated via: Python directly
Contains: the API key for the active provider in the format {"type":"api_key","provider":"google","key":"AIza..."}. This is where OpenClaw actually reads the key at runtime — not from environment variables.
Path: /data/options.json
Updated via: Python directly
Contains: gateway_env_vars array with the active API key as an env var. This ensures the key is available as a process environment variable for any part of OpenClaw that reads process.env.ANTHROPIC_API_KEY etc.
OpenClaw reads credentials from multiple places depending on the code path. During development we discovered that auth-profiles.json is the primary source for the agent, while gateway_env_vars covers the gateway process. Both need to be set for a clean switch.
# /config/.openclaw/providers.env (chmod 600) ANTHROPIC_API_KEY=sk-ant-api03-... GEMINI_API_KEY=AIzaSy... OPENROUTER_API_KEY=sk-or-v1-...
Your API keys are stored in /config/.openclaw/providers.env with chmod 600. This file persists across restarts since /config is the only mounted persistent volume in the HAOS add-on container. The script sources this file on every run — keys are never stored in memory between invocations.
# Check current model openclaw config get agents.defaults.model.primary # Check auth profiles cat /config/.openclaw/agents/main/agent/auth-profiles.json \ | python3 -m json.tool # Check gateway env vars cat /data/options.json | python3 -m json.tool \ | grep -A5 gateway_env_vars
If Claw fails to respond after a switch, check HA → Apps → OpenClaw Assistant → Logs. Error messages like "No API key found for provider X" tell you exactly which file is missing the key.
Comments
Post a Comment