Switch Your OpenClaw LLM Backend With One Command — Claude, Gemini, Ollama & OpenRouter

๐Ÿฆž OpenClaw Series · Bonus

Switch Your LLM Backend
With llm-switch

One shell script. Four providers. No config file editing. Switch between Claude, Gemini, Ollama, and OpenRouter from the terminal and restart.

๐Ÿฅง Raspberry Pi 5 ๐Ÿฆž OpenClaw ๐Ÿš Bash Script ☁️ GitHub Gist
๐ŸŸข Anthropic · ๐Ÿ”ต Gemini · ๐ŸŸก Ollama · ๐ŸŸฃ OpenRouter

Install llm-switch

๐Ÿ“ฆ
The script lives on GitHub Gist and gets installed into the container via your Samba share. Since /usr/local/bin resets on every add-on restart, the install is a one-liner you run whenever needed.
๐Ÿ“„
llm-switch.sh — GitHub Gist
gist.github.com/imnoor/f38b8e32e07a5cb0c01b2b53352e201f
Option A — Via Samba Share (recommended)
1
Download and save to your share

Download the raw script from the Gist above. Save it to your Windows/Mac Samba share as llm-switch.txt. The HAOS Samba add-on mounts your share at /share/ inside the OpenClaw container.

Windows path example: \\192.168.1.42\share\llm-switch.txt

2
Install from SSH terminal
terminal
root@openclaw:/# cp /share/llm-switch.txt /usr/local/bin/llm-switch \
  && sed -i 's/\r//' /usr/local/bin/llm-switch \
  && chmod +x /usr/local/bin/llm-switch
⚠️
Ephemeral container

/usr/local/bin resets on every add-on restart. Keep the script on your share as the source of truth and run this one-liner to reinstall after each restart.

3
Verify it works
llm-switch status
root@openclaw:/# llm-switch status
๐Ÿฆž llm-switch — OpenClaw Provider Switcher
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Current model: anthropic/claude-sonnet-4-6

API Keys (providers.env):
✔ Anthropic
✔ Gemini
✔ OpenRouter
✔ Ollama (no key needed — local)

Active gateway_env_vars:
✔ ANTHROPIC_API_KEY
Option B — Via curl (if network is available)
terminal
root@openclaw:/# curl -o /usr/local/bin/llm-switch \
  https://gist.githubusercontent.com/imnoor/\
f38b8e32e07a5cb0c01b2b53352e201f/raw/\
4b03300a056f58426fb27952f19e3b4c6a20fa94/llm-switch \
  && chmod +x /usr/local/bin/llm-switch
โ„น️
Note

The OpenClaw container's network egress may be restricted. If curl fails, use the Samba share method (Option A) instead.

Step 4 — Set Your API Keys
terminal
root@openclaw:/# llm-switch keys
Configure API keys (press Enter to skip / keep existing)

Anthropic API Key (console.anthropic.com)
  Current: not set
  New key: ••••••••  ← typed but hidden
  Saved.

Google Gemini API Key (aistudio.google.com)
  Current: not set
  New key: ••••••••
  Saved.

✔ Keys saved to /config/.openclaw/providers.env
Note: File is chmod 600 — readable only by root.

Keys are stored in /config/.openclaw/providers.env — this directory persists across add-on restarts so you only need to set keys once.

Supported Providers

๐Ÿ”Œ
Four providers, each with different cost profiles and strengths. OpenClaw determines the provider from the model string prefix — anthropic/, google/, ollama/, or openrouter/.
๐ŸŸข
Anthropic
claude-sonnet-4-6
~$3–8/mo
๐Ÿ”ต
Google Gemini
gemini-3-flash-preview
Free tier
๐ŸŸก
Ollama
llama3.2 · mistral · qwen2.5
$0 — local
๐ŸŸฃ
OpenRouter
100+ models
Pay per use
Provider Comparison
ProviderDefault ModelAPI KeyBest ForCost
anthropic claude-sonnet-4-6 Required Best all-rounder, tool use, reasoning ~$3–8/mo
gemini gemini-3-flash-preview Required Fast, large context, free tier Free / paid
ollama llama3.2 None needed Privacy, offline, zero API cost $0
openrouter llama-3.3-70b-instruct Required Model experimentation, 100+ options Pay per use
How It Works Under the Hood

When you run llm-switch anthropic, the script makes three changes:

What the script does on each switch
# 1. Sets the model in openclaw.json
openclaw config set agents.defaults.model.primary "anthropic/claude-sonnet-4-6"

# 2. Writes API key to auth-profiles.json (where OpenClaw reads it)
auth-profiles.json ← { "anthropic:default": { "type": "api_key", "key": "sk-ant-..." } }

# 3. Injects API key into gateway_env_vars in /data/options.json
options.json ← { "gateway_env_vars": ["ANTHROPIC_API_KEY=sk-ant-..."] }
๐Ÿ’ก
Ollama setup

Ollama must be running on your local network. Set the endpoint with: OLLAMA_ENDPOINT=http://192.168.1.x:11434 llm-switch ollama llama3.2. The default endpoint is http://192.168.1.42:11434 — update the script's OLLAMA_DEFAULT variable to match your setup.

Command Reference

๐Ÿš
Every command, model option, and flag. Always restart the add-on after switching — the new model takes effect on the next session start.
Switch Commands
all switch commands
# Anthropic
llm-switch anthropic                           # Claude Sonnet 4.6 (default)
llm-switch anthropic claude-opus-4-6           # Claude Opus
llm-switch anthropic claude-haiku-4-5-20251001 # Claude Haiku

# Gemini
llm-switch gemini                              # Gemini 3 Flash (default)
llm-switch gemini gemini-3-pro-preview         # Gemini 3 Pro

# Ollama (local — no API cost)
llm-switch ollama                              # llama3.2 (default)
llm-switch ollama mistral
llm-switch ollama deepseek-r1
OLLAMA_ENDPOINT=http://192.168.1.x:11434 llm-switch ollama llama3.2

# OpenRouter (100+ models)
llm-switch openrouter                          # Llama 3.3 70B (default)
llm-switch openrouter mistralai/mistral-large-2411
llm-switch openrouter google/gemini-2.0-flash
Utility Commands
utility
llm-switch status   # current model, key status, active env vars
llm-switch list     # all providers and available models
llm-switch keys     # set or update API keys (hidden input)
Full Switch Workflow
switching anthropic → gemini
# 1. Reinstall after restart
root@openclaw:/# cp /share/llm-switch.txt /usr/local/bin/llm-switch \
&& sed -i 's/\r//' /usr/local/bin/llm-switch \
&& chmod +x /usr/local/bin/llm-switch

# 2. Switch provider
root@openclaw:/# llm-switch gemini

๐Ÿฆž llm-switch — OpenClaw Provider Switcher
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Switching to Google Geminigemini-3-flash-preview...
auth-profiles.json: google:default set
gateway_env_vars: GEMINI_API_KEY set

✔ Done! Model set to: google/gemini-3-flash-preview
⚡ Restart required: HA → Apps → OpenClaw Assistant → Restart

# 3. Restart add-on from HA UI, then verify on Telegram:
# "What model are you running on?"
# → I'm currently running on Gemini 3 Flash (Preview). ๐Ÿฆž
๐Ÿ’ก
Confirm the switch on Telegram

After restarting, always ask Claw: "What model are you running on?" — it will tell you exactly which model and provider is active. This is the fastest way to confirm the switch worked end to end.

How It Works Internally

๐Ÿ”ง
The script touches three separate files to make a provider switch stick. Understanding this helps if something breaks and you need to debug manually.
The Three Files
File 01 — openclaw.json

Path: /config/.openclaw/openclaw.json
Updated via: openclaw config set agents.defaults.model.primary
Contains: the active model string e.g. google/gemini-3-flash-preview. OpenClaw reads this on startup to know which model to use.

File 02 — auth-profiles.json

Path: /config/.openclaw/agents/main/agent/auth-profiles.json
Updated via: Python directly
Contains: the API key for the active provider in the format {"type":"api_key","provider":"google","key":"AIza..."}. This is where OpenClaw actually reads the key at runtime — not from environment variables.

File 03 — options.json

Path: /data/options.json
Updated via: Python directly
Contains: gateway_env_vars array with the active API key as an env var. This ensures the key is available as a process environment variable for any part of OpenClaw that reads process.env.ANTHROPIC_API_KEY etc.

⚠️
Why three files?

OpenClaw reads credentials from multiple places depending on the code path. During development we discovered that auth-profiles.json is the primary source for the agent, while gateway_env_vars covers the gateway process. Both need to be set for a clean switch.

Keys Storage
providers.env
# /config/.openclaw/providers.env (chmod 600)
ANTHROPIC_API_KEY=sk-ant-api03-...
GEMINI_API_KEY=AIzaSy...
OPENROUTER_API_KEY=sk-or-v1-...

Your API keys are stored in /config/.openclaw/providers.env with chmod 600. This file persists across restarts since /config is the only mounted persistent volume in the HAOS add-on container. The script sources this file on every run — keys are never stored in memory between invocations.

Debugging a Failed Switch
manual verification
# Check current model
openclaw config get agents.defaults.model.primary

# Check auth profiles
cat /config/.openclaw/agents/main/agent/auth-profiles.json \
  | python3 -m json.tool

# Check gateway env vars
cat /data/options.json | python3 -m json.tool \
  | grep -A5 gateway_env_vars
๐Ÿ’ก
Check OpenClaw add-on logs

If Claw fails to respond after a switch, check HA → Apps → OpenClaw Assistant → Logs. Error messages like "No API key found for provider X" tell you exactly which file is missing the key.

Comments