- Deploy Music Assistant on Pi (10.0.0.199:8095) with host networking for Chromecast mDNS discovery, Spotify + SMB library support - Switch primary LLM from Ollama to Claude Sonnet 4 (Anthropic API), local models remain as fallback - Add model info tag under each assistant message in dashboard chat, persisted in conversation JSON - Rewrite homeai-agent/setup.sh: loads .env, injects API keys into plists, symlinks plists to ~/Library/LaunchAgents/, smoke tests services - Update install_service() in common.sh to use symlinks instead of copies - Open UFW ports on Pi for Music Assistant (8095, 8097, 8927) - Add ANTHROPIC_API_KEY to openclaw + bridge launchd plists Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
16 KiB
OpenClaw Skills Expansion Plan
Context
The HomeAI project has 4 custom OpenClaw skills (home-assistant, voice-assistant, image-generation, vtube-studio) and a working voice pipeline. The user wants to build 8 new skills plus a Public/Private mode system to dramatically expand what the assistant can do via voice and chat.
Skill Format Convention
Every skill follows the established pattern from ha-ctl:
- Lives in
~/.openclaw/skills/<name>/ SKILL.mdwith YAML frontmatter (name, description) + agent instructions- Optional Python CLI (stdlib only:
urllib.request,json,os,sys,re,datetime) - CLI symlinked to
/opt/homebrew/bin/for PATH access - Agent invokes via
exectool - Entry added to
~/.openclaw/workspace/TOOLS.mdfor reinforcement - New env vars added to
homeai-agent/launchd/com.homeai.openclaw.plist
Phase A — Core Skills (no new services needed)
1. Memory Recall (memory-ctl)
Purpose: Let the agent actively store, search, and recall memories mid-conversation.
Files:
~/.openclaw/skills/memory/SKILL.md~/.openclaw/skills/memory/memory-ctl→ symlink/opt/homebrew/bin/memory-ctl
Commands:
memory-ctl add <personal|general> "<content>" [--category preference|fact|routine] [--character-id ID]
memory-ctl search "<query>" [--type personal|general] [--character-id ID]
memory-ctl list [--type personal|general] [--character-id ID] [--limit 10]
memory-ctl delete <memory_id> [--type personal|general] [--character-id ID]
Details:
- Reads/writes existing files:
~/homeai-data/memories/personal/{char_id}.jsonandgeneral.json - Matches existing schema:
{"memories": [{"id": "m_<timestamp>", "content": "...", "category": "...", "createdAt": "..."}]} - Search: keyword token matching (split query, score by substring hits in content)
--character-iddefaults toHOMEAI_CHARACTER_IDenv var or satellite-map default- Dashboard memory UI will immediately reflect agent-created memories (same files)
- Env vars:
HOMEAI_CHARACTER_ID(optional, set by bridge)
2. Service Monitor (monitor-ctl)
Purpose: "Is everything running?" → spoken health report.
Files:
~/.openclaw/skills/service-monitor/SKILL.md~/.openclaw/skills/service-monitor/monitor-ctl→ symlink/opt/homebrew/bin/monitor-ctl
Commands:
monitor-ctl status # All services summary
monitor-ctl check <service> # Single service (ollama, bridge, ha, tts, stt, dashboard, n8n, gitea, kuma)
monitor-ctl ollama # Ollama-specific: loaded models, VRAM
monitor-ctl docker # Docker container status (runs: docker ps --format json)
Details:
- Checks hardcoded service endpoints with 3s timeout:
- Ollama (
localhost:11434/api/ps), Bridge (localhost:8081/status), Gateway (localhost:8080/status) - Wyoming STT (TCP
localhost:10300), TTS (TCPlocalhost:10301) - Dashboard (
localhost:5173), n8n (localhost:5678), Kuma (localhost:3001) - HA (
10.0.0.199:8123/api/), Gitea (10.0.0.199:3000)
- Ollama (
ollamasubcommand parses/api/psfor model names, sizes, expirydockerrunsdocker ps --format '{{json .}}'via subprocess- Pure stdlib (
urllib.request+socket.create_connectionfor TCP) - Env vars: Uses existing
HASS_TOKEN,HA_URL
3. Character Switcher (character-ctl)
Purpose: "Talk to Aria" → swap persona, TTS voice, system prompt.
Files:
~/.openclaw/skills/character/SKILL.md~/.openclaw/skills/character/character-ctl→ symlink/opt/homebrew/bin/character-ctl
Commands:
character-ctl list # All characters (name, id, tts engine)
character-ctl active # Current default character
character-ctl switch "<name_or_id>" # Set as default
character-ctl info "<name_or_id>" # Profile summary
character-ctl map <satellite_id> <character_id> # Map satellite → character
Details:
- Reads character JSONs from
~/homeai-data/characters/ switchupdatessatellite-map.jsondefault + writesactive-tts-voice.json- Fuzzy name resolution: case-insensitive match on
display_name→name→id→ partial match - Switch takes effect on next bridge request (
SKILL.mdtells agent to inform user) - Env vars: None new
Phase B — Home Assistant Extensions
4. Routine/Scene Builder (routine-ctl)
Purpose: Create and trigger multi-device scenes and routines from voice.
Files:
~/.openclaw/skills/routine/SKILL.md~/.openclaw/skills/routine/routine-ctl→ symlink/opt/homebrew/bin/routine-ctl
Commands:
routine-ctl list-scenes # HA scenes
routine-ctl list-scripts # HA scripts
routine-ctl trigger "<scene_or_script>" # Activate
routine-ctl create-scene "<name>" --entities '[{"entity_id":"light.x","state":"on","brightness":128}]'
routine-ctl list-routines # Local multi-step routines
routine-ctl create-routine "<name>" --steps '[{"type":"scene","target":"movie_mode"},{"type":"delay","seconds":5},{"type":"ha","cmd":"off \"TV Backlight\""}]'
routine-ctl run "<routine_name>" # Execute steps sequentially
Details:
- HA scenes via REST API:
POST /api/services/scene/turn_on,POST /api/services/scene/create - Local routines stored in
~/homeai-data/routines/*.json - Step types:
scene(trigger HA scene),ha(subprocess call to ha-ctl),delay(sleep),tts(curl to bridge/api/tts) runexecutes steps sequentially, reports progress- New data path:
~/homeai-data/routines/ - Env vars: Uses existing
HASS_TOKEN,HA_URL
5. Music Control (music-ctl)
Purpose: Play/control music with multi-room and Spotify support.
Files:
~/.openclaw/skills/music/SKILL.md~/.openclaw/skills/music/music-ctl→ symlink/opt/homebrew/bin/music-ctl
Commands:
music-ctl players # List media_player entities
music-ctl play ["query"] [--player ID] # Play/resume (search + play if query given)
music-ctl pause [--player ID] # Pause
music-ctl next / prev [--player ID] # Skip tracks
music-ctl volume <0-100> [--player ID] # Set volume
music-ctl now-playing [--player ID] # Current track info
music-ctl queue [--player ID] # Queue contents
music-ctl shuffle <on|off> [--player ID] # Toggle shuffle
music-ctl search "<query>" # Search library
Details:
- All commands go through HA
media_playerservices (same API pattern as ha-ctl) playwith query usesmedia_player/play_mediawithmedia_content_type: music- Spotify appears as a
media_playerentity via HA Spotify integration — no separate API needed playerslists allmedia_playerentities (Music Assistant zones, Spotify Connect, Chromecast, etc.)--playerdefaults to first active player or a configurable default- Multi-room: Snapcast zones appear as separate
media_playerentities now-playingreads state attributes:media_title,media_artist,media_album,media_position- Env vars: Uses existing
HASS_TOKEN,HA_URL - Prerequisite: Music Assistant Docker container configured + HA integration, OR Spotify HA integration
Phase C — External Service Skills
6. n8n Workflow Trigger (workflow-ctl)
Purpose: List and trigger n8n workflows by voice.
Files:
~/.openclaw/skills/workflow/SKILL.md~/.openclaw/skills/workflow/workflow-ctl→ symlink/opt/homebrew/bin/workflow-ctl
Commands:
workflow-ctl list # All workflows (name, active, id)
workflow-ctl trigger "<name_or_id>" [--data '{"key":"val"}'] # Fire webhook
workflow-ctl status <execution_id> # Execution status
workflow-ctl history [--limit 10] # Recent executions
Details:
- n8n REST API at
http://localhost:5678/api/v1/ - Auth via API key header:
X-N8N-API-KEY triggerprefers webhook trigger (POST /webhook/<path>), falls back toPOST /api/v1/workflows/<id>/execute- Fuzzy name matching on workflow names
- Env vars (new):
N8N_URL(defaulthttp://localhost:5678),N8N_API_KEY(generate in n8n Settings → API)
7. Gitea Integration (gitea-ctl)
Purpose: Query self-hosted repos, commits, issues, PRs.
Files:
~/.openclaw/skills/gitea/SKILL.md~/.openclaw/skills/gitea/gitea-ctl→ symlink/opt/homebrew/bin/gitea-ctl
Commands:
gitea-ctl repos [--limit 20] # List repos
gitea-ctl commits <owner/repo> [--limit 10] # Recent commits
gitea-ctl issues <owner/repo> [--state open] # List issues
gitea-ctl prs <owner/repo> [--state open] # List PRs
gitea-ctl create-issue <owner/repo> "<title>" [--body TEXT]
Details:
- Gitea REST API v1 at
http://10.0.0.199:3000/api/v1/ - Auth:
Authorization: token <GITEA_TOKEN> - Pure stdlib
urllib.request - Env vars (new):
GITEA_URL(defaulthttp://10.0.0.199:3000),GITEA_TOKEN(generate in Gitea → Settings → Applications)
8. Calendar/Reminders (calendar-ctl)
Purpose: Read calendar, create events, set voice reminders.
Files:
~/.openclaw/skills/calendar/SKILL.md~/.openclaw/skills/calendar/calendar-ctl→ symlink/opt/homebrew/bin/calendar-ctl
Commands:
calendar-ctl today [--calendar ID] # Today's events
calendar-ctl upcoming [--days 7] # Next N days
calendar-ctl add "<summary>" --start <ISO> --end <ISO> [--calendar ID]
calendar-ctl remind "<message>" --at "<time>" # Set reminder (e.g. "in 30 minutes", "at 5pm", "tomorrow 9am")
calendar-ctl reminders # List pending reminders
calendar-ctl cancel-reminder <id> # Cancel reminder
Details:
- Calendar read:
GET /api/calendars/<entity_id>?start=<ISO>&end=<ISO>via HA API - Calendar write:
POST /api/services/calendar/create_event - Reminders stored locally in
~/homeai-data/reminders.json - Relative time parsing with
datetime+re(stdlib): "in 30 minutes", "at 5pm", "tomorrow 9am" - Reminder daemon (
com.homeai.reminder-daemon): Python script checkingreminders.jsonevery 60s, fires TTS viaPOST http://localhost:8081/api/ttswhen due - New data path:
~/homeai-data/reminders.json - New daemon:
homeai-agent/reminder-daemon.py+homeai-agent/launchd/com.homeai.reminder-daemon.plist - Env vars: Uses existing
HASS_TOKEN,HA_URL
Phase D — Public/Private Mode System
9. Mode Controller (mode-ctl)
Purpose: Route requests to cloud LLMs (speed/power) or local LLMs (privacy) with per-task rules and manual toggle.
Files:
~/.openclaw/skills/mode/SKILL.md~/.openclaw/skills/mode/mode-ctl→ symlink/opt/homebrew/bin/mode-ctl
Commands:
mode-ctl status # Current mode + overrides
mode-ctl private # Switch to local-only
mode-ctl public # Switch to cloud LLMs
mode-ctl set-provider <anthropic|openai> # Preferred cloud provider
mode-ctl override <category> <private|public> # Per-category routing
mode-ctl list-overrides # Show all overrides
State file: ~/homeai-data/active-mode.json
{
"mode": "private",
"cloud_provider": "anthropic",
"cloud_model": "claude-sonnet-4-20250514",
"overrides": {
"web_search": "public",
"coding": "public",
"personal_finance": "private",
"health": "private"
},
"updated_at": "2026-03-17T..."
}
How model routing works — Bridge modification:
The HTTP bridge (openclaw-http-bridge.py) is modified to:
- New function
load_mode()readsactive-mode.json - New function
resolve_model(mode, category=None)returns model string - In
_handle_agent_request(), after character resolution, check mode → pass--modelflag to OpenClaw CLI- Private:
ollama/qwen3.5:35b-a3b(current default, no change) - Public:
anthropic/claude-sonnet-4-20250514oropenai/gpt-4o(per provider setting)
- Private:
OpenClaw config changes (openclaw.json): Add cloud providers to models.providers:
"anthropic": {
"baseUrl": "https://api.anthropic.com/v1",
"apiKey": "${ANTHROPIC_API_KEY}",
"api": "anthropic",
"models": [{"id": "claude-sonnet-4-20250514", "contextWindow": 200000, "maxTokens": 8192}]
},
"openai": {
"baseUrl": "https://api.openai.com/v1",
"apiKey": "${OPENAI_API_KEY}",
"api": "openai",
"models": [{"id": "gpt-4o", "contextWindow": 128000, "maxTokens": 4096}]
}
Per-task classification: The SKILL.md provides a category reference table. The agent self-classifies each request and checks overrides. Default categories:
- Always private: personal finance, health, passwords, private conversations
- Always public: web search, coding help, complex reasoning, translation
- Follow global mode: general chat, smart home, music, calendar
Dashboard integration: Add mode toggle to dashboard sidebar via new Vite middleware endpoint GET/POST /api/mode reading/writing active-mode.json.
- Env vars (new):
ANTHROPIC_API_KEY,OPENAI_API_KEY(add to OpenClaw plist) - Bridge file modified:
homeai-agent/openclaw-http-bridge.py— add ~40 lines for mode loading + model resolution
Implementation Order
| # | Skill | Complexity | Dependencies |
|---|---|---|---|
| 1 | memory-ctl |
Simple | None |
| 2 | monitor-ctl |
Simple | None |
| 3 | character-ctl |
Simple | None |
| 4 | routine-ctl |
Medium | ha-ctl existing |
| 5 | music-ctl |
Medium | Music Assistant or Spotify in HA |
| 6 | workflow-ctl |
Simple | n8n API key |
| 7 | gitea-ctl |
Simple | Gitea API token |
| 8 | calendar-ctl |
Medium | HA calendar + new reminder daemon |
| 9 | mode-ctl |
High | Cloud API keys + bridge modification |
Per-Skill Implementation Steps
For each skill:
- Create
SKILL.mdwith frontmatter + agent instructions + examples - Create Python CLI (
chmod +x), stdlib only - Symlink to
/opt/homebrew/bin/ - Test CLI standalone:
<tool> --help,<tool> <command> - Add env vars to
com.homeai.openclaw.plistif needed - Restart OpenClaw:
launchctl kickstart -k gui/501/com.homeai.openclaw - Add section to
~/.openclaw/workspace/TOOLS.md - Test via:
openclaw agent --message "test prompt" --agent main - Test via voice: wake word + spoken command
Verification
- Unit test each CLI: Run each command manually and verify JSON output
- Agent test:
openclaw agent --message "remember that my favorite color is blue"(memory-ctl) - Voice test: Wake word → "Is everything running?" → spoken health report (monitor-ctl)
- Mode test:
mode-ctl public→ send a complex query → verify it routes to cloud model in bridge logs - Dashboard test: Check memory UI shows agent-created memories, mode toggle works
- Cross-skill test: "Switch to Sucy and play some jazz" → character-ctl + music-ctl in one turn
Critical Files to Modify
| File | Changes |
|---|---|
~/.openclaw/workspace/TOOLS.md |
Add sections for all 9 new skills |
homeai-agent/openclaw-http-bridge.py |
Mode routing (Phase D only) |
homeai-agent/launchd/com.homeai.openclaw.plist |
New env vars |
~/.openclaw/openclaw.json |
Add anthropic + openai providers (Phase D) |
homeai-dashboard/vite.config.js |
/api/mode endpoint (Phase D) |
New Files Created
~/.openclaw/skills/memory/(SKILL.md + memory-ctl)~/.openclaw/skills/service-monitor/(SKILL.md + monitor-ctl)~/.openclaw/skills/character/(SKILL.md + character-ctl)~/.openclaw/skills/routine/(SKILL.md + routine-ctl)~/.openclaw/skills/music/(SKILL.md + music-ctl)~/.openclaw/skills/workflow/(SKILL.md + workflow-ctl)~/.openclaw/skills/gitea/(SKILL.md + gitea-ctl)~/.openclaw/skills/calendar/(SKILL.md + calendar-ctl + reminder-daemon.py)~/.openclaw/skills/mode/(SKILL.md + mode-ctl)~/homeai-data/routines/(directory)~/homeai-data/reminders.json(file)~/homeai-data/active-mode.json(file)homeai-agent/reminder-daemon.py+ launchd plist