- Add openclaw-http-bridge.py: HTTP server translating POST requests to OpenClaw CLI calls - Add launchd plist for HTTP bridge (port 8081, auto-start) - Add install-to-docker-ha.sh: deploy custom component to Docker HA via SSH - Add package-for-ha.sh: create distributable tarball of custom component - Add test-services.sh: comprehensive voice pipeline service checker Fixes from code review: - Use OpenClawAgent (HTTP) in async_setup_entry instead of OpenClawCLIAgent (CLI agent fails inside Docker HA where openclaw binary doesn't exist) - Update all port references from 8080 to 8081 (HTTP bridge port) - Remove overly permissive CORS headers from HTTP bridge - Fix zombie process leak: kill child process on CLI timeout - Remove unused subprocess import in conversation.py - Add version field to Kokoro TTS Wyoming info - Update TODO.md with voice pipeline progress
3.4 KiB
3.4 KiB
OpenClaw Conversation - Home Assistant Custom Component
A custom conversation agent for Home Assistant that routes all voice/text queries to OpenClaw for processing.
Features
- Direct OpenClaw Integration: Routes all conversation requests to OpenClaw
- CLI-based Communication: Uses the
openclawCLI command (fallback if HTTP API unavailable) - Configurable: Set host, port, agent name, and timeout via UI
- Voice Pipeline Compatible: Works with Home Assistant's voice assistant pipeline
Installation
Method 1: Manual Copy
-
Copy the entire
openclaw_conversationfolder to your Home Assistantcustom_componentsdirectory:# On the HA host (if using HA OS or Container, use the File Editor add-on) cp -r homeai-agent/custom_components/openclaw_conversation \ /config/custom_components/ -
Restart Home Assistant
-
Go to Settings → Devices & Services → Add Integration
-
Search for "OpenClaw Conversation"
-
Configure the settings:
- OpenClaw Host:
localhost(or IP of Mac Mini) - OpenClaw Port:
8081(HTTP Bridge) - Agent Name:
main(or your configured agent) - Timeout:
30seconds
- OpenClaw Host:
Method 2: Using HACS (if available)
- Add this repository to HACS as a custom repository
- Install "OpenClaw Conversation"
- Restart Home Assistant
Configuration
Via UI (Recommended)
After installation, configure via Settings → Devices & Services → OpenClaw Conversation → Configure.
Via YAML (Alternative)
Add to your configuration.yaml:
openclaw_conversation:
openclaw_host: localhost
openclaw_port: 8081
agent_name: main
timeout: 30
Usage
Once configured, the OpenClaw agent will be available as a conversation agent in Home Assistant.
Setting as Default Agent
- Go to Settings → Voice Assistants
- Edit your voice assistant pipeline
- Set Conversation Agent to "OpenClaw Conversation"
- Save
Testing
- Open the Assist panel in Home Assistant
- Type a query like: "Turn on the reading lamp"
- OpenClaw will process the request and return a response
Architecture
[Voice Input] → [HA Voice Pipeline] → [OpenClaw Conversation Agent]
↓
[OpenClaw CLI/API]
↓
[Ollama LLM + Skills]
↓
[HA Actions + TTS Response]
Troubleshooting
Agent Not Responding
- Check OpenClaw is running:
pgrep -f openclaw - Test CLI directly:
openclaw agent --message "Hello" --agent main - Check HA logs: Settings → System → Logs
Connection Errors
- Verify OpenClaw host/port settings
- Ensure OpenClaw is accessible from HA container/host
- Check network connectivity:
curl http://localhost:8081/status
Files
| File | Purpose |
|---|---|
manifest.json |
Component metadata |
__init__.py |
Component setup and registration |
config_flow.py |
Configuration UI flow |
const.py |
Constants and defaults |
conversation.py |
Conversation agent implementation |
strings.json |
UI translations |