Files
Aodhan Collins 664bb6d275 feat: OpenClaw HTTP bridge, HA conversation agent fixes, voice pipeline tooling
- Add openclaw-http-bridge.py: HTTP server translating POST requests to OpenClaw CLI calls
- Add launchd plist for HTTP bridge (port 8081, auto-start)
- Add install-to-docker-ha.sh: deploy custom component to Docker HA via SSH
- Add package-for-ha.sh: create distributable tarball of custom component
- Add test-services.sh: comprehensive voice pipeline service checker

Fixes from code review:
- Use OpenClawAgent (HTTP) in async_setup_entry instead of OpenClawCLIAgent
  (CLI agent fails inside Docker HA where openclaw binary doesn't exist)
- Update all port references from 8080 to 8081 (HTTP bridge port)
- Remove overly permissive CORS headers from HTTP bridge
- Fix zombie process leak: kill child process on CLI timeout
- Remove unused subprocess import in conversation.py
- Add version field to Kokoro TTS Wyoming info
- Update TODO.md with voice pipeline progress
2026-03-08 22:46:04 +00:00

3.4 KiB

OpenClaw Conversation - Home Assistant Custom Component

A custom conversation agent for Home Assistant that routes all voice/text queries to OpenClaw for processing.

Features

  • Direct OpenClaw Integration: Routes all conversation requests to OpenClaw
  • CLI-based Communication: Uses the openclaw CLI command (fallback if HTTP API unavailable)
  • Configurable: Set host, port, agent name, and timeout via UI
  • Voice Pipeline Compatible: Works with Home Assistant's voice assistant pipeline

Installation

Method 1: Manual Copy

  1. Copy the entire openclaw_conversation folder to your Home Assistant custom_components directory:

    # On the HA host (if using HA OS or Container, use the File Editor add-on)
    cp -r homeai-agent/custom_components/openclaw_conversation \
      /config/custom_components/
    
  2. Restart Home Assistant

  3. Go to Settings → Devices & Services → Add Integration

  4. Search for "OpenClaw Conversation"

  5. Configure the settings:

    • OpenClaw Host: localhost (or IP of Mac Mini)
    • OpenClaw Port: 8081 (HTTP Bridge)
    • Agent Name: main (or your configured agent)
    • Timeout: 30 seconds

Method 2: Using HACS (if available)

  1. Add this repository to HACS as a custom repository
  2. Install "OpenClaw Conversation"
  3. Restart Home Assistant

Configuration

After installation, configure via Settings → Devices & Services → OpenClaw Conversation → Configure.

Via YAML (Alternative)

Add to your configuration.yaml:

openclaw_conversation:
  openclaw_host: localhost
  openclaw_port: 8081
  agent_name: main
  timeout: 30

Usage

Once configured, the OpenClaw agent will be available as a conversation agent in Home Assistant.

Setting as Default Agent

  1. Go to Settings → Voice Assistants
  2. Edit your voice assistant pipeline
  3. Set Conversation Agent to "OpenClaw Conversation"
  4. Save

Testing

  1. Open the Assist panel in Home Assistant
  2. Type a query like: "Turn on the reading lamp"
  3. OpenClaw will process the request and return a response

Architecture

[Voice Input] → [HA Voice Pipeline] → [OpenClaw Conversation Agent]
                                              ↓
                                    [OpenClaw CLI/API]
                                              ↓
                                    [Ollama LLM + Skills]
                                              ↓
                                    [HA Actions + TTS Response]

Troubleshooting

Agent Not Responding

  1. Check OpenClaw is running: pgrep -f openclaw
  2. Test CLI directly: openclaw agent --message "Hello" --agent main
  3. Check HA logs: Settings → System → Logs

Connection Errors

  1. Verify OpenClaw host/port settings
  2. Ensure OpenClaw is accessible from HA container/host
  3. Check network connectivity: curl http://localhost:8081/status

Files

File Purpose
manifest.json Component metadata
__init__.py Component setup and registration
config_flow.py Configuration UI flow
const.py Constants and defaults
conversation.py Conversation agent implementation
strings.json UI translations

See Also