feat: memory v2, prompt styles, Dream/GAZE integration, Wyoming TTS fix

SQLite + sqlite-vec replaces JSON memory files with semantic search,
follow-up injection, privacy levels, and lifecycle management.

Six prompt styles (quick/standard/creative/roleplayer/game-master/storyteller)
with per-style Claude model tiering (Haiku/Sonnet/Opus), temperature control,
and section stripping. Characters can set default style and per-style overrides.

Dream character import and GAZE character linking in the dashboard editor
with auto-populated fields, cover image resolution, and preset assignment.

Bridge: session isolation (conversation_id / 12h satellite buckets),
model routing refactor, PUT/DELETE support, memory REST endpoints.

Dashboard: mobile-responsive sidebar, retry button, style picker in chat,
follow-up banner, memory lifecycle/privacy UI, cloud model options in editor.

Wyoming TTS: upgraded to v1.8.0 for HA 1.7.2 compatibility.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Aodhan Collins
2026-03-24 22:31:04 +00:00
parent c3bae6fdc0
commit 56580a2cb2
34 changed files with 2891 additions and 467 deletions

View File

@@ -10,6 +10,8 @@ DEEPSEEK_API_KEY=
GEMINI_API_KEY= GEMINI_API_KEY=
ELEVENLABS_API_KEY= ELEVENLABS_API_KEY=
GAZE_API_KEY= GAZE_API_KEY=
DREAM_API_KEY=
ANTHROPIC_API_KEY=
# ─── Data & Paths ────────────────────────────────────────────────────────────── # ─── Data & Paths ──────────────────────────────────────────────────────────────
DATA_DIR=${HOME}/homeai-data DATA_DIR=${HOME}/homeai-data
@@ -59,3 +61,5 @@ VTUBE_WS_URL=ws://localhost:8001
# ─── P8: Images ──────────────────────────────────────────────────────────────── # ─── P8: Images ────────────────────────────────────────────────────────────────
COMFYUI_URL=http://localhost:8188 COMFYUI_URL=http://localhost:8188
# ─── P9: Character Management ─────────────────────────────────────────────────
DREAM_HOST=http://localhost:3000

3
.gitignore vendored
View File

@@ -45,3 +45,6 @@ homeai-esp32/esphome/secrets.yaml
homeai-llm/benchmark-results.md homeai-llm/benchmark-results.md
homeai-character/characters/*.json homeai-character/characters/*.json
!homeai-character/characters/.gitkeep !homeai-character/characters/.gitkeep
# MCP Files
*.mcp*

View File

@@ -18,14 +18,14 @@ A self-hosted, always-on personal AI assistant running on a **Mac Mini M4 Pro (6
| Storage | 1TB SSD | | Storage | 1TB SSD |
| Network | Gigabit Ethernet | | Network | Gigabit Ethernet |
Primary LLM is Claude Sonnet 4 via Anthropic API. Local Ollama models available as fallback. All other inference (STT, TTS, image gen) runs locally. Primary LLMs are Claude 4.5/4.6 family via Anthropic API (Haiku for quick, Sonnet for standard, Opus for creative/RP). Local Ollama models available as fallback. All other inference (STT, TTS, image gen) runs locally.
--- ---
## Core Stack ## Core Stack
### AI & LLM ### AI & LLM
- **Claude Sonnet 4** — primary LLM via Anthropic API (`anthropic/claude-sonnet-4-20250514`), used for all agent interactions - **Claude 4.5/4.6 family** — primary LLMs via Anthropic API, tiered per prompt style: Haiku 4.5 (quick commands), Sonnet 4.6 (standard/creative), Opus 4.6 (roleplay/storytelling)
- **Ollama** — local LLM runtime (fallback models: Llama 3.3 70B, Qwen 3.5 35B-A3B, Qwen 2.5 7B) - **Ollama** — local LLM runtime (fallback models: Llama 3.3 70B, Qwen 3.5 35B-A3B, Qwen 2.5 7B)
- **Model keep-warm daemon** — `preload-models.sh` runs as a loop, checks every 5 min, re-pins evicted models with `keep_alive=-1`. Keeps `qwen2.5:7b` (small/fast) and `$HOMEAI_MEDIUM_MODEL` (default: `qwen3.5:35b-a3b`) always loaded in VRAM. Medium model is configurable via env var for per-persona model assignment. - **Model keep-warm daemon** — `preload-models.sh` runs as a loop, checks every 5 min, re-pins evicted models with `keep_alive=-1`. Keeps `qwen2.5:7b` (small/fast) and `$HOMEAI_MEDIUM_MODEL` (default: `qwen3.5:35b-a3b`) always loaded in VRAM. Medium model is configurable via env var for per-persona model assignment.
- **Open WebUI** — browser-based chat interface, runs as Docker container - **Open WebUI** — browser-based chat interface, runs as Docker container
@@ -53,13 +53,16 @@ Primary LLM is Claude Sonnet 4 via Anthropic API. Local Ollama models available
- **OpenClaw** — primary AI agent layer; receives voice commands, calls tools, manages personality - **OpenClaw** — primary AI agent layer; receives voice commands, calls tools, manages personality
- **OpenClaw Skills** — 13 skills total: home-assistant, image-generation, voice-assistant, vtube-studio, memory, service-monitor, character, routine, music, workflow, gitea, calendar, mode - **OpenClaw Skills** — 13 skills total: home-assistant, image-generation, voice-assistant, vtube-studio, memory, service-monitor, character, routine, music, workflow, gitea, calendar, mode
- **n8n** — visual workflow automation (Docker), chains AI actions - **n8n** — visual workflow automation (Docker), chains AI actions
- **Character Memory System** — two-tier JSON-based memories (personal per-character + general shared), injected into LLM system prompt with budget truncation - **Character Memory System** — SQLite + sqlite-vec semantic search (personal per-character + general shared + follow-ups), injected into LLM system prompt with context-aware retrieval
- **Public/Private Mode** — routes requests to local Ollama (private) or cloud LLMs (public) with per-category overrides via `active-mode.json`. Default primary model is Claude Sonnet 4. - **Prompt Styles** — 6 styles (quick, standard, creative, roleplayer, game-master, storyteller) with per-style model routing, temperature, and section stripping. JSON templates in `homeai-agent/prompt-styles/`
- **Public/Private Mode** — routes requests to local Ollama (private) or cloud LLMs (public) with per-category overrides via `active-mode.json`. Default primary model is Claude Sonnet 4.6, with per-style model tiering (Haiku/Sonnet/Opus).
### Character & Personality ### Character & Personality
- **Character Schema v2** — JSON spec with background, dialogue_style, appearance, skills, gaze_presets (v1 auto-migrated) - **Character Schema v2** — JSON spec with background, dialogue_style, appearance, skills, gaze_presets, dream_id, gaze_character, prompt style overrides (v1 auto-migrated)
- **HomeAI Dashboard** — unified web app: character editor, chat, memory manager, service dashboard - **HomeAI Dashboard** — unified web app: character editor, chat, memory manager, service dashboard
- **Dream** — character management service (http://10.0.0.101:3000), REST API for character CRUD with GAZE integration for cover images
- **Character MCP Server** — LLM-assisted character creation via Fandom wiki/Wikipedia lookup (Docker) - **Character MCP Server** — LLM-assisted character creation via Fandom wiki/Wikipedia lookup (Docker)
- **GAZE** — image generation service (http://10.0.0.101:5782), REST API for presets, characters, and job-based image generation
- Character config stored as JSON files in `~/homeai-data/characters/`, consumed by bridge for system prompt construction - Character config stored as JSON files in `~/homeai-data/characters/`, consumed by bridge for system prompt construction
### Visual Representation ### Visual Representation
@@ -97,7 +100,7 @@ ESP32-S3-BOX-3 (room)
→ Bridge resolves character (satellite_id → character mapping) → Bridge resolves character (satellite_id → character mapping)
→ Bridge builds system prompt (profile + memories) and writes TTS config to state file → Bridge builds system prompt (profile + memories) and writes TTS config to state file
→ Bridge checks active-mode.json for model routing (private=local, public=cloud) → Bridge checks active-mode.json for model routing (private=local, public=cloud)
→ OpenClaw CLI → LLM generates response (Claude Sonnet 4 default, Ollama fallback) → OpenClaw CLI → LLM generates response (Claude Haiku/Sonnet/Opus per style, Ollama fallback)
→ Response dispatched: → Response dispatched:
→ Wyoming TTS reads state file → routes to Kokoro (local) or ElevenLabs (cloud) → Wyoming TTS reads state file → routes to Kokoro (local) or ElevenLabs (cloud)
→ Audio sent back to ESP32-S3-BOX-3 (spoken response) → Audio sent back to ESP32-S3-BOX-3 (spoken response)
@@ -130,15 +133,32 @@ Each character is a JSON file in `~/homeai-data/characters/` with:
- **Profile fields** — background, appearance, dialogue_style, skills array - **Profile fields** — background, appearance, dialogue_style, skills array
- **TTS config** — engine (kokoro/elevenlabs), kokoro_voice, elevenlabs_voice_id, elevenlabs_model, speed - **TTS config** — engine (kokoro/elevenlabs), kokoro_voice, elevenlabs_voice_id, elevenlabs_model, speed
- **GAZE presets** — array of `{preset, trigger}` for image generation styles - **GAZE presets** — array of `{preset, trigger}` for image generation styles
- **Dream link** — `dream_id` for syncing character data from Dream service
- **GAZE link** — `gaze_character` for auto-assigned cover image and presets
- **Prompt style config** — `default_prompt_style`, `prompt_style_overrides` for per-style tuning
- **Custom prompt rules** — trigger/response overrides for specific contexts - **Custom prompt rules** — trigger/response overrides for specific contexts
### Memory System ### Memory System
Two-tier memory stored as JSON in `~/homeai-data/memories/`: SQLite + sqlite-vec database at `~/homeai-data/memories/memories.db`:
- **Personal memories** (`personal/{character_id}.json`) — per-character, about user interactions - **Personal memories** — per-character, semantic/episodic/relational/opinion types
- **General memories** (`general.json`) — shared operational knowledge (tool usage, device info, routines) - **General memories** — shared operational knowledge (character_id = "general")
- **Follow-ups** — LLM-driven questions injected into system prompt, auto-resolve after 2 surfacings or 48h
- **Privacy levels** — public, sensitive, local_only (local_only excluded from cloud model requests)
- **Semantic search** — sentence-transformers all-MiniLM-L6-v2 embeddings (384 dims) for context-aware retrieval
- Core module: `homeai-agent/memory_store.py` (imported by bridge + memory-ctl skill)
Memories are injected into the system prompt by the bridge with budget truncation (personal: 4000 chars, general: 3000 chars, newest first). ### Prompt Styles
Six response styles in `homeai-agent/prompt-styles/`, each a JSON template with model, temperature, and instructions:
- **quick** — Claude Haiku 4.5, low temp, brief responses, strips profile sections
- **standard** — Claude Sonnet 4.6, balanced
- **creative** — Claude Sonnet 4.6, higher temp, elaborative
- **roleplayer** — Claude Opus 4.6, full personality injection
- **game-master** — Claude Opus 4.6, narrative-focused
- **storyteller** — Claude Opus 4.6, story-centric
Style selection: dashboard chat has a style picker; characters can set `default_prompt_style`; satellites use the global active style. Bridge resolves model per style → group → mode → default.
### TTS Voice Routing ### TTS Voice Routing
@@ -160,11 +180,12 @@ This works for both ESP32/HA pipeline and dashboard chat.
6. **Character system** — schema v2, dashboard editor, memory system, per-character TTS routing ✅ 6. **Character system** — schema v2, dashboard editor, memory system, per-character TTS routing ✅
7. **OpenClaw skills expansion** — 9 new skills (memory, monitor, character, routine, music, workflow, gitea, calendar, mode) + public/private mode routing ✅ 7. **OpenClaw skills expansion** — 9 new skills (memory, monitor, character, routine, music, workflow, gitea, calendar, mode) + public/private mode routing ✅
8. **Music Assistant** — deployed on Pi (10.0.0.199:8095), Spotify + SMB + Chromecast players ✅ 8. **Music Assistant** — deployed on Pi (10.0.0.199:8095), Spotify + SMB + Chromecast players ✅
9. **Animated visual** — PNG/GIF character visual for the web assistant (initial visual layer) 9. **Memory v2 + Prompt Styles + Dream/GAZE** — SQLite memory with semantic search, 6 prompt styles with model tiering, Dream character import, GAZE character linking ✅
10. **Android app** — companion app for mobile access to the assistant 10. **Animated visual** — PNG/GIF character visual for the web assistant (initial visual layer)
11. **ComfyUI** — image generation online, character-consistent model workflows 11. **Android app** — companion app for mobile access to the assistant
12. **Extended integrations** — Snapcast, code-server 12. **ComfyUI** — image generation online, character-consistent model workflows
13. **Polish** — Authelia, Tailscale hardening, iOS widgets 13. **Extended integrations** — Snapcast, code-server
14. **Polish** — Authelia, Tailscale hardening, iOS widgets
### Stretch Goals ### Stretch Goals
- **Live2D / VTube Studio** — full Live2D model with WebSocket API bridge (requires learning Live2D tooling) - **Live2D / VTube Studio** — full Live2D model with WebSocket API bridge (requires learning Live2D tooling)
@@ -180,7 +201,10 @@ This works for both ESP32/HA pipeline and dashboard chat.
- OpenClaw workspace tools: `~/.openclaw/workspace/TOOLS.md` - OpenClaw workspace tools: `~/.openclaw/workspace/TOOLS.md`
- OpenClaw config: `~/.openclaw/openclaw.json` - OpenClaw config: `~/.openclaw/openclaw.json`
- Character configs: `~/homeai-data/characters/` - Character configs: `~/homeai-data/characters/`
- Character memories: `~/homeai-data/memories/` - Character memories DB: `~/homeai-data/memories/memories.db`
- Memory store module: `homeai-agent/memory_store.py`
- Prompt style templates: `homeai-agent/prompt-styles/`
- Active prompt style: `~/homeai-data/active-prompt-style.json`
- Conversation history: `~/homeai-data/conversations/` - Conversation history: `~/homeai-data/conversations/`
- Active TTS state: `~/homeai-data/active-tts-voice.json` - Active TTS state: `~/homeai-data/active-tts-voice.json`
- Active mode state: `~/homeai-data/active-mode.json` - Active mode state: `~/homeai-data/active-mode.json`
@@ -194,6 +218,8 @@ This works for both ESP32/HA pipeline and dashboard chat.
- Gitea repos root: `~/gitea/` - Gitea repos root: `~/gitea/`
- Music Assistant (Pi): `~/docker/selbina/music-assistant/` on 10.0.0.199 - Music Assistant (Pi): `~/docker/selbina/music-assistant/` on 10.0.0.199
- Skills user guide: `homeai-agent/SKILLS_GUIDE.md` - Skills user guide: `homeai-agent/SKILLS_GUIDE.md`
- Dream service: `http://10.0.0.101:3000` (character management, REST API)
- GAZE service: `http://10.0.0.101:5782` (image generation, REST API)
--- ---
@@ -203,8 +229,10 @@ This works for both ESP32/HA pipeline and dashboard chat.
- ESP32-S3-BOX-3 units are dumb satellites — all intelligence stays on Mac Mini - ESP32-S3-BOX-3 units are dumb satellites — all intelligence stays on Mac Mini
- The character JSON schema (from Character Manager) should be treated as a versioned spec; pipeline components read from it, never hardcode personality values - The character JSON schema (from Character Manager) should be treated as a versioned spec; pipeline components read from it, never hardcode personality values
- OpenClaw skills are the primary extension mechanism — new capabilities = new skills - OpenClaw skills are the primary extension mechanism — new capabilities = new skills
- Primary LLM is Claude Sonnet 4 (Anthropic API); local Ollama models are available as fallback - Primary LLMs are Claude 4.5/4.6 family (Anthropic API) with per-style tiering; local Ollama models are available as fallback
- Launchd plists are symlinked from repo source to ~/Library/LaunchAgents/ — edit source, then bootout/bootstrap to reload - Launchd plists are symlinked from repo source to ~/Library/LaunchAgents/ — edit source, then bootout/bootstrap to reload
- Music Assistant runs on Pi (10.0.0.199), not Mac Mini — needs host networking for Chromecast mDNS discovery - Music Assistant runs on Pi (10.0.0.199), not Mac Mini — needs host networking for Chromecast mDNS discovery
- VTube Studio API bridge should be a standalone OpenClaw skill with clear event interface - VTube Studio API bridge should be a standalone OpenClaw skill with clear event interface
- mem0 memory store should be backed up as part of regular Gitea commits - Memory DB (`memories.db`) should be backed up as part of regular Gitea commits
- Dream characters can be linked to GAZE characters for cover image fallback and cross-referencing
- Prompt style selection hierarchy: explicit user pick → character default → global active style

89
PORT_MAP.md Normal file
View File

@@ -0,0 +1,89 @@
# HomeAI Port Map
All ports used across the HomeAI stack. Updated 2026-03-20.
**Host: LINDBLUM (10.0.0.101)** — Mac Mini M4 Pro
## Voice Pipeline
| Port | Service | Protocol | Managed By | Binds |
|------|---------|----------|------------|-------|
| 10300 | Wyoming STT (Whisper MLX) | TCP (Wyoming) | launchd `com.homeai.wyoming-stt` | 0.0.0.0 |
| 10301 | Wyoming TTS (Kokoro) | TCP (Wyoming) | launchd `com.homeai.wyoming-tts` | 0.0.0.0 |
| 10302 | Wyoming TTS (ElevenLabs) | TCP (Wyoming) | launchd `com.homeai.wyoming-elevenlabs` | 0.0.0.0 |
| 10700 | Wyoming Satellite | TCP (Wyoming) | launchd `com.homeai.wyoming-satellite` | 0.0.0.0 |
## Agent / Orchestration
| Port | Service | Protocol | Managed By | Binds |
|------|---------|----------|------------|-------|
| 8080 | OpenClaw Gateway | HTTP | launchd `com.homeai.openclaw` | localhost |
| 8081 | OpenClaw HTTP Bridge | HTTP | launchd `com.homeai.openclaw-bridge` | 0.0.0.0 |
| 8002 | VTube Studio Bridge | HTTP | launchd `com.homeai.vtube-bridge` | localhost |
## LLM
| Port | Service | Protocol | Managed By | Binds |
|------|---------|----------|------------|-------|
| 11434 | Ollama | HTTP | launchd `com.homeai.ollama` | 0.0.0.0 |
| 3030 | Open WebUI | HTTP | Docker `homeai-open-webui` | 0.0.0.0 |
## Dashboards / UIs
| Port | Service | Protocol | Managed By | Binds |
|------|---------|----------|------------|-------|
| 5173 | HomeAI Dashboard | HTTP | launchd `com.homeai.dashboard` | localhost |
| 5174 | Desktop Assistant | HTTP | launchd `com.homeai.desktop-assistant` | localhost |
## Image Generation
| Port | Service | Protocol | Managed By | Binds |
|------|---------|----------|------------|-------|
| 5782 | GAZE API | HTTP | — | 10.0.0.101 |
| 8188 | ComfyUI | HTTP | — | localhost |
## Visual
| Port | Service | Protocol | Managed By | Binds |
|------|---------|----------|------------|-------|
| 8001 | VTube Studio (WebSocket) | WS | External app | localhost |
## Infrastructure (Docker)
| Port | Service | Protocol | Managed By | Binds |
|------|---------|----------|------------|-------|
| 3001 | Uptime Kuma | HTTP | Docker `homeai-uptime-kuma` | 0.0.0.0 |
| 5678 | n8n | HTTP | Docker `homeai-n8n` | 0.0.0.0 |
| 8090 | code-server | HTTP | Docker `homeai-code-server` | 0.0.0.0 |
---
**Host: SELBINA (10.0.0.199)** — Raspberry Pi 5
| Port | Service | Protocol | Managed By |
|------|---------|----------|------------|
| 3000 | Gitea | HTTP | Docker |
| 8095 | Music Assistant | HTTP | Docker (host networking) |
| 8123 | Home Assistant | HTTPS | Docker |
| 9443 | Portainer | HTTPS | Docker |
---
## Port Ranges Summary
```
30003030 Web UIs (Gitea, Uptime Kuma, Open WebUI)
51735174 Vite dev servers (dashboards)
5678 n8n
5782 GAZE API
80018002 VTube Studio (app + bridge)
80808081 OpenClaw (gateway + bridge)
8090 code-server
8095 Music Assistant
8123 Home Assistant
8188 ComfyUI
9443 Portainer
11434 Ollama
1030010302 Wyoming voice (STT + TTS)
10700 Wyoming satellite
```

View File

@@ -0,0 +1,865 @@
#!/usr/bin/env python3
"""
HomeAI Memory Store — SQLite + Vector Search
Replaces flat JSON memory files with a structured SQLite database
using sqlite-vec for semantic similarity search.
Used by:
- openclaw-http-bridge.py (memory retrieval + follow-up injection)
- memory-ctl skill (CLI memory management)
- Dashboard API (REST endpoints via bridge)
"""
import json
import os
import sqlite3
import struct
import time
from datetime import datetime, timedelta, timezone
from pathlib import Path
from typing import Optional
import sqlite_vec
# ---------------------------------------------------------------------------
# Configuration
# ---------------------------------------------------------------------------
DATA_DIR = Path(os.environ.get("DATA_DIR", os.path.expanduser("~/homeai-data")))
MEMORIES_DIR = DATA_DIR / "memories"
DB_PATH = MEMORIES_DIR / "memories.db"
EMBEDDING_DIM = 384 # all-MiniLM-L6-v2
# Privacy keywords for rule-based classification
PRIVACY_KEYWORDS = {
"local_only": [
"health", "illness", "sick", "doctor", "medical", "medication", "surgery",
"salary", "bank", "financial", "debt", "mortgage", "tax",
"depression", "anxiety", "therapy", "divorce", "breakup",
],
"sensitive": [
"address", "phone", "email", "password", "birthday",
],
}
# ---------------------------------------------------------------------------
# Embedding model (lazy-loaded singleton)
# ---------------------------------------------------------------------------
_embedder = None
def _get_embedder():
"""Lazy-load the sentence-transformers model."""
global _embedder
if _embedder is None:
from sentence_transformers import SentenceTransformer
_embedder = SentenceTransformer("all-MiniLM-L6-v2")
return _embedder
def get_embedding(text: str) -> list[float]:
"""Compute a 384-dim embedding for the given text."""
model = _get_embedder()
vec = model.encode(text, normalize_embeddings=True)
return vec.tolist()
def _serialize_f32(vec: list[float]) -> bytes:
"""Serialize a float list to little-endian bytes for sqlite-vec."""
return struct.pack(f"<{len(vec)}f", *vec)
def _deserialize_f32(blob: bytes) -> list[float]:
"""Deserialize sqlite-vec float bytes back to a list."""
n = len(blob) // 4
return list(struct.unpack(f"<{n}f", blob))
# ---------------------------------------------------------------------------
# Database initialization
# ---------------------------------------------------------------------------
_db: Optional[sqlite3.Connection] = None
def init_db() -> sqlite3.Connection:
"""Initialize the SQLite database with schema and sqlite-vec extension."""
global _db
if _db is not None:
return _db
MEMORIES_DIR.mkdir(parents=True, exist_ok=True)
db = sqlite3.connect(str(DB_PATH), check_same_thread=False)
db.enable_load_extension(True)
sqlite_vec.load(db)
db.enable_load_extension(False)
db.row_factory = sqlite3.Row
db.executescript("""
CREATE TABLE IF NOT EXISTS memories (
id TEXT PRIMARY KEY,
character_id TEXT NOT NULL,
content TEXT NOT NULL,
memory_type TEXT NOT NULL DEFAULT 'semantic',
category TEXT NOT NULL DEFAULT 'other',
privacy_level TEXT NOT NULL DEFAULT 'standard',
importance REAL NOT NULL DEFAULT 0.5,
lifecycle_state TEXT NOT NULL DEFAULT 'active',
follow_up_due TEXT,
follow_up_context TEXT,
source TEXT DEFAULT 'user_explicit',
created_at TEXT NOT NULL,
last_accessed TEXT,
expires_at TEXT,
previous_value TEXT,
tags TEXT,
surfaced_count INTEGER DEFAULT 0
);
CREATE INDEX IF NOT EXISTS idx_memories_character
ON memories(character_id);
CREATE INDEX IF NOT EXISTS idx_memories_lifecycle
ON memories(lifecycle_state);
CREATE INDEX IF NOT EXISTS idx_memories_type
ON memories(memory_type);
""")
# Create the vec0 virtual table for vector search
# sqlite-vec requires this specific syntax
db.execute(f"""
CREATE VIRTUAL TABLE IF NOT EXISTS memory_embeddings USING vec0(
id TEXT PRIMARY KEY,
embedding float[{EMBEDDING_DIM}]
)
""")
# Partial index for follow-ups (created manually since executescript can't
# handle IF NOT EXISTS for partial indexes cleanly on all versions)
try:
db.execute("""
CREATE INDEX idx_memories_followup
ON memories(lifecycle_state, follow_up_due)
WHERE lifecycle_state = 'pending_followup'
""")
except sqlite3.OperationalError:
pass # index already exists
db.commit()
_db = db
return db
def _get_db() -> sqlite3.Connection:
"""Get or initialize the database connection."""
if _db is None:
return init_db()
return _db
def _row_to_dict(row: sqlite3.Row) -> dict:
"""Convert a sqlite3.Row to a plain dict."""
return dict(row)
def _generate_id() -> str:
"""Generate a unique memory ID."""
return f"m_{int(time.time() * 1000)}"
def _now_iso() -> str:
"""Current UTC time as ISO string."""
return datetime.now(timezone.utc).isoformat()
# ---------------------------------------------------------------------------
# Write-time classification (rule-based, Phase 1)
# ---------------------------------------------------------------------------
def classify_memory(content: str) -> dict:
"""Rule-based classification for memory properties.
Returns defaults that can be overridden by explicit parameters."""
content_lower = content.lower()
# Privacy detection
privacy = "standard"
for level, keywords in PRIVACY_KEYWORDS.items():
if any(kw in content_lower for kw in keywords):
privacy = level
break
# Memory type detection
memory_type = "semantic"
temporal_markers = [
"today", "yesterday", "tonight", "this morning", "just now",
"feeling", "right now", "this week", "earlier",
]
if any(kw in content_lower for kw in temporal_markers):
memory_type = "episodic"
# Importance heuristic
importance = 0.5
if privacy == "local_only":
importance = 0.7
elif privacy == "sensitive":
importance = 0.6
return {
"memory_type": memory_type,
"privacy_level": privacy,
"importance": importance,
}
# ---------------------------------------------------------------------------
# CRUD operations
# ---------------------------------------------------------------------------
def add_memory(
character_id: str,
content: str,
memory_type: str | None = None,
category: str = "other",
importance: float | None = None,
privacy_level: str | None = None,
tags: list[str] | None = None,
follow_up_due: str | None = None,
follow_up_context: str | None = None,
source: str = "user_explicit",
expires_at: str | None = None,
) -> dict:
"""Add a new memory record. Auto-classifies fields not explicitly set."""
db = _get_db()
classified = classify_memory(content)
memory_type = memory_type or classified["memory_type"]
privacy_level = privacy_level or classified["privacy_level"]
importance = importance if importance is not None else classified["importance"]
lifecycle_state = "active"
if follow_up_due or follow_up_context:
lifecycle_state = "pending_followup"
if not follow_up_due:
follow_up_due = "next_interaction"
mem_id = _generate_id()
now = _now_iso()
# Generate embedding
embedding = get_embedding(content)
db.execute("""
INSERT INTO memories (
id, character_id, content, memory_type, category,
privacy_level, importance, lifecycle_state,
follow_up_due, follow_up_context, source,
created_at, tags, surfaced_count
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, 0)
""", (
mem_id, character_id, content, memory_type, category,
privacy_level, importance, lifecycle_state,
follow_up_due, follow_up_context, source,
now, json.dumps(tags) if tags else None,
))
# Insert embedding into vec0 table
db.execute(
"INSERT INTO memory_embeddings (id, embedding) VALUES (?, ?)",
(mem_id, _serialize_f32(embedding)),
)
db.commit()
return {
"id": mem_id,
"character_id": character_id,
"content": content,
"memory_type": memory_type,
"category": category,
"privacy_level": privacy_level,
"importance": importance,
"lifecycle_state": lifecycle_state,
"follow_up_due": follow_up_due,
"follow_up_context": follow_up_context,
"source": source,
"created_at": now,
"tags": tags,
}
def update_memory(memory_id: str, **fields) -> dict | None:
"""Update specific fields on a memory record."""
db = _get_db()
# Validate that memory exists
row = db.execute("SELECT * FROM memories WHERE id = ?", (memory_id,)).fetchone()
if not row:
return None
allowed = {
"content", "memory_type", "category", "privacy_level", "importance",
"lifecycle_state", "follow_up_due", "follow_up_context", "source",
"last_accessed", "expires_at", "previous_value", "tags", "surfaced_count",
}
updates = {k: v for k, v in fields.items() if k in allowed}
if not updates:
return _row_to_dict(row)
# If content changed, update embedding and store previous value
if "content" in updates:
updates["previous_value"] = row["content"]
embedding = get_embedding(updates["content"])
# Update vec0 table: delete old, insert new
db.execute("DELETE FROM memory_embeddings WHERE id = ?", (memory_id,))
db.execute(
"INSERT INTO memory_embeddings (id, embedding) VALUES (?, ?)",
(memory_id, _serialize_f32(embedding)),
)
if "tags" in updates and isinstance(updates["tags"], list):
updates["tags"] = json.dumps(updates["tags"])
set_clause = ", ".join(f"{k} = ?" for k in updates)
values = list(updates.values()) + [memory_id]
db.execute(f"UPDATE memories SET {set_clause} WHERE id = ?", values)
db.commit()
row = db.execute("SELECT * FROM memories WHERE id = ?", (memory_id,)).fetchone()
return _row_to_dict(row) if row else None
def delete_memory(memory_id: str) -> bool:
"""Delete a memory record and its embedding."""
db = _get_db()
row = db.execute("SELECT id FROM memories WHERE id = ?", (memory_id,)).fetchone()
if not row:
return False
db.execute("DELETE FROM memories WHERE id = ?", (memory_id,))
db.execute("DELETE FROM memory_embeddings WHERE id = ?", (memory_id,))
db.commit()
return True
# ---------------------------------------------------------------------------
# Retrieval
# ---------------------------------------------------------------------------
def retrieve_memories(
character_id: str,
context_text: str = "",
limit: int = 20,
exclude_private_for_cloud: bool = False,
) -> list[dict]:
"""Dual retrieval: semantic similarity + recency, merged and ranked.
If context_text is empty, falls back to recency-only retrieval.
"""
db = _get_db()
privacy_filter = ""
if exclude_private_for_cloud:
privacy_filter = "AND m.privacy_level != 'local_only'"
# Always include high-importance memories
high_importance = db.execute(f"""
SELECT * FROM memories m
WHERE m.character_id = ?
AND m.lifecycle_state IN ('active', 'pending_followup')
AND m.importance > 0.8
{privacy_filter}
ORDER BY m.created_at DESC
LIMIT 5
""", (character_id,)).fetchall()
seen_ids = {r["id"] for r in high_importance}
results = {r["id"]: {**_row_to_dict(r), "_score": 1.0} for r in high_importance}
# Semantic search (if context provided and embeddings exist)
if context_text:
try:
query_emb = get_embedding(context_text)
vec_rows = db.execute("""
SELECT id, distance
FROM memory_embeddings
WHERE embedding MATCH ?
AND k = 30
""", (_serialize_f32(query_emb),)).fetchall()
vec_ids = [r["id"] for r in vec_rows if r["id"] not in seen_ids]
vec_distances = {r["id"]: r["distance"] for r in vec_rows}
if vec_ids:
placeholders = ",".join("?" * len(vec_ids))
sem_rows = db.execute(f"""
SELECT * FROM memories m
WHERE m.id IN ({placeholders})
AND m.character_id = ?
AND m.lifecycle_state IN ('active', 'pending_followup')
{privacy_filter}
""", (*vec_ids, character_id)).fetchall()
for r in sem_rows:
d = _row_to_dict(r)
# Convert cosine distance to similarity (sqlite-vec returns L2 distance for vec0)
dist = vec_distances.get(r["id"], 1.0)
semantic_score = max(0.0, 1.0 - dist)
d["_score"] = 0.6 * semantic_score + 0.1 * d["importance"]
results[r["id"]] = d
seen_ids.add(r["id"])
except Exception as e:
print(f"[MemoryStore] Vector search error: {e}")
# Recency search: last 7 days, ordered by importance + recency
recency_rows = db.execute(f"""
SELECT * FROM memories m
WHERE m.character_id = ?
AND m.lifecycle_state IN ('active', 'pending_followup')
AND m.created_at > datetime('now', '-7 days')
{privacy_filter}
ORDER BY m.importance DESC, m.created_at DESC
LIMIT 10
""", (character_id,)).fetchall()
for r in recency_rows:
if r["id"] not in seen_ids:
d = _row_to_dict(r)
# Recency score based on age in days (newer = higher)
try:
created = datetime.fromisoformat(d["created_at"])
age_days = (datetime.now(timezone.utc) - created).total_seconds() / 86400
recency_score = max(0.0, 1.0 - (age_days / 7.0))
except (ValueError, TypeError):
recency_score = 0.5
d["_score"] = 0.3 * recency_score + 0.1 * d["importance"]
results[r["id"]] = d
seen_ids.add(r["id"])
# Sort by score descending, return top N
ranked = sorted(results.values(), key=lambda x: x.get("_score", 0), reverse=True)
# Update last_accessed for returned memories
returned = ranked[:limit]
now = _now_iso()
for mem in returned:
mem.pop("_score", None)
db.execute(
"UPDATE memories SET last_accessed = ? WHERE id = ?",
(now, mem["id"]),
)
db.commit()
return returned
def get_pending_followups(character_id: str) -> list[dict]:
"""Get follow-up memories that are due for surfacing."""
db = _get_db()
now = _now_iso()
rows = db.execute("""
SELECT * FROM memories
WHERE character_id = ?
AND lifecycle_state = 'pending_followup'
AND (follow_up_due <= ? OR follow_up_due = 'next_interaction')
ORDER BY importance DESC, created_at DESC
LIMIT 5
""", (character_id, now)).fetchall()
return [_row_to_dict(r) for r in rows]
def search_memories(
character_id: str,
query: str,
memory_type: str | None = None,
limit: int = 10,
) -> list[dict]:
"""Semantic search for memories matching a query."""
db = _get_db()
query_emb = get_embedding(query)
vec_rows = db.execute("""
SELECT id, distance
FROM memory_embeddings
WHERE embedding MATCH ?
AND k = ?
""", (_serialize_f32(query_emb), limit * 3)).fetchall()
if not vec_rows:
return []
vec_ids = [r["id"] for r in vec_rows]
vec_distances = {r["id"]: r["distance"] for r in vec_rows}
placeholders = ",".join("?" * len(vec_ids))
type_filter = "AND m.memory_type = ?" if memory_type else ""
params = [*vec_ids, character_id]
if memory_type:
params.append(memory_type)
rows = db.execute(f"""
SELECT * FROM memories m
WHERE m.id IN ({placeholders})
AND m.character_id = ?
{type_filter}
ORDER BY m.created_at DESC
""", params).fetchall()
# Sort by similarity
results = []
for r in rows:
d = _row_to_dict(r)
d["_distance"] = vec_distances.get(r["id"], 1.0)
results.append(d)
results.sort(key=lambda x: x["_distance"])
for r in results:
r.pop("_distance", None)
return results[:limit]
def list_memories(
character_id: str,
memory_type: str | None = None,
lifecycle_state: str | None = None,
category: str | None = None,
limit: int = 20,
offset: int = 0,
) -> list[dict]:
"""List memories with optional filters."""
db = _get_db()
conditions = ["character_id = ?"]
params: list = [character_id]
if memory_type:
conditions.append("memory_type = ?")
params.append(memory_type)
if lifecycle_state:
conditions.append("lifecycle_state = ?")
params.append(lifecycle_state)
if category:
conditions.append("category = ?")
params.append(category)
where = " AND ".join(conditions)
params.extend([limit, offset])
rows = db.execute(f"""
SELECT * FROM memories
WHERE {where}
ORDER BY created_at DESC
LIMIT ? OFFSET ?
""", params).fetchall()
return [_row_to_dict(r) for r in rows]
def count_memories(character_id: str) -> int:
"""Count memories for a character."""
db = _get_db()
row = db.execute(
"SELECT COUNT(*) as cnt FROM memories WHERE character_id = ?",
(character_id,),
).fetchone()
return row["cnt"] if row else 0
# ---------------------------------------------------------------------------
# Lifecycle management
# ---------------------------------------------------------------------------
def resolve_followup(memory_id: str) -> bool:
"""Mark a follow-up as resolved."""
db = _get_db()
result = db.execute("""
UPDATE memories
SET lifecycle_state = 'resolved',
follow_up_due = NULL
WHERE id = ? AND lifecycle_state = 'pending_followup'
""", (memory_id,))
db.commit()
return result.rowcount > 0
def archive_memory(memory_id: str) -> bool:
"""Archive a memory (keeps it for relational inference, not surfaced)."""
db = _get_db()
result = db.execute("""
UPDATE memories
SET lifecycle_state = 'archived'
WHERE id = ?
""", (memory_id,))
db.commit()
return result.rowcount > 0
def auto_resolve_expired_followups() -> int:
"""Auto-resolve follow-ups that are more than 48h past due."""
db = _get_db()
cutoff = (datetime.now(timezone.utc) - timedelta(hours=48)).isoformat()
result = db.execute("""
UPDATE memories
SET lifecycle_state = 'resolved',
follow_up_due = NULL
WHERE lifecycle_state = 'pending_followup'
AND follow_up_due != 'next_interaction'
AND follow_up_due < ?
""", (cutoff,))
db.commit()
return result.rowcount
def auto_archive_old_resolved() -> int:
"""Archive resolved memories older than 7 days."""
db = _get_db()
cutoff = (datetime.now(timezone.utc) - timedelta(days=7)).isoformat()
result = db.execute("""
UPDATE memories
SET lifecycle_state = 'archived'
WHERE lifecycle_state = 'resolved'
AND created_at < ?
""", (cutoff,))
db.commit()
return result.rowcount
def increment_surfaced_count(memory_id: str) -> int:
"""Increment surfaced_count and return new value. Auto-resolves if >= 1."""
db = _get_db()
row = db.execute(
"SELECT surfaced_count FROM memories WHERE id = ?", (memory_id,)
).fetchone()
if not row:
return 0
new_count = (row["surfaced_count"] or 0) + 1
if new_count >= 2:
# Auto-resolve: surfaced twice without user engagement
db.execute("""
UPDATE memories
SET surfaced_count = ?, lifecycle_state = 'resolved', follow_up_due = NULL
WHERE id = ?
""", (new_count, memory_id))
else:
# Update next_interaction to actual timestamp so the 48h timer starts
db.execute("""
UPDATE memories
SET surfaced_count = ?,
follow_up_due = CASE
WHEN follow_up_due = 'next_interaction' THEN ?
ELSE follow_up_due
END
WHERE id = ?
""", (new_count, _now_iso(), memory_id))
db.commit()
return new_count
# ---------------------------------------------------------------------------
# Deduplication
# ---------------------------------------------------------------------------
def find_similar(
character_id: str,
content: str,
memory_type: str = "semantic",
threshold: float = 0.85,
) -> dict | None:
"""Find an existing memory that is semantically similar (>threshold).
Returns the matching memory dict or None."""
db = _get_db()
query_emb = get_embedding(content)
vec_rows = db.execute("""
SELECT id, distance
FROM memory_embeddings
WHERE embedding MATCH ?
AND k = 5
""", (_serialize_f32(query_emb),)).fetchall()
for vr in vec_rows:
similarity = max(0.0, 1.0 - vr["distance"])
if similarity >= threshold:
row = db.execute("""
SELECT * FROM memories
WHERE id = ? AND character_id = ? AND memory_type = ?
AND lifecycle_state = 'active'
""", (vr["id"], character_id, memory_type)).fetchone()
if row:
return _row_to_dict(row)
return None
def add_or_merge_memory(
character_id: str,
content: str,
memory_type: str | None = None,
category: str = "other",
importance: float | None = None,
privacy_level: str | None = None,
tags: list[str] | None = None,
follow_up_due: str | None = None,
follow_up_context: str | None = None,
source: str = "user_explicit",
expires_at: str | None = None,
dedup_threshold: float = 0.85,
) -> dict:
"""Add a memory, or merge with an existing similar one (semantic dedup).
For semantic memories, if a similar one exists (>threshold), update it
instead of creating a new record."""
resolved_type = memory_type or classify_memory(content)["memory_type"]
if resolved_type == "semantic":
existing = find_similar(character_id, content, "semantic", dedup_threshold)
if existing:
updated = update_memory(existing["id"], content=content)
if updated:
return updated
return add_memory(
character_id=character_id,
content=content,
memory_type=memory_type,
category=category,
importance=importance,
privacy_level=privacy_level,
tags=tags,
follow_up_due=follow_up_due,
follow_up_context=follow_up_context,
source=source,
expires_at=expires_at,
)
# ---------------------------------------------------------------------------
# Migration from JSON
# ---------------------------------------------------------------------------
# Mapping from old JSON categories to new memory types
_CATEGORY_TO_TYPE = {
"preference": "semantic",
"personal_info": "semantic",
"interaction": "episodic",
"emotional": "episodic",
"system": "semantic",
"tool_usage": "semantic",
"home_layout": "semantic",
"device": "semantic",
"routine": "semantic",
"other": "semantic",
}
_CATEGORY_TO_IMPORTANCE = {
"personal_info": 0.7,
"preference": 0.6,
"emotional": 0.5,
"interaction": 0.4,
"system": 0.4,
"tool_usage": 0.3,
"home_layout": 0.5,
"device": 0.4,
"routine": 0.5,
"other": 0.4,
}
_CATEGORY_TO_PRIVACY = {
"emotional": "sensitive",
"personal_info": "sensitive",
}
def migrate_from_json(memories_dir: str | None = None) -> dict:
"""Migrate all JSON memory files to SQLite.
Returns {migrated: int, skipped: int, errors: [str]}."""
db = _get_db()
mem_dir = Path(memories_dir) if memories_dir else MEMORIES_DIR
migrated = 0
skipped = 0
errors = []
# Migrate personal memories
personal_dir = mem_dir / "personal"
if personal_dir.exists():
for json_file in personal_dir.glob("*.json"):
try:
with open(json_file) as f:
data = json.load(f)
character_id = data.get("characterId", json_file.stem)
for mem in data.get("memories", []):
content = mem.get("content", "").strip()
if not content:
skipped += 1
continue
category = mem.get("category", "other")
created_at = mem.get("createdAt", _now_iso())
try:
add_memory(
character_id=character_id,
content=content,
memory_type=_CATEGORY_TO_TYPE.get(category, "semantic"),
category=category,
importance=_CATEGORY_TO_IMPORTANCE.get(category, 0.5),
privacy_level=_CATEGORY_TO_PRIVACY.get(category, "standard"),
source="migrated_json",
)
# Fix created_at to original value
db.execute(
"UPDATE memories SET created_at = ? WHERE id = (SELECT id FROM memories ORDER BY rowid DESC LIMIT 1)",
(created_at,),
)
db.commit()
migrated += 1
except Exception as e:
errors.append(f"personal/{json_file.name}: {e}")
# Rename to backup
backup = json_file.with_suffix(".json.bak")
json_file.rename(backup)
except Exception as e:
errors.append(f"personal/{json_file.name}: {e}")
# Migrate general memories
general_file = mem_dir / "general.json"
if general_file.exists():
try:
with open(general_file) as f:
data = json.load(f)
for mem in data.get("memories", []):
content = mem.get("content", "").strip()
if not content:
skipped += 1
continue
category = mem.get("category", "other")
created_at = mem.get("createdAt", _now_iso())
try:
add_memory(
character_id="shared",
content=content,
memory_type=_CATEGORY_TO_TYPE.get(category, "semantic"),
category=category,
importance=_CATEGORY_TO_IMPORTANCE.get(category, 0.5),
privacy_level="standard",
source="migrated_json",
)
db.execute(
"UPDATE memories SET created_at = ? WHERE id = (SELECT id FROM memories ORDER BY rowid DESC LIMIT 1)",
(created_at,),
)
db.commit()
migrated += 1
except Exception as e:
errors.append(f"general.json: {e}")
backup = general_file.with_suffix(".json.bak")
general_file.rename(backup)
except Exception as e:
errors.append(f"general.json: {e}")
return {"migrated": migrated, "skipped": skipped, "errors": errors}

View File

@@ -37,6 +37,26 @@ from pathlib import Path
import wave import wave
import io import io
import re import re
from datetime import datetime, timezone
from urllib.parse import parse_qs
from memory_store import (
init_db as init_memory_db,
retrieve_memories as _retrieve_memories,
get_pending_followups,
auto_resolve_expired_followups,
auto_archive_old_resolved,
increment_surfaced_count,
add_memory as _add_memory,
add_or_merge_memory,
update_memory as _update_memory,
delete_memory as _delete_memory,
list_memories as _list_memories,
search_memories as _search_memories,
resolve_followup,
count_memories,
migrate_from_json,
)
from wyoming.client import AsyncTcpClient from wyoming.client import AsyncTcpClient
from wyoming.tts import Synthesize, SynthesizeVoice from wyoming.tts import Synthesize, SynthesizeVoice
from wyoming.asr import Transcribe, Transcript from wyoming.asr import Transcribe, Transcript
@@ -48,7 +68,7 @@ TIMEOUT_WARM = 120 # Model already loaded in VRAM
TIMEOUT_COLD = 180 # Model needs loading first (~10-20s load + inference) TIMEOUT_COLD = 180 # Model needs loading first (~10-20s load + inference)
OLLAMA_PS_URL = "http://localhost:11434/api/ps" OLLAMA_PS_URL = "http://localhost:11434/api/ps"
VTUBE_BRIDGE_URL = "http://localhost:8002" VTUBE_BRIDGE_URL = "http://localhost:8002"
DEFAULT_MODEL = "anthropic/claude-sonnet-4-20250514" DEFAULT_MODEL = "anthropic/claude-sonnet-4-6"
def _vtube_fire_and_forget(path: str, data: dict): def _vtube_fire_and_forget(path: str, data: dict):
@@ -85,12 +105,21 @@ SATELLITE_MAP_PATH = Path("/Users/aodhan/homeai-data/satellite-map.json")
MEMORIES_DIR = Path("/Users/aodhan/homeai-data/memories") MEMORIES_DIR = Path("/Users/aodhan/homeai-data/memories")
ACTIVE_TTS_VOICE_PATH = Path("/Users/aodhan/homeai-data/active-tts-voice.json") ACTIVE_TTS_VOICE_PATH = Path("/Users/aodhan/homeai-data/active-tts-voice.json")
ACTIVE_MODE_PATH = Path("/Users/aodhan/homeai-data/active-mode.json") ACTIVE_MODE_PATH = Path("/Users/aodhan/homeai-data/active-mode.json")
ACTIVE_STYLE_PATH = Path("/Users/aodhan/homeai-data/active-prompt-style.json")
PROMPT_STYLES_DIR = Path(__file__).parent / "prompt-styles"
# Cloud provider model mappings for mode routing # Cloud provider model mappings for mode routing (fallback when style has no model)
CLOUD_MODELS = { CLOUD_MODELS = {
"anthropic": "anthropic/claude-sonnet-4-20250514", "anthropic": "anthropic/claude-sonnet-4-6",
"openai": "openai/gpt-4o", "openai": "openai/gpt-4o",
} }
LOCAL_MODEL = "ollama/qwen3.5:35b-a3b"
# Lock to serialise model-switch + agent-call (openclaw config is global)
_model_lock = threading.Lock()
# Initialize memory database at module load
init_memory_db()
def load_mode() -> dict: def load_mode() -> dict:
@@ -102,11 +131,56 @@ def load_mode() -> dict:
return {"mode": "private", "cloud_provider": "anthropic", "overrides": {}} return {"mode": "private", "cloud_provider": "anthropic", "overrides": {}}
def resolve_model(mode_data: dict) -> str | None: def resolve_model(mode_data: dict) -> str:
"""Resolve which model to use based on mode. Returns None for default (private/local).""" """Resolve which model to use based on mode."""
mode = mode_data.get("mode", "private") mode = mode_data.get("mode", "private")
if mode == "private": if mode == "private":
return None # Use OpenClaw default (ollama/qwen3.5:35b-a3b) return mode_data.get("local_model", LOCAL_MODEL)
provider = mode_data.get("cloud_provider", "anthropic")
return CLOUD_MODELS.get(provider, CLOUD_MODELS["anthropic"])
def load_prompt_style(style_id: str) -> dict:
"""Load a prompt style template by ID. Returns the style dict or a default."""
if not style_id:
style_id = "standard"
safe_id = style_id.replace("/", "_").replace("..", "")
style_path = PROMPT_STYLES_DIR / f"{safe_id}.json"
try:
with open(style_path) as f:
return json.load(f)
except Exception:
return {"id": "standard", "name": "Standard", "group": "cloud", "instruction": "", "strip_sections": []}
def load_active_style() -> str:
"""Load the active prompt style ID from state file. Defaults to 'standard'."""
try:
with open(ACTIVE_STYLE_PATH) as f:
data = json.load(f)
return data.get("style", "standard")
except Exception:
return "standard"
def resolve_model_for_style(style: dict, mode_data: dict) -> str:
"""Resolve model based on prompt style, falling back to mode config.
Priority: style 'model' field > group-based routing > mode default."""
mode = mode_data.get("mode", "private")
group = style.get("group", "cloud")
# Private mode always uses local model regardless of style
if mode == "private" and group == "local":
return mode_data.get("local_model", LOCAL_MODEL)
# Per-style model override (e.g. haiku for quick, opus for roleplay)
style_model = style.get("model")
if style_model:
return style_model
# Fallback: cloud model from mode config
if group == "local":
return mode_data.get("local_model", LOCAL_MODEL)
provider = mode_data.get("cloud_provider", "anthropic") provider = mode_data.get("cloud_provider", "anthropic")
return CLOUD_MODELS.get(provider, CLOUD_MODELS["anthropic"]) return CLOUD_MODELS.get(provider, CLOUD_MODELS["anthropic"])
@@ -192,31 +266,44 @@ def load_character(character_id: str = None) -> dict:
return {} return {}
def load_character_prompt(satellite_id: str = None, character_id: str = None) -> str: def load_character_prompt(satellite_id: str = None, character_id: str = None,
prompt_style: str = None, user_message: str = "",
is_cloud: bool = False) -> str:
"""Load the full system prompt for a character, resolved by satellite or explicit ID. """Load the full system prompt for a character, resolved by satellite or explicit ID.
Builds a rich prompt from system_prompt + profile fields (background, dialogue_style, etc.).""" Builds a rich prompt from style instruction + system_prompt + profile fields + memories.
The prompt_style controls HOW the character responds (brief, conversational, roleplay, etc.)."""
if not character_id: if not character_id:
character_id = resolve_character_id(satellite_id) character_id = resolve_character_id(satellite_id)
char = load_character(character_id) char = load_character(character_id)
if not char: if not char:
return "" return ""
# Load prompt style template
style_id = prompt_style or load_active_style()
style = load_prompt_style(style_id)
strip_sections = set(style.get("strip_sections", []))
sections = [] sections = []
# Core system prompt # 1. Response style instruction (framing directive — goes first)
instruction = style.get("instruction", "")
if instruction:
sections.append(f"[Response Style: {style.get('name', style_id)}]\n{instruction}")
# 2. Core character identity (system_prompt)
prompt = char.get("system_prompt", "") prompt = char.get("system_prompt", "")
if prompt: if prompt:
sections.append(prompt) sections.append(prompt)
# Character profile fields # 3. Character profile fields (filtered by style's strip_sections)
profile_parts = [] profile_parts = []
if char.get("background"): if "background" not in strip_sections and char.get("background"):
profile_parts.append(f"## Background\n{char['background']}") profile_parts.append(f"## Background\n{char['background']}")
if char.get("appearance"): if "appearance" not in strip_sections and char.get("appearance"):
profile_parts.append(f"## Appearance\n{char['appearance']}") profile_parts.append(f"## Appearance\n{char['appearance']}")
if char.get("dialogue_style"): if "dialogue_style" not in strip_sections and char.get("dialogue_style"):
profile_parts.append(f"## Dialogue Style\n{char['dialogue_style']}") profile_parts.append(f"## Dialogue Style\n{char['dialogue_style']}")
if char.get("skills"): if "skills" not in strip_sections and char.get("skills"):
skills = char["skills"] skills = char["skills"]
if isinstance(skills, list): if isinstance(skills, list):
skills_text = ", ".join(skills[:15]) skills_text = ", ".join(skills[:15])
@@ -226,7 +313,18 @@ def load_character_prompt(satellite_id: str = None, character_id: str = None) ->
if profile_parts: if profile_parts:
sections.append("[Character Profile]\n" + "\n\n".join(profile_parts)) sections.append("[Character Profile]\n" + "\n\n".join(profile_parts))
# Character metadata # 4. Per-character style overrides (optional customization per style)
style_overrides = char.get("prompt_style_overrides", {}).get(style_id, {})
if style_overrides:
override_parts = []
if style_overrides.get("dialogue_style"):
override_parts.append(f"## Dialogue Style Override\n{style_overrides['dialogue_style']}")
if style_overrides.get("system_prompt_suffix"):
override_parts.append(style_overrides["system_prompt_suffix"])
if override_parts:
sections.append("[Style-Specific Notes]\n" + "\n\n".join(override_parts))
# 5. Character metadata
meta_lines = [] meta_lines = []
if char.get("display_name"): if char.get("display_name"):
meta_lines.append(f"Your name is: {char['display_name']}") meta_lines.append(f"Your name is: {char['display_name']}")
@@ -243,47 +341,86 @@ def load_character_prompt(satellite_id: str = None, character_id: str = None) ->
if meta_lines: if meta_lines:
sections.append("[Character Metadata]\n" + "\n".join(meta_lines)) sections.append("[Character Metadata]\n" + "\n".join(meta_lines))
# Memories (personal + general) # 6. Memories (personal + general, context-aware retrieval)
personal, general = load_memories(character_id) personal, general, followups = load_memories(character_id, context=user_message, is_cloud=is_cloud)
if personal: if personal:
sections.append("[Personal Memories]\n" + "\n".join(f"- {m}" for m in personal)) sections.append("[Personal Memories]\n" + "\n".join(f"- {m}" for m in personal))
if general: if general:
sections.append("[General Knowledge]\n" + "\n".join(f"- {m}" for m in general)) sections.append("[General Knowledge]\n" + "\n".join(f"- {m}" for m in general))
# 7. Pending follow-ups (things the character should naturally bring up)
if followups:
followup_lines = [
f"- {fu['follow_up_context']} (from {fu['created_at'][:10]})"
for fu in followups[:3]
]
sections.append(
"[Pending Follow-ups — Bring these up naturally if relevant]\n"
"You have unresolved topics to check on with the user. "
"Weave them into conversation naturally — don't list them. "
"If the user addresses one, use memory-ctl resolve <id> to mark it resolved.\n"
+ "\n".join(followup_lines)
)
return "\n\n".join(sections) return "\n\n".join(sections)
def load_memories(character_id: str) -> tuple[list[str], list[str]]: def _truncate_to_budget(contents: list[str], budget: int) -> list[str]:
"""Load personal (per-character) and general memories. """Truncate a list of strings to fit within a character budget."""
Returns (personal_contents, general_contents) truncated to fit context budget.""" result = []
PERSONAL_BUDGET = 4000 # max chars for personal memories in prompt used = 0
GENERAL_BUDGET = 3000 # max chars for general memories in prompt for content in contents:
if used + len(content) > budget:
break
result.append(content)
used += len(content)
return result
def _read_memories(path: Path, budget: int) -> list[str]:
def load_memories(character_id: str, context: str = "", is_cloud: bool = False) -> tuple[list[str], list[str], list[dict]]:
"""Load personal and general memories using semantic + recency retrieval.
Returns (personal_contents, general_contents, pending_followups)."""
PERSONAL_BUDGET = 4000
GENERAL_BUDGET = 3000
# Check if SQLite has any memories; fall back to JSON if empty (pre-migration)
if count_memories(character_id) == 0 and count_memories("shared") == 0:
return _load_memories_json_fallback(character_id), [], []
personal_mems = _retrieve_memories(character_id, context, limit=15,
exclude_private_for_cloud=is_cloud)
general_mems = _retrieve_memories("shared", context, limit=10,
exclude_private_for_cloud=is_cloud)
followups = get_pending_followups(character_id)
personal = _truncate_to_budget([m["content"] for m in personal_mems], PERSONAL_BUDGET)
general = _truncate_to_budget([m["content"] for m in general_mems], GENERAL_BUDGET)
return personal, general, followups
def _load_memories_json_fallback(character_id: str) -> list[str]:
"""Legacy JSON fallback for pre-migration state."""
def _read(path: Path, budget: int) -> list[str]:
try: try:
with open(path) as f: with open(path) as f:
data = json.load(f) data = json.load(f)
except Exception: except Exception:
return [] return []
memories = data.get("memories", []) memories = data.get("memories", [])
# Sort newest first
memories.sort(key=lambda m: m.get("createdAt", ""), reverse=True) memories.sort(key=lambda m: m.get("createdAt", ""), reverse=True)
result = [] result, used = [], 0
used = 0
for m in memories: for m in memories:
content = m.get("content", "").strip() content = m.get("content", "").strip()
if not content: if not content:
continue continue
if used + len(content) > budget: if used + len(content) > 4000:
break break
result.append(content) result.append(content)
used += len(content) used += len(content)
return result return result
safe_id = character_id.replace("/", "_") safe_id = character_id.replace("/", "_")
personal = _read_memories(MEMORIES_DIR / "personal" / f"{safe_id}.json", PERSONAL_BUDGET) return _read(MEMORIES_DIR / "personal" / f"{safe_id}.json", 4000)
general = _read_memories(MEMORIES_DIR / "general.json", GENERAL_BUDGET)
return personal, general
class OpenClawBridgeHandler(BaseHTTPRequestHandler): class OpenClawBridgeHandler(BaseHTTPRequestHandler):
@@ -297,6 +434,7 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
"""Send a JSON response.""" """Send a JSON response."""
self.send_response(status_code) self.send_response(status_code)
self.send_header("Content-Type", "application/json") self.send_header("Content-Type", "application/json")
self.send_header("Access-Control-Allow-Origin", "*")
self.end_headers() self.end_headers()
self.wfile.write(json.dumps(data).encode()) self.wfile.write(json.dumps(data).encode())
@@ -319,11 +457,17 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
self._handle_stt_request() self._handle_stt_request()
return return
# Only handle the agent message endpoint # Agent message endpoint
if parsed_path.path == "/api/agent/message": if parsed_path.path == "/api/agent/message":
self._handle_agent_request() self._handle_agent_request()
return return
# Memory API: POST /api/memories/...
if parsed_path.path.startswith("/api/memories/"):
parts = parsed_path.path[len("/api/memories/"):].strip("/").split("/")
self._handle_memory_post(parts)
return
self._send_json_response(404, {"error": "Not found"}) self._send_json_response(404, {"error": "Not found"})
def _handle_tts_request(self): def _handle_tts_request(self):
@@ -399,11 +543,29 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
audio_bytes = resp.read() audio_bytes = resp.read()
return audio_bytes, "audio/mpeg" return audio_bytes, "audio/mpeg"
def do_PUT(self):
"""Handle PUT requests (memory updates)."""
parsed_path = urlparse(self.path)
if parsed_path.path.startswith("/api/memories/"):
parts = parsed_path.path[len("/api/memories/"):].strip("/").split("/")
self._handle_memory_put(parts)
return
self._send_json_response(404, {"error": "Not found"})
def do_DELETE(self):
"""Handle DELETE requests (memory deletion)."""
parsed_path = urlparse(self.path)
if parsed_path.path.startswith("/api/memories/"):
parts = parsed_path.path[len("/api/memories/"):].strip("/").split("/")
self._handle_memory_delete(parts)
return
self._send_json_response(404, {"error": "Not found"})
def do_OPTIONS(self): def do_OPTIONS(self):
"""Handle CORS preflight requests.""" """Handle CORS preflight requests."""
self.send_response(204) self.send_response(204)
self.send_header("Access-Control-Allow-Origin", "*") self.send_header("Access-Control-Allow-Origin", "*")
self.send_header("Access-Control-Allow-Methods", "POST, GET, OPTIONS") self.send_header("Access-Control-Allow-Methods", "POST, GET, PUT, DELETE, OPTIONS")
self.send_header("Access-Control-Allow-Headers", "Content-Type") self.send_header("Access-Control-Allow-Headers", "Content-Type")
self.end_headers() self.end_headers()
@@ -531,19 +693,55 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
self._send_json_response(200, {"status": "ok", "message": "Wake word received"}) self._send_json_response(200, {"status": "ok", "message": "Wake word received"})
@staticmethod @staticmethod
def _call_openclaw(message: str, agent: str, timeout: int, model: str = None) -> str: def _config_set(path: str, value: str):
"""Call OpenClaw CLI and return stdout.""" """Set an OpenClaw config value."""
cmd = ["/opt/homebrew/bin/openclaw", "agent", "--message", message, "--agent", agent] subprocess.run(
if model: ["/opt/homebrew/bin/openclaw", "config", "set", path, value],
cmd.extend(["--model", model]) capture_output=True, text=True, timeout=5,
result = subprocess.run(
cmd,
capture_output=True,
text=True,
timeout=timeout,
check=True,
) )
return result.stdout.strip()
@staticmethod
def _call_openclaw(message: str, agent: str, timeout: int,
model: str = None, session_id: str = None,
params: dict = None, thinking: str = None) -> str:
"""Call OpenClaw CLI and return stdout.
Temporarily switches the gateway's primary model and inference params
via `openclaw config set`, protected by _model_lock to prevent races."""
cmd = ["/opt/homebrew/bin/openclaw", "agent", "--message", message, "--agent", agent]
if session_id:
cmd.extend(["--session-id", session_id])
if thinking:
cmd.extend(["--thinking", thinking])
with _model_lock:
if model:
OpenClawBridgeHandler._config_set(
"agents.defaults.model.primary", model)
# Set per-style temperature if provided
temp_path = None
if model and params and params.get("temperature") is not None:
temp_path = f'agents.defaults.models["{model}"].params.temperature'
OpenClawBridgeHandler._config_set(
temp_path, str(params["temperature"]))
try:
result = subprocess.run(
cmd,
capture_output=True,
text=True,
timeout=timeout,
check=True,
)
return result.stdout.strip()
finally:
# Restore defaults
if model and model != DEFAULT_MODEL:
OpenClawBridgeHandler._config_set(
"agents.defaults.model.primary", DEFAULT_MODEL)
if temp_path:
# Restore to neutral default
OpenClawBridgeHandler._config_set(temp_path, "0.5")
@staticmethod @staticmethod
def _needs_followup(response: str) -> bool: def _needs_followup(response: str) -> bool:
@@ -588,6 +786,8 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
agent = data.get("agent", "main") agent = data.get("agent", "main")
satellite_id = data.get("satellite_id") satellite_id = data.get("satellite_id")
explicit_character_id = data.get("character_id") explicit_character_id = data.get("character_id")
requested_style = data.get("prompt_style")
conversation_id = data.get("conversation_id")
if not message: if not message:
self._send_json_response(400, {"error": "Message is required"}) self._send_json_response(400, {"error": "Message is required"})
@@ -598,10 +798,28 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
character_id = explicit_character_id character_id = explicit_character_id
else: else:
character_id = resolve_character_id(satellite_id) character_id = resolve_character_id(satellite_id)
system_prompt = load_character_prompt(character_id=character_id)
# Resolve prompt style: explicit > character default > global active
char = load_character(character_id)
style_id = requested_style or char.get("default_prompt_style") or load_active_style()
style = load_prompt_style(style_id)
print(f"[OpenClaw Bridge] Prompt style: {style.get('name', style_id)} ({style.get('group', 'cloud')})")
# Determine if routing to cloud (for privacy filtering)
mode_data = load_mode()
active_model = resolve_model_for_style(style, mode_data)
is_cloud = style.get("group", "cloud") == "cloud" and mode_data.get("mode") != "private"
system_prompt = load_character_prompt(
character_id=character_id, prompt_style=style_id,
user_message=message, is_cloud=is_cloud,
)
# Run lifecycle maintenance (cheap SQL updates)
auto_resolve_expired_followups()
auto_archive_old_resolved()
# Set the active TTS config for the Wyoming server to pick up # Set the active TTS config for the Wyoming server to pick up
char = load_character(character_id)
tts_config = char.get("tts", {}) tts_config = char.get("tts", {})
if tts_config: if tts_config:
set_active_tts_voice(character_id, tts_config) set_active_tts_voice(character_id, tts_config)
@@ -616,14 +834,30 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
if system_prompt: if system_prompt:
message = f"System Context: {system_prompt}\n\nUser Request: {message}" message = f"System Context: {system_prompt}\n\nUser Request: {message}"
# Load mode and resolve model routing group = style.get("group", "cloud")
mode_data = load_mode() print(f"[OpenClaw Bridge] Routing: {group.upper()}{active_model}")
model_override = resolve_model(mode_data)
active_model = model_override or DEFAULT_MODEL # Resolve session ID for OpenClaw thread isolation
if model_override: # Dashboard chats: use conversation_id (each "New Chat" = fresh thread)
print(f"[OpenClaw Bridge] Mode: PUBLIC → {model_override}") # Satellites: use rotating 12-hour bucket so old context expires naturally
if conversation_id:
session_id = conversation_id
elif satellite_id:
now = datetime.now(timezone.utc)
half = "am" if now.hour < 12 else "pm"
session_id = f"sat_{satellite_id}_{now.strftime('%Y%m%d')}_{half}"
else: else:
print(f"[OpenClaw Bridge] Mode: PRIVATE ({active_model})") # API call with no conversation or satellite — use a transient session
session_id = f"api_{int(datetime.now(timezone.utc).timestamp())}"
print(f"[OpenClaw Bridge] Session: {session_id}")
# Extract style inference params (temperature, etc.) and thinking level
style_params = style.get("params", {})
style_thinking = style.get("thinking")
if style_params:
print(f"[OpenClaw Bridge] Style params: {style_params}")
if style_thinking:
print(f"[OpenClaw Bridge] Thinking: {style_thinking}")
# Check if model is warm to set appropriate timeout # Check if model is warm to set appropriate timeout
warm = is_model_warm() warm = is_model_warm()
@@ -635,7 +869,7 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
# Call OpenClaw CLI (use full path for launchd compatibility) # Call OpenClaw CLI (use full path for launchd compatibility)
try: try:
response_text = self._call_openclaw(message, agent, timeout, model=model_override) response_text = self._call_openclaw(message, agent, timeout, model=active_model, session_id=session_id, params=style_params, thinking=style_thinking)
# Re-prompt if the model promised to act but didn't call a tool. # Re-prompt if the model promised to act but didn't call a tool.
# Detect "I'll do X" / "Let me X" responses that lack any result. # Detect "I'll do X" / "Let me X" responses that lack any result.
@@ -645,11 +879,19 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
"You just said you would do something but didn't actually call the exec tool. " "You just said you would do something but didn't actually call the exec tool. "
"Do NOT explain what you will do — call the tool NOW using exec and return the result." "Do NOT explain what you will do — call the tool NOW using exec and return the result."
) )
response_text = self._call_openclaw(followup, agent, timeout, model=model_override) response_text = self._call_openclaw(followup, agent, timeout, model=active_model, session_id=session_id, params=style_params, thinking=style_thinking)
# Increment surfaced_count on follow-ups that were injected into prompt
try:
followups = get_pending_followups(character_id)
for fu in followups[:3]:
increment_surfaced_count(fu["id"])
except Exception as e:
print(f"[OpenClaw Bridge] Follow-up tracking error: {e}")
# Signal avatar: idle (TTS handler will override to 'speaking' if voice is used) # Signal avatar: idle (TTS handler will override to 'speaking' if voice is used)
_vtube_fire_and_forget("/expression", {"event": "idle"}) _vtube_fire_and_forget("/expression", {"event": "idle"})
self._send_json_response(200, {"response": response_text, "model": active_model}) self._send_json_response(200, {"response": response_text, "model": active_model, "prompt_style": style_id})
except subprocess.TimeoutExpired: except subprocess.TimeoutExpired:
self._send_json_response(504, {"error": f"OpenClaw command timed out after {timeout}s (model was {'warm' if warm else 'cold'})"}) self._send_json_response(504, {"error": f"OpenClaw command timed out after {timeout}s (model was {'warm' if warm else 'cold'})"})
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as e:
@@ -660,18 +902,174 @@ class OpenClawBridgeHandler(BaseHTTPRequestHandler):
except Exception as e: except Exception as e:
self._send_json_response(500, {"error": str(e)}) self._send_json_response(500, {"error": str(e)})
def do_GET(self): # ------------------------------------------------------------------
"""Handle GET requests (health check).""" # Memory REST API
parsed_path = urlparse(self.path) # ------------------------------------------------------------------
if parsed_path.path == "/status" or parsed_path.path == "/": def _read_json_body(self) -> dict | None:
"""Read and parse JSON body from request. Returns None on error (response already sent)."""
content_length = int(self.headers.get("Content-Length", 0))
if content_length == 0:
self._send_json_response(400, {"error": "Empty body"})
return None
try:
return json.loads(self.rfile.read(content_length).decode())
except json.JSONDecodeError:
self._send_json_response(400, {"error": "Invalid JSON"})
return None
def _handle_memory_get(self, path_parts: list[str], query_params: dict):
"""Handle GET /api/memories/..."""
# GET /api/memories/general
if len(path_parts) == 1 and path_parts[0] == "general":
limit = int(query_params.get("limit", ["50"])[0])
offset = int(query_params.get("offset", ["0"])[0])
memory_type = query_params.get("type", [None])[0]
lifecycle = query_params.get("lifecycle", [None])[0]
category = query_params.get("category", [None])[0]
memories = _list_memories("shared", memory_type=memory_type,
lifecycle_state=lifecycle, category=category,
limit=limit, offset=offset)
self._send_json_response(200, {"memories": memories})
return
if len(path_parts) < 1:
self._send_json_response(400, {"error": "Character ID required"})
return
char_id = path_parts[0]
# GET /api/memories/:characterId/followups
if len(path_parts) == 2 and path_parts[1] == "followups":
followups = get_pending_followups(char_id)
self._send_json_response(200, {"followups": followups})
return
# GET /api/memories/:characterId
limit = int(query_params.get("limit", ["50"])[0])
offset = int(query_params.get("offset", ["0"])[0])
memory_type = query_params.get("type", [None])[0]
lifecycle = query_params.get("lifecycle", [None])[0]
category = query_params.get("category", [None])[0]
query = query_params.get("q", [None])[0]
if query:
memories = _search_memories(char_id, query, memory_type=memory_type, limit=limit)
else:
memories = _list_memories(char_id, memory_type=memory_type,
lifecycle_state=lifecycle, category=category,
limit=limit, offset=offset)
self._send_json_response(200, {"memories": memories, "characterId": char_id})
def _handle_memory_post(self, path_parts: list[str]):
"""Handle POST /api/memories/..."""
data = self._read_json_body()
if data is None:
return
# POST /api/memories/migrate
if len(path_parts) == 1 and path_parts[0] == "migrate":
result = migrate_from_json()
self._send_json_response(200, result)
return
# POST /api/memories/:memoryId/resolve
if len(path_parts) == 2 and path_parts[1] == "resolve":
ok = resolve_followup(path_parts[0])
self._send_json_response(200 if ok else 404,
{"ok": ok, "id": path_parts[0]})
return
# POST /api/memories/general — add general memory
if len(path_parts) == 1 and path_parts[0] == "general":
content = data.get("content", "").strip()
if not content:
self._send_json_response(400, {"error": "content is required"})
return
mem = add_or_merge_memory(
character_id="shared",
content=content,
memory_type=data.get("memory_type"),
category=data.get("category", "other"),
importance=data.get("importance"),
privacy_level=data.get("privacy_level"),
tags=data.get("tags"),
source=data.get("source", "dashboard"),
)
self._send_json_response(200, {"ok": True, "memory": mem})
return
# POST /api/memories/:characterId — add personal memory
if len(path_parts) == 1:
char_id = path_parts[0]
content = data.get("content", "").strip()
if not content:
self._send_json_response(400, {"error": "content is required"})
return
mem = add_or_merge_memory(
character_id=char_id,
content=content,
memory_type=data.get("memory_type"),
category=data.get("category", "other"),
importance=data.get("importance"),
privacy_level=data.get("privacy_level"),
tags=data.get("tags"),
follow_up_due=data.get("follow_up_due"),
follow_up_context=data.get("follow_up_context"),
source=data.get("source", "dashboard"),
)
self._send_json_response(200, {"ok": True, "memory": mem})
return
self._send_json_response(404, {"error": "Not found"})
def _handle_memory_put(self, path_parts: list[str]):
"""Handle PUT /api/memories/:memoryId — update a memory."""
if len(path_parts) != 1:
self._send_json_response(400, {"error": "Memory ID required"})
return
data = self._read_json_body()
if data is None:
return
mem = _update_memory(path_parts[0], **data)
if mem:
self._send_json_response(200, {"ok": True, "memory": mem})
else:
self._send_json_response(404, {"error": "Memory not found"})
def _handle_memory_delete(self, path_parts: list[str]):
"""Handle DELETE /api/memories/:memoryId."""
if len(path_parts) != 1:
self._send_json_response(400, {"error": "Memory ID required"})
return
ok = _delete_memory(path_parts[0])
self._send_json_response(200 if ok else 404, {"ok": ok, "id": path_parts[0]})
# ------------------------------------------------------------------
# HTTP method dispatchers
# ------------------------------------------------------------------
def do_GET(self):
"""Handle GET requests."""
parsed_path = urlparse(self.path)
path = parsed_path.path
if path == "/status" or path == "/":
self._send_json_response(200, { self._send_json_response(200, {
"status": "ok", "status": "ok",
"service": "OpenClaw HTTP Bridge", "service": "OpenClaw HTTP Bridge",
"version": "1.0.0" "version": "2.0.0"
}) })
else: return
self._send_json_response(404, {"error": "Not found"})
# Memory API: GET /api/memories/...
if path.startswith("/api/memories/"):
parts = path[len("/api/memories/"):].strip("/").split("/")
query_params = parse_qs(parsed_path.query)
self._handle_memory_get(parts, query_params)
return
self._send_json_response(404, {"error": "Not found"})
class ThreadingHTTPServer(ThreadingMixIn, HTTPServer): class ThreadingHTTPServer(ThreadingMixIn, HTTPServer):

View File

@@ -0,0 +1,13 @@
{
"id": "creative",
"name": "Creative",
"group": "cloud",
"model": "anthropic/claude-sonnet-4-6",
"description": "In-depth answers, longer conversational responses",
"thinking": "low",
"params": {
"temperature": 0.7
},
"instruction": "Give thorough, in-depth answers. Respond at whatever length the topic requires — short for simple things, long for complex ones. Be conversational and engaging, like a knowledgeable friend. Vary your sentence structure and word choice to keep things interesting. Do not use roleplay actions or narration. If a topic has interesting depth worth exploring, offer to continue. This mode is for rich conversation, not commands.",
"strip_sections": []
}

View File

@@ -0,0 +1,13 @@
{
"id": "game-master",
"name": "Game Master",
"group": "cloud",
"model": "anthropic/claude-opus-4-6",
"description": "Second-person interactive narration with user as participant",
"thinking": "off",
"params": {
"temperature": 0.9
},
"instruction": "Narrate in second person — the user is the subject experiencing the scene. Describe what they see, hear, and feel with vivid, varied language. Write your character's dialogue in quotes and their actions in prose. After describing the scene or an interaction, prompt the user for their next action. Keep the user engaged as an active participant. Balance rich description with opportunities for user agency. Avoid repeating descriptive patterns — each scene should feel fresh and unpredictable. This is a 2nd-person interactive experience.",
"strip_sections": []
}

View File

@@ -0,0 +1,13 @@
{
"id": "quick",
"name": "Quick",
"group": "cloud",
"model": "anthropic/claude-haiku-4-5-20251001",
"description": "Brief responses for commands and quick questions",
"thinking": "off",
"params": {
"temperature": 0.15
},
"instruction": "RESPONSE RULES — STRICT:\n- Respond as briefly as possible. For smart home commands, confirm with 1-3 words (\"Done.\", \"Lights on.\", \"Playing jazz.\").\n- For factual questions, give the shortest correct answer. One sentence max.\n- No small talk, no elaboration, no follow-up questions unless the request is genuinely ambiguous.\n- Never describe your actions, emotions, or thought process.\n- Never add flair, personality, or creative embellishments — be a reliable, predictable tool.\n- If a tool call is needed, execute it and report the result. Nothing else.",
"strip_sections": ["background", "appearance", "dialogue_style"]
}

View File

@@ -0,0 +1,13 @@
{
"id": "roleplayer",
"name": "Roleplayer",
"group": "cloud",
"model": "anthropic/claude-opus-4-6",
"description": "First-person roleplay with character actions and expressions",
"thinking": "off",
"params": {
"temperature": 0.85
},
"instruction": "Respond entirely in first person as your character. Use action descriptions enclosed in asterisks (*adjusts glasses*, *leans forward thoughtfully*) to convey body language, emotions, and physical actions. Stay fully in character at all times — your personality, speech patterns, and mannerisms should be consistent with your character profile. React emotionally and physically to what the user says. Vary your expressions, gestures, and phrasings — never repeat the same actions or sentence structures. Surprise the user with unexpected but in-character reactions. This is an immersive 1st-person interaction.",
"strip_sections": []
}

View File

@@ -0,0 +1,13 @@
{
"id": "standard",
"name": "Standard",
"group": "cloud",
"model": "anthropic/claude-sonnet-4-6",
"description": "Conversational responses, concise but informative",
"thinking": "off",
"params": {
"temperature": 0.4
},
"instruction": "Respond naturally and conversationally. Be concise but informative — a few sentences is ideal. Do not use roleplay actions, narration, or describe your expressions/body language. Treat the interaction as a chat, not a performance. Stay helpful, on-topic, and consistent. Prioritise clarity and accuracy over flair.",
"strip_sections": []
}

View File

@@ -0,0 +1,13 @@
{
"id": "storyteller",
"name": "Storyteller",
"group": "cloud",
"model": "anthropic/claude-opus-4-6",
"description": "Third-person narrative with periodic reader check-ins",
"thinking": "off",
"params": {
"temperature": 0.95
},
"instruction": "Narrate in third person as a storyteller. Describe scenes, character actions, dialogue, and atmosphere as a novelist would. Your character should be written about, not speaking as themselves directly to the user. Write rich, evocative prose with varied vocabulary, rhythm, and imagery. Avoid formulaic descriptions — each passage should have its own texture and mood. Periodically check in with the reader about story direction. The user drives the direction but you drive the narrative between check-ins. This is a 3rd-person storytelling experience.",
"strip_sections": []
}

View File

@@ -26,6 +26,8 @@
<string>/Users/aodhan</string> <string>/Users/aodhan</string>
<key>GAZE_API_KEY</key> <key>GAZE_API_KEY</key>
<string>e63401f17e4845e1059f830267f839fe7fc7b6083b1cb1730863318754d799f4</string> <string>e63401f17e4845e1059f830267f839fe7fc7b6083b1cb1730863318754d799f4</string>
<key>HA_TOKEN</key>
<string></string>
</dict> </dict>
<key>RunAtLoad</key> <key>RunAtLoad</key>

View File

@@ -46,6 +46,16 @@
} }
}, },
"dream_id": {
"type": "string",
"description": "Linked Dream character ID for syncing character data and images"
},
"gaze_character": {
"type": "string",
"description": "Linked GAZE character_id for auto-assigned cover image and default image generation preset"
},
"gaze_presets": { "gaze_presets": {
"type": "array", "type": "array",
"description": "GAZE image generation presets with trigger conditions", "description": "GAZE image generation presets with trigger conditions",
@@ -72,7 +82,25 @@
} }
}, },
"notes": { "type": "string" } "notes": { "type": "string" },
"default_prompt_style": {
"type": "string",
"description": "Default prompt style for this character (quick, standard, creative, roleplayer, game-master, storyteller). Overrides global active style when this character is active.",
"enum": ["", "quick", "standard", "creative", "roleplayer", "game-master", "storyteller"]
},
"prompt_style_overrides": {
"type": "object",
"description": "Per-style customizations for this character. Keys are style IDs, values contain override fields.",
"additionalProperties": {
"type": "object",
"properties": {
"dialogue_style": { "type": "string", "description": "Override dialogue style for this prompt style" },
"system_prompt_suffix": { "type": "string", "description": "Additional instructions appended for this prompt style" }
}
}
}
}, },
"additionalProperties": true "additionalProperties": true
} }

View File

@@ -1,14 +1,16 @@
import { BrowserRouter, Routes, Route, NavLink } from 'react-router-dom'; import { useState, useCallback, useEffect } from 'react';
import { BrowserRouter, Routes, Route, NavLink, useLocation } from 'react-router-dom';
import Dashboard from './pages/Dashboard'; import Dashboard from './pages/Dashboard';
import Chat from './pages/Chat'; import Chat from './pages/Chat';
import Characters from './pages/Characters'; import Characters from './pages/Characters';
import Editor from './pages/Editor'; import Editor from './pages/Editor';
import Memories from './pages/Memories'; import Memories from './pages/Memories';
function NavItem({ to, children, icon }) { function NavItem({ to, children, icon, onClick }) {
return ( return (
<NavLink <NavLink
to={to} to={to}
onClick={onClick}
className={({ isActive }) => className={({ isActive }) =>
`flex items-center gap-3 px-4 py-2.5 rounded-lg text-sm font-medium transition-colors ${ `flex items-center gap-3 px-4 py-2.5 rounded-lg text-sm font-medium transition-colors ${
isActive isActive
@@ -24,22 +26,78 @@ function NavItem({ to, children, icon }) {
} }
function Layout({ children }) { function Layout({ children }) {
const [sidebarOpen, setSidebarOpen] = useState(false)
const location = useLocation()
// Close sidebar on route change (mobile)
useEffect(() => {
setSidebarOpen(false)
}, [location.pathname])
const closeSidebar = useCallback(() => setSidebarOpen(false), [])
return ( return (
<div className="h-screen bg-gray-950 flex overflow-hidden"> <div className="h-screen bg-gray-950 flex overflow-hidden">
{/* Mobile header bar */}
<div className="fixed top-0 left-0 right-0 z-30 flex items-center gap-3 px-4 py-3 bg-gray-900/95 backdrop-blur border-b border-gray-800 md:hidden">
<button
onClick={() => setSidebarOpen(true)}
className="p-1.5 text-gray-400 hover:text-white transition-colors"
aria-label="Open menu"
>
<svg className="w-6 h-6" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M3.75 6.75h16.5M3.75 12h16.5m-16.5 5.25h16.5" />
</svg>
</button>
<div className="flex items-center gap-2">
<div className="w-7 h-7 rounded-md bg-gradient-to-br from-indigo-500 to-purple-600 flex items-center justify-center">
<svg className="w-4 h-4 text-white" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M2.25 12l8.954-8.955c.44-.439 1.152-.439 1.591 0L21.75 12M4.5 9.75v10.125c0 .621.504 1.125 1.125 1.125H9.75v-4.875c0-.621.504-1.125 1.125-1.125h2.25c.621 0 1.125.504 1.125 1.125V21h4.125c.621 0 1.125-.504 1.125-1.125V9.75M8.25 21h8.25" />
</svg>
</div>
<span className="text-sm font-bold text-white">HomeAI</span>
</div>
</div>
{/* Mobile backdrop */}
{sidebarOpen && (
<div
className="fixed inset-0 bg-black/60 z-40 md:hidden"
onClick={closeSidebar}
/>
)}
{/* Sidebar */} {/* Sidebar */}
<aside className="w-64 bg-gray-900 border-r border-gray-800 flex flex-col shrink-0"> <aside className={`
fixed inset-y-0 left-0 z-50 w-64 bg-gray-900 border-r border-gray-800 flex flex-col shrink-0
transform transition-transform duration-200 ease-out
${sidebarOpen ? 'translate-x-0' : '-translate-x-full'}
md:static md:translate-x-0
`}>
{/* Logo */} {/* Logo */}
<div className="px-6 py-5 border-b border-gray-800"> <div className="px-6 py-5 border-b border-gray-800">
<div className="flex items-center gap-3"> <div className="flex items-center justify-between">
<div className="w-9 h-9 rounded-lg bg-gradient-to-br from-indigo-500 to-purple-600 flex items-center justify-center"> <div className="flex items-center gap-3">
<svg className="w-5 h-5 text-white" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}> <div className="w-9 h-9 rounded-lg bg-gradient-to-br from-indigo-500 to-purple-600 flex items-center justify-center">
<path strokeLinecap="round" strokeLinejoin="round" d="M2.25 12l8.954-8.955c.44-.439 1.152-.439 1.591 0L21.75 12M4.5 9.75v10.125c0 .621.504 1.125 1.125 1.125H9.75v-4.875c0-.621.504-1.125 1.125-1.125h2.25c.621 0 1.125.504 1.125 1.125V21h4.125c.621 0 1.125-.504 1.125-1.125V9.75M8.25 21h8.25" /> <svg className="w-5 h-5 text-white" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M2.25 12l8.954-8.955c.44-.439 1.152-.439 1.591 0L21.75 12M4.5 9.75v10.125c0 .621.504 1.125 1.125 1.125H9.75v-4.875c0-.621.504-1.125 1.125-1.125h2.25c.621 0 1.125.504 1.125 1.125V21h4.125c.621 0 1.125-.504 1.125-1.125V9.75M8.25 21h8.25" />
</svg>
</div>
<div>
<h1 className="text-lg font-bold text-white tracking-tight">HomeAI</h1>
<p className="text-xs text-gray-500">LINDBLUM</p>
</div>
</div>
{/* Close button on mobile */}
<button
onClick={closeSidebar}
className="p-1 text-gray-500 hover:text-white md:hidden"
aria-label="Close menu"
>
<svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M6 18L18 6M6 6l12 12" />
</svg> </svg>
</div> </button>
<div>
<h1 className="text-lg font-bold text-white tracking-tight">HomeAI</h1>
<p className="text-xs text-gray-500">LINDBLUM</p>
</div>
</div> </div>
</div> </div>
@@ -47,6 +105,7 @@ function Layout({ children }) {
<nav className="flex-1 px-3 py-4 space-y-1"> <nav className="flex-1 px-3 py-4 space-y-1">
<NavItem <NavItem
to="/" to="/"
onClick={closeSidebar}
icon={ icon={
<svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}> <svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}>
<path strokeLinecap="round" strokeLinejoin="round" d="M3.75 6A2.25 2.25 0 016 3.75h2.25A2.25 2.25 0 0110.5 6v2.25a2.25 2.25 0 01-2.25 2.25H6a2.25 2.25 0 01-2.25-2.25V6zM3.75 15.75A2.25 2.25 0 016 13.5h2.25a2.25 2.25 0 012.25 2.25V18a2.25 2.25 0 01-2.25 2.25H6A2.25 2.25 0 013.75 18v-2.25zM13.5 6a2.25 2.25 0 012.25-2.25H18A2.25 2.25 0 0120.25 6v2.25A2.25 2.25 0 0118 10.5h-2.25a2.25 2.25 0 01-2.25-2.25V6zM13.5 15.75a2.25 2.25 0 012.25-2.25H18a2.25 2.25 0 012.25 2.25V18A2.25 2.25 0 0118 20.25h-2.25A2.25 2.25 0 0113.5 18v-2.25z" /> <path strokeLinecap="round" strokeLinejoin="round" d="M3.75 6A2.25 2.25 0 016 3.75h2.25A2.25 2.25 0 0110.5 6v2.25a2.25 2.25 0 01-2.25 2.25H6a2.25 2.25 0 01-2.25-2.25V6zM3.75 15.75A2.25 2.25 0 016 13.5h2.25a2.25 2.25 0 012.25 2.25V18a2.25 2.25 0 01-2.25 2.25H6A2.25 2.25 0 013.75 18v-2.25zM13.5 6a2.25 2.25 0 012.25-2.25H18A2.25 2.25 0 0120.25 6v2.25A2.25 2.25 0 0118 10.5h-2.25a2.25 2.25 0 01-2.25-2.25V6zM13.5 15.75a2.25 2.25 0 012.25-2.25H18a2.25 2.25 0 012.25 2.25V18A2.25 2.25 0 0118 20.25h-2.25A2.25 2.25 0 0113.5 18v-2.25z" />
@@ -58,6 +117,7 @@ function Layout({ children }) {
<NavItem <NavItem
to="/chat" to="/chat"
onClick={closeSidebar}
icon={ icon={
<svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}> <svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}>
<path strokeLinecap="round" strokeLinejoin="round" d="M8.625 12a.375.375 0 11-.75 0 .375.375 0 01.75 0zm0 0H8.25m4.125 0a.375.375 0 11-.75 0 .375.375 0 01.75 0zm0 0H12m4.125 0a.375.375 0 11-.75 0 .375.375 0 01.75 0zm0 0h-.375M21 12c0 4.556-4.03 8.25-9 8.25a9.764 9.764 0 01-2.555-.337A5.972 5.972 0 015.41 20.97a5.969 5.969 0 01-.474-.065 4.48 4.48 0 00.978-2.025c.09-.457-.133-.901-.467-1.226C3.93 16.178 3 14.189 3 12c0-4.556 4.03-8.25 9-8.25s9 3.694 9 8.25z" /> <path strokeLinecap="round" strokeLinejoin="round" d="M8.625 12a.375.375 0 11-.75 0 .375.375 0 01.75 0zm0 0H8.25m4.125 0a.375.375 0 11-.75 0 .375.375 0 01.75 0zm0 0H12m4.125 0a.375.375 0 11-.75 0 .375.375 0 01.75 0zm0 0h-.375M21 12c0 4.556-4.03 8.25-9 8.25a9.764 9.764 0 01-2.555-.337A5.972 5.972 0 015.41 20.97a5.969 5.969 0 01-.474-.065 4.48 4.48 0 00.978-2.025c.09-.457-.133-.901-.467-1.226C3.93 16.178 3 14.189 3 12c0-4.556 4.03-8.25 9-8.25s9 3.694 9 8.25z" />
@@ -69,6 +129,7 @@ function Layout({ children }) {
<NavItem <NavItem
to="/characters" to="/characters"
onClick={closeSidebar}
icon={ icon={
<svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}> <svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}>
<path strokeLinecap="round" strokeLinejoin="round" d="M15.75 6a3.75 3.75 0 11-7.5 0 3.75 3.75 0 017.5 0zM4.501 20.118a7.5 7.5 0 0114.998 0A17.933 17.933 0 0112 21.75c-2.676 0-5.216-.584-7.499-1.632z" /> <path strokeLinecap="round" strokeLinejoin="round" d="M15.75 6a3.75 3.75 0 11-7.5 0 3.75 3.75 0 017.5 0zM4.501 20.118a7.5 7.5 0 0114.998 0A17.933 17.933 0 0112 21.75c-2.676 0-5.216-.584-7.499-1.632z" />
@@ -80,6 +141,7 @@ function Layout({ children }) {
<NavItem <NavItem
to="/memories" to="/memories"
onClick={closeSidebar}
icon={ icon={
<svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}> <svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}>
<path strokeLinecap="round" strokeLinejoin="round" d="M12 18v-5.25m0 0a6.01 6.01 0 001.5-.189m-1.5.189a6.01 6.01 0 01-1.5-.189m3.75 7.478a12.06 12.06 0 01-4.5 0m3.75 2.383a14.406 14.406 0 01-3 0M14.25 18v-.192c0-.983.658-1.823 1.508-2.316a7.5 7.5 0 10-7.517 0c.85.493 1.509 1.333 1.509 2.316V18" /> <path strokeLinecap="round" strokeLinejoin="round" d="M12 18v-5.25m0 0a6.01 6.01 0 001.5-.189m-1.5.189a6.01 6.01 0 01-1.5-.189m3.75 7.478a12.06 12.06 0 01-4.5 0m3.75 2.383a14.406 14.406 0 01-3 0M14.25 18v-.192c0-.983.658-1.823 1.508-2.316a7.5 7.5 0 10-7.517 0c.85.493 1.509 1.333 1.509 2.316V18" />
@@ -91,6 +153,7 @@ function Layout({ children }) {
<NavItem <NavItem
to="/editor" to="/editor"
onClick={closeSidebar}
icon={ icon={
<svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}> <svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}>
<path strokeLinecap="round" strokeLinejoin="round" d="M9.594 3.94c.09-.542.56-.94 1.11-.94h2.593c.55 0 1.02.398 1.11.94l.213 1.281c.063.374.313.686.645.87.074.04.147.083.22.127.324.196.72.257 1.075.124l1.217-.456a1.125 1.125 0 011.37.49l1.296 2.247a1.125 1.125 0 01-.26 1.431l-1.003.827c-.293.24-.438.613-.431.992a6.759 6.759 0 010 .255c-.007.378.138.75.43.99l1.005.828c.424.35.534.954.26 1.43l-1.298 2.247a1.125 1.125 0 01-1.369.491l-1.217-.456c-.355-.133-.75-.072-1.076.124a6.57 6.57 0 01-.22.128c-.331.183-.581.495-.644.869l-.213 1.28c-.09.543-.56.941-1.11.941h-2.594c-.55 0-1.02-.398-1.11-.94l-.213-1.281c-.062-.374-.312-.686-.644-.87a6.52 6.52 0 01-.22-.127c-.325-.196-.72-.257-1.076-.124l-1.217.456a1.125 1.125 0 01-1.369-.49l-1.297-2.247a1.125 1.125 0 01.26-1.431l1.004-.827c.292-.24.437-.613.43-.992a6.932 6.932 0 010-.255c.007-.378-.138-.75-.43-.99l-1.004-.828a1.125 1.125 0 01-.26-1.43l1.297-2.247a1.125 1.125 0 011.37-.491l1.216.456c.356.133.751.072 1.076-.124.072-.044.146-.087.22-.128.332-.183.582-.495.644-.869l.214-1.281z" /> <path strokeLinecap="round" strokeLinejoin="round" d="M9.594 3.94c.09-.542.56-.94 1.11-.94h2.593c.55 0 1.02.398 1.11.94l.213 1.281c.063.374.313.686.645.87.074.04.147.083.22.127.324.196.72.257 1.075.124l1.217-.456a1.125 1.125 0 011.37.49l1.296 2.247a1.125 1.125 0 01-.26 1.431l-1.003.827c-.293.24-.438.613-.431.992a6.759 6.759 0 010 .255c-.007.378.138.75.43.99l1.005.828c.424.35.534.954.26 1.43l-1.298 2.247a1.125 1.125 0 01-1.369.491l-1.217-.456c-.355-.133-.75-.072-1.076.124a6.57 6.57 0 01-.22.128c-.331.183-.581.495-.644.869l-.213 1.28c-.09.543-.56.941-1.11.941h-2.594c-.55 0-1.02-.398-1.11-.94l-.213-1.281c-.062-.374-.312-.686-.644-.87a6.52 6.52 0 01-.22-.127c-.325-.196-.72-.257-1.076-.124l-1.217.456a1.125 1.125 0 01-1.369-.49l-1.297-2.247a1.125 1.125 0 01.26-1.431l1.004-.827c.292-.24.437-.613.43-.992a6.932 6.932 0 010-.255c.007-.378-.138-.75-.43-.99l-1.004-.828a1.125 1.125 0 01-.26-1.43l1.297-2.247a1.125 1.125 0 011.37-.491l1.216.456c.356.133.751.072 1.076-.124.072-.044.146-.087.22-.128.332-.183.582-.495.644-.869l.214-1.281z" />
@@ -109,8 +172,8 @@ function Layout({ children }) {
</div> </div>
</aside> </aside>
{/* Main content */} {/* Main content — add top padding on mobile for the header bar */}
<main className="flex-1 overflow-hidden flex flex-col"> <main className="flex-1 overflow-hidden flex flex-col pt-14 md:pt-0">
{children} {children}
</main> </main>
</div> </div>
@@ -122,11 +185,11 @@ function App() {
<BrowserRouter> <BrowserRouter>
<Layout> <Layout>
<Routes> <Routes>
<Route path="/" element={<div className="flex-1 overflow-y-auto p-8"><div className="max-w-6xl mx-auto"><Dashboard /></div></div>} /> <Route path="/" element={<div className="flex-1 overflow-y-auto p-4 md:p-8"><div className="max-w-6xl mx-auto"><Dashboard /></div></div>} />
<Route path="/chat" element={<Chat />} /> <Route path="/chat" element={<Chat />} />
<Route path="/characters" element={<div className="flex-1 overflow-y-auto p-8"><div className="max-w-6xl mx-auto"><Characters /></div></div>} /> <Route path="/characters" element={<div className="flex-1 overflow-y-auto p-4 md:p-8"><div className="max-w-6xl mx-auto"><Characters /></div></div>} />
<Route path="/memories" element={<div className="flex-1 overflow-y-auto p-8"><div className="max-w-6xl mx-auto"><Memories /></div></div>} /> <Route path="/memories" element={<div className="flex-1 overflow-y-auto p-4 md:p-8"><div className="max-w-6xl mx-auto"><Memories /></div></div>} />
<Route path="/editor" element={<div className="flex-1 overflow-y-auto p-8"><div className="max-w-6xl mx-auto"><Editor /></div></div>} /> <Route path="/editor" element={<div className="flex-1 overflow-y-auto p-4 md:p-8"><div className="max-w-6xl mx-auto"><Editor /></div></div>} />
</Routes> </Routes>
</Layout> </Layout>
</BrowserRouter> </BrowserRouter>

View File

@@ -2,7 +2,7 @@ import { useEffect, useRef } from 'react'
import MessageBubble from './MessageBubble' import MessageBubble from './MessageBubble'
import ThinkingIndicator from './ThinkingIndicator' import ThinkingIndicator from './ThinkingIndicator'
export default function ChatPanel({ messages, isLoading, onReplay, character }) { export default function ChatPanel({ messages, isLoading, onReplay, onRetry, character }) {
const bottomRef = useRef(null) const bottomRef = useRef(null)
const name = character?.name || 'AI' const name = character?.name || 'AI'
const image = character?.image || null const image = character?.image || null
@@ -32,7 +32,7 @@ export default function ChatPanel({ messages, isLoading, onReplay, character })
return ( return (
<div className="flex-1 overflow-y-auto py-4"> <div className="flex-1 overflow-y-auto py-4">
{messages.map((msg) => ( {messages.map((msg) => (
<MessageBubble key={msg.id} message={msg} onReplay={onReplay} character={character} /> <MessageBubble key={msg.id} message={msg} onReplay={onReplay} onRetry={onRetry} character={character} />
))} ))}
{isLoading && <ThinkingIndicator character={character} />} {isLoading && <ThinkingIndicator character={character} />}
<div ref={bottomRef} /> <div ref={bottomRef} />

View File

@@ -10,61 +10,95 @@ function timeAgo(dateStr) {
return `${days}d ago` return `${days}d ago`
} }
export default function ConversationList({ conversations, activeId, onCreate, onSelect, onDelete }) { export default function ConversationList({ conversations, activeId, onCreate, onSelect, onDelete, isOpen, onToggle }) {
return ( return (
<div className="w-72 border-r border-gray-800 flex flex-col bg-gray-950 shrink-0"> <>
{/* New chat button */} {/* Mobile toggle button */}
<div className="p-3 border-b border-gray-800"> <button
<button onClick={onToggle}
onClick={onCreate} className="md:hidden absolute left-2 top-2 z-10 p-2 text-gray-400 hover:text-white bg-gray-900/80 rounded-lg border border-gray-800"
className="w-full flex items-center justify-center gap-2 px-3 py-2 bg-indigo-600 hover:bg-indigo-500 text-white text-sm rounded-lg transition-colors" aria-label="Toggle conversations"
> title="Conversations"
<svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}> >
<path strokeLinecap="round" strokeLinejoin="round" d="M12 4.5v15m7.5-7.5h-15" /> <svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}>
</svg> <path strokeLinecap="round" strokeLinejoin="round" d="M20.25 8.511c.884.284 1.5 1.128 1.5 2.097v4.286c0 1.136-.847 2.1-1.98 2.193-.34.027-.68.052-1.02.072v3.091l-3-3c-1.354 0-2.694-.055-4.02-.163a2.115 2.115 0 01-.825-.242m9.345-8.334a2.126 2.126 0 00-.476-.095 48.64 48.64 0 00-8.048 0c-1.131.094-1.976 1.057-1.976 2.192v4.286c0 .837.46 1.58 1.155 1.951m9.345-8.334V6.637c0-1.621-1.152-3.026-2.76-3.235A48.455 48.455 0 0011.25 3c-2.115 0-4.198.137-6.24.402-1.608.209-2.76 1.614-2.76 3.235v6.226c0 1.621 1.152 3.026 2.76 3.235.577.075 1.157.14 1.74.194V21l4.155-4.155" />
New chat </svg>
</button> </button>
</div>
{/* Conversation list */} {/* Mobile backdrop */}
<div className="flex-1 overflow-y-auto"> {isOpen && (
{conversations.length === 0 ? ( <div className="fixed inset-0 bg-black/50 z-20 md:hidden" onClick={onToggle} />
<p className="text-xs text-gray-600 text-center py-6">No conversations yet</p> )}
) : (
conversations.map(conv => ( {/* Conversation panel */}
<div <div className={`
key={conv.id} fixed inset-y-0 left-0 z-30 w-72 bg-gray-950 border-r border-gray-800 flex flex-col
onClick={() => onSelect(conv.id)} transform transition-transform duration-200 ease-out
className={`group flex items-start gap-2 px-3 py-2.5 cursor-pointer border-b border-gray-800/50 transition-colors ${ ${isOpen ? 'translate-x-0' : '-translate-x-full'}
conv.id === activeId md:static md:translate-x-0 md:shrink-0
? 'bg-gray-800 text-white' `}>
: 'text-gray-400 hover:bg-gray-800/50 hover:text-gray-200' {/* Header with close on mobile */}
}`} <div className="p-3 border-b border-gray-800 flex items-center gap-2">
> <button
<div className="flex-1 min-w-0"> onClick={onCreate}
<p className="text-sm truncate"> className="flex-1 flex items-center justify-center gap-2 px-3 py-2.5 bg-indigo-600 hover:bg-indigo-500 text-white text-sm rounded-lg transition-colors"
{conv.title || 'New conversation'} >
</p> <svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<div className="flex items-center gap-2 mt-0.5"> <path strokeLinecap="round" strokeLinejoin="round" d="M12 4.5v15m7.5-7.5h-15" />
{conv.characterName && ( </svg>
<span className="text-xs text-indigo-400/70">{conv.characterName}</span> New chat
)} </button>
<span className="text-xs text-gray-600">{timeAgo(conv.updatedAt)}</span> <button
</div> onClick={onToggle}
</div> className="p-2 text-gray-500 hover:text-white md:hidden"
<button aria-label="Close"
onClick={(e) => { e.stopPropagation(); onDelete(conv.id) }} >
className="opacity-0 group-hover:opacity-100 p-1 text-gray-500 hover:text-red-400 transition-all shrink-0 mt-0.5" <svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
title="Delete" <path strokeLinecap="round" strokeLinejoin="round" d="M6 18L18 6M6 6l12 12" />
</svg>
</button>
</div>
{/* Conversation list */}
<div className="flex-1 overflow-y-auto">
{conversations.length === 0 ? (
<p className="text-xs text-gray-600 text-center py-6">No conversations yet</p>
) : (
conversations.map(conv => (
<div
key={conv.id}
onClick={() => { onSelect(conv.id); if (onToggle) onToggle() }}
className={`group flex items-start gap-2 px-3 py-2.5 cursor-pointer border-b border-gray-800/50 transition-colors ${
conv.id === activeId
? 'bg-gray-800 text-white'
: 'text-gray-400 hover:bg-gray-800/50 hover:text-gray-200'
}`}
> >
<svg className="w-3.5 h-3.5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}> <div className="flex-1 min-w-0">
<path strokeLinecap="round" strokeLinejoin="round" d="M14.74 9l-.346 9m-4.788 0L9.26 9m9.968-3.21c.342.052.682.107 1.022.166m-1.022-.165L18.16 19.673a2.25 2.25 0 01-2.244 2.077H8.084a2.25 2.25 0 01-2.244-2.077L4.772 5.79m14.456 0a48.108 48.108 0 00-3.478-.397m-12 .562c.34-.059.68-.114 1.022-.165m0 0a48.11 48.11 0 013.478-.397m7.5 0v-.916c0-1.18-.91-2.164-2.09-2.201a51.964 51.964 0 00-3.32 0c-1.18.037-2.09 1.022-2.09 2.201v.916m7.5 0a48.667 48.667 0 00-7.5 0" /> <p className="text-sm truncate">
</svg> {conv.title || 'New conversation'}
</button> </p>
</div> <div className="flex items-center gap-2 mt-0.5">
)) {conv.characterName && (
)} <span className="text-xs text-indigo-400/70">{conv.characterName}</span>
)}
<span className="text-xs text-gray-600">{timeAgo(conv.updatedAt)}</span>
</div>
</div>
<button
onClick={(e) => { e.stopPropagation(); onDelete(conv.id) }}
className="opacity-0 group-hover:opacity-100 p-1.5 text-gray-500 hover:text-red-400 transition-all shrink-0 mt-0.5"
title="Delete"
>
<svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M14.74 9l-.346 9m-4.788 0L9.26 9m9.968-3.21c.342.052.682.107 1.022.166m-1.022-.165L18.16 19.673a2.25 2.25 0 01-2.244 2.077H8.084a2.25 2.25 0 01-2.244-2.077L4.772 5.79m14.456 0a48.108 48.108 0 00-3.478-.397m-12 .562c.34-.059.68-.114 1.022-.165m0 0a48.11 48.11 0 013.478-.397m7.5 0v-.916c0-1.18-.91-2.164-2.09-2.201a51.964 51.964 0 00-3.32 0c-1.18.037-2.09 1.022-2.09 2.201v.916m7.5 0a48.667 48.667 0 00-7.5 0" />
</svg>
</button>
</div>
))
)}
</div>
</div> </div>
</div> </>
) )
} }

View File

@@ -20,7 +20,7 @@ export default function InputBar({ onSend, onVoiceToggle, isLoading, isRecording
} }
return ( return (
<form onSubmit={handleSubmit} className="border-t border-gray-800 bg-gray-950 px-4 py-3 shrink-0"> <form onSubmit={handleSubmit} className="border-t border-gray-800 bg-gray-950 px-3 sm:px-4 py-2 sm:py-3 shrink-0">
<div className="flex items-end gap-2 max-w-3xl mx-auto"> <div className="flex items-end gap-2 max-w-3xl mx-auto">
<VoiceButton <VoiceButton
isRecording={isRecording} isRecording={isRecording}
@@ -41,7 +41,7 @@ export default function InputBar({ onSend, onVoiceToggle, isLoading, isRecording
<button <button
type="submit" type="submit"
disabled={!text.trim() || isLoading} disabled={!text.trim() || isLoading}
className="w-10 h-10 rounded-full bg-indigo-600 text-white flex items-center justify-center shrink-0 hover:bg-indigo-500 disabled:opacity-40 disabled:hover:bg-indigo-600 transition-colors" className="w-11 h-11 sm:w-10 sm:h-10 rounded-full bg-indigo-600 text-white flex items-center justify-center shrink-0 hover:bg-indigo-500 disabled:opacity-40 disabled:hover:bg-indigo-600 transition-colors"
> >
<svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}> <svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M6 12L3.269 3.126A59.768 59.768 0 0121.485 12 59.77 59.77 0 013.27 20.876L5.999 12zm0 0h7.5" /> <path strokeLinecap="round" strokeLinejoin="round" d="M6 12L3.269 3.126A59.768 59.768 0 0121.485 12 59.77 59.77 0 013.27 20.876L5.999 12zm0 0h7.5" />

View File

@@ -88,12 +88,12 @@ function RichContent({ text }) {
) )
} }
export default function MessageBubble({ message, onReplay, character }) { export default function MessageBubble({ message, onReplay, onRetry, character }) {
const isUser = message.role === 'user' const isUser = message.role === 'user'
return ( return (
<div className={`flex ${isUser ? 'justify-end' : 'justify-start'} px-4 py-1.5`}> <div className={`flex ${isUser ? 'justify-end' : 'justify-start'} px-3 sm:px-4 py-1.5`}>
<div className={`flex items-start gap-3 max-w-[80%] ${isUser ? 'flex-row-reverse' : ''}`}> <div className={`flex items-start gap-2 sm:gap-3 max-w-[92%] sm:max-w-[80%] ${isUser ? 'flex-row-reverse' : ''}`}>
{!isUser && <Avatar character={character} />} {!isUser && <Avatar character={character} />}
<div> <div>
<div <div
@@ -114,6 +114,18 @@ export default function MessageBubble({ message, onReplay, character }) {
{message.model} {message.model}
</span> </span>
)} )}
{message.isError && onRetry && (
<button
onClick={() => onRetry(message.id)}
className="text-red-400 hover:text-red-300 transition-colors flex items-center gap-1 text-xs"
title="Retry"
>
<svg className="w-3.5 h-3.5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M16.023 9.348h4.992v-.001M2.985 19.644v-4.992m0 0h4.992m-4.993 0l3.181 3.183a8.25 8.25 0 0013.803-3.7M4.031 9.865a8.25 8.25 0 0113.803-3.7l3.181 3.182" />
</svg>
Retry
</button>
)}
{!message.isError && onReplay && ( {!message.isError && onReplay && (
<button <button
onClick={() => onReplay(message.content)} onClick={() => onReplay(message.content)}

View File

@@ -0,0 +1,52 @@
const GROUP_LABELS = { cloud: 'Cloud', local: 'Local' }
const GROUP_COLORS = {
cloud: {
active: 'bg-indigo-600 text-white',
inactive: 'text-indigo-400 hover:bg-indigo-900/30',
},
local: {
active: 'bg-emerald-600 text-white',
inactive: 'text-emerald-400 hover:bg-emerald-900/30',
},
}
export default function PromptStyleSelector({ styles, activeStyle, onSelect }) {
if (!styles || styles.length === 0) return null
const groups = { cloud: [], local: [] }
for (const s of styles) {
const g = s.group === 'local' ? 'local' : 'cloud'
groups[g].push(s)
}
return (
<div className="flex items-center gap-2 sm:gap-3 px-3 sm:px-4 py-1.5 border-b border-gray-800/50 shrink-0 overflow-x-auto scrollbar-none">
{Object.entries(groups).map(([group, groupStyles]) => (
groupStyles.length > 0 && (
<div key={group} className="flex items-center gap-1">
<span className="text-[10px] uppercase tracking-wider text-gray-600 mr-1">
{GROUP_LABELS[group]}
</span>
{groupStyles.map((s) => {
const isActive = s.id === activeStyle
const colors = GROUP_COLORS[group] || GROUP_COLORS.cloud
return (
<button
key={s.id}
onClick={() => onSelect(s.id)}
className={`text-xs px-2.5 py-1 sm:px-2 sm:py-0.5 rounded-full transition-colors whitespace-nowrap ${
isActive ? colors.active : colors.inactive
}`}
title={s.description || s.name}
>
{s.name}
</button>
)
})}
</div>
)
))}
</div>
)
}

View File

@@ -8,7 +8,7 @@ export default function SettingsDrawer({ isOpen, onClose, settings, onUpdate })
return ( return (
<> <>
<div className="fixed inset-0 bg-black/50 z-40" onClick={onClose} /> <div className="fixed inset-0 bg-black/50 z-40" onClick={onClose} />
<div className="fixed right-0 top-0 bottom-0 w-80 bg-gray-900 border-l border-gray-800 z-50 flex flex-col"> <div className="fixed right-0 top-0 bottom-0 w-full sm:w-80 bg-gray-900 border-l border-gray-800 z-50 flex flex-col">
<div className="flex items-center justify-between px-4 py-3 border-b border-gray-800"> <div className="flex items-center justify-between px-4 py-3 border-b border-gray-800">
<h2 className="text-sm font-medium text-gray-200">Settings</h2> <h2 className="text-sm font-medium text-gray-200">Settings</h2>
<button onClick={onClose} className="text-gray-500 hover:text-gray-300"> <button onClick={onClose} className="text-gray-500 hover:text-gray-300">

View File

@@ -8,7 +8,7 @@ export default function VoiceButton({ isRecording, isTranscribing, onToggle, dis
<button <button
onClick={handleClick} onClick={handleClick}
disabled={disabled || isTranscribing} disabled={disabled || isTranscribing}
className={`w-10 h-10 rounded-full flex items-center justify-center transition-all shrink-0 ${ className={`w-11 h-11 sm:w-10 sm:h-10 rounded-full flex items-center justify-center transition-all shrink-0 ${
isRecording isRecording
? 'bg-red-500 text-white shadow-[0_0_0_4px_rgba(239,68,68,0.3)] animate-pulse' ? 'bg-red-500 text-white shadow-[0_0_0_4px_rgba(239,68,68,0.3)] animate-pulse'
: isTranscribing : isTranscribing

View File

@@ -70,7 +70,8 @@ export function useChat(conversationId, conversationMeta, onConversationUpdate)
}, [conversationMeta, onConversationUpdate]) }, [conversationMeta, onConversationUpdate])
// send accepts an optional overrideId for when the conversation was just created // send accepts an optional overrideId for when the conversation was just created
const send = useCallback(async (text, overrideId) => { // and an optional promptStyle to control response style
const send = useCallback(async (text, overrideId, promptStyle) => {
if (!text.trim() || isLoading) return null if (!text.trim() || isLoading) return null
const userMsg = { id: Date.now(), role: 'user', content: text.trim(), timestamp: new Date().toISOString() } const userMsg = { id: Date.now(), role: 'user', content: text.trim(), timestamp: new Date().toISOString() }
@@ -80,13 +81,15 @@ export function useChat(conversationId, conversationMeta, onConversationUpdate)
setIsLoading(true) setIsLoading(true)
try { try {
const { response, model } = await sendMessage(text.trim(), conversationMeta?.characterId || null) const activeConvId = overrideId || idRef.current
const { response, model, prompt_style } = await sendMessage(text.trim(), conversationMeta?.characterId || null, promptStyle, activeConvId)
const assistantMsg = { const assistantMsg = {
id: Date.now() + 1, id: Date.now() + 1,
role: 'assistant', role: 'assistant',
content: response, content: response,
timestamp: new Date().toISOString(), timestamp: new Date().toISOString(),
...(model && { model }), ...(model && { model }),
...(prompt_style && { prompt_style }),
} }
const allMessages = [...newMessages, assistantMsg] const allMessages = [...newMessages, assistantMsg]
setMessages(allMessages) setMessages(allMessages)
@@ -114,6 +117,52 @@ export function useChat(conversationId, conversationMeta, onConversationUpdate)
} }
}, [isLoading, messages, persist]) }, [isLoading, messages, persist])
// Retry: remove the error message, re-send the preceding user message
const retry = useCallback(async (errorMsgId, promptStyle) => {
const idx = messages.findIndex(m => m.id === errorMsgId)
if (idx < 1) return null
// Find the user message right before the error
const userMsg = messages[idx - 1]
if (!userMsg || userMsg.role !== 'user') return null
// Remove the error message
const cleaned = messages.filter(m => m.id !== errorMsgId)
setMessages(cleaned)
await persist(cleaned)
// Re-send (but we need to temporarily set messages back without the error so send picks up correctly)
// Instead, inline the send logic with the cleaned message list
setIsLoading(true)
try {
const activeConvId = idRef.current
const { response, model, prompt_style } = await sendMessage(userMsg.content, conversationMeta?.characterId || null, promptStyle, activeConvId)
const assistantMsg = {
id: Date.now() + 1,
role: 'assistant',
content: response,
timestamp: new Date().toISOString(),
...(model && { model }),
...(prompt_style && { prompt_style }),
}
const allMessages = [...cleaned, assistantMsg]
setMessages(allMessages)
await persist(allMessages)
return response
} catch (err) {
const newError = {
id: Date.now() + 1,
role: 'assistant',
content: `Error: ${err.message}`,
timestamp: new Date().toISOString(),
isError: true,
}
const allMessages = [...cleaned, newError]
setMessages(allMessages)
await persist(allMessages)
return null
} finally {
setIsLoading(false)
}
}, [messages, persist, conversationMeta])
const clearHistory = useCallback(async () => { const clearHistory = useCallback(async () => {
setMessages([]) setMessages([])
if (idRef.current) { if (idRef.current) {
@@ -121,5 +170,5 @@ export function useChat(conversationId, conversationMeta, onConversationUpdate)
} }
}, [persist]) }, [persist])
return { messages, isLoading, isLoadingConv, send, clearHistory } return { messages, isLoading, isLoadingConv, send, retry, clearHistory }
} }

View File

@@ -0,0 +1,27 @@
import { useState, useEffect, useCallback } from 'react';
import { getFollowups } from '../lib/memoryApi';
export function useFollowups(characterId) {
const [followups, setFollowups] = useState([]);
const [loading, setLoading] = useState(false);
const refresh = useCallback(async () => {
if (!characterId) {
setFollowups([]);
return;
}
setLoading(true);
try {
const data = await getFollowups(characterId);
setFollowups(data.followups || []);
} catch {
setFollowups([]);
} finally {
setLoading(false);
}
}, [characterId]);
useEffect(() => { refresh(); }, [refresh]);
return { followups, loading, refresh };
}

View File

@@ -0,0 +1,34 @@
import { useState, useEffect, useCallback } from 'react'
import { getPromptStyles, getActiveStyle, setActiveStyle } from '../lib/api'
export function usePromptStyle() {
const [styles, setStyles] = useState([])
const [activeStyle, setActive] = useState('standard')
const [isLoading, setIsLoading] = useState(true)
useEffect(() => {
let cancelled = false
Promise.all([getPromptStyles(), getActiveStyle()])
.then(([allStyles, active]) => {
if (cancelled) return
setStyles(allStyles)
setActive(active.style || 'standard')
setIsLoading(false)
})
.catch(() => {
if (!cancelled) setIsLoading(false)
})
return () => { cancelled = true }
}, [])
const selectStyle = useCallback(async (styleId) => {
setActive(styleId)
try {
await setActiveStyle(styleId)
} catch (err) {
console.error('Failed to set prompt style:', err)
}
}, [])
return { styles, activeStyle, selectStyle, isLoading }
}

View File

@@ -33,3 +33,12 @@ body {
::selection { ::selection {
background: rgba(99, 102, 241, 0.3); background: rgba(99, 102, 241, 0.3);
} }
/* Hide scrollbar for horizontal scroll containers */
.scrollbar-none {
-ms-overflow-style: none;
scrollbar-width: none;
}
.scrollbar-none::-webkit-scrollbar {
display: none;
}

View File

@@ -18,9 +18,11 @@ async function fetchWithRetry(url, options, retries = MAX_RETRIES) {
} }
} }
export async function sendMessage(text, characterId = null) { export async function sendMessage(text, characterId = null, promptStyle = null, conversationId = null) {
const payload = { message: text, agent: 'main' } const payload = { message: text, agent: 'main' }
if (characterId) payload.character_id = characterId if (characterId) payload.character_id = characterId
if (promptStyle) payload.prompt_style = promptStyle
if (conversationId) payload.conversation_id = conversationId
const res = await fetchWithRetry('/api/agent/message', { const res = await fetchWithRetry('/api/agent/message', {
method: 'POST', method: 'POST',
headers: { 'Content-Type': 'application/json' }, headers: { 'Content-Type': 'application/json' },
@@ -31,7 +33,29 @@ export async function sendMessage(text, characterId = null) {
throw new Error(err.error || `HTTP ${res.status}`) throw new Error(err.error || `HTTP ${res.status}`)
} }
const data = await res.json() const data = await res.json()
return { response: data.response, model: data.model || null } return { response: data.response, model: data.model || null, prompt_style: data.prompt_style || null }
}
export async function getPromptStyles() {
const res = await fetch('/api/prompt-styles')
if (!res.ok) return []
return await res.json()
}
export async function getActiveStyle() {
const res = await fetch('/api/prompt-style')
if (!res.ok) return { style: 'standard' }
return await res.json()
}
export async function setActiveStyle(style) {
const res = await fetch('/api/prompt-style', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ style }),
})
if (!res.ok) throw new Error('Failed to set prompt style')
return await res.json()
} }
export async function synthesize(text, voice, engine = 'kokoro', model = null) { export async function synthesize(text, voice, engine = 'kokoro', model = null) {

View File

@@ -1,11 +1,20 @@
export async function getPersonalMemories(characterId) { // Memory API — proxied through Vite middleware to bridge (port 8081)
const res = await fetch(`/api/memories/personal/${encodeURIComponent(characterId)}`)
export async function getPersonalMemories(characterId, { type, lifecycle, category, q, limit } = {}) {
const params = new URLSearchParams()
if (type) params.set('type', type)
if (lifecycle) params.set('lifecycle', lifecycle)
if (category) params.set('category', category)
if (q) params.set('q', q)
if (limit) params.set('limit', limit)
const qs = params.toString() ? `?${params}` : ''
const res = await fetch(`/api/memories/${encodeURIComponent(characterId)}${qs}`)
if (!res.ok) return { characterId, memories: [] } if (!res.ok) return { characterId, memories: [] }
return res.json() return res.json()
} }
export async function savePersonalMemory(characterId, memory) { export async function savePersonalMemory(characterId, memory) {
const res = await fetch(`/api/memories/personal/${encodeURIComponent(characterId)}`, { const res = await fetch(`/api/memories/${encodeURIComponent(characterId)}`, {
method: 'POST', method: 'POST',
headers: { 'Content-Type': 'application/json' }, headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(memory), body: JSON.stringify(memory),
@@ -15,14 +24,21 @@ export async function savePersonalMemory(characterId, memory) {
} }
export async function deletePersonalMemory(characterId, memoryId) { export async function deletePersonalMemory(characterId, memoryId) {
const res = await fetch(`/api/memories/personal/${encodeURIComponent(characterId)}/${encodeURIComponent(memoryId)}`, { const res = await fetch(`/api/memories/${encodeURIComponent(memoryId)}`, {
method: 'DELETE', method: 'DELETE',
}) })
if (!res.ok) throw new Error(`Failed to delete memory: ${res.status}`) if (!res.ok) throw new Error(`Failed to delete memory: ${res.status}`)
return res.json()
} }
export async function getGeneralMemories() { export async function getGeneralMemories({ type, lifecycle, category, limit } = {}) {
const res = await fetch('/api/memories/general') const params = new URLSearchParams()
if (type) params.set('type', type)
if (lifecycle) params.set('lifecycle', lifecycle)
if (category) params.set('category', category)
if (limit) params.set('limit', limit)
const qs = params.toString() ? `?${params}` : ''
const res = await fetch(`/api/memories/general${qs}`)
if (!res.ok) return { memories: [] } if (!res.ok) return { memories: [] }
return res.json() return res.json()
} }
@@ -38,8 +54,45 @@ export async function saveGeneralMemory(memory) {
} }
export async function deleteGeneralMemory(memoryId) { export async function deleteGeneralMemory(memoryId) {
const res = await fetch(`/api/memories/general/${encodeURIComponent(memoryId)}`, { const res = await fetch(`/api/memories/${encodeURIComponent(memoryId)}`, {
method: 'DELETE', method: 'DELETE',
}) })
if (!res.ok) throw new Error(`Failed to delete memory: ${res.status}`) if (!res.ok) throw new Error(`Failed to delete memory: ${res.status}`)
return res.json()
}
export async function getFollowups(characterId) {
const res = await fetch(`/api/memories/${encodeURIComponent(characterId)}/followups`)
if (!res.ok) return { followups: [] }
return res.json()
}
export async function resolveFollowup(memoryId) {
const res = await fetch(`/api/memories/${encodeURIComponent(memoryId)}/resolve`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({}),
})
if (!res.ok) throw new Error(`Failed to resolve follow-up: ${res.status}`)
return res.json()
}
export async function updateMemory(memoryId, fields) {
const res = await fetch(`/api/memories/${encodeURIComponent(memoryId)}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(fields),
})
if (!res.ok) throw new Error(`Failed to update memory: ${res.status}`)
return res.json()
}
export async function runMigration() {
const res = await fetch('/api/memories/migrate', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({}),
})
if (!res.ok) throw new Error(`Migration failed: ${res.status}`)
return res.json()
} }

View File

@@ -162,9 +162,9 @@ export default function Characters() {
return ( return (
<div className="space-y-8"> <div className="space-y-8">
{/* Header */} {/* Header */}
<div className="flex items-center justify-between"> <div className="flex flex-col sm:flex-row sm:items-center justify-between gap-3">
<div> <div>
<h1 className="text-3xl font-bold text-gray-100">Characters</h1> <h1 className="text-2xl sm:text-3xl font-bold text-gray-100">Characters</h1>
<p className="text-sm text-gray-500 mt-1"> <p className="text-sm text-gray-500 mt-1">
{profiles.length} profile{profiles.length !== 1 ? 's' : ''} stored {profiles.length} profile{profiles.length !== 1 ? 's' : ''} stored
{activeProfile && ( {activeProfile && (
@@ -174,19 +174,20 @@ export default function Characters() {
)} )}
</p> </p>
</div> </div>
<div className="flex gap-3"> <div className="flex gap-2 sm:gap-3">
<button <button
onClick={() => { onClick={() => {
sessionStorage.removeItem('edit_character'); sessionStorage.removeItem('edit_character');
sessionStorage.removeItem('edit_character_profile_id'); sessionStorage.removeItem('edit_character_profile_id');
navigate('/editor'); navigate('/editor');
}} }}
className="flex items-center gap-2 px-4 py-2 bg-indigo-600 hover:bg-indigo-500 text-white rounded-lg transition-colors" className="flex items-center gap-2 px-3 sm:px-4 py-2 bg-indigo-600 hover:bg-indigo-500 text-white text-sm rounded-lg transition-colors"
> >
<svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}> <svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M12 4.5v15m7.5-7.5h-15" /> <path strokeLinecap="round" strokeLinejoin="round" d="M12 4.5v15m7.5-7.5h-15" />
</svg> </svg>
New Character <span className="hidden sm:inline">New Character</span>
<span className="sm:hidden">New</span>
</button> </button>
<label className="flex items-center gap-2 px-4 py-2 bg-gray-800 hover:bg-gray-700 text-gray-300 rounded-lg cursor-pointer border border-gray-700 transition-colors"> <label className="flex items-center gap-2 px-4 py-2 bg-gray-800 hover:bg-gray-700 text-gray-300 rounded-lg cursor-pointer border border-gray-700 transition-colors">
<svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}> <svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
@@ -249,17 +250,30 @@ export default function Characters() {
> >
{/* Image area */} {/* Image area */}
<div className="relative h-48 bg-gray-900 flex items-center justify-center overflow-hidden group"> <div className="relative h-48 bg-gray-900 flex items-center justify-center overflow-hidden group">
{profile.image ? ( {(() => {
<img const imgSrc = profile.image
src={profile.image} || (char.dream_id && `/api/dream/characters/${char.dream_id}/image`)
alt={char.display_name || char.name} || (char.gaze_character && `/api/gaze/character/${char.gaze_character}/cover`)
className="w-full h-full object-cover" || null;
/> return imgSrc ? (
) : ( <img
<div className="text-6xl font-bold text-gray-700 select-none"> src={imgSrc}
{(char.display_name || char.name || '?')[0].toUpperCase()} alt={char.display_name || char.name}
</div> className="w-full h-full object-cover"
)} onError={(e) => {
e.target.style.display = 'none';
const fallback = e.target.nextElementSibling;
if (fallback) fallback.style.display = '';
}}
/>
) : null;
})()}
<div
className="text-6xl font-bold text-gray-700 select-none absolute inset-0 flex items-center justify-center"
style={(profile.image || char.dream_id || char.gaze_character) ? { display: 'none' } : {}}
>
{(char.display_name || char.name || '?')[0].toUpperCase()}
</div>
<label className="absolute inset-0 flex items-center justify-center bg-black/50 opacity-0 group-hover:opacity-100 transition-opacity cursor-pointer"> <label className="absolute inset-0 flex items-center justify-center bg-black/50 opacity-0 group-hover:opacity-100 transition-opacity cursor-pointer">
<div className="text-center"> <div className="text-center">
<svg className="w-8 h-8 mx-auto text-white/80 mb-1" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}> <svg className="w-8 h-8 mx-auto text-white/80 mb-1" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}>
@@ -313,13 +327,18 @@ export default function Characters() {
{char.tts.voice_ref_path.split('/').pop()} {char.tts.voice_ref_path.split('/').pop()}
</span> </span>
)} )}
{char.gaze_character && (
<span className="px-2 py-0.5 bg-violet-500/20 text-violet-300 text-xs rounded-full border border-violet-500/30" title={`GAZE character: ${char.gaze_character}`}>
{char.gaze_character}
</span>
)}
{(() => { {(() => {
const defaultPreset = char.gaze_presets?.find(gp => gp.trigger === 'self-portrait')?.preset const defaultPreset = char.gaze_presets?.find(gp => gp.trigger === 'self-portrait')?.preset
|| char.gaze_presets?.[0]?.preset || char.gaze_presets?.[0]?.preset
|| char.gaze_preset || char.gaze_preset
|| null; || null;
return defaultPreset ? ( return defaultPreset && defaultPreset !== char.gaze_character ? (
<span className="px-2 py-0.5 bg-violet-500/20 text-violet-300 text-xs rounded-full border border-violet-500/30" title={`GAZE: ${defaultPreset}`}> <span className="px-2 py-0.5 bg-violet-500/20 text-violet-300 text-xs rounded-full border border-violet-500/30" title={`GAZE preset: ${defaultPreset}`}>
{defaultPreset} {defaultPreset}
</span> </span>
) : null; ) : null;
@@ -386,8 +405,8 @@ export default function Characters() {
</div> </div>
{/* Default character */} {/* Default character */}
<div className="flex items-center gap-3"> <div className="flex flex-col sm:flex-row sm:items-center gap-1.5 sm:gap-3">
<label className="text-sm text-gray-400 w-32 shrink-0">Default</label> <label className="text-sm text-gray-400 sm:w-32 shrink-0">Default</label>
<select <select
value={satMap.default || ''} value={satMap.default || ''}
onChange={(e) => saveSatMap({ ...satMap, default: e.target.value })} onChange={(e) => saveSatMap({ ...satMap, default: e.target.value })}
@@ -402,8 +421,8 @@ export default function Characters() {
{/* Per-satellite assignments */} {/* Per-satellite assignments */}
{Object.entries(satMap.satellites || {}).map(([satId, charId]) => ( {Object.entries(satMap.satellites || {}).map(([satId, charId]) => (
<div key={satId} className="flex items-center gap-3"> <div key={satId} className="flex flex-col sm:flex-row sm:items-center gap-1.5 sm:gap-3">
<span className="text-sm text-gray-300 w-32 shrink-0 truncate font-mono" title={satId}>{satId}</span> <span className="text-sm text-gray-300 sm:w-32 shrink-0 truncate font-mono" title={satId}>{satId}</span>
<select <select
value={charId} value={charId}
onChange={(e) => { onChange={(e) => {
@@ -432,13 +451,13 @@ export default function Characters() {
))} ))}
{/* Add new satellite */} {/* Add new satellite */}
<div className="flex items-center gap-3 pt-2 border-t border-gray-800"> <div className="flex flex-col sm:flex-row sm:items-center gap-1.5 sm:gap-3 pt-2 border-t border-gray-800">
<input <input
type="text" type="text"
value={newSatId} value={newSatId}
onChange={(e) => setNewSatId(e.target.value)} onChange={(e) => setNewSatId(e.target.value)}
placeholder="Satellite ID (from bridge log)" placeholder="Satellite ID (from bridge log)"
className="w-32 shrink-0 bg-gray-800 text-gray-200 text-sm rounded-lg px-3 py-2 border border-gray-700 focus:outline-none focus:border-indigo-500 font-mono" className="sm:w-32 shrink-0 bg-gray-800 text-gray-200 text-sm rounded-lg px-3 py-2 border border-gray-700 focus:outline-none focus:border-indigo-500 font-mono"
/> />
<select <select
value={newSatChar} value={newSatChar}

View File

@@ -4,6 +4,7 @@ import InputBar from '../components/InputBar'
import StatusIndicator from '../components/StatusIndicator' import StatusIndicator from '../components/StatusIndicator'
import SettingsDrawer from '../components/SettingsDrawer' import SettingsDrawer from '../components/SettingsDrawer'
import ConversationList from '../components/ConversationList' import ConversationList from '../components/ConversationList'
import PromptStyleSelector from '../components/PromptStyleSelector'
import { useSettings } from '../hooks/useSettings' import { useSettings } from '../hooks/useSettings'
import { useBridgeHealth } from '../hooks/useBridgeHealth' import { useBridgeHealth } from '../hooks/useBridgeHealth'
import { useChat } from '../hooks/useChat' import { useChat } from '../hooks/useChat'
@@ -11,6 +12,8 @@ import { useTtsPlayback } from '../hooks/useTtsPlayback'
import { useVoiceInput } from '../hooks/useVoiceInput' import { useVoiceInput } from '../hooks/useVoiceInput'
import { useActiveCharacter } from '../hooks/useActiveCharacter' import { useActiveCharacter } from '../hooks/useActiveCharacter'
import { useConversations } from '../hooks/useConversations' import { useConversations } from '../hooks/useConversations'
import { usePromptStyle } from '../hooks/usePromptStyle'
import { useFollowups } from '../hooks/useFollowups'
export default function Chat() { export default function Chat() {
const { settings, updateSetting } = useSettings() const { settings, updateSetting } = useSettings()
@@ -26,7 +29,9 @@ export default function Chat() {
characterName: character?.name || '', characterName: character?.name || '',
} }
const { messages, isLoading, isLoadingConv, send, clearHistory } = useChat(activeId, convMeta, updateMeta) const { messages, isLoading, isLoadingConv, send, retry, clearHistory } = useChat(activeId, convMeta, updateMeta)
const { styles, activeStyle, selectStyle } = usePromptStyle()
const { followups, refresh: refreshFollowups } = useFollowups(character?.id)
// Use character's TTS config if available, fall back to global settings // Use character's TTS config if available, fall back to global settings
const ttsEngine = character?.tts?.engine || settings.ttsEngine const ttsEngine = character?.tts?.engine || settings.ttsEngine
@@ -37,6 +42,7 @@ export default function Chat() {
const { isPlaying, speak, stop } = useTtsPlayback(ttsVoice, ttsEngine, ttsModel) const { isPlaying, speak, stop } = useTtsPlayback(ttsVoice, ttsEngine, ttsModel)
const { isRecording, isTranscribing, startRecording, stopRecording } = useVoiceInput(settings.sttMode) const { isRecording, isTranscribing, startRecording, stopRecording } = useVoiceInput(settings.sttMode)
const [settingsOpen, setSettingsOpen] = useState(false) const [settingsOpen, setSettingsOpen] = useState(false)
const [convListOpen, setConvListOpen] = useState(false)
const handleSend = useCallback(async (text) => { const handleSend = useCallback(async (text) => {
// Auto-create a conversation if none is active // Auto-create a conversation if none is active
@@ -44,11 +50,19 @@ export default function Chat() {
if (!activeId) { if (!activeId) {
newId = await create(convMeta.characterId, convMeta.characterName) newId = await create(convMeta.characterId, convMeta.characterName)
} }
const response = await send(text, newId) const response = await send(text, newId, activeStyle)
if (response && settings.autoTts) { if (response && settings.autoTts) {
speak(response) speak(response)
} }
}, [activeId, create, convMeta, send, settings.autoTts, speak]) refreshFollowups()
}, [activeId, create, convMeta, send, settings.autoTts, speak, activeStyle, refreshFollowups])
const handleRetry = useCallback(async (errorMsgId) => {
const response = await retry(errorMsgId, activeStyle)
if (response && settings.autoTts) {
speak(response)
}
}, [retry, activeStyle, settings.autoTts, speak])
const handleVoiceToggle = useCallback(async () => { const handleVoiceToggle = useCallback(async () => {
if (isRecording) { if (isRecording) {
@@ -63,8 +77,12 @@ export default function Chat() {
create(convMeta.characterId, convMeta.characterName) create(convMeta.characterId, convMeta.characterName)
}, [create, convMeta]) }, [create, convMeta])
const toggleConvList = useCallback(() => {
setConvListOpen(prev => !prev)
}, [])
return ( return (
<div className="flex-1 flex min-h-0"> <div className="flex-1 flex min-h-0 relative">
{/* Conversation sidebar */} {/* Conversation sidebar */}
<ConversationList <ConversationList
conversations={conversations} conversations={conversations}
@@ -72,19 +90,21 @@ export default function Chat() {
onCreate={handleNewChat} onCreate={handleNewChat}
onSelect={select} onSelect={select}
onDelete={remove} onDelete={remove}
isOpen={convListOpen}
onToggle={toggleConvList}
/> />
{/* Chat area */} {/* Chat area */}
<div className="flex-1 flex flex-col min-h-0 min-w-0"> <div className="flex-1 flex flex-col min-h-0 min-w-0">
{/* Status bar */} {/* Status bar */}
<header className="flex items-center justify-between px-4 py-2 border-b border-gray-800/50 shrink-0"> <header className="flex items-center justify-between px-3 sm:px-4 py-2 border-b border-gray-800/50 shrink-0">
<div className="flex items-center gap-2"> <div className="flex items-center gap-2 ml-10 md:ml-0">
<StatusIndicator isOnline={isOnline} /> <StatusIndicator isOnline={isOnline} />
<span className="text-xs text-gray-500"> <span className="text-xs text-gray-500">
{isOnline === null ? 'Connecting...' : isOnline ? 'Connected' : 'Offline'} {isOnline === null ? 'Connecting...' : isOnline ? 'Connected' : 'Offline'}
</span> </span>
</div> </div>
<div className="flex items-center gap-2"> <div className="flex items-center gap-1 sm:gap-2">
{messages.length > 0 && ( {messages.length > 0 && (
<button <button
onClick={clearHistory} onClick={clearHistory}
@@ -105,7 +125,7 @@ export default function Chat() {
)} )}
<button <button
onClick={() => setSettingsOpen(true)} onClick={() => setSettingsOpen(true)}
className="text-gray-500 hover:text-gray-300 transition-colors p-1" className="text-gray-500 hover:text-gray-300 transition-colors p-1.5"
title="Settings" title="Settings"
> >
<svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}> <svg className="w-5 h-5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1.5}>
@@ -116,11 +136,34 @@ export default function Chat() {
</div> </div>
</header> </header>
{/* Prompt style selector */}
<PromptStyleSelector
styles={styles}
activeStyle={activeStyle}
onSelect={selectStyle}
/>
{/* Follow-up banner */}
{followups.length > 0 && (
<div className="px-4 py-2 bg-amber-500/10 border-b border-amber-500/20 text-sm text-amber-300 flex items-center gap-2 shrink-0">
<svg className="w-4 h-4 shrink-0" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M12 6v6h4.5m4.5 0a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<span>
{followups.length} pending follow-up{followups.length !== 1 ? 's' : ''}
<span className="text-amber-400/60 ml-1">
{followups.map(f => f.follow_up_context).join('; ')}
</span>
</span>
</div>
)}
{/* Messages */} {/* Messages */}
<ChatPanel <ChatPanel
messages={messages} messages={messages}
isLoading={isLoading || isLoadingConv} isLoading={isLoading || isLoadingConv}
onReplay={speak} onReplay={speak}
onRetry={handleRetry}
character={character} character={character}
/> />

View File

@@ -14,7 +14,7 @@ const SERVICES = [
name: 'Open WebUI', name: 'Open WebUI',
url: 'http://localhost:3030', url: 'http://localhost:3030',
healthPath: '/', healthPath: '/',
uiUrl: 'http://localhost:3030', uiUrl: 'http://10.0.0.101:3030',
description: 'Chat interface', description: 'Chat interface',
category: 'AI & LLM', category: 'AI & LLM',
restart: { type: 'docker', id: 'homeai-open-webui' }, restart: { type: 'docker', id: 'homeai-open-webui' },
@@ -74,12 +74,13 @@ const SERVICES = [
uiUrl: 'https://10.0.0.199:8123', uiUrl: 'https://10.0.0.199:8123',
description: 'Smart home platform', description: 'Smart home platform',
category: 'Smart Home', category: 'Smart Home',
auth: true,
}, },
{ {
name: 'Uptime Kuma', name: 'Uptime Kuma',
url: 'http://localhost:3001', url: 'http://localhost:3001',
healthPath: '/', healthPath: '/',
uiUrl: 'http://localhost:3001', uiUrl: 'http://10.0.0.101:3001',
description: 'Service health monitoring', description: 'Service health monitoring',
category: 'Infrastructure', category: 'Infrastructure',
restart: { type: 'docker', id: 'homeai-uptime-kuma' }, restart: { type: 'docker', id: 'homeai-uptime-kuma' },
@@ -88,7 +89,7 @@ const SERVICES = [
name: 'n8n', name: 'n8n',
url: 'http://localhost:5678', url: 'http://localhost:5678',
healthPath: '/', healthPath: '/',
uiUrl: 'http://localhost:5678', uiUrl: 'http://10.0.0.101:5678',
description: 'Workflow automation', description: 'Workflow automation',
category: 'Infrastructure', category: 'Infrastructure',
restart: { type: 'docker', id: 'homeai-n8n' }, restart: { type: 'docker', id: 'homeai-n8n' },
@@ -97,7 +98,7 @@ const SERVICES = [
name: 'code-server', name: 'code-server',
url: 'http://localhost:8090', url: 'http://localhost:8090',
healthPath: '/', healthPath: '/',
uiUrl: 'http://localhost:8090', uiUrl: 'http://10.0.0.101:8090',
description: 'Browser-based VS Code', description: 'Browser-based VS Code',
category: 'Infrastructure', category: 'Infrastructure',
restart: { type: 'docker', id: 'homeai-code-server' }, restart: { type: 'docker', id: 'homeai-code-server' },
@@ -171,10 +172,11 @@ export default function Dashboard() {
try { try {
const target = encodeURIComponent(service.url + service.healthPath); const target = encodeURIComponent(service.url + service.healthPath);
const modeParam = service.tcp ? '&mode=tcp' : ''; const modeParam = service.tcp ? '&mode=tcp' : '';
const authParam = service.auth ? '&auth=1' : '';
const controller = new AbortController(); const controller = new AbortController();
const timeout = setTimeout(() => controller.abort(), 8000); const timeout = setTimeout(() => controller.abort(), 8000);
const res = await fetch(`/api/health?url=${target}${modeParam}`, { signal: controller.signal }); const res = await fetch(`/api/health?url=${target}${modeParam}${authParam}`, { signal: controller.signal });
clearTimeout(timeout); clearTimeout(timeout);
const data = await res.json(); const data = await res.json();
@@ -249,9 +251,9 @@ export default function Dashboard() {
return ( return (
<div className="space-y-8"> <div className="space-y-8">
{/* Header */} {/* Header */}
<div className="flex items-center justify-between"> <div className="flex flex-col sm:flex-row sm:items-center justify-between gap-3">
<div> <div>
<h1 className="text-3xl font-bold text-gray-100">Service Status</h1> <h1 className="text-2xl sm:text-3xl font-bold text-gray-100">Service Status</h1>
<p className="text-sm text-gray-500 mt-1"> <p className="text-sm text-gray-500 mt-1">
{onlineCount}/{totalCount} services online {onlineCount}/{totalCount} services online
{lastRefresh && ( {lastRefresh && (

View File

@@ -12,8 +12,8 @@ const DEFAULT_CHARACTER = {
skills: [], skills: [],
system_prompt: "", system_prompt: "",
model_overrides: { model_overrides: {
primary: "qwen3.5:35b-a3b", primary: "",
fast: "qwen2.5:7b" fast: ""
}, },
tts: { tts: {
engine: "kokoro", engine: "kokoro",
@@ -22,9 +22,20 @@ const DEFAULT_CHARACTER = {
}, },
gaze_presets: [], gaze_presets: [],
custom_rules: [], custom_rules: [],
default_prompt_style: "",
prompt_style_overrides: {},
notes: "" notes: ""
}; };
const PROMPT_STYLES = [
{ id: 'quick', name: 'Quick', group: 'cloud' },
{ id: 'standard', name: 'Standard', group: 'cloud' },
{ id: 'creative', name: 'Creative', group: 'cloud' },
{ id: 'roleplayer', name: 'Roleplayer', group: 'local' },
{ id: 'game-master', name: 'Game Master', group: 'local' },
{ id: 'storyteller', name: 'Storyteller', group: 'local' },
];
export default function Editor() { export default function Editor() {
const [character, setCharacter] = useState(() => { const [character, setCharacter] = useState(() => {
const editData = sessionStorage.getItem('edit_character'); const editData = sessionStorage.getItem('edit_character');
@@ -59,10 +70,17 @@ export default function Editor() {
const [elevenLabsModels, setElevenLabsModels] = useState([]); const [elevenLabsModels, setElevenLabsModels] = useState([]);
const [isLoadingElevenLabs, setIsLoadingElevenLabs] = useState(false); const [isLoadingElevenLabs, setIsLoadingElevenLabs] = useState(false);
// GAZE presets state (from API) // GAZE state (from API)
const [availableGazePresets, setAvailableGazePresets] = useState([]); const [availableGazePresets, setAvailableGazePresets] = useState([]);
const [availableGazeCharacters, setAvailableGazeCharacters] = useState([]);
const [isLoadingGaze, setIsLoadingGaze] = useState(false); const [isLoadingGaze, setIsLoadingGaze] = useState(false);
// Dream import state
const [dreamCharacters, setDreamCharacters] = useState([]);
const [isLoadingDream, setIsLoadingDream] = useState(false);
const [dreamImportDone, setDreamImportDone] = useState(false);
const [selectedDreamId, setSelectedDreamId] = useState('');
// Character lookup state // Character lookup state
const [lookupName, setLookupName] = useState(''); const [lookupName, setLookupName] = useState('');
const [lookupFranchise, setLookupFranchise] = useState(''); const [lookupFranchise, setLookupFranchise] = useState('');
@@ -102,16 +120,31 @@ export default function Editor() {
} }
}, [character.tts.engine]); }, [character.tts.engine]);
// Fetch GAZE presets on mount // Fetch GAZE presets + characters on mount
useEffect(() => { useEffect(() => {
setIsLoadingGaze(true); setIsLoadingGaze(true);
fetch('/api/gaze/presets') Promise.all([
.then(r => r.ok ? r.json() : { presets: [] }) fetch('/api/gaze/presets').then(r => r.ok ? r.json() : { presets: [] }),
.then(data => setAvailableGazePresets(data.presets || [])) fetch('/api/gaze/characters').then(r => r.ok ? r.json() : { characters: [] }),
])
.then(([presetsData, charsData]) => {
setAvailableGazePresets(presetsData.presets || []);
setAvailableGazeCharacters(charsData.characters || []);
})
.catch(() => {}) .catch(() => {})
.finally(() => setIsLoadingGaze(false)); .finally(() => setIsLoadingGaze(false));
}, []); }, []);
// Fetch Dream characters on mount
useEffect(() => {
setIsLoadingDream(true);
fetch('/api/dream/characters')
.then(r => r.ok ? r.json() : { characters: [] })
.then(data => setDreamCharacters(data.characters || []))
.catch(() => {})
.finally(() => setIsLoadingDream(false));
}, []);
useEffect(() => { useEffect(() => {
return () => { return () => {
if (audioRef.current) { audioRef.current.pause(); audioRef.current = null; } if (audioRef.current) { audioRef.current.pause(); audioRef.current = null; }
@@ -272,6 +305,66 @@ export default function Editor() {
}); });
}; };
// Dream character import
const handleDreamImport = async (dreamId) => {
if (!dreamId) return;
setIsLoadingDream(true);
setError(null);
try {
const res = await fetch(`/api/dream/characters/${dreamId}`);
if (!res.ok) throw new Error(`Dream returned ${res.status}`);
const data = await res.json();
const dc = data.character;
setCharacter(prev => ({
...prev,
name: prev.name || dc.name?.toLowerCase().replace(/\s+/g, '_') || prev.name,
display_name: prev.display_name || dc.name || prev.display_name,
description: dc.backstory ? dc.backstory.split('.').slice(0, 2).join('.') + '.' : prev.description,
background: dc.backstory || prev.background,
appearance: dc.appearance || prev.appearance,
dialogue_style: dc.personality || prev.dialogue_style,
system_prompt: prev.system_prompt || dc.systemPrompt || '',
gaze_character: dc.gazeCharacterId || prev.gaze_character,
dream_id: dc.id,
}));
// Auto-add GAZE presets if linked
if (dc.gazeCharacterId && availableGazePresets.length > 0) {
handleGazeCharacterLink(dc.gazeCharacterId);
}
setDreamImportDone(true);
} catch (err) {
setError(`Dream import failed: ${err.message}`);
} finally {
setIsLoadingDream(false);
}
};
// GAZE character linking
const handleGazeCharacterLink = (characterId) => {
setCharacter(prev => {
const updated = { ...prev, gaze_character: characterId || undefined };
// Auto-add matching presets when linking a character
if (characterId && availableGazePresets.length > 0) {
const matching = availableGazePresets.filter(p =>
p.slug.includes(characterId) || characterId.includes(p.slug)
);
if (matching.length > 0) {
const existingSlugs = new Set((prev.gaze_presets || []).map(gp => gp.preset));
const newPresets = matching
.filter(p => !existingSlugs.has(p.slug))
.map(p => ({ preset: p.slug, trigger: 'self-portrait' }));
if (newPresets.length > 0) {
updated.gaze_presets = [...(prev.gaze_presets || []), ...newPresets];
}
}
}
return updated;
});
};
// GAZE preset helpers // GAZE preset helpers
const addGazePreset = () => { const addGazePreset = () => {
setCharacter(prev => ({ setCharacter(prev => ({
@@ -390,9 +483,9 @@ export default function Editor() {
return ( return (
<div className="space-y-6"> <div className="space-y-6">
<div className="flex justify-between items-center"> <div className="flex flex-col sm:flex-row justify-between sm:items-center gap-3">
<div> <div>
<h1 className="text-3xl font-bold text-gray-100">Character Editor</h1> <h1 className="text-2xl sm:text-3xl font-bold text-gray-100">Character Editor</h1>
<p className="text-sm text-gray-500 mt-1"> <p className="text-sm text-gray-500 mt-1">
{character.display_name || character.name {character.display_name || character.name
? `Editing: ${character.display_name || character.name}` ? `Editing: ${character.display_name || character.name}`
@@ -442,6 +535,70 @@ export default function Editor() {
</div> </div>
)} )}
{/* Import from Dream */}
{!isEditing && (
<div className={cardClass}>
<div className="flex items-center gap-2">
<svg className="w-5 h-5 text-violet-400" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M9.813 15.904L9 18.75l-.813-2.846a4.5 4.5 0 00-3.09-3.09L2.25 12l2.846-.813a4.5 4.5 0 003.09-3.09L9 5.25l.813 2.846a4.5 4.5 0 003.09 3.09L15.75 12l-2.846.813a4.5 4.5 0 00-3.09 3.09zM18.259 8.715L18 9.75l-.259-1.035a3.375 3.375 0 00-2.455-2.456L14.25 6l1.036-.259a3.375 3.375 0 002.455-2.456L18 2.25l.259 1.035a3.375 3.375 0 002.455 2.456L21.75 6l-1.036.259a3.375 3.375 0 00-2.455 2.456z" />
</svg>
<h2 className="text-lg font-semibold text-gray-200">Import from Dream</h2>
</div>
<p className="text-xs text-gray-500">Import character data from Dream. Auto-links GAZE character if configured.</p>
<div className="flex gap-3 items-end">
<div className="flex-1">
<label className={labelClass}>Dream Character</label>
{isLoadingDream ? (
<p className="text-sm text-gray-500">Loading Dream characters...</p>
) : dreamCharacters.length > 0 ? (
<select
className={selectClass}
value={selectedDreamId}
onChange={(e) => setSelectedDreamId(e.target.value)}
>
<option value="">-- Select a character --</option>
{dreamCharacters.map(c => (
<option key={c.id} value={c.id}>{c.name}</option>
))}
</select>
) : (
<p className="text-sm text-gray-600 italic">No Dream characters available.</p>
)}
</div>
{selectedDreamId && (
<div className="w-12 h-12 rounded-lg overflow-hidden bg-gray-800 border border-gray-700 shrink-0">
<img
src={`/api/dream/characters/${selectedDreamId}/image`}
alt="Preview"
className="w-full h-full object-cover"
onError={(e) => { e.target.style.display = 'none' }}
/>
</div>
)}
<button
onClick={() => handleDreamImport(selectedDreamId)}
disabled={!selectedDreamId || isLoadingDream}
className={`flex items-center gap-2 px-5 py-2 rounded-lg text-white transition-colors whitespace-nowrap ${
dreamImportDone
? 'bg-emerald-600 hover:bg-emerald-500'
: 'bg-violet-600 hover:bg-violet-500 disabled:bg-gray-700 disabled:text-gray-500'
}`}
>
{isLoadingDream && (
<svg className="w-4 h-4 animate-spin" viewBox="0 0 24 24" fill="none">
<circle className="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" strokeWidth="4" />
<path className="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4z" />
</svg>
)}
{isLoadingDream ? 'Importing...' : dreamImportDone ? 'Imported' : 'Import'}
</button>
</div>
{dreamImportDone && (
<p className="text-xs text-emerald-400">Fields populated from Dream character. Review and edit below.</p>
)}
</div>
)}
{/* Character Lookup — auto-fill from fictional character wiki */} {/* Character Lookup — auto-fill from fictional character wiki */}
{!isEditing && ( {!isEditing && (
<div className={cardClass}> <div className={cardClass}>
@@ -501,7 +658,7 @@ export default function Editor() {
<div className="grid grid-cols-1 md:grid-cols-2 gap-6"> <div className="grid grid-cols-1 md:grid-cols-2 gap-6">
{/* Basic Info */} {/* Basic Info */}
<div className={cardClass}> <div className={cardClass + " flex flex-col"}>
<h2 className="text-lg font-semibold text-gray-200">Basic Info</h2> <h2 className="text-lg font-semibold text-gray-200">Basic Info</h2>
<div> <div>
<label className={labelClass}>Name (ID)</label> <label className={labelClass}>Name (ID)</label>
@@ -511,9 +668,9 @@ export default function Editor() {
<label className={labelClass}>Display Name</label> <label className={labelClass}>Display Name</label>
<input type="text" className={inputClass} value={character.display_name || ''} onChange={(e) => handleChange('display_name', e.target.value)} /> <input type="text" className={inputClass} value={character.display_name || ''} onChange={(e) => handleChange('display_name', e.target.value)} />
</div> </div>
<div> <div className="flex-1 flex flex-col">
<label className={labelClass}>Description</label> <label className={labelClass}>Description</label>
<input type="text" className={inputClass} value={character.description || ''} onChange={(e) => handleChange('description', e.target.value)} /> <textarea className={inputClass + " flex-1 min-h-20 resize-y"} value={character.description || ''} onChange={(e) => handleChange('description', e.target.value)} />
</div> </div>
</div> </div>
@@ -757,6 +914,48 @@ export default function Editor() {
</div> </div>
</div> </div>
{/* GAZE Character Link */}
<div className={cardClass}>
<h2 className="text-lg font-semibold text-gray-200">GAZE Character Link</h2>
<p className="text-xs text-gray-500">Link to a GAZE character for automatic cover image and image generation presets.</p>
<div className="flex items-center gap-3">
<div className="flex-1">
{isLoadingGaze ? (
<p className="text-sm text-gray-500">Loading GAZE characters...</p>
) : availableGazeCharacters.length > 0 ? (
<select
className={selectClass}
value={character.gaze_character || ''}
onChange={(e) => handleGazeCharacterLink(e.target.value)}
>
<option value="">-- None --</option>
{availableGazeCharacters.map(c => (
<option key={c.character_id} value={c.character_id}>{c.name} ({c.character_id})</option>
))}
</select>
) : (
<input
type="text"
className={inputClass}
value={character.gaze_character || ''}
onChange={(e) => handleGazeCharacterLink(e.target.value)}
placeholder="GAZE character_id (e.g. tifa_lockhart)"
/>
)}
</div>
{character.gaze_character && (
<div className="w-16 h-16 rounded-lg overflow-hidden bg-gray-800 border border-gray-700 shrink-0">
<img
src={`/api/gaze/character/${character.gaze_character}/cover`}
alt="GAZE cover"
className="w-full h-full object-cover"
onError={(e) => { e.target.style.display = 'none' }}
/>
</div>
)}
</div>
</div>
<div className="grid grid-cols-1 md:grid-cols-2 gap-6"> <div className="grid grid-cols-1 md:grid-cols-2 gap-6">
{/* Image Generation — GAZE presets */} {/* Image Generation — GAZE presets */}
<div className={cardClass}> <div className={cardClass}>
@@ -832,22 +1031,37 @@ export default function Editor() {
<h2 className="text-lg font-semibold text-gray-200">Model Overrides</h2> <h2 className="text-lg font-semibold text-gray-200">Model Overrides</h2>
<div> <div>
<label className={labelClass}>Primary Model</label> <label className={labelClass}>Primary Model</label>
<select className={selectClass} value={character.model_overrides?.primary || 'qwen3.5:35b-a3b'} onChange={(e) => handleNestedChange('model_overrides', 'primary', e.target.value)}> <select className={selectClass} value={character.model_overrides?.primary || ''} onChange={(e) => handleNestedChange('model_overrides', 'primary', e.target.value)}>
<option value="llama3.3:70b">llama3.3:70b</option> <option value="">Default (system assigned)</option>
<option value="qwen3.5:35b-a3b">qwen3.5:35b-a3b</option> <optgroup label="Cloud">
<option value="qwen2.5:7b">qwen2.5:7b</option> <option value="anthropic/claude-sonnet-4-6">Claude Sonnet 4.6</option>
<option value="qwen3:32b">qwen3:32b</option> <option value="anthropic/claude-haiku-4-5">Claude Haiku 4.5</option>
<option value="codestral:22b">codestral:22b</option> <option value="anthropic/claude-opus-4-6">Claude Opus 4.6</option>
</optgroup>
<optgroup label="Local">
<option value="qwen3.5:35b-a3b">qwen3.5:35b-a3b</option>
<option value="llama3.3:70b">llama3.3:70b</option>
<option value="qwen3:32b">qwen3:32b</option>
<option value="qwen2.5:7b">qwen2.5:7b</option>
<option value="codestral:22b">codestral:22b</option>
</optgroup>
</select> </select>
</div> </div>
<div> <div>
<label className={labelClass}>Fast Model</label> <label className={labelClass}>Fast Model</label>
<select className={selectClass} value={character.model_overrides?.fast || 'qwen2.5:7b'} onChange={(e) => handleNestedChange('model_overrides', 'fast', e.target.value)}> <select className={selectClass} value={character.model_overrides?.fast || ''} onChange={(e) => handleNestedChange('model_overrides', 'fast', e.target.value)}>
<option value="qwen2.5:7b">qwen2.5:7b</option> <option value="">Default (system assigned)</option>
<option value="qwen3.5:35b-a3b">qwen3.5:35b-a3b</option> <optgroup label="Cloud">
<option value="llama3.3:70b">llama3.3:70b</option> <option value="anthropic/claude-haiku-4-5">Claude Haiku 4.5</option>
<option value="qwen3:32b">qwen3:32b</option> <option value="anthropic/claude-sonnet-4-6">Claude Sonnet 4.6</option>
<option value="codestral:22b">codestral:22b</option> </optgroup>
<optgroup label="Local">
<option value="qwen2.5:7b">qwen2.5:7b</option>
<option value="qwen3.5:35b-a3b">qwen3.5:35b-a3b</option>
<option value="llama3.3:70b">llama3.3:70b</option>
<option value="qwen3:32b">qwen3:32b</option>
<option value="codestral:22b">codestral:22b</option>
</optgroup>
</select> </select>
</div> </div>
</div> </div>
@@ -900,6 +1114,82 @@ export default function Editor() {
)} )}
</div> </div>
{/* Prompt Style Overrides */}
<div className={cardClass}>
<h2 className="text-lg font-semibold text-gray-200">Prompt Style</h2>
<div>
<label className={labelClass}>Default Style</label>
<select
className={inputClass}
value={character.default_prompt_style || ''}
onChange={(e) => handleChange('default_prompt_style', e.target.value || '')}
>
<option value="">Use global default</option>
{PROMPT_STYLES.map((s) => (
<option key={s.id} value={s.id}>{s.name} ({s.group})</option>
))}
</select>
<p className="text-xs text-gray-500 mt-1">Auto-select this style when this character is active</p>
</div>
<details className="group">
<summary className="cursor-pointer text-sm text-gray-400 hover:text-gray-300">
Per-style overrides
</summary>
<div className="mt-3 space-y-3">
{PROMPT_STYLES.map((s) => {
const overrides = character.prompt_style_overrides || {};
const styleOverride = overrides[s.id] || {};
const hasContent = styleOverride.dialogue_style || styleOverride.system_prompt_suffix;
return (
<details key={s.id} className="group/inner">
<summary className={`cursor-pointer text-sm ${hasContent ? 'text-indigo-400' : 'text-gray-500'} hover:text-gray-300`}>
{s.name} {hasContent && '(customized)'}
</summary>
<div className="mt-2 space-y-2 pl-3 border-l border-gray-800">
<div>
<label className={labelClass}>Dialogue Style Override</label>
<textarea
className={inputClass + " h-16 resize-y text-sm"}
value={styleOverride.dialogue_style || ''}
onChange={(e) => {
const val = e.target.value;
setCharacter(prev => {
const newOverrides = { ...(prev.prompt_style_overrides || {}) };
newOverrides[s.id] = { ...(newOverrides[s.id] || {}), dialogue_style: val };
if (!val && !newOverrides[s.id].system_prompt_suffix) delete newOverrides[s.id];
return { ...prev, prompt_style_overrides: newOverrides };
});
}}
placeholder={`Custom dialogue style for ${s.name} mode...`}
/>
</div>
<div>
<label className={labelClass}>Additional Instructions</label>
<textarea
className={inputClass + " h-16 resize-y text-sm"}
value={styleOverride.system_prompt_suffix || ''}
onChange={(e) => {
const val = e.target.value;
setCharacter(prev => {
const newOverrides = { ...(prev.prompt_style_overrides || {}) };
newOverrides[s.id] = { ...(newOverrides[s.id] || {}), system_prompt_suffix: val };
if (!val && !newOverrides[s.id].dialogue_style) delete newOverrides[s.id];
return { ...prev, prompt_style_overrides: newOverrides };
});
}}
placeholder={`Extra instructions for ${s.name} mode...`}
/>
</div>
</div>
</details>
);
})}
</div>
</details>
</div>
{/* Notes */} {/* Notes */}
<div className={cardClass}> <div className={cardClass}>
<h2 className="text-lg font-semibold text-gray-200">Notes</h2> <h2 className="text-lg font-semibold text-gray-200">Notes</h2>

View File

@@ -2,6 +2,7 @@ import { useState, useEffect, useCallback } from 'react';
import { import {
getPersonalMemories, savePersonalMemory, deletePersonalMemory, getPersonalMemories, savePersonalMemory, deletePersonalMemory,
getGeneralMemories, saveGeneralMemory, deleteGeneralMemory, getGeneralMemories, saveGeneralMemory, deleteGeneralMemory,
getFollowups, resolveFollowup,
} from '../lib/memoryApi'; } from '../lib/memoryApi';
const PERSONAL_CATEGORIES = [ const PERSONAL_CATEGORIES = [
@@ -21,53 +22,131 @@ const GENERAL_CATEGORIES = [
{ value: 'other', label: 'Other', color: 'bg-gray-500/20 text-gray-300 border-gray-500/30' }, { value: 'other', label: 'Other', color: 'bg-gray-500/20 text-gray-300 border-gray-500/30' },
]; ];
const LIFECYCLE_BADGES = {
active: 'bg-emerald-500/20 text-emerald-300 border-emerald-500/30',
pending_followup: 'bg-amber-500/20 text-amber-300 border-amber-500/30',
resolved: 'bg-gray-500/20 text-gray-400 border-gray-500/30',
archived: 'bg-gray-700/30 text-gray-500 border-gray-600/30',
};
const MEMORY_TYPE_BADGES = {
semantic: 'bg-indigo-500/20 text-indigo-300 border-indigo-500/30',
episodic: 'bg-cyan-500/20 text-cyan-300 border-cyan-500/30',
relational: 'bg-purple-500/20 text-purple-300 border-purple-500/30',
opinion: 'bg-rose-500/20 text-rose-300 border-rose-500/30',
};
const ACTIVE_KEY = 'homeai_active_character'; const ACTIVE_KEY = 'homeai_active_character';
function CategoryBadge({ category, categories }) { function Badge({ label, colorClass }) {
const cat = categories.find(c => c.value === category) || categories[categories.length - 1];
return ( return (
<span className={`px-2 py-0.5 text-xs rounded-full border ${cat.color}`}> <span className={`px-2 py-0.5 text-xs rounded-full border ${colorClass}`}>
{cat.label} {label}
</span> </span>
); );
} }
function CategoryBadge({ category, categories }) {
const cat = categories.find(c => c.value === category) || categories[categories.length - 1];
return <Badge label={cat.label} colorClass={cat.color} />;
}
function PrivacyIcon({ level }) {
if (level === 'local_only') {
return (
<span title="Local only — never sent to cloud" className="text-rose-400">
<svg className="w-3.5 h-3.5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M16.5 10.5V6.75a4.5 4.5 0 10-9 0v3.75m-.75 11.25h10.5a2.25 2.25 0 002.25-2.25v-6.75a2.25 2.25 0 00-2.25-2.25H6.75a2.25 2.25 0 00-2.25 2.25v6.75a2.25 2.25 0 002.25 2.25z" />
</svg>
</span>
);
}
if (level === 'sensitive') {
return (
<span title="Sensitive — cloud allowed but stripped when possible" className="text-amber-400">
<svg className="w-3.5 h-3.5" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M9 12.75L11.25 15 15 9.75m-3-7.036A11.959 11.959 0 013.598 6 11.99 11.99 0 003 9.749c0 5.592 3.824 10.29 9 11.623 5.176-1.332 9-6.03 9-11.622 0-1.31-.21-2.571-.598-3.751h-.152c-3.196 0-6.1-1.248-8.25-3.285z" />
</svg>
</span>
);
}
return null;
}
function ImportanceBar({ value }) {
const pct = Math.round((value || 0.5) * 100);
const color = value >= 0.8 ? 'bg-rose-400' : value >= 0.6 ? 'bg-amber-400' : 'bg-gray-500';
return (
<div title={`Importance: ${pct}%`} className="flex items-center gap-1">
<div className="w-12 h-1.5 bg-gray-700 rounded-full overflow-hidden">
<div className={`h-full ${color} rounded-full`} style={{ width: `${pct}%` }} />
</div>
</div>
);
}
function MemoryCard({ memory, categories, onEdit, onDelete }) { function MemoryCard({ memory, categories, onEdit, onDelete }) {
const date = memory.created_at || memory.createdAt;
return ( return (
<div className="border border-gray-700 rounded-lg p-4 bg-gray-800/50 space-y-2"> <div className="border border-gray-700 rounded-lg p-4 bg-gray-800/50 space-y-2">
<div className="flex items-start justify-between gap-3"> <div className="flex items-start justify-between gap-3">
<p className="text-sm text-gray-200 flex-1 whitespace-pre-wrap">{memory.content}</p> <p className="text-sm text-gray-200 flex-1 whitespace-pre-wrap">{memory.content}</p>
<div className="flex gap-1 shrink-0"> <div className="flex gap-1 shrink-0">
<button <button onClick={() => onEdit(memory)} className="p-1.5 text-gray-500 hover:text-gray-300 transition-colors" title="Edit">
onClick={() => onEdit(memory)}
className="p-1.5 text-gray-500 hover:text-gray-300 transition-colors"
title="Edit"
>
<svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}> <svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M16.862 4.487l1.687-1.688a1.875 1.875 0 112.652 2.652L10.582 16.07a4.5 4.5 0 01-1.897 1.13L6 18l.8-2.685a4.5 4.5 0 011.13-1.897l8.932-8.931z" /> <path strokeLinecap="round" strokeLinejoin="round" d="M16.862 4.487l1.687-1.688a1.875 1.875 0 112.652 2.652L10.582 16.07a4.5 4.5 0 01-1.897 1.13L6 18l.8-2.685a4.5 4.5 0 011.13-1.897l8.932-8.931z" />
</svg> </svg>
</button> </button>
<button <button onClick={() => onDelete(memory.id)} className="p-1.5 text-gray-500 hover:text-red-400 transition-colors" title="Delete">
onClick={() => onDelete(memory.id)}
className="p-1.5 text-gray-500 hover:text-red-400 transition-colors"
title="Delete"
>
<svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}> <svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
<path strokeLinecap="round" strokeLinejoin="round" d="M14.74 9l-.346 9m-4.788 0L9.26 9m9.968-3.21c.342.052.682.107 1.022.166m-1.022-.165L18.16 19.673a2.25 2.25 0 01-2.244 2.077H8.084a2.25 2.25 0 01-2.244-2.077L4.772 5.79m14.456 0a48.108 48.108 0 00-3.478-.397m-12 .562c.34-.059.68-.114 1.022-.165m0 0a48.11 48.11 0 013.478-.397m7.5 0v-.916c0-1.18-.91-2.164-2.09-2.201a51.964 51.964 0 00-3.32 0c-1.18.037-2.09 1.022-2.09 2.201v.916m7.5 0a48.667 48.667 0 00-7.5 0" /> <path strokeLinecap="round" strokeLinejoin="round" d="M14.74 9l-.346 9m-4.788 0L9.26 9m9.968-3.21c.342.052.682.107 1.022.166m-1.022-.165L18.16 19.673a2.25 2.25 0 01-2.244 2.077H8.084a2.25 2.25 0 01-2.244-2.077L4.772 5.79m14.456 0a48.108 48.108 0 00-3.478-.397m-12 .562c.34-.059.68-.114 1.022-.165m0 0a48.11 48.11 0 013.478-.397m7.5 0v-.916c0-1.18-.91-2.164-2.09-2.201a51.964 51.964 0 00-3.32 0c-1.18.037-2.09 1.022-2.09 2.201v.916m7.5 0a48.667 48.667 0 00-7.5 0" />
</svg> </svg>
</button> </button>
</div> </div>
</div> </div>
<div className="flex items-center gap-2"> <div className="flex items-center gap-2 flex-wrap">
<CategoryBadge category={memory.category} categories={categories} /> <CategoryBadge category={memory.category} categories={categories} />
{memory.memory_type && (
<Badge label={memory.memory_type} colorClass={MEMORY_TYPE_BADGES[memory.memory_type] || MEMORY_TYPE_BADGES.semantic} />
)}
{memory.lifecycle_state && memory.lifecycle_state !== 'active' && (
<Badge label={memory.lifecycle_state.replace('_', ' ')} colorClass={LIFECYCLE_BADGES[memory.lifecycle_state] || LIFECYCLE_BADGES.active} />
)}
<PrivacyIcon level={memory.privacy_level} />
<ImportanceBar value={memory.importance} />
<span className="text-xs text-gray-600"> <span className="text-xs text-gray-600">
{memory.createdAt ? new Date(memory.createdAt).toLocaleDateString() : ''} {date ? new Date(date).toLocaleDateString() : ''}
</span> </span>
</div> </div>
</div> </div>
); );
} }
function FollowupCard({ followup, onResolve }) {
const date = followup.created_at || followup.createdAt;
return (
<div className="border border-amber-500/30 rounded-lg p-4 bg-amber-500/5 space-y-2">
<div className="flex items-start justify-between gap-3">
<div className="flex-1">
<p className="text-sm text-amber-200 font-medium">{followup.follow_up_context}</p>
<p className="text-xs text-gray-400 mt-1">{followup.content}</p>
</div>
<button
onClick={() => onResolve(followup.id)}
className="px-3 py-1.5 bg-amber-600/20 hover:bg-amber-600/40 text-amber-300 text-xs rounded-lg border border-amber-500/30 transition-colors shrink-0"
>
Resolve
</button>
</div>
<div className="flex items-center gap-3 text-xs text-gray-500">
<span>Due: {followup.follow_up_due === 'next_interaction' ? 'Next interaction' : new Date(followup.follow_up_due).toLocaleString()}</span>
<span>Surfaced: {followup.surfaced_count || 0}x</span>
<span>{date ? new Date(date).toLocaleDateString() : ''}</span>
</div>
</div>
);
}
function MemoryForm({ categories, editing, onSave, onCancel }) { function MemoryForm({ categories, editing, onSave, onCancel }) {
const [content, setContent] = useState(editing?.content || ''); const [content, setContent] = useState(editing?.content || '');
const [category, setCategory] = useState(editing?.category || categories[0].value); const [category, setCategory] = useState(editing?.category || categories[0].value);
@@ -104,10 +183,7 @@ function MemoryForm({ categories, editing, onSave, onCancel }) {
))} ))}
</select> </select>
<div className="flex gap-2 ml-auto"> <div className="flex gap-2 ml-auto">
<button <button onClick={onCancel} className="px-3 py-1.5 bg-gray-700 hover:bg-gray-600 text-gray-300 text-sm rounded-lg transition-colors">
onClick={onCancel}
className="px-3 py-1.5 bg-gray-700 hover:bg-gray-600 text-gray-300 text-sm rounded-lg transition-colors"
>
Cancel Cancel
</button> </button>
<button <button
@@ -124,10 +200,11 @@ function MemoryForm({ categories, editing, onSave, onCancel }) {
} }
export default function Memories() { export default function Memories() {
const [tab, setTab] = useState('personal'); // 'personal' | 'general' const [tab, setTab] = useState('personal'); // 'personal' | 'general' | 'followups'
const [characters, setCharacters] = useState([]); const [characters, setCharacters] = useState([]);
const [selectedCharId, setSelectedCharId] = useState(''); const [selectedCharId, setSelectedCharId] = useState('');
const [memories, setMemories] = useState([]); const [memories, setMemories] = useState([]);
const [followups, setFollowups] = useState([]);
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const [showForm, setShowForm] = useState(false); const [showForm, setShowForm] = useState(false);
const [editing, setEditing] = useState(null); const [editing, setEditing] = useState(null);
@@ -155,14 +232,21 @@ export default function Memories() {
setLoading(true); setLoading(true);
setError(null); setError(null);
try { try {
if (tab === 'personal' && selectedCharId) { if (tab === 'followups' && selectedCharId) {
const data = await getFollowups(selectedCharId);
setFollowups(data.followups || []);
setMemories([]);
} else if (tab === 'personal' && selectedCharId) {
const data = await getPersonalMemories(selectedCharId); const data = await getPersonalMemories(selectedCharId);
setMemories(data.memories || []); setMemories(data.memories || []);
setFollowups([]);
} else if (tab === 'general') { } else if (tab === 'general') {
const data = await getGeneralMemories(); const data = await getGeneralMemories();
setMemories(data.memories || []); setMemories(data.memories || []);
setFollowups([]);
} else { } else {
setMemories([]); setMemories([]);
setFollowups([]);
} }
} catch (err) { } catch (err) {
setError(err.message); setError(err.message);
@@ -206,14 +290,27 @@ export default function Memories() {
setShowForm(true); setShowForm(true);
}; };
const handleResolve = async (memoryId) => {
try {
await resolveFollowup(memoryId);
await loadMemories();
} catch (err) {
setError(err.message);
}
};
const categories = tab === 'personal' ? PERSONAL_CATEGORIES : GENERAL_CATEGORIES; const categories = tab === 'personal' ? PERSONAL_CATEGORIES : GENERAL_CATEGORIES;
const filteredMemories = filter const filteredMemories = filter
? memories.filter(m => m.content?.toLowerCase().includes(filter.toLowerCase()) || m.category === filter) ? memories.filter(m =>
m.content?.toLowerCase().includes(filter.toLowerCase()) ||
m.category === filter ||
m.memory_type === filter
)
: memories; : memories;
// Sort newest first // Sort newest first
const sortedMemories = [...filteredMemories].sort( const sortedMemories = [...filteredMemories].sort(
(a, b) => (b.createdAt || '').localeCompare(a.createdAt || '') (a, b) => (b.created_at || b.createdAt || '').localeCompare(a.created_at || a.createdAt || '')
); );
const selectedChar = characters.find(c => c.id === selectedCharId); const selectedChar = characters.find(c => c.id === selectedCharId);
@@ -221,27 +318,31 @@ export default function Memories() {
return ( return (
<div className="space-y-6"> <div className="space-y-6">
{/* Header */} {/* Header */}
<div className="flex items-center justify-between"> <div className="flex flex-col sm:flex-row sm:items-center justify-between gap-3">
<div> <div>
<h1 className="text-3xl font-bold text-gray-100">Memories</h1> <h1 className="text-2xl sm:text-3xl font-bold text-gray-100">Memories</h1>
<p className="text-sm text-gray-500 mt-1"> <p className="text-sm text-gray-500 mt-1">
{sortedMemories.length} {tab} memor{sortedMemories.length !== 1 ? 'ies' : 'y'} {tab === 'followups'
{tab === 'personal' && selectedChar && ( ? `${followups.length} pending follow-up${followups.length !== 1 ? 's' : ''}`
: `${sortedMemories.length} ${tab} memor${sortedMemories.length !== 1 ? 'ies' : 'y'}`}
{(tab === 'personal' || tab === 'followups') && selectedChar && (
<span className="ml-1 text-indigo-400"> <span className="ml-1 text-indigo-400">
for {selectedChar.data?.display_name || selectedChar.data?.name || selectedCharId} for {selectedChar.data?.display_name || selectedChar.data?.name || selectedCharId}
</span> </span>
)} )}
</p> </p>
</div> </div>
<button {tab !== 'followups' && (
onClick={() => { setEditing(null); setShowForm(!showForm); }} <button
className="flex items-center gap-2 px-4 py-2 bg-indigo-600 hover:bg-indigo-500 text-white rounded-lg transition-colors" onClick={() => { setEditing(null); setShowForm(!showForm); }}
> className="flex items-center gap-2 px-4 py-2 bg-indigo-600 hover:bg-indigo-500 text-white rounded-lg transition-colors"
<svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}> >
<path strokeLinecap="round" strokeLinejoin="round" d="M12 4.5v15m7.5-7.5h-15" /> <svg className="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={2}>
</svg> <path strokeLinecap="round" strokeLinejoin="round" d="M12 4.5v15m7.5-7.5h-15" />
Add Memory </svg>
</button> Add Memory
</button>
)}
</div> </div>
{error && ( {error && (
@@ -253,30 +354,21 @@ export default function Memories() {
{/* Tabs */} {/* Tabs */}
<div className="flex gap-1 bg-gray-900 p-1 rounded-lg border border-gray-800 w-fit"> <div className="flex gap-1 bg-gray-900 p-1 rounded-lg border border-gray-800 w-fit">
<button {['personal', 'general', 'followups'].map(t => (
onClick={() => { setTab('personal'); setShowForm(false); setEditing(null); }} <button
className={`px-4 py-2 text-sm font-medium rounded-md transition-colors ${ key={t}
tab === 'personal' onClick={() => { setTab(t); setShowForm(false); setEditing(null); }}
? 'bg-gray-800 text-white' className={`px-4 py-2 text-sm font-medium rounded-md transition-colors ${
: 'text-gray-400 hover:text-gray-200' tab === t ? 'bg-gray-800 text-white' : 'text-gray-400 hover:text-gray-200'
}`} }`}
> >
Personal {t === 'followups' ? 'Follow-ups' : t.charAt(0).toUpperCase() + t.slice(1)}
</button> </button>
<button ))}
onClick={() => { setTab('general'); setShowForm(false); setEditing(null); }}
className={`px-4 py-2 text-sm font-medium rounded-md transition-colors ${
tab === 'general'
? 'bg-gray-800 text-white'
: 'text-gray-400 hover:text-gray-200'
}`}
>
General
</button>
</div> </div>
{/* Character selector (personal tab only) */} {/* Character selector (personal + followups tabs) */}
{tab === 'personal' && ( {(tab === 'personal' || tab === 'followups') && (
<div className="flex items-center gap-3"> <div className="flex items-center gap-3">
<label className="text-sm text-gray-400">Character</label> <label className="text-sm text-gray-400">Character</label>
<select <select
@@ -293,19 +385,21 @@ export default function Memories() {
</div> </div>
)} )}
{/* Search filter */} {/* Search filter (not for followups) */}
<div> {tab !== 'followups' && (
<input <div>
type="text" <input
className="w-full bg-gray-800 border border-gray-700 text-gray-200 p-2 rounded-lg text-sm focus:border-indigo-500 focus:ring-1 focus:ring-indigo-500 outline-none" type="text"
value={filter} className="w-full bg-gray-800 border border-gray-700 text-gray-200 p-2 rounded-lg text-sm focus:border-indigo-500 focus:ring-1 focus:ring-indigo-500 outline-none"
onChange={(e) => setFilter(e.target.value)} value={filter}
placeholder="Search memories..." onChange={(e) => setFilter(e.target.value)}
/> placeholder="Search memories..."
</div> />
</div>
)}
{/* Add/Edit form */} {/* Add/Edit form */}
{showForm && ( {showForm && tab !== 'followups' && (
<MemoryForm <MemoryForm
categories={categories} categories={categories}
editing={editing} editing={editing}
@@ -314,11 +408,26 @@ export default function Memories() {
/> />
)} )}
{/* Memory list */} {/* Content */}
{loading ? ( {loading ? (
<div className="text-center py-12"> <div className="text-center py-12">
<p className="text-gray-500">Loading memories...</p> <p className="text-gray-500">Loading...</p>
</div> </div>
) : tab === 'followups' ? (
followups.length === 0 ? (
<div className="text-center py-12">
<svg className="w-12 h-12 mx-auto text-gray-700 mb-3" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1}>
<path strokeLinecap="round" strokeLinejoin="round" d="M9 12.75L11.25 15 15 9.75M21 12a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<p className="text-gray-500 text-sm">No pending follow-ups.</p>
</div>
) : (
<div className="space-y-3">
{followups.map(fu => (
<FollowupCard key={fu.id} followup={fu} onResolve={handleResolve} />
))}
</div>
)
) : sortedMemories.length === 0 ? ( ) : sortedMemories.length === 0 ? (
<div className="text-center py-12"> <div className="text-center py-12">
<svg className="w-12 h-12 mx-auto text-gray-700 mb-3" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1}> <svg className="w-12 h-12 mx-auto text-gray-700 mb-3" fill="none" viewBox="0 0 24 24" stroke="currentColor" strokeWidth={1}>

View File

@@ -7,8 +7,13 @@ const SATELLITE_MAP_PATH = '/Users/aodhan/homeai-data/satellite-map.json'
const CONVERSATIONS_DIR = '/Users/aodhan/homeai-data/conversations' const CONVERSATIONS_DIR = '/Users/aodhan/homeai-data/conversations'
const MEMORIES_DIR = '/Users/aodhan/homeai-data/memories' const MEMORIES_DIR = '/Users/aodhan/homeai-data/memories'
const MODE_PATH = '/Users/aodhan/homeai-data/active-mode.json' const MODE_PATH = '/Users/aodhan/homeai-data/active-mode.json'
const ACTIVE_STYLE_PATH = '/Users/aodhan/homeai-data/active-prompt-style.json'
const PROMPT_STYLES_DIR = new URL('./homeai-agent/prompt-styles', import.meta.url).pathname
const HA_TOKEN = process.env.HA_TOKEN || ''
const GAZE_HOST = 'http://10.0.0.101:5782' const GAZE_HOST = 'http://10.0.0.101:5782'
const GAZE_API_KEY = process.env.GAZE_API_KEY || '' const GAZE_API_KEY = process.env.GAZE_API_KEY || ''
const DREAM_HOST = process.env.DREAM_HOST || 'http://10.0.0.101:3000'
const DREAM_API_KEY = process.env.DREAM_API_KEY || ''
function characterStoragePlugin() { function characterStoragePlugin() {
return { return {
@@ -272,6 +277,7 @@ function healthCheckPlugin() {
const params = new URL(req.url, 'http://localhost').searchParams; const params = new URL(req.url, 'http://localhost').searchParams;
const url = params.get('url'); const url = params.get('url');
const mode = params.get('mode'); // 'tcp' for raw TCP port check const mode = params.get('mode'); // 'tcp' for raw TCP port check
const needsAuth = params.get('auth') === '1'; // use server-side HA_TOKEN
if (!url) { if (!url) {
res.writeHead(400, { 'Content-Type': 'application/json' }); res.writeHead(400, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Missing url param' })); res.end(JSON.stringify({ error: 'Missing url param' }));
@@ -298,8 +304,12 @@ function healthCheckPlugin() {
const { default: http } = await import('http'); const { default: http } = await import('http');
const client = parsedUrl.protocol === 'https:' ? https : http; const client = parsedUrl.protocol === 'https:' ? https : http;
const opts = { rejectUnauthorized: false, timeout: 5000 };
if (needsAuth && HA_TOKEN) {
opts.headers = { 'Authorization': `Bearer ${HA_TOKEN}` };
}
await new Promise((resolve, reject) => { await new Promise((resolve, reject) => {
const reqObj = client.get(url, { rejectUnauthorized: false, timeout: 5000 }, (resp) => { const reqObj = client.get(url, opts, (resp) => {
resp.resume(); resp.resume();
resolve(); resolve();
}); });
@@ -387,15 +397,16 @@ function gazeProxyPlugin() {
return { return {
name: 'gaze-proxy', name: 'gaze-proxy',
configureServer(server) { configureServer(server) {
server.middlewares.use('/api/gaze/presets', async (req, res) => { // Helper to proxy a JSON GET to GAZE
const proxyGazeJson = async (apiPath, res, fallback) => {
if (!GAZE_API_KEY) { if (!GAZE_API_KEY) {
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' }) res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify({ presets: [] })) res.end(JSON.stringify(fallback))
return return
} }
try { try {
const http = await import('http') const http = await import('http')
const url = new URL(`${GAZE_HOST}/api/v1/presets`) const url = new URL(`${GAZE_HOST}${apiPath}`)
const proxyRes = await new Promise((resolve, reject) => { const proxyRes = await new Promise((resolve, reject) => {
const r = http.default.get(url, { headers: { 'X-API-Key': GAZE_API_KEY }, timeout: 5000 }, resolve) const r = http.default.get(url, { headers: { 'X-API-Key': GAZE_API_KEY }, timeout: 5000 }, resolve)
r.on('error', reject) r.on('error', reject)
@@ -407,8 +418,110 @@ function gazeProxyPlugin() {
res.end(Buffer.concat(chunks)) res.end(Buffer.concat(chunks))
} catch { } catch {
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' }) res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify({ presets: [] })) res.end(JSON.stringify(fallback))
} }
}
server.middlewares.use('/api/gaze/presets', async (req, res) => {
await proxyGazeJson('/api/v1/presets', res, { presets: [] })
})
server.middlewares.use('/api/gaze/characters', async (req, res) => {
await proxyGazeJson('/api/v1/characters', res, { characters: [] })
})
// Proxy cover image for a GAZE character (binary passthrough)
server.middlewares.use(async (req, res, next) => {
const match = req.url.match(/^\/api\/gaze\/character\/([a-zA-Z0-9_\-]+)\/cover/)
if (!match) return next()
const characterId = match[1]
if (!GAZE_API_KEY) {
res.writeHead(404)
res.end()
return
}
try {
const { default: http } = await import('http')
const url = new URL(`${GAZE_HOST}/api/v1/character/${characterId}/cover`)
const r = http.get(url, { headers: { 'X-API-Key': GAZE_API_KEY }, timeout: 5000 }, (proxyRes) => {
res.writeHead(proxyRes.statusCode, {
'Content-Type': proxyRes.headers['content-type'] || 'image/png',
'Access-Control-Allow-Origin': '*',
'Cache-Control': 'public, max-age=3600',
})
proxyRes.pipe(res)
})
r.on('error', () => { if (!res.headersSent) { res.writeHead(502); res.end() } })
r.on('timeout', () => { r.destroy(); if (!res.headersSent) { res.writeHead(504); res.end() } })
} catch {
if (!res.headersSent) { res.writeHead(500); res.end() }
}
})
},
}
}
function dreamProxyPlugin() {
const dreamHeaders = DREAM_API_KEY ? { 'X-API-Key': DREAM_API_KEY } : {}
return {
name: 'dream-proxy',
configureServer(server) {
// Helper: proxy a JSON GET to Dream
const proxyDreamJson = async (apiPath, res) => {
try {
const http = await import('http')
const url = new URL(`${DREAM_HOST}${apiPath}`)
const proxyRes = await new Promise((resolve, reject) => {
const r = http.default.get(url, { headers: dreamHeaders, timeout: 5000 }, resolve)
r.on('error', reject)
r.on('timeout', () => { r.destroy(); reject(new Error('timeout')) })
})
const chunks = []
for await (const chunk of proxyRes) chunks.push(chunk)
res.writeHead(proxyRes.statusCode, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(Buffer.concat(chunks))
} catch {
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify({ characters: [], error: 'Dream unreachable' }))
}
}
// List characters (compact)
server.middlewares.use('/api/dream/characters', async (req, res, next) => {
// Only handle exact path (not sub-paths like /api/dream/characters/abc/image)
if (req.url !== '/' && req.url !== '' && !req.url.startsWith('?')) return next()
const qs = req.url === '/' || req.url === '' ? '' : req.url
await proxyDreamJson(`/api/characters${qs}`, res)
})
// Character image (binary passthrough)
server.middlewares.use(async (req, res, next) => {
const match = req.url.match(/^\/api\/dream\/characters\/([^/]+)\/image/)
if (!match) return next()
const charId = match[1]
try {
const { default: http } = await import('http')
const url = new URL(`${DREAM_HOST}/api/characters/${charId}/image`)
const r = http.get(url, { headers: dreamHeaders, timeout: 5000 }, (proxyRes) => {
res.writeHead(proxyRes.statusCode, {
'Content-Type': proxyRes.headers['content-type'] || 'image/png',
'Access-Control-Allow-Origin': '*',
'Cache-Control': 'public, max-age=3600',
})
proxyRes.pipe(res)
})
r.on('error', () => { if (!res.headersSent) { res.writeHead(502); res.end() } })
r.on('timeout', () => { r.destroy(); if (!res.headersSent) { res.writeHead(504); res.end() } })
} catch {
if (!res.headersSent) { res.writeHead(500); res.end() }
}
})
// Get single character (full details)
server.middlewares.use(async (req, res, next) => {
const match = req.url.match(/^\/api\/dream\/characters\/([^/]+)\/?$/)
if (!match) return next()
await proxyDreamJson(`/api/characters/${match[1]}`, res)
}) })
}, },
} }
@@ -418,163 +531,67 @@ function memoryStoragePlugin() {
return { return {
name: 'memory-storage', name: 'memory-storage',
configureServer(server) { configureServer(server) {
const ensureDirs = async () => { // Proxy all /api/memories/* requests to the OpenClaw bridge (port 8081)
const { mkdir } = await import('fs/promises') // The bridge handles SQLite + vector search; dashboard is just a passthrough
await mkdir(`${MEMORIES_DIR}/personal`, { recursive: true }) const proxyMemoryRequest = async (req, res) => {
} if (req.method === 'OPTIONS') {
res.writeHead(204, {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET,POST,PUT,DELETE,OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type',
})
res.end()
return
}
const readJsonFile = async (path, fallback) => {
const { readFile } = await import('fs/promises')
try { try {
return JSON.parse(await readFile(path, 'utf-8')) const { default: http } = await import('http')
} catch { const chunks = []
return fallback for await (const chunk of req) chunks.push(chunk)
const body = Buffer.concat(chunks)
// Reconstruct full path: /api/memories/... (req.url has the part after /api/memories)
const targetPath = `/api/memories${req.url}`
await new Promise((resolve, reject) => {
const proxyReq = http.request(
`http://localhost:8081${targetPath}`,
{
method: req.method,
headers: {
'Content-Type': req.headers['content-type'] || 'application/json',
...(body.length > 0 ? { 'Content-Length': body.length } : {}),
},
timeout: 30000,
},
(proxyRes) => {
res.writeHead(proxyRes.statusCode, {
'Content-Type': proxyRes.headers['content-type'] || 'application/json',
'Access-Control-Allow-Origin': '*',
})
proxyRes.pipe(res)
proxyRes.on('end', resolve)
proxyRes.on('error', resolve)
}
)
proxyReq.on('error', reject)
proxyReq.on('timeout', () => {
proxyReq.destroy()
reject(new Error('timeout'))
})
if (body.length > 0) proxyReq.write(body)
proxyReq.end()
})
} catch (err) {
console.error(`[memory-proxy] failed:`, err?.message || err)
if (!res.headersSent) {
res.writeHead(502, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: `Bridge unreachable: ${err?.message || 'unknown'}` }))
}
} }
} }
const writeJsonFile = async (path, data) => { server.middlewares.use('/api/memories', proxyMemoryRequest)
const { writeFile } = await import('fs/promises')
await writeFile(path, JSON.stringify(data, null, 2))
}
// Personal memories: /api/memories/personal/:characterId[/:memoryId]
server.middlewares.use('/api/memories/personal', async (req, res, next) => {
if (req.method === 'OPTIONS') {
res.writeHead(204, { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET,POST,DELETE', 'Access-Control-Allow-Headers': 'Content-Type' })
res.end()
return
}
await ensureDirs()
const url = new URL(req.url, 'http://localhost')
const parts = url.pathname.replace(/^\/+/, '').split('/')
const characterId = parts[0] ? parts[0].replace(/[^a-zA-Z0-9_\-\.]/g, '_') : null
const memoryId = parts[1] || null
if (!characterId) {
res.writeHead(400, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: 'Missing character ID' }))
return
}
const filePath = `${MEMORIES_DIR}/personal/${characterId}.json`
if (req.method === 'GET') {
const data = await readJsonFile(filePath, { characterId, memories: [] })
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify(data))
return
}
if (req.method === 'POST') {
try {
const chunks = []
for await (const chunk of req) chunks.push(chunk)
const memory = JSON.parse(Buffer.concat(chunks).toString())
const data = await readJsonFile(filePath, { characterId, memories: [] })
if (memory.id) {
const idx = data.memories.findIndex(m => m.id === memory.id)
if (idx >= 0) {
data.memories[idx] = { ...data.memories[idx], ...memory }
} else {
data.memories.push(memory)
}
} else {
memory.id = 'm_' + Date.now()
memory.createdAt = memory.createdAt || new Date().toISOString()
data.memories.push(memory)
}
await writeJsonFile(filePath, data)
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify({ ok: true, memory }))
} catch (err) {
res.writeHead(500, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: err.message }))
}
return
}
if (req.method === 'DELETE' && memoryId) {
try {
const data = await readJsonFile(filePath, { characterId, memories: [] })
data.memories = data.memories.filter(m => m.id !== memoryId)
await writeJsonFile(filePath, data)
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify({ ok: true }))
} catch (err) {
res.writeHead(500, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: err.message }))
}
return
}
next()
})
// General memories: /api/memories/general[/:memoryId]
server.middlewares.use('/api/memories/general', async (req, res, next) => {
if (req.method === 'OPTIONS') {
res.writeHead(204, { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET,POST,DELETE', 'Access-Control-Allow-Headers': 'Content-Type' })
res.end()
return
}
await ensureDirs()
const url = new URL(req.url, 'http://localhost')
const memoryId = url.pathname.replace(/^\/+/, '') || null
const filePath = `${MEMORIES_DIR}/general.json`
if (req.method === 'GET') {
const data = await readJsonFile(filePath, { memories: [] })
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify(data))
return
}
if (req.method === 'POST') {
try {
const chunks = []
for await (const chunk of req) chunks.push(chunk)
const memory = JSON.parse(Buffer.concat(chunks).toString())
const data = await readJsonFile(filePath, { memories: [] })
if (memory.id) {
const idx = data.memories.findIndex(m => m.id === memory.id)
if (idx >= 0) {
data.memories[idx] = { ...data.memories[idx], ...memory }
} else {
data.memories.push(memory)
}
} else {
memory.id = 'm_' + Date.now()
memory.createdAt = memory.createdAt || new Date().toISOString()
data.memories.push(memory)
}
await writeJsonFile(filePath, data)
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify({ ok: true, memory }))
} catch (err) {
res.writeHead(500, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: err.message }))
}
return
}
if (req.method === 'DELETE' && memoryId) {
try {
const data = await readJsonFile(filePath, { memories: [] })
data.memories = data.memories.filter(m => m.id !== memoryId)
await writeJsonFile(filePath, data)
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify({ ok: true }))
} catch (err) {
res.writeHead(500, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: err.message }))
}
return
}
next()
})
}, },
} }
} }
@@ -698,6 +715,96 @@ function modePlugin() {
} }
} }
function promptStylePlugin() {
return {
name: 'prompt-style-api',
configureServer(server) {
// GET /api/prompt-styles — list all available styles
server.middlewares.use('/api/prompt-styles', async (req, res, next) => {
if (req.method === 'OPTIONS') {
res.writeHead(204, { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET', 'Access-Control-Allow-Headers': 'Content-Type' })
res.end()
return
}
if (req.method !== 'GET') { next(); return }
const { readdir, readFile } = await import('fs/promises')
try {
const files = (await readdir(PROMPT_STYLES_DIR)).filter(f => f.endsWith('.json'))
const styles = []
for (const file of files) {
try {
const raw = await readFile(`${PROMPT_STYLES_DIR}/${file}`, 'utf-8')
styles.push(JSON.parse(raw))
} catch { /* skip corrupt files */ }
}
// Sort: cloud group first, then local
styles.sort((a, b) => {
if (a.group !== b.group) return a.group === 'cloud' ? -1 : 1
return (a.id || '').localeCompare(b.id || '')
})
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify(styles))
} catch (err) {
res.writeHead(500, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: err.message }))
}
})
// GET/POST /api/prompt-style — active style
server.middlewares.use('/api/prompt-style', async (req, res, next) => {
// Avoid matching /api/prompt-styles (plural)
const url = new URL(req.url, 'http://localhost')
if (url.pathname !== '/') { next(); return }
if (req.method === 'OPTIONS') {
res.writeHead(204, { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET,POST', 'Access-Control-Allow-Headers': 'Content-Type' })
res.end()
return
}
const { readFile, writeFile } = await import('fs/promises')
if (req.method === 'GET') {
try {
const raw = await readFile(ACTIVE_STYLE_PATH, 'utf-8')
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(raw)
} catch {
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify({ style: 'standard', updated_at: '' }))
}
return
}
if (req.method === 'POST') {
try {
const chunks = []
for await (const chunk of req) chunks.push(chunk)
const data = JSON.parse(Buffer.concat(chunks).toString())
const VALID_STYLES = ['quick', 'standard', 'creative', 'roleplayer', 'game-master', 'storyteller']
if (!data.style || !VALID_STYLES.includes(data.style)) {
res.writeHead(400, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: `Invalid style. Valid: ${VALID_STYLES.join(', ')}` }))
return
}
const state = { style: data.style, updated_at: new Date().toISOString() }
await writeFile(ACTIVE_STYLE_PATH, JSON.stringify(state, null, 2))
res.writeHead(200, { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' })
res.end(JSON.stringify({ ok: true, ...state }))
} catch (err) {
res.writeHead(500, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: err.message }))
}
return
}
next()
})
},
}
}
function bridgeProxyPlugin() { function bridgeProxyPlugin() {
return { return {
name: 'bridge-proxy', name: 'bridge-proxy',
@@ -771,10 +878,12 @@ export default defineConfig({
satelliteMapPlugin(), satelliteMapPlugin(),
conversationStoragePlugin(), conversationStoragePlugin(),
memoryStoragePlugin(), memoryStoragePlugin(),
dreamProxyPlugin(),
gazeProxyPlugin(), gazeProxyPlugin(),
characterLookupPlugin(), characterLookupPlugin(),
healthCheckPlugin(), healthCheckPlugin(),
modePlugin(), modePlugin(),
promptStylePlugin(),
bridgeProxyPlugin(), bridgeProxyPlugin(),
tailwindcss(), tailwindcss(),
react(), react(),