Compare commits
1 Commits
frontend-d
...
docker
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
da55b0889b |
8
.dockerignore
Normal file
8
.dockerignore
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
venv/
|
||||||
|
__pycache__/
|
||||||
|
*.pyc
|
||||||
|
instance/
|
||||||
|
flask_session/
|
||||||
|
static/uploads/
|
||||||
|
tools/
|
||||||
|
.git/
|
||||||
67
CLAUDE.md
67
CLAUDE.md
@@ -96,6 +96,30 @@ Cross-deduplicates tags between the positive and negative prompt strings. For ea
|
|||||||
|
|
||||||
Called as the last step of `_prepare_workflow`, after `_apply_checkpoint_settings` has added `base_positive`/`base_negative`, so it operates on fully-assembled prompts.
|
Called as the last step of `_prepare_workflow`, after `_apply_checkpoint_settings` has added `base_positive`/`base_negative`, so it operates on fully-assembled prompts.
|
||||||
|
|
||||||
|
### `_IDENTITY_KEYS` / `_WARDROBE_KEYS` (module-level constants)
|
||||||
|
Lists of canonical field names for the `identity` and `wardrobe` sections. Used by `_ensure_character_fields()` to avoid hard-coding key lists in every route.
|
||||||
|
|
||||||
|
### `_resolve_character(character_slug)`
|
||||||
|
Returns a `Character` ORM object for a given slug string. Handles the `"__random__"` sentinel by selecting a random character. Returns `None` if `character_slug` is falsy or no match is found. Every route that accepts an optional character dropdown (outfit, action, style, scene, detailer, checkpoint, look generate routes) uses this instead of an inline if/elif block.
|
||||||
|
|
||||||
|
### `_ensure_character_fields(character, selected_fields, include_wardrobe=True, include_defaults=False)`
|
||||||
|
Mutates `selected_fields` in place, appending any populated identity, wardrobe, and optional defaults keys that are not already present. Ensures `"special::name"` is always included. Called in every secondary-category generate route immediately after `_resolve_character()` to guarantee the character's essential fields are sent to `build_prompt`.
|
||||||
|
|
||||||
|
### `_append_background(prompts, character=None)`
|
||||||
|
Appends a `"<primary_color> simple background"` tag (or plain `"simple background"` if no primary color) to `prompts['main']`. Called in outfit, action, style, detailer, and checkpoint generate routes instead of repeating the same inline string construction.
|
||||||
|
|
||||||
|
### `_make_finalize(category, slug, db_model_class=None, action=None)`
|
||||||
|
Factory function that returns a `_finalize(comfy_prompt_id, job)` callback closure. The closure:
|
||||||
|
1. Calls `get_history()` and `get_image()` to retrieve the generated image from ComfyUI.
|
||||||
|
2. Saves the image to `static/uploads/<category>/<slug>/gen_<timestamp>.png`.
|
||||||
|
3. Sets `job['result']` with `image_url` and `relative_path`.
|
||||||
|
4. If `db_model_class` is provided **and** (`action` is `None` or `action == 'replace'`), updates the ORM object's `image_path` and commits.
|
||||||
|
|
||||||
|
All `generate` routes pass a `_make_finalize(...)` call as the finalize argument to `_enqueue_job()` instead of defining an inline closure.
|
||||||
|
|
||||||
|
### `_prune_job_history(max_age_seconds=3600)`
|
||||||
|
Removes entries from `_job_history` that are in a terminal state (`done`, `failed`, `removed`) and older than `max_age_seconds`. Called at the end of every worker loop iteration to prevent unbounded memory growth.
|
||||||
|
|
||||||
### `_queue_generation(character, action, selected_fields, client_id)`
|
### `_queue_generation(character, action, selected_fields, client_id)`
|
||||||
Convenience wrapper for character detail page generation. Loads workflow, calls `build_prompt`, calls `_prepare_workflow`, calls `queue_prompt`.
|
Convenience wrapper for character detail page generation. Loads workflow, calls `build_prompt`, calls `_prepare_workflow`, calls `queue_prompt`.
|
||||||
|
|
||||||
@@ -225,8 +249,7 @@ Checkpoint JSONs are keyed by `checkpoint_path`. If no JSON exists for a discove
|
|||||||
### Characters
|
### Characters
|
||||||
- `GET /` — character gallery (index)
|
- `GET /` — character gallery (index)
|
||||||
- `GET /character/<slug>` — character detail with generation UI
|
- `GET /character/<slug>` — character detail with generation UI
|
||||||
- `POST /character/<slug>/generate` — queue generation (AJAX or form)
|
- `POST /character/<slug>/generate` — queue generation (AJAX or form); returns `{"job_id": ...}`
|
||||||
- `POST /character/<slug>/finalize_generation/<prompt_id>` — retrieve image from ComfyUI
|
|
||||||
- `POST /character/<slug>/replace_cover_from_preview` — promote preview to cover
|
- `POST /character/<slug>/replace_cover_from_preview` — promote preview to cover
|
||||||
- `GET/POST /character/<slug>/edit` — edit character data
|
- `GET/POST /character/<slug>/edit` — edit character data
|
||||||
- `POST /character/<slug>/upload` — upload cover image
|
- `POST /character/<slug>/upload` — upload cover image
|
||||||
@@ -239,8 +262,7 @@ Checkpoint JSONs are keyed by `checkpoint_path`. If no JSON exists for a discove
|
|||||||
Each category follows the same URL pattern:
|
Each category follows the same URL pattern:
|
||||||
- `GET /<category>/` — gallery
|
- `GET /<category>/` — gallery
|
||||||
- `GET /<category>/<slug>` — detail + generation UI
|
- `GET /<category>/<slug>` — detail + generation UI
|
||||||
- `POST /<category>/<slug>/generate` — queue generation
|
- `POST /<category>/<slug>/generate` — queue generation; returns `{"job_id": ...}`
|
||||||
- `POST /<category>/<slug>/finalize_generation/<prompt_id>` — retrieve image
|
|
||||||
- `POST /<category>/<slug>/replace_cover_from_preview`
|
- `POST /<category>/<slug>/replace_cover_from_preview`
|
||||||
- `GET/POST /<category>/<slug>/edit`
|
- `GET/POST /<category>/<slug>/edit`
|
||||||
- `POST /<category>/<slug>/upload`
|
- `POST /<category>/<slug>/upload`
|
||||||
@@ -254,15 +276,13 @@ Each category follows the same URL pattern:
|
|||||||
- `GET /looks` — gallery
|
- `GET /looks` — gallery
|
||||||
- `GET /look/<slug>` — detail
|
- `GET /look/<slug>` — detail
|
||||||
- `GET/POST /look/<slug>/edit`
|
- `GET/POST /look/<slug>/edit`
|
||||||
- `POST /look/<slug>/generate`
|
- `POST /look/<slug>/generate` — queue generation; returns `{"job_id": ...}`
|
||||||
- `POST /look/<slug>/finalize_generation/<prompt_id>`
|
|
||||||
- `POST /look/<slug>/replace_cover_from_preview`
|
- `POST /look/<slug>/replace_cover_from_preview`
|
||||||
- `GET/POST /look/create`
|
- `GET/POST /look/create`
|
||||||
- `POST /looks/rescan`
|
- `POST /looks/rescan`
|
||||||
|
|
||||||
### Generator (Mix & Match)
|
### Generator (Mix & Match)
|
||||||
- `GET/POST /generator` — freeform generator with multi-select accordion UI
|
- `GET/POST /generator` — freeform generator with multi-select accordion UI
|
||||||
- `POST /generator/finalize/<slug>/<prompt_id>` — retrieve image
|
|
||||||
- `POST /generator/preview_prompt` — AJAX: preview composed prompt without generating
|
- `POST /generator/preview_prompt` — AJAX: preview composed prompt without generating
|
||||||
|
|
||||||
### Checkpoints
|
### Checkpoints
|
||||||
@@ -271,11 +291,16 @@ Each category follows the same URL pattern:
|
|||||||
- `POST /checkpoint/<slug>/save_json`
|
- `POST /checkpoint/<slug>/save_json`
|
||||||
- `POST /checkpoints/rescan`
|
- `POST /checkpoints/rescan`
|
||||||
|
|
||||||
|
### Job Queue API
|
||||||
|
All generation routes use the background job queue. Frontend polls:
|
||||||
|
- `GET /api/queue/<job_id>/status` — returns `{"status": "pending"|"running"|"done"|"failed", "result": {...}}`
|
||||||
|
|
||||||
|
Image retrieval is handled server-side by the `_make_finalize()` callback; there are no separate client-facing finalize routes.
|
||||||
|
|
||||||
### Utilities
|
### Utilities
|
||||||
- `POST /set_default_checkpoint` — save default checkpoint to session
|
- `POST /set_default_checkpoint` — save default checkpoint to session
|
||||||
- `GET /check_status/<prompt_id>` — poll ComfyUI for completion
|
|
||||||
- `GET /get_missing_{characters,outfits,actions,scenes}` — AJAX: list items without cover images
|
- `GET /get_missing_{characters,outfits,actions,scenes}` — AJAX: list items without cover images
|
||||||
- `POST /generate_missing` — batch generate covers for characters
|
- `POST /generate_missing` — batch generate covers for all characters missing one (uses job queue)
|
||||||
- `POST /clear_all_covers` / `clear_all_{outfit,action,scene}_covers`
|
- `POST /clear_all_covers` / `clear_all_{outfit,action,scene}_covers`
|
||||||
- `GET /gallery` — global image gallery browsing `static/uploads/`
|
- `GET /gallery` — global image gallery browsing `static/uploads/`
|
||||||
- `GET/POST /settings` — LLM provider configuration
|
- `GET/POST /settings` — LLM provider configuration
|
||||||
@@ -295,7 +320,7 @@ Each category follows the same URL pattern:
|
|||||||
- `initJsonEditor(saveUrl)` — shared JSON editor modal (simple form + raw textarea tabs)
|
- `initJsonEditor(saveUrl)` — shared JSON editor modal (simple form + raw textarea tabs)
|
||||||
- Context processors inject `all_checkpoints`, `default_checkpoint_path`, and `COMFYUI_WS_URL` into every template.
|
- Context processors inject `all_checkpoints`, `default_checkpoint_path`, and `COMFYUI_WS_URL` into every template.
|
||||||
- **No `{% block head %}` exists** in layout.html — do not try to use it.
|
- **No `{% block head %}` exists** in layout.html — do not try to use it.
|
||||||
- Generation is async: JS submits the form via AJAX (`X-Requested-With: XMLHttpRequest`), receives a `prompt_id`, opens a WebSocket to ComfyUI to show progress, then calls the finalize endpoint to save the image.
|
- Generation is async: JS submits the form via AJAX (`X-Requested-With: XMLHttpRequest`), receives a `{"job_id": ...}` response, then polls `/api/queue/<job_id>/status` every ~1.5 seconds until `status == "done"`. The server-side worker handles all ComfyUI polling and image saving via the `_make_finalize()` callback. There are no client-facing finalize HTTP routes.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -374,6 +399,8 @@ The Flask filesystem session stores:
|
|||||||
|
|
||||||
## Running the App
|
## Running the App
|
||||||
|
|
||||||
|
### Directly (development)
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd /mnt/alexander/Projects/character-browser
|
cd /mnt/alexander/Projects/character-browser
|
||||||
source venv/bin/activate
|
source venv/bin/activate
|
||||||
@@ -384,6 +411,25 @@ The app runs in debug mode on port 5000 by default. ComfyUI must be running at `
|
|||||||
|
|
||||||
The DB is initialised and all sync functions are called inside `with app.app_context():` at the bottom of `app.py` before `app.run()`.
|
The DB is initialised and all sync functions are called inside `with app.app_context():` at the bottom of `app.py` before `app.run()`.
|
||||||
|
|
||||||
|
### Docker
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
The compose file (`docker-compose.yml`) runs two services:
|
||||||
|
- **`danbooru-mcp`** — built from `https://git.liveaodh.com/aodhan/danbooru-mcp.git`; the MCP tag-search container used by `call_llm()`.
|
||||||
|
- **`app`** — the Flask app, exposed on host port **5782** → container port 5000.
|
||||||
|
|
||||||
|
Key environment variables set by compose:
|
||||||
|
- `COMFYUI_URL=http://10.0.0.200:8188` — points at ComfyUI on the Docker host network.
|
||||||
|
- `SKIP_MCP_AUTOSTART=true` — disables the app's built-in danbooru-mcp launch logic (compose manages it).
|
||||||
|
|
||||||
|
Volumes mounted into the app container:
|
||||||
|
- `./data`, `./static/uploads`, `./instance`, `./flask_session` — persistent app data.
|
||||||
|
- `/Volumes/ImageModels:/ImageModels:ro` — model files for checkpoint/LoRA scanning (**requires Docker Desktop file sharing enabled for `/Volumes/ImageModels`**).
|
||||||
|
- `/var/run/docker.sock` — Docker socket so the app can exec danbooru-mcp tool containers.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Common Pitfalls
|
## Common Pitfalls
|
||||||
@@ -396,3 +442,4 @@ The DB is initialised and all sync functions are called inside `with app.app_con
|
|||||||
- **AJAX detection**: `request.headers.get('X-Requested-With') == 'XMLHttpRequest'` determines whether to return JSON or redirect.
|
- **AJAX detection**: `request.headers.get('X-Requested-With') == 'XMLHttpRequest'` determines whether to return JSON or redirect.
|
||||||
- **Session must be marked modified for JSON responses**: After setting session values in AJAX-responding routes, set `session.modified = True`.
|
- **Session must be marked modified for JSON responses**: After setting session values in AJAX-responding routes, set `session.modified = True`.
|
||||||
- **Detailer `prompt` is a list**: The `prompt` field in detailer JSON is stored as a list of strings (e.g. `["detailed skin", "pores"]`), not a plain string. When merging into `tags` for `build_prompt`, use `extend` for lists and `append` for strings — never append the list object itself or `", ".join()` will fail on the nested list item.
|
- **Detailer `prompt` is a list**: The `prompt` field in detailer JSON is stored as a list of strings (e.g. `["detailed skin", "pores"]`), not a plain string. When merging into `tags` for `build_prompt`, use `extend` for lists and `append` for strings — never append the list object itself or `", ".join()` will fail on the nested list item.
|
||||||
|
- **`_make_finalize` action semantics**: Pass `action=None` when the route should always update the DB cover (e.g. batch generate, checkpoint generate). Pass `action=request.form.get('action')` for routes that support both "preview" (no DB update) and "replace" (update DB). The factory skips the DB write when `action` is truthy and not `"replace"`.
|
||||||
|
|||||||
29
Dockerfile
Normal file
29
Dockerfile
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
FROM python:3.12-slim
|
||||||
|
|
||||||
|
# Install system deps: git (for danbooru-mcp repo clone) + docker CLI
|
||||||
|
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
|
git \
|
||||||
|
curl \
|
||||||
|
ca-certificates \
|
||||||
|
&& install -m 0755 -d /etc/apt/keyrings \
|
||||||
|
&& curl -fsSL https://download.docker.com/linux/debian/gpg -o /etc/apt/keyrings/docker.asc \
|
||||||
|
&& chmod a+r /etc/apt/keyrings/docker.asc \
|
||||||
|
&& echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/debian $(. /etc/os-release && echo "$VERSION_CODENAME") stable" \
|
||||||
|
> /etc/apt/sources.list.d/docker.list \
|
||||||
|
&& apt-get update && apt-get install -y --no-install-recommends docker-ce-cli docker-compose-plugin \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
COPY requirements.txt .
|
||||||
|
RUN pip install --no-cache-dir -r requirements.txt
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Writable dirs that will typically be bind-mounted at runtime
|
||||||
|
RUN mkdir -p static/uploads instance flask_session data/characters data/clothing \
|
||||||
|
data/actions data/styles data/scenes data/detailers data/checkpoints data/looks
|
||||||
|
|
||||||
|
EXPOSE 5000
|
||||||
|
|
||||||
|
CMD ["python", "app.py"]
|
||||||
14
README.md
14
README.md
@@ -39,6 +39,20 @@ A local web-based GUI for managing character profiles (JSON) and generating cons
|
|||||||
|
|
||||||
## Setup & Installation
|
## Setup & Installation
|
||||||
|
|
||||||
|
### Option A — Docker (recommended)
|
||||||
|
|
||||||
|
1. **Clone the repository.**
|
||||||
|
2. Edit `docker-compose.yml` if needed:
|
||||||
|
- Set `COMFYUI_URL` to your ComfyUI host/port.
|
||||||
|
- Adjust the `/Volumes/ImageModels` volume path to your model directory. If you're on Docker Desktop, add the path under **Settings → Resources → File Sharing** first.
|
||||||
|
3. **Start services:**
|
||||||
|
```bash
|
||||||
|
docker compose up -d
|
||||||
|
```
|
||||||
|
The app will be available at `http://localhost:5782`.
|
||||||
|
|
||||||
|
### Option B — Local (development)
|
||||||
|
|
||||||
1. **Clone the repository** to your local machine.
|
1. **Clone the repository** to your local machine.
|
||||||
2. **Configure Paths**: Open `app.py` and update the following variables to match your system:
|
2. **Configure Paths**: Open `app.py` and update the following variables to match your system:
|
||||||
```python
|
```python
|
||||||
|
|||||||
31
docker-compose.yml
Normal file
31
docker-compose.yml
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
services:
|
||||||
|
danbooru-mcp:
|
||||||
|
build: https://git.liveaodh.com/aodhan/danbooru-mcp.git
|
||||||
|
image: danbooru-mcp:latest
|
||||||
|
stdin_open: true
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
app:
|
||||||
|
build: .
|
||||||
|
ports:
|
||||||
|
- "5782:5000"
|
||||||
|
environment:
|
||||||
|
# ComfyUI runs on the Docker host
|
||||||
|
COMFYUI_URL: http://10.0.0.200:8188 # Compose manages danbooru-mcp — skip the app's auto-start logic
|
||||||
|
SKIP_MCP_AUTOSTART: "true"
|
||||||
|
volumes:
|
||||||
|
# Persistent data
|
||||||
|
- ./data:/app/data
|
||||||
|
- ./static/uploads:/app/static/uploads
|
||||||
|
- ./instance:/app/instance
|
||||||
|
- ./flask_session:/app/flask_session
|
||||||
|
# Model files (read-only — used for checkpoint/LoRA scanning)
|
||||||
|
- /Volumes/ImageModels:/ImageModels:ro
|
||||||
|
# Docker socket so the app can run danbooru-mcp tool containers
|
||||||
|
- /var/run/docker.sock:/var/run/docker.sock
|
||||||
|
extra_hosts:
|
||||||
|
# Resolve host.docker.internal on Linux hosts
|
||||||
|
- "host.docker.internal:host-gateway"
|
||||||
|
depends_on:
|
||||||
|
- danbooru-mcp
|
||||||
|
restart: unless-stopped
|
||||||
@@ -1,63 +0,0 @@
|
|||||||
import os
|
|
||||||
import json
|
|
||||||
from collections import OrderedDict
|
|
||||||
|
|
||||||
ACTIONS_DIR = 'data/actions'
|
|
||||||
|
|
||||||
def migrate_actions():
|
|
||||||
if not os.path.exists(ACTIONS_DIR):
|
|
||||||
print(f"Directory {ACTIONS_DIR} not found.")
|
|
||||||
return
|
|
||||||
|
|
||||||
count = 0
|
|
||||||
for filename in os.listdir(ACTIONS_DIR):
|
|
||||||
if not filename.endswith('.json'):
|
|
||||||
continue
|
|
||||||
|
|
||||||
filepath = os.path.join(ACTIONS_DIR, filename)
|
|
||||||
try:
|
|
||||||
with open(filepath, 'r') as f:
|
|
||||||
data = json.load(f, object_pairs_hook=OrderedDict)
|
|
||||||
|
|
||||||
# Check if participants already exists
|
|
||||||
if 'participants' in data:
|
|
||||||
print(f"Skipping {filename}: 'participants' already exists.")
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Create new ordered dict to enforce order
|
|
||||||
new_data = OrderedDict()
|
|
||||||
|
|
||||||
# Copy keys up to 'action'
|
|
||||||
found_action = False
|
|
||||||
for key, value in data.items():
|
|
||||||
new_data[key] = value
|
|
||||||
if key == 'action':
|
|
||||||
found_action = True
|
|
||||||
# Insert participants here
|
|
||||||
new_data['participants'] = {
|
|
||||||
"solo_focus": "true",
|
|
||||||
"orientation": "MF"
|
|
||||||
}
|
|
||||||
|
|
||||||
# If 'action' wasn't found, append at the end
|
|
||||||
if not found_action:
|
|
||||||
print(f"Warning: 'action' key not found in {filename}. Appending 'participants' at the end.")
|
|
||||||
new_data['participants'] = {
|
|
||||||
"solo_focus": "true",
|
|
||||||
"orientation": "MF"
|
|
||||||
}
|
|
||||||
|
|
||||||
# Write back to file
|
|
||||||
with open(filepath, 'w') as f:
|
|
||||||
json.dump(new_data, f, indent=2)
|
|
||||||
|
|
||||||
count += 1
|
|
||||||
# print(f"Updated {filename}") # Commented out to reduce noise if many files
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
print(f"Error processing {filename}: {e}")
|
|
||||||
|
|
||||||
print(f"Migration complete. Updated {count} files.")
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
migrate_actions()
|
|
||||||
@@ -1,61 +0,0 @@
|
|||||||
import os
|
|
||||||
import json
|
|
||||||
import re
|
|
||||||
|
|
||||||
DETAILERS_DIR = 'data/detailers'
|
|
||||||
|
|
||||||
def migrate_detailers():
|
|
||||||
if not os.path.exists(DETAILERS_DIR):
|
|
||||||
print(f"Directory {DETAILERS_DIR} does not exist.")
|
|
||||||
return
|
|
||||||
|
|
||||||
count = 0
|
|
||||||
for filename in os.listdir(DETAILERS_DIR):
|
|
||||||
if filename.endswith('.json'):
|
|
||||||
file_path = os.path.join(DETAILERS_DIR, filename)
|
|
||||||
try:
|
|
||||||
with open(file_path, 'r') as f:
|
|
||||||
data = json.load(f)
|
|
||||||
|
|
||||||
# Create a new ordered dictionary
|
|
||||||
new_data = {}
|
|
||||||
|
|
||||||
# Copy existing fields up to 'prompt'
|
|
||||||
if 'detailer_id' in data:
|
|
||||||
new_data['detailer_id'] = data['detailer_id']
|
|
||||||
if 'detailer_name' in data:
|
|
||||||
new_data['detailer_name'] = data['detailer_name']
|
|
||||||
|
|
||||||
# Handle 'prompt'
|
|
||||||
prompt_val = data.get('prompt', [])
|
|
||||||
if isinstance(prompt_val, str):
|
|
||||||
# Split by comma and strip whitespace
|
|
||||||
new_data['prompt'] = [p.strip() for p in prompt_val.split(',') if p.strip()]
|
|
||||||
else:
|
|
||||||
new_data['prompt'] = prompt_val
|
|
||||||
|
|
||||||
# Insert 'focus'
|
|
||||||
if 'focus' not in data:
|
|
||||||
new_data['focus'] = ""
|
|
||||||
else:
|
|
||||||
new_data['focus'] = data['focus']
|
|
||||||
|
|
||||||
# Copy remaining fields
|
|
||||||
for key, value in data.items():
|
|
||||||
if key not in ['detailer_id', 'detailer_name', 'prompt', 'focus']:
|
|
||||||
new_data[key] = value
|
|
||||||
|
|
||||||
# Write back to file
|
|
||||||
with open(file_path, 'w') as f:
|
|
||||||
json.dump(new_data, f, indent=2)
|
|
||||||
|
|
||||||
print(f"Migrated {filename}")
|
|
||||||
count += 1
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
print(f"Error processing {filename}: {e}")
|
|
||||||
|
|
||||||
print(f"Migration complete. Processed {count} files.")
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
migrate_detailers()
|
|
||||||
@@ -1,79 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
"""
|
|
||||||
Migration: add lora_weight_min / lora_weight_max to every entity JSON.
|
|
||||||
|
|
||||||
For each JSON file in the target directories we:
|
|
||||||
- Read the existing lora_weight value (default 1.0 if missing)
|
|
||||||
- Write lora_weight_min = lora_weight (if not already set)
|
|
||||||
- Write lora_weight_max = lora_weight (if not already set)
|
|
||||||
- Leave lora_weight in place (the resolver still uses it as a fallback)
|
|
||||||
|
|
||||||
Directories processed (skip data/checkpoints — no LoRA weight there):
|
|
||||||
data/characters/
|
|
||||||
data/clothing/
|
|
||||||
data/actions/
|
|
||||||
data/styles/
|
|
||||||
data/scenes/
|
|
||||||
data/detailers/
|
|
||||||
data/looks/
|
|
||||||
"""
|
|
||||||
|
|
||||||
import glob
|
|
||||||
import json
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
DIRS = [
|
|
||||||
'data/characters',
|
|
||||||
'data/clothing',
|
|
||||||
'data/actions',
|
|
||||||
'data/styles',
|
|
||||||
'data/scenes',
|
|
||||||
'data/detailers',
|
|
||||||
'data/looks',
|
|
||||||
]
|
|
||||||
|
|
||||||
dry_run = '--dry-run' in sys.argv
|
|
||||||
updated = 0
|
|
||||||
skipped = 0
|
|
||||||
errors = 0
|
|
||||||
|
|
||||||
for directory in DIRS:
|
|
||||||
pattern = os.path.join(directory, '*.json')
|
|
||||||
for path in sorted(glob.glob(pattern)):
|
|
||||||
try:
|
|
||||||
with open(path, 'r', encoding='utf-8') as f:
|
|
||||||
data = json.load(f)
|
|
||||||
|
|
||||||
lora = data.get('lora')
|
|
||||||
if not isinstance(lora, dict):
|
|
||||||
skipped += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
weight = float(lora.get('lora_weight', 1.0))
|
|
||||||
changed = False
|
|
||||||
|
|
||||||
if 'lora_weight_min' not in lora:
|
|
||||||
lora['lora_weight_min'] = weight
|
|
||||||
changed = True
|
|
||||||
if 'lora_weight_max' not in lora:
|
|
||||||
lora['lora_weight_max'] = weight
|
|
||||||
changed = True
|
|
||||||
|
|
||||||
if changed:
|
|
||||||
if not dry_run:
|
|
||||||
with open(path, 'w', encoding='utf-8') as f:
|
|
||||||
json.dump(data, f, indent=2, ensure_ascii=False)
|
|
||||||
f.write('\n')
|
|
||||||
print(f" [{'DRY' if dry_run else 'OK'}] {path} weight={weight}")
|
|
||||||
updated += 1
|
|
||||||
else:
|
|
||||||
skipped += 1
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
print(f" [ERR] {path}: {e}", file=sys.stderr)
|
|
||||||
errors += 1
|
|
||||||
|
|
||||||
print(f"\nDone. updated={updated} skipped={skipped} errors={errors}")
|
|
||||||
if dry_run:
|
|
||||||
print("(dry-run — no files were written)")
|
|
||||||
@@ -1,153 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
"""
|
|
||||||
Migration script to convert wardrobe structure from flat to nested format.
|
|
||||||
|
|
||||||
Before:
|
|
||||||
"wardrobe": {
|
|
||||||
"headwear": "...",
|
|
||||||
"top": "...",
|
|
||||||
...
|
|
||||||
}
|
|
||||||
|
|
||||||
After:
|
|
||||||
"wardrobe": {
|
|
||||||
"default": {
|
|
||||||
"headwear": "...",
|
|
||||||
"top": "...",
|
|
||||||
...
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
This enables multiple outfits per character.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import os
|
|
||||||
import json
|
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
|
|
||||||
def migrate_wardrobe(characters_dir: str = "characters", dry_run: bool = False):
|
|
||||||
"""
|
|
||||||
Migrate all character JSON files to the new wardrobe structure.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
characters_dir: Path to the directory containing character JSON files
|
|
||||||
dry_run: If True, only print what would be changed without modifying files
|
|
||||||
"""
|
|
||||||
characters_path = Path(characters_dir)
|
|
||||||
|
|
||||||
if not characters_path.exists():
|
|
||||||
print(f"Error: Directory '{characters_dir}' does not exist")
|
|
||||||
return
|
|
||||||
|
|
||||||
json_files = list(characters_path.glob("*.json"))
|
|
||||||
|
|
||||||
if not json_files:
|
|
||||||
print(f"No JSON files found in '{characters_dir}'")
|
|
||||||
return
|
|
||||||
|
|
||||||
migrated_count = 0
|
|
||||||
skipped_count = 0
|
|
||||||
error_count = 0
|
|
||||||
|
|
||||||
for json_file in json_files:
|
|
||||||
try:
|
|
||||||
with open(json_file, 'r', encoding='utf-8') as f:
|
|
||||||
data = json.load(f)
|
|
||||||
|
|
||||||
# Check if character has a wardrobe
|
|
||||||
if 'wardrobe' not in data:
|
|
||||||
print(f" [SKIP] {json_file.name}: No wardrobe field")
|
|
||||||
skipped_count += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
wardrobe = data['wardrobe']
|
|
||||||
|
|
||||||
# Check if already migrated (wardrobe contains 'default' key with nested dict)
|
|
||||||
if 'default' in wardrobe and isinstance(wardrobe['default'], dict):
|
|
||||||
# Verify it's actually the new format (has wardrobe keys inside)
|
|
||||||
expected_keys = {'headwear', 'top', 'legwear', 'footwear', 'hands', 'accessories',
|
|
||||||
'inner_layer', 'outer_layer', 'lower_body', 'gloves'}
|
|
||||||
if any(key in wardrobe['default'] for key in expected_keys):
|
|
||||||
print(f" [SKIP] {json_file.name}: Already migrated")
|
|
||||||
skipped_count += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Check if wardrobe is a flat structure (not already nested)
|
|
||||||
# A flat wardrobe has string values, a nested one has dict values
|
|
||||||
if not isinstance(wardrobe, dict):
|
|
||||||
print(f" [ERROR] {json_file.name}: Wardrobe is not a dictionary")
|
|
||||||
error_count += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Check if any value is a dict (indicating partial migration or different structure)
|
|
||||||
has_nested_values = any(isinstance(v, dict) for v in wardrobe.values())
|
|
||||||
if has_nested_values:
|
|
||||||
print(f" [SKIP] {json_file.name}: Wardrobe has nested values, may already be migrated")
|
|
||||||
skipped_count += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Perform migration
|
|
||||||
new_wardrobe = {
|
|
||||||
"default": wardrobe
|
|
||||||
}
|
|
||||||
data['wardrobe'] = new_wardrobe
|
|
||||||
|
|
||||||
if dry_run:
|
|
||||||
print(f" [DRY-RUN] {json_file.name}: Would migrate wardrobe")
|
|
||||||
print(f" Old: {json.dumps(wardrobe, indent=2)[:100]}...")
|
|
||||||
print(f" New: {json.dumps(new_wardrobe, indent=2)[:100]}...")
|
|
||||||
else:
|
|
||||||
with open(json_file, 'w', encoding='utf-8') as f:
|
|
||||||
json.dump(data, f, indent=2, ensure_ascii=False)
|
|
||||||
print(f" [MIGRATED] {json_file.name}")
|
|
||||||
|
|
||||||
migrated_count += 1
|
|
||||||
|
|
||||||
except json.JSONDecodeError as e:
|
|
||||||
print(f" [ERROR] {json_file.name}: Invalid JSON - {e}")
|
|
||||||
error_count += 1
|
|
||||||
except Exception as e:
|
|
||||||
print(f" [ERROR] {json_file.name}: {e}")
|
|
||||||
error_count += 1
|
|
||||||
|
|
||||||
print()
|
|
||||||
print("=" * 50)
|
|
||||||
print(f"Migration complete:")
|
|
||||||
print(f" - Migrated: {migrated_count}")
|
|
||||||
print(f" - Skipped: {skipped_count}")
|
|
||||||
print(f" - Errors: {error_count}")
|
|
||||||
if dry_run:
|
|
||||||
print()
|
|
||||||
print("This was a dry run. No files were modified.")
|
|
||||||
print("Run with --execute to apply changes.")
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
import argparse
|
|
||||||
|
|
||||||
parser = argparse.ArgumentParser(
|
|
||||||
description="Migrate character wardrobe structure to support multiple outfits"
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
"--execute",
|
|
||||||
action="store_true",
|
|
||||||
help="Actually modify files (default is dry-run)"
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
"--dir",
|
|
||||||
default="characters",
|
|
||||||
help="Directory containing character JSON files (default: characters)"
|
|
||||||
)
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
|
||||||
|
|
||||||
print("=" * 50)
|
|
||||||
print("Wardrobe Migration Script")
|
|
||||||
print("=" * 50)
|
|
||||||
print(f"Directory: {args.dir}")
|
|
||||||
print(f"Mode: {'EXECUTE' if args.execute else 'DRY-RUN'}")
|
|
||||||
print("=" * 50)
|
|
||||||
print()
|
|
||||||
|
|
||||||
migrate_wardrobe(characters_dir=args.dir, dry_run=not args.execute)
|
|
||||||
Reference in New Issue
Block a user