Code review fixes: wardrobe migration, response validation, path traversal guard, deduplication
- Migrate 11 character JSONs from old wardrobe keys to _BODY_GROUP_KEYS format - Add is_favourite/is_nsfw columns to Preset model - Add HTTP response validation and timeouts to ComfyUI client - Add path traversal protection on replace cover route - Deduplicate services/mcp.py (4 functions → 2 generic + 2 wrappers) - Extract apply_library_filters() and clean_html_text() shared helpers - Add named constants for 17 ComfyUI workflow node IDs - Fix bare except clauses in services/llm.py - Fix tags schema in ensure_default_outfit() (list → dict) - Convert f-string logging to lazy % formatting - Add 5-minute polling timeout to frontend waitForJob() - Improve migration error handling (non-duplicate errors log at WARNING) - Update CLAUDE.md to reflect all changes Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
69
CLAUDE.md
69
CLAUDE.md
@@ -30,7 +30,7 @@ services/
|
|||||||
generation.py # Shared generation logic (generate_from_preset)
|
generation.py # Shared generation logic (generate_from_preset)
|
||||||
routes/
|
routes/
|
||||||
__init__.py # register_routes(app) — imports and calls all route modules
|
__init__.py # register_routes(app) — imports and calls all route modules
|
||||||
shared.py # Factory functions for common routes (favourite, upload, clone, save_json, etc.)
|
shared.py # Factory functions for common routes + apply_library_filters() helper
|
||||||
characters.py # Character CRUD + generation + outfit management
|
characters.py # Character CRUD + generation + outfit management
|
||||||
outfits.py # Outfit routes
|
outfits.py # Outfit routes
|
||||||
actions.py # Action routes
|
actions.py # Action routes
|
||||||
@@ -115,27 +115,29 @@ All category models (except Settings and Checkpoint) share this pattern:
|
|||||||
|
|
||||||
## ComfyUI Workflow Node Map
|
## ComfyUI Workflow Node Map
|
||||||
|
|
||||||
The workflow (`comfy_workflow.json`) uses string node IDs. These are the critical nodes:
|
The workflow (`comfy_workflow.json`) uses string node IDs. Named constants are defined in `services/workflow.py`:
|
||||||
|
|
||||||
| Node | Role |
|
| Constant | Node | Role |
|
||||||
|------|------|
|
|----------|------|------|
|
||||||
| `3` | Main KSampler |
|
| `NODE_KSAMPLER` | `3` | Main KSampler |
|
||||||
| `4` | Checkpoint loader |
|
| `NODE_CHECKPOINT` | `4` | Checkpoint loader |
|
||||||
| `5` | Empty latent (width/height) |
|
| `NODE_LATENT` | `5` | Empty latent (width/height) |
|
||||||
| `6` | Positive prompt — contains `{{POSITIVE_PROMPT}}` placeholder |
|
| `NODE_POSITIVE` | `6` | Positive prompt — contains `{{POSITIVE_PROMPT}}` placeholder |
|
||||||
| `7` | Negative prompt |
|
| `NODE_NEGATIVE` | `7` | Negative prompt |
|
||||||
| `8` | VAE decode |
|
| `NODE_VAE_DECODE` | `8` | VAE decode |
|
||||||
| `9` | Save image |
|
| `NODE_SAVE` | `9` | Save image |
|
||||||
| `11` | Face ADetailer |
|
| `NODE_FACE_DETAILER` | `11` | Face ADetailer |
|
||||||
| `13` | Hand ADetailer |
|
| `NODE_HAND_DETAILER` | `13` | Hand ADetailer |
|
||||||
| `14` | Face detailer prompt — contains `{{FACE_PROMPT}}` placeholder |
|
| `NODE_FACE_PROMPT` | `14` | Face detailer prompt — contains `{{FACE_PROMPT}}` placeholder |
|
||||||
| `15` | Hand detailer prompt — contains `{{HAND_PROMPT}}` placeholder |
|
| `NODE_HAND_PROMPT` | `15` | Hand detailer prompt — contains `{{HAND_PROMPT}}` placeholder |
|
||||||
| `16` | Character LoRA (or Look LoRA when a Look is active) |
|
| `NODE_LORA_CHAR` | `16` | Character LoRA (or Look LoRA when a Look is active) |
|
||||||
| `17` | Outfit LoRA |
|
| `NODE_LORA_OUTFIT` | `17` | Outfit LoRA |
|
||||||
| `18` | Action LoRA |
|
| `NODE_LORA_ACTION` | `18` | Action LoRA |
|
||||||
| `19` | Style / Detailer / Scene LoRA (priority: style > detailer > scene) |
|
| `NODE_LORA_STYLE` | `19` | Style / Detailer / Scene LoRA (priority: style > detailer > scene) |
|
||||||
|
| `NODE_LORA_CHAR_B` | `20` | Character LoRA B (second character) |
|
||||||
|
| `NODE_VAE_LOADER` | `21` | VAE loader |
|
||||||
|
|
||||||
LoRA nodes chain: `4 → 16 → 17 → 18 → 19`. Unused LoRA nodes are bypassed by pointing `model_source`/`clip_source` directly to the prior node. All model/clip consumers (nodes 3, 6, 7, 11, 13, 14, 15) are wired to the final `model_source`/`clip_source` at the end of `_prepare_workflow`.
|
LoRA nodes chain: `4 → 16 → 17 → 18 → 19`. Unused LoRA nodes are bypassed by pointing `model_source`/`clip_source` directly to the prior node. All model/clip consumers (nodes 3, 6, 7, 11, 13, 14, 15) are wired to the final `model_source`/`clip_source` at the end of `_prepare_workflow`. Always use the named constants instead of string literals when referencing node IDs.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -143,12 +145,13 @@ LoRA nodes chain: `4 → 16 → 17 → 18 → 19`. Unused LoRA nodes are bypasse
|
|||||||
|
|
||||||
### `utils.py` — Constants and Pure Helpers
|
### `utils.py` — Constants and Pure Helpers
|
||||||
|
|
||||||
- **`_IDENTITY_KEYS` / `_WARDROBE_KEYS`** — Lists of canonical field names for the `identity` and `wardrobe` sections. Used by `_ensure_character_fields()`.
|
- **`_BODY_GROUP_KEYS`** — Canonical list of field names shared by both `identity` and `wardrobe` sections: `['base', 'head', 'upper_body', 'lower_body', 'hands', 'feet', 'additional']`. Used by `build_prompt()`, `_ensure_character_fields()`, and `_resolve_preset_fields()`.
|
||||||
- **`ALLOWED_EXTENSIONS`** — Permitted upload file extensions.
|
- **`ALLOWED_EXTENSIONS`** — Permitted upload file extensions.
|
||||||
- **`_LORA_DEFAULTS`** — Default LoRA directory paths per category.
|
- **`_LORA_DEFAULTS`** — Default LoRA directory paths per category.
|
||||||
- **`parse_orientation(orientation_str)`** — Converts orientation codes (`1F`, `2F`, `1M1F`, etc.) into Danbooru tags.
|
- **`parse_orientation(orientation_str)`** — Converts orientation codes (`1F`, `2F`, `1M1F`, etc.) into Danbooru tags.
|
||||||
- **`_resolve_lora_weight(lora_data)`** — Extracts and validates LoRA weight from a lora data dict.
|
- **`_resolve_lora_weight(lora_data)`** — Extracts and validates LoRA weight from a lora data dict.
|
||||||
- **`allowed_file(filename)`** — Checks file extension against `ALLOWED_EXTENSIONS`.
|
- **`allowed_file(filename)`** — Checks file extension against `ALLOWED_EXTENSIONS`.
|
||||||
|
- **`clean_html_text(html_raw)`** — Strips HTML tags, scripts, styles, and images from raw HTML, returning plain text. Used by bulk_create routes.
|
||||||
|
|
||||||
### `services/prompts.py` — Prompt Building
|
### `services/prompts.py` — Prompt Building
|
||||||
|
|
||||||
@@ -181,10 +184,11 @@ Two independent queues with separate worker threads:
|
|||||||
|
|
||||||
### `services/comfyui.py` — ComfyUI HTTP Client
|
### `services/comfyui.py` — ComfyUI HTTP Client
|
||||||
|
|
||||||
- **`queue_prompt(prompt_workflow, client_id)`** — POSTs workflow to ComfyUI's `/prompt` endpoint.
|
- **`queue_prompt(prompt_workflow, client_id)`** — POSTs workflow to ComfyUI's `/prompt` endpoint. Validates HTTP response status before parsing JSON; raises `RuntimeError` on non-OK responses. Timeout: 30s.
|
||||||
- **`get_history(prompt_id)`** — Polls ComfyUI for job completion.
|
- **`get_history(prompt_id)`** — Polls ComfyUI for job completion. Timeout: 10s.
|
||||||
- **`get_image(filename, subfolder, folder_type)`** — Retrieves generated image bytes.
|
- **`get_image(filename, subfolder, folder_type)`** — Retrieves generated image bytes. Timeout: 30s.
|
||||||
- **`_ensure_checkpoint_loaded(checkpoint_path)`** — Forces ComfyUI to load a specific checkpoint.
|
- **`get_loaded_checkpoint()`** — Returns the checkpoint path currently loaded in ComfyUI by inspecting the most recent job in `/history`.
|
||||||
|
- **`_ensure_checkpoint_loaded(checkpoint_path)`** — Forces ComfyUI to unload all models if the desired checkpoint doesn't match what's currently loaded.
|
||||||
|
|
||||||
### `services/llm.py` — LLM Integration
|
### `services/llm.py` — LLM Integration
|
||||||
|
|
||||||
@@ -211,8 +215,10 @@ Two independent queues with separate worker threads:
|
|||||||
|
|
||||||
### `services/mcp.py` — MCP/Docker Lifecycle
|
### `services/mcp.py` — MCP/Docker Lifecycle
|
||||||
|
|
||||||
- **`ensure_mcp_server_running()`** — Ensures the danbooru-mcp Docker container is running.
|
- **`_ensure_repo(compose_dir, repo_url, name)`** — Generic helper: clones a git repo if the directory doesn't exist.
|
||||||
- **`ensure_character_mcp_server_running()`** — Ensures the character-mcp Docker container is running.
|
- **`_ensure_server_running(compose_dir, repo_url, container_name, name)`** — Generic helper: ensures a Docker Compose service is running (clones repo if needed, starts container if not running).
|
||||||
|
- **`ensure_mcp_server_running()`** — Ensures the danbooru-mcp Docker container is running (thin wrapper around `_ensure_server_running`).
|
||||||
|
- **`ensure_character_mcp_server_running()`** — Ensures the character-mcp Docker container is running (thin wrapper around `_ensure_server_running`).
|
||||||
|
|
||||||
### Route-local Helpers
|
### Route-local Helpers
|
||||||
|
|
||||||
@@ -239,7 +245,7 @@ Some helpers are defined inside a route module's `register_routes()` since they'
|
|||||||
"identity": { "base_specs": "", "hair": "", "eyes": "", "hands": "", "arms": "", "torso": "", "pelvis": "", "legs": "", "feet": "", "extra": "" },
|
"identity": { "base_specs": "", "hair": "", "eyes": "", "hands": "", "arms": "", "torso": "", "pelvis": "", "legs": "", "feet": "", "extra": "" },
|
||||||
"defaults": { "expression": "", "pose": "", "scene": "" },
|
"defaults": { "expression": "", "pose": "", "scene": "" },
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"default": { "full_body": "", "headwear": "", "top": "", "bottom": "", "legwear": "", "footwear": "", "hands": "", "gloves": "", "accessories": "" }
|
"default": { "base": "", "head": "", "upper_body": "", "lower_body": "", "hands": "", "feet": "", "additional": "" }
|
||||||
},
|
},
|
||||||
"styles": { "aesthetic": "", "primary_color": "", "secondary_color": "", "tertiary_color": "" },
|
"styles": { "aesthetic": "", "primary_color": "", "secondary_color": "", "tertiary_color": "" },
|
||||||
"lora": { "lora_name": "Illustrious/Looks/tifa.safetensors", "lora_weight": 0.8, "lora_triggers": "" },
|
"lora": { "lora_name": "Illustrious/Looks/tifa.safetensors", "lora_weight": 0.8, "lora_triggers": "" },
|
||||||
@@ -254,7 +260,7 @@ Some helpers are defined inside a route module's `register_routes()` since they'
|
|||||||
{
|
{
|
||||||
"outfit_id": "french_maid_01",
|
"outfit_id": "french_maid_01",
|
||||||
"outfit_name": "French Maid",
|
"outfit_name": "French Maid",
|
||||||
"wardrobe": { "full_body": "", "headwear": "", "top": "", "bottom": "", "legwear": "", "footwear": "", "hands": "", "accessories": "" },
|
"wardrobe": { "base": "", "head": "", "upper_body": "", "lower_body": "", "hands": "", "feet": "", "additional": "" },
|
||||||
"lora": { "lora_name": "Illustrious/Clothing/maid.safetensors", "lora_weight": 0.8, "lora_triggers": "" },
|
"lora": { "lora_name": "Illustrious/Clothing/maid.safetensors", "lora_weight": 0.8, "lora_triggers": "" },
|
||||||
"tags": { "outfit_type": "Uniform", "nsfw": false }
|
"tags": { "outfit_type": "Uniform", "nsfw": false }
|
||||||
}
|
}
|
||||||
@@ -439,7 +445,7 @@ Image retrieval is handled server-side by the `_make_finalize()` callback; there
|
|||||||
- `static/js/library-toolbar.js` — Library page toolbar (batch generate, clear covers, missing items)
|
- `static/js/library-toolbar.js` — Library page toolbar (batch generate, clear covers, missing items)
|
||||||
- Context processors inject `all_checkpoints`, `default_checkpoint_path`, and `COMFYUI_WS_URL` into every template. The `random_gen_image(category, slug)` template global returns a random image path from `static/uploads/<category>/<slug>/` for use as a fallback cover when `image_path` is not set.
|
- Context processors inject `all_checkpoints`, `default_checkpoint_path`, and `COMFYUI_WS_URL` into every template. The `random_gen_image(category, slug)` template global returns a random image path from `static/uploads/<category>/<slug>/` for use as a fallback cover when `image_path` is not set.
|
||||||
- **No `{% block head %}` exists** in layout.html — do not try to use it.
|
- **No `{% block head %}` exists** in layout.html — do not try to use it.
|
||||||
- Generation is async: JS submits the form via AJAX (`X-Requested-With: XMLHttpRequest`), receives a `{"job_id": ...}` response, then polls `/api/queue/<job_id>/status` every ~1.5 seconds until `status == "done"`. The server-side worker handles all ComfyUI polling and image saving via the `_make_finalize()` callback. There are no client-facing finalize HTTP routes.
|
- Generation is async: JS submits the form via AJAX (`X-Requested-With: XMLHttpRequest`), receives a `{"job_id": ...}` response, then polls `/api/queue/<job_id>/status` every ~1.5 seconds until `status == "done"` or the 5-minute timeout is reached. The server-side worker handles all ComfyUI polling and image saving via the `_make_finalize()` callback. There are no client-facing finalize HTTP routes.
|
||||||
- **Batch generation** (library pages): Uses a two-phase pattern:
|
- **Batch generation** (library pages): Uses a two-phase pattern:
|
||||||
1. **Queue phase**: All jobs are submitted upfront via sequential fetch calls, collecting job IDs
|
1. **Queue phase**: All jobs are submitted upfront via sequential fetch calls, collecting job IDs
|
||||||
2. **Poll phase**: All jobs are polled concurrently via `Promise.all()`, updating UI as each completes
|
2. **Poll phase**: All jobs are polled concurrently via `Promise.all()`, updating UI as each completes
|
||||||
@@ -501,6 +507,7 @@ All library index pages support query params:
|
|||||||
- `?favourite=on` — show only favourites
|
- `?favourite=on` — show only favourites
|
||||||
- `?nsfw=sfw|nsfw|all` — filter by NSFW status
|
- `?nsfw=sfw|nsfw|all` — filter by NSFW status
|
||||||
- Results are ordered by `is_favourite DESC, name ASC` (favourites sort first).
|
- Results are ordered by `is_favourite DESC, name ASC` (favourites sort first).
|
||||||
|
- Filter logic is shared via `apply_library_filters(query, model_class)` in `routes/shared.py`, which returns `(items, fav, nsfw)`.
|
||||||
|
|
||||||
### Gallery Image Sidecar Files
|
### Gallery Image Sidecar Files
|
||||||
|
|
||||||
@@ -609,3 +616,5 @@ Volumes mounted into the app container:
|
|||||||
- **`_make_finalize` action semantics**: Pass `action=None` when the route should always update the DB cover (e.g. batch generate, checkpoint generate). Pass `action=request.form.get('action')` for routes that support both "preview" (no DB update) and "replace" (update DB). The factory skips the DB write when `action` is truthy and not `"replace"`.
|
- **`_make_finalize` action semantics**: Pass `action=None` when the route should always update the DB cover (e.g. batch generate, checkpoint generate). Pass `action=request.form.get('action')` for routes that support both "preview" (no DB update) and "replace" (update DB). The factory skips the DB write when `action` is truthy and not `"replace"`.
|
||||||
- **LLM queue runs without request context**: `_enqueue_task()` callbacks execute in a background thread with only `app.app_context()`. Do not access `flask.request`, `flask.session`, or other request-scoped objects inside `task_fn`. Use `has_request_context()` guard if code is shared between HTTP handlers and background tasks.
|
- **LLM queue runs without request context**: `_enqueue_task()` callbacks execute in a background thread with only `app.app_context()`. Do not access `flask.request`, `flask.session`, or other request-scoped objects inside `task_fn`. Use `has_request_context()` guard if code is shared between HTTP handlers and background tasks.
|
||||||
- **Tags are metadata only**: Tags (`data['tags']`) are never injected into generation prompts. They are purely for UI filtering and search. The old pattern of `parts.extend(data.get('tags', []))` in prompt building has been removed.
|
- **Tags are metadata only**: Tags (`data['tags']`) are never injected into generation prompts. They are purely for UI filtering and search. The old pattern of `parts.extend(data.get('tags', []))` in prompt building has been removed.
|
||||||
|
- **Path traversal guard on replace cover**: The replace cover route in `routes/shared.py` validates `preview_path` using `os.path.realpath()` + `startswith()` to prevent path traversal attacks.
|
||||||
|
- **Logging uses lazy % formatting**: All logger calls use `logger.info("msg %s", var)` style, not f-strings. This avoids formatting the string when the log level is disabled.
|
||||||
|
|||||||
10
app.py
10
app.py
@@ -69,6 +69,7 @@ if __name__ == '__main__':
|
|||||||
from sqlalchemy import text
|
from sqlalchemy import text
|
||||||
|
|
||||||
os.makedirs(app.config['UPLOAD_FOLDER'], exist_ok=True)
|
os.makedirs(app.config['UPLOAD_FOLDER'], exist_ok=True)
|
||||||
|
os.makedirs(app.config['SESSION_FILE_DIR'], exist_ok=True)
|
||||||
db.create_all()
|
db.create_all()
|
||||||
|
|
||||||
# --- Helper for safe column additions ---
|
# --- Helper for safe column additions ---
|
||||||
@@ -79,8 +80,11 @@ if __name__ == '__main__':
|
|||||||
logger.info("Added %s.%s column", table, column)
|
logger.info("Added %s.%s column", table, column)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
db.session.rollback()
|
db.session.rollback()
|
||||||
if 'duplicate column name' not in str(e).lower() and 'already exists' not in str(e).lower():
|
err_str = str(e).lower()
|
||||||
logger.debug("Migration note (%s.%s): %s", table, column, e)
|
if 'duplicate column name' in err_str or 'already exists' in err_str:
|
||||||
|
pass # Column already exists, expected
|
||||||
|
else:
|
||||||
|
logger.warning("Migration failed (%s.%s): %s", table, column, e)
|
||||||
|
|
||||||
# --- All migrations (grouped before syncs) ---
|
# --- All migrations (grouped before syncs) ---
|
||||||
_add_column('character', 'active_outfit', "VARCHAR(100) DEFAULT 'default'")
|
_add_column('character', 'active_outfit', "VARCHAR(100) DEFAULT 'default'")
|
||||||
@@ -106,7 +110,7 @@ if __name__ == '__main__':
|
|||||||
_add_column('settings', col_name, col_type)
|
_add_column('settings', col_name, col_type)
|
||||||
|
|
||||||
# is_favourite / is_nsfw on all resource tables
|
# is_favourite / is_nsfw on all resource tables
|
||||||
for tbl in ['character', 'look', 'outfit', 'action', 'style', 'scene', 'detailer', 'checkpoint']:
|
for tbl in ['character', 'look', 'outfit', 'action', 'style', 'scene', 'detailer', 'checkpoint', 'preset']:
|
||||||
_add_column(tbl, 'is_favourite', 'BOOLEAN DEFAULT 0')
|
_add_column(tbl, 'is_favourite', 'BOOLEAN DEFAULT 0')
|
||||||
_add_column(tbl, 'is_nsfw', 'BOOLEAN DEFAULT 0')
|
_add_column(tbl, 'is_nsfw', 'BOOLEAN DEFAULT 0')
|
||||||
|
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": ""
|
"scene": ""
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "black_dress, lace-trimmed_dress, gothic_lolita",
|
"base": "black_dress, lace-trimmed_dress, gothic_lolita",
|
||||||
"headwear": "blindfold",
|
"head": "blindfold",
|
||||||
"top": "black_dress, cleavage_cutout, feather_trim",
|
"upper_body": "black_dress, cleavage_cutout, feather_trim",
|
||||||
"bottom": "short_dress",
|
"lower_body": "short_dress",
|
||||||
"legwear": "thighhighs",
|
"additional": "katana, sword_on_back",
|
||||||
"footwear": "thigh_boots, black_boots, high_heels",
|
"feet": "thighhighs, thigh_boots, black_boots, high_heels",
|
||||||
"hands": "black_gloves",
|
"hands": "black_gloves"
|
||||||
"accessories": "katana, sword_on_back"
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "gothic_lolita, science_fiction, dark_atmosphere",
|
"aesthetic": "gothic_lolita, science_fiction, dark_atmosphere",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": "space_station"
|
"scene": "space_station"
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "off-shoulder_dress, two-tone_dress",
|
"base": "off-shoulder_dress, two-tone_dress",
|
||||||
"headwear": "circlet, hair_ring",
|
"head": "circlet, hair_ring",
|
||||||
"top": "neck_bell, white_collar, long_sleeves, cleavage",
|
"upper_body": "neck_bell, white_collar, long_sleeves, cleavage",
|
||||||
"bottom": "black_belt",
|
"lower_body": "black_belt",
|
||||||
"legwear": "pantyhose, thigh_strap",
|
"additional": "bell",
|
||||||
"footwear": "",
|
"feet": "pantyhose, thigh_strap",
|
||||||
"hands": "bracelets",
|
"hands": "bracelets"
|
||||||
"accessories": "bell"
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "retro_anime, 1990s_(style), outlaw_star",
|
"aesthetic": "retro_anime, 1990s_(style), outlaw_star",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": "indoors, laboratory"
|
"scene": "indoors, laboratory"
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "lab_coat, dress, checkered_pattern",
|
"base": "lab_coat, dress, checkered_pattern",
|
||||||
"headwear": "",
|
"head": "",
|
||||||
"top": "lab_coat, dress",
|
"upper_body": "lab_coat, dress",
|
||||||
"bottom": "",
|
"lower_body": "",
|
||||||
"legwear": "thighhighs, black_thighhighs",
|
"additional": "earrings, ring",
|
||||||
"footwear": "high_heels",
|
"feet": "thighhighs, black_thighhighs, high_heels",
|
||||||
"hands": "",
|
"hands": ""
|
||||||
"accessories": "earrings, ring"
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "anime",
|
"aesthetic": "anime",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": ""
|
"scene": ""
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "eden_academy_school_uniform, black_dress, gold_trim",
|
"base": "eden_academy_school_uniform, black_dress, gold_trim",
|
||||||
"headwear": "hair_ornament",
|
"head": "hair_ornament",
|
||||||
"top": "",
|
"upper_body": "",
|
||||||
"bottom": "",
|
"lower_body": "",
|
||||||
"legwear": "white_socks",
|
"additional": "",
|
||||||
"footwear": "loafers",
|
"feet": "white_socks, loafers",
|
||||||
"hands": "",
|
"hands": ""
|
||||||
"accessories": ""
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "anime_style",
|
"aesthetic": "anime_style",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": "cityscape, daytime, sky"
|
"scene": "cityscape, daytime, sky"
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "pink_dress, sleeveless_dress, a-line_dress",
|
"base": "pink_dress, sleeveless_dress, a-line_dress",
|
||||||
"headwear": "red_hair_bow, oversized_bow",
|
"head": "red_hair_bow, oversized_bow",
|
||||||
"top": "",
|
"upper_body": "",
|
||||||
"bottom": "",
|
"lower_body": "",
|
||||||
"legwear": "white_leggings, white_tights",
|
"additional": "black_waist_belt",
|
||||||
"footwear": "black_shoes, mary_janes",
|
"feet": "white_leggings, white_tights, black_shoes, mary_janes",
|
||||||
"hands": "",
|
"hands": ""
|
||||||
"accessories": "black_waist_belt"
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "modern_cartoon, cel_shading, vibrant",
|
"aesthetic": "modern_cartoon, cel_shading, vibrant",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": ""
|
"scene": ""
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "blue_dress",
|
"base": "blue_dress",
|
||||||
"headwear": "",
|
"head": "",
|
||||||
"top": "",
|
"upper_body": "",
|
||||||
"bottom": "black_belt",
|
"lower_body": "black_belt",
|
||||||
"legwear": "thighhighs, white_socks",
|
"additional": "",
|
||||||
"footwear": "mary_janes",
|
"feet": "thighhighs, white_socks, mary_janes",
|
||||||
"hands": "",
|
"hands": ""
|
||||||
"accessories": ""
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "vibrant_colors",
|
"aesthetic": "vibrant_colors",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": ""
|
"scene": ""
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "",
|
"base": "",
|
||||||
"headwear": "",
|
"head": "",
|
||||||
"top": "green_crop_top, sleeveless",
|
"upper_body": "green_crop_top, sleeveless",
|
||||||
"bottom": "black_belt, green_shorts",
|
"lower_body": "black_belt, green_shorts",
|
||||||
"legwear": "white_thighhighs",
|
"additional": "",
|
||||||
"footwear": "black_boots",
|
"feet": "white_thighhighs, black_boots",
|
||||||
"hands": "fingerless_gloves",
|
"hands": "fingerless_gloves"
|
||||||
"accessories": ""
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "high_contrast, vibrant",
|
"aesthetic": "high_contrast, vibrant",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": ""
|
"scene": ""
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "red_bodysuit, latex_bodysuit",
|
"base": "red_bodysuit, latex_bodysuit",
|
||||||
"headwear": "",
|
"head": "",
|
||||||
"top": "",
|
"upper_body": "",
|
||||||
"bottom": "",
|
"lower_body": "",
|
||||||
"legwear": "",
|
"additional": "belt, silver_belt",
|
||||||
"footwear": "boots, high_heels",
|
"feet": "boots, high_heels",
|
||||||
"hands": "",
|
"hands": ""
|
||||||
"accessories": "belt, silver_belt"
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "anime_style, 2000s_(style)",
|
"aesthetic": "anime_style, 2000s_(style)",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": ""
|
"scene": ""
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "torn_clothes",
|
"base": "torn_clothes",
|
||||||
"headwear": "",
|
"head": "",
|
||||||
"top": "yellow_shirt, crop_top",
|
"upper_body": "yellow_shirt, crop_top",
|
||||||
"bottom": "jeans, torn_jeans, open_fly, loose_belt",
|
"lower_body": "jeans, torn_jeans, open_fly, loose_belt",
|
||||||
"legwear": "",
|
"additional": "leg_belt",
|
||||||
"footwear": "boots",
|
"feet": "boots",
|
||||||
"hands": "arm_belt",
|
"hands": "arm_belt"
|
||||||
"accessories": "leg_belt"
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "anime, video_game",
|
"aesthetic": "anime, video_game",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": ""
|
"scene": ""
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "black_dress, frilled_dress, gothic_lolita",
|
"base": "black_dress, frilled_dress, gothic_lolita",
|
||||||
"headwear": "black_hat, mini_hat",
|
"head": "black_hat, mini_hat",
|
||||||
"top": "",
|
"upper_body": "",
|
||||||
"bottom": "",
|
"lower_body": "",
|
||||||
"legwear": "thighhighs, black_thighhighs",
|
"additional": "cross_necklace, scythe",
|
||||||
"footwear": "black_footwear",
|
"feet": "thighhighs, black_thighhighs, black_footwear",
|
||||||
"hands": "",
|
"hands": ""
|
||||||
"accessories": "cross_necklace, scythe"
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "gothic_lolita",
|
"aesthetic": "gothic_lolita",
|
||||||
|
|||||||
@@ -16,14 +16,13 @@
|
|||||||
"scene": "starry_sky, space, night"
|
"scene": "starry_sky, space, night"
|
||||||
},
|
},
|
||||||
"wardrobe": {
|
"wardrobe": {
|
||||||
"full_body": "",
|
"base": "",
|
||||||
"headwear": "",
|
"head": "",
|
||||||
"top": "crop_top",
|
"upper_body": "crop_top",
|
||||||
"bottom": "purple_skirt, miniskirt",
|
"lower_body": "purple_skirt, miniskirt",
|
||||||
"legwear": "",
|
"additional": "gorget, belt, armlet",
|
||||||
"footwear": "thigh_boots, purple_boots",
|
"feet": "thigh_boots, purple_boots",
|
||||||
"hands": "vambraces",
|
"hands": "vambraces"
|
||||||
"accessories": "gorget, belt, armlet"
|
|
||||||
},
|
},
|
||||||
"styles": {
|
"styles": {
|
||||||
"aesthetic": "cartoon, superhero, dc_comics",
|
"aesthetic": "cartoon, superhero, dc_comics",
|
||||||
|
|||||||
@@ -271,7 +271,7 @@ class Checkpoint(db.Model):
|
|||||||
slug = db.Column(db.String(255), unique=True, nullable=False)
|
slug = db.Column(db.String(255), unique=True, nullable=False)
|
||||||
name = db.Column(db.String(255), nullable=False)
|
name = db.Column(db.String(255), nullable=False)
|
||||||
checkpoint_path = db.Column(db.String(255), nullable=False) # e.g. "Illustrious/model.safetensors"
|
checkpoint_path = db.Column(db.String(255), nullable=False) # e.g. "Illustrious/model.safetensors"
|
||||||
data = db.Column(db.JSON, nullable=True)
|
data = db.Column(db.JSON, nullable=False, default=dict)
|
||||||
image_path = db.Column(db.String(255), nullable=True)
|
image_path = db.Column(db.String(255), nullable=True)
|
||||||
is_favourite = db.Column(db.Boolean, default=False)
|
is_favourite = db.Column(db.Boolean, default=False)
|
||||||
is_nsfw = db.Column(db.Boolean, default=False)
|
is_nsfw = db.Column(db.Boolean, default=False)
|
||||||
@@ -287,6 +287,8 @@ class Preset(db.Model):
|
|||||||
name = db.Column(db.String(100), nullable=False)
|
name = db.Column(db.String(100), nullable=False)
|
||||||
data = db.Column(db.JSON, nullable=False)
|
data = db.Column(db.JSON, nullable=False)
|
||||||
image_path = db.Column(db.String(255), nullable=True)
|
image_path = db.Column(db.String(255), nullable=True)
|
||||||
|
is_favourite = db.Column(db.Boolean, default=False)
|
||||||
|
is_nsfw = db.Column(db.Boolean, default=False)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return f'<Preset {self.preset_id}>'
|
return f'<Preset {self.preset_id}>'
|
||||||
|
|||||||
@@ -15,8 +15,8 @@ from services.prompts import build_prompt, _resolve_character, _ensure_character
|
|||||||
from services.sync import sync_actions
|
from services.sync import sync_actions
|
||||||
from services.file_io import get_available_loras
|
from services.file_io import get_available_loras
|
||||||
from services.llm import load_prompt, call_llm
|
from services.llm import load_prompt, call_llm
|
||||||
from utils import allowed_file, _LORA_DEFAULTS
|
from utils import allowed_file, _LORA_DEFAULTS, clean_html_text
|
||||||
from routes.shared import register_common_routes
|
from routes.shared import register_common_routes, apply_library_filters
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -26,17 +26,8 @@ def register_routes(app):
|
|||||||
|
|
||||||
@app.route('/actions')
|
@app.route('/actions')
|
||||||
def actions_index():
|
def actions_index():
|
||||||
query = Action.query
|
actions, fav, nsfw = apply_library_filters(Action.query, Action)
|
||||||
fav = request.args.get('favourite')
|
return render_template('actions/index.html', actions=actions, favourite_filter=fav, nsfw_filter=nsfw)
|
||||||
nsfw = request.args.get('nsfw', 'all')
|
|
||||||
if fav == 'on':
|
|
||||||
query = query.filter_by(is_favourite=True)
|
|
||||||
if nsfw == 'sfw':
|
|
||||||
query = query.filter_by(is_nsfw=False)
|
|
||||||
elif nsfw == 'nsfw':
|
|
||||||
query = query.filter_by(is_nsfw=True)
|
|
||||||
actions = query.order_by(Action.is_favourite.desc(), Action.name).all()
|
|
||||||
return render_template('actions/index.html', actions=actions, favourite_filter=fav or '', nsfw_filter=nsfw)
|
|
||||||
|
|
||||||
@app.route('/actions/rescan', methods=['POST'])
|
@app.route('/actions/rescan', methods=['POST'])
|
||||||
def rescan_actions():
|
def rescan_actions():
|
||||||
@@ -228,9 +219,9 @@ def register_routes(app):
|
|||||||
selected_fields.append(f'identity::{key}')
|
selected_fields.append(f'identity::{key}')
|
||||||
# Add wardrobe fields (unless suppressed)
|
# Add wardrobe fields (unless suppressed)
|
||||||
if not suppress_wardrobe:
|
if not suppress_wardrobe:
|
||||||
from utils import _WARDROBE_KEYS
|
from utils import _BODY_GROUP_KEYS
|
||||||
wardrobe = character.get_active_wardrobe()
|
wardrobe = character.get_active_wardrobe()
|
||||||
for key in _WARDROBE_KEYS:
|
for key in _BODY_GROUP_KEYS:
|
||||||
if wardrobe.get(key):
|
if wardrobe.get(key):
|
||||||
selected_fields.append(f'wardrobe::{key}')
|
selected_fields.append(f'wardrobe::{key}')
|
||||||
|
|
||||||
@@ -302,9 +293,9 @@ def register_routes(app):
|
|||||||
|
|
||||||
# Wardrobe (active outfit) — skip if suppressed
|
# Wardrobe (active outfit) — skip if suppressed
|
||||||
if not suppress_wardrobe:
|
if not suppress_wardrobe:
|
||||||
from utils import _WARDROBE_KEYS
|
from utils import _BODY_GROUP_KEYS
|
||||||
wardrobe = extra_char.get_active_wardrobe()
|
wardrobe = extra_char.get_active_wardrobe()
|
||||||
for key in _WARDROBE_KEYS:
|
for key in _BODY_GROUP_KEYS:
|
||||||
val = wardrobe.get(key)
|
val = wardrobe.get(key)
|
||||||
if val:
|
if val:
|
||||||
extra_parts.append(val)
|
extra_parts.append(val)
|
||||||
@@ -389,11 +380,7 @@ def register_routes(app):
|
|||||||
try:
|
try:
|
||||||
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
||||||
html_raw = hf.read()
|
html_raw = hf.read()
|
||||||
clean_html = re.sub(r'<script[^>]*>.*?</script>', '', html_raw, flags=re.DOTALL)
|
html_content = clean_html_text(html_raw)
|
||||||
clean_html = re.sub(r'<style[^>]*>.*?</style>', '', clean_html, flags=re.DOTALL)
|
|
||||||
clean_html = re.sub(r'<img[^>]*>', '', clean_html)
|
|
||||||
clean_html = re.sub(r'<[^>]+>', ' ', clean_html)
|
|
||||||
html_content = ' '.join(clean_html.split())
|
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|||||||
@@ -12,7 +12,7 @@ from services.llm import call_character_mcp_tool, call_llm, load_prompt
|
|||||||
from services.prompts import build_prompt
|
from services.prompts import build_prompt
|
||||||
from services.sync import sync_characters
|
from services.sync import sync_characters
|
||||||
from services.workflow import _get_default_checkpoint, _prepare_workflow
|
from services.workflow import _get_default_checkpoint, _prepare_workflow
|
||||||
from routes.shared import register_common_routes
|
from routes.shared import register_common_routes, apply_library_filters
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -22,17 +22,8 @@ def register_routes(app):
|
|||||||
|
|
||||||
@app.route('/')
|
@app.route('/')
|
||||||
def index():
|
def index():
|
||||||
query = Character.query
|
characters, fav, nsfw = apply_library_filters(Character.query, Character)
|
||||||
fav = request.args.get('favourite')
|
return render_template('index.html', characters=characters, favourite_filter=fav, nsfw_filter=nsfw)
|
||||||
nsfw = request.args.get('nsfw', 'all')
|
|
||||||
if fav == 'on':
|
|
||||||
query = query.filter_by(is_favourite=True)
|
|
||||||
if nsfw == 'sfw':
|
|
||||||
query = query.filter_by(is_nsfw=False)
|
|
||||||
elif nsfw == 'nsfw':
|
|
||||||
query = query.filter_by(is_nsfw=True)
|
|
||||||
characters = query.order_by(Character.is_favourite.desc(), Character.name).all()
|
|
||||||
return render_template('index.html', characters=characters, favourite_filter=fav or '', nsfw_filter=nsfw)
|
|
||||||
|
|
||||||
@app.route('/rescan', methods=['POST'])
|
@app.route('/rescan', methods=['POST'])
|
||||||
def rescan():
|
def rescan():
|
||||||
@@ -274,16 +265,16 @@ def register_routes(app):
|
|||||||
# Fetch reference data from wiki URL if provided
|
# Fetch reference data from wiki URL if provided
|
||||||
wiki_reference = ''
|
wiki_reference = ''
|
||||||
if wiki_url:
|
if wiki_url:
|
||||||
logger.info(f"Fetching character data from URL: {wiki_url}")
|
logger.info("Fetching character data from URL: %s", wiki_url)
|
||||||
wiki_data = call_character_mcp_tool('get_character_from_url', {
|
wiki_data = call_character_mcp_tool('get_character_from_url', {
|
||||||
'url': wiki_url,
|
'url': wiki_url,
|
||||||
'name': name,
|
'name': name,
|
||||||
})
|
})
|
||||||
if wiki_data:
|
if wiki_data:
|
||||||
wiki_reference = f"\n\nReference data from wiki:\n{wiki_data}\n\nUse this reference to accurately describe the character's appearance, outfit, and features."
|
wiki_reference = f"\n\nReference data from wiki:\n{wiki_data}\n\nUse this reference to accurately describe the character's appearance, outfit, and features."
|
||||||
logger.info(f"Got wiki reference data ({len(wiki_data)} chars)")
|
logger.info("Got wiki reference data (%d chars)", len(wiki_data))
|
||||||
else:
|
else:
|
||||||
logger.warning(f"Failed to fetch wiki data from {wiki_url}")
|
logger.warning("Failed to fetch wiki data from %s", wiki_url)
|
||||||
|
|
||||||
# Step 1: Generate or select outfit first
|
# Step 1: Generate or select outfit first
|
||||||
default_outfit_id = 'default'
|
default_outfit_id = 'default'
|
||||||
@@ -352,7 +343,7 @@ Create an outfit JSON with wardrobe fields appropriate for this character."""
|
|||||||
db.session.commit()
|
db.session.commit()
|
||||||
|
|
||||||
default_outfit_id = outfit_slug
|
default_outfit_id = outfit_slug
|
||||||
logger.info(f"Generated outfit: {outfit_name} for character {name}")
|
logger.info("Generated outfit: %s for character %s", outfit_name, name)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.exception("Outfit generation error: %s", e)
|
logger.exception("Outfit generation error: %s", e)
|
||||||
|
|||||||
@@ -14,8 +14,8 @@ from services.prompts import build_prompt, _resolve_character, _ensure_character
|
|||||||
from services.sync import sync_checkpoints, _default_checkpoint_data
|
from services.sync import sync_checkpoints, _default_checkpoint_data
|
||||||
from services.file_io import get_available_checkpoints
|
from services.file_io import get_available_checkpoints
|
||||||
from services.llm import load_prompt, call_llm
|
from services.llm import load_prompt, call_llm
|
||||||
from utils import allowed_file
|
from utils import allowed_file, clean_html_text
|
||||||
from routes.shared import register_common_routes
|
from routes.shared import register_common_routes, apply_library_filters
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -58,17 +58,8 @@ def register_routes(app):
|
|||||||
|
|
||||||
@app.route('/checkpoints')
|
@app.route('/checkpoints')
|
||||||
def checkpoints_index():
|
def checkpoints_index():
|
||||||
query = Checkpoint.query
|
checkpoints, fav, nsfw = apply_library_filters(Checkpoint.query, Checkpoint)
|
||||||
fav = request.args.get('favourite')
|
return render_template('checkpoints/index.html', checkpoints=checkpoints, favourite_filter=fav, nsfw_filter=nsfw)
|
||||||
nsfw = request.args.get('nsfw', 'all')
|
|
||||||
if fav == 'on':
|
|
||||||
query = query.filter_by(is_favourite=True)
|
|
||||||
if nsfw == 'sfw':
|
|
||||||
query = query.filter_by(is_nsfw=False)
|
|
||||||
elif nsfw == 'nsfw':
|
|
||||||
query = query.filter_by(is_nsfw=True)
|
|
||||||
checkpoints = query.order_by(Checkpoint.is_favourite.desc(), Checkpoint.name).all()
|
|
||||||
return render_template('checkpoints/index.html', checkpoints=checkpoints, favourite_filter=fav or '', nsfw_filter=nsfw)
|
|
||||||
|
|
||||||
@app.route('/checkpoints/rescan', methods=['POST'])
|
@app.route('/checkpoints/rescan', methods=['POST'])
|
||||||
def rescan_checkpoints():
|
def rescan_checkpoints():
|
||||||
@@ -179,11 +170,7 @@ def register_routes(app):
|
|||||||
try:
|
try:
|
||||||
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
||||||
html_raw = hf.read()
|
html_raw = hf.read()
|
||||||
clean_html = re.sub(r'<script[^>]*>.*?</script>', '', html_raw, flags=re.DOTALL)
|
html_content = clean_html_text(html_raw)
|
||||||
clean_html = re.sub(r'<style[^>]*>.*?</style>', '', clean_html, flags=re.DOTALL)
|
|
||||||
clean_html = re.sub(r'<img[^>]*>', '', clean_html)
|
|
||||||
clean_html = re.sub(r'<[^>]+>', ' ', clean_html)
|
|
||||||
html_content = ' '.join(clean_html.split())
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error("Error reading HTML for %s: %s", filename, e)
|
logger.error("Error reading HTML for %s: %s", filename, e)
|
||||||
|
|
||||||
|
|||||||
@@ -13,8 +13,8 @@ from services.prompts import build_prompt, _resolve_character, _ensure_character
|
|||||||
from services.sync import sync_detailers
|
from services.sync import sync_detailers
|
||||||
from services.file_io import get_available_loras
|
from services.file_io import get_available_loras
|
||||||
from services.llm import load_prompt, call_llm
|
from services.llm import load_prompt, call_llm
|
||||||
from utils import _WARDROBE_KEYS
|
from utils import _BODY_GROUP_KEYS, clean_html_text
|
||||||
from routes.shared import register_common_routes
|
from routes.shared import register_common_routes, apply_library_filters
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -47,7 +47,7 @@ def register_routes(app):
|
|||||||
selected_fields.append(f'identity::{key}')
|
selected_fields.append(f'identity::{key}')
|
||||||
selected_fields.append('special::name')
|
selected_fields.append('special::name')
|
||||||
wardrobe = character.get_active_wardrobe()
|
wardrobe = character.get_active_wardrobe()
|
||||||
for key in _WARDROBE_KEYS:
|
for key in _BODY_GROUP_KEYS:
|
||||||
if wardrobe.get(key):
|
if wardrobe.get(key):
|
||||||
selected_fields.append(f'wardrobe::{key}')
|
selected_fields.append(f'wardrobe::{key}')
|
||||||
selected_fields.extend(['lora::lora_triggers'])
|
selected_fields.extend(['lora::lora_triggers'])
|
||||||
@@ -87,17 +87,8 @@ def register_routes(app):
|
|||||||
|
|
||||||
@app.route('/detailers')
|
@app.route('/detailers')
|
||||||
def detailers_index():
|
def detailers_index():
|
||||||
query = Detailer.query
|
detailers, fav, nsfw = apply_library_filters(Detailer.query, Detailer)
|
||||||
fav = request.args.get('favourite')
|
return render_template('detailers/index.html', detailers=detailers, favourite_filter=fav, nsfw_filter=nsfw)
|
||||||
nsfw = request.args.get('nsfw', 'all')
|
|
||||||
if fav == 'on':
|
|
||||||
query = query.filter_by(is_favourite=True)
|
|
||||||
if nsfw == 'sfw':
|
|
||||||
query = query.filter_by(is_nsfw=False)
|
|
||||||
elif nsfw == 'nsfw':
|
|
||||||
query = query.filter_by(is_nsfw=True)
|
|
||||||
detailers = query.order_by(Detailer.is_favourite.desc(), Detailer.name).all()
|
|
||||||
return render_template('detailers/index.html', detailers=detailers, favourite_filter=fav or '', nsfw_filter=nsfw)
|
|
||||||
|
|
||||||
@app.route('/detailers/rescan', methods=['POST'])
|
@app.route('/detailers/rescan', methods=['POST'])
|
||||||
def rescan_detailers():
|
def rescan_detailers():
|
||||||
@@ -296,11 +287,7 @@ def register_routes(app):
|
|||||||
try:
|
try:
|
||||||
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
||||||
html_raw = hf.read()
|
html_raw = hf.read()
|
||||||
clean_html = re.sub(r'<script[^>]*>.*?</script>', '', html_raw, flags=re.DOTALL)
|
html_content = clean_html_text(html_raw)
|
||||||
clean_html = re.sub(r'<style[^>]*>.*?</style>', '', clean_html, flags=re.DOTALL)
|
|
||||||
clean_html = re.sub(r'<img[^>]*>', '', clean_html)
|
|
||||||
clean_html = re.sub(r'<[^>]+>', ' ', clean_html)
|
|
||||||
html_content = ' '.join(clean_html.split())
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error("Error reading HTML %s: %s", html_filename, e)
|
logger.error("Error reading HTML %s: %s", html_filename, e)
|
||||||
|
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import json
|
|||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import logging
|
import logging
|
||||||
|
from utils import clean_html_text
|
||||||
|
|
||||||
from flask import render_template, request, redirect, url_for, flash, session
|
from flask import render_template, request, redirect, url_for, flash, session
|
||||||
from sqlalchemy.orm.attributes import flag_modified
|
from sqlalchemy.orm.attributes import flag_modified
|
||||||
@@ -13,7 +14,7 @@ from services.prompts import build_prompt, _resolve_character, _ensure_character
|
|||||||
from services.sync import sync_looks
|
from services.sync import sync_looks
|
||||||
from services.file_io import get_available_loras, _count_look_assignments
|
from services.file_io import get_available_loras, _count_look_assignments
|
||||||
from services.llm import load_prompt, call_llm
|
from services.llm import load_prompt, call_llm
|
||||||
from routes.shared import register_common_routes
|
from routes.shared import register_common_routes, apply_library_filters
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -57,18 +58,9 @@ def register_routes(app):
|
|||||||
|
|
||||||
@app.route('/looks')
|
@app.route('/looks')
|
||||||
def looks_index():
|
def looks_index():
|
||||||
query = Look.query
|
looks, fav, nsfw = apply_library_filters(Look.query, Look)
|
||||||
fav = request.args.get('favourite')
|
|
||||||
nsfw = request.args.get('nsfw', 'all')
|
|
||||||
if fav == 'on':
|
|
||||||
query = query.filter_by(is_favourite=True)
|
|
||||||
if nsfw == 'sfw':
|
|
||||||
query = query.filter_by(is_nsfw=False)
|
|
||||||
elif nsfw == 'nsfw':
|
|
||||||
query = query.filter_by(is_nsfw=True)
|
|
||||||
looks = query.order_by(Look.is_favourite.desc(), Look.name).all()
|
|
||||||
look_assignments = _count_look_assignments()
|
look_assignments = _count_look_assignments()
|
||||||
return render_template('looks/index.html', looks=looks, look_assignments=look_assignments, favourite_filter=fav or '', nsfw_filter=nsfw)
|
return render_template('looks/index.html', looks=looks, look_assignments=look_assignments, favourite_filter=fav, nsfw_filter=nsfw)
|
||||||
|
|
||||||
@app.route('/looks/rescan', methods=['POST'])
|
@app.route('/looks/rescan', methods=['POST'])
|
||||||
def rescan_looks():
|
def rescan_looks():
|
||||||
@@ -320,7 +312,7 @@ Character ID: {character_slug}"""
|
|||||||
character_data['lora'] = lora_data
|
character_data['lora'] = lora_data
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.exception(f"LLM character generation error: {e}")
|
logger.exception("LLM character generation error: %s", e)
|
||||||
flash(f'Failed to generate character with AI: {e}', 'error')
|
flash(f'Failed to generate character with AI: {e}', 'error')
|
||||||
return redirect(url_for('look_detail', slug=slug))
|
return redirect(url_for('look_detail', slug=slug))
|
||||||
else:
|
else:
|
||||||
@@ -494,11 +486,7 @@ Character ID: {character_slug}"""
|
|||||||
try:
|
try:
|
||||||
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
||||||
html_raw = hf.read()
|
html_raw = hf.read()
|
||||||
clean_html = re.sub(r'<script[^>]*>.*?</script>', '', html_raw, flags=re.DOTALL)
|
html_content = clean_html_text(html_raw)
|
||||||
clean_html = re.sub(r'<style[^>]*>.*?</style>', '', clean_html, flags=re.DOTALL)
|
|
||||||
clean_html = re.sub(r'<img[^>]*>', '', clean_html)
|
|
||||||
clean_html = re.sub(r'<[^>]+>', ' ', clean_html)
|
|
||||||
html_content = ' '.join(clean_html.split())
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error("Error reading HTML %s: %s", html_filename, e)
|
logger.error("Error reading HTML %s: %s", html_filename, e)
|
||||||
|
|
||||||
|
|||||||
@@ -13,9 +13,9 @@ from services.job_queue import _enqueue_job, _make_finalize, _enqueue_task
|
|||||||
from services.prompts import build_prompt, _resolve_character, _ensure_character_fields, _append_background
|
from services.prompts import build_prompt, _resolve_character, _ensure_character_fields, _append_background
|
||||||
from services.sync import sync_outfits
|
from services.sync import sync_outfits
|
||||||
from services.file_io import get_available_loras, _count_outfit_lora_assignments
|
from services.file_io import get_available_loras, _count_outfit_lora_assignments
|
||||||
from utils import allowed_file, _LORA_DEFAULTS
|
from utils import allowed_file, _LORA_DEFAULTS, clean_html_text
|
||||||
from services.llm import load_prompt, call_llm
|
from services.llm import load_prompt, call_llm
|
||||||
from routes.shared import register_common_routes
|
from routes.shared import register_common_routes, apply_library_filters
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -25,18 +25,9 @@ def register_routes(app):
|
|||||||
|
|
||||||
@app.route('/outfits')
|
@app.route('/outfits')
|
||||||
def outfits_index():
|
def outfits_index():
|
||||||
query = Outfit.query
|
outfits, fav, nsfw = apply_library_filters(Outfit.query, Outfit)
|
||||||
fav = request.args.get('favourite')
|
|
||||||
nsfw = request.args.get('nsfw', 'all')
|
|
||||||
if fav == 'on':
|
|
||||||
query = query.filter_by(is_favourite=True)
|
|
||||||
if nsfw == 'sfw':
|
|
||||||
query = query.filter_by(is_nsfw=False)
|
|
||||||
elif nsfw == 'nsfw':
|
|
||||||
query = query.filter_by(is_nsfw=True)
|
|
||||||
outfits = query.order_by(Outfit.is_favourite.desc(), Outfit.name).all()
|
|
||||||
lora_assignments = _count_outfit_lora_assignments()
|
lora_assignments = _count_outfit_lora_assignments()
|
||||||
return render_template('outfits/index.html', outfits=outfits, lora_assignments=lora_assignments, favourite_filter=fav or '', nsfw_filter=nsfw)
|
return render_template('outfits/index.html', outfits=outfits, lora_assignments=lora_assignments, favourite_filter=fav, nsfw_filter=nsfw)
|
||||||
|
|
||||||
@app.route('/outfits/rescan', methods=['POST'])
|
@app.route('/outfits/rescan', methods=['POST'])
|
||||||
def rescan_outfits():
|
def rescan_outfits():
|
||||||
@@ -90,11 +81,7 @@ def register_routes(app):
|
|||||||
try:
|
try:
|
||||||
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
||||||
html_raw = hf.read()
|
html_raw = hf.read()
|
||||||
clean_html = re.sub(r'<script[^>]*>.*?</script>', '', html_raw, flags=re.DOTALL)
|
html_content = clean_html_text(html_raw)
|
||||||
clean_html = re.sub(r'<style[^>]*>.*?</style>', '', clean_html, flags=re.DOTALL)
|
|
||||||
clean_html = re.sub(r'<img[^>]*>', '', clean_html)
|
|
||||||
clean_html = re.sub(r'<[^>]+>', ' ', clean_html)
|
|
||||||
html_content = ' '.join(clean_html.split())
|
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
@@ -313,12 +300,12 @@ def register_routes(app):
|
|||||||
# No explicit field selection (e.g. batch generation) — build a selection
|
# No explicit field selection (e.g. batch generation) — build a selection
|
||||||
# that includes identity + wardrobe + name + lora triggers, but NOT character
|
# that includes identity + wardrobe + name + lora triggers, but NOT character
|
||||||
# defaults (expression, pose, scene), so outfit covers stay generic.
|
# defaults (expression, pose, scene), so outfit covers stay generic.
|
||||||
from utils import _IDENTITY_KEYS, _WARDROBE_KEYS
|
from utils import _BODY_GROUP_KEYS
|
||||||
for key in _IDENTITY_KEYS:
|
for key in _BODY_GROUP_KEYS:
|
||||||
if character.data.get('identity', {}).get(key):
|
if character.data.get('identity', {}).get(key):
|
||||||
selected_fields.append(f'identity::{key}')
|
selected_fields.append(f'identity::{key}')
|
||||||
outfit_wardrobe = outfit.data.get('wardrobe', {})
|
outfit_wardrobe = outfit.data.get('wardrobe', {})
|
||||||
for key in _WARDROBE_KEYS:
|
for key in _BODY_GROUP_KEYS:
|
||||||
if outfit_wardrobe.get(key):
|
if outfit_wardrobe.get(key):
|
||||||
selected_fields.append(f'wardrobe::{key}')
|
selected_fields.append(f'wardrobe::{key}')
|
||||||
selected_fields.append('special::name')
|
selected_fields.append('special::name')
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ from sqlalchemy.orm.attributes import flag_modified
|
|||||||
from services.sync import sync_presets
|
from services.sync import sync_presets
|
||||||
from services.generation import generate_from_preset
|
from services.generation import generate_from_preset
|
||||||
from services.llm import load_prompt, call_llm
|
from services.llm import load_prompt, call_llm
|
||||||
from routes.shared import register_common_routes
|
from routes.shared import register_common_routes, apply_library_filters
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -18,8 +18,9 @@ def register_routes(app):
|
|||||||
|
|
||||||
@app.route('/presets')
|
@app.route('/presets')
|
||||||
def presets_index():
|
def presets_index():
|
||||||
presets = Preset.query.order_by(Preset.filename).all()
|
presets, fav, nsfw = apply_library_filters(Preset.query, Preset)
|
||||||
return render_template('presets/index.html', presets=presets)
|
return render_template('presets/index.html', presets=presets,
|
||||||
|
favourite_filter=fav, nsfw_filter=nsfw)
|
||||||
|
|
||||||
@app.route('/preset/<path:slug>')
|
@app.route('/preset/<path:slug>')
|
||||||
def preset_detail(slug):
|
def preset_detail(slug):
|
||||||
|
|||||||
@@ -63,7 +63,7 @@ def register_routes(app):
|
|||||||
clean_json = llm_response.replace('```json', '').replace('```', '').strip()
|
clean_json = llm_response.replace('```json', '').replace('```', '').strip()
|
||||||
new_data = json.loads(clean_json)
|
new_data = json.loads(clean_json)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.exception(f"Regenerate tags LLM error for {category}/{slug}")
|
logger.exception("Regenerate tags LLM error for %s/%s", category, slug)
|
||||||
return {'error': f'LLM error: {str(e)}'}, 500
|
return {'error': f'LLM error: {str(e)}'}, 500
|
||||||
|
|
||||||
# Preserve protected fields from original
|
# Preserve protected fields from original
|
||||||
@@ -106,7 +106,7 @@ def register_routes(app):
|
|||||||
with open(file_path, 'w') as f:
|
with open(file_path, 'w') as f:
|
||||||
json.dump(new_data, f, indent=2)
|
json.dump(new_data, f, indent=2)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Could not write {file_path}: {e}")
|
logger.warning("Could not write %s: %s", file_path, e)
|
||||||
|
|
||||||
migrated += 1
|
migrated += 1
|
||||||
|
|
||||||
@@ -122,7 +122,7 @@ def register_routes(app):
|
|||||||
migrated += 1
|
migrated += 1
|
||||||
|
|
||||||
db.session.commit()
|
db.session.commit()
|
||||||
logger.info(f"Migrated {migrated} resources from list tags to dict tags")
|
logger.info("Migrated %d resources from list tags to dict tags", migrated)
|
||||||
return {'success': True, 'migrated': migrated}
|
return {'success': True, 'migrated': migrated}
|
||||||
|
|
||||||
def _make_regen_task(category, slug, name, system_prompt):
|
def _make_regen_task(category, slug, name, system_prompt):
|
||||||
|
|||||||
@@ -13,8 +13,8 @@ from services.prompts import build_prompt, _resolve_character, _ensure_character
|
|||||||
from services.sync import sync_scenes
|
from services.sync import sync_scenes
|
||||||
from services.file_io import get_available_loras
|
from services.file_io import get_available_loras
|
||||||
from services.llm import load_prompt, call_llm
|
from services.llm import load_prompt, call_llm
|
||||||
from routes.shared import register_common_routes
|
from routes.shared import register_common_routes, apply_library_filters
|
||||||
from utils import _WARDROBE_KEYS
|
from utils import _BODY_GROUP_KEYS, clean_html_text
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -24,17 +24,8 @@ def register_routes(app):
|
|||||||
|
|
||||||
@app.route('/scenes')
|
@app.route('/scenes')
|
||||||
def scenes_index():
|
def scenes_index():
|
||||||
query = Scene.query
|
scenes, fav, nsfw = apply_library_filters(Scene.query, Scene)
|
||||||
fav = request.args.get('favourite')
|
return render_template('scenes/index.html', scenes=scenes, favourite_filter=fav, nsfw_filter=nsfw)
|
||||||
nsfw = request.args.get('nsfw', 'all')
|
|
||||||
if fav == 'on':
|
|
||||||
query = query.filter_by(is_favourite=True)
|
|
||||||
if nsfw == 'sfw':
|
|
||||||
query = query.filter_by(is_nsfw=False)
|
|
||||||
elif nsfw == 'nsfw':
|
|
||||||
query = query.filter_by(is_nsfw=True)
|
|
||||||
scenes = query.order_by(Scene.is_favourite.desc(), Scene.name).all()
|
|
||||||
return render_template('scenes/index.html', scenes=scenes, favourite_filter=fav or '', nsfw_filter=nsfw)
|
|
||||||
|
|
||||||
@app.route('/scenes/rescan', methods=['POST'])
|
@app.route('/scenes/rescan', methods=['POST'])
|
||||||
def rescan_scenes():
|
def rescan_scenes():
|
||||||
@@ -177,7 +168,7 @@ def register_routes(app):
|
|||||||
selected_fields.append(f'identity::{key}')
|
selected_fields.append(f'identity::{key}')
|
||||||
selected_fields.append('special::name')
|
selected_fields.append('special::name')
|
||||||
wardrobe = character.get_active_wardrobe()
|
wardrobe = character.get_active_wardrobe()
|
||||||
for key in _WARDROBE_KEYS:
|
for key in _BODY_GROUP_KEYS:
|
||||||
if wardrobe.get(key):
|
if wardrobe.get(key):
|
||||||
selected_fields.append(f'wardrobe::{key}')
|
selected_fields.append(f'wardrobe::{key}')
|
||||||
selected_fields.extend(['defaults::scene', 'lora::lora_triggers'])
|
selected_fields.extend(['defaults::scene', 'lora::lora_triggers'])
|
||||||
@@ -312,11 +303,7 @@ def register_routes(app):
|
|||||||
try:
|
try:
|
||||||
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
||||||
html_raw = hf.read()
|
html_raw = hf.read()
|
||||||
clean_html = re.sub(r'<script[^>]*>.*?</script>', '', html_raw, flags=re.DOTALL)
|
html_content = clean_html_text(html_raw)
|
||||||
clean_html = re.sub(r'<style[^>]*>.*?</style>', '', clean_html, flags=re.DOTALL)
|
|
||||||
clean_html = re.sub(r'<img[^>]*>', '', clean_html)
|
|
||||||
clean_html = re.sub(r'<[^>]+>', ' ', clean_html)
|
|
||||||
html_content = ' '.join(clean_html.split())
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error("Error reading HTML %s: %s", html_filename, e)
|
logger.error("Error reading HTML %s: %s", html_filename, e)
|
||||||
|
|
||||||
|
|||||||
@@ -62,7 +62,7 @@ def register_routes(app):
|
|||||||
db.session.commit()
|
db.session.commit()
|
||||||
logger.info("Default checkpoint saved to database: %s", checkpoint_path)
|
logger.info("Default checkpoint saved to database: %s", checkpoint_path)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Failed to persist checkpoint to database: {e}")
|
logger.error("Failed to persist checkpoint to database: %s", e)
|
||||||
db.session.rollback()
|
db.session.rollback()
|
||||||
|
|
||||||
# Also persist to comfy_workflow.json for backwards compatibility
|
# Also persist to comfy_workflow.json for backwards compatibility
|
||||||
@@ -78,7 +78,7 @@ def register_routes(app):
|
|||||||
with open(workflow_path, 'w') as f:
|
with open(workflow_path, 'w') as f:
|
||||||
json.dump(workflow, f, indent=2)
|
json.dump(workflow, f, indent=2)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Failed to persist checkpoint to workflow file: {e}")
|
logger.error("Failed to persist checkpoint to workflow file: %s", e)
|
||||||
|
|
||||||
return {'status': 'ok'}
|
return {'status': 'ok'}
|
||||||
|
|
||||||
|
|||||||
@@ -20,6 +20,23 @@ from utils import allowed_file
|
|||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
|
|
||||||
|
def apply_library_filters(query, model_class):
|
||||||
|
"""Apply standard favourite/NSFW filters and sorting to a library query.
|
||||||
|
|
||||||
|
Returns (items, favourite_filter, nsfw_filter) tuple.
|
||||||
|
"""
|
||||||
|
fav = request.args.get('favourite')
|
||||||
|
nsfw = request.args.get('nsfw', 'all')
|
||||||
|
if fav == 'on':
|
||||||
|
query = query.filter_by(is_favourite=True)
|
||||||
|
if nsfw == 'sfw':
|
||||||
|
query = query.filter_by(is_nsfw=False)
|
||||||
|
elif nsfw == 'nsfw':
|
||||||
|
query = query.filter_by(is_nsfw=True)
|
||||||
|
items = query.order_by(model_class.is_favourite.desc(), model_class.name).all()
|
||||||
|
return items, fav or '', nsfw
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
# Category configuration registry
|
# Category configuration registry
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
@@ -237,11 +254,16 @@ def _register_replace_cover_route(app, cfg):
|
|||||||
def replace_cover(slug):
|
def replace_cover(slug):
|
||||||
entity = Model.query.filter_by(slug=slug).first_or_404()
|
entity = Model.query.filter_by(slug=slug).first_or_404()
|
||||||
preview_path = request.form.get('preview_path')
|
preview_path = request.form.get('preview_path')
|
||||||
if preview_path and os.path.exists(
|
if preview_path:
|
||||||
os.path.join(current_app.config['UPLOAD_FOLDER'], preview_path)):
|
full_path = os.path.realpath(
|
||||||
entity.image_path = preview_path
|
os.path.join(current_app.config['UPLOAD_FOLDER'], preview_path))
|
||||||
db.session.commit()
|
upload_root = os.path.realpath(current_app.config['UPLOAD_FOLDER'])
|
||||||
flash('Cover image updated!')
|
if full_path.startswith(upload_root + os.sep) and os.path.exists(full_path):
|
||||||
|
entity.image_path = preview_path
|
||||||
|
db.session.commit()
|
||||||
|
flash('Cover image updated!')
|
||||||
|
else:
|
||||||
|
flash('Invalid preview path.', 'error')
|
||||||
else:
|
else:
|
||||||
flash('No valid preview image selected.', 'error')
|
flash('No valid preview image selected.', 'error')
|
||||||
return redirect(url_for(detail_ep, slug=slug))
|
return redirect(url_for(detail_ep, slug=slug))
|
||||||
|
|||||||
@@ -14,8 +14,8 @@ from services.prompts import build_prompt, _resolve_character, _ensure_character
|
|||||||
from services.sync import sync_styles
|
from services.sync import sync_styles
|
||||||
from services.file_io import get_available_loras
|
from services.file_io import get_available_loras
|
||||||
from services.llm import load_prompt, call_llm
|
from services.llm import load_prompt, call_llm
|
||||||
from routes.shared import register_common_routes
|
from routes.shared import register_common_routes, apply_library_filters
|
||||||
from utils import _WARDROBE_KEYS
|
from utils import _BODY_GROUP_KEYS, clean_html_text
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -47,7 +47,7 @@ def register_routes(app):
|
|||||||
selected_fields.append(f'identity::{key}')
|
selected_fields.append(f'identity::{key}')
|
||||||
selected_fields.append('special::name')
|
selected_fields.append('special::name')
|
||||||
wardrobe = character.get_active_wardrobe()
|
wardrobe = character.get_active_wardrobe()
|
||||||
for key in _WARDROBE_KEYS:
|
for key in _BODY_GROUP_KEYS:
|
||||||
if wardrobe.get(key):
|
if wardrobe.get(key):
|
||||||
selected_fields.append(f'wardrobe::{key}')
|
selected_fields.append(f'wardrobe::{key}')
|
||||||
selected_fields.extend(['style::artist_name', 'style::artistic_style', 'lora::lora_triggers'])
|
selected_fields.extend(['style::artist_name', 'style::artistic_style', 'lora::lora_triggers'])
|
||||||
@@ -82,17 +82,8 @@ def register_routes(app):
|
|||||||
|
|
||||||
@app.route('/styles')
|
@app.route('/styles')
|
||||||
def styles_index():
|
def styles_index():
|
||||||
query = Style.query
|
styles, fav, nsfw = apply_library_filters(Style.query, Style)
|
||||||
fav = request.args.get('favourite')
|
return render_template('styles/index.html', styles=styles, favourite_filter=fav, nsfw_filter=nsfw)
|
||||||
nsfw = request.args.get('nsfw', 'all')
|
|
||||||
if fav == 'on':
|
|
||||||
query = query.filter_by(is_favourite=True)
|
|
||||||
if nsfw == 'sfw':
|
|
||||||
query = query.filter_by(is_nsfw=False)
|
|
||||||
elif nsfw == 'nsfw':
|
|
||||||
query = query.filter_by(is_nsfw=True)
|
|
||||||
styles = query.order_by(Style.is_favourite.desc(), Style.name).all()
|
|
||||||
return render_template('styles/index.html', styles=styles, favourite_filter=fav or '', nsfw_filter=nsfw)
|
|
||||||
|
|
||||||
@app.route('/styles/rescan', methods=['POST'])
|
@app.route('/styles/rescan', methods=['POST'])
|
||||||
def rescan_styles():
|
def rescan_styles():
|
||||||
@@ -323,11 +314,7 @@ def register_routes(app):
|
|||||||
try:
|
try:
|
||||||
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
with open(html_path, 'r', encoding='utf-8', errors='ignore') as hf:
|
||||||
html_raw = hf.read()
|
html_raw = hf.read()
|
||||||
clean_html = re.sub(r'<script[^>]*>.*?</script>', '', html_raw, flags=re.DOTALL)
|
html_content = clean_html_text(html_raw)
|
||||||
clean_html = re.sub(r'<style[^>]*>.*?</style>', '', clean_html, flags=re.DOTALL)
|
|
||||||
clean_html = re.sub(r'<img[^>]*>', '', clean_html)
|
|
||||||
clean_html = re.sub(r'<[^>]+>', ' ', clean_html)
|
|
||||||
html_content = ' '.join(clean_html.split())
|
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|||||||
@@ -244,7 +244,7 @@ Generate a complete {target_category.rstrip('s')} profile with all required fiel
|
|||||||
new_data[target_name_key] = new_name
|
new_data[target_name_key] = new_name
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.exception(f"LLM transfer error: {e}")
|
logger.exception("LLM transfer error: %s", e)
|
||||||
flash(f'Failed to generate {target_category.rstrip("s")} with AI: {e}')
|
flash(f'Failed to generate {target_category.rstrip("s")} with AI: {e}')
|
||||||
return redirect(url_for('transfer_resource', category=category, slug=slug))
|
return redirect(url_for('transfer_resource', category=category, slug=slug))
|
||||||
else:
|
else:
|
||||||
@@ -290,7 +290,7 @@ Generate a complete {target_category.rstrip('s')} profile with all required fiel
|
|||||||
lora_moved = True
|
lora_moved = True
|
||||||
flash(f'Moved LoRA file to {target_lora_dir}')
|
flash(f'Moved LoRA file to {target_lora_dir}')
|
||||||
except Exception as lora_e:
|
except Exception as lora_e:
|
||||||
logger.exception(f"LoRA move error: {lora_e}")
|
logger.exception("LoRA move error: %s", lora_e)
|
||||||
flash(f'Warning: Failed to move LoRA file: {lora_e}', 'warning')
|
flash(f'Warning: Failed to move LoRA file: {lora_e}', 'warning')
|
||||||
else:
|
else:
|
||||||
flash(f'Warning: Source LoRA file not found at {abs_source_path}', 'warning')
|
flash(f'Warning: Source LoRA file not found at {abs_source_path}', 'warning')
|
||||||
@@ -317,7 +317,7 @@ Generate a complete {target_category.rstrip('s')} profile with all required fiel
|
|||||||
db.session.delete(resource)
|
db.session.delete(resource)
|
||||||
flash(f'Removed original {category.rstrip("s")}: {resource_name}')
|
flash(f'Removed original {category.rstrip("s")}: {resource_name}')
|
||||||
except Exception as rm_e:
|
except Exception as rm_e:
|
||||||
logger.exception(f"Error removing original: {rm_e}")
|
logger.exception("Error removing original: %s", rm_e)
|
||||||
flash(f'Warning: Failed to remove original: {rm_e}', 'warning')
|
flash(f'Warning: Failed to remove original: {rm_e}', 'warning')
|
||||||
|
|
||||||
db.session.commit()
|
db.session.commit()
|
||||||
@@ -325,7 +325,7 @@ Generate a complete {target_category.rstrip('s')} profile with all required fiel
|
|||||||
return redirect(url_for(target_config['index_route'], highlight=safe_slug))
|
return redirect(url_for(target_config['index_route'], highlight=safe_slug))
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.exception(f"Transfer save error: {e}")
|
logger.exception("Transfer save error: %s", e)
|
||||||
flash(f'Failed to save transferred {target_category.rstrip("s")}: {e}')
|
flash(f'Failed to save transferred {target_category.rstrip("s")}: {e}')
|
||||||
return redirect(url_for('transfer_resource', category=category, slug=slug))
|
return redirect(url_for('transfer_resource', category=category, slug=slug))
|
||||||
|
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import json
|
|||||||
import logging
|
import logging
|
||||||
import requests
|
import requests
|
||||||
from flask import current_app
|
from flask import current_app
|
||||||
|
from services.workflow import NODE_CHECKPOINT
|
||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
@@ -14,9 +15,11 @@ def get_loaded_checkpoint():
|
|||||||
if resp.ok:
|
if resp.ok:
|
||||||
history = resp.json()
|
history = resp.json()
|
||||||
if history:
|
if history:
|
||||||
latest = max(history.values(), key=lambda j: j.get('status', {}).get('status_str', ''))
|
# Sort by prompt ID (numeric string) to get the most recent job
|
||||||
|
latest_id = max(history.keys())
|
||||||
|
latest = history[latest_id]
|
||||||
nodes = latest.get('prompt', [None, None, {}])[2]
|
nodes = latest.get('prompt', [None, None, {}])[2]
|
||||||
return nodes.get('4', {}).get('inputs', {}).get('ckpt_name')
|
return nodes.get(NODE_CHECKPOINT, {}).get('inputs', {}).get('ckpt_name')
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
return None
|
return None
|
||||||
@@ -34,26 +37,27 @@ def _ensure_checkpoint_loaded(checkpoint_path):
|
|||||||
if resp.ok:
|
if resp.ok:
|
||||||
history = resp.json()
|
history = resp.json()
|
||||||
if history:
|
if history:
|
||||||
latest = max(history.values(), key=lambda j: j.get('status', {}).get('status_str', ''))
|
latest_id = max(history.keys())
|
||||||
|
latest = history[latest_id]
|
||||||
nodes = latest.get('prompt', [None, None, {}])[2]
|
nodes = latest.get('prompt', [None, None, {}])[2]
|
||||||
loaded_ckpt = nodes.get('4', {}).get('inputs', {}).get('ckpt_name')
|
loaded_ckpt = nodes.get(NODE_CHECKPOINT, {}).get('inputs', {}).get('ckpt_name')
|
||||||
|
|
||||||
# If the loaded checkpoint matches what we want, no action needed
|
# If the loaded checkpoint matches what we want, no action needed
|
||||||
if loaded_ckpt == checkpoint_path:
|
if loaded_ckpt == checkpoint_path:
|
||||||
logger.info(f"Checkpoint {checkpoint_path} already loaded in ComfyUI")
|
logger.info("Checkpoint %s already loaded in ComfyUI", checkpoint_path)
|
||||||
return
|
return
|
||||||
|
|
||||||
# Checkpoint doesn't match or couldn't determine - force unload all models
|
# Checkpoint doesn't match or couldn't determine - force unload all models
|
||||||
logger.info(f"Forcing ComfyUI to unload models to ensure {checkpoint_path} loads")
|
logger.info("Forcing ComfyUI to unload models to ensure %s loads", checkpoint_path)
|
||||||
requests.post(f'{url}/free', json={'unload_models': True}, timeout=5)
|
requests.post(f'{url}/free', json={'unload_models': True}, timeout=5)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Failed to check/force checkpoint reload: {e}")
|
logger.warning("Failed to check/force checkpoint reload: %s", e)
|
||||||
|
|
||||||
|
|
||||||
def queue_prompt(prompt_workflow, client_id=None):
|
def queue_prompt(prompt_workflow, client_id=None):
|
||||||
"""POST a workflow to ComfyUI's /prompt endpoint."""
|
"""POST a workflow to ComfyUI's /prompt endpoint."""
|
||||||
# Ensure the checkpoint in the workflow is loaded in ComfyUI
|
# Ensure the checkpoint in the workflow is loaded in ComfyUI
|
||||||
checkpoint_path = prompt_workflow.get('4', {}).get('inputs', {}).get('ckpt_name')
|
checkpoint_path = prompt_workflow.get(NODE_CHECKPOINT, {}).get('inputs', {}).get('ckpt_name')
|
||||||
_ensure_checkpoint_loaded(checkpoint_path)
|
_ensure_checkpoint_loaded(checkpoint_path)
|
||||||
|
|
||||||
p = {"prompt": prompt_workflow}
|
p = {"prompt": prompt_workflow}
|
||||||
@@ -72,7 +76,10 @@ def queue_prompt(prompt_workflow, client_id=None):
|
|||||||
logger.debug("=" * 80)
|
logger.debug("=" * 80)
|
||||||
|
|
||||||
data = json.dumps(p).encode('utf-8')
|
data = json.dumps(p).encode('utf-8')
|
||||||
response = requests.post(f"{url}/prompt", data=data)
|
response = requests.post(f"{url}/prompt", data=data, timeout=30)
|
||||||
|
if not response.ok:
|
||||||
|
logger.error("ComfyUI returned HTTP %s: %s", response.status_code, response.text[:500])
|
||||||
|
raise RuntimeError(f"ComfyUI returned HTTP {response.status_code}")
|
||||||
response_json = response.json()
|
response_json = response.json()
|
||||||
|
|
||||||
# Log the response from ComfyUI
|
# Log the response from ComfyUI
|
||||||
@@ -90,7 +97,7 @@ def queue_prompt(prompt_workflow, client_id=None):
|
|||||||
def get_history(prompt_id):
|
def get_history(prompt_id):
|
||||||
"""Poll ComfyUI /history for results of a given prompt_id."""
|
"""Poll ComfyUI /history for results of a given prompt_id."""
|
||||||
url = current_app.config['COMFYUI_URL']
|
url = current_app.config['COMFYUI_URL']
|
||||||
response = requests.get(f"{url}/history/{prompt_id}")
|
response = requests.get(f"{url}/history/{prompt_id}", timeout=10)
|
||||||
history_json = response.json()
|
history_json = response.json()
|
||||||
|
|
||||||
# Log detailed history response for debugging
|
# Log detailed history response for debugging
|
||||||
@@ -128,6 +135,6 @@ def get_image(filename, subfolder, folder_type):
|
|||||||
data = {"filename": filename, "subfolder": subfolder, "type": folder_type}
|
data = {"filename": filename, "subfolder": subfolder, "type": folder_type}
|
||||||
logger.debug("Fetching image from ComfyUI: filename=%s, subfolder=%s, type=%s",
|
logger.debug("Fetching image from ComfyUI: filename=%s, subfolder=%s, type=%s",
|
||||||
filename, subfolder, folder_type)
|
filename, subfolder, folder_type)
|
||||||
response = requests.get(f"{url}/view", params=data)
|
response = requests.get(f"{url}/view", params=data, timeout=30)
|
||||||
logger.debug("Image retrieved: %d bytes (status: %s)", len(response.content), response.status_code)
|
logger.debug("Image retrieved: %d bytes (status: %s)", len(response.content), response.status_code)
|
||||||
return response.content
|
return response.content
|
||||||
|
|||||||
@@ -205,13 +205,13 @@ def call_llm(prompt, system_prompt="You are a creative assistant."):
|
|||||||
except requests.exceptions.RequestException as e:
|
except requests.exceptions.RequestException as e:
|
||||||
error_body = ""
|
error_body = ""
|
||||||
try: error_body = f" - Body: {response.text}"
|
try: error_body = f" - Body: {response.text}"
|
||||||
except: pass
|
except Exception: pass
|
||||||
raise RuntimeError(f"LLM API request failed: {str(e)}{error_body}") from e
|
raise RuntimeError(f"LLM API request failed: {str(e)}{error_body}") from e
|
||||||
except (KeyError, IndexError) as e:
|
except (KeyError, IndexError) as e:
|
||||||
# Log the raw response to help diagnose the issue
|
# Log the raw response to help diagnose the issue
|
||||||
raw = ""
|
raw = ""
|
||||||
try: raw = response.text[:500]
|
try: raw = response.text[:500]
|
||||||
except: pass
|
except Exception: pass
|
||||||
logger.warning("Unexpected LLM response format (key=%s). Raw response: %s", e, raw)
|
logger.warning("Unexpected LLM response format (key=%s). Raw response: %s", e, raw)
|
||||||
if format_retries > 0:
|
if format_retries > 0:
|
||||||
format_retries -= 1
|
format_retries -= 1
|
||||||
|
|||||||
176
services/mcp.py
176
services/mcp.py
@@ -12,147 +12,83 @@ CHAR_MCP_COMPOSE_DIR = os.path.join(MCP_TOOLS_DIR, 'character-mcp')
|
|||||||
CHAR_MCP_REPO_URL = 'https://git.liveaodh.com/aodhan/character-mcp.git'
|
CHAR_MCP_REPO_URL = 'https://git.liveaodh.com/aodhan/character-mcp.git'
|
||||||
|
|
||||||
|
|
||||||
def _ensure_mcp_repo():
|
def _ensure_repo(compose_dir, repo_url, name):
|
||||||
"""Clone or update the danbooru-mcp source repository inside tools/.
|
"""Clone or update an MCP source repository inside tools/.
|
||||||
|
|
||||||
- If ``tools/danbooru-mcp/`` does not exist, clone from MCP_REPO_URL.
|
- If the directory does not exist, clone from repo_url.
|
||||||
- If it already exists, run ``git pull`` to fetch the latest changes.
|
- If it already exists, run ``git pull`` to fetch the latest changes.
|
||||||
Errors are non-fatal.
|
Errors are non-fatal.
|
||||||
"""
|
"""
|
||||||
os.makedirs(MCP_TOOLS_DIR, exist_ok=True)
|
os.makedirs(MCP_TOOLS_DIR, exist_ok=True)
|
||||||
try:
|
try:
|
||||||
if not os.path.isdir(MCP_COMPOSE_DIR):
|
if not os.path.isdir(compose_dir):
|
||||||
logger.info('Cloning danbooru-mcp from %s …', MCP_REPO_URL)
|
logger.info('Cloning %s from %s …', name, repo_url)
|
||||||
subprocess.run(
|
subprocess.run(
|
||||||
['git', 'clone', MCP_REPO_URL, MCP_COMPOSE_DIR],
|
['git', 'clone', repo_url, compose_dir],
|
||||||
timeout=120, check=True,
|
timeout=120, check=True,
|
||||||
)
|
)
|
||||||
logger.info('danbooru-mcp cloned successfully.')
|
logger.info('%s cloned successfully.', name)
|
||||||
else:
|
else:
|
||||||
logger.info('Updating danbooru-mcp via git pull …')
|
logger.info('Updating %s via git pull …', name)
|
||||||
subprocess.run(
|
subprocess.run(
|
||||||
['git', 'pull'],
|
['git', 'pull'],
|
||||||
cwd=MCP_COMPOSE_DIR,
|
cwd=compose_dir,
|
||||||
timeout=60, check=True,
|
timeout=60, check=True,
|
||||||
)
|
)
|
||||||
logger.info('danbooru-mcp updated.')
|
logger.info('%s updated.', name)
|
||||||
except FileNotFoundError:
|
except FileNotFoundError:
|
||||||
logger.warning('git not found on PATH — danbooru-mcp repo will not be cloned/updated.')
|
logger.warning('git not found on PATH — %s repo will not be cloned/updated.', name)
|
||||||
except subprocess.CalledProcessError as e:
|
except subprocess.CalledProcessError as e:
|
||||||
logger.warning('git operation failed for danbooru-mcp: %s', e)
|
logger.warning('git operation failed for %s: %s', name, e)
|
||||||
except subprocess.TimeoutExpired:
|
except subprocess.TimeoutExpired:
|
||||||
logger.warning('git timed out while cloning/updating danbooru-mcp.')
|
logger.warning('git timed out while cloning/updating %s.', name)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning('Could not clone/update danbooru-mcp repo: %s', e)
|
logger.warning('Could not clone/update %s repo: %s', name, e)
|
||||||
|
|
||||||
|
|
||||||
|
def _ensure_server_running(compose_dir, repo_url, container_name, name):
|
||||||
|
"""Ensure an MCP repo is present/up-to-date, then start the Docker
|
||||||
|
container if it is not already running.
|
||||||
|
|
||||||
|
Uses ``docker compose up -d`` so the image is built automatically on first
|
||||||
|
run. Errors are non-fatal — the app will still start even if Docker is
|
||||||
|
unavailable.
|
||||||
|
|
||||||
|
Skipped when ``SKIP_MCP_AUTOSTART=true`` (set by docker-compose, where the
|
||||||
|
MCP service is managed by compose instead).
|
||||||
|
"""
|
||||||
|
if os.environ.get('SKIP_MCP_AUTOSTART', '').lower() == 'true':
|
||||||
|
logger.info('SKIP_MCP_AUTOSTART set — skipping %s auto-start.', name)
|
||||||
|
return
|
||||||
|
_ensure_repo(compose_dir, repo_url, name)
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
['docker', 'ps', '--filter', f'name={container_name}', '--format', '{{.Names}}'],
|
||||||
|
capture_output=True, text=True, timeout=10,
|
||||||
|
)
|
||||||
|
if container_name in result.stdout:
|
||||||
|
logger.info('%s container already running.', name)
|
||||||
|
return
|
||||||
|
logger.info('Starting %s container via docker compose …', name)
|
||||||
|
subprocess.run(
|
||||||
|
['docker', 'compose', 'up', '-d'],
|
||||||
|
cwd=compose_dir,
|
||||||
|
timeout=120,
|
||||||
|
)
|
||||||
|
logger.info('%s container started.', name)
|
||||||
|
except FileNotFoundError:
|
||||||
|
logger.warning('docker not found on PATH — %s will not be started automatically.', name)
|
||||||
|
except subprocess.TimeoutExpired:
|
||||||
|
logger.warning('docker timed out while starting %s.', name)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning('Could not ensure %s is running: %s', name, e)
|
||||||
|
|
||||||
|
|
||||||
def ensure_mcp_server_running():
|
def ensure_mcp_server_running():
|
||||||
"""Ensure the danbooru-mcp repo is present/up-to-date, then start the
|
"""Ensure the danbooru-mcp Docker container is running."""
|
||||||
Docker container if it is not already running.
|
_ensure_server_running(MCP_COMPOSE_DIR, MCP_REPO_URL, 'danbooru-mcp', 'danbooru-mcp')
|
||||||
|
|
||||||
Uses ``docker compose up -d`` so the image is built automatically on first
|
|
||||||
run. Errors are non-fatal — the app will still start even if Docker is
|
|
||||||
unavailable.
|
|
||||||
|
|
||||||
Skipped when ``SKIP_MCP_AUTOSTART=true`` (set by docker-compose, where the
|
|
||||||
danbooru-mcp service is managed by compose instead).
|
|
||||||
"""
|
|
||||||
if os.environ.get('SKIP_MCP_AUTOSTART', '').lower() == 'true':
|
|
||||||
logger.info('SKIP_MCP_AUTOSTART set — skipping danbooru-mcp auto-start.')
|
|
||||||
return
|
|
||||||
_ensure_mcp_repo()
|
|
||||||
try:
|
|
||||||
result = subprocess.run(
|
|
||||||
['docker', 'ps', '--filter', 'name=danbooru-mcp', '--format', '{{.Names}}'],
|
|
||||||
capture_output=True, text=True, timeout=10,
|
|
||||||
)
|
|
||||||
if 'danbooru-mcp' in result.stdout:
|
|
||||||
logger.info('danbooru-mcp container already running.')
|
|
||||||
return
|
|
||||||
# Container not running — start it via docker compose
|
|
||||||
logger.info('Starting danbooru-mcp container via docker compose …')
|
|
||||||
subprocess.run(
|
|
||||||
['docker', 'compose', 'up', '-d'],
|
|
||||||
cwd=MCP_COMPOSE_DIR,
|
|
||||||
timeout=120,
|
|
||||||
)
|
|
||||||
logger.info('danbooru-mcp container started.')
|
|
||||||
except FileNotFoundError:
|
|
||||||
logger.warning('docker not found on PATH — danbooru-mcp will not be started automatically.')
|
|
||||||
except subprocess.TimeoutExpired:
|
|
||||||
logger.warning('docker timed out while starting danbooru-mcp.')
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning('Could not ensure danbooru-mcp is running: %s', e)
|
|
||||||
|
|
||||||
|
|
||||||
def _ensure_character_mcp_repo():
|
|
||||||
"""Clone or update the character-mcp source repository inside tools/.
|
|
||||||
|
|
||||||
- If ``tools/character-mcp/`` does not exist, clone from CHAR_MCP_REPO_URL.
|
|
||||||
- If it already exists, run ``git pull`` to fetch the latest changes.
|
|
||||||
Errors are non-fatal.
|
|
||||||
"""
|
|
||||||
os.makedirs(MCP_TOOLS_DIR, exist_ok=True)
|
|
||||||
try:
|
|
||||||
if not os.path.isdir(CHAR_MCP_COMPOSE_DIR):
|
|
||||||
logger.info('Cloning character-mcp from %s …', CHAR_MCP_REPO_URL)
|
|
||||||
subprocess.run(
|
|
||||||
['git', 'clone', CHAR_MCP_REPO_URL, CHAR_MCP_COMPOSE_DIR],
|
|
||||||
timeout=120, check=True,
|
|
||||||
)
|
|
||||||
logger.info('character-mcp cloned successfully.')
|
|
||||||
else:
|
|
||||||
logger.info('Updating character-mcp via git pull …')
|
|
||||||
subprocess.run(
|
|
||||||
['git', 'pull'],
|
|
||||||
cwd=CHAR_MCP_COMPOSE_DIR,
|
|
||||||
timeout=60, check=True,
|
|
||||||
)
|
|
||||||
logger.info('character-mcp updated.')
|
|
||||||
except FileNotFoundError:
|
|
||||||
logger.warning('git not found on PATH — character-mcp repo will not be cloned/updated.')
|
|
||||||
except subprocess.CalledProcessError as e:
|
|
||||||
logger.warning('git operation failed for character-mcp: %s', e)
|
|
||||||
except subprocess.TimeoutExpired:
|
|
||||||
logger.warning('git timed out while cloning/updating character-mcp.')
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning('Could not clone/update character-mcp repo: %s', e)
|
|
||||||
|
|
||||||
|
|
||||||
def ensure_character_mcp_server_running():
|
def ensure_character_mcp_server_running():
|
||||||
"""Ensure the character-mcp repo is present/up-to-date, then start the
|
"""Ensure the character-mcp Docker container is running."""
|
||||||
Docker container if it is not already running.
|
_ensure_server_running(CHAR_MCP_COMPOSE_DIR, CHAR_MCP_REPO_URL, 'character-mcp', 'character-mcp')
|
||||||
|
|
||||||
Uses ``docker compose up -d`` so the image is built automatically on first
|
|
||||||
run. Errors are non-fatal — the app will still start even if Docker is
|
|
||||||
unavailable.
|
|
||||||
|
|
||||||
Skipped when ``SKIP_MCP_AUTOSTART=true`` (set by docker-compose, where the
|
|
||||||
character-mcp service is managed by compose instead).
|
|
||||||
"""
|
|
||||||
if os.environ.get('SKIP_MCP_AUTOSTART', '').lower() == 'true':
|
|
||||||
logger.info('SKIP_MCP_AUTOSTART set — skipping character-mcp auto-start.')
|
|
||||||
return
|
|
||||||
_ensure_character_mcp_repo()
|
|
||||||
try:
|
|
||||||
result = subprocess.run(
|
|
||||||
['docker', 'ps', '--filter', 'name=character-mcp', '--format', '{{.Names}}'],
|
|
||||||
capture_output=True, text=True, timeout=10,
|
|
||||||
)
|
|
||||||
if 'character-mcp' in result.stdout:
|
|
||||||
logger.info('character-mcp container already running.')
|
|
||||||
return
|
|
||||||
# Container not running — start it via docker compose
|
|
||||||
logger.info('Starting character-mcp container via docker compose …')
|
|
||||||
subprocess.run(
|
|
||||||
['docker', 'compose', 'up', '-d'],
|
|
||||||
cwd=CHAR_MCP_COMPOSE_DIR,
|
|
||||||
timeout=120,
|
|
||||||
)
|
|
||||||
logger.info('character-mcp container started.')
|
|
||||||
except FileNotFoundError:
|
|
||||||
logger.warning('docker not found on PATH — character-mcp will not be started automatically.')
|
|
||||||
except subprocess.TimeoutExpired:
|
|
||||||
logger.warning('docker timed out while starting character-mcp.')
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning('Could not ensure character-mcp is running: %s', e)
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
import re
|
import re
|
||||||
from models import db, Character
|
from models import db, Character
|
||||||
from utils import _IDENTITY_KEYS, _WARDROBE_KEYS, _BODY_GROUP_KEYS, parse_orientation
|
from utils import _BODY_GROUP_KEYS, parse_orientation
|
||||||
|
|
||||||
|
|
||||||
def _dedup_tags(prompt_str):
|
def _dedup_tags(prompt_str):
|
||||||
@@ -57,7 +57,7 @@ def _ensure_character_fields(character, selected_fields, include_wardrobe=True,
|
|||||||
include_defaults — also inject defaults::expression and defaults::pose (for outfit/look previews)
|
include_defaults — also inject defaults::expression and defaults::pose (for outfit/look previews)
|
||||||
"""
|
"""
|
||||||
identity = character.data.get('identity', {})
|
identity = character.data.get('identity', {})
|
||||||
for key in _IDENTITY_KEYS:
|
for key in _BODY_GROUP_KEYS:
|
||||||
if identity.get(key):
|
if identity.get(key):
|
||||||
field_key = f'identity::{key}'
|
field_key = f'identity::{key}'
|
||||||
if field_key not in selected_fields:
|
if field_key not in selected_fields:
|
||||||
@@ -72,7 +72,7 @@ def _ensure_character_fields(character, selected_fields, include_wardrobe=True,
|
|||||||
selected_fields.append('special::name')
|
selected_fields.append('special::name')
|
||||||
if include_wardrobe:
|
if include_wardrobe:
|
||||||
wardrobe = character.get_active_wardrobe()
|
wardrobe = character.get_active_wardrobe()
|
||||||
for key in _WARDROBE_KEYS:
|
for key in _BODY_GROUP_KEYS:
|
||||||
if wardrobe.get(key):
|
if wardrobe.get(key):
|
||||||
field_key = f'wardrobe::{key}'
|
field_key = f'wardrobe::{key}'
|
||||||
if field_key not in selected_fields:
|
if field_key not in selected_fields:
|
||||||
|
|||||||
@@ -193,7 +193,7 @@ def ensure_default_outfit():
|
|||||||
"lora_weight": 0.8,
|
"lora_weight": 0.8,
|
||||||
"lora_triggers": ""
|
"lora_triggers": ""
|
||||||
},
|
},
|
||||||
"tags": []
|
"tags": {"outfit_type": "Default", "nsfw": False}
|
||||||
}
|
}
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -9,6 +9,27 @@ from services.prompts import _cross_dedup_prompts
|
|||||||
|
|
||||||
logger = logging.getLogger('gaze')
|
logger = logging.getLogger('gaze')
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# ComfyUI workflow node IDs (must match comfy_workflow.json)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
NODE_KSAMPLER = "3"
|
||||||
|
NODE_CHECKPOINT = "4"
|
||||||
|
NODE_LATENT = "5"
|
||||||
|
NODE_POSITIVE = "6"
|
||||||
|
NODE_NEGATIVE = "7"
|
||||||
|
NODE_VAE_DECODE = "8"
|
||||||
|
NODE_SAVE = "9"
|
||||||
|
NODE_FACE_DETAILER = "11"
|
||||||
|
NODE_HAND_DETAILER = "13"
|
||||||
|
NODE_FACE_PROMPT = "14"
|
||||||
|
NODE_HAND_PROMPT = "15"
|
||||||
|
NODE_LORA_CHAR = "16"
|
||||||
|
NODE_LORA_OUTFIT = "17"
|
||||||
|
NODE_LORA_ACTION = "18"
|
||||||
|
NODE_LORA_STYLE = "19"
|
||||||
|
NODE_LORA_CHAR_B = "20"
|
||||||
|
NODE_VAE_LOADER = "21"
|
||||||
|
|
||||||
# Node IDs used by DetailerForEach in multi-char mode
|
# Node IDs used by DetailerForEach in multi-char mode
|
||||||
_SEGS_DETAILER_NODES = ['46', '47', '53', '54']
|
_SEGS_DETAILER_NODES = ['46', '47', '53', '54']
|
||||||
# Node IDs for per-character CLIP prompts in multi-char mode
|
# Node IDs for per-character CLIP prompts in multi-char mode
|
||||||
@@ -22,7 +43,7 @@ def _log_workflow_prompts(label, workflow):
|
|||||||
lora_details = []
|
lora_details = []
|
||||||
|
|
||||||
# Collect detailed LoRA information
|
# Collect detailed LoRA information
|
||||||
for node_id, label_str in [("16", "char/look"), ("17", "outfit"), ("18", "action"), ("19", "style/detail/scene"), ("20", "char_b")]:
|
for node_id, label_str in [(NODE_LORA_CHAR, "char/look"), (NODE_LORA_OUTFIT, "outfit"), (NODE_LORA_ACTION, "action"), (NODE_LORA_STYLE, "style/detail/scene"), (NODE_LORA_CHAR_B, "char_b")]:
|
||||||
if node_id in workflow:
|
if node_id in workflow:
|
||||||
name = workflow[node_id]["inputs"].get("lora_name", "")
|
name = workflow[node_id]["inputs"].get("lora_name", "")
|
||||||
if name:
|
if name:
|
||||||
@@ -41,13 +62,13 @@ def _log_workflow_prompts(label, workflow):
|
|||||||
|
|
||||||
# Extract VAE information
|
# Extract VAE information
|
||||||
vae_info = "(integrated)"
|
vae_info = "(integrated)"
|
||||||
if '21' in workflow:
|
if NODE_VAE_LOADER in workflow:
|
||||||
vae_info = workflow['21']['inputs'].get('vae_name', '(custom)')
|
vae_info = workflow[NODE_VAE_LOADER]['inputs'].get('vae_name', '(custom)')
|
||||||
|
|
||||||
# Extract adetailer information
|
# Extract adetailer information
|
||||||
adetailer_info = []
|
adetailer_info = []
|
||||||
# Single-char mode: FaceDetailer nodes 11 + 13
|
# Single-char mode: FaceDetailer nodes 11 + 13
|
||||||
for node_id, node_name in [("11", "Face"), ("13", "Hand")]:
|
for node_id, node_name in [(NODE_FACE_DETAILER, "Face"), (NODE_HAND_DETAILER, "Hand")]:
|
||||||
if node_id in workflow:
|
if node_id in workflow:
|
||||||
adetailer_info.append(f" {node_name} (Node {node_id}): steps={workflow[node_id]['inputs'].get('steps', '?')}, "
|
adetailer_info.append(f" {node_name} (Node {node_id}): steps={workflow[node_id]['inputs'].get('steps', '?')}, "
|
||||||
f"cfg={workflow[node_id]['inputs'].get('cfg', '?')}, "
|
f"cfg={workflow[node_id]['inputs'].get('cfg', '?')}, "
|
||||||
@@ -59,24 +80,24 @@ def _log_workflow_prompts(label, workflow):
|
|||||||
f"cfg={workflow[node_id]['inputs'].get('cfg', '?')}, "
|
f"cfg={workflow[node_id]['inputs'].get('cfg', '?')}, "
|
||||||
f"denoise={workflow[node_id]['inputs'].get('denoise', '?')}")
|
f"denoise={workflow[node_id]['inputs'].get('denoise', '?')}")
|
||||||
|
|
||||||
face_text = workflow.get('14', {}).get('inputs', {}).get('text', '')
|
face_text = workflow.get(NODE_FACE_PROMPT, {}).get('inputs', {}).get('text', '')
|
||||||
hand_text = workflow.get('15', {}).get('inputs', {}).get('text', '')
|
hand_text = workflow.get(NODE_HAND_PROMPT, {}).get('inputs', {}).get('text', '')
|
||||||
|
|
||||||
lines = [
|
lines = [
|
||||||
sep,
|
sep,
|
||||||
f" WORKFLOW PROMPTS [{label}]",
|
f" WORKFLOW PROMPTS [{label}]",
|
||||||
sep,
|
sep,
|
||||||
" MODEL CONFIGURATION:",
|
" MODEL CONFIGURATION:",
|
||||||
f" Checkpoint : {workflow['4']['inputs'].get('ckpt_name', '(not set)')}",
|
f" Checkpoint : {workflow[NODE_CHECKPOINT]['inputs'].get('ckpt_name', '(not set)')}",
|
||||||
f" VAE : {vae_info}",
|
f" VAE : {vae_info}",
|
||||||
"",
|
"",
|
||||||
" GENERATION SETTINGS:",
|
" GENERATION SETTINGS:",
|
||||||
f" Seed : {workflow['3']['inputs'].get('seed', '(not set)')}",
|
f" Seed : {workflow[NODE_KSAMPLER]['inputs'].get('seed', '(not set)')}",
|
||||||
f" Resolution : {workflow['5']['inputs'].get('width', '?')} x {workflow['5']['inputs'].get('height', '?')}",
|
f" Resolution : {workflow[NODE_LATENT]['inputs'].get('width', '?')} x {workflow[NODE_LATENT]['inputs'].get('height', '?')}",
|
||||||
f" Sampler : {workflow['3']['inputs'].get('sampler_name', '?')} / {workflow['3']['inputs'].get('scheduler', '?')}",
|
f" Sampler : {workflow[NODE_KSAMPLER]['inputs'].get('sampler_name', '?')} / {workflow[NODE_KSAMPLER]['inputs'].get('scheduler', '?')}",
|
||||||
f" Steps : {workflow['3']['inputs'].get('steps', '?')}",
|
f" Steps : {workflow[NODE_KSAMPLER]['inputs'].get('steps', '?')}",
|
||||||
f" CFG Scale : {workflow['3']['inputs'].get('cfg', '?')}",
|
f" CFG Scale : {workflow[NODE_KSAMPLER]['inputs'].get('cfg', '?')}",
|
||||||
f" Denoise : {workflow['3']['inputs'].get('denoise', '1.0')}",
|
f" Denoise : {workflow[NODE_KSAMPLER]['inputs'].get('denoise', '1.0')}",
|
||||||
]
|
]
|
||||||
|
|
||||||
# Add LoRA details
|
# Add LoRA details
|
||||||
@@ -98,8 +119,8 @@ def _log_workflow_prompts(label, workflow):
|
|||||||
lines.extend([
|
lines.extend([
|
||||||
"",
|
"",
|
||||||
" PROMPTS:",
|
" PROMPTS:",
|
||||||
f" [+] Positive : {workflow['6']['inputs'].get('text', '')}",
|
f" [+] Positive : {workflow[NODE_POSITIVE]['inputs'].get('text', '')}",
|
||||||
f" [-] Negative : {workflow['7']['inputs'].get('text', '')}",
|
f" [-] Negative : {workflow[NODE_NEGATIVE]['inputs'].get('text', '')}",
|
||||||
])
|
])
|
||||||
|
|
||||||
if face_text:
|
if face_text:
|
||||||
@@ -128,17 +149,17 @@ def _apply_checkpoint_settings(workflow, ckpt_data):
|
|||||||
vae = ckpt_data.get('vae', 'integrated')
|
vae = ckpt_data.get('vae', 'integrated')
|
||||||
|
|
||||||
# KSampler (node 3)
|
# KSampler (node 3)
|
||||||
if steps and '3' in workflow:
|
if steps and NODE_KSAMPLER in workflow:
|
||||||
workflow['3']['inputs']['steps'] = int(steps)
|
workflow[NODE_KSAMPLER]['inputs']['steps'] = int(steps)
|
||||||
if cfg and '3' in workflow:
|
if cfg and NODE_KSAMPLER in workflow:
|
||||||
workflow['3']['inputs']['cfg'] = float(cfg)
|
workflow[NODE_KSAMPLER]['inputs']['cfg'] = float(cfg)
|
||||||
if sampler_name and '3' in workflow:
|
if sampler_name and NODE_KSAMPLER in workflow:
|
||||||
workflow['3']['inputs']['sampler_name'] = sampler_name
|
workflow[NODE_KSAMPLER]['inputs']['sampler_name'] = sampler_name
|
||||||
if scheduler and '3' in workflow:
|
if scheduler and NODE_KSAMPLER in workflow:
|
||||||
workflow['3']['inputs']['scheduler'] = scheduler
|
workflow[NODE_KSAMPLER]['inputs']['scheduler'] = scheduler
|
||||||
|
|
||||||
# Face/hand detailers (nodes 11, 13) + multi-char SEGS detailers
|
# Face/hand detailers (nodes 11, 13) + multi-char SEGS detailers
|
||||||
for node_id in ['11', '13'] + _SEGS_DETAILER_NODES:
|
for node_id in [NODE_FACE_DETAILER, NODE_HAND_DETAILER] + _SEGS_DETAILER_NODES:
|
||||||
if node_id in workflow:
|
if node_id in workflow:
|
||||||
if steps:
|
if steps:
|
||||||
workflow[node_id]['inputs']['steps'] = int(steps)
|
workflow[node_id]['inputs']['steps'] = int(steps)
|
||||||
@@ -151,25 +172,25 @@ def _apply_checkpoint_settings(workflow, ckpt_data):
|
|||||||
|
|
||||||
# Prepend base_positive to all positive prompt nodes
|
# Prepend base_positive to all positive prompt nodes
|
||||||
if base_positive:
|
if base_positive:
|
||||||
for node_id in ['6', '14', '15'] + _SEGS_PROMPT_NODES:
|
for node_id in [NODE_POSITIVE, NODE_FACE_PROMPT, NODE_HAND_PROMPT] + _SEGS_PROMPT_NODES:
|
||||||
if node_id in workflow:
|
if node_id in workflow:
|
||||||
workflow[node_id]['inputs']['text'] = f"{base_positive}, {workflow[node_id]['inputs']['text']}"
|
workflow[node_id]['inputs']['text'] = f"{base_positive}, {workflow[node_id]['inputs']['text']}"
|
||||||
|
|
||||||
# Append base_negative to negative prompt (shared by main + detailers via node 7)
|
# Append base_negative to negative prompt (shared by main + detailers via node 7)
|
||||||
if base_negative and '7' in workflow:
|
if base_negative and NODE_NEGATIVE in workflow:
|
||||||
workflow['7']['inputs']['text'] = f"{workflow['7']['inputs']['text']}, {base_negative}"
|
workflow[NODE_NEGATIVE]['inputs']['text'] = f"{workflow[NODE_NEGATIVE]['inputs']['text']}, {base_negative}"
|
||||||
|
|
||||||
# VAE: if not integrated, inject a VAELoader node and rewire
|
# VAE: if not integrated, inject a VAELoader node and rewire
|
||||||
if vae and vae != 'integrated':
|
if vae and vae != 'integrated':
|
||||||
workflow['21'] = {
|
workflow[NODE_VAE_LOADER] = {
|
||||||
'inputs': {'vae_name': vae},
|
'inputs': {'vae_name': vae},
|
||||||
'class_type': 'VAELoader'
|
'class_type': 'VAELoader'
|
||||||
}
|
}
|
||||||
if '8' in workflow:
|
if NODE_VAE_DECODE in workflow:
|
||||||
workflow['8']['inputs']['vae'] = ['21', 0]
|
workflow[NODE_VAE_DECODE]['inputs']['vae'] = [NODE_VAE_LOADER, 0]
|
||||||
for node_id in ['11', '13'] + _SEGS_DETAILER_NODES:
|
for node_id in [NODE_FACE_DETAILER, NODE_HAND_DETAILER] + _SEGS_DETAILER_NODES:
|
||||||
if node_id in workflow:
|
if node_id in workflow:
|
||||||
workflow[node_id]['inputs']['vae'] = ['21', 0]
|
workflow[node_id]['inputs']['vae'] = [NODE_VAE_LOADER, 0]
|
||||||
|
|
||||||
return workflow
|
return workflow
|
||||||
|
|
||||||
@@ -190,7 +211,7 @@ def _get_default_checkpoint():
|
|||||||
try:
|
try:
|
||||||
with open('comfy_workflow.json', 'r') as f:
|
with open('comfy_workflow.json', 'r') as f:
|
||||||
workflow = json.load(f)
|
workflow = json.load(f)
|
||||||
ckpt_path = workflow.get('4', {}).get('inputs', {}).get('ckpt_name')
|
ckpt_path = workflow.get(NODE_CHECKPOINT, {}).get('inputs', {}).get('ckpt_name')
|
||||||
logger.debug("Loaded default checkpoint from workflow file: %s", ckpt_path)
|
logger.debug("Loaded default checkpoint from workflow file: %s", ckpt_path)
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
@@ -231,11 +252,11 @@ def _inject_multi_char_detailers(workflow, prompts, model_source, clip_source):
|
|||||||
|
|
||||||
Image flow: VAEDecode(8) → PersonA(46) → PersonB(47) → FaceA(53) → FaceB(54) → Hand(13)
|
Image flow: VAEDecode(8) → PersonA(46) → PersonB(47) → FaceA(53) → FaceB(54) → Hand(13)
|
||||||
"""
|
"""
|
||||||
vae_source = ["4", 2]
|
vae_source = [NODE_CHECKPOINT, 2]
|
||||||
|
|
||||||
# Remove old single face detailer and its prompt — we replace them
|
# Remove old single face detailer and its prompt — we replace them
|
||||||
workflow.pop('11', None)
|
workflow.pop(NODE_FACE_DETAILER, None)
|
||||||
workflow.pop('14', None)
|
workflow.pop(NODE_FACE_PROMPT, None)
|
||||||
|
|
||||||
# --- Person detection ---
|
# --- Person detection ---
|
||||||
workflow['40'] = {
|
workflow['40'] = {
|
||||||
@@ -246,7 +267,7 @@ def _inject_multi_char_detailers(workflow, prompts, model_source, clip_source):
|
|||||||
workflow['41'] = {
|
workflow['41'] = {
|
||||||
'inputs': {
|
'inputs': {
|
||||||
'bbox_detector': ['40', 0],
|
'bbox_detector': ['40', 0],
|
||||||
'image': ['8', 0],
|
'image': [NODE_VAE_DECODE, 0],
|
||||||
'threshold': 0.5,
|
'threshold': 0.5,
|
||||||
'dilation': 10,
|
'dilation': 10,
|
||||||
'crop_factor': 3.0,
|
'crop_factor': 3.0,
|
||||||
@@ -313,13 +334,13 @@ def _inject_multi_char_detailers(workflow, prompts, model_source, clip_source):
|
|||||||
workflow['46'] = {
|
workflow['46'] = {
|
||||||
'inputs': {
|
'inputs': {
|
||||||
**_person_base,
|
**_person_base,
|
||||||
'image': ['8', 0],
|
'image': [NODE_VAE_DECODE, 0],
|
||||||
'segs': ['42', 0],
|
'segs': ['42', 0],
|
||||||
'model': model_source,
|
'model': model_source,
|
||||||
'clip': clip_source,
|
'clip': clip_source,
|
||||||
'vae': vae_source,
|
'vae': vae_source,
|
||||||
'positive': ['44', 0],
|
'positive': ['44', 0],
|
||||||
'negative': ['7', 0],
|
'negative': [NODE_NEGATIVE, 0],
|
||||||
},
|
},
|
||||||
'class_type': 'DetailerForEach'
|
'class_type': 'DetailerForEach'
|
||||||
}
|
}
|
||||||
@@ -333,7 +354,7 @@ def _inject_multi_char_detailers(workflow, prompts, model_source, clip_source):
|
|||||||
'clip': clip_source,
|
'clip': clip_source,
|
||||||
'vae': vae_source,
|
'vae': vae_source,
|
||||||
'positive': ['45', 0],
|
'positive': ['45', 0],
|
||||||
'negative': ['7', 0],
|
'negative': [NODE_NEGATIVE, 0],
|
||||||
},
|
},
|
||||||
'class_type': 'DetailerForEach'
|
'class_type': 'DetailerForEach'
|
||||||
}
|
}
|
||||||
@@ -413,7 +434,7 @@ def _inject_multi_char_detailers(workflow, prompts, model_source, clip_source):
|
|||||||
'clip': clip_source,
|
'clip': clip_source,
|
||||||
'vae': vae_source,
|
'vae': vae_source,
|
||||||
'positive': ['51', 0],
|
'positive': ['51', 0],
|
||||||
'negative': ['7', 0],
|
'negative': [NODE_NEGATIVE, 0],
|
||||||
},
|
},
|
||||||
'class_type': 'DetailerForEach'
|
'class_type': 'DetailerForEach'
|
||||||
}
|
}
|
||||||
@@ -427,29 +448,29 @@ def _inject_multi_char_detailers(workflow, prompts, model_source, clip_source):
|
|||||||
'clip': clip_source,
|
'clip': clip_source,
|
||||||
'vae': vae_source,
|
'vae': vae_source,
|
||||||
'positive': ['52', 0],
|
'positive': ['52', 0],
|
||||||
'negative': ['7', 0],
|
'negative': [NODE_NEGATIVE, 0],
|
||||||
},
|
},
|
||||||
'class_type': 'DetailerForEach'
|
'class_type': 'DetailerForEach'
|
||||||
}
|
}
|
||||||
|
|
||||||
# Rewire hand detailer: image input from last face detailer instead of old node 11
|
# Rewire hand detailer: image input from last face detailer instead of old node 11
|
||||||
if '13' in workflow:
|
if NODE_HAND_DETAILER in workflow:
|
||||||
workflow['13']['inputs']['image'] = ['54', 0]
|
workflow[NODE_HAND_DETAILER]['inputs']['image'] = ['54', 0]
|
||||||
|
|
||||||
logger.debug("Injected multi-char SEGS detailers (nodes 40-54)")
|
logger.debug("Injected multi-char SEGS detailers (nodes 40-54)")
|
||||||
|
|
||||||
|
|
||||||
def _prepare_workflow(workflow, character, prompts, checkpoint=None, custom_negative=None, outfit=None, action=None, style=None, detailer=None, scene=None, width=None, height=None, checkpoint_data=None, look=None, fixed_seed=None, character_b=None):
|
def _prepare_workflow(workflow, character, prompts, checkpoint=None, custom_negative=None, outfit=None, action=None, style=None, detailer=None, scene=None, width=None, height=None, checkpoint_data=None, look=None, fixed_seed=None, character_b=None):
|
||||||
# 1. Update prompts using replacement to preserve embeddings
|
# 1. Update prompts using replacement to preserve embeddings
|
||||||
workflow["6"]["inputs"]["text"] = workflow["6"]["inputs"]["text"].replace("{{POSITIVE_PROMPT}}", prompts["main"])
|
workflow[NODE_POSITIVE]["inputs"]["text"] = workflow[NODE_POSITIVE]["inputs"]["text"].replace("{{POSITIVE_PROMPT}}", prompts["main"])
|
||||||
|
|
||||||
if custom_negative:
|
if custom_negative:
|
||||||
workflow["7"]["inputs"]["text"] = f"{custom_negative}, {workflow['7']['inputs']['text']}"
|
workflow[NODE_NEGATIVE]["inputs"]["text"] = f"{custom_negative}, {workflow[NODE_NEGATIVE]['inputs']['text']}"
|
||||||
|
|
||||||
if "14" in workflow:
|
if NODE_FACE_PROMPT in workflow:
|
||||||
workflow["14"]["inputs"]["text"] = workflow["14"]["inputs"]["text"].replace("{{FACE_PROMPT}}", prompts["face"])
|
workflow[NODE_FACE_PROMPT]["inputs"]["text"] = workflow[NODE_FACE_PROMPT]["inputs"]["text"].replace("{{FACE_PROMPT}}", prompts["face"])
|
||||||
if "15" in workflow:
|
if NODE_HAND_PROMPT in workflow:
|
||||||
workflow["15"]["inputs"]["text"] = workflow["15"]["inputs"]["text"].replace("{{HAND_PROMPT}}", prompts["hand"])
|
workflow[NODE_HAND_PROMPT]["inputs"]["text"] = workflow[NODE_HAND_PROMPT]["inputs"]["text"].replace("{{HAND_PROMPT}}", prompts["hand"])
|
||||||
|
|
||||||
# 2. Update Checkpoint - always set one, fall back to default if not provided
|
# 2. Update Checkpoint - always set one, fall back to default if not provided
|
||||||
if not checkpoint:
|
if not checkpoint:
|
||||||
@@ -458,20 +479,20 @@ def _prepare_workflow(workflow, character, prompts, checkpoint=None, custom_nega
|
|||||||
if not checkpoint_data:
|
if not checkpoint_data:
|
||||||
checkpoint_data = default_ckpt_data
|
checkpoint_data = default_ckpt_data
|
||||||
if checkpoint:
|
if checkpoint:
|
||||||
workflow["4"]["inputs"]["ckpt_name"] = checkpoint
|
workflow[NODE_CHECKPOINT]["inputs"]["ckpt_name"] = checkpoint
|
||||||
else:
|
else:
|
||||||
raise ValueError("No checkpoint specified and no default checkpoint configured")
|
raise ValueError("No checkpoint specified and no default checkpoint configured")
|
||||||
|
|
||||||
# 3. Handle LoRAs - Node 16 for character, Node 17 for outfit, Node 18 for action, Node 19 for style/detailer
|
# 3. Handle LoRAs - Node 16 for character, Node 17 for outfit, Node 18 for action, Node 19 for style/detailer
|
||||||
# Start with direct checkpoint connections
|
# Start with direct checkpoint connections
|
||||||
model_source = ["4", 0]
|
model_source = [NODE_CHECKPOINT, 0]
|
||||||
clip_source = ["4", 1]
|
clip_source = [NODE_CHECKPOINT, 1]
|
||||||
|
|
||||||
# Look negative prompt (applied before character LoRA)
|
# Look negative prompt (applied before character LoRA)
|
||||||
if look:
|
if look:
|
||||||
look_negative = look.data.get('negative', '')
|
look_negative = look.data.get('negative', '')
|
||||||
if look_negative:
|
if look_negative:
|
||||||
workflow["7"]["inputs"]["text"] = f"{look_negative}, {workflow['7']['inputs']['text']}"
|
workflow[NODE_NEGATIVE]["inputs"]["text"] = f"{look_negative}, {workflow[NODE_NEGATIVE]['inputs']['text']}"
|
||||||
|
|
||||||
# Character LoRA (Node 16) — look LoRA overrides character LoRA when present
|
# Character LoRA (Node 16) — look LoRA overrides character LoRA when present
|
||||||
if look:
|
if look:
|
||||||
@@ -480,47 +501,47 @@ def _prepare_workflow(workflow, character, prompts, checkpoint=None, custom_nega
|
|||||||
char_lora_data = character.data.get('lora', {}) if character else {}
|
char_lora_data = character.data.get('lora', {}) if character else {}
|
||||||
char_lora_name = char_lora_data.get('lora_name')
|
char_lora_name = char_lora_data.get('lora_name')
|
||||||
|
|
||||||
if char_lora_name and "16" in workflow:
|
if char_lora_name and NODE_LORA_CHAR in workflow:
|
||||||
_w16 = _resolve_lora_weight(char_lora_data)
|
_w16 = _resolve_lora_weight(char_lora_data)
|
||||||
workflow["16"]["inputs"]["lora_name"] = char_lora_name
|
workflow[NODE_LORA_CHAR]["inputs"]["lora_name"] = char_lora_name
|
||||||
workflow["16"]["inputs"]["strength_model"] = _w16
|
workflow[NODE_LORA_CHAR]["inputs"]["strength_model"] = _w16
|
||||||
workflow["16"]["inputs"]["strength_clip"] = _w16
|
workflow[NODE_LORA_CHAR]["inputs"]["strength_clip"] = _w16
|
||||||
workflow["16"]["inputs"]["model"] = ["4", 0] # From checkpoint
|
workflow[NODE_LORA_CHAR]["inputs"]["model"] = [NODE_CHECKPOINT, 0] # From checkpoint
|
||||||
workflow["16"]["inputs"]["clip"] = ["4", 1] # From checkpoint
|
workflow[NODE_LORA_CHAR]["inputs"]["clip"] = [NODE_CHECKPOINT, 1] # From checkpoint
|
||||||
model_source = ["16", 0]
|
model_source = [NODE_LORA_CHAR, 0]
|
||||||
clip_source = ["16", 1]
|
clip_source = [NODE_LORA_CHAR, 1]
|
||||||
logger.debug("Character LoRA: %s @ %s", char_lora_name, _w16)
|
logger.debug("Character LoRA: %s @ %s", char_lora_name, _w16)
|
||||||
|
|
||||||
# Outfit LoRA (Node 17) - chains from character LoRA or checkpoint
|
# Outfit LoRA (Node 17) - chains from character LoRA or checkpoint
|
||||||
outfit_lora_data = outfit.data.get('lora', {}) if outfit else {}
|
outfit_lora_data = outfit.data.get('lora', {}) if outfit else {}
|
||||||
outfit_lora_name = outfit_lora_data.get('lora_name')
|
outfit_lora_name = outfit_lora_data.get('lora_name')
|
||||||
|
|
||||||
if outfit_lora_name and "17" in workflow:
|
if outfit_lora_name and NODE_LORA_OUTFIT in workflow:
|
||||||
_w17 = _resolve_lora_weight({**{'lora_weight': 0.8}, **outfit_lora_data})
|
_w17 = _resolve_lora_weight({**{'lora_weight': 0.8}, **outfit_lora_data})
|
||||||
workflow["17"]["inputs"]["lora_name"] = outfit_lora_name
|
workflow[NODE_LORA_OUTFIT]["inputs"]["lora_name"] = outfit_lora_name
|
||||||
workflow["17"]["inputs"]["strength_model"] = _w17
|
workflow[NODE_LORA_OUTFIT]["inputs"]["strength_model"] = _w17
|
||||||
workflow["17"]["inputs"]["strength_clip"] = _w17
|
workflow[NODE_LORA_OUTFIT]["inputs"]["strength_clip"] = _w17
|
||||||
# Chain from character LoRA (node 16) or checkpoint (node 4)
|
# Chain from character LoRA (node 16) or checkpoint (node 4)
|
||||||
workflow["17"]["inputs"]["model"] = model_source
|
workflow[NODE_LORA_OUTFIT]["inputs"]["model"] = model_source
|
||||||
workflow["17"]["inputs"]["clip"] = clip_source
|
workflow[NODE_LORA_OUTFIT]["inputs"]["clip"] = clip_source
|
||||||
model_source = ["17", 0]
|
model_source = [NODE_LORA_OUTFIT, 0]
|
||||||
clip_source = ["17", 1]
|
clip_source = [NODE_LORA_OUTFIT, 1]
|
||||||
logger.debug("Outfit LoRA: %s @ %s", outfit_lora_name, _w17)
|
logger.debug("Outfit LoRA: %s @ %s", outfit_lora_name, _w17)
|
||||||
|
|
||||||
# Action LoRA (Node 18) - chains from previous LoRA or checkpoint
|
# Action LoRA (Node 18) - chains from previous LoRA or checkpoint
|
||||||
action_lora_data = action.data.get('lora', {}) if action else {}
|
action_lora_data = action.data.get('lora', {}) if action else {}
|
||||||
action_lora_name = action_lora_data.get('lora_name')
|
action_lora_name = action_lora_data.get('lora_name')
|
||||||
|
|
||||||
if action_lora_name and "18" in workflow:
|
if action_lora_name and NODE_LORA_ACTION in workflow:
|
||||||
_w18 = _resolve_lora_weight(action_lora_data)
|
_w18 = _resolve_lora_weight(action_lora_data)
|
||||||
workflow["18"]["inputs"]["lora_name"] = action_lora_name
|
workflow[NODE_LORA_ACTION]["inputs"]["lora_name"] = action_lora_name
|
||||||
workflow["18"]["inputs"]["strength_model"] = _w18
|
workflow[NODE_LORA_ACTION]["inputs"]["strength_model"] = _w18
|
||||||
workflow["18"]["inputs"]["strength_clip"] = _w18
|
workflow[NODE_LORA_ACTION]["inputs"]["strength_clip"] = _w18
|
||||||
# Chain from previous source
|
# Chain from previous source
|
||||||
workflow["18"]["inputs"]["model"] = model_source
|
workflow[NODE_LORA_ACTION]["inputs"]["model"] = model_source
|
||||||
workflow["18"]["inputs"]["clip"] = clip_source
|
workflow[NODE_LORA_ACTION]["inputs"]["clip"] = clip_source
|
||||||
model_source = ["18", 0]
|
model_source = [NODE_LORA_ACTION, 0]
|
||||||
clip_source = ["18", 1]
|
clip_source = [NODE_LORA_ACTION, 1]
|
||||||
logger.debug("Action LoRA: %s @ %s", action_lora_name, _w18)
|
logger.debug("Action LoRA: %s @ %s", action_lora_name, _w18)
|
||||||
|
|
||||||
# Style/Detailer/Scene LoRA (Node 19) - chains from previous LoRA or checkpoint
|
# Style/Detailer/Scene LoRA (Node 19) - chains from previous LoRA or checkpoint
|
||||||
@@ -529,31 +550,31 @@ def _prepare_workflow(workflow, character, prompts, checkpoint=None, custom_nega
|
|||||||
style_lora_data = target_obj.data.get('lora', {}) if target_obj else {}
|
style_lora_data = target_obj.data.get('lora', {}) if target_obj else {}
|
||||||
style_lora_name = style_lora_data.get('lora_name')
|
style_lora_name = style_lora_data.get('lora_name')
|
||||||
|
|
||||||
if style_lora_name and "19" in workflow:
|
if style_lora_name and NODE_LORA_STYLE in workflow:
|
||||||
_w19 = _resolve_lora_weight(style_lora_data)
|
_w19 = _resolve_lora_weight(style_lora_data)
|
||||||
workflow["19"]["inputs"]["lora_name"] = style_lora_name
|
workflow[NODE_LORA_STYLE]["inputs"]["lora_name"] = style_lora_name
|
||||||
workflow["19"]["inputs"]["strength_model"] = _w19
|
workflow[NODE_LORA_STYLE]["inputs"]["strength_model"] = _w19
|
||||||
workflow["19"]["inputs"]["strength_clip"] = _w19
|
workflow[NODE_LORA_STYLE]["inputs"]["strength_clip"] = _w19
|
||||||
# Chain from previous source
|
# Chain from previous source
|
||||||
workflow["19"]["inputs"]["model"] = model_source
|
workflow[NODE_LORA_STYLE]["inputs"]["model"] = model_source
|
||||||
workflow["19"]["inputs"]["clip"] = clip_source
|
workflow[NODE_LORA_STYLE]["inputs"]["clip"] = clip_source
|
||||||
model_source = ["19", 0]
|
model_source = [NODE_LORA_STYLE, 0]
|
||||||
clip_source = ["19", 1]
|
clip_source = [NODE_LORA_STYLE, 1]
|
||||||
logger.debug("Style/Detailer LoRA: %s @ %s", style_lora_name, _w19)
|
logger.debug("Style/Detailer LoRA: %s @ %s", style_lora_name, _w19)
|
||||||
|
|
||||||
# Second character LoRA (Node 20) - for multi-character generation
|
# Second character LoRA (Node 20) - for multi-character generation
|
||||||
if character_b:
|
if character_b:
|
||||||
char_b_lora_data = character_b.data.get('lora', {})
|
char_b_lora_data = character_b.data.get('lora', {})
|
||||||
char_b_lora_name = char_b_lora_data.get('lora_name')
|
char_b_lora_name = char_b_lora_data.get('lora_name')
|
||||||
if char_b_lora_name and "20" in workflow:
|
if char_b_lora_name and NODE_LORA_CHAR_B in workflow:
|
||||||
_w20 = _resolve_lora_weight(char_b_lora_data)
|
_w20 = _resolve_lora_weight(char_b_lora_data)
|
||||||
workflow["20"]["inputs"]["lora_name"] = char_b_lora_name
|
workflow[NODE_LORA_CHAR_B]["inputs"]["lora_name"] = char_b_lora_name
|
||||||
workflow["20"]["inputs"]["strength_model"] = _w20
|
workflow[NODE_LORA_CHAR_B]["inputs"]["strength_model"] = _w20
|
||||||
workflow["20"]["inputs"]["strength_clip"] = _w20
|
workflow[NODE_LORA_CHAR_B]["inputs"]["strength_clip"] = _w20
|
||||||
workflow["20"]["inputs"]["model"] = model_source
|
workflow[NODE_LORA_CHAR_B]["inputs"]["model"] = model_source
|
||||||
workflow["20"]["inputs"]["clip"] = clip_source
|
workflow[NODE_LORA_CHAR_B]["inputs"]["clip"] = clip_source
|
||||||
model_source = ["20", 0]
|
model_source = [NODE_LORA_CHAR_B, 0]
|
||||||
clip_source = ["20", 1]
|
clip_source = [NODE_LORA_CHAR_B, 1]
|
||||||
logger.debug("Character B LoRA: %s @ %s", char_b_lora_name, _w20)
|
logger.debug("Character B LoRA: %s @ %s", char_b_lora_name, _w20)
|
||||||
|
|
||||||
# 3b. Multi-char: inject per-character SEGS detailers (replaces node 11/14)
|
# 3b. Multi-char: inject per-character SEGS detailers (replaces node 11/14)
|
||||||
@@ -561,35 +582,35 @@ def _prepare_workflow(workflow, character, prompts, checkpoint=None, custom_nega
|
|||||||
_inject_multi_char_detailers(workflow, prompts, model_source, clip_source)
|
_inject_multi_char_detailers(workflow, prompts, model_source, clip_source)
|
||||||
|
|
||||||
# Apply connections to all model/clip consumers (conditional on node existence)
|
# Apply connections to all model/clip consumers (conditional on node existence)
|
||||||
for nid in ["3", "11", "13"] + _SEGS_DETAILER_NODES:
|
for nid in [NODE_KSAMPLER, NODE_FACE_DETAILER, NODE_HAND_DETAILER] + _SEGS_DETAILER_NODES:
|
||||||
if nid in workflow:
|
if nid in workflow:
|
||||||
workflow[nid]["inputs"]["model"] = model_source
|
workflow[nid]["inputs"]["model"] = model_source
|
||||||
|
|
||||||
for nid in ["6", "7", "11", "13", "14", "15"] + _SEGS_PROMPT_NODES:
|
for nid in [NODE_POSITIVE, NODE_NEGATIVE, NODE_FACE_DETAILER, NODE_HAND_DETAILER, NODE_FACE_PROMPT, NODE_HAND_PROMPT] + _SEGS_PROMPT_NODES:
|
||||||
if nid in workflow:
|
if nid in workflow:
|
||||||
workflow[nid]["inputs"]["clip"] = clip_source
|
workflow[nid]["inputs"]["clip"] = clip_source
|
||||||
|
|
||||||
# 4. Randomize seeds (or use a fixed seed for reproducible batches like Strengths Gallery)
|
# 4. Randomize seeds (or use a fixed seed for reproducible batches like Strengths Gallery)
|
||||||
gen_seed = fixed_seed if fixed_seed is not None else random.randint(1, 10**15)
|
gen_seed = fixed_seed if fixed_seed is not None else random.randint(1, 10**15)
|
||||||
for nid in ["3", "11", "13"] + _SEGS_DETAILER_NODES:
|
for nid in [NODE_KSAMPLER, NODE_FACE_DETAILER, NODE_HAND_DETAILER] + _SEGS_DETAILER_NODES:
|
||||||
if nid in workflow:
|
if nid in workflow:
|
||||||
workflow[nid]["inputs"]["seed"] = gen_seed
|
workflow[nid]["inputs"]["seed"] = gen_seed
|
||||||
|
|
||||||
# 5. Set image dimensions
|
# 5. Set image dimensions
|
||||||
if "5" in workflow:
|
if NODE_LATENT in workflow:
|
||||||
if width:
|
if width:
|
||||||
workflow["5"]["inputs"]["width"] = int(width)
|
workflow[NODE_LATENT]["inputs"]["width"] = int(width)
|
||||||
if height:
|
if height:
|
||||||
workflow["5"]["inputs"]["height"] = int(height)
|
workflow[NODE_LATENT]["inputs"]["height"] = int(height)
|
||||||
|
|
||||||
# 6. Apply checkpoint-specific settings (steps, cfg, sampler, base prompts, VAE)
|
# 6. Apply checkpoint-specific settings (steps, cfg, sampler, base prompts, VAE)
|
||||||
if checkpoint_data:
|
if checkpoint_data:
|
||||||
workflow = _apply_checkpoint_settings(workflow, checkpoint_data)
|
workflow = _apply_checkpoint_settings(workflow, checkpoint_data)
|
||||||
|
|
||||||
# 7. Sync sampler/scheduler from main KSampler to adetailer nodes
|
# 7. Sync sampler/scheduler from main KSampler to adetailer nodes
|
||||||
sampler_name = workflow["3"]["inputs"].get("sampler_name")
|
sampler_name = workflow[NODE_KSAMPLER]["inputs"].get("sampler_name")
|
||||||
scheduler = workflow["3"]["inputs"].get("scheduler")
|
scheduler = workflow[NODE_KSAMPLER]["inputs"].get("scheduler")
|
||||||
for node_id in ["11", "13"] + _SEGS_DETAILER_NODES:
|
for node_id in [NODE_FACE_DETAILER, NODE_HAND_DETAILER] + _SEGS_DETAILER_NODES:
|
||||||
if node_id in workflow:
|
if node_id in workflow:
|
||||||
if sampler_name:
|
if sampler_name:
|
||||||
workflow[node_id]["inputs"]["sampler_name"] = sampler_name
|
workflow[node_id]["inputs"]["sampler_name"] = sampler_name
|
||||||
@@ -598,11 +619,11 @@ def _prepare_workflow(workflow, character, prompts, checkpoint=None, custom_nega
|
|||||||
|
|
||||||
# 8. Cross-deduplicate: remove tags shared between positive and negative
|
# 8. Cross-deduplicate: remove tags shared between positive and negative
|
||||||
pos_text, neg_text = _cross_dedup_prompts(
|
pos_text, neg_text = _cross_dedup_prompts(
|
||||||
workflow["6"]["inputs"]["text"],
|
workflow[NODE_POSITIVE]["inputs"]["text"],
|
||||||
workflow["7"]["inputs"]["text"]
|
workflow[NODE_NEGATIVE]["inputs"]["text"]
|
||||||
)
|
)
|
||||||
workflow["6"]["inputs"]["text"] = pos_text
|
workflow[NODE_POSITIVE]["inputs"]["text"] = pos_text
|
||||||
workflow["7"]["inputs"]["text"] = neg_text
|
workflow[NODE_NEGATIVE]["inputs"]["text"] = neg_text
|
||||||
|
|
||||||
# 9. Final prompt debug — logged after all transformations are complete
|
# 9. Final prompt debug — logged after all transformations are complete
|
||||||
_log_workflow_prompts("_prepare_workflow", workflow)
|
_log_workflow_prompts("_prepare_workflow", workflow)
|
||||||
|
|||||||
@@ -67,9 +67,15 @@ function initDetailPage(options = {}) {
|
|||||||
// -----------------------------------------------------------------------
|
// -----------------------------------------------------------------------
|
||||||
// Job polling
|
// Job polling
|
||||||
// -----------------------------------------------------------------------
|
// -----------------------------------------------------------------------
|
||||||
async function waitForJob(jobId) {
|
async function waitForJob(jobId, maxPollMs = 300000) {
|
||||||
return new Promise((resolve, reject) => {
|
return new Promise((resolve, reject) => {
|
||||||
|
const start = Date.now();
|
||||||
const poll = setInterval(async () => {
|
const poll = setInterval(async () => {
|
||||||
|
if (Date.now() - start > maxPollMs) {
|
||||||
|
clearInterval(poll);
|
||||||
|
reject(new Error('Job timed out after ' + Math.round(maxPollMs / 1000) + 's'));
|
||||||
|
return;
|
||||||
|
}
|
||||||
try {
|
try {
|
||||||
const resp = await fetch(`/api/queue/${jobId}/status`);
|
const resp = await fetch(`/api/queue/${jobId}/status`);
|
||||||
const data = await resp.json();
|
const data = await resp.json();
|
||||||
|
|||||||
@@ -28,9 +28,15 @@ document.addEventListener('DOMContentLoaded', () => {
|
|||||||
const bulkOverwriteBtn = document.getElementById('bulk-overwrite-btn');
|
const bulkOverwriteBtn = document.getElementById('bulk-overwrite-btn');
|
||||||
|
|
||||||
// --- Utility: poll a job until done ---
|
// --- Utility: poll a job until done ---
|
||||||
function waitForJob(jobId) {
|
function waitForJob(jobId, maxPollMs = 300000) {
|
||||||
return new Promise((resolve, reject) => {
|
return new Promise((resolve, reject) => {
|
||||||
|
const start = Date.now();
|
||||||
const poll = setInterval(async () => {
|
const poll = setInterval(async () => {
|
||||||
|
if (Date.now() - start > maxPollMs) {
|
||||||
|
clearInterval(poll);
|
||||||
|
reject(new Error('Job timed out after ' + Math.round(maxPollMs / 1000) + 's'));
|
||||||
|
return;
|
||||||
|
}
|
||||||
try {
|
try {
|
||||||
const resp = await fetch(`/api/queue/${jobId}/status`);
|
const resp = await fetch(`/api/queue/${jobId}/status`);
|
||||||
const data = await resp.json();
|
const data = await resp.json();
|
||||||
|
|||||||
@@ -19,6 +19,19 @@
|
|||||||
{% endif %}
|
{% endif %}
|
||||||
{% endwith %}
|
{% endwith %}
|
||||||
|
|
||||||
|
<!-- Filters -->
|
||||||
|
<form method="get" class="mb-3 d-flex gap-3 align-items-center">
|
||||||
|
<div class="form-check">
|
||||||
|
<input class="form-check-input" type="checkbox" name="favourite" value="on" id="favFilter" {% if favourite_filter == 'on' %}checked{% endif %} onchange="this.form.submit()">
|
||||||
|
<label class="form-check-label small" for="favFilter">★ Favourites</label>
|
||||||
|
</div>
|
||||||
|
<select name="nsfw" class="form-select form-select-sm" style="width:auto;" onchange="this.form.submit()">
|
||||||
|
<option value="all" {% if nsfw_filter == 'all' %}selected{% endif %}>All ratings</option>
|
||||||
|
<option value="sfw" {% if nsfw_filter == 'sfw' %}selected{% endif %}>SFW only</option>
|
||||||
|
<option value="nsfw" {% if nsfw_filter == 'nsfw' %}selected{% endif %}>NSFW only</option>
|
||||||
|
</select>
|
||||||
|
</form>
|
||||||
|
|
||||||
<div class="row row-cols-2 row-cols-sm-3 row-cols-md-4 row-cols-lg-5 row-cols-xl-6 g-3">
|
<div class="row row-cols-2 row-cols-sm-3 row-cols-md-4 row-cols-lg-5 row-cols-xl-6 g-3">
|
||||||
{% for preset in presets %}
|
{% for preset in presets %}
|
||||||
<div class="col">
|
<div class="col">
|
||||||
|
|||||||
12
utils.py
12
utils.py
@@ -12,8 +12,16 @@ _LORA_DEFAULTS = {
|
|||||||
}
|
}
|
||||||
|
|
||||||
_BODY_GROUP_KEYS = ['base', 'head', 'upper_body', 'lower_body', 'hands', 'feet', 'additional']
|
_BODY_GROUP_KEYS = ['base', 'head', 'upper_body', 'lower_body', 'hands', 'feet', 'additional']
|
||||||
_IDENTITY_KEYS = _BODY_GROUP_KEYS
|
|
||||||
_WARDROBE_KEYS = _BODY_GROUP_KEYS
|
|
||||||
|
def clean_html_text(html_raw):
|
||||||
|
"""Strip HTML tags, scripts, styles, and images from raw HTML, returning plain text."""
|
||||||
|
import re
|
||||||
|
text = re.sub(r'<script[^>]*>.*?</script>', '', html_raw, flags=re.DOTALL)
|
||||||
|
text = re.sub(r'<style[^>]*>.*?</style>', '', text, flags=re.DOTALL)
|
||||||
|
text = re.sub(r'<img[^>]*>', '', text)
|
||||||
|
text = re.sub(r'<[^>]+>', ' ', text)
|
||||||
|
return ' '.join(text.split())
|
||||||
|
|
||||||
|
|
||||||
def allowed_file(filename):
|
def allowed_file(filename):
|
||||||
|
|||||||
Reference in New Issue
Block a user