{% extends "layout.html" %} {% block content %}
Application Settings
LLM Configuration
Choose where your AI text generation requests are processed.

OpenRouter Configuration
Get your key at openrouter.ai
Local LLM Configuration
Ollama default: http://localhost:11434/v1
LMStudio default: http://localhost:1234/v1
Ensure your local LLM server is running and API is enabled.

LoRA Directories

Absolute paths on disk where LoRA files are scanned for each category.


Checkpoint Directories
Comma-separated list of directories to scan for checkpoint files.

Default Checkpoint
Saved
Sets the checkpoint used for all generation requests. Saved immediately on change.

REST API Key

Generate an API key to use the /api/v1/ endpoints for programmatic image generation.

Tag Management
Migrate Tags

Convert old list-format tags to new structured dict format across all resources.

Bulk Regenerate Tags

Use LLM to regenerate structured tags for all resources. This will overwrite existing tags.

{% endblock %} {% block scripts %} {% endblock %}