{% extends "layout.html" %} {% block content %}
Application Settings
LLM Configuration
Choose where your AI text generation requests are processed.

OpenRouter Configuration
Get your key at openrouter.ai
Local LLM Configuration
Ollama default: http://localhost:11434/v1
LMStudio default: http://localhost:1234/v1
Ensure your local LLM server is running and API is enabled.

LoRA Directories

Absolute paths on disk where LoRA files are scanned for each category.


Checkpoint Directories
Comma-separated list of directories to scan for checkpoint files.

Default Checkpoint
Saved
Sets the checkpoint used for all generation requests. Saved immediately on change.
{% endblock %} {% block scripts %} {% endblock %}