{% extends "layout.html" %} {% block content %}
Application Settings
LLM Configuration
Choose where your AI text generation requests are processed.

OpenRouter Configuration
Get your key at openrouter.ai
Local LLM Configuration
Ollama default: http://localhost:11434/v1
LMStudio default: http://localhost:1234/v1
Ensure your local LLM server is running and API is enabled.
{% endblock %} {% block scripts %} {% endblock %}