Initial commit.
Basic docker deployment with Local LLM integration and simple game state.
This commit is contained in:
89
architecture.md
Normal file
89
architecture.md
Normal file
@@ -0,0 +1,89 @@
|
||||
# Text-Based LLM Interaction System Architecture
|
||||
|
||||
## Overview
|
||||
This document outlines the architecture for a text-based system that allows users to interact with an LLM running on LM Studio.
|
||||
|
||||
## Components
|
||||
|
||||
### 1. User Interface Layer
|
||||
- **Text Input Handler**: Captures user input from terminal
|
||||
- **Text Output Display**: Shows LLM responses to user
|
||||
- **Session Manager**: Manages conversation history
|
||||
|
||||
### 2. Communication Layer
|
||||
- **LLM Client**: Handles HTTP communication with LM Studio
|
||||
- **API Interface**: Formats requests/responses according to LM Studio's API
|
||||
|
||||
### 3. Core Logic Layer
|
||||
- **Message Processor**: Processes user input and LLM responses
|
||||
- **Conversation History**: Maintains context between messages
|
||||
|
||||
## Data Flow
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[User] --> B[Text Input Handler]
|
||||
B --> C[Message Processor]
|
||||
C --> D[LLM Client]
|
||||
D --> E[LM Studio Server]
|
||||
E --> D
|
||||
D --> C
|
||||
C --> F[Text Output Display]
|
||||
F --> A
|
||||
```
|
||||
|
||||
## Technical Details
|
||||
|
||||
### LM Studio API
|
||||
- Endpoint: http://10.0.0.200:1234/v1/chat/completions
|
||||
- Method: POST
|
||||
- Content-Type: application/json
|
||||
|
||||
### Request Format
|
||||
```json
|
||||
{
|
||||
"model": "model_name",
|
||||
"messages": [
|
||||
{"role": "user", "content": "user message"}
|
||||
],
|
||||
"temperature": 0.7,
|
||||
"max_tokens": -1
|
||||
}
|
||||
```
|
||||
|
||||
### Response Format
|
||||
```json
|
||||
{
|
||||
"id": "chatcmpl-123",
|
||||
"object": "chat.completion",
|
||||
"created": 1677652288,
|
||||
"model": "model_name",
|
||||
"choices": [
|
||||
{
|
||||
"index": 0,
|
||||
"message": {
|
||||
"role": "assistant",
|
||||
"content": "response message"
|
||||
},
|
||||
"finish_reason": "stop"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Project Structure
|
||||
```
|
||||
text_adventure/
|
||||
├── main.py # Entry point
|
||||
├── config.py # Configuration settings
|
||||
├── llm_client.py # LLM communication
|
||||
├── interface.py # Text input/output
|
||||
└── conversation.py # Conversation history management
|
||||
```
|
||||
|
||||
## Implementation Plan
|
||||
1. Create basic text input/output interface
|
||||
2. Implement LLM client for LM Studio communication
|
||||
3. Add conversation history management
|
||||
4. Integrate components
|
||||
5. Test functionality
|
||||
Reference in New Issue
Block a user