| ```mermaid |
| %% MODEL Mode Data Flow (single model) |
| %% Detailed flows: ./flows/server-flow.mmd, ./flows/models-flow.mmd, ./flows/chat-flow.mmd |
| |
| sequenceDiagram |
| participant User as ๐ค User |
| participant UI as ๐งฉ UI |
| participant Stores as ๐๏ธ Stores |
| participant DB as ๐พ IndexedDB |
| participant API as ๐ llama-server |
| |
| Note over User,API: ๐ Initialization (see: server-flow.mmd, models-flow.mmd) |
| |
| UI->>Stores: initialize() |
| Stores->>DB: load conversations |
| Stores->>API: GET /props |
| API-->>Stores: server config + modalities |
| Stores->>API: GET /v1/models |
| API-->>Stores: single model (auto-selected) |
| |
| Note over User,API: ๐ฌ Chat Flow (see: chat-flow.mmd) |
| |
| User->>UI: send message |
| UI->>Stores: sendMessage() |
| Stores->>DB: save user message |
| Stores->>API: POST /v1/chat/completions (stream) |
| loop streaming |
| API-->>Stores: SSE chunks |
| Stores-->>UI: reactive update |
| end |
| API-->>Stores: done + timings |
| Stores->>DB: save assistant message |
| |
| Note over User,API: ๐ Regenerate |
| |
| User->>UI: regenerate |
| Stores->>DB: create message branch |
| Note right of Stores: same streaming flow |
| |
| Note over User,API: โน๏ธ Stop |
| |
| User->>UI: stop |
| Stores->>Stores: abort stream |
| Stores->>DB: save partial response |
| ``` |
|
|