Quillan-Ronin / llama.cpp /tools /server /webui /docs /flows /data-flow-simplified-model-mode.md
CrashOverrideX's picture
Add files using upload-large-folder tool
41a5ab2 verified
%% MODEL Mode Data Flow (single model)
%% Detailed flows: ./flows/server-flow.mmd, ./flows/models-flow.mmd, ./flows/chat-flow.mmd

sequenceDiagram
    participant User as ๐Ÿ‘ค User
    participant UI as ๐Ÿงฉ UI
    participant Stores as ๐Ÿ—„๏ธ Stores
    participant DB as ๐Ÿ’พ IndexedDB
    participant API as ๐ŸŒ llama-server

    Note over User,API: ๐Ÿš€ Initialization (see: server-flow.mmd, models-flow.mmd)

    UI->>Stores: initialize()
    Stores->>DB: load conversations
    Stores->>API: GET /props
    API-->>Stores: server config + modalities
    Stores->>API: GET /v1/models
    API-->>Stores: single model (auto-selected)

    Note over User,API: ๐Ÿ’ฌ Chat Flow (see: chat-flow.mmd)

    User->>UI: send message
    UI->>Stores: sendMessage()
    Stores->>DB: save user message
    Stores->>API: POST /v1/chat/completions (stream)
    loop streaming
        API-->>Stores: SSE chunks
        Stores-->>UI: reactive update
    end
    API-->>Stores: done + timings
    Stores->>DB: save assistant message

    Note over User,API: ๐Ÿ” Regenerate

    User->>UI: regenerate
    Stores->>DB: create message branch
    Note right of Stores: same streaming flow

    Note over User,API: โน๏ธ Stop

    User->>UI: stop
    Stores->>Stores: abort stream
    Stores->>DB: save partial response