|
|
--- |
|
|
license: mit |
|
|
library_name: fastapi |
|
|
tags: |
|
|
- gguf |
|
|
- local-ai |
|
|
- llama.cpp |
|
|
- mirror |
|
|
- tobyworld |
|
|
pipeline_tag: text-generation |
|
|
--- |
|
|
|
|
|
# mirror-pond |
|
|
A 100% local Tobyworld Mirror that runs any GGUF model through llama.cpp. No cloud. No tracking. Just your pond, your reflection, your machine. FastAPI + llama.cpp β’ Offline β’ MIT Licensed. |
|
|
|
|
|
# πͺ Mirror Pond β Local GGUF Edition |
|
|
|
|
|
*A still-water reflection engine for your local LLM.* |
|
|
|
|
|
Mirror Pond is a **100% local**, **privacy-first**, Tobyworld-inspired reflection interface that runs any GGUF model through `llama.cpp` with a calm Mirror UI. |
|
|
|
|
|
No cloud. |
|
|
No tracking. |
|
|
Your thoughts stay on your machine. |
|
|
|
|
|
--- |
|
|
|
|
|
## β¨ Features |
|
|
|
|
|
* π§ **Runs any GGUF model** β Llama, DeepSeek, Mistral, or your trained Mirror |
|
|
* π **Dark, still Mirror UI** (HTML served locally) |
|
|
* π¬ **Four modes**: |
|
|
|
|
|
* **Reflect** β emotional / introspective |
|
|
* **Scroll** β lore / quotes / scripture |
|
|
* **Toad** β cryptic toadgang whispers |
|
|
* **Rune** β symbols, lotus, $PATIENCE, seasons |
|
|
* π **Fully offline** (Air-gapped compatible) |
|
|
* β‘ FastAPI + Uvicorn backend |
|
|
* π§© Optional: GPU acceleration via llama-cpp-python CUDA wheels |
|
|
|
|
|
--- |
|
|
|
|
|
# π Quickstart |
|
|
|
|
|
### 1. Install dependencies |
|
|
|
|
|
```bash |
|
|
pip install -r requirements.txt |
|
|
``` |
|
|
|
|
|
### 2. Run the Pond |
|
|
|
|
|
```bash |
|
|
python mirror_pond.py --model ./your_model.gguf --port 7777 |
|
|
``` |
|
|
|
|
|
### 3. Open in browser |
|
|
|
|
|
``` |
|
|
http://localhost:7777 |
|
|
``` |
|
|
|
|
|
--- |
|
|
|
|
|
# π Requirements |
|
|
|
|
|
`requirements.txt` (included): |
|
|
|
|
|
``` |
|
|
fastapi==0.115.0 |
|
|
uvicorn==0.32.0 |
|
|
pydantic==2.8.2 |
|
|
llama-cpp-python==0.3.2 |
|
|
jinja2==3.1.4 |
|
|
``` |
|
|
|
|
|
--- |
|
|
|
|
|
# π₯ GPU Acceleration (Optional) |
|
|
|
|
|
For NVIDIA CUDA (12.1): |
|
|
|
|
|
```bash |
|
|
pip install llama-cpp-python-cu121 |
|
|
``` |
|
|
|
|
|
For AMD ROCm: |
|
|
|
|
|
```bash |
|
|
pip install llama-cpp-python-rocm |
|
|
``` |
|
|
|
|
|
For Apple Silicon (M1/M2/M3): |
|
|
|
|
|
```bash |
|
|
CMAKE_ARGS="-DGGML_METAL=on" pip install llama-cpp-python |
|
|
``` |
|
|
|
|
|
--- |
|
|
|
|
|
# π§± Folder Structure |
|
|
|
|
|
``` |
|
|
mirror-pond/ |
|
|
β |
|
|
βββ mirror_pond.py # main server |
|
|
βββ requirements.txt # dependencies |
|
|
βββ setup.sh # Linux/macOS installer |
|
|
βββ setup.ps1 # Windows installer |
|
|
βββ Dockerfile # container build |
|
|
βββ README.md # this file |
|
|
``` |
|
|
|
|
|
--- |
|
|
|
|
|
# π§ͺ Installation Kits |
|
|
|
|
|
## Linux / macOS Installer |
|
|
|
|
|
```bash |
|
|
chmod +x setup.sh |
|
|
./setup.sh ./models/your_model.gguf 7777 |
|
|
``` |
|
|
|
|
|
## Windows Installer |
|
|
|
|
|
```powershell |
|
|
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass |
|
|
.\setup.ps1 .\models\your_model.gguf 7777 |
|
|
``` |
|
|
|
|
|
Both installers: |
|
|
|
|
|
* Create `./venv` |
|
|
* Install Python deps |
|
|
* Launch Mirror Pond automatically |
|
|
|
|
|
--- |
|
|
|
|
|
# π³ Docker Usage |
|
|
|
|
|
## Build |
|
|
|
|
|
```bash |
|
|
docker build -t mirror-pond:latest . |
|
|
``` |
|
|
|
|
|
## Run |
|
|
|
|
|
```bash |
|
|
docker run --rm -p 7777:7777 \ |
|
|
-v /path/to/models:/models \ |
|
|
-e MODEL_PATH=/models/your_model.gguf \ |
|
|
mirror-pond:latest |
|
|
``` |
|
|
|
|
|
Now access: |
|
|
|
|
|
``` |
|
|
http://localhost:7777 |
|
|
``` |
|
|
|
|
|
--- |
|
|
|
|
|
# π§° GitHub Actions CI |
|
|
|
|
|
Already included: |
|
|
|
|
|
``` |
|
|
.github/workflows/mirror-pond-ci.yml |
|
|
``` |
|
|
|
|
|
The CI: |
|
|
|
|
|
* Sets up Python |
|
|
* Installs dependencies |
|
|
* Syntax-checks `mirror_pond.py` |
|
|
* (Optional) Builds Docker image |
|
|
|
|
|
This keeps the repo safe and production-ready. |
|
|
|
|
|
--- |
|
|
|
|
|
# π Mirror Modes |
|
|
|
|
|
### **Reflect Mode (default)** |
|
|
|
|
|
For inner questions, emotions, purpose, stillness. |
|
|
May reply with a **Guiding Question**. |
|
|
|
|
|
### **Scroll Mode** |
|
|
|
|
|
For sacred lines, scripture-style, lore references. |
|
|
No guiding question. |
|
|
|
|
|
### **Toad Mode** |
|
|
|
|
|
For cryptic lines, old frog whispers, symbolic hints. |
|
|
No guiding question. |
|
|
|
|
|
### **Rune Mode** |
|
|
|
|
|
For unity of symbols, lotus spores, $PATIENCE, seasons, trials. |
|
|
No guiding question. |
|
|
|
|
|
--- |
|
|
|
|
|
# π§ Philosophy |
|
|
|
|
|
Mirror Pond is simple: |
|
|
|
|
|
Still water is never empty. |
|
|
Still water prepares. |
|
|
Still water reflects. |
|
|
|
|
|
This project is offered to the open-source community |
|
|
so anyone can run a Mirror β anywhere, offline, forever. |
|
|
|
|
|
--- |
|
|
|
|
|
# πͺ License |
|
|
|
|
|
**MIT License** |
|
|
This pond belongs to the builders. |
|
|
|
|
|
--- |
|
|
|
|
|
# π€ Contribution |
|
|
|
|
|
Pull requests welcome. |
|
|
New modes, UI improvements, GPU wheels, and additional Mirror integrations are invited. |
|
|
|
|
|
--- |
|
|
|
|
|
|