File size: 10,174 Bytes
dedf900 1751817 dedf900 c59eeb0 dedf900 87a72a1 dedf900 c59eeb0 dedf900 c59eeb0 dedf900 c59eeb0 dedf900 c59eeb0 dedf900 c59eeb0 dedf900 c59eeb0 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 | ---
license: mit
language:
- en
tags:
- rust
- local-agent
- ollama
- agent-harness
- tui
- ratatui
- persistent-memory
- bio-md
- sandboxed-agent
---
<div align="center">
<pre style="color: #CD7F32; font-size: 10pt; font-weight: bold; line-height: 1.2; margin: 0;">
βββββββ βββ βββββββββββββββββββββββ βββ ββββββββββ ββββββ βββ βββ
βββββββββββ ββββββββββββββββββββββββ βββββββββββββββ βββββββββββ βββ
βββββββββββ βββββββββββ βββ βββββββ βββ βββ βββββββββββ ββ βββ
βββββββββββ βββββββββββ βββ βββββ βββ βββ ββββββββββββββββββ
βββ ββββββββββββββββββββ βββ βββ βββββββββββββββββββ βββββββββββββ
βββ βββ βββββββ ββββββββ βββ βββ ββββββββββββββββββ βββ ββββββββ
</pre>
</div>
# π¦ RustyClaw 0.6.0 β Local Agent Harness
[](https://www.rust-lang.org/)
[](https://opensource.org/licenses/MIT)
**RustyClaw** is a terminalβbased, minimal barebones and OEM localβonly agent harness powered by [Ollama](https://ollama.com/).
It combines a TUI chat interface, file system operations, Git versioning, memory consolidation, and a REST API β all inside a single Rust binary.
---
## β¨ Features
- π§ **Persistent memory** β `bio.md` evolves with every conversation.
- π₯οΈ **Fullβscreen TUI** β built with `ratatui` and `crossterm`.
- π€ **Local Ollama** β no data leaves your machine (supports any model).
- π **Sandboxed file ops** β read/write files inside `~/.rustyclaw/data/`.
- π **Whitelisted shell commands** β `ls`, `cat`, `echo`, `git`, `pwd`.
- π¦ **Git versioning** β every file change is autoβcommitted (optional).
- π§ **Memory consolidation** β periodic summarisation of conversations into `bio.md`.
- π **REST API** β `GET /api/bio` to fetch the current `bio.md`.
- π¨ **Permanent ASCII logo** β RustyClaw branding stays on screen.
- β‘ **Nonβblocking runtime** β smooth TUI even while background tasks run.
---
## File Structure
```
rustyclaw/
βββ src/
β βββ main.rs # singleβfile application
βββ Cargo.toml # dependencies
βββ start.sh # launcher script (build + run)
βββ config.yaml # optional β autoβcreated on first run
βββ data/ # sandboxed file storage (Git repo) - auto-created on first run
```
> **Note:** `~/.rustyclaw/` is created automatically on first launch.
> The `data/` folder inside it is initialised as a Git repository if `git` is available.
---
## π οΈ Installation
### 1. Once you have the files downloaded run this bash command:
```bash
cd ~/rustyclaw (The folder you have the files installed)
cargo build --release
./start.sh --rebuild
```
### 2. Install Ollama
```bash
curl -fsSL https://ollama.com/install.sh | sh
ollama serve & # start the server
ollama pull qwen2.5:0.5b # pull a small model (or any you like)
```
### 3. Install Git (optional but recommended)
```bash
sudo apt install git # Debian/Ubuntu
# or brew install git on macOS
```
Warning: The first build may take a few minutes. Subsequent runs will reuse the cached binary.
---
## Configuration
On first launch, a default `config.yaml` is created in the current directory.
You can edit it to change behaviour:
```yaml
ollama_url: "http://localhost:11434"
ollama_model: "qwen2.5:0.5b"
api_port: 3030
root_dir: "/home/you/.rustyclaw"
bio_file: "/home/you/.rustyclaw/bio.md"
heartbeat_log: "/home/you/.rustyclaw/data/logs/heartbeat.log"
memory_sync_interval_secs: 3600 # consolidate every hour
max_log_lines: 200
git_auto_commit: true
```
| Field | Description |
|----------------------------|-----------------------------------------------------------------------------|
| `ollama_url` | Ollama API endpoint (default `http://localhost:11434`) |
| `ollama_model` | Model to use for chat and consolidation |
| `api_port` | Port for the REST API |
| `root_dir` | Where `bio.md` and `data/` live (default `~/.rustyclaw`) |
| `git_auto_commit` | Automatically commit file writes in the `data/` folder |
| `memory_sync_interval_secs`| How often to run automatic memory consolidation |
---
## `bio.md` β The Living Agent Memory
`bio.md` is a Markdown file that acts as the agentβs **persistent longβterm memory**.
It is read on every chat and updated during `/consolidate`. The file is structured into five sections:
### 1. `# BIO.MD β Living Agent Identity`
- Contains the **last updated** timestamp (autoβrefreshed after each chat).
### 2. `## SOUL`
- Core personality, values, constraints, and behavioural rules.
- Example: *βStay sandboxed, respect security, be concise and helpful.β*
### 3. `## SKILLS`
- Reusable capabilities and βhowβtoβ instructions.
- Example: *βRead/write local files, run whitelisted shell commands.β*
### 4. `## MEMORY`
- Curated longβterm knowledge.
- During `/consolidate`, the agent summarises recent conversations and appends a new entry here (e.g., `### Summary for 2025-04-02 14:30 β¦`).
### 5. `## CONTEXT`
- Current runtime state (OS, working directory, active model).
### 6. `## SESSION TREE`
- Pointers or summaries of active conversation branches (currently a placeholder β can be extended).
> **You can edit `bio.md` manually** β the agent will respect your changes in future chats.
---
## Usage β TUI Commands
Launch the TUI with `./start.sh`.
All commands are typed at the bottom input line and sent with **Enter**.
| Command | Description |
|---------|-------------|
| `/help` | Show all commands |
| `/bio` | Display the current `bio.md` content |
| `/consolidate` | Force memory consolidation (summarises recent chats into `## MEMORY`) |
| `/write_file <path> <content>` | Write a file inside `data/` (supports folders) |
| `/read_file <path>` | Read and display a file from `data/` |
| `/model list` | List all available Ollama models |
| `/model select <name>` | Switch to a different model (persists in `config.yaml`) |
| `/list_dir [path]` | List contents of `data/` or a subfolder |
| `/search <query>` | Search for text in all files under `data/` (regex) |
| `/run <command>` | Run a whitelisted shell command (`ls`, `cat`, `echo`, `git`, `pwd`) inside `data/` |
| `/git status` | Show `git status --short` of the `data/` folder |
| `/git log [n]` | Show last `n` commits (default 10) |
| `/git commit <msg>` | Commit all changes in `data/` with a message |
| `/quit` or `/exit` | Exit RustyClaw |
**Any text not starting with `/` is sent as a chat message to the AI.**
---
## REST API
While the TUI is running, a simple HTTP server listens on `http://127.0.0.1:3030`.
- `GET /health` β `{"status":"ok"}`
- `GET /api/bio` β returns the current `bio.md` as JSON:
```json
{"bio": "# BIO.MD β Living Agent Identity\n**Last Updated:** ..."}
```
You can use `curl` to fetch the agentβs memory:
```bash
curl http://127.0.0.1:3030/api/bio
```
---
## How Memory Consolidation Works
1. Every chat interaction is logged as a JSON line in `~/.rustyclaw/data/logs/heartbeat.log`.
2. Periodically (default every 3600 seconds), the agent reads the last 20 entries.
3. It sends a summarisation prompt to Ollama.
4. The summary is inserted into the `## MEMORY` section of `bio.md` with a timestamp.
5. The agentβs future chats include the updated `bio.md`, giving it longβterm recall.
You can also trigger consolidation manually with `/consolidate`.
---
## Tool Functions Explained
The core of RustyClaw is the `run_command` dispatcher in `main.rs`.
Each command is handled in a nonβblocking worker task.
| Function | Description |
|-------------------|-------------|
| `Chat` | Sends user message to Ollama together with the full `bio.md` as system prompt. Logs the exchange and updates the timestamp in `bio.md`. |
| `ConsolidateMemory` | Reads heartbeat log, asks Ollama to summarise, inserts summary into `bio.md`. |
| `WriteFile` | Sanitises path (stays inside `data/`), creates parent directories, writes content, then optionally `git add` + `commit`. |
| `ReadFile` | Reads a file from `data/` and displays its content in the logs. |
| `ListModels` | Calls Ollamaβs `/api/tags` endpoint and lists available models. |
| `SelectModel` | Updates `config.yaml` with the new model name. |
| `ListDir` | Uses `walkdir` to show oneβlevel directory listing. |
| `SearchFiles` | Recursively walks `data/` and prints paths of files containing a regex match. |
| `RunCommand` | Executes a whitelisted command (`ls`, `cat`, `echo`, `git`, `pwd`) inside `data/`. |
| `GitStatus`, `GitLog`, `GitCommit` | Thin wrappers around `git` commands, always run inside `data/`. |
| `Quit` | Signals the main loop to exit. |
All file operations are **sandboxed** β the `sanitize_path` function ensures no path can escape `~/.rustyclaw/data/`.
---
## π License
MIT License
|