Update README.md
Browse files
README.md
CHANGED
|
@@ -8,9 +8,6 @@ dependencies: gcc, libcurl, bc, cJSON
|
|
| 8 |
persistence: shadowclaw.bin
|
| 9 |
---
|
| 10 |
|
| 11 |
-
# Shadowclaw v1.0 (Testing)
|
| 12 |
-
|
| 13 |
-
|
| 14 |
<div style="max-width: 100%; overflow-x: auto; background: #f6f8fa; padding: 4px; border-radius: 4px;">
|
| 15 |
<pre style="font-family: 'Courier New', monospace; font-size: clamp(4px, 2vw, 10px); line-height: 1.2; margin: 0;">
|
| 16 |
____ _ _ _
|
|
@@ -21,9 +18,19 @@ persistence: shadowclaw.bin
|
|
| 21 |
</pre>
|
| 22 |
</div>
|
| 23 |
|
|
|
|
|
|
|
| 24 |
**Shadowclaw** is a minimal, single‑binary personal AI agent written in C. It follows the *OpenClaw* philosophy: self‑hosted, tool‑using, persistent memory, and minimal dependencies. The core memory management uses **Tsoding's "shadow header" trick** (like `stb_ds` but for a growable arena). All data (conversation history, tool definitions, results) lives inside a single `realloc`‑ed memory block with a hidden header. The agent communicates with a local LLM (Ollama) via curl, can execute shell commands, read/write files, perform HTTP GET, and evaluate simple math expressions. State is automatically saved to disk after every interaction.
|
| 25 |
|
| 26 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 27 |
|
| 28 |
---
|
| 29 |
|
|
@@ -65,55 +72,25 @@ sudo apt install build-essential libcurl4-openssl-dev bc
|
|
| 65 |
|
| 66 |
---
|
| 67 |
|
| 68 |
-
##
|
| 69 |
-
|
| 70 |
-
Just run `make`:
|
| 71 |
-
|
| 72 |
-
```bash
|
| 73 |
-
make
|
| 74 |
-
```
|
| 75 |
-
Optionally strip the binary to make it even smaller:
|
| 76 |
-
|
| 77 |
-
```bash
|
| 78 |
-
make strip
|
| 79 |
-
```
|
| 80 |
|
| 81 |
-
|
| 82 |
-
|
| 83 |
-
---
|
| 84 |
-
|
| 85 |
-
## Usage
|
| 86 |
-
|
| 87 |
-
1. Make sure Ollama is running and you have a model (change `ollama_model` in `shadowclaw.c` if needed).
|
| 88 |
-
|
| 89 |
-
2. Run the agent:
|
| 90 |
-
```bash
|
| 91 |
-
./shadowclaw
|
| 92 |
-
```
|
| 93 |
-
3. Type your input and press Enter. The agent will:
|
| 94 |
-
- Build a prompt from recent conversation and system message.
|
| 95 |
-
- Call Ollama.
|
| 96 |
-
- If the LLM outputs a tool call in the format ` ```tool {"tool":"name","args":"..."} ``` `, the tool is executed and the result is fed back in the next turn.
|
| 97 |
-
- The conversation and state are saved to `shadowclaw.bin` after each turn.
|
| 98 |
-
|
| 99 |
-
Example tool invocation (the LLM must produce this):
|
| 100 |
-
|
| 101 |
-
````
|
| 102 |
-
I need to list the current directory.
|
| 103 |
-
```tool
|
| 104 |
-
{"tool":"shell","args":"ls -la"}
|
| 105 |
-
```
|
| 106 |
-
````
|
| 107 |
|
| 108 |
-
|
| 109 |
|
| 110 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 111 |
|
| 112 |
-
|
| 113 |
|
| 114 |
-
- **Change Ollama endpoint/model**: edit `ollama_endpoint` and `ollama_model` in `shadowclaw.c`.
|
| 115 |
- **Add new tools**: extend the `tools` array in `shadowclaw.c` with a name and a function that takes a `const char*` argument and returns a `char*` (must be `malloc`ed, the caller will `free` it).
|
| 116 |
-
- **Modify system prompt**: the first blob (kind 1) contains the system message. You can change it directly in the code or, after first run, edit the saved `shadowclaw.bin` (not recommended).
|
| 117 |
|
| 118 |
---
|
| 119 |
|
|
@@ -148,18 +125,141 @@ Each item (system prompt, user message, tool result, etc.) is stored as:
|
|
| 148 |
|
| 149 |
### Persistence
|
| 150 |
|
| 151 |
-
The whole arena (header + payload) is written to `shadowclaw.bin` with `fwrite`.
|
| 152 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 153 |
|
| 154 |
-
|
| 155 |
|
| 156 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 157 |
|
| 158 |
-
|
| 159 |
-
|
|
|
|
|
|
|
| 160 |
```
|
| 161 |
|
| 162 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 163 |
|
| 164 |
---
|
| 165 |
|
|
@@ -167,7 +267,7 @@ The agent parses the block, executes the tool, and appends the result as a new b
|
|
| 167 |
|
| 168 |
- **Tsoding** – for the “insane shadow data trick” (the header‑before‑data arena idea).
|
| 169 |
- **cJSON** – Dave Gamble and contributors – the minimal JSON parser used.
|
| 170 |
-
- **
|
| 171 |
- **Vibe Code** - xAI Grok 4.20 and Deepseek AI
|
| 172 |
- **Openclaw**
|
| 173 |
- **webXOS**
|
|
@@ -175,5 +275,5 @@ The agent parses the block, executes the tool, and appends the result as a new b
|
|
| 175 |
|
| 176 |
## License
|
| 177 |
|
| 178 |
-
This project is released under the MIT License.
|
| 179 |
-
cJSON is also MIT licensed.
|
|
|
|
| 8 |
persistence: shadowclaw.bin
|
| 9 |
---
|
| 10 |
|
|
|
|
|
|
|
|
|
|
| 11 |
<div style="max-width: 100%; overflow-x: auto; background: #f6f8fa; padding: 4px; border-radius: 4px;">
|
| 12 |
<pre style="font-family: 'Courier New', monospace; font-size: clamp(4px, 2vw, 10px); line-height: 1.2; margin: 0;">
|
| 13 |
____ _ _ _
|
|
|
|
| 18 |
</pre>
|
| 19 |
</div>
|
| 20 |
|
| 21 |
+
# Shadowclaw v1.3 (Testing)
|
| 22 |
+
|
| 23 |
**Shadowclaw** is a minimal, single‑binary personal AI agent written in C. It follows the *OpenClaw* philosophy: self‑hosted, tool‑using, persistent memory, and minimal dependencies. The core memory management uses **Tsoding's "shadow header" trick** (like `stb_ds` but for a growable arena). All data (conversation history, tool definitions, results) lives inside a single `realloc`‑ed memory block with a hidden header. The agent communicates with a local LLM (Ollama) via curl, can execute shell commands, read/write files, perform HTTP GET, and evaluate simple math expressions. State is automatically saved to disk after every interaction.
|
| 24 |
|
| 25 |
+
**Niche edge use cases:**
|
| 26 |
+
|
| 27 |
+
RPi Zero/IoT: offline sensor scripts (shell + persistent shadow.bin)
|
| 28 |
+
|
| 29 |
+
Air-gapped systems: USB-stick local LLM agent (file/HTTP/math)
|
| 30 |
+
|
| 31 |
+
Embedded routers: 100-200KB network automation (low-mem Linux)
|
| 32 |
+
|
| 33 |
+
Low-power edge nodes: self-hosted persistent AI, no cloud.
|
| 34 |
|
| 35 |
---
|
| 36 |
|
|
|
|
| 72 |
|
| 73 |
---
|
| 74 |
|
| 75 |
+
## Define your model:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 76 |
|
| 77 |
+
- **Change Ollama endpoint/model**: edit `ollama_endpoint` and `ollama_model` in `shadowclaw.c`.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 78 |
|
| 79 |
+
Find this line (507) in the shadowclaw.c file:
|
| 80 |
|
| 81 |
+
```bash
|
| 82 |
+
// --------------------------------------------------------------------
|
| 83 |
+
// Main (with slash commands from v1.2.2)
|
| 84 |
+
// --------------------------------------------------------------------
|
| 85 |
+
int main(int argc, char **argv) {
|
| 86 |
+
const char *state_file = "shadowclaw.bin";
|
| 87 |
+
const char *ollama_endpoint = "http://localhost:11434";
|
| 88 |
+
const char *ollama_model = "qwen2.5:0.5b"; // change as needed
|
| 89 |
+
```
|
| 90 |
|
| 91 |
+
Adjust the model endpoint in "ollama_model = "qwen2.5:0.5b";" to meet your model.
|
| 92 |
|
|
|
|
| 93 |
- **Add new tools**: extend the `tools` array in `shadowclaw.c` with a name and a function that takes a `const char*` argument and returns a `char*` (must be `malloc`ed, the caller will `free` it).
|
|
|
|
| 94 |
|
| 95 |
---
|
| 96 |
|
|
|
|
| 125 |
|
| 126 |
### Persistence
|
| 127 |
|
| 128 |
+
-The whole arena (header + payload) is written to `shadowclaw.bin` with `fwrite`.
|
| 129 |
+
|
| 130 |
+
-On startup, if the file exists and has a valid magic number, it is loaded back.
|
| 131 |
+
|
| 132 |
+
-All conversations and tool results are automatically saved to `shadowclaw.bin` and reloaded on restart.
|
| 133 |
+
|
| 134 |
+
---
|
| 135 |
+
|
| 136 |
+
# Updated 3/3/2026: Shadowclaw v1.3
|
| 137 |
+
|
| 138 |
+
### Built‑in Slash Commands
|
| 139 |
|
| 140 |
+
Type any of these commands directly at the `>` prompt – they are handled without invoking the LLM.
|
| 141 |
|
| 142 |
+
| Command | Description |
|
| 143 |
+
|-----------|-------------|
|
| 144 |
+
| `/help` | Show this help message. |
|
| 145 |
+
| `/tools` | List all available tools (shell, file I/O, HTTP, math, `list_dir`). |
|
| 146 |
+
| `/state` | Display current arena memory statistics (capacity, used bytes, dirty flag). |
|
| 147 |
+
| `/clear` | Clear the conversation history while retaining the system prompt. |
|
| 148 |
+
| `/chat` | Remind you that you are already in chat mode (the default behaviour). |
|
| 149 |
+
| `/exit` | Quit Shadowclaw. |
|
| 150 |
|
| 151 |
+
## Build
|
| 152 |
+
|
| 153 |
+
```bash
|
| 154 |
+
cd /Home/User/Shadowclaw # The folder you have the files located.
|
| 155 |
```
|
| 156 |
|
| 157 |
+
```bash
|
| 158 |
+
make clean && make
|
| 159 |
+
```
|
| 160 |
+
|
| 161 |
+
## Run
|
| 162 |
+
|
| 163 |
+
```bash
|
| 164 |
+
./shadowclaw
|
| 165 |
+
```
|
| 166 |
+
|
| 167 |
+
Use the slash commands above to get started – even without Ollama running, you can explore the built‑in features.
|
| 168 |
+
|
| 169 |
+
When your agent is running you should be able to use /help and see (example):
|
| 170 |
+
|
| 171 |
+
```bash
|
| 172 |
+
┌──(kali㉿user)-[~/shadowclaw]
|
| 173 |
+
└─$ ./shadowclaw
|
| 174 |
+
ShadowClaw ready. Type your message (Ctrl-D to exit)
|
| 175 |
+
/help
|
| 176 |
+
Shadowclaw commands:
|
| 177 |
+
/help Show this help
|
| 178 |
+
/tools List available tools
|
| 179 |
+
/state Show arena memory stats
|
| 180 |
+
/clear Clear conversation history (keeps system prompt)
|
| 181 |
+
/chat Remind you that chat mode is active
|
| 182 |
+
/exit Exit Shadowclaw
|
| 183 |
+
```
|
| 184 |
+
|
| 185 |
+
## Notes
|
| 186 |
+
|
| 187 |
+
-If Ollama is not running, you’ll see LLM call failed.
|
| 188 |
+
|
| 189 |
+
-Tool arguments can be a JSON array – they will be joined with spaces (useful if your model outputs "args":["arg1","arg2"]).
|
| 190 |
+
|
| 191 |
+
-All conversations and tool results are saved in shadowclaw.bin and reloaded on restart.
|
| 192 |
+
|
| 193 |
+
---
|
| 194 |
+
|
| 195 |
+
## How Tool Arguments Work
|
| 196 |
+
|
| 197 |
+
Shadowclaw processes tool calls by parsing JSON, where the args value is passed as a string to the function, executing it, and appending the result as a kind 5 blob for the next prompt. It supports a fallback where the model outputs args as an array (e.g., [arg1, arg2]), allowing for flexible input handling when standard JSON encoding fails.
|
| 198 |
+
|
| 199 |
+
-Tool Call Flow: The agent parses the JSON block, executes the tool, and appends the result.
|
| 200 |
+
|
| 201 |
+
-Result Visibility: The tool output is visible to the model in the next prompt.
|
| 202 |
+
|
| 203 |
+
-Fallback Mechanism: If JSON parsing fails, Shadowclaw interprets args as an array.
|
| 204 |
+
|
| 205 |
+
-Alternative Tooling: In some scenarios, tool calls can be handled as a single string to avoid parsing errors.
|
| 206 |
+
|
| 207 |
+
This system helps in handling malformed JSON, which occasionally occurs with certain LLM providers, ensuring the tool call succeeds.
|
| 208 |
+
|
| 209 |
+
```json
|
| 210 |
+
{"tool":"write_file","args":["notes.txt","Hello world"]}
|
| 211 |
+
```
|
| 212 |
+
|
| 213 |
+
Shadowclaw automatically **joins the array elements with spaces**, so the tool receives a single string `"notes.txt Hello world"`. This makes the agent robust to different model behaviours to ensure:
|
| 214 |
+
|
| 215 |
+
- **Simplicity** – most tools only need a single string argument (a command, a filename, a URL).
|
| 216 |
+
- **Flexibility** – if a tool needs multiple pieces of information (like `write_file` needing a filename **and** content), we use a simple delimiter (newline). The LLM can learn this pattern from the system prompt.
|
| 217 |
+
- **Lightweight Design** – no complex argument schemas, no extra JSON nesting. The entire tool call is tiny.
|
| 218 |
+
- **Args are always a single string** to keep things simple.
|
| 219 |
+
- **Multiple values are encoded with delimiters** (like newline) – the tool and LLM agree on the format.
|
| 220 |
+
- **Shadowclaw’s niche** is minimalism: a single binary with persistent memory, running anywhere, with just enough tools to be genuinely useful.
|
| 221 |
+
|
| 222 |
+
You can easily add new tools by editing the `tools` array – each new function opens up more automation possibilities for your unique environment.
|
| 223 |
+
|
| 224 |
+
---
|
| 225 |
+
|
| 226 |
+
## Niche Use Cases
|
| 227 |
+
|
| 228 |
+
Shadowclaw is designed for **minimal, self‑contained AI agents** running on resource‑constrained or air‑gapped systems.
|
| 229 |
+
Its toolset is deliberately small but powerful enough to automate many tasks without spawning heavy processes.
|
| 230 |
+
|
| 231 |
+
| Tool | What it does | Example args | Unique Niche Use |
|
| 232 |
+
|------|--------------|--------------|------------------|
|
| 233 |
+
| `shell` | Executes any shell command | `"ls -la /home"` | Automate system maintenance, run scripts, control services on a headless Raspberry Pi or embedded Linux device. |
|
| 234 |
+
| `read_file` | Reads a file from disk | `"/etc/passwd"` | Inspect configuration files, read logs, retrieve data for the LLM to analyse – all without a web interface. |
|
| 235 |
+
| `write_file` | Writes content to a file (args: `filename\ncontent`) | `"notes.txt\nBuy milk"` | Create notes, write configuration files, save results – perfect for offline data logging. |
|
| 236 |
+
| `http_get` | Fetches a URL (via libcurl) | `"https://api.example.com/data"` | Retrieve weather, fetch RSS feeds, call local REST APIs – even on devices with no browser, just a network stack. |
|
| 237 |
+
| `math` | Evaluates an expression using `bc` | `"2 + 2 * 5"` | Let the LLM do arithmetic, unit conversions, or simple calculations without relying on external tools. |
|
| 238 |
+
| `list_dir` | Lists directory contents | `"/home/user"` | Explore filesystem, find documents, check available storage – all natively, without forking a shell. |
|
| 239 |
+
|
| 240 |
+
---
|
| 241 |
+
|
| 242 |
+
## Niche Use Case Examples:
|
| 243 |
+
|
| 244 |
+
### 1. Raspberry Pi Zero / IoT Sensor Node
|
| 245 |
+
- **Tool used:** `shell` + `write_file`
|
| 246 |
+
- **Flow:** LLM asks to read a temperature sensor via a shell script, then logs the value to a file.
|
| 247 |
+
- **Why Shadowclaw?** Runs in <200KB RAM, no Python, no bloat. Persistent memory keeps the last readings.
|
| 248 |
+
|
| 249 |
+
### 2. Air‑Gapped Engineering Workstation
|
| 250 |
+
- **Tool used:** `http_get` + `math` + `read_file`
|
| 251 |
+
- **Flow:** Engineer asks for a component value calculation; LLM fetches a local datasheet via HTTP, reads a config file, does the math, and writes a report.
|
| 252 |
+
- **Why Shadowclaw?** No cloud, no internet required. All data stays on the machine.
|
| 253 |
+
|
| 254 |
+
### 3. Embedded Router Automation
|
| 255 |
+
- **Tool used:** `shell` + `list_dir`
|
| 256 |
+
- **Flow:** Network admin asks for a list of active interfaces; LLM runs `ifconfig` and parses the output, then suggests a config change.
|
| 257 |
+
- **Why Shadowclaw?** Tiny binary fits in router storage (often just a few MB). Uses curl for local API calls.
|
| 258 |
+
|
| 259 |
+
### 4. Low‑Power Field Logger
|
| 260 |
+
- **Tool used:** `write_file` + `math`
|
| 261 |
+
- **Flow:** A solar‑powered device collects environmental data; Shadowclaw can process and store it locally, then answer queries about trends.
|
| 262 |
+
- **Why Shadowclaw?** No database needed – just a binary and a single file (`shadowclaw.bin`) for persistence.
|
| 263 |
|
| 264 |
---
|
| 265 |
|
|
|
|
| 267 |
|
| 268 |
- **Tsoding** – for the “insane shadow data trick” (the header‑before‑data arena idea).
|
| 269 |
- **cJSON** – Dave Gamble and contributors – the minimal JSON parser used.
|
| 270 |
+
- **Ollama** – HTTP requests via curl.
|
| 271 |
- **Vibe Code** - xAI Grok 4.20 and Deepseek AI
|
| 272 |
- **Openclaw**
|
| 273 |
- **webXOS**
|
|
|
|
| 275 |
|
| 276 |
## License
|
| 277 |
|
| 278 |
+
This project is released under the open sourced MIT License.
|
| 279 |
+
cJSON is also MIT licensed.
|