File size: 8,904 Bytes
bac3a2c dbd9d67 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 | ---
license: mit
language:
- en
- cpp
library_name: pencilclaw
tags:
- c++
- ollama
- code-generation
- autonomous-agent
- local-ai
- git
- offline
- cpp
- agent
- task-automation
- llm
- code-execution
- self-hosted
- privacy-focused
---
# βοΈ PENCILCLAW β Autonomous C++ Coding Agent (v1.0 Testing)
```
βββββββ ββββββββββββ βββ βββββββββββββ ββββββββββ ββββββ βββ βββ
βββββββββββββββββββββ βββββββββββββββββ βββββββββββ βββββββββββ βββ
ββββββββββββββ ββββββ ββββββ ββββββ βββ βββ βββββββββββ ββ βββ
βββββββ ββββββ βββββββββββββ ββββββ βββ βββ ββββββββββββββββββ
βββ βββββββββββ βββββββββββββββββββββββββββββββββββββββββββ βββββββββββββ
βββ βββββββββββ βββββ ββββββββββββββββββ βββββββββββββββββ βββ ββββββββ
```
**PENCILCLAW** is a C++βbased autonomous coding agent harness for your local [Ollama](https://ollama.com/) instance to generate, manage, and execute C++ code. It features a persistent task system, Git integration, and a secure execution environment β all running offline with complete privacy.
---
## Features
- **Code Generation (`/CODE`)** β Generate C++ code for any idea, automatically saved as a `.txt` file.
- **Autonomous Tasks (`/TASK`)** β Start a longβrunning coding goal; the agent continues working on it in the background via heartbeat.
- **Task Management** β View status (`/TASK_STATUS`) and stop tasks (`/STOP_TASK`).
- **Code Execution (`/EXECUTE`)** β Compile and run the last generated code block (with safety confirmation).
- **Git Integration** β Every saved file is automatically committed to a local Git repository inside `pencil_data/`.
- **Heartbeat & KeepβAlive** β Keeps the Ollama model loaded and continues active tasks periodically.
- **Secure by Design** β Command injection prevented, path sanitisation, explicit confirmation before running AIβgenerated code.
- **Natural Language Interface** β Commands like *"write code for a fibonacci function"* are understood.
---
## Project Structure
```
/home/kali/pencilclaw/
βββ pencilclaw.cpp # Main program source
βββ pencil_utils.hpp # Workspace utilities
βββ pencilclaw # Compiled executable
**βββ pencil_data/ # Created automatically on first run**
βββ session.log # Full interaction log
βββ .git/ # Local Git repository (if initialised)
βββ tasks/ # Autonomous task folders
β βββ 20260309_123456_build_calculator/
β βββ description.txt
β βββ log.txt
β βββ iteration_1.txt
β βββ ...
βββ [code files].txt # Files saved via /CODE or natural language
```
---
## Requirements
- **Compiler** with C++17 support (g++ 7+ or clang 5+)
- **libcurl** development libraries
- **nlohmann/json** (headerβonly JSON library)
- **Ollama** installed and running
- A model pulled in Ollama (default: `qwen2.5:0.5b` β configurable via environment variable `OLLAMA_MODEL`)
*Note: PENCILCLAW uses POSIX system calls (`fork`, `pipe`, `execvp`). It runs on Linux, macOS, and Windows Subsystem for Linux (WSL).*
---
## Installation
### 1. Install System Dependencies
```bash
sudo apt update
sudo apt install -y build-essential libcurl4-openssl-dev
```
### 2. Install nlohmann/json
The library is headerβonly; simply download `json.hpp` and place it in your include path, or install via package manager:
```bash
sudo apt install -y nlohmann-json3-dev
```
### 3. Install Ollama
```bash
curl -fsSL https://ollama.com/install.sh | sh
ollama serve & # start the service
ollama pull qwen2.5:0.5b # or your preferred model
```
Set Model (Optional)
Override the default model by setting the environment variable:
```bash
export OLLAMA_MODEL="llama3.2:latest"
```
### 4. cd
```bash
cd ~/pencilclaw/ -The folder you have the files installed
```
### 5. Compile PENCILCLAW
```bash
g++ -std=c++17 -o pencilclaw pencilclaw.cpp -lcurl
```
If `json.hpp` is in a nonβstandard location, add the appropriate `-I` flag.
---
## Usage
Start the program:
```bash
./pencilclaw
```
You will see the `>` prompt. Commands are caseβsensitive and start with `/`. Any line not starting with `/` is treated as natural language and passed to Ollama.
### Available Commands
| Command | Description |
|-----------------------|-----------------------------------------------------------------------------|
| `/HELP` | Show this help message. |
| `/CODE <idea>` | Generate C++ code for the given idea; saved as `<sanitized_idea>.txt`. |
| `/TASK <description>` | Start a new autonomous coding task (creates a timestamped folder). |
| `/TASK_STATUS` | Show the current active task, its folder, and iteration count. |
| `/STOP_TASK` | Clear the active task (does not delete existing task files). |
| `/EXECUTE` | Compile and run the first C++ code block from the last AI output. |
| `/FILES` | List all saved `.txt` files and task folders. |
| `/DEBUG` | Toggle verbose debug output (shows JSON requests/responses). |
| `/EXIT` | Quit the program. |
### Natural Language Examples
- `write code for a fibonacci function`
- `start a task to build a calculator`
- `save it as mycode.txt` (after code generation)
---
## Git Integration
PENCILCLAW automatically initialises a Git repository inside `pencil_data/` on first run. Every file saved via `/CODE` or task iteration is committed with a descriptive message. The repository is configured with a local identity (`pencilclaw@local` / `PencilClaw`) so commits work even without global Git configuration.
If you prefer not to use Git, simply remove the `.git` folder from `pencil_data/` β PENCILCLAW will detect its absence and skip all Git operations.
---
## Security Notes
- **Code execution is potentially dangerous.** PENCILCLAW always shows the code and requires you to type `yes` before running it.
- **Path traversal is prevented** β filenames are sanitised, and all writes are confined to `pencil_data/`.
- **No shell commands are used** β all external commands (`git`, `g++`) are invoked via `fork`+`execvp` with argument vectors, eliminating command injection risks.
---
## Configuration
| Setting | Method |
|------------------------|------------------------------------------------------------------------|
| Ollama model | Environment variable `OLLAMA_MODEL` (default: `qwen2.5:0.5b`) |
| Workspace directory | Environment variable `PENCIL_DATA` (default: `./pencil_data/`) |
| Heartbeat interval | Edit `HEARTBEAT_INTERVAL` in source (default 120 seconds) |
| Keepβalive interval | Edit `KEEP_ALIVE_INTERVAL` in source (default 120 seconds) |
---
## Troubleshooting
| Problem | Solution |
|----------------------------------|----------------------------------------------------------------|
| `json.hpp: No such file or directory` | Install nlohmann/json or add the correct `-I` flag. |
| `curl failed: Couldn't connect to server` | Ensure Ollama is running (`ollama serve`) and the URL `http://localhost:11434` is accessible. |
| Model not found | Run `ollama pull <model_name>` (e.g., `qwen2.5:0.5b`). |
| Git commit fails | The repository already has a local identity; this should not happen. If it does, run `git config` manually in `pencil_data/`. |
| Compilation errors (C++17) | Use a compiler that supports `-std=c++17` (g++ 7+ or clang 5+). |
---
## License
This project is released under the MIT License. Built with C++ and Ollama.
|