webxos commited on
Commit
dbd9d67
Β·
verified Β·
1 Parent(s): fc8a1d1

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +174 -0
README.md ADDED
@@ -0,0 +1,174 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ✏️ PENCILCLAW – Autonomous C++ Coding Agent (v1.0 Testing)
2
+
3
+ ```
4
+ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ•—
5
+ β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘
6
+ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β–ˆβ–ˆβ•— β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ•— β–ˆβ–ˆβ•‘
7
+ β–ˆβ–ˆβ•”β•β•β•β• β–ˆβ–ˆβ•”β•β•β• β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘
8
+ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β•šβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ–ˆβ•”β–ˆβ–ˆβ–ˆβ•”β•
9
+ β•šβ•β• β•šβ•β•β•β•β•β•β•β•šβ•β• β•šβ•β•β•β• β•šβ•β•β•β•β•β•β•šβ•β•β•šβ•β•β•β•β•β•β• β•šβ•β•β•β•β•β•β•šβ•β•β•β•β•β•β•šβ•β• β•šβ•β• β•šβ•β•β•β•šβ•β•β•
10
+ ```
11
+
12
+ **PENCILCLAW** is a C++‑based autonomous coding agent harness for your local [Ollama](https://ollama.com/) instance to generate, manage, and execute C++ code. It features a persistent task system, Git integration, and a secure execution environment – all running offline with complete privacy.
13
+
14
+ ---
15
+
16
+ ## Features
17
+
18
+ - **Code Generation (`/CODE`)** – Generate C++ code for any idea, automatically saved as a `.txt` file.
19
+ - **Autonomous Tasks (`/TASK`)** – Start a long‑running coding goal; the agent continues working on it in the background via heartbeat.
20
+ - **Task Management** – View status (`/TASK_STATUS`) and stop tasks (`/STOP_TASK`).
21
+ - **Code Execution (`/EXECUTE`)** – Compile and run the last generated code block (with safety confirmation).
22
+ - **Git Integration** – Every saved file is automatically committed to a local Git repository inside `pencil_data/`.
23
+ - **Heartbeat & Keep‑Alive** – Keeps the Ollama model loaded and continues active tasks periodically.
24
+ - **Secure by Design** – Command injection prevented, path sanitisation, explicit confirmation before running AI‑generated code.
25
+ - **Natural Language Interface** – Commands like *"write code for a fibonacci function"* are understood.
26
+
27
+ ---
28
+
29
+ ## Project Structure
30
+
31
+ ```
32
+ /home/kali/pencilclaw/
33
+ β”œβ”€β”€ pencilclaw.cpp # Main program source
34
+ β”œβ”€β”€ pencil_utils.hpp # Workspace utilities
35
+ β”œβ”€β”€ pencilclaw # Compiled executable
36
+ **└── pencil_data/ # Created automatically on first run**
37
+ β”œβ”€β”€ session.log # Full interaction log
38
+ β”œβ”€β”€ .git/ # Local Git repository (if initialised)
39
+ β”œβ”€β”€ tasks/ # Autonomous task folders
40
+ β”‚ └── 20260309_123456_build_calculator/
41
+ β”‚ β”œβ”€β”€ description.txt
42
+ β”‚ β”œβ”€β”€ log.txt
43
+ β”‚ β”œβ”€β”€ iteration_1.txt
44
+ β”‚ └── ...
45
+ └── [code files].txt # Files saved via /CODE or natural language
46
+ ```
47
+
48
+ ---
49
+
50
+ ## Requirements
51
+
52
+ - **Compiler** with C++17 support (g++ 7+ or clang 5+)
53
+ - **libcurl** development libraries
54
+ - **nlohmann/json** (header‑only JSON library)
55
+ - **Ollama** installed and running
56
+ - A model pulled in Ollama (default: `qwen2.5:0.5b` – configurable via environment variable `OLLAMA_MODEL`)
57
+
58
+ *Note: PENCILCLAW uses POSIX system calls (`fork`, `pipe`, `execvp`). It runs on Linux, macOS, and Windows Subsystem for Linux (WSL).*
59
+
60
+ ---
61
+
62
+ ## Installation
63
+
64
+ ### 1. Install System Dependencies
65
+ ```bash
66
+ sudo apt update
67
+ sudo apt install -y build-essential libcurl4-openssl-dev
68
+ ```
69
+
70
+ ### 2. Install nlohmann/json
71
+ The library is header‑only; simply download `json.hpp` and place it in your include path, or install via package manager:
72
+ ```bash
73
+ sudo apt install -y nlohmann-json3-dev
74
+ ```
75
+
76
+ ### 3. Install Ollama
77
+ ```bash
78
+ curl -fsSL https://ollama.com/install.sh | sh
79
+ ollama serve & # start the service
80
+ ollama pull qwen2.5:0.5b # or your preferred model
81
+ ```
82
+
83
+ Set Model (Optional)
84
+
85
+ Override the default model by setting the environment variable:
86
+ ```bash
87
+ export OLLAMA_MODEL="llama3.2:latest"
88
+ ```
89
+ ### 4. cd
90
+ ```bash
91
+ cd ~/pencilclaw/ -The folder you have the files installed
92
+ ```
93
+
94
+ ### 5. Compile PENCILCLAW
95
+ ```bash
96
+ g++ -std=c++17 -o pencilclaw pencilclaw.cpp -lcurl
97
+ ```
98
+ If `json.hpp` is in a non‑standard location, add the appropriate `-I` flag.
99
+
100
+ ---
101
+
102
+ ## Usage
103
+
104
+ Start the program:
105
+ ```bash
106
+ ./pencilclaw
107
+ ```
108
+
109
+ You will see the `>` prompt. Commands are case‑sensitive and start with `/`. Any line not starting with `/` is treated as natural language and passed to Ollama.
110
+
111
+ ### Available Commands
112
+
113
+ | Command | Description |
114
+ |-----------------------|-----------------------------------------------------------------------------|
115
+ | `/HELP` | Show this help message. |
116
+ | `/CODE <idea>` | Generate C++ code for the given idea; saved as `<sanitized_idea>.txt`. |
117
+ | `/TASK <description>` | Start a new autonomous coding task (creates a timestamped folder). |
118
+ | `/TASK_STATUS` | Show the current active task, its folder, and iteration count. |
119
+ | `/STOP_TASK` | Clear the active task (does not delete existing task files). |
120
+ | `/EXECUTE` | Compile and run the first C++ code block from the last AI output. |
121
+ | `/FILES` | List all saved `.txt` files and task folders. |
122
+ | `/DEBUG` | Toggle verbose debug output (shows JSON requests/responses). |
123
+ | `/EXIT` | Quit the program. |
124
+
125
+ ### Natural Language Examples
126
+
127
+ - `write code for a fibonacci function`
128
+ - `start a task to build a calculator`
129
+ - `save it as mycode.txt` (after code generation)
130
+
131
+ ---
132
+
133
+ ## Git Integration
134
+
135
+ PENCILCLAW automatically initialises a Git repository inside `pencil_data/` on first run. Every file saved via `/CODE` or task iteration is committed with a descriptive message. The repository is configured with a local identity (`pencilclaw@local` / `PencilClaw`) so commits work even without global Git configuration.
136
+
137
+ If you prefer not to use Git, simply remove the `.git` folder from `pencil_data/` – PENCILCLAW will detect its absence and skip all Git operations.
138
+
139
+ ---
140
+
141
+ ## Security Notes
142
+
143
+ - **Code execution is potentially dangerous.** PENCILCLAW always shows the code and requires you to type `yes` before running it.
144
+ - **Path traversal is prevented** – filenames are sanitised, and all writes are confined to `pencil_data/`.
145
+ - **No shell commands are used** – all external commands (`git`, `g++`) are invoked via `fork`+`execvp` with argument vectors, eliminating command injection risks.
146
+
147
+ ---
148
+
149
+ ## Configuration
150
+
151
+ | Setting | Method |
152
+ |------------------------|------------------------------------------------------------------------|
153
+ | Ollama model | Environment variable `OLLAMA_MODEL` (default: `qwen2.5:0.5b`) |
154
+ | Workspace directory | Environment variable `PENCIL_DATA` (default: `./pencil_data/`) |
155
+ | Heartbeat interval | Edit `HEARTBEAT_INTERVAL` in source (default 120 seconds) |
156
+ | Keep‑alive interval | Edit `KEEP_ALIVE_INTERVAL` in source (default 120 seconds) |
157
+
158
+ ---
159
+
160
+ ## Troubleshooting
161
+
162
+ | Problem | Solution |
163
+ |----------------------------------|----------------------------------------------------------------|
164
+ | `json.hpp: No such file or directory` | Install nlohmann/json or add the correct `-I` flag. |
165
+ | `curl failed: Couldn't connect to server` | Ensure Ollama is running (`ollama serve`) and the URL `http://localhost:11434` is accessible. |
166
+ | Model not found | Run `ollama pull <model_name>` (e.g., `qwen2.5:0.5b`). |
167
+ | Git commit fails | The repository already has a local identity; this should not happen. If it does, run `git config` manually in `pencil_data/`. |
168
+ | Compilation errors (C++17) | Use a compiler that supports `-std=c++17` (g++ 7+ or clang 5+). |
169
+
170
+ ---
171
+
172
+ ## License
173
+
174
+ This project is released under the MIT License. Built with C++ and Ollama.