webxos commited on
Commit
d24d3af
Β·
verified Β·
1 Parent(s): 54b2e09

Delete README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -173
README.md DELETED
@@ -1,173 +0,0 @@
1
- ---
2
- license: mit
3
- language:
4
- - en
5
- pipeline_tag: text-generation
6
- library_name: cpp
7
- tags:
8
- - creative-writing
9
- - ollama
10
- - code-execution
11
- - productivity
12
- - offline-ai
13
- - local-llm
14
- ---
15
-
16
- # Pencilclaw v1.0 (Testing) ✏️
17
-
18
- ```
19
- β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•— β–ˆβ–ˆβ•—
20
- β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘
21
- β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ•”β–ˆβ–ˆβ•— β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ•— β–ˆβ–ˆβ•‘
22
- β–ˆβ–ˆβ•”β•β•β•β• β–ˆβ–ˆβ•”β•β•β• β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘
23
- β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘ β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β•šβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ–ˆβ•”β–ˆβ–ˆβ–ˆβ•”β•
24
- β•šβ•β• β•šβ•β•β•β•β•β•β•β•šβ•β• β•šβ•β•β•β• β•šβ•β•β•β•β•β•β•šβ•β•β•šβ•β•β•β•β•β•β• β•šβ•β•β•β•β•β•β•šβ•β•β•β•β•β•β•šβ•β• β•šβ•β• β•šβ•β•β•β•šβ•β•β•
25
- ```
26
-
27
- **PENCILCLAW** is a C++ agent harness that turns your local [Ollama](https://ollama.com/) instance into a creative writing partner with the ability to execute generated C++ code. It follows a simple ADA-style command interface - perfect for writers, tinkerers, and AI enthusiasts who want to keep their data private and their workflows offline.
28
-
29
- ---
30
-
31
- ## Features
32
-
33
- - **Story & Poem Generation** - Use `/STORY` or `/POEM` with a title/subject to get creative text from your local LLM.
34
- - **Book Continuation** - The `/BOOK` command appends new chapters to a running `book.txt`, maintaining context from previous content.
35
- - **Code Execution** - If the AI responds with a C++ code block (triple backticks), `/EXECUTE` compiles and runs it - ideal for prototyping or exploring AI-generated algorithms.
36
- - **Session Logging** - All interactions are saved in `pencil_data/session.log` for later reference.
37
- - **Workspace Isolation** - Everything lives in the `./pencil_data/` folder; temporary files are cleaned up after execution.
38
- - **Security Awareness** - Includes filename sanitisation and a confirmation prompt before running any AI-generated code.
39
-
40
- ---
41
-
42
- ## Project Structure
43
-
44
- All necessary files for PENCILCLAW are contained within the `/home/kali/pencilclaw/` directory. Below is the complete tree:
45
-
46
- ```
47
- /home/kali/pencilclaw/
48
- β”œβ”€β”€ pencilclaw.cpp # Main program source
49
- β”œβ”€β”€ pencil_utils.hpp # Workspace and template helpers
50
- β”œβ”€β”€ pencilclaw # Compiled executable (after build)
51
- └── pencil_data/ # **Created automatically on first run**
52
- β”œβ”€β”€ session.log # Full interaction log
53
- β”œβ”€β”€ book.txt # Accumulated book chapters
54
- β”œβ”€β”€ temp_code.cpp # Temporary source file (deleted after execution)
55
- β”œβ”€β”€ temp_code # Temporary executable (deleted after execution)
56
- └── [story/poem files] # Individual .txt files for each /STORY or /POEM
57
- ```
58
-
59
- **The `pencil_data` directory is created automatically when you run the program. All generated content and logs reside there.**
60
-
61
- ---
62
-
63
- ## Requirements
64
-
65
- - **libcurl** development libraries
66
- - **cJSON** library
67
- - **Ollama** installed and running
68
- - A model pulled in Ollama (default: `qwen2.5:0.5b` - change in source if desired)
69
-
70
- ---
71
-
72
- ## Installation
73
-
74
- ### 1. Install System Dependencies
75
- ```bash
76
- sudo apt update
77
- sudo apt install -y build-essential libcurl4-openssl-dev
78
- ```
79
-
80
- ### 2. Install cJSON
81
- If your distribution does not provide a package, build from source:
82
- ```bash
83
- git clone https://github.com/DaveGamble/cJSON.git
84
- cd cJSON
85
- mkdir build && cd build
86
- cmake ..
87
- make
88
- sudo make install
89
- sudo ldconfig
90
- cd ../..
91
- ```
92
-
93
- ### 3. Install Ollama
94
- ```bash
95
- curl -fsSL https://ollama.com/install.sh | sh
96
- ollama serve & # start the service
97
- ollama pull qwen2.5:0.5b # or another model of your choice
98
- ```
99
- ## Custom Models
100
-
101
- Edit line 36 of the pencilclaw.cpp file:
102
-
103
- ```
104
- // Model name – change this to match your installed model (e.g., "llama3", "qwen2.5", "mistral")
105
- const std::string MODEL_NAME = "qwen2.5:0.5b";
106
- ```
107
-
108
- ### 4. Compile PENCILCLAW
109
- Place the source files in the same directory and compile:
110
- ```bash
111
- g++ -std=c++17 -o pencilclaw pencilclaw.cpp -lcurl -lcjson
112
- ```
113
- If cJSON headers are in a non-standard location (e.g., `/usr/local/include/cjson`), add the appropriate `-I` flag:
114
- ```bash
115
- g++ -std=c++17 -o pencilclaw pencilclaw.cpp -lcurl -lcjson -I/usr/local/include/cjson
116
- ```
117
-
118
- ---
119
-
120
- ## Usage
121
-
122
- Start the program:
123
- ```bash
124
- ./pencilclaw
125
- ```
126
-
127
- You will see the `>` prompt. Commands are case-sensitive and start with `/`.
128
-
129
- ### Available Commands
130
- | Command | Description |
131
- |-------------------|-----------------------------------------------------------------------------|
132
- | `/HELP` | Show this help message. |
133
- | `/STORY <title>` | Generate a short story with the given title. Saved as `<title>.txt`. |
134
- | `/POEM <subject>` | Compose a poem about the subject. Saved as `<subject>.txt`. |
135
- | `/BOOK <chapter>` | Append a new chapter to `book.txt` (creates file if it doesn't exist). |
136
- | `/EXECUTE` | Compile and run the first C++ code block from the last AI response. |
137
- | `/DEBUG` | Toggle verbose debug output (shows JSON requests/responses). |
138
- | `/EXIT` | Quit the program. |
139
-
140
- Any line not starting with `/` is sent directly to Ollama as a free prompt; the response is displayed and logged.
141
-
142
- ---
143
-
144
- ## Security Notes
145
-
146
- - **Code execution is a powerful feature.** PENCILCLAW asks for confirmation before running any AI-generated code. Always review the code if you are unsure.
147
- - **Filename sanitisation** prevents path traversal attacks (e.g., `../../etc/passwd` becomes `____etc_passwd`).
148
- - All operations are confined to the `pencil_data` subdirectory; no system-wide changes are made.
149
-
150
- ---
151
-
152
- ## Customisation
153
-
154
- - **Model**: Change the `MODEL_NAME` constant in `pencilclaw.cpp` to use a different Ollama model.
155
- - **Prompts**: Edit the templates in `pencil_utils.hpp` (`get_template` function) to adjust the AI's behaviour.
156
- - **Timeout**: The default HTTP timeout is 60 seconds. Adjust `CURLOPT_TIMEOUT` in the source if needed.
157
-
158
- ---
159
-
160
- ## Troubleshooting
161
-
162
- | Problem | Solution |
163
- |----------------------------------|----------------------------------------------------------------|
164
- | `cJSON.h: No such file or directory` | Install cJSON or add the correct `-I` flag during compilation. |
165
- | `curl failed: Timeout was reached` | Ensure Ollama is running (`ollama serve`) and the model is pulled. |
166
- | Model not found | Run `ollama pull <model_name>` (e.g., `qwen2.5:0.5b`). |
167
- | Compilation errors (C++17) | Use a compiler that supports `-std=c++17` (g++ 7+ or clang 5+).|
168
-
169
- ---
170
-
171
- ## License
172
-
173
- This project is released under the MIT License. Built with C++ and Ollama.