Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,151 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Pencilclaw v1.0 (Testing) βοΈ
|
| 2 |
+
---
|
| 3 |
+
```
|
| 4 |
+
βββββββ ββββββββββββ βββ βββββββββββββ ββββββββββ ββββββ βββ βββ
|
| 5 |
+
βββββββββββββββββββββ βββββββββββββββββ βββββββββββ βββββββββββ βββ
|
| 6 |
+
ββββββββββββββ ββββββ ββββββ ββββββ βββ βββ βββββββββββ ββ βββ
|
| 7 |
+
βββββββ ββββββ βββββββββββββ ββββββ βββ βββ ββββββββββββββββββ
|
| 8 |
+
βββ βββββββββββ βββββββββββββββββββββββββββββββββββββββββββ βββββββββββββ
|
| 9 |
+
βββ βββββββββββ βββββ ββββββββββββββββββ βββββββββββββββββ βββ ββββββββ
|
| 10 |
+
```
|
| 11 |
+
|
| 12 |
+
**PENCILCLAW** is a C++ command-line tool that turns your local [Ollama](https://ollama.com/) instance into a creative writing partner with the ability to execute generated C++ code. It follows a simple ADA-style command interface - perfect for writers, tinkerers, and AI enthusiasts who want to keep their data private and their workflows offline.
|
| 13 |
+
|
| 14 |
+
---
|
| 15 |
+
|
| 16 |
+
## Features
|
| 17 |
+
|
| 18 |
+
- **Story & Poem Generation** - Use `/STORY` or `/POEM` with a title/subject to get creative text from your local LLM.
|
| 19 |
+
- **Book Continuation** - The `/BOOK` command appends new chapters to a running `book.txt`, maintaining context from previous content.
|
| 20 |
+
- **Code Execution** - If the AI responds with a C++ code block (triple backticks), `/EXECUTE` compiles and runs it - ideal for prototyping or exploring AI-generated algorithms.
|
| 21 |
+
- **Session Logging** - All interactions are saved in `pencil_data/session.log` for later reference.
|
| 22 |
+
- **Workspace Isolation** - Everything lives in the `./pencil_data/` folder; temporary files are cleaned up after execution.
|
| 23 |
+
- **Security Awareness** - Includes filename sanitisation and a confirmation prompt before running any AI-generated code.
|
| 24 |
+
|
| 25 |
+
---
|
| 26 |
+
|
| 27 |
+
## Project Structure
|
| 28 |
+
|
| 29 |
+
All necessary files for PENCILCLAW are contained within the `/home/kali/pencilclaw/` directory. Below is the complete tree:
|
| 30 |
+
|
| 31 |
+
```
|
| 32 |
+
/home/kali/pencilclaw/
|
| 33 |
+
βββ pencilclaw.cpp # Main program source
|
| 34 |
+
βββ pencil_utils.hpp # Workspace and template helpers
|
| 35 |
+
βββ pencilclaw # Compiled executable (after build)
|
| 36 |
+
βββ pencil_data/ # **Created automatically on first run**
|
| 37 |
+
βββ session.log # Full interaction log
|
| 38 |
+
βββ book.txt # Accumulated book chapters
|
| 39 |
+
βββ temp_code.cpp # Temporary source file (deleted after execution)
|
| 40 |
+
βββ temp_code # Temporary executable (deleted after execution)
|
| 41 |
+
βββ [story/poem files] # Individual .txt files for each /STORY or /POEM
|
| 42 |
+
```
|
| 43 |
+
|
| 44 |
+
**The `pencil_data` directory is created automatically when you run the program. All generated content and logs reside there.**
|
| 45 |
+
|
| 46 |
+
---
|
| 47 |
+
|
| 48 |
+
## Requirements
|
| 49 |
+
|
| 50 |
+
- **C++17** compiler (g++ recommended)
|
| 51 |
+
- **libcurl** development libraries
|
| 52 |
+
- **cJSON** library
|
| 53 |
+
- **Ollama** installed and running
|
| 54 |
+
- A model pulled in Ollama (default: `qwen2.5:0.5b` - change in source if desired)
|
| 55 |
+
|
| 56 |
+
---
|
| 57 |
+
|
| 58 |
+
## Installation
|
| 59 |
+
|
| 60 |
+
### 1. Install System Dependencies
|
| 61 |
+
```bash
|
| 62 |
+
sudo apt update
|
| 63 |
+
sudo apt install -y build-essential libcurl4-openssl-dev
|
| 64 |
+
```
|
| 65 |
+
|
| 66 |
+
### 2. Install cJSON
|
| 67 |
+
If your distribution does not provide a package, build from source:
|
| 68 |
+
```bash
|
| 69 |
+
git clone https://github.com/DaveGamble/cJSON.git
|
| 70 |
+
cd cJSON
|
| 71 |
+
mkdir build && cd build
|
| 72 |
+
cmake ..
|
| 73 |
+
make
|
| 74 |
+
sudo make install
|
| 75 |
+
sudo ldconfig
|
| 76 |
+
cd ../..
|
| 77 |
+
```
|
| 78 |
+
|
| 79 |
+
### 3. Install Ollama
|
| 80 |
+
```bash
|
| 81 |
+
curl -fsSL https://ollama.com/install.sh | sh
|
| 82 |
+
ollama serve & # start the service
|
| 83 |
+
ollama pull qwen2.5:0.5b # or another model of your choice
|
| 84 |
+
```
|
| 85 |
+
|
| 86 |
+
### 4. Compile PENCILCLAW
|
| 87 |
+
Place the source files in the same directory and compile:
|
| 88 |
+
```bash
|
| 89 |
+
g++ -std=c++17 -o pencilclaw pencilclaw.cpp -lcurl -lcjson
|
| 90 |
+
```
|
| 91 |
+
If cJSON headers are in a non-standard location (e.g., `/usr/local/include/cjson`), add the appropriate `-I` flag:
|
| 92 |
+
```bash
|
| 93 |
+
g++ -std=c++17 -o pencilclaw pencilclaw.cpp -lcurl -lcjson -I/usr/local/include/cjson
|
| 94 |
+
```
|
| 95 |
+
|
| 96 |
+
---
|
| 97 |
+
|
| 98 |
+
## Usage
|
| 99 |
+
|
| 100 |
+
Start the program:
|
| 101 |
+
```bash
|
| 102 |
+
./pencilclaw
|
| 103 |
+
```
|
| 104 |
+
|
| 105 |
+
You will see the `>` prompt. Commands are case-sensitive and start with `/`.
|
| 106 |
+
|
| 107 |
+
### Available Commands
|
| 108 |
+
| Command | Description |
|
| 109 |
+
|-------------------|-----------------------------------------------------------------------------|
|
| 110 |
+
| `/HELP` | Show this help message. |
|
| 111 |
+
| `/STORY <title>` | Generate a short story with the given title. Saved as `<title>.txt`. |
|
| 112 |
+
| `/POEM <subject>` | Compose a poem about the subject. Saved as `<subject>.txt`. |
|
| 113 |
+
| `/BOOK <chapter>` | Append a new chapter to `book.txt` (creates file if it doesn't exist). |
|
| 114 |
+
| `/EXECUTE` | Compile and run the first C++ code block from the last AI response. |
|
| 115 |
+
| `/DEBUG` | Toggle verbose debug output (shows JSON requests/responses). |
|
| 116 |
+
| `/EXIT` | Quit the program. |
|
| 117 |
+
|
| 118 |
+
Any line not starting with `/` is sent directly to Ollama as a free prompt; the response is displayed and logged.
|
| 119 |
+
|
| 120 |
+
---
|
| 121 |
+
|
| 122 |
+
## Security Notes
|
| 123 |
+
|
| 124 |
+
- **Code execution is a powerful feature.** PENCILCLAW asks for confirmation before running any AI-generated code. Always review the code if you are unsure.
|
| 125 |
+
- **Filename sanitisation** prevents path traversal attacks (e.g., `../../etc/passwd` becomes `____etc_passwd`).
|
| 126 |
+
- All operations are confined to the `pencil_data` subdirectory; no system-wide changes are made.
|
| 127 |
+
|
| 128 |
+
---
|
| 129 |
+
|
| 130 |
+
## Customisation
|
| 131 |
+
|
| 132 |
+
- **Model**: Change the `MODEL_NAME` constant in `pencilclaw.cpp` to use a different Ollama model.
|
| 133 |
+
- **Prompts**: Edit the templates in `pencil_utils.hpp` (`get_template` function) to adjust the AI's behaviour.
|
| 134 |
+
- **Timeout**: The default HTTP timeout is 60 seconds. Adjust `CURLOPT_TIMEOUT` in the source if needed.
|
| 135 |
+
|
| 136 |
+
---
|
| 137 |
+
|
| 138 |
+
## Troubleshooting
|
| 139 |
+
|
| 140 |
+
| Problem | Solution |
|
| 141 |
+
|----------------------------------|----------------------------------------------------------------|
|
| 142 |
+
| `cJSON.h: No such file or directory` | Install cJSON or add the correct `-I` flag during compilation. |
|
| 143 |
+
| `curl failed: Timeout was reached` | Ensure Ollama is running (`ollama serve`) and the model is pulled. |
|
| 144 |
+
| Model not found | Run `ollama pull <model_name>` (e.g., `qwen2.5:0.5b`). |
|
| 145 |
+
| Compilation errors (C++17) | Use a compiler that supports `-std=c++17` (g++ 7+ or clang 5+).|
|
| 146 |
+
|
| 147 |
+
---
|
| 148 |
+
|
| 149 |
+
## License
|
| 150 |
+
|
| 151 |
+
This project is released under the MIT License. Built with C++ and Ollama.
|