File size: 1,542 Bytes
4698c1d | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 | ---
license: apache-2.0
base_model: mlx-community/Qwen3-1.7B-4bit
tags:
- mlx
- lora
- doer
- qwen3
- unix
library_name: mlx
---
# cagataydev/doer
The **default checkpoint** for [doer](https://github.com/cagataycali/doer) —
a one-file pipe-native self-aware Unix agent.
## what
A LoRA-fine-tuned `mlx-community/Qwen3-1.7B-4bit` that knows:
- what doer is, its architecture, its SOUL (creed)
- all `DOER_*` env vars and their defaults
- how to train, upload, round-trip data via `--train*` / `--upload-hf`
- the design rules: one file, lean deps, context over memory, unix over RPC,
env vars over config files
- how to use doer with images, audio, video (mlx-vlm routing)
- provider auto-detection (bedrock → mlx → ollama)
## use
```bash
pip install 'doer-cli[mlx]'
# point at this checkpoint
DOER_PROVIDER=mlx \
DOER_MLX_MODEL=cagataydev/doer \
doer "what is doer"
```
Future doer builds default `DOER_MLX_MODEL=cagataydev/doer`, so:
```bash
pip install 'doer-cli[mlx]'
doer "what is doer" # auto-pulls this checkpoint on first run
```
## training
- **base**: `mlx-community/Qwen3-1.7B-4bit`
- **data**: [cagataydev/doer-training](https://huggingface.co/datasets/cagataydev/doer-training)
(fat, self-contained records: `{ts, query, system, messages, tools}`)
- **method**: LoRA via `mlx_lm.tuner`, 8 layers, rank 8, scale 20
- **fused**: `mlx_lm.fuse --dequantize` → re-quantized to 4bit
Trained on self-generated Q/A turns about doer itself — the model learns its
own source, its own prompt, its own philosophy.
|