| --- |
| license: apache-2.0 |
| base_model: mlx-community/Qwen3-1.7B-4bit |
| tags: |
| - mlx |
| - lora |
| - doer |
| - qwen3 |
| - unix |
| library_name: mlx |
| --- |
| |
| # cagataydev/doer |
|
|
| The **default checkpoint** for [doer](https://github.com/cagataycali/doer) β |
| a one-file pipe-native self-aware Unix agent. |
|
|
| ## what |
|
|
| A LoRA-fine-tuned `mlx-community/Qwen3-1.7B-4bit` that knows: |
|
|
| - what doer is, its architecture, its SOUL (creed) |
| - all `DOER_*` env vars and their defaults |
| - how to train, upload, round-trip data via `--train*` / `--upload-hf` |
| - the design rules: one file, lean deps, context over memory, unix over RPC, |
| env vars over config files |
| - how to use doer with images, audio, video (mlx-vlm routing) |
| - provider auto-detection (bedrock β mlx β ollama) |
|
|
| ## use |
|
|
| ```bash |
| pip install 'doer-cli[mlx]' |
| |
| # point at this checkpoint |
| DOER_PROVIDER=mlx \ |
| DOER_MLX_MODEL=cagataydev/doer \ |
| doer "what is doer" |
| ``` |
|
|
| Future doer builds default `DOER_MLX_MODEL=cagataydev/doer`, so: |
|
|
| ```bash |
| pip install 'doer-cli[mlx]' |
| doer "what is doer" # auto-pulls this checkpoint on first run |
| ``` |
|
|
| ## training |
|
|
| - **base**: `mlx-community/Qwen3-1.7B-4bit` |
| - **data**: [cagataydev/doer-training](https://huggingface.co/datasets/cagataydev/doer-training) |
| (fat, self-contained records: `{ts, query, system, messages, tools}`) |
| - **method**: LoRA via `mlx_lm.tuner`, 8 layers, rank 8, scale 20 |
| - **fused**: `mlx_lm.fuse --dequantize` β re-quantized to 4bit |
|
|
| Trained on self-generated Q/A turns about doer itself β the model learns its |
| own source, its own prompt, its own philosophy. |
|
|