doer / README.md
cagataydev's picture
doer LoRA fused from mlx-community/Qwen3-1.7B-4bit
4698c1d verified
metadata
license: apache-2.0
base_model: mlx-community/Qwen3-1.7B-4bit
tags:
  - mlx
  - lora
  - doer
  - qwen3
  - unix
library_name: mlx

cagataydev/doer

The default checkpoint for doer — a one-file pipe-native self-aware Unix agent.

what

A LoRA-fine-tuned mlx-community/Qwen3-1.7B-4bit that knows:

  • what doer is, its architecture, its SOUL (creed)
  • all DOER_* env vars and their defaults
  • how to train, upload, round-trip data via --train* / --upload-hf
  • the design rules: one file, lean deps, context over memory, unix over RPC, env vars over config files
  • how to use doer with images, audio, video (mlx-vlm routing)
  • provider auto-detection (bedrock → mlx → ollama)

use

pip install 'doer-cli[mlx]'

# point at this checkpoint
DOER_PROVIDER=mlx \
DOER_MLX_MODEL=cagataydev/doer \
doer "what is doer"

Future doer builds default DOER_MLX_MODEL=cagataydev/doer, so:

pip install 'doer-cli[mlx]'
doer "what is doer"   # auto-pulls this checkpoint on first run

training

  • base: mlx-community/Qwen3-1.7B-4bit
  • data: cagataydev/doer-training (fat, self-contained records: {ts, query, system, messages, tools})
  • method: LoRA via mlx_lm.tuner, 8 layers, rank 8, scale 20
  • fused: mlx_lm.fuse --dequantize → re-quantized to 4bit

Trained on self-generated Q/A turns about doer itself — the model learns its own source, its own prompt, its own philosophy.