Flagship Lion model. 322M params. PPL 1.11. Context 256 bytes. Retrained from SGD to Lion.
| Property | Value |
|---|---|
| Architecture | Multi-Scale Transformer |
| d_model | ? |
| Attention Heads | ? |
| Layers per Scale | ? |
| Context Window | 256 bytes |
| Downsample Factors | [1, 2, 4] |
| Vocab Size | 258 (byte-level) |
| Optimizer | Lion |
| Metric | Value |
|---|---|
| Final Loss | 0.1465 |
| Perplexity | 1.11 |
| Training Steps | 1082 |
| Training Time | 45 min |
ollama create axl-300m -f Modelfile
ollama run axl-300m "def fibonacci():"
| File | Size | Format |
|---|---|---|
| F16 GGUF | 645 MB | Full precision |
| Q4_K_M GGUF | --- | 4-bit quantized |