Code translation. Python-JS-Rust. 15.2M params. PPL 1.86. Context 256 bytes.
| Property | Value |
|---|---|
| Architecture | Multi-Scale Transformer |
| d_model | ? |
| Attention Heads | ? |
| Layers per Scale | ? |
| Context Window | 256 bytes |
| Downsample Factors | [1, 2, 4] |
| Vocab Size | 258 (byte-level) |
| Optimizer | SGD |
| Metric | Value |
|---|---|
| Final Loss | 0.6703 |
| Perplexity | 1.86 |
| Training Steps | 3942 |
| Training Time | 10 min |
ollama create axl-translate -f Modelfile
ollama run axl-translate "def fibonacci():"
| File | Size | Format |
|---|---|---|
| F16 GGUF | 31 MB | Full precision |
| Q4_K_M GGUF | 18 MB | 4-bit quantized |