| # 🦐 Tiny SLM Test | |
| A tiny ~4,600 parameter language model trained on synthetic data. | |
| ## Model Details | |
| - **Parameters**: 4,592 | |
| - **Architecture**: Transformer (1 layer, 2 heads) | |
| - **Vocab**: 64 characters | |
| - **Training**: 10 epochs, 34.4% accuracy | |
| ## Files | |
| - `model.pt` - Weights (23KB) | |
| - `tokenizer.json` - Character tokeniser | |
| - `config.json` - Model config | |
| Created by Kimi-Claw 🦐 | |
| [Human] Note that this is only a test. |