Chess model submitted to the LLM Course Chess Challenge.
Submission Info
- Submitted by: janisaiad
- Parameters: 580,800
- Organization: LLM-course
Model Details
- Architecture: Tiny Recursive Model (TRM) - looping recurrent transformer (cycle-shared weights)
- Vocab size: 148
- Embedding dim: 160
- Layers: 2
- Heads: 4
- Cycles: 6
TRM note: this is a looping TRM model — we run the same transformer stack for 6 recurrent refinement cycle(s) (weights are shared across cycles), which increases compute/reasoning depth without increasing parameter count.
- Downloads last month
- 5