janisaiad's picture
Chess Challenge submission by janisaiad
0511542 verified
metadata
library_name: transformers
tags:
  - chess
  - llm-course
  - chess-challenge
license: mit

Chess model submitted to the LLM Course Chess Challenge.

Submission Info

  • Submitted by: janisaiad
  • Parameters: 43,104
  • Organization: LLM-course

Model Details

  • Architecture: Tiny Recursive Model (TRM) - looping recurrent transformer (cycle-shared weights)
  • Vocab size: 148
  • Embedding dim: 48
  • Layers: 1
  • Heads: 2
  • Cycles: 2

TRM note: this is a looping TRM model — at inference/training time we run the same transformer stack for 2 recurrent refinement cycle(s) (weights are shared across cycles), which increases compute/reasoning depth without increasing parameter count.