# Load model directly
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("LLM-course/Sunxt25", dtype="auto")Quick Links
Sunxt25
Chess model submitted to the LLM Course Chess Challenge.
Submission Info
- Submitted by: Sunxt25
- Parameters: 878,336
- Organization: LLM-course
Model Details
- Architecture: Chess Transformer (GPT-style)
- Vocab size: 144
- Embedding dim: 128
- Layers: 5
- Heads: 4
- Downloads last month
- 3
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="LLM-course/Sunxt25")