|
|
--- |
|
|
library_name: transformers |
|
|
tags: |
|
|
- chess |
|
|
- llm-course |
|
|
- chess-challenge |
|
|
license: mit |
|
|
--- |
|
|
|
|
|
# chess-littletestmodel |
|
|
|
|
|
Chess model for LLM Course Challenge. |
|
|
|
|
|
- **By**: [MDaytek](https://huggingface.co/MDaytek) |
|
|
- **Params**: 790,560 |
|
|
- **Architecture**: GPT-2 with custom tokenizer |
|
|
|
|
|
## Usage |
|
|
|
|
|
The model uses a custom tokenizer. Load it with: |
|
|
|
|
|
```python |
|
|
from transformers import GPT2LMHeadModel, AutoConfig |
|
|
import json |
|
|
|
|
|
config = AutoConfig.from_pretrained("LLM-course/chess-littletestmodel") |
|
|
model = GPT2LMHeadModel.from_pretrained("LLM-course/chess-littletestmodel", config=config) |
|
|
|
|
|
# Load vocab |
|
|
with open("vocab.json") as f: |
|
|
vocab = json.load(f) |
|
|
``` |
|
|
|