Bichrai commited on
Commit
0369aac
·
verified ·
1 Parent(s): ead5630

Chess Challenge submission by Bichrai

Browse files
Files changed (3) hide show
  1. README.md +2 -2
  2. config.json +3 -2
  3. model.safetensors +2 -2
README.md CHANGED
@@ -14,13 +14,13 @@ Chess model submitted to the LLM Course Chess Challenge.
14
  ## Submission Info
15
 
16
  - **Submitted by**: [Bichrai](https://huggingface.co/Bichrai)
17
- - **Parameters**: 821,296
18
  - **Organization**: LLM-course
19
 
20
  ## Model Details
21
 
22
  - **Architecture**: Chess Transformer (GPT-style)
23
  - **Vocab size**: 1682
24
- - **Embedding dim**: 112
25
  - **Layers**: 5
26
  - **Heads**: 4
 
14
  ## Submission Info
15
 
16
  - **Submitted by**: [Bichrai](https://huggingface.co/Bichrai)
17
+ - **Parameters**: 876,544
18
  - **Organization**: LLM-course
19
 
20
  ## Model Details
21
 
22
  - **Architecture**: Chess Transformer (GPT-style)
23
  - **Vocab size**: 1682
24
+ - **Embedding dim**: 128
25
  - **Layers**: 5
26
  - **Heads**: 4
config.json CHANGED
@@ -6,11 +6,12 @@
6
  "dropout": 0.1,
7
  "dtype": "float32",
8
  "eos_token_id": 2,
 
9
  "model_type": "chess_transformer",
10
  "n_ctx": 256,
11
- "n_embd": 112,
12
  "n_head": 4,
13
- "n_inner": 336,
14
  "n_layer": 5,
15
  "pad_token_id": 0,
16
  "rms_norm_eps": 1e-06,
 
6
  "dropout": 0.1,
7
  "dtype": "float32",
8
  "eos_token_id": 2,
9
+ "layer_norm_epsilon": 1e-05,
10
  "model_type": "chess_transformer",
11
  "n_ctx": 256,
12
+ "n_embd": 128,
13
  "n_head": 4,
14
+ "n_inner": 256,
15
  "n_layer": 5,
16
  "pad_token_id": 0,
17
  "rms_norm_eps": 1e-06,
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5f2ecf2379e1f892827eb5d85dc3f5422614d4e0b1d66ce350e6928b00bce750
3
- size 3289680
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:13c9467fbfc32654d0a4b656df524a9a0010009519786db73d3fa8e48847aba2
3
+ size 3510672