rebeccaelch commited on
Commit
ca38808
·
verified ·
1 Parent(s): 8a2d76f

Chess Challenge submission by rebeccaelch

Browse files
Files changed (7) hide show
  1. README.md +5 -9
  2. config.json +5 -1
  3. model.py +437 -0
  4. model.safetensors +2 -2
  5. tokenizer_config.json +2 -2
  6. tokenizer_decomposed.py +157 -0
  7. vocab.json +144 -1678
README.md CHANGED
@@ -7,20 +7,16 @@ tags:
7
  license: mit
8
  ---
9
 
10
- # chess-reb
11
-
12
- Chess model submitted to the LLM Course Chess Challenge.
13
-
14
- ## Submission Info
15
 
 
16
  - **Submitted by**: [rebeccaelch](https://huggingface.co/rebeccaelch)
17
- - **Parameters**: 909,824
18
  - **Organization**: LLM-course
19
 
20
- ## Model Details
21
-
22
  - **Architecture**: Chess Transformer (GPT-style)
23
- - **Vocab size**: 1682
24
  - **Embedding dim**: 128
25
  - **Layers**: 4
26
  - **Heads**: 4
 
7
  license: mit
8
  ---
9
 
10
+ ## Chess model submitted to the LLM Course Chess Challenge.
 
 
 
 
11
 
12
+ ### Submission Info
13
  - **Submitted by**: [rebeccaelch](https://huggingface.co/rebeccaelch)
14
+ - **Parameters**: 713,472
15
  - **Organization**: LLM-course
16
 
17
+ ### Model Details
 
18
  - **Architecture**: Chess Transformer (GPT-style)
19
+ - **Vocab size**: 148
20
  - **Embedding dim**: 128
21
  - **Layers**: 4
22
  - **Heads**: 4
config.json CHANGED
@@ -2,6 +2,10 @@
2
  "architectures": [
3
  "ChessForCausalLM"
4
  ],
 
 
 
 
5
  "bos_token_id": 1,
6
  "dropout": 0.1,
7
  "dtype": "float32",
@@ -16,5 +20,5 @@
16
  "pad_token_id": 0,
17
  "tie_weights": true,
18
  "transformers_version": "4.57.6",
19
- "vocab_size": 1682
20
  }
 
2
  "architectures": [
3
  "ChessForCausalLM"
4
  ],
5
+ "auto_map": {
6
+ "AutoConfig": "model.ChessConfig",
7
+ "AutoModelForCausalLM": "model.ChessForCausalLM"
8
+ },
9
  "bos_token_id": 1,
10
  "dropout": 0.1,
11
  "dtype": "float32",
 
20
  "pad_token_id": 0,
21
  "tie_weights": true,
22
  "transformers_version": "4.57.6",
23
+ "vocab_size": 148
24
  }
model.py ADDED
@@ -0,0 +1,437 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Chess Transformer Model for the Chess Challenge.
3
+
4
+ This module provides a simple GPT-style transformer architecture
5
+ designed to fit within the 1M parameter constraint.
6
+
7
+ Key components:
8
+ - ChessConfig: Configuration class for model hyperparameters
9
+ - ChessForCausalLM: The main model class for next-move prediction
10
+ """
11
+
12
+ from __future__ import annotations
13
+
14
+ import math
15
+ from dataclasses import dataclass
16
+ from typing import Optional, Tuple, Union
17
+
18
+ import torch
19
+ import torch.nn as nn
20
+ import torch.nn.functional as F
21
+ from transformers import PretrainedConfig, PreTrainedModel
22
+ from transformers.modeling_outputs import CausalLMOutputWithPast
23
+
24
+
25
+ class ChessConfig(PretrainedConfig):
26
+ """
27
+ Configuration class for the Chess Transformer model.
28
+
29
+ This configuration is designed for a ~1M parameter model.
30
+ Students can adjust these values to explore different architectures.
31
+
32
+ Parameter budget breakdown (with default values):
33
+ - Embeddings (vocab): 1200 x 128 = 153,600
34
+ - Position Embeddings: 256 x 128 = 32,768
35
+ - Transformer Layers: 6 x ~120,000 = ~720,000
36
+ - LM Head (with weight tying): 0 (shared with embeddings)
37
+ - Total: ~906,000 parameters
38
+
39
+ Attributes:
40
+ vocab_size: Size of the vocabulary (number of unique moves).
41
+ n_embd: Embedding dimension (d_model).
42
+ n_layer: Number of transformer layers.
43
+ n_head: Number of attention heads.
44
+ n_ctx: Maximum sequence length (context window).
45
+ n_inner: Feed-forward inner dimension (default: 3 * n_embd).
46
+ dropout: Dropout probability.
47
+ layer_norm_epsilon: Epsilon for layer normalization.
48
+ tie_weights: Whether to tie embedding and output weights.
49
+ """
50
+
51
+ model_type = "chess_transformer"
52
+
53
+ def __init__(
54
+ self,
55
+ vocab_size: int = 1200,
56
+ n_embd: int = 128,
57
+ n_layer: int = 6,
58
+ n_head: int = 4,
59
+ n_ctx: int = 256,
60
+ n_inner: Optional[int] = None,
61
+ dropout: float = 0.1,
62
+ layer_norm_epsilon: float = 1e-5,
63
+ tie_weights: bool = True,
64
+ pad_token_id: int = 0,
65
+ bos_token_id: int = 1,
66
+ eos_token_id: int = 2,
67
+ **kwargs,
68
+ ):
69
+ super().__init__(
70
+ pad_token_id=pad_token_id,
71
+ bos_token_id=bos_token_id,
72
+ eos_token_id=eos_token_id,
73
+ **kwargs,
74
+ )
75
+
76
+ self.vocab_size = vocab_size
77
+ self.n_embd = n_embd
78
+ self.n_layer = n_layer
79
+ self.n_head = n_head
80
+ self.n_ctx = n_ctx
81
+ self.n_inner = n_inner if n_inner is not None else 3 * n_embd # Reduced from 4x to 3x
82
+ self.dropout = dropout
83
+ self.layer_norm_epsilon = layer_norm_epsilon
84
+ self.tie_weights = tie_weights
85
+ # Inform HF base class about tying behavior
86
+ self.tie_word_embeddings = bool(tie_weights)
87
+
88
+
89
+ class MultiHeadAttention(nn.Module):
90
+ """
91
+ Multi-head self-attention module.
92
+
93
+ This is a standard scaled dot-product attention implementation
94
+ with causal masking for autoregressive generation.
95
+ """
96
+
97
+ def __init__(self, config: ChessConfig):
98
+ super().__init__()
99
+
100
+ assert config.n_embd % config.n_head == 0, \
101
+ f"n_embd ({config.n_embd}) must be divisible by n_head ({config.n_head})"
102
+
103
+ self.n_head = config.n_head
104
+ self.n_embd = config.n_embd
105
+ self.head_dim = config.n_embd // config.n_head
106
+
107
+ # Combined QKV projection for efficiency
108
+ self.c_attn = nn.Linear(config.n_embd, 3 * config.n_embd)
109
+ self.c_proj = nn.Linear(config.n_embd, config.n_embd)
110
+
111
+ self.dropout = nn.Dropout(config.dropout)
112
+
113
+ # Causal mask (will be created on first forward pass)
114
+ self.register_buffer(
115
+ "bias",
116
+ torch.tril(torch.ones(config.n_ctx, config.n_ctx)).view(
117
+ 1, 1, config.n_ctx, config.n_ctx
118
+ ),
119
+ persistent=False,
120
+ )
121
+
122
+ def forward(
123
+ self,
124
+ x: torch.Tensor,
125
+ attention_mask: Optional[torch.Tensor] = None,
126
+ ) -> torch.Tensor:
127
+ batch_size, seq_len, _ = x.size()
128
+
129
+ # Compute Q, K, V
130
+ qkv = self.c_attn(x)
131
+ q, k, v = qkv.split(self.n_embd, dim=2)
132
+
133
+ # Reshape for multi-head attention
134
+ q = q.view(batch_size, seq_len, self.n_head, self.head_dim).transpose(1, 2)
135
+ k = k.view(batch_size, seq_len, self.n_head, self.head_dim).transpose(1, 2)
136
+ v = v.view(batch_size, seq_len, self.n_head, self.head_dim).transpose(1, 2)
137
+
138
+ # Scaled dot-product attention
139
+ attn_weights = torch.matmul(q, k.transpose(-2, -1)) / math.sqrt(self.head_dim)
140
+
141
+ # Apply causal mask
142
+ causal_mask = self.bias[:, :, :seq_len, :seq_len]
143
+ attn_weights = attn_weights.masked_fill(causal_mask == 0, float("-inf"))
144
+
145
+ # Apply attention mask (for padding)
146
+ if attention_mask is not None:
147
+ # attention_mask shape: (batch_size, seq_len) -> (batch_size, 1, 1, seq_len)
148
+ attention_mask = attention_mask.unsqueeze(1).unsqueeze(2)
149
+ attn_weights = attn_weights.masked_fill(attention_mask == 0, float("-inf"))
150
+
151
+ attn_weights = F.softmax(attn_weights, dim=-1)
152
+ attn_weights = self.dropout(attn_weights)
153
+
154
+ # Apply attention to values
155
+ attn_output = torch.matmul(attn_weights, v)
156
+
157
+ # Reshape back
158
+ attn_output = attn_output.transpose(1, 2).contiguous().view(
159
+ batch_size, seq_len, self.n_embd
160
+ )
161
+
162
+ # Output projection
163
+ attn_output = self.c_proj(attn_output)
164
+
165
+ return attn_output
166
+
167
+
168
+ class FeedForward(nn.Module):
169
+ """
170
+ Feed-forward network (MLP) module.
171
+
172
+ Standard two-layer MLP with GELU activation.
173
+ """
174
+
175
+ def __init__(self, config: ChessConfig):
176
+ super().__init__()
177
+
178
+ self.c_fc = nn.Linear(config.n_embd, config.n_inner)
179
+ self.c_proj = nn.Linear(config.n_inner, config.n_embd)
180
+ self.dropout = nn.Dropout(config.dropout)
181
+
182
+ def forward(self, x: torch.Tensor) -> torch.Tensor:
183
+ x = self.c_fc(x)
184
+ x = F.gelu(x)
185
+ x = self.c_proj(x)
186
+ x = self.dropout(x)
187
+ return x
188
+
189
+
190
+ class TransformerBlock(nn.Module):
191
+ """
192
+ A single transformer block with attention and feed-forward layers.
193
+
194
+ Uses pre-normalization (LayerNorm before attention/FFN) for better
195
+ training stability.
196
+ """
197
+
198
+ def __init__(self, config: ChessConfig):
199
+ super().__init__()
200
+
201
+ self.ln_1 = nn.LayerNorm(config.n_embd, eps=config.layer_norm_epsilon)
202
+ self.attn = MultiHeadAttention(config)
203
+ self.ln_2 = nn.LayerNorm(config.n_embd, eps=config.layer_norm_epsilon)
204
+ self.mlp = FeedForward(config)
205
+
206
+ def forward(
207
+ self,
208
+ x: torch.Tensor,
209
+ attention_mask: Optional[torch.Tensor] = None,
210
+ ) -> torch.Tensor:
211
+ # Pre-norm attention
212
+ x = x + self.attn(self.ln_1(x), attention_mask=attention_mask)
213
+ # Pre-norm FFN
214
+ x = x + self.mlp(self.ln_2(x))
215
+ return x
216
+
217
+
218
+ class ChessForCausalLM(PreTrainedModel):
219
+ """
220
+ Chess Transformer for Causal Language Modeling (next-move prediction).
221
+
222
+ This model is designed to predict the next chess move given a sequence
223
+ of previous moves. It uses a GPT-style architecture with:
224
+ - Token embeddings for chess moves
225
+ - Learned positional embeddings
226
+ - Stacked transformer blocks
227
+ - Linear head for next-token prediction
228
+
229
+ The model supports weight tying between the embedding layer and the
230
+ output projection to save parameters.
231
+
232
+ Example:
233
+ >>> config = ChessConfig(vocab_size=1200, n_embd=128, n_layer=6)
234
+ >>> model = ChessForCausalLM(config)
235
+ >>> inputs = {"input_ids": torch.tensor([[1, 42, 87]])}
236
+ >>> outputs = model(**inputs)
237
+ >>> next_move_logits = outputs.logits[:, -1, :]
238
+ """
239
+
240
+ config_class = ChessConfig
241
+ base_model_prefix = "transformer"
242
+ supports_gradient_checkpointing = True
243
+ # Suppress missing-key warning for tied lm_head when loading
244
+ keys_to_ignore_on_load_missing = ["lm_head.weight"]
245
+
246
+ def __init__(self, config: ChessConfig):
247
+ super().__init__(config)
248
+
249
+ # Token and position embeddings
250
+ self.wte = nn.Embedding(config.vocab_size, config.n_embd)
251
+ self.wpe = nn.Embedding(config.n_ctx, config.n_embd)
252
+
253
+ self.drop = nn.Dropout(config.dropout)
254
+
255
+ # Transformer blocks
256
+ self.h = nn.ModuleList([
257
+ TransformerBlock(config) for _ in range(config.n_layer)
258
+ ])
259
+
260
+ # Final layer norm
261
+ self.ln_f = nn.LayerNorm(config.n_embd, eps=config.layer_norm_epsilon)
262
+
263
+ # Output head
264
+ self.lm_head = nn.Linear(config.n_embd, config.vocab_size, bias=False)
265
+
266
+ # Declare tied weights for proper serialization
267
+ if config.tie_weights:
268
+ self._tied_weights_keys = ["lm_head.weight"]
269
+
270
+ # Initialize weights
271
+ self.post_init()
272
+
273
+ # Tie weights if configured
274
+ if config.tie_weights:
275
+ self.tie_weights()
276
+
277
+ def get_input_embeddings(self) -> nn.Module:
278
+ return self.wte
279
+
280
+ def set_input_embeddings(self, new_embeddings: nn.Module):
281
+ self.wte = new_embeddings
282
+ if getattr(self.config, "tie_weights", False):
283
+ self.tie_weights()
284
+
285
+ def get_output_embeddings(self) -> nn.Module:
286
+ return self.lm_head
287
+
288
+ def set_output_embeddings(self, new_embeddings: nn.Module):
289
+ self.lm_head = new_embeddings
290
+
291
+ def tie_weights(self):
292
+ # Use HF helper to tie or clone depending on config
293
+ if getattr(self.config, "tie_weights", False) or getattr(self.config, "tie_word_embeddings", False):
294
+ self._tie_or_clone_weights(self.lm_head, self.wte)
295
+
296
+ def _init_weights(self, module: nn.Module):
297
+ """Initialize weights following GPT-2 style."""
298
+ if isinstance(module, nn.Linear):
299
+ torch.nn.init.normal_(module.weight, mean=0.0, std=0.02)
300
+ if module.bias is not None:
301
+ torch.nn.init.zeros_(module.bias)
302
+ elif isinstance(module, nn.Embedding):
303
+ torch.nn.init.normal_(module.weight, mean=0.0, std=0.02)
304
+ elif isinstance(module, nn.LayerNorm):
305
+ torch.nn.init.ones_(module.weight)
306
+ torch.nn.init.zeros_(module.bias)
307
+
308
+ def forward(
309
+ self,
310
+ input_ids: torch.LongTensor,
311
+ attention_mask: Optional[torch.Tensor] = None,
312
+ position_ids: Optional[torch.LongTensor] = None,
313
+ labels: Optional[torch.LongTensor] = None,
314
+ return_dict: Optional[bool] = None,
315
+ **kwargs,
316
+ ) -> Union[Tuple, CausalLMOutputWithPast]:
317
+ """
318
+ Forward pass of the model.
319
+
320
+ Args:
321
+ input_ids: Token IDs of shape (batch_size, seq_len).
322
+ attention_mask: Attention mask of shape (batch_size, seq_len).
323
+ position_ids: Position IDs of shape (batch_size, seq_len).
324
+ labels: Labels for language modeling loss.
325
+ return_dict: Whether to return a ModelOutput object.
326
+
327
+ Returns:
328
+ CausalLMOutputWithPast containing loss (if labels provided) and logits.
329
+ """
330
+ return_dict = return_dict if return_dict is not None else self.config.use_return_dict
331
+
332
+ batch_size, seq_len = input_ids.size()
333
+ device = input_ids.device
334
+
335
+ # Create position IDs if not provided
336
+ if position_ids is None:
337
+ position_ids = torch.arange(seq_len, device=device).unsqueeze(0).expand(batch_size, -1)
338
+
339
+ # Get embeddings
340
+ token_embeds = self.wte(input_ids)
341
+ position_embeds = self.wpe(position_ids)
342
+ hidden_states = self.drop(token_embeds + position_embeds)
343
+
344
+ # Pass through transformer blocks
345
+ for block in self.h:
346
+ hidden_states = block(hidden_states, attention_mask=attention_mask)
347
+
348
+ # Final layer norm
349
+ hidden_states = self.ln_f(hidden_states)
350
+
351
+ # Get logits
352
+ logits = self.lm_head(hidden_states)
353
+
354
+ # Compute loss if labels are provided
355
+ loss = None
356
+ if labels is not None:
357
+ # Shift logits and labels for next-token prediction
358
+ shift_logits = logits[..., :-1, :].contiguous()
359
+ shift_labels = labels[..., 1:].contiguous()
360
+
361
+ # Flatten for cross-entropy
362
+ loss_fct = nn.CrossEntropyLoss(ignore_index=-100)
363
+ loss = loss_fct(
364
+ shift_logits.view(-1, shift_logits.size(-1)),
365
+ shift_labels.view(-1),
366
+ )
367
+
368
+ if not return_dict:
369
+ output = (logits,)
370
+ return ((loss,) + output) if loss is not None else output
371
+
372
+ return CausalLMOutputWithPast(
373
+ loss=loss,
374
+ logits=logits,
375
+ past_key_values=None,
376
+ hidden_states=None,
377
+ attentions=None,
378
+ )
379
+
380
+ @torch.no_grad()
381
+ def generate_move(
382
+ self,
383
+ input_ids: torch.LongTensor,
384
+ temperature: float = 1.0,
385
+ top_k: Optional[int] = None,
386
+ top_p: Optional[float] = None,
387
+ ) -> int:
388
+ """
389
+ Generate the next move given a sequence of moves.
390
+
391
+ Args:
392
+ input_ids: Token IDs of shape (1, seq_len).
393
+ temperature: Sampling temperature (1.0 = no change).
394
+ top_k: If set, only sample from top k tokens.
395
+ top_p: If set, use nucleus sampling with this threshold.
396
+
397
+ Returns:
398
+ The token ID of the predicted next move.
399
+ """
400
+ self.eval()
401
+
402
+ # Get logits for the last position
403
+ outputs = self(input_ids)
404
+ logits = outputs.logits[:, -1, :] / temperature
405
+
406
+ # Apply top-k filtering
407
+ if top_k is not None:
408
+ indices_to_remove = logits < torch.topk(logits, top_k)[0][..., -1, None]
409
+ logits[indices_to_remove] = float("-inf")
410
+
411
+ # Apply top-p (nucleus) filtering
412
+ if top_p is not None:
413
+ sorted_logits, sorted_indices = torch.sort(logits, descending=True)
414
+ cumulative_probs = torch.cumsum(F.softmax(sorted_logits, dim=-1), dim=-1)
415
+
416
+ # Remove tokens with cumulative probability above the threshold
417
+ sorted_indices_to_remove = cumulative_probs > top_p
418
+ sorted_indices_to_remove[..., 1:] = sorted_indices_to_remove[..., :-1].clone()
419
+ sorted_indices_to_remove[..., 0] = 0
420
+
421
+ indices_to_remove = sorted_indices_to_remove.scatter(
422
+ dim=-1, index=sorted_indices, src=sorted_indices_to_remove
423
+ )
424
+ logits[indices_to_remove] = float("-inf")
425
+
426
+ # Sample from the distribution
427
+ probs = F.softmax(logits, dim=-1)
428
+ next_token = torch.multinomial(probs, num_samples=1)
429
+
430
+ return next_token.item()
431
+
432
+
433
+ # Register the model with Auto classes for easy loading
434
+ from transformers import AutoConfig, AutoModelForCausalLM
435
+
436
+ AutoConfig.register("chess_transformer", ChessConfig)
437
+ AutoModelForCausalLM.register(ChessConfig, ChessForCausalLM)
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fded355905f2a7fe810b72ec5481a2c2d2b8ca9b6751a8a61190d0118df6d340
3
- size 3643696
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:53ef6ce469b92d0ca51935898f507fea199d569de0662dbf30edb0698f00ef1c
3
+ size 2858288
tokenizer_config.json CHANGED
@@ -35,7 +35,7 @@
35
  },
36
  "auto_map": {
37
  "AutoTokenizer": [
38
- "tokenizer.ChessTokenizer",
39
  null
40
  ]
41
  },
@@ -45,6 +45,6 @@
45
  "extra_special_tokens": {},
46
  "model_max_length": 1000000000000000019884624838656,
47
  "pad_token": "[PAD]",
48
- "tokenizer_class": "ChessTokenizer",
49
  "unk_token": "[UNK]"
50
  }
 
35
  },
36
  "auto_map": {
37
  "AutoTokenizer": [
38
+ "tokenizer_decomposed.ChessDecomposedTokenizer",
39
  null
40
  ]
41
  },
 
45
  "extra_special_tokens": {},
46
  "model_max_length": 1000000000000000019884624838656,
47
  "pad_token": "[PAD]",
48
+ "tokenizer_class": "ChessDecomposedTokenizer",
49
  "unk_token": "[UNK]"
50
  }
tokenizer_decomposed.py ADDED
@@ -0,0 +1,157 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Decomposed Chess Tokenizer.
3
+
4
+ This tokenizer decomposes each move into 3-4 tokens:
5
+ - color+piece token (e.g., "WP", "BN")
6
+ - from-square token with suffix "_f" (e.g., "e2_f")
7
+ - to-square token with suffix "_t" (e.g., "e4_t")
8
+ - optional promotion token (one of "q", "r", "b", "n")
9
+
10
+ This avoids UNKs for rare moves and makes legality learning easier because the model
11
+ always emits explicit squares.
12
+ """
13
+
14
+ from __future__ import annotations
15
+
16
+ import json
17
+ import os
18
+ import re
19
+ from typing import Dict, List, Optional
20
+
21
+ from transformers import PreTrainedTokenizer
22
+
23
+
24
+ class ChessDecomposedTokenizer(PreTrainedTokenizer):
25
+ model_input_names = ["input_ids", "attention_mask"]
26
+ vocab_files_names = {"vocab_file": "vocab.json"}
27
+
28
+ PAD_TOKEN = "[PAD]"
29
+ BOS_TOKEN = "[BOS]"
30
+ EOS_TOKEN = "[EOS]"
31
+ UNK_TOKEN = "[UNK]"
32
+
33
+ _MOVE_RE = re.compile(r"^[WB][PNBRQK][a-h][1-8][a-h][1-8].*$")
34
+
35
+ def __init__(
36
+ self,
37
+ vocab_file: Optional[str] = None,
38
+ vocab: Optional[Dict[str, int]] = None,
39
+ **kwargs,
40
+ ):
41
+ self._pad_token = self.PAD_TOKEN
42
+ self._bos_token = self.BOS_TOKEN
43
+ self._eos_token = self.EOS_TOKEN
44
+ self._unk_token = self.UNK_TOKEN
45
+
46
+ kwargs.pop("pad_token", None)
47
+ kwargs.pop("bos_token", None)
48
+ kwargs.pop("eos_token", None)
49
+ kwargs.pop("unk_token", None)
50
+
51
+ if vocab is not None:
52
+ self._vocab = vocab
53
+ elif vocab_file is not None and os.path.exists(vocab_file):
54
+ with open(vocab_file, "r", encoding="utf-8") as f:
55
+ self._vocab = json.load(f)
56
+ else:
57
+ self._vocab = self._create_full_vocab()
58
+
59
+ self._ids_to_tokens = {v: k for k, v in self._vocab.items()}
60
+
61
+ super().__init__(
62
+ pad_token=self._pad_token,
63
+ bos_token=self._bos_token,
64
+ eos_token=self._eos_token,
65
+ unk_token=self._unk_token,
66
+ **kwargs,
67
+ )
68
+
69
+ @staticmethod
70
+ def _create_full_vocab() -> Dict[str, int]:
71
+ special_tokens = [
72
+ ChessDecomposedTokenizer.PAD_TOKEN,
73
+ ChessDecomposedTokenizer.BOS_TOKEN,
74
+ ChessDecomposedTokenizer.EOS_TOKEN,
75
+ ChessDecomposedTokenizer.UNK_TOKEN,
76
+ ]
77
+
78
+ pieces = ["P", "N", "B", "R", "Q", "K"]
79
+ colors = ["W", "B"]
80
+ piece_tokens = [f"{c}{p}" for c in colors for p in pieces]
81
+
82
+ files = "abcdefgh"
83
+ ranks = "12345678"
84
+ squares = [f"{f}{r}" for f in files for r in ranks]
85
+ from_tokens = [f"{sq}_f" for sq in squares]
86
+ to_tokens = [f"{sq}_t" for sq in squares]
87
+
88
+ promo_tokens = ["q", "r", "b", "n"]
89
+
90
+ tokens = special_tokens + piece_tokens + from_tokens + to_tokens + promo_tokens
91
+ return {tok: idx for idx, tok in enumerate(tokens)}
92
+
93
+ @property
94
+ def vocab_size(self) -> int:
95
+ return len(self._vocab)
96
+
97
+ def get_vocab(self) -> Dict[str, int]:
98
+ return dict(self._vocab)
99
+
100
+ def _tokenize(self, text: str) -> List[str]:
101
+ raw = text.strip()
102
+ if not raw:
103
+ return []
104
+
105
+ parts = raw.split()
106
+ out: List[str] = []
107
+
108
+ for part in parts:
109
+ if part in {self.PAD_TOKEN, self.BOS_TOKEN, self.EOS_TOKEN, self.UNK_TOKEN}:
110
+ out.append(part)
111
+ continue
112
+
113
+ if not self._MOVE_RE.match(part):
114
+ out.append(self.UNK_TOKEN)
115
+ continue
116
+
117
+ color = part[0]
118
+ piece = part[1]
119
+ from_sq = part[2:4]
120
+ to_sq = part[4:6]
121
+ out.append(f"{color}{piece}")
122
+ out.append(f"{from_sq}_f")
123
+ out.append(f"{to_sq}_t")
124
+
125
+ if "=" in part:
126
+ promo_idx = part.find("=")
127
+ if promo_idx != -1 and promo_idx + 1 < len(part):
128
+ promo = part[promo_idx + 1].lower()
129
+ if promo in {"q", "r", "b", "n"}:
130
+ out.append(promo)
131
+
132
+ return out
133
+
134
+ def _convert_token_to_id(self, token: str) -> int:
135
+ return self._vocab.get(token, self._vocab.get(self.UNK_TOKEN, 0))
136
+
137
+ def _convert_id_to_token(self, index: int) -> str:
138
+ return self._ids_to_tokens.get(index, self.UNK_TOKEN)
139
+
140
+ def convert_tokens_to_string(self, tokens: List[str]) -> str:
141
+ special = {self.PAD_TOKEN, self.BOS_TOKEN, self.EOS_TOKEN, self.UNK_TOKEN}
142
+ return " ".join(t for t in tokens if t not in special)
143
+
144
+ def save_vocabulary(self, save_directory: str, filename_prefix: Optional[str] = None) -> tuple:
145
+ if not os.path.isdir(save_directory):
146
+ os.makedirs(save_directory, exist_ok=True)
147
+
148
+ vocab_file = os.path.join(
149
+ save_directory,
150
+ (filename_prefix + "-" if filename_prefix else "") + "vocab.json",
151
+ )
152
+
153
+ with open(vocab_file, "w", encoding="utf-8") as f:
154
+ json.dump(self._vocab, f, ensure_ascii=False, indent=2)
155
+
156
+ return (vocab_file,)
157
+
vocab.json CHANGED
@@ -3,1682 +3,148 @@
3
  "[BOS]": 1,
4
  "[EOS]": 2,
5
  "[UNK]": 3,
6
- "BBa5b6": 4,
7
- "BBa6b7": 5,
8
- "BBb4a5": 6,
9
- "BBb4c3(x)": 7,
10
- "BBb4c3(x+)": 8,
11
- "BBb4c5": 9,
12
- "BBb4d2(x)": 10,
13
- "BBb4d2(x+)": 11,
14
- "BBb4d6": 12,
15
- "BBb4e7": 13,
16
- "BBb6c7": 14,
17
- "BBb6d4(x)": 15,
18
- "BBb7a6": 16,
19
- "BBb7c6": 17,
20
- "BBb7c6(x)": 18,
21
- "BBb7c8": 19,
22
- "BBb7d5": 20,
23
- "BBb7d5(x)": 21,
24
- "BBb7e4(x)": 22,
25
- "BBb7f3(x)": 23,
26
- "BBb7g2(x)": 24,
27
- "BBc5a7": 25,
28
- "BBc5b4": 26,
29
- "BBc5b4(+)": 27,
30
- "BBc5b6": 28,
31
- "BBc5d4": 29,
32
- "BBc5d4(x)": 30,
33
- "BBc5d6": 31,
34
- "BBc5e3(x)": 32,
35
- "BBc5e7": 33,
36
- "BBc5f2(x+)": 34,
37
- "BBc6b5": 35,
38
- "BBc6d5": 36,
39
- "BBc6d5(x)": 37,
40
- "BBc6d7": 38,
41
- "BBc6e4(x)": 39,
42
- "BBc6f3(x)": 40,
43
- "BBc8a6": 41,
44
- "BBc8b7": 42,
45
- "BBc8d7": 43,
46
- "BBc8d7(x)": 44,
47
- "BBc8e6": 45,
48
- "BBc8e6(x)": 46,
49
- "BBc8f5": 47,
50
- "BBc8f5(x)": 48,
51
- "BBc8g4": 49,
52
- "BBc8g4(x)": 50,
53
- "BBc8h3": 51,
54
- "BBc8h3(x)": 52,
55
- "BBd5e6": 53,
56
- "BBd5f3(x)": 54,
57
- "BBd6b4": 55,
58
- "BBd6c5": 56,
59
- "BBd6c5(x)": 57,
60
- "BBd6c7": 58,
61
- "BBd6e5": 59,
62
- "BBd6e5(x)": 60,
63
- "BBd6e7": 61,
64
- "BBd6f4": 62,
65
- "BBd6f4(x)": 63,
66
- "BBd6g3(x)": 64,
67
- "BBd6h2(x+)": 65,
68
- "BBd7b5": 66,
69
- "BBd7b5(x)": 67,
70
- "BBd7c6": 68,
71
- "BBd7c6(x)": 69,
72
- "BBd7c8": 70,
73
- "BBd7e6": 71,
74
- "BBd7e8": 72,
75
- "BBd7f5": 73,
76
- "BBd7f5(x)": 74,
77
- "BBd7g4": 75,
78
- "BBe4f3(x)": 76,
79
- "BBe4g6": 77,
80
- "BBe5d6": 78,
81
- "BBe5f6": 79,
82
- "BBe5g7": 80,
83
- "BBe6a2(x)": 81,
84
- "BBe6b3(x)": 82,
85
- "BBe6c4": 83,
86
- "BBe6c4(x)": 84,
87
- "BBe6d5": 85,
88
- "BBe6d5(x)": 86,
89
- "BBe6d7": 87,
90
- "BBe6f5": 88,
91
- "BBe6f5(x)": 89,
92
- "BBe6f7": 90,
93
- "BBe6g4": 91,
94
- "BBe6h3(x)": 92,
95
- "BBe7b4": 93,
96
- "BBe7c5": 94,
97
- "BBe7c5(x)": 95,
98
- "BBe7d6": 96,
99
- "BBe7d6(x)": 97,
100
- "BBe7d8": 98,
101
- "BBe7f6": 99,
102
- "BBe7f6(x)": 100,
103
- "BBe7f8": 101,
104
- "BBe7g5": 102,
105
- "BBe7g5(x)": 103,
106
- "BBe7h4": 104,
107
- "BBe7h4(x)": 105,
108
- "BBf5c2(x)": 106,
109
- "BBf5d3": 107,
110
- "BBf5d3(x)": 108,
111
- "BBf5d7": 109,
112
- "BBf5e4": 110,
113
- "BBf5e4(x)": 111,
114
- "BBf5e6": 112,
115
- "BBf5g4": 113,
116
- "BBf5g6": 114,
117
- "BBf6b2(x)": 115,
118
- "BBf6c3(x)": 116,
119
- "BBf6d4(x)": 117,
120
- "BBf6e5": 118,
121
- "BBf6e5(x)": 119,
122
- "BBf6e7": 120,
123
- "BBf6g5": 121,
124
- "BBf6g7": 122,
125
- "BBf8b4": 123,
126
- "BBf8b4(+)": 124,
127
- "BBf8c5": 125,
128
- "BBf8c5(+)": 126,
129
- "BBf8c5(x)": 127,
130
- "BBf8d6": 128,
131
- "BBf8d6(x)": 129,
132
- "BBf8e7": 130,
133
- "BBf8e7(x)": 131,
134
- "BBf8g7": 132,
135
- "BBf8h6": 133,
136
- "BBg4d1(x)": 134,
137
- "BBg4d7": 135,
138
- "BBg4e2(x)": 136,
139
- "BBg4e6": 137,
140
- "BBg4f3(x)": 138,
141
- "BBg4f5": 139,
142
- "BBg4h5": 140,
143
- "BBg5f6": 141,
144
- "BBg6d3(x)": 142,
145
- "BBg6e4(x)": 143,
146
- "BBg6h7": 144,
147
- "BBg7b2(x)": 145,
148
- "BBg7c3(x)": 146,
149
- "BBg7d4(x)": 147,
150
- "BBg7e5": 148,
151
- "BBg7e5(x)": 149,
152
- "BBg7f6": 150,
153
- "BBg7f6(x)": 151,
154
- "BBg7f8": 152,
155
- "BBg7h6": 153,
156
- "BBg7h6(x)": 154,
157
- "BBh3g2(x)": 155,
158
- "BBh5f3(x)": 156,
159
- "BBh5g6": 157,
160
- "BBh6g7": 158,
161
- "BKb6a5": 159,
162
- "BKb6b5": 160,
163
- "BKb6c5": 161,
164
- "BKb6c6": 162,
165
- "BKb6c7": 163,
166
- "BKb7a6": 164,
167
- "BKb7b6": 165,
168
- "BKb7c6": 166,
169
- "BKb7c7": 167,
170
- "BKb8a7": 168,
171
- "BKb8a8": 169,
172
- "BKb8b7": 170,
173
- "BKb8c7": 171,
174
- "BKb8c8": 172,
175
- "BKc5b4": 173,
176
- "BKc5c4": 174,
177
- "BKc5d4": 175,
178
- "BKc5d6": 176,
179
- "BKc6b5": 177,
180
- "BKc6b6": 178,
181
- "BKc6b7": 179,
182
- "BKc6c5": 180,
183
- "BKc6c7": 181,
184
- "BKc6d5": 182,
185
- "BKc6d6": 183,
186
- "BKc6d7": 184,
187
- "BKc7b6": 185,
188
- "BKc7b7": 186,
189
- "BKc7b8": 187,
190
- "BKc7c6": 188,
191
- "BKc7c8": 189,
192
- "BKc7d6": 190,
193
- "BKc7d7": 191,
194
- "BKc7d8": 192,
195
- "BKc8b7": 193,
196
- "BKc8b8": 194,
197
- "BKc8c7": 195,
198
- "BKc8d7": 196,
199
- "BKc8d8": 197,
200
- "BKd4c3": 198,
201
- "BKd5c4": 199,
202
- "BKd5c5": 200,
203
- "BKd5c6": 201,
204
- "BKd5d4": 202,
205
- "BKd5e4": 203,
206
- "BKd5e6": 204,
207
- "BKd6c5": 205,
208
- "BKd6c6": 206,
209
- "BKd6c7": 207,
210
- "BKd6d5": 208,
211
- "BKd6d7": 209,
212
- "BKd6e5": 210,
213
- "BKd6e6": 211,
214
- "BKd6e7": 212,
215
- "BKd7c6": 213,
216
- "BKd7c7": 214,
217
- "BKd7c8": 215,
218
- "BKd7d6": 216,
219
- "BKd7d8": 217,
220
- "BKd7e6": 218,
221
- "BKd7e7": 219,
222
- "BKd7e8": 220,
223
- "BKd8c7": 221,
224
- "BKd8c8": 222,
225
- "BKd8d7": 223,
226
- "BKd8e7": 224,
227
- "BKd8e8": 225,
228
- "BKe4d3": 226,
229
- "BKe5d4": 227,
230
- "BKe5d5": 228,
231
- "BKe5d6": 229,
232
- "BKe5e4": 230,
233
- "BKe5f4": 231,
234
- "BKe5f5": 232,
235
- "BKe5f6": 233,
236
- "BKe6d5": 234,
237
- "BKe6d6": 235,
238
- "BKe6d7": 236,
239
- "BKe6e5": 237,
240
- "BKe6e7": 238,
241
- "BKe6f5": 239,
242
- "BKe6f6": 240,
243
- "BKe6f7": 241,
244
- "BKe7d6": 242,
245
- "BKe7d7": 243,
246
- "BKe7d8": 244,
247
- "BKe7e6": 245,
248
- "BKe7e8": 246,
249
- "BKe7f6": 247,
250
- "BKe7f7": 248,
251
- "BKe7f8": 249,
252
- "BKe8c8(O)": 250,
253
- "BKe8d7": 251,
254
- "BKe8d7(x)": 252,
255
- "BKe8d8": 253,
256
- "BKe8d8(x)": 254,
257
- "BKe8e7": 255,
258
- "BKe8e7(x)": 256,
259
- "BKe8f7": 257,
260
- "BKe8f7(x)": 258,
261
- "BKe8f8": 259,
262
- "BKe8g8(o)": 260,
263
- "BKf4f3": 261,
264
- "BKf5e4": 262,
265
- "BKf5e5": 263,
266
- "BKf5e6": 264,
267
- "BKf5f4": 265,
268
- "BKf5f6": 266,
269
- "BKf5g4": 267,
270
- "BKf5g5": 268,
271
- "BKf5g6": 269,
272
- "BKf6e5": 270,
273
- "BKf6e6": 271,
274
- "BKf6e7": 272,
275
- "BKf6f5": 273,
276
- "BKf6f7": 274,
277
- "BKf6g5": 275,
278
- "BKf6g6": 276,
279
- "BKf6g7": 277,
280
- "BKf7e6": 278,
281
- "BKf7e7": 279,
282
- "BKf7e8": 280,
283
- "BKf7f6": 281,
284
- "BKf7f8": 282,
285
- "BKf7g6": 283,
286
- "BKf7g7": 284,
287
- "BKf7g8": 285,
288
- "BKf8e7": 286,
289
- "BKf8e8": 287,
290
- "BKf8f7": 288,
291
- "BKf8g7": 289,
292
- "BKf8g8": 290,
293
- "BKg5f4": 291,
294
- "BKg5f5": 292,
295
- "BKg5f6": 293,
296
- "BKg5g4": 294,
297
- "BKg5h4": 295,
298
- "BKg6f5": 296,
299
- "BKg6f6": 297,
300
- "BKg6f7": 298,
301
- "BKg6g5": 299,
302
- "BKg6g7": 300,
303
- "BKg6h5": 301,
304
- "BKg6h6": 302,
305
- "BKg6h7": 303,
306
- "BKg7f6": 304,
307
- "BKg7f7": 305,
308
- "BKg7f8": 306,
309
- "BKg7g6": 307,
310
- "BKg7g8": 308,
311
- "BKg7h6": 309,
312
- "BKg7h7": 310,
313
- "BKg7h8": 311,
314
- "BKg8f7": 312,
315
- "BKg8f7(x)": 313,
316
- "BKg8f8": 314,
317
- "BKg8f8(x)": 315,
318
- "BKg8g7": 316,
319
- "BKg8g7(x)": 317,
320
- "BKg8h7": 318,
321
- "BKg8h7(x)": 319,
322
- "BKg8h8": 320,
323
- "BKh5g4": 321,
324
- "BKh5g6": 322,
325
- "BKh5h4": 323,
326
- "BKh6g5": 324,
327
- "BKh6g6": 325,
328
- "BKh6g7": 326,
329
- "BKh6h5": 327,
330
- "BKh6h7": 328,
331
- "BKh7g6": 329,
332
- "BKh7g7": 330,
333
- "BKh7g8": 331,
334
- "BKh7h6": 332,
335
- "BKh7h8": 333,
336
- "BKh8g7": 334,
337
- "BKh8g8": 335,
338
- "BKh8h7": 336,
339
- "BNa5b3(x)": 337,
340
- "BNa5c4": 338,
341
- "BNa5c4(x)": 339,
342
- "BNa5c6": 340,
343
- "BNa6b4": 341,
344
- "BNa6c5": 342,
345
- "BNa6c7": 343,
346
- "BNb4a6": 344,
347
- "BNb4c2": 345,
348
- "BNb4c2(x)": 346,
349
- "BNb4c6": 347,
350
- "BNb4d3": 348,
351
- "BNb4d3(x)": 349,
352
- "BNb4d5": 350,
353
- "BNb6c4": 351,
354
- "BNb6c4(x)": 352,
355
- "BNb6d5": 353,
356
- "BNb6d5(x)": 354,
357
- "BNb6d7": 355,
358
- "BNb8a6": 356,
359
- "BNb8c6": 357,
360
- "BNb8c6(x)": 358,
361
- "BNb8d7": 359,
362
- "BNb8d7(x)": 360,
363
- "BNc2a1(x)": 361,
364
- "BNc4b2(x)": 362,
365
- "BNc4d6": 363,
366
- "BNc4e5": 364,
367
- "BNc5d3": 365,
368
- "BNc5d3(x)": 366,
369
- "BNc5d7": 367,
370
- "BNc5e4": 368,
371
- "BNc5e4(x)": 369,
372
- "BNc5e6": 370,
373
- "BNc6a5": 371,
374
- "BNc6a7": 372,
375
- "BNc6b4": 373,
376
- "BNc6b4(x)": 374,
377
- "BNc6b8": 375,
378
- "BNc6d4": 376,
379
- "BNc6d4(x)": 377,
380
- "BNc6d8": 378,
381
- "BNc6d8(x)": 379,
382
- "BNc6e5": 380,
383
- "BNc6e5(x)": 381,
384
- "BNc6e7": 382,
385
- "BNc6e7(x)": 383,
386
- "BNd3b2(x)": 384,
387
- "BNd4b3(x)": 385,
388
- "BNd4c2(x)": 386,
389
- "BNd4c6": 387,
390
- "BNd4e2(+)": 388,
391
- "BNd4e2(x+)": 389,
392
- "BNd4e6": 390,
393
- "BNd4f3(+)": 391,
394
- "BNd4f3(x+)": 392,
395
- "BNd4f5": 393,
396
- "BNd5b4": 394,
397
- "BNd5b6": 395,
398
- "BNd5c3": 396,
399
- "BNd5c3(x)": 397,
400
- "BNd5e3": 398,
401
- "BNd5e3(x)": 399,
402
- "BNd5e7": 400,
403
- "BNd5f4": 401,
404
- "BNd5f4(x)": 402,
405
- "BNd5f6": 403,
406
- "BNd6e4": 404,
407
- "BNd6f5": 405,
408
- "BNd7b6": 406,
409
- "BNd7b8": 407,
410
- "BNd7c5": 408,
411
- "BNd7c5(x)": 409,
412
- "BNd7e5": 410,
413
- "BNd7e5(x)": 411,
414
- "BNd7f6": 412,
415
- "BNd7f6(x)": 413,
416
- "BNd7f8": 414,
417
- "BNe3f1(x)": 415,
418
- "BNe4c3": 416,
419
- "BNe4c3(x)": 417,
420
- "BNe4c5": 418,
421
- "BNe4d2": 419,
422
- "BNe4d2(x)": 420,
423
- "BNe4d6": 421,
424
- "BNe4f2(x)": 422,
425
- "BNe4f6": 423,
426
- "BNe4g3(x)": 424,
427
- "BNe4g5": 425,
428
- "BNe4g5(x)": 426,
429
- "BNe5c4": 427,
430
- "BNe5c4(x)": 428,
431
- "BNe5c6": 429,
432
- "BNe5d3": 430,
433
- "BNe5d3(x)": 431,
434
- "BNe5d7": 432,
435
- "BNe5f3(+)": 433,
436
- "BNe5f3(x+)": 434,
437
- "BNe5g4": 435,
438
- "BNe5g6": 436,
439
- "BNe6d4": 437,
440
- "BNe6f4": 438,
441
- "BNe7c6": 439,
442
- "BNe7c6(x)": 440,
443
- "BNe7c8": 441,
444
- "BNe7d5": 442,
445
- "BNe7d5(x)": 443,
446
- "BNe7f5": 444,
447
- "BNe7f5(x)": 445,
448
- "BNe7g6": 446,
449
- "BNe8d6": 447,
450
- "BNe8f6": 448,
451
- "BNf4e2(+)": 449,
452
- "BNf5d4": 450,
453
- "BNf5d4(x)": 451,
454
- "BNf5d6": 452,
455
- "BNf5e3": 453,
456
- "BNf5e3(x)": 454,
457
- "BNf5e7": 455,
458
- "BNf5h4": 456,
459
- "BNf6d5": 457,
460
- "BNf6d5(x)": 458,
461
- "BNf6d7": 459,
462
- "BNf6d7(x)": 460,
463
- "BNf6e4": 461,
464
- "BNf6e4(x)": 462,
465
- "BNf6e8": 463,
466
- "BNf6g4": 464,
467
- "BNf6g4(x)": 465,
468
- "BNf6g8": 466,
469
- "BNf6h5": 467,
470
- "BNf6h5(x)": 468,
471
- "BNf6h7": 469,
472
- "BNf8e6": 470,
473
- "BNf8g6": 471,
474
- "BNg4e3": 472,
475
- "BNg4e3(x)": 473,
476
- "BNg4e5": 474,
477
- "BNg4e5(x)": 475,
478
- "BNg4f2(x)": 476,
479
- "BNg4f6": 477,
480
- "BNg4h6": 478,
481
- "BNg6e5": 479,
482
- "BNg6e5(x)": 480,
483
- "BNg6e7": 481,
484
- "BNg6f4": 482,
485
- "BNg6f4(x)": 483,
486
- "BNg6h4": 484,
487
- "BNg8e7": 485,
488
- "BNg8f6": 486,
489
- "BNg8f6(x)": 487,
490
- "BNg8h6": 488,
491
- "BNh5f4": 489,
492
- "BNh5f4(x)": 490,
493
- "BNh5f6": 491,
494
- "BNh5g3(x)": 492,
495
- "BNh6f5": 493,
496
- "BNh6f7": 494,
497
- "BNh6g4": 495,
498
- "BNh7f6": 496,
499
- "BNh7g5": 497,
500
- "BPa2a1(Q)": 498,
501
- "BPa3a2": 499,
502
- "BPa4a3": 500,
503
- "BPa4b3(x)": 501,
504
- "BPa5a4": 502,
505
- "BPa5b4(x)": 503,
506
- "BPa6a5": 504,
507
- "BPa6b5(x)": 505,
508
- "BPa7a5": 506,
509
- "BPa7a6": 507,
510
- "BPa7b6(x)": 508,
511
- "BPb2b1(Q)": 509,
512
- "BPb3b2": 510,
513
- "BPb4a3(x)": 511,
514
- "BPb4b3": 512,
515
- "BPb4c3(x)": 513,
516
- "BPb5a4(x)": 514,
517
- "BPb5b4": 515,
518
- "BPb5c4(x)": 516,
519
- "BPb6a5(x)": 517,
520
- "BPb6b5": 518,
521
- "BPb6c5(x)": 519,
522
- "BPb7a6(x)": 520,
523
- "BPb7b5": 521,
524
- "BPb7b6": 522,
525
- "BPb7c6(x)": 523,
526
- "BPc2c1(Q)": 524,
527
- "BPc3c2": 525,
528
- "BPc4b3(x)": 526,
529
- "BPc4c3": 527,
530
- "BPc4d3(x)": 528,
531
- "BPc5b4(x)": 529,
532
- "BPc5c4": 530,
533
- "BPc5d4(x)": 531,
534
- "BPc6b5(x)": 532,
535
- "BPc6c5": 533,
536
- "BPc6d5(x)": 534,
537
- "BPc7b6(x)": 535,
538
- "BPc7c5": 536,
539
- "BPc7c6": 537,
540
- "BPc7d6(x)": 538,
541
- "BPd3d2": 539,
542
- "BPd4c3(x)": 540,
543
- "BPd4d3": 541,
544
- "BPd4e3(x)": 542,
545
- "BPd5c4(x)": 543,
546
- "BPd5d4": 544,
547
- "BPd5e4(x)": 545,
548
- "BPd6c5(x)": 546,
549
- "BPd6d5": 547,
550
- "BPd6e5(x)": 548,
551
- "BPd7c6(x)": 549,
552
- "BPd7d5": 550,
553
- "BPd7d6": 551,
554
- "BPe3e2": 552,
555
- "BPe4d3(x)": 553,
556
- "BPe4e3": 554,
557
- "BPe4f3(x)": 555,
558
- "BPe5d4(x)": 556,
559
- "BPe5e4": 557,
560
- "BPe5f4(x)": 558,
561
- "BPe6d5(x)": 559,
562
- "BPe6e5": 560,
563
- "BPe6f5(x)": 561,
564
- "BPe7d6(x)": 562,
565
- "BPe7e5": 563,
566
- "BPe7e6": 564,
567
- "BPe7f6(x)": 565,
568
- "BPf3f2": 566,
569
- "BPf4e3(x)": 567,
570
- "BPf4f3": 568,
571
- "BPf4g3(x)": 569,
572
- "BPf5e4(x)": 570,
573
- "BPf5f4": 571,
574
- "BPf5g4(x)": 572,
575
- "BPf6e5(x)": 573,
576
- "BPf6f5": 574,
577
- "BPf6g5(x)": 575,
578
- "BPf7e6(x)": 576,
579
- "BPf7f5": 577,
580
- "BPf7f6": 578,
581
- "BPf7g6(x)": 579,
582
- "BPg2g1(Q)": 580,
583
- "BPg3g2": 581,
584
- "BPg4f3(x)": 582,
585
- "BPg4g3": 583,
586
- "BPg4h3(x)": 584,
587
- "BPg5f4(x)": 585,
588
- "BPg5g4": 586,
589
- "BPg5h4(x)": 587,
590
- "BPg6f5(x)": 588,
591
- "BPg6g5": 589,
592
- "BPg6h5(x)": 590,
593
- "BPg7f6(x)": 591,
594
- "BPg7g5": 592,
595
- "BPg7g6": 593,
596
- "BPg7h6(x)": 594,
597
- "BPh2h1(Q)": 595,
598
- "BPh3h2": 596,
599
- "BPh4g3(x)": 597,
600
- "BPh4h3": 598,
601
- "BPh5g4(x)": 599,
602
- "BPh5h4": 600,
603
- "BPh6g5(x)": 601,
604
- "BPh6h5": 602,
605
- "BPh7g6(x)": 603,
606
- "BPh7h5": 604,
607
- "BPh7h6": 605,
608
- "BQa5b6": 606,
609
- "BQa5c7": 607,
610
- "BQa5d8": 608,
611
- "BQb2a2(x)": 609,
612
- "BQb4b2(x)": 610,
613
- "BQb6a5": 611,
614
- "BQb6b2(x)": 612,
615
- "BQb6c6": 613,
616
- "BQb6c7": 614,
617
- "BQb6d4(x)": 615,
618
- "BQb6d8": 616,
619
- "BQc7a5": 617,
620
- "BQc7b6": 618,
621
- "BQc7b7": 619,
622
- "BQc7c6": 620,
623
- "BQc7d6": 621,
624
- "BQc7d6(x)": 622,
625
- "BQc7d7": 623,
626
- "BQc7d8": 624,
627
- "BQc7e5(x)": 625,
628
- "BQc7e7": 626,
629
- "BQd5a5": 627,
630
- "BQd5d6": 628,
631
- "BQd5d8": 629,
632
- "BQd6c6": 630,
633
- "BQd6c7": 631,
634
- "BQd6d7": 632,
635
- "BQd6e6": 633,
636
- "BQd6e7": 634,
637
- "BQd7c6": 635,
638
- "BQd7c7": 636,
639
- "BQd7d6": 637,
640
- "BQd7e6": 638,
641
- "BQd7e7": 639,
642
- "BQd7f5": 640,
643
- "BQd7g4": 641,
644
- "BQd8a5": 642,
645
- "BQd8a5(+)": 643,
646
- "BQd8a8(x)": 644,
647
- "BQd8b6": 645,
648
- "BQd8b6(+)": 646,
649
- "BQd8b8": 647,
650
- "BQd8c7": 648,
651
- "BQd8c8": 649,
652
- "BQd8d1(x)": 650,
653
- "BQd8d1(x+)": 651,
654
- "BQd8d4": 652,
655
- "BQd8d4(x)": 653,
656
- "BQd8d5": 654,
657
- "BQd8d5(x)": 655,
658
- "BQd8d6": 656,
659
- "BQd8d6(x)": 657,
660
- "BQd8d7": 658,
661
- "BQd8d7(x)": 659,
662
- "BQd8e7": 660,
663
- "BQd8e7(+)": 661,
664
- "BQd8e7(x)": 662,
665
- "BQd8e8": 663,
666
- "BQd8f6": 664,
667
- "BQd8f6(x)": 665,
668
- "BQd8f8": 666,
669
- "BQd8g5": 667,
670
- "BQd8g5(x)": 668,
671
- "BQd8h4": 669,
672
- "BQd8h4(+)": 670,
673
- "BQd8h4(x)": 671,
674
- "BQe7c5": 672,
675
- "BQe7c7": 673,
676
- "BQe7d6": 674,
677
- "BQe7d7": 675,
678
- "BQe7d8": 676,
679
- "BQe7e5": 677,
680
- "BQe7e5(x)": 678,
681
- "BQe7e6": 679,
682
- "BQe7e6(x)": 680,
683
- "BQe7f6": 681,
684
- "BQe7f6(x)": 682,
685
- "BQe7f7": 683,
686
- "BQe7g5": 684,
687
- "BQe7h4": 685,
688
- "BQf6d8": 686,
689
- "BQf6e5(x)": 687,
690
- "BQf6e6": 688,
691
- "BQf6e7": 689,
692
- "BQf6f3(x)": 690,
693
- "BQf6f5": 691,
694
- "BQf6g5": 692,
695
- "BQf6g6": 693,
696
- "BQg5f6": 694,
697
- "BQg5g6": 695,
698
- "BQg6f6": 696,
699
- "BRa2a1(+)": 697,
700
- "BRa2b2": 698,
701
- "BRa8a1(x)": 699,
702
- "BRa8a2(x)": 700,
703
- "BRa8a6": 701,
704
- "BRa8a6(x)": 702,
705
- "BRa8a7": 703,
706
- "BRa8b8": 704,
707
- "BRa8c8": 705,
708
- "BRa8c8(x)": 706,
709
- "BRa8d8": 707,
710
- "BRa8d8(x)": 708,
711
- "BRa8e8": 709,
712
- "BRa8e8(x)": 710,
713
- "BRa8f8": 711,
714
- "BRa8f8(x)": 712,
715
- "BRa8g8": 713,
716
- "BRa8h8": 714,
717
- "BRb2a2(x)": 715,
718
- "BRb8a8": 716,
719
- "BRb8b2": 717,
720
- "BRb8b2(x)": 718,
721
- "BRb8b6": 719,
722
- "BRb8b7": 720,
723
- "BRb8b7(x)": 721,
724
- "BRb8c8": 722,
725
- "BRb8d8": 723,
726
- "BRb8e8": 724,
727
- "BRb8f8": 725,
728
- "BRc2a2(x)": 726,
729
- "BRc2b2(x)": 727,
730
- "BRc8a8": 728,
731
- "BRc8b8": 729,
732
- "BRc8c1(x)": 730,
733
- "BRc8c1(x+)": 731,
734
- "BRc8c2": 732,
735
- "BRc8c2(x)": 733,
736
- "BRc8c3": 734,
737
- "BRc8c3(x)": 735,
738
- "BRc8c4": 736,
739
- "BRc8c4(x)": 737,
740
- "BRc8c5": 738,
741
- "BRc8c5(x)": 739,
742
- "BRc8c6": 740,
743
- "BRc8c6(x)": 741,
744
- "BRc8c7": 742,
745
- "BRc8c7(x)": 743,
746
- "BRc8d8": 744,
747
- "BRc8e8": 745,
748
- "BRc8f8": 746,
749
- "BRd7c7": 747,
750
- "BRd7e7": 748,
751
- "BRd8a8": 749,
752
- "BRd8b8": 750,
753
- "BRd8c8": 751,
754
- "BRd8d1(+)": 752,
755
- "BRd8d1(x)": 753,
756
- "BRd8d1(x+)": 754,
757
- "BRd8d2": 755,
758
- "BRd8d2(x)": 756,
759
- "BRd8d3": 757,
760
- "BRd8d3(x)": 758,
761
- "BRd8d4": 759,
762
- "BRd8d4(x)": 760,
763
- "BRd8d5": 761,
764
- "BRd8d5(x)": 762,
765
- "BRd8d6": 763,
766
- "BRd8d6(x)": 764,
767
- "BRd8d7": 765,
768
- "BRd8d7(x)": 766,
769
- "BRd8e8": 767,
770
- "BRd8f8": 768,
771
- "BRd8g8": 769,
772
- "BRd8h8": 770,
773
- "BRe7d7": 771,
774
- "BRe8b8": 772,
775
- "BRe8c8": 773,
776
- "BRe8d8": 774,
777
- "BRe8d8(x)": 775,
778
- "BRe8e1(+)": 776,
779
- "BRe8e1(x)": 777,
780
- "BRe8e1(x+)": 778,
781
- "BRe8e2": 779,
782
- "BRe8e2(x)": 780,
783
- "BRe8e3": 781,
784
- "BRe8e3(x)": 782,
785
- "BRe8e4": 783,
786
- "BRe8e4(x)": 784,
787
- "BRe8e5": 785,
788
- "BRe8e5(x)": 786,
789
- "BRe8e6": 787,
790
- "BRe8e6(x)": 788,
791
- "BRe8e7": 789,
792
- "BRe8e7(x)": 790,
793
- "BRe8f8": 791,
794
- "BRe8g8": 792,
795
- "BRf6g6": 793,
796
- "BRf7e7": 794,
797
- "BRf7f8": 795,
798
- "BRf8a8": 796,
799
- "BRf8a8(x)": 797,
800
- "BRf8b8": 798,
801
- "BRf8c8": 799,
802
- "BRf8c8(x)": 800,
803
- "BRf8d8": 801,
804
- "BRf8d8(x)": 802,
805
- "BRf8e8": 803,
806
- "BRf8e8(+)": 804,
807
- "BRf8e8(x)": 805,
808
- "BRf8f1(x+)": 806,
809
- "BRf8f2(x)": 807,
810
- "BRf8f3(x)": 808,
811
- "BRf8f4": 809,
812
- "BRf8f4(x)": 810,
813
- "BRf8f5": 811,
814
- "BRf8f5(x)": 812,
815
- "BRf8f6": 813,
816
- "BRf8f6(x)": 814,
817
- "BRf8f7": 815,
818
- "BRf8f7(x)": 816,
819
- "BRf8g8": 817,
820
- "BRf8h8": 818,
821
- "BRg8f8": 819,
822
- "BRg8g6": 820,
823
- "BRg8g7": 821,
824
- "BRg8h8": 822,
825
- "BRh8c8": 823,
826
- "BRh8d8": 824,
827
- "BRh8e8": 825,
828
- "BRh8f8": 826,
829
- "BRh8g8": 827,
830
- "BRh8h6": 828,
831
- "BRh8h7": 829,
832
- "WBa3b2": 830,
833
- "WBa4b3": 831,
834
- "WBa4c2": 832,
835
- "WBb2a3": 833,
836
- "WBb2c1": 834,
837
- "WBb2c3": 835,
838
- "WBb2c3(x)": 836,
839
- "WBb2d4": 837,
840
- "WBb2d4(x)": 838,
841
- "WBb2e5(x)": 839,
842
- "WBb2f6(x)": 840,
843
- "WBb2g7(x)": 841,
844
- "WBb3a2": 842,
845
- "WBb3c2": 843,
846
- "WBb3d5": 844,
847
- "WBb3d5(x)": 845,
848
- "WBb3e6(x)": 846,
849
- "WBb5a4": 847,
850
- "WBb5c4": 848,
851
- "WBb5c6(x)": 849,
852
- "WBb5c6(x+)": 850,
853
- "WBb5d3": 851,
854
- "WBb5d7(x)": 852,
855
- "WBb5d7(x+)": 853,
856
- "WBb5e2": 854,
857
- "WBc1a3": 855,
858
- "WBc1b2": 856,
859
- "WBc1d2": 857,
860
- "WBc1d2(x)": 858,
861
- "WBc1e3": 859,
862
- "WBc1e3(x)": 860,
863
- "WBc1f4": 861,
864
- "WBc1f4(x)": 862,
865
- "WBc1g5": 863,
866
- "WBc1g5(x)": 864,
867
- "WBc1h6": 865,
868
- "WBc1h6(x)": 866,
869
- "WBc2b3": 867,
870
- "WBc2e4(x)": 868,
871
- "WBc3d2": 869,
872
- "WBc4a2": 870,
873
- "WBc4b3": 871,
874
- "WBc4b5": 872,
875
- "WBc4b5(+)": 873,
876
- "WBc4d3": 874,
877
- "WBc4d5": 875,
878
- "WBc4d5(x)": 876,
879
- "WBc4e2": 877,
880
- "WBc4e6(x)": 878,
881
- "WBc4f7(x)": 879,
882
- "WBc4f7(x+)": 880,
883
- "WBd2b4": 881,
884
- "WBd2b4(x)": 882,
885
- "WBd2c1": 883,
886
- "WBd2c3": 884,
887
- "WBd2c3(x)": 885,
888
- "WBd2e1": 886,
889
- "WBd2e3": 887,
890
- "WBd2f4": 888,
891
- "WBd2f4(x)": 889,
892
- "WBd2g5": 890,
893
- "WBd3a6(x)": 891,
894
- "WBd3b1": 892,
895
- "WBd3b5": 893,
896
- "WBd3b5(x)": 894,
897
- "WBd3c2": 895,
898
- "WBd3c4": 896,
899
- "WBd3c4(x)": 897,
900
- "WBd3e2": 898,
901
- "WBd3e4": 899,
902
- "WBd3e4(x)": 900,
903
- "WBd3f1": 901,
904
- "WBd3f5": 902,
905
- "WBd3f5(x)": 903,
906
- "WBd3g6(x)": 904,
907
- "WBd3h7(x+)": 905,
908
- "WBd4e3": 906,
909
- "WBd4f6(x)": 907,
910
- "WBd5b3": 908,
911
- "WBe2b5": 909,
912
- "WBe2c4": 910,
913
- "WBe2c4(x)": 911,
914
- "WBe2d1": 912,
915
- "WBe2d3": 913,
916
- "WBe2f1": 914,
917
- "WBe2f3": 915,
918
- "WBe2f3(x)": 916,
919
- "WBe2g4": 917,
920
- "WBe2g4(x)": 918,
921
- "WBe2h5": 919,
922
- "WBe2h5(x)": 920,
923
- "WBe3a7(x)": 921,
924
- "WBe3b6(x)": 922,
925
- "WBe3c1": 923,
926
- "WBe3c5": 924,
927
- "WBe3c5(x)": 925,
928
- "WBe3d2": 926,
929
- "WBe3d4": 927,
930
- "WBe3d4(x)": 928,
931
- "WBe3f2": 929,
932
- "WBe3f4": 930,
933
- "WBe3f4(x)": 931,
934
- "WBe3g5": 932,
935
- "WBe3g5(x)": 933,
936
- "WBe3h6": 934,
937
- "WBe3h6(x)": 935,
938
- "WBe4d3": 936,
939
- "WBe4f3": 937,
940
- "WBe5f6(x)": 938,
941
- "WBe5g3": 939,
942
- "WBf1b5": 940,
943
- "WBf1b5(+)": 941,
944
- "WBf1c4": 942,
945
- "WBf1c4(x)": 943,
946
- "WBf1d3": 944,
947
- "WBf1d3(x)": 945,
948
- "WBf1e2": 946,
949
- "WBf1g2": 947,
950
- "WBf1h3": 948,
951
- "WBf3b7(x)": 949,
952
- "WBf3c6(x)": 950,
953
- "WBf3d5(x)": 951,
954
- "WBf3e2": 952,
955
- "WBf3e4": 953,
956
- "WBf3e4(x)": 954,
957
- "WBf3g2": 955,
958
- "WBf3g4": 956,
959
- "WBf4c7(x)": 957,
960
- "WBf4d2": 958,
961
- "WBf4d6": 959,
962
- "WBf4d6(x)": 960,
963
- "WBf4e3": 961,
964
- "WBf4e5": 962,
965
- "WBf4e5(x)": 963,
966
- "WBf4g3": 964,
967
- "WBf4g5": 965,
968
- "WBf4h2": 966,
969
- "WBf4h6": 967,
970
- "WBg2b7(x)": 968,
971
- "WBg2c6(x)": 969,
972
- "WBg2d5(x)": 970,
973
- "WBg2e4": 971,
974
- "WBg2e4(x)": 972,
975
- "WBg2f1": 973,
976
- "WBg2f3": 974,
977
- "WBg2f3(x)": 975,
978
- "WBg2h3": 976,
979
- "WBg3d6(x)": 977,
980
- "WBg3e5": 978,
981
- "WBg3e5(x)": 979,
982
- "WBg3f2": 980,
983
- "WBg3h2": 981,
984
- "WBg3h4": 982,
985
- "WBg4f3": 983,
986
- "WBg5d2": 984,
987
- "WBg5d8(x)": 985,
988
- "WBg5e3": 986,
989
- "WBg5e7(x)": 987,
990
- "WBg5f4": 988,
991
- "WBg5f6": 989,
992
- "WBg5f6(x)": 990,
993
- "WBg5h4": 991,
994
- "WBg5h6": 992,
995
- "WBh4e7(x)": 993,
996
- "WBh4f6(x)": 994,
997
- "WBh4g3": 995,
998
- "WBh6f8(x)": 996,
999
- "WBh6g5": 997,
1000
- "WBh6g7(x)": 998,
1001
- "WKb1a1": 999,
1002
- "WKb1a2": 1000,
1003
- "WKb1b2": 1001,
1004
- "WKb1c1": 1002,
1005
- "WKb1c2": 1003,
1006
- "WKb2a3": 1004,
1007
- "WKb2b3": 1005,
1008
- "WKb2c2": 1006,
1009
- "WKb2c3": 1007,
1010
- "WKb3a4": 1008,
1011
- "WKb3c2": 1009,
1012
- "WKb3c4": 1010,
1013
- "WKc1b1": 1011,
1014
- "WKc1b2": 1012,
1015
- "WKc1c2": 1013,
1016
- "WKc1d1": 1014,
1017
- "WKc1d2": 1015,
1018
- "WKc2b1": 1016,
1019
- "WKc2b2": 1017,
1020
- "WKc2b3": 1018,
1021
- "WKc2c3": 1019,
1022
- "WKc2d2": 1020,
1023
- "WKc2d3": 1021,
1024
- "WKc3b2": 1022,
1025
- "WKc3b3": 1023,
1026
- "WKc3b4": 1024,
1027
- "WKc3c4": 1025,
1028
- "WKc3d2": 1026,
1029
- "WKc3d3": 1027,
1030
- "WKc3d4": 1028,
1031
- "WKc4b5": 1029,
1032
- "WKc4c5": 1030,
1033
- "WKc4d5": 1031,
1034
- "WKd1c1": 1032,
1035
- "WKd1c2": 1033,
1036
- "WKd1d2": 1034,
1037
- "WKd1e1": 1035,
1038
- "WKd1e2": 1036,
1039
- "WKd2c1": 1037,
1040
- "WKd2c2": 1038,
1041
- "WKd2c3": 1039,
1042
- "WKd2d1": 1040,
1043
- "WKd2d3": 1041,
1044
- "WKd2e1": 1042,
1045
- "WKd2e2": 1043,
1046
- "WKd2e3": 1044,
1047
- "WKd3c2": 1045,
1048
- "WKd3c3": 1046,
1049
- "WKd3c4": 1047,
1050
- "WKd3d2": 1048,
1051
- "WKd3d4": 1049,
1052
- "WKd3e2": 1050,
1053
- "WKd3e3": 1051,
1054
- "WKd3e4": 1052,
1055
- "WKd4c3": 1053,
1056
- "WKd4c4": 1054,
1057
- "WKd4c5": 1055,
1058
- "WKd4d5": 1056,
1059
- "WKd4e3": 1057,
1060
- "WKd4e4": 1058,
1061
- "WKd4e5": 1059,
1062
- "WKd5c6": 1060,
1063
- "WKe1c1(O)": 1061,
1064
- "WKe1d1": 1062,
1065
- "WKe1d1(x)": 1063,
1066
- "WKe1d2": 1064,
1067
- "WKe1d2(x)": 1065,
1068
- "WKe1e2": 1066,
1069
- "WKe1e2(x)": 1067,
1070
- "WKe1f1": 1068,
1071
- "WKe1f2": 1069,
1072
- "WKe1f2(x)": 1070,
1073
- "WKe1g1(o)": 1071,
1074
- "WKe2d1": 1072,
1075
- "WKe2d2": 1073,
1076
- "WKe2d3": 1074,
1077
- "WKe2e1": 1075,
1078
- "WKe2e3": 1076,
1079
- "WKe2f1": 1077,
1080
- "WKe2f2": 1078,
1081
- "WKe2f3": 1079,
1082
- "WKe3d2": 1080,
1083
- "WKe3d3": 1081,
1084
- "WKe3d4": 1082,
1085
- "WKe3e2": 1083,
1086
- "WKe3e4": 1084,
1087
- "WKe3f2": 1085,
1088
- "WKe3f3": 1086,
1089
- "WKe3f4": 1087,
1090
- "WKe4d3": 1088,
1091
- "WKe4d4": 1089,
1092
- "WKe4d5": 1090,
1093
- "WKe4e3": 1091,
1094
- "WKe4e5": 1092,
1095
- "WKe4f3": 1093,
1096
- "WKe4f4": 1094,
1097
- "WKe4f5": 1095,
1098
- "WKe5d6": 1096,
1099
- "WKe5f6": 1097,
1100
- "WKf1e1": 1098,
1101
- "WKf1e2": 1099,
1102
- "WKf1f2": 1100,
1103
- "WKf1g1": 1101,
1104
- "WKf1g2": 1102,
1105
- "WKf2e1": 1103,
1106
- "WKf2e2": 1104,
1107
- "WKf2e3": 1105,
1108
- "WKf2f1": 1106,
1109
- "WKf2f3": 1107,
1110
- "WKf2g1": 1108,
1111
- "WKf2g2": 1109,
1112
- "WKf2g3": 1110,
1113
- "WKf3e2": 1111,
1114
- "WKf3e3": 1112,
1115
- "WKf3e4": 1113,
1116
- "WKf3f2": 1114,
1117
- "WKf3f4": 1115,
1118
- "WKf3g2": 1116,
1119
- "WKf3g3": 1117,
1120
- "WKf3g4": 1118,
1121
- "WKf4e3": 1119,
1122
- "WKf4e4": 1120,
1123
- "WKf4e5": 1121,
1124
- "WKf4f3": 1122,
1125
- "WKf4f5": 1123,
1126
- "WKf4g3": 1124,
1127
- "WKf4g4": 1125,
1128
- "WKf4g5": 1126,
1129
- "WKg1f1": 1127,
1130
- "WKg1f1(x)": 1128,
1131
- "WKg1f2": 1129,
1132
- "WKg1f2(x)": 1130,
1133
- "WKg1g2": 1131,
1134
- "WKg1g2(x)": 1132,
1135
- "WKg1h1": 1133,
1136
- "WKg1h2": 1134,
1137
- "WKg1h2(x)": 1135,
1138
- "WKg2f1": 1136,
1139
- "WKg2f2": 1137,
1140
- "WKg2f3": 1138,
1141
- "WKg2g1": 1139,
1142
- "WKg2g3": 1140,
1143
- "WKg2h1": 1141,
1144
- "WKg2h2": 1142,
1145
- "WKg2h3": 1143,
1146
- "WKg3f2": 1144,
1147
- "WKg3f3": 1145,
1148
- "WKg3f4": 1146,
1149
- "WKg3g2": 1147,
1150
- "WKg3g4": 1148,
1151
- "WKg3h2": 1149,
1152
- "WKg3h3": 1150,
1153
- "WKg3h4": 1151,
1154
- "WKg4f3": 1152,
1155
- "WKg4f4": 1153,
1156
- "WKg4f5": 1154,
1157
- "WKg4g3": 1155,
1158
- "WKg4g5": 1156,
1159
- "WKg4h3": 1157,
1160
- "WKg4h5": 1158,
1161
- "WKg5f6": 1159,
1162
- "WKh1g1": 1160,
1163
- "WKh1g2": 1161,
1164
- "WKh1h2": 1162,
1165
- "WKh2g1": 1163,
1166
- "WKh2g2": 1164,
1167
- "WKh2g3": 1165,
1168
- "WKh2h1": 1166,
1169
- "WKh2h3": 1167,
1170
- "WKh3g2": 1168,
1171
- "WKh3g3": 1169,
1172
- "WKh3g4": 1170,
1173
- "WKh3h2": 1171,
1174
- "WKh3h4": 1172,
1175
- "WKh4g3": 1173,
1176
- "WKh4g5": 1174,
1177
- "WKh4h5": 1175,
1178
- "WNa3b5": 1176,
1179
- "WNa3c2": 1177,
1180
- "WNa3c4": 1178,
1181
- "WNa4c3": 1179,
1182
- "WNa4c5": 1180,
1183
- "WNa4c5(x)": 1181,
1184
- "WNb1a3": 1182,
1185
- "WNb1c3": 1183,
1186
- "WNb1c3(x)": 1184,
1187
- "WNb1d2": 1185,
1188
- "WNb1d2(x)": 1186,
1189
- "WNb3c5": 1187,
1190
- "WNb3d2": 1188,
1191
- "WNb3d4": 1189,
1192
- "WNb5a3": 1190,
1193
- "WNb5c3": 1191,
1194
- "WNb5c7": 1192,
1195
- "WNb5d4": 1193,
1196
- "WNb5d6": 1194,
1197
- "WNb5d6(+)": 1195,
1198
- "WNb5d6(x)": 1196,
1199
- "WNc2e3": 1197,
1200
- "WNc3a2": 1198,
1201
- "WNc3a4": 1199,
1202
- "WNc3b1": 1200,
1203
- "WNc3b5": 1201,
1204
- "WNc3b5(x)": 1202,
1205
- "WNc3d1": 1203,
1206
- "WNc3d1(x)": 1204,
1207
- "WNc3d5": 1205,
1208
- "WNc3d5(x)": 1206,
1209
- "WNc3e2": 1207,
1210
- "WNc3e2(x)": 1208,
1211
- "WNc3e4": 1209,
1212
- "WNc3e4(x)": 1210,
1213
- "WNc4d2": 1211,
1214
- "WNc4d6": 1212,
1215
- "WNc4e3": 1213,
1216
- "WNc4e5": 1214,
1217
- "WNc4e5(x)": 1215,
1218
- "WNc5d3": 1216,
1219
- "WNc7a8(x)": 1217,
1220
- "WNd1e3": 1218,
1221
- "WNd2b1": 1219,
1222
- "WNd2b3": 1220,
1223
- "WNd2c4": 1221,
1224
- "WNd2c4(x)": 1222,
1225
- "WNd2e4": 1223,
1226
- "WNd2e4(x)": 1224,
1227
- "WNd2f1": 1225,
1228
- "WNd2f3": 1226,
1229
- "WNd2f3(x)": 1227,
1230
- "WNd3e5": 1228,
1231
- "WNd3f4": 1229,
1232
- "WNd4b3": 1230,
1233
- "WNd4b5": 1231,
1234
- "WNd4c6": 1232,
1235
- "WNd4c6(x)": 1233,
1236
- "WNd4e2": 1234,
1237
- "WNd4e6": 1235,
1238
- "WNd4e6(x)": 1236,
1239
- "WNd4f3": 1237,
1240
- "WNd4f5": 1238,
1241
- "WNd4f5(x)": 1239,
1242
- "WNd5c3": 1240,
1243
- "WNd5c7(x)": 1241,
1244
- "WNd5e3": 1242,
1245
- "WNd5e7(+)": 1243,
1246
- "WNd5e7(x)": 1244,
1247
- "WNd5e7(x+)": 1245,
1248
- "WNd5f4": 1246,
1249
- "WNd5f6(+)": 1247,
1250
- "WNd5f6(x+)": 1248,
1251
- "WNd6b7(x)": 1249,
1252
- "WNe1f3": 1250,
1253
- "WNe2c3": 1251,
1254
- "WNe2c3(x)": 1252,
1255
- "WNe2d4": 1253,
1256
- "WNe2d4(x)": 1254,
1257
- "WNe2f4": 1255,
1258
- "WNe2f4(x)": 1256,
1259
- "WNe2g3": 1257,
1260
- "WNe3c4": 1258,
1261
- "WNe3d5": 1259,
1262
- "WNe3f5": 1260,
1263
- "WNe3g4": 1261,
1264
- "WNe4c3": 1262,
1265
- "WNe4c5": 1263,
1266
- "WNe4c5(x)": 1264,
1267
- "WNe4d2": 1265,
1268
- "WNe4d6": 1266,
1269
- "WNe4d6(+)": 1267,
1270
- "WNe4d6(x)": 1268,
1271
- "WNe4f6(+)": 1269,
1272
- "WNe4f6(x+)": 1270,
1273
- "WNe4g3": 1271,
1274
- "WNe4g5": 1272,
1275
- "WNe5c4": 1273,
1276
- "WNe5c4(x)": 1274,
1277
- "WNe5c6": 1275,
1278
- "WNe5c6(x)": 1276,
1279
- "WNe5d3": 1277,
1280
- "WNe5d7": 1278,
1281
- "WNe5d7(x)": 1279,
1282
- "WNe5f3": 1280,
1283
- "WNe5f7(x)": 1281,
1284
- "WNe5g4": 1282,
1285
- "WNe5g4(x)": 1283,
1286
- "WNe5g6": 1284,
1287
- "WNe5g6(x)": 1285,
1288
- "WNe6f8(x)": 1286,
1289
- "WNf1e3": 1287,
1290
- "WNf1g3": 1288,
1291
- "WNf3d2": 1289,
1292
- "WNf3d2(x)": 1290,
1293
- "WNf3d4": 1291,
1294
- "WNf3d4(x)": 1292,
1295
- "WNf3e1": 1293,
1296
- "WNf3e5": 1294,
1297
- "WNf3e5(+)": 1295,
1298
- "WNf3e5(x)": 1296,
1299
- "WNf3g1": 1297,
1300
- "WNf3g5": 1298,
1301
- "WNf3g5(+)": 1299,
1302
- "WNf3g5(x)": 1300,
1303
- "WNf3h2": 1301,
1304
- "WNf3h4": 1302,
1305
- "WNf3h4(x)": 1303,
1306
- "WNf4d3": 1304,
1307
- "WNf4d5": 1305,
1308
- "WNf4d5(x)": 1306,
1309
- "WNf4e6(x)": 1307,
1310
- "WNf4h5": 1308,
1311
- "WNf5e3": 1309,
1312
- "WNf5e7(+)": 1310,
1313
- "WNf7h8(x)": 1311,
1314
- "WNg1e2": 1312,
1315
- "WNg1f3": 1313,
1316
- "WNg1f3(x)": 1314,
1317
- "WNg1h3": 1315,
1318
- "WNg3e2": 1316,
1319
- "WNg3e4": 1317,
1320
- "WNg3e4(x)": 1318,
1321
- "WNg3f5": 1319,
1322
- "WNg3f5(x)": 1320,
1323
- "WNg3h5": 1321,
1324
- "WNg4e3": 1322,
1325
- "WNg4e5": 1323,
1326
- "WNg5e4": 1324,
1327
- "WNg5e4(x)": 1325,
1328
- "WNg5e6": 1326,
1329
- "WNg5e6(x)": 1327,
1330
- "WNg5f3": 1328,
1331
- "WNg5f7(x)": 1329,
1332
- "WNg5h3": 1330,
1333
- "WNh2f3": 1331,
1334
- "WNh2g4": 1332,
1335
- "WNh3f2": 1333,
1336
- "WNh3f4": 1334,
1337
- "WNh3g5": 1335,
1338
- "WNh4f3": 1336,
1339
- "WNh4f5": 1337,
1340
- "WNh4f5(x)": 1338,
1341
- "WNh4g6(x)": 1339,
1342
- "WPa2a3": 1340,
1343
- "WPa2a4": 1341,
1344
- "WPa2b3(x)": 1342,
1345
- "WPa3a4": 1343,
1346
- "WPa3b4(x)": 1344,
1347
- "WPa4a5": 1345,
1348
- "WPa4b5(x)": 1346,
1349
- "WPa5a6": 1347,
1350
- "WPa5b6(x)": 1348,
1351
- "WPa6a7": 1349,
1352
- "WPa7a8(Q)": 1350,
1353
- "WPb2a3(x)": 1351,
1354
- "WPb2b3": 1352,
1355
- "WPb2b4": 1353,
1356
- "WPb2c3(x)": 1354,
1357
- "WPb3a4(x)": 1355,
1358
- "WPb3b4": 1356,
1359
- "WPb3c4(x)": 1357,
1360
- "WPb4a5(x)": 1358,
1361
- "WPb4b5": 1359,
1362
- "WPb4c5(x)": 1360,
1363
- "WPb5a6(x)": 1361,
1364
- "WPb5b6": 1362,
1365
- "WPb5c6(x)": 1363,
1366
- "WPb6b7": 1364,
1367
- "WPb7b8(Q)": 1365,
1368
- "WPc2b3(x)": 1366,
1369
- "WPc2c3": 1367,
1370
- "WPc2c4": 1368,
1371
- "WPc2d3(x)": 1369,
1372
- "WPc3b4(x)": 1370,
1373
- "WPc3c4": 1371,
1374
- "WPc3d4(x)": 1372,
1375
- "WPc4b5(x)": 1373,
1376
- "WPc4c5": 1374,
1377
- "WPc4d5(x)": 1375,
1378
- "WPc5b6(x)": 1376,
1379
- "WPc5c6": 1377,
1380
- "WPc5d6(x)": 1378,
1381
- "WPc6c7": 1379,
1382
- "WPc7c8(Q)": 1380,
1383
- "WPd2c3(x)": 1381,
1384
- "WPd2d3": 1382,
1385
- "WPd2d4": 1383,
1386
- "WPd3c4(x)": 1384,
1387
- "WPd3d4": 1385,
1388
- "WPd3e4(x)": 1386,
1389
- "WPd4c5(x)": 1387,
1390
- "WPd4d5": 1388,
1391
- "WPd4e5(x)": 1389,
1392
- "WPd5c6(x)": 1390,
1393
- "WPd5d6": 1391,
1394
- "WPd5e6(x)": 1392,
1395
- "WPd6d7": 1393,
1396
- "WPd7d8(Q)": 1394,
1397
- "WPe2e3": 1395,
1398
- "WPe2e4": 1396,
1399
- "WPe3d4(x)": 1397,
1400
- "WPe3e4": 1398,
1401
- "WPe3f4(x)": 1399,
1402
- "WPe4d5(x)": 1400,
1403
- "WPe4e5": 1401,
1404
- "WPe4f5(x)": 1402,
1405
- "WPe5d6(x)": 1403,
1406
- "WPe5e6": 1404,
1407
- "WPe5f6(x)": 1405,
1408
- "WPe5f6(xE)": 1406,
1409
- "WPe6e7": 1407,
1410
- "WPe6f7(x+)": 1408,
1411
- "WPf2e3(x)": 1409,
1412
- "WPf2f3": 1410,
1413
- "WPf2f4": 1411,
1414
- "WPf2g3(x)": 1412,
1415
- "WPf3e4(x)": 1413,
1416
- "WPf3f4": 1414,
1417
- "WPf3g4(x)": 1415,
1418
- "WPf4e5(x)": 1416,
1419
- "WPf4f5": 1417,
1420
- "WPf4g5(x)": 1418,
1421
- "WPf5e6(x)": 1419,
1422
- "WPf5f6": 1420,
1423
- "WPf5g6(x)": 1421,
1424
- "WPf6f7": 1422,
1425
- "WPg2f3(x)": 1423,
1426
- "WPg2g3": 1424,
1427
- "WPg2g4": 1425,
1428
- "WPg2h3(x)": 1426,
1429
- "WPg3f4(x)": 1427,
1430
- "WPg3g4": 1428,
1431
- "WPg3h4(x)": 1429,
1432
- "WPg4f5(x)": 1430,
1433
- "WPg4g5": 1431,
1434
- "WPg4h5(x)": 1432,
1435
- "WPg5f6(x)": 1433,
1436
- "WPg5g6": 1434,
1437
- "WPg5h6(x)": 1435,
1438
- "WPg6g7": 1436,
1439
- "WPg7g8(Q)": 1437,
1440
- "WPh2g3(x)": 1438,
1441
- "WPh2h3": 1439,
1442
- "WPh2h4": 1440,
1443
- "WPh3g4(x)": 1441,
1444
- "WPh3h4": 1442,
1445
- "WPh4g5(x)": 1443,
1446
- "WPh4h5": 1444,
1447
- "WPh5g6(x)": 1445,
1448
- "WPh5h6": 1446,
1449
- "WPh6h7": 1447,
1450
- "WPh7h8(Q)": 1448,
1451
- "WQa4b3": 1449,
1452
- "WQa4c2": 1450,
1453
- "WQb3b7(x)": 1451,
1454
- "WQb3c2": 1452,
1455
- "WQb3d1": 1453,
1456
- "WQc2b3": 1454,
1457
- "WQc2c3": 1455,
1458
- "WQc2d2": 1456,
1459
- "WQc2d3": 1457,
1460
- "WQc2e2": 1458,
1461
- "WQc2e4(x)": 1459,
1462
- "WQd1a1(x)": 1460,
1463
- "WQd1a4": 1461,
1464
- "WQd1a4(+)": 1462,
1465
- "WQd1b3": 1463,
1466
- "WQd1c1": 1464,
1467
- "WQd1c2": 1465,
1468
- "WQd1d2": 1466,
1469
- "WQd1d2(x)": 1467,
1470
- "WQd1d3": 1468,
1471
- "WQd1d3(x)": 1469,
1472
- "WQd1d4": 1470,
1473
- "WQd1d4(x)": 1471,
1474
- "WQd1d5": 1472,
1475
- "WQd1d5(x)": 1473,
1476
- "WQd1d6(x)": 1474,
1477
- "WQd1d8(x)": 1475,
1478
- "WQd1d8(x+)": 1476,
1479
- "WQd1e1": 1477,
1480
- "WQd1e2": 1478,
1481
- "WQd1e2(+)": 1479,
1482
- "WQd1e2(x)": 1480,
1483
- "WQd1f3": 1481,
1484
- "WQd1f3(x)": 1482,
1485
- "WQd1g4": 1483,
1486
- "WQd1g4(x)": 1484,
1487
- "WQd1h5": 1485,
1488
- "WQd1h5(+)": 1486,
1489
- "WQd1h5(x)": 1487,
1490
- "WQd2c2": 1488,
1491
- "WQd2c3": 1489,
1492
- "WQd2d3": 1490,
1493
- "WQd2e2": 1491,
1494
- "WQd2e3": 1492,
1495
- "WQd2e3(x)": 1493,
1496
- "WQd2f2": 1494,
1497
- "WQd2f4": 1495,
1498
- "WQd2f4(x)": 1496,
1499
- "WQd2g5": 1497,
1500
- "WQd2h6(x)": 1498,
1501
- "WQd3c2": 1499,
1502
- "WQd3d2": 1500,
1503
- "WQd3e2": 1501,
1504
- "WQd3e3": 1502,
1505
- "WQd3e4(x)": 1503,
1506
- "WQd3f3": 1504,
1507
- "WQd3g3": 1505,
1508
- "WQd4d1": 1506,
1509
- "WQd4d3": 1507,
1510
- "WQd4e3": 1508,
1511
- "WQe2c2": 1509,
1512
- "WQe2c4": 1510,
1513
- "WQe2c4(x)": 1511,
1514
- "WQe2d1": 1512,
1515
- "WQe2d2": 1513,
1516
- "WQe2d3": 1514,
1517
- "WQe2e3": 1515,
1518
- "WQe2e3(x)": 1516,
1519
- "WQe2e4": 1517,
1520
- "WQe2e4(x)": 1518,
1521
- "WQe2f2": 1519,
1522
- "WQe2f3": 1520,
1523
- "WQe2f3(x)": 1521,
1524
- "WQe2g4": 1522,
1525
- "WQe2h5": 1523,
1526
- "WQe3e2": 1524,
1527
- "WQe3f3": 1525,
1528
- "WQe3g3": 1526,
1529
- "WQf3b7(x)": 1527,
1530
- "WQf3d1": 1528,
1531
- "WQf3d3": 1529,
1532
- "WQf3d5(x)": 1530,
1533
- "WQf3e2": 1531,
1534
- "WQf3e3": 1532,
1535
- "WQf3e4(x)": 1533,
1536
- "WQf3f4": 1534,
1537
- "WQf3f6(x)": 1535,
1538
- "WQf3g3": 1536,
1539
- "WQf3g4": 1537,
1540
- "WQf3h3": 1538,
1541
- "WQf3h5": 1539,
1542
- "WQg3f3": 1540,
1543
- "WQg3g4": 1541,
1544
- "WQg3h4": 1542,
1545
- "WQg4f3": 1543,
1546
- "WQg4g3": 1544,
1547
- "WQh5f3": 1545,
1548
- "WRa1a2": 1546,
1549
- "WRa1a3": 1547,
1550
- "WRa1a6(x)": 1548,
1551
- "WRa1a7(x)": 1549,
1552
- "WRa1a8(x)": 1550,
1553
- "WRa1b1": 1551,
1554
- "WRa1c1": 1552,
1555
- "WRa1c1(x)": 1553,
1556
- "WRa1d1": 1554,
1557
- "WRa1d1(x)": 1555,
1558
- "WRa1e1": 1556,
1559
- "WRa1e1(x)": 1557,
1560
- "WRa1f1": 1558,
1561
- "WRa1f1(x)": 1559,
1562
- "WRa1g1": 1560,
1563
- "WRa1h1": 1561,
1564
- "WRb1a1": 1562,
1565
- "WRb1b2": 1563,
1566
- "WRb1b2(x)": 1564,
1567
- "WRb1b3": 1565,
1568
- "WRb1b7": 1566,
1569
- "WRb1b7(x)": 1567,
1570
- "WRb1c1": 1568,
1571
- "WRb1d1": 1569,
1572
- "WRb1e1": 1570,
1573
- "WRb1f1": 1571,
1574
- "WRb7a7(x)": 1572,
1575
- "WRc1a1": 1573,
1576
- "WRc1b1": 1574,
1577
- "WRc1c2": 1575,
1578
- "WRc1c2(x)": 1576,
1579
- "WRc1c3": 1577,
1580
- "WRc1c3(x)": 1578,
1581
- "WRc1c4(x)": 1579,
1582
- "WRc1c5": 1580,
1583
- "WRc1c5(x)": 1581,
1584
- "WRc1c6(x)": 1582,
1585
- "WRc1c7": 1583,
1586
- "WRc1c7(x)": 1584,
1587
- "WRc1c8(x)": 1585,
1588
- "WRc1d1": 1586,
1589
- "WRc1e1": 1587,
1590
- "WRc1f1": 1588,
1591
- "WRc7b7(x)": 1589,
1592
- "WRd1a1": 1590,
1593
- "WRd1b1": 1591,
1594
- "WRd1c1": 1592,
1595
- "WRd1d2": 1593,
1596
- "WRd1d2(x)": 1594,
1597
- "WRd1d3": 1595,
1598
- "WRd1d3(x)": 1596,
1599
- "WRd1d4": 1597,
1600
- "WRd1d4(x)": 1598,
1601
- "WRd1d5": 1599,
1602
- "WRd1d5(x)": 1600,
1603
- "WRd1d6": 1601,
1604
- "WRd1d6(x)": 1602,
1605
- "WRd1d7": 1603,
1606
- "WRd1d7(+)": 1604,
1607
- "WRd1d7(x)": 1605,
1608
- "WRd1d8(+)": 1606,
1609
- "WRd1d8(x)": 1607,
1610
- "WRd1d8(x+)": 1608,
1611
- "WRd1e1": 1609,
1612
- "WRd1e1(x)": 1610,
1613
- "WRd1f1": 1611,
1614
- "WRd1g1": 1612,
1615
- "WRd1h1": 1613,
1616
- "WRd2e2": 1614,
1617
- "WRd7b7(x)": 1615,
1618
- "WRe1a1": 1616,
1619
- "WRe1b1": 1617,
1620
- "WRe1c1": 1618,
1621
- "WRe1d1": 1619,
1622
- "WRe1d1(x)": 1620,
1623
- "WRe1e2": 1621,
1624
- "WRe1e2(x)": 1622,
1625
- "WRe1e3": 1623,
1626
- "WRe1e3(x)": 1624,
1627
- "WRe1e4": 1625,
1628
- "WRe1e4(x)": 1626,
1629
- "WRe1e5": 1627,
1630
- "WRe1e5(x)": 1628,
1631
- "WRe1e6": 1629,
1632
- "WRe1e6(x)": 1630,
1633
- "WRe1e7": 1631,
1634
- "WRe1e7(x)": 1632,
1635
- "WRe1e8(+)": 1633,
1636
- "WRe1e8(x)": 1634,
1637
- "WRe1e8(x+)": 1635,
1638
- "WRe1f1": 1636,
1639
- "WRe1g1": 1637,
1640
- "WRe1h1": 1638,
1641
- "WRe2d2": 1639,
1642
- "WRe2e3": 1640,
1643
- "WRe3f3": 1641,
1644
- "WRe3g3": 1642,
1645
- "WRf1a1": 1643,
1646
- "WRf1a1(x)": 1644,
1647
- "WRf1b1": 1645,
1648
- "WRf1c1": 1646,
1649
- "WRf1c1(x)": 1647,
1650
- "WRf1d1": 1648,
1651
- "WRf1d1(x)": 1649,
1652
- "WRf1e1": 1650,
1653
- "WRf1e1(+)": 1651,
1654
- "WRf1e1(x)": 1652,
1655
- "WRf1f2": 1653,
1656
- "WRf1f2(x)": 1654,
1657
- "WRf1f3": 1655,
1658
- "WRf1f3(x)": 1656,
1659
- "WRf1f4": 1657,
1660
- "WRf1f4(x)": 1658,
1661
- "WRf1f5(x)": 1659,
1662
- "WRf1f6(x)": 1660,
1663
- "WRf1f7": 1661,
1664
- "WRf1f7(x)": 1662,
1665
- "WRf1f8(x+)": 1663,
1666
- "WRf1g1": 1664,
1667
- "WRf1h1": 1665,
1668
- "WRf2e2": 1666,
1669
- "WRf3g3": 1667,
1670
- "WRf3h3": 1668,
1671
- "WRg1e1": 1669,
1672
- "WRg1f1": 1670,
1673
- "WRg1g2": 1671,
1674
- "WRg1g3": 1672,
1675
- "WRg1h1": 1673,
1676
- "WRh1c1": 1674,
1677
- "WRh1d1": 1675,
1678
- "WRh1e1": 1676,
1679
- "WRh1f1": 1677,
1680
- "WRh1g1": 1678,
1681
- "WRh1h2": 1679,
1682
- "WRh1h3": 1680,
1683
- "WRh1h5(x)": 1681
1684
  }
 
3
  "[BOS]": 1,
4
  "[EOS]": 2,
5
  "[UNK]": 3,
6
+ "WP": 4,
7
+ "WN": 5,
8
+ "WB": 6,
9
+ "WR": 7,
10
+ "WQ": 8,
11
+ "WK": 9,
12
+ "BP": 10,
13
+ "BN": 11,
14
+ "BB": 12,
15
+ "BR": 13,
16
+ "BQ": 14,
17
+ "BK": 15,
18
+ "a1_f": 16,
19
+ "a2_f": 17,
20
+ "a3_f": 18,
21
+ "a4_f": 19,
22
+ "a5_f": 20,
23
+ "a6_f": 21,
24
+ "a7_f": 22,
25
+ "a8_f": 23,
26
+ "b1_f": 24,
27
+ "b2_f": 25,
28
+ "b3_f": 26,
29
+ "b4_f": 27,
30
+ "b5_f": 28,
31
+ "b6_f": 29,
32
+ "b7_f": 30,
33
+ "b8_f": 31,
34
+ "c1_f": 32,
35
+ "c2_f": 33,
36
+ "c3_f": 34,
37
+ "c4_f": 35,
38
+ "c5_f": 36,
39
+ "c6_f": 37,
40
+ "c7_f": 38,
41
+ "c8_f": 39,
42
+ "d1_f": 40,
43
+ "d2_f": 41,
44
+ "d3_f": 42,
45
+ "d4_f": 43,
46
+ "d5_f": 44,
47
+ "d6_f": 45,
48
+ "d7_f": 46,
49
+ "d8_f": 47,
50
+ "e1_f": 48,
51
+ "e2_f": 49,
52
+ "e3_f": 50,
53
+ "e4_f": 51,
54
+ "e5_f": 52,
55
+ "e6_f": 53,
56
+ "e7_f": 54,
57
+ "e8_f": 55,
58
+ "f1_f": 56,
59
+ "f2_f": 57,
60
+ "f3_f": 58,
61
+ "f4_f": 59,
62
+ "f5_f": 60,
63
+ "f6_f": 61,
64
+ "f7_f": 62,
65
+ "f8_f": 63,
66
+ "g1_f": 64,
67
+ "g2_f": 65,
68
+ "g3_f": 66,
69
+ "g4_f": 67,
70
+ "g5_f": 68,
71
+ "g6_f": 69,
72
+ "g7_f": 70,
73
+ "g8_f": 71,
74
+ "h1_f": 72,
75
+ "h2_f": 73,
76
+ "h3_f": 74,
77
+ "h4_f": 75,
78
+ "h5_f": 76,
79
+ "h6_f": 77,
80
+ "h7_f": 78,
81
+ "h8_f": 79,
82
+ "a1_t": 80,
83
+ "a2_t": 81,
84
+ "a3_t": 82,
85
+ "a4_t": 83,
86
+ "a5_t": 84,
87
+ "a6_t": 85,
88
+ "a7_t": 86,
89
+ "a8_t": 87,
90
+ "b1_t": 88,
91
+ "b2_t": 89,
92
+ "b3_t": 90,
93
+ "b4_t": 91,
94
+ "b5_t": 92,
95
+ "b6_t": 93,
96
+ "b7_t": 94,
97
+ "b8_t": 95,
98
+ "c1_t": 96,
99
+ "c2_t": 97,
100
+ "c3_t": 98,
101
+ "c4_t": 99,
102
+ "c5_t": 100,
103
+ "c6_t": 101,
104
+ "c7_t": 102,
105
+ "c8_t": 103,
106
+ "d1_t": 104,
107
+ "d2_t": 105,
108
+ "d3_t": 106,
109
+ "d4_t": 107,
110
+ "d5_t": 108,
111
+ "d6_t": 109,
112
+ "d7_t": 110,
113
+ "d8_t": 111,
114
+ "e1_t": 112,
115
+ "e2_t": 113,
116
+ "e3_t": 114,
117
+ "e4_t": 115,
118
+ "e5_t": 116,
119
+ "e6_t": 117,
120
+ "e7_t": 118,
121
+ "e8_t": 119,
122
+ "f1_t": 120,
123
+ "f2_t": 121,
124
+ "f3_t": 122,
125
+ "f4_t": 123,
126
+ "f5_t": 124,
127
+ "f6_t": 125,
128
+ "f7_t": 126,
129
+ "f8_t": 127,
130
+ "g1_t": 128,
131
+ "g2_t": 129,
132
+ "g3_t": 130,
133
+ "g4_t": 131,
134
+ "g5_t": 132,
135
+ "g6_t": 133,
136
+ "g7_t": 134,
137
+ "g8_t": 135,
138
+ "h1_t": 136,
139
+ "h2_t": 137,
140
+ "h3_t": 138,
141
+ "h4_t": 139,
142
+ "h5_t": 140,
143
+ "h6_t": 141,
144
+ "h7_t": 142,
145
+ "h8_t": 143,
146
+ "q": 144,
147
+ "r": 145,
148
+ "b": 146,
149
+ "n": 147
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
150
  }