Valmbd commited on
Commit
474aa21
·
0 Parent(s):

Initial commit

Browse files
README.md ADDED
@@ -0,0 +1,71 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # PETIMOT: Protein Motion Inference from Sparse Data
2
+
3
+ PETIMOT (Protein sEquence and sTructure-based Inference of MOTions) predicts protein conformational changes using SE(3)-equivariant graph neural networks and pre-trained protein language models.
4
+
5
+ ## Installation
6
+
7
+ ```bash
8
+ # Create and activate conda environment
9
+ conda create -n petimot python=3.9
10
+ conda activate petimot
11
+
12
+ # Clone and install
13
+ git clone https://github.com/PhyloSofS-Team/PETIMOT.git
14
+ cd petimot
15
+ pip install -r requirements.txt
16
+ ```
17
+
18
+
19
+ ## Usage
20
+
21
+ ### Reproduce paper results
22
+
23
+ 1. Download resources from [Figshare](https://figshare.com/s/ab400d852b4669a83b64):
24
+ - Download `default_2025-02-07_21-54-02_epoch_33.pt` into the `weights/` directory
25
+ - Download and extract `ground_truth.zip` into the `ground_truth/` directory
26
+
27
+ 2. Run inference and evaluation:
28
+ ```bash
29
+ python -m petimot infer_and_evaluate \
30
+ --model-path weights/default_2025-02-07_21-54-02_epoch_33.pt \
31
+ --list-path eval_list.txt \
32
+ --ground-truth-path ground_truth/ \
33
+ --prediction-path predictions/ \
34
+ --evaluation-path evaluation/
35
+ ```
36
+
37
+ ### Compare with baseline methods
38
+
39
+ 1. Download baseline predictions from [Figshare](https://figshare.com/s/ab400d852b4669a83b64) :
40
+ - Download and extract `baseline_predictions.zip` into the `baselines/` directory
41
+
42
+ 2. Run evaluation:
43
+ ```bash
44
+ python -m petimot evaluate \
45
+ --prediction-path baselines/alphaflow_pdb_distilled/ \
46
+ --ground-truth-path ground_truth/ \
47
+ --output-path evaluation/
48
+ ```
49
+
50
+ Available baseline predictions:
51
+ - AlphaFlow (distilled)
52
+ - ESMFlow (distilled)
53
+ - Normal Mode Analysis
54
+
55
+
56
+
57
+ ### Predict motions for your own PDB files
58
+
59
+ ```bash
60
+ # Single PDB structure
61
+ python -m petimot infer \
62
+ --model-path weights/default_2025-02-07_21-54-02_epoch_33.pt \
63
+ --list-path protein.pdb \
64
+ --output-path predictions/
65
+
66
+ # Multiple structures (provide paths in a text file)
67
+ python -m petimot infer \
68
+ --model-path weights/default_2025-02-07_21-54-02_epoch_33.pt \
69
+ --list-path protein_list.txt \
70
+ --output-path predictions/
71
+ ```
configs/0_dropout_emb.yaml ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ model:
2
+ input_embedding_dropout: 0.0
configs/10_layers.yaml ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ model:
2
+ num_layers: 10
configs/128_emb_dim.yaml ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ model:
2
+ emb_dim: 128
configs/1_mode.yaml ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ training:
2
+ nsse_weight: 1.0
3
+ rmsip_weight: 0.0
4
+ ortho_weight: 0.0
5
+
6
+ model:
7
+ num_modes_pred: 1
configs/2_modes.yaml ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ training:
2
+ nsse_weight: 1.0
3
+ rmsip_weight: 0.0
4
+ ortho_weight: 0.0
5
+
6
+ model:
7
+ num_modes_pred: 2
configs/5_layers.yaml ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ training:
2
+ nsse_weight: 0.5
3
+ rmsip_weight: 0.5
4
+
5
+ model:
6
+ num_layers: 5
configs/8_modes.yaml ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ training:
2
+ nsse_weight: 1.0
3
+ rmsip_weight: 0.0
4
+ ortho_weight: 0.0
5
+
6
+ model:
7
+ num_modes_pred: 8
configs/IS.yaml ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ training:
2
+ nsse_weight: 0.0
3
+ rmsip_weight: 0.0
4
+ ortho_weight: 1.0
configs/IS_LS.yaml ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ training:
2
+ nsse_weight: 0.5
3
+ rmsip_weight: 0.0
4
+ ortho_weight: 0.5
configs/LS.yaml ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ training:
2
+ nsse_weight: 1.0
3
+ rmsip_weight: 0.0
4
+ ortho_weight: 0.0
configs/SS.yaml ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ training:
2
+ nsse_weight: 0.0
3
+ rmsip_weight: 1.0
4
+ ortho_weight: 0.0
configs/ablate_struct.yaml ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ model:
2
+ ablate_structure: true
configs/change_connectivity_false.yaml ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ data:
2
+ change_connectivity: false
configs/default.yaml ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ training:
2
+ seed: 7
3
+ nsse_weight: 0.5
4
+ rmsip_weight: 0.5
5
+ ortho_weight: 0.0
6
+ batch_size: 32
7
+ validation_batch_size: 32
8
+ nb_epochs: 500
9
+ num_workers: 6
10
+ num_workers_val: 2
11
+ pin_memory: false
12
+ optimizer: "adamw"
13
+ learning_rate: 0.0005
14
+ weight_decay: 0.01
15
+ scheduler_factor: 0.2
16
+ scheduler_patience: 10
17
+ loss_threshold: 0.6
18
+ use_amp: true
19
+ grad_clip: 10
20
+ early_stop_patience: 50
21
+ weights_dir: "weights"
22
+ debug: false
23
+
24
+ data:
25
+ training_split_path: "full_train_list.txt"
26
+ validation_split_path: "val_list.txt"
27
+ ground_truth_dir: "ground_truth"
28
+ embedding_dir: "embeddings"
29
+ emb_model: "ProstT5"
30
+ noise: 0.0
31
+ k_nearest: 5
32
+ l_random: 10
33
+ num_modes_gt: 4
34
+ rand_emb: false
35
+ change_connectivity: true
36
+
37
+ model:
38
+ emb_dim: 256
39
+ edge_dim: 329
40
+ num_modes_pred: 4
41
+ num_layers: 15
42
+ shared_layers: false
43
+ mlp_num_layers: 1
44
+ start_with_zero_v: false
45
+ input_embedding_dropout: 0.8
46
+ dropout: 0.4
47
+ normalize_between_layers: false
48
+ center_between_layers: false
49
+ orthogonalize_between_layers: false
50
+ num_basis: 20
51
+ num_backbone_atoms: 4
52
+ max_dist: 20.0
53
+ sigma: 1.0
54
+ ablate_structure: false
configs/esmc_300m.yaml ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ data:
2
+ emb_model: "esmc_300m"
configs/esmc_300m_ablate_struct.yaml ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ data:
2
+ emb_model: "esmc_300m"
3
+
4
+ model:
5
+ ablate_structure: true
configs/esmc_600m.yaml ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ data:
2
+ emb_model: "esmc_600m"
configs/esmc_600m_ablate_struct.yaml ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ data:
2
+ emb_model: "esmc_600m"
3
+
4
+ model:
5
+ ablate_structure: true
configs/only_close.yaml ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ data:
2
+ k_nearest: 15
3
+ l_random: 0
4
+ change_connectivity: false
configs/only_random.yaml ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ data:
2
+ k_nearest: 0
3
+ l_random: 15
configs/rand_emb.yaml ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ data:
2
+ rand_emb: true
configs/rand_emb_ablate_struct.yaml ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ data:
2
+ rand_emb: true
3
+ model:
4
+ ablate_structure: true
configs/shared_layers.yaml ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ model:
2
+ shared_layers: true
eval_list.txt ADDED
@@ -0,0 +1,824 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 3F59A_3F59D
2
+ 1A2KA_1JB5A
3
+ 6X6AC_6X6AG
4
+ 3H3PH_4LLVE
5
+ 1BBHA_1BBHB
6
+ 3AAPA_4BR4A
7
+ 1R5ZA_1R5ZA
8
+ 5A1WH_5A1YH
9
+ 5ZFQA_6OJZC
10
+ 1B75A_1B75A
11
+ 1VVJR0_4V4IU
12
+ 2JRCA_2JRCA
13
+ 1TKLA_1TXNA
14
+ 2A50A_2A52C
15
+ 1ACPA_1ACPA
16
+ 1Q8GA_7M0GA
17
+ 1ECWA_2K4IA
18
+ 5F17A_5F17D
19
+ 1EUCA_2FP4A
20
+ 4V4NA5_4V4NA5
21
+ 4JVZA_4N8FA
22
+ 6CYJA_6CYJB
23
+ 1CODA_1G6MA
24
+ 4LDVA_4LDVA
25
+ 1II6A_6TA4K
26
+ 2BVJA_2BVJB
27
+ 6JY5A_6JY5E
28
+ 1LILA_4QF1L
29
+ 1HURA_1HURA
30
+ 1LREA_1OV2A
31
+ 6CQDA_6NFZA
32
+ 8AB2A_8AB3D
33
+ 1ZTPA_1ZTPA
34
+ 7W5ZS_8GZU18
35
+ 6LKGA_6LKLB
36
+ 8FY9B_8FYAC
37
+ 3FPQA_5WE8A
38
+ 1Z3KA_2CI9A
39
+ 5QSTA_6RRCC
40
+ 3J7Oh_6HCFh3
41
+ 1G57A_1IEZA
42
+ 1CPQA_1EKYA
43
+ 1BX4A_5KB6A
44
+ 8DKEP_8DKWP
45
+ 1C5WB_5WXFU
46
+ 3VD8A_6QF7C
47
+ 6XHPA_6XHTB
48
+ 6HIVAt_7AOIAt
49
+ 3IYDE_5MS0O
50
+ 5CUFA_6N0MA
51
+ 2XU7A_6G16A
52
+ 6SSJA_8C9MA
53
+ 1B1GA_1HT9B
54
+ 5W1GH_5W1MB
55
+ 1X4JA_4V3KC
56
+ 2I80A_7U9KB
57
+ 1BQLH_4CADH
58
+ 4NXTA_5X9BA
59
+ 1SK3A_1SK4A
60
+ 4CFEB_4RERB
61
+ 6F2DA_6F2DA
62
+ 4KGHA_4KGOB
63
+ 1JQJA_1MMIA
64
+ 1A9UA_8A8MA
65
+ 3SL7A_3SL7B
66
+ 1LAFE_1LAFE
67
+ 2YRQA_2YRQA
68
+ 1AYGA_5AUSC
69
+ 1T94A_1T94A
70
+ 1GAV0_1UNAB
71
+ 1LQ8A_1LQ8E
72
+ 1WZ3A_7EU4K
73
+ 2GCFA_2GCFA
74
+ 3JC8Ta_3JC8Ta
75
+ 1J1DC_6KLUC
76
+ 6KN7T_7KO5T
77
+ 3CHXB_7S4MB
78
+ 1DLFL_6AD0L
79
+ 6DM9A_6DM9C
80
+ 1TDHA_6LWQD
81
+ 1XX9A_6AODC
82
+ 2OX8A_2OX9C
83
+ 5L3SB_5L3WA
84
+ 1Y5OA_1Y5OA
85
+ 1JGNA_1JGNA
86
+ 5IPPA_5IUFA
87
+ 3CC6A_3ET7A
88
+ 1BVHA_1BVHA
89
+ 2EB7A_3GFIA
90
+ 5O3PA_6WUEA
91
+ 1QCQA_1UR6A
92
+ 1KO2A_6BM9B
93
+ 4CPCA_4CPCA
94
+ 6O6NA_6O6OA
95
+ 1FAIH_2F19H
96
+ 1FU5A_1OO3A
97
+ 1QXHA_4AF2A
98
+ 5IL0B_7O2FB
99
+ 5M64M_6RWEM
100
+ 4V1Ai_6VLZd
101
+ 2AW6A_2AXZA
102
+ 1KY9B_3MH4B
103
+ 1C9KA_1C9KA
104
+ 7XCNM_7XCNN
105
+ 4MFJA_6IIXA
106
+ 1HL6B_1OO0A
107
+ 4P6VC_4P6VC
108
+ 4PLQA_4PLSC
109
+ 4FZQA_4FZQC
110
+ 3N9IA_8I1WB
111
+ 5KENC_7UREH
112
+ 6RXTUE_6RXTUI
113
+ 4GPKA_5DBKB
114
+ 1Y7MA_2MTZA
115
+ 7KHAB_8DFOB
116
+ 5XUAA_5XUBB
117
+ 2WFF3_2WS93
118
+ 4WCEV_5TCULB
119
+ 1ACMA_1EKXA
120
+ 2KCNA_2NB0A
121
+ 5UXBA_5UXDA
122
+ 6TB9A4_6TB9A5
123
+ 3AU3A_3AU3A
124
+ 5F3KA_5F5RB
125
+ 2LS7A_6P6CA
126
+ 2K2JA_2K2JA
127
+ 7KZMZ_7KZNZ
128
+ 3JTMA_3N7UB
129
+ 6Q8FA_6Q8IE
130
+ 2H01A_4L0UJ
131
+ 8ECOA_8ECOB
132
+ 1ALUA_4ZS7A
133
+ 1F9XA_1TFQA
134
+ 1AGQA_6Q2NA
135
+ 5OHKA_5OHPA
136
+ 5Y6P44_5Y6PD2
137
+ 1BFOA_3B9KC
138
+ 1GZKA_1GZKA
139
+ 1JC0A_4W6PH
140
+ 2EFCB_2EFDD
141
+ 1A7LA_4O4BA
142
+ 1DVVA_2EXVA
143
+ 6W192_6W19p
144
+ 6NR83_6NRB3
145
+ 1DAPA_1DAPA
146
+ 5UDQA_5UDRC
147
+ 7X5EB_7X5EB
148
+ 1JS9B_1JS9C
149
+ 1MFGA_1N7TA
150
+ 1KTZB_1PLOA
151
+ 3Q6GH_6P60A
152
+ 3RJRA_5FFOD
153
+ 3J9MAF_6NU3AF
154
+ 5LI0e_6S13e
155
+ 6SPBF_6SPFF
156
+ 6YPZA_6YQ3B
157
+ 5GW0A_5GW0E
158
+ 1C9PB_1C9TH
159
+ 7PQ2A_7PQAA
160
+ 3J7Yc_4CE4h
161
+ 7OLCLR_7Z3OLR
162
+ 1I7PA_1NDHA
163
+ 1EX6A_1EX6A
164
+ 5CJ8A_5CJ8A
165
+ 4D0YA_4D0YB
166
+ 2V8QB_4REWB
167
+ 2D00A_5GASL
168
+ 1BMPA_6OMOJ
169
+ 2FE8A_6WUUB
170
+ 2K1WA_2K1XA
171
+ 6HE4H_6HE4M
172
+ 3IYKA_7RTNB
173
+ 4DAGH_4XMKJ
174
+ 5GMUA_5GO2A
175
+ 1MKIA_3AGFA
176
+ 1QZZA_1XDUA
177
+ 3J04B_7UDUE
178
+ 3J9WAR_6S13r
179
+ 2ZCHP_2ZCHP
180
+ 1M5Q1_1M5QM
181
+ 6X89G1_8E73G2
182
+ 1AH1A_2X44D
183
+ 4QJBA_4ZEVB
184
+ 1A8EA_1A8EA
185
+ 7C9XA_8GSD1
186
+ 7DH7A_7DH7D
187
+ 6NCLc6_8H2IbI
188
+ 2J28Y_4V5BAY
189
+ 1AGWA_3RHXA
190
+ 3WKLA_3WKMA
191
+ 1GMEB_2BYUC
192
+ 6IJO2_6IJO2
193
+ 3J6X59_6T7TLR
194
+ 1AARA_5J26B
195
+ 1A4HA_2AKPA
196
+ 5FS4A_5JZRB
197
+ 1C9BA_5WH1A
198
+ 4XHRM_4YTXO
199
+ 1TJLA_7KHIM
200
+ 3ECOA_4L9TB
201
+ 1F3MC_4ZJJD
202
+ 2B0JA_3DAFA
203
+ 3KTGA_7FESM
204
+ 3IABA_6W6VF
205
+ 1HZ5A_1HZ6B
206
+ 2N7LC_7SC3A
207
+ 1AL22_6Q0B2
208
+ 5ZGB2_5ZGH2
209
+ 6U0TA_8G3D1X
210
+ 4QYZD_5CD4G
211
+ 3GOYA_3SMJA
212
+ 1U7LA_7TMMO
213
+ 5ZE7A_5ZE7A
214
+ 6P8VA_6P8VE
215
+ 2BNDA_2BNDB
216
+ 3A1JB_3GGRB
217
+ 6Q975_7ACR5
218
+ 1SEIA_1SEIB
219
+ 3B6YA_3RNUA
220
+ 2BX9A_2KO8A
221
+ 3AUXA_3AUYA
222
+ 2A4EA_3LNDD
223
+ 1LAFE_6MLPE
224
+ 8EDUA_8EDUG
225
+ 2BJ1A_2BJ9B
226
+ 2EAYA_3FJPA
227
+ 6TP9A_6TP9G
228
+ 4OCJA_4OCJA
229
+ 2ZJTA_7UGWD
230
+ 1PP7U_1PP8M
231
+ 3J7OB_4V5ZBb
232
+ 7D2SA_7D2SA
233
+ 1XLSE_1XNXA
234
+ 2ZW4A_2ZW4D
235
+ 4Z3WE_4Z3ZF
236
+ 6F1CA_6F39B
237
+ 5T1PA_5T1PE
238
+ 1E7WA_2XOXA
239
+ 4V61BV_4V61BV
240
+ 1G1QA_1G1RB
241
+ 1JM4B_1JM4B
242
+ 2LR6A_3SOOB
243
+ 3I3YA_3IKHC
244
+ 1S5LH_1S5Lh
245
+ 2WEWA_2WEXA
246
+ 6JITA_6JITB
247
+ 1AXBA_6B2ND
248
+ 5SUPA_7LUVM
249
+ 2Q80A_6C56A
250
+ 3SOPA_6UQQD
251
+ 1YZTA_2OT3B
252
+ 2VRGA_2VRGA
253
+ 2LXNA_7D97D
254
+ 1EQFA_7T36A
255
+ 3DHWA_3DHWB
256
+ 5YQQA_5YQQA
257
+ 4V6WAf_7OLDSf
258
+ 7L6KA_7LF7K
259
+ 1X8GA_3F9OA
260
+ 3J9TR_5VOZT
261
+ 6ECBA_6ECDA
262
+ 1J2FA_5JEMB
263
+ 3SW1A_5LUVB
264
+ 3J0LK_4V7HAK
265
+ 1L8QA_2HCBC
266
+ 6ZXBA_6ZXCC
267
+ 7MWYA_7MWZA
268
+ 2NTXA_2NTXA
269
+ 5GUPa_5O31h
270
+ 6JMXA_6JN8A
271
+ 2HDAA_4LE9A
272
+ 2V1SA_2V1TB
273
+ 2OEXA_4JJYB
274
+ 1SAWA_6FOHB
275
+ 6VQ6L_7KHRL
276
+ 3UO1H_6BPEE
277
+ 1CP3A_7XN5A
278
+ 5MYJBP_7NHKP
279
+ 5H1AA_7CUOB
280
+ 2VL0A_6V03C
281
+ 4X8WA_4X8WE
282
+ 1QMYA_2JQGR
283
+ 3EI1B_4A0KD
284
+ 3U5ZB_3U60C
285
+ 2YKRE_4V6OAH
286
+ 1K3WA_1Q39A
287
+ 5WRHA_6JZRA
288
+ 4W6ZA_4W6ZD
289
+ 4L7ZA_4L80A
290
+ 5L1XA_8E15F
291
+ 7DWNA_7DWND
292
+ 4QX6A_5JYFC
293
+ 3NKDA_5VVKA
294
+ 5LI0b_6S13b
295
+ 5DPOA_6K3BB
296
+ 5LI6A_5LI7A
297
+ 7ADSA_7ADSA
298
+ 1DFVA_6SUAA
299
+ 7RGSA_7RGSA
300
+ 1DV4G_4V4GAG
301
+ 1AL21_5KU01
302
+ 1A22A_1Z7CA
303
+ 4TL6A_6X61G
304
+ 7QIHB_7QIHB
305
+ 1L7IH_8HIJH
306
+ 1UDVA_2A2YB
307
+ 1APCA_5AWIA
308
+ 6NJ8A_6NJ8A
309
+ 2H0BA_6PNPA
310
+ 6N5MB_7EALC
311
+ 1F9NA_1F9NA
312
+ 2J28P_2J28P
313
+ 1IJ9A_1VSCA
314
+ 6SPCh_7UNVh
315
+ 3B5HA_7XY8A
316
+ 4ARJA_4ARJB
317
+ 2YX1A_2ZZMA
318
+ 1HQME_8HSRK
319
+ 5BS1A_5BS1A
320
+ 1B62A_7P8VA
321
+ 3T12A_3T1QA
322
+ 4YMKA_4ZYOA
323
+ 7OX1G_7OX6A
324
+ 5MRCEE_8D8KE
325
+ 1FOSF_1FOSH
326
+ 4S04A_4S05B
327
+ 2LW5A_2LW5A
328
+ 7Y97A_7Y9OA
329
+ 1Q05A_4WLSB
330
+ 6II0A_6II0D
331
+ 7A6HG_8ITYG
332
+ 1C7UA_1EGWA
333
+ 1URKA_2I9BC
334
+ 4L8PA_4N3VA
335
+ 7C4PA_7ELKA
336
+ 4UT9H_6OL6H
337
+ 4XA8A_4XA8A
338
+ 3LREA_7RSIC
339
+ 2JBXA_2JBYA
340
+ 3JCRD_7ABGD
341
+ 1H6TA_4AW4C
342
+ 1FJGQ_4V4Jr
343
+ 3B1NA_3B1OB
344
+ 2ETNA_2ETNB
345
+ 1AP4A_3RV5A
346
+ 3DL3A_3DL3E
347
+ 3EZJB_6WARD
348
+ 4NFTA_7WLPA
349
+ 5TVFB_5TVMB
350
+ 5A5TL_7QP65
351
+ 6SGACd_7ANEq
352
+ 1QBZA_8DVDA
353
+ 3GKKA_5IOXA
354
+ 1VK6A_5IW5B
355
+ 1V1HA_1V1IC
356
+ 1EWIA_1EWIA
357
+ 5FVCA_5FVDC
358
+ 1J5AM_4V4RB5
359
+ 2KPWA_3UMNC
360
+ 4V4NBT_5JB3T
361
+ 2RHEA_7WE9K
362
+ 2IYEA_2YJ4A
363
+ 4OIGA_4OIGA
364
+ 2YKRS_4V70AS
365
+ 1UFKA_1UFKA
366
+ 5CZRA_7N86D
367
+ 1GPWA_1GPWC
368
+ 2MR3A_6J2XO
369
+ 3MSYA_4H83F
370
+ 3HZSA_3VMTA
371
+ 4GLRH_8E7MH
372
+ 2WW9L_6OIGY
373
+ 5KK2E_6NJLH
374
+ 7C42A_7C4CA
375
+ 7F0RA_7F0RA
376
+ 4EIYA_5UIGA
377
+ 1C39A_2RL7D
378
+ 1BA2A_1URPA
379
+ 7NDRA_7NDRA
380
+ 1IKUA_1LA3A
381
+ 2OWQA_2OWQB
382
+ 2LN0A_2LN0A
383
+ 8EIZA_8EIZD
384
+ 6YCXF_6YCYE
385
+ 3JCKE_3JCKE
386
+ 1KYWA_1KYWC
387
+ 4AQ5C_4BOG1
388
+ 3JCMP_5ZWOb
389
+ 3X3MA_5EBDA
390
+ 1GAGA_2B4SB
391
+ 4KBLA_4KBLB
392
+ 3QF7A_3THOA
393
+ 4HKRA_4HKRB
394
+ 1WT5A_7L57H
395
+ 3J7Yk_3J7Yk
396
+ 2XNHA_6HM5A
397
+ 4CBVA_4CBVD
398
+ 6W1SE_6W1SE
399
+ 4K3VA_4NNPA
400
+ 4D10H_6R7NH
401
+ 1MWKA_1MWKA
402
+ 8CJZG_8CJZg
403
+ 3WT1A_3WT2B
404
+ 1HQMA_1HQMA
405
+ 5EXIA_5EXKK
406
+ 5KKOA_5KKOB
407
+ 2ZNIA_2ZNIA
408
+ 8ESQE_8ETJE
409
+ 3J7Oc_4V5ZB6
410
+ 8AX4A_8AX4C
411
+ 4N7WA_4N7XA
412
+ 4C8VA_4C8WI
413
+ 1Z3RA_1Z66A
414
+ 5GJQR_6EPER
415
+ 6EZOE_6EZOF
416
+ 2IVSA_6VHGA
417
+ 7PWFP_7PWFP
418
+ 1I84U_7QIOP
419
+ 1IKNA_1LE9A
420
+ 1A3RH_5W23J
421
+ 6Y61A_7B6TB
422
+ 1U4HA_5JRUF
423
+ 2JPHA_2JPHA
424
+ 5TJ5B_5VOYR
425
+ 2AE0X_2PI8D
426
+ 1BA2A_1DBPA
427
+ 1ABEA_9ABPA
428
+ 4REGA_4REGA
429
+ 4C2MD_6H68D
430
+ 4AJ5K_4AJ5P
431
+ 4EJ7A_4FEVE
432
+ 2W6RA_2W6RA
433
+ 8AP6M1_8APEM1
434
+ 1D4M1_1D4M1
435
+ 1XJ5A_6O64H
436
+ 1KELH_4HLZI
437
+ 2CAZB_2P22B
438
+ 4V0KA_4V0MA
439
+ 4KU7A_4KU8A
440
+ 7UQLA_7UQLB
441
+ 1G85A_2HLVA
442
+ 7O4HD_7O4HD
443
+ 5A9RA_5A9RA
444
+ 7EQEA_7EQEA
445
+ 3AJMA_3L8JA
446
+ 5FMGL_6MUVL
447
+ 5VKUg_7ET3g
448
+ 6AJMA_6AJMA
449
+ 3J3VT_3J3WT
450
+ 4HYEA_4ZMRB
451
+ 5EGPA_5EGPA
452
+ 4XVPD_7Q1ED
453
+ 2D3AA_7V4LC
454
+ 1QMVA_5IJTF
455
+ 5W78A_5W7BA
456
+ 7DT0A_7DT0E
457
+ 5GKEA_5GKEA
458
+ 1DYLA_2YEWG
459
+ 1FJGJ_6GZZJ4
460
+ 6IFSA_6IFXA
461
+ 4CI1B_6BN9B
462
+ 1B7EA_1B7EA
463
+ 1ONCA_1ONCA
464
+ 5U07D_6C66D
465
+ 1NKWL_7A0RK
466
+ 3J7PSV_6FECb
467
+ 4V61AE_6ERIBE
468
+ 2V3JA_7D4IRG
469
+ 7ZTWB_7ZU4A
470
+ 1L8LA_6HYYB
471
+ 3B0WA_6FGQA
472
+ 1Y9RA_4UDAA
473
+ 1H2CA_4LDDC
474
+ 7F9KA_7F9KA
475
+ 1IAZA_4TSYD
476
+ 2G76A_6RIHA
477
+ 1IZNA_6F38K
478
+ 5A9Q7_7PEQCG
479
+ 2L7EA_6MIPA
480
+ 6J3Y11_6J3Z16
481
+ 7SBDH_7SD2A
482
+ 6X5ZO_8ENCP
483
+ 1BXIA_1IMQA
484
+ 4ML1A_4MLYB
485
+ 6LTNA_6M66A
486
+ 4YV6A_6W1AA
487
+ 6XGPA_6XGQA
488
+ 1OVZA_1OVZB
489
+ 2ZPMA_3ID3A
490
+ 1XWRA_1ZS4B
491
+ 1AN2A_1R05A
492
+ 3J7774_6T83Wy
493
+ 7FBIN_8HEBE
494
+ 6ANOA_6AOZC
495
+ 2A74C_2QKIC
496
+ 3J1OI_3J1OI
497
+ 6KVNA_6LCTA
498
+ 5GUPb_7DGSc
499
+ 2CMRA_3MA9A
500
+ 1ACAA_2ABDA
501
+ 5OJ8A_7AIEA
502
+ 1FNTD_5WVID
503
+ 1TRJA_4V6IAa
504
+ 6TAQA_6TAQC
505
+ 2V5MA_4X83A
506
+ 7BTWA_7BTWA
507
+ 4FU3A_4Q96A
508
+ 1AO8A_2L28A
509
+ 6VOYA_7PELA
510
+ 3J1ZP_8F6KC
511
+ 4JZBA_4K10D
512
+ 1QOHA_7U6VD
513
+ 3JCTb_7OHPb
514
+ 6S6BA_6S8BC
515
+ 3U1SH_5UXQH
516
+ 1DFBH_3Q6FF
517
+ 7EY0A_7WLZP
518
+ 1A32A_6HTQo
519
+ 2LVWA_2LVWA
520
+ 6DJLB_6DJLC
521
+ 2QF0A_3B8JA
522
+ 2Y9KA_2Y9KB
523
+ 7DN2a_7F2Pi
524
+ 2LE2A_3ZOQC
525
+ 1SW1A_1SW1A
526
+ 1F9QA_4HSVD
527
+ 2WDPA_6DEUA
528
+ 6H9CA_6H9CA
529
+ 4X5MA_4X5MB
530
+ 6LNBB_6PIFF
531
+ 7PKNQ_7XHNQ
532
+ 3VD6C_4HC7A
533
+ 3R90A_6MS4A
534
+ 1AUTC_6M3CG
535
+ 2WV0A_4U0WB
536
+ 6NR85_7WU75
537
+ 7V9FA_7V9IA
538
+ 5CRAA_5CRBA
539
+ 1G0WA_3B6RB
540
+ 1ES7B_3NH7B
541
+ 5D0YA_5D0YB
542
+ 4IHBA_7K6BA
543
+ 7WHRA_7WHTD
544
+ 5O6SA_5O6TC
545
+ 2OQHA_2OQHD
546
+ 1REPC_2Z9OB
547
+ 4GC5A_4GC9A
548
+ 6K7XA_6WDOA
549
+ 1AJ6A_5MMPA
550
+ 6HIVDK_7PUBDK
551
+ 3PILA_3PINB
552
+ 1C1AB_8E14B
553
+ 5USCA_6U60B
554
+ 7SN9A_7SN9i
555
+ 1IARB_6WGLC
556
+ 5V44A_5V46A
557
+ 6KE6RF_7D4IRF
558
+ 6G18y_6ZXDy
559
+ 2VGMA_3IZQ0
560
+ 1A02F_1A02F
561
+ 1DN0B_1DN0B
562
+ 1MF2H_1OTSC
563
+ 3BCHA_4V5ZAb
564
+ 1BVNT_1HOEA
565
+ 2Q81A_2Q81B
566
+ 5GMKP_5MPSK
567
+ 1ARQA_1QTGA
568
+ 5J09A_5J09A
569
+ 4U3MSM_5DATSM
570
+ 6A27A_6BDUA
571
+ 6WMAA_6WMCA
572
+ 2YPLD_3O4LD
573
+ 1JK9B_5U9MB
574
+ 3OETA_3OETB
575
+ 3TGOC_6LYSC
576
+ 3TTKA_3TTMA
577
+ 4ETVA_6MM6F
578
+ 132LA_1E8LA
579
+ 1EDHA_3Q2LB
580
+ 3J22A_4C0UA
581
+ 3JCSH_4V8MBO
582
+ 3J7PSK_6OLZBK
583
+ 7P37A_7P37A
584
+ 7OLCSR_7OLDSR
585
+ 6PQEA_6PRPA
586
+ 2M7KA_5XOPE
587
+ 1MR3F_2K5UA
588
+ 1DCTA_3UBTB
589
+ 1VK1A_5X0EA
590
+ 2CGHA_3L1AB
591
+ 3J9WB0_6HTQW
592
+ 1SLQA_2B4HA
593
+ 1BMQB_1SC1B
594
+ 1JN0A_7Q54R
595
+ 4V4NBF_4V4NBF
596
+ 1PK2A_1PK2A
597
+ 4JDQA_4JDXF
598
+ 1S50A_4QF9C
599
+ 5GJQV_5LN3V
600
+ 7C4SA_7C4SA
601
+ 3J6XL5_4V6IBQ
602
+ 1NY6A_1NY6H
603
+ 3T5NA_3T5QK
604
+ 6OECA_6OECH
605
+ 1O7BT_2N40A
606
+ 2J28E_4V66B1
607
+ 1P34E_6ESIE
608
+ 3KLTA_3TRTB
609
+ 7Y4L53_7Y5E53
610
+ 1EJ3A_1SL8A
611
+ 4V61B5_4V61B5
612
+ 1CKMA_1CKMA
613
+ 1HX5A_1P3HA
614
+ 6VO6A_6VO6C
615
+ 1AP2A_6LHTL
616
+ 5ADXA_6F38I
617
+ 3QILA_3QILC
618
+ 5GREA_6KDFA
619
+ 1SA0E_8ASNE
620
+ 3W6GA_8HLAL
621
+ 4CSFA_4J4YB
622
+ 3JCKH_3JCKH
623
+ 5OEYA_5OEYB
624
+ 2JTCA_2JTCA
625
+ 2VPNA_2VPOB
626
+ 7OODs_7PASs
627
+ 3J9MAS_6NU3AS
628
+ 3J6X55_4V7HBL
629
+ 1ID0A_3CGZC
630
+ 1VVJR4_4V4ZB3
631
+ 8ESQu_8EUPu
632
+ 1Q5HA_7BPDA
633
+ 5HJIA_5HJJA
634
+ 3J2WQ_5VU2Q
635
+ 3GB4A_6VSHA
636
+ 1NKWE_4V4GBH
637
+ 2AQLA_6AGOD
638
+ 2H3HA_3C6QA
639
+ 1G47A_1G47A
640
+ 4MSOA_4MSOA
641
+ 3J7Pq_7A01K2
642
+ 4DZDA_4DZDA
643
+ 5LPUA_7PC7A
644
+ 2POHA_4LINL
645
+ 6HQCA_6QAYA
646
+ 1DMWA_6HPOA
647
+ 6IDEA_6IDEB
648
+ 5COLA_5COLA
649
+ 5LF9A_5LORB
650
+ 3APZA_3AQ0D
651
+ 2R6AA_2VYFB
652
+ 2ZU6B_2ZU6B
653
+ 2ZIXA_2ZIXA
654
+ 7OC4A_7OC6A
655
+ 4EHWA_4EHYA
656
+ 3GLRA_5D7NF
657
+ 1F4TA_5BV5C
658
+ 1OCCH_5Z62H
659
+ 1YNMA_1YNMA
660
+ 6NR86_7WU76
661
+ 1GGGA_8EYZG
662
+ 1PB7A_4KFQB
663
+ 1ZW0A_1ZW0B
664
+ 6WXWA_6WXYB
665
+ 4V8MA2_4V8MA2
666
+ 4YX1A_4YX1A
667
+ 6P7OA_6P7QB
668
+ 3J7ON_4V5ZBm
669
+ 5DWZC_5DWZC
670
+ 4GRCA_4GRCA
671
+ 3FK3A_3RLSA
672
+ 7ET4A_7ET5A
673
+ 1NCJA_4NUPB
674
+ 2OGXA_6RJ4E
675
+ 5Z1GA_7OHVJ
676
+ 1BE4A_7KPFE
677
+ 1I42A_1S3SH
678
+ 3EEBA_7D5YA
679
+ 5MQFQ_7ABFQ
680
+ 6HIVBH_6HIVBH
681
+ 2KJ3A_2KJ3B
682
+ 1STFI_4N6V0
683
+ 5GUPW_5XTDW
684
+ 1U8YA_5CM8B
685
+ 1TYFA_1YG6C
686
+ 5FJ9I_7Z2ZI
687
+ 2NW2A_4DZBA
688
+ 1JDNA_1JDNA
689
+ 2MZWB_2MZWB
690
+ 1FS0G_5T4QG
691
+ 5I5DA_5I5HA
692
+ 1A4YB_1GV7A
693
+ 1L4UA_2IYUA
694
+ 1BCXA_3EXUA
695
+ 3JCTu_6C0Fu
696
+ 6CAJI_7RLOI
697
+ 3J9WAL_7ASPc
698
+ 2ZXYA_3X15D
699
+ 1NJMT_4V4GDW
700
+ 2KIIA_2KIIA
701
+ 1YDEA_5ICSA
702
+ 3DZUA_4NQAA
703
+ 2MM4A_2MM4A
704
+ 1NKSA_1NKSE
705
+ 6ZJ8A_6ZJ8A
706
+ 4BM9A_6GLCA
707
+ 1A7XA_7T64E
708
+ 1ADQA_6F2ZB
709
+ 1W33A_1W3ZB
710
+ 3PV2A_3PV5B
711
+ 2B69A_2B69A
712
+ 3PHGA_4FSHB
713
+ 2WCPA_3MVSA
714
+ 4IUFA_4Y00B
715
+ 6GIQi_8EC0R
716
+ 5IJNG_7WKKk
717
+ 3JCTr_6ELZr
718
+ 4JPHA_5HK5E
719
+ 3FH2A_6CN8A
720
+ 1VFGA_4X0BA
721
+ 2I1AA_4Z2ZA
722
+ 7EWIA_7EWJG
723
+ 1MUMA_1OQFA
724
+ 6B4CA_6B4CD
725
+ 5O5JG_6JMKA
726
+ 5IBLL_6UULD
727
+ 1B4AA_1B4AE
728
+ 6JXNA_6JXNB
729
+ 2D9SA_2JUJA
730
+ 1K9UA_1K9UA
731
+ 1BU2A_2EUFA
732
+ 4Q37A_4Q37E
733
+ 7ANCA_7CXTB
734
+ 4QRZA_4RJZA
735
+ 5MW5A_5MW7A
736
+ 2K1BA_4X3TD
737
+ 4O64A_6ASBI
738
+ 1Y8GA_2HAKA
739
+ 2LHNA_5L2LB
740
+ 2J28D_4V75BD
741
+ 1Z2CB_2BNXB
742
+ 1BCCE_1BE3E
743
+ 6S3DM_6XWIA
744
+ 5B0UA_7SNYA
745
+ 7EYIG_7N5SA
746
+ 4C3BA_6G0YC
747
+ 1PKQA_7WUHL
748
+ 3J0JI_3V6IB
749
+ 5Z6ZA_7DW5A
750
+ 6FDDA_6FDDC
751
+ 4V61AF_4V61AF
752
+ 1B3TA_6NPMA
753
+ 3ZBEA_5CW7M
754
+ 1JQJA_1JQJA
755
+ 1BY4A_6XWGC
756
+ 1IB2A_3GVOA
757
+ 7W2BA_7W2HT
758
+ 3KKBA_4LLCB
759
+ 1QBHA_1QBHA
760
+ 2KT6A_3L48D
761
+ 7F0OA_7F10B
762
+ 2JSFA_2JSFA
763
+ 1GXPA_1QQIA
764
+ 8ECKA_8ECKG
765
+ 3J40A_3J40G
766
+ 1VVJRU_5IMRm
767
+ 1DYTA_2LVZA
768
+ 1EXIA_7CKQG
769
+ 1Y0LB_7V8QD
770
+ 2AJFE_2GHVE
771
+ 2WZPR_2X531
772
+ 1QK9A_1QK9A
773
+ 2CT9A_2E30A
774
+ 5VNYA_5VNYA
775
+ 1LM8C_6R7NQ
776
+ 2OSSA_3UW9A
777
+ 6HIVDO_6HIVDO
778
+ 2RR8A_2RR8A
779
+ 5O5JD_5V93d
780
+ 2J28U_4V75BU
781
+ 2MZBA_3RSBC
782
+ 6P4LA_7N7FB
783
+ 1YMYA_1YRRA
784
+ 3AQFB_6UMGc
785
+ 6YNXR_6YNZR3
786
+ 1GEFA_1IPIB
787
+ 2Y7LA_2YLHA
788
+ 7EMFF_8GXSf
789
+ 4KZXl_6ZVJN
790
+ 7N85A_7N85G
791
+ 1N69A_1N69A
792
+ 1N88A_6GZXS2
793
+ 6ACUC_6IIOC
794
+ 2QFIA_3H90D
795
+ 8AS8C_8BFNC
796
+ 1H1DA_4PYKA
797
+ 3NXCA_4GFLB
798
+ 3X2RA_4UV2A
799
+ 4G6FB_4G6FB
800
+ 1B6FA_6R3CA
801
+ 4V61BE_4V61BE
802
+ 3ELLA_3MOLA
803
+ 1WCWA_1WD7B
804
+ 3WG9A_3WGIC
805
+ 5ZZ8r_5ZZ8r
806
+ 3AAZA_5IIEA
807
+ 4C2MI_4C2MI
808
+ 6T3ZA_7T7II
809
+ 8ECNA_8ECNB
810
+ 5LC5E_7DGQ9
811
+ 2J4JA_2J4KA
812
+ 4QNCA_4QNCB
813
+ 2AVAA_2AVAA
814
+ 7TJYX_7TKSX
815
+ 6TYTA_6TYTA
816
+ 2CCQA_2HPLA
817
+ 1AKEA_1AKEA
818
+ 3DTPE_3JBHE
819
+ 6GIQc_8EC0O
820
+ 7K9RA_7K9SD
821
+ 2X3FA_3AG6A
822
+ 5WC6A_5WC8A
823
+ 5MSMA_5MSMA
824
+ 5LI0i_6S13i
full_list.txt ADDED
The diff for this file is too large to render. See raw diff
 
full_test_list.txt ADDED
@@ -0,0 +1,5585 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 132LA_193LA
2
+ 132LA_1E8LA
3
+ 132LA_1LKRB
4
+ 132LA_6P4BD
5
+ 132LA_6S7NA
6
+ 1A02F_1A02F
7
+ 1A02F_1FOSE
8
+ 1A02F_1FOSG
9
+ 1A02F_1S9KD
10
+ 1A02F_2WT7A
11
+ 1A22A_1A22A
12
+ 1A22A_1BP3A
13
+ 1A22A_1HGUA
14
+ 1A22A_1HUWA
15
+ 1A22A_1Z7CA
16
+ 1A2KA_1A2KA
17
+ 1A2KA_1GY5B
18
+ 1A2KA_1GY6A
19
+ 1A2KA_1JB2A
20
+ 1A2KA_1JB5A
21
+ 1A2NA_1EJCA
22
+ 1A2NA_3VCYA
23
+ 1A2NA_4E7BA
24
+ 1A2NA_4E7BD
25
+ 1A2NA_5VM7B
26
+ 1A32A_1A32A
27
+ 1A32A_3J9WAO
28
+ 1A32A_5NJTO
29
+ 1A32A_6HA1o
30
+ 1A32A_6HTQo
31
+ 1A3RH_1BBDH
32
+ 1A3RH_1CR9H
33
+ 1A3RH_3NIDH
34
+ 1A3RH_3ZE2E
35
+ 1A3RH_5W23J
36
+ 1A4HA_1AH6A
37
+ 1A4HA_2AKPA
38
+ 1A4HA_2WERA
39
+ 1A4HA_6CJPA
40
+ 1A4HA_6CJSA
41
+ 1A4YB_1GV7A
42
+ 1A4YB_1H0DC
43
+ 1A4YB_4AHLA
44
+ 1A4YB_4B36A
45
+ 1A4YB_5EOPA
46
+ 1A7AA_1B3RA
47
+ 1A7AA_1D4FB
48
+ 1A7AA_1D4FC
49
+ 1A7AA_1K0UA
50
+ 1A7AA_5W49A
51
+ 1A7LA_2D21A
52
+ 1A7LA_4O4BA
53
+ 1A7LA_6CXSD
54
+ 1A7LA_6LESB
55
+ 1A7LA_7FFTA
56
+ 1A7XA_1C9HA
57
+ 1A7XA_1FKTA
58
+ 1A7XA_6X36A
59
+ 1A7XA_7T64E
60
+ 1A7XA_8DUJE
61
+ 1A8EA_1A8EA
62
+ 1A8EA_1A8FA
63
+ 1A8EA_1BP5B
64
+ 1A8EA_1BP5C
65
+ 1A8EA_1TFDA
66
+ 1A8IA_1FA9A
67
+ 1A8IA_1LWNA
68
+ 1A8IA_1XC7A
69
+ 1A8IA_5GPBA
70
+ 1A8IA_9GPBB
71
+ 1A9UA_3BV3A
72
+ 1A9UA_3OHTA
73
+ 1A9UA_5WJJA
74
+ 1A9UA_6ZQSA
75
+ 1A9UA_8A8MA
76
+ 1AARA_1AARA
77
+ 1AARA_2L3ZA
78
+ 1AARA_5J26B
79
+ 1AARA_6DJ9K
80
+ 1AARA_6OB1A
81
+ 1ABEA_1ABEA
82
+ 1ABEA_2WRZA
83
+ 1ABEA_2WRZB
84
+ 1ABEA_8ABPA
85
+ 1ABEA_9ABPA
86
+ 1ACAA_1ACAA
87
+ 1ACAA_1NTIA
88
+ 1ACAA_1NVLA
89
+ 1ACAA_2ABDA
90
+ 1ACAA_2CB8A
91
+ 1ACMA_1EKXA
92
+ 1ACMA_2A0FC
93
+ 1ACMA_2ATCA
94
+ 1ACMA_2QGFA
95
+ 1ACMA_6KJ9A
96
+ 1ACPA_1ACPA
97
+ 1ACPA_2FHSC
98
+ 1ACPA_2K93A
99
+ 1ACPA_4ETWB
100
+ 1ACPA_5KOFD
101
+ 1ADEA_1ADEA
102
+ 1ADEA_1ADEB
103
+ 1ADEA_1HOPB
104
+ 1ADEA_1KKFA
105
+ 1ADEA_1SOOA
106
+ 1ADQA_1OQOA
107
+ 1ADQA_3RY6B
108
+ 1ADQA_5HVWA
109
+ 1ADQA_5V43B
110
+ 1ADQA_6F2ZB
111
+ 1AFOA_1AFOA
112
+ 1AFOA_1AFOB
113
+ 1AFOA_2KPFA
114
+ 1AFOA_2KPFB
115
+ 1AFOA_7UZ3B
116
+ 1AGQA_1AGQB
117
+ 1AGQA_1AGQD
118
+ 1AGQA_2V5EB
119
+ 1AGQA_3FUBB
120
+ 1AGQA_6Q2NA
121
+ 1AGWA_3CLYA
122
+ 1AGWA_3KXXB
123
+ 1AGWA_3RHXA
124
+ 1AGWA_6P68C
125
+ 1AGWA_6PNXB
126
+ 1AH1A_1AH1A
127
+ 1AH1A_2X44D
128
+ 1AH1A_3BX7C
129
+ 1AH1A_5GGVY
130
+ 1AH1A_6RQMA
131
+ 1AJ6A_3G7EA
132
+ 1AJ6A_5MMPA
133
+ 1AJ6A_5Z4OB
134
+ 1AJ6A_7DQMA
135
+ 1AJ6A_7P2WA
136
+ 1AKEA_1AKEA
137
+ 1AKEA_4AKEA
138
+ 1AKEA_4X8HA
139
+ 1AKEA_4X8MA
140
+ 1AKEA_7APUA
141
+ 1AL21_1AR91
142
+ 1AL21_3JBG1
143
+ 1AL21_5KTZ1
144
+ 1AL21_5KU01
145
+ 1AL21_6P9O1
146
+ 1AL22_1HXS2
147
+ 1AL22_1POV0
148
+ 1AL22_5KU02
149
+ 1AL22_6P9O2
150
+ 1AL22_6Q0B2
151
+ 1ALUA_1IL6A
152
+ 1ALUA_1P9MB
153
+ 1ALUA_2IL6A
154
+ 1ALUA_4CNIC
155
+ 1ALUA_4ZS7A
156
+ 1AN2A_1AN2A
157
+ 1AN2A_1HLOB
158
+ 1AN2A_1R05A
159
+ 1AN2A_5EYOC
160
+ 1AN2A_8OTSN
161
+ 1AO8A_1AO8A
162
+ 1AO8A_1LUDA
163
+ 1AO8A_2HQPA
164
+ 1AO8A_2L28A
165
+ 1AO8A_2LF1A
166
+ 1AONA_1AONA
167
+ 1AONA_1GRUH
168
+ 1AONA_3WVLD
169
+ 1AONA_4AB3A
170
+ 1AONA_4HELM
171
+ 1AP2A_6BSPB
172
+ 1AP2A_6LHTL
173
+ 1AP2A_6UJCF
174
+ 1AP2A_7UZ8N
175
+ 1AP2A_7WP6I
176
+ 1AP4A_1AP4A
177
+ 1AP4A_2L1RA
178
+ 1AP4A_3RV5A
179
+ 1AP4A_3RV5C
180
+ 1AP4A_4GJEA
181
+ 1APCA_1APCA
182
+ 1APCA_1YZAA
183
+ 1APCA_1YZCA
184
+ 1APCA_5AWIA
185
+ 1APCA_5AWIB
186
+ 1ARQA_1ARQA
187
+ 1ARQA_1ARQB
188
+ 1ARQA_1NLAA
189
+ 1ARQA_1PARB
190
+ 1ARQA_1QTGA
191
+ 1AUPA_1AUPA
192
+ 1AUPA_1BGVA
193
+ 1AUPA_1HRDA
194
+ 1AUPA_1HRDC
195
+ 1AUPA_1K89A
196
+ 1AUTC_1AUTC
197
+ 1AUTC_3F6UH
198
+ 1AUTC_6M3CA
199
+ 1AUTC_6M3CC
200
+ 1AUTC_6M3CG
201
+ 1AVGL_1AVGL
202
+ 1AVGL_1BBRJ
203
+ 1AVGL_1ETSL
204
+ 1AVGL_1UCYL
205
+ 1AVGL_1VITL
206
+ 1AXBA_1JWPA
207
+ 1AXBA_4OQIA
208
+ 1AXBA_4RVAA
209
+ 1AXBA_5NPOB
210
+ 1AXBA_6B2ND
211
+ 1AYGA_1AYGA
212
+ 1AYGA_3VYMA
213
+ 1AYGA_5AURG
214
+ 1AYGA_5AUSA
215
+ 1AYGA_5AUSC
216
+ 1AZ6A_1AZ6A
217
+ 1AZ6A_1AZHA
218
+ 1AZ6A_1AZKA
219
+ 1AZ6A_1CBHA
220
+ 1AZ6A_2MWJA
221
+ 1B05A_1B05A
222
+ 1B05A_1RKMA
223
+ 1B05A_3TCFD
224
+ 1B05A_3TCFF
225
+ 1B05A_3TCHA
226
+ 1B0LA_1CB6A
227
+ 1B0LA_1LCFA
228
+ 1B0LA_1LFHA
229
+ 1B0LA_7JRDB
230
+ 1B0LA_7N88B
231
+ 1B1GA_1BOCA
232
+ 1B1GA_1HT9A
233
+ 1B1GA_1HT9B
234
+ 1B1GA_1KCYA
235
+ 1B1GA_1KQVA
236
+ 1B3TA_1B3TA
237
+ 1B3TA_6NPMA
238
+ 1B3TA_6NPPA
239
+ 1B3TA_8DLFC
240
+ 1B3TA_8DLFD
241
+ 1B4AA_1B4AA
242
+ 1B4AA_1B4AC
243
+ 1B4AA_1B4AD
244
+ 1B4AA_1B4AE
245
+ 1B4AA_1B4AF
246
+ 1B62A_1B63A
247
+ 1B62A_5AKBC
248
+ 1B62A_5AKCH
249
+ 1B62A_5AKDD
250
+ 1B62A_7P8VA
251
+ 1B6FA_1B6FA
252
+ 1B6FA_1BTVA
253
+ 1B6FA_4A80A
254
+ 1B6FA_4BKCA
255
+ 1B6FA_6R3CA
256
+ 1B75A_1B75A
257
+ 1B75A_1D6KA
258
+ 1B75A_4V6PBX
259
+ 1B75A_4V71BV
260
+ 1B75A_4V76BV
261
+ 1B7EA_1B7EA
262
+ 1B7EA_1MM8A
263
+ 1B7EA_1MUHA
264
+ 1B7EA_3ECPA
265
+ 1B7EA_4DM0A
266
+ 1BA2A_1BA2A
267
+ 1BA2A_1BA2B
268
+ 1BA2A_1DBPA
269
+ 1BA2A_1DRJA
270
+ 1BA2A_1URPA
271
+ 1BALA_1BALA
272
+ 1BALA_1W4HA
273
+ 1BALA_2BTGA
274
+ 1BALA_2CYUA
275
+ 1BALA_2WXCA
276
+ 1BBHA_1BBHA
277
+ 1BBHA_1BBHB
278
+ 1BBHA_3VRCA
279
+ 1BBHA_5GYRA
280
+ 1BBHA_5GYRJ
281
+ 1BCCA_1BE3A
282
+ 1BCCA_5GUP5
283
+ 1BCCA_5J4ZAA
284
+ 1BCCA_6QBXa3
285
+ 1BCCA_7DGSk
286
+ 1BCCE_1BE3E
287
+ 1BCCE_2BCCE
288
+ 1BCCE_7DGRA0
289
+ 1BCCE_7DGRp
290
+ 1BCCE_7DGSp
291
+ 1BCXA_1BVVA
292
+ 1BCXA_3EXUA
293
+ 1BCXA_3HD8B
294
+ 1BCXA_3VZLC
295
+ 1BCXA_5TVVB
296
+ 1BE4A_1BE4A
297
+ 1BE4A_4ENOA
298
+ 1BE4A_4ENOB
299
+ 1BE4A_7KPFD
300
+ 1BE4A_7KPFE
301
+ 1BFOA_1C5DA
302
+ 1BFOA_1FN4A
303
+ 1BFOA_3B9KC
304
+ 1BFOA_5W5XL
305
+ 1BFOA_7TE4L
306
+ 1BKTA_1BKTA
307
+ 1BKTA_1SCOA
308
+ 1BKTA_2CK4A
309
+ 1BKTA_2K4UA
310
+ 1BKTA_2MLAA
311
+ 1BL9A_1HZUA
312
+ 1BL9A_1HZVA
313
+ 1BL9A_1N15A
314
+ 1BL9A_5GUWM
315
+ 1BL9A_6TPOA
316
+ 1BMPA_1LXIA
317
+ 1BMPA_1M4UL
318
+ 1BMPA_2QCWA
319
+ 1BMPA_2R53A
320
+ 1BMPA_6OMOJ
321
+ 1BMQB_1BMQB
322
+ 1BMQB_1SC1B
323
+ 1BMQB_2FQQB
324
+ 1BMQB_3D6HB
325
+ 1BMQB_6KN0B
326
+ 1BNCA_2GPSA
327
+ 1BNCA_2J9GB
328
+ 1BNCA_3RUPA
329
+ 1BNCA_3RV3A
330
+ 1BNCA_4HR7A
331
+ 1BQLH_1EO8H
332
+ 1BQLH_1Q0XH
333
+ 1BQLH_3VI3F
334
+ 1BQLH_4CADH
335
+ 1BQLH_5NBWH
336
+ 1BQMA_1HVUA
337
+ 1BQMA_1LWEA
338
+ 1BQMA_5HLFA
339
+ 1BQMA_6AOCC
340
+ 1BQMA_7SJXB
341
+ 1BU2A_1BU2A
342
+ 1BU2A_1JOWA
343
+ 1BU2A_1XO2A
344
+ 1BU2A_2EUFA
345
+ 1BU2A_2F2CA
346
+ 1BVHA_1BVHA
347
+ 1BVHA_1C0EA
348
+ 1BVHA_1Z12A
349
+ 1BVHA_3N8IA
350
+ 1BVHA_7KH8B
351
+ 1BVNT_1HOEA
352
+ 1BVNT_1OK0A
353
+ 1BVNT_2AITA
354
+ 1BVNT_3AITA
355
+ 1BVNT_4AITA
356
+ 1BX4A_2I6AA
357
+ 1BX4A_2I6AB
358
+ 1BX4A_4O1LB
359
+ 1BX4A_5KB6A
360
+ 1BX4A_5KB6B
361
+ 1BXIA_1E0HA
362
+ 1BXIA_1EMVA
363
+ 1BXIA_1IMQA
364
+ 1BXIA_2K5XA
365
+ 1BXIA_2VLQA
366
+ 1BY4A_1BY4A
367
+ 1BY4A_1R0NA
368
+ 1BY4A_1RXRA
369
+ 1BY4A_6FBRA
370
+ 1BY4A_6XWGC
371
+ 1C1AB_1C1AB
372
+ 1C1AB_4FW1A
373
+ 1C1AB_7JN3F
374
+ 1C1AB_8E14B
375
+ 1C1AB_8E14F
376
+ 1C39A_1C39A
377
+ 1C39A_2RL7C
378
+ 1C39A_2RL7D
379
+ 1C39A_2RL8A
380
+ 1C39A_3K41B
381
+ 1C5WB_1LMWD
382
+ 1C5WB_1W12U
383
+ 1C5WB_4DVAU
384
+ 1C5WB_4DW2U
385
+ 1C5WB_5WXFU
386
+ 1C7UA_1C7UB
387
+ 1C7UA_1EGWA
388
+ 1C7UA_1EGWB
389
+ 1C7UA_3MU6A
390
+ 1C7UA_6C9LE
391
+ 1C9BA_1C9BA
392
+ 1C9BA_1TFBA
393
+ 1C9BA_2PHGA
394
+ 1C9BA_5WH1A
395
+ 1C9BA_5WH1C
396
+ 1C9KA_1C9KA
397
+ 1C9KA_1C9KB
398
+ 1C9KA_1CBUA
399
+ 1C9KA_1CBUB
400
+ 1C9KA_1CBUC
401
+ 1C9PB_1C9PB
402
+ 1C9PB_1C9TG
403
+ 1C9PB_1C9TH
404
+ 1C9PB_1C9TK
405
+ 1C9PB_1EJAB
406
+ 1CKMA_1CKMA
407
+ 1CKMA_1CKMB
408
+ 1CKMA_1CKNA
409
+ 1CKMA_1CKNB
410
+ 1CKMA_1CKOA
411
+ 1CLQA_1CLQA
412
+ 1CLQA_1IG9A
413
+ 1CLQA_1IH7A
414
+ 1CLQA_3RMBB
415
+ 1CLQA_4KHNA
416
+ 1CODA_1CODA
417
+ 1CODA_1COEA
418
+ 1CODA_1G6MA
419
+ 1CODA_1V6PA
420
+ 1CODA_1V6PB
421
+ 1CP3A_3DEHB
422
+ 1CP3A_4JQYB
423
+ 1CP3A_4PS0A
424
+ 1CP3A_7XN5A
425
+ 1CP3A_7XN6A
426
+ 1CPQA_1CPQA
427
+ 1CPQA_1CPRA
428
+ 1CPQA_1EKYA
429
+ 1CPQA_1NBBB
430
+ 1CPQA_1RCPB
431
+ 1D0XA_1FMWA
432
+ 1D0XA_2AKAA
433
+ 1D0XA_2XELA
434
+ 1D0XA_6Z2SA
435
+ 1D0XA_6Z7UA
436
+ 1D4M1_1D4M1
437
+ 1D4M1_3J2JA
438
+ 1D4M1_8AT5A
439
+ 1D4M1_8AW6A
440
+ 1D4M1_8AXXA
441
+ 1D6MA_1D6MA
442
+ 1D6MA_1I7DA
443
+ 1D6MA_2O19B
444
+ 1D6MA_2O59A
445
+ 1D6MA_2O5CA
446
+ 1DAPA_1DAPA
447
+ 1DAPA_3DAPB
448
+ 1DAPA_5LOAA
449
+ 1DAPA_5LOAB
450
+ 1DAPA_5LOCB
451
+ 1DCTA_1DCTA
452
+ 1DCTA_1DCTB
453
+ 1DCTA_3UBTA
454
+ 1DCTA_3UBTB
455
+ 1DCTA_3UBTY
456
+ 1DFBH_3Q6FB
457
+ 1DFBH_3Q6FF
458
+ 1DFBH_4OD2B
459
+ 1DFBH_7KQ7H
460
+ 1DFBH_8DB4A
461
+ 1DFVA_1NGLA
462
+ 1DFVA_1X89C
463
+ 1DFVA_6SUAA
464
+ 1DFVA_6Z2CC
465
+ 1DFVA_6Z6ZA
466
+ 1DLFL_1MAJA
467
+ 1DLFL_1MAKA
468
+ 1DLFL_2CJUL
469
+ 1DLFL_6AD0L
470
+ 1DLFL_7MHYN
471
+ 1DMWA_1KW0A
472
+ 1DMWA_1LRMA
473
+ 1DMWA_1MMTA
474
+ 1DMWA_2PAHA
475
+ 1DMWA_6HPOA
476
+ 1DN0B_1DN0B
477
+ 1DN0B_1DN0D
478
+ 1DN0B_1QLRB
479
+ 1DN0B_2J6EH
480
+ 1DN0B_2J6EI
481
+ 1DV4G_1FJGG
482
+ 1DV4G_1RSSA
483
+ 1DV4G_4V4GAG
484
+ 1DV4G_4V4Ih
485
+ 1DV4G_5OT7F
486
+ 1DVVA_2EXVA
487
+ 1DVVA_2PACA
488
+ 1DVVA_3X39B
489
+ 1DVVA_5XECC
490
+ 1DVVA_5XEDC
491
+ 1DYLA_2YEWA
492
+ 1DYLA_2YEWD
493
+ 1DYLA_2YEWG
494
+ 1DYLA_5H23A
495
+ 1DYLA_6NK7L
496
+ 1DYTA_1DYTA
497
+ 1DYTA_1QMTA
498
+ 1DYTA_2KB5A
499
+ 1DYTA_2LVZA
500
+ 1DYTA_4A2OB
501
+ 1E4SA_1E4SA
502
+ 1E4SA_1KJ5A
503
+ 1E4SA_2NLCA
504
+ 1E4SA_2NLEB
505
+ 1E4SA_2NLSA
506
+ 1E5LA_1E5LA
507
+ 1E5LA_1E5LB
508
+ 1E5LA_1E5QB
509
+ 1E5LA_1E5QD
510
+ 1E5LA_1FF9A
511
+ 1E79I_1E79I
512
+ 1E79I_2WSSR
513
+ 1E79I_5AREI
514
+ 1E79I_5ARII
515
+ 1E79I_7AJFAI
516
+ 1E7WA_1P33C
517
+ 1E7WA_1W0CG
518
+ 1E7WA_2XOXA
519
+ 1E7WA_2XOXB
520
+ 1E7WA_5L42A
521
+ 1E8CA_1E8CA
522
+ 1E8CA_1E8CB
523
+ 1E8CA_7B53A
524
+ 1E8CA_7B61A
525
+ 1E8CA_7B6OA
526
+ 1EC5A_1JM0A
527
+ 1EC5A_1NVOA
528
+ 1EC5A_1NVOB
529
+ 1EC5A_1U7MA
530
+ 1EC5A_2KIKB
531
+ 1ECWA_1ECWA
532
+ 1ECWA_1ED1A
533
+ 1ECWA_2K4EA
534
+ 1ECWA_2K4HA
535
+ 1ECWA_2K4IA
536
+ 1ED0A_1ED0A
537
+ 1ED0A_1JMNA
538
+ 1ED0A_1JMPA
539
+ 1ED0A_1ORLA
540
+ 1ED0A_2V9BA
541
+ 1EDHA_1FF5A
542
+ 1EDHA_1Q1PA
543
+ 1EDHA_3LNEA
544
+ 1EDHA_3Q2LB
545
+ 1EDHA_4QD2J
546
+ 1EJ3A_1SL8A
547
+ 1EJ3A_1UHJB
548
+ 1EJ3A_1UHKA
549
+ 1EJ3A_5ZABA
550
+ 1EJ3A_7EG3J
551
+ 1EPSA_1G6SA
552
+ 1EPSA_1X8TA
553
+ 1EPSA_7TM4A
554
+ 1EPSA_7TM5A
555
+ 1EPSA_7TM5B
556
+ 1EQFA_6FICT
557
+ 1EQFA_7K0DA
558
+ 1EQFA_7K6FA
559
+ 1EQFA_7L6XA
560
+ 1EQFA_7T36A
561
+ 1ES7B_2K3GA
562
+ 1ES7B_2QJ9D
563
+ 1ES7B_2QJAD
564
+ 1ES7B_3NH7B
565
+ 1ES7B_3QB4B
566
+ 1EUCA_2FP4A
567
+ 1EUCA_4XX0A
568
+ 1EUCA_6WCVA
569
+ 1EUCA_7JKRA
570
+ 1EUCA_7JMKA
571
+ 1EVUA_1F13B
572
+ 1EVUA_1GGUB
573
+ 1EVUA_5MHLA
574
+ 1EVUA_5MHLB
575
+ 1EVUA_5MHOA
576
+ 1EWIA_1EWIA
577
+ 1EWIA_2B29A
578
+ 1EWIA_4IJLA
579
+ 1EWIA_7XUTA
580
+ 1EWIA_7XUWA
581
+ 1EWKA_1EWVA
582
+ 1EWKA_1ISRA
583
+ 1EWKA_1ISSA
584
+ 1EWKA_3KS9A
585
+ 1EWKA_3KS9B
586
+ 1EX6A_1EX6A
587
+ 1EX6A_1EX7A
588
+ 1EX6A_1GKYA
589
+ 1EX6A_4F4JA
590
+ 1EX6A_4F4JB
591
+ 1EXIA_1EXIA
592
+ 1EXIA_3IAOA
593
+ 1EXIA_3Q1MA
594
+ 1EXIA_7CKQG
595
+ 1EXIA_7CKQI
596
+ 1F3MC_1F3MD
597
+ 1F3MC_4O0TA
598
+ 1F3MC_4ZJJD
599
+ 1F3MC_4ZY4B
600
+ 1F3MC_6FD3A
601
+ 1F4TA_1F4TA
602
+ 1F4TA_1IO9A
603
+ 1F4TA_5BV5C
604
+ 1F4TA_5BV5D
605
+ 1F4TA_7UORB
606
+ 1F8PA_1F8PA
607
+ 1F8PA_1RONA
608
+ 1F8PA_1TZ4A
609
+ 1F8PA_7VGXL
610
+ 1F8PA_7YOOL
611
+ 1F9NA_1F9NA
612
+ 1F9NA_1F9NB
613
+ 1F9NA_1F9NC
614
+ 1F9NA_1F9ND
615
+ 1F9NA_1F9NF
616
+ 1F9QA_1PFNA
617
+ 1F9QA_4HSVB
618
+ 1F9QA_4HSVD
619
+ 1F9QA_4R9YA
620
+ 1F9QA_4RAUI
621
+ 1F9XA_1F9XA
622
+ 1F9XA_1G3FA
623
+ 1F9XA_1TFQA
624
+ 1F9XA_3CM2C
625
+ 1F9XA_3G76H
626
+ 1FAIH_2A6JB
627
+ 1FAIH_2F19H
628
+ 1FAIH_4BH8B
629
+ 1FAIH_5MTHA
630
+ 1FAIH_6P8DD
631
+ 1FGSA_1FGSA
632
+ 1FGSA_1JBVA
633
+ 1FGSA_1JBWA
634
+ 1FGSA_2GC5A
635
+ 1FGSA_2GCAA
636
+ 1FJGJ_1FJGJ
637
+ 1FJGJ_4V4GAJ
638
+ 1FJGJ_4V4Jk
639
+ 1FJGJ_6GZXJ3
640
+ 1FJGJ_6GZZJ4
641
+ 1FJGQ_1IBLQ
642
+ 1FJGQ_1VY4AQ
643
+ 1FJGQ_4V4GAQ
644
+ 1FJGQ_4V4Jr
645
+ 1FJGQ_6GZXQ3
646
+ 1FMKA_1KSWA
647
+ 1FMKA_1Y57A
648
+ 1FMKA_2PTKA
649
+ 1FMKA_4K11A
650
+ 1FMKA_6F3FA
651
+ 1FNTD_4NNNC
652
+ 1FNTD_5MPCd
653
+ 1FNTD_5WVID
654
+ 1FNTD_6EF2D
655
+ 1FNTD_7LSXD
656
+ 1FOHA_1FOHA
657
+ 1FOHA_1PN0A
658
+ 1FOHA_1PN0B
659
+ 1FOHA_1PN0C
660
+ 1FOHA_1PN0D
661
+ 1FOSF_1FOSH
662
+ 1FOSF_5T01B
663
+ 1FOSF_5VPBD
664
+ 1FOSF_7UCCJ
665
+ 1FOSF_7UCDJ
666
+ 1FS0G_3OAAO
667
+ 1FS0G_5T4OG
668
+ 1FS0G_5T4QG
669
+ 1FS0G_8DBTG
670
+ 1FS0G_8DBWG
671
+ 1FU5A_1FU5A
672
+ 1FU5A_1FU6A
673
+ 1FU5A_1OO3A
674
+ 1FU5A_1OO4A
675
+ 1FU5A_7RNSA
676
+ 1FX0A_6FKIC
677
+ 1FX0A_6VM1C
678
+ 1FX0A_6VMBA
679
+ 1FX0A_6VMBC
680
+ 1FX0A_6VOOA
681
+ 1G0WA_1QH4A
682
+ 1G0WA_3B6RB
683
+ 1G0WA_3DRBB
684
+ 1G0WA_4Q2RA
685
+ 1G0WA_7TUNA
686
+ 1G1QA_1G1QA
687
+ 1G1QA_1G1RA
688
+ 1G1QA_1G1RB
689
+ 1G1QA_1G1SA
690
+ 1G1QA_1G1SB
691
+ 1G47A_1G47A
692
+ 1G47A_2KBXB
693
+ 1G47A_3F6QB
694
+ 1G47A_3IXEB
695
+ 1G47A_4HI8B
696
+ 1G57A_1G57B
697
+ 1G57A_1IEZA
698
+ 1G57A_3LQUA
699
+ 1G57A_7TYEB
700
+ 1G57A_7TYEC
701
+ 1G85A_1GT1A
702
+ 1G85A_1GT4A
703
+ 1G85A_1GT4B
704
+ 1G85A_1PBOA
705
+ 1G85A_2HLVA
706
+ 1GAGA_1IRKA
707
+ 1GAGA_2AUHA
708
+ 1GAGA_2B4SB
709
+ 1GAGA_3ETAB
710
+ 1GAGA_4IBMA
711
+ 1GAV0_1GAV0
712
+ 1GAV0_1GAV3
713
+ 1GAV0_1GAVD
714
+ 1GAV0_1UNAA
715
+ 1GAV0_1UNAB
716
+ 1GC5A_4B8RA
717
+ 1GC5A_4B8SA
718
+ 1GC5A_5O5XA
719
+ 1GC5A_5O5YB
720
+ 1GC5A_5O5ZA
721
+ 1GEFA_1GEFA
722
+ 1GEFA_1GEFB
723
+ 1GEFA_1GEFE
724
+ 1GEFA_1IPIA
725
+ 1GEFA_1IPIB
726
+ 1GGGA_1GGGA
727
+ 1GGGA_1GGGB
728
+ 1GGGA_1WDNA
729
+ 1GGGA_8EYZG
730
+ 1GGGA_8EYZI
731
+ 1GL1I_1KIOA
732
+ 1GL1I_1PMCA
733
+ 1GL1I_7SGQD
734
+ 1GL1I_7SLTA
735
+ 1GL1I_7SLTD
736
+ 1GMEB_1GMEB
737
+ 1GMEB_1GMED
738
+ 1GMEB_2BYUC
739
+ 1GMEB_2BYUD
740
+ 1GMEB_2H50A
741
+ 1GPWA_1GPWC
742
+ 1GPWA_1THFD
743
+ 1GPWA_5D30A
744
+ 1GPWA_6RTZA
745
+ 1GPWA_7AC8E
746
+ 1GWIA_1GWIA
747
+ 1GWIA_1GWIB
748
+ 1GWIA_6L69A
749
+ 1GWIA_7CL8A
750
+ 1GWIA_7CL9A
751
+ 1GXPA_1GXPA
752
+ 1GXPA_1GXPE
753
+ 1GXPA_1GXQA
754
+ 1GXPA_1QQIA
755
+ 1GXPA_2Z33A
756
+ 1GZKA_1GZKA
757
+ 1GZKA_1MRYA
758
+ 1GZKA_3E8DB
759
+ 1GZKA_3OW4A
760
+ 1GZKA_6CCYA
761
+ 1H1DA_3NWEA
762
+ 1H1DA_4PYKA
763
+ 1H1DA_5FHRA
764
+ 1H1DA_5FHRB
765
+ 1H1DA_5P8WB
766
+ 1H2CA_1H2DA
767
+ 1H2CA_1H2DB
768
+ 1H2CA_4LDDA
769
+ 1H2CA_4LDDC
770
+ 1H2CA_7K5LA
771
+ 1H6TA_2WQUA
772
+ 1H6TA_2WQXA
773
+ 1H6TA_4AW4B
774
+ 1H6TA_4AW4C
775
+ 1H6TA_4CILA
776
+ 1HL6B_1HL6B
777
+ 1HL6B_1OO0A
778
+ 1HL6B_2HYIA
779
+ 1HL6B_2X1GB
780
+ 1HL6B_7ZNJL
781
+ 1HQMA_1HQMA
782
+ 1HQMA_2GHOB
783
+ 1HQMA_2O5JB
784
+ 1HQMA_4G7HA
785
+ 1HQMA_6ASGB
786
+ 1HQME_1L9UE
787
+ 1HQME_4G7HE
788
+ 1HQME_8HSGK
789
+ 1HQME_8HSLK
790
+ 1HQME_8HSRK
791
+ 1HURA_1HURA
792
+ 1HURA_1R8QB
793
+ 1HURA_1U81A
794
+ 1HURA_8D4EZ
795
+ 1HURA_8D9S3
796
+ 1HX5A_1HX5E
797
+ 1HX5A_1HX5F
798
+ 1HX5A_1P3HA
799
+ 1HX5A_1P3HB
800
+ 1HX5A_1P3HN
801
+ 1HZ5A_1HZ5A
802
+ 1HZ5A_1HZ6B
803
+ 1HZ5A_1K50D
804
+ 1HZ5A_1K51A
805
+ 1HZ5A_2PTLA
806
+ 1I2DA_1I2DB
807
+ 1I2DA_1I2DC
808
+ 1I2DA_1M8PA
809
+ 1I2DA_1M8PB
810
+ 1I2DA_1M8PC
811
+ 1I42A_1I42A
812
+ 1I42A_1JRUA
813
+ 1I42A_1S3SH
814
+ 1I42A_7R7SJ
815
+ 1I42A_7R7TI
816
+ 1I7PA_1I7PA
817
+ 1I7PA_1NDHA
818
+ 1I7PA_1QX4B
819
+ 1I7PA_1UMKA
820
+ 1I7PA_7TNVA
821
+ 1I84U_1M8QB
822
+ 1I84U_2W4AB
823
+ 1I84U_5H53B
824
+ 1I84U_7NEPR
825
+ 1I84U_7QIOP
826
+ 1I9WA_1RERA
827
+ 1I9WA_2ALAA
828
+ 1I9WA_8D87B
829
+ 1I9WA_8IHPH
830
+ 1I9WA_8IHPK
831
+ 1IARB_1IARB
832
+ 1IARB_3BPNB
833
+ 1IARB_5E4EB
834
+ 1IARB_6OELB
835
+ 1IARB_6WGLC
836
+ 1IAZA_1KD6A
837
+ 1IAZA_3VWIA
838
+ 1IAZA_4TSQE
839
+ 1IAZA_4TSYB
840
+ 1IAZA_4TSYD
841
+ 1IB2A_1M8WB
842
+ 1IB2A_3BSBB
843
+ 1IB2A_3BSXA
844
+ 1IB2A_3GVOA
845
+ 1IB2A_3Q0RA
846
+ 1ID0A_1ID0A
847
+ 1ID0A_3CGYA
848
+ 1ID0A_3CGZA
849
+ 1ID0A_3CGZB
850
+ 1ID0A_3CGZC
851
+ 1II6A_2WOGA
852
+ 1II6A_3WPNA
853
+ 1II6A_4BXNA
854
+ 1II6A_4ZHIA
855
+ 1II6A_6TA4K
856
+ 1IJ9A_1IJ9A
857
+ 1IJ9A_1VCAA
858
+ 1IJ9A_1VCAB
859
+ 1IJ9A_1VSCA
860
+ 1IJ9A_1VSCB
861
+ 1IKNA_1IKNA
862
+ 1IKNA_1LE9A
863
+ 1IKNA_1NFIA
864
+ 1IKNA_2RAMB
865
+ 1IKNA_5U01D
866
+ 1IKUA_1IKUA
867
+ 1IKUA_1JSAA
868
+ 1IKUA_1LA3A
869
+ 1IKUA_2I94A
870
+ 1IKUA_4MLWA
871
+ 1IVOC_1IVOC
872
+ 1IVOC_1JL9B
873
+ 1IVOC_1NQLB
874
+ 1IVOC_1P9JA
875
+ 1IVOC_2KV4A
876
+ 1IWOA_1IWOA
877
+ 1IWOA_2C9MA
878
+ 1IWOA_2C9MB
879
+ 1IWOA_3FGOA
880
+ 1IWOA_6LLEA
881
+ 1IZNA_2KXPA
882
+ 1IZNA_3AA0A
883
+ 1IZNA_6F38K
884
+ 1IZNA_7PDZF
885
+ 1IZNA_7T5QI
886
+ 1J1DC_1J1EF
887
+ 1J1DC_4Y99C
888
+ 1J1DC_6KLTC
889
+ 1J1DC_6KLUC
890
+ 1J1DC_6KN8U
891
+ 1J2FA_1J2FA
892
+ 1J2FA_3A77D
893
+ 1J2FA_5JEJA
894
+ 1J2FA_5JEMB
895
+ 1J2FA_5JEME
896
+ 1J5AM_2ZJPY
897
+ 1J5AM_4V4GB2
898
+ 1J5AM_4V4PA5
899
+ 1J5AM_4V4RB5
900
+ 1J5AM_7A18Z
901
+ 1JC0A_3ED8C
902
+ 1JC0A_4W6PH
903
+ 1JC0A_4W73A
904
+ 1JC0A_5NI3A
905
+ 1JC0A_5WWKD
906
+ 1JDNA_1JDNA
907
+ 1JDNA_1JDPB
908
+ 1JDNA_1YK0A
909
+ 1JDNA_1YK1A
910
+ 1JDNA_1YK1B
911
+ 1JGNA_1JGNA
912
+ 1JGNA_1JH4A
913
+ 1JGNA_2RQGB
914
+ 1JGNA_3KUJA
915
+ 1JGNA_4IVEA
916
+ 1JK9B_1JK9B
917
+ 1JK9B_1JK9D
918
+ 1JK9B_1QUPA
919
+ 1JK9B_5U9MB
920
+ 1JK9B_5U9MD
921
+ 1JM4B_1JM4B
922
+ 1JM4B_1N72A
923
+ 1JM4B_1ZS5A
924
+ 1JM4B_2RNWA
925
+ 1JM4B_5FE0A
926
+ 1JMUB_1JMUD
927
+ 1JMUB_6XF8K
928
+ 1JMUB_6ZTZK
929
+ 1JMUB_6ZTZM
930
+ 1JMUB_7ELLb
931
+ 1JN0A_1NBOB
932
+ 1JN0A_7Q54A
933
+ 1JN0A_7Q54C
934
+ 1JN0A_7Q54R
935
+ 1JN0A_7Q56R
936
+ 1JQJA_1JQJA
937
+ 1JQJA_1JQJB
938
+ 1JQJA_1MMIA
939
+ 1JQJA_2XURA
940
+ 1JQJA_5M1SC
941
+ 1JS9B_1JS9C
942
+ 1JS9B_1YC63
943
+ 1JS9B_1YC6N
944
+ 1JS9B_3J7LB
945
+ 1JS9B_7PE1AA
946
+ 1K36A_1K36A
947
+ 1K36A_1K37A
948
+ 1K36A_5WB7E
949
+ 1K36A_7LFSG
950
+ 1K36A_8HGPC
951
+ 1K3WA_1K3WA
952
+ 1K3WA_1Q39A
953
+ 1K3WA_1Q3BA
954
+ 1K3WA_2OPFA
955
+ 1K3WA_2OQ4A
956
+ 1K9AA_1K9AA
957
+ 1K9AA_1K9AB
958
+ 1K9AA_1K9AC
959
+ 1K9AA_1K9AD
960
+ 1K9AA_1K9AF
961
+ 1K9UA_1K9UA
962
+ 1K9UA_1K9UB
963
+ 1K9UA_2LVIA
964
+ 1K9UA_2LVJA
965
+ 1K9UA_2LVKA
966
+ 1KELH_1KELH
967
+ 1KELH_1KEMH
968
+ 1KELH_4HLZI
969
+ 1KELH_4HLZK
970
+ 1KELH_5C0RH
971
+ 1KO2A_2YZ3B
972
+ 1KO2A_4BZ3A
973
+ 1KO2A_6BM9A
974
+ 1KO2A_6BM9B
975
+ 1KO2A_6BM9C
976
+ 1KT0A_1KT0A
977
+ 1KT0A_1KT1A
978
+ 1KT0A_5NJXA
979
+ 1KT0A_5OMPA
980
+ 1KT0A_7L7IC
981
+ 1KTZB_1PLOA
982
+ 1KTZB_2PJYB
983
+ 1KTZB_3KFDE
984
+ 1KTZB_3KFDG
985
+ 1KTZB_4XJJA
986
+ 1KY9B_1KY9B
987
+ 1KY9B_3CS0A
988
+ 1KY9B_3MH4B
989
+ 1KY9B_6JJKC
990
+ 1KY9B_6JJLD
991
+ 1KYWA_1KYWA
992
+ 1KYWA_1KYWC
993
+ 1KYWA_1KYZE
994
+ 1KYWA_6I71A
995
+ 1KYWA_6I73B
996
+ 1L4UA_1ZYUA
997
+ 1L4UA_2DFTB
998
+ 1L4UA_2IYSA
999
+ 1L4UA_2IYUA
1000
+ 1L4UA_2IYVA
1001
+ 1L7IH_4LLUA
1002
+ 1L7IH_6DEZH
1003
+ 1L7IH_6WW2H
1004
+ 1L7IH_7CR5H
1005
+ 1L7IH_8HIJH
1006
+ 1L8LA_1L8LB
1007
+ 1L8LA_1L8OA
1008
+ 1L8LA_6HYJA
1009
+ 1L8LA_6HYJB
1010
+ 1L8LA_6HYYB
1011
+ 1L8QA_1L8QA
1012
+ 1L8QA_2HCBB
1013
+ 1L8QA_2HCBC
1014
+ 1L8QA_2HCBD
1015
+ 1L8QA_3R8FA
1016
+ 1LAFE_1LAFE
1017
+ 1LAFE_2LAOA
1018
+ 1LAFE_6ML0A
1019
+ 1LAFE_6MLDA
1020
+ 1LAFE_6MLPE
1021
+ 1LDJA_1LDJA
1022
+ 1LDJA_1U6GA
1023
+ 1LDJA_6TTUC
1024
+ 1LDJA_7B5LC
1025
+ 1LDJA_7Z8TC
1026
+ 1LILA_1LILB
1027
+ 1LILA_4QF1L
1028
+ 1LILA_5T33L
1029
+ 1LILA_6CH7R
1030
+ 1LILA_7KHFB
1031
+ 1LM8C_2JZ3C
1032
+ 1LM8C_2MA9C
1033
+ 1LM8C_3ZTDH
1034
+ 1LM8C_4JGHC
1035
+ 1LM8C_6R7NQ
1036
+ 1LQ8A_1LQ8E
1037
+ 1LQ8A_1LQ8G
1038
+ 1LQ8A_2HI9A
1039
+ 1LQ8A_2OL2A
1040
+ 1LQ8A_3B9FI
1041
+ 1LREA_1LREA
1042
+ 1LREA_1NREA
1043
+ 1LREA_1OP1A
1044
+ 1LREA_1OV2A
1045
+ 1LREA_2FYLA
1046
+ 1LUPA_1LUPA
1047
+ 1LUPA_2N9TA
1048
+ 1LUPA_5O0UA
1049
+ 1LUPA_5TCZA
1050
+ 1LUPA_7A64A
1051
+ 1M3IA_1M3IA
1052
+ 1M3IA_1M3IC
1053
+ 1M3IA_1M3ID
1054
+ 1M3IA_1M3JA
1055
+ 1M3IA_1PFOA
1056
+ 1M5Q1_1M5Q1
1057
+ 1M5Q1_1M5QB
1058
+ 1M5Q1_1M5QC
1059
+ 1M5Q1_1M5QH
1060
+ 1M5Q1_1M5QM
1061
+ 1MF2H_1OTSC
1062
+ 1MF2H_2EZ0E
1063
+ 1MF2H_4FG6E
1064
+ 1MF2H_7U62H
1065
+ 1MF2H_7U62I
1066
+ 1MFGA_1N7TA
1067
+ 1MFGA_2H3LA
1068
+ 1MFGA_2H3LB
1069
+ 1MFGA_6Q0MB
1070
+ 1MFGA_6Q0NA
1071
+ 1MJGM_1OAOD
1072
+ 1MJGM_3I01M
1073
+ 1MJGM_3I01O
1074
+ 1MJGM_3I01P
1075
+ 1MJGM_6X5KN
1076
+ 1MKIA_1MKIA
1077
+ 1MKIA_1MKIB
1078
+ 1MKIA_2OSUA
1079
+ 1MKIA_2OSUB
1080
+ 1MKIA_3AGFA
1081
+ 1MPZA_1MPZA
1082
+ 1MPZA_2W9OA
1083
+ 1MPZA_2W9UA
1084
+ 1MPZA_2W9VA
1085
+ 1MPZA_2W9WA
1086
+ 1MR3F_2K5UA
1087
+ 1MR3F_3TJZA
1088
+ 1MR3F_5NZRM
1089
+ 1MR3F_6PTAC
1090
+ 1MR3F_7UROB
1091
+ 1MUMA_1MUMA
1092
+ 1MUMA_1OQFA
1093
+ 1MUMA_1XG3A
1094
+ 1MUMA_1XG3B
1095
+ 1MUMA_1XG3C
1096
+ 1MWKA_1MWKA
1097
+ 1MWKA_2QU4A
1098
+ 1MWKA_2ZHCA
1099
+ 1MWKA_4A62B
1100
+ 1MWKA_5AEYA
1101
+ 1MWRA_1MWRB
1102
+ 1MWRA_3ZG5A
1103
+ 1MWRA_4CJNB
1104
+ 1MWRA_5M18B
1105
+ 1MWRA_6H5OB
1106
+ 1N69A_1N69A
1107
+ 1N69A_1N69B
1108
+ 1N69A_1N69C
1109
+ 1N69A_4V2OB
1110
+ 1N69A_6SLRC
1111
+ 1N88A_1N88A
1112
+ 1N88A_4V4IR
1113
+ 1N88A_4V6ABX
1114
+ 1N88A_6GZXS2
1115
+ 1N88A_7LH5BX
1116
+ 1NCJA_2QVIA
1117
+ 1NCJA_4NUMB
1118
+ 1NCJA_4NUMC
1119
+ 1NCJA_4NUPB
1120
+ 1NCJA_4NUPC
1121
+ 1NEKA_1NEKA
1122
+ 1NEKA_2ACZA
1123
+ 1NEKA_2WP9I
1124
+ 1NEKA_6C12A
1125
+ 1NEKA_6C12B
1126
+ 1NJMT_2ZJPS
1127
+ 1NJMT_4V4GDW
1128
+ 1NJMT_4V4GFW
1129
+ 1NJMT_4V4GHW
1130
+ 1NJMT_4V4RBZ
1131
+ 1NKSA_1NKSA
1132
+ 1NKSA_1NKSB
1133
+ 1NKSA_1NKSC
1134
+ 1NKSA_1NKSE
1135
+ 1NKSA_1NKSF
1136
+ 1NKWE_2ZJPE
1137
+ 1NKWE_3DLLE
1138
+ 1NKWE_4V4GBH
1139
+ 1NKWE_7A0RE
1140
+ 1NKWE_7A0SE
1141
+ 1NKWL_2ZJPK
1142
+ 1NKWL_4V4GBO
1143
+ 1NKWL_5DM7K
1144
+ 1NKWL_5JVHK
1145
+ 1NKWL_7A0RK
1146
+ 1NQEA_1NQGA
1147
+ 1NQEA_1NQHA
1148
+ 1NQEA_1UJWA
1149
+ 1NQEA_2GSKA
1150
+ 1NQEA_2GUFA
1151
+ 1NY6A_1NY6A
1152
+ 1NY6A_1NY6H
1153
+ 1NY6A_3M0EA
1154
+ 1NY6A_4LY6L
1155
+ 1NY6A_4LZZL
1156
+ 1O7BT_1O7BT
1157
+ 1O7BT_1O7CT
1158
+ 1O7BT_2N40A
1159
+ 1O7BT_2PF5B
1160
+ 1O7BT_2PF5C
1161
+ 1OCCH_1OCRH
1162
+ 1OCCH_1OCZH
1163
+ 1OCCH_5J4ZBH
1164
+ 1OCCH_5Z62H
1165
+ 1OCCH_7W3EU
1166
+ 1OG2A_4NZ2B
1167
+ 1OG2A_5A5JA
1168
+ 1OG2A_5W0CA
1169
+ 1OG2A_5XXIA
1170
+ 1OG2A_6VLTC
1171
+ 1ONCA_1ONCA
1172
+ 1ONCA_1PU3A
1173
+ 1ONCA_2KB6A
1174
+ 1ONCA_7OR6B
1175
+ 1ONCA_7ORDB
1176
+ 1OQEK_1OQEM
1177
+ 1OQEK_1OQEQ
1178
+ 1OQEK_1OQER
1179
+ 1OQEK_1OSXA
1180
+ 1OQEK_4V46B0
1181
+ 1OVZA_1OVZA
1182
+ 1OVZA_1OVZB
1183
+ 1OVZA_1OW0C
1184
+ 1OVZA_1OW0D
1185
+ 1OVZA_1UCTA
1186
+ 1P34E_3B6GE
1187
+ 1P34E_5BO0A
1188
+ 1P34E_5C3IJ
1189
+ 1P34E_6ESIE
1190
+ 1P34E_7CRPM
1191
+ 1PB7A_1Y20A
1192
+ 1PB7A_4KCCA
1193
+ 1PB7A_4KFQB
1194
+ 1PB7A_6USUA
1195
+ 1PB7A_6UZWA
1196
+ 1PJRA_1PJRA
1197
+ 1PJRA_1QHGA
1198
+ 1PJRA_2PJRA
1199
+ 1PJRA_2PJRF
1200
+ 1PJRA_3PJRA
1201
+ 1PK2A_1PK2A
1202
+ 1PK2A_1PMLB
1203
+ 1PK2A_1TPKA
1204
+ 1PK2A_1TPKB
1205
+ 1PK2A_1TPKC
1206
+ 1PKQA_4R96A
1207
+ 1PKQA_5I1IL
1208
+ 1PKQA_7SG4L
1209
+ 1PKQA_7WUHI
1210
+ 1PKQA_7WUHL
1211
+ 1PP7U_1PP8F
1212
+ 1PP7U_1PP8M
1213
+ 1PP7U_1PP8O
1214
+ 1PP7U_1PP8P
1215
+ 1PP7U_1PP8V
1216
+ 1PV6A_2CFQA
1217
+ 1PV6A_2V8NA
1218
+ 1PV6A_2Y5YB
1219
+ 1PV6A_5GXBA
1220
+ 1PV6A_6VBGB
1221
+ 1Q05A_4WLSA
1222
+ 1Q05A_4WLSB
1223
+ 1Q05A_6LDIG
1224
+ 1Q05A_7C17G
1225
+ 1Q05A_7C17H
1226
+ 1Q57A_1Q57A
1227
+ 1Q57A_1Q57E
1228
+ 1Q57A_5IKND
1229
+ 1Q57A_6N9VF
1230
+ 1Q57A_6N9WA
1231
+ 1Q5HA_1Q5HB
1232
+ 1Q5HA_3EHWY
1233
+ 1Q5HA_7BPDA
1234
+ 1Q5HA_7BPDB
1235
+ 1Q5HA_7PWJA
1236
+ 1Q8GA_1Q8XA
1237
+ 1Q8GA_1TVJA
1238
+ 1Q8GA_3J0SM
1239
+ 1Q8GA_5HVKB
1240
+ 1Q8GA_7M0GA
1241
+ 1Q8IA_3K59A
1242
+ 1Q8IA_3K5LA
1243
+ 1Q8IA_3K5NA
1244
+ 1Q8IA_3K5NB
1245
+ 1Q8IA_3K5OA
1246
+ 1QBHA_1QBHA
1247
+ 1QBHA_3OZ1B
1248
+ 1QBHA_4KMNA
1249
+ 1QBHA_4LGEB
1250
+ 1QBHA_5M6NA
1251
+ 1QBZA_1QBZC
1252
+ 1QBZA_1QCEA
1253
+ 1QBZA_2EZSA
1254
+ 1QBZA_7T4GB
1255
+ 1QBZA_8DVDA
1256
+ 1QCQA_1UR6A
1257
+ 1QCQA_1W4UA
1258
+ 1QCQA_2FUHA
1259
+ 1QCQA_3L1YA
1260
+ 1QCQA_5FERE
1261
+ 1QK9A_1QK9A
1262
+ 1QK9A_6C1YA
1263
+ 1QK9A_6OGJA
1264
+ 1QK9A_6OGJB
1265
+ 1QK9A_6OGKA
1266
+ 1QMVA_1QMVF
1267
+ 1QMVA_5IJTB
1268
+ 1QMVA_5IJTE
1269
+ 1QMVA_5IJTF
1270
+ 1QMVA_7KIZA
1271
+ 1QMYA_1QOLA
1272
+ 1QMYA_1QOLG
1273
+ 1QMYA_1QOLH
1274
+ 1QMYA_2JQFR
1275
+ 1QMYA_2JQGR
1276
+ 1QOHA_1R4PB
1277
+ 1QOHA_6FE4A
1278
+ 1QOHA_7U6VC
1279
+ 1QOHA_7U6VD
1280
+ 1QOHA_7U6VF
1281
+ 1QXHA_1QXHA
1282
+ 1QXHA_3HVSA
1283
+ 1QXHA_3HVVA
1284
+ 1QXHA_3HVXB
1285
+ 1QXHA_4AF2A
1286
+ 1QZZA_1QZZA
1287
+ 1QZZA_1R00A
1288
+ 1QZZA_1XDSA
1289
+ 1QZZA_1XDSB
1290
+ 1QZZA_1XDUA
1291
+ 1R1RA_1R1RA
1292
+ 1R1RA_1RLRA
1293
+ 1R1RA_4ERMB
1294
+ 1R1RA_6W4XA
1295
+ 1R1RA_6W4XB
1296
+ 1R59O_1R59X
1297
+ 1R59O_1XUPO
1298
+ 1R59O_1XUPX
1299
+ 1R59O_3FLCO
1300
+ 1R59O_3H3OB
1301
+ 1R5ZA_1R5ZA
1302
+ 1R5ZA_1V9MA
1303
+ 1R5ZA_3J0JM
1304
+ 1R5ZA_5GASM
1305
+ 1R5ZA_6QUMM
1306
+ 1R7RA_5IFWB
1307
+ 1R7RA_7LMZB
1308
+ 1R7RA_7LN6F
1309
+ 1R7RA_7R7TD
1310
+ 1R7RA_7WUBG
1311
+ 1RBAA_1RBAB
1312
+ 1RBAA_2RUSA
1313
+ 1RBAA_7VWXa
1314
+ 1RBAA_9RUBA
1315
+ 1RBAA_9RUBB
1316
+ 1REPC_2Z9OA
1317
+ 1REPC_2Z9OB
1318
+ 1REPC_7RVAC
1319
+ 1REPC_7SGCC
1320
+ 1REPC_7UXYC
1321
+ 1S50A_2PBWB
1322
+ 1S50A_2ZNSA
1323
+ 1S50A_3FUZB
1324
+ 1S50A_3S2VB
1325
+ 1S50A_4QF9C
1326
+ 1S5LH_1S5Lh
1327
+ 1S5LH_1S5LH
1328
+ 1S5LH_3KZIH
1329
+ 1S5LH_7NHOH
1330
+ 1S5LH_7YQ2h
1331
+ 1SA0E_3DU7E
1332
+ 1SA0E_3E22E
1333
+ 1SA0E_4I4TE
1334
+ 1SA0E_7DP8E
1335
+ 1SA0E_8ASNE
1336
+ 1SAWA_6FOGA
1337
+ 1SAWA_6FOGE
1338
+ 1SAWA_6FOHB
1339
+ 1SAWA_6SBIA
1340
+ 1SAWA_6SBJA
1341
+ 1SEIA_1SEIA
1342
+ 1SEIA_1SEIB
1343
+ 1SEIA_5NJTH
1344
+ 1SEIA_6O8Wh
1345
+ 1SEIA_6O8Yh
1346
+ 1SHYB_1SHYB
1347
+ 1SHYB_4K3JB
1348
+ 1SHYB_4O3TB
1349
+ 1SHYB_6WVZM
1350
+ 1SHYB_7MOBC
1351
+ 1SK3A_1SK3A
1352
+ 1SK3A_1SK4A
1353
+ 1SK3A_1TWQA
1354
+ 1SK3A_2APHA
1355
+ 1SK3A_2APHB
1356
+ 1SLQA_2B4HA
1357
+ 1SLQA_2B4IB
1358
+ 1SLQA_2B4IC
1359
+ 1SLQA_6WXG1
1360
+ 1SLQA_7UMT3
1361
+ 1STFI_1STFI
1362
+ 1STFI_2OCTA
1363
+ 1STFI_2OCTB
1364
+ 1STFI_4N6V0
1365
+ 1STFI_4N6V4
1366
+ 1SW1A_1SW1A
1367
+ 1SW1A_1SW2A
1368
+ 1SW1A_1SW5A
1369
+ 1SW1A_1SW5B
1370
+ 1SW1A_3MAMA
1371
+ 1SZQA_1SZQA
1372
+ 1SZQA_1SZQB
1373
+ 1SZQA_5MVIA
1374
+ 1SZQA_5MVIB
1375
+ 1SZQA_5MVID
1376
+ 1T09A_1T09A
1377
+ 1T09A_1T0LD
1378
+ 1T09A_4KZOB
1379
+ 1T09A_6BKZA
1380
+ 1T09A_8HB9D
1381
+ 1T94A_1T94A
1382
+ 1T94A_1T94B
1383
+ 1T94A_3IN5A
1384
+ 1T94A_4U6PA
1385
+ 1T94A_7NV0A
1386
+ 1TDHA_6LWGD
1387
+ 1TDHA_6LWIG
1388
+ 1TDHA_6LWLD
1389
+ 1TDHA_6LWQD
1390
+ 1TDHA_6LWQG
1391
+ 1TJLA_1TJLA
1392
+ 1TJLA_1TJLD
1393
+ 1TJLA_5W1TM
1394
+ 1TJLA_5W1TN
1395
+ 1TJLA_7KHIM
1396
+ 1TKLA_1TKLB
1397
+ 1TKLA_1TLBA
1398
+ 1TKLA_1TLBW
1399
+ 1TKLA_1TXNA
1400
+ 1TKLA_1TXNB
1401
+ 1TRJA_3J6XRA
1402
+ 1TRJA_3RFHD
1403
+ 1TRJA_4V6IAa
1404
+ 1TRJA_6XIQAV
1405
+ 1TRJA_7OSMRACK
1406
+ 1TYFA_1YG6C
1407
+ 1TYFA_1YG6D
1408
+ 1TYFA_1YG6I
1409
+ 1TYFA_2FZSB
1410
+ 1TYFA_7UIVM
1411
+ 1U4HA_1U4HA
1412
+ 1U4HA_1XBNA
1413
+ 1U4HA_3NVUB
1414
+ 1U4HA_5JRUD
1415
+ 1U4HA_5JRUF
1416
+ 1U7LA_3J9TO
1417
+ 1U7LA_7FDCO
1418
+ 1U7LA_7FDEO
1419
+ 1U7LA_7TMMO
1420
+ 1U7LA_7TMSO
1421
+ 1U8YA_1UADA
1422
+ 1U8YA_2KE5A
1423
+ 1U8YA_2KWIA
1424
+ 1U8YA_5CM8B
1425
+ 1U8YA_6P0JB
1426
+ 1UDVA_1UDVA
1427
+ 1UDVA_1UDVB
1428
+ 1UDVA_2A2YA
1429
+ 1UDVA_2A2YB
1430
+ 1UDVA_2BKYY
1431
+ 1UFKA_1UFKA
1432
+ 1UFKA_2NXCA
1433
+ 1UFKA_2NXEA
1434
+ 1UFKA_2NXNA
1435
+ 1UFKA_3CJTA
1436
+ 1URKA_1URKA
1437
+ 1URKA_2I9AA
1438
+ 1URKA_2I9AD
1439
+ 1URKA_2I9BC
1440
+ 1URKA_4K24A
1441
+ 1V1HA_1V1HA
1442
+ 1V1HA_1V1HC
1443
+ 1V1HA_1V1IA
1444
+ 1V1HA_1V1IB
1445
+ 1V1HA_1V1IC
1446
+ 1V9U5_1V9U5
1447
+ 1V9U5_3DPRE
1448
+ 1V9U5_8IHPM
1449
+ 1V9U5_8IHPN
1450
+ 1V9U5_8IHPO
1451
+ 1VFGA_1VFGB
1452
+ 1VFGA_4WBYA
1453
+ 1VFGA_4WC2A
1454
+ 1VFGA_4WC3A
1455
+ 1VFGA_4X0BA
1456
+ 1VK1A_5X0BA
1457
+ 1VK1A_5X0EA
1458
+ 1VK1A_5X0FA
1459
+ 1VK1A_5X0GA
1460
+ 1VK1A_5X0JA
1461
+ 1VK6A_1VK6A
1462
+ 1VK6A_5IW4A
1463
+ 1VK6A_5IW5B
1464
+ 1VK6A_7E44A
1465
+ 1VK6A_7E44B
1466
+ 1VVJR0_1VY4B0
1467
+ 1VVJR0_4V4IU
1468
+ 1VVJR0_4V4JU
1469
+ 1VVJR0_4V4XBZ
1470
+ 1VVJR0_5ZLUt
1471
+ 1VVJR4_1VY4B4
1472
+ 1VVJR4_4V4ZB3
1473
+ 1VVJR4_4V5FB4
1474
+ 1VVJR4_4V90B4
1475
+ 1VVJR4_5ZLUw
1476
+ 1VVJRU_1VVJRU
1477
+ 1VVJRU_4V4IO
1478
+ 1VVJRU_4V4ZBT
1479
+ 1VVJRU_5IMRm
1480
+ 1VVJRU_6GZXP1
1481
+ 1W26A_1W26A
1482
+ 1W26A_5OWJB
1483
+ 1W26A_6D6SA
1484
+ 1W26A_6D6SB
1485
+ 1W26A_7D805
1486
+ 1W33A_1W33B
1487
+ 1W33A_1W3ZB
1488
+ 1W33A_4BL4D
1489
+ 1W33A_5A2UA
1490
+ 1W33A_5A2UB
1491
+ 1W63A_1W63C
1492
+ 1W63A_4HMYA
1493
+ 1W63A_4P6ZG
1494
+ 1W63A_6CM9G
1495
+ 1W63A_8D9VE
1496
+ 1W63B_2JKTE
1497
+ 1W63B_2XA7B
1498
+ 1W63B_6CM9B
1499
+ 1W63B_6YAFB
1500
+ 1W63B_7OHOB
1501
+ 1WAO1_1WAO2
1502
+ 1WAO1_4JA7A
1503
+ 1WAO1_7ZR5P
1504
+ 1WAO1_7ZR6P
1505
+ 1WAO1_8GAEE
1506
+ 1WCWA_1WCWA
1507
+ 1WCWA_1WD7B
1508
+ 1WCWA_3D8RA
1509
+ 1WCWA_3D8TA
1510
+ 1WCWA_3D8TB
1511
+ 1WT5A_6UM5C
1512
+ 1WT5A_7L56F
1513
+ 1WT5A_7L57H
1514
+ 1WT5A_7ND7F
1515
+ 1WT5A_7Q9KD
1516
+ 1WZ3A_1WZ3A
1517
+ 1WZ3A_1WZ3B
1518
+ 1WZ3A_7EU4I
1519
+ 1WZ3A_7EU4K
1520
+ 1WZ3A_7EU4N
1521
+ 1X4JA_1X4JA
1522
+ 1X4JA_4V3KC
1523
+ 1X4JA_4V3KF
1524
+ 1X4JA_4V3LC
1525
+ 1X4JA_7OJXA
1526
+ 1X8GA_1X8GA
1527
+ 1X8GA_3F9OA
1528
+ 1X8GA_3IOGA
1529
+ 1X8GA_3SW3A
1530
+ 1X8GA_3T9MA
1531
+ 1X9PA_2C6SA
1532
+ 1X9PA_2C9GA
1533
+ 1X9PA_4CWUN
1534
+ 1X9PA_6B1TM
1535
+ 1X9PA_6CGVN
1536
+ 1XD4A_1XD4A
1537
+ 1XD4A_1XD4B
1538
+ 1XD4A_1XDVA
1539
+ 1XD4A_1XDVB
1540
+ 1XD4A_3KSYA
1541
+ 1XHXA_1XHXA
1542
+ 1XHXA_1XHXB
1543
+ 1XHXA_2PY5A
1544
+ 1XHXA_2PYJB
1545
+ 1XHXA_2PYLA
1546
+ 1XJ5A_1XJ5B
1547
+ 1XJ5A_6O63A
1548
+ 1XJ5A_6O64H
1549
+ 1XJ5A_6O65F
1550
+ 1XJ5A_6O65G
1551
+ 1XLSE_1XLSE
1552
+ 1XLSE_1XLSG
1553
+ 1XLSE_1XLSH
1554
+ 1XLSE_1XNXA
1555
+ 1XLSE_1XNXB
1556
+ 1XWRA_1XWRA
1557
+ 1XWRA_1XWRB
1558
+ 1XWRA_1ZS4B
1559
+ 1XWRA_1ZS4D
1560
+ 1XWRA_8IGRA
1561
+ 1XX9A_1ZJDA
1562
+ 1XX9A_5EXNA
1563
+ 1XX9A_6AODC
1564
+ 1XX9A_6HHCA
1565
+ 1XX9A_6R8XA
1566
+ 1Y0LB_3CFJB
1567
+ 1Y0LB_6ELJA
1568
+ 1Y0LB_6S5AH
1569
+ 1Y0LB_7V8QD
1570
+ 1Y0LB_7VACB
1571
+ 1Y5OA_1Y5OA
1572
+ 1Y5OA_2K2UA
1573
+ 1Y5OA_2LOXA
1574
+ 1Y5OA_2MKRA
1575
+ 1Y5OA_2N23A
1576
+ 1Y7MA_1Y7MB
1577
+ 1Y7MA_2MTZA
1578
+ 1Y7MA_3ZQDA
1579
+ 1Y7MA_4A1IA
1580
+ 1Y7MA_4A52A
1581
+ 1Y8GA_2HAKA
1582
+ 1Y8GA_2HAKB
1583
+ 1Y8GA_2WZJD
1584
+ 1Y8GA_3IECD
1585
+ 1Y8GA_5EAKA
1586
+ 1Y9RA_1Y9RA
1587
+ 1Y9RA_4UDAA
1588
+ 1Y9RA_6GGGA
1589
+ 1Y9RA_6L88B
1590
+ 1Y9RA_6L88C
1591
+ 1YDEA_1YDEB
1592
+ 1YDEA_5ICSA
1593
+ 1YDEA_5ICSC
1594
+ 1YDEA_5ICSF
1595
+ 1YDEA_6GTUA
1596
+ 1YMYA_1YMYA
1597
+ 1YMYA_1YRRA
1598
+ 1YMYA_2P50A
1599
+ 1YMYA_2P53A
1600
+ 1YMYA_2P53B
1601
+ 1YNMA_1YNMA
1602
+ 1YNMA_2FKCA
1603
+ 1YNMA_2FKCB
1604
+ 1YNMA_2FKHB
1605
+ 1YNMA_2FLCA
1606
+ 1YQTA_1YQTA
1607
+ 1YQTA_3BK7A
1608
+ 1YQTA_3J15B
1609
+ 1YQTA_5LW7B
1610
+ 1YQTA_5YV5A
1611
+ 1YZTA_1YZUB
1612
+ 1YZTA_1Z08B
1613
+ 1YZTA_1Z08C
1614
+ 1YZTA_1Z0IA
1615
+ 1YZTA_2OT3B
1616
+ 1Z2CB_1Z2CB
1617
+ 1Z2CB_2BNXA
1618
+ 1Z2CB_2BNXB
1619
+ 1Z2CB_3EG5B
1620
+ 1Z2CB_3O4XD
1621
+ 1Z3KA_1Z3KA
1622
+ 1Z3KA_2CI8A
1623
+ 1Z3KA_2CI9A
1624
+ 1Z3KA_2CI9B
1625
+ 1Z3KA_2CIAA
1626
+ 1Z3RA_1Z3RA
1627
+ 1Z3RA_1Z66A
1628
+ 1Z3RA_2GG1A
1629
+ 1Z3RA_7LSFE
1630
+ 1Z3RA_7LSGC
1631
+ 1ZTPA_1ZTPA
1632
+ 1ZTPA_1ZTPB
1633
+ 1ZTPA_2Q4KA
1634
+ 1ZTPA_2Q4KB
1635
+ 1ZTPA_2Q4KC
1636
+ 1ZW0A_1ZW0A
1637
+ 1ZW0A_1ZW0B
1638
+ 1ZW0A_1ZW0C
1639
+ 1ZW0A_1ZW0D
1640
+ 1ZW0A_1ZW0E
1641
+ 2A4EA_2A4EA
1642
+ 2A4EA_3LNDD
1643
+ 2A4EA_6CGBA
1644
+ 2A4EA_6CGUA
1645
+ 2A4EA_6CGUB
1646
+ 2A50A_2A52A
1647
+ 2A50A_2A52C
1648
+ 2A50A_2A53A
1649
+ 2A50A_2A56A
1650
+ 2A50A_3CFAM
1651
+ 2A65A_3TT3A
1652
+ 2A65A_4MM4B
1653
+ 2A65A_4MMDA
1654
+ 2A65A_6XWMA
1655
+ 2A65A_7LQJA
1656
+ 2A74C_2A74C
1657
+ 2A74C_2QKIC
1658
+ 2A74C_3OHXF
1659
+ 2A74C_3T4AC
1660
+ 2A74C_3T4AF
1661
+ 2AE0X_2GAEA
1662
+ 2AE0X_2PI8B
1663
+ 2AE0X_2PI8D
1664
+ 2AE0X_2PICA
1665
+ 2AE0X_2PJJA
1666
+ 2AHXA_3U7UB
1667
+ 2AHXA_3U7UC
1668
+ 2AHXA_3U7UF
1669
+ 2AHXA_3U9UE
1670
+ 2AHXA_3U9UF
1671
+ 2AJFE_2GHVC
1672
+ 2AJFE_2GHVE
1673
+ 2AJFE_2GHWA
1674
+ 2AJFE_7RKSR
1675
+ 2AJFE_7U0NF
1676
+ 2AQLA_2N1DB
1677
+ 2AQLA_6AGOC
1678
+ 2AQLA_6AGOD
1679
+ 2AQLA_7S4AC
1680
+ 2AQLA_8BPAD
1681
+ 2AVAA_2AVAA
1682
+ 2AVAA_2FVAA
1683
+ 2AVAA_2FVEA
1684
+ 2AVAA_2FVFA
1685
+ 2AVAA_2XZ0D
1686
+ 2AW6A_2AXUG
1687
+ 2AW6A_2AXUL
1688
+ 2AW6A_2AXZA
1689
+ 2AW6A_2AXZD
1690
+ 2AW6A_2GRMB
1691
+ 2B0JA_2B0JA
1692
+ 2B0JA_3DAFA
1693
+ 2B0JA_3F46A
1694
+ 2B0JA_3F47A
1695
+ 2B0JA_3H65A
1696
+ 2B69A_2B69A
1697
+ 2B69A_4GLLA
1698
+ 2B69A_4GLLB
1699
+ 2B69A_4LK3D
1700
+ 2B69A_4M55D
1701
+ 2BJ1A_2BJ3C
1702
+ 2BJ1A_2BJ7A
1703
+ 2BJ1A_2BJ7B
1704
+ 2BJ1A_2BJ8A
1705
+ 2BJ1A_2BJ9B
1706
+ 2BKUB_2BKUB
1707
+ 2BKUB_3EA5B
1708
+ 2BKUB_3EA5D
1709
+ 2BKUB_3ND2A
1710
+ 2BKUB_5OWUA
1711
+ 2BNDA_2BNDB
1712
+ 2BNDA_2BNEB
1713
+ 2BNDA_2BNFA
1714
+ 2BNDA_2V4YA
1715
+ 2BNDA_2V4YB
1716
+ 2BVJA_2BVJB
1717
+ 2BVJA_2VZMA
1718
+ 2BVJA_2WHWA
1719
+ 2BVJA_3ZPIA
1720
+ 2BVJA_4B7SA
1721
+ 2BX2L_2C0BL
1722
+ 2BX2L_2VMKC
1723
+ 2BX2L_2VMKD
1724
+ 2BX2L_2VRTD
1725
+ 2BX2L_6G63A
1726
+ 2BX9A_2BX9A
1727
+ 2BX9A_2BX9H
1728
+ 2BX9A_2BX9I
1729
+ 2BX9A_2KO8A
1730
+ 2BX9A_2ZP8E
1731
+ 2C9OA_2C9OB
1732
+ 2C9OA_6HTSE
1733
+ 2C9OA_6IGMC
1734
+ 2C9OA_7OLEA
1735
+ 2C9OA_7OLEC
1736
+ 2CAZB_2CAZB
1737
+ 2CAZB_2F66B
1738
+ 2CAZB_2F66E
1739
+ 2CAZB_2F6MB
1740
+ 2CAZB_2P22B
1741
+ 2CCQA_2CCQA
1742
+ 2CCQA_2CM0A
1743
+ 2CCQA_2D5UA
1744
+ 2CCQA_2HPJA
1745
+ 2CCQA_2HPLA
1746
+ 2CGHA_3L1AA
1747
+ 2CGHA_3L1AB
1748
+ 2CGHA_4OP0A
1749
+ 2CGHA_4XTXA
1750
+ 2CGHA_4XU1B
1751
+ 2CMRA_2CMRA
1752
+ 2CMRA_2XRAA
1753
+ 2CMRA_3MA9A
1754
+ 2CMRA_3O3XA
1755
+ 2CMRA_3O43A
1756
+ 2CSBA_2CSBA
1757
+ 2CSBA_2CSBB
1758
+ 2CSBA_2CSDA
1759
+ 2CSBA_2CSDB
1760
+ 2CSBA_4GFJA
1761
+ 2CT9A_2CT9A
1762
+ 2CT9A_2CT9B
1763
+ 2CT9A_2E30A
1764
+ 2CT9A_7DSVC
1765
+ 2CT9A_7X2UC
1766
+ 2D00A_2D00D
1767
+ 2D00A_2D00E
1768
+ 2D00A_3A5CH
1769
+ 2D00A_5GASL
1770
+ 2D00A_7VAMH
1771
+ 2D3AA_2D3AA
1772
+ 2D3AA_4IS4C
1773
+ 2D3AA_7V4LA
1774
+ 2D3AA_7V4LB
1775
+ 2D3AA_7V4LC
1776
+ 2D9SA_2D9SA
1777
+ 2D9SA_2D9SB
1778
+ 2D9SA_2JUJA
1779
+ 2D9SA_2OO9A
1780
+ 2D9SA_2OO9B
1781
+ 2DW4A_2IW5A
1782
+ 2DW4A_6KGQA
1783
+ 2DW4A_6NQMA
1784
+ 2DW4A_6VYPk
1785
+ 2DW4A_6VYPK
1786
+ 2EAYA_2EAYA
1787
+ 2EAYA_3EFRA
1788
+ 2EAYA_3EFRB
1789
+ 2EAYA_3EFSA
1790
+ 2EAYA_3FJPA
1791
+ 2EB7A_2EB7A
1792
+ 2EB7A_2YR2B
1793
+ 2EB7A_3GF2A
1794
+ 2EB7A_3GFIA
1795
+ 2EB7A_3GFIC
1796
+ 2EFCB_2EFCB
1797
+ 2EFCB_2EFDB
1798
+ 2EFCB_2EFDD
1799
+ 2EFCB_2EFEB
1800
+ 2EFCB_4G01B
1801
+ 2ETNA_2ETNB
1802
+ 2ETNA_2EULD
1803
+ 2ETNA_2F23A
1804
+ 2ETNA_3AOIX
1805
+ 2ETNA_4WQTY
1806
+ 2FE8A_5TL7B
1807
+ 2FE8A_6WUUB
1808
+ 2FE8A_6WUUD
1809
+ 2FE8A_7RZCA
1810
+ 2FE8A_7SKRA
1811
+ 2FSFA_2FSHB
1812
+ 2FSFA_2FSIB
1813
+ 2FSFA_2VDAA
1814
+ 2FSFA_6GOXA
1815
+ 2FSFA_6S0Kh
1816
+ 2FTCP_3J7Y1
1817
+ 2FTCP_4CE46
1818
+ 2FTCP_4V196
1819
+ 2FTCP_5AJ4B6
1820
+ 2FTCP_7QH61
1821
+ 2FUG1_2FUGA
1822
+ 2FUG1_2YBB1
1823
+ 2FUG1_6ZIY1
1824
+ 2FUG1_6ZJN1
1825
+ 2FUG1_6ZJY1
1826
+ 2G76A_2G76A
1827
+ 2G76A_5N6CB
1828
+ 2G76A_6PLGC
1829
+ 2G76A_6PLGG
1830
+ 2G76A_6RIHA
1831
+ 2GCFA_2GCFA
1832
+ 2GCFA_2XMWA
1833
+ 2GCFA_4A48A
1834
+ 2GCFA_4A48B
1835
+ 2GCFA_4A4JA
1836
+ 2GICA_2GICA
1837
+ 2GICA_3PTXE
1838
+ 2GICA_6BJYB
1839
+ 2GICA_6BJYC
1840
+ 2GICA_7UMKA
1841
+ 2GTTA_2GTTM
1842
+ 2GTTA_2GTTN
1843
+ 2GTTA_8B8VA
1844
+ 2GTTA_8FFRA
1845
+ 2GTTA_8FWLA
1846
+ 2H01A_2H01A
1847
+ 2H01A_2H66B
1848
+ 2H01A_2I81A
1849
+ 2H01A_4L0UJ
1850
+ 2H01A_4L0WA
1851
+ 2H0BA_2H0BA
1852
+ 2H0BA_2H0BB
1853
+ 2H0BA_2H0BC
1854
+ 2H0BA_6PNPA
1855
+ 2H0BA_6PNQA
1856
+ 2H3HA_2H3HB
1857
+ 2H3HA_2QVCB
1858
+ 2H3HA_2QVCD
1859
+ 2H3HA_3C6QA
1860
+ 2H3HA_3C6QC
1861
+ 2HDAA_4LE9A
1862
+ 2HDAA_4OMMA
1863
+ 2HDAA_4RTUA
1864
+ 2HDAA_4RTXD
1865
+ 2HDAA_7PVXA
1866
+ 2HEPA_2HEPA
1867
+ 2HEPA_2JVDA
1868
+ 2HEPA_3BHPA
1869
+ 2HEPA_3BHPB
1870
+ 2HEPA_3BHPC
1871
+ 2HIGA_2HIGA
1872
+ 2HIGA_6QU3A
1873
+ 2HIGA_6QU3B
1874
+ 2HIGA_6QU5C
1875
+ 2HIGA_6SY7H
1876
+ 2I1AA_2I1AA
1877
+ 2I1AA_2I1AB
1878
+ 2I1AA_2I1AC
1879
+ 2I1AA_4Z2ZA
1880
+ 2I1AA_4Z2ZB
1881
+ 2I4IA_2I4IA
1882
+ 2I4IA_4PXAA
1883
+ 2I4IA_5E7IA
1884
+ 2I4IA_6O5FB
1885
+ 2I4IA_7LIUB
1886
+ 2I80A_2I80A
1887
+ 2I80A_2I87A
1888
+ 2I80A_2I8CB
1889
+ 2I80A_3N8DB
1890
+ 2I80A_7U9KB
1891
+ 2IJZA_2IJZA
1892
+ 2IJZA_2IJZJ
1893
+ 2IJZA_3WT4A
1894
+ 2IJZA_3WT4B
1895
+ 2IJZA_4OIWB
1896
+ 2IOPA_2IOPA
1897
+ 2IOPA_2IOPC
1898
+ 2IOPA_2IOPD
1899
+ 2IOPA_2IOQA
1900
+ 2IOPA_2IOQB
1901
+ 2IS1A_2IS1A
1902
+ 2IS1A_2IS2A
1903
+ 2IS1A_2IS4B
1904
+ 2IS1A_2IS6B
1905
+ 2IS1A_3LFUA
1906
+ 2IVSA_2X2KA
1907
+ 2IVSA_5AMNA
1908
+ 2IVSA_6FEKA
1909
+ 2IVSA_6VHGA
1910
+ 2IVSA_7RUNA
1911
+ 2IYEA_2YJ4A
1912
+ 2IYEA_2YJ4B
1913
+ 2IYEA_2YJ5A
1914
+ 2IYEA_2YJ5B
1915
+ 2IYEA_2YJ6B
1916
+ 2J0JA_2J0JA
1917
+ 2J0JA_2J0KA
1918
+ 2J0JA_2J0KB
1919
+ 2J0JA_6TY3A
1920
+ 2J0JA_6TY4A
1921
+ 2J284_2J284
1922
+ 2J284_4V5BC4
1923
+ 2J284_4V66BX
1924
+ 2J284_4V69B4
1925
+ 2J284_4V72B4
1926
+ 2J28D_2J28D
1927
+ 2J28D_4V4VBB
1928
+ 2J28D_4V4WBB
1929
+ 2J28D_4V66BY
1930
+ 2J28D_4V75BD
1931
+ 2J28E_2J28E
1932
+ 2J28E_4V4VBC
1933
+ 2J28E_4V4WBC
1934
+ 2J28E_4V66B1
1935
+ 2J28E_4V77BE
1936
+ 2J28P_2J28P
1937
+ 2J28P_4V4WBN
1938
+ 2J28P_4V65BI
1939
+ 2J28P_4V6LBQ
1940
+ 2J28P_4V6PBR
1941
+ 2J28U_2J28U
1942
+ 2J28U_4V4VBS
1943
+ 2J28U_4V4WBS
1944
+ 2J28U_4V72BU
1945
+ 2J28U_4V75BU
1946
+ 2J28Y_2J28Y
1947
+ 2J28Y_4V5BAY
1948
+ 2J28Y_4V66BR
1949
+ 2J28Y_4V70BZ
1950
+ 2J28Y_4V75BZ
1951
+ 2J4JA_2J4JA
1952
+ 2J4JA_2J4KA
1953
+ 2J4JA_2J4KD
1954
+ 2J4JA_2J4LA
1955
+ 2J4JA_2J4LC
1956
+ 2J6LA_4X0TA
1957
+ 2J6LA_4X0UA
1958
+ 2J6LA_4X0UC
1959
+ 2J6LA_4ZULG
1960
+ 2J6LA_6O4EH
1961
+ 2JBXA_2JBXA
1962
+ 2JBXA_2JBXB
1963
+ 2JBXA_2JBYA
1964
+ 2JBXA_2O42A
1965
+ 2JBXA_2O42B
1966
+ 2JPHA_2JPHA
1967
+ 2JPHA_2R2OA
1968
+ 2JPHA_2R2OB
1969
+ 2JPHA_2REXA
1970
+ 2JPHA_2REXC
1971
+ 2JRCA_2JRCA
1972
+ 2JRCA_2LGJA
1973
+ 2JRCA_2NAFA
1974
+ 2JRCA_3KK0A
1975
+ 2JRCA_3TCNB
1976
+ 2JSFA_2JSFA
1977
+ 2JSFA_2R29A
1978
+ 2JSFA_3UZVA
1979
+ 2JSFA_6FLAG
1980
+ 2JSFA_6FLBG
1981
+ 2JTCA_2JTCA
1982
+ 2JTCA_4D8BA
1983
+ 2JTCA_4D8EA
1984
+ 2JTCA_6UKDA
1985
+ 2JTCA_6UQDB
1986
+ 2K1BA_2K1BA
1987
+ 2K1BA_4X3KB
1988
+ 2K1BA_4X3SB
1989
+ 2K1BA_4X3TA
1990
+ 2K1BA_4X3TD
1991
+ 2K1WA_2K1WA
1992
+ 2K1WA_2K1XA
1993
+ 2K1WA_3HZ2A
1994
+ 2K1WA_5HT9A
1995
+ 2K1WA_5HT9B
1996
+ 2K28A_2K28A
1997
+ 2K28A_3I8ZA
1998
+ 2K28A_4X3TE
1999
+ 2K28A_5EPLA
2000
+ 2K28A_5EPLB
2001
+ 2K2JA_2K2JA
2002
+ 2K2JA_2W2WE
2003
+ 2K2JA_2W2WF
2004
+ 2K2JA_2W2WL
2005
+ 2K2JA_2W2XC
2006
+ 2KB8A_2KJ7A
2007
+ 2KB8A_5MGQA
2008
+ 2KB8A_7M62A
2009
+ 2KB8A_7M65C
2010
+ 2KB8A_7YL7A
2011
+ 2KCNA_2KCNA
2012
+ 2KCNA_2NB0A
2013
+ 2KCNA_2NBFA
2014
+ 2KCNA_6HAJB
2015
+ 2KCNA_7PGDA
2016
+ 2KHOA_2KHOA
2017
+ 2KHOA_4B9QC
2018
+ 2KHOA_4JN4A
2019
+ 2KHOA_7KRWA
2020
+ 2KHOA_7KZIA
2021
+ 2KIIA_2KIIA
2022
+ 2KIIA_2KILA
2023
+ 2KIIA_4U99B
2024
+ 2KIIA_4U9BA
2025
+ 2KIIA_4U9JA
2026
+ 2KIXA_2KIXA
2027
+ 2KIXA_2KIXB
2028
+ 2KIXA_2KIXC
2029
+ 2KIXA_6PVRA
2030
+ 2KIXA_6PVTA
2031
+ 2KJ3A_2KJ3A
2032
+ 2KJ3A_2KJ3B
2033
+ 2KJ3A_2LBUA
2034
+ 2KJ3A_2RNMD
2035
+ 2KJ3A_2RNME
2036
+ 2KPWA_2KPWA
2037
+ 2KPWA_3UMNA
2038
+ 2KPWA_3UMNC
2039
+ 2KPWA_7DTGA
2040
+ 2KPWA_7DTGD
2041
+ 2KT6A_2KT6A
2042
+ 2KT6A_3L48A
2043
+ 2KT6A_3L48B
2044
+ 2KT6A_3L48C
2045
+ 2KT6A_3L48D
2046
+ 2L7EA_2L7EA
2047
+ 2L7EA_3QRLA
2048
+ 2L7EA_5D7EA
2049
+ 2L7EA_6MIPA
2050
+ 2L7EA_7F4AA
2051
+ 2LE2A_2LE2A
2052
+ 2LE2A_2LE2B
2053
+ 2LE2A_3ZOQB
2054
+ 2LE2A_3ZOQC
2055
+ 2LE2A_4L5NC
2056
+ 2LHNA_2LHNA
2057
+ 2LHNA_5L2LA
2058
+ 2LHNA_5L2LB
2059
+ 2LHNA_5L2LE
2060
+ 2LHNA_5L2LF
2061
+ 2LN0A_2LN0A
2062
+ 2LN0A_3V43A
2063
+ 2LN0A_4LJNA
2064
+ 2LN0A_4LKAA
2065
+ 2LN0A_6LSBA
2066
+ 2LR6A_2LR6A
2067
+ 2LR6A_2LR6B
2068
+ 2LR6A_3SOOA
2069
+ 2LR6A_3SOOB
2070
+ 2LR6A_3SOOC
2071
+ 2LS7A_2LS7A
2072
+ 2LS7A_4IZ5H
2073
+ 2LS7A_4IZAB
2074
+ 2LS7A_6P6BA
2075
+ 2LS7A_6P6CA
2076
+ 2LVWA_2LVWA
2077
+ 2LVWA_2LVWB
2078
+ 2LVWA_5YPPA
2079
+ 2LVWA_5YUMA
2080
+ 2LVWA_6LPIF
2081
+ 2LW5A_2LW5A
2082
+ 2LW5A_5UZ9I
2083
+ 2LW5A_5XLOM
2084
+ 2LW5A_5XLPM
2085
+ 2LW5A_6B46I
2086
+ 2LXNA_2LXNA
2087
+ 2LXNA_7D95A
2088
+ 2LXNA_7D97B
2089
+ 2LXNA_7D97D
2090
+ 2LXNA_7YC6A
2091
+ 2M7KA_2M7KA
2092
+ 2M7KA_2M7MA
2093
+ 2M7KA_2NXQB
2094
+ 2M7KA_3LI6J
2095
+ 2M7KA_5XOPE
2096
+ 2MM4A_2MM4A
2097
+ 2MM4A_5X29A
2098
+ 2MM4A_5X29B
2099
+ 2MM4A_5X29D
2100
+ 2MM4A_5X29E
2101
+ 2MR3A_2MR3A
2102
+ 2MR3A_3JCKF
2103
+ 2MR3A_5MPDO
2104
+ 2MR3A_6J2NO
2105
+ 2MR3A_6J2XO
2106
+ 2MZ7A_6QJMA
2107
+ 2MZ7A_6QJPA
2108
+ 2MZ7A_7QKWD
2109
+ 2MZ7A_7QL2B
2110
+ 2MZ7A_7QL2C
2111
+ 2MZBA_2MZBA
2112
+ 2MZBA_3RSBA
2113
+ 2MZBA_3RSBB
2114
+ 2MZBA_3RSBC
2115
+ 2MZBA_3RSBD
2116
+ 2MZWB_2MZWB
2117
+ 2MZWB_4ADNA
2118
+ 2MZWB_4ADNB
2119
+ 2MZWB_4ADOB
2120
+ 2MZWB_4E4BA
2121
+ 2N2CA_2N3XA
2122
+ 2N2CA_2N4HA
2123
+ 2N2CA_6N3BA
2124
+ 2N2CA_6N3BB
2125
+ 2N2CA_8A6IA
2126
+ 2N7LC_6MV3A
2127
+ 2N7LC_7SC2A
2128
+ 2N7LC_7SC3A
2129
+ 2N7LC_7SUPA
2130
+ 2N7LC_7SXDA
2131
+ 2NTXA_2NTXA
2132
+ 2NTXA_2NTXB
2133
+ 2NTXA_2NTYA
2134
+ 2NTXA_2NTYB
2135
+ 2NTXA_2WBLA
2136
+ 2NW2A_4DZBA
2137
+ 2NW2A_4G8EA
2138
+ 2NW2A_4IIQA
2139
+ 2NW2A_4PJ5G
2140
+ 2NW2A_5L2KD
2141
+ 2OEXA_2OEXA
2142
+ 2OEXA_2OEXB
2143
+ 2OEXA_2OJQA
2144
+ 2OEXA_4JJYA
2145
+ 2OEXA_4JJYB
2146
+ 2OGXA_4NDOA
2147
+ 2OGXA_6GWVD
2148
+ 2OGXA_6RJ4E
2149
+ 2OGXA_6RKDA
2150
+ 2OGXA_6RKEG
2151
+ 2OK5A_2XWJJ
2152
+ 2OK5A_2XWJL
2153
+ 2OK5A_3HS0I
2154
+ 2OK5A_7JTNC
2155
+ 2OK5A_7JTQA
2156
+ 2OQHA_2OQHA
2157
+ 2OQHA_2OQHB
2158
+ 2OQHA_2OQHC
2159
+ 2OQHA_2OQHD
2160
+ 2OQHA_4DYEA
2161
+ 2OSSA_2OSSA
2162
+ 2OSSA_3UW9A
2163
+ 2OSSA_5EGUD
2164
+ 2OSSA_5KU3A
2165
+ 2OSSA_6G0OA
2166
+ 2OWQA_2OWQB
2167
+ 2OWQA_4DOFA
2168
+ 2OWQA_4QC9A
2169
+ 2OWQA_4QCBB
2170
+ 2OWQA_4YIGI
2171
+ 2OX8A_2OX8A
2172
+ 2OX8A_2OX8B
2173
+ 2OX8A_2OX8D
2174
+ 2OX8A_2OX9A
2175
+ 2OX8A_2OX9C
2176
+ 2POHA_2POHF
2177
+ 2POHA_3C9ID
2178
+ 2POHA_4LINA
2179
+ 2POHA_4LINJ
2180
+ 2POHA_4LINL
2181
+ 2PTQA_2PTQB
2182
+ 2PTQA_2PTRB
2183
+ 2PTQA_3GZHA
2184
+ 2PTQA_4NSLA
2185
+ 2PTQA_4NSLD
2186
+ 2Q80A_6C56A
2187
+ 2Q80A_6C57A
2188
+ 2Q80A_6G31A
2189
+ 2Q80A_6G31E
2190
+ 2Q80A_6G31I
2191
+ 2Q81A_2Q81B
2192
+ 2Q81A_2Q81D
2193
+ 2Q81A_3M52A
2194
+ 2Q81A_3M52B
2195
+ 2Q81A_7AZXA
2196
+ 2QF0A_2QF0B
2197
+ 2QF0A_3B8JA
2198
+ 2QF0A_3LGIB
2199
+ 2QF0A_3LGIC
2200
+ 2QF0A_3LH1A
2201
+ 2QFIA_2QFIA
2202
+ 2QFIA_2QFIB
2203
+ 2QFIA_3H90A
2204
+ 2QFIA_3H90B
2205
+ 2QFIA_3H90D
2206
+ 2R5KA_6L31A
2207
+ 2R5KA_6L31B
2208
+ 2R5KA_6L31E
2209
+ 2R5KA_6L31F
2210
+ 2R5KA_7EW5A
2211
+ 2R6AA_2R6EA
2212
+ 2R6AA_2R6EB
2213
+ 2R6AA_2VYFB
2214
+ 2R6AA_4ESVA
2215
+ 2R6AA_4ESVI
2216
+ 2RDO7_3J0EH
2217
+ 2RDO7_4V9ODV
2218
+ 2RDO7_7N2VEF
2219
+ 2RDO7_7PJWx
2220
+ 2RDO7_7PJYx
2221
+ 2RHEA_2RHEA
2222
+ 2RHEA_7T6XB
2223
+ 2RHEA_7WE9J
2224
+ 2RHEA_7WE9K
2225
+ 2RHEA_7WTJL
2226
+ 2RR8A_2RR8A
2227
+ 2RR8A_3I6XA
2228
+ 2RR8A_3I6XD
2229
+ 2RR8A_5L0OA
2230
+ 2RR8A_5L0OB
2231
+ 2V1SA_2V1SB
2232
+ 2V1SA_2V1TB
2233
+ 2V1SA_3AWRA
2234
+ 2V1SA_3AX2G
2235
+ 2V1SA_3AX5C
2236
+ 2V3JA_3OIJA
2237
+ 2V3JA_3OIJB
2238
+ 2V3JA_6LQSRG
2239
+ 2V3JA_6ZQGJG
2240
+ 2V3JA_7D4IRG
2241
+ 2V5MA_2V5RB
2242
+ 2V5MA_2V5SA
2243
+ 2V5MA_4X83A
2244
+ 2V5MA_4X9BA
2245
+ 2V5MA_4X9IB
2246
+ 2V8QB_2V8QB
2247
+ 2V8QB_2V9JB
2248
+ 2V8QB_4REWB
2249
+ 2V8QB_7JHHB
2250
+ 2V8QB_7JIJB
2251
+ 2VBCA_2VBCA
2252
+ 2VBCA_2WHXA
2253
+ 2VBCA_5YVJB
2254
+ 2VBCA_5YVUB
2255
+ 2VBCA_5YW1B
2256
+ 2VGMA_2VGNA
2257
+ 2VGMA_2VGNB
2258
+ 2VGMA_3IZQ0
2259
+ 2VGMA_3J16A
2260
+ 2VGMA_5M1JA1
2261
+ 2VL0A_2YN6A
2262
+ 2VL0A_6HK0J
2263
+ 2VL0A_6V03C
2264
+ 2VL0A_6V03E
2265
+ 2VL0A_8D68A
2266
+ 2VPNA_2VPNB
2267
+ 2VPNA_2VPOB
2268
+ 2VPNA_3GYYB
2269
+ 2VPNA_3GYYC
2270
+ 2VPNA_3GYYD
2271
+ 2VRGA_2VRGA
2272
+ 2VRGA_3A4UB
2273
+ 2VRGA_3LCPD
2274
+ 2VRGA_4YGCB
2275
+ 2VRGA_4YGDH
2276
+ 2W6RA_2W6RA
2277
+ 2W6RA_3OG3A
2278
+ 2W6RA_4FX7A
2279
+ 2W6RA_4FX7B
2280
+ 2W6RA_4J9JA
2281
+ 2WADA_2WADA
2282
+ 2WADA_2WADB
2283
+ 2WADA_2WADC
2284
+ 2WADA_2WAEA
2285
+ 2WADA_2WAFA
2286
+ 2WCPA_2WD0A
2287
+ 2WCPA_2WHVA
2288
+ 2WCPA_3MVSA
2289
+ 2WCPA_4AQEA
2290
+ 2WCPA_7SB6C
2291
+ 2WDPA_3S70A
2292
+ 2WDPA_3S8EF
2293
+ 2WDPA_4HVAB
2294
+ 2WDPA_4N5DA
2295
+ 2WDPA_6DEUA
2296
+ 2WEWA_2WEWA
2297
+ 2WEWA_2WEXA
2298
+ 2WEWA_2XKLA
2299
+ 2WEWA_2YG2A
2300
+ 2WEWA_2YG2B
2301
+ 2WFF3_2WFF3
2302
+ 2WFF3_2WS93
2303
+ 2WFF3_2XBO3
2304
+ 2WFF3_4CTFD0
2305
+ 2WFF3_4CTGC0
2306
+ 2WV0A_2WV0B
2307
+ 2WV0A_4U0WA
2308
+ 2WV0A_4U0WB
2309
+ 2WV0A_4WWCA
2310
+ 2WV0A_4WWCB
2311
+ 2WW9L_2WWAL
2312
+ 2WW9L_2WWBL
2313
+ 2WW9L_3J6X66
2314
+ 2WW9L_5FL8Y
2315
+ 2WW9L_6OIGY
2316
+ 2WZPR_2WZPR
2317
+ 2WZPR_2X531
2318
+ 2WZPR_2X53Y
2319
+ 2WZPR_4V5IAY
2320
+ 2WZPR_4V5IAZ
2321
+ 2X3FA_2X3FB
2322
+ 2X3FA_3AG5A
2323
+ 2X3FA_3AG5B
2324
+ 2X3FA_3AG6A
2325
+ 2X3FA_3AG6B
2326
+ 2XGJA_2XGJA
2327
+ 2XGJA_4QU4A
2328
+ 2XGJA_4U4CA
2329
+ 2XGJA_6FT6MM
2330
+ 2XGJA_7AJTEN
2331
+ 2XNHA_2XNHA
2332
+ 2XNHA_2XNKB
2333
+ 2XNHA_2XNKC
2334
+ 2XNHA_6HM5A
2335
+ 2XNHA_6RMLB
2336
+ 2XU7A_3CFVB
2337
+ 2XU7A_6G16A
2338
+ 2XU7A_6G16C
2339
+ 2XU7A_6WKRN
2340
+ 2XU7A_7KSOD
2341
+ 2XYYA_5UU5F
2342
+ 2XYYA_5UU5G
2343
+ 2XYYA_8I1TA
2344
+ 2XYYA_8I1VC
2345
+ 2XYYA_8I1VG
2346
+ 2Y0FA_2Y0FA
2347
+ 2Y0FA_2Y0FB
2348
+ 2Y0FA_2Y0FD
2349
+ 2Y0FA_4S23A
2350
+ 2Y0FA_4S38A
2351
+ 2Y7LA_2Y7LA
2352
+ 2Y7LA_2Y7MA
2353
+ 2Y7LA_2Y7NA
2354
+ 2Y7LA_2Y7OA
2355
+ 2Y7LA_2YLHA
2356
+ 2Y9KA_2Y9KB
2357
+ 2Y9KA_3J1VA
2358
+ 2Y9KA_4G08A
2359
+ 2Y9KA_6PEMA
2360
+ 2Y9KA_6Q15Q
2361
+ 2YIIA_3NZ4B
2362
+ 2YIIA_4BAAD
2363
+ 2YIIA_4V2QA
2364
+ 2YIIA_4V2QB
2365
+ 2YIIA_4V2RB
2366
+ 2YKRE_4V6LAI
2367
+ 2YKRE_4V6OAH
2368
+ 2YKRE_4V6VAE
2369
+ 2YKRE_5MDVj
2370
+ 2YKRE_7JT3J
2371
+ 2YKRS_2YKRS
2372
+ 2YKRS_4V70AS
2373
+ 2YKRS_4V74AS
2374
+ 2YKRS_4V7IBS
2375
+ 2YKRS_6SPFs
2376
+ 2YPLD_2YPLD
2377
+ 2YPLD_3O4LD
2378
+ 2YPLD_7NMED
2379
+ 2YPLD_7NMFD
2380
+ 2YPLD_7NMGD
2381
+ 2YRQA_2YRQA
2382
+ 2YRQA_5ZDZN
2383
+ 2YRQA_5ZE1N
2384
+ 2YRQA_6CIJN
2385
+ 2YRQA_6CIMN
2386
+ 2YX1A_2YX1A
2387
+ 2YX1A_2ZZMA
2388
+ 2YX1A_2ZZNA
2389
+ 2YX1A_2ZZNB
2390
+ 2YX1A_3AY0B
2391
+ 2Z81A_2Z82A
2392
+ 2Z81A_3A79A
2393
+ 2Z81A_3A7BA
2394
+ 2Z81A_3A7CA
2395
+ 2Z81A_5D3IA
2396
+ 2ZCHP_2ZCHP
2397
+ 2ZCHP_2ZCKP
2398
+ 2ZCHP_2ZCLP
2399
+ 2ZCHP_3QUMP
2400
+ 2ZCHP_3QUMQ
2401
+ 2ZIXA_2ZIXA
2402
+ 2ZIXA_4P0PA
2403
+ 2ZIXA_4P0QA
2404
+ 2ZIXA_4P0RA
2405
+ 2ZIXA_7F6LA
2406
+ 2ZJTA_2ZJTA
2407
+ 2ZJTA_3M4IA
2408
+ 2ZJTA_5BS8B
2409
+ 2ZJTA_5BTCB
2410
+ 2ZJTA_7UGWD
2411
+ 2ZNIA_2ZNIA
2412
+ 2ZNIA_2ZNIB
2413
+ 2ZNIA_2ZNJA
2414
+ 2ZNIA_2ZNJC
2415
+ 2ZNIA_3DSQA
2416
+ 2ZPMA_3ID2A
2417
+ 2ZPMA_3ID3A
2418
+ 2ZPMA_3ID3B
2419
+ 2ZPMA_3ID4A
2420
+ 2ZPMA_7W71B
2421
+ 2ZU6B_2ZU6B
2422
+ 2ZU6B_2ZU6E
2423
+ 2ZU6B_3EIJA
2424
+ 2ZU6B_3EIJB
2425
+ 2ZU6B_3EIQC
2426
+ 2ZW4A_2ZW4A
2427
+ 2ZW4A_2ZW4C
2428
+ 2ZW4A_2ZW4D
2429
+ 2ZW4A_2ZW5A
2430
+ 2ZW4A_2ZW5B
2431
+ 2ZXEA_3WGUA
2432
+ 2ZXEA_4HQJC
2433
+ 2ZXEA_5AW1A
2434
+ 2ZXEA_6KPWC
2435
+ 2ZXEA_7E1ZA
2436
+ 2ZXYA_2ZXYA
2437
+ 2ZXYA_3X15A
2438
+ 2ZXYA_3X15D
2439
+ 2ZXYA_3X15G
2440
+ 2ZXYA_3X15J
2441
+ 3A1JB_3A1JB
2442
+ 3A1JB_3GGRB
2443
+ 3A1JB_6J8YB
2444
+ 3A1JB_7Z6HC
2445
+ 3A1JB_8GNNB
2446
+ 3A69A_6JZTA
2447
+ 3A69A_6K3IEF
2448
+ 3A69A_6K3IFF
2449
+ 3A69A_6K9QI
2450
+ 3A69A_6K9QT
2451
+ 3AAPA_3AARA
2452
+ 3AAPA_4BR4A
2453
+ 3AAPA_4BREA
2454
+ 3AAPA_4BRKB
2455
+ 3AAPA_4BRLA
2456
+ 3AAZA_3AAZH
2457
+ 3AAZA_5IIEA
2458
+ 3AAZA_5IIEH
2459
+ 3AAZA_5T4ZA
2460
+ 3AAZA_5T4ZH
2461
+ 3AJMA_3AJMA
2462
+ 3AJMA_3AJMB
2463
+ 3AJMA_3L8JA
2464
+ 3AJMA_3W8HA
2465
+ 3AJMA_4GEHC
2466
+ 3APZA_3APZA
2467
+ 3APZA_3AQ0A
2468
+ 3APZA_3AQ0D
2469
+ 3APZA_3AQ0E
2470
+ 3APZA_3AQ0G
2471
+ 3AQFB_3N7PA
2472
+ 3AQFB_3N7PB
2473
+ 3AQFB_3N7SA
2474
+ 3AQFB_6UMGc
2475
+ 3AQFB_6UMGC
2476
+ 3AU3A_3AU3A
2477
+ 3AU3A_3NMZA
2478
+ 3AU3A_4YJEA
2479
+ 3AU3A_4YJLD
2480
+ 3AU3A_5IZAA
2481
+ 3AUXA_3AUXA
2482
+ 3AUXA_3AUYA
2483
+ 3AUXA_3AUYB
2484
+ 3AUXA_3AV0B
2485
+ 3AUXA_5DNYD
2486
+ 3B0WA_3KYTA
2487
+ 3B0WA_5IZ0B
2488
+ 3B0WA_5NTQA
2489
+ 3B0WA_6A22A
2490
+ 3B0WA_6FGQA
2491
+ 3B1NA_3B1OA
2492
+ 3B1NA_3B1OB
2493
+ 3B1NA_3B1QB
2494
+ 3B1NA_3B1RA
2495
+ 3B1NA_3B1RC
2496
+ 3B5HA_3B5HA
2497
+ 3B5HA_3B5HB
2498
+ 3B5HA_3B5HC
2499
+ 3B5HA_4U0QD
2500
+ 3B5HA_7XY8A
2501
+ 3B6YA_3B6YA
2502
+ 3B6YA_3B6YB
2503
+ 3B6YA_3RLOA
2504
+ 3B6YA_3RNUA
2505
+ 3B6YA_3RNUB
2506
+ 3BCHA_3BCHA
2507
+ 3BCHA_4D5LA
2508
+ 3BCHA_4V5ZAb
2509
+ 3BCHA_6D9JBB
2510
+ 3BCHA_7SYXB
2511
+ 3BG3A_3BG3A
2512
+ 3BG3A_3BG3D
2513
+ 3BG3A_7WTAA
2514
+ 3BG3A_7WTAB
2515
+ 3BG3A_7WTCD
2516
+ 3CC6A_3ET7A
2517
+ 3CC6A_3FZOA
2518
+ 3CC6A_3FZRA
2519
+ 3CC6A_3FZTA
2520
+ 3CC6A_4H1MA
2521
+ 3CHXB_3CHXB
2522
+ 3CHXB_3CHXJ
2523
+ 3CHXB_3RFRB
2524
+ 3CHXB_4PHZB
2525
+ 3CHXB_7S4MB
2526
+ 3DHWA_3DHWB
2527
+ 3DHWA_3DHWE
2528
+ 3DHWA_3DHWF
2529
+ 3DHWA_3TUIA
2530
+ 3DHWA_6CVLB
2531
+ 3DL3A_3DL3A
2532
+ 3DL3A_3DL3B
2533
+ 3DL3A_3DL3E
2534
+ 3DL3A_3DL3F
2535
+ 3DL3A_3DL3I
2536
+ 3DTPE_3JAXE
2537
+ 3DTPE_3JBHE
2538
+ 3DTPE_3JBHL
2539
+ 3DTPE_6SO3E
2540
+ 3DTPE_6SO3F
2541
+ 3DZUA_3DZUA
2542
+ 3DZUA_3DZYA
2543
+ 3DZUA_4NQAA
2544
+ 3DZUA_4NQAH
2545
+ 3DZUA_5UANA
2546
+ 3ECOA_4L9JA
2547
+ 3ECOA_4L9TB
2548
+ 3ECOA_4L9VA
2549
+ 3ECOA_4LD5C
2550
+ 3ECOA_4LD5D
2551
+ 3EEBA_3EEBA
2552
+ 3EEBA_3FZYA
2553
+ 3EEBA_3FZYB
2554
+ 3EEBA_3GCDA
2555
+ 3EEBA_7D5YA
2556
+ 3EI1B_3EI1B
2557
+ 3EI1B_4A08B
2558
+ 3EI1B_4A0BD
2559
+ 3EI1B_4A0KD
2560
+ 3EI1B_4A0LD
2561
+ 3ELLA_3ELLA
2562
+ 3ELLA_3MOKA
2563
+ 3ELLA_3MOLA
2564
+ 3ELLA_5XA4A
2565
+ 3ELLA_7EMUA
2566
+ 3EOEA_3EOEB
2567
+ 3EOEA_3GG8A
2568
+ 3EOEA_3GG8B
2569
+ 3EOEA_3GG8C
2570
+ 3EOEA_3GG8D
2571
+ 3EPSA_3EPSA
2572
+ 3EPSA_3EPSB
2573
+ 3EPSA_3LC6A
2574
+ 3EPSA_3LC6B
2575
+ 3EPSA_4P69A
2576
+ 3EZJB_3EZJD
2577
+ 3EZJB_3EZJH
2578
+ 3EZJB_6WARD
2579
+ 3EZJB_6WARF
2580
+ 3EZJB_6WARN
2581
+ 3F59A_3F59C
2582
+ 3F59A_3F59D
2583
+ 3F59A_3KBTC
2584
+ 3F59A_3KBTD
2585
+ 3F59A_3KBUD
2586
+ 3FH2A_3WDBA
2587
+ 3FH2A_6CN8A
2588
+ 3FH2A_6PBSG
2589
+ 3FH2A_6UCRA
2590
+ 3FH2A_7AA4A
2591
+ 3FK3A_3FK3A
2592
+ 3FK3A_3RLSA
2593
+ 3FK3A_3RLSB
2594
+ 3FK3A_5WYIA
2595
+ 3FK3A_6AXJC
2596
+ 3FPQA_5O1VA
2597
+ 3FPQA_5O26B
2598
+ 3FPQA_5WE8A
2599
+ 3FPQA_5WE8B
2600
+ 3FPQA_6CN9B
2601
+ 3FVYA_3FVYA
2602
+ 3FVYA_3T6BB
2603
+ 3FVYA_3T6JA
2604
+ 3FVYA_5E33A
2605
+ 3FVYA_5E3CA
2606
+ 3GB4A_3GB4A
2607
+ 3GB4A_3GB4C
2608
+ 3GB4A_3GOBA
2609
+ 3GB4A_3GTEC
2610
+ 3GB4A_6VSHA
2611
+ 3GB8A_3GB8A
2612
+ 3GB8A_3NBZA
2613
+ 3GB8A_3NC1A
2614
+ 3GB8A_4BSMA
2615
+ 3GB8A_5DISA
2616
+ 3GKKA_3GKKA
2617
+ 3GKKA_3GKNA
2618
+ 3GKKA_3GKNB
2619
+ 3GKKA_5IMDA
2620
+ 3GKKA_5IOXA
2621
+ 3GLRA_3GLSA
2622
+ 3GLRA_4BVHC
2623
+ 3GLRA_5D7NF
2624
+ 3GLRA_6ISOA
2625
+ 3GLRA_6ISOG
2626
+ 3GOYA_3GOYB
2627
+ 3GOYA_3SE2C
2628
+ 3GOYA_3SMJA
2629
+ 3GOYA_6WE4B
2630
+ 3GOYA_7L9YA
2631
+ 3GVPA_3GVPB
2632
+ 3GVPA_3GVPC
2633
+ 3GVPA_3GVPD
2634
+ 3GVPA_3MTGA
2635
+ 3GVPA_3MTGB
2636
+ 3GZUA_6OGYI
2637
+ 3GZUA_6OGYK
2638
+ 3GZUA_6OGZE
2639
+ 3GZUA_6OJ3F
2640
+ 3GZUA_6OJ5E
2641
+ 3H3PH_3H3PI
2642
+ 3H3PH_4LLVA
2643
+ 3H3PH_4LLVE
2644
+ 3H3PH_4LRNH
2645
+ 3H3PH_4ODXH
2646
+ 3HZSA_3VMQB
2647
+ 3HZSA_3VMRA
2648
+ 3HZSA_3VMSB
2649
+ 3HZSA_3VMTA
2650
+ 3HZSA_6FTBA
2651
+ 3I3YA_3I3YA
2652
+ 3I3YA_3I3YD
2653
+ 3I3YA_3IKHA
2654
+ 3I3YA_3IKHB
2655
+ 3I3YA_3IKHC
2656
+ 3I4RB_3I4RB
2657
+ 3I4RB_5A9Q3
2658
+ 3I4RB_5A9QC
2659
+ 3I4RB_5A9QL
2660
+ 3I4RB_7PEQCC
2661
+ 3IABA_3IABA
2662
+ 3IABA_6AGBF
2663
+ 3IABA_6AH3F
2664
+ 3IABA_6W6VF
2665
+ 3IABA_7C79F
2666
+ 3IBDA_3IBDA
2667
+ 3IBDA_3UA5A
2668
+ 3IBDA_3UA5B
2669
+ 3IBDA_4ZV8A
2670
+ 3IBDA_5UAPA
2671
+ 3IEPA_3IESA
2672
+ 3IEPA_6AAAA
2673
+ 3IEPA_6AAAB
2674
+ 3IEPA_6Q2MA
2675
+ 3IEPA_6Q2MC
2676
+ 3IFZA_3IFZA
2677
+ 3IFZA_3ILWA
2678
+ 3IFZA_5BS8A
2679
+ 3IFZA_5BTAC
2680
+ 3IFZA_7UGWA
2681
+ 3IYDE_3IYDE
2682
+ 3IYDE_3LU0E
2683
+ 3IYDE_4LJZK
2684
+ 3IYDE_5MS0O
2685
+ 3IYDE_6N4CE
2686
+ 3IYGQ_6NRBH
2687
+ 3IYGQ_7TTNB
2688
+ 3IYGQ_7WU7P
2689
+ 3IYGQ_7X0AQ
2690
+ 3IYGQ_7X6QJ
2691
+ 3IYKA_3J9ED
2692
+ 3IYKA_7RTNA
2693
+ 3IYKA_7RTNB
2694
+ 3IYKA_7RTOA
2695
+ 3IYKA_7RTOB
2696
+ 3IYLU_3IYLV
2697
+ 3IYLU_3K1QD
2698
+ 3IYLU_3K1QE
2699
+ 3IYLU_5ZVTU
2700
+ 3IYLU_8FJKV
2701
+ 3J04B_3J04B
2702
+ 3J04B_6Z47E
2703
+ 3J04B_7MF3E
2704
+ 3J04B_7UDTG
2705
+ 3J04B_7UDUE
2706
+ 3J0JI_3V6IB
2707
+ 3J0JI_3V6IX
2708
+ 3J0JI_5GARI
2709
+ 3J0JI_5GASI
2710
+ 3J0JI_6R10I
2711
+ 3J0LK_4V5ZAk
2712
+ 3J0LK_4V6IAK
2713
+ 3J0LK_4V7EBO
2714
+ 3J0LK_4V7HAK
2715
+ 3J0LK_7MQANG
2716
+ 3J1OI_3J1OI
2717
+ 3J1OI_3RJ1B
2718
+ 3J1OI_3RJ1P
2719
+ 3J1OI_4GWPB
2720
+ 3J1OI_7UIOAq
2721
+ 3J1ZP_3J1ZP
2722
+ 3J1ZP_7KZXA
2723
+ 3J1ZP_8F6KA
2724
+ 3J1ZP_8F6KB
2725
+ 3J1ZP_8F6KC
2726
+ 3J22A_4C0UA
2727
+ 3J22A_4N43A
2728
+ 3J22A_4RR3E
2729
+ 3J22A_4RS5J
2730
+ 3J22A_4YVSA
2731
+ 3J2WQ_3J2WQ
2732
+ 3J2WQ_3J2WS
2733
+ 3J2WQ_5VU2Q
2734
+ 3J2WQ_5VU2R
2735
+ 3J2WQ_5VU2T
2736
+ 3J3VT_3J3VT
2737
+ 3J3VT_3J3WT
2738
+ 3J3VT_3J9WBW
2739
+ 3J3VT_7AS9X
2740
+ 3J3VT_8BUUT
2741
+ 3J40A_3J40A
2742
+ 3J40A_3J40C
2743
+ 3J40A_3J40E
2744
+ 3J40A_3J40F
2745
+ 3J40A_3J40G
2746
+ 3J5PA_3J5QB
2747
+ 3J5PA_7L2PD
2748
+ 3J5PA_7L2TA
2749
+ 3J5PA_7LR0A
2750
+ 3J5PA_7RQZA
2751
+ 3J6X55_3J6X55
2752
+ 3J6X55_4V4BBL
2753
+ 3J6X55_4V7HBL
2754
+ 3J6X55_5FL8N
2755
+ 3J6X55_7U0HN
2756
+ 3J6X59_3J6X59
2757
+ 3J6X59_5FL8R
2758
+ 3J6X59_6OIGR
2759
+ 3J6X59_6T7TLR
2760
+ 3J6X59_7R7AR
2761
+ 3J6X79_3J6X79
2762
+ 3J6X79_4V6IBo
2763
+ 3J6X79_4V7HB1
2764
+ 3J6X79_5FL8l
2765
+ 3J6X79_6OIGl
2766
+ 3J6XL5_4U3Ml5
2767
+ 3J6XL5_4V6IBQ
2768
+ 3J6XL5_4V7RBE
2769
+ 3J6XL5_5FL8D
2770
+ 3J6XL5_7UOOD
2771
+ 3J70D_3J70P
2772
+ 3J70D_6CH9G
2773
+ 3J70D_6EDUD
2774
+ 3J70D_6MEOG
2775
+ 3J70D_6NIJA
2776
+ 3J7774_3J7774
2777
+ 3J7774_5NDWN4
2778
+ 3J7774_6OIGW
2779
+ 3J7774_6T83Wy
2780
+ 3J7774_7ZUXEV
2781
+ 3J7OB_3JAGB
2782
+ 3J7OB_4D5YB
2783
+ 3J7OB_4V5ZBb
2784
+ 3J7OB_4V6XCB
2785
+ 3J7OB_6D9JB
2786
+ 3J7Oc_4V5ZB6
2787
+ 3J7Oc_6HCFc3
2788
+ 3J7Oc_7NWHc
2789
+ 3J7Oc_7O81Bc
2790
+ 3J7Oc_8G5YLc
2791
+ 3J7Oh_4D5Yh
2792
+ 3J7Oh_4V6XCh
2793
+ 3J7Oh_6D9Jh
2794
+ 3J7Oh_6HCFh3
2795
+ 3J7Oh_7OYBh1
2796
+ 3J7ON_3J7ON
2797
+ 3J7ON_4V5ZBm
2798
+ 3J7ON_6D9JN
2799
+ 3J7ON_6ZVK12
2800
+ 3J7ON_7A0112
2801
+ 3J7Pq_3JAJq
2802
+ 3J7Pq_5LZSs
2803
+ 3J7Pq_6HCJt3
2804
+ 3J7Pq_6ZVKK2
2805
+ 3J7Pq_7A01K2
2806
+ 3J7PSK_5K0Yt
2807
+ 3J7PSK_5LZSKK
2808
+ 3J7PSK_6GZ3BK
2809
+ 3J7PSK_6OLZBK
2810
+ 3J7PSK_7A01E3
2811
+ 3J7PSV_4D5LV
2812
+ 3J7PSV_4KZXV
2813
+ 3J7PSV_6D9JWW
2814
+ 3J7PSV_6FECb
2815
+ 3J7PSV_7A01D3
2816
+ 3J7Yc_3J7Yc
2817
+ 3J7Yc_4CE4h
2818
+ 3J7Yc_4V1Ah
2819
+ 3J7Yc_7OI6c
2820
+ 3J7Yc_7PD3c
2821
+ 3J7Yk_3J7Yk
2822
+ 3J7Yk_4V1Ap
2823
+ 3J7Yk_6I9Rk
2824
+ 3J7Yk_6NU2k
2825
+ 3J7Yk_7OI6k
2826
+ 3J96A_3J96A
2827
+ 3J96A_3J96C
2828
+ 3J96A_3J97F
2829
+ 3J96A_3J99E
2830
+ 3J96A_6MDMA
2831
+ 3J9MAF_3JD5G
2832
+ 3J9MAF_6NF8G
2833
+ 3J9MAF_6NU2AF
2834
+ 3J9MAF_6NU3AF
2835
+ 3J9MAF_6RW4F
2836
+ 3J9MAS_3JD5b
2837
+ 3J9MAS_6NU2AS
2838
+ 3J9MAS_6NU3AS
2839
+ 3J9MAS_6RW4S
2840
+ 3J9MAS_7A5KS6
2841
+ 3J9TR_3J9TT
2842
+ 3J9TR_3J9Ua
2843
+ 3J9TR_5VOZT
2844
+ 3J9TR_6O7Ti
2845
+ 3J9TR_7FDCV
2846
+ 3J9WAL_5MYJAL
2847
+ 3J9WAL_5ND8l
2848
+ 3J9WAL_6S13l
2849
+ 3J9WAL_7ASO3
2850
+ 3J9WAL_7ASPc
2851
+ 3J9WAR_5T7VS9
2852
+ 3J9WAR_6S13r
2853
+ 3J9WAR_7ASO9
2854
+ 3J9WAR_7ASPi
2855
+ 3J9WAR_7P48r
2856
+ 3J9WB0_3J9WB0
2857
+ 3J9WB0_6HTQW
2858
+ 3J9WB0_7AS9b
2859
+ 3J9WB0_7NHN1
2860
+ 3J9WB0_8BUUX
2861
+ 3J9Yu_4V6LAY
2862
+ 3J9Yu_4V6QAX
2863
+ 3J9Yu_4V6VAU
2864
+ 3J9Yu_7PJTu
2865
+ 3J9Yu_8A3LU
2866
+ 3JA7A_3JA7B
2867
+ 3JA7A_6UZCG
2868
+ 3JA7A_6UZCH
2869
+ 3JA7A_6UZCN
2870
+ 3JA7A_6UZCO
2871
+ 3JA84_3JC74
2872
+ 3JA84_5V8F4
2873
+ 3JA84_5XF84
2874
+ 3JA84_6PTOH
2875
+ 3JA84_7PT64
2876
+ 3JAAA_4DL3A
2877
+ 3JAAA_4DL5A
2878
+ 3JAAA_4RNOA
2879
+ 3JAAA_6M7UA
2880
+ 3JAAA_6M7VA
2881
+ 3JC8Ta_3JC8Ta
2882
+ 3JC8Ta_3JC8Tb
2883
+ 3JC8Ta_3JC8Tc
2884
+ 3JC8Ta_3JC8Ti
2885
+ 3JC8Ta_3JC9Ta
2886
+ 3JCKC_3JCKC
2887
+ 3JCKC_3JCOQ
2888
+ 3JCKC_4CR2Q
2889
+ 3JCKC_6FVTQ
2890
+ 3JCKC_6FVWQ
2891
+ 3JCKE_3JCKE
2892
+ 3JCKE_5WVIU
2893
+ 3JCKE_6FVUU
2894
+ 3JCKE_6FVWU
2895
+ 3JCKE_6J30U
2896
+ 3JCKH_3JCKH
2897
+ 3JCKH_4CR3T
2898
+ 3JCKH_5MPBT
2899
+ 3JCKH_5WVIT
2900
+ 3JCKH_6J30T
2901
+ 3JCMP_5GAMh
2902
+ 3JCMP_5GAOl
2903
+ 3JCMP_5ZWOb
2904
+ 3JCMP_6N7XL
2905
+ 3JCMP_7DCOb
2906
+ 3JCRD_5MQFF
2907
+ 3JCRD_5XJCE
2908
+ 3JCRD_6QDVN
2909
+ 3JCRD_7ABGD
2910
+ 3JCRD_8CH6o
2911
+ 3JCSH_3JCSH
2912
+ 3JCSH_4V8MBO
2913
+ 3JCSH_5T2AM
2914
+ 3JCSH_5T5HO
2915
+ 3JCSH_6AZ3H
2916
+ 3JCTb_7OHPb
2917
+ 3JCTb_7OHQb
2918
+ 3JCTb_7OHWb
2919
+ 3JCTb_7UG6b
2920
+ 3JCTb_7Z34b
2921
+ 3JCTr_3JCTr
2922
+ 3JCTr_6ELZr
2923
+ 3JCTr_6YLGr
2924
+ 3JCTr_6YLXr
2925
+ 3JCTr_7OHTr
2926
+ 3JCTu_6C0Fu
2927
+ 3JCTu_6N8Ju
2928
+ 3JCTu_6YLYu
2929
+ 3JCTu_7OH3u
2930
+ 3JCTu_7Z34u
2931
+ 3JTMA_3JTMA
2932
+ 3JTMA_3N7UB
2933
+ 3JTMA_3N7UF
2934
+ 3JTMA_3N7UH
2935
+ 3JTMA_3NAQA
2936
+ 3JUAB_5OAQL
2937
+ 3JUAB_6GE3L
2938
+ 3JUAB_6GEKL
2939
+ 3JUAB_8A8QC
2940
+ 3JUAB_8A8RM
2941
+ 3KHXA_3KHXA
2942
+ 3KHXA_3KHXB
2943
+ 3KHXA_3KHZA
2944
+ 3KHXA_3KHZB
2945
+ 3KHXA_3KI9A
2946
+ 3KKBA_3KKBA
2947
+ 3KKBA_3KKBB
2948
+ 3KKBA_3L34A
2949
+ 3KKBA_4LLCA
2950
+ 3KKBA_4LLCB
2951
+ 3KLTA_3KLTA
2952
+ 3KLTA_3KLTB
2953
+ 3KLTA_3KLTC
2954
+ 3KLTA_3TRTA
2955
+ 3KLTA_3TRTB
2956
+ 3KTGA_3KTHA
2957
+ 3KTGA_7FESJ
2958
+ 3KTGA_7FESL
2959
+ 3KTGA_7FESM
2960
+ 3KTGA_7P80B
2961
+ 3LREA_3LREB
2962
+ 3LREA_5OAMK
2963
+ 3LREA_5OCUK
2964
+ 3LREA_5OGCK
2965
+ 3LREA_7RSIC
2966
+ 3MSYA_3MSYB
2967
+ 3MSYA_3MSYD
2968
+ 3MSYA_3NO1A
2969
+ 3MSYA_4H83D
2970
+ 3MSYA_4H83F
2971
+ 3N9IA_3N9IA
2972
+ 3N9IA_3N9IB
2973
+ 3N9IA_8I1WB
2974
+ 3N9IA_8I1ZA
2975
+ 3N9IA_8I2CB
2976
+ 3NBXX_3NBXX
2977
+ 3NBXX_6Q7LV
2978
+ 3NBXX_6Q7LW
2979
+ 3NBXX_6Q7LX
2980
+ 3NBXX_6Q7LZ
2981
+ 3NKDA_3NKDA
2982
+ 3NKDA_5DQZD
2983
+ 3NKDA_5DS5B
2984
+ 3NKDA_5VVKA
2985
+ 3NKDA_5WFEB
2986
+ 3NXCA_3NXCA
2987
+ 3NXCA_4GFLB
2988
+ 3NXCA_5HBUA
2989
+ 3NXCA_5HBUB
2990
+ 3NXCA_5K58F
2991
+ 3OCEA_3OCEA
2992
+ 3OCEA_3OCEC
2993
+ 3OCEA_3OCED
2994
+ 3OCEA_3OCFA
2995
+ 3OCEA_3OCFB
2996
+ 3OETA_3OETB
2997
+ 3OETA_3OETD
2998
+ 3OETA_3OETE
2999
+ 3OETA_3OETG
3000
+ 3OETA_3OETH
3001
+ 3OLMA_3OLMA
3002
+ 3OLMA_4LCDA
3003
+ 3OLMA_4LCDB
3004
+ 3OLMA_5HPLA
3005
+ 3OLMA_5HPLB
3006
+ 3P54A_3P54A
3007
+ 3P54A_5WSNA
3008
+ 3P54A_5YWPE
3009
+ 3P54A_6A0PA
3010
+ 3P54A_7KVBA
3011
+ 3PCOB_3PCOD
3012
+ 3PCOB_6OZ5D
3013
+ 3PCOB_6P24B
3014
+ 3PCOB_7N8YB
3015
+ 3PCOB_7N8YD
3016
+ 3PHGA_3PHGB
3017
+ 3PHGA_4FOSA
3018
+ 3PHGA_4FQ8B
3019
+ 3PHGA_4FSHA
3020
+ 3PHGA_4FSHB
3021
+ 3PILA_3PILA
3022
+ 3PILA_3PILB
3023
+ 3PILA_3PIMA
3024
+ 3PILA_3PIMB
3025
+ 3PILA_3PINB
3026
+ 3PJZA_3PJZA
3027
+ 3PJZA_6V4JA
3028
+ 3PJZA_6V4JB
3029
+ 3PJZA_6V4KC
3030
+ 3PJZA_6V4LB
3031
+ 3PPCA_3PPCA
3032
+ 3PPCA_3PPCB
3033
+ 3PPCA_3PPHB
3034
+ 3PPCA_4L5ZA
3035
+ 3PPCA_4QQUA
3036
+ 3PV2A_3PV2B
3037
+ 3PV2A_3PV3C
3038
+ 3PV2A_3PV3D
3039
+ 3PV2A_3PV5B
3040
+ 3PV2A_3PV5C
3041
+ 3Q6GH_3Q6GH
3042
+ 3Q6GH_4RFEA
3043
+ 3Q6GH_4RFNB
3044
+ 3Q6GH_5UKNH
3045
+ 3Q6GH_6P60A
3046
+ 3QF7A_3QG5A
3047
+ 3QF7A_3QG5B
3048
+ 3QF7A_3THOA
3049
+ 3QF7A_4W9ME
3050
+ 3QF7A_4W9MI
3051
+ 3QILA_3QILA
3052
+ 3QILA_3QILC
3053
+ 3QILA_3QILD
3054
+ 3QILA_3QILM
3055
+ 3QILA_3QILR
3056
+ 3QV9A_4HYVA
3057
+ 3QV9A_4KCVB
3058
+ 3QV9A_4KRZA
3059
+ 3QV9A_4KRZB
3060
+ 3QV9A_4KS0B
3061
+ 3R74A_3R74A
3062
+ 3R74A_3R74B
3063
+ 3R74A_3R75A
3064
+ 3R74A_3R75B
3065
+ 3R74A_3R76A
3066
+ 3R90A_3R90C
3067
+ 3R90A_3R90E
3068
+ 3R90A_3R90J
3069
+ 3R90A_5VYCk4
3070
+ 3R90A_6MS4A
3071
+ 3RGXA_3RGXA
3072
+ 3RGXA_3RGZA
3073
+ 3RGXA_3RJ0A
3074
+ 3RGXA_4M7EA
3075
+ 3RGXA_4M7EB
3076
+ 3RJRA_3RJRA
3077
+ 3RJRA_5FFOD
3078
+ 3RJRA_5FFOH
3079
+ 3RJRA_5VQFC
3080
+ 3RJRA_5VQFD
3081
+ 3RKIA_3RKIA
3082
+ 3RKIA_3RKIB
3083
+ 3RKIA_5TDGB
3084
+ 3RKIA_5TPNA
3085
+ 3RKIA_5UDCA
3086
+ 3S2KA_3S2KA
3087
+ 3S2KA_3S8VA
3088
+ 3S2KA_5FWWA
3089
+ 3S2KA_6H16A
3090
+ 3S2KA_8DVMA
3091
+ 3SL7A_3SL7A
3092
+ 3SL7A_3SL7B
3093
+ 3SL7A_4GQYA
3094
+ 3SL7A_4GQYB
3095
+ 3SL7A_4GQYC
3096
+ 3SN6R_3SN6R
3097
+ 3SN6R_4LDEA
3098
+ 3SN6R_4QKXA
3099
+ 3SN6R_5JQHB
3100
+ 3SN6R_6CSYA
3101
+ 3SOPA_3SOPB
3102
+ 3SOPA_4Z54A
3103
+ 3SOPA_4Z54B
3104
+ 3SOPA_6UQQC
3105
+ 3SOPA_6UQQD
3106
+ 3SW1A_3SW1A
3107
+ 3SW1A_5J3WB
3108
+ 3SW1A_5J4EC
3109
+ 3SW1A_5LUVB
3110
+ 3SW1A_7R56A
3111
+ 3T12A_3T12A
3112
+ 3T12A_3T1OA
3113
+ 3T12A_3T1QA
3114
+ 3T12A_3T1TB
3115
+ 3T12A_3T1VC
3116
+ 3T5NA_3T5QA
3117
+ 3T5NA_3T5QE
3118
+ 3T5NA_3T5QG
3119
+ 3T5NA_3T5QI
3120
+ 3T5NA_3T5QK
3121
+ 3T6CA_3T6CA
3122
+ 3T6CA_3T6CB
3123
+ 3T6CA_3TW9B
3124
+ 3T6CA_3TW9D
3125
+ 3T6CA_3TWAA
3126
+ 3T8SA_3T8SA
3127
+ 3T8SA_3T8SB
3128
+ 3T8SA_3UJ0A
3129
+ 3T8SA_3UJ0B
3130
+ 3T8SA_3UJ4A
3131
+ 3TGOC_3TGOC
3132
+ 3TGOC_5EKQC
3133
+ 3TGOC_5LJOC
3134
+ 3TGOC_6LYSC
3135
+ 3TGOC_6V05C
3136
+ 3TJ1A_3TJ1A
3137
+ 3TJ1A_5G5LO
3138
+ 3TJ1A_6RQTO
3139
+ 3TJ1A_6RUOO
3140
+ 3TJ1A_6RWEO
3141
+ 3TTKA_3TTKA
3142
+ 3TTKA_3TTKB
3143
+ 3TTKA_3TTKC
3144
+ 3TTKA_3TTMA
3145
+ 3TTKA_3TTMB
3146
+ 3U1SH_3U1SH
3147
+ 3U1SH_5U1FH
3148
+ 3U1SH_5UXQA
3149
+ 3U1SH_5UXQH
3150
+ 3U1SH_5UY3H
3151
+ 3U3IA_3U3IA
3152
+ 3U3IA_4AKLA
3153
+ 3U3IA_4AQFA
3154
+ 3U3IA_4AQGA
3155
+ 3U3IA_6Z0OB
3156
+ 3U5ZB_3U5ZB
3157
+ 3U5ZB_3U5ZE
3158
+ 3U5ZB_3U60C
3159
+ 3U5ZB_3U61B
3160
+ 3U5ZB_3U61E
3161
+ 3UO1H_6BPCE
3162
+ 3UO1H_6BPEB
3163
+ 3UO1H_6BPEE
3164
+ 3UO1H_7KYLH
3165
+ 3UO1H_7KYLX
3166
+ 3V44A_3V44A
3167
+ 3V44A_3V47A
3168
+ 3V44A_3V47B
3169
+ 3V44A_5GY2B
3170
+ 3V44A_6BXCB
3171
+ 3V6TA_4OSKB
3172
+ 3V6TA_4OSLA
3173
+ 3V6TA_4OT3B
3174
+ 3V6TA_6JTQB
3175
+ 3V6TA_6JVZB
3176
+ 3VD6C_3VEKC
3177
+ 3VD6C_4HC7A
3178
+ 3VD6C_4HC7B
3179
+ 3VD6C_4HC9A
3180
+ 3VD6C_4HCAA
3181
+ 3VD8A_5VAWA
3182
+ 3VD8A_5YGSD
3183
+ 3VD8A_6QF7C
3184
+ 3VD8A_8AG0B
3185
+ 3VD8A_8DX4B
3186
+ 3VH1A_3VH1A
3187
+ 3VH1A_3VH2A
3188
+ 3VH1A_4GSKA
3189
+ 3VH1A_4GSKB
3190
+ 3VH1A_4GSLB
3191
+ 3VKWA_3VKWA
3192
+ 3VKWA_3WRXC
3193
+ 3VKWA_3WRXD
3194
+ 3VKWA_3WRYC
3195
+ 3VKWA_3WRYD
3196
+ 3W6GA_5XBQI
3197
+ 3W6GA_6IU1A
3198
+ 3W6GA_8HH0A
3199
+ 3W6GA_8HLAD
3200
+ 3W6GA_8HLAL
3201
+ 3WBIA_3WBIA
3202
+ 3WBIA_3WBKA
3203
+ 3WBIA_3WBKB
3204
+ 3WBIA_6UZ71
3205
+ 3WBIA_6WOO1
3206
+ 3WG9A_3WG9B
3207
+ 3WG9A_3WG9C
3208
+ 3WG9A_3WGGA
3209
+ 3WG9A_3WGIB
3210
+ 3WG9A_3WGIC
3211
+ 3WKLA_3WKLA
3212
+ 3WKLA_3WKMA
3213
+ 3WKLA_6AKQA
3214
+ 3WKLA_6ICFA
3215
+ 3WKLA_7CQCA
3216
+ 3WMEA_3WMFA
3217
+ 3WMEA_6A6MA
3218
+ 3WMEA_7DQVA
3219
+ 3WMEA_7FC9A
3220
+ 3WMEA_7VR5A
3221
+ 3WSXA_3WSXA
3222
+ 3WSXA_3WSYA
3223
+ 3WSXA_3WSZA
3224
+ 3WSXA_7VT0A
3225
+ 3WSXA_7VT0B
3226
+ 3WT1A_3WT1A
3227
+ 3WT1A_3WT1D
3228
+ 3WT1A_3WT2A
3229
+ 3WT1A_3WT2B
3230
+ 3WT1A_5CRWA
3231
+ 3X2RA_3X2RC
3232
+ 3X2RA_4Q79E
3233
+ 3X2RA_4UV2A
3234
+ 3X2RA_4UV2P
3235
+ 3X2RA_6L7CA
3236
+ 3X3MA_3X3MA
3237
+ 3X3MA_3X3NA
3238
+ 3X3MA_4KK7A
3239
+ 3X3MA_5EBCA
3240
+ 3X3MA_5EBDA
3241
+ 3ZBEA_3ZBEA
3242
+ 3ZBEA_5CW7E
3243
+ 3ZBEA_5CW7G
3244
+ 3ZBEA_5CW7M
3245
+ 3ZBEA_5CZEA
3246
+ 3ZYYX_3ZYYX
3247
+ 3ZYYX_3ZYYY
3248
+ 3ZYYX_4C1NI
3249
+ 3ZYYX_4C1NJ
3250
+ 3ZYYX_4C1NK
3251
+ 4AJ5K_4AJ5K
3252
+ 4AJ5K_4AJ5L
3253
+ 4AJ5K_4AJ5M
3254
+ 4AJ5K_4AJ5P
3255
+ 4AJ5K_4AJ5R
3256
+ 4AQ5C_4AQ5C
3257
+ 4AQ5C_4AQ9C
3258
+ 4AQ5C_4BOG1
3259
+ 4AQ5C_7QKOC
3260
+ 4AQ5C_8F6ZB
3261
+ 4ARJA_4ARJA
3262
+ 4ARJA_4ARJB
3263
+ 4ARJA_4EPIA
3264
+ 4ARJA_4EXMC
3265
+ 4ARJA_4EXMD
3266
+ 4BM9A_4K95A
3267
+ 4BM9A_5N2WA
3268
+ 4BM9A_5N38A
3269
+ 4BM9A_6GLCA
3270
+ 4BM9A_6HUEA
3271
+ 4BSRA_4BSSF
3272
+ 4BSRA_4BSUE
3273
+ 4BSRA_4KNGB
3274
+ 4BSRA_4UFRA
3275
+ 4BSRA_4UFRC
3276
+ 4C0OA_4C0PC
3277
+ 4C0OA_4C0PD
3278
+ 4C0OA_4C0QA
3279
+ 4C0OA_6GX9A
3280
+ 4C0OA_6GX9B
3281
+ 4C2MD_5M5WD
3282
+ 4C2MD_6H67D
3283
+ 4C2MD_6H68D
3284
+ 4C2MD_6RQHD
3285
+ 4C2MD_6RQTD
3286
+ 4C2MI_4C2MI
3287
+ 4C2MI_6HLQI
3288
+ 4C2MI_6HLSI
3289
+ 4C2MI_6RRDI
3290
+ 4C2MI_6RUOI
3291
+ 4C3BA_4C3BP
3292
+ 4C3BA_4C3EK
3293
+ 4C3BA_4C3EO
3294
+ 4C3BA_6G0YC
3295
+ 4C3BA_6G0YF
3296
+ 4C4VA_4C4VA
3297
+ 4C4VA_4C4VB
3298
+ 4C4VA_7NREA
3299
+ 4C4VA_7TT5A
3300
+ 4C4VA_7TT6A
3301
+ 4C8HA_4C93A
3302
+ 4C8HA_5HOGB
3303
+ 4C8HA_6PTNF
3304
+ 4C8HA_6PTNG
3305
+ 4C8HA_7PMKF
3306
+ 4C8VA_4C8VH
3307
+ 4C8VA_4C8WI
3308
+ 4C8VA_4C99D
3309
+ 4C8VA_4C9EB
3310
+ 4C8VA_4C9EF
3311
+ 4CBVA_4CBVA
3312
+ 4CBVA_4CBVB
3313
+ 4CBVA_4CBVC
3314
+ 4CBVA_4CBVD
3315
+ 4CBVA_4CBVF
3316
+ 4CFEB_4RERB
3317
+ 4CFEB_4ZHXB
3318
+ 4CFEB_5ISOB
3319
+ 4CFEB_6B2EB
3320
+ 4CFEB_7MYJD
3321
+ 4CI1B_6BN9B
3322
+ 4CI1B_6BNBB
3323
+ 4CI1B_6H0FB
3324
+ 4CI1B_7U8FA
3325
+ 4CI1B_8D81B
3326
+ 4CKBA_4CKBD
3327
+ 4CKBA_4CKCA
3328
+ 4CKBA_4CKEA
3329
+ 4CKBA_6RFLO
3330
+ 4CKBA_6RIEO
3331
+ 4COOA_4L0DB
3332
+ 4COOA_4L3VA
3333
+ 4COOA_4PCUA
3334
+ 4COOA_4PCUB
3335
+ 4COOA_7QGTA
3336
+ 4CPCA_4CPCA
3337
+ 4CPCA_4CPCC
3338
+ 4CPCA_4CPCD
3339
+ 4CPCA_4CPCF
3340
+ 4CPCA_4CPCH
3341
+ 4CSFA_4CSFA
3342
+ 4CSFA_4CSGD
3343
+ 4CSFA_4J4YB
3344
+ 4CSFA_5FVAA
3345
+ 4CSFA_8BPKA
3346
+ 4D0YA_4D0YA
3347
+ 4D0YA_4D0YB
3348
+ 4D0YA_4NT9B
3349
+ 4D0YA_4OXDA
3350
+ 4D0YA_4OXDD
3351
+ 4D10H_4D10H
3352
+ 4D10H_6R6HH
3353
+ 4D10H_6R7HH
3354
+ 4D10H_6R7IH
3355
+ 4D10H_6R7NH
3356
+ 4D6WA_4D6WA
3357
+ 4D6WA_4D6WB
3358
+ 4D6WA_5MDMA
3359
+ 4D6WA_5MDMC
3360
+ 4D6WA_5MDMF
3361
+ 4DAGH_4XMKJ
3362
+ 4DAGH_5WCDH
3363
+ 4DAGH_7BEIH
3364
+ 4DAGH_7E5YC
3365
+ 4DAGH_7L7RB
3366
+ 4DJDC_4DJDC
3367
+ 4DJDC_4DJDE
3368
+ 4DJDC_4DJEC
3369
+ 4DJDC_4DJEE
3370
+ 4DJDC_4DJFE
3371
+ 4DPGI_4DPGI
3372
+ 4DPGI_4DPGJ
3373
+ 4DPGI_4DPGL
3374
+ 4DPGI_4YCUC
3375
+ 4DPGI_6ILDC
3376
+ 4DZDA_4DZDA
3377
+ 4DZDA_4TVXA
3378
+ 4DZDA_4U7UP
3379
+ 4DZDA_5CD4A
3380
+ 4DZDA_5H9EK
3381
+ 4E6HA_4E6HA
3382
+ 4E6HA_4EBAA
3383
+ 4E6HA_4EBAB
3384
+ 4E6HA_4EBAD
3385
+ 4E6HA_4EBAE
3386
+ 4EG1A_4EG4A
3387
+ 4EG1A_4EG7B
3388
+ 4EG1A_4EGAA
3389
+ 4EG1A_4MVXA
3390
+ 4EG1A_5J59B
3391
+ 4EHWA_4EHYA
3392
+ 4EHWA_4ITLA
3393
+ 4EHWA_4ITMA
3394
+ 4EHWA_4ITNA
3395
+ 4EHWA_4LKVB
3396
+ 4EIYA_5IUAA
3397
+ 4EIYA_5UIGA
3398
+ 4EIYA_7EZCA
3399
+ 4EIYA_7EZCB
3400
+ 4EIYA_8GNEA
3401
+ 4EJ7A_4EJ7A
3402
+ 4EJ7A_4FEUB
3403
+ 4EJ7A_4FEUC
3404
+ 4EJ7A_4FEVE
3405
+ 4EJ7A_4GKHC
3406
+ 4ETVA_4ETVA
3407
+ 4ETVA_6MM6D
3408
+ 4ETVA_6MM6F
3409
+ 4ETVA_6MM7C
3410
+ 4ETVA_6MM7E
3411
+ 4F3TA_4KXTA
3412
+ 4F3TA_5VM9A
3413
+ 4F3TA_5WEAA
3414
+ 4F3TA_6MDZA
3415
+ 4F3TA_6MFNA
3416
+ 4F5CA_4F5CA
3417
+ 4F5CA_4F5CB
3418
+ 4F5CA_4FKEA
3419
+ 4F5CA_5LDSB
3420
+ 4F5CA_7VPPA
3421
+ 4FU3A_4FU3B
3422
+ 4FU3A_4HFGA
3423
+ 4FU3A_4HFGB
3424
+ 4FU3A_4Q94A
3425
+ 4FU3A_4Q96A
3426
+ 4FXGA_4FXGA
3427
+ 4FXGA_4FXKA
3428
+ 4FXGA_4XAMB
3429
+ 4FXGA_5JPNA
3430
+ 4FXGA_5JTWA
3431
+ 4FZQA_4FZQA
3432
+ 4FZQA_4FZQB
3433
+ 4FZQA_4FZQC
3434
+ 4FZQA_4FZQE
3435
+ 4FZQA_4FZQF
3436
+ 4G6FB_4G6FB
3437
+ 4G6FB_5GHWH
3438
+ 4G6FB_5IQ7H
3439
+ 4G6FB_5IQ9A
3440
+ 4G6FB_5T29H
3441
+ 4G6GA_4G6GA
3442
+ 4G6GA_4G6HA
3443
+ 4G6GA_5YJWA
3444
+ 4G6GA_5YJXA
3445
+ 4G6GA_5YJYB
3446
+ 4GC5A_4GC5A
3447
+ 4GC5A_4GC9A
3448
+ 4GC5A_8CSP5
3449
+ 4GC5A_8CSR5
3450
+ 4GC5A_8CSU5
3451
+ 4GLRH_5IG7J
3452
+ 4GLRH_6UTEI
3453
+ 4GLRH_7CJ2C
3454
+ 4GLRH_7KQKH
3455
+ 4GLRH_8E7MH
3456
+ 4GPKA_4GPKB
3457
+ 4GPKA_4GPKE
3458
+ 4GPKA_4GPKG
3459
+ 4GPKA_5DBKA
3460
+ 4GPKA_5DBKB
3461
+ 4GRCA_4GRCA
3462
+ 4GRCA_6I43A
3463
+ 4GRCA_6Q3DB
3464
+ 4GRCA_6TB8A
3465
+ 4GRCA_6TB8B
3466
+ 4H1SA_6VC9A
3467
+ 4H1SA_6VCAB
3468
+ 4H1SA_6VCAC
3469
+ 4H1SA_6XUEB
3470
+ 4H1SA_7JV8A
3471
+ 4HKRA_4HKRA
3472
+ 4HKRA_4HKRB
3473
+ 4HKRA_4HKSA
3474
+ 4HKRA_4HKSB
3475
+ 4HKRA_6BBGA
3476
+ 4HNWA_4HNYA
3477
+ 4HNWA_4HNYC
3478
+ 4HNWA_4XNHA
3479
+ 4HNWA_4Y49G
3480
+ 4HNWA_6HD5t
3481
+ 4HYEA_4HYEA
3482
+ 4HYEA_4HYEB
3483
+ 4HYEA_4ZMRB
3484
+ 4HYEA_4ZMSA
3485
+ 4HYEA_4ZMSB
3486
+ 4IHBA_4IHBA
3487
+ 4IHBA_4IQHC
3488
+ 4IHBA_7JOFB
3489
+ 4IHBA_7K6BA
3490
+ 4IHBA_7KRBA
3491
+ 4IUFA_4IUFA
3492
+ 4IUFA_4Y00A
3493
+ 4IUFA_4Y00B
3494
+ 4IUFA_4Y00C
3495
+ 4IUFA_4Y00D
3496
+ 4JDQA_4JDQA
3497
+ 4JDQA_4JDQF
3498
+ 4JDQA_4JDXB
3499
+ 4JDQA_4JDXC
3500
+ 4JDQA_4JDXF
3501
+ 4JPHA_4JPHA
3502
+ 4JPHA_4JPHB
3503
+ 4JPHA_4JPHD
3504
+ 4JPHA_5HK5E
3505
+ 4JPHA_5HK5H
3506
+ 4JVZA_4JVZC
3507
+ 4JVZA_4JVZE
3508
+ 4JVZA_4N8FA
3509
+ 4JVZA_4N8FE
3510
+ 4JVZA_7WKCA
3511
+ 4JZBA_4JZBA
3512
+ 4JZBA_4K10B
3513
+ 4JZBA_4K10C
3514
+ 4JZBA_4K10D
3515
+ 4JZBA_6VJCB
3516
+ 4K3VA_4K3VA
3517
+ 4K3VA_4NNOA
3518
+ 4K3VA_4NNPA
3519
+ 4K3VA_4NNPB
3520
+ 4K3VA_5HDQA
3521
+ 4KBLA_4KBLB
3522
+ 4KBLA_4KC9A
3523
+ 4KBLA_5UDHA
3524
+ 4KBLA_7B5LH
3525
+ 4KBLA_7B5NH
3526
+ 4KGHA_4KGHB
3527
+ 4KGHA_4KGOB
3528
+ 4KGHA_5I7JA
3529
+ 4KGHA_5I7LA
3530
+ 4KGHA_5I7LB
3531
+ 4KU7A_4KU8A
3532
+ 4KU7A_5J48A
3533
+ 4KU7A_5JAXA
3534
+ 4KU7A_5JD7A
3535
+ 4KU7A_5L0NA
3536
+ 4KZXl_4KZXl
3537
+ 4KZXl_4KZYl
3538
+ 4KZXl_6YBWp
3539
+ 4KZXl_6ZP4Z
3540
+ 4KZXl_6ZVJN
3541
+ 4L7ZA_4L80A
3542
+ 4L7ZA_6KINC
3543
+ 4L7ZA_6KIND
3544
+ 4L7ZA_6KINE
3545
+ 4L7ZA_6KKHA
3546
+ 4L8PA_4L8PA
3547
+ 4L8PA_4LEHA
3548
+ 4L8PA_4LEHC
3549
+ 4L8PA_4N3VA
3550
+ 4L8PA_4N3VB
3551
+ 4LDVA_4LDVA
3552
+ 4LDVA_4LDXA
3553
+ 4LDVA_4LDXB
3554
+ 4LDVA_4LDYB
3555
+ 4LDVA_6YCQB
3556
+ 4LVCA_4LVCA
3557
+ 4LVCA_4LVCD
3558
+ 4LVCA_5M5KD
3559
+ 4LVCA_5M67B
3560
+ 4LVCA_6EXIB
3561
+ 4MFJA_4MFJA
3562
+ 4MFJA_4MFKA
3563
+ 4MFJA_4MFPA
3564
+ 4MFJA_4MFQA
3565
+ 4MFJA_6IIXA
3566
+ 4ML1A_4ML1C
3567
+ 4ML1A_4ML6C
3568
+ 4ML1A_4ML6D
3569
+ 4ML1A_4MLYB
3570
+ 4ML1A_4MLYC
3571
+ 4MSOA_4MSOA
3572
+ 4MSOA_4MSOB
3573
+ 4MSOA_4N0WA
3574
+ 4MSOA_4N0WD
3575
+ 4MSOA_4OTLD
3576
+ 4MTKA_4MTKA
3577
+ 4MTKA_4UHVA
3578
+ 4MTKA_4UHVB
3579
+ 4MTKA_6H3LA
3580
+ 4MTKA_6H3LC
3581
+ 4MYCA_4MYCA
3582
+ 4MYCA_4MYCB
3583
+ 4MYCA_7PSLA
3584
+ 4MYCA_7PSMA
3585
+ 4MYCA_7PSNA
3586
+ 4N0NA_4N0NA
3587
+ 4N0NA_4N0OA
3588
+ 4N0NA_4N0OC
3589
+ 4N0NA_4N0OE
3590
+ 4N0NA_4N0OG
3591
+ 4N7WA_4N7WA
3592
+ 4N7WA_4N7WB
3593
+ 4N7WA_4N7XA
3594
+ 4N7WA_6LGVA
3595
+ 4N7WA_6LH1A
3596
+ 4NFTA_4NFTD
3597
+ 4NFTA_5CHLB
3598
+ 4NFTA_5VEYA
3599
+ 4NFTA_7VCLA
3600
+ 4NFTA_7WLPA
3601
+ 4NXTA_4NXTB
3602
+ 4NXTA_4NXWA
3603
+ 4NXTA_4OAIZ
3604
+ 4NXTA_5X9BA
3605
+ 4NXTA_5X9CB
3606
+ 4O64A_4O64B
3607
+ 4O64A_4O64C
3608
+ 4O64A_6ASBC
3609
+ 4O64A_6ASBF
3610
+ 4O64A_6ASBI
3611
+ 4OCJA_4OCJA
3612
+ 4OCJA_4OCOA
3613
+ 4OCJA_4OCVA
3614
+ 4OCJA_4WH1A
3615
+ 4OCJA_4WH3A
3616
+ 4OIGA_4OIGA
3617
+ 4OIGA_4OIGB
3618
+ 4OIGA_4OIGD
3619
+ 4OIGA_4OIGE
3620
+ 4OIGA_7BSCA
3621
+ 4OO1J_5C0WK
3622
+ 4OO1J_5C0XK
3623
+ 4OO1J_5K36J
3624
+ 4OO1J_6FSZKK
3625
+ 4OO1J_7D4Ir6
3626
+ 4P6VC_4P6VC
3627
+ 4P6VC_4U9SC
3628
+ 4P6VC_7XK6C
3629
+ 4P6VC_8A1YC
3630
+ 4P6VC_8EVUC
3631
+ 4PL0A_4PL0A
3632
+ 4PL0A_4PL0B
3633
+ 4PL0A_5OFPA
3634
+ 4PL0A_5OFRA
3635
+ 4PL0A_5OFRB
3636
+ 4PLQA_4PLSA
3637
+ 4PLQA_4PLSB
3638
+ 4PLQA_4PLSC
3639
+ 4PLQA_5AEIB
3640
+ 4PLQA_5MFGA
3641
+ 4PMWA_4PMWA
3642
+ 4PMWA_4PMWB
3643
+ 4PMWA_8E28A
3644
+ 4PMWA_8E29A
3645
+ 4PMWA_8E2AA
3646
+ 4Q37A_4Q37A
3647
+ 4Q37A_4Q37C
3648
+ 4Q37A_4Q37D
3649
+ 4Q37A_4Q37E
3650
+ 4Q37A_4Q37F
3651
+ 4QFKA_4QFKD
3652
+ 4QFKA_4QFKE
3653
+ 4QFKA_4QFLA
3654
+ 4QFKA_4QFNB
3655
+ 4QFKA_4QFOA
3656
+ 4QJBA_4QJBA
3657
+ 4QJBA_4QJBB
3658
+ 4QJBA_4ZEVA
3659
+ 4QJBA_4ZEVB
3660
+ 4QJBA_4ZEWB
3661
+ 4QNCA_4QNCB
3662
+ 4QNCA_5UHQA
3663
+ 4QNCA_5UHQB
3664
+ 4QNCA_5UHQD
3665
+ 4QNCA_5UHSA
3666
+ 4QRZA_4QRZA
3667
+ 4QRZA_4QSCA
3668
+ 4QRZA_4QSDA
3669
+ 4QRZA_4QSEA
3670
+ 4QRZA_4RJZA
3671
+ 4QS4A_4QS4A
3672
+ 4QS4A_5AX6A
3673
+ 4QS4A_5YPZA
3674
+ 4QS4A_5YPZB
3675
+ 4QS4A_5YPZC
3676
+ 4QX6A_4QX6C
3677
+ 4QX6A_5JYAD
3678
+ 4QX6A_5JYFA
3679
+ 4QX6A_5JYFC
3680
+ 4QX6A_6ITEQ
3681
+ 4QYZD_4TVXN
3682
+ 4QYZD_4TVXS
3683
+ 4QYZD_4U7UQ
3684
+ 4QYZD_5CD4B
3685
+ 4QYZD_5CD4G
3686
+ 4REGA_4REGA
3687
+ 4REGA_7TR6J
3688
+ 4REGA_7TR9I
3689
+ 4REGA_7TRAH
3690
+ 4REGA_7TRAM
3691
+ 4S04A_4S04A
3692
+ 4S04A_4S04B
3693
+ 4S04A_4S04E
3694
+ 4S04A_4S04F
3695
+ 4S04A_4S05B
3696
+ 4TL6A_4TLAE
3697
+ 4TL6A_4TLBA
3698
+ 4TL6A_4TLBE
3699
+ 4TL6A_6X61E
3700
+ 4TL6A_6X61G
3701
+ 4U3MSM_4U3MSM
3702
+ 4U3MSM_4U4RsM
3703
+ 4U3MSM_5DATsM
3704
+ 4U3MSM_5DATSM
3705
+ 4U3MSM_5ON6i
3706
+ 4UI9J_5G04J
3707
+ 4UI9J_6TM5J
3708
+ 4UI9J_6TM5K
3709
+ 4UI9J_6TNTJ
3710
+ 4UI9J_6TNTK
3711
+ 4UI9X_5G04X
3712
+ 4UI9X_6TNTX
3713
+ 4UI9X_6TNTY
3714
+ 4UI9X_7QE7Y
3715
+ 4UI9X_7QE7Z
3716
+ 4UT9H_4UT9H
3717
+ 4UT9H_5H37I
3718
+ 4UT9H_6OL6H
3719
+ 4UT9H_7M30F
3720
+ 4UT9H_7V3HK
3721
+ 4UX8A_4UX8B
3722
+ 4UX8A_6GL7E
3723
+ 4UX8A_6Q2JE
3724
+ 4UX8A_6Q2OE
3725
+ 4UX8A_6Q2SE
3726
+ 4V0KA_4V0KA
3727
+ 4V0KA_4V0KB
3728
+ 4V0KA_4V0LB
3729
+ 4V0KA_4V0MA
3730
+ 4V0KA_4V0OA
3731
+ 4V0QA_5CCVB
3732
+ 4V0QA_5CCVE
3733
+ 4V0QA_5CCVH
3734
+ 4V0QA_5DTOA
3735
+ 4V0QA_5JJSA
3736
+ 4V1Ai_6VLZd
3737
+ 4V1Ai_6ZM5d
3738
+ 4V1Ai_6ZM6d
3739
+ 4V1Ai_7OI6d
3740
+ 4V1Ai_8OINBu
3741
+ 4V4NA5_4V4NA5
3742
+ 4V4NA5_4V4NAK
3743
+ 4V4NA5_4V6UBK
3744
+ 4V4NA5_6SKFBM
3745
+ 4V4NA5_6TH6BN
3746
+ 4V4NBF_4V4NBF
3747
+ 4V4NBF_4V6UAF
3748
+ 4V4NBF_5JB3F
3749
+ 4V4NBF_6SW9F
3750
+ 4V4NBF_7ZHGF
3751
+ 4V4NBT_4V4NBT
3752
+ 4V4NBT_4V6UAT
3753
+ 4V4NBT_5JB3T
3754
+ 4V4NBT_6SW9T
3755
+ 4V4NBT_7ZAGT
3756
+ 4V61AE_4V61AE
3757
+ 4V61AE_5MMJe
3758
+ 4V61AE_5MMMe
3759
+ 4V61AE_5X8Pe
3760
+ 4V61AE_6ERIBE
3761
+ 4V61AF_4V61AF
3762
+ 4V61AF_5MMJf
3763
+ 4V61AF_5MMMf
3764
+ 4V61AF_5X8Pf
3765
+ 4V61AF_6ERIBF
3766
+ 4V61B2_4V61B2
3767
+ 4V61B2_5MLC2
3768
+ 4V61B2_5MMI1
3769
+ 4V61B2_5X8P1
3770
+ 4V61B2_6ERIAa
3771
+ 4V61B5_4V61B5
3772
+ 4V61B5_5MLC5
3773
+ 4V61B5_5MMI4
3774
+ 4V61B5_5X8P4
3775
+ 4V61B5_6ERIAd
3776
+ 4V61BE_4V61BE
3777
+ 4V61BE_5H1SE
3778
+ 4V61BE_5MMIC
3779
+ 4V61BE_5MMMC
3780
+ 4V61BE_5X8PC
3781
+ 4V61BV_4V61BV
3782
+ 4V61BV_5H1SV
3783
+ 4V61BV_5MLCV
3784
+ 4V61BV_5MMMU
3785
+ 4V61BV_6ERIAT
3786
+ 4V6WAf_4V6WAf
3787
+ 4V6WAf_6XU7Af
3788
+ 4V6WAf_6XU8Af
3789
+ 4V6WAf_7OLDSf
3790
+ 4V6WAf_7Z3OSf
3791
+ 4V81C_4V81K
3792
+ 4V81C_4V8RAG
3793
+ 4V81C_5GW5G
3794
+ 4V81C_7YLUG
3795
+ 4V81C_7YLVg
3796
+ 4V8MA2_4V8MA2
3797
+ 4V8MA2_5OPTO
3798
+ 4V8MA2_5T2AAN
3799
+ 4V8MA2_6AZ1H
3800
+ 4V8MA2_7ASEo
3801
+ 4V92Bf_4V92Bf
3802
+ 4V92Bf_5MC6N
3803
+ 4V92Bf_6EMLN
3804
+ 4V92Bf_6WOOff
3805
+ 4V92Bf_7MPIBf
3806
+ 4W6ZA_4W6ZA
3807
+ 4W6ZA_4W6ZB
3808
+ 4W6ZA_4W6ZD
3809
+ 4W6ZA_7KCBA
3810
+ 4W6ZA_7NTMC
3811
+ 4WCEV_4WCEV
3812
+ 4WCEV_4WFBV
3813
+ 4WCEV_5TCULB
3814
+ 4WCEV_7ASOd
3815
+ 4WCEV_7TTWK
3816
+ 4WGIA_4WMXA
3817
+ 4WGIA_6DM8A
3818
+ 4WGIA_6DM8D
3819
+ 4WGIA_6DM8G
3820
+ 4WGIA_8G3SA
3821
+ 4WVYA_4WVYA
3822
+ 4WVYA_5FM6A
3823
+ 4WVYA_6FHSA
3824
+ 4WVYA_6FHSB
3825
+ 4WVYA_6FHSC
3826
+ 4WZ7C_6GCSC
3827
+ 4WZ7C_6H8KC
3828
+ 4WZ7C_6RFQC
3829
+ 4WZ7C_6RFRC
3830
+ 4WZ7C_7B0ND
3831
+ 4X01A_4X01A
3832
+ 4X01A_4X01E
3833
+ 4X01A_4X01F
3834
+ 4X01A_4X01G
3835
+ 4X01A_4X01H
3836
+ 4X5MA_4X5MA
3837
+ 4X5MA_4X5MB
3838
+ 4X5MA_4X5NA
3839
+ 4X5MA_4X5NC
3840
+ 4X5MA_4X5ND
3841
+ 4X8WA_4X8WA
3842
+ 4X8WA_4X8WB
3843
+ 4X8WA_4X8WC
3844
+ 4X8WA_4X8WD
3845
+ 4X8WA_4X8WE
3846
+ 4XA8A_4XA8A
3847
+ 4XA8A_5VG6A
3848
+ 4XA8A_5VG6B
3849
+ 4XA8A_5VG6D
3850
+ 4XA8A_5VG6J
3851
+ 4XGCD_4XGCD
3852
+ 4XGCD_7JGRD
3853
+ 4XGCD_7JK3D
3854
+ 4XGCD_7JK5D
3855
+ 4XGCD_7JK6D
3856
+ 4XHRM_4XHRM
3857
+ 4XHRM_4YTXO
3858
+ 4XHRM_5JQLB
3859
+ 4XHRM_5JQOA
3860
+ 4XHRM_6KYLA
3861
+ 4XVPD_4XVPE
3862
+ 4XVPD_5DCQB
3863
+ 4XVPD_5DCQC
3864
+ 4XVPD_6GWDI
3865
+ 4XVPD_7Q1ED
3866
+ 4XZYA_4Y04A
3867
+ 4XZYA_5JWFA
3868
+ 4XZYA_5SDCA
3869
+ 4XZYA_7R51A
3870
+ 4XZYA_7R51B
3871
+ 4Y5UA_4Y5UA
3872
+ 4Y5UA_4Y5UB
3873
+ 4Y5UA_4Y5WA
3874
+ 4Y5UA_5D39C
3875
+ 4Y5UA_5D39D
3876
+ 4YMKA_4YMKA
3877
+ 4YMKA_4YMKD
3878
+ 4YMKA_4ZYOA
3879
+ 4YMKA_6WF2A
3880
+ 4YMKA_6WF2B
3881
+ 4YV6A_4YV6A
3882
+ 4YV6A_4YV6B
3883
+ 4YV6A_5W4NA
3884
+ 4YV6A_5W4NB
3885
+ 4YV6A_6W1AA
3886
+ 4YX1A_4YX1A
3887
+ 4YX1A_4YX1B
3888
+ 4YX1A_4YX5B
3889
+ 4YX1A_4YX7B
3890
+ 4YX1A_4YXAB
3891
+ 4Z3WE_4Z3WG
3892
+ 4Z3WE_4Z3XE
3893
+ 4Z3WE_4Z3ZE
3894
+ 4Z3WE_4Z3ZF
3895
+ 4Z3WE_4Z40E
3896
+ 5A1WH_5A1WH
3897
+ 5A1WH_5A1XH
3898
+ 5A1WH_5A1XP
3899
+ 5A1WH_5A1YH
3900
+ 5A1WH_5NZVD
3901
+ 5A5TL_5A5TL
3902
+ 5A5TL_6ZMW5
3903
+ 5A5TL_6ZVJL
3904
+ 5A5TL_7A09L
3905
+ 5A5TL_7QP65
3906
+ 5A9Q7_5A9QG
3907
+ 5A9Q7_7PEQBG
3908
+ 5A9Q7_7PEQCG
3909
+ 5A9Q7_7R5JO0
3910
+ 5A9Q7_7VOPL
3911
+ 5A9RA_5A9RA
3912
+ 5A9RA_5A9SB
3913
+ 5A9RA_5A9TA
3914
+ 5A9RA_5FWNA
3915
+ 5A9RA_5FWNB
3916
+ 5ADXA_5ADXI
3917
+ 5ADXA_6F1TA
3918
+ 5ADXA_6F38F
3919
+ 5ADXA_6F38G
3920
+ 5ADXA_6F38I
3921
+ 5AQ7A_5AQ7B
3922
+ 5AQ7A_5AQ8A
3923
+ 5AQ7A_5AQ9C
3924
+ 5AQ7A_5AQAB
3925
+ 5AQ7A_5AQBA
3926
+ 5B0UA_5B0UA
3927
+ 5B0UA_5IBOA
3928
+ 5B0UA_7SNSD
3929
+ 5B0UA_7SNWA
3930
+ 5B0UA_7SNYA
3931
+ 5BS1A_5BS1A
3932
+ 5BS1A_5BS1B
3933
+ 5BS1A_5BS1C
3934
+ 5BS1A_5BS1D
3935
+ 5BS1A_5BS2A
3936
+ 5BSMA_5BSMA
3937
+ 5BSMA_5BSTA
3938
+ 5BSMA_5BSWB
3939
+ 5BSMA_5U95B
3940
+ 5BSMA_5U95C
3941
+ 5BUQA_5BUQB
3942
+ 5BUQA_5BURA
3943
+ 5BUQA_5GTDA
3944
+ 5BUQA_5X8FA
3945
+ 5BUQA_5X8GB
3946
+ 5C6TA_5C6TA
3947
+ 5C6TA_5CXFB
3948
+ 5C6TA_5CXFC
3949
+ 5C6TA_7KDDA
3950
+ 5C6TA_7KDPA
3951
+ 5CJ8A_5CJ8A
3952
+ 5CJ8A_5CJBA
3953
+ 5CJ8A_5EIQA
3954
+ 5CJ8A_5EIVA
3955
+ 5CJ8A_5EIVB
3956
+ 5COLA_5COLA
3957
+ 5COLA_5COLB
3958
+ 5COLA_5D8HC
3959
+ 5COLA_5DARC
3960
+ 5COLA_5DARF
3961
+ 5CRAA_5CRAA
3962
+ 5CRAA_5CRBA
3963
+ 5CRAA_5CRCA
3964
+ 5CRAA_5CRCB
3965
+ 5CRAA_6WTGA
3966
+ 5CUFA_5CUFB
3967
+ 5CUFA_5CUFC
3968
+ 5CUFA_5DJ4B
3969
+ 5CUFA_5DJ4C
3970
+ 5CUFA_6N0MA
3971
+ 5CXBA_5CXCA
3972
+ 5CXBA_5CYKA
3973
+ 5CXBA_5EM2B
3974
+ 5CXBA_8I9YCD
3975
+ 5CXBA_8IA0CD
3976
+ 5CZRA_5CZRA
3977
+ 5CZRA_5CZRB
3978
+ 5CZRA_7N86A
3979
+ 5CZRA_7N86C
3980
+ 5CZRA_7N86D
3981
+ 5D0YA_5D0YA
3982
+ 5D0YA_5D0YB
3983
+ 5D0YA_5D3MC
3984
+ 5D0YA_5JSZC
3985
+ 5D0YA_7NNUC
3986
+ 5D98B_5D98B
3987
+ 5D98B_6XZDEP1
3988
+ 5D98B_6XZPBP1
3989
+ 5D98B_6XZREP1
3990
+ 5D98B_6Y0CB
3991
+ 5DPOA_5DPOA
3992
+ 5DPOA_6K3BB
3993
+ 5DPOA_6LW4B
3994
+ 5DPOA_7BXFC
3995
+ 5DPOA_7BXHA
3996
+ 5DWZC_5DWZC
3997
+ 5DWZC_5DWZF
3998
+ 5DWZC_6ET1G
3999
+ 5DWZC_6ET2A
4000
+ 5DWZC_6ET3C
4001
+ 5DZXA_5DZXA
4002
+ 5DZXA_5DZYB
4003
+ 5DZXA_5DZYC
4004
+ 5DZXA_5DZYE
4005
+ 5DZXA_5DZYF
4006
+ 5E9TA_5E9TA
4007
+ 5E9TA_5E9TC
4008
+ 5E9TA_5E9UA
4009
+ 5E9TA_5E9UE
4010
+ 5E9TA_5E9UG
4011
+ 5EGPA_5EGPA
4012
+ 5EGPA_5JGJA
4013
+ 5EGPA_5JGKA
4014
+ 5EGPA_5JGLA
4015
+ 5EGPA_5JGLB
4016
+ 5ERDA_5ERDA
4017
+ 5ERDA_7A7DA
4018
+ 5ERDA_7A7DB
4019
+ 5ERDA_7A7DC
4020
+ 5ERDA_7A7DE
4021
+ 5EXIA_5EXIA
4022
+ 5EXIA_5EXJA
4023
+ 5EXIA_5EXKA
4024
+ 5EXIA_5EXKI
4025
+ 5EXIA_5EXKK
4026
+ 5F17A_5F17A
4027
+ 5F17A_5F17D
4028
+ 5F17A_5HC1A
4029
+ 5F17A_5HC1B
4030
+ 5F17A_5HC1C
4031
+ 5F3KA_5F3KA
4032
+ 5F3KA_5F3KB
4033
+ 5F3KA_5F5RA
4034
+ 5F3KA_5F5RB
4035
+ 5F3KA_7C7BA
4036
+ 5F4HA_5F4HA
4037
+ 5F4HA_5F4HC
4038
+ 5F4HA_5F4HD
4039
+ 5F4HA_5F4HF
4040
+ 5F4HA_5YWWA
4041
+ 5FJ9I_5FJAI
4042
+ 5FJ9I_7Z0HI
4043
+ 5FJ9I_7Z1LI
4044
+ 5FJ9I_7Z1MI
4045
+ 5FJ9I_7Z2ZI
4046
+ 5FMGL_5FMGZ
4047
+ 5FMGL_6MUVL
4048
+ 5FMGL_6MUXL
4049
+ 5FMGL_7LXTL
4050
+ 5FMGL_7LXUL
4051
+ 5FS4A_5JZRA
4052
+ 5FS4A_5JZRB
4053
+ 5FS4A_5LQPAB
4054
+ 5FS4A_5LQPAC
4055
+ 5FS4A_5LQPAD
4056
+ 5FVCA_5FVCD
4057
+ 5FVCA_5FVCG
4058
+ 5FVCA_5FVCJ
4059
+ 5FVCA_5FVDA
4060
+ 5FVCA_5FVDC
4061
+ 5FWKA_5FWKB
4062
+ 5FWKA_7KW7A
4063
+ 5FWKA_7KW7B
4064
+ 5FWKA_7ZR6A
4065
+ 5FWKA_7ZR6B
4066
+ 5G2XC_5G2XC
4067
+ 5G2XC_7D0FC
4068
+ 5G2XC_7D0GC
4069
+ 5G2XC_7D1AC
4070
+ 5G2XC_8H2HD
4071
+ 5GGFA_5GGFA
4072
+ 5GGFA_5GGFB
4073
+ 5GGFA_5GGFC
4074
+ 5GGFA_5GGGA
4075
+ 5GGFA_5GGIB
4076
+ 5GJQR_5T0CAY
4077
+ 5GJQR_6EPCR
4078
+ 5GJQR_6EPER
4079
+ 5GJQR_6MSEY
4080
+ 5GJQR_7W3HY
4081
+ 5GJQV_5LN3V
4082
+ 5GJQV_5T0CAc
4083
+ 5GJQV_6EPCV
4084
+ 5GJQV_6EPDV
4085
+ 5GJQV_7W3Jc
4086
+ 5GKEA_5GKEA
4087
+ 5GKEA_5GKGB
4088
+ 5GKEA_5GKHA
4089
+ 5GKEA_5GKJA
4090
+ 5GKEA_5GKJB
4091
+ 5GMKP_5GMKP
4092
+ 5GMKP_5MPSK
4093
+ 5GMKP_5Y88Q
4094
+ 5GMKP_5YLZQ
4095
+ 5GMKP_6J6GP
4096
+ 5GMUA_5GMUA
4097
+ 5GMUA_5GMUB
4098
+ 5GMUA_5GO2A
4099
+ 5GMUA_5GO2B
4100
+ 5GMUA_5GO2D
4101
+ 5GREA_5YVTA
4102
+ 5GREA_6KDFA
4103
+ 5GREA_6KE3E
4104
+ 5GREA_8GRUA
4105
+ 5GREA_8GS5A
4106
+ 5GUPa_5O31h
4107
+ 5GUPa_6G2Jh
4108
+ 5GUPa_6ZKVW
4109
+ 5GUPa_7DGQb
4110
+ 5GUPa_7DH0b
4111
+ 5GUPb_5LNKr
4112
+ 5GUPb_6G72i
4113
+ 5GUPb_6QC7B6
4114
+ 5GUPb_6ZKAr
4115
+ 5GUPb_7DGSc
4116
+ 5GUPW_5XTDW
4117
+ 5GUPW_6QBXAM
4118
+ 5GUPW_7DGSV
4119
+ 5GUPW_7QSDZ
4120
+ 5GUPW_7R4GZ
4121
+ 5GW0A_5GW0A
4122
+ 5GW0A_5GW0E
4123
+ 5GW0A_5GW0F
4124
+ 5GW0A_5GW1A
4125
+ 5GW0A_5GW1F
4126
+ 5H1AA_5H1AA
4127
+ 5H1AA_5H1AB
4128
+ 5H1AA_5H1AC
4129
+ 5H1AA_7CUOA
4130
+ 5H1AA_7CUOB
4131
+ 5HJIA_5HJJA
4132
+ 5HJIA_5HJMA
4133
+ 5HJIA_5WT1A
4134
+ 5HJIA_5WT1B
4135
+ 5HJIA_5WT3A
4136
+ 5HUQA_5HUQA
4137
+ 5HUQA_5HUQB
4138
+ 5HUQA_6C1WB
4139
+ 5HUQA_6C1WC
4140
+ 5HUQA_8EZHB
4141
+ 5I5DA_5I5FB
4142
+ 5I5DA_5I5HA
4143
+ 5I5DA_6VATC
4144
+ 5I5DA_6VATF
4145
+ 5I5DA_6VDFB
4146
+ 5IBLL_5IBLL
4147
+ 5IBLL_6UTKL
4148
+ 5IBLL_6UUHB
4149
+ 5IBLL_6UUHD
4150
+ 5IBLL_6UULD
4151
+ 5IDEB_5IDEB
4152
+ 5IDEB_5IDFB
4153
+ 5IDEB_5IDFD
4154
+ 5IDEB_6NJMC
4155
+ 5IDEB_6NJNC
4156
+ 5IJNG_5IJNG
4157
+ 5IJNG_7R5JI0
4158
+ 5IJNG_7WKKI
4159
+ 5IJNG_7WKKk
4160
+ 5IJNG_7WKKK
4161
+ 5IL0B_5IL2B
4162
+ 5IL0B_6TTVB
4163
+ 5IL0B_6Y4GB
4164
+ 5IL0B_7O2FB
4165
+ 5IL0B_7RX6B
4166
+ 5IPPA_5IPPA
4167
+ 5IPPA_5IUFA
4168
+ 5IPPA_5IUFC
4169
+ 5IPPA_5IZOB
4170
+ 5IPPA_5IZOC
4171
+ 5IQR8_5IQR8
4172
+ 5IQR8_5KPV33
4173
+ 5IQR8_5KPW33
4174
+ 5IQR8_5KPX33
4175
+ 5IQR8_5L3Pz
4176
+ 5IX1A_5IX1A
4177
+ 5IX1A_5IX1B
4178
+ 5IX1A_5IX2A
4179
+ 5IX1A_5IX2B
4180
+ 5IX1A_6O1EA
4181
+ 5J09A_5J09A
4182
+ 5J09A_5J09D
4183
+ 5J09A_5J09F
4184
+ 5J09A_5J36A
4185
+ 5J09A_5J37B
4186
+ 5JPQi_5JPQj
4187
+ 5JPQi_5OQLj
4188
+ 5JPQi_6RXTCR
4189
+ 5JPQi_6RXVCR
4190
+ 5JPQi_6RXZCR
4191
+ 5K47A_5MKEA
4192
+ 5K47A_5MKFA
4193
+ 5K47A_6A70A
4194
+ 5K47A_6A70F
4195
+ 5K47A_6D1WA
4196
+ 5KENC_5KENC
4197
+ 5KENC_5KENG
4198
+ 5KENC_5KENN
4199
+ 5KENC_7URAH
4200
+ 5KENC_7UREH
4201
+ 5KGZA_5KGZA
4202
+ 5KGZA_7T8UA
4203
+ 5KGZA_7T8UB
4204
+ 5KGZA_7T8UC
4205
+ 5KGZA_7T8UD
4206
+ 5KK2E_5KK2E
4207
+ 5KK2E_5KK2F
4208
+ 5KK2E_5VOTE
4209
+ 5KK2E_5VOUF
4210
+ 5KK2E_6NJLH
4211
+ 5KKOA_5KKOA
4212
+ 5KKOA_5KKOB
4213
+ 5KKOA_5KKOC
4214
+ 5KKOA_5KKOD
4215
+ 5KKOA_5KKOE
4216
+ 5L1XA_5L1XA
4217
+ 5L1XA_5L1XK
4218
+ 5L1XA_7SEJB
4219
+ 5L1XA_7SEJC
4220
+ 5L1XA_8E15F
4221
+ 5L3SB_5L3SB
4222
+ 5L3SB_5L3SD
4223
+ 5L3SB_5L3SF
4224
+ 5L3SB_5L3SH
4225
+ 5L3SB_5L3WA
4226
+ 5L5KA_5L5KA
4227
+ 5L5KA_5L5LA
4228
+ 5L5KA_5L5LB
4229
+ 5L5KA_5L5NA
4230
+ 5L5KA_7M0RA
4231
+ 5LC5E_5LDWE
4232
+ 5LC5E_6QBXV2
4233
+ 5LC5E_7DGQ9
4234
+ 5LC5E_7DGS9
4235
+ 5LC5E_7QSDE
4236
+ 5LE2A_5LE2A
4237
+ 5LE2A_5LE2B
4238
+ 5LE2A_5LEBA
4239
+ 5LE2A_5LECA
4240
+ 5LE2A_5LEDA
4241
+ 5LF9A_5LORB
4242
+ 5LF9A_5LOUA
4243
+ 5LF9A_5R50A
4244
+ 5LF9A_5R56A
4245
+ 5LF9A_5R5JA
4246
+ 5LI0b_5ND8b
4247
+ 5LI0b_5NGMAb
4248
+ 5LI0b_6S0Xb
4249
+ 5LI0b_6S13b
4250
+ 5LI0b_7KWGb
4251
+ 5LI0e_5NGMAe
4252
+ 5LI0e_5T7VSD
4253
+ 5LI0e_6S13e
4254
+ 5LI0e_7ASOD
4255
+ 5LI0e_7ASPm
4256
+ 5LI0i_5LI0i
4257
+ 5LI0i_5NGMAi
4258
+ 5LI0i_6S13i
4259
+ 5LI0i_7ASOH
4260
+ 5LI0i_7ASPq
4261
+ 5LI6A_5LI6A
4262
+ 5LI6A_5LI7A
4263
+ 5LI6A_5LI8A
4264
+ 5LI6A_5LIEA
4265
+ 5LI6A_5LIEB
4266
+ 5LJ3T_5LJ3T
4267
+ 5LJ3T_5MQ0T
4268
+ 5LJ3T_5Y88H
4269
+ 5LJ3T_6J6Gv
4270
+ 5LJ3T_7B9VT
4271
+ 5LPUA_5N7DB
4272
+ 5LPUA_7P74A
4273
+ 5LPUA_7PC7A
4274
+ 5LPUA_7QQLB
4275
+ 5LPUA_7QQNA
4276
+ 5M2XA_5M2XA
4277
+ 5M2XA_5M2ZF
4278
+ 5M2XA_5TMHB
4279
+ 5M2XA_5U0BA
4280
+ 5M2XA_6WCZB
4281
+ 5M30B_5M30B
4282
+ 5M30B_6GIYG
4283
+ 5M30B_6GIYH
4284
+ 5M30B_6GJ3G
4285
+ 5M30B_6GJ3H
4286
+ 5M64M_5M64M
4287
+ 5M64M_5W66M
4288
+ 5M64M_6H68M
4289
+ 5M64M_6RQTM
4290
+ 5M64M_6RWEM
4291
+ 5M7NA_5M7NB
4292
+ 5M7NA_5M7OA
4293
+ 5M7NA_5M7OB
4294
+ 5M7NA_5M7PA
4295
+ 5M7NA_5M7PB
4296
+ 5MKKB_5MKKB
4297
+ 5MKKB_6RAFB
4298
+ 5MKKB_6RAHB
4299
+ 5MKKB_6RAJB
4300
+ 5MKKB_6RAMB
4301
+ 5MLTA_5MLTA
4302
+ 5MLTA_5SUOA
4303
+ 5MLTA_5SWAA
4304
+ 5MLTA_5SWBC
4305
+ 5MLTA_5SWBG
4306
+ 5MQFQ_5MQFQ
4307
+ 5MQFQ_5XJCN
4308
+ 5MQFQ_6ZYMQ
4309
+ 5MQFQ_7ABFQ
4310
+ 5MQFQ_7ABGQ
4311
+ 5MRCEE_5MRCEE
4312
+ 5MRCEE_5MREEE
4313
+ 5MRCEE_5MRFEE
4314
+ 5MRCEE_8D8KE
4315
+ 5MRCEE_8D8LE
4316
+ 5MSMA_5MSMA
4317
+ 5MSMA_5OKCB
4318
+ 5MSMA_5OKIC
4319
+ 5MSMA_6S1CB
4320
+ 5MSMA_6S2EB
4321
+ 5MVWD_5MW0C
4322
+ 5MVWD_5MW0D
4323
+ 5MVWD_5MW9C
4324
+ 5MVWD_5MW9G
4325
+ 5MVWD_5MW9H
4326
+ 5MW5A_5MW5A
4327
+ 5MW5A_5MW7A
4328
+ 5MW5A_5MWFA
4329
+ 5MW5A_5MWFB
4330
+ 5MW5A_5MWFD
4331
+ 5MYJBP_5MYJBP
4332
+ 5MYJBP_6O8YN
4333
+ 5MYJBP_6O8ZN
4334
+ 5MYJBP_6W6PN
4335
+ 5MYJBP_7NHKP
4336
+ 5O3PA_6WUEA
4337
+ 5O3PA_6WUEB
4338
+ 5O3PA_7R2ZC
4339
+ 5O3PA_7R30A
4340
+ 5O3PA_7R31C
4341
+ 5O5JD_5V93d
4342
+ 5O5JD_6DZIl
4343
+ 5O5JD_7KGBd
4344
+ 5O5JD_7MSZd
4345
+ 5O5JD_8FR8e
4346
+ 5O5JG_5V93g
4347
+ 5O5JG_6DZIp
4348
+ 5O5JG_6JMKA
4349
+ 5O5JG_6JMKB
4350
+ 5O5JG_7MSCg
4351
+ 5O6SA_5O6SA
4352
+ 5O6SA_5O6SB
4353
+ 5O6SA_5O6SE
4354
+ 5O6SA_5O6TC
4355
+ 5O6SA_5O6TD
4356
+ 5OC9A_5OC9A
4357
+ 5OC9A_6R65A
4358
+ 5OC9A_6R7XA
4359
+ 5OC9A_6R7YA
4360
+ 5OC9A_6R7ZA
4361
+ 5OEYA_5OEYB
4362
+ 5OEYA_5OEYC
4363
+ 5OEYA_5OEYD
4364
+ 5OEYA_5OFUA
4365
+ 5OEYA_5OFUC
4366
+ 5OF9A_5OF9A
4367
+ 5OF9A_5OF9B
4368
+ 5OF9A_5OFAB
4369
+ 5OF9A_5OFBA
4370
+ 5OF9A_5OFBB
4371
+ 5OHKA_5OHKA
4372
+ 5OHKA_5OHNA
4373
+ 5OHKA_5OHPA
4374
+ 5OHKA_8D0AA
4375
+ 5OHKA_8D1TA
4376
+ 5OJ8A_5OJ8A
4377
+ 5OJ8A_7AI4A
4378
+ 5OJ8A_7AI4B
4379
+ 5OJ8A_7AIEA
4380
+ 5OJ8A_7AIEB
4381
+ 5QSTA_5QSTD
4382
+ 5QSTA_5QSVB
4383
+ 5QSTA_6QB5C
4384
+ 5QSTA_6RRCA
4385
+ 5QSTA_6RRCC
4386
+ 5RL6A_5RLKB
4387
+ 5RL6A_7CXNE
4388
+ 5RL6A_7EGQR
4389
+ 5RL6A_7RDXE
4390
+ 5RL6A_7RE3F
4391
+ 5SUPA_5SUPA
4392
+ 5SUPA_5SUPC
4393
+ 5SUPA_5SUQA
4394
+ 5SUPA_7LUVM
4395
+ 5SUPA_7V2YF
4396
+ 5SVDA_5SVDA
4397
+ 5SVDA_5WTXA
4398
+ 5SVDA_5WTYA
4399
+ 5SVDA_5WTYB
4400
+ 5SVDA_6WPIA
4401
+ 5T1PA_5T1PB
4402
+ 5T1PA_5T1PC
4403
+ 5T1PA_5T1PE
4404
+ 5T1PA_5T1PF
4405
+ 5T1PA_5T1PH
4406
+ 5TJ5B_5VOXR
4407
+ 5TJ5B_5VOYR
4408
+ 5TJ5B_5VOZR
4409
+ 5TJ5B_6M0RC
4410
+ 5TJ5B_7FDCT
4411
+ 5TVFB_5TVFB
4412
+ 5TVFB_5TVMB
4413
+ 5TVFB_5TVMD
4414
+ 5TVFB_5TVOA
4415
+ 5TVFB_6BM7B
4416
+ 5U07D_5U07E
4417
+ 5U07D_5U07H
4418
+ 5U07D_5U0AG
4419
+ 5U07D_6C66B
4420
+ 5U07D_6C66D
4421
+ 5UDQA_5UDRC
4422
+ 5UDQA_5UDTA
4423
+ 5UDQA_5UDWE
4424
+ 5UDQA_5UDXA
4425
+ 5UDQA_6UTQD
4426
+ 5UNFA_5UNFA
4427
+ 5UNFA_5UNGB
4428
+ 5UNFA_5UNHA
4429
+ 5UNFA_7JNIA
4430
+ 5UNFA_7JNIB
4431
+ 5USCA_5USCA
4432
+ 5USCA_5UYYB
4433
+ 5USCA_5UYYC
4434
+ 5USCA_6U60A
4435
+ 5USCA_6U60B
4436
+ 5UXBA_5UXBA
4437
+ 5UXBA_5UXBB
4438
+ 5UXBA_5UXCA
4439
+ 5UXBA_5UXDA
4440
+ 5UXBA_5UXDB
4441
+ 5UZ9A_5UZ9A
4442
+ 5UZ9A_6NE0A
4443
+ 5UZ9A_6VQVC
4444
+ 5UZ9A_7ELMA
4445
+ 5UZ9A_7YHSA
4446
+ 5V44A_5V44B
4447
+ 5V44A_5V44C
4448
+ 5V44A_5V45A
4449
+ 5V44A_5V45B
4450
+ 5V44A_5V46A
4451
+ 5VKUg_5VKUg
4452
+ 5VKUg_5VKUj
4453
+ 5VKUg_7ET3g
4454
+ 5VKUg_7ET3m
4455
+ 5VKUg_7ETJm
4456
+ 5VNYA_5VNYA
4457
+ 5VNYA_7ZRVE
4458
+ 5VNYA_7ZSDP
4459
+ 5VNYA_7ZSSh
4460
+ 5VNYA_7ZSSP
4461
+ 5W1GH_5W1GH
4462
+ 5W1GH_5W1MB
4463
+ 5W1GH_5W1MD
4464
+ 5W1GH_5W1MF
4465
+ 5W1GH_5W1MH
4466
+ 5W78A_5W78A
4467
+ 5W78A_5W7AA
4468
+ 5W78A_5W7BA
4469
+ 5W78A_5W7BB
4470
+ 5W78A_5W7CA
4471
+ 5WC6A_5WC6M
4472
+ 5WC6A_5WC8A
4473
+ 5WC6A_5WC8M
4474
+ 5WC6A_5WCNA
4475
+ 5WC6A_5WD7A
4476
+ 5WK5A_5WK5B
4477
+ 5WK5A_5WK6A
4478
+ 5WK5A_8SUGF
4479
+ 5WK5A_8SUGL
4480
+ 5WK5A_8SUGM
4481
+ 5WRHA_5WRHA
4482
+ 5WRHA_6JZRA
4483
+ 5WRHA_7BIN1
4484
+ 5WRHA_7BIN4
4485
+ 5WRHA_7CBMW
4486
+ 5X8OA_5X8OA
4487
+ 5X8OA_6JIGA
4488
+ 5X8OA_6JL8A
4489
+ 5X8OA_6JL8B
4490
+ 5X8OA_6LK4A
4491
+ 5XTSA_5XTSA
4492
+ 5XTSA_5XTWC
4493
+ 5XTSA_6INUA
4494
+ 5XTSA_6INVA
4495
+ 5XTSA_6IOEB
4496
+ 5XUAA_5XUAA
4497
+ 5XUAA_5XUAB
4498
+ 5XUAA_5XUBB
4499
+ 5XUAA_6ITSB
4500
+ 5XUAA_7WRMB
4501
+ 5XVIA_5XVIA
4502
+ 5XVIA_5XVIE
4503
+ 5XVIA_5XVVE
4504
+ 5XVIA_5XVXA
4505
+ 5XVIA_7ECTB
4506
+ 5Y6P44_5Y6P44
4507
+ 5Y6P44_5Y6Pc8
4508
+ 5Y6P44_5Y6PD2
4509
+ 5Y6P44_5Y6Pd8
4510
+ 5Y6P44_5Y6Pe8
4511
+ 5YQQA_5YQQA
4512
+ 5YQQA_5YQQB
4513
+ 5YQQA_5YS0A
4514
+ 5YQQA_5YS0B
4515
+ 5YQQA_5YS0C
4516
+ 5Z1GA_5Z1GA
4517
+ 5Z1GA_5Z1GC
4518
+ 5Z1GA_5Z3Gb
4519
+ 5Z1GA_7OHVJ
4520
+ 5Z1GA_7R6KJ
4521
+ 5Z6ZA_6U81A
4522
+ 5Z6ZA_6U82A
4523
+ 5Z6ZA_6U82D
4524
+ 5Z6ZA_7DW5A
4525
+ 5Z6ZA_7DW5B
4526
+ 5ZE7A_5ZE7A
4527
+ 5ZE7A_5ZE7B
4528
+ 5ZE7A_5ZERB
4529
+ 5ZE7A_5ZESB
4530
+ 5ZE7A_5ZFKA
4531
+ 5ZFQA_6OJXA
4532
+ 5ZFQA_6OJZC
4533
+ 5ZFQA_6OJZD
4534
+ 5ZFQA_6OJZF
4535
+ 5ZFQA_6OLLC
4536
+ 5ZGB2_5ZGB2
4537
+ 5ZGB2_5ZGH2
4538
+ 5ZGB2_6FOS2
4539
+ 5ZGB2_6FOS3
4540
+ 5ZGB2_7BLZ2
4541
+ 5ZZ8r_5ZZ8r
4542
+ 5ZZ8r_6CGRm
4543
+ 5ZZ8r_6M6Gl
4544
+ 5ZZ8r_6M6HI
4545
+ 5ZZ8r_6ODMG
4546
+ 6A27A_6A29A
4547
+ 6A27A_6A29C
4548
+ 6A27A_6A29E
4549
+ 6A27A_6BDUA
4550
+ 6A27A_6MC6B
4551
+ 6ACUC_6AD1C
4552
+ 6ACUC_6AKUC
4553
+ 6ACUC_6IIOC
4554
+ 6ACUC_6SNWC
4555
+ 6ACUC_7BZOC
4556
+ 6AJMA_6AJMA
4557
+ 6AJMA_6AJNA
4558
+ 6AJMA_6GTPA
4559
+ 6AJMA_7CHDA
4560
+ 6AJMA_7CHDB
4561
+ 6ANOA_6ANOA
4562
+ 6ANOA_6ANOB
4563
+ 6ANOA_6AOZC
4564
+ 6ANOA_6AOZD
4565
+ 6ANOA_6AP0A
4566
+ 6AVHA_6AVHA
4567
+ 6AVHA_6AVHB
4568
+ 6AVHA_6AVHC
4569
+ 6AVHA_6AVHD
4570
+ 6AVHA_6E1QA
4571
+ 6B4CA_6B4CB
4572
+ 6B4CA_6B4CD
4573
+ 6B4CA_6B4CK
4574
+ 6B4CA_7N7IA
4575
+ 6B4CA_7N7IC
4576
+ 6CAJI_6CAJJ
4577
+ 6CAJI_6O9ZI
4578
+ 6CAJI_7F66E
4579
+ 6CAJI_7RLOI
4580
+ 6CAJI_7TRJI
4581
+ 6CQDA_6CQDB
4582
+ 6CQDA_6NFZA
4583
+ 6CQDA_7L25A
4584
+ 6CQDA_7M0KA
4585
+ 6CQDA_7SIUB
4586
+ 6CYJA_6CYJB
4587
+ 6CYJA_6CYYA
4588
+ 6CYJA_6CYYB
4589
+ 6CYJA_6CZ6A
4590
+ 6CYJA_6D2SA
4591
+ 6DJLB_6DJLC
4592
+ 6DJLB_6DJLD
4593
+ 6DJLB_6IXFA
4594
+ 6DJLB_6IXVC
4595
+ 6DJLB_6IXVD
4596
+ 6DM9A_6DM9A
4597
+ 6DM9A_6DM9C
4598
+ 6DM9A_6DMAA
4599
+ 6DM9A_6DMAE
4600
+ 6DM9A_6DMAG
4601
+ 6ECBA_6ECBA
4602
+ 6ECBA_6ECDA
4603
+ 6ECBA_6ECFC
4604
+ 6ECBA_6ECFD
4605
+ 6ECBA_6ECFF
4606
+ 6EU2Q_6EU2Q
4607
+ 6EU2Q_6EU3Q
4608
+ 6EU2Q_6F40Q
4609
+ 6EU2Q_6F41Q
4610
+ 6EU2Q_6F44Q
4611
+ 6EZOE_6EZOF
4612
+ 6EZOE_6K71E
4613
+ 6EZOE_6K72E
4614
+ 6EZOE_7D43F
4615
+ 6EZOE_7D46E
4616
+ 6F1CA_6F1CA
4617
+ 6F1CA_6F1CC
4618
+ 6F1CA_6F1HC
4619
+ 6F1CA_6F39A
4620
+ 6F1CA_6F39B
4621
+ 6F2DA_6F2DA
4622
+ 6F2DA_6R69A
4623
+ 6F2DA_7BINA
4624
+ 6F2DA_7CGOCF
4625
+ 6F2DA_7CGOx
4626
+ 6FDDA_6FDDA
4627
+ 6FDDA_6FDDC
4628
+ 6FDDA_6FDDD
4629
+ 6FDDA_6FDDF
4630
+ 6FDDA_6FDEA
4631
+ 6G18y_6G18y
4632
+ 6G18y_6G5Iy
4633
+ 6G18y_6ZUOy
4634
+ 6G18y_6ZXDy
4635
+ 6G18y_6ZXFy
4636
+ 6GCS5_6GCS5
4637
+ 6GCS5_6RFQ5
4638
+ 6GCS5_6RFS5
4639
+ 6GCS5_7B0NL
4640
+ 6GCS5_7O715
4641
+ 6GIQc_6GIQc
4642
+ 6GIQc_6HU9c
4643
+ 6GIQc_7Z10c
4644
+ 6GIQc_8E7SO
4645
+ 6GIQc_8EC0O
4646
+ 6GIQi_6GIQi
4647
+ 6GIQi_6HU9i
4648
+ 6GIQi_6T0Bv
4649
+ 6GIQi_8E7SR
4650
+ 6GIQi_8EC0R
4651
+ 6H9CA_6H9CA
4652
+ 6H9CA_6QT9A
4653
+ 6H9CA_6QT9G
4654
+ 6H9CA_6QT9H
4655
+ 6H9CA_6QT9I
4656
+ 6HE4H_6HE4H
4657
+ 6HE4H_6HE4I
4658
+ 6HE4H_6HE4J
4659
+ 6HE4H_6HE4L
4660
+ 6HE4H_6HE4M
4661
+ 6HIVAt_6HIVAt
4662
+ 6HIVAt_6HIXAt
4663
+ 6HIVAt_6YXXAt
4664
+ 6HIVAt_6YXYAt
4665
+ 6HIVAt_7AOIAt
4666
+ 6HIVBH_6HIVBH
4667
+ 6HIVBH_6HIXBH
4668
+ 6HIVBH_6YXXBH
4669
+ 6HIVBH_6YXYBH
4670
+ 6HIVBH_7AOIBH
4671
+ 6HIVCa_6HIVCa
4672
+ 6HIVCa_6SGACa
4673
+ 6HIVCa_6SGBCa
4674
+ 6HIVCa_7AORCa
4675
+ 6HIVCa_7PUBCa
4676
+ 6HIVCI_6SG9CI
4677
+ 6HIVCI_6SGBCI
4678
+ 6HIVCI_7AORd
4679
+ 6HIVCI_7PUACI
4680
+ 6HIVCI_7PUBCI
4681
+ 6HIVDD_6HIVDD
4682
+ 6HIVDD_6SGADD
4683
+ 6HIVDD_6SGBDD
4684
+ 6HIVDD_7AORad
4685
+ 6HIVDD_7PUBDD
4686
+ 6HIVDH_6HIVDH
4687
+ 6HIVDH_6SG9DH
4688
+ 6HIVDH_6SGBDH
4689
+ 6HIVDH_7PUADH
4690
+ 6HIVDH_7PUBDH
4691
+ 6HIVDK_6HIVDK
4692
+ 6HIVDK_6SG9DK
4693
+ 6HIVDK_6SGBDK
4694
+ 6HIVDK_7PUADK
4695
+ 6HIVDK_7PUBDK
4696
+ 6HIVDO_6HIVDO
4697
+ 6HIVDO_6SGADO
4698
+ 6HIVDO_6SGBDO
4699
+ 6HIVDO_7PUADO
4700
+ 6HIVDO_7PUBDO
4701
+ 6HQCA_6HQCA
4702
+ 6HQCA_6QAYA
4703
+ 6HQCA_8AIFA
4704
+ 6HQCA_8AIFB
4705
+ 6HQCA_8AIFC
4706
+ 6HWJA_6HWJA
4707
+ 6HWJA_6HWJB
4708
+ 6HWJA_6HWKA
4709
+ 6HWJA_6HWKB
4710
+ 6HWJA_6HWLB
4711
+ 6IDEA_6IDEA
4712
+ 6IDEA_6IDEB
4713
+ 6IDEA_6KJUA
4714
+ 6IDEA_6KJUB
4715
+ 6IDEA_6UGLB
4716
+ 6IFSA_6IFSA
4717
+ 6IFSA_6IFXA
4718
+ 6IFSA_6IFXB
4719
+ 6IFSA_7V2MU
4720
+ 6IFSA_7V2NU
4721
+ 6II0A_6II0B
4722
+ 6II0A_6II0C
4723
+ 6II0A_6II0D
4724
+ 6II0A_6II6A
4725
+ 6II0A_6II6B
4726
+ 6IJO2_6IJO2
4727
+ 6IJO2_7BGI2
4728
+ 6IJO2_7D0J2
4729
+ 6IJO2_7DZ72
4730
+ 6IJO2_7DZ82
4731
+ 6IMJA_6IMJA
4732
+ 6IMJA_6IMJB
4733
+ 6IMJA_6IMLA
4734
+ 6IMJA_6IMNA
4735
+ 6IMJA_6IMNB
4736
+ 6J3Y11_6J3Y11
4737
+ 6J3Y11_6J3Z15
4738
+ 6J3Y11_6J3Z16
4739
+ 6J3Y11_7VD511
4740
+ 6J3Y11_7VD517
4741
+ 6J54i_6J5Ii
4742
+ 6J54i_6J5KAi
4743
+ 6J54i_6TT7O
4744
+ 6J54i_6ZBBk
4745
+ 6J54i_7AJFAk
4746
+ 6JEOaM_6K61m
4747
+ 6JEOaM_6PNJM
4748
+ 6JEOaM_6TCLMM
4749
+ 6JEOaM_7LX0M
4750
+ 6JEOaM_7QCOM
4751
+ 6JITA_6JITA
4752
+ 6JITA_6JITB
4753
+ 6JITA_6JITC
4754
+ 6JITA_6JIZB
4755
+ 6JITA_6JIZC
4756
+ 6JMXA_6JMYA
4757
+ 6JMXA_6JMZA
4758
+ 6JMXA_6JN7B
4759
+ 6JMXA_6JN8A
4760
+ 6JMXA_7E65A
4761
+ 6JNFB_6JNFG
4762
+ 6JNFB_6UCUE
4763
+ 6JNFB_6UCVE
4764
+ 6JNFB_6UCVm
4765
+ 6JNFB_6UCVM
4766
+ 6JXNA_6JXNB
4767
+ 6JXNA_6JXND
4768
+ 6JXNA_6JXSB
4769
+ 6JXNA_6QU0A
4770
+ 6JXNA_6QU0B
4771
+ 6JY5A_6JY5A
4772
+ 6JY5A_6JY5B
4773
+ 6JY5A_6JY5C
4774
+ 6JY5A_6JY5D
4775
+ 6JY5A_6JY5E
4776
+ 6K7XA_6K7XA
4777
+ 6K7XA_6K7YB
4778
+ 6K7XA_6K7YC
4779
+ 6K7XA_6O58M
4780
+ 6K7XA_6WDOA
4781
+ 6KE6RF_6KE6RF
4782
+ 6KE6RF_6LQSRF
4783
+ 6KE6RF_6ZQACN
4784
+ 6KE6RF_6ZQDCN
4785
+ 6KE6RF_7D4IRF
4786
+ 6KN7T_6KN7a
4787
+ 6KN7T_6KN7T
4788
+ 6KN7T_6KN8a
4789
+ 6KN7T_7KO5T
4790
+ 6KN7T_7KO7a
4791
+ 6KVNA_6KVNA
4792
+ 6KVNA_6KVOA
4793
+ 6KVNA_6KVOB
4794
+ 6KVNA_6LCTA
4795
+ 6KVNA_6LCTB
4796
+ 6LKGA_6LKGA
4797
+ 6LKGA_6LKIA
4798
+ 6LKGA_6LKKA
4799
+ 6LKGA_6LKLA
4800
+ 6LKGA_6LKLB
4801
+ 6LNBB_6LNBE
4802
+ 6LNBB_6LNCB
4803
+ 6LNBB_6PIFB
4804
+ 6LNBB_6PIFF
4805
+ 6LNBB_6UVNH
4806
+ 6LTNA_6M02A
4807
+ 6LTNA_6M66A
4808
+ 6LTNA_6M67A
4809
+ 6LTNA_6WBMA
4810
+ 6LTNA_7DWBA
4811
+ 6MZEE_6MZEL
4812
+ 6MZEE_6MZES
4813
+ 6MZEE_6MZFE
4814
+ 6MZEE_6MZGE
4815
+ 6MZEE_6MZGK
4816
+ 6N5MB_6N5MB
4817
+ 6N5MB_6N5MD
4818
+ 6N5MB_7EALB
4819
+ 6N5MB_7EALC
4820
+ 6N5MB_7EALE
4821
+ 6NCLc6_6NCLc6
4822
+ 6NCLc6_6NCLc8
4823
+ 6NCLc6_8H2IbI
4824
+ 6NCLc6_8H2IbJ
4825
+ 6NCLc6_8H2IbK
4826
+ 6NJ8A_6NJ8A
4827
+ 6NJ8A_6NJ8B
4828
+ 6NJ8A_6NJ8C
4829
+ 6NJ8A_7MH2A
4830
+ 6NJ8A_7MH2C
4831
+ 6NR83_6NR83
4832
+ 6NR83_6NR93
4833
+ 6NR83_6NRB3
4834
+ 6NR83_6NRD3
4835
+ 6NR83_7WU73
4836
+ 6NR85_6NR85
4837
+ 6NR85_6NRB5
4838
+ 6NR85_6NRC5
4839
+ 6NR85_6NRD5
4840
+ 6NR85_7WU75
4841
+ 6NR86_6NR86
4842
+ 6NR86_6NR96
4843
+ 6NR86_6NRB6
4844
+ 6NR86_6NRD6
4845
+ 6NR86_7WU76
4846
+ 6NROB_6NROD
4847
+ 6NROB_6NTXC
4848
+ 6NROB_6O40B
4849
+ 6NROB_6PRLD
4850
+ 6NROB_6V3VF
4851
+ 6O6JA_6O6JA
4852
+ 6O6JA_6O7AA
4853
+ 6O6JA_6O7AB
4854
+ 6O6JA_6O7AD
4855
+ 6O6JA_6O7CA
4856
+ 6O6NA_6O6NA
4857
+ 6O6NA_6O6OA
4858
+ 6O6NA_6O6OB
4859
+ 6O6NA_6O6PA
4860
+ 6O6NA_6O6PB
4861
+ 6O6SA_6O6SB
4862
+ 6O6SA_6O71A
4863
+ 6O6SA_6O71B
4864
+ 6O6SA_6OV0A
4865
+ 6O6SA_6OV0B
4866
+ 6OECA_6OECA
4867
+ 6OECA_6OECH
4868
+ 6OECA_6OECI
4869
+ 6OECA_6OECK
4870
+ 6OECA_6OECL
4871
+ 6P4LA_6P4LA
4872
+ 6P4LA_6P4LB
4873
+ 6P4LA_7N7FA
4874
+ 6P4LA_7N7FB
4875
+ 6P4LA_7N7FC
4876
+ 6P7OA_6P7OA
4877
+ 6P7OA_6P7PA
4878
+ 6P7OA_6P7PB
4879
+ 6P7OA_6P7PC
4880
+ 6P7OA_6P7QB
4881
+ 6P8VA_6P8VA
4882
+ 6P8VA_6P8VB
4883
+ 6P8VA_6P8VD
4884
+ 6P8VA_6P8VE
4885
+ 6P8VA_6P8VF
4886
+ 6PQEA_6PQEA
4887
+ 6PQEA_6PQMA
4888
+ 6PQEA_6PRIA
4889
+ 6PQEA_6PRJA
4890
+ 6PQEA_6PRPA
4891
+ 6PQNA_6PQNA
4892
+ 6PQNA_6PQNB
4893
+ 6PQNA_6PQRA
4894
+ 6PQNA_6PQXA
4895
+ 6PQNA_6PQYA
4896
+ 6Q2CA_6Q2CB
4897
+ 6Q2CA_6UW2B
4898
+ 6Q2CA_6UW2D
4899
+ 6Q2CA_6UW2F
4900
+ 6Q2CA_7LESB
4901
+ 6Q45D_6Q45D
4902
+ 6Q45D_6Q45E
4903
+ 6Q45D_6Q45L
4904
+ 6Q45D_6Q45M
4905
+ 6Q45D_6Q45N
4906
+ 6Q8FA_6Q8FA
4907
+ 6Q8FA_6Q8FB
4908
+ 6Q8FA_6Q8IE
4909
+ 6Q8FA_6Q8IN
4910
+ 6Q8FA_6Q8JA
4911
+ 6Q975_6Q975
4912
+ 6Q975_6Q985
4913
+ 6Q975_7ABZ5
4914
+ 6Q975_7AC75
4915
+ 6Q975_7ACR5
4916
+ 6RM8C_6RM8C
4917
+ 6RM8C_6RM9C
4918
+ 6RM8C_6RMBD
4919
+ 6RM8C_6RMCC
4920
+ 6RM8C_6RMCD
4921
+ 6RXTUE_6RXTUE
4922
+ 6RXTUE_6RXTUI
4923
+ 6RXTUE_6RXVUE
4924
+ 6RXTUE_6RXYUE
4925
+ 6RXTUE_6RXZUE
4926
+ 6S2CA_6S2CA
4927
+ 6S2CA_6S2CB
4928
+ 6S2CA_7D0KA
4929
+ 6S2CA_7D0LA
4930
+ 6S2CA_7D0LB
4931
+ 6S3DM_6S3DM
4932
+ 6S3DM_6S3DN
4933
+ 6S3DM_6S3DO
4934
+ 6S3DM_6S3DP
4935
+ 6S3DM_6XWIA
4936
+ 6S6BA_6S6BB
4937
+ 6S6BA_6S8BC
4938
+ 6S6BA_6S8EA
4939
+ 6S6BA_6S91B
4940
+ 6S6BA_6S91C
4941
+ 6SB3A_6SB3B
4942
+ 6SB3A_6SB5A
4943
+ 6SB3A_8A1DD
4944
+ 6SB3A_8A1DG
4945
+ 6SB3A_8A1DN
4946
+ 6SGACd_6SGACd
4947
+ 6SGACd_7ANEq
4948
+ 6SGACd_7AORq
4949
+ 6SGACd_7PUACd
4950
+ 6SGACd_7PUBCd
4951
+ 6SPBF_6SPBF
4952
+ 6SPBF_6SPDF
4953
+ 6SPBF_6SPFF
4954
+ 6SPBF_6SPGF
4955
+ 6SPBF_7UNRF
4956
+ 6SPCh_6SPCh
4957
+ 6SPCh_6SPEh
4958
+ 6SPCh_6SPFh
4959
+ 6SPCh_7UNUh
4960
+ 6SPCh_7UNVh
4961
+ 6SSJA_6SSKC
4962
+ 6SSJA_6SSKF
4963
+ 6SSJA_6SSLE
4964
+ 6SSJA_6SSLI
4965
+ 6SSJA_8C9MA
4966
+ 6T3ZA_6T3ZA
4967
+ 6T3ZA_7T7IA
4968
+ 6T3ZA_7T7IC
4969
+ 6T3ZA_7T7IE
4970
+ 6T3ZA_7T7II
4971
+ 6TAQA_6TAQA
4972
+ 6TAQA_6TAQB
4973
+ 6TAQA_6TAQC
4974
+ 6TAQA_6TAQD
4975
+ 6TAQA_6TAUD
4976
+ 6TB9A4_6TB9A5
4977
+ 6TB9A4_6TB9M4
4978
+ 6TB9A4_6TB9U4
4979
+ 6TB9A4_6TSUP4
4980
+ 6TB9A4_6TSUU4
4981
+ 6TDUAD_6TDUAD
4982
+ 6TDUAD_6TDUAE
4983
+ 6TDUAD_6TDUAF
4984
+ 6TDUAD_6TE0D
4985
+ 6TDUAD_6TE0E
4986
+ 6TMTA_6TMTA
4987
+ 6TMTA_6TMUA
4988
+ 6TMTA_6TMUF
4989
+ 6TMTA_6TMVA
4990
+ 6TMTA_6TMVF
4991
+ 6TP9A_6TP9A
4992
+ 6TP9A_6TP9E
4993
+ 6TP9A_6TP9G
4994
+ 6TP9A_6TP9H
4995
+ 6TP9A_6TP9I
4996
+ 6TYTA_6TYTA
4997
+ 6TYTA_6TYUA
4998
+ 6TYTA_6TYWA
4999
+ 6TYTA_6TYXA
5000
+ 6TYTA_6TYXB
5001
+ 6U0TA_6U0TA
5002
+ 6U0TA_8G2Z3X
5003
+ 6U0TA_8G2Z4X
5004
+ 6U0TA_8G3D1X
5005
+ 6U0TA_8G3D4X
5006
+ 6VLZv_6VLZv
5007
+ 6VLZv_6VMIv
5008
+ 6VLZv_6YDPBC
5009
+ 6VLZv_6YDWBC
5010
+ 6VLZv_7A5Kr1
5011
+ 6VO6A_6VO6B
5012
+ 6VO6A_6VO6C
5013
+ 6VO6A_6VO6D
5014
+ 6VO6A_6VO8A
5015
+ 6VO6A_6VO8B
5016
+ 6VOYA_6VOYB
5017
+ 6VOYA_6VOYC
5018
+ 6VOYA_6VOYD
5019
+ 6VOYA_7OUFB
5020
+ 6VOYA_7PELA
5021
+ 6VQ6L_6VQ6L
5022
+ 6VQ6L_6WLZN
5023
+ 6VQ6L_6XBWL
5024
+ 6VQ6L_7KHRL
5025
+ 6VQ6L_7U8QL
5026
+ 6W192_6W192
5027
+ 6W192_6W19p
5028
+ 6W192_6W2Em
5029
+ 6W192_7BR77
5030
+ 6W192_7BSIc
5031
+ 6W1SE_6W1SE
5032
+ 6W1SE_7EMFH
5033
+ 6W1SE_7ENAh
5034
+ 6W1SE_7ENCh
5035
+ 6W1SE_7LBMh
5036
+ 6WMAA_6WMAA
5037
+ 6WMAA_6WMBA
5038
+ 6WMAA_6WMCA
5039
+ 6WMAA_8CX1F
5040
+ 6WMAA_8CX2A
5041
+ 6WXWA_6WXWA
5042
+ 6WXWA_6WXXC
5043
+ 6WXWA_6WXXD
5044
+ 6WXWA_6WXYB
5045
+ 6WXWA_6XL1A
5046
+ 6X5ZO_6X5ZO
5047
+ 6X5ZO_6X5ZP
5048
+ 6X5ZO_8EFHO
5049
+ 6X5ZO_8EFIO
5050
+ 6X5ZO_8ENCP
5051
+ 6X6AC_6X6AC
5052
+ 6X6AC_6X6AF
5053
+ 6X6AC_6X6AG
5054
+ 6X6AC_6X6AI
5055
+ 6X6AC_6X6CF
5056
+ 6X89G1_6X89G1
5057
+ 6X89G1_7A23p
5058
+ 6X89G1_7A23q
5059
+ 6X89G1_7AQQz
5060
+ 6X89G1_8E73G2
5061
+ 6X91A_6X91A
5062
+ 6X91A_6X91B
5063
+ 6X91A_6X91E
5064
+ 6X91A_6X91F
5065
+ 6X91A_6X91G
5066
+ 6XGPA_6XGPA
5067
+ 6XGPA_6XGPB
5068
+ 6XGPA_6XGQA
5069
+ 6XGPA_6XGQC
5070
+ 6XGPA_6XGQG
5071
+ 6XHPA_6XHQA
5072
+ 6XHPA_6XHSD
5073
+ 6XHPA_6XHSE
5074
+ 6XHPA_6XHSF
5075
+ 6XHPA_6XHTB
5076
+ 6Y61A_6Y61B
5077
+ 6Y61A_7B6SF
5078
+ 6Y61A_7B6TB
5079
+ 6Y61A_7B6TI
5080
+ 6Y61A_7B6VH
5081
+ 6YCXF_6YCXF
5082
+ 6YCXF_6YCXG
5083
+ 6YCXF_6YCYE
5084
+ 6YCXF_6YCZC
5085
+ 6YCXF_6ZN3A
5086
+ 6YNXR_6YNXr
5087
+ 6YNXR_6YNXR
5088
+ 6YNXR_6YNZr3
5089
+ 6YNXR_6YNZR3
5090
+ 6YNXR_6YNZR
5091
+ 6YPZA_6YPZB
5092
+ 6YPZA_6YQ0A
5093
+ 6YPZA_6YQ0B
5094
+ 6YPZA_6YQ3B
5095
+ 6YPZA_6YQ6B
5096
+ 6ZHFA_6ZHFA
5097
+ 6ZHFA_6ZHGA
5098
+ 6ZHFA_6ZHHE
5099
+ 6ZHFA_6ZHHF
5100
+ 6ZHFA_6ZHHG
5101
+ 6ZJ8A_6ZJ8A
5102
+ 6ZJ8A_6ZJ8C
5103
+ 6ZJ8A_6ZJ8F
5104
+ 6ZJ8A_6ZJ8G
5105
+ 6ZJ8A_6ZJ8H
5106
+ 6ZXBA_6ZXBA
5107
+ 6ZXBA_6ZXBB
5108
+ 6ZXBA_6ZXCC
5109
+ 6ZXBA_6ZXMC
5110
+ 6ZXBA_6ZXMD
5111
+ 7A6HG_7AEAG
5112
+ 7A6HG_7ASTJ
5113
+ 7A6HG_7D58G
5114
+ 7A6HG_8ITYG
5115
+ 7A6HG_8IUHG
5116
+ 7ADSA_7ADSA
5117
+ 7ADSA_7ADTA
5118
+ 7ADSA_7P0SA
5119
+ 7ADSA_7P0UA
5120
+ 7ADSA_7P0UF
5121
+ 7ANCA_7ANCA
5122
+ 7ANCA_7CXTA
5123
+ 7ANCA_7CXTB
5124
+ 7ANCA_7M13A
5125
+ 7ANCA_7M13B
5126
+ 7AQCR_7AQCR
5127
+ 7AQCR_7AS80
5128
+ 7AQCR_7AS90
5129
+ 7AQCR_7ASA0
5130
+ 7AQCR_7OPE0
5131
+ 7BTWA_7BTWA
5132
+ 7BTWA_7BTWD
5133
+ 7BTWA_7BTXA
5134
+ 7BTWA_7E4HA
5135
+ 7BTWA_7VKUA
5136
+ 7C42A_7C42A
5137
+ 7C42A_7C43A
5138
+ 7C42A_7C45A
5139
+ 7C42A_7C47A
5140
+ 7C42A_7C4CA
5141
+ 7C4PA_7C4PA
5142
+ 7C4PA_7C4QA
5143
+ 7C4PA_7C4QB
5144
+ 7C4PA_7C4RB
5145
+ 7C4PA_7ELKA
5146
+ 7C4SA_7C4SA
5147
+ 7C4SA_7C4SB
5148
+ 7C4SA_7EW2R
5149
+ 7C4SA_7EW3R
5150
+ 7C4SA_7EW4R
5151
+ 7C9XA_7EAIA
5152
+ 7C9XA_7EAJ1
5153
+ 7C9XA_8GSC1
5154
+ 7C9XA_8GSD1
5155
+ 7C9XA_8GSF1
5156
+ 7CK6E_7CK6E
5157
+ 7CK6E_7CK6F
5158
+ 7CK6E_7CP9C
5159
+ 7CK6E_7VBYA
5160
+ 7CK6E_7VDDF
5161
+ 7D2SA_7D2SA
5162
+ 7D2SA_7D2TA
5163
+ 7D2SA_7D2UA
5164
+ 7D2SA_7LT8A
5165
+ 7D2SA_7LT9A
5166
+ 7D7RA_7D7RB
5167
+ 7D7RA_7DNYA
5168
+ 7D7RA_7DNZB
5169
+ 7D7RA_7EKLA
5170
+ 7D7RA_7EKMA
5171
+ 7DDQa_7DDQn
5172
+ 7DDQa_7DDQu
5173
+ 7DDQa_7F0L1
5174
+ 7DDQa_7VOR6
5175
+ 7DDQa_7VOYC
5176
+ 7DH7A_7DH7A
5177
+ 7DH7A_7DH7B
5178
+ 7DH7A_7DH7C
5179
+ 7DH7A_7DH7D
5180
+ 7DH7A_7DH8A
5181
+ 7DN2a_7DN2a
5182
+ 7DN2a_7DN2g
5183
+ 7DN2a_7DN2i
5184
+ 7DN2a_7F2Pg
5185
+ 7DN2a_7F2Pi
5186
+ 7DT0A_7DT0A
5187
+ 7DT0A_7DT0C
5188
+ 7DT0A_7DT0E
5189
+ 7DT0A_7DZZB
5190
+ 7DT0A_8H7TB
5191
+ 7DWNA_7DWNA
5192
+ 7DWNA_7DWNC
5193
+ 7DWNA_7DWND
5194
+ 7DWNA_7DWOA
5195
+ 7DWNA_7DWOB
5196
+ 7E2CI_7E2CI
5197
+ 7E2CI_7E8SI
5198
+ 7E2CI_7EA3I
5199
+ 7E2CI_7U06b
5200
+ 7E2CI_7U06B
5201
+ 7EC1A_7EC1A
5202
+ 7EC1A_7EC3C
5203
+ 7EC1A_7VFKB
5204
+ 7EC1A_7VFNC
5205
+ 7EC1A_7VFOB
5206
+ 7EDXA_7EG7A
5207
+ 7EDXA_7EGCA
5208
+ 7EDXA_7EGEA
5209
+ 7EDXA_7EGIA
5210
+ 7EDXA_7ENADA
5211
+ 7EMFF_7EMFF
5212
+ 7EMFF_7ENAf
5213
+ 7EMFF_7ENCf
5214
+ 7EMFF_8GXQf
5215
+ 7EMFF_8GXSf
5216
+ 7EQEA_7EQEA
5217
+ 7EQEA_7EQEB
5218
+ 7EQEA_7EQFA
5219
+ 7EQEA_7EQFB
5220
+ 7EQEA_7EQFC
5221
+ 7ET4A_7ET4A
5222
+ 7ET4A_7ET4D
5223
+ 7ET4A_7ET4G
5224
+ 7ET4A_7ET4J
5225
+ 7ET4A_7ET5A
5226
+ 7EWIA_7EWIA
5227
+ 7EWIA_7EWIC
5228
+ 7EWIA_7EWJB
5229
+ 7EWIA_7EWJG
5230
+ 7EWIA_7EWJH
5231
+ 7EY0A_7WLZP
5232
+ 7EY0A_7WS2E
5233
+ 7EY0A_7ZR7F
5234
+ 7EY0A_8F0GX
5235
+ 7EY0A_8F0HB
5236
+ 7EYIG_7EYIG
5237
+ 7EYIG_7N5SA
5238
+ 7EYIG_7N5VB
5239
+ 7EYIG_8E3EB
5240
+ 7EYIG_8E3EF
5241
+ 7F0OA_7F0OA
5242
+ 7F0OA_7F0OB
5243
+ 7F0OA_7F0YA
5244
+ 7F0OA_7F10B
5245
+ 7F0OA_7F11A
5246
+ 7F0RA_7F0RA
5247
+ 7F0RA_7F0RB
5248
+ 7F0RA_7XL3A
5249
+ 7F0RA_7XL4B
5250
+ 7F0RA_7XYAB
5251
+ 7F9KA_7F9KA
5252
+ 7F9KA_7F9LA
5253
+ 7F9KA_7F9LB
5254
+ 7F9KA_7F9LC
5255
+ 7F9KA_7F9LD
5256
+ 7FBIN_7FBIN
5257
+ 7FBIN_8HEBE
5258
+ 7FBIN_8HECE
5259
+ 7FBIN_8HECG
5260
+ 7FBIN_8HECI
5261
+ 7FJMA_7FJPB
5262
+ 7FJMA_7FJQA
5263
+ 7FJMA_7N70A
5264
+ 7FJMA_7N73A
5265
+ 7FJMA_7VPLA
5266
+ 7K9RA_7K9RB
5267
+ 7K9RA_7K9RC
5268
+ 7K9RA_7K9SD
5269
+ 7K9RA_7K9VA
5270
+ 7K9RA_7K9WB
5271
+ 7KHAB_7KHAB
5272
+ 7KHAB_8DEJB
5273
+ 7KHAB_8DFAG
5274
+ 7KHAB_8DFAH
5275
+ 7KHAB_8DFOB
5276
+ 7KHAI_7KHAI
5277
+ 7KHAI_8DEJI
5278
+ 7KHAI_8DFAI
5279
+ 7KHAI_8DFOI
5280
+ 7KHAI_8DFSI
5281
+ 7KRZA_7KSMA
5282
+ 7KRZA_7KSMB
5283
+ 7KRZA_7KSME
5284
+ 7KRZA_7P09C
5285
+ 7KRZA_7P0BA
5286
+ 7KZMZ_7KZMZ
5287
+ 7KZMZ_7KZNZ
5288
+ 7KZMZ_8GLVFn
5289
+ 7KZMZ_8GLVLN
5290
+ 7KZMZ_8GLVMO
5291
+ 7L6KA_7L6KA
5292
+ 7L6KA_7LF7K
5293
+ 7L6KA_7LF7M
5294
+ 7L6KA_7LFAA
5295
+ 7L6KA_7LFAC
5296
+ 7LFTA_7LFWA
5297
+ 7LFTA_7O4HA
5298
+ 7LFTA_7O4HB
5299
+ 7LFTA_7O4HC
5300
+ 7LFTA_7RHJD
5301
+ 7LGUA_7LGWA
5302
+ 7LGUA_7S9BA
5303
+ 7LGUA_7S9CA
5304
+ 7LGUA_7S9DA
5305
+ 7LGUA_7SUNA
5306
+ 7MNLA_7MNLA
5307
+ 7MNLA_7MNNA
5308
+ 7MNLA_7R5K00
5309
+ 7MNLA_7R5K01
5310
+ 7MNLA_7TBMc1
5311
+ 7MWYA_7MWYA
5312
+ 7MWYA_7MWZA
5313
+ 7MWYA_7MWZB
5314
+ 7MWYA_7MWZC
5315
+ 7MWYA_7MWZD
5316
+ 7N85A_7N85G
5317
+ 7N85A_7N9FD
5318
+ 7N85A_7TBIX1
5319
+ 7N85A_7TBIX2
5320
+ 7N85A_7WOOL
5321
+ 7NDRA_7NDRA
5322
+ 7NDRA_7NDRB
5323
+ 7NDRA_7NDRC
5324
+ 7NDRA_7NDRE
5325
+ 7NDRA_7NDSA
5326
+ 7O4HD_7O4HD
5327
+ 7O4HD_7RH9B
5328
+ 7O4HD_7RHHB
5329
+ 7O4HD_7RHIB
5330
+ 7O4HD_8BX7D
5331
+ 7OC4A_7OC4A
5332
+ 7OC4A_7OC4B
5333
+ 7OC4A_7OC5A
5334
+ 7OC4A_7OC5B
5335
+ 7OC4A_7OC6A
5336
+ 7OGKA_7OGKA
5337
+ 7OGKA_7OGKB
5338
+ 7OGKA_7OGKC
5339
+ 7OGKA_7OGLB
5340
+ 7OGKA_7OGLC
5341
+ 7OLCLR_7OLCLR
5342
+ 7OLCLR_7R81T1
5343
+ 7OLCLR_7Z3OLR
5344
+ 7OLCLR_8I9ZLR
5345
+ 7OLCLR_8IA0LR
5346
+ 7OLCSR_7OLCSR
5347
+ 7OLCSR_7OLDSR
5348
+ 7OLCSR_7R81S2
5349
+ 7OLCSR_7Z3NSR
5350
+ 7OLCSR_7Z3OSR
5351
+ 7OODs_7OODs
5352
+ 7OODs_7PASs
5353
+ 7OODs_7PIAs
5354
+ 7OODs_7PIRs
5355
+ 7OODs_7PISs
5356
+ 7OX1G_7OX1G
5357
+ 7OX1G_7OX2N
5358
+ 7OX1G_7OX5H
5359
+ 7OX1G_7OX5N
5360
+ 7OX1G_7OX6A
5361
+ 7P37A_7P37A
5362
+ 7P37A_7P37B
5363
+ 7P37A_7P3FB
5364
+ 7P37A_7P3QA
5365
+ 7P37A_7P3QB
5366
+ 7PKNQ_7PKNQ
5367
+ 7PKNQ_7QOOQ
5368
+ 7PKNQ_7R5VQ
5369
+ 7PKNQ_7XHNQ
5370
+ 7PKNQ_7XHOQ
5371
+ 7PQ2A_7PQ2B
5372
+ 7PQ2A_7PQ3A
5373
+ 7PQ2A_7PQ3B
5374
+ 7PQ2A_7PQ6A
5375
+ 7PQ2A_7PQAA
5376
+ 7PWFP_7PWFP
5377
+ 7PWFP_8BRMSR
5378
+ 7PWFP_8BSISR
5379
+ 7PWFP_8BSJSR
5380
+ 7PWFP_8BTDSR
5381
+ 7Q0BA_7Q0BB
5382
+ 7Q0BA_7Q12A
5383
+ 7Q0BA_7Q13A
5384
+ 7Q0BA_7ZBNB
5385
+ 7Q0BA_8CVZB
5386
+ 7QIHB_7QIHB
5387
+ 7QIHB_7QIIB
5388
+ 7QIHB_7QIJE
5389
+ 7QIHB_7QIJf
5390
+ 7QIHB_7QIJT
5391
+ 7RGSA_7RGSA
5392
+ 7RGSA_7RGSB
5393
+ 7RGSA_7RGTB
5394
+ 7RGSA_7RGUG
5395
+ 7RGSA_7RGUM
5396
+ 7S6BA_7S6BA
5397
+ 7S6BA_7S6BB
5398
+ 7S6BA_7S6CA
5399
+ 7S6BA_7S6CB
5400
+ 7S6BA_7S6DA
5401
+ 7SAXC_7SAXC
5402
+ 7SAXC_7SAXD
5403
+ 7SAXC_7SAXE
5404
+ 7SAXC_7SAXF
5405
+ 7SAXC_7SAXG
5406
+ 7SBDH_7SBDH
5407
+ 7SBDH_7SBGH
5408
+ 7SBDH_7SD2A
5409
+ 7SBDH_7SD2C
5410
+ 7SBDH_7SD2H
5411
+ 7SN4A_7SN4A
5412
+ 7SN4A_7SN4c
5413
+ 7SN4A_7SN4p
5414
+ 7SN4A_7SN4S
5415
+ 7SN4A_7SN4W
5416
+ 7SN9A_7SN9A
5417
+ 7SN9A_7SN9i
5418
+ 7SN9A_7SN9K
5419
+ 7SN9A_7SN9P
5420
+ 7SN9A_7SN9T
5421
+ 7SY9A_7SY9A
5422
+ 7SY9A_7SY9B
5423
+ 7SY9A_7U35B
5424
+ 7SY9A_7U35C
5425
+ 7SY9A_8DP2A
5426
+ 7TJYX_7TJYX
5427
+ 7TJYX_7TKDX
5428
+ 7TJYX_7TKJX
5429
+ 7TJYX_7TKOX
5430
+ 7TJYX_7TKSX
5431
+ 7UQLA_7UQLA
5432
+ 7UQLA_7UQLB
5433
+ 7UQLA_7UQMA
5434
+ 7UQLA_7UQNA
5435
+ 7UQLA_7UQOA
5436
+ 7V9FA_7V9FA
5437
+ 7V9FA_7V9GA
5438
+ 7V9FA_7V9GD
5439
+ 7V9FA_7V9IA
5440
+ 7V9FA_7W27C
5441
+ 7VUFA_7VUFA
5442
+ 7VUFA_7VUFB
5443
+ 7VUFA_7VUFD
5444
+ 7VUFA_7VUKA
5445
+ 7VUFA_7VUKB
5446
+ 7W2BA_7W2BA
5447
+ 7W2BA_7W2BJ
5448
+ 7W2BA_7W2DK
5449
+ 7W2BA_7W2HT
5450
+ 7W2BA_7W2HW
5451
+ 7W5ZS_7W5Zs
5452
+ 7W5ZS_8B6HEK
5453
+ 7W5ZS_8GYMS
5454
+ 7W5ZS_8GZU18
5455
+ 7W5ZS_8GZU73
5456
+ 7WBBA_7WBBA
5457
+ 7WBBA_7WBBF
5458
+ 7WBBA_7YKKA
5459
+ 7WBBA_7YKLB
5460
+ 7WBBA_7Z34n
5461
+ 7WHRA_7WHRE
5462
+ 7WHRA_7WHRF
5463
+ 7WHRA_7WHSA
5464
+ 7WHRA_7WHTB
5465
+ 7WHRA_7WHTD
5466
+ 7WK1A_7WK1A
5467
+ 7WK1A_7WK7B
5468
+ 7WK1A_7WLAA
5469
+ 7WK1A_7WLAB
5470
+ 7WK1A_7WLEA
5471
+ 7X5EB_7X5EB
5472
+ 7X5EB_7X5EF
5473
+ 7X5EB_7X5FB
5474
+ 7X5EB_7X5FF
5475
+ 7X5EB_7X5GF
5476
+ 7XCNM_7XCNM
5477
+ 7XCNM_7XCNN
5478
+ 7XCNM_7XCNP
5479
+ 7XCNM_7XCNQ
5480
+ 7XCNM_7XCNR
5481
+ 7Y4L53_7Y4LY3
5482
+ 7Y4L53_7Y5E53
5483
+ 7Y4L53_7Y5EY3
5484
+ 7Y4L53_7Y7A5B
5485
+ 7Y4L53_7Y7AYd
5486
+ 7Y97A_7Y97A
5487
+ 7Y97A_7Y97B
5488
+ 7Y97A_7Y98A
5489
+ 7Y97A_7Y98B
5490
+ 7Y97A_7Y9OA
5491
+ 7ZD5C_7ZDAC
5492
+ 7ZD5C_7ZDBC
5493
+ 7ZD5C_7ZDFC
5494
+ 7ZD5C_7ZDTC
5495
+ 7ZD5C_8IPTC
5496
+ 7ZTWB_7ZTWB
5497
+ 7ZTWB_7ZU4A
5498
+ 7ZTWB_8A2EBBB
5499
+ 7ZTWB_8A2ECCC
5500
+ 7ZTWB_8A2EHHH
5501
+ 8AB2A_8AB2A
5502
+ 8AB2A_8AB3A
5503
+ 8AB2A_8AB3B
5504
+ 8AB2A_8AB3C
5505
+ 8AB2A_8AB3D
5506
+ 8AP6M1_8AP6M1
5507
+ 8AP6M1_8APCM1
5508
+ 8AP6M1_8APEM1
5509
+ 8AP6M1_8APFM1
5510
+ 8AP6M1_8APHM1
5511
+ 8AS8C_8AS8C
5512
+ 8AS8C_8AS8D
5513
+ 8AS8C_8BFNC
5514
+ 8AS8C_8BFNH
5515
+ 8AS8C_8BFNI
5516
+ 8AX4A_8AX4A
5517
+ 8AX4A_8AX4B
5518
+ 8AX4A_8AX4C
5519
+ 8AX4A_8AX4D
5520
+ 8AX4A_8AXFC
5521
+ 8CJZG_8CJZg
5522
+ 8CJZG_8CJZG
5523
+ 8CJZG_8CJZH
5524
+ 8CJZG_8CJZI
5525
+ 8CJZG_8CJZY
5526
+ 8DKEP_8DKEP
5527
+ 8DKEP_8DKIP
5528
+ 8DKEP_8DKMP
5529
+ 8DKEP_8DKWP
5530
+ 8DKEP_8DYPA
5531
+ 8ECKA_8ECKA
5532
+ 8ECKA_8ECKB
5533
+ 8ECKA_8ECKC
5534
+ 8ECKA_8ECKF
5535
+ 8ECKA_8ECKG
5536
+ 8ECNA_8ECNA
5537
+ 8ECNA_8ECNB
5538
+ 8ECNA_8ECNF
5539
+ 8ECNA_8ECNG
5540
+ 8ECNA_8ECNH
5541
+ 8ECOA_8ECOA
5542
+ 8ECOA_8ECOB
5543
+ 8ECOA_8ECOE
5544
+ 8ECOA_8ECOF
5545
+ 8ECOA_8ECOG
5546
+ 8EDUA_8EDUA
5547
+ 8EDUA_8EDUB
5548
+ 8EDUA_8EDUC
5549
+ 8EDUA_8EDUF
5550
+ 8EDUA_8EDUG
5551
+ 8EGRA_8EGRA
5552
+ 8EGRA_8EGRC
5553
+ 8EGRA_8EGRD
5554
+ 8EGRA_8EGRE
5555
+ 8EGRA_8EGRF
5556
+ 8EIZA_8EIZA
5557
+ 8EIZA_8EIZB
5558
+ 8EIZA_8EIZC
5559
+ 8EIZA_8EIZD
5560
+ 8EIZA_8EIZE
5561
+ 8ESQE_8ESQE
5562
+ 8ESQE_8ETHE
5563
+ 8ESQE_8ETIE
5564
+ 8ESQE_8ETJE
5565
+ 8ESQE_8EV3E
5566
+ 8ESQu_8ETCu
5567
+ 8ESQu_8ETIu
5568
+ 8ESQu_8ETJu
5569
+ 8ESQu_8EUPu
5570
+ 8ESQu_8EUYu
5571
+ 8FY9B_8FY9B
5572
+ 8FY9B_8FY9E
5573
+ 8FY9B_8FYAC
5574
+ 8FY9B_8FYAE
5575
+ 8FY9B_8FYCC
5576
+ 8H2IbA_8H2IbA
5577
+ 8H2IbA_8H2IbB
5578
+ 8H2IbA_8H2Ibu
5579
+ 8H2IbA_8H2Iby
5580
+ 8H2IbA_8H2Ibz
5581
+ 8OVNA_8OVNA
5582
+ 8OVNA_8OVOA
5583
+ 8OVNA_8OVOB
5584
+ 8OVNA_8OVPA
5585
+ 8OVNA_8OVPB
full_train_list.txt ADDED
The diff for this file is too large to render. See raw diff
 
full_val_list.txt ADDED
@@ -0,0 +1,5495 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 121PA_6PQ3A
2
+ 121PA_6YXWC
3
+ 121PA_7F0WA
4
+ 121PA_7KMRA
5
+ 121PA_7KYZA
6
+ 12CAA_1AVNA
7
+ 12CAA_1V9EB
8
+ 12CAA_3RLDA
9
+ 12CAA_6HD2A
10
+ 12CAA_6QEBA
11
+ 13PKA_13PKA
12
+ 13PKA_13PKB
13
+ 13PKA_13PKC
14
+ 13PKA_13PKD
15
+ 13PKA_16PKA
16
+ 1A22B_1AXIB
17
+ 1A22B_1HWHB
18
+ 1A22B_2AEWA
19
+ 1A22B_2AEWB
20
+ 1A22B_3HHRC
21
+ 1A3LL_1QKZL
22
+ 1A3LL_1UZ6M
23
+ 1A3LL_2VQ1E
24
+ 1A3LL_3BZ4G
25
+ 1A3LL_6FN4B
26
+ 1A3PA_1A3PA
27
+ 1A3PA_1EGFA
28
+ 1A3PA_1EPHA
29
+ 1A3PA_1EPJA
30
+ 1A3PA_1GK5A
31
+ 1A4RA_1AJEA
32
+ 1A4RA_1CF4A
33
+ 1A4RA_1E0AA
34
+ 1A4RA_1EESA
35
+ 1A4RA_2ASEA
36
+ 1A50B_1K3UB
37
+ 1A50B_2DH6A
38
+ 1A50B_2J9XB
39
+ 1A50B_5KZMB
40
+ 1A50B_7KA1B
41
+ 1A53A_1A53A
42
+ 1A53A_4IJBA
43
+ 1A53A_4LT9A
44
+ 1A53A_5K7JB
45
+ 1A53A_6TF8C
46
+ 1A5TA_1JR3E
47
+ 1A5TA_3GLFJ
48
+ 1A5TA_3GLHE
49
+ 1A5TA_3GLHO
50
+ 1A5TA_3GLIJ
51
+ 1A6IA_1QPIA
52
+ 1A6IA_2TCTA
53
+ 1A6IA_3FK6A
54
+ 1A6IA_3FK6B
55
+ 1A6IA_6YR2B
56
+ 1AACA_1AACA
57
+ 1AACA_1MDAA
58
+ 1AACA_1MDAB
59
+ 1AACA_1T5KB
60
+ 1AACA_3RYMB
61
+ 1AE6H_1YEDB
62
+ 1AE6H_2NTFB
63
+ 1AE6H_2NTFH
64
+ 1AE6H_2UYLB
65
+ 1AE6H_4I2XD
66
+ 1AEWA_2Z5PA
67
+ 1AEWA_2Z5RA
68
+ 1AEWA_4V6BAO
69
+ 1AEWA_7VIOA
70
+ 1AEWA_7VITA
71
+ 1AFIA_1AFIA
72
+ 1AFIA_1AFJA
73
+ 1AFIA_1OSDA
74
+ 1AFIA_1OSDB
75
+ 1AFIA_2HQIA
76
+ 1AGTA_1AGTA
77
+ 1AGTA_1KTXA
78
+ 1AGTA_1XSWA
79
+ 1AGTA_2KIRA
80
+ 1AGTA_2KTXA
81
+ 1AILA_2N74A
82
+ 1AILA_2N74B
83
+ 1AILA_2Z0AA
84
+ 1AILA_2Z0AD
85
+ 1AILA_3M8AE
86
+ 1AIVA_1AIVA
87
+ 1AIVA_1AOVA
88
+ 1AIVA_1DOTA
89
+ 1AIVA_1RYXA
90
+ 1AIVA_8FEIA
91
+ 1AJ7H_1AJ7H
92
+ 1AJ7H_2HH0H
93
+ 1AJ7H_2RCSH
94
+ 1AJ7H_6SVLH
95
+ 1AJ7H_7PI7E
96
+ 1AKKA_2B4ZA
97
+ 1AKKA_3NBSC
98
+ 1AKKA_3NBTD
99
+ 1AKKA_5ZKVA
100
+ 1AKKA_6XNKG
101
+ 1AM7A_1AM7A
102
+ 1AM7A_1AM7B
103
+ 1AM7A_1AM7C
104
+ 1AM7A_1D9UA
105
+ 1AM7A_3D3DB
106
+ 1AMEA_1AMEA
107
+ 1AMEA_1HG7A
108
+ 1AMEA_1KDFA
109
+ 1AMEA_3NLAA
110
+ 1AMEA_3RDNA
111
+ 1AQ2A_1K3CA
112
+ 1AQ2A_1OENA
113
+ 1AQ2A_1OS1A
114
+ 1AQ2A_6ASNA
115
+ 1AQ2A_6AT4A
116
+ 1AQKH_4OCRH
117
+ 1AQKH_4ZYKA
118
+ 1AQKH_4ZYKH
119
+ 1AQKH_7MMND
120
+ 1AQKH_7Z0YH
121
+ 1ATNA_2VCPA
122
+ 1ATNA_2Y83S
123
+ 1ATNA_3LUEA
124
+ 1ATNA_6FM2A
125
+ 1ATNA_7NVMK
126
+ 1ATUA_1IZ2A
127
+ 1ATUA_1KCTA
128
+ 1ATUA_1QLPA
129
+ 1ATUA_3T1PA
130
+ 1ATUA_6HX4B
131
+ 1AXSA_1D5IL
132
+ 1AXSA_6NYQL
133
+ 1AXSA_7BH8F
134
+ 1AXSA_7BH8H
135
+ 1AXSA_7BM5L
136
+ 1AXSB_1NGXH
137
+ 1AXSB_5VL3E
138
+ 1AXSB_6BJZH
139
+ 1AXSB_6EV1K
140
+ 1AXSB_7YDIH
141
+ 1AY9A_1AY9A
142
+ 1AY9A_1AY9B
143
+ 1AY9A_1I4VA
144
+ 1AY9A_1I4VB
145
+ 1AY9A_1UMUA
146
+ 1AYKA_1AYKA
147
+ 1AYKA_1CGFA
148
+ 1AYKA_1CGLA
149
+ 1AYKA_2J0TC
150
+ 1AYKA_4AYKA
151
+ 1AYRA_1AYRA
152
+ 1AYRA_1CF1C
153
+ 1AYRA_3UGXA
154
+ 1AYRA_4J2QA
155
+ 1AYRA_4J2QB
156
+ 1B2YA_1B2YA
157
+ 1B2YA_1BVNP
158
+ 1B2YA_1DHKA
159
+ 1B2YA_1KXQC
160
+ 1B2YA_4GQQA
161
+ 1B3QA_1B3QA
162
+ 1B3QA_1B3QB
163
+ 1B3QA_2CH4A
164
+ 1B3QA_3JA6C
165
+ 1B3QA_3JA6E
166
+ 1B4CA_1QLKA
167
+ 1B4CA_1SYMA
168
+ 1B4CA_2H61A
169
+ 1B4CA_2M49B
170
+ 1B4CA_2PRUA
171
+ 1B4FA_1B4FA
172
+ 1B4FA_1B4FB
173
+ 1B4FA_1B4FD
174
+ 1B4FA_1B4FF
175
+ 1B4FA_1B4FG
176
+ 1B50A_1B50B
177
+ 1B50A_1B53A
178
+ 1B50A_2X69A
179
+ 1B50A_3FPUB
180
+ 1B50A_5CORJ
181
+ 1B8XA_1M99A
182
+ 1B8XA_4ECBA
183
+ 1B8XA_5GZZB
184
+ 1B8XA_5GZZE
185
+ 1B8XA_5GZZF
186
+ 1B9KA_1KY6A
187
+ 1B9KA_1KYDA
188
+ 1B9KA_1KYUA
189
+ 1B9KA_1W80A
190
+ 1B9KA_7OHIA
191
+ 1BBSA_2IKUA
192
+ 1BBSA_3G70B
193
+ 1BBSA_3VCMA
194
+ 1BBSA_3VCMB
195
+ 1BBSA_3VSWB
196
+ 1BCCJ_1L0LJ
197
+ 1BCCJ_1NTZJ
198
+ 1BCCJ_2BCCJ
199
+ 1BCCJ_5J4ZAJ
200
+ 1BCCJ_7DGSB4
201
+ 1BE2A_1BE2A
202
+ 1BE2A_1JTBA
203
+ 1BE2A_1LIPA
204
+ 1BE2A_1MIDA
205
+ 1BE2A_3GSHB
206
+ 1BF5A_1BF5A
207
+ 1BF5A_1YVLA
208
+ 1BF5A_1YVLB
209
+ 1BF5A_7NUFA
210
+ 1BF5A_8D3FA
211
+ 1BF9A_1BF9A
212
+ 1BF9A_1F7EA
213
+ 1BF9A_1F7MA
214
+ 1BF9A_1FF7A
215
+ 1BF9A_1FFMA
216
+ 1BFIA_1BFIA
217
+ 1BFIA_1BFJA
218
+ 1BFIA_1PICA
219
+ 1BFIA_1QADA
220
+ 1BFIA_5AULA
221
+ 1BFSA_1BFSA
222
+ 1BFSA_1IKNC
223
+ 1BFSA_1U42A
224
+ 1BFSA_3JV4B
225
+ 1BFSA_3JV4D
226
+ 1BGTA_1IXYA
227
+ 1BGTA_1JEJA
228
+ 1BGTA_1JIXA
229
+ 1BGTA_1M5RA
230
+ 1BGTA_2BGTA
231
+ 1BI2B_1DPRA
232
+ 1BI2B_1DPRB
233
+ 1BI2B_1F5TA
234
+ 1BI2B_1F5TB
235
+ 1BI2B_1F5TD
236
+ 1BJ1H_1BJ1H
237
+ 1BJ1H_1CZ8H
238
+ 1BJ1H_4EOWH
239
+ 1BJ1H_5CGYC
240
+ 1BJ1H_5ZMJH
241
+ 1BJFA_1BJFA
242
+ 1BJFA_1BJFB
243
+ 1BJFA_5G4PA
244
+ 1BJFA_5M6CE
245
+ 1BJFA_5T7CA
246
+ 1BKUA_1BYVA
247
+ 1BKUA_1BZBA
248
+ 1BKUA_1FB9A
249
+ 1BKUA_2GLHA
250
+ 1BKUA_7TYWP
251
+ 1BL8A_2A9HC
252
+ 1BL8A_2HVJC
253
+ 1BL8A_3F5WC
254
+ 1BL8A_3OGCC
255
+ 1BL8A_6W0DC
256
+ 1BOGB_3J2YB
257
+ 1BOGB_4FFZZ
258
+ 1BOGB_5DFVE
259
+ 1BOGB_5KVGH
260
+ 1BOGB_5UHYH
261
+ 1BORA_1BORA
262
+ 1BORA_2MWXA
263
+ 1BORA_5YUFA
264
+ 1BORA_5YUFB
265
+ 1BORA_5YUFC
266
+ 1BUQA_1BUQA
267
+ 1BUQA_1ISKA
268
+ 1BUQA_1ISKB
269
+ 1BUQA_3NUVA
270
+ 1BUQA_6UAEA
271
+ 1BXZA_1BXZA
272
+ 1BXZA_3FPLA
273
+ 1BXZA_6SDMC
274
+ 1BXZA_7XY9B
275
+ 1BXZA_7XY9D
276
+ 1BZ7B_1BZ7B
277
+ 1BZ7B_1CLZH
278
+ 1BZ7B_2H1PH
279
+ 1BZ7B_4HDIB
280
+ 1BZ7B_4HDIH
281
+ 1C04D_3J3VK
282
+ 1C04D_3J3WK
283
+ 1C04D_5MYJBN
284
+ 1C04D_6SJ5D
285
+ 1C04D_6WRUW
286
+ 1C9BB_1NGMA
287
+ 1C9BB_1NVPA
288
+ 1C9BB_1TBAB
289
+ 1C9BB_1TBPB
290
+ 1C9BB_6EU0Y
291
+ 1CD0A_2MMXA
292
+ 1CD0A_5IR3A
293
+ 1CD0A_6HUDA
294
+ 1CD0A_7WOGB
295
+ 1CD0A_8HHXI
296
+ 1CL3A_1CL3A
297
+ 1CL3A_2JHBA
298
+ 1CL3A_3WTXB
299
+ 1CL3A_4N9Fi
300
+ 1CL3A_6P59A
301
+ 1CLHA_1CLHA
302
+ 1CLHA_1J2AA
303
+ 1CLHA_1V9TA
304
+ 1CLHA_1VAIA
305
+ 1CLHA_7ZFMB
306
+ 1CM9A_1CM9A
307
+ 1CM9A_1HFGA
308
+ 1CM9A_1HFNA
309
+ 1CM9A_1HHVA
310
+ 1CM9A_1VMPA
311
+ 1COV3_6ZMSC
312
+ 1COV3_7DPG3
313
+ 1COV3_7VXLC
314
+ 1COV3_7WL33
315
+ 1COV3_7X2IC
316
+ 1CQTA_1CQTA
317
+ 1CQTA_1CQTB
318
+ 1CQTA_1E3OC
319
+ 1CQTA_1HF0B
320
+ 1CQTA_1OCTC
321
+ 1CRXA_1CRXA
322
+ 1CRXA_1DRGA
323
+ 1CRXA_1OUQF
324
+ 1CRXA_1Q3VB
325
+ 1CRXA_7RHZA
326
+ 1CVLA_1CVLA
327
+ 1CVLA_1HQDA
328
+ 1CVLA_1TAHB
329
+ 1CVLA_1TAHD
330
+ 1CVLA_7COGC
331
+ 1D1DA_1D1DA
332
+ 1D1DA_5A9EA
333
+ 1D1DA_5A9EI
334
+ 1D1DA_7NO2B
335
+ 1D1DA_7NO6B
336
+ 1D2EA_1D2EA
337
+ 1D2EA_1D2EC
338
+ 1D2EA_1XB2A
339
+ 1D2EA_7A5GZ
340
+ 1D2EA_7O9Kt
341
+ 1D4M3_3J2JB
342
+ 1D4M3_6LA3C
343
+ 1D4M3_6LB1C
344
+ 1D4M3_8AW6C
345
+ 1D4M3_8GSF3
346
+ 1D4UA_1D4UA
347
+ 1D4UA_1XPAA
348
+ 1D4UA_6LAEA
349
+ 1D4UA_6RO4G
350
+ 1D4UA_7AD8G
351
+ 1D5GA_1D5GA
352
+ 1D5GA_1OZIA
353
+ 1D5GA_1Q7XA
354
+ 1D5GA_7QCYA
355
+ 1D5GA_7XTYA
356
+ 1D6JA_1D6JA
357
+ 1D6JA_1M7GA
358
+ 1D6JA_1M7GC
359
+ 1D6JA_1M7HA
360
+ 1D6JA_1M7HD
361
+ 1D7EA_1NWZA
362
+ 1D7EA_2KX6A
363
+ 1D7EA_3PHYA
364
+ 1D7EA_7AVAA
365
+ 1D7EA_7AVBA
366
+ 1D8UA_2GNVA
367
+ 1D8UA_2GNVB
368
+ 1D8UA_2GNWB
369
+ 1D8UA_2OIFD
370
+ 1D8UA_2OIFG
371
+ 1DDFA_1DDFA
372
+ 1DDFA_3EZQA
373
+ 1DDFA_3EZQC
374
+ 1DDFA_3EZQM
375
+ 1DDFA_3EZQO
376
+ 1DJSA_1E0OB
377
+ 1DJSA_1E0OD
378
+ 1DJSA_1II4E
379
+ 1DJSA_2FDBP
380
+ 1DJSA_4J23A
381
+ 1DMZA_1DMZA
382
+ 1DMZA_1FHQA
383
+ 1DMZA_1K2MA
384
+ 1DMZA_1K2NA
385
+ 1DMZA_1QU5A
386
+ 1DNYA_2GDWA
387
+ 1DNYA_2GDXA
388
+ 1DNYA_2GDYA
389
+ 1DNYA_2K2QA
390
+ 1DNYA_2MD9A
391
+ 1DPMA_1I0BB
392
+ 1DPMA_1PTAA
393
+ 1DPMA_6B2FG
394
+ 1DPMA_6FEEA
395
+ 1DPMA_7P85A
396
+ 1DQYA_1DQZA
397
+ 1DQYA_1DQZB
398
+ 1DQYA_1VA5A
399
+ 1DQYA_4QDUA
400
+ 1DQYA_5KWIA
401
+ 1DX5I_1DX5I
402
+ 1DX5I_1DX5L
403
+ 1DX5I_3GISX
404
+ 1DX5I_3GISY
405
+ 1DX5I_3GISZ
406
+ 1DZFA_1DZFA
407
+ 1DZFA_1I6HE
408
+ 1DZFA_5FJAE
409
+ 1DZFA_5VVSE
410
+ 1DZFA_6UPZE
411
+ 1E08A_1E08A
412
+ 1E08A_1GX7A
413
+ 1E08A_1HFEL
414
+ 1E08A_1HFEM
415
+ 1E08A_6SG2A
416
+ 1E0DA_1E0DA
417
+ 1E0DA_1EEHA
418
+ 1E0DA_2UAGA
419
+ 1E0DA_2WJPA
420
+ 1E0DA_5A5FA
421
+ 1E1CA_2REQA
422
+ 1E1CA_2REQC
423
+ 1E1CA_3REQA
424
+ 1E1CA_4REQA
425
+ 1E1CA_7REQA
426
+ 1E4JA_1E4KC
427
+ 1E4JA_3SGJC
428
+ 1E4JA_3WN5F
429
+ 1E4JA_5MN2A
430
+ 1E4JA_5MN2B
431
+ 1E4WH_1MJUH
432
+ 1E4WH_3J2XB
433
+ 1E4WH_7U63H
434
+ 1E4WH_7XRZB
435
+ 1E4WH_8DK6H
436
+ 1E6CA_1E6CB
437
+ 1E6CA_1SHKA
438
+ 1E6CA_1SHKB
439
+ 1E6CA_2SHKA
440
+ 1E6CA_2SHKB
441
+ 1E7JA_1E7JA
442
+ 1E7JA_1HMAA
443
+ 1E7JA_1QRVA
444
+ 1E7JA_3NM9A
445
+ 1E7JA_3NM9M
446
+ 1EDIA_1EDIA
447
+ 1EDIA_1EDJA
448
+ 1EDIA_1EDKA
449
+ 1EDIA_1EDLA
450
+ 1EDIA_1ZXGA
451
+ 1EG0G_1RIPA
452
+ 1EG0G_5MYJAQ
453
+ 1EG0G_6HA1q
454
+ 1EG0G_6S13q
455
+ 1EG0G_7ASO8
456
+ 1EG0K_1MMSA
457
+ 1EG0K_2K3FA
458
+ 1EG0K_4V4PAL
459
+ 1EG0K_4V4RBK
460
+ 1EG0K_4V4SBK
461
+ 1EH5A_1EH5A
462
+ 1EH5A_1EI9A
463
+ 1EH5A_1EXWA
464
+ 1EH5A_3GROA
465
+ 1EH5A_3GROB
466
+ 1EJ1A_1EJ4A
467
+ 1EJ1A_1EJHA
468
+ 1EJ1A_1EJHC
469
+ 1EJ1A_2GPQA
470
+ 1EJ1A_4TQBA
471
+ 1EJLI_1EJLI
472
+ 1EJLI_2C1MA
473
+ 1EJLI_4WV6A
474
+ 1EJLI_7S1EB
475
+ 1EJLI_8FZKD
476
+ 1EK8A_1EK8A
477
+ 1EK8A_1ISEA
478
+ 1EK8A_4V9CCY
479
+ 1EK8A_4V9DAY
480
+ 1EK8A_4WOIAV
481
+ 1EK9A_1EK9A
482
+ 1EK9A_5BUNB
483
+ 1EK9A_5NIKA
484
+ 1EK9A_5O66B
485
+ 1EK9A_5V5SB
486
+ 1ELVA_1ELVA
487
+ 1ELVA_4J1YA
488
+ 1ELVA_4J1YB
489
+ 1ELVA_8GMNA
490
+ 1ELVA_8GMNB
491
+ 1EO8A_6NZ7A
492
+ 1EO8A_6NZ7E
493
+ 1EO8A_6Y5KA
494
+ 1EO8A_7K37A
495
+ 1EO8A_7X6LA
496
+ 1ESMA_1ESMA
497
+ 1ESMA_1ESMB
498
+ 1ESMA_1ESNA
499
+ 1ESMA_4F7WA
500
+ 1ESMA_4F7WH
501
+ 1ETZB_1ETZH
502
+ 1ETZB_2IPUG
503
+ 1ETZB_3BKMH
504
+ 1ETZB_5EOQH
505
+ 1ETZB_6YWCD
506
+ 1EUCB_2FP4B
507
+ 1EUCB_6WCVB
508
+ 1EUCB_6XRUB
509
+ 1EUCB_7JJ0B
510
+ 1EUCB_7JMKB
511
+ 1EYSC_1EYSC
512
+ 1EYSC_3WMMC
513
+ 1EYSC_4V8KBC
514
+ 1EYSC_5B5MC
515
+ 1EYSC_7VRJC
516
+ 1EZVE_1EZVE
517
+ 1EZVE_4PD4E
518
+ 1EZVE_6T15E
519
+ 1EZVE_6YMXP
520
+ 1EZVE_8EC0C
521
+ 1F11B_3RHWI
522
+ 1F11B_3RHWJ
523
+ 1F11B_4TNVF
524
+ 1F11B_5JHLH
525
+ 1F11B_5MYOD
526
+ 1F3OA_1F3OA
527
+ 1F3OA_1L2TA
528
+ 1F3OA_1L2TB
529
+ 1F3OA_3TIFA
530
+ 1F3OA_3TIFB
531
+ 1F3UB_1F3UD
532
+ 1F3UB_1F3UF
533
+ 1F3UB_5IY6S
534
+ 1F3UB_7EDXS
535
+ 1F3UB_7LBMS
536
+ 1F46A_1F46A
537
+ 1F46A_1F47B
538
+ 1F46A_1F7WA
539
+ 1F46A_1F7XA
540
+ 1F46A_1Y2FA
541
+ 1F59A_1F59A
542
+ 1F59A_1GCJA
543
+ 1F59A_1IBRB
544
+ 1F59A_1IBRD
545
+ 1F59A_1O6PB
546
+ 1F88A_1JFPA
547
+ 1F88A_1LN6A
548
+ 1F88A_2I36A
549
+ 1F88A_6OFJA
550
+ 1F88A_7MT9R
551
+ 1FA0A_1FA0A
552
+ 1FA0A_2O1PA
553
+ 1FA0A_2O1PB
554
+ 1FA0A_2Q66A
555
+ 1FA0A_3C66B
556
+ 1FBRA_1FBRA
557
+ 1FBRA_2RKYA
558
+ 1FBRA_2RL0A
559
+ 1FBRA_2RL0F
560
+ 1FBRA_2RL0K
561
+ 1FEZA_1RQLA
562
+ 1FEZA_1SWVB
563
+ 1FEZA_2IOFA
564
+ 1FEZA_2IOHC
565
+ 1FEZA_2IOHD
566
+ 1FFKB_1JJ2B
567
+ 1FFKB_2QA4B
568
+ 1FFKB_3G4SB
569
+ 1FFKB_4V4PAE
570
+ 1FFKB_4V4RBE
571
+ 1FFKF_1JJ2H
572
+ 1FFKF_1Q81J
573
+ 1FFKF_1Q86J
574
+ 1FFKF_4V4PAP
575
+ 1FFKF_4V4RBQ
576
+ 1FFKR_1JJ2T
577
+ 1FFKR_3CCJU
578
+ 1FFKR_3CCRU
579
+ 1FFKR_4V4PAR
580
+ 1FFKR_4V4RBT
581
+ 1FJ1E_2G8CO
582
+ 1FJ1E_3CKGA
583
+ 1FJ1E_6IDCA
584
+ 1FJ1E_6IEIA
585
+ 1FJ1E_6KWJB
586
+ 1FJGN_1FJGN
587
+ 1FJGN_1IBLN
588
+ 1FJGN_4V4Jo
589
+ 1FJGN_5OT7M
590
+ 1FJGN_6GZXN3
591
+ 1FNTc_1FNTm
592
+ 1FNTc_1FNTn
593
+ 1FNTc_1YA7O
594
+ 1FNTc_1Z7Qi
595
+ 1FNTc_3IPMS
596
+ 1FNTN_1FNTN
597
+ 1FNTN_4CR37
598
+ 1FNTN_4CR47
599
+ 1FNTN_5MPBn
600
+ 1FNTN_5WVI7
601
+ 1FUUB_1FUUB
602
+ 1FUUB_2VSOA
603
+ 1FUUB_2VSOB
604
+ 1FUUB_2VSXA
605
+ 1FUUB_2VSXB
606
+ 1G0YR_1G0YR
607
+ 1G0YR_1IRAY
608
+ 1G0YR_1ITBB
609
+ 1G0YR_4DEPE
610
+ 1G0YR_4GAFB
611
+ 1G1AA_1G1AA
612
+ 1G1AA_1G1AB
613
+ 1G1AA_1G1AD
614
+ 1G1AA_1KEUA
615
+ 1G1AA_1KEWB
616
+ 1G2OA_1G2OA
617
+ 1G2OA_1I80B
618
+ 1G2OA_3IX2A
619
+ 1G2OA_7ZSQB
620
+ 1G2OA_7ZSQC
621
+ 1G5MA_1GJHA
622
+ 1G5MA_2O2FA
623
+ 1G5MA_4LVTA
624
+ 1G5MA_5VAYB
625
+ 1G5MA_6IWBB
626
+ 1G6OA_1G6OA
627
+ 1G6OA_1NLZF
628
+ 1G6OA_2PT7B
629
+ 1G6OA_2PT7C
630
+ 1G6OA_2PT7D
631
+ 1G96A_1R4CA
632
+ 1G96A_1TIJA
633
+ 1G96A_3NX0A
634
+ 1G96A_3QRDB
635
+ 1G96A_6RPVA
636
+ 1G9EA_1G9EA
637
+ 1G9EA_1HCVA
638
+ 1G9EA_1SHMA
639
+ 1G9EA_1SHMB
640
+ 1G9EA_1SHME
641
+ 1G9OA_1G9OA
642
+ 1G9OA_4JL7A
643
+ 1G9OA_4LMMA
644
+ 1G9OA_4MPAA
645
+ 1G9OA_4N6XA
646
+ 1GDCA_1LATA
647
+ 1GDCA_1RGDA
648
+ 1GDCA_4HN5A
649
+ 1GDCA_4OORF
650
+ 1GDCA_5CC1B
651
+ 1GJZA_1GJZB
652
+ 1GJZA_5TGMp1
653
+ 1GJZA_5TGMp2
654
+ 1GJZA_6OA9K
655
+ 1GJZA_6OAAH
656
+ 1GNKA_1GNKB
657
+ 1GNKA_2NS1B
658
+ 1GNKA_2NUUI
659
+ 1GNKA_2NUUK
660
+ 1GNKA_2NUUL
661
+ 1GNUA_2L8JA
662
+ 1GNUA_3D32B
663
+ 1GNUA_5DPTB
664
+ 1GNUA_5LXHC
665
+ 1GNUA_7BRQA
666
+ 1GS0A_1GS0A
667
+ 1GS0A_1GS0B
668
+ 1GS0A_4TVJA
669
+ 1GS0A_4ZZYA
670
+ 1GS0A_7R59A
671
+ 1GXJA_1GXJA
672
+ 1GXJA_1GXJB
673
+ 1GXJA_1GXKB
674
+ 1GXJA_1GXKC
675
+ 1GXJA_1GXLD
676
+ 1HAVA_1HAVB
677
+ 1HAVA_1QA7A
678
+ 1HAVA_1QA7B
679
+ 1HAVA_1QA7C
680
+ 1HAVA_2H6MA
681
+ 1HOVA_1HOVA
682
+ 1HOVA_3AYUA
683
+ 1HOVA_7XGJC
684
+ 1HOVA_7XJOA
685
+ 1HOVA_8H78A
686
+ 1HUEA_1HUEA
687
+ 1HUEA_1HUEB
688
+ 1HUEA_4QJNA
689
+ 1HUEA_4QJNC
690
+ 1HUEA_4QJUA
691
+ 1HUMA_1HUMA
692
+ 1HUMA_1HUNA
693
+ 1HUMA_1JE4A
694
+ 1HUMA_2X6LC
695
+ 1HUMA_3TN2A
696
+ 1HVVA_1HVVA
697
+ 1HVVA_1HVVC
698
+ 1HVVA_1HVVD
699
+ 1HVVA_1L4AB
700
+ 1HVVA_3J96L
701
+ 1HVYA_2RDAA
702
+ 1HVYA_3EHIX
703
+ 1HVYA_3GH0A
704
+ 1HVYA_4E28A
705
+ 1HVYA_5X5DD
706
+ 1HZEA_1HZEA
707
+ 1HZEA_1I18A
708
+ 1HZEA_1I18B
709
+ 1HZEA_1PKVA
710
+ 1HZEA_1PKVB
711
+ 1I0IA_1I0IB
712
+ 1I0IA_1I0LA
713
+ 1I0IA_1I13B
714
+ 1I0IA_1P18A
715
+ 1I0IA_5EUCC
716
+ 1I1IP_1I1IP
717
+ 1I1IP_2O3EA
718
+ 1I1IP_4FXYP
719
+ 1I1IP_5LUZB
720
+ 1I1IP_5LV0B
721
+ 1I7GA_1KKQA
722
+ 1I7GA_1KKQD
723
+ 1I7GA_3ET1B
724
+ 1I7GA_5AZTA
725
+ 1I7GA_6LX6A
726
+ 1ICWA_1IKLA
727
+ 1ICWA_1IL8A
728
+ 1ICWA_1RODA
729
+ 1ICWA_2IL8A
730
+ 1ICWA_4XDXA
731
+ 1IDRA_1RTEA
732
+ 1IDRA_1RTEB
733
+ 1IDRA_1S61A
734
+ 1IDRA_2GLNB
735
+ 1IDRA_5AB8A
736
+ 1IG1A_1JKTA
737
+ 1IG1A_2J90B
738
+ 1IG1A_3BQRA
739
+ 1IG1A_5A6NB
740
+ 1IG1A_5VJAC
741
+ 1IHRA_1IHRB
742
+ 1IHRA_1QXXA
743
+ 1IHRA_1U07A
744
+ 1IHRA_1XX3A
745
+ 1IHRA_2GRXC
746
+ 1IOMA_1IOMA
747
+ 1IOMA_1IXEA
748
+ 1IOMA_1IXEB
749
+ 1IOMA_1IXEC
750
+ 1IOMA_1IXED
751
+ 1IQPA_1IQPA
752
+ 1IQPA_1IQPC
753
+ 1IQPA_1IQPD
754
+ 1IQPA_1IQPE
755
+ 1IQPA_1IQPF
756
+ 1IRUG_5LE5T
757
+ 1IRUG_5LN3G
758
+ 1IRUG_5T0IM
759
+ 1IRUG_6EPCG
760
+ 1IRUG_6EPDG
761
+ 1IU3C_1IU3C
762
+ 1IU3C_1IU3F
763
+ 1IU3C_1J3EA
764
+ 1IU3C_1LRRA
765
+ 1IU3C_1LRRD
766
+ 1IWLA_2ZPCA
767
+ 1IWLA_6FHMA
768
+ 1IWLA_6FHMB
769
+ 1IWLA_7ARMA
770
+ 1IWLA_7Z6XA
771
+ 1IZLC_1IZLC
772
+ 1IZLC_1IZLM
773
+ 1IZLC_3WU2C
774
+ 1IZLC_6YP7C
775
+ 1IZLC_7NHQC
776
+ 1IZLD_1IZLD
777
+ 1IZLD_1IZLN
778
+ 1IZLD_1W5CD
779
+ 1IZLD_4PBUd
780
+ 1IZLD_7NHQD
781
+ 1J1HA_1J1HA
782
+ 1J1HA_2CZJA
783
+ 1J1HA_3IYQB
784
+ 1J1HA_3IYRB
785
+ 1J1HA_3IZ4B
786
+ 1J5AK_2ZJPC
787
+ 1J5AK_4V4GBF
788
+ 1J5AK_5DM7C
789
+ 1J5AK_5JVGC
790
+ 1J5AK_7A0RC
791
+ 1J5HA_1J5HA
792
+ 1J5HA_1J5IA
793
+ 1J5HA_1O5PA
794
+ 1J5HA_2G0KA
795
+ 1J5HA_2G0LA
796
+ 1J8BA_1J8BA
797
+ 1J8BA_1PUGA
798
+ 1J8BA_1PUGB
799
+ 1J8BA_1PUGC
800
+ 1J8BA_1PUGD
801
+ 1JGSA_1JGSA
802
+ 1JGSA_3VB2A
803
+ 1JGSA_3VOEA
804
+ 1JGSA_5H3RA
805
+ 1JGSA_5H3RB
806
+ 1JPYA_1JPYX
807
+ 1JPYA_5N92F
808
+ 1JPYA_5NANE
809
+ 1JPYA_5NANF
810
+ 1JPYA_6HGOC
811
+ 1JQHA_1P4OB
812
+ 1JQHA_3D94A
813
+ 1JQHA_3F5PB
814
+ 1JQHA_3LVPD
815
+ 1JQHA_3O23A
816
+ 1JRKA_1JRKA
817
+ 1JRKA_1JRKD
818
+ 1JRKA_1K26A
819
+ 1JRKA_1K26B
820
+ 1JRKA_1K2EB
821
+ 1JS3A_1JS3B
822
+ 1JS3A_1JS6A
823
+ 1JS3A_3RBFB
824
+ 1JS3A_3RBLB
825
+ 1JS3A_3RCHA
826
+ 1JSUC_1JSUC
827
+ 1JSUC_6ATHC
828
+ 1JSUC_6P8EC
829
+ 1JSUC_6P8FC
830
+ 1JSUC_6P8GC
831
+ 1JT0A_1JT0A
832
+ 1JT0A_1JT0D
833
+ 1JT0A_1JT6A
834
+ 1JT0A_1JUMA
835
+ 1JT0A_3BT9B
836
+ 1JWHA_3BQCA
837
+ 1JWHA_4DGLC
838
+ 1JWHA_5M44A
839
+ 1JWHA_5Y9MA
840
+ 1JWHA_6EHUA
841
+ 1K04A_1OW6A
842
+ 1K04A_1PV3A
843
+ 1K04A_3S9OA
844
+ 1K04A_3S9OC
845
+ 1K04A_7W7ZA
846
+ 1K2PA_1K2PA
847
+ 1K2PA_1K2PB
848
+ 1K2PA_3P08B
849
+ 1K2PA_3PIYA
850
+ 1K2PA_5P9GA
851
+ 1K92A_1K92A
852
+ 1K92A_1K97A
853
+ 1K92A_1KP3A
854
+ 1K92A_5US8A
855
+ 1K92A_5US8B
856
+ 1KAMA_1KAMB
857
+ 1KAMA_1KAMC
858
+ 1KAMA_1KAMD
859
+ 1KAMA_1KAQA
860
+ 1KAMA_1KAQB
861
+ 1KBAA_1KBAA
862
+ 1KBAA_1KBAB
863
+ 1KBAA_2NBTA
864
+ 1KBAA_7ULRA
865
+ 1KBAA_7ULRB
866
+ 1KGQA_1KGQA
867
+ 1KGQA_1KGTA
868
+ 1KGQA_3BXYA
869
+ 1KGQA_3GOSA
870
+ 1KGQA_3GOSC
871
+ 1KGYA_1KGYA
872
+ 1KGYA_1NUKA
873
+ 1KGYA_1SHWB
874
+ 1KGYA_2QBXB
875
+ 1KGYA_3ETPA
876
+ 1KILE_1KILE
877
+ 1KILE_3RK3E
878
+ 1KILE_3RL0g
879
+ 1KILE_3RL0i
880
+ 1KILE_5W5DE
881
+ 1KJ0A_1KJ0A
882
+ 1KJ0A_1WO9A
883
+ 1KJ0A_2F91B
884
+ 1KJ0A_2VU8I
885
+ 1KJ0A_2XTTA
886
+ 1KPKA_1KPLD
887
+ 1KPKA_1OTSA
888
+ 1KPKA_7N9WA
889
+ 1KPKA_7N9WB
890
+ 1KPKA_7RO0A
891
+ 1KPSB_1KPSD
892
+ 1KPSB_1Z5SC
893
+ 1KPSB_2GRPB
894
+ 1KPSB_2IO2C
895
+ 1KPSB_2IO3C
896
+ 1L9VA_1L9VA
897
+ 1L9VA_4G0AA
898
+ 1L9VA_4G0AD
899
+ 1L9VA_7PKOC
900
+ 1L9VA_7PKPA
901
+ 1LISA_2LISA
902
+ 1LISA_2LYNA
903
+ 1LISA_5II7A
904
+ 1LISA_5II9B
905
+ 1LISA_5UTGA
906
+ 1LNQA_5BKKA
907
+ 1LNQA_6U68A
908
+ 1LNQA_6U68B
909
+ 1LNQA_6U68C
910
+ 1LNQA_6U68D
911
+ 1LTXR_1LTXR
912
+ 1LTXR_1VG9A
913
+ 1LTXR_1VG9C
914
+ 1LTXR_1VG9E
915
+ 1LTXR_1VG9G
916
+ 1LVAA_1LVAA
917
+ 1LVAA_2PLYA
918
+ 1LVAA_2PLYB
919
+ 1LVAA_2UWMA
920
+ 1LVAA_2UWMB
921
+ 1LWUC_1LWUC
922
+ 1LWUC_1LWUF
923
+ 1LWUC_1LWUI
924
+ 1LWUC_1N73C
925
+ 1LWUC_1N73F
926
+ 1LZWA_1MBUC
927
+ 1LZWA_1R6OC
928
+ 1LZWA_1R6OD
929
+ 1LZWA_2W9RA
930
+ 1LZWA_2WA8C
931
+ 1M08A_1M08A
932
+ 1M08A_1PT3A
933
+ 1M08A_3FBDA
934
+ 1M08A_3FBDD
935
+ 1M08A_7CEIB
936
+ 1M12A_1M12A
937
+ 1M12A_1SN6A
938
+ 1M12A_2GTGA
939
+ 1M12A_2QYPA
940
+ 1M12A_2QYPB
941
+ 1MHPX_1VHPA
942
+ 1MHPX_3ZHKA
943
+ 1MHPX_6OROD
944
+ 1MHPX_8DT8D
945
+ 1MHPX_8DT8H
946
+ 1MI1A_1MI1A
947
+ 1MI1A_1T77A
948
+ 1MI1A_1T77B
949
+ 1MI1A_1T77C
950
+ 1MI1A_1T77D
951
+ 1MIQA_1MIQA
952
+ 1MIQA_1MIQB
953
+ 1MIQA_1QS8A
954
+ 1MIQA_1QS8B
955
+ 1MIQA_2ANLB
956
+ 1MU5A_1MU5A
957
+ 1MU5A_1MX0C
958
+ 1MU5A_1Z5AA
959
+ 1MU5A_2ZBKF
960
+ 1MU5A_2ZBKH
961
+ 1MYOA_1MYOA
962
+ 1MYOA_2KXPC
963
+ 1MYOA_2MYOA
964
+ 1MYOA_3AAAC
965
+ 1MYOA_7DF7A
966
+ 1N0UA_1N0UA
967
+ 1N0UA_1ZM2A
968
+ 1N0UA_2E1RA
969
+ 1N0UA_2P8YT
970
+ 1N0UA_3B82E
971
+ 1N3BA_1N3BB
972
+ 1N3BA_1VHLA
973
+ 1N3BA_1VIYA
974
+ 1N3BA_6ARIA
975
+ 1N3BA_6ARIB
976
+ 1NKW1_3PIO1
977
+ 1NKW1_4V4GB3
978
+ 1NKW1_5DM61
979
+ 1NKW1_5DM71
980
+ 1NKW1_5JVG1
981
+ 1NKWG_2ZJPF
982
+ 1NKWG_2ZJQF
983
+ 1NKWG_4V4GBJ
984
+ 1NKWG_5DM6F
985
+ 1NKWG_5DM7F
986
+ 1NKWR_2D3OR
987
+ 1NKWR_4V4GBU
988
+ 1NKWR_5DM6Q
989
+ 1NKWR_7A0RQ
990
+ 1NKWR_7A0SQ
991
+ 1NO4A_1NO4B
992
+ 1NO4A_1NO4C
993
+ 1NO4A_1NO4D
994
+ 1NO4A_1NOHA
995
+ 1NO4A_1NOHB
996
+ 1NSOA_1NSOA
997
+ 1NSOA_3SQFA
998
+ 1NSOA_6S1WB
999
+ 1NSOA_7BGTA
1000
+ 1NSOA_7BGTB
1001
+ 1NW4A_1NW4A
1002
+ 1NW4A_2B94A
1003
+ 1NW4A_2BSXA
1004
+ 1NW4A_3EMVA
1005
+ 1NW4A_3ENZD
1006
+ 1OCCF_1OCCF
1007
+ 1OCCF_1OCZF
1008
+ 1OCCF_2OCCF
1009
+ 1OCCF_5J4ZBP
1010
+ 1OCCF_5ZCPF
1011
+ 1ODDA_2JPBA
1012
+ 1ODDA_6LXMA
1013
+ 1ODDA_6LXMB
1014
+ 1ODDA_6LXMC
1015
+ 1ODDA_6LXNB
1016
+ 1OHUA_1OHUA
1017
+ 1OHUA_1OHUB
1018
+ 1OHUA_1TY4A
1019
+ 1OHUA_1TY4B
1020
+ 1OHUA_2A5YA
1021
+ 1OJLA_1OJLA
1022
+ 1OJLA_1OJLB
1023
+ 1OJLA_1OJLC
1024
+ 1OJLA_1OJLD
1025
+ 1OJLA_1OJLE
1026
+ 1OMSA_1OMSA
1027
+ 1OMSA_1OMSC
1028
+ 1OMSA_1P9YA
1029
+ 1OMSA_4URDA
1030
+ 1OMSA_7D6Zh
1031
+ 1ONQA_4X6CA
1032
+ 1ONQA_4X6DC
1033
+ 1ONQA_7KOZA
1034
+ 1ONQA_7RYNA
1035
+ 1ONQA_7RYOA
1036
+ 1OPMA_1OPMA
1037
+ 1OPMA_5WJAA
1038
+ 1OPMA_6ALAA
1039
+ 1OPMA_6ALAC
1040
+ 1OPMA_8DSNK
1041
+ 1OWTA_1OWTA
1042
+ 1OWTA_2FJZA
1043
+ 1OWTA_2FK3B
1044
+ 1OWTA_2FKLB
1045
+ 1OWTA_7MRSB
1046
+ 1P0ZA_1P0ZA
1047
+ 1P0ZA_1P0ZE
1048
+ 1P0ZA_2J80A
1049
+ 1P0ZA_2V9AA
1050
+ 1P0ZA_2V9AB
1051
+ 1P8DA_4DK7C
1052
+ 1P8DA_4DK8A
1053
+ 1P8DA_5JY3A
1054
+ 1P8DA_5JY3C
1055
+ 1P8DA_6JIOD
1056
+ 1P9MC_1P9MC
1057
+ 1P9MC_5FUCC
1058
+ 1P9MC_5FUCD
1059
+ 1P9MC_7DC8C
1060
+ 1P9MC_7DC8F
1061
+ 1PA7A_1V5KA
1062
+ 1PA7A_1VKAA
1063
+ 1PA7A_1WYOA
1064
+ 1PA7A_3CO1A
1065
+ 1PA7A_3JAKM
1066
+ 1PGNA_2JKVC
1067
+ 1PGNA_2JKVF
1068
+ 1PGNA_4GWGA
1069
+ 1PGNA_5UQ9B
1070
+ 1PGNA_5UQ9F
1071
+ 1PJWA_1PJWA
1072
+ 1PJWA_6S93B
1073
+ 1PJWA_6S93C
1074
+ 1PJWA_6S94B
1075
+ 1PJWA_6S95A
1076
+ 1PKLA_1PKLD
1077
+ 1PKLA_1PKLG
1078
+ 1PKLA_3E0WA
1079
+ 1PKLA_3HQOA
1080
+ 1PKLA_3HQPB
1081
+ 1PVLA_1PVLA
1082
+ 1PVLA_4Q7GA
1083
+ 1PVLA_6U2SA
1084
+ 1PVLA_6U33A
1085
+ 1PVLA_6U3YA
1086
+ 1Q14A_1Q14A
1087
+ 1Q14A_1Q17A
1088
+ 1Q14A_1Q17C
1089
+ 1Q14A_1SZCA
1090
+ 1Q14A_2OD2A
1091
+ 1Q59A_1Q59A
1092
+ 1Q59A_2WH6A
1093
+ 1Q59A_4OYDC
1094
+ 1Q59A_7P33A
1095
+ 1Q59A_7P33B
1096
+ 1Q8MA_1Q8MA
1097
+ 1Q8MA_1Q8MB
1098
+ 1Q8MA_1Q8MD
1099
+ 1Q8MA_1SMOA
1100
+ 1Q8MA_1SMOB
1101
+ 1Q90B_1VF5A
1102
+ 1Q90B_2D2CA
1103
+ 1Q90B_2E74A
1104
+ 1Q90B_7ZXYA
1105
+ 1Q90B_7ZYVA
1106
+ 1QG3A_3F7PC
1107
+ 1QG3A_3F7PD
1108
+ 1QG3A_3F7QA
1109
+ 1QG3A_3F7QB
1110
+ 1QG3A_3F7RA
1111
+ 1QMGA_1QMGA
1112
+ 1QMGA_1QMGC
1113
+ 1QMGA_3FR7A
1114
+ 1QMGA_3FR7B
1115
+ 1QMGA_3FR8A
1116
+ 1QPCA_1QPCA
1117
+ 1QPCA_2PL0A
1118
+ 1QPCA_3MPMA
1119
+ 1QPCA_4C3FA
1120
+ 1QPCA_6PDJA
1121
+ 1QYMA_1QYMA
1122
+ 1QYMA_1TR4A
1123
+ 1QYMA_4NIKA
1124
+ 1QYMA_5VHQG
1125
+ 1QYMA_5VHRG
1126
+ 1R3BA_1R3BA
1127
+ 1R3BA_5B5VA
1128
+ 1R3BA_5B6BB
1129
+ 1R3BA_5B6BH
1130
+ 1R3BA_5TWFB
1131
+ 1R4MB_1R4MD
1132
+ 1R4MB_1R4ND
1133
+ 1R4MB_1YOVB
1134
+ 1R4MB_3DBLF
1135
+ 1R4MB_3GZND
1136
+ 1R5PA_1VGLA
1137
+ 1R5PA_2QKEB
1138
+ 1R5PA_2QKEC
1139
+ 1R5PA_5JYTA
1140
+ 1R5PA_5JYVB
1141
+ 1RA6A_1RDRA
1142
+ 1RA6A_3OL6A
1143
+ 1RA6A_4K4TA
1144
+ 1RA6A_4K4WA
1145
+ 1RA6A_4NLQA
1146
+ 1RFNB_1RFNB
1147
+ 1RFNB_2WPME
1148
+ 1RFNB_3LC3B
1149
+ 1RFNB_5EGMB
1150
+ 1RFNB_7AHVL
1151
+ 1RH5B_1RHZB
1152
+ 1RH5B_3BO0B
1153
+ 1RH5B_3DKNB
1154
+ 1RH5B_4V4NA7
1155
+ 1RH5B_4V7IAB
1156
+ 1RJVA_1RJVA
1157
+ 1RJVA_1RK9A
1158
+ 1RJVA_1RTP1
1159
+ 1RJVA_2JWWA
1160
+ 1RJVA_3F45A
1161
+ 1RKRA_1RKRB
1162
+ 1RKRA_1RKRC
1163
+ 1RKRA_6L1VA
1164
+ 1RKRA_6L1VC
1165
+ 1RKRA_6L1VE
1166
+ 1RLGA_4BW0B
1167
+ 1RLGA_5G4UG
1168
+ 1RLGA_6HCTC
1169
+ 1RLGA_6HCTD
1170
+ 1RLGA_6HCTG
1171
+ 1RRMA_1RRMB
1172
+ 1RRMA_2BI4A
1173
+ 1RRMA_7QNIA
1174
+ 1RRMA_7QNJA
1175
+ 1RRMA_7R0PB
1176
+ 1RZ4A_1RZ4A
1177
+ 1RZ4A_5A5TK
1178
+ 1RZ4A_7A09K
1179
+ 1RZ4A_7QP63
1180
+ 1RZ4A_7QP73
1181
+ 1S5LB_3WU2B
1182
+ 1S5LB_5KAIB
1183
+ 1S5LB_7EDAB
1184
+ 1S5LB_7NHPB
1185
+ 1S5LB_7NHQB
1186
+ 1S5LI_1S5Li
1187
+ 1S5LI_1S5LI
1188
+ 1S5LI_3A0Bi
1189
+ 1S5LI_3A0HI
1190
+ 1S5LI_3WU2I
1191
+ 1SCMA_1WDCA
1192
+ 1SCMA_3JTDA
1193
+ 1SCMA_3PN7A
1194
+ 1SCMA_3PN7D
1195
+ 1SCMA_3TUYA
1196
+ 1SJ2A_2CCDB
1197
+ 1SJ2A_6ZJIBP1
1198
+ 1SJ2A_7A2IA
1199
+ 1SJ2A_7A7AA
1200
+ 1SJ2A_7A8ZB
1201
+ 1SQEA_1SQEA
1202
+ 1SQEA_1SQEB
1203
+ 1SQEA_2ZDPA
1204
+ 1SQEA_4FNHA
1205
+ 1SQEA_4FNHB
1206
+ 1SR9A_3FIGB
1207
+ 1SR9A_3HPSA
1208
+ 1SR9A_3HPSB
1209
+ 1SR9A_3HPZB
1210
+ 1SR9A_3HQ1A
1211
+ 1ST0A_1ST0A
1212
+ 1ST0A_1VLRA
1213
+ 1ST0A_4QEBB
1214
+ 1ST0A_5OSYA
1215
+ 1ST0A_5OSYB
1216
+ 1SXJC_1SXJC
1217
+ 1SXJC_7STBC
1218
+ 1SXJC_7STEC
1219
+ 1SXJC_7TFLC
1220
+ 1SXJC_7THJC
1221
+ 1T2KA_1T2KB
1222
+ 1T2KA_2O6GE
1223
+ 1T2KA_2O6GG
1224
+ 1T2KA_2PI0C
1225
+ 1T2KA_2PI0D
1226
+ 1T3WA_1T3WA
1227
+ 1T3WA_1T3WB
1228
+ 1T3WA_2HAJA
1229
+ 1T3WA_6CBRA
1230
+ 1T3WA_6CBSA
1231
+ 1TENA_1TENA
1232
+ 1TENA_2RB8A
1233
+ 1TENA_2RBLA
1234
+ 1TENA_2RBLB
1235
+ 1TENA_6BRBD
1236
+ 1TJGH_3IDGB
1237
+ 1TJGH_3LZFH
1238
+ 1TJGH_6URMD
1239
+ 1TJGH_7MMOA
1240
+ 1TJGH_7MU4H
1241
+ 1TW2A_1TW2B
1242
+ 1TW2A_1TW3A
1243
+ 1TW2A_7PGJA
1244
+ 1TW2A_7PHEB
1245
+ 1TW2A_7PHFB
1246
+ 1TZNA_1TZOA
1247
+ 1TZNA_6UJIA
1248
+ 1TZNA_6UZBB
1249
+ 1TZNA_6UZBG
1250
+ 1TZNA_7KXRB
1251
+ 1U4CA_1U4CB
1252
+ 1U4CA_1YFQA
1253
+ 1U4CA_2I3SC
1254
+ 1U4CA_2I3TA
1255
+ 1U4CA_4BL0D
1256
+ 1V0JA_4RPHB
1257
+ 1V0JA_4XGKA
1258
+ 1V0JA_5ER9B
1259
+ 1V0JA_5F3RA
1260
+ 1V0JA_6D9CA
1261
+ 1V76A_1V76A
1262
+ 1V76A_1V76B
1263
+ 1V76A_2KI7A
1264
+ 1V76A_2ZAEA
1265
+ 1V76A_2ZAEC
1266
+ 1V8JA_1V8JA
1267
+ 1V8JA_1V8KA
1268
+ 1V8JA_4UBFA
1269
+ 1V8JA_4UBFD
1270
+ 1V8JA_4Y05A
1271
+ 1VCLA_1VCLA
1272
+ 1VCLA_1VCLB
1273
+ 1VCLA_2Z49A
1274
+ 1VCLA_3W9TC
1275
+ 1VCLA_3W9TG
1276
+ 1VDZA_3I72A
1277
+ 1VDZA_3QIAA
1278
+ 1VDZA_3SDZA
1279
+ 1VDZA_3SE0A
1280
+ 1VDZA_5X09A
1281
+ 1VGYA_1VGYB
1282
+ 1VGYA_4O23B
1283
+ 1VGYA_4PPZA
1284
+ 1VGYA_5UEJA
1285
+ 1VGYA_5UEJB
1286
+ 1VR4A_1VR4A
1287
+ 1VR4A_1VR4C
1288
+ 1VR4A_1VR4D
1289
+ 1VR4A_1VR4E
1290
+ 1VR4A_2GTCB
1291
+ 1VVJR7_1VY4B7
1292
+ 1VVJR7_4V4JZ
1293
+ 1VVJR7_4V4XB6
1294
+ 1VVJR7_4V9LD7
1295
+ 1VVJR7_5IMRy
1296
+ 1VVJRO_1VVJRO
1297
+ 1VVJRO_4V4JI
1298
+ 1VVJRO_4V4XBN
1299
+ 1VVJRO_4W29BO
1300
+ 1VVJRO_5ZLUh
1301
+ 1VVJRT_1VVJRT
1302
+ 1VVJRT_4V4IN
1303
+ 1VVJRT_4V9HBT
1304
+ 1VVJRT_5IMRl
1305
+ 1VVJRT_5ZLUm
1306
+ 1VZSA_1VZSA
1307
+ 1VZSA_2CLYC
1308
+ 1VZSA_5ARAV
1309
+ 1VZSA_5ARHV
1310
+ 1VZSA_6J5Ic
1311
+ 1W63M_1W63N
1312
+ 1W63M_4HMYM
1313
+ 1W63M_4P6ZM
1314
+ 1W63M_6D83M
1315
+ 1W63M_8D4FJ
1316
+ 1WAQA_2BHKA
1317
+ 1WAQA_5HK5C
1318
+ 1WAQA_6Z3LA
1319
+ 1WAQA_6Z3MH
1320
+ 1WAQA_7ZJFB
1321
+ 1WDKA_1WDLA
1322
+ 1WDKA_1WDLB
1323
+ 1WDKA_1WDMB
1324
+ 1WDKA_2D3TA
1325
+ 1WDKA_2D3TB
1326
+ 1WFIA_1WFIA
1327
+ 1WFIA_2CR0A
1328
+ 1WFIA_3QORA
1329
+ 1WFIA_3QORC
1330
+ 1WFIA_3QORE
1331
+ 1X03A_1X03A
1332
+ 1X03A_1X04A
1333
+ 1X03A_2C08A
1334
+ 1X03A_2D4CA
1335
+ 1X03A_2D4CB
1336
+ 1X0JA_2L5EA
1337
+ 1X0JA_4A9OB
1338
+ 1X0JA_6C7RA
1339
+ 1X0JA_7TO7B
1340
+ 1X0JA_7TUQB
1341
+ 1X79B_1X79C
1342
+ 1X79B_4N3YB
1343
+ 1X79B_4N3YC
1344
+ 1X79B_4N3ZB
1345
+ 1X79B_4N3ZC
1346
+ 1XF1A_1XF1A
1347
+ 1XF1A_1XF1B
1348
+ 1XF1A_3EIFA
1349
+ 1XF1A_7BJ3A
1350
+ 1XF1A_7YZXA
1351
+ 1XFUA_1XFUA
1352
+ 1XFUA_1XFVD
1353
+ 1XFUA_6UZEH
1354
+ 1XFUA_6UZEI
1355
+ 1XFUA_6VRAI
1356
+ 1XJSA_1XJSA
1357
+ 1XJSA_2AZHA
1358
+ 1XJSA_5XT6D
1359
+ 1XJSA_6JZVB
1360
+ 1XJSA_6JZWA
1361
+ 1XU7A_1XU9A
1362
+ 1XU7A_2BELB
1363
+ 1XU7A_2BELD
1364
+ 1XU7A_2RBED
1365
+ 1XU7A_3PDJA
1366
+ 1YBIA_1YBIA
1367
+ 1YBIA_3WINA
1368
+ 1YBIA_4LO0A
1369
+ 1YBIA_4OUJA
1370
+ 1YBIA_4QD2H
1371
+ 1YEWC_1YEWC
1372
+ 1YEWC_1YEWK
1373
+ 1YEWC_7S4HC
1374
+ 1YEWC_7T4PC
1375
+ 1YEWC_7YZYC
1376
+ 1YH3A_2I66A
1377
+ 1YH3A_3F6YA
1378
+ 1YH3A_3OFSC
1379
+ 1YH3A_6EDRA
1380
+ 1YH3A_8IL3C
1381
+ 1YJ5A_1YJ5A
1382
+ 1YJ5A_3U7GA
1383
+ 1YJ5A_3ZVLA
1384
+ 1YJ5A_3ZVMA
1385
+ 1YJ5A_3ZVMB
1386
+ 1YKEB_1YKHB
1387
+ 1YKEB_5OQMn
1388
+ 1YKEB_5SVAW
1389
+ 1YKEB_7UI9u
1390
+ 1YKEB_8CEOn
1391
+ 1YMTA_1YP0A
1392
+ 1YMTA_1ZDTA
1393
+ 1YMTA_1ZDTB
1394
+ 1YMTA_3F7DA
1395
+ 1YMTA_7KHTA
1396
+ 1YQ3A_1ZOYA
1397
+ 1YQ3A_2H89A
1398
+ 1YQ3A_6VAXA
1399
+ 1YQ3A_6VAXC
1400
+ 1YQ3A_8GS8A
1401
+ 1YVUA_1YVUA
1402
+ 1YVUA_2F8SA
1403
+ 1YVUA_2F8SB
1404
+ 1YVUA_2F8TB
1405
+ 1YVUA_2NUBA
1406
+ 1YWTA_3SPRA
1407
+ 1YWTA_5OMAD
1408
+ 1YWTA_6QZRH
1409
+ 1YWTA_6T5FC
1410
+ 1YWTA_7NIXA
1411
+ 1YWUA_1YWUA
1412
+ 1YWUA_2L74A
1413
+ 1YWUA_5XLYB
1414
+ 1YWUA_5Y4RC
1415
+ 1YWUA_5Y4RD
1416
+ 1Z3AA_1Z3AA
1417
+ 1Z3AA_8E2PC
1418
+ 1Z3AA_8E2RB
1419
+ 1Z3AA_8E2SE
1420
+ 1Z3AA_8E2SH
1421
+ 1Z4VA_1Z4VA
1422
+ 1Z4VA_4JF7A
1423
+ 1Z4VA_4JF7B
1424
+ 1Z4VA_4JF7C
1425
+ 1Z4VA_4JF7D
1426
+ 1Z7PA_1Z7PA
1427
+ 1Z7PA_1Z7RA
1428
+ 1Z7PA_2E7PA
1429
+ 1Z7PA_2E7PB
1430
+ 1Z7PA_2E7PC
1431
+ 1Z98A_2B5FA
1432
+ 1Z98A_2B5FB
1433
+ 1Z98A_3CLLA
1434
+ 1Z98A_3CN6A
1435
+ 1Z98A_4IA4D
1436
+ 1ZIWA_1ZIWA
1437
+ 1ZIWA_2A0ZA
1438
+ 1ZIWA_3ULUA
1439
+ 1ZIWA_3ULVA
1440
+ 1ZIWA_5GS0B
1441
+ 1ZKKA_1ZKKA
1442
+ 1ZKKA_4IJ8A
1443
+ 1ZKKA_5V2NA
1444
+ 1ZKKA_5W1YA
1445
+ 1ZKKA_6BOZA
1446
+ 1ZO1I_1ZO1I
1447
+ 1ZO1I_3JCJf
1448
+ 1ZO1I_5ME0W
1449
+ 1ZO1I_6O7Kf
1450
+ 1ZO1I_6O9Kz
1451
+ 1ZTYA_1ZTYA
1452
+ 1ZTYA_1ZU0A
1453
+ 1ZTYA_4GF8A
1454
+ 1ZTYA_5YQWA
1455
+ 1ZTYA_6LZWA
1456
+ 1ZWMA_2M3UA
1457
+ 1ZWMA_6FD8A
1458
+ 1ZWMA_6IF9A
1459
+ 1ZWMA_6MYHA
1460
+ 1ZWMA_6MYHB
1461
+ 2A19B_2A19B
1462
+ 2A19B_2A1AB
1463
+ 2A19B_3UIUB
1464
+ 2A19B_6D3KA
1465
+ 2A19B_6D3KB
1466
+ 2A73A_2XWJE
1467
+ 2A73A_3G6JC
1468
+ 2A73A_6RU5A
1469
+ 2A73A_7AKKB
1470
+ 2A73A_7ZGJA
1471
+ 2AHME_2AHME
1472
+ 2AHME_2AHMG
1473
+ 2AHME_6XEZB
1474
+ 2AHME_6YYTD
1475
+ 2AHME_8GWEB
1476
+ 2B0SH_4NRXD
1477
+ 2B0SH_6DCWH
1478
+ 2B0SH_7M7WC
1479
+ 2B0SH_7M7WE
1480
+ 2B0SH_7N3EH
1481
+ 2B3TB_2B3TB
1482
+ 2B3TB_6DNCMA
1483
+ 2B3TB_6GXNv
1484
+ 2B3TB_7M5DA
1485
+ 2B3TB_8AKNW
1486
+ 2B8WA_2B8WB
1487
+ 2B8WA_2B92B
1488
+ 2B8WA_2BC9A
1489
+ 2B8WA_2D4HA
1490
+ 2B8WA_2D4HB
1491
+ 2BBJA_3JCGA
1492
+ 2BBJA_3JCHA
1493
+ 2BBJA_3JCHB
1494
+ 2BBJA_3JCHD
1495
+ 2BBJA_4I0UD
1496
+ 2BDEA_2BDEA
1497
+ 2BDEA_4G63A
1498
+ 2BDEA_4OHFA
1499
+ 2BDEA_4OHFB
1500
+ 2BDEA_4OHFC
1501
+ 2BISA_2BISB
1502
+ 2BISA_3FROA
1503
+ 2BISA_3FROB
1504
+ 2BISA_3FROC
1505
+ 2BISA_3L01B
1506
+ 2BJNA_2BJNA
1507
+ 2BJNA_2BJNB
1508
+ 2BJNA_2CFHC
1509
+ 2BJNA_2CFHD
1510
+ 2BJNA_3KXCC
1511
+ 2BVAA_2BVAA
1512
+ 2BVAA_2F57B
1513
+ 2BVAA_2J0IA
1514
+ 2BVAA_2Q0NA
1515
+ 2BVAA_5XVGA
1516
+ 2BW3A_2BW3A
1517
+ 2BW3A_4D1QA
1518
+ 2BW3A_4D1QB
1519
+ 2BW3A_4D1QG
1520
+ 2BW3A_4D1QH
1521
+ 2C35B_2C35B
1522
+ 2C35B_6DRDG
1523
+ 2C35B_6XREG
1524
+ 2C35B_7ENAPG
1525
+ 2C35B_7LBMG
1526
+ 2C6FA_2XYDA
1527
+ 2C6FA_5AM8D
1528
+ 2C6FA_5AM9B
1529
+ 2C6FA_7Q49A
1530
+ 2C6FA_7Q4DA
1531
+ 2CA5A_2CA5A
1532
+ 2CA5A_2CA5B
1533
+ 2CA5A_8AXKh
1534
+ 2CA5A_8AXKS
1535
+ 2CA5A_8AXKU
1536
+ 2CDEA_2EYSA
1537
+ 2CDEA_3O8XC
1538
+ 2CDEA_3TZVA
1539
+ 2CDEA_6TROD
1540
+ 2CDEA_6XNGC
1541
+ 2CEOA_2CEOA
1542
+ 2CEOA_2CEOB
1543
+ 2CEOA_2RIWA
1544
+ 2CEOA_2XN5A
1545
+ 2CEOA_4X30A
1546
+ 2CEXA_2CEXA
1547
+ 2CEXA_2CEXC
1548
+ 2CEXA_2CEXD
1549
+ 2CEXA_2V4CA
1550
+ 2CEXA_2WYKA
1551
+ 2CH7A_2CH7A
1552
+ 2CH7A_3JA6G
1553
+ 2CH7A_3JA6H
1554
+ 2CH7A_3JA6Q
1555
+ 2CH7A_3JA6R
1556
+ 2CKZC_2CKZC
1557
+ 2CKZC_5FJ8D
1558
+ 2CKZC_6EU0D
1559
+ 2CKZC_7Z1MD
1560
+ 2CKZC_7Z30D
1561
+ 2COMA_2COMA
1562
+ 2COMA_2L3DA
1563
+ 2COMA_6E1FA
1564
+ 2COMA_6E1FB
1565
+ 2COMA_6E1FD
1566
+ 2CQBA_2KYXA
1567
+ 2CQBA_6ICZy
1568
+ 2CQBA_6ID1y
1569
+ 2CQBA_7ZEXA
1570
+ 2CQBA_7ZEZA
1571
+ 2CSKA_2CSKA
1572
+ 2CSKA_2MXCA
1573
+ 2CSKA_5F0JC
1574
+ 2CSKA_5F0LC
1575
+ 2CSKA_5F0PC
1576
+ 2CY9A_2CY9A
1577
+ 2CY9A_2F0XA
1578
+ 2CY9A_2H4UD
1579
+ 2CY9A_3F5OC
1580
+ 2CY9A_3F5OH
1581
+ 2D9IA_2D9IA
1582
+ 2D9IA_2VKCA
1583
+ 2D9IA_3FAUB
1584
+ 2D9IA_3FAUC
1585
+ 2D9IA_3FAUD
1586
+ 2DGRA_2DGRA
1587
+ 2DGRA_5WWXA
1588
+ 2DGRA_5WWZA
1589
+ 2DGRA_5WWZB
1590
+ 2DGRA_5WWZC
1591
+ 2DMJA_2DMJA
1592
+ 2DMJA_2L30A
1593
+ 2DMJA_3OD8A
1594
+ 2DMJA_3OD8F
1595
+ 2DMJA_4DQYD
1596
+ 2DMWA_2DMWA
1597
+ 2DMWA_2VX8A
1598
+ 2DMWA_2VX8B
1599
+ 2DMWA_2VX8C
1600
+ 2DMWA_2VX8D
1601
+ 2DNVA_2DNVA
1602
+ 2DNVA_3GV6A
1603
+ 2DNVA_3I91A
1604
+ 2DNVA_3I91B
1605
+ 2DNVA_5EQ0A
1606
+ 2DU9A_2DU9A
1607
+ 2DU9A_2EK5A
1608
+ 2DU9A_2EK5B
1609
+ 2DU9A_2EK5C
1610
+ 2DU9A_2EK5D
1611
+ 2DUTA_2DUTA
1612
+ 2DUTA_2DUTC
1613
+ 2DUTA_2DUTD
1614
+ 2DUTA_2H4PA
1615
+ 2DUTA_2H4QA
1616
+ 2E5QA_2E5QA
1617
+ 2E5QA_4BD3A
1618
+ 2E5QA_6WAUA
1619
+ 2E5QA_6WAUB
1620
+ 2E5QA_6WAUD
1621
+ 2EKOA_2EKOA
1622
+ 2EKOA_4QQGA
1623
+ 2EKOA_4QQGE
1624
+ 2EKOA_4QQGF
1625
+ 2EKOA_4QQGG
1626
+ 2ELBA_2ELBA
1627
+ 2ELBA_2Q13A
1628
+ 2ELBA_2Z0OA
1629
+ 2ELBA_5C5BA
1630
+ 2ELBA_5C5BC
1631
+ 2EWVA_2EWWA
1632
+ 2EWVA_2GSZB
1633
+ 2EWVA_2GSZC
1634
+ 2EWVA_2GSZD
1635
+ 2EWVA_2GSZE
1636
+ 2F1CX_2IWVA
1637
+ 2F1CX_2IWWB
1638
+ 2F1CX_2JQYA
1639
+ 2F1CX_5MWVA
1640
+ 2F1CX_6OQHA
1641
+ 2F1TA_2F1TA
1642
+ 2F1TA_2F1TC
1643
+ 2F1TA_2F1VA
1644
+ 2F1TA_2F1VB
1645
+ 2F1TA_2MHLA
1646
+ 2F3IA_2F3IA
1647
+ 2F3IA_5FLMH
1648
+ 2F3IA_6XREH
1649
+ 2F3IA_7ASTD
1650
+ 2F3IA_8IUEH
1651
+ 2F8XC_2F8XC
1652
+ 2F8XC_3IAGC
1653
+ 2F8XC_5E24E
1654
+ 2F8XC_5E24F
1655
+ 2F8XC_6PY8E
1656
+ 2FA4A_2FA4A
1657
+ 2FA4A_2FA4B
1658
+ 2FA4A_2HSYA
1659
+ 2FA4A_3PINA
1660
+ 2FA4A_7BVVB
1661
+ 2FMYA_2FMYA
1662
+ 2FMYA_2FMYB
1663
+ 2FMYA_2FMYC
1664
+ 2FMYA_2FMYD
1665
+ 2FMYA_2HKXA
1666
+ 2FPDA_2FPDD
1667
+ 2FPDA_2FPED
1668
+ 2FPDA_7NYKD
1669
+ 2FPDA_7NYLB
1670
+ 2FPDA_7NYMD
1671
+ 2FSSA_2J6ID
1672
+ 2FSSA_6D4BA
1673
+ 2FSSA_6D4BB
1674
+ 2FSSA_8HTYC
1675
+ 2FSSA_8IQ7D
1676
+ 2FTXA_2FTXA
1677
+ 2FTXA_2FV4A
1678
+ 2FTXA_5T6JB
1679
+ 2FTXA_5TCSD
1680
+ 2FTXA_7KDFD
1681
+ 2FXMA_2FXMA
1682
+ 2FXMA_2FXOA
1683
+ 2FXMA_2FXOB
1684
+ 2FXMA_2FXOC
1685
+ 2FXMA_2FXOD
1686
+ 2GC9A_2GC9B
1687
+ 2GC9A_2W2AA
1688
+ 2GC9A_2W2BA
1689
+ 2GC9A_2W2FB
1690
+ 2GC9A_2WSJB
1691
+ 2GESA_2GETA
1692
+ 2GESA_2ZS7A
1693
+ 2GESA_5XLVB
1694
+ 2GESA_5XLWB
1695
+ 2GESA_5XMBC
1696
+ 2GHWB_2GHWB
1697
+ 2GHWB_5WYMA
1698
+ 2GHWB_5WYMC
1699
+ 2GHWB_7AH1A
1700
+ 2GHWB_7ZCEE
1701
+ 2GI7A_5OU7A
1702
+ 2GI7A_5OU7D
1703
+ 2GI7A_7NMUA
1704
+ 2GI7A_7NMUB
1705
+ 2GI7A_7R58A
1706
+ 2GJ2A_2GJ2A
1707
+ 2GJ2A_2GJ2B
1708
+ 2GJ2A_2GJIA
1709
+ 2GJ2A_2ZUGA
1710
+ 2GJ2A_2ZUGB
1711
+ 2GTYA_2GTYA
1712
+ 2GTYA_3LKYA
1713
+ 2GTYA_3LL0A
1714
+ 2GTYA_7RIAB
1715
+ 2GTYA_7RIBA
1716
+ 2GX9A_3EE8A
1717
+ 2GX9A_3O9TA
1718
+ 2GX9A_3O9UB
1719
+ 2GX9A_6E5UV
1720
+ 2GX9A_6NU0A
1721
+ 2H3KA_2H3KA
1722
+ 2H3KA_3OVUB
1723
+ 2H3KA_3S48A
1724
+ 2H3KA_3S48B
1725
+ 2H3KA_4WJGX
1726
+ 2H57A_2H57A
1727
+ 2H57A_2H57B
1728
+ 2H57A_2H57C
1729
+ 2H57A_6VBV3
1730
+ 2H57A_6VOAA
1731
+ 2H5EA_2H5EA
1732
+ 2H5EA_2H5EB
1733
+ 2H5EA_4V85AW
1734
+ 2H5EA_4V8OAY
1735
+ 2H5EA_6GXPw
1736
+ 2HAUA_2HAVA
1737
+ 2HAUA_3S9MC
1738
+ 2HAUA_5DYHA
1739
+ 2HAUA_6D03D
1740
+ 2HAUA_6SOZC
1741
+ 2HDPA_2HDPA
1742
+ 2HDPA_2HDPB
1743
+ 2HDPA_2VJFA
1744
+ 2HDPA_6SQPA
1745
+ 2HDPA_7THLC
1746
+ 2HFDA_2HFDA
1747
+ 2HFDA_2QGVA
1748
+ 2HFDA_2QGVE
1749
+ 2HFDA_2QGVH
1750
+ 2HFDA_2QGVI
1751
+ 2HG0A_2HG0A
1752
+ 2HG0A_2I69A
1753
+ 2HG0A_3IYWA
1754
+ 2HG0A_3IYWC
1755
+ 2HG0A_7KV9A
1756
+ 2HJ8A_2HJ8A
1757
+ 2HJ8A_3PHXB
1758
+ 2HJ8A_5TL6C
1759
+ 2HJ8A_6XA9D
1760
+ 2HJ8A_6XA9F
1761
+ 2HLDI_3OE7I
1762
+ 2HLDI_3ZRYI
1763
+ 2HLDI_6CP6I
1764
+ 2HLDI_7TK5I
1765
+ 2HLDI_7TKSI
1766
+ 2HVQA_2HVQA
1767
+ 2HVQA_2HVRA
1768
+ 2HVQA_2HVRB
1769
+ 2HVQA_2HVSA
1770
+ 2HVQA_2HVSB
1771
+ 2I62A_2IIPD
1772
+ 2I62A_5XVKB
1773
+ 2I62A_7EGUA
1774
+ 2I62A_7EHZA
1775
+ 2I62A_7WMCA
1776
+ 2ID5A_2ID5A
1777
+ 2ID5A_2ID5B
1778
+ 2ID5A_2ID5C
1779
+ 2ID5A_2ID5D
1780
+ 2ID5A_4OQTA
1781
+ 2IW3A_2IW3B
1782
+ 2IW3A_2IWHA
1783
+ 2IW3A_2IX3A
1784
+ 2IW3A_2IX8A
1785
+ 2IW3A_7B7DEF
1786
+ 2IYFA_2IYFB
1787
+ 2IYFA_4M7PA
1788
+ 2IYFA_4M83B
1789
+ 2IYFA_7XX4A
1790
+ 2IYFA_7XX4B
1791
+ 2J28I_2J28I
1792
+ 2J28I_4V70BI
1793
+ 2J28I_4V74BI
1794
+ 2J28I_4V7ABI
1795
+ 2J28I_6O9KI
1796
+ 2JJNA_2JJNA
1797
+ 2JJNA_2JJOA
1798
+ 2JJNA_2JJPA
1799
+ 2JJNA_2WIOA
1800
+ 2JJNA_2XFHA
1801
+ 2JJSC_2JJTC
1802
+ 2JJSC_2VSCC
1803
+ 2JJSC_2VSCD
1804
+ 2JJSC_7MYZD
1805
+ 2JJSC_7XJFC
1806
+ 2JK4A_2K4TA
1807
+ 2JK4A_5JDPA
1808
+ 2JK4A_5XDOA
1809
+ 2JK4A_6TIRA
1810
+ 2JK4A_7QI2A
1811
+ 2JLNA_2JLNA
1812
+ 2JLNA_2X79A
1813
+ 2JLNA_4D1AA
1814
+ 2JLNA_4D1BA
1815
+ 2JLNA_4D1DA
1816
+ 2JX0A_2JX0A
1817
+ 2JX0A_6IUHA
1818
+ 2JX0A_6IUIB
1819
+ 2JX0A_6JMUA
1820
+ 2JX0A_6JMUB
1821
+ 2K07A_2K07A
1822
+ 2K07A_2Z6OA
1823
+ 2K07A_2Z6PA
1824
+ 2K07A_7NW1A
1825
+ 2K07A_7OVCA
1826
+ 2K0AA_2K0AA
1827
+ 2K0AA_5GM6J
1828
+ 2K0AA_5NRLS
1829
+ 2K0AA_5ZWM5
1830
+ 2K0AA_7DCO5
1831
+ 2KCAA_2KCAA
1832
+ 2KCAA_5A20E
1833
+ 2KCAA_5A20F
1834
+ 2KCAA_5A21E
1835
+ 2KCAA_7Z4W1
1836
+ 2KGKA_2KGKA
1837
+ 2KGKA_3FL8A
1838
+ 2KGKA_3FL9C
1839
+ 2KGKA_3SA2A
1840
+ 2KGKA_8SSXA
1841
+ 2KIHA_2L0JC
1842
+ 2KIHA_2L0JD
1843
+ 2KIHA_2N70B
1844
+ 2KIHA_2RLFA
1845
+ 2KIHA_6OUGH
1846
+ 2KL3A_2KL3A
1847
+ 2KL3A_3ILMA
1848
+ 2KL3A_3ILMB
1849
+ 2KL3A_3ILMC
1850
+ 2KL3A_3ILMD
1851
+ 2KRTA_2KRTA
1852
+ 2KRTA_3JVCA
1853
+ 2KRTA_3JVCB
1854
+ 2KRTA_3JVCC
1855
+ 2KRTA_3K63A
1856
+ 2KS9A_2KSAA
1857
+ 2KS9A_7P00R
1858
+ 2KS9A_7P02R
1859
+ 2KS9A_7RMGR
1860
+ 2KS9A_7RMHR
1861
+ 2KXFA_2KXFA
1862
+ 2KXFA_2KXHA
1863
+ 2KXFA_5KW1B
1864
+ 2KXFA_5KWQB
1865
+ 2KXFA_7Q8AB
1866
+ 2LZGA_2LZGA
1867
+ 2LZGA_2MPSA
1868
+ 2LZGA_2RUHA
1869
+ 2LZGA_4HBMB
1870
+ 2LZGA_4UE1B
1871
+ 2M5RA_2M5RA
1872
+ 2M5RA_5ZWSA
1873
+ 2M5RA_5ZWTB
1874
+ 2M5RA_6SGAFc
1875
+ 2M5RA_7AM2CC
1876
+ 2MOEA_2MOEA
1877
+ 2MOEA_3VXVA
1878
+ 2MOEA_3VXXA
1879
+ 2MOEA_3VYQD
1880
+ 2MOEA_4LG7A
1881
+ 2MUQA_2MUQA
1882
+ 2MUQA_2MURA
1883
+ 2MUQA_3WWQC
1884
+ 2MUQA_3WWQF
1885
+ 2MUQA_3WWQI
1886
+ 2MY1A_2MY1A
1887
+ 2MY1A_5GM6T
1888
+ 2MY1A_5LJ3L
1889
+ 2MY1A_5MPSL
1890
+ 2MY1A_5WSGT
1891
+ 2MYIA_2MYIA
1892
+ 2MYIA_4JG3A
1893
+ 2MYIA_6O1LR
1894
+ 2MYIA_6O1MS
1895
+ 2MYIA_8BVJA
1896
+ 2MZEA_2MZEA
1897
+ 2MZEA_2MZHA
1898
+ 2MZEA_2MZIA
1899
+ 2MZEA_5UE2A
1900
+ 2MZEA_5UE5A
1901
+ 2N4PA_2N4PA
1902
+ 2N4PA_5X4FA
1903
+ 2N4PA_6B1GA
1904
+ 2N4PA_6B1GB
1905
+ 2N4PA_6T4BC
1906
+ 2N5ZA_2N5ZA
1907
+ 2N5ZA_4OW1A
1908
+ 2N5ZA_4OW1E
1909
+ 2N5ZA_4OW1S
1910
+ 2N5ZA_4OW1T
1911
+ 2NBMA_2NBMA
1912
+ 2NBMA_2NBNA
1913
+ 2NBMA_2QZTA
1914
+ 2NBMA_2QZTB
1915
+ 2NBMA_3BDQA
1916
+ 2NNWA_3NMUA
1917
+ 2NNWA_4BY9C
1918
+ 2NNWA_4BY9F
1919
+ 2NNWA_4BY9I
1920
+ 2NNWA_4BY9L
1921
+ 2NO8A_2NO8A
1922
+ 2NO8A_2WPTA
1923
+ 2NO8A_3U43A
1924
+ 2NO8A_6EREC
1925
+ 2NO8A_6ERED
1926
+ 2NOQG_2NOQG
1927
+ 2NOQG_4V7RDA
1928
+ 2NOQG_6OIGz
1929
+ 2NOQG_6XIQL1
1930
+ 2NOQG_8BQXBT
1931
+ 2NOQH_2NOQH
1932
+ 2NOQH_3J0Qk
1933
+ 2NOQH_3J6X51
1934
+ 2NOQH_4V7HBJ
1935
+ 2NOQH_7MPJAJ
1936
+ 2NXWA_2Q5JB
1937
+ 2NXWA_2Q5LA
1938
+ 2NXWA_2Q5LB
1939
+ 2NXWA_2Q5OB
1940
+ 2NXWA_2Q5QB
1941
+ 2NYTA_2NYTA
1942
+ 2NYTA_2NYTB
1943
+ 2NYTA_2NYTC
1944
+ 2NYTA_2NYTD
1945
+ 2NYTA_2RPZA
1946
+ 2O013_2O013
1947
+ 2O013_2WSC3
1948
+ 2O013_2WSE3
1949
+ 2O013_2WSF3
1950
+ 2O013_3LW53
1951
+ 2O01L_2O01L
1952
+ 2O01L_2WSFL
1953
+ 2O01L_3LW5L
1954
+ 2O01L_6YEZL
1955
+ 2O01L_6ZXSL
1956
+ 2O66A_2O66A
1957
+ 2O66A_2O66B
1958
+ 2O66A_2O67B
1959
+ 2O66A_2RD5C
1960
+ 2O66A_2RD5D
1961
+ 2O7KA_2O7KA
1962
+ 2O7KA_2O85A
1963
+ 2O7KA_2O89A
1964
+ 2O7KA_3DIEA
1965
+ 2O7KA_3DIEB
1966
+ 2O8BB_2O8BB
1967
+ 2O8BB_2O8DB
1968
+ 2O8BB_2O8EB
1969
+ 2O8BB_2O8FB
1970
+ 2O8BB_8AG6B
1971
+ 2OAYA_2OAYA
1972
+ 2OAYA_5DU3A
1973
+ 2OAYA_5DU3B
1974
+ 2OAYA_5DUQB
1975
+ 2OAYA_7AKVA
1976
+ 2OGWA_2OSVA
1977
+ 2OGWA_2PRSB
1978
+ 2OGWA_2XH8A
1979
+ 2OGWA_4BBPA
1980
+ 2OGWA_7RCJF
1981
+ 2OIVA_5JP1A
1982
+ 2OIVA_5JP3A
1983
+ 2OIVA_5JP3C
1984
+ 2OIVA_5JP3E
1985
+ 2OIVA_5JP3G
1986
+ 2OLUA_2OLVA
1987
+ 2OLUA_2OLVB
1988
+ 2OLUA_3DWKA
1989
+ 2OLUA_3DWKB
1990
+ 2OLUA_3DWKC
1991
+ 2OZ6A_7FEWB
1992
+ 2OZ6A_7FF0B
1993
+ 2OZ6A_7FF9E
1994
+ 2OZ6A_7FF9F
1995
+ 2OZ6A_7FF9G
1996
+ 2P4VA_2P4VA
1997
+ 2P4VA_2P4VE
1998
+ 2P4VA_6RI7F
1999
+ 2P4VA_6RI7G
2000
+ 2P4VA_6RINF
2001
+ 2PA8D_2PA8D
2002
+ 2PA8D_2WAQD
2003
+ 2PA8D_2WB1D
2004
+ 2PA8D_3HKZD
2005
+ 2PA8D_4V8SAS
2006
+ 2PM6A_2PM6A
2007
+ 2PM6A_2PM7C
2008
+ 2PM6A_3MZLB
2009
+ 2PM6A_3MZLF
2010
+ 2PM6A_3MZLH
2011
+ 2PMZP_2WAQP
2012
+ 2PMZP_3HKZP
2013
+ 2PMZP_4AYBP
2014
+ 2PMZP_7OQ4P
2015
+ 2PMZP_7OQYP
2016
+ 2PROA_2PROA
2017
+ 2PROA_2PROB
2018
+ 2PROA_3PROC
2019
+ 2PROA_3PROD
2020
+ 2PROA_4PROD
2021
+ 2Q3EA_2Q3EA
2022
+ 2Q3EA_3ITKE
2023
+ 2Q3EA_3ITKF
2024
+ 2Q3EA_3TF5A
2025
+ 2Q3EA_4EDFC
2026
+ 2QL2A_2YPAB
2027
+ 2QL2A_6MGNA
2028
+ 2QL2A_6OD3G
2029
+ 2QL2A_6OD3H
2030
+ 2QL2A_6OD3I
2031
+ 2QM4A_2R9AA
2032
+ 2QM4A_7NFCQ
2033
+ 2QM4A_7NFCR
2034
+ 2QM4A_7NFEF
2035
+ 2QM4A_7NFEG
2036
+ 2QTSA_2QTSA
2037
+ 2QTSA_2QTSB
2038
+ 2QTSA_2QTSC
2039
+ 2QTSA_6L6IA
2040
+ 2QTSA_7CFSA
2041
+ 2QUOA_2QUOA
2042
+ 2QUOA_3X29B
2043
+ 2QUOA_7KP4B
2044
+ 2QUOA_7TDMB
2045
+ 2QUOA_7TDNB
2046
+ 2R4RA_2R4RA
2047
+ 2R4RA_3KJ6A
2048
+ 2R4RA_6KR8A
2049
+ 2R4RA_6NI3R
2050
+ 2R4RA_7BZ2R
2051
+ 2R84A_2R85A
2052
+ 2R84A_2R85B
2053
+ 2R84A_2R86A
2054
+ 2R84A_2R87A
2055
+ 2R84A_2R87C
2056
+ 2R8VA_2R8VA
2057
+ 2R8VA_3D2MA
2058
+ 2R8VA_3D2PA
2059
+ 2R8VA_3D2PB
2060
+ 2R8VA_4I49A
2061
+ 2RCNA_2RCNA
2062
+ 2RCNA_2YKRW
2063
+ 2RCNA_4A2IV
2064
+ 2RCNA_5NO2Z
2065
+ 2RCNA_5UZ4Z
2066
+ 2RFJA_2RFJA
2067
+ 2RFJA_2RFJB
2068
+ 2RFJA_2WP2A
2069
+ 2RFJA_5VBRB
2070
+ 2RFJA_8CZAA
2071
+ 2RFTB_2RFTB
2072
+ 2RFTB_4FQKD
2073
+ 2RFTB_4FQMJ
2074
+ 2RFTB_4NKJA
2075
+ 2RFTB_6CNVB
2076
+ 2RVJA_2RVJA
2077
+ 2RVJA_7VKJA
2078
+ 2RVJA_7VKJF
2079
+ 2RVJA_7VKJG
2080
+ 2RVJA_7VKJH
2081
+ 2UY6B_2UY6B
2082
+ 2UY6B_2UY6C
2083
+ 2UY6B_2UY7B
2084
+ 2UY6B_2UY7H
2085
+ 2UY6B_5FLUA
2086
+ 2UZXB_2UZYB
2087
+ 2UZXB_2UZYD
2088
+ 2UZXB_6GCUD
2089
+ 2UZXB_7MO7B
2090
+ 2UZXB_7MO7E
2091
+ 2V6XA_2V6XA
2092
+ 2V6XA_4NIQA
2093
+ 2V6XA_5FVKA
2094
+ 2V6XA_5FVKB
2095
+ 2V6XA_5FVLA
2096
+ 2V7FA_2V7FA
2097
+ 2V7FA_4V4NBU
2098
+ 2V7FA_4V6UAU
2099
+ 2V7FA_6SW9U
2100
+ 2V7FA_6TMFV
2101
+ 2VDCG_2VDCH
2102
+ 2VDCG_6S6SG
2103
+ 2VDCG_6S6SH
2104
+ 2VDCG_6S6TE
2105
+ 2VDCG_6S6TG
2106
+ 2VJEB_2VJED
2107
+ 2VJEB_2VJFB
2108
+ 2VJEB_2VJFD
2109
+ 2VJEB_5MNJH
2110
+ 2VJEB_7MLAA
2111
+ 2VSMA_3D11A
2112
+ 2VSMA_7TXZA
2113
+ 2VSMA_7TXZB
2114
+ 2VSMA_7TY0C
2115
+ 2VSMA_7TY0D
2116
+ 2VTVA_2VTVA
2117
+ 2VTVA_2X76A
2118
+ 2VTVA_4BRSA
2119
+ 2VTVA_4BVLB
2120
+ 2VTVA_5MLXA
2121
+ 2VXQH_5UBZH
2122
+ 2VXQH_6GLXC
2123
+ 2VXQH_6NIPA
2124
+ 2VXQH_6NIPH
2125
+ 2VXQH_6R2SB
2126
+ 2WAQQ_2WAQQ
2127
+ 2WAQQ_3HKZY
2128
+ 2WAQQ_3HKZZ
2129
+ 2WAQQ_4AYBQ
2130
+ 2WAQQ_4V8SBQ
2131
+ 2WG5A_2WG5C
2132
+ 2WG5A_2WG5G
2133
+ 2WG5A_2WG5L
2134
+ 2WG5A_2WG6A
2135
+ 2WG5A_2WG6H
2136
+ 2WKBA_2WKBA
2137
+ 2WKBA_2WKBE
2138
+ 2WKBA_2WKBF
2139
+ 2WKBA_3GACA
2140
+ 2WKBA_3GADF
2141
+ 2WPDJ_3U2FM
2142
+ 2WPDJ_5LQXN
2143
+ 2WPDJ_6WTDK
2144
+ 2WPDJ_6WTDL
2145
+ 2WPDJ_6WTDO
2146
+ 2WQMA_2WQNA
2147
+ 2WQMA_5DE2B
2148
+ 2WQMA_6GT1A
2149
+ 2WQMA_6S76C
2150
+ 2WQMA_6S76D
2151
+ 2WVRA_2ZXXA
2152
+ 2WVRA_2ZXXB
2153
+ 2WVRA_2ZXXD
2154
+ 2WVRA_2ZXXE
2155
+ 2WVRA_4BRYA
2156
+ 2WW9B_2WW9B
2157
+ 2WW9B_2WWAB
2158
+ 2WW9B_6N3QC
2159
+ 2WW9B_7AFTC
2160
+ 2WW9B_7KB5C
2161
+ 2WWBB_2WWBB
2162
+ 2WWBB_3J7Q2
2163
+ 2WWBB_4CG6B
2164
+ 2WWBB_8B6Cq
2165
+ 2WWBB_8B6LC
2166
+ 2WXWA_2WXWA
2167
+ 2WXWA_2X0BB
2168
+ 2WXWA_2X0BF
2169
+ 2WXWA_6I3FA
2170
+ 2WXWA_6I3IA
2171
+ 2X2ZA_2X2ZA
2172
+ 2X2ZA_2X2ZD
2173
+ 2X2ZA_2Y8RA
2174
+ 2X2ZA_2Y8SD
2175
+ 2X2ZA_2Y8TD
2176
+ 2X4IA_2X4IA
2177
+ 2X4IA_2X4IC
2178
+ 2X4IA_2X4ID
2179
+ 2X4IA_6SCFA
2180
+ 2X4IA_6SCFF
2181
+ 2X4MA_2X4MA
2182
+ 2X4MA_2X4MC
2183
+ 2X4MA_2X4MD
2184
+ 2X4MA_2X55A
2185
+ 2X4MA_4DCBA
2186
+ 2X5SA_2X5SA
2187
+ 2X5SA_2X5ZB
2188
+ 2X5SA_2X60A
2189
+ 2X5SA_2X60B
2190
+ 2X5SA_2X65A
2191
+ 2XFBB_2XFBB
2192
+ 2XFBB_2XFBE
2193
+ 2XFBB_2XFBG
2194
+ 2XFBB_2XFBI
2195
+ 2XFBB_5VU2D
2196
+ 2XIKA_3GGFA
2197
+ 2XIKA_3ZHPC
2198
+ 2XIKA_4FZAB
2199
+ 2XIKA_4QMQA
2200
+ 2XIKA_4QO9B
2201
+ 2XJYB_2XJYB
2202
+ 2XJYB_2XJZI
2203
+ 2XJYB_2XJZJ
2204
+ 2XJYB_2XJZM
2205
+ 2XJYB_2YPAD
2206
+ 2XNDJ_6TT71
2207
+ 2XNDJ_7AJFAM
2208
+ 2XNDJ_7AJFAO
2209
+ 2XNDJ_7AJFAP
2210
+ 2XNDJ_7AJFAQ
2211
+ 2XS2A_2XS2A
2212
+ 2XS2A_2XS5A
2213
+ 2XS2A_2XS5B
2214
+ 2XS2A_2XS7A
2215
+ 2XS2A_2XSFA
2216
+ 2XSAA_2XSAA
2217
+ 2XSAA_2XSBA
2218
+ 2XSAA_7KHSB
2219
+ 2XSAA_7KHSC
2220
+ 2XSAA_7KHSD
2221
+ 2XT3A_2XT3A
2222
+ 2XT3A_4A14A
2223
+ 2XT3A_6MLQC
2224
+ 2XT3A_6MLRC
2225
+ 2XT3A_7RX0C
2226
+ 2XTBA_2XTBA
2227
+ 2XTBA_3OTXB
2228
+ 2XTBA_4N08A
2229
+ 2XTBA_4N09A
2230
+ 2XTBA_4N09B
2231
+ 2XZBA_2YN9A
2232
+ 2XZBA_4UX2A
2233
+ 2XZBA_5YLUA
2234
+ 2XZBA_5YLVA
2235
+ 2XZBA_6JXJA
2236
+ 2Y2ZA_2Y2ZA
2237
+ 2Y2ZA_2Y30A
2238
+ 2Y2ZA_2Y31A
2239
+ 2Y2ZA_3ZQLA
2240
+ 2Y2ZA_3ZQLB
2241
+ 2Y3VA_2Y3VA
2242
+ 2Y3VA_2Y3VB
2243
+ 2Y3VA_2Y3VC
2244
+ 2Y3VA_2Y3VD
2245
+ 2Y3VA_2Y3WA
2246
+ 2Y4TA_2Y4TA
2247
+ 2Y4TA_2Y4TC
2248
+ 2Y4TA_2Y4UA
2249
+ 2Y4TA_3IEGA
2250
+ 2Y4TA_3IEGB
2251
+ 2YKOA_2YKOB
2252
+ 2YKOA_2YKOC
2253
+ 2YKOA_2YKQA
2254
+ 2YKOA_2YKQB
2255
+ 2YKOA_2YKQC
2256
+ 2YKRB_2YKRB
2257
+ 2YKRB_4V6PAE
2258
+ 2YKRB_4V77AB
2259
+ 2YKRB_4V79AB
2260
+ 2YKRB_6VWMa
2261
+ 2YKRC_2YKRC
2262
+ 2YKRC_4V6LAG
2263
+ 2YKRC_4V74AC
2264
+ 2YKRC_4V78AC
2265
+ 2YKRC_4V7IBC
2266
+ 2YKRK_2YKRK
2267
+ 2YKRK_4V72AK
2268
+ 2YKRK_4V77AK
2269
+ 2YKRK_4V78AK
2270
+ 2YKRK_4V9OHK
2271
+ 2YKRL_2YKRL
2272
+ 2YKRL_4V65AD
2273
+ 2YKRL_4V66AD
2274
+ 2YKRL_4V70AL
2275
+ 2YKRL_4V71AL
2276
+ 2YKRP_2YKRP
2277
+ 2YKRP_4V65AI
2278
+ 2YKRP_4V6OAS
2279
+ 2YKRP_4V73AP
2280
+ 2YKRP_4V76AP
2281
+ 2YKRT_2YKRT
2282
+ 2YKRT_4V6LAX
2283
+ 2YKRT_4V77AT
2284
+ 2YKRT_4V78AT
2285
+ 2YKRT_4V79AT
2286
+ 2YM9A_2YM9A
2287
+ 2YM9A_2YM9B
2288
+ 2YM9A_3O00A
2289
+ 2YM9A_3O01A
2290
+ 2YM9A_3O01B
2291
+ 2YNAA_2YNAA
2292
+ 2YNAA_4WMFC
2293
+ 2YNAA_4YOIA
2294
+ 2YNAA_6VH3B
2295
+ 2YNAA_7D3CA
2296
+ 2YQ2A_2YQ2A
2297
+ 2YQ2A_2YQ2B
2298
+ 2YQ2A_2YQ3B
2299
+ 2YQ2A_4JNTA
2300
+ 2YQ2A_4JNTB
2301
+ 2YT4A_2YT4A
2302
+ 2YT4A_6V5BB
2303
+ 2YT4A_6V5BC
2304
+ 2YT4A_6V5CB
2305
+ 2YT4A_6V5CC
2306
+ 2YULA_2YULA
2307
+ 2YULA_3F27D
2308
+ 2YULA_4Y60C
2309
+ 2YULA_6L6YD
2310
+ 2YULA_6L6YF
2311
+ 2Z1EA_2Z1EA
2312
+ 2Z1EA_2Z1FA
2313
+ 2Z1EA_3VYSC
2314
+ 2Z1EA_3VYTC
2315
+ 2Z1EA_3WJPA
2316
+ 2ZIUB_2ZIUB
2317
+ 2ZIUB_2ZIWB
2318
+ 2ZIUB_2ZIXB
2319
+ 2ZIUB_4P0PB
2320
+ 2ZIUB_4P0QB
2321
+ 2ZNLA_2ZNLA
2322
+ 2ZNLA_5IF2A
2323
+ 2ZNLA_5IF5A
2324
+ 2ZNLA_6QX8A
2325
+ 2ZNLA_7NISA
2326
+ 2ZYKA_2ZYKA
2327
+ 2ZYKA_2ZYKB
2328
+ 2ZYKA_2ZYMA
2329
+ 2ZYKA_2ZYNA
2330
+ 2ZYKA_2ZYOA
2331
+ 3A1JA_3A1JA
2332
+ 3A1JA_3G65A
2333
+ 3A1JA_3GGRA
2334
+ 3A1JA_7Z6HA
2335
+ 3A1JA_8GNNA
2336
+ 3AFEA_3AFEA
2337
+ 3AFEA_3AFEB
2338
+ 3AFEA_3AFEC
2339
+ 3AFEA_3AFFA
2340
+ 3AFEA_3AFFB
2341
+ 3AKCA_3AKCA
2342
+ 3AKCA_3AKDA
2343
+ 3AKCA_3AKEA
2344
+ 3AKCA_3W8NA
2345
+ 3AKCA_3W90A
2346
+ 3B5WA_3B60C
2347
+ 3B5WA_5TTPA
2348
+ 3B5WA_7SELB
2349
+ 3B5WA_8DMOA
2350
+ 3B5WA_8DMOB
2351
+ 3B7BA_3B7BA
2352
+ 3B7BA_3B7BB
2353
+ 3B7BA_3B95A
2354
+ 3B7BA_3B95B
2355
+ 3B7BA_6BY9A
2356
+ 3BC8A_3HL2B
2357
+ 3BC8A_3HL2C
2358
+ 3BC8A_4ZDPD
2359
+ 3BC8A_7MDLA
2360
+ 3BC8A_7MDLB
2361
+ 3BE5A_3BE5A
2362
+ 3BE5A_3BE5B
2363
+ 3BE5A_3BE5D
2364
+ 3BE5A_3BE6C
2365
+ 3BE5A_3BE6D
2366
+ 3BW1A_3BW1A
2367
+ 3BW1A_3BW1B
2368
+ 3BW1A_3JCMd
2369
+ 3BW1A_5VSUC
2370
+ 3BW1A_6ASOC
2371
+ 3CJJA_3CJJA
2372
+ 3CJJA_4IM8A
2373
+ 3CJJA_4LP4A
2374
+ 3CJJA_4OI8A
2375
+ 3CJJA_6XQ8B
2376
+ 3CMMA_5TR4C
2377
+ 3CMMA_6ZHSA
2378
+ 3CMMA_6ZHTC
2379
+ 3CMMA_6ZQHC
2380
+ 3CMMA_7K5JK
2381
+ 3COIA_3COIA
2382
+ 3COIA_4EYJA
2383
+ 3COIA_4MYGA
2384
+ 3COIA_4MYGB
2385
+ 3COIA_4YNOA
2386
+ 3CPFA_3CPFA
2387
+ 3CPFA_5DLQF
2388
+ 3CPFA_7OYA11
2389
+ 3CPFA_8A0EE
2390
+ 3CPFA_8G615A
2391
+ 3CW1E_4WZJE
2392
+ 3CW1E_5XJSE
2393
+ 3CW1E_6FF7c
2394
+ 3CW1E_7ABIc
2395
+ 3CW1E_8CH6i
2396
+ 3D4RA_3D4RA
2397
+ 3D4RA_3D4RB
2398
+ 3D4RA_3D4RD
2399
+ 3D4RA_3D4RE
2400
+ 3D4RA_3D4RF
2401
+ 3D63A_3D63A
2402
+ 3D63A_3D63B
2403
+ 3D63A_3D63C
2404
+ 3D63A_3EIYA
2405
+ 3D63A_3EIZA
2406
+ 3DD7A_3DD9A
2407
+ 3DD7A_3DD9C
2408
+ 3DD7A_3DD9D
2409
+ 3DD7A_3K33A
2410
+ 3DD7A_3KH2B
2411
+ 3DGPB_3DOMD
2412
+ 3DGPB_7ML05
2413
+ 3DGPB_7ML15
2414
+ 3DGPB_7ML45
2415
+ 3DGPB_8CEN5
2416
+ 3DMBA_3DMBA
2417
+ 3DMBA_3DMBB
2418
+ 3DMBA_3DMBC
2419
+ 3DMBA_3U34A
2420
+ 3DMBA_3U34B
2421
+ 3DWWA_3DWWA
2422
+ 3DWWA_3DWWB
2423
+ 3DWWA_3DWWC
2424
+ 3DWWA_4AL0A
2425
+ 3DWWA_4WABA
2426
+ 3E9LA_3E9LA
2427
+ 3E9LA_4JK7A
2428
+ 3E9LA_4JKAB
2429
+ 3E9LA_4JKDB
2430
+ 3E9LA_4JKFA
2431
+ 3EI4B_3EI4B
2432
+ 3EI4B_3EI4F
2433
+ 3EI4B_4E54B
2434
+ 3EI4B_4E5ZB
2435
+ 3EI4B_6R8YL
2436
+ 3ENMA_3ENMA
2437
+ 3ENMA_3ENMC
2438
+ 3ENMA_3FMEA
2439
+ 3ENMA_3VN9A
2440
+ 3ENMA_8A8MB
2441
+ 3EZLA_3EZLA
2442
+ 3EZLA_3VZPC
2443
+ 3EZLA_3VZRA
2444
+ 3EZLA_3VZRB
2445
+ 3EZLA_4N5MB
2446
+ 3F6KA_4PO7A
2447
+ 3F6KA_5MRIA
2448
+ 3F6KA_5NNIA
2449
+ 3F6KA_5NNIB
2450
+ 3F6KA_6EHOA
2451
+ 3FIED_3FIED
2452
+ 3FIED_3RK3A
2453
+ 3FIED_3RL0A
2454
+ 3FIED_3RL0E
2455
+ 3FIED_3RL0M
2456
+ 3FOEC_3FOED
2457
+ 3FOEC_3FOFC
2458
+ 3FOEC_3KSBC
2459
+ 3FOEC_3LTND
2460
+ 3FOEC_3RAFD
2461
+ 3GK8H_3J3ZH
2462
+ 3GK8H_4OTXH
2463
+ 3GK8H_5EN2A
2464
+ 3GK8H_5YWYH
2465
+ 3GK8H_6BT3B
2466
+ 3GNSA_3GR6G
2467
+ 3GNSA_4ALMA
2468
+ 3GNSA_4ALMB
2469
+ 3GNSA_4ALMD
2470
+ 3GNSA_4CV1A
2471
+ 3GPGA_3GPOB
2472
+ 3GPGA_3GPQA
2473
+ 3GPGA_6W8YB
2474
+ 3GPGA_6W8ZC
2475
+ 3GPGA_7P27A
2476
+ 3GQEA_3GQEA
2477
+ 3GQEA_3GQEB
2478
+ 3GQEA_3GQOA
2479
+ 3GQEA_5ISNA
2480
+ 3GQEA_5MQXA
2481
+ 3H9VA_3H9VA
2482
+ 3H9VA_3I5DC
2483
+ 3H9VA_4DW0A
2484
+ 3H9VA_4DW1A
2485
+ 3H9VA_5WZYA
2486
+ 3HIXA_3HIXB
2487
+ 3HIXA_3K9RA
2488
+ 3HIXA_3K9RB
2489
+ 3HIXA_3K9RC
2490
+ 3HIXA_3K9RD
2491
+ 3HMEA_4NQNA
2492
+ 3HMEA_5IGMB
2493
+ 3HMEA_5TWXA
2494
+ 3HMEA_5TWXB
2495
+ 3HMEA_6Y7JA
2496
+ 3HPGA_3HPGA
2497
+ 3HPGA_3HPGB
2498
+ 3HPGA_3HPHA
2499
+ 3HPGA_3HPHD
2500
+ 3HPGA_7Z1ZB
2501
+ 3HRYA_3HRYA
2502
+ 3HRYA_3HRYB
2503
+ 3HRYA_3HRYC
2504
+ 3HRYA_4ZM0A
2505
+ 3HRYA_4ZM0B
2506
+ 3ID5B_3ID5B
2507
+ 3ID5B_3ID5F
2508
+ 3ID5B_3PLAE
2509
+ 3ID5B_3PLAM
2510
+ 3ID5B_5GIOE
2511
+ 3IFAA_3IFAA
2512
+ 3IFAA_3IFCC
2513
+ 3IFAA_4HE0A
2514
+ 3IFAA_5ET5A
2515
+ 3IFAA_5Q0CC
2516
+ 3IK5A_3IK5A
2517
+ 3IK5A_3IK5C
2518
+ 3IK5A_5NUHA
2519
+ 3IK5A_5NUHB
2520
+ 3IK5A_5NUIB
2521
+ 3IYGE_5UYXC
2522
+ 3IYGE_6NRDE
2523
+ 3IYGE_7TTND
2524
+ 3IYGE_7X0AE
2525
+ 3IYGE_8AJOC
2526
+ 3IYGH_6NRCG
2527
+ 3IYGH_7NVLH
2528
+ 3IYGH_7TTNC
2529
+ 3IYGH_7WU7O
2530
+ 3IYGH_7X7YH
2531
+ 3IYJA_3IYJA
2532
+ 3IYJA_3IYJB
2533
+ 3IYJA_3IYJD
2534
+ 3IYJA_3IYJE
2535
+ 3IYJA_3IYJF
2536
+ 3J1OJ_3J1OJ
2537
+ 3J1OJ_3RJ1Q
2538
+ 3J1OJ_5OQMb
2539
+ 3J1OJ_5SVAN
2540
+ 3J1OJ_8CENb
2541
+ 3J22B_3ZFGB
2542
+ 3J22B_4BIQB
2543
+ 3J22B_4RR3C
2544
+ 3J22B_4RS5O
2545
+ 3J22B_4YVWL
2546
+ 3J3RA_3J3RA
2547
+ 3J3RA_3J3SE
2548
+ 3J3RA_3J3TA
2549
+ 3J3RA_3PXIC
2550
+ 3J3RA_7ABRD
2551
+ 3J3V6_3J3V6
2552
+ 3J3V6_3J3W6
2553
+ 3J3V6_3J9WBK
2554
+ 3J3V6_7AS9K
2555
+ 3J3V6_8A57E
2556
+ 3J3VU_3J3VU
2557
+ 3J3VU_3J3WU
2558
+ 3J3VU_6HA1U
2559
+ 3J3VU_6PPKU
2560
+ 3J3VU_6TPQn
2561
+ 3J5LX_3J5LX
2562
+ 3J5LX_4V6PBZ
2563
+ 3J5LX_4V74BX
2564
+ 3J5LX_4V78BX
2565
+ 3J5LX_4V79BX
2566
+ 3J6X10_3J6X10
2567
+ 3J6X10_5DGVc0
2568
+ 3J6X10_6GSMK
2569
+ 3J6X10_6WDRK
2570
+ 3J6X10_6XIRAA
2571
+ 3J6X23_3J6X23
2572
+ 3J6X23_4V4BAL
2573
+ 3J6X23_4V6IAL
2574
+ 3J6X23_4V7HAL
2575
+ 3J6X23_6ZQDDX
2576
+ 3J6X69_3J6X69
2577
+ 3J6X69_5JUOGA
2578
+ 3J6X69_6GQBb
2579
+ 3J6X69_6HD7d
2580
+ 3J6X69_6OIGb
2581
+ 3J6X72_3J6X72
2582
+ 3J6X72_4V4BB0
2583
+ 3J6X72_4V6IBh
2584
+ 3J6X72_4V7HB0
2585
+ 3J6X72_6OIGe
2586
+ 3J6X78_3J6X78
2587
+ 3J6X78_4V6IBn
2588
+ 3J6X78_5JUOPA
2589
+ 3J6X78_6OIGk
2590
+ 3J6X78_7MPIAk
2591
+ 3J6X80_3J6X80
2592
+ 3J6X80_5AN9H
2593
+ 3J6X80_5ANBH
2594
+ 3J6X80_5JUSRA
2595
+ 3J6X80_6XIQm
2596
+ 3J6XL3_3J6XL3
2597
+ 3J6XL3_4V4BBC
2598
+ 3J6XL3_4V6IBC
2599
+ 3J6XL3_4V7HBC
2600
+ 3J6XL3_4V7RBC
2601
+ 3J6XS2_3J6XS2
2602
+ 3J6XS2_4V6IAE
2603
+ 3J6XS2_4V7RAB
2604
+ 3J6XS2_6XIRs
2605
+ 3J6XS2_7OSMuS5
2606
+ 3J79G_3J79G
2607
+ 3J79G_3JBNAG
2608
+ 3J79G_3JBOAG
2609
+ 3J79G_3JBPAG
2610
+ 3J79G_5UMDG
2611
+ 3J79L_3J79L
2612
+ 3J79L_3JBNAL
2613
+ 3J79L_3JBOAL
2614
+ 3J79L_3JBPAL
2615
+ 3J79L_5UMDL
2616
+ 3J7OR_3J7OR
2617
+ 3J7OR_4V5ZBp
2618
+ 3J7OR_6D90R
2619
+ 3J7OR_6HCFR3
2620
+ 3J7OR_6HCJR3
2621
+ 3J7PE_3J7PE
2622
+ 3J7PE_4D5YE
2623
+ 3J7PE_5AJ0AE
2624
+ 3J7PE_6GZ4AE
2625
+ 3J7PE_6OLGAE
2626
+ 3J7PSd_4V5ZAn
2627
+ 3J7PSd_5A2Qd
2628
+ 3J7PSd_6FECJ
2629
+ 3J7PSd_6YALe
2630
+ 3J7PSd_6ZONl
2631
+ 3J7PSe_3J7PSe
2632
+ 3J7PSE_3J7PSE
2633
+ 3J7PSe_4D5Le
2634
+ 3J7PSE_6FECI
2635
+ 3J7PSe_6FECV
2636
+ 3J7PSE_6OM0SE
2637
+ 3J7PSE_6OM7SE
2638
+ 3J7PSE_7MQ8L4
2639
+ 3J7PSe_7SYNf
2640
+ 3J7PSe_7SYOf
2641
+ 3J7PSJ_3J7PSJ
2642
+ 3J7PSJ_6FECK
2643
+ 3J7PSJ_6HCJK2
2644
+ 3J7PSJ_7A01d3
2645
+ 3J7PSJ_7NWGK2
2646
+ 3J7PSL_3J7PSL
2647
+ 3J7PSL_4D5LL
2648
+ 3J7PSL_6FECG
2649
+ 3J7PSL_6OM0SL
2650
+ 3J7PSL_6OM7SL
2651
+ 3J7PSM_5T2CAM
2652
+ 3J7PSM_6IP83J
2653
+ 3J7PSM_6OM0SM
2654
+ 3J7PSM_6QZPSM
2655
+ 3J7PSM_7A01e3
2656
+ 3J7PSU_4V5ZAj
2657
+ 3J7PSU_5A2QU
2658
+ 3J7PSU_5VYCU1
2659
+ 3J7PSU_6XU7AU
2660
+ 3J7PSU_7A01I3
2661
+ 3J7Yf_3J9Mf
2662
+ 3J7Yf_7NSJBk
2663
+ 3J7Yf_7O9Kf
2664
+ 3J7Yf_7ODRf
2665
+ 3J7Yf_7PO4f
2666
+ 3J7YI_7O9KI
2667
+ 3J7YI_7ODSI
2668
+ 3J7YI_7ODTI
2669
+ 3J7YI_7OG4XI
2670
+ 3J7YI_7OI6I
2671
+ 3J9MA3_3JD5n
2672
+ 3J9MA3_6NEQn
2673
+ 3J9MA3_6NU3A3
2674
+ 3J9MA3_6RW43
2675
+ 3J9MA3_6VMIA3
2676
+ 3J9MAC_3J9MAC
2677
+ 3J9MAC_3JD5C
2678
+ 3J9MAC_6NU2AC
2679
+ 3J9MAC_6NU3AC
2680
+ 3J9MAC_6VLZAC
2681
+ 3J9MAY_3JD5h
2682
+ 3J9MAY_6NU2AY
2683
+ 3J9MAY_6NU3AY
2684
+ 3J9MAY_6ZS9AY
2685
+ 3J9MAY_7A5IY6
2686
+ 3J9QA_3J9R0
2687
+ 3J9QA_6U5BM
2688
+ 3J9QA_6U5FG
2689
+ 3J9QA_6U5FS
2690
+ 3J9QA_6U5Ja
2691
+ 3JC1Ab_3JC1Ad
2692
+ 3JC1Ab_6E8GAA
2693
+ 3JC1Ab_6TZ402
2694
+ 3JC1Ab_6TZ5AA
2695
+ 3JC1Ab_6TZ9A
2696
+ 3JCBC_4FQCH
2697
+ 3JCBC_4JY6B
2698
+ 3JCBC_4R2GQ
2699
+ 3JCBC_4TVPH
2700
+ 3JCBC_5WDUC
2701
+ 3JCMX_5GANr
2702
+ 3JCMX_5NRLy
2703
+ 3JCMX_6EXNg
2704
+ 3JCMX_6EXNr
2705
+ 3JCMX_6N7PQ
2706
+ 3JCOL_6EF3L
2707
+ 3JCOL_6FVVL
2708
+ 3JCOL_6FVWL
2709
+ 3JCOL_6J2NL
2710
+ 3JCOL_6J2QL
2711
+ 3JCON_3JCPN
2712
+ 3JCON_5MPEN
2713
+ 3JCON_5WVIN
2714
+ 3JCON_6FVTN
2715
+ 3JCON_6FVWN
2716
+ 3JCOS_3JCOS
2717
+ 3JCOS_3JCPS
2718
+ 3JCOS_5MPBS
2719
+ 3JCOS_5MPDS
2720
+ 3JCOS_6FVWS
2721
+ 3JCSC_3JCSC
2722
+ 3JCSC_4V8MBr
2723
+ 3JCSC_5T2Ap
2724
+ 3JCSC_5T5Hr
2725
+ 3JCSC_6AZ3C
2726
+ 3JCST_3JCST
2727
+ 3JCST_4V8MBR
2728
+ 3JCST_5T2AP
2729
+ 3JCST_5T5HR
2730
+ 3JCST_6AZ3T
2731
+ 3K1MA_3K1MA
2732
+ 3K1MA_3K1MB
2733
+ 3K1MA_3K1NA
2734
+ 3K1MA_3K1PA
2735
+ 3K1MA_3K1PB
2736
+ 3K6SB_3K6SB
2737
+ 3K6SB_3K6SD
2738
+ 3K6SB_4NENB
2739
+ 3K6SB_5ES4D
2740
+ 3K6SB_7USLB
2741
+ 3K8GA_3K8GA
2742
+ 3K8GA_3K8GB
2743
+ 3K8GA_3K8HA
2744
+ 3K8GA_3K8IA
2745
+ 3K8GA_3K8JA
2746
+ 3KG2A_4U5DD
2747
+ 3KG2A_5IDEA
2748
+ 3KG2A_5IDFC
2749
+ 3KG2A_6XSRC
2750
+ 3KG2A_7LDEB
2751
+ 3LYFA_3LYFA
2752
+ 3LYFA_3LYFB
2753
+ 3LYFA_4H5PB
2754
+ 3LYFA_4H5QB
2755
+ 3LYFA_4V9EB
2756
+ 3M8OH_3M8OH
2757
+ 3M8OH_3QNXB
2758
+ 3M8OH_3QNYB
2759
+ 3M8OH_3QNYD
2760
+ 3M8OH_7UVLH
2761
+ 3MD0A_3MD0A
2762
+ 3MD0A_3NXSA
2763
+ 3MD0A_3P32A
2764
+ 3MD0A_3TK1A
2765
+ 3MD0A_3TK1B
2766
+ 3MFFA_3MFFA
2767
+ 3MFFA_3MFFC
2768
+ 3MFFA_4P5TE
2769
+ 3MFFA_5SWSD
2770
+ 3MFFA_6MNOA
2771
+ 3MG1A_5UI2A
2772
+ 3MG1A_6T6MA
2773
+ 3MG1A_7SC9BH
2774
+ 3MG1A_7SC9CQ
2775
+ 3MG1A_7SCCBP
2776
+ 3MT5A_3NAFA
2777
+ 3MT5A_3U6NA
2778
+ 3MT5A_3U6NH
2779
+ 3MT5A_6ND0A
2780
+ 3MT5A_6ND0C
2781
+ 3MTNB_3MTNB
2782
+ 3MTNB_5JG6B
2783
+ 3MTNB_5ZD0A
2784
+ 3MTNB_6OB1B
2785
+ 3MTNB_7M3QB
2786
+ 3MV7D_3MV7D
2787
+ 3MV7D_5KS9E
2788
+ 3MV7D_5KSAC
2789
+ 3MV7D_6XCOD
2790
+ 3MV7D_8ES9A
2791
+ 3N0XA_3N0XA
2792
+ 3N0XA_3NNDA
2793
+ 3N0XA_3NNDB
2794
+ 3N0XA_3NNDC
2795
+ 3N0XA_3NNDD
2796
+ 3NTCH_3NTCH
2797
+ 3NTCH_4YGVA
2798
+ 3NTCH_6ANII
2799
+ 3NTCH_6T3JH
2800
+ 3NTCH_7SJOI
2801
+ 3NUHB_4TMAB
2802
+ 3NUHB_4TMAF
2803
+ 3NUHB_6RKVB
2804
+ 3NUHB_7Z9CB
2805
+ 3NUHB_7Z9CD
2806
+ 3NZHH_6M3CF
2807
+ 3NZHH_6PHDH
2808
+ 3NZHH_6UGSA
2809
+ 3NZHH_6UGVH
2810
+ 3NZHH_7N0AB
2811
+ 3O4OB_3O4OB
2812
+ 3O4OB_4DEPC
2813
+ 3O4OB_4DEPF
2814
+ 3O4OB_4YFDB
2815
+ 3O4OB_5VI4F
2816
+ 3OJYA_3OJYA
2817
+ 3OJYA_6H03F
2818
+ 3OJYA_6H04F
2819
+ 3OJYA_8B0FE
2820
+ 3OJYA_8B0HE
2821
+ 3OJYB_3OJYB
2822
+ 3OJYB_6H03C
2823
+ 3OJYB_6H04C
2824
+ 3OJYB_8B0GD
2825
+ 3OJYB_8B0HD
2826
+ 3OSYA_3OSYA
2827
+ 3OSYA_3OSYB
2828
+ 3OSYA_5C1UA
2829
+ 3OSYA_5GSOA
2830
+ 3OSYA_5GSWE
2831
+ 3P7IA_3QK6A
2832
+ 3P7IA_3QK6B
2833
+ 3P7IA_3QUJB
2834
+ 3P7IA_3QUJD
2835
+ 3P7IA_3S4UA
2836
+ 3PBLA_3PBLA
2837
+ 3PBLA_3PBLB
2838
+ 3PBLA_6C38A
2839
+ 3PBLA_6CM4A
2840
+ 3PBLA_6LUQA
2841
+ 3PNTB_3PNTB
2842
+ 3PNTB_3QB2B
2843
+ 3PNTB_3QB2C
2844
+ 3PNTB_4KT6B
2845
+ 3PNTB_4KT6D
2846
+ 3PXGA_3PXGA
2847
+ 3PXGA_3PXGB
2848
+ 3PXGA_3PXGC
2849
+ 3PXGA_3PXGD
2850
+ 3PXGA_3PXGF
2851
+ 3Q0XA_6ZZ8B
2852
+ 3Q0XA_6ZZ8D
2853
+ 3Q0XA_6ZZCA
2854
+ 3Q0XA_6ZZCB
2855
+ 3Q0XA_6ZZCC
2856
+ 3Q5DA_3Q5DA
2857
+ 3Q5DA_3Q5EA
2858
+ 3Q5DA_3QNUA
2859
+ 3Q5DA_6B9EA
2860
+ 3Q5DA_6XJNA
2861
+ 3QLEA_3QLEA
2862
+ 3QLEA_4QQFA
2863
+ 3QLEA_4QQFC
2864
+ 3QLEA_4QQFD
2865
+ 3QLEA_4QQFF
2866
+ 3QS2A_3QS2A
2867
+ 3QS2A_3QS2B
2868
+ 3QS2A_3QS3B
2869
+ 3QS2A_3QS3E
2870
+ 3QS2A_3QS3H
2871
+ 3R23A_3R23A
2872
+ 3R23A_3R23B
2873
+ 3R23A_3R5XA
2874
+ 3R23A_3R5XB
2875
+ 3R23A_3R5XD
2876
+ 3R7WB_3R7WB
2877
+ 3R7WB_3R7WD
2878
+ 3R7WB_4ARZB
2879
+ 3R7WB_6JWPB
2880
+ 3R7WB_6JWPG
2881
+ 3R8JA_3R8JA
2882
+ 3R8JA_3R8JB
2883
+ 3R8JA_3R8KA
2884
+ 3R8JA_3R8KB
2885
+ 3R8JA_4AYZA
2886
+ 3RFUA_3RFUA
2887
+ 3RFUA_3RFUC
2888
+ 3RFUA_4BBJA
2889
+ 3RFUA_4BEVA
2890
+ 3RFUA_4BYGA
2891
+ 3RJQA_4I53B
2892
+ 3RJQA_4LSVG
2893
+ 3RJQA_5F6JE
2894
+ 3RJQA_5TE7G
2895
+ 3RJQA_6XCJG
2896
+ 3RKOA_3RKOA
2897
+ 3RKOA_3RKOE
2898
+ 3RKOA_7NYRA
2899
+ 3RKOA_7P64A
2900
+ 3RKOA_7Z80A
2901
+ 3RONA_3RONA
2902
+ 3RONA_3RONB
2903
+ 3RONA_6T14A
2904
+ 3RONA_6T1AA
2905
+ 3RONA_6T1CA
2906
+ 3RUKA_5IRQB
2907
+ 3RUKA_6CHIC
2908
+ 3RUKA_6CIRD
2909
+ 3RUKA_6WR1A
2910
+ 3RUKA_6WR1B
2911
+ 3SBYA_3SBYA
2912
+ 3SBYA_3SBZA
2913
+ 3SBYA_3SC0A
2914
+ 3SBYA_3SOMP
2915
+ 3SBYA_5UOSA
2916
+ 3SDEB_3SDEB
2917
+ 3SDEB_5IFMG
2918
+ 3SDEB_5IFMJ
2919
+ 3SDEB_6WMZD
2920
+ 3SDEB_7LRQB
2921
+ 3SIAA_3SIAA
2922
+ 3SIAA_3SIBA
2923
+ 3SIAA_3SJSA
2924
+ 3SIAA_4Q04A
2925
+ 3SIAA_4Q04B
2926
+ 3SWKA_3SWKA
2927
+ 3SWKA_3SWKB
2928
+ 3SWKA_5WHFD
2929
+ 3SWKA_5WHFE
2930
+ 3SWKA_5WHFH
2931
+ 3SYKA_3SYKA
2932
+ 3SYKA_3SYKB
2933
+ 3SYKA_3SYLB
2934
+ 3SYKA_3ZUHA
2935
+ 3SYKA_3ZUHC
2936
+ 3TESA_3TESA
2937
+ 3TESA_4LPTC
2938
+ 3TESA_4LPXA
2939
+ 3TESA_4LPYA
2940
+ 3TESA_4M6AJ
2941
+ 3TFKC_3TFKC
2942
+ 3TFKC_3VXMD
2943
+ 3TFKC_4Z7WE
2944
+ 3TFKC_6VM8D
2945
+ 3TFKC_8GVIA
2946
+ 3TGXA_3TGXC
2947
+ 3TGXA_4NZDA
2948
+ 3TGXA_4NZDB
2949
+ 3TGXA_4NZDC
2950
+ 3TGXA_7KQ7B
2951
+ 3TNXA_3TNXA
2952
+ 3TNXA_3USVA
2953
+ 3TNXA_3USVC
2954
+ 3TNXA_4QRVA
2955
+ 3TNXA_4QRVB
2956
+ 3TV0A_5ADXU
2957
+ 3TV0A_5AFUU
2958
+ 3TV0A_6F1TU
2959
+ 3TV0A_6ZNLU
2960
+ 3TV0A_7Z8FU
2961
+ 3TV3H_3TV3H
2962
+ 3TV3H_3TWCH
2963
+ 3TV3H_3TYGH
2964
+ 3TV3H_5C7KA
2965
+ 3TV3H_5JS9A
2966
+ 3U2SC_3U2SC
2967
+ 3U2SC_3U2SG
2968
+ 3U2SC_4DQOC
2969
+ 3U2SC_4YWGG
2970
+ 3U2SC_4YWGQ
2971
+ 3UF8A_4G50B
2972
+ 3UF8A_4GGQA
2973
+ 3UF8A_5KLXB
2974
+ 3UF8A_5V8TB
2975
+ 3UF8A_6O49A
2976
+ 3UTMA_3UTMA
2977
+ 3UTMA_5HKPA
2978
+ 3UTMA_6CF6B
2979
+ 3UTMA_6URQA
2980
+ 3UTMA_6URQB
2981
+ 3UX4A_3UX4A
2982
+ 3UX4A_3UX4C
2983
+ 3UX4A_6NSJA
2984
+ 3UX4A_6NSJC
2985
+ 3UX4A_6NSKA
2986
+ 3VBFA_3ZFGA
2987
+ 3VBFA_4AEDA
2988
+ 3VBFA_6DIZA
2989
+ 3VBFA_6LQDA
2990
+ 3VBFA_8C6DA
2991
+ 3VGYD_3VGYD
2992
+ 3VGYD_3VH7B
2993
+ 3VGYD_3VH7D
2994
+ 3VGYD_3VH7F
2995
+ 3VGYD_3W19D
2996
+ 3VVNA_3VVOA
2997
+ 3VVNA_3VVPA
2998
+ 3VVNA_3VVPB
2999
+ 3VVNA_4MLBC
3000
+ 3VVNA_6FHZA
3001
+ 3WBPA_3WBPA
3002
+ 3WBPA_3WBQB
3003
+ 3WBPA_3WBRB
3004
+ 3WBPA_4ZESB
3005
+ 3WBPA_4ZETA
3006
+ 3WDKA_3WDKA
3007
+ 3WDKA_3WDKD
3008
+ 3WDKA_3WDLC
3009
+ 3WDKA_3WDMD
3010
+ 3WDKA_4MB0D
3011
+ 3WGOA_3WGOB
3012
+ 3WGOA_3WGQA
3013
+ 3WGOA_3WGYA
3014
+ 3WGOA_3WGYB
3015
+ 3WGOA_3WGZA
3016
+ 3WJKA_5ZE6A
3017
+ 3WJKA_5ZE6B
3018
+ 3WJKA_5ZE6D
3019
+ 3WJKA_5ZLFB
3020
+ 3WJKA_5ZLFD
3021
+ 3WSVA_3WSVB
3022
+ 3WSVA_3WSVC
3023
+ 3WSVA_3WSVD
3024
+ 3WSVA_3WSWA
3025
+ 3WSVA_3WSWD
3026
+ 3WWSA_3WWSA
3027
+ 3WWSA_4HKDD
3028
+ 3WWSA_4L0NA
3029
+ 3WWSA_4L0NE
3030
+ 3WWSA_4L0NI
3031
+ 3X0XA_3X0YA
3032
+ 3X0XA_3X0YH
3033
+ 3X0XA_4DOYA
3034
+ 3X0XA_4DOYB
3035
+ 3X0XA_4DOYH
3036
+ 3ZFHA_4DQWA
3037
+ 3ZFHA_4DQWB
3038
+ 3ZFHA_6GJVE
3039
+ 3ZFHA_6GJVF
3040
+ 3ZFHA_7PJIA
3041
+ 3ZGHA_3ZGIA
3042
+ 3ZGHA_3ZGIB
3043
+ 3ZGHA_5JUIA
3044
+ 3ZGHA_5JUIB
3045
+ 3ZGHA_5JUIC
3046
+ 3ZGQA_3ZGQA
3047
+ 3ZGQA_4HOQA
3048
+ 3ZGQA_4HORA
3049
+ 3ZGQA_4HOTA
3050
+ 3ZGQA_4J0UA
3051
+ 3ZLAA_4IJSA
3052
+ 3ZLAA_4IJSB
3053
+ 3ZLAA_4IJSC
3054
+ 3ZLAA_4IJSD
3055
+ 3ZLAA_7AOYA
3056
+ 3ZLDA_3ZLDA
3057
+ 3ZLDA_3ZLEA
3058
+ 3ZLDA_3ZLED
3059
+ 3ZLDA_3ZLEE
3060
+ 3ZLDA_3ZLEF
3061
+ 3ZZFA_3ZZFA
3062
+ 3ZZFA_3ZZFD
3063
+ 3ZZFA_3ZZGA
3064
+ 3ZZFA_3ZZGD
3065
+ 3ZZFA_3ZZHC
3066
+ 4A1NA_5T5CA
3067
+ 4A1NA_5ZKJA
3068
+ 4A1NA_6IIDA
3069
+ 4A1NA_7R6TD
3070
+ 4A1NA_7R6VI
3071
+ 4AFNA_4AFNB
3072
+ 4AFNA_4AG3C
3073
+ 4AFNA_4BNVD
3074
+ 4AFNA_4BNWC
3075
+ 4AFNA_4BO4B
3076
+ 4AKFA_4AKFA
3077
+ 4AKFA_4KYIA
3078
+ 4AKFA_4KYIC
3079
+ 4AKFA_4KYIE
3080
+ 4AKFA_4KYIG
3081
+ 4AVPA_4CO8A
3082
+ 4AVPA_4UNOA
3083
+ 4AVPA_4UUVA
3084
+ 4AVPA_4UUVD
3085
+ 4AVPA_5ILVA
3086
+ 4B8WA_4B8WA
3087
+ 4B8WA_4BKPC
3088
+ 4B8WA_4BKPD
3089
+ 4B8WA_4BL5C
3090
+ 4B8WA_4E5YC
3091
+ 4B99A_4B99A
3092
+ 4B99A_4IC7D
3093
+ 4B99A_4IC8A
3094
+ 4B99A_4IC8B
3095
+ 4B99A_4ZSGA
3096
+ 4BB9A_4BB9A
3097
+ 4BB9A_4BBAA
3098
+ 4BB9A_4LC9A
3099
+ 4BB9A_4LY9B
3100
+ 4BB9A_4OLHA
3101
+ 4BJMA_4BJMD
3102
+ 4BJMA_4BJNA
3103
+ 4BJMA_4BJNC
3104
+ 4BJMA_4BJNF
3105
+ 4BJMA_4BJNG
3106
+ 4BL8A_4BL8A
3107
+ 4BL8A_4BL9C
3108
+ 4BL8A_4BLAA
3109
+ 4BL8A_4BLAD
3110
+ 4BL8A_4BLDA
3111
+ 4BSOA_4BSOA
3112
+ 4BSOA_4BSPA
3113
+ 4BSOA_4BSRD
3114
+ 4BSOA_4CDKH
3115
+ 4BSOA_4LI2B
3116
+ 4BTGA_4BTGA
3117
+ 4BTGA_4BTGB
3118
+ 4BTGA_4BTQA
3119
+ 4BTGA_4BTQB
3120
+ 4BTGA_6HY0B
3121
+ 4C03A_4C04A
3122
+ 4C03A_4C06A
3123
+ 4C03A_5E8RA
3124
+ 4C03A_6SQ4B
3125
+ 4C03A_6W6DA
3126
+ 4C1QA_4C1QA
3127
+ 4C1QA_4IJDA
3128
+ 4C1QA_4IJDB
3129
+ 4C1QA_6NM4A
3130
+ 4C1QA_6NM4B
3131
+ 4C2CA_4C2CA
3132
+ 4C2CA_4C2DA
3133
+ 4C2CA_4C2DB
3134
+ 4C2CA_4C2HA
3135
+ 4C2CA_4C2HB
3136
+ 4C48C_5NC5F
3137
+ 4C48C_5NC5G
3138
+ 4C48C_5NG5N
3139
+ 4C48C_5NG5O
3140
+ 4C48C_6SGSG
3141
+ 4COFA_4COFC
3142
+ 4COFA_6D6TA
3143
+ 4COFA_6D6UA
3144
+ 4COFA_6D6UC
3145
+ 4COFA_6DW0E
3146
+ 4CSU9_4CSU9
3147
+ 4CSU9_5M04A
3148
+ 4CSU9_7BL29P1
3149
+ 4CSU9_7BL49
3150
+ 4CSU9_7BL59
3151
+ 4CT8A_4CT8A
3152
+ 4CT8A_4CT9B
3153
+ 4CT8A_4CTAA
3154
+ 4CT8A_4UOCB
3155
+ 4CT8A_4UUXA
3156
+ 4CVNE_4CVNE
3157
+ 4CVNE_4CW7D
3158
+ 4CVNE_4V4NBM
3159
+ 4CVNE_4V6UAM
3160
+ 4CVNE_7ZAGM
3161
+ 4CXIA_4CXIA
3162
+ 4CXIA_5DAFA
3163
+ 4CXIA_5NLBA
3164
+ 4CXIA_6W66C
3165
+ 4CXIA_6WCQC
3166
+ 4CYIA_4CYIF
3167
+ 4CYIA_4CYIG
3168
+ 4CYIA_4CYJA
3169
+ 4CYIA_4CYJB
3170
+ 4CYIA_4CYJD
3171
+ 4CZTA_4CZTA
3172
+ 4CZTA_4CZTB
3173
+ 4CZTA_4CZTD
3174
+ 4CZTA_4CZUB
3175
+ 4CZTA_4CZUD
3176
+ 4D0PA_4D10D
3177
+ 4D0PA_4WSNT
3178
+ 4D0PA_6R7FD
3179
+ 4D0PA_6R7HD
3180
+ 4D0PA_6R7ND
3181
+ 4D10A_4D10A
3182
+ 4D10A_6R6HA
3183
+ 4D10A_6R7HA
3184
+ 4D10A_6R7IA
3185
+ 4D10A_6R7NA
3186
+ 4D10F_4WSNF
3187
+ 4D10F_6R6HF
3188
+ 4D10F_6R7FF
3189
+ 4D10F_6R7HF
3190
+ 4D10F_6R7IF
3191
+ 4D8JA_4D8JA
3192
+ 4D8JA_4D8JC
3193
+ 4D8JA_4D8JG
3194
+ 4D8JA_4D8JH
3195
+ 4D8JA_4D8JL
3196
+ 4DAJA_4DAJC
3197
+ 4DAJA_4U14A
3198
+ 4DAJA_4U15B
3199
+ 4DAJA_5ZHPA
3200
+ 4DAJA_5ZHPB
3201
+ 4DB6A_4DB6A
3202
+ 4DB6A_4DB8D
3203
+ 4DB6A_4DB9A
3204
+ 4DB6A_4DBAC
3205
+ 4DB6A_5MFGD
3206
+ 4DX8A_4DX9e
3207
+ 4DX8A_4DX9o
3208
+ 4DX8A_4DX9O
3209
+ 4DX8A_4DX9q
3210
+ 4DX8A_4DX9w
3211
+ 4E29A_4E2HC
3212
+ 4E29A_4ZM1A
3213
+ 4E29A_4ZM5C
3214
+ 4E29A_5NBZA
3215
+ 4E29A_5NBZB
3216
+ 4E7XA_4E7XA
3217
+ 4E7XA_4E7XC
3218
+ 4E7XA_4E8FB
3219
+ 4E7XA_4UD4B
3220
+ 4E7XA_4UD5B
3221
+ 4EGYA_4EGYA
3222
+ 4EGYA_4EGZA
3223
+ 4EGYA_4EGZB
3224
+ 4EGYA_5D4RA
3225
+ 4EGYA_5D4SB
3226
+ 4EPLA_4EPLA
3227
+ 4EPLA_5ECHA
3228
+ 4EPLA_5ECND
3229
+ 4EPLA_5ECPD
3230
+ 4EPLA_5GZZA
3231
+ 4F0FA_4F0FA
3232
+ 4F0FA_4F1TA
3233
+ 4F0FA_4YZMA
3234
+ 4F0FA_4YZMB
3235
+ 4F0FA_4YZNA
3236
+ 4F15A_4F15A
3237
+ 4F15A_4F15D
3238
+ 4F15A_4F15J
3239
+ 4F15A_4LVHA
3240
+ 4F15A_8DIMA
3241
+ 4F2ME_4F2ME
3242
+ 4F2ME_4F2MF
3243
+ 4F2ME_4F5CE
3244
+ 4F2ME_4F5CF
3245
+ 4F2ME_7U0LB
3246
+ 4F57H_4F57H
3247
+ 4F57H_4F58I
3248
+ 4F57H_4F58K
3249
+ 4F57H_8D6ZE
3250
+ 4F57H_8D6ZG
3251
+ 4F7BA_4F7BA
3252
+ 4F7BA_4F7BB
3253
+ 4F7BA_4TVZB
3254
+ 4F7BA_5UPHA
3255
+ 4F7BA_5UPHB
3256
+ 4FR4A_4FR4A
3257
+ 4FR4A_4FR4B
3258
+ 4FR4A_4FR4D
3259
+ 4FR4A_4FR4E
3260
+ 4FR4A_4FR4F
3261
+ 4GEIA_4GEIA
3262
+ 4GEIA_4GEJC
3263
+ 4GEIA_4GEJD
3264
+ 4GEIA_4GEJG
3265
+ 4GEIA_4GFXA
3266
+ 4GLRI_4GLRI
3267
+ 4GLRI_4GLRK
3268
+ 4GLRI_6NMVL
3269
+ 4GLRI_7KPGL
3270
+ 4GLRI_7KQKL
3271
+ 4I5IA_4I5IB
3272
+ 4I5IA_4IF6A
3273
+ 4I5IA_4IG9C
3274
+ 4I5IA_4IG9E
3275
+ 4I5IA_4KXQA
3276
+ 4I8OA_4I8OA
3277
+ 4I8OA_6Y2PA
3278
+ 4I8OA_6Y2PB
3279
+ 4I8OA_6Y2QB
3280
+ 4I8OA_6Y2RA
3281
+ 4ICQA_4ICQB
3282
+ 4ICQA_4ICRA
3283
+ 4ICQA_4ICRB
3284
+ 4ICQA_4ICSA
3285
+ 4ICQA_4ICSB
3286
+ 4IFTA_4IFTA
3287
+ 4IFTA_4IG4A
3288
+ 4IFTA_4IG4B
3289
+ 4IFTA_4KN8A
3290
+ 4IFTA_4O8CB
3291
+ 4IPEA_5TVUA
3292
+ 4IPEA_6XG6A
3293
+ 4IPEA_6XG6B
3294
+ 4IPEA_7KCMA
3295
+ 4IPEA_7KLUA
3296
+ 4IRPA_4IRPA
3297
+ 4IRPA_4IRPB
3298
+ 4IRPA_4IRQA
3299
+ 4IRPA_4IRQC
3300
+ 4IRPA_4IRQD
3301
+ 4IY0A_4IY2A
3302
+ 4IY0A_4IY3C
3303
+ 4IY0A_4IYSA
3304
+ 4IY0A_5MMZA
3305
+ 4IY0A_6RS2C
3306
+ 4J2NA_4J2NA
3307
+ 4J2NA_4J2NB
3308
+ 4J2NA_4J2NC
3309
+ 4J2NA_4J2ND
3310
+ 4J2NA_4J2NE
3311
+ 4J4RA_4J4RA
3312
+ 4J4RA_4J4SA
3313
+ 4J4RA_4J4SB
3314
+ 4J4RA_4J4SD
3315
+ 4J4RA_4J4UB
3316
+ 4JXYA_4JXYA
3317
+ 4JXYA_4JYXA
3318
+ 4JXYA_4JYXC
3319
+ 4JXYA_4JYXD
3320
+ 4JXYA_4JYXE
3321
+ 4K46A_4K46A
3322
+ 4K46A_4NP6A
3323
+ 4K46A_4NP6B
3324
+ 4K46A_4NP6C
3325
+ 4K46A_4NP6D
3326
+ 4K5YA_4K5YA
3327
+ 4K5YA_4K5YB
3328
+ 4K5YA_4Z9GA
3329
+ 4K5YA_4Z9GB
3330
+ 4K5YA_4Z9GC
3331
+ 4KT0K_4L6V0
3332
+ 4KT0K_4L6Vk
3333
+ 4KT0K_5OY0K
3334
+ 4KT0K_6UZV8
3335
+ 4KT0K_7O1VK
3336
+ 4L0EA_4L0EA
3337
+ 4L0EA_4L0FA
3338
+ 4L0EA_4PWVA
3339
+ 4L0EA_4PXHA
3340
+ 4L0EA_4PXHC
3341
+ 4L3TA_4NGEA
3342
+ 4L3TA_4RPUA
3343
+ 4L3TA_6XOTA
3344
+ 4L3TA_6XOUA
3345
+ 4L3TA_6XOWA
3346
+ 4L5GA_4L5GA
3347
+ 4L5GA_4L5GB
3348
+ 4L5GA_4XAXB
3349
+ 4L5GA_4XLRN
3350
+ 4L5GA_4XLSN
3351
+ 4LGDA_4LGDA
3352
+ 4LGDA_4LGDB
3353
+ 4LGDA_4LGDC
3354
+ 4LGDA_4LGDD
3355
+ 4LGDA_6AO5A
3356
+ 4MTQA_4MTQB
3357
+ 4MTQA_4MTSA
3358
+ 4MTQA_4MTSB
3359
+ 4MTQA_4MTTA
3360
+ 4MTQA_4MTTB
3361
+ 4MVCA_4MVCA
3362
+ 4MVCA_4MVCB
3363
+ 4MVCA_4MVDB
3364
+ 4MVCA_4MVDC
3365
+ 4MVCA_4MVDF
3366
+ 4N0TA_4N0TA
3367
+ 4N0TA_5TF6A
3368
+ 4N0TA_5TF6C
3369
+ 4N0TA_5VSUA
3370
+ 4N0TA_6ASOA
3371
+ 4N44A_4N44A
3372
+ 4N44A_4N44B
3373
+ 4N44A_4N45A
3374
+ 4N44A_4N45B
3375
+ 4N44A_4WYRB
3376
+ 4N9F1_4N9FM
3377
+ 4N9F1_4N9FS
3378
+ 4N9F1_6NILC
3379
+ 4N9F1_8CX1G
3380
+ 4N9F1_8CX2G
3381
+ 4NKOB_4NKOD
3382
+ 4NKOB_6LHQH
3383
+ 4NKOB_6LHTH
3384
+ 4NKOB_7EJ5J
3385
+ 4NKOB_7X3CF
3386
+ 4O6BA_6WERB
3387
+ 4O6BA_7K93D
3388
+ 4O6BA_7WURA
3389
+ 4O6BA_7WUSA
3390
+ 4O6BA_7WUTA
3391
+ 4OCSH_4OCSH
3392
+ 4OCSH_4OCWH
3393
+ 4OCSH_4OD1H
3394
+ 4OCSH_4ORGC
3395
+ 4OCSH_5DT1H
3396
+ 4OM2A_4OM2A
3397
+ 4OM2A_4OM3A
3398
+ 4OM2A_4OM3B
3399
+ 4OM2A_4OM3C
3400
+ 4OM2A_4OM3D
3401
+ 4OOJA_4OOJD
3402
+ 4OOJA_4PAYA
3403
+ 4OOJA_4PAYB
3404
+ 4OOJA_4TRGA
3405
+ 4OOJA_6CP2A
3406
+ 4P6CA_4P6DA
3407
+ 4P6CA_4P6DB
3408
+ 4P6CA_4P6PA
3409
+ 4P6CA_4P6PB
3410
+ 4P6CA_4P8JB
3411
+ 4P6VB_4P6VB
3412
+ 4P6VB_7XK5B
3413
+ 4P6VB_7XK6B
3414
+ 4P6VB_8A1XB
3415
+ 4P6VB_8A1YB
3416
+ 4PE5A_4TLMA
3417
+ 4PE5A_5IPRA
3418
+ 4PE5A_6MMIA
3419
+ 4PE5A_6WHUA
3420
+ 4PE5A_8E93A
3421
+ 4PEOA_4PEOB
3422
+ 4PEOA_7JVSB
3423
+ 4PEOA_7JVSD
3424
+ 4PEOA_7KLDB
3425
+ 4PEOA_7KLDD
3426
+ 4PH0A_4PH0A
3427
+ 4PH0A_4PH0B
3428
+ 4PH0A_4PH0C
3429
+ 4PH0A_4PH0E
3430
+ 4PH0A_4PH0F
3431
+ 4PJUA_4PJUA
3432
+ 4PJUA_6QNXA
3433
+ 4PJUA_6QNYA
3434
+ 4PJUA_7ZJSA
3435
+ 4PJUA_7ZJSC
3436
+ 4PU3A_4PU3A
3437
+ 4PU3A_4PU3B
3438
+ 4PU3A_4PU4A
3439
+ 4PU3A_4PU4B
3440
+ 4PU3A_4PU5A
3441
+ 4QHFA_4QHFA
3442
+ 4QHFA_4QHGA
3443
+ 4QHFA_4QHHA
3444
+ 4QHFA_4QHIB
3445
+ 4QHFA_4QHID
3446
+ 4QICA_4QICA
3447
+ 4QICA_4QICC
3448
+ 4QICA_5UXVA
3449
+ 4QICA_5UXVB
3450
+ 4QICA_5UXWA
3451
+ 4QIWA_4QIWA
3452
+ 4QIWA_4QIWI
3453
+ 4QIWA_6KF3A
3454
+ 4QIWA_6KF4A
3455
+ 4QIWA_6KF9A
3456
+ 4QQAA_4QQAA
3457
+ 4QQAA_4ZGHA
3458
+ 4QQAA_5AOFA
3459
+ 4QQAA_5CR6D
3460
+ 4QQAA_5LY6B
3461
+ 4R7QA_4R7QA
3462
+ 4R7QA_7KB3A
3463
+ 4R7QA_7KB3D
3464
+ 4R7QA_7KB7A
3465
+ 4R7QA_7KB9A
3466
+ 4RFXA_4RFXA
3467
+ 4RFXA_4RFXB
3468
+ 4RFXA_4RFXC
3469
+ 4RFXA_4RFXD
3470
+ 4RFXA_4RFXE
3471
+ 4TW1B_4TW1D
3472
+ 4TW1B_4TW1H
3473
+ 4TW1B_4TW1L
3474
+ 4TW1B_5K59C
3475
+ 4TW1B_7T87B
3476
+ 4U3AA_4U5IB
3477
+ 4U3AA_4U5KA
3478
+ 4U3AA_4U5KB
3479
+ 4U3AA_5BYWD
3480
+ 4U3AA_5BYWE
3481
+ 4U64A_4U64A
3482
+ 4U64A_4U65A
3483
+ 4U64A_4U65B
3484
+ 4U64A_4U65C
3485
+ 4U64A_4U65D
3486
+ 4UT6L_4UT6L
3487
+ 4UT6L_5NGVL
3488
+ 4UT6L_6NB4L
3489
+ 4UT6L_8SASI
3490
+ 4UT6L_8SB2D
3491
+ 4UTAH_4UTAH
3492
+ 4UTAH_4UTAI
3493
+ 4UTAH_7A3TH
3494
+ 4UTAH_7CBPD
3495
+ 4UTAH_7CBPR
3496
+ 4V2EA_4V2EA
3497
+ 4V2EA_4V2EB
3498
+ 4V2EA_4YEBB
3499
+ 4V2EA_5CMNA
3500
+ 4V2EA_6JBUB
3501
+ 4V3PLT_4V3PLT
3502
+ 4V3PLT_4V7ECR
3503
+ 4V3PLT_7QIWT
3504
+ 4V3PLT_7QIZT
3505
+ 4V3PLT_8AZWH
3506
+ 4V3PSE_4V3PSE
3507
+ 4V3PSE_4V7EBC
3508
+ 4V3PSE_7QIXO
3509
+ 4V3PSE_7QIZEA
3510
+ 4V3PSE_8AUVc
3511
+ 4V4BBE_4V4BBE
3512
+ 4V4BBE_4V7HBE
3513
+ 4V4BBE_7PZYm
3514
+ 4V4BBE_7Q08m
3515
+ 4V4BBE_7Q0Fm
3516
+ 4V4BBP_4V4BBP
3517
+ 4V4BBP_4V7HBP
3518
+ 4V4BBP_6QIKR
3519
+ 4V4BBP_7NACR
3520
+ 4V4BBP_7OHVR
3521
+ 4V4NAD_4V4NAD
3522
+ 4V4NAD_4V6UBD
3523
+ 4V4NAD_6SKFBE
3524
+ 4V4NAD_6SKGBE
3525
+ 4V4NAD_6TH6BE
3526
+ 4V4NBG_4V4NBG
3527
+ 4V4NBG_4V6UAG
3528
+ 4V4NBG_5JB3G
3529
+ 4V4NBG_6SW9G
3530
+ 4V4NBG_7ZHGG
3531
+ 4V51B2_4V51B2
3532
+ 4V51B2_4V5AB2
3533
+ 4V51B2_6Q95Y
3534
+ 4V51B2_7LH5B2
3535
+ 4V51B2_7LH5D2
3536
+ 4V61AD_4V61AD
3537
+ 4V61AD_5MMJd
3538
+ 4V61AD_5MMMd
3539
+ 4V61AD_5X8Pd
3540
+ 4V61AD_6ERIBD
3541
+ 4V61AK_4V61AK
3542
+ 4V61AK_5MMJk
3543
+ 4V61AK_5MMMk
3544
+ 4V61AK_5X8Pk
3545
+ 4V61AK_6ERIBK
3546
+ 4V61BN_4V61BN
3547
+ 4V61BN_5H1SN
3548
+ 4V61BN_5MLCN
3549
+ 4V61BN_5X8PM
3550
+ 4V61BN_6ERIAL
3551
+ 4V61BQ_4V61BQ
3552
+ 4V61BQ_5H1SQ
3553
+ 4V61BQ_5MLCQ
3554
+ 4V61BQ_5X8PP
3555
+ 4V61BQ_6ERIAO
3556
+ 4V81D_4V81L
3557
+ 4V81D_4V8RAD
3558
+ 4V81D_5GW4D
3559
+ 4V81D_7YLUD
3560
+ 4V81D_7YLWd
3561
+ 4V81H_4V81P
3562
+ 4V81H_5GW4Q
3563
+ 4V81H_5GW5Q
3564
+ 4V81H_7YLUq
3565
+ 4V81H_7YLYQ
3566
+ 4V8MA6_4V8MA6
3567
+ 4V8MA6_5OPTb
3568
+ 4V8MA6_5T2A6
3569
+ 4V8MA6_6AZ1D
3570
+ 4V8MA6_7ASEB
3571
+ 4V8MAK_4V8MAK
3572
+ 4V8MAK_5OPTr
3573
+ 4V8MAK_5T2AAK
3574
+ 4V8MAK_6AZ1L
3575
+ 4V8MAK_7ASEr
3576
+ 4V8MAL_4V8MAL
3577
+ 4V8MAL_5OPTk
3578
+ 4V8MAL_5T2AAL
3579
+ 4V8MAL_6AZ1V
3580
+ 4V8MAL_7ASEi
3581
+ 4WCE3_4WF93
3582
+ 4WCE3_5LI07
3583
+ 4WCE3_5NGMA3
3584
+ 4WCE3_5NRG3
3585
+ 4WCE3_6SJ67
3586
+ 4WCER_4WF9R
3587
+ 4WCER_5ND9X
3588
+ 4WCER_5NRGR
3589
+ 4WCER_6WQQG
3590
+ 4WCER_7ASOZ
3591
+ 4WFTA_4WFTC
3592
+ 4WFTA_5OC5A
3593
+ 4WFTA_5OC6A
3594
+ 4WFTA_6EI8A
3595
+ 4WFTA_6F00B
3596
+ 4WGVA_4WGVA
3597
+ 4WGVA_4WGWC
3598
+ 4WGVA_5M94A
3599
+ 4WGVA_5M94C
3600
+ 4WGVA_7PHQB
3601
+ 4WHJA_4WHJA
3602
+ 4WHJA_4WHJB
3603
+ 4WHJA_5UOT0
3604
+ 4WHJA_5UOTB
3605
+ 4WHJA_5UOTY
3606
+ 4WHTB_4WHTB
3607
+ 4WHTB_4WHYH
3608
+ 4WHTB_4WHYJ
3609
+ 4WHTB_5AUMB
3610
+ 4WHTB_5AUML
3611
+ 4WPHA_4WPHA
3612
+ 4WPHA_4WPHB
3613
+ 4WPHA_4WPIA
3614
+ 4WPHA_5GG4A
3615
+ 4WPHA_6P5LA
3616
+ 4WSAC_5EPIG
3617
+ 4WSAC_5EPIS
3618
+ 4WSAC_5EPIW
3619
+ 4WSAC_6QCTC
3620
+ 4WSAC_6QCVC
3621
+ 4WXQA_4WXQA
3622
+ 4WXQA_6NAXA
3623
+ 4WXQA_6PKDB
3624
+ 4WXQA_6PKEB
3625
+ 4WXQA_6PKFA
3626
+ 4WZ72_6GCS2
3627
+ 4WZ72_6H8K2
3628
+ 4WZ72_6RFQ2
3629
+ 4WZ72_7ZKP2
3630
+ 4WZ72_7ZKQ2
3631
+ 4XHAA_4XHAB
3632
+ 4XHAA_5CYRB
3633
+ 4XHAA_7OM6B
3634
+ 4XHAA_7OM9A
3635
+ 4XHAA_7OM9B
3636
+ 4XL1B_4XLWB
3637
+ 4XL1B_4XLWD
3638
+ 4XL1B_4XLWF
3639
+ 4XL1B_4XLWH
3640
+ 4XL1B_5MVXA
3641
+ 4Y7JA_4Y7JA
3642
+ 4Y7JA_4Y7JB
3643
+ 4Y7JA_4Y7JE
3644
+ 4Y7JA_4Y7KA
3645
+ 4Y7JA_4Y7KD
3646
+ 4Y97A_4Y97A
3647
+ 4Y97A_4Y97E
3648
+ 4Y97A_5EXRD
3649
+ 4Y97A_7U5CD
3650
+ 4Y97A_8D0KG
3651
+ 4YB5A_4YB5B
3652
+ 4YB5A_4YB5E
3653
+ 4YB5A_4YB6A
3654
+ 4YB5A_4YB7B
3655
+ 4YB5A_4YB7D
3656
+ 4YONA_4YONA
3657
+ 4YONA_5FI0A
3658
+ 4YONA_5FI0C
3659
+ 4YONA_5FI0G
3660
+ 4YONA_5FI1A
3661
+ 4Z2CA_4Z2CB
3662
+ 4Z2CA_4Z2DA
3663
+ 4Z2CA_6N1PA
3664
+ 4Z2CA_6N1PB
3665
+ 4Z2CA_6N1RA
3666
+ 4Z87A_4Z87B
3667
+ 4Z87A_5MCPA
3668
+ 4Z87A_5MCPC
3669
+ 4Z87A_5MCPH
3670
+ 4Z87A_5TC3B
3671
+ 4ZBWA_5H31A
3672
+ 4ZBWA_5H31B
3673
+ 4ZBWA_5H33A
3674
+ 4ZBWA_5L08I
3675
+ 4ZBWA_7LVJA
3676
+ 4ZJFC_4ZJFC
3677
+ 4ZJFC_7PVDA
3678
+ 4ZJFC_7SGFA
3679
+ 4ZJFC_8EJEA
3680
+ 4ZJFC_8EJIB
3681
+ 4ZV3A_4ZV3A
3682
+ 4ZV3A_4ZV3C
3683
+ 4ZV3A_6VFYD
3684
+ 4ZV3A_6VFYE
3685
+ 4ZV3A_6VFYF
3686
+ 5A31C_5A31P
3687
+ 5A31C_5G04C
3688
+ 5A31C_6TNTC
3689
+ 5A31C_6TNTP
3690
+ 5A31C_7QE7U
3691
+ 5A5UB_5A5UB
3692
+ 5A5UB_5K1HB
3693
+ 5A5UB_6YBT1
3694
+ 5A5UB_6ZP4B
3695
+ 5A5UB_6ZVJB
3696
+ 5A63B_5FN2B
3697
+ 5A63B_5FN4B
3698
+ 5A63B_5FN5B
3699
+ 5A63B_6IDFB
3700
+ 5A63B_6LR4B
3701
+ 5AKPA_5UYRA
3702
+ 5AKPA_5UYRB
3703
+ 5AKPA_6NDOB
3704
+ 5AKPA_6PL0A
3705
+ 5AKPA_6PL0B
3706
+ 5AOSA_5AOSA
3707
+ 5AOSA_5AOTA
3708
+ 5AOSA_5FU2A
3709
+ 5AOSA_5FU2B
3710
+ 5AOSA_5FU3B
3711
+ 5B04A_5B04A
3712
+ 5B04A_5B04B
3713
+ 5B04A_6JLYA
3714
+ 5B04A_6JLYB
3715
+ 5B04A_6JLZB
3716
+ 5B3TA_5B3TB
3717
+ 5B3TA_5B3UA
3718
+ 5B3TA_5B3UB
3719
+ 5B3TA_5B3VA
3720
+ 5B3TA_5B3VB
3721
+ 5BXBA_5BXBD
3722
+ 5BXBA_5BXDA
3723
+ 5BXBA_5BXDB
3724
+ 5BXBA_5BXDD
3725
+ 5BXBA_5BXDE
3726
+ 5C21A_5C21A
3727
+ 5C21A_5C21B
3728
+ 5C21A_5C22A
3729
+ 5C21A_5C22B
3730
+ 5C21A_5C22C
3731
+ 5C73A_5C73A
3732
+ 5C73A_5C78A
3733
+ 5C73A_5C78D
3734
+ 5C73A_5NBDB
3735
+ 5C73A_6HRCA
3736
+ 5C8SB_5NFYB
3737
+ 5C8SB_5SKWD
3738
+ 5C8SB_5SL1D
3739
+ 5C8SB_7EIZK
3740
+ 5C8SB_7R2VB
3741
+ 5CBNA_5CBNA
3742
+ 5CBNA_5CBOA
3743
+ 5CBNA_5CBOG
3744
+ 5CBNA_5CBOH
3745
+ 5CBNA_5CBOK
3746
+ 5COCA_5COCA
3747
+ 5COCA_5H7DE
3748
+ 5COCA_5H7DF
3749
+ 5COCA_5H7DH
3750
+ 5COCA_5H7DL
3751
+ 5D8CA_5D8CA
3752
+ 5D8CA_5D8CB
3753
+ 5D8CA_5D90A
3754
+ 5D8CA_5D90C
3755
+ 5D8CA_5D90D
3756
+ 5DHVB_5DHVB
3757
+ 5DHVB_5DHZL
3758
+ 5DHVB_6CF2B
3759
+ 5DHVB_6OSHL
3760
+ 5DHVB_6OSVL
3761
+ 5DO7B_5DO7B
3762
+ 5DO7B_5DO7D
3763
+ 5DO7B_7JR7B
3764
+ 5DO7B_7R87B
3765
+ 5DO7B_8CUBD
3766
+ 5DOFA_5DOFA
3767
+ 5DOFA_5DOFD
3768
+ 5DOFA_5DOIA
3769
+ 5DOFA_7UY5K
3770
+ 5DOFA_7UY7C
3771
+ 5DZUA_5DZUA
3772
+ 5DZUA_5FNWA
3773
+ 5DZUA_5FZUA
3774
+ 5DZUA_5FZYA
3775
+ 5DZUA_5FZZA
3776
+ 5EGBA_5EGBA
3777
+ 5EGBA_5EH2E
3778
+ 5EGBA_5EH2F
3779
+ 5EGBA_5EI9E
3780
+ 5EGBA_5EI9F
3781
+ 5EJJA_5EJJB
3782
+ 5EJJA_5XDAA
3783
+ 5EJJA_5XDAB
3784
+ 5EJJA_5XDAC
3785
+ 5EJJA_5XDAD
3786
+ 5ELUA_5ELUA
3787
+ 5ELUA_5OHLB
3788
+ 5ELUA_5OHLE
3789
+ 5ELUA_5OHMD
3790
+ 5ELUA_6HJLH
3791
+ 5ERPA_5ERPB
3792
+ 5ERPA_7A7Da
3793
+ 5ERPA_7A7Dc
3794
+ 5ERPA_7A7Dd
3795
+ 5ERPA_7A7Df
3796
+ 5EY2A_5EY2A
3797
+ 5EY2A_5EY2B
3798
+ 5EY2A_5LOEC
3799
+ 5EY2A_5LOOA
3800
+ 5EY2A_5LOON
3801
+ 5F72S_5GRWA
3802
+ 5F72S_5GS1C
3803
+ 5F72S_5GS3A
3804
+ 5F72S_6DSIA
3805
+ 5F72S_6KN9E
3806
+ 5F7PA_5F7PA
3807
+ 5F7PA_5F7QE
3808
+ 5F7PA_5F7QL
3809
+ 5F7PA_5F7RA
3810
+ 5F7PA_5F7RE
3811
+ 5FD5A_5FD5C
3812
+ 5FD5A_5FD5D
3813
+ 5FD5A_5FD6B
3814
+ 5FD5A_5FD6C
3815
+ 5FD5A_5FD6D
3816
+ 5FFIA_5FFIB
3817
+ 5FFIA_5FFID
3818
+ 5FFIA_5FFIE
3819
+ 5FFIA_6YAVC
3820
+ 5FFIA_6YAVE
3821
+ 5FLME_6EXVE
3822
+ 5FLME_6O9LE
3823
+ 5FLME_6XREE
3824
+ 5FLME_7ASTF
3825
+ 5FLME_8IUHE
3826
+ 5FLZB_5FLZB
3827
+ 5FLZB_5FM1B
3828
+ 5FLZB_7M2WF
3829
+ 5FLZB_7M2YC
3830
+ 5FLZB_7M2ZC
3831
+ 5FMWA_6DLWB
3832
+ 5FMWA_6H03G
3833
+ 5FMWA_6H04G
3834
+ 5FMWA_8B0GJ
3835
+ 5FMWA_8B0HI
3836
+ 5FYWX_5FYWX
3837
+ 5FYWX_5FZ5X
3838
+ 5FYWX_5OQJX
3839
+ 5FYWX_7ML2X
3840
+ 5FYWX_7ML4X
3841
+ 5G04R_5G04R
3842
+ 5G04R_5LCWQ
3843
+ 5G04R_5LCWR
3844
+ 5G04R_6Q6GR
3845
+ 5G04R_6TLJR
3846
+ 5G1MA_5G1MA
3847
+ 5G1MA_5G3RB
3848
+ 5G1MA_5G5KB
3849
+ 5G1MA_5G5UA
3850
+ 5G1MA_5G6TA
3851
+ 5G59A_5G59A
3852
+ 5G59A_5G5MB
3853
+ 5G59A_5LCNA
3854
+ 5G59A_5LCNC
3855
+ 5G59A_5LCND
3856
+ 5GI0A_5GI0A
3857
+ 5GI0A_5IWBA
3858
+ 5GI0A_5IWWC
3859
+ 5GI0A_5MPYA
3860
+ 5GI0A_5MPYB
3861
+ 5GJQO_5LN3O
3862
+ 5GJQO_5T0CAa
3863
+ 5GJQO_6EPCO
3864
+ 5GJQO_6EPEO
3865
+ 5GJQO_8CVTa
3866
+ 5GJQQ_5LN3Q
3867
+ 5GJQQ_5T0CAX
3868
+ 5GJQQ_6EPDQ
3869
+ 5GJQQ_6EPEQ
3870
+ 5GJQQ_6WJNX
3871
+ 5GM6o_5GMKo
3872
+ 5GM6o_5MQ0v
3873
+ 5GM6o_6EXNt
3874
+ 5GM6o_7B9Vu
3875
+ 5GM6o_7B9Vv
3876
+ 5GPNh_5LC5M
3877
+ 5GPNh_5LNKM
3878
+ 5GPNh_6ZKMM
3879
+ 5GPNh_7DH04
3880
+ 5GPNh_7R4GM
3881
+ 5GPNV_5GPNV
3882
+ 5GPNV_7DGQA4
3883
+ 5GPNV_7DGRA3
3884
+ 5GPNV_7DGSA3
3885
+ 5GPNV_7DGSt
3886
+ 5GPYA_5GPYA
3887
+ 5GPYA_5IY8Q
3888
+ 5GPYA_7EG9U
3889
+ 5GPYA_7EGCU
3890
+ 5GPYA_7NVUW
3891
+ 5GUDA_5GUDB
3892
+ 5GUDA_5GUDD
3893
+ 5GUDA_5IJZA
3894
+ 5GUDA_5IJZJ
3895
+ 5GUDA_5IJZL
3896
+ 5GUPc_5GUPc
3897
+ 5GUPc_5LNKv
3898
+ 5GUPc_5XTCc
3899
+ 5GUPc_6Q9BB8
3900
+ 5GUPc_7DGZe
3901
+ 5GUPM_5GUPM
3902
+ 5GUPM_5LNKh
3903
+ 5GUPM_5O31r
3904
+ 5GUPM_7DH0P
3905
+ 5GUPM_7V2CI
3906
+ 5GUPs_5O31X
3907
+ 5GUPs_6QC4A8
3908
+ 5GUPs_7DGRQ
3909
+ 5GUPs_7DH0Q
3910
+ 5GUPs_7V2Cu
3911
+ 5GUPt_5LNKs
3912
+ 5GUPt_6G2Jo
3913
+ 5GUPt_6QC7B7
3914
+ 5GUPt_6ZKAs
3915
+ 5GUPt_7DH0d
3916
+ 5GUPZ_5GUPZ
3917
+ 5GUPZ_5LNKn
3918
+ 5GUPZ_7DGQZ
3919
+ 5GUPZ_7DGSZ
3920
+ 5GUPZ_7QSDk
3921
+ 5H1BA_5JZCA
3922
+ 5H1BA_5JZCG
3923
+ 5H1BA_5NWLD
3924
+ 5H1BA_8BR2A
3925
+ 5H1BA_8BSCA
3926
+ 5H1SZ_5H1SZ
3927
+ 5H1SZ_5MLCZ
3928
+ 5H1SZ_5MMIZ
3929
+ 5H1SZ_5X8PZ
3930
+ 5H1SZ_6ERIAY
3931
+ 5H2GA_5H2GA
3932
+ 5H2GA_5H2GB
3933
+ 5H2GA_5H2YA
3934
+ 5H2GA_5H2YB
3935
+ 5H2GA_5M47A
3936
+ 5H64B_5H64b
3937
+ 5H64B_6BCUW
3938
+ 5H64B_6SB0N
3939
+ 5H64B_7OWGY
3940
+ 5H64B_8ERAY
3941
+ 5H77A_5H77B
3942
+ 5H77A_5H77J
3943
+ 5H77A_5H77K
3944
+ 5H77A_5XBYB
3945
+ 5H77A_5XBYC
3946
+ 5HK1A_5HK1B
3947
+ 5HK1A_5HK2A
3948
+ 5HK1A_6DK0A
3949
+ 5HK1A_6DK0C
3950
+ 5HK1A_6DK1C
3951
+ 5HWTA_5HWTA
3952
+ 5HWTA_5HWTB
3953
+ 5HWTA_5HWVA
3954
+ 5HWTA_5HWVB
3955
+ 5HWTA_5HWWA
3956
+ 5HX2D_5HX2E
3957
+ 5HX2D_5IV5A
3958
+ 5HX2D_5IV5B
3959
+ 5HX2D_5IV7A
3960
+ 5HX2D_5IV7B
3961
+ 5HZIA_5HZIA
3962
+ 5HZIA_5HZJA
3963
+ 5HZIA_5HZJB
3964
+ 5HZIA_5HZKB
3965
+ 5HZIA_5HZKD
3966
+ 5ICJA_6C31E
3967
+ 5ICJA_6C31H
3968
+ 5ICJA_6HRYA
3969
+ 5ICJA_6HRZA
3970
+ 5ICJA_6HS0B
3971
+ 5IFEC_6EN4C
3972
+ 5IFEC_6FF7u
3973
+ 5IFEC_7ABGu
3974
+ 5IFEC_7QTTC
3975
+ 5IFEC_8CH6C
3976
+ 5IITA_5IITB
3977
+ 5IITA_5IITC
3978
+ 5IITA_5IITD
3979
+ 5IITA_5LNCA
3980
+ 5IITA_5LNCB
3981
+ 5IJNC_5IJNU
3982
+ 5IJNC_7MW1A
3983
+ 5IJNC_7R5JA2
3984
+ 5IJNC_7R5JA5
3985
+ 5IJNC_7VCIU
3986
+ 5IPLF_5IPLF
3987
+ 5IPLF_5IPMF
3988
+ 5IPLF_6KJ6F
3989
+ 5IPLF_6UTYH
3990
+ 5IPLF_6UU7I
3991
+ 5IQLA_5IQLA
3992
+ 5IQLA_5XNVA
3993
+ 5IQLA_6LSDB
3994
+ 5IQLA_7EIEA
3995
+ 5IQLA_7EIEB
3996
+ 5IT5A_5IT5A
3997
+ 5IT5A_5OIUD
3998
+ 5IT5A_6EJFA
3999
+ 5IT5A_6EJFC
4000
+ 5IT5A_6F8LF
4001
+ 5IX3A_5IX3A
4002
+ 5IX3A_7KY3B
4003
+ 5IX3A_7KY4B
4004
+ 5IX3A_8FV0A
4005
+ 5IX3A_8FV1A
4006
+ 5IZKA_5IZKA
4007
+ 5IZKA_5IZKB
4008
+ 5IZKA_5IZLA
4009
+ 5IZKA_5IZLB
4010
+ 5IZKA_7ZJWE
4011
+ 5J1JA_5J1JA
4012
+ 5J1JA_5J1JB
4013
+ 5J1JA_5JVFA
4014
+ 5J1JA_7EJWA
4015
+ 5J1JA_7EJWB
4016
+ 5JFLA_5JFLA
4017
+ 5JFLA_5JFMA
4018
+ 5JFLA_5JFME
4019
+ 5JFLA_5JFNC
4020
+ 5JFLA_6GVSB
4021
+ 5JGHA_5JGHA
4022
+ 5JGHA_5JGHD
4023
+ 5JGHA_5JGHG
4024
+ 5JGHA_5JGHJ
4025
+ 5JGHA_5JH0D
4026
+ 5JUBA_5JUBA
4027
+ 5JUBA_5JUBB
4028
+ 5JUBA_6HU8A
4029
+ 5JUBA_6HUAA
4030
+ 5JUBA_6QERA
4031
+ 5JXTA_5JXTA
4032
+ 5JXTA_5JXTB
4033
+ 5JXTA_5JXTF
4034
+ 5JXTA_5JXTH
4035
+ 5JXTA_5JXTM
4036
+ 5JY7I_5JY7I
4037
+ 5JY7I_5JY7J
4038
+ 5JY7I_5JY7L
4039
+ 5JY7I_5JY7M
4040
+ 5JY7I_5JY7N
4041
+ 5JZJA_5JZNB
4042
+ 5JZJA_6KYQA
4043
+ 5JZJA_7F3GB
4044
+ 5JZJA_7KX6B
4045
+ 5JZJA_7KXWA
4046
+ 5KC1B_5KC1D
4047
+ 5KC1B_5KC1F
4048
+ 5KC1B_5KC1H
4049
+ 5KC1B_5KC1J
4050
+ 5KC1B_5KC1L
4051
+ 5KTEA_5KTEA
4052
+ 5KTEA_6BU5A
4053
+ 5KTEA_6D91A
4054
+ 5KTEA_6D9WA
4055
+ 5KTEA_8E60A
4056
+ 5KZ51_5KZ53
4057
+ 5KZ51_5KZ54
4058
+ 5KZ51_5KZ5O
4059
+ 5KZ51_6NZUA
4060
+ 5KZ51_6WI2A
4061
+ 5L3SA_5L3SA
4062
+ 5L3SA_5L3SE
4063
+ 5L3SA_5L3SG
4064
+ 5L3SA_5L3VA
4065
+ 5L3SA_5L3VB
4066
+ 5L7DA_5V57A
4067
+ 5L7DA_5V57B
4068
+ 5L7DA_6D32A
4069
+ 5L7DA_6D35A
4070
+ 5L7DA_7ZI0A
4071
+ 5L90A_5L90B
4072
+ 5L90A_5L91B
4073
+ 5L90A_5L92A
4074
+ 5L90A_5L92B
4075
+ 5L90A_5L94A
4076
+ 5LI0c_6S0Xc
4077
+ 5LI0c_6S13c
4078
+ 5LI0c_7ASOB
4079
+ 5LI0c_7ASPk
4080
+ 5LI0c_7P48c
4081
+ 5LMNX_5LMNX
4082
+ 5LMNX_5LMPX
4083
+ 5LMNX_5LMSX
4084
+ 5LMNX_5LMTX
4085
+ 5LMNX_5LMVX
4086
+ 5LTPA_5LTPE
4087
+ 5LTPA_5LTQF
4088
+ 5LTPA_5LTQJ
4089
+ 5LTPA_7Z7OA
4090
+ 5LTPA_7Z7PC
4091
+ 5LTVA_5LTVA
4092
+ 5LTVA_5LTVB
4093
+ 5LTVA_5LTVC
4094
+ 5LTVA_5LTVD
4095
+ 5LTVA_5LTVF
4096
+ 5LTWC_5LTWC
4097
+ 5LTWC_5LTWD
4098
+ 5LTWC_5LTWG
4099
+ 5LTWC_5LTWK
4100
+ 5LTWC_5LTWL
4101
+ 5M0RD_5M0RH
4102
+ 5M0RD_5T3AA
4103
+ 5M0RD_7U32D
4104
+ 5M0RD_7U32H
4105
+ 5M0RD_7ZPPD
4106
+ 5MQFT_5XJCV
4107
+ 5MQFT_6FF7T
4108
+ 5MQFT_7DVQV
4109
+ 5MQFT_7QTTW
4110
+ 5MQFT_7W5BV
4111
+ 5MWRA_6EL2B
4112
+ 5MWRA_6EN8B
4113
+ 5MWRA_6EN8D
4114
+ 5MWRA_6EN8E
4115
+ 5MWRA_6EN8F
4116
+ 5N4BA_5N4BA
4117
+ 5N4BA_5N4CB
4118
+ 5N4BA_5N4DB
4119
+ 5N4BA_5N4EB
4120
+ 5N4BA_5N4FA
4121
+ 5N6XA_5N6XA
4122
+ 5N6XA_5N6XB
4123
+ 5N6XA_5N72A
4124
+ 5N6XA_5Y9OA
4125
+ 5N6XA_5Y9OB
4126
+ 5N7HA_5N7HA
4127
+ 5N7HA_5N7IB
4128
+ 5N7HA_5N7KA
4129
+ 5N7HA_5N7KB
4130
+ 5N7HA_5N7KC
4131
+ 5NBCA_5NBCA
4132
+ 5NBCA_5NHKA
4133
+ 5NBCA_5NHKB
4134
+ 5NBCA_5NHKC
4135
+ 5NBCA_5NHKD
4136
+ 5NGMAY_5NGMAY
4137
+ 5NGMAY_6S0XY
4138
+ 5NGMAY_6S0ZY
4139
+ 5NGMAY_6S12Y
4140
+ 5NGMAY_6S13Y
4141
+ 5NJ3A_5NJ3B
4142
+ 5NJ3A_6HBUA
4143
+ 5NJ3A_6HCOA
4144
+ 5NJ3A_6HZMA
4145
+ 5NJ3A_7OJHA
4146
+ 5NMUA_5NMUA
4147
+ 5NMUA_5NPLA
4148
+ 5NMUA_5NPLB
4149
+ 5NMUA_5NPLC
4150
+ 5NMUA_5NVDA
4151
+ 5NRLT_5NRLT
4152
+ 5NRLT_5ZWMu
4153
+ 5NRLT_6G90T
4154
+ 5NRLT_7DCOu
4155
+ 5NRLT_7OQBT
4156
+ 5NYFA_5NYGA
4157
+ 5NYFA_5NYJA
4158
+ 5NYFA_5NYJJ
4159
+ 5NYFA_5NYPA
4160
+ 5NYFA_5NYRA
4161
+ 5O1UA_5O25A
4162
+ 5O1UA_5O25B
4163
+ 5O1UA_5O58A
4164
+ 5O1UA_5O70A
4165
+ 5O1UA_5O7FA
4166
+ 5O5JM_5V93m
4167
+ 5O5JM_5XYUM
4168
+ 5O5JM_7KGBm
4169
+ 5O5JM_7MSHm
4170
+ 5O5JM_8FR8j
4171
+ 5OQMk_5OQMk
4172
+ 5OQMk_7UI9j
4173
+ 5OQMk_7UIOAj
4174
+ 5OQMk_8CENk
4175
+ 5OQMk_8CEOk
4176
+ 5SV0A_5SV1A
4177
+ 5SV0A_5ZFUC
4178
+ 5SV0A_5ZFUD
4179
+ 5SV0A_5ZFVA
4180
+ 5SV0A_5ZFVD
4181
+ 5SZDA_5SZDA
4182
+ 5SZDA_5SZDC
4183
+ 5SZDA_5SZDF
4184
+ 5SZDA_5SZDG
4185
+ 5SZDA_5SZEA
4186
+ 5T0IX_5T0IX
4187
+ 5T0IX_5T0JX
4188
+ 5T0IX_5VFRX
4189
+ 5T0IX_5VHSX
4190
+ 5T0IX_7QY7X
4191
+ 5T0WA_5T0WA
4192
+ 5T0WA_5T0WB
4193
+ 5T0WA_5T0WC
4194
+ 5T0WA_5T0WD
4195
+ 5T0WA_5TUJC
4196
+ 5T2UA_5T2UA
4197
+ 5T2UA_5T2UB
4198
+ 5T2UA_5T2UC
4199
+ 5T2UA_5T2UD
4200
+ 5T2UA_5T2VA
4201
+ 5T3OA_5T3OA
4202
+ 5T3OA_5T3OB
4203
+ 5T3OA_5T3OC
4204
+ 5T3OA_7AWOA
4205
+ 5T3OA_7PN0B
4206
+ 5T4MA_5T4MA
4207
+ 5T4MA_5T4NA
4208
+ 5T4MA_5T4NB
4209
+ 5T4MA_6E8FA
4210
+ 5T4MA_6E8FC
4211
+ 5T4OK_5T4OK
4212
+ 5T4OK_5T4PK
4213
+ 5T4OK_5T4QK
4214
+ 5T4OK_6OQRa
4215
+ 5T4OK_8DBWa
4216
+ 5TCSB_5TCSB
4217
+ 5TCSB_5TD8B
4218
+ 5TCSB_7KDFB
4219
+ 5TCSB_8G0QB
4220
+ 5TCSB_8G0QD
4221
+ 5TEJA_5TEJA
4222
+ 5TEJA_5TENA
4223
+ 5TEJA_5TENH
4224
+ 5TEJA_5US6B
4225
+ 5TEJA_5US6L
4226
+ 5TGQA_5TGQA
4227
+ 5TGQA_5TGXA
4228
+ 5TGQA_5TGXB
4229
+ 5TGQA_5TH3B
4230
+ 5TGQA_5TH3C
4231
+ 5TMBA_5TMBA
4232
+ 5TMBA_5TMDA
4233
+ 5TMBA_6BK0A
4234
+ 5TMBA_6BK2A
4235
+ 5TMBA_6BK3A
4236
+ 5U8QH_5U8RH
4237
+ 5U8QH_5W9HB
4238
+ 5U8QH_5W9JB
4239
+ 5U8QH_6AJ9D
4240
+ 5U8QH_6PV8F
4241
+ 5UF5A_5UF5A
4242
+ 5UF5A_5UF5B
4243
+ 5UF5A_5UFKA
4244
+ 5UF5A_6WLZX
4245
+ 5UF5A_6WM2Z
4246
+ 5UHKB_5UHOB
4247
+ 5UHKB_5UHOD
4248
+ 5UHKB_5UHPF
4249
+ 5UHKB_5UHPH
4250
+ 5UHKB_6PM9E
4251
+ 5UJMB_5UJMB
4252
+ 5UJMB_7CTEB
4253
+ 5UJMB_7CTFB
4254
+ 5UJMB_7JPPB
4255
+ 5UJMB_7JPQB
4256
+ 5UN1A_5UN1A
4257
+ 5UN1A_5UN1C
4258
+ 5UN1A_5UN1E
4259
+ 5UN1A_5UN1G
4260
+ 5UN1A_8E93C
4261
+ 5UOWB_6MMAB
4262
+ 5UOWB_6MMBB
4263
+ 5UOWB_6MMBD
4264
+ 5UOWB_6MMJD
4265
+ 5UOWB_6MMND
4266
+ 5V2OA_5V2OA
4267
+ 5V2OA_5V2OB
4268
+ 5V2OA_5V2OC
4269
+ 5V2OA_5V2OE
4270
+ 5V2OA_5V2OF
4271
+ 5V6PA_5V6PB
4272
+ 5V6PA_6VJYB
4273
+ 5V6PA_6VJZB
4274
+ 5V6PA_6VK0B
4275
+ 5V6PA_6VK1B
4276
+ 5V7QV_5V7QV
4277
+ 5V7QV_5V93V
4278
+ 5V7QV_7F0DV
4279
+ 5V7QV_7MSHV
4280
+ 5V7QV_7MSZV
4281
+ 5VF3A_6UZCA
4282
+ 5VF3A_7VRTcc
4283
+ 5VF3A_7VRTcf
4284
+ 5VF3A_8GMO8
4285
+ 5VF3A_8GMOAN
4286
+ 5VOKB_5VOKD
4287
+ 5VOKB_5VOKH
4288
+ 5VOKB_5Y39I
4289
+ 5VOKB_5YK3D
4290
+ 5VOKB_7T3CI
4291
+ 5VPAA_5VPBA
4292
+ 5VPAA_5VPDC
4293
+ 5VPAA_6UCIB
4294
+ 5VPAA_6UCID
4295
+ 5VPAA_6UCMC
4296
+ 5W78B_5W78B
4297
+ 5W78B_5W7BC
4298
+ 5W78B_5W7BD
4299
+ 5W78B_5W7CC
4300
+ 5W78B_5W7CD
4301
+ 5W9Hp_5W9Hp
4302
+ 5W9Hp_5W9JJ
4303
+ 5W9Hp_5W9LC
4304
+ 5W9Hp_5W9NH
4305
+ 5W9Hp_5W9NJ
4306
+ 5WAIB_5WAIB
4307
+ 5WAIB_5WAIF
4308
+ 5WAIB_5WAKB
4309
+ 5WAIB_6NQ3B
4310
+ 5WAIB_6NQ3F
4311
+ 5WI8A_5WI8C
4312
+ 5WI8A_5WI8D
4313
+ 5WI8A_5WIWA
4314
+ 5WI8A_5WJFA
4315
+ 5WI8A_5WJFB
4316
+ 5WKRA_5WKSA
4317
+ 5WKRA_5WL3A
4318
+ 5WKRA_5WL5A
4319
+ 5WKRA_5WL7A
4320
+ 5WKRA_5WL8B
4321
+ 5WQLC_5WQLC
4322
+ 5WQLC_5WQLD
4323
+ 5WQLC_6IQQC
4324
+ 5WQLC_6IQQD
4325
+ 5WQLC_6IQRA
4326
+ 5WYJBC_5WYJBC
4327
+ 5WYJBC_5WYKBC
4328
+ 5WYJBC_6KE6B3
4329
+ 5WYJBC_6LQUB3
4330
+ 5WYJBC_6ZQFUM
4331
+ 5XEZA_5XEZA
4332
+ 5XEZA_5XEZB
4333
+ 5XEZA_5XF1A
4334
+ 5XEZA_5XF1B
4335
+ 5XEZA_5YQZR
4336
+ 5XONW_5XONW
4337
+ 5XONW_6IR9W
4338
+ 5XONW_6J50W
4339
+ 5XONW_7XN7W
4340
+ 5XONW_7XTIW
4341
+ 5XPYA_5XPYA
4342
+ 5XPYA_5XPZB
4343
+ 5XPYA_5XQ0A
4344
+ 5XPYA_5XQ0B
4345
+ 5XPYA_6XTJA
4346
+ 5XSDA_5XSDA
4347
+ 5XSDA_5XSDB
4348
+ 5XSDA_5XSJX
4349
+ 5XSDA_5XSSA
4350
+ 5XSDA_5XSSB
4351
+ 5XVNA_5XVNB
4352
+ 5XVNA_5XVNC
4353
+ 5XVNA_5XVOI
4354
+ 5XVNA_5XVPA
4355
+ 5XVNA_5XVPD
4356
+ 5Y6RA_5Y6RA
4357
+ 5Y6RA_5YF7A
4358
+ 5Y6RA_5YF8A
4359
+ 5Y6RA_6AE6A
4360
+ 5Y6RA_6AE7A
4361
+ 5YC8A_5YC8A
4362
+ 5YC8A_5ZK3A
4363
+ 5YC8A_5ZK8A
4364
+ 5YC8A_5ZKBA
4365
+ 5YC8A_5ZKCA
4366
+ 5YHXA_5YHXM
4367
+ 5YHXA_5YHYA
4368
+ 5YHXA_5YI0A
4369
+ 5YHXA_5YI2F
4370
+ 5YHXA_5YI3I
4371
+ 5YIUA_5YIVA
4372
+ 5YIUA_5YIVC
4373
+ 5YIUA_5YIWA
4374
+ 5YIUA_5YIWC
4375
+ 5YIUA_5Z7IB
4376
+ 5Z56v_5Z56v
4377
+ 5Z56v_6QX9A2
4378
+ 5Z56v_7ABIF
4379
+ 5Z56v_7VPXB
4380
+ 5Z56v_8CH6I
4381
+ 5Z56Y_6FF41
4382
+ 5Z56Y_6FF71
4383
+ 5Z56Y_7ABH1
4384
+ 5Z56Y_7ABI1
4385
+ 5Z56Y_7DVQY
4386
+ 5ZAPQ_5ZAPa
4387
+ 5ZAPQ_5ZAPZ
4388
+ 5ZAPQ_5ZZ82
4389
+ 5ZAPQ_5ZZ8X
4390
+ 5ZAPQ_6ODMH
4391
+ 5ZYFA_6QXTK
4392
+ 5ZYFA_6QXTQ
4393
+ 5ZYFA_6QXTt
4394
+ 5ZYFA_6QXTT
4395
+ 5ZYFA_6QXTx
4396
+ 6A2QA_6A2RD
4397
+ 6A2QA_6A2RE
4398
+ 6A2QA_6A2SA
4399
+ 6A2QA_6A2SG
4400
+ 6A2QA_6A2TA
4401
+ 6A93A_6A93A
4402
+ 6A93A_6A94A
4403
+ 6A93A_6WH4B
4404
+ 6A93A_6WH4C
4405
+ 6A93A_7WC5A
4406
+ 6AN7C_6AN7C
4407
+ 6AN7C_6AN7D
4408
+ 6AN7C_6M96B
4409
+ 6AN7C_7K2TD
4410
+ 6AN7C_8DNCD
4411
+ 6B02A_6B02A
4412
+ 6B02A_6B02B
4413
+ 6B02A_6B04A
4414
+ 6B02A_6B04C
4415
+ 6B02A_6B06B
4416
+ 6B0XA_6B0XA
4417
+ 6B0XA_6B23A
4418
+ 6B0XA_6C21G
4419
+ 6B0XA_6C22A
4420
+ 6B0XA_6C22D
4421
+ 6B435_6B435
4422
+ 6B435_6B438
4423
+ 6B435_6B43e
4424
+ 6B435_6PPD5
4425
+ 6B435_6PPH5
4426
+ 6B4FC_6B4FC
4427
+ 6B4FC_6B4FD
4428
+ 6B4FC_6B4IC
4429
+ 6B4FC_6B4JC
4430
+ 6B4FC_7TBLf
4431
+ 6B9SA_6B9SB
4432
+ 6B9SA_6B9SE
4433
+ 6B9SA_6B9SG
4434
+ 6B9SA_6B9TE
4435
+ 6B9SA_6B9TF
4436
+ 6BUZN_6BUZN
4437
+ 6BUZN_6C0WK
4438
+ 6BUZN_6EQTB
4439
+ 6BUZN_6MUOM
4440
+ 6BUZN_6MUPM
4441
+ 6C5RA_6C5RB
4442
+ 6C5RA_6C5RC
4443
+ 6C5RA_6C5RE
4444
+ 6C5RA_6C5RF
4445
+ 6C5RA_6C5RH
4446
+ 6C9UH_6WHKC
4447
+ 6C9UH_8C3VH
4448
+ 6C9UH_8EE0H
4449
+ 6C9UH_8GB5B
4450
+ 6C9UH_8GB5I
4451
+ 6CG8A_6K2JB
4452
+ 6CG8A_6K2JC
4453
+ 6CG8A_6K2JD
4454
+ 6CG8A_6OZYB
4455
+ 6CG8A_6OZZB
4456
+ 6CGQA_6CGQA
4457
+ 6CGQA_6CGQB
4458
+ 6CGQA_6NMXA
4459
+ 6CGQA_6NMXB
4460
+ 6CGQA_6NMXC
4461
+ 6CM3A_6CM3B
4462
+ 6CM3A_7UGND
4463
+ 6CM3A_7UGOD
4464
+ 6CM3A_7UGPD
4465
+ 6CM3A_7UGQE
4466
+ 6CP3Z_6CP3Z
4467
+ 6CP3Z_7TJYU
4468
+ 6CP3Z_7TKDU
4469
+ 6CP3Z_7TKKU
4470
+ 6CP3Z_7TKOU
4471
+ 6D6UE_6D6UE
4472
+ 6D6UE_6HUGC
4473
+ 6D6UE_6X3SE
4474
+ 6D6UE_6X3WE
4475
+ 6D6UE_8DD2E
4476
+ 6D6VE_6D6VE
4477
+ 6D6VE_7LMAE
4478
+ 6D6VE_7LMBE
4479
+ 6D6VE_7UY5E
4480
+ 6D6VE_7UY6E
4481
+ 6D6VF_6D6VF
4482
+ 6D6VF_7LMAF
4483
+ 6D6VF_7LMBF
4484
+ 6D6VF_7UY5F
4485
+ 6D6VF_7UY6F
4486
+ 6DFPA_6DFPA
4487
+ 6DFPA_6EZVX
4488
+ 6DFPA_7P3RA
4489
+ 6DFPA_7P3RB
4490
+ 6DFPA_7P3RC
4491
+ 6DK7A_6DK7C
4492
+ 6DK7A_6DK7H
4493
+ 6DK7A_6DK8B
4494
+ 6DK7A_6DK8E
4495
+ 6DK7A_7N0EB
4496
+ 6E101_6E101
4497
+ 6E101_6E102
4498
+ 6E101_6E106
4499
+ 6E101_6E114
4500
+ 6E101_6E116
4501
+ 6E8WA_6E8WA
4502
+ 6E8WA_6E8WB
4503
+ 6E8WA_6V4TA
4504
+ 6E8WA_6V4TB
4505
+ 6E8WA_6V4TC
4506
+ 6EFRA_6URUA
4507
+ 6EFRA_6URUB
4508
+ 6EFRA_7S7UA
4509
+ 6EFRA_7S7VB
4510
+ 6EFRA_7S7VC
4511
+ 6EM8A_6EM8A
4512
+ 6EM8A_6EM8E
4513
+ 6EM8A_6EM8H
4514
+ 6EM8A_6EM8L
4515
+ 6EM8A_6EM9L
4516
+ 6F5DD_6F5DE
4517
+ 6F5DD_8APEF1
4518
+ 6F5DD_8APFD1
4519
+ 6F5DD_8APHF1
4520
+ 6F5DD_8APKD1
4521
+ 6FJUA_6FJUA
4522
+ 6FJUA_6FJUB
4523
+ 6FJUA_6FJVA
4524
+ 6FJUA_6FNNB
4525
+ 6FJUA_6FNOA
4526
+ 6FONA_6FONA
4527
+ 6FONA_6FONC
4528
+ 6FONA_6FP6D
4529
+ 6FONA_6FP6L
4530
+ 6FONA_6FP6X
4531
+ 6FPEB_6FPEB
4532
+ 6FPEB_6FPEG
4533
+ 6FPEB_6N9AD
4534
+ 6FPEB_6NAKB
4535
+ 6FPEB_6S84A
4536
+ 6FYXo_6FYXo
4537
+ 6FYXo_6FYYo
4538
+ 6FYXo_6ZCEo
4539
+ 6FYXo_6ZU9o
4540
+ 6FYXo_8CASo
4541
+ 6FYXq_6GSMq
4542
+ 6FYXq_6GSNq
4543
+ 6FYXq_6ZCEq
4544
+ 6FYXq_8CAHq
4545
+ 6FYXq_8CASq
4546
+ 6G1BB_6G1BB
4547
+ 6G1BB_6G1BJ
4548
+ 6G1BB_6G1DA
4549
+ 6G1BB_6G4RA
4550
+ 6G1BB_6G4RB
4551
+ 6G1HA_6G1HA
4552
+ 6G1HA_6G1MA
4553
+ 6G1HA_6G1MB
4554
+ 6G1HA_6G1MC
4555
+ 6G1HA_6G1MD
4556
+ 6G3BA_6G3BB
4557
+ 6G3BA_6S48A
4558
+ 6G3BA_6S48B
4559
+ 6G3BA_6S58A
4560
+ 6G3BA_6S58C
4561
+ 6GAWBq_6I9Rl
4562
+ 6GAWBq_6VLZl
4563
+ 6GAWBq_6YDPBq
4564
+ 6GAWBq_7L20l
4565
+ 6GAWBq_8OINBc
4566
+ 6GCSH_6GCSH
4567
+ 6GCSH_6RFRH
4568
+ 6GCSH_6RFSH
4569
+ 6GCSH_6YJ4E
4570
+ 6GCSH_7B0NE
4571
+ 6GY6B_6GY6B
4572
+ 6GY6B_6GY7A
4573
+ 6GY6B_6GY7B
4574
+ 6GY6B_6GY7C
4575
+ 6GY6B_6GY7D
4576
+ 6H9XA_6HHZA
4577
+ 6H9XA_6R1MA
4578
+ 6H9XA_6R1MB
4579
+ 6H9XA_6R1OA
4580
+ 6H9XA_6S30A
4581
+ 6HB0A_6HB0A
4582
+ 6HB0A_6HB0B
4583
+ 6HB0A_6HBDA
4584
+ 6HB0A_6HBDB
4585
+ 6HB0A_6HBMB
4586
+ 6HBGA_6HBGA
4587
+ 6HBGA_6HBHA
4588
+ 6HBGA_7XXAA
4589
+ 6HBGA_7XXGA
4590
+ 6HBGA_7XXJA
4591
+ 6HIVA1_6HIVA1
4592
+ 6HIVA1_6HIXA1
4593
+ 6HIVA1_6YXXA1
4594
+ 6HIVA1_6YXYA1
4595
+ 6HIVA1_7AOIA1
4596
+ 6HIVAN_6HIVAN
4597
+ 6HIVAN_6HIXAN
4598
+ 6HIVAN_6YXXAN
4599
+ 6HIVAN_6YXYAN
4600
+ 6HIVAN_7ANEF
4601
+ 6HIVBf_6HIVBf
4602
+ 6HIVBf_6YXXBf
4603
+ 6HIVBf_6YXYBf
4604
+ 6HIVBf_7AIHBE
4605
+ 6HIVBf_7AOIBf
4606
+ 6HIVCC_6HIVCC
4607
+ 6HIVCC_6SG9CC
4608
+ 6HIVCC_6SGBCC
4609
+ 6HIVCC_7PUACC
4610
+ 6HIVCC_7PUBCC
4611
+ 6HIVCH_6HIVCH
4612
+ 6HIVCH_6SGACH
4613
+ 6HIVCH_6SGBCH
4614
+ 6HIVCH_7AORc
4615
+ 6HIVCH_7PUBCH
4616
+ 6HIVDU_6HIVDU
4617
+ 6HIVDU_6HIWDU
4618
+ 6HIVDU_6SGADU
4619
+ 6HIVDU_6SGBDU
4620
+ 6HIVDU_7PUBDU
4621
+ 6HS7A_6HS7a
4622
+ 6HS7A_6HS7A
4623
+ 6HS7A_6HS7D
4624
+ 6HS7A_6IXHP
4625
+ 6HS7A_6IXHU
4626
+ 6HU9d_6HU9d
4627
+ 6HU9d_6T0Bq
4628
+ 6HU9d_8E7St
4629
+ 6HU9d_8E7ST
4630
+ 6HU9d_8EC0T
4631
+ 6HU9j_6HU9j
4632
+ 6HU9j_6T0Bw
4633
+ 6HU9j_6T15j
4634
+ 6HU9j_8E7Su
4635
+ 6HU9j_8EC0U
4636
+ 6HUMA_6L7OA
4637
+ 6HUMA_6L7PA
4638
+ 6HUMA_6NBQA
4639
+ 6HUMA_6NBXA
4640
+ 6HUMA_6TJVA
4641
+ 6HUMF_6KHIF
4642
+ 6HUMF_6L7PF
4643
+ 6HUMF_6NBQF
4644
+ 6HUMF_6NBXF
4645
+ 6HUMF_6NBYF
4646
+ 6HUML_6HUML
4647
+ 6HUML_6KHJL
4648
+ 6HUML_6L7OL
4649
+ 6HUML_6NBQL
4650
+ 6HUML_6NBXL
4651
+ 6HWWA_6HWWB
4652
+ 6HWWA_6HWXA
4653
+ 6HWWA_6HWYA
4654
+ 6HWWA_6HWYB
4655
+ 6HWWA_6HWYD
4656
+ 6I3MC_6I3MD
4657
+ 6I3MC_6I7TC
4658
+ 6I3MC_6QG0G
4659
+ 6I3MC_6QG2G
4660
+ 6I3MC_6QG2H
4661
+ 6I6BA_6I6BA
4662
+ 6I6BA_6I6JA
4663
+ 6I6BA_6ZXRA
4664
+ 6I6BA_7OYEA
4665
+ 6I6BA_7RXCK
4666
+ 6I7DA_6TU7AP1
4667
+ 6I7DA_6YCXA
4668
+ 6I7DA_6YCXB
4669
+ 6I7DA_6YCYA
4670
+ 6I7DA_6YCZA
4671
+ 6IEEA_7BHQA
4672
+ 6IEEA_7BHQB
4673
+ 6IEEA_7BHQC
4674
+ 6IEEA_7BHQD
4675
+ 6IEEA_7BHQE
4676
+ 6IJJD_6JO5D
4677
+ 6IJJD_6QPHD
4678
+ 6IJJD_6RHZD
4679
+ 6IJJD_7WYID
4680
+ 6IJJD_8H2UD
4681
+ 6IKNA_6IKNA
4682
+ 6IKNA_6IKNB
4683
+ 6IKNA_6IKNC
4684
+ 6IKNA_6IKND
4685
+ 6IKNA_6IKOB
4686
+ 6JCVA_6JCWA
4687
+ 6JCVA_6JD1K
4688
+ 6JCVA_6JD2F
4689
+ 6JCVA_6KPEA
4690
+ 6JCVA_6KPHA
4691
+ 6JSAA_6JSAA
4692
+ 6JSAA_6JSCA
4693
+ 6JSAA_6JSCB
4694
+ 6JSAA_6JSDA
4695
+ 6JSAA_6JSDB
4696
+ 6JZKA_6JZKA
4697
+ 6JZKA_6JZKB
4698
+ 6JZKA_6KMFA
4699
+ 6JZKA_6KMFB
4700
+ 6JZKA_6KMFC
4701
+ 6K82B_6K83C
4702
+ 6K82B_6K87B
4703
+ 6K82B_6K88A
4704
+ 6K82B_6K89B
4705
+ 6K82B_6K8AA
4706
+ 6KACR_6KACr
4707
+ 6KACR_6KACR
4708
+ 6KACR_6KADR
4709
+ 6KACR_6KAFr
4710
+ 6KACR_6KAFR
4711
+ 6KFVA_6KFVA
4712
+ 6KFVA_6KFVH
4713
+ 6KFVA_6KFVI
4714
+ 6KFVA_6KFVK
4715
+ 6KFVA_6KFVL
4716
+ 6KGX21_6KGX21
4717
+ 6KGX21_6KGX24
4718
+ 6KGX21_6KGXM1
4719
+ 6KGX21_7Y4LM9
4720
+ 6KGX21_7Y5EZ9
4721
+ 6KLOA_6KLOA
4722
+ 6KLOA_6KLWF
4723
+ 6KLOA_6OKRA
4724
+ 6KLOA_6OKRB
4725
+ 6KLOA_6V1SA
4726
+ 6L1TA_6L1TB
4727
+ 6L1TA_6RT0A
4728
+ 6L1TA_6XYQA
4729
+ 6L1TA_8ADWA
4730
+ 6L1TA_8BQWC
4731
+ 6L54B_6L54B
4732
+ 6L54B_6SYTB
4733
+ 6L54B_6Z3RB
4734
+ 6L54B_7PW4B
4735
+ 6L54B_7PW8B
4736
+ 6M0VA_6M0VA
4737
+ 6M0VA_6M0WA
4738
+ 6M0VA_6RJ9C
4739
+ 6M0VA_6RJAC
4740
+ 6M0VA_6RJDC
4741
+ 6M25A_6M25A
4742
+ 6M25A_6M25B
4743
+ 6M25A_6M26A
4744
+ 6M25A_6M2AA
4745
+ 6M25A_6M2AB
4746
+ 6M3WA_7T5OA
4747
+ 6M3WA_7TPLA
4748
+ 6M3WA_8DYAA
4749
+ 6M3WA_8FDWA
4750
+ 6M3WA_8FDWC
4751
+ 6MB3K_7M6FL
4752
+ 6MB3K_7M6HF
4753
+ 6MB3K_7M6HG
4754
+ 6MB3K_7TZ5L
4755
+ 6MB3K_7WONL
4756
+ 6NBFR_6NBFR
4757
+ 6NBFR_6NBIR
4758
+ 6NBFR_7VVKR
4759
+ 6NBFR_7VVNR
4760
+ 6NBFR_8FLQR
4761
+ 6NF5E_6NF5H
4762
+ 6NF5E_7BELE
4763
+ 6NF5E_7BELH
4764
+ 6NF5E_7ND4D
4765
+ 6NF5E_7RQ6D
4766
+ 6NHJ1_6NHJb
4767
+ 6NHJ1_6NHJc
4768
+ 6NHJ1_6NHJY
4769
+ 6NHJ1_6NHJz
4770
+ 6NHJ1_6NHJZ
4771
+ 6NMIE_6O9L6
4772
+ 6NMIE_6O9M6
4773
+ 6NMIE_8EBSE
4774
+ 6NMIE_8EBUE
4775
+ 6NMIE_8EBYE
4776
+ 6NR81_6NR81
4777
+ 6NR81_6NRB1
4778
+ 6NR81_6NRC1
4779
+ 6NR81_6NRD1
4780
+ 6NR81_7WU71
4781
+ 6NUWY_6NUWY
4782
+ 6NUWY_6OUAI
4783
+ 6NUWY_6QLDI
4784
+ 6NUWY_6QLEI
4785
+ 6NUWY_7L7QI
4786
+ 6NYYA_6NYYA
4787
+ 6NYYA_6NYYB
4788
+ 6NYYA_6NYYC
4789
+ 6NYYA_6NYYD
4790
+ 6NYYA_6NYYE
4791
+ 6O0QA_6O0QA
4792
+ 6O0QA_6O0VA
4793
+ 6O0QA_7NAKF
4794
+ 6O0QA_8GNIB
4795
+ 6O0QA_8GNJB
4796
+ 6O1WA_6O1WA
4797
+ 6O1WA_6O1XA
4798
+ 6O1WA_6O1XB
4799
+ 6O1WA_6O1YB
4800
+ 6O1WA_6O1ZA
4801
+ 6O9L3_6O9L3
4802
+ 6O9L3_7EGB0
4803
+ 6O9L3_7EGC0
4804
+ 6O9L3_7ENC0
4805
+ 6O9L3_7LBMd
4806
+ 6OAPA_6OAPA
4807
+ 6OAPA_6OAPB
4808
+ 6OAPA_6OAQA
4809
+ 6OAPA_6OB8A
4810
+ 6OAPA_6OB8B
4811
+ 6OJNA_6OJNA
4812
+ 6OJNA_6OJNB
4813
+ 6OJNA_8HIFB3
4814
+ 6OJNA_8HIFD3
4815
+ 6OJNA_8HIFF3
4816
+ 6OKUC_6OKUC
4817
+ 6OKUC_7VNJA
4818
+ 6OKUC_7VNJF
4819
+ 6OKUC_7YVQD
4820
+ 6OKUC_7YVQG
4821
+ 6ON2A_6ON2A
4822
+ 6ON2A_6ON2D
4823
+ 6ON2A_6ON2F
4824
+ 6ON2A_6V11B
4825
+ 6ON2A_6V11E
4826
+ 6PCOA_6PCOA
4827
+ 6PCOA_6PCOC
4828
+ 6PCOA_6PCOD
4829
+ 6PCOA_6PCPC
4830
+ 6PCOA_6PCPF
4831
+ 6PEM0_6PEM3
4832
+ 6PEM0_6PEM4
4833
+ 6PEM0_6PEP3
4834
+ 6PEM0_7AH91A
4835
+ 6PEM0_7AH91E
4836
+ 6PEPAN_6PEPAP
4837
+ 6PEPAN_6Q15AN
4838
+ 6PEPAN_7AGX1K
4839
+ 6PEPAN_7AH91K
4840
+ 6PEPAN_7AH91P
4841
+ 6PZTA_7D10A
4842
+ 6PZTA_7S1YB
4843
+ 6PZTA_7S1ZA
4844
+ 6PZTA_7S1ZB
4845
+ 6PZTA_7ZGOA
4846
+ 6Q89A_6YKQA
4847
+ 6Q89A_7AP2A
4848
+ 6Q89A_7NU8A
4849
+ 6Q89A_7NU9A
4850
+ 6Q89A_7NUCA
4851
+ 6QJQA_6QJQB
4852
+ 6QJQA_6QJQD
4853
+ 6QJQA_7QKFB
4854
+ 6QJQA_7QKFD
4855
+ 6QJQA_7QKFF
4856
+ 6QNWB_6QPFE
4857
+ 6QNWB_6TU5B
4858
+ 6QNWB_6TU5E
4859
+ 6QNWB_7NHXB
4860
+ 6QNWB_7QTLB
4861
+ 6QNWC_6QPGC
4862
+ 6QNWC_6QPGI
4863
+ 6QNWC_6RR7C
4864
+ 6QNWC_7NHXC
4865
+ 6QNWC_7QTLC
4866
+ 6QTIA_6QTIA
4867
+ 6QTIA_6QTIB
4868
+ 6QTIA_6QUEA
4869
+ 6QTIA_6QUEB
4870
+ 6QTIA_6S59B
4871
+ 6R24A_6R24A
4872
+ 6R24A_6R24D
4873
+ 6R24A_6R24E
4874
+ 6R24A_6R24F
4875
+ 6R24A_6R24H
4876
+ 6RMDA_6RMEC
4877
+ 6RMDA_6RMEE
4878
+ 6RMDA_6RMOM
4879
+ 6RMDA_6RMWC
4880
+ 6RMDA_6RNHA
4881
+ 6RTBH_6RTBH
4882
+ 6RTBH_7NCHA
4883
+ 6RTBH_7NCHF
4884
+ 6RTBH_7NCID
4885
+ 6RTBH_8BQWA
4886
+ 6S3EA_6S3EA
4887
+ 6S3EA_6S3EB
4888
+ 6S3EA_6S3NA
4889
+ 6S3EA_6S3PA
4890
+ 6S3EA_7OARA
4891
+ 6S7IA_6S7IB
4892
+ 6S7IA_6S7LA
4893
+ 6S7IA_6S7LB
4894
+ 6S7IA_7Q0NA
4895
+ 6S7IA_7Q0NB
4896
+ 6SB1A_6SB1A
4897
+ 6SB1A_6SB4A
4898
+ 6SB1A_6SB4B
4899
+ 6SB1A_6SB4D
4900
+ 6SB1A_6SB4F
4901
+ 6SCTA_6SCTA
4902
+ 6SCTA_6WCJA
4903
+ 6SCTA_6WCJC
4904
+ 6SCTA_6WCJD
4905
+ 6SCTA_6YAIA
4906
+ 6SCTE_6SCTE
4907
+ 6SCTE_6WCJJ
4908
+ 6SCTE_6WCJN
4909
+ 6SCTE_6WCJO
4910
+ 6SCTE_6YAIO
4911
+ 6SGEB_6SGEB
4912
+ 6SGEB_6SGED
4913
+ 6SGEB_7AZBB
4914
+ 6SGEB_7TE8A
4915
+ 6SGEB_7TE8D
4916
+ 6SNCB_6SNCH
4917
+ 6SNCB_6SNDH
4918
+ 6SNCB_6SNED
4919
+ 6SNCB_6SNEF
4920
+ 6SNCB_6SNEH
4921
+ 6T1FA_6T1FB
4922
+ 6T1FA_6T1FC
4923
+ 6T1FA_6T1FD
4924
+ 6T1FA_7BM8A
4925
+ 6T1FA_7BM8B
4926
+ 6TMHA_6TMHA
4927
+ 6TMHA_6TMKA1
4928
+ 6TMHA_6TMKC1
4929
+ 6TMHA_6TMKC2
4930
+ 6TMHA_6TMLC2
4931
+ 6TMHB_6TMHD
4932
+ 6TMHB_6TMHF
4933
+ 6TMHB_6TMKB1
4934
+ 6TMHB_6TMKD1
4935
+ 6TMHB_6TMKF2
4936
+ 6UT3A_6UT3A
4937
+ 6UT3A_6UT3B
4938
+ 6UT3A_6UT3E
4939
+ 6UT3A_6UT4F
4940
+ 6UT3A_6UT7D
4941
+ 6VLBA_6VLBA
4942
+ 6VLBA_6VLCA
4943
+ 6VLBA_6VLCB
4944
+ 6VLBA_6VLCC
4945
+ 6VLBA_6VLCD
4946
+ 6VYHA_6VYHA
4947
+ 6VYHA_6WBVA
4948
+ 6VYHA_6WIKA
4949
+ 6VYHA_8BZYA
4950
+ 6VYHA_8C02A
4951
+ 6VYVA_6VYVA
4952
+ 6VYVA_6VYVB
4953
+ 6VYVA_7V4TE
4954
+ 6VYVA_7WC2A
4955
+ 6VYVA_7WC2D
4956
+ 6VZ0A_6VZ0A
4957
+ 6VZ0A_6VZ0B
4958
+ 6VZ0A_6VZEC
4959
+ 6VZ0A_6VZEF
4960
+ 6VZ0A_6VZEH
4961
+ 6VZQA_6VZSB
4962
+ 6VZQA_6VZTA
4963
+ 6VZQA_6VZTB
4964
+ 6VZQA_6VZVA
4965
+ 6VZQA_6VZVB
4966
+ 6W1SD_6W1SD
4967
+ 6W1SD_7EMFG
4968
+ 6W1SD_7ENAg
4969
+ 6W1SD_7ENCg
4970
+ 6W1SD_7LBMt
4971
+ 6W1SH_6W1SH
4972
+ 6W1SH_7EMFK
4973
+ 6W1SH_7ENAk
4974
+ 6W1SH_7ENCk
4975
+ 6W1SH_7NVRc
4976
+ 6W1SJ_6W1SJ
4977
+ 6W1SJ_7EMFO
4978
+ 6W1SJ_7ENAo
4979
+ 6W1SJ_7LBMz
4980
+ 6W1SJ_8GXQo
4981
+ 6W1SO_6W1SO
4982
+ 6W1SO_7EMFT
4983
+ 6W1SO_7ENCt
4984
+ 6W1SO_7LBMl
4985
+ 6W1SO_7NVRf
4986
+ 6W5CA_6W5CA
4987
+ 6W5CA_6W62A
4988
+ 6W5CA_7D2LA
4989
+ 6W5CA_7D3JA
4990
+ 6W5CA_7D8CA
4991
+ 6W9OA_6W9OA
4992
+ 6W9OA_6W9RA
4993
+ 6W9OA_6W9RD
4994
+ 6W9OA_6W9RF
4995
+ 6W9OA_6W9RK
4996
+ 6WCTA_6WCTA
4997
+ 6WCTA_6WCTD
4998
+ 6WCTA_7S5EA
4999
+ 6WCTA_7S5EC
5000
+ 6WCTA_7S5ED
5001
+ 6WK5A_6WK5A
5002
+ 6WK5A_6WK8A
5003
+ 6WK5A_6WK8B
5004
+ 6WK5A_6WK9B
5005
+ 6WK5A_7SZTB
5006
+ 6WV3A_6WV3A
5007
+ 6WV3A_6WV4A
5008
+ 6WV3A_6WV5A
5009
+ 6WV3A_6WV6A
5010
+ 6WV3A_6WV7A
5011
+ 6X896M_6X896M
5012
+ 6X896M_7A23N
5013
+ 6X896M_7A24N
5014
+ 6X896M_8BQ6J
5015
+ 6X896M_8E736M
5016
+ 6X89AM_6X89AM
5017
+ 6X89AM_7A23Z
5018
+ 6X89AM_7A24Z
5019
+ 6X89AM_7AR7Z
5020
+ 6X89AM_8BQ5Z
5021
+ 6X89S6_6X89S6
5022
+ 6X89S6_7A23P
5023
+ 6X89S6_7AR7R
5024
+ 6X89S6_7AR8R
5025
+ 6X89S6_8BEDR
5026
+ 6XFIA_6XFIA
5027
+ 6XFIA_6XI2B
5028
+ 6XFIA_6XI2D
5029
+ 6XFIA_7E9KD
5030
+ 6XFIA_7E9LA
5031
+ 6YEXA_6YEXA
5032
+ 6YEXA_6YEXB
5033
+ 6YEXA_6YJBA
5034
+ 6YEXA_6YMGA
5035
+ 6YEXA_6YMGB
5036
+ 6YG8A_6YG8D
5037
+ 6YG8A_7L2ZA
5038
+ 6YG8A_7L2ZD
5039
+ 6YG8A_7L2ZF
5040
+ 6YG8A_7LBYB
5041
+ 6YNXS_6YNXs
5042
+ 6YNXS_6YNXS
5043
+ 6YNXS_6YNYS
5044
+ 6YNXS_6YNZs
5045
+ 6YNXS_6YNZS
5046
+ 6ZFWA_6ZFWB
5047
+ 6ZFWA_6ZFWC
5048
+ 6ZFWA_6ZFWD
5049
+ 6ZFWA_7QRCA
5050
+ 6ZFWA_7QRCB
5051
+ 6ZLZA_6ZMLA
5052
+ 6ZLZA_6ZMLB
5053
+ 6ZLZA_6ZMLC
5054
+ 6ZLZA_6ZMLD
5055
+ 6ZLZA_6ZMLF
5056
+ 6ZNGA_6ZNGC
5057
+ 6ZNGA_6ZNGE
5058
+ 6ZNGA_6ZNGF
5059
+ 6ZNGA_6ZNJB
5060
+ 6ZNGA_6ZNJC
5061
+ 6ZQDJD_6ZQDJD
5062
+ 6ZQDJD_6ZQEJD
5063
+ 6ZQDJD_6ZQFJD
5064
+ 6ZQDJD_6ZQGJD
5065
+ 6ZQDJD_7AJUJD
5066
+ 6ZQIA_6ZQIA
5067
+ 6ZQIA_6ZQJG
5068
+ 6ZQIA_6ZQVA
5069
+ 6ZQIA_6ZQVC
5070
+ 6ZQIA_6ZQVE
5071
+ 6ZVPA_6ZVPA
5072
+ 6ZVPA_7A2GA
5073
+ 6ZVPA_7A2GB
5074
+ 6ZVPA_7A2GC
5075
+ 6ZVPA_7A2GD
5076
+ 7A23a_7A23a
5077
+ 7A23a_7AR7L
5078
+ 7A23a_7AR8L
5079
+ 7A23a_8BPXL
5080
+ 7A23a_8E735M
5081
+ 7A23f_7A23f
5082
+ 7A23f_7AR7p
5083
+ 7A23f_7AR8p
5084
+ 7A23f_8BEHp
5085
+ 7A23f_8E73BJ
5086
+ 7A5SA_7A5SB
5087
+ 7A5SA_7A92A
5088
+ 7A5SA_7S0EA
5089
+ 7A5SA_7UZBA
5090
+ 7A5SA_7ZBUA
5091
+ 7A6HK_7A6HK
5092
+ 7A6HK_7ASTG
5093
+ 7A6HK_7DN3K
5094
+ 7A6HK_7OBAK
5095
+ 7A6HK_8A43K
5096
+ 7A6HM_7A6HM
5097
+ 7A6HM_7ASTK
5098
+ 7A6HM_7DN3M
5099
+ 7A6HM_7FJIM
5100
+ 7A6HM_7FJJM
5101
+ 7A6HQ_7A6HQ
5102
+ 7A6HQ_7AEAQ
5103
+ 7A6HQ_7D58Q
5104
+ 7A6HQ_7DN3Q
5105
+ 7A6HQ_8IUEQ
5106
+ 7AE4A_7AE4B
5107
+ 7AE4A_7AE4E
5108
+ 7AE4A_7AE5D
5109
+ 7AE4A_7AE7A
5110
+ 7AE4A_7AE7E
5111
+ 7AG6A_7AG6B
5112
+ 7AG6A_7AGHB
5113
+ 7AG6A_7AGHC
5114
+ 7AG6A_7AGKC
5115
+ 7AG6A_7AGKD
5116
+ 7APEA_7APEA
5117
+ 7APEA_7APEB
5118
+ 7APEA_7CAEE
5119
+ 7APEA_7CAFE
5120
+ 7APEA_7CAGE
5121
+ 7ARHC_7ARHC
5122
+ 7ARHC_7ARIC
5123
+ 7ARHC_7ARKC
5124
+ 7ARHC_7V8IC
5125
+ 7ARHC_7V8LC
5126
+ 7BUUA_7BUVA
5127
+ 7BUUA_7BUVB
5128
+ 7BUUA_7BUVD
5129
+ 7BUUA_7BUWA
5130
+ 7BUUA_7BUWB
5131
+ 7BVRA_7BVRA
5132
+ 7BVRA_7BVRC
5133
+ 7BVRA_7XRFC
5134
+ 7BVRA_7XRFE
5135
+ 7BVRA_7XRFG
5136
+ 7CADC_7CAEC
5137
+ 7CADC_7CAED
5138
+ 7CADC_7CAFD
5139
+ 7CADC_7CAGC
5140
+ 7CADC_7CAGD
5141
+ 7CO2A_7CO5B
5142
+ 7CO2A_7CO5H
5143
+ 7CO2A_7CO5J
5144
+ 7CO2A_7CO5L
5145
+ 7CO2A_7CO7D
5146
+ 7DD5A_7DD5B
5147
+ 7DD5A_7DTUA
5148
+ 7DD5A_7E6UA
5149
+ 7DD5A_7M3GB
5150
+ 7DD5A_7M3JA
5151
+ 7DDQA_7DDQA
5152
+ 7DDQA_7DDQB
5153
+ 7DDQA_7DDQS
5154
+ 7DDQA_7DDQT
5155
+ 7DDQA_7DDQU
5156
+ 7DKHA_7DKHA
5157
+ 7DKHA_7DKHE
5158
+ 7DKHA_7DKHI
5159
+ 7DKHA_8J8PC
5160
+ 7DKHA_8J8QC
5161
+ 7DP3A_7DP3A
5162
+ 7DP3A_7DP3B
5163
+ 7DP3A_7W7PB
5164
+ 7DP3A_7W7PD
5165
+ 7DP3A_8S92A
5166
+ 7E2CB_7E2CB
5167
+ 7E2CB_7E2DB
5168
+ 7E2CB_7E8TB
5169
+ 7E2CB_7U06e
5170
+ 7E2CB_7U06E
5171
+ 7E2CJ_7E2CJ
5172
+ 7E2CJ_7E8SJ
5173
+ 7E2CJ_7E8TJ
5174
+ 7E2CJ_7E93J
5175
+ 7E2CJ_7EA3J
5176
+ 7E9GR_7E9GS
5177
+ 7E9GR_7EPDA
5178
+ 7E9GR_7MTQA
5179
+ 7E9GR_7MTQB
5180
+ 7E9GR_7MTRA
5181
+ 7EKIA_7EKIA
5182
+ 7EKIA_7EKTA
5183
+ 7EKIA_7KOOA
5184
+ 7EKIA_7KOQA
5185
+ 7EKIA_7KOXA
5186
+ 7EMFJ_7EMFJ
5187
+ 7EMFJ_7ENCj
5188
+ 7EMFJ_7LBMv
5189
+ 7EMFJ_7NVRk
5190
+ 7EMFJ_8GXQj
5191
+ 7EWCA_7EWDB
5192
+ 7EWCA_7EWDC
5193
+ 7EWCA_7EWEB
5194
+ 7EWCA_7EWED
5195
+ 7EWCA_7EWEE
5196
+ 7F5BA_7F5BA
5197
+ 7F5BA_7F5BB
5198
+ 7F5BA_7F5BC
5199
+ 7F5BA_7F5BD
5200
+ 7F5BA_8FWWA
5201
+ 7FAXA_7FAXA
5202
+ 7FAXA_7XGWA
5203
+ 7FAXA_7XGWB
5204
+ 7FAXA_7XGWE
5205
+ 7FAXA_7XGWF
5206
+ 7JNAA_7JNAA
5207
+ 7JNAA_7JNCA
5208
+ 7JNAA_7SQFA
5209
+ 7JNAA_7SQGA
5210
+ 7JNAA_7SQHA
5211
+ 7KUFB_7KUGD
5212
+ 7KUFB_8CWTD
5213
+ 7KUFB_8CWTF
5214
+ 7KUFB_8CYFB
5215
+ 7KUFB_8D5VB
5216
+ 7KUIC_7KUIC
5217
+ 7KUIC_7KUIH
5218
+ 7KUIC_8E14D
5219
+ 7KUIC_8E14G
5220
+ 7KUIC_8E14H
5221
+ 7KZPB_7KZPB
5222
+ 7KZPB_7KZQO
5223
+ 7KZPB_7KZSO
5224
+ 7KZPB_7KZTB
5225
+ 7KZPB_7KZVB
5226
+ 7LARA_7LARA
5227
+ 7LARA_7LARB
5228
+ 7LARA_7LARD
5229
+ 7LARA_7LARE
5230
+ 7LARA_7LARF
5231
+ 7LJ7A_7LJ7A
5232
+ 7LJ7A_7LJ7B
5233
+ 7LJ7A_7M1TA
5234
+ 7LJ7A_7M1UA
5235
+ 7LJ7A_7M1UB
5236
+ 7LSYH_7LSYH
5237
+ 7LSYH_7LSYI
5238
+ 7LSYH_8BHVQ
5239
+ 7LSYH_8BHVR
5240
+ 7LSYH_8EZAI
5241
+ 7MDWB_7MDWB
5242
+ 7MDWB_7ME7B
5243
+ 7MDWB_8CXND
5244
+ 7MDWB_8CXNE
5245
+ 7MDWB_8CXNF
5246
+ 7MONB_7MONB
5247
+ 7MONB_7MX3A
5248
+ 7MONB_7MX3B
5249
+ 7MONB_7MX3C
5250
+ 7MONB_7MX3D
5251
+ 7MSC8_7MSC8
5252
+ 7MSC8_7MSH8
5253
+ 7MSC8_7MSM8
5254
+ 7MSC8_7MT28
5255
+ 7MSC8_7MT78
5256
+ 7N6G4T_7N6G4U
5257
+ 7N6G4T_7SQC5A
5258
+ 7N6G4T_7SQC5B
5259
+ 7N6G4T_7SQC5C
5260
+ 7N6G4T_7SQC5D
5261
+ 7N6IA_7N6IB
5262
+ 7N6IA_7OXDA
5263
+ 7N6IA_7SVUX
5264
+ 7N6IA_8BD5Q
5265
+ 7N6IA_8EA4Q
5266
+ 7N85C_7N85F
5267
+ 7N85C_7N9FC
5268
+ 7N85C_7N9FF
5269
+ 7N85C_7N9FI
5270
+ 7N85C_7WOOJ
5271
+ 7NDGB_7NDGB
5272
+ 7NDGB_7NDGE
5273
+ 7NDGB_7NDGH
5274
+ 7NDGB_7NE0B
5275
+ 7NDGB_7NE1B
5276
+ 7NS8A_7NS8A
5277
+ 7NS8A_7NSAA
5278
+ 7NS8A_7NSDA
5279
+ 7NS8A_7NSFA
5280
+ 7NS8A_7OA2A
5281
+ 7NYRB_7NYRB
5282
+ 7NYRB_7NYVB
5283
+ 7NYRB_7P62B
5284
+ 7NYRB_7P64B
5285
+ 7NYRB_7Z7SB
5286
+ 7NYWC_7NYWC
5287
+ 7NYWC_7NZ0C
5288
+ 7NYWC_7NZ2C2
5289
+ 7NYWC_7NZ4C1
5290
+ 7NYWC_7NZ4D1
5291
+ 7O0Uaa_7O0Uaa
5292
+ 7O0Uaa_7O0Vak
5293
+ 7O0Uaa_7O0Van
5294
+ 7O0Uaa_7O0Xak
5295
+ 7O0Uaa_7O0Xap
5296
+ 7O3WA_7O3WE
5297
+ 7O3WA_7O3WF
5298
+ 7O3WA_7O3XB
5299
+ 7O3WA_7O3ZA
5300
+ 7O3WA_7O40A
5301
+ 7OLCLG_7OLDLG
5302
+ 7OLCLG_7R81J1
5303
+ 7OLCLG_8I9WLG
5304
+ 7OLCLG_8I9ZLG
5305
+ 7OLCLG_8IA0LG
5306
+ 7OLCLP_7OLCLP
5307
+ 7OLCLP_7Z3NLP
5308
+ 7OLCLP_8I9PLP
5309
+ 7OLCLP_8I9RLP
5310
+ 7OLCLP_8I9VLP
5311
+ 7OODc_7OODc
5312
+ 7OODc_7PASc
5313
+ 7OODc_7PH9c
5314
+ 7OODc_7PIAc
5315
+ 7OODc_7PICc
5316
+ 7PAO9_7PAO9
5317
+ 7PAO9_7PAR9
5318
+ 7PAO9_7PIB9
5319
+ 7PAO9_7PIS9
5320
+ 7PAO9_7PIT9
5321
+ 7PQXA_7PQXB
5322
+ 7PQXA_7PR1A
5323
+ 7PQXA_7PROA
5324
+ 7PQXA_7PRUA
5325
+ 7PQXA_7PSDA
5326
+ 7PXEA_7PXEA
5327
+ 7PXEA_7PXFA
5328
+ 7PXEA_7PXGA
5329
+ 7PXEA_7PXHA
5330
+ 7PXEA_7PXHC
5331
+ 7QOOH_7QOOH
5332
+ 7QOOH_7XHNH
5333
+ 7QOOH_7XHOH
5334
+ 7QOOH_7YWXH
5335
+ 7QOOH_7YYHH
5336
+ 7QWKA_7QWKB
5337
+ 7QWKA_7QWKD
5338
+ 7QWKA_7QWKE
5339
+ 7QWKA_7QWKF
5340
+ 7QWKA_7QWKH
5341
+ 7R6OA_7R6OB
5342
+ 7R6OA_7R6OC
5343
+ 7R6OA_8IFJD
5344
+ 7R6OA_8IFJF
5345
+ 7R6OA_8IFJJ
5346
+ 7RYJA_7RYJA
5347
+ 7RYJA_7RYJB
5348
+ 7RYJA_7RYJD
5349
+ 7RYJA_7RYJE
5350
+ 7RYJA_7RYJF
5351
+ 7S2JA_7S2JA
5352
+ 7S2JA_7S2JB
5353
+ 7S2JA_7S2JC
5354
+ 7S2JA_7S2JD
5355
+ 7S2JA_7S2KA
5356
+ 7SG7A_7SG7A
5357
+ 7SG7A_7SG7D
5358
+ 7SG7A_7SG7K
5359
+ 7SG7A_7SG7O
5360
+ 7SG7A_7SG7R
5361
+ 7SXKa_7SXKc
5362
+ 7SXKa_7SXKg
5363
+ 7SXKa_7SXKh
5364
+ 7SXKa_7SYAa
5365
+ 7SXKa_7SZ4f
5366
+ 7T10R_7T11R
5367
+ 7T10R_7UL5A
5368
+ 7T10R_7XATA
5369
+ 7T10R_7Y24E
5370
+ 7T10R_7YAEE
5371
+ 7T4ZA_7T4ZA
5372
+ 7T4ZA_7T4ZB
5373
+ 7T4ZA_7T51A
5374
+ 7T4ZA_7T51B
5375
+ 7T4ZA_7T5AB
5376
+ 7TTSA_7TTSB
5377
+ 7TTSA_7TTSC
5378
+ 7TTSA_7TTSD
5379
+ 7TTSA_7TTSE
5380
+ 7TTSA_7TTSF
5381
+ 7TYFR_7TYFR
5382
+ 7TYFR_7TYHR
5383
+ 7TYFR_7TYIR
5384
+ 7TYFR_7TYNR
5385
+ 7TYFR_7TZFR
5386
+ 7UM4A_7UM4A
5387
+ 7UM4A_7UM5A
5388
+ 7UM4A_7UM6A
5389
+ 7UM4A_7UM7A
5390
+ 7UM4A_7X5HR
5391
+ 7UTIX_7UTIX
5392
+ 7UTIX_7UTLe
5393
+ 7UTIX_7UTLX
5394
+ 7UTIX_8DD0K
5395
+ 7UTIX_8DD0P
5396
+ 7UWGA_7UWGA
5397
+ 7UWGA_7UWGB
5398
+ 7UWGA_7UWGC
5399
+ 7UWGA_7UWGD
5400
+ 7UWGA_7UXUA
5401
+ 7V2BA_7V2VA
5402
+ 7V2BA_7V2VD
5403
+ 7V2BA_7V3WA
5404
+ 7V2BA_7V3WD
5405
+ 7V2BA_7V4ED
5406
+ 7V5YB_7V5YB
5407
+ 7V5YB_7V5ZA
5408
+ 7V5YB_7V5ZB
5409
+ 7V5YB_7V6WB
5410
+ 7V5YB_7V6WL
5411
+ 7W75C_7W75D
5412
+ 7W75C_7W76C
5413
+ 7W75C_7W76D
5414
+ 7W75C_7W76E
5415
+ 7W75C_7W76F
5416
+ 7WUYA_7WUYA
5417
+ 7WUYA_7WUYB
5418
+ 7WUYA_7WVSA
5419
+ 7WUYA_7WVSB
5420
+ 7WUYA_7WW0B
5421
+ 7X3HA_7X3HA
5422
+ 7X3HA_7X54A
5423
+ 7X3HA_7X55A
5424
+ 7X3HA_7X56A
5425
+ 7X3HA_7X59A
5426
+ 7X8AD_7XC7D
5427
+ 7X8AD_7XSRC
5428
+ 7X8AD_7Y85D
5429
+ 7X8AD_8D9GA
5430
+ 7X8AD_8D9HE
5431
+ 7XR2A_7XR2B
5432
+ 7XR2A_7XR3A
5433
+ 7XR2A_7XR3C
5434
+ 7XR2A_7XR3F
5435
+ 7XR2A_7XR3J
5436
+ 7Y4IA_7Y4IA
5437
+ 7Y4IA_7Y4IB
5438
+ 7Y4IA_8DTFA
5439
+ 7Y4IA_8DTGA
5440
+ 7Y4IA_8DTHA
5441
+ 7YCVA_7YCVA
5442
+ 7YCVA_7YCVB
5443
+ 7YCVA_7YCWB
5444
+ 7YCVA_7YCWC
5445
+ 7YCVA_7YCWD
5446
+ 7Z45A_7Z45A
5447
+ 7Z45A_7Z46A
5448
+ 7Z45A_7Z48P
5449
+ 7Z45A_7Z48T
5450
+ 7Z45A_7Z49DP
5451
+ 8A3TJ_8A3TJ
5452
+ 8A3TJ_8A3TK
5453
+ 8A3TJ_8A5YJ
5454
+ 8A3TJ_8A5YK
5455
+ 8A3TJ_8A61J
5456
+ 8AP6D_8AP6D
5457
+ 8AP6D_8APCd
5458
+ 8AP6D_8APEd
5459
+ 8AP6D_8APFd
5460
+ 8AP6D_8APHd
5461
+ 8B40A_8B40A
5462
+ 8B40A_8B40D
5463
+ 8B40A_8B42E
5464
+ 8B40A_8DS3F
5465
+ 8B40A_8F79F
5466
+ 8D3LA_8D3LA
5467
+ 8D3LA_8D3MB
5468
+ 8D3LA_8D3MD
5469
+ 8D3LA_8D3QB
5470
+ 8D3LA_8D3QC
5471
+ 8DTPA_8DTPA
5472
+ 8DTPA_8DTPF
5473
+ 8DTPA_8DUEA
5474
+ 8DTPA_8DUOF
5475
+ 8DTPA_8G0ZD
5476
+ 8EC2A_8EC2B
5477
+ 8EC2A_8EC2G
5478
+ 8EC2A_8ECJA
5479
+ 8EC2A_8ECJB
5480
+ 8EC2A_8ECJG
5481
+ 8EMCA_8EMCD
5482
+ 8EMCA_8EMHB
5483
+ 8EMCA_8EMHC
5484
+ 8EMCA_8EMHG
5485
+ 8EMCA_8EMHJ
5486
+ 8ESQG_8ESQG
5487
+ 8ESQG_8ETGG
5488
+ 8ESQG_8ETJG
5489
+ 8ESQG_8EUGG
5490
+ 8ESQG_8EUIG
5491
+ 8FXHA_8FXHA
5492
+ 8FXHA_8FXHB
5493
+ 8FXHA_8FXHC
5494
+ 8FXHA_8FXIB
5495
+ 8FXHA_8FXIC
petimot/__init__.py ADDED
File without changes
petimot/__main__.py ADDED
@@ -0,0 +1,231 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import typer
2
+
3
+ app = typer.Typer()
4
+
5
+
6
+ @app.command(name="infer")
7
+ def run_infer(
8
+ model_path: str = typer.Option(
9
+ "",
10
+ "--model-path",
11
+ "-m",
12
+ help="Path to model checkpoint",
13
+ ),
14
+ config_file: str = typer.Option(
15
+ "configs/default.yaml",
16
+ "--config-file",
17
+ "-c",
18
+ help="Path to configuration file",
19
+ ),
20
+ list_path: str = typer.Option(
21
+ "",
22
+ "--list-path",
23
+ "-l",
24
+ help="Path to file containing input files or sample indices, or direct path to a PDB/PT file",
25
+ ),
26
+ output_path: str = typer.Option(
27
+ "predictions",
28
+ "--output-path",
29
+ "-o",
30
+ help="Output directory path",
31
+ ),
32
+ ):
33
+ """
34
+ Run inference using a pretrained model. The input can be:
35
+ 1. A direct path to a .pdb or .pt file
36
+ 2. A text file containing paths or sample names
37
+ """
38
+ from petimot.infer.infer import infer
39
+
40
+ if list_path.endswith((".pdb", ".pt")):
41
+ input_list = [list_path]
42
+ else:
43
+ try:
44
+ with open(list_path, "r") as f:
45
+ input_list = [line.strip() for line in f if line.strip()]
46
+ except Exception as e:
47
+ raise typer.BadParameter(f"Error reading list file: {e}")
48
+
49
+ if not input_list:
50
+ raise typer.BadParameter("No input files specified")
51
+
52
+ infer(
53
+ model_path=model_path,
54
+ config_file=config_file,
55
+ input_list=input_list,
56
+ output_path=output_path,
57
+ )
58
+
59
+
60
+ @app.command(name="evaluate")
61
+ def run_evaluate(
62
+ prediction_path: str = typer.Option(
63
+ "",
64
+ "--prediction-path",
65
+ "-p",
66
+ help="Path to the directory containing predicted modes.",
67
+ ),
68
+ sample_ids_file: str = typer.Option(
69
+ "eval_list.txt",
70
+ "--list-path",
71
+ "-l",
72
+ help="Path to a text file containing sample IDs (one per line).",
73
+ ),
74
+ ground_truth_path: str = typer.Option(
75
+ "ground_truth",
76
+ "--ground-truth-path",
77
+ "-g",
78
+ help="Path to the directory containing ground truth .pt files.",
79
+ ),
80
+ output_path: str = typer.Option(
81
+ "evaluation",
82
+ "--output-path",
83
+ "-o",
84
+ help="Path to the directory where evaluation results will be saved.",
85
+ ),
86
+ num_modes_pred: int = typer.Option(
87
+ 4,
88
+ "--num-modes-pred",
89
+ "-np",
90
+ help="Number of predicted modes.",
91
+ ),
92
+ num_modes_gt: int = typer.Option(
93
+ 4,
94
+ "--num-modes-gt",
95
+ "-ng",
96
+ help="Number of ground truth modes.",
97
+ ),
98
+ device: str = typer.Option(
99
+ "cuda",
100
+ "--device",
101
+ "-d",
102
+ help="Device to use for evaluation, e.g., 'cuda' or 'cpu'.",
103
+ ),
104
+ ):
105
+ """
106
+ Evaluate predicted modes against ground truth data stored in .pt files.
107
+ """
108
+ from petimot.eval.eval import evaluate
109
+
110
+ sample_ids = None
111
+ if sample_ids_file:
112
+ try:
113
+ with open(sample_ids_file, "r") as f:
114
+ sample_ids = [line.strip() for line in f if line.strip()]
115
+ print(f"Loaded {len(sample_ids)} sample IDs from {sample_ids_file}")
116
+ except Exception as e:
117
+ raise typer.BadParameter(f"Error reading sample IDs file: {e}")
118
+
119
+ evaluate(
120
+ prediction_path=prediction_path,
121
+ ground_truth_path=ground_truth_path,
122
+ output_path=output_path,
123
+ sample_ids=sample_ids,
124
+ num_modes_pred=num_modes_pred,
125
+ num_modes_gt=num_modes_gt,
126
+ device=device,
127
+ )
128
+
129
+
130
+ @app.command(name="infer_and_evaluate")
131
+ def run_infer_and_evaluate(
132
+ model_path: str = typer.Option(
133
+ "",
134
+ "--model-path",
135
+ "-m",
136
+ help="Path to model checkpoint",
137
+ ),
138
+ config_file: str = typer.Option(
139
+ "configs/default.yaml",
140
+ "--config-file",
141
+ "-c",
142
+ help="Path to configuration file",
143
+ ),
144
+ list_path: str = typer.Option(
145
+ "eval_list.txt",
146
+ "--list-path",
147
+ "-l",
148
+ help="Path to file containing input files or sample indices",
149
+ ),
150
+ ground_truth_path: str = typer.Option(
151
+ "ground_truth",
152
+ "--ground-truth-path",
153
+ "-g",
154
+ help="Path to the directory containing ground truth .pt files.",
155
+ ),
156
+ prediction_path: str = typer.Option(
157
+ "predictions",
158
+ "--prediction-path",
159
+ "-p",
160
+ help="Directory path for saving predictions",
161
+ ),
162
+ evaluation_path: str = typer.Option(
163
+ "evaluation",
164
+ "--evaluation-path",
165
+ "-e",
166
+ help="Directory path for saving evaluation results",
167
+ ),
168
+ num_modes_pred: int = typer.Option(
169
+ 4,
170
+ "--num-modes-pred",
171
+ "-np",
172
+ help="Number of predicted modes.",
173
+ ),
174
+ num_modes_gt: int = typer.Option(
175
+ 4,
176
+ "--num-modes-gt",
177
+ "-ng",
178
+ help="Number of ground truth modes.",
179
+ ),
180
+ device: str = typer.Option(
181
+ "cuda",
182
+ "--device",
183
+ "-d",
184
+ help="Device to use for inference and evaluation.",
185
+ ),
186
+ ):
187
+ """
188
+ Run inference using a pretrained model and immediately evaluate the results.
189
+ This combined command streamlines the workflow by performing both inference
190
+ and evaluation in one step, while keeping predictions and evaluation results
191
+ in separate directories.
192
+ """
193
+ from petimot.infer.infer import infer
194
+ from petimot.eval.eval import evaluate
195
+ import os
196
+
197
+ with open(list_path, "r") as f:
198
+ input_list = [line.strip() for line in f if line.strip()]
199
+
200
+ print(f"Starting inference phase...")
201
+ infer(
202
+ model_path=model_path,
203
+ config_file=config_file,
204
+ input_list=input_list,
205
+ output_path=prediction_path,
206
+ )
207
+
208
+ model_name = os.path.splitext(os.path.basename(model_path))[0]
209
+ model_prediction_path = os.path.join(prediction_path, model_name)
210
+
211
+ print(f"\nStarting evaluation phase...")
212
+
213
+ results = evaluate(
214
+ prediction_path=model_prediction_path,
215
+ ground_truth_path=ground_truth_path,
216
+ output_path=evaluation_path,
217
+ sample_ids=input_list,
218
+ num_modes_pred=num_modes_pred,
219
+ num_modes_gt=num_modes_gt,
220
+ device=device,
221
+ )
222
+
223
+ print("\nInference and evaluation complete!")
224
+ print(f"Predictions saved to: {model_prediction_path}")
225
+ print(f"Evaluation results saved to: {evaluation_path}")
226
+
227
+ return results
228
+
229
+
230
+ if __name__ == "__main__":
231
+ app()
petimot/config/__init__.py ADDED
File without changes
petimot/config/config.py ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from dataclasses import dataclass, field
2
+ import torch
3
+ import yaml
4
+
5
+
6
+ @dataclass
7
+ class ModelConfig:
8
+ # Architecture parameters
9
+ emb_dim: int = 256
10
+ edge_dim: int = 329
11
+ num_modes_pred: int = 4
12
+ num_layers: int = 15
13
+ shared_layers: bool = False
14
+ mlp_num_layers: int = 1
15
+ start_with_zero_v: bool = False
16
+
17
+ # Regularization
18
+ input_embedding_dropout: float = 0.8
19
+ dropout: float = 0.4
20
+ normalize_between_layers: bool = False
21
+ center_between_layers: bool = False
22
+ orthogonalize_between_layers: bool = False
23
+
24
+ # Geometric features
25
+ num_basis: int = 20
26
+ num_backbone_atoms: int = 4
27
+ max_dist: float = 20.0
28
+ sigma: float = 1.0
29
+
30
+ # Ablation
31
+ ablate_structure: bool = False
32
+
33
+ def __post_init__(self):
34
+ expected_edge_dim = 9 + (self.num_backbone_atoms**2 * self.num_basis)
35
+ if self.edge_dim != expected_edge_dim:
36
+ raise ValueError(
37
+ f"edge_dim must be 9 + (num_backbone_atoms² * num_basis) = "
38
+ f"9 + ({self.num_backbone_atoms}² * {self.num_basis}) = {expected_edge_dim}. "
39
+ f"Got {self.edge_dim}"
40
+ )
41
+
42
+
43
+ @dataclass
44
+ class DataConfig:
45
+ training_split_path: str = "full_train_list.txt"
46
+ validation_split_path: str = "val_list.txt"
47
+ ground_truth_dir: str = "ground_truth"
48
+ embedding_dir: str = "embeddings"
49
+ emb_model: str = "ProstT5"
50
+ noise: float = 0.0
51
+ k_nearest: int = 5
52
+ l_random: int = 10
53
+ num_modes_gt: int = 4
54
+ rand_emb: bool = False
55
+ change_connectivity: bool = True
56
+
57
+
58
+ @dataclass
59
+ class TrainingConfig:
60
+ seed: int = 7
61
+ nsse_weight: float = 0.5 # LS loss
62
+ rmsip_weight: float = 0.5 # SS loss
63
+ ortho_weight: float = 0.0 # IS loss
64
+ batch_size: int = 32
65
+ validation_batch_size: int = 32
66
+ nb_epochs: int = 500
67
+ device: str = "cuda" if torch.cuda.is_available() else "cpu"
68
+ optimizer: str = "adamw"
69
+ learning_rate: float = 5e-4
70
+ weight_decay: float = 0.01
71
+ grad_clip: float = 10
72
+ use_amp: bool = True
73
+ scheduler_factor: float = 0.2
74
+ scheduler_patience: int = 10
75
+ early_stop_patience: int = 50
76
+ loss_threshold: float = 0.6
77
+ num_workers: int = 6
78
+ num_workers_val: int = 2
79
+ pin_memory: bool = False
80
+ weights_dir: str = "weights"
81
+ debug: bool = False
82
+
83
+
84
+ @dataclass
85
+ class Config:
86
+ model: ModelConfig = field(default_factory=ModelConfig)
87
+ data: DataConfig = field(default_factory=DataConfig)
88
+ training: TrainingConfig = field(default_factory=TrainingConfig)
89
+
90
+ @classmethod
91
+ def from_file(cls, path: str) -> "Config":
92
+ with open(path, "r") as f:
93
+ config_dict = yaml.safe_load(f) or {}
94
+ return cls(
95
+ model=ModelConfig(**config_dict.get("model", {})),
96
+ data=DataConfig(**config_dict.get("data", {})),
97
+ training=TrainingConfig(**config_dict.get("training", {})),
98
+ )
99
+
100
+ def save(self, path: str) -> None:
101
+ config_dict = {
102
+ "model": self.model.__dict__,
103
+ "data": self.data.__dict__,
104
+ "training": self.training.__dict__,
105
+ }
106
+ with open(path, "w") as f:
107
+ yaml.dump(config_dict, f, sort_keys=False)
petimot/data/__init__.py ADDED
File without changes
petimot/data/data_set.py ADDED
@@ -0,0 +1,282 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+ from torch_geometric.data import Data, Dataset
3
+ from abc import ABC, abstractmethod
4
+ from typing import Tuple, List, Optional, Union
5
+ from pathlib import Path
6
+ import logging
7
+ from .embeddings import EmbeddingManager
8
+ from .pdb_utils import load_backbone_coordinates
9
+ import warnings
10
+
11
+ logger = logging.getLogger(__name__)
12
+ logging.basicConfig(level=logging.INFO)
13
+
14
+
15
+ def generate_edges_no_self(num_nodes: int) -> Tuple[torch.Tensor, torch.Tensor]:
16
+ i, j = torch.meshgrid(
17
+ torch.arange(num_nodes), torch.arange(num_nodes), indexing="ij"
18
+ )
19
+ mask = i != j
20
+ return torch.vstack((i[mask], j[mask]))
21
+
22
+
23
+ def build_knn_edges(
24
+ dist_matrix: torch.Tensor, k: int
25
+ ) -> tuple[torch.Tensor, torch.Tensor, torch.Tensor]:
26
+ num_nodes = dist_matrix.size(0)
27
+ _, nearest_neighbors = torch.topk(dist_matrix, k=k + 1, largest=False)
28
+ nearest_neighbors = nearest_neighbors[:, 1:]
29
+ k = nearest_neighbors.size(1)
30
+ row_indices = (
31
+ torch.arange(num_nodes, device=dist_matrix.device)[:, None]
32
+ .expand(-1, k)
33
+ .flatten()
34
+ )
35
+ col_indices = nearest_neighbors.flatten()
36
+ mask = torch.ones(
37
+ (num_nodes, num_nodes), dtype=torch.bool, device=dist_matrix.device
38
+ )
39
+ mask[torch.arange(num_nodes), torch.arange(num_nodes)] = False
40
+ mask.scatter_(1, nearest_neighbors, False)
41
+
42
+ return row_indices, col_indices, mask
43
+
44
+
45
+ def build_random_edges(
46
+ num_nodes: int, mask: torch.Tensor, l: int
47
+ ) -> tuple[torch.Tensor, torch.Tensor]:
48
+ rand_probs = torch.rand((num_nodes, num_nodes), device=mask.device).masked_fill(
49
+ ~mask, -float("inf")
50
+ )
51
+ _, random_indices = torch.topk(rand_probs, l, dim=1)
52
+ row_indices = (
53
+ torch.arange(num_nodes, device=mask.device)[:, None].expand(-1, l).flatten()
54
+ )
55
+ col_indices = random_indices.flatten()
56
+
57
+ return row_indices, col_indices
58
+
59
+
60
+ def build_connectivity(
61
+ seq_len: int,
62
+ dist_matrix: torch.Tensor,
63
+ k_nearest: int,
64
+ l_random: int,
65
+ num_layers: int,
66
+ change_connectivity: bool,
67
+ ) -> tuple[torch.Tensor, torch.Tensor]:
68
+
69
+ if k_nearest + l_random >= seq_len - 1:
70
+ row_index, col_index_fixed = generate_edges_no_self(seq_len)
71
+ if change_connectivity:
72
+ col_index = col_index_fixed.unsqueeze(0).repeat(num_layers, 1)
73
+ else:
74
+ col_index = col_index_fixed
75
+ return row_index, col_index
76
+
77
+ if k_nearest > 0:
78
+ row_index_knn, col_index_knn, mask = build_knn_edges(dist_matrix, k_nearest)
79
+ else:
80
+
81
+ mask = torch.ones((seq_len, seq_len), dtype=torch.bool)
82
+ mask.fill_diagonal_(False)
83
+ row_index_knn = torch.empty(0, dtype=torch.long)
84
+ col_index_knn = torch.empty(0, dtype=torch.long)
85
+
86
+ if not change_connectivity:
87
+ if l_random > 0:
88
+ row_index_rand, col_index_rand = build_random_edges(seq_len, mask, l_random)
89
+ else:
90
+ row_index_rand = torch.empty(0, dtype=torch.long)
91
+ col_index_rand = torch.empty(0, dtype=torch.long)
92
+ row_index_total = torch.cat([row_index_knn, row_index_rand], dim=0)
93
+ col_index_total = torch.cat([col_index_knn, col_index_rand], dim=0)
94
+ return row_index_total, col_index_total
95
+ else:
96
+ fixed_row = row_index_knn
97
+ fixed_col = col_index_knn
98
+ col_indices_layers = []
99
+ for _ in range(num_layers):
100
+ if l_random > 0:
101
+ row_index_rand_layer, col_index_rand_layer = build_random_edges(
102
+ seq_len, mask, l_random
103
+ )
104
+ else:
105
+ row_index_rand_layer = torch.empty(0, dtype=torch.long)
106
+ col_index_rand_layer = torch.empty(0, dtype=torch.long)
107
+ combined_col = torch.cat([fixed_col, col_index_rand_layer], dim=0)
108
+ col_indices_layers.append(combined_col)
109
+
110
+ if l_random > 0:
111
+ row_index_rand = row_index_rand_layer
112
+ else:
113
+ row_index_rand = torch.empty(0, dtype=torch.long)
114
+ total_row_index = torch.cat([fixed_row, row_index_rand], dim=0)
115
+ col_index = torch.stack(col_indices_layers, dim=0)
116
+ return total_row_index, col_index
117
+
118
+
119
+ class BaseDataset(Dataset, ABC):
120
+ def __init__(
121
+ self,
122
+ ground_truth_dir: Union[str, Path],
123
+ embedding_dir: Union[str, Path],
124
+ emb_model: str,
125
+ device: str,
126
+ k_nearest: int,
127
+ l_random: int,
128
+ num_layers: int,
129
+ change_connectivity: bool,
130
+ ):
131
+ super().__init__()
132
+ self.ground_truth_dir = Path(ground_truth_dir)
133
+ self.embedding_dir = Path(embedding_dir)
134
+ self.emb_model = emb_model.lower()
135
+ self.device = device
136
+ self.k_nearest = k_nearest
137
+ self.l_random = l_random
138
+ self.num_layers = num_layers
139
+ self.change_connectivity = change_connectivity
140
+
141
+ def _build_connectivity(
142
+ self, ca_coords: torch.Tensor
143
+ ) -> Tuple[torch.Tensor, torch.Tensor]:
144
+ dist_matrix = torch.cdist(ca_coords, ca_coords)
145
+ return build_connectivity(
146
+ seq_len=ca_coords.size(0),
147
+ dist_matrix=dist_matrix,
148
+ k_nearest=self.k_nearest,
149
+ l_random=self.l_random,
150
+ num_layers=self.num_layers,
151
+ change_connectivity=self.change_connectivity,
152
+ )
153
+
154
+ def _load_embedding(self, name: str, seq_len: int) -> torch.Tensor:
155
+ emb_path = self.embedding_dir / f"{name}_{self.emb_model}.pt"
156
+ if not emb_path.exists():
157
+ raise FileNotFoundError(f"Embedding {emb_path} not found")
158
+
159
+ emb = torch.load(emb_path, map_location="cpu", weights_only=True)
160
+ if emb.size(0) == seq_len + 2: # Handle BOS/EOS tokens
161
+ emb = emb[1:-1]
162
+ elif emb.size(0) != seq_len:
163
+ raise ValueError(f"Embedding size mismatch: {emb.size(0)} vs {seq_len}")
164
+
165
+ return emb
166
+
167
+ def _compute_embeddings(self, names: List[str], sequences: List[str]) -> None:
168
+ """Shared logic for computing missing embeddings."""
169
+ if not names:
170
+ return
171
+
172
+ logger.info(f"Computing embeddings for {len(names)} samples")
173
+ EmbeddingManager(
174
+ embedding_dir=self.embedding_dir,
175
+ emb_model=self.emb_model,
176
+ device=self.device,
177
+ ).get_or_compute_embeddings(names, sequences)
178
+
179
+ @abstractmethod
180
+ def __getitem__(self, idx: int) -> Data:
181
+ pass
182
+
183
+ @abstractmethod
184
+ def __len__(self) -> int:
185
+ pass
186
+
187
+
188
+ class InferenceDataset(BaseDataset):
189
+ def __init__(
190
+ self,
191
+ entries: List[str],
192
+ ground_truth_dir: Union[str, Path],
193
+ embedding_dir: Union[str, Path],
194
+ emb_model: str,
195
+ device: str,
196
+ k_nearest: int,
197
+ l_random: int,
198
+ num_layers: int,
199
+ change_connectivity: bool,
200
+ ):
201
+ super().__init__(
202
+ ground_truth_dir=ground_truth_dir,
203
+ embedding_dir=embedding_dir,
204
+ emb_model=emb_model,
205
+ device=device,
206
+ k_nearest=k_nearest,
207
+ l_random=l_random,
208
+ num_layers=num_layers,
209
+ change_connectivity=change_connectivity,
210
+ )
211
+ self.file_paths = self._resolve_entries(entries)
212
+ self._compute_missing_embeddings()
213
+
214
+ def _resolve_entries(self, entries: List[str]) -> List[Path]:
215
+ resolved = []
216
+ for entry in entries:
217
+ path = Path(entry)
218
+ if path.exists():
219
+ resolved.append(path.resolve())
220
+ continue
221
+
222
+ for ext in [".pt"]:
223
+ candidate = self.ground_truth_dir / f"{entry}{ext}"
224
+ if candidate.exists():
225
+ resolved.append(candidate)
226
+ break
227
+ else:
228
+ logger.warning(f"Couldn't resolve entry: {entry}")
229
+
230
+ if not resolved:
231
+ raise ValueError("No valid input files found")
232
+ return resolved
233
+
234
+ def _compute_missing_embeddings(self) -> None:
235
+ valid_names, sequences = [], []
236
+ for path in self.file_paths:
237
+ emb_path = self.embedding_dir / f"{path.stem}_{self.emb_model}.pt"
238
+ if emb_path.exists():
239
+ continue
240
+
241
+ try:
242
+ sequences.append(self._extract_sequence(path))
243
+ valid_names.append(path.stem)
244
+ except Exception as e:
245
+ logger.error(f"Failed processing {path.name}: {e}")
246
+
247
+ self._compute_embeddings(valid_names, sequences)
248
+
249
+ def _extract_sequence(self, path: Path) -> str:
250
+ if path.suffix == ".pt":
251
+ return torch.load(path, map_location="cpu", weights_only=True).get(
252
+ "seq", ""
253
+ )
254
+ elif path.suffix == ".pdb":
255
+ seq = load_backbone_coordinates(str(path), allow_hetatm=True)["seq"]
256
+ return seq
257
+ raise ValueError(f"Unsupported file type: {path.suffix}")
258
+
259
+ def __len__(self) -> int:
260
+ return len(self.file_paths)
261
+
262
+ def __getitem__(self, idx: int) -> Data:
263
+ path = self.file_paths[idx]
264
+
265
+ if path.suffix == ".pt":
266
+ data = torch.load(path, map_location="cpu", weights_only=True)
267
+ bb, coverage = data["bb"], data.get("coverage", torch.ones(len(data["bb"])))
268
+ else:
269
+ bb = load_backbone_coordinates(str(path), allow_hetatm=True)["bb"]
270
+ coverage = torch.ones(len(bb))
271
+
272
+ row_idx, col_idx = self._build_connectivity(bb[:, 1])
273
+
274
+ return Data(
275
+ x=self._load_embedding(path.stem, bb.size(0)),
276
+ row_index=row_idx,
277
+ col_index=col_idx,
278
+ bb=bb,
279
+ coverage=coverage,
280
+ name=path.stem,
281
+ path=str(path),
282
+ )
petimot/data/embeddings.py ADDED
@@ -0,0 +1,84 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from pathlib import Path
2
+ import torch
3
+ import re
4
+ from transformers import T5Tokenizer, T5EncoderModel
5
+ from typing import List, Union
6
+
7
+
8
+ class EmbeddingManager:
9
+ def __init__(
10
+ self,
11
+ embedding_dir: Path,
12
+ emb_model: str,
13
+ device: Union[str, torch.device] = "cuda",
14
+ ):
15
+ emb_model = emb_model.lower()
16
+ valid_models = ["prostt5", "esmc_300m", "esmc_600m"]
17
+ if emb_model not in valid_models:
18
+ raise ValueError(
19
+ f"Unsupported model: {emb_model}. Choose from {valid_models}."
20
+ )
21
+
22
+ self.embedding_dir = Path(embedding_dir)
23
+ self.embedding_dir.mkdir(exist_ok=True)
24
+ self.emb_model = emb_model
25
+
26
+ if isinstance(device, str):
27
+ self.device = torch.device(device)
28
+ else:
29
+ self.device = device
30
+
31
+ if emb_model == "prostt5":
32
+ self.tokenizer = T5Tokenizer.from_pretrained(
33
+ "Rostlab/ProstT5", do_lower_case=False, legacy=True
34
+ )
35
+ self.model = T5EncoderModel.from_pretrained("Rostlab/ProstT5").to(
36
+ self.device
37
+ )
38
+
39
+ if self.device.type == "cuda":
40
+ self.model = self.model.half()
41
+ else:
42
+ self.model = self.model.float()
43
+
44
+ elif emb_model.startswith("esmc"):
45
+ from esm.models.esmc import ESMC
46
+ from esm.sdk.api import ESMProtein, LogitsConfig
47
+
48
+ self.model = ESMC.from_pretrained(emb_model).to(self.device)
49
+ self.model = self.model.float()
50
+ self.ESMProtein = ESMProtein
51
+ self.LogitsConfig = LogitsConfig
52
+ self.tokenizer = None
53
+
54
+ self.model.eval()
55
+
56
+ def get_or_compute_embeddings(self, names: List[str], sequences: List[str]) -> None:
57
+ for name, seq in zip(names, sequences):
58
+ emb_path = self.embedding_dir / f"{name}_{self.emb_model}.pt"
59
+ if not emb_path.exists():
60
+ emb = self._compute_embedding(seq)
61
+ torch.save(emb.cpu(), emb_path)
62
+
63
+ @torch.no_grad()
64
+ def _compute_embedding(self, sequence: str) -> torch.Tensor:
65
+ sequence = re.sub(r"[UZOB]", "X", sequence)
66
+
67
+ if self.emb_model == "prostt5":
68
+ sequence = "<AA2fold> " + " ".join(list(sequence))
69
+ ids = self.tokenizer.encode_plus(
70
+ sequence, add_special_tokens=True, return_tensors="pt"
71
+ ).to(self.device)
72
+ outputs = self.model(ids.input_ids, attention_mask=ids.attention_mask)
73
+ embeddings = outputs.last_hidden_state.squeeze(0)[1:-1].float()
74
+
75
+ elif self.emb_model.startswith("esmc"):
76
+ protein = self.ESMProtein(sequence=sequence)
77
+ protein_tensor = self.model.encode(protein)
78
+ logits_output = self.model.logits(
79
+ protein_tensor, self.LogitsConfig(sequence=True, return_embeddings=True)
80
+ )
81
+
82
+ embeddings = logits_output.embeddings.squeeze(0)
83
+
84
+ return embeddings
petimot/data/pdb_utils.py ADDED
@@ -0,0 +1,157 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from typing import Dict, Tuple, List
2
+ import torch
3
+ from pathlib import Path
4
+ import os
5
+
6
+
7
+ def load_backbone_coordinates(
8
+ pdb_path: str,
9
+ allow_hetatm: bool = False,
10
+ ) -> Tuple[torch.Tensor, ...]:
11
+
12
+ THREE_TO_ONE = {
13
+ "ALA": "A",
14
+ "ARG": "R",
15
+ "ASN": "N",
16
+ "ASP": "D",
17
+ "CYS": "C",
18
+ "GLN": "Q",
19
+ "GLU": "E",
20
+ "GLY": "G",
21
+ "HIS": "H",
22
+ "ILE": "I",
23
+ "LEU": "L",
24
+ "LYS": "K",
25
+ "MET": "M",
26
+ "PHE": "F",
27
+ "PRO": "P",
28
+ "SER": "S",
29
+ "THR": "T",
30
+ "TRP": "W",
31
+ "TYR": "Y",
32
+ "VAL": "V",
33
+ "SEC": "U",
34
+ "PYL": "O",
35
+ "UNK": "X",
36
+ # Non-standard and modified amino acids
37
+ "ABA": "A",
38
+ "ALY": "K",
39
+ "BFD": "D",
40
+ "CAF": "C",
41
+ "CAS": "C",
42
+ "CGU": "E",
43
+ "CME": "C",
44
+ "CSD": "C",
45
+ "CSO": "C",
46
+ "CSS": "C",
47
+ "CSX": "C",
48
+ "CXM": "M",
49
+ "DAL": "A",
50
+ "DCY": "C",
51
+ "DHA": "S",
52
+ "DLE": "L",
53
+ "DSN": "S",
54
+ "FME": "M",
55
+ "HIC": "H",
56
+ "HYP": "P",
57
+ "IAS": "D",
58
+ "KCX": "K",
59
+ "LLP": "K",
60
+ "M3L": "K",
61
+ "MDO": "A",
62
+ "MEN": "N",
63
+ "MEQ": "Q",
64
+ "MHO": "M",
65
+ "MLE": "L",
66
+ "MLY": "K",
67
+ "MLZ": "K",
68
+ "MSE": "M",
69
+ "MVA": "V",
70
+ "NEP": "H",
71
+ "NLE": "L",
72
+ "OCS": "C",
73
+ "PCA": "E",
74
+ "PHD": "D",
75
+ "PTR": "Y",
76
+ "SAR": "G",
77
+ "SCH": "C",
78
+ "SCY": "C",
79
+ "SEP": "S",
80
+ "SMC": "C",
81
+ "SME": "M",
82
+ "SNC": "C",
83
+ "TPO": "T",
84
+ "TYS": "Y",
85
+ "YCM": "C",
86
+ }
87
+
88
+ residues, types, numbers = [], [], []
89
+ current = {"coords": [], "type": None, "num": None}
90
+
91
+ try:
92
+ with open(pdb_path, "r") as f:
93
+ for line in f:
94
+ if not (
95
+ line.startswith("ATOM")
96
+ or (allow_hetatm and line.startswith("HETATM"))
97
+ ):
98
+ continue
99
+
100
+ atom = line[12:16].strip()
101
+ res_type = line[17:20].strip()
102
+
103
+ if res_type == "HOH" or atom not in ["N", "CA", "C", "O"]:
104
+ continue
105
+
106
+ coords = torch.tensor(
107
+ [float(line[30:38]), float(line[38:46]), float(line[46:54])]
108
+ )
109
+
110
+ current["coords"].append(coords)
111
+
112
+ if not current["type"]:
113
+ current["type"] = res_type
114
+ current["num"] = int(line[22:26])
115
+
116
+ if len(current["coords"]) == 4:
117
+ residues.append(torch.stack(current["coords"]))
118
+ types.append(current["type"])
119
+ numbers.append(current["num"])
120
+ current = {"coords": [], "type": None, "num": None}
121
+
122
+ except FileNotFoundError:
123
+ raise FileNotFoundError(f"PDB file not found: {pdb_path}")
124
+ except Exception as e:
125
+ raise ValueError(f"Error parsing PDB file: {e}")
126
+
127
+ if not residues:
128
+ raise ValueError("No valid backbone atoms found")
129
+
130
+ backbone = torch.stack(residues)
131
+
132
+ output_path = f"extracted_{Path(pdb_path).name}"
133
+ if not os.path.exists(output_path):
134
+ with open(output_path, "w") as f:
135
+ atom_num = 1
136
+ res_num = 1
137
+ for res_idx, residue in enumerate(residues):
138
+ res_type = types[res_idx]
139
+ for atom_idx, (atom_name, coords) in enumerate(
140
+ zip(["N", "CA", "C", "O"], residue)
141
+ ):
142
+ # Standard PDB format
143
+ f.write(
144
+ f"ATOM {atom_num:5d} {atom_name:<3s} {res_type:3s} A{res_num:4d} "
145
+ f"{coords[0]:8.3f}{coords[1]:8.3f}{coords[2]:8.3f}"
146
+ f" 1.00 0.00 {atom_name[0]:>2s}\n"
147
+ )
148
+ atom_num += 1
149
+ res_num += 1
150
+ f.write("END\n")
151
+
152
+ outputs = {"bb": backbone}
153
+ seq = "".join(THREE_TO_ONE.get(t, "X") for t in types)
154
+ outputs["seq"] = seq
155
+ outputs["residue_types"] = types
156
+ outputs["residue_numbers"] = numbers
157
+ return outputs
petimot/eval/eval.py ADDED
@@ -0,0 +1,286 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+ import numpy as np
3
+ import os
4
+ import json
5
+ from pathlib import Path
6
+ from typing import Dict, List, Optional, Tuple
7
+ from tqdm import tqdm
8
+ from petimot.model.loss import (
9
+ compute_NSSE_matrix,
10
+ compute_rmsip_sq,
11
+ select_minimum_indices,
12
+ )
13
+
14
+
15
+ def compute_magnitude_error_matrix(
16
+ eigvects: torch.Tensor, coverage: torch.Tensor, modes_pred: torch.Tensor
17
+ ) -> torch.Tensor:
18
+ N, nmode_gt, _ = eigvects.shape
19
+
20
+ modes_pred = modes_pred - modes_pred.mean(dim=0, keepdim=True)
21
+
22
+ eigvects = eigvects - eigvects.mean(dim=0, keepdim=True)
23
+
24
+ gt_magnitudes = torch.norm(eigvects, dim=2)[:, :, None] # Shape: (N, nmode_gt,1)
25
+ pred_magnitudes = torch.norm(modes_pred, dim=2)[
26
+ :, None, :
27
+ ] # Shape: (N, 1, nmode_pred)
28
+
29
+ coverage = coverage[:, None, None]
30
+ sqrt_cov = torch.sqrt(coverage) # Shape: (N, 1, 1)
31
+
32
+ numerator = torch.sum(sqrt_cov * gt_magnitudes * pred_magnitudes, dim=0)
33
+
34
+ denominator = torch.sum(coverage * pred_magnitudes.pow(2), dim=0) + 1e-8
35
+ c_optimal = numerator / denominator
36
+
37
+ pred_magnitudes = (
38
+ sqrt_cov * pred_magnitudes * c_optimal[None, :, :]
39
+ ) # Shape: (N, 1, nmode_pred)
40
+
41
+ sum_squared_error_matrix = (
42
+ torch.sum((gt_magnitudes - pred_magnitudes) ** 2, dim=0) / N
43
+ )
44
+
45
+ return sum_squared_error_matrix
46
+
47
+
48
+ def compute_optimal_assignment_metrics(
49
+ matrix: torch.Tensor, maximize: bool = False
50
+ ) -> float:
51
+
52
+ cost_matrix = -matrix if maximize else matrix
53
+
54
+ # Get optimal indices using Hungarian algorithm
55
+ indices = select_minimum_indices(cost_matrix)
56
+
57
+ optimal_cost = matrix[indices[:, 0], indices[:, 1]].mean().item()
58
+
59
+ return optimal_cost
60
+
61
+
62
+ def load_ground_truth(
63
+ file_path: str, num_modes_gt: int, device: str
64
+ ) -> Optional[Tuple[torch.Tensor, torch.Tensor]]:
65
+ try:
66
+ data = torch.load(file_path, map_location=device)
67
+ eigvects = data["eigvects"]
68
+ seq_length = int(len(eigvects) / 3)
69
+
70
+ eigvects = eigvects[:, :num_modes_gt]
71
+ eigvects *= seq_length**0.5
72
+ eigvects = eigvects.reshape(-1, 3, num_modes_gt).permute(0, 2, 1)
73
+
74
+ coverage = data.get("coverage", torch.ones(eigvects.shape[0], device=device))
75
+ if not isinstance(coverage, torch.Tensor):
76
+ coverage = torch.tensor(coverage, device=device)
77
+
78
+ return eigvects, coverage
79
+
80
+ except Exception as e:
81
+ print(f"Error loading ground truth: {e}")
82
+ return None
83
+
84
+
85
+ def load_predictions(
86
+ base_path: str, base_name: str, num_modes: int, device: str
87
+ ) -> Optional[torch.Tensor]:
88
+ try:
89
+ modes = []
90
+ for k in range(num_modes):
91
+ pred_file = os.path.join(base_path, f"{base_name}_mode_{k}.txt")
92
+ modes.append(np.loadtxt(pred_file))
93
+ return torch.tensor(np.stack(modes, axis=1), device=device, dtype=torch.float32)
94
+ except Exception as e:
95
+ print(f"Error loading predictions for {base_name}: {e}")
96
+ return None
97
+
98
+
99
+ def save_matrix(
100
+ output_path: str,
101
+ base_name: str,
102
+ matrix: torch.Tensor,
103
+ num_modes_gt: int,
104
+ metric_name: str,
105
+ ):
106
+ matrix_path = os.path.join(output_path, f"{base_name}_{metric_name}_matrix.csv")
107
+ with open(matrix_path, "w") as f:
108
+ f.write("mode_name," + ",".join(f"gt_{j}" for j in range(num_modes_gt)) + "\n")
109
+
110
+ matrix_cpu = matrix.cpu()
111
+ for i in range(len(matrix_cpu)):
112
+ row = f"pred_{i}," + ",".join(f"{val:.6f}" for val in matrix_cpu[i])
113
+ f.write(f"{row}\n")
114
+
115
+
116
+ def save_sample_metrics(output_path: str, base_name: str, metrics: Dict) -> None:
117
+ metrics_path = os.path.join(output_path, f"{base_name}_metrics.json")
118
+ with open(metrics_path, "w") as f:
119
+ json.dump(metrics, f, indent=2)
120
+
121
+
122
+ def evaluate(
123
+ prediction_path: str,
124
+ ground_truth_path: str,
125
+ output_path: str,
126
+ sample_ids: List[str],
127
+ num_modes_pred: int = 4,
128
+ num_modes_gt: int = 4,
129
+ device: str = "cuda",
130
+ success_threshold: float = 0.6,
131
+ ) -> Dict:
132
+
133
+ if device == "cuda" and not torch.cuda.is_available():
134
+ device = "cpu"
135
+
136
+ prediction_subdir = os.path.basename(prediction_path.rstrip("/"))
137
+ output_path = os.path.join(output_path, prediction_subdir)
138
+
139
+ os.makedirs(output_path, exist_ok=True)
140
+ if not sample_ids:
141
+ all_files = os.listdir(prediction_path)
142
+ mode0_files = [f for f in all_files if f.endswith("_mode_0.txt")]
143
+ if not mode0_files:
144
+ raise ValueError(
145
+ f"No files ending with '_mode_0.txt' found in {prediction_path}"
146
+ )
147
+
148
+ sample_ids = [Path(f).stem.rsplit("_mode", 1)[0] for f in mode0_files]
149
+
150
+ sample_ids = [
151
+ Path(sample_id).stem.rsplit("_mode", 1)[0] for sample_id in sample_ids
152
+ ]
153
+
154
+ min_losses = []
155
+ min_magnitude_errors = []
156
+ rmsip_sq_scores = []
157
+
158
+ optimal_losses = []
159
+ optimal_magnitudes = []
160
+ stats = {"total": 0, "success": 0}
161
+
162
+ missing_files = []
163
+ for sample_id in sample_ids:
164
+ mode0_file = os.path.join(prediction_path, f"{sample_id}_mode_0.txt")
165
+ gt_file = os.path.join(ground_truth_path, f"{sample_id}.pt")
166
+
167
+ if not os.path.exists(mode0_file):
168
+ missing_files.append(f"Missing prediction file: {mode0_file}")
169
+ if not os.path.exists(gt_file):
170
+ missing_files.append(f"Missing ground truth file: {gt_file}")
171
+
172
+ if missing_files:
173
+ print("Warning: Some files are missing:")
174
+ for msg in missing_files:
175
+ print(f" {msg}")
176
+ print("Proceeding with available files...")
177
+
178
+ for base_name in tqdm(sample_ids, desc="Evaluating samples"):
179
+ gt_data = load_ground_truth(
180
+ os.path.join(ground_truth_path, f"{base_name}.pt"), num_modes_gt, device
181
+ )
182
+ if gt_data is None:
183
+ continue
184
+
185
+ modes_pred = load_predictions(
186
+ prediction_path, base_name, num_modes_pred, device
187
+ )
188
+ if modes_pred is None:
189
+ continue
190
+
191
+ eigvects, coverage = gt_data
192
+ loss_matrix = compute_NSSE_matrix(eigvects, coverage, modes_pred).T
193
+ magnitude_error_matrix = compute_magnitude_error_matrix(
194
+ eigvects, coverage, modes_pred
195
+ ).T
196
+
197
+ rmsip_sq = compute_rmsip_sq(eigvects, coverage, modes_pred).item()
198
+
199
+ optimal_loss = compute_optimal_assignment_metrics(loss_matrix, maximize=False)
200
+
201
+ optimal_magnitude = compute_optimal_assignment_metrics(
202
+ magnitude_error_matrix, maximize=False
203
+ )
204
+
205
+ stats["total"] += 1
206
+ min_loss = torch.min(loss_matrix).item()
207
+ min_magnitude_error = torch.min(magnitude_error_matrix).item()
208
+
209
+ stats["success"] += int(min_loss < success_threshold)
210
+ min_losses.append(min_loss)
211
+ min_magnitude_errors.append(min_magnitude_error)
212
+ rmsip_sq_scores.append(rmsip_sq)
213
+
214
+ optimal_losses.append(optimal_loss)
215
+ optimal_magnitudes.append(optimal_magnitude)
216
+
217
+ sample_metrics = {
218
+ "nsse_metrics": {"min_loss": min_loss, "optimal_assignment": optimal_loss},
219
+ "magnitude_metrics": {
220
+ "min_error": min_magnitude_error,
221
+ "optimal_assignment": optimal_magnitude,
222
+ },
223
+ "rmsip_sq": rmsip_sq,
224
+ "success": min_loss < success_threshold,
225
+ }
226
+ save_sample_metrics(output_path, base_name, sample_metrics)
227
+
228
+ save_matrix(output_path, base_name, loss_matrix, num_modes_gt, "loss")
229
+ save_matrix(
230
+ output_path,
231
+ base_name,
232
+ magnitude_error_matrix,
233
+ num_modes_gt,
234
+ "magnitude_error",
235
+ )
236
+
237
+ results = {
238
+ "total_samples": stats["total"],
239
+ "success_rate": stats["success"] / stats["total"] if stats["total"] > 0 else 0,
240
+ "nsse_metrics": {
241
+ "mean_min_loss": float(np.mean(min_losses)) if min_losses else 0,
242
+ "std_min_loss": float(np.std(min_losses)) if min_losses else 0,
243
+ "optimal_assignment_mean": (
244
+ float(np.mean(optimal_losses)) if optimal_losses else 0
245
+ ),
246
+ "optimal_assignment_std": (
247
+ float(np.std(optimal_losses)) if optimal_losses else 0
248
+ ),
249
+ },
250
+ "magnitude_metrics": {
251
+ "mean_min_error": (
252
+ float(np.mean(min_magnitude_errors)) if min_magnitude_errors else 0
253
+ ),
254
+ "std_min_error": (
255
+ float(np.std(min_magnitude_errors)) if min_magnitude_errors else 0
256
+ ),
257
+ "optimal_assignment_mean": (
258
+ float(np.mean(optimal_magnitudes)) if optimal_magnitudes else 0
259
+ ),
260
+ "optimal_assignment_std": (
261
+ float(np.std(optimal_magnitudes)) if optimal_magnitudes else 0
262
+ ),
263
+ },
264
+ "rmsip_sq_metrics": {
265
+ "mean_rmsip_sq": float(np.mean(rmsip_sq_scores)) if rmsip_sq_scores else 0,
266
+ "std_rmsip_sq": float(np.std(rmsip_sq_scores)) if rmsip_sq_scores else 0,
267
+ },
268
+ }
269
+
270
+ with open(os.path.join(output_path, "evaluation_results.json"), "w") as f:
271
+ json.dump(results, f, indent=2)
272
+
273
+ print(f"\nEvaluation complete. Results saved to {output_path}")
274
+ print(f"Success rate: {results['success_rate']:.2%}")
275
+ print(f"NSSE metrics:")
276
+ print(
277
+ f" Min loss (mean ± std): {results['nsse_metrics']['mean_min_loss']:.4f} ± {results['nsse_metrics']['std_min_loss']:.4f}"
278
+ )
279
+ print(
280
+ f" Optimal assignment (mean ± std): {results['nsse_metrics']['optimal_assignment_mean']:.4f} ± {results['nsse_metrics']['optimal_assignment_std']:.4f}"
281
+ )
282
+ print(
283
+ f"Mean rmsip_sq: {results['rmsip_sq_metrics']['mean_rmsip_sq']:.4f} ± {results['rmsip_sq_metrics']['std_rmsip_sq']:.4f}"
284
+ )
285
+
286
+ return results
petimot/infer/infer.py ADDED
@@ -0,0 +1,148 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+ import logging
3
+ import random
4
+ import numpy as np
5
+ from pathlib import Path
6
+ from typing import List
7
+ from tqdm import tqdm
8
+
9
+ from torch_geometric.loader import DataLoader
10
+
11
+ from petimot.model.neural_net import ProteinMotionMPNN
12
+ from petimot.config.config import Config
13
+ from petimot.data.data_set import InferenceDataset
14
+ from petimot.utils.seeding import set_seed
15
+
16
+ logger = logging.getLogger(__name__)
17
+ logging.basicConfig(level=logging.INFO)
18
+
19
+ EMBEDDING_DIM_MAP = {"prostt5": 1024, "esmc_300m": 960, "esmc_600m": 1152}
20
+
21
+
22
+ def setup_model(model_path: str, config: Config, device: str) -> ProteinMotionMPNN:
23
+
24
+ model = ProteinMotionMPNN(
25
+ input_dim=EMBEDDING_DIM_MAP[config.data.emb_model.lower()],
26
+ emb_dim=config.model.emb_dim,
27
+ edge_dim=config.model.edge_dim,
28
+ num_modes_pred=config.model.num_modes_pred,
29
+ num_layers=config.model.num_layers,
30
+ shared_layers=config.model.shared_layers,
31
+ mlp_num_layers=config.model.mlp_num_layers,
32
+ input_embedding_dropout=0.0,
33
+ dropout=0.0,
34
+ num_basis=config.model.num_basis,
35
+ max_dist=config.model.max_dist,
36
+ sigma=config.model.sigma,
37
+ change_connectivity=config.data.change_connectivity,
38
+ normalize_between_layers=config.model.normalize_between_layers,
39
+ center_between_layers=config.model.center_between_layers,
40
+ orthogonalize_between_layers=config.model.orthogonalize_between_layers,
41
+ start_with_zero_v=config.model.start_with_zero_v,
42
+ ablate_structure=config.model.ablate_structure,
43
+ ).to(device)
44
+
45
+ try:
46
+ state_dict = torch.load(model_path, map_location=device, weights_only=True)
47
+ model.load_state_dict(state_dict)
48
+ model.eval()
49
+ logger.info(f"Successfully loaded model weights from {model_path}")
50
+ except Exception as e:
51
+ error_msg = f"Failed to load model weights from {model_path}: {e}"
52
+ logger.error(error_msg)
53
+ raise RuntimeError(error_msg)
54
+
55
+ return model
56
+
57
+
58
+ def save_predictions(output_dir: Path, name: str, modes_pred: torch.Tensor) -> None:
59
+
60
+ modes_pred = modes_pred - modes_pred.mean(dim=0, keepdim=True)
61
+ modes_pred_np = modes_pred.cpu().numpy()
62
+ num_modes = modes_pred_np.shape[1]
63
+
64
+ for k in range(num_modes):
65
+ out_path = output_dir / f"{name}_mode_{k}.txt"
66
+ try:
67
+ np.savetxt(out_path, modes_pred_np[:, k, :], fmt="%.6f")
68
+ logger.debug(f"Saved mode {k} predictions to {out_path}")
69
+ except Exception as e:
70
+ logger.error(f"Failed to save prediction to {out_path}: {e}")
71
+
72
+
73
+ def infer(
74
+ model_path: str,
75
+ config_file: str,
76
+ input_list: List[str],
77
+ output_path: str,
78
+ device: str = "cuda",
79
+ ) -> None:
80
+
81
+ if device == "cuda" and not torch.cuda.is_available():
82
+ logger.warning("CUDA is not available. Falling back to CPU.")
83
+ device = "cpu"
84
+
85
+ output_dir = Path(output_path)
86
+ output_dir.mkdir(parents=True, exist_ok=True)
87
+
88
+ model_stem = Path(model_path).stem
89
+ output_subdir = output_dir / model_stem
90
+ output_subdir.mkdir(parents=True, exist_ok=True)
91
+
92
+ config = Config.from_file(config_file)
93
+ set_seed(config.training.seed, deterministic_algorithms=True)
94
+
95
+ model = setup_model(model_path, config, device)
96
+
97
+ dataset = InferenceDataset(
98
+ entries=input_list,
99
+ ground_truth_dir=config.data.ground_truth_dir,
100
+ embedding_dir=config.data.embedding_dir,
101
+ emb_model=config.data.emb_model,
102
+ device=device,
103
+ k_nearest=config.data.k_nearest,
104
+ l_random=config.data.l_random,
105
+ num_layers=config.model.num_layers,
106
+ change_connectivity=config.data.change_connectivity,
107
+ )
108
+
109
+ def seed_worker(worker_id):
110
+ worker_seed = torch.initial_seed() % 2**32
111
+ np.random.seed(worker_seed)
112
+ random.seed(worker_seed)
113
+
114
+ g = torch.Generator()
115
+ g.manual_seed(config.training.seed)
116
+
117
+ dataloader = DataLoader(
118
+ dataset,
119
+ batch_size=config.training.validation_batch_size,
120
+ num_workers=config.training.num_workers_val,
121
+ worker_init_fn=seed_worker,
122
+ generator=g,
123
+ shuffle=False,
124
+ )
125
+
126
+ logger.info(f"Starting inference on {len(dataset)} protein(s)")
127
+
128
+ with torch.no_grad():
129
+ for batch in tqdm(dataloader, desc="Processing proteins"):
130
+ try:
131
+ batch = batch.to(device)
132
+
133
+ _, modes_pred = model(batch) # (N, emb_dim), (N, num_modes, 3)
134
+
135
+ ptr = batch.ptr # shape [num_graphs + 1]
136
+ for graph_idx, (start_idx, end_idx) in enumerate(
137
+ zip(ptr[:-1], ptr[1:])
138
+ ):
139
+ name = batch.name[graph_idx]
140
+ modes_pred_graph = modes_pred[start_idx:end_idx]
141
+
142
+ save_predictions(output_subdir, name, modes_pred_graph)
143
+
144
+ except Exception as e:
145
+ logger.error(f"Error processing batch: {e}")
146
+ continue
147
+
148
+ logger.info(f"Inference complete. Results saved to {output_subdir}")
petimot/model/__init__.py ADDED
File without changes
petimot/model/loss.py ADDED
@@ -0,0 +1,227 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+ from torch_geometric.data import Batch
3
+ from scipy.optimize import linear_sum_assignment
4
+ import sys
5
+
6
+
7
+ def compute_NSSE_matrix(
8
+ eigvects: torch.Tensor,
9
+ coverage: torch.Tensor,
10
+ modes_pred: torch.Tensor,
11
+ ) -> torch.Tensor:
12
+ N, _, _ = eigvects.shape
13
+
14
+ modes_pred = modes_pred - modes_pred.mean(dim=0, keepdim=True)
15
+
16
+ # Expand tensors for broadcasting
17
+ coverage = coverage.view(N, 1, 1, 1)
18
+ sqrt_cov = torch.sqrt(coverage) # (N, 1, 1, 1)
19
+ eigvects_expanded = eigvects.unsqueeze(2) # (N, gt, 1, 3)
20
+ modes_pred_expanded = modes_pred.unsqueeze(1) # (N, 1, pred, 3)
21
+
22
+ # Optimal scaling calculation
23
+ numerator = torch.sum(
24
+ sqrt_cov * eigvects_expanded * modes_pred_expanded, dim=(0, -1)
25
+ )
26
+ denominator = torch.sum(coverage * modes_pred_expanded.pow(2), dim=(0, -1)) + 1e-8
27
+ c_optimal = numerator / denominator
28
+
29
+ modes_adjusted = (
30
+ sqrt_cov * modes_pred_expanded * c_optimal.unsqueeze(0).unsqueeze(-1)
31
+ )
32
+
33
+ loss_matrix = (
34
+ torch.sum((eigvects_expanded - modes_adjusted).pow(2), dim=(0, -1)) / N
35
+ ) # nb_gt, nb_pred
36
+
37
+ return loss_matrix
38
+
39
+
40
+ def select_minimum_indices(cost_matrix: torch.Tensor) -> torch.Tensor:
41
+
42
+ cost_np = cost_matrix.detach().cpu().numpy()
43
+
44
+ # Get optimal indices
45
+ row_idx, col_idx = linear_sum_assignment(cost_np)
46
+
47
+ indices = torch.stack(
48
+ [
49
+ torch.tensor(row_idx, device=cost_matrix.device),
50
+ torch.tensor(col_idx, device=cost_matrix.device),
51
+ ],
52
+ dim=1,
53
+ )
54
+
55
+ return indices
56
+
57
+
58
+ def elementwise_loss(
59
+ eigvects: torch.Tensor, coverage: torch.Tensor, modes_pred: torch.Tensor
60
+ ) -> torch.Tensor:
61
+ loss_matrix = compute_NSSE_matrix(eigvects, coverage, modes_pred)
62
+ matched_indices = select_minimum_indices(loss_matrix)
63
+ return loss_matrix[matched_indices[:, 0], matched_indices[:, 1]] # LS loss
64
+
65
+
66
+ def compute_nsse_loss(batch: Batch, modes_pred: torch.Tensor) -> torch.Tensor:
67
+
68
+ eigvects, coverage, ptr = batch.eigvects, batch.coverage, batch.ptr
69
+ losses = []
70
+
71
+ for i in range(len(ptr) - 1):
72
+ start, end = ptr[i], ptr[i + 1]
73
+ losses.append(
74
+ elementwise_loss(
75
+ eigvects[start:end],
76
+ coverage[start:end],
77
+ modes_pred[start:end],
78
+ )
79
+ )
80
+
81
+ return torch.stack(losses)
82
+
83
+
84
+ def orthogonalize_modes(modes: torch.Tensor) -> torch.Tensor:
85
+ N, K, D = modes.shape # [number of residues, number of modes, 3]
86
+
87
+ modes_reshaped = modes.transpose(0, 1).reshape(K, -1) # shape [K, N*D]
88
+
89
+ Q, _ = torch.linalg.qr(modes_reshaped.T) # Q: [N*D, K], so that Q^T Q = I
90
+
91
+ modes_ortho = Q.reshape(N, D, K).transpose(1, 2) # [N, K, D]
92
+ return modes_ortho # this is Q, with Q^T Q = I
93
+
94
+
95
+ def compute_rmsip_sq(
96
+ eigvects: torch.Tensor, coverage: torch.Tensor, modes_pred: torch.Tensor
97
+ ) -> float:
98
+
99
+ eigvects = eigvects - eigvects.mean(dim=0, keepdim=True)
100
+ eigvects = eigvects / torch.norm(eigvects, dim=(0, 2), keepdim=True)
101
+
102
+ modes_pred = modes_pred - modes_pred.mean(dim=0, keepdim=True)
103
+
104
+ n_modes = min(eigvects.shape[1], modes_pred.shape[1])
105
+
106
+ sqrt_cov = torch.sqrt(coverage)[:, None, None] # shape [N, 1, 1]
107
+ weighted_modes_pred = modes_pred * sqrt_cov # shape [N, K, D]
108
+
109
+ weighted_modes_pred_ortho = orthogonalize_modes(weighted_modes_pred)[:, :n_modes, :]
110
+
111
+ eigvects = eigvects[:, :n_modes, :]
112
+
113
+ inner_products = torch.einsum("nkd,nld->kl", eigvects, weighted_modes_pred_ortho)
114
+
115
+ rmsip_sq = torch.sum(inner_products**2) / n_modes
116
+
117
+ return rmsip_sq
118
+
119
+
120
+ def compute_rmsip(
121
+ eigvects: torch.Tensor, coverage: torch.Tensor, modes_pred: torch.Tensor
122
+ ) -> float:
123
+ return torch.sqrt(compute_rmsip_sq(eigvects, coverage, modes_pred)).item()
124
+
125
+
126
+ def compute_rmsip_loss_sample(
127
+ eigvects: torch.Tensor, coverage: torch.Tensor, modes_pred: torch.Tensor
128
+ ) -> torch.Tensor:
129
+
130
+ rmsip_sq = compute_rmsip_sq(eigvects, coverage, modes_pred)
131
+
132
+ loss = 1.0 - rmsip_sq # SS loss
133
+
134
+ return loss
135
+
136
+
137
+ def compute_rmsip_loss(batch: Batch, modes_pred: torch.Tensor) -> torch.Tensor:
138
+
139
+ eigvects, coverage, ptr = batch.eigvects, batch.coverage, batch.ptr
140
+ losses = []
141
+
142
+ for i in range(len(ptr) - 1):
143
+ start, end = ptr[i], ptr[i + 1]
144
+ loss = compute_rmsip_loss_sample(
145
+ eigvects[start:end], coverage[start:end], modes_pred[start:end]
146
+ )
147
+ losses.append(loss)
148
+
149
+ return torch.stack(losses)
150
+
151
+
152
+ def compute_rmsip_sq_without_ortho(
153
+ eigvects: torch.Tensor, coverage: torch.Tensor, modes_pred: torch.Tensor
154
+ ) -> float:
155
+
156
+ eigvects = eigvects - eigvects.mean(dim=0, keepdim=True)
157
+ eigvects = eigvects / torch.norm(eigvects, dim=(0, 2), keepdim=True)
158
+
159
+ modes_pred = modes_pred - modes_pred.mean(dim=0, keepdim=True)
160
+
161
+ n_modes = min(eigvects.shape[1], modes_pred.shape[1])
162
+
163
+ sqrt_cov = torch.sqrt(coverage)[:, None, None] # shape [N, 1, 1]
164
+ weighted_modes_pred = modes_pred * sqrt_cov # shape [N, K, D]
165
+ weighted_modes_pred = weighted_modes_pred[:, :n_modes, :]
166
+ weighted_modes_pred = weighted_modes_pred / torch.norm(
167
+ weighted_modes_pred, dim=(0, 2), keepdim=True
168
+ )
169
+
170
+ eigvects = eigvects[:, :n_modes, :]
171
+
172
+ inner_products = torch.einsum("nkd,nld->kl", eigvects, weighted_modes_pred)
173
+
174
+ rmsip_sq = torch.sum(inner_products**2) / (n_modes * n_modes)
175
+
176
+ return rmsip_sq
177
+
178
+
179
+ def compute_self_cosine_loss(
180
+ modes_pred: torch.Tensor, coverage: torch.Tensor
181
+ ) -> torch.Tensor:
182
+
183
+ modes_pred = modes_pred - modes_pred.mean(dim=0, keepdim=True)
184
+
185
+ coverage = coverage[:, None, None] # [N, 1, 1]
186
+
187
+ weighted_modes = modes_pred * torch.sqrt(coverage) # [N, K, 3]
188
+
189
+ norms = torch.norm(weighted_modes, dim=(0, 2), keepdim=True) # [1, K, 1]
190
+ normalized_modes = weighted_modes / norms # [N, K, 3]
191
+
192
+ cosine_matrix = torch.einsum("nid,njd->ij", normalized_modes, normalized_modes)
193
+
194
+ # Mask out diagonal elements (we don't want to include self-similarity)
195
+ mask = ~torch.eye(cosine_matrix.shape[0], dtype=bool, device=cosine_matrix.device)
196
+
197
+ n_modes = cosine_matrix.shape[0]
198
+ cosine_loss = torch.sum(cosine_matrix[mask] ** 2) / (n_modes * n_modes)
199
+
200
+ return cosine_loss
201
+
202
+
203
+ def compute_combined_loss_sample(
204
+ eigvects: torch.Tensor, coverage: torch.Tensor, modes_pred: torch.Tensor
205
+ ) -> torch.Tensor:
206
+ rmsip_sq = compute_rmsip_sq_without_ortho(eigvects, coverage, modes_pred)
207
+ rmsip_loss = 1.0 - rmsip_sq
208
+
209
+ cosine_loss = compute_self_cosine_loss(modes_pred, coverage)
210
+
211
+ combined_loss = rmsip_loss + cosine_loss # IS loss
212
+
213
+ return combined_loss
214
+
215
+
216
+ def compute_ortho_loss(batch: Batch, modes_pred: torch.Tensor) -> torch.Tensor:
217
+ eigvects, coverage, ptr = batch.eigvects, batch.coverage, batch.ptr
218
+ losses = []
219
+
220
+ for i in range(len(ptr) - 1):
221
+ start, end = ptr[i], ptr[i + 1]
222
+ loss = compute_combined_loss_sample(
223
+ eigvects[start:end], coverage[start:end], modes_pred[start:end]
224
+ )
225
+ losses.append(loss)
226
+
227
+ return torch.stack(losses)
petimot/model/neural_net.py ADDED
@@ -0,0 +1,374 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+ import torch.nn as nn
3
+ from torch_geometric.nn import MessagePassing
4
+ from typing import Tuple
5
+ from petimot.utils.rigid_utils import Rigid
6
+
7
+
8
+ class MLP(nn.Module):
9
+ def __init__(
10
+ self,
11
+ input_dim: int,
12
+ output_dim: int,
13
+ hidden_dim: int,
14
+ num_layers: int,
15
+ dropout: float,
16
+ ):
17
+ super(MLP, self).__init__()
18
+
19
+ if num_layers <= 0:
20
+ self.network = nn.Linear(input_dim, output_dim)
21
+ else:
22
+ layers = [nn.Linear(input_dim, hidden_dim), nn.GELU(), nn.Dropout(dropout)]
23
+
24
+ for _ in range(num_layers - 1):
25
+ layers.append(nn.Linear(hidden_dim, hidden_dim))
26
+ layers.append(nn.GELU())
27
+ layers.append(nn.Dropout(dropout))
28
+
29
+ layers.append(nn.Linear(hidden_dim, output_dim))
30
+
31
+ self.network = nn.Sequential(*layers)
32
+
33
+ def forward(self, x):
34
+ return self.network(x)
35
+
36
+
37
+ class MPNNLayer(MessagePassing):
38
+
39
+ def __init__(
40
+ self,
41
+ emb_dim: int,
42
+ edge_dim: int,
43
+ num_modes_pred: int,
44
+ mlp_num_layers: int,
45
+ dropout: float,
46
+ normalize_between_layers: bool,
47
+ center_between_layers: bool,
48
+ orthogonalize_between_layers: bool,
49
+ ablate_structure: bool,
50
+ ):
51
+ # We use `flow="target_to_source"`,
52
+ # meaning the first row of edge_index are target nodes
53
+ # and the second row are source nodes.
54
+ super().__init__(aggr="mean", flow="target_to_source")
55
+ self.normalize_between_layers = normalize_between_layers
56
+ self.center_between_layers = center_between_layers
57
+ self.orthogonalize_between_layers = orthogonalize_between_layers
58
+ self.ablate_structure = ablate_structure
59
+
60
+ if self.ablate_structure:
61
+ message_input_dim = 2 * emb_dim + 3 * (num_modes_pred * 3)
62
+ else:
63
+ message_input_dim = 2 * emb_dim + edge_dim + 3 * (num_modes_pred * 3)
64
+ # Message MLP => outputs new node embeddings (emb_dim)
65
+ self.message_mlp = MLP(
66
+ input_dim=message_input_dim,
67
+ output_dim=emb_dim,
68
+ hidden_dim=emb_dim,
69
+ num_layers=mlp_num_layers,
70
+ dropout=dropout,
71
+ )
72
+
73
+ # A linear layer to update v from [x, v]. (No bias if you prefer.)
74
+ self.vect_mlp = nn.Linear(
75
+ emb_dim + num_modes_pred * 3, num_modes_pred * 3, bias=False
76
+ )
77
+
78
+ self.num_modes_pred = num_modes_pred
79
+ self.norm_x = nn.LayerNorm(emb_dim)
80
+
81
+ def forward(
82
+ self,
83
+ x: torch.Tensor,
84
+ v: torch.Tensor,
85
+ edge_index: torch.Tensor,
86
+ edge_rots: Rigid,
87
+ edge_attr: torch.Tensor,
88
+ ptr: torch.Tensor,
89
+ ) -> Tuple[torch.Tensor, torch.Tensor]:
90
+
91
+ x_update = self.propagate(
92
+ edge_index=edge_index,
93
+ x=x,
94
+ v=v,
95
+ edge_rots=edge_rots,
96
+ edge_attr=edge_attr,
97
+ )
98
+
99
+ x = x + x_update
100
+ x = self.norm_x(x)
101
+
102
+ v_update = self.vect_mlp(torch.cat([x, v], dim=-1))
103
+ v = v + v_update
104
+
105
+ if (
106
+ self.normalize_between_layers
107
+ or self.center_between_layers
108
+ or self.orthogonalize_between_layers
109
+ ):
110
+ v = self.normalize_vectors(
111
+ v,
112
+ ptr,
113
+ orthogonalize=self.orthogonalize_between_layers,
114
+ normalize=self.normalize_between_layers,
115
+ center=self.center_between_layers,
116
+ )
117
+
118
+ return x, v
119
+
120
+ def message(
121
+ self,
122
+ x_i: torch.Tensor,
123
+ x_j: torch.Tensor,
124
+ v_i: torch.Tensor,
125
+ v_j: torch.Tensor,
126
+ edge_rots: Rigid,
127
+ edge_attr: torch.Tensor,
128
+ ) -> torch.Tensor:
129
+
130
+ vj = v_j.reshape(-1, self.num_modes_pred, 3)
131
+ vj_rotated = edge_rots.apply_batch(vj)
132
+ vj_rotated = vj_rotated.view(vj_rotated.shape[0], -1)
133
+
134
+ s1 = v_i - vj_rotated
135
+
136
+ if self.ablate_structure:
137
+ message = self.message_mlp(
138
+ torch.cat([x_i, x_j, v_i, vj_rotated, s1], dim=-1)
139
+ )
140
+ else:
141
+ message = self.message_mlp(
142
+ torch.cat([x_i, x_j, v_i, vj_rotated, s1, edge_attr], dim=-1)
143
+ )
144
+
145
+ return message
146
+
147
+ def normalize_vectors(
148
+ self,
149
+ v: torch.Tensor,
150
+ ptr: torch.Tensor,
151
+ orthogonalize: bool,
152
+ normalize: bool,
153
+ center: bool,
154
+ ) -> torch.Tensor:
155
+
156
+ v_norm = torch.empty_like(v)
157
+ v_reshaped = v.view(
158
+ v.shape[0], self.num_modes_pred, 3
159
+ ) # [N, num_modes_pred, 3]
160
+
161
+ for i in range(len(ptr) - 1):
162
+ start, end = ptr[i], ptr[i + 1]
163
+ n_residues = end - start
164
+ v_graph = v_reshaped[start:end]
165
+
166
+ if center:
167
+ v_graph = v_graph - v_graph.mean(dim=0, keepdim=True)
168
+
169
+ if orthogonalize and not normalize:
170
+ norms_input = torch.norm(v_graph, dim=(0, 2))
171
+
172
+ if orthogonalize:
173
+ v_flat = v_graph.reshape(-1, self.num_modes_pred)
174
+
175
+ q, _ = torch.linalg.qr(v_flat)
176
+
177
+ if normalize:
178
+ # Scale Q to maintain magnitude proportional to sqrt(N)
179
+ scale = n_residues.to(v.device).sqrt()
180
+ q = q * scale
181
+ else:
182
+ # Restore original norms
183
+ current_norms = torch.norm(q, dim=0)
184
+ scale_factors = norms_input / current_norms
185
+ q = q * scale_factors.unsqueeze(0)
186
+
187
+ q_reshaped = q.reshape(n_residues, -1)
188
+ v_norm[start:end] = q_reshaped
189
+
190
+ else:
191
+ if normalize:
192
+ # Just normalize without orthogonalization
193
+ scale = n_residues.to(v.device).sqrt()
194
+ current_norms = torch.norm(v_graph, dim=(0, 2))
195
+ scale_factors = scale / current_norms
196
+ v_graph = v_graph * scale_factors.unsqueeze(0).unsqueeze(-1)
197
+ v_norm[start:end] = v_graph.reshape(n_residues, -1)
198
+ else:
199
+ # Neither normalize nor orthogonalize, just return (possibly centered) vectors
200
+ v_norm[start:end] = v_graph.reshape(n_residues, -1)
201
+
202
+ return v_norm
203
+
204
+
205
+ class ProteinMotionMPNN(nn.Module):
206
+
207
+ def __init__(
208
+ self,
209
+ input_dim: int,
210
+ emb_dim: int,
211
+ edge_dim: int,
212
+ num_modes_pred: int,
213
+ num_layers: int,
214
+ shared_layers: bool,
215
+ mlp_num_layers: int,
216
+ input_embedding_dropout: float,
217
+ dropout: float,
218
+ num_basis: int,
219
+ max_dist: float,
220
+ sigma: float,
221
+ change_connectivity: bool,
222
+ normalize_between_layers: bool,
223
+ center_between_layers: bool,
224
+ orthogonalize_between_layers: bool,
225
+ start_with_zero_v: bool,
226
+ ablate_structure: bool,
227
+ ):
228
+ super().__init__()
229
+
230
+ self.num_modes_pred = num_modes_pred
231
+ self.num_layers = num_layers
232
+ self.shared_layers = shared_layers
233
+ self.edge_dim = edge_dim
234
+
235
+ self.input_proj = nn.Linear(input_dim, emb_dim)
236
+ self.input_norm = nn.LayerNorm(input_dim)
237
+ self.input_dropout = nn.Dropout(input_embedding_dropout)
238
+
239
+ self.num_basis = num_basis
240
+ self.max_dist = max_dist
241
+ self.sigma = sigma
242
+
243
+ self.change_connectivity = change_connectivity
244
+ self.start_with_zero_v = start_with_zero_v
245
+
246
+ if shared_layers:
247
+ self.mpnn = MPNNLayer(
248
+ emb_dim=emb_dim,
249
+ edge_dim=edge_dim,
250
+ num_modes_pred=num_modes_pred,
251
+ mlp_num_layers=mlp_num_layers,
252
+ dropout=dropout,
253
+ normalize_between_layers=normalize_between_layers,
254
+ center_between_layers=center_between_layers,
255
+ orthogonalize_between_layers=orthogonalize_between_layers,
256
+ ablate_structure=ablate_structure,
257
+ )
258
+ else:
259
+ self.mpnn_layers = nn.ModuleList()
260
+ for _ in range(num_layers):
261
+ layer = MPNNLayer(
262
+ emb_dim=emb_dim,
263
+ edge_dim=edge_dim,
264
+ num_modes_pred=num_modes_pred,
265
+ mlp_num_layers=mlp_num_layers,
266
+ dropout=dropout,
267
+ normalize_between_layers=normalize_between_layers,
268
+ center_between_layers=center_between_layers,
269
+ orthogonalize_between_layers=orthogonalize_between_layers,
270
+ ablate_structure=ablate_structure,
271
+ )
272
+ self.mpnn_layers.append(layer)
273
+
274
+ def compute_radial_basis_fullbb(
275
+ self, atoms_i: torch.Tensor, atoms_j: torch.Tensor
276
+ ) -> torch.Tensor:
277
+
278
+ # Pairwise distances among the sets of atoms
279
+ # e.g. [E, 3, 3] for each edge => cdist => [E, 3, 3]
280
+ distances = torch.cdist(atoms_i, atoms_j) # shape [E, X, X]
281
+
282
+ # Expand distances along a new dimension for the radial basis
283
+ # => shape [E, X, X, num_basis]
284
+ mu = torch.linspace(0, self.max_dist, self.num_basis, device=distances.device)
285
+ radial = torch.exp(-((distances.unsqueeze(-1) - mu) ** 2) / (2 * self.sigma**2))
286
+
287
+ # Flatten the X*X dimension
288
+ # => shape [E, X*X*num_basis]
289
+ radial = radial.view(distances.size(0), -1)
290
+ return radial
291
+
292
+ def compute_edge_features_from_index(
293
+ self, rigids, row_index: torch.Tensor, col_index: torch.Tensor, bb: torch.Tensor
294
+ ) -> Tuple:
295
+
296
+ T_i = rigids[row_index]
297
+ T_j = rigids[col_index]
298
+ T_i_inv = T_i.invert()
299
+ composed_ij = T_i_inv.compose(T_j)
300
+ edge_rots = composed_ij.get_rots()
301
+ edge_quats = edge_rots.get_quats()
302
+ edge_trans = composed_ij.get_trans()
303
+ edge_log_dist = torch.log(torch.norm(edge_trans, dim=-1) + 1e-8)
304
+ chain_distance = torch.log(torch.abs(row_index - col_index) + 1)
305
+ atoms_i = bb[row_index]
306
+ atoms_j = bb[col_index]
307
+ radial_features = self.compute_radial_basis_fullbb(atoms_i, atoms_j)
308
+ edge_attr = torch.cat(
309
+ [
310
+ edge_quats,
311
+ edge_trans,
312
+ chain_distance.unsqueeze(-1),
313
+ edge_log_dist.unsqueeze(-1),
314
+ radial_features,
315
+ ],
316
+ dim=-1,
317
+ )
318
+ return edge_rots, edge_attr
319
+
320
+ def forward(self, data) -> Tuple[torch.Tensor, torch.Tensor]:
321
+
322
+ emb = self.input_norm(data.x)
323
+ emb = self.input_dropout(emb)
324
+
325
+ emb = self.input_proj(emb)
326
+
327
+ if self.start_with_zero_v:
328
+ v = torch.zeros(emb.shape[0], self.num_modes_pred * 3, device=emb.device)
329
+ else:
330
+ v = torch.rand(emb.shape[0], self.num_modes_pred * 3, device=emb.device)
331
+
332
+ row_index, col_index, ptr = data.row_index, data.col_index, data.ptr
333
+
334
+ rigids = Rigid.make_transform_from_reference(
335
+ data.bb[:, 0, :], data.bb[:, 1, :], data.bb[:, 2, :]
336
+ )
337
+
338
+ if self.change_connectivity:
339
+
340
+ for layer_idx in range(self.num_layers):
341
+ current_edge_index = torch.stack(
342
+ [row_index, col_index[layer_idx]], dim=0
343
+ )
344
+
345
+ edge_rots, edge_attr = self.compute_edge_features_from_index(
346
+ rigids, row_index, col_index[layer_idx], data.bb
347
+ )
348
+
349
+ if self.shared_layers:
350
+ emb, v = self.mpnn(
351
+ emb, v, current_edge_index, edge_rots, edge_attr, ptr
352
+ )
353
+ else:
354
+ emb, v = self.mpnn_layers[layer_idx](
355
+ emb, v, current_edge_index, edge_rots, edge_attr, ptr
356
+ )
357
+ else:
358
+ edge_index = torch.stack([row_index, col_index], dim=0)
359
+ edge_rots, edge_attr = self.compute_edge_features_from_index(
360
+ rigids, row_index, col_index, data.bb
361
+ )
362
+
363
+ if self.shared_layers:
364
+ for _ in range(self.num_layers):
365
+ emb, v = self.mpnn(emb, v, edge_index, edge_rots, edge_attr, ptr)
366
+ else:
367
+ for layer in self.mpnn_layers:
368
+ emb, v = layer(emb, v, edge_index, edge_rots, edge_attr, ptr)
369
+
370
+ # Convert local vectors v to global coordinates
371
+ v_reshaped = v.view(v.shape[0], self.num_modes_pred, 3)
372
+ v_global = rigids.get_rots().apply_batch(v_reshaped)
373
+
374
+ return emb, v_global
petimot/model/optimizer.py ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+
3
+
4
+ def get_optimizer(parameters, optimizer_name, learning_rate, **kwargs):
5
+
6
+ optimizer_name = optimizer_name.lower()
7
+ optimizer_map = {
8
+ "adam": torch.optim.Adam,
9
+ "sgd": torch.optim.SGD,
10
+ "adagrad": torch.optim.Adagrad,
11
+ "adadelta": torch.optim.Adadelta,
12
+ "rmsprop": torch.optim.RMSprop,
13
+ "adamw": torch.optim.AdamW,
14
+ }
15
+
16
+ if optimizer_name not in optimizer_map:
17
+ raise ValueError(
18
+ f"Invalid optimizer name: {optimizer_name}. "
19
+ f"Valid options: {list(optimizer_map.keys())}"
20
+ )
21
+
22
+ defaults = {
23
+ "adam": {"weight_decay": 0.0, "amsgrad": False},
24
+ "adamw": {"weight_decay": 0.01},
25
+ "sgd": {"momentum": 0.9, "nesterov": True},
26
+ "rmsprop": {"alpha": 0.99, "momentum": 0.0},
27
+ }.get(optimizer_name, {})
28
+
29
+ final_params = {**defaults, **kwargs}
30
+
31
+ return optimizer_map[optimizer_name](parameters, lr=learning_rate, **final_params)
petimot/utils/__init__.py ADDED
File without changes
petimot/utils/rigid_utils.py ADDED
@@ -0,0 +1,1445 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright 2021 AlQuraishi Laboratory
2
+ # Copyright 2021 DeepMind Technologies Limited
3
+ #
4
+ # Licensed under the Apache License, Version 2.0 (the "License");
5
+ # you may not use this file except in compliance with the License.
6
+ # You may obtain a copy of the License at
7
+ #
8
+ # http://www.apache.org/licenses/LICENSE-2.0
9
+ #
10
+ # Unless required by applicable law or agreed to in writing, software
11
+ # distributed under the License is distributed on an "AS IS" BASIS,
12
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13
+ # See the License for the specific language governing permissions and
14
+ # limitations under the License.
15
+
16
+ # Modifications to original file:
17
+ # - `rot_vec_mul_batch`: Applies a batch of rotations to a batch of vectors.
18
+ # - `apply_batch`: Applies rotations to a batch of 3D coordinates.
19
+ # - `invert_apply_batch`: Applies the inverse of the batch rotation to coordinates.
20
+
21
+ from __future__ import annotations
22
+ from functools import lru_cache
23
+ from typing import Tuple, Any, Sequence, Callable, Optional
24
+
25
+ import numpy as np
26
+ import torch
27
+
28
+
29
+ def rot_vec_mul_batch(r: torch.Tensor, t: torch.Tensor) -> torch.Tensor:
30
+ """
31
+ Applies a batch of rotations to a batch of vectors. Supports t with shape [Batch, N, 3].
32
+
33
+ Args:
34
+ r: [Batch, 3, 3] rotation matrices
35
+ t: [Batch, N, 3] coordinate tensors
36
+ Returns:
37
+ [Batch, N, 3] rotated coordinates
38
+ """
39
+ x, y, z = torch.unbind(t, dim=-1) # x, y, z have shape [Batch, N]
40
+
41
+ return torch.stack(
42
+ [
43
+ r[..., 0, 0].unsqueeze(-1) * x
44
+ + r[..., 0, 1].unsqueeze(-1) * y
45
+ + r[..., 0, 2].unsqueeze(-1) * z,
46
+ r[..., 1, 0].unsqueeze(-1) * x
47
+ + r[..., 1, 1].unsqueeze(-1) * y
48
+ + r[..., 1, 2].unsqueeze(-1) * z,
49
+ r[..., 2, 0].unsqueeze(-1) * x
50
+ + r[..., 2, 1].unsqueeze(-1) * y
51
+ + r[..., 2, 2].unsqueeze(-1) * z,
52
+ ],
53
+ dim=-1,
54
+ )
55
+
56
+
57
+ def rot_matmul(a: torch.Tensor, b: torch.Tensor) -> torch.Tensor:
58
+ """
59
+ Performs matrix multiplication of two rotation matrix tensors. Written
60
+ out by hand to avoid AMP downcasting.
61
+
62
+ Args:
63
+ a: [*, 3, 3] left multiplicand
64
+ b: [*, 3, 3] right multiplicand
65
+ Returns:
66
+ The product ab
67
+ """
68
+
69
+ def row_mul(i):
70
+ return torch.stack(
71
+ [
72
+ a[..., i, 0] * b[..., 0, 0]
73
+ + a[..., i, 1] * b[..., 1, 0]
74
+ + a[..., i, 2] * b[..., 2, 0],
75
+ a[..., i, 0] * b[..., 0, 1]
76
+ + a[..., i, 1] * b[..., 1, 1]
77
+ + a[..., i, 2] * b[..., 2, 1],
78
+ a[..., i, 0] * b[..., 0, 2]
79
+ + a[..., i, 1] * b[..., 1, 2]
80
+ + a[..., i, 2] * b[..., 2, 2],
81
+ ],
82
+ dim=-1,
83
+ )
84
+
85
+ return torch.stack(
86
+ [
87
+ row_mul(0),
88
+ row_mul(1),
89
+ row_mul(2),
90
+ ],
91
+ dim=-2,
92
+ )
93
+
94
+
95
+ def rot_vec_mul(r: torch.Tensor, t: torch.Tensor) -> torch.Tensor:
96
+ """
97
+ Applies a rotation to a vector. Written out by hand to avoid transfer
98
+ to avoid AMP downcasting.
99
+
100
+ Args:
101
+ r: [*, 3, 3] rotation matrices
102
+ t: [*, 3] coordinate tensors
103
+ Returns:
104
+ [*, 3] rotated coordinates
105
+ """
106
+ x, y, z = torch.unbind(t, dim=-1)
107
+ return torch.stack(
108
+ [
109
+ r[..., 0, 0] * x + r[..., 0, 1] * y + r[..., 0, 2] * z,
110
+ r[..., 1, 0] * x + r[..., 1, 1] * y + r[..., 1, 2] * z,
111
+ r[..., 2, 0] * x + r[..., 2, 1] * y + r[..., 2, 2] * z,
112
+ ],
113
+ dim=-1,
114
+ )
115
+
116
+
117
+ @lru_cache(maxsize=None)
118
+ def identity_rot_mats(
119
+ batch_dims: Tuple[int],
120
+ dtype: Optional[torch.dtype] = None,
121
+ device: Optional[torch.device] = None,
122
+ requires_grad: bool = True,
123
+ ) -> torch.Tensor:
124
+ rots = torch.eye(3, dtype=dtype, device=device, requires_grad=requires_grad)
125
+ rots = rots.view(*((1,) * len(batch_dims)), 3, 3)
126
+ rots = rots.expand(*batch_dims, -1, -1)
127
+ rots = rots.contiguous()
128
+
129
+ return rots
130
+
131
+
132
+ @lru_cache(maxsize=None)
133
+ def identity_trans(
134
+ batch_dims: Tuple[int],
135
+ dtype: Optional[torch.dtype] = None,
136
+ device: Optional[torch.device] = None,
137
+ requires_grad: bool = True,
138
+ ) -> torch.Tensor:
139
+ trans = torch.zeros(
140
+ (*batch_dims, 3), dtype=dtype, device=device, requires_grad=requires_grad
141
+ )
142
+ return trans
143
+
144
+
145
+ @lru_cache(maxsize=None)
146
+ def identity_quats(
147
+ batch_dims: Tuple[int],
148
+ dtype: Optional[torch.dtype] = None,
149
+ device: Optional[torch.device] = None,
150
+ requires_grad: bool = True,
151
+ ) -> torch.Tensor:
152
+ quat = torch.zeros(
153
+ (*batch_dims, 4), dtype=dtype, device=device, requires_grad=requires_grad
154
+ )
155
+
156
+ with torch.no_grad():
157
+ quat[..., 0] = 1
158
+
159
+ return quat
160
+
161
+
162
+ _quat_elements = ["a", "b", "c", "d"]
163
+ _qtr_keys = [l1 + l2 for l1 in _quat_elements for l2 in _quat_elements]
164
+ _qtr_ind_dict = {key: ind for ind, key in enumerate(_qtr_keys)}
165
+
166
+
167
+ def _to_mat(pairs):
168
+ mat = np.zeros((4, 4))
169
+ for pair in pairs:
170
+ key, value = pair
171
+ ind = _qtr_ind_dict[key]
172
+ mat[ind // 4][ind % 4] = value
173
+
174
+ return mat
175
+
176
+
177
+ _QTR_MAT = np.zeros((4, 4, 3, 3))
178
+ _QTR_MAT[..., 0, 0] = _to_mat([("aa", 1), ("bb", 1), ("cc", -1), ("dd", -1)])
179
+ _QTR_MAT[..., 0, 1] = _to_mat([("bc", 2), ("ad", -2)])
180
+ _QTR_MAT[..., 0, 2] = _to_mat([("bd", 2), ("ac", 2)])
181
+ _QTR_MAT[..., 1, 0] = _to_mat([("bc", 2), ("ad", 2)])
182
+ _QTR_MAT[..., 1, 1] = _to_mat([("aa", 1), ("bb", -1), ("cc", 1), ("dd", -1)])
183
+ _QTR_MAT[..., 1, 2] = _to_mat([("cd", 2), ("ab", -2)])
184
+ _QTR_MAT[..., 2, 0] = _to_mat([("bd", 2), ("ac", -2)])
185
+ _QTR_MAT[..., 2, 1] = _to_mat([("cd", 2), ("ab", 2)])
186
+ _QTR_MAT[..., 2, 2] = _to_mat([("aa", 1), ("bb", -1), ("cc", -1), ("dd", 1)])
187
+
188
+
189
+ def quat_to_rot(quat: torch.Tensor) -> torch.Tensor:
190
+ """
191
+ Converts a quaternion to a rotation matrix.
192
+
193
+ Args:
194
+ quat: [*, 4] quaternions
195
+ Returns:
196
+ [*, 3, 3] rotation matrices
197
+ """
198
+ # [*, 4, 4]
199
+ quat = quat[..., None] * quat[..., None, :]
200
+
201
+ # [4, 4, 3, 3]
202
+ mat = _get_quat("_QTR_MAT", dtype=quat.dtype, device=quat.device)
203
+
204
+ # [*, 4, 4, 3, 3]
205
+ shaped_qtr_mat = mat.view((1,) * len(quat.shape[:-2]) + mat.shape)
206
+ quat = quat[..., None, None] * shaped_qtr_mat
207
+
208
+ # [*, 3, 3]
209
+ return torch.sum(quat, dim=(-3, -4))
210
+
211
+
212
+ def rot_to_quat(
213
+ rot: torch.Tensor,
214
+ ):
215
+ if rot.shape[-2:] != (3, 3):
216
+ raise ValueError("Input rotation is incorrectly shaped")
217
+
218
+ rot = [[rot[..., i, j] for j in range(3)] for i in range(3)]
219
+ [[xx, xy, xz], [yx, yy, yz], [zx, zy, zz]] = rot
220
+
221
+ k = [
222
+ [
223
+ xx + yy + zz,
224
+ zy - yz,
225
+ xz - zx,
226
+ yx - xy,
227
+ ],
228
+ [
229
+ zy - yz,
230
+ xx - yy - zz,
231
+ xy + yx,
232
+ xz + zx,
233
+ ],
234
+ [
235
+ xz - zx,
236
+ xy + yx,
237
+ yy - xx - zz,
238
+ yz + zy,
239
+ ],
240
+ [
241
+ yx - xy,
242
+ xz + zx,
243
+ yz + zy,
244
+ zz - xx - yy,
245
+ ],
246
+ ]
247
+
248
+ k = (1.0 / 3.0) * torch.stack([torch.stack(t, dim=-1) for t in k], dim=-2)
249
+
250
+ _, vectors = torch.linalg.eigh(k)
251
+ return vectors[..., -1]
252
+
253
+
254
+ _QUAT_MULTIPLY = np.zeros((4, 4, 4))
255
+ _QUAT_MULTIPLY[:, :, 0] = [[1, 0, 0, 0], [0, -1, 0, 0], [0, 0, -1, 0], [0, 0, 0, -1]]
256
+
257
+ _QUAT_MULTIPLY[:, :, 1] = [[0, 1, 0, 0], [1, 0, 0, 0], [0, 0, 0, 1], [0, 0, -1, 0]]
258
+
259
+ _QUAT_MULTIPLY[:, :, 2] = [[0, 0, 1, 0], [0, 0, 0, -1], [1, 0, 0, 0], [0, 1, 0, 0]]
260
+
261
+ _QUAT_MULTIPLY[:, :, 3] = [[0, 0, 0, 1], [0, 0, 1, 0], [0, -1, 0, 0], [1, 0, 0, 0]]
262
+
263
+ _QUAT_MULTIPLY_BY_VEC = _QUAT_MULTIPLY[:, 1:, :]
264
+
265
+ _CACHED_QUATS = {
266
+ "_QTR_MAT": _QTR_MAT,
267
+ "_QUAT_MULTIPLY": _QUAT_MULTIPLY,
268
+ "_QUAT_MULTIPLY_BY_VEC": _QUAT_MULTIPLY_BY_VEC,
269
+ }
270
+
271
+
272
+ @lru_cache(maxsize=None)
273
+ def _get_quat(quat_key, dtype, device):
274
+ return torch.tensor(_CACHED_QUATS[quat_key], dtype=dtype, device=device)
275
+
276
+
277
+ def quat_multiply(quat1, quat2):
278
+ """Multiply a quaternion by another quaternion."""
279
+ mat = _get_quat("_QUAT_MULTIPLY", dtype=quat1.dtype, device=quat1.device)
280
+ reshaped_mat = mat.view((1,) * len(quat1.shape[:-1]) + mat.shape)
281
+ return torch.sum(
282
+ reshaped_mat * quat1[..., :, None, None] * quat2[..., None, :, None],
283
+ dim=(-3, -2),
284
+ )
285
+
286
+
287
+ def quat_multiply_by_vec(quat, vec):
288
+ """Multiply a quaternion by a pure-vector quaternion."""
289
+ mat = _get_quat("_QUAT_MULTIPLY_BY_VEC", dtype=quat.dtype, device=quat.device)
290
+ reshaped_mat = mat.view((1,) * len(quat.shape[:-1]) + mat.shape)
291
+ return torch.sum(
292
+ reshaped_mat * quat[..., :, None, None] * vec[..., None, :, None], dim=(-3, -2)
293
+ )
294
+
295
+
296
+ def invert_rot_mat(rot_mat: torch.Tensor):
297
+ return rot_mat.transpose(-1, -2)
298
+
299
+
300
+ def invert_quat(quat: torch.Tensor):
301
+ quat_prime = quat.clone()
302
+ quat_prime[..., 1:] *= -1
303
+ inv = quat_prime / torch.sum(quat**2, dim=-1, keepdim=True)
304
+ return inv
305
+
306
+
307
+ class Rotation:
308
+ """
309
+ A 3D rotation. Depending on how the object is initialized, the
310
+ rotation is represented by either a rotation matrix or a
311
+ quaternion, though both formats are made available by helper functions.
312
+ To simplify gradient computation, the underlying format of the
313
+ rotation cannot be changed in-place. Like Rigid, the class is designed
314
+ to mimic the behavior of a torch Tensor, almost as if each Rotation
315
+ object were a tensor of rotations, in one format or another.
316
+ """
317
+
318
+ def __init__(
319
+ self,
320
+ rot_mats: Optional[torch.Tensor] = None,
321
+ quats: Optional[torch.Tensor] = None,
322
+ normalize_quats: bool = True,
323
+ ):
324
+ """
325
+ Args:
326
+ rot_mats:
327
+ A [*, 3, 3] rotation matrix tensor. Mutually exclusive with
328
+ quats
329
+ quats:
330
+ A [*, 4] quaternion. Mutually exclusive with rot_mats. If
331
+ normalize_quats is not True, must be a unit quaternion
332
+ normalize_quats:
333
+ If quats is specified, whether to normalize quats
334
+ """
335
+ if (rot_mats is None and quats is None) or (
336
+ rot_mats is not None and quats is not None
337
+ ):
338
+ raise ValueError("Exactly one input argument must be specified")
339
+
340
+ if (rot_mats is not None and rot_mats.shape[-2:] != (3, 3)) or (
341
+ quats is not None and quats.shape[-1] != 4
342
+ ):
343
+ raise ValueError("Incorrectly shaped rotation matrix or quaternion")
344
+
345
+ # Force full-precision
346
+ if quats is not None:
347
+ quats = quats.to(dtype=torch.float32)
348
+ if rot_mats is not None:
349
+ rot_mats = rot_mats.to(dtype=torch.float32)
350
+
351
+ if quats is not None and normalize_quats:
352
+ quats = quats / torch.linalg.norm(quats, dim=-1, keepdim=True)
353
+
354
+ self._rot_mats = rot_mats
355
+ self._quats = quats
356
+
357
+ @staticmethod
358
+ def identity(
359
+ shape,
360
+ dtype: Optional[torch.dtype] = None,
361
+ device: Optional[torch.device] = None,
362
+ requires_grad: bool = True,
363
+ fmt: str = "quat",
364
+ ) -> Rotation:
365
+ """
366
+ Returns an identity Rotation.
367
+
368
+ Args:
369
+ shape:
370
+ The "shape" of the resulting Rotation object. See documentation
371
+ for the shape property
372
+ dtype:
373
+ The torch dtype for the rotation
374
+ device:
375
+ The torch device for the new rotation
376
+ requires_grad:
377
+ Whether the underlying tensors in the new rotation object
378
+ should require gradient computation
379
+ fmt:
380
+ One of "quat" or "rot_mat". Determines the underlying format
381
+ of the new object's rotation
382
+ Returns:
383
+ A new identity rotation
384
+ """
385
+ if fmt == "rot_mat":
386
+ rot_mats = identity_rot_mats(
387
+ shape,
388
+ dtype,
389
+ device,
390
+ requires_grad,
391
+ )
392
+ return Rotation(rot_mats=rot_mats, quats=None)
393
+ elif fmt == "quat":
394
+ quats = identity_quats(shape, dtype, device, requires_grad)
395
+ return Rotation(rot_mats=None, quats=quats, normalize_quats=False)
396
+ else:
397
+ raise ValueError(f"Invalid format: f{fmt}")
398
+
399
+ # Magic methods
400
+
401
+ def __getitem__(self, index: Any) -> Rotation:
402
+ """
403
+ Allows torch-style indexing over the virtual shape of the rotation
404
+ object. See documentation for the shape property.
405
+
406
+ Args:
407
+ index:
408
+ A torch index. E.g. (1, 3, 2), or (slice(None,))
409
+ Returns:
410
+ The indexed rotation
411
+ """
412
+ if type(index) != tuple:
413
+ index = (index,)
414
+
415
+ if self._rot_mats is not None:
416
+ rot_mats = self._rot_mats[index + (slice(None), slice(None))]
417
+ return Rotation(rot_mats=rot_mats)
418
+ elif self._quats is not None:
419
+ quats = self._quats[index + (slice(None),)]
420
+ return Rotation(quats=quats, normalize_quats=False)
421
+ else:
422
+ raise ValueError("Both rotations are None")
423
+
424
+ def __mul__(
425
+ self,
426
+ right: torch.Tensor,
427
+ ) -> Rotation:
428
+ """
429
+ Pointwise left multiplication of the rotation with a tensor. Can be
430
+ used to e.g. mask the Rotation.
431
+
432
+ Args:
433
+ right:
434
+ The tensor multiplicand
435
+ Returns:
436
+ The product
437
+ """
438
+ if not (isinstance(right, torch.Tensor)):
439
+ raise TypeError("The other multiplicand must be a Tensor")
440
+
441
+ if self._rot_mats is not None:
442
+ rot_mats = self._rot_mats * right[..., None, None]
443
+ return Rotation(rot_mats=rot_mats, quats=None)
444
+ elif self._quats is not None:
445
+ quats = self._quats * right[..., None]
446
+ return Rotation(rot_mats=None, quats=quats, normalize_quats=False)
447
+ else:
448
+ raise ValueError("Both rotations are None")
449
+
450
+ def __rmul__(
451
+ self,
452
+ left: torch.Tensor,
453
+ ) -> Rotation:
454
+ """
455
+ Reverse pointwise multiplication of the rotation with a tensor.
456
+
457
+ Args:
458
+ left:
459
+ The left multiplicand
460
+ Returns:
461
+ The product
462
+ """
463
+ return self.__mul__(left)
464
+
465
+ # Properties
466
+
467
+ @property
468
+ def shape(self) -> torch.Size:
469
+ """
470
+ Returns the virtual shape of the rotation object. This shape is
471
+ defined as the batch dimensions of the underlying rotation matrix
472
+ or quaternion. If the Rotation was initialized with a [10, 3, 3]
473
+ rotation matrix tensor, for example, the resulting shape would be
474
+ [10].
475
+
476
+ Returns:
477
+ The virtual shape of the rotation object
478
+ """
479
+ s = None
480
+ if self._quats is not None:
481
+ s = self._quats.shape[:-1]
482
+ else:
483
+ s = self._rot_mats.shape[:-2]
484
+
485
+ return s
486
+
487
+ @property
488
+ def dtype(self) -> torch.dtype:
489
+ """
490
+ Returns the dtype of the underlying rotation.
491
+
492
+ Returns:
493
+ The dtype of the underlying rotation
494
+ """
495
+ if self._rot_mats is not None:
496
+ return self._rot_mats.dtype
497
+ elif self._quats is not None:
498
+ return self._quats.dtype
499
+ else:
500
+ raise ValueError("Both rotations are None")
501
+
502
+ @property
503
+ def device(self) -> torch.device:
504
+ """
505
+ The device of the underlying rotation
506
+
507
+ Returns:
508
+ The device of the underlying rotation
509
+ """
510
+ if self._rot_mats is not None:
511
+ return self._rot_mats.device
512
+ elif self._quats is not None:
513
+ return self._quats.device
514
+ else:
515
+ raise ValueError("Both rotations are None")
516
+
517
+ @property
518
+ def requires_grad(self) -> bool:
519
+ """
520
+ Returns the requires_grad property of the underlying rotation
521
+
522
+ Returns:
523
+ The requires_grad property of the underlying tensor
524
+ """
525
+ if self._rot_mats is not None:
526
+ return self._rot_mats.requires_grad
527
+ elif self._quats is not None:
528
+ return self._quats.requires_grad
529
+ else:
530
+ raise ValueError("Both rotations are None")
531
+
532
+ def get_rot_mats(self) -> torch.Tensor:
533
+ """
534
+ Returns the underlying rotation as a rotation matrix tensor.
535
+
536
+ Returns:
537
+ The rotation as a rotation matrix tensor
538
+ """
539
+ rot_mats = self._rot_mats
540
+ if rot_mats is None:
541
+ if self._quats is None:
542
+ raise ValueError("Both rotations are None")
543
+ else:
544
+ rot_mats = quat_to_rot(self._quats)
545
+
546
+ return rot_mats
547
+
548
+ def get_quats(self) -> torch.Tensor:
549
+ """
550
+ Returns the underlying rotation as a quaternion tensor.
551
+
552
+ Depending on whether the Rotation was initialized with a
553
+ quaternion, this function may call torch.linalg.eigh.
554
+
555
+ Returns:
556
+ The rotation as a quaternion tensor.
557
+ """
558
+ quats = self._quats
559
+ if quats is None:
560
+ if self._rot_mats is None:
561
+ raise ValueError("Both rotations are None")
562
+ else:
563
+ quats = rot_to_quat(self._rot_mats)
564
+
565
+ return quats
566
+
567
+ def get_cur_rot(self) -> torch.Tensor:
568
+ """
569
+ Return the underlying rotation in its current form
570
+
571
+ Returns:
572
+ The stored rotation
573
+ """
574
+ if self._rot_mats is not None:
575
+ return self._rot_mats
576
+ elif self._quats is not None:
577
+ return self._quats
578
+ else:
579
+ raise ValueError("Both rotations are None")
580
+
581
+ # Rotation functions
582
+
583
+ def compose_q_update_vec(
584
+ self, q_update_vec: torch.Tensor, normalize_quats: bool = True
585
+ ) -> Rotation:
586
+ """
587
+ Returns a new quaternion Rotation after updating the current
588
+ object's underlying rotation with a quaternion update, formatted
589
+ as a [*, 3] tensor whose final three columns represent x, y, z such
590
+ that (1, x, y, z) is the desired (not necessarily unit) quaternion
591
+ update.
592
+
593
+ Args:
594
+ q_update_vec:
595
+ A [*, 3] quaternion update tensor
596
+ normalize_quats:
597
+ Whether to normalize the output quaternion
598
+ Returns:
599
+ An updated Rotation
600
+ """
601
+ quats = self.get_quats()
602
+ new_quats = quats + quat_multiply_by_vec(quats, q_update_vec)
603
+ return Rotation(
604
+ rot_mats=None,
605
+ quats=new_quats,
606
+ normalize_quats=normalize_quats,
607
+ )
608
+
609
+ def compose_r(self, r: Rotation) -> Rotation:
610
+ """
611
+ Compose the rotation matrices of the current Rotation object with
612
+ those of another.
613
+
614
+ Args:
615
+ r:
616
+ An update rotation object
617
+ Returns:
618
+ An updated rotation object
619
+ """
620
+ r1 = self.get_rot_mats()
621
+ r2 = r.get_rot_mats()
622
+ new_rot_mats = rot_matmul(r1, r2)
623
+ return Rotation(rot_mats=new_rot_mats, quats=None)
624
+
625
+ def compose_q(self, r: Rotation, normalize_quats: bool = True) -> Rotation:
626
+ """
627
+ Compose the quaternions of the current Rotation object with those
628
+ of another.
629
+
630
+ Depending on whether either Rotation was initialized with
631
+ quaternions, this function may call torch.linalg.eigh.
632
+
633
+ Args:
634
+ r:
635
+ An update rotation object
636
+ Returns:
637
+ An updated rotation object
638
+ """
639
+ q1 = self.get_quats()
640
+ q2 = r.get_quats()
641
+ new_quats = quat_multiply(q1, q2)
642
+ return Rotation(rot_mats=None, quats=new_quats, normalize_quats=normalize_quats)
643
+
644
+ def apply(self, pts: torch.Tensor) -> torch.Tensor:
645
+ """
646
+ Apply the current Rotation as a rotation matrix to a set of 3D
647
+ coordinates.
648
+
649
+ Args:
650
+ pts:
651
+ A [*, 3] set of points
652
+ Returns:
653
+ [*, 3] rotated points
654
+ """
655
+ rot_mats = self.get_rot_mats()
656
+ return rot_vec_mul(rot_mats, pts)
657
+
658
+ def apply_batch(self, pts: torch.Tensor) -> torch.Tensor:
659
+ """
660
+ Apply the current Rotation as a rotation matrix to a batch of 3D
661
+ coordinates.
662
+
663
+ Args:
664
+ pts:
665
+ A [Batch, N, 3] set of points
666
+ Returns:
667
+ [Batch, N, 3] rotated points
668
+ """
669
+ rot_mats = self.get_rot_mats() # Assume shape [Batch, 3, 3]
670
+ return rot_vec_mul_batch(rot_mats, pts)
671
+
672
+ def invert_apply(self, pts: torch.Tensor) -> torch.Tensor:
673
+ """
674
+ The inverse of the apply() method.
675
+
676
+ Args:
677
+ pts:
678
+ A [*, 3] set of points
679
+ Returns:
680
+ [*, 3] inverse-rotated points
681
+ """
682
+ rot_mats = self.get_rot_mats()
683
+ inv_rot_mats = invert_rot_mat(rot_mats)
684
+ return rot_vec_mul(inv_rot_mats, pts)
685
+
686
+ def invert_apply_batch(self, pts: torch.Tensor) -> torch.Tensor:
687
+ """
688
+ The inverse of the apply_batch() method, applying the inverse rotation
689
+ to a batch of 3D coordinates.
690
+
691
+ Args:
692
+ pts:
693
+ A [Batch, N, 3] set of points
694
+ Returns:
695
+ [Batch, N, 3] inverse-rotated points
696
+ """
697
+ rot_mats = self.get_rot_mats() # Assume shape [Batch, 3, 3]
698
+ inv_rot_mats = rot_mats.transpose(
699
+ -1, -2
700
+ ) # Transpose to get the inverse rotation matrices
701
+ return rot_vec_mul_batch(inv_rot_mats, pts)
702
+
703
+ def invert(self) -> Rotation:
704
+ """
705
+ Returns the inverse of the current Rotation.
706
+
707
+ Returns:
708
+ The inverse of the current Rotation
709
+ """
710
+ if self._rot_mats is not None:
711
+ return Rotation(rot_mats=invert_rot_mat(self._rot_mats), quats=None)
712
+ elif self._quats is not None:
713
+ return Rotation(
714
+ rot_mats=None,
715
+ quats=invert_quat(self._quats),
716
+ normalize_quats=False,
717
+ )
718
+ else:
719
+ raise ValueError("Both rotations are None")
720
+
721
+ # "Tensor" stuff
722
+
723
+ def unsqueeze(
724
+ self,
725
+ dim: int,
726
+ ) -> Rigid:
727
+ """
728
+ Analogous to torch.unsqueeze. The dimension is relative to the
729
+ shape of the Rotation object.
730
+
731
+ Args:
732
+ dim: A positive or negative dimension index.
733
+ Returns:
734
+ The unsqueezed Rotation.
735
+ """
736
+ if dim >= len(self.shape):
737
+ raise ValueError("Invalid dimension")
738
+
739
+ if self._rot_mats is not None:
740
+ rot_mats = self._rot_mats.unsqueeze(dim if dim >= 0 else dim - 2)
741
+ return Rotation(rot_mats=rot_mats, quats=None)
742
+ elif self._quats is not None:
743
+ quats = self._quats.unsqueeze(dim if dim >= 0 else dim - 1)
744
+ return Rotation(rot_mats=None, quats=quats, normalize_quats=False)
745
+ else:
746
+ raise ValueError("Both rotations are None")
747
+
748
+ @staticmethod
749
+ def cat(
750
+ rs: Sequence[Rotation],
751
+ dim: int,
752
+ ) -> Rigid:
753
+ """
754
+ Concatenates rotations along one of the batch dimensions. Analogous
755
+ to torch.cat().
756
+
757
+ Note that the output of this operation is always a rotation matrix,
758
+ regardless of the format of input rotations.
759
+
760
+ Args:
761
+ rs:
762
+ A list of rotation objects
763
+ dim:
764
+ The dimension along which the rotations should be
765
+ concatenated
766
+ Returns:
767
+ A concatenated Rotation object in rotation matrix format
768
+ """
769
+ rot_mats = [r.get_rot_mats() for r in rs]
770
+ rot_mats = torch.cat(rot_mats, dim=dim if dim >= 0 else dim - 2)
771
+
772
+ return Rotation(rot_mats=rot_mats, quats=None)
773
+
774
+ def map_tensor_fn(self, fn: Callable[torch.Tensor, torch.Tensor]) -> Rotation:
775
+ """
776
+ Apply a Tensor -> Tensor function to underlying rotation tensors,
777
+ mapping over the rotation dimension(s). Can be used e.g. to sum out
778
+ a one-hot batch dimension.
779
+
780
+ Args:
781
+ fn:
782
+ A Tensor -> Tensor function to be mapped over the Rotation
783
+ Returns:
784
+ The transformed Rotation object
785
+ """
786
+ if self._rot_mats is not None:
787
+ rot_mats = self._rot_mats.view(self._rot_mats.shape[:-2] + (9,))
788
+ rot_mats = torch.stack(
789
+ list(map(fn, torch.unbind(rot_mats, dim=-1))), dim=-1
790
+ )
791
+ rot_mats = rot_mats.view(rot_mats.shape[:-1] + (3, 3))
792
+ return Rotation(rot_mats=rot_mats, quats=None)
793
+ elif self._quats is not None:
794
+ quats = torch.stack(
795
+ list(map(fn, torch.unbind(self._quats, dim=-1))), dim=-1
796
+ )
797
+ return Rotation(rot_mats=None, quats=quats, normalize_quats=False)
798
+ else:
799
+ raise ValueError("Both rotations are None")
800
+
801
+ def cuda(self) -> Rotation:
802
+ """
803
+ Analogous to the cuda() method of torch Tensors
804
+
805
+ Returns:
806
+ A copy of the Rotation in CUDA memory
807
+ """
808
+ if self._rot_mats is not None:
809
+ return Rotation(rot_mats=self._rot_mats.cuda(), quats=None)
810
+ elif self._quats is not None:
811
+ return Rotation(
812
+ rot_mats=None, quats=self._quats.cuda(), normalize_quats=False
813
+ )
814
+ else:
815
+ raise ValueError("Both rotations are None")
816
+
817
+ def to(
818
+ self, device: Optional[torch.device], dtype: Optional[torch.dtype]
819
+ ) -> Rotation:
820
+ """
821
+ Analogous to the to() method of torch Tensors
822
+
823
+ Args:
824
+ device:
825
+ A torch device
826
+ dtype:
827
+ A torch dtype
828
+ Returns:
829
+ A copy of the Rotation using the new device and dtype
830
+ """
831
+ if self._rot_mats is not None:
832
+ return Rotation(
833
+ rot_mats=self._rot_mats.to(device=device, dtype=dtype),
834
+ quats=None,
835
+ )
836
+ elif self._quats is not None:
837
+ return Rotation(
838
+ rot_mats=None,
839
+ quats=self._quats.to(device=device, dtype=dtype),
840
+ normalize_quats=False,
841
+ )
842
+ else:
843
+ raise ValueError("Both rotations are None")
844
+
845
+ def detach(self) -> Rotation:
846
+ """
847
+ Returns a copy of the Rotation whose underlying Tensor has been
848
+ detached from its torch graph.
849
+
850
+ Returns:
851
+ A copy of the Rotation whose underlying Tensor has been detached
852
+ from its torch graph
853
+ """
854
+ if self._rot_mats is not None:
855
+ return Rotation(rot_mats=self._rot_mats.detach(), quats=None)
856
+ elif self._quats is not None:
857
+ return Rotation(
858
+ rot_mats=None,
859
+ quats=self._quats.detach(),
860
+ normalize_quats=False,
861
+ )
862
+ else:
863
+ raise ValueError("Both rotations are None")
864
+
865
+
866
+ class Rigid:
867
+ """
868
+ A class representing a rigid transformation. Little more than a wrapper
869
+ around two objects: a Rotation object and a [*, 3] translation
870
+ Designed to behave approximately like a single torch tensor with the
871
+ shape of the shared batch dimensions of its component parts.
872
+ """
873
+
874
+ def __init__(
875
+ self,
876
+ rots: Optional[Rotation],
877
+ trans: Optional[torch.Tensor],
878
+ ):
879
+ """
880
+ Args:
881
+ rots: A [*, 3, 3] rotation tensor
882
+ trans: A corresponding [*, 3] translation tensor
883
+ """
884
+ # (we need device, dtype, etc. from at least one input)
885
+
886
+ batch_dims, dtype, device, requires_grad = None, None, None, None
887
+ if trans is not None:
888
+ batch_dims = trans.shape[:-1]
889
+ dtype = trans.dtype
890
+ device = trans.device
891
+ requires_grad = trans.requires_grad
892
+ elif rots is not None:
893
+ batch_dims = rots.shape
894
+ dtype = rots.dtype
895
+ device = rots.device
896
+ requires_grad = rots.requires_grad
897
+ else:
898
+ raise ValueError("At least one input argument must be specified")
899
+
900
+ if rots is None:
901
+ rots = Rotation.identity(
902
+ batch_dims,
903
+ dtype,
904
+ device,
905
+ requires_grad,
906
+ )
907
+ elif trans is None:
908
+ trans = identity_trans(
909
+ batch_dims,
910
+ dtype,
911
+ device,
912
+ requires_grad,
913
+ )
914
+
915
+ if (rots.shape != trans.shape[:-1]) or (rots.device != trans.device):
916
+ raise ValueError("Rots and trans incompatible")
917
+
918
+ # Force full precision. Happens to the rotations automatically.
919
+ trans = trans.to(dtype=torch.float32)
920
+
921
+ self._rots = rots
922
+ self._trans = trans
923
+
924
+ @staticmethod
925
+ def identity(
926
+ shape: Tuple[int],
927
+ dtype: Optional[torch.dtype] = None,
928
+ device: Optional[torch.device] = None,
929
+ requires_grad: bool = True,
930
+ fmt: str = "quat",
931
+ ) -> Rigid:
932
+ """
933
+ Constructs an identity transformation.
934
+
935
+ Args:
936
+ shape:
937
+ The desired shape
938
+ dtype:
939
+ The dtype of both internal tensors
940
+ device:
941
+ The device of both internal tensors
942
+ requires_grad:
943
+ Whether grad should be enabled for the internal tensors
944
+ Returns:
945
+ The identity transformation
946
+ """
947
+ return Rigid(
948
+ Rotation.identity(shape, dtype, device, requires_grad, fmt=fmt),
949
+ identity_trans(shape, dtype, device, requires_grad),
950
+ )
951
+
952
+ def __getitem__(
953
+ self,
954
+ index: Any,
955
+ ) -> Rigid:
956
+ """
957
+ Indexes the affine transformation with PyTorch-style indices.
958
+ The index is applied to the shared dimensions of both the rotation
959
+ and the translation.
960
+
961
+ E.g.::
962
+
963
+ r = Rotation(rot_mats=torch.rand(10, 10, 3, 3), quats=None)
964
+ t = Rigid(r, torch.rand(10, 10, 3))
965
+ indexed = t[3, 4:6]
966
+ assert(indexed.shape == (2,))
967
+ assert(indexed.get_rots().shape == (2,))
968
+ assert(indexed.get_trans().shape == (2, 3))
969
+
970
+ Args:
971
+ index: A standard torch tensor index. E.g. 8, (10, None, 3),
972
+ or (3, slice(0, 1, None))
973
+ Returns:
974
+ The indexed tensor
975
+ """
976
+ if type(index) != tuple:
977
+ index = (index,)
978
+
979
+ return Rigid(
980
+ self._rots[index],
981
+ self._trans[index + (slice(None),)],
982
+ )
983
+
984
+ def __mul__(
985
+ self,
986
+ right: torch.Tensor,
987
+ ) -> Rigid:
988
+ """
989
+ Pointwise left multiplication of the transformation with a tensor.
990
+ Can be used to e.g. mask the Rigid.
991
+
992
+ Args:
993
+ right:
994
+ The tensor multiplicand
995
+ Returns:
996
+ The product
997
+ """
998
+ if not (isinstance(right, torch.Tensor)):
999
+ raise TypeError("The other multiplicand must be a Tensor")
1000
+
1001
+ new_rots = self._rots * right
1002
+ new_trans = self._trans * right[..., None]
1003
+
1004
+ return Rigid(new_rots, new_trans)
1005
+
1006
+ def __rmul__(
1007
+ self,
1008
+ left: torch.Tensor,
1009
+ ) -> Rigid:
1010
+ """
1011
+ Reverse pointwise multiplication of the transformation with a
1012
+ tensor.
1013
+
1014
+ Args:
1015
+ left:
1016
+ The left multiplicand
1017
+ Returns:
1018
+ The product
1019
+ """
1020
+ return self.__mul__(left)
1021
+
1022
+ @property
1023
+ def shape(self) -> torch.Size:
1024
+ """
1025
+ Returns the shape of the shared dimensions of the rotation and
1026
+ the translation.
1027
+
1028
+ Returns:
1029
+ The shape of the transformation
1030
+ """
1031
+ s = self._trans.shape[:-1]
1032
+ return s
1033
+
1034
+ @property
1035
+ def device(self) -> torch.device:
1036
+ """
1037
+ Returns the device on which the Rigid's tensors are located.
1038
+
1039
+ Returns:
1040
+ The device on which the Rigid's tensors are located
1041
+ """
1042
+ return self._trans.device
1043
+
1044
+ @property
1045
+ def dtype(self) -> torch.dtype:
1046
+ """
1047
+ Returns the dtype of the Rigid tensors.
1048
+
1049
+ Returns:
1050
+ The dtype of the Rigid tensors
1051
+ """
1052
+ return self._rots.dtype
1053
+
1054
+ def get_rots(self) -> Rotation:
1055
+ """
1056
+ Getter for the rotation.
1057
+
1058
+ Returns:
1059
+ The rotation object
1060
+ """
1061
+ return self._rots
1062
+
1063
+ def get_trans(self) -> torch.Tensor:
1064
+ """
1065
+ Getter for the translation.
1066
+
1067
+ Returns:
1068
+ The stored translation
1069
+ """
1070
+ return self._trans
1071
+
1072
+ def compose_q_update_vec(
1073
+ self,
1074
+ q_update_vec: torch.Tensor,
1075
+ ) -> Rigid:
1076
+ """
1077
+ Composes the transformation with a quaternion update vector of
1078
+ shape [*, 6], where the final 6 columns represent the x, y, and
1079
+ z values of a quaternion of form (1, x, y, z) followed by a 3D
1080
+ translation.
1081
+
1082
+ Args:
1083
+ q_vec: The quaternion update vector.
1084
+ Returns:
1085
+ The composed transformation.
1086
+ """
1087
+ q_vec, t_vec = q_update_vec[..., :3], q_update_vec[..., 3:]
1088
+ new_rots = self._rots.compose_q_update_vec(q_vec)
1089
+
1090
+ trans_update = self._rots.apply(t_vec)
1091
+ new_translation = self._trans + trans_update
1092
+
1093
+ return Rigid(new_rots, new_translation)
1094
+
1095
+ def compose(
1096
+ self,
1097
+ r: Rigid,
1098
+ ) -> Rigid:
1099
+ """
1100
+ Composes the current rigid object with another.
1101
+
1102
+ Args:
1103
+ r:
1104
+ Another Rigid object
1105
+ Returns:
1106
+ The composition of the two transformations
1107
+ """
1108
+ new_rot = self._rots.compose_r(r._rots)
1109
+ new_trans = self._rots.apply(r._trans) + self._trans
1110
+ return Rigid(new_rot, new_trans)
1111
+
1112
+ def apply(
1113
+ self,
1114
+ pts: torch.Tensor,
1115
+ ) -> torch.Tensor:
1116
+ """
1117
+ Applies the transformation to a coordinate tensor.
1118
+
1119
+ Args:
1120
+ pts: A [*, 3] coordinate tensor.
1121
+ Returns:
1122
+ The transformed points.
1123
+ """
1124
+ rotated = self._rots.apply(pts)
1125
+ return rotated + self._trans
1126
+
1127
+ def invert_apply(self, pts: torch.Tensor) -> torch.Tensor:
1128
+ """
1129
+ Applies the inverse of the transformation to a coordinate tensor.
1130
+
1131
+ Args:
1132
+ pts: A [*, 3] coordinate tensor
1133
+ Returns:
1134
+ The transformed points.
1135
+ """
1136
+ pts = pts - self._trans
1137
+ return self._rots.invert_apply(pts)
1138
+
1139
+ def invert(self) -> Rigid:
1140
+ """
1141
+ Inverts the transformation.
1142
+
1143
+ Returns:
1144
+ The inverse transformation.
1145
+ """
1146
+ rot_inv = self._rots.invert()
1147
+ trn_inv = rot_inv.apply(self._trans)
1148
+
1149
+ return Rigid(rot_inv, -1 * trn_inv)
1150
+
1151
+ def map_tensor_fn(self, fn: Callable[torch.Tensor, torch.Tensor]) -> Rigid:
1152
+ """
1153
+ Apply a Tensor -> Tensor function to underlying translation and
1154
+ rotation tensors, mapping over the translation/rotation dimensions
1155
+ respectively.
1156
+
1157
+ Args:
1158
+ fn:
1159
+ A Tensor -> Tensor function to be mapped over the Rigid
1160
+ Returns:
1161
+ The transformed Rigid object
1162
+ """
1163
+ new_rots = self._rots.map_tensor_fn(fn)
1164
+ new_trans = torch.stack(
1165
+ list(map(fn, torch.unbind(self._trans, dim=-1))), dim=-1
1166
+ )
1167
+
1168
+ return Rigid(new_rots, new_trans)
1169
+
1170
+ def to_tensor_4x4(self) -> torch.Tensor:
1171
+ """
1172
+ Converts a transformation to a homogenous transformation tensor.
1173
+
1174
+ Returns:
1175
+ A [*, 4, 4] homogenous transformation tensor
1176
+ """
1177
+ tensor = self._trans.new_zeros((*self.shape, 4, 4))
1178
+ tensor[..., :3, :3] = self._rots.get_rot_mats()
1179
+ tensor[..., :3, 3] = self._trans
1180
+ tensor[..., 3, 3] = 1
1181
+ return tensor
1182
+
1183
+ @staticmethod
1184
+ def from_tensor_4x4(t: torch.Tensor) -> Rigid:
1185
+ """
1186
+ Constructs a transformation from a homogenous transformation
1187
+ tensor.
1188
+
1189
+ Args:
1190
+ t: [*, 4, 4] homogenous transformation tensor
1191
+ Returns:
1192
+ T object with shape [*]
1193
+ """
1194
+ if t.shape[-2:] != (4, 4):
1195
+ raise ValueError("Incorrectly shaped input tensor")
1196
+
1197
+ rots = Rotation(rot_mats=t[..., :3, :3], quats=None)
1198
+ trans = t[..., :3, 3]
1199
+
1200
+ return Rigid(rots, trans)
1201
+
1202
+ def to_tensor_7(self) -> torch.Tensor:
1203
+ """
1204
+ Converts a transformation to a tensor with 7 final columns, four
1205
+ for the quaternion followed by three for the translation.
1206
+
1207
+ Returns:
1208
+ A [*, 7] tensor representation of the transformation
1209
+ """
1210
+ tensor = self._trans.new_zeros((*self.shape, 7))
1211
+ tensor[..., :4] = self._rots.get_quats()
1212
+ tensor[..., 4:] = self._trans
1213
+
1214
+ return tensor
1215
+
1216
+ @staticmethod
1217
+ def from_tensor_7(
1218
+ t: torch.Tensor,
1219
+ normalize_quats: bool = False,
1220
+ ) -> Rigid:
1221
+ if t.shape[-1] != 7:
1222
+ raise ValueError("Incorrectly shaped input tensor")
1223
+
1224
+ quats, trans = t[..., :4], t[..., 4:]
1225
+
1226
+ rots = Rotation(rot_mats=None, quats=quats, normalize_quats=normalize_quats)
1227
+
1228
+ return Rigid(rots, trans)
1229
+
1230
+ @staticmethod
1231
+ def from_3_points(
1232
+ p_neg_x_axis: torch.Tensor,
1233
+ origin: torch.Tensor,
1234
+ p_xy_plane: torch.Tensor,
1235
+ eps: float = 1e-8,
1236
+ ) -> Rigid:
1237
+ """
1238
+ Implements algorithm 21. Constructs transformations from sets of 3
1239
+ points using the Gram-Schmidt algorithm.
1240
+
1241
+ Args:
1242
+ p_neg_x_axis: [*, 3] coordinates
1243
+ origin: [*, 3] coordinates used as frame origins
1244
+ p_xy_plane: [*, 3] coordinates
1245
+ eps: Small epsilon value
1246
+ Returns:
1247
+ A transformation object of shape [*]
1248
+ """
1249
+ p_neg_x_axis = torch.unbind(p_neg_x_axis, dim=-1)
1250
+ origin = torch.unbind(origin, dim=-1)
1251
+ p_xy_plane = torch.unbind(p_xy_plane, dim=-1)
1252
+
1253
+ e0 = [c1 - c2 for c1, c2 in zip(origin, p_neg_x_axis)]
1254
+ e1 = [c1 - c2 for c1, c2 in zip(p_xy_plane, origin)]
1255
+
1256
+ denom = torch.sqrt(sum((c * c for c in e0)) + eps)
1257
+ e0 = [c / denom for c in e0]
1258
+ dot = sum((c1 * c2 for c1, c2 in zip(e0, e1)))
1259
+ e1 = [c2 - c1 * dot for c1, c2 in zip(e0, e1)]
1260
+ denom = torch.sqrt(sum((c * c for c in e1)) + eps)
1261
+ e1 = [c / denom for c in e1]
1262
+ e2 = [
1263
+ e0[1] * e1[2] - e0[2] * e1[1],
1264
+ e0[2] * e1[0] - e0[0] * e1[2],
1265
+ e0[0] * e1[1] - e0[1] * e1[0],
1266
+ ]
1267
+
1268
+ rots = torch.stack([c for tup in zip(e0, e1, e2) for c in tup], dim=-1)
1269
+ rots = rots.reshape(rots.shape[:-1] + (3, 3))
1270
+
1271
+ rot_obj = Rotation(rot_mats=rots, quats=None)
1272
+
1273
+ return Rigid(rot_obj, torch.stack(origin, dim=-1))
1274
+
1275
+ def unsqueeze(
1276
+ self,
1277
+ dim: int,
1278
+ ) -> Rigid:
1279
+ """
1280
+ Analogous to torch.unsqueeze. The dimension is relative to the
1281
+ shared dimensions of the rotation/translation.
1282
+
1283
+ Args:
1284
+ dim: A positive or negative dimension index.
1285
+ Returns:
1286
+ The unsqueezed transformation.
1287
+ """
1288
+ if dim >= len(self.shape):
1289
+ raise ValueError("Invalid dimension")
1290
+ rots = self._rots.unsqueeze(dim)
1291
+ trans = self._trans.unsqueeze(dim if dim >= 0 else dim - 1)
1292
+
1293
+ return Rigid(rots, trans)
1294
+
1295
+ @staticmethod
1296
+ def cat(
1297
+ ts: Sequence[Rigid],
1298
+ dim: int,
1299
+ ) -> Rigid:
1300
+ """
1301
+ Concatenates transformations along a new dimension.
1302
+
1303
+ Args:
1304
+ ts:
1305
+ A list of T objects
1306
+ dim:
1307
+ The dimension along which the transformations should be
1308
+ concatenated
1309
+ Returns:
1310
+ A concatenated transformation object
1311
+ """
1312
+ rots = Rotation.cat([t._rots for t in ts], dim)
1313
+ trans = torch.cat([t._trans for t in ts], dim=dim if dim >= 0 else dim - 1)
1314
+
1315
+ return Rigid(rots, trans)
1316
+
1317
+ def apply_rot_fn(self, fn: Callable[Rotation, Rotation]) -> Rigid:
1318
+ """
1319
+ Applies a Rotation -> Rotation function to the stored rotation
1320
+ object.
1321
+
1322
+ Args:
1323
+ fn: A function of type Rotation -> Rotation
1324
+ Returns:
1325
+ A transformation object with a transformed rotation.
1326
+ """
1327
+ return Rigid(fn(self._rots), self._trans)
1328
+
1329
+ def apply_trans_fn(self, fn: Callable[torch.Tensor, torch.Tensor]) -> Rigid:
1330
+ """
1331
+ Applies a Tensor -> Tensor function to the stored translation.
1332
+
1333
+ Args:
1334
+ fn:
1335
+ A function of type Tensor -> Tensor to be applied to the
1336
+ translation
1337
+ Returns:
1338
+ A transformation object with a transformed translation.
1339
+ """
1340
+ return Rigid(self._rots, fn(self._trans))
1341
+
1342
+ def scale_translation(self, trans_scale_factor: float) -> Rigid:
1343
+ """
1344
+ Scales the translation by a constant factor.
1345
+
1346
+ Args:
1347
+ trans_scale_factor:
1348
+ The constant factor
1349
+ Returns:
1350
+ A transformation object with a scaled translation.
1351
+ """
1352
+ fn = lambda t: t * trans_scale_factor
1353
+ return self.apply_trans_fn(fn)
1354
+
1355
+ def stop_rot_gradient(self) -> Rigid:
1356
+ """
1357
+ Detaches the underlying rotation object
1358
+
1359
+ Returns:
1360
+ A transformation object with detached rotations
1361
+ """
1362
+ fn = lambda r: r.detach()
1363
+ return self.apply_rot_fn(fn)
1364
+
1365
+ @staticmethod
1366
+ def make_transform_from_reference(n_xyz, ca_xyz, c_xyz, eps=1e-20):
1367
+ """
1368
+ Returns a transformation object from reference coordinates.
1369
+
1370
+ Note that this method does not take care of symmetries. If you
1371
+ provide the atom positions in the non-standard way, the N atom will
1372
+ end up not at [-0.527250, 1.359329, 0.0] but instead at
1373
+ [-0.527250, -1.359329, 0.0]. You need to take care of such cases in
1374
+ your code.
1375
+
1376
+ Args:
1377
+ n_xyz: A [*, 3] tensor of nitrogen xyz coordinates.
1378
+ ca_xyz: A [*, 3] tensor of carbon alpha xyz coordinates.
1379
+ c_xyz: A [*, 3] tensor of carbon xyz coordinates.
1380
+ Returns:
1381
+ A transformation object. After applying the translation and
1382
+ rotation to the reference backbone, the coordinates will
1383
+ approximately equal to the input coordinates.
1384
+ """
1385
+ translation = -1 * ca_xyz
1386
+ n_xyz = n_xyz + translation
1387
+ c_xyz = c_xyz + translation
1388
+
1389
+ c_x, c_y, c_z = [c_xyz[..., i] for i in range(3)]
1390
+ norm = torch.sqrt(eps + c_x**2 + c_y**2)
1391
+ sin_c1 = -c_y / norm
1392
+ cos_c1 = c_x / norm
1393
+ zeros = sin_c1.new_zeros(sin_c1.shape)
1394
+ ones = sin_c1.new_ones(sin_c1.shape)
1395
+
1396
+ c1_rots = sin_c1.new_zeros((*sin_c1.shape, 3, 3))
1397
+ c1_rots[..., 0, 0] = cos_c1
1398
+ c1_rots[..., 0, 1] = -1 * sin_c1
1399
+ c1_rots[..., 1, 0] = sin_c1
1400
+ c1_rots[..., 1, 1] = cos_c1
1401
+ c1_rots[..., 2, 2] = 1
1402
+
1403
+ norm = torch.sqrt(eps + c_x**2 + c_y**2 + c_z**2)
1404
+ sin_c2 = c_z / norm
1405
+ cos_c2 = torch.sqrt(c_x**2 + c_y**2) / norm
1406
+
1407
+ c2_rots = sin_c2.new_zeros((*sin_c2.shape, 3, 3))
1408
+ c2_rots[..., 0, 0] = cos_c2
1409
+ c2_rots[..., 0, 2] = sin_c2
1410
+ c2_rots[..., 1, 1] = 1
1411
+ c2_rots[..., 2, 0] = -1 * sin_c2
1412
+ c2_rots[..., 2, 2] = cos_c2
1413
+
1414
+ c_rots = rot_matmul(c2_rots, c1_rots)
1415
+ n_xyz = rot_vec_mul(c_rots, n_xyz)
1416
+
1417
+ _, n_y, n_z = [n_xyz[..., i] for i in range(3)]
1418
+ norm = torch.sqrt(eps + n_y**2 + n_z**2)
1419
+ sin_n = -n_z / norm
1420
+ cos_n = n_y / norm
1421
+
1422
+ n_rots = sin_c2.new_zeros((*sin_c2.shape, 3, 3))
1423
+ n_rots[..., 0, 0] = 1
1424
+ n_rots[..., 1, 1] = cos_n
1425
+ n_rots[..., 1, 2] = -1 * sin_n
1426
+ n_rots[..., 2, 1] = sin_n
1427
+ n_rots[..., 2, 2] = cos_n
1428
+
1429
+ rots = rot_matmul(n_rots, c_rots)
1430
+
1431
+ rots = rots.transpose(-1, -2)
1432
+ translation = -1 * translation
1433
+
1434
+ rot_obj = Rotation(rot_mats=rots, quats=None)
1435
+
1436
+ return Rigid(rot_obj, translation)
1437
+
1438
+ def cuda(self) -> Rigid:
1439
+ """
1440
+ Moves the transformation object to GPU memory
1441
+
1442
+ Returns:
1443
+ A version of the transformation on GPU
1444
+ """
1445
+ return Rigid(self._rots.cuda(), self._trans.cuda())
petimot/utils/seeding.py ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import random
2
+ import numpy as np
3
+ import torch
4
+ import torch_geometric
5
+ import os
6
+
7
+
8
+ def set_seed(seed: int, deterministic_algorithms: bool = False):
9
+ """
10
+ Set seeds for reproducibility and optionally enable deterministic algorithms.
11
+
12
+ Args:
13
+ seed: Integer seed for reproducibility
14
+ deterministic_algorithms: If True, enables deterministic CUDA algorithms
15
+ and disables cuDNN benchmarking. This will impact performance.
16
+ """
17
+ # Basic seeding
18
+ random.seed(seed)
19
+ np.random.seed(seed)
20
+ torch.manual_seed(seed)
21
+ if torch.cuda.is_available():
22
+ torch.cuda.manual_seed_all(seed)
23
+
24
+ # PyG specific seeding
25
+ torch_geometric.seed_everything(seed)
26
+
27
+ if deterministic_algorithms:
28
+ # These settings impact performance but ensure reproducibility
29
+ torch.backends.cudnn.deterministic = True
30
+ torch.backends.cudnn.benchmark = False
31
+ os.environ["CUBLAS_WORKSPACE_CONFIG"] = ":4096:8"
32
+ torch.use_deterministic_algorithms(True)
33
+ else:
34
+ # Better performance settings
35
+ torch.backends.cudnn.deterministic = False
36
+ torch.backends.cudnn.benchmark = True
37
+ if "CUBLAS_WORKSPACE_CONFIG" in os.environ:
38
+ del os.environ["CUBLAS_WORKSPACE_CONFIG"]
39
+ torch.use_deterministic_algorithms(False)
40
+
41
+ print(f"Set seed to {seed}")
42
+ print(f"Deterministic mode: {deterministic_algorithms}")
requirements.txt ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ torch>=2.0.0
2
+ torch_geometric>=2.0.0
3
+ wandb>=0.19.0
4
+ transformers==4.48.3
5
+ sentencepiece==0.2.0
6
+ tqdm>=4.65.0
7
+ scipy>=1.13.0
8
+ typer==0.15.1