insomnia7 commited on
Commit
a91c09f
·
verified ·
1 Parent(s): aa120f9

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
added_tokens.json ADDED
@@ -0,0 +1,2855 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "</think>": 151668,
3
+ "</tool_call>": 151658,
4
+ "</tool_response>": 151666,
5
+ "<mask0>": 153984,
6
+ "<mask100>": 154084,
7
+ "<mask101>": 154085,
8
+ "<mask102>": 154086,
9
+ "<mask103>": 154087,
10
+ "<mask104>": 154088,
11
+ "<mask105>": 154089,
12
+ "<mask106>": 154090,
13
+ "<mask107>": 154091,
14
+ "<mask108>": 154092,
15
+ "<mask109>": 154093,
16
+ "<mask10>": 153994,
17
+ "<mask110>": 154094,
18
+ "<mask111>": 154095,
19
+ "<mask112>": 154096,
20
+ "<mask113>": 154097,
21
+ "<mask114>": 154098,
22
+ "<mask115>": 154099,
23
+ "<mask116>": 154100,
24
+ "<mask117>": 154101,
25
+ "<mask118>": 154102,
26
+ "<mask119>": 154103,
27
+ "<mask11>": 153995,
28
+ "<mask120>": 154104,
29
+ "<mask121>": 154105,
30
+ "<mask122>": 154106,
31
+ "<mask123>": 154107,
32
+ "<mask124>": 154108,
33
+ "<mask125>": 154109,
34
+ "<mask126>": 154110,
35
+ "<mask127>": 154111,
36
+ "<mask128>": 154112,
37
+ "<mask129>": 154113,
38
+ "<mask12>": 153996,
39
+ "<mask130>": 154114,
40
+ "<mask131>": 154115,
41
+ "<mask132>": 154116,
42
+ "<mask133>": 154117,
43
+ "<mask134>": 154118,
44
+ "<mask135>": 154119,
45
+ "<mask136>": 154120,
46
+ "<mask137>": 154121,
47
+ "<mask138>": 154122,
48
+ "<mask139>": 154123,
49
+ "<mask13>": 153997,
50
+ "<mask140>": 154124,
51
+ "<mask141>": 154125,
52
+ "<mask142>": 154126,
53
+ "<mask143>": 154127,
54
+ "<mask144>": 154128,
55
+ "<mask145>": 154129,
56
+ "<mask146>": 154130,
57
+ "<mask147>": 154131,
58
+ "<mask148>": 154132,
59
+ "<mask149>": 154133,
60
+ "<mask14>": 153998,
61
+ "<mask150>": 154134,
62
+ "<mask151>": 154135,
63
+ "<mask152>": 154136,
64
+ "<mask153>": 154137,
65
+ "<mask154>": 154138,
66
+ "<mask155>": 154139,
67
+ "<mask156>": 154140,
68
+ "<mask157>": 154141,
69
+ "<mask158>": 154142,
70
+ "<mask159>": 154143,
71
+ "<mask15>": 153999,
72
+ "<mask160>": 154144,
73
+ "<mask161>": 154145,
74
+ "<mask162>": 154146,
75
+ "<mask163>": 154147,
76
+ "<mask164>": 154148,
77
+ "<mask165>": 154149,
78
+ "<mask166>": 154150,
79
+ "<mask167>": 154151,
80
+ "<mask168>": 154152,
81
+ "<mask169>": 154153,
82
+ "<mask16>": 154000,
83
+ "<mask170>": 154154,
84
+ "<mask171>": 154155,
85
+ "<mask172>": 154156,
86
+ "<mask173>": 154157,
87
+ "<mask174>": 154158,
88
+ "<mask175>": 154159,
89
+ "<mask176>": 154160,
90
+ "<mask177>": 154161,
91
+ "<mask178>": 154162,
92
+ "<mask179>": 154163,
93
+ "<mask17>": 154001,
94
+ "<mask180>": 154164,
95
+ "<mask181>": 154165,
96
+ "<mask182>": 154166,
97
+ "<mask183>": 154167,
98
+ "<mask184>": 154168,
99
+ "<mask185>": 154169,
100
+ "<mask186>": 154170,
101
+ "<mask187>": 154171,
102
+ "<mask188>": 154172,
103
+ "<mask189>": 154173,
104
+ "<mask18>": 154002,
105
+ "<mask190>": 154174,
106
+ "<mask191>": 154175,
107
+ "<mask192>": 154176,
108
+ "<mask193>": 154177,
109
+ "<mask194>": 154178,
110
+ "<mask195>": 154179,
111
+ "<mask196>": 154180,
112
+ "<mask197>": 154181,
113
+ "<mask198>": 154182,
114
+ "<mask199>": 154183,
115
+ "<mask19>": 154003,
116
+ "<mask1>": 153985,
117
+ "<mask200>": 154184,
118
+ "<mask201>": 154185,
119
+ "<mask202>": 154186,
120
+ "<mask203>": 154187,
121
+ "<mask204>": 154188,
122
+ "<mask205>": 154189,
123
+ "<mask206>": 154190,
124
+ "<mask207>": 154191,
125
+ "<mask208>": 154192,
126
+ "<mask209>": 154193,
127
+ "<mask20>": 154004,
128
+ "<mask210>": 154194,
129
+ "<mask211>": 154195,
130
+ "<mask212>": 154196,
131
+ "<mask213>": 154197,
132
+ "<mask214>": 154198,
133
+ "<mask215>": 154199,
134
+ "<mask216>": 154200,
135
+ "<mask217>": 154201,
136
+ "<mask218>": 154202,
137
+ "<mask219>": 154203,
138
+ "<mask21>": 154005,
139
+ "<mask220>": 154204,
140
+ "<mask221>": 154205,
141
+ "<mask222>": 154206,
142
+ "<mask223>": 154207,
143
+ "<mask224>": 154208,
144
+ "<mask225>": 154209,
145
+ "<mask226>": 154210,
146
+ "<mask227>": 154211,
147
+ "<mask228>": 154212,
148
+ "<mask229>": 154213,
149
+ "<mask22>": 154006,
150
+ "<mask230>": 154214,
151
+ "<mask231>": 154215,
152
+ "<mask232>": 154216,
153
+ "<mask233>": 154217,
154
+ "<mask234>": 154218,
155
+ "<mask235>": 154219,
156
+ "<mask236>": 154220,
157
+ "<mask237>": 154221,
158
+ "<mask238>": 154222,
159
+ "<mask239>": 154223,
160
+ "<mask23>": 154007,
161
+ "<mask240>": 154224,
162
+ "<mask241>": 154225,
163
+ "<mask242>": 154226,
164
+ "<mask243>": 154227,
165
+ "<mask244>": 154228,
166
+ "<mask245>": 154229,
167
+ "<mask246>": 154230,
168
+ "<mask247>": 154231,
169
+ "<mask248>": 154232,
170
+ "<mask249>": 154233,
171
+ "<mask24>": 154008,
172
+ "<mask250>": 154234,
173
+ "<mask251>": 154235,
174
+ "<mask252>": 154236,
175
+ "<mask253>": 154237,
176
+ "<mask254>": 154238,
177
+ "<mask255>": 154239,
178
+ "<mask256>": 154240,
179
+ "<mask257>": 154241,
180
+ "<mask258>": 154242,
181
+ "<mask259>": 154243,
182
+ "<mask25>": 154009,
183
+ "<mask260>": 154244,
184
+ "<mask261>": 154245,
185
+ "<mask262>": 154246,
186
+ "<mask263>": 154247,
187
+ "<mask264>": 154248,
188
+ "<mask265>": 154249,
189
+ "<mask266>": 154250,
190
+ "<mask267>": 154251,
191
+ "<mask268>": 154252,
192
+ "<mask269>": 154253,
193
+ "<mask26>": 154010,
194
+ "<mask270>": 154254,
195
+ "<mask271>": 154255,
196
+ "<mask272>": 154256,
197
+ "<mask273>": 154257,
198
+ "<mask274>": 154258,
199
+ "<mask275>": 154259,
200
+ "<mask276>": 154260,
201
+ "<mask277>": 154261,
202
+ "<mask278>": 154262,
203
+ "<mask279>": 154263,
204
+ "<mask27>": 154011,
205
+ "<mask280>": 154264,
206
+ "<mask281>": 154265,
207
+ "<mask282>": 154266,
208
+ "<mask283>": 154267,
209
+ "<mask284>": 154268,
210
+ "<mask285>": 154269,
211
+ "<mask286>": 154270,
212
+ "<mask287>": 154271,
213
+ "<mask288>": 154272,
214
+ "<mask289>": 154273,
215
+ "<mask28>": 154012,
216
+ "<mask290>": 154274,
217
+ "<mask291>": 154275,
218
+ "<mask292>": 154276,
219
+ "<mask293>": 154277,
220
+ "<mask294>": 154278,
221
+ "<mask295>": 154279,
222
+ "<mask296>": 154280,
223
+ "<mask297>": 154281,
224
+ "<mask298>": 154282,
225
+ "<mask299>": 154283,
226
+ "<mask29>": 154013,
227
+ "<mask2>": 153986,
228
+ "<mask300>": 154284,
229
+ "<mask301>": 154285,
230
+ "<mask302>": 154286,
231
+ "<mask303>": 154287,
232
+ "<mask304>": 154288,
233
+ "<mask305>": 154289,
234
+ "<mask306>": 154290,
235
+ "<mask307>": 154291,
236
+ "<mask308>": 154292,
237
+ "<mask309>": 154293,
238
+ "<mask30>": 154014,
239
+ "<mask310>": 154294,
240
+ "<mask311>": 154295,
241
+ "<mask312>": 154296,
242
+ "<mask313>": 154297,
243
+ "<mask314>": 154298,
244
+ "<mask315>": 154299,
245
+ "<mask316>": 154300,
246
+ "<mask317>": 154301,
247
+ "<mask318>": 154302,
248
+ "<mask319>": 154303,
249
+ "<mask31>": 154015,
250
+ "<mask320>": 154304,
251
+ "<mask321>": 154305,
252
+ "<mask322>": 154306,
253
+ "<mask323>": 154307,
254
+ "<mask324>": 154308,
255
+ "<mask325>": 154309,
256
+ "<mask326>": 154310,
257
+ "<mask327>": 154311,
258
+ "<mask328>": 154312,
259
+ "<mask329>": 154313,
260
+ "<mask32>": 154016,
261
+ "<mask330>": 154314,
262
+ "<mask331>": 154315,
263
+ "<mask332>": 154316,
264
+ "<mask333>": 154317,
265
+ "<mask334>": 154318,
266
+ "<mask335>": 154319,
267
+ "<mask336>": 154320,
268
+ "<mask337>": 154321,
269
+ "<mask338>": 154322,
270
+ "<mask339>": 154323,
271
+ "<mask33>": 154017,
272
+ "<mask340>": 154324,
273
+ "<mask341>": 154325,
274
+ "<mask342>": 154326,
275
+ "<mask343>": 154327,
276
+ "<mask344>": 154328,
277
+ "<mask345>": 154329,
278
+ "<mask346>": 154330,
279
+ "<mask347>": 154331,
280
+ "<mask348>": 154332,
281
+ "<mask349>": 154333,
282
+ "<mask34>": 154018,
283
+ "<mask350>": 154334,
284
+ "<mask351>": 154335,
285
+ "<mask352>": 154336,
286
+ "<mask353>": 154337,
287
+ "<mask354>": 154338,
288
+ "<mask355>": 154339,
289
+ "<mask356>": 154340,
290
+ "<mask357>": 154341,
291
+ "<mask358>": 154342,
292
+ "<mask359>": 154343,
293
+ "<mask35>": 154019,
294
+ "<mask360>": 154344,
295
+ "<mask361>": 154345,
296
+ "<mask362>": 154346,
297
+ "<mask363>": 154347,
298
+ "<mask364>": 154348,
299
+ "<mask365>": 154349,
300
+ "<mask366>": 154350,
301
+ "<mask367>": 154351,
302
+ "<mask368>": 154352,
303
+ "<mask369>": 154353,
304
+ "<mask36>": 154020,
305
+ "<mask370>": 154354,
306
+ "<mask371>": 154355,
307
+ "<mask372>": 154356,
308
+ "<mask373>": 154357,
309
+ "<mask374>": 154358,
310
+ "<mask375>": 154359,
311
+ "<mask376>": 154360,
312
+ "<mask377>": 154361,
313
+ "<mask378>": 154362,
314
+ "<mask379>": 154363,
315
+ "<mask37>": 154021,
316
+ "<mask380>": 154364,
317
+ "<mask381>": 154365,
318
+ "<mask382>": 154366,
319
+ "<mask383>": 154367,
320
+ "<mask384>": 154368,
321
+ "<mask385>": 154369,
322
+ "<mask386>": 154370,
323
+ "<mask387>": 154371,
324
+ "<mask388>": 154372,
325
+ "<mask389>": 154373,
326
+ "<mask38>": 154022,
327
+ "<mask390>": 154374,
328
+ "<mask391>": 154375,
329
+ "<mask392>": 154376,
330
+ "<mask393>": 154377,
331
+ "<mask394>": 154378,
332
+ "<mask395>": 154379,
333
+ "<mask396>": 154380,
334
+ "<mask397>": 154381,
335
+ "<mask398>": 154382,
336
+ "<mask399>": 154383,
337
+ "<mask39>": 154023,
338
+ "<mask3>": 153987,
339
+ "<mask400>": 154384,
340
+ "<mask401>": 154385,
341
+ "<mask402>": 154386,
342
+ "<mask403>": 154387,
343
+ "<mask404>": 154388,
344
+ "<mask405>": 154389,
345
+ "<mask406>": 154390,
346
+ "<mask407>": 154391,
347
+ "<mask408>": 154392,
348
+ "<mask409>": 154393,
349
+ "<mask40>": 154024,
350
+ "<mask410>": 154394,
351
+ "<mask411>": 154395,
352
+ "<mask412>": 154396,
353
+ "<mask413>": 154397,
354
+ "<mask414>": 154398,
355
+ "<mask415>": 154399,
356
+ "<mask416>": 154400,
357
+ "<mask417>": 154401,
358
+ "<mask418>": 154402,
359
+ "<mask419>": 154403,
360
+ "<mask41>": 154025,
361
+ "<mask420>": 154404,
362
+ "<mask421>": 154405,
363
+ "<mask422>": 154406,
364
+ "<mask423>": 154407,
365
+ "<mask424>": 154408,
366
+ "<mask425>": 154409,
367
+ "<mask426>": 154410,
368
+ "<mask427>": 154411,
369
+ "<mask428>": 154412,
370
+ "<mask429>": 154413,
371
+ "<mask42>": 154026,
372
+ "<mask430>": 154414,
373
+ "<mask431>": 154415,
374
+ "<mask432>": 154416,
375
+ "<mask433>": 154417,
376
+ "<mask434>": 154418,
377
+ "<mask435>": 154419,
378
+ "<mask436>": 154420,
379
+ "<mask437>": 154421,
380
+ "<mask438>": 154422,
381
+ "<mask439>": 154423,
382
+ "<mask43>": 154027,
383
+ "<mask440>": 154424,
384
+ "<mask441>": 154425,
385
+ "<mask442>": 154426,
386
+ "<mask443>": 154427,
387
+ "<mask444>": 154428,
388
+ "<mask445>": 154429,
389
+ "<mask446>": 154430,
390
+ "<mask447>": 154431,
391
+ "<mask448>": 154432,
392
+ "<mask449>": 154433,
393
+ "<mask44>": 154028,
394
+ "<mask450>": 154434,
395
+ "<mask451>": 154435,
396
+ "<mask452>": 154436,
397
+ "<mask453>": 154437,
398
+ "<mask454>": 154438,
399
+ "<mask455>": 154439,
400
+ "<mask456>": 154440,
401
+ "<mask457>": 154441,
402
+ "<mask458>": 154442,
403
+ "<mask459>": 154443,
404
+ "<mask45>": 154029,
405
+ "<mask460>": 154444,
406
+ "<mask461>": 154445,
407
+ "<mask462>": 154446,
408
+ "<mask463>": 154447,
409
+ "<mask464>": 154448,
410
+ "<mask465>": 154449,
411
+ "<mask466>": 154450,
412
+ "<mask467>": 154451,
413
+ "<mask468>": 154452,
414
+ "<mask469>": 154453,
415
+ "<mask46>": 154030,
416
+ "<mask470>": 154454,
417
+ "<mask471>": 154455,
418
+ "<mask472>": 154456,
419
+ "<mask473>": 154457,
420
+ "<mask474>": 154458,
421
+ "<mask475>": 154459,
422
+ "<mask476>": 154460,
423
+ "<mask477>": 154461,
424
+ "<mask478>": 154462,
425
+ "<mask479>": 154463,
426
+ "<mask47>": 154031,
427
+ "<mask480>": 154464,
428
+ "<mask481>": 154465,
429
+ "<mask482>": 154466,
430
+ "<mask483>": 154467,
431
+ "<mask484>": 154468,
432
+ "<mask485>": 154469,
433
+ "<mask486>": 154470,
434
+ "<mask487>": 154471,
435
+ "<mask488>": 154472,
436
+ "<mask489>": 154473,
437
+ "<mask48>": 154032,
438
+ "<mask490>": 154474,
439
+ "<mask491>": 154475,
440
+ "<mask492>": 154476,
441
+ "<mask493>": 154477,
442
+ "<mask494>": 154478,
443
+ "<mask495>": 154479,
444
+ "<mask496>": 154480,
445
+ "<mask497>": 154481,
446
+ "<mask498>": 154482,
447
+ "<mask499>": 154483,
448
+ "<mask49>": 154033,
449
+ "<mask4>": 153988,
450
+ "<mask500>": 154484,
451
+ "<mask501>": 154485,
452
+ "<mask502>": 154486,
453
+ "<mask503>": 154487,
454
+ "<mask504>": 154488,
455
+ "<mask505>": 154489,
456
+ "<mask506>": 154490,
457
+ "<mask507>": 154491,
458
+ "<mask508>": 154492,
459
+ "<mask509>": 154493,
460
+ "<mask50>": 154034,
461
+ "<mask510>": 154494,
462
+ "<mask511>": 154495,
463
+ "<mask51>": 154035,
464
+ "<mask52>": 154036,
465
+ "<mask53>": 154037,
466
+ "<mask54>": 154038,
467
+ "<mask55>": 154039,
468
+ "<mask56>": 154040,
469
+ "<mask57>": 154041,
470
+ "<mask58>": 154042,
471
+ "<mask59>": 154043,
472
+ "<mask5>": 153989,
473
+ "<mask60>": 154044,
474
+ "<mask61>": 154045,
475
+ "<mask62>": 154046,
476
+ "<mask63>": 154047,
477
+ "<mask64>": 154048,
478
+ "<mask65>": 154049,
479
+ "<mask66>": 154050,
480
+ "<mask67>": 154051,
481
+ "<mask68>": 154052,
482
+ "<mask69>": 154053,
483
+ "<mask6>": 153990,
484
+ "<mask70>": 154054,
485
+ "<mask71>": 154055,
486
+ "<mask72>": 154056,
487
+ "<mask73>": 154057,
488
+ "<mask74>": 154058,
489
+ "<mask75>": 154059,
490
+ "<mask76>": 154060,
491
+ "<mask77>": 154061,
492
+ "<mask78>": 154062,
493
+ "<mask79>": 154063,
494
+ "<mask7>": 153991,
495
+ "<mask80>": 154064,
496
+ "<mask81>": 154065,
497
+ "<mask82>": 154066,
498
+ "<mask83>": 154067,
499
+ "<mask84>": 154068,
500
+ "<mask85>": 154069,
501
+ "<mask86>": 154070,
502
+ "<mask87>": 154071,
503
+ "<mask88>": 154072,
504
+ "<mask89>": 154073,
505
+ "<mask8>": 153992,
506
+ "<mask90>": 154074,
507
+ "<mask91>": 154075,
508
+ "<mask92>": 154076,
509
+ "<mask93>": 154077,
510
+ "<mask94>": 154078,
511
+ "<mask95>": 154079,
512
+ "<mask96>": 154080,
513
+ "<mask97>": 154081,
514
+ "<mask98>": 154082,
515
+ "<mask99>": 154083,
516
+ "<mask9>": 153993,
517
+ "<none0>": 151669,
518
+ "<none100>": 151769,
519
+ "<none101>": 151770,
520
+ "<none102>": 151771,
521
+ "<none103>": 151772,
522
+ "<none104>": 151773,
523
+ "<none105>": 151774,
524
+ "<none106>": 151775,
525
+ "<none107>": 151776,
526
+ "<none108>": 151777,
527
+ "<none109>": 151778,
528
+ "<none10>": 151679,
529
+ "<none110>": 151779,
530
+ "<none111>": 151780,
531
+ "<none112>": 151781,
532
+ "<none113>": 151782,
533
+ "<none114>": 151783,
534
+ "<none115>": 151784,
535
+ "<none116>": 151785,
536
+ "<none117>": 151786,
537
+ "<none118>": 151787,
538
+ "<none119>": 151788,
539
+ "<none11>": 151680,
540
+ "<none120>": 151789,
541
+ "<none121>": 151790,
542
+ "<none122>": 151791,
543
+ "<none123>": 151792,
544
+ "<none124>": 151793,
545
+ "<none125>": 151794,
546
+ "<none126>": 151795,
547
+ "<none127>": 151796,
548
+ "<none128>": 151797,
549
+ "<none129>": 151798,
550
+ "<none12>": 151681,
551
+ "<none130>": 151799,
552
+ "<none131>": 151800,
553
+ "<none132>": 151801,
554
+ "<none133>": 151802,
555
+ "<none134>": 151803,
556
+ "<none135>": 151804,
557
+ "<none136>": 151805,
558
+ "<none137>": 151806,
559
+ "<none138>": 151807,
560
+ "<none139>": 151808,
561
+ "<none13>": 151682,
562
+ "<none140>": 151809,
563
+ "<none141>": 151810,
564
+ "<none142>": 151811,
565
+ "<none143>": 151812,
566
+ "<none144>": 151813,
567
+ "<none145>": 151814,
568
+ "<none146>": 151815,
569
+ "<none147>": 151816,
570
+ "<none148>": 151817,
571
+ "<none149>": 151818,
572
+ "<none14>": 151683,
573
+ "<none150>": 151819,
574
+ "<none151>": 151820,
575
+ "<none152>": 151821,
576
+ "<none153>": 151822,
577
+ "<none154>": 151823,
578
+ "<none155>": 151824,
579
+ "<none156>": 151825,
580
+ "<none157>": 151826,
581
+ "<none158>": 151827,
582
+ "<none159>": 151828,
583
+ "<none15>": 151684,
584
+ "<none160>": 151829,
585
+ "<none161>": 151830,
586
+ "<none162>": 151831,
587
+ "<none163>": 151832,
588
+ "<none164>": 151833,
589
+ "<none165>": 151834,
590
+ "<none166>": 151835,
591
+ "<none167>": 151836,
592
+ "<none168>": 151837,
593
+ "<none169>": 151838,
594
+ "<none16>": 151685,
595
+ "<none170>": 151839,
596
+ "<none171>": 151840,
597
+ "<none172>": 151841,
598
+ "<none173>": 151842,
599
+ "<none174>": 151843,
600
+ "<none175>": 151844,
601
+ "<none176>": 151845,
602
+ "<none177>": 151846,
603
+ "<none178>": 151847,
604
+ "<none179>": 151848,
605
+ "<none17>": 151686,
606
+ "<none180>": 151849,
607
+ "<none181>": 151850,
608
+ "<none182>": 151851,
609
+ "<none183>": 151852,
610
+ "<none184>": 151853,
611
+ "<none185>": 151854,
612
+ "<none186>": 151855,
613
+ "<none187>": 151856,
614
+ "<none188>": 151857,
615
+ "<none189>": 151858,
616
+ "<none18>": 151687,
617
+ "<none190>": 151859,
618
+ "<none191>": 151860,
619
+ "<none192>": 151861,
620
+ "<none193>": 151862,
621
+ "<none194>": 151863,
622
+ "<none195>": 151864,
623
+ "<none196>": 151865,
624
+ "<none197>": 151866,
625
+ "<none198>": 151867,
626
+ "<none199>": 151868,
627
+ "<none19>": 151688,
628
+ "<none1>": 151670,
629
+ "<none200>": 151869,
630
+ "<none201>": 151870,
631
+ "<none202>": 151871,
632
+ "<none203>": 151872,
633
+ "<none204>": 151873,
634
+ "<none205>": 151874,
635
+ "<none206>": 151875,
636
+ "<none207>": 151876,
637
+ "<none208>": 151877,
638
+ "<none209>": 151878,
639
+ "<none20>": 151689,
640
+ "<none210>": 151879,
641
+ "<none211>": 151880,
642
+ "<none212>": 151881,
643
+ "<none213>": 151882,
644
+ "<none214>": 151883,
645
+ "<none215>": 151884,
646
+ "<none216>": 151885,
647
+ "<none217>": 151886,
648
+ "<none218>": 151887,
649
+ "<none219>": 151888,
650
+ "<none21>": 151690,
651
+ "<none220>": 151889,
652
+ "<none221>": 151890,
653
+ "<none222>": 151891,
654
+ "<none223>": 151892,
655
+ "<none224>": 151893,
656
+ "<none225>": 151894,
657
+ "<none226>": 151895,
658
+ "<none227>": 151896,
659
+ "<none228>": 151897,
660
+ "<none229>": 151898,
661
+ "<none22>": 151691,
662
+ "<none230>": 151899,
663
+ "<none231>": 151900,
664
+ "<none232>": 151901,
665
+ "<none233>": 151902,
666
+ "<none234>": 151903,
667
+ "<none235>": 151904,
668
+ "<none236>": 151905,
669
+ "<none237>": 151906,
670
+ "<none238>": 151907,
671
+ "<none239>": 151908,
672
+ "<none23>": 151692,
673
+ "<none240>": 151909,
674
+ "<none241>": 151910,
675
+ "<none242>": 151911,
676
+ "<none243>": 151912,
677
+ "<none244>": 151913,
678
+ "<none245>": 151914,
679
+ "<none246>": 151915,
680
+ "<none247>": 151916,
681
+ "<none248>": 151917,
682
+ "<none249>": 151918,
683
+ "<none24>": 151693,
684
+ "<none250>": 151919,
685
+ "<none251>": 151920,
686
+ "<none252>": 151921,
687
+ "<none253>": 151922,
688
+ "<none254>": 151923,
689
+ "<none255>": 151924,
690
+ "<none256>": 151925,
691
+ "<none257>": 151926,
692
+ "<none258>": 151927,
693
+ "<none259>": 151928,
694
+ "<none25>": 151694,
695
+ "<none260>": 151929,
696
+ "<none261>": 151930,
697
+ "<none262>": 151931,
698
+ "<none263>": 151932,
699
+ "<none264>": 151933,
700
+ "<none265>": 151934,
701
+ "<none266>": 151935,
702
+ "<none26>": 151695,
703
+ "<none27>": 151696,
704
+ "<none28>": 151697,
705
+ "<none29>": 151698,
706
+ "<none2>": 151671,
707
+ "<none30>": 151699,
708
+ "<none31>": 151700,
709
+ "<none32>": 151701,
710
+ "<none33>": 151702,
711
+ "<none34>": 151703,
712
+ "<none35>": 151704,
713
+ "<none36>": 151705,
714
+ "<none37>": 151706,
715
+ "<none38>": 151707,
716
+ "<none39>": 151708,
717
+ "<none3>": 151672,
718
+ "<none40>": 151709,
719
+ "<none41>": 151710,
720
+ "<none42>": 151711,
721
+ "<none43>": 151712,
722
+ "<none44>": 151713,
723
+ "<none45>": 151714,
724
+ "<none46>": 151715,
725
+ "<none47>": 151716,
726
+ "<none48>": 151717,
727
+ "<none49>": 151718,
728
+ "<none4>": 151673,
729
+ "<none50>": 151719,
730
+ "<none51>": 151720,
731
+ "<none52>": 151721,
732
+ "<none53>": 151722,
733
+ "<none54>": 151723,
734
+ "<none55>": 151724,
735
+ "<none56>": 151725,
736
+ "<none57>": 151726,
737
+ "<none58>": 151727,
738
+ "<none59>": 151728,
739
+ "<none5>": 151674,
740
+ "<none60>": 151729,
741
+ "<none61>": 151730,
742
+ "<none62>": 151731,
743
+ "<none63>": 151732,
744
+ "<none64>": 151733,
745
+ "<none65>": 151734,
746
+ "<none66>": 151735,
747
+ "<none67>": 151736,
748
+ "<none68>": 151737,
749
+ "<none69>": 151738,
750
+ "<none6>": 151675,
751
+ "<none70>": 151739,
752
+ "<none71>": 151740,
753
+ "<none72>": 151741,
754
+ "<none73>": 151742,
755
+ "<none74>": 151743,
756
+ "<none75>": 151744,
757
+ "<none76>": 151745,
758
+ "<none77>": 151746,
759
+ "<none78>": 151747,
760
+ "<none79>": 151748,
761
+ "<none7>": 151676,
762
+ "<none80>": 151749,
763
+ "<none81>": 151750,
764
+ "<none82>": 151751,
765
+ "<none83>": 151752,
766
+ "<none84>": 151753,
767
+ "<none85>": 151754,
768
+ "<none86>": 151755,
769
+ "<none87>": 151756,
770
+ "<none88>": 151757,
771
+ "<none89>": 151758,
772
+ "<none8>": 151677,
773
+ "<none90>": 151759,
774
+ "<none91>": 151760,
775
+ "<none92>": 151761,
776
+ "<none93>": 151762,
777
+ "<none94>": 151763,
778
+ "<none95>": 151764,
779
+ "<none96>": 151765,
780
+ "<none97>": 151766,
781
+ "<none98>": 151767,
782
+ "<none99>": 151768,
783
+ "<none9>": 151678,
784
+ "<think>": 151667,
785
+ "<tool_call>": 151657,
786
+ "<tool_response>": 151665,
787
+ "<x0>": 151936,
788
+ "<x1000>": 153936,
789
+ "<x1001>": 153938,
790
+ "<x1002>": 153940,
791
+ "<x1003>": 153942,
792
+ "<x1004>": 153944,
793
+ "<x1005>": 153946,
794
+ "<x1006>": 153948,
795
+ "<x1007>": 153950,
796
+ "<x1008>": 153952,
797
+ "<x1009>": 153954,
798
+ "<x100>": 152136,
799
+ "<x1010>": 153956,
800
+ "<x1011>": 153958,
801
+ "<x1012>": 153960,
802
+ "<x1013>": 153962,
803
+ "<x1014>": 153964,
804
+ "<x1015>": 153966,
805
+ "<x1016>": 153968,
806
+ "<x1017>": 153970,
807
+ "<x1018>": 153972,
808
+ "<x1019>": 153974,
809
+ "<x101>": 152138,
810
+ "<x1020>": 153976,
811
+ "<x1021>": 153978,
812
+ "<x1022>": 153980,
813
+ "<x1023>": 153982,
814
+ "<x102>": 152140,
815
+ "<x103>": 152142,
816
+ "<x104>": 152144,
817
+ "<x105>": 152146,
818
+ "<x106>": 152148,
819
+ "<x107>": 152150,
820
+ "<x108>": 152152,
821
+ "<x109>": 152154,
822
+ "<x10>": 151956,
823
+ "<x110>": 152156,
824
+ "<x111>": 152158,
825
+ "<x112>": 152160,
826
+ "<x113>": 152162,
827
+ "<x114>": 152164,
828
+ "<x115>": 152166,
829
+ "<x116>": 152168,
830
+ "<x117>": 152170,
831
+ "<x118>": 152172,
832
+ "<x119>": 152174,
833
+ "<x11>": 151958,
834
+ "<x120>": 152176,
835
+ "<x121>": 152178,
836
+ "<x122>": 152180,
837
+ "<x123>": 152182,
838
+ "<x124>": 152184,
839
+ "<x125>": 152186,
840
+ "<x126>": 152188,
841
+ "<x127>": 152190,
842
+ "<x128>": 152192,
843
+ "<x129>": 152194,
844
+ "<x12>": 151960,
845
+ "<x130>": 152196,
846
+ "<x131>": 152198,
847
+ "<x132>": 152200,
848
+ "<x133>": 152202,
849
+ "<x134>": 152204,
850
+ "<x135>": 152206,
851
+ "<x136>": 152208,
852
+ "<x137>": 152210,
853
+ "<x138>": 152212,
854
+ "<x139>": 152214,
855
+ "<x13>": 151962,
856
+ "<x140>": 152216,
857
+ "<x141>": 152218,
858
+ "<x142>": 152220,
859
+ "<x143>": 152222,
860
+ "<x144>": 152224,
861
+ "<x145>": 152226,
862
+ "<x146>": 152228,
863
+ "<x147>": 152230,
864
+ "<x148>": 152232,
865
+ "<x149>": 152234,
866
+ "<x14>": 151964,
867
+ "<x150>": 152236,
868
+ "<x151>": 152238,
869
+ "<x152>": 152240,
870
+ "<x153>": 152242,
871
+ "<x154>": 152244,
872
+ "<x155>": 152246,
873
+ "<x156>": 152248,
874
+ "<x157>": 152250,
875
+ "<x158>": 152252,
876
+ "<x159>": 152254,
877
+ "<x15>": 151966,
878
+ "<x160>": 152256,
879
+ "<x161>": 152258,
880
+ "<x162>": 152260,
881
+ "<x163>": 152262,
882
+ "<x164>": 152264,
883
+ "<x165>": 152266,
884
+ "<x166>": 152268,
885
+ "<x167>": 152270,
886
+ "<x168>": 152272,
887
+ "<x169>": 152274,
888
+ "<x16>": 151968,
889
+ "<x170>": 152276,
890
+ "<x171>": 152278,
891
+ "<x172>": 152280,
892
+ "<x173>": 152282,
893
+ "<x174>": 152284,
894
+ "<x175>": 152286,
895
+ "<x176>": 152288,
896
+ "<x177>": 152290,
897
+ "<x178>": 152292,
898
+ "<x179>": 152294,
899
+ "<x17>": 151970,
900
+ "<x180>": 152296,
901
+ "<x181>": 152298,
902
+ "<x182>": 152300,
903
+ "<x183>": 152302,
904
+ "<x184>": 152304,
905
+ "<x185>": 152306,
906
+ "<x186>": 152308,
907
+ "<x187>": 152310,
908
+ "<x188>": 152312,
909
+ "<x189>": 152314,
910
+ "<x18>": 151972,
911
+ "<x190>": 152316,
912
+ "<x191>": 152318,
913
+ "<x192>": 152320,
914
+ "<x193>": 152322,
915
+ "<x194>": 152324,
916
+ "<x195>": 152326,
917
+ "<x196>": 152328,
918
+ "<x197>": 152330,
919
+ "<x198>": 152332,
920
+ "<x199>": 152334,
921
+ "<x19>": 151974,
922
+ "<x1>": 151938,
923
+ "<x200>": 152336,
924
+ "<x201>": 152338,
925
+ "<x202>": 152340,
926
+ "<x203>": 152342,
927
+ "<x204>": 152344,
928
+ "<x205>": 152346,
929
+ "<x206>": 152348,
930
+ "<x207>": 152350,
931
+ "<x208>": 152352,
932
+ "<x209>": 152354,
933
+ "<x20>": 151976,
934
+ "<x210>": 152356,
935
+ "<x211>": 152358,
936
+ "<x212>": 152360,
937
+ "<x213>": 152362,
938
+ "<x214>": 152364,
939
+ "<x215>": 152366,
940
+ "<x216>": 152368,
941
+ "<x217>": 152370,
942
+ "<x218>": 152372,
943
+ "<x219>": 152374,
944
+ "<x21>": 151978,
945
+ "<x220>": 152376,
946
+ "<x221>": 152378,
947
+ "<x222>": 152380,
948
+ "<x223>": 152382,
949
+ "<x224>": 152384,
950
+ "<x225>": 152386,
951
+ "<x226>": 152388,
952
+ "<x227>": 152390,
953
+ "<x228>": 152392,
954
+ "<x229>": 152394,
955
+ "<x22>": 151980,
956
+ "<x230>": 152396,
957
+ "<x231>": 152398,
958
+ "<x232>": 152400,
959
+ "<x233>": 152402,
960
+ "<x234>": 152404,
961
+ "<x235>": 152406,
962
+ "<x236>": 152408,
963
+ "<x237>": 152410,
964
+ "<x238>": 152412,
965
+ "<x239>": 152414,
966
+ "<x23>": 151982,
967
+ "<x240>": 152416,
968
+ "<x241>": 152418,
969
+ "<x242>": 152420,
970
+ "<x243>": 152422,
971
+ "<x244>": 152424,
972
+ "<x245>": 152426,
973
+ "<x246>": 152428,
974
+ "<x247>": 152430,
975
+ "<x248>": 152432,
976
+ "<x249>": 152434,
977
+ "<x24>": 151984,
978
+ "<x250>": 152436,
979
+ "<x251>": 152438,
980
+ "<x252>": 152440,
981
+ "<x253>": 152442,
982
+ "<x254>": 152444,
983
+ "<x255>": 152446,
984
+ "<x256>": 152448,
985
+ "<x257>": 152450,
986
+ "<x258>": 152452,
987
+ "<x259>": 152454,
988
+ "<x25>": 151986,
989
+ "<x260>": 152456,
990
+ "<x261>": 152458,
991
+ "<x262>": 152460,
992
+ "<x263>": 152462,
993
+ "<x264>": 152464,
994
+ "<x265>": 152466,
995
+ "<x266>": 152468,
996
+ "<x267>": 152470,
997
+ "<x268>": 152472,
998
+ "<x269>": 152474,
999
+ "<x26>": 151988,
1000
+ "<x270>": 152476,
1001
+ "<x271>": 152478,
1002
+ "<x272>": 152480,
1003
+ "<x273>": 152482,
1004
+ "<x274>": 152484,
1005
+ "<x275>": 152486,
1006
+ "<x276>": 152488,
1007
+ "<x277>": 152490,
1008
+ "<x278>": 152492,
1009
+ "<x279>": 152494,
1010
+ "<x27>": 151990,
1011
+ "<x280>": 152496,
1012
+ "<x281>": 152498,
1013
+ "<x282>": 152500,
1014
+ "<x283>": 152502,
1015
+ "<x284>": 152504,
1016
+ "<x285>": 152506,
1017
+ "<x286>": 152508,
1018
+ "<x287>": 152510,
1019
+ "<x288>": 152512,
1020
+ "<x289>": 152514,
1021
+ "<x28>": 151992,
1022
+ "<x290>": 152516,
1023
+ "<x291>": 152518,
1024
+ "<x292>": 152520,
1025
+ "<x293>": 152522,
1026
+ "<x294>": 152524,
1027
+ "<x295>": 152526,
1028
+ "<x296>": 152528,
1029
+ "<x297>": 152530,
1030
+ "<x298>": 152532,
1031
+ "<x299>": 152534,
1032
+ "<x29>": 151994,
1033
+ "<x2>": 151940,
1034
+ "<x300>": 152536,
1035
+ "<x301>": 152538,
1036
+ "<x302>": 152540,
1037
+ "<x303>": 152542,
1038
+ "<x304>": 152544,
1039
+ "<x305>": 152546,
1040
+ "<x306>": 152548,
1041
+ "<x307>": 152550,
1042
+ "<x308>": 152552,
1043
+ "<x309>": 152554,
1044
+ "<x30>": 151996,
1045
+ "<x310>": 152556,
1046
+ "<x311>": 152558,
1047
+ "<x312>": 152560,
1048
+ "<x313>": 152562,
1049
+ "<x314>": 152564,
1050
+ "<x315>": 152566,
1051
+ "<x316>": 152568,
1052
+ "<x317>": 152570,
1053
+ "<x318>": 152572,
1054
+ "<x319>": 152574,
1055
+ "<x31>": 151998,
1056
+ "<x320>": 152576,
1057
+ "<x321>": 152578,
1058
+ "<x322>": 152580,
1059
+ "<x323>": 152582,
1060
+ "<x324>": 152584,
1061
+ "<x325>": 152586,
1062
+ "<x326>": 152588,
1063
+ "<x327>": 152590,
1064
+ "<x328>": 152592,
1065
+ "<x329>": 152594,
1066
+ "<x32>": 152000,
1067
+ "<x330>": 152596,
1068
+ "<x331>": 152598,
1069
+ "<x332>": 152600,
1070
+ "<x333>": 152602,
1071
+ "<x334>": 152604,
1072
+ "<x335>": 152606,
1073
+ "<x336>": 152608,
1074
+ "<x337>": 152610,
1075
+ "<x338>": 152612,
1076
+ "<x339>": 152614,
1077
+ "<x33>": 152002,
1078
+ "<x340>": 152616,
1079
+ "<x341>": 152618,
1080
+ "<x342>": 152620,
1081
+ "<x343>": 152622,
1082
+ "<x344>": 152624,
1083
+ "<x345>": 152626,
1084
+ "<x346>": 152628,
1085
+ "<x347>": 152630,
1086
+ "<x348>": 152632,
1087
+ "<x349>": 152634,
1088
+ "<x34>": 152004,
1089
+ "<x350>": 152636,
1090
+ "<x351>": 152638,
1091
+ "<x352>": 152640,
1092
+ "<x353>": 152642,
1093
+ "<x354>": 152644,
1094
+ "<x355>": 152646,
1095
+ "<x356>": 152648,
1096
+ "<x357>": 152650,
1097
+ "<x358>": 152652,
1098
+ "<x359>": 152654,
1099
+ "<x35>": 152006,
1100
+ "<x360>": 152656,
1101
+ "<x361>": 152658,
1102
+ "<x362>": 152660,
1103
+ "<x363>": 152662,
1104
+ "<x364>": 152664,
1105
+ "<x365>": 152666,
1106
+ "<x366>": 152668,
1107
+ "<x367>": 152670,
1108
+ "<x368>": 152672,
1109
+ "<x369>": 152674,
1110
+ "<x36>": 152008,
1111
+ "<x370>": 152676,
1112
+ "<x371>": 152678,
1113
+ "<x372>": 152680,
1114
+ "<x373>": 152682,
1115
+ "<x374>": 152684,
1116
+ "<x375>": 152686,
1117
+ "<x376>": 152688,
1118
+ "<x377>": 152690,
1119
+ "<x378>": 152692,
1120
+ "<x379>": 152694,
1121
+ "<x37>": 152010,
1122
+ "<x380>": 152696,
1123
+ "<x381>": 152698,
1124
+ "<x382>": 152700,
1125
+ "<x383>": 152702,
1126
+ "<x384>": 152704,
1127
+ "<x385>": 152706,
1128
+ "<x386>": 152708,
1129
+ "<x387>": 152710,
1130
+ "<x388>": 152712,
1131
+ "<x389>": 152714,
1132
+ "<x38>": 152012,
1133
+ "<x390>": 152716,
1134
+ "<x391>": 152718,
1135
+ "<x392>": 152720,
1136
+ "<x393>": 152722,
1137
+ "<x394>": 152724,
1138
+ "<x395>": 152726,
1139
+ "<x396>": 152728,
1140
+ "<x397>": 152730,
1141
+ "<x398>": 152732,
1142
+ "<x399>": 152734,
1143
+ "<x39>": 152014,
1144
+ "<x3>": 151942,
1145
+ "<x400>": 152736,
1146
+ "<x401>": 152738,
1147
+ "<x402>": 152740,
1148
+ "<x403>": 152742,
1149
+ "<x404>": 152744,
1150
+ "<x405>": 152746,
1151
+ "<x406>": 152748,
1152
+ "<x407>": 152750,
1153
+ "<x408>": 152752,
1154
+ "<x409>": 152754,
1155
+ "<x40>": 152016,
1156
+ "<x410>": 152756,
1157
+ "<x411>": 152758,
1158
+ "<x412>": 152760,
1159
+ "<x413>": 152762,
1160
+ "<x414>": 152764,
1161
+ "<x415>": 152766,
1162
+ "<x416>": 152768,
1163
+ "<x417>": 152770,
1164
+ "<x418>": 152772,
1165
+ "<x419>": 152774,
1166
+ "<x41>": 152018,
1167
+ "<x420>": 152776,
1168
+ "<x421>": 152778,
1169
+ "<x422>": 152780,
1170
+ "<x423>": 152782,
1171
+ "<x424>": 152784,
1172
+ "<x425>": 152786,
1173
+ "<x426>": 152788,
1174
+ "<x427>": 152790,
1175
+ "<x428>": 152792,
1176
+ "<x429>": 152794,
1177
+ "<x42>": 152020,
1178
+ "<x430>": 152796,
1179
+ "<x431>": 152798,
1180
+ "<x432>": 152800,
1181
+ "<x433>": 152802,
1182
+ "<x434>": 152804,
1183
+ "<x435>": 152806,
1184
+ "<x436>": 152808,
1185
+ "<x437>": 152810,
1186
+ "<x438>": 152812,
1187
+ "<x439>": 152814,
1188
+ "<x43>": 152022,
1189
+ "<x440>": 152816,
1190
+ "<x441>": 152818,
1191
+ "<x442>": 152820,
1192
+ "<x443>": 152822,
1193
+ "<x444>": 152824,
1194
+ "<x445>": 152826,
1195
+ "<x446>": 152828,
1196
+ "<x447>": 152830,
1197
+ "<x448>": 152832,
1198
+ "<x449>": 152834,
1199
+ "<x44>": 152024,
1200
+ "<x450>": 152836,
1201
+ "<x451>": 152838,
1202
+ "<x452>": 152840,
1203
+ "<x453>": 152842,
1204
+ "<x454>": 152844,
1205
+ "<x455>": 152846,
1206
+ "<x456>": 152848,
1207
+ "<x457>": 152850,
1208
+ "<x458>": 152852,
1209
+ "<x459>": 152854,
1210
+ "<x45>": 152026,
1211
+ "<x460>": 152856,
1212
+ "<x461>": 152858,
1213
+ "<x462>": 152860,
1214
+ "<x463>": 152862,
1215
+ "<x464>": 152864,
1216
+ "<x465>": 152866,
1217
+ "<x466>": 152868,
1218
+ "<x467>": 152870,
1219
+ "<x468>": 152872,
1220
+ "<x469>": 152874,
1221
+ "<x46>": 152028,
1222
+ "<x470>": 152876,
1223
+ "<x471>": 152878,
1224
+ "<x472>": 152880,
1225
+ "<x473>": 152882,
1226
+ "<x474>": 152884,
1227
+ "<x475>": 152886,
1228
+ "<x476>": 152888,
1229
+ "<x477>": 152890,
1230
+ "<x478>": 152892,
1231
+ "<x479>": 152894,
1232
+ "<x47>": 152030,
1233
+ "<x480>": 152896,
1234
+ "<x481>": 152898,
1235
+ "<x482>": 152900,
1236
+ "<x483>": 152902,
1237
+ "<x484>": 152904,
1238
+ "<x485>": 152906,
1239
+ "<x486>": 152908,
1240
+ "<x487>": 152910,
1241
+ "<x488>": 152912,
1242
+ "<x489>": 152914,
1243
+ "<x48>": 152032,
1244
+ "<x490>": 152916,
1245
+ "<x491>": 152918,
1246
+ "<x492>": 152920,
1247
+ "<x493>": 152922,
1248
+ "<x494>": 152924,
1249
+ "<x495>": 152926,
1250
+ "<x496>": 152928,
1251
+ "<x497>": 152930,
1252
+ "<x498>": 152932,
1253
+ "<x499>": 152934,
1254
+ "<x49>": 152034,
1255
+ "<x4>": 151944,
1256
+ "<x500>": 152936,
1257
+ "<x501>": 152938,
1258
+ "<x502>": 152940,
1259
+ "<x503>": 152942,
1260
+ "<x504>": 152944,
1261
+ "<x505>": 152946,
1262
+ "<x506>": 152948,
1263
+ "<x507>": 152950,
1264
+ "<x508>": 152952,
1265
+ "<x509>": 152954,
1266
+ "<x50>": 152036,
1267
+ "<x510>": 152956,
1268
+ "<x511>": 152958,
1269
+ "<x512>": 152960,
1270
+ "<x513>": 152962,
1271
+ "<x514>": 152964,
1272
+ "<x515>": 152966,
1273
+ "<x516>": 152968,
1274
+ "<x517>": 152970,
1275
+ "<x518>": 152972,
1276
+ "<x519>": 152974,
1277
+ "<x51>": 152038,
1278
+ "<x520>": 152976,
1279
+ "<x521>": 152978,
1280
+ "<x522>": 152980,
1281
+ "<x523>": 152982,
1282
+ "<x524>": 152984,
1283
+ "<x525>": 152986,
1284
+ "<x526>": 152988,
1285
+ "<x527>": 152990,
1286
+ "<x528>": 152992,
1287
+ "<x529>": 152994,
1288
+ "<x52>": 152040,
1289
+ "<x530>": 152996,
1290
+ "<x531>": 152998,
1291
+ "<x532>": 153000,
1292
+ "<x533>": 153002,
1293
+ "<x534>": 153004,
1294
+ "<x535>": 153006,
1295
+ "<x536>": 153008,
1296
+ "<x537>": 153010,
1297
+ "<x538>": 153012,
1298
+ "<x539>": 153014,
1299
+ "<x53>": 152042,
1300
+ "<x540>": 153016,
1301
+ "<x541>": 153018,
1302
+ "<x542>": 153020,
1303
+ "<x543>": 153022,
1304
+ "<x544>": 153024,
1305
+ "<x545>": 153026,
1306
+ "<x546>": 153028,
1307
+ "<x547>": 153030,
1308
+ "<x548>": 153032,
1309
+ "<x549>": 153034,
1310
+ "<x54>": 152044,
1311
+ "<x550>": 153036,
1312
+ "<x551>": 153038,
1313
+ "<x552>": 153040,
1314
+ "<x553>": 153042,
1315
+ "<x554>": 153044,
1316
+ "<x555>": 153046,
1317
+ "<x556>": 153048,
1318
+ "<x557>": 153050,
1319
+ "<x558>": 153052,
1320
+ "<x559>": 153054,
1321
+ "<x55>": 152046,
1322
+ "<x560>": 153056,
1323
+ "<x561>": 153058,
1324
+ "<x562>": 153060,
1325
+ "<x563>": 153062,
1326
+ "<x564>": 153064,
1327
+ "<x565>": 153066,
1328
+ "<x566>": 153068,
1329
+ "<x567>": 153070,
1330
+ "<x568>": 153072,
1331
+ "<x569>": 153074,
1332
+ "<x56>": 152048,
1333
+ "<x570>": 153076,
1334
+ "<x571>": 153078,
1335
+ "<x572>": 153080,
1336
+ "<x573>": 153082,
1337
+ "<x574>": 153084,
1338
+ "<x575>": 153086,
1339
+ "<x576>": 153088,
1340
+ "<x577>": 153090,
1341
+ "<x578>": 153092,
1342
+ "<x579>": 153094,
1343
+ "<x57>": 152050,
1344
+ "<x580>": 153096,
1345
+ "<x581>": 153098,
1346
+ "<x582>": 153100,
1347
+ "<x583>": 153102,
1348
+ "<x584>": 153104,
1349
+ "<x585>": 153106,
1350
+ "<x586>": 153108,
1351
+ "<x587>": 153110,
1352
+ "<x588>": 153112,
1353
+ "<x589>": 153114,
1354
+ "<x58>": 152052,
1355
+ "<x590>": 153116,
1356
+ "<x591>": 153118,
1357
+ "<x592>": 153120,
1358
+ "<x593>": 153122,
1359
+ "<x594>": 153124,
1360
+ "<x595>": 153126,
1361
+ "<x596>": 153128,
1362
+ "<x597>": 153130,
1363
+ "<x598>": 153132,
1364
+ "<x599>": 153134,
1365
+ "<x59>": 152054,
1366
+ "<x5>": 151946,
1367
+ "<x600>": 153136,
1368
+ "<x601>": 153138,
1369
+ "<x602>": 153140,
1370
+ "<x603>": 153142,
1371
+ "<x604>": 153144,
1372
+ "<x605>": 153146,
1373
+ "<x606>": 153148,
1374
+ "<x607>": 153150,
1375
+ "<x608>": 153152,
1376
+ "<x609>": 153154,
1377
+ "<x60>": 152056,
1378
+ "<x610>": 153156,
1379
+ "<x611>": 153158,
1380
+ "<x612>": 153160,
1381
+ "<x613>": 153162,
1382
+ "<x614>": 153164,
1383
+ "<x615>": 153166,
1384
+ "<x616>": 153168,
1385
+ "<x617>": 153170,
1386
+ "<x618>": 153172,
1387
+ "<x619>": 153174,
1388
+ "<x61>": 152058,
1389
+ "<x620>": 153176,
1390
+ "<x621>": 153178,
1391
+ "<x622>": 153180,
1392
+ "<x623>": 153182,
1393
+ "<x624>": 153184,
1394
+ "<x625>": 153186,
1395
+ "<x626>": 153188,
1396
+ "<x627>": 153190,
1397
+ "<x628>": 153192,
1398
+ "<x629>": 153194,
1399
+ "<x62>": 152060,
1400
+ "<x630>": 153196,
1401
+ "<x631>": 153198,
1402
+ "<x632>": 153200,
1403
+ "<x633>": 153202,
1404
+ "<x634>": 153204,
1405
+ "<x635>": 153206,
1406
+ "<x636>": 153208,
1407
+ "<x637>": 153210,
1408
+ "<x638>": 153212,
1409
+ "<x639>": 153214,
1410
+ "<x63>": 152062,
1411
+ "<x640>": 153216,
1412
+ "<x641>": 153218,
1413
+ "<x642>": 153220,
1414
+ "<x643>": 153222,
1415
+ "<x644>": 153224,
1416
+ "<x645>": 153226,
1417
+ "<x646>": 153228,
1418
+ "<x647>": 153230,
1419
+ "<x648>": 153232,
1420
+ "<x649>": 153234,
1421
+ "<x64>": 152064,
1422
+ "<x650>": 153236,
1423
+ "<x651>": 153238,
1424
+ "<x652>": 153240,
1425
+ "<x653>": 153242,
1426
+ "<x654>": 153244,
1427
+ "<x655>": 153246,
1428
+ "<x656>": 153248,
1429
+ "<x657>": 153250,
1430
+ "<x658>": 153252,
1431
+ "<x659>": 153254,
1432
+ "<x65>": 152066,
1433
+ "<x660>": 153256,
1434
+ "<x661>": 153258,
1435
+ "<x662>": 153260,
1436
+ "<x663>": 153262,
1437
+ "<x664>": 153264,
1438
+ "<x665>": 153266,
1439
+ "<x666>": 153268,
1440
+ "<x667>": 153270,
1441
+ "<x668>": 153272,
1442
+ "<x669>": 153274,
1443
+ "<x66>": 152068,
1444
+ "<x670>": 153276,
1445
+ "<x671>": 153278,
1446
+ "<x672>": 153280,
1447
+ "<x673>": 153282,
1448
+ "<x674>": 153284,
1449
+ "<x675>": 153286,
1450
+ "<x676>": 153288,
1451
+ "<x677>": 153290,
1452
+ "<x678>": 153292,
1453
+ "<x679>": 153294,
1454
+ "<x67>": 152070,
1455
+ "<x680>": 153296,
1456
+ "<x681>": 153298,
1457
+ "<x682>": 153300,
1458
+ "<x683>": 153302,
1459
+ "<x684>": 153304,
1460
+ "<x685>": 153306,
1461
+ "<x686>": 153308,
1462
+ "<x687>": 153310,
1463
+ "<x688>": 153312,
1464
+ "<x689>": 153314,
1465
+ "<x68>": 152072,
1466
+ "<x690>": 153316,
1467
+ "<x691>": 153318,
1468
+ "<x692>": 153320,
1469
+ "<x693>": 153322,
1470
+ "<x694>": 153324,
1471
+ "<x695>": 153326,
1472
+ "<x696>": 153328,
1473
+ "<x697>": 153330,
1474
+ "<x698>": 153332,
1475
+ "<x699>": 153334,
1476
+ "<x69>": 152074,
1477
+ "<x6>": 151948,
1478
+ "<x700>": 153336,
1479
+ "<x701>": 153338,
1480
+ "<x702>": 153340,
1481
+ "<x703>": 153342,
1482
+ "<x704>": 153344,
1483
+ "<x705>": 153346,
1484
+ "<x706>": 153348,
1485
+ "<x707>": 153350,
1486
+ "<x708>": 153352,
1487
+ "<x709>": 153354,
1488
+ "<x70>": 152076,
1489
+ "<x710>": 153356,
1490
+ "<x711>": 153358,
1491
+ "<x712>": 153360,
1492
+ "<x713>": 153362,
1493
+ "<x714>": 153364,
1494
+ "<x715>": 153366,
1495
+ "<x716>": 153368,
1496
+ "<x717>": 153370,
1497
+ "<x718>": 153372,
1498
+ "<x719>": 153374,
1499
+ "<x71>": 152078,
1500
+ "<x720>": 153376,
1501
+ "<x721>": 153378,
1502
+ "<x722>": 153380,
1503
+ "<x723>": 153382,
1504
+ "<x724>": 153384,
1505
+ "<x725>": 153386,
1506
+ "<x726>": 153388,
1507
+ "<x727>": 153390,
1508
+ "<x728>": 153392,
1509
+ "<x729>": 153394,
1510
+ "<x72>": 152080,
1511
+ "<x730>": 153396,
1512
+ "<x731>": 153398,
1513
+ "<x732>": 153400,
1514
+ "<x733>": 153402,
1515
+ "<x734>": 153404,
1516
+ "<x735>": 153406,
1517
+ "<x736>": 153408,
1518
+ "<x737>": 153410,
1519
+ "<x738>": 153412,
1520
+ "<x739>": 153414,
1521
+ "<x73>": 152082,
1522
+ "<x740>": 153416,
1523
+ "<x741>": 153418,
1524
+ "<x742>": 153420,
1525
+ "<x743>": 153422,
1526
+ "<x744>": 153424,
1527
+ "<x745>": 153426,
1528
+ "<x746>": 153428,
1529
+ "<x747>": 153430,
1530
+ "<x748>": 153432,
1531
+ "<x749>": 153434,
1532
+ "<x74>": 152084,
1533
+ "<x750>": 153436,
1534
+ "<x751>": 153438,
1535
+ "<x752>": 153440,
1536
+ "<x753>": 153442,
1537
+ "<x754>": 153444,
1538
+ "<x755>": 153446,
1539
+ "<x756>": 153448,
1540
+ "<x757>": 153450,
1541
+ "<x758>": 153452,
1542
+ "<x759>": 153454,
1543
+ "<x75>": 152086,
1544
+ "<x760>": 153456,
1545
+ "<x761>": 153458,
1546
+ "<x762>": 153460,
1547
+ "<x763>": 153462,
1548
+ "<x764>": 153464,
1549
+ "<x765>": 153466,
1550
+ "<x766>": 153468,
1551
+ "<x767>": 153470,
1552
+ "<x768>": 153472,
1553
+ "<x769>": 153474,
1554
+ "<x76>": 152088,
1555
+ "<x770>": 153476,
1556
+ "<x771>": 153478,
1557
+ "<x772>": 153480,
1558
+ "<x773>": 153482,
1559
+ "<x774>": 153484,
1560
+ "<x775>": 153486,
1561
+ "<x776>": 153488,
1562
+ "<x777>": 153490,
1563
+ "<x778>": 153492,
1564
+ "<x779>": 153494,
1565
+ "<x77>": 152090,
1566
+ "<x780>": 153496,
1567
+ "<x781>": 153498,
1568
+ "<x782>": 153500,
1569
+ "<x783>": 153502,
1570
+ "<x784>": 153504,
1571
+ "<x785>": 153506,
1572
+ "<x786>": 153508,
1573
+ "<x787>": 153510,
1574
+ "<x788>": 153512,
1575
+ "<x789>": 153514,
1576
+ "<x78>": 152092,
1577
+ "<x790>": 153516,
1578
+ "<x791>": 153518,
1579
+ "<x792>": 153520,
1580
+ "<x793>": 153522,
1581
+ "<x794>": 153524,
1582
+ "<x795>": 153526,
1583
+ "<x796>": 153528,
1584
+ "<x797>": 153530,
1585
+ "<x798>": 153532,
1586
+ "<x799>": 153534,
1587
+ "<x79>": 152094,
1588
+ "<x7>": 151950,
1589
+ "<x800>": 153536,
1590
+ "<x801>": 153538,
1591
+ "<x802>": 153540,
1592
+ "<x803>": 153542,
1593
+ "<x804>": 153544,
1594
+ "<x805>": 153546,
1595
+ "<x806>": 153548,
1596
+ "<x807>": 153550,
1597
+ "<x808>": 153552,
1598
+ "<x809>": 153554,
1599
+ "<x80>": 152096,
1600
+ "<x810>": 153556,
1601
+ "<x811>": 153558,
1602
+ "<x812>": 153560,
1603
+ "<x813>": 153562,
1604
+ "<x814>": 153564,
1605
+ "<x815>": 153566,
1606
+ "<x816>": 153568,
1607
+ "<x817>": 153570,
1608
+ "<x818>": 153572,
1609
+ "<x819>": 153574,
1610
+ "<x81>": 152098,
1611
+ "<x820>": 153576,
1612
+ "<x821>": 153578,
1613
+ "<x822>": 153580,
1614
+ "<x823>": 153582,
1615
+ "<x824>": 153584,
1616
+ "<x825>": 153586,
1617
+ "<x826>": 153588,
1618
+ "<x827>": 153590,
1619
+ "<x828>": 153592,
1620
+ "<x829>": 153594,
1621
+ "<x82>": 152100,
1622
+ "<x830>": 153596,
1623
+ "<x831>": 153598,
1624
+ "<x832>": 153600,
1625
+ "<x833>": 153602,
1626
+ "<x834>": 153604,
1627
+ "<x835>": 153606,
1628
+ "<x836>": 153608,
1629
+ "<x837>": 153610,
1630
+ "<x838>": 153612,
1631
+ "<x839>": 153614,
1632
+ "<x83>": 152102,
1633
+ "<x840>": 153616,
1634
+ "<x841>": 153618,
1635
+ "<x842>": 153620,
1636
+ "<x843>": 153622,
1637
+ "<x844>": 153624,
1638
+ "<x845>": 153626,
1639
+ "<x846>": 153628,
1640
+ "<x847>": 153630,
1641
+ "<x848>": 153632,
1642
+ "<x849>": 153634,
1643
+ "<x84>": 152104,
1644
+ "<x850>": 153636,
1645
+ "<x851>": 153638,
1646
+ "<x852>": 153640,
1647
+ "<x853>": 153642,
1648
+ "<x854>": 153644,
1649
+ "<x855>": 153646,
1650
+ "<x856>": 153648,
1651
+ "<x857>": 153650,
1652
+ "<x858>": 153652,
1653
+ "<x859>": 153654,
1654
+ "<x85>": 152106,
1655
+ "<x860>": 153656,
1656
+ "<x861>": 153658,
1657
+ "<x862>": 153660,
1658
+ "<x863>": 153662,
1659
+ "<x864>": 153664,
1660
+ "<x865>": 153666,
1661
+ "<x866>": 153668,
1662
+ "<x867>": 153670,
1663
+ "<x868>": 153672,
1664
+ "<x869>": 153674,
1665
+ "<x86>": 152108,
1666
+ "<x870>": 153676,
1667
+ "<x871>": 153678,
1668
+ "<x872>": 153680,
1669
+ "<x873>": 153682,
1670
+ "<x874>": 153684,
1671
+ "<x875>": 153686,
1672
+ "<x876>": 153688,
1673
+ "<x877>": 153690,
1674
+ "<x878>": 153692,
1675
+ "<x879>": 153694,
1676
+ "<x87>": 152110,
1677
+ "<x880>": 153696,
1678
+ "<x881>": 153698,
1679
+ "<x882>": 153700,
1680
+ "<x883>": 153702,
1681
+ "<x884>": 153704,
1682
+ "<x885>": 153706,
1683
+ "<x886>": 153708,
1684
+ "<x887>": 153710,
1685
+ "<x888>": 153712,
1686
+ "<x889>": 153714,
1687
+ "<x88>": 152112,
1688
+ "<x890>": 153716,
1689
+ "<x891>": 153718,
1690
+ "<x892>": 153720,
1691
+ "<x893>": 153722,
1692
+ "<x894>": 153724,
1693
+ "<x895>": 153726,
1694
+ "<x896>": 153728,
1695
+ "<x897>": 153730,
1696
+ "<x898>": 153732,
1697
+ "<x899>": 153734,
1698
+ "<x89>": 152114,
1699
+ "<x8>": 151952,
1700
+ "<x900>": 153736,
1701
+ "<x901>": 153738,
1702
+ "<x902>": 153740,
1703
+ "<x903>": 153742,
1704
+ "<x904>": 153744,
1705
+ "<x905>": 153746,
1706
+ "<x906>": 153748,
1707
+ "<x907>": 153750,
1708
+ "<x908>": 153752,
1709
+ "<x909>": 153754,
1710
+ "<x90>": 152116,
1711
+ "<x910>": 153756,
1712
+ "<x911>": 153758,
1713
+ "<x912>": 153760,
1714
+ "<x913>": 153762,
1715
+ "<x914>": 153764,
1716
+ "<x915>": 153766,
1717
+ "<x916>": 153768,
1718
+ "<x917>": 153770,
1719
+ "<x918>": 153772,
1720
+ "<x919>": 153774,
1721
+ "<x91>": 152118,
1722
+ "<x920>": 153776,
1723
+ "<x921>": 153778,
1724
+ "<x922>": 153780,
1725
+ "<x923>": 153782,
1726
+ "<x924>": 153784,
1727
+ "<x925>": 153786,
1728
+ "<x926>": 153788,
1729
+ "<x927>": 153790,
1730
+ "<x928>": 153792,
1731
+ "<x929>": 153794,
1732
+ "<x92>": 152120,
1733
+ "<x930>": 153796,
1734
+ "<x931>": 153798,
1735
+ "<x932>": 153800,
1736
+ "<x933>": 153802,
1737
+ "<x934>": 153804,
1738
+ "<x935>": 153806,
1739
+ "<x936>": 153808,
1740
+ "<x937>": 153810,
1741
+ "<x938>": 153812,
1742
+ "<x939>": 153814,
1743
+ "<x93>": 152122,
1744
+ "<x940>": 153816,
1745
+ "<x941>": 153818,
1746
+ "<x942>": 153820,
1747
+ "<x943>": 153822,
1748
+ "<x944>": 153824,
1749
+ "<x945>": 153826,
1750
+ "<x946>": 153828,
1751
+ "<x947>": 153830,
1752
+ "<x948>": 153832,
1753
+ "<x949>": 153834,
1754
+ "<x94>": 152124,
1755
+ "<x950>": 153836,
1756
+ "<x951>": 153838,
1757
+ "<x952>": 153840,
1758
+ "<x953>": 153842,
1759
+ "<x954>": 153844,
1760
+ "<x955>": 153846,
1761
+ "<x956>": 153848,
1762
+ "<x957>": 153850,
1763
+ "<x958>": 153852,
1764
+ "<x959>": 153854,
1765
+ "<x95>": 152126,
1766
+ "<x960>": 153856,
1767
+ "<x961>": 153858,
1768
+ "<x962>": 153860,
1769
+ "<x963>": 153862,
1770
+ "<x964>": 153864,
1771
+ "<x965>": 153866,
1772
+ "<x966>": 153868,
1773
+ "<x967>": 153870,
1774
+ "<x968>": 153872,
1775
+ "<x969>": 153874,
1776
+ "<x96>": 152128,
1777
+ "<x970>": 153876,
1778
+ "<x971>": 153878,
1779
+ "<x972>": 153880,
1780
+ "<x973>": 153882,
1781
+ "<x974>": 153884,
1782
+ "<x975>": 153886,
1783
+ "<x976>": 153888,
1784
+ "<x977>": 153890,
1785
+ "<x978>": 153892,
1786
+ "<x979>": 153894,
1787
+ "<x97>": 152130,
1788
+ "<x980>": 153896,
1789
+ "<x981>": 153898,
1790
+ "<x982>": 153900,
1791
+ "<x983>": 153902,
1792
+ "<x984>": 153904,
1793
+ "<x985>": 153906,
1794
+ "<x986>": 153908,
1795
+ "<x987>": 153910,
1796
+ "<x988>": 153912,
1797
+ "<x989>": 153914,
1798
+ "<x98>": 152132,
1799
+ "<x990>": 153916,
1800
+ "<x991>": 153918,
1801
+ "<x992>": 153920,
1802
+ "<x993>": 153922,
1803
+ "<x994>": 153924,
1804
+ "<x995>": 153926,
1805
+ "<x996>": 153928,
1806
+ "<x997>": 153930,
1807
+ "<x998>": 153932,
1808
+ "<x999>": 153934,
1809
+ "<x99>": 152134,
1810
+ "<x9>": 151954,
1811
+ "<y0>": 151937,
1812
+ "<y1000>": 153937,
1813
+ "<y1001>": 153939,
1814
+ "<y1002>": 153941,
1815
+ "<y1003>": 153943,
1816
+ "<y1004>": 153945,
1817
+ "<y1005>": 153947,
1818
+ "<y1006>": 153949,
1819
+ "<y1007>": 153951,
1820
+ "<y1008>": 153953,
1821
+ "<y1009>": 153955,
1822
+ "<y100>": 152137,
1823
+ "<y1010>": 153957,
1824
+ "<y1011>": 153959,
1825
+ "<y1012>": 153961,
1826
+ "<y1013>": 153963,
1827
+ "<y1014>": 153965,
1828
+ "<y1015>": 153967,
1829
+ "<y1016>": 153969,
1830
+ "<y1017>": 153971,
1831
+ "<y1018>": 153973,
1832
+ "<y1019>": 153975,
1833
+ "<y101>": 152139,
1834
+ "<y1020>": 153977,
1835
+ "<y1021>": 153979,
1836
+ "<y1022>": 153981,
1837
+ "<y1023>": 153983,
1838
+ "<y102>": 152141,
1839
+ "<y103>": 152143,
1840
+ "<y104>": 152145,
1841
+ "<y105>": 152147,
1842
+ "<y106>": 152149,
1843
+ "<y107>": 152151,
1844
+ "<y108>": 152153,
1845
+ "<y109>": 152155,
1846
+ "<y10>": 151957,
1847
+ "<y110>": 152157,
1848
+ "<y111>": 152159,
1849
+ "<y112>": 152161,
1850
+ "<y113>": 152163,
1851
+ "<y114>": 152165,
1852
+ "<y115>": 152167,
1853
+ "<y116>": 152169,
1854
+ "<y117>": 152171,
1855
+ "<y118>": 152173,
1856
+ "<y119>": 152175,
1857
+ "<y11>": 151959,
1858
+ "<y120>": 152177,
1859
+ "<y121>": 152179,
1860
+ "<y122>": 152181,
1861
+ "<y123>": 152183,
1862
+ "<y124>": 152185,
1863
+ "<y125>": 152187,
1864
+ "<y126>": 152189,
1865
+ "<y127>": 152191,
1866
+ "<y128>": 152193,
1867
+ "<y129>": 152195,
1868
+ "<y12>": 151961,
1869
+ "<y130>": 152197,
1870
+ "<y131>": 152199,
1871
+ "<y132>": 152201,
1872
+ "<y133>": 152203,
1873
+ "<y134>": 152205,
1874
+ "<y135>": 152207,
1875
+ "<y136>": 152209,
1876
+ "<y137>": 152211,
1877
+ "<y138>": 152213,
1878
+ "<y139>": 152215,
1879
+ "<y13>": 151963,
1880
+ "<y140>": 152217,
1881
+ "<y141>": 152219,
1882
+ "<y142>": 152221,
1883
+ "<y143>": 152223,
1884
+ "<y144>": 152225,
1885
+ "<y145>": 152227,
1886
+ "<y146>": 152229,
1887
+ "<y147>": 152231,
1888
+ "<y148>": 152233,
1889
+ "<y149>": 152235,
1890
+ "<y14>": 151965,
1891
+ "<y150>": 152237,
1892
+ "<y151>": 152239,
1893
+ "<y152>": 152241,
1894
+ "<y153>": 152243,
1895
+ "<y154>": 152245,
1896
+ "<y155>": 152247,
1897
+ "<y156>": 152249,
1898
+ "<y157>": 152251,
1899
+ "<y158>": 152253,
1900
+ "<y159>": 152255,
1901
+ "<y15>": 151967,
1902
+ "<y160>": 152257,
1903
+ "<y161>": 152259,
1904
+ "<y162>": 152261,
1905
+ "<y163>": 152263,
1906
+ "<y164>": 152265,
1907
+ "<y165>": 152267,
1908
+ "<y166>": 152269,
1909
+ "<y167>": 152271,
1910
+ "<y168>": 152273,
1911
+ "<y169>": 152275,
1912
+ "<y16>": 151969,
1913
+ "<y170>": 152277,
1914
+ "<y171>": 152279,
1915
+ "<y172>": 152281,
1916
+ "<y173>": 152283,
1917
+ "<y174>": 152285,
1918
+ "<y175>": 152287,
1919
+ "<y176>": 152289,
1920
+ "<y177>": 152291,
1921
+ "<y178>": 152293,
1922
+ "<y179>": 152295,
1923
+ "<y17>": 151971,
1924
+ "<y180>": 152297,
1925
+ "<y181>": 152299,
1926
+ "<y182>": 152301,
1927
+ "<y183>": 152303,
1928
+ "<y184>": 152305,
1929
+ "<y185>": 152307,
1930
+ "<y186>": 152309,
1931
+ "<y187>": 152311,
1932
+ "<y188>": 152313,
1933
+ "<y189>": 152315,
1934
+ "<y18>": 151973,
1935
+ "<y190>": 152317,
1936
+ "<y191>": 152319,
1937
+ "<y192>": 152321,
1938
+ "<y193>": 152323,
1939
+ "<y194>": 152325,
1940
+ "<y195>": 152327,
1941
+ "<y196>": 152329,
1942
+ "<y197>": 152331,
1943
+ "<y198>": 152333,
1944
+ "<y199>": 152335,
1945
+ "<y19>": 151975,
1946
+ "<y1>": 151939,
1947
+ "<y200>": 152337,
1948
+ "<y201>": 152339,
1949
+ "<y202>": 152341,
1950
+ "<y203>": 152343,
1951
+ "<y204>": 152345,
1952
+ "<y205>": 152347,
1953
+ "<y206>": 152349,
1954
+ "<y207>": 152351,
1955
+ "<y208>": 152353,
1956
+ "<y209>": 152355,
1957
+ "<y20>": 151977,
1958
+ "<y210>": 152357,
1959
+ "<y211>": 152359,
1960
+ "<y212>": 152361,
1961
+ "<y213>": 152363,
1962
+ "<y214>": 152365,
1963
+ "<y215>": 152367,
1964
+ "<y216>": 152369,
1965
+ "<y217>": 152371,
1966
+ "<y218>": 152373,
1967
+ "<y219>": 152375,
1968
+ "<y21>": 151979,
1969
+ "<y220>": 152377,
1970
+ "<y221>": 152379,
1971
+ "<y222>": 152381,
1972
+ "<y223>": 152383,
1973
+ "<y224>": 152385,
1974
+ "<y225>": 152387,
1975
+ "<y226>": 152389,
1976
+ "<y227>": 152391,
1977
+ "<y228>": 152393,
1978
+ "<y229>": 152395,
1979
+ "<y22>": 151981,
1980
+ "<y230>": 152397,
1981
+ "<y231>": 152399,
1982
+ "<y232>": 152401,
1983
+ "<y233>": 152403,
1984
+ "<y234>": 152405,
1985
+ "<y235>": 152407,
1986
+ "<y236>": 152409,
1987
+ "<y237>": 152411,
1988
+ "<y238>": 152413,
1989
+ "<y239>": 152415,
1990
+ "<y23>": 151983,
1991
+ "<y240>": 152417,
1992
+ "<y241>": 152419,
1993
+ "<y242>": 152421,
1994
+ "<y243>": 152423,
1995
+ "<y244>": 152425,
1996
+ "<y245>": 152427,
1997
+ "<y246>": 152429,
1998
+ "<y247>": 152431,
1999
+ "<y248>": 152433,
2000
+ "<y249>": 152435,
2001
+ "<y24>": 151985,
2002
+ "<y250>": 152437,
2003
+ "<y251>": 152439,
2004
+ "<y252>": 152441,
2005
+ "<y253>": 152443,
2006
+ "<y254>": 152445,
2007
+ "<y255>": 152447,
2008
+ "<y256>": 152449,
2009
+ "<y257>": 152451,
2010
+ "<y258>": 152453,
2011
+ "<y259>": 152455,
2012
+ "<y25>": 151987,
2013
+ "<y260>": 152457,
2014
+ "<y261>": 152459,
2015
+ "<y262>": 152461,
2016
+ "<y263>": 152463,
2017
+ "<y264>": 152465,
2018
+ "<y265>": 152467,
2019
+ "<y266>": 152469,
2020
+ "<y267>": 152471,
2021
+ "<y268>": 152473,
2022
+ "<y269>": 152475,
2023
+ "<y26>": 151989,
2024
+ "<y270>": 152477,
2025
+ "<y271>": 152479,
2026
+ "<y272>": 152481,
2027
+ "<y273>": 152483,
2028
+ "<y274>": 152485,
2029
+ "<y275>": 152487,
2030
+ "<y276>": 152489,
2031
+ "<y277>": 152491,
2032
+ "<y278>": 152493,
2033
+ "<y279>": 152495,
2034
+ "<y27>": 151991,
2035
+ "<y280>": 152497,
2036
+ "<y281>": 152499,
2037
+ "<y282>": 152501,
2038
+ "<y283>": 152503,
2039
+ "<y284>": 152505,
2040
+ "<y285>": 152507,
2041
+ "<y286>": 152509,
2042
+ "<y287>": 152511,
2043
+ "<y288>": 152513,
2044
+ "<y289>": 152515,
2045
+ "<y28>": 151993,
2046
+ "<y290>": 152517,
2047
+ "<y291>": 152519,
2048
+ "<y292>": 152521,
2049
+ "<y293>": 152523,
2050
+ "<y294>": 152525,
2051
+ "<y295>": 152527,
2052
+ "<y296>": 152529,
2053
+ "<y297>": 152531,
2054
+ "<y298>": 152533,
2055
+ "<y299>": 152535,
2056
+ "<y29>": 151995,
2057
+ "<y2>": 151941,
2058
+ "<y300>": 152537,
2059
+ "<y301>": 152539,
2060
+ "<y302>": 152541,
2061
+ "<y303>": 152543,
2062
+ "<y304>": 152545,
2063
+ "<y305>": 152547,
2064
+ "<y306>": 152549,
2065
+ "<y307>": 152551,
2066
+ "<y308>": 152553,
2067
+ "<y309>": 152555,
2068
+ "<y30>": 151997,
2069
+ "<y310>": 152557,
2070
+ "<y311>": 152559,
2071
+ "<y312>": 152561,
2072
+ "<y313>": 152563,
2073
+ "<y314>": 152565,
2074
+ "<y315>": 152567,
2075
+ "<y316>": 152569,
2076
+ "<y317>": 152571,
2077
+ "<y318>": 152573,
2078
+ "<y319>": 152575,
2079
+ "<y31>": 151999,
2080
+ "<y320>": 152577,
2081
+ "<y321>": 152579,
2082
+ "<y322>": 152581,
2083
+ "<y323>": 152583,
2084
+ "<y324>": 152585,
2085
+ "<y325>": 152587,
2086
+ "<y326>": 152589,
2087
+ "<y327>": 152591,
2088
+ "<y328>": 152593,
2089
+ "<y329>": 152595,
2090
+ "<y32>": 152001,
2091
+ "<y330>": 152597,
2092
+ "<y331>": 152599,
2093
+ "<y332>": 152601,
2094
+ "<y333>": 152603,
2095
+ "<y334>": 152605,
2096
+ "<y335>": 152607,
2097
+ "<y336>": 152609,
2098
+ "<y337>": 152611,
2099
+ "<y338>": 152613,
2100
+ "<y339>": 152615,
2101
+ "<y33>": 152003,
2102
+ "<y340>": 152617,
2103
+ "<y341>": 152619,
2104
+ "<y342>": 152621,
2105
+ "<y343>": 152623,
2106
+ "<y344>": 152625,
2107
+ "<y345>": 152627,
2108
+ "<y346>": 152629,
2109
+ "<y347>": 152631,
2110
+ "<y348>": 152633,
2111
+ "<y349>": 152635,
2112
+ "<y34>": 152005,
2113
+ "<y350>": 152637,
2114
+ "<y351>": 152639,
2115
+ "<y352>": 152641,
2116
+ "<y353>": 152643,
2117
+ "<y354>": 152645,
2118
+ "<y355>": 152647,
2119
+ "<y356>": 152649,
2120
+ "<y357>": 152651,
2121
+ "<y358>": 152653,
2122
+ "<y359>": 152655,
2123
+ "<y35>": 152007,
2124
+ "<y360>": 152657,
2125
+ "<y361>": 152659,
2126
+ "<y362>": 152661,
2127
+ "<y363>": 152663,
2128
+ "<y364>": 152665,
2129
+ "<y365>": 152667,
2130
+ "<y366>": 152669,
2131
+ "<y367>": 152671,
2132
+ "<y368>": 152673,
2133
+ "<y369>": 152675,
2134
+ "<y36>": 152009,
2135
+ "<y370>": 152677,
2136
+ "<y371>": 152679,
2137
+ "<y372>": 152681,
2138
+ "<y373>": 152683,
2139
+ "<y374>": 152685,
2140
+ "<y375>": 152687,
2141
+ "<y376>": 152689,
2142
+ "<y377>": 152691,
2143
+ "<y378>": 152693,
2144
+ "<y379>": 152695,
2145
+ "<y37>": 152011,
2146
+ "<y380>": 152697,
2147
+ "<y381>": 152699,
2148
+ "<y382>": 152701,
2149
+ "<y383>": 152703,
2150
+ "<y384>": 152705,
2151
+ "<y385>": 152707,
2152
+ "<y386>": 152709,
2153
+ "<y387>": 152711,
2154
+ "<y388>": 152713,
2155
+ "<y389>": 152715,
2156
+ "<y38>": 152013,
2157
+ "<y390>": 152717,
2158
+ "<y391>": 152719,
2159
+ "<y392>": 152721,
2160
+ "<y393>": 152723,
2161
+ "<y394>": 152725,
2162
+ "<y395>": 152727,
2163
+ "<y396>": 152729,
2164
+ "<y397>": 152731,
2165
+ "<y398>": 152733,
2166
+ "<y399>": 152735,
2167
+ "<y39>": 152015,
2168
+ "<y3>": 151943,
2169
+ "<y400>": 152737,
2170
+ "<y401>": 152739,
2171
+ "<y402>": 152741,
2172
+ "<y403>": 152743,
2173
+ "<y404>": 152745,
2174
+ "<y405>": 152747,
2175
+ "<y406>": 152749,
2176
+ "<y407>": 152751,
2177
+ "<y408>": 152753,
2178
+ "<y409>": 152755,
2179
+ "<y40>": 152017,
2180
+ "<y410>": 152757,
2181
+ "<y411>": 152759,
2182
+ "<y412>": 152761,
2183
+ "<y413>": 152763,
2184
+ "<y414>": 152765,
2185
+ "<y415>": 152767,
2186
+ "<y416>": 152769,
2187
+ "<y417>": 152771,
2188
+ "<y418>": 152773,
2189
+ "<y419>": 152775,
2190
+ "<y41>": 152019,
2191
+ "<y420>": 152777,
2192
+ "<y421>": 152779,
2193
+ "<y422>": 152781,
2194
+ "<y423>": 152783,
2195
+ "<y424>": 152785,
2196
+ "<y425>": 152787,
2197
+ "<y426>": 152789,
2198
+ "<y427>": 152791,
2199
+ "<y428>": 152793,
2200
+ "<y429>": 152795,
2201
+ "<y42>": 152021,
2202
+ "<y430>": 152797,
2203
+ "<y431>": 152799,
2204
+ "<y432>": 152801,
2205
+ "<y433>": 152803,
2206
+ "<y434>": 152805,
2207
+ "<y435>": 152807,
2208
+ "<y436>": 152809,
2209
+ "<y437>": 152811,
2210
+ "<y438>": 152813,
2211
+ "<y439>": 152815,
2212
+ "<y43>": 152023,
2213
+ "<y440>": 152817,
2214
+ "<y441>": 152819,
2215
+ "<y442>": 152821,
2216
+ "<y443>": 152823,
2217
+ "<y444>": 152825,
2218
+ "<y445>": 152827,
2219
+ "<y446>": 152829,
2220
+ "<y447>": 152831,
2221
+ "<y448>": 152833,
2222
+ "<y449>": 152835,
2223
+ "<y44>": 152025,
2224
+ "<y450>": 152837,
2225
+ "<y451>": 152839,
2226
+ "<y452>": 152841,
2227
+ "<y453>": 152843,
2228
+ "<y454>": 152845,
2229
+ "<y455>": 152847,
2230
+ "<y456>": 152849,
2231
+ "<y457>": 152851,
2232
+ "<y458>": 152853,
2233
+ "<y459>": 152855,
2234
+ "<y45>": 152027,
2235
+ "<y460>": 152857,
2236
+ "<y461>": 152859,
2237
+ "<y462>": 152861,
2238
+ "<y463>": 152863,
2239
+ "<y464>": 152865,
2240
+ "<y465>": 152867,
2241
+ "<y466>": 152869,
2242
+ "<y467>": 152871,
2243
+ "<y468>": 152873,
2244
+ "<y469>": 152875,
2245
+ "<y46>": 152029,
2246
+ "<y470>": 152877,
2247
+ "<y471>": 152879,
2248
+ "<y472>": 152881,
2249
+ "<y473>": 152883,
2250
+ "<y474>": 152885,
2251
+ "<y475>": 152887,
2252
+ "<y476>": 152889,
2253
+ "<y477>": 152891,
2254
+ "<y478>": 152893,
2255
+ "<y479>": 152895,
2256
+ "<y47>": 152031,
2257
+ "<y480>": 152897,
2258
+ "<y481>": 152899,
2259
+ "<y482>": 152901,
2260
+ "<y483>": 152903,
2261
+ "<y484>": 152905,
2262
+ "<y485>": 152907,
2263
+ "<y486>": 152909,
2264
+ "<y487>": 152911,
2265
+ "<y488>": 152913,
2266
+ "<y489>": 152915,
2267
+ "<y48>": 152033,
2268
+ "<y490>": 152917,
2269
+ "<y491>": 152919,
2270
+ "<y492>": 152921,
2271
+ "<y493>": 152923,
2272
+ "<y494>": 152925,
2273
+ "<y495>": 152927,
2274
+ "<y496>": 152929,
2275
+ "<y497>": 152931,
2276
+ "<y498>": 152933,
2277
+ "<y499>": 152935,
2278
+ "<y49>": 152035,
2279
+ "<y4>": 151945,
2280
+ "<y500>": 152937,
2281
+ "<y501>": 152939,
2282
+ "<y502>": 152941,
2283
+ "<y503>": 152943,
2284
+ "<y504>": 152945,
2285
+ "<y505>": 152947,
2286
+ "<y506>": 152949,
2287
+ "<y507>": 152951,
2288
+ "<y508>": 152953,
2289
+ "<y509>": 152955,
2290
+ "<y50>": 152037,
2291
+ "<y510>": 152957,
2292
+ "<y511>": 152959,
2293
+ "<y512>": 152961,
2294
+ "<y513>": 152963,
2295
+ "<y514>": 152965,
2296
+ "<y515>": 152967,
2297
+ "<y516>": 152969,
2298
+ "<y517>": 152971,
2299
+ "<y518>": 152973,
2300
+ "<y519>": 152975,
2301
+ "<y51>": 152039,
2302
+ "<y520>": 152977,
2303
+ "<y521>": 152979,
2304
+ "<y522>": 152981,
2305
+ "<y523>": 152983,
2306
+ "<y524>": 152985,
2307
+ "<y525>": 152987,
2308
+ "<y526>": 152989,
2309
+ "<y527>": 152991,
2310
+ "<y528>": 152993,
2311
+ "<y529>": 152995,
2312
+ "<y52>": 152041,
2313
+ "<y530>": 152997,
2314
+ "<y531>": 152999,
2315
+ "<y532>": 153001,
2316
+ "<y533>": 153003,
2317
+ "<y534>": 153005,
2318
+ "<y535>": 153007,
2319
+ "<y536>": 153009,
2320
+ "<y537>": 153011,
2321
+ "<y538>": 153013,
2322
+ "<y539>": 153015,
2323
+ "<y53>": 152043,
2324
+ "<y540>": 153017,
2325
+ "<y541>": 153019,
2326
+ "<y542>": 153021,
2327
+ "<y543>": 153023,
2328
+ "<y544>": 153025,
2329
+ "<y545>": 153027,
2330
+ "<y546>": 153029,
2331
+ "<y547>": 153031,
2332
+ "<y548>": 153033,
2333
+ "<y549>": 153035,
2334
+ "<y54>": 152045,
2335
+ "<y550>": 153037,
2336
+ "<y551>": 153039,
2337
+ "<y552>": 153041,
2338
+ "<y553>": 153043,
2339
+ "<y554>": 153045,
2340
+ "<y555>": 153047,
2341
+ "<y556>": 153049,
2342
+ "<y557>": 153051,
2343
+ "<y558>": 153053,
2344
+ "<y559>": 153055,
2345
+ "<y55>": 152047,
2346
+ "<y560>": 153057,
2347
+ "<y561>": 153059,
2348
+ "<y562>": 153061,
2349
+ "<y563>": 153063,
2350
+ "<y564>": 153065,
2351
+ "<y565>": 153067,
2352
+ "<y566>": 153069,
2353
+ "<y567>": 153071,
2354
+ "<y568>": 153073,
2355
+ "<y569>": 153075,
2356
+ "<y56>": 152049,
2357
+ "<y570>": 153077,
2358
+ "<y571>": 153079,
2359
+ "<y572>": 153081,
2360
+ "<y573>": 153083,
2361
+ "<y574>": 153085,
2362
+ "<y575>": 153087,
2363
+ "<y576>": 153089,
2364
+ "<y577>": 153091,
2365
+ "<y578>": 153093,
2366
+ "<y579>": 153095,
2367
+ "<y57>": 152051,
2368
+ "<y580>": 153097,
2369
+ "<y581>": 153099,
2370
+ "<y582>": 153101,
2371
+ "<y583>": 153103,
2372
+ "<y584>": 153105,
2373
+ "<y585>": 153107,
2374
+ "<y586>": 153109,
2375
+ "<y587>": 153111,
2376
+ "<y588>": 153113,
2377
+ "<y589>": 153115,
2378
+ "<y58>": 152053,
2379
+ "<y590>": 153117,
2380
+ "<y591>": 153119,
2381
+ "<y592>": 153121,
2382
+ "<y593>": 153123,
2383
+ "<y594>": 153125,
2384
+ "<y595>": 153127,
2385
+ "<y596>": 153129,
2386
+ "<y597>": 153131,
2387
+ "<y598>": 153133,
2388
+ "<y599>": 153135,
2389
+ "<y59>": 152055,
2390
+ "<y5>": 151947,
2391
+ "<y600>": 153137,
2392
+ "<y601>": 153139,
2393
+ "<y602>": 153141,
2394
+ "<y603>": 153143,
2395
+ "<y604>": 153145,
2396
+ "<y605>": 153147,
2397
+ "<y606>": 153149,
2398
+ "<y607>": 153151,
2399
+ "<y608>": 153153,
2400
+ "<y609>": 153155,
2401
+ "<y60>": 152057,
2402
+ "<y610>": 153157,
2403
+ "<y611>": 153159,
2404
+ "<y612>": 153161,
2405
+ "<y613>": 153163,
2406
+ "<y614>": 153165,
2407
+ "<y615>": 153167,
2408
+ "<y616>": 153169,
2409
+ "<y617>": 153171,
2410
+ "<y618>": 153173,
2411
+ "<y619>": 153175,
2412
+ "<y61>": 152059,
2413
+ "<y620>": 153177,
2414
+ "<y621>": 153179,
2415
+ "<y622>": 153181,
2416
+ "<y623>": 153183,
2417
+ "<y624>": 153185,
2418
+ "<y625>": 153187,
2419
+ "<y626>": 153189,
2420
+ "<y627>": 153191,
2421
+ "<y628>": 153193,
2422
+ "<y629>": 153195,
2423
+ "<y62>": 152061,
2424
+ "<y630>": 153197,
2425
+ "<y631>": 153199,
2426
+ "<y632>": 153201,
2427
+ "<y633>": 153203,
2428
+ "<y634>": 153205,
2429
+ "<y635>": 153207,
2430
+ "<y636>": 153209,
2431
+ "<y637>": 153211,
2432
+ "<y638>": 153213,
2433
+ "<y639>": 153215,
2434
+ "<y63>": 152063,
2435
+ "<y640>": 153217,
2436
+ "<y641>": 153219,
2437
+ "<y642>": 153221,
2438
+ "<y643>": 153223,
2439
+ "<y644>": 153225,
2440
+ "<y645>": 153227,
2441
+ "<y646>": 153229,
2442
+ "<y647>": 153231,
2443
+ "<y648>": 153233,
2444
+ "<y649>": 153235,
2445
+ "<y64>": 152065,
2446
+ "<y650>": 153237,
2447
+ "<y651>": 153239,
2448
+ "<y652>": 153241,
2449
+ "<y653>": 153243,
2450
+ "<y654>": 153245,
2451
+ "<y655>": 153247,
2452
+ "<y656>": 153249,
2453
+ "<y657>": 153251,
2454
+ "<y658>": 153253,
2455
+ "<y659>": 153255,
2456
+ "<y65>": 152067,
2457
+ "<y660>": 153257,
2458
+ "<y661>": 153259,
2459
+ "<y662>": 153261,
2460
+ "<y663>": 153263,
2461
+ "<y664>": 153265,
2462
+ "<y665>": 153267,
2463
+ "<y666>": 153269,
2464
+ "<y667>": 153271,
2465
+ "<y668>": 153273,
2466
+ "<y669>": 153275,
2467
+ "<y66>": 152069,
2468
+ "<y670>": 153277,
2469
+ "<y671>": 153279,
2470
+ "<y672>": 153281,
2471
+ "<y673>": 153283,
2472
+ "<y674>": 153285,
2473
+ "<y675>": 153287,
2474
+ "<y676>": 153289,
2475
+ "<y677>": 153291,
2476
+ "<y678>": 153293,
2477
+ "<y679>": 153295,
2478
+ "<y67>": 152071,
2479
+ "<y680>": 153297,
2480
+ "<y681>": 153299,
2481
+ "<y682>": 153301,
2482
+ "<y683>": 153303,
2483
+ "<y684>": 153305,
2484
+ "<y685>": 153307,
2485
+ "<y686>": 153309,
2486
+ "<y687>": 153311,
2487
+ "<y688>": 153313,
2488
+ "<y689>": 153315,
2489
+ "<y68>": 152073,
2490
+ "<y690>": 153317,
2491
+ "<y691>": 153319,
2492
+ "<y692>": 153321,
2493
+ "<y693>": 153323,
2494
+ "<y694>": 153325,
2495
+ "<y695>": 153327,
2496
+ "<y696>": 153329,
2497
+ "<y697>": 153331,
2498
+ "<y698>": 153333,
2499
+ "<y699>": 153335,
2500
+ "<y69>": 152075,
2501
+ "<y6>": 151949,
2502
+ "<y700>": 153337,
2503
+ "<y701>": 153339,
2504
+ "<y702>": 153341,
2505
+ "<y703>": 153343,
2506
+ "<y704>": 153345,
2507
+ "<y705>": 153347,
2508
+ "<y706>": 153349,
2509
+ "<y707>": 153351,
2510
+ "<y708>": 153353,
2511
+ "<y709>": 153355,
2512
+ "<y70>": 152077,
2513
+ "<y710>": 153357,
2514
+ "<y711>": 153359,
2515
+ "<y712>": 153361,
2516
+ "<y713>": 153363,
2517
+ "<y714>": 153365,
2518
+ "<y715>": 153367,
2519
+ "<y716>": 153369,
2520
+ "<y717>": 153371,
2521
+ "<y718>": 153373,
2522
+ "<y719>": 153375,
2523
+ "<y71>": 152079,
2524
+ "<y720>": 153377,
2525
+ "<y721>": 153379,
2526
+ "<y722>": 153381,
2527
+ "<y723>": 153383,
2528
+ "<y724>": 153385,
2529
+ "<y725>": 153387,
2530
+ "<y726>": 153389,
2531
+ "<y727>": 153391,
2532
+ "<y728>": 153393,
2533
+ "<y729>": 153395,
2534
+ "<y72>": 152081,
2535
+ "<y730>": 153397,
2536
+ "<y731>": 153399,
2537
+ "<y732>": 153401,
2538
+ "<y733>": 153403,
2539
+ "<y734>": 153405,
2540
+ "<y735>": 153407,
2541
+ "<y736>": 153409,
2542
+ "<y737>": 153411,
2543
+ "<y738>": 153413,
2544
+ "<y739>": 153415,
2545
+ "<y73>": 152083,
2546
+ "<y740>": 153417,
2547
+ "<y741>": 153419,
2548
+ "<y742>": 153421,
2549
+ "<y743>": 153423,
2550
+ "<y744>": 153425,
2551
+ "<y745>": 153427,
2552
+ "<y746>": 153429,
2553
+ "<y747>": 153431,
2554
+ "<y748>": 153433,
2555
+ "<y749>": 153435,
2556
+ "<y74>": 152085,
2557
+ "<y750>": 153437,
2558
+ "<y751>": 153439,
2559
+ "<y752>": 153441,
2560
+ "<y753>": 153443,
2561
+ "<y754>": 153445,
2562
+ "<y755>": 153447,
2563
+ "<y756>": 153449,
2564
+ "<y757>": 153451,
2565
+ "<y758>": 153453,
2566
+ "<y759>": 153455,
2567
+ "<y75>": 152087,
2568
+ "<y760>": 153457,
2569
+ "<y761>": 153459,
2570
+ "<y762>": 153461,
2571
+ "<y763>": 153463,
2572
+ "<y764>": 153465,
2573
+ "<y765>": 153467,
2574
+ "<y766>": 153469,
2575
+ "<y767>": 153471,
2576
+ "<y768>": 153473,
2577
+ "<y769>": 153475,
2578
+ "<y76>": 152089,
2579
+ "<y770>": 153477,
2580
+ "<y771>": 153479,
2581
+ "<y772>": 153481,
2582
+ "<y773>": 153483,
2583
+ "<y774>": 153485,
2584
+ "<y775>": 153487,
2585
+ "<y776>": 153489,
2586
+ "<y777>": 153491,
2587
+ "<y778>": 153493,
2588
+ "<y779>": 153495,
2589
+ "<y77>": 152091,
2590
+ "<y780>": 153497,
2591
+ "<y781>": 153499,
2592
+ "<y782>": 153501,
2593
+ "<y783>": 153503,
2594
+ "<y784>": 153505,
2595
+ "<y785>": 153507,
2596
+ "<y786>": 153509,
2597
+ "<y787>": 153511,
2598
+ "<y788>": 153513,
2599
+ "<y789>": 153515,
2600
+ "<y78>": 152093,
2601
+ "<y790>": 153517,
2602
+ "<y791>": 153519,
2603
+ "<y792>": 153521,
2604
+ "<y793>": 153523,
2605
+ "<y794>": 153525,
2606
+ "<y795>": 153527,
2607
+ "<y796>": 153529,
2608
+ "<y797>": 153531,
2609
+ "<y798>": 153533,
2610
+ "<y799>": 153535,
2611
+ "<y79>": 152095,
2612
+ "<y7>": 151951,
2613
+ "<y800>": 153537,
2614
+ "<y801>": 153539,
2615
+ "<y802>": 153541,
2616
+ "<y803>": 153543,
2617
+ "<y804>": 153545,
2618
+ "<y805>": 153547,
2619
+ "<y806>": 153549,
2620
+ "<y807>": 153551,
2621
+ "<y808>": 153553,
2622
+ "<y809>": 153555,
2623
+ "<y80>": 152097,
2624
+ "<y810>": 153557,
2625
+ "<y811>": 153559,
2626
+ "<y812>": 153561,
2627
+ "<y813>": 153563,
2628
+ "<y814>": 153565,
2629
+ "<y815>": 153567,
2630
+ "<y816>": 153569,
2631
+ "<y817>": 153571,
2632
+ "<y818>": 153573,
2633
+ "<y819>": 153575,
2634
+ "<y81>": 152099,
2635
+ "<y820>": 153577,
2636
+ "<y821>": 153579,
2637
+ "<y822>": 153581,
2638
+ "<y823>": 153583,
2639
+ "<y824>": 153585,
2640
+ "<y825>": 153587,
2641
+ "<y826>": 153589,
2642
+ "<y827>": 153591,
2643
+ "<y828>": 153593,
2644
+ "<y829>": 153595,
2645
+ "<y82>": 152101,
2646
+ "<y830>": 153597,
2647
+ "<y831>": 153599,
2648
+ "<y832>": 153601,
2649
+ "<y833>": 153603,
2650
+ "<y834>": 153605,
2651
+ "<y835>": 153607,
2652
+ "<y836>": 153609,
2653
+ "<y837>": 153611,
2654
+ "<y838>": 153613,
2655
+ "<y839>": 153615,
2656
+ "<y83>": 152103,
2657
+ "<y840>": 153617,
2658
+ "<y841>": 153619,
2659
+ "<y842>": 153621,
2660
+ "<y843>": 153623,
2661
+ "<y844>": 153625,
2662
+ "<y845>": 153627,
2663
+ "<y846>": 153629,
2664
+ "<y847>": 153631,
2665
+ "<y848>": 153633,
2666
+ "<y849>": 153635,
2667
+ "<y84>": 152105,
2668
+ "<y850>": 153637,
2669
+ "<y851>": 153639,
2670
+ "<y852>": 153641,
2671
+ "<y853>": 153643,
2672
+ "<y854>": 153645,
2673
+ "<y855>": 153647,
2674
+ "<y856>": 153649,
2675
+ "<y857>": 153651,
2676
+ "<y858>": 153653,
2677
+ "<y859>": 153655,
2678
+ "<y85>": 152107,
2679
+ "<y860>": 153657,
2680
+ "<y861>": 153659,
2681
+ "<y862>": 153661,
2682
+ "<y863>": 153663,
2683
+ "<y864>": 153665,
2684
+ "<y865>": 153667,
2685
+ "<y866>": 153669,
2686
+ "<y867>": 153671,
2687
+ "<y868>": 153673,
2688
+ "<y869>": 153675,
2689
+ "<y86>": 152109,
2690
+ "<y870>": 153677,
2691
+ "<y871>": 153679,
2692
+ "<y872>": 153681,
2693
+ "<y873>": 153683,
2694
+ "<y874>": 153685,
2695
+ "<y875>": 153687,
2696
+ "<y876>": 153689,
2697
+ "<y877>": 153691,
2698
+ "<y878>": 153693,
2699
+ "<y879>": 153695,
2700
+ "<y87>": 152111,
2701
+ "<y880>": 153697,
2702
+ "<y881>": 153699,
2703
+ "<y882>": 153701,
2704
+ "<y883>": 153703,
2705
+ "<y884>": 153705,
2706
+ "<y885>": 153707,
2707
+ "<y886>": 153709,
2708
+ "<y887>": 153711,
2709
+ "<y888>": 153713,
2710
+ "<y889>": 153715,
2711
+ "<y88>": 152113,
2712
+ "<y890>": 153717,
2713
+ "<y891>": 153719,
2714
+ "<y892>": 153721,
2715
+ "<y893>": 153723,
2716
+ "<y894>": 153725,
2717
+ "<y895>": 153727,
2718
+ "<y896>": 153729,
2719
+ "<y897>": 153731,
2720
+ "<y898>": 153733,
2721
+ "<y899>": 153735,
2722
+ "<y89>": 152115,
2723
+ "<y8>": 151953,
2724
+ "<y900>": 153737,
2725
+ "<y901>": 153739,
2726
+ "<y902>": 153741,
2727
+ "<y903>": 153743,
2728
+ "<y904>": 153745,
2729
+ "<y905>": 153747,
2730
+ "<y906>": 153749,
2731
+ "<y907>": 153751,
2732
+ "<y908>": 153753,
2733
+ "<y909>": 153755,
2734
+ "<y90>": 152117,
2735
+ "<y910>": 153757,
2736
+ "<y911>": 153759,
2737
+ "<y912>": 153761,
2738
+ "<y913>": 153763,
2739
+ "<y914>": 153765,
2740
+ "<y915>": 153767,
2741
+ "<y916>": 153769,
2742
+ "<y917>": 153771,
2743
+ "<y918>": 153773,
2744
+ "<y919>": 153775,
2745
+ "<y91>": 152119,
2746
+ "<y920>": 153777,
2747
+ "<y921>": 153779,
2748
+ "<y922>": 153781,
2749
+ "<y923>": 153783,
2750
+ "<y924>": 153785,
2751
+ "<y925>": 153787,
2752
+ "<y926>": 153789,
2753
+ "<y927>": 153791,
2754
+ "<y928>": 153793,
2755
+ "<y929>": 153795,
2756
+ "<y92>": 152121,
2757
+ "<y930>": 153797,
2758
+ "<y931>": 153799,
2759
+ "<y932>": 153801,
2760
+ "<y933>": 153803,
2761
+ "<y934>": 153805,
2762
+ "<y935>": 153807,
2763
+ "<y936>": 153809,
2764
+ "<y937>": 153811,
2765
+ "<y938>": 153813,
2766
+ "<y939>": 153815,
2767
+ "<y93>": 152123,
2768
+ "<y940>": 153817,
2769
+ "<y941>": 153819,
2770
+ "<y942>": 153821,
2771
+ "<y943>": 153823,
2772
+ "<y944>": 153825,
2773
+ "<y945>": 153827,
2774
+ "<y946>": 153829,
2775
+ "<y947>": 153831,
2776
+ "<y948>": 153833,
2777
+ "<y949>": 153835,
2778
+ "<y94>": 152125,
2779
+ "<y950>": 153837,
2780
+ "<y951>": 153839,
2781
+ "<y952>": 153841,
2782
+ "<y953>": 153843,
2783
+ "<y954>": 153845,
2784
+ "<y955>": 153847,
2785
+ "<y956>": 153849,
2786
+ "<y957>": 153851,
2787
+ "<y958>": 153853,
2788
+ "<y959>": 153855,
2789
+ "<y95>": 152127,
2790
+ "<y960>": 153857,
2791
+ "<y961>": 153859,
2792
+ "<y962>": 153861,
2793
+ "<y963>": 153863,
2794
+ "<y964>": 153865,
2795
+ "<y965>": 153867,
2796
+ "<y966>": 153869,
2797
+ "<y967>": 153871,
2798
+ "<y968>": 153873,
2799
+ "<y969>": 153875,
2800
+ "<y96>": 152129,
2801
+ "<y970>": 153877,
2802
+ "<y971>": 153879,
2803
+ "<y972>": 153881,
2804
+ "<y973>": 153883,
2805
+ "<y974>": 153885,
2806
+ "<y975>": 153887,
2807
+ "<y976>": 153889,
2808
+ "<y977>": 153891,
2809
+ "<y978>": 153893,
2810
+ "<y979>": 153895,
2811
+ "<y97>": 152131,
2812
+ "<y980>": 153897,
2813
+ "<y981>": 153899,
2814
+ "<y982>": 153901,
2815
+ "<y983>": 153903,
2816
+ "<y984>": 153905,
2817
+ "<y985>": 153907,
2818
+ "<y986>": 153909,
2819
+ "<y987>": 153911,
2820
+ "<y988>": 153913,
2821
+ "<y989>": 153915,
2822
+ "<y98>": 152133,
2823
+ "<y990>": 153917,
2824
+ "<y991>": 153919,
2825
+ "<y992>": 153921,
2826
+ "<y993>": 153923,
2827
+ "<y994>": 153925,
2828
+ "<y995>": 153927,
2829
+ "<y996>": 153929,
2830
+ "<y997>": 153931,
2831
+ "<y998>": 153933,
2832
+ "<y999>": 153935,
2833
+ "<y99>": 152135,
2834
+ "<y9>": 151955,
2835
+ "<|box_end|>": 151649,
2836
+ "<|box_start|>": 151648,
2837
+ "<|endoftext|>": 151643,
2838
+ "<|file_sep|>": 151664,
2839
+ "<|fim_middle|>": 151660,
2840
+ "<|fim_pad|>": 151662,
2841
+ "<|fim_prefix|>": 151659,
2842
+ "<|fim_suffix|>": 151661,
2843
+ "<|im_end|>": 151645,
2844
+ "<|im_start|>": 151644,
2845
+ "<|image_pad|>": 151655,
2846
+ "<|object_ref_end|>": 151647,
2847
+ "<|object_ref_start|>": 151646,
2848
+ "<|quad_end|>": 151651,
2849
+ "<|quad_start|>": 151650,
2850
+ "<|repo_name|>": 151663,
2851
+ "<|video_pad|>": 151656,
2852
+ "<|vision_end|>": 151653,
2853
+ "<|vision_pad|>": 151654,
2854
+ "<|vision_start|>": 151652
2855
+ }
chat_template.jinja ADDED
@@ -0,0 +1,120 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {%- if tools %}
2
+ {{- '<|im_start|>system\n' }}
3
+ {%- if messages[0].role == 'system' %}
4
+ {%- if messages[0].content is string %}
5
+ {{- messages[0].content }}
6
+ {%- else %}
7
+ {%- for content in messages[0].content %}
8
+ {%- if 'text' in content %}
9
+ {{- content.text }}
10
+ {%- endif %}
11
+ {%- endfor %}
12
+ {%- endif %}
13
+ {{- '\n\n' }}
14
+ {%- endif %}
15
+ {{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
16
+ {%- for tool in tools %}
17
+ {{- "\n" }}
18
+ {{- tool | tojson }}
19
+ {%- endfor %}
20
+ {{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
21
+ {%- else %}
22
+ {%- if messages[0].role == 'system' %}
23
+ {{- '<|im_start|>system\n' }}
24
+ {%- if messages[0].content is string %}
25
+ {{- messages[0].content }}
26
+ {%- else %}
27
+ {%- for content in messages[0].content %}
28
+ {%- if 'text' in content %}
29
+ {{- content.text }}
30
+ {%- endif %}
31
+ {%- endfor %}
32
+ {%- endif %}
33
+ {{- '<|im_end|>\n' }}
34
+ {%- endif %}
35
+ {%- endif %}
36
+ {%- set image_count = namespace(value=0) %}
37
+ {%- set video_count = namespace(value=0) %}
38
+ {%- for message in messages %}
39
+ {%- if message.role == "user" %}
40
+ {{- '<|im_start|>' + message.role + '\n' }}
41
+ {%- if message.content is string %}
42
+ {{- message.content }}
43
+ {%- else %}
44
+ {%- for content in message.content %}
45
+ {%- if content.type == 'image' or 'image' in content or 'image_url' in content %}
46
+ {%- set image_count.value = image_count.value + 1 %}
47
+ {%- if add_vision_id %}Picture {{ image_count.value }}: {% endif -%}
48
+ <|vision_start|><|image_pad|><|vision_end|>
49
+ {%- elif content.type == 'video' or 'video' in content %}
50
+ {%- set video_count.value = video_count.value + 1 %}
51
+ {%- if add_vision_id %}Video {{ video_count.value }}: {% endif -%}
52
+ <|vision_start|><|video_pad|><|vision_end|>
53
+ {%- elif 'text' in content %}
54
+ {{- content.text }}
55
+ {%- endif %}
56
+ {%- endfor %}
57
+ {%- endif %}
58
+ {{- '<|im_end|>\n' }}
59
+ {%- elif message.role == "assistant" %}
60
+ {{- '<|im_start|>' + message.role + '\n' }}
61
+ {%- if message.content is string %}
62
+ {{- message.content }}
63
+ {%- else %}
64
+ {%- for content_item in message.content %}
65
+ {%- if 'text' in content_item %}
66
+ {{- content_item.text }}
67
+ {%- endif %}
68
+ {%- endfor %}
69
+ {%- endif %}
70
+ {%- if message.tool_calls %}
71
+ {%- for tool_call in message.tool_calls %}
72
+ {%- if (loop.first and message.content) or (not loop.first) %}
73
+ {{- '\n' }}
74
+ {%- endif %}
75
+ {%- if tool_call.function %}
76
+ {%- set tool_call = tool_call.function %}
77
+ {%- endif %}
78
+ {{- '<tool_call>\n{"name": "' }}
79
+ {{- tool_call.name }}
80
+ {{- '", "arguments": ' }}
81
+ {%- if tool_call.arguments is string %}
82
+ {{- tool_call.arguments }}
83
+ {%- else %}
84
+ {{- tool_call.arguments | tojson }}
85
+ {%- endif %}
86
+ {{- '}\n</tool_call>' }}
87
+ {%- endfor %}
88
+ {%- endif %}
89
+ {{- '<|im_end|>\n' }}
90
+ {%- elif message.role == "tool" %}
91
+ {%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
92
+ {{- '<|im_start|>user' }}
93
+ {%- endif %}
94
+ {{- '\n<tool_response>\n' }}
95
+ {%- if message.content is string %}
96
+ {{- message.content }}
97
+ {%- else %}
98
+ {%- for content in message.content %}
99
+ {%- if content.type == 'image' or 'image' in content or 'image_url' in content %}
100
+ {%- set image_count.value = image_count.value + 1 %}
101
+ {%- if add_vision_id %}Picture {{ image_count.value }}: {% endif -%}
102
+ <|vision_start|><|image_pad|><|vision_end|>
103
+ {%- elif content.type == 'video' or 'video' in content %}
104
+ {%- set video_count.value = video_count.value + 1 %}
105
+ {%- if add_vision_id %}Video {{ video_count.value }}: {% endif -%}
106
+ <|vision_start|><|video_pad|><|vision_end|>
107
+ {%- elif 'text' in content %}
108
+ {{- content.text }}
109
+ {%- endif %}
110
+ {%- endfor %}
111
+ {%- endif %}
112
+ {{- '\n</tool_response>' }}
113
+ {%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
114
+ {{- '<|im_end|>\n' }}
115
+ {%- endif %}
116
+ {%- endif %}
117
+ {%- endfor %}
118
+ {%- if add_generation_prompt %}
119
+ {{- '<|im_start|>assistant\n' }}
120
+ {%- endif %}
config.json ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Qwen3VLForConditionalGeneration"
4
+ ],
5
+ "dtype": "bfloat16",
6
+ "eos_token_id": 151645,
7
+ "image_token_id": 151655,
8
+ "model_type": "qwen3_vl",
9
+ "pad_token_id": 151643,
10
+ "text_config": {
11
+ "attention_bias": false,
12
+ "attention_dropout": 0.0,
13
+ "bos_token_id": 151643,
14
+ "dtype": "bfloat16",
15
+ "eos_token_id": 151645,
16
+ "head_dim": 128,
17
+ "hidden_act": "silu",
18
+ "hidden_size": 4096,
19
+ "initializer_range": 0.02,
20
+ "intermediate_size": 12288,
21
+ "max_position_embeddings": 262144,
22
+ "model_type": "qwen3_vl_text",
23
+ "num_attention_heads": 32,
24
+ "num_hidden_layers": 36,
25
+ "num_key_value_heads": 8,
26
+ "rms_norm_eps": 1e-06,
27
+ "rope_scaling": {
28
+ "mrope_interleaved": true,
29
+ "mrope_section": [
30
+ 24,
31
+ 20,
32
+ 20
33
+ ],
34
+ "rope_type": "default"
35
+ },
36
+ "rope_theta": 5000000,
37
+ "use_cache": true,
38
+ "vocab_size": 154496
39
+ },
40
+ "tie_word_embeddings": false,
41
+ "transformers_version": "4.57.5",
42
+ "use_cache": true,
43
+ "video_token_id": 151656,
44
+ "vision_config": {
45
+ "deepstack_visual_indexes": [
46
+ 8,
47
+ 16,
48
+ 24
49
+ ],
50
+ "depth": 27,
51
+ "dtype": "bfloat16",
52
+ "hidden_act": "gelu_pytorch_tanh",
53
+ "hidden_size": 1152,
54
+ "in_channels": 3,
55
+ "initializer_range": 0.02,
56
+ "intermediate_size": 4304,
57
+ "model_type": "qwen3_vl",
58
+ "num_heads": 16,
59
+ "num_position_embeddings": 2304,
60
+ "out_hidden_size": 4096,
61
+ "patch_size": 16,
62
+ "spatial_merge_size": 2,
63
+ "temporal_patch_size": 2
64
+ },
65
+ "vision_end_token_id": 151653,
66
+ "vision_start_token_id": 151652
67
+ }
generation_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_sample": true,
3
+ "eos_token_id": [
4
+ 151645,
5
+ 151645,
6
+ 151643
7
+ ],
8
+ "pad_token_id": 151643,
9
+ "temperature": 0.7,
10
+ "top_k": 20,
11
+ "top_p": 0.8,
12
+ "transformers_version": "4.57.5"
13
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model-00001-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:067f6d0cb7bda5a67b2f2b86152750eeddf749f4740e960fea7ff4c2c908b839
3
+ size 4918364648
model-00002-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e2a3dcd328fbf7fedb6afc31d026b8c0e3e2ca8cbb58173e370fe6e6cec8b74a
3
+ size 4915962464
model-00003-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fcdae57aa7edfbb84f428bc7598150653a44daf448c2a94ee7fd674c3fac4f17
3
+ size 4983070600
model-00004-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:947597fa168ea34d1e43d217291d32e4998e074f23205c7cdb6eeef1afd7c790
3
+ size 2758884832
model.safetensors.index.json ADDED
@@ -0,0 +1,758 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_parameters": 8788095216,
4
+ "total_size": 17576190432
5
+ },
6
+ "weight_map": {
7
+ "lm_head.weight": "model-00004-of-00004.safetensors",
8
+ "model.language_model.embed_tokens.weight": "model-00001-of-00004.safetensors",
9
+ "model.language_model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
10
+ "model.language_model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
11
+ "model.language_model.layers.0.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
12
+ "model.language_model.layers.0.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
13
+ "model.language_model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
14
+ "model.language_model.layers.0.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
15
+ "model.language_model.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
16
+ "model.language_model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
17
+ "model.language_model.layers.0.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
18
+ "model.language_model.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
19
+ "model.language_model.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
20
+ "model.language_model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
21
+ "model.language_model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
22
+ "model.language_model.layers.1.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
23
+ "model.language_model.layers.1.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
24
+ "model.language_model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
25
+ "model.language_model.layers.1.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
26
+ "model.language_model.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
27
+ "model.language_model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
28
+ "model.language_model.layers.1.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
29
+ "model.language_model.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
30
+ "model.language_model.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
31
+ "model.language_model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
32
+ "model.language_model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
33
+ "model.language_model.layers.10.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
34
+ "model.language_model.layers.10.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
35
+ "model.language_model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
36
+ "model.language_model.layers.10.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
37
+ "model.language_model.layers.10.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
38
+ "model.language_model.layers.10.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
39
+ "model.language_model.layers.10.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
40
+ "model.language_model.layers.10.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
41
+ "model.language_model.layers.10.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
42
+ "model.language_model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
43
+ "model.language_model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
44
+ "model.language_model.layers.11.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
45
+ "model.language_model.layers.11.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
46
+ "model.language_model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
47
+ "model.language_model.layers.11.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
48
+ "model.language_model.layers.11.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
49
+ "model.language_model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
50
+ "model.language_model.layers.11.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
51
+ "model.language_model.layers.11.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
52
+ "model.language_model.layers.11.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
53
+ "model.language_model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
54
+ "model.language_model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
55
+ "model.language_model.layers.12.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
56
+ "model.language_model.layers.12.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
57
+ "model.language_model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
58
+ "model.language_model.layers.12.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
59
+ "model.language_model.layers.12.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
60
+ "model.language_model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
61
+ "model.language_model.layers.12.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
62
+ "model.language_model.layers.12.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
63
+ "model.language_model.layers.12.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
64
+ "model.language_model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
65
+ "model.language_model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
66
+ "model.language_model.layers.13.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
67
+ "model.language_model.layers.13.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
68
+ "model.language_model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
69
+ "model.language_model.layers.13.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
70
+ "model.language_model.layers.13.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
71
+ "model.language_model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
72
+ "model.language_model.layers.13.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
73
+ "model.language_model.layers.13.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
74
+ "model.language_model.layers.13.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
75
+ "model.language_model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
76
+ "model.language_model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
77
+ "model.language_model.layers.14.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
78
+ "model.language_model.layers.14.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
79
+ "model.language_model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
80
+ "model.language_model.layers.14.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
81
+ "model.language_model.layers.14.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
82
+ "model.language_model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
83
+ "model.language_model.layers.14.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
84
+ "model.language_model.layers.14.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
85
+ "model.language_model.layers.14.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
86
+ "model.language_model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
87
+ "model.language_model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
88
+ "model.language_model.layers.15.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
89
+ "model.language_model.layers.15.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
90
+ "model.language_model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
91
+ "model.language_model.layers.15.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
92
+ "model.language_model.layers.15.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
93
+ "model.language_model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
94
+ "model.language_model.layers.15.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
95
+ "model.language_model.layers.15.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
96
+ "model.language_model.layers.15.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
97
+ "model.language_model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
98
+ "model.language_model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
99
+ "model.language_model.layers.16.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
100
+ "model.language_model.layers.16.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
101
+ "model.language_model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
102
+ "model.language_model.layers.16.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
103
+ "model.language_model.layers.16.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
104
+ "model.language_model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
105
+ "model.language_model.layers.16.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
106
+ "model.language_model.layers.16.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
107
+ "model.language_model.layers.16.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
108
+ "model.language_model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
109
+ "model.language_model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
110
+ "model.language_model.layers.17.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
111
+ "model.language_model.layers.17.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
112
+ "model.language_model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
113
+ "model.language_model.layers.17.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
114
+ "model.language_model.layers.17.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
115
+ "model.language_model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
116
+ "model.language_model.layers.17.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
117
+ "model.language_model.layers.17.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
118
+ "model.language_model.layers.17.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
119
+ "model.language_model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
120
+ "model.language_model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
121
+ "model.language_model.layers.18.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
122
+ "model.language_model.layers.18.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
123
+ "model.language_model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
124
+ "model.language_model.layers.18.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
125
+ "model.language_model.layers.18.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
126
+ "model.language_model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
127
+ "model.language_model.layers.18.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
128
+ "model.language_model.layers.18.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
129
+ "model.language_model.layers.18.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
130
+ "model.language_model.layers.19.input_layernorm.weight": "model-00003-of-00004.safetensors",
131
+ "model.language_model.layers.19.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
132
+ "model.language_model.layers.19.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
133
+ "model.language_model.layers.19.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
134
+ "model.language_model.layers.19.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
135
+ "model.language_model.layers.19.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
136
+ "model.language_model.layers.19.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
137
+ "model.language_model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
138
+ "model.language_model.layers.19.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
139
+ "model.language_model.layers.19.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
140
+ "model.language_model.layers.19.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
141
+ "model.language_model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
142
+ "model.language_model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
143
+ "model.language_model.layers.2.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
144
+ "model.language_model.layers.2.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
145
+ "model.language_model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
146
+ "model.language_model.layers.2.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
147
+ "model.language_model.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
148
+ "model.language_model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
149
+ "model.language_model.layers.2.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
150
+ "model.language_model.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
151
+ "model.language_model.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
152
+ "model.language_model.layers.20.input_layernorm.weight": "model-00003-of-00004.safetensors",
153
+ "model.language_model.layers.20.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
154
+ "model.language_model.layers.20.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
155
+ "model.language_model.layers.20.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
156
+ "model.language_model.layers.20.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
157
+ "model.language_model.layers.20.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
158
+ "model.language_model.layers.20.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
159
+ "model.language_model.layers.20.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
160
+ "model.language_model.layers.20.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
161
+ "model.language_model.layers.20.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
162
+ "model.language_model.layers.20.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
163
+ "model.language_model.layers.21.input_layernorm.weight": "model-00003-of-00004.safetensors",
164
+ "model.language_model.layers.21.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
165
+ "model.language_model.layers.21.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
166
+ "model.language_model.layers.21.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
167
+ "model.language_model.layers.21.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
168
+ "model.language_model.layers.21.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
169
+ "model.language_model.layers.21.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
170
+ "model.language_model.layers.21.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
171
+ "model.language_model.layers.21.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
172
+ "model.language_model.layers.21.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
173
+ "model.language_model.layers.21.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
174
+ "model.language_model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
175
+ "model.language_model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
176
+ "model.language_model.layers.22.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
177
+ "model.language_model.layers.22.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
178
+ "model.language_model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
179
+ "model.language_model.layers.22.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
180
+ "model.language_model.layers.22.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
181
+ "model.language_model.layers.22.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
182
+ "model.language_model.layers.22.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
183
+ "model.language_model.layers.22.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
184
+ "model.language_model.layers.22.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
185
+ "model.language_model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
186
+ "model.language_model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
187
+ "model.language_model.layers.23.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
188
+ "model.language_model.layers.23.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
189
+ "model.language_model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
190
+ "model.language_model.layers.23.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
191
+ "model.language_model.layers.23.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
192
+ "model.language_model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
193
+ "model.language_model.layers.23.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
194
+ "model.language_model.layers.23.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
195
+ "model.language_model.layers.23.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
196
+ "model.language_model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
197
+ "model.language_model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
198
+ "model.language_model.layers.24.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
199
+ "model.language_model.layers.24.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
200
+ "model.language_model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
201
+ "model.language_model.layers.24.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
202
+ "model.language_model.layers.24.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
203
+ "model.language_model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
204
+ "model.language_model.layers.24.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
205
+ "model.language_model.layers.24.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
206
+ "model.language_model.layers.24.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
207
+ "model.language_model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
208
+ "model.language_model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
209
+ "model.language_model.layers.25.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
210
+ "model.language_model.layers.25.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
211
+ "model.language_model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
212
+ "model.language_model.layers.25.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
213
+ "model.language_model.layers.25.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
214
+ "model.language_model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
215
+ "model.language_model.layers.25.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
216
+ "model.language_model.layers.25.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
217
+ "model.language_model.layers.25.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
218
+ "model.language_model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
219
+ "model.language_model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
220
+ "model.language_model.layers.26.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
221
+ "model.language_model.layers.26.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
222
+ "model.language_model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
223
+ "model.language_model.layers.26.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
224
+ "model.language_model.layers.26.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
225
+ "model.language_model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
226
+ "model.language_model.layers.26.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
227
+ "model.language_model.layers.26.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
228
+ "model.language_model.layers.26.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
229
+ "model.language_model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
230
+ "model.language_model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
231
+ "model.language_model.layers.27.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
232
+ "model.language_model.layers.27.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
233
+ "model.language_model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
234
+ "model.language_model.layers.27.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
235
+ "model.language_model.layers.27.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
236
+ "model.language_model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
237
+ "model.language_model.layers.27.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
238
+ "model.language_model.layers.27.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
239
+ "model.language_model.layers.27.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
240
+ "model.language_model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
241
+ "model.language_model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
242
+ "model.language_model.layers.28.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
243
+ "model.language_model.layers.28.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
244
+ "model.language_model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
245
+ "model.language_model.layers.28.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
246
+ "model.language_model.layers.28.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
247
+ "model.language_model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
248
+ "model.language_model.layers.28.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
249
+ "model.language_model.layers.28.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
250
+ "model.language_model.layers.28.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
251
+ "model.language_model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
252
+ "model.language_model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
253
+ "model.language_model.layers.29.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
254
+ "model.language_model.layers.29.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
255
+ "model.language_model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
256
+ "model.language_model.layers.29.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
257
+ "model.language_model.layers.29.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
258
+ "model.language_model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
259
+ "model.language_model.layers.29.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
260
+ "model.language_model.layers.29.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
261
+ "model.language_model.layers.29.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
262
+ "model.language_model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
263
+ "model.language_model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
264
+ "model.language_model.layers.3.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
265
+ "model.language_model.layers.3.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
266
+ "model.language_model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
267
+ "model.language_model.layers.3.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
268
+ "model.language_model.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
269
+ "model.language_model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
270
+ "model.language_model.layers.3.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
271
+ "model.language_model.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
272
+ "model.language_model.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
273
+ "model.language_model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
274
+ "model.language_model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
275
+ "model.language_model.layers.30.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
276
+ "model.language_model.layers.30.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
277
+ "model.language_model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
278
+ "model.language_model.layers.30.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
279
+ "model.language_model.layers.30.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
280
+ "model.language_model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
281
+ "model.language_model.layers.30.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
282
+ "model.language_model.layers.30.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
283
+ "model.language_model.layers.30.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
284
+ "model.language_model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
285
+ "model.language_model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
286
+ "model.language_model.layers.31.mlp.gate_proj.weight": "model-00003-of-00004.safetensors",
287
+ "model.language_model.layers.31.mlp.up_proj.weight": "model-00003-of-00004.safetensors",
288
+ "model.language_model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
289
+ "model.language_model.layers.31.self_attn.k_norm.weight": "model-00003-of-00004.safetensors",
290
+ "model.language_model.layers.31.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
291
+ "model.language_model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
292
+ "model.language_model.layers.31.self_attn.q_norm.weight": "model-00003-of-00004.safetensors",
293
+ "model.language_model.layers.31.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
294
+ "model.language_model.layers.31.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
295
+ "model.language_model.layers.32.input_layernorm.weight": "model-00004-of-00004.safetensors",
296
+ "model.language_model.layers.32.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
297
+ "model.language_model.layers.32.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
298
+ "model.language_model.layers.32.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
299
+ "model.language_model.layers.32.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
300
+ "model.language_model.layers.32.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
301
+ "model.language_model.layers.32.self_attn.k_proj.weight": "model-00003-of-00004.safetensors",
302
+ "model.language_model.layers.32.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
303
+ "model.language_model.layers.32.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
304
+ "model.language_model.layers.32.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
305
+ "model.language_model.layers.32.self_attn.v_proj.weight": "model-00003-of-00004.safetensors",
306
+ "model.language_model.layers.33.input_layernorm.weight": "model-00004-of-00004.safetensors",
307
+ "model.language_model.layers.33.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
308
+ "model.language_model.layers.33.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
309
+ "model.language_model.layers.33.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
310
+ "model.language_model.layers.33.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
311
+ "model.language_model.layers.33.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
312
+ "model.language_model.layers.33.self_attn.k_proj.weight": "model-00004-of-00004.safetensors",
313
+ "model.language_model.layers.33.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
314
+ "model.language_model.layers.33.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
315
+ "model.language_model.layers.33.self_attn.q_proj.weight": "model-00004-of-00004.safetensors",
316
+ "model.language_model.layers.33.self_attn.v_proj.weight": "model-00004-of-00004.safetensors",
317
+ "model.language_model.layers.34.input_layernorm.weight": "model-00004-of-00004.safetensors",
318
+ "model.language_model.layers.34.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
319
+ "model.language_model.layers.34.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
320
+ "model.language_model.layers.34.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
321
+ "model.language_model.layers.34.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
322
+ "model.language_model.layers.34.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
323
+ "model.language_model.layers.34.self_attn.k_proj.weight": "model-00004-of-00004.safetensors",
324
+ "model.language_model.layers.34.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
325
+ "model.language_model.layers.34.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
326
+ "model.language_model.layers.34.self_attn.q_proj.weight": "model-00004-of-00004.safetensors",
327
+ "model.language_model.layers.34.self_attn.v_proj.weight": "model-00004-of-00004.safetensors",
328
+ "model.language_model.layers.35.input_layernorm.weight": "model-00004-of-00004.safetensors",
329
+ "model.language_model.layers.35.mlp.down_proj.weight": "model-00004-of-00004.safetensors",
330
+ "model.language_model.layers.35.mlp.gate_proj.weight": "model-00004-of-00004.safetensors",
331
+ "model.language_model.layers.35.mlp.up_proj.weight": "model-00004-of-00004.safetensors",
332
+ "model.language_model.layers.35.post_attention_layernorm.weight": "model-00004-of-00004.safetensors",
333
+ "model.language_model.layers.35.self_attn.k_norm.weight": "model-00004-of-00004.safetensors",
334
+ "model.language_model.layers.35.self_attn.k_proj.weight": "model-00004-of-00004.safetensors",
335
+ "model.language_model.layers.35.self_attn.o_proj.weight": "model-00004-of-00004.safetensors",
336
+ "model.language_model.layers.35.self_attn.q_norm.weight": "model-00004-of-00004.safetensors",
337
+ "model.language_model.layers.35.self_attn.q_proj.weight": "model-00004-of-00004.safetensors",
338
+ "model.language_model.layers.35.self_attn.v_proj.weight": "model-00004-of-00004.safetensors",
339
+ "model.language_model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
340
+ "model.language_model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
341
+ "model.language_model.layers.4.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
342
+ "model.language_model.layers.4.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
343
+ "model.language_model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
344
+ "model.language_model.layers.4.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
345
+ "model.language_model.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
346
+ "model.language_model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
347
+ "model.language_model.layers.4.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
348
+ "model.language_model.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
349
+ "model.language_model.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
350
+ "model.language_model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
351
+ "model.language_model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
352
+ "model.language_model.layers.5.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
353
+ "model.language_model.layers.5.mlp.up_proj.weight": "model-00001-of-00004.safetensors",
354
+ "model.language_model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
355
+ "model.language_model.layers.5.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
356
+ "model.language_model.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
357
+ "model.language_model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
358
+ "model.language_model.layers.5.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
359
+ "model.language_model.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
360
+ "model.language_model.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
361
+ "model.language_model.layers.6.input_layernorm.weight": "model-00002-of-00004.safetensors",
362
+ "model.language_model.layers.6.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
363
+ "model.language_model.layers.6.mlp.gate_proj.weight": "model-00001-of-00004.safetensors",
364
+ "model.language_model.layers.6.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
365
+ "model.language_model.layers.6.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
366
+ "model.language_model.layers.6.self_attn.k_norm.weight": "model-00001-of-00004.safetensors",
367
+ "model.language_model.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
368
+ "model.language_model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
369
+ "model.language_model.layers.6.self_attn.q_norm.weight": "model-00001-of-00004.safetensors",
370
+ "model.language_model.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
371
+ "model.language_model.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
372
+ "model.language_model.layers.7.input_layernorm.weight": "model-00002-of-00004.safetensors",
373
+ "model.language_model.layers.7.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
374
+ "model.language_model.layers.7.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
375
+ "model.language_model.layers.7.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
376
+ "model.language_model.layers.7.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
377
+ "model.language_model.layers.7.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
378
+ "model.language_model.layers.7.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
379
+ "model.language_model.layers.7.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
380
+ "model.language_model.layers.7.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
381
+ "model.language_model.layers.7.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
382
+ "model.language_model.layers.7.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
383
+ "model.language_model.layers.8.input_layernorm.weight": "model-00002-of-00004.safetensors",
384
+ "model.language_model.layers.8.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
385
+ "model.language_model.layers.8.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
386
+ "model.language_model.layers.8.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
387
+ "model.language_model.layers.8.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
388
+ "model.language_model.layers.8.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
389
+ "model.language_model.layers.8.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
390
+ "model.language_model.layers.8.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
391
+ "model.language_model.layers.8.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
392
+ "model.language_model.layers.8.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
393
+ "model.language_model.layers.8.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
394
+ "model.language_model.layers.9.input_layernorm.weight": "model-00002-of-00004.safetensors",
395
+ "model.language_model.layers.9.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
396
+ "model.language_model.layers.9.mlp.gate_proj.weight": "model-00002-of-00004.safetensors",
397
+ "model.language_model.layers.9.mlp.up_proj.weight": "model-00002-of-00004.safetensors",
398
+ "model.language_model.layers.9.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
399
+ "model.language_model.layers.9.self_attn.k_norm.weight": "model-00002-of-00004.safetensors",
400
+ "model.language_model.layers.9.self_attn.k_proj.weight": "model-00002-of-00004.safetensors",
401
+ "model.language_model.layers.9.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
402
+ "model.language_model.layers.9.self_attn.q_norm.weight": "model-00002-of-00004.safetensors",
403
+ "model.language_model.layers.9.self_attn.q_proj.weight": "model-00002-of-00004.safetensors",
404
+ "model.language_model.layers.9.self_attn.v_proj.weight": "model-00002-of-00004.safetensors",
405
+ "model.language_model.norm.weight": "model-00004-of-00004.safetensors",
406
+ "model.visual.blocks.0.attn.proj.bias": "model-00001-of-00004.safetensors",
407
+ "model.visual.blocks.0.attn.proj.weight": "model-00001-of-00004.safetensors",
408
+ "model.visual.blocks.0.attn.qkv.bias": "model-00001-of-00004.safetensors",
409
+ "model.visual.blocks.0.attn.qkv.weight": "model-00001-of-00004.safetensors",
410
+ "model.visual.blocks.0.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
411
+ "model.visual.blocks.0.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
412
+ "model.visual.blocks.0.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
413
+ "model.visual.blocks.0.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
414
+ "model.visual.blocks.0.norm1.bias": "model-00001-of-00004.safetensors",
415
+ "model.visual.blocks.0.norm1.weight": "model-00001-of-00004.safetensors",
416
+ "model.visual.blocks.0.norm2.bias": "model-00001-of-00004.safetensors",
417
+ "model.visual.blocks.0.norm2.weight": "model-00001-of-00004.safetensors",
418
+ "model.visual.blocks.1.attn.proj.bias": "model-00001-of-00004.safetensors",
419
+ "model.visual.blocks.1.attn.proj.weight": "model-00001-of-00004.safetensors",
420
+ "model.visual.blocks.1.attn.qkv.bias": "model-00001-of-00004.safetensors",
421
+ "model.visual.blocks.1.attn.qkv.weight": "model-00001-of-00004.safetensors",
422
+ "model.visual.blocks.1.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
423
+ "model.visual.blocks.1.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
424
+ "model.visual.blocks.1.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
425
+ "model.visual.blocks.1.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
426
+ "model.visual.blocks.1.norm1.bias": "model-00001-of-00004.safetensors",
427
+ "model.visual.blocks.1.norm1.weight": "model-00001-of-00004.safetensors",
428
+ "model.visual.blocks.1.norm2.bias": "model-00001-of-00004.safetensors",
429
+ "model.visual.blocks.1.norm2.weight": "model-00001-of-00004.safetensors",
430
+ "model.visual.blocks.10.attn.proj.bias": "model-00001-of-00004.safetensors",
431
+ "model.visual.blocks.10.attn.proj.weight": "model-00001-of-00004.safetensors",
432
+ "model.visual.blocks.10.attn.qkv.bias": "model-00001-of-00004.safetensors",
433
+ "model.visual.blocks.10.attn.qkv.weight": "model-00001-of-00004.safetensors",
434
+ "model.visual.blocks.10.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
435
+ "model.visual.blocks.10.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
436
+ "model.visual.blocks.10.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
437
+ "model.visual.blocks.10.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
438
+ "model.visual.blocks.10.norm1.bias": "model-00001-of-00004.safetensors",
439
+ "model.visual.blocks.10.norm1.weight": "model-00001-of-00004.safetensors",
440
+ "model.visual.blocks.10.norm2.bias": "model-00001-of-00004.safetensors",
441
+ "model.visual.blocks.10.norm2.weight": "model-00001-of-00004.safetensors",
442
+ "model.visual.blocks.11.attn.proj.bias": "model-00001-of-00004.safetensors",
443
+ "model.visual.blocks.11.attn.proj.weight": "model-00001-of-00004.safetensors",
444
+ "model.visual.blocks.11.attn.qkv.bias": "model-00001-of-00004.safetensors",
445
+ "model.visual.blocks.11.attn.qkv.weight": "model-00001-of-00004.safetensors",
446
+ "model.visual.blocks.11.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
447
+ "model.visual.blocks.11.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
448
+ "model.visual.blocks.11.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
449
+ "model.visual.blocks.11.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
450
+ "model.visual.blocks.11.norm1.bias": "model-00001-of-00004.safetensors",
451
+ "model.visual.blocks.11.norm1.weight": "model-00001-of-00004.safetensors",
452
+ "model.visual.blocks.11.norm2.bias": "model-00001-of-00004.safetensors",
453
+ "model.visual.blocks.11.norm2.weight": "model-00001-of-00004.safetensors",
454
+ "model.visual.blocks.12.attn.proj.bias": "model-00001-of-00004.safetensors",
455
+ "model.visual.blocks.12.attn.proj.weight": "model-00001-of-00004.safetensors",
456
+ "model.visual.blocks.12.attn.qkv.bias": "model-00001-of-00004.safetensors",
457
+ "model.visual.blocks.12.attn.qkv.weight": "model-00001-of-00004.safetensors",
458
+ "model.visual.blocks.12.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
459
+ "model.visual.blocks.12.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
460
+ "model.visual.blocks.12.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
461
+ "model.visual.blocks.12.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
462
+ "model.visual.blocks.12.norm1.bias": "model-00001-of-00004.safetensors",
463
+ "model.visual.blocks.12.norm1.weight": "model-00001-of-00004.safetensors",
464
+ "model.visual.blocks.12.norm2.bias": "model-00001-of-00004.safetensors",
465
+ "model.visual.blocks.12.norm2.weight": "model-00001-of-00004.safetensors",
466
+ "model.visual.blocks.13.attn.proj.bias": "model-00001-of-00004.safetensors",
467
+ "model.visual.blocks.13.attn.proj.weight": "model-00001-of-00004.safetensors",
468
+ "model.visual.blocks.13.attn.qkv.bias": "model-00001-of-00004.safetensors",
469
+ "model.visual.blocks.13.attn.qkv.weight": "model-00001-of-00004.safetensors",
470
+ "model.visual.blocks.13.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
471
+ "model.visual.blocks.13.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
472
+ "model.visual.blocks.13.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
473
+ "model.visual.blocks.13.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
474
+ "model.visual.blocks.13.norm1.bias": "model-00001-of-00004.safetensors",
475
+ "model.visual.blocks.13.norm1.weight": "model-00001-of-00004.safetensors",
476
+ "model.visual.blocks.13.norm2.bias": "model-00001-of-00004.safetensors",
477
+ "model.visual.blocks.13.norm2.weight": "model-00001-of-00004.safetensors",
478
+ "model.visual.blocks.14.attn.proj.bias": "model-00001-of-00004.safetensors",
479
+ "model.visual.blocks.14.attn.proj.weight": "model-00001-of-00004.safetensors",
480
+ "model.visual.blocks.14.attn.qkv.bias": "model-00001-of-00004.safetensors",
481
+ "model.visual.blocks.14.attn.qkv.weight": "model-00001-of-00004.safetensors",
482
+ "model.visual.blocks.14.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
483
+ "model.visual.blocks.14.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
484
+ "model.visual.blocks.14.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
485
+ "model.visual.blocks.14.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
486
+ "model.visual.blocks.14.norm1.bias": "model-00001-of-00004.safetensors",
487
+ "model.visual.blocks.14.norm1.weight": "model-00001-of-00004.safetensors",
488
+ "model.visual.blocks.14.norm2.bias": "model-00001-of-00004.safetensors",
489
+ "model.visual.blocks.14.norm2.weight": "model-00001-of-00004.safetensors",
490
+ "model.visual.blocks.15.attn.proj.bias": "model-00001-of-00004.safetensors",
491
+ "model.visual.blocks.15.attn.proj.weight": "model-00001-of-00004.safetensors",
492
+ "model.visual.blocks.15.attn.qkv.bias": "model-00001-of-00004.safetensors",
493
+ "model.visual.blocks.15.attn.qkv.weight": "model-00001-of-00004.safetensors",
494
+ "model.visual.blocks.15.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
495
+ "model.visual.blocks.15.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
496
+ "model.visual.blocks.15.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
497
+ "model.visual.blocks.15.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
498
+ "model.visual.blocks.15.norm1.bias": "model-00001-of-00004.safetensors",
499
+ "model.visual.blocks.15.norm1.weight": "model-00001-of-00004.safetensors",
500
+ "model.visual.blocks.15.norm2.bias": "model-00001-of-00004.safetensors",
501
+ "model.visual.blocks.15.norm2.weight": "model-00001-of-00004.safetensors",
502
+ "model.visual.blocks.16.attn.proj.bias": "model-00001-of-00004.safetensors",
503
+ "model.visual.blocks.16.attn.proj.weight": "model-00001-of-00004.safetensors",
504
+ "model.visual.blocks.16.attn.qkv.bias": "model-00001-of-00004.safetensors",
505
+ "model.visual.blocks.16.attn.qkv.weight": "model-00001-of-00004.safetensors",
506
+ "model.visual.blocks.16.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
507
+ "model.visual.blocks.16.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
508
+ "model.visual.blocks.16.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
509
+ "model.visual.blocks.16.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
510
+ "model.visual.blocks.16.norm1.bias": "model-00001-of-00004.safetensors",
511
+ "model.visual.blocks.16.norm1.weight": "model-00001-of-00004.safetensors",
512
+ "model.visual.blocks.16.norm2.bias": "model-00001-of-00004.safetensors",
513
+ "model.visual.blocks.16.norm2.weight": "model-00001-of-00004.safetensors",
514
+ "model.visual.blocks.17.attn.proj.bias": "model-00001-of-00004.safetensors",
515
+ "model.visual.blocks.17.attn.proj.weight": "model-00001-of-00004.safetensors",
516
+ "model.visual.blocks.17.attn.qkv.bias": "model-00001-of-00004.safetensors",
517
+ "model.visual.blocks.17.attn.qkv.weight": "model-00001-of-00004.safetensors",
518
+ "model.visual.blocks.17.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
519
+ "model.visual.blocks.17.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
520
+ "model.visual.blocks.17.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
521
+ "model.visual.blocks.17.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
522
+ "model.visual.blocks.17.norm1.bias": "model-00001-of-00004.safetensors",
523
+ "model.visual.blocks.17.norm1.weight": "model-00001-of-00004.safetensors",
524
+ "model.visual.blocks.17.norm2.bias": "model-00001-of-00004.safetensors",
525
+ "model.visual.blocks.17.norm2.weight": "model-00001-of-00004.safetensors",
526
+ "model.visual.blocks.18.attn.proj.bias": "model-00001-of-00004.safetensors",
527
+ "model.visual.blocks.18.attn.proj.weight": "model-00001-of-00004.safetensors",
528
+ "model.visual.blocks.18.attn.qkv.bias": "model-00001-of-00004.safetensors",
529
+ "model.visual.blocks.18.attn.qkv.weight": "model-00001-of-00004.safetensors",
530
+ "model.visual.blocks.18.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
531
+ "model.visual.blocks.18.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
532
+ "model.visual.blocks.18.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
533
+ "model.visual.blocks.18.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
534
+ "model.visual.blocks.18.norm1.bias": "model-00001-of-00004.safetensors",
535
+ "model.visual.blocks.18.norm1.weight": "model-00001-of-00004.safetensors",
536
+ "model.visual.blocks.18.norm2.bias": "model-00001-of-00004.safetensors",
537
+ "model.visual.blocks.18.norm2.weight": "model-00001-of-00004.safetensors",
538
+ "model.visual.blocks.19.attn.proj.bias": "model-00001-of-00004.safetensors",
539
+ "model.visual.blocks.19.attn.proj.weight": "model-00001-of-00004.safetensors",
540
+ "model.visual.blocks.19.attn.qkv.bias": "model-00001-of-00004.safetensors",
541
+ "model.visual.blocks.19.attn.qkv.weight": "model-00001-of-00004.safetensors",
542
+ "model.visual.blocks.19.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
543
+ "model.visual.blocks.19.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
544
+ "model.visual.blocks.19.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
545
+ "model.visual.blocks.19.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
546
+ "model.visual.blocks.19.norm1.bias": "model-00001-of-00004.safetensors",
547
+ "model.visual.blocks.19.norm1.weight": "model-00001-of-00004.safetensors",
548
+ "model.visual.blocks.19.norm2.bias": "model-00001-of-00004.safetensors",
549
+ "model.visual.blocks.19.norm2.weight": "model-00001-of-00004.safetensors",
550
+ "model.visual.blocks.2.attn.proj.bias": "model-00001-of-00004.safetensors",
551
+ "model.visual.blocks.2.attn.proj.weight": "model-00001-of-00004.safetensors",
552
+ "model.visual.blocks.2.attn.qkv.bias": "model-00001-of-00004.safetensors",
553
+ "model.visual.blocks.2.attn.qkv.weight": "model-00001-of-00004.safetensors",
554
+ "model.visual.blocks.2.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
555
+ "model.visual.blocks.2.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
556
+ "model.visual.blocks.2.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
557
+ "model.visual.blocks.2.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
558
+ "model.visual.blocks.2.norm1.bias": "model-00001-of-00004.safetensors",
559
+ "model.visual.blocks.2.norm1.weight": "model-00001-of-00004.safetensors",
560
+ "model.visual.blocks.2.norm2.bias": "model-00001-of-00004.safetensors",
561
+ "model.visual.blocks.2.norm2.weight": "model-00001-of-00004.safetensors",
562
+ "model.visual.blocks.20.attn.proj.bias": "model-00001-of-00004.safetensors",
563
+ "model.visual.blocks.20.attn.proj.weight": "model-00001-of-00004.safetensors",
564
+ "model.visual.blocks.20.attn.qkv.bias": "model-00001-of-00004.safetensors",
565
+ "model.visual.blocks.20.attn.qkv.weight": "model-00001-of-00004.safetensors",
566
+ "model.visual.blocks.20.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
567
+ "model.visual.blocks.20.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
568
+ "model.visual.blocks.20.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
569
+ "model.visual.blocks.20.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
570
+ "model.visual.blocks.20.norm1.bias": "model-00001-of-00004.safetensors",
571
+ "model.visual.blocks.20.norm1.weight": "model-00001-of-00004.safetensors",
572
+ "model.visual.blocks.20.norm2.bias": "model-00001-of-00004.safetensors",
573
+ "model.visual.blocks.20.norm2.weight": "model-00001-of-00004.safetensors",
574
+ "model.visual.blocks.21.attn.proj.bias": "model-00001-of-00004.safetensors",
575
+ "model.visual.blocks.21.attn.proj.weight": "model-00001-of-00004.safetensors",
576
+ "model.visual.blocks.21.attn.qkv.bias": "model-00001-of-00004.safetensors",
577
+ "model.visual.blocks.21.attn.qkv.weight": "model-00001-of-00004.safetensors",
578
+ "model.visual.blocks.21.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
579
+ "model.visual.blocks.21.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
580
+ "model.visual.blocks.21.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
581
+ "model.visual.blocks.21.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
582
+ "model.visual.blocks.21.norm1.bias": "model-00001-of-00004.safetensors",
583
+ "model.visual.blocks.21.norm1.weight": "model-00001-of-00004.safetensors",
584
+ "model.visual.blocks.21.norm2.bias": "model-00001-of-00004.safetensors",
585
+ "model.visual.blocks.21.norm2.weight": "model-00001-of-00004.safetensors",
586
+ "model.visual.blocks.22.attn.proj.bias": "model-00001-of-00004.safetensors",
587
+ "model.visual.blocks.22.attn.proj.weight": "model-00001-of-00004.safetensors",
588
+ "model.visual.blocks.22.attn.qkv.bias": "model-00001-of-00004.safetensors",
589
+ "model.visual.blocks.22.attn.qkv.weight": "model-00001-of-00004.safetensors",
590
+ "model.visual.blocks.22.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
591
+ "model.visual.blocks.22.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
592
+ "model.visual.blocks.22.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
593
+ "model.visual.blocks.22.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
594
+ "model.visual.blocks.22.norm1.bias": "model-00001-of-00004.safetensors",
595
+ "model.visual.blocks.22.norm1.weight": "model-00001-of-00004.safetensors",
596
+ "model.visual.blocks.22.norm2.bias": "model-00001-of-00004.safetensors",
597
+ "model.visual.blocks.22.norm2.weight": "model-00001-of-00004.safetensors",
598
+ "model.visual.blocks.23.attn.proj.bias": "model-00001-of-00004.safetensors",
599
+ "model.visual.blocks.23.attn.proj.weight": "model-00001-of-00004.safetensors",
600
+ "model.visual.blocks.23.attn.qkv.bias": "model-00001-of-00004.safetensors",
601
+ "model.visual.blocks.23.attn.qkv.weight": "model-00001-of-00004.safetensors",
602
+ "model.visual.blocks.23.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
603
+ "model.visual.blocks.23.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
604
+ "model.visual.blocks.23.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
605
+ "model.visual.blocks.23.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
606
+ "model.visual.blocks.23.norm1.bias": "model-00001-of-00004.safetensors",
607
+ "model.visual.blocks.23.norm1.weight": "model-00001-of-00004.safetensors",
608
+ "model.visual.blocks.23.norm2.bias": "model-00001-of-00004.safetensors",
609
+ "model.visual.blocks.23.norm2.weight": "model-00001-of-00004.safetensors",
610
+ "model.visual.blocks.24.attn.proj.bias": "model-00001-of-00004.safetensors",
611
+ "model.visual.blocks.24.attn.proj.weight": "model-00001-of-00004.safetensors",
612
+ "model.visual.blocks.24.attn.qkv.bias": "model-00001-of-00004.safetensors",
613
+ "model.visual.blocks.24.attn.qkv.weight": "model-00001-of-00004.safetensors",
614
+ "model.visual.blocks.24.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
615
+ "model.visual.blocks.24.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
616
+ "model.visual.blocks.24.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
617
+ "model.visual.blocks.24.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
618
+ "model.visual.blocks.24.norm1.bias": "model-00001-of-00004.safetensors",
619
+ "model.visual.blocks.24.norm1.weight": "model-00001-of-00004.safetensors",
620
+ "model.visual.blocks.24.norm2.bias": "model-00001-of-00004.safetensors",
621
+ "model.visual.blocks.24.norm2.weight": "model-00001-of-00004.safetensors",
622
+ "model.visual.blocks.25.attn.proj.bias": "model-00001-of-00004.safetensors",
623
+ "model.visual.blocks.25.attn.proj.weight": "model-00001-of-00004.safetensors",
624
+ "model.visual.blocks.25.attn.qkv.bias": "model-00001-of-00004.safetensors",
625
+ "model.visual.blocks.25.attn.qkv.weight": "model-00001-of-00004.safetensors",
626
+ "model.visual.blocks.25.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
627
+ "model.visual.blocks.25.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
628
+ "model.visual.blocks.25.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
629
+ "model.visual.blocks.25.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
630
+ "model.visual.blocks.25.norm1.bias": "model-00001-of-00004.safetensors",
631
+ "model.visual.blocks.25.norm1.weight": "model-00001-of-00004.safetensors",
632
+ "model.visual.blocks.25.norm2.bias": "model-00001-of-00004.safetensors",
633
+ "model.visual.blocks.25.norm2.weight": "model-00001-of-00004.safetensors",
634
+ "model.visual.blocks.26.attn.proj.bias": "model-00001-of-00004.safetensors",
635
+ "model.visual.blocks.26.attn.proj.weight": "model-00001-of-00004.safetensors",
636
+ "model.visual.blocks.26.attn.qkv.bias": "model-00001-of-00004.safetensors",
637
+ "model.visual.blocks.26.attn.qkv.weight": "model-00001-of-00004.safetensors",
638
+ "model.visual.blocks.26.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
639
+ "model.visual.blocks.26.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
640
+ "model.visual.blocks.26.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
641
+ "model.visual.blocks.26.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
642
+ "model.visual.blocks.26.norm1.bias": "model-00001-of-00004.safetensors",
643
+ "model.visual.blocks.26.norm1.weight": "model-00001-of-00004.safetensors",
644
+ "model.visual.blocks.26.norm2.bias": "model-00001-of-00004.safetensors",
645
+ "model.visual.blocks.26.norm2.weight": "model-00001-of-00004.safetensors",
646
+ "model.visual.blocks.3.attn.proj.bias": "model-00001-of-00004.safetensors",
647
+ "model.visual.blocks.3.attn.proj.weight": "model-00001-of-00004.safetensors",
648
+ "model.visual.blocks.3.attn.qkv.bias": "model-00001-of-00004.safetensors",
649
+ "model.visual.blocks.3.attn.qkv.weight": "model-00001-of-00004.safetensors",
650
+ "model.visual.blocks.3.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
651
+ "model.visual.blocks.3.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
652
+ "model.visual.blocks.3.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
653
+ "model.visual.blocks.3.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
654
+ "model.visual.blocks.3.norm1.bias": "model-00001-of-00004.safetensors",
655
+ "model.visual.blocks.3.norm1.weight": "model-00001-of-00004.safetensors",
656
+ "model.visual.blocks.3.norm2.bias": "model-00001-of-00004.safetensors",
657
+ "model.visual.blocks.3.norm2.weight": "model-00001-of-00004.safetensors",
658
+ "model.visual.blocks.4.attn.proj.bias": "model-00001-of-00004.safetensors",
659
+ "model.visual.blocks.4.attn.proj.weight": "model-00001-of-00004.safetensors",
660
+ "model.visual.blocks.4.attn.qkv.bias": "model-00001-of-00004.safetensors",
661
+ "model.visual.blocks.4.attn.qkv.weight": "model-00001-of-00004.safetensors",
662
+ "model.visual.blocks.4.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
663
+ "model.visual.blocks.4.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
664
+ "model.visual.blocks.4.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
665
+ "model.visual.blocks.4.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
666
+ "model.visual.blocks.4.norm1.bias": "model-00001-of-00004.safetensors",
667
+ "model.visual.blocks.4.norm1.weight": "model-00001-of-00004.safetensors",
668
+ "model.visual.blocks.4.norm2.bias": "model-00001-of-00004.safetensors",
669
+ "model.visual.blocks.4.norm2.weight": "model-00001-of-00004.safetensors",
670
+ "model.visual.blocks.5.attn.proj.bias": "model-00001-of-00004.safetensors",
671
+ "model.visual.blocks.5.attn.proj.weight": "model-00001-of-00004.safetensors",
672
+ "model.visual.blocks.5.attn.qkv.bias": "model-00001-of-00004.safetensors",
673
+ "model.visual.blocks.5.attn.qkv.weight": "model-00001-of-00004.safetensors",
674
+ "model.visual.blocks.5.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
675
+ "model.visual.blocks.5.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
676
+ "model.visual.blocks.5.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
677
+ "model.visual.blocks.5.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
678
+ "model.visual.blocks.5.norm1.bias": "model-00001-of-00004.safetensors",
679
+ "model.visual.blocks.5.norm1.weight": "model-00001-of-00004.safetensors",
680
+ "model.visual.blocks.5.norm2.bias": "model-00001-of-00004.safetensors",
681
+ "model.visual.blocks.5.norm2.weight": "model-00001-of-00004.safetensors",
682
+ "model.visual.blocks.6.attn.proj.bias": "model-00001-of-00004.safetensors",
683
+ "model.visual.blocks.6.attn.proj.weight": "model-00001-of-00004.safetensors",
684
+ "model.visual.blocks.6.attn.qkv.bias": "model-00001-of-00004.safetensors",
685
+ "model.visual.blocks.6.attn.qkv.weight": "model-00001-of-00004.safetensors",
686
+ "model.visual.blocks.6.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
687
+ "model.visual.blocks.6.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
688
+ "model.visual.blocks.6.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
689
+ "model.visual.blocks.6.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
690
+ "model.visual.blocks.6.norm1.bias": "model-00001-of-00004.safetensors",
691
+ "model.visual.blocks.6.norm1.weight": "model-00001-of-00004.safetensors",
692
+ "model.visual.blocks.6.norm2.bias": "model-00001-of-00004.safetensors",
693
+ "model.visual.blocks.6.norm2.weight": "model-00001-of-00004.safetensors",
694
+ "model.visual.blocks.7.attn.proj.bias": "model-00001-of-00004.safetensors",
695
+ "model.visual.blocks.7.attn.proj.weight": "model-00001-of-00004.safetensors",
696
+ "model.visual.blocks.7.attn.qkv.bias": "model-00001-of-00004.safetensors",
697
+ "model.visual.blocks.7.attn.qkv.weight": "model-00001-of-00004.safetensors",
698
+ "model.visual.blocks.7.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
699
+ "model.visual.blocks.7.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
700
+ "model.visual.blocks.7.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
701
+ "model.visual.blocks.7.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
702
+ "model.visual.blocks.7.norm1.bias": "model-00001-of-00004.safetensors",
703
+ "model.visual.blocks.7.norm1.weight": "model-00001-of-00004.safetensors",
704
+ "model.visual.blocks.7.norm2.bias": "model-00001-of-00004.safetensors",
705
+ "model.visual.blocks.7.norm2.weight": "model-00001-of-00004.safetensors",
706
+ "model.visual.blocks.8.attn.proj.bias": "model-00001-of-00004.safetensors",
707
+ "model.visual.blocks.8.attn.proj.weight": "model-00001-of-00004.safetensors",
708
+ "model.visual.blocks.8.attn.qkv.bias": "model-00001-of-00004.safetensors",
709
+ "model.visual.blocks.8.attn.qkv.weight": "model-00001-of-00004.safetensors",
710
+ "model.visual.blocks.8.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
711
+ "model.visual.blocks.8.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
712
+ "model.visual.blocks.8.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
713
+ "model.visual.blocks.8.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
714
+ "model.visual.blocks.8.norm1.bias": "model-00001-of-00004.safetensors",
715
+ "model.visual.blocks.8.norm1.weight": "model-00001-of-00004.safetensors",
716
+ "model.visual.blocks.8.norm2.bias": "model-00001-of-00004.safetensors",
717
+ "model.visual.blocks.8.norm2.weight": "model-00001-of-00004.safetensors",
718
+ "model.visual.blocks.9.attn.proj.bias": "model-00001-of-00004.safetensors",
719
+ "model.visual.blocks.9.attn.proj.weight": "model-00001-of-00004.safetensors",
720
+ "model.visual.blocks.9.attn.qkv.bias": "model-00001-of-00004.safetensors",
721
+ "model.visual.blocks.9.attn.qkv.weight": "model-00001-of-00004.safetensors",
722
+ "model.visual.blocks.9.mlp.linear_fc1.bias": "model-00001-of-00004.safetensors",
723
+ "model.visual.blocks.9.mlp.linear_fc1.weight": "model-00001-of-00004.safetensors",
724
+ "model.visual.blocks.9.mlp.linear_fc2.bias": "model-00001-of-00004.safetensors",
725
+ "model.visual.blocks.9.mlp.linear_fc2.weight": "model-00001-of-00004.safetensors",
726
+ "model.visual.blocks.9.norm1.bias": "model-00001-of-00004.safetensors",
727
+ "model.visual.blocks.9.norm1.weight": "model-00001-of-00004.safetensors",
728
+ "model.visual.blocks.9.norm2.bias": "model-00001-of-00004.safetensors",
729
+ "model.visual.blocks.9.norm2.weight": "model-00001-of-00004.safetensors",
730
+ "model.visual.deepstack_merger_list.0.linear_fc1.bias": "model-00001-of-00004.safetensors",
731
+ "model.visual.deepstack_merger_list.0.linear_fc1.weight": "model-00001-of-00004.safetensors",
732
+ "model.visual.deepstack_merger_list.0.linear_fc2.bias": "model-00001-of-00004.safetensors",
733
+ "model.visual.deepstack_merger_list.0.linear_fc2.weight": "model-00001-of-00004.safetensors",
734
+ "model.visual.deepstack_merger_list.0.norm.bias": "model-00001-of-00004.safetensors",
735
+ "model.visual.deepstack_merger_list.0.norm.weight": "model-00001-of-00004.safetensors",
736
+ "model.visual.deepstack_merger_list.1.linear_fc1.bias": "model-00001-of-00004.safetensors",
737
+ "model.visual.deepstack_merger_list.1.linear_fc1.weight": "model-00001-of-00004.safetensors",
738
+ "model.visual.deepstack_merger_list.1.linear_fc2.bias": "model-00001-of-00004.safetensors",
739
+ "model.visual.deepstack_merger_list.1.linear_fc2.weight": "model-00001-of-00004.safetensors",
740
+ "model.visual.deepstack_merger_list.1.norm.bias": "model-00001-of-00004.safetensors",
741
+ "model.visual.deepstack_merger_list.1.norm.weight": "model-00001-of-00004.safetensors",
742
+ "model.visual.deepstack_merger_list.2.linear_fc1.bias": "model-00001-of-00004.safetensors",
743
+ "model.visual.deepstack_merger_list.2.linear_fc1.weight": "model-00001-of-00004.safetensors",
744
+ "model.visual.deepstack_merger_list.2.linear_fc2.bias": "model-00001-of-00004.safetensors",
745
+ "model.visual.deepstack_merger_list.2.linear_fc2.weight": "model-00001-of-00004.safetensors",
746
+ "model.visual.deepstack_merger_list.2.norm.bias": "model-00001-of-00004.safetensors",
747
+ "model.visual.deepstack_merger_list.2.norm.weight": "model-00001-of-00004.safetensors",
748
+ "model.visual.merger.linear_fc1.bias": "model-00001-of-00004.safetensors",
749
+ "model.visual.merger.linear_fc1.weight": "model-00001-of-00004.safetensors",
750
+ "model.visual.merger.linear_fc2.bias": "model-00001-of-00004.safetensors",
751
+ "model.visual.merger.linear_fc2.weight": "model-00001-of-00004.safetensors",
752
+ "model.visual.merger.norm.bias": "model-00001-of-00004.safetensors",
753
+ "model.visual.merger.norm.weight": "model-00001-of-00004.safetensors",
754
+ "model.visual.patch_embed.proj.bias": "model-00001-of-00004.safetensors",
755
+ "model.visual.patch_embed.proj.weight": "model-00001-of-00004.safetensors",
756
+ "model.visual.pos_embed.weight": "model-00001-of-00004.safetensors"
757
+ }
758
+ }
preprocessor_config.json ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "crop_size": null,
3
+ "data_format": "channels_first",
4
+ "default_to_square": true,
5
+ "device": null,
6
+ "disable_grouping": null,
7
+ "do_center_crop": null,
8
+ "do_convert_rgb": true,
9
+ "do_normalize": true,
10
+ "do_pad": null,
11
+ "do_rescale": true,
12
+ "do_resize": true,
13
+ "image_mean": [
14
+ 0.5,
15
+ 0.5,
16
+ 0.5
17
+ ],
18
+ "image_processor_type": "Qwen2VLImageProcessorFast",
19
+ "image_std": [
20
+ 0.5,
21
+ 0.5,
22
+ 0.5
23
+ ],
24
+ "input_data_format": null,
25
+ "max_pixels": 2007040,
26
+ "merge_size": 2,
27
+ "min_pixels": 12544,
28
+ "pad_size": null,
29
+ "patch_size": 16,
30
+ "processor_class": "Qwen3VLProcessor",
31
+ "resample": 3,
32
+ "rescale_factor": 0.00392156862745098,
33
+ "return_tensors": null,
34
+ "size": {
35
+ "longest_edge": 2007040,
36
+ "shortest_edge": 12544
37
+ },
38
+ "temporal_patch_size": 2
39
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|im_start|>",
4
+ "<|im_end|>",
5
+ "<|object_ref_start|>",
6
+ "<|object_ref_end|>",
7
+ "<|box_start|>",
8
+ "<|box_end|>",
9
+ "<|quad_start|>",
10
+ "<|quad_end|>",
11
+ "<|vision_start|>",
12
+ "<|vision_end|>",
13
+ "<|vision_pad|>",
14
+ "<|image_pad|>",
15
+ "<|video_pad|>"
16
+ ],
17
+ "eos_token": {
18
+ "content": "<|im_end|>",
19
+ "lstrip": false,
20
+ "normalized": false,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ },
24
+ "pad_token": {
25
+ "content": "<|endoftext|>",
26
+ "lstrip": false,
27
+ "normalized": false,
28
+ "rstrip": false,
29
+ "single_word": false
30
+ }
31
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:644af6b530a749b416d780642826025cbba56352a2afe45ba5bf842ea0e85ce6
3
+ size 11941934
tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff
 
trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:11138cd38f0abd6233387920cbc4d66e4a8fa5b65e4afe1b0ed9115b97582f3b
3
+ size 7505
video_preprocessor_config.json ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "crop_size": null,
3
+ "data_format": "channels_first",
4
+ "default_to_square": true,
5
+ "device": null,
6
+ "disable_grouping": null,
7
+ "do_center_crop": null,
8
+ "do_convert_rgb": true,
9
+ "do_normalize": true,
10
+ "do_pad": null,
11
+ "do_rescale": true,
12
+ "do_resize": true,
13
+ "do_sample_frames": true,
14
+ "fps": 2,
15
+ "image_mean": [
16
+ 0.5,
17
+ 0.5,
18
+ 0.5
19
+ ],
20
+ "image_processor_type": "Qwen2VLImageProcessorFast",
21
+ "image_std": [
22
+ 0.5,
23
+ 0.5,
24
+ 0.5
25
+ ],
26
+ "input_data_format": null,
27
+ "max_frames": 8,
28
+ "max_pixels": 802816,
29
+ "merge_size": 2,
30
+ "min_frames": 4,
31
+ "min_pixels": 200704,
32
+ "num_frames": null,
33
+ "pad_size": null,
34
+ "patch_size": 16,
35
+ "processor_class": "Qwen3VLProcessor",
36
+ "resample": 3,
37
+ "rescale_factor": 0.00392156862745098,
38
+ "return_metadata": false,
39
+ "return_tensors": null,
40
+ "size": {
41
+ "longest_edge": 802816,
42
+ "shortest_edge": 200704
43
+ },
44
+ "temporal_patch_size": 2,
45
+ "video_metadata": null,
46
+ "video_processor_type": "Qwen3VLVideoProcessor"
47
+ }
vocab.json ADDED
The diff for this file is too large to render. See raw diff