WatsonOverHere commited on
Commit
3abd97c
·
verified ·
1 Parent(s): ff69bde

Upload 11 files

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: WatsonOverHere/mysterious_mistral-small-3.1-24b
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.15.2
adapter_config.json ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "WatsonOverHere/mysterious_mistral-small-3.1-24b",
5
+ "bias": "none",
6
+ "corda_config": null,
7
+ "eva_config": null,
8
+ "exclude_modules": null,
9
+ "fan_in_fan_out": null,
10
+ "inference_mode": true,
11
+ "init_lora_weights": true,
12
+ "layer_replication": null,
13
+ "layers_pattern": null,
14
+ "layers_to_transform": null,
15
+ "loftq_config": {},
16
+ "lora_alpha": 32,
17
+ "lora_bias": false,
18
+ "lora_dropout": 0.05,
19
+ "megatron_config": null,
20
+ "megatron_core": "megatron.core",
21
+ "modules_to_save": null,
22
+ "peft_type": "LORA",
23
+ "r": 16,
24
+ "rank_pattern": {},
25
+ "revision": null,
26
+ "target_modules": [
27
+ "up_proj",
28
+ "o_proj",
29
+ "down_proj",
30
+ "v_proj",
31
+ "gate_proj",
32
+ "k_proj",
33
+ "q_proj"
34
+ ],
35
+ "task_type": "CAUSAL_LM",
36
+ "trainable_token_indices": null,
37
+ "use_dora": false,
38
+ "use_rslora": false
39
+ }
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:33c5ff04d02fcacd73800d5113ee1fa48bea1b47d5cbd03fadd63d4ca5fe39ad
3
+ size 369698576
optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3d803477909a017eb1ab1365680092c542c1eb7f7aab8ac718a82f40d36b8131
3
+ size 739727010
rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:27d7d2cd33cdfee32dced20dba811ed75980189f275cd064b6efa4f8d11f2744
3
+ size 14244
scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:64b39287064711950e37c2e52db8f6899534b70c52b0efeb8d0a1e18024dd697
3
+ size 1064
special_tokens_map.json ADDED
@@ -0,0 +1,1032 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<unk>",
4
+ "<s>",
5
+ "</s>",
6
+ "[INST]",
7
+ "[/INST]",
8
+ "[AVAILABLE_TOOLS]",
9
+ "[/AVAILABLE_TOOLS]",
10
+ "[TOOL_RESULTS]",
11
+ "[/TOOL_RESULTS]",
12
+ "[TOOL_CALLS]",
13
+ "[IMG]",
14
+ "<pad>",
15
+ "[IMG_BREAK]",
16
+ "[IMG_END]",
17
+ "[PREFIX]",
18
+ "[MIDDLE]",
19
+ "[SUFFIX]",
20
+ "[SYSTEM_PROMPT]",
21
+ "[/SYSTEM_PROMPT]",
22
+ "[TOOL_CONTENT]",
23
+ "<SPECIAL_20>",
24
+ "<SPECIAL_21>",
25
+ "<SPECIAL_22>",
26
+ "<SPECIAL_23>",
27
+ "<SPECIAL_24>",
28
+ "<SPECIAL_25>",
29
+ "<SPECIAL_26>",
30
+ "<SPECIAL_27>",
31
+ "<SPECIAL_28>",
32
+ "<SPECIAL_29>",
33
+ "<SPECIAL_30>",
34
+ "<SPECIAL_31>",
35
+ "<SPECIAL_32>",
36
+ "<SPECIAL_33>",
37
+ "<SPECIAL_34>",
38
+ "<SPECIAL_35>",
39
+ "<SPECIAL_36>",
40
+ "<SPECIAL_37>",
41
+ "<SPECIAL_38>",
42
+ "<SPECIAL_39>",
43
+ "<SPECIAL_40>",
44
+ "<SPECIAL_41>",
45
+ "<SPECIAL_42>",
46
+ "<SPECIAL_43>",
47
+ "<SPECIAL_44>",
48
+ "<SPECIAL_45>",
49
+ "<SPECIAL_46>",
50
+ "<SPECIAL_47>",
51
+ "<SPECIAL_48>",
52
+ "<SPECIAL_49>",
53
+ "<SPECIAL_50>",
54
+ "<SPECIAL_51>",
55
+ "<SPECIAL_52>",
56
+ "<SPECIAL_53>",
57
+ "<SPECIAL_54>",
58
+ "<SPECIAL_55>",
59
+ "<SPECIAL_56>",
60
+ "<SPECIAL_57>",
61
+ "<SPECIAL_58>",
62
+ "<SPECIAL_59>",
63
+ "<SPECIAL_60>",
64
+ "<SPECIAL_61>",
65
+ "<SPECIAL_62>",
66
+ "<SPECIAL_63>",
67
+ "<SPECIAL_64>",
68
+ "<SPECIAL_65>",
69
+ "<SPECIAL_66>",
70
+ "<SPECIAL_67>",
71
+ "<SPECIAL_68>",
72
+ "<SPECIAL_69>",
73
+ "<SPECIAL_70>",
74
+ "<SPECIAL_71>",
75
+ "<SPECIAL_72>",
76
+ "<SPECIAL_73>",
77
+ "<SPECIAL_74>",
78
+ "<SPECIAL_75>",
79
+ "<SPECIAL_76>",
80
+ "<SPECIAL_77>",
81
+ "<SPECIAL_78>",
82
+ "<SPECIAL_79>",
83
+ "<SPECIAL_80>",
84
+ "<SPECIAL_81>",
85
+ "<SPECIAL_82>",
86
+ "<SPECIAL_83>",
87
+ "<SPECIAL_84>",
88
+ "<SPECIAL_85>",
89
+ "<SPECIAL_86>",
90
+ "<SPECIAL_87>",
91
+ "<SPECIAL_88>",
92
+ "<SPECIAL_89>",
93
+ "<SPECIAL_90>",
94
+ "<SPECIAL_91>",
95
+ "<SPECIAL_92>",
96
+ "<SPECIAL_93>",
97
+ "<SPECIAL_94>",
98
+ "<SPECIAL_95>",
99
+ "<SPECIAL_96>",
100
+ "<SPECIAL_97>",
101
+ "<SPECIAL_98>",
102
+ "<SPECIAL_99>",
103
+ "<SPECIAL_100>",
104
+ "<SPECIAL_101>",
105
+ "<SPECIAL_102>",
106
+ "<SPECIAL_103>",
107
+ "<SPECIAL_104>",
108
+ "<SPECIAL_105>",
109
+ "<SPECIAL_106>",
110
+ "<SPECIAL_107>",
111
+ "<SPECIAL_108>",
112
+ "<SPECIAL_109>",
113
+ "<SPECIAL_110>",
114
+ "<SPECIAL_111>",
115
+ "<SPECIAL_112>",
116
+ "<SPECIAL_113>",
117
+ "<SPECIAL_114>",
118
+ "<SPECIAL_115>",
119
+ "<SPECIAL_116>",
120
+ "<SPECIAL_117>",
121
+ "<SPECIAL_118>",
122
+ "<SPECIAL_119>",
123
+ "<SPECIAL_120>",
124
+ "<SPECIAL_121>",
125
+ "<SPECIAL_122>",
126
+ "<SPECIAL_123>",
127
+ "<SPECIAL_124>",
128
+ "<SPECIAL_125>",
129
+ "<SPECIAL_126>",
130
+ "<SPECIAL_127>",
131
+ "<SPECIAL_128>",
132
+ "<SPECIAL_129>",
133
+ "<SPECIAL_130>",
134
+ "<SPECIAL_131>",
135
+ "<SPECIAL_132>",
136
+ "<SPECIAL_133>",
137
+ "<SPECIAL_134>",
138
+ "<SPECIAL_135>",
139
+ "<SPECIAL_136>",
140
+ "<SPECIAL_137>",
141
+ "<SPECIAL_138>",
142
+ "<SPECIAL_139>",
143
+ "<SPECIAL_140>",
144
+ "<SPECIAL_141>",
145
+ "<SPECIAL_142>",
146
+ "<SPECIAL_143>",
147
+ "<SPECIAL_144>",
148
+ "<SPECIAL_145>",
149
+ "<SPECIAL_146>",
150
+ "<SPECIAL_147>",
151
+ "<SPECIAL_148>",
152
+ "<SPECIAL_149>",
153
+ "<SPECIAL_150>",
154
+ "<SPECIAL_151>",
155
+ "<SPECIAL_152>",
156
+ "<SPECIAL_153>",
157
+ "<SPECIAL_154>",
158
+ "<SPECIAL_155>",
159
+ "<SPECIAL_156>",
160
+ "<SPECIAL_157>",
161
+ "<SPECIAL_158>",
162
+ "<SPECIAL_159>",
163
+ "<SPECIAL_160>",
164
+ "<SPECIAL_161>",
165
+ "<SPECIAL_162>",
166
+ "<SPECIAL_163>",
167
+ "<SPECIAL_164>",
168
+ "<SPECIAL_165>",
169
+ "<SPECIAL_166>",
170
+ "<SPECIAL_167>",
171
+ "<SPECIAL_168>",
172
+ "<SPECIAL_169>",
173
+ "<SPECIAL_170>",
174
+ "<SPECIAL_171>",
175
+ "<SPECIAL_172>",
176
+ "<SPECIAL_173>",
177
+ "<SPECIAL_174>",
178
+ "<SPECIAL_175>",
179
+ "<SPECIAL_176>",
180
+ "<SPECIAL_177>",
181
+ "<SPECIAL_178>",
182
+ "<SPECIAL_179>",
183
+ "<SPECIAL_180>",
184
+ "<SPECIAL_181>",
185
+ "<SPECIAL_182>",
186
+ "<SPECIAL_183>",
187
+ "<SPECIAL_184>",
188
+ "<SPECIAL_185>",
189
+ "<SPECIAL_186>",
190
+ "<SPECIAL_187>",
191
+ "<SPECIAL_188>",
192
+ "<SPECIAL_189>",
193
+ "<SPECIAL_190>",
194
+ "<SPECIAL_191>",
195
+ "<SPECIAL_192>",
196
+ "<SPECIAL_193>",
197
+ "<SPECIAL_194>",
198
+ "<SPECIAL_195>",
199
+ "<SPECIAL_196>",
200
+ "<SPECIAL_197>",
201
+ "<SPECIAL_198>",
202
+ "<SPECIAL_199>",
203
+ "<SPECIAL_200>",
204
+ "<SPECIAL_201>",
205
+ "<SPECIAL_202>",
206
+ "<SPECIAL_203>",
207
+ "<SPECIAL_204>",
208
+ "<SPECIAL_205>",
209
+ "<SPECIAL_206>",
210
+ "<SPECIAL_207>",
211
+ "<SPECIAL_208>",
212
+ "<SPECIAL_209>",
213
+ "<SPECIAL_210>",
214
+ "<SPECIAL_211>",
215
+ "<SPECIAL_212>",
216
+ "<SPECIAL_213>",
217
+ "<SPECIAL_214>",
218
+ "<SPECIAL_215>",
219
+ "<SPECIAL_216>",
220
+ "<SPECIAL_217>",
221
+ "<SPECIAL_218>",
222
+ "<SPECIAL_219>",
223
+ "<SPECIAL_220>",
224
+ "<SPECIAL_221>",
225
+ "<SPECIAL_222>",
226
+ "<SPECIAL_223>",
227
+ "<SPECIAL_224>",
228
+ "<SPECIAL_225>",
229
+ "<SPECIAL_226>",
230
+ "<SPECIAL_227>",
231
+ "<SPECIAL_228>",
232
+ "<SPECIAL_229>",
233
+ "<SPECIAL_230>",
234
+ "<SPECIAL_231>",
235
+ "<SPECIAL_232>",
236
+ "<SPECIAL_233>",
237
+ "<SPECIAL_234>",
238
+ "<SPECIAL_235>",
239
+ "<SPECIAL_236>",
240
+ "<SPECIAL_237>",
241
+ "<SPECIAL_238>",
242
+ "<SPECIAL_239>",
243
+ "<SPECIAL_240>",
244
+ "<SPECIAL_241>",
245
+ "<SPECIAL_242>",
246
+ "<SPECIAL_243>",
247
+ "<SPECIAL_244>",
248
+ "<SPECIAL_245>",
249
+ "<SPECIAL_246>",
250
+ "<SPECIAL_247>",
251
+ "<SPECIAL_248>",
252
+ "<SPECIAL_249>",
253
+ "<SPECIAL_250>",
254
+ "<SPECIAL_251>",
255
+ "<SPECIAL_252>",
256
+ "<SPECIAL_253>",
257
+ "<SPECIAL_254>",
258
+ "<SPECIAL_255>",
259
+ "<SPECIAL_256>",
260
+ "<SPECIAL_257>",
261
+ "<SPECIAL_258>",
262
+ "<SPECIAL_259>",
263
+ "<SPECIAL_260>",
264
+ "<SPECIAL_261>",
265
+ "<SPECIAL_262>",
266
+ "<SPECIAL_263>",
267
+ "<SPECIAL_264>",
268
+ "<SPECIAL_265>",
269
+ "<SPECIAL_266>",
270
+ "<SPECIAL_267>",
271
+ "<SPECIAL_268>",
272
+ "<SPECIAL_269>",
273
+ "<SPECIAL_270>",
274
+ "<SPECIAL_271>",
275
+ "<SPECIAL_272>",
276
+ "<SPECIAL_273>",
277
+ "<SPECIAL_274>",
278
+ "<SPECIAL_275>",
279
+ "<SPECIAL_276>",
280
+ "<SPECIAL_277>",
281
+ "<SPECIAL_278>",
282
+ "<SPECIAL_279>",
283
+ "<SPECIAL_280>",
284
+ "<SPECIAL_281>",
285
+ "<SPECIAL_282>",
286
+ "<SPECIAL_283>",
287
+ "<SPECIAL_284>",
288
+ "<SPECIAL_285>",
289
+ "<SPECIAL_286>",
290
+ "<SPECIAL_287>",
291
+ "<SPECIAL_288>",
292
+ "<SPECIAL_289>",
293
+ "<SPECIAL_290>",
294
+ "<SPECIAL_291>",
295
+ "<SPECIAL_292>",
296
+ "<SPECIAL_293>",
297
+ "<SPECIAL_294>",
298
+ "<SPECIAL_295>",
299
+ "<SPECIAL_296>",
300
+ "<SPECIAL_297>",
301
+ "<SPECIAL_298>",
302
+ "<SPECIAL_299>",
303
+ "<SPECIAL_300>",
304
+ "<SPECIAL_301>",
305
+ "<SPECIAL_302>",
306
+ "<SPECIAL_303>",
307
+ "<SPECIAL_304>",
308
+ "<SPECIAL_305>",
309
+ "<SPECIAL_306>",
310
+ "<SPECIAL_307>",
311
+ "<SPECIAL_308>",
312
+ "<SPECIAL_309>",
313
+ "<SPECIAL_310>",
314
+ "<SPECIAL_311>",
315
+ "<SPECIAL_312>",
316
+ "<SPECIAL_313>",
317
+ "<SPECIAL_314>",
318
+ "<SPECIAL_315>",
319
+ "<SPECIAL_316>",
320
+ "<SPECIAL_317>",
321
+ "<SPECIAL_318>",
322
+ "<SPECIAL_319>",
323
+ "<SPECIAL_320>",
324
+ "<SPECIAL_321>",
325
+ "<SPECIAL_322>",
326
+ "<SPECIAL_323>",
327
+ "<SPECIAL_324>",
328
+ "<SPECIAL_325>",
329
+ "<SPECIAL_326>",
330
+ "<SPECIAL_327>",
331
+ "<SPECIAL_328>",
332
+ "<SPECIAL_329>",
333
+ "<SPECIAL_330>",
334
+ "<SPECIAL_331>",
335
+ "<SPECIAL_332>",
336
+ "<SPECIAL_333>",
337
+ "<SPECIAL_334>",
338
+ "<SPECIAL_335>",
339
+ "<SPECIAL_336>",
340
+ "<SPECIAL_337>",
341
+ "<SPECIAL_338>",
342
+ "<SPECIAL_339>",
343
+ "<SPECIAL_340>",
344
+ "<SPECIAL_341>",
345
+ "<SPECIAL_342>",
346
+ "<SPECIAL_343>",
347
+ "<SPECIAL_344>",
348
+ "<SPECIAL_345>",
349
+ "<SPECIAL_346>",
350
+ "<SPECIAL_347>",
351
+ "<SPECIAL_348>",
352
+ "<SPECIAL_349>",
353
+ "<SPECIAL_350>",
354
+ "<SPECIAL_351>",
355
+ "<SPECIAL_352>",
356
+ "<SPECIAL_353>",
357
+ "<SPECIAL_354>",
358
+ "<SPECIAL_355>",
359
+ "<SPECIAL_356>",
360
+ "<SPECIAL_357>",
361
+ "<SPECIAL_358>",
362
+ "<SPECIAL_359>",
363
+ "<SPECIAL_360>",
364
+ "<SPECIAL_361>",
365
+ "<SPECIAL_362>",
366
+ "<SPECIAL_363>",
367
+ "<SPECIAL_364>",
368
+ "<SPECIAL_365>",
369
+ "<SPECIAL_366>",
370
+ "<SPECIAL_367>",
371
+ "<SPECIAL_368>",
372
+ "<SPECIAL_369>",
373
+ "<SPECIAL_370>",
374
+ "<SPECIAL_371>",
375
+ "<SPECIAL_372>",
376
+ "<SPECIAL_373>",
377
+ "<SPECIAL_374>",
378
+ "<SPECIAL_375>",
379
+ "<SPECIAL_376>",
380
+ "<SPECIAL_377>",
381
+ "<SPECIAL_378>",
382
+ "<SPECIAL_379>",
383
+ "<SPECIAL_380>",
384
+ "<SPECIAL_381>",
385
+ "<SPECIAL_382>",
386
+ "<SPECIAL_383>",
387
+ "<SPECIAL_384>",
388
+ "<SPECIAL_385>",
389
+ "<SPECIAL_386>",
390
+ "<SPECIAL_387>",
391
+ "<SPECIAL_388>",
392
+ "<SPECIAL_389>",
393
+ "<SPECIAL_390>",
394
+ "<SPECIAL_391>",
395
+ "<SPECIAL_392>",
396
+ "<SPECIAL_393>",
397
+ "<SPECIAL_394>",
398
+ "<SPECIAL_395>",
399
+ "<SPECIAL_396>",
400
+ "<SPECIAL_397>",
401
+ "<SPECIAL_398>",
402
+ "<SPECIAL_399>",
403
+ "<SPECIAL_400>",
404
+ "<SPECIAL_401>",
405
+ "<SPECIAL_402>",
406
+ "<SPECIAL_403>",
407
+ "<SPECIAL_404>",
408
+ "<SPECIAL_405>",
409
+ "<SPECIAL_406>",
410
+ "<SPECIAL_407>",
411
+ "<SPECIAL_408>",
412
+ "<SPECIAL_409>",
413
+ "<SPECIAL_410>",
414
+ "<SPECIAL_411>",
415
+ "<SPECIAL_412>",
416
+ "<SPECIAL_413>",
417
+ "<SPECIAL_414>",
418
+ "<SPECIAL_415>",
419
+ "<SPECIAL_416>",
420
+ "<SPECIAL_417>",
421
+ "<SPECIAL_418>",
422
+ "<SPECIAL_419>",
423
+ "<SPECIAL_420>",
424
+ "<SPECIAL_421>",
425
+ "<SPECIAL_422>",
426
+ "<SPECIAL_423>",
427
+ "<SPECIAL_424>",
428
+ "<SPECIAL_425>",
429
+ "<SPECIAL_426>",
430
+ "<SPECIAL_427>",
431
+ "<SPECIAL_428>",
432
+ "<SPECIAL_429>",
433
+ "<SPECIAL_430>",
434
+ "<SPECIAL_431>",
435
+ "<SPECIAL_432>",
436
+ "<SPECIAL_433>",
437
+ "<SPECIAL_434>",
438
+ "<SPECIAL_435>",
439
+ "<SPECIAL_436>",
440
+ "<SPECIAL_437>",
441
+ "<SPECIAL_438>",
442
+ "<SPECIAL_439>",
443
+ "<SPECIAL_440>",
444
+ "<SPECIAL_441>",
445
+ "<SPECIAL_442>",
446
+ "<SPECIAL_443>",
447
+ "<SPECIAL_444>",
448
+ "<SPECIAL_445>",
449
+ "<SPECIAL_446>",
450
+ "<SPECIAL_447>",
451
+ "<SPECIAL_448>",
452
+ "<SPECIAL_449>",
453
+ "<SPECIAL_450>",
454
+ "<SPECIAL_451>",
455
+ "<SPECIAL_452>",
456
+ "<SPECIAL_453>",
457
+ "<SPECIAL_454>",
458
+ "<SPECIAL_455>",
459
+ "<SPECIAL_456>",
460
+ "<SPECIAL_457>",
461
+ "<SPECIAL_458>",
462
+ "<SPECIAL_459>",
463
+ "<SPECIAL_460>",
464
+ "<SPECIAL_461>",
465
+ "<SPECIAL_462>",
466
+ "<SPECIAL_463>",
467
+ "<SPECIAL_464>",
468
+ "<SPECIAL_465>",
469
+ "<SPECIAL_466>",
470
+ "<SPECIAL_467>",
471
+ "<SPECIAL_468>",
472
+ "<SPECIAL_469>",
473
+ "<SPECIAL_470>",
474
+ "<SPECIAL_471>",
475
+ "<SPECIAL_472>",
476
+ "<SPECIAL_473>",
477
+ "<SPECIAL_474>",
478
+ "<SPECIAL_475>",
479
+ "<SPECIAL_476>",
480
+ "<SPECIAL_477>",
481
+ "<SPECIAL_478>",
482
+ "<SPECIAL_479>",
483
+ "<SPECIAL_480>",
484
+ "<SPECIAL_481>",
485
+ "<SPECIAL_482>",
486
+ "<SPECIAL_483>",
487
+ "<SPECIAL_484>",
488
+ "<SPECIAL_485>",
489
+ "<SPECIAL_486>",
490
+ "<SPECIAL_487>",
491
+ "<SPECIAL_488>",
492
+ "<SPECIAL_489>",
493
+ "<SPECIAL_490>",
494
+ "<SPECIAL_491>",
495
+ "<SPECIAL_492>",
496
+ "<SPECIAL_493>",
497
+ "<SPECIAL_494>",
498
+ "<SPECIAL_495>",
499
+ "<SPECIAL_496>",
500
+ "<SPECIAL_497>",
501
+ "<SPECIAL_498>",
502
+ "<SPECIAL_499>",
503
+ "<SPECIAL_500>",
504
+ "<SPECIAL_501>",
505
+ "<SPECIAL_502>",
506
+ "<SPECIAL_503>",
507
+ "<SPECIAL_504>",
508
+ "<SPECIAL_505>",
509
+ "<SPECIAL_506>",
510
+ "<SPECIAL_507>",
511
+ "<SPECIAL_508>",
512
+ "<SPECIAL_509>",
513
+ "<SPECIAL_510>",
514
+ "<SPECIAL_511>",
515
+ "<SPECIAL_512>",
516
+ "<SPECIAL_513>",
517
+ "<SPECIAL_514>",
518
+ "<SPECIAL_515>",
519
+ "<SPECIAL_516>",
520
+ "<SPECIAL_517>",
521
+ "<SPECIAL_518>",
522
+ "<SPECIAL_519>",
523
+ "<SPECIAL_520>",
524
+ "<SPECIAL_521>",
525
+ "<SPECIAL_522>",
526
+ "<SPECIAL_523>",
527
+ "<SPECIAL_524>",
528
+ "<SPECIAL_525>",
529
+ "<SPECIAL_526>",
530
+ "<SPECIAL_527>",
531
+ "<SPECIAL_528>",
532
+ "<SPECIAL_529>",
533
+ "<SPECIAL_530>",
534
+ "<SPECIAL_531>",
535
+ "<SPECIAL_532>",
536
+ "<SPECIAL_533>",
537
+ "<SPECIAL_534>",
538
+ "<SPECIAL_535>",
539
+ "<SPECIAL_536>",
540
+ "<SPECIAL_537>",
541
+ "<SPECIAL_538>",
542
+ "<SPECIAL_539>",
543
+ "<SPECIAL_540>",
544
+ "<SPECIAL_541>",
545
+ "<SPECIAL_542>",
546
+ "<SPECIAL_543>",
547
+ "<SPECIAL_544>",
548
+ "<SPECIAL_545>",
549
+ "<SPECIAL_546>",
550
+ "<SPECIAL_547>",
551
+ "<SPECIAL_548>",
552
+ "<SPECIAL_549>",
553
+ "<SPECIAL_550>",
554
+ "<SPECIAL_551>",
555
+ "<SPECIAL_552>",
556
+ "<SPECIAL_553>",
557
+ "<SPECIAL_554>",
558
+ "<SPECIAL_555>",
559
+ "<SPECIAL_556>",
560
+ "<SPECIAL_557>",
561
+ "<SPECIAL_558>",
562
+ "<SPECIAL_559>",
563
+ "<SPECIAL_560>",
564
+ "<SPECIAL_561>",
565
+ "<SPECIAL_562>",
566
+ "<SPECIAL_563>",
567
+ "<SPECIAL_564>",
568
+ "<SPECIAL_565>",
569
+ "<SPECIAL_566>",
570
+ "<SPECIAL_567>",
571
+ "<SPECIAL_568>",
572
+ "<SPECIAL_569>",
573
+ "<SPECIAL_570>",
574
+ "<SPECIAL_571>",
575
+ "<SPECIAL_572>",
576
+ "<SPECIAL_573>",
577
+ "<SPECIAL_574>",
578
+ "<SPECIAL_575>",
579
+ "<SPECIAL_576>",
580
+ "<SPECIAL_577>",
581
+ "<SPECIAL_578>",
582
+ "<SPECIAL_579>",
583
+ "<SPECIAL_580>",
584
+ "<SPECIAL_581>",
585
+ "<SPECIAL_582>",
586
+ "<SPECIAL_583>",
587
+ "<SPECIAL_584>",
588
+ "<SPECIAL_585>",
589
+ "<SPECIAL_586>",
590
+ "<SPECIAL_587>",
591
+ "<SPECIAL_588>",
592
+ "<SPECIAL_589>",
593
+ "<SPECIAL_590>",
594
+ "<SPECIAL_591>",
595
+ "<SPECIAL_592>",
596
+ "<SPECIAL_593>",
597
+ "<SPECIAL_594>",
598
+ "<SPECIAL_595>",
599
+ "<SPECIAL_596>",
600
+ "<SPECIAL_597>",
601
+ "<SPECIAL_598>",
602
+ "<SPECIAL_599>",
603
+ "<SPECIAL_600>",
604
+ "<SPECIAL_601>",
605
+ "<SPECIAL_602>",
606
+ "<SPECIAL_603>",
607
+ "<SPECIAL_604>",
608
+ "<SPECIAL_605>",
609
+ "<SPECIAL_606>",
610
+ "<SPECIAL_607>",
611
+ "<SPECIAL_608>",
612
+ "<SPECIAL_609>",
613
+ "<SPECIAL_610>",
614
+ "<SPECIAL_611>",
615
+ "<SPECIAL_612>",
616
+ "<SPECIAL_613>",
617
+ "<SPECIAL_614>",
618
+ "<SPECIAL_615>",
619
+ "<SPECIAL_616>",
620
+ "<SPECIAL_617>",
621
+ "<SPECIAL_618>",
622
+ "<SPECIAL_619>",
623
+ "<SPECIAL_620>",
624
+ "<SPECIAL_621>",
625
+ "<SPECIAL_622>",
626
+ "<SPECIAL_623>",
627
+ "<SPECIAL_624>",
628
+ "<SPECIAL_625>",
629
+ "<SPECIAL_626>",
630
+ "<SPECIAL_627>",
631
+ "<SPECIAL_628>",
632
+ "<SPECIAL_629>",
633
+ "<SPECIAL_630>",
634
+ "<SPECIAL_631>",
635
+ "<SPECIAL_632>",
636
+ "<SPECIAL_633>",
637
+ "<SPECIAL_634>",
638
+ "<SPECIAL_635>",
639
+ "<SPECIAL_636>",
640
+ "<SPECIAL_637>",
641
+ "<SPECIAL_638>",
642
+ "<SPECIAL_639>",
643
+ "<SPECIAL_640>",
644
+ "<SPECIAL_641>",
645
+ "<SPECIAL_642>",
646
+ "<SPECIAL_643>",
647
+ "<SPECIAL_644>",
648
+ "<SPECIAL_645>",
649
+ "<SPECIAL_646>",
650
+ "<SPECIAL_647>",
651
+ "<SPECIAL_648>",
652
+ "<SPECIAL_649>",
653
+ "<SPECIAL_650>",
654
+ "<SPECIAL_651>",
655
+ "<SPECIAL_652>",
656
+ "<SPECIAL_653>",
657
+ "<SPECIAL_654>",
658
+ "<SPECIAL_655>",
659
+ "<SPECIAL_656>",
660
+ "<SPECIAL_657>",
661
+ "<SPECIAL_658>",
662
+ "<SPECIAL_659>",
663
+ "<SPECIAL_660>",
664
+ "<SPECIAL_661>",
665
+ "<SPECIAL_662>",
666
+ "<SPECIAL_663>",
667
+ "<SPECIAL_664>",
668
+ "<SPECIAL_665>",
669
+ "<SPECIAL_666>",
670
+ "<SPECIAL_667>",
671
+ "<SPECIAL_668>",
672
+ "<SPECIAL_669>",
673
+ "<SPECIAL_670>",
674
+ "<SPECIAL_671>",
675
+ "<SPECIAL_672>",
676
+ "<SPECIAL_673>",
677
+ "<SPECIAL_674>",
678
+ "<SPECIAL_675>",
679
+ "<SPECIAL_676>",
680
+ "<SPECIAL_677>",
681
+ "<SPECIAL_678>",
682
+ "<SPECIAL_679>",
683
+ "<SPECIAL_680>",
684
+ "<SPECIAL_681>",
685
+ "<SPECIAL_682>",
686
+ "<SPECIAL_683>",
687
+ "<SPECIAL_684>",
688
+ "<SPECIAL_685>",
689
+ "<SPECIAL_686>",
690
+ "<SPECIAL_687>",
691
+ "<SPECIAL_688>",
692
+ "<SPECIAL_689>",
693
+ "<SPECIAL_690>",
694
+ "<SPECIAL_691>",
695
+ "<SPECIAL_692>",
696
+ "<SPECIAL_693>",
697
+ "<SPECIAL_694>",
698
+ "<SPECIAL_695>",
699
+ "<SPECIAL_696>",
700
+ "<SPECIAL_697>",
701
+ "<SPECIAL_698>",
702
+ "<SPECIAL_699>",
703
+ "<SPECIAL_700>",
704
+ "<SPECIAL_701>",
705
+ "<SPECIAL_702>",
706
+ "<SPECIAL_703>",
707
+ "<SPECIAL_704>",
708
+ "<SPECIAL_705>",
709
+ "<SPECIAL_706>",
710
+ "<SPECIAL_707>",
711
+ "<SPECIAL_708>",
712
+ "<SPECIAL_709>",
713
+ "<SPECIAL_710>",
714
+ "<SPECIAL_711>",
715
+ "<SPECIAL_712>",
716
+ "<SPECIAL_713>",
717
+ "<SPECIAL_714>",
718
+ "<SPECIAL_715>",
719
+ "<SPECIAL_716>",
720
+ "<SPECIAL_717>",
721
+ "<SPECIAL_718>",
722
+ "<SPECIAL_719>",
723
+ "<SPECIAL_720>",
724
+ "<SPECIAL_721>",
725
+ "<SPECIAL_722>",
726
+ "<SPECIAL_723>",
727
+ "<SPECIAL_724>",
728
+ "<SPECIAL_725>",
729
+ "<SPECIAL_726>",
730
+ "<SPECIAL_727>",
731
+ "<SPECIAL_728>",
732
+ "<SPECIAL_729>",
733
+ "<SPECIAL_730>",
734
+ "<SPECIAL_731>",
735
+ "<SPECIAL_732>",
736
+ "<SPECIAL_733>",
737
+ "<SPECIAL_734>",
738
+ "<SPECIAL_735>",
739
+ "<SPECIAL_736>",
740
+ "<SPECIAL_737>",
741
+ "<SPECIAL_738>",
742
+ "<SPECIAL_739>",
743
+ "<SPECIAL_740>",
744
+ "<SPECIAL_741>",
745
+ "<SPECIAL_742>",
746
+ "<SPECIAL_743>",
747
+ "<SPECIAL_744>",
748
+ "<SPECIAL_745>",
749
+ "<SPECIAL_746>",
750
+ "<SPECIAL_747>",
751
+ "<SPECIAL_748>",
752
+ "<SPECIAL_749>",
753
+ "<SPECIAL_750>",
754
+ "<SPECIAL_751>",
755
+ "<SPECIAL_752>",
756
+ "<SPECIAL_753>",
757
+ "<SPECIAL_754>",
758
+ "<SPECIAL_755>",
759
+ "<SPECIAL_756>",
760
+ "<SPECIAL_757>",
761
+ "<SPECIAL_758>",
762
+ "<SPECIAL_759>",
763
+ "<SPECIAL_760>",
764
+ "<SPECIAL_761>",
765
+ "<SPECIAL_762>",
766
+ "<SPECIAL_763>",
767
+ "<SPECIAL_764>",
768
+ "<SPECIAL_765>",
769
+ "<SPECIAL_766>",
770
+ "<SPECIAL_767>",
771
+ "<SPECIAL_768>",
772
+ "<SPECIAL_769>",
773
+ "<SPECIAL_770>",
774
+ "<SPECIAL_771>",
775
+ "<SPECIAL_772>",
776
+ "<SPECIAL_773>",
777
+ "<SPECIAL_774>",
778
+ "<SPECIAL_775>",
779
+ "<SPECIAL_776>",
780
+ "<SPECIAL_777>",
781
+ "<SPECIAL_778>",
782
+ "<SPECIAL_779>",
783
+ "<SPECIAL_780>",
784
+ "<SPECIAL_781>",
785
+ "<SPECIAL_782>",
786
+ "<SPECIAL_783>",
787
+ "<SPECIAL_784>",
788
+ "<SPECIAL_785>",
789
+ "<SPECIAL_786>",
790
+ "<SPECIAL_787>",
791
+ "<SPECIAL_788>",
792
+ "<SPECIAL_789>",
793
+ "<SPECIAL_790>",
794
+ "<SPECIAL_791>",
795
+ "<SPECIAL_792>",
796
+ "<SPECIAL_793>",
797
+ "<SPECIAL_794>",
798
+ "<SPECIAL_795>",
799
+ "<SPECIAL_796>",
800
+ "<SPECIAL_797>",
801
+ "<SPECIAL_798>",
802
+ "<SPECIAL_799>",
803
+ "<SPECIAL_800>",
804
+ "<SPECIAL_801>",
805
+ "<SPECIAL_802>",
806
+ "<SPECIAL_803>",
807
+ "<SPECIAL_804>",
808
+ "<SPECIAL_805>",
809
+ "<SPECIAL_806>",
810
+ "<SPECIAL_807>",
811
+ "<SPECIAL_808>",
812
+ "<SPECIAL_809>",
813
+ "<SPECIAL_810>",
814
+ "<SPECIAL_811>",
815
+ "<SPECIAL_812>",
816
+ "<SPECIAL_813>",
817
+ "<SPECIAL_814>",
818
+ "<SPECIAL_815>",
819
+ "<SPECIAL_816>",
820
+ "<SPECIAL_817>",
821
+ "<SPECIAL_818>",
822
+ "<SPECIAL_819>",
823
+ "<SPECIAL_820>",
824
+ "<SPECIAL_821>",
825
+ "<SPECIAL_822>",
826
+ "<SPECIAL_823>",
827
+ "<SPECIAL_824>",
828
+ "<SPECIAL_825>",
829
+ "<SPECIAL_826>",
830
+ "<SPECIAL_827>",
831
+ "<SPECIAL_828>",
832
+ "<SPECIAL_829>",
833
+ "<SPECIAL_830>",
834
+ "<SPECIAL_831>",
835
+ "<SPECIAL_832>",
836
+ "<SPECIAL_833>",
837
+ "<SPECIAL_834>",
838
+ "<SPECIAL_835>",
839
+ "<SPECIAL_836>",
840
+ "<SPECIAL_837>",
841
+ "<SPECIAL_838>",
842
+ "<SPECIAL_839>",
843
+ "<SPECIAL_840>",
844
+ "<SPECIAL_841>",
845
+ "<SPECIAL_842>",
846
+ "<SPECIAL_843>",
847
+ "<SPECIAL_844>",
848
+ "<SPECIAL_845>",
849
+ "<SPECIAL_846>",
850
+ "<SPECIAL_847>",
851
+ "<SPECIAL_848>",
852
+ "<SPECIAL_849>",
853
+ "<SPECIAL_850>",
854
+ "<SPECIAL_851>",
855
+ "<SPECIAL_852>",
856
+ "<SPECIAL_853>",
857
+ "<SPECIAL_854>",
858
+ "<SPECIAL_855>",
859
+ "<SPECIAL_856>",
860
+ "<SPECIAL_857>",
861
+ "<SPECIAL_858>",
862
+ "<SPECIAL_859>",
863
+ "<SPECIAL_860>",
864
+ "<SPECIAL_861>",
865
+ "<SPECIAL_862>",
866
+ "<SPECIAL_863>",
867
+ "<SPECIAL_864>",
868
+ "<SPECIAL_865>",
869
+ "<SPECIAL_866>",
870
+ "<SPECIAL_867>",
871
+ "<SPECIAL_868>",
872
+ "<SPECIAL_869>",
873
+ "<SPECIAL_870>",
874
+ "<SPECIAL_871>",
875
+ "<SPECIAL_872>",
876
+ "<SPECIAL_873>",
877
+ "<SPECIAL_874>",
878
+ "<SPECIAL_875>",
879
+ "<SPECIAL_876>",
880
+ "<SPECIAL_877>",
881
+ "<SPECIAL_878>",
882
+ "<SPECIAL_879>",
883
+ "<SPECIAL_880>",
884
+ "<SPECIAL_881>",
885
+ "<SPECIAL_882>",
886
+ "<SPECIAL_883>",
887
+ "<SPECIAL_884>",
888
+ "<SPECIAL_885>",
889
+ "<SPECIAL_886>",
890
+ "<SPECIAL_887>",
891
+ "<SPECIAL_888>",
892
+ "<SPECIAL_889>",
893
+ "<SPECIAL_890>",
894
+ "<SPECIAL_891>",
895
+ "<SPECIAL_892>",
896
+ "<SPECIAL_893>",
897
+ "<SPECIAL_894>",
898
+ "<SPECIAL_895>",
899
+ "<SPECIAL_896>",
900
+ "<SPECIAL_897>",
901
+ "<SPECIAL_898>",
902
+ "<SPECIAL_899>",
903
+ "<SPECIAL_900>",
904
+ "<SPECIAL_901>",
905
+ "<SPECIAL_902>",
906
+ "<SPECIAL_903>",
907
+ "<SPECIAL_904>",
908
+ "<SPECIAL_905>",
909
+ "<SPECIAL_906>",
910
+ "<SPECIAL_907>",
911
+ "<SPECIAL_908>",
912
+ "<SPECIAL_909>",
913
+ "<SPECIAL_910>",
914
+ "<SPECIAL_911>",
915
+ "<SPECIAL_912>",
916
+ "<SPECIAL_913>",
917
+ "<SPECIAL_914>",
918
+ "<SPECIAL_915>",
919
+ "<SPECIAL_916>",
920
+ "<SPECIAL_917>",
921
+ "<SPECIAL_918>",
922
+ "<SPECIAL_919>",
923
+ "<SPECIAL_920>",
924
+ "<SPECIAL_921>",
925
+ "<SPECIAL_922>",
926
+ "<SPECIAL_923>",
927
+ "<SPECIAL_924>",
928
+ "<SPECIAL_925>",
929
+ "<SPECIAL_926>",
930
+ "<SPECIAL_927>",
931
+ "<SPECIAL_928>",
932
+ "<SPECIAL_929>",
933
+ "<SPECIAL_930>",
934
+ "<SPECIAL_931>",
935
+ "<SPECIAL_932>",
936
+ "<SPECIAL_933>",
937
+ "<SPECIAL_934>",
938
+ "<SPECIAL_935>",
939
+ "<SPECIAL_936>",
940
+ "<SPECIAL_937>",
941
+ "<SPECIAL_938>",
942
+ "<SPECIAL_939>",
943
+ "<SPECIAL_940>",
944
+ "<SPECIAL_941>",
945
+ "<SPECIAL_942>",
946
+ "<SPECIAL_943>",
947
+ "<SPECIAL_944>",
948
+ "<SPECIAL_945>",
949
+ "<SPECIAL_946>",
950
+ "<SPECIAL_947>",
951
+ "<SPECIAL_948>",
952
+ "<SPECIAL_949>",
953
+ "<SPECIAL_950>",
954
+ "<SPECIAL_951>",
955
+ "<SPECIAL_952>",
956
+ "<SPECIAL_953>",
957
+ "<SPECIAL_954>",
958
+ "<SPECIAL_955>",
959
+ "<SPECIAL_956>",
960
+ "<SPECIAL_957>",
961
+ "<SPECIAL_958>",
962
+ "<SPECIAL_959>",
963
+ "<SPECIAL_960>",
964
+ "<SPECIAL_961>",
965
+ "<SPECIAL_962>",
966
+ "<SPECIAL_963>",
967
+ "<SPECIAL_964>",
968
+ "<SPECIAL_965>",
969
+ "<SPECIAL_966>",
970
+ "<SPECIAL_967>",
971
+ "<SPECIAL_968>",
972
+ "<SPECIAL_969>",
973
+ "<SPECIAL_970>",
974
+ "<SPECIAL_971>",
975
+ "<SPECIAL_972>",
976
+ "<SPECIAL_973>",
977
+ "<SPECIAL_974>",
978
+ "<SPECIAL_975>",
979
+ "<SPECIAL_976>",
980
+ "<SPECIAL_977>",
981
+ "<SPECIAL_978>",
982
+ "<SPECIAL_979>",
983
+ "<SPECIAL_980>",
984
+ "<SPECIAL_981>",
985
+ "<SPECIAL_982>",
986
+ "<SPECIAL_983>",
987
+ "<SPECIAL_984>",
988
+ "<SPECIAL_985>",
989
+ "<SPECIAL_986>",
990
+ "<SPECIAL_987>",
991
+ "<SPECIAL_988>",
992
+ "<SPECIAL_989>",
993
+ "<SPECIAL_990>",
994
+ "<SPECIAL_991>",
995
+ "<SPECIAL_992>",
996
+ "<SPECIAL_993>",
997
+ "<SPECIAL_994>",
998
+ "<SPECIAL_995>",
999
+ "<SPECIAL_996>",
1000
+ "<SPECIAL_997>",
1001
+ "<SPECIAL_998>",
1002
+ "<SPECIAL_999>"
1003
+ ],
1004
+ "bos_token": {
1005
+ "content": "<s>",
1006
+ "lstrip": false,
1007
+ "normalized": false,
1008
+ "rstrip": false,
1009
+ "single_word": false
1010
+ },
1011
+ "eos_token": {
1012
+ "content": "</s>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false
1017
+ },
1018
+ "pad_token": {
1019
+ "content": "</s>",
1020
+ "lstrip": false,
1021
+ "normalized": false,
1022
+ "rstrip": false,
1023
+ "single_word": false
1024
+ },
1025
+ "unk_token": {
1026
+ "content": "<unk>",
1027
+ "lstrip": false,
1028
+ "normalized": false,
1029
+ "rstrip": false,
1030
+ "single_word": false
1031
+ }
1032
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b76085f9923309d873994d444989f7eb6ec074b06f25b58f1e8d7b7741070949
3
+ size 17078037
tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff
 
trainer_state.json ADDED
@@ -0,0 +1,2554 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": null,
3
+ "best_metric": null,
4
+ "best_model_checkpoint": null,
5
+ "epoch": 2.9812889812889813,
6
+ "eval_steps": 500,
7
+ "global_step": 360,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.008316008316008316,
14
+ "grad_norm": 8.161552429199219,
15
+ "learning_rate": 0.0,
16
+ "loss": 4.8011,
17
+ "step": 1
18
+ },
19
+ {
20
+ "epoch": 0.016632016632016633,
21
+ "grad_norm": 8.788352966308594,
22
+ "learning_rate": 2.0000000000000003e-06,
23
+ "loss": 4.8042,
24
+ "step": 2
25
+ },
26
+ {
27
+ "epoch": 0.02494802494802495,
28
+ "grad_norm": 7.977554798126221,
29
+ "learning_rate": 4.000000000000001e-06,
30
+ "loss": 4.8113,
31
+ "step": 3
32
+ },
33
+ {
34
+ "epoch": 0.033264033264033266,
35
+ "grad_norm": 8.319157600402832,
36
+ "learning_rate": 6e-06,
37
+ "loss": 4.8424,
38
+ "step": 4
39
+ },
40
+ {
41
+ "epoch": 0.04158004158004158,
42
+ "grad_norm": 6.344866752624512,
43
+ "learning_rate": 8.000000000000001e-06,
44
+ "loss": 4.7736,
45
+ "step": 5
46
+ },
47
+ {
48
+ "epoch": 0.0498960498960499,
49
+ "grad_norm": 5.9034647941589355,
50
+ "learning_rate": 1e-05,
51
+ "loss": 4.7142,
52
+ "step": 6
53
+ },
54
+ {
55
+ "epoch": 0.058212058212058215,
56
+ "grad_norm": 5.377430438995361,
57
+ "learning_rate": 1.2e-05,
58
+ "loss": 4.6148,
59
+ "step": 7
60
+ },
61
+ {
62
+ "epoch": 0.06652806652806653,
63
+ "grad_norm": 6.9711222648620605,
64
+ "learning_rate": 1.4000000000000001e-05,
65
+ "loss": 4.5419,
66
+ "step": 8
67
+ },
68
+ {
69
+ "epoch": 0.07484407484407485,
70
+ "grad_norm": 6.142094612121582,
71
+ "learning_rate": 1.6000000000000003e-05,
72
+ "loss": 4.4492,
73
+ "step": 9
74
+ },
75
+ {
76
+ "epoch": 0.08316008316008316,
77
+ "grad_norm": 4.155233383178711,
78
+ "learning_rate": 1.8e-05,
79
+ "loss": 4.2954,
80
+ "step": 10
81
+ },
82
+ {
83
+ "epoch": 0.09147609147609148,
84
+ "grad_norm": 3.6919031143188477,
85
+ "learning_rate": 2e-05,
86
+ "loss": 4.208,
87
+ "step": 11
88
+ },
89
+ {
90
+ "epoch": 0.0997920997920998,
91
+ "grad_norm": 3.5758368968963623,
92
+ "learning_rate": 2.2000000000000003e-05,
93
+ "loss": 4.0939,
94
+ "step": 12
95
+ },
96
+ {
97
+ "epoch": 0.10810810810810811,
98
+ "grad_norm": 2.7365787029266357,
99
+ "learning_rate": 2.4e-05,
100
+ "loss": 4.0046,
101
+ "step": 13
102
+ },
103
+ {
104
+ "epoch": 0.11642411642411643,
105
+ "grad_norm": 2.0502889156341553,
106
+ "learning_rate": 2.6000000000000002e-05,
107
+ "loss": 3.9557,
108
+ "step": 14
109
+ },
110
+ {
111
+ "epoch": 0.12474012474012475,
112
+ "grad_norm": 2.2007663249969482,
113
+ "learning_rate": 2.8000000000000003e-05,
114
+ "loss": 3.8579,
115
+ "step": 15
116
+ },
117
+ {
118
+ "epoch": 0.13305613305613306,
119
+ "grad_norm": 2.462459087371826,
120
+ "learning_rate": 3e-05,
121
+ "loss": 3.8167,
122
+ "step": 16
123
+ },
124
+ {
125
+ "epoch": 0.14137214137214138,
126
+ "grad_norm": 2.7364518642425537,
127
+ "learning_rate": 3.2000000000000005e-05,
128
+ "loss": 3.8004,
129
+ "step": 17
130
+ },
131
+ {
132
+ "epoch": 0.1496881496881497,
133
+ "grad_norm": 2.85400128364563,
134
+ "learning_rate": 3.4000000000000007e-05,
135
+ "loss": 3.6991,
136
+ "step": 18
137
+ },
138
+ {
139
+ "epoch": 0.158004158004158,
140
+ "grad_norm": 2.499318838119507,
141
+ "learning_rate": 3.6e-05,
142
+ "loss": 3.6456,
143
+ "step": 19
144
+ },
145
+ {
146
+ "epoch": 0.16632016632016633,
147
+ "grad_norm": 2.214195728302002,
148
+ "learning_rate": 3.8e-05,
149
+ "loss": 3.6195,
150
+ "step": 20
151
+ },
152
+ {
153
+ "epoch": 0.17463617463617465,
154
+ "grad_norm": 2.195711135864258,
155
+ "learning_rate": 4e-05,
156
+ "loss": 3.5519,
157
+ "step": 21
158
+ },
159
+ {
160
+ "epoch": 0.18295218295218296,
161
+ "grad_norm": 1.992714762687683,
162
+ "learning_rate": 4.2e-05,
163
+ "loss": 3.5034,
164
+ "step": 22
165
+ },
166
+ {
167
+ "epoch": 0.19126819126819128,
168
+ "grad_norm": 2.6501753330230713,
169
+ "learning_rate": 4.4000000000000006e-05,
170
+ "loss": 3.4722,
171
+ "step": 23
172
+ },
173
+ {
174
+ "epoch": 0.1995841995841996,
175
+ "grad_norm": 2.2270214557647705,
176
+ "learning_rate": 4.600000000000001e-05,
177
+ "loss": 3.4126,
178
+ "step": 24
179
+ },
180
+ {
181
+ "epoch": 0.2079002079002079,
182
+ "grad_norm": 3.029968023300171,
183
+ "learning_rate": 4.8e-05,
184
+ "loss": 3.3517,
185
+ "step": 25
186
+ },
187
+ {
188
+ "epoch": 0.21621621621621623,
189
+ "grad_norm": 1.7878334522247314,
190
+ "learning_rate": 5e-05,
191
+ "loss": 3.3321,
192
+ "step": 26
193
+ },
194
+ {
195
+ "epoch": 0.22453222453222454,
196
+ "grad_norm": 1.6116269826889038,
197
+ "learning_rate": 5.2000000000000004e-05,
198
+ "loss": 3.2636,
199
+ "step": 27
200
+ },
201
+ {
202
+ "epoch": 0.23284823284823286,
203
+ "grad_norm": 2.6376936435699463,
204
+ "learning_rate": 5.4000000000000005e-05,
205
+ "loss": 3.2541,
206
+ "step": 28
207
+ },
208
+ {
209
+ "epoch": 0.24116424116424118,
210
+ "grad_norm": 10.313434600830078,
211
+ "learning_rate": 5.6000000000000006e-05,
212
+ "loss": 3.2762,
213
+ "step": 29
214
+ },
215
+ {
216
+ "epoch": 0.2494802494802495,
217
+ "grad_norm": 5.4037251472473145,
218
+ "learning_rate": 5.8e-05,
219
+ "loss": 3.2097,
220
+ "step": 30
221
+ },
222
+ {
223
+ "epoch": 0.2577962577962578,
224
+ "grad_norm": 15.105055809020996,
225
+ "learning_rate": 6e-05,
226
+ "loss": 3.2342,
227
+ "step": 31
228
+ },
229
+ {
230
+ "epoch": 0.2661122661122661,
231
+ "grad_norm": 15.099345207214355,
232
+ "learning_rate": 6.2e-05,
233
+ "loss": 3.2316,
234
+ "step": 32
235
+ },
236
+ {
237
+ "epoch": 0.27442827442827444,
238
+ "grad_norm": 1.7807673215866089,
239
+ "learning_rate": 6.400000000000001e-05,
240
+ "loss": 3.1318,
241
+ "step": 33
242
+ },
243
+ {
244
+ "epoch": 0.28274428274428276,
245
+ "grad_norm": 7.983642101287842,
246
+ "learning_rate": 6.6e-05,
247
+ "loss": 3.1442,
248
+ "step": 34
249
+ },
250
+ {
251
+ "epoch": 0.2910602910602911,
252
+ "grad_norm": 4.913079738616943,
253
+ "learning_rate": 6.800000000000001e-05,
254
+ "loss": 3.1082,
255
+ "step": 35
256
+ },
257
+ {
258
+ "epoch": 0.2993762993762994,
259
+ "grad_norm": 12.353167533874512,
260
+ "learning_rate": 7e-05,
261
+ "loss": 3.1656,
262
+ "step": 36
263
+ },
264
+ {
265
+ "epoch": 0.3076923076923077,
266
+ "grad_norm": 10.911690711975098,
267
+ "learning_rate": 7.2e-05,
268
+ "loss": 3.1361,
269
+ "step": 37
270
+ },
271
+ {
272
+ "epoch": 0.316008316008316,
273
+ "grad_norm": 1.3475183248519897,
274
+ "learning_rate": 7.4e-05,
275
+ "loss": 3.057,
276
+ "step": 38
277
+ },
278
+ {
279
+ "epoch": 0.32432432432432434,
280
+ "grad_norm": 7.743471145629883,
281
+ "learning_rate": 7.6e-05,
282
+ "loss": 3.0549,
283
+ "step": 39
284
+ },
285
+ {
286
+ "epoch": 0.33264033264033266,
287
+ "grad_norm": 4.499805450439453,
288
+ "learning_rate": 7.800000000000001e-05,
289
+ "loss": 2.9976,
290
+ "step": 40
291
+ },
292
+ {
293
+ "epoch": 0.340956340956341,
294
+ "grad_norm": 4.716672420501709,
295
+ "learning_rate": 8e-05,
296
+ "loss": 2.9922,
297
+ "step": 41
298
+ },
299
+ {
300
+ "epoch": 0.3492723492723493,
301
+ "grad_norm": 5.240478038787842,
302
+ "learning_rate": 8.2e-05,
303
+ "loss": 3.0115,
304
+ "step": 42
305
+ },
306
+ {
307
+ "epoch": 0.3575883575883576,
308
+ "grad_norm": 1.5897458791732788,
309
+ "learning_rate": 8.4e-05,
310
+ "loss": 2.9899,
311
+ "step": 43
312
+ },
313
+ {
314
+ "epoch": 0.3659043659043659,
315
+ "grad_norm": 2.463665723800659,
316
+ "learning_rate": 8.6e-05,
317
+ "loss": 2.9799,
318
+ "step": 44
319
+ },
320
+ {
321
+ "epoch": 0.37422037422037424,
322
+ "grad_norm": 1.827469825744629,
323
+ "learning_rate": 8.800000000000001e-05,
324
+ "loss": 2.928,
325
+ "step": 45
326
+ },
327
+ {
328
+ "epoch": 0.38253638253638256,
329
+ "grad_norm": 1.878302812576294,
330
+ "learning_rate": 9e-05,
331
+ "loss": 2.9717,
332
+ "step": 46
333
+ },
334
+ {
335
+ "epoch": 0.3908523908523909,
336
+ "grad_norm": 2.1960086822509766,
337
+ "learning_rate": 9.200000000000001e-05,
338
+ "loss": 2.9037,
339
+ "step": 47
340
+ },
341
+ {
342
+ "epoch": 0.3991683991683992,
343
+ "grad_norm": 3.8804755210876465,
344
+ "learning_rate": 9.4e-05,
345
+ "loss": 2.9174,
346
+ "step": 48
347
+ },
348
+ {
349
+ "epoch": 0.4074844074844075,
350
+ "grad_norm": 1.1289819478988647,
351
+ "learning_rate": 9.6e-05,
352
+ "loss": 2.8997,
353
+ "step": 49
354
+ },
355
+ {
356
+ "epoch": 0.4158004158004158,
357
+ "grad_norm": 4.365386009216309,
358
+ "learning_rate": 9.8e-05,
359
+ "loss": 2.8705,
360
+ "step": 50
361
+ },
362
+ {
363
+ "epoch": 0.42411642411642414,
364
+ "grad_norm": 4.703887462615967,
365
+ "learning_rate": 0.0001,
366
+ "loss": 2.876,
367
+ "step": 51
368
+ },
369
+ {
370
+ "epoch": 0.43243243243243246,
371
+ "grad_norm": 2.8481767177581787,
372
+ "learning_rate": 0.00010200000000000001,
373
+ "loss": 2.8617,
374
+ "step": 52
375
+ },
376
+ {
377
+ "epoch": 0.4407484407484408,
378
+ "grad_norm": 10.58674144744873,
379
+ "learning_rate": 0.00010400000000000001,
380
+ "loss": 2.8998,
381
+ "step": 53
382
+ },
383
+ {
384
+ "epoch": 0.4490644490644491,
385
+ "grad_norm": 6.807188510894775,
386
+ "learning_rate": 0.00010600000000000002,
387
+ "loss": 2.8795,
388
+ "step": 54
389
+ },
390
+ {
391
+ "epoch": 0.4573804573804574,
392
+ "grad_norm": 2.453004837036133,
393
+ "learning_rate": 0.00010800000000000001,
394
+ "loss": 2.8707,
395
+ "step": 55
396
+ },
397
+ {
398
+ "epoch": 0.4656964656964657,
399
+ "grad_norm": 1.8895411491394043,
400
+ "learning_rate": 0.00011000000000000002,
401
+ "loss": 2.8367,
402
+ "step": 56
403
+ },
404
+ {
405
+ "epoch": 0.47401247401247404,
406
+ "grad_norm": 3.585893154144287,
407
+ "learning_rate": 0.00011200000000000001,
408
+ "loss": 2.8564,
409
+ "step": 57
410
+ },
411
+ {
412
+ "epoch": 0.48232848232848236,
413
+ "grad_norm": 2.117868661880493,
414
+ "learning_rate": 0.00011399999999999999,
415
+ "loss": 2.8326,
416
+ "step": 58
417
+ },
418
+ {
419
+ "epoch": 0.49064449064449067,
420
+ "grad_norm": 1.4010989665985107,
421
+ "learning_rate": 0.000116,
422
+ "loss": 2.784,
423
+ "step": 59
424
+ },
425
+ {
426
+ "epoch": 0.498960498960499,
427
+ "grad_norm": 0.8910171985626221,
428
+ "learning_rate": 0.000118,
429
+ "loss": 2.7828,
430
+ "step": 60
431
+ },
432
+ {
433
+ "epoch": 0.5072765072765073,
434
+ "grad_norm": 0.8965553641319275,
435
+ "learning_rate": 0.00012,
436
+ "loss": 2.7768,
437
+ "step": 61
438
+ },
439
+ {
440
+ "epoch": 0.5155925155925156,
441
+ "grad_norm": 1.0502556562423706,
442
+ "learning_rate": 0.000122,
443
+ "loss": 2.7733,
444
+ "step": 62
445
+ },
446
+ {
447
+ "epoch": 0.5239085239085239,
448
+ "grad_norm": 2.425708532333374,
449
+ "learning_rate": 0.000124,
450
+ "loss": 2.7897,
451
+ "step": 63
452
+ },
453
+ {
454
+ "epoch": 0.5322245322245323,
455
+ "grad_norm": 0.8981500864028931,
456
+ "learning_rate": 0.000126,
457
+ "loss": 2.7452,
458
+ "step": 64
459
+ },
460
+ {
461
+ "epoch": 0.5405405405405406,
462
+ "grad_norm": 0.7442967891693115,
463
+ "learning_rate": 0.00012800000000000002,
464
+ "loss": 2.7297,
465
+ "step": 65
466
+ },
467
+ {
468
+ "epoch": 0.5488565488565489,
469
+ "grad_norm": 1.2096165418624878,
470
+ "learning_rate": 0.00013000000000000002,
471
+ "loss": 2.7521,
472
+ "step": 66
473
+ },
474
+ {
475
+ "epoch": 0.5571725571725572,
476
+ "grad_norm": 5.993701457977295,
477
+ "learning_rate": 0.000132,
478
+ "loss": 2.7633,
479
+ "step": 67
480
+ },
481
+ {
482
+ "epoch": 0.5654885654885655,
483
+ "grad_norm": 1.6413137912750244,
484
+ "learning_rate": 0.000134,
485
+ "loss": 2.7269,
486
+ "step": 68
487
+ },
488
+ {
489
+ "epoch": 0.5738045738045738,
490
+ "grad_norm": 2.8430962562561035,
491
+ "learning_rate": 0.00013600000000000003,
492
+ "loss": 2.7717,
493
+ "step": 69
494
+ },
495
+ {
496
+ "epoch": 0.5821205821205822,
497
+ "grad_norm": 2.3216440677642822,
498
+ "learning_rate": 0.000138,
499
+ "loss": 2.8099,
500
+ "step": 70
501
+ },
502
+ {
503
+ "epoch": 0.5904365904365905,
504
+ "grad_norm": 1.4732354879379272,
505
+ "learning_rate": 0.00014,
506
+ "loss": 2.7525,
507
+ "step": 71
508
+ },
509
+ {
510
+ "epoch": 0.5987525987525988,
511
+ "grad_norm": 1.524367332458496,
512
+ "learning_rate": 0.000142,
513
+ "loss": 2.7422,
514
+ "step": 72
515
+ },
516
+ {
517
+ "epoch": 0.6070686070686071,
518
+ "grad_norm": 1.230338215827942,
519
+ "learning_rate": 0.000144,
520
+ "loss": 2.7555,
521
+ "step": 73
522
+ },
523
+ {
524
+ "epoch": 0.6153846153846154,
525
+ "grad_norm": 0.9941631555557251,
526
+ "learning_rate": 0.000146,
527
+ "loss": 2.7042,
528
+ "step": 74
529
+ },
530
+ {
531
+ "epoch": 0.6237006237006237,
532
+ "grad_norm": 1.3642252683639526,
533
+ "learning_rate": 0.000148,
534
+ "loss": 2.7088,
535
+ "step": 75
536
+ },
537
+ {
538
+ "epoch": 0.632016632016632,
539
+ "grad_norm": 0.681107223033905,
540
+ "learning_rate": 0.00015000000000000001,
541
+ "loss": 2.7168,
542
+ "step": 76
543
+ },
544
+ {
545
+ "epoch": 0.6403326403326404,
546
+ "grad_norm": 0.8406685590744019,
547
+ "learning_rate": 0.000152,
548
+ "loss": 2.6938,
549
+ "step": 77
550
+ },
551
+ {
552
+ "epoch": 0.6486486486486487,
553
+ "grad_norm": 0.6661787033081055,
554
+ "learning_rate": 0.000154,
555
+ "loss": 2.7035,
556
+ "step": 78
557
+ },
558
+ {
559
+ "epoch": 0.656964656964657,
560
+ "grad_norm": 0.5472131967544556,
561
+ "learning_rate": 0.00015600000000000002,
562
+ "loss": 2.6913,
563
+ "step": 79
564
+ },
565
+ {
566
+ "epoch": 0.6652806652806653,
567
+ "grad_norm": 0.5465010404586792,
568
+ "learning_rate": 0.00015800000000000002,
569
+ "loss": 2.6913,
570
+ "step": 80
571
+ },
572
+ {
573
+ "epoch": 0.6735966735966736,
574
+ "grad_norm": 0.6352857351303101,
575
+ "learning_rate": 0.00016,
576
+ "loss": 2.6834,
577
+ "step": 81
578
+ },
579
+ {
580
+ "epoch": 0.681912681912682,
581
+ "grad_norm": 0.7992680668830872,
582
+ "learning_rate": 0.000162,
583
+ "loss": 2.6984,
584
+ "step": 82
585
+ },
586
+ {
587
+ "epoch": 0.6902286902286903,
588
+ "grad_norm": 0.543773889541626,
589
+ "learning_rate": 0.000164,
590
+ "loss": 2.6631,
591
+ "step": 83
592
+ },
593
+ {
594
+ "epoch": 0.6985446985446986,
595
+ "grad_norm": 0.4968494474887848,
596
+ "learning_rate": 0.000166,
597
+ "loss": 2.6707,
598
+ "step": 84
599
+ },
600
+ {
601
+ "epoch": 0.7068607068607069,
602
+ "grad_norm": 0.5793298482894897,
603
+ "learning_rate": 0.000168,
604
+ "loss": 2.6776,
605
+ "step": 85
606
+ },
607
+ {
608
+ "epoch": 0.7151767151767152,
609
+ "grad_norm": 1.137745976448059,
610
+ "learning_rate": 0.00017,
611
+ "loss": 2.6852,
612
+ "step": 86
613
+ },
614
+ {
615
+ "epoch": 0.7234927234927235,
616
+ "grad_norm": 1.2862417697906494,
617
+ "learning_rate": 0.000172,
618
+ "loss": 2.6967,
619
+ "step": 87
620
+ },
621
+ {
622
+ "epoch": 0.7318087318087318,
623
+ "grad_norm": 0.5603763461112976,
624
+ "learning_rate": 0.000174,
625
+ "loss": 2.6928,
626
+ "step": 88
627
+ },
628
+ {
629
+ "epoch": 0.7401247401247402,
630
+ "grad_norm": 0.9993265867233276,
631
+ "learning_rate": 0.00017600000000000002,
632
+ "loss": 2.667,
633
+ "step": 89
634
+ },
635
+ {
636
+ "epoch": 0.7484407484407485,
637
+ "grad_norm": 0.982528567314148,
638
+ "learning_rate": 0.00017800000000000002,
639
+ "loss": 2.6673,
640
+ "step": 90
641
+ },
642
+ {
643
+ "epoch": 0.7567567567567568,
644
+ "grad_norm": 0.6244588494300842,
645
+ "learning_rate": 0.00018,
646
+ "loss": 2.6803,
647
+ "step": 91
648
+ },
649
+ {
650
+ "epoch": 0.7650727650727651,
651
+ "grad_norm": 0.5460201501846313,
652
+ "learning_rate": 0.000182,
653
+ "loss": 2.6341,
654
+ "step": 92
655
+ },
656
+ {
657
+ "epoch": 0.7733887733887734,
658
+ "grad_norm": 0.618466854095459,
659
+ "learning_rate": 0.00018400000000000003,
660
+ "loss": 2.6338,
661
+ "step": 93
662
+ },
663
+ {
664
+ "epoch": 0.7817047817047817,
665
+ "grad_norm": 0.5517793893814087,
666
+ "learning_rate": 0.00018600000000000002,
667
+ "loss": 2.6288,
668
+ "step": 94
669
+ },
670
+ {
671
+ "epoch": 0.7900207900207901,
672
+ "grad_norm": 0.5810732841491699,
673
+ "learning_rate": 0.000188,
674
+ "loss": 2.6575,
675
+ "step": 95
676
+ },
677
+ {
678
+ "epoch": 0.7983367983367984,
679
+ "grad_norm": 0.5090439915657043,
680
+ "learning_rate": 0.00019,
681
+ "loss": 2.6646,
682
+ "step": 96
683
+ },
684
+ {
685
+ "epoch": 0.8066528066528067,
686
+ "grad_norm": 0.5052196979522705,
687
+ "learning_rate": 0.000192,
688
+ "loss": 2.6515,
689
+ "step": 97
690
+ },
691
+ {
692
+ "epoch": 0.814968814968815,
693
+ "grad_norm": 0.4681943953037262,
694
+ "learning_rate": 0.000194,
695
+ "loss": 2.6507,
696
+ "step": 98
697
+ },
698
+ {
699
+ "epoch": 0.8232848232848233,
700
+ "grad_norm": 0.47244778275489807,
701
+ "learning_rate": 0.000196,
702
+ "loss": 2.6259,
703
+ "step": 99
704
+ },
705
+ {
706
+ "epoch": 0.8316008316008316,
707
+ "grad_norm": 0.6378790140151978,
708
+ "learning_rate": 0.00019800000000000002,
709
+ "loss": 2.6371,
710
+ "step": 100
711
+ },
712
+ {
713
+ "epoch": 0.83991683991684,
714
+ "grad_norm": 0.5825613737106323,
715
+ "learning_rate": 0.0002,
716
+ "loss": 2.6397,
717
+ "step": 101
718
+ },
719
+ {
720
+ "epoch": 0.8482328482328483,
721
+ "grad_norm": 0.4540935754776001,
722
+ "learning_rate": 0.00019999270008556108,
723
+ "loss": 2.6295,
724
+ "step": 102
725
+ },
726
+ {
727
+ "epoch": 0.8565488565488566,
728
+ "grad_norm": 0.4203695058822632,
729
+ "learning_rate": 0.00019997080140801932,
730
+ "loss": 2.6144,
731
+ "step": 103
732
+ },
733
+ {
734
+ "epoch": 0.8648648648648649,
735
+ "grad_norm": 0.4233225882053375,
736
+ "learning_rate": 0.00019993430716454413,
737
+ "loss": 2.6378,
738
+ "step": 104
739
+ },
740
+ {
741
+ "epoch": 0.8731808731808732,
742
+ "grad_norm": 0.4105020761489868,
743
+ "learning_rate": 0.00019988322268323268,
744
+ "loss": 2.6502,
745
+ "step": 105
746
+ },
747
+ {
748
+ "epoch": 0.8814968814968815,
749
+ "grad_norm": 0.38928887248039246,
750
+ "learning_rate": 0.00019981755542233177,
751
+ "loss": 2.6154,
752
+ "step": 106
753
+ },
754
+ {
755
+ "epoch": 0.8898128898128899,
756
+ "grad_norm": 0.4148002862930298,
757
+ "learning_rate": 0.00019973731496914914,
758
+ "loss": 2.6405,
759
+ "step": 107
760
+ },
761
+ {
762
+ "epoch": 0.8981288981288982,
763
+ "grad_norm": 0.39650681614875793,
764
+ "learning_rate": 0.00019964251303865362,
765
+ "loss": 2.6215,
766
+ "step": 108
767
+ },
768
+ {
769
+ "epoch": 0.9064449064449065,
770
+ "grad_norm": 0.4093710780143738,
771
+ "learning_rate": 0.00019953316347176488,
772
+ "loss": 2.6382,
773
+ "step": 109
774
+ },
775
+ {
776
+ "epoch": 0.9147609147609148,
777
+ "grad_norm": 0.43350887298583984,
778
+ "learning_rate": 0.00019940928223333252,
779
+ "loss": 2.6281,
780
+ "step": 110
781
+ },
782
+ {
783
+ "epoch": 0.9230769230769231,
784
+ "grad_norm": 0.5919923782348633,
785
+ "learning_rate": 0.0001992708874098054,
786
+ "loss": 2.6348,
787
+ "step": 111
788
+ },
789
+ {
790
+ "epoch": 0.9313929313929314,
791
+ "grad_norm": 2.143718957901001,
792
+ "learning_rate": 0.00019911799920659093,
793
+ "loss": 2.6295,
794
+ "step": 112
795
+ },
796
+ {
797
+ "epoch": 0.9397089397089398,
798
+ "grad_norm": 0.8955861330032349,
799
+ "learning_rate": 0.0001989506399451051,
800
+ "loss": 2.5963,
801
+ "step": 113
802
+ },
803
+ {
804
+ "epoch": 0.9480249480249481,
805
+ "grad_norm": 0.7444430589675903,
806
+ "learning_rate": 0.00019876883405951377,
807
+ "loss": 2.6131,
808
+ "step": 114
809
+ },
810
+ {
811
+ "epoch": 0.9563409563409564,
812
+ "grad_norm": 0.6534228324890137,
813
+ "learning_rate": 0.0001985726080931651,
814
+ "loss": 2.621,
815
+ "step": 115
816
+ },
817
+ {
818
+ "epoch": 0.9646569646569647,
819
+ "grad_norm": 0.6738382577896118,
820
+ "learning_rate": 0.00019836199069471437,
821
+ "loss": 2.6123,
822
+ "step": 116
823
+ },
824
+ {
825
+ "epoch": 0.972972972972973,
826
+ "grad_norm": 0.46829137206077576,
827
+ "learning_rate": 0.00019813701261394136,
828
+ "loss": 2.6216,
829
+ "step": 117
830
+ },
831
+ {
832
+ "epoch": 0.9812889812889813,
833
+ "grad_norm": 0.5436323881149292,
834
+ "learning_rate": 0.00019789770669726087,
835
+ "loss": 2.6364,
836
+ "step": 118
837
+ },
838
+ {
839
+ "epoch": 0.9896049896049897,
840
+ "grad_norm": 0.3988193869590759,
841
+ "learning_rate": 0.00019764410788292722,
842
+ "loss": 2.6036,
843
+ "step": 119
844
+ },
845
+ {
846
+ "epoch": 0.997920997920998,
847
+ "grad_norm": 0.43007829785346985,
848
+ "learning_rate": 0.00019737625319593335,
849
+ "loss": 2.6067,
850
+ "step": 120
851
+ },
852
+ {
853
+ "epoch": 1.0,
854
+ "grad_norm": 0.6972622871398926,
855
+ "learning_rate": 0.0001970941817426052,
856
+ "loss": 2.5875,
857
+ "step": 121
858
+ },
859
+ {
860
+ "epoch": 1.0083160083160083,
861
+ "grad_norm": 0.7436782121658325,
862
+ "learning_rate": 0.00019679793470489228,
863
+ "loss": 2.6127,
864
+ "step": 122
865
+ },
866
+ {
867
+ "epoch": 1.0166320166320166,
868
+ "grad_norm": 2.4193379878997803,
869
+ "learning_rate": 0.00019648755533435518,
870
+ "loss": 2.575,
871
+ "step": 123
872
+ },
873
+ {
874
+ "epoch": 1.024948024948025,
875
+ "grad_norm": 0.6707261204719543,
876
+ "learning_rate": 0.00019616308894585078,
877
+ "loss": 2.5704,
878
+ "step": 124
879
+ },
880
+ {
881
+ "epoch": 1.0332640332640333,
882
+ "grad_norm": 0.46323344111442566,
883
+ "learning_rate": 0.00019582458291091663,
884
+ "loss": 2.5438,
885
+ "step": 125
886
+ },
887
+ {
888
+ "epoch": 1.0415800415800416,
889
+ "grad_norm": 0.5376307964324951,
890
+ "learning_rate": 0.00019547208665085457,
891
+ "loss": 2.5629,
892
+ "step": 126
893
+ },
894
+ {
895
+ "epoch": 1.04989604989605,
896
+ "grad_norm": 0.4404616951942444,
897
+ "learning_rate": 0.00019510565162951537,
898
+ "loss": 2.5539,
899
+ "step": 127
900
+ },
901
+ {
902
+ "epoch": 1.0582120582120582,
903
+ "grad_norm": 0.45081833004951477,
904
+ "learning_rate": 0.00019472533134578507,
905
+ "loss": 2.558,
906
+ "step": 128
907
+ },
908
+ {
909
+ "epoch": 1.0665280665280665,
910
+ "grad_norm": 0.6300471425056458,
911
+ "learning_rate": 0.0001943311813257743,
912
+ "loss": 2.5688,
913
+ "step": 129
914
+ },
915
+ {
916
+ "epoch": 1.0748440748440748,
917
+ "grad_norm": 0.49277371168136597,
918
+ "learning_rate": 0.00019392325911471155,
919
+ "loss": 2.5545,
920
+ "step": 130
921
+ },
922
+ {
923
+ "epoch": 1.0831600831600832,
924
+ "grad_norm": 0.4726185202598572,
925
+ "learning_rate": 0.0001935016242685415,
926
+ "loss": 2.569,
927
+ "step": 131
928
+ },
929
+ {
930
+ "epoch": 1.0914760914760915,
931
+ "grad_norm": 0.5317509174346924,
932
+ "learning_rate": 0.00019306633834523024,
933
+ "loss": 2.5738,
934
+ "step": 132
935
+ },
936
+ {
937
+ "epoch": 1.0997920997920998,
938
+ "grad_norm": 0.4991399645805359,
939
+ "learning_rate": 0.00019261746489577765,
940
+ "loss": 2.5491,
941
+ "step": 133
942
+ },
943
+ {
944
+ "epoch": 1.1081081081081081,
945
+ "grad_norm": 0.6071000099182129,
946
+ "learning_rate": 0.0001921550694549393,
947
+ "loss": 2.5667,
948
+ "step": 134
949
+ },
950
+ {
951
+ "epoch": 1.1164241164241164,
952
+ "grad_norm": 0.4629747271537781,
953
+ "learning_rate": 0.00019167921953165825,
954
+ "loss": 2.5514,
955
+ "step": 135
956
+ },
957
+ {
958
+ "epoch": 1.1247401247401247,
959
+ "grad_norm": 0.41420885920524597,
960
+ "learning_rate": 0.00019118998459920902,
961
+ "loss": 2.5432,
962
+ "step": 136
963
+ },
964
+ {
965
+ "epoch": 1.133056133056133,
966
+ "grad_norm": 0.41042062640190125,
967
+ "learning_rate": 0.00019068743608505455,
968
+ "loss": 2.5783,
969
+ "step": 137
970
+ },
971
+ {
972
+ "epoch": 1.1413721413721414,
973
+ "grad_norm": 0.40009400248527527,
974
+ "learning_rate": 0.00019017164736041795,
975
+ "loss": 2.5598,
976
+ "step": 138
977
+ },
978
+ {
979
+ "epoch": 1.1496881496881497,
980
+ "grad_norm": 0.44067704677581787,
981
+ "learning_rate": 0.00018964269372957038,
982
+ "loss": 2.56,
983
+ "step": 139
984
+ },
985
+ {
986
+ "epoch": 1.158004158004158,
987
+ "grad_norm": 0.4079325795173645,
988
+ "learning_rate": 0.0001891006524188368,
989
+ "loss": 2.5748,
990
+ "step": 140
991
+ },
992
+ {
993
+ "epoch": 1.1663201663201663,
994
+ "grad_norm": 0.4190751314163208,
995
+ "learning_rate": 0.000188545602565321,
996
+ "loss": 2.5569,
997
+ "step": 141
998
+ },
999
+ {
1000
+ "epoch": 1.1746361746361746,
1001
+ "grad_norm": 0.39320653676986694,
1002
+ "learning_rate": 0.00018797762520535177,
1003
+ "loss": 2.5689,
1004
+ "step": 142
1005
+ },
1006
+ {
1007
+ "epoch": 1.182952182952183,
1008
+ "grad_norm": 0.3822016716003418,
1009
+ "learning_rate": 0.0001873968032626518,
1010
+ "loss": 2.543,
1011
+ "step": 143
1012
+ },
1013
+ {
1014
+ "epoch": 1.1912681912681913,
1015
+ "grad_norm": 0.37677645683288574,
1016
+ "learning_rate": 0.00018680322153623075,
1017
+ "loss": 2.5315,
1018
+ "step": 144
1019
+ },
1020
+ {
1021
+ "epoch": 1.1995841995841996,
1022
+ "grad_norm": 0.3912084996700287,
1023
+ "learning_rate": 0.00018619696668800492,
1024
+ "loss": 2.5344,
1025
+ "step": 145
1026
+ },
1027
+ {
1028
+ "epoch": 1.207900207900208,
1029
+ "grad_norm": 0.3857469856739044,
1030
+ "learning_rate": 0.00018557812723014476,
1031
+ "loss": 2.5631,
1032
+ "step": 146
1033
+ },
1034
+ {
1035
+ "epoch": 1.2162162162162162,
1036
+ "grad_norm": 0.38990500569343567,
1037
+ "learning_rate": 0.0001849467935121521,
1038
+ "loss": 2.5472,
1039
+ "step": 147
1040
+ },
1041
+ {
1042
+ "epoch": 1.2245322245322245,
1043
+ "grad_norm": 0.38198038935661316,
1044
+ "learning_rate": 0.00018430305770766948,
1045
+ "loss": 2.5287,
1046
+ "step": 148
1047
+ },
1048
+ {
1049
+ "epoch": 1.2328482328482329,
1050
+ "grad_norm": 0.5182828307151794,
1051
+ "learning_rate": 0.00018364701380102266,
1052
+ "loss": 2.5404,
1053
+ "step": 149
1054
+ },
1055
+ {
1056
+ "epoch": 1.2411642411642412,
1057
+ "grad_norm": 0.4881569445133209,
1058
+ "learning_rate": 0.00018297875757349952,
1059
+ "loss": 2.5547,
1060
+ "step": 150
1061
+ },
1062
+ {
1063
+ "epoch": 1.2494802494802495,
1064
+ "grad_norm": 0.4347280263900757,
1065
+ "learning_rate": 0.00018229838658936564,
1066
+ "loss": 2.5338,
1067
+ "step": 151
1068
+ },
1069
+ {
1070
+ "epoch": 1.2577962577962578,
1071
+ "grad_norm": 0.47588396072387695,
1072
+ "learning_rate": 0.0001816060001816205,
1073
+ "loss": 2.5507,
1074
+ "step": 152
1075
+ },
1076
+ {
1077
+ "epoch": 1.2661122661122661,
1078
+ "grad_norm": 0.3899973928928375,
1079
+ "learning_rate": 0.00018090169943749476,
1080
+ "loss": 2.5596,
1081
+ "step": 153
1082
+ },
1083
+ {
1084
+ "epoch": 1.2744282744282744,
1085
+ "grad_norm": 0.39740344882011414,
1086
+ "learning_rate": 0.00018018558718369186,
1087
+ "loss": 2.539,
1088
+ "step": 154
1089
+ },
1090
+ {
1091
+ "epoch": 1.2827442827442828,
1092
+ "grad_norm": 0.42894020676612854,
1093
+ "learning_rate": 0.00017945776797137543,
1094
+ "loss": 2.5211,
1095
+ "step": 155
1096
+ },
1097
+ {
1098
+ "epoch": 1.291060291060291,
1099
+ "grad_norm": 0.42803955078125,
1100
+ "learning_rate": 0.00017871834806090501,
1101
+ "loss": 2.5267,
1102
+ "step": 156
1103
+ },
1104
+ {
1105
+ "epoch": 1.2993762993762994,
1106
+ "grad_norm": 0.4332488477230072,
1107
+ "learning_rate": 0.00017796743540632223,
1108
+ "loss": 2.5422,
1109
+ "step": 157
1110
+ },
1111
+ {
1112
+ "epoch": 1.3076923076923077,
1113
+ "grad_norm": 0.5245593786239624,
1114
+ "learning_rate": 0.00017720513963958968,
1115
+ "loss": 2.5275,
1116
+ "step": 158
1117
+ },
1118
+ {
1119
+ "epoch": 1.316008316008316,
1120
+ "grad_norm": 0.4242333173751831,
1121
+ "learning_rate": 0.00017643157205458483,
1122
+ "loss": 2.5284,
1123
+ "step": 159
1124
+ },
1125
+ {
1126
+ "epoch": 1.3243243243243243,
1127
+ "grad_norm": 0.4454910457134247,
1128
+ "learning_rate": 0.00017564684559085136,
1129
+ "loss": 2.5204,
1130
+ "step": 160
1131
+ },
1132
+ {
1133
+ "epoch": 1.3326403326403327,
1134
+ "grad_norm": 0.5117389559745789,
1135
+ "learning_rate": 0.00017485107481711012,
1136
+ "loss": 2.5484,
1137
+ "step": 161
1138
+ },
1139
+ {
1140
+ "epoch": 1.340956340956341,
1141
+ "grad_norm": 0.5533547401428223,
1142
+ "learning_rate": 0.00017404437591453235,
1143
+ "loss": 2.5256,
1144
+ "step": 162
1145
+ },
1146
+ {
1147
+ "epoch": 1.3492723492723493,
1148
+ "grad_norm": 0.4423222541809082,
1149
+ "learning_rate": 0.00017322686665977737,
1150
+ "loss": 2.5516,
1151
+ "step": 163
1152
+ },
1153
+ {
1154
+ "epoch": 1.3575883575883576,
1155
+ "grad_norm": 0.6984363198280334,
1156
+ "learning_rate": 0.00017239866640779745,
1157
+ "loss": 2.5392,
1158
+ "step": 164
1159
+ },
1160
+ {
1161
+ "epoch": 1.365904365904366,
1162
+ "grad_norm": 0.46573567390441895,
1163
+ "learning_rate": 0.00017155989607441213,
1164
+ "loss": 2.5215,
1165
+ "step": 165
1166
+ },
1167
+ {
1168
+ "epoch": 1.3742203742203742,
1169
+ "grad_norm": 0.43190667033195496,
1170
+ "learning_rate": 0.00017071067811865476,
1171
+ "loss": 2.5109,
1172
+ "step": 166
1173
+ },
1174
+ {
1175
+ "epoch": 1.3825363825363826,
1176
+ "grad_norm": 0.6366004943847656,
1177
+ "learning_rate": 0.00016985113652489374,
1178
+ "loss": 2.5607,
1179
+ "step": 167
1180
+ },
1181
+ {
1182
+ "epoch": 1.3908523908523909,
1183
+ "grad_norm": 0.4087092876434326,
1184
+ "learning_rate": 0.00016898139678473076,
1185
+ "loss": 2.5446,
1186
+ "step": 168
1187
+ },
1188
+ {
1189
+ "epoch": 1.3991683991683992,
1190
+ "grad_norm": 0.40001702308654785,
1191
+ "learning_rate": 0.00016810158587867973,
1192
+ "loss": 2.5087,
1193
+ "step": 169
1194
+ },
1195
+ {
1196
+ "epoch": 1.4074844074844075,
1197
+ "grad_norm": 0.40865230560302734,
1198
+ "learning_rate": 0.00016721183225762727,
1199
+ "loss": 2.5256,
1200
+ "step": 170
1201
+ },
1202
+ {
1203
+ "epoch": 1.4158004158004158,
1204
+ "grad_norm": 0.41592201590538025,
1205
+ "learning_rate": 0.00016631226582407952,
1206
+ "loss": 2.5457,
1207
+ "step": 171
1208
+ },
1209
+ {
1210
+ "epoch": 1.4241164241164241,
1211
+ "grad_norm": 0.47380271553993225,
1212
+ "learning_rate": 0.00016540301791319645,
1213
+ "loss": 2.5239,
1214
+ "step": 172
1215
+ },
1216
+ {
1217
+ "epoch": 1.4324324324324325,
1218
+ "grad_norm": 0.3751722574234009,
1219
+ "learning_rate": 0.00016448422127361706,
1220
+ "loss": 2.5277,
1221
+ "step": 173
1222
+ },
1223
+ {
1224
+ "epoch": 1.4407484407484408,
1225
+ "grad_norm": 0.531104326248169,
1226
+ "learning_rate": 0.00016355601004807856,
1227
+ "loss": 2.5095,
1228
+ "step": 174
1229
+ },
1230
+ {
1231
+ "epoch": 1.449064449064449,
1232
+ "grad_norm": 0.3729030191898346,
1233
+ "learning_rate": 0.00016261851975383137,
1234
+ "loss": 2.508,
1235
+ "step": 175
1236
+ },
1237
+ {
1238
+ "epoch": 1.4573804573804574,
1239
+ "grad_norm": 0.4520551562309265,
1240
+ "learning_rate": 0.00016167188726285434,
1241
+ "loss": 2.5198,
1242
+ "step": 176
1243
+ },
1244
+ {
1245
+ "epoch": 1.4656964656964657,
1246
+ "grad_norm": 0.4487551152706146,
1247
+ "learning_rate": 0.00016071625078187114,
1248
+ "loss": 2.5153,
1249
+ "step": 177
1250
+ },
1251
+ {
1252
+ "epoch": 1.474012474012474,
1253
+ "grad_norm": 0.41815704107284546,
1254
+ "learning_rate": 0.00015975174983217275,
1255
+ "loss": 2.5218,
1256
+ "step": 178
1257
+ },
1258
+ {
1259
+ "epoch": 1.4823284823284824,
1260
+ "grad_norm": 0.4343583583831787,
1261
+ "learning_rate": 0.00015877852522924732,
1262
+ "loss": 2.518,
1263
+ "step": 179
1264
+ },
1265
+ {
1266
+ "epoch": 1.4906444906444907,
1267
+ "grad_norm": 0.4115164279937744,
1268
+ "learning_rate": 0.0001577967190622215,
1269
+ "loss": 2.5137,
1270
+ "step": 180
1271
+ },
1272
+ {
1273
+ "epoch": 1.498960498960499,
1274
+ "grad_norm": 0.366964191198349,
1275
+ "learning_rate": 0.00015680647467311557,
1276
+ "loss": 2.5379,
1277
+ "step": 181
1278
+ },
1279
+ {
1280
+ "epoch": 1.5072765072765073,
1281
+ "grad_norm": 0.42494460940361023,
1282
+ "learning_rate": 0.00015580793663591585,
1283
+ "loss": 2.5471,
1284
+ "step": 182
1285
+ },
1286
+ {
1287
+ "epoch": 1.5155925155925156,
1288
+ "grad_norm": 0.39186251163482666,
1289
+ "learning_rate": 0.00015480125073546704,
1290
+ "loss": 2.5131,
1291
+ "step": 183
1292
+ },
1293
+ {
1294
+ "epoch": 1.523908523908524,
1295
+ "grad_norm": 0.3860762119293213,
1296
+ "learning_rate": 0.00015378656394618787,
1297
+ "loss": 2.5355,
1298
+ "step": 184
1299
+ },
1300
+ {
1301
+ "epoch": 1.5322245322245323,
1302
+ "grad_norm": 0.4578627943992615,
1303
+ "learning_rate": 0.0001527640244106133,
1304
+ "loss": 2.5119,
1305
+ "step": 185
1306
+ },
1307
+ {
1308
+ "epoch": 1.5405405405405406,
1309
+ "grad_norm": 0.3482917547225952,
1310
+ "learning_rate": 0.00015173378141776568,
1311
+ "loss": 2.512,
1312
+ "step": 186
1313
+ },
1314
+ {
1315
+ "epoch": 1.5488565488565489,
1316
+ "grad_norm": 0.3526865541934967,
1317
+ "learning_rate": 0.00015069598538135906,
1318
+ "loss": 2.5142,
1319
+ "step": 187
1320
+ },
1321
+ {
1322
+ "epoch": 1.5571725571725572,
1323
+ "grad_norm": 0.3888881206512451,
1324
+ "learning_rate": 0.0001496507878178388,
1325
+ "loss": 2.5202,
1326
+ "step": 188
1327
+ },
1328
+ {
1329
+ "epoch": 1.5654885654885655,
1330
+ "grad_norm": 0.39839354157447815,
1331
+ "learning_rate": 0.0001485983413242606,
1332
+ "loss": 2.5178,
1333
+ "step": 189
1334
+ },
1335
+ {
1336
+ "epoch": 1.5738045738045738,
1337
+ "grad_norm": 0.42406460642814636,
1338
+ "learning_rate": 0.00014753879955601163,
1339
+ "loss": 2.4996,
1340
+ "step": 190
1341
+ },
1342
+ {
1343
+ "epoch": 1.5821205821205822,
1344
+ "grad_norm": 0.4432888627052307,
1345
+ "learning_rate": 0.00014647231720437686,
1346
+ "loss": 2.5232,
1347
+ "step": 191
1348
+ },
1349
+ {
1350
+ "epoch": 1.5904365904365905,
1351
+ "grad_norm": 0.43413785099983215,
1352
+ "learning_rate": 0.00014539904997395468,
1353
+ "loss": 2.5243,
1354
+ "step": 192
1355
+ },
1356
+ {
1357
+ "epoch": 1.5987525987525988,
1358
+ "grad_norm": 0.3551112413406372,
1359
+ "learning_rate": 0.00014431915455992414,
1360
+ "loss": 2.5134,
1361
+ "step": 193
1362
+ },
1363
+ {
1364
+ "epoch": 1.607068607068607,
1365
+ "grad_norm": 0.36720749735832214,
1366
+ "learning_rate": 0.00014323278862516775,
1367
+ "loss": 2.502,
1368
+ "step": 194
1369
+ },
1370
+ {
1371
+ "epoch": 1.6153846153846154,
1372
+ "grad_norm": 0.35447460412979126,
1373
+ "learning_rate": 0.00014214011077725292,
1374
+ "loss": 2.5036,
1375
+ "step": 195
1376
+ },
1377
+ {
1378
+ "epoch": 1.6237006237006237,
1379
+ "grad_norm": 0.3743102550506592,
1380
+ "learning_rate": 0.0001410412805452757,
1381
+ "loss": 2.4936,
1382
+ "step": 196
1383
+ },
1384
+ {
1385
+ "epoch": 1.632016632016632,
1386
+ "grad_norm": 0.36030158400535583,
1387
+ "learning_rate": 0.00013993645835656953,
1388
+ "loss": 2.4851,
1389
+ "step": 197
1390
+ },
1391
+ {
1392
+ "epoch": 1.6403326403326404,
1393
+ "grad_norm": 0.3682660162448883,
1394
+ "learning_rate": 0.0001388258055132835,
1395
+ "loss": 2.5107,
1396
+ "step": 198
1397
+ },
1398
+ {
1399
+ "epoch": 1.6486486486486487,
1400
+ "grad_norm": 0.37054452300071716,
1401
+ "learning_rate": 0.00013770948416883205,
1402
+ "loss": 2.4875,
1403
+ "step": 199
1404
+ },
1405
+ {
1406
+ "epoch": 1.656964656964657,
1407
+ "grad_norm": 0.4111086428165436,
1408
+ "learning_rate": 0.00013658765730422125,
1409
+ "loss": 2.5055,
1410
+ "step": 200
1411
+ },
1412
+ {
1413
+ "epoch": 1.6652806652806653,
1414
+ "grad_norm": 0.36359384655952454,
1415
+ "learning_rate": 0.00013546048870425356,
1416
+ "loss": 2.5099,
1417
+ "step": 201
1418
+ },
1419
+ {
1420
+ "epoch": 1.6735966735966736,
1421
+ "grad_norm": 0.40381795167922974,
1422
+ "learning_rate": 0.00013432814293361584,
1423
+ "loss": 2.5162,
1424
+ "step": 202
1425
+ },
1426
+ {
1427
+ "epoch": 1.681912681912682,
1428
+ "grad_norm": 0.3458056151866913,
1429
+ "learning_rate": 0.00013319078531285285,
1430
+ "loss": 2.4827,
1431
+ "step": 203
1432
+ },
1433
+ {
1434
+ "epoch": 1.6902286902286903,
1435
+ "grad_norm": 0.35546690225601196,
1436
+ "learning_rate": 0.00013204858189423097,
1437
+ "loss": 2.5162,
1438
+ "step": 204
1439
+ },
1440
+ {
1441
+ "epoch": 1.6985446985446986,
1442
+ "grad_norm": 0.37076064944267273,
1443
+ "learning_rate": 0.00013090169943749476,
1444
+ "loss": 2.5201,
1445
+ "step": 205
1446
+ },
1447
+ {
1448
+ "epoch": 1.706860706860707,
1449
+ "grad_norm": 0.3626950681209564,
1450
+ "learning_rate": 0.00012975030538552032,
1451
+ "loss": 2.5196,
1452
+ "step": 206
1453
+ },
1454
+ {
1455
+ "epoch": 1.7151767151767152,
1456
+ "grad_norm": 0.38436341285705566,
1457
+ "learning_rate": 0.00012859456783986893,
1458
+ "loss": 2.5157,
1459
+ "step": 207
1460
+ },
1461
+ {
1462
+ "epoch": 1.7234927234927235,
1463
+ "grad_norm": 0.38007652759552,
1464
+ "learning_rate": 0.0001274346555362446,
1465
+ "loss": 2.4935,
1466
+ "step": 208
1467
+ },
1468
+ {
1469
+ "epoch": 1.7318087318087318,
1470
+ "grad_norm": 0.38916251063346863,
1471
+ "learning_rate": 0.0001262707378198587,
1472
+ "loss": 2.5083,
1473
+ "step": 209
1474
+ },
1475
+ {
1476
+ "epoch": 1.7401247401247402,
1477
+ "grad_norm": 0.37072858214378357,
1478
+ "learning_rate": 0.00012510298462070619,
1479
+ "loss": 2.4915,
1480
+ "step": 210
1481
+ },
1482
+ {
1483
+ "epoch": 1.7484407484407485,
1484
+ "grad_norm": 0.4798509478569031,
1485
+ "learning_rate": 0.0001239315664287558,
1486
+ "loss": 2.5189,
1487
+ "step": 211
1488
+ },
1489
+ {
1490
+ "epoch": 1.7567567567567568,
1491
+ "grad_norm": 0.4451827108860016,
1492
+ "learning_rate": 0.000122756654269059,
1493
+ "loss": 2.5231,
1494
+ "step": 212
1495
+ },
1496
+ {
1497
+ "epoch": 1.7650727650727651,
1498
+ "grad_norm": 0.35244473814964294,
1499
+ "learning_rate": 0.00012157841967678063,
1500
+ "loss": 2.5061,
1501
+ "step": 213
1502
+ },
1503
+ {
1504
+ "epoch": 1.7733887733887734,
1505
+ "grad_norm": 0.34157419204711914,
1506
+ "learning_rate": 0.00012039703467215488,
1507
+ "loss": 2.5044,
1508
+ "step": 214
1509
+ },
1510
+ {
1511
+ "epoch": 1.7817047817047817,
1512
+ "grad_norm": 0.40339285135269165,
1513
+ "learning_rate": 0.00011921267173537086,
1514
+ "loss": 2.506,
1515
+ "step": 215
1516
+ },
1517
+ {
1518
+ "epoch": 1.79002079002079,
1519
+ "grad_norm": 0.3662133514881134,
1520
+ "learning_rate": 0.0001180255037813906,
1521
+ "loss": 2.5012,
1522
+ "step": 216
1523
+ },
1524
+ {
1525
+ "epoch": 1.7983367983367984,
1526
+ "grad_norm": 0.3495447039604187,
1527
+ "learning_rate": 0.00011683570413470383,
1528
+ "loss": 2.5,
1529
+ "step": 217
1530
+ },
1531
+ {
1532
+ "epoch": 1.8066528066528067,
1533
+ "grad_norm": 0.36315813660621643,
1534
+ "learning_rate": 0.0001156434465040231,
1535
+ "loss": 2.4923,
1536
+ "step": 218
1537
+ },
1538
+ {
1539
+ "epoch": 1.814968814968815,
1540
+ "grad_norm": 0.3807941973209381,
1541
+ "learning_rate": 0.00011444890495692213,
1542
+ "loss": 2.5123,
1543
+ "step": 219
1544
+ },
1545
+ {
1546
+ "epoch": 1.8232848232848233,
1547
+ "grad_norm": 0.36616984009742737,
1548
+ "learning_rate": 0.00011325225389442277,
1549
+ "loss": 2.4954,
1550
+ "step": 220
1551
+ },
1552
+ {
1553
+ "epoch": 1.8316008316008316,
1554
+ "grad_norm": 0.372138112783432,
1555
+ "learning_rate": 0.0001120536680255323,
1556
+ "loss": 2.488,
1557
+ "step": 221
1558
+ },
1559
+ {
1560
+ "epoch": 1.83991683991684,
1561
+ "grad_norm": 0.37164467573165894,
1562
+ "learning_rate": 0.00011085332234173664,
1563
+ "loss": 2.4883,
1564
+ "step": 222
1565
+ },
1566
+ {
1567
+ "epoch": 1.8482328482328483,
1568
+ "grad_norm": 0.34948042035102844,
1569
+ "learning_rate": 0.00010965139209145152,
1570
+ "loss": 2.4864,
1571
+ "step": 223
1572
+ },
1573
+ {
1574
+ "epoch": 1.8565488565488566,
1575
+ "grad_norm": 0.35938721895217896,
1576
+ "learning_rate": 0.00010844805275443673,
1577
+ "loss": 2.4928,
1578
+ "step": 224
1579
+ },
1580
+ {
1581
+ "epoch": 1.864864864864865,
1582
+ "grad_norm": 0.3879775404930115,
1583
+ "learning_rate": 0.00010724348001617625,
1584
+ "loss": 2.4938,
1585
+ "step": 225
1586
+ },
1587
+ {
1588
+ "epoch": 1.8731808731808732,
1589
+ "grad_norm": 0.36195841431617737,
1590
+ "learning_rate": 0.00010603784974222861,
1591
+ "loss": 2.4925,
1592
+ "step": 226
1593
+ },
1594
+ {
1595
+ "epoch": 1.8814968814968815,
1596
+ "grad_norm": 0.35239464044570923,
1597
+ "learning_rate": 0.00010483133795255071,
1598
+ "loss": 2.4973,
1599
+ "step": 227
1600
+ },
1601
+ {
1602
+ "epoch": 1.8898128898128899,
1603
+ "grad_norm": 0.3456074297428131,
1604
+ "learning_rate": 0.00010362412079579924,
1605
+ "loss": 2.4966,
1606
+ "step": 228
1607
+ },
1608
+ {
1609
+ "epoch": 1.8981288981288982,
1610
+ "grad_norm": 0.3808579444885254,
1611
+ "learning_rate": 0.00010241637452361323,
1612
+ "loss": 2.5087,
1613
+ "step": 229
1614
+ },
1615
+ {
1616
+ "epoch": 1.9064449064449065,
1617
+ "grad_norm": 0.39099177718162537,
1618
+ "learning_rate": 0.00010120827546488174,
1619
+ "loss": 2.4894,
1620
+ "step": 230
1621
+ },
1622
+ {
1623
+ "epoch": 1.9147609147609148,
1624
+ "grad_norm": 0.3821989595890045,
1625
+ "learning_rate": 0.0001,
1626
+ "loss": 2.4971,
1627
+ "step": 231
1628
+ },
1629
+ {
1630
+ "epoch": 1.9230769230769231,
1631
+ "grad_norm": 0.3640426695346832,
1632
+ "learning_rate": 9.879172453511827e-05,
1633
+ "loss": 2.4893,
1634
+ "step": 232
1635
+ },
1636
+ {
1637
+ "epoch": 1.9313929313929314,
1638
+ "grad_norm": 0.3754567801952362,
1639
+ "learning_rate": 9.75836254763868e-05,
1640
+ "loss": 2.4871,
1641
+ "step": 233
1642
+ },
1643
+ {
1644
+ "epoch": 1.9397089397089398,
1645
+ "grad_norm": 0.3793177902698517,
1646
+ "learning_rate": 9.63758792042008e-05,
1647
+ "loss": 2.4929,
1648
+ "step": 234
1649
+ },
1650
+ {
1651
+ "epoch": 1.948024948024948,
1652
+ "grad_norm": 0.35912153124809265,
1653
+ "learning_rate": 9.516866204744931e-05,
1654
+ "loss": 2.503,
1655
+ "step": 235
1656
+ },
1657
+ {
1658
+ "epoch": 1.9563409563409564,
1659
+ "grad_norm": 0.3523949384689331,
1660
+ "learning_rate": 9.396215025777139e-05,
1661
+ "loss": 2.4869,
1662
+ "step": 236
1663
+ },
1664
+ {
1665
+ "epoch": 1.9646569646569647,
1666
+ "grad_norm": 0.38750869035720825,
1667
+ "learning_rate": 9.275651998382377e-05,
1668
+ "loss": 2.4957,
1669
+ "step": 237
1670
+ },
1671
+ {
1672
+ "epoch": 1.972972972972973,
1673
+ "grad_norm": 0.3791241943836212,
1674
+ "learning_rate": 9.155194724556331e-05,
1675
+ "loss": 2.506,
1676
+ "step": 238
1677
+ },
1678
+ {
1679
+ "epoch": 1.9812889812889813,
1680
+ "grad_norm": 0.34388408064842224,
1681
+ "learning_rate": 9.034860790854849e-05,
1682
+ "loss": 2.5003,
1683
+ "step": 239
1684
+ },
1685
+ {
1686
+ "epoch": 1.9896049896049897,
1687
+ "grad_norm": 0.3853475749492645,
1688
+ "learning_rate": 8.914667765826338e-05,
1689
+ "loss": 2.4585,
1690
+ "step": 240
1691
+ },
1692
+ {
1693
+ "epoch": 1.997920997920998,
1694
+ "grad_norm": 0.3552019000053406,
1695
+ "learning_rate": 8.79463319744677e-05,
1696
+ "loss": 2.4799,
1697
+ "step": 241
1698
+ },
1699
+ {
1700
+ "epoch": 2.0,
1701
+ "grad_norm": 0.6879647374153137,
1702
+ "learning_rate": 8.674774610557728e-05,
1703
+ "loss": 2.485,
1704
+ "step": 242
1705
+ },
1706
+ {
1707
+ "epoch": 2.008316008316008,
1708
+ "grad_norm": 0.4841165542602539,
1709
+ "learning_rate": 8.55510950430779e-05,
1710
+ "loss": 2.3719,
1711
+ "step": 243
1712
+ },
1713
+ {
1714
+ "epoch": 2.0166320166320166,
1715
+ "grad_norm": 0.40582990646362305,
1716
+ "learning_rate": 8.435655349597689e-05,
1717
+ "loss": 2.3914,
1718
+ "step": 244
1719
+ },
1720
+ {
1721
+ "epoch": 2.024948024948025,
1722
+ "grad_norm": 0.44056642055511475,
1723
+ "learning_rate": 8.316429586529615e-05,
1724
+ "loss": 2.3862,
1725
+ "step": 245
1726
+ },
1727
+ {
1728
+ "epoch": 2.0332640332640333,
1729
+ "grad_norm": 0.5893064141273499,
1730
+ "learning_rate": 8.197449621860943e-05,
1731
+ "loss": 2.3759,
1732
+ "step": 246
1733
+ },
1734
+ {
1735
+ "epoch": 2.0415800415800414,
1736
+ "grad_norm": 0.5335010290145874,
1737
+ "learning_rate": 8.078732826462915e-05,
1738
+ "loss": 2.3966,
1739
+ "step": 247
1740
+ },
1741
+ {
1742
+ "epoch": 2.04989604989605,
1743
+ "grad_norm": 0.40315303206443787,
1744
+ "learning_rate": 7.960296532784515e-05,
1745
+ "loss": 2.3886,
1746
+ "step": 248
1747
+ },
1748
+ {
1749
+ "epoch": 2.0582120582120584,
1750
+ "grad_norm": 0.45403429865837097,
1751
+ "learning_rate": 7.84215803232194e-05,
1752
+ "loss": 2.3664,
1753
+ "step": 249
1754
+ },
1755
+ {
1756
+ "epoch": 2.0665280665280665,
1757
+ "grad_norm": 0.44575178623199463,
1758
+ "learning_rate": 7.7243345730941e-05,
1759
+ "loss": 2.3705,
1760
+ "step": 250
1761
+ },
1762
+ {
1763
+ "epoch": 2.0748440748440746,
1764
+ "grad_norm": 0.41231396794319153,
1765
+ "learning_rate": 7.606843357124426e-05,
1766
+ "loss": 2.3417,
1767
+ "step": 251
1768
+ },
1769
+ {
1770
+ "epoch": 2.083160083160083,
1771
+ "grad_norm": 0.45150724053382874,
1772
+ "learning_rate": 7.489701537929384e-05,
1773
+ "loss": 2.3467,
1774
+ "step": 252
1775
+ },
1776
+ {
1777
+ "epoch": 2.0914760914760917,
1778
+ "grad_norm": 0.45270782709121704,
1779
+ "learning_rate": 7.372926218014131e-05,
1780
+ "loss": 2.3589,
1781
+ "step": 253
1782
+ },
1783
+ {
1784
+ "epoch": 2.0997920997921,
1785
+ "grad_norm": 0.49504727125167847,
1786
+ "learning_rate": 7.256534446375542e-05,
1787
+ "loss": 2.3821,
1788
+ "step": 254
1789
+ },
1790
+ {
1791
+ "epoch": 2.108108108108108,
1792
+ "grad_norm": 0.40953919291496277,
1793
+ "learning_rate": 7.14054321601311e-05,
1794
+ "loss": 2.3754,
1795
+ "step": 255
1796
+ },
1797
+ {
1798
+ "epoch": 2.1164241164241164,
1799
+ "grad_norm": 0.429299533367157,
1800
+ "learning_rate": 7.024969461447972e-05,
1801
+ "loss": 2.3582,
1802
+ "step": 256
1803
+ },
1804
+ {
1805
+ "epoch": 2.124740124740125,
1806
+ "grad_norm": 0.4189452826976776,
1807
+ "learning_rate": 6.909830056250527e-05,
1808
+ "loss": 2.3636,
1809
+ "step": 257
1810
+ },
1811
+ {
1812
+ "epoch": 2.133056133056133,
1813
+ "grad_norm": 0.4161284267902374,
1814
+ "learning_rate": 6.795141810576906e-05,
1815
+ "loss": 2.3543,
1816
+ "step": 258
1817
+ },
1818
+ {
1819
+ "epoch": 2.141372141372141,
1820
+ "grad_norm": 0.4045678675174713,
1821
+ "learning_rate": 6.680921468714719e-05,
1822
+ "loss": 2.362,
1823
+ "step": 259
1824
+ },
1825
+ {
1826
+ "epoch": 2.1496881496881497,
1827
+ "grad_norm": 0.47194382548332214,
1828
+ "learning_rate": 6.567185706638417e-05,
1829
+ "loss": 2.377,
1830
+ "step": 260
1831
+ },
1832
+ {
1833
+ "epoch": 2.1580041580041582,
1834
+ "grad_norm": 0.47539058327674866,
1835
+ "learning_rate": 6.453951129574644e-05,
1836
+ "loss": 2.3645,
1837
+ "step": 261
1838
+ },
1839
+ {
1840
+ "epoch": 2.1663201663201663,
1841
+ "grad_norm": 0.43471434712409973,
1842
+ "learning_rate": 6.341234269577879e-05,
1843
+ "loss": 2.3441,
1844
+ "step": 262
1845
+ },
1846
+ {
1847
+ "epoch": 2.1746361746361744,
1848
+ "grad_norm": 0.45001259446144104,
1849
+ "learning_rate": 6.229051583116796e-05,
1850
+ "loss": 2.3124,
1851
+ "step": 263
1852
+ },
1853
+ {
1854
+ "epoch": 2.182952182952183,
1855
+ "grad_norm": 0.40879690647125244,
1856
+ "learning_rate": 6.117419448671651e-05,
1857
+ "loss": 2.3396,
1858
+ "step": 264
1859
+ },
1860
+ {
1861
+ "epoch": 2.1912681912681915,
1862
+ "grad_norm": 0.42216163873672485,
1863
+ "learning_rate": 6.006354164343046e-05,
1864
+ "loss": 2.3724,
1865
+ "step": 265
1866
+ },
1867
+ {
1868
+ "epoch": 2.1995841995841996,
1869
+ "grad_norm": 0.41845160722732544,
1870
+ "learning_rate": 5.8958719454724346e-05,
1871
+ "loss": 2.3541,
1872
+ "step": 266
1873
+ },
1874
+ {
1875
+ "epoch": 2.2079002079002077,
1876
+ "grad_norm": 0.423265665769577,
1877
+ "learning_rate": 5.785988922274711e-05,
1878
+ "loss": 2.3594,
1879
+ "step": 267
1880
+ },
1881
+ {
1882
+ "epoch": 2.2162162162162162,
1883
+ "grad_norm": 0.4284776747226715,
1884
+ "learning_rate": 5.676721137483225e-05,
1885
+ "loss": 2.3676,
1886
+ "step": 268
1887
+ },
1888
+ {
1889
+ "epoch": 2.2245322245322248,
1890
+ "grad_norm": 0.4213842749595642,
1891
+ "learning_rate": 5.568084544007588e-05,
1892
+ "loss": 2.3392,
1893
+ "step": 269
1894
+ },
1895
+ {
1896
+ "epoch": 2.232848232848233,
1897
+ "grad_norm": 0.4187883138656616,
1898
+ "learning_rate": 5.4600950026045326e-05,
1899
+ "loss": 2.3501,
1900
+ "step": 270
1901
+ },
1902
+ {
1903
+ "epoch": 2.241164241164241,
1904
+ "grad_norm": 0.41858911514282227,
1905
+ "learning_rate": 5.3527682795623146e-05,
1906
+ "loss": 2.3563,
1907
+ "step": 271
1908
+ },
1909
+ {
1910
+ "epoch": 2.2494802494802495,
1911
+ "grad_norm": 0.4647063910961151,
1912
+ "learning_rate": 5.246120044398839e-05,
1913
+ "loss": 2.3575,
1914
+ "step": 272
1915
+ },
1916
+ {
1917
+ "epoch": 2.257796257796258,
1918
+ "grad_norm": 0.41441211104393005,
1919
+ "learning_rate": 5.14016586757394e-05,
1920
+ "loss": 2.3534,
1921
+ "step": 273
1922
+ },
1923
+ {
1924
+ "epoch": 2.266112266112266,
1925
+ "grad_norm": 0.4543509781360626,
1926
+ "learning_rate": 5.0349212182161254e-05,
1927
+ "loss": 2.3562,
1928
+ "step": 274
1929
+ },
1930
+ {
1931
+ "epoch": 2.274428274428274,
1932
+ "grad_norm": 0.4140624403953552,
1933
+ "learning_rate": 4.9304014618640995e-05,
1934
+ "loss": 2.3488,
1935
+ "step": 275
1936
+ },
1937
+ {
1938
+ "epoch": 2.2827442827442828,
1939
+ "grad_norm": 0.4368607699871063,
1940
+ "learning_rate": 4.826621858223431e-05,
1941
+ "loss": 2.3532,
1942
+ "step": 276
1943
+ },
1944
+ {
1945
+ "epoch": 2.2910602910602913,
1946
+ "grad_norm": 0.41528797149658203,
1947
+ "learning_rate": 4.723597558938672e-05,
1948
+ "loss": 2.3591,
1949
+ "step": 277
1950
+ },
1951
+ {
1952
+ "epoch": 2.2993762993762994,
1953
+ "grad_norm": 0.43311697244644165,
1954
+ "learning_rate": 4.6213436053812144e-05,
1955
+ "loss": 2.3347,
1956
+ "step": 278
1957
+ },
1958
+ {
1959
+ "epoch": 2.3076923076923075,
1960
+ "grad_norm": 0.42939522862434387,
1961
+ "learning_rate": 4.519874926453302e-05,
1962
+ "loss": 2.3505,
1963
+ "step": 279
1964
+ },
1965
+ {
1966
+ "epoch": 2.316008316008316,
1967
+ "grad_norm": 0.42948731780052185,
1968
+ "learning_rate": 4.419206336408418e-05,
1969
+ "loss": 2.3377,
1970
+ "step": 280
1971
+ },
1972
+ {
1973
+ "epoch": 2.3243243243243246,
1974
+ "grad_norm": 0.44148650765419006,
1975
+ "learning_rate": 4.3193525326884435e-05,
1976
+ "loss": 2.3413,
1977
+ "step": 281
1978
+ },
1979
+ {
1980
+ "epoch": 2.3326403326403327,
1981
+ "grad_norm": 0.424143522977829,
1982
+ "learning_rate": 4.220328093777851e-05,
1983
+ "loss": 2.3377,
1984
+ "step": 282
1985
+ },
1986
+ {
1987
+ "epoch": 2.3409563409563408,
1988
+ "grad_norm": 0.4698318541049957,
1989
+ "learning_rate": 4.12214747707527e-05,
1990
+ "loss": 2.3692,
1991
+ "step": 283
1992
+ },
1993
+ {
1994
+ "epoch": 2.3492723492723493,
1995
+ "grad_norm": 0.42582592368125916,
1996
+ "learning_rate": 4.0248250167827275e-05,
1997
+ "loss": 2.3572,
1998
+ "step": 284
1999
+ },
2000
+ {
2001
+ "epoch": 2.357588357588358,
2002
+ "grad_norm": 0.4405396282672882,
2003
+ "learning_rate": 3.9283749218128885e-05,
2004
+ "loss": 2.352,
2005
+ "step": 285
2006
+ },
2007
+ {
2008
+ "epoch": 2.365904365904366,
2009
+ "grad_norm": 0.4368214011192322,
2010
+ "learning_rate": 3.832811273714569e-05,
2011
+ "loss": 2.3425,
2012
+ "step": 286
2013
+ },
2014
+ {
2015
+ "epoch": 2.374220374220374,
2016
+ "grad_norm": 0.42838993668556213,
2017
+ "learning_rate": 3.738148024616863e-05,
2018
+ "loss": 2.3472,
2019
+ "step": 287
2020
+ },
2021
+ {
2022
+ "epoch": 2.3825363825363826,
2023
+ "grad_norm": 0.4351613223552704,
2024
+ "learning_rate": 3.644398995192147e-05,
2025
+ "loss": 2.3441,
2026
+ "step": 288
2027
+ },
2028
+ {
2029
+ "epoch": 2.390852390852391,
2030
+ "grad_norm": 0.4328824281692505,
2031
+ "learning_rate": 3.5515778726382966e-05,
2032
+ "loss": 2.3573,
2033
+ "step": 289
2034
+ },
2035
+ {
2036
+ "epoch": 2.399168399168399,
2037
+ "grad_norm": 0.42902839183807373,
2038
+ "learning_rate": 3.459698208680359e-05,
2039
+ "loss": 2.3672,
2040
+ "step": 290
2041
+ },
2042
+ {
2043
+ "epoch": 2.4074844074844073,
2044
+ "grad_norm": 0.44229575991630554,
2045
+ "learning_rate": 3.36877341759205e-05,
2046
+ "loss": 2.3371,
2047
+ "step": 291
2048
+ },
2049
+ {
2050
+ "epoch": 2.415800415800416,
2051
+ "grad_norm": 0.4500284492969513,
2052
+ "learning_rate": 3.2788167742372725e-05,
2053
+ "loss": 2.329,
2054
+ "step": 292
2055
+ },
2056
+ {
2057
+ "epoch": 2.4241164241164244,
2058
+ "grad_norm": 0.6344565749168396,
2059
+ "learning_rate": 3.1898414121320276e-05,
2060
+ "loss": 2.3618,
2061
+ "step": 293
2062
+ },
2063
+ {
2064
+ "epoch": 2.4324324324324325,
2065
+ "grad_norm": 0.42112040519714355,
2066
+ "learning_rate": 3.101860321526924e-05,
2067
+ "loss": 2.3349,
2068
+ "step": 294
2069
+ },
2070
+ {
2071
+ "epoch": 2.4407484407484406,
2072
+ "grad_norm": 0.49494612216949463,
2073
+ "learning_rate": 3.0148863475106314e-05,
2074
+ "loss": 2.3389,
2075
+ "step": 295
2076
+ },
2077
+ {
2078
+ "epoch": 2.449064449064449,
2079
+ "grad_norm": 0.4607335925102234,
2080
+ "learning_rate": 2.9289321881345254e-05,
2081
+ "loss": 2.3245,
2082
+ "step": 296
2083
+ },
2084
+ {
2085
+ "epoch": 2.4573804573804576,
2086
+ "grad_norm": 0.4578121602535248,
2087
+ "learning_rate": 2.84401039255879e-05,
2088
+ "loss": 2.3573,
2089
+ "step": 297
2090
+ },
2091
+ {
2092
+ "epoch": 2.4656964656964657,
2093
+ "grad_norm": 0.4579460620880127,
2094
+ "learning_rate": 2.7601333592202583e-05,
2095
+ "loss": 2.3372,
2096
+ "step": 298
2097
+ },
2098
+ {
2099
+ "epoch": 2.474012474012474,
2100
+ "grad_norm": 0.4420205354690552,
2101
+ "learning_rate": 2.677313334022268e-05,
2102
+ "loss": 2.3221,
2103
+ "step": 299
2104
+ },
2105
+ {
2106
+ "epoch": 2.4823284823284824,
2107
+ "grad_norm": 0.4563603103160858,
2108
+ "learning_rate": 2.59556240854677e-05,
2109
+ "loss": 2.3446,
2110
+ "step": 300
2111
+ },
2112
+ {
2113
+ "epoch": 2.490644490644491,
2114
+ "grad_norm": 0.44042691588401794,
2115
+ "learning_rate": 2.514892518288988e-05,
2116
+ "loss": 2.3279,
2117
+ "step": 301
2118
+ },
2119
+ {
2120
+ "epoch": 2.498960498960499,
2121
+ "grad_norm": 0.4372948408126831,
2122
+ "learning_rate": 2.4353154409148637e-05,
2123
+ "loss": 2.3629,
2124
+ "step": 302
2125
+ },
2126
+ {
2127
+ "epoch": 2.507276507276507,
2128
+ "grad_norm": 0.4709435701370239,
2129
+ "learning_rate": 2.356842794541516e-05,
2130
+ "loss": 2.353,
2131
+ "step": 303
2132
+ },
2133
+ {
2134
+ "epoch": 2.5155925155925156,
2135
+ "grad_norm": 0.42820531129837036,
2136
+ "learning_rate": 2.2794860360410342e-05,
2137
+ "loss": 2.3419,
2138
+ "step": 304
2139
+ },
2140
+ {
2141
+ "epoch": 2.523908523908524,
2142
+ "grad_norm": 0.4281867742538452,
2143
+ "learning_rate": 2.2032564593677774e-05,
2144
+ "loss": 2.3369,
2145
+ "step": 305
2146
+ },
2147
+ {
2148
+ "epoch": 2.5322245322245323,
2149
+ "grad_norm": 0.4326595664024353,
2150
+ "learning_rate": 2.1281651939094992e-05,
2151
+ "loss": 2.3215,
2152
+ "step": 306
2153
+ },
2154
+ {
2155
+ "epoch": 2.5405405405405403,
2156
+ "grad_norm": 0.4328608810901642,
2157
+ "learning_rate": 2.0542232028624586e-05,
2158
+ "loss": 2.3094,
2159
+ "step": 307
2160
+ },
2161
+ {
2162
+ "epoch": 2.548856548856549,
2163
+ "grad_norm": 0.42794665694236755,
2164
+ "learning_rate": 1.981441281630816e-05,
2165
+ "loss": 2.3303,
2166
+ "step": 308
2167
+ },
2168
+ {
2169
+ "epoch": 2.5571725571725574,
2170
+ "grad_norm": 0.43002575635910034,
2171
+ "learning_rate": 1.9098300562505266e-05,
2172
+ "loss": 2.3516,
2173
+ "step": 309
2174
+ },
2175
+ {
2176
+ "epoch": 2.5654885654885655,
2177
+ "grad_norm": 0.43052583932876587,
2178
+ "learning_rate": 1.8393999818379525e-05,
2179
+ "loss": 2.3592,
2180
+ "step": 310
2181
+ },
2182
+ {
2183
+ "epoch": 2.5738045738045736,
2184
+ "grad_norm": 0.41541633009910583,
2185
+ "learning_rate": 1.7701613410634365e-05,
2186
+ "loss": 2.333,
2187
+ "step": 311
2188
+ },
2189
+ {
2190
+ "epoch": 2.582120582120582,
2191
+ "grad_norm": 0.44588619470596313,
2192
+ "learning_rate": 1.7021242426500493e-05,
2193
+ "loss": 2.3445,
2194
+ "step": 312
2195
+ },
2196
+ {
2197
+ "epoch": 2.5904365904365907,
2198
+ "grad_norm": 0.4385538399219513,
2199
+ "learning_rate": 1.6352986198977325e-05,
2200
+ "loss": 2.3406,
2201
+ "step": 313
2202
+ },
2203
+ {
2204
+ "epoch": 2.598752598752599,
2205
+ "grad_norm": 0.43057897686958313,
2206
+ "learning_rate": 1.5696942292330576e-05,
2207
+ "loss": 2.3426,
2208
+ "step": 314
2209
+ },
2210
+ {
2211
+ "epoch": 2.607068607068607,
2212
+ "grad_norm": 0.4492436945438385,
2213
+ "learning_rate": 1.5053206487847914e-05,
2214
+ "loss": 2.33,
2215
+ "step": 315
2216
+ },
2217
+ {
2218
+ "epoch": 2.6153846153846154,
2219
+ "grad_norm": 0.43555155396461487,
2220
+ "learning_rate": 1.442187276985526e-05,
2221
+ "loss": 2.342,
2222
+ "step": 316
2223
+ },
2224
+ {
2225
+ "epoch": 2.623700623700624,
2226
+ "grad_norm": 0.4379512071609497,
2227
+ "learning_rate": 1.3803033311995072e-05,
2228
+ "loss": 2.3202,
2229
+ "step": 317
2230
+ },
2231
+ {
2232
+ "epoch": 2.632016632016632,
2233
+ "grad_norm": 0.42732083797454834,
2234
+ "learning_rate": 1.3196778463769255e-05,
2235
+ "loss": 2.3104,
2236
+ "step": 318
2237
+ },
2238
+ {
2239
+ "epoch": 2.64033264033264,
2240
+ "grad_norm": 0.43830588459968567,
2241
+ "learning_rate": 1.260319673734821e-05,
2242
+ "loss": 2.3252,
2243
+ "step": 319
2244
+ },
2245
+ {
2246
+ "epoch": 2.6486486486486487,
2247
+ "grad_norm": 0.43504393100738525,
2248
+ "learning_rate": 1.2022374794648228e-05,
2249
+ "loss": 2.3416,
2250
+ "step": 320
2251
+ },
2252
+ {
2253
+ "epoch": 2.6569646569646572,
2254
+ "grad_norm": 0.43797558546066284,
2255
+ "learning_rate": 1.1454397434679021e-05,
2256
+ "loss": 2.3361,
2257
+ "step": 321
2258
+ },
2259
+ {
2260
+ "epoch": 2.6652806652806653,
2261
+ "grad_norm": 0.4464263916015625,
2262
+ "learning_rate": 1.0899347581163221e-05,
2263
+ "loss": 2.3515,
2264
+ "step": 322
2265
+ },
2266
+ {
2267
+ "epoch": 2.6735966735966734,
2268
+ "grad_norm": 0.43725448846817017,
2269
+ "learning_rate": 1.0357306270429624e-05,
2270
+ "loss": 2.3236,
2271
+ "step": 323
2272
+ },
2273
+ {
2274
+ "epoch": 2.681912681912682,
2275
+ "grad_norm": 0.4281315803527832,
2276
+ "learning_rate": 9.828352639582072e-06,
2277
+ "loss": 2.3296,
2278
+ "step": 324
2279
+ },
2280
+ {
2281
+ "epoch": 2.6902286902286905,
2282
+ "grad_norm": 0.4342389702796936,
2283
+ "learning_rate": 9.31256391494546e-06,
2284
+ "loss": 2.3097,
2285
+ "step": 325
2286
+ },
2287
+ {
2288
+ "epoch": 2.6985446985446986,
2289
+ "grad_norm": 0.44206634163856506,
2290
+ "learning_rate": 8.810015400790994e-06,
2291
+ "loss": 2.3303,
2292
+ "step": 326
2293
+ },
2294
+ {
2295
+ "epoch": 2.7068607068607067,
2296
+ "grad_norm": 0.4294271767139435,
2297
+ "learning_rate": 8.32078046834176e-06,
2298
+ "loss": 2.3183,
2299
+ "step": 327
2300
+ },
2301
+ {
2302
+ "epoch": 2.715176715176715,
2303
+ "grad_norm": 0.43030035495758057,
2304
+ "learning_rate": 7.844930545060703e-06,
2305
+ "loss": 2.3368,
2306
+ "step": 328
2307
+ },
2308
+ {
2309
+ "epoch": 2.7234927234927238,
2310
+ "grad_norm": 0.435197651386261,
2311
+ "learning_rate": 7.382535104222366e-06,
2312
+ "loss": 2.3312,
2313
+ "step": 329
2314
+ },
2315
+ {
2316
+ "epoch": 2.731808731808732,
2317
+ "grad_norm": 0.4398477375507355,
2318
+ "learning_rate": 6.9336616547697965e-06,
2319
+ "loss": 2.3146,
2320
+ "step": 330
2321
+ },
2322
+ {
2323
+ "epoch": 2.74012474012474,
2324
+ "grad_norm": 0.43261393904685974,
2325
+ "learning_rate": 6.498375731458528e-06,
2326
+ "loss": 2.3277,
2327
+ "step": 331
2328
+ },
2329
+ {
2330
+ "epoch": 2.7484407484407485,
2331
+ "grad_norm": 0.46077948808670044,
2332
+ "learning_rate": 6.076740885288479e-06,
2333
+ "loss": 2.3493,
2334
+ "step": 332
2335
+ },
2336
+ {
2337
+ "epoch": 2.756756756756757,
2338
+ "grad_norm": 0.4285019636154175,
2339
+ "learning_rate": 5.668818674225685e-06,
2340
+ "loss": 2.3537,
2341
+ "step": 333
2342
+ },
2343
+ {
2344
+ "epoch": 2.765072765072765,
2345
+ "grad_norm": 0.43278998136520386,
2346
+ "learning_rate": 5.274668654214932e-06,
2347
+ "loss": 2.3504,
2348
+ "step": 334
2349
+ },
2350
+ {
2351
+ "epoch": 2.773388773388773,
2352
+ "grad_norm": 0.43840229511260986,
2353
+ "learning_rate": 4.8943483704846475e-06,
2354
+ "loss": 2.333,
2355
+ "step": 335
2356
+ },
2357
+ {
2358
+ "epoch": 2.7817047817047817,
2359
+ "grad_norm": 0.4274958670139313,
2360
+ "learning_rate": 4.527913349145441e-06,
2361
+ "loss": 2.3467,
2362
+ "step": 336
2363
+ },
2364
+ {
2365
+ "epoch": 2.7900207900207903,
2366
+ "grad_norm": 0.4174635410308838,
2367
+ "learning_rate": 4.175417089083378e-06,
2368
+ "loss": 2.3248,
2369
+ "step": 337
2370
+ },
2371
+ {
2372
+ "epoch": 2.7983367983367984,
2373
+ "grad_norm": 0.4231819808483124,
2374
+ "learning_rate": 3.836911054149239e-06,
2375
+ "loss": 2.3169,
2376
+ "step": 338
2377
+ },
2378
+ {
2379
+ "epoch": 2.8066528066528065,
2380
+ "grad_norm": 0.4251345098018646,
2381
+ "learning_rate": 3.512444665644865e-06,
2382
+ "loss": 2.333,
2383
+ "step": 339
2384
+ },
2385
+ {
2386
+ "epoch": 2.814968814968815,
2387
+ "grad_norm": 0.42021313309669495,
2388
+ "learning_rate": 3.202065295107726e-06,
2389
+ "loss": 2.3475,
2390
+ "step": 340
2391
+ },
2392
+ {
2393
+ "epoch": 2.8232848232848236,
2394
+ "grad_norm": 0.4197041094303131,
2395
+ "learning_rate": 2.905818257394799e-06,
2396
+ "loss": 2.3128,
2397
+ "step": 341
2398
+ },
2399
+ {
2400
+ "epoch": 2.8316008316008316,
2401
+ "grad_norm": 0.4266311228275299,
2402
+ "learning_rate": 2.6237468040666512e-06,
2403
+ "loss": 2.3249,
2404
+ "step": 342
2405
+ },
2406
+ {
2407
+ "epoch": 2.8399168399168397,
2408
+ "grad_norm": 0.442434698343277,
2409
+ "learning_rate": 2.3558921170727888e-06,
2410
+ "loss": 2.3422,
2411
+ "step": 343
2412
+ },
2413
+ {
2414
+ "epoch": 2.8482328482328483,
2415
+ "grad_norm": 0.4329405725002289,
2416
+ "learning_rate": 2.1022933027391555e-06,
2417
+ "loss": 2.3188,
2418
+ "step": 344
2419
+ },
2420
+ {
2421
+ "epoch": 2.856548856548857,
2422
+ "grad_norm": 0.43024739623069763,
2423
+ "learning_rate": 1.8629873860586566e-06,
2424
+ "loss": 2.3365,
2425
+ "step": 345
2426
+ },
2427
+ {
2428
+ "epoch": 2.864864864864865,
2429
+ "grad_norm": 0.43431299924850464,
2430
+ "learning_rate": 1.6380093052856483e-06,
2431
+ "loss": 2.3476,
2432
+ "step": 346
2433
+ },
2434
+ {
2435
+ "epoch": 2.873180873180873,
2436
+ "grad_norm": 0.43452492356300354,
2437
+ "learning_rate": 1.4273919068349184e-06,
2438
+ "loss": 2.3327,
2439
+ "step": 347
2440
+ },
2441
+ {
2442
+ "epoch": 2.8814968814968815,
2443
+ "grad_norm": 0.4286072552204132,
2444
+ "learning_rate": 1.231165940486234e-06,
2445
+ "loss": 2.334,
2446
+ "step": 348
2447
+ },
2448
+ {
2449
+ "epoch": 2.88981288981289,
2450
+ "grad_norm": 0.4313005805015564,
2451
+ "learning_rate": 1.0493600548948878e-06,
2452
+ "loss": 2.3216,
2453
+ "step": 349
2454
+ },
2455
+ {
2456
+ "epoch": 2.898128898128898,
2457
+ "grad_norm": 0.4270586371421814,
2458
+ "learning_rate": 8.820007934090879e-07,
2459
+ "loss": 2.3438,
2460
+ "step": 350
2461
+ },
2462
+ {
2463
+ "epoch": 2.9064449064449063,
2464
+ "grad_norm": 0.4391648769378662,
2465
+ "learning_rate": 7.291125901946027e-07,
2466
+ "loss": 2.3528,
2467
+ "step": 351
2468
+ },
2469
+ {
2470
+ "epoch": 2.914760914760915,
2471
+ "grad_norm": 0.4252782166004181,
2472
+ "learning_rate": 5.907177666674812e-07,
2473
+ "loss": 2.3422,
2474
+ "step": 352
2475
+ },
2476
+ {
2477
+ "epoch": 2.9230769230769234,
2478
+ "grad_norm": 0.4223313629627228,
2479
+ "learning_rate": 4.668365282351372e-07,
2480
+ "loss": 2.3233,
2481
+ "step": 353
2482
+ },
2483
+ {
2484
+ "epoch": 2.9313929313929314,
2485
+ "grad_norm": 0.43984875082969666,
2486
+ "learning_rate": 3.5748696134639825e-07,
2487
+ "loss": 2.3437,
2488
+ "step": 354
2489
+ },
2490
+ {
2491
+ "epoch": 2.9397089397089395,
2492
+ "grad_norm": 0.4258709251880646,
2493
+ "learning_rate": 2.6268503085089547e-07,
2494
+ "loss": 2.3406,
2495
+ "step": 355
2496
+ },
2497
+ {
2498
+ "epoch": 2.948024948024948,
2499
+ "grad_norm": 0.42968347668647766,
2500
+ "learning_rate": 1.824445776682504e-07,
2501
+ "loss": 2.3277,
2502
+ "step": 356
2503
+ },
2504
+ {
2505
+ "epoch": 2.9563409563409566,
2506
+ "grad_norm": 0.4366348683834076,
2507
+ "learning_rate": 1.1677731676733584e-07,
2508
+ "loss": 2.3263,
2509
+ "step": 357
2510
+ },
2511
+ {
2512
+ "epoch": 2.9646569646569647,
2513
+ "grad_norm": 0.42044591903686523,
2514
+ "learning_rate": 6.569283545587724e-08,
2515
+ "loss": 2.3439,
2516
+ "step": 358
2517
+ },
2518
+ {
2519
+ "epoch": 2.972972972972973,
2520
+ "grad_norm": 0.43199941515922546,
2521
+ "learning_rate": 2.9198591980705848e-08,
2522
+ "loss": 2.3411,
2523
+ "step": 359
2524
+ },
2525
+ {
2526
+ "epoch": 2.9812889812889813,
2527
+ "grad_norm": 0.4222280979156494,
2528
+ "learning_rate": 7.2999144389296335e-09,
2529
+ "loss": 2.3358,
2530
+ "step": 360
2531
+ }
2532
+ ],
2533
+ "logging_steps": 1,
2534
+ "max_steps": 360,
2535
+ "num_input_tokens_seen": 0,
2536
+ "num_train_epochs": 3,
2537
+ "save_steps": 200,
2538
+ "stateful_callbacks": {
2539
+ "TrainerControl": {
2540
+ "args": {
2541
+ "should_epoch_stop": false,
2542
+ "should_evaluate": false,
2543
+ "should_log": false,
2544
+ "should_save": true,
2545
+ "should_training_stop": true
2546
+ },
2547
+ "attributes": {}
2548
+ }
2549
+ },
2550
+ "total_flos": 3.2413772944716595e+18,
2551
+ "train_batch_size": 4,
2552
+ "trial_name": null,
2553
+ "trial_params": null
2554
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e81fbdd6cff1861c240817ac43dadc969e51ad90ec3bffb314ccf89c9d2c6d32
3
+ size 6584