nikJ13 commited on
Commit
f0b0a53
·
verified ·
1 Parent(s): cf9bfc4

Delete Qwen2.5-Coder-7B-Instruct-math-solver-config_1

Browse files
Files changed (26) hide show
  1. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/added_tokens.json +0 -24
  2. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/README.md +0 -202
  3. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/adapter_config.json +0 -34
  4. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/adapter_model.safetensors +0 -3
  5. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/added_tokens.json +0 -24
  6. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/merges.txt +0 -0
  7. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/optimizer.pt +0 -3
  8. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/rng_state.pth +0 -3
  9. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/scheduler.pt +0 -3
  10. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/special_tokens_map.json +0 -31
  11. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/tokenizer.json +0 -3
  12. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/tokenizer_config.json +0 -207
  13. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/trainer_state.json +0 -3533
  14. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/training_args.bin +0 -3
  15. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/vocab.json +0 -0
  16. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/config.json +0 -45
  17. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/generation_config.json +0 -14
  18. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/merges.txt +0 -0
  19. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/model-00001-of-00003.safetensors +0 -3
  20. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/model-00002-of-00003.safetensors +0 -3
  21. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/model-00003-of-00003.safetensors +0 -3
  22. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/model.safetensors.index.json +0 -0
  23. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/special_tokens_map.json +0 -31
  24. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/tokenizer.json +0 -3
  25. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/tokenizer_config.json +0 -207
  26. Qwen2.5-Coder-7B-Instruct-math-solver-config_1/vocab.json +0 -0
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/added_tokens.json DELETED
@@ -1,24 +0,0 @@
1
- {
2
- "</tool_call>": 151658,
3
- "<tool_call>": 151657,
4
- "<|box_end|>": 151649,
5
- "<|box_start|>": 151648,
6
- "<|endoftext|>": 151643,
7
- "<|file_sep|>": 151664,
8
- "<|fim_middle|>": 151660,
9
- "<|fim_pad|>": 151662,
10
- "<|fim_prefix|>": 151659,
11
- "<|fim_suffix|>": 151661,
12
- "<|im_end|>": 151645,
13
- "<|im_start|>": 151644,
14
- "<|image_pad|>": 151655,
15
- "<|object_ref_end|>": 151647,
16
- "<|object_ref_start|>": 151646,
17
- "<|quad_end|>": 151651,
18
- "<|quad_start|>": 151650,
19
- "<|repo_name|>": 151663,
20
- "<|video_pad|>": 151656,
21
- "<|vision_end|>": 151653,
22
- "<|vision_pad|>": 151654,
23
- "<|vision_start|>": 151652
24
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/README.md DELETED
@@ -1,202 +0,0 @@
1
- ---
2
- base_model: Qwen/Qwen2.5-Coder-7B-Instruct
3
- library_name: peft
4
- ---
5
-
6
- # Model Card for Model ID
7
-
8
- <!-- Provide a quick summary of what the model is/does. -->
9
-
10
-
11
-
12
- ## Model Details
13
-
14
- ### Model Description
15
-
16
- <!-- Provide a longer summary of what this model is. -->
17
-
18
-
19
-
20
- - **Developed by:** [More Information Needed]
21
- - **Funded by [optional]:** [More Information Needed]
22
- - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
- - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
-
28
- ### Model Sources [optional]
29
-
30
- <!-- Provide the basic links for the model. -->
31
-
32
- - **Repository:** [More Information Needed]
33
- - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
-
36
- ## Uses
37
-
38
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
-
40
- ### Direct Use
41
-
42
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
-
44
- [More Information Needed]
45
-
46
- ### Downstream Use [optional]
47
-
48
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
-
50
- [More Information Needed]
51
-
52
- ### Out-of-Scope Use
53
-
54
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
-
56
- [More Information Needed]
57
-
58
- ## Bias, Risks, and Limitations
59
-
60
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
-
62
- [More Information Needed]
63
-
64
- ### Recommendations
65
-
66
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
-
68
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
-
70
- ## How to Get Started with the Model
71
-
72
- Use the code below to get started with the model.
73
-
74
- [More Information Needed]
75
-
76
- ## Training Details
77
-
78
- ### Training Data
79
-
80
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
-
82
- [More Information Needed]
83
-
84
- ### Training Procedure
85
-
86
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
-
88
- #### Preprocessing [optional]
89
-
90
- [More Information Needed]
91
-
92
-
93
- #### Training Hyperparameters
94
-
95
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
-
97
- #### Speeds, Sizes, Times [optional]
98
-
99
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
-
101
- [More Information Needed]
102
-
103
- ## Evaluation
104
-
105
- <!-- This section describes the evaluation protocols and provides the results. -->
106
-
107
- ### Testing Data, Factors & Metrics
108
-
109
- #### Testing Data
110
-
111
- <!-- This should link to a Dataset Card if possible. -->
112
-
113
- [More Information Needed]
114
-
115
- #### Factors
116
-
117
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
-
119
- [More Information Needed]
120
-
121
- #### Metrics
122
-
123
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
-
125
- [More Information Needed]
126
-
127
- ### Results
128
-
129
- [More Information Needed]
130
-
131
- #### Summary
132
-
133
-
134
-
135
- ## Model Examination [optional]
136
-
137
- <!-- Relevant interpretability work for the model goes here -->
138
-
139
- [More Information Needed]
140
-
141
- ## Environmental Impact
142
-
143
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
-
145
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
-
147
- - **Hardware Type:** [More Information Needed]
148
- - **Hours used:** [More Information Needed]
149
- - **Cloud Provider:** [More Information Needed]
150
- - **Compute Region:** [More Information Needed]
151
- - **Carbon Emitted:** [More Information Needed]
152
-
153
- ## Technical Specifications [optional]
154
-
155
- ### Model Architecture and Objective
156
-
157
- [More Information Needed]
158
-
159
- ### Compute Infrastructure
160
-
161
- [More Information Needed]
162
-
163
- #### Hardware
164
-
165
- [More Information Needed]
166
-
167
- #### Software
168
-
169
- [More Information Needed]
170
-
171
- ## Citation [optional]
172
-
173
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
-
175
- **BibTeX:**
176
-
177
- [More Information Needed]
178
-
179
- **APA:**
180
-
181
- [More Information Needed]
182
-
183
- ## Glossary [optional]
184
-
185
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
-
187
- [More Information Needed]
188
-
189
- ## More Information [optional]
190
-
191
- [More Information Needed]
192
-
193
- ## Model Card Authors [optional]
194
-
195
- [More Information Needed]
196
-
197
- ## Model Card Contact
198
-
199
- [More Information Needed]
200
- ### Framework versions
201
-
202
- - PEFT 0.13.2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/adapter_config.json DELETED
@@ -1,34 +0,0 @@
1
- {
2
- "alpha_pattern": {},
3
- "auto_mapping": null,
4
- "base_model_name_or_path": "Qwen/Qwen2.5-Coder-7B-Instruct",
5
- "bias": "none",
6
- "fan_in_fan_out": false,
7
- "inference_mode": true,
8
- "init_lora_weights": true,
9
- "layer_replication": null,
10
- "layers_pattern": null,
11
- "layers_to_transform": null,
12
- "loftq_config": {},
13
- "lora_alpha": 64,
14
- "lora_dropout": 0.1,
15
- "megatron_config": null,
16
- "megatron_core": "megatron.core",
17
- "modules_to_save": null,
18
- "peft_type": "LORA",
19
- "r": 32,
20
- "rank_pattern": {},
21
- "revision": null,
22
- "target_modules": [
23
- "v_proj",
24
- "gate_proj",
25
- "down_proj",
26
- "k_proj",
27
- "o_proj",
28
- "q_proj",
29
- "up_proj"
30
- ],
31
- "task_type": "CAUSAL_LM",
32
- "use_dora": false,
33
- "use_rslora": false
34
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/adapter_model.safetensors DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:6a56f1d978fe8b8d15861855743fa2977366bf85d3eca23677a1b96719c09aa1
3
- size 323014168
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/added_tokens.json DELETED
@@ -1,24 +0,0 @@
1
- {
2
- "</tool_call>": 151658,
3
- "<tool_call>": 151657,
4
- "<|box_end|>": 151649,
5
- "<|box_start|>": 151648,
6
- "<|endoftext|>": 151643,
7
- "<|file_sep|>": 151664,
8
- "<|fim_middle|>": 151660,
9
- "<|fim_pad|>": 151662,
10
- "<|fim_prefix|>": 151659,
11
- "<|fim_suffix|>": 151661,
12
- "<|im_end|>": 151645,
13
- "<|im_start|>": 151644,
14
- "<|image_pad|>": 151655,
15
- "<|object_ref_end|>": 151647,
16
- "<|object_ref_start|>": 151646,
17
- "<|quad_end|>": 151651,
18
- "<|quad_start|>": 151650,
19
- "<|repo_name|>": 151663,
20
- "<|video_pad|>": 151656,
21
- "<|vision_end|>": 151653,
22
- "<|vision_pad|>": 151654,
23
- "<|vision_start|>": 151652
24
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/merges.txt DELETED
The diff for this file is too large to render. See raw diff
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/optimizer.pt DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:24f9e7efb2241934ed613fce2e0d797a4fb461bcb664ddec37c8562b1ef82e47
3
- size 646164282
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/rng_state.pth DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:b24603eb35ff0afc28f3c86f8d0d8ed12985ffe26a3776fe0a59e232508bc6c6
3
- size 14244
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/scheduler.pt DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:5f4fe8e328be6e10e053e14666b2e571c45c73d9a8291556b08910e3da67b3e6
3
- size 1064
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/special_tokens_map.json DELETED
@@ -1,31 +0,0 @@
1
- {
2
- "additional_special_tokens": [
3
- "<|im_start|>",
4
- "<|im_end|>",
5
- "<|object_ref_start|>",
6
- "<|object_ref_end|>",
7
- "<|box_start|>",
8
- "<|box_end|>",
9
- "<|quad_start|>",
10
- "<|quad_end|>",
11
- "<|vision_start|>",
12
- "<|vision_end|>",
13
- "<|vision_pad|>",
14
- "<|image_pad|>",
15
- "<|video_pad|>"
16
- ],
17
- "eos_token": {
18
- "content": "<|im_end|>",
19
- "lstrip": false,
20
- "normalized": false,
21
- "rstrip": false,
22
- "single_word": false
23
- },
24
- "pad_token": {
25
- "content": "<|endoftext|>",
26
- "lstrip": false,
27
- "normalized": false,
28
- "rstrip": false,
29
- "single_word": false
30
- }
31
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/tokenizer.json DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:962b8d8c521fefa934665afddae177326e974ddd6a26e69ff31ad6bccbb5593b
3
- size 11421994
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/tokenizer_config.json DELETED
@@ -1,207 +0,0 @@
1
- {
2
- "add_bos_token": false,
3
- "add_prefix_space": false,
4
- "added_tokens_decoder": {
5
- "151643": {
6
- "content": "<|endoftext|>",
7
- "lstrip": false,
8
- "normalized": false,
9
- "rstrip": false,
10
- "single_word": false,
11
- "special": true
12
- },
13
- "151644": {
14
- "content": "<|im_start|>",
15
- "lstrip": false,
16
- "normalized": false,
17
- "rstrip": false,
18
- "single_word": false,
19
- "special": true
20
- },
21
- "151645": {
22
- "content": "<|im_end|>",
23
- "lstrip": false,
24
- "normalized": false,
25
- "rstrip": false,
26
- "single_word": false,
27
- "special": true
28
- },
29
- "151646": {
30
- "content": "<|object_ref_start|>",
31
- "lstrip": false,
32
- "normalized": false,
33
- "rstrip": false,
34
- "single_word": false,
35
- "special": true
36
- },
37
- "151647": {
38
- "content": "<|object_ref_end|>",
39
- "lstrip": false,
40
- "normalized": false,
41
- "rstrip": false,
42
- "single_word": false,
43
- "special": true
44
- },
45
- "151648": {
46
- "content": "<|box_start|>",
47
- "lstrip": false,
48
- "normalized": false,
49
- "rstrip": false,
50
- "single_word": false,
51
- "special": true
52
- },
53
- "151649": {
54
- "content": "<|box_end|>",
55
- "lstrip": false,
56
- "normalized": false,
57
- "rstrip": false,
58
- "single_word": false,
59
- "special": true
60
- },
61
- "151650": {
62
- "content": "<|quad_start|>",
63
- "lstrip": false,
64
- "normalized": false,
65
- "rstrip": false,
66
- "single_word": false,
67
- "special": true
68
- },
69
- "151651": {
70
- "content": "<|quad_end|>",
71
- "lstrip": false,
72
- "normalized": false,
73
- "rstrip": false,
74
- "single_word": false,
75
- "special": true
76
- },
77
- "151652": {
78
- "content": "<|vision_start|>",
79
- "lstrip": false,
80
- "normalized": false,
81
- "rstrip": false,
82
- "single_word": false,
83
- "special": true
84
- },
85
- "151653": {
86
- "content": "<|vision_end|>",
87
- "lstrip": false,
88
- "normalized": false,
89
- "rstrip": false,
90
- "single_word": false,
91
- "special": true
92
- },
93
- "151654": {
94
- "content": "<|vision_pad|>",
95
- "lstrip": false,
96
- "normalized": false,
97
- "rstrip": false,
98
- "single_word": false,
99
- "special": true
100
- },
101
- "151655": {
102
- "content": "<|image_pad|>",
103
- "lstrip": false,
104
- "normalized": false,
105
- "rstrip": false,
106
- "single_word": false,
107
- "special": true
108
- },
109
- "151656": {
110
- "content": "<|video_pad|>",
111
- "lstrip": false,
112
- "normalized": false,
113
- "rstrip": false,
114
- "single_word": false,
115
- "special": true
116
- },
117
- "151657": {
118
- "content": "<tool_call>",
119
- "lstrip": false,
120
- "normalized": false,
121
- "rstrip": false,
122
- "single_word": false,
123
- "special": false
124
- },
125
- "151658": {
126
- "content": "</tool_call>",
127
- "lstrip": false,
128
- "normalized": false,
129
- "rstrip": false,
130
- "single_word": false,
131
- "special": false
132
- },
133
- "151659": {
134
- "content": "<|fim_prefix|>",
135
- "lstrip": false,
136
- "normalized": false,
137
- "rstrip": false,
138
- "single_word": false,
139
- "special": false
140
- },
141
- "151660": {
142
- "content": "<|fim_middle|>",
143
- "lstrip": false,
144
- "normalized": false,
145
- "rstrip": false,
146
- "single_word": false,
147
- "special": false
148
- },
149
- "151661": {
150
- "content": "<|fim_suffix|>",
151
- "lstrip": false,
152
- "normalized": false,
153
- "rstrip": false,
154
- "single_word": false,
155
- "special": false
156
- },
157
- "151662": {
158
- "content": "<|fim_pad|>",
159
- "lstrip": false,
160
- "normalized": false,
161
- "rstrip": false,
162
- "single_word": false,
163
- "special": false
164
- },
165
- "151663": {
166
- "content": "<|repo_name|>",
167
- "lstrip": false,
168
- "normalized": false,
169
- "rstrip": false,
170
- "single_word": false,
171
- "special": false
172
- },
173
- "151664": {
174
- "content": "<|file_sep|>",
175
- "lstrip": false,
176
- "normalized": false,
177
- "rstrip": false,
178
- "single_word": false,
179
- "special": false
180
- }
181
- },
182
- "additional_special_tokens": [
183
- "<|im_start|>",
184
- "<|im_end|>",
185
- "<|object_ref_start|>",
186
- "<|object_ref_end|>",
187
- "<|box_start|>",
188
- "<|box_end|>",
189
- "<|quad_start|>",
190
- "<|quad_end|>",
191
- "<|vision_start|>",
192
- "<|vision_end|>",
193
- "<|vision_pad|>",
194
- "<|image_pad|>",
195
- "<|video_pad|>"
196
- ],
197
- "bos_token": null,
198
- "chat_template": "{%- if tools %}\n {{- '<|im_start|>system\\n' }}\n {%- if messages[0]['role'] == 'system' %}\n {{- messages[0]['content'] }}\n {%- else %}\n {{- 'You are Qwen, created by Alibaba Cloud. You are a helpful assistant.' }}\n {%- endif %}\n {{- \"\\n\\n# Tools\\n\\nYou may call one or more functions to assist with the user query.\\n\\nYou are provided with function signatures within <tools></tools> XML tags:\\n<tools>\" }}\n {%- for tool in tools %}\n {{- \"\\n\" }}\n {{- tool | tojson }}\n {%- endfor %}\n {{- \"\\n</tools>\\n\\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\\n<tool_call>\\n{\\\"name\\\": <function-name>, \\\"arguments\\\": <args-json-object>}\\n</tool_call><|im_end|>\\n\" }}\n{%- else %}\n {%- if messages[0]['role'] == 'system' %}\n {{- '<|im_start|>system\\n' + messages[0]['content'] + '<|im_end|>\\n' }}\n {%- else %}\n {{- '<|im_start|>system\\nYou are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>\\n' }}\n {%- endif %}\n{%- endif %}\n{%- for message in messages %}\n {%- if (message.role == \"user\") or (message.role == \"system\" and not loop.first) or (message.role == \"assistant\" and not message.tool_calls) %}\n {{- '<|im_start|>' + message.role + '\\n' + message.content + '<|im_end|>' + '\\n' }}\n {%- elif message.role == \"assistant\" %}\n {{- '<|im_start|>' + message.role }}\n {%- if message.content %}\n {{- '\\n' + message.content }}\n {%- endif %}\n {%- for tool_call in message.tool_calls %}\n {%- if tool_call.function is defined %}\n {%- set tool_call = tool_call.function %}\n {%- endif %}\n {{- '\\n<tool_call>\\n{\"name\": \"' }}\n {{- tool_call.name }}\n {{- '\", \"arguments\": ' }}\n {{- tool_call.arguments | tojson }}\n {{- '}\\n</tool_call>' }}\n {%- endfor %}\n {{- '<|im_end|>\\n' }}\n {%- elif message.role == \"tool\" %}\n {%- if (loop.index0 == 0) or (messages[loop.index0 - 1].role != \"tool\") %}\n {{- '<|im_start|>user' }}\n {%- endif %}\n {{- '\\n<tool_response>\\n' }}\n {{- message.content }}\n {{- '\\n</tool_response>' }}\n {%- if loop.last or (messages[loop.index0 + 1].role != \"tool\") %}\n {{- '<|im_end|>\\n' }}\n {%- endif %}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|im_start|>assistant\\n' }}\n{%- endif %}\n",
199
- "clean_up_tokenization_spaces": false,
200
- "eos_token": "<|im_end|>",
201
- "errors": "replace",
202
- "model_max_length": 32768,
203
- "pad_token": "<|endoftext|>",
204
- "split_special_tokens": false,
205
- "tokenizer_class": "Qwen2Tokenizer",
206
- "unk_token": null
207
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/trainer_state.json DELETED
@@ -1,3533 +0,0 @@
1
- {
2
- "best_metric": null,
3
- "best_model_checkpoint": null,
4
- "epoch": 2.0,
5
- "eval_steps": 500,
6
- "global_step": 500,
7
- "is_hyper_param_search": false,
8
- "is_local_process_zero": true,
9
- "is_world_process_zero": true,
10
- "log_history": [
11
- {
12
- "epoch": 0.004,
13
- "grad_norm": 0.4955354928970337,
14
- "learning_rate": 8.000000000000001e-06,
15
- "loss": 1.1902,
16
- "step": 1
17
- },
18
- {
19
- "epoch": 0.008,
20
- "grad_norm": 0.5662614107131958,
21
- "learning_rate": 1.6000000000000003e-05,
22
- "loss": 1.1255,
23
- "step": 2
24
- },
25
- {
26
- "epoch": 0.012,
27
- "grad_norm": 0.6702236533164978,
28
- "learning_rate": 2.4e-05,
29
- "loss": 1.4075,
30
- "step": 3
31
- },
32
- {
33
- "epoch": 0.016,
34
- "grad_norm": 0.6998797655105591,
35
- "learning_rate": 3.2000000000000005e-05,
36
- "loss": 1.4778,
37
- "step": 4
38
- },
39
- {
40
- "epoch": 0.02,
41
- "grad_norm": 0.7086893320083618,
42
- "learning_rate": 4e-05,
43
- "loss": 1.4316,
44
- "step": 5
45
- },
46
- {
47
- "epoch": 0.024,
48
- "grad_norm": 0.7096680402755737,
49
- "learning_rate": 4.8e-05,
50
- "loss": 1.3811,
51
- "step": 6
52
- },
53
- {
54
- "epoch": 0.028,
55
- "grad_norm": 0.7006363868713379,
56
- "learning_rate": 5.6000000000000006e-05,
57
- "loss": 1.2483,
58
- "step": 7
59
- },
60
- {
61
- "epoch": 0.032,
62
- "grad_norm": 0.6772950291633606,
63
- "learning_rate": 6.400000000000001e-05,
64
- "loss": 1.2064,
65
- "step": 8
66
- },
67
- {
68
- "epoch": 0.036,
69
- "grad_norm": 0.678734540939331,
70
- "learning_rate": 7.2e-05,
71
- "loss": 1.0959,
72
- "step": 9
73
- },
74
- {
75
- "epoch": 0.04,
76
- "grad_norm": 0.7308094501495361,
77
- "learning_rate": 8e-05,
78
- "loss": 1.1092,
79
- "step": 10
80
- },
81
- {
82
- "epoch": 0.044,
83
- "grad_norm": 0.7465188503265381,
84
- "learning_rate": 8.800000000000001e-05,
85
- "loss": 1.064,
86
- "step": 11
87
- },
88
- {
89
- "epoch": 0.048,
90
- "grad_norm": 0.9013694524765015,
91
- "learning_rate": 9.6e-05,
92
- "loss": 0.8864,
93
- "step": 12
94
- },
95
- {
96
- "epoch": 0.052,
97
- "grad_norm": 1.035233736038208,
98
- "learning_rate": 0.00010400000000000001,
99
- "loss": 0.7344,
100
- "step": 13
101
- },
102
- {
103
- "epoch": 0.056,
104
- "grad_norm": 0.9891907572746277,
105
- "learning_rate": 0.00011200000000000001,
106
- "loss": 0.7008,
107
- "step": 14
108
- },
109
- {
110
- "epoch": 0.06,
111
- "grad_norm": 0.8915588855743408,
112
- "learning_rate": 0.00012,
113
- "loss": 0.4322,
114
- "step": 15
115
- },
116
- {
117
- "epoch": 0.064,
118
- "grad_norm": 0.6444385051727295,
119
- "learning_rate": 0.00012800000000000002,
120
- "loss": 0.4706,
121
- "step": 16
122
- },
123
- {
124
- "epoch": 0.068,
125
- "grad_norm": 0.716277003288269,
126
- "learning_rate": 0.00013600000000000003,
127
- "loss": 0.4235,
128
- "step": 17
129
- },
130
- {
131
- "epoch": 0.072,
132
- "grad_norm": 0.9461649656295776,
133
- "learning_rate": 0.000144,
134
- "loss": 0.4431,
135
- "step": 18
136
- },
137
- {
138
- "epoch": 0.076,
139
- "grad_norm": 0.6718122363090515,
140
- "learning_rate": 0.000152,
141
- "loss": 0.3785,
142
- "step": 19
143
- },
144
- {
145
- "epoch": 0.08,
146
- "grad_norm": 1.019701361656189,
147
- "learning_rate": 0.00016,
148
- "loss": 0.4057,
149
- "step": 20
150
- },
151
- {
152
- "epoch": 0.084,
153
- "grad_norm": 0.7621087431907654,
154
- "learning_rate": 0.000168,
155
- "loss": 0.4109,
156
- "step": 21
157
- },
158
- {
159
- "epoch": 0.088,
160
- "grad_norm": 0.68868088722229,
161
- "learning_rate": 0.00017600000000000002,
162
- "loss": 0.3947,
163
- "step": 22
164
- },
165
- {
166
- "epoch": 0.092,
167
- "grad_norm": 0.6749639511108398,
168
- "learning_rate": 0.00018400000000000003,
169
- "loss": 0.3161,
170
- "step": 23
171
- },
172
- {
173
- "epoch": 0.096,
174
- "grad_norm": 0.7607600688934326,
175
- "learning_rate": 0.000192,
176
- "loss": 0.3791,
177
- "step": 24
178
- },
179
- {
180
- "epoch": 0.1,
181
- "grad_norm": 1.090657353401184,
182
- "learning_rate": 0.0002,
183
- "loss": 0.2962,
184
- "step": 25
185
- },
186
- {
187
- "epoch": 0.104,
188
- "grad_norm": 0.7308134436607361,
189
- "learning_rate": 0.0001999978128380225,
190
- "loss": 0.2859,
191
- "step": 26
192
- },
193
- {
194
- "epoch": 0.108,
195
- "grad_norm": 0.6777300834655762,
196
- "learning_rate": 0.0001999912514477634,
197
- "loss": 0.2505,
198
- "step": 27
199
- },
200
- {
201
- "epoch": 0.112,
202
- "grad_norm": 0.8053539991378784,
203
- "learning_rate": 0.0001999803161162393,
204
- "loss": 0.3348,
205
- "step": 28
206
- },
207
- {
208
- "epoch": 0.116,
209
- "grad_norm": 0.7658929824829102,
210
- "learning_rate": 0.00019996500732179695,
211
- "loss": 0.3712,
212
- "step": 29
213
- },
214
- {
215
- "epoch": 0.12,
216
- "grad_norm": 0.6466525793075562,
217
- "learning_rate": 0.00019994532573409262,
218
- "loss": 0.2495,
219
- "step": 30
220
- },
221
- {
222
- "epoch": 0.124,
223
- "grad_norm": 0.6649733185768127,
224
- "learning_rate": 0.00019992127221406275,
225
- "loss": 0.2955,
226
- "step": 31
227
- },
228
- {
229
- "epoch": 0.128,
230
- "grad_norm": 0.9709344506263733,
231
- "learning_rate": 0.00019989284781388617,
232
- "loss": 0.369,
233
- "step": 32
234
- },
235
- {
236
- "epoch": 0.132,
237
- "grad_norm": 0.62021404504776,
238
- "learning_rate": 0.00019986005377693825,
239
- "loss": 0.2186,
240
- "step": 33
241
- },
242
- {
243
- "epoch": 0.136,
244
- "grad_norm": 0.6729118227958679,
245
- "learning_rate": 0.00019982289153773646,
246
- "loss": 0.2409,
247
- "step": 34
248
- },
249
- {
250
- "epoch": 0.14,
251
- "grad_norm": 0.6611261367797852,
252
- "learning_rate": 0.00019978136272187747,
253
- "loss": 0.3017,
254
- "step": 35
255
- },
256
- {
257
- "epoch": 0.144,
258
- "grad_norm": 0.5344218611717224,
259
- "learning_rate": 0.00019973546914596623,
260
- "loss": 0.2103,
261
- "step": 36
262
- },
263
- {
264
- "epoch": 0.148,
265
- "grad_norm": 0.6153568029403687,
266
- "learning_rate": 0.00019968521281753642,
267
- "loss": 0.2758,
268
- "step": 37
269
- },
270
- {
271
- "epoch": 0.152,
272
- "grad_norm": 1.0064842700958252,
273
- "learning_rate": 0.00019963059593496268,
274
- "loss": 0.2552,
275
- "step": 38
276
- },
277
- {
278
- "epoch": 0.156,
279
- "grad_norm": 0.7223315238952637,
280
- "learning_rate": 0.0001995716208873644,
281
- "loss": 0.3121,
282
- "step": 39
283
- },
284
- {
285
- "epoch": 0.16,
286
- "grad_norm": 0.8822200298309326,
287
- "learning_rate": 0.00019950829025450114,
288
- "loss": 0.3772,
289
- "step": 40
290
- },
291
- {
292
- "epoch": 0.164,
293
- "grad_norm": 0.5373098850250244,
294
- "learning_rate": 0.00019944060680666002,
295
- "loss": 0.1898,
296
- "step": 41
297
- },
298
- {
299
- "epoch": 0.168,
300
- "grad_norm": 0.6075126528739929,
301
- "learning_rate": 0.0001993685735045343,
302
- "loss": 0.2623,
303
- "step": 42
304
- },
305
- {
306
- "epoch": 0.172,
307
- "grad_norm": 0.5642558336257935,
308
- "learning_rate": 0.00019929219349909392,
309
- "loss": 0.1945,
310
- "step": 43
311
- },
312
- {
313
- "epoch": 0.176,
314
- "grad_norm": 0.6643729209899902,
315
- "learning_rate": 0.0001992114701314478,
316
- "loss": 0.2449,
317
- "step": 44
318
- },
319
- {
320
- "epoch": 0.18,
321
- "grad_norm": 0.5862522125244141,
322
- "learning_rate": 0.00019912640693269752,
323
- "loss": 0.3278,
324
- "step": 45
325
- },
326
- {
327
- "epoch": 0.184,
328
- "grad_norm": 0.5162032842636108,
329
- "learning_rate": 0.000199037007623783,
330
- "loss": 0.1748,
331
- "step": 46
332
- },
333
- {
334
- "epoch": 0.188,
335
- "grad_norm": 0.6573117971420288,
336
- "learning_rate": 0.0001989432761153196,
337
- "loss": 0.2473,
338
- "step": 47
339
- },
340
- {
341
- "epoch": 0.192,
342
- "grad_norm": 0.6575373411178589,
343
- "learning_rate": 0.00019884521650742715,
344
- "loss": 0.3077,
345
- "step": 48
346
- },
347
- {
348
- "epoch": 0.196,
349
- "grad_norm": 0.5843667387962341,
350
- "learning_rate": 0.00019874283308955057,
351
- "loss": 0.2452,
352
- "step": 49
353
- },
354
- {
355
- "epoch": 0.2,
356
- "grad_norm": 0.6257981061935425,
357
- "learning_rate": 0.00019863613034027224,
358
- "loss": 0.2562,
359
- "step": 50
360
- },
361
- {
362
- "epoch": 0.204,
363
- "grad_norm": 0.8069740533828735,
364
- "learning_rate": 0.00019852511292711608,
365
- "loss": 0.4475,
366
- "step": 51
367
- },
368
- {
369
- "epoch": 0.208,
370
- "grad_norm": 0.6240782737731934,
371
- "learning_rate": 0.0001984097857063434,
372
- "loss": 0.3188,
373
- "step": 52
374
- },
375
- {
376
- "epoch": 0.212,
377
- "grad_norm": 0.7743021249771118,
378
- "learning_rate": 0.00019829015372274038,
379
- "loss": 0.3873,
380
- "step": 53
381
- },
382
- {
383
- "epoch": 0.216,
384
- "grad_norm": 0.6066705584526062,
385
- "learning_rate": 0.0001981662222093976,
386
- "loss": 0.346,
387
- "step": 54
388
- },
389
- {
390
- "epoch": 0.22,
391
- "grad_norm": 0.48520612716674805,
392
- "learning_rate": 0.00019803799658748094,
393
- "loss": 0.2641,
394
- "step": 55
395
- },
396
- {
397
- "epoch": 0.224,
398
- "grad_norm": 0.4940112829208374,
399
- "learning_rate": 0.00019790548246599447,
400
- "loss": 0.2565,
401
- "step": 56
402
- },
403
- {
404
- "epoch": 0.228,
405
- "grad_norm": 0.5925392508506775,
406
- "learning_rate": 0.00019776868564153516,
407
- "loss": 0.243,
408
- "step": 57
409
- },
410
- {
411
- "epoch": 0.232,
412
- "grad_norm": 0.4947235584259033,
413
- "learning_rate": 0.00019762761209803927,
414
- "loss": 0.3149,
415
- "step": 58
416
- },
417
- {
418
- "epoch": 0.236,
419
- "grad_norm": 0.6175550818443298,
420
- "learning_rate": 0.0001974822680065206,
421
- "loss": 0.3092,
422
- "step": 59
423
- },
424
- {
425
- "epoch": 0.24,
426
- "grad_norm": 0.652012288570404,
427
- "learning_rate": 0.0001973326597248006,
428
- "loss": 0.2711,
429
- "step": 60
430
- },
431
- {
432
- "epoch": 0.244,
433
- "grad_norm": 0.6104367971420288,
434
- "learning_rate": 0.00019717879379723012,
435
- "loss": 0.3271,
436
- "step": 61
437
- },
438
- {
439
- "epoch": 0.248,
440
- "grad_norm": 0.5613756775856018,
441
- "learning_rate": 0.00019702067695440332,
442
- "loss": 0.2217,
443
- "step": 62
444
- },
445
- {
446
- "epoch": 0.252,
447
- "grad_norm": 0.7702831029891968,
448
- "learning_rate": 0.0001968583161128631,
449
- "loss": 0.3048,
450
- "step": 63
451
- },
452
- {
453
- "epoch": 0.256,
454
- "grad_norm": 0.6488165855407715,
455
- "learning_rate": 0.00019669171837479873,
456
- "loss": 0.3584,
457
- "step": 64
458
- },
459
- {
460
- "epoch": 0.26,
461
- "grad_norm": 0.7235074639320374,
462
- "learning_rate": 0.00019652089102773488,
463
- "loss": 0.345,
464
- "step": 65
465
- },
466
- {
467
- "epoch": 0.264,
468
- "grad_norm": 0.6242514252662659,
469
- "learning_rate": 0.00019634584154421317,
470
- "loss": 0.2212,
471
- "step": 66
472
- },
473
- {
474
- "epoch": 0.268,
475
- "grad_norm": 0.5614833831787109,
476
- "learning_rate": 0.00019616657758146503,
477
- "loss": 0.2072,
478
- "step": 67
479
- },
480
- {
481
- "epoch": 0.272,
482
- "grad_norm": 0.5229408144950867,
483
- "learning_rate": 0.00019598310698107702,
484
- "loss": 0.2463,
485
- "step": 68
486
- },
487
- {
488
- "epoch": 0.276,
489
- "grad_norm": 0.5220629572868347,
490
- "learning_rate": 0.0001957954377686475,
491
- "loss": 0.2801,
492
- "step": 69
493
- },
494
- {
495
- "epoch": 0.28,
496
- "grad_norm": 0.4703560471534729,
497
- "learning_rate": 0.00019560357815343577,
498
- "loss": 0.2463,
499
- "step": 70
500
- },
501
- {
502
- "epoch": 0.284,
503
- "grad_norm": 0.40801766514778137,
504
- "learning_rate": 0.000195407536528003,
505
- "loss": 0.1775,
506
- "step": 71
507
- },
508
- {
509
- "epoch": 0.288,
510
- "grad_norm": 0.5604711174964905,
511
- "learning_rate": 0.00019520732146784491,
512
- "loss": 0.2848,
513
- "step": 72
514
- },
515
- {
516
- "epoch": 0.292,
517
- "grad_norm": 0.5292896032333374,
518
- "learning_rate": 0.00019500294173101687,
519
- "loss": 0.2403,
520
- "step": 73
521
- },
522
- {
523
- "epoch": 0.296,
524
- "grad_norm": 0.5667911767959595,
525
- "learning_rate": 0.0001947944062577507,
526
- "loss": 0.2056,
527
- "step": 74
528
- },
529
- {
530
- "epoch": 0.3,
531
- "grad_norm": 0.5002603530883789,
532
- "learning_rate": 0.00019458172417006347,
533
- "loss": 0.1944,
534
- "step": 75
535
- },
536
- {
537
- "epoch": 0.304,
538
- "grad_norm": 0.6352862119674683,
539
- "learning_rate": 0.00019436490477135878,
540
- "loss": 0.2756,
541
- "step": 76
542
- },
543
- {
544
- "epoch": 0.308,
545
- "grad_norm": 0.5134908556938171,
546
- "learning_rate": 0.00019414395754601947,
547
- "loss": 0.2528,
548
- "step": 77
549
- },
550
- {
551
- "epoch": 0.312,
552
- "grad_norm": 0.6464311480522156,
553
- "learning_rate": 0.00019391889215899299,
554
- "loss": 0.2434,
555
- "step": 78
556
- },
557
- {
558
- "epoch": 0.316,
559
- "grad_norm": 0.4323250353336334,
560
- "learning_rate": 0.00019368971845536845,
561
- "loss": 0.2583,
562
- "step": 79
563
- },
564
- {
565
- "epoch": 0.32,
566
- "grad_norm": 0.798538088798523,
567
- "learning_rate": 0.0001934564464599461,
568
- "loss": 0.361,
569
- "step": 80
570
- },
571
- {
572
- "epoch": 0.324,
573
- "grad_norm": 0.490371972322464,
574
- "learning_rate": 0.00019321908637679865,
575
- "loss": 0.2141,
576
- "step": 81
577
- },
578
- {
579
- "epoch": 0.328,
580
- "grad_norm": 0.8492386341094971,
581
- "learning_rate": 0.00019297764858882514,
582
- "loss": 0.3176,
583
- "step": 82
584
- },
585
- {
586
- "epoch": 0.332,
587
- "grad_norm": 0.6251209378242493,
588
- "learning_rate": 0.00019273214365729655,
589
- "loss": 0.306,
590
- "step": 83
591
- },
592
- {
593
- "epoch": 0.336,
594
- "grad_norm": 0.6417653560638428,
595
- "learning_rate": 0.00019248258232139388,
596
- "loss": 0.3239,
597
- "step": 84
598
- },
599
- {
600
- "epoch": 0.34,
601
- "grad_norm": 0.5241423845291138,
602
- "learning_rate": 0.00019222897549773848,
603
- "loss": 0.2226,
604
- "step": 85
605
- },
606
- {
607
- "epoch": 0.344,
608
- "grad_norm": 0.5505190491676331,
609
- "learning_rate": 0.00019197133427991436,
610
- "loss": 0.3094,
611
- "step": 86
612
- },
613
- {
614
- "epoch": 0.348,
615
- "grad_norm": 0.4747926592826843,
616
- "learning_rate": 0.000191709669937983,
617
- "loss": 0.1867,
618
- "step": 87
619
- },
620
- {
621
- "epoch": 0.352,
622
- "grad_norm": 0.601978600025177,
623
- "learning_rate": 0.00019144399391799043,
624
- "loss": 0.3268,
625
- "step": 88
626
- },
627
- {
628
- "epoch": 0.356,
629
- "grad_norm": 0.42518407106399536,
630
- "learning_rate": 0.00019117431784146645,
631
- "loss": 0.1916,
632
- "step": 89
633
- },
634
- {
635
- "epoch": 0.36,
636
- "grad_norm": 0.44538766145706177,
637
- "learning_rate": 0.00019090065350491626,
638
- "loss": 0.1583,
639
- "step": 90
640
- },
641
- {
642
- "epoch": 0.364,
643
- "grad_norm": 0.49884578585624695,
644
- "learning_rate": 0.00019062301287930446,
645
- "loss": 0.1941,
646
- "step": 91
647
- },
648
- {
649
- "epoch": 0.368,
650
- "grad_norm": 0.44037410616874695,
651
- "learning_rate": 0.0001903414081095315,
652
- "loss": 0.2379,
653
- "step": 92
654
- },
655
- {
656
- "epoch": 0.372,
657
- "grad_norm": 0.6624314188957214,
658
- "learning_rate": 0.00019005585151390223,
659
- "loss": 0.2281,
660
- "step": 93
661
- },
662
- {
663
- "epoch": 0.376,
664
- "grad_norm": 0.43455690145492554,
665
- "learning_rate": 0.00018976635558358722,
666
- "loss": 0.183,
667
- "step": 94
668
- },
669
- {
670
- "epoch": 0.38,
671
- "grad_norm": 0.5635223388671875,
672
- "learning_rate": 0.00018947293298207635,
673
- "loss": 0.1923,
674
- "step": 95
675
- },
676
- {
677
- "epoch": 0.384,
678
- "grad_norm": 0.5557594299316406,
679
- "learning_rate": 0.00018917559654462474,
680
- "loss": 0.1513,
681
- "step": 96
682
- },
683
- {
684
- "epoch": 0.388,
685
- "grad_norm": 0.4074317216873169,
686
- "learning_rate": 0.00018887435927769137,
687
- "loss": 0.1558,
688
- "step": 97
689
- },
690
- {
691
- "epoch": 0.392,
692
- "grad_norm": 0.6130122542381287,
693
- "learning_rate": 0.00018856923435837022,
694
- "loss": 0.2345,
695
- "step": 98
696
- },
697
- {
698
- "epoch": 0.396,
699
- "grad_norm": 0.6227589249610901,
700
- "learning_rate": 0.0001882602351338137,
701
- "loss": 0.2241,
702
- "step": 99
703
- },
704
- {
705
- "epoch": 0.4,
706
- "grad_norm": 0.688264787197113,
707
- "learning_rate": 0.0001879473751206489,
708
- "loss": 0.2274,
709
- "step": 100
710
- },
711
- {
712
- "epoch": 0.404,
713
- "grad_norm": 0.4848058819770813,
714
- "learning_rate": 0.00018763066800438636,
715
- "loss": 0.3741,
716
- "step": 101
717
- },
718
- {
719
- "epoch": 0.408,
720
- "grad_norm": 0.40063992142677307,
721
- "learning_rate": 0.00018731012763882133,
722
- "loss": 0.1983,
723
- "step": 102
724
- },
725
- {
726
- "epoch": 0.412,
727
- "grad_norm": 0.6468407511711121,
728
- "learning_rate": 0.00018698576804542777,
729
- "loss": 0.308,
730
- "step": 103
731
- },
732
- {
733
- "epoch": 0.416,
734
- "grad_norm": 0.6126744747161865,
735
- "learning_rate": 0.00018665760341274505,
736
- "loss": 0.2815,
737
- "step": 104
738
- },
739
- {
740
- "epoch": 0.42,
741
- "grad_norm": 0.541337788105011,
742
- "learning_rate": 0.00018632564809575742,
743
- "loss": 0.2284,
744
- "step": 105
745
- },
746
- {
747
- "epoch": 0.424,
748
- "grad_norm": 0.5728098154067993,
749
- "learning_rate": 0.00018598991661526572,
750
- "loss": 0.2112,
751
- "step": 106
752
- },
753
- {
754
- "epoch": 0.428,
755
- "grad_norm": 0.5167835354804993,
756
- "learning_rate": 0.00018565042365725258,
757
- "loss": 0.2298,
758
- "step": 107
759
- },
760
- {
761
- "epoch": 0.432,
762
- "grad_norm": 0.5101655125617981,
763
- "learning_rate": 0.00018530718407223974,
764
- "loss": 0.2157,
765
- "step": 108
766
- },
767
- {
768
- "epoch": 0.436,
769
- "grad_norm": 0.5442182421684265,
770
- "learning_rate": 0.0001849602128746387,
771
- "loss": 0.191,
772
- "step": 109
773
- },
774
- {
775
- "epoch": 0.44,
776
- "grad_norm": 0.5199810266494751,
777
- "learning_rate": 0.00018460952524209355,
778
- "loss": 0.3364,
779
- "step": 110
780
- },
781
- {
782
- "epoch": 0.444,
783
- "grad_norm": 0.6269917488098145,
784
- "learning_rate": 0.00018425513651481747,
785
- "loss": 0.2628,
786
- "step": 111
787
- },
788
- {
789
- "epoch": 0.448,
790
- "grad_norm": 0.6553277373313904,
791
- "learning_rate": 0.00018389706219492147,
792
- "loss": 0.305,
793
- "step": 112
794
- },
795
- {
796
- "epoch": 0.452,
797
- "grad_norm": 0.6149172782897949,
798
- "learning_rate": 0.00018353531794573625,
799
- "loss": 0.2882,
800
- "step": 113
801
- },
802
- {
803
- "epoch": 0.456,
804
- "grad_norm": 0.6855000853538513,
805
- "learning_rate": 0.00018316991959112716,
806
- "loss": 0.3806,
807
- "step": 114
808
- },
809
- {
810
- "epoch": 0.46,
811
- "grad_norm": 0.5103464722633362,
812
- "learning_rate": 0.00018280088311480201,
813
- "loss": 0.2593,
814
- "step": 115
815
- },
816
- {
817
- "epoch": 0.464,
818
- "grad_norm": 0.49645787477493286,
819
- "learning_rate": 0.00018242822465961176,
820
- "loss": 0.2276,
821
- "step": 116
822
- },
823
- {
824
- "epoch": 0.468,
825
- "grad_norm": 0.5269503593444824,
826
- "learning_rate": 0.00018205196052684445,
827
- "loss": 0.3434,
828
- "step": 117
829
- },
830
- {
831
- "epoch": 0.472,
832
- "grad_norm": 0.5436558723449707,
833
- "learning_rate": 0.00018167210717551224,
834
- "loss": 0.3803,
835
- "step": 118
836
- },
837
- {
838
- "epoch": 0.476,
839
- "grad_norm": 0.442690372467041,
840
- "learning_rate": 0.00018128868122163123,
841
- "loss": 0.2289,
842
- "step": 119
843
- },
844
- {
845
- "epoch": 0.48,
846
- "grad_norm": 0.3994387090206146,
847
- "learning_rate": 0.00018090169943749476,
848
- "loss": 0.2825,
849
- "step": 120
850
- },
851
- {
852
- "epoch": 0.484,
853
- "grad_norm": 0.5957876443862915,
854
- "learning_rate": 0.00018051117875093976,
855
- "loss": 0.2747,
856
- "step": 121
857
- },
858
- {
859
- "epoch": 0.488,
860
- "grad_norm": 0.4253046214580536,
861
- "learning_rate": 0.00018011713624460608,
862
- "loss": 0.2245,
863
- "step": 122
864
- },
865
- {
866
- "epoch": 0.492,
867
- "grad_norm": 0.4820120334625244,
868
- "learning_rate": 0.0001797195891551896,
869
- "loss": 0.2685,
870
- "step": 123
871
- },
872
- {
873
- "epoch": 0.496,
874
- "grad_norm": 0.4225156307220459,
875
- "learning_rate": 0.00017931855487268782,
876
- "loss": 0.2172,
877
- "step": 124
878
- },
879
- {
880
- "epoch": 0.5,
881
- "grad_norm": 0.41555771231651306,
882
- "learning_rate": 0.00017891405093963938,
883
- "loss": 0.2742,
884
- "step": 125
885
- },
886
- {
887
- "epoch": 0.504,
888
- "grad_norm": 0.5232601761817932,
889
- "learning_rate": 0.0001785060950503568,
890
- "loss": 0.1839,
891
- "step": 126
892
- },
893
- {
894
- "epoch": 0.508,
895
- "grad_norm": 0.5760388374328613,
896
- "learning_rate": 0.0001780947050501522,
897
- "loss": 0.3365,
898
- "step": 127
899
- },
900
- {
901
- "epoch": 0.512,
902
- "grad_norm": 0.6275510191917419,
903
- "learning_rate": 0.00017767989893455698,
904
- "loss": 0.2565,
905
- "step": 128
906
- },
907
- {
908
- "epoch": 0.516,
909
- "grad_norm": 0.595443844795227,
910
- "learning_rate": 0.00017726169484853438,
911
- "loss": 0.358,
912
- "step": 129
913
- },
914
- {
915
- "epoch": 0.52,
916
- "grad_norm": 0.5733467936515808,
917
- "learning_rate": 0.00017684011108568592,
918
- "loss": 0.1629,
919
- "step": 130
920
- },
921
- {
922
- "epoch": 0.524,
923
- "grad_norm": 0.5721938014030457,
924
- "learning_rate": 0.00017641516608745114,
925
- "loss": 0.2287,
926
- "step": 131
927
- },
928
- {
929
- "epoch": 0.528,
930
- "grad_norm": 0.5880502462387085,
931
- "learning_rate": 0.00017598687844230088,
932
- "loss": 0.3348,
933
- "step": 132
934
- },
935
- {
936
- "epoch": 0.532,
937
- "grad_norm": 0.6357387900352478,
938
- "learning_rate": 0.0001755552668849242,
939
- "loss": 0.2448,
940
- "step": 133
941
- },
942
- {
943
- "epoch": 0.536,
944
- "grad_norm": 0.5971753001213074,
945
- "learning_rate": 0.00017512035029540885,
946
- "loss": 0.2081,
947
- "step": 134
948
- },
949
- {
950
- "epoch": 0.54,
951
- "grad_norm": 0.5828693509101868,
952
- "learning_rate": 0.0001746821476984154,
953
- "loss": 0.2584,
954
- "step": 135
955
- },
956
- {
957
- "epoch": 0.544,
958
- "grad_norm": 0.6158702373504639,
959
- "learning_rate": 0.000174240678262345,
960
- "loss": 0.2242,
961
- "step": 136
962
- },
963
- {
964
- "epoch": 0.548,
965
- "grad_norm": 0.5678686499595642,
966
- "learning_rate": 0.00017379596129850098,
967
- "loss": 0.19,
968
- "step": 137
969
- },
970
- {
971
- "epoch": 0.552,
972
- "grad_norm": 0.603077232837677,
973
- "learning_rate": 0.000173348016260244,
974
- "loss": 0.2917,
975
- "step": 138
976
- },
977
- {
978
- "epoch": 0.556,
979
- "grad_norm": 0.449489027261734,
980
- "learning_rate": 0.00017289686274214118,
981
- "loss": 0.1648,
982
- "step": 139
983
- },
984
- {
985
- "epoch": 0.56,
986
- "grad_norm": 0.6623111367225647,
987
- "learning_rate": 0.00017244252047910892,
988
- "loss": 0.3078,
989
- "step": 140
990
- },
991
- {
992
- "epoch": 0.564,
993
- "grad_norm": 0.7190758585929871,
994
- "learning_rate": 0.00017198500934554966,
995
- "loss": 0.2665,
996
- "step": 141
997
- },
998
- {
999
- "epoch": 0.568,
1000
- "grad_norm": 0.7329822182655334,
1001
- "learning_rate": 0.00017152434935448256,
1002
- "loss": 0.2134,
1003
- "step": 142
1004
- },
1005
- {
1006
- "epoch": 0.572,
1007
- "grad_norm": 0.8009594082832336,
1008
- "learning_rate": 0.00017106056065666793,
1009
- "loss": 0.2929,
1010
- "step": 143
1011
- },
1012
- {
1013
- "epoch": 0.576,
1014
- "grad_norm": 0.4459131360054016,
1015
- "learning_rate": 0.0001705936635397259,
1016
- "loss": 0.2282,
1017
- "step": 144
1018
- },
1019
- {
1020
- "epoch": 0.58,
1021
- "grad_norm": 0.5884144902229309,
1022
- "learning_rate": 0.00017012367842724887,
1023
- "loss": 0.2961,
1024
- "step": 145
1025
- },
1026
- {
1027
- "epoch": 0.584,
1028
- "grad_norm": 0.48679161071777344,
1029
- "learning_rate": 0.00016965062587790823,
1030
- "loss": 0.199,
1031
- "step": 146
1032
- },
1033
- {
1034
- "epoch": 0.588,
1035
- "grad_norm": 0.4284805953502655,
1036
- "learning_rate": 0.00016917452658455495,
1037
- "loss": 0.1794,
1038
- "step": 147
1039
- },
1040
- {
1041
- "epoch": 0.592,
1042
- "grad_norm": 0.4210096299648285,
1043
- "learning_rate": 0.00016869540137331445,
1044
- "loss": 0.1528,
1045
- "step": 148
1046
- },
1047
- {
1048
- "epoch": 0.596,
1049
- "grad_norm": 0.5506507754325867,
1050
- "learning_rate": 0.00016821327120267567,
1051
- "loss": 0.1551,
1052
- "step": 149
1053
- },
1054
- {
1055
- "epoch": 0.6,
1056
- "grad_norm": 0.5550769567489624,
1057
- "learning_rate": 0.00016772815716257412,
1058
- "loss": 0.2835,
1059
- "step": 150
1060
- },
1061
- {
1062
- "epoch": 0.604,
1063
- "grad_norm": 0.4100673794746399,
1064
- "learning_rate": 0.00016724008047346947,
1065
- "loss": 0.4128,
1066
- "step": 151
1067
- },
1068
- {
1069
- "epoch": 0.608,
1070
- "grad_norm": 0.5355508327484131,
1071
- "learning_rate": 0.00016674906248541726,
1072
- "loss": 0.3101,
1073
- "step": 152
1074
- },
1075
- {
1076
- "epoch": 0.612,
1077
- "grad_norm": 0.3811154067516327,
1078
- "learning_rate": 0.000166255124677135,
1079
- "loss": 0.2393,
1080
- "step": 153
1081
- },
1082
- {
1083
- "epoch": 0.616,
1084
- "grad_norm": 0.5073661804199219,
1085
- "learning_rate": 0.00016575828865506245,
1086
- "loss": 0.3463,
1087
- "step": 154
1088
- },
1089
- {
1090
- "epoch": 0.62,
1091
- "grad_norm": 0.5104597210884094,
1092
- "learning_rate": 0.00016525857615241687,
1093
- "loss": 0.3262,
1094
- "step": 155
1095
- },
1096
- {
1097
- "epoch": 0.624,
1098
- "grad_norm": 0.3945644795894623,
1099
- "learning_rate": 0.0001647560090282419,
1100
- "loss": 0.2162,
1101
- "step": 156
1102
- },
1103
- {
1104
- "epoch": 0.628,
1105
- "grad_norm": 0.5298649668693542,
1106
- "learning_rate": 0.00016425060926645167,
1107
- "loss": 0.2206,
1108
- "step": 157
1109
- },
1110
- {
1111
- "epoch": 0.632,
1112
- "grad_norm": 0.46345198154449463,
1113
- "learning_rate": 0.000163742398974869,
1114
- "loss": 0.1806,
1115
- "step": 158
1116
- },
1117
- {
1118
- "epoch": 0.636,
1119
- "grad_norm": 0.6795451641082764,
1120
- "learning_rate": 0.00016323140038425842,
1121
- "loss": 0.4104,
1122
- "step": 159
1123
- },
1124
- {
1125
- "epoch": 0.64,
1126
- "grad_norm": 0.4154146611690521,
1127
- "learning_rate": 0.0001627176358473537,
1128
- "loss": 0.2155,
1129
- "step": 160
1130
- },
1131
- {
1132
- "epoch": 0.644,
1133
- "grad_norm": 0.5460222363471985,
1134
- "learning_rate": 0.0001622011278378801,
1135
- "loss": 0.2937,
1136
- "step": 161
1137
- },
1138
- {
1139
- "epoch": 0.648,
1140
- "grad_norm": 0.5266145467758179,
1141
- "learning_rate": 0.0001616818989495711,
1142
- "loss": 0.252,
1143
- "step": 162
1144
- },
1145
- {
1146
- "epoch": 0.652,
1147
- "grad_norm": 0.6369134783744812,
1148
- "learning_rate": 0.00016115997189518043,
1149
- "loss": 0.3568,
1150
- "step": 163
1151
- },
1152
- {
1153
- "epoch": 0.656,
1154
- "grad_norm": 0.5750764012336731,
1155
- "learning_rate": 0.00016063536950548826,
1156
- "loss": 0.3061,
1157
- "step": 164
1158
- },
1159
- {
1160
- "epoch": 0.66,
1161
- "grad_norm": 0.5703745484352112,
1162
- "learning_rate": 0.00016010811472830252,
1163
- "loss": 0.275,
1164
- "step": 165
1165
- },
1166
- {
1167
- "epoch": 0.664,
1168
- "grad_norm": 0.6912633180618286,
1169
- "learning_rate": 0.0001595782306274553,
1170
- "loss": 0.3468,
1171
- "step": 166
1172
- },
1173
- {
1174
- "epoch": 0.668,
1175
- "grad_norm": 0.5048147439956665,
1176
- "learning_rate": 0.0001590457403817937,
1177
- "loss": 0.2633,
1178
- "step": 167
1179
- },
1180
- {
1181
- "epoch": 0.672,
1182
- "grad_norm": 0.4091242849826813,
1183
- "learning_rate": 0.00015851066728416618,
1184
- "loss": 0.2074,
1185
- "step": 168
1186
- },
1187
- {
1188
- "epoch": 0.676,
1189
- "grad_norm": 0.481308251619339,
1190
- "learning_rate": 0.00015797303474040332,
1191
- "loss": 0.1691,
1192
- "step": 169
1193
- },
1194
- {
1195
- "epoch": 0.68,
1196
- "grad_norm": 0.537234902381897,
1197
- "learning_rate": 0.00015743286626829437,
1198
- "loss": 0.2094,
1199
- "step": 170
1200
- },
1201
- {
1202
- "epoch": 0.684,
1203
- "grad_norm": 0.5328525900840759,
1204
- "learning_rate": 0.00015689018549655813,
1205
- "loss": 0.1987,
1206
- "step": 171
1207
- },
1208
- {
1209
- "epoch": 0.688,
1210
- "grad_norm": 0.5353541970252991,
1211
- "learning_rate": 0.00015634501616380967,
1212
- "loss": 0.251,
1213
- "step": 172
1214
- },
1215
- {
1216
- "epoch": 0.692,
1217
- "grad_norm": 0.5319244265556335,
1218
- "learning_rate": 0.00015579738211752165,
1219
- "loss": 0.1889,
1220
- "step": 173
1221
- },
1222
- {
1223
- "epoch": 0.696,
1224
- "grad_norm": 0.5995360612869263,
1225
- "learning_rate": 0.00015524730731298134,
1226
- "loss": 0.288,
1227
- "step": 174
1228
- },
1229
- {
1230
- "epoch": 0.7,
1231
- "grad_norm": 0.48707112669944763,
1232
- "learning_rate": 0.00015469481581224272,
1233
- "loss": 0.2073,
1234
- "step": 175
1235
- },
1236
- {
1237
- "epoch": 0.704,
1238
- "grad_norm": 0.44136014580726624,
1239
- "learning_rate": 0.0001541399317830738,
1240
- "loss": 0.2649,
1241
- "step": 176
1242
- },
1243
- {
1244
- "epoch": 0.708,
1245
- "grad_norm": 0.48200029134750366,
1246
- "learning_rate": 0.00015358267949789966,
1247
- "loss": 0.1662,
1248
- "step": 177
1249
- },
1250
- {
1251
- "epoch": 0.712,
1252
- "grad_norm": 0.5385119318962097,
1253
- "learning_rate": 0.0001530230833327405,
1254
- "loss": 0.2222,
1255
- "step": 178
1256
- },
1257
- {
1258
- "epoch": 0.716,
1259
- "grad_norm": 0.48744550347328186,
1260
- "learning_rate": 0.00015246116776614538,
1261
- "loss": 0.2671,
1262
- "step": 179
1263
- },
1264
- {
1265
- "epoch": 0.72,
1266
- "grad_norm": 0.7354145050048828,
1267
- "learning_rate": 0.00015189695737812152,
1268
- "loss": 0.2752,
1269
- "step": 180
1270
- },
1271
- {
1272
- "epoch": 0.724,
1273
- "grad_norm": 0.5340560078620911,
1274
- "learning_rate": 0.00015133047684905916,
1275
- "loss": 0.3146,
1276
- "step": 181
1277
- },
1278
- {
1279
- "epoch": 0.728,
1280
- "grad_norm": 0.4831385910511017,
1281
- "learning_rate": 0.0001507617509586517,
1282
- "loss": 0.2547,
1283
- "step": 182
1284
- },
1285
- {
1286
- "epoch": 0.732,
1287
- "grad_norm": 0.5295563340187073,
1288
- "learning_rate": 0.00015019080458481202,
1289
- "loss": 0.2962,
1290
- "step": 183
1291
- },
1292
- {
1293
- "epoch": 0.736,
1294
- "grad_norm": 0.5048829913139343,
1295
- "learning_rate": 0.00014961766270258422,
1296
- "loss": 0.1909,
1297
- "step": 184
1298
- },
1299
- {
1300
- "epoch": 0.74,
1301
- "grad_norm": 0.5644073486328125,
1302
- "learning_rate": 0.00014904235038305083,
1303
- "loss": 0.216,
1304
- "step": 185
1305
- },
1306
- {
1307
- "epoch": 0.744,
1308
- "grad_norm": 0.5821675062179565,
1309
- "learning_rate": 0.00014846489279223652,
1310
- "loss": 0.2375,
1311
- "step": 186
1312
- },
1313
- {
1314
- "epoch": 0.748,
1315
- "grad_norm": 0.7688628435134888,
1316
- "learning_rate": 0.00014788531519000696,
1317
- "loss": 0.2767,
1318
- "step": 187
1319
- },
1320
- {
1321
- "epoch": 0.752,
1322
- "grad_norm": 0.5115137100219727,
1323
- "learning_rate": 0.0001473036429289641,
1324
- "loss": 0.175,
1325
- "step": 188
1326
- },
1327
- {
1328
- "epoch": 0.756,
1329
- "grad_norm": 0.48442262411117554,
1330
- "learning_rate": 0.00014671990145333696,
1331
- "loss": 0.2332,
1332
- "step": 189
1333
- },
1334
- {
1335
- "epoch": 0.76,
1336
- "grad_norm": 0.6630519032478333,
1337
- "learning_rate": 0.0001461341162978688,
1338
- "loss": 0.3098,
1339
- "step": 190
1340
- },
1341
- {
1342
- "epoch": 0.764,
1343
- "grad_norm": 0.5805238485336304,
1344
- "learning_rate": 0.00014554631308669994,
1345
- "loss": 0.1861,
1346
- "step": 191
1347
- },
1348
- {
1349
- "epoch": 0.768,
1350
- "grad_norm": 0.542442798614502,
1351
- "learning_rate": 0.00014495651753224705,
1352
- "loss": 0.196,
1353
- "step": 192
1354
- },
1355
- {
1356
- "epoch": 0.772,
1357
- "grad_norm": 0.5466201305389404,
1358
- "learning_rate": 0.00014436475543407843,
1359
- "loss": 0.266,
1360
- "step": 193
1361
- },
1362
- {
1363
- "epoch": 0.776,
1364
- "grad_norm": 0.6598832607269287,
1365
- "learning_rate": 0.00014377105267778518,
1366
- "loss": 0.2226,
1367
- "step": 194
1368
- },
1369
- {
1370
- "epoch": 0.78,
1371
- "grad_norm": 0.6290153861045837,
1372
- "learning_rate": 0.00014317543523384928,
1373
- "loss": 0.1933,
1374
- "step": 195
1375
- },
1376
- {
1377
- "epoch": 0.784,
1378
- "grad_norm": 0.8984250426292419,
1379
- "learning_rate": 0.00014257792915650728,
1380
- "loss": 0.259,
1381
- "step": 196
1382
- },
1383
- {
1384
- "epoch": 0.788,
1385
- "grad_norm": 0.6985680460929871,
1386
- "learning_rate": 0.0001419785605826106,
1387
- "loss": 0.2459,
1388
- "step": 197
1389
- },
1390
- {
1391
- "epoch": 0.792,
1392
- "grad_norm": 0.9643673896789551,
1393
- "learning_rate": 0.00014137735573048233,
1394
- "loss": 0.42,
1395
- "step": 198
1396
- },
1397
- {
1398
- "epoch": 0.796,
1399
- "grad_norm": 0.509860634803772,
1400
- "learning_rate": 0.00014077434089877037,
1401
- "loss": 0.1935,
1402
- "step": 199
1403
- },
1404
- {
1405
- "epoch": 0.8,
1406
- "grad_norm": 0.5520833134651184,
1407
- "learning_rate": 0.00014016954246529696,
1408
- "loss": 0.2356,
1409
- "step": 200
1410
- },
1411
- {
1412
- "epoch": 0.804,
1413
- "grad_norm": 0.5402956008911133,
1414
- "learning_rate": 0.00013956298688590484,
1415
- "loss": 0.2148,
1416
- "step": 201
1417
- },
1418
- {
1419
- "epoch": 0.808,
1420
- "grad_norm": 0.5454432368278503,
1421
- "learning_rate": 0.00013895470069330004,
1422
- "loss": 0.2209,
1423
- "step": 202
1424
- },
1425
- {
1426
- "epoch": 0.812,
1427
- "grad_norm": 0.6620739698410034,
1428
- "learning_rate": 0.00013834471049589117,
1429
- "loss": 0.2851,
1430
- "step": 203
1431
- },
1432
- {
1433
- "epoch": 0.816,
1434
- "grad_norm": 0.6007181406021118,
1435
- "learning_rate": 0.00013773304297662559,
1436
- "loss": 0.3732,
1437
- "step": 204
1438
- },
1439
- {
1440
- "epoch": 0.82,
1441
- "grad_norm": 0.5536758303642273,
1442
- "learning_rate": 0.00013711972489182208,
1443
- "loss": 0.2656,
1444
- "step": 205
1445
- },
1446
- {
1447
- "epoch": 0.824,
1448
- "grad_norm": 0.6447663903236389,
1449
- "learning_rate": 0.00013650478307000057,
1450
- "loss": 0.225,
1451
- "step": 206
1452
- },
1453
- {
1454
- "epoch": 0.828,
1455
- "grad_norm": 0.5073426961898804,
1456
- "learning_rate": 0.00013588824441070852,
1457
- "loss": 0.3077,
1458
- "step": 207
1459
- },
1460
- {
1461
- "epoch": 0.832,
1462
- "grad_norm": 0.7368581295013428,
1463
- "learning_rate": 0.00013527013588334415,
1464
- "loss": 0.268,
1465
- "step": 208
1466
- },
1467
- {
1468
- "epoch": 0.836,
1469
- "grad_norm": 0.48935043811798096,
1470
- "learning_rate": 0.00013465048452597682,
1471
- "loss": 0.2053,
1472
- "step": 209
1473
- },
1474
- {
1475
- "epoch": 0.84,
1476
- "grad_norm": 0.5928518772125244,
1477
- "learning_rate": 0.00013402931744416433,
1478
- "loss": 0.2556,
1479
- "step": 210
1480
- },
1481
- {
1482
- "epoch": 0.844,
1483
- "grad_norm": 0.5154832601547241,
1484
- "learning_rate": 0.00013340666180976712,
1485
- "loss": 0.2993,
1486
- "step": 211
1487
- },
1488
- {
1489
- "epoch": 0.848,
1490
- "grad_norm": 0.6863323450088501,
1491
- "learning_rate": 0.00013278254485975976,
1492
- "loss": 0.337,
1493
- "step": 212
1494
- },
1495
- {
1496
- "epoch": 0.852,
1497
- "grad_norm": 0.4001295864582062,
1498
- "learning_rate": 0.00013215699389503954,
1499
- "loss": 0.2058,
1500
- "step": 213
1501
- },
1502
- {
1503
- "epoch": 0.856,
1504
- "grad_norm": 0.5775720477104187,
1505
- "learning_rate": 0.00013153003627923218,
1506
- "loss": 0.364,
1507
- "step": 214
1508
- },
1509
- {
1510
- "epoch": 0.86,
1511
- "grad_norm": 0.4591532051563263,
1512
- "learning_rate": 0.00013090169943749476,
1513
- "loss": 0.3343,
1514
- "step": 215
1515
- },
1516
- {
1517
- "epoch": 0.864,
1518
- "grad_norm": 0.6201801300048828,
1519
- "learning_rate": 0.00013027201085531634,
1520
- "loss": 0.2138,
1521
- "step": 216
1522
- },
1523
- {
1524
- "epoch": 0.868,
1525
- "grad_norm": 0.5922008752822876,
1526
- "learning_rate": 0.0001296409980773154,
1527
- "loss": 0.2251,
1528
- "step": 217
1529
- },
1530
- {
1531
- "epoch": 0.872,
1532
- "grad_norm": 0.7046107649803162,
1533
- "learning_rate": 0.00012900868870603503,
1534
- "loss": 0.3299,
1535
- "step": 218
1536
- },
1537
- {
1538
- "epoch": 0.876,
1539
- "grad_norm": 0.5001659393310547,
1540
- "learning_rate": 0.0001283751104007355,
1541
- "loss": 0.2286,
1542
- "step": 219
1543
- },
1544
- {
1545
- "epoch": 0.88,
1546
- "grad_norm": 0.753545343875885,
1547
- "learning_rate": 0.00012774029087618446,
1548
- "loss": 0.3051,
1549
- "step": 220
1550
- },
1551
- {
1552
- "epoch": 0.884,
1553
- "grad_norm": 0.4501713216304779,
1554
- "learning_rate": 0.00012710425790144446,
1555
- "loss": 0.2878,
1556
- "step": 221
1557
- },
1558
- {
1559
- "epoch": 0.888,
1560
- "grad_norm": 0.6050700545310974,
1561
- "learning_rate": 0.00012646703929865817,
1562
- "loss": 0.2571,
1563
- "step": 222
1564
- },
1565
- {
1566
- "epoch": 0.892,
1567
- "grad_norm": 0.4856882393360138,
1568
- "learning_rate": 0.00012582866294183167,
1569
- "loss": 0.2012,
1570
- "step": 223
1571
- },
1572
- {
1573
- "epoch": 0.896,
1574
- "grad_norm": 0.6917821168899536,
1575
- "learning_rate": 0.00012518915675561483,
1576
- "loss": 0.2586,
1577
- "step": 224
1578
- },
1579
- {
1580
- "epoch": 0.9,
1581
- "grad_norm": 0.35256296396255493,
1582
- "learning_rate": 0.00012454854871407994,
1583
- "loss": 0.1788,
1584
- "step": 225
1585
- },
1586
- {
1587
- "epoch": 0.904,
1588
- "grad_norm": 0.5019899010658264,
1589
- "learning_rate": 0.00012390686683949798,
1590
- "loss": 0.2638,
1591
- "step": 226
1592
- },
1593
- {
1594
- "epoch": 0.908,
1595
- "grad_norm": 0.59697026014328,
1596
- "learning_rate": 0.00012326413920111303,
1597
- "loss": 0.2432,
1598
- "step": 227
1599
- },
1600
- {
1601
- "epoch": 0.912,
1602
- "grad_norm": 0.6440879702568054,
1603
- "learning_rate": 0.00012262039391391404,
1604
- "loss": 0.2774,
1605
- "step": 228
1606
- },
1607
- {
1608
- "epoch": 0.916,
1609
- "grad_norm": 0.6482043266296387,
1610
- "learning_rate": 0.00012197565913740531,
1611
- "loss": 0.3204,
1612
- "step": 229
1613
- },
1614
- {
1615
- "epoch": 0.92,
1616
- "grad_norm": 0.48348715901374817,
1617
- "learning_rate": 0.0001213299630743747,
1618
- "loss": 0.2025,
1619
- "step": 230
1620
- },
1621
- {
1622
- "epoch": 0.924,
1623
- "grad_norm": 0.49052107334136963,
1624
- "learning_rate": 0.00012068333396965968,
1625
- "loss": 0.2144,
1626
- "step": 231
1627
- },
1628
- {
1629
- "epoch": 0.928,
1630
- "grad_norm": 0.7402703762054443,
1631
- "learning_rate": 0.00012003580010891213,
1632
- "loss": 0.3353,
1633
- "step": 232
1634
- },
1635
- {
1636
- "epoch": 0.932,
1637
- "grad_norm": 0.5001167058944702,
1638
- "learning_rate": 0.00011938738981736085,
1639
- "loss": 0.2622,
1640
- "step": 233
1641
- },
1642
- {
1643
- "epoch": 0.936,
1644
- "grad_norm": 0.6015510559082031,
1645
- "learning_rate": 0.00011873813145857249,
1646
- "loss": 0.26,
1647
- "step": 234
1648
- },
1649
- {
1650
- "epoch": 0.94,
1651
- "grad_norm": 0.5365880727767944,
1652
- "learning_rate": 0.000118088053433211,
1653
- "loss": 0.2161,
1654
- "step": 235
1655
- },
1656
- {
1657
- "epoch": 0.944,
1658
- "grad_norm": 0.513251543045044,
1659
- "learning_rate": 0.00011743718417779517,
1660
- "loss": 0.1221,
1661
- "step": 236
1662
- },
1663
- {
1664
- "epoch": 0.948,
1665
- "grad_norm": 0.5267243385314941,
1666
- "learning_rate": 0.00011678555216345477,
1667
- "loss": 0.1956,
1668
- "step": 237
1669
- },
1670
- {
1671
- "epoch": 0.952,
1672
- "grad_norm": 0.488800585269928,
1673
- "learning_rate": 0.00011613318589468511,
1674
- "loss": 0.2646,
1675
- "step": 238
1676
- },
1677
- {
1678
- "epoch": 0.956,
1679
- "grad_norm": 0.4522203207015991,
1680
- "learning_rate": 0.00011548011390810017,
1681
- "loss": 0.1728,
1682
- "step": 239
1683
- },
1684
- {
1685
- "epoch": 0.96,
1686
- "grad_norm": 0.5678312182426453,
1687
- "learning_rate": 0.0001148263647711842,
1688
- "loss": 0.2214,
1689
- "step": 240
1690
- },
1691
- {
1692
- "epoch": 0.964,
1693
- "grad_norm": 0.5377548933029175,
1694
- "learning_rate": 0.00011417196708104243,
1695
- "loss": 0.2833,
1696
- "step": 241
1697
- },
1698
- {
1699
- "epoch": 0.968,
1700
- "grad_norm": 0.5589195489883423,
1701
- "learning_rate": 0.0001135169494631497,
1702
- "loss": 0.2017,
1703
- "step": 242
1704
- },
1705
- {
1706
- "epoch": 0.972,
1707
- "grad_norm": 0.6018348932266235,
1708
- "learning_rate": 0.00011286134057009863,
1709
- "loss": 0.2382,
1710
- "step": 243
1711
- },
1712
- {
1713
- "epoch": 0.976,
1714
- "grad_norm": 0.7269315123558044,
1715
- "learning_rate": 0.00011220516908034601,
1716
- "loss": 0.3555,
1717
- "step": 244
1718
- },
1719
- {
1720
- "epoch": 0.98,
1721
- "grad_norm": 0.6661584973335266,
1722
- "learning_rate": 0.00011154846369695863,
1723
- "loss": 0.2729,
1724
- "step": 245
1725
- },
1726
- {
1727
- "epoch": 0.984,
1728
- "grad_norm": 0.5153563022613525,
1729
- "learning_rate": 0.00011089125314635726,
1730
- "loss": 0.1617,
1731
- "step": 246
1732
- },
1733
- {
1734
- "epoch": 0.988,
1735
- "grad_norm": 0.6063748002052307,
1736
- "learning_rate": 0.00011023356617706052,
1737
- "loss": 0.2208,
1738
- "step": 247
1739
- },
1740
- {
1741
- "epoch": 0.992,
1742
- "grad_norm": 0.6304444074630737,
1743
- "learning_rate": 0.00010957543155842702,
1744
- "loss": 0.1528,
1745
- "step": 248
1746
- },
1747
- {
1748
- "epoch": 0.996,
1749
- "grad_norm": 0.5737876892089844,
1750
- "learning_rate": 0.00010891687807939707,
1751
- "loss": 0.3462,
1752
- "step": 249
1753
- },
1754
- {
1755
- "epoch": 1.0,
1756
- "grad_norm": 0.6061635613441467,
1757
- "learning_rate": 0.00010825793454723325,
1758
- "loss": 0.2215,
1759
- "step": 250
1760
- },
1761
- {
1762
- "epoch": 1.004,
1763
- "grad_norm": 0.3539637327194214,
1764
- "learning_rate": 0.00010759862978626031,
1765
- "loss": 0.2342,
1766
- "step": 251
1767
- },
1768
- {
1769
- "epoch": 1.008,
1770
- "grad_norm": 0.40586933493614197,
1771
- "learning_rate": 0.00010693899263660441,
1772
- "loss": 0.2442,
1773
- "step": 252
1774
- },
1775
- {
1776
- "epoch": 1.012,
1777
- "grad_norm": 0.33828112483024597,
1778
- "learning_rate": 0.00010627905195293135,
1779
- "loss": 0.1908,
1780
- "step": 253
1781
- },
1782
- {
1783
- "epoch": 1.016,
1784
- "grad_norm": 0.4040011763572693,
1785
- "learning_rate": 0.00010561883660318455,
1786
- "loss": 0.1456,
1787
- "step": 254
1788
- },
1789
- {
1790
- "epoch": 1.02,
1791
- "grad_norm": 0.3069443106651306,
1792
- "learning_rate": 0.00010495837546732224,
1793
- "loss": 0.1898,
1794
- "step": 255
1795
- },
1796
- {
1797
- "epoch": 1.024,
1798
- "grad_norm": 0.4217762351036072,
1799
- "learning_rate": 0.00010429769743605407,
1800
- "loss": 0.1974,
1801
- "step": 256
1802
- },
1803
- {
1804
- "epoch": 1.028,
1805
- "grad_norm": 0.3466332256793976,
1806
- "learning_rate": 0.00010363683140957745,
1807
- "loss": 0.1383,
1808
- "step": 257
1809
- },
1810
- {
1811
- "epoch": 1.032,
1812
- "grad_norm": 0.3328801691532135,
1813
- "learning_rate": 0.00010297580629631325,
1814
- "loss": 0.1422,
1815
- "step": 258
1816
- },
1817
- {
1818
- "epoch": 1.036,
1819
- "grad_norm": 0.3091568946838379,
1820
- "learning_rate": 0.00010231465101164139,
1821
- "loss": 0.1491,
1822
- "step": 259
1823
- },
1824
- {
1825
- "epoch": 1.04,
1826
- "grad_norm": 0.5561888813972473,
1827
- "learning_rate": 0.00010165339447663587,
1828
- "loss": 0.1849,
1829
- "step": 260
1830
- },
1831
- {
1832
- "epoch": 1.044,
1833
- "grad_norm": 0.31464457511901855,
1834
- "learning_rate": 0.00010099206561679963,
1835
- "loss": 0.1414,
1836
- "step": 261
1837
- },
1838
- {
1839
- "epoch": 1.048,
1840
- "grad_norm": 0.40212392807006836,
1841
- "learning_rate": 0.00010033069336079952,
1842
- "loss": 0.1574,
1843
- "step": 262
1844
- },
1845
- {
1846
- "epoch": 1.052,
1847
- "grad_norm": 0.3913220763206482,
1848
- "learning_rate": 9.966930663920049e-05,
1849
- "loss": 0.1772,
1850
- "step": 263
1851
- },
1852
- {
1853
- "epoch": 1.056,
1854
- "grad_norm": 0.42926493287086487,
1855
- "learning_rate": 9.900793438320037e-05,
1856
- "loss": 0.257,
1857
- "step": 264
1858
- },
1859
- {
1860
- "epoch": 1.06,
1861
- "grad_norm": 0.39840856194496155,
1862
- "learning_rate": 9.834660552336415e-05,
1863
- "loss": 0.1256,
1864
- "step": 265
1865
- },
1866
- {
1867
- "epoch": 1.064,
1868
- "grad_norm": 0.4913635849952698,
1869
- "learning_rate": 9.768534898835862e-05,
1870
- "loss": 0.15,
1871
- "step": 266
1872
- },
1873
- {
1874
- "epoch": 1.068,
1875
- "grad_norm": 0.4526960253715515,
1876
- "learning_rate": 9.702419370368676e-05,
1877
- "loss": 0.1826,
1878
- "step": 267
1879
- },
1880
- {
1881
- "epoch": 1.072,
1882
- "grad_norm": 0.4643208384513855,
1883
- "learning_rate": 9.636316859042259e-05,
1884
- "loss": 0.1776,
1885
- "step": 268
1886
- },
1887
- {
1888
- "epoch": 1.076,
1889
- "grad_norm": 0.42755165696144104,
1890
- "learning_rate": 9.570230256394596e-05,
1891
- "loss": 0.1121,
1892
- "step": 269
1893
- },
1894
- {
1895
- "epoch": 1.08,
1896
- "grad_norm": 0.3962644040584564,
1897
- "learning_rate": 9.504162453267777e-05,
1898
- "loss": 0.127,
1899
- "step": 270
1900
- },
1901
- {
1902
- "epoch": 1.084,
1903
- "grad_norm": 0.31402885913848877,
1904
- "learning_rate": 9.438116339681545e-05,
1905
- "loss": 0.1078,
1906
- "step": 271
1907
- },
1908
- {
1909
- "epoch": 1.088,
1910
- "grad_norm": 0.3931761384010315,
1911
- "learning_rate": 9.372094804706867e-05,
1912
- "loss": 0.1714,
1913
- "step": 272
1914
- },
1915
- {
1916
- "epoch": 1.092,
1917
- "grad_norm": 0.3037507236003876,
1918
- "learning_rate": 9.30610073633956e-05,
1919
- "loss": 0.116,
1920
- "step": 273
1921
- },
1922
- {
1923
- "epoch": 1.096,
1924
- "grad_norm": 0.2935754358768463,
1925
- "learning_rate": 9.24013702137397e-05,
1926
- "loss": 0.0831,
1927
- "step": 274
1928
- },
1929
- {
1930
- "epoch": 1.1,
1931
- "grad_norm": 0.35577020049095154,
1932
- "learning_rate": 9.174206545276677e-05,
1933
- "loss": 0.1468,
1934
- "step": 275
1935
- },
1936
- {
1937
- "epoch": 1.104,
1938
- "grad_norm": 0.7628803849220276,
1939
- "learning_rate": 9.108312192060298e-05,
1940
- "loss": 0.1636,
1941
- "step": 276
1942
- },
1943
- {
1944
- "epoch": 1.108,
1945
- "grad_norm": 0.33394256234169006,
1946
- "learning_rate": 9.042456844157299e-05,
1947
- "loss": 0.1084,
1948
- "step": 277
1949
- },
1950
- {
1951
- "epoch": 1.112,
1952
- "grad_norm": 0.30702200531959534,
1953
- "learning_rate": 8.97664338229395e-05,
1954
- "loss": 0.1174,
1955
- "step": 278
1956
- },
1957
- {
1958
- "epoch": 1.116,
1959
- "grad_norm": 0.5347734093666077,
1960
- "learning_rate": 8.910874685364275e-05,
1961
- "loss": 0.1334,
1962
- "step": 279
1963
- },
1964
- {
1965
- "epoch": 1.12,
1966
- "grad_norm": 0.3750351667404175,
1967
- "learning_rate": 8.845153630304139e-05,
1968
- "loss": 0.1125,
1969
- "step": 280
1970
- },
1971
- {
1972
- "epoch": 1.124,
1973
- "grad_norm": 0.50673508644104,
1974
- "learning_rate": 8.7794830919654e-05,
1975
- "loss": 0.1851,
1976
- "step": 281
1977
- },
1978
- {
1979
- "epoch": 1.1280000000000001,
1980
- "grad_norm": 0.4266781210899353,
1981
- "learning_rate": 8.713865942990141e-05,
1982
- "loss": 0.1175,
1983
- "step": 282
1984
- },
1985
- {
1986
- "epoch": 1.1320000000000001,
1987
- "grad_norm": 0.34388792514801025,
1988
- "learning_rate": 8.648305053685034e-05,
1989
- "loss": 0.086,
1990
- "step": 283
1991
- },
1992
- {
1993
- "epoch": 1.1360000000000001,
1994
- "grad_norm": 0.45312806963920593,
1995
- "learning_rate": 8.582803291895758e-05,
1996
- "loss": 0.2005,
1997
- "step": 284
1998
- },
1999
- {
2000
- "epoch": 1.1400000000000001,
2001
- "grad_norm": 0.4231245517730713,
2002
- "learning_rate": 8.517363522881579e-05,
2003
- "loss": 0.1609,
2004
- "step": 285
2005
- },
2006
- {
2007
- "epoch": 1.144,
2008
- "grad_norm": 0.4056156873703003,
2009
- "learning_rate": 8.451988609189987e-05,
2010
- "loss": 0.1026,
2011
- "step": 286
2012
- },
2013
- {
2014
- "epoch": 1.148,
2015
- "grad_norm": 0.4729995131492615,
2016
- "learning_rate": 8.386681410531491e-05,
2017
- "loss": 0.12,
2018
- "step": 287
2019
- },
2020
- {
2021
- "epoch": 1.152,
2022
- "grad_norm": 0.3991888761520386,
2023
- "learning_rate": 8.321444783654524e-05,
2024
- "loss": 0.1323,
2025
- "step": 288
2026
- },
2027
- {
2028
- "epoch": 1.156,
2029
- "grad_norm": 0.44730663299560547,
2030
- "learning_rate": 8.256281582220485e-05,
2031
- "loss": 0.1452,
2032
- "step": 289
2033
- },
2034
- {
2035
- "epoch": 1.16,
2036
- "grad_norm": 0.37809377908706665,
2037
- "learning_rate": 8.191194656678904e-05,
2038
- "loss": 0.1033,
2039
- "step": 290
2040
- },
2041
- {
2042
- "epoch": 1.164,
2043
- "grad_norm": 0.44346722960472107,
2044
- "learning_rate": 8.126186854142752e-05,
2045
- "loss": 0.169,
2046
- "step": 291
2047
- },
2048
- {
2049
- "epoch": 1.168,
2050
- "grad_norm": 0.48387813568115234,
2051
- "learning_rate": 8.061261018263919e-05,
2052
- "loss": 0.1055,
2053
- "step": 292
2054
- },
2055
- {
2056
- "epoch": 1.172,
2057
- "grad_norm": 0.49735358357429504,
2058
- "learning_rate": 7.996419989108789e-05,
2059
- "loss": 0.1325,
2060
- "step": 293
2061
- },
2062
- {
2063
- "epoch": 1.176,
2064
- "grad_norm": 0.585444450378418,
2065
- "learning_rate": 7.931666603034033e-05,
2066
- "loss": 0.1194,
2067
- "step": 294
2068
- },
2069
- {
2070
- "epoch": 1.18,
2071
- "grad_norm": 0.43533700704574585,
2072
- "learning_rate": 7.867003692562534e-05,
2073
- "loss": 0.1043,
2074
- "step": 295
2075
- },
2076
- {
2077
- "epoch": 1.184,
2078
- "grad_norm": 0.6790356040000916,
2079
- "learning_rate": 7.80243408625947e-05,
2080
- "loss": 0.105,
2081
- "step": 296
2082
- },
2083
- {
2084
- "epoch": 1.188,
2085
- "grad_norm": 0.44777408242225647,
2086
- "learning_rate": 7.7379606086086e-05,
2087
- "loss": 0.1563,
2088
- "step": 297
2089
- },
2090
- {
2091
- "epoch": 1.192,
2092
- "grad_norm": 0.2888951003551483,
2093
- "learning_rate": 7.673586079888698e-05,
2094
- "loss": 0.0877,
2095
- "step": 298
2096
- },
2097
- {
2098
- "epoch": 1.196,
2099
- "grad_norm": 0.47348588705062866,
2100
- "learning_rate": 7.6093133160502e-05,
2101
- "loss": 0.1123,
2102
- "step": 299
2103
- },
2104
- {
2105
- "epoch": 1.2,
2106
- "grad_norm": 0.4997583031654358,
2107
- "learning_rate": 7.54514512859201e-05,
2108
- "loss": 0.1525,
2109
- "step": 300
2110
- },
2111
- {
2112
- "epoch": 1.204,
2113
- "grad_norm": 0.38823431730270386,
2114
- "learning_rate": 7.48108432443852e-05,
2115
- "loss": 0.1763,
2116
- "step": 301
2117
- },
2118
- {
2119
- "epoch": 1.208,
2120
- "grad_norm": 0.3676242530345917,
2121
- "learning_rate": 7.417133705816837e-05,
2122
- "loss": 0.1335,
2123
- "step": 302
2124
- },
2125
- {
2126
- "epoch": 1.212,
2127
- "grad_norm": 0.49667099118232727,
2128
- "learning_rate": 7.353296070134186e-05,
2129
- "loss": 0.2112,
2130
- "step": 303
2131
- },
2132
- {
2133
- "epoch": 1.216,
2134
- "grad_norm": 0.3953639566898346,
2135
- "learning_rate": 7.289574209855559e-05,
2136
- "loss": 0.1548,
2137
- "step": 304
2138
- },
2139
- {
2140
- "epoch": 1.22,
2141
- "grad_norm": 0.30921855568885803,
2142
- "learning_rate": 7.225970912381556e-05,
2143
- "loss": 0.1059,
2144
- "step": 305
2145
- },
2146
- {
2147
- "epoch": 1.224,
2148
- "grad_norm": 0.41073694825172424,
2149
- "learning_rate": 7.16248895992645e-05,
2150
- "loss": 0.1507,
2151
- "step": 306
2152
- },
2153
- {
2154
- "epoch": 1.228,
2155
- "grad_norm": 0.4784708321094513,
2156
- "learning_rate": 7.099131129396501e-05,
2157
- "loss": 0.1406,
2158
- "step": 307
2159
- },
2160
- {
2161
- "epoch": 1.232,
2162
- "grad_norm": 0.45144668221473694,
2163
- "learning_rate": 7.035900192268464e-05,
2164
- "loss": 0.1963,
2165
- "step": 308
2166
- },
2167
- {
2168
- "epoch": 1.236,
2169
- "grad_norm": 0.5429076552391052,
2170
- "learning_rate": 6.972798914468369e-05,
2171
- "loss": 0.1192,
2172
- "step": 309
2173
- },
2174
- {
2175
- "epoch": 1.24,
2176
- "grad_norm": 0.45704060792922974,
2177
- "learning_rate": 6.909830056250527e-05,
2178
- "loss": 0.1136,
2179
- "step": 310
2180
- },
2181
- {
2182
- "epoch": 1.244,
2183
- "grad_norm": 0.6119317412376404,
2184
- "learning_rate": 6.846996372076786e-05,
2185
- "loss": 0.2982,
2186
- "step": 311
2187
- },
2188
- {
2189
- "epoch": 1.248,
2190
- "grad_norm": 0.29200586676597595,
2191
- "learning_rate": 6.784300610496048e-05,
2192
- "loss": 0.1593,
2193
- "step": 312
2194
- },
2195
- {
2196
- "epoch": 1.252,
2197
- "grad_norm": 0.4559411406517029,
2198
- "learning_rate": 6.721745514024022e-05,
2199
- "loss": 0.1875,
2200
- "step": 313
2201
- },
2202
- {
2203
- "epoch": 1.256,
2204
- "grad_norm": 0.48311662673950195,
2205
- "learning_rate": 6.65933381902329e-05,
2206
- "loss": 0.1629,
2207
- "step": 314
2208
- },
2209
- {
2210
- "epoch": 1.26,
2211
- "grad_norm": 0.7097484469413757,
2212
- "learning_rate": 6.59706825558357e-05,
2213
- "loss": 0.2506,
2214
- "step": 315
2215
- },
2216
- {
2217
- "epoch": 1.264,
2218
- "grad_norm": 0.333810955286026,
2219
- "learning_rate": 6.534951547402322e-05,
2220
- "loss": 0.155,
2221
- "step": 316
2222
- },
2223
- {
2224
- "epoch": 1.268,
2225
- "grad_norm": 0.42433595657348633,
2226
- "learning_rate": 6.47298641166559e-05,
2227
- "loss": 0.1285,
2228
- "step": 317
2229
- },
2230
- {
2231
- "epoch": 1.272,
2232
- "grad_norm": 0.5455209016799927,
2233
- "learning_rate": 6.411175558929152e-05,
2234
- "loss": 0.1528,
2235
- "step": 318
2236
- },
2237
- {
2238
- "epoch": 1.276,
2239
- "grad_norm": 0.6677869558334351,
2240
- "learning_rate": 6.349521692999945e-05,
2241
- "loss": 0.163,
2242
- "step": 319
2243
- },
2244
- {
2245
- "epoch": 1.28,
2246
- "grad_norm": 0.7256897687911987,
2247
- "learning_rate": 6.28802751081779e-05,
2248
- "loss": 0.1464,
2249
- "step": 320
2250
- },
2251
- {
2252
- "epoch": 1.284,
2253
- "grad_norm": 0.4139724373817444,
2254
- "learning_rate": 6.226695702337442e-05,
2255
- "loss": 0.1309,
2256
- "step": 321
2257
- },
2258
- {
2259
- "epoch": 1.288,
2260
- "grad_norm": 0.28360605239868164,
2261
- "learning_rate": 6.165528950410884e-05,
2262
- "loss": 0.1043,
2263
- "step": 322
2264
- },
2265
- {
2266
- "epoch": 1.292,
2267
- "grad_norm": 0.25463399291038513,
2268
- "learning_rate": 6.10452993067e-05,
2269
- "loss": 0.1072,
2270
- "step": 323
2271
- },
2272
- {
2273
- "epoch": 1.296,
2274
- "grad_norm": 0.38707828521728516,
2275
- "learning_rate": 6.0437013114095195e-05,
2276
- "loss": 0.1612,
2277
- "step": 324
2278
- },
2279
- {
2280
- "epoch": 1.3,
2281
- "grad_norm": 0.32273241877555847,
2282
- "learning_rate": 5.983045753470308e-05,
2283
- "loss": 0.1164,
2284
- "step": 325
2285
- },
2286
- {
2287
- "epoch": 1.304,
2288
- "grad_norm": 0.33470627665519714,
2289
- "learning_rate": 5.922565910122967e-05,
2290
- "loss": 0.0981,
2291
- "step": 326
2292
- },
2293
- {
2294
- "epoch": 1.308,
2295
- "grad_norm": 0.4948393404483795,
2296
- "learning_rate": 5.862264426951768e-05,
2297
- "loss": 0.174,
2298
- "step": 327
2299
- },
2300
- {
2301
- "epoch": 1.312,
2302
- "grad_norm": 0.3687496781349182,
2303
- "learning_rate": 5.8021439417389444e-05,
2304
- "loss": 0.12,
2305
- "step": 328
2306
- },
2307
- {
2308
- "epoch": 1.316,
2309
- "grad_norm": 0.4321393668651581,
2310
- "learning_rate": 5.7422070843492734e-05,
2311
- "loss": 0.1437,
2312
- "step": 329
2313
- },
2314
- {
2315
- "epoch": 1.32,
2316
- "grad_norm": 0.34234917163848877,
2317
- "learning_rate": 5.6824564766150726e-05,
2318
- "loss": 0.1428,
2319
- "step": 330
2320
- },
2321
- {
2322
- "epoch": 1.324,
2323
- "grad_norm": 0.6654959321022034,
2324
- "learning_rate": 5.622894732221482e-05,
2325
- "loss": 0.101,
2326
- "step": 331
2327
- },
2328
- {
2329
- "epoch": 1.328,
2330
- "grad_norm": 0.5414759516716003,
2331
- "learning_rate": 5.563524456592163e-05,
2332
- "loss": 0.1286,
2333
- "step": 332
2334
- },
2335
- {
2336
- "epoch": 1.332,
2337
- "grad_norm": 0.5127768516540527,
2338
- "learning_rate": 5.504348246775299e-05,
2339
- "loss": 0.1485,
2340
- "step": 333
2341
- },
2342
- {
2343
- "epoch": 1.336,
2344
- "grad_norm": 0.41076546907424927,
2345
- "learning_rate": 5.4453686913300074e-05,
2346
- "loss": 0.1587,
2347
- "step": 334
2348
- },
2349
- {
2350
- "epoch": 1.34,
2351
- "grad_norm": 0.3729254901409149,
2352
- "learning_rate": 5.386588370213124e-05,
2353
- "loss": 0.0936,
2354
- "step": 335
2355
- },
2356
- {
2357
- "epoch": 1.3439999999999999,
2358
- "grad_norm": 0.4239449203014374,
2359
- "learning_rate": 5.328009854666303e-05,
2360
- "loss": 0.0969,
2361
- "step": 336
2362
- },
2363
- {
2364
- "epoch": 1.3479999999999999,
2365
- "grad_norm": 0.36368995904922485,
2366
- "learning_rate": 5.269635707103593e-05,
2367
- "loss": 0.1938,
2368
- "step": 337
2369
- },
2370
- {
2371
- "epoch": 1.3519999999999999,
2372
- "grad_norm": 0.43767890334129333,
2373
- "learning_rate": 5.2114684809993044e-05,
2374
- "loss": 0.1299,
2375
- "step": 338
2376
- },
2377
- {
2378
- "epoch": 1.3559999999999999,
2379
- "grad_norm": 0.3987194299697876,
2380
- "learning_rate": 5.1535107207763534e-05,
2381
- "loss": 0.1095,
2382
- "step": 339
2383
- },
2384
- {
2385
- "epoch": 1.3599999999999999,
2386
- "grad_norm": 0.5002091526985168,
2387
- "learning_rate": 5.095764961694922e-05,
2388
- "loss": 0.184,
2389
- "step": 340
2390
- },
2391
- {
2392
- "epoch": 1.3639999999999999,
2393
- "grad_norm": 0.37880387902259827,
2394
- "learning_rate": 5.0382337297415773e-05,
2395
- "loss": 0.1458,
2396
- "step": 341
2397
- },
2398
- {
2399
- "epoch": 1.3679999999999999,
2400
- "grad_norm": 0.35637861490249634,
2401
- "learning_rate": 4.980919541518796e-05,
2402
- "loss": 0.1333,
2403
- "step": 342
2404
- },
2405
- {
2406
- "epoch": 1.3719999999999999,
2407
- "grad_norm": 0.31971898674964905,
2408
- "learning_rate": 4.923824904134829e-05,
2409
- "loss": 0.0977,
2410
- "step": 343
2411
- },
2412
- {
2413
- "epoch": 1.376,
2414
- "grad_norm": 0.5161877870559692,
2415
- "learning_rate": 4.866952315094088e-05,
2416
- "loss": 0.1384,
2417
- "step": 344
2418
- },
2419
- {
2420
- "epoch": 1.38,
2421
- "grad_norm": 0.4512530267238617,
2422
- "learning_rate": 4.810304262187852e-05,
2423
- "loss": 0.1191,
2424
- "step": 345
2425
- },
2426
- {
2427
- "epoch": 1.384,
2428
- "grad_norm": 0.2885878384113312,
2429
- "learning_rate": 4.753883223385467e-05,
2430
- "loss": 0.0874,
2431
- "step": 346
2432
- },
2433
- {
2434
- "epoch": 1.388,
2435
- "grad_norm": 0.3796810805797577,
2436
- "learning_rate": 4.697691666725955e-05,
2437
- "loss": 0.1044,
2438
- "step": 347
2439
- },
2440
- {
2441
- "epoch": 1.392,
2442
- "grad_norm": 0.4405852258205414,
2443
- "learning_rate": 4.6417320502100316e-05,
2444
- "loss": 0.1161,
2445
- "step": 348
2446
- },
2447
- {
2448
- "epoch": 1.396,
2449
- "grad_norm": 0.3295149803161621,
2450
- "learning_rate": 4.58600682169262e-05,
2451
- "loss": 0.102,
2452
- "step": 349
2453
- },
2454
- {
2455
- "epoch": 1.4,
2456
- "grad_norm": 0.38261374831199646,
2457
- "learning_rate": 4.530518418775733e-05,
2458
- "loss": 0.1315,
2459
- "step": 350
2460
- },
2461
- {
2462
- "epoch": 1.404,
2463
- "grad_norm": 0.5392604470252991,
2464
- "learning_rate": 4.475269268701868e-05,
2465
- "loss": 0.3177,
2466
- "step": 351
2467
- },
2468
- {
2469
- "epoch": 1.408,
2470
- "grad_norm": 0.5153935551643372,
2471
- "learning_rate": 4.4202617882478405e-05,
2472
- "loss": 0.1475,
2473
- "step": 352
2474
- },
2475
- {
2476
- "epoch": 1.412,
2477
- "grad_norm": 0.5613036155700684,
2478
- "learning_rate": 4.365498383619036e-05,
2479
- "loss": 0.2047,
2480
- "step": 353
2481
- },
2482
- {
2483
- "epoch": 1.416,
2484
- "grad_norm": 0.42411884665489197,
2485
- "learning_rate": 4.310981450344189e-05,
2486
- "loss": 0.1342,
2487
- "step": 354
2488
- },
2489
- {
2490
- "epoch": 1.42,
2491
- "grad_norm": 0.3150688409805298,
2492
- "learning_rate": 4.256713373170564e-05,
2493
- "loss": 0.1505,
2494
- "step": 355
2495
- },
2496
- {
2497
- "epoch": 1.424,
2498
- "grad_norm": 0.3344944715499878,
2499
- "learning_rate": 4.2026965259596666e-05,
2500
- "loss": 0.1368,
2501
- "step": 356
2502
- },
2503
- {
2504
- "epoch": 1.428,
2505
- "grad_norm": 0.3764253258705139,
2506
- "learning_rate": 4.148933271583385e-05,
2507
- "loss": 0.1208,
2508
- "step": 357
2509
- },
2510
- {
2511
- "epoch": 1.432,
2512
- "grad_norm": 0.4218865931034088,
2513
- "learning_rate": 4.0954259618206295e-05,
2514
- "loss": 0.1534,
2515
- "step": 358
2516
- },
2517
- {
2518
- "epoch": 1.436,
2519
- "grad_norm": 0.35453012585639954,
2520
- "learning_rate": 4.0421769372544736e-05,
2521
- "loss": 0.128,
2522
- "step": 359
2523
- },
2524
- {
2525
- "epoch": 1.44,
2526
- "grad_norm": 0.5020301938056946,
2527
- "learning_rate": 3.9891885271697496e-05,
2528
- "loss": 0.1302,
2529
- "step": 360
2530
- },
2531
- {
2532
- "epoch": 1.444,
2533
- "grad_norm": 0.34255918860435486,
2534
- "learning_rate": 3.936463049451179e-05,
2535
- "loss": 0.0989,
2536
- "step": 361
2537
- },
2538
- {
2539
- "epoch": 1.448,
2540
- "grad_norm": 0.545989990234375,
2541
- "learning_rate": 3.884002810481958e-05,
2542
- "loss": 0.2236,
2543
- "step": 362
2544
- },
2545
- {
2546
- "epoch": 1.452,
2547
- "grad_norm": 0.3559330999851227,
2548
- "learning_rate": 3.8318101050428904e-05,
2549
- "loss": 0.1779,
2550
- "step": 363
2551
- },
2552
- {
2553
- "epoch": 1.456,
2554
- "grad_norm": 0.45797303318977356,
2555
- "learning_rate": 3.779887216211995e-05,
2556
- "loss": 0.1647,
2557
- "step": 364
2558
- },
2559
- {
2560
- "epoch": 1.46,
2561
- "grad_norm": 0.5930070877075195,
2562
- "learning_rate": 3.7282364152646297e-05,
2563
- "loss": 0.1644,
2564
- "step": 365
2565
- },
2566
- {
2567
- "epoch": 1.464,
2568
- "grad_norm": 0.3515298664569855,
2569
- "learning_rate": 3.676859961574162e-05,
2570
- "loss": 0.1432,
2571
- "step": 366
2572
- },
2573
- {
2574
- "epoch": 1.468,
2575
- "grad_norm": 0.5396482348442078,
2576
- "learning_rate": 3.6257601025131026e-05,
2577
- "loss": 0.1687,
2578
- "step": 367
2579
- },
2580
- {
2581
- "epoch": 1.472,
2582
- "grad_norm": 0.40738555788993835,
2583
- "learning_rate": 3.574939073354838e-05,
2584
- "loss": 0.1318,
2585
- "step": 368
2586
- },
2587
- {
2588
- "epoch": 1.476,
2589
- "grad_norm": 0.4788297116756439,
2590
- "learning_rate": 3.5243990971758125e-05,
2591
- "loss": 0.1273,
2592
- "step": 369
2593
- },
2594
- {
2595
- "epoch": 1.48,
2596
- "grad_norm": 0.4082763195037842,
2597
- "learning_rate": 3.4741423847583134e-05,
2598
- "loss": 0.1614,
2599
- "step": 370
2600
- },
2601
- {
2602
- "epoch": 1.484,
2603
- "grad_norm": 0.46562305092811584,
2604
- "learning_rate": 3.424171134493756e-05,
2605
- "loss": 0.1349,
2606
- "step": 371
2607
- },
2608
- {
2609
- "epoch": 1.488,
2610
- "grad_norm": 0.47657540440559387,
2611
- "learning_rate": 3.3744875322865034e-05,
2612
- "loss": 0.1437,
2613
- "step": 372
2614
- },
2615
- {
2616
- "epoch": 1.492,
2617
- "grad_norm": 0.31126895546913147,
2618
- "learning_rate": 3.325093751458276e-05,
2619
- "loss": 0.0907,
2620
- "step": 373
2621
- },
2622
- {
2623
- "epoch": 1.496,
2624
- "grad_norm": 0.3878353536128998,
2625
- "learning_rate": 3.275991952653054e-05,
2626
- "loss": 0.1437,
2627
- "step": 374
2628
- },
2629
- {
2630
- "epoch": 1.5,
2631
- "grad_norm": 0.3266257643699646,
2632
- "learning_rate": 3.227184283742591e-05,
2633
- "loss": 0.0822,
2634
- "step": 375
2635
- },
2636
- {
2637
- "epoch": 1.504,
2638
- "grad_norm": 0.4265097975730896,
2639
- "learning_rate": 3.178672879732435e-05,
2640
- "loss": 0.1449,
2641
- "step": 376
2642
- },
2643
- {
2644
- "epoch": 1.508,
2645
- "grad_norm": 0.3834059536457062,
2646
- "learning_rate": 3.1304598626685545e-05,
2647
- "loss": 0.1253,
2648
- "step": 377
2649
- },
2650
- {
2651
- "epoch": 1.512,
2652
- "grad_norm": 0.28038114309310913,
2653
- "learning_rate": 3.0825473415445074e-05,
2654
- "loss": 0.0913,
2655
- "step": 378
2656
- },
2657
- {
2658
- "epoch": 1.516,
2659
- "grad_norm": 0.3404386341571808,
2660
- "learning_rate": 3.034937412209178e-05,
2661
- "loss": 0.1449,
2662
- "step": 379
2663
- },
2664
- {
2665
- "epoch": 1.52,
2666
- "grad_norm": 0.3864313066005707,
2667
- "learning_rate": 2.9876321572751144e-05,
2668
- "loss": 0.1743,
2669
- "step": 380
2670
- },
2671
- {
2672
- "epoch": 1.524,
2673
- "grad_norm": 0.48347947001457214,
2674
- "learning_rate": 2.940633646027414e-05,
2675
- "loss": 0.1275,
2676
- "step": 381
2677
- },
2678
- {
2679
- "epoch": 1.528,
2680
- "grad_norm": 0.4755510687828064,
2681
- "learning_rate": 2.8939439343332086e-05,
2682
- "loss": 0.1151,
2683
- "step": 382
2684
- },
2685
- {
2686
- "epoch": 1.532,
2687
- "grad_norm": 0.33736056089401245,
2688
- "learning_rate": 2.8475650645517472e-05,
2689
- "loss": 0.1085,
2690
- "step": 383
2691
- },
2692
- {
2693
- "epoch": 1.536,
2694
- "grad_norm": 0.46070510149002075,
2695
- "learning_rate": 2.8014990654450325e-05,
2696
- "loss": 0.1103,
2697
- "step": 384
2698
- },
2699
- {
2700
- "epoch": 1.54,
2701
- "grad_norm": 0.3870835602283478,
2702
- "learning_rate": 2.7557479520891104e-05,
2703
- "loss": 0.1214,
2704
- "step": 385
2705
- },
2706
- {
2707
- "epoch": 1.544,
2708
- "grad_norm": 0.42631757259368896,
2709
- "learning_rate": 2.7103137257858868e-05,
2710
- "loss": 0.0753,
2711
- "step": 386
2712
- },
2713
- {
2714
- "epoch": 1.548,
2715
- "grad_norm": 0.48006054759025574,
2716
- "learning_rate": 2.6651983739756026e-05,
2717
- "loss": 0.0988,
2718
- "step": 387
2719
- },
2720
- {
2721
- "epoch": 1.552,
2722
- "grad_norm": 0.2611723840236664,
2723
- "learning_rate": 2.6204038701499056e-05,
2724
- "loss": 0.0659,
2725
- "step": 388
2726
- },
2727
- {
2728
- "epoch": 1.556,
2729
- "grad_norm": 0.32546359300613403,
2730
- "learning_rate": 2.5759321737655017e-05,
2731
- "loss": 0.1045,
2732
- "step": 389
2733
- },
2734
- {
2735
- "epoch": 1.56,
2736
- "grad_norm": 0.3296236991882324,
2737
- "learning_rate": 2.5317852301584643e-05,
2738
- "loss": 0.0994,
2739
- "step": 390
2740
- },
2741
- {
2742
- "epoch": 1.564,
2743
- "grad_norm": 0.3907070457935333,
2744
- "learning_rate": 2.487964970459118e-05,
2745
- "loss": 0.083,
2746
- "step": 391
2747
- },
2748
- {
2749
- "epoch": 1.568,
2750
- "grad_norm": 0.4443913400173187,
2751
- "learning_rate": 2.4444733115075823e-05,
2752
- "loss": 0.0778,
2753
- "step": 392
2754
- },
2755
- {
2756
- "epoch": 1.572,
2757
- "grad_norm": 0.5356037616729736,
2758
- "learning_rate": 2.4013121557699157e-05,
2759
- "loss": 0.1191,
2760
- "step": 393
2761
- },
2762
- {
2763
- "epoch": 1.576,
2764
- "grad_norm": 0.48386839032173157,
2765
- "learning_rate": 2.3584833912548888e-05,
2766
- "loss": 0.1227,
2767
- "step": 394
2768
- },
2769
- {
2770
- "epoch": 1.58,
2771
- "grad_norm": 0.31592532992362976,
2772
- "learning_rate": 2.315988891431412e-05,
2773
- "loss": 0.0986,
2774
- "step": 395
2775
- },
2776
- {
2777
- "epoch": 1.584,
2778
- "grad_norm": 0.417259156703949,
2779
- "learning_rate": 2.2738305151465645e-05,
2780
- "loss": 0.1178,
2781
- "step": 396
2782
- },
2783
- {
2784
- "epoch": 1.588,
2785
- "grad_norm": 0.3802774250507355,
2786
- "learning_rate": 2.2320101065443056e-05,
2787
- "loss": 0.0846,
2788
- "step": 397
2789
- },
2790
- {
2791
- "epoch": 1.592,
2792
- "grad_norm": 0.36296606063842773,
2793
- "learning_rate": 2.190529494984782e-05,
2794
- "loss": 0.1005,
2795
- "step": 398
2796
- },
2797
- {
2798
- "epoch": 1.596,
2799
- "grad_norm": 0.4494307041168213,
2800
- "learning_rate": 2.149390494964323e-05,
2801
- "loss": 0.1336,
2802
- "step": 399
2803
- },
2804
- {
2805
- "epoch": 1.6,
2806
- "grad_norm": 0.41138237714767456,
2807
- "learning_rate": 2.1085949060360654e-05,
2808
- "loss": 0.101,
2809
- "step": 400
2810
- },
2811
- {
2812
- "epoch": 1.604,
2813
- "grad_norm": 0.3313177227973938,
2814
- "learning_rate": 2.0681445127312214e-05,
2815
- "loss": 0.1191,
2816
- "step": 401
2817
- },
2818
- {
2819
- "epoch": 1.608,
2820
- "grad_norm": 0.4874570369720459,
2821
- "learning_rate": 2.0280410844810428e-05,
2822
- "loss": 0.1677,
2823
- "step": 402
2824
- },
2825
- {
2826
- "epoch": 1.612,
2827
- "grad_norm": 0.756597638130188,
2828
- "learning_rate": 1.988286375539391e-05,
2829
- "loss": 0.2832,
2830
- "step": 403
2831
- },
2832
- {
2833
- "epoch": 1.616,
2834
- "grad_norm": 0.428316205739975,
2835
- "learning_rate": 1.9488821249060297e-05,
2836
- "loss": 0.1366,
2837
- "step": 404
2838
- },
2839
- {
2840
- "epoch": 1.62,
2841
- "grad_norm": 0.3996226191520691,
2842
- "learning_rate": 1.9098300562505266e-05,
2843
- "loss": 0.1348,
2844
- "step": 405
2845
- },
2846
- {
2847
- "epoch": 1.624,
2848
- "grad_norm": 0.45633479952812195,
2849
- "learning_rate": 1.871131877836879e-05,
2850
- "loss": 0.1589,
2851
- "step": 406
2852
- },
2853
- {
2854
- "epoch": 1.6280000000000001,
2855
- "grad_norm": 0.35015976428985596,
2856
- "learning_rate": 1.8327892824487792e-05,
2857
- "loss": 0.0803,
2858
- "step": 407
2859
- },
2860
- {
2861
- "epoch": 1.6320000000000001,
2862
- "grad_norm": 0.39677464962005615,
2863
- "learning_rate": 1.7948039473155554e-05,
2864
- "loss": 0.1358,
2865
- "step": 408
2866
- },
2867
- {
2868
- "epoch": 1.6360000000000001,
2869
- "grad_norm": 0.5173831582069397,
2870
- "learning_rate": 1.7571775340388276e-05,
2871
- "loss": 0.2196,
2872
- "step": 409
2873
- },
2874
- {
2875
- "epoch": 1.6400000000000001,
2876
- "grad_norm": 0.48408329486846924,
2877
- "learning_rate": 1.7199116885197995e-05,
2878
- "loss": 0.1526,
2879
- "step": 410
2880
- },
2881
- {
2882
- "epoch": 1.6440000000000001,
2883
- "grad_norm": 0.4577184021472931,
2884
- "learning_rate": 1.683008040887285e-05,
2885
- "loss": 0.1362,
2886
- "step": 411
2887
- },
2888
- {
2889
- "epoch": 1.6480000000000001,
2890
- "grad_norm": 0.35686102509498596,
2891
- "learning_rate": 1.646468205426377e-05,
2892
- "loss": 0.1556,
2893
- "step": 412
2894
- },
2895
- {
2896
- "epoch": 1.6520000000000001,
2897
- "grad_norm": 0.8025571703910828,
2898
- "learning_rate": 1.6102937805078544e-05,
2899
- "loss": 0.1942,
2900
- "step": 413
2901
- },
2902
- {
2903
- "epoch": 1.6560000000000001,
2904
- "grad_norm": 0.46714654564857483,
2905
- "learning_rate": 1.5744863485182537e-05,
2906
- "loss": 0.1719,
2907
- "step": 414
2908
- },
2909
- {
2910
- "epoch": 1.6600000000000001,
2911
- "grad_norm": 0.5622069239616394,
2912
- "learning_rate": 1.5390474757906446e-05,
2913
- "loss": 0.1739,
2914
- "step": 415
2915
- },
2916
- {
2917
- "epoch": 1.6640000000000001,
2918
- "grad_norm": 0.34492602944374084,
2919
- "learning_rate": 1.5039787125361326e-05,
2920
- "loss": 0.1297,
2921
- "step": 416
2922
- },
2923
- {
2924
- "epoch": 1.6680000000000001,
2925
- "grad_norm": 0.5172492265701294,
2926
- "learning_rate": 1.4692815927760273e-05,
2927
- "loss": 0.126,
2928
- "step": 417
2929
- },
2930
- {
2931
- "epoch": 1.6720000000000002,
2932
- "grad_norm": 0.359210342168808,
2933
- "learning_rate": 1.4349576342747462e-05,
2934
- "loss": 0.0991,
2935
- "step": 418
2936
- },
2937
- {
2938
- "epoch": 1.6760000000000002,
2939
- "grad_norm": 0.3213334381580353,
2940
- "learning_rate": 1.4010083384734308e-05,
2941
- "loss": 0.0872,
2942
- "step": 419
2943
- },
2944
- {
2945
- "epoch": 1.6800000000000002,
2946
- "grad_norm": 0.4676114320755005,
2947
- "learning_rate": 1.3674351904242611e-05,
2948
- "loss": 0.172,
2949
- "step": 420
2950
- },
2951
- {
2952
- "epoch": 1.6840000000000002,
2953
- "grad_norm": 0.43466052412986755,
2954
- "learning_rate": 1.3342396587254958e-05,
2955
- "loss": 0.2017,
2956
- "step": 421
2957
- },
2958
- {
2959
- "epoch": 1.688,
2960
- "grad_norm": 0.6130461692810059,
2961
- "learning_rate": 1.3014231954572287e-05,
2962
- "loss": 0.1756,
2963
- "step": 422
2964
- },
2965
- {
2966
- "epoch": 1.692,
2967
- "grad_norm": 0.4119471311569214,
2968
- "learning_rate": 1.2689872361178701e-05,
2969
- "loss": 0.1323,
2970
- "step": 423
2971
- },
2972
- {
2973
- "epoch": 1.696,
2974
- "grad_norm": 0.3372259736061096,
2975
- "learning_rate": 1.2369331995613665e-05,
2976
- "loss": 0.1182,
2977
- "step": 424
2978
- },
2979
- {
2980
- "epoch": 1.7,
2981
- "grad_norm": 0.301332950592041,
2982
- "learning_rate": 1.2052624879351104e-05,
2983
- "loss": 0.0827,
2984
- "step": 425
2985
- },
2986
- {
2987
- "epoch": 1.704,
2988
- "grad_norm": 0.37877362966537476,
2989
- "learning_rate": 1.173976486618631e-05,
2990
- "loss": 0.1456,
2991
- "step": 426
2992
- },
2993
- {
2994
- "epoch": 1.708,
2995
- "grad_norm": 0.3089151084423065,
2996
- "learning_rate": 1.143076564162977e-05,
2997
- "loss": 0.0929,
2998
- "step": 427
2999
- },
3000
- {
3001
- "epoch": 1.712,
3002
- "grad_norm": 0.3376375138759613,
3003
- "learning_rate": 1.1125640722308628e-05,
3004
- "loss": 0.1195,
3005
- "step": 428
3006
- },
3007
- {
3008
- "epoch": 1.716,
3009
- "grad_norm": 0.4322645664215088,
3010
- "learning_rate": 1.0824403455375288e-05,
3011
- "loss": 0.1728,
3012
- "step": 429
3013
- },
3014
- {
3015
- "epoch": 1.72,
3016
- "grad_norm": 0.3470593988895416,
3017
- "learning_rate": 1.0527067017923654e-05,
3018
- "loss": 0.0866,
3019
- "step": 430
3020
- },
3021
- {
3022
- "epoch": 1.724,
3023
- "grad_norm": 0.42513084411621094,
3024
- "learning_rate": 1.0233644416412791e-05,
3025
- "loss": 0.1535,
3026
- "step": 431
3027
- },
3028
- {
3029
- "epoch": 1.728,
3030
- "grad_norm": 0.4393742084503174,
3031
- "learning_rate": 9.944148486097793e-06,
3032
- "loss": 0.1625,
3033
- "step": 432
3034
- },
3035
- {
3036
- "epoch": 1.732,
3037
- "grad_norm": 0.4160197675228119,
3038
- "learning_rate": 9.658591890468515e-06,
3039
- "loss": 0.1082,
3040
- "step": 433
3041
- },
3042
- {
3043
- "epoch": 1.736,
3044
- "grad_norm": 0.47929707169532776,
3045
- "learning_rate": 9.376987120695545e-06,
3046
- "loss": 0.1419,
3047
- "step": 434
3048
- },
3049
- {
3050
- "epoch": 1.74,
3051
- "grad_norm": 0.3390033543109894,
3052
- "learning_rate": 9.09934649508375e-06,
3053
- "loss": 0.0783,
3054
- "step": 435
3055
- },
3056
- {
3057
- "epoch": 1.744,
3058
- "grad_norm": 0.3353302478790283,
3059
- "learning_rate": 8.825682158533554e-06,
3060
- "loss": 0.0993,
3061
- "step": 436
3062
- },
3063
- {
3064
- "epoch": 1.748,
3065
- "grad_norm": 0.5262099504470825,
3066
- "learning_rate": 8.55600608200956e-06,
3067
- "loss": 0.1207,
3068
- "step": 437
3069
- },
3070
- {
3071
- "epoch": 1.752,
3072
- "grad_norm": 0.46003034710884094,
3073
- "learning_rate": 8.290330062017016e-06,
3074
- "loss": 0.1326,
3075
- "step": 438
3076
- },
3077
- {
3078
- "epoch": 1.756,
3079
- "grad_norm": 0.38699957728385925,
3080
- "learning_rate": 8.02866572008566e-06,
3081
- "loss": 0.1148,
3082
- "step": 439
3083
- },
3084
- {
3085
- "epoch": 1.76,
3086
- "grad_norm": 0.42611733078956604,
3087
- "learning_rate": 7.771024502261526e-06,
3088
- "loss": 0.0938,
3089
- "step": 440
3090
- },
3091
- {
3092
- "epoch": 1.764,
3093
- "grad_norm": 0.3002888560295105,
3094
- "learning_rate": 7.51741767860612e-06,
3095
- "loss": 0.0959,
3096
- "step": 441
3097
- },
3098
- {
3099
- "epoch": 1.768,
3100
- "grad_norm": 0.49262702465057373,
3101
- "learning_rate": 7.267856342703461e-06,
3102
- "loss": 0.0973,
3103
- "step": 442
3104
- },
3105
- {
3106
- "epoch": 1.772,
3107
- "grad_norm": 0.2750943899154663,
3108
- "learning_rate": 7.022351411174866e-06,
3109
- "loss": 0.0891,
3110
- "step": 443
3111
- },
3112
- {
3113
- "epoch": 1.776,
3114
- "grad_norm": 0.3401097059249878,
3115
- "learning_rate": 6.780913623201346e-06,
3116
- "loss": 0.1159,
3117
- "step": 444
3118
- },
3119
- {
3120
- "epoch": 1.78,
3121
- "grad_norm": 0.4844938814640045,
3122
- "learning_rate": 6.543553540053926e-06,
3123
- "loss": 0.1143,
3124
- "step": 445
3125
- },
3126
- {
3127
- "epoch": 1.784,
3128
- "grad_norm": 0.3894381523132324,
3129
- "learning_rate": 6.310281544631546e-06,
3130
- "loss": 0.1017,
3131
- "step": 446
3132
- },
3133
- {
3134
- "epoch": 1.788,
3135
- "grad_norm": 0.294817179441452,
3136
- "learning_rate": 6.081107841007006e-06,
3137
- "loss": 0.0887,
3138
- "step": 447
3139
- },
3140
- {
3141
- "epoch": 1.792,
3142
- "grad_norm": 0.2947250008583069,
3143
- "learning_rate": 5.856042453980526e-06,
3144
- "loss": 0.0937,
3145
- "step": 448
3146
- },
3147
- {
3148
- "epoch": 1.796,
3149
- "grad_norm": 0.5969186425209045,
3150
- "learning_rate": 5.63509522864123e-06,
3151
- "loss": 0.1236,
3152
- "step": 449
3153
- },
3154
- {
3155
- "epoch": 1.8,
3156
- "grad_norm": 0.2769240140914917,
3157
- "learning_rate": 5.418275829936537e-06,
3158
- "loss": 0.0848,
3159
- "step": 450
3160
- },
3161
- {
3162
- "epoch": 1.804,
3163
- "grad_norm": 0.4774380326271057,
3164
- "learning_rate": 5.205593742249326e-06,
3165
- "loss": 0.3201,
3166
- "step": 451
3167
- },
3168
- {
3169
- "epoch": 1.808,
3170
- "grad_norm": 0.45032572746276855,
3171
- "learning_rate": 4.997058268983135e-06,
3172
- "loss": 0.151,
3173
- "step": 452
3174
- },
3175
- {
3176
- "epoch": 1.812,
3177
- "grad_norm": 0.5476038455963135,
3178
- "learning_rate": 4.792678532155115e-06,
3179
- "loss": 0.1886,
3180
- "step": 453
3181
- },
3182
- {
3183
- "epoch": 1.8159999999999998,
3184
- "grad_norm": 0.4461387097835541,
3185
- "learning_rate": 4.592463471997022e-06,
3186
- "loss": 0.1338,
3187
- "step": 454
3188
- },
3189
- {
3190
- "epoch": 1.8199999999999998,
3191
- "grad_norm": 0.38340190052986145,
3192
- "learning_rate": 4.3964218465642355e-06,
3193
- "loss": 0.183,
3194
- "step": 455
3195
- },
3196
- {
3197
- "epoch": 1.8239999999999998,
3198
- "grad_norm": 0.8330069184303284,
3199
- "learning_rate": 4.204562231352516e-06,
3200
- "loss": 0.2425,
3201
- "step": 456
3202
- },
3203
- {
3204
- "epoch": 1.8279999999999998,
3205
- "grad_norm": 0.28832969069480896,
3206
- "learning_rate": 4.016893018922996e-06,
3207
- "loss": 0.0919,
3208
- "step": 457
3209
- },
3210
- {
3211
- "epoch": 1.8319999999999999,
3212
- "grad_norm": 0.43037348985671997,
3213
- "learning_rate": 3.83342241853496e-06,
3214
- "loss": 0.1836,
3215
- "step": 458
3216
- },
3217
- {
3218
- "epoch": 1.8359999999999999,
3219
- "grad_norm": 0.4096948802471161,
3220
- "learning_rate": 3.6541584557868604e-06,
3221
- "loss": 0.1419,
3222
- "step": 459
3223
- },
3224
- {
3225
- "epoch": 1.8399999999999999,
3226
- "grad_norm": 0.2975824177265167,
3227
- "learning_rate": 3.4791089722651436e-06,
3228
- "loss": 0.1334,
3229
- "step": 460
3230
- },
3231
- {
3232
- "epoch": 1.8439999999999999,
3233
- "grad_norm": 0.29744744300842285,
3234
- "learning_rate": 3.3082816252012926e-06,
3235
- "loss": 0.1077,
3236
- "step": 461
3237
- },
3238
- {
3239
- "epoch": 1.8479999999999999,
3240
- "grad_norm": 0.46424493193626404,
3241
- "learning_rate": 3.1416838871368924e-06,
3242
- "loss": 0.1574,
3243
- "step": 462
3244
- },
3245
- {
3246
- "epoch": 1.8519999999999999,
3247
- "grad_norm": 0.3721751570701599,
3248
- "learning_rate": 2.9793230455966937e-06,
3249
- "loss": 0.1318,
3250
- "step": 463
3251
- },
3252
- {
3253
- "epoch": 1.8559999999999999,
3254
- "grad_norm": 0.43020930886268616,
3255
- "learning_rate": 2.821206202769899e-06,
3256
- "loss": 0.1169,
3257
- "step": 464
3258
- },
3259
- {
3260
- "epoch": 1.8599999999999999,
3261
- "grad_norm": 0.44107383489608765,
3262
- "learning_rate": 2.667340275199426e-06,
3263
- "loss": 0.2362,
3264
- "step": 465
3265
- },
3266
- {
3267
- "epoch": 1.8639999999999999,
3268
- "grad_norm": 0.55259770154953,
3269
- "learning_rate": 2.5177319934794e-06,
3270
- "loss": 0.2054,
3271
- "step": 466
3272
- },
3273
- {
3274
- "epoch": 1.8679999999999999,
3275
- "grad_norm": 0.35051223635673523,
3276
- "learning_rate": 2.3723879019607374e-06,
3277
- "loss": 0.0925,
3278
- "step": 467
3279
- },
3280
- {
3281
- "epoch": 1.8719999999999999,
3282
- "grad_norm": 0.33733323216438293,
3283
- "learning_rate": 2.2313143584648423e-06,
3284
- "loss": 0.1012,
3285
- "step": 468
3286
- },
3287
- {
3288
- "epoch": 1.876,
3289
- "grad_norm": 0.36968204379081726,
3290
- "learning_rate": 2.0945175340055357e-06,
3291
- "loss": 0.1222,
3292
- "step": 469
3293
- },
3294
- {
3295
- "epoch": 1.88,
3296
- "grad_norm": 0.541716456413269,
3297
- "learning_rate": 1.9620034125190644e-06,
3298
- "loss": 0.1721,
3299
- "step": 470
3300
- },
3301
- {
3302
- "epoch": 1.884,
3303
- "grad_norm": 0.5674886107444763,
3304
- "learning_rate": 1.8337777906023978e-06,
3305
- "loss": 0.1989,
3306
- "step": 471
3307
- },
3308
- {
3309
- "epoch": 1.888,
3310
- "grad_norm": 0.3306349813938141,
3311
- "learning_rate": 1.7098462772596302e-06,
3312
- "loss": 0.0797,
3313
- "step": 472
3314
- },
3315
- {
3316
- "epoch": 1.892,
3317
- "grad_norm": 0.5260130167007446,
3318
- "learning_rate": 1.5902142936566334e-06,
3319
- "loss": 0.1403,
3320
- "step": 473
3321
- },
3322
- {
3323
- "epoch": 1.896,
3324
- "grad_norm": 0.32538896799087524,
3325
- "learning_rate": 1.4748870728839347e-06,
3326
- "loss": 0.0952,
3327
- "step": 474
3328
- },
3329
- {
3330
- "epoch": 1.9,
3331
- "grad_norm": 0.5289676785469055,
3332
- "learning_rate": 1.3638696597277679e-06,
3333
- "loss": 0.1712,
3334
- "step": 475
3335
- },
3336
- {
3337
- "epoch": 1.904,
3338
- "grad_norm": 0.3857274353504181,
3339
- "learning_rate": 1.2571669104494256e-06,
3340
- "loss": 0.09,
3341
- "step": 476
3342
- },
3343
- {
3344
- "epoch": 1.908,
3345
- "grad_norm": 0.3752627968788147,
3346
- "learning_rate": 1.1547834925728528e-06,
3347
- "loss": 0.1206,
3348
- "step": 477
3349
- },
3350
- {
3351
- "epoch": 1.912,
3352
- "grad_norm": 0.3532065451145172,
3353
- "learning_rate": 1.0567238846803996e-06,
3354
- "loss": 0.1066,
3355
- "step": 478
3356
- },
3357
- {
3358
- "epoch": 1.916,
3359
- "grad_norm": 0.70602947473526,
3360
- "learning_rate": 9.62992376217009e-07,
3361
- "loss": 0.1734,
3362
- "step": 479
3363
- },
3364
- {
3365
- "epoch": 1.92,
3366
- "grad_norm": 0.4085240066051483,
3367
- "learning_rate": 8.735930673024806e-07,
3368
- "loss": 0.0987,
3369
- "step": 480
3370
- },
3371
- {
3372
- "epoch": 1.924,
3373
- "grad_norm": 0.44681504368782043,
3374
- "learning_rate": 7.885298685522235e-07,
3375
- "loss": 0.1204,
3376
- "step": 481
3377
- },
3378
- {
3379
- "epoch": 1.928,
3380
- "grad_norm": 0.4097270965576172,
3381
- "learning_rate": 7.078065009060941e-07,
3382
- "loss": 0.151,
3383
- "step": 482
3384
- },
3385
- {
3386
- "epoch": 1.932,
3387
- "grad_norm": 0.31006622314453125,
3388
- "learning_rate": 6.314264954657256e-07,
3389
- "loss": 0.096,
3390
- "step": 483
3391
- },
3392
- {
3393
- "epoch": 1.936,
3394
- "grad_norm": 0.3499917984008789,
3395
- "learning_rate": 5.593931933399854e-07,
3396
- "loss": 0.0817,
3397
- "step": 484
3398
- },
3399
- {
3400
- "epoch": 1.94,
3401
- "grad_norm": 0.41455402970314026,
3402
- "learning_rate": 4.917097454988584e-07,
3403
- "loss": 0.1434,
3404
- "step": 485
3405
- },
3406
- {
3407
- "epoch": 1.944,
3408
- "grad_norm": 0.3714580535888672,
3409
- "learning_rate": 4.2837911263562404e-07,
3410
- "loss": 0.1134,
3411
- "step": 486
3412
- },
3413
- {
3414
- "epoch": 1.948,
3415
- "grad_norm": 0.44166669249534607,
3416
- "learning_rate": 3.694040650373154e-07,
3417
- "loss": 0.0831,
3418
- "step": 487
3419
- },
3420
- {
3421
- "epoch": 1.952,
3422
- "grad_norm": 0.41021236777305603,
3423
- "learning_rate": 3.1478718246357173e-07,
3424
- "loss": 0.088,
3425
- "step": 488
3426
- },
3427
- {
3428
- "epoch": 1.956,
3429
- "grad_norm": 0.36902716755867004,
3430
- "learning_rate": 2.645308540337843e-07,
3431
- "loss": 0.1097,
3432
- "step": 489
3433
- },
3434
- {
3435
- "epoch": 1.96,
3436
- "grad_norm": 0.4132952094078064,
3437
- "learning_rate": 2.1863727812254653e-07,
3438
- "loss": 0.1133,
3439
- "step": 490
3440
- },
3441
- {
3442
- "epoch": 1.964,
3443
- "grad_norm": 0.3286338746547699,
3444
- "learning_rate": 1.7710846226355328e-07,
3445
- "loss": 0.0856,
3446
- "step": 491
3447
- },
3448
- {
3449
- "epoch": 1.968,
3450
- "grad_norm": 0.4310964345932007,
3451
- "learning_rate": 1.3994622306173765e-07,
3452
- "loss": 0.1155,
3453
- "step": 492
3454
- },
3455
- {
3456
- "epoch": 1.972,
3457
- "grad_norm": 0.3993656039237976,
3458
- "learning_rate": 1.0715218611384581e-07,
3459
- "loss": 0.1045,
3460
- "step": 493
3461
- },
3462
- {
3463
- "epoch": 1.976,
3464
- "grad_norm": 0.38035959005355835,
3465
- "learning_rate": 7.872778593728258e-08,
3466
- "loss": 0.0979,
3467
- "step": 494
3468
- },
3469
- {
3470
- "epoch": 1.98,
3471
- "grad_norm": 0.32513630390167236,
3472
- "learning_rate": 5.467426590739511e-08,
3473
- "loss": 0.0829,
3474
- "step": 495
3475
- },
3476
- {
3477
- "epoch": 1.984,
3478
- "grad_norm": 0.718855082988739,
3479
- "learning_rate": 3.499267820307184e-08,
3480
- "loss": 0.1378,
3481
- "step": 496
3482
- },
3483
- {
3484
- "epoch": 1.988,
3485
- "grad_norm": 0.3467668294906616,
3486
- "learning_rate": 1.9683883760723832e-08,
3487
- "loss": 0.0919,
3488
- "step": 497
3489
- },
3490
- {
3491
- "epoch": 1.992,
3492
- "grad_norm": 0.34150242805480957,
3493
- "learning_rate": 8.748552236603757e-09,
3494
- "loss": 0.0997,
3495
- "step": 498
3496
- },
3497
- {
3498
- "epoch": 1.996,
3499
- "grad_norm": 0.4084968566894531,
3500
- "learning_rate": 2.187161977540431e-09,
3501
- "loss": 0.105,
3502
- "step": 499
3503
- },
3504
- {
3505
- "epoch": 2.0,
3506
- "grad_norm": 0.47725334763526917,
3507
- "learning_rate": 0.0,
3508
- "loss": 0.1549,
3509
- "step": 500
3510
- }
3511
- ],
3512
- "logging_steps": 1,
3513
- "max_steps": 500,
3514
- "num_input_tokens_seen": 0,
3515
- "num_train_epochs": 2,
3516
- "save_steps": 500,
3517
- "stateful_callbacks": {
3518
- "TrainerControl": {
3519
- "args": {
3520
- "should_epoch_stop": false,
3521
- "should_evaluate": false,
3522
- "should_log": false,
3523
- "should_save": true,
3524
- "should_training_stop": true
3525
- },
3526
- "attributes": {}
3527
- }
3528
- },
3529
- "total_flos": 1.994087940177715e+16,
3530
- "train_batch_size": 2,
3531
- "trial_name": null,
3532
- "trial_params": null
3533
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/training_args.bin DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:8fa1f4789ace6ab0638df1e6c46e84824bfc38e14f122f371f3235d65b751dfe
3
- size 5752
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/checkpoint-500/vocab.json DELETED
The diff for this file is too large to render. See raw diff
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/config.json DELETED
@@ -1,45 +0,0 @@
1
- {
2
- "_name_or_path": "Qwen/Qwen2.5-Coder-7B-Instruct",
3
- "architectures": [
4
- "Qwen2ForCausalLM"
5
- ],
6
- "attention_dropout": 0.0,
7
- "bos_token_id": 151643,
8
- "eos_token_id": 151645,
9
- "hidden_act": "silu",
10
- "hidden_size": 3584,
11
- "initializer_range": 0.02,
12
- "intermediate_size": 18944,
13
- "max_position_embeddings": 32768,
14
- "max_window_layers": 28,
15
- "model_type": "qwen2",
16
- "num_attention_heads": 28,
17
- "num_hidden_layers": 28,
18
- "num_key_value_heads": 4,
19
- "pad_token_id": 151645,
20
- "quantization_config": {
21
- "_load_in_4bit": true,
22
- "_load_in_8bit": false,
23
- "bnb_4bit_compute_dtype": "bfloat16",
24
- "bnb_4bit_quant_storage": "uint8",
25
- "bnb_4bit_quant_type": "nf4",
26
- "bnb_4bit_use_double_quant": true,
27
- "llm_int8_enable_fp32_cpu_offload": false,
28
- "llm_int8_has_fp16_weight": false,
29
- "llm_int8_skip_modules": null,
30
- "llm_int8_threshold": 6.0,
31
- "load_in_4bit": true,
32
- "load_in_8bit": false,
33
- "quant_method": "bitsandbytes"
34
- },
35
- "rms_norm_eps": 1e-06,
36
- "rope_scaling": null,
37
- "rope_theta": 1000000.0,
38
- "sliding_window": null,
39
- "tie_word_embeddings": false,
40
- "torch_dtype": "bfloat16",
41
- "transformers_version": "4.46.3",
42
- "use_cache": true,
43
- "use_sliding_window": false,
44
- "vocab_size": 152064
45
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/generation_config.json DELETED
@@ -1,14 +0,0 @@
1
- {
2
- "bos_token_id": 151643,
3
- "do_sample": true,
4
- "eos_token_id": [
5
- 151645,
6
- 151643
7
- ],
8
- "pad_token_id": 151643,
9
- "repetition_penalty": 1.1,
10
- "temperature": 0.7,
11
- "top_k": 20,
12
- "top_p": 0.8,
13
- "transformers_version": "4.46.3"
14
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/merges.txt DELETED
The diff for this file is too large to render. See raw diff
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/model-00001-of-00003.safetensors DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:95ff0353514811cd0cbf9bd4d55e770997161ed2fb009e31355b86f9510e3264
3
- size 1982173099
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/model-00002-of-00003.safetensors DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:7f8e25deb577819e4955650af650353b1c0ec8b080339aa603476705323db9fb
3
- size 1994606136
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/model-00003-of-00003.safetensors DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:e5d3675b62139592329b80d71a217acd77df2a80ff7d86dd17c26dbcec86e321
3
- size 1571140061
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/model.safetensors.index.json DELETED
The diff for this file is too large to render. See raw diff
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/special_tokens_map.json DELETED
@@ -1,31 +0,0 @@
1
- {
2
- "additional_special_tokens": [
3
- "<|im_start|>",
4
- "<|im_end|>",
5
- "<|object_ref_start|>",
6
- "<|object_ref_end|>",
7
- "<|box_start|>",
8
- "<|box_end|>",
9
- "<|quad_start|>",
10
- "<|quad_end|>",
11
- "<|vision_start|>",
12
- "<|vision_end|>",
13
- "<|vision_pad|>",
14
- "<|image_pad|>",
15
- "<|video_pad|>"
16
- ],
17
- "eos_token": {
18
- "content": "<|im_end|>",
19
- "lstrip": false,
20
- "normalized": false,
21
- "rstrip": false,
22
- "single_word": false
23
- },
24
- "pad_token": {
25
- "content": "<|endoftext|>",
26
- "lstrip": false,
27
- "normalized": false,
28
- "rstrip": false,
29
- "single_word": false
30
- }
31
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/tokenizer.json DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:962b8d8c521fefa934665afddae177326e974ddd6a26e69ff31ad6bccbb5593b
3
- size 11421994
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/tokenizer_config.json DELETED
@@ -1,207 +0,0 @@
1
- {
2
- "add_bos_token": false,
3
- "add_prefix_space": false,
4
- "added_tokens_decoder": {
5
- "151643": {
6
- "content": "<|endoftext|>",
7
- "lstrip": false,
8
- "normalized": false,
9
- "rstrip": false,
10
- "single_word": false,
11
- "special": true
12
- },
13
- "151644": {
14
- "content": "<|im_start|>",
15
- "lstrip": false,
16
- "normalized": false,
17
- "rstrip": false,
18
- "single_word": false,
19
- "special": true
20
- },
21
- "151645": {
22
- "content": "<|im_end|>",
23
- "lstrip": false,
24
- "normalized": false,
25
- "rstrip": false,
26
- "single_word": false,
27
- "special": true
28
- },
29
- "151646": {
30
- "content": "<|object_ref_start|>",
31
- "lstrip": false,
32
- "normalized": false,
33
- "rstrip": false,
34
- "single_word": false,
35
- "special": true
36
- },
37
- "151647": {
38
- "content": "<|object_ref_end|>",
39
- "lstrip": false,
40
- "normalized": false,
41
- "rstrip": false,
42
- "single_word": false,
43
- "special": true
44
- },
45
- "151648": {
46
- "content": "<|box_start|>",
47
- "lstrip": false,
48
- "normalized": false,
49
- "rstrip": false,
50
- "single_word": false,
51
- "special": true
52
- },
53
- "151649": {
54
- "content": "<|box_end|>",
55
- "lstrip": false,
56
- "normalized": false,
57
- "rstrip": false,
58
- "single_word": false,
59
- "special": true
60
- },
61
- "151650": {
62
- "content": "<|quad_start|>",
63
- "lstrip": false,
64
- "normalized": false,
65
- "rstrip": false,
66
- "single_word": false,
67
- "special": true
68
- },
69
- "151651": {
70
- "content": "<|quad_end|>",
71
- "lstrip": false,
72
- "normalized": false,
73
- "rstrip": false,
74
- "single_word": false,
75
- "special": true
76
- },
77
- "151652": {
78
- "content": "<|vision_start|>",
79
- "lstrip": false,
80
- "normalized": false,
81
- "rstrip": false,
82
- "single_word": false,
83
- "special": true
84
- },
85
- "151653": {
86
- "content": "<|vision_end|>",
87
- "lstrip": false,
88
- "normalized": false,
89
- "rstrip": false,
90
- "single_word": false,
91
- "special": true
92
- },
93
- "151654": {
94
- "content": "<|vision_pad|>",
95
- "lstrip": false,
96
- "normalized": false,
97
- "rstrip": false,
98
- "single_word": false,
99
- "special": true
100
- },
101
- "151655": {
102
- "content": "<|image_pad|>",
103
- "lstrip": false,
104
- "normalized": false,
105
- "rstrip": false,
106
- "single_word": false,
107
- "special": true
108
- },
109
- "151656": {
110
- "content": "<|video_pad|>",
111
- "lstrip": false,
112
- "normalized": false,
113
- "rstrip": false,
114
- "single_word": false,
115
- "special": true
116
- },
117
- "151657": {
118
- "content": "<tool_call>",
119
- "lstrip": false,
120
- "normalized": false,
121
- "rstrip": false,
122
- "single_word": false,
123
- "special": false
124
- },
125
- "151658": {
126
- "content": "</tool_call>",
127
- "lstrip": false,
128
- "normalized": false,
129
- "rstrip": false,
130
- "single_word": false,
131
- "special": false
132
- },
133
- "151659": {
134
- "content": "<|fim_prefix|>",
135
- "lstrip": false,
136
- "normalized": false,
137
- "rstrip": false,
138
- "single_word": false,
139
- "special": false
140
- },
141
- "151660": {
142
- "content": "<|fim_middle|>",
143
- "lstrip": false,
144
- "normalized": false,
145
- "rstrip": false,
146
- "single_word": false,
147
- "special": false
148
- },
149
- "151661": {
150
- "content": "<|fim_suffix|>",
151
- "lstrip": false,
152
- "normalized": false,
153
- "rstrip": false,
154
- "single_word": false,
155
- "special": false
156
- },
157
- "151662": {
158
- "content": "<|fim_pad|>",
159
- "lstrip": false,
160
- "normalized": false,
161
- "rstrip": false,
162
- "single_word": false,
163
- "special": false
164
- },
165
- "151663": {
166
- "content": "<|repo_name|>",
167
- "lstrip": false,
168
- "normalized": false,
169
- "rstrip": false,
170
- "single_word": false,
171
- "special": false
172
- },
173
- "151664": {
174
- "content": "<|file_sep|>",
175
- "lstrip": false,
176
- "normalized": false,
177
- "rstrip": false,
178
- "single_word": false,
179
- "special": false
180
- }
181
- },
182
- "additional_special_tokens": [
183
- "<|im_start|>",
184
- "<|im_end|>",
185
- "<|object_ref_start|>",
186
- "<|object_ref_end|>",
187
- "<|box_start|>",
188
- "<|box_end|>",
189
- "<|quad_start|>",
190
- "<|quad_end|>",
191
- "<|vision_start|>",
192
- "<|vision_end|>",
193
- "<|vision_pad|>",
194
- "<|image_pad|>",
195
- "<|video_pad|>"
196
- ],
197
- "bos_token": null,
198
- "chat_template": "{%- if tools %}\n {{- '<|im_start|>system\\n' }}\n {%- if messages[0]['role'] == 'system' %}\n {{- messages[0]['content'] }}\n {%- else %}\n {{- 'You are Qwen, created by Alibaba Cloud. You are a helpful assistant.' }}\n {%- endif %}\n {{- \"\\n\\n# Tools\\n\\nYou may call one or more functions to assist with the user query.\\n\\nYou are provided with function signatures within <tools></tools> XML tags:\\n<tools>\" }}\n {%- for tool in tools %}\n {{- \"\\n\" }}\n {{- tool | tojson }}\n {%- endfor %}\n {{- \"\\n</tools>\\n\\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\\n<tool_call>\\n{\\\"name\\\": <function-name>, \\\"arguments\\\": <args-json-object>}\\n</tool_call><|im_end|>\\n\" }}\n{%- else %}\n {%- if messages[0]['role'] == 'system' %}\n {{- '<|im_start|>system\\n' + messages[0]['content'] + '<|im_end|>\\n' }}\n {%- else %}\n {{- '<|im_start|>system\\nYou are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>\\n' }}\n {%- endif %}\n{%- endif %}\n{%- for message in messages %}\n {%- if (message.role == \"user\") or (message.role == \"system\" and not loop.first) or (message.role == \"assistant\" and not message.tool_calls) %}\n {{- '<|im_start|>' + message.role + '\\n' + message.content + '<|im_end|>' + '\\n' }}\n {%- elif message.role == \"assistant\" %}\n {{- '<|im_start|>' + message.role }}\n {%- if message.content %}\n {{- '\\n' + message.content }}\n {%- endif %}\n {%- for tool_call in message.tool_calls %}\n {%- if tool_call.function is defined %}\n {%- set tool_call = tool_call.function %}\n {%- endif %}\n {{- '\\n<tool_call>\\n{\"name\": \"' }}\n {{- tool_call.name }}\n {{- '\", \"arguments\": ' }}\n {{- tool_call.arguments | tojson }}\n {{- '}\\n</tool_call>' }}\n {%- endfor %}\n {{- '<|im_end|>\\n' }}\n {%- elif message.role == \"tool\" %}\n {%- if (loop.index0 == 0) or (messages[loop.index0 - 1].role != \"tool\") %}\n {{- '<|im_start|>user' }}\n {%- endif %}\n {{- '\\n<tool_response>\\n' }}\n {{- message.content }}\n {{- '\\n</tool_response>' }}\n {%- if loop.last or (messages[loop.index0 + 1].role != \"tool\") %}\n {{- '<|im_end|>\\n' }}\n {%- endif %}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|im_start|>assistant\\n' }}\n{%- endif %}\n",
199
- "clean_up_tokenization_spaces": false,
200
- "eos_token": "<|im_end|>",
201
- "errors": "replace",
202
- "model_max_length": 32768,
203
- "pad_token": "<|endoftext|>",
204
- "split_special_tokens": false,
205
- "tokenizer_class": "Qwen2Tokenizer",
206
- "unk_token": null
207
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Qwen2.5-Coder-7B-Instruct-math-solver-config_1/vocab.json DELETED
The diff for this file is too large to render. See raw diff