mschonhardt commited on
Commit
223bbe7
·
verified ·
1 Parent(s): 294713d

Add files using upload-large-folder tool

Browse files
README.md ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: la
3
+ library_name: transformers
4
+ license: cc-by-sa-4.0
5
+ base_model: google/byt5-large
6
+ pipeline_tag: text2text-generation
7
+ tags:
8
+ - latin
9
+ - medieval-latin
10
+ - normalization
11
+ - legal-history
12
+ - digital-humanities
13
+ - ocr-postprocessing
14
+ widget:
15
+ - text: "viiii vt in sabbato sancto ieiunium ante noctis initium non soluatur"
16
+ example_title: "Medieval Legal Latin"
17
+ ---
18
+
19
+ # Medieval Latin Normalizer (ByT5-Large)
20
+
21
+ This model is a **ByT5-Large** transformer fine-tuned to normalize medieval Latin text. It transforms diplomatic transcriptions or noisy HTR/OCR output into a standardized normalized orthography, facilitating better downstream processing such as POS tagging, lemmatization, and linguistic analysis. The model was developed as part of the following research projects **"Embedding the Past"** (LOEWE-Exploration, TU Darmstadt) and **"Burchards Dekret Digital"** (Academy of Sciences and Literature | Mainz).
22
+
23
+ ## Model Logic
24
+ Medieval Latin normalization involves handling inconsistent orthography (e.g., `u/v`, `i/j`, or `ae/e` variations) and resolving phonetic spellings common in legal and ecclesiastical manuscripts.
25
+
26
+ By using **ByT5-Large**, the model operates directly on **UTF-8 bytes**. This is a significant advantage for Medieval Latin, as it allows the model to process non-standard characters without the information loss typical of subword tokenizers (like BERT or standard T5).
27
+
28
+ - **Input:** Raw/Diplomatic medieval Latin text.
29
+ - **Output:** Standardized/Normalized Latin text.
30
+
31
+ ## Technical Specifications
32
+ - **Architecture:** [ByT5-Large](https://huggingface.co/google/byt5-large) (~1.2B parameters).
33
+ - **Hardware:** Trained on NVIDIA Blackwell GPUs using `bf16` precision and `adamw_torch_fused` optimization.
34
+ - **Training Parameters:**
35
+ - **Learning Rate:** 2e-4
36
+ - **Epochs:** 20
37
+ - **Label Smoothing:** 0.1 (to improve robustness against transcription noise).
38
+ - **Batch Size:** 48.
39
+
40
+ ## Performance (Test Set)
41
+ The model was evaluated on a held-out test set (85 samples) from medieval legal corpora:
42
+
43
+ | Metric | Value |
44
+ | :--- | :--- |
45
+ | **Character Error Rate (CER)** | **1.62%** |
46
+ | **Word-Level F1-Score** | **94.12%** |
47
+ | **Evaluation Loss** | 0.143 |
48
+
49
+ ## Usage
50
+ You can utilize this model through the Hugging Face `pipeline` API:
51
+
52
+ ```python
53
+ from transformers import pipeline
54
+
55
+ # Initialize the normalizer
56
+ normalizer = pipeline("text2text-generation", model="mschonhardt/latin-normalizer")
57
+
58
+ # Example input
59
+ raw_text = "viiii vt in sabbato sancto ieiunium ante noctis initium non soluatur"
60
+ result = normalizer(raw_text, max_length=128)
61
+
62
+ print(f"Normalized: {result[0]['generated_text']}")
63
+
64
+ ```
65
+
66
+ ## Citation
67
+
68
+ If you use this model in your research, please cite:
69
+
70
+ ```bibtex
71
+ @software{schonhardt_michael_2026_normalization,
72
+ author = "Schonhardt, Michael",
73
+ title = "Medieval Latin Normalizer",
74
+ year = "2026",
75
+ publisher = "Zenodo",
76
+ doi = "10.5281/zenodo.18416639",
77
+ url = "https://doi.org/10.5281/zenodo.18416639"
78
+ }
79
+
80
+ @article{xue-etal-2022-byt5,
81
+ title = "{B}y{T}5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models",
82
+ author = "Xue, Linting and
83
+ Barua, Aditya and
84
+ Constant, Noah and
85
+ Al-Rfou, Rami and
86
+ Narang, Sharan and
87
+ Kale, Mihir and
88
+ Roberts, Adam and
89
+ Raffel, Colin",
90
+ editor = "Roark, Brian and
91
+ Nenkova, Ani",
92
+ journal = "Transactions of the Association for Computational Linguistics",
93
+ volume = "10",
94
+ year = "2022",
95
+ address = "Cambridge, MA",
96
+ publisher = "MIT Press",
97
+ url = "https://aclanthology.org/2022.tacl-1.17/",
98
+ doi = "10.1162/tacl_a_00461",
99
+ pages = "291--306"}
100
+
101
+ ```
added_tokens.json ADDED
@@ -0,0 +1,127 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<extra_id_0>": 259,
3
+ "<extra_id_100>": 359,
4
+ "<extra_id_101>": 360,
5
+ "<extra_id_102>": 361,
6
+ "<extra_id_103>": 362,
7
+ "<extra_id_104>": 363,
8
+ "<extra_id_105>": 364,
9
+ "<extra_id_106>": 365,
10
+ "<extra_id_107>": 366,
11
+ "<extra_id_108>": 367,
12
+ "<extra_id_109>": 368,
13
+ "<extra_id_10>": 269,
14
+ "<extra_id_110>": 369,
15
+ "<extra_id_111>": 370,
16
+ "<extra_id_112>": 371,
17
+ "<extra_id_113>": 372,
18
+ "<extra_id_114>": 373,
19
+ "<extra_id_115>": 374,
20
+ "<extra_id_116>": 375,
21
+ "<extra_id_117>": 376,
22
+ "<extra_id_118>": 377,
23
+ "<extra_id_119>": 378,
24
+ "<extra_id_11>": 270,
25
+ "<extra_id_120>": 379,
26
+ "<extra_id_121>": 380,
27
+ "<extra_id_122>": 381,
28
+ "<extra_id_123>": 382,
29
+ "<extra_id_124>": 383,
30
+ "<extra_id_12>": 271,
31
+ "<extra_id_13>": 272,
32
+ "<extra_id_14>": 273,
33
+ "<extra_id_15>": 274,
34
+ "<extra_id_16>": 275,
35
+ "<extra_id_17>": 276,
36
+ "<extra_id_18>": 277,
37
+ "<extra_id_19>": 278,
38
+ "<extra_id_1>": 260,
39
+ "<extra_id_20>": 279,
40
+ "<extra_id_21>": 280,
41
+ "<extra_id_22>": 281,
42
+ "<extra_id_23>": 282,
43
+ "<extra_id_24>": 283,
44
+ "<extra_id_25>": 284,
45
+ "<extra_id_26>": 285,
46
+ "<extra_id_27>": 286,
47
+ "<extra_id_28>": 287,
48
+ "<extra_id_29>": 288,
49
+ "<extra_id_2>": 261,
50
+ "<extra_id_30>": 289,
51
+ "<extra_id_31>": 290,
52
+ "<extra_id_32>": 291,
53
+ "<extra_id_33>": 292,
54
+ "<extra_id_34>": 293,
55
+ "<extra_id_35>": 294,
56
+ "<extra_id_36>": 295,
57
+ "<extra_id_37>": 296,
58
+ "<extra_id_38>": 297,
59
+ "<extra_id_39>": 298,
60
+ "<extra_id_3>": 262,
61
+ "<extra_id_40>": 299,
62
+ "<extra_id_41>": 300,
63
+ "<extra_id_42>": 301,
64
+ "<extra_id_43>": 302,
65
+ "<extra_id_44>": 303,
66
+ "<extra_id_45>": 304,
67
+ "<extra_id_46>": 305,
68
+ "<extra_id_47>": 306,
69
+ "<extra_id_48>": 307,
70
+ "<extra_id_49>": 308,
71
+ "<extra_id_4>": 263,
72
+ "<extra_id_50>": 309,
73
+ "<extra_id_51>": 310,
74
+ "<extra_id_52>": 311,
75
+ "<extra_id_53>": 312,
76
+ "<extra_id_54>": 313,
77
+ "<extra_id_55>": 314,
78
+ "<extra_id_56>": 315,
79
+ "<extra_id_57>": 316,
80
+ "<extra_id_58>": 317,
81
+ "<extra_id_59>": 318,
82
+ "<extra_id_5>": 264,
83
+ "<extra_id_60>": 319,
84
+ "<extra_id_61>": 320,
85
+ "<extra_id_62>": 321,
86
+ "<extra_id_63>": 322,
87
+ "<extra_id_64>": 323,
88
+ "<extra_id_65>": 324,
89
+ "<extra_id_66>": 325,
90
+ "<extra_id_67>": 326,
91
+ "<extra_id_68>": 327,
92
+ "<extra_id_69>": 328,
93
+ "<extra_id_6>": 265,
94
+ "<extra_id_70>": 329,
95
+ "<extra_id_71>": 330,
96
+ "<extra_id_72>": 331,
97
+ "<extra_id_73>": 332,
98
+ "<extra_id_74>": 333,
99
+ "<extra_id_75>": 334,
100
+ "<extra_id_76>": 335,
101
+ "<extra_id_77>": 336,
102
+ "<extra_id_78>": 337,
103
+ "<extra_id_79>": 338,
104
+ "<extra_id_7>": 266,
105
+ "<extra_id_80>": 339,
106
+ "<extra_id_81>": 340,
107
+ "<extra_id_82>": 341,
108
+ "<extra_id_83>": 342,
109
+ "<extra_id_84>": 343,
110
+ "<extra_id_85>": 344,
111
+ "<extra_id_86>": 345,
112
+ "<extra_id_87>": 346,
113
+ "<extra_id_88>": 347,
114
+ "<extra_id_89>": 348,
115
+ "<extra_id_8>": 267,
116
+ "<extra_id_90>": 349,
117
+ "<extra_id_91>": 350,
118
+ "<extra_id_92>": 351,
119
+ "<extra_id_93>": 352,
120
+ "<extra_id_94>": 353,
121
+ "<extra_id_95>": 354,
122
+ "<extra_id_96>": 355,
123
+ "<extra_id_97>": 356,
124
+ "<extra_id_98>": 357,
125
+ "<extra_id_99>": 358,
126
+ "<extra_id_9>": 268
127
+ }
config.json ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "T5ForConditionalGeneration"
4
+ ],
5
+ "classifier_dropout": 0.0,
6
+ "d_ff": 3840,
7
+ "d_kv": 64,
8
+ "d_model": 1536,
9
+ "decoder_start_token_id": 0,
10
+ "dense_act_fn": "gelu_new",
11
+ "dropout_rate": 0.1,
12
+ "dtype": "float32",
13
+ "eos_token_id": 1,
14
+ "feed_forward_proj": "gated-gelu",
15
+ "gradient_checkpointing": false,
16
+ "initializer_factor": 1.0,
17
+ "is_encoder_decoder": true,
18
+ "is_gated_act": true,
19
+ "layer_norm_epsilon": 1e-06,
20
+ "model_type": "t5",
21
+ "num_decoder_layers": 12,
22
+ "num_heads": 16,
23
+ "num_layers": 36,
24
+ "output_past": true,
25
+ "pad_token_id": 0,
26
+ "relative_attention_max_distance": 128,
27
+ "relative_attention_num_buckets": 32,
28
+ "tie_word_embeddings": false,
29
+ "tokenizer_class": "ByT5Tokenizer",
30
+ "transformers_version": "4.57.6",
31
+ "use_cache": true,
32
+ "vocab_size": 384
33
+ }
generation_config.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "decoder_start_token_id": 0,
4
+ "eos_token_id": [
5
+ 1
6
+ ],
7
+ "pad_token_id": 0,
8
+ "transformers_version": "4.57.6"
9
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:005949de6d559363bec3846254764b6ee3c766f10c81444da1f3dc967f85f2a2
3
+ size 4912795416
optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eab44ef2046e1bdfb96816e7877e73bbeb0159c6ca0568abe6eaeb24b9b7ee68
3
+ size 9825908525
rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:88f3a10d9c6247faa3d188e5121618bc7a3138568cc4e9c9c55f52106126daa5
3
+ size 14645
scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2efd8a83a01ce30e710e01e70684c97fc5098f0cfc80467284ff94ac2ba163d5
3
+ size 1465
special_tokens_map.json ADDED
@@ -0,0 +1,150 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<extra_id_0>",
4
+ "<extra_id_1>",
5
+ "<extra_id_2>",
6
+ "<extra_id_3>",
7
+ "<extra_id_4>",
8
+ "<extra_id_5>",
9
+ "<extra_id_6>",
10
+ "<extra_id_7>",
11
+ "<extra_id_8>",
12
+ "<extra_id_9>",
13
+ "<extra_id_10>",
14
+ "<extra_id_11>",
15
+ "<extra_id_12>",
16
+ "<extra_id_13>",
17
+ "<extra_id_14>",
18
+ "<extra_id_15>",
19
+ "<extra_id_16>",
20
+ "<extra_id_17>",
21
+ "<extra_id_18>",
22
+ "<extra_id_19>",
23
+ "<extra_id_20>",
24
+ "<extra_id_21>",
25
+ "<extra_id_22>",
26
+ "<extra_id_23>",
27
+ "<extra_id_24>",
28
+ "<extra_id_25>",
29
+ "<extra_id_26>",
30
+ "<extra_id_27>",
31
+ "<extra_id_28>",
32
+ "<extra_id_29>",
33
+ "<extra_id_30>",
34
+ "<extra_id_31>",
35
+ "<extra_id_32>",
36
+ "<extra_id_33>",
37
+ "<extra_id_34>",
38
+ "<extra_id_35>",
39
+ "<extra_id_36>",
40
+ "<extra_id_37>",
41
+ "<extra_id_38>",
42
+ "<extra_id_39>",
43
+ "<extra_id_40>",
44
+ "<extra_id_41>",
45
+ "<extra_id_42>",
46
+ "<extra_id_43>",
47
+ "<extra_id_44>",
48
+ "<extra_id_45>",
49
+ "<extra_id_46>",
50
+ "<extra_id_47>",
51
+ "<extra_id_48>",
52
+ "<extra_id_49>",
53
+ "<extra_id_50>",
54
+ "<extra_id_51>",
55
+ "<extra_id_52>",
56
+ "<extra_id_53>",
57
+ "<extra_id_54>",
58
+ "<extra_id_55>",
59
+ "<extra_id_56>",
60
+ "<extra_id_57>",
61
+ "<extra_id_58>",
62
+ "<extra_id_59>",
63
+ "<extra_id_60>",
64
+ "<extra_id_61>",
65
+ "<extra_id_62>",
66
+ "<extra_id_63>",
67
+ "<extra_id_64>",
68
+ "<extra_id_65>",
69
+ "<extra_id_66>",
70
+ "<extra_id_67>",
71
+ "<extra_id_68>",
72
+ "<extra_id_69>",
73
+ "<extra_id_70>",
74
+ "<extra_id_71>",
75
+ "<extra_id_72>",
76
+ "<extra_id_73>",
77
+ "<extra_id_74>",
78
+ "<extra_id_75>",
79
+ "<extra_id_76>",
80
+ "<extra_id_77>",
81
+ "<extra_id_78>",
82
+ "<extra_id_79>",
83
+ "<extra_id_80>",
84
+ "<extra_id_81>",
85
+ "<extra_id_82>",
86
+ "<extra_id_83>",
87
+ "<extra_id_84>",
88
+ "<extra_id_85>",
89
+ "<extra_id_86>",
90
+ "<extra_id_87>",
91
+ "<extra_id_88>",
92
+ "<extra_id_89>",
93
+ "<extra_id_90>",
94
+ "<extra_id_91>",
95
+ "<extra_id_92>",
96
+ "<extra_id_93>",
97
+ "<extra_id_94>",
98
+ "<extra_id_95>",
99
+ "<extra_id_96>",
100
+ "<extra_id_97>",
101
+ "<extra_id_98>",
102
+ "<extra_id_99>",
103
+ "<extra_id_100>",
104
+ "<extra_id_101>",
105
+ "<extra_id_102>",
106
+ "<extra_id_103>",
107
+ "<extra_id_104>",
108
+ "<extra_id_105>",
109
+ "<extra_id_106>",
110
+ "<extra_id_107>",
111
+ "<extra_id_108>",
112
+ "<extra_id_109>",
113
+ "<extra_id_110>",
114
+ "<extra_id_111>",
115
+ "<extra_id_112>",
116
+ "<extra_id_113>",
117
+ "<extra_id_114>",
118
+ "<extra_id_115>",
119
+ "<extra_id_116>",
120
+ "<extra_id_117>",
121
+ "<extra_id_118>",
122
+ "<extra_id_119>",
123
+ "<extra_id_120>",
124
+ "<extra_id_121>",
125
+ "<extra_id_122>",
126
+ "<extra_id_123>",
127
+ "<extra_id_124>"
128
+ ],
129
+ "eos_token": {
130
+ "content": "</s>",
131
+ "lstrip": false,
132
+ "normalized": true,
133
+ "rstrip": false,
134
+ "single_word": false
135
+ },
136
+ "pad_token": {
137
+ "content": "<pad>",
138
+ "lstrip": false,
139
+ "normalized": true,
140
+ "rstrip": false,
141
+ "single_word": false
142
+ },
143
+ "unk_token": {
144
+ "content": "<unk>",
145
+ "lstrip": false,
146
+ "normalized": true,
147
+ "rstrip": false,
148
+ "single_word": false
149
+ }
150
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,1163 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<pad>",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "</s>",
13
+ "lstrip": false,
14
+ "normalized": true,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "<unk>",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "259": {
28
+ "content": "<extra_id_0>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "260": {
36
+ "content": "<extra_id_1>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "261": {
44
+ "content": "<extra_id_2>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "262": {
52
+ "content": "<extra_id_3>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "263": {
60
+ "content": "<extra_id_4>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "264": {
68
+ "content": "<extra_id_5>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "265": {
76
+ "content": "<extra_id_6>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "266": {
84
+ "content": "<extra_id_7>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "267": {
92
+ "content": "<extra_id_8>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "268": {
100
+ "content": "<extra_id_9>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "269": {
108
+ "content": "<extra_id_10>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "270": {
116
+ "content": "<extra_id_11>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "271": {
124
+ "content": "<extra_id_12>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "272": {
132
+ "content": "<extra_id_13>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "273": {
140
+ "content": "<extra_id_14>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "274": {
148
+ "content": "<extra_id_15>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "275": {
156
+ "content": "<extra_id_16>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "276": {
164
+ "content": "<extra_id_17>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "277": {
172
+ "content": "<extra_id_18>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "278": {
180
+ "content": "<extra_id_19>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "279": {
188
+ "content": "<extra_id_20>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "280": {
196
+ "content": "<extra_id_21>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "281": {
204
+ "content": "<extra_id_22>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "282": {
212
+ "content": "<extra_id_23>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "283": {
220
+ "content": "<extra_id_24>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "284": {
228
+ "content": "<extra_id_25>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "285": {
236
+ "content": "<extra_id_26>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "286": {
244
+ "content": "<extra_id_27>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "287": {
252
+ "content": "<extra_id_28>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "288": {
260
+ "content": "<extra_id_29>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "289": {
268
+ "content": "<extra_id_30>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "290": {
276
+ "content": "<extra_id_31>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "291": {
284
+ "content": "<extra_id_32>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "292": {
292
+ "content": "<extra_id_33>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "293": {
300
+ "content": "<extra_id_34>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "294": {
308
+ "content": "<extra_id_35>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "295": {
316
+ "content": "<extra_id_36>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "296": {
324
+ "content": "<extra_id_37>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "297": {
332
+ "content": "<extra_id_38>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "298": {
340
+ "content": "<extra_id_39>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "299": {
348
+ "content": "<extra_id_40>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "300": {
356
+ "content": "<extra_id_41>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "301": {
364
+ "content": "<extra_id_42>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "302": {
372
+ "content": "<extra_id_43>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "303": {
380
+ "content": "<extra_id_44>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "304": {
388
+ "content": "<extra_id_45>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "305": {
396
+ "content": "<extra_id_46>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "306": {
404
+ "content": "<extra_id_47>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "307": {
412
+ "content": "<extra_id_48>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "308": {
420
+ "content": "<extra_id_49>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "309": {
428
+ "content": "<extra_id_50>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "310": {
436
+ "content": "<extra_id_51>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "311": {
444
+ "content": "<extra_id_52>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "312": {
452
+ "content": "<extra_id_53>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "313": {
460
+ "content": "<extra_id_54>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "314": {
468
+ "content": "<extra_id_55>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "315": {
476
+ "content": "<extra_id_56>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "316": {
484
+ "content": "<extra_id_57>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "317": {
492
+ "content": "<extra_id_58>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "318": {
500
+ "content": "<extra_id_59>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "319": {
508
+ "content": "<extra_id_60>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "320": {
516
+ "content": "<extra_id_61>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "321": {
524
+ "content": "<extra_id_62>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "322": {
532
+ "content": "<extra_id_63>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "323": {
540
+ "content": "<extra_id_64>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "324": {
548
+ "content": "<extra_id_65>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "325": {
556
+ "content": "<extra_id_66>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "326": {
564
+ "content": "<extra_id_67>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "327": {
572
+ "content": "<extra_id_68>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "328": {
580
+ "content": "<extra_id_69>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "329": {
588
+ "content": "<extra_id_70>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "330": {
596
+ "content": "<extra_id_71>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "331": {
604
+ "content": "<extra_id_72>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "332": {
612
+ "content": "<extra_id_73>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "333": {
620
+ "content": "<extra_id_74>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "334": {
628
+ "content": "<extra_id_75>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "335": {
636
+ "content": "<extra_id_76>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "336": {
644
+ "content": "<extra_id_77>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "337": {
652
+ "content": "<extra_id_78>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "338": {
660
+ "content": "<extra_id_79>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "339": {
668
+ "content": "<extra_id_80>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "340": {
676
+ "content": "<extra_id_81>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "341": {
684
+ "content": "<extra_id_82>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "342": {
692
+ "content": "<extra_id_83>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "343": {
700
+ "content": "<extra_id_84>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "344": {
708
+ "content": "<extra_id_85>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "345": {
716
+ "content": "<extra_id_86>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "346": {
724
+ "content": "<extra_id_87>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "347": {
732
+ "content": "<extra_id_88>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "348": {
740
+ "content": "<extra_id_89>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "349": {
748
+ "content": "<extra_id_90>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "350": {
756
+ "content": "<extra_id_91>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "351": {
764
+ "content": "<extra_id_92>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "352": {
772
+ "content": "<extra_id_93>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "353": {
780
+ "content": "<extra_id_94>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "354": {
788
+ "content": "<extra_id_95>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "355": {
796
+ "content": "<extra_id_96>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "356": {
804
+ "content": "<extra_id_97>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "357": {
812
+ "content": "<extra_id_98>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "358": {
820
+ "content": "<extra_id_99>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "359": {
828
+ "content": "<extra_id_100>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "360": {
836
+ "content": "<extra_id_101>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "361": {
844
+ "content": "<extra_id_102>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "362": {
852
+ "content": "<extra_id_103>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "363": {
860
+ "content": "<extra_id_104>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "364": {
868
+ "content": "<extra_id_105>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "365": {
876
+ "content": "<extra_id_106>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "366": {
884
+ "content": "<extra_id_107>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "367": {
892
+ "content": "<extra_id_108>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "368": {
900
+ "content": "<extra_id_109>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "369": {
908
+ "content": "<extra_id_110>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "370": {
916
+ "content": "<extra_id_111>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "371": {
924
+ "content": "<extra_id_112>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "372": {
932
+ "content": "<extra_id_113>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "373": {
940
+ "content": "<extra_id_114>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "374": {
948
+ "content": "<extra_id_115>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "375": {
956
+ "content": "<extra_id_116>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "376": {
964
+ "content": "<extra_id_117>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "377": {
972
+ "content": "<extra_id_118>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "378": {
980
+ "content": "<extra_id_119>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "379": {
988
+ "content": "<extra_id_120>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "380": {
996
+ "content": "<extra_id_121>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "381": {
1004
+ "content": "<extra_id_122>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "382": {
1012
+ "content": "<extra_id_123>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "383": {
1020
+ "content": "<extra_id_124>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ }
1027
+ },
1028
+ "additional_special_tokens": [
1029
+ "<extra_id_0>",
1030
+ "<extra_id_1>",
1031
+ "<extra_id_2>",
1032
+ "<extra_id_3>",
1033
+ "<extra_id_4>",
1034
+ "<extra_id_5>",
1035
+ "<extra_id_6>",
1036
+ "<extra_id_7>",
1037
+ "<extra_id_8>",
1038
+ "<extra_id_9>",
1039
+ "<extra_id_10>",
1040
+ "<extra_id_11>",
1041
+ "<extra_id_12>",
1042
+ "<extra_id_13>",
1043
+ "<extra_id_14>",
1044
+ "<extra_id_15>",
1045
+ "<extra_id_16>",
1046
+ "<extra_id_17>",
1047
+ "<extra_id_18>",
1048
+ "<extra_id_19>",
1049
+ "<extra_id_20>",
1050
+ "<extra_id_21>",
1051
+ "<extra_id_22>",
1052
+ "<extra_id_23>",
1053
+ "<extra_id_24>",
1054
+ "<extra_id_25>",
1055
+ "<extra_id_26>",
1056
+ "<extra_id_27>",
1057
+ "<extra_id_28>",
1058
+ "<extra_id_29>",
1059
+ "<extra_id_30>",
1060
+ "<extra_id_31>",
1061
+ "<extra_id_32>",
1062
+ "<extra_id_33>",
1063
+ "<extra_id_34>",
1064
+ "<extra_id_35>",
1065
+ "<extra_id_36>",
1066
+ "<extra_id_37>",
1067
+ "<extra_id_38>",
1068
+ "<extra_id_39>",
1069
+ "<extra_id_40>",
1070
+ "<extra_id_41>",
1071
+ "<extra_id_42>",
1072
+ "<extra_id_43>",
1073
+ "<extra_id_44>",
1074
+ "<extra_id_45>",
1075
+ "<extra_id_46>",
1076
+ "<extra_id_47>",
1077
+ "<extra_id_48>",
1078
+ "<extra_id_49>",
1079
+ "<extra_id_50>",
1080
+ "<extra_id_51>",
1081
+ "<extra_id_52>",
1082
+ "<extra_id_53>",
1083
+ "<extra_id_54>",
1084
+ "<extra_id_55>",
1085
+ "<extra_id_56>",
1086
+ "<extra_id_57>",
1087
+ "<extra_id_58>",
1088
+ "<extra_id_59>",
1089
+ "<extra_id_60>",
1090
+ "<extra_id_61>",
1091
+ "<extra_id_62>",
1092
+ "<extra_id_63>",
1093
+ "<extra_id_64>",
1094
+ "<extra_id_65>",
1095
+ "<extra_id_66>",
1096
+ "<extra_id_67>",
1097
+ "<extra_id_68>",
1098
+ "<extra_id_69>",
1099
+ "<extra_id_70>",
1100
+ "<extra_id_71>",
1101
+ "<extra_id_72>",
1102
+ "<extra_id_73>",
1103
+ "<extra_id_74>",
1104
+ "<extra_id_75>",
1105
+ "<extra_id_76>",
1106
+ "<extra_id_77>",
1107
+ "<extra_id_78>",
1108
+ "<extra_id_79>",
1109
+ "<extra_id_80>",
1110
+ "<extra_id_81>",
1111
+ "<extra_id_82>",
1112
+ "<extra_id_83>",
1113
+ "<extra_id_84>",
1114
+ "<extra_id_85>",
1115
+ "<extra_id_86>",
1116
+ "<extra_id_87>",
1117
+ "<extra_id_88>",
1118
+ "<extra_id_89>",
1119
+ "<extra_id_90>",
1120
+ "<extra_id_91>",
1121
+ "<extra_id_92>",
1122
+ "<extra_id_93>",
1123
+ "<extra_id_94>",
1124
+ "<extra_id_95>",
1125
+ "<extra_id_96>",
1126
+ "<extra_id_97>",
1127
+ "<extra_id_98>",
1128
+ "<extra_id_99>",
1129
+ "<extra_id_100>",
1130
+ "<extra_id_101>",
1131
+ "<extra_id_102>",
1132
+ "<extra_id_103>",
1133
+ "<extra_id_104>",
1134
+ "<extra_id_105>",
1135
+ "<extra_id_106>",
1136
+ "<extra_id_107>",
1137
+ "<extra_id_108>",
1138
+ "<extra_id_109>",
1139
+ "<extra_id_110>",
1140
+ "<extra_id_111>",
1141
+ "<extra_id_112>",
1142
+ "<extra_id_113>",
1143
+ "<extra_id_114>",
1144
+ "<extra_id_115>",
1145
+ "<extra_id_116>",
1146
+ "<extra_id_117>",
1147
+ "<extra_id_118>",
1148
+ "<extra_id_119>",
1149
+ "<extra_id_120>",
1150
+ "<extra_id_121>",
1151
+ "<extra_id_122>",
1152
+ "<extra_id_123>",
1153
+ "<extra_id_124>"
1154
+ ],
1155
+ "clean_up_tokenization_spaces": false,
1156
+ "eos_token": "</s>",
1157
+ "extra_ids": 0,
1158
+ "extra_special_tokens": {},
1159
+ "model_max_length": 1000000000000000019884624838656,
1160
+ "pad_token": "<pad>",
1161
+ "tokenizer_class": "ByT5Tokenizer",
1162
+ "unk_token": "<unk>"
1163
+ }
trainer_state.json ADDED
@@ -0,0 +1,2182 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": null,
3
+ "best_metric": null,
4
+ "best_model_checkpoint": null,
5
+ "epoch": 20.0,
6
+ "eval_steps": 500,
7
+ "global_step": 1420,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.07042253521126761,
14
+ "grad_norm": 298.37652587890625,
15
+ "learning_rate": 0.00019943661971830986,
16
+ "loss": 32.3915,
17
+ "step": 5
18
+ },
19
+ {
20
+ "epoch": 0.14084507042253522,
21
+ "grad_norm": 93.51129150390625,
22
+ "learning_rate": 0.0001987323943661972,
23
+ "loss": 28.743,
24
+ "step": 10
25
+ },
26
+ {
27
+ "epoch": 0.2112676056338028,
28
+ "grad_norm": 47.780601501464844,
29
+ "learning_rate": 0.00019802816901408452,
30
+ "loss": 25.2619,
31
+ "step": 15
32
+ },
33
+ {
34
+ "epoch": 0.28169014084507044,
35
+ "grad_norm": 50.333736419677734,
36
+ "learning_rate": 0.00019732394366197184,
37
+ "loss": 22.2152,
38
+ "step": 20
39
+ },
40
+ {
41
+ "epoch": 0.352112676056338,
42
+ "grad_norm": 16.70964241027832,
43
+ "learning_rate": 0.00019661971830985917,
44
+ "loss": 18.6695,
45
+ "step": 25
46
+ },
47
+ {
48
+ "epoch": 0.4225352112676056,
49
+ "grad_norm": 15.14587688446045,
50
+ "learning_rate": 0.0001959154929577465,
51
+ "loss": 14.7068,
52
+ "step": 30
53
+ },
54
+ {
55
+ "epoch": 0.49295774647887325,
56
+ "grad_norm": 9.351914405822754,
57
+ "learning_rate": 0.00019521126760563382,
58
+ "loss": 8.1565,
59
+ "step": 35
60
+ },
61
+ {
62
+ "epoch": 0.5633802816901409,
63
+ "grad_norm": 8.563234329223633,
64
+ "learning_rate": 0.00019450704225352114,
65
+ "loss": 5.4416,
66
+ "step": 40
67
+ },
68
+ {
69
+ "epoch": 0.6338028169014085,
70
+ "grad_norm": 6.7714691162109375,
71
+ "learning_rate": 0.00019380281690140847,
72
+ "loss": 5.0701,
73
+ "step": 45
74
+ },
75
+ {
76
+ "epoch": 0.704225352112676,
77
+ "grad_norm": 4.738491535186768,
78
+ "learning_rate": 0.0001930985915492958,
79
+ "loss": 4.6657,
80
+ "step": 50
81
+ },
82
+ {
83
+ "epoch": 0.7746478873239436,
84
+ "grad_norm": 5.511097431182861,
85
+ "learning_rate": 0.00019239436619718312,
86
+ "loss": 4.2706,
87
+ "step": 55
88
+ },
89
+ {
90
+ "epoch": 0.8450704225352113,
91
+ "grad_norm": 5.683650970458984,
92
+ "learning_rate": 0.00019169014084507045,
93
+ "loss": 3.8374,
94
+ "step": 60
95
+ },
96
+ {
97
+ "epoch": 0.9154929577464789,
98
+ "grad_norm": 4.763166427612305,
99
+ "learning_rate": 0.00019098591549295774,
100
+ "loss": 3.3471,
101
+ "step": 65
102
+ },
103
+ {
104
+ "epoch": 0.9859154929577465,
105
+ "grad_norm": 2.353513240814209,
106
+ "learning_rate": 0.00019028169014084507,
107
+ "loss": 2.9019,
108
+ "step": 70
109
+ },
110
+ {
111
+ "epoch": 1.0,
112
+ "eval_loss": 2.327265977859497,
113
+ "eval_runtime": 1.3559,
114
+ "eval_samples_per_second": 279.528,
115
+ "eval_steps_per_second": 2.213,
116
+ "step": 71
117
+ },
118
+ {
119
+ "epoch": 1.056338028169014,
120
+ "grad_norm": 2.5209953784942627,
121
+ "learning_rate": 0.0001895774647887324,
122
+ "loss": 2.5521,
123
+ "step": 75
124
+ },
125
+ {
126
+ "epoch": 1.1267605633802817,
127
+ "grad_norm": 0.8227373957633972,
128
+ "learning_rate": 0.00018887323943661972,
129
+ "loss": 2.3236,
130
+ "step": 80
131
+ },
132
+ {
133
+ "epoch": 1.1971830985915493,
134
+ "grad_norm": 0.8206098675727844,
135
+ "learning_rate": 0.00018816901408450705,
136
+ "loss": 2.1289,
137
+ "step": 85
138
+ },
139
+ {
140
+ "epoch": 1.267605633802817,
141
+ "grad_norm": 0.748960018157959,
142
+ "learning_rate": 0.00018746478873239437,
143
+ "loss": 1.6767,
144
+ "step": 90
145
+ },
146
+ {
147
+ "epoch": 1.3380281690140845,
148
+ "grad_norm": 0.4808853566646576,
149
+ "learning_rate": 0.0001867605633802817,
150
+ "loss": 1.419,
151
+ "step": 95
152
+ },
153
+ {
154
+ "epoch": 1.408450704225352,
155
+ "grad_norm": 1.964240312576294,
156
+ "learning_rate": 0.00018605633802816902,
157
+ "loss": 1.3036,
158
+ "step": 100
159
+ },
160
+ {
161
+ "epoch": 1.4788732394366197,
162
+ "grad_norm": 0.2643977403640747,
163
+ "learning_rate": 0.00018535211267605635,
164
+ "loss": 1.2568,
165
+ "step": 105
166
+ },
167
+ {
168
+ "epoch": 1.5492957746478875,
169
+ "grad_norm": 0.38880959153175354,
170
+ "learning_rate": 0.00018464788732394367,
171
+ "loss": 1.2002,
172
+ "step": 110
173
+ },
174
+ {
175
+ "epoch": 1.619718309859155,
176
+ "grad_norm": 0.22605347633361816,
177
+ "learning_rate": 0.000183943661971831,
178
+ "loss": 1.1694,
179
+ "step": 115
180
+ },
181
+ {
182
+ "epoch": 1.6901408450704225,
183
+ "grad_norm": 0.23560728132724762,
184
+ "learning_rate": 0.00018323943661971832,
185
+ "loss": 1.1492,
186
+ "step": 120
187
+ },
188
+ {
189
+ "epoch": 1.76056338028169,
190
+ "grad_norm": 0.15073446929454803,
191
+ "learning_rate": 0.00018253521126760565,
192
+ "loss": 1.1263,
193
+ "step": 125
194
+ },
195
+ {
196
+ "epoch": 1.8309859154929577,
197
+ "grad_norm": 0.12264095991849899,
198
+ "learning_rate": 0.00018183098591549298,
199
+ "loss": 1.1051,
200
+ "step": 130
201
+ },
202
+ {
203
+ "epoch": 1.9014084507042255,
204
+ "grad_norm": 0.1781918853521347,
205
+ "learning_rate": 0.0001811267605633803,
206
+ "loss": 1.0964,
207
+ "step": 135
208
+ },
209
+ {
210
+ "epoch": 1.971830985915493,
211
+ "grad_norm": 0.15171071887016296,
212
+ "learning_rate": 0.00018042253521126763,
213
+ "loss": 1.0779,
214
+ "step": 140
215
+ },
216
+ {
217
+ "epoch": 2.0,
218
+ "eval_loss": 1.0098963975906372,
219
+ "eval_runtime": 1.3563,
220
+ "eval_samples_per_second": 279.438,
221
+ "eval_steps_per_second": 2.212,
222
+ "step": 142
223
+ },
224
+ {
225
+ "epoch": 2.0422535211267605,
226
+ "grad_norm": 0.11982790380716324,
227
+ "learning_rate": 0.00017971830985915495,
228
+ "loss": 1.0746,
229
+ "step": 145
230
+ },
231
+ {
232
+ "epoch": 2.112676056338028,
233
+ "grad_norm": 0.11876315623521805,
234
+ "learning_rate": 0.00017901408450704228,
235
+ "loss": 1.0658,
236
+ "step": 150
237
+ },
238
+ {
239
+ "epoch": 2.183098591549296,
240
+ "grad_norm": 0.09626977145671844,
241
+ "learning_rate": 0.0001783098591549296,
242
+ "loss": 1.0535,
243
+ "step": 155
244
+ },
245
+ {
246
+ "epoch": 2.2535211267605635,
247
+ "grad_norm": 0.11824846267700195,
248
+ "learning_rate": 0.00017760563380281693,
249
+ "loss": 1.0492,
250
+ "step": 160
251
+ },
252
+ {
253
+ "epoch": 2.323943661971831,
254
+ "grad_norm": 0.1524844616651535,
255
+ "learning_rate": 0.00017690140845070425,
256
+ "loss": 1.04,
257
+ "step": 165
258
+ },
259
+ {
260
+ "epoch": 2.3943661971830985,
261
+ "grad_norm": 0.10610729455947876,
262
+ "learning_rate": 0.00017619718309859158,
263
+ "loss": 1.0355,
264
+ "step": 170
265
+ },
266
+ {
267
+ "epoch": 2.464788732394366,
268
+ "grad_norm": 0.09342360496520996,
269
+ "learning_rate": 0.0001754929577464789,
270
+ "loss": 1.0333,
271
+ "step": 175
272
+ },
273
+ {
274
+ "epoch": 2.535211267605634,
275
+ "grad_norm": 0.10479816794395447,
276
+ "learning_rate": 0.0001747887323943662,
277
+ "loss": 1.0318,
278
+ "step": 180
279
+ },
280
+ {
281
+ "epoch": 2.6056338028169015,
282
+ "grad_norm": 0.0711260586977005,
283
+ "learning_rate": 0.00017408450704225353,
284
+ "loss": 1.0241,
285
+ "step": 185
286
+ },
287
+ {
288
+ "epoch": 2.676056338028169,
289
+ "grad_norm": 0.11890017986297607,
290
+ "learning_rate": 0.00017338028169014086,
291
+ "loss": 1.0241,
292
+ "step": 190
293
+ },
294
+ {
295
+ "epoch": 2.7464788732394365,
296
+ "grad_norm": 0.10851440578699112,
297
+ "learning_rate": 0.00017267605633802818,
298
+ "loss": 1.018,
299
+ "step": 195
300
+ },
301
+ {
302
+ "epoch": 2.816901408450704,
303
+ "grad_norm": 0.1457432061433792,
304
+ "learning_rate": 0.0001719718309859155,
305
+ "loss": 1.0149,
306
+ "step": 200
307
+ },
308
+ {
309
+ "epoch": 2.887323943661972,
310
+ "grad_norm": 0.0658695176243782,
311
+ "learning_rate": 0.00017126760563380283,
312
+ "loss": 1.0093,
313
+ "step": 205
314
+ },
315
+ {
316
+ "epoch": 2.9577464788732395,
317
+ "grad_norm": 0.09453574568033218,
318
+ "learning_rate": 0.00017056338028169016,
319
+ "loss": 1.0081,
320
+ "step": 210
321
+ },
322
+ {
323
+ "epoch": 3.0,
324
+ "eval_loss": 0.9575695395469666,
325
+ "eval_runtime": 1.3565,
326
+ "eval_samples_per_second": 279.401,
327
+ "eval_steps_per_second": 2.212,
328
+ "step": 213
329
+ },
330
+ {
331
+ "epoch": 3.028169014084507,
332
+ "grad_norm": 0.13185155391693115,
333
+ "learning_rate": 0.00016985915492957746,
334
+ "loss": 1.0059,
335
+ "step": 215
336
+ },
337
+ {
338
+ "epoch": 3.0985915492957745,
339
+ "grad_norm": 0.10181483626365662,
340
+ "learning_rate": 0.00016915492957746478,
341
+ "loss": 0.9993,
342
+ "step": 220
343
+ },
344
+ {
345
+ "epoch": 3.169014084507042,
346
+ "grad_norm": 0.07031014561653137,
347
+ "learning_rate": 0.0001684507042253521,
348
+ "loss": 1.0002,
349
+ "step": 225
350
+ },
351
+ {
352
+ "epoch": 3.23943661971831,
353
+ "grad_norm": 0.08129975944757462,
354
+ "learning_rate": 0.00016774647887323943,
355
+ "loss": 0.9966,
356
+ "step": 230
357
+ },
358
+ {
359
+ "epoch": 3.3098591549295775,
360
+ "grad_norm": 0.061685919761657715,
361
+ "learning_rate": 0.00016704225352112676,
362
+ "loss": 0.9986,
363
+ "step": 235
364
+ },
365
+ {
366
+ "epoch": 3.380281690140845,
367
+ "grad_norm": 0.10376816242933273,
368
+ "learning_rate": 0.00016633802816901408,
369
+ "loss": 0.9914,
370
+ "step": 240
371
+ },
372
+ {
373
+ "epoch": 3.4507042253521125,
374
+ "grad_norm": 0.16595374047756195,
375
+ "learning_rate": 0.0001656338028169014,
376
+ "loss": 0.9944,
377
+ "step": 245
378
+ },
379
+ {
380
+ "epoch": 3.52112676056338,
381
+ "grad_norm": 0.0959262102842331,
382
+ "learning_rate": 0.00016492957746478873,
383
+ "loss": 0.99,
384
+ "step": 250
385
+ },
386
+ {
387
+ "epoch": 3.591549295774648,
388
+ "grad_norm": 0.04954326152801514,
389
+ "learning_rate": 0.00016422535211267606,
390
+ "loss": 0.9887,
391
+ "step": 255
392
+ },
393
+ {
394
+ "epoch": 3.6619718309859155,
395
+ "grad_norm": 0.09042917937040329,
396
+ "learning_rate": 0.00016352112676056339,
397
+ "loss": 0.989,
398
+ "step": 260
399
+ },
400
+ {
401
+ "epoch": 3.732394366197183,
402
+ "grad_norm": 0.1556464284658432,
403
+ "learning_rate": 0.0001628169014084507,
404
+ "loss": 0.9914,
405
+ "step": 265
406
+ },
407
+ {
408
+ "epoch": 3.802816901408451,
409
+ "grad_norm": 0.0660383403301239,
410
+ "learning_rate": 0.00016211267605633804,
411
+ "loss": 0.9874,
412
+ "step": 270
413
+ },
414
+ {
415
+ "epoch": 3.873239436619718,
416
+ "grad_norm": 0.08755503594875336,
417
+ "learning_rate": 0.00016140845070422536,
418
+ "loss": 0.9823,
419
+ "step": 275
420
+ },
421
+ {
422
+ "epoch": 3.943661971830986,
423
+ "grad_norm": 0.07488470524549484,
424
+ "learning_rate": 0.0001607042253521127,
425
+ "loss": 0.9846,
426
+ "step": 280
427
+ },
428
+ {
429
+ "epoch": 4.0,
430
+ "eval_loss": 0.9477686285972595,
431
+ "eval_runtime": 1.356,
432
+ "eval_samples_per_second": 279.49,
433
+ "eval_steps_per_second": 2.212,
434
+ "step": 284
435
+ },
436
+ {
437
+ "epoch": 4.014084507042254,
438
+ "grad_norm": 0.0663798451423645,
439
+ "learning_rate": 0.00016,
440
+ "loss": 0.9813,
441
+ "step": 285
442
+ },
443
+ {
444
+ "epoch": 4.084507042253521,
445
+ "grad_norm": 0.07960853725671768,
446
+ "learning_rate": 0.00015929577464788734,
447
+ "loss": 0.9834,
448
+ "step": 290
449
+ },
450
+ {
451
+ "epoch": 4.154929577464789,
452
+ "grad_norm": 0.0459803082048893,
453
+ "learning_rate": 0.00015859154929577466,
454
+ "loss": 0.9813,
455
+ "step": 295
456
+ },
457
+ {
458
+ "epoch": 4.225352112676056,
459
+ "grad_norm": 1.369140386581421,
460
+ "learning_rate": 0.00015788732394366196,
461
+ "loss": 0.9847,
462
+ "step": 300
463
+ },
464
+ {
465
+ "epoch": 4.295774647887324,
466
+ "grad_norm": 5.1048102378845215,
467
+ "learning_rate": 0.0001571830985915493,
468
+ "loss": 0.9825,
469
+ "step": 305
470
+ },
471
+ {
472
+ "epoch": 4.366197183098592,
473
+ "grad_norm": 0.722586452960968,
474
+ "learning_rate": 0.00015647887323943661,
475
+ "loss": 0.9782,
476
+ "step": 310
477
+ },
478
+ {
479
+ "epoch": 4.436619718309859,
480
+ "grad_norm": 0.11551281064748764,
481
+ "learning_rate": 0.00015577464788732394,
482
+ "loss": 0.9775,
483
+ "step": 315
484
+ },
485
+ {
486
+ "epoch": 4.507042253521127,
487
+ "grad_norm": 0.07706153392791748,
488
+ "learning_rate": 0.00015507042253521126,
489
+ "loss": 0.9765,
490
+ "step": 320
491
+ },
492
+ {
493
+ "epoch": 4.577464788732394,
494
+ "grad_norm": 0.13360825181007385,
495
+ "learning_rate": 0.0001543661971830986,
496
+ "loss": 0.9825,
497
+ "step": 325
498
+ },
499
+ {
500
+ "epoch": 4.647887323943662,
501
+ "grad_norm": 0.0996808111667633,
502
+ "learning_rate": 0.00015366197183098592,
503
+ "loss": 0.9755,
504
+ "step": 330
505
+ },
506
+ {
507
+ "epoch": 4.71830985915493,
508
+ "grad_norm": 0.064101941883564,
509
+ "learning_rate": 0.00015295774647887324,
510
+ "loss": 0.9747,
511
+ "step": 335
512
+ },
513
+ {
514
+ "epoch": 4.788732394366197,
515
+ "grad_norm": 0.0587599016726017,
516
+ "learning_rate": 0.00015225352112676057,
517
+ "loss": 0.9762,
518
+ "step": 340
519
+ },
520
+ {
521
+ "epoch": 4.859154929577465,
522
+ "grad_norm": 0.08781857788562775,
523
+ "learning_rate": 0.0001515492957746479,
524
+ "loss": 0.9725,
525
+ "step": 345
526
+ },
527
+ {
528
+ "epoch": 4.929577464788732,
529
+ "grad_norm": 0.09201648831367493,
530
+ "learning_rate": 0.00015084507042253522,
531
+ "loss": 0.9751,
532
+ "step": 350
533
+ },
534
+ {
535
+ "epoch": 5.0,
536
+ "grad_norm": 0.07896555215120316,
537
+ "learning_rate": 0.00015014084507042254,
538
+ "loss": 0.974,
539
+ "step": 355
540
+ },
541
+ {
542
+ "epoch": 5.0,
543
+ "eval_loss": 0.9434144496917725,
544
+ "eval_runtime": 1.3282,
545
+ "eval_samples_per_second": 285.356,
546
+ "eval_steps_per_second": 2.259,
547
+ "step": 355
548
+ },
549
+ {
550
+ "epoch": 5.070422535211268,
551
+ "grad_norm": 0.06868085265159607,
552
+ "learning_rate": 0.00014943661971830987,
553
+ "loss": 0.9773,
554
+ "step": 360
555
+ },
556
+ {
557
+ "epoch": 5.140845070422535,
558
+ "grad_norm": 0.06904036551713943,
559
+ "learning_rate": 0.0001487323943661972,
560
+ "loss": 0.9724,
561
+ "step": 365
562
+ },
563
+ {
564
+ "epoch": 5.211267605633803,
565
+ "grad_norm": 0.05451333895325661,
566
+ "learning_rate": 0.00014802816901408452,
567
+ "loss": 0.9716,
568
+ "step": 370
569
+ },
570
+ {
571
+ "epoch": 5.28169014084507,
572
+ "grad_norm": 0.08863852173089981,
573
+ "learning_rate": 0.00014732394366197185,
574
+ "loss": 0.9692,
575
+ "step": 375
576
+ },
577
+ {
578
+ "epoch": 5.352112676056338,
579
+ "grad_norm": 0.06055117025971413,
580
+ "learning_rate": 0.00014661971830985917,
581
+ "loss": 0.9713,
582
+ "step": 380
583
+ },
584
+ {
585
+ "epoch": 5.422535211267606,
586
+ "grad_norm": 0.10994315892457962,
587
+ "learning_rate": 0.0001459154929577465,
588
+ "loss": 0.9707,
589
+ "step": 385
590
+ },
591
+ {
592
+ "epoch": 5.492957746478873,
593
+ "grad_norm": 0.09731490910053253,
594
+ "learning_rate": 0.00014521126760563382,
595
+ "loss": 0.9694,
596
+ "step": 390
597
+ },
598
+ {
599
+ "epoch": 5.563380281690141,
600
+ "grad_norm": 0.05324326828122139,
601
+ "learning_rate": 0.00014450704225352115,
602
+ "loss": 0.9666,
603
+ "step": 395
604
+ },
605
+ {
606
+ "epoch": 5.633802816901408,
607
+ "grad_norm": 0.05008189380168915,
608
+ "learning_rate": 0.00014380281690140847,
609
+ "loss": 0.9672,
610
+ "step": 400
611
+ },
612
+ {
613
+ "epoch": 5.704225352112676,
614
+ "grad_norm": 0.0279605221003294,
615
+ "learning_rate": 0.0001430985915492958,
616
+ "loss": 0.9676,
617
+ "step": 405
618
+ },
619
+ {
620
+ "epoch": 5.774647887323944,
621
+ "grad_norm": 0.0670681744813919,
622
+ "learning_rate": 0.00014239436619718312,
623
+ "loss": 0.9654,
624
+ "step": 410
625
+ },
626
+ {
627
+ "epoch": 5.845070422535211,
628
+ "grad_norm": 0.04129045829176903,
629
+ "learning_rate": 0.00014169014084507045,
630
+ "loss": 0.9659,
631
+ "step": 415
632
+ },
633
+ {
634
+ "epoch": 5.915492957746479,
635
+ "grad_norm": 0.04552697390317917,
636
+ "learning_rate": 0.00014098591549295775,
637
+ "loss": 0.9675,
638
+ "step": 420
639
+ },
640
+ {
641
+ "epoch": 5.985915492957746,
642
+ "grad_norm": 0.05515826866030693,
643
+ "learning_rate": 0.00014028169014084507,
644
+ "loss": 0.967,
645
+ "step": 425
646
+ },
647
+ {
648
+ "epoch": 6.0,
649
+ "eval_loss": 0.9389934539794922,
650
+ "eval_runtime": 1.3536,
651
+ "eval_samples_per_second": 279.994,
652
+ "eval_steps_per_second": 2.216,
653
+ "step": 426
654
+ },
655
+ {
656
+ "epoch": 6.056338028169014,
657
+ "grad_norm": 0.060634512454271317,
658
+ "learning_rate": 0.0001395774647887324,
659
+ "loss": 0.9661,
660
+ "step": 430
661
+ },
662
+ {
663
+ "epoch": 6.126760563380282,
664
+ "grad_norm": 0.035951510071754456,
665
+ "learning_rate": 0.00013887323943661972,
666
+ "loss": 0.9673,
667
+ "step": 435
668
+ },
669
+ {
670
+ "epoch": 6.197183098591549,
671
+ "grad_norm": 0.07630197703838348,
672
+ "learning_rate": 0.00013816901408450705,
673
+ "loss": 0.969,
674
+ "step": 440
675
+ },
676
+ {
677
+ "epoch": 6.267605633802817,
678
+ "grad_norm": 0.048709649592638016,
679
+ "learning_rate": 0.00013746478873239438,
680
+ "loss": 0.9605,
681
+ "step": 445
682
+ },
683
+ {
684
+ "epoch": 6.338028169014084,
685
+ "grad_norm": 0.08284774422645569,
686
+ "learning_rate": 0.0001367605633802817,
687
+ "loss": 0.9647,
688
+ "step": 450
689
+ },
690
+ {
691
+ "epoch": 6.408450704225352,
692
+ "grad_norm": 0.10234501212835312,
693
+ "learning_rate": 0.00013605633802816903,
694
+ "loss": 0.9649,
695
+ "step": 455
696
+ },
697
+ {
698
+ "epoch": 6.47887323943662,
699
+ "grad_norm": 0.07582994550466537,
700
+ "learning_rate": 0.00013535211267605635,
701
+ "loss": 0.9637,
702
+ "step": 460
703
+ },
704
+ {
705
+ "epoch": 6.549295774647887,
706
+ "grad_norm": 0.032033782452344894,
707
+ "learning_rate": 0.00013464788732394368,
708
+ "loss": 0.9616,
709
+ "step": 465
710
+ },
711
+ {
712
+ "epoch": 6.619718309859155,
713
+ "grad_norm": 0.029730021953582764,
714
+ "learning_rate": 0.000133943661971831,
715
+ "loss": 0.961,
716
+ "step": 470
717
+ },
718
+ {
719
+ "epoch": 6.690140845070422,
720
+ "grad_norm": 0.036565788090229034,
721
+ "learning_rate": 0.00013323943661971833,
722
+ "loss": 0.9613,
723
+ "step": 475
724
+ },
725
+ {
726
+ "epoch": 6.76056338028169,
727
+ "grad_norm": 0.046678755432367325,
728
+ "learning_rate": 0.00013253521126760565,
729
+ "loss": 0.9631,
730
+ "step": 480
731
+ },
732
+ {
733
+ "epoch": 6.830985915492958,
734
+ "grad_norm": 0.11235543340444565,
735
+ "learning_rate": 0.00013183098591549295,
736
+ "loss": 0.9617,
737
+ "step": 485
738
+ },
739
+ {
740
+ "epoch": 6.901408450704225,
741
+ "grad_norm": 0.04812972992658615,
742
+ "learning_rate": 0.00013112676056338028,
743
+ "loss": 0.9599,
744
+ "step": 490
745
+ },
746
+ {
747
+ "epoch": 6.971830985915493,
748
+ "grad_norm": 0.05159539356827736,
749
+ "learning_rate": 0.0001304225352112676,
750
+ "loss": 0.9622,
751
+ "step": 495
752
+ },
753
+ {
754
+ "epoch": 7.0,
755
+ "eval_loss": 0.938995361328125,
756
+ "eval_runtime": 1.3575,
757
+ "eval_samples_per_second": 279.181,
758
+ "eval_steps_per_second": 2.21,
759
+ "step": 497
760
+ },
761
+ {
762
+ "epoch": 7.042253521126761,
763
+ "grad_norm": 0.046111464500427246,
764
+ "learning_rate": 0.00012971830985915493,
765
+ "loss": 0.9592,
766
+ "step": 500
767
+ },
768
+ {
769
+ "epoch": 7.112676056338028,
770
+ "grad_norm": 0.09057960659265518,
771
+ "learning_rate": 0.00012901408450704226,
772
+ "loss": 0.9605,
773
+ "step": 505
774
+ },
775
+ {
776
+ "epoch": 7.183098591549296,
777
+ "grad_norm": 0.06652987748384476,
778
+ "learning_rate": 0.00012830985915492958,
779
+ "loss": 0.9595,
780
+ "step": 510
781
+ },
782
+ {
783
+ "epoch": 7.253521126760563,
784
+ "grad_norm": 0.05949646607041359,
785
+ "learning_rate": 0.0001276056338028169,
786
+ "loss": 0.9586,
787
+ "step": 515
788
+ },
789
+ {
790
+ "epoch": 7.323943661971831,
791
+ "grad_norm": 0.04339510202407837,
792
+ "learning_rate": 0.00012690140845070423,
793
+ "loss": 0.9594,
794
+ "step": 520
795
+ },
796
+ {
797
+ "epoch": 7.394366197183099,
798
+ "grad_norm": 0.06728670001029968,
799
+ "learning_rate": 0.00012619718309859156,
800
+ "loss": 0.9613,
801
+ "step": 525
802
+ },
803
+ {
804
+ "epoch": 7.464788732394366,
805
+ "grad_norm": 0.03795556724071503,
806
+ "learning_rate": 0.00012549295774647888,
807
+ "loss": 0.9563,
808
+ "step": 530
809
+ },
810
+ {
811
+ "epoch": 7.535211267605634,
812
+ "grad_norm": 0.0375334694981575,
813
+ "learning_rate": 0.00012478873239436618,
814
+ "loss": 0.9601,
815
+ "step": 535
816
+ },
817
+ {
818
+ "epoch": 7.605633802816901,
819
+ "grad_norm": 0.057856108993291855,
820
+ "learning_rate": 0.0001240845070422535,
821
+ "loss": 0.9616,
822
+ "step": 540
823
+ },
824
+ {
825
+ "epoch": 7.676056338028169,
826
+ "grad_norm": 0.03536657989025116,
827
+ "learning_rate": 0.00012338028169014083,
828
+ "loss": 0.9557,
829
+ "step": 545
830
+ },
831
+ {
832
+ "epoch": 7.746478873239437,
833
+ "grad_norm": 0.05437963828444481,
834
+ "learning_rate": 0.00012267605633802816,
835
+ "loss": 0.9596,
836
+ "step": 550
837
+ },
838
+ {
839
+ "epoch": 7.816901408450704,
840
+ "grad_norm": 0.0574585422873497,
841
+ "learning_rate": 0.0001219718309859155,
842
+ "loss": 0.9583,
843
+ "step": 555
844
+ },
845
+ {
846
+ "epoch": 7.887323943661972,
847
+ "grad_norm": 0.0447322279214859,
848
+ "learning_rate": 0.00012126760563380282,
849
+ "loss": 0.9599,
850
+ "step": 560
851
+ },
852
+ {
853
+ "epoch": 7.957746478873239,
854
+ "grad_norm": 0.06250979751348495,
855
+ "learning_rate": 0.00012056338028169015,
856
+ "loss": 0.9581,
857
+ "step": 565
858
+ },
859
+ {
860
+ "epoch": 8.0,
861
+ "eval_loss": 0.9374349117279053,
862
+ "eval_runtime": 1.356,
863
+ "eval_samples_per_second": 279.498,
864
+ "eval_steps_per_second": 2.212,
865
+ "step": 568
866
+ },
867
+ {
868
+ "epoch": 8.028169014084508,
869
+ "grad_norm": 0.02564265951514244,
870
+ "learning_rate": 0.00011985915492957746,
871
+ "loss": 0.9574,
872
+ "step": 570
873
+ },
874
+ {
875
+ "epoch": 8.098591549295774,
876
+ "grad_norm": 0.047478966414928436,
877
+ "learning_rate": 0.00011915492957746479,
878
+ "loss": 0.9538,
879
+ "step": 575
880
+ },
881
+ {
882
+ "epoch": 8.169014084507042,
883
+ "grad_norm": 0.04486701637506485,
884
+ "learning_rate": 0.00011845070422535211,
885
+ "loss": 0.957,
886
+ "step": 580
887
+ },
888
+ {
889
+ "epoch": 8.23943661971831,
890
+ "grad_norm": 0.03243768587708473,
891
+ "learning_rate": 0.00011774647887323944,
892
+ "loss": 0.9549,
893
+ "step": 585
894
+ },
895
+ {
896
+ "epoch": 8.309859154929578,
897
+ "grad_norm": 0.023309363052248955,
898
+ "learning_rate": 0.00011704225352112676,
899
+ "loss": 0.9552,
900
+ "step": 590
901
+ },
902
+ {
903
+ "epoch": 8.380281690140846,
904
+ "grad_norm": 0.06186755374073982,
905
+ "learning_rate": 0.00011633802816901409,
906
+ "loss": 0.956,
907
+ "step": 595
908
+ },
909
+ {
910
+ "epoch": 8.450704225352112,
911
+ "grad_norm": 0.033569905906915665,
912
+ "learning_rate": 0.00011563380281690141,
913
+ "loss": 0.9558,
914
+ "step": 600
915
+ },
916
+ {
917
+ "epoch": 8.52112676056338,
918
+ "grad_norm": 0.02437894605100155,
919
+ "learning_rate": 0.00011492957746478874,
920
+ "loss": 0.9539,
921
+ "step": 605
922
+ },
923
+ {
924
+ "epoch": 8.591549295774648,
925
+ "grad_norm": 0.0500415675342083,
926
+ "learning_rate": 0.00011422535211267606,
927
+ "loss": 0.9567,
928
+ "step": 610
929
+ },
930
+ {
931
+ "epoch": 8.661971830985916,
932
+ "grad_norm": 0.03755724057555199,
933
+ "learning_rate": 0.00011352112676056339,
934
+ "loss": 0.9561,
935
+ "step": 615
936
+ },
937
+ {
938
+ "epoch": 8.732394366197184,
939
+ "grad_norm": 0.06001519784331322,
940
+ "learning_rate": 0.00011281690140845072,
941
+ "loss": 0.9587,
942
+ "step": 620
943
+ },
944
+ {
945
+ "epoch": 8.80281690140845,
946
+ "grad_norm": 0.03620074316859245,
947
+ "learning_rate": 0.00011211267605633804,
948
+ "loss": 0.9541,
949
+ "step": 625
950
+ },
951
+ {
952
+ "epoch": 8.873239436619718,
953
+ "grad_norm": 0.052633192390203476,
954
+ "learning_rate": 0.00011140845070422537,
955
+ "loss": 0.9552,
956
+ "step": 630
957
+ },
958
+ {
959
+ "epoch": 8.943661971830986,
960
+ "grad_norm": 0.06923436373472214,
961
+ "learning_rate": 0.00011070422535211269,
962
+ "loss": 0.9562,
963
+ "step": 635
964
+ },
965
+ {
966
+ "epoch": 9.0,
967
+ "eval_loss": 0.9356324672698975,
968
+ "eval_runtime": 1.356,
969
+ "eval_samples_per_second": 279.507,
970
+ "eval_steps_per_second": 2.212,
971
+ "step": 639
972
+ },
973
+ {
974
+ "epoch": 9.014084507042254,
975
+ "grad_norm": 0.030308526009321213,
976
+ "learning_rate": 0.00011000000000000002,
977
+ "loss": 0.9567,
978
+ "step": 640
979
+ },
980
+ {
981
+ "epoch": 9.084507042253522,
982
+ "grad_norm": 0.04463675990700722,
983
+ "learning_rate": 0.00010929577464788734,
984
+ "loss": 0.9534,
985
+ "step": 645
986
+ },
987
+ {
988
+ "epoch": 9.154929577464788,
989
+ "grad_norm": 0.07801490277051926,
990
+ "learning_rate": 0.00010859154929577467,
991
+ "loss": 0.9532,
992
+ "step": 650
993
+ },
994
+ {
995
+ "epoch": 9.225352112676056,
996
+ "grad_norm": 0.07350599020719528,
997
+ "learning_rate": 0.00010788732394366197,
998
+ "loss": 0.9573,
999
+ "step": 655
1000
+ },
1001
+ {
1002
+ "epoch": 9.295774647887324,
1003
+ "grad_norm": 0.04489121586084366,
1004
+ "learning_rate": 0.00010718309859154929,
1005
+ "loss": 0.9529,
1006
+ "step": 660
1007
+ },
1008
+ {
1009
+ "epoch": 9.366197183098592,
1010
+ "grad_norm": 0.03186199814081192,
1011
+ "learning_rate": 0.00010647887323943662,
1012
+ "loss": 0.9547,
1013
+ "step": 665
1014
+ },
1015
+ {
1016
+ "epoch": 9.43661971830986,
1017
+ "grad_norm": 0.050086986273527145,
1018
+ "learning_rate": 0.00010577464788732394,
1019
+ "loss": 0.9537,
1020
+ "step": 670
1021
+ },
1022
+ {
1023
+ "epoch": 9.507042253521126,
1024
+ "grad_norm": 0.037186432629823685,
1025
+ "learning_rate": 0.00010507042253521127,
1026
+ "loss": 0.9526,
1027
+ "step": 675
1028
+ },
1029
+ {
1030
+ "epoch": 9.577464788732394,
1031
+ "grad_norm": 0.07604589313268661,
1032
+ "learning_rate": 0.0001043661971830986,
1033
+ "loss": 0.9548,
1034
+ "step": 680
1035
+ },
1036
+ {
1037
+ "epoch": 9.647887323943662,
1038
+ "grad_norm": 0.0647517517209053,
1039
+ "learning_rate": 0.00010366197183098592,
1040
+ "loss": 0.9565,
1041
+ "step": 685
1042
+ },
1043
+ {
1044
+ "epoch": 9.71830985915493,
1045
+ "grad_norm": 0.0681651309132576,
1046
+ "learning_rate": 0.00010295774647887325,
1047
+ "loss": 0.9533,
1048
+ "step": 690
1049
+ },
1050
+ {
1051
+ "epoch": 9.788732394366198,
1052
+ "grad_norm": 0.04616454616189003,
1053
+ "learning_rate": 0.00010225352112676057,
1054
+ "loss": 0.9529,
1055
+ "step": 695
1056
+ },
1057
+ {
1058
+ "epoch": 9.859154929577464,
1059
+ "grad_norm": 0.029675917699933052,
1060
+ "learning_rate": 0.0001015492957746479,
1061
+ "loss": 0.9521,
1062
+ "step": 700
1063
+ },
1064
+ {
1065
+ "epoch": 9.929577464788732,
1066
+ "grad_norm": 0.025709936395287514,
1067
+ "learning_rate": 0.00010084507042253521,
1068
+ "loss": 0.9494,
1069
+ "step": 705
1070
+ },
1071
+ {
1072
+ "epoch": 10.0,
1073
+ "grad_norm": 0.02850981615483761,
1074
+ "learning_rate": 0.00010014084507042253,
1075
+ "loss": 0.9522,
1076
+ "step": 710
1077
+ },
1078
+ {
1079
+ "epoch": 10.0,
1080
+ "eval_loss": 0.9344412684440613,
1081
+ "eval_runtime": 1.3287,
1082
+ "eval_samples_per_second": 285.24,
1083
+ "eval_steps_per_second": 2.258,
1084
+ "step": 710
1085
+ },
1086
+ {
1087
+ "epoch": 10.070422535211268,
1088
+ "grad_norm": 0.05556517466902733,
1089
+ "learning_rate": 9.943661971830986e-05,
1090
+ "loss": 0.9529,
1091
+ "step": 715
1092
+ },
1093
+ {
1094
+ "epoch": 10.140845070422536,
1095
+ "grad_norm": 0.06735611706972122,
1096
+ "learning_rate": 9.873239436619719e-05,
1097
+ "loss": 0.9531,
1098
+ "step": 720
1099
+ },
1100
+ {
1101
+ "epoch": 10.211267605633802,
1102
+ "grad_norm": 0.05679089203476906,
1103
+ "learning_rate": 9.802816901408451e-05,
1104
+ "loss": 0.9511,
1105
+ "step": 725
1106
+ },
1107
+ {
1108
+ "epoch": 10.28169014084507,
1109
+ "grad_norm": 0.09331786632537842,
1110
+ "learning_rate": 9.732394366197184e-05,
1111
+ "loss": 0.9509,
1112
+ "step": 730
1113
+ },
1114
+ {
1115
+ "epoch": 10.352112676056338,
1116
+ "grad_norm": 0.06282954663038254,
1117
+ "learning_rate": 9.661971830985916e-05,
1118
+ "loss": 0.952,
1119
+ "step": 735
1120
+ },
1121
+ {
1122
+ "epoch": 10.422535211267606,
1123
+ "grad_norm": 0.06508725136518478,
1124
+ "learning_rate": 9.591549295774649e-05,
1125
+ "loss": 0.9534,
1126
+ "step": 740
1127
+ },
1128
+ {
1129
+ "epoch": 10.492957746478874,
1130
+ "grad_norm": 0.041396528482437134,
1131
+ "learning_rate": 9.52112676056338e-05,
1132
+ "loss": 0.9537,
1133
+ "step": 745
1134
+ },
1135
+ {
1136
+ "epoch": 10.56338028169014,
1137
+ "grad_norm": 0.04602223262190819,
1138
+ "learning_rate": 9.450704225352112e-05,
1139
+ "loss": 0.951,
1140
+ "step": 750
1141
+ },
1142
+ {
1143
+ "epoch": 10.633802816901408,
1144
+ "grad_norm": 0.03372340276837349,
1145
+ "learning_rate": 9.380281690140845e-05,
1146
+ "loss": 0.9498,
1147
+ "step": 755
1148
+ },
1149
+ {
1150
+ "epoch": 10.704225352112676,
1151
+ "grad_norm": 0.03808495029807091,
1152
+ "learning_rate": 9.309859154929578e-05,
1153
+ "loss": 0.9504,
1154
+ "step": 760
1155
+ },
1156
+ {
1157
+ "epoch": 10.774647887323944,
1158
+ "grad_norm": 0.027203522622585297,
1159
+ "learning_rate": 9.23943661971831e-05,
1160
+ "loss": 0.9514,
1161
+ "step": 765
1162
+ },
1163
+ {
1164
+ "epoch": 10.845070422535212,
1165
+ "grad_norm": 0.03273136168718338,
1166
+ "learning_rate": 9.169014084507043e-05,
1167
+ "loss": 0.9496,
1168
+ "step": 770
1169
+ },
1170
+ {
1171
+ "epoch": 10.915492957746478,
1172
+ "grad_norm": 0.03632248193025589,
1173
+ "learning_rate": 9.098591549295775e-05,
1174
+ "loss": 0.9518,
1175
+ "step": 775
1176
+ },
1177
+ {
1178
+ "epoch": 10.985915492957746,
1179
+ "grad_norm": 0.04196527600288391,
1180
+ "learning_rate": 9.028169014084508e-05,
1181
+ "loss": 0.9493,
1182
+ "step": 780
1183
+ },
1184
+ {
1185
+ "epoch": 11.0,
1186
+ "eval_loss": 0.9340574145317078,
1187
+ "eval_runtime": 1.3554,
1188
+ "eval_samples_per_second": 279.624,
1189
+ "eval_steps_per_second": 2.213,
1190
+ "step": 781
1191
+ },
1192
+ {
1193
+ "epoch": 11.056338028169014,
1194
+ "grad_norm": 0.02801569737493992,
1195
+ "learning_rate": 8.95774647887324e-05,
1196
+ "loss": 0.9506,
1197
+ "step": 785
1198
+ },
1199
+ {
1200
+ "epoch": 11.126760563380282,
1201
+ "grad_norm": 0.07875282317399979,
1202
+ "learning_rate": 8.887323943661973e-05,
1203
+ "loss": 0.9527,
1204
+ "step": 790
1205
+ },
1206
+ {
1207
+ "epoch": 11.19718309859155,
1208
+ "grad_norm": 0.06978793442249298,
1209
+ "learning_rate": 8.816901408450705e-05,
1210
+ "loss": 0.9513,
1211
+ "step": 795
1212
+ },
1213
+ {
1214
+ "epoch": 11.267605633802816,
1215
+ "grad_norm": 0.05102715268731117,
1216
+ "learning_rate": 8.746478873239437e-05,
1217
+ "loss": 0.9493,
1218
+ "step": 800
1219
+ },
1220
+ {
1221
+ "epoch": 11.338028169014084,
1222
+ "grad_norm": 0.037087179720401764,
1223
+ "learning_rate": 8.676056338028169e-05,
1224
+ "loss": 0.9482,
1225
+ "step": 805
1226
+ },
1227
+ {
1228
+ "epoch": 11.408450704225352,
1229
+ "grad_norm": 0.026025516912341118,
1230
+ "learning_rate": 8.605633802816902e-05,
1231
+ "loss": 0.9499,
1232
+ "step": 810
1233
+ },
1234
+ {
1235
+ "epoch": 11.47887323943662,
1236
+ "grad_norm": 0.046463027596473694,
1237
+ "learning_rate": 8.535211267605634e-05,
1238
+ "loss": 0.9526,
1239
+ "step": 815
1240
+ },
1241
+ {
1242
+ "epoch": 11.549295774647888,
1243
+ "grad_norm": 0.03885842487215996,
1244
+ "learning_rate": 8.464788732394367e-05,
1245
+ "loss": 0.9508,
1246
+ "step": 820
1247
+ },
1248
+ {
1249
+ "epoch": 11.619718309859154,
1250
+ "grad_norm": 0.042792994529008865,
1251
+ "learning_rate": 8.3943661971831e-05,
1252
+ "loss": 0.9491,
1253
+ "step": 825
1254
+ },
1255
+ {
1256
+ "epoch": 11.690140845070422,
1257
+ "grad_norm": 0.040800243616104126,
1258
+ "learning_rate": 8.323943661971832e-05,
1259
+ "loss": 0.9502,
1260
+ "step": 830
1261
+ },
1262
+ {
1263
+ "epoch": 11.76056338028169,
1264
+ "grad_norm": 0.04244118928909302,
1265
+ "learning_rate": 8.253521126760565e-05,
1266
+ "loss": 0.9463,
1267
+ "step": 835
1268
+ },
1269
+ {
1270
+ "epoch": 11.830985915492958,
1271
+ "grad_norm": 0.027077561244368553,
1272
+ "learning_rate": 8.183098591549296e-05,
1273
+ "loss": 0.9486,
1274
+ "step": 840
1275
+ },
1276
+ {
1277
+ "epoch": 11.901408450704226,
1278
+ "grad_norm": 0.0710897371172905,
1279
+ "learning_rate": 8.112676056338028e-05,
1280
+ "loss": 0.9498,
1281
+ "step": 845
1282
+ },
1283
+ {
1284
+ "epoch": 11.971830985915492,
1285
+ "grad_norm": 0.027009285986423492,
1286
+ "learning_rate": 8.042253521126761e-05,
1287
+ "loss": 0.9492,
1288
+ "step": 850
1289
+ },
1290
+ {
1291
+ "epoch": 12.0,
1292
+ "eval_loss": 0.934472382068634,
1293
+ "eval_runtime": 1.3559,
1294
+ "eval_samples_per_second": 279.512,
1295
+ "eval_steps_per_second": 2.212,
1296
+ "step": 852
1297
+ },
1298
+ {
1299
+ "epoch": 12.04225352112676,
1300
+ "grad_norm": 0.027241826057434082,
1301
+ "learning_rate": 7.971830985915493e-05,
1302
+ "loss": 0.9485,
1303
+ "step": 855
1304
+ },
1305
+ {
1306
+ "epoch": 12.112676056338028,
1307
+ "grad_norm": 17.60222816467285,
1308
+ "learning_rate": 7.901408450704225e-05,
1309
+ "loss": 0.9505,
1310
+ "step": 860
1311
+ },
1312
+ {
1313
+ "epoch": 12.183098591549296,
1314
+ "grad_norm": 0.04522601515054703,
1315
+ "learning_rate": 7.830985915492957e-05,
1316
+ "loss": 0.9511,
1317
+ "step": 865
1318
+ },
1319
+ {
1320
+ "epoch": 12.253521126760564,
1321
+ "grad_norm": 0.0453733466565609,
1322
+ "learning_rate": 7.76056338028169e-05,
1323
+ "loss": 0.9488,
1324
+ "step": 870
1325
+ },
1326
+ {
1327
+ "epoch": 12.323943661971832,
1328
+ "grad_norm": 0.04800290986895561,
1329
+ "learning_rate": 7.690140845070422e-05,
1330
+ "loss": 0.9484,
1331
+ "step": 875
1332
+ },
1333
+ {
1334
+ "epoch": 12.394366197183098,
1335
+ "grad_norm": 0.056329064071178436,
1336
+ "learning_rate": 7.619718309859155e-05,
1337
+ "loss": 0.9488,
1338
+ "step": 880
1339
+ },
1340
+ {
1341
+ "epoch": 12.464788732394366,
1342
+ "grad_norm": 0.04998723790049553,
1343
+ "learning_rate": 7.549295774647887e-05,
1344
+ "loss": 0.9491,
1345
+ "step": 885
1346
+ },
1347
+ {
1348
+ "epoch": 12.535211267605634,
1349
+ "grad_norm": 0.024647079408168793,
1350
+ "learning_rate": 7.47887323943662e-05,
1351
+ "loss": 0.9474,
1352
+ "step": 890
1353
+ },
1354
+ {
1355
+ "epoch": 12.605633802816902,
1356
+ "grad_norm": 0.06089835241436958,
1357
+ "learning_rate": 7.408450704225352e-05,
1358
+ "loss": 0.9487,
1359
+ "step": 895
1360
+ },
1361
+ {
1362
+ "epoch": 12.676056338028168,
1363
+ "grad_norm": 0.07820327579975128,
1364
+ "learning_rate": 7.338028169014085e-05,
1365
+ "loss": 0.9486,
1366
+ "step": 900
1367
+ },
1368
+ {
1369
+ "epoch": 12.746478873239436,
1370
+ "grad_norm": 0.03777018189430237,
1371
+ "learning_rate": 7.267605633802818e-05,
1372
+ "loss": 0.9483,
1373
+ "step": 905
1374
+ },
1375
+ {
1376
+ "epoch": 12.816901408450704,
1377
+ "grad_norm": 0.038660142570734024,
1378
+ "learning_rate": 7.19718309859155e-05,
1379
+ "loss": 0.9478,
1380
+ "step": 910
1381
+ },
1382
+ {
1383
+ "epoch": 12.887323943661972,
1384
+ "grad_norm": 0.028829969465732574,
1385
+ "learning_rate": 7.126760563380283e-05,
1386
+ "loss": 0.9471,
1387
+ "step": 915
1388
+ },
1389
+ {
1390
+ "epoch": 12.95774647887324,
1391
+ "grad_norm": 0.02644144371151924,
1392
+ "learning_rate": 7.056338028169014e-05,
1393
+ "loss": 0.9493,
1394
+ "step": 920
1395
+ },
1396
+ {
1397
+ "epoch": 13.0,
1398
+ "eval_loss": 0.9329857230186462,
1399
+ "eval_runtime": 1.3566,
1400
+ "eval_samples_per_second": 279.384,
1401
+ "eval_steps_per_second": 2.211,
1402
+ "step": 923
1403
+ },
1404
+ {
1405
+ "epoch": 13.028169014084508,
1406
+ "grad_norm": 0.027125298976898193,
1407
+ "learning_rate": 6.985915492957746e-05,
1408
+ "loss": 0.948,
1409
+ "step": 925
1410
+ },
1411
+ {
1412
+ "epoch": 13.098591549295774,
1413
+ "grad_norm": 0.025396650657057762,
1414
+ "learning_rate": 6.915492957746479e-05,
1415
+ "loss": 0.9485,
1416
+ "step": 930
1417
+ },
1418
+ {
1419
+ "epoch": 13.169014084507042,
1420
+ "grad_norm": 0.039986852556467056,
1421
+ "learning_rate": 6.845070422535212e-05,
1422
+ "loss": 0.948,
1423
+ "step": 935
1424
+ },
1425
+ {
1426
+ "epoch": 13.23943661971831,
1427
+ "grad_norm": 0.04669572040438652,
1428
+ "learning_rate": 6.774647887323944e-05,
1429
+ "loss": 0.9484,
1430
+ "step": 940
1431
+ },
1432
+ {
1433
+ "epoch": 13.309859154929578,
1434
+ "grad_norm": 0.05597477778792381,
1435
+ "learning_rate": 6.704225352112677e-05,
1436
+ "loss": 0.9492,
1437
+ "step": 945
1438
+ },
1439
+ {
1440
+ "epoch": 13.380281690140846,
1441
+ "grad_norm": 0.03363762050867081,
1442
+ "learning_rate": 6.633802816901409e-05,
1443
+ "loss": 0.9479,
1444
+ "step": 950
1445
+ },
1446
+ {
1447
+ "epoch": 13.450704225352112,
1448
+ "grad_norm": 0.0501028336584568,
1449
+ "learning_rate": 6.563380281690142e-05,
1450
+ "loss": 0.9472,
1451
+ "step": 955
1452
+ },
1453
+ {
1454
+ "epoch": 13.52112676056338,
1455
+ "grad_norm": 0.02368428371846676,
1456
+ "learning_rate": 6.492957746478874e-05,
1457
+ "loss": 0.9473,
1458
+ "step": 960
1459
+ },
1460
+ {
1461
+ "epoch": 13.591549295774648,
1462
+ "grad_norm": 0.024430401623249054,
1463
+ "learning_rate": 6.422535211267607e-05,
1464
+ "loss": 0.9461,
1465
+ "step": 965
1466
+ },
1467
+ {
1468
+ "epoch": 13.661971830985916,
1469
+ "grad_norm": 0.036314696073532104,
1470
+ "learning_rate": 6.35211267605634e-05,
1471
+ "loss": 0.9469,
1472
+ "step": 970
1473
+ },
1474
+ {
1475
+ "epoch": 13.732394366197184,
1476
+ "grad_norm": 0.03353135287761688,
1477
+ "learning_rate": 6.28169014084507e-05,
1478
+ "loss": 0.9468,
1479
+ "step": 975
1480
+ },
1481
+ {
1482
+ "epoch": 13.80281690140845,
1483
+ "grad_norm": 0.0263178963214159,
1484
+ "learning_rate": 6.211267605633803e-05,
1485
+ "loss": 0.9484,
1486
+ "step": 980
1487
+ },
1488
+ {
1489
+ "epoch": 13.873239436619718,
1490
+ "grad_norm": 0.06421630084514618,
1491
+ "learning_rate": 6.140845070422536e-05,
1492
+ "loss": 0.9481,
1493
+ "step": 985
1494
+ },
1495
+ {
1496
+ "epoch": 13.943661971830986,
1497
+ "grad_norm": 0.03617294877767563,
1498
+ "learning_rate": 6.0704225352112676e-05,
1499
+ "loss": 0.9467,
1500
+ "step": 990
1501
+ },
1502
+ {
1503
+ "epoch": 14.0,
1504
+ "eval_loss": 0.9325685501098633,
1505
+ "eval_runtime": 1.3557,
1506
+ "eval_samples_per_second": 279.559,
1507
+ "eval_steps_per_second": 2.213,
1508
+ "step": 994
1509
+ },
1510
+ {
1511
+ "epoch": 14.014084507042254,
1512
+ "grad_norm": 0.05025329068303108,
1513
+ "learning_rate": 6e-05,
1514
+ "loss": 0.9447,
1515
+ "step": 995
1516
+ },
1517
+ {
1518
+ "epoch": 14.084507042253522,
1519
+ "grad_norm": 0.03912140801548958,
1520
+ "learning_rate": 5.929577464788733e-05,
1521
+ "loss": 0.9468,
1522
+ "step": 1000
1523
+ },
1524
+ {
1525
+ "epoch": 14.154929577464788,
1526
+ "grad_norm": 0.030682148411870003,
1527
+ "learning_rate": 5.859154929577465e-05,
1528
+ "loss": 0.9454,
1529
+ "step": 1005
1530
+ },
1531
+ {
1532
+ "epoch": 14.225352112676056,
1533
+ "grad_norm": 0.06476651132106781,
1534
+ "learning_rate": 5.788732394366198e-05,
1535
+ "loss": 0.946,
1536
+ "step": 1010
1537
+ },
1538
+ {
1539
+ "epoch": 14.295774647887324,
1540
+ "grad_norm": 0.01931784488260746,
1541
+ "learning_rate": 5.71830985915493e-05,
1542
+ "loss": 0.9458,
1543
+ "step": 1015
1544
+ },
1545
+ {
1546
+ "epoch": 14.366197183098592,
1547
+ "grad_norm": 0.030390406027436256,
1548
+ "learning_rate": 5.647887323943662e-05,
1549
+ "loss": 0.9464,
1550
+ "step": 1020
1551
+ },
1552
+ {
1553
+ "epoch": 14.43661971830986,
1554
+ "grad_norm": 0.060740068554878235,
1555
+ "learning_rate": 5.577464788732395e-05,
1556
+ "loss": 0.9459,
1557
+ "step": 1025
1558
+ },
1559
+ {
1560
+ "epoch": 14.507042253521126,
1561
+ "grad_norm": 0.034336596727371216,
1562
+ "learning_rate": 5.5070422535211273e-05,
1563
+ "loss": 0.9454,
1564
+ "step": 1030
1565
+ },
1566
+ {
1567
+ "epoch": 14.577464788732394,
1568
+ "grad_norm": 0.04022248461842537,
1569
+ "learning_rate": 5.43661971830986e-05,
1570
+ "loss": 0.9459,
1571
+ "step": 1035
1572
+ },
1573
+ {
1574
+ "epoch": 14.647887323943662,
1575
+ "grad_norm": 0.03897751495242119,
1576
+ "learning_rate": 5.366197183098591e-05,
1577
+ "loss": 0.948,
1578
+ "step": 1040
1579
+ },
1580
+ {
1581
+ "epoch": 14.71830985915493,
1582
+ "grad_norm": 0.017990631982684135,
1583
+ "learning_rate": 5.2957746478873237e-05,
1584
+ "loss": 0.9461,
1585
+ "step": 1045
1586
+ },
1587
+ {
1588
+ "epoch": 14.788732394366198,
1589
+ "grad_norm": 0.04828361049294472,
1590
+ "learning_rate": 5.225352112676056e-05,
1591
+ "loss": 0.9473,
1592
+ "step": 1050
1593
+ },
1594
+ {
1595
+ "epoch": 14.859154929577464,
1596
+ "grad_norm": 0.040516018867492676,
1597
+ "learning_rate": 5.154929577464789e-05,
1598
+ "loss": 0.9466,
1599
+ "step": 1055
1600
+ },
1601
+ {
1602
+ "epoch": 14.929577464788732,
1603
+ "grad_norm": 0.023633386939764023,
1604
+ "learning_rate": 5.084507042253521e-05,
1605
+ "loss": 0.9483,
1606
+ "step": 1060
1607
+ },
1608
+ {
1609
+ "epoch": 15.0,
1610
+ "grad_norm": 0.03356311842799187,
1611
+ "learning_rate": 5.014084507042254e-05,
1612
+ "loss": 0.9433,
1613
+ "step": 1065
1614
+ },
1615
+ {
1616
+ "epoch": 15.0,
1617
+ "eval_loss": 0.9321981072425842,
1618
+ "eval_runtime": 1.3288,
1619
+ "eval_samples_per_second": 285.221,
1620
+ "eval_steps_per_second": 2.258,
1621
+ "step": 1065
1622
+ },
1623
+ {
1624
+ "epoch": 15.070422535211268,
1625
+ "grad_norm": 0.031062249094247818,
1626
+ "learning_rate": 4.9436619718309864e-05,
1627
+ "loss": 0.9449,
1628
+ "step": 1070
1629
+ },
1630
+ {
1631
+ "epoch": 15.140845070422536,
1632
+ "grad_norm": 0.03606761246919632,
1633
+ "learning_rate": 4.873239436619719e-05,
1634
+ "loss": 0.9452,
1635
+ "step": 1075
1636
+ },
1637
+ {
1638
+ "epoch": 15.211267605633802,
1639
+ "grad_norm": 0.03723893314599991,
1640
+ "learning_rate": 4.8028169014084515e-05,
1641
+ "loss": 0.9495,
1642
+ "step": 1080
1643
+ },
1644
+ {
1645
+ "epoch": 15.28169014084507,
1646
+ "grad_norm": 0.022561002522706985,
1647
+ "learning_rate": 4.7323943661971834e-05,
1648
+ "loss": 0.9451,
1649
+ "step": 1085
1650
+ },
1651
+ {
1652
+ "epoch": 15.352112676056338,
1653
+ "grad_norm": 0.03934817761182785,
1654
+ "learning_rate": 4.661971830985915e-05,
1655
+ "loss": 0.9465,
1656
+ "step": 1090
1657
+ },
1658
+ {
1659
+ "epoch": 15.422535211267606,
1660
+ "grad_norm": 0.021020477637648582,
1661
+ "learning_rate": 4.591549295774648e-05,
1662
+ "loss": 0.9458,
1663
+ "step": 1095
1664
+ },
1665
+ {
1666
+ "epoch": 15.492957746478874,
1667
+ "grad_norm": 0.029691854491829872,
1668
+ "learning_rate": 4.5211267605633804e-05,
1669
+ "loss": 0.9464,
1670
+ "step": 1100
1671
+ },
1672
+ {
1673
+ "epoch": 15.56338028169014,
1674
+ "grad_norm": 0.030643166974186897,
1675
+ "learning_rate": 4.450704225352113e-05,
1676
+ "loss": 0.9438,
1677
+ "step": 1105
1678
+ },
1679
+ {
1680
+ "epoch": 15.633802816901408,
1681
+ "grad_norm": 0.0261048823595047,
1682
+ "learning_rate": 4.3802816901408455e-05,
1683
+ "loss": 0.9442,
1684
+ "step": 1110
1685
+ },
1686
+ {
1687
+ "epoch": 15.704225352112676,
1688
+ "grad_norm": 0.04185587912797928,
1689
+ "learning_rate": 4.3098591549295774e-05,
1690
+ "loss": 0.9455,
1691
+ "step": 1115
1692
+ },
1693
+ {
1694
+ "epoch": 15.774647887323944,
1695
+ "grad_norm": 0.03181832283735275,
1696
+ "learning_rate": 4.23943661971831e-05,
1697
+ "loss": 0.9452,
1698
+ "step": 1120
1699
+ },
1700
+ {
1701
+ "epoch": 15.845070422535212,
1702
+ "grad_norm": 0.017347080633044243,
1703
+ "learning_rate": 4.1690140845070425e-05,
1704
+ "loss": 0.9445,
1705
+ "step": 1125
1706
+ },
1707
+ {
1708
+ "epoch": 15.915492957746478,
1709
+ "grad_norm": 0.0175313837826252,
1710
+ "learning_rate": 4.098591549295775e-05,
1711
+ "loss": 0.9462,
1712
+ "step": 1130
1713
+ },
1714
+ {
1715
+ "epoch": 15.985915492957746,
1716
+ "grad_norm": 0.016237597912549973,
1717
+ "learning_rate": 4.0281690140845076e-05,
1718
+ "loss": 0.9441,
1719
+ "step": 1135
1720
+ },
1721
+ {
1722
+ "epoch": 16.0,
1723
+ "eval_loss": 0.932042121887207,
1724
+ "eval_runtime": 1.3559,
1725
+ "eval_samples_per_second": 279.512,
1726
+ "eval_steps_per_second": 2.212,
1727
+ "step": 1136
1728
+ },
1729
+ {
1730
+ "epoch": 16.056338028169016,
1731
+ "grad_norm": 0.02838326431810856,
1732
+ "learning_rate": 3.9577464788732395e-05,
1733
+ "loss": 0.9443,
1734
+ "step": 1140
1735
+ },
1736
+ {
1737
+ "epoch": 16.12676056338028,
1738
+ "grad_norm": 0.03755342960357666,
1739
+ "learning_rate": 3.887323943661972e-05,
1740
+ "loss": 0.9448,
1741
+ "step": 1145
1742
+ },
1743
+ {
1744
+ "epoch": 16.197183098591548,
1745
+ "grad_norm": 0.019139522686600685,
1746
+ "learning_rate": 3.8169014084507046e-05,
1747
+ "loss": 0.945,
1748
+ "step": 1150
1749
+ },
1750
+ {
1751
+ "epoch": 16.267605633802816,
1752
+ "grad_norm": 0.029503419995307922,
1753
+ "learning_rate": 3.746478873239437e-05,
1754
+ "loss": 0.9437,
1755
+ "step": 1155
1756
+ },
1757
+ {
1758
+ "epoch": 16.338028169014084,
1759
+ "grad_norm": 0.02175034210085869,
1760
+ "learning_rate": 3.676056338028169e-05,
1761
+ "loss": 0.9458,
1762
+ "step": 1160
1763
+ },
1764
+ {
1765
+ "epoch": 16.408450704225352,
1766
+ "grad_norm": 0.041107743978500366,
1767
+ "learning_rate": 3.6056338028169015e-05,
1768
+ "loss": 0.9462,
1769
+ "step": 1165
1770
+ },
1771
+ {
1772
+ "epoch": 16.47887323943662,
1773
+ "grad_norm": 0.03304363787174225,
1774
+ "learning_rate": 3.5352112676056334e-05,
1775
+ "loss": 0.9456,
1776
+ "step": 1170
1777
+ },
1778
+ {
1779
+ "epoch": 16.549295774647888,
1780
+ "grad_norm": 0.02730601467192173,
1781
+ "learning_rate": 3.464788732394366e-05,
1782
+ "loss": 0.9447,
1783
+ "step": 1175
1784
+ },
1785
+ {
1786
+ "epoch": 16.619718309859156,
1787
+ "grad_norm": 0.03291373327374458,
1788
+ "learning_rate": 3.3943661971830985e-05,
1789
+ "loss": 0.9452,
1790
+ "step": 1180
1791
+ },
1792
+ {
1793
+ "epoch": 16.690140845070424,
1794
+ "grad_norm": 0.023141978308558464,
1795
+ "learning_rate": 3.323943661971831e-05,
1796
+ "loss": 0.9437,
1797
+ "step": 1185
1798
+ },
1799
+ {
1800
+ "epoch": 16.760563380281692,
1801
+ "grad_norm": 0.02332274429500103,
1802
+ "learning_rate": 3.2535211267605636e-05,
1803
+ "loss": 0.9438,
1804
+ "step": 1190
1805
+ },
1806
+ {
1807
+ "epoch": 16.830985915492956,
1808
+ "grad_norm": 0.050522513687610626,
1809
+ "learning_rate": 3.183098591549296e-05,
1810
+ "loss": 0.9437,
1811
+ "step": 1195
1812
+ },
1813
+ {
1814
+ "epoch": 16.901408450704224,
1815
+ "grad_norm": 0.02046947181224823,
1816
+ "learning_rate": 3.112676056338028e-05,
1817
+ "loss": 0.9453,
1818
+ "step": 1200
1819
+ },
1820
+ {
1821
+ "epoch": 16.971830985915492,
1822
+ "grad_norm": 0.06999664008617401,
1823
+ "learning_rate": 3.0422535211267606e-05,
1824
+ "loss": 0.9453,
1825
+ "step": 1205
1826
+ },
1827
+ {
1828
+ "epoch": 17.0,
1829
+ "eval_loss": 0.9319289326667786,
1830
+ "eval_runtime": 1.3556,
1831
+ "eval_samples_per_second": 279.59,
1832
+ "eval_steps_per_second": 2.213,
1833
+ "step": 1207
1834
+ },
1835
+ {
1836
+ "epoch": 17.04225352112676,
1837
+ "grad_norm": 0.027964303269982338,
1838
+ "learning_rate": 2.971830985915493e-05,
1839
+ "loss": 0.9437,
1840
+ "step": 1210
1841
+ },
1842
+ {
1843
+ "epoch": 17.112676056338028,
1844
+ "grad_norm": 0.021849192678928375,
1845
+ "learning_rate": 2.9014084507042254e-05,
1846
+ "loss": 0.9451,
1847
+ "step": 1215
1848
+ },
1849
+ {
1850
+ "epoch": 17.183098591549296,
1851
+ "grad_norm": 0.06382541358470917,
1852
+ "learning_rate": 2.830985915492958e-05,
1853
+ "loss": 0.9446,
1854
+ "step": 1220
1855
+ },
1856
+ {
1857
+ "epoch": 17.253521126760564,
1858
+ "grad_norm": 0.018395431339740753,
1859
+ "learning_rate": 2.7605633802816905e-05,
1860
+ "loss": 0.9449,
1861
+ "step": 1225
1862
+ },
1863
+ {
1864
+ "epoch": 17.323943661971832,
1865
+ "grad_norm": 0.02647424302995205,
1866
+ "learning_rate": 2.6901408450704224e-05,
1867
+ "loss": 0.943,
1868
+ "step": 1230
1869
+ },
1870
+ {
1871
+ "epoch": 17.3943661971831,
1872
+ "grad_norm": 0.053954143077135086,
1873
+ "learning_rate": 2.619718309859155e-05,
1874
+ "loss": 0.9451,
1875
+ "step": 1235
1876
+ },
1877
+ {
1878
+ "epoch": 17.464788732394368,
1879
+ "grad_norm": 0.03957201540470123,
1880
+ "learning_rate": 2.5492957746478875e-05,
1881
+ "loss": 0.9448,
1882
+ "step": 1240
1883
+ },
1884
+ {
1885
+ "epoch": 17.535211267605632,
1886
+ "grad_norm": 0.0455038957297802,
1887
+ "learning_rate": 2.47887323943662e-05,
1888
+ "loss": 0.9441,
1889
+ "step": 1245
1890
+ },
1891
+ {
1892
+ "epoch": 17.6056338028169,
1893
+ "grad_norm": 0.03737751021981239,
1894
+ "learning_rate": 2.4084507042253522e-05,
1895
+ "loss": 0.9469,
1896
+ "step": 1250
1897
+ },
1898
+ {
1899
+ "epoch": 17.676056338028168,
1900
+ "grad_norm": 0.038107406347990036,
1901
+ "learning_rate": 2.3380281690140845e-05,
1902
+ "loss": 0.9445,
1903
+ "step": 1255
1904
+ },
1905
+ {
1906
+ "epoch": 17.746478873239436,
1907
+ "grad_norm": 0.04548390954732895,
1908
+ "learning_rate": 2.267605633802817e-05,
1909
+ "loss": 0.9451,
1910
+ "step": 1260
1911
+ },
1912
+ {
1913
+ "epoch": 17.816901408450704,
1914
+ "grad_norm": 0.03373611345887184,
1915
+ "learning_rate": 2.1971830985915496e-05,
1916
+ "loss": 0.9438,
1917
+ "step": 1265
1918
+ },
1919
+ {
1920
+ "epoch": 17.887323943661972,
1921
+ "grad_norm": 0.029089247807860374,
1922
+ "learning_rate": 2.1267605633802818e-05,
1923
+ "loss": 0.9435,
1924
+ "step": 1270
1925
+ },
1926
+ {
1927
+ "epoch": 17.95774647887324,
1928
+ "grad_norm": 0.027908792719244957,
1929
+ "learning_rate": 2.0563380281690143e-05,
1930
+ "loss": 0.9433,
1931
+ "step": 1275
1932
+ },
1933
+ {
1934
+ "epoch": 18.0,
1935
+ "eval_loss": 0.9321526885032654,
1936
+ "eval_runtime": 1.3562,
1937
+ "eval_samples_per_second": 279.464,
1938
+ "eval_steps_per_second": 2.212,
1939
+ "step": 1278
1940
+ },
1941
+ {
1942
+ "epoch": 18.028169014084508,
1943
+ "grad_norm": 0.038007136434316635,
1944
+ "learning_rate": 1.9859154929577465e-05,
1945
+ "loss": 0.9433,
1946
+ "step": 1280
1947
+ },
1948
+ {
1949
+ "epoch": 18.098591549295776,
1950
+ "grad_norm": 0.055055730044841766,
1951
+ "learning_rate": 1.9154929577464788e-05,
1952
+ "loss": 0.9444,
1953
+ "step": 1285
1954
+ },
1955
+ {
1956
+ "epoch": 18.169014084507044,
1957
+ "grad_norm": 0.024233724921941757,
1958
+ "learning_rate": 1.8450704225352113e-05,
1959
+ "loss": 0.9448,
1960
+ "step": 1290
1961
+ },
1962
+ {
1963
+ "epoch": 18.239436619718308,
1964
+ "grad_norm": 0.02525465376675129,
1965
+ "learning_rate": 1.774647887323944e-05,
1966
+ "loss": 0.9452,
1967
+ "step": 1295
1968
+ },
1969
+ {
1970
+ "epoch": 18.309859154929576,
1971
+ "grad_norm": 0.03696692734956741,
1972
+ "learning_rate": 1.704225352112676e-05,
1973
+ "loss": 0.9451,
1974
+ "step": 1300
1975
+ },
1976
+ {
1977
+ "epoch": 18.380281690140844,
1978
+ "grad_norm": 0.012775393202900887,
1979
+ "learning_rate": 1.6338028169014086e-05,
1980
+ "loss": 0.9424,
1981
+ "step": 1305
1982
+ },
1983
+ {
1984
+ "epoch": 18.450704225352112,
1985
+ "grad_norm": 0.04282708466053009,
1986
+ "learning_rate": 1.5633802816901412e-05,
1987
+ "loss": 0.9439,
1988
+ "step": 1310
1989
+ },
1990
+ {
1991
+ "epoch": 18.52112676056338,
1992
+ "grad_norm": 0.026728734374046326,
1993
+ "learning_rate": 1.4929577464788732e-05,
1994
+ "loss": 0.9432,
1995
+ "step": 1315
1996
+ },
1997
+ {
1998
+ "epoch": 18.591549295774648,
1999
+ "grad_norm": 0.038278259336948395,
2000
+ "learning_rate": 1.4225352112676058e-05,
2001
+ "loss": 0.9437,
2002
+ "step": 1320
2003
+ },
2004
+ {
2005
+ "epoch": 18.661971830985916,
2006
+ "grad_norm": 0.0476648211479187,
2007
+ "learning_rate": 1.352112676056338e-05,
2008
+ "loss": 0.9442,
2009
+ "step": 1325
2010
+ },
2011
+ {
2012
+ "epoch": 18.732394366197184,
2013
+ "grad_norm": 0.017152875661849976,
2014
+ "learning_rate": 1.2816901408450704e-05,
2015
+ "loss": 0.9439,
2016
+ "step": 1330
2017
+ },
2018
+ {
2019
+ "epoch": 18.802816901408452,
2020
+ "grad_norm": 0.02195524424314499,
2021
+ "learning_rate": 1.211267605633803e-05,
2022
+ "loss": 0.9441,
2023
+ "step": 1335
2024
+ },
2025
+ {
2026
+ "epoch": 18.87323943661972,
2027
+ "grad_norm": 0.03179704770445824,
2028
+ "learning_rate": 1.1408450704225353e-05,
2029
+ "loss": 0.944,
2030
+ "step": 1340
2031
+ },
2032
+ {
2033
+ "epoch": 18.943661971830984,
2034
+ "grad_norm": 0.01666986383497715,
2035
+ "learning_rate": 1.0704225352112677e-05,
2036
+ "loss": 0.9436,
2037
+ "step": 1345
2038
+ },
2039
+ {
2040
+ "epoch": 19.0,
2041
+ "eval_loss": 0.9319114685058594,
2042
+ "eval_runtime": 1.3555,
2043
+ "eval_samples_per_second": 279.596,
2044
+ "eval_steps_per_second": 2.213,
2045
+ "step": 1349
2046
+ },
2047
+ {
2048
+ "epoch": 19.014084507042252,
2049
+ "grad_norm": 0.027670254930853844,
2050
+ "learning_rate": 1e-05,
2051
+ "loss": 0.9441,
2052
+ "step": 1350
2053
+ },
2054
+ {
2055
+ "epoch": 19.08450704225352,
2056
+ "grad_norm": 0.05388191342353821,
2057
+ "learning_rate": 9.295774647887325e-06,
2058
+ "loss": 0.9445,
2059
+ "step": 1355
2060
+ },
2061
+ {
2062
+ "epoch": 19.154929577464788,
2063
+ "grad_norm": 0.02966221235692501,
2064
+ "learning_rate": 8.591549295774648e-06,
2065
+ "loss": 0.9432,
2066
+ "step": 1360
2067
+ },
2068
+ {
2069
+ "epoch": 19.225352112676056,
2070
+ "grad_norm": 0.02553911693394184,
2071
+ "learning_rate": 7.887323943661972e-06,
2072
+ "loss": 0.9434,
2073
+ "step": 1365
2074
+ },
2075
+ {
2076
+ "epoch": 19.295774647887324,
2077
+ "grad_norm": 0.03100278414785862,
2078
+ "learning_rate": 7.183098591549296e-06,
2079
+ "loss": 0.9438,
2080
+ "step": 1370
2081
+ },
2082
+ {
2083
+ "epoch": 19.366197183098592,
2084
+ "grad_norm": 0.03018755465745926,
2085
+ "learning_rate": 6.47887323943662e-06,
2086
+ "loss": 0.9442,
2087
+ "step": 1375
2088
+ },
2089
+ {
2090
+ "epoch": 19.43661971830986,
2091
+ "grad_norm": 0.019777249544858932,
2092
+ "learning_rate": 5.774647887323944e-06,
2093
+ "loss": 0.9438,
2094
+ "step": 1380
2095
+ },
2096
+ {
2097
+ "epoch": 19.507042253521128,
2098
+ "grad_norm": 0.036921270191669464,
2099
+ "learning_rate": 5.070422535211268e-06,
2100
+ "loss": 0.9435,
2101
+ "step": 1385
2102
+ },
2103
+ {
2104
+ "epoch": 19.577464788732396,
2105
+ "grad_norm": 0.038511138409376144,
2106
+ "learning_rate": 4.3661971830985915e-06,
2107
+ "loss": 0.9442,
2108
+ "step": 1390
2109
+ },
2110
+ {
2111
+ "epoch": 19.647887323943664,
2112
+ "grad_norm": 0.03045705333352089,
2113
+ "learning_rate": 3.6619718309859158e-06,
2114
+ "loss": 0.9443,
2115
+ "step": 1395
2116
+ },
2117
+ {
2118
+ "epoch": 19.718309859154928,
2119
+ "grad_norm": 0.037487465888261795,
2120
+ "learning_rate": 2.9577464788732396e-06,
2121
+ "loss": 0.9435,
2122
+ "step": 1400
2123
+ },
2124
+ {
2125
+ "epoch": 19.788732394366196,
2126
+ "grad_norm": 0.022748373448848724,
2127
+ "learning_rate": 2.2535211267605635e-06,
2128
+ "loss": 0.9439,
2129
+ "step": 1405
2130
+ },
2131
+ {
2132
+ "epoch": 19.859154929577464,
2133
+ "grad_norm": 0.037895116955041885,
2134
+ "learning_rate": 1.5492957746478875e-06,
2135
+ "loss": 0.9438,
2136
+ "step": 1410
2137
+ },
2138
+ {
2139
+ "epoch": 19.929577464788732,
2140
+ "grad_norm": 0.02999270148575306,
2141
+ "learning_rate": 8.450704225352112e-07,
2142
+ "loss": 0.9429,
2143
+ "step": 1415
2144
+ },
2145
+ {
2146
+ "epoch": 20.0,
2147
+ "grad_norm": 0.03205498680472374,
2148
+ "learning_rate": 1.4084507042253522e-07,
2149
+ "loss": 0.9441,
2150
+ "step": 1420
2151
+ },
2152
+ {
2153
+ "epoch": 20.0,
2154
+ "eval_loss": 0.931971549987793,
2155
+ "eval_runtime": 1.3285,
2156
+ "eval_samples_per_second": 285.277,
2157
+ "eval_steps_per_second": 2.258,
2158
+ "step": 1420
2159
+ }
2160
+ ],
2161
+ "logging_steps": 5,
2162
+ "max_steps": 1420,
2163
+ "num_input_tokens_seen": 0,
2164
+ "num_train_epochs": 20,
2165
+ "save_steps": 500,
2166
+ "stateful_callbacks": {
2167
+ "TrainerControl": {
2168
+ "args": {
2169
+ "should_epoch_stop": false,
2170
+ "should_evaluate": false,
2171
+ "should_log": false,
2172
+ "should_save": true,
2173
+ "should_training_stop": true
2174
+ },
2175
+ "attributes": {}
2176
+ }
2177
+ },
2178
+ "total_flos": 6.424179214123008e+16,
2179
+ "train_batch_size": 48,
2180
+ "trial_name": null,
2181
+ "trial_params": null
2182
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:041f2ee6df165b5d22004b622affcf417f382986127d7a972adaf8f8946ea8ed
3
+ size 6033