Toxotes commited on
Commit
f39ac71
·
verified ·
1 Parent(s): ff5e5fa

feat: Turkish Financial BERT (fin-bert-tr mini)

Browse files
README.md ADDED
@@ -0,0 +1,78 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - tr
4
+ license: apache-2.0
5
+ tags:
6
+ - bert
7
+ - turkish
8
+ - finance
9
+ - masked-language-modeling
10
+ - custom-trained
11
+ library_name: transformers
12
+ datasets:
13
+ - custom
14
+ ---
15
+
16
+ # fin-bert-tr
17
+
18
+ **Türkçe Finansal BERT** — Türk finans metinleri üzerinde sıfırdan ön-eğitilmiş
19
+ BERT modeli.
20
+
21
+ ## Model Detayları
22
+
23
+ | Parametre | Değer |
24
+ |-----------|-------|
25
+ | Mimari | BERT (mini varyantı) |
26
+ | Vocab Boyutu | 32,000 |
27
+ | Tokenizer | Zemberek morfoloji + BPE |
28
+ | Ön-Eğitim Görevi | Masked Language Modeling (MLM, %15) |
29
+ | Eğitim Tarihi | 2026-03-12 |
30
+
31
+ ## Eğitim Verisi
32
+
33
+ Toplam ~0 MB Türkçe finans metni:
34
+
35
+ | Kaynak | Açıklama |
36
+ |--------|----------|
37
+ | Bloomberg HT | bloomberg.com.tr ekonomi ve piyasa haberleri |
38
+ | Investing TR | tr.investing.com finans haberleri |
39
+ | Bigpara | bigpara.hurriyet.com.tr BIST ve döviz haberleri |
40
+ | Dünya Gazetesi | dunya.com iş dünyası ve ekonomi |
41
+ | MASSIVE TR | Amazon/massive Türkçe utterance veri seti |
42
+ | WikiANN TR | Türkçe Wikipedia NER cümleleri |
43
+ | TCMB/BDDK | Merkez bankası ve düzenleyici kurum duyuruları |
44
+ | Sentetik | Şablon tabanlı Türkçe finans metinleri |
45
+
46
+ ## Kullanım
47
+
48
+ ```python
49
+ from transformers import BertForMaskedLM, PreTrainedTokenizerFast
50
+ import torch
51
+
52
+ tokenizer = PreTrainedTokenizerFast.from_pretrained("Toxotes/fin-bert-tr")
53
+ model = BertForMaskedLM.from_pretrained("Toxotes/fin-bert-tr")
54
+
55
+ text = "Merkez Bankası [MASK] oranını artırdı."
56
+ inputs = tokenizer(text, return_tensors="pt")
57
+ with torch.no_grad():
58
+ outputs = model(**inputs)
59
+ logits = outputs.logits
60
+
61
+ # En yüksek olasılıklı token
62
+ mask_idx = (inputs["input_ids"] == tokenizer.mask_token_id).nonzero()[0, 1]
63
+ top_token = tokenizer.decode(logits[0, mask_idx].argmax().item())
64
+ print(top_token) # → faiz
65
+ ```
66
+
67
+ ## İnce-Ayar
68
+
69
+ Bu model aşağıdaki görevler için ince-ayarlanabilir:
70
+ - Finansal metin sınıflandırma
71
+ - NER (named entity recognition) — banka, şirket, oran isimleri
72
+ - RAG query routing (bkz. `fin-bert-tr-router`)
73
+ - Türk finans duygu analizi
74
+
75
+ ## Proje
76
+
77
+ MOSAIC — Federated Financial RAG System
78
+ [GitHub](https://github.com/tahatoy/MOSAIC)
checkpoint-4653/config.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_cross_attention": false,
3
+ "architectures": [
4
+ "BertForMaskedLM"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 2,
8
+ "classifier_dropout": null,
9
+ "dtype": "float32",
10
+ "eos_token_id": 3,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout_prob": 0.1,
13
+ "hidden_size": 384,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 1536,
16
+ "is_decoder": false,
17
+ "layer_norm_eps": 1e-12,
18
+ "max_position_embeddings": 512,
19
+ "model_type": "bert",
20
+ "num_attention_heads": 12,
21
+ "num_hidden_layers": 6,
22
+ "pad_token_id": 0,
23
+ "tie_word_embeddings": true,
24
+ "transformers_version": "5.3.0",
25
+ "type_vocab_size": 2,
26
+ "use_cache": false,
27
+ "vocab_size": 32000
28
+ }
checkpoint-4653/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4fb9e7f5cb8d561f4fe7d39994fabd80704c779a2937addec7f936bbc9755f7f
3
+ size 93266320
checkpoint-4653/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:34ec055a42de0bc48b4bc20f6eadbfa0ba74a7cc9adb5f16901eb47a9399e201
3
+ size 186597643
checkpoint-4653/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b46c6e5715866a80b4850b4655b5b4973a6a52716d8697c4eb12709d9a02c2e5
3
+ size 14645
checkpoint-4653/scaler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ffd45aef1235be42429a4dd97dd125f8d8ee2467c730fc66acff315aa95dc941
3
+ size 1383
checkpoint-4653/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7d7c8d507a5ff068fe05808d8247169e8b4fc7c3c6c69f386013702b6370dadf
3
+ size 1465
checkpoint-4653/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-4653/tokenizer_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "backend": "tokenizers",
3
+ "bos_token": "[CLS]",
4
+ "cls_token": "[CLS]",
5
+ "eos_token": "[SEP]",
6
+ "is_local": true,
7
+ "mask_token": "[MASK]",
8
+ "model_max_length": 1000000000000000019884624838656,
9
+ "pad_token": "[PAD]",
10
+ "sep_token": "[SEP]",
11
+ "tokenizer_class": "TokenizersBackend",
12
+ "unk_token": "[UNK]"
13
+ }
checkpoint-4653/trainer_state.json ADDED
@@ -0,0 +1,1749 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": null,
3
+ "best_metric": null,
4
+ "best_model_checkpoint": null,
5
+ "epoch": 9.0,
6
+ "eval_steps": 500,
7
+ "global_step": 4653,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.0025906735751295338,
14
+ "grad_norm": 1.9656798839569092,
15
+ "learning_rate": 0.0,
16
+ "loss": 10.453109741210938,
17
+ "step": 1
18
+ },
19
+ {
20
+ "epoch": 0.04922279792746114,
21
+ "grad_norm": 1.8898563385009766,
22
+ "learning_rate": 7.792207792207792e-06,
23
+ "loss": 10.416481018066406,
24
+ "step": 19
25
+ },
26
+ {
27
+ "epoch": 0.09844559585492228,
28
+ "grad_norm": 1.5997825860977173,
29
+ "learning_rate": 1.6017316017316017e-05,
30
+ "loss": 10.20559371145148,
31
+ "step": 38
32
+ },
33
+ {
34
+ "epoch": 0.14766839378238342,
35
+ "grad_norm": 1.4691131114959717,
36
+ "learning_rate": 2.4242424242424244e-05,
37
+ "loss": 9.938501458418997,
38
+ "step": 57
39
+ },
40
+ {
41
+ "epoch": 0.19689119170984457,
42
+ "grad_norm": 1.5397157669067383,
43
+ "learning_rate": 3.246753246753247e-05,
44
+ "loss": 9.666742425215872,
45
+ "step": 76
46
+ },
47
+ {
48
+ "epoch": 0.24611398963730569,
49
+ "grad_norm": 1.5672590732574463,
50
+ "learning_rate": 4.0692640692640695e-05,
51
+ "loss": 9.390644274259868,
52
+ "step": 95
53
+ },
54
+ {
55
+ "epoch": 0.29533678756476683,
56
+ "grad_norm": 1.4179465770721436,
57
+ "learning_rate": 4.8917748917748915e-05,
58
+ "loss": 9.110911319130345,
59
+ "step": 114
60
+ },
61
+ {
62
+ "epoch": 0.344559585492228,
63
+ "grad_norm": 1.3159687519073486,
64
+ "learning_rate": 5.714285714285714e-05,
65
+ "loss": 8.792612176192435,
66
+ "step": 133
67
+ },
68
+ {
69
+ "epoch": 0.39378238341968913,
70
+ "grad_norm": 1.0485610961914062,
71
+ "learning_rate": 6.536796536796536e-05,
72
+ "loss": 8.488385652240954,
73
+ "step": 152
74
+ },
75
+ {
76
+ "epoch": 0.4430051813471503,
77
+ "grad_norm": 0.8901123404502869,
78
+ "learning_rate": 7.35930735930736e-05,
79
+ "loss": 8.201339721679688,
80
+ "step": 171
81
+ },
82
+ {
83
+ "epoch": 0.49222797927461137,
84
+ "grad_norm": 0.7018402218818665,
85
+ "learning_rate": 8.181818181818183e-05,
86
+ "loss": 7.978334125719573,
87
+ "step": 190
88
+ },
89
+ {
90
+ "epoch": 0.5414507772020726,
91
+ "grad_norm": 0.5120431184768677,
92
+ "learning_rate": 9.004329004329005e-05,
93
+ "loss": 7.839283993369655,
94
+ "step": 209
95
+ },
96
+ {
97
+ "epoch": 0.5906735751295337,
98
+ "grad_norm": 0.5026654601097107,
99
+ "learning_rate": 9.826839826839827e-05,
100
+ "loss": 7.7916211579975325,
101
+ "step": 228
102
+ },
103
+ {
104
+ "epoch": 0.6398963730569949,
105
+ "grad_norm": 0.638586163520813,
106
+ "learning_rate": 9.999578456659054e-05,
107
+ "loss": 7.715636403937089,
108
+ "step": 247
109
+ },
110
+ {
111
+ "epoch": 0.689119170984456,
112
+ "grad_norm": 0.6381962895393372,
113
+ "learning_rate": 9.997834329912887e-05,
114
+ "loss": 7.697683233963816,
115
+ "step": 266
116
+ },
117
+ {
118
+ "epoch": 0.7383419689119171,
119
+ "grad_norm": 0.6784098148345947,
120
+ "learning_rate": 9.994738114801949e-05,
121
+ "loss": 7.658172607421875,
122
+ "step": 285
123
+ },
124
+ {
125
+ "epoch": 0.7875647668393783,
126
+ "grad_norm": 0.7522804141044617,
127
+ "learning_rate": 9.990290648960332e-05,
128
+ "loss": 7.618246781198602,
129
+ "step": 304
130
+ },
131
+ {
132
+ "epoch": 0.8367875647668394,
133
+ "grad_norm": 1.413805365562439,
134
+ "learning_rate": 9.984493135582543e-05,
135
+ "loss": 7.569692511307566,
136
+ "step": 323
137
+ },
138
+ {
139
+ "epoch": 0.8860103626943006,
140
+ "grad_norm": 0.769112229347229,
141
+ "learning_rate": 9.977347143098e-05,
142
+ "loss": 7.520751953125,
143
+ "step": 342
144
+ },
145
+ {
146
+ "epoch": 0.9352331606217616,
147
+ "grad_norm": 0.8767187595367432,
148
+ "learning_rate": 9.96885460474671e-05,
149
+ "loss": 7.502451043379934,
150
+ "step": 361
151
+ },
152
+ {
153
+ "epoch": 0.9844559585492227,
154
+ "grad_norm": 0.7236428260803223,
155
+ "learning_rate": 9.959017818056273e-05,
156
+ "loss": 7.4918670654296875,
157
+ "step": 380
158
+ },
159
+ {
160
+ "epoch": 1.0336787564766838,
161
+ "grad_norm": 0.5859951376914978,
162
+ "learning_rate": 9.947839444220306e-05,
163
+ "loss": 7.4534374036287,
164
+ "step": 399
165
+ },
166
+ {
167
+ "epoch": 1.0829015544041452,
168
+ "grad_norm": 0.770540714263916,
169
+ "learning_rate": 9.935322507378509e-05,
170
+ "loss": 7.40715187474301,
171
+ "step": 418
172
+ },
173
+ {
174
+ "epoch": 1.1321243523316062,
175
+ "grad_norm": 0.6562390327453613,
176
+ "learning_rate": 9.921470393798522e-05,
177
+ "loss": 7.423827321905839,
178
+ "step": 437
179
+ },
180
+ {
181
+ "epoch": 1.1813471502590673,
182
+ "grad_norm": 0.7159621715545654,
183
+ "learning_rate": 9.906286850959825e-05,
184
+ "loss": 7.380163895456414,
185
+ "step": 456
186
+ },
187
+ {
188
+ "epoch": 1.2305699481865284,
189
+ "grad_norm": 0.6420731544494629,
190
+ "learning_rate": 9.889775986539913e-05,
191
+ "loss": 7.33871781198602,
192
+ "step": 475
193
+ },
194
+ {
195
+ "epoch": 1.2797927461139897,
196
+ "grad_norm": 0.7125868797302246,
197
+ "learning_rate": 9.871942267303034e-05,
198
+ "loss": 7.3665418122944075,
199
+ "step": 494
200
+ },
201
+ {
202
+ "epoch": 1.3290155440414508,
203
+ "grad_norm": 0.7381535768508911,
204
+ "learning_rate": 9.852790517891754e-05,
205
+ "loss": 7.347101311934622,
206
+ "step": 513
207
+ },
208
+ {
209
+ "epoch": 1.378238341968912,
210
+ "grad_norm": 0.7120394110679626,
211
+ "learning_rate": 9.83232591952175e-05,
212
+ "loss": 7.310685810289885,
213
+ "step": 532
214
+ },
215
+ {
216
+ "epoch": 1.427461139896373,
217
+ "grad_norm": 0.7593790888786316,
218
+ "learning_rate": 9.810554008580081e-05,
219
+ "loss": 7.298673127826891,
220
+ "step": 551
221
+ },
222
+ {
223
+ "epoch": 1.4766839378238341,
224
+ "grad_norm": 0.7787287831306458,
225
+ "learning_rate": 9.787480675127431e-05,
226
+ "loss": 7.280764931126645,
227
+ "step": 570
228
+ },
229
+ {
230
+ "epoch": 1.5259067357512954,
231
+ "grad_norm": 0.8570913076400757,
232
+ "learning_rate": 9.763112161304621e-05,
233
+ "loss": 7.271910014905427,
234
+ "step": 589
235
+ },
236
+ {
237
+ "epoch": 1.5751295336787565,
238
+ "grad_norm": 0.6566023826599121,
239
+ "learning_rate": 9.737455059643903e-05,
240
+ "loss": 7.260608070775082,
241
+ "step": 608
242
+ },
243
+ {
244
+ "epoch": 1.6243523316062176,
245
+ "grad_norm": 0.6554204821586609,
246
+ "learning_rate": 9.710516311285445e-05,
247
+ "loss": 7.235391717208059,
248
+ "step": 627
249
+ },
250
+ {
251
+ "epoch": 1.6735751295336787,
252
+ "grad_norm": 0.8252356648445129,
253
+ "learning_rate": 9.682303204099517e-05,
254
+ "loss": 7.23214400442023,
255
+ "step": 646
256
+ },
257
+ {
258
+ "epoch": 1.7227979274611398,
259
+ "grad_norm": 0.7991335988044739,
260
+ "learning_rate": 9.652823370714861e-05,
261
+ "loss": 7.189540662263569,
262
+ "step": 665
263
+ },
264
+ {
265
+ "epoch": 1.7720207253886011,
266
+ "grad_norm": 0.6291921734809875,
267
+ "learning_rate": 9.622084786453804e-05,
268
+ "loss": 7.1787647448088,
269
+ "step": 684
270
+ },
271
+ {
272
+ "epoch": 1.8212435233160622,
273
+ "grad_norm": 0.6076451539993286,
274
+ "learning_rate": 9.590095767174654e-05,
275
+ "loss": 7.1707924290707235,
276
+ "step": 703
277
+ },
278
+ {
279
+ "epoch": 1.8704663212435233,
280
+ "grad_norm": 0.7197585105895996,
281
+ "learning_rate": 9.556864967021966e-05,
282
+ "loss": 7.1632947419819075,
283
+ "step": 722
284
+ },
285
+ {
286
+ "epoch": 1.9196891191709846,
287
+ "grad_norm": 0.8579983711242676,
288
+ "learning_rate": 9.522401376085302e-05,
289
+ "loss": 7.2033129240337175,
290
+ "step": 741
291
+ },
292
+ {
293
+ "epoch": 1.9689119170984455,
294
+ "grad_norm": 0.7442916631698608,
295
+ "learning_rate": 9.486714317967097e-05,
296
+ "loss": 7.15290671900699,
297
+ "step": 760
298
+ },
299
+ {
300
+ "epoch": 2.018134715025907,
301
+ "grad_norm": 0.6029990911483765,
302
+ "learning_rate": 9.449813447260292e-05,
303
+ "loss": 7.155892623098273,
304
+ "step": 779
305
+ },
306
+ {
307
+ "epoch": 2.0673575129533677,
308
+ "grad_norm": 0.8712852597236633,
309
+ "learning_rate": 9.411708746936439e-05,
310
+ "loss": 7.1117409153988485,
311
+ "step": 798
312
+ },
313
+ {
314
+ "epoch": 2.116580310880829,
315
+ "grad_norm": 0.7027563452720642,
316
+ "learning_rate": 9.372410525644952e-05,
317
+ "loss": 7.13547074167352,
318
+ "step": 817
319
+ },
320
+ {
321
+ "epoch": 2.1658031088082903,
322
+ "grad_norm": 0.7402092814445496,
323
+ "learning_rate": 9.33192941492427e-05,
324
+ "loss": 7.12723099557977,
325
+ "step": 836
326
+ },
327
+ {
328
+ "epoch": 2.215025906735751,
329
+ "grad_norm": 0.6641818284988403,
330
+ "learning_rate": 9.290276366325638e-05,
331
+ "loss": 7.079131276983964,
332
+ "step": 855
333
+ },
334
+ {
335
+ "epoch": 2.2642487046632125,
336
+ "grad_norm": 0.654100239276886,
337
+ "learning_rate": 9.247462648450348e-05,
338
+ "loss": 7.120608681126645,
339
+ "step": 874
340
+ },
341
+ {
342
+ "epoch": 2.313471502590674,
343
+ "grad_norm": 0.6241465210914612,
344
+ "learning_rate": 9.203499843901173e-05,
345
+ "loss": 7.0452880859375,
346
+ "step": 893
347
+ },
348
+ {
349
+ "epoch": 2.3626943005181347,
350
+ "grad_norm": 0.6422539353370667,
351
+ "learning_rate": 9.158399846148886e-05,
352
+ "loss": 7.0627602025082235,
353
+ "step": 912
354
+ },
355
+ {
356
+ "epoch": 2.411917098445596,
357
+ "grad_norm": 0.9347654581069946,
358
+ "learning_rate": 9.11217485631465e-05,
359
+ "loss": 7.087360582853618,
360
+ "step": 931
361
+ },
362
+ {
363
+ "epoch": 2.461139896373057,
364
+ "grad_norm": 0.6797104477882385,
365
+ "learning_rate": 9.064837379869189e-05,
366
+ "loss": 7.03591597707648,
367
+ "step": 950
368
+ },
369
+ {
370
+ "epoch": 2.510362694300518,
371
+ "grad_norm": 0.7567751407623291,
372
+ "learning_rate": 9.016400223249635e-05,
373
+ "loss": 7.0485181306537825,
374
+ "step": 969
375
+ },
376
+ {
377
+ "epoch": 2.5595854922279795,
378
+ "grad_norm": 0.9307654500007629,
379
+ "learning_rate": 8.966876490394927e-05,
380
+ "loss": 7.068600303248355,
381
+ "step": 988
382
+ },
383
+ {
384
+ "epoch": 2.6088082901554404,
385
+ "grad_norm": 0.6442763805389404,
386
+ "learning_rate": 8.91627957920074e-05,
387
+ "loss": 7.0317848607113485,
388
+ "step": 1007
389
+ },
390
+ {
391
+ "epoch": 2.6580310880829017,
392
+ "grad_norm": 0.755132257938385,
393
+ "learning_rate": 8.8646231778949e-05,
394
+ "loss": 7.035190783048931,
395
+ "step": 1026
396
+ },
397
+ {
398
+ "epoch": 2.7072538860103625,
399
+ "grad_norm": 0.9993160367012024,
400
+ "learning_rate": 8.811921261334224e-05,
401
+ "loss": 7.045703285618832,
402
+ "step": 1045
403
+ },
404
+ {
405
+ "epoch": 2.756476683937824,
406
+ "grad_norm": 0.6708552837371826,
407
+ "learning_rate": 8.758188087223845e-05,
408
+ "loss": 7.088768406918175,
409
+ "step": 1064
410
+ },
411
+ {
412
+ "epoch": 2.805699481865285,
413
+ "grad_norm": 0.6677550077438354,
414
+ "learning_rate": 8.703438192260007e-05,
415
+ "loss": 7.025689376027961,
416
+ "step": 1083
417
+ },
418
+ {
419
+ "epoch": 2.854922279792746,
420
+ "grad_norm": 0.8610202074050903,
421
+ "learning_rate": 8.647686388197374e-05,
422
+ "loss": 6.9846753572162825,
423
+ "step": 1102
424
+ },
425
+ {
426
+ "epoch": 2.9041450777202074,
427
+ "grad_norm": 0.6401988863945007,
428
+ "learning_rate": 8.59094775784194e-05,
429
+ "loss": 7.020596955951891,
430
+ "step": 1121
431
+ },
432
+ {
433
+ "epoch": 2.9533678756476682,
434
+ "grad_norm": 0.7033092975616455,
435
+ "learning_rate": 8.533237650970602e-05,
436
+ "loss": 7.012691297029194,
437
+ "step": 1140
438
+ },
439
+ {
440
+ "epoch": 2.241779497098646,
441
+ "grad_norm": 1.8401119709014893,
442
+ "learning_rate": 8.474571680178515e-05,
443
+ "loss": 7.131598949432373,
444
+ "step": 1159
445
+ },
446
+ {
447
+ "epoch": 2.2785299806576402,
448
+ "grad_norm": 0.9385874271392822,
449
+ "learning_rate": 9.233682395815343e-05,
450
+ "loss": 6.8397754869963,
451
+ "step": 1178
452
+ },
453
+ {
454
+ "epoch": 2.3152804642166345,
455
+ "grad_norm": 0.9496298432350159,
456
+ "learning_rate": 9.200700008023644e-05,
457
+ "loss": 6.565899096037212,
458
+ "step": 1197
459
+ },
460
+ {
461
+ "epoch": 2.3520309477756287,
462
+ "grad_norm": 0.7686163187026978,
463
+ "learning_rate": 9.167084229191691e-05,
464
+ "loss": 6.4427024439761515,
465
+ "step": 1216
466
+ },
467
+ {
468
+ "epoch": 2.388781431334623,
469
+ "grad_norm": 0.6687182188034058,
470
+ "learning_rate": 9.132840127982587e-05,
471
+ "loss": 6.356098375822368,
472
+ "step": 1235
473
+ },
474
+ {
475
+ "epoch": 2.425531914893617,
476
+ "grad_norm": 0.9565212726593018,
477
+ "learning_rate": 9.097972867799301e-05,
478
+ "loss": 6.326987818667763,
479
+ "step": 1254
480
+ },
481
+ {
482
+ "epoch": 2.4622823984526114,
483
+ "grad_norm": 0.9246956706047058,
484
+ "learning_rate": 9.062487706006115e-05,
485
+ "loss": 6.313424762926604,
486
+ "step": 1273
487
+ },
488
+ {
489
+ "epoch": 2.4990328820116052,
490
+ "grad_norm": 0.7130828499794006,
491
+ "learning_rate": 9.026389993135918e-05,
492
+ "loss": 6.297733106111226,
493
+ "step": 1292
494
+ },
495
+ {
496
+ "epoch": 2.5357833655705995,
497
+ "grad_norm": 0.6469439268112183,
498
+ "learning_rate": 8.989685172083433e-05,
499
+ "loss": 6.265300549958882,
500
+ "step": 1311
501
+ },
502
+ {
503
+ "epoch": 2.5725338491295937,
504
+ "grad_norm": 0.7045807838439941,
505
+ "learning_rate": 8.952378777284526e-05,
506
+ "loss": 6.24040422941509,
507
+ "step": 1330
508
+ },
509
+ {
510
+ "epoch": 2.609284332688588,
511
+ "grad_norm": 0.8098029494285583,
512
+ "learning_rate": 8.914476433881713e-05,
513
+ "loss": 6.2236998708624585,
514
+ "step": 1349
515
+ },
516
+ {
517
+ "epoch": 2.646034816247582,
518
+ "grad_norm": 0.737579345703125,
519
+ "learning_rate": 8.875983856875986e-05,
520
+ "loss": 6.20395901328639,
521
+ "step": 1368
522
+ },
523
+ {
524
+ "epoch": 2.6827852998065764,
525
+ "grad_norm": 0.8462916612625122,
526
+ "learning_rate": 8.836906850265096e-05,
527
+ "loss": 6.195942125822368,
528
+ "step": 1387
529
+ },
530
+ {
531
+ "epoch": 2.7195357833655707,
532
+ "grad_norm": 0.8085110187530518,
533
+ "learning_rate": 8.797251306168407e-05,
534
+ "loss": 6.188254908511513,
535
+ "step": 1406
536
+ },
537
+ {
538
+ "epoch": 2.756286266924565,
539
+ "grad_norm": 1.0851175785064697,
540
+ "learning_rate": 8.757023203938474e-05,
541
+ "loss": 6.1910757767526725,
542
+ "step": 1425
543
+ },
544
+ {
545
+ "epoch": 2.793036750483559,
546
+ "grad_norm": 1.278171420097351,
547
+ "learning_rate": 8.716228609259462e-05,
548
+ "loss": 6.186161643580387,
549
+ "step": 1444
550
+ },
551
+ {
552
+ "epoch": 2.829787234042553,
553
+ "grad_norm": 0.7596274614334106,
554
+ "learning_rate": 8.674873673232546e-05,
555
+ "loss": 6.173150313527961,
556
+ "step": 1463
557
+ },
558
+ {
559
+ "epoch": 2.866537717601547,
560
+ "grad_norm": 0.7558140754699707,
561
+ "learning_rate": 8.632964631448441e-05,
562
+ "loss": 6.155330457185444,
563
+ "step": 1482
564
+ },
565
+ {
566
+ "epoch": 2.9032882011605414,
567
+ "grad_norm": 0.6216167211532593,
568
+ "learning_rate": 8.590507803047172e-05,
569
+ "loss": 6.167594106573808,
570
+ "step": 1501
571
+ },
572
+ {
573
+ "epoch": 2.9400386847195357,
574
+ "grad_norm": 0.643828272819519,
575
+ "learning_rate": 8.547509589765275e-05,
576
+ "loss": 6.1355847810444075,
577
+ "step": 1520
578
+ },
579
+ {
580
+ "epoch": 2.97678916827853,
581
+ "grad_norm": 0.8299704194068909,
582
+ "learning_rate": 8.503976474970517e-05,
583
+ "loss": 6.138166327225535,
584
+ "step": 1539
585
+ },
586
+ {
587
+ "epoch": 3.013539651837524,
588
+ "grad_norm": 1.1598100662231445,
589
+ "learning_rate": 8.459915022684329e-05,
590
+ "loss": 6.094761497096012,
591
+ "step": 1558
592
+ },
593
+ {
594
+ "epoch": 3.0502901353965184,
595
+ "grad_norm": 0.6949910521507263,
596
+ "learning_rate": 8.415331876592055e-05,
597
+ "loss": 6.0926979466488485,
598
+ "step": 1577
599
+ },
600
+ {
601
+ "epoch": 3.0870406189555126,
602
+ "grad_norm": 0.6773774027824402,
603
+ "learning_rate": 8.370233759041219e-05,
604
+ "loss": 6.107613814504523,
605
+ "step": 1596
606
+ },
607
+ {
608
+ "epoch": 3.123791102514507,
609
+ "grad_norm": 0.6850952506065369,
610
+ "learning_rate": 8.324627470027901e-05,
611
+ "loss": 6.105125025699013,
612
+ "step": 1615
613
+ },
614
+ {
615
+ "epoch": 3.160541586073501,
616
+ "grad_norm": 0.7101475596427917,
617
+ "learning_rate": 8.278519886171423e-05,
618
+ "loss": 6.1307517603824015,
619
+ "step": 1634
620
+ },
621
+ {
622
+ "epoch": 3.1972920696324953,
623
+ "grad_norm": 0.8034424781799316,
624
+ "learning_rate": 8.231917959677473e-05,
625
+ "loss": 6.124847412109375,
626
+ "step": 1653
627
+ },
628
+ {
629
+ "epoch": 3.2340425531914896,
630
+ "grad_norm": 0.7378991842269897,
631
+ "learning_rate": 8.184828717289845e-05,
632
+ "loss": 6.102732608192845,
633
+ "step": 1672
634
+ },
635
+ {
636
+ "epoch": 3.2707930367504834,
637
+ "grad_norm": 0.8416258692741394,
638
+ "learning_rate": 8.13725925923092e-05,
639
+ "loss": 6.12664112291838,
640
+ "step": 1691
641
+ },
642
+ {
643
+ "epoch": 3.3075435203094776,
644
+ "grad_norm": 0.7685152292251587,
645
+ "learning_rate": 8.089216758131087e-05,
646
+ "loss": 6.120386224043997,
647
+ "step": 1710
648
+ },
649
+ {
650
+ "epoch": 3.344294003868472,
651
+ "grad_norm": 0.8043097853660583,
652
+ "learning_rate": 8.04070845794723e-05,
653
+ "loss": 6.096702575683594,
654
+ "step": 1729
655
+ },
656
+ {
657
+ "epoch": 3.381044487427466,
658
+ "grad_norm": 0.8346231579780579,
659
+ "learning_rate": 7.991741672870475e-05,
660
+ "loss": 6.11234564530222,
661
+ "step": 1748
662
+ },
663
+ {
664
+ "epoch": 3.4177949709864603,
665
+ "grad_norm": 0.6343578100204468,
666
+ "learning_rate": 7.942323786223333e-05,
667
+ "loss": 6.072033932334499,
668
+ "step": 1767
669
+ },
670
+ {
671
+ "epoch": 3.4545454545454546,
672
+ "grad_norm": 0.7200827598571777,
673
+ "learning_rate": 7.892462249346432e-05,
674
+ "loss": 6.075145922209087,
675
+ "step": 1786
676
+ },
677
+ {
678
+ "epoch": 3.491295938104449,
679
+ "grad_norm": 0.9759312868118286,
680
+ "learning_rate": 7.84216458047498e-05,
681
+ "loss": 6.069999694824219,
682
+ "step": 1805
683
+ },
684
+ {
685
+ "epoch": 3.528046421663443,
686
+ "grad_norm": 0.8275994658470154,
687
+ "learning_rate": 7.79143836360516e-05,
688
+ "loss": 6.080008255807977,
689
+ "step": 1824
690
+ },
691
+ {
692
+ "epoch": 3.564796905222437,
693
+ "grad_norm": 0.7960038185119629,
694
+ "learning_rate": 7.740291247350581e-05,
695
+ "loss": 6.059996353952508,
696
+ "step": 1843
697
+ },
698
+ {
699
+ "epoch": 3.601547388781431,
700
+ "grad_norm": 0.7582727074623108,
701
+ "learning_rate": 7.688730943789023e-05,
702
+ "loss": 6.085317511307566,
703
+ "step": 1862
704
+ },
705
+ {
706
+ "epoch": 3.6382978723404253,
707
+ "grad_norm": 0.804373025894165,
708
+ "learning_rate": 7.636765227299576e-05,
709
+ "loss": 6.070657027395148,
710
+ "step": 1881
711
+ },
712
+ {
713
+ "epoch": 3.6750483558994196,
714
+ "grad_norm": 0.8290799856185913,
715
+ "learning_rate": 7.584401933390404e-05,
716
+ "loss": 6.05766457005551,
717
+ "step": 1900
718
+ },
719
+ {
720
+ "epoch": 3.711798839458414,
721
+ "grad_norm": 0.9162618517875671,
722
+ "learning_rate": 7.531648957517301e-05,
723
+ "loss": 6.049548098915501,
724
+ "step": 1919
725
+ },
726
+ {
727
+ "epoch": 3.748549323017408,
728
+ "grad_norm": 0.8927680253982544,
729
+ "learning_rate": 7.478514253893181e-05,
730
+ "loss": 6.04520697342722,
731
+ "step": 1938
732
+ },
733
+ {
734
+ "epoch": 3.7852998065764023,
735
+ "grad_norm": 0.7886133790016174,
736
+ "learning_rate": 7.425005834288738e-05,
737
+ "loss": 6.05512157239412,
738
+ "step": 1957
739
+ },
740
+ {
741
+ "epoch": 3.8220502901353965,
742
+ "grad_norm": 0.9593386054039001,
743
+ "learning_rate": 7.371131766824399e-05,
744
+ "loss": 6.040921261436061,
745
+ "step": 1976
746
+ },
747
+ {
748
+ "epoch": 3.858800773694391,
749
+ "grad_norm": 0.7840445637702942,
750
+ "learning_rate": 7.316900174753806e-05,
751
+ "loss": 6.032860203793175,
752
+ "step": 1995
753
+ },
754
+ {
755
+ "epoch": 3.895551257253385,
756
+ "grad_norm": 0.7778313159942627,
757
+ "learning_rate": 7.262319235238967e-05,
758
+ "loss": 6.04224355597245,
759
+ "step": 2014
760
+ },
761
+ {
762
+ "epoch": 3.9323017408123793,
763
+ "grad_norm": 0.6466183662414551,
764
+ "learning_rate": 7.207397178117286e-05,
765
+ "loss": 6.039953934518914,
766
+ "step": 2033
767
+ },
768
+ {
769
+ "epoch": 3.9690522243713735,
770
+ "grad_norm": 0.8412506580352783,
771
+ "learning_rate": 7.152142284660659e-05,
772
+ "loss": 6.043909173262747,
773
+ "step": 2052
774
+ },
775
+ {
776
+ "epoch": 4.005802707930368,
777
+ "grad_norm": 0.7424644231796265,
778
+ "learning_rate": 7.096562886326784e-05,
779
+ "loss": 6.020910965768914,
780
+ "step": 2071
781
+ },
782
+ {
783
+ "epoch": 4.042553191489362,
784
+ "grad_norm": 0.8717330694198608,
785
+ "learning_rate": 7.040667363502946e-05,
786
+ "loss": 6.038821973298726,
787
+ "step": 2090
788
+ },
789
+ {
790
+ "epoch": 4.079303675048356,
791
+ "grad_norm": 0.6821705102920532,
792
+ "learning_rate": 6.984464144242395e-05,
793
+ "loss": 5.9910033376593335,
794
+ "step": 2109
795
+ },
796
+ {
797
+ "epoch": 4.1160541586073505,
798
+ "grad_norm": 0.8593968749046326,
799
+ "learning_rate": 6.92796170299354e-05,
800
+ "loss": 6.038429260253906,
801
+ "step": 2128
802
+ },
803
+ {
804
+ "epoch": 4.152804642166345,
805
+ "grad_norm": 0.6946661472320557,
806
+ "learning_rate": 6.871168559322163e-05,
807
+ "loss": 6.043051468698602,
808
+ "step": 2147
809
+ },
810
+ {
811
+ "epoch": 4.189555125725338,
812
+ "grad_norm": 0.872968316078186,
813
+ "learning_rate": 6.814093276626812e-05,
814
+ "loss": 6.0379281294973275,
815
+ "step": 2166
816
+ },
817
+ {
818
+ "epoch": 4.226305609284332,
819
+ "grad_norm": 0.8149793744087219,
820
+ "learning_rate": 6.756744460847593e-05,
821
+ "loss": 6.025306300113075,
822
+ "step": 2185
823
+ },
824
+ {
825
+ "epoch": 4.2630560928433265,
826
+ "grad_norm": 0.8134008646011353,
827
+ "learning_rate": 6.699130759168552e-05,
828
+ "loss": 6.029500860916941,
829
+ "step": 2204
830
+ },
831
+ {
832
+ "epoch": 4.299806576402321,
833
+ "grad_norm": 0.6807850003242493,
834
+ "learning_rate": 6.641260858713825e-05,
835
+ "loss": 6.02039899324116,
836
+ "step": 2223
837
+ },
838
+ {
839
+ "epoch": 4.336557059961315,
840
+ "grad_norm": 0.7424802780151367,
841
+ "learning_rate": 6.583143485237783e-05,
842
+ "loss": 6.042245965254934,
843
+ "step": 2242
844
+ },
845
+ {
846
+ "epoch": 4.373307543520309,
847
+ "grad_norm": 1.0088832378387451,
848
+ "learning_rate": 6.524787401809335e-05,
849
+ "loss": 5.990176953767476,
850
+ "step": 2261
851
+ },
852
+ {
853
+ "epoch": 4.4100580270793035,
854
+ "grad_norm": 0.8164299130439758,
855
+ "learning_rate": 6.466201407490622e-05,
856
+ "loss": 6.0073804353412825,
857
+ "step": 2280
858
+ },
859
+ {
860
+ "epoch": 4.446808510638298,
861
+ "grad_norm": 0.8493334650993347,
862
+ "learning_rate": 6.40739433601026e-05,
863
+ "loss": 6.001832259328742,
864
+ "step": 2299
865
+ },
866
+ {
867
+ "epoch": 4.483558994197292,
868
+ "grad_norm": 0.8219246864318848,
869
+ "learning_rate": 6.348375054431385e-05,
870
+ "loss": 6.0019788240131575,
871
+ "step": 2318
872
+ },
873
+ {
874
+ "epoch": 4.520309477756286,
875
+ "grad_norm": 0.8987888693809509,
876
+ "learning_rate": 6.289152461814648e-05,
877
+ "loss": 5.987865648771587,
878
+ "step": 2337
879
+ },
880
+ {
881
+ "epoch": 4.5570599613152805,
882
+ "grad_norm": 0.7489388585090637,
883
+ "learning_rate": 6.229735487876398e-05,
884
+ "loss": 6.025086252312911,
885
+ "step": 2356
886
+ },
887
+ {
888
+ "epoch": 4.593810444874275,
889
+ "grad_norm": 0.8458240032196045,
890
+ "learning_rate": 6.170133091642245e-05,
891
+ "loss": 5.987234015213816,
892
+ "step": 2375
893
+ },
894
+ {
895
+ "epoch": 4.630560928433269,
896
+ "grad_norm": 0.7269588112831116,
897
+ "learning_rate": 6.110354260096183e-05,
898
+ "loss": 5.985632645456414,
899
+ "step": 2394
900
+ },
901
+ {
902
+ "epoch": 4.667311411992263,
903
+ "grad_norm": 0.7618328928947449,
904
+ "learning_rate": 6.050408006825525e-05,
905
+ "loss": 5.984134071751645,
906
+ "step": 2413
907
+ },
908
+ {
909
+ "epoch": 4.704061895551257,
910
+ "grad_norm": 0.8415816426277161,
911
+ "learning_rate": 5.9903033706618116e-05,
912
+ "loss": 6.002414904142681,
913
+ "step": 2432
914
+ },
915
+ {
916
+ "epoch": 4.740812379110252,
917
+ "grad_norm": 0.8867862224578857,
918
+ "learning_rate": 5.930049414317913e-05,
919
+ "loss": 5.995708264802632,
920
+ "step": 2451
921
+ },
922
+ {
923
+ "epoch": 4.777562862669246,
924
+ "grad_norm": 1.2235946655273438,
925
+ "learning_rate": 5.869655223021529e-05,
926
+ "loss": 6.0039624665912825,
927
+ "step": 2470
928
+ },
929
+ {
930
+ "epoch": 4.81431334622824,
931
+ "grad_norm": 0.7073670625686646,
932
+ "learning_rate": 5.8091299031453106e-05,
933
+ "loss": 6.0098114013671875,
934
+ "step": 2489
935
+ },
936
+ {
937
+ "epoch": 4.851063829787234,
938
+ "grad_norm": 0.7804876565933228,
939
+ "learning_rate": 5.748482580833766e-05,
940
+ "loss": 5.9925079345703125,
941
+ "step": 2508
942
+ },
943
+ {
944
+ "epoch": 4.887814313346229,
945
+ "grad_norm": 0.8264277577400208,
946
+ "learning_rate": 5.6877224006272086e-05,
947
+ "loss": 5.97403275339227,
948
+ "step": 2527
949
+ },
950
+ {
951
+ "epoch": 4.924564796905223,
952
+ "grad_norm": 0.9556435942649841,
953
+ "learning_rate": 5.626858524082922e-05,
954
+ "loss": 6.007706893117804,
955
+ "step": 2546
956
+ },
957
+ {
958
+ "epoch": 4.961315280464216,
959
+ "grad_norm": 1.08724045753479,
960
+ "learning_rate": 5.5659001283937526e-05,
961
+ "loss": 5.989010057951274,
962
+ "step": 2565
963
+ },
964
+ {
965
+ "epoch": 4.9980657640232105,
966
+ "grad_norm": 0.8453856706619263,
967
+ "learning_rate": 5.5048564050043637e-05,
968
+ "loss": 5.995357714201274,
969
+ "step": 2584
970
+ },
971
+ {
972
+ "epoch": 5.034816247582205,
973
+ "grad_norm": 1.1737676858901978,
974
+ "learning_rate": 5.4437365582253185e-05,
975
+ "loss": 5.977565564607319,
976
+ "step": 2603
977
+ },
978
+ {
979
+ "epoch": 5.071566731141199,
980
+ "grad_norm": 0.934160590171814,
981
+ "learning_rate": 5.382549803845235e-05,
982
+ "loss": 5.981942427785773,
983
+ "step": 2622
984
+ },
985
+ {
986
+ "epoch": 5.108317214700193,
987
+ "grad_norm": 0.8727831840515137,
988
+ "learning_rate": 5.321305367741215e-05,
989
+ "loss": 5.968893352307771,
990
+ "step": 2641
991
+ },
992
+ {
993
+ "epoch": 5.145067698259187,
994
+ "grad_norm": 0.8856471180915833,
995
+ "learning_rate": 5.260012484487739e-05,
996
+ "loss": 5.98333057604338,
997
+ "step": 2660
998
+ },
999
+ {
1000
+ "epoch": 5.181818181818182,
1001
+ "grad_norm": 0.7901120781898499,
1002
+ "learning_rate": 5.198680395964256e-05,
1003
+ "loss": 5.964969434236226,
1004
+ "step": 2679
1005
+ },
1006
+ {
1007
+ "epoch": 5.218568665377176,
1008
+ "grad_norm": 0.7979159355163574,
1009
+ "learning_rate": 5.137318349961677e-05,
1010
+ "loss": 5.9825082076223275,
1011
+ "step": 2698
1012
+ },
1013
+ {
1014
+ "epoch": 5.25531914893617,
1015
+ "grad_norm": 0.9568471312522888,
1016
+ "learning_rate": 5.07593559878797e-05,
1017
+ "loss": 5.916827954744038,
1018
+ "step": 2717
1019
+ },
1020
+ {
1021
+ "epoch": 5.292069632495164,
1022
+ "grad_norm": 0.6639050245285034,
1023
+ "learning_rate": 5.0145413978730726e-05,
1024
+ "loss": 5.972771895559211,
1025
+ "step": 2736
1026
+ },
1027
+ {
1028
+ "epoch": 5.328820116054159,
1029
+ "grad_norm": 1.054398536682129,
1030
+ "learning_rate": 4.9531450043733424e-05,
1031
+ "loss": 5.95155173853824,
1032
+ "step": 2755
1033
+ },
1034
+ {
1035
+ "epoch": 5.365570599613153,
1036
+ "grad_norm": 0.8115559220314026,
1037
+ "learning_rate": 4.891755675775739e-05,
1038
+ "loss": 5.972399259868421,
1039
+ "step": 2774
1040
+ },
1041
+ {
1042
+ "epoch": 5.402321083172147,
1043
+ "grad_norm": 0.8311302661895752,
1044
+ "learning_rate": 4.830382668501961e-05,
1045
+ "loss": 5.989575436240749,
1046
+ "step": 2793
1047
+ },
1048
+ {
1049
+ "epoch": 5.439071566731141,
1050
+ "grad_norm": 0.7544533014297485,
1051
+ "learning_rate": 4.7690352365127384e-05,
1052
+ "loss": 5.947163230494449,
1053
+ "step": 2812
1054
+ },
1055
+ {
1056
+ "epoch": 5.475822050290136,
1057
+ "grad_norm": 0.9242804050445557,
1058
+ "learning_rate": 4.7077226299125066e-05,
1059
+ "loss": 5.953185633609169,
1060
+ "step": 2831
1061
+ },
1062
+ {
1063
+ "epoch": 5.51257253384913,
1064
+ "grad_norm": 0.8004162311553955,
1065
+ "learning_rate": 4.646454093554644e-05,
1066
+ "loss": 5.965155350534539,
1067
+ "step": 2850
1068
+ },
1069
+ {
1070
+ "epoch": 5.549323017408124,
1071
+ "grad_norm": 0.9713261127471924,
1072
+ "learning_rate": 4.5852388656475256e-05,
1073
+ "loss": 5.955127916837993,
1074
+ "step": 2869
1075
+ },
1076
+ {
1077
+ "epoch": 5.586073500967118,
1078
+ "grad_norm": 0.795760452747345,
1079
+ "learning_rate": 4.524086176361549e-05,
1080
+ "loss": 5.981726395456414,
1081
+ "step": 2888
1082
+ },
1083
+ {
1084
+ "epoch": 5.6228239845261125,
1085
+ "grad_norm": 0.9622194170951843,
1086
+ "learning_rate": 4.463005246437407e-05,
1087
+ "loss": 5.9348289088199015,
1088
+ "step": 2907
1089
+ },
1090
+ {
1091
+ "epoch": 5.659574468085106,
1092
+ "grad_norm": 0.9427851438522339,
1093
+ "learning_rate": 4.402005285795745e-05,
1094
+ "loss": 5.9512381302682975,
1095
+ "step": 2926
1096
+ },
1097
+ {
1098
+ "epoch": 5.696324951644101,
1099
+ "grad_norm": 0.8677796721458435,
1100
+ "learning_rate": 4.341095492148483e-05,
1101
+ "loss": 5.980510109349301,
1102
+ "step": 2945
1103
+ },
1104
+ {
1105
+ "epoch": 5.733075435203094,
1106
+ "grad_norm": 0.8452844619750977,
1107
+ "learning_rate": 4.2802850496119536e-05,
1108
+ "loss": 5.963108665064762,
1109
+ "step": 2964
1110
+ },
1111
+ {
1112
+ "epoch": 5.769825918762089,
1113
+ "grad_norm": 0.8970301747322083,
1114
+ "learning_rate": 4.219583127322104e-05,
1115
+ "loss": 5.97346335963199,
1116
+ "step": 2983
1117
+ },
1118
+ {
1119
+ "epoch": 5.806576402321083,
1120
+ "grad_norm": 0.8443690538406372,
1121
+ "learning_rate": 4.158998878051962e-05,
1122
+ "loss": 5.9706971017937915,
1123
+ "step": 3002
1124
+ },
1125
+ {
1126
+ "epoch": 5.843326885880077,
1127
+ "grad_norm": 0.9244300723075867,
1128
+ "learning_rate": 4.098541436831541e-05,
1129
+ "loss": 5.951765361585115,
1130
+ "step": 3021
1131
+ },
1132
+ {
1133
+ "epoch": 5.880077369439071,
1134
+ "grad_norm": 0.784065842628479,
1135
+ "learning_rate": 4.038219919570455e-05,
1136
+ "loss": 5.960685328433388,
1137
+ "step": 3040
1138
+ },
1139
+ {
1140
+ "epoch": 5.916827852998066,
1141
+ "grad_norm": 0.9272547960281372,
1142
+ "learning_rate": 3.978043421683395e-05,
1143
+ "loss": 5.95731634842722,
1144
+ "step": 3059
1145
+ },
1146
+ {
1147
+ "epoch": 5.95357833655706,
1148
+ "grad_norm": 0.7737032771110535,
1149
+ "learning_rate": 3.918021016718704e-05,
1150
+ "loss": 5.947649905556126,
1151
+ "step": 3078
1152
+ },
1153
+ {
1154
+ "epoch": 5.990328820116054,
1155
+ "grad_norm": 0.7086819410324097,
1156
+ "learning_rate": 3.858161754990245e-05,
1157
+ "loss": 5.95235162032278,
1158
+ "step": 3097
1159
+ },
1160
+ {
1161
+ "epoch": 6.027079303675048,
1162
+ "grad_norm": 0.7864798307418823,
1163
+ "learning_rate": 3.7984746622127765e-05,
1164
+ "loss": 5.9433951126901725,
1165
+ "step": 3116
1166
+ },
1167
+ {
1168
+ "epoch": 6.0638297872340425,
1169
+ "grad_norm": 0.8476478457450867,
1170
+ "learning_rate": 3.738968738141033e-05,
1171
+ "loss": 5.926896346242804,
1172
+ "step": 3135
1173
+ },
1174
+ {
1175
+ "epoch": 6.100580270793037,
1176
+ "grad_norm": 0.7427075505256653,
1177
+ "learning_rate": 3.679652955212719e-05,
1178
+ "loss": 5.956519277472245,
1179
+ "step": 3154
1180
+ },
1181
+ {
1182
+ "epoch": 6.137330754352031,
1183
+ "grad_norm": 0.9161301851272583,
1184
+ "learning_rate": 3.620536257195635e-05,
1185
+ "loss": 5.917147184673109,
1186
+ "step": 3173
1187
+ },
1188
+ {
1189
+ "epoch": 6.174081237911025,
1190
+ "grad_norm": 0.8627088665962219,
1191
+ "learning_rate": 3.561627557839099e-05,
1192
+ "loss": 5.942029451069079,
1193
+ "step": 3192
1194
+ },
1195
+ {
1196
+ "epoch": 6.2108317214700195,
1197
+ "grad_norm": 0.7476882338523865,
1198
+ "learning_rate": 3.502935739529928e-05,
1199
+ "loss": 5.934722097296464,
1200
+ "step": 3211
1201
+ },
1202
+ {
1203
+ "epoch": 6.247582205029014,
1204
+ "grad_norm": 0.793505847454071,
1205
+ "learning_rate": 3.444469651953126e-05,
1206
+ "loss": 5.916718733938117,
1207
+ "step": 3230
1208
+ },
1209
+ {
1210
+ "epoch": 6.284332688588008,
1211
+ "grad_norm": 0.7478469610214233,
1212
+ "learning_rate": 3.3862381107575005e-05,
1213
+ "loss": 5.954738416169819,
1214
+ "step": 3249
1215
+ },
1216
+ {
1217
+ "epoch": 6.321083172147002,
1218
+ "grad_norm": 0.8610250353813171,
1219
+ "learning_rate": 3.328249896226428e-05,
1220
+ "loss": 5.922407852975946,
1221
+ "step": 3268
1222
+ },
1223
+ {
1224
+ "epoch": 6.3578336557059965,
1225
+ "grad_norm": 0.8182870745658875,
1226
+ "learning_rate": 3.270513751953944e-05,
1227
+ "loss": 5.919796190763774,
1228
+ "step": 3287
1229
+ },
1230
+ {
1231
+ "epoch": 6.394584139264991,
1232
+ "grad_norm": 0.7998473644256592,
1233
+ "learning_rate": 3.213038383526355e-05,
1234
+ "loss": 5.920766730057566,
1235
+ "step": 3306
1236
+ },
1237
+ {
1238
+ "epoch": 6.431334622823985,
1239
+ "grad_norm": 0.8772637248039246,
1240
+ "learning_rate": 3.155832457209603e-05,
1241
+ "loss": 5.93222367136102,
1242
+ "step": 3325
1243
+ },
1244
+ {
1245
+ "epoch": 6.468085106382979,
1246
+ "grad_norm": 0.7384529709815979,
1247
+ "learning_rate": 3.0989045986425325e-05,
1248
+ "loss": 5.92653415077611,
1249
+ "step": 3344
1250
+ },
1251
+ {
1252
+ "epoch": 6.5048355899419725,
1253
+ "grad_norm": 0.8872863054275513,
1254
+ "learning_rate": 3.0422633915363115e-05,
1255
+ "loss": 5.924022072239926,
1256
+ "step": 3363
1257
+ },
1258
+ {
1259
+ "epoch": 6.541586073500967,
1260
+ "grad_norm": 0.7358129024505615,
1261
+ "learning_rate": 2.9859173763801457e-05,
1262
+ "loss": 5.946694625051398,
1263
+ "step": 3382
1264
+ },
1265
+ {
1266
+ "epoch": 6.578336557059961,
1267
+ "grad_norm": 0.9394431114196777,
1268
+ "learning_rate": 2.9298750491535382e-05,
1269
+ "loss": 5.954012017501028,
1270
+ "step": 3401
1271
+ },
1272
+ {
1273
+ "epoch": 6.615087040618955,
1274
+ "grad_norm": 0.8594652414321899,
1275
+ "learning_rate": 2.8741448600452326e-05,
1276
+ "loss": 5.915107727050781,
1277
+ "step": 3420
1278
+ },
1279
+ {
1280
+ "epoch": 6.6518375241779495,
1281
+ "grad_norm": 0.7383516430854797,
1282
+ "learning_rate": 2.818735212179091e-05,
1283
+ "loss": 5.930320739746094,
1284
+ "step": 3439
1285
+ },
1286
+ {
1287
+ "epoch": 6.688588007736944,
1288
+ "grad_norm": 0.7550167441368103,
1289
+ "learning_rate": 2.763654460347035e-05,
1290
+ "loss": 5.959585892526727,
1291
+ "step": 3458
1292
+ },
1293
+ {
1294
+ "epoch": 6.725338491295938,
1295
+ "grad_norm": 0.9746566414833069,
1296
+ "learning_rate": 2.7089109097493003e-05,
1297
+ "loss": 5.915300469649465,
1298
+ "step": 3477
1299
+ },
1300
+ {
1301
+ "epoch": 6.762088974854932,
1302
+ "grad_norm": 0.8552682995796204,
1303
+ "learning_rate": 2.654512814742159e-05,
1304
+ "loss": 5.918191608629729,
1305
+ "step": 3496
1306
+ },
1307
+ {
1308
+ "epoch": 6.7988394584139265,
1309
+ "grad_norm": 0.7178594470024109,
1310
+ "learning_rate": 2.6004683775933116e-05,
1311
+ "loss": 5.931622153834293,
1312
+ "step": 3515
1313
+ },
1314
+ {
1315
+ "epoch": 6.835589941972921,
1316
+ "grad_norm": 0.8271778225898743,
1317
+ "learning_rate": 2.5467857472451234e-05,
1318
+ "loss": 5.90688042891653,
1319
+ "step": 3534
1320
+ },
1321
+ {
1322
+ "epoch": 6.872340425531915,
1323
+ "grad_norm": 0.8247523903846741,
1324
+ "learning_rate": 2.4934730180859138e-05,
1325
+ "loss": 5.911947149979441,
1326
+ "step": 3553
1327
+ },
1328
+ {
1329
+ "epoch": 6.909090909090909,
1330
+ "grad_norm": 0.9226091504096985,
1331
+ "learning_rate": 2.4405382287294666e-05,
1332
+ "loss": 5.909151579204359,
1333
+ "step": 3572
1334
+ },
1335
+ {
1336
+ "epoch": 6.945841392649903,
1337
+ "grad_norm": 0.8541250228881836,
1338
+ "learning_rate": 2.387989360802943e-05,
1339
+ "loss": 5.93184380782278,
1340
+ "step": 3591
1341
+ },
1342
+ {
1343
+ "epoch": 6.982591876208898,
1344
+ "grad_norm": 0.7963822484016418,
1345
+ "learning_rate": 2.3358343377434074e-05,
1346
+ "loss": 5.926949752004523,
1347
+ "step": 3610
1348
+ },
1349
+ {
1350
+ "epoch": 7.019342359767892,
1351
+ "grad_norm": 0.9335833191871643,
1352
+ "learning_rate": 2.2840810236030986e-05,
1353
+ "loss": 5.90260114167866,
1354
+ "step": 3629
1355
+ },
1356
+ {
1357
+ "epoch": 7.056092843326886,
1358
+ "grad_norm": 0.8136786222457886,
1359
+ "learning_rate": 2.2327372218636767e-05,
1360
+ "loss": 5.914011101973684,
1361
+ "step": 3648
1362
+ },
1363
+ {
1364
+ "epoch": 7.09284332688588,
1365
+ "grad_norm": 0.9287970066070557,
1366
+ "learning_rate": 2.181810674259601e-05,
1367
+ "loss": 5.9164786087839225,
1368
+ "step": 3667
1369
+ },
1370
+ {
1371
+ "epoch": 7.129593810444875,
1372
+ "grad_norm": 0.9184285998344421,
1373
+ "learning_rate": 2.1313090596108043e-05,
1374
+ "loss": 5.9290771484375,
1375
+ "step": 3686
1376
+ },
1377
+ {
1378
+ "epoch": 7.166344294003869,
1379
+ "grad_norm": 0.7558146715164185,
1380
+ "learning_rate": 2.081239992664874e-05,
1381
+ "loss": 5.8995819091796875,
1382
+ "step": 3705
1383
+ },
1384
+ {
1385
+ "epoch": 7.203094777562863,
1386
+ "grad_norm": 0.903976321220398,
1387
+ "learning_rate": 2.0316110229488718e-05,
1388
+ "loss": 5.905699880499589,
1389
+ "step": 3724
1390
+ },
1391
+ {
1392
+ "epoch": 7.2398452611218564,
1393
+ "grad_norm": 0.6883618235588074,
1394
+ "learning_rate": 1.9824296336310056e-05,
1395
+ "loss": 5.935149744937294,
1396
+ "step": 3743
1397
+ },
1398
+ {
1399
+ "epoch": 7.276595744680851,
1400
+ "grad_norm": 0.82213294506073,
1401
+ "learning_rate": 1.9337032403923018e-05,
1402
+ "loss": 5.902831228155839,
1403
+ "step": 3762
1404
+ },
1405
+ {
1406
+ "epoch": 7.313346228239845,
1407
+ "grad_norm": 0.8082081079483032,
1408
+ "learning_rate": 1.8854391903084457e-05,
1409
+ "loss": 5.928005419279399,
1410
+ "step": 3781
1411
+ },
1412
+ {
1413
+ "epoch": 7.350096711798839,
1414
+ "grad_norm": 0.8317619562149048,
1415
+ "learning_rate": 1.8376447607419833e-05,
1416
+ "loss": 5.936038368626645,
1417
+ "step": 3800
1418
+ },
1419
+ {
1420
+ "epoch": 7.386847195357833,
1421
+ "grad_norm": 0.8557573556900024,
1422
+ "learning_rate": 1.790327158245012e-05,
1423
+ "loss": 5.898858321340461,
1424
+ "step": 3819
1425
+ },
1426
+ {
1427
+ "epoch": 7.423597678916828,
1428
+ "grad_norm": 0.8798925876617432,
1429
+ "learning_rate": 1.7434935174725686e-05,
1430
+ "loss": 5.881325972707648,
1431
+ "step": 3838
1432
+ },
1433
+ {
1434
+ "epoch": 7.460348162475822,
1435
+ "grad_norm": 0.7644050121307373,
1436
+ "learning_rate": 1.697150900106844e-05,
1437
+ "loss": 5.888987491005345,
1438
+ "step": 3857
1439
+ },
1440
+ {
1441
+ "epoch": 7.497098646034816,
1442
+ "grad_norm": 0.8622159361839294,
1443
+ "learning_rate": 1.6513062937924155e-05,
1444
+ "loss": 5.928788837633635,
1445
+ "step": 3876
1446
+ },
1447
+ {
1448
+ "epoch": 7.53384912959381,
1449
+ "grad_norm": 0.9436091780662537,
1450
+ "learning_rate": 1.6059666110826277e-05,
1451
+ "loss": 5.897299114026521,
1452
+ "step": 3895
1453
+ },
1454
+ {
1455
+ "epoch": 7.570599613152805,
1456
+ "grad_norm": 0.8347904682159424,
1457
+ "learning_rate": 1.5611386883972995e-05,
1458
+ "loss": 5.9460095857319075,
1459
+ "step": 3914
1460
+ },
1461
+ {
1462
+ "epoch": 7.607350096711799,
1463
+ "grad_norm": 0.7909878492355347,
1464
+ "learning_rate": 1.5168292849919185e-05,
1465
+ "loss": 5.919348465768914,
1466
+ "step": 3933
1467
+ },
1468
+ {
1469
+ "epoch": 7.644100580270793,
1470
+ "grad_norm": 0.9378625750541687,
1471
+ "learning_rate": 1.4730450819384622e-05,
1472
+ "loss": 5.925768400493421,
1473
+ "step": 3952
1474
+ },
1475
+ {
1476
+ "epoch": 7.680851063829787,
1477
+ "grad_norm": 0.7907970547676086,
1478
+ "learning_rate": 1.4297926811180174e-05,
1479
+ "loss": 5.891129744680304,
1480
+ "step": 3971
1481
+ },
1482
+ {
1483
+ "epoch": 7.717601547388782,
1484
+ "grad_norm": 0.7062816619873047,
1485
+ "learning_rate": 1.3870786042253225e-05,
1486
+ "loss": 5.924658925909745,
1487
+ "step": 3990
1488
+ },
1489
+ {
1490
+ "epoch": 7.754352030947776,
1491
+ "grad_norm": 0.7920564413070679,
1492
+ "learning_rate": 1.34490929178542e-05,
1493
+ "loss": 5.9171387521844165,
1494
+ "step": 4009
1495
+ },
1496
+ {
1497
+ "epoch": 7.79110251450677,
1498
+ "grad_norm": 0.8155921697616577,
1499
+ "learning_rate": 1.3032911021825366e-05,
1500
+ "loss": 5.90830471641139,
1501
+ "step": 4028
1502
+ },
1503
+ {
1504
+ "epoch": 7.827852998065764,
1505
+ "grad_norm": 0.7282528877258301,
1506
+ "learning_rate": 1.2622303107013512e-05,
1507
+ "loss": 5.909604925858347,
1508
+ "step": 4047
1509
+ },
1510
+ {
1511
+ "epoch": 7.8646034816247585,
1512
+ "grad_norm": 0.9870123267173767,
1513
+ "learning_rate": 1.2217331085807982e-05,
1514
+ "loss": 5.930417111045436,
1515
+ "step": 4066
1516
+ },
1517
+ {
1518
+ "epoch": 7.901353965183753,
1519
+ "grad_norm": 0.7872030138969421,
1520
+ "learning_rate": 1.1818056020805302e-05,
1521
+ "loss": 5.9119214509662825,
1522
+ "step": 4085
1523
+ },
1524
+ {
1525
+ "epoch": 7.938104448742747,
1526
+ "grad_norm": 0.8964416980743408,
1527
+ "learning_rate": 1.1424538115602073e-05,
1528
+ "loss": 5.888004503752056,
1529
+ "step": 4104
1530
+ },
1531
+ {
1532
+ "epoch": 7.97485493230174,
1533
+ "grad_norm": 0.7404572367668152,
1534
+ "learning_rate": 1.1036836705717363e-05,
1535
+ "loss": 5.905365391781456,
1536
+ "step": 4123
1537
+ },
1538
+ {
1539
+ "epoch": 8.011605415860735,
1540
+ "grad_norm": 0.8973370790481567,
1541
+ "learning_rate": 1.0655010249645891e-05,
1542
+ "loss": 5.92196334035773,
1543
+ "step": 4142
1544
+ },
1545
+ {
1546
+ "epoch": 8.048355899419729,
1547
+ "grad_norm": 0.8033064603805542,
1548
+ "learning_rate": 1.0279116320043603e-05,
1549
+ "loss": 5.9180245650442025,
1550
+ "step": 4161
1551
+ },
1552
+ {
1553
+ "epoch": 8.085106382978724,
1554
+ "grad_norm": 0.8662456274032593,
1555
+ "learning_rate": 9.909211595046663e-06,
1556
+ "loss": 5.927662899619655,
1557
+ "step": 4180
1558
+ },
1559
+ {
1560
+ "epoch": 8.121856866537717,
1561
+ "grad_norm": 0.881951630115509,
1562
+ "learning_rate": 9.545351849725448e-06,
1563
+ "loss": 5.897458126670436,
1564
+ "step": 4199
1565
+ },
1566
+ {
1567
+ "epoch": 8.158607350096712,
1568
+ "grad_norm": 0.9466047286987305,
1569
+ "learning_rate": 9.187591947674612e-06,
1570
+ "loss": 5.879381681743421,
1571
+ "step": 4218
1572
+ },
1573
+ {
1574
+ "epoch": 8.195357833655706,
1575
+ "grad_norm": 0.7660216093063354,
1576
+ "learning_rate": 8.835985832740712e-06,
1577
+ "loss": 5.892818651701274,
1578
+ "step": 4237
1579
+ },
1580
+ {
1581
+ "epoch": 8.232108317214701,
1582
+ "grad_norm": 0.88617342710495,
1583
+ "learning_rate": 8.490586520888321e-06,
1584
+ "loss": 5.913487083033512,
1585
+ "step": 4256
1586
+ },
1587
+ {
1588
+ "epoch": 8.268858800773694,
1589
+ "grad_norm": 0.8038977384567261,
1590
+ "learning_rate": 8.15144609220625e-06,
1591
+ "loss": 5.897986562628495,
1592
+ "step": 4275
1593
+ },
1594
+ {
1595
+ "epoch": 8.30560928433269,
1596
+ "grad_norm": 0.8149951100349426,
1597
+ "learning_rate": 7.818615683054737e-06,
1598
+ "loss": 5.905342503597862,
1599
+ "step": 4294
1600
+ },
1601
+ {
1602
+ "epoch": 8.342359767891683,
1603
+ "grad_norm": 0.8831230401992798,
1604
+ "learning_rate": 7.492145478355023e-06,
1605
+ "loss": 5.904563903808594,
1606
+ "step": 4313
1607
+ },
1608
+ {
1609
+ "epoch": 8.379110251450676,
1610
+ "grad_norm": 0.7993362545967102,
1611
+ "learning_rate": 7.172084704022364e-06,
1612
+ "loss": 5.920640242727179,
1613
+ "step": 4332
1614
+ },
1615
+ {
1616
+ "epoch": 8.415860735009671,
1617
+ "grad_norm": 0.8131271004676819,
1618
+ "learning_rate": 6.8584816195436215e-06,
1619
+ "loss": 5.89423972681949,
1620
+ "step": 4351
1621
+ },
1622
+ {
1623
+ "epoch": 8.452611218568665,
1624
+ "grad_norm": 0.8914048075675964,
1625
+ "learning_rate": 6.551383510700565e-06,
1626
+ "loss": 5.895122327302632,
1627
+ "step": 4370
1628
+ },
1629
+ {
1630
+ "epoch": 8.48936170212766,
1631
+ "grad_norm": 0.8953275084495544,
1632
+ "learning_rate": 6.250836682440047e-06,
1633
+ "loss": 5.9110107421875,
1634
+ "step": 4389
1635
+ },
1636
+ {
1637
+ "epoch": 8.526112185686653,
1638
+ "grad_norm": 0.7585981488227844,
1639
+ "learning_rate": 5.956886451892019e-06,
1640
+ "loss": 5.89650485390111,
1641
+ "step": 4408
1642
+ },
1643
+ {
1644
+ "epoch": 8.562862669245648,
1645
+ "grad_norm": 0.7543755173683167,
1646
+ "learning_rate": 5.669577141536553e-06,
1647
+ "loss": 5.915409288908306,
1648
+ "step": 4427
1649
+ },
1650
+ {
1651
+ "epoch": 8.599613152804642,
1652
+ "grad_norm": 0.7309451103210449,
1653
+ "learning_rate": 5.3889520725207366e-06,
1654
+ "loss": 5.88703717683491,
1655
+ "step": 4446
1656
+ },
1657
+ {
1658
+ "epoch": 8.636363636363637,
1659
+ "grad_norm": 0.9153041839599609,
1660
+ "learning_rate": 5.115053558126653e-06,
1661
+ "loss": 5.926216125488281,
1662
+ "step": 4465
1663
+ },
1664
+ {
1665
+ "epoch": 8.67311411992263,
1666
+ "grad_norm": 0.9788889288902283,
1667
+ "learning_rate": 4.847922897391266e-06,
1668
+ "loss": 5.901534632632607,
1669
+ "step": 4484
1670
+ },
1671
+ {
1672
+ "epoch": 8.709864603481625,
1673
+ "grad_norm": 0.7951791882514954,
1674
+ "learning_rate": 4.587600368879308e-06,
1675
+ "loss": 5.907471907766242,
1676
+ "step": 4503
1677
+ },
1678
+ {
1679
+ "epoch": 8.746615087040619,
1680
+ "grad_norm": 0.8469536900520325,
1681
+ "learning_rate": 4.334125224609903e-06,
1682
+ "loss": 5.8754529451069075,
1683
+ "step": 4522
1684
+ },
1685
+ {
1686
+ "epoch": 8.783365570599614,
1687
+ "grad_norm": 0.8822078108787537,
1688
+ "learning_rate": 4.087535684138127e-06,
1689
+ "loss": 5.920900445235403,
1690
+ "step": 4541
1691
+ },
1692
+ {
1693
+ "epoch": 8.820116054158607,
1694
+ "grad_norm": 0.8397153615951538,
1695
+ "learning_rate": 3.84786892879217e-06,
1696
+ "loss": 5.893511320415296,
1697
+ "step": 4560
1698
+ },
1699
+ {
1700
+ "epoch": 8.856866537717602,
1701
+ "grad_norm": 0.8214257955551147,
1702
+ "learning_rate": 3.615161096066999e-06,
1703
+ "loss": 5.91493706954153,
1704
+ "step": 4579
1705
+ },
1706
+ {
1707
+ "epoch": 8.893617021276595,
1708
+ "grad_norm": 0.7739570140838623,
1709
+ "learning_rate": 3.389447274175528e-06,
1710
+ "loss": 5.91633405183491,
1711
+ "step": 4598
1712
+ },
1713
+ {
1714
+ "epoch": 8.93036750483559,
1715
+ "grad_norm": 0.7047730684280396,
1716
+ "learning_rate": 3.1707614967579122e-06,
1717
+ "loss": 5.886983771073191,
1718
+ "step": 4617
1719
+ },
1720
+ {
1721
+ "epoch": 8.967117988394584,
1722
+ "grad_norm": 0.9191731214523315,
1723
+ "learning_rate": 2.959136737749868e-06,
1724
+ "loss": 5.887207834344161,
1725
+ "step": 4636
1726
+ }
1727
+ ],
1728
+ "logging_steps": 19,
1729
+ "max_steps": 5170,
1730
+ "num_input_tokens_seen": 0,
1731
+ "num_train_epochs": 10,
1732
+ "save_steps": 500,
1733
+ "stateful_callbacks": {
1734
+ "TrainerControl": {
1735
+ "args": {
1736
+ "should_epoch_stop": false,
1737
+ "should_evaluate": false,
1738
+ "should_log": false,
1739
+ "should_save": true,
1740
+ "should_training_stop": false
1741
+ },
1742
+ "attributes": {}
1743
+ }
1744
+ },
1745
+ "total_flos": 9896071987200000.0,
1746
+ "train_batch_size": 64,
1747
+ "trial_name": null,
1748
+ "trial_params": null
1749
+ }
checkpoint-4653/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:655a39837857415e4b7e97f5e1babb75c8c7355fae7a986299c7a8effdb82e4b
3
+ size 5265
checkpoint-5170/config.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_cross_attention": false,
3
+ "architectures": [
4
+ "BertForMaskedLM"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 2,
8
+ "classifier_dropout": null,
9
+ "dtype": "float32",
10
+ "eos_token_id": 3,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout_prob": 0.1,
13
+ "hidden_size": 384,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 1536,
16
+ "is_decoder": false,
17
+ "layer_norm_eps": 1e-12,
18
+ "max_position_embeddings": 512,
19
+ "model_type": "bert",
20
+ "num_attention_heads": 12,
21
+ "num_hidden_layers": 6,
22
+ "pad_token_id": 0,
23
+ "tie_word_embeddings": true,
24
+ "transformers_version": "5.3.0",
25
+ "type_vocab_size": 2,
26
+ "use_cache": false,
27
+ "vocab_size": 32000
28
+ }
checkpoint-5170/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9eb7e7c59ea3bfcbcd6be2a7dfb1fd25f2dae1ba766eca3e302f8b44b8a07cd
3
+ size 93266320
checkpoint-5170/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:86d330b176b3241f93377f2da26c974c696b29f46b7549bbd383bf2fcd1ec964
3
+ size 186597643
checkpoint-5170/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ad1072e52d81484f1fedd98983b8395289e98c0f984f673fad30820cf329ef5c
3
+ size 14645
checkpoint-5170/scaler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4b8b4d95211da64b86a7cf4dc99f0f9f6dd94eb910710f8c757e6759e19e36ba
3
+ size 1383
checkpoint-5170/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d827a4e4d7b8fbb4d8790a89b06e50e0a12911921e04c665bb30e9352802046f
3
+ size 1465
checkpoint-5170/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-5170/tokenizer_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "backend": "tokenizers",
3
+ "bos_token": "[CLS]",
4
+ "cls_token": "[CLS]",
5
+ "eos_token": "[SEP]",
6
+ "is_local": true,
7
+ "mask_token": "[MASK]",
8
+ "model_max_length": 1000000000000000019884624838656,
9
+ "pad_token": "[PAD]",
10
+ "sep_token": "[SEP]",
11
+ "tokenizer_class": "TokenizersBackend",
12
+ "unk_token": "[UNK]"
13
+ }
checkpoint-5170/trainer_state.json ADDED
@@ -0,0 +1,1945 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": null,
3
+ "best_metric": null,
4
+ "best_model_checkpoint": null,
5
+ "epoch": 10.0,
6
+ "eval_steps": 500,
7
+ "global_step": 5170,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.0025906735751295338,
14
+ "grad_norm": 1.9656798839569092,
15
+ "learning_rate": 0.0,
16
+ "loss": 10.453109741210938,
17
+ "step": 1
18
+ },
19
+ {
20
+ "epoch": 0.04922279792746114,
21
+ "grad_norm": 1.8898563385009766,
22
+ "learning_rate": 7.792207792207792e-06,
23
+ "loss": 10.416481018066406,
24
+ "step": 19
25
+ },
26
+ {
27
+ "epoch": 0.09844559585492228,
28
+ "grad_norm": 1.5997825860977173,
29
+ "learning_rate": 1.6017316017316017e-05,
30
+ "loss": 10.20559371145148,
31
+ "step": 38
32
+ },
33
+ {
34
+ "epoch": 0.14766839378238342,
35
+ "grad_norm": 1.4691131114959717,
36
+ "learning_rate": 2.4242424242424244e-05,
37
+ "loss": 9.938501458418997,
38
+ "step": 57
39
+ },
40
+ {
41
+ "epoch": 0.19689119170984457,
42
+ "grad_norm": 1.5397157669067383,
43
+ "learning_rate": 3.246753246753247e-05,
44
+ "loss": 9.666742425215872,
45
+ "step": 76
46
+ },
47
+ {
48
+ "epoch": 0.24611398963730569,
49
+ "grad_norm": 1.5672590732574463,
50
+ "learning_rate": 4.0692640692640695e-05,
51
+ "loss": 9.390644274259868,
52
+ "step": 95
53
+ },
54
+ {
55
+ "epoch": 0.29533678756476683,
56
+ "grad_norm": 1.4179465770721436,
57
+ "learning_rate": 4.8917748917748915e-05,
58
+ "loss": 9.110911319130345,
59
+ "step": 114
60
+ },
61
+ {
62
+ "epoch": 0.344559585492228,
63
+ "grad_norm": 1.3159687519073486,
64
+ "learning_rate": 5.714285714285714e-05,
65
+ "loss": 8.792612176192435,
66
+ "step": 133
67
+ },
68
+ {
69
+ "epoch": 0.39378238341968913,
70
+ "grad_norm": 1.0485610961914062,
71
+ "learning_rate": 6.536796536796536e-05,
72
+ "loss": 8.488385652240954,
73
+ "step": 152
74
+ },
75
+ {
76
+ "epoch": 0.4430051813471503,
77
+ "grad_norm": 0.8901123404502869,
78
+ "learning_rate": 7.35930735930736e-05,
79
+ "loss": 8.201339721679688,
80
+ "step": 171
81
+ },
82
+ {
83
+ "epoch": 0.49222797927461137,
84
+ "grad_norm": 0.7018402218818665,
85
+ "learning_rate": 8.181818181818183e-05,
86
+ "loss": 7.978334125719573,
87
+ "step": 190
88
+ },
89
+ {
90
+ "epoch": 0.5414507772020726,
91
+ "grad_norm": 0.5120431184768677,
92
+ "learning_rate": 9.004329004329005e-05,
93
+ "loss": 7.839283993369655,
94
+ "step": 209
95
+ },
96
+ {
97
+ "epoch": 0.5906735751295337,
98
+ "grad_norm": 0.5026654601097107,
99
+ "learning_rate": 9.826839826839827e-05,
100
+ "loss": 7.7916211579975325,
101
+ "step": 228
102
+ },
103
+ {
104
+ "epoch": 0.6398963730569949,
105
+ "grad_norm": 0.638586163520813,
106
+ "learning_rate": 9.999578456659054e-05,
107
+ "loss": 7.715636403937089,
108
+ "step": 247
109
+ },
110
+ {
111
+ "epoch": 0.689119170984456,
112
+ "grad_norm": 0.6381962895393372,
113
+ "learning_rate": 9.997834329912887e-05,
114
+ "loss": 7.697683233963816,
115
+ "step": 266
116
+ },
117
+ {
118
+ "epoch": 0.7383419689119171,
119
+ "grad_norm": 0.6784098148345947,
120
+ "learning_rate": 9.994738114801949e-05,
121
+ "loss": 7.658172607421875,
122
+ "step": 285
123
+ },
124
+ {
125
+ "epoch": 0.7875647668393783,
126
+ "grad_norm": 0.7522804141044617,
127
+ "learning_rate": 9.990290648960332e-05,
128
+ "loss": 7.618246781198602,
129
+ "step": 304
130
+ },
131
+ {
132
+ "epoch": 0.8367875647668394,
133
+ "grad_norm": 1.413805365562439,
134
+ "learning_rate": 9.984493135582543e-05,
135
+ "loss": 7.569692511307566,
136
+ "step": 323
137
+ },
138
+ {
139
+ "epoch": 0.8860103626943006,
140
+ "grad_norm": 0.769112229347229,
141
+ "learning_rate": 9.977347143098e-05,
142
+ "loss": 7.520751953125,
143
+ "step": 342
144
+ },
145
+ {
146
+ "epoch": 0.9352331606217616,
147
+ "grad_norm": 0.8767187595367432,
148
+ "learning_rate": 9.96885460474671e-05,
149
+ "loss": 7.502451043379934,
150
+ "step": 361
151
+ },
152
+ {
153
+ "epoch": 0.9844559585492227,
154
+ "grad_norm": 0.7236428260803223,
155
+ "learning_rate": 9.959017818056273e-05,
156
+ "loss": 7.4918670654296875,
157
+ "step": 380
158
+ },
159
+ {
160
+ "epoch": 1.0336787564766838,
161
+ "grad_norm": 0.5859951376914978,
162
+ "learning_rate": 9.947839444220306e-05,
163
+ "loss": 7.4534374036287,
164
+ "step": 399
165
+ },
166
+ {
167
+ "epoch": 1.0829015544041452,
168
+ "grad_norm": 0.770540714263916,
169
+ "learning_rate": 9.935322507378509e-05,
170
+ "loss": 7.40715187474301,
171
+ "step": 418
172
+ },
173
+ {
174
+ "epoch": 1.1321243523316062,
175
+ "grad_norm": 0.6562390327453613,
176
+ "learning_rate": 9.921470393798522e-05,
177
+ "loss": 7.423827321905839,
178
+ "step": 437
179
+ },
180
+ {
181
+ "epoch": 1.1813471502590673,
182
+ "grad_norm": 0.7159621715545654,
183
+ "learning_rate": 9.906286850959825e-05,
184
+ "loss": 7.380163895456414,
185
+ "step": 456
186
+ },
187
+ {
188
+ "epoch": 1.2305699481865284,
189
+ "grad_norm": 0.6420731544494629,
190
+ "learning_rate": 9.889775986539913e-05,
191
+ "loss": 7.33871781198602,
192
+ "step": 475
193
+ },
194
+ {
195
+ "epoch": 1.2797927461139897,
196
+ "grad_norm": 0.7125868797302246,
197
+ "learning_rate": 9.871942267303034e-05,
198
+ "loss": 7.3665418122944075,
199
+ "step": 494
200
+ },
201
+ {
202
+ "epoch": 1.3290155440414508,
203
+ "grad_norm": 0.7381535768508911,
204
+ "learning_rate": 9.852790517891754e-05,
205
+ "loss": 7.347101311934622,
206
+ "step": 513
207
+ },
208
+ {
209
+ "epoch": 1.378238341968912,
210
+ "grad_norm": 0.7120394110679626,
211
+ "learning_rate": 9.83232591952175e-05,
212
+ "loss": 7.310685810289885,
213
+ "step": 532
214
+ },
215
+ {
216
+ "epoch": 1.427461139896373,
217
+ "grad_norm": 0.7593790888786316,
218
+ "learning_rate": 9.810554008580081e-05,
219
+ "loss": 7.298673127826891,
220
+ "step": 551
221
+ },
222
+ {
223
+ "epoch": 1.4766839378238341,
224
+ "grad_norm": 0.7787287831306458,
225
+ "learning_rate": 9.787480675127431e-05,
226
+ "loss": 7.280764931126645,
227
+ "step": 570
228
+ },
229
+ {
230
+ "epoch": 1.5259067357512954,
231
+ "grad_norm": 0.8570913076400757,
232
+ "learning_rate": 9.763112161304621e-05,
233
+ "loss": 7.271910014905427,
234
+ "step": 589
235
+ },
236
+ {
237
+ "epoch": 1.5751295336787565,
238
+ "grad_norm": 0.6566023826599121,
239
+ "learning_rate": 9.737455059643903e-05,
240
+ "loss": 7.260608070775082,
241
+ "step": 608
242
+ },
243
+ {
244
+ "epoch": 1.6243523316062176,
245
+ "grad_norm": 0.6554204821586609,
246
+ "learning_rate": 9.710516311285445e-05,
247
+ "loss": 7.235391717208059,
248
+ "step": 627
249
+ },
250
+ {
251
+ "epoch": 1.6735751295336787,
252
+ "grad_norm": 0.8252356648445129,
253
+ "learning_rate": 9.682303204099517e-05,
254
+ "loss": 7.23214400442023,
255
+ "step": 646
256
+ },
257
+ {
258
+ "epoch": 1.7227979274611398,
259
+ "grad_norm": 0.7991335988044739,
260
+ "learning_rate": 9.652823370714861e-05,
261
+ "loss": 7.189540662263569,
262
+ "step": 665
263
+ },
264
+ {
265
+ "epoch": 1.7720207253886011,
266
+ "grad_norm": 0.6291921734809875,
267
+ "learning_rate": 9.622084786453804e-05,
268
+ "loss": 7.1787647448088,
269
+ "step": 684
270
+ },
271
+ {
272
+ "epoch": 1.8212435233160622,
273
+ "grad_norm": 0.6076451539993286,
274
+ "learning_rate": 9.590095767174654e-05,
275
+ "loss": 7.1707924290707235,
276
+ "step": 703
277
+ },
278
+ {
279
+ "epoch": 1.8704663212435233,
280
+ "grad_norm": 0.7197585105895996,
281
+ "learning_rate": 9.556864967021966e-05,
282
+ "loss": 7.1632947419819075,
283
+ "step": 722
284
+ },
285
+ {
286
+ "epoch": 1.9196891191709846,
287
+ "grad_norm": 0.8579983711242676,
288
+ "learning_rate": 9.522401376085302e-05,
289
+ "loss": 7.2033129240337175,
290
+ "step": 741
291
+ },
292
+ {
293
+ "epoch": 1.9689119170984455,
294
+ "grad_norm": 0.7442916631698608,
295
+ "learning_rate": 9.486714317967097e-05,
296
+ "loss": 7.15290671900699,
297
+ "step": 760
298
+ },
299
+ {
300
+ "epoch": 2.018134715025907,
301
+ "grad_norm": 0.6029990911483765,
302
+ "learning_rate": 9.449813447260292e-05,
303
+ "loss": 7.155892623098273,
304
+ "step": 779
305
+ },
306
+ {
307
+ "epoch": 2.0673575129533677,
308
+ "grad_norm": 0.8712852597236633,
309
+ "learning_rate": 9.411708746936439e-05,
310
+ "loss": 7.1117409153988485,
311
+ "step": 798
312
+ },
313
+ {
314
+ "epoch": 2.116580310880829,
315
+ "grad_norm": 0.7027563452720642,
316
+ "learning_rate": 9.372410525644952e-05,
317
+ "loss": 7.13547074167352,
318
+ "step": 817
319
+ },
320
+ {
321
+ "epoch": 2.1658031088082903,
322
+ "grad_norm": 0.7402092814445496,
323
+ "learning_rate": 9.33192941492427e-05,
324
+ "loss": 7.12723099557977,
325
+ "step": 836
326
+ },
327
+ {
328
+ "epoch": 2.215025906735751,
329
+ "grad_norm": 0.6641818284988403,
330
+ "learning_rate": 9.290276366325638e-05,
331
+ "loss": 7.079131276983964,
332
+ "step": 855
333
+ },
334
+ {
335
+ "epoch": 2.2642487046632125,
336
+ "grad_norm": 0.654100239276886,
337
+ "learning_rate": 9.247462648450348e-05,
338
+ "loss": 7.120608681126645,
339
+ "step": 874
340
+ },
341
+ {
342
+ "epoch": 2.313471502590674,
343
+ "grad_norm": 0.6241465210914612,
344
+ "learning_rate": 9.203499843901173e-05,
345
+ "loss": 7.0452880859375,
346
+ "step": 893
347
+ },
348
+ {
349
+ "epoch": 2.3626943005181347,
350
+ "grad_norm": 0.6422539353370667,
351
+ "learning_rate": 9.158399846148886e-05,
352
+ "loss": 7.0627602025082235,
353
+ "step": 912
354
+ },
355
+ {
356
+ "epoch": 2.411917098445596,
357
+ "grad_norm": 0.9347654581069946,
358
+ "learning_rate": 9.11217485631465e-05,
359
+ "loss": 7.087360582853618,
360
+ "step": 931
361
+ },
362
+ {
363
+ "epoch": 2.461139896373057,
364
+ "grad_norm": 0.6797104477882385,
365
+ "learning_rate": 9.064837379869189e-05,
366
+ "loss": 7.03591597707648,
367
+ "step": 950
368
+ },
369
+ {
370
+ "epoch": 2.510362694300518,
371
+ "grad_norm": 0.7567751407623291,
372
+ "learning_rate": 9.016400223249635e-05,
373
+ "loss": 7.0485181306537825,
374
+ "step": 969
375
+ },
376
+ {
377
+ "epoch": 2.5595854922279795,
378
+ "grad_norm": 0.9307654500007629,
379
+ "learning_rate": 8.966876490394927e-05,
380
+ "loss": 7.068600303248355,
381
+ "step": 988
382
+ },
383
+ {
384
+ "epoch": 2.6088082901554404,
385
+ "grad_norm": 0.6442763805389404,
386
+ "learning_rate": 8.91627957920074e-05,
387
+ "loss": 7.0317848607113485,
388
+ "step": 1007
389
+ },
390
+ {
391
+ "epoch": 2.6580310880829017,
392
+ "grad_norm": 0.755132257938385,
393
+ "learning_rate": 8.8646231778949e-05,
394
+ "loss": 7.035190783048931,
395
+ "step": 1026
396
+ },
397
+ {
398
+ "epoch": 2.7072538860103625,
399
+ "grad_norm": 0.9993160367012024,
400
+ "learning_rate": 8.811921261334224e-05,
401
+ "loss": 7.045703285618832,
402
+ "step": 1045
403
+ },
404
+ {
405
+ "epoch": 2.756476683937824,
406
+ "grad_norm": 0.6708552837371826,
407
+ "learning_rate": 8.758188087223845e-05,
408
+ "loss": 7.088768406918175,
409
+ "step": 1064
410
+ },
411
+ {
412
+ "epoch": 2.805699481865285,
413
+ "grad_norm": 0.6677550077438354,
414
+ "learning_rate": 8.703438192260007e-05,
415
+ "loss": 7.025689376027961,
416
+ "step": 1083
417
+ },
418
+ {
419
+ "epoch": 2.854922279792746,
420
+ "grad_norm": 0.8610202074050903,
421
+ "learning_rate": 8.647686388197374e-05,
422
+ "loss": 6.9846753572162825,
423
+ "step": 1102
424
+ },
425
+ {
426
+ "epoch": 2.9041450777202074,
427
+ "grad_norm": 0.6401988863945007,
428
+ "learning_rate": 8.59094775784194e-05,
429
+ "loss": 7.020596955951891,
430
+ "step": 1121
431
+ },
432
+ {
433
+ "epoch": 2.9533678756476682,
434
+ "grad_norm": 0.7033092975616455,
435
+ "learning_rate": 8.533237650970602e-05,
436
+ "loss": 7.012691297029194,
437
+ "step": 1140
438
+ },
439
+ {
440
+ "epoch": 2.241779497098646,
441
+ "grad_norm": 1.8401119709014893,
442
+ "learning_rate": 8.474571680178515e-05,
443
+ "loss": 7.131598949432373,
444
+ "step": 1159
445
+ },
446
+ {
447
+ "epoch": 2.2785299806576402,
448
+ "grad_norm": 0.9385874271392822,
449
+ "learning_rate": 9.233682395815343e-05,
450
+ "loss": 6.8397754869963,
451
+ "step": 1178
452
+ },
453
+ {
454
+ "epoch": 2.3152804642166345,
455
+ "grad_norm": 0.9496298432350159,
456
+ "learning_rate": 9.200700008023644e-05,
457
+ "loss": 6.565899096037212,
458
+ "step": 1197
459
+ },
460
+ {
461
+ "epoch": 2.3520309477756287,
462
+ "grad_norm": 0.7686163187026978,
463
+ "learning_rate": 9.167084229191691e-05,
464
+ "loss": 6.4427024439761515,
465
+ "step": 1216
466
+ },
467
+ {
468
+ "epoch": 2.388781431334623,
469
+ "grad_norm": 0.6687182188034058,
470
+ "learning_rate": 9.132840127982587e-05,
471
+ "loss": 6.356098375822368,
472
+ "step": 1235
473
+ },
474
+ {
475
+ "epoch": 2.425531914893617,
476
+ "grad_norm": 0.9565212726593018,
477
+ "learning_rate": 9.097972867799301e-05,
478
+ "loss": 6.326987818667763,
479
+ "step": 1254
480
+ },
481
+ {
482
+ "epoch": 2.4622823984526114,
483
+ "grad_norm": 0.9246956706047058,
484
+ "learning_rate": 9.062487706006115e-05,
485
+ "loss": 6.313424762926604,
486
+ "step": 1273
487
+ },
488
+ {
489
+ "epoch": 2.4990328820116052,
490
+ "grad_norm": 0.7130828499794006,
491
+ "learning_rate": 9.026389993135918e-05,
492
+ "loss": 6.297733106111226,
493
+ "step": 1292
494
+ },
495
+ {
496
+ "epoch": 2.5357833655705995,
497
+ "grad_norm": 0.6469439268112183,
498
+ "learning_rate": 8.989685172083433e-05,
499
+ "loss": 6.265300549958882,
500
+ "step": 1311
501
+ },
502
+ {
503
+ "epoch": 2.5725338491295937,
504
+ "grad_norm": 0.7045807838439941,
505
+ "learning_rate": 8.952378777284526e-05,
506
+ "loss": 6.24040422941509,
507
+ "step": 1330
508
+ },
509
+ {
510
+ "epoch": 2.609284332688588,
511
+ "grad_norm": 0.8098029494285583,
512
+ "learning_rate": 8.914476433881713e-05,
513
+ "loss": 6.2236998708624585,
514
+ "step": 1349
515
+ },
516
+ {
517
+ "epoch": 2.646034816247582,
518
+ "grad_norm": 0.737579345703125,
519
+ "learning_rate": 8.875983856875986e-05,
520
+ "loss": 6.20395901328639,
521
+ "step": 1368
522
+ },
523
+ {
524
+ "epoch": 2.6827852998065764,
525
+ "grad_norm": 0.8462916612625122,
526
+ "learning_rate": 8.836906850265096e-05,
527
+ "loss": 6.195942125822368,
528
+ "step": 1387
529
+ },
530
+ {
531
+ "epoch": 2.7195357833655707,
532
+ "grad_norm": 0.8085110187530518,
533
+ "learning_rate": 8.797251306168407e-05,
534
+ "loss": 6.188254908511513,
535
+ "step": 1406
536
+ },
537
+ {
538
+ "epoch": 2.756286266924565,
539
+ "grad_norm": 1.0851175785064697,
540
+ "learning_rate": 8.757023203938474e-05,
541
+ "loss": 6.1910757767526725,
542
+ "step": 1425
543
+ },
544
+ {
545
+ "epoch": 2.793036750483559,
546
+ "grad_norm": 1.278171420097351,
547
+ "learning_rate": 8.716228609259462e-05,
548
+ "loss": 6.186161643580387,
549
+ "step": 1444
550
+ },
551
+ {
552
+ "epoch": 2.829787234042553,
553
+ "grad_norm": 0.7596274614334106,
554
+ "learning_rate": 8.674873673232546e-05,
555
+ "loss": 6.173150313527961,
556
+ "step": 1463
557
+ },
558
+ {
559
+ "epoch": 2.866537717601547,
560
+ "grad_norm": 0.7558140754699707,
561
+ "learning_rate": 8.632964631448441e-05,
562
+ "loss": 6.155330457185444,
563
+ "step": 1482
564
+ },
565
+ {
566
+ "epoch": 2.9032882011605414,
567
+ "grad_norm": 0.6216167211532593,
568
+ "learning_rate": 8.590507803047172e-05,
569
+ "loss": 6.167594106573808,
570
+ "step": 1501
571
+ },
572
+ {
573
+ "epoch": 2.9400386847195357,
574
+ "grad_norm": 0.643828272819519,
575
+ "learning_rate": 8.547509589765275e-05,
576
+ "loss": 6.1355847810444075,
577
+ "step": 1520
578
+ },
579
+ {
580
+ "epoch": 2.97678916827853,
581
+ "grad_norm": 0.8299704194068909,
582
+ "learning_rate": 8.503976474970517e-05,
583
+ "loss": 6.138166327225535,
584
+ "step": 1539
585
+ },
586
+ {
587
+ "epoch": 3.013539651837524,
588
+ "grad_norm": 1.1598100662231445,
589
+ "learning_rate": 8.459915022684329e-05,
590
+ "loss": 6.094761497096012,
591
+ "step": 1558
592
+ },
593
+ {
594
+ "epoch": 3.0502901353965184,
595
+ "grad_norm": 0.6949910521507263,
596
+ "learning_rate": 8.415331876592055e-05,
597
+ "loss": 6.0926979466488485,
598
+ "step": 1577
599
+ },
600
+ {
601
+ "epoch": 3.0870406189555126,
602
+ "grad_norm": 0.6773774027824402,
603
+ "learning_rate": 8.370233759041219e-05,
604
+ "loss": 6.107613814504523,
605
+ "step": 1596
606
+ },
607
+ {
608
+ "epoch": 3.123791102514507,
609
+ "grad_norm": 0.6850952506065369,
610
+ "learning_rate": 8.324627470027901e-05,
611
+ "loss": 6.105125025699013,
612
+ "step": 1615
613
+ },
614
+ {
615
+ "epoch": 3.160541586073501,
616
+ "grad_norm": 0.7101475596427917,
617
+ "learning_rate": 8.278519886171423e-05,
618
+ "loss": 6.1307517603824015,
619
+ "step": 1634
620
+ },
621
+ {
622
+ "epoch": 3.1972920696324953,
623
+ "grad_norm": 0.8034424781799316,
624
+ "learning_rate": 8.231917959677473e-05,
625
+ "loss": 6.124847412109375,
626
+ "step": 1653
627
+ },
628
+ {
629
+ "epoch": 3.2340425531914896,
630
+ "grad_norm": 0.7378991842269897,
631
+ "learning_rate": 8.184828717289845e-05,
632
+ "loss": 6.102732608192845,
633
+ "step": 1672
634
+ },
635
+ {
636
+ "epoch": 3.2707930367504834,
637
+ "grad_norm": 0.8416258692741394,
638
+ "learning_rate": 8.13725925923092e-05,
639
+ "loss": 6.12664112291838,
640
+ "step": 1691
641
+ },
642
+ {
643
+ "epoch": 3.3075435203094776,
644
+ "grad_norm": 0.7685152292251587,
645
+ "learning_rate": 8.089216758131087e-05,
646
+ "loss": 6.120386224043997,
647
+ "step": 1710
648
+ },
649
+ {
650
+ "epoch": 3.344294003868472,
651
+ "grad_norm": 0.8043097853660583,
652
+ "learning_rate": 8.04070845794723e-05,
653
+ "loss": 6.096702575683594,
654
+ "step": 1729
655
+ },
656
+ {
657
+ "epoch": 3.381044487427466,
658
+ "grad_norm": 0.8346231579780579,
659
+ "learning_rate": 7.991741672870475e-05,
660
+ "loss": 6.11234564530222,
661
+ "step": 1748
662
+ },
663
+ {
664
+ "epoch": 3.4177949709864603,
665
+ "grad_norm": 0.6343578100204468,
666
+ "learning_rate": 7.942323786223333e-05,
667
+ "loss": 6.072033932334499,
668
+ "step": 1767
669
+ },
670
+ {
671
+ "epoch": 3.4545454545454546,
672
+ "grad_norm": 0.7200827598571777,
673
+ "learning_rate": 7.892462249346432e-05,
674
+ "loss": 6.075145922209087,
675
+ "step": 1786
676
+ },
677
+ {
678
+ "epoch": 3.491295938104449,
679
+ "grad_norm": 0.9759312868118286,
680
+ "learning_rate": 7.84216458047498e-05,
681
+ "loss": 6.069999694824219,
682
+ "step": 1805
683
+ },
684
+ {
685
+ "epoch": 3.528046421663443,
686
+ "grad_norm": 0.8275994658470154,
687
+ "learning_rate": 7.79143836360516e-05,
688
+ "loss": 6.080008255807977,
689
+ "step": 1824
690
+ },
691
+ {
692
+ "epoch": 3.564796905222437,
693
+ "grad_norm": 0.7960038185119629,
694
+ "learning_rate": 7.740291247350581e-05,
695
+ "loss": 6.059996353952508,
696
+ "step": 1843
697
+ },
698
+ {
699
+ "epoch": 3.601547388781431,
700
+ "grad_norm": 0.7582727074623108,
701
+ "learning_rate": 7.688730943789023e-05,
702
+ "loss": 6.085317511307566,
703
+ "step": 1862
704
+ },
705
+ {
706
+ "epoch": 3.6382978723404253,
707
+ "grad_norm": 0.804373025894165,
708
+ "learning_rate": 7.636765227299576e-05,
709
+ "loss": 6.070657027395148,
710
+ "step": 1881
711
+ },
712
+ {
713
+ "epoch": 3.6750483558994196,
714
+ "grad_norm": 0.8290799856185913,
715
+ "learning_rate": 7.584401933390404e-05,
716
+ "loss": 6.05766457005551,
717
+ "step": 1900
718
+ },
719
+ {
720
+ "epoch": 3.711798839458414,
721
+ "grad_norm": 0.9162618517875671,
722
+ "learning_rate": 7.531648957517301e-05,
723
+ "loss": 6.049548098915501,
724
+ "step": 1919
725
+ },
726
+ {
727
+ "epoch": 3.748549323017408,
728
+ "grad_norm": 0.8927680253982544,
729
+ "learning_rate": 7.478514253893181e-05,
730
+ "loss": 6.04520697342722,
731
+ "step": 1938
732
+ },
733
+ {
734
+ "epoch": 3.7852998065764023,
735
+ "grad_norm": 0.7886133790016174,
736
+ "learning_rate": 7.425005834288738e-05,
737
+ "loss": 6.05512157239412,
738
+ "step": 1957
739
+ },
740
+ {
741
+ "epoch": 3.8220502901353965,
742
+ "grad_norm": 0.9593386054039001,
743
+ "learning_rate": 7.371131766824399e-05,
744
+ "loss": 6.040921261436061,
745
+ "step": 1976
746
+ },
747
+ {
748
+ "epoch": 3.858800773694391,
749
+ "grad_norm": 0.7840445637702942,
750
+ "learning_rate": 7.316900174753806e-05,
751
+ "loss": 6.032860203793175,
752
+ "step": 1995
753
+ },
754
+ {
755
+ "epoch": 3.895551257253385,
756
+ "grad_norm": 0.7778313159942627,
757
+ "learning_rate": 7.262319235238967e-05,
758
+ "loss": 6.04224355597245,
759
+ "step": 2014
760
+ },
761
+ {
762
+ "epoch": 3.9323017408123793,
763
+ "grad_norm": 0.6466183662414551,
764
+ "learning_rate": 7.207397178117286e-05,
765
+ "loss": 6.039953934518914,
766
+ "step": 2033
767
+ },
768
+ {
769
+ "epoch": 3.9690522243713735,
770
+ "grad_norm": 0.8412506580352783,
771
+ "learning_rate": 7.152142284660659e-05,
772
+ "loss": 6.043909173262747,
773
+ "step": 2052
774
+ },
775
+ {
776
+ "epoch": 4.005802707930368,
777
+ "grad_norm": 0.7424644231796265,
778
+ "learning_rate": 7.096562886326784e-05,
779
+ "loss": 6.020910965768914,
780
+ "step": 2071
781
+ },
782
+ {
783
+ "epoch": 4.042553191489362,
784
+ "grad_norm": 0.8717330694198608,
785
+ "learning_rate": 7.040667363502946e-05,
786
+ "loss": 6.038821973298726,
787
+ "step": 2090
788
+ },
789
+ {
790
+ "epoch": 4.079303675048356,
791
+ "grad_norm": 0.6821705102920532,
792
+ "learning_rate": 6.984464144242395e-05,
793
+ "loss": 5.9910033376593335,
794
+ "step": 2109
795
+ },
796
+ {
797
+ "epoch": 4.1160541586073505,
798
+ "grad_norm": 0.8593968749046326,
799
+ "learning_rate": 6.92796170299354e-05,
800
+ "loss": 6.038429260253906,
801
+ "step": 2128
802
+ },
803
+ {
804
+ "epoch": 4.152804642166345,
805
+ "grad_norm": 0.6946661472320557,
806
+ "learning_rate": 6.871168559322163e-05,
807
+ "loss": 6.043051468698602,
808
+ "step": 2147
809
+ },
810
+ {
811
+ "epoch": 4.189555125725338,
812
+ "grad_norm": 0.872968316078186,
813
+ "learning_rate": 6.814093276626812e-05,
814
+ "loss": 6.0379281294973275,
815
+ "step": 2166
816
+ },
817
+ {
818
+ "epoch": 4.226305609284332,
819
+ "grad_norm": 0.8149793744087219,
820
+ "learning_rate": 6.756744460847593e-05,
821
+ "loss": 6.025306300113075,
822
+ "step": 2185
823
+ },
824
+ {
825
+ "epoch": 4.2630560928433265,
826
+ "grad_norm": 0.8134008646011353,
827
+ "learning_rate": 6.699130759168552e-05,
828
+ "loss": 6.029500860916941,
829
+ "step": 2204
830
+ },
831
+ {
832
+ "epoch": 4.299806576402321,
833
+ "grad_norm": 0.6807850003242493,
834
+ "learning_rate": 6.641260858713825e-05,
835
+ "loss": 6.02039899324116,
836
+ "step": 2223
837
+ },
838
+ {
839
+ "epoch": 4.336557059961315,
840
+ "grad_norm": 0.7424802780151367,
841
+ "learning_rate": 6.583143485237783e-05,
842
+ "loss": 6.042245965254934,
843
+ "step": 2242
844
+ },
845
+ {
846
+ "epoch": 4.373307543520309,
847
+ "grad_norm": 1.0088832378387451,
848
+ "learning_rate": 6.524787401809335e-05,
849
+ "loss": 5.990176953767476,
850
+ "step": 2261
851
+ },
852
+ {
853
+ "epoch": 4.4100580270793035,
854
+ "grad_norm": 0.8164299130439758,
855
+ "learning_rate": 6.466201407490622e-05,
856
+ "loss": 6.0073804353412825,
857
+ "step": 2280
858
+ },
859
+ {
860
+ "epoch": 4.446808510638298,
861
+ "grad_norm": 0.8493334650993347,
862
+ "learning_rate": 6.40739433601026e-05,
863
+ "loss": 6.001832259328742,
864
+ "step": 2299
865
+ },
866
+ {
867
+ "epoch": 4.483558994197292,
868
+ "grad_norm": 0.8219246864318848,
869
+ "learning_rate": 6.348375054431385e-05,
870
+ "loss": 6.0019788240131575,
871
+ "step": 2318
872
+ },
873
+ {
874
+ "epoch": 4.520309477756286,
875
+ "grad_norm": 0.8987888693809509,
876
+ "learning_rate": 6.289152461814648e-05,
877
+ "loss": 5.987865648771587,
878
+ "step": 2337
879
+ },
880
+ {
881
+ "epoch": 4.5570599613152805,
882
+ "grad_norm": 0.7489388585090637,
883
+ "learning_rate": 6.229735487876398e-05,
884
+ "loss": 6.025086252312911,
885
+ "step": 2356
886
+ },
887
+ {
888
+ "epoch": 4.593810444874275,
889
+ "grad_norm": 0.8458240032196045,
890
+ "learning_rate": 6.170133091642245e-05,
891
+ "loss": 5.987234015213816,
892
+ "step": 2375
893
+ },
894
+ {
895
+ "epoch": 4.630560928433269,
896
+ "grad_norm": 0.7269588112831116,
897
+ "learning_rate": 6.110354260096183e-05,
898
+ "loss": 5.985632645456414,
899
+ "step": 2394
900
+ },
901
+ {
902
+ "epoch": 4.667311411992263,
903
+ "grad_norm": 0.7618328928947449,
904
+ "learning_rate": 6.050408006825525e-05,
905
+ "loss": 5.984134071751645,
906
+ "step": 2413
907
+ },
908
+ {
909
+ "epoch": 4.704061895551257,
910
+ "grad_norm": 0.8415816426277161,
911
+ "learning_rate": 5.9903033706618116e-05,
912
+ "loss": 6.002414904142681,
913
+ "step": 2432
914
+ },
915
+ {
916
+ "epoch": 4.740812379110252,
917
+ "grad_norm": 0.8867862224578857,
918
+ "learning_rate": 5.930049414317913e-05,
919
+ "loss": 5.995708264802632,
920
+ "step": 2451
921
+ },
922
+ {
923
+ "epoch": 4.777562862669246,
924
+ "grad_norm": 1.2235946655273438,
925
+ "learning_rate": 5.869655223021529e-05,
926
+ "loss": 6.0039624665912825,
927
+ "step": 2470
928
+ },
929
+ {
930
+ "epoch": 4.81431334622824,
931
+ "grad_norm": 0.7073670625686646,
932
+ "learning_rate": 5.8091299031453106e-05,
933
+ "loss": 6.0098114013671875,
934
+ "step": 2489
935
+ },
936
+ {
937
+ "epoch": 4.851063829787234,
938
+ "grad_norm": 0.7804876565933228,
939
+ "learning_rate": 5.748482580833766e-05,
940
+ "loss": 5.9925079345703125,
941
+ "step": 2508
942
+ },
943
+ {
944
+ "epoch": 4.887814313346229,
945
+ "grad_norm": 0.8264277577400208,
946
+ "learning_rate": 5.6877224006272086e-05,
947
+ "loss": 5.97403275339227,
948
+ "step": 2527
949
+ },
950
+ {
951
+ "epoch": 4.924564796905223,
952
+ "grad_norm": 0.9556435942649841,
953
+ "learning_rate": 5.626858524082922e-05,
954
+ "loss": 6.007706893117804,
955
+ "step": 2546
956
+ },
957
+ {
958
+ "epoch": 4.961315280464216,
959
+ "grad_norm": 1.08724045753479,
960
+ "learning_rate": 5.5659001283937526e-05,
961
+ "loss": 5.989010057951274,
962
+ "step": 2565
963
+ },
964
+ {
965
+ "epoch": 4.9980657640232105,
966
+ "grad_norm": 0.8453856706619263,
967
+ "learning_rate": 5.5048564050043637e-05,
968
+ "loss": 5.995357714201274,
969
+ "step": 2584
970
+ },
971
+ {
972
+ "epoch": 5.034816247582205,
973
+ "grad_norm": 1.1737676858901978,
974
+ "learning_rate": 5.4437365582253185e-05,
975
+ "loss": 5.977565564607319,
976
+ "step": 2603
977
+ },
978
+ {
979
+ "epoch": 5.071566731141199,
980
+ "grad_norm": 0.934160590171814,
981
+ "learning_rate": 5.382549803845235e-05,
982
+ "loss": 5.981942427785773,
983
+ "step": 2622
984
+ },
985
+ {
986
+ "epoch": 5.108317214700193,
987
+ "grad_norm": 0.8727831840515137,
988
+ "learning_rate": 5.321305367741215e-05,
989
+ "loss": 5.968893352307771,
990
+ "step": 2641
991
+ },
992
+ {
993
+ "epoch": 5.145067698259187,
994
+ "grad_norm": 0.8856471180915833,
995
+ "learning_rate": 5.260012484487739e-05,
996
+ "loss": 5.98333057604338,
997
+ "step": 2660
998
+ },
999
+ {
1000
+ "epoch": 5.181818181818182,
1001
+ "grad_norm": 0.7901120781898499,
1002
+ "learning_rate": 5.198680395964256e-05,
1003
+ "loss": 5.964969434236226,
1004
+ "step": 2679
1005
+ },
1006
+ {
1007
+ "epoch": 5.218568665377176,
1008
+ "grad_norm": 0.7979159355163574,
1009
+ "learning_rate": 5.137318349961677e-05,
1010
+ "loss": 5.9825082076223275,
1011
+ "step": 2698
1012
+ },
1013
+ {
1014
+ "epoch": 5.25531914893617,
1015
+ "grad_norm": 0.9568471312522888,
1016
+ "learning_rate": 5.07593559878797e-05,
1017
+ "loss": 5.916827954744038,
1018
+ "step": 2717
1019
+ },
1020
+ {
1021
+ "epoch": 5.292069632495164,
1022
+ "grad_norm": 0.6639050245285034,
1023
+ "learning_rate": 5.0145413978730726e-05,
1024
+ "loss": 5.972771895559211,
1025
+ "step": 2736
1026
+ },
1027
+ {
1028
+ "epoch": 5.328820116054159,
1029
+ "grad_norm": 1.054398536682129,
1030
+ "learning_rate": 4.9531450043733424e-05,
1031
+ "loss": 5.95155173853824,
1032
+ "step": 2755
1033
+ },
1034
+ {
1035
+ "epoch": 5.365570599613153,
1036
+ "grad_norm": 0.8115559220314026,
1037
+ "learning_rate": 4.891755675775739e-05,
1038
+ "loss": 5.972399259868421,
1039
+ "step": 2774
1040
+ },
1041
+ {
1042
+ "epoch": 5.402321083172147,
1043
+ "grad_norm": 0.8311302661895752,
1044
+ "learning_rate": 4.830382668501961e-05,
1045
+ "loss": 5.989575436240749,
1046
+ "step": 2793
1047
+ },
1048
+ {
1049
+ "epoch": 5.439071566731141,
1050
+ "grad_norm": 0.7544533014297485,
1051
+ "learning_rate": 4.7690352365127384e-05,
1052
+ "loss": 5.947163230494449,
1053
+ "step": 2812
1054
+ },
1055
+ {
1056
+ "epoch": 5.475822050290136,
1057
+ "grad_norm": 0.9242804050445557,
1058
+ "learning_rate": 4.7077226299125066e-05,
1059
+ "loss": 5.953185633609169,
1060
+ "step": 2831
1061
+ },
1062
+ {
1063
+ "epoch": 5.51257253384913,
1064
+ "grad_norm": 0.8004162311553955,
1065
+ "learning_rate": 4.646454093554644e-05,
1066
+ "loss": 5.965155350534539,
1067
+ "step": 2850
1068
+ },
1069
+ {
1070
+ "epoch": 5.549323017408124,
1071
+ "grad_norm": 0.9713261127471924,
1072
+ "learning_rate": 4.5852388656475256e-05,
1073
+ "loss": 5.955127916837993,
1074
+ "step": 2869
1075
+ },
1076
+ {
1077
+ "epoch": 5.586073500967118,
1078
+ "grad_norm": 0.795760452747345,
1079
+ "learning_rate": 4.524086176361549e-05,
1080
+ "loss": 5.981726395456414,
1081
+ "step": 2888
1082
+ },
1083
+ {
1084
+ "epoch": 5.6228239845261125,
1085
+ "grad_norm": 0.9622194170951843,
1086
+ "learning_rate": 4.463005246437407e-05,
1087
+ "loss": 5.9348289088199015,
1088
+ "step": 2907
1089
+ },
1090
+ {
1091
+ "epoch": 5.659574468085106,
1092
+ "grad_norm": 0.9427851438522339,
1093
+ "learning_rate": 4.402005285795745e-05,
1094
+ "loss": 5.9512381302682975,
1095
+ "step": 2926
1096
+ },
1097
+ {
1098
+ "epoch": 5.696324951644101,
1099
+ "grad_norm": 0.8677796721458435,
1100
+ "learning_rate": 4.341095492148483e-05,
1101
+ "loss": 5.980510109349301,
1102
+ "step": 2945
1103
+ },
1104
+ {
1105
+ "epoch": 5.733075435203094,
1106
+ "grad_norm": 0.8452844619750977,
1107
+ "learning_rate": 4.2802850496119536e-05,
1108
+ "loss": 5.963108665064762,
1109
+ "step": 2964
1110
+ },
1111
+ {
1112
+ "epoch": 5.769825918762089,
1113
+ "grad_norm": 0.8970301747322083,
1114
+ "learning_rate": 4.219583127322104e-05,
1115
+ "loss": 5.97346335963199,
1116
+ "step": 2983
1117
+ },
1118
+ {
1119
+ "epoch": 5.806576402321083,
1120
+ "grad_norm": 0.8443690538406372,
1121
+ "learning_rate": 4.158998878051962e-05,
1122
+ "loss": 5.9706971017937915,
1123
+ "step": 3002
1124
+ },
1125
+ {
1126
+ "epoch": 5.843326885880077,
1127
+ "grad_norm": 0.9244300723075867,
1128
+ "learning_rate": 4.098541436831541e-05,
1129
+ "loss": 5.951765361585115,
1130
+ "step": 3021
1131
+ },
1132
+ {
1133
+ "epoch": 5.880077369439071,
1134
+ "grad_norm": 0.784065842628479,
1135
+ "learning_rate": 4.038219919570455e-05,
1136
+ "loss": 5.960685328433388,
1137
+ "step": 3040
1138
+ },
1139
+ {
1140
+ "epoch": 5.916827852998066,
1141
+ "grad_norm": 0.9272547960281372,
1142
+ "learning_rate": 3.978043421683395e-05,
1143
+ "loss": 5.95731634842722,
1144
+ "step": 3059
1145
+ },
1146
+ {
1147
+ "epoch": 5.95357833655706,
1148
+ "grad_norm": 0.7737032771110535,
1149
+ "learning_rate": 3.918021016718704e-05,
1150
+ "loss": 5.947649905556126,
1151
+ "step": 3078
1152
+ },
1153
+ {
1154
+ "epoch": 5.990328820116054,
1155
+ "grad_norm": 0.7086819410324097,
1156
+ "learning_rate": 3.858161754990245e-05,
1157
+ "loss": 5.95235162032278,
1158
+ "step": 3097
1159
+ },
1160
+ {
1161
+ "epoch": 6.027079303675048,
1162
+ "grad_norm": 0.7864798307418823,
1163
+ "learning_rate": 3.7984746622127765e-05,
1164
+ "loss": 5.9433951126901725,
1165
+ "step": 3116
1166
+ },
1167
+ {
1168
+ "epoch": 6.0638297872340425,
1169
+ "grad_norm": 0.8476478457450867,
1170
+ "learning_rate": 3.738968738141033e-05,
1171
+ "loss": 5.926896346242804,
1172
+ "step": 3135
1173
+ },
1174
+ {
1175
+ "epoch": 6.100580270793037,
1176
+ "grad_norm": 0.7427075505256653,
1177
+ "learning_rate": 3.679652955212719e-05,
1178
+ "loss": 5.956519277472245,
1179
+ "step": 3154
1180
+ },
1181
+ {
1182
+ "epoch": 6.137330754352031,
1183
+ "grad_norm": 0.9161301851272583,
1184
+ "learning_rate": 3.620536257195635e-05,
1185
+ "loss": 5.917147184673109,
1186
+ "step": 3173
1187
+ },
1188
+ {
1189
+ "epoch": 6.174081237911025,
1190
+ "grad_norm": 0.8627088665962219,
1191
+ "learning_rate": 3.561627557839099e-05,
1192
+ "loss": 5.942029451069079,
1193
+ "step": 3192
1194
+ },
1195
+ {
1196
+ "epoch": 6.2108317214700195,
1197
+ "grad_norm": 0.7476882338523865,
1198
+ "learning_rate": 3.502935739529928e-05,
1199
+ "loss": 5.934722097296464,
1200
+ "step": 3211
1201
+ },
1202
+ {
1203
+ "epoch": 6.247582205029014,
1204
+ "grad_norm": 0.793505847454071,
1205
+ "learning_rate": 3.444469651953126e-05,
1206
+ "loss": 5.916718733938117,
1207
+ "step": 3230
1208
+ },
1209
+ {
1210
+ "epoch": 6.284332688588008,
1211
+ "grad_norm": 0.7478469610214233,
1212
+ "learning_rate": 3.3862381107575005e-05,
1213
+ "loss": 5.954738416169819,
1214
+ "step": 3249
1215
+ },
1216
+ {
1217
+ "epoch": 6.321083172147002,
1218
+ "grad_norm": 0.8610250353813171,
1219
+ "learning_rate": 3.328249896226428e-05,
1220
+ "loss": 5.922407852975946,
1221
+ "step": 3268
1222
+ },
1223
+ {
1224
+ "epoch": 6.3578336557059965,
1225
+ "grad_norm": 0.8182870745658875,
1226
+ "learning_rate": 3.270513751953944e-05,
1227
+ "loss": 5.919796190763774,
1228
+ "step": 3287
1229
+ },
1230
+ {
1231
+ "epoch": 6.394584139264991,
1232
+ "grad_norm": 0.7998473644256592,
1233
+ "learning_rate": 3.213038383526355e-05,
1234
+ "loss": 5.920766730057566,
1235
+ "step": 3306
1236
+ },
1237
+ {
1238
+ "epoch": 6.431334622823985,
1239
+ "grad_norm": 0.8772637248039246,
1240
+ "learning_rate": 3.155832457209603e-05,
1241
+ "loss": 5.93222367136102,
1242
+ "step": 3325
1243
+ },
1244
+ {
1245
+ "epoch": 6.468085106382979,
1246
+ "grad_norm": 0.7384529709815979,
1247
+ "learning_rate": 3.0989045986425325e-05,
1248
+ "loss": 5.92653415077611,
1249
+ "step": 3344
1250
+ },
1251
+ {
1252
+ "epoch": 6.5048355899419725,
1253
+ "grad_norm": 0.8872863054275513,
1254
+ "learning_rate": 3.0422633915363115e-05,
1255
+ "loss": 5.924022072239926,
1256
+ "step": 3363
1257
+ },
1258
+ {
1259
+ "epoch": 6.541586073500967,
1260
+ "grad_norm": 0.7358129024505615,
1261
+ "learning_rate": 2.9859173763801457e-05,
1262
+ "loss": 5.946694625051398,
1263
+ "step": 3382
1264
+ },
1265
+ {
1266
+ "epoch": 6.578336557059961,
1267
+ "grad_norm": 0.9394431114196777,
1268
+ "learning_rate": 2.9298750491535382e-05,
1269
+ "loss": 5.954012017501028,
1270
+ "step": 3401
1271
+ },
1272
+ {
1273
+ "epoch": 6.615087040618955,
1274
+ "grad_norm": 0.8594652414321899,
1275
+ "learning_rate": 2.8741448600452326e-05,
1276
+ "loss": 5.915107727050781,
1277
+ "step": 3420
1278
+ },
1279
+ {
1280
+ "epoch": 6.6518375241779495,
1281
+ "grad_norm": 0.7383516430854797,
1282
+ "learning_rate": 2.818735212179091e-05,
1283
+ "loss": 5.930320739746094,
1284
+ "step": 3439
1285
+ },
1286
+ {
1287
+ "epoch": 6.688588007736944,
1288
+ "grad_norm": 0.7550167441368103,
1289
+ "learning_rate": 2.763654460347035e-05,
1290
+ "loss": 5.959585892526727,
1291
+ "step": 3458
1292
+ },
1293
+ {
1294
+ "epoch": 6.725338491295938,
1295
+ "grad_norm": 0.9746566414833069,
1296
+ "learning_rate": 2.7089109097493003e-05,
1297
+ "loss": 5.915300469649465,
1298
+ "step": 3477
1299
+ },
1300
+ {
1301
+ "epoch": 6.762088974854932,
1302
+ "grad_norm": 0.8552682995796204,
1303
+ "learning_rate": 2.654512814742159e-05,
1304
+ "loss": 5.918191608629729,
1305
+ "step": 3496
1306
+ },
1307
+ {
1308
+ "epoch": 6.7988394584139265,
1309
+ "grad_norm": 0.7178594470024109,
1310
+ "learning_rate": 2.6004683775933116e-05,
1311
+ "loss": 5.931622153834293,
1312
+ "step": 3515
1313
+ },
1314
+ {
1315
+ "epoch": 6.835589941972921,
1316
+ "grad_norm": 0.8271778225898743,
1317
+ "learning_rate": 2.5467857472451234e-05,
1318
+ "loss": 5.90688042891653,
1319
+ "step": 3534
1320
+ },
1321
+ {
1322
+ "epoch": 6.872340425531915,
1323
+ "grad_norm": 0.8247523903846741,
1324
+ "learning_rate": 2.4934730180859138e-05,
1325
+ "loss": 5.911947149979441,
1326
+ "step": 3553
1327
+ },
1328
+ {
1329
+ "epoch": 6.909090909090909,
1330
+ "grad_norm": 0.9226091504096985,
1331
+ "learning_rate": 2.4405382287294666e-05,
1332
+ "loss": 5.909151579204359,
1333
+ "step": 3572
1334
+ },
1335
+ {
1336
+ "epoch": 6.945841392649903,
1337
+ "grad_norm": 0.8541250228881836,
1338
+ "learning_rate": 2.387989360802943e-05,
1339
+ "loss": 5.93184380782278,
1340
+ "step": 3591
1341
+ },
1342
+ {
1343
+ "epoch": 6.982591876208898,
1344
+ "grad_norm": 0.7963822484016418,
1345
+ "learning_rate": 2.3358343377434074e-05,
1346
+ "loss": 5.926949752004523,
1347
+ "step": 3610
1348
+ },
1349
+ {
1350
+ "epoch": 7.019342359767892,
1351
+ "grad_norm": 0.9335833191871643,
1352
+ "learning_rate": 2.2840810236030986e-05,
1353
+ "loss": 5.90260114167866,
1354
+ "step": 3629
1355
+ },
1356
+ {
1357
+ "epoch": 7.056092843326886,
1358
+ "grad_norm": 0.8136786222457886,
1359
+ "learning_rate": 2.2327372218636767e-05,
1360
+ "loss": 5.914011101973684,
1361
+ "step": 3648
1362
+ },
1363
+ {
1364
+ "epoch": 7.09284332688588,
1365
+ "grad_norm": 0.9287970066070557,
1366
+ "learning_rate": 2.181810674259601e-05,
1367
+ "loss": 5.9164786087839225,
1368
+ "step": 3667
1369
+ },
1370
+ {
1371
+ "epoch": 7.129593810444875,
1372
+ "grad_norm": 0.9184285998344421,
1373
+ "learning_rate": 2.1313090596108043e-05,
1374
+ "loss": 5.9290771484375,
1375
+ "step": 3686
1376
+ },
1377
+ {
1378
+ "epoch": 7.166344294003869,
1379
+ "grad_norm": 0.7558146715164185,
1380
+ "learning_rate": 2.081239992664874e-05,
1381
+ "loss": 5.8995819091796875,
1382
+ "step": 3705
1383
+ },
1384
+ {
1385
+ "epoch": 7.203094777562863,
1386
+ "grad_norm": 0.903976321220398,
1387
+ "learning_rate": 2.0316110229488718e-05,
1388
+ "loss": 5.905699880499589,
1389
+ "step": 3724
1390
+ },
1391
+ {
1392
+ "epoch": 7.2398452611218564,
1393
+ "grad_norm": 0.6883618235588074,
1394
+ "learning_rate": 1.9824296336310056e-05,
1395
+ "loss": 5.935149744937294,
1396
+ "step": 3743
1397
+ },
1398
+ {
1399
+ "epoch": 7.276595744680851,
1400
+ "grad_norm": 0.82213294506073,
1401
+ "learning_rate": 1.9337032403923018e-05,
1402
+ "loss": 5.902831228155839,
1403
+ "step": 3762
1404
+ },
1405
+ {
1406
+ "epoch": 7.313346228239845,
1407
+ "grad_norm": 0.8082081079483032,
1408
+ "learning_rate": 1.8854391903084457e-05,
1409
+ "loss": 5.928005419279399,
1410
+ "step": 3781
1411
+ },
1412
+ {
1413
+ "epoch": 7.350096711798839,
1414
+ "grad_norm": 0.8317619562149048,
1415
+ "learning_rate": 1.8376447607419833e-05,
1416
+ "loss": 5.936038368626645,
1417
+ "step": 3800
1418
+ },
1419
+ {
1420
+ "epoch": 7.386847195357833,
1421
+ "grad_norm": 0.8557573556900024,
1422
+ "learning_rate": 1.790327158245012e-05,
1423
+ "loss": 5.898858321340461,
1424
+ "step": 3819
1425
+ },
1426
+ {
1427
+ "epoch": 7.423597678916828,
1428
+ "grad_norm": 0.8798925876617432,
1429
+ "learning_rate": 1.7434935174725686e-05,
1430
+ "loss": 5.881325972707648,
1431
+ "step": 3838
1432
+ },
1433
+ {
1434
+ "epoch": 7.460348162475822,
1435
+ "grad_norm": 0.7644050121307373,
1436
+ "learning_rate": 1.697150900106844e-05,
1437
+ "loss": 5.888987491005345,
1438
+ "step": 3857
1439
+ },
1440
+ {
1441
+ "epoch": 7.497098646034816,
1442
+ "grad_norm": 0.8622159361839294,
1443
+ "learning_rate": 1.6513062937924155e-05,
1444
+ "loss": 5.928788837633635,
1445
+ "step": 3876
1446
+ },
1447
+ {
1448
+ "epoch": 7.53384912959381,
1449
+ "grad_norm": 0.9436091780662537,
1450
+ "learning_rate": 1.6059666110826277e-05,
1451
+ "loss": 5.897299114026521,
1452
+ "step": 3895
1453
+ },
1454
+ {
1455
+ "epoch": 7.570599613152805,
1456
+ "grad_norm": 0.8347904682159424,
1457
+ "learning_rate": 1.5611386883972995e-05,
1458
+ "loss": 5.9460095857319075,
1459
+ "step": 3914
1460
+ },
1461
+ {
1462
+ "epoch": 7.607350096711799,
1463
+ "grad_norm": 0.7909878492355347,
1464
+ "learning_rate": 1.5168292849919185e-05,
1465
+ "loss": 5.919348465768914,
1466
+ "step": 3933
1467
+ },
1468
+ {
1469
+ "epoch": 7.644100580270793,
1470
+ "grad_norm": 0.9378625750541687,
1471
+ "learning_rate": 1.4730450819384622e-05,
1472
+ "loss": 5.925768400493421,
1473
+ "step": 3952
1474
+ },
1475
+ {
1476
+ "epoch": 7.680851063829787,
1477
+ "grad_norm": 0.7907970547676086,
1478
+ "learning_rate": 1.4297926811180174e-05,
1479
+ "loss": 5.891129744680304,
1480
+ "step": 3971
1481
+ },
1482
+ {
1483
+ "epoch": 7.717601547388782,
1484
+ "grad_norm": 0.7062816619873047,
1485
+ "learning_rate": 1.3870786042253225e-05,
1486
+ "loss": 5.924658925909745,
1487
+ "step": 3990
1488
+ },
1489
+ {
1490
+ "epoch": 7.754352030947776,
1491
+ "grad_norm": 0.7920564413070679,
1492
+ "learning_rate": 1.34490929178542e-05,
1493
+ "loss": 5.9171387521844165,
1494
+ "step": 4009
1495
+ },
1496
+ {
1497
+ "epoch": 7.79110251450677,
1498
+ "grad_norm": 0.8155921697616577,
1499
+ "learning_rate": 1.3032911021825366e-05,
1500
+ "loss": 5.90830471641139,
1501
+ "step": 4028
1502
+ },
1503
+ {
1504
+ "epoch": 7.827852998065764,
1505
+ "grad_norm": 0.7282528877258301,
1506
+ "learning_rate": 1.2622303107013512e-05,
1507
+ "loss": 5.909604925858347,
1508
+ "step": 4047
1509
+ },
1510
+ {
1511
+ "epoch": 7.8646034816247585,
1512
+ "grad_norm": 0.9870123267173767,
1513
+ "learning_rate": 1.2217331085807982e-05,
1514
+ "loss": 5.930417111045436,
1515
+ "step": 4066
1516
+ },
1517
+ {
1518
+ "epoch": 7.901353965183753,
1519
+ "grad_norm": 0.7872030138969421,
1520
+ "learning_rate": 1.1818056020805302e-05,
1521
+ "loss": 5.9119214509662825,
1522
+ "step": 4085
1523
+ },
1524
+ {
1525
+ "epoch": 7.938104448742747,
1526
+ "grad_norm": 0.8964416980743408,
1527
+ "learning_rate": 1.1424538115602073e-05,
1528
+ "loss": 5.888004503752056,
1529
+ "step": 4104
1530
+ },
1531
+ {
1532
+ "epoch": 7.97485493230174,
1533
+ "grad_norm": 0.7404572367668152,
1534
+ "learning_rate": 1.1036836705717363e-05,
1535
+ "loss": 5.905365391781456,
1536
+ "step": 4123
1537
+ },
1538
+ {
1539
+ "epoch": 8.011605415860735,
1540
+ "grad_norm": 0.8973370790481567,
1541
+ "learning_rate": 1.0655010249645891e-05,
1542
+ "loss": 5.92196334035773,
1543
+ "step": 4142
1544
+ },
1545
+ {
1546
+ "epoch": 8.048355899419729,
1547
+ "grad_norm": 0.8033064603805542,
1548
+ "learning_rate": 1.0279116320043603e-05,
1549
+ "loss": 5.9180245650442025,
1550
+ "step": 4161
1551
+ },
1552
+ {
1553
+ "epoch": 8.085106382978724,
1554
+ "grad_norm": 0.8662456274032593,
1555
+ "learning_rate": 9.909211595046663e-06,
1556
+ "loss": 5.927662899619655,
1557
+ "step": 4180
1558
+ },
1559
+ {
1560
+ "epoch": 8.121856866537717,
1561
+ "grad_norm": 0.881951630115509,
1562
+ "learning_rate": 9.545351849725448e-06,
1563
+ "loss": 5.897458126670436,
1564
+ "step": 4199
1565
+ },
1566
+ {
1567
+ "epoch": 8.158607350096712,
1568
+ "grad_norm": 0.9466047286987305,
1569
+ "learning_rate": 9.187591947674612e-06,
1570
+ "loss": 5.879381681743421,
1571
+ "step": 4218
1572
+ },
1573
+ {
1574
+ "epoch": 8.195357833655706,
1575
+ "grad_norm": 0.7660216093063354,
1576
+ "learning_rate": 8.835985832740712e-06,
1577
+ "loss": 5.892818651701274,
1578
+ "step": 4237
1579
+ },
1580
+ {
1581
+ "epoch": 8.232108317214701,
1582
+ "grad_norm": 0.88617342710495,
1583
+ "learning_rate": 8.490586520888321e-06,
1584
+ "loss": 5.913487083033512,
1585
+ "step": 4256
1586
+ },
1587
+ {
1588
+ "epoch": 8.268858800773694,
1589
+ "grad_norm": 0.8038977384567261,
1590
+ "learning_rate": 8.15144609220625e-06,
1591
+ "loss": 5.897986562628495,
1592
+ "step": 4275
1593
+ },
1594
+ {
1595
+ "epoch": 8.30560928433269,
1596
+ "grad_norm": 0.8149951100349426,
1597
+ "learning_rate": 7.818615683054737e-06,
1598
+ "loss": 5.905342503597862,
1599
+ "step": 4294
1600
+ },
1601
+ {
1602
+ "epoch": 8.342359767891683,
1603
+ "grad_norm": 0.8831230401992798,
1604
+ "learning_rate": 7.492145478355023e-06,
1605
+ "loss": 5.904563903808594,
1606
+ "step": 4313
1607
+ },
1608
+ {
1609
+ "epoch": 8.379110251450676,
1610
+ "grad_norm": 0.7993362545967102,
1611
+ "learning_rate": 7.172084704022364e-06,
1612
+ "loss": 5.920640242727179,
1613
+ "step": 4332
1614
+ },
1615
+ {
1616
+ "epoch": 8.415860735009671,
1617
+ "grad_norm": 0.8131271004676819,
1618
+ "learning_rate": 6.8584816195436215e-06,
1619
+ "loss": 5.89423972681949,
1620
+ "step": 4351
1621
+ },
1622
+ {
1623
+ "epoch": 8.452611218568665,
1624
+ "grad_norm": 0.8914048075675964,
1625
+ "learning_rate": 6.551383510700565e-06,
1626
+ "loss": 5.895122327302632,
1627
+ "step": 4370
1628
+ },
1629
+ {
1630
+ "epoch": 8.48936170212766,
1631
+ "grad_norm": 0.8953275084495544,
1632
+ "learning_rate": 6.250836682440047e-06,
1633
+ "loss": 5.9110107421875,
1634
+ "step": 4389
1635
+ },
1636
+ {
1637
+ "epoch": 8.526112185686653,
1638
+ "grad_norm": 0.7585981488227844,
1639
+ "learning_rate": 5.956886451892019e-06,
1640
+ "loss": 5.89650485390111,
1641
+ "step": 4408
1642
+ },
1643
+ {
1644
+ "epoch": 8.562862669245648,
1645
+ "grad_norm": 0.7543755173683167,
1646
+ "learning_rate": 5.669577141536553e-06,
1647
+ "loss": 5.915409288908306,
1648
+ "step": 4427
1649
+ },
1650
+ {
1651
+ "epoch": 8.599613152804642,
1652
+ "grad_norm": 0.7309451103210449,
1653
+ "learning_rate": 5.3889520725207366e-06,
1654
+ "loss": 5.88703717683491,
1655
+ "step": 4446
1656
+ },
1657
+ {
1658
+ "epoch": 8.636363636363637,
1659
+ "grad_norm": 0.9153041839599609,
1660
+ "learning_rate": 5.115053558126653e-06,
1661
+ "loss": 5.926216125488281,
1662
+ "step": 4465
1663
+ },
1664
+ {
1665
+ "epoch": 8.67311411992263,
1666
+ "grad_norm": 0.9788889288902283,
1667
+ "learning_rate": 4.847922897391266e-06,
1668
+ "loss": 5.901534632632607,
1669
+ "step": 4484
1670
+ },
1671
+ {
1672
+ "epoch": 8.709864603481625,
1673
+ "grad_norm": 0.7951791882514954,
1674
+ "learning_rate": 4.587600368879308e-06,
1675
+ "loss": 5.907471907766242,
1676
+ "step": 4503
1677
+ },
1678
+ {
1679
+ "epoch": 8.746615087040619,
1680
+ "grad_norm": 0.8469536900520325,
1681
+ "learning_rate": 4.334125224609903e-06,
1682
+ "loss": 5.8754529451069075,
1683
+ "step": 4522
1684
+ },
1685
+ {
1686
+ "epoch": 8.783365570599614,
1687
+ "grad_norm": 0.8822078108787537,
1688
+ "learning_rate": 4.087535684138127e-06,
1689
+ "loss": 5.920900445235403,
1690
+ "step": 4541
1691
+ },
1692
+ {
1693
+ "epoch": 8.820116054158607,
1694
+ "grad_norm": 0.8397153615951538,
1695
+ "learning_rate": 3.84786892879217e-06,
1696
+ "loss": 5.893511320415296,
1697
+ "step": 4560
1698
+ },
1699
+ {
1700
+ "epoch": 8.856866537717602,
1701
+ "grad_norm": 0.8214257955551147,
1702
+ "learning_rate": 3.615161096066999e-06,
1703
+ "loss": 5.91493706954153,
1704
+ "step": 4579
1705
+ },
1706
+ {
1707
+ "epoch": 8.893617021276595,
1708
+ "grad_norm": 0.7739570140838623,
1709
+ "learning_rate": 3.389447274175528e-06,
1710
+ "loss": 5.91633405183491,
1711
+ "step": 4598
1712
+ },
1713
+ {
1714
+ "epoch": 8.93036750483559,
1715
+ "grad_norm": 0.7047730684280396,
1716
+ "learning_rate": 3.1707614967579122e-06,
1717
+ "loss": 5.886983771073191,
1718
+ "step": 4617
1719
+ },
1720
+ {
1721
+ "epoch": 8.967117988394584,
1722
+ "grad_norm": 0.9191731214523315,
1723
+ "learning_rate": 2.959136737749868e-06,
1724
+ "loss": 5.887207834344161,
1725
+ "step": 4636
1726
+ },
1727
+ {
1728
+ "epoch": 9.003868471953579,
1729
+ "grad_norm": 0.7081539630889893,
1730
+ "learning_rate": 2.7546049064108013e-06,
1731
+ "loss": 5.916957252903988,
1732
+ "step": 4655
1733
+ },
1734
+ {
1735
+ "epoch": 9.040618955512572,
1736
+ "grad_norm": 0.9107328653335571,
1737
+ "learning_rate": 2.557196842512455e-06,
1738
+ "loss": 5.887527867367393,
1739
+ "step": 4674
1740
+ },
1741
+ {
1742
+ "epoch": 9.077369439071568,
1743
+ "grad_norm": 0.7918136715888977,
1744
+ "learning_rate": 2.36694231168883e-06,
1745
+ "loss": 5.882832175806949,
1746
+ "step": 4693
1747
+ },
1748
+ {
1749
+ "epoch": 9.114119922630561,
1750
+ "grad_norm": 0.8014242649078369,
1751
+ "learning_rate": 2.1838700009480293e-06,
1752
+ "loss": 5.885363528603001,
1753
+ "step": 4712
1754
+ },
1755
+ {
1756
+ "epoch": 9.150870406189554,
1757
+ "grad_norm": 0.8799481987953186,
1758
+ "learning_rate": 2.008007514346788e-06,
1759
+ "loss": 5.917357193796258,
1760
+ "step": 4731
1761
+ },
1762
+ {
1763
+ "epoch": 9.18762088974855,
1764
+ "grad_norm": 0.8959386944770813,
1765
+ "learning_rate": 1.8393813688282524e-06,
1766
+ "loss": 5.908062181974712,
1767
+ "step": 4750
1768
+ },
1769
+ {
1770
+ "epoch": 9.224371373307543,
1771
+ "grad_norm": 0.8766935467720032,
1772
+ "learning_rate": 1.6780169902237241e-06,
1773
+ "loss": 5.88951954088713,
1774
+ "step": 4769
1775
+ },
1776
+ {
1777
+ "epoch": 9.261121856866538,
1778
+ "grad_norm": 0.7538824677467346,
1779
+ "learning_rate": 1.5239387094188818e-06,
1780
+ "loss": 5.902222482781661,
1781
+ "step": 4788
1782
+ },
1783
+ {
1784
+ "epoch": 9.297872340425531,
1785
+ "grad_norm": 0.805749773979187,
1786
+ "learning_rate": 1.3771697586850929e-06,
1787
+ "loss": 5.913856104800575,
1788
+ "step": 4807
1789
+ },
1790
+ {
1791
+ "epoch": 9.334622823984526,
1792
+ "grad_norm": 0.8077756762504578,
1793
+ "learning_rate": 1.237732268176428e-06,
1794
+ "loss": 5.920063621119449,
1795
+ "step": 4826
1796
+ },
1797
+ {
1798
+ "epoch": 9.37137330754352,
1799
+ "grad_norm": 0.833474338054657,
1800
+ "learning_rate": 1.1056472625928127e-06,
1801
+ "loss": 5.908903021561472,
1802
+ "step": 4845
1803
+ },
1804
+ {
1805
+ "epoch": 9.408123791102515,
1806
+ "grad_norm": 0.7950544357299805,
1807
+ "learning_rate": 9.80934658009891e-07,
1808
+ "loss": 5.9271697998046875,
1809
+ "step": 4864
1810
+ },
1811
+ {
1812
+ "epoch": 9.444874274661508,
1813
+ "grad_norm": 0.8724213242530823,
1814
+ "learning_rate": 8.63613258876017e-07,
1815
+ "loss": 5.9004058837890625,
1816
+ "step": 4883
1817
+ },
1818
+ {
1819
+ "epoch": 9.481624758220503,
1820
+ "grad_norm": 0.7703186273574829,
1821
+ "learning_rate": 7.537007551768782e-07,
1822
+ "loss": 5.912384836297286,
1823
+ "step": 4902
1824
+ },
1825
+ {
1826
+ "epoch": 9.518375241779497,
1827
+ "grad_norm": 0.8170893788337708,
1828
+ "learning_rate": 6.512137197681733e-07,
1829
+ "loss": 5.891664203844573,
1830
+ "step": 4921
1831
+ },
1832
+ {
1833
+ "epoch": 9.555125725338492,
1834
+ "grad_norm": 0.8812070488929749,
1835
+ "learning_rate": 5.561676058767007e-07,
1836
+ "loss": 5.869388781095806,
1837
+ "step": 4940
1838
+ },
1839
+ {
1840
+ "epoch": 9.591876208897485,
1841
+ "grad_norm": 0.744022786617279,
1842
+ "learning_rate": 4.6857674477032154e-07,
1843
+ "loss": 5.892197859914679,
1844
+ "step": 4959
1845
+ },
1846
+ {
1847
+ "epoch": 9.62862669245648,
1848
+ "grad_norm": 0.8482499122619629,
1849
+ "learning_rate": 3.884543435970056e-07,
1850
+ "loss": 5.940224095394737,
1851
+ "step": 4978
1852
+ },
1853
+ {
1854
+ "epoch": 9.665377176015474,
1855
+ "grad_norm": 0.8590576648712158,
1856
+ "learning_rate": 3.158124833934684e-07,
1857
+ "loss": 5.907253466154399,
1858
+ "step": 4997
1859
+ },
1860
+ {
1861
+ "epoch": 9.702127659574469,
1862
+ "grad_norm": 0.7035248875617981,
1863
+ "learning_rate": 2.506621172635615e-07,
1864
+ "loss": 5.891074732730263,
1865
+ "step": 5016
1866
+ },
1867
+ {
1868
+ "epoch": 9.738878143133462,
1869
+ "grad_norm": 0.8274122476577759,
1870
+ "learning_rate": 1.930130687267051e-07,
1871
+ "loss": 5.884301436574836,
1872
+ "step": 5035
1873
+ },
1874
+ {
1875
+ "epoch": 9.775628626692457,
1876
+ "grad_norm": 0.8533028960227966,
1877
+ "learning_rate": 1.4287403023673373e-07,
1878
+ "loss": 5.917849490517064,
1879
+ "step": 5054
1880
+ },
1881
+ {
1882
+ "epoch": 9.81237911025145,
1883
+ "grad_norm": 0.9782279133796692,
1884
+ "learning_rate": 1.0025256187117249e-07,
1885
+ "loss": 5.881855211759868,
1886
+ "step": 5073
1887
+ },
1888
+ {
1889
+ "epoch": 9.849129593810446,
1890
+ "grad_norm": 0.7845222353935242,
1891
+ "learning_rate": 6.515509019133781e-08,
1892
+ "loss": 5.9063768888774675,
1893
+ "step": 5092
1894
+ },
1895
+ {
1896
+ "epoch": 9.885880077369439,
1897
+ "grad_norm": 0.7692595720291138,
1898
+ "learning_rate": 3.758690727332925e-08,
1899
+ "loss": 5.8881988525390625,
1900
+ "step": 5111
1901
+ },
1902
+ {
1903
+ "epoch": 9.922630560928432,
1904
+ "grad_norm": 0.8751763701438904,
1905
+ "learning_rate": 1.7552169910067807e-08,
1906
+ "loss": 5.899998313502262,
1907
+ "step": 5130
1908
+ },
1909
+ {
1910
+ "epoch": 9.959381044487428,
1911
+ "grad_norm": 0.815253496170044,
1912
+ "learning_rate": 5.053898984519467e-09,
1913
+ "loss": 5.934799595883018,
1914
+ "step": 5149
1915
+ },
1916
+ {
1917
+ "epoch": 9.996131528046421,
1918
+ "grad_norm": 0.7818523645401001,
1919
+ "learning_rate": 9.397901423180422e-11,
1920
+ "loss": 5.88957415129009,
1921
+ "step": 5168
1922
+ }
1923
+ ],
1924
+ "logging_steps": 19,
1925
+ "max_steps": 5170,
1926
+ "num_input_tokens_seen": 0,
1927
+ "num_train_epochs": 10,
1928
+ "save_steps": 500,
1929
+ "stateful_callbacks": {
1930
+ "TrainerControl": {
1931
+ "args": {
1932
+ "should_epoch_stop": false,
1933
+ "should_evaluate": false,
1934
+ "should_log": false,
1935
+ "should_save": true,
1936
+ "should_training_stop": true
1937
+ },
1938
+ "attributes": {}
1939
+ }
1940
+ },
1941
+ "total_flos": 1.099568358948864e+16,
1942
+ "train_batch_size": 64,
1943
+ "trial_name": null,
1944
+ "trial_params": null
1945
+ }
checkpoint-5170/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:655a39837857415e4b7e97f5e1babb75c8c7355fae7a986299c7a8effdb82e4b
3
+ size 5265
config.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_cross_attention": false,
3
+ "architectures": [
4
+ "BertForMaskedLM"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 2,
8
+ "classifier_dropout": null,
9
+ "dtype": "float32",
10
+ "eos_token_id": 3,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout_prob": 0.1,
13
+ "hidden_size": 384,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 1536,
16
+ "is_decoder": false,
17
+ "layer_norm_eps": 1e-12,
18
+ "max_position_embeddings": 512,
19
+ "model_type": "bert",
20
+ "num_attention_heads": 12,
21
+ "num_hidden_layers": 6,
22
+ "pad_token_id": 0,
23
+ "tie_word_embeddings": true,
24
+ "transformers_version": "5.3.0",
25
+ "type_vocab_size": 2,
26
+ "use_cache": false,
27
+ "vocab_size": 32000
28
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9eb7e7c59ea3bfcbcd6be2a7dfb1fd25f2dae1ba766eca3e302f8b44b8a07cd
3
+ size 93266320
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "backend": "tokenizers",
3
+ "bos_token": "[CLS]",
4
+ "cls_token": "[CLS]",
5
+ "eos_token": "[SEP]",
6
+ "is_local": true,
7
+ "mask_token": "[MASK]",
8
+ "model_max_length": 1000000000000000019884624838656,
9
+ "pad_token": "[PAD]",
10
+ "sep_token": "[SEP]",
11
+ "tokenizer_class": "TokenizersBackend",
12
+ "unk_token": "[UNK]"
13
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:655a39837857415e4b7e97f5e1babb75c8c7355fae7a986299c7a8effdb82e4b
3
+ size 5265