EvaristeL commited on
Commit
ff2795c
·
verified ·
1 Parent(s): 18cdc12

Upload folder using huggingface_hub

Browse files
config.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "/mnt/data/yule/.cache/roberta-base",
3
+ "architectures": [
4
+ "RobertaForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "hidden_act": "gelu",
11
+ "hidden_dropout_prob": 0.1,
12
+ "hidden_size": 768,
13
+ "id2label": {
14
+ "0": "LABEL_0",
15
+ "1": "LABEL_1",
16
+ "2": "LABEL_2"
17
+ },
18
+ "initializer_range": 0.02,
19
+ "intermediate_size": 3072,
20
+ "label2id": {
21
+ "LABEL_0": 0,
22
+ "LABEL_1": 1,
23
+ "LABEL_2": 2
24
+ },
25
+ "layer_norm_eps": 1e-05,
26
+ "max_position_embeddings": 514,
27
+ "model_type": "roberta",
28
+ "num_attention_heads": 12,
29
+ "num_hidden_layers": 12,
30
+ "pad_token_id": 1,
31
+ "position_embedding_type": "absolute",
32
+ "torch_dtype": "float32",
33
+ "transformers_version": "4.33.1",
34
+ "type_vocab_size": 1,
35
+ "use_cache": true,
36
+ "vocab_size": 50265
37
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f8e37e128bef6b3cca696c691d2f3481399caa55e6d79cc7e1558cad5bdf3245
3
+ size 498657905
special_tokens_map.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<s>",
3
+ "cls_token": "<s>",
4
+ "eos_token": "</s>",
5
+ "mask_token": {
6
+ "content": "<mask>",
7
+ "lstrip": true,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false
11
+ },
12
+ "pad_token": "<pad>",
13
+ "sep_token": "</s>",
14
+ "unk_token": "<unk>"
15
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "bos_token": "<s>",
4
+ "clean_up_tokenization_spaces": true,
5
+ "cls_token": "<s>",
6
+ "eos_token": "</s>",
7
+ "errors": "replace",
8
+ "mask_token": "<mask>",
9
+ "model_max_length": 1000000000000000019884624838656,
10
+ "pad_token": "<pad>",
11
+ "sep_token": "</s>",
12
+ "tokenizer_class": "RobertaTokenizer",
13
+ "trim_offsets": true,
14
+ "unk_token": "<unk>"
15
+ }
trainer_state.json ADDED
@@ -0,0 +1,2764 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 0.867408520282142,
3
+ "best_model_checkpoint": "./save_models/mnli/roberta-base_lr1e-05_run2/checkpoint-220900",
4
+ "epoch": 10.0,
5
+ "eval_steps": 500,
6
+ "global_step": 220900,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.02,
13
+ "learning_rate": 3.772446054021428e-07,
14
+ "loss": 1.1005,
15
+ "step": 500
16
+ },
17
+ {
18
+ "epoch": 0.05,
19
+ "learning_rate": 7.544892108042856e-07,
20
+ "loss": 1.1004,
21
+ "step": 1000
22
+ },
23
+ {
24
+ "epoch": 0.07,
25
+ "learning_rate": 1.1317338162064282e-06,
26
+ "loss": 1.0915,
27
+ "step": 1500
28
+ },
29
+ {
30
+ "epoch": 0.09,
31
+ "learning_rate": 1.5089784216085712e-06,
32
+ "loss": 0.9695,
33
+ "step": 2000
34
+ },
35
+ {
36
+ "epoch": 0.11,
37
+ "learning_rate": 1.886223027010714e-06,
38
+ "loss": 0.8466,
39
+ "step": 2500
40
+ },
41
+ {
42
+ "epoch": 0.14,
43
+ "learning_rate": 2.2634676324128565e-06,
44
+ "loss": 0.6904,
45
+ "step": 3000
46
+ },
47
+ {
48
+ "epoch": 0.16,
49
+ "learning_rate": 2.6407122378149996e-06,
50
+ "loss": 0.6293,
51
+ "step": 3500
52
+ },
53
+ {
54
+ "epoch": 0.18,
55
+ "learning_rate": 3.0179568432171424e-06,
56
+ "loss": 0.5855,
57
+ "step": 4000
58
+ },
59
+ {
60
+ "epoch": 0.2,
61
+ "learning_rate": 3.395201448619285e-06,
62
+ "loss": 0.5505,
63
+ "step": 4500
64
+ },
65
+ {
66
+ "epoch": 0.23,
67
+ "learning_rate": 3.772446054021428e-06,
68
+ "loss": 0.5519,
69
+ "step": 5000
70
+ },
71
+ {
72
+ "epoch": 0.25,
73
+ "learning_rate": 4.149690659423571e-06,
74
+ "loss": 0.5192,
75
+ "step": 5500
76
+ },
77
+ {
78
+ "epoch": 0.27,
79
+ "learning_rate": 4.526935264825713e-06,
80
+ "loss": 0.5156,
81
+ "step": 6000
82
+ },
83
+ {
84
+ "epoch": 0.29,
85
+ "learning_rate": 4.904179870227856e-06,
86
+ "loss": 0.5082,
87
+ "step": 6500
88
+ },
89
+ {
90
+ "epoch": 0.32,
91
+ "learning_rate": 5.281424475629999e-06,
92
+ "loss": 0.5001,
93
+ "step": 7000
94
+ },
95
+ {
96
+ "epoch": 0.34,
97
+ "learning_rate": 5.658669081032142e-06,
98
+ "loss": 0.489,
99
+ "step": 7500
100
+ },
101
+ {
102
+ "epoch": 0.36,
103
+ "learning_rate": 6.035913686434285e-06,
104
+ "loss": 0.4944,
105
+ "step": 8000
106
+ },
107
+ {
108
+ "epoch": 0.38,
109
+ "learning_rate": 6.4131582918364275e-06,
110
+ "loss": 0.4852,
111
+ "step": 8500
112
+ },
113
+ {
114
+ "epoch": 0.41,
115
+ "learning_rate": 6.79040289723857e-06,
116
+ "loss": 0.4802,
117
+ "step": 9000
118
+ },
119
+ {
120
+ "epoch": 0.43,
121
+ "learning_rate": 7.167647502640713e-06,
122
+ "loss": 0.4721,
123
+ "step": 9500
124
+ },
125
+ {
126
+ "epoch": 0.45,
127
+ "learning_rate": 7.544892108042856e-06,
128
+ "loss": 0.4506,
129
+ "step": 10000
130
+ },
131
+ {
132
+ "epoch": 0.48,
133
+ "learning_rate": 7.922136713445e-06,
134
+ "loss": 0.4756,
135
+ "step": 10500
136
+ },
137
+ {
138
+ "epoch": 0.5,
139
+ "learning_rate": 8.299381318847142e-06,
140
+ "loss": 0.4595,
141
+ "step": 11000
142
+ },
143
+ {
144
+ "epoch": 0.52,
145
+ "learning_rate": 8.676625924249283e-06,
146
+ "loss": 0.474,
147
+ "step": 11500
148
+ },
149
+ {
150
+ "epoch": 0.54,
151
+ "learning_rate": 9.053870529651426e-06,
152
+ "loss": 0.4419,
153
+ "step": 12000
154
+ },
155
+ {
156
+ "epoch": 0.57,
157
+ "learning_rate": 9.431115135053569e-06,
158
+ "loss": 0.4571,
159
+ "step": 12500
160
+ },
161
+ {
162
+ "epoch": 0.59,
163
+ "learning_rate": 9.808359740455711e-06,
164
+ "loss": 0.4544,
165
+ "step": 13000
166
+ },
167
+ {
168
+ "epoch": 0.61,
169
+ "learning_rate": 9.98815291409418e-06,
170
+ "loss": 0.4595,
171
+ "step": 13500
172
+ },
173
+ {
174
+ "epoch": 0.63,
175
+ "learning_rate": 9.964073471196171e-06,
176
+ "loss": 0.441,
177
+ "step": 14000
178
+ },
179
+ {
180
+ "epoch": 0.66,
181
+ "learning_rate": 9.939994028298163e-06,
182
+ "loss": 0.4483,
183
+ "step": 14500
184
+ },
185
+ {
186
+ "epoch": 0.68,
187
+ "learning_rate": 9.915914585400153e-06,
188
+ "loss": 0.4395,
189
+ "step": 15000
190
+ },
191
+ {
192
+ "epoch": 0.7,
193
+ "learning_rate": 9.891835142502145e-06,
194
+ "loss": 0.4484,
195
+ "step": 15500
196
+ },
197
+ {
198
+ "epoch": 0.72,
199
+ "learning_rate": 9.867755699604135e-06,
200
+ "loss": 0.4378,
201
+ "step": 16000
202
+ },
203
+ {
204
+ "epoch": 0.75,
205
+ "learning_rate": 9.843676256706126e-06,
206
+ "loss": 0.4422,
207
+ "step": 16500
208
+ },
209
+ {
210
+ "epoch": 0.77,
211
+ "learning_rate": 9.819596813808116e-06,
212
+ "loss": 0.4293,
213
+ "step": 17000
214
+ },
215
+ {
216
+ "epoch": 0.79,
217
+ "learning_rate": 9.795517370910108e-06,
218
+ "loss": 0.4349,
219
+ "step": 17500
220
+ },
221
+ {
222
+ "epoch": 0.81,
223
+ "learning_rate": 9.771437928012098e-06,
224
+ "loss": 0.4292,
225
+ "step": 18000
226
+ },
227
+ {
228
+ "epoch": 0.84,
229
+ "learning_rate": 9.74735848511409e-06,
230
+ "loss": 0.4403,
231
+ "step": 18500
232
+ },
233
+ {
234
+ "epoch": 0.86,
235
+ "learning_rate": 9.723279042216081e-06,
236
+ "loss": 0.4299,
237
+ "step": 19000
238
+ },
239
+ {
240
+ "epoch": 0.88,
241
+ "learning_rate": 9.699199599318071e-06,
242
+ "loss": 0.4308,
243
+ "step": 19500
244
+ },
245
+ {
246
+ "epoch": 0.91,
247
+ "learning_rate": 9.675120156420061e-06,
248
+ "loss": 0.4141,
249
+ "step": 20000
250
+ },
251
+ {
252
+ "epoch": 0.93,
253
+ "learning_rate": 9.651040713522053e-06,
254
+ "loss": 0.4163,
255
+ "step": 20500
256
+ },
257
+ {
258
+ "epoch": 0.95,
259
+ "learning_rate": 9.626961270624043e-06,
260
+ "loss": 0.41,
261
+ "step": 21000
262
+ },
263
+ {
264
+ "epoch": 0.97,
265
+ "learning_rate": 9.602881827726035e-06,
266
+ "loss": 0.4285,
267
+ "step": 21500
268
+ },
269
+ {
270
+ "epoch": 1.0,
271
+ "learning_rate": 9.578802384828026e-06,
272
+ "loss": 0.4202,
273
+ "step": 22000
274
+ },
275
+ {
276
+ "epoch": 1.0,
277
+ "eval_accuracy": 0.8505512973950243,
278
+ "eval_loss": 0.4125332534313202,
279
+ "eval_runtime": 48.6401,
280
+ "eval_samples_per_second": 807.379,
281
+ "eval_steps_per_second": 50.473,
282
+ "step": 22090
283
+ },
284
+ {
285
+ "epoch": 1.02,
286
+ "learning_rate": 9.554722941930016e-06,
287
+ "loss": 0.3827,
288
+ "step": 22500
289
+ },
290
+ {
291
+ "epoch": 1.04,
292
+ "learning_rate": 9.530643499032008e-06,
293
+ "loss": 0.3678,
294
+ "step": 23000
295
+ },
296
+ {
297
+ "epoch": 1.06,
298
+ "learning_rate": 9.506564056133998e-06,
299
+ "loss": 0.365,
300
+ "step": 23500
301
+ },
302
+ {
303
+ "epoch": 1.09,
304
+ "learning_rate": 9.48248461323599e-06,
305
+ "loss": 0.3697,
306
+ "step": 24000
307
+ },
308
+ {
309
+ "epoch": 1.11,
310
+ "learning_rate": 9.45840517033798e-06,
311
+ "loss": 0.3585,
312
+ "step": 24500
313
+ },
314
+ {
315
+ "epoch": 1.13,
316
+ "learning_rate": 9.43432572743997e-06,
317
+ "loss": 0.3727,
318
+ "step": 25000
319
+ },
320
+ {
321
+ "epoch": 1.15,
322
+ "learning_rate": 9.410246284541961e-06,
323
+ "loss": 0.3513,
324
+ "step": 25500
325
+ },
326
+ {
327
+ "epoch": 1.18,
328
+ "learning_rate": 9.386166841643953e-06,
329
+ "loss": 0.3754,
330
+ "step": 26000
331
+ },
332
+ {
333
+ "epoch": 1.2,
334
+ "learning_rate": 9.362087398745945e-06,
335
+ "loss": 0.3585,
336
+ "step": 26500
337
+ },
338
+ {
339
+ "epoch": 1.22,
340
+ "learning_rate": 9.338007955847935e-06,
341
+ "loss": 0.3661,
342
+ "step": 27000
343
+ },
344
+ {
345
+ "epoch": 1.24,
346
+ "learning_rate": 9.313928512949925e-06,
347
+ "loss": 0.3646,
348
+ "step": 27500
349
+ },
350
+ {
351
+ "epoch": 1.27,
352
+ "learning_rate": 9.289849070051916e-06,
353
+ "loss": 0.3604,
354
+ "step": 28000
355
+ },
356
+ {
357
+ "epoch": 1.29,
358
+ "learning_rate": 9.265769627153906e-06,
359
+ "loss": 0.3721,
360
+ "step": 28500
361
+ },
362
+ {
363
+ "epoch": 1.31,
364
+ "learning_rate": 9.241690184255898e-06,
365
+ "loss": 0.3673,
366
+ "step": 29000
367
+ },
368
+ {
369
+ "epoch": 1.34,
370
+ "learning_rate": 9.217610741357888e-06,
371
+ "loss": 0.3508,
372
+ "step": 29500
373
+ },
374
+ {
375
+ "epoch": 1.36,
376
+ "learning_rate": 9.19353129845988e-06,
377
+ "loss": 0.3543,
378
+ "step": 30000
379
+ },
380
+ {
381
+ "epoch": 1.38,
382
+ "learning_rate": 9.169451855561871e-06,
383
+ "loss": 0.3585,
384
+ "step": 30500
385
+ },
386
+ {
387
+ "epoch": 1.4,
388
+ "learning_rate": 9.145372412663861e-06,
389
+ "loss": 0.359,
390
+ "step": 31000
391
+ },
392
+ {
393
+ "epoch": 1.43,
394
+ "learning_rate": 9.121292969765853e-06,
395
+ "loss": 0.3597,
396
+ "step": 31500
397
+ },
398
+ {
399
+ "epoch": 1.45,
400
+ "learning_rate": 9.097213526867843e-06,
401
+ "loss": 0.3513,
402
+ "step": 32000
403
+ },
404
+ {
405
+ "epoch": 1.47,
406
+ "learning_rate": 9.073134083969835e-06,
407
+ "loss": 0.3618,
408
+ "step": 32500
409
+ },
410
+ {
411
+ "epoch": 1.49,
412
+ "learning_rate": 9.049054641071825e-06,
413
+ "loss": 0.3643,
414
+ "step": 33000
415
+ },
416
+ {
417
+ "epoch": 1.52,
418
+ "learning_rate": 9.024975198173815e-06,
419
+ "loss": 0.3737,
420
+ "step": 33500
421
+ },
422
+ {
423
+ "epoch": 1.54,
424
+ "learning_rate": 9.000895755275806e-06,
425
+ "loss": 0.3586,
426
+ "step": 34000
427
+ },
428
+ {
429
+ "epoch": 1.56,
430
+ "learning_rate": 8.976816312377798e-06,
431
+ "loss": 0.3577,
432
+ "step": 34500
433
+ },
434
+ {
435
+ "epoch": 1.58,
436
+ "learning_rate": 8.95273686947979e-06,
437
+ "loss": 0.3594,
438
+ "step": 35000
439
+ },
440
+ {
441
+ "epoch": 1.61,
442
+ "learning_rate": 8.92865742658178e-06,
443
+ "loss": 0.3438,
444
+ "step": 35500
445
+ },
446
+ {
447
+ "epoch": 1.63,
448
+ "learning_rate": 8.90457798368377e-06,
449
+ "loss": 0.3631,
450
+ "step": 36000
451
+ },
452
+ {
453
+ "epoch": 1.65,
454
+ "learning_rate": 8.880498540785761e-06,
455
+ "loss": 0.3498,
456
+ "step": 36500
457
+ },
458
+ {
459
+ "epoch": 1.67,
460
+ "learning_rate": 8.856419097887751e-06,
461
+ "loss": 0.3611,
462
+ "step": 37000
463
+ },
464
+ {
465
+ "epoch": 1.7,
466
+ "learning_rate": 8.832339654989743e-06,
467
+ "loss": 0.3473,
468
+ "step": 37500
469
+ },
470
+ {
471
+ "epoch": 1.72,
472
+ "learning_rate": 8.808260212091733e-06,
473
+ "loss": 0.3476,
474
+ "step": 38000
475
+ },
476
+ {
477
+ "epoch": 1.74,
478
+ "learning_rate": 8.784180769193725e-06,
479
+ "loss": 0.3565,
480
+ "step": 38500
481
+ },
482
+ {
483
+ "epoch": 1.77,
484
+ "learning_rate": 8.760101326295716e-06,
485
+ "loss": 0.3572,
486
+ "step": 39000
487
+ },
488
+ {
489
+ "epoch": 1.79,
490
+ "learning_rate": 8.736021883397706e-06,
491
+ "loss": 0.3501,
492
+ "step": 39500
493
+ },
494
+ {
495
+ "epoch": 1.81,
496
+ "learning_rate": 8.711942440499698e-06,
497
+ "loss": 0.3535,
498
+ "step": 40000
499
+ },
500
+ {
501
+ "epoch": 1.83,
502
+ "learning_rate": 8.687862997601688e-06,
503
+ "loss": 0.3545,
504
+ "step": 40500
505
+ },
506
+ {
507
+ "epoch": 1.86,
508
+ "learning_rate": 8.663783554703678e-06,
509
+ "loss": 0.3615,
510
+ "step": 41000
511
+ },
512
+ {
513
+ "epoch": 1.88,
514
+ "learning_rate": 8.63970411180567e-06,
515
+ "loss": 0.3651,
516
+ "step": 41500
517
+ },
518
+ {
519
+ "epoch": 1.9,
520
+ "learning_rate": 8.615624668907661e-06,
521
+ "loss": 0.3508,
522
+ "step": 42000
523
+ },
524
+ {
525
+ "epoch": 1.92,
526
+ "learning_rate": 8.591545226009653e-06,
527
+ "loss": 0.344,
528
+ "step": 42500
529
+ },
530
+ {
531
+ "epoch": 1.95,
532
+ "learning_rate": 8.567465783111643e-06,
533
+ "loss": 0.3536,
534
+ "step": 43000
535
+ },
536
+ {
537
+ "epoch": 1.97,
538
+ "learning_rate": 8.543386340213633e-06,
539
+ "loss": 0.359,
540
+ "step": 43500
541
+ },
542
+ {
543
+ "epoch": 1.99,
544
+ "learning_rate": 8.519306897315625e-06,
545
+ "loss": 0.3471,
546
+ "step": 44000
547
+ },
548
+ {
549
+ "epoch": 2.0,
550
+ "eval_accuracy": 0.8628759135239744,
551
+ "eval_loss": 0.3846328854560852,
552
+ "eval_runtime": 48.5564,
553
+ "eval_samples_per_second": 808.77,
554
+ "eval_steps_per_second": 50.56,
555
+ "step": 44180
556
+ },
557
+ {
558
+ "epoch": 2.01,
559
+ "learning_rate": 8.495227454417615e-06,
560
+ "loss": 0.2986,
561
+ "step": 44500
562
+ },
563
+ {
564
+ "epoch": 2.04,
565
+ "learning_rate": 8.471148011519606e-06,
566
+ "loss": 0.2883,
567
+ "step": 45000
568
+ },
569
+ {
570
+ "epoch": 2.06,
571
+ "learning_rate": 8.447068568621596e-06,
572
+ "loss": 0.2719,
573
+ "step": 45500
574
+ },
575
+ {
576
+ "epoch": 2.08,
577
+ "learning_rate": 8.422989125723588e-06,
578
+ "loss": 0.2827,
579
+ "step": 46000
580
+ },
581
+ {
582
+ "epoch": 2.11,
583
+ "learning_rate": 8.39890968282558e-06,
584
+ "loss": 0.2827,
585
+ "step": 46500
586
+ },
587
+ {
588
+ "epoch": 2.13,
589
+ "learning_rate": 8.37483023992757e-06,
590
+ "loss": 0.2976,
591
+ "step": 47000
592
+ },
593
+ {
594
+ "epoch": 2.15,
595
+ "learning_rate": 8.350750797029561e-06,
596
+ "loss": 0.2714,
597
+ "step": 47500
598
+ },
599
+ {
600
+ "epoch": 2.17,
601
+ "learning_rate": 8.326671354131551e-06,
602
+ "loss": 0.2895,
603
+ "step": 48000
604
+ },
605
+ {
606
+ "epoch": 2.2,
607
+ "learning_rate": 8.302591911233543e-06,
608
+ "loss": 0.2826,
609
+ "step": 48500
610
+ },
611
+ {
612
+ "epoch": 2.22,
613
+ "learning_rate": 8.278512468335533e-06,
614
+ "loss": 0.2783,
615
+ "step": 49000
616
+ },
617
+ {
618
+ "epoch": 2.24,
619
+ "learning_rate": 8.254433025437523e-06,
620
+ "loss": 0.2843,
621
+ "step": 49500
622
+ },
623
+ {
624
+ "epoch": 2.26,
625
+ "learning_rate": 8.230353582539515e-06,
626
+ "loss": 0.2913,
627
+ "step": 50000
628
+ },
629
+ {
630
+ "epoch": 2.29,
631
+ "learning_rate": 8.206274139641506e-06,
632
+ "loss": 0.2694,
633
+ "step": 50500
634
+ },
635
+ {
636
+ "epoch": 2.31,
637
+ "learning_rate": 8.182194696743498e-06,
638
+ "loss": 0.2847,
639
+ "step": 51000
640
+ },
641
+ {
642
+ "epoch": 2.33,
643
+ "learning_rate": 8.158115253845488e-06,
644
+ "loss": 0.2911,
645
+ "step": 51500
646
+ },
647
+ {
648
+ "epoch": 2.35,
649
+ "learning_rate": 8.134035810947478e-06,
650
+ "loss": 0.2818,
651
+ "step": 52000
652
+ },
653
+ {
654
+ "epoch": 2.38,
655
+ "learning_rate": 8.10995636804947e-06,
656
+ "loss": 0.282,
657
+ "step": 52500
658
+ },
659
+ {
660
+ "epoch": 2.4,
661
+ "learning_rate": 8.08587692515146e-06,
662
+ "loss": 0.2868,
663
+ "step": 53000
664
+ },
665
+ {
666
+ "epoch": 2.42,
667
+ "learning_rate": 8.061797482253451e-06,
668
+ "loss": 0.2849,
669
+ "step": 53500
670
+ },
671
+ {
672
+ "epoch": 2.44,
673
+ "learning_rate": 8.037718039355441e-06,
674
+ "loss": 0.2791,
675
+ "step": 54000
676
+ },
677
+ {
678
+ "epoch": 2.47,
679
+ "learning_rate": 8.013638596457433e-06,
680
+ "loss": 0.2833,
681
+ "step": 54500
682
+ },
683
+ {
684
+ "epoch": 2.49,
685
+ "learning_rate": 7.989559153559425e-06,
686
+ "loss": 0.2771,
687
+ "step": 55000
688
+ },
689
+ {
690
+ "epoch": 2.51,
691
+ "learning_rate": 7.965479710661415e-06,
692
+ "loss": 0.2865,
693
+ "step": 55500
694
+ },
695
+ {
696
+ "epoch": 2.54,
697
+ "learning_rate": 7.941400267763406e-06,
698
+ "loss": 0.2856,
699
+ "step": 56000
700
+ },
701
+ {
702
+ "epoch": 2.56,
703
+ "learning_rate": 7.917320824865396e-06,
704
+ "loss": 0.2989,
705
+ "step": 56500
706
+ },
707
+ {
708
+ "epoch": 2.58,
709
+ "learning_rate": 7.893241381967386e-06,
710
+ "loss": 0.286,
711
+ "step": 57000
712
+ },
713
+ {
714
+ "epoch": 2.6,
715
+ "learning_rate": 7.869161939069378e-06,
716
+ "loss": 0.2818,
717
+ "step": 57500
718
+ },
719
+ {
720
+ "epoch": 2.63,
721
+ "learning_rate": 7.84508249617137e-06,
722
+ "loss": 0.2781,
723
+ "step": 58000
724
+ },
725
+ {
726
+ "epoch": 2.65,
727
+ "learning_rate": 7.82100305327336e-06,
728
+ "loss": 0.2822,
729
+ "step": 58500
730
+ },
731
+ {
732
+ "epoch": 2.67,
733
+ "learning_rate": 7.796923610375351e-06,
734
+ "loss": 0.2888,
735
+ "step": 59000
736
+ },
737
+ {
738
+ "epoch": 2.69,
739
+ "learning_rate": 7.772844167477341e-06,
740
+ "loss": 0.2922,
741
+ "step": 59500
742
+ },
743
+ {
744
+ "epoch": 2.72,
745
+ "learning_rate": 7.748764724579333e-06,
746
+ "loss": 0.2817,
747
+ "step": 60000
748
+ },
749
+ {
750
+ "epoch": 2.74,
751
+ "learning_rate": 7.724685281681323e-06,
752
+ "loss": 0.2779,
753
+ "step": 60500
754
+ },
755
+ {
756
+ "epoch": 2.76,
757
+ "learning_rate": 7.700605838783315e-06,
758
+ "loss": 0.2846,
759
+ "step": 61000
760
+ },
761
+ {
762
+ "epoch": 2.78,
763
+ "learning_rate": 7.676526395885305e-06,
764
+ "loss": 0.2935,
765
+ "step": 61500
766
+ },
767
+ {
768
+ "epoch": 2.81,
769
+ "learning_rate": 7.652446952987296e-06,
770
+ "loss": 0.2787,
771
+ "step": 62000
772
+ },
773
+ {
774
+ "epoch": 2.83,
775
+ "learning_rate": 7.628367510089287e-06,
776
+ "loss": 0.2701,
777
+ "step": 62500
778
+ },
779
+ {
780
+ "epoch": 2.85,
781
+ "learning_rate": 7.604288067191278e-06,
782
+ "loss": 0.2888,
783
+ "step": 63000
784
+ },
785
+ {
786
+ "epoch": 2.87,
787
+ "learning_rate": 7.58020862429327e-06,
788
+ "loss": 0.2827,
789
+ "step": 63500
790
+ },
791
+ {
792
+ "epoch": 2.9,
793
+ "learning_rate": 7.55612918139526e-06,
794
+ "loss": 0.2746,
795
+ "step": 64000
796
+ },
797
+ {
798
+ "epoch": 2.92,
799
+ "learning_rate": 7.532049738497251e-06,
800
+ "loss": 0.2766,
801
+ "step": 64500
802
+ },
803
+ {
804
+ "epoch": 2.94,
805
+ "learning_rate": 7.507970295599241e-06,
806
+ "loss": 0.2976,
807
+ "step": 65000
808
+ },
809
+ {
810
+ "epoch": 2.97,
811
+ "learning_rate": 7.483890852701232e-06,
812
+ "loss": 0.2902,
813
+ "step": 65500
814
+ },
815
+ {
816
+ "epoch": 2.99,
817
+ "learning_rate": 7.459811409803224e-06,
818
+ "loss": 0.2989,
819
+ "step": 66000
820
+ },
821
+ {
822
+ "epoch": 3.0,
823
+ "eval_accuracy": 0.8646838634106593,
824
+ "eval_loss": 0.4122258126735687,
825
+ "eval_runtime": 48.5877,
826
+ "eval_samples_per_second": 808.249,
827
+ "eval_steps_per_second": 50.527,
828
+ "step": 66270
829
+ },
830
+ {
831
+ "epoch": 3.01,
832
+ "learning_rate": 7.435731966905214e-06,
833
+ "loss": 0.2587,
834
+ "step": 66500
835
+ },
836
+ {
837
+ "epoch": 3.03,
838
+ "learning_rate": 7.4116525240072056e-06,
839
+ "loss": 0.219,
840
+ "step": 67000
841
+ },
842
+ {
843
+ "epoch": 3.06,
844
+ "learning_rate": 7.387573081109196e-06,
845
+ "loss": 0.2385,
846
+ "step": 67500
847
+ },
848
+ {
849
+ "epoch": 3.08,
850
+ "learning_rate": 7.363493638211186e-06,
851
+ "loss": 0.2408,
852
+ "step": 68000
853
+ },
854
+ {
855
+ "epoch": 3.1,
856
+ "learning_rate": 7.339414195313178e-06,
857
+ "loss": 0.2248,
858
+ "step": 68500
859
+ },
860
+ {
861
+ "epoch": 3.12,
862
+ "learning_rate": 7.315334752415169e-06,
863
+ "loss": 0.2335,
864
+ "step": 69000
865
+ },
866
+ {
867
+ "epoch": 3.15,
868
+ "learning_rate": 7.29125530951716e-06,
869
+ "loss": 0.2361,
870
+ "step": 69500
871
+ },
872
+ {
873
+ "epoch": 3.17,
874
+ "learning_rate": 7.2671758666191506e-06,
875
+ "loss": 0.2272,
876
+ "step": 70000
877
+ },
878
+ {
879
+ "epoch": 3.19,
880
+ "learning_rate": 7.2430964237211406e-06,
881
+ "loss": 0.2342,
882
+ "step": 70500
883
+ },
884
+ {
885
+ "epoch": 3.21,
886
+ "learning_rate": 7.219016980823132e-06,
887
+ "loss": 0.2288,
888
+ "step": 71000
889
+ },
890
+ {
891
+ "epoch": 3.24,
892
+ "learning_rate": 7.194937537925123e-06,
893
+ "loss": 0.2266,
894
+ "step": 71500
895
+ },
896
+ {
897
+ "epoch": 3.26,
898
+ "learning_rate": 7.170858095027115e-06,
899
+ "loss": 0.2295,
900
+ "step": 72000
901
+ },
902
+ {
903
+ "epoch": 3.28,
904
+ "learning_rate": 7.146778652129105e-06,
905
+ "loss": 0.2118,
906
+ "step": 72500
907
+ },
908
+ {
909
+ "epoch": 3.3,
910
+ "learning_rate": 7.1226992092310956e-06,
911
+ "loss": 0.2187,
912
+ "step": 73000
913
+ },
914
+ {
915
+ "epoch": 3.33,
916
+ "learning_rate": 7.098619766333087e-06,
917
+ "loss": 0.2349,
918
+ "step": 73500
919
+ },
920
+ {
921
+ "epoch": 3.35,
922
+ "learning_rate": 7.074540323435077e-06,
923
+ "loss": 0.2389,
924
+ "step": 74000
925
+ },
926
+ {
927
+ "epoch": 3.37,
928
+ "learning_rate": 7.050460880537069e-06,
929
+ "loss": 0.2285,
930
+ "step": 74500
931
+ },
932
+ {
933
+ "epoch": 3.4,
934
+ "learning_rate": 7.026381437639059e-06,
935
+ "loss": 0.2197,
936
+ "step": 75000
937
+ },
938
+ {
939
+ "epoch": 3.42,
940
+ "learning_rate": 7.00230199474105e-06,
941
+ "loss": 0.2359,
942
+ "step": 75500
943
+ },
944
+ {
945
+ "epoch": 3.44,
946
+ "learning_rate": 6.9782225518430414e-06,
947
+ "loss": 0.2424,
948
+ "step": 76000
949
+ },
950
+ {
951
+ "epoch": 3.46,
952
+ "learning_rate": 6.954143108945031e-06,
953
+ "loss": 0.2292,
954
+ "step": 76500
955
+ },
956
+ {
957
+ "epoch": 3.49,
958
+ "learning_rate": 6.930063666047023e-06,
959
+ "loss": 0.234,
960
+ "step": 77000
961
+ },
962
+ {
963
+ "epoch": 3.51,
964
+ "learning_rate": 6.905984223149014e-06,
965
+ "loss": 0.2342,
966
+ "step": 77500
967
+ },
968
+ {
969
+ "epoch": 3.53,
970
+ "learning_rate": 6.881904780251004e-06,
971
+ "loss": 0.227,
972
+ "step": 78000
973
+ },
974
+ {
975
+ "epoch": 3.55,
976
+ "learning_rate": 6.857825337352996e-06,
977
+ "loss": 0.2349,
978
+ "step": 78500
979
+ },
980
+ {
981
+ "epoch": 3.58,
982
+ "learning_rate": 6.8337458944549864e-06,
983
+ "loss": 0.2356,
984
+ "step": 79000
985
+ },
986
+ {
987
+ "epoch": 3.6,
988
+ "learning_rate": 6.809666451556978e-06,
989
+ "loss": 0.2322,
990
+ "step": 79500
991
+ },
992
+ {
993
+ "epoch": 3.62,
994
+ "learning_rate": 6.785587008658968e-06,
995
+ "loss": 0.2368,
996
+ "step": 80000
997
+ },
998
+ {
999
+ "epoch": 3.64,
1000
+ "learning_rate": 6.761507565760959e-06,
1001
+ "loss": 0.2254,
1002
+ "step": 80500
1003
+ },
1004
+ {
1005
+ "epoch": 3.67,
1006
+ "learning_rate": 6.73742812286295e-06,
1007
+ "loss": 0.2488,
1008
+ "step": 81000
1009
+ },
1010
+ {
1011
+ "epoch": 3.69,
1012
+ "learning_rate": 6.713348679964941e-06,
1013
+ "loss": 0.2403,
1014
+ "step": 81500
1015
+ },
1016
+ {
1017
+ "epoch": 3.71,
1018
+ "learning_rate": 6.689269237066932e-06,
1019
+ "loss": 0.2371,
1020
+ "step": 82000
1021
+ },
1022
+ {
1023
+ "epoch": 3.73,
1024
+ "learning_rate": 6.665189794168922e-06,
1025
+ "loss": 0.2305,
1026
+ "step": 82500
1027
+ },
1028
+ {
1029
+ "epoch": 3.76,
1030
+ "learning_rate": 6.641110351270914e-06,
1031
+ "loss": 0.2355,
1032
+ "step": 83000
1033
+ },
1034
+ {
1035
+ "epoch": 3.78,
1036
+ "learning_rate": 6.617030908372905e-06,
1037
+ "loss": 0.2453,
1038
+ "step": 83500
1039
+ },
1040
+ {
1041
+ "epoch": 3.8,
1042
+ "learning_rate": 6.592951465474895e-06,
1043
+ "loss": 0.2406,
1044
+ "step": 84000
1045
+ },
1046
+ {
1047
+ "epoch": 3.83,
1048
+ "learning_rate": 6.5688720225768865e-06,
1049
+ "loss": 0.2417,
1050
+ "step": 84500
1051
+ },
1052
+ {
1053
+ "epoch": 3.85,
1054
+ "learning_rate": 6.544792579678877e-06,
1055
+ "loss": 0.2453,
1056
+ "step": 85000
1057
+ },
1058
+ {
1059
+ "epoch": 3.87,
1060
+ "learning_rate": 6.520713136780868e-06,
1061
+ "loss": 0.2365,
1062
+ "step": 85500
1063
+ },
1064
+ {
1065
+ "epoch": 3.89,
1066
+ "learning_rate": 6.496633693882859e-06,
1067
+ "loss": 0.2447,
1068
+ "step": 86000
1069
+ },
1070
+ {
1071
+ "epoch": 3.92,
1072
+ "learning_rate": 6.472554250984849e-06,
1073
+ "loss": 0.2441,
1074
+ "step": 86500
1075
+ },
1076
+ {
1077
+ "epoch": 3.94,
1078
+ "learning_rate": 6.448474808086841e-06,
1079
+ "loss": 0.2314,
1080
+ "step": 87000
1081
+ },
1082
+ {
1083
+ "epoch": 3.96,
1084
+ "learning_rate": 6.4243953651888315e-06,
1085
+ "loss": 0.2402,
1086
+ "step": 87500
1087
+ },
1088
+ {
1089
+ "epoch": 3.98,
1090
+ "learning_rate": 6.400315922290823e-06,
1091
+ "loss": 0.235,
1092
+ "step": 88000
1093
+ },
1094
+ {
1095
+ "epoch": 4.0,
1096
+ "eval_accuracy": 0.8673575921163199,
1097
+ "eval_loss": 0.500419020652771,
1098
+ "eval_runtime": 48.6048,
1099
+ "eval_samples_per_second": 807.965,
1100
+ "eval_steps_per_second": 50.509,
1101
+ "step": 88360
1102
+ },
1103
+ {
1104
+ "epoch": 4.01,
1105
+ "learning_rate": 6.376236479392813e-06,
1106
+ "loss": 0.2287,
1107
+ "step": 88500
1108
+ },
1109
+ {
1110
+ "epoch": 4.03,
1111
+ "learning_rate": 6.352157036494804e-06,
1112
+ "loss": 0.1927,
1113
+ "step": 89000
1114
+ },
1115
+ {
1116
+ "epoch": 4.05,
1117
+ "learning_rate": 6.328077593596796e-06,
1118
+ "loss": 0.1863,
1119
+ "step": 89500
1120
+ },
1121
+ {
1122
+ "epoch": 4.07,
1123
+ "learning_rate": 6.303998150698786e-06,
1124
+ "loss": 0.1964,
1125
+ "step": 90000
1126
+ },
1127
+ {
1128
+ "epoch": 4.1,
1129
+ "learning_rate": 6.279918707800777e-06,
1130
+ "loss": 0.2044,
1131
+ "step": 90500
1132
+ },
1133
+ {
1134
+ "epoch": 4.12,
1135
+ "learning_rate": 6.255839264902767e-06,
1136
+ "loss": 0.1911,
1137
+ "step": 91000
1138
+ },
1139
+ {
1140
+ "epoch": 4.14,
1141
+ "learning_rate": 6.231759822004758e-06,
1142
+ "loss": 0.2,
1143
+ "step": 91500
1144
+ },
1145
+ {
1146
+ "epoch": 4.16,
1147
+ "learning_rate": 6.20768037910675e-06,
1148
+ "loss": 0.1866,
1149
+ "step": 92000
1150
+ },
1151
+ {
1152
+ "epoch": 4.19,
1153
+ "learning_rate": 6.18360093620874e-06,
1154
+ "loss": 0.2139,
1155
+ "step": 92500
1156
+ },
1157
+ {
1158
+ "epoch": 4.21,
1159
+ "learning_rate": 6.1595214933107315e-06,
1160
+ "loss": 0.2048,
1161
+ "step": 93000
1162
+ },
1163
+ {
1164
+ "epoch": 4.23,
1165
+ "learning_rate": 6.135442050412722e-06,
1166
+ "loss": 0.1956,
1167
+ "step": 93500
1168
+ },
1169
+ {
1170
+ "epoch": 4.26,
1171
+ "learning_rate": 6.111362607514712e-06,
1172
+ "loss": 0.2057,
1173
+ "step": 94000
1174
+ },
1175
+ {
1176
+ "epoch": 4.28,
1177
+ "learning_rate": 6.087283164616704e-06,
1178
+ "loss": 0.1986,
1179
+ "step": 94500
1180
+ },
1181
+ {
1182
+ "epoch": 4.3,
1183
+ "learning_rate": 6.063203721718695e-06,
1184
+ "loss": 0.2022,
1185
+ "step": 95000
1186
+ },
1187
+ {
1188
+ "epoch": 4.32,
1189
+ "learning_rate": 6.039124278820686e-06,
1190
+ "loss": 0.1933,
1191
+ "step": 95500
1192
+ },
1193
+ {
1194
+ "epoch": 4.35,
1195
+ "learning_rate": 6.0150448359226765e-06,
1196
+ "loss": 0.1929,
1197
+ "step": 96000
1198
+ },
1199
+ {
1200
+ "epoch": 4.37,
1201
+ "learning_rate": 5.990965393024667e-06,
1202
+ "loss": 0.2034,
1203
+ "step": 96500
1204
+ },
1205
+ {
1206
+ "epoch": 4.39,
1207
+ "learning_rate": 5.966885950126658e-06,
1208
+ "loss": 0.1978,
1209
+ "step": 97000
1210
+ },
1211
+ {
1212
+ "epoch": 4.41,
1213
+ "learning_rate": 5.942806507228649e-06,
1214
+ "loss": 0.1988,
1215
+ "step": 97500
1216
+ },
1217
+ {
1218
+ "epoch": 4.44,
1219
+ "learning_rate": 5.918727064330641e-06,
1220
+ "loss": 0.2131,
1221
+ "step": 98000
1222
+ },
1223
+ {
1224
+ "epoch": 4.46,
1225
+ "learning_rate": 5.894647621432631e-06,
1226
+ "loss": 0.1982,
1227
+ "step": 98500
1228
+ },
1229
+ {
1230
+ "epoch": 4.48,
1231
+ "learning_rate": 5.8705681785346215e-06,
1232
+ "loss": 0.2076,
1233
+ "step": 99000
1234
+ },
1235
+ {
1236
+ "epoch": 4.5,
1237
+ "learning_rate": 5.846488735636613e-06,
1238
+ "loss": 0.1987,
1239
+ "step": 99500
1240
+ },
1241
+ {
1242
+ "epoch": 4.53,
1243
+ "learning_rate": 5.822409292738603e-06,
1244
+ "loss": 0.2144,
1245
+ "step": 100000
1246
+ },
1247
+ {
1248
+ "epoch": 4.55,
1249
+ "learning_rate": 5.798329849840595e-06,
1250
+ "loss": 0.2038,
1251
+ "step": 100500
1252
+ },
1253
+ {
1254
+ "epoch": 4.57,
1255
+ "learning_rate": 5.774250406942586e-06,
1256
+ "loss": 0.1908,
1257
+ "step": 101000
1258
+ },
1259
+ {
1260
+ "epoch": 4.59,
1261
+ "learning_rate": 5.7501709640445765e-06,
1262
+ "loss": 0.2056,
1263
+ "step": 101500
1264
+ },
1265
+ {
1266
+ "epoch": 4.62,
1267
+ "learning_rate": 5.726091521146567e-06,
1268
+ "loss": 0.2038,
1269
+ "step": 102000
1270
+ },
1271
+ {
1272
+ "epoch": 4.64,
1273
+ "learning_rate": 5.702012078248557e-06,
1274
+ "loss": 0.2099,
1275
+ "step": 102500
1276
+ },
1277
+ {
1278
+ "epoch": 4.66,
1279
+ "learning_rate": 5.677932635350549e-06,
1280
+ "loss": 0.2048,
1281
+ "step": 103000
1282
+ },
1283
+ {
1284
+ "epoch": 4.69,
1285
+ "learning_rate": 5.65385319245254e-06,
1286
+ "loss": 0.2072,
1287
+ "step": 103500
1288
+ },
1289
+ {
1290
+ "epoch": 4.71,
1291
+ "learning_rate": 5.6297737495545315e-06,
1292
+ "loss": 0.2155,
1293
+ "step": 104000
1294
+ },
1295
+ {
1296
+ "epoch": 4.73,
1297
+ "learning_rate": 5.6056943066565215e-06,
1298
+ "loss": 0.2044,
1299
+ "step": 104500
1300
+ },
1301
+ {
1302
+ "epoch": 4.75,
1303
+ "learning_rate": 5.581614863758512e-06,
1304
+ "loss": 0.203,
1305
+ "step": 105000
1306
+ },
1307
+ {
1308
+ "epoch": 4.78,
1309
+ "learning_rate": 5.557535420860504e-06,
1310
+ "loss": 0.2003,
1311
+ "step": 105500
1312
+ },
1313
+ {
1314
+ "epoch": 4.8,
1315
+ "learning_rate": 5.533455977962494e-06,
1316
+ "loss": 0.1985,
1317
+ "step": 106000
1318
+ },
1319
+ {
1320
+ "epoch": 4.82,
1321
+ "learning_rate": 5.509376535064486e-06,
1322
+ "loss": 0.2197,
1323
+ "step": 106500
1324
+ },
1325
+ {
1326
+ "epoch": 4.84,
1327
+ "learning_rate": 5.485297092166476e-06,
1328
+ "loss": 0.1965,
1329
+ "step": 107000
1330
+ },
1331
+ {
1332
+ "epoch": 4.87,
1333
+ "learning_rate": 5.4612176492684665e-06,
1334
+ "loss": 0.2217,
1335
+ "step": 107500
1336
+ },
1337
+ {
1338
+ "epoch": 4.89,
1339
+ "learning_rate": 5.437138206370458e-06,
1340
+ "loss": 0.1977,
1341
+ "step": 108000
1342
+ },
1343
+ {
1344
+ "epoch": 4.91,
1345
+ "learning_rate": 5.413058763472448e-06,
1346
+ "loss": 0.214,
1347
+ "step": 108500
1348
+ },
1349
+ {
1350
+ "epoch": 4.93,
1351
+ "learning_rate": 5.38897932057444e-06,
1352
+ "loss": 0.196,
1353
+ "step": 109000
1354
+ },
1355
+ {
1356
+ "epoch": 4.96,
1357
+ "learning_rate": 5.364899877676431e-06,
1358
+ "loss": 0.2213,
1359
+ "step": 109500
1360
+ },
1361
+ {
1362
+ "epoch": 4.98,
1363
+ "learning_rate": 5.340820434778421e-06,
1364
+ "loss": 0.2114,
1365
+ "step": 110000
1366
+ },
1367
+ {
1368
+ "epoch": 5.0,
1369
+ "eval_accuracy": 0.8639454050062387,
1370
+ "eval_loss": 0.601565957069397,
1371
+ "eval_runtime": 48.6094,
1372
+ "eval_samples_per_second": 807.89,
1373
+ "eval_steps_per_second": 50.505,
1374
+ "step": 110450
1375
+ },
1376
+ {
1377
+ "epoch": 5.0,
1378
+ "learning_rate": 5.316740991880412e-06,
1379
+ "loss": 0.2042,
1380
+ "step": 110500
1381
+ },
1382
+ {
1383
+ "epoch": 5.02,
1384
+ "learning_rate": 5.292661548982403e-06,
1385
+ "loss": 0.1637,
1386
+ "step": 111000
1387
+ },
1388
+ {
1389
+ "epoch": 5.05,
1390
+ "learning_rate": 5.268582106084394e-06,
1391
+ "loss": 0.157,
1392
+ "step": 111500
1393
+ },
1394
+ {
1395
+ "epoch": 5.07,
1396
+ "learning_rate": 5.244502663186385e-06,
1397
+ "loss": 0.1711,
1398
+ "step": 112000
1399
+ },
1400
+ {
1401
+ "epoch": 5.09,
1402
+ "learning_rate": 5.220423220288376e-06,
1403
+ "loss": 0.1664,
1404
+ "step": 112500
1405
+ },
1406
+ {
1407
+ "epoch": 5.12,
1408
+ "learning_rate": 5.1963437773903666e-06,
1409
+ "loss": 0.181,
1410
+ "step": 113000
1411
+ },
1412
+ {
1413
+ "epoch": 5.14,
1414
+ "learning_rate": 5.172264334492357e-06,
1415
+ "loss": 0.1686,
1416
+ "step": 113500
1417
+ },
1418
+ {
1419
+ "epoch": 5.16,
1420
+ "learning_rate": 5.148184891594349e-06,
1421
+ "loss": 0.165,
1422
+ "step": 114000
1423
+ },
1424
+ {
1425
+ "epoch": 5.18,
1426
+ "learning_rate": 5.124105448696339e-06,
1427
+ "loss": 0.1811,
1428
+ "step": 114500
1429
+ },
1430
+ {
1431
+ "epoch": 5.21,
1432
+ "learning_rate": 5.10002600579833e-06,
1433
+ "loss": 0.1765,
1434
+ "step": 115000
1435
+ },
1436
+ {
1437
+ "epoch": 5.23,
1438
+ "learning_rate": 5.075946562900322e-06,
1439
+ "loss": 0.1722,
1440
+ "step": 115500
1441
+ },
1442
+ {
1443
+ "epoch": 5.25,
1444
+ "learning_rate": 5.0518671200023116e-06,
1445
+ "loss": 0.1681,
1446
+ "step": 116000
1447
+ },
1448
+ {
1449
+ "epoch": 5.27,
1450
+ "learning_rate": 5.027787677104303e-06,
1451
+ "loss": 0.1981,
1452
+ "step": 116500
1453
+ },
1454
+ {
1455
+ "epoch": 5.3,
1456
+ "learning_rate": 5.003708234206294e-06,
1457
+ "loss": 0.1709,
1458
+ "step": 117000
1459
+ },
1460
+ {
1461
+ "epoch": 5.32,
1462
+ "learning_rate": 4.979628791308285e-06,
1463
+ "loss": 0.1647,
1464
+ "step": 117500
1465
+ },
1466
+ {
1467
+ "epoch": 5.34,
1468
+ "learning_rate": 4.955549348410276e-06,
1469
+ "loss": 0.1816,
1470
+ "step": 118000
1471
+ },
1472
+ {
1473
+ "epoch": 5.36,
1474
+ "learning_rate": 4.931469905512267e-06,
1475
+ "loss": 0.175,
1476
+ "step": 118500
1477
+ },
1478
+ {
1479
+ "epoch": 5.39,
1480
+ "learning_rate": 4.9073904626142574e-06,
1481
+ "loss": 0.187,
1482
+ "step": 119000
1483
+ },
1484
+ {
1485
+ "epoch": 5.41,
1486
+ "learning_rate": 4.883311019716248e-06,
1487
+ "loss": 0.1799,
1488
+ "step": 119500
1489
+ },
1490
+ {
1491
+ "epoch": 5.43,
1492
+ "learning_rate": 4.859231576818239e-06,
1493
+ "loss": 0.1813,
1494
+ "step": 120000
1495
+ },
1496
+ {
1497
+ "epoch": 5.45,
1498
+ "learning_rate": 4.83515213392023e-06,
1499
+ "loss": 0.1883,
1500
+ "step": 120500
1501
+ },
1502
+ {
1503
+ "epoch": 5.48,
1504
+ "learning_rate": 4.811072691022221e-06,
1505
+ "loss": 0.1863,
1506
+ "step": 121000
1507
+ },
1508
+ {
1509
+ "epoch": 5.5,
1510
+ "learning_rate": 4.7869932481242124e-06,
1511
+ "loss": 0.1856,
1512
+ "step": 121500
1513
+ },
1514
+ {
1515
+ "epoch": 5.52,
1516
+ "learning_rate": 4.7629138052262024e-06,
1517
+ "loss": 0.1633,
1518
+ "step": 122000
1519
+ },
1520
+ {
1521
+ "epoch": 5.55,
1522
+ "learning_rate": 4.738834362328193e-06,
1523
+ "loss": 0.1863,
1524
+ "step": 122500
1525
+ },
1526
+ {
1527
+ "epoch": 5.57,
1528
+ "learning_rate": 4.714754919430184e-06,
1529
+ "loss": 0.1779,
1530
+ "step": 123000
1531
+ },
1532
+ {
1533
+ "epoch": 5.59,
1534
+ "learning_rate": 4.690675476532176e-06,
1535
+ "loss": 0.1851,
1536
+ "step": 123500
1537
+ },
1538
+ {
1539
+ "epoch": 5.61,
1540
+ "learning_rate": 4.666596033634167e-06,
1541
+ "loss": 0.173,
1542
+ "step": 124000
1543
+ },
1544
+ {
1545
+ "epoch": 5.64,
1546
+ "learning_rate": 4.642516590736157e-06,
1547
+ "loss": 0.1748,
1548
+ "step": 124500
1549
+ },
1550
+ {
1551
+ "epoch": 5.66,
1552
+ "learning_rate": 4.618437147838148e-06,
1553
+ "loss": 0.1782,
1554
+ "step": 125000
1555
+ },
1556
+ {
1557
+ "epoch": 5.68,
1558
+ "learning_rate": 4.594357704940139e-06,
1559
+ "loss": 0.1777,
1560
+ "step": 125500
1561
+ },
1562
+ {
1563
+ "epoch": 5.7,
1564
+ "learning_rate": 4.57027826204213e-06,
1565
+ "loss": 0.1807,
1566
+ "step": 126000
1567
+ },
1568
+ {
1569
+ "epoch": 5.73,
1570
+ "learning_rate": 4.546198819144121e-06,
1571
+ "loss": 0.179,
1572
+ "step": 126500
1573
+ },
1574
+ {
1575
+ "epoch": 5.75,
1576
+ "learning_rate": 4.522119376246112e-06,
1577
+ "loss": 0.1745,
1578
+ "step": 127000
1579
+ },
1580
+ {
1581
+ "epoch": 5.77,
1582
+ "learning_rate": 4.4980399333481025e-06,
1583
+ "loss": 0.1886,
1584
+ "step": 127500
1585
+ },
1586
+ {
1587
+ "epoch": 5.79,
1588
+ "learning_rate": 4.473960490450093e-06,
1589
+ "loss": 0.1708,
1590
+ "step": 128000
1591
+ },
1592
+ {
1593
+ "epoch": 5.82,
1594
+ "learning_rate": 4.449881047552084e-06,
1595
+ "loss": 0.1712,
1596
+ "step": 128500
1597
+ },
1598
+ {
1599
+ "epoch": 5.84,
1600
+ "learning_rate": 4.425801604654075e-06,
1601
+ "loss": 0.1946,
1602
+ "step": 129000
1603
+ },
1604
+ {
1605
+ "epoch": 5.86,
1606
+ "learning_rate": 4.401722161756066e-06,
1607
+ "loss": 0.1772,
1608
+ "step": 129500
1609
+ },
1610
+ {
1611
+ "epoch": 5.89,
1612
+ "learning_rate": 4.377642718858057e-06,
1613
+ "loss": 0.1873,
1614
+ "step": 130000
1615
+ },
1616
+ {
1617
+ "epoch": 5.91,
1618
+ "learning_rate": 4.3535632759600475e-06,
1619
+ "loss": 0.1895,
1620
+ "step": 130500
1621
+ },
1622
+ {
1623
+ "epoch": 5.93,
1624
+ "learning_rate": 4.329483833062038e-06,
1625
+ "loss": 0.1782,
1626
+ "step": 131000
1627
+ },
1628
+ {
1629
+ "epoch": 5.95,
1630
+ "learning_rate": 4.30540439016403e-06,
1631
+ "loss": 0.1915,
1632
+ "step": 131500
1633
+ },
1634
+ {
1635
+ "epoch": 5.98,
1636
+ "learning_rate": 4.281324947266021e-06,
1637
+ "loss": 0.1826,
1638
+ "step": 132000
1639
+ },
1640
+ {
1641
+ "epoch": 6.0,
1642
+ "learning_rate": 4.257245504368011e-06,
1643
+ "loss": 0.1833,
1644
+ "step": 132500
1645
+ },
1646
+ {
1647
+ "epoch": 6.0,
1648
+ "eval_accuracy": 0.8642509740011713,
1649
+ "eval_loss": 0.6854547262191772,
1650
+ "eval_runtime": 48.5441,
1651
+ "eval_samples_per_second": 808.975,
1652
+ "eval_steps_per_second": 50.573,
1653
+ "step": 132540
1654
+ },
1655
+ {
1656
+ "epoch": 6.02,
1657
+ "learning_rate": 4.2331660614700025e-06,
1658
+ "loss": 0.1409,
1659
+ "step": 133000
1660
+ },
1661
+ {
1662
+ "epoch": 6.04,
1663
+ "learning_rate": 4.209086618571993e-06,
1664
+ "loss": 0.1583,
1665
+ "step": 133500
1666
+ },
1667
+ {
1668
+ "epoch": 6.07,
1669
+ "learning_rate": 4.185007175673984e-06,
1670
+ "loss": 0.1503,
1671
+ "step": 134000
1672
+ },
1673
+ {
1674
+ "epoch": 6.09,
1675
+ "learning_rate": 4.160927732775975e-06,
1676
+ "loss": 0.1372,
1677
+ "step": 134500
1678
+ },
1679
+ {
1680
+ "epoch": 6.11,
1681
+ "learning_rate": 4.136848289877966e-06,
1682
+ "loss": 0.1332,
1683
+ "step": 135000
1684
+ },
1685
+ {
1686
+ "epoch": 6.13,
1687
+ "learning_rate": 4.112768846979957e-06,
1688
+ "loss": 0.1528,
1689
+ "step": 135500
1690
+ },
1691
+ {
1692
+ "epoch": 6.16,
1693
+ "learning_rate": 4.0886894040819475e-06,
1694
+ "loss": 0.1367,
1695
+ "step": 136000
1696
+ },
1697
+ {
1698
+ "epoch": 6.18,
1699
+ "learning_rate": 4.064609961183938e-06,
1700
+ "loss": 0.1456,
1701
+ "step": 136500
1702
+ },
1703
+ {
1704
+ "epoch": 6.2,
1705
+ "learning_rate": 4.040530518285929e-06,
1706
+ "loss": 0.1674,
1707
+ "step": 137000
1708
+ },
1709
+ {
1710
+ "epoch": 6.22,
1711
+ "learning_rate": 4.01645107538792e-06,
1712
+ "loss": 0.1465,
1713
+ "step": 137500
1714
+ },
1715
+ {
1716
+ "epoch": 6.25,
1717
+ "learning_rate": 3.992371632489911e-06,
1718
+ "loss": 0.1492,
1719
+ "step": 138000
1720
+ },
1721
+ {
1722
+ "epoch": 6.27,
1723
+ "learning_rate": 3.968292189591902e-06,
1724
+ "loss": 0.149,
1725
+ "step": 138500
1726
+ },
1727
+ {
1728
+ "epoch": 6.29,
1729
+ "learning_rate": 3.9442127466938925e-06,
1730
+ "loss": 0.1539,
1731
+ "step": 139000
1732
+ },
1733
+ {
1734
+ "epoch": 6.32,
1735
+ "learning_rate": 3.920133303795884e-06,
1736
+ "loss": 0.1339,
1737
+ "step": 139500
1738
+ },
1739
+ {
1740
+ "epoch": 6.34,
1741
+ "learning_rate": 3.896053860897875e-06,
1742
+ "loss": 0.142,
1743
+ "step": 140000
1744
+ },
1745
+ {
1746
+ "epoch": 6.36,
1747
+ "learning_rate": 3.871974417999865e-06,
1748
+ "loss": 0.1504,
1749
+ "step": 140500
1750
+ },
1751
+ {
1752
+ "epoch": 6.38,
1753
+ "learning_rate": 3.847894975101857e-06,
1754
+ "loss": 0.1578,
1755
+ "step": 141000
1756
+ },
1757
+ {
1758
+ "epoch": 6.41,
1759
+ "learning_rate": 3.8238155322038475e-06,
1760
+ "loss": 0.1491,
1761
+ "step": 141500
1762
+ },
1763
+ {
1764
+ "epoch": 6.43,
1765
+ "learning_rate": 3.7997360893058384e-06,
1766
+ "loss": 0.1673,
1767
+ "step": 142000
1768
+ },
1769
+ {
1770
+ "epoch": 6.45,
1771
+ "learning_rate": 3.775656646407829e-06,
1772
+ "loss": 0.1416,
1773
+ "step": 142500
1774
+ },
1775
+ {
1776
+ "epoch": 6.47,
1777
+ "learning_rate": 3.7515772035098196e-06,
1778
+ "loss": 0.1607,
1779
+ "step": 143000
1780
+ },
1781
+ {
1782
+ "epoch": 6.5,
1783
+ "learning_rate": 3.727497760611811e-06,
1784
+ "loss": 0.1386,
1785
+ "step": 143500
1786
+ },
1787
+ {
1788
+ "epoch": 6.52,
1789
+ "learning_rate": 3.7034183177138017e-06,
1790
+ "loss": 0.1455,
1791
+ "step": 144000
1792
+ },
1793
+ {
1794
+ "epoch": 6.54,
1795
+ "learning_rate": 3.6793388748157925e-06,
1796
+ "loss": 0.152,
1797
+ "step": 144500
1798
+ },
1799
+ {
1800
+ "epoch": 6.56,
1801
+ "learning_rate": 3.6552594319177838e-06,
1802
+ "loss": 0.1533,
1803
+ "step": 145000
1804
+ },
1805
+ {
1806
+ "epoch": 6.59,
1807
+ "learning_rate": 3.631179989019774e-06,
1808
+ "loss": 0.1491,
1809
+ "step": 145500
1810
+ },
1811
+ {
1812
+ "epoch": 6.61,
1813
+ "learning_rate": 3.607100546121765e-06,
1814
+ "loss": 0.1392,
1815
+ "step": 146000
1816
+ },
1817
+ {
1818
+ "epoch": 6.63,
1819
+ "learning_rate": 3.583021103223756e-06,
1820
+ "loss": 0.1454,
1821
+ "step": 146500
1822
+ },
1823
+ {
1824
+ "epoch": 6.65,
1825
+ "learning_rate": 3.558941660325747e-06,
1826
+ "loss": 0.1639,
1827
+ "step": 147000
1828
+ },
1829
+ {
1830
+ "epoch": 6.68,
1831
+ "learning_rate": 3.534862217427738e-06,
1832
+ "loss": 0.1458,
1833
+ "step": 147500
1834
+ },
1835
+ {
1836
+ "epoch": 6.7,
1837
+ "learning_rate": 3.5107827745297292e-06,
1838
+ "loss": 0.1563,
1839
+ "step": 148000
1840
+ },
1841
+ {
1842
+ "epoch": 6.72,
1843
+ "learning_rate": 3.4867033316317196e-06,
1844
+ "loss": 0.1613,
1845
+ "step": 148500
1846
+ },
1847
+ {
1848
+ "epoch": 6.75,
1849
+ "learning_rate": 3.4626238887337105e-06,
1850
+ "loss": 0.1332,
1851
+ "step": 149000
1852
+ },
1853
+ {
1854
+ "epoch": 6.77,
1855
+ "learning_rate": 3.4385444458357013e-06,
1856
+ "loss": 0.1543,
1857
+ "step": 149500
1858
+ },
1859
+ {
1860
+ "epoch": 6.79,
1861
+ "learning_rate": 3.4144650029376926e-06,
1862
+ "loss": 0.1649,
1863
+ "step": 150000
1864
+ },
1865
+ {
1866
+ "epoch": 6.81,
1867
+ "learning_rate": 3.3903855600396834e-06,
1868
+ "loss": 0.1393,
1869
+ "step": 150500
1870
+ },
1871
+ {
1872
+ "epoch": 6.84,
1873
+ "learning_rate": 3.366306117141674e-06,
1874
+ "loss": 0.1463,
1875
+ "step": 151000
1876
+ },
1877
+ {
1878
+ "epoch": 6.86,
1879
+ "learning_rate": 3.342226674243665e-06,
1880
+ "loss": 0.1468,
1881
+ "step": 151500
1882
+ },
1883
+ {
1884
+ "epoch": 6.88,
1885
+ "learning_rate": 3.318147231345656e-06,
1886
+ "loss": 0.152,
1887
+ "step": 152000
1888
+ },
1889
+ {
1890
+ "epoch": 6.9,
1891
+ "learning_rate": 3.2940677884476467e-06,
1892
+ "loss": 0.1509,
1893
+ "step": 152500
1894
+ },
1895
+ {
1896
+ "epoch": 6.93,
1897
+ "learning_rate": 3.269988345549638e-06,
1898
+ "loss": 0.1548,
1899
+ "step": 153000
1900
+ },
1901
+ {
1902
+ "epoch": 6.95,
1903
+ "learning_rate": 3.2459089026516284e-06,
1904
+ "loss": 0.1452,
1905
+ "step": 153500
1906
+ },
1907
+ {
1908
+ "epoch": 6.97,
1909
+ "learning_rate": 3.2218294597536192e-06,
1910
+ "loss": 0.1461,
1911
+ "step": 154000
1912
+ },
1913
+ {
1914
+ "epoch": 6.99,
1915
+ "learning_rate": 3.19775001685561e-06,
1916
+ "loss": 0.1568,
1917
+ "step": 154500
1918
+ },
1919
+ {
1920
+ "epoch": 7.0,
1921
+ "eval_accuracy": 0.8665936696289883,
1922
+ "eval_loss": 0.773478090763092,
1923
+ "eval_runtime": 48.6355,
1924
+ "eval_samples_per_second": 807.455,
1925
+ "eval_steps_per_second": 50.478,
1926
+ "step": 154630
1927
+ },
1928
+ {
1929
+ "epoch": 7.02,
1930
+ "learning_rate": 3.1736705739576013e-06,
1931
+ "loss": 0.1257,
1932
+ "step": 155000
1933
+ },
1934
+ {
1935
+ "epoch": 7.04,
1936
+ "learning_rate": 3.149591131059592e-06,
1937
+ "loss": 0.1142,
1938
+ "step": 155500
1939
+ },
1940
+ {
1941
+ "epoch": 7.06,
1942
+ "learning_rate": 3.1255116881615826e-06,
1943
+ "loss": 0.1063,
1944
+ "step": 156000
1945
+ },
1946
+ {
1947
+ "epoch": 7.08,
1948
+ "learning_rate": 3.101432245263574e-06,
1949
+ "loss": 0.1294,
1950
+ "step": 156500
1951
+ },
1952
+ {
1953
+ "epoch": 7.11,
1954
+ "learning_rate": 3.0773528023655647e-06,
1955
+ "loss": 0.1166,
1956
+ "step": 157000
1957
+ },
1958
+ {
1959
+ "epoch": 7.13,
1960
+ "learning_rate": 3.0532733594675555e-06,
1961
+ "loss": 0.1169,
1962
+ "step": 157500
1963
+ },
1964
+ {
1965
+ "epoch": 7.15,
1966
+ "learning_rate": 3.0291939165695468e-06,
1967
+ "loss": 0.1244,
1968
+ "step": 158000
1969
+ },
1970
+ {
1971
+ "epoch": 7.18,
1972
+ "learning_rate": 3.0051144736715376e-06,
1973
+ "loss": 0.1211,
1974
+ "step": 158500
1975
+ },
1976
+ {
1977
+ "epoch": 7.2,
1978
+ "learning_rate": 2.981035030773528e-06,
1979
+ "loss": 0.131,
1980
+ "step": 159000
1981
+ },
1982
+ {
1983
+ "epoch": 7.22,
1984
+ "learning_rate": 2.9569555878755193e-06,
1985
+ "loss": 0.1229,
1986
+ "step": 159500
1987
+ },
1988
+ {
1989
+ "epoch": 7.24,
1990
+ "learning_rate": 2.93287614497751e-06,
1991
+ "loss": 0.1208,
1992
+ "step": 160000
1993
+ },
1994
+ {
1995
+ "epoch": 7.27,
1996
+ "learning_rate": 2.908796702079501e-06,
1997
+ "loss": 0.1207,
1998
+ "step": 160500
1999
+ },
2000
+ {
2001
+ "epoch": 7.29,
2002
+ "learning_rate": 2.884717259181492e-06,
2003
+ "loss": 0.1161,
2004
+ "step": 161000
2005
+ },
2006
+ {
2007
+ "epoch": 7.31,
2008
+ "learning_rate": 2.8606378162834826e-06,
2009
+ "loss": 0.1362,
2010
+ "step": 161500
2011
+ },
2012
+ {
2013
+ "epoch": 7.33,
2014
+ "learning_rate": 2.8365583733854734e-06,
2015
+ "loss": 0.1261,
2016
+ "step": 162000
2017
+ },
2018
+ {
2019
+ "epoch": 7.36,
2020
+ "learning_rate": 2.8124789304874643e-06,
2021
+ "loss": 0.1091,
2022
+ "step": 162500
2023
+ },
2024
+ {
2025
+ "epoch": 7.38,
2026
+ "learning_rate": 2.7883994875894555e-06,
2027
+ "loss": 0.126,
2028
+ "step": 163000
2029
+ },
2030
+ {
2031
+ "epoch": 7.4,
2032
+ "learning_rate": 2.7643200446914464e-06,
2033
+ "loss": 0.127,
2034
+ "step": 163500
2035
+ },
2036
+ {
2037
+ "epoch": 7.42,
2038
+ "learning_rate": 2.7402406017934368e-06,
2039
+ "loss": 0.118,
2040
+ "step": 164000
2041
+ },
2042
+ {
2043
+ "epoch": 7.45,
2044
+ "learning_rate": 2.716161158895428e-06,
2045
+ "loss": 0.1368,
2046
+ "step": 164500
2047
+ },
2048
+ {
2049
+ "epoch": 7.47,
2050
+ "learning_rate": 2.692081715997419e-06,
2051
+ "loss": 0.1225,
2052
+ "step": 165000
2053
+ },
2054
+ {
2055
+ "epoch": 7.49,
2056
+ "learning_rate": 2.6680022730994097e-06,
2057
+ "loss": 0.1095,
2058
+ "step": 165500
2059
+ },
2060
+ {
2061
+ "epoch": 7.51,
2062
+ "learning_rate": 2.643922830201401e-06,
2063
+ "loss": 0.126,
2064
+ "step": 166000
2065
+ },
2066
+ {
2067
+ "epoch": 7.54,
2068
+ "learning_rate": 2.6198433873033918e-06,
2069
+ "loss": 0.1212,
2070
+ "step": 166500
2071
+ },
2072
+ {
2073
+ "epoch": 7.56,
2074
+ "learning_rate": 2.595763944405382e-06,
2075
+ "loss": 0.1326,
2076
+ "step": 167000
2077
+ },
2078
+ {
2079
+ "epoch": 7.58,
2080
+ "learning_rate": 2.5716845015073735e-06,
2081
+ "loss": 0.1343,
2082
+ "step": 167500
2083
+ },
2084
+ {
2085
+ "epoch": 7.61,
2086
+ "learning_rate": 2.5476050586093643e-06,
2087
+ "loss": 0.1089,
2088
+ "step": 168000
2089
+ },
2090
+ {
2091
+ "epoch": 7.63,
2092
+ "learning_rate": 2.523525615711355e-06,
2093
+ "loss": 0.1358,
2094
+ "step": 168500
2095
+ },
2096
+ {
2097
+ "epoch": 7.65,
2098
+ "learning_rate": 2.499446172813346e-06,
2099
+ "loss": 0.1231,
2100
+ "step": 169000
2101
+ },
2102
+ {
2103
+ "epoch": 7.67,
2104
+ "learning_rate": 2.475366729915337e-06,
2105
+ "loss": 0.1261,
2106
+ "step": 169500
2107
+ },
2108
+ {
2109
+ "epoch": 7.7,
2110
+ "learning_rate": 2.4512872870173276e-06,
2111
+ "loss": 0.122,
2112
+ "step": 170000
2113
+ },
2114
+ {
2115
+ "epoch": 7.72,
2116
+ "learning_rate": 2.4272078441193185e-06,
2117
+ "loss": 0.1233,
2118
+ "step": 170500
2119
+ },
2120
+ {
2121
+ "epoch": 7.74,
2122
+ "learning_rate": 2.4031284012213097e-06,
2123
+ "loss": 0.1327,
2124
+ "step": 171000
2125
+ },
2126
+ {
2127
+ "epoch": 7.76,
2128
+ "learning_rate": 2.3790489583233006e-06,
2129
+ "loss": 0.1214,
2130
+ "step": 171500
2131
+ },
2132
+ {
2133
+ "epoch": 7.79,
2134
+ "learning_rate": 2.3549695154252914e-06,
2135
+ "loss": 0.1225,
2136
+ "step": 172000
2137
+ },
2138
+ {
2139
+ "epoch": 7.81,
2140
+ "learning_rate": 2.3308900725272822e-06,
2141
+ "loss": 0.1247,
2142
+ "step": 172500
2143
+ },
2144
+ {
2145
+ "epoch": 7.83,
2146
+ "learning_rate": 2.306810629629273e-06,
2147
+ "loss": 0.119,
2148
+ "step": 173000
2149
+ },
2150
+ {
2151
+ "epoch": 7.85,
2152
+ "learning_rate": 2.282731186731264e-06,
2153
+ "loss": 0.1275,
2154
+ "step": 173500
2155
+ },
2156
+ {
2157
+ "epoch": 7.88,
2158
+ "learning_rate": 2.2586517438332547e-06,
2159
+ "loss": 0.1158,
2160
+ "step": 174000
2161
+ },
2162
+ {
2163
+ "epoch": 7.9,
2164
+ "learning_rate": 2.2345723009352456e-06,
2165
+ "loss": 0.1156,
2166
+ "step": 174500
2167
+ },
2168
+ {
2169
+ "epoch": 7.92,
2170
+ "learning_rate": 2.210492858037237e-06,
2171
+ "loss": 0.1313,
2172
+ "step": 175000
2173
+ },
2174
+ {
2175
+ "epoch": 7.94,
2176
+ "learning_rate": 2.1864134151392277e-06,
2177
+ "loss": 0.1301,
2178
+ "step": 175500
2179
+ },
2180
+ {
2181
+ "epoch": 7.97,
2182
+ "learning_rate": 2.1623339722412185e-06,
2183
+ "loss": 0.1198,
2184
+ "step": 176000
2185
+ },
2186
+ {
2187
+ "epoch": 7.99,
2188
+ "learning_rate": 2.1382545293432093e-06,
2189
+ "loss": 0.1237,
2190
+ "step": 176500
2191
+ },
2192
+ {
2193
+ "epoch": 8.0,
2194
+ "eval_accuracy": 0.8664918132973441,
2195
+ "eval_loss": 0.8249724507331848,
2196
+ "eval_runtime": 48.4679,
2197
+ "eval_samples_per_second": 810.247,
2198
+ "eval_steps_per_second": 50.652,
2199
+ "step": 176720
2200
+ },
2201
+ {
2202
+ "epoch": 8.01,
2203
+ "learning_rate": 2.1141750864452e-06,
2204
+ "loss": 0.1008,
2205
+ "step": 177000
2206
+ },
2207
+ {
2208
+ "epoch": 8.04,
2209
+ "learning_rate": 2.090095643547191e-06,
2210
+ "loss": 0.0971,
2211
+ "step": 177500
2212
+ },
2213
+ {
2214
+ "epoch": 8.06,
2215
+ "learning_rate": 2.066016200649182e-06,
2216
+ "loss": 0.1021,
2217
+ "step": 178000
2218
+ },
2219
+ {
2220
+ "epoch": 8.08,
2221
+ "learning_rate": 2.0419367577511727e-06,
2222
+ "loss": 0.1067,
2223
+ "step": 178500
2224
+ },
2225
+ {
2226
+ "epoch": 8.1,
2227
+ "learning_rate": 2.017857314853164e-06,
2228
+ "loss": 0.0989,
2229
+ "step": 179000
2230
+ },
2231
+ {
2232
+ "epoch": 8.13,
2233
+ "learning_rate": 1.9937778719551548e-06,
2234
+ "loss": 0.1049,
2235
+ "step": 179500
2236
+ },
2237
+ {
2238
+ "epoch": 8.15,
2239
+ "learning_rate": 1.9696984290571456e-06,
2240
+ "loss": 0.1154,
2241
+ "step": 180000
2242
+ },
2243
+ {
2244
+ "epoch": 8.17,
2245
+ "learning_rate": 1.9456189861591364e-06,
2246
+ "loss": 0.0919,
2247
+ "step": 180500
2248
+ },
2249
+ {
2250
+ "epoch": 8.19,
2251
+ "learning_rate": 1.9215395432611273e-06,
2252
+ "loss": 0.1185,
2253
+ "step": 181000
2254
+ },
2255
+ {
2256
+ "epoch": 8.22,
2257
+ "learning_rate": 1.8974601003631183e-06,
2258
+ "loss": 0.0943,
2259
+ "step": 181500
2260
+ },
2261
+ {
2262
+ "epoch": 8.24,
2263
+ "learning_rate": 1.873380657465109e-06,
2264
+ "loss": 0.1093,
2265
+ "step": 182000
2266
+ },
2267
+ {
2268
+ "epoch": 8.26,
2269
+ "learning_rate": 1.8493012145671e-06,
2270
+ "loss": 0.0948,
2271
+ "step": 182500
2272
+ },
2273
+ {
2274
+ "epoch": 8.28,
2275
+ "learning_rate": 1.825221771669091e-06,
2276
+ "loss": 0.0984,
2277
+ "step": 183000
2278
+ },
2279
+ {
2280
+ "epoch": 8.31,
2281
+ "learning_rate": 1.8011423287710816e-06,
2282
+ "loss": 0.1063,
2283
+ "step": 183500
2284
+ },
2285
+ {
2286
+ "epoch": 8.33,
2287
+ "learning_rate": 1.7770628858730727e-06,
2288
+ "loss": 0.1084,
2289
+ "step": 184000
2290
+ },
2291
+ {
2292
+ "epoch": 8.35,
2293
+ "learning_rate": 1.7529834429750633e-06,
2294
+ "loss": 0.0988,
2295
+ "step": 184500
2296
+ },
2297
+ {
2298
+ "epoch": 8.37,
2299
+ "learning_rate": 1.7289040000770544e-06,
2300
+ "loss": 0.091,
2301
+ "step": 185000
2302
+ },
2303
+ {
2304
+ "epoch": 8.4,
2305
+ "learning_rate": 1.7048245571790454e-06,
2306
+ "loss": 0.1077,
2307
+ "step": 185500
2308
+ },
2309
+ {
2310
+ "epoch": 8.42,
2311
+ "learning_rate": 1.680745114281036e-06,
2312
+ "loss": 0.0997,
2313
+ "step": 186000
2314
+ },
2315
+ {
2316
+ "epoch": 8.44,
2317
+ "learning_rate": 1.656665671383027e-06,
2318
+ "loss": 0.0898,
2319
+ "step": 186500
2320
+ },
2321
+ {
2322
+ "epoch": 8.47,
2323
+ "learning_rate": 1.6325862284850181e-06,
2324
+ "loss": 0.1011,
2325
+ "step": 187000
2326
+ },
2327
+ {
2328
+ "epoch": 8.49,
2329
+ "learning_rate": 1.6085067855870087e-06,
2330
+ "loss": 0.1043,
2331
+ "step": 187500
2332
+ },
2333
+ {
2334
+ "epoch": 8.51,
2335
+ "learning_rate": 1.5844273426889998e-06,
2336
+ "loss": 0.0891,
2337
+ "step": 188000
2338
+ },
2339
+ {
2340
+ "epoch": 8.53,
2341
+ "learning_rate": 1.5603478997909904e-06,
2342
+ "loss": 0.0992,
2343
+ "step": 188500
2344
+ },
2345
+ {
2346
+ "epoch": 8.56,
2347
+ "learning_rate": 1.5362684568929815e-06,
2348
+ "loss": 0.1055,
2349
+ "step": 189000
2350
+ },
2351
+ {
2352
+ "epoch": 8.58,
2353
+ "learning_rate": 1.5121890139949725e-06,
2354
+ "loss": 0.1032,
2355
+ "step": 189500
2356
+ },
2357
+ {
2358
+ "epoch": 8.6,
2359
+ "learning_rate": 1.4881095710969631e-06,
2360
+ "loss": 0.0945,
2361
+ "step": 190000
2362
+ },
2363
+ {
2364
+ "epoch": 8.62,
2365
+ "learning_rate": 1.4640301281989542e-06,
2366
+ "loss": 0.1159,
2367
+ "step": 190500
2368
+ },
2369
+ {
2370
+ "epoch": 8.65,
2371
+ "learning_rate": 1.4399506853009452e-06,
2372
+ "loss": 0.1016,
2373
+ "step": 191000
2374
+ },
2375
+ {
2376
+ "epoch": 8.67,
2377
+ "learning_rate": 1.4158712424029358e-06,
2378
+ "loss": 0.1024,
2379
+ "step": 191500
2380
+ },
2381
+ {
2382
+ "epoch": 8.69,
2383
+ "learning_rate": 1.3917917995049269e-06,
2384
+ "loss": 0.101,
2385
+ "step": 192000
2386
+ },
2387
+ {
2388
+ "epoch": 8.71,
2389
+ "learning_rate": 1.3677123566069175e-06,
2390
+ "loss": 0.0962,
2391
+ "step": 192500
2392
+ },
2393
+ {
2394
+ "epoch": 8.74,
2395
+ "learning_rate": 1.3436329137089086e-06,
2396
+ "loss": 0.0986,
2397
+ "step": 193000
2398
+ },
2399
+ {
2400
+ "epoch": 8.76,
2401
+ "learning_rate": 1.3195534708108996e-06,
2402
+ "loss": 0.0963,
2403
+ "step": 193500
2404
+ },
2405
+ {
2406
+ "epoch": 8.78,
2407
+ "learning_rate": 1.2954740279128902e-06,
2408
+ "loss": 0.1238,
2409
+ "step": 194000
2410
+ },
2411
+ {
2412
+ "epoch": 8.8,
2413
+ "learning_rate": 1.2713945850148813e-06,
2414
+ "loss": 0.1001,
2415
+ "step": 194500
2416
+ },
2417
+ {
2418
+ "epoch": 8.83,
2419
+ "learning_rate": 1.247315142116872e-06,
2420
+ "loss": 0.0972,
2421
+ "step": 195000
2422
+ },
2423
+ {
2424
+ "epoch": 8.85,
2425
+ "learning_rate": 1.223235699218863e-06,
2426
+ "loss": 0.091,
2427
+ "step": 195500
2428
+ },
2429
+ {
2430
+ "epoch": 8.87,
2431
+ "learning_rate": 1.199156256320854e-06,
2432
+ "loss": 0.1058,
2433
+ "step": 196000
2434
+ },
2435
+ {
2436
+ "epoch": 8.9,
2437
+ "learning_rate": 1.1750768134228448e-06,
2438
+ "loss": 0.0957,
2439
+ "step": 196500
2440
+ },
2441
+ {
2442
+ "epoch": 8.92,
2443
+ "learning_rate": 1.1509973705248357e-06,
2444
+ "loss": 0.0979,
2445
+ "step": 197000
2446
+ },
2447
+ {
2448
+ "epoch": 8.94,
2449
+ "learning_rate": 1.1269179276268265e-06,
2450
+ "loss": 0.1084,
2451
+ "step": 197500
2452
+ },
2453
+ {
2454
+ "epoch": 8.96,
2455
+ "learning_rate": 1.1028384847288175e-06,
2456
+ "loss": 0.0963,
2457
+ "step": 198000
2458
+ },
2459
+ {
2460
+ "epoch": 8.99,
2461
+ "learning_rate": 1.0787590418308084e-06,
2462
+ "loss": 0.0908,
2463
+ "step": 198500
2464
+ },
2465
+ {
2466
+ "epoch": 9.0,
2467
+ "eval_accuracy": 0.8667973822922768,
2468
+ "eval_loss": 0.9133633375167847,
2469
+ "eval_runtime": 48.4125,
2470
+ "eval_samples_per_second": 811.174,
2471
+ "eval_steps_per_second": 50.71,
2472
+ "step": 198810
2473
+ },
2474
+ {
2475
+ "epoch": 9.01,
2476
+ "learning_rate": 1.0546795989327992e-06,
2477
+ "loss": 0.084,
2478
+ "step": 199000
2479
+ },
2480
+ {
2481
+ "epoch": 9.03,
2482
+ "learning_rate": 1.03060015603479e-06,
2483
+ "loss": 0.0891,
2484
+ "step": 199500
2485
+ },
2486
+ {
2487
+ "epoch": 9.05,
2488
+ "learning_rate": 1.006520713136781e-06,
2489
+ "loss": 0.1038,
2490
+ "step": 200000
2491
+ },
2492
+ {
2493
+ "epoch": 9.08,
2494
+ "learning_rate": 9.82441270238772e-07,
2495
+ "loss": 0.0778,
2496
+ "step": 200500
2497
+ },
2498
+ {
2499
+ "epoch": 9.1,
2500
+ "learning_rate": 9.583618273407628e-07,
2501
+ "loss": 0.0917,
2502
+ "step": 201000
2503
+ },
2504
+ {
2505
+ "epoch": 9.12,
2506
+ "learning_rate": 9.342823844427536e-07,
2507
+ "loss": 0.0924,
2508
+ "step": 201500
2509
+ },
2510
+ {
2511
+ "epoch": 9.14,
2512
+ "learning_rate": 9.102029415447445e-07,
2513
+ "loss": 0.077,
2514
+ "step": 202000
2515
+ },
2516
+ {
2517
+ "epoch": 9.17,
2518
+ "learning_rate": 8.861234986467354e-07,
2519
+ "loss": 0.0851,
2520
+ "step": 202500
2521
+ },
2522
+ {
2523
+ "epoch": 9.19,
2524
+ "learning_rate": 8.620440557487263e-07,
2525
+ "loss": 0.0897,
2526
+ "step": 203000
2527
+ },
2528
+ {
2529
+ "epoch": 9.21,
2530
+ "learning_rate": 8.379646128507171e-07,
2531
+ "loss": 0.0844,
2532
+ "step": 203500
2533
+ },
2534
+ {
2535
+ "epoch": 9.23,
2536
+ "learning_rate": 8.138851699527081e-07,
2537
+ "loss": 0.0845,
2538
+ "step": 204000
2539
+ },
2540
+ {
2541
+ "epoch": 9.26,
2542
+ "learning_rate": 7.898057270546989e-07,
2543
+ "loss": 0.0918,
2544
+ "step": 204500
2545
+ },
2546
+ {
2547
+ "epoch": 9.28,
2548
+ "learning_rate": 7.657262841566899e-07,
2549
+ "loss": 0.0792,
2550
+ "step": 205000
2551
+ },
2552
+ {
2553
+ "epoch": 9.3,
2554
+ "learning_rate": 7.416468412586807e-07,
2555
+ "loss": 0.0737,
2556
+ "step": 205500
2557
+ },
2558
+ {
2559
+ "epoch": 9.33,
2560
+ "learning_rate": 7.175673983606715e-07,
2561
+ "loss": 0.0806,
2562
+ "step": 206000
2563
+ },
2564
+ {
2565
+ "epoch": 9.35,
2566
+ "learning_rate": 6.934879554626625e-07,
2567
+ "loss": 0.0843,
2568
+ "step": 206500
2569
+ },
2570
+ {
2571
+ "epoch": 9.37,
2572
+ "learning_rate": 6.694085125646534e-07,
2573
+ "loss": 0.0823,
2574
+ "step": 207000
2575
+ },
2576
+ {
2577
+ "epoch": 9.39,
2578
+ "learning_rate": 6.453290696666442e-07,
2579
+ "loss": 0.0851,
2580
+ "step": 207500
2581
+ },
2582
+ {
2583
+ "epoch": 9.42,
2584
+ "learning_rate": 6.212496267686352e-07,
2585
+ "loss": 0.073,
2586
+ "step": 208000
2587
+ },
2588
+ {
2589
+ "epoch": 9.44,
2590
+ "learning_rate": 5.97170183870626e-07,
2591
+ "loss": 0.0872,
2592
+ "step": 208500
2593
+ },
2594
+ {
2595
+ "epoch": 9.46,
2596
+ "learning_rate": 5.73090740972617e-07,
2597
+ "loss": 0.0905,
2598
+ "step": 209000
2599
+ },
2600
+ {
2601
+ "epoch": 9.48,
2602
+ "learning_rate": 5.490112980746078e-07,
2603
+ "loss": 0.0846,
2604
+ "step": 209500
2605
+ },
2606
+ {
2607
+ "epoch": 9.51,
2608
+ "learning_rate": 5.249318551765987e-07,
2609
+ "loss": 0.0733,
2610
+ "step": 210000
2611
+ },
2612
+ {
2613
+ "epoch": 9.53,
2614
+ "learning_rate": 5.008524122785896e-07,
2615
+ "loss": 0.0926,
2616
+ "step": 210500
2617
+ },
2618
+ {
2619
+ "epoch": 9.55,
2620
+ "learning_rate": 4.7677296938058045e-07,
2621
+ "loss": 0.0767,
2622
+ "step": 211000
2623
+ },
2624
+ {
2625
+ "epoch": 9.57,
2626
+ "learning_rate": 4.526935264825713e-07,
2627
+ "loss": 0.0852,
2628
+ "step": 211500
2629
+ },
2630
+ {
2631
+ "epoch": 9.6,
2632
+ "learning_rate": 4.286140835845622e-07,
2633
+ "loss": 0.0815,
2634
+ "step": 212000
2635
+ },
2636
+ {
2637
+ "epoch": 9.62,
2638
+ "learning_rate": 4.0453464068655306e-07,
2639
+ "loss": 0.0965,
2640
+ "step": 212500
2641
+ },
2642
+ {
2643
+ "epoch": 9.64,
2644
+ "learning_rate": 3.80455197788544e-07,
2645
+ "loss": 0.0858,
2646
+ "step": 213000
2647
+ },
2648
+ {
2649
+ "epoch": 9.67,
2650
+ "learning_rate": 3.5637575489053483e-07,
2651
+ "loss": 0.0845,
2652
+ "step": 213500
2653
+ },
2654
+ {
2655
+ "epoch": 9.69,
2656
+ "learning_rate": 3.322963119925258e-07,
2657
+ "loss": 0.0901,
2658
+ "step": 214000
2659
+ },
2660
+ {
2661
+ "epoch": 9.71,
2662
+ "learning_rate": 3.0821686909451666e-07,
2663
+ "loss": 0.0871,
2664
+ "step": 214500
2665
+ },
2666
+ {
2667
+ "epoch": 9.73,
2668
+ "learning_rate": 2.8413742619650755e-07,
2669
+ "loss": 0.0868,
2670
+ "step": 215000
2671
+ },
2672
+ {
2673
+ "epoch": 9.76,
2674
+ "learning_rate": 2.6005798329849844e-07,
2675
+ "loss": 0.0817,
2676
+ "step": 215500
2677
+ },
2678
+ {
2679
+ "epoch": 9.78,
2680
+ "learning_rate": 2.3597854040048932e-07,
2681
+ "loss": 0.0793,
2682
+ "step": 216000
2683
+ },
2684
+ {
2685
+ "epoch": 9.8,
2686
+ "learning_rate": 2.118990975024802e-07,
2687
+ "loss": 0.0981,
2688
+ "step": 216500
2689
+ },
2690
+ {
2691
+ "epoch": 9.82,
2692
+ "learning_rate": 1.878196546044711e-07,
2693
+ "loss": 0.0746,
2694
+ "step": 217000
2695
+ },
2696
+ {
2697
+ "epoch": 9.85,
2698
+ "learning_rate": 1.6374021170646199e-07,
2699
+ "loss": 0.0812,
2700
+ "step": 217500
2701
+ },
2702
+ {
2703
+ "epoch": 9.87,
2704
+ "learning_rate": 1.3966076880845285e-07,
2705
+ "loss": 0.0848,
2706
+ "step": 218000
2707
+ },
2708
+ {
2709
+ "epoch": 9.89,
2710
+ "learning_rate": 1.1558132591044375e-07,
2711
+ "loss": 0.0777,
2712
+ "step": 218500
2713
+ },
2714
+ {
2715
+ "epoch": 9.91,
2716
+ "learning_rate": 9.150188301243464e-08,
2717
+ "loss": 0.0876,
2718
+ "step": 219000
2719
+ },
2720
+ {
2721
+ "epoch": 9.94,
2722
+ "learning_rate": 6.742244011442552e-08,
2723
+ "loss": 0.0948,
2724
+ "step": 219500
2725
+ },
2726
+ {
2727
+ "epoch": 9.96,
2728
+ "learning_rate": 4.3342997216416404e-08,
2729
+ "loss": 0.0859,
2730
+ "step": 220000
2731
+ },
2732
+ {
2733
+ "epoch": 9.98,
2734
+ "learning_rate": 1.926355431840729e-08,
2735
+ "loss": 0.0832,
2736
+ "step": 220500
2737
+ },
2738
+ {
2739
+ "epoch": 10.0,
2740
+ "eval_accuracy": 0.867408520282142,
2741
+ "eval_loss": 0.9269108176231384,
2742
+ "eval_runtime": 48.3924,
2743
+ "eval_samples_per_second": 811.511,
2744
+ "eval_steps_per_second": 50.731,
2745
+ "step": 220900
2746
+ },
2747
+ {
2748
+ "epoch": 10.0,
2749
+ "step": 220900,
2750
+ "total_flos": 1.4385611002158144e+17,
2751
+ "train_loss": 0.2250993330244459,
2752
+ "train_runtime": 14146.4436,
2753
+ "train_samples_per_second": 249.837,
2754
+ "train_steps_per_second": 15.615
2755
+ }
2756
+ ],
2757
+ "logging_steps": 500,
2758
+ "max_steps": 220900,
2759
+ "num_train_epochs": 10,
2760
+ "save_steps": 500,
2761
+ "total_flos": 1.4385611002158144e+17,
2762
+ "trial_name": null,
2763
+ "trial_params": null
2764
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:239b779aeb51d9b42758e26ba4d8bd72821222f0005a38f5c8122c9300ce9470
3
+ size 4091
vocab.json ADDED
The diff for this file is too large to render. See raw diff