craa commited on
Commit
c55c190
·
verified ·
1 Parent(s): f7afdec

Training in progress, step 10000

Browse files
README.md ADDED
@@ -0,0 +1,152 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ model-index:
8
+ - name: 100M_high_2000_1208
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # 100M_high_2000_1208
16
+
17
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 3.3070
20
+ - Accuracy: 0.3938
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 0.0006
40
+ - train_batch_size: 32
41
+ - eval_batch_size: 16
42
+ - seed: 1208
43
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
44
+ - lr_scheduler_type: linear
45
+ - lr_scheduler_warmup_steps: 100
46
+ - num_epochs: 10
47
+ - mixed_precision_training: Native AMP
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
+ |:-------------:|:------:|:-----:|:---------------:|:--------:|
53
+ | 5.081 | 0.1076 | 1000 | 5.0342 | 0.2257 |
54
+ | 4.5993 | 0.2153 | 2000 | 4.5159 | 0.2697 |
55
+ | 4.2969 | 0.3229 | 3000 | 4.2418 | 0.2986 |
56
+ | 4.1676 | 0.4305 | 4000 | 4.0957 | 0.3121 |
57
+ | 4.0642 | 0.5382 | 5000 | 4.0004 | 0.3206 |
58
+ | 4.0038 | 0.6458 | 6000 | 3.9248 | 0.3278 |
59
+ | 3.9256 | 0.7534 | 7000 | 3.8684 | 0.3326 |
60
+ | 3.8889 | 0.8610 | 8000 | 3.8204 | 0.3380 |
61
+ | 3.8508 | 0.9687 | 9000 | 3.7829 | 0.3410 |
62
+ | 3.7758 | 1.0763 | 10000 | 3.7507 | 0.3437 |
63
+ | 3.7585 | 1.1839 | 11000 | 3.7283 | 0.3462 |
64
+ | 3.7348 | 1.2916 | 12000 | 3.6997 | 0.3491 |
65
+ | 3.7217 | 1.3992 | 13000 | 3.6811 | 0.3510 |
66
+ | 3.7135 | 1.5068 | 14000 | 3.6597 | 0.3530 |
67
+ | 3.6762 | 1.6145 | 15000 | 3.6410 | 0.3550 |
68
+ | 3.6543 | 1.7221 | 16000 | 3.6245 | 0.3562 |
69
+ | 3.6543 | 1.8297 | 17000 | 3.6093 | 0.3583 |
70
+ | 3.6548 | 1.9374 | 18000 | 3.5946 | 0.3598 |
71
+ | 3.5527 | 2.0450 | 19000 | 3.5837 | 0.3612 |
72
+ | 3.57 | 2.1526 | 20000 | 3.5745 | 0.3621 |
73
+ | 3.5501 | 2.2603 | 21000 | 3.5634 | 0.3636 |
74
+ | 3.5595 | 2.3679 | 22000 | 3.5533 | 0.3647 |
75
+ | 3.5527 | 2.4755 | 23000 | 3.5411 | 0.3653 |
76
+ | 3.5441 | 2.5831 | 24000 | 3.5310 | 0.3671 |
77
+ | 3.5373 | 2.6908 | 25000 | 3.5247 | 0.3670 |
78
+ | 3.5324 | 2.7984 | 26000 | 3.5163 | 0.3685 |
79
+ | 3.5449 | 2.9060 | 27000 | 3.5066 | 0.3695 |
80
+ | 3.4313 | 3.0137 | 28000 | 3.5014 | 0.3701 |
81
+ | 3.4477 | 3.1213 | 29000 | 3.5001 | 0.3706 |
82
+ | 3.4502 | 3.2289 | 30000 | 3.4922 | 0.3713 |
83
+ | 3.4542 | 3.3366 | 31000 | 3.4868 | 0.3718 |
84
+ | 3.4608 | 3.4442 | 32000 | 3.4783 | 0.3729 |
85
+ | 3.4684 | 3.5518 | 33000 | 3.4740 | 0.3731 |
86
+ | 3.4391 | 3.6595 | 34000 | 3.4672 | 0.3741 |
87
+ | 3.4523 | 3.7671 | 35000 | 3.4613 | 0.3746 |
88
+ | 3.4294 | 3.8747 | 36000 | 3.4532 | 0.3755 |
89
+ | 3.4562 | 3.9823 | 37000 | 3.4500 | 0.3757 |
90
+ | 3.3673 | 4.0900 | 38000 | 3.4507 | 0.3761 |
91
+ | 3.3763 | 4.1976 | 39000 | 3.4469 | 0.3767 |
92
+ | 3.4159 | 4.3052 | 40000 | 3.4418 | 0.3772 |
93
+ | 3.3961 | 4.4129 | 41000 | 3.4349 | 0.3776 |
94
+ | 3.3853 | 4.5205 | 42000 | 3.4292 | 0.3785 |
95
+ | 3.3831 | 4.6281 | 43000 | 3.4263 | 0.3788 |
96
+ | 3.4007 | 4.7358 | 44000 | 3.4229 | 0.3791 |
97
+ | 3.3982 | 4.8434 | 45000 | 3.4168 | 0.3797 |
98
+ | 3.4117 | 4.9510 | 46000 | 3.4113 | 0.3800 |
99
+ | 3.2918 | 5.0587 | 47000 | 3.4153 | 0.3807 |
100
+ | 3.3228 | 5.1663 | 48000 | 3.4128 | 0.3808 |
101
+ | 3.3358 | 5.2739 | 49000 | 3.4116 | 0.3811 |
102
+ | 3.3283 | 5.3816 | 50000 | 3.4034 | 0.3818 |
103
+ | 3.3431 | 5.4892 | 51000 | 3.4026 | 0.3819 |
104
+ | 3.3383 | 5.5968 | 52000 | 3.3939 | 0.3826 |
105
+ | 3.3608 | 5.7044 | 53000 | 3.3909 | 0.3833 |
106
+ | 3.3379 | 5.8121 | 54000 | 3.3868 | 0.3834 |
107
+ | 3.335 | 5.9197 | 55000 | 3.3835 | 0.3836 |
108
+ | 3.2537 | 6.0273 | 56000 | 3.3842 | 0.3841 |
109
+ | 3.2644 | 6.1350 | 57000 | 3.3861 | 0.3839 |
110
+ | 3.2737 | 6.2426 | 58000 | 3.3820 | 0.3845 |
111
+ | 3.2764 | 6.3502 | 59000 | 3.3795 | 0.3847 |
112
+ | 3.2804 | 6.4579 | 60000 | 3.3746 | 0.3853 |
113
+ | 3.2691 | 6.5655 | 61000 | 3.3686 | 0.3855 |
114
+ | 3.2979 | 6.6731 | 62000 | 3.3662 | 0.3861 |
115
+ | 3.2994 | 6.7808 | 63000 | 3.3624 | 0.3864 |
116
+ | 3.2885 | 6.8884 | 64000 | 3.3589 | 0.3868 |
117
+ | 3.2941 | 6.9960 | 65000 | 3.3537 | 0.3873 |
118
+ | 3.2105 | 7.1036 | 66000 | 3.3634 | 0.3869 |
119
+ | 3.2258 | 7.2113 | 67000 | 3.3590 | 0.3872 |
120
+ | 3.2207 | 7.3189 | 68000 | 3.3566 | 0.3877 |
121
+ | 3.229 | 7.4265 | 69000 | 3.3524 | 0.3879 |
122
+ | 3.2185 | 7.5342 | 70000 | 3.3491 | 0.3884 |
123
+ | 3.2483 | 7.6418 | 71000 | 3.3465 | 0.3885 |
124
+ | 3.2444 | 7.7494 | 72000 | 3.3405 | 0.3893 |
125
+ | 3.2322 | 7.8571 | 73000 | 3.3408 | 0.3894 |
126
+ | 3.2364 | 7.9647 | 74000 | 3.3331 | 0.3899 |
127
+ | 3.1419 | 8.0723 | 75000 | 3.3419 | 0.3896 |
128
+ | 3.1676 | 8.1800 | 76000 | 3.3360 | 0.3904 |
129
+ | 3.1775 | 8.2876 | 77000 | 3.3360 | 0.3903 |
130
+ | 3.1892 | 8.3952 | 78000 | 3.3310 | 0.3907 |
131
+ | 3.1954 | 8.5029 | 79000 | 3.3294 | 0.3910 |
132
+ | 3.2003 | 8.6105 | 80000 | 3.3247 | 0.3915 |
133
+ | 3.1731 | 8.7181 | 81000 | 3.3210 | 0.3917 |
134
+ | 3.1837 | 8.8257 | 82000 | 3.3180 | 0.3921 |
135
+ | 3.1898 | 8.9334 | 83000 | 3.3148 | 0.3925 |
136
+ | 3.1178 | 9.0410 | 84000 | 3.3185 | 0.3924 |
137
+ | 3.138 | 9.1486 | 85000 | 3.3186 | 0.3926 |
138
+ | 3.1277 | 9.2563 | 86000 | 3.3160 | 0.3928 |
139
+ | 3.1482 | 9.3639 | 87000 | 3.3136 | 0.3931 |
140
+ | 3.1287 | 9.4715 | 88000 | 3.3120 | 0.3934 |
141
+ | 3.1232 | 9.5792 | 89000 | 3.3092 | 0.3936 |
142
+ | 3.124 | 9.6868 | 90000 | 3.3070 | 0.3938 |
143
+ | 3.1261 | 9.7944 | 91000 | 3.3048 | 0.3941 |
144
+ | 3.1251 | 9.9021 | 92000 | 3.3040 | 0.3943 |
145
+
146
+
147
+ ### Framework versions
148
+
149
+ - Transformers 4.47.0.dev0
150
+ - Pytorch 2.5.0+cu124
151
+ - Datasets 3.0.2
152
+ - Tokenizers 0.20.1
all_results.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 10.0,
3
+ "eval_accuracy": 0.39382314851947947,
4
+ "eval_loss": 3.307018518447876,
5
+ "eval_runtime": 185.3957,
6
+ "eval_samples": 18011,
7
+ "eval_samples_per_second": 97.149,
8
+ "eval_steps_per_second": 6.073,
9
+ "perplexity": 27.30359882100141,
10
+ "total_flos": 7.76821211136e+17,
11
+ "train_loss": 3.458438355611428,
12
+ "train_runtime": 80109.3033,
13
+ "train_samples": 297300,
14
+ "train_samples_per_second": 37.112,
15
+ "train_steps_per_second": 1.16
16
+ }
config.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "activation_function": "gelu_new",
3
+ "architectures": [
4
+ "GPT2LMHeadModel"
5
+ ],
6
+ "attn_pdrop": 0.1,
7
+ "bos_token_id": 50256,
8
+ "embd_pdrop": 0.1,
9
+ "eos_token_id": 50256,
10
+ "initializer_range": 0.02,
11
+ "layer_norm_epsilon": 1e-05,
12
+ "model_type": "gpt2",
13
+ "n_embd": 768,
14
+ "n_head": 12,
15
+ "n_inner": null,
16
+ "n_layer": 12,
17
+ "n_positions": 1024,
18
+ "reorder_and_upcast_attn": false,
19
+ "resid_pdrop": 0.1,
20
+ "scale_attn_by_inverse_layer_idx": false,
21
+ "scale_attn_weights": true,
22
+ "summary_activation": null,
23
+ "summary_first_dropout": 0.1,
24
+ "summary_proj_to_labels": true,
25
+ "summary_type": "cls_index",
26
+ "summary_use_proj": true,
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.47.0.dev0",
29
+ "use_cache": true,
30
+ "vocab_size": 52000
31
+ }
eval_results.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 10.0,
3
+ "eval_accuracy": 0.39382314851947947,
4
+ "eval_loss": 3.307018518447876,
5
+ "eval_runtime": 185.3957,
6
+ "eval_samples": 18011,
7
+ "eval_samples_per_second": 97.149,
8
+ "eval_steps_per_second": 6.073,
9
+ "perplexity": 27.30359882100141
10
+ }
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 50256,
4
+ "eos_token_id": 50256,
5
+ "transformers_version": "4.47.0.dev0"
6
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1099585a27bc2664976699ef07430b471f8adb299f2627b2b6f01fc1fc76f4bf
3
+ size 503128704
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {}
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff
 
train_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 10.0,
3
+ "total_flos": 7.76821211136e+17,
4
+ "train_loss": 3.458438355611428,
5
+ "train_runtime": 80109.3033,
6
+ "train_samples": 297300,
7
+ "train_samples_per_second": 37.112,
8
+ "train_steps_per_second": 1.16
9
+ }
trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4c14097bf0485436abbb8ea58a3fa7cbfcf531587b05a96565978beaa601e6c0
3
+ size 5304