mtzig commited on
Commit
dead6ad
·
verified ·
1 Parent(s): 0da7703

Model save

Browse files
Files changed (5) hide show
  1. README.md +216 -0
  2. config.json +19 -0
  3. generation_config.json +4 -0
  4. model.safetensors +3 -0
  5. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,216 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ model-index:
8
+ - name: reverseadd_grad_lr5e-4_batch128_train1-16_eval20
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # reverseadd_grad_lr5e-4_batch128_train1-16_eval20
16
+
17
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.3078
20
+ - Accuracy: 0.4417
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 0.0005
40
+ - train_batch_size: 128
41
+ - eval_batch_size: 512
42
+ - seed: 23452399
43
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
44
+ - lr_scheduler_type: cosine
45
+ - lr_scheduler_warmup_ratio: 0.1
46
+ - num_epochs: 1
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
51
+ |:-------------:|:------:|:-----:|:---------------:|:--------:|
52
+ | No log | 0 | 0 | 2.6848 | 0.0 |
53
+ | 2.3202 | 0.0064 | 100 | 2.3492 | 0.0 |
54
+ | 2.1846 | 0.0128 | 200 | 2.2728 | 0.0 |
55
+ | 2.0767 | 0.0192 | 300 | 2.2164 | 0.0 |
56
+ | 2.0467 | 0.0256 | 400 | 2.2578 | 0.0 |
57
+ | 1.9785 | 0.032 | 500 | 2.1610 | 0.0 |
58
+ | 1.8153 | 0.0384 | 600 | 2.0537 | 0.0 |
59
+ | 1.5812 | 0.0448 | 700 | 1.9282 | 0.0 |
60
+ | 1.5535 | 0.0512 | 800 | 1.9193 | 0.0 |
61
+ | 1.4959 | 0.0576 | 900 | 1.7550 | 0.0 |
62
+ | 1.443 | 0.064 | 1000 | 2.0522 | 0.0 |
63
+ | 1.743 | 0.0704 | 1100 | 1.7900 | 0.0 |
64
+ | 1.5763 | 0.0768 | 1200 | 2.0261 | 0.0 |
65
+ | 1.2627 | 0.0832 | 1300 | 1.5576 | 0.0 |
66
+ | 1.3884 | 0.0896 | 1400 | 1.4530 | 0.0 |
67
+ | 1.2309 | 0.096 | 1500 | 1.4167 | 0.0 |
68
+ | 1.2115 | 0.1024 | 1600 | 1.4832 | 0.0 |
69
+ | 1.2817 | 0.1088 | 1700 | 1.3474 | 0.0 |
70
+ | 1.2665 | 0.1152 | 1800 | 1.4130 | 0.0 |
71
+ | 1.1102 | 0.1216 | 1900 | 1.4014 | 0.0 |
72
+ | 1.0674 | 0.128 | 2000 | 1.4528 | 0.0 |
73
+ | 1.157 | 0.1344 | 2100 | 1.4099 | 0.0 |
74
+ | 1.0901 | 0.1408 | 2200 | 1.6544 | 0.0 |
75
+ | 1.0968 | 0.1472 | 2300 | 1.3503 | 0.0 |
76
+ | 1.1047 | 0.1536 | 2400 | 1.2690 | 0.0004 |
77
+ | 1.0916 | 0.16 | 2500 | 1.2901 | 0.0 |
78
+ | 1.0961 | 0.1664 | 2600 | 1.4174 | 0.0 |
79
+ | 1.2723 | 0.1728 | 2700 | 1.5270 | 0.0 |
80
+ | 1.037 | 0.1792 | 2800 | 1.2813 | 0.0001 |
81
+ | 1.1286 | 0.1856 | 2900 | 1.4213 | 0.0 |
82
+ | 1.2635 | 0.192 | 3000 | 1.6571 | 0.0 |
83
+ | 1.0575 | 0.1984 | 3100 | 1.3091 | 0.0 |
84
+ | 1.0101 | 0.2048 | 3200 | 1.2913 | 0.0002 |
85
+ | 1.3288 | 0.2112 | 3300 | 1.4699 | 0.0 |
86
+ | 0.8451 | 0.2176 | 3400 | 1.1301 | 0.0025 |
87
+ | 0.8022 | 0.224 | 3500 | 1.3349 | 0.0 |
88
+ | 0.6218 | 0.2304 | 3600 | 0.8613 | 0.0021 |
89
+ | 0.6183 | 0.2368 | 3700 | 1.2587 | 0.0016 |
90
+ | 0.4205 | 0.2432 | 3800 | 1.4093 | 0.002 |
91
+ | 0.3145 | 0.2496 | 3900 | 1.1882 | 0.0004 |
92
+ | 0.2012 | 0.256 | 4000 | 0.9237 | 0.0023 |
93
+ | 0.139 | 0.2624 | 4100 | 1.0225 | 0.0285 |
94
+ | 0.1649 | 0.2688 | 4200 | 0.5374 | 0.0808 |
95
+ | 0.3063 | 0.2752 | 4300 | 1.4660 | 0.0002 |
96
+ | 0.1848 | 0.2816 | 4400 | 0.9580 | 0.0072 |
97
+ | 0.3014 | 0.288 | 4500 | 1.5051 | 0.0015 |
98
+ | 0.2235 | 0.2944 | 4600 | 1.1891 | 0.0019 |
99
+ | 0.1976 | 0.3008 | 4700 | 0.7024 | 0.0392 |
100
+ | 0.0846 | 0.3072 | 4800 | 0.4221 | 0.1536 |
101
+ | 0.0846 | 0.3136 | 4900 | 0.6276 | 0.0739 |
102
+ | 0.136 | 0.32 | 5000 | 0.5679 | 0.1152 |
103
+ | 0.4901 | 0.3264 | 5100 | 1.1334 | 0.0057 |
104
+ | 0.1441 | 0.3328 | 5200 | 0.7550 | 0.0424 |
105
+ | 0.1863 | 0.3392 | 5300 | 0.5842 | 0.1928 |
106
+ | 0.145 | 0.3456 | 5400 | 0.6484 | 0.1382 |
107
+ | 0.081 | 0.352 | 5500 | 0.7676 | 0.1447 |
108
+ | 0.2879 | 0.3584 | 5600 | 1.2772 | 0.0018 |
109
+ | 0.1987 | 0.3648 | 5700 | 1.2486 | 0.0179 |
110
+ | 0.2972 | 0.3712 | 5800 | 1.0603 | 0.0024 |
111
+ | 0.0939 | 0.3776 | 5900 | 0.9522 | 0.0007 |
112
+ | 0.1428 | 0.384 | 6000 | 0.8437 | 0.0858 |
113
+ | 0.1049 | 0.3904 | 6100 | 0.7130 | 0.0739 |
114
+ | 0.1106 | 0.3968 | 6200 | 0.8737 | 0.0048 |
115
+ | 0.0611 | 0.4032 | 6300 | 0.8740 | 0.0134 |
116
+ | 0.1152 | 0.4096 | 6400 | 1.2177 | 0.0029 |
117
+ | 0.0402 | 0.416 | 6500 | 0.5074 | 0.1471 |
118
+ | 0.0873 | 0.4224 | 6600 | 0.8076 | 0.1639 |
119
+ | 0.1039 | 0.4288 | 6700 | 1.5710 | 0.0042 |
120
+ | 0.0787 | 0.4352 | 6800 | 0.5226 | 0.1542 |
121
+ | 0.1065 | 0.4416 | 6900 | 0.7633 | 0.0542 |
122
+ | 0.0838 | 0.448 | 7000 | 0.9892 | 0.0525 |
123
+ | 0.0473 | 0.4544 | 7100 | 0.9520 | 0.0231 |
124
+ | 0.0582 | 0.4608 | 7200 | 0.3465 | 0.313 |
125
+ | 0.025 | 0.4672 | 7300 | 0.2506 | 0.4354 |
126
+ | 0.0211 | 0.4736 | 7400 | 0.3364 | 0.3074 |
127
+ | 0.0996 | 0.48 | 7500 | 0.3570 | 0.2133 |
128
+ | 0.0407 | 0.4864 | 7600 | 0.7728 | 0.1108 |
129
+ | 0.014 | 0.4928 | 7700 | 0.1289 | 0.544 |
130
+ | 0.0497 | 0.4992 | 7800 | 0.5174 | 0.0975 |
131
+ | 0.0424 | 0.5056 | 7900 | 0.5479 | 0.0945 |
132
+ | 0.0039 | 0.512 | 8000 | 0.5395 | 0.1422 |
133
+ | 0.0195 | 0.5184 | 8100 | 0.4601 | 0.2033 |
134
+ | 0.0157 | 0.5248 | 8200 | 0.8103 | 0.0714 |
135
+ | 0.0351 | 0.5312 | 8300 | 0.5075 | 0.253 |
136
+ | 0.0128 | 0.5376 | 8400 | 0.4663 | 0.2743 |
137
+ | 0.0256 | 0.544 | 8500 | 0.8656 | 0.3863 |
138
+ | 0.1718 | 0.5504 | 8600 | 0.6020 | 0.1902 |
139
+ | 0.026 | 0.5568 | 8700 | 0.3962 | 0.3037 |
140
+ | 0.1089 | 0.5632 | 8800 | 0.6295 | 0.1404 |
141
+ | 0.0324 | 0.5696 | 8900 | 0.1578 | 0.4956 |
142
+ | 0.0196 | 0.576 | 9000 | 0.3137 | 0.2351 |
143
+ | 0.123 | 0.5824 | 9100 | 1.5833 | 0.0069 |
144
+ | 0.0056 | 0.5888 | 9200 | 0.9272 | 0.2582 |
145
+ | 0.0047 | 0.5952 | 9300 | 0.8148 | 0.1233 |
146
+ | 0.0313 | 0.6016 | 9400 | 1.0792 | 0.1242 |
147
+ | 0.0237 | 0.608 | 9500 | 0.2779 | 0.4696 |
148
+ | 0.0004 | 0.6144 | 9600 | 0.1443 | 0.5383 |
149
+ | 0.003 | 0.6208 | 9700 | 0.4303 | 0.1525 |
150
+ | 0.032 | 0.6272 | 9800 | 0.9887 | 0.092 |
151
+ | 0.0351 | 0.6336 | 9900 | 0.0637 | 0.7268 |
152
+ | 0.0075 | 0.64 | 10000 | 1.3108 | 0.0794 |
153
+ | 0.0139 | 0.6464 | 10100 | 0.4740 | 0.2227 |
154
+ | 0.0143 | 0.6528 | 10200 | 0.2752 | 0.2844 |
155
+ | 0.001 | 0.6592 | 10300 | 0.2274 | 0.291 |
156
+ | 0.0156 | 0.6656 | 10400 | 1.2667 | 0.0493 |
157
+ | 0.0069 | 0.672 | 10500 | 0.3401 | 0.2609 |
158
+ | 0.0023 | 0.6784 | 10600 | 0.6972 | 0.2885 |
159
+ | 0.0014 | 0.6848 | 10700 | 0.5544 | 0.2411 |
160
+ | 0.0095 | 0.6912 | 10800 | 0.4980 | 0.0827 |
161
+ | 0.0046 | 0.6976 | 10900 | 0.6637 | 0.1629 |
162
+ | 0.008 | 0.704 | 11000 | 1.2150 | 0.0922 |
163
+ | 0.0368 | 0.7104 | 11100 | 0.3636 | 0.3067 |
164
+ | 0.0007 | 0.7168 | 11200 | 0.5153 | 0.2979 |
165
+ | 0.0003 | 0.7232 | 11300 | 0.7362 | 0.2725 |
166
+ | 0.0004 | 0.7296 | 11400 | 0.4342 | 0.4889 |
167
+ | 0.0006 | 0.736 | 11500 | 0.0555 | 0.7811 |
168
+ | 0.0005 | 0.7424 | 11600 | 0.4003 | 0.2908 |
169
+ | 0.0 | 0.7488 | 11700 | 0.5867 | 0.3251 |
170
+ | 0.0023 | 0.7552 | 11800 | 0.2589 | 0.5022 |
171
+ | 0.0001 | 0.7616 | 11900 | 0.3106 | 0.4596 |
172
+ | 0.0002 | 0.768 | 12000 | 0.2065 | 0.4589 |
173
+ | 0.0023 | 0.7744 | 12100 | 0.1422 | 0.6436 |
174
+ | 0.0014 | 0.7808 | 12200 | 0.0345 | 0.8808 |
175
+ | 0.0011 | 0.7872 | 12300 | 0.1220 | 0.6067 |
176
+ | 0.0007 | 0.7936 | 12400 | 0.1234 | 0.5929 |
177
+ | 0.0001 | 0.8 | 12500 | 0.0184 | 0.9259 |
178
+ | 0.0001 | 0.8064 | 12600 | 0.2383 | 0.3727 |
179
+ | 0.0002 | 0.8128 | 12700 | 0.6462 | 0.2219 |
180
+ | 0.0001 | 0.8192 | 12800 | 0.2457 | 0.4784 |
181
+ | 0.0001 | 0.8256 | 12900 | 0.2115 | 0.3723 |
182
+ | 0.0001 | 0.832 | 13000 | 0.0998 | 0.7704 |
183
+ | 0.0001 | 0.8384 | 13100 | 0.3560 | 0.2942 |
184
+ | 0.0005 | 0.8448 | 13200 | 0.1943 | 0.4046 |
185
+ | 0.0029 | 0.8512 | 13300 | 0.5677 | 0.2144 |
186
+ | 0.0001 | 0.8576 | 13400 | 1.0426 | 0.1691 |
187
+ | 0.0 | 0.864 | 13500 | 0.0897 | 0.6932 |
188
+ | 0.0 | 0.8704 | 13600 | 0.0242 | 0.865 |
189
+ | 0.0008 | 0.8768 | 13700 | 0.0395 | 0.8246 |
190
+ | 0.0015 | 0.8832 | 13800 | 0.0476 | 0.7952 |
191
+ | 0.0075 | 0.8896 | 13900 | 0.4049 | 0.4015 |
192
+ | 0.001 | 0.896 | 14000 | 0.4570 | 0.332 |
193
+ | 0.0001 | 0.9024 | 14100 | 0.1231 | 0.6003 |
194
+ | 0.0005 | 0.9088 | 14200 | 0.2371 | 0.5059 |
195
+ | 0.0 | 0.9152 | 14300 | 0.1634 | 0.5424 |
196
+ | 0.0018 | 0.9216 | 14400 | 0.2533 | 0.4458 |
197
+ | 0.0 | 0.928 | 14500 | 0.3422 | 0.4327 |
198
+ | 0.0 | 0.9344 | 14600 | 0.2430 | 0.4213 |
199
+ | 0.0001 | 0.9408 | 14700 | 0.3305 | 0.3746 |
200
+ | 0.0 | 0.9472 | 14800 | 0.3879 | 0.367 |
201
+ | 0.0 | 0.9536 | 14900 | 0.4769 | 0.3365 |
202
+ | 0.0 | 0.96 | 15000 | 0.4236 | 0.3831 |
203
+ | 0.0 | 0.9664 | 15100 | 0.3237 | 0.4232 |
204
+ | 0.0 | 0.9728 | 15200 | 0.2917 | 0.4483 |
205
+ | 0.0 | 0.9792 | 15300 | 0.3166 | 0.4387 |
206
+ | 0.0 | 0.9856 | 15400 | 0.3159 | 0.4399 |
207
+ | 0.0 | 0.992 | 15500 | 0.3080 | 0.4413 |
208
+ | 0.0 | 0.9984 | 15600 | 0.3078 | 0.4417 |
209
+
210
+
211
+ ### Framework versions
212
+
213
+ - Transformers 4.46.0
214
+ - Pytorch 2.5.1
215
+ - Datasets 3.1.0
216
+ - Tokenizers 0.20.1
config.json ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "NanoGPT"
4
+ ],
5
+ "bias": true,
6
+ "block_size": 256,
7
+ "dropout": 0.0,
8
+ "mlp_dim": 4,
9
+ "model_type": "nanogpt",
10
+ "n_embd": 384,
11
+ "n_head": 6,
12
+ "n_layer": 6,
13
+ "nonlinearity": "RELU",
14
+ "torch_dtype": "float32",
15
+ "transformers_version": "4.46.0",
16
+ "use_NoPE": true,
17
+ "use_layernorm": true,
18
+ "vocab_size": 14
19
+ }
generation_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "transformers_version": "4.46.0"
4
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1a8e2acc27a826ecc065bd34a42c81bcb5936433efbbdefec5be99eab70084e3
3
+ size 42640744
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9a90d0d0481886dd144e1c62602d3f721e72710677b594f1413fda00b7b9c13d
3
+ size 5304