jszot commited on
Commit
74e5f47
·
verified ·
1 Parent(s): db9c5c2

End of training

Browse files
Files changed (1) hide show
  1. README.md +74 -53
README.md CHANGED
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [jszot/calculator_model_test](https://huggingface.co/jszot/calculator_model_test) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.6156
19
 
20
  ## Model description
21
 
@@ -34,68 +34,89 @@ More information needed
34
  ### Training hyperparameters
35
 
36
  The following hyperparameters were used during training:
37
- - learning_rate: 0.0001
38
  - train_batch_size: 512
39
  - eval_batch_size: 512
40
  - seed: 42
41
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
42
  - lr_scheduler_type: linear
43
- - num_epochs: 50
 
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
- | 0.8978 | 1.0 | 6 | 0.7935 |
50
- | 0.8605 | 2.0 | 12 | 0.8017 |
51
- | 0.8521 | 3.0 | 18 | 0.7763 |
52
- | 0.8392 | 4.0 | 24 | 0.7644 |
53
- | 0.8363 | 5.0 | 30 | 0.7725 |
54
- | 0.8190 | 6.0 | 36 | 0.7506 |
55
- | 0.8195 | 7.0 | 42 | 0.7411 |
56
- | 0.8048 | 8.0 | 48 | 0.7332 |
57
- | 0.8040 | 9.0 | 54 | 0.7260 |
58
- | 0.7897 | 10.0 | 60 | 0.7216 |
59
- | 0.7921 | 11.0 | 66 | 0.7229 |
60
- | 0.7888 | 12.0 | 72 | 0.7279 |
61
- | 0.7827 | 13.0 | 78 | 0.7076 |
62
- | 0.8073 | 14.0 | 84 | 0.7025 |
63
- | 0.7854 | 15.0 | 90 | 0.6939 |
64
- | 0.7717 | 16.0 | 96 | 0.6911 |
65
- | 0.7699 | 17.0 | 102 | 0.6865 |
66
- | 0.7613 | 18.0 | 108 | 0.6901 |
67
- | 0.7597 | 19.0 | 114 | 0.6784 |
68
- | 0.7514 | 20.0 | 120 | 0.6749 |
69
- | 0.7470 | 21.0 | 126 | 0.6698 |
70
- | 0.7455 | 22.0 | 132 | 0.6671 |
71
- | 0.7419 | 23.0 | 138 | 0.6668 |
72
- | 0.7473 | 24.0 | 144 | 0.6612 |
73
- | 0.7371 | 25.0 | 150 | 0.6587 |
74
- | 0.7319 | 26.0 | 156 | 0.6635 |
75
- | 0.7364 | 27.0 | 162 | 0.6493 |
76
- | 0.7187 | 28.0 | 168 | 0.6625 |
77
- | 0.7257 | 29.0 | 174 | 0.6474 |
78
- | 0.7151 | 30.0 | 180 | 0.6469 |
79
- | 0.7132 | 31.0 | 186 | 0.6395 |
80
- | 0.7167 | 32.0 | 192 | 0.6377 |
81
- | 0.7137 | 33.0 | 198 | 0.6347 |
82
- | 0.7127 | 34.0 | 204 | 0.6311 |
83
- | 0.7077 | 35.0 | 210 | 0.6355 |
84
- | 0.7205 | 36.0 | 216 | 0.6278 |
85
- | 0.7156 | 37.0 | 222 | 0.6316 |
86
- | 0.7032 | 38.0 | 228 | 0.6247 |
87
- | 0.7178 | 39.0 | 234 | 0.6248 |
88
- | 0.7151 | 40.0 | 240 | 0.6226 |
89
- | 0.7077 | 41.0 | 246 | 0.6223 |
90
- | 0.7253 | 42.0 | 252 | 0.6258 |
91
- | 0.7038 | 43.0 | 258 | 0.6187 |
92
- | 0.7244 | 44.0 | 264 | 0.6185 |
93
- | 0.7039 | 45.0 | 270 | 0.6185 |
94
- | 0.7290 | 46.0 | 276 | 0.6161 |
95
- | 0.7084 | 47.0 | 282 | 0.6164 |
96
- | 0.7099 | 48.0 | 288 | 0.6164 |
97
- | 0.6998 | 49.0 | 294 | 0.6158 |
98
- | 0.6985 | 50.0 | 300 | 0.6156 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
99
 
100
 
101
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [jszot/calculator_model_test](https://huggingface.co/jszot/calculator_model_test) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: nan
19
 
20
  ## Model description
21
 
 
34
  ### Training hyperparameters
35
 
36
  The following hyperparameters were used during training:
37
+ - learning_rate: 0.03
38
  - train_batch_size: 512
39
  - eval_batch_size: 512
40
  - seed: 42
41
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
42
  - lr_scheduler_type: linear
43
+ - num_epochs: 70
44
+ - mixed_precision_training: Native AMP
45
 
46
  ### Training results
47
 
48
  | Training Loss | Epoch | Step | Validation Loss |
49
  |:-------------:|:-----:|:----:|:---------------:|
50
+ | 2.4915 | 1.0 | 6 | 2.4641 |
51
+ | 2.3810 | 2.0 | 12 | 2.4631 |
52
+ | 0.4020 | 3.0 | 18 | 2.4760 |
53
+ | 0.4152 | 4.0 | 24 | 2.5013 |
54
+ | 0.4167 | 5.0 | 30 | 2.5159 |
55
+ | 0.4259 | 6.0 | 36 | 2.5125 |
56
+ | 0.4180 | 7.0 | 42 | 2.4938 |
57
+ | 0.0 | 8.0 | 48 | 2.4938 |
58
+ | 0.4142 | 9.0 | 54 | 2.4873 |
59
+ | 0.4113 | 10.0 | 60 | 2.4874 |
60
+ | 0.4333 | 11.0 | 66 | 2.4917 |
61
+ | 0.4189 | 12.0 | 72 | 2.4985 |
62
+ | 0.8576 | 13.0 | 78 | 2.5152 |
63
+ | 0.4302 | 14.0 | 84 | 2.5250 |
64
+ | 0.4158 | 15.0 | 90 | 2.5355 |
65
+ | 0.4416 | 16.0 | 96 | 2.5463 |
66
+ | 0.4541 | 17.0 | 102 | 2.5573 |
67
+ | 0.4295 | 18.0 | 108 | 2.5683 |
68
+ | 0.4440 | 19.0 | 114 | 2.5796 |
69
+ | 1.2993 | 20.0 | 120 | 2.6013 |
70
+ | 0.4239 | 21.0 | 126 | 2.6117 |
71
+ | 0.0 | 22.0 | 132 | 2.6117 |
72
+ | 0.0 | 23.0 | 138 | 2.6117 |
73
+ | 0.8906 | 24.0 | 144 | 2.6300 |
74
+ | 0.4285 | 25.0 | 150 | 2.6385 |
75
+ | 0.4323 | 26.0 | 156 | 2.6461 |
76
+ | 0.4449 | 27.0 | 162 | 2.6537 |
77
+ | 0.0 | 28.0 | 168 | 2.6537 |
78
+ | 0.4491 | 29.0 | 174 | 2.6605 |
79
+ | 0.4529 | 30.0 | 180 | 2.6669 |
80
+ | 1.7849 | 31.0 | 186 | nan |
81
+ | 0.0 | 32.0 | 192 | nan |
82
+ | 0.0 | 33.0 | 198 | nan |
83
+ | 0.0 | 34.0 | 204 | nan |
84
+ | 0.0 | 35.0 | 210 | nan |
85
+ | 0.0 | 36.0 | 216 | nan |
86
+ | 0.0 | 37.0 | 222 | nan |
87
+ | 0.0 | 38.0 | 228 | nan |
88
+ | 0.0 | 39.0 | 234 | nan |
89
+ | 0.0 | 40.0 | 240 | nan |
90
+ | 0.0 | 41.0 | 246 | nan |
91
+ | 0.0 | 42.0 | 252 | nan |
92
+ | 0.0 | 43.0 | 258 | nan |
93
+ | 0.0 | 44.0 | 264 | nan |
94
+ | 0.0 | 45.0 | 270 | nan |
95
+ | 0.0 | 46.0 | 276 | nan |
96
+ | 0.0 | 47.0 | 282 | nan |
97
+ | 0.0 | 48.0 | 288 | nan |
98
+ | 0.0 | 49.0 | 294 | nan |
99
+ | 0.0 | 50.0 | 300 | nan |
100
+ | 0.0 | 51.0 | 306 | nan |
101
+ | 0.0 | 52.0 | 312 | nan |
102
+ | 0.0 | 53.0 | 318 | nan |
103
+ | 0.0 | 54.0 | 324 | nan |
104
+ | 0.0 | 55.0 | 330 | nan |
105
+ | 0.0 | 56.0 | 336 | nan |
106
+ | 0.0 | 57.0 | 342 | nan |
107
+ | 0.0 | 58.0 | 348 | nan |
108
+ | 0.0 | 59.0 | 354 | nan |
109
+ | 0.0 | 60.0 | 360 | nan |
110
+ | 0.0 | 61.0 | 366 | nan |
111
+ | 0.0 | 62.0 | 372 | nan |
112
+ | 0.0 | 63.0 | 378 | nan |
113
+ | 0.0 | 64.0 | 384 | nan |
114
+ | 0.0 | 65.0 | 390 | nan |
115
+ | 0.0 | 66.0 | 396 | nan |
116
+ | 0.0 | 67.0 | 402 | nan |
117
+ | 0.0 | 68.0 | 408 | nan |
118
+ | 0.0 | 69.0 | 414 | nan |
119
+ | 0.0 | 70.0 | 420 | nan |
120
 
121
 
122
  ### Framework versions