mousaazari commited on
Commit
ce5d154
·
1 Parent(s): aee3fa1

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -55
README.md CHANGED
@@ -14,10 +14,10 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.3649
18
- - Rouge2 Precision: 0.9142
19
- - Rouge2 Recall: 0.3702
20
- - Rouge2 Fmeasure: 0.5091
21
 
22
  ## Model description
23
 
@@ -42,62 +42,32 @@ The following hyperparameters were used during training:
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
- - num_epochs: 50
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss | Rouge2 Precision | Rouge2 Recall | Rouge2 Fmeasure |
50
  |:-------------:|:-----:|:----:|:---------------:|:----------------:|:-------------:|:---------------:|
51
- | No log | 1.0 | 11 | 1.9185 | 0.0196 | 0.0039 | 0.0065 |
52
- | No log | 2.0 | 22 | 0.9761 | 0.4113 | 0.1687 | 0.2307 |
53
- | No log | 3.0 | 33 | 0.6916 | 0.8798 | 0.3676 | 0.5019 |
54
- | No log | 4.0 | 44 | 0.5450 | 0.8498 | 0.3402 | 0.4735 |
55
- | No log | 5.0 | 55 | 0.4648 | 0.8247 | 0.3186 | 0.4482 |
56
- | No log | 6.0 | 66 | 0.4248 | 0.9028 | 0.3695 | 0.5061 |
57
- | No log | 7.0 | 77 | 0.4157 | 0.9048 | 0.3713 | 0.509 |
58
- | No log | 8.0 | 88 | 0.3755 | 0.9067 | 0.3721 | 0.5093 |
59
- | No log | 9.0 | 99 | 0.3457 | 0.9107 | 0.3721 | 0.5098 |
60
- | No log | 10.0 | 110 | 0.3538 | 0.9325 | 0.3753 | 0.5175 |
61
- | No log | 11.0 | 121 | 0.3433 | 0.9603 | 0.3853 | 0.532 |
62
- | No log | 12.0 | 132 | 0.3296 | 0.9325 | 0.3765 | 0.518 |
63
- | No log | 13.0 | 143 | 0.3442 | 0.9167 | 0.3716 | 0.5098 |
64
- | No log | 14.0 | 154 | 0.3281 | 0.9286 | 0.3739 | 0.5161 |
65
- | No log | 15.0 | 165 | 0.3321 | 0.9127 | 0.3689 | 0.5074 |
66
- | No log | 16.0 | 176 | 0.3261 | 0.9286 | 0.3725 | 0.5142 |
67
- | No log | 17.0 | 187 | 0.3118 | 0.9286 | 0.3739 | 0.5161 |
68
- | No log | 18.0 | 198 | 0.3279 | 0.9325 | 0.3755 | 0.5167 |
69
- | No log | 19.0 | 209 | 0.3313 | 0.8864 | 0.3593 | 0.4928 |
70
- | No log | 20.0 | 220 | 0.3252 | 0.9286 | 0.3725 | 0.5142 |
71
- | No log | 21.0 | 231 | 0.3449 | 0.8907 | 0.3639 | 0.4983 |
72
- | No log | 22.0 | 242 | 0.3434 | 0.9286 | 0.3725 | 0.5142 |
73
- | No log | 23.0 | 253 | 0.3528 | 0.9103 | 0.3679 | 0.5055 |
74
- | No log | 24.0 | 264 | 0.3445 | 0.9286 | 0.3725 | 0.5142 |
75
- | No log | 25.0 | 275 | 0.3317 | 0.9286 | 0.3725 | 0.5142 |
76
- | No log | 26.0 | 286 | 0.3437 | 0.9286 | 0.3737 | 0.5155 |
77
- | No log | 27.0 | 297 | 0.3434 | 0.9127 | 0.358 | 0.498 |
78
- | No log | 28.0 | 308 | 0.3264 | 0.9325 | 0.3741 | 0.5162 |
79
- | No log | 29.0 | 319 | 0.3309 | 0.9142 | 0.3702 | 0.5091 |
80
- | No log | 30.0 | 330 | 0.3367 | 0.9142 | 0.3702 | 0.5091 |
81
- | No log | 31.0 | 341 | 0.3376 | 0.8999 | 0.3565 | 0.4937 |
82
- | No log | 32.0 | 352 | 0.3468 | 0.9167 | 0.3593 | 0.5001 |
83
- | No log | 33.0 | 363 | 0.3507 | 0.9142 | 0.3702 | 0.5091 |
84
- | No log | 34.0 | 374 | 0.3422 | 0.9142 | 0.3702 | 0.5091 |
85
- | No log | 35.0 | 385 | 0.3395 | 0.8968 | 0.3548 | 0.4912 |
86
- | No log | 36.0 | 396 | 0.3500 | 0.8999 | 0.3565 | 0.4937 |
87
- | No log | 37.0 | 407 | 0.3583 | 0.8776 | 0.3504 | 0.4854 |
88
- | No log | 38.0 | 418 | 0.3575 | 0.8776 | 0.3504 | 0.4854 |
89
- | No log | 39.0 | 429 | 0.3554 | 0.8999 | 0.3565 | 0.4937 |
90
- | No log | 40.0 | 440 | 0.3515 | 0.9142 | 0.3702 | 0.5091 |
91
- | No log | 41.0 | 451 | 0.3581 | 0.8999 | 0.3565 | 0.4937 |
92
- | No log | 42.0 | 462 | 0.3623 | 0.8999 | 0.3565 | 0.4937 |
93
- | No log | 43.0 | 473 | 0.3627 | 0.8999 | 0.3565 | 0.4937 |
94
- | No log | 44.0 | 484 | 0.3616 | 0.9142 | 0.3702 | 0.5091 |
95
- | No log | 45.0 | 495 | 0.3635 | 0.9142 | 0.3702 | 0.5091 |
96
- | 0.2628 | 46.0 | 506 | 0.3631 | 0.9142 | 0.3702 | 0.5091 |
97
- | 0.2628 | 47.0 | 517 | 0.3641 | 0.9142 | 0.3702 | 0.5091 |
98
- | 0.2628 | 48.0 | 528 | 0.3643 | 0.9142 | 0.3702 | 0.5091 |
99
- | 0.2628 | 49.0 | 539 | 0.3646 | 0.9142 | 0.3702 | 0.5091 |
100
- | 0.2628 | 50.0 | 550 | 0.3649 | 0.9142 | 0.3702 | 0.5091 |
101
 
102
 
103
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.3329
18
+ - Rouge2 Precision: 0.9286
19
+ - Rouge2 Recall: 0.3749
20
+ - Rouge2 Fmeasure: 0.5159
21
 
22
  ## Model description
23
 
 
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
+ - num_epochs: 20
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss | Rouge2 Precision | Rouge2 Recall | Rouge2 Fmeasure |
50
  |:-------------:|:-----:|:----:|:---------------:|:----------------:|:-------------:|:---------------:|
51
+ | No log | 1.0 | 11 | 1.9239 | 0.0179 | 0.0039 | 0.0065 |
52
+ | No log | 2.0 | 22 | 1.0254 | 0.5905 | 0.2378 | 0.3272 |
53
+ | No log | 3.0 | 33 | 0.7086 | 0.8679 | 0.3524 | 0.4872 |
54
+ | No log | 4.0 | 44 | 0.5639 | 0.8779 | 0.348 | 0.4874 |
55
+ | No log | 5.0 | 55 | 0.4922 | 0.8425 | 0.3264 | 0.4583 |
56
+ | No log | 6.0 | 66 | 0.4356 | 0.8609 | 0.3306 | 0.4665 |
57
+ | No log | 7.0 | 77 | 0.4279 | 0.9338 | 0.3731 | 0.5156 |
58
+ | No log | 8.0 | 88 | 0.4056 | 0.9226 | 0.3752 | 0.5163 |
59
+ | No log | 9.0 | 99 | 0.3803 | 0.9444 | 0.3815 | 0.5255 |
60
+ | No log | 10.0 | 110 | 0.3655 | 0.9385 | 0.3783 | 0.5209 |
61
+ | No log | 11.0 | 121 | 0.3657 | 0.9385 | 0.381 | 0.5238 |
62
+ | No log | 12.0 | 132 | 0.3596 | 0.9544 | 0.3842 | 0.5295 |
63
+ | No log | 13.0 | 143 | 0.3552 | 0.9563 | 0.3828 | 0.5278 |
64
+ | No log | 14.0 | 154 | 0.3502 | 0.9286 | 0.3749 | 0.5159 |
65
+ | No log | 15.0 | 165 | 0.3496 | 0.9286 | 0.3749 | 0.5159 |
66
+ | No log | 16.0 | 176 | 0.3484 | 0.9246 | 0.3731 | 0.5131 |
67
+ | No log | 17.0 | 187 | 0.3346 | 0.9603 | 0.3842 | 0.53 |
68
+ | No log | 18.0 | 198 | 0.3326 | 0.9286 | 0.3749 | 0.5159 |
69
+ | No log | 19.0 | 209 | 0.3335 | 0.9286 | 0.3749 | 0.5159 |
70
+ | No log | 20.0 | 220 | 0.3329 | 0.9286 | 0.3749 | 0.5159 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
71
 
72
 
73
  ### Framework versions