mousaazari commited on
Commit
27d9012
·
1 Parent(s): b1509f2

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +105 -45
README.md CHANGED
@@ -14,10 +14,10 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.4977
18
- - Rouge2 Precision: 0.8135
19
- - Rouge2 Recall: 0.3117
20
- - Rouge2 Fmeasure: 0.4298
21
 
22
  ## Model description
23
 
@@ -42,52 +42,112 @@ The following hyperparameters were used during training:
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
- - num_epochs: 40
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss | Rouge2 Precision | Rouge2 Recall | Rouge2 Fmeasure |
50
  |:-------------:|:-----:|:----:|:---------------:|:----------------:|:-------------:|:---------------:|
51
- | No log | 1.0 | 11 | 2.8516 | 0.0306 | 0.0054 | 0.0091 |
52
- | No log | 2.0 | 22 | 1.9542 | 0.0391 | 0.0119 | 0.0173 |
53
- | No log | 3.0 | 33 | 1.4651 | 0.2077 | 0.0735 | 0.1042 |
54
- | No log | 4.0 | 44 | 1.0618 | 0.5873 | 0.2521 | 0.3435 |
55
- | No log | 5.0 | 55 | 0.8918 | 0.7148 | 0.3032 | 0.4136 |
56
- | No log | 6.0 | 66 | 0.7871 | 0.7623 | 0.2922 | 0.4067 |
57
- | No log | 7.0 | 77 | 0.7451 | 0.7756 | 0.286 | 0.4002 |
58
- | No log | 8.0 | 88 | 0.6902 | 0.7734 | 0.2876 | 0.4022 |
59
- | No log | 9.0 | 99 | 0.6638 | 0.7913 | 0.295 | 0.4125 |
60
- | No log | 10.0 | 110 | 0.6434 | 0.7857 | 0.2923 | 0.4101 |
61
- | No log | 11.0 | 121 | 0.6248 | 0.8016 | 0.309 | 0.4262 |
62
- | No log | 12.0 | 132 | 0.6035 | 0.7976 | 0.3003 | 0.42 |
63
- | No log | 13.0 | 143 | 0.5936 | 0.8294 | 0.3186 | 0.4405 |
64
- | No log | 14.0 | 154 | 0.5608 | 0.8333 | 0.3204 | 0.4427 |
65
- | No log | 15.0 | 165 | 0.5717 | 0.8333 | 0.3204 | 0.4427 |
66
- | No log | 16.0 | 176 | 0.5615 | 0.8292 | 0.3183 | 0.4404 |
67
- | No log | 17.0 | 187 | 0.5550 | 0.8274 | 0.3175 | 0.4394 |
68
- | No log | 18.0 | 198 | 0.5473 | 0.8254 | 0.3176 | 0.4387 |
69
- | No log | 19.0 | 209 | 0.5422 | 0.8333 | 0.3192 | 0.441 |
70
- | No log | 20.0 | 220 | 0.5335 | 0.8373 | 0.3208 | 0.4429 |
71
- | No log | 21.0 | 231 | 0.5276 | 0.8294 | 0.3191 | 0.4409 |
72
- | No log | 22.0 | 242 | 0.5257 | 0.8611 | 0.3291 | 0.4558 |
73
- | No log | 23.0 | 253 | 0.5339 | 0.8387 | 0.3198 | 0.4433 |
74
- | No log | 24.0 | 264 | 0.5302 | 0.8492 | 0.325 | 0.4495 |
75
- | No log | 25.0 | 275 | 0.5187 | 0.8562 | 0.3284 | 0.4527 |
76
- | No log | 26.0 | 286 | 0.5208 | 0.8562 | 0.3284 | 0.4527 |
77
- | No log | 27.0 | 297 | 0.5170 | 0.8611 | 0.3295 | 0.4555 |
78
- | No log | 28.0 | 308 | 0.5095 | 0.8671 | 0.3318 | 0.4604 |
79
- | No log | 29.0 | 319 | 0.5128 | 0.8651 | 0.3307 | 0.4584 |
80
- | No log | 30.0 | 330 | 0.5074 | 0.8175 | 0.3145 | 0.4334 |
81
- | No log | 31.0 | 341 | 0.4990 | 0.8175 | 0.3145 | 0.4334 |
82
- | No log | 32.0 | 352 | 0.4985 | 0.8175 | 0.3145 | 0.4334 |
83
- | No log | 33.0 | 363 | 0.5006 | 0.8229 | 0.316 | 0.4359 |
84
- | No log | 34.0 | 374 | 0.4975 | 0.869 | 0.3318 | 0.4606 |
85
- | No log | 35.0 | 385 | 0.4987 | 0.8591 | 0.3294 | 0.4564 |
86
- | No log | 36.0 | 396 | 0.4968 | 0.8591 | 0.3294 | 0.4564 |
87
- | No log | 37.0 | 407 | 0.4960 | 0.8591 | 0.3283 | 0.455 |
88
- | No log | 38.0 | 418 | 0.4965 | 0.8353 | 0.3192 | 0.4425 |
89
- | No log | 39.0 | 429 | 0.4978 | 0.8135 | 0.3117 | 0.4298 |
90
- | No log | 40.0 | 440 | 0.4977 | 0.8135 | 0.3117 | 0.4298 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
91
 
92
 
93
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.5209
18
+ - Rouge2 Precision: 0.8022
19
+ - Rouge2 Recall: 0.2938
20
+ - Rouge2 Fmeasure: 0.417
21
 
22
  ## Model description
23
 
 
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
+ - num_epochs: 100
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss | Rouge2 Precision | Rouge2 Recall | Rouge2 Fmeasure |
50
  |:-------------:|:-----:|:----:|:---------------:|:----------------:|:-------------:|:---------------:|
51
+ | No log | 1.0 | 11 | 2.6610 | 0.0196 | 0.0042 | 0.0069 |
52
+ | No log | 2.0 | 22 | 1.8527 | 0.0675 | 0.0175 | 0.0277 |
53
+ | No log | 3.0 | 33 | 1.3227 | 0.167 | 0.0651 | 0.0896 |
54
+ | No log | 4.0 | 44 | 0.9633 | 0.6745 | 0.2883 | 0.3935 |
55
+ | No log | 5.0 | 55 | 0.8185 | 0.7441 | 0.2801 | 0.3907 |
56
+ | No log | 6.0 | 66 | 0.7481 | 0.7495 | 0.2879 | 0.4024 |
57
+ | No log | 7.0 | 77 | 0.7140 | 0.7484 | 0.2778 | 0.3899 |
58
+ | No log | 8.0 | 88 | 0.6503 | 0.748 | 0.2754 | 0.3887 |
59
+ | No log | 9.0 | 99 | 0.6324 | 0.767 | 0.2843 | 0.4012 |
60
+ | No log | 10.0 | 110 | 0.6202 | 0.7565 | 0.2813 | 0.396 |
61
+ | No log | 11.0 | 121 | 0.6121 | 0.7964 | 0.3047 | 0.4244 |
62
+ | No log | 12.0 | 132 | 0.5746 | 0.8135 | 0.3111 | 0.4342 |
63
+ | No log | 13.0 | 143 | 0.5831 | 0.8169 | 0.3103 | 0.4327 |
64
+ | No log | 14.0 | 154 | 0.5618 | 0.8407 | 0.3226 | 0.4489 |
65
+ | No log | 15.0 | 165 | 0.5700 | 0.8021 | 0.3098 | 0.4289 |
66
+ | No log | 16.0 | 176 | 0.5497 | 0.8407 | 0.3226 | 0.4489 |
67
+ | No log | 17.0 | 187 | 0.5387 | 0.8274 | 0.3144 | 0.4378 |
68
+ | No log | 18.0 | 198 | 0.5438 | 0.8393 | 0.3174 | 0.4431 |
69
+ | No log | 19.0 | 209 | 0.5225 | 0.775 | 0.2954 | 0.4116 |
70
+ | No log | 20.0 | 220 | 0.5096 | 0.7914 | 0.3064 | 0.4233 |
71
+ | No log | 21.0 | 231 | 0.5104 | 0.8268 | 0.3179 | 0.441 |
72
+ | No log | 22.0 | 242 | 0.5105 | 0.8472 | 0.3253 | 0.4522 |
73
+ | No log | 23.0 | 253 | 0.5098 | 0.8236 | 0.3172 | 0.4393 |
74
+ | No log | 24.0 | 264 | 0.5157 | 0.8402 | 0.322 | 0.4469 |
75
+ | No log | 25.0 | 275 | 0.5090 | 0.8283 | 0.3181 | 0.4407 |
76
+ | No log | 26.0 | 286 | 0.5025 | 0.8283 | 0.3181 | 0.4407 |
77
+ | No log | 27.0 | 297 | 0.4818 | 0.8532 | 0.327 | 0.4546 |
78
+ | No log | 28.0 | 308 | 0.5056 | 0.8492 | 0.3249 | 0.4505 |
79
+ | No log | 29.0 | 319 | 0.5214 | 0.8571 | 0.3282 | 0.4572 |
80
+ | No log | 30.0 | 330 | 0.5013 | 0.8492 | 0.3255 | 0.4521 |
81
+ | No log | 31.0 | 341 | 0.5016 | 0.8433 | 0.3152 | 0.4418 |
82
+ | No log | 32.0 | 352 | 0.4953 | 0.8472 | 0.3186 | 0.4463 |
83
+ | No log | 33.0 | 363 | 0.5084 | 0.8433 | 0.3152 | 0.4418 |
84
+ | No log | 34.0 | 374 | 0.5109 | 0.8135 | 0.2944 | 0.419 |
85
+ | No log | 35.0 | 385 | 0.5135 | 0.8532 | 0.3217 | 0.4499 |
86
+ | No log | 36.0 | 396 | 0.5103 | 0.8016 | 0.2908 | 0.4137 |
87
+ | No log | 37.0 | 407 | 0.4987 | 0.8016 | 0.2908 | 0.4137 |
88
+ | No log | 38.0 | 418 | 0.4973 | 0.8294 | 0.3186 | 0.4439 |
89
+ | No log | 39.0 | 429 | 0.5054 | 0.8016 | 0.2959 | 0.419 |
90
+ | No log | 40.0 | 440 | 0.5005 | 0.7918 | 0.2918 | 0.4134 |
91
+ | No log | 41.0 | 451 | 0.5084 | 0.8124 | 0.293 | 0.4168 |
92
+ | No log | 42.0 | 462 | 0.4983 | 0.8095 | 0.2932 | 0.4176 |
93
+ | No log | 43.0 | 473 | 0.4975 | 0.8016 | 0.2958 | 0.4192 |
94
+ | No log | 44.0 | 484 | 0.4925 | 0.7937 | 0.293 | 0.4149 |
95
+ | No log | 45.0 | 495 | 0.4883 | 0.7817 | 0.2893 | 0.4091 |
96
+ | 0.4328 | 46.0 | 506 | 0.4961 | 0.7976 | 0.2944 | 0.4174 |
97
+ | 0.4328 | 47.0 | 517 | 0.4942 | 0.8056 | 0.2918 | 0.4143 |
98
+ | 0.4328 | 48.0 | 528 | 0.4982 | 0.8095 | 0.2969 | 0.4213 |
99
+ | 0.4328 | 49.0 | 539 | 0.4962 | 0.8095 | 0.293 | 0.4166 |
100
+ | 0.4328 | 50.0 | 550 | 0.4963 | 0.8095 | 0.293 | 0.4166 |
101
+ | 0.4328 | 51.0 | 561 | 0.4914 | 0.7976 | 0.2893 | 0.4115 |
102
+ | 0.4328 | 52.0 | 572 | 0.5019 | 0.7976 | 0.2893 | 0.4115 |
103
+ | 0.4328 | 53.0 | 583 | 0.4972 | 0.7976 | 0.2944 | 0.4174 |
104
+ | 0.4328 | 54.0 | 594 | 0.4899 | 0.7976 | 0.2944 | 0.4174 |
105
+ | 0.4328 | 55.0 | 605 | 0.4990 | 0.7897 | 0.2918 | 0.4133 |
106
+ | 0.4328 | 56.0 | 616 | 0.5094 | 0.7817 | 0.2859 | 0.4042 |
107
+ | 0.4328 | 57.0 | 627 | 0.5109 | 0.8056 | 0.2918 | 0.4143 |
108
+ | 0.4328 | 58.0 | 638 | 0.5133 | 0.8095 | 0.293 | 0.4166 |
109
+ | 0.4328 | 59.0 | 649 | 0.4982 | 0.8095 | 0.293 | 0.4166 |
110
+ | 0.4328 | 60.0 | 660 | 0.4938 | 0.7865 | 0.2871 | 0.4065 |
111
+ | 0.4328 | 61.0 | 671 | 0.4947 | 0.7817 | 0.2859 | 0.4042 |
112
+ | 0.4328 | 62.0 | 682 | 0.4981 | 0.7817 | 0.2859 | 0.4042 |
113
+ | 0.4328 | 63.0 | 693 | 0.4957 | 0.7817 | 0.2859 | 0.4042 |
114
+ | 0.4328 | 64.0 | 704 | 0.5030 | 0.771 | 0.283 | 0.3999 |
115
+ | 0.4328 | 65.0 | 715 | 0.5109 | 0.7783 | 0.286 | 0.4042 |
116
+ | 0.4328 | 66.0 | 726 | 0.5142 | 0.7868 | 0.2883 | 0.4081 |
117
+ | 0.4328 | 67.0 | 737 | 0.5134 | 0.7868 | 0.2883 | 0.4081 |
118
+ | 0.4328 | 68.0 | 748 | 0.5152 | 0.7868 | 0.2883 | 0.4081 |
119
+ | 0.4328 | 69.0 | 759 | 0.5157 | 0.7868 | 0.2883 | 0.4081 |
120
+ | 0.4328 | 70.0 | 770 | 0.5112 | 0.8135 | 0.2944 | 0.4188 |
121
+ | 0.4328 | 71.0 | 781 | 0.5105 | 0.8135 | 0.2944 | 0.4188 |
122
+ | 0.4328 | 72.0 | 792 | 0.5188 | 0.7868 | 0.2883 | 0.4081 |
123
+ | 0.4328 | 73.0 | 803 | 0.5260 | 0.818 | 0.3114 | 0.4329 |
124
+ | 0.4328 | 74.0 | 814 | 0.5227 | 0.7868 | 0.2883 | 0.4081 |
125
+ | 0.4328 | 75.0 | 825 | 0.5179 | 0.7868 | 0.2883 | 0.4081 |
126
+ | 0.4328 | 76.0 | 836 | 0.5158 | 0.8135 | 0.2944 | 0.4188 |
127
+ | 0.4328 | 77.0 | 847 | 0.5142 | 0.7868 | 0.2883 | 0.4081 |
128
+ | 0.4328 | 78.0 | 858 | 0.5144 | 0.818 | 0.3114 | 0.4329 |
129
+ | 0.4328 | 79.0 | 869 | 0.5147 | 0.818 | 0.3114 | 0.4329 |
130
+ | 0.4328 | 80.0 | 880 | 0.5148 | 0.818 | 0.3114 | 0.4329 |
131
+ | 0.4328 | 81.0 | 891 | 0.5150 | 0.818 | 0.3114 | 0.4329 |
132
+ | 0.4328 | 82.0 | 902 | 0.5131 | 0.818 | 0.3114 | 0.4329 |
133
+ | 0.4328 | 83.0 | 913 | 0.5123 | 0.818 | 0.3114 | 0.4329 |
134
+ | 0.4328 | 84.0 | 924 | 0.5143 | 0.8305 | 0.3161 | 0.441 |
135
+ | 0.4328 | 85.0 | 935 | 0.5161 | 0.8305 | 0.3161 | 0.441 |
136
+ | 0.4328 | 86.0 | 946 | 0.5178 | 0.8305 | 0.3161 | 0.441 |
137
+ | 0.4328 | 87.0 | 957 | 0.5159 | 0.8305 | 0.3161 | 0.441 |
138
+ | 0.4328 | 88.0 | 968 | 0.5164 | 0.8305 | 0.3161 | 0.441 |
139
+ | 0.4328 | 89.0 | 979 | 0.5156 | 0.8022 | 0.2938 | 0.417 |
140
+ | 0.4328 | 90.0 | 990 | 0.5173 | 0.8022 | 0.2938 | 0.417 |
141
+ | 0.0488 | 91.0 | 1001 | 0.5196 | 0.8022 | 0.2938 | 0.417 |
142
+ | 0.0488 | 92.0 | 1012 | 0.5193 | 0.8022 | 0.2938 | 0.417 |
143
+ | 0.0488 | 93.0 | 1023 | 0.5205 | 0.8022 | 0.2938 | 0.417 |
144
+ | 0.0488 | 94.0 | 1034 | 0.5222 | 0.8022 | 0.2938 | 0.417 |
145
+ | 0.0488 | 95.0 | 1045 | 0.5221 | 0.8022 | 0.2938 | 0.417 |
146
+ | 0.0488 | 96.0 | 1056 | 0.5214 | 0.8022 | 0.2938 | 0.417 |
147
+ | 0.0488 | 97.0 | 1067 | 0.5211 | 0.8022 | 0.2938 | 0.417 |
148
+ | 0.0488 | 98.0 | 1078 | 0.5209 | 0.8022 | 0.2938 | 0.417 |
149
+ | 0.0488 | 99.0 | 1089 | 0.5208 | 0.8022 | 0.2938 | 0.417 |
150
+ | 0.0488 | 100.0 | 1100 | 0.5209 | 0.8022 | 0.2938 | 0.417 |
151
 
152
 
153
  ### Framework versions