End of training
Browse files- README.md +114 -167
- model.safetensors +1 -1
- training_args.bin +1 -1
README.md
CHANGED
|
@@ -18,11 +18,11 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 18 |
|
| 19 |
This model is a fine-tuned version of [google/flan-t5-small](https://huggingface.co/google/flan-t5-small) on an unknown dataset.
|
| 20 |
It achieves the following results on the evaluation set:
|
| 21 |
-
- Loss: 0.
|
| 22 |
-
- Rouge1: 0.
|
| 23 |
-
- Rouge2: 0.
|
| 24 |
-
- Rougel: 0.
|
| 25 |
-
- Rougelsum: 0.
|
| 26 |
|
| 27 |
## Model description
|
| 28 |
|
|
@@ -41,7 +41,7 @@ More information needed
|
|
| 41 |
### Training hyperparameters
|
| 42 |
|
| 43 |
The following hyperparameters were used during training:
|
| 44 |
-
- learning_rate:
|
| 45 |
- train_batch_size: 80
|
| 46 |
- eval_batch_size: 80
|
| 47 |
- seed: 42
|
|
@@ -49,172 +49,119 @@ The following hyperparameters were used during training:
|
|
| 49 |
- total_train_batch_size: 320
|
| 50 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 51 |
- lr_scheduler_type: linear
|
| 52 |
-
- num_epochs:
|
| 53 |
|
| 54 |
### Training results
|
| 55 |
|
| 56 |
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|
| 57 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|
|
| 58 |
-
|
|
| 59 |
-
|
|
| 60 |
-
|
|
| 61 |
-
|
|
| 62 |
-
|
|
| 63 |
-
|
|
| 64 |
-
|
|
| 65 |
-
|
|
| 66 |
-
|
|
| 67 |
-
|
|
| 68 |
-
|
|
| 69 |
-
|
|
| 70 |
-
|
|
| 71 |
-
|
|
| 72 |
-
|
|
| 73 |
-
|
|
| 74 |
-
|
|
| 75 |
-
|
|
| 76 |
-
|
|
| 77 |
-
|
|
| 78 |
-
|
|
| 79 |
-
|
|
| 80 |
-
|
|
| 81 |
-
|
|
| 82 |
-
|
|
| 83 |
-
|
|
| 84 |
-
|
|
| 85 |
-
|
|
| 86 |
-
|
|
| 87 |
-
|
|
| 88 |
-
|
|
| 89 |
-
|
|
| 90 |
-
|
|
| 91 |
-
|
|
| 92 |
-
|
|
| 93 |
-
|
|
| 94 |
-
|
|
| 95 |
-
|
|
| 96 |
-
|
|
| 97 |
-
|
|
| 98 |
-
|
|
| 99 |
-
|
|
| 100 |
-
|
|
| 101 |
-
|
|
| 102 |
-
|
|
| 103 |
-
|
|
| 104 |
-
|
|
| 105 |
-
|
|
| 106 |
-
|
|
| 107 |
-
|
|
| 108 |
-
|
|
| 109 |
-
|
|
| 110 |
-
|
|
| 111 |
-
|
|
| 112 |
-
|
|
| 113 |
-
|
|
| 114 |
-
|
|
| 115 |
-
|
|
| 116 |
-
|
|
| 117 |
-
|
|
| 118 |
-
|
|
| 119 |
-
|
|
| 120 |
-
|
|
| 121 |
-
|
|
| 122 |
-
|
|
| 123 |
-
|
|
| 124 |
-
|
|
| 125 |
-
|
|
| 126 |
-
|
|
| 127 |
-
|
|
| 128 |
-
|
|
| 129 |
-
|
|
| 130 |
-
|
|
| 131 |
-
|
|
| 132 |
-
|
|
| 133 |
-
|
|
| 134 |
-
|
|
| 135 |
-
|
|
| 136 |
-
|
|
| 137 |
-
|
|
| 138 |
-
|
|
| 139 |
-
|
|
| 140 |
-
|
|
| 141 |
-
|
|
| 142 |
-
|
|
| 143 |
-
|
|
| 144 |
-
|
|
| 145 |
-
|
|
| 146 |
-
|
|
| 147 |
-
|
|
| 148 |
-
|
|
| 149 |
-
|
|
| 150 |
-
|
|
| 151 |
-
|
|
| 152 |
-
|
|
| 153 |
-
|
|
| 154 |
-
|
|
| 155 |
-
|
|
| 156 |
-
|
|
| 157 |
-
|
|
| 158 |
-
|
|
| 159 |
-
|
|
| 160 |
-
|
|
| 161 |
-
|
|
| 162 |
-
|
|
| 163 |
-
|
|
| 164 |
-
|
|
| 165 |
-
| 2.2205 | 108.0 | 324 | 0.6955 | 0.3493 | 0.1134 | 0.3167 | 0.3335 |
|
| 166 |
-
| 2.1966 | 109.0 | 327 | 0.6866 | 0.3493 | 0.1134 | 0.3167 | 0.3335 |
|
| 167 |
-
| 2.1647 | 110.0 | 330 | 0.6781 | 0.3493 | 0.1134 | 0.3167 | 0.3335 |
|
| 168 |
-
| 2.1735 | 111.0 | 333 | 0.6703 | 0.3528 | 0.1131 | 0.3192 | 0.3375 |
|
| 169 |
-
| 2.1282 | 112.0 | 336 | 0.6621 | 0.3528 | 0.1131 | 0.3192 | 0.3375 |
|
| 170 |
-
| 2.128 | 113.0 | 339 | 0.6539 | 0.3528 | 0.1131 | 0.3192 | 0.3375 |
|
| 171 |
-
| 2.0928 | 114.0 | 342 | 0.6461 | 0.3528 | 0.1131 | 0.3192 | 0.3375 |
|
| 172 |
-
| 2.1138 | 115.0 | 345 | 0.6380 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 173 |
-
| 2.0628 | 116.0 | 348 | 0.6300 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 174 |
-
| 2.1074 | 117.0 | 351 | 0.6225 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 175 |
-
| 2.0309 | 118.0 | 354 | 0.6154 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 176 |
-
| 2.0194 | 119.0 | 357 | 0.6086 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 177 |
-
| 2.003 | 120.0 | 360 | 0.6025 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 178 |
-
| 1.9763 | 121.0 | 363 | 0.5971 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 179 |
-
| 1.9839 | 122.0 | 366 | 0.5924 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 180 |
-
| 1.9457 | 123.0 | 369 | 0.5875 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 181 |
-
| 1.9712 | 124.0 | 372 | 0.5831 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 182 |
-
| 1.8996 | 125.0 | 375 | 0.5784 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 183 |
-
| 1.9026 | 126.0 | 378 | 0.5735 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 184 |
-
| 1.8806 | 127.0 | 381 | 0.5687 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 185 |
-
| 1.9003 | 128.0 | 384 | 0.5641 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 186 |
-
| 1.856 | 129.0 | 387 | 0.5596 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 187 |
-
| 1.8529 | 130.0 | 390 | 0.5552 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 188 |
-
| 1.7915 | 131.0 | 393 | 0.5510 | 0.3528 | 0.1133 | 0.3197 | 0.3374 |
|
| 189 |
-
| 1.8471 | 132.0 | 396 | 0.5469 | 0.3606 | 0.1154 | 0.3268 | 0.3445 |
|
| 190 |
-
| 1.8399 | 133.0 | 399 | 0.5432 | 0.3606 | 0.1154 | 0.3268 | 0.3445 |
|
| 191 |
-
| 1.8121 | 134.0 | 402 | 0.5400 | 0.3606 | 0.1154 | 0.3268 | 0.3445 |
|
| 192 |
-
| 1.77 | 135.0 | 405 | 0.5370 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 193 |
-
| 1.8023 | 136.0 | 408 | 0.5344 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 194 |
-
| 1.7561 | 137.0 | 411 | 0.5318 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 195 |
-
| 1.7662 | 138.0 | 414 | 0.5292 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 196 |
-
| 1.7538 | 139.0 | 417 | 0.5264 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 197 |
-
| 1.7416 | 140.0 | 420 | 0.5239 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 198 |
-
| 1.7705 | 141.0 | 423 | 0.5215 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 199 |
-
| 1.6753 | 142.0 | 426 | 0.5193 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 200 |
-
| 1.7438 | 143.0 | 429 | 0.5173 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 201 |
-
| 1.7012 | 144.0 | 432 | 0.5155 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 202 |
-
| 1.7532 | 145.0 | 435 | 0.5138 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 203 |
-
| 1.7072 | 146.0 | 438 | 0.5121 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 204 |
-
| 1.7409 | 147.0 | 441 | 0.5106 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 205 |
-
| 1.685 | 148.0 | 444 | 0.5092 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 206 |
-
| 1.7463 | 149.0 | 447 | 0.5079 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 207 |
-
| 1.6827 | 150.0 | 450 | 0.5067 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 208 |
-
| 1.7278 | 151.0 | 453 | 0.5056 | 0.3606 | 0.1155 | 0.3272 | 0.3445 |
|
| 209 |
-
| 1.6815 | 152.0 | 456 | 0.5046 | 0.3621 | 0.1184 | 0.3287 | 0.3462 |
|
| 210 |
-
| 1.6731 | 153.0 | 459 | 0.5037 | 0.3621 | 0.1184 | 0.3287 | 0.3462 |
|
| 211 |
-
| 1.7415 | 154.0 | 462 | 0.5030 | 0.3621 | 0.1184 | 0.3287 | 0.3462 |
|
| 212 |
-
| 1.6871 | 155.0 | 465 | 0.5024 | 0.3621 | 0.1184 | 0.3287 | 0.3462 |
|
| 213 |
-
| 1.7124 | 156.0 | 468 | 0.5019 | 0.3621 | 0.1184 | 0.3287 | 0.3462 |
|
| 214 |
-
| 1.6678 | 157.0 | 471 | 0.5015 | 0.3621 | 0.1184 | 0.3287 | 0.3462 |
|
| 215 |
-
| 1.7196 | 158.0 | 474 | 0.5012 | 0.3621 | 0.1184 | 0.3287 | 0.3462 |
|
| 216 |
-
| 1.6954 | 159.0 | 477 | 0.5010 | 0.3621 | 0.1184 | 0.3287 | 0.3462 |
|
| 217 |
-
| 1.705 | 160.0 | 480 | 0.5009 | 0.3621 | 0.1184 | 0.3287 | 0.3462 |
|
| 218 |
|
| 219 |
|
| 220 |
### Framework versions
|
|
|
|
| 18 |
|
| 19 |
This model is a fine-tuned version of [google/flan-t5-small](https://huggingface.co/google/flan-t5-small) on an unknown dataset.
|
| 20 |
It achieves the following results on the evaluation set:
|
| 21 |
+
- Loss: 0.4416
|
| 22 |
+
- Rouge1: 0.3489
|
| 23 |
+
- Rouge2: 0.1081
|
| 24 |
+
- Rougel: 0.3225
|
| 25 |
+
- Rougelsum: 0.3335
|
| 26 |
|
| 27 |
## Model description
|
| 28 |
|
|
|
|
| 41 |
### Training hyperparameters
|
| 42 |
|
| 43 |
The following hyperparameters were used during training:
|
| 44 |
+
- learning_rate: 0.0005
|
| 45 |
- train_batch_size: 80
|
| 46 |
- eval_batch_size: 80
|
| 47 |
- seed: 42
|
|
|
|
| 49 |
- total_train_batch_size: 320
|
| 50 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 51 |
- lr_scheduler_type: linear
|
| 52 |
+
- num_epochs: 160
|
| 53 |
|
| 54 |
### Training results
|
| 55 |
|
| 56 |
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|
| 57 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|
|
| 58 |
+
| 41.7986 | 1.0 | 3 | 14.9730 | 0.0645 | 0.0187 | 0.0619 | 0.0626 |
|
| 59 |
+
| 18.0367 | 2.0 | 6 | 6.4506 | 0.0696 | 0.0369 | 0.0673 | 0.0690 |
|
| 60 |
+
| 11.6807 | 3.0 | 9 | 4.8843 | 0.1134 | 0.0385 | 0.0991 | 0.1016 |
|
| 61 |
+
| 9.5977 | 4.0 | 12 | 4.1902 | 0.0615 | 0.0227 | 0.0529 | 0.0562 |
|
| 62 |
+
| 8.382 | 5.0 | 15 | 3.7084 | 0.0108 | 0.0017 | 0.0108 | 0.0107 |
|
| 63 |
+
| 7.3099 | 6.0 | 18 | 3.1393 | 0.0334 | 0.0139 | 0.0321 | 0.0319 |
|
| 64 |
+
| 6.2255 | 7.0 | 21 | 2.6959 | 0.0484 | 0.0206 | 0.0471 | 0.0478 |
|
| 65 |
+
| 5.3866 | 8.0 | 24 | 2.2886 | 0.0942 | 0.0388 | 0.0902 | 0.0928 |
|
| 66 |
+
| 4.4362 | 9.0 | 27 | 1.6919 | 0.1476 | 0.0517 | 0.1244 | 0.1336 |
|
| 67 |
+
| 3.5819 | 10.0 | 30 | 1.2444 | 0.2204 | 0.0785 | 0.1939 | 0.2064 |
|
| 68 |
+
| 2.7713 | 11.0 | 33 | 0.9173 | 0.3423 | 0.1261 | 0.3144 | 0.3277 |
|
| 69 |
+
| 2.1415 | 12.0 | 36 | 0.6726 | 0.3756 | 0.1188 | 0.3440 | 0.3607 |
|
| 70 |
+
| 1.6248 | 13.0 | 39 | 0.4801 | 0.3757 | 0.1220 | 0.3387 | 0.3573 |
|
| 71 |
+
| 1.3172 | 14.0 | 42 | 0.3892 | 0.3855 | 0.1316 | 0.3478 | 0.3655 |
|
| 72 |
+
| 1.0707 | 15.0 | 45 | 0.3425 | 0.3863 | 0.1358 | 0.3514 | 0.3691 |
|
| 73 |
+
| 0.8661 | 16.0 | 48 | 0.3104 | 0.3820 | 0.1376 | 0.3447 | 0.3624 |
|
| 74 |
+
| 0.7925 | 17.0 | 51 | 0.2946 | 0.3937 | 0.1408 | 0.3620 | 0.3738 |
|
| 75 |
+
| 0.6878 | 18.0 | 54 | 0.2863 | 0.3893 | 0.1375 | 0.3568 | 0.3672 |
|
| 76 |
+
| 0.6841 | 19.0 | 57 | 0.2810 | 0.3959 | 0.1427 | 0.3635 | 0.3731 |
|
| 77 |
+
| 0.6014 | 20.0 | 60 | 0.2782 | 0.3991 | 0.1447 | 0.3663 | 0.3794 |
|
| 78 |
+
| 0.5921 | 21.0 | 63 | 0.2786 | 0.4018 | 0.1467 | 0.3696 | 0.3841 |
|
| 79 |
+
| 0.5582 | 22.0 | 66 | 0.2776 | 0.3967 | 0.1414 | 0.3618 | 0.3767 |
|
| 80 |
+
| 0.5268 | 23.0 | 69 | 0.2785 | 0.3984 | 0.1479 | 0.3669 | 0.3822 |
|
| 81 |
+
| 0.4784 | 24.0 | 72 | 0.2796 | 0.4031 | 0.1519 | 0.3709 | 0.3851 |
|
| 82 |
+
| 0.4378 | 25.0 | 75 | 0.2831 | 0.4001 | 0.1501 | 0.3667 | 0.3805 |
|
| 83 |
+
| 0.4395 | 26.0 | 78 | 0.2876 | 0.4015 | 0.1522 | 0.3691 | 0.3811 |
|
| 84 |
+
| 0.4269 | 27.0 | 81 | 0.2897 | 0.4055 | 0.1455 | 0.3747 | 0.3845 |
|
| 85 |
+
| 0.3955 | 28.0 | 84 | 0.2925 | 0.3912 | 0.1330 | 0.3595 | 0.3694 |
|
| 86 |
+
| 0.3876 | 29.0 | 87 | 0.2976 | 0.3881 | 0.1354 | 0.3592 | 0.3677 |
|
| 87 |
+
| 0.3593 | 30.0 | 90 | 0.3008 | 0.3875 | 0.1374 | 0.3580 | 0.3686 |
|
| 88 |
+
| 0.3477 | 31.0 | 93 | 0.3038 | 0.3792 | 0.1303 | 0.3507 | 0.3609 |
|
| 89 |
+
| 0.3368 | 32.0 | 96 | 0.3079 | 0.3854 | 0.1331 | 0.3601 | 0.3677 |
|
| 90 |
+
| 0.3019 | 33.0 | 99 | 0.3134 | 0.3820 | 0.1272 | 0.3524 | 0.3633 |
|
| 91 |
+
| 0.3141 | 34.0 | 102 | 0.3202 | 0.3733 | 0.1229 | 0.3431 | 0.3541 |
|
| 92 |
+
| 0.2914 | 35.0 | 105 | 0.3233 | 0.3814 | 0.1257 | 0.3514 | 0.3638 |
|
| 93 |
+
| 0.2817 | 36.0 | 108 | 0.3250 | 0.3822 | 0.1316 | 0.3563 | 0.3636 |
|
| 94 |
+
| 0.2875 | 37.0 | 111 | 0.3280 | 0.3898 | 0.1405 | 0.3650 | 0.3737 |
|
| 95 |
+
| 0.267 | 38.0 | 114 | 0.3343 | 0.3878 | 0.1353 | 0.3616 | 0.3708 |
|
| 96 |
+
| 0.264 | 39.0 | 117 | 0.3375 | 0.3761 | 0.1182 | 0.3484 | 0.3589 |
|
| 97 |
+
| 0.2519 | 40.0 | 120 | 0.3372 | 0.3781 | 0.1228 | 0.3504 | 0.3606 |
|
| 98 |
+
| 0.2508 | 41.0 | 123 | 0.3382 | 0.3810 | 0.1244 | 0.3538 | 0.3635 |
|
| 99 |
+
| 0.2373 | 42.0 | 126 | 0.3460 | 0.3805 | 0.1230 | 0.3533 | 0.3632 |
|
| 100 |
+
| 0.2316 | 43.0 | 129 | 0.3533 | 0.3692 | 0.1125 | 0.3396 | 0.3514 |
|
| 101 |
+
| 0.2271 | 44.0 | 132 | 0.3552 | 0.3576 | 0.1133 | 0.3313 | 0.3394 |
|
| 102 |
+
| 0.2133 | 45.0 | 135 | 0.3565 | 0.3643 | 0.1244 | 0.3401 | 0.3481 |
|
| 103 |
+
| 0.2167 | 46.0 | 138 | 0.3602 | 0.3683 | 0.1245 | 0.3408 | 0.3490 |
|
| 104 |
+
| 0.2119 | 47.0 | 141 | 0.3647 | 0.3694 | 0.1278 | 0.3399 | 0.3493 |
|
| 105 |
+
| 0.1976 | 48.0 | 144 | 0.3677 | 0.3590 | 0.1194 | 0.3322 | 0.3414 |
|
| 106 |
+
| 0.2133 | 49.0 | 147 | 0.3720 | 0.3531 | 0.1115 | 0.3275 | 0.3351 |
|
| 107 |
+
| 0.1923 | 50.0 | 150 | 0.3746 | 0.3621 | 0.1189 | 0.3339 | 0.3413 |
|
| 108 |
+
| 0.1854 | 51.0 | 153 | 0.3760 | 0.3707 | 0.1280 | 0.3438 | 0.3528 |
|
| 109 |
+
| 0.1872 | 52.0 | 156 | 0.3767 | 0.3635 | 0.1219 | 0.3358 | 0.3463 |
|
| 110 |
+
| 0.1827 | 53.0 | 159 | 0.3790 | 0.3657 | 0.1196 | 0.3384 | 0.3494 |
|
| 111 |
+
| 0.1801 | 54.0 | 162 | 0.3833 | 0.3611 | 0.1195 | 0.3276 | 0.3426 |
|
| 112 |
+
| 0.1787 | 55.0 | 165 | 0.3903 | 0.3595 | 0.1202 | 0.3285 | 0.3411 |
|
| 113 |
+
| 0.1713 | 56.0 | 168 | 0.3923 | 0.3566 | 0.1179 | 0.3258 | 0.3379 |
|
| 114 |
+
| 0.1626 | 57.0 | 171 | 0.3941 | 0.3497 | 0.1152 | 0.3185 | 0.3325 |
|
| 115 |
+
| 0.1599 | 58.0 | 174 | 0.3922 | 0.3605 | 0.1216 | 0.3305 | 0.3448 |
|
| 116 |
+
| 0.1603 | 59.0 | 177 | 0.3929 | 0.3478 | 0.1079 | 0.3188 | 0.3329 |
|
| 117 |
+
| 0.1794 | 60.0 | 180 | 0.3958 | 0.3455 | 0.1057 | 0.3179 | 0.3319 |
|
| 118 |
+
| 0.1626 | 61.0 | 183 | 0.3997 | 0.3481 | 0.1078 | 0.3203 | 0.3320 |
|
| 119 |
+
| 0.1433 | 62.0 | 186 | 0.4019 | 0.3529 | 0.1129 | 0.3278 | 0.3386 |
|
| 120 |
+
| 0.1489 | 63.0 | 189 | 0.4008 | 0.3446 | 0.1137 | 0.3220 | 0.3291 |
|
| 121 |
+
| 0.1595 | 64.0 | 192 | 0.4009 | 0.3579 | 0.1159 | 0.3345 | 0.3421 |
|
| 122 |
+
| 0.1557 | 65.0 | 195 | 0.4044 | 0.3506 | 0.1165 | 0.3269 | 0.3342 |
|
| 123 |
+
| 0.1435 | 66.0 | 198 | 0.4094 | 0.3404 | 0.1082 | 0.3159 | 0.3257 |
|
| 124 |
+
| 0.1427 | 67.0 | 201 | 0.4140 | 0.3450 | 0.1103 | 0.3193 | 0.3301 |
|
| 125 |
+
| 0.1494 | 68.0 | 204 | 0.4163 | 0.3421 | 0.1090 | 0.3198 | 0.3276 |
|
| 126 |
+
| 0.1493 | 69.0 | 207 | 0.4137 | 0.3481 | 0.1101 | 0.3230 | 0.3318 |
|
| 127 |
+
| 0.14 | 70.0 | 210 | 0.4107 | 0.3438 | 0.1083 | 0.3193 | 0.3277 |
|
| 128 |
+
| 0.1338 | 71.0 | 213 | 0.4107 | 0.3432 | 0.1068 | 0.3199 | 0.3270 |
|
| 129 |
+
| 0.1302 | 72.0 | 216 | 0.4134 | 0.3573 | 0.1097 | 0.3317 | 0.3428 |
|
| 130 |
+
| 0.1354 | 73.0 | 219 | 0.4162 | 0.3525 | 0.1092 | 0.3270 | 0.3376 |
|
| 131 |
+
| 0.1379 | 74.0 | 222 | 0.4193 | 0.3402 | 0.1069 | 0.3177 | 0.3249 |
|
| 132 |
+
| 0.1272 | 75.0 | 225 | 0.4233 | 0.3397 | 0.1059 | 0.3173 | 0.3244 |
|
| 133 |
+
| 0.1331 | 76.0 | 228 | 0.4248 | 0.3364 | 0.1021 | 0.3149 | 0.3223 |
|
| 134 |
+
| 0.1211 | 77.0 | 231 | 0.4258 | 0.3459 | 0.1076 | 0.3235 | 0.3312 |
|
| 135 |
+
| 0.1324 | 78.0 | 234 | 0.4267 | 0.3488 | 0.1066 | 0.3257 | 0.3335 |
|
| 136 |
+
| 0.1275 | 79.0 | 237 | 0.4272 | 0.3458 | 0.1165 | 0.3201 | 0.3301 |
|
| 137 |
+
| 0.1265 | 80.0 | 240 | 0.4279 | 0.3519 | 0.1188 | 0.3288 | 0.3366 |
|
| 138 |
+
| 0.1227 | 81.0 | 243 | 0.4293 | 0.3458 | 0.1093 | 0.3261 | 0.3317 |
|
| 139 |
+
| 0.1213 | 82.0 | 246 | 0.4323 | 0.3437 | 0.1051 | 0.3189 | 0.3288 |
|
| 140 |
+
| 0.1275 | 83.0 | 249 | 0.4347 | 0.3457 | 0.1065 | 0.3212 | 0.3318 |
|
| 141 |
+
| 0.1233 | 84.0 | 252 | 0.4346 | 0.3491 | 0.1048 | 0.3235 | 0.3337 |
|
| 142 |
+
| 0.1168 | 85.0 | 255 | 0.4349 | 0.3450 | 0.1035 | 0.3208 | 0.3314 |
|
| 143 |
+
| 0.1184 | 86.0 | 258 | 0.4347 | 0.3480 | 0.1050 | 0.3255 | 0.3336 |
|
| 144 |
+
| 0.1246 | 87.0 | 261 | 0.4336 | 0.3483 | 0.1058 | 0.3272 | 0.3347 |
|
| 145 |
+
| 0.1167 | 88.0 | 264 | 0.4333 | 0.3470 | 0.1065 | 0.3269 | 0.3343 |
|
| 146 |
+
| 0.1203 | 89.0 | 267 | 0.4334 | 0.3494 | 0.1112 | 0.3278 | 0.3351 |
|
| 147 |
+
| 0.1139 | 90.0 | 270 | 0.4339 | 0.3460 | 0.1114 | 0.3253 | 0.3314 |
|
| 148 |
+
| 0.1202 | 91.0 | 273 | 0.4341 | 0.3497 | 0.1103 | 0.3252 | 0.3352 |
|
| 149 |
+
| 0.1174 | 92.0 | 276 | 0.4344 | 0.3497 | 0.1103 | 0.3252 | 0.3352 |
|
| 150 |
+
| 0.1164 | 93.0 | 279 | 0.4350 | 0.3504 | 0.1099 | 0.3249 | 0.3365 |
|
| 151 |
+
| 0.1114 | 94.0 | 282 | 0.4357 | 0.3445 | 0.1073 | 0.3188 | 0.3299 |
|
| 152 |
+
| 0.1094 | 95.0 | 285 | 0.4368 | 0.3455 | 0.1076 | 0.3197 | 0.3308 |
|
| 153 |
+
| 0.114 | 96.0 | 288 | 0.4376 | 0.3483 | 0.1105 | 0.3236 | 0.3336 |
|
| 154 |
+
| 0.1147 | 97.0 | 291 | 0.4381 | 0.3458 | 0.1099 | 0.3207 | 0.3303 |
|
| 155 |
+
| 0.116 | 98.0 | 294 | 0.4386 | 0.3458 | 0.1099 | 0.3207 | 0.3303 |
|
| 156 |
+
| 0.1187 | 99.0 | 297 | 0.4393 | 0.3499 | 0.1100 | 0.3234 | 0.3341 |
|
| 157 |
+
| 0.1112 | 100.0 | 300 | 0.4399 | 0.3519 | 0.1146 | 0.3260 | 0.3368 |
|
| 158 |
+
| 0.1124 | 101.0 | 303 | 0.4404 | 0.3519 | 0.1146 | 0.3260 | 0.3368 |
|
| 159 |
+
| 0.117 | 102.0 | 306 | 0.4408 | 0.3489 | 0.1081 | 0.3225 | 0.3335 |
|
| 160 |
+
| 0.1101 | 103.0 | 309 | 0.4412 | 0.3489 | 0.1081 | 0.3225 | 0.3335 |
|
| 161 |
+
| 0.1135 | 104.0 | 312 | 0.4415 | 0.3472 | 0.1075 | 0.3208 | 0.3311 |
|
| 162 |
+
| 0.1141 | 105.0 | 315 | 0.4416 | 0.3489 | 0.1081 | 0.3225 | 0.3335 |
|
| 163 |
+
| 0.1201 | 106.0 | 318 | 0.4416 | 0.3489 | 0.1081 | 0.3225 | 0.3335 |
|
| 164 |
+
| 0.2258 | 106.8 | 320 | 0.4416 | 0.3489 | 0.1081 | 0.3225 | 0.3335 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 165 |
|
| 166 |
|
| 167 |
### Framework versions
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 307867048
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:5b743102833d290f35e9f7b0131bc84fdb434945088cafcd15157ad61bab7603
|
| 3 |
size 307867048
|
training_args.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 5496
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:bb1acb0dfc573cb37c5e73de80b48f409ce5379778d9a002f0614d2aa3e5da8a
|
| 3 |
size 5496
|