update model card README.md
Browse files
README.md
CHANGED
|
@@ -2,6 +2,8 @@
|
|
| 2 |
license: apache-2.0
|
| 3 |
tags:
|
| 4 |
- generated_from_trainer
|
|
|
|
|
|
|
| 5 |
model-index:
|
| 6 |
- name: bart-large-finetuned-bart
|
| 7 |
results: []
|
|
@@ -13,6 +15,13 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 13 |
# bart-large-finetuned-bart
|
| 14 |
|
| 15 |
This model is a fine-tuned version of [facebook/bart-large](https://huggingface.co/facebook/bart-large) on the None dataset.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 16 |
|
| 17 |
## Model description
|
| 18 |
|
|
@@ -42,6 +51,28 @@ The following hyperparameters were used during training:
|
|
| 42 |
|
| 43 |
### Training results
|
| 44 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 45 |
|
| 46 |
|
| 47 |
### Framework versions
|
|
|
|
| 2 |
license: apache-2.0
|
| 3 |
tags:
|
| 4 |
- generated_from_trainer
|
| 5 |
+
metrics:
|
| 6 |
+
- rouge
|
| 7 |
model-index:
|
| 8 |
- name: bart-large-finetuned-bart
|
| 9 |
results: []
|
|
|
|
| 15 |
# bart-large-finetuned-bart
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [facebook/bart-large](https://huggingface.co/facebook/bart-large) on the None dataset.
|
| 18 |
+
It achieves the following results on the evaluation set:
|
| 19 |
+
- Loss: 0.8644
|
| 20 |
+
- Rouge1: 80.883
|
| 21 |
+
- Rouge2: 72.0268
|
| 22 |
+
- Rougel: 77.0146
|
| 23 |
+
- Rougelsum: 77.3408
|
| 24 |
+
- Gen Len: 19.2969
|
| 25 |
|
| 26 |
## Model description
|
| 27 |
|
|
|
|
| 51 |
|
| 52 |
### Training results
|
| 53 |
|
| 54 |
+
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|
| 55 |
+
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
|
| 56 |
+
| No log | 1.0 | 12 | 0.8158 | 70.5188 | 55.0313 | 61.668 | 62.0697 | 18.8281 |
|
| 57 |
+
| No log | 2.0 | 24 | 0.6610 | 78.9718 | 67.9238 | 71.2508 | 71.5416 | 19.2188 |
|
| 58 |
+
| No log | 3.0 | 36 | 0.6257 | 79.2603 | 69.4298 | 73.031 | 73.4136 | 19.1875 |
|
| 59 |
+
| No log | 4.0 | 48 | 0.6354 | 79.0425 | 69.7772 | 73.6893 | 73.8356 | 19.3125 |
|
| 60 |
+
| No log | 5.0 | 60 | 0.6126 | 79.9003 | 69.6148 | 73.3622 | 73.7154 | 19.2656 |
|
| 61 |
+
| No log | 6.0 | 72 | 0.6537 | 80.0209 | 69.9125 | 73.6876 | 73.9664 | 19.1875 |
|
| 62 |
+
| No log | 7.0 | 84 | 0.7168 | 80.7559 | 71.8251 | 75.5574 | 75.6627 | 19.2031 |
|
| 63 |
+
| No log | 8.0 | 96 | 0.6980 | 80.9116 | 72.2951 | 75.9015 | 76.2205 | 19.2656 |
|
| 64 |
+
| No log | 9.0 | 108 | 0.7569 | 80.1034 | 70.8352 | 75.1723 | 75.4102 | 19.25 |
|
| 65 |
+
| No log | 10.0 | 120 | 0.7523 | 80.0436 | 71.0855 | 75.8337 | 76.1002 | 19.3281 |
|
| 66 |
+
| No log | 11.0 | 132 | 0.7742 | 80.5982 | 71.713 | 75.9081 | 76.1564 | 19.4375 |
|
| 67 |
+
| No log | 12.0 | 144 | 0.7570 | 79.4243 | 70.5775 | 75.3521 | 75.9334 | 19.4844 |
|
| 68 |
+
| No log | 13.0 | 156 | 0.8225 | 80.2529 | 72.6598 | 76.6931 | 76.8326 | 19.7344 |
|
| 69 |
+
| No log | 14.0 | 168 | 0.8696 | 79.821 | 71.2369 | 75.836 | 76.0734 | 19.4688 |
|
| 70 |
+
| No log | 15.0 | 180 | 0.8820 | 80.9234 | 72.5022 | 76.7848 | 77.0368 | 19.375 |
|
| 71 |
+
| No log | 16.0 | 192 | 0.8400 | 80.5926 | 72.0275 | 76.8664 | 77.0773 | 19.2969 |
|
| 72 |
+
| No log | 17.0 | 204 | 0.8461 | 80.6004 | 72.2808 | 76.8693 | 77.0517 | 19.375 |
|
| 73 |
+
| No log | 18.0 | 216 | 0.8577 | 81.0069 | 73.1954 | 77.467 | 77.7083 | 19.3906 |
|
| 74 |
+
| No log | 19.0 | 228 | 0.8695 | 81.0333 | 72.5324 | 77.4248 | 77.5971 | 19.2969 |
|
| 75 |
+
| No log | 20.0 | 240 | 0.8644 | 80.883 | 72.0268 | 77.0146 | 77.3408 | 19.2969 |
|
| 76 |
|
| 77 |
|
| 78 |
### Framework versions
|