Update README.md
Browse files
README.md
CHANGED
|
@@ -14,7 +14,7 @@ datasets:
|
|
| 14 |
|
| 15 |
# mbart_summarization_ilpost
|
| 16 |
|
| 17 |
-
This model is a fine-tuned version of [facebook/mbart-large-cc25](https://huggingface.co/facebook/mbart-large-cc25) on IlPost dataset for Abstractive Summarization
|
| 18 |
|
| 19 |
It achieves the following results:
|
| 20 |
- Loss: 2.3640
|
|
@@ -25,6 +25,7 @@ It achieves the following results:
|
|
| 25 |
- Gen Len: 39.8843
|
| 26 |
|
| 27 |
## Usage
|
|
|
|
| 28 |
```python
|
| 29 |
from transformers import MBartTokenizer, MBartForConditionalGeneration
|
| 30 |
tokenizer = MBartTokenizer.from_pretrained("ARTeLab/mbart-summarization-ilpost")
|
|
|
|
| 14 |
|
| 15 |
# mbart_summarization_ilpost
|
| 16 |
|
| 17 |
+
This model is a fine-tuned version of [facebook/mbart-large-cc25](https://huggingface.co/facebook/mbart-large-cc25) on IlPost dataset for Abstractive Summarization.
|
| 18 |
|
| 19 |
It achieves the following results:
|
| 20 |
- Loss: 2.3640
|
|
|
|
| 25 |
- Gen Len: 39.8843
|
| 26 |
|
| 27 |
## Usage
|
| 28 |
+
|
| 29 |
```python
|
| 30 |
from transformers import MBartTokenizer, MBartForConditionalGeneration
|
| 31 |
tokenizer = MBartTokenizer.from_pretrained("ARTeLab/mbart-summarization-ilpost")
|