|
|
--- |
|
|
license: apache-2.0 |
|
|
language: |
|
|
- en |
|
|
--- |
|
|
|
|
|
# ReBART with Reverse Positional Embeddings |
|
|
|
|
|
This is a custom BART model fine-tuned on the CNN/DailyMail dataset with reverse positional embeddings to better control the length of generated summaries. |
|
|
|
|
|
## How to use |
|
|
|
|
|
```python |
|
|
from transformers import AutoTokenizer |
|
|
from ReBART.modeling_rebart import ReBartForConditionalGeneration |
|
|
|
|
|
tokenizer = AutoTokenizer.from_pretrained("Ivanhoe9/rpebart-large-cnn") |
|
|
``` |
|
|
|
|
|
--- |
|
|
license: apache-2.0 |
|
|
language: |
|
|
- en |
|
|
tags: |
|
|
- summarization |
|
|
- bart |
|
|
- custom |
|
|
- control-length |
|
|
datasets: |
|
|
- cnn_dailymail |
|
|
base_model: |
|
|
- facebook/bart-large |
|
|
--- |