metadata
license: apache-2.0
language:
- en
ReBART with Reverse Positional Embeddings
This is a custom BART model fine-tuned on the CNN/DailyMail dataset with reverse positional embeddings to better control the length of generated summaries.
How to use
from transformers import AutoTokenizer
from ReBART.modeling_rebart import ReBartForConditionalGeneration
tokenizer = AutoTokenizer.from_pretrained("Ivanhoe9/rpebart-large-cnn")