ReBART with Reverse Positional Embeddings
This is a custom BART model fine-tuned on the CNN/DailyMail dataset with reverse positional embeddings to better control the length of generated summaries.
How to use
from transformers import AutoTokenizer
from ReBART.modeling_rebart import ReBartForConditionalGeneration
tokenizer = AutoTokenizer.from_pretrained("Ivanhoe9/rpebart-large-cnn")
license: apache-2.0 language: - en tags: - summarization - bart - custom - control-length datasets: - cnn_dailymail base_model: - facebook/bart-large
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support