rpebart-large-cnn / README.md
Ivanhoe9's picture
Update README.md
b6aadd5 verified
|
raw
history blame
611 Bytes
metadata
license: apache-2.0
language:
  - en

ReBART with Reverse Positional Embeddings

This is a custom BART model fine-tuned on the CNN/DailyMail dataset with reverse positional embeddings to better control the length of generated summaries.

How to use

from transformers import AutoTokenizer
from ReBART.modeling_rebart import ReBartForConditionalGeneration

tokenizer = AutoTokenizer.from_pretrained("Ivanhoe9/rpebart-large-cnn")

license: apache-2.0 language: - en tags: - summarization - bart - custom - control-length datasets: - cnn_dailymail base_model: - facebook/bart-large