kmfoda/booksum
Viewer • Updated • 12.5k • 1.53k • 78
How to use pszemraj/pegasus-large-summary-explain with Transformers:
# Use a pipeline as a high-level helper
# Warning: Pipeline type "summarization" is no longer supported in transformers v5.
# You must load the model directly (see below) or downgrade to v4.x with:
# 'pip install "transformers<5.0.0'
from transformers import pipeline
pipe = pipeline("summarization", model="pszemraj/pegasus-large-summary-explain") # Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("pszemraj/pegasus-large-summary-explain")
model = AutoModelForSeq2SeqLM.from_pretrained("pszemraj/pegasus-large-summary-explain")This model is a fine-tuned version of google/pegasus-large on the booksum dataset for four total epochs.
It achieves the following results on the evaluation set:
A 1-epoch checkpoint can be found at pszemraj/pegasus-large-book-summary, which is where the second training session started from.
More information needed
The following hyperparameters were used during training: