FiscalNote/billsum
Viewer • Updated • 23.5k • 16.6k • 54
How to use bogdancazan/pegasus_summarization_pretrained with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("bogdancazan/pegasus_summarization_pretrained")
model = AutoModelForSeq2SeqLM.from_pretrained("bogdancazan/pegasus_summarization_pretrained")This model is a fine-tuned version of google/pegasus-xsum on the billsum dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 124 | 2.0226 | 0.3896 | 0.1882 | 0.2838 | 0.2839 | 61.5444 |
| No log | 2.0 | 248 | 1.9736 | 0.3991 | 0.1963 | 0.291 | 0.2907 | 61.9194 |
| No log | 3.0 | 372 | 1.9542 | 0.3977 | 0.196 | 0.2889 | 0.2885 | 61.9718 |
| No log | 4.0 | 496 | 1.9463 | 0.3979 | 0.1963 | 0.2889 | 0.2887 | 61.9919 |