FiscalNote/billsum
Viewer • Updated • 23.5k • 16.6k • 54
How to use bogdancazan/prophetnet_summarization_pretrained with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("bogdancazan/prophetnet_summarization_pretrained")
model = AutoModelForSeq2SeqLM.from_pretrained("bogdancazan/prophetnet_summarization_pretrained")This model is a fine-tuned version of microsoft/prophetnet-large-uncased on the billsum dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 124 | 2.5178 | 0.4894 | 0.2223 | 0.2903 | 0.2903 | 139.8105 |
| No log | 2.0 | 248 | 2.4170 | 0.4973 | 0.2279 | 0.2975 | 0.297 | 140.6492 |
| No log | 3.0 | 372 | 2.3895 | 0.4964 | 0.2282 | 0.2984 | 0.2981 | 138.5323 |
| No log | 4.0 | 496 | 2.3683 | 0.4982 | 0.2267 | 0.2983 | 0.2985 | 139.3831 |