Gabriel/citesum_swe
Viewer • Updated • 91.4k • 224
How to use Gabriel/bart-base-cnn-xsum-cite-swe with Transformers:
# Use a pipeline as a high-level helper
# Warning: Pipeline type "summarization" is no longer supported in transformers v5.
# You must load the model directly (see below) or downgrade to v4.x with:
# 'pip install "transformers<5.0.0'
from transformers import pipeline
pipe = pipeline("summarization", model="Gabriel/bart-base-cnn-xsum-cite-swe") # Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("Gabriel/bart-base-cnn-xsum-cite-swe")
model = AutoModelForSeq2SeqLM.from_pretrained("Gabriel/bart-base-cnn-xsum-cite-swe")This model is a fine-tuned version of Gabriel/bart-base-cnn-xsum-swe on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|---|---|---|---|---|---|---|---|---|
| 2.4833 | 1.0 | 2558 | 2.4203 | 29.6279 | 11.5697 | 24.2429 | 24.4557 | 19.9371 |