EdinburghNLP/xsum
Viewer • Updated • 227k • 34.5k • 145
How to use namanpundir/theus_concepttagger with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("namanpundir/theus_concepttagger")
model = AutoModelForSeq2SeqLM.from_pretrained("namanpundir/theus_concepttagger")This model is a fine-tuned version of facebook/bart-large-cnn on the xsum dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|---|---|---|---|---|---|---|---|---|
| 1.4096 | 1.0 | 12753 | 1.6249 | 34.8663 | 15.1526 | 26.1224 | 26.5164 | 62.4475 |
Base model
facebook/bart-large-cnn