armanc/scientific_papers
Updated • 4.72k • 175
How to use David-Xu/t5-small_arxiv_model with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("David-Xu/t5-small_arxiv_model")
model = AutoModelForSeq2SeqLM.from_pretrained("David-Xu/t5-small_arxiv_model")This model is a fine-tuned version of google-t5/t5-small on the scientific_papers dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|---|---|---|---|---|---|---|---|---|
| 2.7744 | 1.0 | 20303 | 2.5639 | 0.1793 | 0.0691 | 0.1438 | 0.1439 | 19.0 |
| 2.6041 | 2.0 | 40606 | 2.5171 | 0.1778 | 0.0677 | 0.142 | 0.142 | 19.0 |
| 2.5843 | 3.0 | 60909 | 2.5070 | 0.1782 | 0.0681 | 0.1422 | 0.1423 | 19.0 |
Base model
google-t5/t5-small