rexarski/eli5_category
Updated • 715 • 18
How to use manasuma/my_awesome_eli5_mlm_model with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="manasuma/my_awesome_eli5_mlm_model") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("manasuma/my_awesome_eli5_mlm_model")
model = AutoModelForMaskedLM.from_pretrained("manasuma/my_awesome_eli5_mlm_model")This model is a fine-tuned version of distilroberta-base on the eli5_category dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 2.2369 | 1.0 | 1316 | 2.0908 |
| 2.1794 | 2.0 | 2632 | 2.0406 |
| 2.1437 | 3.0 | 3948 | 2.0252 |
Base model
distilbert/distilroberta-base