nyu-mll/glue
Viewer • Updated • 1.49M • 462k • 495
How to use joey234/cuenb-mnli with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="joey234/cuenb-mnli") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("joey234/cuenb-mnli")
model = AutoModelForSequenceClassification.from_pretrained("joey234/cuenb-mnli")This model is a fine-tuned version of cuenb on the GLUE MNLI dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 0.5569 | 0.41 | 5000 | 0.4415 | 0.8273 |
| 0.4598 | 0.81 | 10000 | 0.4234 | 0.8425 |
| 0.3832 | 1.22 | 15000 | 0.4398 | 0.8475 |
| 0.3314 | 1.63 | 20000 | 0.4137 | 0.8494 |
| 0.3158 | 2.04 | 25000 | 0.4484 | 0.8527 |
| 0.2294 | 2.44 | 30000 | 0.4471 | 0.8552 |
| 0.2283 | 2.85 | 35000 | 0.4541 | 0.8557 |