nyu-mll/glue
Viewer • Updated • 1.49M • 458k • 495
How to use alup/bert-uncased-finetuned-mrpc with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="alup/bert-uncased-finetuned-mrpc") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("alup/bert-uncased-finetuned-mrpc")
model = AutoModelForSequenceClassification.from_pretrained("alup/bert-uncased-finetuned-mrpc")This model is a fine-tuned version of bert-base-uncased on the glue dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|---|---|---|---|---|---|
| No log | 1.0 | 230 | 0.3924 | 0.8554 | 0.9015 |
| No log | 2.0 | 460 | 0.3575 | 0.875 | 0.9128 |
| 0.3857 | 3.0 | 690 | 0.6265 | 0.8676 | 0.9094 |