nyu-mll/glue
Viewer • Updated • 1.49M • 462k • 495
How to use Riad/finetuned-bert-mrpc with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="Riad/finetuned-bert-mrpc") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("Riad/finetuned-bert-mrpc")
model = AutoModelForSequenceClassification.from_pretrained("Riad/finetuned-bert-mrpc")This model is a fine-tuned version of bert-base-cased on the glue dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|---|---|---|---|---|---|
| 0.5454 | 1.0 | 230 | 0.4396 | 0.8309 | 0.8871 |
| 0.3387 | 2.0 | 460 | 0.3783 | 0.8529 | 0.8976 |
| 0.1956 | 3.0 | 690 | 0.4382 | 0.8676 | 0.9085 |