nyu-mll/glue
Viewer • Updated • 1.49M • 462k • 495
How to use JeffreyWong/roberta-base-relu-mrpc with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="JeffreyWong/roberta-base-relu-mrpc") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("JeffreyWong/roberta-base-relu-mrpc")
model = AutoModelForSequenceClassification.from_pretrained("JeffreyWong/roberta-base-relu-mrpc")This model is a fine-tuned version of JeremiahZ/roberta-base-mrpc on the GLUE MRPC dataset. It achieves the following results on the evaluation set:
The following hyperparameters were used during training:
The best model was selected based on the highest accuracy, which is the key evaluation metric for this task.