How to use jamesdborin/Roberta-Large-MRPC with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="jamesdborin/Roberta-Large-MRPC")
# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("jamesdborin/Roberta-Large-MRPC") model = AutoModelForSequenceClassification.from_pretrained("jamesdborin/Roberta-Large-MRPC")