Instructions to use prajjwal1/roberta-base-mnli with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use prajjwal1/roberta-base-mnli with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="prajjwal1/roberta-base-mnli")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("prajjwal1/roberta-base-mnli") model = AutoModelForSequenceClassification.from_pretrained("prajjwal1/roberta-base-mnli") - Notebooks
- Google Colab
- Kaggle
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Roberta-base trained on MNLI.
| Task | Accuracy |
|---|---|
| MNLI | 86.32 |
| MNLI-mm | 86.43 |
You can also check out:
prajjwal1/roberta-base-mnliprajjwal1/roberta-large-mnliprajjwal1/albert-base-v2-mnliprajjwal1/albert-base-v1-mnliprajjwal1/albert-large-v2-mnli
- Downloads last month
- 24