Instructions to use facebook/bart-large-mnli with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use facebook/bart-large-mnli with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("zero-shot-classification", model="facebook/bart-large-mnli")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("facebook/bart-large-mnli") model = AutoModelForSequenceClassification.from_pretrained("facebook/bart-large-mnli") - Inference
- Notebooks
- Google Colab
- Kaggle
Fix broken link to multi_nli dataset
#48
by brunoalr - opened
README.md
CHANGED
|
@@ -8,7 +8,7 @@ datasets:
|
|
| 8 |
|
| 9 |
# bart-large-mnli
|
| 10 |
|
| 11 |
-
This is the checkpoint for [bart-large](https://huggingface.co/facebook/bart-large) after being trained on the [MultiNLI (MNLI)](https://huggingface.co/datasets/multi_nli) dataset.
|
| 12 |
|
| 13 |
Additional information about this model:
|
| 14 |
- The [bart-large](https://huggingface.co/facebook/bart-large) model page
|
|
|
|
| 8 |
|
| 9 |
# bart-large-mnli
|
| 10 |
|
| 11 |
+
This is the checkpoint for [bart-large](https://huggingface.co/facebook/bart-large) after being trained on the [MultiNLI (MNLI)](https://huggingface.co/datasets/nyu-mll/multi_nli) dataset.
|
| 12 |
|
| 13 |
Additional information about this model:
|
| 14 |
- The [bart-large](https://huggingface.co/facebook/bart-large) model page
|