facebook/xnli
Viewer • Updated • 6.4M • 34.3k • 71
How to use matekadlicsko/distilroberta-nli with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("zero-shot-classification", model="matekadlicsko/distilroberta-nli") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("matekadlicsko/distilroberta-nli")
model = AutoModelForSequenceClassification.from_pretrained("matekadlicsko/distilroberta-nli")This model can be used for Natural Language Inference (NLI) tasks. It is a version of roberta-base fine-tuned on multi_nli and english xnli.
The model's performance on NLI tasks is as follows: