YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Trained on will4381/textual-inference-with-confidence with 40k train and 10k val datasets.
Example Usage:
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
import torch.nn.functional as F
tokenizer = AutoTokenizer.from_pretrained("will4381/inference_confidence_model")
model = AutoModelForSequenceClassification.from_pretrained("will4381/inference_confidence_model")
passage = "The Great Barrier Reef, located off the coast of Queensland in northeastern Australia, is the world's largest coral reef system. Stretching for over 2,300 kilometers, it is composed of over 2,900 individual reefs and 900 islands. The reef is home to more than 1,500 species of fish, 400 species of hard coral, one-third of the world's soft corals, 134 species of sharks and rays, six of the world's seven species of threatened marine turtles, and more than 30 species of marine mammals."
inference = "The Great Barrier Reef is experiencing a decline in biodiversity due to climate change and ocean acidification."
inputs = tokenizer(passage, inference, return_tensors="pt", truncation=True, max_length=512, padding="max_length")
model.eval()
with torch.no_grad():
outputs = model(**inputs)
predicted_confidence = torch.sigmoid(outputs.logits).item()
print(f"Predicted confidence: {predicted_confidence:.4f}")
- Downloads last month
- 11