RoBERTa Distilled for HPV Health Communication
Overview
This model is a RoBERTa-based encoder-only student model trained using knowledge distillation for efficient multi-label classification of HPV-related health communication content.
The goal of this model is to enable efficient and scalable information extraction from public health resources while maintaining strong predictive performance.
Task
Multi-label classification of HPV-related content categories.
Model Details
- Base model: RoBERTa
- Architecture: Encoder-only
- Training paradigm: Knowledge Distillation
- Objective: Efficient information extraction
Usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification
repo = "skhan225/RoBERTa_distilled_HPV"
tokenizer = AutoTokenizer.from_pretrained(repo)
model = AutoModelForSequenceClassification.from_pretrained(repo)
text = "HPV vaccination prevents cervical cancer."
inputs = tokenizer(text, return_tensors="pt")
outputs = model(**inputs)
- Downloads last month
- 1
Model tree for skhan225/RoBERTa_distilled_HPV
Base model
FacebookAI/roberta-large-mnli