stanfordnlp/imdb
Viewer β’ Updated β’ 100k β’ 180k β’ 370
This is a DistilBERT model fine-tuned on IMDB sentiment analysis using Knowledge Distillation from a BERT teacher model.
π Sachin-Rathore-1234/bert-imdb-sentiment
| Setting | Value |
|---|---|
| Teacher | BERT-base |
| Student | DistilBERT |
| Temperature | 4.0 |
| Alpha | 0.7 |
| Epochs | 3 |
| Dataset | IMDB (3000) |
| Metric | Teacher (BERT) | Student (DistilBERT) |
|---|---|---|
| Accuracy | ~91% | ~89% |
| Parameters | 110M | 66M |
| Size Reduction | - | 40% smaller |
| Speed Improvement | - | 1.6x faster |
from transformers import pipeline
classifier = pipeline(
'sentiment-analysis',
model='Sachin-Rathore-1234/distilbert-imdb-distilled'
)
result = classifier("This movie was absolutely amazing!")
print(result)
# [{'label': 'POSITIVE', 'score': 0.98}]