TinyBanglaClickbaitBERT
A compact Bengali clickbait detection model distilled from BanglaClickbaitBERT.
Model Details
- Architecture: 4-layer Transformer (384-d, 6 heads)
- Parameters: 20.1M (vs 110.2M teacher → 5.5x compression)
- Tasks: Binary (clickbait/non-clickbait) + 11-class multiclass
Performance
| Task | Student F1 | Teacher F1 | Retention |
|---|---|---|---|
| Binary (macro) | 0.8543 | 0.8849 | 96.5% |
| Multiclass (macro) | 0.4201 | 0.5088 | 82.6% |
Speed
- Teacher: 7.9 ms | Student: 3.4 ms | 2.3x speedup
- Teacher: 440.9 MB | Student: 80.3 MB | 5.5x compression
Usage
import torch
checkpoint = torch.load("tiny_bangla_clickbait_bert.pt")
# See label_config.json for class mappings
Distilled from
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support