distilbert-quality / README.md
abdulmatinomotoso's picture
Update README.md
983be6c
---
license: other
language: en
datasets:
- valurank/news-small
---
# DistilBERT fine-tuned for news classification
This model is based on [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) pretrained weights, with a classification head fine-tuned to classify news articles into 3 categories (bad, medium, good).
## Training data
The dataset used to fine-tune the model is [news-small](https://huggingface.co/datasets/valurank/news-small), the 300 article news dataset manually annotated by Alex.
## Inputs
Similar to its base model, this model accepts inputs with a maximum length of 512 tokens.