DistilBERT fine-tuned on AG News for Text Classification
This is a distilbert-base-uncased model fine-tuned for text classification on the AG News dataset. It classifies news headlines into one of four categories: World, Sports, Business, and Sci/Tech.
This model was trained as part of a larger project, End-to-End NLP: From-Scratch Transformer vs. Deployed DistilBERT, which compares the performance of a Transformer built from scratch against this fine-tuned DistilBERT model.
π Live Demo
You can interact with this model directly in the browser using the live Gradio demo hosted on Hugging Face Spaces:
Model Details
- Model type:
distilbert-base-uncased, fine-tuned for sequence classification. - Language: English
- Dataset: AG News
- Performance: Achieved 94.79% accuracy on the test set.
- Labels:
0: World1: Sports2: Business3: Sci/Tech
How to Use
You can use this model directly with a text-classification pipeline.
from transformers import pipeline
# Load the classification pipeline
classifier = pipeline(
"text-classification",
model="nabeelshan/distilbert-finetuned-agnews",
return_all_scores=True
)
# Example news headline
text = "Formula 1: Hamilton wins the British Grand Prix after a dramatic last lap."
# Get prediction
result = classifier(text)
print(result)
- Downloads last month
- 3
Model tree for nabeelshan/distilbert-finetuned-agnews
Base model
distilbert/distilbert-base-uncasedDataset used to train nabeelshan/distilbert-finetuned-agnews
Space using nabeelshan/distilbert-finetuned-agnews 1
Evaluation results
- Accuracy on AG Newstest set self-reported0.948