DistilBERT fine-tuned on AG News for Text Classification

This is a distilbert-base-uncased model fine-tuned for text classification on the AG News dataset. It classifies news headlines into one of four categories: World, Sports, Business, and Sci/Tech.

This model was trained as part of a larger project, End-to-End NLP: From-Scratch Transformer vs. Deployed DistilBERT, which compares the performance of a Transformer built from scratch against this fine-tuned DistilBERT model.

πŸš€ Live Demo

You can interact with this model directly in the browser using the live Gradio demo hosted on Hugging Face Spaces:

Hugging Face Spaces

Model Details

  • Model type: distilbert-base-uncased, fine-tuned for sequence classification.
  • Language: English
  • Dataset: AG News
  • Performance: Achieved 94.79% accuracy on the test set.
  • Labels:
    • 0: World
    • 1: Sports
    • 2: Business
    • 3: Sci/Tech

How to Use

You can use this model directly with a text-classification pipeline.

from transformers import pipeline

# Load the classification pipeline
classifier = pipeline(
    "text-classification",
    model="nabeelshan/distilbert-finetuned-agnews",
    return_all_scores=True
)

# Example news headline
text = "Formula 1: Hamilton wins the British Grand Prix after a dramatic last lap."

# Get prediction
result = classifier(text)
print(result)
Downloads last month
3
Safetensors
Model size
67M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for nabeelshan/distilbert-finetuned-agnews

Finetuned
(11430)
this model

Dataset used to train nabeelshan/distilbert-finetuned-agnews

Space using nabeelshan/distilbert-finetuned-agnews 1

Evaluation results