language: en
license: apache-2.0
datasets:
- ag_news
tags:
- text-classification
- bert
- ag-news
BERT-base-uncased fine-tuned on AG News
This model is a fine-tuned version of bert-base-uncased on the AG News dataset, achieving 94.36% accuracy on the test set.
Model Details
- Model Type: Text Classification (BERT)
- Base Model: bert-base-uncased
- Dataset: AG News
- Fine-tuning Approach: Sequence Classification
Training Results
| Epoch | Training Loss | Validation Loss | Accuracy | F1 (Weighted) |
|---|---|---|---|---|
| 1 | 0.231600 | 0.212338 | 0.9359 | 0.9359 |
| 2 | 0.176300 | 0.213332 | 0.9439 | 0.9439 |
| 3 | 0.119100 | 0.230517 | 0.9450 | 0.9450 |
| 4 | 0.074500 | 0.286154 | 0.9447 | 0.9448 |
| 5 | 0.031700 | 0.344374 | 0.9436 | 0.9435 |
Confusion Matrix
Confusion Matrix Values (True Label → Predicted Label)
| World | Sports | Business | Sci/Tech | |
|---|---|---|---|---|
| World | 1812 | 13 | 43 | 32 |
| Sports | 7 | 1880 | 7 | 6 |
| Business | 39 | 9 | 1728 | 124 |
| Sci/Tech | 34 | 10 | 105 | 1751 |
How to Use
from transformers import pipeline
classifier = pipeline("text-classification", model="ShahzaibAli-1/News_Classifier-bert-base-uncased")
result = classifier("Apple reported record profits last quarter.")
print(result)
Performance
Training Hyperparameters
- Learning Rate: 5e-5
- Batch Size: 8
- Epochs: 5
- Warmup Ratio: 0.1
- Max Sequence Length: 128
Final Test Accuracy: 94.36%
Final Test F1-Score (Weighted): 94.35%
To watch a proper demo using Gradio
from transformers import pipeline
classifier = pipeline("text-classification", model="ShahzaibAli-1/News_Classifier-bert-base-uncased")
result = classifier("Apple reported record profits last quarter.")
print(result)
import gradio as gr
from transformers import pipeline
# Load model
classifier = pipeline("text-classification", model="ShahzaibAli-1/News_Classifier-bert-base-uncased")
# Define label mapping (must match your training labels)
label_map = {
0: "World",
1: "Sports",
2: "Business",
3: "Sci/Tech"
}
def predict(text):
result = classifier(text)[0]
# Extract numerical label (e.g., "LABEL_1" -> 1)
label_num = int(result['label'].split("_")[-1])
# Get corresponding text label
label_text = label_map[label_num]
return f"{label_text} (confidence: {result['score']:.2%})"
# Create interface
iface = gr.Interface(
fn=predict,
inputs=gr.Textbox(lines=2, placeholder="Enter news text here..."),
outputs="text",
title="AG News Classifier",
description="Classify news articles into World, Sports, Business, or Sci/Tech categories"
)
iface.launch()
Example Outputs
Here are some example outputs for various test cases:
Sports News:
Prompt:"Newzealand Won the Test Championship today"
Output:Sports (confidence: 99.99%)Business News:
Prompt:"The stock market saw a significant increase following the tech boom"
Output:Business (confidence: 98.50%)World News:
Prompt:"The political unrest in Eastern Europe has escalated this week"
Output:World (confidence: 97.70%)Sci/Tech News:
Prompt:"Scientists have developed a new battery that can last twice as long as current models"
Output:Sci/Tech (confidence: 96.30%)
Evaluation Metrics
The following evaluation metrics were used to assess the model's performance:
- Accuracy: The percentage of correct predictions over the total number of predictions.
- Precision: The proportion of positive predictions that were actually correct.
- Recall: The proportion of actual positives that were correctly identified.
- F1-Score: The harmonic mean of precision and recall.
The model demonstrated strong performance across all metrics, particularly with an accuracy of 94.36%.
Citation
If you use this model in your research or projects, please cite it as follows:
@article{shahzaib2025news,
title={Fine-Tuning BERT for AG News Classification},
author={Shahzaib Ali},
journal={Hugging Face Model Hub},
year={2025},
url={https://huggingface.co/ShahzaibAli-1/News_Classifier-bert-base-uncased}
}
License
The model is released under the Apache-2.0 License. Feel free to use it in your applications and research.
Contact
For any questions or suggestions, feel free to open an issue or contact the model creator at:
- Hugging Face: ShahzaibAli-1