YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

bert-twitter-sentiment-classifier

Fine-tuned model: bert-base-uncased โ†’ bert-twitter-sentiment-classifier

Author: Aakash (Aakash22134) Contact: saiaakash33333@gmail.com License: apache-2.0 Languages: en


Model description

This model is a fine-tuned BERT classifier for multi-class emotion / sentiment classification on short Twitter text. It predicts one of the following classes: sadness, joy, love, anger, fear, surprise.

The model was trained on the twitter_multi_class_sentiment dataset and demonstrates strong classification performance on the held-out test set.


Training data

  • Dataset: twitter_multi_class_sentiment (public CSV from example notebook)
  • Train / Validation / Test: 11200 / 1600 / 3200
  • Preprocessing: tokenized with bert-base-uncased tokenizer, padding + truncation to default BERT max length in the notebook

Training procedure & hyperparameters

  • Base model: bert-base-uncased
  • Training epochs: 2
  • Batch size (train/eval): 64 / 64
  • Learning rate: 2e-05
  • Weight decay: 0.01
  • Trainer: transformers.Trainer (Hugging Face Transformers)
  • Notes: model was trained for 2 epochs in a Colab environment; consider longer training or more data for further improvements.

Evaluation

Test set results (approx):

  • Accuracy: 0.900625
  • F1 (weighted): 0.900321

Per-class (precision / recall / f1 / support): { "sadness": { "precision": 0.93, "recall": 0.95, "f1": 0.94, "support": 933 }, "joy": { "precision": 0.92, "recall": 0.91, "f1": 0.92, "support": 1072 }, "love": { "precision": 0.76, "recall": 0.75, "f1": 0.76, "support": 261 }, "anger": { "precision": 0.91, "recall": 0.91, "f1": 0.91, "support": 432 }, "fear": { "precision": 0.89, "recall": 0.88, "f1": 0.88, "support": 387 }, "surprise": { "precision": 0.75, "recall": 0.72, "f1": 0.74, "support": 115 } }

Evaluation details: computed with sklearn.metrics.classification_report.

WandB logs: https://wandb.ai/saiaakash33333-gitam/huggingface Run: https://wandb.ai/saiaakash33333-gitam/huggingface/runs/qtmurwgd


Usage

Last updated: 2025-09-29 09:20:19 UTC

Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using Aakash22134/bert-twitter-sentiment-classifier 1