DistilBERT Tone Classification Model

This model fine-tunes distilbert-base-uncased to classify tone into 7 categories relevant to community and mentorship transcripts.

πŸ“Œ Labels

uplifting

thoughtful

practical

reflective

motivational

informative

optimistic

πŸ“Š Dataset

The model is trained on the tone-dataset , a dataset containing 1000+ labeled examples created for the MyVillageProject tone classification task. Data includes first-person and third-person statements, anecdotes, factual notes, and reflective entries.

πŸš€ Training

Base model: distilbert-base-uncased

Optimizer: AdamW (lr=2e-5)

Batch size: 16

Epochs: 8

Loss: CrossEntropy

Metrics: Accuracy + Weighted F1

πŸ“ˆ Validation Metrics Epoch Training Loss Validation Loss Accuracy F1 1 No log 1.281651 0.782288 0.778880 2 No log 0.779447 0.845018 0.843397 3 No log 0.566092 0.859779 0.856186 4 No log 0.415437 0.892989 0.892445 5 No log 0.340598 0.915129 0.914765 6 0.729500 0.307513 0.922509 0.922262 7 0.729500 0.296827 0.915129 0.915210 8 0.729500 0.285301 0.922509 0.922262

Final Training Summary:

TrainOutput(global_step=704, training_loss=0.5666945034807379, metrics={'train_runtime': 42.6317, 'train_samples_per_second': 261.402, 'train_steps_per_second': 16.514, 'total_flos': 369087080441856.0, 'train_loss': 0.5666945034807379, 'epoch': 8.0})

πŸ’» Usage from transformers import pipeline

classifier = pipeline("text-classification", model="Dc-4nderson/tone-distilbert")

text = "Ronnie mentioned the turnout was twice what they expected, and it felt like a victory." print(classifier(text))

Output:

[{'label': 'uplifting'}]

πŸ‘₯ Maintainer

Dequan Anderson/ Dc-4nderson

Downloads last month
33
Safetensors
Model size
67M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for Dc-4nderson/tone-classifier

Finetuned
(10487)
this model

Dataset used to train Dc-4nderson/tone-classifier