google-research-datasets/go_emotions
Viewer β’ Updated β’ 265k β’ 22.8k β’ 260
How to use VanshajR/roberta-emotion-7class with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="VanshajR/roberta-emotion-7class") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("VanshajR/roberta-emotion-7class")
model = AutoModelForSequenceClassification.from_pretrained("VanshajR/roberta-emotion-7class")Fine-tuned RoBERTa model for emotion classification on 7 emotions: happy, sad, angry, fear, disgust, surprise, neutral.
roberta-base (125M parameters)Evaluated on GoEmotions test set:
| Metric | Score |
|---|---|
| Accuracy | 57.77% |
| Macro F1 | 0.4787 |
| Precision | 0.5289 |
| Recall | 0.4958 |
| Emotion | Precision | Recall | F1-Score | Support |
|---|---|---|---|---|
| Happy | 0.62 | 0.67 | 0.64 | 2,362 |
| Sad | 0.54 | 0.51 | 0.52 | 1,210 |
| Angry | 0.58 | 0.43 | 0.49 | 1,145 |
| Fear | 0.42 | 0.31 | 0.36 | 428 |
| Disgust | 0.48 | 0.26 | 0.34 | 361 |
| Surprise | 0.43 | 0.43 | 0.43 | 623 |
| Neutral | 0.64 | 0.86 | 0.73 | 8,711 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
# Load model and tokenizer
tokenizer = AutoTokenizer.from_pretrained("VanshajR/roberta-emotion-7class")
model = AutoModelForSequenceClassification.from_pretrained("VanshajR/roberta-emotion-7class")
# Classify emotion
text = "I'm so excited about this project!"
inputs = tokenizer(text, return_tensors="pt", truncation=True, max_length=128)
with torch.no_grad():
outputs = model(**inputs)
predictions = torch.nn.functional.softmax(outputs.logits, dim=-1)
predicted_class = torch.argmax(predictions, dim=-1).item()
# Emotion labels
emotions = ["happy", "sad", "angry", "fear", "disgust", "surprise", "neutral"]
print(f"Predicted emotion: {emotions[predicted_class]}")
print(f"Confidence: {predictions[0][predicted_class].item():.2%}")
β Recommended:
β Not Recommended:
@misc{vanshajr2024roberta,
author = {Vanshaj R},
title = {RoBERTa Emotion Classifier for 7-Class Emotion Detection},
year = {2024},
publisher = {HuggingFace},
url = {https://huggingface.co/VanshajR/roberta-emotion-7class}
}
Part of the Emotion-Controlled Response Generation project:
Base model
FacebookAI/roberta-base