File size: 7,966 Bytes
50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 6415243 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 6415243 aa30061 6415243 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 6415243 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 6415243 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 50aaeb0 aa30061 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 |
---
library_name: transformers
tags:
- text-classification
- emotion-detection
- sentiment-analysis
- distilbert
language:
- en
license: apache-2.0
base_model: distilbert-base-uncased
pipeline_tag: text-classification
metrics:
- accuracy
- f1
---
# DistilBERT Emotion Classifier
## Model Description
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) for multi-class emotion classification. The model classifies text into different emotional categories, enabling applications in sentiment analysis, customer feedback analysis, and social media monitoring.
**Developed by:** Sathwik3
**Model type:** Text Classification (Emotion Detection)
**Language(s):** English
**License:** Apache 2.0
**Base model:** [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased)
## Model Details
### Architecture
The model is based on DistilBERT, a distilled version of BERT that retains 97% of BERT's language understanding while being 40% smaller and 60% faster. The architecture consists of:
- 6 transformer layers
- 768 hidden dimensions
- 12 attention heads
- ~66M parameters
- Classification head for emotion prediction
### Training Objective
The model was fine-tuned using cross-entropy loss for multi-class classification, optimizing for accurate emotion categorization across multiple emotional states.
## Intended Uses
### Direct Use
The model can be directly used for:
- **Emotion detection** in text documents
- **Sentiment analysis** of customer reviews and feedback
- **Social media monitoring** to understand emotional tone
- **Content moderation** based on emotional content
- **Mental health applications** for emotion tracking in journals
- **Chatbot enhancement** for emotion-aware responses
### Downstream Use
This model can be integrated into larger systems for:
- Customer service platforms for automated response routing
- Market research tools for analyzing consumer sentiment
- Educational platforms for emotional intelligence training
- Healthcare applications for mental wellness monitoring
### Out-of-Scope Use
The model should **not** be used for:
- Clinical diagnosis or medical decision-making
- Making critical decisions about individuals without human oversight
- Applications where misclassification could cause harm
- Languages other than English (without additional fine-tuning)
- Real-time crisis intervention or emergency response
## Limitations and Bias
### Limitations
- **Language limitation:** The model is trained primarily on English text and may not perform well on other languages or code-switched text
- **Context sensitivity:** Short texts or texts lacking context may be misclassified
- **Domain specificity:** Performance may vary across different domains (e.g., formal vs. informal text)
- **Sarcasm and irony:** The model may struggle with non-literal expressions
- **Cultural nuances:** Emotion expression varies across cultures, which may affect performance
### Bias Considerations
- The model's predictions may reflect biases present in the training data
- Emotion categories may not universally apply across all cultures and contexts
- Performance may vary across demographic groups depending on training data representation
- Users should validate model outputs, especially in sensitive applications
### Recommendations
- Always review model predictions in high-stakes applications
- Use the model as a decision support tool, not a sole decision-maker
- Evaluate performance on your specific use case before deployment
- Monitor for bias and fairness issues in production
- Provide clear communication to end users about the model's capabilities and limitations
## How to Get Started with the Model
Use the code below to get started with the model:
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
# Load model and tokenizer
model_name = "Sathwik3/distilbert-emotion-classifier"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
# Example text
text = "I am so happy and excited about this amazing opportunity!"
# Tokenize and predict
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True, max_length=512)
with torch.no_grad():
outputs = model(**inputs)
predictions = torch.nn.functional.softmax(outputs.logits, dim=-1)
predicted_class = torch.argmax(predictions, dim=-1).item()
print(f"Predicted emotion class: {predicted_class}")
print(f"Confidence scores: {predictions}")
```
For pipeline usage:
```python
from transformers import pipeline
# Create emotion classification pipeline
emotion_classifier = pipeline("text-classification", model="Sathwik3/distilbert-emotion-classifier")
# Classify emotion
result = emotion_classifier("I am so happy and excited about this amazing opportunity!")
print(result)
```
## Training Details
### Training Data
The model was fine-tuned on an emotion classification dataset. Specific dataset details:
- **Dataset:** Emotion dataset
- **Size:** 16000
- **Emotion categories:** ['sadness', 'joy', 'love', 'anger', 'fear', 'surprise']
- **Data split:** Train,Validation,Test
### Training Procedure
#### Preprocessing
- Text tokenization using DistilBERT tokenizer
- Maximum sequence length: 512 tokens
- Truncation and padding applied as needed
#### Training Hyperparameters
- **Training regime:** Mixed precision (fp16)
- **Optimizer:** AdamW
- **Learning rate:** 2e-5
- **Batch size:** 64
- **Number of epochs:** 2
- **Weight decay:** 0.01
## Evaluation
### Testing Data & Metrics
#### Testing Data
- **Test set:** [Description of test data - placeholder]
- **Test set size:** [Number of examples - placeholder]
- **Distribution:** [Class distribution information - placeholder]
#### Metrics
The model's performance is evaluated using:
- **Accuracy:** Overall classification accuracy
- **F1 Score:** Macro and weighted F1 scores for balanced evaluation
- **Precision:** Per-class and average precision
- **Recall:** Per-class and average recall
- **Confusion Matrix:** For detailed error analysis
### Results
#### Overall Performance
| Metric | Value |
|--------|-------|
| Accuracy | 0.9295 |
| Weighted F1 | 0.9292 |
## Technical Specifications
### Model Architecture
- **Base Model:** DistilBERT (distilbert-base-uncased)
- **Model Size:** ~66M parameters (base) + classification head
- **Layers:** 6 transformer layers
- **Hidden Size:** 768
- **Attention Heads:** 12
- **Intermediate Size:** 3072
- **Max Sequence Length:** 512 tokens
- **Vocabulary Size:** 30,522 tokens
#### Software
- **Framework:** PyTorch
- **Library:** Hugging Face Transformers
- **Python Version:** 3.10
- **Key Dependencies:**
- transformers
- torch
- tokenizers
## Citation
If you use this model in your research or applications, please cite:
**BibTeX:**
```bibtex
@misc{sathwik3-distilbert-emotion,
author = {Sathwik3},
title = {DistilBERT Emotion Classifier},
year = {2024},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/Sathwik3/distilbert-emotion-classifier}}
}
```
Please also cite the original DistilBERT paper:
```bibtex
@article{sanh2019distilbert,
title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
author={Sanh, Victor and Debut, Lysandre and Chaumond, Julien and Wolf, Thomas},
journal={arXiv preprint arXiv:1910.01108},
year={2019}
}
```
**APA:**
Sathwik3. (2024). *DistilBERT Emotion Classifier*. Hugging Face. https://huggingface.co/Sathwik3/distilbert-emotion-classifier
## Model Card Authors
Sathwik3
## Model Card Contact
For questions or feedback about this model, please open an issue in the model's repository or contact via Hugging Face.
---
*This model card follows the guidelines from [Mitchell et al. (2019)](https://arxiv.org/abs/1810.03993) and the Hugging Face Model Card template.* |