File size: 3,872 Bytes
2fa8178 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 |
---
language: en
tags:
- emotion-classification
- multilabel-classification
- text-classification
- pytorch
- transformers
datasets:
- emotion
metrics:
- f1
- accuracy
library_name: transformers
pipeline_tag: text-classification
---
# Multilabel Emotion Classification Model - FirstTimeUp
This model is fine-tuned for multilabel emotion classification using distilbert-base-uncased as the base model.
## Model Details
- **Model Name**: FirstTimeUp
- **Base Model**: distilbert-base-uncased
- **Task**: Multilabel Emotion Classification
- **Emotions**: amusement, anger, annoyance, caring, confusion, disappointment, disgust, embarrassment, excitement, fear, gratitude, joy, love, sadness
- **Total Parameters**: 66,373,646
- **Trainable Parameters**: 66,373,646
## Quick Start
### Installation
```bash
pip install torch transformers huggingface_hub
```
### Usage
```python
# Download the repository
from huggingface_hub import snapshot_download
import sys
import os
# Download model files
model_path = snapshot_download(repo_id="EnJiZ/FirstTimeUp")
# Add to path and import
sys.path.append(model_path)
from model import predict_emotions
# Predict emotions
text = "I am so happy and excited!"
emotions = predict_emotions(text, model_path)
print(emotions)
```
### Advanced Usage
```python
import torch
from transformers import AutoTokenizer
import sys
sys.path.append(model_path)
from model import MultiLabelEmotionClassifier, load_model
# Load model manually
model, config = load_model(model_path)
tokenizer = AutoTokenizer.from_pretrained(model_path)
# Custom prediction with different threshold
def custom_predict(text, threshold=0.3):
encoding = tokenizer(
text,
truncation=True,
padding='max_length',
max_length=128,
return_tensors='pt'
)
model.eval()
with torch.no_grad():
logits = model(encoding['input_ids'], encoding['attention_mask'])
probabilities = torch.sigmoid(logits)
predictions = (probabilities > threshold).int()
emotion_labels = ['amusement', 'anger', 'annoyance', 'caring', 'confusion', 'disappointment', 'disgust', 'embarrassment', 'excitement', 'fear', 'gratitude', 'joy', 'love', 'sadness']
result = {emotion: {
'predicted': bool(pred),
'probability': float(prob)
} for emotion, pred, prob in zip(emotion_labels, predictions[0], probabilities[0])}
return result
# Example with probabilities
result = custom_predict("I feel great today!", threshold=0.3)
print(result)
```
## Model Architecture
- **Base**: distilbert-base-uncased
- **Classification Head**: Linear layer with dropout (dropout_rate=0.3)
- **Loss Function**: BCEWithLogitsLoss
- **Activation**: Sigmoid (for multilabel classification)
## Training Details
- **Epochs**: 3
- **Batch Size**: 32
- **Learning Rate**: 2e-05
- **Max Sequence Length**: 128
- **Optimizer**: AdamW with weight decay (0.01)
- **Scheduler**: Linear warmup + decay
## Files in this Repository
- `config.json`: Model configuration
- `pytorch_model.bin`: Model weights
- `tokenizer.json`, `tokenizer_config.json`: Tokenizer files
- `model.py`: Custom model class and utility functions
- `README.md`: This file
## Performance
- **Task**: Multilabel Emotion Classification
- **Metrics**: F1-Score (Micro & Macro), Accuracy
- **Validation Strategy**: 80/20 train-validation split
## Supported Emotions
amusement, anger, annoyance, caring, confusion, disappointment, disgust, embarrassment, excitement, fear, gratitude, joy, love, sadness
## License
This model is released under the Apache 2.0 license.
## Citation
```bibtex
@misc{firsttimeup2024,
title={FirstTimeUp: Multilabel Emotion Classification Model},
author={EnJiZ},
year={2024},
url={https://huggingface.co/EnJiZ/FirstTimeUp}
}
```
## Contact
For questions or issues, please open an issue in the repository.
|