Update README.md
Browse files
README.md
CHANGED
|
@@ -14,39 +14,36 @@ tags:
|
|
| 14 |
- emotions-classifier
|
| 15 |
---
|
| 16 |
|
| 17 |
-
#
|
| 18 |
|
| 19 |
-
This is a fine-tuned version of [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) for emotion detection
|
| 20 |
|
| 21 |
-
##
|
| 22 |
|
| 23 |
-
Fast Emotion-X is a state-of-the-art emotion detection model fine-tuned from Microsoft's DeBERTa V3 Small model.
|
| 24 |
|
| 25 |
-
##
|
| 26 |
|
| 27 |
-
-
|
| 28 |
-
-
|
| 29 |
-
-
|
| 30 |
-
-
|
| 31 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
|
| 33 |
-
##
|
| 34 |
-
- π Anger
|
| 35 |
-
- π€’ Disgust
|
| 36 |
-
- π¨ Fear
|
| 37 |
-
- π Joy
|
| 38 |
-
- π’ Sadness
|
| 39 |
-
- π² Surprise
|
| 40 |
-
|
| 41 |
-
|
| 42 |
-
## π Usage
|
| 43 |
-
|
| 44 |
-
You can use this model directly with python package or the Hugging Face `transformers` library:
|
| 45 |
|
|
|
|
| 46 |
|
| 47 |
### Installation
|
| 48 |
|
| 49 |
-
|
| 50 |
|
| 51 |
```bash
|
| 52 |
pip install emotionclassifier
|
|
@@ -90,7 +87,7 @@ result = classifier.predict("I am very happy today!")
|
|
| 90 |
plot_emotion_distribution(result['probabilities'], classifier.labels.values())
|
| 91 |
```
|
| 92 |
|
| 93 |
-
### CLI Usage
|
| 94 |
|
| 95 |
You can also use the package from the command line:
|
| 96 |
|
|
@@ -135,7 +132,7 @@ Fine-tune a pre-trained model on your own dataset:
|
|
| 135 |
```python
|
| 136 |
from emotionclassifier.fine_tune import fine_tune_model
|
| 137 |
|
| 138 |
-
# Define your
|
| 139 |
train_dataset = ...
|
| 140 |
val_dataset = ...
|
| 141 |
|
|
@@ -143,7 +140,7 @@ val_dataset = ...
|
|
| 143 |
fine_tune_model(classifier.model, classifier.tokenizer, train_dataset, val_dataset, output_dir='fine_tuned_model')
|
| 144 |
```
|
| 145 |
|
| 146 |
-
### Using transformers
|
| 147 |
|
| 148 |
```python
|
| 149 |
from transformers import AutoModelForSequenceClassification, AutoTokenizer
|
|
@@ -165,42 +162,39 @@ emotion = predict_emotion(text)
|
|
| 165 |
print("Detected Emotion:", emotion)
|
| 166 |
```
|
| 167 |
|
| 168 |
-
##
|
| 169 |
|
| 170 |
The model was trained using the following parameters:
|
| 171 |
|
| 172 |
-
-
|
| 173 |
-
-
|
| 174 |
-
-
|
| 175 |
-
-
|
| 176 |
|
| 177 |
-
###
|
| 178 |
|
| 179 |
-
-
|
| 180 |
-
-
|
| 181 |
-
-
|
| 182 |
-
-
|
| 183 |
-
-
|
| 184 |
-
-
|
| 185 |
-
-
|
| 186 |
-
-
|
| 187 |
-
-
|
| 188 |
|
|
|
|
| 189 |
|
| 190 |
-
|
| 191 |
-
|
| 192 |
-
|
| 193 |
-
| Parameter | Value |
|
| 194 |
-
|-------------------------------|---------------------------|
|
| 195 |
| Model Name | microsoft/deberta-v3-small |
|
| 196 |
-
| Training Dataset | dair-ai/emotion
|
| 197 |
| Number of Training Epochs | 20 |
|
| 198 |
-
| Learning Rate | 2e-5
|
| 199 |
-
| Per Device Train Batch Size | 4
|
| 200 |
-
| Evaluation Strategy | Epoch
|
| 201 |
-
| Best Model Accuracy | 94.6%
|
| 202 |
-
|
| 203 |
|
| 204 |
-
##
|
| 205 |
|
| 206 |
This model is licensed under the [MIT License](LICENSE).
|
|
|
|
| 14 |
- emotions-classifier
|
| 15 |
---
|
| 16 |
|
| 17 |
+
# Fast Emotion-X: Fine-tuned DeBERTa V3 Small Based Emotion Detection
|
| 18 |
|
| 19 |
+
This model is a fine-tuned version of [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) for emotion detection using the [dair-ai/emotion](https://huggingface.co/dair-ai/emotion) dataset.
|
| 20 |
|
| 21 |
+
## Overview
|
| 22 |
|
| 23 |
+
Fast Emotion-X is a state-of-the-art emotion detection model fine-tuned from Microsoft's DeBERTa V3 Small model. It is designed to accurately classify text into one of six emotional categories. Leveraging the robust capabilities of DeBERTa, this model is fine-tuned on a comprehensive emotion dataset, ensuring high accuracy and reliability.
|
| 24 |
|
| 25 |
+
## Model Details
|
| 26 |
|
| 27 |
+
- **Model Name:** `AnkitAI/deberta-v3-small-base-emotions-classifier`
|
| 28 |
+
- **Base Model:** `microsoft/deberta-v3-small`
|
| 29 |
+
- **Dataset:** [dair-ai/emotion](https://huggingface.co/dair-ai/emotion)
|
| 30 |
+
- **Fine-tuning:** The model is fine-tuned for emotion detection with a classification head for six emotional categories: anger, disgust, fear, joy, sadness, and surprise.
|
| 31 |
|
| 32 |
+
## Emotion Labels
|
| 33 |
+
- Anger
|
| 34 |
+
- Disgust
|
| 35 |
+
- Fear
|
| 36 |
+
- Joy
|
| 37 |
+
- Sadness
|
| 38 |
+
- Surprise
|
| 39 |
|
| 40 |
+
## Usage
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 41 |
|
| 42 |
+
You can use this model directly with the provided Python package or the Hugging Face `transformers` library.
|
| 43 |
|
| 44 |
### Installation
|
| 45 |
|
| 46 |
+
Install the package using pip:
|
| 47 |
|
| 48 |
```bash
|
| 49 |
pip install emotionclassifier
|
|
|
|
| 87 |
plot_emotion_distribution(result['probabilities'], classifier.labels.values())
|
| 88 |
```
|
| 89 |
|
| 90 |
+
### Command-Line Interface (CLI) Usage
|
| 91 |
|
| 92 |
You can also use the package from the command line:
|
| 93 |
|
|
|
|
| 132 |
```python
|
| 133 |
from emotionclassifier.fine_tune import fine_tune_model
|
| 134 |
|
| 135 |
+
# Define your training and validation datasets
|
| 136 |
train_dataset = ...
|
| 137 |
val_dataset = ...
|
| 138 |
|
|
|
|
| 140 |
fine_tune_model(classifier.model, classifier.tokenizer, train_dataset, val_dataset, output_dir='fine_tuned_model')
|
| 141 |
```
|
| 142 |
|
| 143 |
+
### Using transformers Library
|
| 144 |
|
| 145 |
```python
|
| 146 |
from transformers import AutoModelForSequenceClassification, AutoTokenizer
|
|
|
|
| 162 |
print("Detected Emotion:", emotion)
|
| 163 |
```
|
| 164 |
|
| 165 |
+
## Training
|
| 166 |
|
| 167 |
The model was trained using the following parameters:
|
| 168 |
|
| 169 |
+
- **Learning Rate:** 2e-5
|
| 170 |
+
- **Batch Size:** 4
|
| 171 |
+
- **Weight Decay:** 0.01
|
| 172 |
+
- **Evaluation Strategy:** Epoch
|
| 173 |
|
| 174 |
+
### Training Details
|
| 175 |
|
| 176 |
+
- **Evaluation Loss:** 0.0858
|
| 177 |
+
- **Evaluation Runtime:** 110070.6349 seconds
|
| 178 |
+
- **Evaluation Samples/Second:** 78.495
|
| 179 |
+
- **Evaluation Steps/Second:** 2.453
|
| 180 |
+
- **Training Loss:** 0.1049
|
| 181 |
+
- **Evaluation Accuracy:** 94.6%
|
| 182 |
+
- **Evaluation Precision:** 94.8%
|
| 183 |
+
- **Evaluation Recall:** 94.5%
|
| 184 |
+
- **Evaluation F1 Score:** 94.7%
|
| 185 |
|
| 186 |
+
## Model Card Data
|
| 187 |
|
| 188 |
+
| Parameter | Value |
|
| 189 |
+
|-------------------------------|----------------------------|
|
|
|
|
|
|
|
|
|
|
| 190 |
| Model Name | microsoft/deberta-v3-small |
|
| 191 |
+
| Training Dataset | dair-ai/emotion |
|
| 192 |
| Number of Training Epochs | 20 |
|
| 193 |
+
| Learning Rate | 2e-5 |
|
| 194 |
+
| Per Device Train Batch Size | 4 |
|
| 195 |
+
| Evaluation Strategy | Epoch |
|
| 196 |
+
| Best Model Accuracy | 94.6% |
|
|
|
|
| 197 |
|
| 198 |
+
## License
|
| 199 |
|
| 200 |
This model is licensed under the [MIT License](LICENSE).
|