Panda0116 commited on
Commit
d9068cc
·
verified ·
1 Parent(s): de3bda1

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +66 -48
README.md CHANGED
@@ -1,64 +1,82 @@
1
- ---
2
- library_name: transformers
3
- license: apache-2.0
4
- base_model: distilbert-base-uncased
5
- tags:
6
- - generated_from_trainer
7
- metrics:
8
- - accuracy
9
- model-index:
10
- - name: emotion-classification-model
11
- results: []
12
- ---
13
-
14
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
- should probably proofread and complete it, then remove this comment. -->
16
-
17
  # emotion-classification-model
18
 
19
- This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
20
- It achieves the following results on the evaluation set:
21
- - Loss: 0.1565
22
- - Accuracy: 0.9415
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
 
24
- ## Model description
 
 
 
25
 
26
- More information needed
27
 
28
- ## Intended uses & limitations
 
 
 
29
 
30
- More information needed
 
 
 
 
31
 
32
- ## Training and evaluation data
33
 
34
- More information needed
 
 
 
 
 
 
 
35
 
36
- ## Training procedure
37
 
38
- ### Training hyperparameters
 
 
 
 
39
 
40
- The following hyperparameters were used during training:
41
- - learning_rate: 5e-05
42
- - train_batch_size: 16
43
- - eval_batch_size: 16
44
- - seed: 42
45
- - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
46
- - lr_scheduler_type: linear
47
- - num_epochs: 3
48
- - mixed_precision_training: Native AMP
49
 
50
- ### Training results
 
 
51
 
52
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
53
- |:-------------:|:-----:|:----:|:---------------:|:--------:|
54
- | 0.225 | 1.0 | 1000 | 0.1815 | 0.9295 |
55
- | 0.1279 | 2.0 | 2000 | 0.1561 | 0.933 |
56
- | 0.0795 | 3.0 | 3000 | 0.1565 | 0.9415 |
57
 
 
 
58
 
59
- ### Framework versions
 
60
 
61
- - Transformers 4.46.2
62
- - Pytorch 2.5.1+cu124
63
- - Datasets 3.1.0
64
- - Tokenizers 0.20.3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  # emotion-classification-model
2
 
3
+ This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the [dair-ai/emotion dataset](https://huggingface.co/datasets/dair-ai/emotion). It is designed to classify text into various emotional categories.
4
+
5
+ It achieves the following results:
6
+ - **Validation Accuracy:** 93.55%
7
+ - **Test Accuracy:** 93.3%
8
+
9
+ ## Model Description
10
+
11
+ This model uses the DistilBERT architecture, which is a lighter and faster variant of BERT. It has been fine-tuned specifically for emotion classification, making it suitable for tasks such as sentiment analysis, customer feedback analysis, and user emotion detection.
12
+
13
+ ### Key Features
14
+ - Efficient and lightweight for deployment.
15
+ - High accuracy for emotion detection tasks.
16
+ - Pretrained on a diverse dataset and fine-tuned for high specificity to emotions.
17
+
18
+ ## Intended Uses & Limitations
19
+
20
+ ### Intended Uses
21
+ - Emotion analysis in text data.
22
+ - Sentiment detection in customer reviews, tweets, or user feedback.
23
+ - Psychological or behavioral studies to analyze emotional tone in communications.
24
 
25
+ ### Limitations
26
+ - May not generalize well to datasets with highly domain-specific language.
27
+ - Might struggle with sarcasm, irony, or other nuanced forms of language.
28
+ - The model is English-specific and may not perform well on non-English text.
29
 
30
+ ## Training and Evaluation Data
31
 
32
+ ### Training Dataset
33
+ - **Dataset:** [dair-ai/emotion](https://huggingface.co/datasets/dair-ai/emotion)
34
+ - **Training Set Size:** 16,000 examples
35
+ - **Dataset Description:** The dataset contains English sentences labeled with six emotional categories: anger, joy, optimism, sadness, fear, and disgust.
36
 
37
+ ### Results
38
+ - **Training Time:** ~204 seconds
39
+ - **Training Loss:** 0.2034
40
+ - **Validation Accuracy:** 93.55%
41
+ - **Test Accuracy:** 93.3%
42
 
43
+ ## Training Procedure
44
 
45
+ ### Hyperparameters
46
+ - **Learning Rate:** 5e-05
47
+ - **Batch Size:** 16 (train and evaluation)
48
+ - **Epochs:** 3
49
+ - **Seed:** 42
50
+ - **Optimizer:** AdamW (betas=(0.9,0.999), epsilon=1e-08)
51
+ - **Learning Rate Scheduler:** Linear
52
+ - **Mixed Precision Training:** Native AMP
53
 
54
+ ### Training and Validation Results
55
 
56
+ | Epoch | Training Loss | Validation Loss | Validation Accuracy |
57
+ |-------|---------------|-----------------|---------------------|
58
+ | 1 | 0.2293 | 0.1746 | 93.35% |
59
+ | 2 | 0.1315 | 0.1529 | 93.70% |
60
+ | 3 | 0.0798 | 0.1554 | 93.55% |
61
 
62
+ ### Test Results
63
+ - **Loss:** 0.1642
64
+ - **Accuracy:** 93.3%
 
 
 
 
 
 
65
 
66
+ ### Performance Metrics
67
+ - **Training Speed:** ~204 samples/second
68
+ - **Evaluation Speed:** ~986 samples/second
69
 
70
+ ## Usage Example
 
 
 
 
71
 
72
+ ```python
73
+ from transformers import pipeline
74
 
75
+ # Load the fine-tuned model
76
+ classifier = pipeline("text-classification", model="Panda0116/emotion-classification-model")
77
 
78
+ # Example usage
79
+ text = "I am so happy to see you!"
80
+ emotion = classifier(text)
81
+ print(emotion)
82
+ ```