File size: 2,508 Bytes
6429c83
 
 
 
 
 
 
 
 
 
2564e6d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
fcca64e
 
 
 
 
 
 
2564e6d
 
 
fcca64e
2564e6d
 
 
 
 
fcca64e
2564e6d
fcca64e
 
 
 
 
 
 
 
 
 
 
 
2564e6d
 
 
 
fcca64e
 
 
 
 
 
cbf0666
2564e6d
 
 
 
fcca64e
 
2564e6d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
---
language: en
license: mit
tags:
- text-classification
- multi-label
- emotion-detection
- huggingface
---

# emo-detector

**emo-detector** is a multi-label emotion detection model for text. It predicts one or more emotions from the following labels:

- anger
- fear
- joy
- sadness
- surprise

## Model Details

- **Architecture:** Pretrained DeBERTa + custom FFNN classifier  
- **Task:** Multi-label text classification    
- **Tokenizer:** DeBERTa tokenizer (`microsoft/deberta-v3-base`) 
- **Output:** Probabilities → Thresholded to 0/1

## Custom Model Class

This model uses a custom architecture defined inside the `emo_detector/` module:

  - `emo_detector/configuration_bert_ffnn.py``BertFFNNConfig`
  - `emo_detector/modeling_bert_ffnn.py``BERT_FFNN`

To load or fine-tune this model, you must download the full repository (including the `emo_detector/` folder).
The recommended way is to use `snapshot_download()` from Hugging Face Hub.

## Installation
```bash
pip install torch transformers huggingface_hub
```

## Usage

```python
import sys
import torch
from transformers import AutoTokenizer
from huggingface_hub import snapshot_download

# Download entire repository
repo_dir = snapshot_download("NeuralNest05/emo-detector")
sys.path.append(repo_dir)

# Import custom architecture + config
from emo_detector.configuration_bert_ffnn import BertFFNNConfig
from emo_detector.modeling_bert_ffnn import BERT_FFNN

DEVICE = "cuda" if torch.cuda.is_available() else "cpu"

# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained("NeuralNest05/emo-detector")

# Load model config and architecture
config = BertFFNNConfig.from_pretrained("NeuralNest05/emo-detector")
model = BERT_FFNN(config)

# Load weights
model.load_state_dict(torch.load(f"{repo_dir}/pytorch_model.bin", map_location=DEVICE))
model.to(DEVICE)
model.eval()

# Example prediction
texts = ["I am very happy today!", "This is scary..."]
encodings = tokenizer(texts, truncation=True, padding=True, return_tensors="pt").to(DEVICE)

with torch.no_grad():
    logits = model(**encodings)
    probs = torch.sigmoid(logits)
    threshold = 0.5
    preds = (probs > threshold).int()

print(preds)
```

## Output Format 
Each prediction corresponds to the five emotion labels in this order:
```
["anger", "fear", "joy", "sadness", "surprise"]
```
Output is a multi-hot vector, e.g.:
```
[0, 0, 1, 0, 0] → joy
```

## License
MIT License

## Acknowledgements

- Microsoft DeBERTa-v3
- Hugging Face Transformers
- PyTorch