dair-ai/emotion
Viewer • Updated • 437k • 34.2k • 439
How to use gokuls/bert_12_layer_model_v4_48_emotion with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="gokuls/bert_12_layer_model_v4_48_emotion") # Load model directly
from transformers import AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained("gokuls/bert_12_layer_model_v4_48_emotion", dtype="auto")This model is a fine-tuned version of gokuls/bert_12_layer_model_v4_complete_training_48 on the emotion dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 1.605 | 1.0 | 250 | 1.5438 | 0.4455 |
| 1.5564 | 2.0 | 500 | 1.6343 | 0.318 |
| 1.5893 | 3.0 | 750 | 1.5894 | 0.31 |
| 1.5839 | 4.0 | 1000 | 1.5841 | 0.3505 |
| 1.5879 | 5.0 | 1250 | 1.6087 | 0.275 |
| 1.5892 | 6.0 | 1500 | 1.5838 | 0.352 |
| 1.5819 | 7.0 | 1750 | 1.5755 | 0.3465 |
| 1.5766 | 8.0 | 2000 | 1.5800 | 0.347 |
| 1.5745 | 9.0 | 2250 | 1.5768 | 0.3505 |
| 1.5717 | 10.0 | 2500 | 1.5774 | 0.3455 |