dair-ai/emotion
Viewer • Updated • 437k • 34.2k • 439
How to use gokuls/bert_12_layer_model_v3_48_emotion with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="gokuls/bert_12_layer_model_v3_48_emotion") # Load model directly
from transformers import AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained("gokuls/bert_12_layer_model_v3_48_emotion", dtype="auto")This model is a fine-tuned version of gokuls/bert_12_layer_model_v3_complete_training_48 on the emotion dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 0.9112 | 1.0 | 250 | 0.5176 | 0.8495 |
| 0.389 | 2.0 | 500 | 0.3617 | 0.8755 |
| 0.2894 | 3.0 | 750 | 0.3037 | 0.8905 |
| 0.2359 | 4.0 | 1000 | 0.3346 | 0.895 |
| 0.1883 | 5.0 | 1250 | 0.3178 | 0.8955 |
| 0.1638 | 6.0 | 1500 | 0.3597 | 0.897 |
| 0.1217 | 7.0 | 1750 | 0.4075 | 0.8895 |
| 0.0962 | 8.0 | 2000 | 0.4023 | 0.899 |
| 0.0732 | 9.0 | 2250 | 0.4479 | 0.8955 |
| 0.0569 | 10.0 | 2500 | 0.4894 | 0.8985 |