koelectra_emotion_v3
This model is a fine-tuned version of monologg/koelectra-small-finetuned-sentiment on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.5162
- Accuracy: 0.4089
- F1: 0.4177
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|---|---|---|---|---|---|
| 1.8421 | 1.0 | 3723 | 1.7489 | 0.2754 | 0.2071 |
| 1.6975 | 2.0 | 7446 | 1.6315 | 0.3388 | 0.3192 |
| 1.6025 | 3.0 | 11169 | 1.5650 | 0.3819 | 0.3749 |
| 1.5331 | 4.0 | 14892 | 1.5103 | 0.4024 | 0.3932 |
| 1.4824 | 5.0 | 18615 | 1.4976 | 0.4120 | 0.4063 |
| 1.4378 | 6.0 | 22338 | 1.4715 | 0.4249 | 0.4243 |
| 1.393 | 7.0 | 26061 | 1.4635 | 0.4313 | 0.4309 |
| 1.3493 | 8.0 | 29784 | 1.4839 | 0.4304 | 0.4288 |
| 1.3069 | 9.0 | 33507 | 1.4705 | 0.4342 | 0.4360 |
| 1.2639 | 10.0 | 37230 | 1.4816 | 0.4363 | 0.4409 |
| 1.2218 | 11.0 | 40953 | 1.5249 | 0.4311 | 0.4352 |
| 1.1793 | 12.0 | 44676 | 1.5316 | 0.4309 | 0.4372 |
| 1.1342 | 13.0 | 48399 | 1.5984 | 0.4302 | 0.4355 |
| 1.0891 | 14.0 | 52122 | 1.6145 | 0.4265 | 0.4329 |
| 1.0445 | 15.0 | 55845 | 1.6517 | 0.4273 | 0.4354 |
| 0.9999 | 16.0 | 59568 | 1.6906 | 0.4266 | 0.4307 |
| 0.9593 | 17.0 | 63291 | 1.7398 | 0.4217 | 0.4297 |
| 0.9134 | 18.0 | 67014 | 1.8011 | 0.4162 | 0.4270 |
| 0.8697 | 19.0 | 70737 | 1.8496 | 0.4200 | 0.4262 |
| 0.8281 | 20.0 | 74460 | 1.8960 | 0.4169 | 0.4241 |
| 0.7899 | 21.0 | 78183 | 1.9289 | 0.4157 | 0.4246 |
| 0.7519 | 22.0 | 81906 | 2.0116 | 0.4166 | 0.4243 |
| 0.7125 | 23.0 | 85629 | 2.0726 | 0.4148 | 0.4226 |
| 0.6757 | 24.0 | 89352 | 2.1433 | 0.4081 | 0.4179 |
| 0.6417 | 25.0 | 93075 | 2.1700 | 0.4025 | 0.4160 |
| 0.6139 | 26.0 | 96798 | 2.2486 | 0.4061 | 0.4159 |
| 0.5799 | 27.0 | 100521 | 2.3329 | 0.4051 | 0.4155 |
| 0.5529 | 28.0 | 104244 | 2.3371 | 0.4102 | 0.4174 |
| 0.5229 | 29.0 | 107967 | 2.4037 | 0.4021 | 0.4151 |
| 0.4992 | 30.0 | 111690 | 2.5162 | 0.4089 | 0.4177 |
Framework versions
- Transformers 4.56.1
- Pytorch 2.8.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.0
- Downloads last month
- -
Model tree for be2be2/koelectra_emotion_v3
Base model
monologg/koelectra-small-finetuned-sentiment