--- library_name: transformers base_model: monologg/koelectra-small-finetuned-sentiment tags: - generated_from_trainer metrics: - accuracy - f1 model-index: - name: koelectra_emotion_v2 results: [] --- # koelectra_emotion_v2 This model is a fine-tuned version of [monologg/koelectra-small-finetuned-sentiment](https://huggingface.co/monologg/koelectra-small-finetuned-sentiment) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.4260 - Accuracy: 0.6118 - F1: 0.6172 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:| | 1.5444 | 1.0 | 601 | 1.3725 | 0.4626 | 0.4567 | | 1.218 | 2.0 | 1202 | 1.2090 | 0.5664 | 0.5693 | | 1.0556 | 3.0 | 1803 | 1.1186 | 0.6031 | 0.6075 | | 0.9497 | 4.0 | 2404 | 1.1470 | 0.5981 | 0.6056 | | 0.8734 | 5.0 | 3005 | 1.1450 | 0.6005 | 0.6067 | | 0.808 | 6.0 | 3606 | 1.1479 | 0.6136 | 0.6181 | | 0.7489 | 7.0 | 4207 | 1.1225 | 0.6287 | 0.6348 | | 0.6926 | 8.0 | 4808 | 1.2075 | 0.6096 | 0.6177 | | 0.6472 | 9.0 | 5409 | 1.2047 | 0.6180 | 0.6227 | | 0.6028 | 10.0 | 6010 | 1.2248 | 0.6194 | 0.6249 | | 0.5624 | 11.0 | 6611 | 1.2474 | 0.6154 | 0.6215 | | 0.5303 | 12.0 | 7212 | 1.2627 | 0.6203 | 0.6257 | | 0.4956 | 13.0 | 7813 | 1.2977 | 0.6191 | 0.6245 | | 0.4662 | 14.0 | 8414 | 1.3655 | 0.6081 | 0.6144 | | 0.4439 | 15.0 | 9015 | 1.3801 | 0.6067 | 0.6124 | | 0.4221 | 16.0 | 9616 | 1.3854 | 0.6124 | 0.6167 | | 0.4097 | 17.0 | 10217 | 1.4101 | 0.6105 | 0.6164 | | 0.3921 | 18.0 | 10818 | 1.4359 | 0.6054 | 0.6109 | | 0.3799 | 19.0 | 11419 | 1.4269 | 0.6111 | 0.6169 | | 0.369 | 20.0 | 12020 | 1.4260 | 0.6118 | 0.6172 | ### Framework versions - Transformers 4.56.1 - Pytorch 2.8.0+cu126 - Datasets 4.0.0 - Tokenizers 0.22.0