be2be2's picture
be2be2/koelectra_emotion_v2_2
59905b6 verified
metadata
library_name: transformers
base_model: monologg/koelectra-small-finetuned-sentiment
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: koelectra_emotion_v2_2
    results: []

koelectra_emotion_v2_2

This model is a fine-tuned version of monologg/koelectra-small-finetuned-sentiment on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4392
  • Accuracy: 0.5249
  • F1: 0.5227

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.8251 1.0 601 1.7175 0.3130 0.2585
1.6569 2.0 1202 1.5616 0.4007 0.3821
1.5086 3.0 1803 1.4698 0.4397 0.4258
1.4099 4.0 2404 1.4268 0.4661 0.4592
1.3318 5.0 3005 1.3885 0.4779 0.4770
1.2688 6.0 3606 1.3679 0.4927 0.4849
1.2213 7.0 4207 1.3497 0.5071 0.5015
1.1756 8.0 4808 1.3405 0.5144 0.5099
1.1328 9.0 5409 1.3548 0.5162 0.5125
1.0964 10.0 6010 1.3864 0.5151 0.5081
1.0675 11.0 6611 1.3606 0.5198 0.5162
1.0368 12.0 7212 1.3719 0.5195 0.5170
1.0044 13.0 7813 1.3695 0.5256 0.5222
0.9808 14.0 8414 1.3896 0.5216 0.5196
0.9554 15.0 9015 1.4033 0.5228 0.5192
0.9307 16.0 9616 1.4135 0.5247 0.5236
0.9131 17.0 10217 1.4193 0.5198 0.5181
0.8969 18.0 10818 1.4259 0.5276 0.5243
0.8825 19.0 11419 1.4385 0.5245 0.5224
0.8779 20.0 12020 1.4392 0.5249 0.5227

Framework versions

  • Transformers 4.56.1
  • Pytorch 2.8.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.0