DanJoshua's picture
End of training
737b70f verified
metadata
library_name: transformers
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
  - precision
model-index:
  - name: student_s3d_default_not_learning_RWF2000
    results: []

student_s3d_default_not_learning_RWF2000

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3789
  • Accuracy: 0.885
  • F1: 0.8847
  • Precision: 0.8893

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 75
  • eval_batch_size: 75
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 47
  • training_steps: 475
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision
0.6345 2.0147 47 0.5707 0.7656 0.7621 0.7824
0.4332 4.0295 94 0.3666 0.8219 0.8218 0.8222
0.3306 7.0021 141 0.4903 0.825 0.8246 0.8283
0.2447 9.0168 188 0.3415 0.8375 0.8361 0.8498
0.1972 11.0316 235 0.4171 0.8375 0.8359 0.8516
0.173 14.0042 282 0.3911 0.8656 0.8650 0.8720
0.1481 16.0189 329 0.4326 0.8719 0.8714 0.8772
0.1165 18.0337 376 0.2364 0.8812 0.8811 0.8834
0.098 21.0063 423 0.5760 0.8844 0.8843 0.8856
0.0857 23.0211 470 0.5217 0.8812 0.8811 0.8827

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.0.1+cu118
  • Datasets 3.0.1
  • Tokenizers 0.20.0