DanJoshua's picture
End of training
14433ac verified
metadata
library_name: transformers
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: profesor_MViT_B_RWF2000
    results: []

profesor_MViT_B_RWF2000

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2639
  • Accuracy: 0.9225
  • F1: 0.9225
  • Precision: 0.9225
  • Recall: 0.9225
  • Roc Auc: 0.9782

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 20
  • eval_batch_size: 20
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 240
  • training_steps: 2400
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall Roc Auc
0.5302 2.0333 240 0.4018 0.895 0.895 0.895 0.895 0.9609
0.2816 5.0333 480 0.2559 0.9125 0.9125 0.9126 0.9125 0.9782
0.1884 8.0333 720 0.2456 0.91 0.9099 0.9115 0.91 0.9799
0.1991 11.0333 960 0.2289 0.9225 0.9225 0.9225 0.9225 0.9815
0.1298 14.0333 1200 0.2186 0.9275 0.9275 0.9275 0.9275 0.9834
0.1518 17.0333 1440 0.2484 0.9275 0.9275 0.9276 0.9275 0.9798
0.107 20.0333 1680 0.2442 0.93 0.9300 0.9300 0.93 0.9834
0.1021 23.0333 1920 0.2653 0.925 0.9250 0.9252 0.925 0.9813

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.0.1+cu118
  • Datasets 3.0.2
  • Tokenizers 0.20.1