hiera-finetuned-stroke-multi-ct-part2-multiw-ct

This model is a fine-tuned version of BTX24/hiera-finetuned-stroke-multi-to-ct on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0060
  • Accuracy: 0.9987
  • F1: 0.9987
  • Precision: 0.9987
  • Recall: 0.9987

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 24
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
1.0912 1.1765 100 0.8830 0.5194 0.4534 0.5942 0.5194
0.7259 2.3529 200 0.6586 0.6942 0.6873 0.6972 0.6942
0.5083 3.5294 300 0.4742 0.7948 0.7908 0.8169 0.7948
0.3477 4.7059 400 0.2835 0.8903 0.8902 0.8918 0.8903
0.2662 5.8824 500 0.1988 0.9252 0.9251 0.9275 0.9252
0.1948 7.0588 600 0.1340 0.9510 0.9510 0.9510 0.9510
0.1468 8.2353 700 0.0793 0.9761 0.9761 0.9761 0.9761
0.1276 9.4118 800 0.0844 0.9735 0.9736 0.9741 0.9735
0.1267 10.5882 900 0.0894 0.9677 0.9678 0.9689 0.9677
0.0888 11.7647 1000 0.0393 0.9890 0.9890 0.9890 0.9890
0.0775 12.9412 1100 0.0419 0.9871 0.9871 0.9872 0.9871
0.0707 14.1176 1200 0.0347 0.9903 0.9903 0.9904 0.9903
0.0584 15.2941 1300 0.0131 0.9974 0.9974 0.9974 0.9974
0.0292 16.4706 1400 0.0175 0.9942 0.9942 0.9942 0.9942
0.0547 17.6471 1500 0.0119 0.9961 0.9961 0.9961 0.9961
0.0365 18.8235 1600 0.0076 0.9981 0.9981 0.9981 0.9981
0.0298 20.0 1700 0.0060 0.9987 0.9987 0.9987 0.9987
0.024 21.1765 1800 0.0059 0.9981 0.9981 0.9981 0.9981
0.0284 22.3529 1900 0.0059 0.9981 0.9981 0.9981 0.9981
0.0234 23.5294 2000 0.0056 0.9981 0.9981 0.9981 0.9981

Framework versions

  • Transformers 4.52.4
  • Pytorch 2.7.1+cu126
  • Datasets 3.6.0
  • Tokenizers 0.21.1

image/png image/png image/png image/png image/png image/png image/png image/png image/png image/png image/png image/png

Downloads last month
1
Safetensors
Model size
50.8M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for BTX24/hiera-finetuned-stroke-multi-ct-part2-multiw-ct

Finetuned
(1)
this model

Evaluation results