hiera-base-224-in1k-hf-finetuned-stroke-binary
This model is a fine-tuned version of facebook/hiera-base-224-in1k-hf on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1032
- Accuracy: 0.9828
- F1: 0.9828
- Precision: 0.9829
- Recall: 0.9828
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 48
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 0.6239 | 0.6202 | 100 | 0.5808 | 0.6920 | 0.6271 | 0.7644 | 0.6920 |
| 0.4449 | 1.2357 | 200 | 0.4067 | 0.8304 | 0.8208 | 0.8482 | 0.8304 |
| 0.3462 | 1.8558 | 300 | 0.3286 | 0.8675 | 0.8620 | 0.8807 | 0.8675 |
| 0.271 | 2.4713 | 400 | 0.2234 | 0.9204 | 0.9201 | 0.9202 | 0.9204 |
| 0.2472 | 3.0868 | 500 | 0.1890 | 0.9272 | 0.9264 | 0.9285 | 0.9272 |
| 0.2765 | 3.7070 | 600 | 0.1951 | 0.9313 | 0.9313 | 0.9314 | 0.9313 |
| 0.2277 | 4.3225 | 700 | 0.1878 | 0.9299 | 0.9297 | 0.9297 | 0.9299 |
| 0.2463 | 4.9426 | 800 | 0.1769 | 0.9313 | 0.9307 | 0.9318 | 0.9313 |
| 0.2181 | 5.5581 | 900 | 0.1948 | 0.9258 | 0.9244 | 0.9308 | 0.9258 |
| 0.223 | 6.1736 | 1000 | 0.1736 | 0.9362 | 0.9358 | 0.9366 | 0.9362 |
| 0.231 | 6.7938 | 1100 | 0.3210 | 0.8765 | 0.8705 | 0.8963 | 0.8765 |
| 0.2022 | 7.4093 | 1200 | 0.1485 | 0.9475 | 0.9471 | 0.9483 | 0.9475 |
| 0.2114 | 8.0248 | 1300 | 0.2115 | 0.9118 | 0.9092 | 0.9215 | 0.9118 |
| 0.2093 | 8.6450 | 1400 | 0.1446 | 0.9480 | 0.9477 | 0.9482 | 0.9480 |
| 0.1736 | 9.2605 | 1500 | 0.1410 | 0.9484 | 0.9481 | 0.9490 | 0.9484 |
| 0.1587 | 9.8806 | 1600 | 0.1912 | 0.9317 | 0.9321 | 0.9340 | 0.9317 |
| 0.1561 | 10.4961 | 1700 | 0.1361 | 0.9512 | 0.9507 | 0.9524 | 0.9512 |
| 0.1536 | 11.1116 | 1800 | 0.1555 | 0.9389 | 0.9388 | 0.9388 | 0.9389 |
| 0.155 | 11.7318 | 1900 | 0.1216 | 0.9602 | 0.9600 | 0.9604 | 0.9602 |
| 0.1533 | 12.3473 | 2000 | 0.1359 | 0.9502 | 0.9499 | 0.9509 | 0.9502 |
| 0.1382 | 12.9674 | 2100 | 0.1394 | 0.9539 | 0.9535 | 0.9549 | 0.9539 |
| 0.1581 | 13.5829 | 2200 | 0.1107 | 0.9602 | 0.9599 | 0.9611 | 0.9602 |
| 0.1256 | 14.1984 | 2300 | 0.1255 | 0.9588 | 0.9586 | 0.9592 | 0.9588 |
| 0.1284 | 14.8186 | 2400 | 0.1777 | 0.9448 | 0.9440 | 0.9485 | 0.9448 |
| 0.1763 | 15.4341 | 2500 | 0.1201 | 0.9543 | 0.9540 | 0.9552 | 0.9543 |
| 0.1532 | 16.0496 | 2600 | 0.1272 | 0.9539 | 0.9535 | 0.9547 | 0.9539 |
| 0.1286 | 16.6698 | 2700 | 0.1221 | 0.9638 | 0.9638 | 0.9638 | 0.9638 |
| 0.1291 | 17.2853 | 2800 | 0.1137 | 0.9548 | 0.9544 | 0.9560 | 0.9548 |
| 0.1127 | 17.9054 | 2900 | 0.1271 | 0.9607 | 0.9608 | 0.9612 | 0.9607 |
| 0.1346 | 18.5209 | 3000 | 0.1024 | 0.9647 | 0.9645 | 0.9654 | 0.9647 |
| 0.1032 | 19.1364 | 3100 | 0.1438 | 0.9593 | 0.9590 | 0.9602 | 0.9593 |
| 0.1125 | 19.7566 | 3200 | 0.1158 | 0.9575 | 0.9574 | 0.9574 | 0.9575 |
| 0.1147 | 20.3721 | 3300 | 0.1350 | 0.9552 | 0.9548 | 0.9571 | 0.9552 |
| 0.1013 | 20.9922 | 3400 | 0.1616 | 0.9534 | 0.9529 | 0.9555 | 0.9534 |
| 0.1071 | 21.6078 | 3500 | 0.1299 | 0.9625 | 0.9622 | 0.9632 | 0.9625 |
| 0.0795 | 22.2233 | 3600 | 0.0954 | 0.9729 | 0.9729 | 0.9731 | 0.9729 |
| 0.1168 | 22.8434 | 3700 | 0.1308 | 0.9661 | 0.9660 | 0.9662 | 0.9661 |
| 0.1007 | 23.4589 | 3800 | 0.1528 | 0.9566 | 0.9561 | 0.9587 | 0.9566 |
| 0.08 | 24.0744 | 3900 | 0.1415 | 0.9607 | 0.9604 | 0.9616 | 0.9607 |
| 0.086 | 24.6946 | 4000 | 0.0843 | 0.9729 | 0.9728 | 0.9729 | 0.9729 |
| 0.0945 | 25.3101 | 4100 | 0.1114 | 0.9711 | 0.9710 | 0.9711 | 0.9711 |
| 0.0904 | 25.9302 | 4200 | 0.1066 | 0.9747 | 0.9746 | 0.9748 | 0.9747 |
| 0.0836 | 26.5457 | 4300 | 0.1061 | 0.9715 | 0.9714 | 0.9717 | 0.9715 |
| 0.0851 | 27.1612 | 4400 | 0.1305 | 0.9683 | 0.9681 | 0.9691 | 0.9683 |
| 0.0561 | 27.7814 | 4500 | 0.1142 | 0.9720 | 0.9718 | 0.9725 | 0.9720 |
| 0.0681 | 28.3969 | 4600 | 0.1083 | 0.9733 | 0.9732 | 0.9734 | 0.9733 |
| 0.0672 | 29.0124 | 4700 | 0.1157 | 0.9724 | 0.9724 | 0.9724 | 0.9724 |
| 0.0728 | 29.6326 | 4800 | 0.1083 | 0.9751 | 0.9750 | 0.9754 | 0.9751 |
| 0.0915 | 30.2481 | 4900 | 0.1164 | 0.9729 | 0.9728 | 0.9730 | 0.9729 |
| 0.0637 | 30.8682 | 5000 | 0.1171 | 0.9738 | 0.9738 | 0.9739 | 0.9738 |
| 0.0764 | 31.4837 | 5100 | 0.1100 | 0.9701 | 0.9701 | 0.9703 | 0.9701 |
| 0.063 | 32.0992 | 5200 | 0.0833 | 0.9778 | 0.9778 | 0.9778 | 0.9778 |
| 0.0494 | 32.7194 | 5300 | 0.0947 | 0.9778 | 0.9778 | 0.9779 | 0.9778 |
| 0.0481 | 33.3349 | 5400 | 0.0962 | 0.9792 | 0.9792 | 0.9792 | 0.9792 |
| 0.0648 | 33.9550 | 5500 | 0.0997 | 0.9783 | 0.9782 | 0.9784 | 0.9783 |
| 0.0516 | 34.5705 | 5600 | 0.1097 | 0.9796 | 0.9796 | 0.9798 | 0.9796 |
| 0.0533 | 35.1860 | 5700 | 0.1054 | 0.9769 | 0.9769 | 0.9769 | 0.9769 |
| 0.0413 | 35.8062 | 5800 | 0.1080 | 0.9769 | 0.9769 | 0.9771 | 0.9769 |
| 0.0454 | 36.4217 | 5900 | 0.1113 | 0.9774 | 0.9773 | 0.9777 | 0.9774 |
| 0.0436 | 37.0372 | 6000 | 0.1058 | 0.9787 | 0.9787 | 0.9788 | 0.9787 |
| 0.0425 | 37.6574 | 6100 | 0.0945 | 0.9810 | 0.9810 | 0.9810 | 0.9810 |
| 0.0529 | 38.2729 | 6200 | 0.0931 | 0.9815 | 0.9814 | 0.9815 | 0.9815 |
| 0.0613 | 38.8930 | 6300 | 0.1019 | 0.9801 | 0.9801 | 0.9801 | 0.9801 |
| 0.0605 | 39.5085 | 6400 | 0.0960 | 0.9810 | 0.9810 | 0.9811 | 0.9810 |
| 0.0402 | 40.1240 | 6500 | 0.0935 | 0.9824 | 0.9823 | 0.9824 | 0.9824 |
| 0.0467 | 40.7442 | 6600 | 0.0992 | 0.9810 | 0.9810 | 0.9811 | 0.9810 |
| 0.0471 | 41.3597 | 6700 | 0.0957 | 0.9801 | 0.9801 | 0.9801 | 0.9801 |
| 0.0524 | 41.9798 | 6800 | 0.0981 | 0.9810 | 0.9810 | 0.9810 | 0.9810 |
| 0.0308 | 42.5953 | 6900 | 0.1033 | 0.9824 | 0.9823 | 0.9825 | 0.9824 |
| 0.0472 | 43.2109 | 7000 | 0.1014 | 0.9815 | 0.9814 | 0.9815 | 0.9815 |
| 0.0453 | 43.8310 | 7100 | 0.0970 | 0.9801 | 0.9801 | 0.9801 | 0.9801 |
| 0.0303 | 44.4465 | 7200 | 0.1041 | 0.9828 | 0.9828 | 0.9830 | 0.9828 |
| 0.0311 | 45.0620 | 7300 | 0.1032 | 0.9828 | 0.9828 | 0.9829 | 0.9828 |
| 0.0443 | 45.6822 | 7400 | 0.1004 | 0.9824 | 0.9823 | 0.9825 | 0.9824 |
| 0.045 | 46.2977 | 7500 | 0.1004 | 0.9824 | 0.9823 | 0.9825 | 0.9824 |
| 0.036 | 46.9178 | 7600 | 0.1007 | 0.9824 | 0.9823 | 0.9825 | 0.9824 |
| 0.0379 | 47.5333 | 7700 | 0.1007 | 0.9824 | 0.9823 | 0.9825 | 0.9824 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.4.0
- Tokenizers 0.21.0
- Downloads last month
- 2










