beit-finetuned-stroke-diff-mri
This model is a fine-tuned version of microsoft/beit-base-patch16-224-pt22k-ft22k on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1236
- Accuracy: 0.9603
- F1: 0.9601
- Precision: 0.9602
- Recall: 0.9603
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 48
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 0.8636 | 2.0415 | 100 | 0.8385 | 0.5936 | 0.5522 | 0.6381 | 0.5936 |
| 0.7416 | 4.0829 | 200 | 0.6827 | 0.7151 | 0.6866 | 0.7520 | 0.7151 |
| 0.6439 | 6.1244 | 300 | 0.5197 | 0.7877 | 0.7731 | 0.7974 | 0.7877 |
| 0.5512 | 8.1658 | 400 | 0.4541 | 0.8252 | 0.8197 | 0.8244 | 0.8252 |
| 0.4854 | 10.2073 | 500 | 0.3876 | 0.8536 | 0.8502 | 0.8672 | 0.8536 |
| 0.4142 | 12.2487 | 600 | 0.3112 | 0.8910 | 0.8904 | 0.8901 | 0.8910 |
| 0.3505 | 14.2902 | 700 | 0.2970 | 0.8990 | 0.8999 | 0.9059 | 0.8990 |
| 0.2846 | 16.3316 | 800 | 0.2522 | 0.9069 | 0.9058 | 0.9107 | 0.9069 |
| 0.2574 | 18.3731 | 900 | 0.2203 | 0.9262 | 0.9261 | 0.9262 | 0.9262 |
| 0.2361 | 20.4145 | 1000 | 0.1889 | 0.9364 | 0.9360 | 0.9369 | 0.9364 |
| 0.1952 | 22.4560 | 1100 | 0.1715 | 0.9421 | 0.9420 | 0.9422 | 0.9421 |
| 0.1869 | 24.4974 | 1200 | 0.1511 | 0.9444 | 0.9444 | 0.9448 | 0.9444 |
| 0.1594 | 26.5389 | 1300 | 0.1478 | 0.9523 | 0.9523 | 0.9526 | 0.9523 |
| 0.1368 | 28.5803 | 1400 | 0.1554 | 0.9478 | 0.9479 | 0.9482 | 0.9478 |
| 0.1257 | 30.6218 | 1500 | 0.1458 | 0.9535 | 0.9532 | 0.9534 | 0.9535 |
| 0.1091 | 32.6632 | 1600 | 0.1519 | 0.9546 | 0.9546 | 0.9555 | 0.9546 |
| 0.1034 | 34.7047 | 1700 | 0.1389 | 0.9546 | 0.9544 | 0.9552 | 0.9546 |
| 0.1026 | 36.7461 | 1800 | 0.1373 | 0.9535 | 0.9532 | 0.9537 | 0.9535 |
| 0.0977 | 38.7876 | 1900 | 0.1311 | 0.9580 | 0.9579 | 0.9579 | 0.9580 |
| 0.0876 | 40.8290 | 2000 | 0.1261 | 0.9591 | 0.9590 | 0.9591 | 0.9591 |
| 0.0866 | 42.8705 | 2100 | 0.1236 | 0.9603 | 0.9601 | 0.9602 | 0.9603 |
| 0.0735 | 44.9119 | 2200 | 0.1273 | 0.9557 | 0.9556 | 0.9557 | 0.9557 |
| 0.0738 | 46.9534 | 2300 | 0.1266 | 0.9580 | 0.9578 | 0.9581 | 0.9580 |
Framework versions
- Transformers 4.52.4
- Pytorch 2.7.1+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 4
Model tree for BTX24/beit-finetuned-stroke-diff-mri
Base model
microsoft/beit-base-patch16-224-pt22k-ft22k










