swinv2-tiny-patch4-window8-256-dmae-humeda-DAV3
This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2067
- Accuracy: 0.9589
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 3.9845 | 1.0 | 21 | 1.6832 | 0.3425 |
| 2.4369 | 2.0 | 42 | 1.1981 | 0.4384 |
| 1.7752 | 3.0 | 63 | 0.8412 | 0.6301 |
| 1.3772 | 4.0 | 84 | 0.7895 | 0.7123 |
| 1.1556 | 5.0 | 105 | 0.7385 | 0.7808 |
| 1.0059 | 6.0 | 126 | 0.6626 | 0.8082 |
| 0.8598 | 7.0 | 147 | 0.5403 | 0.7808 |
| 0.8724 | 8.0 | 168 | 0.5520 | 0.8219 |
| 0.7096 | 9.0 | 189 | 0.5182 | 0.8356 |
| 0.5038 | 10.0 | 210 | 0.4133 | 0.8493 |
| 0.4951 | 11.0 | 231 | 0.3548 | 0.8767 |
| 0.4692 | 12.0 | 252 | 0.3845 | 0.8493 |
| 0.5339 | 13.0 | 273 | 0.3178 | 0.8904 |
| 0.4536 | 14.0 | 294 | 0.3252 | 0.8904 |
| 0.4369 | 15.0 | 315 | 0.2785 | 0.8904 |
| 0.3941 | 16.0 | 336 | 0.2900 | 0.9041 |
| 0.4363 | 17.0 | 357 | 0.3426 | 0.8630 |
| 0.2819 | 18.0 | 378 | 0.2839 | 0.9041 |
| 0.361 | 19.0 | 399 | 0.2223 | 0.9041 |
| 0.1857 | 20.0 | 420 | 0.2522 | 0.9178 |
| 0.3161 | 21.0 | 441 | 0.2164 | 0.9178 |
| 0.3273 | 22.0 | 462 | 0.2224 | 0.9315 |
| 0.3458 | 23.0 | 483 | 0.2199 | 0.9452 |
| 0.337 | 24.0 | 504 | 0.2377 | 0.9315 |
| 0.1801 | 25.0 | 525 | 0.2067 | 0.9589 |
| 0.3283 | 26.0 | 546 | 0.2401 | 0.9315 |
| 0.2211 | 27.0 | 567 | 0.2167 | 0.9315 |
| 0.1783 | 28.0 | 588 | 0.2180 | 0.9315 |
| 0.2783 | 28.5854 | 600 | 0.2223 | 0.9315 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 2
Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV3
Base model
microsoft/swinv2-tiny-patch4-window8-256