finetuned-dermnet
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the dermnet-images dataset. It achieves the following results on the evaluation set:
- Loss: 1.4149
- Accuracy: 0.7052
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 2.7259 | 0.1209 | 100 | 2.6351 | 0.2344 |
| 2.4984 | 0.2418 | 200 | 2.4934 | 0.2759 |
| 2.4756 | 0.3628 | 300 | 2.4343 | 0.2956 |
| 2.3697 | 0.4837 | 400 | 2.2332 | 0.3569 |
| 2.3429 | 0.6046 | 500 | 2.2275 | 0.3449 |
| 2.2098 | 0.7255 | 600 | 2.1461 | 0.3740 |
| 2.097 | 0.8464 | 700 | 2.0719 | 0.4027 |
| 2.1721 | 0.9674 | 800 | 2.0249 | 0.4280 |
| 2.0066 | 1.0883 | 900 | 1.9906 | 0.4212 |
| 1.9893 | 1.2092 | 1000 | 1.9690 | 0.4250 |
| 1.9437 | 1.3301 | 1100 | 1.9122 | 0.4387 |
| 1.8935 | 1.4510 | 1200 | 1.8618 | 0.4482 |
| 1.947 | 1.5719 | 1300 | 1.8229 | 0.4520 |
| 1.983 | 1.6929 | 1400 | 1.8269 | 0.4554 |
| 1.8011 | 1.8138 | 1500 | 1.7483 | 0.4841 |
| 1.8234 | 1.9347 | 1600 | 1.8248 | 0.4529 |
| 1.6741 | 2.0556 | 1700 | 1.7227 | 0.4867 |
| 1.7061 | 2.1765 | 1800 | 1.7463 | 0.4837 |
| 1.5089 | 2.2975 | 1900 | 1.7026 | 0.4931 |
| 1.6389 | 2.4184 | 2000 | 1.6726 | 0.5073 |
| 1.5872 | 2.5393 | 2100 | 1.7186 | 0.4923 |
| 1.5651 | 2.6602 | 2200 | 1.6471 | 0.5171 |
| 1.54 | 2.7811 | 2300 | 1.6291 | 0.5189 |
| 1.6464 | 2.9021 | 2400 | 1.5212 | 0.5381 |
| 1.4522 | 3.0230 | 2500 | 1.5721 | 0.5300 |
| 1.3173 | 3.1439 | 2600 | 1.5308 | 0.5467 |
| 1.3144 | 3.2648 | 2700 | 1.5011 | 0.5608 |
| 1.38 | 3.3857 | 2800 | 1.5539 | 0.5394 |
| 1.2156 | 3.5067 | 2900 | 1.4841 | 0.5510 |
| 1.3692 | 3.6276 | 3000 | 1.4552 | 0.5656 |
| 1.4016 | 3.7485 | 3100 | 1.5167 | 0.5497 |
| 1.2821 | 3.8694 | 3200 | 1.4833 | 0.5690 |
| 1.1618 | 3.9903 | 3300 | 1.4224 | 0.5677 |
| 1.4415 | 4.1112 | 3400 | 1.4014 | 0.5870 |
| 1.1131 | 4.2322 | 3500 | 1.4399 | 0.5831 |
| 1.0672 | 4.3531 | 3600 | 1.4385 | 0.5668 |
| 1.2124 | 4.4740 | 3700 | 1.4179 | 0.5698 |
| 1.1765 | 4.5949 | 3800 | 1.3597 | 0.5943 |
| 1.0993 | 4.7158 | 3900 | 1.3415 | 0.6003 |
| 1.1414 | 4.8368 | 4000 | 1.3966 | 0.5968 |
| 1.1284 | 4.9577 | 4100 | 1.3665 | 0.5994 |
| 0.9258 | 5.0786 | 4200 | 1.3508 | 0.6165 |
| 0.684 | 5.1995 | 4300 | 1.3676 | 0.6058 |
| 1.0152 | 5.3204 | 4400 | 1.3588 | 0.6067 |
| 0.7438 | 5.4414 | 4500 | 1.3133 | 0.6170 |
| 0.8849 | 5.5623 | 4600 | 1.2907 | 0.6264 |
| 0.7456 | 5.6832 | 4700 | 1.4062 | 0.6058 |
| 1.1013 | 5.8041 | 4800 | 1.3282 | 0.6110 |
| 0.982 | 5.9250 | 4900 | 1.2998 | 0.6243 |
| 0.7012 | 6.0459 | 5000 | 1.3006 | 0.6230 |
| 0.6848 | 6.1669 | 5100 | 1.3672 | 0.6191 |
| 0.7298 | 6.2878 | 5200 | 1.3138 | 0.6290 |
| 0.542 | 6.4087 | 5300 | 1.3664 | 0.6217 |
| 0.7623 | 6.5296 | 5400 | 1.3301 | 0.6354 |
| 1.0472 | 6.6505 | 5500 | 1.2836 | 0.6298 |
| 0.964 | 6.7715 | 5600 | 1.3024 | 0.6345 |
| 0.7213 | 6.8924 | 5700 | 1.3025 | 0.6401 |
| 0.6109 | 7.0133 | 5800 | 1.3086 | 0.6384 |
| 0.5563 | 7.1342 | 5900 | 1.3405 | 0.6307 |
| 0.4472 | 7.2551 | 6000 | 1.2843 | 0.6470 |
| 0.5637 | 7.3761 | 6100 | 1.3159 | 0.6255 |
| 0.6429 | 7.4970 | 6200 | 1.3515 | 0.6298 |
| 0.4535 | 7.6179 | 6300 | 1.3600 | 0.6320 |
| 0.4351 | 7.7388 | 6400 | 1.3419 | 0.6431 |
| 0.6521 | 7.8597 | 6500 | 1.3131 | 0.6384 |
| 0.6632 | 7.9807 | 6600 | 1.3271 | 0.6320 |
| 0.6364 | 8.1016 | 6700 | 1.3336 | 0.6440 |
| 0.3828 | 8.2225 | 6800 | 1.4081 | 0.6337 |
| 0.5726 | 8.3434 | 6900 | 1.3465 | 0.6487 |
| 0.5724 | 8.4643 | 7000 | 1.3892 | 0.6397 |
| 0.6399 | 8.5852 | 7100 | 1.4268 | 0.6238 |
| 0.4594 | 8.7062 | 7200 | 1.3526 | 0.6495 |
| 0.4738 | 8.8271 | 7300 | 1.3674 | 0.6470 |
| 0.5154 | 8.9480 | 7400 | 1.3398 | 0.6414 |
| 0.3716 | 9.0689 | 7500 | 1.3825 | 0.6487 |
| 0.4229 | 9.1898 | 7600 | 1.3579 | 0.6525 |
| 0.396 | 9.3108 | 7700 | 1.4205 | 0.6474 |
| 0.4992 | 9.4317 | 7800 | 1.3717 | 0.6534 |
| 0.5165 | 9.5526 | 7900 | 1.3134 | 0.6594 |
| 0.3848 | 9.6735 | 8000 | 1.3695 | 0.6620 |
| 0.4414 | 9.7944 | 8100 | 1.3554 | 0.6624 |
| 0.5408 | 9.9154 | 8200 | 1.3660 | 0.6620 |
| 0.3946 | 10.0363 | 8300 | 1.3243 | 0.6658 |
| 0.3157 | 10.1572 | 8400 | 1.3912 | 0.6542 |
| 0.385 | 10.2781 | 8500 | 1.3961 | 0.6602 |
| 0.3742 | 10.3990 | 8600 | 1.3357 | 0.6611 |
| 0.3976 | 10.5200 | 8700 | 1.3715 | 0.6632 |
| 0.355 | 10.6409 | 8800 | 1.3365 | 0.6722 |
| 0.5399 | 10.7618 | 8900 | 1.3486 | 0.6787 |
| 0.4398 | 10.8827 | 9000 | 1.2953 | 0.6769 |
| 0.2445 | 11.0036 | 9100 | 1.2773 | 0.6847 |
| 0.3286 | 11.1245 | 9200 | 1.3179 | 0.6727 |
| 0.1964 | 11.2455 | 9300 | 1.3526 | 0.6817 |
| 0.3503 | 11.3664 | 9400 | 1.3517 | 0.6795 |
| 0.2261 | 11.4873 | 9500 | 1.3236 | 0.6787 |
| 0.4133 | 11.6082 | 9600 | 1.3401 | 0.6744 |
| 0.3857 | 11.7291 | 9700 | 1.3169 | 0.6834 |
| 0.3831 | 11.8501 | 9800 | 1.3116 | 0.6782 |
| 0.3891 | 11.9710 | 9900 | 1.3644 | 0.6740 |
| 0.4093 | 12.0919 | 10000 | 1.3590 | 0.6748 |
| 0.5045 | 12.2128 | 10100 | 1.3527 | 0.6791 |
| 0.2819 | 12.3337 | 10200 | 1.3897 | 0.6740 |
| 0.2815 | 12.4547 | 10300 | 1.3712 | 0.6847 |
| 0.4357 | 12.5756 | 10400 | 1.3475 | 0.6787 |
| 0.318 | 12.6965 | 10500 | 1.3712 | 0.6859 |
| 0.2357 | 12.8174 | 10600 | 1.3942 | 0.6782 |
| 0.3216 | 12.9383 | 10700 | 1.3630 | 0.6808 |
| 0.2873 | 13.0593 | 10800 | 1.4015 | 0.6727 |
| 0.2433 | 13.1802 | 10900 | 1.3585 | 0.6872 |
| 0.2962 | 13.3011 | 11000 | 1.4138 | 0.6795 |
| 0.2134 | 13.4220 | 11100 | 1.3382 | 0.6834 |
| 0.2922 | 13.5429 | 11200 | 1.3553 | 0.6898 |
| 0.2562 | 13.6638 | 11300 | 1.3986 | 0.6855 |
| 0.1831 | 13.7848 | 11400 | 1.4005 | 0.6855 |
| 0.2235 | 13.9057 | 11500 | 1.3770 | 0.6855 |
| 0.2411 | 14.0266 | 11600 | 1.4194 | 0.6637 |
| 0.1687 | 14.1475 | 11700 | 1.3968 | 0.6804 |
| 0.1913 | 14.2684 | 11800 | 1.4210 | 0.6787 |
| 0.2395 | 14.3894 | 11900 | 1.4085 | 0.6718 |
| 0.111 | 14.5103 | 12000 | 1.4555 | 0.6812 |
| 0.1616 | 14.6312 | 12100 | 1.3750 | 0.6859 |
| 0.2003 | 14.7521 | 12200 | 1.3594 | 0.6954 |
| 0.313 | 14.8730 | 12300 | 1.3914 | 0.6877 |
| 0.2766 | 14.9940 | 12400 | 1.3821 | 0.6855 |
| 0.2937 | 15.1149 | 12500 | 1.3909 | 0.6889 |
| 0.2221 | 15.2358 | 12600 | 1.4073 | 0.6907 |
| 0.1867 | 15.3567 | 12700 | 1.4243 | 0.6825 |
| 0.2371 | 15.4776 | 12800 | 1.4190 | 0.6872 |
| 0.215 | 15.5985 | 12900 | 1.4330 | 0.6851 |
| 0.2075 | 15.7195 | 13000 | 1.4656 | 0.6812 |
| 0.1663 | 15.8404 | 13100 | 1.4386 | 0.6791 |
| 0.2015 | 15.9613 | 13200 | 1.4236 | 0.6868 |
| 0.2444 | 16.0822 | 13300 | 1.4427 | 0.6872 |
| 0.2799 | 16.2031 | 13400 | 1.4151 | 0.6881 |
| 0.1378 | 16.3241 | 13500 | 1.4102 | 0.6949 |
| 0.2701 | 16.4450 | 13600 | 1.3858 | 0.7018 |
| 0.2951 | 16.5659 | 13700 | 1.4027 | 0.6954 |
| 0.1788 | 16.6868 | 13800 | 1.4067 | 0.6949 |
| 0.185 | 16.8077 | 13900 | 1.4164 | 0.6889 |
| 0.241 | 16.9287 | 14000 | 1.3851 | 0.7001 |
| 0.2172 | 17.0496 | 14100 | 1.4145 | 0.6924 |
| 0.1449 | 17.1705 | 14200 | 1.3958 | 0.6979 |
| 0.21 | 17.2914 | 14300 | 1.3992 | 0.6924 |
| 0.2003 | 17.4123 | 14400 | 1.3995 | 0.7027 |
| 0.1851 | 17.5333 | 14500 | 1.3837 | 0.7001 |
| 0.0763 | 17.6542 | 14600 | 1.3951 | 0.6949 |
| 0.2952 | 17.7751 | 14700 | 1.4049 | 0.6945 |
| 0.1609 | 17.8960 | 14800 | 1.4123 | 0.6932 |
| 0.1816 | 18.0169 | 14900 | 1.4050 | 0.6984 |
| 0.1211 | 18.1378 | 15000 | 1.4065 | 0.6962 |
| 0.1513 | 18.2588 | 15100 | 1.4139 | 0.6937 |
| 0.1249 | 18.3797 | 15200 | 1.4142 | 0.6988 |
| 0.1939 | 18.5006 | 15300 | 1.4139 | 0.7018 |
| 0.0724 | 18.6215 | 15400 | 1.4093 | 0.7018 |
| 0.2841 | 18.7424 | 15500 | 1.4191 | 0.6988 |
| 0.2753 | 18.8634 | 15600 | 1.4229 | 0.6954 |
| 0.0368 | 18.9843 | 15700 | 1.4186 | 0.6937 |
| 0.0901 | 19.1052 | 15800 | 1.4220 | 0.6979 |
| 0.153 | 19.2261 | 15900 | 1.4193 | 0.6954 |
| 0.1448 | 19.3470 | 16000 | 1.4176 | 0.6988 |
| 0.157 | 19.4680 | 16100 | 1.4154 | 0.7018 |
| 0.1827 | 19.5889 | 16200 | 1.4165 | 0.7014 |
| 0.0809 | 19.7098 | 16300 | 1.4149 | 0.7052 |
| 0.1651 | 19.8307 | 16400 | 1.4129 | 0.7044 |
| 0.1256 | 19.9516 | 16500 | 1.4133 | 0.7044 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 12
Model tree for WahajRaza/finetuned-dermnet
Base model
google/vit-base-patch16-224-in21k