dino-vitb16
This model is a fine-tuned version of microsoft/resnet-18 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0641
- Accuracy: 0.9856
- Precision: 0.9932
- Recall: 0.9756
- F1: 0.9843
- Tp: 1598
- Tn: 1899
- Fp: 11
- Fn: 40
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 55
- num_epochs: 5
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Tp | Tn | Fp | Fn |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.4827 | 0.0991 | 11 | 0.6050 | 0.6254 | 0.5541 | 0.9664 | 0.7043 | 1583 | 636 | 1274 | 55 |
| 0.2848 | 0.1982 | 22 | 0.6828 | 0.6945 | 0.6111 | 0.9304 | 0.7377 | 1524 | 940 | 970 | 114 |
| 0.2555 | 0.2973 | 33 | 0.5213 | 0.7452 | 0.6491 | 0.9756 | 0.7795 | 1598 | 1046 | 864 | 40 |
| 0.2000 | 0.3964 | 44 | 0.1739 | 0.9436 | 0.9422 | 0.9353 | 0.9387 | 1532 | 1816 | 94 | 106 |
| 0.2370 | 0.4955 | 55 | 0.2988 | 0.8926 | 0.8338 | 0.9585 | 0.8918 | 1570 | 1597 | 313 | 68 |
| 0.2251 | 0.5946 | 66 | 0.2926 | 0.8929 | 0.8249 | 0.9750 | 0.8937 | 1597 | 1571 | 339 | 41 |
| 0.1936 | 0.6937 | 77 | 0.1269 | 0.9704 | 0.9936 | 0.9420 | 0.9671 | 1543 | 1900 | 10 | 95 |
| 0.1959 | 0.7928 | 88 | 0.1577 | 0.9549 | 0.9362 | 0.9683 | 0.9520 | 1586 | 1802 | 108 | 52 |
| 0.2104 | 0.8919 | 99 | 0.1320 | 0.9651 | 0.9621 | 0.9621 | 0.9621 | 1576 | 1848 | 62 | 62 |
| 0.1969 | 0.9910 | 110 | 0.1945 | 0.9453 | 0.9252 | 0.9591 | 0.9418 | 1571 | 1783 | 127 | 67 |
| 0.2055 | 1.0901 | 121 | 0.1051 | 0.9783 | 0.9962 | 0.9567 | 0.9760 | 1567 | 1904 | 6 | 71 |
| 0.1698 | 1.1892 | 132 | 0.1323 | 0.9670 | 0.9617 | 0.9670 | 0.9644 | 1584 | 1847 | 63 | 54 |
| 0.2012 | 1.2883 | 143 | 0.1589 | 0.9591 | 0.9467 | 0.9658 | 0.9562 | 1582 | 1821 | 89 | 56 |
| 0.1713 | 1.3874 | 154 | 0.1327 | 0.9583 | 0.9499 | 0.9603 | 0.9551 | 1573 | 1827 | 83 | 65 |
| 0.1866 | 1.4865 | 165 | 0.1833 | 0.9419 | 0.9877 | 0.8852 | 0.9337 | 1450 | 1892 | 18 | 188 |
| 0.1641 | 1.5856 | 176 | 0.1471 | 0.9597 | 0.9379 | 0.9774 | 0.9572 | 1601 | 1804 | 106 | 37 |
| 0.1749 | 1.6847 | 187 | 0.1130 | 0.9698 | 0.9637 | 0.9713 | 0.9675 | 1591 | 1850 | 60 | 47 |
| 0.1916 | 1.7838 | 198 | 0.1272 | 0.9746 | 0.9754 | 0.9695 | 0.9724 | 1588 | 1870 | 40 | 50 |
| 0.1888 | 1.8829 | 209 | 0.0958 | 0.9777 | 0.9779 | 0.9737 | 0.9758 | 1595 | 1874 | 36 | 43 |
| 0.1605 | 1.9820 | 220 | 0.0682 | 0.9825 | 0.9975 | 0.9646 | 0.9808 | 1580 | 1906 | 4 | 58 |
| 0.1522 | 2.0811 | 231 | 0.1065 | 0.9710 | 0.9735 | 0.9634 | 0.9684 | 1578 | 1867 | 43 | 60 |
| 0.1574 | 2.1802 | 242 | 0.0794 | 0.9789 | 0.9821 | 0.9719 | 0.9770 | 1592 | 1881 | 29 | 46 |
| 0.1552 | 2.2793 | 253 | 0.0707 | 0.9820 | 0.9931 | 0.9676 | 0.9802 | 1585 | 1899 | 11 | 53 |
| 0.1476 | 2.3784 | 264 | 0.0820 | 0.9837 | 0.9901 | 0.9744 | 0.9822 | 1596 | 1894 | 16 | 42 |
| 0.1670 | 2.4775 | 275 | 0.0803 | 0.9806 | 0.9834 | 0.9744 | 0.9788 | 1596 | 1883 | 27 | 42 |
| 0.1729 | 2.5766 | 286 | 0.0762 | 0.9825 | 0.9888 | 0.9731 | 0.9809 | 1594 | 1892 | 18 | 44 |
| 0.1524 | 2.6757 | 297 | 0.0713 | 0.9837 | 0.9907 | 0.9737 | 0.9821 | 1595 | 1895 | 15 | 43 |
| 0.1697 | 2.7748 | 308 | 0.0913 | 0.9820 | 0.9882 | 0.9725 | 0.9803 | 1593 | 1891 | 19 | 45 |
| 0.1429 | 2.8739 | 319 | 0.0824 | 0.9839 | 0.9931 | 0.9719 | 0.9824 | 1592 | 1899 | 11 | 46 |
| 0.1686 | 2.9730 | 330 | 0.0747 | 0.9831 | 0.9962 | 0.9670 | 0.9814 | 1584 | 1904 | 6 | 54 |
| 0.1357 | 3.0721 | 341 | 0.0739 | 0.9825 | 0.9888 | 0.9731 | 0.9809 | 1594 | 1892 | 18 | 44 |
| 0.1606 | 3.1712 | 352 | 0.0705 | 0.9834 | 0.9882 | 0.9756 | 0.9819 | 1598 | 1891 | 19 | 40 |
| 0.1697 | 3.2703 | 363 | 0.0640 | 0.9839 | 0.9919 | 0.9731 | 0.9824 | 1594 | 1897 | 13 | 44 |
| 0.1471 | 3.3694 | 374 | 0.0602 | 0.9873 | 0.9963 | 0.9762 | 0.9861 | 1599 | 1904 | 6 | 39 |
| 0.1425 | 3.4685 | 385 | 0.0775 | 0.9822 | 0.9828 | 0.9786 | 0.9807 | 1603 | 1882 | 28 | 35 |
| 0.1326 | 3.5676 | 396 | 0.0649 | 0.9865 | 0.9920 | 0.9786 | 0.9852 | 1603 | 1897 | 13 | 35 |
| 0.1342 | 3.6667 | 407 | 0.0559 | 0.9870 | 0.9956 | 0.9762 | 0.9858 | 1599 | 1903 | 7 | 39 |
| 0.1524 | 3.7658 | 418 | 0.0611 | 0.9865 | 0.9969 | 0.9737 | 0.9852 | 1595 | 1905 | 5 | 43 |
| 0.1387 | 3.8649 | 429 | 0.0740 | 0.9853 | 0.9913 | 0.9768 | 0.9840 | 1600 | 1896 | 14 | 38 |
| 0.1449 | 3.9640 | 440 | 0.0733 | 0.9848 | 0.9901 | 0.9768 | 0.9834 | 1600 | 1894 | 16 | 38 |
| 0.1605 | 4.0631 | 451 | 0.0756 | 0.9825 | 0.9858 | 0.9762 | 0.9810 | 1599 | 1887 | 23 | 39 |
| 0.1522 | 4.1622 | 462 | 0.0846 | 0.9800 | 0.9775 | 0.9792 | 0.9783 | 1604 | 1873 | 37 | 34 |
| 0.1622 | 4.2613 | 473 | 0.0798 | 0.9851 | 0.9877 | 0.9799 | 0.9838 | 1605 | 1890 | 20 | 33 |
| 0.1103 | 4.3604 | 484 | 0.0717 | 0.9870 | 0.9938 | 0.9780 | 0.9858 | 1602 | 1900 | 10 | 36 |
| 0.1381 | 4.4595 | 495 | 0.0676 | 0.9856 | 0.9907 | 0.9780 | 0.9843 | 1602 | 1895 | 15 | 36 |
| 0.1402 | 4.5586 | 506 | 0.0683 | 0.9845 | 0.9907 | 0.9756 | 0.9831 | 1598 | 1895 | 15 | 40 |
| 0.1635 | 4.6577 | 517 | 0.0672 | 0.9853 | 0.9913 | 0.9768 | 0.9840 | 1600 | 1896 | 14 | 38 |
| 0.1253 | 4.7568 | 528 | 0.0635 | 0.9865 | 0.9938 | 0.9768 | 0.9852 | 1600 | 1900 | 10 | 38 |
| 0.1409 | 4.8559 | 539 | 0.0630 | 0.9865 | 0.9956 | 0.9750 | 0.9852 | 1597 | 1903 | 7 | 41 |
| 0.1359 | 4.9550 | 550 | 0.0641 | 0.9856 | 0.9932 | 0.9756 | 0.9843 | 1598 | 1899 | 11 | 40 |
Framework versions
- Transformers 5.0.0
- Pytorch 2.10.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 14
Model tree for waelhasan/dino-vitb16
Base model
microsoft/resnet-18