square_run_second_vote_full_pic_50
This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.6568
- F1 Macro: 0.2803
- F1 Micro: 0.3939
- F1 Weighted: 0.3344
- Precision Macro: 0.3642
- Precision Micro: 0.3939
- Precision Weighted: 0.4123
- Recall Macro: 0.3362
- Recall Micro: 0.3939
- Recall Weighted: 0.3939
- Accuracy: 0.3939
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Micro | F1 Weighted | Precision Macro | Precision Micro | Precision Weighted | Recall Macro | Recall Micro | Recall Weighted | Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1.8125 | 1.0 | 58 | 1.8594 | 0.1226 | 0.2121 | 0.1692 | 0.1180 | 0.2121 | 0.1586 | 0.1501 | 0.2121 | 0.2121 | 0.2121 |
| 1.8401 | 2.0 | 116 | 1.9425 | 0.0860 | 0.1742 | 0.1036 | 0.0668 | 0.1742 | 0.0824 | 0.1580 | 0.1742 | 0.1742 | 0.1742 |
| 1.7455 | 3.0 | 174 | 1.8949 | 0.1450 | 0.2424 | 0.1731 | 0.2634 | 0.2424 | 0.3029 | 0.1819 | 0.2424 | 0.2424 | 0.2424 |
| 1.8283 | 4.0 | 232 | 1.8868 | 0.0989 | 0.2121 | 0.1383 | 0.0794 | 0.2121 | 0.1089 | 0.1482 | 0.2121 | 0.2121 | 0.2121 |
| 1.729 | 5.0 | 290 | 1.8830 | 0.1271 | 0.1894 | 0.1496 | 0.1438 | 0.1894 | 0.1799 | 0.1663 | 0.1894 | 0.1894 | 0.1894 |
| 1.6643 | 6.0 | 348 | 1.8247 | 0.1450 | 0.2424 | 0.1852 | 0.1921 | 0.2424 | 0.2200 | 0.1749 | 0.2424 | 0.2424 | 0.2424 |
| 1.9317 | 7.0 | 406 | 1.8338 | 0.1470 | 0.1894 | 0.1785 | 0.1535 | 0.1894 | 0.1869 | 0.1574 | 0.1894 | 0.1894 | 0.1894 |
| 1.4753 | 8.0 | 464 | 1.7873 | 0.1617 | 0.2652 | 0.2071 | 0.1458 | 0.2652 | 0.1843 | 0.2046 | 0.2652 | 0.2652 | 0.2652 |
| 2.0844 | 9.0 | 522 | 1.8694 | 0.2562 | 0.3106 | 0.3029 | 0.2622 | 0.3106 | 0.3076 | 0.2610 | 0.3106 | 0.3106 | 0.3106 |
| 1.558 | 10.0 | 580 | 1.8684 | 0.2203 | 0.2803 | 0.2542 | 0.2140 | 0.2803 | 0.2502 | 0.2442 | 0.2803 | 0.2803 | 0.2803 |
| 1.6059 | 11.0 | 638 | 1.9295 | 0.2746 | 0.3182 | 0.3103 | 0.3107 | 0.3182 | 0.3453 | 0.2849 | 0.3182 | 0.3182 | 0.3182 |
| 1.0749 | 12.0 | 696 | 2.0512 | 0.2284 | 0.3182 | 0.2797 | 0.2882 | 0.3182 | 0.3204 | 0.2409 | 0.3182 | 0.3182 | 0.3182 |
| 1.5171 | 13.0 | 754 | 2.1976 | 0.2193 | 0.2955 | 0.2645 | 0.2698 | 0.2955 | 0.3064 | 0.2359 | 0.2955 | 0.2955 | 0.2955 |
| 0.6995 | 14.0 | 812 | 2.3271 | 0.2159 | 0.3030 | 0.2658 | 0.2928 | 0.3030 | 0.3244 | 0.2312 | 0.3030 | 0.3030 | 0.3030 |
| 1.2603 | 15.0 | 870 | 2.6123 | 0.2353 | 0.2727 | 0.2714 | 0.2778 | 0.2727 | 0.3214 | 0.2418 | 0.2727 | 0.2727 | 0.2727 |
| 0.6293 | 16.0 | 928 | 2.5967 | 0.1990 | 0.2576 | 0.2312 | 0.2149 | 0.2576 | 0.2568 | 0.2202 | 0.2576 | 0.2576 | 0.2576 |
| 0.3242 | 17.0 | 986 | 2.7596 | 0.2242 | 0.2727 | 0.2580 | 0.2423 | 0.2727 | 0.2818 | 0.2348 | 0.2727 | 0.2727 | 0.2727 |
| 0.6081 | 18.0 | 1044 | 2.8475 | 0.2060 | 0.25 | 0.2401 | 0.2329 | 0.25 | 0.2604 | 0.2054 | 0.25 | 0.25 | 0.25 |
| 0.3241 | 19.0 | 1102 | 3.1226 | 0.1989 | 0.25 | 0.2334 | 0.2199 | 0.25 | 0.2494 | 0.2033 | 0.25 | 0.25 | 0.25 |
| 0.1119 | 20.0 | 1160 | 3.1286 | 0.2302 | 0.2803 | 0.2653 | 0.2654 | 0.2803 | 0.2992 | 0.2332 | 0.2803 | 0.2803 | 0.2803 |
| 0.0946 | 21.0 | 1218 | 3.2789 | 0.2265 | 0.2955 | 0.2698 | 0.2472 | 0.2955 | 0.2835 | 0.2359 | 0.2955 | 0.2955 | 0.2955 |
| 0.0434 | 22.0 | 1276 | 3.2405 | 0.2357 | 0.2652 | 0.2666 | 0.2398 | 0.2652 | 0.2744 | 0.2360 | 0.2652 | 0.2652 | 0.2652 |
| 0.0926 | 23.0 | 1334 | 3.3668 | 0.2435 | 0.2955 | 0.2829 | 0.2650 | 0.2955 | 0.2973 | 0.2461 | 0.2955 | 0.2955 | 0.2955 |
| 0.1002 | 24.0 | 1392 | 3.4633 | 0.2105 | 0.2727 | 0.2544 | 0.2310 | 0.2727 | 0.2643 | 0.2149 | 0.2727 | 0.2727 | 0.2727 |
| 0.0602 | 25.0 | 1450 | 3.4614 | 0.2575 | 0.3030 | 0.2990 | 0.2662 | 0.3030 | 0.3027 | 0.2555 | 0.3030 | 0.3030 | 0.3030 |
| 0.0079 | 26.0 | 1508 | 3.7489 | 0.2416 | 0.2879 | 0.2764 | 0.2489 | 0.2879 | 0.2847 | 0.2503 | 0.2879 | 0.2879 | 0.2879 |
| 0.1364 | 27.0 | 1566 | 3.8018 | 0.2234 | 0.2727 | 0.2626 | 0.2312 | 0.2727 | 0.2655 | 0.2253 | 0.2727 | 0.2727 | 0.2727 |
| 0.0141 | 28.0 | 1624 | 3.7614 | 0.2435 | 0.2879 | 0.2816 | 0.2527 | 0.2879 | 0.2858 | 0.2437 | 0.2879 | 0.2879 | 0.2879 |
| 0.1638 | 29.0 | 1682 | 3.7921 | 0.2341 | 0.2803 | 0.2745 | 0.2423 | 0.2803 | 0.2795 | 0.2345 | 0.2803 | 0.2803 | 0.2803 |
| 0.0049 | 30.0 | 1740 | 3.7955 | 0.2345 | 0.2803 | 0.2743 | 0.2431 | 0.2803 | 0.2792 | 0.2345 | 0.2803 | 0.2803 | 0.2803 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.1
- Tokenizers 0.21.0
- Downloads last month
- 4
Model tree for corranm/square_run_second_vote_full_pic_50
Base model
google/vit-base-patch16-224