File size: 6,958 Bytes
92fb907
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
---
library_name: transformers
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: vit-tiny-patch16-224
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# vit-tiny-patch16-224

This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6141
- F1 Macro: 0.4385
- F1 Micro: 0.5303
- F1 Weighted: 0.4856
- Precision Macro: 0.5225
- Precision Micro: 0.5303
- Precision Weighted: 0.5788
- Recall Macro: 0.4858
- Recall Micro: 0.5303
- Recall Weighted: 0.5303
- Accuracy: 0.5303

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25

### Training results

| Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Micro | F1 Weighted | Precision Macro | Precision Micro | Precision Weighted | Recall Macro | Recall Micro | Recall Weighted | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:-----------:|:---------------:|:---------------:|:------------------:|:------------:|:------------:|:---------------:|:--------:|
| 1.9719        | 1.0   | 29   | 1.9209          | 0.0949   | 0.2197   | 0.1155      | 0.0891          | 0.2197          | 0.1051             | 0.1722       | 0.2197       | 0.2197          | 0.2197   |
| 1.8717        | 2.0   | 58   | 2.0378          | 0.0953   | 0.1970   | 0.1069      | 0.1996          | 0.1970          | 0.2660             | 0.1794       | 0.1970       | 0.1970          | 0.1970   |
| 1.9326        | 3.0   | 87   | 1.7680          | 0.2290   | 0.3939   | 0.2939      | 0.2151          | 0.3939          | 0.2682             | 0.3004       | 0.3939       | 0.3939          | 0.3939   |
| 1.2873        | 4.0   | 116  | 1.5892          | 0.3502   | 0.4470   | 0.4082      | 0.4831          | 0.4470          | 0.5140             | 0.3646       | 0.4470       | 0.4470          | 0.4470   |
| 1.3997        | 5.0   | 145  | 1.4773          | 0.3481   | 0.5      | 0.4245      | 0.3463          | 0.5             | 0.4119             | 0.4052       | 0.5          | 0.5             | 0.5      |
| 1.7041        | 6.0   | 174  | 1.4406          | 0.4266   | 0.5379   | 0.5005      | 0.5011          | 0.5379          | 0.5628             | 0.4529       | 0.5379       | 0.5379          | 0.5379   |
| 1.1863        | 7.0   | 203  | 1.3680          | 0.4759   | 0.5682   | 0.5400      | 0.5559          | 0.5682          | 0.6032             | 0.4831       | 0.5682       | 0.5682          | 0.5682   |
| 0.9817        | 8.0   | 232  | 1.3515          | 0.4399   | 0.5227   | 0.4969      | 0.4445          | 0.5227          | 0.5088             | 0.4722       | 0.5227       | 0.5227          | 0.5227   |
| 0.617         | 9.0   | 261  | 1.3867          | 0.4895   | 0.5909   | 0.5555      | 0.5136          | 0.5909          | 0.5776             | 0.5183       | 0.5909       | 0.5909          | 0.5909   |
| 1.0365        | 10.0  | 290  | 1.4607          | 0.4313   | 0.5379   | 0.4961      | 0.4371          | 0.5379          | 0.4997             | 0.4674       | 0.5379       | 0.5379          | 0.5379   |
| 0.6815        | 11.0  | 319  | 1.3133          | 0.4962   | 0.5909   | 0.5664      | 0.5087          | 0.5909          | 0.5742             | 0.5133       | 0.5909       | 0.5909          | 0.5909   |
| 0.4153        | 12.0  | 348  | 1.3528          | 0.5082   | 0.5909   | 0.5735      | 0.5185          | 0.5909          | 0.5820             | 0.5202       | 0.5909       | 0.5909          | 0.5909   |
| 0.3396        | 13.0  | 377  | 1.3856          | 0.5372   | 0.5909   | 0.5830      | 0.5623          | 0.5909          | 0.6018             | 0.5387       | 0.5909       | 0.5909          | 0.5909   |
| 0.5415        | 14.0  | 406  | 1.4252          | 0.5132   | 0.5909   | 0.5795      | 0.5223          | 0.5909          | 0.5893             | 0.5255       | 0.5909       | 0.5909          | 0.5909   |
| 0.4421        | 15.0  | 435  | 1.4081          | 0.5574   | 0.6136   | 0.6086      | 0.5753          | 0.6136          | 0.6149             | 0.5532       | 0.6136       | 0.6136          | 0.6136   |
| 0.2893        | 16.0  | 464  | 1.5285          | 0.5127   | 0.5985   | 0.5833      | 0.5059          | 0.5985          | 0.5752             | 0.5253       | 0.5985       | 0.5985          | 0.5985   |
| 0.2403        | 17.0  | 493  | 1.4820          | 0.5395   | 0.6288   | 0.6065      | 0.5808          | 0.6288          | 0.6380             | 0.5460       | 0.6288       | 0.6288          | 0.6288   |
| 0.1087        | 18.0  | 522  | 1.3999          | 0.5320   | 0.6061   | 0.6009      | 0.5612          | 0.6061          | 0.6211             | 0.5261       | 0.6061       | 0.6061          | 0.6061   |
| 0.2619        | 19.0  | 551  | 1.4408          | 0.5618   | 0.6136   | 0.6037      | 0.6154          | 0.6136          | 0.6225             | 0.5501       | 0.6136       | 0.6136          | 0.6136   |
| 0.1154        | 20.0  | 580  | 1.4516          | 0.5402   | 0.6288   | 0.6090      | 0.5538          | 0.6288          | 0.6145             | 0.5492       | 0.6288       | 0.6288          | 0.6288   |
| 0.1367        | 21.0  | 609  | 1.5306          | 0.5254   | 0.6136   | 0.5942      | 0.5321          | 0.6136          | 0.5923             | 0.5340       | 0.6136       | 0.6136          | 0.6136   |
| 0.0839        | 22.0  | 638  | 1.6397          | 0.5154   | 0.5833   | 0.5756      | 0.5274          | 0.5833          | 0.5895             | 0.5252       | 0.5833       | 0.5833          | 0.5833   |
| 0.1818        | 23.0  | 667  | 1.6416          | 0.5656   | 0.6515   | 0.6359      | 0.5848          | 0.6515          | 0.6456             | 0.5696       | 0.6515       | 0.6515          | 0.6515   |
| 0.0781        | 24.0  | 696  | 1.6026          | 0.5393   | 0.6212   | 0.6079      | 0.5524          | 0.6212          | 0.6118             | 0.5412       | 0.6212       | 0.6212          | 0.6212   |
| 0.0792        | 25.0  | 725  | 1.5997          | 0.5494   | 0.6288   | 0.6180      | 0.5716          | 0.6288          | 0.6297             | 0.5480       | 0.6288       | 0.6288          | 0.6288   |


### Framework versions

- Transformers 4.48.2
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0