File size: 8,053 Bytes
428ed53
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: square_run_32_batch
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# square_run_32_batch

This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6241
- F1 Macro: 0.5019
- F1 Micro: 0.5758
- F1 Weighted: 0.5679
- Precision Macro: 0.5021
- Precision Micro: 0.5758
- Precision Weighted: 0.5657
- Recall Macro: 0.5073
- Recall Micro: 0.5758
- Recall Weighted: 0.5758
- Accuracy: 0.5758

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Micro | F1 Weighted | Precision Macro | Precision Micro | Precision Weighted | Recall Macro | Recall Micro | Recall Weighted | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:-----------:|:---------------:|:---------------:|:------------------:|:------------:|:------------:|:---------------:|:--------:|
| 1.9373        | 1.0   | 15   | 1.8818          | 0.0464   | 0.1894   | 0.0615      | 0.0277          | 0.1894          | 0.0367             | 0.1429       | 0.1894       | 0.1894          | 0.1894   |
| 1.869         | 2.0   | 30   | 1.8642          | 0.1100   | 0.2652   | 0.1418      | 0.075           | 0.2652          | 0.0968             | 0.2063       | 0.2652       | 0.2652          | 0.2652   |
| 1.9218        | 3.0   | 45   | 1.8754          | 0.1163   | 0.2576   | 0.1460      | 0.1316          | 0.2576          | 0.1566             | 0.1905       | 0.2576       | 0.2576          | 0.2576   |
| 1.6733        | 4.0   | 60   | 1.6881          | 0.2445   | 0.3864   | 0.3053      | 0.2427          | 0.3864          | 0.2917             | 0.2992       | 0.3864       | 0.3864          | 0.3864   |
| 1.54          | 5.0   | 75   | 1.5528          | 0.3252   | 0.4242   | 0.3856      | 0.3429          | 0.4242          | 0.4101             | 0.3570       | 0.4242       | 0.4242          | 0.4242   |
| 1.4418        | 6.0   | 90   | 1.5737          | 0.2858   | 0.3864   | 0.3213      | 0.2846          | 0.3864          | 0.3243             | 0.3398       | 0.3864       | 0.3864          | 0.3864   |
| 0.8592        | 7.0   | 105  | 1.5408          | 0.3444   | 0.4394   | 0.3965      | 0.3208          | 0.4394          | 0.3674             | 0.3791       | 0.4394       | 0.4394          | 0.4394   |
| 1.1427        | 8.0   | 120  | 1.2804          | 0.4638   | 0.5606   | 0.5317      | 0.4698          | 0.5606          | 0.5280             | 0.4831       | 0.5606       | 0.5606          | 0.5606   |
| 0.7849        | 9.0   | 135  | 1.2880          | 0.4649   | 0.5530   | 0.5291      | 0.4804          | 0.5530          | 0.5401             | 0.4823       | 0.5530       | 0.5530          | 0.5530   |
| 0.6846        | 10.0  | 150  | 1.3130          | 0.4298   | 0.5152   | 0.4811      | 0.4404          | 0.5152          | 0.5005             | 0.4671       | 0.5152       | 0.5152          | 0.5152   |
| 0.4006        | 11.0  | 165  | 1.2958          | 0.4931   | 0.5833   | 0.5598      | 0.4983          | 0.5833          | 0.5756             | 0.5229       | 0.5833       | 0.5833          | 0.5833   |
| 0.4329        | 12.0  | 180  | 1.2990          | 0.5062   | 0.5530   | 0.5562      | 0.5315          | 0.5530          | 0.5874             | 0.5133       | 0.5530       | 0.5530          | 0.5530   |
| 0.482         | 13.0  | 195  | 1.3831          | 0.4842   | 0.5152   | 0.5233      | 0.5517          | 0.5152          | 0.5803             | 0.4839       | 0.5152       | 0.5152          | 0.5152   |
| 0.6409        | 14.0  | 210  | 1.4066          | 0.5081   | 0.5985   | 0.5765      | 0.5194          | 0.5985          | 0.5820             | 0.5232       | 0.5985       | 0.5985          | 0.5985   |
| 0.3206        | 15.0  | 225  | 1.3690          | 0.5155   | 0.5606   | 0.5520      | 0.6158          | 0.5606          | 0.5890             | 0.5170       | 0.5606       | 0.5606          | 0.5606   |
| 0.1773        | 16.0  | 240  | 1.2568          | 0.5920   | 0.6515   | 0.6408      | 0.6894          | 0.6515          | 0.6623             | 0.5843       | 0.6515       | 0.6515          | 0.6515   |
| 0.3259        | 17.0  | 255  | 1.3406          | 0.5467   | 0.6061   | 0.5961      | 0.5615          | 0.6061          | 0.6033             | 0.5467       | 0.6061       | 0.6061          | 0.6061   |
| 0.1123        | 18.0  | 270  | 1.3767          | 0.5868   | 0.6364   | 0.6306      | 0.6258          | 0.6364          | 0.6413             | 0.5785       | 0.6364       | 0.6364          | 0.6364   |
| 0.1129        | 19.0  | 285  | 1.4680          | 0.5879   | 0.6439   | 0.6306      | 0.6809          | 0.6439          | 0.6933             | 0.5806       | 0.6439       | 0.6439          | 0.6439   |
| 0.0651        | 20.0  | 300  | 1.4981          | 0.6655   | 0.6894   | 0.6876      | 0.7115          | 0.6894          | 0.7224             | 0.6511       | 0.6894       | 0.6894          | 0.6894   |
| 0.0685        | 21.0  | 315  | 1.4621          | 0.6091   | 0.6515   | 0.6494      | 0.6303          | 0.6515          | 0.6641             | 0.6040       | 0.6515       | 0.6515          | 0.6515   |
| 0.1469        | 22.0  | 330  | 1.5347          | 0.5330   | 0.6212   | 0.6040      | 0.5477          | 0.6212          | 0.6149             | 0.5440       | 0.6212       | 0.6212          | 0.6212   |
| 0.0289        | 23.0  | 345  | 1.5417          | 0.5466   | 0.6288   | 0.6180      | 0.5409          | 0.6288          | 0.6108             | 0.5549       | 0.6288       | 0.6288          | 0.6288   |
| 0.01          | 24.0  | 360  | 1.5670          | 0.5475   | 0.6364   | 0.6187      | 0.5435          | 0.6364          | 0.6104             | 0.5594       | 0.6364       | 0.6364          | 0.6364   |
| 0.035         | 25.0  | 375  | 1.6037          | 0.5529   | 0.6364   | 0.6209      | 0.5470          | 0.6364          | 0.6156             | 0.5679       | 0.6364       | 0.6364          | 0.6364   |
| 0.0109        | 26.0  | 390  | 1.6752          | 0.5897   | 0.6212   | 0.6203      | 0.6145          | 0.6212          | 0.6527             | 0.6000       | 0.6212       | 0.6212          | 0.6212   |
| 0.038         | 27.0  | 405  | 1.6724          | 0.5344   | 0.6136   | 0.6008      | 0.5332          | 0.6136          | 0.6005             | 0.5468       | 0.6136       | 0.6136          | 0.6136   |
| 0.0116        | 28.0  | 420  | 1.6252          | 0.5384   | 0.6212   | 0.6090      | 0.5337          | 0.6212          | 0.6033             | 0.5491       | 0.6212       | 0.6212          | 0.6212   |
| 0.006         | 29.0  | 435  | 1.5980          | 0.5572   | 0.6364   | 0.6294      | 0.5529          | 0.6364          | 0.6246             | 0.5634       | 0.6364       | 0.6364          | 0.6364   |
| 0.0046        | 30.0  | 450  | 1.5939          | 0.5605   | 0.6439   | 0.6342      | 0.5546          | 0.6439          | 0.6269             | 0.5687       | 0.6439       | 0.6439          | 0.6439   |


### Framework versions

- Transformers 4.48.2
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0