File size: 4,060 Bytes
f866752
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: ViT_B16
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# ViT_B16

This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0999
- Accuracy: 0.9729
- Precision: 0.9874
- Recall: 0.9536
- F1: 0.9702
- Tp: 1562
- Tn: 1890
- Fp: 20
- Fn: 76

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 276
- num_epochs: 5

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Tp   | Tn   | Fp  | Fn  |
|:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:----:|:----:|:---:|:---:|
| 0.6533        | 0.2477 | 55   | 0.5250          | 0.8470   | 0.8296    | 0.8413 | 0.8354 | 1378 | 1627 | 283 | 260 |
| 0.4246        | 0.4955 | 110  | 0.3119          | 0.9081   | 0.9379    | 0.8578 | 0.8960 | 1405 | 1817 | 93  | 233 |
| 0.2834        | 0.7432 | 165  | 0.2395          | 0.9194   | 0.9033    | 0.9243 | 0.9137 | 1514 | 1748 | 162 | 124 |
| 0.2425        | 0.9910 | 220  | 0.1882          | 0.9369   | 0.9348    | 0.9280 | 0.9314 | 1520 | 1804 | 106 | 118 |
| 0.2127        | 1.2387 | 275  | 0.1657          | 0.9501   | 0.9551    | 0.9359 | 0.9454 | 1533 | 1838 | 72  | 105 |
| 0.1973        | 1.4865 | 330  | 0.1446          | 0.9580   | 0.9709    | 0.9371 | 0.9537 | 1535 | 1864 | 46  | 103 |
| 0.1943        | 1.7342 | 385  | 0.1417          | 0.9628   | 0.9772    | 0.9414 | 0.9590 | 1542 | 1874 | 36  | 96  |
| 0.1934        | 1.9820 | 440  | 0.1173          | 0.9696   | 0.9904    | 0.9432 | 0.9662 | 1545 | 1895 | 15  | 93  |
| 0.1671        | 2.2297 | 495  | 0.1085          | 0.9707   | 0.9968    | 0.9396 | 0.9673 | 1539 | 1905 | 5   | 99  |
| 0.1755        | 2.4775 | 550  | 0.1140          | 0.9713   | 0.9898    | 0.9475 | 0.9682 | 1552 | 1894 | 16  | 86  |
| 0.1836        | 2.7252 | 605  | 0.1238          | 0.9659   | 0.9720    | 0.9536 | 0.9627 | 1562 | 1865 | 45  | 76  |
| 0.1664        | 2.9730 | 660  | 0.1199          | 0.9667   | 0.975     | 0.9524 | 0.9636 | 1560 | 1870 | 40  | 78  |
| 0.1693        | 3.2207 | 715  | 0.1189          | 0.9679   | 0.9745    | 0.9554 | 0.9649 | 1565 | 1869 | 41  | 73  |
| 0.1646        | 3.4685 | 770  | 0.1073          | 0.9701   | 0.9867    | 0.9481 | 0.9670 | 1553 | 1889 | 21  | 85  |
| 0.1585        | 3.7162 | 825  | 0.1076          | 0.9687   | 0.9805    | 0.9512 | 0.9656 | 1558 | 1879 | 31  | 80  |
| 0.1604        | 3.9640 | 880  | 0.1054          | 0.9729   | 0.9892    | 0.9518 | 0.9701 | 1559 | 1893 | 17  | 79  |
| 0.1701        | 4.2117 | 935  | 0.1046          | 0.9704   | 0.9806    | 0.9548 | 0.9675 | 1564 | 1879 | 31  | 74  |
| 0.1607        | 4.4595 | 990  | 0.1039          | 0.9713   | 0.9830    | 0.9542 | 0.9684 | 1563 | 1883 | 27  | 75  |
| 0.1631        | 4.7072 | 1045 | 0.1010          | 0.9727   | 0.9873    | 0.9530 | 0.9699 | 1561 | 1890 | 20  | 77  |
| 0.1483        | 4.9550 | 1100 | 0.0999          | 0.9729   | 0.9874    | 0.9536 | 0.9702 | 1562 | 1890 | 20  | 76  |


### Framework versions

- Transformers 5.0.0
- Pytorch 2.10.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2