File size: 5,461 Bytes
c92df14
 
 
6d46298
c92df14
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6d46298
c92df14
6d46298
 
 
 
 
c92df14
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
---
license: apache-2.0
tags:
- image-classification
- generated_from_trainer
metrics:
- accuracy
- recall
- f1
- precision
model-index:
- name: vit-huge-binary-isic-patch-14
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# vit-huge-binary-isic-patch-14

This model is a fine-tuned version of [google/vit-huge-patch14-224-in21k](https://huggingface.co/google/vit-huge-patch14-224-in21k) on the ahishamm/isic_binary_augmented dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2524
- Accuracy: 0.8910
- Recall: 0.8910
- F1: 0.8910
- Precision: 0.8910

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Recall | F1     | Precision |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:---------:|
| 0.3123        | 0.09  | 100  | 0.2954          | 0.8634   | 0.8634 | 0.8634 | 0.8634    |
| 0.2976        | 0.19  | 200  | 0.3114          | 0.8551   | 0.8551 | 0.8551 | 0.8551    |
| 0.3031        | 0.28  | 300  | 0.3799          | 0.8070   | 0.8070 | 0.8070 | 0.8070    |
| 0.2181        | 0.37  | 400  | 0.2541          | 0.8938   | 0.8938 | 0.8938 | 0.8938    |
| 0.2653        | 0.46  | 500  | 0.2780          | 0.8580   | 0.8580 | 0.8580 | 0.8580    |
| 0.1503        | 0.56  | 600  | 0.2552          | 0.8957   | 0.8957 | 0.8957 | 0.8957    |
| 0.2676        | 0.65  | 700  | 0.3207          | 0.8717   | 0.8717 | 0.8717 | 0.8717    |
| 0.1188        | 0.74  | 800  | 0.2524          | 0.8910   | 0.8910 | 0.8910 | 0.8910    |
| 0.0759        | 0.84  | 900  | 0.3596          | 0.8874   | 0.8874 | 0.8874 | 0.8874    |
| 0.0516        | 0.93  | 1000 | 0.3129          | 0.8981   | 0.8981 | 0.8981 | 0.8981    |
| 0.038         | 1.02  | 1100 | 0.3258          | 0.8890   | 0.8890 | 0.8890 | 0.8890    |
| 0.1238        | 1.12  | 1200 | 0.3292          | 0.8828   | 0.8828 | 0.8828 | 0.8828    |
| 0.0557        | 1.21  | 1300 | 0.3667          | 0.8770   | 0.8770 | 0.8770 | 0.8770    |
| 0.0586        | 1.3   | 1400 | 0.3858          | 0.9015   | 0.9015 | 0.9015 | 0.9015    |
| 0.063         | 1.39  | 1500 | 0.3371          | 0.9061   | 0.9061 | 0.9061 | 0.9061    |
| 0.0351        | 1.49  | 1600 | 0.3462          | 0.8995   | 0.8995 | 0.8995 | 0.8995    |
| 0.0149        | 1.58  | 1700 | 0.4622          | 0.8861   | 0.8861 | 0.8861 | 0.8861    |
| 0.0404        | 1.67  | 1800 | 0.4071          | 0.8903   | 0.8903 | 0.8903 | 0.8903    |
| 0.002         | 1.77  | 1900 | 0.4530          | 0.8971   | 0.8971 | 0.8971 | 0.8971    |
| 0.0459        | 1.86  | 2000 | 0.3853          | 0.8898   | 0.8898 | 0.8898 | 0.8898    |
| 0.0067        | 1.95  | 2100 | 0.4223          | 0.8968   | 0.8968 | 0.8968 | 0.8968    |
| 0.0041        | 2.04  | 2200 | 0.4549          | 0.8948   | 0.8948 | 0.8948 | 0.8948    |
| 0.0012        | 2.14  | 2300 | 0.4800          | 0.8962   | 0.8962 | 0.8962 | 0.8962    |
| 0.0159        | 2.23  | 2400 | 0.5657          | 0.8916   | 0.8916 | 0.8916 | 0.8916    |
| 0.0327        | 2.32  | 2500 | 0.5150          | 0.8884   | 0.8884 | 0.8884 | 0.8884    |
| 0.0011        | 2.42  | 2600 | 0.5171          | 0.8962   | 0.8962 | 0.8962 | 0.8962    |
| 0.027         | 2.51  | 2700 | 0.5732          | 0.8865   | 0.8865 | 0.8865 | 0.8865    |
| 0.0031        | 2.6   | 2800 | 0.4335          | 0.9075   | 0.9075 | 0.9075 | 0.9075    |
| 0.0006        | 2.7   | 2900 | 0.4453          | 0.9084   | 0.9084 | 0.9084 | 0.9084    |
| 0.0008        | 2.79  | 3000 | 0.4262          | 0.9047   | 0.9047 | 0.9047 | 0.9047    |
| 0.0031        | 2.88  | 3100 | 0.4823          | 0.9003   | 0.9003 | 0.9003 | 0.9003    |
| 0.0005        | 2.97  | 3200 | 0.5086          | 0.9022   | 0.9022 | 0.9022 | 0.9022    |
| 0.0004        | 3.07  | 3300 | 0.4912          | 0.9061   | 0.9061 | 0.9061 | 0.9061    |
| 0.0005        | 3.16  | 3400 | 0.5218          | 0.9027   | 0.9027 | 0.9027 | 0.9027    |
| 0.0037        | 3.25  | 3500 | 0.5006          | 0.9054   | 0.9054 | 0.9054 | 0.9054    |
| 0.0004        | 3.35  | 3600 | 0.5137          | 0.9040   | 0.9040 | 0.9040 | 0.9040    |
| 0.0003        | 3.44  | 3700 | 0.5105          | 0.9057   | 0.9057 | 0.9057 | 0.9057    |
| 0.0004        | 3.53  | 3800 | 0.5087          | 0.9058   | 0.9058 | 0.9058 | 0.9058    |
| 0.0003        | 3.62  | 3900 | 0.5095          | 0.9079   | 0.9079 | 0.9079 | 0.9079    |
| 0.0003        | 3.72  | 4000 | 0.5294          | 0.9045   | 0.9045 | 0.9045 | 0.9045    |
| 0.0003        | 3.81  | 4100 | 0.5242          | 0.9049   | 0.9049 | 0.9049 | 0.9049    |
| 0.0003        | 3.9   | 4200 | 0.5256          | 0.9050   | 0.9050 | 0.9050 | 0.9050    |
| 0.0096        | 4.0   | 4300 | 0.5267          | 0.9050   | 0.9050 | 0.9050 | 0.9050    |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3