dacunaq commited on
Commit
95cba1e
·
verified ·
1 Parent(s): ffc80a4

Model save

Browse files
README.md ADDED
@@ -0,0 +1,146 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: google/vit-base-patch16-384
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
+ metrics:
10
+ - accuracy
11
+ model-index:
12
+ - name: vit-base-patch16-384-finetuned-humid-classes-13
13
+ results:
14
+ - task:
15
+ name: Image Classification
16
+ type: image-classification
17
+ dataset:
18
+ name: imagefolder
19
+ type: imagefolder
20
+ config: default
21
+ split: validation
22
+ args: default
23
+ metrics:
24
+ - name: Accuracy
25
+ type: accuracy
26
+ value: 1.0
27
+ ---
28
+
29
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
+ should probably proofread and complete it, then remove this comment. -->
31
+
32
+ # vit-base-patch16-384-finetuned-humid-classes-13
33
+
34
+ This model is a fine-tuned version of [google/vit-base-patch16-384](https://huggingface.co/google/vit-base-patch16-384) on the imagefolder dataset.
35
+ It achieves the following results on the evaluation set:
36
+ - Loss: 0.0117
37
+ - Accuracy: 1.0
38
+ - F1 Macro: 1.0
39
+ - Precision Macro: 1.0
40
+ - Recall Macro: 1.0
41
+ - Precision Dry: 1.0
42
+ - Recall Dry: 1.0
43
+ - F1 Dry: 1.0
44
+ - Precision Firm: 1.0
45
+ - Recall Firm: 1.0
46
+ - F1 Firm: 1.0
47
+ - Precision Humid: 1.0
48
+ - Recall Humid: 1.0
49
+ - F1 Humid: 1.0
50
+ - Precision Lump: 1.0
51
+ - Recall Lump: 1.0
52
+ - F1 Lump: 1.0
53
+ - Precision Rockies: 1.0
54
+ - Recall Rockies: 1.0
55
+ - F1 Rockies: 1.0
56
+
57
+ ## Model description
58
+
59
+ More information needed
60
+
61
+ ## Intended uses & limitations
62
+
63
+ More information needed
64
+
65
+ ## Training and evaluation data
66
+
67
+ More information needed
68
+
69
+ ## Training procedure
70
+
71
+ ### Training hyperparameters
72
+
73
+ The following hyperparameters were used during training:
74
+ - learning_rate: 5e-05
75
+ - train_batch_size: 16
76
+ - eval_batch_size: 16
77
+ - seed: 42
78
+ - gradient_accumulation_steps: 4
79
+ - total_train_batch_size: 64
80
+ - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
81
+ - lr_scheduler_type: linear
82
+ - lr_scheduler_warmup_ratio: 0.1
83
+ - num_epochs: 50
84
+
85
+ ### Training results
86
+
87
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | Precision Macro | Recall Macro | Precision Dry | Recall Dry | F1 Dry | Precision Firm | Recall Firm | F1 Firm | Precision Humid | Recall Humid | F1 Humid | Precision Lump | Recall Lump | F1 Lump | Precision Rockies | Recall Rockies | F1 Rockies |
88
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:---------------:|:------------:|:-------------:|:----------:|:------:|:--------------:|:-----------:|:-------:|:---------------:|:------------:|:--------:|:--------------:|:-----------:|:-------:|:-----------------:|:--------------:|:----------:|
89
+ | No log | 1.0 | 5 | 1.5040 | 0.3387 | 0.2863 | 0.5885 | 0.3848 | 1.0 | 0.2 | 0.3333 | 0.4062 | 0.9286 | 0.5652 | 0.1364 | 0.6 | 0.2222 | 1.0 | 0.0526 | 0.1 | 0.4 | 0.1429 | 0.2105 |
90
+ | 1.4809 | 2.0 | 10 | 1.1305 | 0.6613 | 0.4908 | 0.4555 | 0.5600 | 0.8889 | 0.8 | 0.8421 | 0.875 | 1.0 | 0.9333 | 0.0 | 0.0 | 0.0 | 0.5135 | 1.0 | 0.6786 | 0.0 | 0.0 | 0.0 |
91
+ | 1.4809 | 3.0 | 15 | 0.7686 | 0.7903 | 0.6533 | 0.6844 | 0.6857 | 0.8333 | 1.0 | 0.9091 | 0.9333 | 1.0 | 0.9655 | 0.0 | 0.0 | 0.0 | 0.6552 | 1.0 | 0.7917 | 1.0 | 0.4286 | 0.6 |
92
+ | 0.6702 | 4.0 | 20 | 0.3308 | 0.9032 | 0.7653 | 0.752 | 0.7857 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.76 | 1.0 | 0.8636 | 1.0 | 0.9286 | 0.9630 |
93
+ | 0.6702 | 5.0 | 25 | 0.1514 | 0.9677 | 0.9400 | 0.9810 | 0.9200 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.6 | 0.75 | 0.9048 | 1.0 | 0.95 | 1.0 | 1.0 | 1.0 |
94
+ | 0.204 | 6.0 | 30 | 0.1756 | 0.9516 | 0.9280 | 0.9727 | 0.9057 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.6 | 0.75 | 0.8636 | 1.0 | 0.9268 | 1.0 | 0.9286 | 0.9630 |
95
+ | 0.204 | 7.0 | 35 | 0.1464 | 0.9355 | 0.9235 | 0.9247 | 0.9284 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8 | 0.8 | 0.8 | 1.0 | 0.8421 | 0.9143 | 0.8235 | 1.0 | 0.9032 |
96
+ | 0.0809 | 8.0 | 40 | 0.0892 | 0.9839 | 0.9744 | 0.9667 | 0.9857 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 1.0 | 0.9091 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9286 | 0.9630 |
97
+ | 0.0809 | 9.0 | 45 | 0.1692 | 0.9355 | 0.9156 | 0.9652 | 0.8914 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.6 | 0.75 | 0.8261 | 1.0 | 0.9048 | 1.0 | 0.8571 | 0.9231 |
98
+ | 0.068 | 10.0 | 50 | 0.1271 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
99
+ | 0.068 | 11.0 | 55 | 0.2046 | 0.9194 | 0.8756 | 0.9583 | 0.8514 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.4 | 0.5714 | 0.7917 | 1.0 | 0.8837 | 1.0 | 0.8571 | 0.9231 |
100
+ | 0.0383 | 12.0 | 60 | 0.0567 | 0.9677 | 0.9621 | 0.9524 | 0.9752 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 1.0 | 0.9091 | 1.0 | 0.9474 | 0.9730 | 0.9286 | 0.9286 | 0.9286 |
101
+ | 0.0383 | 13.0 | 65 | 0.2568 | 0.9032 | 0.8634 | 0.9432 | 0.8409 | 1.0 | 1.0 | 1.0 | 0.9333 | 1.0 | 0.9655 | 1.0 | 0.4 | 0.5714 | 0.7826 | 0.9474 | 0.8571 | 1.0 | 0.8571 | 0.9231 |
102
+ | 0.0365 | 14.0 | 70 | 0.1315 | 0.9516 | 0.9353 | 0.925 | 0.9647 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.625 | 1.0 | 0.7692 | 1.0 | 0.8947 | 0.9444 | 1.0 | 0.9286 | 0.9630 |
103
+ | 0.0365 | 15.0 | 75 | 0.1447 | 0.9677 | 0.9400 | 0.9810 | 0.9200 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.6 | 0.75 | 0.9048 | 1.0 | 0.95 | 1.0 | 1.0 | 1.0 |
104
+ | 0.027 | 16.0 | 80 | 0.1043 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
105
+ | 0.027 | 17.0 | 85 | 0.0157 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
106
+ | 0.026 | 18.0 | 90 | 0.0247 | 0.9839 | 0.9726 | 0.99 | 0.96 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8 | 0.8889 | 0.95 | 1.0 | 0.9744 | 1.0 | 1.0 | 1.0 |
107
+ | 0.026 | 19.0 | 95 | 0.1236 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
108
+ | 0.008 | 20.0 | 100 | 0.0419 | 0.9839 | 0.9726 | 0.99 | 0.96 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8 | 0.8889 | 0.95 | 1.0 | 0.9744 | 1.0 | 1.0 | 1.0 |
109
+ | 0.008 | 21.0 | 105 | 0.0141 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
110
+ | 0.0069 | 22.0 | 110 | 0.0880 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
111
+ | 0.0069 | 23.0 | 115 | 0.0351 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
112
+ | 0.002 | 24.0 | 120 | 0.0368 | 0.9839 | 0.9726 | 0.99 | 0.96 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8 | 0.8889 | 0.95 | 1.0 | 0.9744 | 1.0 | 1.0 | 1.0 |
113
+ | 0.002 | 25.0 | 125 | 0.0596 | 0.9839 | 0.9726 | 0.99 | 0.96 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8 | 0.8889 | 0.95 | 1.0 | 0.9744 | 1.0 | 1.0 | 1.0 |
114
+ | 0.0034 | 26.0 | 130 | 0.0777 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
115
+ | 0.0034 | 27.0 | 135 | 0.0405 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
116
+ | 0.0017 | 28.0 | 140 | 0.0516 | 0.9839 | 0.9726 | 0.99 | 0.96 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8 | 0.8889 | 0.95 | 1.0 | 0.9744 | 1.0 | 1.0 | 1.0 |
117
+ | 0.0017 | 29.0 | 145 | 0.0420 | 0.9839 | 0.9726 | 0.99 | 0.96 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8 | 0.8889 | 0.95 | 1.0 | 0.9744 | 1.0 | 1.0 | 1.0 |
118
+ | 0.0017 | 30.0 | 150 | 0.0147 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
119
+ | 0.0017 | 31.0 | 155 | 0.0250 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
120
+ | 0.0007 | 32.0 | 160 | 0.0346 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
121
+ | 0.0007 | 33.0 | 165 | 0.0379 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
122
+ | 0.0007 | 34.0 | 170 | 0.0330 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
123
+ | 0.0007 | 35.0 | 175 | 0.0284 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
124
+ | 0.0006 | 36.0 | 180 | 0.0249 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
125
+ | 0.0006 | 37.0 | 185 | 0.0221 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
126
+ | 0.0005 | 38.0 | 190 | 0.0198 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
127
+ | 0.0005 | 39.0 | 195 | 0.0178 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
128
+ | 0.0005 | 40.0 | 200 | 0.0160 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
129
+ | 0.0005 | 41.0 | 205 | 0.0147 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
130
+ | 0.0005 | 42.0 | 210 | 0.0139 | 0.9839 | 0.9877 | 0.9867 | 0.9895 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.9474 | 0.9730 | 0.9333 | 1.0 | 0.9655 |
131
+ | 0.0005 | 43.0 | 215 | 0.0134 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
132
+ | 0.0005 | 44.0 | 220 | 0.0129 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
133
+ | 0.0005 | 45.0 | 225 | 0.0125 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
134
+ | 0.0005 | 46.0 | 230 | 0.0122 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
135
+ | 0.0005 | 47.0 | 235 | 0.0120 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
136
+ | 0.0005 | 48.0 | 240 | 0.0118 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
137
+ | 0.0005 | 49.0 | 245 | 0.0118 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
138
+ | 0.0005 | 50.0 | 250 | 0.0117 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
139
+
140
+
141
+ ### Framework versions
142
+
143
+ - Transformers 4.56.1
144
+ - Pytorch 2.9.0+cu126
145
+ - Datasets 4.0.0
146
+ - Tokenizers 0.22.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a6321209c206f2df1107c679db7ed1be1c70e92f212306b26425718a70b729f7
3
  size 344400564
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:552f1658ea4c9ccaa347adf3caeed69fd751c24aafb74422bc19ad7c88d784c3
3
  size 344400564
runs/Oct28_11-13-42_tech/events.out.tfevents.1761668025.tech.78812.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4ff24509971bcb2dee7e12d64f673c256e48b6b7f78115cbee7b54fb9d591cb6
3
- size 74229
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bdb57fa8ee380c2c55b961ea8c9d266548a1a918726cb200e629aee41b0c43a7
3
+ size 76113