ayanban011 commited on
Commit
8696f12
·
1 Parent(s): bdbcb7b

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +147 -0
README.md ADDED
@@ -0,0 +1,147 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ model-index:
8
+ - name: vit-base_tobacco
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # vit-base_tobacco
16
+
17
+ This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.7324
20
+ - Accuracy: 0.8
21
+ - Brier Loss: 0.3049
22
+ - Nll: 1.3070
23
+ - F1 Micro: 0.8000
24
+ - F1 Macro: 0.7733
25
+ - Ece: 0.2124
26
+ - Aurc: 0.0840
27
+
28
+ ## Model description
29
+
30
+ More information needed
31
+
32
+ ## Intended uses & limitations
33
+
34
+ More information needed
35
+
36
+ ## Training and evaluation data
37
+
38
+ More information needed
39
+
40
+ ## Training procedure
41
+
42
+ ### Training hyperparameters
43
+
44
+ The following hyperparameters were used during training:
45
+ - learning_rate: 2e-05
46
+ - train_batch_size: 40
47
+ - eval_batch_size: 40
48
+ - seed: 42
49
+ - gradient_accumulation_steps: 16
50
+ - total_train_batch_size: 640
51
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
+ - lr_scheduler_type: linear
53
+ - lr_scheduler_warmup_ratio: 0.1
54
+ - num_epochs: 100
55
+
56
+ ### Training results
57
+
58
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
59
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
60
+ | No log | 0.8 | 1 | 0.7434 | 0.815 | 0.3073 | 1.1863 | 0.815 | 0.7942 | 0.2217 | 0.0720 |
61
+ | No log | 1.6 | 2 | 0.7569 | 0.81 | 0.3117 | 1.2131 | 0.81 | 0.7893 | 0.2153 | 0.0800 |
62
+ | No log | 2.4 | 3 | 0.7491 | 0.82 | 0.3107 | 1.2631 | 0.82 | 0.8063 | 0.2311 | 0.0777 |
63
+ | No log | 4.0 | 5 | 0.7489 | 0.795 | 0.3088 | 1.1544 | 0.795 | 0.7766 | 0.2427 | 0.0730 |
64
+ | No log | 4.8 | 6 | 0.7658 | 0.81 | 0.3171 | 1.3766 | 0.81 | 0.7983 | 0.2434 | 0.0822 |
65
+ | No log | 5.6 | 7 | 0.7496 | 0.815 | 0.3097 | 1.1920 | 0.815 | 0.8014 | 0.2434 | 0.0848 |
66
+ | No log | 6.4 | 8 | 0.7468 | 0.8 | 0.3090 | 1.0732 | 0.8000 | 0.7750 | 0.2195 | 0.0774 |
67
+ | No log | 8.0 | 10 | 0.7563 | 0.815 | 0.3131 | 1.3472 | 0.815 | 0.8082 | 0.2255 | 0.0741 |
68
+ | No log | 8.8 | 11 | 0.7548 | 0.81 | 0.3116 | 1.2016 | 0.81 | 0.7930 | 0.2496 | 0.0868 |
69
+ | No log | 9.6 | 12 | 0.7395 | 0.805 | 0.3071 | 1.1664 | 0.805 | 0.7841 | 0.2432 | 0.0772 |
70
+ | No log | 10.4 | 13 | 0.7296 | 0.82 | 0.3018 | 1.1776 | 0.82 | 0.8078 | 0.2214 | 0.0676 |
71
+ | No log | 12.0 | 15 | 0.7515 | 0.815 | 0.3104 | 1.2034 | 0.815 | 0.7987 | 0.2307 | 0.0835 |
72
+ | No log | 12.8 | 16 | 0.7350 | 0.81 | 0.3053 | 1.1762 | 0.81 | 0.7978 | 0.2196 | 0.0747 |
73
+ | No log | 13.6 | 17 | 0.7281 | 0.805 | 0.3023 | 1.1664 | 0.805 | 0.7841 | 0.2144 | 0.0718 |
74
+ | No log | 14.4 | 18 | 0.7395 | 0.81 | 0.3064 | 1.1750 | 0.81 | 0.7871 | 0.2306 | 0.0778 |
75
+ | No log | 16.0 | 20 | 0.7427 | 0.81 | 0.3076 | 1.2637 | 0.81 | 0.7986 | 0.2194 | 0.0808 |
76
+ | No log | 16.8 | 21 | 0.7337 | 0.81 | 0.3044 | 1.2447 | 0.81 | 0.7948 | 0.2321 | 0.0743 |
77
+ | No log | 17.6 | 22 | 0.7340 | 0.805 | 0.3050 | 1.1681 | 0.805 | 0.7841 | 0.2307 | 0.0743 |
78
+ | No log | 18.4 | 23 | 0.7338 | 0.805 | 0.3047 | 1.1708 | 0.805 | 0.7841 | 0.2290 | 0.0759 |
79
+ | No log | 20.0 | 25 | 0.7390 | 0.815 | 0.3058 | 1.2551 | 0.815 | 0.7984 | 0.2489 | 0.0818 |
80
+ | No log | 20.8 | 26 | 0.7390 | 0.815 | 0.3063 | 1.1894 | 0.815 | 0.7984 | 0.2294 | 0.0818 |
81
+ | No log | 21.6 | 27 | 0.7349 | 0.805 | 0.3054 | 1.1714 | 0.805 | 0.7847 | 0.2011 | 0.0791 |
82
+ | No log | 22.4 | 28 | 0.7308 | 0.81 | 0.3037 | 1.1694 | 0.81 | 0.7948 | 0.2128 | 0.0766 |
83
+ | No log | 24.0 | 30 | 0.7353 | 0.81 | 0.3051 | 1.1852 | 0.81 | 0.7956 | 0.2282 | 0.0794 |
84
+ | No log | 24.8 | 31 | 0.7378 | 0.81 | 0.3062 | 1.1870 | 0.81 | 0.7956 | 0.2293 | 0.0819 |
85
+ | No log | 25.6 | 32 | 0.7356 | 0.81 | 0.3054 | 1.1863 | 0.81 | 0.7956 | 0.2287 | 0.0817 |
86
+ | No log | 26.4 | 33 | 0.7309 | 0.81 | 0.3037 | 1.1801 | 0.81 | 0.7954 | 0.2209 | 0.0795 |
87
+ | No log | 28.0 | 35 | 0.7336 | 0.805 | 0.3050 | 1.1733 | 0.805 | 0.7850 | 0.2082 | 0.0789 |
88
+ | No log | 28.8 | 36 | 0.7334 | 0.81 | 0.3045 | 1.1799 | 0.81 | 0.7956 | 0.2207 | 0.0797 |
89
+ | No log | 29.6 | 37 | 0.7320 | 0.81 | 0.3040 | 1.2447 | 0.81 | 0.7956 | 0.2279 | 0.0804 |
90
+ | No log | 30.4 | 38 | 0.7328 | 0.81 | 0.3045 | 1.2473 | 0.81 | 0.7956 | 0.2154 | 0.0812 |
91
+ | No log | 32.0 | 40 | 0.7322 | 0.805 | 0.3044 | 1.1796 | 0.805 | 0.7850 | 0.2384 | 0.0804 |
92
+ | No log | 32.8 | 41 | 0.7318 | 0.81 | 0.3045 | 1.1792 | 0.81 | 0.7954 | 0.2291 | 0.0794 |
93
+ | No log | 33.6 | 42 | 0.7302 | 0.81 | 0.3034 | 1.2401 | 0.81 | 0.7954 | 0.2086 | 0.0794 |
94
+ | No log | 34.4 | 43 | 0.7311 | 0.805 | 0.3036 | 1.2424 | 0.805 | 0.7850 | 0.2278 | 0.0804 |
95
+ | No log | 36.0 | 45 | 0.7323 | 0.805 | 0.3043 | 1.1902 | 0.805 | 0.7850 | 0.2119 | 0.0816 |
96
+ | No log | 36.8 | 46 | 0.7304 | 0.805 | 0.3034 | 1.2428 | 0.805 | 0.7850 | 0.2330 | 0.0807 |
97
+ | No log | 37.6 | 47 | 0.7297 | 0.805 | 0.3032 | 1.2413 | 0.805 | 0.7850 | 0.2447 | 0.0801 |
98
+ | No log | 38.4 | 48 | 0.7310 | 0.805 | 0.3039 | 1.2424 | 0.805 | 0.7850 | 0.2233 | 0.0802 |
99
+ | No log | 40.0 | 50 | 0.7316 | 0.805 | 0.3040 | 1.2451 | 0.805 | 0.7850 | 0.2094 | 0.0809 |
100
+ | No log | 40.8 | 51 | 0.7313 | 0.805 | 0.3041 | 1.2450 | 0.805 | 0.7850 | 0.2093 | 0.0810 |
101
+ | No log | 41.6 | 52 | 0.7313 | 0.805 | 0.3041 | 1.2445 | 0.805 | 0.7850 | 0.2073 | 0.0814 |
102
+ | No log | 42.4 | 53 | 0.7315 | 0.805 | 0.3040 | 1.2447 | 0.805 | 0.7850 | 0.2198 | 0.0821 |
103
+ | No log | 44.0 | 55 | 0.7303 | 0.805 | 0.3034 | 1.2441 | 0.805 | 0.7850 | 0.2048 | 0.0813 |
104
+ | No log | 44.8 | 56 | 0.7306 | 0.805 | 0.3038 | 1.2444 | 0.805 | 0.7850 | 0.1966 | 0.0809 |
105
+ | No log | 45.6 | 57 | 0.7317 | 0.805 | 0.3043 | 1.2449 | 0.805 | 0.7850 | 0.1976 | 0.0821 |
106
+ | No log | 46.4 | 58 | 0.7317 | 0.805 | 0.3041 | 1.2466 | 0.805 | 0.7850 | 0.2007 | 0.0822 |
107
+ | No log | 48.0 | 60 | 0.7316 | 0.805 | 0.3041 | 1.2499 | 0.805 | 0.7850 | 0.2137 | 0.0820 |
108
+ | No log | 48.8 | 61 | 0.7320 | 0.8 | 0.3043 | 1.2536 | 0.8000 | 0.7733 | 0.2081 | 0.0822 |
109
+ | No log | 49.6 | 62 | 0.7319 | 0.805 | 0.3044 | 1.2494 | 0.805 | 0.7850 | 0.1998 | 0.0825 |
110
+ | No log | 50.4 | 63 | 0.7326 | 0.805 | 0.3048 | 1.2476 | 0.805 | 0.7850 | 0.1936 | 0.0828 |
111
+ | No log | 52.0 | 65 | 0.7313 | 0.8 | 0.3044 | 1.2495 | 0.8000 | 0.7733 | 0.2117 | 0.0822 |
112
+ | No log | 52.8 | 66 | 0.7304 | 0.8 | 0.3039 | 1.2524 | 0.8000 | 0.7733 | 0.2009 | 0.0818 |
113
+ | No log | 53.6 | 67 | 0.7306 | 0.8 | 0.3038 | 1.2505 | 0.8000 | 0.7733 | 0.2182 | 0.0818 |
114
+ | No log | 54.4 | 68 | 0.7321 | 0.8 | 0.3044 | 1.2513 | 0.8000 | 0.7733 | 0.2185 | 0.0833 |
115
+ | No log | 56.0 | 70 | 0.7326 | 0.8 | 0.3049 | 1.2519 | 0.8000 | 0.7733 | 0.2014 | 0.0833 |
116
+ | No log | 56.8 | 71 | 0.7320 | 0.8 | 0.3047 | 1.2580 | 0.8000 | 0.7733 | 0.2175 | 0.0829 |
117
+ | No log | 57.6 | 72 | 0.7313 | 0.8 | 0.3043 | 1.2571 | 0.8000 | 0.7733 | 0.2045 | 0.0828 |
118
+ | No log | 58.4 | 73 | 0.7314 | 0.8 | 0.3043 | 1.3065 | 0.8000 | 0.7733 | 0.2038 | 0.0827 |
119
+ | No log | 60.0 | 75 | 0.7322 | 0.8 | 0.3046 | 1.3081 | 0.8000 | 0.7733 | 0.2047 | 0.0840 |
120
+ | No log | 60.8 | 76 | 0.7323 | 0.8 | 0.3047 | 1.3078 | 0.8000 | 0.7733 | 0.2053 | 0.0839 |
121
+ | No log | 61.6 | 77 | 0.7322 | 0.8 | 0.3047 | 1.3070 | 0.8000 | 0.7733 | 0.2051 | 0.0837 |
122
+ | No log | 62.4 | 78 | 0.7316 | 0.8 | 0.3045 | 1.3062 | 0.8000 | 0.7733 | 0.2145 | 0.0835 |
123
+ | No log | 64.0 | 80 | 0.7315 | 0.8 | 0.3044 | 1.3063 | 0.8000 | 0.7733 | 0.2067 | 0.0836 |
124
+ | No log | 64.8 | 81 | 0.7320 | 0.8 | 0.3047 | 1.3064 | 0.8000 | 0.7733 | 0.2041 | 0.0839 |
125
+ | No log | 65.6 | 82 | 0.7323 | 0.8 | 0.3048 | 1.3070 | 0.8000 | 0.7733 | 0.2046 | 0.0839 |
126
+ | No log | 66.4 | 83 | 0.7323 | 0.8 | 0.3048 | 1.3068 | 0.8000 | 0.7733 | 0.2045 | 0.0838 |
127
+ | No log | 68.0 | 85 | 0.7320 | 0.8 | 0.3046 | 1.3068 | 0.8000 | 0.7733 | 0.2046 | 0.0840 |
128
+ | No log | 68.8 | 86 | 0.7318 | 0.8 | 0.3045 | 1.3069 | 0.8000 | 0.7733 | 0.2114 | 0.0838 |
129
+ | No log | 69.6 | 87 | 0.7316 | 0.8 | 0.3045 | 1.3066 | 0.8000 | 0.7733 | 0.2149 | 0.0836 |
130
+ | No log | 70.4 | 88 | 0.7316 | 0.8 | 0.3045 | 1.3066 | 0.8000 | 0.7733 | 0.2244 | 0.0834 |
131
+ | No log | 72.0 | 90 | 0.7321 | 0.8 | 0.3047 | 1.3069 | 0.8000 | 0.7733 | 0.2151 | 0.0837 |
132
+ | No log | 72.8 | 91 | 0.7322 | 0.8 | 0.3048 | 1.3070 | 0.8000 | 0.7733 | 0.2151 | 0.0839 |
133
+ | No log | 73.6 | 92 | 0.7322 | 0.8 | 0.3048 | 1.3070 | 0.8000 | 0.7733 | 0.2155 | 0.0840 |
134
+ | No log | 74.4 | 93 | 0.7323 | 0.8 | 0.3048 | 1.3071 | 0.8000 | 0.7733 | 0.2129 | 0.0842 |
135
+ | No log | 76.0 | 95 | 0.7324 | 0.8 | 0.3049 | 1.3071 | 0.8000 | 0.7733 | 0.2084 | 0.0841 |
136
+ | No log | 76.8 | 96 | 0.7324 | 0.8 | 0.3049 | 1.3071 | 0.8000 | 0.7733 | 0.2141 | 0.0842 |
137
+ | No log | 77.6 | 97 | 0.7324 | 0.8 | 0.3049 | 1.3070 | 0.8000 | 0.7733 | 0.2136 | 0.0841 |
138
+ | No log | 78.4 | 98 | 0.7324 | 0.8 | 0.3049 | 1.3070 | 0.8000 | 0.7733 | 0.2136 | 0.0841 |
139
+ | No log | 80.0 | 100 | 0.7324 | 0.8 | 0.3049 | 1.3070 | 0.8000 | 0.7733 | 0.2124 | 0.0840 |
140
+
141
+
142
+ ### Framework versions
143
+
144
+ - Transformers 4.30.2
145
+ - Pytorch 1.13.1
146
+ - Datasets 2.13.1
147
+ - Tokenizers 0.13.3