thethinkmachine commited on
Commit
bc6d38a
·
verified ·
1 Parent(s): 98bd0e2

Model save

Browse files
Files changed (3) hide show
  1. README.md +42 -97
  2. model.safetensors +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -21,11 +21,11 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [microsoft/resnet-18](https://huggingface.co/microsoft/resnet-18) on an unknown dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 0.9548
25
- - Accuracy: 0.5947
26
- - Precision: 0.6223
27
- - Recall: 0.5947
28
- - F1: 0.5855
29
 
30
  ## Model description
31
 
@@ -51,102 +51,47 @@ The following hyperparameters were used during training:
51
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
52
  - lr_scheduler_type: cosine
53
  - lr_scheduler_warmup_ratio: 0.1
54
- - num_epochs: 50
55
  - mixed_precision_training: Native AMP
56
 
57
  ### Training results
58
 
59
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
60
- |:-------------:|:-----:|:-----:|:---------------:|:--------:|:---------:|:------:|:------:|
61
- | 4.5541 | 0.012 | 150 | 4.5227 | 0.0051 | 0.0020 | 0.0051 | 0.0017 |
62
- | 4.5317 | 0.024 | 300 | 4.5042 | 0.0054 | 0.0016 | 0.0054 | 0.0017 |
63
- | 4.4762 | 0.036 | 450 | 4.4747 | 0.0072 | 0.0025 | 0.0072 | 0.0025 |
64
- | 4.3982 | 0.048 | 600 | 4.4355 | 0.0075 | 0.0022 | 0.0075 | 0.0026 |
65
- | 4.326 | 0.06 | 750 | 4.3926 | 0.0096 | 0.0035 | 0.0096 | 0.0040 |
66
- | 4.3798 | 0.072 | 900 | 4.3494 | 0.012 | 0.0056 | 0.012 | 0.0053 |
67
- | 4.3094 | 0.084 | 1050 | 4.2877 | 0.0142 | 0.0062 | 0.0142 | 0.0063 |
68
- | 4.2576 | 0.096 | 1200 | 4.2247 | 0.019 | 0.0117 | 0.019 | 0.0104 |
69
- | 4.2497 | 0.108 | 1350 | 4.1559 | 0.0251 | 0.0232 | 0.0251 | 0.0153 |
70
- | 4.1446 | 0.12 | 1500 | 4.0773 | 0.0364 | 0.0305 | 0.0364 | 0.0230 |
71
- | 4.0553 | 0.132 | 1650 | 3.9926 | 0.0455 | 0.0388 | 0.0455 | 0.0302 |
72
- | 4.0142 | 0.144 | 1800 | 3.9018 | 0.0586 | 0.0483 | 0.0586 | 0.0402 |
73
- | 3.872 | 0.156 | 1950 | 3.8018 | 0.0773 | 0.0658 | 0.0773 | 0.0521 |
74
- | 3.8058 | 0.168 | 2100 | 3.6973 | 0.0989 | 0.0990 | 0.0989 | 0.0712 |
75
- | 3.7513 | 0.18 | 2250 | 3.5953 | 0.1253 | 0.1247 | 0.1253 | 0.0940 |
76
- | 3.648 | 0.192 | 2400 | 3.4597 | 0.1449 | 0.1545 | 0.1449 | 0.1105 |
77
- | 3.6208 | 0.204 | 2550 | 3.3697 | 0.157 | 0.1812 | 0.157 | 0.1241 |
78
- | 3.5246 | 0.216 | 2700 | 3.2463 | 0.1778 | 0.2016 | 0.1778 | 0.1403 |
79
- | 3.4365 | 0.228 | 2850 | 3.1238 | 0.1939 | 0.2451 | 0.1939 | 0.1554 |
80
- | 3.3472 | 0.24 | 3000 | 3.0007 | 0.2131 | 0.2737 | 0.2131 | 0.1739 |
81
- | 3.2302 | 0.252 | 3150 | 2.9187 | 0.2359 | 0.2880 | 0.2359 | 0.1974 |
82
- | 3.1533 | 0.264 | 3300 | 2.7929 | 0.2481 | 0.3081 | 0.2481 | 0.2060 |
83
- | 3.0656 | 0.276 | 3450 | 2.6988 | 0.27 | 0.3309 | 0.27 | 0.2260 |
84
- | 3.063 | 0.288 | 3600 | 2.5655 | 0.281 | 0.3197 | 0.281 | 0.2362 |
85
- | 2.9569 | 0.3 | 3750 | 2.4821 | 0.2967 | 0.3373 | 0.2967 | 0.2549 |
86
- | 2.8611 | 0.312 | 3900 | 2.4055 | 0.309 | 0.3637 | 0.309 | 0.2648 |
87
- | 2.781 | 0.324 | 4050 | 2.3112 | 0.322 | 0.3734 | 0.322 | 0.2795 |
88
- | 2.7003 | 0.336 | 4200 | 2.2378 | 0.3324 | 0.4156 | 0.3324 | 0.2938 |
89
- | 2.6487 | 0.348 | 4350 | 2.1506 | 0.3429 | 0.3989 | 0.3429 | 0.2954 |
90
- | 2.5996 | 0.36 | 4500 | 2.0626 | 0.3579 | 0.4166 | 0.3579 | 0.3147 |
91
- | 2.5326 | 0.372 | 4650 | 2.0180 | 0.3704 | 0.4375 | 0.3704 | 0.3278 |
92
- | 2.4018 | 0.384 | 4800 | 1.9384 | 0.377 | 0.4627 | 0.377 | 0.3364 |
93
- | 2.4614 | 0.396 | 4950 | 1.8644 | 0.384 | 0.4613 | 0.384 | 0.3455 |
94
- | 2.3182 | 0.408 | 5100 | 1.8123 | 0.4027 | 0.4720 | 0.4027 | 0.3670 |
95
- | 2.2881 | 0.42 | 5250 | 1.7507 | 0.4034 | 0.4682 | 0.4034 | 0.3632 |
96
- | 2.2158 | 0.432 | 5400 | 1.6979 | 0.4167 | 0.4887 | 0.4167 | 0.3772 |
97
- | 2.1465 | 0.444 | 5550 | 1.6531 | 0.4263 | 0.4948 | 0.4263 | 0.3880 |
98
- | 2.0682 | 0.456 | 5700 | 1.6165 | 0.4376 | 0.4780 | 0.4376 | 0.3996 |
99
- | 2.0762 | 0.468 | 5850 | 1.5328 | 0.4426 | 0.5060 | 0.4426 | 0.4118 |
100
- | 2.1077 | 0.48 | 6000 | 1.4685 | 0.4571 | 0.4994 | 0.4571 | 0.4228 |
101
- | 1.9077 | 0.492 | 6150 | 1.4592 | 0.4618 | 0.5109 | 0.4618 | 0.4266 |
102
- | 1.9145 | 0.504 | 6300 | 1.4162 | 0.4639 | 0.5076 | 0.4639 | 0.4300 |
103
- | 1.9404 | 0.516 | 6450 | 1.3810 | 0.4712 | 0.5104 | 0.4712 | 0.4363 |
104
- | 1.8828 | 0.528 | 6600 | 1.3470 | 0.4856 | 0.5284 | 0.4856 | 0.4559 |
105
- | 1.7646 | 0.54 | 6750 | 1.3309 | 0.4826 | 0.5324 | 0.4826 | 0.4528 |
106
- | 1.7944 | 0.552 | 6900 | 1.3001 | 0.4899 | 0.5276 | 0.4899 | 0.4598 |
107
- | 1.7257 | 0.564 | 7050 | 1.2816 | 0.4967 | 0.5387 | 0.4967 | 0.4656 |
108
- | 1.6668 | 0.576 | 7200 | 1.2616 | 0.4982 | 0.5521 | 0.4982 | 0.4665 |
109
- | 1.6566 | 0.588 | 7350 | 1.2272 | 0.5009 | 0.5527 | 0.5009 | 0.4706 |
110
- | 1.6628 | 0.6 | 7500 | 1.2193 | 0.5071 | 0.5603 | 0.5071 | 0.4803 |
111
- | 1.6117 | 0.612 | 7650 | 1.1694 | 0.5155 | 0.5609 | 0.5155 | 0.4902 |
112
- | 1.6219 | 0.624 | 7800 | 1.1470 | 0.5285 | 0.5607 | 0.5285 | 0.5063 |
113
- | 1.5716 | 0.636 | 7950 | 1.1622 | 0.5227 | 0.5544 | 0.5227 | 0.4976 |
114
- | 1.6103 | 0.648 | 8100 | 1.1415 | 0.5246 | 0.5598 | 0.5246 | 0.5040 |
115
- | 1.5735 | 0.66 | 8250 | 1.1258 | 0.5297 | 0.5787 | 0.5297 | 0.5081 |
116
- | 1.6088 | 0.672 | 8400 | 1.1037 | 0.5293 | 0.5643 | 0.5293 | 0.5082 |
117
- | 1.4725 | 0.684 | 8550 | 1.0931 | 0.538 | 0.5782 | 0.538 | 0.5173 |
118
- | 1.5742 | 0.696 | 8700 | 1.0870 | 0.5481 | 0.5788 | 0.5481 | 0.5290 |
119
- | 1.4093 | 0.708 | 8850 | 1.0697 | 0.5458 | 0.5886 | 0.5458 | 0.5261 |
120
- | 1.5218 | 0.72 | 9000 | 1.0427 | 0.5505 | 0.5948 | 0.5505 | 0.5347 |
121
- | 1.4483 | 0.732 | 9150 | 1.0418 | 0.5473 | 0.5752 | 0.5473 | 0.5265 |
122
- | 1.449 | 0.744 | 9300 | 1.0421 | 0.548 | 0.5785 | 0.548 | 0.5278 |
123
- | 1.4383 | 0.756 | 9450 | 1.0196 | 0.5548 | 0.5866 | 0.5548 | 0.5385 |
124
- | 1.3789 | 0.768 | 9600 | 1.0251 | 0.5577 | 0.5867 | 0.5577 | 0.5418 |
125
- | 1.3798 | 0.78 | 9750 | 1.0285 | 0.5605 | 0.5989 | 0.5605 | 0.5441 |
126
- | 1.423 | 0.792 | 9900 | 0.9976 | 0.5674 | 0.5980 | 0.5674 | 0.5504 |
127
- | 1.3519 | 0.804 | 10050 | 1.0080 | 0.5657 | 0.5970 | 0.5657 | 0.5509 |
128
- | 1.3074 | 0.816 | 10200 | 1.0114 | 0.5644 | 0.5944 | 0.5644 | 0.5482 |
129
- | 1.402 | 0.828 | 10350 | 0.9869 | 0.5678 | 0.5957 | 0.5678 | 0.5535 |
130
- | 1.3303 | 0.84 | 10500 | 0.9798 | 0.5737 | 0.6028 | 0.5737 | 0.5608 |
131
- | 1.3275 | 0.852 | 10650 | 0.9861 | 0.5815 | 0.6149 | 0.5815 | 0.5670 |
132
- | 1.2912 | 0.864 | 10800 | 0.9868 | 0.571 | 0.6022 | 0.571 | 0.5564 |
133
- | 1.3826 | 0.876 | 10950 | 1.0269 | 0.5629 | 0.5967 | 0.5629 | 0.5502 |
134
- | 1.2715 | 0.888 | 11100 | 0.9918 | 0.571 | 0.5987 | 0.571 | 0.5543 |
135
- | 1.253 | 0.9 | 11250 | 0.9830 | 0.5783 | 0.5958 | 0.5783 | 0.5623 |
136
- | 1.253 | 0.912 | 11400 | 0.9723 | 0.5774 | 0.6037 | 0.5774 | 0.5643 |
137
- | 1.2421 | 0.924 | 11550 | 0.9607 | 0.5748 | 0.6144 | 0.5748 | 0.5588 |
138
- | 1.2315 | 0.936 | 11700 | 0.9664 | 0.571 | 0.6087 | 0.571 | 0.5561 |
139
- | 1.2553 | 0.948 | 11850 | 0.9032 | 0.5683 | 0.6157 | 0.5683 | 0.5563 |
140
- | 1.2328 | 0.96 | 12000 | 0.9590 | 0.5705 | 0.6025 | 0.5705 | 0.5548 |
141
- | 1.2263 | 0.972 | 12150 | 0.9738 | 0.5786 | 0.6082 | 0.5786 | 0.5675 |
142
- | 1.167 | 0.984 | 12300 | 0.9485 | 0.5829 | 0.6118 | 0.5829 | 0.5718 |
143
- | 1.2948 | 0.996 | 12450 | 0.9758 | 0.586 | 0.6171 | 0.586 | 0.5732 |
144
- | 1.0593 | 1.008 | 12600 | 0.9690 | 0.5862 | 0.6092 | 0.5862 | 0.5756 |
145
- | 1.17 | 1.02 | 12750 | 0.9561 | 0.5927 | 0.6204 | 0.5927 | 0.5841 |
146
- | 1.1304 | 1.032 | 12900 | 0.9885 | 0.5828 | 0.6161 | 0.5828 | 0.5708 |
147
- | 1.2012 | 1.044 | 13050 | 0.9700 | 0.592 | 0.6144 | 0.592 | 0.5807 |
148
- | 1.1335 | 1.056 | 13200 | 0.9563 | 0.5949 | 0.6194 | 0.5949 | 0.5843 |
149
- | 1.0747 | 1.068 | 13350 | 0.9548 | 0.5947 | 0.6223 | 0.5947 | 0.5855 |
150
 
151
 
152
  ### Framework versions
 
21
 
22
  This model is a fine-tuned version of [microsoft/resnet-18](https://huggingface.co/microsoft/resnet-18) on an unknown dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 2.0839
25
+ - Accuracy: 0.4336
26
+ - Precision: 0.5017
27
+ - Recall: 0.4336
28
+ - F1: 0.4223
29
 
30
  ## Model description
31
 
 
51
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
52
  - lr_scheduler_type: cosine
53
  - lr_scheduler_warmup_ratio: 0.1
54
+ - num_epochs: 1
55
  - mixed_precision_training: Native AMP
56
 
57
  ### Training results
58
 
59
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
60
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
61
+ | 4.3903 | 0.012 | 150 | 4.2290 | 0.0146 | 0.0134 | 0.0146 | 0.0083 |
62
+ | 3.846 | 0.024 | 300 | 3.6356 | 0.0838 | 0.0946 | 0.0838 | 0.0592 |
63
+ | 3.3778 | 0.036 | 450 | 3.0217 | 0.1558 | 0.2091 | 0.1558 | 0.1167 |
64
+ | 2.9266 | 0.048 | 600 | 2.6267 | 0.1918 | 0.2768 | 0.1918 | 0.1498 |
65
+ | 2.7657 | 0.06 | 750 | 2.3280 | 0.2335 | 0.3684 | 0.2335 | 0.1946 |
66
+ | 2.6257 | 0.072 | 900 | 2.1951 | 0.2758 | 0.3689 | 0.2758 | 0.2381 |
67
+ | 2.4699 | 0.084 | 1050 | 2.3175 | 0.2424 | 0.3960 | 0.2424 | 0.2101 |
68
+ | 2.5352 | 0.096 | 1200 | 2.2917 | 0.2533 | 0.3728 | 0.2533 | 0.2284 |
69
+ | 2.4032 | 0.108 | 1350 | 2.4920 | 0.251 | 0.3818 | 0.251 | 0.2225 |
70
+ | 2.332 | 0.12 | 1500 | 2.3880 | 0.2639 | 0.3638 | 0.2639 | 0.2324 |
71
+ | 2.3968 | 0.132 | 1650 | 2.4804 | 0.2687 | 0.3862 | 0.2687 | 0.2531 |
72
+ | 2.3922 | 0.144 | 1800 | 2.3411 | 0.2886 | 0.4126 | 0.2886 | 0.2600 |
73
+ | 2.3328 | 0.156 | 1950 | 2.2690 | 0.3191 | 0.3973 | 0.3191 | 0.2896 |
74
+ | 2.3191 | 0.168 | 2100 | 2.1504 | 0.3387 | 0.4172 | 0.3387 | 0.3161 |
75
+ | 2.1208 | 0.18 | 2250 | 2.1226 | 0.3369 | 0.4232 | 0.3369 | 0.3154 |
76
+ | 2.2256 | 0.192 | 2400 | 2.0580 | 0.3629 | 0.4330 | 0.3629 | 0.3372 |
77
+ | 2.1618 | 0.204 | 2550 | 2.0567 | 0.3585 | 0.4509 | 0.3585 | 0.3360 |
78
+ | 2.2237 | 0.216 | 2700 | 2.3808 | 0.3446 | 0.4299 | 0.3446 | 0.3254 |
79
+ | 2.0754 | 0.228 | 2850 | 2.2442 | 0.3718 | 0.4656 | 0.3718 | 0.3529 |
80
+ | 1.9684 | 0.24 | 3000 | 2.1301 | 0.3848 | 0.4569 | 0.3848 | 0.3590 |
81
+ | 2.082 | 0.252 | 3150 | 2.0963 | 0.3734 | 0.4557 | 0.3734 | 0.3533 |
82
+ | 2.0737 | 0.264 | 3300 | 2.2619 | 0.3621 | 0.4506 | 0.3621 | 0.3443 |
83
+ | 2.0049 | 0.276 | 3450 | 2.3372 | 0.3748 | 0.4527 | 0.3748 | 0.3542 |
84
+ | 1.9876 | 0.288 | 3600 | 2.0522 | 0.4025 | 0.4759 | 0.4025 | 0.3818 |
85
+ | 1.9218 | 0.3 | 3750 | 2.1785 | 0.4002 | 0.4704 | 0.4002 | 0.3863 |
86
+ | 1.9899 | 0.312 | 3900 | 2.3298 | 0.4059 | 0.4758 | 0.4059 | 0.3852 |
87
+ | 1.9478 | 0.324 | 4050 | 2.0669 | 0.4245 | 0.4732 | 0.4245 | 0.4033 |
88
+ | 1.9293 | 0.336 | 4200 | 2.1866 | 0.4154 | 0.4885 | 0.4154 | 0.3986 |
89
+ | 1.8939 | 0.348 | 4350 | 2.1652 | 0.4159 | 0.4788 | 0.4159 | 0.3944 |
90
+ | 1.8356 | 0.36 | 4500 | 2.1702 | 0.4245 | 0.4706 | 0.4245 | 0.4002 |
91
+ | 1.8724 | 0.372 | 4650 | 2.1267 | 0.4282 | 0.4825 | 0.4282 | 0.4089 |
92
+ | 1.7633 | 0.384 | 4800 | 2.1603 | 0.4262 | 0.4896 | 0.4262 | 0.4065 |
93
+ | 1.8592 | 0.396 | 4950 | 2.0575 | 0.4393 | 0.4837 | 0.4393 | 0.4187 |
94
+ | 1.7407 | 0.408 | 5100 | 2.0839 | 0.4336 | 0.5017 | 0.4336 | 0.4223 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
95
 
96
 
97
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:930baaee64226f00da15e96424e932b8f8c59a2751d76960f24543f731238738
3
  size 45170672
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:84ddfcee95cdfb400602504e0a9c8c1cc58f997fbd3f8407f9a100593a9fd088
3
  size 45170672
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e89761b57eac78a19f520b9779e52d15b541c49fd7b4f1fcedd2842fb3f5b6b3
3
  size 5841
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:68091efc800d6c0790d161cd0ea5442255e2e93b59e53f55952a7a530594448a
3
  size 5841