amSOwO commited on
Commit
8860342
·
verified ·
1 Parent(s): f39a20e

End of training

Browse files
Files changed (2) hide show
  1. README.md +44 -54
  2. model.safetensors +1 -1
README.md CHANGED
@@ -16,29 +16,29 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 1.2104
20
- - Map: 0.2196
21
- - Map 50: 0.4356
22
- - Map 75: 0.188
23
- - Map Small: 0.0612
24
- - Map Medium: 0.171
25
- - Map Large: 0.329
26
- - Mar 1: 0.2502
27
- - Mar 10: 0.4112
28
- - Mar 100: 0.4367
29
- - Mar Small: 0.1848
30
- - Mar Medium: 0.3804
31
- - Mar Large: 0.6025
32
- - Map Coverall: 0.4813
33
- - Mar 100 Coverall: 0.6212
34
- - Map Face Shield: 0.1332
35
- - Mar 100 Face Shield: 0.4506
36
- - Map Gloves: 0.1463
37
- - Mar 100 Gloves: 0.3647
38
- - Map Goggles: 0.0995
39
- - Mar 100 Goggles: 0.3523
40
- - Map Mask: 0.2377
41
- - Mar 100 Mask: 0.3947
42
 
43
  ## Model description
44
 
@@ -63,42 +63,32 @@ The following hyperparameters were used during training:
63
  - seed: 42
64
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
65
  - lr_scheduler_type: cosine
66
- - num_epochs: 30
67
 
68
  ### Training results
69
 
70
  | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
71
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
72
- | No log | 1.0 | 107 | 2.0315 | 0.0165 | 0.0436 | 0.0104 | 0.0023 | 0.0079 | 0.0265 | 0.0334 | 0.0864 | 0.1169 | 0.0399 | 0.0793 | 0.181 | 0.0583 | 0.2405 | 0.0038 | 0.0494 | 0.0025 | 0.1522 | 0.0 | 0.0 | 0.0179 | 0.1422 |
73
- | No log | 2.0 | 214 | 2.2366 | 0.0044 | 0.0176 | 0.0013 | 0.0016 | 0.0055 | 0.0094 | 0.0167 | 0.0616 | 0.0858 | 0.024 | 0.0953 | 0.1012 | 0.0045 | 0.1198 | 0.0034 | 0.0899 | 0.0006 | 0.054 | 0.0006 | 0.0046 | 0.013 | 0.1609 |
74
- | No log | 3.0 | 321 | 1.8060 | 0.0264 | 0.0605 | 0.0222 | 0.0059 | 0.0294 | 0.029 | 0.0567 | 0.1329 | 0.1814 | 0.0609 | 0.1167 | 0.2592 | 0.0924 | 0.4739 | 0.0053 | 0.062 | 0.0032 | 0.133 | 0.0003 | 0.0031 | 0.0305 | 0.2351 |
75
- | No log | 4.0 | 428 | 1.7634 | 0.0346 | 0.0796 | 0.0268 | 0.0086 | 0.0327 | 0.0393 | 0.0812 | 0.1752 | 0.2054 | 0.0939 | 0.1558 | 0.2369 | 0.1179 | 0.4527 | 0.0153 | 0.2278 | 0.0031 | 0.1174 | 0.0005 | 0.0246 | 0.0363 | 0.2044 |
76
- | 2.5211 | 5.0 | 535 | 1.6345 | 0.045 | 0.1153 | 0.0336 | 0.0076 | 0.0427 | 0.0531 | 0.0818 | 0.1939 | 0.2483 | 0.0982 | 0.179 | 0.3275 | 0.1687 | 0.5405 | 0.0259 | 0.2228 | 0.004 | 0.1888 | 0.0014 | 0.0692 | 0.025 | 0.22 |
77
- | 2.5211 | 6.0 | 642 | 1.5649 | 0.0655 | 0.161 | 0.0507 | 0.0099 | 0.0602 | 0.09 | 0.1084 | 0.2515 | 0.287 | 0.1106 | 0.2183 | 0.3845 | 0.2197 | 0.5757 | 0.0358 | 0.2747 | 0.0145 | 0.1857 | 0.0026 | 0.1369 | 0.0548 | 0.2618 |
78
- | 2.5211 | 7.0 | 749 | 1.5948 | 0.0845 | 0.2049 | 0.0633 | 0.0269 | 0.0742 | 0.0995 | 0.1297 | 0.2626 | 0.2992 | 0.1088 | 0.2235 | 0.4293 | 0.2717 | 0.5743 | 0.0459 | 0.2734 | 0.0163 | 0.2062 | 0.0079 | 0.1785 | 0.0805 | 0.2636 |
79
- | 2.5211 | 8.0 | 856 | 1.5967 | 0.0919 | 0.2207 | 0.0707 | 0.0167 | 0.0633 | 0.1232 | 0.1217 | 0.2607 | 0.2909 | 0.0852 | 0.235 | 0.4226 | 0.3033 | 0.5194 | 0.034 | 0.2544 | 0.0215 | 0.2308 | 0.0119 | 0.1723 | 0.0887 | 0.2778 |
80
- | 2.5211 | 9.0 | 963 | 1.4514 | 0.1109 | 0.2538 | 0.0845 | 0.0411 | 0.0777 | 0.1563 | 0.1422 | 0.3009 | 0.3289 | 0.1791 | 0.2695 | 0.446 | 0.3386 | 0.5432 | 0.071 | 0.3544 | 0.0471 | 0.2562 | 0.0088 | 0.16 | 0.089 | 0.3307 |
81
- | 1.4523 | 10.0 | 1070 | 1.4644 | 0.1133 | 0.2619 | 0.0875 | 0.0273 | 0.0882 | 0.1606 | 0.1461 | 0.3134 | 0.3443 | 0.1686 | 0.2875 | 0.4779 | 0.3493 | 0.5441 | 0.0458 | 0.3418 | 0.0312 | 0.2612 | 0.0294 | 0.2738 | 0.1106 | 0.3004 |
82
- | 1.4523 | 11.0 | 1177 | 1.4809 | 0.1275 | 0.2867 | 0.1025 | 0.0281 | 0.0897 | 0.1935 | 0.1579 | 0.321 | 0.3499 | 0.1611 | 0.277 | 0.4931 | 0.351 | 0.5509 | 0.06 | 0.3633 | 0.0392 | 0.2446 | 0.0435 | 0.2462 | 0.1438 | 0.3444 |
83
- | 1.4523 | 12.0 | 1284 | 1.3664 | 0.1275 | 0.2856 | 0.1075 | 0.0271 | 0.0913 | 0.1987 | 0.1684 | 0.3425 | 0.3685 | 0.1711 | 0.2964 | 0.5251 | 0.3792 | 0.5743 | 0.0735 | 0.3873 | 0.0572 | 0.2906 | 0.0123 | 0.2431 | 0.1154 | 0.3471 |
84
- | 1.4523 | 13.0 | 1391 | 1.3664 | 0.1526 | 0.3248 | 0.1317 | 0.0353 | 0.1067 | 0.2337 | 0.1812 | 0.3475 | 0.3781 | 0.1792 | 0.3069 | 0.5293 | 0.4254 | 0.5937 | 0.0769 | 0.3911 | 0.0681 | 0.3013 | 0.035 | 0.2538 | 0.1574 | 0.3507 |
85
- | 1.4523 | 14.0 | 1498 | 1.3835 | 0.151 | 0.3341 | 0.1152 | 0.0286 | 0.1038 | 0.2345 | 0.1763 | 0.3344 | 0.3599 | 0.1439 | 0.2864 | 0.5153 | 0.4127 | 0.5932 | 0.091 | 0.3481 | 0.066 | 0.2759 | 0.0391 | 0.2354 | 0.1464 | 0.3467 |
86
- | 1.276 | 15.0 | 1605 | 1.3411 | 0.1543 | 0.335 | 0.1298 | 0.0292 | 0.1057 | 0.2407 | 0.183 | 0.3381 | 0.3701 | 0.1465 | 0.3124 | 0.5277 | 0.4324 | 0.605 | 0.088 | 0.3671 | 0.0771 | 0.2857 | 0.0228 | 0.2585 | 0.1515 | 0.3342 |
87
- | 1.276 | 16.0 | 1712 | 1.2893 | 0.1661 | 0.3537 | 0.133 | 0.0342 | 0.1156 | 0.2717 | 0.1948 | 0.3721 | 0.4002 | 0.1815 | 0.3359 | 0.5539 | 0.454 | 0.6095 | 0.0686 | 0.3873 | 0.0894 | 0.3192 | 0.0365 | 0.3123 | 0.1823 | 0.3729 |
88
- | 1.276 | 17.0 | 1819 | 1.2780 | 0.1755 | 0.3802 | 0.1453 | 0.0373 | 0.1345 | 0.2636 | 0.2135 | 0.3861 | 0.4123 | 0.1981 | 0.3469 | 0.5757 | 0.4376 | 0.609 | 0.0986 | 0.4203 | 0.0941 | 0.3424 | 0.0588 | 0.3169 | 0.1884 | 0.3729 |
89
- | 1.276 | 18.0 | 1926 | 1.2497 | 0.1847 | 0.3934 | 0.1655 | 0.0461 | 0.1455 | 0.2714 | 0.2171 | 0.3867 | 0.4139 | 0.1944 | 0.3529 | 0.5756 | 0.4483 | 0.6036 | 0.1029 | 0.4329 | 0.11 | 0.3375 | 0.0513 | 0.3154 | 0.2112 | 0.38 |
90
- | 1.1271 | 19.0 | 2033 | 1.2770 | 0.1816 | 0.3797 | 0.1593 | 0.0439 | 0.1411 | 0.2703 | 0.2128 | 0.3874 | 0.4184 | 0.2093 | 0.3576 | 0.5749 | 0.4459 | 0.605 | 0.0961 | 0.4304 | 0.0994 | 0.3277 | 0.0591 | 0.3508 | 0.2078 | 0.3782 |
91
- | 1.1271 | 20.0 | 2140 | 1.2571 | 0.1942 | 0.3991 | 0.1714 | 0.0551 | 0.1523 | 0.293 | 0.2153 | 0.3873 | 0.4114 | 0.169 | 0.3529 | 0.5763 | 0.4696 | 0.6239 | 0.1069 | 0.4127 | 0.1192 | 0.3424 | 0.0523 | 0.3108 | 0.2229 | 0.3671 |
92
- | 1.1271 | 21.0 | 2247 | 1.2282 | 0.2044 | 0.4048 | 0.1707 | 0.0499 | 0.1545 | 0.3103 | 0.2349 | 0.3961 | 0.4282 | 0.1929 | 0.3695 | 0.5866 | 0.4778 | 0.6257 | 0.1088 | 0.4304 | 0.12 | 0.3549 | 0.0889 | 0.3231 | 0.2264 | 0.4071 |
93
- | 1.1271 | 22.0 | 2354 | 1.2395 | 0.2017 | 0.4202 | 0.1791 | 0.0511 | 0.1581 | 0.3156 | 0.2329 | 0.3961 | 0.4245 | 0.2095 | 0.3691 | 0.5774 | 0.4595 | 0.6077 | 0.1162 | 0.4456 | 0.125 | 0.3473 | 0.0944 | 0.3477 | 0.2134 | 0.3742 |
94
- | 1.1271 | 23.0 | 2461 | 1.2307 | 0.2147 | 0.4289 | 0.1954 | 0.0565 | 0.1687 | 0.3282 | 0.2424 | 0.4024 | 0.4304 | 0.197 | 0.3702 | 0.5957 | 0.4719 | 0.6167 | 0.1274 | 0.4392 | 0.1368 | 0.3527 | 0.1057 | 0.3492 | 0.2317 | 0.3942 |
95
- | 1.0136 | 24.0 | 2568 | 1.2164 | 0.2162 | 0.4366 | 0.1905 | 0.059 | 0.1676 | 0.3325 | 0.2435 | 0.4075 | 0.4355 | 0.1878 | 0.3812 | 0.5951 | 0.4729 | 0.6243 | 0.1291 | 0.4342 | 0.1433 | 0.3643 | 0.102 | 0.3585 | 0.234 | 0.3964 |
96
- | 1.0136 | 25.0 | 2675 | 1.2099 | 0.2199 | 0.4399 | 0.1918 | 0.064 | 0.1717 | 0.3337 | 0.2482 | 0.4094 | 0.4356 | 0.1865 | 0.3821 | 0.5947 | 0.4834 | 0.6315 | 0.1333 | 0.443 | 0.141 | 0.3554 | 0.1059 | 0.3538 | 0.2359 | 0.3942 |
97
- | 1.0136 | 26.0 | 2782 | 1.2123 | 0.2161 | 0.4387 | 0.186 | 0.0618 | 0.1664 | 0.3262 | 0.2468 | 0.4108 | 0.4358 | 0.1817 | 0.3803 | 0.599 | 0.4759 | 0.6243 | 0.1314 | 0.4506 | 0.1421 | 0.3567 | 0.0978 | 0.3431 | 0.2332 | 0.4044 |
98
- | 1.0136 | 27.0 | 2889 | 1.2133 | 0.217 | 0.4372 | 0.1832 | 0.0612 | 0.1665 | 0.3297 | 0.2473 | 0.4104 | 0.4363 | 0.1845 | 0.3771 | 0.6027 | 0.4812 | 0.6185 | 0.1362 | 0.4506 | 0.1442 | 0.3634 | 0.0917 | 0.3554 | 0.2317 | 0.3938 |
99
- | 1.0136 | 28.0 | 2996 | 1.2098 | 0.2195 | 0.4411 | 0.1871 | 0.06 | 0.1716 | 0.3288 | 0.2476 | 0.4124 | 0.4377 | 0.1818 | 0.3807 | 0.6065 | 0.48 | 0.6185 | 0.134 | 0.4557 | 0.1452 | 0.3603 | 0.0996 | 0.3554 | 0.2385 | 0.3987 |
100
- | 0.9389 | 29.0 | 3103 | 1.2105 | 0.2195 | 0.4354 | 0.1875 | 0.061 | 0.1712 | 0.3295 | 0.2504 | 0.4115 | 0.436 | 0.1845 | 0.3805 | 0.6004 | 0.4809 | 0.6212 | 0.1317 | 0.4481 | 0.1469 | 0.3638 | 0.1004 | 0.3523 | 0.2374 | 0.3947 |
101
- | 0.9389 | 30.0 | 3210 | 1.2104 | 0.2196 | 0.4356 | 0.188 | 0.0612 | 0.171 | 0.329 | 0.2502 | 0.4112 | 0.4367 | 0.1848 | 0.3804 | 0.6025 | 0.4813 | 0.6212 | 0.1332 | 0.4506 | 0.1463 | 0.3647 | 0.0995 | 0.3523 | 0.2377 | 0.3947 |
102
 
103
 
104
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 1.2957
20
+ - Map: 0.1854
21
+ - Map 50: 0.3967
22
+ - Map 75: 0.1509
23
+ - Map Small: 0.075
24
+ - Map Medium: 0.1538
25
+ - Map Large: 0.2634
26
+ - Mar 1: 0.2171
27
+ - Mar 10: 0.382
28
+ - Mar 100: 0.4106
29
+ - Mar Small: 0.1542
30
+ - Mar Medium: 0.3591
31
+ - Mar Large: 0.5679
32
+ - Map Coverall: 0.4319
33
+ - Mar 100 Coverall: 0.632
34
+ - Map Face Shield: 0.1239
35
+ - Mar 100 Face Shield: 0.3861
36
+ - Map Gloves: 0.1038
37
+ - Mar 100 Gloves: 0.3379
38
+ - Map Goggles: 0.0372
39
+ - Mar 100 Goggles: 0.3092
40
+ - Map Mask: 0.2304
41
+ - Mar 100 Mask: 0.3876
42
 
43
  ## Model description
44
 
 
63
  - seed: 42
64
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
65
  - lr_scheduler_type: cosine
66
+ - num_epochs: 20
67
 
68
  ### Training results
69
 
70
  | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
71
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
72
+ | No log | 1.0 | 107 | 2.4345 | 0.0025 | 0.0117 | 0.0001 | 0.0031 | 0.0042 | 0.0014 | 0.0104 | 0.0384 | 0.0714 | 0.0336 | 0.0987 | 0.0523 | 0.0007 | 0.0856 | 0.0026 | 0.1076 | 0.0002 | 0.0308 | 0.0 | 0.0 | 0.0089 | 0.1329 |
73
+ | No log | 2.0 | 214 | 2.1177 | 0.0058 | 0.0186 | 0.0039 | 0.0029 | 0.0042 | 0.0104 | 0.0224 | 0.0799 | 0.1055 | 0.0656 | 0.1019 | 0.1304 | 0.0173 | 0.1644 | 0.0007 | 0.0367 | 0.0026 | 0.1063 | 0.0008 | 0.0492 | 0.0077 | 0.1711 |
74
+ | No log | 3.0 | 321 | 1.8639 | 0.0236 | 0.0653 | 0.0116 | 0.0025 | 0.014 | 0.027 | 0.0513 | 0.1213 | 0.1498 | 0.0563 | 0.119 | 0.1881 | 0.088 | 0.3297 | 0.0068 | 0.0595 | 0.0033 | 0.1415 | 0.0004 | 0.0092 | 0.0195 | 0.2089 |
75
+ | No log | 4.0 | 428 | 1.6789 | 0.0346 | 0.0803 | 0.0268 | 0.0064 | 0.0251 | 0.041 | 0.0747 | 0.1761 | 0.2189 | 0.0694 | 0.163 | 0.2987 | 0.1429 | 0.5486 | 0.0092 | 0.1291 | 0.003 | 0.1571 | 0.001 | 0.0369 | 0.017 | 0.2227 |
76
+ | 2.5173 | 5.0 | 535 | 1.6678 | 0.0414 | 0.1 | 0.0309 | 0.0123 | 0.0344 | 0.0556 | 0.1003 | 0.1985 | 0.2444 | 0.0932 | 0.197 | 0.3174 | 0.1336 | 0.4887 | 0.0332 | 0.2481 | 0.0064 | 0.1973 | 0.002 | 0.0369 | 0.0317 | 0.2511 |
77
+ | 2.5173 | 6.0 | 642 | 1.8747 | 0.0341 | 0.0891 | 0.0217 | 0.0074 | 0.0423 | 0.0392 | 0.0872 | 0.1979 | 0.2412 | 0.0691 | 0.1803 | 0.3318 | 0.0923 | 0.5198 | 0.0171 | 0.1911 | 0.0183 | 0.1826 | 0.0008 | 0.0985 | 0.0421 | 0.2138 |
78
+ | 2.5173 | 7.0 | 749 | 1.6153 | 0.0555 | 0.1413 | 0.0372 | 0.0126 | 0.0497 | 0.0812 | 0.1012 | 0.2199 | 0.255 | 0.1117 | 0.2002 | 0.3413 | 0.1711 | 0.5023 | 0.0311 | 0.1886 | 0.0234 | 0.1978 | 0.0019 | 0.1 | 0.0499 | 0.2862 |
79
+ | 2.5173 | 8.0 | 856 | 1.5458 | 0.085 | 0.207 | 0.0597 | 0.0218 | 0.0735 | 0.1101 | 0.1362 | 0.2698 | 0.3038 | 0.107 | 0.2324 | 0.4444 | 0.2387 | 0.5613 | 0.0491 | 0.2595 | 0.0393 | 0.2491 | 0.0051 | 0.1385 | 0.0926 | 0.3107 |
80
+ | 2.5173 | 9.0 | 963 | 1.4737 | 0.0998 | 0.2394 | 0.0738 | 0.0257 | 0.09 | 0.1373 | 0.1328 | 0.2961 | 0.3372 | 0.1365 | 0.286 | 0.4551 | 0.2987 | 0.5833 | 0.0463 | 0.2722 | 0.0393 | 0.2705 | 0.0093 | 0.2108 | 0.1055 | 0.3493 |
81
+ | 1.4606 | 10.0 | 1070 | 1.4459 | 0.1192 | 0.2641 | 0.0884 | 0.0249 | 0.1004 | 0.1666 | 0.1615 | 0.3041 | 0.3404 | 0.1114 | 0.2709 | 0.5104 | 0.3475 | 0.6045 | 0.0504 | 0.3051 | 0.0607 | 0.2603 | 0.0077 | 0.1785 | 0.1298 | 0.3538 |
82
+ | 1.4606 | 11.0 | 1177 | 1.4111 | 0.139 | 0.3108 | 0.106 | 0.0335 | 0.1135 | 0.1957 | 0.1748 | 0.3278 | 0.3602 | 0.1332 | 0.2968 | 0.5332 | 0.3809 | 0.6113 | 0.0683 | 0.3215 | 0.0676 | 0.2879 | 0.0069 | 0.24 | 0.1712 | 0.3404 |
83
+ | 1.4606 | 12.0 | 1284 | 1.3864 | 0.1518 | 0.3352 | 0.1224 | 0.0325 | 0.1291 | 0.2138 | 0.1757 | 0.3332 | 0.3674 | 0.1373 | 0.3102 | 0.521 | 0.4031 | 0.6167 | 0.0681 | 0.2937 | 0.0721 | 0.3013 | 0.0194 | 0.2615 | 0.1964 | 0.3636 |
84
+ | 1.4606 | 13.0 | 1391 | 1.3704 | 0.159 | 0.3398 | 0.1218 | 0.0382 | 0.1262 | 0.2277 | 0.1829 | 0.3494 | 0.3807 | 0.1273 | 0.3245 | 0.5416 | 0.4058 | 0.6243 | 0.0825 | 0.3582 | 0.0824 | 0.2969 | 0.0206 | 0.2569 | 0.2039 | 0.3671 |
85
+ | 1.4606 | 14.0 | 1498 | 1.3534 | 0.1553 | 0.3483 | 0.1227 | 0.0371 | 0.1157 | 0.2372 | 0.1878 | 0.3518 | 0.3822 | 0.1263 | 0.3257 | 0.5434 | 0.4125 | 0.6113 | 0.0796 | 0.3684 | 0.0846 | 0.3027 | 0.0137 | 0.2523 | 0.1863 | 0.3764 |
86
+ | 1.2325 | 15.0 | 1605 | 1.3488 | 0.1642 | 0.3747 | 0.1284 | 0.0514 | 0.1233 | 0.2522 | 0.1958 | 0.3479 | 0.378 | 0.1332 | 0.3223 | 0.5373 | 0.4146 | 0.5995 | 0.0956 | 0.3203 | 0.0784 | 0.2951 | 0.0295 | 0.3 | 0.203 | 0.3751 |
87
+ | 1.2325 | 16.0 | 1712 | 1.3200 | 0.1806 | 0.393 | 0.1471 | 0.0662 | 0.1429 | 0.2668 | 0.2091 | 0.367 | 0.3922 | 0.1413 | 0.3381 | 0.5545 | 0.424 | 0.6221 | 0.1245 | 0.3709 | 0.1015 | 0.3156 | 0.0293 | 0.2846 | 0.2234 | 0.368 |
88
+ | 1.2325 | 17.0 | 1819 | 1.3110 | 0.1825 | 0.391 | 0.1407 | 0.0665 | 0.149 | 0.2635 | 0.2116 | 0.3724 | 0.3998 | 0.1443 | 0.3565 | 0.5489 | 0.4204 | 0.6162 | 0.1207 | 0.3772 | 0.1011 | 0.329 | 0.0432 | 0.2954 | 0.2268 | 0.3813 |
89
+ | 1.2325 | 18.0 | 1926 | 1.2965 | 0.1859 | 0.3958 | 0.1484 | 0.0751 | 0.1529 | 0.2665 | 0.2154 | 0.3792 | 0.41 | 0.155 | 0.3626 | 0.559 | 0.4314 | 0.6279 | 0.1269 | 0.3975 | 0.1034 | 0.3321 | 0.0401 | 0.3169 | 0.2275 | 0.3756 |
90
+ | 1.1097 | 19.0 | 2033 | 1.2951 | 0.1859 | 0.3982 | 0.1491 | 0.075 | 0.1541 | 0.2661 | 0.2167 | 0.3814 | 0.4114 | 0.1527 | 0.3623 | 0.5655 | 0.4313 | 0.632 | 0.126 | 0.3937 | 0.1031 | 0.3362 | 0.0389 | 0.3138 | 0.2302 | 0.3813 |
91
+ | 1.1097 | 20.0 | 2140 | 1.2957 | 0.1854 | 0.3967 | 0.1509 | 0.075 | 0.1538 | 0.2634 | 0.2171 | 0.382 | 0.4106 | 0.1542 | 0.3591 | 0.5679 | 0.4319 | 0.632 | 0.1239 | 0.3861 | 0.1038 | 0.3379 | 0.0372 | 0.3092 | 0.2304 | 0.3876 |
 
 
 
 
 
 
 
 
 
 
92
 
93
 
94
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:09d92d7a9e5ea23f9e86c11861413a120ed09652e29706a42a7d5c9f674f1020
3
  size 174079796
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3892c0740ea54c0b3ac10ffb764210f33aba5df6e87fcf924b81208caf146689
3
  size 174079796