End of training
Browse files- README.md +44 -54
- model.safetensors +1 -1
README.md
CHANGED
|
@@ -16,29 +16,29 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
-
- Loss: 1.
|
| 20 |
-
- Map: 0.
|
| 21 |
-
- Map 50: 0.
|
| 22 |
-
- Map 75: 0.
|
| 23 |
-
- Map Small: 0.
|
| 24 |
-
- Map Medium: 0.
|
| 25 |
-
- Map Large: 0.
|
| 26 |
-
- Mar 1: 0.
|
| 27 |
-
- Mar 10: 0.
|
| 28 |
-
- Mar 100: 0.
|
| 29 |
-
- Mar Small: 0.
|
| 30 |
-
- Mar Medium: 0.
|
| 31 |
-
- Mar Large: 0.
|
| 32 |
-
- Map Coverall: 0.
|
| 33 |
-
- Mar 100 Coverall: 0.
|
| 34 |
-
- Map Face Shield: 0.
|
| 35 |
-
- Mar 100 Face Shield: 0.
|
| 36 |
-
- Map Gloves: 0.
|
| 37 |
-
- Mar 100 Gloves: 0.
|
| 38 |
-
- Map Goggles: 0.
|
| 39 |
-
- Mar 100 Goggles: 0.
|
| 40 |
-
- Map Mask: 0.
|
| 41 |
-
- Mar 100 Mask: 0.
|
| 42 |
|
| 43 |
## Model description
|
| 44 |
|
|
@@ -63,42 +63,32 @@ The following hyperparameters were used during training:
|
|
| 63 |
- seed: 42
|
| 64 |
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 65 |
- lr_scheduler_type: cosine
|
| 66 |
-
- num_epochs:
|
| 67 |
|
| 68 |
### Training results
|
| 69 |
|
| 70 |
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|
| 71 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
|
| 72 |
-
| No log | 1.0 | 107 | 2.
|
| 73 |
-
| No log | 2.0 | 214 | 2.
|
| 74 |
-
| No log | 3.0 | 321 | 1.
|
| 75 |
-
| No log | 4.0 | 428 | 1.
|
| 76 |
-
| 2.
|
| 77 |
-
| 2.
|
| 78 |
-
| 2.
|
| 79 |
-
| 2.
|
| 80 |
-
| 2.
|
| 81 |
-
| 1.
|
| 82 |
-
| 1.
|
| 83 |
-
| 1.
|
| 84 |
-
| 1.
|
| 85 |
-
| 1.
|
| 86 |
-
| 1.
|
| 87 |
-
| 1.
|
| 88 |
-
| 1.
|
| 89 |
-
| 1.
|
| 90 |
-
| 1.
|
| 91 |
-
| 1.
|
| 92 |
-
| 1.1271 | 21.0 | 2247 | 1.2282 | 0.2044 | 0.4048 | 0.1707 | 0.0499 | 0.1545 | 0.3103 | 0.2349 | 0.3961 | 0.4282 | 0.1929 | 0.3695 | 0.5866 | 0.4778 | 0.6257 | 0.1088 | 0.4304 | 0.12 | 0.3549 | 0.0889 | 0.3231 | 0.2264 | 0.4071 |
|
| 93 |
-
| 1.1271 | 22.0 | 2354 | 1.2395 | 0.2017 | 0.4202 | 0.1791 | 0.0511 | 0.1581 | 0.3156 | 0.2329 | 0.3961 | 0.4245 | 0.2095 | 0.3691 | 0.5774 | 0.4595 | 0.6077 | 0.1162 | 0.4456 | 0.125 | 0.3473 | 0.0944 | 0.3477 | 0.2134 | 0.3742 |
|
| 94 |
-
| 1.1271 | 23.0 | 2461 | 1.2307 | 0.2147 | 0.4289 | 0.1954 | 0.0565 | 0.1687 | 0.3282 | 0.2424 | 0.4024 | 0.4304 | 0.197 | 0.3702 | 0.5957 | 0.4719 | 0.6167 | 0.1274 | 0.4392 | 0.1368 | 0.3527 | 0.1057 | 0.3492 | 0.2317 | 0.3942 |
|
| 95 |
-
| 1.0136 | 24.0 | 2568 | 1.2164 | 0.2162 | 0.4366 | 0.1905 | 0.059 | 0.1676 | 0.3325 | 0.2435 | 0.4075 | 0.4355 | 0.1878 | 0.3812 | 0.5951 | 0.4729 | 0.6243 | 0.1291 | 0.4342 | 0.1433 | 0.3643 | 0.102 | 0.3585 | 0.234 | 0.3964 |
|
| 96 |
-
| 1.0136 | 25.0 | 2675 | 1.2099 | 0.2199 | 0.4399 | 0.1918 | 0.064 | 0.1717 | 0.3337 | 0.2482 | 0.4094 | 0.4356 | 0.1865 | 0.3821 | 0.5947 | 0.4834 | 0.6315 | 0.1333 | 0.443 | 0.141 | 0.3554 | 0.1059 | 0.3538 | 0.2359 | 0.3942 |
|
| 97 |
-
| 1.0136 | 26.0 | 2782 | 1.2123 | 0.2161 | 0.4387 | 0.186 | 0.0618 | 0.1664 | 0.3262 | 0.2468 | 0.4108 | 0.4358 | 0.1817 | 0.3803 | 0.599 | 0.4759 | 0.6243 | 0.1314 | 0.4506 | 0.1421 | 0.3567 | 0.0978 | 0.3431 | 0.2332 | 0.4044 |
|
| 98 |
-
| 1.0136 | 27.0 | 2889 | 1.2133 | 0.217 | 0.4372 | 0.1832 | 0.0612 | 0.1665 | 0.3297 | 0.2473 | 0.4104 | 0.4363 | 0.1845 | 0.3771 | 0.6027 | 0.4812 | 0.6185 | 0.1362 | 0.4506 | 0.1442 | 0.3634 | 0.0917 | 0.3554 | 0.2317 | 0.3938 |
|
| 99 |
-
| 1.0136 | 28.0 | 2996 | 1.2098 | 0.2195 | 0.4411 | 0.1871 | 0.06 | 0.1716 | 0.3288 | 0.2476 | 0.4124 | 0.4377 | 0.1818 | 0.3807 | 0.6065 | 0.48 | 0.6185 | 0.134 | 0.4557 | 0.1452 | 0.3603 | 0.0996 | 0.3554 | 0.2385 | 0.3987 |
|
| 100 |
-
| 0.9389 | 29.0 | 3103 | 1.2105 | 0.2195 | 0.4354 | 0.1875 | 0.061 | 0.1712 | 0.3295 | 0.2504 | 0.4115 | 0.436 | 0.1845 | 0.3805 | 0.6004 | 0.4809 | 0.6212 | 0.1317 | 0.4481 | 0.1469 | 0.3638 | 0.1004 | 0.3523 | 0.2374 | 0.3947 |
|
| 101 |
-
| 0.9389 | 30.0 | 3210 | 1.2104 | 0.2196 | 0.4356 | 0.188 | 0.0612 | 0.171 | 0.329 | 0.2502 | 0.4112 | 0.4367 | 0.1848 | 0.3804 | 0.6025 | 0.4813 | 0.6212 | 0.1332 | 0.4506 | 0.1463 | 0.3647 | 0.0995 | 0.3523 | 0.2377 | 0.3947 |
|
| 102 |
|
| 103 |
|
| 104 |
### Framework versions
|
|
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
+
- Loss: 1.2957
|
| 20 |
+
- Map: 0.1854
|
| 21 |
+
- Map 50: 0.3967
|
| 22 |
+
- Map 75: 0.1509
|
| 23 |
+
- Map Small: 0.075
|
| 24 |
+
- Map Medium: 0.1538
|
| 25 |
+
- Map Large: 0.2634
|
| 26 |
+
- Mar 1: 0.2171
|
| 27 |
+
- Mar 10: 0.382
|
| 28 |
+
- Mar 100: 0.4106
|
| 29 |
+
- Mar Small: 0.1542
|
| 30 |
+
- Mar Medium: 0.3591
|
| 31 |
+
- Mar Large: 0.5679
|
| 32 |
+
- Map Coverall: 0.4319
|
| 33 |
+
- Mar 100 Coverall: 0.632
|
| 34 |
+
- Map Face Shield: 0.1239
|
| 35 |
+
- Mar 100 Face Shield: 0.3861
|
| 36 |
+
- Map Gloves: 0.1038
|
| 37 |
+
- Mar 100 Gloves: 0.3379
|
| 38 |
+
- Map Goggles: 0.0372
|
| 39 |
+
- Mar 100 Goggles: 0.3092
|
| 40 |
+
- Map Mask: 0.2304
|
| 41 |
+
- Mar 100 Mask: 0.3876
|
| 42 |
|
| 43 |
## Model description
|
| 44 |
|
|
|
|
| 63 |
- seed: 42
|
| 64 |
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 65 |
- lr_scheduler_type: cosine
|
| 66 |
+
- num_epochs: 20
|
| 67 |
|
| 68 |
### Training results
|
| 69 |
|
| 70 |
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|
| 71 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
|
| 72 |
+
| No log | 1.0 | 107 | 2.4345 | 0.0025 | 0.0117 | 0.0001 | 0.0031 | 0.0042 | 0.0014 | 0.0104 | 0.0384 | 0.0714 | 0.0336 | 0.0987 | 0.0523 | 0.0007 | 0.0856 | 0.0026 | 0.1076 | 0.0002 | 0.0308 | 0.0 | 0.0 | 0.0089 | 0.1329 |
|
| 73 |
+
| No log | 2.0 | 214 | 2.1177 | 0.0058 | 0.0186 | 0.0039 | 0.0029 | 0.0042 | 0.0104 | 0.0224 | 0.0799 | 0.1055 | 0.0656 | 0.1019 | 0.1304 | 0.0173 | 0.1644 | 0.0007 | 0.0367 | 0.0026 | 0.1063 | 0.0008 | 0.0492 | 0.0077 | 0.1711 |
|
| 74 |
+
| No log | 3.0 | 321 | 1.8639 | 0.0236 | 0.0653 | 0.0116 | 0.0025 | 0.014 | 0.027 | 0.0513 | 0.1213 | 0.1498 | 0.0563 | 0.119 | 0.1881 | 0.088 | 0.3297 | 0.0068 | 0.0595 | 0.0033 | 0.1415 | 0.0004 | 0.0092 | 0.0195 | 0.2089 |
|
| 75 |
+
| No log | 4.0 | 428 | 1.6789 | 0.0346 | 0.0803 | 0.0268 | 0.0064 | 0.0251 | 0.041 | 0.0747 | 0.1761 | 0.2189 | 0.0694 | 0.163 | 0.2987 | 0.1429 | 0.5486 | 0.0092 | 0.1291 | 0.003 | 0.1571 | 0.001 | 0.0369 | 0.017 | 0.2227 |
|
| 76 |
+
| 2.5173 | 5.0 | 535 | 1.6678 | 0.0414 | 0.1 | 0.0309 | 0.0123 | 0.0344 | 0.0556 | 0.1003 | 0.1985 | 0.2444 | 0.0932 | 0.197 | 0.3174 | 0.1336 | 0.4887 | 0.0332 | 0.2481 | 0.0064 | 0.1973 | 0.002 | 0.0369 | 0.0317 | 0.2511 |
|
| 77 |
+
| 2.5173 | 6.0 | 642 | 1.8747 | 0.0341 | 0.0891 | 0.0217 | 0.0074 | 0.0423 | 0.0392 | 0.0872 | 0.1979 | 0.2412 | 0.0691 | 0.1803 | 0.3318 | 0.0923 | 0.5198 | 0.0171 | 0.1911 | 0.0183 | 0.1826 | 0.0008 | 0.0985 | 0.0421 | 0.2138 |
|
| 78 |
+
| 2.5173 | 7.0 | 749 | 1.6153 | 0.0555 | 0.1413 | 0.0372 | 0.0126 | 0.0497 | 0.0812 | 0.1012 | 0.2199 | 0.255 | 0.1117 | 0.2002 | 0.3413 | 0.1711 | 0.5023 | 0.0311 | 0.1886 | 0.0234 | 0.1978 | 0.0019 | 0.1 | 0.0499 | 0.2862 |
|
| 79 |
+
| 2.5173 | 8.0 | 856 | 1.5458 | 0.085 | 0.207 | 0.0597 | 0.0218 | 0.0735 | 0.1101 | 0.1362 | 0.2698 | 0.3038 | 0.107 | 0.2324 | 0.4444 | 0.2387 | 0.5613 | 0.0491 | 0.2595 | 0.0393 | 0.2491 | 0.0051 | 0.1385 | 0.0926 | 0.3107 |
|
| 80 |
+
| 2.5173 | 9.0 | 963 | 1.4737 | 0.0998 | 0.2394 | 0.0738 | 0.0257 | 0.09 | 0.1373 | 0.1328 | 0.2961 | 0.3372 | 0.1365 | 0.286 | 0.4551 | 0.2987 | 0.5833 | 0.0463 | 0.2722 | 0.0393 | 0.2705 | 0.0093 | 0.2108 | 0.1055 | 0.3493 |
|
| 81 |
+
| 1.4606 | 10.0 | 1070 | 1.4459 | 0.1192 | 0.2641 | 0.0884 | 0.0249 | 0.1004 | 0.1666 | 0.1615 | 0.3041 | 0.3404 | 0.1114 | 0.2709 | 0.5104 | 0.3475 | 0.6045 | 0.0504 | 0.3051 | 0.0607 | 0.2603 | 0.0077 | 0.1785 | 0.1298 | 0.3538 |
|
| 82 |
+
| 1.4606 | 11.0 | 1177 | 1.4111 | 0.139 | 0.3108 | 0.106 | 0.0335 | 0.1135 | 0.1957 | 0.1748 | 0.3278 | 0.3602 | 0.1332 | 0.2968 | 0.5332 | 0.3809 | 0.6113 | 0.0683 | 0.3215 | 0.0676 | 0.2879 | 0.0069 | 0.24 | 0.1712 | 0.3404 |
|
| 83 |
+
| 1.4606 | 12.0 | 1284 | 1.3864 | 0.1518 | 0.3352 | 0.1224 | 0.0325 | 0.1291 | 0.2138 | 0.1757 | 0.3332 | 0.3674 | 0.1373 | 0.3102 | 0.521 | 0.4031 | 0.6167 | 0.0681 | 0.2937 | 0.0721 | 0.3013 | 0.0194 | 0.2615 | 0.1964 | 0.3636 |
|
| 84 |
+
| 1.4606 | 13.0 | 1391 | 1.3704 | 0.159 | 0.3398 | 0.1218 | 0.0382 | 0.1262 | 0.2277 | 0.1829 | 0.3494 | 0.3807 | 0.1273 | 0.3245 | 0.5416 | 0.4058 | 0.6243 | 0.0825 | 0.3582 | 0.0824 | 0.2969 | 0.0206 | 0.2569 | 0.2039 | 0.3671 |
|
| 85 |
+
| 1.4606 | 14.0 | 1498 | 1.3534 | 0.1553 | 0.3483 | 0.1227 | 0.0371 | 0.1157 | 0.2372 | 0.1878 | 0.3518 | 0.3822 | 0.1263 | 0.3257 | 0.5434 | 0.4125 | 0.6113 | 0.0796 | 0.3684 | 0.0846 | 0.3027 | 0.0137 | 0.2523 | 0.1863 | 0.3764 |
|
| 86 |
+
| 1.2325 | 15.0 | 1605 | 1.3488 | 0.1642 | 0.3747 | 0.1284 | 0.0514 | 0.1233 | 0.2522 | 0.1958 | 0.3479 | 0.378 | 0.1332 | 0.3223 | 0.5373 | 0.4146 | 0.5995 | 0.0956 | 0.3203 | 0.0784 | 0.2951 | 0.0295 | 0.3 | 0.203 | 0.3751 |
|
| 87 |
+
| 1.2325 | 16.0 | 1712 | 1.3200 | 0.1806 | 0.393 | 0.1471 | 0.0662 | 0.1429 | 0.2668 | 0.2091 | 0.367 | 0.3922 | 0.1413 | 0.3381 | 0.5545 | 0.424 | 0.6221 | 0.1245 | 0.3709 | 0.1015 | 0.3156 | 0.0293 | 0.2846 | 0.2234 | 0.368 |
|
| 88 |
+
| 1.2325 | 17.0 | 1819 | 1.3110 | 0.1825 | 0.391 | 0.1407 | 0.0665 | 0.149 | 0.2635 | 0.2116 | 0.3724 | 0.3998 | 0.1443 | 0.3565 | 0.5489 | 0.4204 | 0.6162 | 0.1207 | 0.3772 | 0.1011 | 0.329 | 0.0432 | 0.2954 | 0.2268 | 0.3813 |
|
| 89 |
+
| 1.2325 | 18.0 | 1926 | 1.2965 | 0.1859 | 0.3958 | 0.1484 | 0.0751 | 0.1529 | 0.2665 | 0.2154 | 0.3792 | 0.41 | 0.155 | 0.3626 | 0.559 | 0.4314 | 0.6279 | 0.1269 | 0.3975 | 0.1034 | 0.3321 | 0.0401 | 0.3169 | 0.2275 | 0.3756 |
|
| 90 |
+
| 1.1097 | 19.0 | 2033 | 1.2951 | 0.1859 | 0.3982 | 0.1491 | 0.075 | 0.1541 | 0.2661 | 0.2167 | 0.3814 | 0.4114 | 0.1527 | 0.3623 | 0.5655 | 0.4313 | 0.632 | 0.126 | 0.3937 | 0.1031 | 0.3362 | 0.0389 | 0.3138 | 0.2302 | 0.3813 |
|
| 91 |
+
| 1.1097 | 20.0 | 2140 | 1.2957 | 0.1854 | 0.3967 | 0.1509 | 0.075 | 0.1538 | 0.2634 | 0.2171 | 0.382 | 0.4106 | 0.1542 | 0.3591 | 0.5679 | 0.4319 | 0.632 | 0.1239 | 0.3861 | 0.1038 | 0.3379 | 0.0372 | 0.3092 | 0.2304 | 0.3876 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 92 |
|
| 93 |
|
| 94 |
### Framework versions
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 174079796
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:3892c0740ea54c0b3ac10ffb764210f33aba5df6e87fcf924b81208caf146689
|
| 3 |
size 174079796
|