End of training
Browse files- README.md +109 -0
- model.safetensors +1 -1
README.md
ADDED
|
@@ -0,0 +1,109 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
library_name: transformers
|
| 3 |
+
license: apache-2.0
|
| 4 |
+
base_model: microsoft/conditional-detr-resnet-50
|
| 5 |
+
tags:
|
| 6 |
+
- generated_from_trainer
|
| 7 |
+
model-index:
|
| 8 |
+
- name: detr_finetuned_cppe5
|
| 9 |
+
results: []
|
| 10 |
+
---
|
| 11 |
+
|
| 12 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 13 |
+
should probably proofread and complete it, then remove this comment. -->
|
| 14 |
+
|
| 15 |
+
# detr_finetuned_cppe5
|
| 16 |
+
|
| 17 |
+
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
|
| 18 |
+
It achieves the following results on the evaluation set:
|
| 19 |
+
- Loss: 1.2104
|
| 20 |
+
- Map: 0.2196
|
| 21 |
+
- Map 50: 0.4356
|
| 22 |
+
- Map 75: 0.188
|
| 23 |
+
- Map Small: 0.0612
|
| 24 |
+
- Map Medium: 0.171
|
| 25 |
+
- Map Large: 0.329
|
| 26 |
+
- Mar 1: 0.2502
|
| 27 |
+
- Mar 10: 0.4112
|
| 28 |
+
- Mar 100: 0.4367
|
| 29 |
+
- Mar Small: 0.1848
|
| 30 |
+
- Mar Medium: 0.3804
|
| 31 |
+
- Mar Large: 0.6025
|
| 32 |
+
- Map Coverall: 0.4813
|
| 33 |
+
- Mar 100 Coverall: 0.6212
|
| 34 |
+
- Map Face Shield: 0.1332
|
| 35 |
+
- Mar 100 Face Shield: 0.4506
|
| 36 |
+
- Map Gloves: 0.1463
|
| 37 |
+
- Mar 100 Gloves: 0.3647
|
| 38 |
+
- Map Goggles: 0.0995
|
| 39 |
+
- Mar 100 Goggles: 0.3523
|
| 40 |
+
- Map Mask: 0.2377
|
| 41 |
+
- Mar 100 Mask: 0.3947
|
| 42 |
+
|
| 43 |
+
## Model description
|
| 44 |
+
|
| 45 |
+
More information needed
|
| 46 |
+
|
| 47 |
+
## Intended uses & limitations
|
| 48 |
+
|
| 49 |
+
More information needed
|
| 50 |
+
|
| 51 |
+
## Training and evaluation data
|
| 52 |
+
|
| 53 |
+
More information needed
|
| 54 |
+
|
| 55 |
+
## Training procedure
|
| 56 |
+
|
| 57 |
+
### Training hyperparameters
|
| 58 |
+
|
| 59 |
+
The following hyperparameters were used during training:
|
| 60 |
+
- learning_rate: 0.0001
|
| 61 |
+
- train_batch_size: 8
|
| 62 |
+
- eval_batch_size: 8
|
| 63 |
+
- seed: 42
|
| 64 |
+
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 65 |
+
- lr_scheduler_type: cosine
|
| 66 |
+
- num_epochs: 30
|
| 67 |
+
|
| 68 |
+
### Training results
|
| 69 |
+
|
| 70 |
+
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|
| 71 |
+
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
|
| 72 |
+
| No log | 1.0 | 107 | 2.0315 | 0.0165 | 0.0436 | 0.0104 | 0.0023 | 0.0079 | 0.0265 | 0.0334 | 0.0864 | 0.1169 | 0.0399 | 0.0793 | 0.181 | 0.0583 | 0.2405 | 0.0038 | 0.0494 | 0.0025 | 0.1522 | 0.0 | 0.0 | 0.0179 | 0.1422 |
|
| 73 |
+
| No log | 2.0 | 214 | 2.2366 | 0.0044 | 0.0176 | 0.0013 | 0.0016 | 0.0055 | 0.0094 | 0.0167 | 0.0616 | 0.0858 | 0.024 | 0.0953 | 0.1012 | 0.0045 | 0.1198 | 0.0034 | 0.0899 | 0.0006 | 0.054 | 0.0006 | 0.0046 | 0.013 | 0.1609 |
|
| 74 |
+
| No log | 3.0 | 321 | 1.8060 | 0.0264 | 0.0605 | 0.0222 | 0.0059 | 0.0294 | 0.029 | 0.0567 | 0.1329 | 0.1814 | 0.0609 | 0.1167 | 0.2592 | 0.0924 | 0.4739 | 0.0053 | 0.062 | 0.0032 | 0.133 | 0.0003 | 0.0031 | 0.0305 | 0.2351 |
|
| 75 |
+
| No log | 4.0 | 428 | 1.7634 | 0.0346 | 0.0796 | 0.0268 | 0.0086 | 0.0327 | 0.0393 | 0.0812 | 0.1752 | 0.2054 | 0.0939 | 0.1558 | 0.2369 | 0.1179 | 0.4527 | 0.0153 | 0.2278 | 0.0031 | 0.1174 | 0.0005 | 0.0246 | 0.0363 | 0.2044 |
|
| 76 |
+
| 2.5211 | 5.0 | 535 | 1.6345 | 0.045 | 0.1153 | 0.0336 | 0.0076 | 0.0427 | 0.0531 | 0.0818 | 0.1939 | 0.2483 | 0.0982 | 0.179 | 0.3275 | 0.1687 | 0.5405 | 0.0259 | 0.2228 | 0.004 | 0.1888 | 0.0014 | 0.0692 | 0.025 | 0.22 |
|
| 77 |
+
| 2.5211 | 6.0 | 642 | 1.5649 | 0.0655 | 0.161 | 0.0507 | 0.0099 | 0.0602 | 0.09 | 0.1084 | 0.2515 | 0.287 | 0.1106 | 0.2183 | 0.3845 | 0.2197 | 0.5757 | 0.0358 | 0.2747 | 0.0145 | 0.1857 | 0.0026 | 0.1369 | 0.0548 | 0.2618 |
|
| 78 |
+
| 2.5211 | 7.0 | 749 | 1.5948 | 0.0845 | 0.2049 | 0.0633 | 0.0269 | 0.0742 | 0.0995 | 0.1297 | 0.2626 | 0.2992 | 0.1088 | 0.2235 | 0.4293 | 0.2717 | 0.5743 | 0.0459 | 0.2734 | 0.0163 | 0.2062 | 0.0079 | 0.1785 | 0.0805 | 0.2636 |
|
| 79 |
+
| 2.5211 | 8.0 | 856 | 1.5967 | 0.0919 | 0.2207 | 0.0707 | 0.0167 | 0.0633 | 0.1232 | 0.1217 | 0.2607 | 0.2909 | 0.0852 | 0.235 | 0.4226 | 0.3033 | 0.5194 | 0.034 | 0.2544 | 0.0215 | 0.2308 | 0.0119 | 0.1723 | 0.0887 | 0.2778 |
|
| 80 |
+
| 2.5211 | 9.0 | 963 | 1.4514 | 0.1109 | 0.2538 | 0.0845 | 0.0411 | 0.0777 | 0.1563 | 0.1422 | 0.3009 | 0.3289 | 0.1791 | 0.2695 | 0.446 | 0.3386 | 0.5432 | 0.071 | 0.3544 | 0.0471 | 0.2562 | 0.0088 | 0.16 | 0.089 | 0.3307 |
|
| 81 |
+
| 1.4523 | 10.0 | 1070 | 1.4644 | 0.1133 | 0.2619 | 0.0875 | 0.0273 | 0.0882 | 0.1606 | 0.1461 | 0.3134 | 0.3443 | 0.1686 | 0.2875 | 0.4779 | 0.3493 | 0.5441 | 0.0458 | 0.3418 | 0.0312 | 0.2612 | 0.0294 | 0.2738 | 0.1106 | 0.3004 |
|
| 82 |
+
| 1.4523 | 11.0 | 1177 | 1.4809 | 0.1275 | 0.2867 | 0.1025 | 0.0281 | 0.0897 | 0.1935 | 0.1579 | 0.321 | 0.3499 | 0.1611 | 0.277 | 0.4931 | 0.351 | 0.5509 | 0.06 | 0.3633 | 0.0392 | 0.2446 | 0.0435 | 0.2462 | 0.1438 | 0.3444 |
|
| 83 |
+
| 1.4523 | 12.0 | 1284 | 1.3664 | 0.1275 | 0.2856 | 0.1075 | 0.0271 | 0.0913 | 0.1987 | 0.1684 | 0.3425 | 0.3685 | 0.1711 | 0.2964 | 0.5251 | 0.3792 | 0.5743 | 0.0735 | 0.3873 | 0.0572 | 0.2906 | 0.0123 | 0.2431 | 0.1154 | 0.3471 |
|
| 84 |
+
| 1.4523 | 13.0 | 1391 | 1.3664 | 0.1526 | 0.3248 | 0.1317 | 0.0353 | 0.1067 | 0.2337 | 0.1812 | 0.3475 | 0.3781 | 0.1792 | 0.3069 | 0.5293 | 0.4254 | 0.5937 | 0.0769 | 0.3911 | 0.0681 | 0.3013 | 0.035 | 0.2538 | 0.1574 | 0.3507 |
|
| 85 |
+
| 1.4523 | 14.0 | 1498 | 1.3835 | 0.151 | 0.3341 | 0.1152 | 0.0286 | 0.1038 | 0.2345 | 0.1763 | 0.3344 | 0.3599 | 0.1439 | 0.2864 | 0.5153 | 0.4127 | 0.5932 | 0.091 | 0.3481 | 0.066 | 0.2759 | 0.0391 | 0.2354 | 0.1464 | 0.3467 |
|
| 86 |
+
| 1.276 | 15.0 | 1605 | 1.3411 | 0.1543 | 0.335 | 0.1298 | 0.0292 | 0.1057 | 0.2407 | 0.183 | 0.3381 | 0.3701 | 0.1465 | 0.3124 | 0.5277 | 0.4324 | 0.605 | 0.088 | 0.3671 | 0.0771 | 0.2857 | 0.0228 | 0.2585 | 0.1515 | 0.3342 |
|
| 87 |
+
| 1.276 | 16.0 | 1712 | 1.2893 | 0.1661 | 0.3537 | 0.133 | 0.0342 | 0.1156 | 0.2717 | 0.1948 | 0.3721 | 0.4002 | 0.1815 | 0.3359 | 0.5539 | 0.454 | 0.6095 | 0.0686 | 0.3873 | 0.0894 | 0.3192 | 0.0365 | 0.3123 | 0.1823 | 0.3729 |
|
| 88 |
+
| 1.276 | 17.0 | 1819 | 1.2780 | 0.1755 | 0.3802 | 0.1453 | 0.0373 | 0.1345 | 0.2636 | 0.2135 | 0.3861 | 0.4123 | 0.1981 | 0.3469 | 0.5757 | 0.4376 | 0.609 | 0.0986 | 0.4203 | 0.0941 | 0.3424 | 0.0588 | 0.3169 | 0.1884 | 0.3729 |
|
| 89 |
+
| 1.276 | 18.0 | 1926 | 1.2497 | 0.1847 | 0.3934 | 0.1655 | 0.0461 | 0.1455 | 0.2714 | 0.2171 | 0.3867 | 0.4139 | 0.1944 | 0.3529 | 0.5756 | 0.4483 | 0.6036 | 0.1029 | 0.4329 | 0.11 | 0.3375 | 0.0513 | 0.3154 | 0.2112 | 0.38 |
|
| 90 |
+
| 1.1271 | 19.0 | 2033 | 1.2770 | 0.1816 | 0.3797 | 0.1593 | 0.0439 | 0.1411 | 0.2703 | 0.2128 | 0.3874 | 0.4184 | 0.2093 | 0.3576 | 0.5749 | 0.4459 | 0.605 | 0.0961 | 0.4304 | 0.0994 | 0.3277 | 0.0591 | 0.3508 | 0.2078 | 0.3782 |
|
| 91 |
+
| 1.1271 | 20.0 | 2140 | 1.2571 | 0.1942 | 0.3991 | 0.1714 | 0.0551 | 0.1523 | 0.293 | 0.2153 | 0.3873 | 0.4114 | 0.169 | 0.3529 | 0.5763 | 0.4696 | 0.6239 | 0.1069 | 0.4127 | 0.1192 | 0.3424 | 0.0523 | 0.3108 | 0.2229 | 0.3671 |
|
| 92 |
+
| 1.1271 | 21.0 | 2247 | 1.2282 | 0.2044 | 0.4048 | 0.1707 | 0.0499 | 0.1545 | 0.3103 | 0.2349 | 0.3961 | 0.4282 | 0.1929 | 0.3695 | 0.5866 | 0.4778 | 0.6257 | 0.1088 | 0.4304 | 0.12 | 0.3549 | 0.0889 | 0.3231 | 0.2264 | 0.4071 |
|
| 93 |
+
| 1.1271 | 22.0 | 2354 | 1.2395 | 0.2017 | 0.4202 | 0.1791 | 0.0511 | 0.1581 | 0.3156 | 0.2329 | 0.3961 | 0.4245 | 0.2095 | 0.3691 | 0.5774 | 0.4595 | 0.6077 | 0.1162 | 0.4456 | 0.125 | 0.3473 | 0.0944 | 0.3477 | 0.2134 | 0.3742 |
|
| 94 |
+
| 1.1271 | 23.0 | 2461 | 1.2307 | 0.2147 | 0.4289 | 0.1954 | 0.0565 | 0.1687 | 0.3282 | 0.2424 | 0.4024 | 0.4304 | 0.197 | 0.3702 | 0.5957 | 0.4719 | 0.6167 | 0.1274 | 0.4392 | 0.1368 | 0.3527 | 0.1057 | 0.3492 | 0.2317 | 0.3942 |
|
| 95 |
+
| 1.0136 | 24.0 | 2568 | 1.2164 | 0.2162 | 0.4366 | 0.1905 | 0.059 | 0.1676 | 0.3325 | 0.2435 | 0.4075 | 0.4355 | 0.1878 | 0.3812 | 0.5951 | 0.4729 | 0.6243 | 0.1291 | 0.4342 | 0.1433 | 0.3643 | 0.102 | 0.3585 | 0.234 | 0.3964 |
|
| 96 |
+
| 1.0136 | 25.0 | 2675 | 1.2099 | 0.2199 | 0.4399 | 0.1918 | 0.064 | 0.1717 | 0.3337 | 0.2482 | 0.4094 | 0.4356 | 0.1865 | 0.3821 | 0.5947 | 0.4834 | 0.6315 | 0.1333 | 0.443 | 0.141 | 0.3554 | 0.1059 | 0.3538 | 0.2359 | 0.3942 |
|
| 97 |
+
| 1.0136 | 26.0 | 2782 | 1.2123 | 0.2161 | 0.4387 | 0.186 | 0.0618 | 0.1664 | 0.3262 | 0.2468 | 0.4108 | 0.4358 | 0.1817 | 0.3803 | 0.599 | 0.4759 | 0.6243 | 0.1314 | 0.4506 | 0.1421 | 0.3567 | 0.0978 | 0.3431 | 0.2332 | 0.4044 |
|
| 98 |
+
| 1.0136 | 27.0 | 2889 | 1.2133 | 0.217 | 0.4372 | 0.1832 | 0.0612 | 0.1665 | 0.3297 | 0.2473 | 0.4104 | 0.4363 | 0.1845 | 0.3771 | 0.6027 | 0.4812 | 0.6185 | 0.1362 | 0.4506 | 0.1442 | 0.3634 | 0.0917 | 0.3554 | 0.2317 | 0.3938 |
|
| 99 |
+
| 1.0136 | 28.0 | 2996 | 1.2098 | 0.2195 | 0.4411 | 0.1871 | 0.06 | 0.1716 | 0.3288 | 0.2476 | 0.4124 | 0.4377 | 0.1818 | 0.3807 | 0.6065 | 0.48 | 0.6185 | 0.134 | 0.4557 | 0.1452 | 0.3603 | 0.0996 | 0.3554 | 0.2385 | 0.3987 |
|
| 100 |
+
| 0.9389 | 29.0 | 3103 | 1.2105 | 0.2195 | 0.4354 | 0.1875 | 0.061 | 0.1712 | 0.3295 | 0.2504 | 0.4115 | 0.436 | 0.1845 | 0.3805 | 0.6004 | 0.4809 | 0.6212 | 0.1317 | 0.4481 | 0.1469 | 0.3638 | 0.1004 | 0.3523 | 0.2374 | 0.3947 |
|
| 101 |
+
| 0.9389 | 30.0 | 3210 | 1.2104 | 0.2196 | 0.4356 | 0.188 | 0.0612 | 0.171 | 0.329 | 0.2502 | 0.4112 | 0.4367 | 0.1848 | 0.3804 | 0.6025 | 0.4813 | 0.6212 | 0.1332 | 0.4506 | 0.1463 | 0.3647 | 0.0995 | 0.3523 | 0.2377 | 0.3947 |
|
| 102 |
+
|
| 103 |
+
|
| 104 |
+
### Framework versions
|
| 105 |
+
|
| 106 |
+
- Transformers 4.55.4
|
| 107 |
+
- Pytorch 2.8.0+cu126
|
| 108 |
+
- Datasets 4.0.0
|
| 109 |
+
- Tokenizers 0.21.4
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 174079796
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:d659ce8d78e592666ab77139d1cc3e55f54177ed0d4b3b33429211c0fcee6b21
|
| 3 |
size 174079796
|