File size: 12,750 Bytes
481c931
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
library_name: transformers
license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
- generated_from_trainer
model-index:
- name: conditional-detr-resnet-50_finetuned_cppe5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# conditional-detr-resnet-50_finetuned_cppe5

This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0625
- Map: 0.2889
- Map 50: 0.5213
- Map 75: 0.2755
- Map Small: 0.1022
- Map Medium: 0.229
- Map Large: 0.4272
- Mar 1: 0.3222
- Mar 10: 0.4831
- Mar 100: 0.5016
- Mar Small: 0.2614
- Mar Medium: 0.4179
- Mar Large: 0.6587
- Map Coverall: 0.5718
- Mar 100 Coverall: 0.6954
- Map Face Shield: 0.2333
- Mar 100 Face Shield: 0.4986
- Map Gloves: 0.2051
- Mar 100 Gloves: 0.4271
- Map Goggles: 0.1597
- Mar 100 Goggles: 0.4629
- Map Mask: 0.2745
- Mar 100 Mask: 0.4239

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| 1.9147        | 1.0   | 106  | 1.8194          | 0.018  | 0.0388 | 0.0143 | 0.0021    | 0.0124     | 0.0239    | 0.0327 | 0.0914 | 0.1369  | 0.0511    | 0.1158     | 0.1784    | 0.066        | 0.2723           | 0.0016          | 0.0141              | 0.002      | 0.1633         | 0.0         | 0.0242          | 0.0204   | 0.2105       |
| 1.7779        | 2.0   | 212  | 1.6795          | 0.0258 | 0.0566 | 0.0213 | 0.0028    | 0.0269     | 0.045     | 0.0581 | 0.1398 | 0.2003  | 0.0736    | 0.1451     | 0.276     | 0.0948       | 0.4672           | 0.0018          | 0.0183              | 0.0034     | 0.2156         | 0.0         | 0.0161          | 0.0288   | 0.2842       |
| 1.5384        | 3.0   | 318  | 1.5198          | 0.0564 | 0.1102 | 0.0497 | 0.011     | 0.0461     | 0.0728    | 0.1032 | 0.2112 | 0.2611  | 0.0954    | 0.19       | 0.3563    | 0.1985       | 0.5672           | 0.0195          | 0.0803              | 0.006      | 0.2704         | 0.0026      | 0.0452          | 0.0554   | 0.3426       |
| 1.4396        | 4.0   | 424  | 1.4693          | 0.068  | 0.1268 | 0.0605 | 0.0083    | 0.0559     | 0.0844    | 0.1045 | 0.2328 | 0.2914  | 0.0959    | 0.2147     | 0.3865    | 0.2221       | 0.6308           | 0.0243          | 0.1169              | 0.0062     | 0.2503         | 0.0022      | 0.0887          | 0.0854   | 0.3703       |
| 1.346         | 5.0   | 530  | 1.3865          | 0.1092 | 0.211  | 0.0886 | 0.0299    | 0.0947     | 0.1306    | 0.147  | 0.3056 | 0.3498  | 0.1688    | 0.2721     | 0.4676    | 0.3744       | 0.6385           | 0.0243          | 0.2676              | 0.0141     | 0.299          | 0.017       | 0.1774          | 0.1161   | 0.3665       |
| 1.4945        | 6.0   | 636  | 1.3785          | 0.1192 | 0.2391 | 0.1004 | 0.0284    | 0.1022     | 0.14      | 0.1468 | 0.3166 | 0.3547  | 0.1625    | 0.2961     | 0.468     | 0.4067       | 0.6149           | 0.0358          | 0.269               | 0.0376     | 0.3347         | 0.0148      | 0.2032          | 0.1011   | 0.3517       |
| 1.3374        | 7.0   | 742  | 1.3494          | 0.1227 | 0.2446 | 0.1104 | 0.0224    | 0.0971     | 0.1504    | 0.1771 | 0.3492 | 0.3849  | 0.1914    | 0.3082     | 0.5158    | 0.4317       | 0.619            | 0.0343          | 0.3535              | 0.0329     | 0.3397         | 0.009       | 0.2403          | 0.1056   | 0.3718       |
| 1.2146        | 8.0   | 848  | 1.2911          | 0.1438 | 0.2789 | 0.1279 | 0.07      | 0.1074     | 0.2004    | 0.1887 | 0.3793 | 0.4099  | 0.2098    | 0.3261     | 0.5631    | 0.4433       | 0.6128           | 0.0426          | 0.369               | 0.073      | 0.3935         | 0.0302      | 0.2903          | 0.1298   | 0.3837       |
| 1.2948        | 9.0   | 954  | 1.2916          | 0.142  | 0.2761 | 0.1166 | 0.0521    | 0.1147     | 0.1801    | 0.1982 | 0.3779 | 0.4107  | 0.1976    | 0.3197     | 0.5616    | 0.4229       | 0.6103           | 0.0509          | 0.3718              | 0.0672     | 0.3935         | 0.0368      | 0.2871          | 0.1321   | 0.3909       |
| 1.2078        | 10.0  | 1060 | 1.2753          | 0.1507 | 0.3144 | 0.1235 | 0.0328    | 0.1191     | 0.2269    | 0.2093 | 0.3851 | 0.4132  | 0.1723    | 0.332      | 0.5746    | 0.4389       | 0.6256           | 0.0735          | 0.369               | 0.0662     | 0.3714         | 0.0271      | 0.3177          | 0.1477   | 0.3823       |
| 1.2587        | 11.0  | 1166 | 1.2508          | 0.1734 | 0.3524 | 0.1442 | 0.0515    | 0.1426     | 0.2446    | 0.2113 | 0.3937 | 0.4257  | 0.1954    | 0.3409     | 0.5844    | 0.463        | 0.6436           | 0.0627          | 0.3859              | 0.1096     | 0.3799         | 0.0553      | 0.3387          | 0.1762   | 0.3804       |
| 1.1698        | 12.0  | 1272 | 1.1786          | 0.2005 | 0.3877 | 0.1683 | 0.1016    | 0.1568     | 0.2834    | 0.238  | 0.424  | 0.4521  | 0.2352    | 0.3716     | 0.5995    | 0.506        | 0.6631           | 0.1129          | 0.4676              | 0.1215     | 0.4095         | 0.0575      | 0.3081          | 0.2048   | 0.412        |
| 1.2067        | 13.0  | 1378 | 1.2097          | 0.1984 | 0.387  | 0.1682 | 0.0614    | 0.1482     | 0.2911    | 0.229  | 0.4181 | 0.4454  | 0.217     | 0.3519     | 0.6179    | 0.5025       | 0.6697           | 0.103           | 0.4366              | 0.1196     | 0.3899         | 0.0618      | 0.3565          | 0.205    | 0.3742       |
| 1.0912        | 14.0  | 1484 | 1.1591          | 0.2216 | 0.419  | 0.2006 | 0.0819    | 0.1857     | 0.3055    | 0.2625 | 0.4402 | 0.4714  | 0.2437    | 0.3902     | 0.615     | 0.5025       | 0.6759           | 0.1505          | 0.469               | 0.13       | 0.4055         | 0.083       | 0.3887          | 0.2422   | 0.4177       |
| 1.0154        | 15.0  | 1590 | 1.1474          | 0.2306 | 0.4327 | 0.2099 | 0.1035    | 0.1847     | 0.333     | 0.2706 | 0.4434 | 0.4683  | 0.2241    | 0.3922     | 0.6141    | 0.5212       | 0.6672           | 0.159           | 0.4718              | 0.1442     | 0.3935         | 0.0876      | 0.3935          | 0.2411   | 0.4153       |
| 1.0772        | 16.0  | 1696 | 1.1345          | 0.2388 | 0.4498 | 0.2175 | 0.1053    | 0.1906     | 0.3505    | 0.2811 | 0.4483 | 0.4791  | 0.2289    | 0.3974     | 0.6313    | 0.5251       | 0.6692           | 0.1643          | 0.4845              | 0.1485     | 0.4131         | 0.1073      | 0.4194          | 0.2489   | 0.4096       |
| 0.9185        | 17.0  | 1802 | 1.1253          | 0.2409 | 0.4471 | 0.2173 | 0.1159    | 0.1991     | 0.3476    | 0.279  | 0.4538 | 0.4783  | 0.253     | 0.4055     | 0.6226    | 0.5242       | 0.6559           | 0.1469          | 0.4606              | 0.1676     | 0.4211         | 0.1137      | 0.429           | 0.2523   | 0.4249       |
| 1.0736        | 18.0  | 1908 | 1.1187          | 0.2507 | 0.4656 | 0.2378 | 0.1003    | 0.2144     | 0.3554    | 0.2878 | 0.4592 | 0.4858  | 0.2265    | 0.426      | 0.6152    | 0.5353       | 0.6785           | 0.1635          | 0.4606              | 0.1699     | 0.4387         | 0.1412      | 0.4339          | 0.2437   | 0.4172       |
| 1.0021        | 19.0  | 2014 | 1.0987          | 0.2608 | 0.4736 | 0.2522 | 0.1262    | 0.2172     | 0.3734    | 0.2981 | 0.4526 | 0.4809  | 0.237     | 0.4049     | 0.6321    | 0.5396       | 0.6713           | 0.1529          | 0.4803              | 0.1888     | 0.4201         | 0.1616      | 0.421           | 0.2611   | 0.412        |
| 1.0613        | 20.0  | 2120 | 1.1144          | 0.2563 | 0.4785 | 0.2276 | 0.0917    | 0.2102     | 0.3691    | 0.2895 | 0.4472 | 0.4732  | 0.2493    | 0.3923     | 0.6189    | 0.5421       | 0.6621           | 0.1623          | 0.4521              | 0.1872     | 0.4317         | 0.1383      | 0.4145          | 0.2515   | 0.4057       |
| 1.0576        | 21.0  | 2226 | 1.0888          | 0.2827 | 0.5059 | 0.271  | 0.1041    | 0.2359     | 0.4061    | 0.3106 | 0.4658 | 0.4922  | 0.2639    | 0.4076     | 0.6432    | 0.5516       | 0.6831           | 0.2145          | 0.5056              | 0.1897     | 0.4276         | 0.1896      | 0.4274          | 0.2678   | 0.4172       |
| 0.8949        | 22.0  | 2332 | 1.0893          | 0.2762 | 0.5078 | 0.2489 | 0.1207    | 0.2241     | 0.4014    | 0.3038 | 0.4662 | 0.489   | 0.2413    | 0.4026     | 0.6519    | 0.5583       | 0.6856           | 0.1908          | 0.4789              | 0.2028     | 0.4226         | 0.1652      | 0.4403          | 0.2635   | 0.4177       |
| 1.0853        | 23.0  | 2438 | 1.0747          | 0.275  | 0.501  | 0.2602 | 0.1144    | 0.2267     | 0.3966    | 0.3049 | 0.4722 | 0.4958  | 0.2632    | 0.412      | 0.646     | 0.5569       | 0.6908           | 0.1743          | 0.493               | 0.202      | 0.4281         | 0.1771      | 0.4581          | 0.2649   | 0.4091       |
| 1.0098        | 24.0  | 2544 | 1.0623          | 0.28   | 0.5081 | 0.263  | 0.1136    | 0.2199     | 0.4101    | 0.311  | 0.479  | 0.5011  | 0.2629    | 0.4175     | 0.653     | 0.5705       | 0.6949           | 0.2057          | 0.5014              | 0.2023     | 0.4256         | 0.1551      | 0.4613          | 0.2665   | 0.4225       |
| 0.8466        | 25.0  | 2650 | 1.0682          | 0.2862 | 0.514  | 0.2748 | 0.105     | 0.2306     | 0.4217    | 0.3173 | 0.4813 | 0.5009  | 0.2669    | 0.4195     | 0.6565    | 0.5691       | 0.6944           | 0.221           | 0.4972              | 0.2088     | 0.4241         | 0.1613      | 0.4645          | 0.271    | 0.4244       |
| 0.9067        | 26.0  | 2756 | 1.0638          | 0.2881 | 0.5225 | 0.276  | 0.1085    | 0.2309     | 0.4219    | 0.3183 | 0.4802 | 0.4991  | 0.2475    | 0.4205     | 0.6539    | 0.5723       | 0.6938           | 0.232           | 0.4944              | 0.2051     | 0.4246         | 0.158       | 0.4597          | 0.2734   | 0.423        |
| 0.8462        | 27.0  | 2862 | 1.0629          | 0.288  | 0.5182 | 0.2679 | 0.1013    | 0.2252     | 0.4272    | 0.3191 | 0.4807 | 0.5004  | 0.261     | 0.4176     | 0.6546    | 0.57         | 0.6933           | 0.2337          | 0.4986              | 0.2057     | 0.4286         | 0.158       | 0.4565          | 0.2728   | 0.4249       |
| 0.8771        | 28.0  | 2968 | 1.0642          | 0.2901 | 0.524  | 0.2781 | 0.1028    | 0.2299     | 0.4267    | 0.3215 | 0.4821 | 0.5009  | 0.2603    | 0.4192     | 0.6555    | 0.5725       | 0.6949           | 0.2346          | 0.4972              | 0.2055     | 0.4251         | 0.1634      | 0.4629          | 0.2744   | 0.4244       |
| 0.8022        | 29.0  | 3074 | 1.0626          | 0.2885 | 0.5212 | 0.2758 | 0.1017    | 0.229      | 0.427     | 0.322  | 0.4829 | 0.5013  | 0.2589    | 0.4183     | 0.6588    | 0.5718       | 0.6959           | 0.2309          | 0.4972              | 0.206      | 0.4276         | 0.1595      | 0.4613          | 0.2744   | 0.4244       |
| 0.8245        | 30.0  | 3180 | 1.0625          | 0.2889 | 0.5213 | 0.2755 | 0.1022    | 0.229      | 0.4272    | 0.3222 | 0.4831 | 0.5016  | 0.2614    | 0.4179     | 0.6587    | 0.5718       | 0.6954           | 0.2333          | 0.4986              | 0.2051     | 0.4271         | 0.1597      | 0.4629          | 0.2745   | 0.4239       |


### Framework versions

- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.2