File size: 12,713 Bytes
d38f923
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16716aa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d38f923
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16716aa
d38f923
 
 
 
7f94e27
d38f923
 
 
 
 
16716aa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d38f923
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
library_name: transformers
license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
- generated_from_trainer
model-index:
- name: detr_finetuned_cppe5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# detr_finetuned_cppe5

This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1482
- Map: 0.3013
- Map 50: 0.6037
- Map 75: 0.2625
- Map Small: 0.0935
- Map Medium: 0.2354
- Map Large: 0.4705
- Mar 1: 0.293
- Mar 10: 0.4358
- Mar 100: 0.4539
- Mar Small: 0.2086
- Mar Medium: 0.3954
- Mar Large: 0.6332
- Map Coverall: 0.5417
- Mar 100 Coverall: 0.6608
- Map Face Shield: 0.3015
- Mar 100 Face Shield: 0.4709
- Map Gloves: 0.1881
- Mar 100 Gloves: 0.3589
- Map Goggles: 0.1751
- Mar 100 Goggles: 0.3723
- Map Mask: 0.2998
- Mar 100 Mask: 0.4067

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log        | 1.0   | 107  | 1.3103          | 0.2366 | 0.5251 | 0.1954 | 0.0483    | 0.1595     | 0.4051    | 0.2415 | 0.372  | 0.3916  | 0.0973    | 0.3279     | 0.5894    | 0.5019       | 0.6234           | 0.1952          | 0.3873              | 0.1421     | 0.3067         | 0.1166      | 0.3123          | 0.2272   | 0.328        |
| No log        | 2.0   | 214  | 1.2553          | 0.2407 | 0.5311 | 0.1895 | 0.0818    | 0.1888     | 0.3807    | 0.2457 | 0.4052 | 0.4239  | 0.2015    | 0.3461     | 0.6037    | 0.4952       | 0.6351           | 0.2002          | 0.4405              | 0.1491     | 0.317          | 0.1145      | 0.3508          | 0.2445   | 0.376        |
| No log        | 3.0   | 321  | 1.2643          | 0.2545 | 0.5418 | 0.2059 | 0.0836    | 0.1862     | 0.4223    | 0.2582 | 0.4086 | 0.4288  | 0.1915    | 0.354      | 0.6306    | 0.4996       | 0.6315           | 0.2537          | 0.4911              | 0.1503     | 0.3174         | 0.1181      | 0.34            | 0.2506   | 0.364        |
| No log        | 4.0   | 428  | 1.2615          | 0.2529 | 0.5449 | 0.2072 | 0.0696    | 0.1878     | 0.4059    | 0.2609 | 0.3967 | 0.4207  | 0.1807    | 0.3445     | 0.6116    | 0.5063       | 0.6261           | 0.2257          | 0.438               | 0.1622     | 0.3304         | 0.1233      | 0.3385          | 0.247    | 0.3707       |
| 0.809         | 5.0   | 535  | 1.2607          | 0.2579 | 0.531  | 0.2227 | 0.073     | 0.2017     | 0.3961    | 0.2645 | 0.4083 | 0.4268  | 0.1903    | 0.372      | 0.5884    | 0.4998       | 0.6248           | 0.2598          | 0.4519              | 0.1353     | 0.304          | 0.1357      | 0.3677          | 0.2589   | 0.3858       |
| 0.809         | 6.0   | 642  | 1.2593          | 0.2519 | 0.5188 | 0.1991 | 0.0871    | 0.1865     | 0.3902    | 0.2603 | 0.3996 | 0.4208  | 0.2027    | 0.3592     | 0.5902    | 0.493        | 0.6302           | 0.2515          | 0.443               | 0.1459     | 0.3241         | 0.1331      | 0.3538          | 0.2359   | 0.3529       |
| 0.809         | 7.0   | 749  | 1.2608          | 0.2343 | 0.5174 | 0.1746 | 0.0704    | 0.1827     | 0.359     | 0.2465 | 0.3923 | 0.4159  | 0.1648    | 0.3577     | 0.5925    | 0.4923       | 0.6392           | 0.2226          | 0.419               | 0.143      | 0.3121         | 0.0839      | 0.3462          | 0.2296   | 0.3631       |
| 0.809         | 8.0   | 856  | 1.2420          | 0.2518 | 0.5359 | 0.2112 | 0.0678    | 0.1871     | 0.3933    | 0.2665 | 0.4027 | 0.4247  | 0.1816    | 0.3587     | 0.6093    | 0.5093       | 0.6333           | 0.2541          | 0.457               | 0.161      | 0.304          | 0.1149      | 0.3708          | 0.2194   | 0.3582       |
| 0.809         | 9.0   | 963  | 1.2877          | 0.2389 | 0.5124 | 0.1925 | 0.0765    | 0.1861     | 0.3552    | 0.2418 | 0.3911 | 0.4139  | 0.1845    | 0.3416     | 0.5759    | 0.4878       | 0.6405           | 0.2406          | 0.457               | 0.16       | 0.3219         | 0.0767      | 0.3154          | 0.2292   | 0.3347       |
| 0.8045        | 10.0  | 1070 | 1.2622          | 0.2465 | 0.5363 | 0.1849 | 0.0681    | 0.1901     | 0.3804    | 0.2551 | 0.3944 | 0.4152  | 0.1688    | 0.347      | 0.5884    | 0.4993       | 0.6248           | 0.235           | 0.419               | 0.1353     | 0.325          | 0.1121      | 0.3323          | 0.2506   | 0.3751       |
| 0.8045        | 11.0  | 1177 | 1.2450          | 0.2567 | 0.5325 | 0.212  | 0.0769    | 0.2089     | 0.3991    | 0.2561 | 0.4013 | 0.4236  | 0.1756    | 0.38       | 0.583     | 0.5152       | 0.6311           | 0.2205          | 0.4228              | 0.1471     | 0.3152         | 0.1476      | 0.3892          | 0.2528   | 0.3596       |
| 0.8045        | 12.0  | 1284 | 1.2581          | 0.2482 | 0.5347 | 0.1968 | 0.0744    | 0.1923     | 0.3948    | 0.2679 | 0.4031 | 0.4202  | 0.1677    | 0.365      | 0.5949    | 0.511        | 0.6423           | 0.2382          | 0.443               | 0.1513     | 0.3107         | 0.1024      | 0.3523          | 0.2383   | 0.3524       |
| 0.8045        | 13.0  | 1391 | 1.2723          | 0.2523 | 0.5343 | 0.1932 | 0.0752    | 0.1776     | 0.4016    | 0.2564 | 0.3901 | 0.413   | 0.1807    | 0.3553     | 0.5787    | 0.5103       | 0.6374           | 0.2291          | 0.4165              | 0.1744     | 0.3304         | 0.1068      | 0.3308          | 0.2407   | 0.3502       |
| 0.8045        | 14.0  | 1498 | 1.2605          | 0.2557 | 0.5429 | 0.2066 | 0.064     | 0.2004     | 0.4036    | 0.26   | 0.3913 | 0.4058  | 0.1734    | 0.349      | 0.5638    | 0.506        | 0.6284           | 0.2267          | 0.4101              | 0.1439     | 0.283          | 0.1458      | 0.3431          | 0.2562   | 0.3644       |
| 0.7483        | 15.0  | 1605 | 1.1914          | 0.2745 | 0.5581 | 0.2287 | 0.0771    | 0.2257     | 0.436     | 0.2752 | 0.418  | 0.4318  | 0.182     | 0.3714     | 0.5948    | 0.5256       | 0.6437           | 0.265           | 0.4506              | 0.1654     | 0.3304         | 0.1388      | 0.3508          | 0.2776   | 0.3836       |
| 0.7483        | 16.0  | 1712 | 1.2263          | 0.2707 | 0.5439 | 0.2323 | 0.0806    | 0.2134     | 0.4229    | 0.271  | 0.4093 | 0.4281  | 0.2211    | 0.353      | 0.5989    | 0.5228       | 0.6329           | 0.2682          | 0.438               | 0.1485     | 0.3138         | 0.1464      | 0.3769          | 0.2675   | 0.3787       |
| 0.7483        | 17.0  | 1819 | 1.1937          | 0.2768 | 0.5753 | 0.2219 | 0.0905    | 0.2061     | 0.4351    | 0.2744 | 0.4158 | 0.4374  | 0.1936    | 0.3639     | 0.6148    | 0.5191       | 0.6464           | 0.2555          | 0.4582              | 0.175      | 0.3348         | 0.1594      | 0.3646          | 0.2752   | 0.3831       |
| 0.7483        | 18.0  | 1926 | 1.1891          | 0.2819 | 0.5779 | 0.2195 | 0.0856    | 0.2146     | 0.4421    | 0.2804 | 0.4272 | 0.4425  | 0.1994    | 0.3761     | 0.6186    | 0.528        | 0.6491           | 0.2863          | 0.4797              | 0.1772     | 0.3366         | 0.1458      | 0.3646          | 0.2721   | 0.3822       |
| 0.6667        | 19.0  | 2033 | 1.1921          | 0.2873 | 0.5852 | 0.2319 | 0.0904    | 0.2222     | 0.4537    | 0.2785 | 0.4251 | 0.4483  | 0.1838    | 0.4064     | 0.6159    | 0.539        | 0.6568           | 0.2849          | 0.4747              | 0.1765     | 0.3411         | 0.1539      | 0.3754          | 0.2821   | 0.3938       |
| 0.6667        | 20.0  | 2140 | 1.1818          | 0.293  | 0.5831 | 0.2489 | 0.091     | 0.2229     | 0.4721    | 0.2877 | 0.4307 | 0.448   | 0.2008    | 0.3913     | 0.6292    | 0.5318       | 0.6536           | 0.3021          | 0.4886              | 0.1806     | 0.3496         | 0.1786      | 0.3662          | 0.2721   | 0.3822       |
| 0.6667        | 21.0  | 2247 | 1.1679          | 0.2914 | 0.5889 | 0.2375 | 0.0971    | 0.2219     | 0.4566    | 0.2834 | 0.4302 | 0.4503  | 0.1951    | 0.3953     | 0.6284    | 0.5388       | 0.6577           | 0.2947          | 0.4608              | 0.1848     | 0.3567         | 0.1545      | 0.3815          | 0.2844   | 0.3947       |
| 0.6667        | 22.0  | 2354 | 1.1684          | 0.2901 | 0.5869 | 0.2451 | 0.0909    | 0.2182     | 0.4556    | 0.284  | 0.4314 | 0.4457  | 0.2025    | 0.375      | 0.6275    | 0.5342       | 0.6568           | 0.2854          | 0.4608              | 0.1817     | 0.3473         | 0.1645      | 0.3677          | 0.2848   | 0.396        |
| 0.6667        | 23.0  | 2461 | 1.1606          | 0.2927 | 0.5861 | 0.2426 | 0.0869    | 0.2221     | 0.4582    | 0.2868 | 0.4278 | 0.4471  | 0.2063    | 0.3833     | 0.6147    | 0.546        | 0.6568           | 0.292           | 0.4709              | 0.1828     | 0.3522         | 0.1601      | 0.3646          | 0.2825   | 0.3911       |
| 0.5873        | 24.0  | 2568 | 1.1582          | 0.2924 | 0.5898 | 0.2543 | 0.0918    | 0.2298     | 0.4546    | 0.2898 | 0.4311 | 0.4497  | 0.2008    | 0.3965     | 0.6166    | 0.5473       | 0.6653           | 0.2906          | 0.4747              | 0.1811     | 0.346          | 0.149       | 0.3662          | 0.2939   | 0.3964       |
| 0.5873        | 25.0  | 2675 | 1.1534          | 0.2987 | 0.6015 | 0.2658 | 0.0897    | 0.2329     | 0.4705    | 0.2899 | 0.4312 | 0.4515  | 0.2032    | 0.3874     | 0.6358    | 0.5507       | 0.6653           | 0.2975          | 0.4797              | 0.186      | 0.3536         | 0.1632      | 0.3569          | 0.2963   | 0.4018       |
| 0.5873        | 26.0  | 2782 | 1.1536          | 0.2976 | 0.5984 | 0.2627 | 0.093     | 0.2331     | 0.4708    | 0.2882 | 0.4335 | 0.4536  | 0.2074    | 0.3973     | 0.6307    | 0.5414       | 0.6604           | 0.2947          | 0.4785              | 0.1831     | 0.3562         | 0.1733      | 0.3708          | 0.2955   | 0.4022       |
| 0.5873        | 27.0  | 2889 | 1.1492          | 0.2992 | 0.6031 | 0.2633 | 0.0931    | 0.2331     | 0.4692    | 0.2913 | 0.4358 | 0.4528  | 0.2119    | 0.3922     | 0.6313    | 0.5442       | 0.6617           | 0.2975          | 0.4785              | 0.185      | 0.3558         | 0.174       | 0.3662          | 0.2951   | 0.4018       |
| 0.5873        | 28.0  | 2996 | 1.1491          | 0.3007 | 0.6027 | 0.2606 | 0.0927    | 0.2337     | 0.4742    | 0.2928 | 0.4342 | 0.4528  | 0.2026    | 0.3965     | 0.6313    | 0.5406       | 0.6581           | 0.2982          | 0.4709              | 0.1881     | 0.358          | 0.1799      | 0.3738          | 0.2965   | 0.4031       |
| 0.5379        | 29.0  | 3103 | 1.1484          | 0.301  | 0.6039 | 0.2632 | 0.0932    | 0.2356     | 0.4706    | 0.2935 | 0.4356 | 0.4532  | 0.2035    | 0.395      | 0.6327    | 0.541        | 0.6599           | 0.3019          | 0.4696              | 0.1885     | 0.3589         | 0.1739      | 0.3723          | 0.2995   | 0.4053       |
| 0.5379        | 30.0  | 3210 | 1.1482          | 0.3013 | 0.6037 | 0.2625 | 0.0935    | 0.2354     | 0.4705    | 0.293  | 0.4358 | 0.4539  | 0.2086    | 0.3954     | 0.6332    | 0.5417       | 0.6608           | 0.3015          | 0.4709              | 0.1881     | 0.3589         | 0.1751      | 0.3723          | 0.2998   | 0.4067       |


### Framework versions

- Transformers 4.55.4
- Pytorch 2.8.0+cu126
- Datasets 4.0.0
- Tokenizers 0.21.4