ironmanfcf commited on
Commit
2910c41
·
verified ·
1 Parent(s): 7a86682

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +93 -364
README.md CHANGED
@@ -1,50 +1,85 @@
1
  ---
2
  license: cc-by-nc-4.0
 
 
3
  ---
4
- # HazyDet: Open-Source Benchmark for Drone-View Object Detection With Depth-Cues in Hazy Scenes
5
- This repository is the official implementation of HazyDet
6
-
7
- - [HazyDet](#hazydet)
8
- - [Leadboard and Model Zoo](#leadboard-and-model-zoo)
9
- - [Detectors](#detectors)
10
- - [Dehazing](#dehazing)
11
- - [DeCoDet](#decodet)
12
- - [Installation](#installation)
13
- - [Step 1: Create a conda](#step-1-create-a-conda)
14
- - [Step 2: Install PyTorch](#step-2-install-pytorch)
15
- - [Step 3: Install OpenMMLab 2.x Codebases](#step-3-install-openmmlab-2x-codebases)
16
- - [Step 4: Install `HazyDet`](#step-4-install-hazydet)
17
- - [Training](#training)
18
- - [Inference](#inference)
19
- - [Depth Maps](#depth-maps)
20
- - [Acknowledgement](#acknowledgement)
21
- - [Citation](#citation)
 
 
22
 
23
  ## HazyDet
24
 
25
- ![HazyDet](./docs/dataset_samples.jpg)
 
 
26
 
27
- You can **download** our HazyDet dataset from [**Baidu Netdisk**](https://pan.baidu.com/s/1KKWqTbG1oBAdlIZrTzTceQ?pwd=grok) or [**OneDrive**](https://1drv.ms/f/s!AmElF7K4aY9p83CqLdm4N-JSo9rg?e=H06ghJ).<br>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
28
 
29
  For both training and inference, the following dataset structure is required:
30
 
 
 
31
  ```
32
- HazyDet
33
- |-- train
34
- |-- clean images
35
- |-- hazy images
36
- |-- labels
37
- |-- val
38
- |-- clean images
39
- |-- hazy images
40
- |-- labels
41
- |-- test
42
- |-- clean images
43
- |-- hazy images
44
- |-- labels
45
- |-- RDDTS
46
- |-- hazy images
47
- |-- labels
 
 
48
  ```
49
 
50
  **Note: Both passwords for BaiduYun and OneDrive is `grok`**.
@@ -57,270 +92,28 @@ All the weight files in the model zoo can be accessed on [Baidu Cloud](https://p
57
 
58
  ### Detectors
59
 
60
-
61
- <table>
62
- <tr>
63
- <td>Model</td>
64
- <td>Backbone</td>
65
- <td>#Params (M)</td>
66
- <td>GFLOPs</td>
67
- <td>mAP on<br>Test-set</td> <!-- 使用 <br> 标签换行 -->
68
- <td>mAP on<br>RDDTS</td>
69
- <td>Config</td>
70
- <td>Weight</td>
71
- </tr>
72
- <tr>
73
- <td>One Stage</td>
74
- <td></td>
75
- <td></td>
76
- <td></td>
77
- <td></td>
78
- <td></td>
79
- <td></td>
80
- <td></td>
81
- </tr>
82
- <tr>
83
- <td>YOLOv3</td>
84
- <td>Darknet53</td>
85
- <td>61.63</td>
86
- <td>20.19</td>
87
- <td>35.0</td>
88
- <td>19.2</td>
89
- <td><a href="./configs/yolov3/yolov3_d53_8xb8-ms-416-273e_hazydet.py">config</a></td>
90
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
91
- </tr>
92
- <tr>
93
- <td>GFL</td>
94
- <td>ResNet50</td>
95
- <td>32.26</td>
96
- <td>198.65</td>
97
- <td>36.8</td>
98
- <td>13.9</td>
99
- <td><a href="./configs/gfl/gfl_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
100
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
101
- </tr>
102
- <tr>
103
- <td>YOLOX</td>
104
- <td>CSPDarkNet</td>
105
- <td>8.94</td>
106
- <td>13.32</td>
107
- <td>42.3</td>
108
- <td>24.7</td>
109
- <td><a href="./configs/yolox/yolox_s_8xb8-300e_hazydet.py">config</a></td> <!-- 新增链接 -->
110
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
111
- </tr>
112
- <tr>
113
- <td>RepPoints</td>
114
- <td>ResNet50</td>
115
- <td>36.83</td>
116
- <td>184.32</td>
117
- <td>43.8</td>
118
- <td>21.3</td>
119
- <td><a href="./configs/reppoints/reppoints-moment_r50_fpn-gn_head-gn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
120
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
121
- </tr>
122
- <tr>
123
- <td>FCOS</td>
124
- <td>ResNet50</td>
125
- <td>32.11</td>
126
- <td>191.48</td>
127
- <td>45.9</td>
128
- <td>22.8</td>
129
- <td><a href="./configs/fcos/fcos_r50_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
130
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
131
- </tr>
132
- <tr>
133
- <td>Centernet</td>
134
- <td>ResNet50</td>
135
- <td>32.11</td>
136
- <td>191.49</td>
137
- <td>47.2</td>
138
- <td>23.8</td>
139
- <td><a href="./configs/centernet/centernet-update_r50-caffe_fpn_ms-1x_hazydet.py">config</a></td> <!-- 新增链接 -->
140
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
141
- </tr>
142
- <tr>
143
- <td>ATTS</td>
144
- <td>ResNet50</td>
145
- <td>32.12</td>
146
- <td>195.58</td>
147
- <td>50.4</td>
148
- <td>25.1</td>
149
- <td><a href="./configs/atts/atss_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
150
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
151
- </tr>
152
- <tr>
153
- <td>DDOD</td>
154
- <td>ResNet50</td>
155
- <td>32.20</td>
156
- <td>173.05</td>
157
- <td>50.7</td>
158
- <td>26.1</td>
159
- <td><a href="./configs/ddod/ddod_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
160
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
161
- </tr>
162
- <tr>
163
- <td>VFNet</td>
164
- <td>ResNet50</td>
165
- <td>32.89</td>
166
- <td>187.39</td>
167
- <td>51.1</td>
168
- <td>25.6</td>
169
- <td><a href="./configs/vfnet/vfnet_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
170
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
171
- </tr>
172
- <tr>
173
- <td>TOOD</td>
174
- <td>ResNet50</td>
175
- <td>32.02</td>
176
- <td>192.51</td>
177
- <td>51.4</td>
178
- <td>25.8</td>
179
- <td><a href="./configs/tood/tood_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
180
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
181
- </tr>
182
- <tr>
183
- <td>Two Stage</td>
184
- <td></td>
185
- <td></td>
186
- <td></td>
187
- <td></td>
188
- <td></td>
189
- <td></td>
190
- <td></td>
191
- </tr>
192
- <tr>
193
- <td>Sparse RCNN</td>
194
- <td>ResNet50</td>
195
- <td>108.54</td>
196
- <td>147.45</td>
197
- <td>27.7</td>
198
- <td>10.4</td>
199
- <td><a href="./configs/sparse_rcnn/sparse-rcnn_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
200
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
201
- </tr>
202
- <tr>
203
- <td>Dynamic RCNN</td>
204
- <td>ResNet50</td>
205
- <td>41.35</td>
206
- <td>201.72</td>
207
- <td>47.6</td>
208
- <td>22.5</td>
209
- <td><a href="./configs/dynamic_rcnn/dynamic-rcnn_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
210
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
211
- </tr>
212
- <tr>
213
- <td>Faster RCNN</td>
214
- <td>ResNet50</td>
215
- <td>41.35</td>
216
- <td>201.72</td>
217
- <td>48.7</td>
218
- <td>23.6</td>
219
- <td><a href="./configs/faster_rcnn/faster-rcnn_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
220
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
221
- </tr>
222
- <tr>
223
- <td>Libra RCNN</td>
224
- <td>ResNet50</td>
225
- <td>41.62</td>
226
- <td>209.92</td>
227
- <td>49.0</td>
228
- <td>23.7</td>
229
- <td><a href="./configs/libra_rcnn/libra-faster-rcnn_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
230
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
231
- </tr>
232
- <tr>
233
- <td>Grid RCNN</td>
234
- <td>ResNet50</td>
235
- <td>64.46</td>
236
- <td>317.44</td>
237
- <td>50.5</td>
238
- <td>25.2</td>
239
- <td><a href="./configs/grid_rcnn/grid-rcnn_r50_fpn_gn-head_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
240
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
241
- </tr>
242
- <tr>
243
- <td>Cascade RCNN</td>
244
- <td>ResNet50</td>
245
- <td>69.15</td>
246
- <td>230.40</td>
247
- <td>51.6</td>
248
- <td>26.0</td>
249
- <td><a href="./configs/cascade_rcnn/cascade-rcnn_r50_fpn_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
250
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
251
- </tr>
252
- <tr>
253
- <td>End-to-End</td>
254
- <td></td>
255
- <td></td>
256
- <td></td>
257
- <td></td>
258
- <td></td>
259
- <td></td>
260
- <td></td>
261
- </tr>
262
- <tr>
263
- <td>Conditional DETR</td>
264
- <td>ResNet50</td>
265
- <td>43.55</td>
266
- <td>94.17</td>
267
- <td>30.5</td>
268
- <td>11.7</td>
269
- <td><a href="./configs/conditional_detr/conditional-detr_r50_8xb2-50e_hazydet.py">config</a></td> <!-- 新增链接 -->
270
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
271
- </tr>
272
- <tr>
273
- <td>DAB DETR</td>
274
- <td>ResNet50</td>
275
- <td>43.70</td>
276
- <td>97.02</td>
277
- <td>31.3</td>
278
- <td>11.7</td>
279
- <td><a href="./configs/dab_detr/dab-detr_r50_8xb2-50e_hazydet.py">config</a></td> <!-- 新增链接 -->
280
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
281
- </tr>
282
- <tr>
283
- <td>Deform DETR</td>
284
- <td>ResNet50</td>
285
- <td>40.01</td>
286
- <td>192.51</td>
287
- <td>51.9</td>
288
- <td>26.5</td>
289
- <td><a href="./configs/deform_detr/deformable-detr_r50_16xb2-50e_hazydet.py">config</a></td> <!-- 新增链接 -->
290
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
291
- </tr>
292
- <tr>
293
- <td>Plug-and-Play</td>
294
- <td></td>
295
- <td></td>
296
- <td></td>
297
- <td></td>
298
- <td></td>
299
- <td></td>
300
- <td></td>
301
- </tr>
302
- <tr>
303
- <td>FCOS-DeCoDet</td>
304
- <td>ResNet50</td>
305
- <td>34.62</td>
306
- <td>225.37</td>
307
- <td>47.4</td>
308
- <td>24.3</td>
309
- <td><a href="./configs/DeCoDet/DeCoDet_r50_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
310
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
311
- </tr>
312
- <tr>
313
- <td>VFNet-DeCoDet</td>
314
- <td>ResNet50</td>
315
- <td>34.61</td>
316
- <td>249.91</td>
317
- <td>51.5</td>
318
- <td>25.9</td>
319
- <td><a href="./configs/DeCoDet/DeCoDet_r50_1x_hazydet.py">config</a></td> <!-- 新增链接 -->
320
- <td><a href="https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok">weight</a></td>
321
- </tr>
322
- </table>
323
-
324
 
325
  ### Dehazing
326
 
@@ -438,70 +231,6 @@ All the weight files in the model zoo can be accessed on [Baidu Cloud](https://p
438
  </table>
439
 
440
 
441
- ## DeCoDet
442
- ![HazyDet](./docs/Fig_network.jpg)
443
-
444
- ### Installation
445
-
446
- #### Step 1: Create a conda
447
-
448
- ```shell
449
- $ conda create --name HazyDet python=3.9
450
- $ source activate HazyDet
451
- ```
452
-
453
- #### Step 2: Install PyTorch
454
-
455
- ```shell
456
- conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia
457
- ```
458
-
459
- #### Step 3: Install OpenMMLab 2.x Codebases
460
-
461
- ```shell
462
- # openmmlab codebases
463
- pip install -U openmim --no-input
464
- mim install mmengine "mmcv>=2.0.0" "mmdet>=3.0.0" "mmsegmentation>=1.0.0" "mmrotate>=1.0.0rc1" mmyolo "mmpretrain>=1.0.0rc7" 'mmagic'
465
- # other dependencies
466
- pip install -U ninja scikit-image --no-input
467
- ```
468
-
469
- #### Step 4: Install `HazyDet`
470
-
471
- ```shell
472
- python setup.py develop
473
- ```
474
-
475
- **Note**: make sure you have `cd` to the root directory of `HazyDet`
476
-
477
- ```shell
478
- $ git clone git@github.com:GrokCV/HazyDet.git
479
- $ cd HazyDet
480
- ```
481
-
482
- ### Training
483
- ```shell
484
- $ python tools/train_det.py configs/DeCoDet/DeCoDet_r50_1x_hazydet.py
485
- ```
486
-
487
-
488
- ### Inference
489
- ```shell
490
- $ python tools/test.py configs/DeCoDet/DeCoDet_r50_1x_hazydet.py weights/fcos_DeCoDet_r50_1x_hazydet.pth
491
- ```
492
-
493
- We released our [checkpoint](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) on HazyDet <br>
494
-
495
- ### Depth Maps
496
-
497
- The depth map required for training can be obtained through [Metic3D](https://github.com/YvanYin/Metric3D). They can also be acquired through other depth estimation models.<br>
498
-
499
- If you want to use our depth data, please download it and place it in the specified path. For convenience in storage and viewing, we save relative depth in PNG image format and the maximum depth in text format, but we use absolute depth during training.
500
-
501
- ## Acknowledgement
502
- We are grateful to the Tianjin Key Laboratory of Visual Computing and Intelligent Perception (VCIP) for providing essential resources. Our sincere appreciation goes to Professor Pengfei Zhu and the dedicated AISKYEYE team at Tianjin University for their invaluable support with data, which has been crucial to our research efforts. We also deeply thank Xianghui Li, Yuxin Feng, and other researchers for granting us access to their datasets, significantly advancing and promoting our work in this field. Additionally, our thanks extend to [Metric3D](https://github.com/YvanYin/Metric3D) for its contributions to the methodology presented in this article.
503
-
504
-
505
  ## Citation
506
 
507
  If you use this toolbox or benchmark in your research, please cite this project.
 
1
  ---
2
  license: cc-by-nc-4.0
3
+ task_categories:
4
+ - object-detection
5
  ---
6
+
7
+ # HazyDet: Open-Source Benchmark for Drone-View Object Detection With Depth-Cues in Hazy Scenes [(paper)](https://arxiv.org/abs/2409.19833)
8
+
9
+ **HazyDet** is the first benchmark for object detection in hazy drone imagery. It couples physics-driven synthetic data with real foggy drone photos, providing a controlled yet realistic test-bed for designing haze-robust detectors.
10
+
11
+ ---
12
+
13
+ ## Abstract
14
+ Object detection from aerial platforms under adverse atmospheric conditions, particularly haze, is paramount for robust drone autonomy.
15
+ Yet, this domain remains largely underexplored, primarily hindered by the absence of specialized benchmarks.
16
+ To bridge this gap, we present HazyDet, the first, large-scale benchmark specifically designed for drone-view object detection in hazy conditions.
17
+ Comprising 383,000 real-world instances derived from both naturally hazy captures and synthetically hazed scenes augmented from clear images,
18
+ HazyDet provides a challenging and realistic testbed for advancing detection algorithms. To address the severe visual degradation induced by haze,
19
+ we propose the Depth-Conditioned Detector (DeCoDet), a novel architecture that integrates a Depth-Conditioned Kernel to dynamically modulate feature representations
20
+ based on depth cues. The practical efficacy and robustness of DeCoDet are further enhanced by its training with a Progressive Domain Fine-Tuning (PDFT) strategy
21
+ to navigate synthetic-to-real domain shifts, and a Scale-Invariant Refurbishment Loss (SIRLoss) to ensure resilient learning from potentially noisy depth annotations.
22
+ Comprehensive empirical validation on HazyDet substantiates the superiority of our unified DeCoDet framework,
23
+ which achieves state-of-the-art performance, surpassing the closest competitor by a notable +1.5\% mAP on challenging real-world
24
+ hazy test scenarios. Our dataset and toolkit are available at [github](https://github.com/GrokCV/HazyDet).
25
+
26
 
27
  ## HazyDet
28
 
29
+ ![HazyDet](./docs/dataset_pipeline_sample.jpg)
30
+
31
+ ---
32
 
33
+ ### 📦 Dataset at a Glance
34
+
35
+ _Target size buckets: Small < 0.1 % of image area , Medium 0.1–1 % , Large > 1 %_
36
+
37
+ | Split | #Images | #Instances | Class | Small | Medium | Large |
38
+ |-------|:-------:|:----------:|-------|------:|-------:|------:|
39
+ | **Train** | 8 000 | 264 511 | Car | 159 491 | 77 527 | 5 177 |
40
+ | | | | Truck | 4 197 | 6 262 | 1 167 |
41
+ | | | | Bus | 1 990 | 7 879 | 861 |
42
+ | **Val** | 1 000 | 34 560 | Car | 21 051 | 9 881 | 630 |
43
+ | | | | Truck | 552 | 853 | 103 |
44
+ | | | | Bus | 243 | 1 122 | 125 |
45
+ | **Test** | 2 000 | 65 322 | Car | 38 910 | 19 860 | 1 256 |
46
+ | | | | Truck | 881 | 1 409 | 263 |
47
+ | | | | Bus | 473 | 1 991 | 279 |
48
+ | **Real-world Train** | 400 | 13 753 | Car | 5 816 | 6 487 | 695 |
49
+ | | | | Truck | 86 | 204 | 57 |
50
+ | | | | Bus | 52 | 256 | 100 |
51
+ | **Real-world Test** | 200 | 5 543 | Car | 2 351 | 2 506 | 365 |
52
+ | | | | Truck | 26 | 86 | 30 |
53
+ | | | | Bus | 17 | 107 | 55 |
54
+
55
+ ---
56
+
57
+
58
+ You can also **download** our HazyDet dataset from [**Baidu Netdisk**](https://pan.baidu.com/s/1KKWqTbG1oBAdlIZrTzTceQ?pwd=grok) or [**OneDrive**](https://1drv.ms/f/s!AmElF7K4aY9p83CqLdm4N-JSo9rg?e=H06ghJ).<br>
59
 
60
  For both training and inference, the following dataset structure is required:
61
 
62
+ ![HazyDet](./docs/dataset_samples.jpg)
63
+
64
  ```
65
+ HazyDet/
66
+ ├── train/
67
+ │ └── clean images/
68
+ │ └── hazy images/
69
+ │ └── lables/
70
+ ├──val/
71
+ │ └── clean images/
72
+ │ └── hazy images/
73
+ │ └── lables/
74
+ ├── test/
75
+ │ └── clean images/
76
+ │ └── hazy images/
77
+ │ └── lables/
78
+ ├── Real-world/
79
+ │ └── train/
80
+ │ └── test/
81
+ │ └── lables/
82
+ └── README.md <-- you are here
83
  ```
84
 
85
  **Note: Both passwords for BaiduYun and OneDrive is `grok`**.
 
92
 
93
  ### Detectors
94
 
95
+ | Model | Backbone | #Params (M) | GFLOPs | mAP on<br>Synthetic Test-set | mAP on<br>Real-world Test-set | Weight |
96
+ |-------------------------|----------|-------------|--------|-----------------------------|-------------------------------|--------|
97
+ | **One Stage** | | | | | | |
98
+ | YOLOv3 | Darknet53 | 61.63 | 20.19 | 35.0 | 30.7 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
99
+ | GFL | ResNet50 | 32.26 | 198.65 | 36.8 | 32.5 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
100
+ | YOLOX | CSPDarkNet | 8.94 | 13.32 | 42.3 | 35.4 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
101
+ | FCOS | ResNet50 | 32.11 | 191.48 | 45.9 | 32.7 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
102
+ | VFNet | ResNet50 | 32.71 | 184.32 | 49.5 | 35.6 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
103
+ | ATTS | ResNet50 | 32.12 | 195.58 | 50.4 | 36.4 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
104
+ | DDOD | ResNet50 | 32.20 | 173.05 | 50.7 | 37.1 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
105
+ | TOOD | ResNet50 | 32.02 | 192.51 | 51.4 | 36.7 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
106
+ | **Two Stage** | | | | | | |
107
+ | Faster RCNN | ResNet50 | 41.35 | 201.72 | 48.7 | 33.4 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
108
+ | Libra RCNN | ResNet50 | 41.62 | 209.92 | 49.0 | 34.5 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
109
+ | Grid RCNN | ResNet50 | 64.46 | 317.44 | 50.5 | 35.2 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
110
+ | Cascade RCNN | ResNet50 | 69.15 | 230.40 | <u>51.6</u> | <u>37.2</u> | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
111
+ | **End-to-End** | | | | | | |
112
+ | Conditional DETR | ResNet50 | 43.55 | 91.47 | 30.5 | 25.8 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
113
+ | DAB DETR | ResNet50 | 43.7 | 91.02 | 31.3 | 27.2 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
114
+ | Deform DETR | ResNet50 | 40.01 | 203.11 | 51.5 | 36.9 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
115
+ | **DeCoDet** | | | | | | |
116
+ | **DeCoDet (Ours)** | ResNet50 | 34.62 | 225.37 | **52.0** | **38.7** | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
117
 
118
  ### Dehazing
119
 
 
231
  </table>
232
 
233
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
234
  ## Citation
235
 
236
  If you use this toolbox or benchmark in your research, please cite this project.