Update README.md
Browse files
README.md
CHANGED
|
@@ -1,50 +1,85 @@
|
|
| 1 |
---
|
| 2 |
license: cc-by-nc-4.0
|
|
|
|
|
|
|
| 3 |
---
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
-
|
|
|
|
|
|
|
| 22 |
|
| 23 |
## HazyDet
|
| 24 |
|
| 25 |
-

|
| 443 |
-
|
| 444 |
-
### Installation
|
| 445 |
-
|
| 446 |
-
#### Step 1: Create a conda
|
| 447 |
-
|
| 448 |
-
```shell
|
| 449 |
-
$ conda create --name HazyDet python=3.9
|
| 450 |
-
$ source activate HazyDet
|
| 451 |
-
```
|
| 452 |
-
|
| 453 |
-
#### Step 2: Install PyTorch
|
| 454 |
-
|
| 455 |
-
```shell
|
| 456 |
-
conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia
|
| 457 |
-
```
|
| 458 |
-
|
| 459 |
-
#### Step 3: Install OpenMMLab 2.x Codebases
|
| 460 |
-
|
| 461 |
-
```shell
|
| 462 |
-
# openmmlab codebases
|
| 463 |
-
pip install -U openmim --no-input
|
| 464 |
-
mim install mmengine "mmcv>=2.0.0" "mmdet>=3.0.0" "mmsegmentation>=1.0.0" "mmrotate>=1.0.0rc1" mmyolo "mmpretrain>=1.0.0rc7" 'mmagic'
|
| 465 |
-
# other dependencies
|
| 466 |
-
pip install -U ninja scikit-image --no-input
|
| 467 |
-
```
|
| 468 |
-
|
| 469 |
-
#### Step 4: Install `HazyDet`
|
| 470 |
-
|
| 471 |
-
```shell
|
| 472 |
-
python setup.py develop
|
| 473 |
-
```
|
| 474 |
-
|
| 475 |
-
**Note**: make sure you have `cd` to the root directory of `HazyDet`
|
| 476 |
-
|
| 477 |
-
```shell
|
| 478 |
-
$ git clone git@github.com:GrokCV/HazyDet.git
|
| 479 |
-
$ cd HazyDet
|
| 480 |
-
```
|
| 481 |
-
|
| 482 |
-
### Training
|
| 483 |
-
```shell
|
| 484 |
-
$ python tools/train_det.py configs/DeCoDet/DeCoDet_r50_1x_hazydet.py
|
| 485 |
-
```
|
| 486 |
-
|
| 487 |
-
|
| 488 |
-
### Inference
|
| 489 |
-
```shell
|
| 490 |
-
$ python tools/test.py configs/DeCoDet/DeCoDet_r50_1x_hazydet.py weights/fcos_DeCoDet_r50_1x_hazydet.pth
|
| 491 |
-
```
|
| 492 |
-
|
| 493 |
-
We released our [checkpoint](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) on HazyDet <br>
|
| 494 |
-
|
| 495 |
-
### Depth Maps
|
| 496 |
-
|
| 497 |
-
The depth map required for training can be obtained through [Metic3D](https://github.com/YvanYin/Metric3D). They can also be acquired through other depth estimation models.<br>
|
| 498 |
-
|
| 499 |
-
If you want to use our depth data, please download it and place it in the specified path. For convenience in storage and viewing, we save relative depth in PNG image format and the maximum depth in text format, but we use absolute depth during training.
|
| 500 |
-
|
| 501 |
-
## Acknowledgement
|
| 502 |
-
We are grateful to the Tianjin Key Laboratory of Visual Computing and Intelligent Perception (VCIP) for providing essential resources. Our sincere appreciation goes to Professor Pengfei Zhu and the dedicated AISKYEYE team at Tianjin University for their invaluable support with data, which has been crucial to our research efforts. We also deeply thank Xianghui Li, Yuxin Feng, and other researchers for granting us access to their datasets, significantly advancing and promoting our work in this field. Additionally, our thanks extend to [Metric3D](https://github.com/YvanYin/Metric3D) for its contributions to the methodology presented in this article.
|
| 503 |
-
|
| 504 |
-
|
| 505 |
## Citation
|
| 506 |
|
| 507 |
If you use this toolbox or benchmark in your research, please cite this project.
|
|
|
|
| 1 |
---
|
| 2 |
license: cc-by-nc-4.0
|
| 3 |
+
task_categories:
|
| 4 |
+
- object-detection
|
| 5 |
---
|
| 6 |
+
|
| 7 |
+
# HazyDet: Open-Source Benchmark for Drone-View Object Detection With Depth-Cues in Hazy Scenes [(paper)](https://arxiv.org/abs/2409.19833)
|
| 8 |
+
|
| 9 |
+
**HazyDet** is the first benchmark for object detection in hazy drone imagery. It couples physics-driven synthetic data with real foggy drone photos, providing a controlled yet realistic test-bed for designing haze-robust detectors.
|
| 10 |
+
|
| 11 |
+
---
|
| 12 |
+
|
| 13 |
+
## Abstract
|
| 14 |
+
Object detection from aerial platforms under adverse atmospheric conditions, particularly haze, is paramount for robust drone autonomy.
|
| 15 |
+
Yet, this domain remains largely underexplored, primarily hindered by the absence of specialized benchmarks.
|
| 16 |
+
To bridge this gap, we present HazyDet, the first, large-scale benchmark specifically designed for drone-view object detection in hazy conditions.
|
| 17 |
+
Comprising 383,000 real-world instances derived from both naturally hazy captures and synthetically hazed scenes augmented from clear images,
|
| 18 |
+
HazyDet provides a challenging and realistic testbed for advancing detection algorithms. To address the severe visual degradation induced by haze,
|
| 19 |
+
we propose the Depth-Conditioned Detector (DeCoDet), a novel architecture that integrates a Depth-Conditioned Kernel to dynamically modulate feature representations
|
| 20 |
+
based on depth cues. The practical efficacy and robustness of DeCoDet are further enhanced by its training with a Progressive Domain Fine-Tuning (PDFT) strategy
|
| 21 |
+
to navigate synthetic-to-real domain shifts, and a Scale-Invariant Refurbishment Loss (SIRLoss) to ensure resilient learning from potentially noisy depth annotations.
|
| 22 |
+
Comprehensive empirical validation on HazyDet substantiates the superiority of our unified DeCoDet framework,
|
| 23 |
+
which achieves state-of-the-art performance, surpassing the closest competitor by a notable +1.5\% mAP on challenging real-world
|
| 24 |
+
hazy test scenarios. Our dataset and toolkit are available at [github](https://github.com/GrokCV/HazyDet).
|
| 25 |
+
|
| 26 |
|
| 27 |
## HazyDet
|
| 28 |
|
| 29 |
+

|
| 30 |
+
|
| 31 |
+
---
|
| 32 |
|
| 33 |
+
### 📦 Dataset at a Glance
|
| 34 |
+
|
| 35 |
+
_Target size buckets: Small < 0.1 % of image area , Medium 0.1–1 % , Large > 1 %_
|
| 36 |
+
|
| 37 |
+
| Split | #Images | #Instances | Class | Small | Medium | Large |
|
| 38 |
+
|-------|:-------:|:----------:|-------|------:|-------:|------:|
|
| 39 |
+
| **Train** | 8 000 | 264 511 | Car | 159 491 | 77 527 | 5 177 |
|
| 40 |
+
| | | | Truck | 4 197 | 6 262 | 1 167 |
|
| 41 |
+
| | | | Bus | 1 990 | 7 879 | 861 |
|
| 42 |
+
| **Val** | 1 000 | 34 560 | Car | 21 051 | 9 881 | 630 |
|
| 43 |
+
| | | | Truck | 552 | 853 | 103 |
|
| 44 |
+
| | | | Bus | 243 | 1 122 | 125 |
|
| 45 |
+
| **Test** | 2 000 | 65 322 | Car | 38 910 | 19 860 | 1 256 |
|
| 46 |
+
| | | | Truck | 881 | 1 409 | 263 |
|
| 47 |
+
| | | | Bus | 473 | 1 991 | 279 |
|
| 48 |
+
| **Real-world Train** | 400 | 13 753 | Car | 5 816 | 6 487 | 695 |
|
| 49 |
+
| | | | Truck | 86 | 204 | 57 |
|
| 50 |
+
| | | | Bus | 52 | 256 | 100 |
|
| 51 |
+
| **Real-world Test** | 200 | 5 543 | Car | 2 351 | 2 506 | 365 |
|
| 52 |
+
| | | | Truck | 26 | 86 | 30 |
|
| 53 |
+
| | | | Bus | 17 | 107 | 55 |
|
| 54 |
+
|
| 55 |
+
---
|
| 56 |
+
|
| 57 |
+
|
| 58 |
+
You can also **download** our HazyDet dataset from [**Baidu Netdisk**](https://pan.baidu.com/s/1KKWqTbG1oBAdlIZrTzTceQ?pwd=grok) or [**OneDrive**](https://1drv.ms/f/s!AmElF7K4aY9p83CqLdm4N-JSo9rg?e=H06ghJ).<br>
|
| 59 |
|
| 60 |
For both training and inference, the following dataset structure is required:
|
| 61 |
|
| 62 |
+

|
| 63 |
+
|
| 64 |
```
|
| 65 |
+
HazyDet/
|
| 66 |
+
├── train/
|
| 67 |
+
│ └── clean images/
|
| 68 |
+
│ └── hazy images/
|
| 69 |
+
│ └── lables/
|
| 70 |
+
├──val/
|
| 71 |
+
│ └── clean images/
|
| 72 |
+
│ └── hazy images/
|
| 73 |
+
│ └── lables/
|
| 74 |
+
├── test/
|
| 75 |
+
│ └── clean images/
|
| 76 |
+
│ └── hazy images/
|
| 77 |
+
│ └── lables/
|
| 78 |
+
├── Real-world/
|
| 79 |
+
│ └── train/
|
| 80 |
+
│ └── test/
|
| 81 |
+
│ └── lables/
|
| 82 |
+
└── README.md <-- you are here
|
| 83 |
```
|
| 84 |
|
| 85 |
**Note: Both passwords for BaiduYun and OneDrive is `grok`**.
|
|
|
|
| 92 |
|
| 93 |
### Detectors
|
| 94 |
|
| 95 |
+
| Model | Backbone | #Params (M) | GFLOPs | mAP on<br>Synthetic Test-set | mAP on<br>Real-world Test-set | Weight |
|
| 96 |
+
|-------------------------|----------|-------------|--------|-----------------------------|-------------------------------|--------|
|
| 97 |
+
| **One Stage** | | | | | | |
|
| 98 |
+
| YOLOv3 | Darknet53 | 61.63 | 20.19 | 35.0 | 30.7 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 99 |
+
| GFL | ResNet50 | 32.26 | 198.65 | 36.8 | 32.5 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 100 |
+
| YOLOX | CSPDarkNet | 8.94 | 13.32 | 42.3 | 35.4 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 101 |
+
| FCOS | ResNet50 | 32.11 | 191.48 | 45.9 | 32.7 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 102 |
+
| VFNet | ResNet50 | 32.71 | 184.32 | 49.5 | 35.6 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 103 |
+
| ATTS | ResNet50 | 32.12 | 195.58 | 50.4 | 36.4 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 104 |
+
| DDOD | ResNet50 | 32.20 | 173.05 | 50.7 | 37.1 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 105 |
+
| TOOD | ResNet50 | 32.02 | 192.51 | 51.4 | 36.7 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 106 |
+
| **Two Stage** | | | | | | |
|
| 107 |
+
| Faster RCNN | ResNet50 | 41.35 | 201.72 | 48.7 | 33.4 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 108 |
+
| Libra RCNN | ResNet50 | 41.62 | 209.92 | 49.0 | 34.5 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 109 |
+
| Grid RCNN | ResNet50 | 64.46 | 317.44 | 50.5 | 35.2 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 110 |
+
| Cascade RCNN | ResNet50 | 69.15 | 230.40 | <u>51.6</u> | <u>37.2</u> | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 111 |
+
| **End-to-End** | | | | | | |
|
| 112 |
+
| Conditional DETR | ResNet50 | 43.55 | 91.47 | 30.5 | 25.8 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 113 |
+
| DAB DETR | ResNet50 | 43.7 | 91.02 | 31.3 | 27.2 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 114 |
+
| Deform DETR | ResNet50 | 40.01 | 203.11 | 51.5 | 36.9 | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
| 115 |
+
| **DeCoDet** | | | | | | |
|
| 116 |
+
| **DeCoDet (Ours)** | ResNet50 | 34.62 | 225.37 | **52.0** | **38.7** | [weight](https://pan.baidu.com/s/1EEX_934Q421RkHCx53akJQ?pwd=grok) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 117 |
|
| 118 |
### Dehazing
|
| 119 |
|
|
|
|
| 231 |
</table>
|
| 232 |
|
| 233 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 234 |
## Citation
|
| 235 |
|
| 236 |
If you use this toolbox or benchmark in your research, please cite this project.
|