Datasets:
Update README.md
Browse files
README.md
CHANGED
|
@@ -34,4 +34,55 @@ configs:
|
|
| 34 |
- split: test
|
| 35 |
path: data/*
|
| 36 |
pretty_name: CHIP
|
| 37 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 34 |
- split: test
|
| 35 |
path: data/*
|
| 36 |
pretty_name: CHIP
|
| 37 |
+
---
|
| 38 |
+
# CHIP: A multi-sensor dataset for 6D pose estimation of chairs in industrial settings
|
| 39 |
+
<div style="display: flex; gap: 10px; margin: 20px 0;">
|
| 40 |
+
<div style="background-color: #007bff; color: white; padding: 10px 20px; border-radius: 5px; text-decoration: none; display: inline-block;">
|
| 41 |
+
<a href="https://tev-fbk.github.io/CHIP/" style="color: white; text-decoration: none;">🏠 Homepage</a>
|
| 42 |
+
</div>
|
| 43 |
+
<div style="background-color: #28a745; color: white; padding: 10px 20px; border-radius: 5px; text-decoration: none; display: inline-block;">
|
| 44 |
+
<a href="https://arxiv.org/abs/2506.09699" style="color: white; text-decoration: none;">📄 Paper</a>
|
| 45 |
+
</div>
|
| 46 |
+
</div>
|
| 47 |
+
|
| 48 |
+
## Introduction
|
| 49 |
+
|
| 50 |
+
Accurate 6D pose estimation of complex objects in 3D environments is essential for effective robotic manipulation. Yet, existing benchmarks fall short in evaluating 6D pose estimation methods under realistic industrial conditions, as most datasets focus on household objects in domestic settings, while the few available industrial datasets are limited to artificial setups with objects placed on tables. To bridge this gap, we introduce CHIP, the first dataset designed for 6D pose estimation of chairs manipulated by a robotic arm in a real-world industrial environment. CHIP includes seven distinct chairs captured using three different RGBD sensing technologies and presents unique challenges, such as distractor objects with fine-grained differences and severe occlusions caused by the robotic arm and human operators. CHIP comprises 77,811 RGBD images annotated with ground-truth 6D poses automatically derived from the robot's kinematics, averaging 11,115 annotations per chair. We benchmark CHIP using three zero-shot 6D pose estimation methods, assessing performance across different sensor types, localization priors, and occlusion levels. Results show substantial room for improvement, highlighting the unique challenges posed by the dataset.
|
| 51 |
+
|
| 52 |
+
## Dataset Summary
|
| 53 |
+
- **Number of images:** 77,811 RGBD images
|
| 54 |
+
- **Number of object classes:** 7 distinct chair models
|
| 55 |
+
- **Sensors used:** Intel RealSense D435, Intel RealSense L515, Stereo Labs ZED
|
| 56 |
+
- **Annotations:** Ground-truth 6D poses derived from robot kinematics (~11,115 annotations per chair)
|
| 57 |
+
- **Occlusion levels:** No occlusions, moderate occlusions
|
| 58 |
+
|
| 59 |
+
|
| 60 |
+
### Object Classes
|
| 61 |
+
The dataset contains 7 distinct chair models, each represented by a unique object class. The chairs exhibit a variety of designs and structures, providing a diverse set of challenges for 6D pose estimation algorithms. The CHIP dataset features seven chair models: three solid-wood and four frameonly designs, originally including cushions.
|
| 62 |
+
|
| 63 |
+
Solid-wood chairs:
|
| 64 |
+
- 000001: si0325 [Andreu World link](https://andreuworld.com/en/products/smile-si0325)
|
| 65 |
+
|
| 66 |
+
|
| 67 |
+
|
| 68 |
+
|
| 69 |
+
|
| 70 |
+
- si0325 -> 000001
|
| 71 |
+
- si0374 -> 000002
|
| 72 |
+
- si0991 -> 000003
|
| 73 |
+
- si2750 -> 000004
|
| 74 |
+
- si7291 -> 000005
|
| 75 |
+
- so0903 -> 000006
|
| 76 |
+
- so2043 -> 000007
|
| 77 |
+
|
| 78 |
+
|
| 79 |
+
|
| 80 |
+
### Citation
|
| 81 |
+
If you find CHIP useful for your work please cite:
|
| 82 |
+
```
|
| 83 |
+
@inproceedings{nardon2025chip,
|
| 84 |
+
title={CHIP: A multi-sensor dataset for 6D pose estimation of chairs in industrial settings},
|
| 85 |
+
author={Nardon, Mattia and Mujika Agirre, Mikel and González Tomé, Ander and Sedano Algarabel, Daniel and Rueda Collell, Josep and Caro, Ana Paola and Caraffa, Andrea and Poiesi, Fabio and Chippendale, Paul Ian and Boscaini, Davide},
|
| 86 |
+
booktitle={British Machine Vision Conference (BMVC)},
|
| 87 |
+
year={2025}}
|
| 88 |
+
```
|