Click-and-Traverse / README.md
Axian12138's picture
Update readme
b5491ae verified
---
license: apache-2.0
tags:
- robotics
- humanoid
---
# Click-and-Traverse: Collision-Free Humanoid Traversal in Cluttered Indoor Scenes
<p style="font-size: 1.2em;">
<a href="https://axian12138.github.io/CAT/"><strong>Project Page</strong></a> |
<a href="https://github.com/GalaxyGeneralRobotics/Click-and-Traverse/"><strong>Code</strong></a> |
<a href="https://arxiv.org/abs/2601.16035"><strong>Paper</strong></a> |
<a href="https://www.youtube.com/watch?v=blek__Qf0Vc"><strong>Video Demo</strong></a>|
<a href="https://huggingface.co/Axian12138/Click-and-Traverse"><strong>Models</strong></a>|
<a href="https://huggingface.co/datasets/Axian12138/Click-and-Traverse/"><strong>Datasets</strong></a>
</p>
The system is designed for **general humanoid traversal tasks** requiring coordinated whole-body motion planning and collision avoidance.
---
## Key Capabilities
<table width="100%">
<tr>
<th width="50%">Full-Spatial Constraints</th>
<th width="50%">Intricate Geometry Handling</th>
</tr>
<tr>
<td valign="top">
Indoor environments impose constraints at multiple spatial levels. Obstacles can appear simultaneously at the <strong>ground, lateral, and overhead</strong> levels, restricting humanoid motion in all spatial dimensions and requiring coordinated whole-body movement.
</td>
<td valign="top">
The method handles <strong>complex and irregular obstacle geometries</strong> that commonly appear in real-world environments, going beyond simplified primitives such as cubes or regular polyhedra used in many prior benchmarks.
</td>
</tr>
</table>
<div align="center">
<img
src="https://img.youtube.com/vi/blek__Qf0Vc/maxresdefault.jpg"
style="width:100%; max-width:1000px; height:auto;"
alt="Collision-Free Humanoid Traversal Demo"
/>
</div>
---
## Citation
```bibtex
@misc{xue2026collisionfreehumanoidtraversalcluttered,
title = {Collision-Free Humanoid Traversal in Cluttered Indoor Scenes},
author = {Xue, Han and Liang, Sikai and Zhang, Zhikai and Zeng, Zicheng and Liu, Yun and Lian, Yunrui and Wang, Jilong and Liu, Qingtao and Shi, Xuesong and Li, Yi},
year = {2026},
eprint = {2601.16035},
archivePrefix= {arXiv},
primaryClass = {cs.RO},
url = {https://arxiv.org/abs/2601.16035}
}