Spaces:
Running
Running
| title: README | |
| emoji: 🤖 | |
| colorFrom: blue | |
| colorTo: indigo | |
| sdk: static | |
| pinned: false | |
| # Rice RobotΠ Lab | |
| **Where robots Physically Interact with the world.** | |
| The Rice RobotΠ Lab at Rice University, led by Prof. Kaiyu Hang, studies how robots can physically interact with an uncertain, contact-rich world. Our work spans manipulation planning, interactive perception, learning from in-the-wild video, and benchmarking infrastructure for the field. | |
| ## What you'll find here | |
| This Hugging Face organization hosts the lab's open-source datasets, interactive demos, and model artifacts. For papers, members, and full project listings, please visit our [lab website](https://robotpilab.github.io/). | |
| ### On Hugging Face | |
| - **EgoInfinity** *(coming soon)*: Large-scale automated reconstruction of 3D hand-object interactions from in-the-wild egocentric video, designed for scaling up robot training data. | |
| ### Other ongoing lab projects | |
| - [**ManiDreams**](https://robotpilab.github.io/post/manidreams/): Open-source modular framework for uncertainty-aware manipulation planning over intuitive physics models. | |
| - [**ManipulationNet**](https://robotpilab.github.io/post/manipulationnet/): Community-driven global infrastructure for benchmarking robot manipulation research at scale. | |
| ## Links | |
| - 🌐 [Lab website](https://robotpilab.github.io/) | |
| - 💻 [GitHub](https://github.com/Rice-RobotPI-Lab) | |
| - 📝 [Publications](https://robotpilab.github.io/publication/) | |
| - 📨 [Join the lab](https://robotpilab.github.io/join/) | |
| ## Contact | |
| robotpilab@gmail.com |