Enhance dataset card: Add metadata, paper, project, code links, and sample usage
#2
by
nielsr
HF Staff
- opened
README.md
CHANGED
|
@@ -30,4 +30,51 @@ configs:
|
|
| 30 |
path: data/evaluation-*
|
| 31 |
- split: all_data
|
| 32 |
path: data/all_data-*
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 30 |
path: data/evaluation-*
|
| 31 |
- split: all_data
|
| 32 |
path: data/all_data-*
|
| 33 |
+
task_categories:
|
| 34 |
+
- robotics
|
| 35 |
+
language:
|
| 36 |
+
- en
|
| 37 |
+
tags:
|
| 38 |
+
- human-robot-interaction
|
| 39 |
+
- multi-agent
|
| 40 |
+
- reinforcement-learning
|
| 41 |
+
- collaboration
|
| 42 |
---
|
| 43 |
+
|
| 44 |
+
# Moving Out: Physically-grounded Human-AI Collaboration Dataset
|
| 45 |
+
|
| 46 |
+
This repository contains human-human interaction data collected for the [Moving Out: Physically-grounded Human-AI Collaboration](https://huggingface.co/papers/2507.18623) benchmark.
|
| 47 |
+
|
| 48 |
+
**Moving Out** introduces a new human-AI collaboration benchmark designed to evaluate embodied agents' ability to adapt to physical actions and constraints in an environment. The dataset includes human-human interaction data collected for two tasks within this environment, covering various collaboration modes such as moving heavy items together and coordinating actions around corners. It is crucial for understanding how models adapt to diverse human behaviors and unseen physical attributes in physically-grounded scenarios.
|
| 49 |
+
|
| 50 |
+
**Project Page:** [https://live-robotics-uva.github.io/movingout_ai/](https://live-robotics-uva.github.io/movingout_ai/)
|
| 51 |
+
|
| 52 |
+
**Code Repository:** [https://github.com/live-robotics-uva/Moving-Out](https://github.com/live-robotics-uva/Moving-Out)
|
| 53 |
+
|
| 54 |
+
## Sample Usage
|
| 55 |
+
|
| 56 |
+
You can visualize trajectories from this dataset using the provided `dataset_to_video.py` script available in the project's GitHub repository.
|
| 57 |
+
|
| 58 |
+
**Save trajectory as MP4:**
|
| 59 |
+
```bash
|
| 60 |
+
python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v video
|
| 61 |
+
```
|
| 62 |
+
|
| 63 |
+
**Show trajectory in popup window:**
|
| 64 |
+
```bash
|
| 65 |
+
python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v human
|
| 66 |
+
```
|
| 67 |
+
Use `-m` for map name and `-t` for trajectory ID. Remember to replace `ShuaKang/movingout_task2` with the specific dataset ID you are using (e.g., `ShuaKang/movingout_task1` or `ShuaKang/movingout_task2`).
|
| 68 |
+
|
| 69 |
+
## Citation
|
| 70 |
+
|
| 71 |
+
If you use this dataset in your research, please cite the following paper:
|
| 72 |
+
|
| 73 |
+
```bibtex
|
| 74 |
+
@article{li2025moving,
|
| 75 |
+
title={Moving Out: Physically-grounded Human-AI Collaboration},
|
| 76 |
+
author={Li, Shua and Liu, Zi and Guo, Ruohan and Bai, Min and Li, Erran and Levine, Sergey and Lin, Dahua},
|
| 77 |
+
journal={arXiv preprint arXiv:2507.18623},
|
| 78 |
+
year={2025}
|
| 79 |
+
}
|
| 80 |
+
```
|