movingout_task2 / README.md
nielsr's picture
nielsr HF Staff
Enhance dataset card: Add metadata, paper, project, code links, and sample usage
95823fc verified
|
raw
history blame
2.76 kB
---
dataset_info:
features:
- name: map_name
dtype: string
- name: trajectory_id
dtype: int64
- name: steps_data
dtype: string
- name: num_steps
dtype: int64
splits:
- name: train
num_bytes: 307264902
num_examples: 624
- name: evaluation
num_bytes: 46719495
num_examples: 96
- name: all_data
num_bytes: 353984397
num_examples: 720
download_size: 148078966
dataset_size: 707968794
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: evaluation
path: data/evaluation-*
- split: all_data
path: data/all_data-*
task_categories:
- robotics
language:
- en
tags:
- human-robot-interaction
- multi-agent
- reinforcement-learning
- collaboration
---
# Moving Out: Physically-grounded Human-AI Collaboration Dataset
This repository contains human-human interaction data collected for the [Moving Out: Physically-grounded Human-AI Collaboration](https://huggingface.co/papers/2507.18623) benchmark.
**Moving Out** introduces a new human-AI collaboration benchmark designed to evaluate embodied agents' ability to adapt to physical actions and constraints in an environment. The dataset includes human-human interaction data collected for two tasks within this environment, covering various collaboration modes such as moving heavy items together and coordinating actions around corners. It is crucial for understanding how models adapt to diverse human behaviors and unseen physical attributes in physically-grounded scenarios.
**Project Page:** [https://live-robotics-uva.github.io/movingout_ai/](https://live-robotics-uva.github.io/movingout_ai/)
**Code Repository:** [https://github.com/live-robotics-uva/Moving-Out](https://github.com/live-robotics-uva/Moving-Out)
## Sample Usage
You can visualize trajectories from this dataset using the provided `dataset_to_video.py` script available in the project's GitHub repository.
**Save trajectory as MP4:**
```bash
python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v video
```
**Show trajectory in popup window:**
```bash
python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v human
```
Use `-m` for map name and `-t` for trajectory ID. Remember to replace `ShuaKang/movingout_task2` with the specific dataset ID you are using (e.g., `ShuaKang/movingout_task1` or `ShuaKang/movingout_task2`).
## Citation
If you use this dataset in your research, please cite the following paper:
```bibtex
@article{li2025moving,
title={Moving Out: Physically-grounded Human-AI Collaboration},
author={Li, Shua and Liu, Zi and Guo, Ruohan and Bai, Min and Li, Erran and Levine, Sergey and Lin, Dahua},
journal={arXiv preprint arXiv:2507.18623},
year={2025}
}
```