File size: 2,761 Bytes
2a8ebfb
 
 
 
 
 
 
 
 
 
 
 
 
59ca7bd
b793395
2a8ebfb
59ca7bd
2a8ebfb
de8cc77
59ca7bd
b793395
59ca7bd
 
2a8ebfb
 
 
 
 
 
 
de8cc77
 
95823fc
 
 
 
 
 
 
 
 
2a8ebfb
95823fc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
---
dataset_info:
  features:
  - name: map_name
    dtype: string
  - name: trajectory_id
    dtype: int64
  - name: steps_data
    dtype: string
  - name: num_steps
    dtype: int64
  splits:
  - name: train
    num_bytes: 307264902
    num_examples: 624
  - name: evaluation
    num_bytes: 46719495
    num_examples: 96
  - name: all_data
    num_bytes: 353984397
    num_examples: 720
  download_size: 148078966
  dataset_size: 707968794
configs:
- config_name: default
  data_files:
  - split: train
    path: data/train-*
  - split: evaluation
    path: data/evaluation-*
  - split: all_data
    path: data/all_data-*
task_categories:
- robotics
language:
- en
tags:
- human-robot-interaction
- multi-agent
- reinforcement-learning
- collaboration
---

# Moving Out: Physically-grounded Human-AI Collaboration Dataset

This repository contains human-human interaction data collected for the [Moving Out: Physically-grounded Human-AI Collaboration](https://huggingface.co/papers/2507.18623) benchmark.

**Moving Out** introduces a new human-AI collaboration benchmark designed to evaluate embodied agents' ability to adapt to physical actions and constraints in an environment. The dataset includes human-human interaction data collected for two tasks within this environment, covering various collaboration modes such as moving heavy items together and coordinating actions around corners. It is crucial for understanding how models adapt to diverse human behaviors and unseen physical attributes in physically-grounded scenarios.

**Project Page:** [https://live-robotics-uva.github.io/movingout_ai/](https://live-robotics-uva.github.io/movingout_ai/)

**Code Repository:** [https://github.com/live-robotics-uva/Moving-Out](https://github.com/live-robotics-uva/Moving-Out)

## Sample Usage

You can visualize trajectories from this dataset using the provided `dataset_to_video.py` script available in the project's GitHub repository.

**Save trajectory as MP4:**
```bash
python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v video
```

**Show trajectory in popup window:**
```bash
python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v human
```
Use `-m` for map name and `-t` for trajectory ID. Remember to replace `ShuaKang/movingout_task2` with the specific dataset ID you are using (e.g., `ShuaKang/movingout_task1` or `ShuaKang/movingout_task2`).

## Citation

If you use this dataset in your research, please cite the following paper:

```bibtex
@article{li2025moving,
  title={Moving Out: Physically-grounded Human-AI Collaboration},
  author={Li, Shua and Liu, Zi and Guo, Ruohan and Bai, Min and Li, Erran and Levine, Sergey and Lin, Dahua},
  journal={arXiv preprint arXiv:2507.18623},
  year={2025}
}
```