Improve dataset card: Add paper, project, code, task category, tags, and usage

#2
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +49 -0
README.md CHANGED
@@ -1,4 +1,11 @@
1
  ---
 
 
 
 
 
 
 
2
  dataset_info:
3
  features:
4
  - name: map_name
@@ -31,3 +38,45 @@ configs:
31
  - split: all_data
32
  path: data/all_data-*
33
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ task_categories:
3
+ - robotics
4
+ tags:
5
+ - human-robot-interaction
6
+ - multi-agent
7
+ - embodied-ai
8
+ - reinforcement-learning
9
  dataset_info:
10
  features:
11
  - name: map_name
 
38
  - split: all_data
39
  path: data/all_data-*
40
  ---
41
+
42
+ # Moving Out: Physically-grounded Human-AI Collaboration Dataset
43
+
44
+ This repository contains the human-human interaction dataset for the paper [Moving Out: Physically-grounded Human-AI Collaboration](https://huggingface.co/papers/2507.18623).
45
+
46
+ The **Moving Out** benchmark focuses on human-AI collaboration in physical environments, specifically for embodied agents (e.g., robots). It introduces a new benchmark that resembles a wide range of collaboration modes affected by physical attributes and constraints, such as moving heavy items together and maintaining consistent actions to move a big item around a corner. The dataset includes human-human interaction data collected for two tasks, designed to evaluate models' abilities to adapt to diverse human behaviors and unseen physical attributes in continuous state-action spaces with constrained dynamics.
47
+
48
+ **Project Page:** [https://live-robotics-uva.github.io/movingout_ai/](https://live-robotics-uva.github.io/movingout_ai/)
49
+
50
+ **Code:** [https://github.com/live-robotics-uva/moving_out_ai](https://github.com/live-robotics-uva/moving_out_ai)
51
+
52
+ ## Sample Usage
53
+
54
+ You can visualize trajectories from this dataset using the `dataset_to_video.py` script provided in the associated GitHub repository.
55
+
56
+ **Save trajectory as MP4:**
57
+
58
+ ```bash
59
+ python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v video
60
+ ```
61
+
62
+ **Show trajectory in popup window:**
63
+
64
+ ```bash
65
+ python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v human
66
+ ```
67
+
68
+ Use `-m` for map name and `-t` for trajectory ID.
69
+
70
+ ## Citation
71
+
72
+ If you use this dataset in your research, please cite the original paper:
73
+
74
+ ```bibtex
75
+ @article{,
76
+ title={Moving Out: Physically-grounded Human-AI Collaboration},
77
+ author={Shua Kang, Wenxin Xia, Ziqi Li, Mingxing Yuan, Xufan Wu, Min Bai, Boyi Liu, David Abel, Wenhao Zhang},
78
+ journal={arXiv preprint arXiv:2507.18623},
79
+ year={2025},
80
+ url={https://huggingface.co/papers/2507.18623}
81
+ }
82
+ ```