movingout_task1 / README.md
nielsr's picture
nielsr HF Staff
Improve dataset card: Add paper, project, code, task category, tags, and usage
c6acd3f verified
|
raw
history blame
2.72 kB
metadata
task_categories:
  - robotics
tags:
  - human-robot-interaction
  - multi-agent
  - embodied-ai
  - reinforcement-learning
dataset_info:
  features:
    - name: map_name
      dtype: string
    - name: trajectory_id
      dtype: int64
    - name: steps_data
      dtype: string
    - name: num_steps
      dtype: int64
  splits:
    - name: train
      num_bytes: 562424179
      num_examples: 912
    - name: evaluation
      num_bytes: 54684865
      num_examples: 96
    - name: all_data
      num_bytes: 617109044
      num_examples: 1008
  download_size: 261118576
  dataset_size: 1234218088
configs:
  - config_name: default
    data_files:
      - split: train
        path: data/train-*
      - split: evaluation
        path: data/evaluation-*
      - split: all_data
        path: data/all_data-*

Moving Out: Physically-grounded Human-AI Collaboration Dataset

This repository contains the human-human interaction dataset for the paper Moving Out: Physically-grounded Human-AI Collaboration.

The Moving Out benchmark focuses on human-AI collaboration in physical environments, specifically for embodied agents (e.g., robots). It introduces a new benchmark that resembles a wide range of collaboration modes affected by physical attributes and constraints, such as moving heavy items together and maintaining consistent actions to move a big item around a corner. The dataset includes human-human interaction data collected for two tasks, designed to evaluate models' abilities to adapt to diverse human behaviors and unseen physical attributes in continuous state-action spaces with constrained dynamics.

Project Page: https://live-robotics-uva.github.io/movingout_ai/

Code: https://github.com/live-robotics-uva/moving_out_ai

Sample Usage

You can visualize trajectories from this dataset using the dataset_to_video.py script provided in the associated GitHub repository.

Save trajectory as MP4:

python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v video

Show trajectory in popup window:

python dataset_to_video.py -f ShuaKang/movingout_task2 -m HandOff -t 4 -v human

Use -m for map name and -t for trajectory ID.

Citation

If you use this dataset in your research, please cite the original paper:

@article{,
      title={Moving Out: Physically-grounded Human-AI Collaboration},
      author={Shua Kang, Wenxin Xia, Ziqi Li, Mingxing Yuan, Xufan Wu, Min Bai, Boyi Liu, David Abel, Wenhao Zhang},
      journal={arXiv preprint arXiv:2507.18623},
      year={2025},
      url={https://huggingface.co/papers/2507.18623}
}