EgoVCP_Dataset / README.md
nielsr's picture
nielsr HF Staff
Add dataset card for EgoVCP
831117b verified
|
raw
history blame
1.94 kB
metadata
task_categories:
  - robotics
tags:
  - humanoid
  - world-model
  - contact-planning
  - ego-vision

EgoVCP Dataset

This repository contains the dataset for the paper Ego-Vision World Model for Humanoid Contact Planning.

Project Page | GitHub | arXiv

The EgoVCP dataset is a demonstration-free offline dataset used to train a learned world model for humanoid contact planning. It features data collected from a G1 humanoid robot in the Isaac Lab simulation environment, enabling the robot to predict outcomes in a compressed latent space for contact-aware tasks.

Dataset Tasks

The dataset includes rollouts for three primary manipulation and navigation scenarios:

  • Wall Support (wall): Supporting the robot against a wall after a perturbation.
  • Ball Blocking (ball): Interacting with and blocking incoming objects (e.g., a yoga ball).
  • Tunnel Traversal (tunnel): Traversing height-limited arches or tunnels.

Dataset Structure

The dataset contains:

  • Ego-centric Depth Images: Captured from the robot's perspective.
  • Proprioception Data: Internal robot state information.
  • Actions and Rewards: For offline world model training.

The data is organized by task into subdirectories: wall, ball, and tunnel.

Usage

To use this dataset with the official implementation, you can clone it directly into your project directory:

cd Ego-VCP
mkdir dataset
git clone https://huggingface.co/datasets/Hang917/EgoVCP_Dataset.git dataset

Citation

If you use this dataset in your research, please cite the following paper:

@article{liu2025egovcp,
  title={Ego-Vision World Model for Humanoid Contact Planning},
  author={Liu, Hang and others},
  journal={arXiv preprint arXiv:2510.11682},
  year={2025}
}