wintermelontree's picture
Add dataset card and metadata (#1)
42b55fc
---
task_categories:
- robotics
---
# From Prior to Pro: Efficient Skill Mastery via Distribution Contractive RL Finetuning (DICE-RL)
This repository contains the datasets used in the paper [From Prior to Pro: Efficient Skill Mastery via Distribution Contractive RL Finetuning](https://huggingface.co/papers/2603.10263).
[**Project Website**](https://zhanyisun.github.io/dice.rl.2026/) | [**GitHub Repository**](https://github.com/zhanyisun/dice-rl)
## Dataset Description
Distribution Contractive Reinforcement Learning (DICE-RL) is a framework that uses reinforcement learning (RL) to refine pretrained generative robot policies. This repository hosts the data used for pretraining Behavior Cloning (BC) policies and finetuning them with DICE-RL across various Robomimic environments.
The data covers both:
- **Low-dimensional (state-based)** observations.
- **Image-based (pixel-based)** observations.
### Data Splits
- `ph_pretrain`: Datasets used for pretraining the BC policies for broad behavioral coverage.
- `ph_finetune`: Datasets used for DICE-RL finetuning. These trajectories are truncated to have exactly one success at the end to ensure consistent value learning.
## Dataset Structure
The datasets are provided in `numpy` format. Once downloaded, they follow this structure:
```
data_dir/
└── robomimic
├── {env_name}-low-dim
│ ├── ph_pretrain
│ └── ph_finetune
└── {env_name}-img
├── ph_pretrain
└── ph_finetune
```
Each folder contains:
- `train.npy`: The trajectory data.
- `normalization.npz`: Statistics used for data normalization.
## Sample Usage
To download the datasets as intended by the authors, you can use the script provided in the [official repository](https://github.com/zhanyisun/dice-rl):
```console
bash script/download_hf.sh
```
## Citation
```bibtex
@article{sun2026prior,
title={From Prior to Pro: Efficient Skill Mastery via Distribution Contractive RL Finetuning},
author={Sun, Zhanyi and Song, Shuran},
journal={arXiv preprint arXiv:2603.10263},
year={2026}
}
```