bair_256 / README.md
nielsr's picture
nielsr HF Staff
Add paper link, project page, and metadata to dataset card
d2dc8e9 verified
|
raw
history blame
1.45 kB
metadata
license: cc-by-sa-4.0
task_categories:
  - robotics

Latent Particle World Models (LPWM)

Project Website | Paper | GitHub

Latent Particle World Model (LPWM) is a self-supervised object-centric world model scaled to real-world multi-object datasets and applicable in decision-making. LPWM autonomously discovers keypoints, bounding boxes, and object masks directly from video data, enabling it to learn rich scene decompositions without supervision. The architecture is trained end-to-end purely from videos and supports flexible conditioning on actions, language, and image goals.

Sample Usage

To train LPWM on a dataset like Sketchy using the official implementation, you can use the following commands:

# Install environment
conda env create -f environment.yml
conda activate dlp

# Train LPWM on Sketchy
python train_lpwm.py --dataset sketchy

Citation

@inproceedings{
    daniel2026latent,
    title={Latent Particle World Models: Self-supervised Object-centric Stochastic Dynamics Modeling},
    author={Tal Daniel and Carl Qi and Dan Haramati and Amir Zadeh and Chuan Li and Aviv Tamar and Deepak Pathak and David Held},
    booktitle={The Fourteenth International Conference on Learning Representations},
    year={2026},
    url={https://openreview.net/forum?id=lTaPtGiUUc}
}