nielsr HF Staff commited on
Commit
8ce4bc5
·
verified ·
1 Parent(s): c87bfa0

Improve dataset card and add metadata

Browse files

Hi! I'm Niels from the community science team at Hugging Face.

This PR improves the dataset card for the preprocessed Bridge dataset:
- Added `task_categories: robotics` to the YAML metadata.
- Fixed a typo in the tags (`robotc` -> `robotics`).
- Added links to the [LPWM paper](https://huggingface.co/papers/2603.04553), project page, and GitHub repository.
- Included the BibTeX citation from the official repository.
- Added a more detailed description of the preprocessing performed on this data.

Files changed (1) hide show
  1. README.md +30 -2
README.md CHANGED
@@ -1,13 +1,41 @@
1
  ---
2
  license: cc-by-4.0
 
 
3
  tags:
4
  - video
5
  - prediction
6
- - robotc
7
  - manipulation
8
  - language
9
  - t5
10
  - bridge
11
  - open-x
12
  ---
13
- Bridge data from Open-x, preprocessed with T5-large embeddings for the instructions.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: cc-by-4.0
3
+ task_categories:
4
+ - robotics
5
  tags:
6
  - video
7
  - prediction
8
+ - robotics
9
  - manipulation
10
  - language
11
  - t5
12
  - bridge
13
  - open-x
14
  ---
15
+
16
+ # Bridge Dataset (Preprocessed for LPWM)
17
+
18
+ This repository contains preprocessed videos from the BRIDGE dataset (part of [Open-X Embodiment](https://github.com/google-deepmind/open_x_embodiment)) as used in the paper [Latent Particle World Models: Self-supervised Object-centric Stochastic Dynamics Modeling](https://huggingface.co/papers/2603.04553).
19
+
20
+ The videos are preprocessed with T5-large embeddings for the text instructions to support language-conditioned world modeling.
21
+
22
+ - **Project Page:** [https://taldatech.github.io/lpwm-web](https://taldatech.github.io/lpwm-web)
23
+ - **GitHub Repository:** [https://github.com/taldatech/lpwm](https://github.com/taldatech/lpwm)
24
+ - **Paper:** [arXiv:2603.04553](https://arxiv.org/abs/2603.04553)
25
+
26
+ ## Description
27
+
28
+ Latent Particle World Model (LPWM) is a self-supervised object-centric world model that autonomously discovers keypoints, bounding boxes, and object masks directly from video data. This specific dataset contains Bridge data formatted for training LPWM, including language embeddings for goal conditioning.
29
+
30
+ ## Citation
31
+
32
+ ```bibtex
33
+ @inproceedings{
34
+ daniel2026latent,
35
+ title={Latent Particle World Models: Self-supervised Object-centric Stochastic Dynamics Modeling},
36
+ author={Tal Daniel and Carl Qi and Dan Haramati and Amir Zadeh and Chuan Li and Aviv Tamar and Deepak Pathak and David Held},
37
+ booktitle={The Fourteenth International Conference on Learning Representations},
38
+ year={2026},
39
+ url={https://openreview.net/forum?id=lTaPtGiUUc}
40
+ }
41
+ ```