Robotics
PyTorch
world-model
jepa
planning
Basile-Terv commited on
Commit
a06a805
Β·
verified Β·
1 Parent(s): 653a2d3

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +32 -3
README.md CHANGED
@@ -12,7 +12,30 @@ datasets:
12
  - facebook/jepa-wms
13
  ---
14
 
15
- # JEPA-WMs: Pretrained World Models
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
 
17
  This repository contains pretrained world model checkpoints from the paper
18
  ["What Drives Success in Physical Planning with Joint-Embedding Predictive World Models?"](https://arxiv.org/abs/2512.24497)
@@ -85,10 +108,15 @@ checkpoint_path = hf_hub_download(
85
  filename="jepa_wm_droid.pth.tar"
86
  )
87
 
88
- # Load with PyTorch
89
  checkpoint = torch.load(checkpoint_path, map_location="cpu")
 
90
  ```
91
 
 
 
 
 
92
  ## Citation
93
 
94
  ```bibtex
@@ -111,4 +139,5 @@ These models are licensed under [CC-BY-NC 4.0](https://creativecommons.org/licen
111
 
112
  - πŸ“„ [Paper](https://arxiv.org/abs/2512.24497)
113
  - πŸ’» [GitHub Repository](https://github.com/facebookresearch/jepa-wms)
114
- - πŸ€— [Dataset](https://huggingface.co/datasets/facebook/jepa-wms)
 
 
12
  - facebook/jepa-wms
13
  ---
14
 
15
+ <h1 align="center">
16
+ <p>πŸ€– <b>JEPA-WMs Pretrained Models</b></p>
17
+ </h1>
18
+
19
+ <h2 align="center">
20
+ <p><i>World models for robot planning with latent imagination 🧠</i></p>
21
+ </h2>
22
+
23
+ <div align="center" style="line-height: 1;">
24
+ <a href="https://github.com/facebookresearch/jepa-wms" target="_blank" style="margin: 2px;"><img alt="Github" src="https://img.shields.io/badge/Github-facebookresearch/jepa--wms-black?logo=github" style="display: inline-block; vertical-align: middle;"/></a>
25
+ <a href="https://huggingface.co/facebook/jepa-wms" target="_blank" style="margin: 2px;"><img alt="HuggingFace" src="https://img.shields.io/badge/πŸ€—%20HuggingFace-facebook/jepa-wms-ffc107" style="display: inline-block; vertical-align: middle;"/></a>
26
+ <a href="https://arxiv.org/abs/2512.24497" target="_blank" style="margin: 2px;"><img alt="ArXiv" src="https://img.shields.io/badge/arXiv-2512.24497-b5212f?logo=arxiv" style="display: inline-block; vertical-align: middle;"/></a>
27
+ </div>
28
+
29
+ <br>
30
+
31
+ <p align="center">
32
+ <b><a href="https://ai.facebook.com/research/">Meta AI Research, FAIR</a></b>
33
+ </p>
34
+
35
+ <p align="center">
36
+ This πŸ€— HuggingFace repository hosts pretrained <b>JEPA-WM</b> world models.<br>
37
+ πŸ‘‰ See the <a href="https://github.com/facebookresearch/jepa-wms">main repository</a> for training code and datasets.
38
+ </p>
39
 
40
  This repository contains pretrained world model checkpoints from the paper
41
  ["What Drives Success in Physical Planning with Joint-Embedding Predictive World Models?"](https://arxiv.org/abs/2512.24497)
 
108
  filename="jepa_wm_droid.pth.tar"
109
  )
110
 
111
+ # Load checkpoint (contains 'encoder', 'predictor', and 'heads' state dicts)
112
  checkpoint = torch.load(checkpoint_path, map_location="cpu")
113
+ print(checkpoint.keys()) # dict_keys(['encoder', 'predictor', 'heads', 'opt', 'scaler', 'epoch', 'batch_size', 'lr', 'amp'])
114
  ```
115
 
116
+ > **Note**: This only downloads the weights. To instantiate the full model with the correct
117
+ > architecture and load the weights, we recommend using PyTorch Hub (see above) or cloning the
118
+ > [jepa-wms repository](https://github.com/facebookresearch/jepa-wms) and using the training/eval scripts.
119
+
120
  ## Citation
121
 
122
  ```bibtex
 
139
 
140
  - πŸ“„ [Paper](https://arxiv.org/abs/2512.24497)
141
  - πŸ’» [GitHub Repository](https://github.com/facebookresearch/jepa-wms)
142
+ - πŸ€— [Datasets](https://huggingface.co/datasets/facebook/jepa-wms)
143
+ - πŸ€— [Models](https://huggingface.co/facebook/jepa-wms)