File size: 9,947 Bytes
2ff30a1
 
 
8467dfc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7940be4
8467dfc
 
 
 
 
 
 
 
 
d598433
8467dfc
 
 
 
 
f4836c4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8467dfc
 
d598433
25fe337
 
 
 
 
 
d598433
 
 
 
8467dfc
 
 
dedce14
8467dfc
12b9ba5
e9ac930
70d7a0c
 
12b9ba5
e9ac930
 
 
 
 
 
 
 
 
 
 
70d7a0c
 
e9ac930
 
 
8467dfc
 
 
d598433
8467dfc
 
d598433
 
 
 
8467dfc
 
d598433
 
 
8467dfc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d598433
8467dfc
 
 
 
d598433
 
8467dfc
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
---
license: other
license_name: nvidia-open-model-license
license_link: LICENSE
tags:
- robotics
- humanoid
- whole-body-control
- reinforcement-learning
- motion-tracking
- teleoperation
- pytorch
- isaac-lab
pipeline_tag: reinforcement-learning
---

# GEAR-SONIC: Supersizing Motion Tracking for Natural Humanoid Whole-Body Control

<div align="center">
  <img src="sonic-preview-gif-480P.gif" width="800">
</div>

## Model Description

**SONIC** (Supersizing Motion Tracking) is a humanoid behavior foundation model developed by NVIDIA that gives robots a core set of motor skills learned from large-scale human motion data. Rather than building separate controllers for predefined motions, SONIC uses motion tracking as a scalable training task, enabling a single unified policy to produce natural, whole-body movement and support a wide range of behaviors.

### Key Features

- ๐Ÿค– **Unified Whole-Body Control**: Single policy handles walking, running, crawling, jumping, manipulation, and more
- ๐ŸŽฏ **Motion Tracking**: Trained on large-scale human motion data for natural movements
- ๐ŸŽฎ **Real-Time Teleoperation**: VR-based whole-body teleoperation via PICO headset
- ๐Ÿš€ **Hardware Deployment**: C++ inference stack for real-time control on humanoid robots
- ๐ŸŽจ **Kinematic Planner**: Real-time locomotion generation with multiple movement styles
- ๐Ÿ”„ **Multi-Modal Control**: Supports keyboard, gamepad, VR, and high-level planning

## VR Whole-Body Teleoperation

SONIC supports real-time whole-body teleoperation via PICO VR headset, enabling natural human-to-robot motion transfer for data collection and interactive control.

<div align="center">
<table>
<tr>
<td align="center"><b>Walking</b></td>
<td align="center"><b>Running</b></td>
</tr>
<tr>
<td align="center"><img src="media/teleop_walking.gif" width="400"></td>
<td align="center"><img src="media/teleop_running.gif" width="400"></td>
</tr>
<tr>
<td align="center"><b>Sideways Movement</b></td>
<td align="center"><b>Kneeling</b></td>
</tr>
<tr>
<td align="center"><img src="media/teleop_sideways.gif" width="400"></td>
<td align="center"><img src="media/teleop_kneeling.gif" width="400"></td>
</tr>
<tr>
<td align="center"><b>Getting Up</b></td>
<td align="center"><b>Jumping</b></td>
</tr>
<tr>
<td align="center"><img src="media/teleop_getup.gif" width="400"></td>
<td align="center"><img src="media/teleop_jumping.gif" width="400"></td>
</tr>
<tr>
<td align="center"><b>Bimanual Manipulation</b></td>
<td align="center"><b>Object Hand-off</b></td>
</tr>
<tr>
<td align="center"><img src="media/teleop_bimanual.gif" width="400"></td>
<td align="center"><img src="media/teleop_switch_hands.gif" width="400"></td>
</tr>
</table>
</div>

## Kinematic Planner

SONIC includes a kinematic planner for real-time locomotion generation โ€” choose a movement style, steer with keyboard/gamepad, and adjust speed and height on the fly.

<div align="center">
<table>
<tr>
<td align="center" colspan="2"><b>In-the-Wild Navigation</b></td>
</tr>
<tr>
<td align="center" colspan="2"><img src="media/planner/planner_in_the_wild_navigation.gif" width="800"></td>
</tr>
<tr>
<td align="center"><b>Run</b></td>
<td align="center"><b>Happy</b></td>
</tr>
<tr>
<td align="center"><img src="media/planner/planner_run.gif" width="400"></td>
<td align="center"><img src="media/planner/planner_happy.gif" width="400"></td>
</tr>
<tr>
<td align="center"><b>Stealth</b></td>
<td align="center"><b>Injured</b></td>
</tr>
<tr>
<td align="center"><img src="media/planner/planner_stealth.gif" width="400"></td>
<td align="center"><img src="media/planner/planner_injured.gif" width="400"></td>
</tr>
<tr>
<td align="center"><b>Kneeling</b></td>
<td align="center"><b>Hand Crawling</b></td>
</tr>
<tr>
<td align="center"><img src="media/planner/planner_kneeling.gif" width="400"></td>
<td align="center"><img src="media/planner/planner_hand_crawling.gif" width="400"></td>
</tr>
<tr>
<td align="center"><b>Elbow Crawling</b></td>
<td align="center"><b>Boxing</b></td>
</tr>
<tr>
<td align="center"><img src="media/planner/planner_elbow_crawling.gif" width="400"></td>
<td align="center"><img src="media/planner/planner_boxing.gif" width="400"></td>
</tr>
</table>
</div>

## Quick Start

๐Ÿ“š **See the [Quick Start Guide](https://nvlabs.github.io/GR00T-WholeBodyControl/getting_started/quickstart.html)** for step-by-step instructions on:
- Installation and setup
- Running SONIC with different control modes (keyboard, gamepad, VR)
- Deploying on real hardware
- Using the kinematic planner

**Key Resources:**
- [Installation Guide](https://nvlabs.github.io/GR00T-WholeBodyControl/getting_started/installation_deploy.html) - Complete setup instructions
- [Keyboard Control Tutorial](https://nvlabs.github.io/GR00T-WholeBodyControl/tutorials/keyboard.html) - Get started with keyboard control
- [Gamepad Control Tutorial](https://nvlabs.github.io/GR00T-WholeBodyControl/tutorials/gamepad.html) - Set up gamepad control
- [VR Teleoperation Setup](https://nvlabs.github.io/GR00T-WholeBodyControl/getting_started/vr_teleop_setup.html) - Full-body VR control

## Model Checkpoints

All checkpoints (ONNX format) are available directly in this repository. Inference is powered by TensorRT and runs on both desktop and Jetson hardware.

| Checkpoint | File | Description |
|---|---|---|
| Policy encoder | `model_encoder.onnx` | Encodes motion reference into latent |
| Policy decoder | `model_decoder.onnx` | Decodes latent into joint actions |
| Kinematic planner | `planner_sonic.onnx` | Real-time locomotion style planner |

**Quick download** (requires `pip install huggingface_hub`):

```python
from huggingface_hub import snapshot_download
snapshot_download(repo_id="nvidia/GEAR-SONIC", local_dir="gear_sonic_deploy")
```

Or use the download script from the GitHub repo:

```bash
python download_from_hf.py             # policy + planner (default)
python download_from_hf.py --no-planner # policy only
```

See the [Download Models guide](https://nvlabs.github.io/GR00T-WholeBodyControl/getting_started/download_models.html) for full instructions.

## Documentation

๐Ÿ“š **[Full Documentation](https://nvlabs.github.io/GR00T-WholeBodyControl/)**

### Guides
- [Installation (Deployment)](https://nvlabs.github.io/GR00T-WholeBodyControl/getting_started/installation_deploy.html)
- [Installation (Training)](https://nvlabs.github.io/GR00T-WholeBodyControl/getting_started/installation_training.html)
- [Quick Start](https://nvlabs.github.io/GR00T-WholeBodyControl/getting_started/quickstart.html)
- [VR Teleoperation Setup](https://nvlabs.github.io/GR00T-WholeBodyControl/getting_started/vr_teleop_setup.html)

### Tutorials
- [Keyboard Control](https://nvlabs.github.io/GR00T-WholeBodyControl/tutorials/keyboard.html)
- [Gamepad Control](https://nvlabs.github.io/GR00T-WholeBodyControl/tutorials/gamepad.html)
- [VR Whole-Body Teleoperation](https://nvlabs.github.io/GR00T-WholeBodyControl/tutorials/vr_wholebody_teleop.html)

## Repository Structure

```
GR00T-WholeBodyControl/
โ”œโ”€โ”€ gear_sonic_deploy/     # C++ inference stack for deployment
โ”œโ”€โ”€ gear_sonic/            # Teleoperation and data collection tools
โ”œโ”€โ”€ decoupled_wbc/         # Decoupled WBC (GR00T N1.5/N1.6)
โ”œโ”€โ”€ docs/                  # Documentation source
โ””โ”€โ”€ media/                 # Videos and images
```

## Related Projects

This repository is part of NVIDIA's GR00T (Generalist Robot 00 Technology) initiative:
- **[GR00T N1.5](https://research.nvidia.com/labs/gear/gr00t-n1_5/)**: Previous generation decoupled controller
- **[GR00T N1.6](https://research.nvidia.com/labs/gear/gr00t-n1_6/)**: Improved decoupled WBC approach
- **[GEAR-SONIC Website](https://nvlabs.github.io/GEAR-SONIC/)**: Project page with videos and details

## Citation

If you use GEAR-SONIC in your research, please cite:

```bibtex
@article{luo2025sonic,
    title={SONIC: Supersizing Motion Tracking for Natural Humanoid Whole-Body Control},
    author={Luo, Zhengyi and Yuan, Ye and Wang, Tingwu and Li, Chenran and Chen, Sirui and Casta\~neda, Fernando and Cao, Zi-Ang and Li, Jiefeng and Minor, David and Ben, Qingwei and Da, Xingye and Ding, Runyu and Hogg, Cyrus and Song, Lina and Lim, Edy and Jeong, Eugene and He, Tairan and Xue, Haoru and Xiao, Wenli and Wang, Zi and Yuen, Simon and Kautz, Jan and Chang, Yan and Iqbal, Umar and Fan, Linxi and Zhu, Yuke},
    journal={arXiv preprint arXiv:2511.07820},
    year={2025}
}
```

## License

This project uses **dual licensing**:

- **Source Code**: Apache License 2.0 - applies to all code, scripts, and software components
- **Model Weights**: NVIDIA Open Model License - applies to all trained model checkpoints

**Key points of the NVIDIA Open Model License:**
- โœ… Commercial use permitted with attribution
- โœ… Modification and distribution allowed
- โš ๏ธ Must comply with NVIDIA's Trustworthy AI terms
- โš ๏ธ Model outputs subject to responsible use guidelines

See [LICENSE](https://github.com/NVlabs/GR00T-WholeBodyControl/blob/main/LICENSE) for complete terms.

## Support & Contact

- ๐Ÿ“ง **Email**: [gear-wbc@nvidia.com](mailto:gear-wbc@nvidia.com)
- ๐Ÿ› **Issues**: [GitHub Issues](https://github.com/NVlabs/GR00T-WholeBodyControl/issues)
- ๐Ÿ“– **Documentation**: [https://nvlabs.github.io/GR00T-WholeBodyControl/](https://nvlabs.github.io/GR00T-WholeBodyControl/)
- ๐ŸŒ **Website**: [https://nvlabs.github.io/GEAR-SONIC/](https://nvlabs.github.io/GEAR-SONIC/)

## Acknowledgments

This work builds upon and acknowledges:
- [Beyond Mimic](https://github.com/HybridRobotics/whole_body_tracking) - Whole-body tracking foundation
- [Isaac Lab](https://github.com/isaac-sim/IsaacLab) - Robot learning framework
- NVIDIA Research GEAR Lab team
- All contributors and collaborators

## Model Card Contact

For questions about this model card or responsible AI considerations, contact: [gear-wbc@nvidia.com](mailto:gear-wbc@nvidia.com)