Spaces:
Running
Running
Remove code snippets from pipeline, make cards clickable links, GB10 envs to 1024
Browse files- CLAUDE.md +1 -0
- index.html +16 -45
- style.css +7 -23
CLAUDE.md
CHANGED
|
@@ -7,6 +7,7 @@
|
|
| 7 |
|
| 8 |
| ID | Time | T | Title | Read |
|
| 9 |
|----|------|---|-------|------|
|
|
|
|
| 10 |
| #2364 | 6:25 PM | β
| Dell Pro Max GB10 Equipment Specs Updated | ~254 |
|
| 11 |
| #2361 | 6:17 PM | β
| Updated team credits with role reorganization and new contributors | ~250 |
|
| 12 |
| #2360 | " | β
| Updated credits to reflect development role | ~190 |
|
|
|
|
| 7 |
|
| 8 |
| ID | Time | T | Title | Read |
|
| 9 |
|----|------|---|-------|------|
|
| 10 |
+
| #2374 | 6:52 PM | π | Pipeline Tool Cards Refactored as Clickable Links | ~316 |
|
| 11 |
| #2364 | 6:25 PM | β
| Dell Pro Max GB10 Equipment Specs Updated | ~254 |
|
| 12 |
| #2361 | 6:17 PM | β
| Updated team credits with role reorganization and new contributors | ~250 |
|
| 13 |
| #2360 | " | β
| Updated credits to reflect development role | ~190 |
|
index.html
CHANGED
|
@@ -126,65 +126,36 @@
|
|
| 126 |
<h2>PIPELINE</h2>
|
| 127 |
<p class="deploy-intro">The complete motion-to-policy pipeline uses six tools. Every trained policy is available as an ONNX model (160-dim input, 29-dim output) with baked-in observation normalization.</p>
|
| 128 |
<div class="deploy-options">
|
| 129 |
-
<
|
| 130 |
<div class="deploy-num">01</div>
|
| 131 |
<h3>MOVIN Studio</h3>
|
| 132 |
<p>Recording and processing software for MOVIN TRACIN markerless mocap. Captures full-body motion in real-time using LiDAR + vision at 60 FPS. Exports BVH with 51-joint humanoid skeleton, plus FBX for Blender, Maya, Unreal, and Unity.</p>
|
| 133 |
-
|
| 134 |
-
</
|
| 135 |
-
<div class="deploy-option">
|
| 136 |
<div class="deploy-num">02</div>
|
| 137 |
<h3>Motion-Player-ROS</h3>
|
| 138 |
-
<p>ROS 2 package for retargeting and previewing motion capture on the G1. Supports both playback of pre-recorded
|
| 139 |
-
|
| 140 |
-
|
| 141 |
-
ros2 launch motion_player player.launch.py \
|
| 142 |
-
motion_file:=clip.pkl bvh_file:=clip.bvh
|
| 143 |
-
|
| 144 |
-
# Real-time mocap
|
| 145 |
-
ros2 launch motion_player realtime.launch.py \
|
| 146 |
-
port:=9000 human_height:=1.75</code></pre>
|
| 147 |
-
</div>
|
| 148 |
-
<div class="deploy-option">
|
| 149 |
<div class="deploy-num">03</div>
|
| 150 |
<h3>video2robot</h3>
|
| 151 |
-
<p>End-to-end pipeline converting any video to robot motion. Extracts 3D human pose via
|
| 152 |
-
|
| 153 |
-
|
| 154 |
-
python scripts/extract_pose.py --project data/clip
|
| 155 |
-
|
| 156 |
-
# Retarget to G1
|
| 157 |
-
python scripts/convert_to_robot.py \
|
| 158 |
-
--project data/clip --robot unitree_g1</code></pre>
|
| 159 |
-
</div>
|
| 160 |
-
<div class="deploy-option">
|
| 161 |
<div class="deploy-num">04</div>
|
| 162 |
<h3>mjlab</h3>
|
| 163 |
-
<p>GPU-accelerated RL training framework combining Isaac Lab's manager-based API with
|
| 164 |
-
|
| 165 |
-
|
| 166 |
-
uv run train Mjlab-Tracking-Flat-Unitree-G1 \
|
| 167 |
-
--env.commands.motion.motion-file clip.npz \
|
| 168 |
-
--env.scene.num-envs 8192 \
|
| 169 |
-
--agent.max-iterations 30000</code></pre>
|
| 170 |
-
</div>
|
| 171 |
-
<div class="deploy-option">
|
| 172 |
<div class="deploy-num">05</div>
|
| 173 |
<h3>RoboJuDo</h3>
|
| 174 |
<p>Plug-and-play sim2real deployment framework for humanoid robots. Modular architecture separates controller (joystick/keyboard/motion), environment (MuJoCo sim or real robot via Unitree SDK), and policy (ONNX/JIT). Supports seamless switching between sim2sim validation and real hardware with minimal code changes.</p>
|
| 175 |
-
|
| 176 |
-
|
| 177 |
-
python scripts/run_pipeline.py --config=g1
|
| 178 |
-
|
| 179 |
-
# Deploy to physical G1
|
| 180 |
-
python scripts/run_pipeline.py --config=g1_real</code></pre>
|
| 181 |
-
</div>
|
| 182 |
-
<div class="deploy-option">
|
| 183 |
<div class="deploy-num">06</div>
|
| 184 |
<h3>MuJoCo WASM</h3>
|
| 185 |
<p>Browser-based 3D visualization of trained policies. Runs MuJoCo physics simulation via WebAssembly with ONNX Runtime Web for neural network inference β no install required. Each policy card in the gallery above is a live interactive viewer.</p>
|
| 186 |
-
|
| 187 |
-
</div>
|
| 188 |
</div>
|
| 189 |
<p class="deploy-footer">Full observation vector spec and integration guide in the <a href="https://huggingface.co/datasets/exptech/g1-moves" target="_blank" rel="noopener">dataset README</a>.</p>
|
| 190 |
</div>
|
|
@@ -269,7 +240,7 @@ python scripts/run_pipeline.py --config=g1_real</code></pre>
|
|
| 269 |
</div>
|
| 270 |
<div class="equipment-item">
|
| 271 |
<span class="equipment-label">Envs</span>
|
| 272 |
-
<span class="equipment-value">
|
| 273 |
</div>
|
| 274 |
</div>
|
| 275 |
</div>
|
|
|
|
| 126 |
<h2>PIPELINE</h2>
|
| 127 |
<p class="deploy-intro">The complete motion-to-policy pipeline uses six tools. Every trained policy is available as an ONNX model (160-dim input, 29-dim output) with baked-in observation normalization.</p>
|
| 128 |
<div class="deploy-options">
|
| 129 |
+
<a class="deploy-option" href="https://movin3d.com/" target="_blank" rel="noopener">
|
| 130 |
<div class="deploy-num">01</div>
|
| 131 |
<h3>MOVIN Studio</h3>
|
| 132 |
<p>Recording and processing software for MOVIN TRACIN markerless mocap. Captures full-body motion in real-time using LiDAR + vision at 60 FPS. Exports BVH with 51-joint humanoid skeleton, plus FBX for Blender, Maya, Unreal, and Unity.</p>
|
| 133 |
+
</a>
|
| 134 |
+
<a class="deploy-option" href="https://github.com/MOVIN3D/Motion-Player-ROS" target="_blank" rel="noopener">
|
|
|
|
| 135 |
<div class="deploy-num">02</div>
|
| 136 |
<h3>Motion-Player-ROS</h3>
|
| 137 |
+
<p>ROS 2 package for retargeting and previewing motion capture on the G1. Supports both playback of pre-recorded .pkl files and real-time retargeting from live MOVIN TRACIN data via OSC/UDP. Dual visualization shows the original human BVH skeleton alongside the retargeted robot motion in RViz.</p>
|
| 138 |
+
</a>
|
| 139 |
+
<a class="deploy-option" href="https://github.com/AIM-Intelligence/video2robot" target="_blank" rel="noopener">
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 140 |
<div class="deploy-num">03</div>
|
| 141 |
<h3>video2robot</h3>
|
| 142 |
+
<p>End-to-end pipeline converting any video to robot motion. Extracts 3D human pose via PromptHMR (SMPL-X), then retargets to the G1's 29-DOF joint space using GMR inverse kinematics. Works with YouTube, phone video, or AI-generated footage β no mocap hardware needed.</p>
|
| 143 |
+
</a>
|
| 144 |
+
<a class="deploy-option" href="https://github.com/mujocolab/mjlab" target="_blank" rel="noopener">
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 145 |
<div class="deploy-num">04</div>
|
| 146 |
<h3>mjlab</h3>
|
| 147 |
+
<p>GPU-accelerated RL training framework combining Isaac Lab's manager-based API with MuJoCo-Warp simulation. Trains PPO policies across 8,192 parallel environments on a single GPU. Motion imitation uses 14-body tracking with reward shaping for position, orientation, velocity, and collision avoidance.</p>
|
| 148 |
+
</a>
|
| 149 |
+
<a class="deploy-option" href="https://github.com/GDDG08/RoboJuDo" target="_blank" rel="noopener">
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 150 |
<div class="deploy-num">05</div>
|
| 151 |
<h3>RoboJuDo</h3>
|
| 152 |
<p>Plug-and-play sim2real deployment framework for humanoid robots. Modular architecture separates controller (joystick/keyboard/motion), environment (MuJoCo sim or real robot via Unitree SDK), and policy (ONNX/JIT). Supports seamless switching between sim2sim validation and real hardware with minimal code changes.</p>
|
| 153 |
+
</a>
|
| 154 |
+
<a class="deploy-option" href="https://github.com/zalo/mujoco_wasm" target="_blank" rel="noopener">
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 155 |
<div class="deploy-num">06</div>
|
| 156 |
<h3>MuJoCo WASM</h3>
|
| 157 |
<p>Browser-based 3D visualization of trained policies. Runs MuJoCo physics simulation via WebAssembly with ONNX Runtime Web for neural network inference β no install required. Each policy card in the gallery above is a live interactive viewer.</p>
|
| 158 |
+
</a>
|
|
|
|
| 159 |
</div>
|
| 160 |
<p class="deploy-footer">Full observation vector spec and integration guide in the <a href="https://huggingface.co/datasets/exptech/g1-moves" target="_blank" rel="noopener">dataset README</a>.</p>
|
| 161 |
</div>
|
|
|
|
| 240 |
</div>
|
| 241 |
<div class="equipment-item">
|
| 242 |
<span class="equipment-label">Envs</span>
|
| 243 |
+
<span class="equipment-value">1,024 parallel</span>
|
| 244 |
</div>
|
| 245 |
</div>
|
| 246 |
</div>
|
style.css
CHANGED
|
@@ -815,16 +815,20 @@ a:hover { opacity: 0.8; }
|
|
| 815 |
gap: 1.5rem;
|
| 816 |
}
|
| 817 |
|
| 818 |
-
.deploy-option {
|
| 819 |
background: var(--bg-card);
|
| 820 |
border: 1px solid var(--border);
|
| 821 |
border-radius: var(--radius);
|
| 822 |
padding: 1.5rem;
|
| 823 |
transition: border-color 0.2s;
|
|
|
|
|
|
|
|
|
|
|
|
|
| 824 |
}
|
| 825 |
|
| 826 |
-
.deploy-option:hover {
|
| 827 |
-
border-color: var(--
|
| 828 |
}
|
| 829 |
|
| 830 |
.deploy-num {
|
|
@@ -852,26 +856,6 @@ a:hover { opacity: 0.8; }
|
|
| 852 |
margin-bottom: 0.75rem;
|
| 853 |
}
|
| 854 |
|
| 855 |
-
.deploy-option p a {
|
| 856 |
-
color: var(--accent);
|
| 857 |
-
text-decoration: none;
|
| 858 |
-
}
|
| 859 |
-
|
| 860 |
-
.deploy-option pre {
|
| 861 |
-
background: var(--bg-primary);
|
| 862 |
-
border: 1px solid var(--border);
|
| 863 |
-
border-radius: 6px;
|
| 864 |
-
padding: 0.75rem;
|
| 865 |
-
overflow-x: auto;
|
| 866 |
-
margin: 0;
|
| 867 |
-
}
|
| 868 |
-
|
| 869 |
-
.deploy-option code {
|
| 870 |
-
font-family: var(--font-mono);
|
| 871 |
-
font-size: 0.7rem;
|
| 872 |
-
color: var(--text-primary);
|
| 873 |
-
line-height: 1.5;
|
| 874 |
-
}
|
| 875 |
|
| 876 |
.deploy-footer {
|
| 877 |
color: var(--text-tertiary);
|
|
|
|
| 815 |
gap: 1.5rem;
|
| 816 |
}
|
| 817 |
|
| 818 |
+
a.deploy-option, .deploy-option {
|
| 819 |
background: var(--bg-card);
|
| 820 |
border: 1px solid var(--border);
|
| 821 |
border-radius: var(--radius);
|
| 822 |
padding: 1.5rem;
|
| 823 |
transition: border-color 0.2s;
|
| 824 |
+
text-decoration: none;
|
| 825 |
+
color: inherit;
|
| 826 |
+
display: block;
|
| 827 |
+
cursor: pointer;
|
| 828 |
}
|
| 829 |
|
| 830 |
+
a.deploy-option:hover, .deploy-option:hover {
|
| 831 |
+
border-color: var(--accent);
|
| 832 |
}
|
| 833 |
|
| 834 |
.deploy-num {
|
|
|
|
| 856 |
margin-bottom: 0.75rem;
|
| 857 |
}
|
| 858 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 859 |
|
| 860 |
.deploy-footer {
|
| 861 |
color: var(--text-tertiary);
|