File size: 2,838 Bytes
483e7c7 7b4fbe0 483e7c7 7b4fbe0 a01f22a 7b4fbe0 a01f22a 7b4fbe0 4859b1f 7b4fbe0 06538e0 8137f44 a01f22a 7b4fbe0 a01f22a 7b4fbe0 a01f22a 7b4fbe0 a01f22a 7b4fbe0 a01f22a 483e7c7 7b4fbe0 2d6142d 7b4fbe0 a01f22a 7b4fbe0 704c7a0 f0c02b8 704c7a0 7b4fbe0 483e7c7 ff5890f 7b4fbe0 ff5890f 483e7c7 7b4fbe0 483e7c7 73574b7 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 | ---
license: cc-by-4.0
---
# April Robotics Data Sample
This dataset contains 3D ground truth hand and finger annotations combined with egocentric and wrist recordings of industrial assembly operations captured in an active manufacturing environment. The data showcases our multimodal capture system consisting of head and wrist-mounted cameras and a sensorized glove that tracks high-quality, precise human hand motion and manipulation.
## Dataset Summary
| Property | Value |
|----------|-------|
| Total episodes | 6 |
| Frames | 7,205 |
| Tasks | 2 distinct industrial manipulation tasks (sorting and drilling) |
| Modalities | RGB, 3D Hand Keypoints, Wrist Pose, Head Pose, Language Annotations |
| Resolution | 720x960 (head cam), 480×640 (wrist cam) @ 30 FPS |
| Format | LeRobotDataset v3.0 |
## Quick Start
```python
from lerobot.datasets.lerobot_dataset import LeRobotDataset
dataset = LeRobotDataset(repo_id="aprilrobotics/sample", episodes=[0])
frame = dataset[10]
# Camera images (H, W, 3)
head_img = frame["observation.images.head_cam"].numpy().transpose(1, 2, 0)
wrist_img = frame["observation.images.wrist_cam"].numpy().transpose(1, 2, 0)
# State vector: [head_pose(7), wrist_pose(7), hand_keypoints(63)]
state = frame["observation.state"].numpy()
head_pose = state[0:7] # position (xyz) + quaternion (xyzw)
wrist_pose = state[7:14] # position (xyz) + quaternion (xyzw)
# 21 hand keypoints (xyz each):
# 0=wrist, 1-4=thumb, 5-8=index, 9-12=middle, 13-16=ring, 17-20=pinky
keypoints = state[14:].reshape(21, 3)
```
## Visualization
We provide visualization using Rerun. See the [visualization repository](https://github.com/AprilRoboticsAI/Visualization) for instructions.
<video controls style="max-width: 720px; width: 100%;">
<source src="https://huggingface.co/datasets/aprilrobotics/sample/resolve/main/screw_sort_preview.mp4"
type="video/mp4">
</video>
## Intended Uses
- Vision-Language-Action model pre/post-training
- World model pre/post-training
- Video generation training
- VR/AR/XR applications
- Hand tracking model training
## About April Robotics
We view humans as another embodiment of humanoid robots and believe their behavior should be captured with the same fidelity as robotic data. By building wearables such as sensorized gloves and using the data ourselves to train and deploy humanoids, we create a closed loop where real-world use continuously improves data quality.
### What We Can Offer
April Robotics captures ground truth hand tracking data using proprietary sensorized wearables. We can provide:
- Accurate human expert data (hands & head pose) from partner factories across multiple industries
- Additional modalities: touch, depth, audio
- On-demand high-quality data collection for your custom training needs
## Contact
📧 joni@aprilrobotics.ai |