Datasets:

Modalities:
Video
Size:
< 1K
Libraries:
Datasets
License:
aspiridonov commited on
Commit
7b4fbe0
·
verified ·
1 Parent(s): 704c7a0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -36
README.md CHANGED
@@ -1,63 +1,69 @@
1
  ---
2
- {}
3
  ---
4
 
 
5
  # April Robotics Data Sample
6
- This dataset contains 3D ground truth hand and finger annotations combined with egocentric and wrist recordings of industrial assembly operations captured in an active manufacturing environment. The data showcases our multimodal capture system consisting of head and wrist-mounted cameras, and a sensorized glove that track high-quality and precise human hand motion and manipulation.
 
7
 
8
  ## Dataset Summary
9
- - **Total episodes:** 6
10
- - **Frames:** 7205
11
- - **Tasks:** 2 distinct industrial manipulation tasks
12
- - **Modalities:** RGB, 3D Hand Keypoints, Wrist Pose, Head Pose, Language Annotations
13
- - **Resolution and rate:** 960 x 720 (headcam), 480 x 640 (wristcam) @ 30 FPS
14
- - **Format:** LeRobot Dataset v3
15
 
16
- ## How to Use
 
 
 
 
 
 
 
17
 
18
- **Quick Example:**
19
  ```python
20
  from lerobot.datasets.lerobot_dataset import LeRobotDataset
21
 
22
- EPISODE_ID = 0
23
- SAMPLE_ID = 10
24
-
25
- dataset = LeRobotDataset(repo_id="aprilrobotics/sample", episodes=[EPISODE_ID])
26
- frame = dataset[SAMPLE_ID]
27
 
28
  # Camera images (H, W, 3)
29
  head_img = frame["observation.images.head_cam"].numpy().transpose(1, 2, 0)
30
  wrist_img = frame["observation.images.wrist_cam"].numpy().transpose(1, 2, 0)
31
 
32
- # State: [head_pose(7), wrist_pose(7), hand_keypoints(63)]
33
- # pose = [x, y, z, qx, qy, qz, qw]
34
  state = frame["observation.state"].numpy()
35
- head_pose = state[0:7] # position + quaternion (xyzw)
36
- wrist_pose = state[7:14] # position + quaternion (xyzw)
37
 
38
- # 21 hand keypoints:
39
- # 0=wrist, 1-4=thumb, 5-8=index, 9-12=middle, 13-16=ring, 17-20=pinky
 
 
 
40
  keypoints = state[14:].reshape(21, 3)
41
  ```
42
 
43
- **Visualization:**
44
- We provide a visualization in rerun for our data. Visit [https://github.com/AprilRoboticsAI/Visualization] and follow the instructions there.
 
 
 
 
 
 
 
 
 
 
 
45
 
46
- ## Intended uses
47
 
48
- * Vision-Language-Action Model Pre/Post-Training
49
- * World Model Pre/Post-Trainig
50
- * Video Generation Training
51
- * VR/AR/XR Applications
52
- * Hand Tracking Model Training
53
 
54
- # About April Robotics
55
 
56
- We view humans as another embodiment of a humanoid robots and believe their behavior should be captured with the same fidelity as robotic data. By building wearables like sensorized gloves and using the data ourselves to train and deploy humanoids, we create a closed loop where real-world use continuously improves data quality. This tight integration ensures the data directly translates into better downstream robotic capabilities.
 
 
57
 
58
- ## Services
59
 
60
- April Robotics can provide:
61
- - Human expert data from factories across industries
62
- - Additional data modalities: Touch, Depth, Audio
63
- -
 
1
  ---
2
+ license: cc-by-4.0
3
  ---
4
 
5
+
6
  # April Robotics Data Sample
7
+
8
+ This dataset contains 3D ground truth hand and finger annotations combined with egocentric and wrist recordings of industrial assembly operations captured in an active manufacturing environment. The data showcases our multimodal capture system consisting of head and wrist-mounted cameras and a sensorized glove that tracks high-quality, precise human hand motion and manipulation.
9
 
10
  ## Dataset Summary
 
 
 
 
 
 
11
 
12
+ | Property | Value |
13
+ |----------|-------|
14
+ | Total episodes | 6 |
15
+ | Frames | 7,205 |
16
+ | Tasks | 2 distinct industrial manipulation tasks |
17
+ | Modalities | RGB, 3D Hand Keypoints, Wrist Pose, Head Pose, Language Annotations |
18
+ | Resolution | 960×720 (head cam), 480×640 (wrist cam) @ 30 FPS |
19
+ | Format | LeRobot Dataset v3 |
20
 
21
+ ## Quick Start
22
  ```python
23
  from lerobot.datasets.lerobot_dataset import LeRobotDataset
24
 
25
+ dataset = LeRobotDataset(repo_id="aprilrobotics/sample", episodes=[0])
26
+ frame = dataset[10]
 
 
 
27
 
28
  # Camera images (H, W, 3)
29
  head_img = frame["observation.images.head_cam"].numpy().transpose(1, 2, 0)
30
  wrist_img = frame["observation.images.wrist_cam"].numpy().transpose(1, 2, 0)
31
 
32
+ # State vector: [head_pose(7), wrist_pose(7), hand_keypoints(63)]
 
33
  state = frame["observation.state"].numpy()
 
 
34
 
35
+ head_pose = state[0:7] # position (xyz) + quaternion (xyzw)
36
+ wrist_pose = state[7:14] # position (xyz) + quaternion (xyzw)
37
+
38
+ # 21 hand keypoints (xyz each):
39
+ # 0=wrist, 1-4=thumb, 5-8=index, 9-12=middle, 13-16=ring, 17-20=pinky
40
  keypoints = state[14:].reshape(21, 3)
41
  ```
42
 
43
+ ## Visualization
44
+
45
+ We provide visualization using Rerun. See the [visualization repository](https://github.com/AprilRoboticsAI/Visualization) for instructions.
46
+
47
+ ## Intended Uses
48
+
49
+ - Vision-Language-Action model pre/post-training
50
+ - World model pre/post-training
51
+ - Video generation training
52
+ - VR/AR/XR applications
53
+ - Hand tracking model training
54
+
55
+ ## About April Robotics
56
 
57
+ We view humans as another embodiment of humanoid robots and believe their behavior should be captured with the same fidelity as robotic data. By building wearables such as sensorized gloves and using the data ourselves to train and deploy humanoids, we create a closed loop where real-world use continuously improves data quality.
58
 
59
+ ### Additional Services
 
 
 
 
60
 
61
+ April Robotics captures ground truth hand tracking data using proprietary sensorized wearables. We can provide:
62
 
63
+ - Accurate human expert data from partner factories across industries
64
+ - Additional modalities: touch, depth, audio
65
+ - On-demand high-quality data collection for custom training needs
66
 
67
+ ## Contact
68
 
69
+ 📧 team@aprilrobotics.ai