LumosRobotics commited on
Commit
d891b27
·
verified ·
1 Parent(s): a01a7b3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -12
README.md CHANGED
@@ -4,19 +4,37 @@ language:
4
  - zh
5
  tags:
6
  - robotics
7
- - embodied-ai
8
  - manipulation
9
  - multimodal
10
- - vla
11
- - data-collection
12
  license: other
13
  task_categories:
14
  - robotics
15
- - imitation-learning
16
- multimodal: vision+depth+trajectory+force
17
- configs:
18
- - config_name: default
19
- data_files: "**/*"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
  ---
21
 
22
  <h1 align="center" style="font-size: 40px; font-weight: bold;">
@@ -35,11 +53,36 @@ configs:
35
  </p>
36
 
37
  ---
38
- ## 📖 Overview FastUMI (Fast Universal Manipulation Interface) is a dataset and interface framework for general-purpose robotic manipulation tasks, designed to support hardware-agnostic, scalable, and efficient data collection and model training. The project provides: - Physical prototype systems - Complete data collection codebase - Standardized data formats and utilities - Tools for real-world manipulation learning research ## 🚀 Features ### FastUMI Pro Enhancements - ✅ **Higher precision trajectory data** - ✅ **Diverse embodiment support** for true "one-brain-multiple-forms" - ✅ **Enterprise-ready** pipeline and full-link data processing ### FastUMI-150K - ~150,000 real-world manipulation trajectories - Used by research partners for large-scale VLA (Vision-Language-Action) model training - Demonstrated significant multi-task generalization capabilities ##
39
 
40
- 📊 Model Performance
41
 
42
- **VLA Model Results**: [TBD]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
43
 
44
  ## 🛠️ Toolchain
45
  | Tool | Description | Link |
@@ -49,10 +92,16 @@ configs:
49
  | **Hardware SDK** | FastUMI hardware development kit | [GitHub](https://github.com/FastUMIRobotics/FastUMI_Hardware_SDK) |
50
  | **Monitor Tool** | Real-time device monitoring | [GitHub](https://github.com/FastUMIRobotics/FastUMI_Monitor_Tool) |
51
  | **Data Collection** | Data collection utilities | [GitHub](https://github.com/FastUMIRobotics/FastUMI_Data_Collection) |
 
 
 
52
 
53
- ### Research & Applications
54
  - **Paper**: [MLM: Learning Multi-task Loco-Manipulation Whole-Body Control for Quadruped Robot with Arm](https://arxiv.org/abs/2508.10538)
55
  - **Tutorial**: PI0 (FastUMI Data Lightweight Adaptation, Version V0) Full Pipeline
 
 
 
56
 
57
  ## 📥 Data Download
58
 
 
4
  - zh
5
  tags:
6
  - robotics
 
7
  - manipulation
8
  - multimodal
9
+ - trajectory-data
10
+ - vision-sensors
11
  license: other
12
  task_categories:
13
  - robotics
14
+ multimodal: vision+action
15
+ dataset_info:
16
+ features:
17
+ - name: rgb_images
18
+ dtype: image
19
+ description: Multi-view RGB images
20
+ - name: slam_poses
21
+ sequence: float32
22
+ description: SLAM pose trajectories
23
+ - name: vive_poses
24
+ sequence: float32
25
+ description: Vive tracking system poses
26
+ - name: point_clouds
27
+ sequence: float32
28
+ description: Time-of-Flight point cloud data
29
+ - name: clamp_data
30
+ sequence: float32
31
+ description: Clamp sensor readings
32
+ - name: merged_trajectory
33
+ sequence: float32
34
+ description: Fused trajectory data
35
+ configs:
36
+ - config_name: default
37
+ data_files: "**/*"
38
  ---
39
 
40
  <h1 align="center" style="font-size: 40px; font-weight: bold;">
 
53
  </p>
54
 
55
  ---
 
56
 
57
+ ## 📖 Overview
58
 
59
+ The **FastUMI Pro Sample Dataset** contains a small number of demonstration trajectories
60
+ (only **dozens of episodes**, *not* a large-scale dataset).
61
+ It showcases the multimodal sensing capabilities of the FastUMI Pro system, including:
62
+
63
+ - RGB camera streams
64
+ - Visual SLAM pose trajectories
65
+ - Vive tracking data
66
+ - Time-of-Flight (ToF) point clouds
67
+ - Clamp (gripper gap) measurements
68
+ - Fused pose trajectories
69
+
70
+ This dataset is intended as a **public preview** of the data modality, structure, and quality of FastUMI Pro.
71
+ For full-scale datasets or customized collection services, please **contact the FastUMI team directly**.
72
+
73
+ ---
74
+
75
+ ## ✨ Key Features
76
+
77
+ - **High-precision spatial tracking**
78
+ - **Multi-sensor synchronization** across RGB, SLAM, Vive, ToF, and clamp channels
79
+ - **Standardized directory and timestamp structure**
80
+ - **Ready for embodied AI, imitation learning, and robotics research**
81
+ - **Hardware-agnostic data format** for cross-platform manipulation applications
82
+
83
+ ---
84
+
85
+ <!-- **VLA Model Results**: [TBD]
86
 
87
  ## 🛠️ Toolchain
88
  | Tool | Description | Link |
 
92
  | **Hardware SDK** | FastUMI hardware development kit | [GitHub](https://github.com/FastUMIRobotics/FastUMI_Hardware_SDK) |
93
  | **Monitor Tool** | Real-time device monitoring | [GitHub](https://github.com/FastUMIRobotics/FastUMI_Monitor_Tool) |
94
  | **Data Collection** | Data collection utilities | [GitHub](https://github.com/FastUMIRobotics/FastUMI_Data_Collection) |
95
+ -->
96
+
97
+ ---
98
 
99
+ <!-- ### Research & Applications
100
  - **Paper**: [MLM: Learning Multi-task Loco-Manipulation Whole-Body Control for Quadruped Robot with Arm](https://arxiv.org/abs/2508.10538)
101
  - **Tutorial**: PI0 (FastUMI Data Lightweight Adaptation, Version V0) Full Pipeline
102
+ -->
103
+
104
+ ---
105
 
106
  ## 📥 Data Download
107