ylang-chen commited on
Commit
367b8de
·
verified ·
1 Parent(s): 7202e05

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +157 -46
README.md CHANGED
@@ -1,77 +1,188 @@
1
  ---
2
  license: odbl
3
- Author: Dexrobot Team
 
 
 
 
 
 
 
 
 
 
 
 
4
  ---
5
- # DexCanvas
6
- ## - Human Dexterous Manipulation Dataset & Simulation Platform
7
 
8
- 🔬 Coming Soon - Expected Release: Mid-October 2025
9
 
10
- We are excited to announce the upcoming release of MOCAP, a comprehensive dataset and simulation platform for human dexterous manipulation research.
11
 
12
- 📊 Dataset Overview
13
 
14
- Our team has collected and processed approximately 10,000 recordings / 200 hours of human manipulation data, categorized by:
15
 
16
- - 33 Common Grasping Gestures: Comprehensive coverage of human grasping patterns
17
- - ~10 In-Hand Manipulation Actions: Essential dexterous manipulation skills
 
 
 
 
 
18
 
19
- 🛠️ Key Features
20
 
21
- 🔥 1M+ Physics-Accurate Records
22
 
23
- - 100x Amplified Dataset: Original 10K human recordings expanded to 1M+ samples through RL-based simulation
24
- - Physics-Correct: Each simulated sample maintains physical accuracy and realistic dynamics
25
- - Rich Contact Information: Complete contact points, forces, and interaction dynamics
 
26
 
27
- Data Pipeline
28
 
29
- - Complete data cleaning and preprocessing pipeline
30
- - ✅ Quality-assured human operation recordings
31
- - ✅ Standardized annotation framework
32
 
33
- Real2Sim Achievement
34
 
35
- - 🎯 Precise replication of human hand operations in simulation environment
36
- - 🤖 Reinforcement Learning-based real2sim conversion methodology
37
- - 📈 100x data scaling with maintained physical fidelity
 
38
 
39
- 🎯 Technical Significance
40
 
41
- 1. End-to-End Validation
42
 
43
- Complete workflow from "how to collect" to "how to use", ensuring data collection and preprocessing accuracy.
44
 
45
- 2. Physics-Informed Insights
 
 
46
 
47
- Obtain comprehensive contact points, force information, and interaction dynamics through physics simulation - data impossible to observe directly in real-world scenarios.
48
 
49
- 3. Massive Scale with Physical Accuracy
50
 
51
- - 1M+ physically-accurate samples generated through RL-enhanced simulation
52
- - Complete contact information for every manipulation sequence
53
- - Verified physics consistency across all synthesized data
54
 
55
- 🚀 What's Coming
56
 
57
- - 📦 Complete Dataset Release: 1M+ physics-accurate manipulation recordings
58
- - 🔧 Preprocessing Tools: Ready-to-use data pipeline
59
- - 🎮 Simulation Environment: Physics-based replication platform
60
- - 📊 Contact Analysis Tools: Contact force and interaction analysis utilities
61
- - 📚 Documentation & Tutorials: Comprehensive usage guides
62
 
63
- 📅 Release Timeline
64
 
65
- Expected Release Date: Mid-October 2025
 
 
66
 
67
- Stay tuned for updates!
68
 
69
- ---
70
- Developed by DexRobot TeamLast Updated: September 11, 2025
 
71
 
72
- 📬 Contact
73
 
74
- For early access requests or collaboration inquiries, please reach out to our team at https://www.dex-robot.com/en
75
 
76
- ---
77
- ⭐ Star this space to get notified when the dataset becomes available!
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: odbl
3
+ task_categories:
4
+ - robotics
5
+ - computer-vision
6
+ - video-classification
7
+ tags:
8
+ - dexterous-manipulation
9
+ - hand-object-interaction
10
+ - motion-capture
11
+ - physics-simulation
12
+ - rgbd
13
+ - contact-forces
14
+ size_categories:
15
+ - 10<n<1K
16
  ---
 
 
17
 
18
+ # DexCanvas: Dexterous Manipulation Dataset v0.1
19
 
20
+ **⚠️ TEST RELEASE**: This is a preview version containing 1% of the full dataset. Contact force data is not included in v0.1.
21
 
22
+ DexCanvas is a large-scale hybrid dataset for robotic hand-object interaction research, combining real human demonstrations with physics-validated simulation data.
23
 
24
+ ## Dataset Statistics (v0.1 Test Release)
25
 
26
+ - **Total Frames**: ~30 million multi-view RGB-D frames
27
+ - **Total Duration**: ~70 hours of dexterous hand-object interactions
28
+ - **Real Demonstrations**: ~0.7 hours of human mocap data (1/100 of collected data)
29
+ - **Expansion Ratio**: 100× from real to simulated data
30
+ - **Manipulation Types**: 21 types based on Cutkosky taxonomy
31
+ - **Objects**: 30 objects (geometric primitives + YCB objects)
32
+ - **Capture Rate**: 100 Hz optical motion capture
33
 
34
+ ## Manipulation Coverage
35
 
36
+ The dataset spans four primary grasp categories:
37
 
38
+ - **Power Grasps**: Full-hand wrapping grips
39
+ - **Intermediate Grasps**: Mixed precision-power combinations
40
+ - **Precision Grasps**: Fingertip-based manipulation
41
+ - **In-Hand Manipulation**: Object reorientation and repositioning
42
 
43
+ All 21 manipulation types follow the Cutkosky grasp taxonomy.
44
 
45
+ ## Data Modalities
 
 
46
 
47
+ Each frame includes:
48
 
49
+ - **RGB-D Data**: Multi-view color and depth images
50
+ - **Hand Pose**: MANO hand parameters with high-precision tracking
51
+ - **Object State**: 6-DoF pose and object wrenches
52
+ - **Annotations**: Per-frame labels and metadata
53
 
54
+ **Note**: Contact force data is not included in v0.1. Contact forces will be available in future releases.
55
 
56
+ ## Data Pipeline
57
 
58
+ The dataset is generated through three stages:
59
 
60
+ 1. **Real Capture**: Optical motion capture of human demonstrations at 30 Hz
61
+ 2. **Force Reconstruction**: RL-based physics simulation to infer contact forces
62
+ 3. **Physics Validation**: Verification of contact points, forces, and object dynamics
63
 
64
+ This hybrid approach provides contact information impossible to observe directly in real-world scenarios while maintaining physical accuracy.
65
 
66
+ ## Installation
67
 
68
+ ```bash
69
+ pip install datasets huggingface_hub
70
+ ```
71
 
72
+ For image processing and visualization:
73
 
74
+ ```bash
75
+ pip install pillow numpy torch
76
+ ```
 
 
77
 
78
+ Authenticate with HuggingFace (required for private datasets):
79
 
80
+ ```bash
81
+ huggingface-cli login
82
+ ```
83
 
84
+ Or set your token as an environment variable:
85
 
86
+ ```bash
87
+ export HF_TOKEN="your_token_here"
88
+ ```
89
 
90
+ ## Quick Start
91
 
92
+ ### Data Structure
93
 
94
+ ```json
95
+ {
96
+ "trajectory_meta_data": {
97
+ "generated_data": "int",
98
+ "data_fps": "int",
99
+ "mocap_raw_data_source": {
100
+ "operator": "str",
101
+ "object": "str",
102
+ "gesture": "str"
103
+ },
104
+ "total_frames": "int",
105
+ "mano_hand_shape": "(10,)"
106
+ //...
107
+ },
108
+ "sequence_info": {
109
+ "timestamp": "(T,)",
110
+ "hand_joint": {
111
+ "position": "(T, 3)",
112
+ "rotation": "(T, 3)",
113
+ "finger_pose": "(T, 48)"
114
+ },
115
+ "object_info": {
116
+ "pose": "(T, 6)"
117
+ },
118
+ "mano_model_output": {
119
+ "joints": "(T, 63)"
120
+ }
121
+ }
122
+ }
123
+ ```
124
+ ### Visualization
125
+
126
+ Visualize trajectories using the **mocap_loader**:
127
+
128
+ ```bash
129
+ # Install dependencies
130
+ pip install open3d trimesh scipy
131
+
132
+ # Visualize trajectory
133
+ python -m hand_trajectory_loader.examples.visualize_trajectory \
134
+ dataset.parquet 0 \
135
+ --mano-model assets/mano/models/MANO_RIGHT.pkl \
136
+ --object assets/objects/cube1.stl \
137
+ --show-joints
138
+ ```
139
+
140
+ Controls: **SPACE** pause/resume, **M** toggle hand mesh, **O** toggle object, **Q** quit
141
+
142
+ ## Version Information
143
+
144
+ **v0.1 (Test Release)** includes:
145
+ - 1% of collected real human demonstration data
146
+ - MANO hand parameters
147
+ - Object pose data
148
+ - Manipulation type annotations
149
+
150
+ **Coming in future releases**:
151
+ - Complete dataset (100× larger than v0.1)
152
+ - Contact force data with physics validation
153
+ - Additional objects and manipulation types
154
+ - Extended annotations and metadata
155
+
156
+ ## Contact
157
+
158
+ **Research Collaboration**
159
+ Academic inquiries: lyw@dex-robot.com
160
+
161
+ **Business Inquiries**
162
+ Business collaboration: info@dex-robot.com
163
+
164
+ **Website**
165
+ https://www.dex-robot.com/en
166
+ https://dexcanvas.github.io/
167
+
168
+ ## Citation
169
+
170
+ ```bibtex
171
+ @article{dexcanvas2025,
172
+ title={DexCanvas: A Large-Scale Hybrid Dataset for Dexterous Manipulation},
173
+ author={DexRobot Team},
174
+ year={2025},
175
+ eprint={2510.15786},
176
+ archivePrefix={arXiv},
177
+ url={https://arxiv.org/abs/2510.15786}
178
+ }
179
+ ```
180
+
181
+ ## License
182
+
183
+ This dataset is released under the Open Database License (ODbL).
184
+
185
+ ---
186
+
187
+ **Developed by DexRobot Team**
188
+ Last Updated: January 2025