readme
Browse files
README.md
CHANGED
|
@@ -16,6 +16,17 @@ This dataset supports the paper **[FreeTacman: Robot-free Visuo-Tactile Data Col
|
|
| 16 |

|
| 17 |
Please refer to our π [Website](http://opendrivelab.com/freetacman) | π [Paper](http://arxiv.org/abs/2506.01941) | π» [Code](https://github.com/OpenDriveLab/FreeTacMan) | π οΈ [Hardware Guide](https://docs.google.com/document/d/1Hhi2stn_goXUHdYi7461w10AJbzQDC0fdYaSxMdMVXM/edit?addon_store&tab=t.0#heading=h.rl14j3i7oz0t) | πΊ [Video](https://opendrivelab.github.io/FreeTacMan/landing/FreeTacMan_demo_video.mp4) | π [X](https://x.com/OpenDriveLab/status/1930234855729836112) for more details.
|
| 18 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
## π Dataset Structure
|
| 20 |
|
| 21 |
The dataset is organized into 50 task categories, each containing:
|
|
|
|
| 16 |

|
| 17 |
Please refer to our π [Website](http://opendrivelab.com/freetacman) | π [Paper](http://arxiv.org/abs/2506.01941) | π» [Code](https://github.com/OpenDriveLab/FreeTacMan) | π οΈ [Hardware Guide](https://docs.google.com/document/d/1Hhi2stn_goXUHdYi7461w10AJbzQDC0fdYaSxMdMVXM/edit?addon_store&tab=t.0#heading=h.rl14j3i7oz0t) | πΊ [Video](https://opendrivelab.github.io/FreeTacMan/landing/FreeTacMan_demo_video.mp4) | π [X](https://x.com/OpenDriveLab/status/1930234855729836112) for more details.
|
| 18 |
|
| 19 |
+
## π¬ Potential Applications
|
| 20 |
+
The FreeTacman dataset enables diverse research directions in visuo-tactile learning and manipulation:
|
| 21 |
+
|
| 22 |
+
**System Reproduction**: For researchers interested in hardware implementation, you can reproduce FreeTacMan from scratch using our π οΈ [Hardware Guide](https://docs.google.com/document/d/1Hhi2stn_goXUHdYi7461w10AJbzQDC0fdYaSxMdMVXM/edit?addon_store&tab=t.0#heading=h.rl14j3i7oz0t) and π» [Code](https://github.com/OpenDriveLab/FreeTacMan).
|
| 23 |
+
|
| 24 |
+
**Multimodal Imitation Learning**: Transfer to other LED-based tactile sensors (such as GelSight) for developing robust multimodal imitation learning frameworks.
|
| 25 |
+
|
| 26 |
+
**Tactile-aware Grasping**: Utilize the dataset for pre-training tactile representation models and developing tactile-aware reasoning systems.
|
| 27 |
+
|
| 28 |
+
**Simulation-to-Real Transfer**: Leverage the dynamic tactile interaction sequences to enhance tactile simulation fidelity, significantly reducing the sim2real gap.
|
| 29 |
+
|
| 30 |
## π Dataset Structure
|
| 31 |
|
| 32 |
The dataset is organized into 50 task categories, each containing:
|