Datasets:

Modalities:
Video
ArXiv:
License:
File size: 3,660 Bytes
7d57308
 
 
 
 
 
 
3d53cb4
bf62400
 
 
 
 
 
 
 
2b6c7d9
bf62400
f99ab3e
 
 
3fcee7e
f99ab3e
3fcee7e
f99ab3e
3fcee7e
f99ab3e
3fcee7e
f99ab3e
bf62400
 
 
fd20759
1e471f3
bf62400
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1e471f3
 
7d57308
 
 
1e471f3
bf62400
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
69410e4
7d57308
 
 
3d53cb4
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
---
license: mit
task_categories:
- robotics
tags:
- tactile
---

# πŸ“¦ FreeTacman
## Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation


## 🎯 Overview

This dataset supports the paper **[FreeTacman: Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation](http://arxiv.org/abs/2506.01941)**. It contains a large-scale, high-precision visuo-tactile manipulation dataset with over 3000k visuo-tactile image pairs, more than 10k trajectories across 50 tasks.
![FreeTacMan System Overview](https://raw.githubusercontent.com/OpenDriveLab/opendrivelab.github.io/master/FreeTacMan/task/datasetweb.png)
Please refer to our πŸš€ [Website](http://opendrivelab.com/freetacman) | πŸ“„ [Paper](http://arxiv.org/abs/2506.01941) | πŸ’» [Code](https://github.com/OpenDriveLab/FreeTacMan) | πŸ› οΈ [Hardware Guide](https://docs.google.com/document/d/1Hhi2stn_goXUHdYi7461w10AJbzQDC0fdYaSxMdMVXM/edit?addon_store&tab=t.0#heading=h.rl14j3i7oz0t) | πŸ“Ί [Video](https://opendrivelab.github.io/FreeTacMan/landing/FreeTacMan_demo_video.mp4) | 🌐 [X](https://x.com/OpenDriveLab/status/1930234855729836112) for more details.

## πŸ”¬ Potential Applications
The FreeTacman dataset enables diverse research directions in visuo-tactile learning and manipulation:

- **System Reproduction**: For researchers interested in hardware implementation, you can reproduce FreeTacMan from scratch using our πŸ› οΈ [Hardware Guide](https://docs.google.com/document/d/1Hhi2stn_goXUHdYi7461w10AJbzQDC0fdYaSxMdMVXM/edit?addon_store&tab=t.0#heading=h.rl14j3i7oz0t) and πŸ’» [Code](https://github.com/OpenDriveLab/FreeTacMan).

- **Multimodal Imitation Learning**: Transfer to other LED-based tactile sensors (such as GelSight) for developing robust multimodal imitation learning frameworks.

- **Tactile-aware Grasping**: Utilize the dataset for pre-training tactile representation models and developing tactile-aware reasoning systems.

- **Simulation-to-Real Transfer**: Leverage the dynamic tactile interaction sequences to enhance tactile simulation fidelity, significantly reducing the sim2real gap.

## πŸ“‚ Dataset Structure

The dataset is organized into 50 task categories, each containing:
- **Video files**: Synchronized video recordings from the wrist-mounted and visuo-tactile cameras for each demonstration
- **Trajectory files**: Detailed tracking data for tool center point pose and gripper distance


## 🧾 Data Format

### Video Files

- **Format**: MP4
- **Views**: Wrist-mounted camera and visuo-tactile camera perspectives per demonstration

### Trajectory Files

Each trajectory file contains the following data columns:

#### Timestamp
- `timestamp` - Unix Timestamp

#### Tool Center Point (TCP) Data

- `TCP_pos_x`, `TCP_pos_y`, `TCP_pos_z` - TCP position
- `TCP_euler_x`, `TCP_euler_y`, `TCP_euler_z` - TCP orientation (euler angles)
- `quat_w`, `quat_x`, `quat_y`, `quat_z` - TCP orientation (quaternion representation)

#### Gripper Data
- `gripper_distance` - Gripper opening distance


## πŸ“ Citation

If you use this dataset in your research, please cite:

```bibtex
@article{wu2025freetacman,
  title={Freetacman: Robot-free visuo-tactile data collection system for contact-rich manipulation},
  author={Wu, Longyan and Yu, Checheng and Ren, Jieji and Chen, Li and Jiang, Yufei and Huang, Ran and Gu, Guoying and Li, Hongyang},
  journal={arXiv preprint arXiv:2506.01941},
  year={2025}
}
```

## πŸ’Ό License

This dataset is released under the MIT License. See LICENSE file for details.

## πŸ“§ Contact

For questions or issues regarding the dataset, please contact: Longyan Wu (im.longyanwu@gmail.com).