Enhance dataset card: Add paper, code, project page, task category, and sample usage

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +113 -2
README.md CHANGED
@@ -1,5 +1,116 @@
1
  ---
2
- license: mit
3
  language:
4
  - en
5
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
 
2
  language:
3
  - en
4
+ license: mit
5
+ task_categories:
6
+ - robotics
7
+ ---
8
+
9
+ # TacThru-UMI Tasks Dataset
10
+
11
+ This repository contains the datasets used in the paper [Simultaneous Tactile-Visual Perception for Learning Multimodal Robot Manipulation](https://huggingface.co/papers/2512.09851).
12
+
13
+ This work introduces **TacThru**, a novel see-through-skin (STS) sensor enabling simultaneous visual perception and robust tactile signal extraction, and **TacThru-UMI**, an imitation learning framework that leverages these multimodal signals for robotic manipulation. The datasets provided here are generated through this framework and are used to train and evaluate generalist policies for precise manipulation tasks on five challenging real-world tasks.
14
+
15
+ * **Paper:** [https://huggingface.co/papers/2512.09851](https://huggingface.co/papers/2512.09851)
16
+ * **Project Page:** [https://tacthru.yuyang.li/](https://tacthru.yuyang.li/)
17
+ * **Code/GitHub Repository:** [https://github.com/YuyangLee/TacThru](https://github.com/YuyangLee/TacThru)
18
+ * **Video:** [https://vimeo.com/1145307821](https://vimeo.com/1145307821)
19
+ * **Hardware Guide:** [https://docs.google.com/document/d/1fpZRiGoxWqLoFs-zxnG4d_d3hy0eHjlLA4nsuEKCvg/edit?usp=sharing](https://docs.google.com/document/d/1fpZRiGoxWqLoFs-zxnG4d_d3hy0eHjlLA4nsuEKCvg/edit?usp=sharing)
20
+
21
+ ## Dataset Tasks
22
+
23
+ This dataset includes the following tasks used in our experiments:
24
+
25
+ * `PickBottle`
26
+ * `PullTissue`
27
+ * `SortBolt`
28
+ * `HangScissors`
29
+ * `InsertCap`
30
+
31
+ ## Dataset Structure
32
+
33
+ The datasets are provided as Zarr files, following a structure similar to the example below. You can use `scripts/show_ds.py` from the associated codebase to inspect the Zarr files.
34
+
35
+ ```
36
+ data/camera0_rgb shape=(N, 224, 224, 3) dtype=uint8
37
+ data/robot0_demo_end_pose shape=(N, 6) dtype=float64
38
+ data/robot0_demo_start_pose shape=(N, 6) dtype=float64
39
+ data/robot0_eef_pos shape=(N, 3) dtype=float32
40
+ data/robot0_eef_rot_axis_angle shape=(N, 3) dtype=float32
41
+ data/robot0_gripper_width shape=(N, 1) dtype=float32
42
+ data/tacthru_l_marker shape=(N, 64, 2) dtype=float32
43
+ data/tacthru_l_rgb shape=(N, 224, 224, 3) dtype=uint8
44
+ data/tacthru_r_marker shape=(N, 64, 2) dtype=float32
45
+ data/tacthru_r_rgb shape=(N, 224, 224, 3) dtype=uint8
46
+ meta/episode_ends shape=(M,) dtype=int64
47
+ ```
48
+ where `N` is the total number of frames and `M` is the number of episodes.
49
+
50
+ ## Sample Usage
51
+
52
+ To utilize these datasets, you will typically interact with the main [TacThru codebase](https://github.com/YuyangLee/TacThru).
53
+
54
+ ### Environment Setup
55
+
56
+ First, clone the main repository and set up the environment:
57
+
58
+ ```shell
59
+ git clone https://github.com/YuyangLee/TacThru
60
+ cd TacThru
61
+ ```
62
+
63
+ We use `uv` to manage the virtual environment. Install the basic dependencies:
64
+
65
+ ```shell
66
+ uv sync
67
+ ```
68
+
69
+ For policy training and validating robotic manipulation, include the optional `umi` dependencies:
70
+
71
+ ```shell
72
+ uv sync --extra umi
73
+ ```
74
+
75
+ ### Download Datasets
76
+
77
+ The datasets are provided as a Hugging Face Dataset and are set up as a submodule under `data/tasks/` in the main `TacThru` repository. To download them, ensure you have Git LFS installed and then run:
78
+
79
+ ```bash
80
+ git submodule init
81
+ git submodule update
82
+ ```
83
+
84
+ You can also use [sparse checkout](https://git-scm.com/docs/git-sparse-checkout) to download partially.
85
+
86
+ ### Train Policy
87
+
88
+ An example training script is provided in the main repository's `./train_tf.sh` file. Here's how you might run it for a specific task, such as `pick_bottle`:
89
+
90
+ ```shell
91
+ task=pick_bottle
92
+
93
+ # TacThru w/ marker deviations
94
+ tac_active_keys="[tacthru_l_rgb,tacthru_l_markers]"
95
+ obs_tag="tt_m"
96
+ exp_tag="run"
97
+
98
+ uv run scripts/train.py --config-name=train_tf exp_name=tf-$obs_tag-$exp_tag task=$task tac_active_keys=$tac_active_keys
99
+ ```
100
+
101
+ The `task` can be one of: `pick_bottle`, `pull_tissue`, `sort_bolt`, `hang_scissors`, `insert_cap`.
102
+
103
+ The `tac_active_keys` argument should include the observation keys defined in `train.task.shape_meta.obs`. In the provided datasets, `tacthru_l_*` refers to signals from the TacThru sensor (left finger), while `tacthru_r_*` refers to signals from the GelSight-type sensor (rectified).
104
+
105
+ ## Citation
106
+
107
+ If you find our work helpful, please consider citing it:
108
+
109
+ ```bibtex
110
+ @article{li2025simultaneous,
111
+ title={Simultaneous Tactile-Visual Perception for Learning Multimodal Robot Manipulation},
112
+ author={Yuyang Li and Yinghan Chen and Zihang Zhao and Puhao Li and Tengyu Liu and Siyuan Huang and Yixin Zhu},
113
+ journal={arXiv preprint arXiv:2512.09851},
114
+ year={2025}
115
+ }
116
+ ```