nielsr HF Staff commited on
Commit
b1e2f21
·
verified ·
1 Parent(s): f4bc1ef

Add dataset card, link to paper and project page

Browse files

Hi! I'm Niels from the Hugging Face community science team. I'm opening this PR to improve the dataset card for the OmniViTac dataset. This update includes:
- Adding the `robotics` task category to the metadata.
- Linking the dataset to the original research paper, project page, and GitHub repository.
- Providing a brief description of the dataset's scope and content based on the paper abstract.

This information helps researchers better understand and cite the dataset.

Files changed (1) hide show
  1. README.md +38 -3
README.md CHANGED
@@ -1,3 +1,38 @@
1
- ---
2
- license: cc-by-nc-4.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ task_categories:
4
+ - robotics
5
+ tags:
6
+ - visuo-tactile
7
+ - manipulation
8
+ - world-models
9
+ ---
10
+
11
+ # OmniViTac Dataset
12
+
13
+ OmniViTac is a large-scale visuo-tactile-action dataset introduced in the paper [OmniVTA: Visuo-Tactile World Modeling for Contact-Rich Robotic Manipulation](https://huggingface.co/papers/2603.19201).
14
+
15
+ [**Project Page**](https://mrsecant.github.io/OmniVTA) | [**GitHub**](https://github.com/MrSecant/OmniVTA) | [**Paper**](https://huggingface.co/papers/2603.19201)
16
+
17
+ ## Dataset Summary
18
+ The OmniViTac dataset comprises 21,000+ trajectories across 86 tasks and 100+ objects, organized into six physics-grounded interaction patterns. It is designed to support the development of world-model-based visuo-tactile manipulation frameworks, providing high-frequency tactile feedback alongside visual observations and robot actions for contact-rich tasks such as wiping and assembly.
19
+
20
+ ## Interaction Patterns
21
+ The dataset covers six physics-grounded interaction categories:
22
+ 1. Wiping
23
+ 2. Assembly
24
+ 3. Pushing
25
+ 4. Sliding
26
+ 5. Grasping
27
+ 6. Peg-in-hole insertion
28
+
29
+ ## Citation
30
+ If you use this dataset in your research, please cite the following paper:
31
+ ```bibtex
32
+ @article{OmniVTA2026,
33
+ title={OmniVTA: Visuo-Tactile World Modeling for Contact-Rich Robotic Manipulation},
34
+ author={Anonymous},
35
+ journal={arXiv preprint arXiv:2603.19201},
36
+ year={2026}
37
+ }
38
+ ```