suddhu commited on
Commit
c08a738
·
verified ·
1 Parent(s): 055d346

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +90 -0
README.md ADDED
@@ -0,0 +1,90 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - en
5
+ pretty_name: a
6
+ ---
7
+ # *Feelsight* : A visuo-tactile robot manipulation dataset
8
+
9
+ <!-- Provide a quick summary of the dataset. -->
10
+
11
+ <div style="text-align: center;">
12
+ <video width="80%" onmouseover="this.pause()" onmouseout="this.play()" autoplay="" loop="" muted="">
13
+ <source src="https://suddhu.github.io/neural-feels/video/dataset_zoom.mp4" type="video/mp4">
14
+ </video>
15
+ </div>
16
+
17
+ <br>
18
+
19
+ The FeelSight dataset is a dataset of vision, touch, and proprioception data collected from in-hand rotation of objects via an RL policy. It consists of a total of 70 experiments, 30 in the real-world and 40 in simulation, each lasting 30 seconds. For training neural field models with FeelSight, refer to the [NeuralFeels](https://github.com/facebookresearch/neuralfeels) repository.
20
+
21
+ ## Simulation data
22
+
23
+ Our simulated data is collected in IsaacGym with TACTO touch simulation in the loop.
24
+ <div style="text-align: center;">
25
+ <video width="80%" onmouseover="this.pause()" onmouseout="this.play()" autoplay="" loop="" muted="">
26
+ <source src="https://suddhu.github.io/neural-feels/video/feelsight_sim_rubber_duck.mp4" type="video/mp4">
27
+ </video>
28
+ </div>
29
+
30
+ ## Real-world data
31
+
32
+ Here's an example of real-world data from our three-camera setup and the DIGIT-Allegro hand:
33
+ <div style="text-align: center;">
34
+ <video width="80%" onmouseover="this.pause()" onmouseout="this.play()" autoplay="" loop="" muted="">
35
+ <source src="https://suddhu.github.io/neural-feels/video/feelsight_real_bell_pepper.mp4" type="video/mp4">
36
+ </video>
37
+ </div>
38
+
39
+ ## Robot setup
40
+
41
+ The Allegro hand is mounted on the Franka Emika Panda robot. The hand is sensorized with DIGIT tactile sensors, and surrounded by three Intel RealSense cameras.
42
+
43
+ <img src="https://suddhu.github.io/neural-feels/img/robot_cell.jpg" width="90%">
44
+
45
+ ## Dataset structure
46
+
47
+ For dataloaders, refer to the [NeuralFeels](https://github.com/facebookresearch/neuralfeels) repository.
48
+
49
+ ```bash
50
+ feelsight/ # root directory, either feelsight or feelsight_real
51
+ ├── object_1/ # e.g. 077_rubiks_cube
52
+ │ ├── 00/ # log directory
53
+ │ │ ├── allegro/ # tactile sensor data
54
+ │ │ │ ├── index/ # finger id
55
+ │ │ │ │ ├── depth # only in sim, ground-truth
56
+ | | | | | └── ..jpg
57
+ │ │ │ │ ├── image # RGB tactile images
58
+ | | | | | └── ..jpg
59
+ │ │ │ │ └── mask # only in sim, ground-truth
60
+ | | | | └── ..jpg
61
+ │ │ │ └── ..
62
+ │ │ ├── realsense/ # RGB-D data
63
+ │ │ │ ├── front-left/ # camera id
64
+ │ │ │ │ ├── image # RGB images
65
+ | | | | | └── ..jpg
66
+ │ │ │ │ ├── seg # only in sim, ground-truth
67
+ | | | | | └── ..jpg
68
+ │ │ │ | └── depth.npz # depth images
69
+ │ │ ├── object_1.mp4 # video of sensor stream
70
+ │ │ └── data.pkl # proprioception data
71
+ │ └── ..
72
+ ├── object_2/
73
+ │ └── ..
74
+ └── ..
75
+ ```
76
+
77
+ ## Citation
78
+
79
+ If you find NeuralFeels useful in your research, please consider citing our paper:
80
+
81
+ ```bash
82
+ @article{suresh2024neuralfeels,
83
+ title={{N}eural feels with neural fields: {V}isuo-tactile perception for in-hand manipulation},
84
+ author={Suresh, Sudharshan and Qi, Haozhi and Wu, Tingfan and Fan, Taosha and Pineda, Luis and Lambeta, Mike and Malik, Jitendra and Kalakrishnan, Mrinal and Calandra, Roberto and Kaess, Michael and Ortiz, Joseph and Mukadam, Mustafa},
85
+ journal={Science Robotics},
86
+ pages={adl0628},
87
+ year={2024},
88
+ publisher={American Association for the Advancement of Science}
89
+ }
90
+ ```