yunkao commited on
Commit
b918029
·
verified ·
1 Parent(s): 20ddddc

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. subsets/filtered/meta/README.md +212 -0
subsets/filtered/meta/README.md ADDED
@@ -0,0 +1,212 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <!--
2
+ Open-H Embodiment Dataset README Template (v1.0)
3
+ Please fill out this template and include it in the ./metadata directory of your LeRobot dataset.
4
+ This file helps others understand the context and details of your contribution.
5
+ -->
6
+
7
+ # [Dataset Name] - README
8
+
9
+ ---
10
+
11
+ ## 📋 At a Glance
12
+
13
+ *Freehand ultrasound scan trajectory from an expert.*
14
+
15
+ ---
16
+
17
+ ## 📖 Dataset Overview
18
+
19
+ *This dataset contains 5 trajectories of expert surgeons using the ultrasound probe to scan bones.*
20
+
21
+ | | |
22
+ | :--- | :--- |
23
+ | **Total Trajectories** | `[5]` |
24
+ | **Total Hours** | `[]` |
25
+ | **Data Type** | `[ ] Clinical` `[✔️] Ex-Vivo` `[ ] Table-Top Phantom` `[ ] Digital Simulation` `[ ] Physical Simulation` `[ ] Other (If checked, update "Other")` |
26
+ | **License** | CC BY 4.0 |
27
+ | **Version** | `[e.g., 1.0]` |
28
+
29
+ ---
30
+
31
+ ## 🎯 Tasks & Domain
32
+
33
+ ### Domain
34
+
35
+ *Select the primary domain for this dataset.*
36
+
37
+ - [ ] **Surgical Robotics**
38
+ - [✔️] **Ultrasound Robotics**
39
+ - [ ] **Other Healthcare Robotics** (Please specify: `[Your Domain]`)
40
+
41
+ ### Demonstrated Skills
42
+
43
+ - Ultrasound scanning
44
+
45
+ ---
46
+
47
+ ## 🔬 Data Collection Details
48
+
49
+ ### Collection Method
50
+
51
+ *How was the data collected?*
52
+
53
+ - [ ] **Human Teleoperation**
54
+ - [ ] **Programmatic/State-Machine**
55
+ - [ ] **AI Policy / Autonomous**
56
+ - [ ] **Other** (Please specify: `[Expert freehand]`)
57
+
58
+ ### Operator Details
59
+
60
+ | | Description |
61
+ | :--- | :--- |
62
+ | **Operator Count** | `[1]` |
63
+ | **Operator Skill Level** | `[✔️] Expert (e.g., Surgeon, Sonographer)` <br> `[ ] Intermediate (e.g., Trained Researcher)` <br> `[ ] Novice (e.g., ML Researcher with minimal experience)` <br> `[ ] N/A` |
64
+ | **Collection Period** | From `[YYYY-MM-DD]` to `[YYYY-MM-DD]` |
65
+
66
+ ### Recovery Demonstrations
67
+
68
+ *Does this dataset include examples of recovering from failure?*
69
+
70
+ - [ ] **Yes**
71
+ - [✔️] **No**
72
+
73
+ **If yes, please briefly describe the recovery process:**
74
+
75
+ ---
76
+
77
+ ## 💡 Diversity Dimensions
78
+
79
+ *Check all dimensions that were intentionally varied during data collection.*
80
+
81
+ - [✔️] **Camera Position / Angle**
82
+ - [ ] **Lighting Conditions**
83
+ - [✔️] **Target Object** (e.g., different phantom models, suture types)
84
+ - [ ] **Spatial Layout** (e.g., placing the target suture needle in various locations)
85
+ - [ ] **Robot Embodiment** (if multiple robots were used)
86
+ - [ ] **Task Execution** (e.g., different techniques for the same task)
87
+ - [ ] **Background / Scene**
88
+ - [ ] **Other** (Please specify: `[Your Dimension]`)
89
+
90
+ *If you checked any of the above please briefly elaborate below.*
91
+
92
+ **Example:** We start the ultrasound scan from different initial positions. We scan multiple bone structures including fibla and tibia
93
+
94
+ ---
95
+
96
+ ## 🛠️ Equipment & Setup
97
+
98
+ ### Robotic Platform(s)
99
+
100
+ *List the primary robot(s) used.*
101
+
102
+ - **Robot 1:** `[e.g., dVRK (da Vinci Research Kit)]`
103
+ - **Robot 2:** `[If applicable]`
104
+
105
+ ### Sensors & Cameras
106
+
107
+ *List the sensors and cameras used. Specify model names where possible. (Add and remove rows as needed)*
108
+
109
+ | Type | Model/Details |
110
+ | :--- | :--- |
111
+ | **Primary Camera** | `[None]` |
112
+ | **Room/3rd Person Camera** | `[None]` |
113
+ | **Force/Torque Sensor** | `[None]` |
114
+ | **Medical Imager** | `[e.g., SuperSonic Imagine SL18-5 Ultrasound, B-Mode]` |
115
+ | **Other** | `[Specify]` |
116
+
117
+ ---
118
+
119
+ ## 🎯 Action & State Space Representation
120
+
121
+ *Describe how actions and robot states are represented in your dataset. This is crucial for understanding data compatibility and enabling effective policy learning.*
122
+
123
+ ### Action Space Representation
124
+
125
+ **Primary Action Representation:**
126
+ - [ ] **Absolute Cartesian** (position/orientation relative to robot base)
127
+ - [ ] **Relative Cartesian** (delta position/orientation from current pose)
128
+ - [ ] **Joint Space** (direct joint angle commands)
129
+ - [ ] **Other** (Please specify: `[Your Representation]`)
130
+
131
+ **Orientation Representation:**
132
+ - [ ] **Quaternions** (x, y, z, w)
133
+ - [ ] **Euler Angles** (roll, pitch, yaw)
134
+ - [ ] **Axis-Angle** (rotation vector)
135
+ - [ ] **Rotation Matrix** (3x3 matrix)
136
+ - [ ] **Other** (Please specify: `[Your Representation]`)
137
+
138
+ **Reference Frame:**
139
+ - [ ] **Robot Base Frame**
140
+ - [ ] **Tool/End-Effector Frame**
141
+ - [ ] **World/Global Frame**
142
+ - [ ] **Camera Frame**
143
+ - [ ] **Other** (Please specify: `[Your Frame]`)
144
+
145
+ **Action Dimensions:**
146
+ *List the action space dimensions and their meanings.*
147
+
148
+ **Example:**
149
+ ```
150
+ action: [x, y, z, qx, qy, qz, qw, gripper]
151
+ - x, y, z: Absolute position in robot base frame (meters)
152
+ - qx, qy, qz, qw: Absolute orientation as quaternion
153
+ - gripper: Gripper opening angle (radians)
154
+ ```
155
+
156
+ ### State Space Representation
157
+
158
+ **State Information Included:**
159
+ - [ ] **Joint Positions** (all articulated joints)
160
+ - [ ] **Joint Velocities**
161
+ - [ ] **End-Effector Pose** (Cartesian position/orientation)
162
+ - [ ] **Force/Torque Readings**
163
+ - [ ] **Gripper State** (position, force, etc.)
164
+ - [ ] **Other** (Please specify: `[Your State Info]`)
165
+
166
+ **State Dimensions:**
167
+ *List the state space dimensions and their meanings.*
168
+
169
+ **Example:**
170
+ ```
171
+ observation.state: [j1, j2, j3, j4, j5, j6, j7, gripper_pos]
172
+ - j1-j7: Absolute joint positions for 7-DOF arm (radians)
173
+ - gripper_pos: Current gripper opening (meters)
174
+ ```
175
+
176
+ ### 📋 Recommended Additional Representations
177
+
178
+ *Even if not your primary action/state representation, we strongly encourage including these standardized formats for maximum compatibility:*
179
+
180
+ **Recommended Action Fields:**
181
+ - **`action.cartesian_absolute`**: Absolute Cartesian pose with absolute quaternions
182
+ ```
183
+ [x, y, z, qx, qy, qz, qw, gripper_angle]
184
+ ```
185
+
186
+ **Recommended State Fields:**
187
+ - **`observation.state.joint_positions`**: Absolute positions for all articulated joints
188
+ ```
189
+ [joint_1, joint_2, ..., joint_n]
190
+ ```
191
+
192
+
193
+ ---
194
+
195
+ ## ⏱️ Data Synchronization Approach
196
+
197
+ *Describe how you achieved proper data synchronization across different sensors, cameras, and robotic systems during data collection. This is crucial for ensuring temporal alignment of all modalities in your dataset.*
198
+
199
+ **Example:** *We collect joint kinematics from our Franka Research 3 and RGB-D frames from Intel RealSense D435 cameras, all running in ROS 2 Galactic on the same workstation clocked with ROS Time. Both drivers stamp their outgoing messages’ header.stamp fields with the shared system clock, and we record /joint_states, /camera/*/image_raw, and /camera/*/camera_info in a single rosbag2 session. During export to LeRobot, each data point’s ROS header.stamp is written verbatim into the timestamp attribute. Offline checks show inter-sensor skew stays below ±2 ms across a 2-minute capture.*
200
+
201
+ ---
202
+
203
+ ## 👥 Attribution & Contact
204
+
205
+ *Please provide attribution for the dataset creators and a point of contact.*
206
+
207
+ | | |
208
+ | :--- | :--- |
209
+ | **Dataset Lead** | `[Name1, Name2, ...]` |
210
+ | **Institution** | `[Your Institution]` |
211
+ | **Contact Email** | `[email1@example.com, email2@example.com, ...]` |
212
+ | **Citation (BibTeX)** | <pre><code>@misc{[your_dataset_name_2025],<br> author = {[Your Name(s)]},<br> title = {[Your Dataset Title]},<br> year = {2025},<br> publisher = {Open-H-Embodiment},<br> note = {https://hrpp.research.virginia.edu/teams/irb-sbs/researcher-guide-irb-sbs/identifiers}<br>}</code></pre> |