File size: 26,674 Bytes
213eed3
 
 
 
45551d8
f65bc77
45551d8
 
 
 
 
 
 
 
 
 
 
f65bc77
45551d8
 
 
 
 
 
f65bc77
45551d8
 
 
 
 
 
 
 
 
 
 
9593027
ba613c3
f65bc77
45551d8
 
 
 
 
f65bc77
45551d8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f65bc77
45551d8
 
 
 
 
 
 
 
 
 
 
 
 
f65bc77
45551d8
 
 
 
 
 
 
13e7477
45551d8
 
 
 
 
 
 
 
f65bc77
45551d8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
52292a3
45551d8
 
 
f65bc77
 
52292a3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f65bc77
 
 
 
 
 
45551d8
 
 
 
 
52292a3
 
 
 
 
 
 
 
 
 
f65bc77
52292a3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
96e597d
 
 
 
 
 
 
 
 
 
 
52292a3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b221619
52292a3
 
 
 
 
 
 
 
 
 
 
 
2fb49bc
52292a3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
45551d8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9593027
52292a3
45551d8
52292a3
45551d8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ba613c3
 
47e4cd9
ba613c3
 
d20e51e
ba613c3
f65bc77
 
 
 
 
 
 
 
 
45551d8
 
0494159
45551d8
0494159
45551d8
 
 
0494159
45551d8
 
 
 
0494159
 
 
 
 
 
2d5e4ae
 
 
 
 
 
 
 
 
 
0494159
 
45551d8
 
213eed3
f29c4be
45551d8
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
---
license: cc-by-nc-sa-4.0
---

# LET:Full-Size Humanoid Robot Real-World Dataset

<div style="display: flex; justify-content: space-between; align-items: center; width: 100%;">
  <div>
    <a href="https://huggingface.co/datasets/LejuRobotics/let_dataset">
      <img src="https://img.shields.io/badge/Huggingface-FF6B35?style=for-the-badge&logo=huggingface" alt="Huggingface">
    </a>
    <a href="https://www.modelscope.cn/datasets/LejuRobotics/let_dataset">
      <img src="https://img.shields.io/badge/Modelscope-1890FF?style=for-the-badge&logo=alibabacloud" alt="Modelscope">
    </a>
  </div>
</div>

[中文](https://www.modelscope.cn/datasets/LejuRobotics/let_dataset/file/view/master/README.md?id=159848&status=1)| [English]

<div style="font-size:1.1em; max-width:800px; margin: 0 0 16px 0; text-align: left;">
  <b><span style="color:#000000">LET Dataset</span></b> is collected based on the full-size humanoid robot <b><span style="color:#1890FF">Kuavo 4 Pro</span></b> covering real-world multi-task data across multiple scenarios and operation types. It is designed for robot manipulation, mobility, and interaction tasks, supporting scalable robot learning in real environments.
</div>

## 📋 Table of Contents


- [Key Features](#key-features)
- [Hardware Platform](#hardware-platform)
- [Usage Guide](#usage-guide)
  - [Tool Repository](#tool-repository)
- [Tasks and Data Overview](#tasks-and-data-overview)
  - [Semantic Labels](#semantic-labels)
  - [Data Statistics](#data-statistics)
- [Dataset](#dataset)
  - [Dataset Directory Structure](#dataset-directory-structure)
  - [Data Format](#data-format)
  - [Label Format](#label-format)
- [Data Access](#data-access)
- [Data Communication Group](#data-communication-group)
- [Citation](#citation)
- [License](#license)

<a id="key-features"></a>
## ✨ Key Features

- Large-scale, real-world, full-size humanoid robot multi-view, multi-modal data, continuously updated
- Covers multiple domains including industry, home, medical, and service, with 31 sub-task scenarios
- Includes 117 atomic skills such as grasping, bimanual operation, tool use, with a total duration of over 1000 hours
- Expert-labeled and human-verified data to ensure high quality
- Provides a complete toolchain from data conversion, model training to inference and validation

<div style="overflow-x: auto; text-align: left; max-width: fit-content; margin-left: 0;">
  <table style="border-collapse: collapse; border-spacing: 0; width: auto; table-layout: auto;">
    <tr>
      <td align="center" style="padding: 10px;">
        <img src="docs/images/Assembly_line_sorting.gif" alt="Assembly line sorting" width="230" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
        <p><b>Assembly line sorting</b></p>
      </td>
      <td align="center" style="padding: 10px;">
        <img src="docs/images/Clean the floor.gif" alt="Daily table cleaning" width="230" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
        <p><b>Daily table cleaning</b></p>
      </td>
      <td align="center" style="padding: 10px;">
        <img src="docs/images/Assembly_line_sorting-dex_hand.gif" alt="Assembly line sorting (dexterous hand)" width="230" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
        <p><b>Assembly line sorting (dexterous hand)</b></p>
      </td>
    </tr>
    <tr>
      <td align="center" style="padding: 10px;">
        <img src="docs/images/cam_l.gif" alt="Left hand camera view" width="230" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
        <p><b>Left hand camera view</b></p>
      </td>
      <td align="center" style="padding: 10px;">
        <img src="docs/images/cam_h.gif" alt="Head camera view" width="230" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
        <p><b>Head camera view</b></p>
      </td>
      <td align="center" style="padding: 10px;">
        <img src="docs/images/cam_r.gif" alt="Right hand camera view" width="230" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
        <p><b>Right hand camera view</b></p>
      </td>
    </tr>
  </table>
</div>

<a id="hardware-platform"></a>
## 🤖 Hardware Platform

<div align="left">
  <img src="docs/images/kuavo4pro.png" alt="kuavo" width="200" style="display:inline-block; margin-right: 10px;">
  <img src="docs/images/kuavo_wheel.png" alt="kuavo_wheel" width="200" style="display:inline-block;">
</div>

The main hardware platform is **Kuavo 4 Pro** and its wheeled version, with the following features:

- **Robot parameters:** Height **1.66 m**, weight **55 kg**, supports hot-swappable batteries
- **Motion control:** 40 degrees of freedom, max walking speed **7 km/h**, supports bipedal autonomous SLAM
- **Generalization:** Supports multi-modal large models (e.g., Pangu, DeepSeek, ChatGPT), with **20+ atomic skills**

<a id="usage-guide"></a>
## 🚀 Usage Guide

<a id="dataset-download-example"></a>

<a id="tool-repository"></a>
### Tool Repository

We provide a complete tool repository, including:

- **Data conversion tool ([**rosbag2lerobot**](https://github.com/LejuRobotics/kuavo_data_challenge/tree/main/kuavo_data))**: Convert rosbag files to formats suitable for model training
- **Two imitation learning models:** **Diffusion Policy** and **ACT**
- **Model training scripts**
- **Code and deployment instructions** for both real robots and simulation environments

For details, see the open-source repository: [**kuavo_data_challenge**](https://github.com/LejuRobotics/kuavo_data_challenge) 🔥

<a id="tasks-and-data-overview"></a>
## 🎬 Tasks and Data Overview

This dataset covers various scenarios such as automobile factories, FMCG, hotel services, 3C factories, life services, logistics, etc., including multi-modal observations (RGB, Depth, joints, etc.) and a rich set of atomic skills (grasping, bimanual operation, tool use, etc.).

<div style="overflow-x: auto; text-align: left; max-width: fit-content; margin-left: 0;">
  <table style="border-collapse: collapse; border-spacing: 0; width: auto; table-layout: auto;">
    <tr>
      <td align="center" style="padding: 10px;">
        <img src="docs/images/Sorting.gif" alt="Consumer goods sorting" width="230" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
        <p><b>Consumer goods sorting</b></p>
      </td>
      <td align="center" style="padding: 10px;">
        <img src="docs/images/Simulation_resized.gif" alt="Simulation data demonstration" width="230" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
        <p><b>Simulation data demonstration</b></p>
      </td>
      <td align="center" style="padding: 10px;">
        <img src="docs/images/3C.gif" alt="Assembly feeding" width="230" style="border-radius: 10px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);">
        <p><b>Assembly feeding</b></p>
      </td>
    </tr>
  </table>
</div>

<a id="semantic-labels"></a>
### Semantic Labels

The LET dataset decomposes complex tasks into a series of atomic action steps with clear semantics, using standardized annotation methods to provide sub-task level timelines and natural language annotations for each task.

<div style="text-align: center;">
<img src="docs/images/Visualize Datasets.png" width="600">
</div>

Each data entry is accompanied by multi-dimensional semantic label information, including:

- Object labels: industrial parts, tableware, daily utensils, medicines, etc.
- Skill labels: grasp, place, rotate, push, pull, press, etc.
- Task and scene identifiers: unified task name coding, scene dimension distinguishes operation context semantics
- End effector type: records actions performed by gripper and dexterous hand separately
- Language description: e.g., "Pick up the medicine box from the conveyor belt and place it on the designated tray", supporting natural language and action alignment modeling

<a id="data-statistics"></a>
### Data Statistics

LET dataset statistics are as follows:

#### Data type & Scene distribution

| Data type distribution | Scene distribution |
|:---:|:---:|
| <img src="docs/images/Data type_en.png" width="500"> | <img src="docs/images/Scene distribution_en.png" width="500"> |

#### Task distribution

<div align="left">
  <img src="docs/images/Task Distribution_en.png" width="800" alt="Task distribution">
</div>

#### Task duration distribution

<div align="left">
  <img src="docs/images/Task duration distribution_en.png" width="800" alt="Task duration distribution">
</div>

#### Distribution of atomic skills

<div align="left">
  <img src="docs/images/Distribution of Task Atomic Skills_en.png" width="800" alt="Distribution of atomic skills">
</div>

<a id="dataset"></a>
## 📦 Dataset
<hr style="margin-top: -10px;margin-bottom: 6px">
<a id="dataset-directory-structure"></a>

### Dataset Directory Structure

```text
.
├── hdf5
│   ├── real
│   │   ├── Labelled
│   │   │   ├── customer_check_in-P4-dex_hand
│   │   │   ├── deliver_room_card-P4-dex_hand
│   │   │   ├── deliver_water_bottle-P4-dex_hand
│   │   │   ├── loading_of_large_tooling-P4-dex_hand
│   │   │   ├── loading_of_small_tooling-P4-dex_hand
│   │   │   ├── more_coil_sorting-P4-dex_hand
│   │   │   ├── more_FMCG_loading-P4-dex_hand
│   │   │   ├── more_goods_orders-P4-dex_hand
│   │   │   ├── more_scan_code_for_weighing-P4-dex_hand
│   │   │   ├── parts_offline-P4-dex_hand
│   │   │   ├── quick_sort-P4-leju_claw
│   │   │   ├── rubbish_sorting-P4-leju_claw
│   │   │   ├── shop_oversale-P4-leju_claw
│   │   │   ├── single_coil_sorting-P4-dex_hand
│   │   │   ├── single_FMCG_loading-P4-dex_hand
│   │   │   ├── single_goods_orders-P4-dex_hand
│   │   │   ├── single_scan_code_for_weighing-P4-dex_hand
│   │   │   ├── SPS_parts_grab-P4-leju_claw
│   │   │   ├── SPS_parts_sorting-P4-dex_hand
│   │   │   └── task_mass_check-P4-leju_claw
│   │   └── Unlabelled
│   │       ├── assembly_line_sorting-P4-leju_claw
│   │       ├── clothing_storage-P4-leju_claw
│   │       ├── countertop_cleaning-P4-leju_claw
│   │       ├── deliver_room_card-P4-dex_hand
│   │       ├── desktop_decluttering-P4-leju_claw
│   │       ├── drug_finishing-P4-leju_claw
│   │       ├── express_delivery_sorting-P4-leju_claw
│   │       ├── express_logistics_scenario-P4-leju_claw
│   │       ├── loading_of_large_tooling-P4-dex_hand
│   │       ├── loading_of_small_tooling-P4-dex_hand
│   │       ├── loading_of_small_tooling-P4-leju_claw
│   │       ├── more_coil_sorting-P4-dex_hand
│   │       ├── more_FMCG_loading-P4-dex_hand
│   │       ├── more_goods_orders-P4-dex_hand
│   │       ├── more_goods_orders-P4-leju_claw
│   │       ├── more_scan_code_for_weighing-P4-dex_hand
│   │       ├── parts_offline-P4-dex_hand
│   │       ├── parts_off_line-P4-leju_claw
│   │       ├── quick_sort-P4-leju_claw
│   │       ├── rubbish_sorting-P4-leju_claw
│   │       ├── shop_oversale-P4-leju_claw
│   │       ├── single_coil_sorting-P4-dex_hand
│   │       ├── single_FMCG_loading-P4-leju_claw
│   │       ├── single_goods_orders-P4-dex_hand
│   │       ├── SMT_tray_rack_blanking-P4-leju_claw
│   │       ├── SPS_parts_grab-P4-leju_claw
│   │       ├── SPS_parts_sorting-P4-dex_hand
│   │       ├── SPS_parts_sorting-P4-leju_claw
│   │       ├── standardized_feeding_for_FMCG-P4-dex_hand
│   │       └── task_mass_check-P4-leju_claw
│   └── sim
│       └── Unlabelled
│           ├── bottle_flip-P4-claw(Rq2f85)
│           ├── package_weighing-P4-claw(Rq2f85)
│           ├── SPS_parts_sorting-P4-claw(Rq2f85)
│           └── target_placement-P4-claw(Rq2f85)
└── rosbag
    ├── real
    │   ├── Labelled  // Same task structure as HDF5.
    │   └── Unlabelled // Same task structure as HDF5.
    └── sim
        └── Unlabelled // Same task structure as HDF5.
```

<a id="data-format"></a>
### Data Format

#### ROSbag Data Format

| Topic Type           | Topic Name                                      | Message Type                        | Main Fields / Description                                                                                   |
|--------------------|------------------------------------------------|---------------------------------|--------------------------------------------------------------------------------------------------|
| <b>Camera RGB Image</b>      | <span style="color:#1890FF">/cam_x/color/image_raw/compressed</span>               | <span style="color:#000000">sensor_msgs/CompressedImage</span>     | x is h/l/r, for head/left wrist/right wrist camera respectively;<br>header (message header with timestamp, sequence, frame, etc.),<br>format (image encoding format),<br>data (image data)|
| <b>Camera Depth Image</b>       | <span style="color:#1890FF">/cam_x/depth/image_rect_raw/compressed</span>          | <span style="color:#000000">sensor_msgs/CompressedImage</span>     | x is h/l/r, for head/left wrist/right wrist camera respectively;<br>header (message header), format (encoding format), data (image data)|
| <b>Arm Trajectory Control</b>       | <span style="color:#1890FF">/kuavo_arm_traj</span>                                 | <span style="color:#000000">sensor_msgs/JointState</span>          | header (message header),<br>name (joint name list, 14 joints, arm_joint_1~arm_joint_14),<br>position (desired joint position, structure same as raw sensor data items 12-25)|
| <b>Raw Sensor Data</b>     | <span style="color:#1890FF">/sensors_data_raw</span>                               | <span style="color:#000000">kuavo_msgs/sensorsData</span>          | sensor_time (timestamp),<br>joint_data (joint data: position, velocity, acceleration, current),<br>imu_data (IMU data: gyroscope, accelerometer, quaternion),<br>end_effector_data (end effector data, currently unused)|
| <b>Dexterous Hand Position (Real Robot)</b> | <span style="color:#1890FF">/control_robot_hand_position</span>                    | <span style="color:#000000">kuavo_msgs/robotHandPosition</span>    | left_hand_position (left hand 6D, 0 open, 100 closed),<br>right_hand_position (right hand 6D, 0 open, 100 closed)|
| <b>Dexterous Hand State (Real Robot)</b> | <span style="color:#1890FF">/dexhand/state</span>                                  | <span style="color:#000000">sensor_msgs/JointState</span>          | name (12 joint names),<br>position (12 joint positions, first 6 for left hand, last 6 for right hand),<br>velocity (12 joint velocities),<br>effort (12 joint currents)|
| <b>Gripper Control (Real Robot)</b>   | <span style="color:#1890FF">/leju_claw_command</span>                        | <span style="color:#000000">kuavo_msgs/leju_claw_command</span>      | name (length 2, left_claw/right_claw),<br>position (length 2, 0 open, 100 closed),<br>velocity (length 2, target velocity, default 50),<br>effort (length 2, target current in A, default 1)|
| <b>Gripper State (Real Robot)</b>   | <span style="color:#1890FF">/leju_claw_state</span>                                | <span style="color:#000000">kuavo_msgs/lejuClawState</span>        | state (int8[2], left/right gripper state, see details below),<br>data (kuavo_msgs/endEffectorData, contains gripper position, velocity, current)|
| <b>Simulation Gripper Control</b>       | <span style="color:#1890FF">/gripper/command</span>                                | <span style="color:#000000">sensor_msgs/JointState</span>          | header (message header),<br>position (length 2, 0 open, 255 closed)|
| <b>Simulation Gripper State</b>       | <span style="color:#1890FF">/gripper/state</span>                                  | <span style="color:#000000">sensor_msgs/JointState</span>          | header (message header),<br>position (length 2, 0 open, 0.8 closed)|
| <b>Robot Position Command</b>     | <span style="color:#1890FF">/cmd_pose_world</span>                                 | <span style="color:#000000">geometry_msgs/Twist</span>             | linear.x/y/z (translation in world frame in m),<br>angular.x/y/z (rotation in world frame in radians)|

<details>
<summary>Detailed Field Descriptions</summary>

- <b><span style="color:#000000">/cam_x/color/image_raw/compressed</span></b><b>/cam_x/depth/image_rect_raw/compressed</b>  - header(std_msgs/Header):Message header with timestamp, sequence number, frame information
  - format(string):Image encoding format
  - data(uint8[]):Image data

- <b><span style="color:#000000">/kuavo_arm_traj</span></b>
  - header:Message header
  - name:Joint name list, 14 joints named arm_joint_1~arm_joint_14
  - position:Desired joint position, structure same as raw sensor data items 12-25

- <b><span style="color:#000000">/sensors_data_raw</span></b>
  - sensor_time(time):Timestamp
  - joint_data(kuavo_msgs/jointData):Joint data including position, velocity, acceleration, current
    - Data order:
      - First 12 items are lower limb motor data:
        - Indices 0–5: left leg  
          (`l_leg_roll`, `l_leg_yaw`, `l_leg_pitch`, `l_knee`, `l_foot_pitch`, `l_foot_roll`)
        - Indices 6–11: right leg  
          (`r_leg_roll`, `r_leg_yaw`, `r_leg_pitch`, `r_knee`, `r_foot_pitch`, `r_foot_roll`)

      - Next 14 items are arm motor data:
        - Indices 12–18: left arm  
          (`l_arm_pitch`, `l_arm_roll`, `l_arm_yaw`, `l_forearm_pitch`, `l_hand_yaw`, `l_hand_pitch`, `l_hand_roll`)
        - Indices 19–25: right arm  
          (`r_arm_pitch`, `r_arm_roll`, `r_arm_yaw`, `r_forearm_pitch`, `r_hand_yaw`, `r_hand_pitch`, `r_hand_roll`)
      - Last 2 items are head motor data: head_yaw, head_pitch
    - Units: position in radians, velocity in radian/s, acceleration in radian/s², current in Amperes (A)
  - imu_data(kuavo_msgs/imuData):IMU data including gyroscope (gyro, unit rad/s), accelerometer (acc, unit m/s²), quat (IMU orientation)
  - end_effector_data(kuavo_msgs/endEffectorData):End effector data, currently unused

- <b><span style="color:#000000">/control_robot_hand_position</span></b>
  - left_hand_position(float[6]):Left hand 6D, each element [0,100], 0 fully open, 100 fully closed
  - right_hand_position(float[6]):Right hand 6D, same meaning as above

- <b><span style="color:#000000">/dexhand/state</span></b>
  - name(string[12]):12 joint names
  - position(float[12]):12 joint positions, first 6 for left hand, last 6 for right hand
  - velocity(float[12]):12 joint velocities, first 6 for left hand, last 6 for right hand
  - effort(float[12]):12 joint currents, first 6 for left hand, last 6 for right hand

- <b><span style="color:#000000">/leju_claw_command</span></b>
  - name(string[2]):left_claw, right_claw
  - position(float[2]):Left/right gripper target position, [0,100], 0 open, 100 closed
  - velocity(float[2]):Target velocity, [0,100], default 50
  - effort(float[2]):Target current in A, default 1

- <b><span style="color:#000000">/leju_claw_state</span></b>
  - state(int8[2]):Left/right gripper state, meanings as follows:
    - -1:Error (execution anomaly)
    - 0:Unknown (default initialization state)
    - 1:Moving
    - 2:Reached target position
    - 3:Object grasped
  - data(kuavo_msgs/endEffectorData):Contains gripper position, velocity, current, structure same as /leju_claw_command

- <b><span style="color:#000000">/gripper/command</span></b>(Simulation):
  - header:Message header
  - position(float[2]):Left/right gripper target position, [0,255], 0 open, 255 closed

- <b><span style="color:#000000">/gripper/state</span></b>(Simulation):
  - header:Message header
  - position(float[2]):Left/right gripper current position, [0,0.8], 0 open, 0.8 closed

- <b><span style="color:#000000">/cmd_pose_world(Simulation Task 4 only)</span></b>  - linear.x/y/z(float):Translation in world frame in meters
  - angular.x/y/z(float):Rotation in world frame in radians

</details>

#### HDF5 Data Format

```text
<task_root>
├── cameras
│   ├── hand_left  // Left hand camera
│   │   ├── color     // RGB image info
│   │   │   └── data   // RGB image data (by timestamp)
│   │   └── depth/     // Depth image info
│   │       └── data   // Depth data
│   ├── hand_right // Right hand camera
│   │   ├── color // RGB image info
│   │   │   └── data // RGB data
│   │   └── depth  // Depth image info
│   │       └── data // Depth data
│   └── head // Head camera
│       ├── color // RGB image info
│       │   └── data // RGB image data
│       └── depth // Depth image info
│           └── data // Depth data
├── joints  // Joint data
│   ├── action  // Desired joint values
│   │   ├── arm // Arm
│   │   │   ├── position   // N(rows)*14(cols); N=frames, 14=DoF for both arms (7 per arm)
│   │   │   └── velocity   // Desired joint velocity
│   │   ├── effector // End effector
│   │   │   └── position   //  N(rows)*2(cols); N=frames, 2=left/right gripper open/close
│   │   ├── head // Head
│   │   │   ├── position   //  N(rows)*2(cols); N=frames, 2=2 DoF (pitch/yaw)
│   │   │   └── velocity  // Joint velocity
│   │   └── leg // Leg 
│   │       ├── position   //  N(rows)*12(cols)
│   │       └── velocity  // Joint velocity 
│   └── state  // Actual joint values
│       ├── arm // Arm
│       │   ├── position // N(rows)*14(cols); N=frames, 14=DoF for both arms (7 per arm)
│       │   └── velocity // Joint velocity
│       ├── effector // End effector
│       │   └── position //  N(rows)*2(cols); N=frames, 2=left/right gripper open/close
│       ├── head // Head
│       │   ├── position  // N(rows)*2(cols); N=frames, 2=2 DoF (pitch/yaw)
│       │   └── velocity   // Joint velocity
│       └── leg // Leg
│           ├── position //  N(rows)*12(cols)
│           └── velocity // Joint velocity
├── parameters  // Sensor extrinsics
│   └── camera
│       ├── hand_left.json   # Left hand camera intrinsics/extrinsics
│       ├── hand_right.json  # Right hand camera intrinsics/extrinsics
│       └── head.json        # Head camera intrinsics/extrinsics
└── metadata.json            # Collection metadata: device, end effector type, camera frame rate, joint info, etc.
```
<a id="label-format"></a>
### Label Format

Label information is stored in a JSON file with the same name as the data file. Example:

```json
{
  "loaction": "Yangtze River Delta Integrated Demonstration Zone Intelligent Robot Training Center",
  "primaryScene": "Default primary scene",
  "primarySceneCode": "default_level_one_scene",
  "secondaryScene": "3C factory scene",
  "secondarySceneCode": "3C factory manufacturing",
  "tertiaryScene": "Coil sorting",
  "tertiarySceneCode": "Coil sorting",
  "initSceneText": "Coils of various colors are placed in the middle of the table, material boxes are placed on both sides of the table, and the robot is located at the back of the table",
  "englishInitSceneText": "Coils of various colors are placed in the middle of the table, material boxes are placed on both sides of the table, and the robot is located at the back of the table",
  "taskGroupName": "Single coil sorting",
  "taskGroupCode": "single_coil_sorting",
  "taskName": "7-22-Coil classification",
  "taskCode": "XQFL_11",
  "deviceSn": "P4-209",
  "taskPrompt": "",
  "marks": [
    {
      "taskId": "1947326026455584768",
      "markStart": "2025-07-22 9:18:39.640",
      "markEnd": "2025-07-22 9:18:39.814",
      "duration": 0.233,
      "startPosition": 0.7363737795977026,
      "endPosition": 0.769568869806783,
      "skillAtomic": "pick",
      "skillDetail": "Pick up the coil from the table",
      "enSkillDetail": "pick coil from table",
      "markType": "step"
    }
  ]
}
```

<a id="data-access"></a>
## 📥Data Access

- **Official request:** You can request access by contacting the official email `wangsong@lejurobot.com`.
- **Public platforms:** The LET dataset will be publicly released on major platforms such as Openloong, ModelScope, and Hugging Face to provide convenience for developers and researchers worldwide.


<a id="data-communication-group"></a>
## 📋 Data Communication Group

  - **Data communication QQ group: 1043359345**
<div align="left">
  <img src="docs/images/qq.png" width="400" alt="LET Data Communication Group">
</div>

<a id="citation"></a>
## 📝 Citation
If you use this dataset in your research, please cite it according to the platform from which you accessed it:

**Citation for Hugging Face**
```text
@misc{LET2025,
    title={LET:Full-Size Humanoid Robot Real-World Dataset},
    author={LejuRobotics},
    year={2025},
    howpublished={\url{https://huggingface.co/datasets/LejuRobotics/let_dataset}}
}
```
**Citation for ModelScope**
```text
@misc{LET2025,
    title={LET:Full-Size Humanoid Robot Real-World Dataset},
    author={LejuRobotics},
    year={2025},
    howpublished={\url{https://www.modelscope.cn/datasets/lejurobot/let_dataset}}
}
```
**Citation for Atomgit AI**
```text
@misc{LET2025,
    title={LET:Full-Size Humanoid Robot Real-World Dataset},
    author={LejuRobotics},
    year={2025},
    howpublished={\url{https://ai.atomgit.com/lejurobot/let_dataset}}
}
```
<a id="license"></a>
## 📄 License

All the data and code within this repo are under [CC BY-NC-SA-4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/).