OpenDriveLab-org commited on
Commit
b881874
·
verified ·
1 Parent(s): 00d9684

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +289 -0
README.md ADDED
@@ -0,0 +1,289 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ task_categories:
4
+ - robotics
5
+ tags:
6
+ - LeRobot
7
+ configs:
8
+ - config_name: default
9
+ data_files: data/*/*.parquet
10
+ ---
11
+ # KAI0
12
+ <span style="color: red; font-weight: bold; font-size: 24px;">⚠️ !!! 等待信息,填充链接</span>
13
+ <div align="center">
14
+ <a href="">
15
+ <img src="https://img.shields.io/badge/GitHub-grey?logo=GitHub" alt="GitHub Badge">
16
+ </a>
17
+ <a href="">
18
+ <img src="https://img.shields.io/badge/Project%20Page-blue?style=plastic" alt="Project Page Badge">
19
+ </a>
20
+ <a href="">
21
+ <img src="https://img.shields.io/badge/Research_Blog-black?style=flat" alt="Research Blog Badge">
22
+ </a>
23
+ <a href="">
24
+ <img src="https://img.shields.io/badge/Dataset-Overview-brightgreen?logo=googleforms" alt="Research Blog Badge">
25
+ </a>
26
+ </div>
27
+
28
+ <a id="Table-of-Contents"></a>
29
+ # Table of Contents
30
+ - [About the Dataset](#About-the-Dataset)
31
+ - [Dataset Structure](#Dataset-Structure)
32
+ - [Folder hierarchy](#Folder-hierarchy)
33
+ - [Details](#Details)
34
+ - [Download the Dataset](#download-the-dataset)
35
+ - [Get started](#Get-Started)
36
+ - [License-And-Citation](#License-And-Citation)
37
+
38
+ <span style="color: red; font-weight: bold; font-size: 24px;">⚠️ !!! 等待信息填充</span>
39
+ <a id="About-the-Dataset"></a>
40
+ # [About the Dataset](#Table-of-Contents)
41
+ - This dataset was created using [LeRobot](https://github.com/huggingface/lerobot)
42
+ - ** hours real world scenarios** across ** ** tasks
43
+ - **tasks**
44
+ * ff
45
+
46
+ <a id="Dataset-Structure"></a>
47
+ # [Dataset Structure](#Table-of-Contents)
48
+ <span style="color: red; font-weight: bold; font-size: 24px;">⚠️ !!! 等待最后数据信息,进行调整</span>
49
+ <a id="Folder-hierarchy"></a>
50
+ ## [Folder hierarchy](#Table-of-Contents)
51
+ ```text
52
+ dataset_root/
53
+ ├── data/
54
+ │ ├── chunk-000/
55
+ │ │ ├── episode_000000.parquet
56
+ │ │ ├── episode_000001.parquet
57
+ │ │ └── ...
58
+ │ └── ...
59
+ ├── videos/
60
+ │ ├── chunk-000/
61
+ │ │ ├── observation.images.hand_left
62
+ │ │ │ ├── episode_000000.mp4
63
+ │ │ │ ├── episode_000001.mp4
64
+ │ │ │ └── ...
65
+ │ │ ├── observation.images.hand_right
66
+ │ │ │ ├── episode_000000.mp4
67
+ │ │ │ ├── episode_000001.mp4
68
+ │ │ │ └── ...
69
+ │ │ ├── observation.images.top_head
70
+ │ │ │ ├── episode_000000.mp4
71
+ │ │ │ ├── episode_000001.mp4
72
+ │ │ │ └── ...
73
+ │ │ └── ...
74
+ ├── meta/
75
+ │ ├── info.json
76
+ │ ├── episodes.jsonl
77
+ │ ├── tasks.jsonl
78
+ │ └── episodes_stats.jsonl
79
+ └ README.md
80
+ ```
81
+
82
+ <a id='Details'></a>
83
+ ## [Details](#Table-of-Contents)
84
+ ### [info.json](#meta/info.json)
85
+ the basic struct of the [info.json](#meta/info.json)
86
+ ```json
87
+ {
88
+ "codebase_version": "v2.1",
89
+ "robot_type": "agilex",
90
+ "total_episodes": ...,
91
+ "total_frames": ...,
92
+ "total_tasks": ...,
93
+ "total_videos": ...,
94
+ "total_chunks": ...,
95
+ "chunks_size": ...,
96
+ "fps": ...,
97
+ "splits": {
98
+ "train": ...
99
+ },
100
+ "data_path": "data/chunk-{episode_chunk:03d}/episode_{episode_index:06d}.parquet",
101
+ "video_path": "videos/chunk-{episode_chunk:03d}/{video_key}/episode_{episode_index:06d}.mp4",
102
+ "features": {
103
+ "observation.images.top_head": {
104
+ "dtype": "video",
105
+ "shape": [
106
+ 480,
107
+ 640,
108
+ 3
109
+ ],
110
+ "names": [
111
+ "height",
112
+ "width",
113
+ "channel"
114
+ ],
115
+ "info": {
116
+ "video.height": 480,
117
+ "video.width": 640,
118
+ "video.codec": "av1",
119
+ "video.pix_fmt": "yuv420p",
120
+ "video.is_depth_map": false,
121
+ "video.fps": 30,
122
+ "video.channels": 3,
123
+ "has_audio": false
124
+ }
125
+ },
126
+ "observation.images.hand_left": {
127
+ ...
128
+ },
129
+ "observation.images.hand_right": {
130
+ ...
131
+ },
132
+ "observation.state": {
133
+ "dtype": "float32",
134
+ "shape": [
135
+ 14
136
+ ],
137
+ "names": null
138
+ },
139
+ "action": {
140
+ "dtype": "float32",
141
+ "shape": [
142
+ 14
143
+ ],
144
+ "names": null
145
+ },
146
+ "timestamp": {
147
+ "dtype": "float32",
148
+ "shape": [
149
+ 1
150
+ ],
151
+ "names": null
152
+ },
153
+ "frame_index": {
154
+ "dtype": "int64",
155
+ "shape": [
156
+ 1
157
+ ],
158
+ "names": null
159
+ },
160
+ "episode_index": {
161
+ "dtype": "int64",
162
+ "shape": [
163
+ 1
164
+ ],
165
+ "names": null
166
+ },
167
+ "index": {
168
+ "dtype": "int64",
169
+ "shape": [
170
+ 1
171
+ ],
172
+ "names": null
173
+ },
174
+ "task_index": {
175
+ "dtype": "int64",
176
+ "shape": [
177
+ 1
178
+ ],
179
+ "names": null
180
+ }
181
+ }
182
+ ```
183
+
184
+ <span style="color: red; font-weight: bold; font-size: 24px;">⚠️ !!! Need Confirm real explanation</span>
185
+ ### [Parquet file format](#Table-of-Contents)
186
+ | Field Name | shape | Meaning |
187
+ |------------|-------------|-------------|
188
+ | observation.state | [N, 14] |left `[:, :6]`, right `[:, 7:13]`, joint angle<br> left`[:, 6]`, right `[:, 13]` , gripper open range|
189
+ | action | [N, 14] |left `[:, :6]`, right `[:, 7:13]`, joint angle<br>left`[:, 6]`, right `[:, 13]` , gripper open range |
190
+ | timestamp | [N, 1] | Time elapsed since the start of the episode (in seconds) |
191
+ | frame_index | [N, 1] | Index of this frame within the current episode (0-indexed) |
192
+ | episode_index | [N, 1] | Index of the episode this frame belongs to |
193
+ | index | [N, 1] | Global unique index across all frames in the dataset |
194
+ | task_index | [N, 1] | Index identifying the task type being performed |
195
+
196
+
197
+ <a id="download-the-dataset"></a>
198
+ # [Download the Dataset](#Table-of-Contents)
199
+ ### Python Script
200
+
201
+ ```python
202
+ from huggingface_hub import hf_hub_download, snapshot_download
203
+ from datasets import load_dataset
204
+
205
+ # Download a single file
206
+ hf_hub_download(
207
+ repo_id="OpenDriveLab-org/kai0",
208
+ filename="episodes.jsonl",
209
+ subfolder="meta",
210
+ repo_type="dataset",
211
+ local_dir="where/you/want/to/save"
212
+ )
213
+
214
+ # Download a specific folder
215
+ snapshot_download(
216
+ repo_id="OpenDriveLab-org/kai0",
217
+ local_dir="/where/you/want/to/save",
218
+ repo_type="dataset",
219
+ allow_patterns=["data/*"]
220
+ )
221
+
222
+ # Load the entire dataset
223
+ dataset = load_dataset("OpenDriveLab-org/kai0")
224
+ ```
225
+
226
+ ### Terminal (CLI)
227
+
228
+ ```bash
229
+ # Download a single file
230
+ hf download OpenDriveLab-org/kai0 \
231
+ --include "meta/info.json" \
232
+ --repo-type dataset \
233
+ --local-dir "/where/you/want/to/save"
234
+
235
+ # Download a specific folder
236
+ hf download OpenDriveLab-org/kai0 \
237
+ --repo-type dataset \
238
+ --include "meta/*" \
239
+ --local-dir "/where/you/want/to/save"
240
+
241
+ # Download the entire dataset
242
+ hf download OpenDriveLab-org/kai0 \
243
+ --repo-type dataset \
244
+ --local-dir "/where/you/want/to/save"
245
+ ```
246
+
247
+ <a id='Get-Started'></a>
248
+ # [Get Started](#Table-of-Contents)
249
+ ## Load the dataset
250
+
251
+ ### For LeRobot version < 0.4.0
252
+
253
+ Choose the appropriate import based on your version:
254
+
255
+ | Version | Import Path |
256
+ |---------|-------------|
257
+ | `<= 0.1.0` | `from lerobot.common.datasets.lerobot_dataset import LeRobotDataset` |
258
+ | `> 0.1.0` and `< 0.4.0` | `from lerobot.datasets.lerobot_dataset import LeRobotDataset` |
259
+
260
+ ```python
261
+ # For version <= 0.1.0
262
+ from lerobot.common.datasets.lerobot_dataset import LeRobotDataset
263
+
264
+ # For version > 0.1.0 and < 0.4.0
265
+ from lerobot.datasets.lerobot_dataset import LeRobotDataset
266
+
267
+ # Load the dataset
268
+ dataset = LeRobotDataset(repo_id='where/the/dataset/you/stored')
269
+ ```
270
+
271
+ ### For LeRobot version >= 0.4.0
272
+
273
+ You need to migrate the dataset from v2.1 to v3.0 first. See the official documentation: [Migrate the dataset from v2.1 to v3.0](https://huggingface.co/docs/lerobot/lerobot-dataset-v3)
274
+
275
+ ```bash
276
+ python -m lerobot.datasets.v30.convert_dataset_v21_to_v30 --repo-id=<HF_USER/DATASET_ID>
277
+ ```
278
+ <span style="color: red; font-weight: bold; font-size: 24px;">⚠️ !!! 等待信息填充</span>
279
+ <a id="License-And-Citation"></a>
280
+ # [License and Citation](#Table-of-Contents)
281
+ All the data and code within this repo are under [](). Please consider citing our project if it helps your research.
282
+
283
+ ```BibTeX
284
+ @misc{,
285
+ title={},
286
+ author={},
287
+ howpublished={\url{}},
288
+ year={}
289
+ }