crowdbot / README.md
pfung's picture
Update README.md
df900cd verified
---
license: apache-2.0
task_categories:
- robotics
tags:
- LeRobot
- so100
- tutorial
- flower
configs:
- config_name: task-fold-t-shirt-sizeM-pfung
data_files: >-
task-fold-tshirt-sizeM/pfung-2025-01-23T19:49:03.888Z/data/chunk-000/episode_*
default: true
---
# 🦾 🤖 Project CrowdBot
<img src="https://cdn-uploads.huggingface.co/production/uploads/649621f2b68b87c87b5f5385/HdrGApKavhEEoB5WUiVjI.png" width="700" />
## Our Goal: Advancing Robotics Through Collective and Open Intelligence
The latest robotics foundational models are built upon thousands of hours of physical robot data, but creating such datasets remains feasible only for the most well-funded organizations. Project CrowdBot allows researchers to pool their robotics data to create rich, open-source datasets for training state-of-the-art models.
Built with:
[LeRobot](https://github.com/huggingface/lerobot) | [So-100 Arm](https://github.com/TheRobotStudio/SO-ARM100) | [Flower.AI](https://flower.ai)
[Contribute Now →](https://huggingface.co/datasets/pfung/test)
<img src="https://cdn-uploads.huggingface.co/production/uploads/649621f2b68b87c87b5f5385/aN5G-P_UxMxSai_qMBME1.png" width="500" />
<img src="https://cdn-uploads.huggingface.co/production/uploads/649621f2b68b87c87b5f5385/iWXfx3TD5kthkWgBOrMMN.png" width="300" />
## Contributors
* [Philip Fung](https://cnn.com)
* [Conor McGartoll](https://cnn.com)
* [Brian McGeehan](https://cnn.com)
## Dataset Description
- **Homepage:**
- **License:** apache-2.0
## Dataset Structure
[meta/info.json](meta/info.json):
```json
{
"codebase_version": "v2.0",
"robot_type": "so100",
"total_episodes": 50,
"total_frames": 17383,
"total_tasks": 1,
"total_videos": 150,
"total_chunks": 1,
"chunks_size": 1000,
"fps": 30,
"splits": {
"train": "0:50"
},
"data_path": "data/chunk-{episode_chunk:03d}/episode_{episode_index:06d}.parquet",
"video_path": "videos/chunk-{episode_chunk:03d}/{video_key}/episode_{episode_index:06d}.mp4",
"features": {
"action": {
"dtype": "float32",
"shape": [
12
],
"names": [
"left_shoulder_pan",
"left_shoulder_lift",
"left_elbow_flex",
"left_wrist_flex",
"left_wrist_roll",
"left_gripper",
"right_shoulder_pan",
"right_shoulder_lift",
"right_elbow_flex",
"right_wrist_flex",
"right_wrist_roll",
"right_gripper"
]
},
"observation.state": {
"dtype": "float32",
"shape": [
12
],
"names": [
"left_shoulder_pan",
"left_shoulder_lift",
"left_elbow_flex",
"left_wrist_flex",
"left_wrist_roll",
"left_gripper",
"right_shoulder_pan",
"right_shoulder_lift",
"right_elbow_flex",
"right_wrist_flex",
"right_wrist_roll",
"right_gripper"
]
},
"observation.images.top": {
"dtype": "video",
"shape": [
480,
640,
3
],
"names": [
"height",
"width",
"channels"
],
"info": {
"video.fps": 30.0,
"video.height": 480,
"video.width": 640,
"video.channels": 3,
"video.codec": "av1",
"video.pix_fmt": "yuv420p",
"video.is_depth_map": false,
"has_audio": false
}
},
"observation.images.left": {
"dtype": "video",
"shape": [
480,
640,
3
],
"names": [
"height",
"width",
"channels"
],
"info": {
"video.fps": 30.0,
"video.height": 480,
"video.width": 640,
"video.channels": 3,
"video.codec": "av1",
"video.pix_fmt": "yuv420p",
"video.is_depth_map": false,
"has_audio": false
}
},
"observation.images.right": {
"dtype": "video",
"shape": [
480,
640,
3
],
"names": [
"height",
"width",
"channels"
],
"info": {
"video.fps": 30.0,
"video.height": 480,
"video.width": 640,
"video.channels": 3,
"video.codec": "av1",
"video.pix_fmt": "yuv420p",
"video.is_depth_map": false,
"has_audio": false
}
},
"timestamp": {
"dtype": "float32",
"shape": [
1
],
"names": null
},
"frame_index": {
"dtype": "int64",
"shape": [
1
],
"names": null
},
"episode_index": {
"dtype": "int64",
"shape": [
1
],
"names": null
},
"index": {
"dtype": "int64",
"shape": [
1
],
"names": null
},
"task_index": {
"dtype": "int64",
"shape": [
1
],
"names": null
}
}
}
```
## Citation
**BibTeX:**
```bibtex
[More Information Needed]
```