File size: 3,521 Bytes
e777d37 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 | ---
license: mit
task_categories:
- object-detection
- robotics
language:
- en
tags:
- Robotics,
- AI,
pretty_name: FAFO
size_categories:
- 10K<n<100K
---
# FAFO Dataset
The FAFO dataset is designed for universal robotics software development. It includes:
- Sensor data: LiDAR scans, GPS coordinates, and IMU readings.
- Image data: Infrared and camera images for object detection and navigation.
- 3D data: Point cloud files for SLAM and mapping.
- Task data: Pre-labeled tasks for robotic arm operations.
## Dataset Overview
### Sensor Data
- **LiDAR Data**: Point cloud scans with timestamps, ranges, intensities, and angles
- **GPS Data**: Precise location data including latitude, longitude, and altitude
- **IMU Data**: Acceleration, angular velocity, and orientation readings
### Image Data
- RGB camera feeds
- Infrared images
- Object detection datasets with bounding box annotations
### 3D Data
- Point cloud maps for SLAM
- 3D environment scans
- Occupancy grid maps
### Task Data
- Pick-and-place task definitions
- Navigation paths
- Robot arm trajectories
- Task annotations and metadata
## Dataset Structure
- `sensor_data/`: Contains JSON files for LiDAR, GPS, and IMU readings.
- `image_data/`: JPEG images for object detection and segmentation.
- `3d_data/`: PCD files for 3D point clouds.
- `task_data/`: JSON files for robotic tasks.
## Usage
This dataset is designed for AI model training, sensor calibration, and robotic task automation.
### Loading the Dataset
```python
from datasets import load_dataset
# Load the complete dataset
dataset = load_dataset("GotThatData/fafo")
# Load specific splits
train_dataset = load_dataset("GotThatData/fafo", split="train")
val_dataset = load_dataset("GotThatData/fafo", split="validation")
test_dataset = load_dataset("GotThatData/fafo", split="test")
```
### Example Usage
```python
# Access sensor data
lidar_scan = dataset['train'][0]['sensor_data']['lidar']
gps_reading = dataset['train'][0]['sensor_data']['gps']
imu_data = dataset['train'][0]['sensor_data']['imu']
# Access image data
image_path = dataset['train'][0]['image_data']
# Access 3D data
point_cloud = dataset['train'][0]['3d_data']
# Access task data
task = dataset['train'][0]['task_data']
```
## Data Format
### Sensor Data
```json
{
"lidar": {
"timestamp": 1640995200.0,
"ranges": [1.2, 2.3, 3.4],
"intensities": [0.5, 0.6, 0.7],
"angles": [0.0, 0.1, 0.2]
},
"gps": {
"timestamp": 1640995200.0,
"latitude": 37.7749,
"longitude": -122.4194,
"altitude": 0.0
},
"imu": {
"timestamp": 1640995200.0,
"acceleration": [0.0, 0.0, 9.81],
"angular_velocity": [0.0, 0.0, 0.0],
"orientation": [0.0, 0.0, 0.0, 1.0]
}
}
```
### Task Data
```json
{
"task_type": "pick_and_place",
"parameters": {
"position": [0.5, 0.3, 0.2],
"orientation": [0.0, 0.0, 0.0, 1.0],
"gripper_state": "open"
},
"annotations": {
"object_class": "cube",
"bounding_box": [0.1, 0.1, 0.2, 0.2],
"confidence": 0.95
}
}
```
## Dataset Statistics
- Total samples: [Number of samples]
- Train/Val/Test split: 60%/20%/20%
- Data types:
- Sensor readings: [Number of readings]
- Images: [Number of images]
- 3D scans: [Number of scans]
- Task definitions: [Number of tasks]
## License
MIT License
## Citation
```bibtex
@inproceedings{fafo2024,
title={FAFO Dataset},
author={GotThatData},
year={2024}
}
``` |