kushsaraf's picture
Update dataset card: add navigation + failure dataset docs
6313b0d verified
metadata
license: cc-by-4.0
task_categories:
  - tabular-classification
  - reinforcement-learning
language:
  - en
tags:
  - robot
  - sensors
  - activity-recognition
  - navigation
  - failure-detection
  - reinforcement-learning
  - tabular
pretty_name: Robot Intelligence Dataset
size_categories:
  - 10K<n<100K

🤖 Robot Intelligence Dataset

A collection of three real-world robotics sensor datasets used to train and evaluate machine learning pipelines across four intelligence challenges: Perception · Navigation · Failure Detection · Autonomous Decision-Making (RL).

Companion GitHub repo: KushSaraf/Robot_Intelligence

Made by: Kush Saraf & Yash Chavda


Datasets

🧠 1. Perception Dataset — Human Activity Recognition (HAR)

Source: UCI HAR Dataset

Property Value
Instances 10,299 (7,352 train / 2,947 test)
Features 561 time & frequency domain features
Sensors Accelerometer + Gyroscope (3-axis, 50 Hz)
Classes 6 (Walking, Walking Upstairs, Walking Downstairs, Sitting, Standing, Laying)
Device Samsung Galaxy S II worn on waist
Subjects 30 volunteers, ages 19–48

Task: Classify human activity from smartphone inertial sensor signals.

Files:

perception_dataset/
├── train/X_train.txt       # 7352 × 561 feature matrix
├── train/y_train.txt       # labels 1–6
├── train/subject_train.txt # subject IDs
├── test/X_test.txt
├── test/y_test.txt
├── test/subject_test.txt
├── features.txt            # feature names
├── activity_labels.txt     # class names
└── train/Inertial Signals/ # raw inertial signals (9 channels)

🗺️ 2. Navigation Dataset — Wall-Following Robot

Source: UCI Wall-Following Robot Navigation

Property Value
Instances 5,456
Features 24 ultrasonic range sensors (US1–US24, 360°)
Classes 4 movement decisions
Robot SCITOS G5
Sampling ~9 samples/second

Class distribution:

Class Count %
Move-Forward ~2,205 40%
Sharp-Right-Turn ~2,074 38%
Slight-Right-Turn ~818 15%
Slight-Left-Turn ~359 6%

Task: Sensor array → movement command (non-linearly separable by design).

Files:

navigation_dataset/
├── sensor_readings_24.data   # 24 sensors (recommended)
├── sensor_readings_4.data    # 4 simplified sensors
├── sensor_readings_2.data    # 2 simplified sensors
└── Wall-following.names      # dataset description

⚠️ 3. Failure Detection Dataset — Robot Execution Failures

Source: UCI Robot Execution Failures

Property Value
Instances ~463 total across 5 tasks
Raw features 90 (15 time-steps × 6 force/torque axes)
Stat features 24 (mean, std, min, max per axis)
Total features 114 (after feature engineering)
Classes Up to 11 per file

Force/torque axes: Fx, Fy, Fz, Tx, Ty, Tz

Learning problems (files):

File Task
lp1.data Approach to grasp position
lp2.data Transfer of a part
lp3.data Position after transfer failure
lp4.data Approach to ungrasp position
lp5.data Motion with a part

Task: Classify robot arm execution failures from time-series force/torque data.

Files:

failure_dataset/
├── lp1.data    # approach to grasp
├── lp2.data    # transfer
├── lp3.data    # post-transfer
├── lp4.data    # ungrasp approach
└── lp5.data    # motion with part

Usage

Perception

import pandas as pd
X_train = pd.read_csv("perception_dataset/train/X_train.txt", sep=r"\s+", header=None)
y_train = pd.read_csv("perception_dataset/train/y_train.txt", header=None, names=["label"])

Navigation

import pandas as pd
df = pd.read_csv("navigation_dataset/sensor_readings_24.data", header=None)
X, y = df.iloc[:, :-1], df.iloc[:, -1]

Failure Detection

# Blocks are separated by blank lines; each block = one instance
# Format: label\n6_values\n6_values\n...\n (15 timesteps)
def parse_lp(path):
    rows = []
    with open(path) as f:
        text = f.read()
    for block in text.strip().split("\n\n"):
        lines = block.strip().split("\n")
        label = lines[0].strip()
        data = [list(map(float, l.split())) for l in lines[1:] if l.strip()]
        rows.append({"label": label, "data": data})
    return rows

Citation

If you use this dataset collection, please cite the original UCI sources:

  • HAR: Anguita et al., ESANN 2013
  • Wall-Following: Freire et al., ESANN 2009
  • Robot Failures: Camarinha-Matos et al., 1996