metadata
license: mit
task_categories:
- other
pretty_name: Cheedeh IMU Gesture Data
size_categories:
- n<1K
configs:
- config_name: default
data_files:
- split: train
path: train/*.json
- split: test
path: test/*.json
Cheedeh IMU Gesture Data
Phone IMU recordings of mid-air gestures drawn in the air with the phone. Collected to train a real-time gesture classifier running on Android.
Gestures (labels)
| Label | Gesture |
|---|---|
z |
Letter Z |
m |
Letter M |
s |
Letter S |
o |
Letter O (circle) |
none |
No gesture / idle |
Splits
| Split | Samples | m | none | o | s | z |
|---|---|---|---|---|---|---|
| train | 266 | 31 | 111 | 43 | 42 | 39 |
| test | 54 | 15 | 20 | 8 | 4 | 7 |
Data Format
Each sample is a separate JSON file named gesture_{label}_{YYYYMMDD}_{HHMMSS}.json.
{
"gesture_id": "m_20260210_160623",
"gesture_name": "m",
"timestamp": "2026-02-10T15:06:23.660296Z",
"duration_ms": 3246,
"sample_rate_hz": 57,
"data": [
{
"t": 1,
"x": -0.072, "y": -0.143, "z": 0.456,
"gx": -0.011, "gy": -0.043, "gz": 0.020,
"grx": 0.522, "gry": 4.386, "grz": 8.759,
"qx": 0.152, "qy": -0.175, "qz": -0.653, "qw": 0.721
},
...
]
}
Fields in data array
| Field | Sensor | Unit |
|---|---|---|
t |
Elapsed time | ms |
x, y, z |
Linear acceleration (TYPE_LINEAR_ACCELERATION) |
m/s² |
gx, gy, gz |
Gyroscope (angular velocity) | rad/s |
grx, gry, grz |
Gravity vector (rotation frame) | m/s² |
qx, qy, qz, qw |
Orientation quaternion | — |
Sample rate is approximately 50–57 Hz. Duration varies by gesture (typically 1–4 s).
Loading
from datasets import load_dataset
ds = load_dataset("ravenwing/cheedeh-IMU-data")
sample = ds["train"][0]
# sample["gesture_name"] → "m"
# sample["data"] → list of dicts with t, x, y, z, ...
Collection
Recorded with an Android app (gather-data) using TYPE_LINEAR_ACCELERATION and gyroscope sensors. Each session was a manual recording of one gesture, labeled at collection time.
Related
- Training code: learn/ — SVM/RF classifier achieving ~0.76 test accuracy, exported to ONNX for on-device inference.