Datasets:

nm000338 / README.md
bruAristimunha's picture
Metadata stub for nm000338
6188180 verified
---
pretty_name: "Lee2019-MI"
license: other
tags:
- eeg
- neuroscience
- eegdash
- brain-computer-interface
- pytorch
- visual
- motor
size_categories:
- n<1K
task_categories:
- other
---
# Lee2019-MI
**Dataset ID:** `nm000338`
_Lee2019_MI_
**Canonical aliases:** `OpenBMI_MI`
> **At a glance:** EEG · Visual motor · healthy · 54 subjects · 216 recordings · GPL-3.0
## Load this dataset
This repo is a **pointer**. The raw EEG data lives at its canonical source
(OpenNeuro / NEMAR); [EEGDash](https://github.com/eegdash/EEGDash) streams it
on demand and returns a PyTorch / braindecode dataset.
```python
# pip install eegdash
from eegdash import EEGDashDataset
ds = EEGDashDataset(dataset="nm000338", cache_dir="./cache")
print(len(ds), "recordings")
```
You can also load it by canonical alias — these are registered classes in `eegdash.dataset`:
```python
from eegdash.dataset import OpenBMI_MI
ds = OpenBMI_MI(cache_dir="./cache")
```
If the dataset has been mirrored to the HF Hub in braindecode's Zarr layout,
you can also pull it directly:
```python
from braindecode.datasets import BaseConcatDataset
ds = BaseConcatDataset.pull_from_hub("EEGDash/nm000338")
```
## Dataset metadata
| | |
|---|---|
| **Subjects** | 54 |
| **Recordings** | 216 |
| **Tasks (count)** | 1 |
| **Channels** | 66 (×216) |
| **Sampling rate (Hz)** | 1000 (×216) |
| **Total duration (h)** | 91.5 |
| **Size on disk** | 60.8 GB |
| **Recording type** | EEG |
| **Experimental modality** | Visual |
| **Paradigm type** | Motor |
| **Population** | Healthy |
| **Source** | nemar |
| **License** | GPL-3.0 |
## Links
- **DOI:** [10.1093/gigascience/giz002](https://doi.org/10.1093/gigascience/giz002)
- **NEMAR:** [nm000338](https://nemar.org/dataexplorer/detail?dataset_id=nm000338)
- **Browse 700+ datasets:** [EEGDash catalog](https://huggingface.co/spaces/EEGDash/catalog)
- **Docs:** <https://eegdash.org>
- **Code:** <https://github.com/eegdash/EEGDash>
---
_Auto-generated from [dataset_summary.csv](https://github.com/eegdash/EEGDash/blob/main/eegdash/dataset/dataset_summary.csv) and the [EEGDash API](https://data.eegdash.org/api/eegdash/datasets/summary/nm000338). Do not edit this file by hand — update the upstream source and re-run `scripts/push_metadata_stubs.py`._