--- pretty_name: "HefmiIch2025" license: other tags: - eeg - neuroscience - eegdash - brain-computer-interface - pytorch - multisensory - motor - other size_categories: - n<1K task_categories: - other --- # HefmiIch2025 **Dataset ID:** `nm000347` _HefmiIch2025_ **Canonical aliases:** `HEFMI_ICH` · `HEFMIICH` > **At a glance:** EEG · Multisensory motor · other · 37 subjects · 98 recordings · CC-BY-NC-ND-4.0 ## Load this dataset This repo is a **pointer**. The raw EEG data lives at its canonical source (OpenNeuro / NEMAR); [EEGDash](https://github.com/eegdash/EEGDash) streams it on demand and returns a PyTorch / braindecode dataset. ```python # pip install eegdash from eegdash import EEGDashDataset ds = EEGDashDataset(dataset="nm000347", cache_dir="./cache") print(len(ds), "recordings") ``` You can also load it by canonical alias — these are registered classes in `eegdash.dataset`: ```python from eegdash.dataset import HEFMI_ICH ds = HEFMI_ICH(cache_dir="./cache") ``` If the dataset has been mirrored to the HF Hub in braindecode's Zarr layout, you can also pull it directly: ```python from braindecode.datasets import BaseConcatDataset ds = BaseConcatDataset.pull_from_hub("EEGDash/nm000347") ``` ## Dataset metadata | | | |---|---| | **Subjects** | 37 | | **Recordings** | 98 | | **Tasks (count)** | 1 | | **Channels** | 32 (×98) | | **Sampling rate (Hz)** | 256 (×98) | | **Total duration (h)** | 31.2 | | **Size on disk** | 2.6 GB | | **Recording type** | EEG | | **Experimental modality** | Multisensory | | **Paradigm type** | Motor | | **Population** | Other | | **Source** | nemar | | **License** | CC-BY-NC-ND-4.0 | ## Links - **DOI:** [10.1038/s41597-025-06100-7](https://doi.org/10.1038/s41597-025-06100-7) - **NEMAR:** [nm000347](https://nemar.org/dataexplorer/detail?dataset_id=nm000347) - **Browse 700+ datasets:** [EEGDash catalog](https://huggingface.co/spaces/EEGDash/catalog) - **Docs:** - **Code:** --- _Auto-generated from [dataset_summary.csv](https://github.com/eegdash/EEGDash/blob/main/eegdash/dataset/dataset_summary.csv) and the [EEGDash API](https://data.eegdash.org/api/eegdash/datasets/summary/nm000347). Do not edit this file by hand — update the upstream source and re-run `scripts/push_metadata_stubs.py`._