The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
Singh2020: EEG Parkinson's Classification Dataset with Pedaling Task
The Singh2020 dataset contains EEG recordings collected during a lower-limb pedaling task designed to assess motor control in individuals with Parkinson's disease (PD), with a particular focus on freezing of gait (FOG) symptoms. A total of 39 participants were included: 13 PD patients with FOG (PDFOG+), 13 PD patients without FOG (PDFOG-), and 13 demographically matched healthy controls.
Participants completed a lower-limb motor task in which they pedaled one rotation in response to a visual "GO" cue, designed to minimize fall risk and reduce EEG artifacts from movement. Each subject completed at least two blocks of either 30 or 50 trials, with PDFOG+ participants performing fewer trials due to symptom severity. A tri-axial accelerometer mounted on the ankle measured pedaling kinematics.
EEG was recorded using a 64-channel cap with a sampling rate of 500 Hz.
Paper
Singh, A., Cole, R. C., Espinoza, A. I., Brown, D., Cavanagh, J. F., & Narayanan, N. S. (2020). Frontal theta and beta oscillations during lower-limb movement in Parkinson’s disease. Clinical Neurophysiology, 131(3), 694-702.
DISCLAIMER: We (DISCO) are NOT the owners or creators of this dataset, but we merely uploaded it here, to support our's (EEG-Bench) and other's work on EEG benchmarking.
Dataset Structure
raw_data/contains the unprocessed recording EEG and accelerometer data.processed_data/contains pre- (_Ped_Processed.mat) and post-processed (_ANALYZED.mat) experiment data, produced from theRAW_DATA/usingStep1_EEG_Pedaling_PreProcess.mandStep2_EEG_Pedaling_PostProcess.mscripts, respectively.ALL_data_Modeling.csvcontains participant information, such as age and MOCA test scores.scripts/contains various scripts that can be used to reproduce the results from the paper.ORIGINAL_README.txt: the original README with perhaps some more helpful information.
Filename Format
In raw_data/, a recording consists of 3 files:
[GROUP][PID].vhdr: The header file with meta information[GROUP][PID].eeg: contains the EEG (and accelerometer) data[GROUP][PID].vmrk: contains event information
where GROUP is either Control or PD (Parkinson's Disease) and PID is the participant's ID (also used in ALL_data_Modeling.csv).
Similarly, in processed_data/,
[GROUP][PID]_Ped_Processed.matcontains pre-processed data epoched around trials[GROUP][PID]_ANALYZED.matcontains the time-frequency analysis of the pre-processed data
where, again, GROUP is either Control or PD (Parkinson's Disease) and PID is the participant's ID.
Reading RAW_DATA Files in Python
In python, the 3 files that make up a raw recording can be read via:
import mne
raw = mne.io.read_raw_brainvision("path_to/[GROUP][PID].vhdr")
Now, raw.get_data(units='uV') yields a numpy array of shape (#channels, time_len) in micro-Volt units.
Some general info can be inspected with raw.info, such as the sampling rate (raw.info["sfreq"]).
The channel names (in their correct order) can be seen via raw.ch_names. Note that X, Y and Z denote the accelerometer X-, Y- and Z-axis outputs.
Events can be read with
events_list, events_dict = mne.events_from_annotations(raw)
where events_dict contains the mapping of the original event types (where "S 1" stands for the black smaller circle "warning" cue, and "S 2" stands for the green bigger circle "GO" cue) to event IDs in [1,2], the latter of which are used in events_list.
events_list is a list of events, ordered by time. Each entry e = [timestamp, (not important), event ID] consists of the time of the event onset timestamp that refers to the time_len dimension in the raw.get_data() EEG array, as well as the event-ID.
(See the https://mne.tools/stable/generated/mne.io.Raw.html documentation for more details.)
Reading Processed Files in Python
In python, the [GROUP][PID]_Ped_Processed.mat files can be read e.g. with the mat73 package (to read MATLAB v7.3 files in HDF5 format):
import mat73
mat = mat73.loadmat("path_to/[GROUP][PID]_Ped_Processed.mat")
Then mat contains (among others) the following fields and subfields
EEGdata: EEG data of shape(#channels, trial_len, #trials). E.g. a recording of 50 trials/epochs with 59 channels, each trial having a duration of 4 seconds and a sampling rate of 500 Hz will have shape(59, 2000, 50). Recall that here, trials were epoched from -1000ms to +3000ms around the "GO" cue event.event: Contains a list of dictionaries, each entry (each event) having the following description:latency: The onset of the event, measured as the index in the merged time-dimension#trials x trial_len(note#trialsbeing the outer andtrial_lenbeing the inner array when merging).type: The type of event. Here, only"S 2"(larger green circle "GO" cue) events are stored.TimeDiff: The time in seconds that has passed between the"S 1"and the"S 2"cue.
chanlocs: A list of channel descriptorsnbchan: Number of channelstrials: Number of trials/epochs in this recordingsrate: Sampling Rate (Hz)
Additionally, the field and bad_chans lists bad channels of this recording.
License
By the original authors of this work, this work has been licensed under the PDDL v1.0 license (see LICENSE.txt).
- Downloads last month
- 1,521