|
|
--- |
|
|
license: cc-by-nc-sa-4.0 |
|
|
tags: |
|
|
- Affective Brain-Computer Interface |
|
|
- Electroencephalogram (EEG) |
|
|
- Functional near-infrared spectroscopy (fNIRS) |
|
|
- Real-time dynamic label |
|
|
- EEG-fNIRS |
|
|
- Emotion Recognition |
|
|
pretty_name: >- |
|
|
A Subject Real-time Dynamic Labeled EEG-fNIRS Synchronized Recorded Emotion Dataset |
|
|
size_categories: |
|
|
- 10B<n<100B |
|
|
configs: |
|
|
- config_name: Metadata |
|
|
data_files: |
|
|
- split: Metadata |
|
|
path: "Metadata.csv" |
|
|
- config_name: PANAS_score |
|
|
data_files: |
|
|
- split: PANAS_score |
|
|
path: "PANAS_score.csv" |
|
|
- config_name: SAM_score |
|
|
data_files: |
|
|
- split: SAM_score |
|
|
path: "SAM_score.csv" |
|
|
- config_name: video_info |
|
|
data_files: |
|
|
- split: video_info |
|
|
path: "Video_info.csv" |
|
|
- config_name: EEG_channels |
|
|
data_files: |
|
|
- split: EEG_channels |
|
|
path: "EEG_channels.csv" |
|
|
- config_name: fNIRS_coordinates |
|
|
data_files: |
|
|
- split: fNIRS_coordinates |
|
|
path: "fNIRS_coordinates.csv" |
|
|
- config_name: fNIRS_reservations |
|
|
data_files: |
|
|
- split: fNIRS_reservations |
|
|
path: "fNIRS_reservations.csv" |
|
|
extra_gated_prompt: "You agree to not use the dataset to conduct experiments that cause harm to human subjects." |
|
|
extra_gated_fields: |
|
|
Real Name: text |
|
|
Position: text |
|
|
Country: country |
|
|
Institution: text |
|
|
Department: text |
|
|
Supervisor Name (if any): text |
|
|
Lab/Supervisor Home Page: text |
|
|
Purpose of study: text |
|
|
I agree to use this dataset under the `CC BY-NC-SA` license for non-commercial use ONLY: checkbox |
|
|
--- |
|
|
|
|
|
# REFED: A Subject Real-time Dynamic Labeled EEG-fNIRS Synchronized Recorded Emotion Dataset |
|
|
|
|
|
## About REFED |
|
|
|
|
|
The REFED is an affective brain-computer interface (aBCI) dataset integrating multimodal brain signals and real-time dynamic emotion annotation, fills a critical gap in the study of neural mechanisms of emotional dynamic evolution and the development of high-reliability aBCI models. By synchronizing the acquisition of EEG and fNIRS signals, REFED realizes the joint observation of neuroelectrical activity and hemodynamic response under emotional evocation, which provides unique data support for exploring emotion-related neuro-vascular coupling mechanisms. Meanwhile, the dynamic valence and arousal annotation based on subjects' subjective reports, realizes temporal alignment of brain signals with emotional state changes, which significantly improves the temporal modeling capability of the emotion recognition model. Experimental validation shows that the dataset meets the standard in terms of emotion evoked validity and labeling reliability, and the multimodal signal features show significant correlation with the dynamic labeling. The open sharing of REFED will promote the cross-modal neural representation parsing for the dynamic encoding of emotions in the following research directions, and lay an important foundation for the field of affective computation and brain-computer interfaces to move towards dynamic interaction paradigms with higher ecological validity. |
|
|
|
|
|
## Data overview |
|
|
|
|
|
```plaintext |
|
|
REFED/ |
|
|
βββ README.md |
|
|
βββ Metadata.csv |
|
|
βββ SAM.csv |
|
|
βββ PANAS.csv |
|
|
βββ fNIRS_reservations.csv |
|
|
βββ fNIRS_coordinate.csv |
|
|
βββ Video_info.csv |
|
|
βββ data/ |
|
|
β βββ 1/ |
|
|
β β βββ EEG_baselines.mat |
|
|
β β βββ EEG_videos.mat |
|
|
β β βββ fNIRS_baselines.mat |
|
|
β β βββ fNIRS_videos.mat |
|
|
β βββ 2/ |
|
|
β β βββ ... |
|
|
β βββ 32/ |
|
|
β βββ ... |
|
|
βββ annotations/ |
|
|
βββ 1_label.mat |
|
|
βββ 2_label.mat |
|
|
βββ ... (up to 32_label.mat) |
|
|
|
|
|
``` |
|
|
### π README.md |
|
|
Documentation of instructions for using the dataset, including background, structure, data collection, license agreement, etc. |
|
|
|
|
|
### π Metadata.csv |
|
|
Subject basic information (ID, gender, age, health status, etc.). |
|
|
|
|
|
### π SAM.csv |
|
|
Post-trial subjective mood scores based on the Self-Assessment Manikin (SAM) scale (valence and arousal). |
|
|
|
|
|
### π PANAS.csv |
|
|
Positive and Negative Affect Self-Assessment Results Before and After the Experiment (Positive and Negative Affect Schedule). |
|
|
|
|
|
### π fNIRS_reservations.csv |
|
|
fNIRS data acquisition logs, including bad channel markers. |
|
|
|
|
|
### π fNIRS_coordinate.csv |
|
|
3D coordinates of the fNIRS channels for alignment and spatial analysis. |
|
|
|
|
|
### π Video_info.csv |
|
|
Information on 15 emotional videos. |
|
|
|
|
|
### π data/ |
|
|
|
|
|
Contains folders of data for subjects numbered 1 through 32, each containing the following raw brain signal files: |
|
|
|
|
|
**Subdirectory structure:** |
|
|
|
|
|
```plaintext |
|
|
data/{subject_id}/ |
|
|
βββ EEG_baselines.mat # Resting EEG signal (baseline phase) |
|
|
βββ EEG_videos.mat # Emotionally evoked phase EEG signaling (video stimulation) |
|
|
βββ fNIRS_baselines.mat # Resting-state fNIRS signal (baseline phase) |
|
|
βββ fNIRS_videos.mat # Emotionally evoked phase fNIRS signaling (video stimulation) |
|
|
``` |
|
|
|
|
|
- Each subject corresponds to a numbered folder, for a total of 32 subjects. |
|
|
- EEG sampling rate of 1000 Hz, fNIRS sampling rate of 47.62 Hz. |
|
|
- fNIRS has six signal types, including `HbO`, `HbR`, `HbT`, `Abs 780 nm`, `Abs 805 nm`, `Abs 830 nm`. |
|
|
- `.mat` format saves multi-trial time series data, such as video_1, a total of 15 videos. |
|
|
- The shape of EEG is channel dimension * time dimension and the shape of fNIRS is signal type * channel dimension * time dimension. |
|
|
|
|
|
### π annotations/ |
|
|
|
|
|
Store **the real-time dynamic labeling data** of each subject during the viewing of the emotion video: |
|
|
|
|
|
``` |
|
|
annotations/ |
|
|
βββ 1_label.mat |
|
|
βββ 2_label.mat |
|
|
βββ ... (up to 32_label.mat) |
|
|
``` |
|
|
|
|
|
- Each *_label.mat file contains the results of subjects' subjective labeling of emotions based on the joystick. Contains both valence and arousal dimensions. |
|
|
- Annotation sequences containing changes in valence and arousal over time have been precisely aligned to brain signals (shaped as 2 * time dimension). |
|
|
|
|
|
----- |
|
|
|
|
|
## Collection |
|
|
|
|
|
In order to realize the real-time annotation of subjects' emotional state and the automated control of the whole process, we also developed a real-time annotation and control system, as shown in the following figure. |
|
|
|
|
|
<img src="./Figures/Figure_2.png" alt="Figure_2" style="zoom:25%;" /> |
|
|
|
|
|
The channel distribution of the joint EEG-fNIRS acquisition is shown in the following figure. |
|
|
|
|
|
<img src="./Figures/Figure_1.png" alt="Figure_1" style="zoom:50%;" /> |
|
|
|
|
|
----- |
|
|
|
|
|
## Citation |
|
|
|
|
|
``` |
|
|
@inproceedings{ning2025refed, |
|
|
title={{REFED}: A Subject Real-time Dynamic Labeled {EEG}-f{NIRS} Synchronized Recorded Emotion Dataset}, |
|
|
author={Xiaojun Ning and Jing Wang and Zhiyang Feng and Tianzuo Xin and Shuo Zhang and Shaoqi Zhang and Zheng Lian and Yi Ding and Youfang Lin and Ziyu Jia}, |
|
|
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, |
|
|
year={2025}, |
|
|
url={https://openreview.net/forum?id=C4IqLzavel} |
|
|
} |
|
|
``` |
|
|
|
|
|
----- |
|
|
|
|
|
## License |
|
|
|
|
|
Publicly available under the `CC BY-NC-SA` protocol, with direct access to users after confirming use for non-commercial research purposes. |
|
|
|
|
|
 |