You need to agree to share your contact information to access this dataset

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Access to MER_PS requires sharing your Hugging Face username and email address with the dataset authors. By requesting access, you agree not to use this dataset for experiments or applications that may cause harm to human subjects, and to use it only for non-commercial scientific research under the CC BY-NC-SA license.

Log in or Sign Up to review the conditions and access this dataset content.

MER 2026: MER-PS (Physiological Signal-Based Emotion) Codabench Public Training/Validation Data

This release contains the public MER_PS data for model development in the Codabench valence-arousal regression task. The subject identifiers are anonymized as test_1 through test_24.

The task is to model continuous emotional state from synchronized EEG and fNIRS recordings. Dynamic labels are provided at 1 Hz for two affective dimensions:

  • valence: pleasantness of the emotional state
  • arousal: activation level of the emotional state

Valence and arousal use the original MER_PS raw label scale [1, 255]. The center of the valence-arousal plane is (128, 128).

Data Modalities

Modality Channels Sampling Rate Description
EEG 64 200 Hz EEG signals during baseline and video stimulation
fNIRS 51 47.62 Hz fNIRS signals during baseline and video stimulation
Annotation 2 1 Hz Continuous valence and arousal labels

Each trial includes a 5-second baseline period before video onset. Each subject watched 15 emotion-inducing video clips.

Directory Structure

MER_PS_codabench_public_trainval/
β”œβ”€β”€ README.md
β”œβ”€β”€ Metadata.csv
β”œβ”€β”€ SAM_score.csv
β”œβ”€β”€ PANAS_score.csv
β”œβ”€β”€ Targeted_emotions.txt
β”œβ”€β”€ fNIRS_coordinates.csv
β”œβ”€β”€ fNIRS_reservations.csv
β”œβ”€β”€ data/
β”‚   β”œβ”€β”€ test_1/
β”‚   β”‚   β”œβ”€β”€ EEG_baselines.mat
β”‚   β”‚   β”œβ”€β”€ EEG_videos.mat
β”‚   β”‚   β”œβ”€β”€ fNIRS_baselines.mat
β”‚   β”‚   └── fNIRS_videos.mat
β”‚   β”œβ”€β”€ ...
β”‚   └── test_24/
└── annotations/
    β”œβ”€β”€ test_1_label.mat
    β”œβ”€β”€ ...
    └── test_24_label.mat

Files

Metadata.csv contains anonymized subject metadata.

SAM_score.csv contains post-trial subjective ratings based on the Self-Assessment Manikin scale.

PANAS_score.csv contains Positive and Negative Affect Schedule scores.

Targeted_emotions.txt lists the targeted emotion category for each video.

fNIRS_coordinates.csv contains the 3D coordinates of fNIRS channels.

fNIRS_reservations.csv contains fNIRS channel reservation records.

Signal Files

Each subject folder under data/ contains:

EEG_baselines.mat
EEG_videos.mat
fNIRS_baselines.mat
fNIRS_videos.mat

The .mat files store trial-wise arrays such as video_1, video_2, ..., video_15.

EEG arrays are organized as:

channel Γ— time

fNIRS arrays are organized as:

signal_type Γ— channel Γ— time

The fNIRS signal type dimension includes HbO, HbR, HbT, Abs 780 nm, Abs 805 nm, and Abs 830 nm.

Annotation Files

Each file in annotations/ contains dynamic valence-arousal labels for one anonymized subject:

annotations/test_1_label.mat

Each annotation array has shape:

2 Γ— time

The first row is valence and the second row is arousal.

Downloads last month
19