Spaces:
Sleeping
A newer version of the Gradio SDK is available:
6.5.1
title: Brain_Emotion_Decoder
emoji: π§
colorFrom: yellow
colorTo: indigo
sdk: gradio
sdk_version: 5.48.0
app_file: app.py
pinned: false
Brain Emotional Decoder: EEG-Driven Artistic Style Transfer
π§ Project Overview
The Brain Emotional Decoder is an interdisciplinary project that integrates Affective Computing (decoding emotions from brain signals) with Computational Creativity (generating art).
The pipeline connects these two domains:
- A Long Short-Term Memory (LSTM) network classifies a dominant emotion from pre-processed EEG data.
- The resulting emotion dictates the Content Image for a subsequent Neural Style Transfer (NST) process.
- The style of a user-selected Painter's Artwork is then transferred to the emotion-matched content image using an NST implementation based on Adaptive Instance Normalization (AdaIN).
πΎ Dataset
This project is built upon the EmoEEG-MC: A Multi-Context Emotional EEG Dataset for Cross-Context Emotion Decoding dataset.
- Source: OpenNeuro
- Accession Number:
ds005540 - Original Publication: Please cite the corresponding article:
X. Xu, X. Shen, X. Chen, Q. Zhang, S. Wang, Y. Li, Z. Li, D. Zhang, M. Zhang, and Q. Liu, βA Multi-Context Emotional EEG Dataset for Cross-Context Emotion Decoding,β Scientific Data, 2024. (or the version corresponding to the time of use)
- Key Features:
- 64-channel EEG and peripheral physiological data from 60 participants.
- Seven Emotional Categories: Joy, Inspiration, Tenderness, Fear, Disgust, Sadness, and Neutral Emotion.
- Emotions are elicited in two contexts: video-induced and imagery-induced.
βοΈ Methodology
1. EEG Emotion Classification (LSTM on PSD Features)
Due to resource limitations for local EEG signal processing, this project operates on pre-extracted features.
- Input: Pre-computed Power Spectral Density (PSD) features from the EEG signals, loaded file (e.g., a
.psd.file). - Model: A Long Short-Term Memory (LSTM) network is used. LSTMs are effective for modeling the temporal dependencies and sequences inherent in EEG features.
- Classification Strategy (Experimental):
- Each trial (approximately 30 seconds) is processed as a sequence.
- The model predicts an emotion distribution for the sequence.
- The final predicted label is the most frequently predicted emotion across the sequence (the 'dominant emotion'). This is an experimental framing to simplify the problem from frame-tagging to a single summary classification per trial.
- Output: The single dominant emotional label (e.g., 'Disgust', as seen in the example).
2. Emotional Artistic Style Transfer
The classified dominant emotion drives the final artistic output.
- Content Image Selection: A pre-selected image corresponding to the predicted dominant emotion is retrieved from the file system.
- (Note: While generating emotion-specific images with models like Diffusion Models is ideal, a simpler, file-based lookup is used here due to hardware constraints.)
- Style Image Selection: The user specifies a painting (e.g., a Van Gogh) to provide the artistic texture and colors.
- Neural Style Transfer (NST): The content and style images are merged using an NST model.
- Core Technique: The model utilizes Adaptive Instance Normalization (AdaIN), which is known to accelerate and improve the quality of style transfer by normalizing the feature statistics of the content image to match those of the style image.
- Output: The final stylized image.
β¨ Project Visualization
Experience the Brain Emotional Decoder directly on Hugging Face Spaces!
π Live Demo: [https://huggingface.co/spaces/Ihssane123/Brain_Emotion_Decoder]
Application Screenshots
π€ Credits and Acknowledgements
We acknowledge and are deeply grateful for the foundational work that made this project possible.
| Component | Acknowledged Source / Contributor | Citation / Link |
|---|---|---|
| Dataset & Article | The EmoEEG-MC Research Team | OpenNeuro ds005540 Link, paper link |
| Neural Style Transfer Code | The implementation of the Instance Normalization-based style transfer was adapted from the publicly available work by: pytorch-AdaIN Implementation | Original Implementation |
π Installation and Usage
Clone the repository:
git clone https://github.com/Ihssane5/brain-emotional-decoder.git cd brain-emotional-decoderInstall dependencies:
pip install -r requirements.txtRun the script:
gradio app.pyExplore the interface:
- Upload your EEG PSD data file
- Select a style image from famous artists
- View the emotion analysis and resulting artistic output
π Conclusion
This project demonstrates the intersection of neuroscience, emotional computing, and computational art. By bridging brain activity patterns with creative expression, we hope to inspire further exploration of how our internal emotional states can be visualized through AI-assisted artistic representation.
For questions, contributions, or feedback, please open an issue or submit a pull request.

