File size: 3,967 Bytes
500543c 9720533 500543c 9720533 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 |
---
license: apache-2.0
library_name: pytorch
pipeline_tag: other
---
### There is the pretrained weights of CBraMod.
# CBraMod: A Criss-Cross Brain Foundation Model for EEG Decoding
[](https://arxiv.org/abs/2412.07236)
[](https://openreview.net/forum?id=NPNUHgHF2w)
[](https://huggingface.co/weighting666/CBraMod)

<div align="center">
<img src="figure/CBraMod_logo.png" style="width: 15%;" />
</div>
<p align="center">
๐ <a href="#-about">About</a>
| ๐จ <a href="#-setup">Setup</a>
| ๐ข <a href="#-how-to-pretrain">How to Pretrain</a>
| โต <a href="#-how-to-finetune">How to Finetune</a>
| ๐ <a href="#-quick-start">Quick Start</a>
| ๐ <a href="#-citation">Citation</a>
</p>
๐ฅ NEWS: The paper "_CBraMod: A Criss-Cross Brain Foundation Model for EEG Decoding_" has been accepted by ICLR 2025!
## ๐ About
We propose **CBraMod**, a novel EEG foundation model, for EEG decoding on various clinical and BCI application.
The preprint version of our paper is available at [arXiv](https://arxiv.org/abs/2412.07236).
The camera-ready version of the paper will be available at [OpenReview](https://openreview.net/forum?id=NPNUHgHF2w).
<div align="center">
<img src="figure/model.png" style="width:100%;" />
</div>
## ๐จ Setup
Install [Python](https://www.python.org/downloads/).
Install [PyTorch](https://pytorch.org/get-started/locally/).
Install other requirements:
```commandline
pip install -r requirements.txt
```
## ๐ข How to Pretrain
You can pretrain CBraMod on our pretraining dataset or your custom pretraining dataset using the following code:
```commandline
python pretrain_main.py
```
We have released a pretrained checkpoint on [Hugginface๐ค](https://huggingface.co/weighting666/CBraMod).
## โต How to Finetune
You can finetune CBraMod on our selected downstream datasets using the following code:
```commandline
python finetune_main.py
```
## ๐ Quick Start
You can fine-tune the pretrained CBraMod on your custom downstream dataset using the following example code:
```python
import torch
import torch.nn as nn
from models.cbramod import CBraMod
from einops.layers.torch import Rearrange
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
model = CBraMod().to(device)
model.load_state_dict(torch.load('pretrained_weights/pretrained_weights.pth', map_location=device))
model.proj_out = nn.Identity()
classifier = nn.Sequential(
Rearrange('b c s p -> b (c s p)'),
nn.Linear(22*4*200, 4*200),
nn.ELU(),
nn.Dropout(0.1),
nn.Linear(4 * 200, 200),
nn.ELU(),
nn.Dropout(0.1),
nn.Linear(200, 4),
).to(device)
# mock_eeg.shape = (batch_size, num_of_channels, time_segments, points_per_patch)
mock_eeg = torch.randn((8, 22, 4, 200)).to(device)
# logits.shape = (batch_size, num_of_classes)
logits = classifier(model(mock_eeg))
```
## ๐ Citation
If you're using this repository in your research or applications, please cite using the following BibTeX:
```bibtex
@inproceedings{wang2025cbramod,
title={{CB}raMod: A Criss-Cross Brain Foundation Model for {EEG} Decoding},
author={Jiquan Wang and Sha Zhao and Zhiling Luo and Yangxuan Zhou and Haiteng Jiang and Shijian Li and Tao Li and Gang Pan},
booktitle={The Thirteenth International Conference on Learning Representations},
year={2025},
url={https://openreview.net/forum?id=NPNUHgHF2w}
}
```
## โญ Star History
<div align="center">
<a href="https://star-history.com/#wjq-learning/CBraMod&Date">
<img src="https://api.star-history.com/svg?repos=wjq-learning/CBraMod&type=Date" style="width: 80%;" />
</a>
</div>
Code: https://github.com/wjq-learning/CBraMod |