I2E / README.md
UESTC-BICS's picture
Update README.md
a0f85c5
|
raw
history blame
9.15 kB
---
license: mit
---
<div align="center">
<h1>I2E: Real-Time Image-to-Event Conversion for High-Performance Spiking Neural Networks</h1>
[![Paper](https://img.shields.io/badge/Arxiv-2511.08065-B31B1B.svg)](https://arxiv.org/abs/2511.08065)
[![AAAI 2026](https://img.shields.io/badge/AAAI%202026-Oral-4b44ce.svg)](https://aaai.org/)
[![Google Scholar](https://img.shields.io/badge/Google%20Scholar-Paper-4285F4?style=flat-square&logo=google-scholar&logoColor=white)](https://scholar.google.com/scholar?cluster=1814482600796011970)
[![GitHub](https://img.shields.io/badge/GitHub-Repository-black?logo=github)](https://github.com/Ruichen0424/I2E)
[![Hugging Face](https://img.shields.io/badge/Hugging%20Face-Paper-FFD21E?style=flat-square&logo=huggingface&logoColor=black)](https://huggingface.co/papers/2511.08065)
[![Hugging Face](https://img.shields.io/badge/Hugging%20Face-Models-FFD21E?style=flat-square&logo=huggingface&logoColor=black)](https://huggingface.co/Ruichen0424/I2E)
</div>
## 🚀 Introduction
This repository contains the **I2E-Datasets** for the paper **"I2E: Real-Time Image-to-Event Conversion for High-Performance Spiking Neural Networks"**, which has been accepted for **Oral Presentation at AAAI 2026**.
**I2E** is a pioneering framework that bridges the data scarcity gap in neuromorphic computing. By simulating microsaccadic eye movements via highly parallelized convolution, I2E converts static images into high-fidelity event streams in real-time (>300x faster than prior methods).
### ✨ Key Highlights
* **SOTA Performance**: Achieves **60.50%** top-1 accuracy on Event-based ImageNet.
* **Sim-to-Real Transfer**: Pre-training on I2E data enables **92.5%** accuracy on real-world CIFAR10-DVS, setting a new benchmark.
* **Real-Time Conversion**: Enables on-the-fly data augmentation for deep SNN training.
## 🏆 Model Zoo & Results
We provide pre-trained models for **I2E-CIFAR** and **I2E-ImageNet**. You can download the `.pth` files directly from the [**Files and versions**](https://huggingface.co/Ruichen0424/I2E/tree/main) tab in model repository.
[![Hugging Face](https://img.shields.io/badge/Hugging%20Face-Models-FFD21E?style=flat-square&logo=huggingface&logoColor=black)](https://huggingface.co/Ruichen0424/I2E)
<table border="1">
<tr>
<th>Target Dataset</th>
<th align="center">Architecture</th>
<th align="center">Method</th>
<th align="center">Top-1 Acc</th>
</tr>
<!-- CIFAR10-DVS -->
<tr>
<td rowspan="3" align="center" style="vertical-align: middle;"><strong>CIFAR10-DVS</strong><br>(Real)</td>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Baseline</td>
<td align="center" style="vertical-align: middle;">65.6%</td>
</tr>
<tr>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Transfer-I</td>
<td align="center" style="vertical-align: middle;">83.1%</td>
</tr>
<tr>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Transfer-II (Sim-to-Real)</td>
<td align="center" style="vertical-align: middle;"><strong>92.5%</strong></td>
</tr>
<!-- I2E-CIFAR10 -->
<tr>
<td rowspan="3" align="center" style="vertical-align: middle;"><strong>I2E-CIFAR10</strong></td>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Baseline-I</td>
<td align="center" style="vertical-align: middle;">85.07%</td>
</tr>
<tr>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Baseline-II</td>
<td align="center" style="vertical-align: middle;">89.23%</td>
</tr>
<tr>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Transfer-I</td>
<td align="center" style="vertical-align: middle;"><strong>90.86%</strong></td>
</tr>
<!-- I2E-CIFAR100 -->
<tr>
<td rowspan="3" align="center" style="vertical-align: middle;"><strong>I2E-CIFAR100</strong></td>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Baseline-I</td>
<td align="center" style="vertical-align: middle;">51.32%</td>
</tr>
<tr>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Baseline-II</td>
<td align="center" style="vertical-align: middle;">60.68%</td>
</tr>
<tr>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Transfer-I</td>
<td align="center" style="vertical-align: middle;"><strong>64.53%</strong></td>
</tr>
<!-- I2E-ImageNet -->
<tr>
<td rowspan="4" align="center" style="vertical-align: middle;"><strong>I2E-ImageNet</strong></td>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Baseline-I</td>
<td align="center" style="vertical-align: middle;">48.30%</td>
</tr>
<tr>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Baseline-II</td>
<td align="center" style="vertical-align: middle;">57.97%</td>
</tr>
<tr>
<td align="center" style="vertical-align: middle;">MS-ResNet18</td>
<td align="center" style="vertical-align: middle;">Transfer-I</td>
<td align="center" style="vertical-align: middle;">59.28%</td>
</tr>
<tr>
<td align="center" style="vertical-align: middle;">MS-ResNet34</td>
<td align="center" style="vertical-align: middle;">Baseline-II</td>
<td align="center" style="vertical-align: middle;"><strong>60.50%</strong></td>
</tr>
</table>
> **Method Legend:**
> * **Baseline-I**: Training from scratch with minimal augmentation.
> * **Baseline-II**: Training from scratch with full augmentation.
> * **Transfer-I**: Fine-tuning from Static ImageNet (or I2E-ImageNet for CIFAR targets).
> * **Transfer-II**: Fine-tuning from I2E-CIFAR10.
## 👁️ Visualization
Below is the visualization of the I2E conversion process. We illustrate the high-fidelity conversion from static RGB images to dynamic event streams.
More than 200 additional visualization comparisons can be found in [Visualization.md](./Visualization.md).
<table border="0" style="width: 100%">
<tr>
<td width="25%" align="center"><img src="./assets/original_1.jpg" alt="Original 1" style="width:100%"></td>
<td width="25%" align="center"><img src="./assets/converted_1.gif" alt="Converted 1" style="width:100%"></td>
<td width="25%" align="center"><img src="./assets/original_2.jpg" alt="Original 2" style="width:100%"></td>
<td width="25%" align="center"><img src="./assets/converted_2.gif" alt="Converted 2" style="width:100%"></td>
</tr>
<tr>
<td width="25%" align="center"><img src="./assets/original_3.jpg" alt="Original 3" style="width:100%"></td>
<td width="25%" align="center"><img src="./assets/converted_3.gif" alt="Converted 3" style="width:100%"></td>
<td width="25%" align="center"><img src="./assets/original_4.jpg" alt="Original 4" style="width:100%"></td>
<td width="25%" align="center"><img src="./assets/converted_4.gif" alt="Converted 4" style="width:100%"></td>
</tr>
</table>
## 💻 Usage
The generated I2E-Datasets are provided in the [**Files and versions**](https://huggingface.co/datasets/UESTC-BICS/I2E/tree/main) section, including I2E-CIFAR10, I2E-CIFAR100, and I2E-ImageNet.
For I2E-ImageNet, the following command can be used to concatenate the zip file.
``` bash
cat ./I2E-ImageNet_split.part_* > ./I2E-ImageNet.zip
```
We also provide MD5 checksums to facilitate verification. Use the following command to verify the compressed files:
``` bash
md5sum -c md5.txt
```
This repository hosts the **datasets only**.
For the **I2E dataset generation code**, **training scripts**, and detailed usage instructions, please refer to our official GitHub repository.
To generate the datasets (I2E-CIFAR10, I2E-CIFAR100, I2E-ImageNet) yourself using the I2E algorithm, please follow the instructions in the GitHub README.
[![GitHub](https://img.shields.io/badge/GitHub-Repository-black?logo=github)](https://github.com/Ruichen0424/I2E)
## 📜 Citation
If you find this work or the models useful, please cite our AAAI 2026 paper:
```bibtex
@article{ma2025i2e,
title={I2E: Real-Time Image-to-Event Conversion for High-Performance Spiking Neural Networks},
author={Ma, Ruichen and Meng, Liwei and Qiao, Guanchao and Ning, Ning and Liu, Yang and Hu, Shaogang},
journal={arXiv preprint arXiv:2511.08065},
year={2025}
}
```