English
Ruichen Ma commited on
Commit
3852896
ยท
verified ยท
1 Parent(s): 3240659

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +166 -3
README.md CHANGED
@@ -1,3 +1,166 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - ILSVRC/imagenet-1k
5
+ - uoft-cs/cifar10
6
+ - uoft-cs/cifar100
7
+ language:
8
+ - en
9
+ metrics:
10
+ - accuracy
11
+ base_model:
12
+ - MS-ResNet
13
+ ---
14
+
15
+ <div align="center">
16
+
17
+ <h1>I2E: Real-Time Image-to-Event Conversion for High-Performance Spiking Neural Networks</h1>
18
+
19
+ [![Paper](https://img.shields.io/badge/Arxiv-2511.08065-B31B1B.svg)](https://arxiv.org/abs/2511.08065)
20
+ [![AAAI 2026](https://img.shields.io/badge/AAAI%202026-Oral-4b44ce.svg)](https://aaai.org/)
21
+ [![GitHub](https://img.shields.io/badge/GitHub-Repository-black?logo=github)](https://github.com/Ruichen0424/I2E)
22
+
23
+ </div>
24
+
25
+ ## ๐Ÿš€ Introduction
26
+
27
+ This repository contains the **pre-trained weights** for the paper **"I2E: Real-Time Image-to-Event Conversion for High-Performance Spiking Neural Networks"**, which has been accepted for **Oral Presentation at AAAI 2026**.
28
+
29
+ **I2E** is a pioneering framework that bridges the data scarcity gap in neuromorphic computing. By simulating microsaccadic eye movements via highly parallelized convolution, I2E converts static images into high-fidelity event streams in real-time (>300x faster than prior methods).
30
+
31
+ ### โœจ Key Highlights
32
+ * **SOTA Performance**: Achieves **60.50%** top-1 accuracy on Event-based ImageNet.
33
+ * **Sim-to-Real Transfer**: Pre-training on I2E data enables **92.5%** accuracy on real-world CIFAR10-DVS, setting a new benchmark.
34
+ * **Real-Time Conversion**: Enables on-the-fly data augmentation for deep SNN training.
35
+
36
+ ## ๐Ÿ† Model Zoo & Results
37
+
38
+ We provide pre-trained models for **I2E-CIFAR** and **I2E-ImageNet**. You can download the `.pth` files directly from the **Files and versions** tab in this repository.
39
+
40
+ <table>
41
+ <tr>
42
+ <th>Target Dataset</th>
43
+ <th align="center">Architecture</th>
44
+ <th align="center">Method</th>
45
+ <th align="center">Top-1 Acc</th>
46
+ </tr>
47
+ <!-- CIFAR10-DVS -->
48
+ <tr>
49
+ <td rowspan="3" align="center" style="vertical-align: middle;"><strong>CIFAR10-DVS</strong><br>(Real)</td>
50
+ <td align="center">MS-ResNet18</td>
51
+ <td align="center">Baseline</td>
52
+ <td align="center">65.6%</td>
53
+ </tr>
54
+ <tr>
55
+ <td align="center">MS-ResNet18</td>
56
+ <td align="center">Transfer-I</td>
57
+ <td align="center">83.1%</td>
58
+ </tr>
59
+ <tr>
60
+ <td align="center">MS-ResNet18</td>
61
+ <td align="center">Transfer-II (Sim-to-Real)</td>
62
+ <td align="center"><strong>92.5%</strong></td>
63
+ </tr>
64
+ <!-- I2E-CIFAR10 -->
65
+ <tr>
66
+ <td rowspan="3" align="center" style="vertical-align: middle;"><strong>I2E-CIFAR10</strong></td>
67
+ <td align="center">MS-ResNet18</td>
68
+ <td align="center">Baseline-I</td>
69
+ <td align="center">85.07%</td>
70
+ </tr>
71
+ <tr>
72
+ <td align="center">MS-ResNet18</td>
73
+ <td align="center">Baseline-II</td>
74
+ <td align="center">89.23%</td>
75
+ </tr>
76
+ <tr>
77
+ <td align="center">MS-ResNet18</td>
78
+ <td align="center">Transfer-I</td>
79
+ <td align="center"><strong>90.86%</strong></td>
80
+ </tr>
81
+ <!-- I2E-CIFAR100 -->
82
+ <tr>
83
+ <td rowspan="3" align="center" style="vertical-align: middle;"><strong>I2E-CIFAR100</strong></td>
84
+ <td align="center">MS-ResNet18</td>
85
+ <td align="center">Baseline-I</td>
86
+ <td align="center">51.32%</td>
87
+ </tr>
88
+ <tr>
89
+ <td align="center">MS-ResNet18</td>
90
+ <td align="center">Baseline-II</td>
91
+ <td align="center">60.68%</td>
92
+ </tr>
93
+ <tr>
94
+ <td align="center">MS-ResNet18</td>
95
+ <td align="center">Transfer-I</td>
96
+ <td align="center"><strong>64.53%</strong></td>
97
+ </tr>
98
+ <!-- I2E-ImageNet -->
99
+ <tr>
100
+ <td rowspan="4" align="center" style="vertical-align: middle;"><strong>I2E-ImageNet</strong></td>
101
+ <td align="center">MS-ResNet18</td>
102
+ <td align="center">Baseline-I</td>
103
+ <td align="center">48.30%</td>
104
+ </tr>
105
+ <tr>
106
+ <td align="center">MS-ResNet18</td>
107
+ <td align="center">Baseline-II</td>
108
+ <td align="center">57.97%</td>
109
+ </tr>
110
+ <tr>
111
+ <td align="center">MS-ResNet18</td>
112
+ <td align="center">Transfer-I</td>
113
+ <td align="center">59.28%</td>
114
+ </tr>
115
+ <tr>
116
+ <td align="center">MS-ResNet34</td>
117
+ <td align="center">Baseline-II</td>
118
+ <td align="center"><strong>60.50%</strong></td>
119
+ </tr>
120
+ </table>
121
+
122
+ > **Method Legend:**
123
+ > * **Baseline-I**: Training from scratch with minimal augmentation.
124
+ > * **Baseline-II**: Training from scratch with full augmentation.
125
+ > * **Transfer-I**: Fine-tuning from Static ImageNet (or I2E-ImageNet for CIFAR targets).
126
+ > * **Transfer-II**: Fine-tuning from I2E-CIFAR10.
127
+
128
+ ## ๐Ÿ‘๏ธ Visualization
129
+
130
+ Below is the visualization of the I2E conversion process. We illustrate the high-fidelity conversion from static RGB images to dynamic event streams.
131
+
132
+ <div align="center">
133
+ <img src="./assets/original_1.jpg" width="22%" alt="Original RGB Image 1">
134
+ <img src="./assets/converted_1.gif" width="22%" alt="Converted Event Stream 1">
135
+ <img src="./assets/original_2.jpg" width="22%" alt="Original RGB Image 2">
136
+ <img src="./assets/converted_2.gif" width="22%" alt="Converted Event Stream 2">
137
+ </div>
138
+ <div align="center">
139
+ <img src="./assets/original_3.jpg" width="22%" alt="Original RGB Image 3">
140
+ <img src="./assets/converted_3.gif" width="22%" alt="Converted Event Stream 3">
141
+ <img src="./assets/original_4.jpg" width="22%" alt="Original RGB Image 4">
142
+ <img src="./assets/converted_4.gif" width="22%" alt="Converted Event Stream 4">
143
+ </div>
144
+
145
+ ## ๐Ÿ’ป Usage & Datasets
146
+
147
+ This repository hosts the **model weights only**.
148
+
149
+ For the **I2E dataset generation code**, **training scripts**, and detailed usage instructions, please refer to our official GitHub repository:
150
+
151
+ ๐Ÿ‘‰ **[GitHub: Ruichen0424/I2E](https://github.com/Ruichen0424/I2E)**
152
+
153
+ To generate the datasets (I2E-CIFAR10, I2E-CIFAR100, I2E-ImageNet) yourself using the I2E algorithm, please follow the instructions in the GitHub README.
154
+
155
+ ## ๐Ÿ“œ Citation
156
+
157
+ If you find this work or the models useful, please cite our AAAI 2026 paper:
158
+
159
+ ```bibtex
160
+ @article{ma2025i2e,
161
+ title={I2E: Real-Time Image-to-Event Conversion for High-Performance Spiking Neural Networks},
162
+ author={Ma, Ruichen and Meng, Liwei and Qiao, Guanchao and Ning, Ning and Liu, Yang and Hu, Shaogang},
163
+ journal={arXiv preprint arXiv:2511.08065},
164
+ year={2025}
165
+ }
166
+ ```