English
File size: 5,608 Bytes
a303aa6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
276cabc
c99a7e9
a303aa6
108b708
0d23d1b
 
 
a303aa6
 
 
 
 
 
 
 
276cabc
 
a303aa6
 
 
 
 
 
 
b95d690
a303aa6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
---
license: mit
datasets:
- uoft-cs/cifar10
- uoft-cs/cifar100
- ILSVRC/imagenet-1k
language:
- en
metrics:
- accuracy
base_model:
- ReActNet
---

<div align="center">

<h1>A&B BNN: Add&Bit-Operation-Only Hardware-Friendly Binary Neural Network</h1>

[![Paper](https://img.shields.io/badge/Arxiv-2403.03739-B31B1B.svg?style=flat-square)](https://arxiv.org/abs/2403.03739)
[![CVPR 2024](https://img.shields.io/badge/CVPR%202024-Poster-4b44ce.svg?style=flat-square)](https://cvpr.thecvf.com/virtual/2024/poster/29447)
[![Google Scholar](https://img.shields.io/badge/Google%20Scholar-Paper-4285F4?style=flat-square&logo=google-scholar&logoColor=white)](https://scholar.google.com/scholar?cluster=9219398500921383941)
[![IEEE](https://img.shields.io/badge/IEEE-Paper-00629B?style=flat-square&logo=ieee&logoColor=white)](https://xploreqa.ieee.org/document/10656026)
[![Hugging Face](https://img.shields.io/badge/Hugging%20Face-Paper-FFD21E?style=flat-square&logo=huggingface&logoColor=black)](https://huggingface.co/papers/2403.03739)

[![GitHub](https://img.shields.io/badge/GitHub-Repository-black?logo=github)](https://github.com/Ruichen0424/AB-BNN)
[![YouTube](https://img.shields.io/badge/YouTube-Video-FF0000?style=flat-square&logo=youtube&logoColor=white)](https://youtu.be/L8cWTetcU2M?si=V_fH1YXVKhlaEdf4)
[![Bilibili](https://img.shields.io/badge/Bilibili-Video-FE7398?style=flat-square&logo=bilibili&logoColor=white)](https://www.bilibili.com/video/BV1PM4m1S7T1)

</div>

## ๐Ÿš€ Introduction

This repository contains the **pre-trained weights** for the paper **"A&B BNN: Add&Bit-Operation-Only Hardware-Friendly Binary Neural Network"**, published in **CVPR 2024**.

**A&B BNN** proposes to directly remove part of the multiplication operations in a traditional BNN and replace the rest with an equal number of bit operations. It introduces the mask layer and the quantized RPReLU structure based on the normalizer-free network architecture.

![Poster](./assets/poster.png)

### โœจ Key Highlights
* **Hardware-Friendly**: Removes multiplication operations, replacing them with bit operations.
* **Competitive Performance**: Achieves **92.30%**, **69.35%**, and **66.89%** on CIFAR-10, CIFAR-100, and ImageNet respectively.
* **Innovative Structures**: Introduces mask layer and quantized RPReLU.

## ๐Ÿ† Model Zoo & Results

We provide pre-trained models for **CIFAR-10**, **CIFAR-100**, and **ImageNet**. You can download the `.h5` files directly from the [**Files and versions**](https://huggingface.co/Ruichen0424/AB-BNN/tree/main/models) tab in this repository.

<table border="1">
    <tr>
        <th>Dataset</th>
        <th align="center">Structure</th>
        <th align="center"># Params</th>
        <th align="center">Top-1 Acc</th>
    </tr>
    <!-- CIFAR10 -->
    <tr>
        <td rowspan="2" align="center" style="vertical-align: middle;"><strong>CIFAR10</strong></td>
        <td align="center" style="vertical-align: middle;">ReActNet-18</td>
        <td align="center" style="vertical-align: middle;">11.18 M</td>
        <td align="center" style="vertical-align: middle;">91.94%</td>
    </tr>
    <tr>
        <td align="center" style="vertical-align: middle;">ReActNet-A</td>
        <td align="center" style="vertical-align: middle;">28.32 M</td>
        <td align="center" style="vertical-align: middle;">89.44%</td>
    </tr>
    <!-- CIFAR100 -->
    <tr>
        <td rowspan="2" align="center" style="vertical-align: middle;"><strong>CIFAR100</strong></td>
        <td align="center" style="vertical-align: middle;">ReActNet-18</td>
        <td align="center" style="vertical-align: middle;">11.23 M</td>
        <td align="center" style="vertical-align: middle;">69.35%</td>
    </tr>
    <tr>
        <td align="center" style="vertical-align: middle;">ReActNet-A</td>
        <td align="center" style="vertical-align: middle;">28.41 M</td>
        <td align="center" style="vertical-align: middle;">63.23%</td>
    </tr>
    <!-- ImageNet -->
    <tr>
        <td rowspan="3" align="center" style="vertical-align: middle;"><strong>ImageNet</strong></td>
        <td align="center" style="vertical-align: middle;">ReActNet-18</td>
        <td align="center" style="vertical-align: middle;">11.70 M</td>
        <td align="center" style="vertical-align: middle;">61.39%</td>
    </tr>
    <tr>
        <td align="center" style="vertical-align: middle;">ReActNet-34</td>
        <td align="center" style="vertical-align: middle;">21.82 M</td>
        <td align="center" style="vertical-align: middle;">65.19%</td>
    </tr>
    <tr>
        <td align="center" style="vertical-align: middle;">ReActNet-A</td>
        <td align="center" style="vertical-align: middle;">29.33 M</td>
        <td align="center" style="vertical-align: middle;">66.89%</td>
    </tr>
</table>

## ๐Ÿ’ป Usage

This repository hosts the **model weights only**.

For the **training scripts**, **inference codes**, and detailed usage instructions, please refer to our official GitHub repository.

[![GitHub](https://img.shields.io/badge/GitHub-Repository-black?logo=github)](https://github.com/Ruichen0424/AB-BNN)

## ๐Ÿ“œ Citation

If you find our code useful for your research, please consider citing:

```bibtex
@inproceedings{ma2024b,
  title={A\&B BNN: Add\&Bit-Operation-Only Hardware-Friendly Binary Neural Network},
  author={Ma, Ruichen and Qiao, Guanchao and Liu, Yian and Meng, Liwei and Ning, Ning and Liu, Yang and Hu, Shaogang},
  booktitle={2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  pages={5704--5713},
  year={2024},
  organization={IEEE}
}
```