Image Classification
FBAGSTM commited on
Commit
5758b5c
·
verified ·
1 Parent(s): 314ba1f

Release AI-ModelZoo-4.0.0

Browse files
Files changed (1) hide show
  1. README.md +105 -3
README.md CHANGED
@@ -1,3 +1,105 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ pipeline_tag: image-classification
4
+ ---
5
+ # PeleeNet
6
+
7
+ ## **Use case** : `Image classification`
8
+
9
+ # Model description
10
+
11
+
12
+
13
+ PeleeNet is a lightweight convolutional neural network architecture designed for **efficient real-time inference on mobile and edge devices**. It is built upon dense connectivity principles, similar to DenseNet, but optimized to significantly reduce computational cost while maintaining strong accuracy.
14
+
15
+ Unlike MobileNet, PeleeNet does **not rely on width multipliers or depthwise separable convolutions**. Instead, it uses **two-way dense blocks**, bottleneck layers, and efficient stem blocks to reduce memory access cost and improve speed on practical hardware.
16
+
17
+ The original paper demonstrates that PeleeNet achieves competitive accuracy compared to MobileNet while requiring **fewer parameters and lower latency**, especially on real devices.
18
+
19
+ (source: https://arxiv.org/abs/1804.06882)
20
+
21
+ The model is quantized to **int8** using **ONNX Runtime** and exported for efficient deployment.
22
+
23
+ ## Network information
24
+
25
+
26
+ | Network Information | Value |
27
+ |--------------------|-------|
28
+ | Framework | Torch |
29
+ | MParams | ~2.69 M |
30
+ | Quantization | Int8 |
31
+ | Provenance | https://github.com/Robert-JunWang/PeleeNet |
32
+ | Paper | https://arxiv.org/abs/1804.06882 |
33
+
34
+ ## Network inputs / outputs
35
+
36
+
37
+ For an image resolution of NxM and P classes
38
+
39
+ | Input Shape | Description |
40
+ | ----- | ----------- |
41
+ | (1, N, M, 3) | Single NxM RGB image with UINT8 values between 0 and 255 |
42
+
43
+ | Output Shape | Description |
44
+ | ----- | ----------- |
45
+ | (1, P) | Per-class confidence for P classes in FLOAT32|
46
+
47
+
48
+ ## Recommended platforms
49
+
50
+
51
+ | Platform | Supported | Recommended |
52
+ |----------|-----------|-----------|
53
+ | STM32L0 |[]|[]|
54
+ | STM32L4 |[]|[]|
55
+ | STM32U5 |[]|[]|
56
+ | STM32H7 |[]|[]|
57
+ | STM32MP1 |[]|[]|
58
+ | STM32MP2 |[]|[]|
59
+ | STM32N6 |[x]|[x]|
60
+
61
+ # Performances
62
+
63
+ ## Metrics
64
+
65
+ - Measures are done with default STM32Edge.AI configuration with enabled input / output allocated option.
66
+ - All the models are trained from scratch on Imagenet dataset
67
+
68
+ ### Reference **NPU** memory footprint on Imagenet dataset (see Accuracy for details on dataset)
69
+
70
+ | Model | Dataset | Format | Resolution | Series | Internal RAM (KiB) | External RAM (KiB) | Weights Flash (KiB) | STEdgeAI Core version |
71
+ |-------|---------|--------|------------|--------|--------------|--------------|---------------|----------------------|
72
+ | [peleenet_pt_224](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/peleenet_pt/Public_pretrainedmodel_public_dataset/Imagenet/peleenet_pt_224/peleenet_pt_224_qdq_int8.onnx) | Imagenet | Int8 | 224×224×3 | STM32N6 | 1421 | 0 | 2751.06 | 3.0.0 |
73
+
74
+
75
+ ### Reference **NPU** inference time on Imagenet dataset (see Accuracy for details on dataset)
76
+
77
+
78
+ | Model | Dataset | Format | Resolution | Board | Execution Engine | Inference time (ms) | Inf / sec | STEdgeAI Core version |
79
+ |--------|----------|--------|-------------|------------------|------------------|---------------------|-----------|-------------------------|
80
+ | [peleenet_pt_224](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/peleenet_pt/Public_pretrainedmodel_public_dataset/Imagenet/peleenet_pt_224/peleenet_pt_224_qdq_int8.onnx) | Imagenet | Int8 | 224x224x3 | STM32N6570-DK | NPU/MCU | 18.01 | 55.52 | 3.0.0 |
81
+
82
+
83
+ ### Accuracy with Imagenet dataset
84
+
85
+ Dataset details: [link](https://www.image-net.org)
86
+ Number of classes: 1000.
87
+ To perform the quantization, we calibrated the activations with a random subset of the training set.
88
+ For the sake of simplicity, the accuracy reported here was estimated on the 50000 labelled images of the validation set.
89
+
90
+ |model | Format | Resolution | Top 1 Accuracy |
91
+ |---------|--------|------------|----------------|
92
+ | [peleenet_pt_224](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/peleenet_pt/Public_pretrainedmodel_public_dataset/Imagenet/peleenet_pt_224/peleenet_pt_224_qdq_int8.onnx) | Int8 | 224x224x3 | 70.37% |
93
+ | [peleenet_pt_224](https://github.com/STMicroelectronics/stm32ai-modelzoo/tree/main/image_classification/peleenet_pt/Public_pretrainedmodel_public_dataset/Imagenet/peleenet_pt_224/peleenet_pt_224.onnx)| Float | 224x224x3 | 70.57% |
94
+
95
+ ## Retraining and Integration in a simple example:
96
+
97
+ Please refer to the stm32ai-modelzoo-services GitHub [here](https://github.com/STMicroelectronics/stm32ai-modelzoo-services)
98
+
99
+
100
+
101
+ # References
102
+
103
+ <a id="1">[1]</a> - **Dataset**: Imagenet (ILSVRC 2012) — https://www.image-net.org/
104
+
105
+ <a id="2">[2]</a> - **Model**: PeleeNet — https://github.com/Robert-JunWang/PeleeNet