|
|
--- |
|
|
license: agpl-3.0 |
|
|
datasets: |
|
|
- openfoodfacts/front_image_classification |
|
|
base_model: |
|
|
- Ultralytics/YOLO11 |
|
|
metrics: |
|
|
- accuracy |
|
|
--- |
|
|
|
|
|
# Front image classification model |
|
|
|
|
|
This model classifies Open Food Facts images into two classes: |
|
|
|
|
|
- `front` (ID 0) |
|
|
- `other` (ID 1) |
|
|
|
|
|
Front images are the "default" image of a product, displayed on Open Food Facts product page. A front image is most of the time a photo of the front side of the product packaging. |
|
|
It's useful to be able to detect front images so that we can update the front image with a newer version (when the packaging changes for example). |
|
|
|
|
|
## Model Details |
|
|
|
|
|
### Model Description |
|
|
|
|
|
- **Developed by:** Raphaël Bournhonesque |
|
|
- **Model type:** Image Classification |
|
|
- **License:** AGPL 3.0 |
|
|
- **Finetuned from model [optional]:** Yolo11n-cls |
|
|
|
|
|
## Uses |
|
|
|
|
|
This model is intended to be used on Open Food Facts images only (images of food packaged products). |
|
|
|
|
|
## Training Details |
|
|
|
|
|
### Training Data |
|
|
|
|
|
v1.0 of the [front_image_classification](https://huggingface.co/datasets/openfoodfacts/front_image_classification) dataset was used to train the model. |
|
|
|
|
|
### Training Procedure |
|
|
|
|
|
- Epochs: 100 |
|
|
- Image size: 448 |
|
|
- Albumentation augmentation |
|
|
|
|
|
[This script](https://github.com/openfoodfacts/openfoodfacts-ai/blob/dbbec40a3d964124cd7c8d838023be4a10d6c0be/front-image-classification/train.py) was used for training the model. |
|
|
|
|
|
The augmentation pipeline used for prediction: |
|
|
|
|
|
```python |
|
|
A.Compose( |
|
|
[ |
|
|
A.LongestMaxSize(max_size=max_size, p=1.0), |
|
|
A.PadIfNeeded(min_height=max_size, min_width=max_size, p=1.0), |
|
|
A.Normalize(mean=DEFAULT_MEAN, std=DEFAULT_STD, p=1.0), |
|
|
ToTensorV2(p=1.0), |
|
|
] |
|
|
) |
|
|
``` |
|
|
|
|
|
For optimal performance, it is advised to keep the same preprocessing pipeline during inference. |
|
|
|
|
|
## Evaluation |
|
|
|
|
|
accuracy: 0.9525 |
|
|
|
|
|
## Export |
|
|
|
|
|
An ONNX export can be found in `weights/model.onnx`. |