|
|
--- |
|
|
license: apache-2.0 |
|
|
tags: |
|
|
- crop-disease-detection |
|
|
- computer-vision |
|
|
- pytorch |
|
|
- efficientnet |
|
|
- agriculture |
|
|
- plant-diseases |
|
|
datasets: |
|
|
- plant-village |
|
|
- dhan-shomadhan |
|
|
- custom |
|
|
language: en |
|
|
--- |
|
|
|
|
|
# Crop Disease Detection Model (EfficientNet-B3) |
|
|
|
|
|
## Overview |
|
|
|
|
|
This model detects and classifies crop diseases using computer vision and deep learning. Built on **EfficientNet-B3** and trained on a curated dataset of 13,000+ images, the model can recognize 17 disease classes across **five major crops**: |
|
|
|
|
|
- **Corn** (Common Rust, Gray Leaf Spot, Northern Leaf Blight, Healthy) |
|
|
- **Potato** (Early Blight, Late Blight, Healthy) |
|
|
- **Rice** (Brown Spot, Leaf Blast, Neck Blast, Healthy) |
|
|
- **Wheat** (Yellow Rust, Brown Rust, Healthy) |
|
|
- **Sugarcane** (Red Rot, Bacterial Blight, Healthy) |
|
|
|
|
|
β
**Accuracy:** 94.8% |
|
|
β
**Precision:** 95.4% |
|
|
β
**Recall:** 94.5% |
|
|
|
|
|
The model contributes to **SDG 2 β Zero Hunger**, **SDG 12 β Responsible Consumption**, and **SDG 13 β Climate Action** by enabling early intervention and sustainable agriculture practices. |
|
|
|
|
|
--- |
|
|
|
|
|
## Model Details |
|
|
|
|
|
- **Architecture**: EfficientNet-B3 (pretrained on ImageNet) |
|
|
- **Classifier Head**: Replaced with `Linear(1536 β 17)` |
|
|
- **Framework**: PyTorch |
|
|
- **Total Parameters**: ~10.7M |
|
|
- **Training**: |
|
|
- 5-fold cross-validation |
|
|
- Early stopping (best at epoch 29) |
|
|
- Augmentation & normalization |
|
|
|
|
|
--- |
|
|
|
|
|
## How to Use |
|
|
|
|
|
> π‘ This model requires preprocessing consistent with training (image resizing, normalization). For ready-to-use prediction. |
|
|
|
|
|
### Inference Example (PyTorch) |
|
|
```python |
|
|
import torch |
|
|
from torchvision import transforms |
|
|
from PIL import Image |
|
|
import requests |
|
|
from huggingface_hub import hf_hub_download |
|
|
|
|
|
# Download the model file from Hugging Face |
|
|
model_path = hf_hub_download(repo_id="VisionaryQuant/5_Crop_Disease_Detection", filename="best_crop_disease_model.pt") |
|
|
|
|
|
# Load the model (make sure your architecture matches) |
|
|
model = torch.load(model_path, map_location=torch.device('cpu')) |
|
|
model.eval() |
|
|
|
|
|
# Preprocess input image |
|
|
image = Image.open("your_crop_image.jpg").convert("RGB") |
|
|
transform = transforms.Compose([ |
|
|
transforms.Resize((300, 300)), |
|
|
transforms.ToTensor(), |
|
|
transforms.Normalize(mean=[0.485, 0.456, 0.406], |
|
|
std=[0.229, 0.224, 0.225]) |
|
|
]) |
|
|
input_tensor = transform(image).unsqueeze(0) |
|
|
|
|
|
# Run inference |
|
|
with torch.no_grad(): |
|
|
logits = model(input_tensor) |
|
|
probs = torch.nn.functional.softmax(logits, dim=1) |
|
|
predicted_idx = torch.argmax(probs, dim=1).item() |
|
|
|
|
|
# Map class index to label |
|
|
idx2label = {0: "Corn___Common_Rust", 1: "Corn___Gray_Leaf_Spot", ..., 16: "Sugarcane___Healthy"} # Add full mapping |
|
|
print("Prediction:", idx2label[predicted_class]) |
|
|
``` |
|
|
|
|
|
## Real-World Applications |
|
|
- Smart Farming: Disease detection via mobile/drones |
|
|
|
|
|
- Scalable Monitoring: Surveying across large farmlands |
|
|
|
|
|
- Yield Optimization: Early diagnosis = lower crop loss |
|
|
|
|
|
## Citation |
|
|
If you use this model, please cite it as: |
|
|
|
|
|
**BibTeX:** |
|
|
``` |
|
|
@misc{5cropdiseasedetection2025, |
|
|
title = {Crop Disease Detection using EfficientNet-B3}, |
|
|
author = {Abdullahi Olalekan Abdulmumeen}, |
|
|
year = {2025}, |
|
|
url = {https://huggingface.co/VisionaryQuant/5_Crop_Disease_Detection} |
|
|
} |
|
|
``` |
|
|
|
|
|
**APA:** |
|
|
``` |
|
|
Abdulmumeen, A. O. (2025). Crop disease detection using EfficientNet-B3 [Model]. Hugging Face. https://huggingface.co/VisionaryQuant/5_Crop_Disease_Detection |
|
|
``` |
|
|
|
|
|
## Contact & Credits |
|
|
Developed by Abdullahi Olalekan Abdulmumeen <br/> |
|
|
For the NaijaFarmConsultAI 3MTT Knowledge Showcase project |