File size: 1,965 Bytes
1b04224 5522891 1b04224 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
---
license: mit
tags:
- pytorch
- image-classification
- gzsl
- agriculture
- weeds
- crops
- mobilenetv2
- resnet18
- squeezenet
- shufflenetv2
- squeeze-and-excitation
- depthwise-separable-convolution
- weed-identification
---
# GZSL Weeds Identification: Lightweight Classifier Weights
This repository hosts the PyTorch checkpoints used in our generalized zero‑shot learning (GZSL) pipeline for weed identification in agricultural imagery.
Backbones were fine‑tuned on **CropAndWeed** and evaluated for cross‑dataset generalization to **Plant Phenotyping** and a self‑collected, real‑field dataset.
## Available models
| File name | Architecture / variant |
|-----------|------------------------|
| `mobilenet.pt` | MobileNetV2 (ImageNet stem, width 1.0) |
| `resnet18.pt` | ResNet‑18 |
| `squeezenet.pt` | SqueezeNet 1.1 |
| `shufflenet.pt` | ShuffleNet V2 (baseline) |
| `shufflenet_squeeze_excitation.pt` | ShuffleNet V2 + Squeeze‑and‑Excitation (SE) |
| `shufflenet_sep_conv.pt` | ShuffleNet V2 + Depthwise Separable Convolution (SC) |
| `shufflenet_sep_conv_squeeze_excitation.pt` | ShuffleNet V2 + SC + SE |
## Getting started
All inference scripts, data loaders and architecture definitions live in the companion GitHub repository:
https://github.com/SyArsRa/WeedZSL.git
The quick‑start guide there walks through:
1. Instantiating the desired backbone (for example, MobileNetV2 or ShuffleNetV2 + SE)
2. Loading the matching `.pt` file from this weights hub
3. Running single‑image or batch inference
4. Fine‑tuning on a custom dataset if needed
## License
Weights and code are released under the MIT license for research and non‑commercial use.
See `LICENSE` for details or contact the maintainers for alternative licensing.
## Citation
These checkpoints support a study currently submitted to a conference.
Please cite the forthcoming paper or contact the authors for an interim reference.
|