File size: 6,248 Bytes
adda979
6ff4abc
b29e867
 
 
 
 
 
073d55f
 
 
 
 
642af15
 
adda979
b29e867
073d55f
b29e867
073d55f
b29e867
073d55f
b29e867
073d55f
 
 
 
 
 
 
 
 
 
bcfdee2
073d55f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6ff4abc
073d55f
b29e867
073d55f
 
 
 
 
 
 
 
 
 
 
 
b29e867
 
 
073d55f
 
 
 
 
 
 
 
 
 
 
b29e867
073d55f
 
b29e867
073d55f
 
 
 
 
 
 
 
 
 
 
 
 
 
b29e867
 
073d55f
6ff4abc
073d55f
 
 
 
 
 
 
 
 
 
 
6ff4abc
 
 
073d55f
 
 
 
 
 
6ff4abc
 
 
073d55f
 
 
 
 
6ff4abc
 
 
073d55f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6ff4abc
 
 
073d55f
 
6ff4abc
073d55f
6ff4abc
073d55f
6ff4abc
bcfdee2
6ff4abc
bcfdee2
6ff4abc
bcfdee2
f59069a
073d55f
6ff4abc
073d55f
6ff4abc
bcfdee2
 
 
 
073d55f
 
 
 
 
f59069a
073d55f
 
 
 
 
 
6ff4abc
 
 
adda979
6ff4abc
073d55f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
---
license: apache-2.0
library_name: pytorch
pipeline_tag: image-segmentation
tags:
- image-segmentation
- pytorch
- unet
- fungal-colony
- petri-dish
- morphometry
- magnaporthe
- area-consistency
language:
- en
---

# πŸ”¬ Gray Leaf Spot Segmentation Model

PyTorch U-Net models for **gray leaf spot** (*Magnaporthe* and related fungal) colony segmentation on 90 mm petri-dish images.

**[β–Ά Try the live demo](https://huggingface.co/spaces/rotsl/grayleafspot-segmentation-demo)** β€” upload images, run inference, see overlays & growth charts in your browser.

---

## Model Weights

| File | Architecture | Params | Area-Consistency Weight | Description |
|---|---|---|---|---|
| `grayleafspot.pt` | smp.Unet (ResNet-34) | ~24.4 M | β€” | Main encoder–decoder model |
| `best_area_w_0.1.pt` | SmallUNet | ~250 K | 0.1 | Light area regularisation |
| `best_area_w_0.3.pt` | SmallUNet | ~250 K | 0.3 | Moderate area regularisation |
| `best_area_w_0.5.pt` | SmallUNet | ~250 K | 0.5 | Balanced BCE + area |
| **`best_area_w_0.7.pt`** | **SmallUNet** | **~250 K** | **0.7** | **Strong area consistency (used by demo) βœ… recommended** |

All SmallUNet variants share the same architecture:

```
Input (3 Γ— 256 Γ— 256)
  β”‚
  β”œβ”€ enc1: ConvBlock(3 β†’ 16)              ─── skip s1
  β”œβ”€ enc2: MaxPool2d β†’ ConvBlock(16 β†’ 32)  ─── skip s2
  β”œβ”€ enc3: MaxPool2d β†’ ConvBlock(32 β†’ 64)  ─── skip s3
  β”œβ”€ enc4: MaxPool2d β†’ ConvBlock(64 β†’ 128) ─── skip s4
  β”‚
  β”œβ”€ bottleneck: MaxPool2d β†’ ConvBlock(128 β†’ 256)
  β”‚
  β”œβ”€ up4: Upsample + cat(s4) β†’ ConvBlock(384 β†’ 128)
  β”œβ”€ up3: Upsample + cat(s3) β†’ ConvBlock(192 β†’ 64)
  β”œβ”€ up2: Upsample + cat(s2) β†’ ConvBlock(96 β†’ 32)
  β”œβ”€ up1: Upsample + cat(s1) β†’ ConvBlock(48 β†’ 16)
  β”‚
  └─ head: Conv2d(16 β†’ 1) β†’ Sigmoid
```

Each `ConvBlock` = Conv3Γ—3 (no bias) β†’ ReLU β†’ Conv3Γ—3 (no bias) β†’ ReLU.

| Property | Value |
|---|---|
| **Input** | 256 Γ— 256 RGB |
| **Output** | 1-channel sigmoid probability mask |
| **Training loss** | BCE + area-consistency loss |
| **CPU compatible** | βœ… Pure PyTorch β€” no custom CUDA kernels |

---

## Quick Start

### Download & Inference (SmallUNet)

```python
import torch
from huggingface_hub import hf_hub_download

# Download weights
path = hf_hub_download("rotsl/grayleafspot-segmentation", "best_area_w_0.7.pt")

# Load checkpoint
ckpt = torch.load(path, map_location="cpu", weights_only=False)

# Build model (SmallUNet architecture β€” see demo repo for full class definition)
# https://huggingface.co/rotsl/grayleafspot-segmentation-demo/blob/main/app.py
from model import SmallUNet  # or copy the class from the demo app.py

model = SmallUNet(in_channels=3, out_channels=1, base_channels=16)
model.load_state_dict(ckpt["model_state_dict"])
model.eval()

# Run inference on a 256Γ—256 RGB tensor
import numpy as np
from PIL import Image

img = Image.open("petri_dish.jpg").convert("RGB").resize((256, 256))
x = torch.from_numpy(np.array(img).transpose(2, 0, 1)).float() / 255.0
x = x.unsqueeze(0)

with torch.no_grad():
    prob = model(x)[0, 0].numpy()

mask = (prob > 0.5).astype(np.uint8) * 255
Image.fromarray(mask).save("colony_mask.png")
```

### Download & Inference (Main U-Net)

```python
import torch

path = hf_hub_download("rotsl/grayleafspot-segmentation", "grayleafspot.pt")
model = torch.load(path, map_location="cpu", weights_only=False)
model.eval()
```

---

## Training & Inference Pipeline

### Environment Setup (Apple Silicon recommended)

```bash
python3.10 -m venv trainenv
source trainenv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
```

### Dataset Preparation

Place raw images in `raw/` and corresponding masks in `masks/` (matching filenames). Expand the dataset with augmentations:

```bash
python src/build_augmented_dataset.py --copies-per-image 4 --clean
```

### Training

Standard U-Net training:

```bash
python src/train.py \
    --image-dir augmented_dataset/raw \
    --mask-dir augmented_dataset/masks \
    --epochs 40 --batch-size 4 --lr 1e-4 \
    --image-size 256 --freeze-encoder-epochs 5
```

Area-consistency U-Net (LabelMe JSON polygons):

```bash
./trainenv/bin/python src/area_consistency/train_area.py
```

### Inference (CLI)

Single image:

```bash
python src/predict.py --input raw/your_image.jpg --weights models/best_finetuned.pt --output-dir predictions
```

Folder:

```bash
python src/predict.py --input raw --weights models/best_finetuned.pt --output-dir predictions
```

### Best Practices

- Keep trusted human-labelled masks unchanged.
- Use augmentations and area-consistency loss for improved generalisation.
- Inspect overlay outputs to verify mask quality.
- On Apple Silicon, MPS acceleration is used automatically if available.

---

## Demo Spaces

### βœ… Recommended: [`rotsl/grayleafspot-segmentation-demo`](https://huggingface.co/spaces/rotsl/grayleafspot-segmentation-demo)

Uses **`best_area_w_0.7.pt`** (SmallUNet with area-consistency loss). More accurate segmentation with better boundary adherence thanks to the area-consistency regularisation.

Features: dish detection β†’ colony segmentation β†’ crack & hyphae analysis β†’ 16 morphometric measurements β†’ time-series growth charts β†’ CSV/JSON export.

Source code: [`rotsl/grayleafspot-segmentation-demo`](https://huggingface.co/rotsl/grayleafspot-segmentation-demo) (model repo)

### Legacy: [`rotsl/fungal-colony-input`](https://huggingface.co/spaces/rotsl/fungal-colony-input)

Uses **`grayleafspot.pt`** (smp.Unet with ResNet-34 encoder). This is the earlier, larger model trained with standard BCE loss only β€” it is less accurate than the area-consistency variant above, particularly for colony boundary delineation and area estimation. Kept available for reference and backward compatibility.

---

## Citation

```bibtex
@misc{rohan_r_2026,
  author       = {rohan r},
  title        = {grayleafspot-segmentation (Revision 0e85f71)},
  year         = 2026,
  url          = {https://huggingface.co/rotsl/grayleafspot-segmentation},
  doi          = {10.57967/hf/8416},
  publisher    = {Hugging Face}
}
```

## License

Apache License 2.0 β€” see [LICENSE](LICENSE) for details.