File size: 4,771 Bytes
995653b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
---
license: other
tags:
  - coreml
  - bird-classification
  - object-detection
  - keypoint-detection
  - image-quality-assessment
---

# SuperPicky CoreML Models

CoreML-converted copies of the five machine-learning models used by
[SuperPickyMac](https://github.com/halfhacked/SuperPickyMac), a native macOS
birding photo-culling app. Each file is the `weights/weight.bin` payload of
the corresponding `.mlmodelc` directory — the app ships the small scaffold
files (`model.mil`, `metadata.json`, …) in its app bundle and downloads
these weight blobs on first launch.

**This repository does not introduce any new models.** Every model here is a
conversion of an existing, independently-published network to Apple's
Core ML format, for native Neural Engine execution on Apple Silicon. Credit
and licensing belong to the original authors.

## Models and credits

| File | Architecture | Source / credit | License |
|---|---|---|---|
| `FlightDetector.weight.bin` (41 MB) | EfficientNet-B3 → binary head | Trained by [SuperPicky](https://gitcode.com/Jamesphotography/SuperPicky) (Jamesphotography) for flying-vs-perched bird classification. Backbone: [EfficientNet](https://arxiv.org/abs/1905.11946) (Tan & Le, 2019). | See SuperPicky repo |
| `KeypointDetector.weight.bin` (94 MB) | ResNet50 + PartLocalizer head | Trained by SuperPicky on [CUB-200-2011](http://www.vision.caltech.edu/datasets/cub_200_2011/) keypoint annotations (left-eye, right-eye, beak). | See SuperPicky repo |
| `YOLOBirdDetector.weight.bin` (53 MB) | YOLO11l-seg | [Ultralytics YOLO11l-seg](https://github.com/ultralytics/ultralytics); SuperPicky filters detections to COCO class 14 (`bird`). | [AGPL-3.0](https://github.com/ultralytics/ultralytics/blob/main/LICENSE) |
| `OSEAClassifier.weight.bin` (103 MB) | ResNet34 → 10,964 species | [OSEA bird classifier](https://gitcode.com/sunjiao) by [Sun Jiao](https://gitcode.com/sunjiao). Trained on ~11 k bird species worldwide; SuperPicky feeds each YOLO crop to it for species identification. | See OSEA repo |
| `AestheticsModel.weight.bin` (266 MB) | CFANet / TOPIQ (ResNet50 backbone + transformer cross-attention) | [TOPIQ](https://github.com/chaofengc/IQA-PyTorch) by Chen *et al.*; CFANet checkpoint trained on the AVA aesthetics dataset. Paper: [TOPIQ: A Top-Down Approach from Semantics to Distortions for Image Quality Assessment](https://arxiv.org/abs/2308.03060). | [NTU S-Lab License](https://github.com/chaofengc/IQA-PyTorch/blob/main/LICENSE.txt) |

All source PyTorch checkpoints originate from the [`jamesphotography/SuperPicky-models`](https://huggingface.co/jamesphotography/SuperPicky-models) reference repository — see there for the `.pth` / `.onnx` sources and the corresponding training code.

## What this repo contains

Five files, one per CoreML model, each identical to the `weight.bin` blob
produced by `coremltools.convert(...).save()`:

| File | SHA-256 | Size |
|---|---|---|
| `FlightDetector.weight.bin` | `0105ee79ff06f4f40edace40daa275f71126d8d1fb0737f0fff029c611379610` | 42,634,112 |
| `KeypointDetector.weight.bin` | `0ce77aefef957af92ffbc58e23897f7b6127ac79ab1d23f8a0395db9f296d82c` | 98,676,800 |
| `YOLOBirdDetector.weight.bin` | `387b5e33feb8fdaac86e6792ba11cf40d91aaed851bb4ccb0ce04501cbc760ca` | 55,367,168 |
| `OSEAClassifier.weight.bin` | `cd2ca17e7858e3b49647a01e7830d38405e5b605f6c49c5b8f2490c73bd67bf2` | 107,681,472 |
| `AestheticsModel.weight.bin` | `9e3612f51c95331d69cf5aecfff5185f4f7316436f00186713f9656fb211f1b9` | 278,668,800 |

The SuperPicky Mac app bundles `manifest.json` with exactly these digests and
refuses to install a downloaded file whose SHA-256 doesn't match — so if you
modify any file here, the app will reject it.

## Reproducing these weights

The conversion scripts live in the SuperPickyMac repo under
[`scripts/convert_*.py`](https://github.com/halfhacked/SuperPickyMac/tree/main/scripts).
Each script:

1. Loads the original PyTorch checkpoint from the SuperPicky source models
   (or a pinned Ultralytics release).
2. Traces the model with `torch.jit.trace`.
3. Converts via `coremltools.convert(..., convert_to='mlprogram',
   compute_precision=ct.precision.FLOAT32)`.
4. Writes a `.mlpackage` directory whose `weights/weight.bin` is the file
   you see here, and runs a parity check against the PyTorch original
   (max absolute delta typically ≤ 1e-6).

No architectural changes, no re-training, no quantization — just format
translation so the models can run on Apple's Neural Engine.

## License

Each model inherits the license of its upstream source (see the table
above). This repository packages the CoreML conversion artifacts only;
please consult the original projects for terms governing commercial use,
redistribution, and derivative works.