BioCLIP 2 ExecuTorch (CoreML)
ExecuTorch .pte exports of the BioCLIP 2 visual encoder for on-device inference on Apple devices (iOS 18+ / macOS 15+).
Source code & export scripts: github.com/mallman/CoreMLCLIP
Files
| File | Precision | Backend | Compute Units | Size |
|---|---|---|---|---|
bioclip2_visual_fp16_all.pte |
fp16 | CoreML + XNNPACK fallback | CPU + GPU + ANE | ~581 MB |
bioclip2_visual_fp32_cpu.pte |
fp32 | XNNPACK | CPU only | ~1.1 GB |
The fp16 CoreML variant is recommended for deployment — it leverages the Apple Neural Engine.
Model Details
- Source model: imageomics/bioclip-2 (MIT license)
- Architecture: ViT-L/14 visual encoder (~302M params)
- Input:
[1, 3, 224, 224]float tensor (RGB, normalized) - Output: 768-dim L2-normalized embedding vector
- ExecuTorch version: 1.1.0
- Minimum deployment target: iOS 18 / macOS 15
Usage
The exported model takes a preprocessed image tensor and returns an L2-normalized embedding. Classification is a dot product against precomputed text embeddings.
Preprocessing
| Parameter | Value |
|---|---|
| Input size | 224 x 224 |
| Resize | Bicubic, shortest edge to 224 |
| Crop | Center crop |
| Color space | RGB, [0, 1] range |
| Normalization mean | [0.48145466, 0.4578275, 0.40821073] |
| Normalization std | [0.26862954, 0.26130258, 0.27577711] |
Text Embeddings
Precomputed text embeddings for 867K species are available from the TreeOfLife-200M dataset:
Verification
Both variants were verified against the original PyTorch model using a CC-BY licensed monarch butterfly photo from iNaturalist:
| Variant | Cosine Similarity | Max Abs Diff | Top-1 | Top-5 |
|---|---|---|---|---|
| fp16 CoreML | 0.999999 | 0.000183 | Danaus plexippus | 5/5 match |
| fp32 XNNPACK | 1.000000 | 0.000000 | Danaus plexippus | 5/5 match |
How to Reproduce
git clone https://github.com/mallman/CoreMLCLIP.git
cd CoreMLCLIP
pip install -r requirements.txt
python export_bioclip2.py
See the GitHub repo for full instructions.
- Downloads last month
- 37
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for mallman/bioclip-2-coreml
Base model
imageomics/bioclip-2