repo_id
stringlengths 15
89
| file_path
stringlengths 27
180
| content
stringlengths 1
2.23M
| __index_level_0__
int64 0
0
|
|---|---|---|---|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/tf-inception-v3.mdx
|
# (Tensorflow) Inception v3
**Inception v3** is a convolutional neural network architecture from the Inception family that makes several improvements including using [Label Smoothing](https://paperswithcode.com/method/label-smoothing), Factorized 7 x 7 convolutions, and the use of an [auxiliary classifer](https://paperswithcode.com/method/auxiliary-classifier) to propagate label information lower down the network (along with the use of batch normalization for layers in the sidehead). The key building block is an [Inception Module](https://paperswithcode.com/method/inception-v3-module).
The weights from this model were ported from [Tensorflow/Models](https://github.com/tensorflow/models).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('tf_inception_v3', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `tf_inception_v3`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('tf_inception_v3', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/SzegedyVISW15,
author = {Christian Szegedy and
Vincent Vanhoucke and
Sergey Ioffe and
Jonathon Shlens and
Zbigniew Wojna},
title = {Rethinking the Inception Architecture for Computer Vision},
journal = {CoRR},
volume = {abs/1512.00567},
year = {2015},
url = {http://arxiv.org/abs/1512.00567},
archivePrefix = {arXiv},
eprint = {1512.00567},
timestamp = {Mon, 13 Aug 2018 16:49:07 +0200},
biburl = {https://dblp.org/rec/journals/corr/SzegedyVISW15.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: TF Inception v3
Paper:
Title: Rethinking the Inception Architecture for Computer Vision
URL: https://paperswithcode.com/paper/rethinking-the-inception-architecture-for
Models:
- Name: tf_inception_v3
In Collection: TF Inception v3
Metadata:
FLOPs: 7352418880
Parameters: 23830000
File Size: 95549439
Architecture:
- 1x1 Convolution
- Auxiliary Classifier
- Average Pooling
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inception-v3 Module
- Max Pooling
- ReLU
- Softmax
Tasks:
- Image Classification
Training Techniques:
- Gradient Clipping
- Label Smoothing
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 50x NVIDIA Kepler GPUs
ID: tf_inception_v3
LR: 0.045
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Image Size: '299'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/inception_v3.py#L449
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.87%
Top 5 Accuracy: 93.65%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/selecsls.mdx
|
# SelecSLS
**SelecSLS** uses novel selective long and short range skip connections to improve the information flow allowing for a drastically faster network without compromising accuracy.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('selecsls42b', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `selecsls42b`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('selecsls42b', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{Mehta_2020,
title={XNect},
volume={39},
ISSN={1557-7368},
url={http://dx.doi.org/10.1145/3386569.3392410},
DOI={10.1145/3386569.3392410},
number={4},
journal={ACM Transactions on Graphics},
publisher={Association for Computing Machinery (ACM)},
author={Mehta, Dushyant and Sotnychenko, Oleksandr and Mueller, Franziska and Xu, Weipeng and Elgharib, Mohamed and Fua, Pascal and Seidel, Hans-Peter and Rhodin, Helge and Pons-Moll, Gerard and Theobalt, Christian},
year={2020},
month={Jul}
}
```
<!--
Type: model-index
Collections:
- Name: SelecSLS
Paper:
Title: 'XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera'
URL: https://paperswithcode.com/paper/xnect-real-time-multi-person-3d-human-pose
Models:
- Name: selecsls42b
In Collection: SelecSLS
Metadata:
FLOPs: 3824022528
Parameters: 32460000
File Size: 129948954
Architecture:
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Global Average Pooling
- ReLU
- SelecSLS Block
Tasks:
- Image Classification
Training Techniques:
- Cosine Annealing
- Random Erasing
Training Data:
- ImageNet
ID: selecsls42b
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/selecsls.py#L335
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.18%
Top 5 Accuracy: 93.39%
- Name: selecsls60
In Collection: SelecSLS
Metadata:
FLOPs: 4610472600
Parameters: 30670000
File Size: 122839714
Architecture:
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Global Average Pooling
- ReLU
- SelecSLS Block
Tasks:
- Image Classification
Training Techniques:
- Cosine Annealing
- Random Erasing
Training Data:
- ImageNet
ID: selecsls60
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/selecsls.py#L342
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.99%
Top 5 Accuracy: 93.83%
- Name: selecsls60b
In Collection: SelecSLS
Metadata:
FLOPs: 4657653144
Parameters: 32770000
File Size: 131252898
Architecture:
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Global Average Pooling
- ReLU
- SelecSLS Block
Tasks:
- Image Classification
Training Techniques:
- Cosine Annealing
- Random Erasing
Training Data:
- ImageNet
ID: selecsls60b
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/selecsls.py#L349
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.41%
Top 5 Accuracy: 94.18%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/resnest.mdx
|
# ResNeSt
A **ResNeSt** is a variant on a [ResNet](https://paperswithcode.com/method/resnet), which instead stacks [Split-Attention blocks](https://paperswithcode.com/method/split-attention). The cardinal group representations are then concatenated along the channel dimension: \\( V = \text{Concat} \\){\\( V^{1},V^{2},\cdots{V}^{K} \\)}. As in standard residual blocks, the final output \\( Y \\) of otheur Split-Attention block is produced using a shortcut connection: \\( Y=V+X \\), if the input and output feature-map share the same shape. For blocks with a stride, an appropriate transformation \\( \mathcal{T} \\) is applied to the shortcut connection to align the output shapes: \\( Y=V+\mathcal{T}(X) \\). For example, \\( \mathcal{T} \\) can be strided convolution or combined convolution-with-pooling.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('resnest101e', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `resnest101e`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('resnest101e', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{zhang2020resnest,
title={ResNeSt: Split-Attention Networks},
author={Hang Zhang and Chongruo Wu and Zhongyue Zhang and Yi Zhu and Haibin Lin and Zhi Zhang and Yue Sun and Tong He and Jonas Mueller and R. Manmatha and Mu Li and Alexander Smola},
year={2020},
eprint={2004.08955},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: ResNeSt
Paper:
Title: 'ResNeSt: Split-Attention Networks'
URL: https://paperswithcode.com/paper/resnest-split-attention-networks
Models:
- Name: resnest101e
In Collection: ResNeSt
Metadata:
FLOPs: 17423183648
Parameters: 48280000
File Size: 193782911
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Split Attention
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- DropBlock
- Label Smoothing
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 64x NVIDIA V100 GPUs
ID: resnest101e
LR: 0.1
Epochs: 270
Layers: 101
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 4096
Image Size: '256'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnest.py#L182
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 82.88%
Top 5 Accuracy: 96.31%
- Name: resnest14d
In Collection: ResNeSt
Metadata:
FLOPs: 3548594464
Parameters: 10610000
File Size: 42562639
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Split Attention
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- DropBlock
- Label Smoothing
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 64x NVIDIA V100 GPUs
ID: resnest14d
LR: 0.1
Epochs: 270
Layers: 14
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 8192
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnest.py#L148
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 75.51%
Top 5 Accuracy: 92.52%
- Name: resnest200e
In Collection: ResNeSt
Metadata:
FLOPs: 45954387872
Parameters: 70200000
File Size: 193782911
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Split Attention
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- DropBlock
- Label Smoothing
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 64x NVIDIA V100 GPUs
ID: resnest200e
LR: 0.1
Epochs: 270
Layers: 200
Dropout: 0.2
Crop Pct: '0.909'
Momentum: 0.9
Batch Size: 2048
Image Size: '320'
Weight Decay: 0.0001
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnest.py#L194
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.85%
Top 5 Accuracy: 96.89%
- Name: resnest269e
In Collection: ResNeSt
Metadata:
FLOPs: 100830307104
Parameters: 110930000
File Size: 445402691
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Split Attention
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- DropBlock
- Label Smoothing
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 64x NVIDIA V100 GPUs
ID: resnest269e
LR: 0.1
Epochs: 270
Layers: 269
Dropout: 0.2
Crop Pct: '0.928'
Momentum: 0.9
Batch Size: 2048
Image Size: '416'
Weight Decay: 0.0001
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnest.py#L206
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.53%
Top 5 Accuracy: 96.99%
- Name: resnest26d
In Collection: ResNeSt
Metadata:
FLOPs: 4678918720
Parameters: 17070000
File Size: 68470242
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Split Attention
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- DropBlock
- Label Smoothing
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 64x NVIDIA V100 GPUs
ID: resnest26d
LR: 0.1
Epochs: 270
Layers: 26
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 8192
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnest.py#L159
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.48%
Top 5 Accuracy: 94.3%
- Name: resnest50d
In Collection: ResNeSt
Metadata:
FLOPs: 6937106336
Parameters: 27480000
File Size: 110273258
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Split Attention
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- DropBlock
- Label Smoothing
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 64x NVIDIA V100 GPUs
ID: resnest50d
LR: 0.1
Epochs: 270
Layers: 50
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 8192
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnest.py#L170
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.96%
Top 5 Accuracy: 95.38%
- Name: resnest50d_1s4x24d
In Collection: ResNeSt
Metadata:
FLOPs: 5686764544
Parameters: 25680000
File Size: 103045531
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Split Attention
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- DropBlock
- Label Smoothing
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 64x NVIDIA V100 GPUs
ID: resnest50d_1s4x24d
LR: 0.1
Epochs: 270
Layers: 50
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 8192
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnest.py#L229
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.0%
Top 5 Accuracy: 95.33%
- Name: resnest50d_4s2x40d
In Collection: ResNeSt
Metadata:
FLOPs: 5657064720
Parameters: 30420000
File Size: 122133282
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Split Attention
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- DropBlock
- Label Smoothing
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 64x NVIDIA V100 GPUs
ID: resnest50d_4s2x40d
LR: 0.1
Epochs: 270
Layers: 50
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 8192
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnest.py#L218
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.11%
Top 5 Accuracy: 95.55%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/csp-resnext.mdx
|
# CSP-ResNeXt
**CSPResNeXt** is a convolutional neural network where we apply the Cross Stage Partial Network (CSPNet) approach to [ResNeXt](https://paperswithcode.com/method/resnext). The CSPNet partitions the feature map of the base layer into two parts and then merges them through a cross-stage hierarchy. The use of a split and merge strategy allows for more gradient flow through the network.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('cspresnext50', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `cspresnext50`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('cspresnext50', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{wang2019cspnet,
title={CSPNet: A New Backbone that can Enhance Learning Capability of CNN},
author={Chien-Yao Wang and Hong-Yuan Mark Liao and I-Hau Yeh and Yueh-Hua Wu and Ping-Yang Chen and Jun-Wei Hsieh},
year={2019},
eprint={1911.11929},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: CSP ResNeXt
Paper:
Title: 'CSPNet: A New Backbone that can Enhance Learning Capability of CNN'
URL: https://paperswithcode.com/paper/cspnet-a-new-backbone-that-can-enhance
Models:
- Name: cspresnext50
In Collection: CSP ResNeXt
Metadata:
FLOPs: 3962945536
Parameters: 20570000
File Size: 82562887
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- Polynomial Learning Rate Decay
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 1x GPU
ID: cspresnext50
LR: 0.1
Layers: 50
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 128
Image Size: '224'
Weight Decay: 0.005
Interpolation: bilinear
Training Steps: 8000000
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/cspnet.py#L430
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.05%
Top 5 Accuracy: 94.94%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/res2next.mdx
|
# Res2NeXt
**Res2NeXt** is an image model that employs a variation on [ResNeXt](https://paperswithcode.com/method/resnext) bottleneck residual blocks. The motivation is to be able to represent features at multiple scales. This is achieved through a novel building block for CNNs that constructs hierarchical residual-like connections within one single residual block. This represents multi-scale features at a granular level and increases the range of receptive fields for each network layer.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('res2next50', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `res2next50`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('res2next50', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{Gao_2021,
title={Res2Net: A New Multi-Scale Backbone Architecture},
volume={43},
ISSN={1939-3539},
url={http://dx.doi.org/10.1109/TPAMI.2019.2938758},
DOI={10.1109/tpami.2019.2938758},
number={2},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
publisher={Institute of Electrical and Electronics Engineers (IEEE)},
author={Gao, Shang-Hua and Cheng, Ming-Ming and Zhao, Kai and Zhang, Xin-Yu and Yang, Ming-Hsuan and Torr, Philip},
year={2021},
month={Feb},
pages={652–662}
}
```
<!--
Type: model-index
Collections:
- Name: Res2NeXt
Paper:
Title: 'Res2Net: A New Multi-scale Backbone Architecture'
URL: https://paperswithcode.com/paper/res2net-a-new-multi-scale-backbone
Models:
- Name: res2next50
In Collection: Res2NeXt
Metadata:
FLOPs: 5396798208
Parameters: 24670000
File Size: 99019592
Architecture:
- Batch Normalization
- Convolution
- Global Average Pooling
- ReLU
- Res2NeXt Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x Titan Xp GPUs
ID: res2next50
LR: 0.1
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/res2net.py#L207
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.24%
Top 5 Accuracy: 93.91%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/resnet-d.mdx
|
# ResNet-D
**ResNet-D** is a modification on the [ResNet](https://paperswithcode.com/method/resnet) architecture that utilises an [average pooling](https://paperswithcode.com/method/average-pooling) tweak for downsampling. The motivation is that in the unmodified ResNet, the [1×1 convolution](https://paperswithcode.com/method/1x1-convolution) for the downsampling block ignores 3/4 of input feature maps, so this is modified so no information will be ignored
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('resnet101d', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `resnet101d`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('resnet101d', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{he2018bag,
title={Bag of Tricks for Image Classification with Convolutional Neural Networks},
author={Tong He and Zhi Zhang and Hang Zhang and Zhongyue Zhang and Junyuan Xie and Mu Li},
year={2018},
eprint={1812.01187},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: ResNet-D
Paper:
Title: Bag of Tricks for Image Classification with Convolutional Neural Networks
URL: https://paperswithcode.com/paper/bag-of-tricks-for-image-classification-with
Models:
- Name: resnet101d
In Collection: ResNet-D
Metadata:
FLOPs: 13805639680
Parameters: 44570000
File Size: 178791263
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet101d
Crop Pct: '0.94'
Image Size: '256'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L716
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 82.31%
Top 5 Accuracy: 96.06%
- Name: resnet152d
In Collection: ResNet-D
Metadata:
FLOPs: 20155275264
Parameters: 60210000
File Size: 241596837
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet152d
Crop Pct: '0.94'
Image Size: '256'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L724
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.13%
Top 5 Accuracy: 96.35%
- Name: resnet18d
In Collection: ResNet-D
Metadata:
FLOPs: 2645205760
Parameters: 11710000
File Size: 46893231
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet18d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L649
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 72.27%
Top 5 Accuracy: 90.69%
- Name: resnet200d
In Collection: ResNet-D
Metadata:
FLOPs: 26034378752
Parameters: 64690000
File Size: 259662933
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet200d
Crop Pct: '0.94'
Image Size: '256'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L749
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.24%
Top 5 Accuracy: 96.49%
- Name: resnet26d
In Collection: ResNet-D
Metadata:
FLOPs: 3335276032
Parameters: 16010000
File Size: 64209122
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet26d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L683
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 76.69%
Top 5 Accuracy: 93.15%
- Name: resnet34d
In Collection: ResNet-D
Metadata:
FLOPs: 5026601728
Parameters: 21820000
File Size: 87369807
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet34d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L666
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.11%
Top 5 Accuracy: 93.38%
- Name: resnet50d
In Collection: ResNet-D
Metadata:
FLOPs: 5591002624
Parameters: 25580000
File Size: 102567109
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet50d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L699
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.55%
Top 5 Accuracy: 95.16%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/ese-vovnet.mdx
|
# ESE-VoVNet
**VoVNet** is a convolutional neural network that seeks to make [DenseNet](https://paperswithcode.com/method/densenet) more efficient by concatenating all features only once in the last feature map, which makes input size constant and enables enlarging new output channel.
Read about [one-shot aggregation here](https://paperswithcode.com/method/one-shot-aggregation).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('ese_vovnet19b_dw', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `ese_vovnet19b_dw`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('ese_vovnet19b_dw', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{lee2019energy,
title={An Energy and GPU-Computation Efficient Backbone Network for Real-Time Object Detection},
author={Youngwan Lee and Joong-won Hwang and Sangrok Lee and Yuseok Bae and Jongyoul Park},
year={2019},
eprint={1904.09730},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: ESE VovNet
Paper:
Title: 'CenterMask : Real-Time Anchor-Free Instance Segmentation'
URL: https://paperswithcode.com/paper/centermask-real-time-anchor-free-instance-1
Models:
- Name: ese_vovnet19b_dw
In Collection: ESE VovNet
Metadata:
FLOPs: 1711959904
Parameters: 6540000
File Size: 26243175
Architecture:
- Batch Normalization
- Convolution
- Max Pooling
- One-Shot Aggregation
- ReLU
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: ese_vovnet19b_dw
Layers: 19
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/vovnet.py#L361
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 76.82%
Top 5 Accuracy: 93.28%
- Name: ese_vovnet39b
In Collection: ESE VovNet
Metadata:
FLOPs: 9089259008
Parameters: 24570000
File Size: 98397138
Architecture:
- Batch Normalization
- Convolution
- Max Pooling
- One-Shot Aggregation
- ReLU
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: ese_vovnet39b
Layers: 39
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/vovnet.py#L371
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.31%
Top 5 Accuracy: 94.72%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/seresnext.mdx
|
# SE-ResNeXt
**SE ResNeXt** is a variant of a [ResNext](https://www.paperswithcode.com/method/resneXt) that employs [squeeze-and-excitation blocks](https://paperswithcode.com/method/squeeze-and-excitation-block) to enable the network to perform dynamic channel-wise feature recalibration.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('seresnext26d_32x4d', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `seresnext26d_32x4d`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('seresnext26d_32x4d', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{hu2019squeezeandexcitation,
title={Squeeze-and-Excitation Networks},
author={Jie Hu and Li Shen and Samuel Albanie and Gang Sun and Enhua Wu},
year={2019},
eprint={1709.01507},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: SEResNeXt
Paper:
Title: Squeeze-and-Excitation Networks
URL: https://paperswithcode.com/paper/squeeze-and-excitation-networks
Models:
- Name: seresnext26d_32x4d
In Collection: SEResNeXt
Metadata:
FLOPs: 3507053024
Parameters: 16810000
File Size: 67425193
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: seresnext26d_32x4d
LR: 0.6
Epochs: 100
Layers: 26
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/resnet.py#L1234
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.59%
Top 5 Accuracy: 93.61%
- Name: seresnext26t_32x4d
In Collection: SEResNeXt
Metadata:
FLOPs: 3466436448
Parameters: 16820000
File Size: 67414838
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: seresnext26t_32x4d
LR: 0.6
Epochs: 100
Layers: 26
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/resnet.py#L1246
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.99%
Top 5 Accuracy: 93.73%
- Name: seresnext50_32x4d
In Collection: SEResNeXt
Metadata:
FLOPs: 5475179184
Parameters: 27560000
File Size: 110569859
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: seresnext50_32x4d
LR: 0.6
Epochs: 100
Layers: 50
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/resnet.py#L1267
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.27%
Top 5 Accuracy: 95.62%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/ecaresnet.mdx
|
# ECA-ResNet
An **ECA ResNet** is a variant on a [ResNet](https://paperswithcode.com/method/resnet) that utilises an [Efficient Channel Attention module](https://paperswithcode.com/method/efficient-channel-attention). Efficient Channel Attention is an architectural unit based on [squeeze-and-excitation blocks](https://paperswithcode.com/method/squeeze-and-excitation-block) that reduces model complexity without dimensionality reduction.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('ecaresnet101d', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `ecaresnet101d`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('ecaresnet101d', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{wang2020ecanet,
title={ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks},
author={Qilong Wang and Banggu Wu and Pengfei Zhu and Peihua Li and Wangmeng Zuo and Qinghua Hu},
year={2020},
eprint={1910.03151},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: ECAResNet
Paper:
Title: 'ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks'
URL: https://paperswithcode.com/paper/eca-net-efficient-channel-attention-for-deep
Models:
- Name: ecaresnet101d
In Collection: ECAResNet
Metadata:
FLOPs: 10377193728
Parameters: 44570000
File Size: 178815067
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Efficient Channel Attention
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x RTX 2080Ti GPUs
ID: ecaresnet101d
LR: 0.1
Epochs: 100
Layers: 101
Crop Pct: '0.875'
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/resnet.py#L1087
Weights: https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 82.18%
Top 5 Accuracy: 96.06%
- Name: ecaresnet101d_pruned
In Collection: ECAResNet
Metadata:
FLOPs: 4463972081
Parameters: 24880000
File Size: 99852736
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Efficient Channel Attention
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
ID: ecaresnet101d_pruned
Layers: 101
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/resnet.py#L1097
Weights: https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.82%
Top 5 Accuracy: 95.64%
- Name: ecaresnet50d
In Collection: ECAResNet
Metadata:
FLOPs: 5591090432
Parameters: 25580000
File Size: 102579290
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Efficient Channel Attention
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x RTX 2080Ti GPUs
ID: ecaresnet50d
LR: 0.1
Epochs: 100
Layers: 50
Crop Pct: '0.875'
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/resnet.py#L1045
Weights: https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.61%
Top 5 Accuracy: 95.31%
- Name: ecaresnet50d_pruned
In Collection: ECAResNet
Metadata:
FLOPs: 3250730657
Parameters: 19940000
File Size: 79990436
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Efficient Channel Attention
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
ID: ecaresnet50d_pruned
Layers: 50
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/resnet.py#L1055
Weights: https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.71%
Top 5 Accuracy: 94.88%
- Name: ecaresnetlight
In Collection: ECAResNet
Metadata:
FLOPs: 5276118784
Parameters: 30160000
File Size: 120956612
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Efficient Channel Attention
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
ID: ecaresnetlight
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/resnet.py#L1077
Weights: https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.46%
Top 5 Accuracy: 95.25%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/fbnet.mdx
|
# FBNet
**FBNet** is a type of convolutional neural architectures discovered through [DNAS](https://paperswithcode.com/method/dnas) neural architecture search. It utilises a basic type of image model block inspired by [MobileNetv2](https://paperswithcode.com/method/mobilenetv2) that utilises depthwise convolutions and an inverted residual structure (see components).
The principal building block is the [FBNet Block](https://paperswithcode.com/method/fbnet-block).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('fbnetc_100', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `fbnetc_100`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('fbnetc_100', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{wu2019fbnet,
title={FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural Architecture Search},
author={Bichen Wu and Xiaoliang Dai and Peizhao Zhang and Yanghan Wang and Fei Sun and Yiming Wu and Yuandong Tian and Peter Vajda and Yangqing Jia and Kurt Keutzer},
year={2019},
eprint={1812.03443},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: FBNet
Paper:
Title: 'FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural
Architecture Search'
URL: https://paperswithcode.com/paper/fbnet-hardware-aware-efficient-convnet-design
Models:
- Name: fbnetc_100
In Collection: FBNet
Metadata:
FLOPs: 508940064
Parameters: 5570000
File Size: 22525094
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Dropout
- FBNet Block
- Global Average Pooling
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x GPUs
ID: fbnetc_100
LR: 0.1
Epochs: 360
Layers: 22
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0005
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L985
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 75.12%
Top 5 Accuracy: 92.37%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/resnet.mdx
|
# ResNet
**Residual Networks**, or **ResNets**, learn residual functions with reference to the layer inputs, instead of learning unreferenced functions. Instead of hoping each few stacked layers directly fit a desired underlying mapping, residual nets let these layers fit a residual mapping. They stack [residual blocks](https://paperswithcode.com/method/residual-block) ontop of each other to form network: e.g. a ResNet-50 has fifty layers using these blocks.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('resnet18', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `resnet18`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('resnet18', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/HeZRS15,
author = {Kaiming He and
Xiangyu Zhang and
Shaoqing Ren and
Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {CoRR},
volume = {abs/1512.03385},
year = {2015},
url = {http://arxiv.org/abs/1512.03385},
archivePrefix = {arXiv},
eprint = {1512.03385},
timestamp = {Wed, 17 Apr 2019 17:23:45 +0200},
biburl = {https://dblp.org/rec/journals/corr/HeZRS15.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: ResNet
Paper:
Title: Deep Residual Learning for Image Recognition
URL: https://paperswithcode.com/paper/deep-residual-learning-for-image-recognition
Models:
- Name: resnet18
In Collection: ResNet
Metadata:
FLOPs: 2337073152
Parameters: 11690000
File Size: 46827520
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet18
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L641
Weights: https://download.pytorch.org/models/resnet18-5c106cde.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 69.74%
Top 5 Accuracy: 89.09%
- Name: resnet26
In Collection: ResNet
Metadata:
FLOPs: 3026804736
Parameters: 16000000
File Size: 64129972
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet26
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L675
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 75.29%
Top 5 Accuracy: 92.57%
- Name: resnet34
In Collection: ResNet
Metadata:
FLOPs: 4718469120
Parameters: 21800000
File Size: 87290831
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet34
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L658
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 75.11%
Top 5 Accuracy: 92.28%
- Name: resnet50
In Collection: ResNet
Metadata:
FLOPs: 5282531328
Parameters: 25560000
File Size: 102488165
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnet50
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L691
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.04%
Top 5 Accuracy: 94.39%
- Name: resnetblur50
In Collection: ResNet
Metadata:
FLOPs: 6621606912
Parameters: 25560000
File Size: 102488165
Architecture:
- 1x1 Convolution
- Batch Normalization
- Blur Pooling
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: resnetblur50
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L1160
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.29%
Top 5 Accuracy: 94.64%
- Name: tv_resnet101
In Collection: ResNet
Metadata:
FLOPs: 10068547584
Parameters: 44550000
File Size: 178728960
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
ID: tv_resnet101
LR: 0.1
Epochs: 90
Crop Pct: '0.875'
LR Gamma: 0.1
Momentum: 0.9
Batch Size: 32
Image Size: '224'
LR Step Size: 30
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L761
Weights: https://download.pytorch.org/models/resnet101-5d3b4d8f.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.37%
Top 5 Accuracy: 93.56%
- Name: tv_resnet152
In Collection: ResNet
Metadata:
FLOPs: 14857660416
Parameters: 60190000
File Size: 241530880
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
ID: tv_resnet152
LR: 0.1
Epochs: 90
Crop Pct: '0.875'
LR Gamma: 0.1
Momentum: 0.9
Batch Size: 32
Image Size: '224'
LR Step Size: 30
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L769
Weights: https://download.pytorch.org/models/resnet152-b121ed2d.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.32%
Top 5 Accuracy: 94.05%
- Name: tv_resnet34
In Collection: ResNet
Metadata:
FLOPs: 4718469120
Parameters: 21800000
File Size: 87306240
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
ID: tv_resnet34
LR: 0.1
Epochs: 90
Crop Pct: '0.875'
LR Gamma: 0.1
Momentum: 0.9
Batch Size: 32
Image Size: '224'
LR Step Size: 30
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L745
Weights: https://download.pytorch.org/models/resnet34-333f7ec4.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 73.3%
Top 5 Accuracy: 91.42%
- Name: tv_resnet50
In Collection: ResNet
Metadata:
FLOPs: 5282531328
Parameters: 25560000
File Size: 102502400
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
ID: tv_resnet50
LR: 0.1
Epochs: 90
Crop Pct: '0.875'
LR Gamma: 0.1
Momentum: 0.9
Batch Size: 32
Image Size: '224'
LR Step Size: 30
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L753
Weights: https://download.pytorch.org/models/resnet50-19c8e357.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 76.16%
Top 5 Accuracy: 92.88%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/gloun-inception-v3.mdx
|
# (Gluon) Inception v3
**Inception v3** is a convolutional neural network architecture from the Inception family that makes several improvements including using [Label Smoothing](https://paperswithcode.com/method/label-smoothing), Factorized 7 x 7 convolutions, and the use of an [auxiliary classifer](https://paperswithcode.com/method/auxiliary-classifier) to propagate label information lower down the network (along with the use of batch normalization for layers in the sidehead). The key building block is an [Inception Module](https://paperswithcode.com/method/inception-v3-module).
The weights from this model were ported from [Gluon](https://cv.gluon.ai/model_zoo/classification.html).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('gluon_inception_v3', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `gluon_inception_v3`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('gluon_inception_v3', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/SzegedyVISW15,
author = {Christian Szegedy and
Vincent Vanhoucke and
Sergey Ioffe and
Jonathon Shlens and
Zbigniew Wojna},
title = {Rethinking the Inception Architecture for Computer Vision},
journal = {CoRR},
volume = {abs/1512.00567},
year = {2015},
url = {http://arxiv.org/abs/1512.00567},
archivePrefix = {arXiv},
eprint = {1512.00567},
timestamp = {Mon, 13 Aug 2018 16:49:07 +0200},
biburl = {https://dblp.org/rec/journals/corr/SzegedyVISW15.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: Gloun Inception v3
Paper:
Title: Rethinking the Inception Architecture for Computer Vision
URL: https://paperswithcode.com/paper/rethinking-the-inception-architecture-for
Models:
- Name: gluon_inception_v3
In Collection: Gloun Inception v3
Metadata:
FLOPs: 7352418880
Parameters: 23830000
File Size: 95567055
Architecture:
- 1x1 Convolution
- Auxiliary Classifier
- Average Pooling
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inception-v3 Module
- Max Pooling
- ReLU
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_inception_v3
Crop Pct: '0.875'
Image Size: '299'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/inception_v3.py#L464
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.8%
Top 5 Accuracy: 94.38%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/res2net.mdx
|
# Res2Net
**Res2Net** is an image model that employs a variation on bottleneck residual blocks, [Res2Net Blocks](https://paperswithcode.com/method/res2net-block). The motivation is to be able to represent features at multiple scales. This is achieved through a novel building block for CNNs that constructs hierarchical residual-like connections within one single residual block. This represents multi-scale features at a granular level and increases the range of receptive fields for each network layer.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('res2net101_26w_4s', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `res2net101_26w_4s`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('res2net101_26w_4s', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{Gao_2021,
title={Res2Net: A New Multi-Scale Backbone Architecture},
volume={43},
ISSN={1939-3539},
url={http://dx.doi.org/10.1109/TPAMI.2019.2938758},
DOI={10.1109/tpami.2019.2938758},
number={2},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
publisher={Institute of Electrical and Electronics Engineers (IEEE)},
author={Gao, Shang-Hua and Cheng, Ming-Ming and Zhao, Kai and Zhang, Xin-Yu and Yang, Ming-Hsuan and Torr, Philip},
year={2021},
month={Feb},
pages={652–662}
}
```
<!--
Type: model-index
Collections:
- Name: Res2Net
Paper:
Title: 'Res2Net: A New Multi-scale Backbone Architecture'
URL: https://paperswithcode.com/paper/res2net-a-new-multi-scale-backbone
Models:
- Name: res2net101_26w_4s
In Collection: Res2Net
Metadata:
FLOPs: 10415881200
Parameters: 45210000
File Size: 181456059
Architecture:
- Batch Normalization
- Convolution
- Global Average Pooling
- ReLU
- Res2Net Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x Titan Xp GPUs
ID: res2net101_26w_4s
LR: 0.1
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/res2net.py#L152
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.19%
Top 5 Accuracy: 94.43%
- Name: res2net50_14w_8s
In Collection: Res2Net
Metadata:
FLOPs: 5403546768
Parameters: 25060000
File Size: 100638543
Architecture:
- Batch Normalization
- Convolution
- Global Average Pooling
- ReLU
- Res2Net Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x Titan Xp GPUs
ID: res2net50_14w_8s
LR: 0.1
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/res2net.py#L196
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.14%
Top 5 Accuracy: 93.86%
- Name: res2net50_26w_4s
In Collection: Res2Net
Metadata:
FLOPs: 5499974064
Parameters: 25700000
File Size: 103110087
Architecture:
- Batch Normalization
- Convolution
- Global Average Pooling
- ReLU
- Res2Net Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x Titan Xp GPUs
ID: res2net50_26w_4s
LR: 0.1
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/res2net.py#L141
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.99%
Top 5 Accuracy: 93.85%
- Name: res2net50_26w_6s
In Collection: Res2Net
Metadata:
FLOPs: 8130156528
Parameters: 37050000
File Size: 148603239
Architecture:
- Batch Normalization
- Convolution
- Global Average Pooling
- ReLU
- Res2Net Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x Titan Xp GPUs
ID: res2net50_26w_6s
LR: 0.1
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/res2net.py#L163
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.57%
Top 5 Accuracy: 94.12%
- Name: res2net50_26w_8s
In Collection: Res2Net
Metadata:
FLOPs: 10760338992
Parameters: 48400000
File Size: 194085165
Architecture:
- Batch Normalization
- Convolution
- Global Average Pooling
- ReLU
- Res2Net Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x Titan Xp GPUs
ID: res2net50_26w_8s
LR: 0.1
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/res2net.py#L174
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.19%
Top 5 Accuracy: 94.37%
- Name: res2net50_48w_2s
In Collection: Res2Net
Metadata:
FLOPs: 5375291520
Parameters: 25290000
File Size: 101421406
Architecture:
- Batch Normalization
- Convolution
- Global Average Pooling
- ReLU
- Res2Net Block
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x Titan Xp GPUs
ID: res2net50_48w_2s
LR: 0.1
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/res2net.py#L185
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.53%
Top 5 Accuracy: 93.56%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/advprop.mdx
|
# AdvProp (EfficientNet)
**AdvProp** is an adversarial training scheme which treats adversarial examples as additional examples, to prevent overfitting. Key to the method is the usage of a separate auxiliary batch norm for adversarial examples, as they have different underlying distributions to normal examples.
The weights from this model were ported from [Tensorflow/TPU](https://github.com/tensorflow/tpu).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('tf_efficientnet_b0_ap', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `tf_efficientnet_b0_ap`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('tf_efficientnet_b0_ap', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{xie2020adversarial,
title={Adversarial Examples Improve Image Recognition},
author={Cihang Xie and Mingxing Tan and Boqing Gong and Jiang Wang and Alan Yuille and Quoc V. Le},
year={2020},
eprint={1911.09665},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: AdvProp
Paper:
Title: Adversarial Examples Improve Image Recognition
URL: https://paperswithcode.com/paper/adversarial-examples-improve-image
Models:
- Name: tf_efficientnet_b0_ap
In Collection: AdvProp
Metadata:
FLOPs: 488688572
Parameters: 5290000
File Size: 21385973
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AdvProp
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b0_ap
LR: 0.256
Epochs: 350
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 2048
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1334
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.1%
Top 5 Accuracy: 93.26%
- Name: tf_efficientnet_b1_ap
In Collection: AdvProp
Metadata:
FLOPs: 883633200
Parameters: 7790000
File Size: 31515350
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AdvProp
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b1_ap
LR: 0.256
Epochs: 350
Crop Pct: '0.882'
Momentum: 0.9
Batch Size: 2048
Image Size: '240'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1344
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.28%
Top 5 Accuracy: 94.3%
- Name: tf_efficientnet_b2_ap
In Collection: AdvProp
Metadata:
FLOPs: 1234321170
Parameters: 9110000
File Size: 36800745
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AdvProp
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b2_ap
LR: 0.256
Epochs: 350
Crop Pct: '0.89'
Momentum: 0.9
Batch Size: 2048
Image Size: '260'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1354
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.3%
Top 5 Accuracy: 95.03%
- Name: tf_efficientnet_b3_ap
In Collection: AdvProp
Metadata:
FLOPs: 2275247568
Parameters: 12230000
File Size: 49384538
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AdvProp
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b3_ap
LR: 0.256
Epochs: 350
Crop Pct: '0.904'
Momentum: 0.9
Batch Size: 2048
Image Size: '300'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1364
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.82%
Top 5 Accuracy: 95.62%
- Name: tf_efficientnet_b4_ap
In Collection: AdvProp
Metadata:
FLOPs: 5749638672
Parameters: 19340000
File Size: 77993585
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AdvProp
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b4_ap
LR: 0.256
Epochs: 350
Crop Pct: '0.922'
Momentum: 0.9
Batch Size: 2048
Image Size: '380'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1374
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.26%
Top 5 Accuracy: 96.39%
- Name: tf_efficientnet_b5_ap
In Collection: AdvProp
Metadata:
FLOPs: 13176501888
Parameters: 30390000
File Size: 122403150
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AdvProp
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b5_ap
LR: 0.256
Epochs: 350
Crop Pct: '0.934'
Momentum: 0.9
Batch Size: 2048
Image Size: '456'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1384
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.25%
Top 5 Accuracy: 96.97%
- Name: tf_efficientnet_b6_ap
In Collection: AdvProp
Metadata:
FLOPs: 24180518488
Parameters: 43040000
File Size: 173237466
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AdvProp
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b6_ap
LR: 0.256
Epochs: 350
Crop Pct: '0.942'
Momentum: 0.9
Batch Size: 2048
Image Size: '528'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1394
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.79%
Top 5 Accuracy: 97.14%
- Name: tf_efficientnet_b7_ap
In Collection: AdvProp
Metadata:
FLOPs: 48205304880
Parameters: 66349999
File Size: 266850607
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AdvProp
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b7_ap
LR: 0.256
Epochs: 350
Crop Pct: '0.949'
Momentum: 0.9
Batch Size: 2048
Image Size: '600'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1405
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 85.12%
Top 5 Accuracy: 97.25%
- Name: tf_efficientnet_b8_ap
In Collection: AdvProp
Metadata:
FLOPs: 80962956270
Parameters: 87410000
File Size: 351412563
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AdvProp
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b8_ap
LR: 0.128
Epochs: 350
Crop Pct: '0.954'
Momentum: 0.9
Batch Size: 2048
Image Size: '672'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1416
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 85.37%
Top 5 Accuracy: 97.3%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/legacy-se-resnext.mdx
|
# (Legacy) SE-ResNeXt
**SE ResNeXt** is a variant of a [ResNeXt](https://www.paperswithcode.com/method/resnext) that employs [squeeze-and-excitation blocks](https://paperswithcode.com/method/squeeze-and-excitation-block) to enable the network to perform dynamic channel-wise feature recalibration.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('legacy_seresnext101_32x4d', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `legacy_seresnext101_32x4d`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('legacy_seresnext101_32x4d', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{hu2019squeezeandexcitation,
title={Squeeze-and-Excitation Networks},
author={Jie Hu and Li Shen and Samuel Albanie and Gang Sun and Enhua Wu},
year={2019},
eprint={1709.01507},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: Legacy SE ResNeXt
Paper:
Title: Squeeze-and-Excitation Networks
URL: https://paperswithcode.com/paper/squeeze-and-excitation-networks
Models:
- Name: legacy_seresnext101_32x4d
In Collection: Legacy SE ResNeXt
Metadata:
FLOPs: 10287698672
Parameters: 48960000
File Size: 196466866
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: legacy_seresnext101_32x4d
LR: 0.6
Epochs: 100
Layers: 101
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/senet.py#L462
Weights: http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.23%
Top 5 Accuracy: 95.02%
- Name: legacy_seresnext26_32x4d
In Collection: Legacy SE ResNeXt
Metadata:
FLOPs: 3187342304
Parameters: 16790000
File Size: 67346327
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: legacy_seresnext26_32x4d
LR: 0.6
Epochs: 100
Layers: 26
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/senet.py#L448
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.11%
Top 5 Accuracy: 93.31%
- Name: legacy_seresnext50_32x4d
In Collection: Legacy SE ResNeXt
Metadata:
FLOPs: 5459954352
Parameters: 27560000
File Size: 110559176
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: legacy_seresnext50_32x4d
LR: 0.6
Epochs: 100
Layers: 50
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/senet.py#L455
Weights: http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.08%
Top 5 Accuracy: 94.43%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/gloun-xception.mdx
|
# (Gluon) Xception
**Xception** is a convolutional neural network architecture that relies solely on [depthwise separable convolution](https://paperswithcode.com/method/depthwise-separable-convolution) layers.
The weights from this model were ported from [Gluon](https://cv.gluon.ai/model_zoo/classification.html).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('gluon_xception65', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `gluon_xception65`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('gluon_xception65', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{chollet2017xception,
title={Xception: Deep Learning with Depthwise Separable Convolutions},
author={François Chollet},
year={2017},
eprint={1610.02357},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: Gloun Xception
Paper:
Title: 'Xception: Deep Learning with Depthwise Separable Convolutions'
URL: https://paperswithcode.com/paper/xception-deep-learning-with-depthwise
Models:
- Name: gluon_xception65
In Collection: Gloun Xception
Metadata:
FLOPs: 17594889728
Parameters: 39920000
File Size: 160551306
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_xception65
Crop Pct: '0.903'
Image Size: '299'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_xception.py#L241
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.7%
Top 5 Accuracy: 94.87%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/inception-resnet-v2.mdx
|
# Inception ResNet v2
**Inception-ResNet-v2** is a convolutional neural architecture that builds on the Inception family of architectures but incorporates [residual connections](https://paperswithcode.com/method/residual-connection) (replacing the filter concatenation stage of the Inception architecture).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('inception_resnet_v2', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `inception_resnet_v2`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('inception_resnet_v2', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{szegedy2016inceptionv4,
title={Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning},
author={Christian Szegedy and Sergey Ioffe and Vincent Vanhoucke and Alex Alemi},
year={2016},
eprint={1602.07261},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: Inception ResNet v2
Paper:
Title: Inception-v4, Inception-ResNet and the Impact of Residual Connections on
Learning
URL: https://paperswithcode.com/paper/inception-v4-inception-resnet-and-the-impact
Models:
- Name: inception_resnet_v2
In Collection: Inception ResNet v2
Metadata:
FLOPs: 16959133120
Parameters: 55850000
File Size: 223774238
Architecture:
- Average Pooling
- Dropout
- Inception-ResNet-v2 Reduction-B
- Inception-ResNet-v2-A
- Inception-ResNet-v2-B
- Inception-ResNet-v2-C
- Reduction-A
- Softmax
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 20x NVIDIA Kepler GPUs
ID: inception_resnet_v2
LR: 0.045
Dropout: 0.2
Crop Pct: '0.897'
Momentum: 0.9
Image Size: '299'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/inception_resnet_v2.py#L343
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 0.95%
Top 5 Accuracy: 17.29%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/skresnet.mdx
|
# SK-ResNet
**SK ResNet** is a variant of a [ResNet](https://www.paperswithcode.com/method/resnet) that employs a [Selective Kernel](https://paperswithcode.com/method/selective-kernel) unit. In general, all the large kernel convolutions in the original bottleneck blocks in ResNet are replaced by the proposed [SK convolutions](https://paperswithcode.com/method/selective-kernel-convolution), enabling the network to choose appropriate receptive field sizes in an adaptive manner.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('skresnet18', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `skresnet18`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('skresnet18', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{li2019selective,
title={Selective Kernel Networks},
author={Xiang Li and Wenhai Wang and Xiaolin Hu and Jian Yang},
year={2019},
eprint={1903.06586},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: SKResNet
Paper:
Title: Selective Kernel Networks
URL: https://paperswithcode.com/paper/selective-kernel-networks
Models:
- Name: skresnet18
In Collection: SKResNet
Metadata:
FLOPs: 2333467136
Parameters: 11960000
File Size: 47923238
Architecture:
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- Residual Connection
- Selective Kernel
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x GPUs
ID: skresnet18
LR: 0.1
Epochs: 100
Layers: 18
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 4.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/sknet.py#L148
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 73.03%
Top 5 Accuracy: 91.17%
- Name: skresnet34
In Collection: SKResNet
Metadata:
FLOPs: 4711849952
Parameters: 22280000
File Size: 89299314
Architecture:
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- Residual Connection
- Selective Kernel
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x GPUs
ID: skresnet34
LR: 0.1
Epochs: 100
Layers: 34
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 4.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/sknet.py#L165
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 76.93%
Top 5 Accuracy: 93.32%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/tresnet.mdx
|
# TResNet
A **TResNet** is a variant on a [ResNet](https://paperswithcode.com/method/resnet) that aim to boost accuracy while maintaining GPU training and inference efficiency. They contain several design tricks including a SpaceToDepth stem, [Anti-Alias downsampling](https://paperswithcode.com/method/anti-alias-downsampling), In-Place Activated BatchNorm, Blocks selection and [squeeze-and-excitation layers](https://paperswithcode.com/method/squeeze-and-excitation-block).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('tresnet_l', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `tresnet_l`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('tresnet_l', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{ridnik2020tresnet,
title={TResNet: High Performance GPU-Dedicated Architecture},
author={Tal Ridnik and Hussam Lawen and Asaf Noy and Emanuel Ben Baruch and Gilad Sharir and Itamar Friedman},
year={2020},
eprint={2003.13630},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: TResNet
Paper:
Title: 'TResNet: High Performance GPU-Dedicated Architecture'
URL: https://paperswithcode.com/paper/tresnet-high-performance-gpu-dedicated
Models:
- Name: tresnet_l
In Collection: TResNet
Metadata:
FLOPs: 10873416792
Parameters: 53456696
File Size: 224440219
Architecture:
- 1x1 Convolution
- Anti-Alias Downsampling
- Convolution
- Global Average Pooling
- InPlace-ABN
- Leaky ReLU
- ReLU
- Residual Connection
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Cutout
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA 100 GPUs
ID: tresnet_l
LR: 0.01
Epochs: 300
Crop Pct: '0.875'
Momentum: 0.9
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/tresnet.py#L267
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.49%
Top 5 Accuracy: 95.62%
- Name: tresnet_l_448
In Collection: TResNet
Metadata:
FLOPs: 43488238584
Parameters: 53456696
File Size: 224440219
Architecture:
- 1x1 Convolution
- Anti-Alias Downsampling
- Convolution
- Global Average Pooling
- InPlace-ABN
- Leaky ReLU
- ReLU
- Residual Connection
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Cutout
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA 100 GPUs
ID: tresnet_l_448
LR: 0.01
Epochs: 300
Crop Pct: '0.875'
Momentum: 0.9
Image Size: '448'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/tresnet.py#L285
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 82.26%
Top 5 Accuracy: 95.98%
- Name: tresnet_m
In Collection: TResNet
Metadata:
FLOPs: 5733048064
Parameters: 41282200
File Size: 125861314
Architecture:
- 1x1 Convolution
- Anti-Alias Downsampling
- Convolution
- Global Average Pooling
- InPlace-ABN
- Leaky ReLU
- ReLU
- Residual Connection
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Cutout
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA 100 GPUs
Training Time: < 24 hours
ID: tresnet_m
LR: 0.01
Epochs: 300
Crop Pct: '0.875'
Momentum: 0.9
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/tresnet.py#L261
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_80_8-dbc13962.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.8%
Top 5 Accuracy: 94.86%
- Name: tresnet_m_448
In Collection: TResNet
Metadata:
FLOPs: 22929743104
Parameters: 29278464
File Size: 125861314
Architecture:
- 1x1 Convolution
- Anti-Alias Downsampling
- Convolution
- Global Average Pooling
- InPlace-ABN
- Leaky ReLU
- ReLU
- Residual Connection
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Cutout
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA 100 GPUs
ID: tresnet_m_448
LR: 0.01
Epochs: 300
Crop Pct: '0.875'
Momentum: 0.9
Image Size: '448'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/tresnet.py#L279
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.72%
Top 5 Accuracy: 95.57%
- Name: tresnet_xl
In Collection: TResNet
Metadata:
FLOPs: 15162534034
Parameters: 75646610
File Size: 314378965
Architecture:
- 1x1 Convolution
- Anti-Alias Downsampling
- Convolution
- Global Average Pooling
- InPlace-ABN
- Leaky ReLU
- ReLU
- Residual Connection
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Cutout
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA 100 GPUs
ID: tresnet_xl
LR: 0.01
Epochs: 300
Crop Pct: '0.875'
Momentum: 0.9
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/tresnet.py#L273
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 82.05%
Top 5 Accuracy: 95.93%
- Name: tresnet_xl_448
In Collection: TResNet
Metadata:
FLOPs: 60641712730
Parameters: 75646610
File Size: 224440219
Architecture:
- 1x1 Convolution
- Anti-Alias Downsampling
- Convolution
- Global Average Pooling
- InPlace-ABN
- Leaky ReLU
- ReLU
- Residual Connection
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Cutout
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA 100 GPUs
ID: tresnet_xl_448
LR: 0.01
Epochs: 300
Crop Pct: '0.875'
Momentum: 0.9
Image Size: '448'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/tresnet.py#L291
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.06%
Top 5 Accuracy: 96.19%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/rexnet.mdx
|
# RexNet
**Rank Expansion Networks** (ReXNets) follow a set of new design principles for designing bottlenecks in image classification models. Authors refine each layer by 1) expanding the input channel size of the convolution layer and 2) replacing the [ReLU6s](https://www.paperswithcode.com/method/relu6).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('rexnet_100', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `rexnet_100`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('rexnet_100', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{han2020rexnet,
title={ReXNet: Diminishing Representational Bottleneck on Convolutional Neural Network},
author={Dongyoon Han and Sangdoo Yun and Byeongho Heo and YoungJoon Yoo},
year={2020},
eprint={2007.00992},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: RexNet
Paper:
Title: 'ReXNet: Diminishing Representational Bottleneck on Convolutional Neural
Network'
URL: https://paperswithcode.com/paper/rexnet-diminishing-representational
Models:
- Name: rexnet_100
In Collection: RexNet
Metadata:
FLOPs: 509989377
Parameters: 4800000
File Size: 19417552
Architecture:
- Batch Normalization
- Convolution
- Dropout
- ReLU6
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- Linear Warmup With Cosine Annealing
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: rexnet_100
LR: 0.5
Epochs: 400
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bicubic
Label Smoothing: 0.1
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/rexnet.py#L212
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.86%
Top 5 Accuracy: 93.88%
- Name: rexnet_130
In Collection: RexNet
Metadata:
FLOPs: 848364461
Parameters: 7560000
File Size: 30508197
Architecture:
- Batch Normalization
- Convolution
- Dropout
- ReLU6
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- Linear Warmup With Cosine Annealing
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: rexnet_130
LR: 0.5
Epochs: 400
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bicubic
Label Smoothing: 0.1
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/rexnet.py#L218
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.49%
Top 5 Accuracy: 94.67%
- Name: rexnet_150
In Collection: RexNet
Metadata:
FLOPs: 1122374469
Parameters: 9730000
File Size: 39227315
Architecture:
- Batch Normalization
- Convolution
- Dropout
- ReLU6
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- Linear Warmup With Cosine Annealing
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: rexnet_150
LR: 0.5
Epochs: 400
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bicubic
Label Smoothing: 0.1
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/rexnet.py#L224
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.31%
Top 5 Accuracy: 95.16%
- Name: rexnet_200
In Collection: RexNet
Metadata:
FLOPs: 1960224938
Parameters: 16370000
File Size: 65862221
Architecture:
- Batch Normalization
- Convolution
- Dropout
- ReLU6
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- Linear Warmup With Cosine Annealing
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: rexnet_200
LR: 0.5
Epochs: 400
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bicubic
Label Smoothing: 0.1
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/rexnet.py#L230
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.63%
Top 5 Accuracy: 95.67%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/wide-resnet.mdx
|
# Wide ResNet
**Wide Residual Networks** are a variant on [ResNets](https://paperswithcode.com/method/resnet) where we decrease depth and increase the width of residual networks. This is achieved through the use of [wide residual blocks](https://paperswithcode.com/method/wide-residual-block).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('wide_resnet101_2', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `wide_resnet101_2`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('wide_resnet101_2', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/ZagoruykoK16,
author = {Sergey Zagoruyko and
Nikos Komodakis},
title = {Wide Residual Networks},
journal = {CoRR},
volume = {abs/1605.07146},
year = {2016},
url = {http://arxiv.org/abs/1605.07146},
archivePrefix = {arXiv},
eprint = {1605.07146},
timestamp = {Mon, 13 Aug 2018 16:46:42 +0200},
biburl = {https://dblp.org/rec/journals/corr/ZagoruykoK16.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: Wide ResNet
Paper:
Title: Wide Residual Networks
URL: https://paperswithcode.com/paper/wide-residual-networks
Models:
- Name: wide_resnet101_2
In Collection: Wide ResNet
Metadata:
FLOPs: 29304929280
Parameters: 126890000
File Size: 254695146
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Wide Residual Block
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: wide_resnet101_2
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/5f9aff395c224492e9e44248b15f44b5cc095d9c/timm/models/resnet.py#L802
Weights: https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.85%
Top 5 Accuracy: 94.28%
- Name: wide_resnet50_2
In Collection: Wide ResNet
Metadata:
FLOPs: 14688058368
Parameters: 68880000
File Size: 275853271
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Wide Residual Block
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: wide_resnet50_2
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/5f9aff395c224492e9e44248b15f44b5cc095d9c/timm/models/resnet.py#L790
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.45%
Top 5 Accuracy: 95.52%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/tf-mixnet.mdx
|
# (Tensorflow) MixNet
**MixNet** is a type of convolutional neural network discovered via AutoML that utilises [MixConvs](https://paperswithcode.com/method/mixconv) instead of regular [depthwise convolutions](https://paperswithcode.com/method/depthwise-convolution).
The weights from this model were ported from [Tensorflow/TPU](https://github.com/tensorflow/tpu).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('tf_mixnet_l', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `tf_mixnet_l`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('tf_mixnet_l', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{tan2019mixconv,
title={MixConv: Mixed Depthwise Convolutional Kernels},
author={Mingxing Tan and Quoc V. Le},
year={2019},
eprint={1907.09595},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: TF MixNet
Paper:
Title: 'MixConv: Mixed Depthwise Convolutional Kernels'
URL: https://paperswithcode.com/paper/mixnet-mixed-depthwise-convolutional-kernels
Models:
- Name: tf_mixnet_l
In Collection: TF MixNet
Metadata:
FLOPs: 688674516
Parameters: 7330000
File Size: 29620756
Architecture:
- Batch Normalization
- Dense Connections
- Dropout
- Global Average Pooling
- Grouped Convolution
- MixConv
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- MNAS
Training Data:
- ImageNet
ID: tf_mixnet_l
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1720
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.78%
Top 5 Accuracy: 94.0%
- Name: tf_mixnet_m
In Collection: TF MixNet
Metadata:
FLOPs: 416633502
Parameters: 5010000
File Size: 20310871
Architecture:
- Batch Normalization
- Dense Connections
- Dropout
- Global Average Pooling
- Grouped Convolution
- MixConv
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- MNAS
Training Data:
- ImageNet
ID: tf_mixnet_m
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1709
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 76.96%
Top 5 Accuracy: 93.16%
- Name: tf_mixnet_s
In Collection: TF MixNet
Metadata:
FLOPs: 302587678
Parameters: 4130000
File Size: 16738218
Architecture:
- Batch Normalization
- Dense Connections
- Dropout
- Global Average Pooling
- Grouped Convolution
- MixConv
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- MNAS
Training Data:
- ImageNet
ID: tf_mixnet_s
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1698
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 75.68%
Top 5 Accuracy: 92.64%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/tf-efficientnet.mdx
|
# (Tensorflow) EfficientNet
**EfficientNet** is a convolutional neural network architecture and scaling method that uniformly scales all dimensions of depth/width/resolution using a *compound coefficient*. Unlike conventional practice that arbitrary scales these factors, the EfficientNet scaling method uniformly scales network width, depth, and resolution with a set of fixed scaling coefficients. For example, if we want to use \\( 2^N \\) times more computational resources, then we can simply increase the network depth by \\( \alpha ^ N \\), width by \\( \beta ^ N \\), and image size by \\( \gamma ^ N \\), where \\( \alpha, \beta, \gamma \\) are constant coefficients determined by a small grid search on the original small model. EfficientNet uses a compound coefficient \\( \phi \\) to uniformly scales network width, depth, and resolution in a principled way.
The compound scaling method is justified by the intuition that if the input image is bigger, then the network needs more layers to increase the receptive field and more channels to capture more fine-grained patterns on the bigger image.
The base EfficientNet-B0 network is based on the inverted bottleneck residual blocks of [MobileNetV2](https://paperswithcode.com/method/mobilenetv2), in addition to [squeeze-and-excitation blocks](https://paperswithcode.com/method/squeeze-and-excitation-block).
The weights from this model were ported from [Tensorflow/TPU](https://github.com/tensorflow/tpu).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('tf_efficientnet_b0', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `tf_efficientnet_b0`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('tf_efficientnet_b0', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{tan2020efficientnet,
title={EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks},
author={Mingxing Tan and Quoc V. Le},
year={2020},
eprint={1905.11946},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
<!--
Type: model-index
Collections:
- Name: TF EfficientNet
Paper:
Title: 'EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks'
URL: https://paperswithcode.com/paper/efficientnet-rethinking-model-scaling-for
Models:
- Name: tf_efficientnet_b0
In Collection: TF EfficientNet
Metadata:
FLOPs: 488688572
Parameters: 5290000
File Size: 21383997
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
Training Resources: TPUv3 Cloud TPU
ID: tf_efficientnet_b0
LR: 0.256
Epochs: 350
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 2048
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1241
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 76.85%
Top 5 Accuracy: 93.23%
- Name: tf_efficientnet_b1
In Collection: TF EfficientNet
Metadata:
FLOPs: 883633200
Parameters: 7790000
File Size: 31512534
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b1
LR: 0.256
Epochs: 350
Crop Pct: '0.882'
Momentum: 0.9
Batch Size: 2048
Image Size: '240'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1251
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.84%
Top 5 Accuracy: 94.2%
- Name: tf_efficientnet_b2
In Collection: TF EfficientNet
Metadata:
FLOPs: 1234321170
Parameters: 9110000
File Size: 36797929
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b2
LR: 0.256
Epochs: 350
Crop Pct: '0.89'
Momentum: 0.9
Batch Size: 2048
Image Size: '260'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1261
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.07%
Top 5 Accuracy: 94.9%
- Name: tf_efficientnet_b3
In Collection: TF EfficientNet
Metadata:
FLOPs: 2275247568
Parameters: 12230000
File Size: 49381362
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b3
LR: 0.256
Epochs: 350
Crop Pct: '0.904'
Momentum: 0.9
Batch Size: 2048
Image Size: '300'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1271
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.65%
Top 5 Accuracy: 95.72%
- Name: tf_efficientnet_b4
In Collection: TF EfficientNet
Metadata:
FLOPs: 5749638672
Parameters: 19340000
File Size: 77989689
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
Training Resources: TPUv3 Cloud TPU
ID: tf_efficientnet_b4
LR: 0.256
Epochs: 350
Crop Pct: '0.922'
Momentum: 0.9
Batch Size: 2048
Image Size: '380'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1281
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.03%
Top 5 Accuracy: 96.3%
- Name: tf_efficientnet_b5
In Collection: TF EfficientNet
Metadata:
FLOPs: 13176501888
Parameters: 30390000
File Size: 122403150
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b5
LR: 0.256
Epochs: 350
Crop Pct: '0.934'
Momentum: 0.9
Batch Size: 2048
Image Size: '456'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1291
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.81%
Top 5 Accuracy: 96.75%
- Name: tf_efficientnet_b6
In Collection: TF EfficientNet
Metadata:
FLOPs: 24180518488
Parameters: 43040000
File Size: 173232007
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b6
LR: 0.256
Epochs: 350
Crop Pct: '0.942'
Momentum: 0.9
Batch Size: 2048
Image Size: '528'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1301
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.11%
Top 5 Accuracy: 96.89%
- Name: tf_efficientnet_b7
In Collection: TF EfficientNet
Metadata:
FLOPs: 48205304880
Parameters: 66349999
File Size: 266850607
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b7
LR: 0.256
Epochs: 350
Crop Pct: '0.949'
Momentum: 0.9
Batch Size: 2048
Image Size: '600'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1312
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.93%
Top 5 Accuracy: 97.2%
- Name: tf_efficientnet_b8
In Collection: TF EfficientNet
Metadata:
FLOPs: 80962956270
Parameters: 87410000
File Size: 351379853
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_b8
LR: 0.256
Epochs: 350
Crop Pct: '0.954'
Momentum: 0.9
Batch Size: 2048
Image Size: '672'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1323
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 85.35%
Top 5 Accuracy: 97.39%
- Name: tf_efficientnet_el
In Collection: TF EfficientNet
Metadata:
FLOPs: 9356616096
Parameters: 10590000
File Size: 42800271
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: tf_efficientnet_el
Crop Pct: '0.904'
Image Size: '300'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1551
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.45%
Top 5 Accuracy: 95.17%
- Name: tf_efficientnet_em
In Collection: TF EfficientNet
Metadata:
FLOPs: 3636607040
Parameters: 6900000
File Size: 27933644
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: tf_efficientnet_em
Crop Pct: '0.882'
Image Size: '240'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1541
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.71%
Top 5 Accuracy: 94.33%
- Name: tf_efficientnet_es
In Collection: TF EfficientNet
Metadata:
FLOPs: 2057577472
Parameters: 5440000
File Size: 22008479
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: tf_efficientnet_es
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1531
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.28%
Top 5 Accuracy: 93.6%
- Name: tf_efficientnet_l2_ns_475
In Collection: TF EfficientNet
Metadata:
FLOPs: 217795669644
Parameters: 480310000
File Size: 1925950424
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- FixRes
- Label Smoothing
- Noisy Student
- RMSProp
- RandAugment
- Weight Decay
Training Data:
- ImageNet
- JFT-300M
Training Resources: TPUv3 Cloud TPU
ID: tf_efficientnet_l2_ns_475
LR: 0.128
Epochs: 350
Dropout: 0.5
Crop Pct: '0.936'
Momentum: 0.9
Batch Size: 2048
Image Size: '475'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Stochastic Depth Survival: 0.8
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1509
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 88.24%
Top 5 Accuracy: 98.55%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/hrnet.mdx
|
# HRNet
**HRNet**, or **High-Resolution Net**, is a general purpose convolutional neural network for tasks like semantic segmentation, object detection and image classification. It is able to maintain high resolution representations through the whole process. We start from a high-resolution convolution stream, gradually add high-to-low resolution convolution streams one by one, and connect the multi-resolution streams in parallel. The resulting network consists of several (\\( 4 \\) in the paper) stages and the \\( n \\)th stage contains \\( n \\) streams corresponding to \\( n \\) resolutions. The authors conduct repeated multi-resolution fusions by exchanging the information across the parallel streams over and over.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('hrnet_w18', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `hrnet_w18`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('hrnet_w18', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{sun2019highresolution,
title={High-Resolution Representations for Labeling Pixels and Regions},
author={Ke Sun and Yang Zhao and Borui Jiang and Tianheng Cheng and Bin Xiao and Dong Liu and Yadong Mu and Xinggang Wang and Wenyu Liu and Jingdong Wang},
year={2019},
eprint={1904.04514},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: HRNet
Paper:
Title: Deep High-Resolution Representation Learning for Visual Recognition
URL: https://paperswithcode.com/paper/190807919
Models:
- Name: hrnet_w18
In Collection: HRNet
Metadata:
FLOPs: 5547205500
Parameters: 21300000
File Size: 85718883
Architecture:
- Batch Normalization
- Convolution
- ReLU
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: hrnet_w18
Epochs: 100
Layers: 18
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/hrnet.py#L800
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 76.76%
Top 5 Accuracy: 93.44%
- Name: hrnet_w18_small
In Collection: HRNet
Metadata:
FLOPs: 2071651488
Parameters: 13190000
File Size: 52934302
Architecture:
- Batch Normalization
- Convolution
- ReLU
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: hrnet_w18_small
Epochs: 100
Layers: 18
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/hrnet.py#L790
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 72.34%
Top 5 Accuracy: 90.68%
- Name: hrnet_w18_small_v2
In Collection: HRNet
Metadata:
FLOPs: 3360023160
Parameters: 15600000
File Size: 62682879
Architecture:
- Batch Normalization
- Convolution
- ReLU
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: hrnet_w18_small_v2
Epochs: 100
Layers: 18
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/hrnet.py#L795
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 75.11%
Top 5 Accuracy: 92.41%
- Name: hrnet_w30
In Collection: HRNet
Metadata:
FLOPs: 10474119492
Parameters: 37710000
File Size: 151452218
Architecture:
- Batch Normalization
- Convolution
- ReLU
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: hrnet_w30
Epochs: 100
Layers: 30
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/hrnet.py#L805
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.21%
Top 5 Accuracy: 94.22%
- Name: hrnet_w32
In Collection: HRNet
Metadata:
FLOPs: 11524528320
Parameters: 41230000
File Size: 165547812
Architecture:
- Batch Normalization
- Convolution
- ReLU
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
Training Time: 60 hours
ID: hrnet_w32
Epochs: 100
Layers: 32
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/hrnet.py#L810
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.45%
Top 5 Accuracy: 94.19%
- Name: hrnet_w40
In Collection: HRNet
Metadata:
FLOPs: 16381182192
Parameters: 57560000
File Size: 230899236
Architecture:
- Batch Normalization
- Convolution
- ReLU
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: hrnet_w40
Epochs: 100
Layers: 40
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/hrnet.py#L815
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.93%
Top 5 Accuracy: 94.48%
- Name: hrnet_w44
In Collection: HRNet
Metadata:
FLOPs: 19202520264
Parameters: 67060000
File Size: 268957432
Architecture:
- Batch Normalization
- Convolution
- ReLU
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: hrnet_w44
Epochs: 100
Layers: 44
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/hrnet.py#L820
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.89%
Top 5 Accuracy: 94.37%
- Name: hrnet_w48
In Collection: HRNet
Metadata:
FLOPs: 22285865760
Parameters: 77470000
File Size: 310603710
Architecture:
- Batch Normalization
- Convolution
- ReLU
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
Training Time: 80 hours
ID: hrnet_w48
Epochs: 100
Layers: 48
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/hrnet.py#L825
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.32%
Top 5 Accuracy: 94.51%
- Name: hrnet_w64
In Collection: HRNet
Metadata:
FLOPs: 37239321984
Parameters: 128060000
File Size: 513071818
Architecture:
- Batch Normalization
- Convolution
- ReLU
- Residual Connection
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x NVIDIA V100 GPUs
ID: hrnet_w64
Epochs: 100
Layers: 64
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/hrnet.py#L830
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.46%
Top 5 Accuracy: 94.65%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/regnetx.mdx
|
# RegNetX
**RegNetX** is a convolutional network design space with simple, regular models with parameters: depth \\( d \\), initial width \\( w\_{0} > 0 \\), and slope \\( w\_{a} > 0 \\), and generates a different block width \\( u\_{j} \\) for each block \\( j < d \\). The key restriction for the RegNet types of model is that there is a linear parameterisation of block widths (the design space only contains models with this linear structure):
\\( \\) u\_{j} = w\_{0} + w\_{a}\cdot{j} \\( \\)
For **RegNetX** we have additional restrictions: we set \\( b = 1 \\) (the bottleneck ratio), \\( 12 \leq d \leq 28 \\), and \\( w\_{m} \geq 2 \\) (the width multiplier).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('regnetx_002', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `regnetx_002`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('regnetx_002', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{radosavovic2020designing,
title={Designing Network Design Spaces},
author={Ilija Radosavovic and Raj Prateek Kosaraju and Ross Girshick and Kaiming He and Piotr Dollár},
year={2020},
eprint={2003.13678},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: RegNetX
Paper:
Title: Designing Network Design Spaces
URL: https://paperswithcode.com/paper/designing-network-design-spaces
Models:
- Name: regnetx_002
In Collection: RegNetX
Metadata:
FLOPs: 255276032
Parameters: 2680000
File Size: 10862199
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_002
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L337
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 68.75%
Top 5 Accuracy: 88.56%
- Name: regnetx_004
In Collection: RegNetX
Metadata:
FLOPs: 510619136
Parameters: 5160000
File Size: 20841309
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_004
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L343
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 72.39%
Top 5 Accuracy: 90.82%
- Name: regnetx_006
In Collection: RegNetX
Metadata:
FLOPs: 771659136
Parameters: 6200000
File Size: 24965172
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_006
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L349
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 73.84%
Top 5 Accuracy: 91.68%
- Name: regnetx_008
In Collection: RegNetX
Metadata:
FLOPs: 1027038208
Parameters: 7260000
File Size: 29235944
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_008
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L355
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 75.05%
Top 5 Accuracy: 92.34%
- Name: regnetx_016
In Collection: RegNetX
Metadata:
FLOPs: 2059337856
Parameters: 9190000
File Size: 36988158
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_016
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L361
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 76.95%
Top 5 Accuracy: 93.43%
- Name: regnetx_032
In Collection: RegNetX
Metadata:
FLOPs: 4082555904
Parameters: 15300000
File Size: 61509573
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_032
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L367
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.15%
Top 5 Accuracy: 94.09%
- Name: regnetx_040
In Collection: RegNetX
Metadata:
FLOPs: 5095167744
Parameters: 22120000
File Size: 88844824
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_040
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L373
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.48%
Top 5 Accuracy: 94.25%
- Name: regnetx_064
In Collection: RegNetX
Metadata:
FLOPs: 8303405824
Parameters: 26210000
File Size: 105184854
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_064
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L379
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.06%
Top 5 Accuracy: 94.47%
- Name: regnetx_080
In Collection: RegNetX
Metadata:
FLOPs: 10276726784
Parameters: 39570000
File Size: 158720042
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_080
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L385
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.21%
Top 5 Accuracy: 94.55%
- Name: regnetx_120
In Collection: RegNetX
Metadata:
FLOPs: 15536378368
Parameters: 46110000
File Size: 184866342
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_120
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L391
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.61%
Top 5 Accuracy: 94.73%
- Name: regnetx_160
In Collection: RegNetX
Metadata:
FLOPs: 20491740672
Parameters: 54280000
File Size: 217623862
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_160
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 512
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L397
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.84%
Top 5 Accuracy: 94.82%
- Name: regnetx_320
In Collection: RegNetX
Metadata:
FLOPs: 40798958592
Parameters: 107810000
File Size: 431962133
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Global Average Pooling
- Grouped Convolution
- ReLU
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA V100 GPUs
ID: regnetx_320
Epochs: 100
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 256
Image Size: '224'
Weight Decay: 5.0e-05
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/regnet.py#L403
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.25%
Top 5 Accuracy: 95.03%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/dpn.mdx
|
# Dual Path Network (DPN)
A **Dual Path Network (DPN)** is a convolutional neural network which presents a new topology of connection paths internally. The intuition is that [ResNets](https://paperswithcode.com/method/resnet) enables feature re-usage while DenseNet enables new feature exploration, and both are important for learning good representations. To enjoy the benefits from both path topologies, Dual Path Networks share common features while maintaining the flexibility to explore new features through dual path architectures.
The principal building block is an [DPN Block](https://paperswithcode.com/method/dpn-block).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('dpn107', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `dpn107`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('dpn107', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{chen2017dual,
title={Dual Path Networks},
author={Yunpeng Chen and Jianan Li and Huaxin Xiao and Xiaojie Jin and Shuicheng Yan and Jiashi Feng},
year={2017},
eprint={1707.01629},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: DPN
Paper:
Title: Dual Path Networks
URL: https://paperswithcode.com/paper/dual-path-networks
Models:
- Name: dpn107
In Collection: DPN
Metadata:
FLOPs: 23524280296
Parameters: 86920000
File Size: 348612331
Architecture:
- Batch Normalization
- Convolution
- DPN Block
- Dense Connections
- Global Average Pooling
- Max Pooling
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 40x K80 GPUs
ID: dpn107
LR: 0.316
Layers: 107
Crop Pct: '0.875'
Batch Size: 1280
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/dpn.py#L310
Weights: https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.16%
Top 5 Accuracy: 94.91%
- Name: dpn131
In Collection: DPN
Metadata:
FLOPs: 20586274792
Parameters: 79250000
File Size: 318016207
Architecture:
- Batch Normalization
- Convolution
- DPN Block
- Dense Connections
- Global Average Pooling
- Max Pooling
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 40x K80 GPUs
ID: dpn131
LR: 0.316
Layers: 131
Crop Pct: '0.875'
Batch Size: 960
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/dpn.py#L302
Weights: https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.83%
Top 5 Accuracy: 94.71%
- Name: dpn68
In Collection: DPN
Metadata:
FLOPs: 2990567880
Parameters: 12610000
File Size: 50761994
Architecture:
- Batch Normalization
- Convolution
- DPN Block
- Dense Connections
- Global Average Pooling
- Max Pooling
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 40x K80 GPUs
ID: dpn68
LR: 0.316
Layers: 68
Crop Pct: '0.875'
Batch Size: 1280
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/dpn.py#L270
Weights: https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 76.31%
Top 5 Accuracy: 92.97%
- Name: dpn68b
In Collection: DPN
Metadata:
FLOPs: 2990567880
Parameters: 12610000
File Size: 50781025
Architecture:
- Batch Normalization
- Convolution
- DPN Block
- Dense Connections
- Global Average Pooling
- Max Pooling
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 40x K80 GPUs
ID: dpn68b
LR: 0.316
Layers: 68
Crop Pct: '0.875'
Batch Size: 1280
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/dpn.py#L278
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.21%
Top 5 Accuracy: 94.42%
- Name: dpn92
In Collection: DPN
Metadata:
FLOPs: 8357659624
Parameters: 37670000
File Size: 151248422
Architecture:
- Batch Normalization
- Convolution
- DPN Block
- Dense Connections
- Global Average Pooling
- Max Pooling
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 40x K80 GPUs
ID: dpn92
LR: 0.316
Layers: 92
Crop Pct: '0.875'
Batch Size: 1280
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/dpn.py#L286
Weights: https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.99%
Top 5 Accuracy: 94.84%
- Name: dpn98
In Collection: DPN
Metadata:
FLOPs: 15003675112
Parameters: 61570000
File Size: 247021307
Architecture:
- Batch Normalization
- Convolution
- DPN Block
- Dense Connections
- Global Average Pooling
- Max Pooling
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 40x K80 GPUs
ID: dpn98
LR: 0.4
Layers: 98
Crop Pct: '0.875'
Batch Size: 1280
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/dpn.py#L294
Weights: https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.65%
Top 5 Accuracy: 94.61%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/xception.mdx
|
# Xception
**Xception** is a convolutional neural network architecture that relies solely on [depthwise separable convolution layers](https://paperswithcode.com/method/depthwise-separable-convolution).
The weights from this model were ported from [Tensorflow/Models](https://github.com/tensorflow/models).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('xception', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `xception`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('xception', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/ZagoruykoK16,
@misc{chollet2017xception,
title={Xception: Deep Learning with Depthwise Separable Convolutions},
author={François Chollet},
year={2017},
eprint={1610.02357},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: Xception
Paper:
Title: 'Xception: Deep Learning with Depthwise Separable Convolutions'
URL: https://paperswithcode.com/paper/xception-deep-learning-with-depthwise
Models:
- Name: xception
In Collection: Xception
Metadata:
FLOPs: 10600506792
Parameters: 22860000
File Size: 91675053
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: xception
Crop Pct: '0.897'
Image Size: '299'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/xception.py#L229
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.05%
Top 5 Accuracy: 94.4%
- Name: xception41
In Collection: Xception
Metadata:
FLOPs: 11681983232
Parameters: 26970000
File Size: 108422028
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: xception41
Crop Pct: '0.903'
Image Size: '299'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/xception_aligned.py#L181
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.54%
Top 5 Accuracy: 94.28%
- Name: xception65
In Collection: Xception
Metadata:
FLOPs: 17585702144
Parameters: 39920000
File Size: 160536780
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: xception65
Crop Pct: '0.903'
Image Size: '299'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/xception_aligned.py#L200
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.55%
Top 5 Accuracy: 94.66%
- Name: xception71
In Collection: Xception
Metadata:
FLOPs: 22817346560
Parameters: 42340000
File Size: 170295556
Architecture:
- 1x1 Convolution
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: xception71
Crop Pct: '0.903'
Image Size: '299'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/xception_aligned.py#L219
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.88%
Top 5 Accuracy: 94.93%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/ig-resnext.mdx
|
# Instagram ResNeXt WSL
A **ResNeXt** repeats a [building block](https://paperswithcode.com/method/resnext-block) that aggregates a set of transformations with the same topology. Compared to a [ResNet](https://paperswithcode.com/method/resnet), it exposes a new dimension, *cardinality* (the size of the set of transformations) \\( C \\), as an essential factor in addition to the dimensions of depth and width.
This model was trained on billions of Instagram images using thousands of distinct hashtags as labels exhibit excellent transfer learning performance.
Please note the CC-BY-NC 4.0 license on theses weights, non-commercial use only.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('ig_resnext101_32x16d', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `ig_resnext101_32x16d`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('ig_resnext101_32x16d', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{mahajan2018exploring,
title={Exploring the Limits of Weakly Supervised Pretraining},
author={Dhruv Mahajan and Ross Girshick and Vignesh Ramanathan and Kaiming He and Manohar Paluri and Yixuan Li and Ashwin Bharambe and Laurens van der Maaten},
year={2018},
eprint={1805.00932},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: IG ResNeXt
Paper:
Title: Exploring the Limits of Weakly Supervised Pretraining
URL: https://paperswithcode.com/paper/exploring-the-limits-of-weakly-supervised
Models:
- Name: ig_resnext101_32x16d
In Collection: IG ResNeXt
Metadata:
FLOPs: 46623691776
Parameters: 194030000
File Size: 777518664
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- IG-3.5B-17k
- ImageNet
Training Resources: 336x GPUs
ID: ig_resnext101_32x16d
Epochs: 100
Layers: 101
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 8064
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L874
Weights: https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.16%
Top 5 Accuracy: 97.19%
- Name: ig_resnext101_32x32d
In Collection: IG ResNeXt
Metadata:
FLOPs: 112225170432
Parameters: 468530000
File Size: 1876573776
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- IG-3.5B-17k
- ImageNet
Training Resources: 336x GPUs
ID: ig_resnext101_32x32d
Epochs: 100
Layers: 101
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 8064
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Minibatch Size: 8064
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L885
Weights: https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 85.09%
Top 5 Accuracy: 97.44%
- Name: ig_resnext101_32x48d
In Collection: IG ResNeXt
Metadata:
FLOPs: 197446554624
Parameters: 828410000
File Size: 3317136976
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- IG-3.5B-17k
- ImageNet
Training Resources: 336x GPUs
ID: ig_resnext101_32x48d
Epochs: 100
Layers: 101
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 8064
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L896
Weights: https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 85.42%
Top 5 Accuracy: 97.58%
- Name: ig_resnext101_32x8d
In Collection: IG ResNeXt
Metadata:
FLOPs: 21180417024
Parameters: 88790000
File Size: 356056638
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- Nesterov Accelerated Gradient
- Weight Decay
Training Data:
- IG-3.5B-17k
- ImageNet
Training Resources: 336x GPUs
ID: ig_resnext101_32x8d
Epochs: 100
Layers: 101
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 8064
Image Size: '224'
Weight Decay: 0.001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/resnet.py#L863
Weights: https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 82.7%
Top 5 Accuracy: 96.64%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/legacy-se-resnet.mdx
|
# (Legacy) SE-ResNet
**SE ResNet** is a variant of a [ResNet](https://www.paperswithcode.com/method/resnet) that employs [squeeze-and-excitation blocks](https://paperswithcode.com/method/squeeze-and-excitation-block) to enable the network to perform dynamic channel-wise feature recalibration.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('legacy_seresnet101', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `legacy_seresnet101`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('legacy_seresnet101', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{hu2019squeezeandexcitation,
title={Squeeze-and-Excitation Networks},
author={Jie Hu and Li Shen and Samuel Albanie and Gang Sun and Enhua Wu},
year={2019},
eprint={1709.01507},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: Legacy SE ResNet
Paper:
Title: Squeeze-and-Excitation Networks
URL: https://paperswithcode.com/paper/squeeze-and-excitation-networks
Models:
- Name: legacy_seresnet101
In Collection: Legacy SE ResNet
Metadata:
FLOPs: 9762614000
Parameters: 49330000
File Size: 197822624
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: legacy_seresnet101
LR: 0.6
Epochs: 100
Layers: 101
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/senet.py#L426
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.38%
Top 5 Accuracy: 94.26%
- Name: legacy_seresnet152
In Collection: Legacy SE ResNet
Metadata:
FLOPs: 14553578160
Parameters: 66819999
File Size: 268033864
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: legacy_seresnet152
LR: 0.6
Epochs: 100
Layers: 152
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/senet.py#L433
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.67%
Top 5 Accuracy: 94.38%
- Name: legacy_seresnet18
In Collection: Legacy SE ResNet
Metadata:
FLOPs: 2328876024
Parameters: 11780000
File Size: 47175663
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: legacy_seresnet18
LR: 0.6
Epochs: 100
Layers: 18
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/senet.py#L405
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 71.74%
Top 5 Accuracy: 90.34%
- Name: legacy_seresnet34
In Collection: Legacy SE ResNet
Metadata:
FLOPs: 4706201004
Parameters: 21960000
File Size: 87958697
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: legacy_seresnet34
LR: 0.6
Epochs: 100
Layers: 34
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/senet.py#L412
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 74.79%
Top 5 Accuracy: 92.13%
- Name: legacy_seresnet50
In Collection: Legacy SE ResNet
Metadata:
FLOPs: 4974351024
Parameters: 28090000
File Size: 112611220
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: legacy_seresnet50
LR: 0.6
Epochs: 100
Layers: 50
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Image Size: '224'
Interpolation: bilinear
Minibatch Size: 1024
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/senet.py#L419
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.64%
Top 5 Accuracy: 93.74%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/gloun-resnet.mdx
|
# (Gluon) ResNet
**Residual Networks**, or **ResNets**, learn residual functions with reference to the layer inputs, instead of learning unreferenced functions. Instead of hoping each few stacked layers directly fit a desired underlying mapping, residual nets let these layers fit a residual mapping. They stack [residual blocks](https://paperswithcode.com/method/residual-block) ontop of each other to form network: e.g. a ResNet-50 has fifty layers using these blocks.
The weights from this model were ported from [Gluon](https://cv.gluon.ai/model_zoo/classification.html).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('gluon_resnet101_v1b', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `gluon_resnet101_v1b`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('gluon_resnet101_v1b', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/HeZRS15,
author = {Kaiming He and
Xiangyu Zhang and
Shaoqing Ren and
Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {CoRR},
volume = {abs/1512.03385},
year = {2015},
url = {http://arxiv.org/abs/1512.03385},
archivePrefix = {arXiv},
eprint = {1512.03385},
timestamp = {Wed, 17 Apr 2019 17:23:45 +0200},
biburl = {https://dblp.org/rec/journals/corr/HeZRS15.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: Gloun ResNet
Paper:
Title: Deep Residual Learning for Image Recognition
URL: https://paperswithcode.com/paper/deep-residual-learning-for-image-recognition
Models:
- Name: gluon_resnet101_v1b
In Collection: Gloun ResNet
Metadata:
FLOPs: 10068547584
Parameters: 44550000
File Size: 178723172
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet101_v1b
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L89
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.3%
Top 5 Accuracy: 94.53%
- Name: gluon_resnet101_v1c
In Collection: Gloun ResNet
Metadata:
FLOPs: 10376567296
Parameters: 44570000
File Size: 178802575
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet101_v1c
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L113
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.53%
Top 5 Accuracy: 94.59%
- Name: gluon_resnet101_v1d
In Collection: Gloun ResNet
Metadata:
FLOPs: 10377018880
Parameters: 44570000
File Size: 178802755
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet101_v1d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L138
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.4%
Top 5 Accuracy: 95.02%
- Name: gluon_resnet101_v1s
In Collection: Gloun ResNet
Metadata:
FLOPs: 11805511680
Parameters: 44670000
File Size: 179221777
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet101_v1s
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L166
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.29%
Top 5 Accuracy: 95.16%
- Name: gluon_resnet152_v1b
In Collection: Gloun ResNet
Metadata:
FLOPs: 14857660416
Parameters: 60190000
File Size: 241534001
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet152_v1b
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L97
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.69%
Top 5 Accuracy: 94.73%
- Name: gluon_resnet152_v1c
In Collection: Gloun ResNet
Metadata:
FLOPs: 15165680128
Parameters: 60210000
File Size: 241613404
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet152_v1c
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L121
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.91%
Top 5 Accuracy: 94.85%
- Name: gluon_resnet152_v1d
In Collection: Gloun ResNet
Metadata:
FLOPs: 15166131712
Parameters: 60210000
File Size: 241613584
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet152_v1d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L147
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.48%
Top 5 Accuracy: 95.2%
- Name: gluon_resnet152_v1s
In Collection: Gloun ResNet
Metadata:
FLOPs: 16594624512
Parameters: 60320000
File Size: 242032606
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet152_v1s
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L175
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.02%
Top 5 Accuracy: 95.42%
- Name: gluon_resnet18_v1b
In Collection: Gloun ResNet
Metadata:
FLOPs: 2337073152
Parameters: 11690000
File Size: 46816736
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet18_v1b
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L65
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 70.84%
Top 5 Accuracy: 89.76%
- Name: gluon_resnet34_v1b
In Collection: Gloun ResNet
Metadata:
FLOPs: 4718469120
Parameters: 21800000
File Size: 87295112
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet34_v1b
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L73
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 74.59%
Top 5 Accuracy: 92.0%
- Name: gluon_resnet50_v1b
In Collection: Gloun ResNet
Metadata:
FLOPs: 5282531328
Parameters: 25560000
File Size: 102493763
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet50_v1b
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L81
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.58%
Top 5 Accuracy: 93.72%
- Name: gluon_resnet50_v1c
In Collection: Gloun ResNet
Metadata:
FLOPs: 5590551040
Parameters: 25580000
File Size: 102573166
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet50_v1c
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L105
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.01%
Top 5 Accuracy: 93.99%
- Name: gluon_resnet50_v1d
In Collection: Gloun ResNet
Metadata:
FLOPs: 5591002624
Parameters: 25580000
File Size: 102573346
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet50_v1d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L129
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.06%
Top 5 Accuracy: 94.46%
- Name: gluon_resnet50_v1s
In Collection: Gloun ResNet
Metadata:
FLOPs: 7019495424
Parameters: 25680000
File Size: 102992368
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnet50_v1s
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L156
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.7%
Top 5 Accuracy: 94.25%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/legacy-senet.mdx
|
# (Legacy) SENet
A **SENet** is a convolutional neural network architecture that employs [squeeze-and-excitation blocks](https://paperswithcode.com/method/squeeze-and-excitation-block) to enable the network to perform dynamic channel-wise feature recalibration.
The weights from this model were ported from Gluon.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('legacy_senet154', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `legacy_senet154`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('legacy_senet154', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{hu2019squeezeandexcitation,
title={Squeeze-and-Excitation Networks},
author={Jie Hu and Li Shen and Samuel Albanie and Gang Sun and Enhua Wu},
year={2019},
eprint={1709.01507},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: Legacy SENet
Paper:
Title: Squeeze-and-Excitation Networks
URL: https://paperswithcode.com/paper/squeeze-and-excitation-networks
Models:
- Name: legacy_senet154
In Collection: Legacy SENet
Metadata:
FLOPs: 26659556016
Parameters: 115090000
File Size: 461488402
Architecture:
- Convolution
- Dense Connections
- Global Average Pooling
- Max Pooling
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
Training Resources: 8x NVIDIA Titan X GPUs
ID: legacy_senet154
LR: 0.6
Epochs: 100
Layers: 154
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 1024
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/senet.py#L440
Weights: http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 81.33%
Top 5 Accuracy: 95.51%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/efficientnet-pruned.mdx
|
# EfficientNet (Knapsack Pruned)
**EfficientNet** is a convolutional neural network architecture and scaling method that uniformly scales all dimensions of depth/width/resolution using a *compound coefficient*. Unlike conventional practice that arbitrary scales these factors, the EfficientNet scaling method uniformly scales network width, depth, and resolution with a set of fixed scaling coefficients. For example, if we want to use \\( 2^N \\) times more computational resources, then we can simply increase the network depth by \\( \alpha ^ N \\), width by \\( \beta ^ N \\), and image size by \\( \gamma ^ N \\), where \\( \alpha, \beta, \gamma \\) are constant coefficients determined by a small grid search on the original small model. EfficientNet uses a compound coefficient \\( \phi \\) to uniformly scales network width, depth, and resolution in a principled way.
The compound scaling method is justified by the intuition that if the input image is bigger, then the network needs more layers to increase the receptive field and more channels to capture more fine-grained patterns on the bigger image.
The base EfficientNet-B0 network is based on the inverted bottleneck residual blocks of [MobileNetV2](https://paperswithcode.com/method/mobilenetv2), in addition to [squeeze-and-excitation blocks](https://paperswithcode.com/method/squeeze-and-excitation-block).
This collection consists of pruned EfficientNet models.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('efficientnet_b1_pruned', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `efficientnet_b1_pruned`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('efficientnet_b1_pruned', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{tan2020efficientnet,
title={EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks},
author={Mingxing Tan and Quoc V. Le},
year={2020},
eprint={1905.11946},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
```
@misc{aflalo2020knapsack,
title={Knapsack Pruning with Inner Distillation},
author={Yonathan Aflalo and Asaf Noy and Ming Lin and Itamar Friedman and Lihi Zelnik},
year={2020},
eprint={2002.08258},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
<!--
Type: model-index
Collections:
- Name: EfficientNet Pruned
Paper:
Title: Knapsack Pruning with Inner Distillation
URL: https://paperswithcode.com/paper/knapsack-pruning-with-inner-distillation
Models:
- Name: efficientnet_b1_pruned
In Collection: EfficientNet Pruned
Metadata:
FLOPs: 489653114
Parameters: 6330000
File Size: 25595162
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: efficientnet_b1_pruned
Crop Pct: '0.882'
Image Size: '240'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/efficientnet.py#L1208
Weights: https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 78.25%
Top 5 Accuracy: 93.84%
- Name: efficientnet_b2_pruned
In Collection: EfficientNet Pruned
Metadata:
FLOPs: 878133915
Parameters: 8310000
File Size: 33555005
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: efficientnet_b2_pruned
Crop Pct: '0.89'
Image Size: '260'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/efficientnet.py#L1219
Weights: https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.91%
Top 5 Accuracy: 94.86%
- Name: efficientnet_b3_pruned
In Collection: EfficientNet Pruned
Metadata:
FLOPs: 1239590641
Parameters: 9860000
File Size: 39770812
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: efficientnet_b3_pruned
Crop Pct: '0.904'
Image Size: '300'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/a7f95818e44b281137503bcf4b3e3e94d8ffa52f/timm/models/efficientnet.py#L1230
Weights: https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.86%
Top 5 Accuracy: 95.24%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/tf-efficientnet-condconv.mdx
|
# (Tensorflow) EfficientNet CondConv
**EfficientNet** is a convolutional neural network architecture and scaling method that uniformly scales all dimensions of depth/width/resolution using a *compound coefficient*. Unlike conventional practice that arbitrary scales these factors, the EfficientNet scaling method uniformly scales network width, depth, and resolution with a set of fixed scaling coefficients. For example, if we want to use \\( 2^N \\) times more computational resources, then we can simply increase the network depth by \\( \alpha ^ N \\), width by \\( \beta ^ N \\), and image size by \\( \gamma ^ N \\), where \\( \alpha, \beta, \gamma \\) are constant coefficients determined by a small grid search on the original small model. EfficientNet uses a compound coefficient \\( \phi \\) to uniformly scales network width, depth, and resolution in a principled way.
The compound scaling method is justified by the intuition that if the input image is bigger, then the network needs more layers to increase the receptive field and more channels to capture more fine-grained patterns on the bigger image.
The base EfficientNet-B0 network is based on the inverted bottleneck residual blocks of [MobileNetV2](https://paperswithcode.com/method/mobilenetv2), in addition to squeeze-and-excitation blocks.
This collection of models amends EfficientNet by adding [CondConv](https://paperswithcode.com/method/condconv) convolutions.
The weights from this model were ported from [Tensorflow/TPU](https://github.com/tensorflow/tpu).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('tf_efficientnet_cc_b0_4e', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `tf_efficientnet_cc_b0_4e`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('tf_efficientnet_cc_b0_4e', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/abs-1904-04971,
author = {Brandon Yang and
Gabriel Bender and
Quoc V. Le and
Jiquan Ngiam},
title = {Soft Conditional Computation},
journal = {CoRR},
volume = {abs/1904.04971},
year = {2019},
url = {http://arxiv.org/abs/1904.04971},
archivePrefix = {arXiv},
eprint = {1904.04971},
timestamp = {Thu, 25 Apr 2019 13:55:01 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1904-04971.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: TF EfficientNet CondConv
Paper:
Title: 'CondConv: Conditionally Parameterized Convolutions for Efficient Inference'
URL: https://paperswithcode.com/paper/soft-conditional-computation
Models:
- Name: tf_efficientnet_cc_b0_4e
In Collection: TF EfficientNet CondConv
Metadata:
FLOPs: 224153788
Parameters: 13310000
File Size: 53490940
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- CondConv
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_cc_b0_4e
LR: 0.256
Epochs: 350
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 2048
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1561
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.32%
Top 5 Accuracy: 93.32%
- Name: tf_efficientnet_cc_b0_8e
In Collection: TF EfficientNet CondConv
Metadata:
FLOPs: 224158524
Parameters: 24010000
File Size: 96287616
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- CondConv
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_cc_b0_8e
LR: 0.256
Epochs: 350
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 2048
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1572
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.91%
Top 5 Accuracy: 93.65%
- Name: tf_efficientnet_cc_b1_8e
In Collection: TF EfficientNet CondConv
Metadata:
FLOPs: 370427824
Parameters: 39720000
File Size: 159206198
Architecture:
- 1x1 Convolution
- Average Pooling
- Batch Normalization
- CondConv
- Convolution
- Dense Connections
- Dropout
- Inverted Residual Block
- Squeeze-and-Excitation Block
- Swish
Tasks:
- Image Classification
Training Techniques:
- AutoAugment
- Label Smoothing
- RMSProp
- Stochastic Depth
- Weight Decay
Training Data:
- ImageNet
ID: tf_efficientnet_cc_b1_8e
LR: 0.256
Epochs: 350
Crop Pct: '0.882'
Momentum: 0.9
Batch Size: 2048
Image Size: '240'
Weight Decay: 1.0e-05
Interpolation: bicubic
RMSProp Decay: 0.9
Label Smoothing: 0.1
BatchNorm Momentum: 0.99
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L1584
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.33%
Top 5 Accuracy: 94.37%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/spnasnet.mdx
|
# SPNASNet
**Single-Path NAS** is a novel differentiable NAS method for designing hardware-efficient ConvNets in less than 4 hours.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('spnasnet_100', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `spnasnet_100`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('spnasnet_100', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{stamoulis2019singlepath,
title={Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours},
author={Dimitrios Stamoulis and Ruizhou Ding and Di Wang and Dimitrios Lymberopoulos and Bodhi Priyantha and Jie Liu and Diana Marculescu},
year={2019},
eprint={1904.02877},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
<!--
Type: model-index
Collections:
- Name: SPNASNet
Paper:
Title: 'Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4
Hours'
URL: https://paperswithcode.com/paper/single-path-nas-designing-hardware-efficient
Models:
- Name: spnasnet_100
In Collection: SPNASNet
Metadata:
FLOPs: 442385600
Parameters: 4420000
File Size: 17902337
Architecture:
- Average Pooling
- Batch Normalization
- Convolution
- Depthwise Separable Convolution
- Dropout
- ReLU
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: spnasnet_100
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L995
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 74.08%
Top 5 Accuracy: 91.82%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/gloun-resnext.mdx
|
# (Gluon) ResNeXt
A **ResNeXt** repeats a [building block](https://paperswithcode.com/method/resnext-block) that aggregates a set of transformations with the same topology. Compared to a [ResNet](https://paperswithcode.com/method/resnet), it exposes a new dimension, *cardinality* (the size of the set of transformations) \\( C \\), as an essential factor in addition to the dimensions of depth and width.
The weights from this model were ported from [Gluon](https://cv.gluon.ai/model_zoo/classification.html).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('gluon_resnext101_32x4d', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `gluon_resnext101_32x4d`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('gluon_resnext101_32x4d', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/XieGDTH16,
author = {Saining Xie and
Ross B. Girshick and
Piotr Doll{\'{a}}r and
Zhuowen Tu and
Kaiming He},
title = {Aggregated Residual Transformations for Deep Neural Networks},
journal = {CoRR},
volume = {abs/1611.05431},
year = {2016},
url = {http://arxiv.org/abs/1611.05431},
archivePrefix = {arXiv},
eprint = {1611.05431},
timestamp = {Mon, 13 Aug 2018 16:45:58 +0200},
biburl = {https://dblp.org/rec/journals/corr/XieGDTH16.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: Gloun ResNeXt
Paper:
Title: Aggregated Residual Transformations for Deep Neural Networks
URL: https://paperswithcode.com/paper/aggregated-residual-transformations-for-deep
Models:
- Name: gluon_resnext101_32x4d
In Collection: Gloun ResNeXt
Metadata:
FLOPs: 10298145792
Parameters: 44180000
File Size: 177367414
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnext101_32x4d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L193
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.33%
Top 5 Accuracy: 94.91%
- Name: gluon_resnext101_64x4d
In Collection: Gloun ResNeXt
Metadata:
FLOPs: 19954172928
Parameters: 83460000
File Size: 334737852
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnext101_64x4d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L201
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.63%
Top 5 Accuracy: 95.0%
- Name: gluon_resnext50_32x4d
In Collection: Gloun ResNeXt
Metadata:
FLOPs: 5472648192
Parameters: 25030000
File Size: 100441719
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: gluon_resnext50_32x4d
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/gluon_resnet.py#L185
Weights: https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.35%
Top 5 Accuracy: 94.42%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/pnasnet.mdx
|
# PNASNet
**Progressive Neural Architecture Search**, or **PNAS**, is a method for learning the structure of convolutional neural networks (CNNs). It uses a sequential model-based optimization (SMBO) strategy, where we search the space of cell structures, starting with simple (shallow) models and progressing to complex ones, pruning out unpromising structures as we go.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('pnasnet5large', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `pnasnet5large`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('pnasnet5large', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{liu2018progressive,
title={Progressive Neural Architecture Search},
author={Chenxi Liu and Barret Zoph and Maxim Neumann and Jonathon Shlens and Wei Hua and Li-Jia Li and Li Fei-Fei and Alan Yuille and Jonathan Huang and Kevin Murphy},
year={2018},
eprint={1712.00559},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: PNASNet
Paper:
Title: Progressive Neural Architecture Search
URL: https://paperswithcode.com/paper/progressive-neural-architecture-search
Models:
- Name: pnasnet5large
In Collection: PNASNet
Metadata:
FLOPs: 31458865950
Parameters: 86060000
File Size: 345153926
Architecture:
- Average Pooling
- Batch Normalization
- Convolution
- Depthwise Separable Convolution
- Dropout
- ReLU
Tasks:
- Image Classification
Training Techniques:
- Label Smoothing
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 100x NVIDIA P100 GPUs
ID: pnasnet5large
LR: 0.015
Dropout: 0.5
Crop Pct: '0.911'
Momentum: 0.9
Batch Size: 1600
Image Size: '331'
Interpolation: bicubic
Label Smoothing: 0.1
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/pnasnet.py#L343
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 0.98%
Top 5 Accuracy: 18.58%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/swsl-resnext.mdx
|
# SWSL ResNeXt
A **ResNeXt** repeats a [building block](https://paperswithcode.com/method/resnext-block) that aggregates a set of transformations with the same topology. Compared to a [ResNet](https://paperswithcode.com/method/resnet), it exposes a new dimension, *cardinality* (the size of the set of transformations) \\( C \\), as an essential factor in addition to the dimensions of depth and width.
The models in this collection utilise semi-weakly supervised learning to improve the performance of the model. The approach brings important gains to standard architectures for image, video and fine-grained classification.
Please note the CC-BY-NC 4.0 license on theses weights, non-commercial use only.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('swsl_resnext101_32x16d', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `swsl_resnext101_32x16d`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('swsl_resnext101_32x16d', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/abs-1905-00546,
author = {I. Zeki Yalniz and
Herv{\'{e}} J{\'{e}}gou and
Kan Chen and
Manohar Paluri and
Dhruv Mahajan},
title = {Billion-scale semi-supervised learning for image classification},
journal = {CoRR},
volume = {abs/1905.00546},
year = {2019},
url = {http://arxiv.org/abs/1905.00546},
archivePrefix = {arXiv},
eprint = {1905.00546},
timestamp = {Mon, 28 Sep 2020 08:19:37 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1905-00546.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: SWSL ResNext
Paper:
Title: Billion-scale semi-supervised learning for image classification
URL: https://paperswithcode.com/paper/billion-scale-semi-supervised-learning-for
Models:
- Name: swsl_resnext101_32x16d
In Collection: SWSL ResNext
Metadata:
FLOPs: 46623691776
Parameters: 194030000
File Size: 777518664
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- IG-1B-Targeted
- ImageNet
Training Resources: 64x GPUs
ID: swsl_resnext101_32x16d
LR: 0.0015
Epochs: 30
Layers: 101
Crop Pct: '0.875'
Batch Size: 1536
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L1009
Weights: https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.34%
Top 5 Accuracy: 96.84%
- Name: swsl_resnext101_32x4d
In Collection: SWSL ResNext
Metadata:
FLOPs: 10298145792
Parameters: 44180000
File Size: 177341913
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- IG-1B-Targeted
- ImageNet
Training Resources: 64x GPUs
ID: swsl_resnext101_32x4d
LR: 0.0015
Epochs: 30
Layers: 101
Crop Pct: '0.875'
Batch Size: 1536
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L987
Weights: https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.22%
Top 5 Accuracy: 96.77%
- Name: swsl_resnext101_32x8d
In Collection: SWSL ResNext
Metadata:
FLOPs: 21180417024
Parameters: 88790000
File Size: 356056638
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- IG-1B-Targeted
- ImageNet
Training Resources: 64x GPUs
ID: swsl_resnext101_32x8d
LR: 0.0015
Epochs: 30
Layers: 101
Crop Pct: '0.875'
Batch Size: 1536
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L998
Weights: https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.27%
Top 5 Accuracy: 97.17%
- Name: swsl_resnext50_32x4d
In Collection: SWSL ResNext
Metadata:
FLOPs: 5472648192
Parameters: 25030000
File Size: 100428550
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Global Average Pooling
- Grouped Convolution
- Max Pooling
- ReLU
- ResNeXt Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- IG-1B-Targeted
- ImageNet
Training Resources: 64x GPUs
ID: swsl_resnext50_32x4d
LR: 0.0015
Epochs: 30
Layers: 50
Crop Pct: '0.875'
Batch Size: 1536
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L976
Weights: https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 82.17%
Top 5 Accuracy: 96.23%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/ssl-resnet.mdx
|
# SSL ResNet
**Residual Networks**, or **ResNets**, learn residual functions with reference to the layer inputs, instead of learning unreferenced functions. Instead of hoping each few stacked layers directly fit a desired underlying mapping, residual nets let these layers fit a residual mapping. They stack [residual blocks](https://paperswithcode.com/method/residual-block) ontop of each other to form network: e.g. a ResNet-50 has fifty layers using these blocks.
The model in this collection utilises semi-supervised learning to improve the performance of the model. The approach brings important gains to standard architectures for image, video and fine-grained classification.
Please note the CC-BY-NC 4.0 license on theses weights, non-commercial use only.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('ssl_resnet18', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `ssl_resnet18`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('ssl_resnet18', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/abs-1905-00546,
author = {I. Zeki Yalniz and
Herv{\'{e}} J{\'{e}}gou and
Kan Chen and
Manohar Paluri and
Dhruv Mahajan},
title = {Billion-scale semi-supervised learning for image classification},
journal = {CoRR},
volume = {abs/1905.00546},
year = {2019},
url = {http://arxiv.org/abs/1905.00546},
archivePrefix = {arXiv},
eprint = {1905.00546},
timestamp = {Mon, 28 Sep 2020 08:19:37 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1905-00546.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: SSL ResNet
Paper:
Title: Billion-scale semi-supervised learning for image classification
URL: https://paperswithcode.com/paper/billion-scale-semi-supervised-learning-for
Models:
- Name: ssl_resnet18
In Collection: SSL ResNet
Metadata:
FLOPs: 2337073152
Parameters: 11690000
File Size: 46811375
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
- YFCC-100M
Training Resources: 64x GPUs
ID: ssl_resnet18
LR: 0.0015
Epochs: 30
Layers: 18
Crop Pct: '0.875'
Batch Size: 1536
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L894
Weights: https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 72.62%
Top 5 Accuracy: 91.42%
- Name: ssl_resnet50
In Collection: SSL ResNet
Metadata:
FLOPs: 5282531328
Parameters: 25560000
File Size: 102480594
Architecture:
- 1x1 Convolution
- Batch Normalization
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
- YFCC-100M
Training Resources: 64x GPUs
ID: ssl_resnet50
LR: 0.0015
Epochs: 30
Layers: 50
Crop Pct: '0.875'
Batch Size: 1536
Image Size: '224'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/resnet.py#L904
Weights: https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 79.24%
Top 5 Accuracy: 94.83%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/inception-v3.mdx
|
# Inception v3
**Inception v3** is a convolutional neural network architecture from the Inception family that makes several improvements including using [Label Smoothing](https://paperswithcode.com/method/label-smoothing), Factorized 7 x 7 convolutions, and the use of an [auxiliary classifer](https://paperswithcode.com/method/auxiliary-classifier) to propagate label information lower down the network (along with the use of batch normalization for layers in the sidehead). The key building block is an [Inception Module](https://paperswithcode.com/method/inception-v3-module).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('inception_v3', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `inception_v3`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('inception_v3', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/SzegedyVISW15,
author = {Christian Szegedy and
Vincent Vanhoucke and
Sergey Ioffe and
Jonathon Shlens and
Zbigniew Wojna},
title = {Rethinking the Inception Architecture for Computer Vision},
journal = {CoRR},
volume = {abs/1512.00567},
year = {2015},
url = {http://arxiv.org/abs/1512.00567},
archivePrefix = {arXiv},
eprint = {1512.00567},
timestamp = {Mon, 13 Aug 2018 16:49:07 +0200},
biburl = {https://dblp.org/rec/journals/corr/SzegedyVISW15.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: Inception v3
Paper:
Title: Rethinking the Inception Architecture for Computer Vision
URL: https://paperswithcode.com/paper/rethinking-the-inception-architecture-for
Models:
- Name: inception_v3
In Collection: Inception v3
Metadata:
FLOPs: 7352418880
Parameters: 23830000
File Size: 108857766
Architecture:
- 1x1 Convolution
- Auxiliary Classifier
- Average Pooling
- Average Pooling
- Batch Normalization
- Convolution
- Dense Connections
- Dropout
- Inception-v3 Module
- Max Pooling
- ReLU
- Softmax
Tasks:
- Image Classification
Training Techniques:
- Gradient Clipping
- Label Smoothing
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 50x NVIDIA Kepler GPUs
ID: inception_v3
LR: 0.045
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Image Size: '299'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/d8e69206be253892b2956341fea09fdebfaae4e3/timm/models/inception_v3.py#L442
Weights: https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 77.46%
Top 5 Accuracy: 93.48%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/big-transfer.mdx
|
# Big Transfer (BiT)
**Big Transfer (BiT)** is a type of pretraining recipe that pre-trains on a large supervised source dataset, and fine-tunes the weights on the target task. Models are trained on the JFT-300M dataset. The finetuned models contained in this collection are finetuned on ImageNet.
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('resnetv2_101x1_bitm', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `resnetv2_101x1_bitm`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('resnetv2_101x1_bitm', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{kolesnikov2020big,
title={Big Transfer (BiT): General Visual Representation Learning},
author={Alexander Kolesnikov and Lucas Beyer and Xiaohua Zhai and Joan Puigcerver and Jessica Yung and Sylvain Gelly and Neil Houlsby},
year={2020},
eprint={1912.11370},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: Big Transfer
Paper:
Title: 'Big Transfer (BiT): General Visual Representation Learning'
URL: https://paperswithcode.com/paper/large-scale-learning-of-general-visual
Models:
- Name: resnetv2_101x1_bitm
In Collection: Big Transfer
Metadata:
FLOPs: 5330896
Parameters: 44540000
File Size: 178256468
Architecture:
- 1x1 Convolution
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Group Normalization
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Weight Standardization
Tasks:
- Image Classification
Training Techniques:
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
- JFT-300M
Training Resources: Cloud TPUv3-512
ID: resnetv2_101x1_bitm
LR: 0.03
Epochs: 90
Layers: 101
Crop Pct: '1.0'
Momentum: 0.9
Batch Size: 4096
Image Size: '480'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/resnetv2.py#L444
Weights: https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 82.21%
Top 5 Accuracy: 96.47%
- Name: resnetv2_101x3_bitm
In Collection: Big Transfer
Metadata:
FLOPs: 15988688
Parameters: 387930000
File Size: 1551830100
Architecture:
- 1x1 Convolution
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Group Normalization
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Weight Standardization
Tasks:
- Image Classification
Training Techniques:
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
- JFT-300M
Training Resources: Cloud TPUv3-512
ID: resnetv2_101x3_bitm
LR: 0.03
Epochs: 90
Layers: 101
Crop Pct: '1.0'
Momentum: 0.9
Batch Size: 4096
Image Size: '480'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/resnetv2.py#L451
Weights: https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.38%
Top 5 Accuracy: 97.37%
- Name: resnetv2_152x2_bitm
In Collection: Big Transfer
Metadata:
FLOPs: 10659792
Parameters: 236340000
File Size: 945476668
Architecture:
- 1x1 Convolution
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Group Normalization
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Weight Standardization
Tasks:
- Image Classification
Training Techniques:
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
- JFT-300M
ID: resnetv2_152x2_bitm
Crop Pct: '1.0'
Image Size: '480'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/resnetv2.py#L458
Weights: https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.4%
Top 5 Accuracy: 97.43%
- Name: resnetv2_152x4_bitm
In Collection: Big Transfer
Metadata:
FLOPs: 21317584
Parameters: 936530000
File Size: 3746270104
Architecture:
- 1x1 Convolution
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Group Normalization
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Weight Standardization
Tasks:
- Image Classification
Training Techniques:
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
- JFT-300M
Training Resources: Cloud TPUv3-512
ID: resnetv2_152x4_bitm
Crop Pct: '1.0'
Image Size: '480'
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/resnetv2.py#L465
Weights: https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 84.95%
Top 5 Accuracy: 97.45%
- Name: resnetv2_50x1_bitm
In Collection: Big Transfer
Metadata:
FLOPs: 5330896
Parameters: 25550000
File Size: 102242668
Architecture:
- 1x1 Convolution
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Group Normalization
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Weight Standardization
Tasks:
- Image Classification
Training Techniques:
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
- JFT-300M
Training Resources: Cloud TPUv3-512
ID: resnetv2_50x1_bitm
LR: 0.03
Epochs: 90
Layers: 50
Crop Pct: '1.0'
Momentum: 0.9
Batch Size: 4096
Image Size: '480'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/resnetv2.py#L430
Weights: https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 80.19%
Top 5 Accuracy: 95.63%
- Name: resnetv2_50x3_bitm
In Collection: Big Transfer
Metadata:
FLOPs: 15988688
Parameters: 217320000
File Size: 869321580
Architecture:
- 1x1 Convolution
- Bottleneck Residual Block
- Convolution
- Global Average Pooling
- Group Normalization
- Max Pooling
- ReLU
- Residual Block
- Residual Connection
- Softmax
- Weight Standardization
Tasks:
- Image Classification
Training Techniques:
- Mixup
- SGD with Momentum
- Weight Decay
Training Data:
- ImageNet
- JFT-300M
Training Resources: Cloud TPUv3-512
ID: resnetv2_50x3_bitm
LR: 0.03
Epochs: 90
Layers: 50
Crop Pct: '1.0'
Momentum: 0.9
Batch Size: 4096
Image Size: '480'
Weight Decay: 0.0001
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/b9843f954b0457af2db4f9dea41a8538f51f5d78/timm/models/resnetv2.py#L437
Weights: https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 83.75%
Top 5 Accuracy: 97.12%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/mnasnet.mdx
|
# MnasNet
**MnasNet** is a type of convolutional neural network optimized for mobile devices that is discovered through mobile neural architecture search, which explicitly incorporates model latency into the main objective so that the search can identify a model that achieves a good trade-off between accuracy and latency. The main building block is an [inverted residual block](https://paperswithcode.com/method/inverted-residual-block) (from [MobileNetV2](https://paperswithcode.com/method/mobilenetv2)).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('mnasnet_100', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `mnasnet_100`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('mnasnet_100', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@misc{tan2019mnasnet,
title={MnasNet: Platform-Aware Neural Architecture Search for Mobile},
author={Mingxing Tan and Bo Chen and Ruoming Pang and Vijay Vasudevan and Mark Sandler and Andrew Howard and Quoc V. Le},
year={2019},
eprint={1807.11626},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
Type: model-index
Collections:
- Name: MNASNet
Paper:
Title: 'MnasNet: Platform-Aware Neural Architecture Search for Mobile'
URL: https://paperswithcode.com/paper/mnasnet-platform-aware-neural-architecture
Models:
- Name: mnasnet_100
In Collection: MNASNet
Metadata:
FLOPs: 416415488
Parameters: 4380000
File Size: 17731774
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Depthwise Separable Convolution
- Dropout
- Global Average Pooling
- Inverted Residual Block
- Max Pooling
- ReLU
- Residual Connection
- Softmax
Tasks:
- Image Classification
Training Techniques:
- RMSProp
- Weight Decay
Training Data:
- ImageNet
ID: mnasnet_100
Layers: 100
Dropout: 0.2
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 4000
Image Size: '224'
Interpolation: bicubic
RMSProp Decay: 0.9
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L894
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 74.67%
Top 5 Accuracy: 92.1%
- Name: semnasnet_100
In Collection: MNASNet
Metadata:
FLOPs: 414570766
Parameters: 3890000
File Size: 15731489
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Depthwise Separable Convolution
- Dropout
- Global Average Pooling
- Inverted Residual Block
- Max Pooling
- ReLU
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Data:
- ImageNet
ID: semnasnet_100
Crop Pct: '0.875'
Image Size: '224'
Interpolation: bicubic
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/efficientnet.py#L928
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 75.45%
Top 5 Accuracy: 92.61%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/models/tf-mobilenet-v3.mdx
|
# (Tensorflow) MobileNet v3
**MobileNetV3** is a convolutional neural network that is designed for mobile phone CPUs. The network design includes the use of a [hard swish activation](https://paperswithcode.com/method/hard-swish) and [squeeze-and-excitation](https://paperswithcode.com/method/squeeze-and-excitation-block) modules in the [MBConv blocks](https://paperswithcode.com/method/inverted-residual-block).
The weights from this model were ported from [Tensorflow/Models](https://github.com/tensorflow/models).
## How do I use this model on an image?
To load a pretrained model:
```py
>>> import timm
>>> model = timm.create_model('tf_mobilenetv3_large_075', pretrained=True)
>>> model.eval()
```
To load and preprocess the image:
```py
>>> import urllib
>>> from PIL import Image
>>> from timm.data import resolve_data_config
>>> from timm.data.transforms_factory import create_transform
>>> config = resolve_data_config({}, model=model)
>>> transform = create_transform(**config)
>>> url, filename = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg", "dog.jpg")
>>> urllib.request.urlretrieve(url, filename)
>>> img = Image.open(filename).convert('RGB')
>>> tensor = transform(img).unsqueeze(0) # transform and add batch dimension
```
To get the model predictions:
```py
>>> import torch
>>> with torch.no_grad():
... out = model(tensor)
>>> probabilities = torch.nn.functional.softmax(out[0], dim=0)
>>> print(probabilities.shape)
>>> # prints: torch.Size([1000])
```
To get the top-5 predictions class names:
```py
>>> # Get imagenet class mappings
>>> url, filename = ("https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt", "imagenet_classes.txt")
>>> urllib.request.urlretrieve(url, filename)
>>> with open("imagenet_classes.txt", "r") as f:
... categories = [s.strip() for s in f.readlines()]
>>> # Print top categories per image
>>> top5_prob, top5_catid = torch.topk(probabilities, 5)
>>> for i in range(top5_prob.size(0)):
... print(categories[top5_catid[i]], top5_prob[i].item())
>>> # prints class names and probabilities like:
>>> # [('Samoyed', 0.6425196528434753), ('Pomeranian', 0.04062102362513542), ('keeshond', 0.03186424449086189), ('white wolf', 0.01739676296710968), ('Eskimo dog', 0.011717947199940681)]
```
Replace the model name with the variant you want to use, e.g. `tf_mobilenetv3_large_075`. You can find the IDs in the model summaries at the top of this page.
To extract image features with this model, follow the [timm feature extraction examples](../feature_extraction), just change the name of the model you want to use.
## How do I finetune this model?
You can finetune any of the pre-trained models just by changing the classifier (the last layer).
```py
>>> model = timm.create_model('tf_mobilenetv3_large_075', pretrained=True, num_classes=NUM_FINETUNE_CLASSES)
```
To finetune on your own dataset, you have to write a training loop or adapt [timm's training
script](https://github.com/rwightman/pytorch-image-models/blob/master/train.py) to use your dataset.
## How do I train this model?
You can follow the [timm recipe scripts](../scripts) for training a new model afresh.
## Citation
```BibTeX
@article{DBLP:journals/corr/abs-1905-02244,
author = {Andrew Howard and
Mark Sandler and
Grace Chu and
Liang{-}Chieh Chen and
Bo Chen and
Mingxing Tan and
Weijun Wang and
Yukun Zhu and
Ruoming Pang and
Vijay Vasudevan and
Quoc V. Le and
Hartwig Adam},
title = {Searching for MobileNetV3},
journal = {CoRR},
volume = {abs/1905.02244},
year = {2019},
url = {http://arxiv.org/abs/1905.02244},
archivePrefix = {arXiv},
eprint = {1905.02244},
timestamp = {Tue, 12 Jan 2021 15:30:06 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1905-02244.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<!--
Type: model-index
Collections:
- Name: TF MobileNet V3
Paper:
Title: Searching for MobileNetV3
URL: https://paperswithcode.com/paper/searching-for-mobilenetv3
Models:
- Name: tf_mobilenetv3_large_075
In Collection: TF MobileNet V3
Metadata:
FLOPs: 194323712
Parameters: 3990000
File Size: 16097377
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Dropout
- Global Average Pooling
- Hard Swish
- Inverted Residual Block
- ReLU
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x4 TPU Pod
ID: tf_mobilenetv3_large_075
LR: 0.1
Dropout: 0.8
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 4096
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/mobilenetv3.py#L394
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 73.45%
Top 5 Accuracy: 91.34%
- Name: tf_mobilenetv3_large_100
In Collection: TF MobileNet V3
Metadata:
FLOPs: 274535288
Parameters: 5480000
File Size: 22076649
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Dropout
- Global Average Pooling
- Hard Swish
- Inverted Residual Block
- ReLU
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x4 TPU Pod
ID: tf_mobilenetv3_large_100
LR: 0.1
Dropout: 0.8
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 4096
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/mobilenetv3.py#L403
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 75.51%
Top 5 Accuracy: 92.61%
- Name: tf_mobilenetv3_large_minimal_100
In Collection: TF MobileNet V3
Metadata:
FLOPs: 267216928
Parameters: 3920000
File Size: 15836368
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Dropout
- Global Average Pooling
- Hard Swish
- Inverted Residual Block
- ReLU
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 4x4 TPU Pod
ID: tf_mobilenetv3_large_minimal_100
LR: 0.1
Dropout: 0.8
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 4096
Image Size: '224'
Weight Decay: 1.0e-05
Interpolation: bilinear
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/mobilenetv3.py#L412
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 72.24%
Top 5 Accuracy: 90.64%
- Name: tf_mobilenetv3_small_075
In Collection: TF MobileNet V3
Metadata:
FLOPs: 48457664
Parameters: 2040000
File Size: 8242701
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Dropout
- Global Average Pooling
- Hard Swish
- Inverted Residual Block
- ReLU
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 16x GPUs
ID: tf_mobilenetv3_small_075
LR: 0.045
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 4096
Image Size: '224'
Weight Decay: 4.0e-05
Interpolation: bilinear
RMSProp Decay: 0.9
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/mobilenetv3.py#L421
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 65.72%
Top 5 Accuracy: 86.13%
- Name: tf_mobilenetv3_small_100
In Collection: TF MobileNet V3
Metadata:
FLOPs: 65450600
Parameters: 2540000
File Size: 10256398
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Dropout
- Global Average Pooling
- Hard Swish
- Inverted Residual Block
- ReLU
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 16x GPUs
ID: tf_mobilenetv3_small_100
LR: 0.045
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 4096
Image Size: '224'
Weight Decay: 4.0e-05
Interpolation: bilinear
RMSProp Decay: 0.9
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/mobilenetv3.py#L430
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 67.92%
Top 5 Accuracy: 87.68%
- Name: tf_mobilenetv3_small_minimal_100
In Collection: TF MobileNet V3
Metadata:
FLOPs: 60827936
Parameters: 2040000
File Size: 8258083
Architecture:
- 1x1 Convolution
- Batch Normalization
- Convolution
- Dense Connections
- Depthwise Separable Convolution
- Dropout
- Global Average Pooling
- Hard Swish
- Inverted Residual Block
- ReLU
- Residual Connection
- Softmax
- Squeeze-and-Excitation Block
Tasks:
- Image Classification
Training Techniques:
- RMSProp
- Weight Decay
Training Data:
- ImageNet
Training Resources: 16x GPUs
ID: tf_mobilenetv3_small_minimal_100
LR: 0.045
Crop Pct: '0.875'
Momentum: 0.9
Batch Size: 4096
Image Size: '224'
Weight Decay: 4.0e-05
Interpolation: bilinear
RMSProp Decay: 0.9
Code: https://github.com/rwightman/pytorch-image-models/blob/9a25fdf3ad0414b4d66da443fe60ae0aa14edc84/timm/models/mobilenetv3.py#L439
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth
Results:
- Task: Image Classification
Dataset: ImageNet
Metrics:
Top 1 Accuracy: 62.91%
Top 5 Accuracy: 84.24%
-->
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/reference/optimizers.mdx
|
# Optimization
This page contains the API reference documentation for learning rate optimizers included in `timm`.
## Optimizers
### Factory functions
[[autodoc]] timm.optim.optim_factory.create_optimizer
[[autodoc]] timm.optim.optim_factory.create_optimizer_v2
### Optimizer Classes
[[autodoc]] timm.optim.adabelief.AdaBelief
[[autodoc]] timm.optim.adafactor.Adafactor
[[autodoc]] timm.optim.adahessian.Adahessian
[[autodoc]] timm.optim.adamp.AdamP
[[autodoc]] timm.optim.adamw.AdamW
[[autodoc]] timm.optim.lamb.Lamb
[[autodoc]] timm.optim.lars.Lars
[[autodoc]] timm.optim.lookahead.Lookahead
[[autodoc]] timm.optim.madgrad.MADGRAD
[[autodoc]] timm.optim.nadam.Nadam
[[autodoc]] timm.optim.nvnovograd.NvNovoGrad
[[autodoc]] timm.optim.radam.RAdam
[[autodoc]] timm.optim.rmsprop_tf.RMSpropTF
[[autodoc]] timm.optim.sgdp.SGDP
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/reference/models.mdx
|
# Models
[[autodoc]] timm.create_model
[[autodoc]] timm.list_models
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/reference/schedulers.mdx
|
# Learning Rate Schedulers
This page contains the API reference documentation for learning rate schedulers included in `timm`.
## Schedulers
### Factory functions
[[autodoc]] timm.scheduler.scheduler_factory.create_scheduler
[[autodoc]] timm.scheduler.scheduler_factory.create_scheduler_v2
### Scheduler Classes
[[autodoc]] timm.scheduler.cosine_lr.CosineLRScheduler
[[autodoc]] timm.scheduler.multistep_lr.MultiStepLRScheduler
[[autodoc]] timm.scheduler.plateau_lr.PlateauLRScheduler
[[autodoc]] timm.scheduler.poly_lr.PolyLRScheduler
[[autodoc]] timm.scheduler.step_lr.StepLRScheduler
[[autodoc]] timm.scheduler.tanh_lr.TanhLRScheduler
| 0
|
hf_public_repos/pytorch-image-models/hfdocs/source
|
hf_public_repos/pytorch-image-models/hfdocs/source/reference/data.mdx
|
# Data
[[autodoc]] timm.data.create_dataset
[[autodoc]] timm.data.create_loader
[[autodoc]] timm.data.create_transform
[[autodoc]] timm.data.resolve_data_config
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/README.md
|
# Validation and Benchmark Results
This folder contains validation and benchmark results for the models in this collection. Validation scores are currently only run for models with pretrained weights and ImageNet-1k heads, benchmark numbers are run for all.
## Datasets
There are currently results for the ImageNet validation set and 5 additional test / label sets.
The test set results include rank and top-1/top-5 differences from clean validation. For the "Real Labels", ImageNetV2, and Sketch test sets, the differences were calculated against the full 1000 class ImageNet-1k validation set. For both the Adversarial and Rendition sets, the differences were calculated against 'clean' runs on the ImageNet-1k validation set with the same 200 classes used in each test set respectively.
### ImageNet Validation - [`results-imagenet.csv`](results-imagenet.csv)
The standard 50,000 image ImageNet-1k validation set. Model selection during training utilizes this validation set, so it is not a true test set. Question: Does anyone have the official ImageNet-1k test set classification labels now that challenges are done?
* Source: http://image-net.org/challenges/LSVRC/2012/index
* Paper: "ImageNet Large Scale Visual Recognition Challenge" - https://arxiv.org/abs/1409.0575
### ImageNet-"Real Labels" - [`results-imagenet-real.csv`](results-imagenet-real.csv)
The usual ImageNet-1k validation set with a fresh new set of labels intended to improve on mistakes in the original annotation process.
* Source: https://github.com/google-research/reassessed-imagenet
* Paper: "Are we done with ImageNet?" - https://arxiv.org/abs/2006.07159
### ImageNetV2 Matched Frequency - [`results-imagenetv2-matched-frequency.csv`](results-imagenetv2-matched-frequency.csv)
An ImageNet test set of 10,000 images sampled from new images roughly 10 years after the original. Care was taken to replicate the original ImageNet curation/sampling process.
* Source: https://github.com/modestyachts/ImageNetV2
* Paper: "Do ImageNet Classifiers Generalize to ImageNet?" - https://arxiv.org/abs/1902.10811
### ImageNet-Sketch - [`results-sketch.csv`](results-sketch.csv)
50,000 non photographic (or photos of such) images (sketches, doodles, mostly monochromatic) covering all 1000 ImageNet classes.
* Source: https://github.com/HaohanWang/ImageNet-Sketch
* Paper: "Learning Robust Global Representations by Penalizing Local Predictive Power" - https://arxiv.org/abs/1905.13549
### ImageNet-Adversarial - [`results-imagenet-a.csv`](results-imagenet-a.csv)
A collection of 7500 images covering 200 of the 1000 ImageNet classes. Images are naturally occurring adversarial examples that confuse typical ImageNet classifiers. This is a challenging dataset, your typical ResNet-50 will score 0% top-1.
For clean validation with same 200 classes, see [`results-imagenet-a-clean.csv`](results-imagenet-a-clean.csv)
* Source: https://github.com/hendrycks/natural-adv-examples
* Paper: "Natural Adversarial Examples" - https://arxiv.org/abs/1907.07174
### ImageNet-Rendition - [`results-imagenet-r.csv`](results-imagenet-r.csv)
Renditions of 200 ImageNet classes resulting in 30,000 images for testing robustness.
For clean validation with same 200 classes, see [`results-imagenet-r-clean.csv`](results-imagenet-r-clean.csv)
* Source: https://github.com/hendrycks/imagenet-r
* Paper: "The Many Faces of Robustness" - https://arxiv.org/abs/2006.16241
### TODO
* Explore adding a reduced version of ImageNet-C (Corruptions) and ImageNet-P (Perturbations) from https://github.com/hendrycks/robustness. The originals are huge and image size specific.
## Benchmark
CSV files with a `model_benchmark` prefix include benchmark numbers for models on various accelerators with different precision. Currently only run on RTX 3090 w/ AMP for inference, I intend to add more in the future.
## Metadata
CSV files with `model_metadata` prefix contain extra information about the source training, currently the pretraining dataset and technique (ie distillation, SSL, WSL, etc). Eventually I'd like to have metadata about augmentation, regularization, etc. but that will be a challenge to source consistently.
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/results-imagenet.csv
|
model,top1,top1_err,top5,top5_err,param_count,img_size,crop_pct,interpolation
eva02_large_patch14_448.mim_m38m_ft_in22k_in1k,90.052,9.948,99.048,0.952,305.08,448,1.000,bicubic
eva02_large_patch14_448.mim_in22k_ft_in22k_in1k,89.970,10.030,99.012,0.988,305.08,448,1.000,bicubic
eva_giant_patch14_560.m30m_ft_in22k_in1k,89.786,10.214,98.992,1.008,"1,014.45",560,1.000,bicubic
eva02_large_patch14_448.mim_in22k_ft_in1k,89.622,10.378,98.950,1.050,305.08,448,1.000,bicubic
eva02_large_patch14_448.mim_m38m_ft_in1k,89.574,10.426,98.924,1.076,305.08,448,1.000,bicubic
eva_giant_patch14_336.m30m_ft_in22k_in1k,89.566,10.434,98.952,1.048,"1,013.01",336,1.000,bicubic
eva_giant_patch14_336.clip_ft_in1k,89.466,10.534,98.826,1.174,"1,013.01",336,1.000,bicubic
eva_large_patch14_336.in22k_ft_in22k_in1k,89.206,10.794,98.854,1.146,304.53,336,1.000,bicubic
eva_giant_patch14_224.clip_ft_in1k,88.880,11.120,98.680,1.320,"1,012.56",224,0.900,bicubic
convnextv2_huge.fcmae_ft_in22k_in1k_512,88.858,11.142,98.748,1.252,660.29,512,1.000,bicubic
eva02_base_patch14_448.mim_in22k_ft_in22k_in1k,88.690,11.310,98.724,1.276,87.12,448,1.000,bicubic
convnextv2_huge.fcmae_ft_in22k_in1k_384,88.670,11.330,98.738,1.262,660.29,384,1.000,bicubic
eva_large_patch14_336.in22k_ft_in1k,88.670,11.330,98.722,1.278,304.53,336,1.000,bicubic
convnext_xxlarge.clip_laion2b_soup_ft_in1k,88.604,11.396,98.708,1.292,846.47,256,1.000,bicubic
beit_large_patch16_512.in22k_ft_in22k_in1k,88.596,11.404,98.656,1.344,305.67,512,1.000,bicubic
vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k,88.592,11.408,98.662,1.338,632.46,336,1.000,bicubic
eva_large_patch14_196.in22k_ft_in22k_in1k,88.574,11.426,98.658,1.342,304.14,196,1.000,bicubic
maxvit_xlarge_tf_512.in21k_ft_in1k,88.538,11.462,98.644,1.356,475.77,512,1.000,bicubic
beit_large_patch16_384.in22k_ft_in22k_in1k,88.402,11.598,98.608,1.392,305.00,384,1.000,bicubic
beitv2_large_patch16_224.in1k_ft_in22k_in1k,88.394,11.606,98.598,1.402,304.43,224,0.950,bicubic
tf_efficientnet_l2.ns_jft_in1k,88.352,11.648,98.648,1.352,480.31,800,0.960,bicubic
maxvit_xlarge_tf_384.in21k_ft_in1k,88.314,11.686,98.544,1.456,475.32,384,1.000,bicubic
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384,88.306,11.694,98.582,1.418,200.13,384,1.000,bicubic
vit_large_patch14_clip_336.openai_ft_in12k_in1k,88.268,11.732,98.526,1.474,304.53,336,1.000,bicubic
vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k,88.256,11.744,98.552,1.448,632.05,224,1.000,bicubic
eva02_base_patch14_448.mim_in22k_ft_in1k,88.252,11.748,98.564,1.436,87.12,448,1.000,bicubic
tf_efficientnet_l2.ns_jft_in1k_475,88.234,11.766,98.546,1.454,480.31,475,0.936,bicubic
regnety_1280.swag_ft_in1k,88.230,11.770,98.686,1.314,644.81,384,1.000,bicubic
maxvit_large_tf_512.in21k_ft_in1k,88.224,11.776,98.598,1.402,212.33,512,1.000,bicubic
maxvit_base_tf_512.in21k_ft_in1k,88.220,11.780,98.530,1.470,119.88,512,1.000,bicubic
convnextv2_large.fcmae_ft_in22k_in1k_384,88.198,11.802,98.528,1.472,197.96,384,1.000,bicubic
vit_large_patch14_clip_336.laion2b_ft_in12k_in1k,88.180,11.820,98.572,1.428,304.53,336,1.000,bicubic
vit_large_patch14_clip_224.openai_ft_in12k_in1k,88.174,11.826,98.546,1.454,304.20,224,1.000,bicubic
caformer_b36.sail_in22k_ft_in1k_384,88.058,11.942,98.582,1.418,98.75,384,1.000,bicubic
maxvit_large_tf_384.in21k_ft_in1k,87.986,12.014,98.568,1.432,212.03,384,1.000,bicubic
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320,87.958,12.042,98.476,1.524,200.13,320,1.000,bicubic
eva_large_patch14_196.in22k_ft_in1k,87.932,12.068,98.498,1.502,304.14,196,1.000,bicubic
maxvit_base_tf_384.in21k_ft_in1k,87.922,12.078,98.544,1.456,119.65,384,1.000,bicubic
vit_large_patch14_clip_224.laion2b_ft_in12k_in1k,87.894,12.106,98.408,1.592,304.20,224,1.000,bicubic
vit_large_patch14_clip_336.laion2b_ft_in1k,87.856,12.144,98.368,1.632,304.53,336,1.000,bicubic
vit_large_patch14_clip_224.openai_ft_in1k,87.854,12.146,98.426,1.574,304.20,224,1.000,bicubic
convnext_large_mlp.clip_laion2b_augreg_ft_in1k_384,87.848,12.152,98.446,1.554,200.13,384,1.000,bicubic
maxvit_rmlp_base_rw_384.sw_in12k_ft_in1k,87.828,12.172,98.372,1.628,116.14,384,1.000,bicubic
convnext_xlarge.fb_in22k_ft_in1k_384,87.752,12.248,98.556,1.444,350.20,384,1.000,bicubic
deit3_large_patch16_384.fb_in22k_ft_in1k,87.720,12.280,98.512,1.488,304.76,384,1.000,bicubic
convnextv2_base.fcmae_ft_in22k_in1k_384,87.644,12.356,98.416,1.584,88.72,384,1.000,bicubic
convformer_b36.sail_in22k_ft_in1k_384,87.602,12.398,98.434,1.566,99.88,384,1.000,bicubic
vit_huge_patch14_clip_224.laion2b_ft_in1k,87.588,12.412,98.218,1.782,632.05,224,1.000,bicubic
convnextv2_large.fcmae_ft_in22k_in1k,87.484,12.516,98.356,1.644,197.96,288,1.000,bicubic
beit_large_patch16_224.in22k_ft_in22k_in1k,87.478,12.522,98.304,1.696,304.43,224,0.900,bicubic
convnext_large.fb_in22k_ft_in1k_384,87.472,12.528,98.386,1.614,197.77,384,1.000,bicubic
maxxvitv2_rmlp_base_rw_384.sw_in12k_ft_in1k,87.464,12.536,98.374,1.626,116.09,384,1.000,bicubic
swinv2_large_window12to24_192to384.ms_in22k_ft_in1k,87.464,12.536,98.250,1.750,196.74,384,1.000,bicubic
caformer_m36.sail_in22k_ft_in1k_384,87.446,12.554,98.308,1.692,56.20,384,1.000,bicubic
caformer_b36.sail_in22k_ft_in1k,87.420,12.580,98.328,1.672,98.75,224,1.000,bicubic
beitv2_large_patch16_224.in1k_ft_in1k,87.412,12.588,98.234,1.766,304.43,224,0.950,bicubic
coatnet_rmlp_2_rw_384.sw_in12k_ft_in1k,87.382,12.618,98.312,1.688,73.88,384,1.000,bicubic
convnext_large_mlp.clip_laion2b_augreg_ft_in1k,87.336,12.664,98.218,1.782,200.13,256,1.000,bicubic
convnext_xlarge.fb_in22k_ft_in1k,87.330,12.670,98.328,1.672,350.20,288,1.000,bicubic
seresnextaa201d_32x8d.sw_in12k_ft_in1k_384,87.288,12.712,98.334,1.666,149.39,384,1.000,bicubic
vit_large_patch14_clip_224.laion2b_ft_in1k,87.286,12.714,98.244,1.756,304.20,224,1.000,bicubic
vit_base_patch16_clip_384.laion2b_ft_in12k_in1k,87.206,12.794,98.034,1.966,86.86,384,1.000,bicubic
deit3_huge_patch14_224.fb_in22k_ft_in1k,87.186,12.814,98.260,1.740,632.13,224,1.000,bicubic
convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384,87.134,12.866,98.222,1.778,88.59,384,1.000,bicubic
swin_large_patch4_window12_384.ms_in22k_ft_in1k,87.132,12.868,98.234,1.766,196.74,384,1.000,bicubic
swinv2_base_window12to24_192to384.ms_in22k_ft_in1k,87.096,12.904,98.234,1.766,87.92,384,1.000,bicubic
vit_large_patch16_384.augreg_in21k_ft_in1k,87.084,12.916,98.302,1.698,304.72,384,1.000,bicubic
volo_d5_512.sail_in1k,87.058,12.942,97.970,2.030,296.09,512,1.150,bicubic
convnext_large.fb_in22k_ft_in1k,87.026,12.974,98.204,1.796,197.77,288,1.000,bicubic
vit_base_patch16_clip_384.openai_ft_in12k_in1k,87.026,12.974,98.182,1.818,86.86,384,0.950,bicubic
convformer_b36.sail_in22k_ft_in1k,86.998,13.002,98.172,1.828,99.88,224,1.000,bicubic
convnextv2_base.fcmae_ft_in22k_in1k,86.998,13.002,98.168,1.832,88.72,288,1.000,bicubic
deit3_large_patch16_224.fb_in22k_ft_in1k,86.982,13.018,98.236,1.764,304.37,224,1.000,bicubic
swinv2_large_window12to16_192to256.ms_in22k_ft_in1k,86.952,13.048,98.106,1.894,196.74,256,0.900,bicubic
volo_d5_448.sail_in1k,86.952,13.048,97.938,2.062,295.91,448,1.150,bicubic
maxvit_rmlp_base_rw_224.sw_in12k_ft_in1k,86.894,13.106,98.014,1.986,116.14,224,0.950,bicubic
convformer_m36.sail_in22k_ft_in1k_384,86.892,13.108,98.116,1.884,57.05,384,1.000,bicubic
caformer_s36.sail_in22k_ft_in1k_384,86.858,13.142,98.212,1.788,39.30,384,1.000,bicubic
tf_efficientnet_b7.ns_jft_in1k,86.840,13.160,98.092,1.908,66.35,600,0.949,bicubic
regnety_320.swag_ft_in1k,86.834,13.166,98.362,1.638,145.05,384,1.000,bicubic
tf_efficientnetv2_l.in21k_ft_in1k,86.802,13.198,98.136,1.864,118.52,480,1.000,bicubic
beit_base_patch16_384.in22k_ft_in22k_in1k,86.800,13.200,98.136,1.864,86.74,384,1.000,bicubic
convnext_base.fb_in22k_ft_in1k_384,86.796,13.204,98.264,1.736,88.59,384,1.000,bicubic
volo_d4_448.sail_in1k,86.792,13.208,97.884,2.116,193.41,448,1.150,bicubic
tf_efficientnetv2_xl.in21k_ft_in1k,86.748,13.252,98.014,1.986,208.12,512,1.000,bicubic
deit3_base_patch16_384.fb_in22k_ft_in1k,86.740,13.260,98.116,1.884,86.88,384,1.000,bicubic
seresnextaa101d_32x8d.sw_in12k_ft_in1k_288,86.724,13.276,98.176,1.824,93.59,320,1.000,bicubic
maxxvitv2_rmlp_base_rw_224.sw_in12k_ft_in1k,86.642,13.358,98.020,1.980,116.09,224,0.950,bicubic
vit_base_patch16_clip_384.laion2b_ft_in1k,86.618,13.382,98.008,1.992,86.86,384,1.000,bicubic
maxvit_base_tf_512.in1k,86.602,13.398,97.918,2.082,119.88,512,1.000,bicubic
caformer_m36.sail_in22k_ft_in1k,86.594,13.406,98.024,1.976,56.20,224,1.000,bicubic
convnextv2_huge.fcmae_ft_in1k,86.580,13.420,97.972,2.028,660.29,288,1.000,bicubic
coatnet_2_rw_224.sw_in12k_ft_in1k,86.564,13.436,97.896,2.104,73.87,224,0.950,bicubic
maxvit_large_tf_512.in1k,86.526,13.474,97.880,2.120,212.33,512,1.000,bicubic
coatnet_rmlp_2_rw_224.sw_in12k_ft_in1k,86.504,13.496,97.894,2.106,73.88,224,0.950,bicubic
convnext_base.clip_laiona_augreg_ft_in1k_384,86.502,13.498,97.968,2.032,88.59,384,1.000,bicubic
volo_d3_448.sail_in1k,86.502,13.498,97.710,2.290,86.63,448,1.000,bicubic
cait_m48_448.fb_dist_in1k,86.492,13.508,97.752,2.248,356.46,448,1.000,bicubic
seresnextaa101d_32x8d.sw_in12k_ft_in1k,86.484,13.516,98.030,1.970,93.59,288,1.000,bicubic
beitv2_base_patch16_224.in1k_ft_in22k_in1k,86.474,13.526,98.052,1.948,86.53,224,0.900,bicubic
tiny_vit_21m_512.dist_in22k_ft_in1k,86.458,13.542,97.890,2.110,21.27,512,1.000,bicubic
tf_efficientnet_b6.ns_jft_in1k,86.458,13.542,97.884,2.116,43.04,528,0.942,bicubic
swin_base_patch4_window12_384.ms_in22k_ft_in1k,86.438,13.562,98.066,1.934,87.90,384,1.000,bicubic
caformer_b36.sail_in1k_384,86.408,13.592,97.814,2.186,98.75,384,1.000,bicubic
convformer_s36.sail_in22k_ft_in1k_384,86.378,13.622,97.984,2.016,40.01,384,1.000,bicubic
convnext_base.clip_laion2b_augreg_ft_in12k_in1k,86.370,13.630,97.984,2.016,88.59,256,1.000,bicubic
dm_nfnet_f6.dm_in1k,86.362,13.638,97.896,2.104,438.36,576,0.956,bicubic
swin_large_patch4_window7_224.ms_in22k_ft_in1k,86.312,13.688,97.902,2.098,196.53,224,0.900,bicubic
maxvit_base_tf_384.in1k,86.302,13.698,97.798,2.202,119.65,384,1.000,bicubic
convnext_base.fb_in22k_ft_in1k,86.274,13.726,98.092,1.908,88.59,288,1.000,bicubic
swinv2_base_window12to16_192to256.ms_in22k_ft_in1k,86.268,13.732,97.882,2.118,87.92,256,0.900,bicubic
maxvit_large_tf_384.in1k,86.230,13.770,97.688,2.312,212.03,384,1.000,bicubic
vit_base_patch8_224.augreg2_in21k_ft_in1k,86.218,13.782,97.832,2.168,86.58,224,0.900,bicubic
vit_base_patch16_clip_384.openai_ft_in1k,86.206,13.794,97.876,2.124,86.86,384,1.000,bicubic
convnext_small.in12k_ft_in1k_384,86.182,13.818,97.922,2.078,50.22,384,1.000,bicubic
vit_large_r50_s32_384.augreg_in21k_ft_in1k,86.182,13.818,97.922,2.078,329.09,384,1.000,bicubic
vit_base_patch16_clip_224.laion2b_ft_in12k_in1k,86.170,13.830,97.756,2.244,86.57,224,0.950,bicubic
caformer_m36.sail_in1k_384,86.166,13.834,97.820,2.180,56.20,384,1.000,bicubic
convnext_base.clip_laion2b_augreg_ft_in1k,86.158,13.842,97.680,2.320,88.59,256,1.000,bicubic
convformer_m36.sail_in22k_ft_in1k,86.148,13.852,97.850,2.150,57.05,224,1.000,bicubic
convnextv2_large.fcmae_ft_in1k,86.118,13.882,97.822,2.178,197.96,288,1.000,bicubic
tiny_vit_21m_384.dist_in22k_ft_in1k,86.108,13.892,97.710,2.290,21.23,384,1.000,bicubic
dm_nfnet_f5.dm_in1k,86.100,13.900,97.688,2.312,377.21,544,0.954,bicubic
tf_efficientnet_b5.ns_jft_in1k,86.088,13.912,97.756,2.244,30.39,456,0.934,bicubic
maxvit_small_tf_512.in1k,86.084,13.916,97.764,2.236,69.13,512,1.000,bicubic
volo_d5_224.sail_in1k,86.070,13.930,97.576,2.424,295.46,224,0.960,bicubic
cait_m36_384.fb_dist_in1k,86.058,13.942,97.730,2.270,271.22,384,1.000,bicubic
volo_d2_384.sail_in1k,86.042,13.958,97.574,2.426,58.87,384,1.000,bicubic
regnety_160.swag_ft_in1k,86.020,13.980,98.052,1.948,83.59,384,1.000,bicubic
xcit_large_24_p8_384.fb_dist_in1k,85.996,14.004,97.690,2.310,188.93,384,1.000,bicubic
vit_base_patch16_384.augreg_in21k_ft_in1k,85.994,14.006,98.002,1.998,86.86,384,1.000,bicubic
tf_efficientnetv2_m.in21k_ft_in1k,85.992,14.008,97.944,2.056,54.14,480,1.000,bicubic
regnety_160.lion_in12k_ft_in1k,85.988,14.012,97.834,2.166,83.59,288,1.000,bicubic
regnety_160.sw_in12k_ft_in1k,85.986,14.014,97.834,2.166,83.59,288,1.000,bicubic
regnety_1280.swag_lc_in1k,85.982,14.018,97.850,2.150,644.81,224,0.965,bicubic
vit_base_patch16_clip_224.openai_ft_in12k_in1k,85.942,14.058,97.728,2.272,86.57,224,0.950,bicubic
efficientnet_b5.sw_in12k_ft_in1k,85.896,14.104,97.736,2.264,30.39,448,1.000,bicubic
volo_d4_224.sail_in1k,85.872,14.128,97.472,2.528,192.96,224,0.960,bicubic
vit_large_patch16_224.augreg_in21k_ft_in1k,85.836,14.164,97.818,2.182,304.33,224,0.900,bicubic
dm_nfnet_f4.dm_in1k,85.836,14.164,97.664,2.336,316.07,512,0.951,bicubic
xcit_medium_24_p8_384.fb_dist_in1k,85.816,14.184,97.592,2.408,84.32,384,1.000,bicubic
deit3_large_patch16_384.fb_in1k,85.812,14.188,97.598,2.402,304.76,384,1.000,bicubic
vit_base_patch8_224.augreg_in21k_ft_in1k,85.798,14.202,97.790,2.210,86.58,224,0.900,bicubic
caformer_s36.sail_in22k_ft_in1k,85.790,14.210,97.826,2.174,39.30,224,1.000,bicubic
vit_base_patch32_clip_448.laion2b_ft_in12k_in1k,85.780,14.220,97.638,2.362,88.34,448,1.000,bicubic
convnext_small.fb_in22k_ft_in1k_384,85.778,14.222,97.890,2.110,50.22,384,1.000,bicubic
xcit_large_24_p16_384.fb_dist_in1k,85.754,14.246,97.538,2.462,189.10,384,1.000,bicubic
caformer_s36.sail_in1k_384,85.742,14.258,97.672,2.328,39.30,384,1.000,bicubic
convformer_b36.sail_in1k_384,85.740,14.260,97.524,2.476,99.88,384,1.000,bicubic
eva02_small_patch14_336.mim_in22k_ft_in1k,85.718,14.282,97.634,2.366,22.13,336,1.000,bicubic
deit3_base_patch16_224.fb_in22k_ft_in1k,85.700,14.300,97.746,2.254,86.59,224,1.000,bicubic
dm_nfnet_f3.dm_in1k,85.686,14.314,97.570,2.430,254.92,416,0.940,bicubic
maxvit_tiny_tf_512.in1k,85.664,14.336,97.584,2.416,31.05,512,1.000,bicubic
tf_efficientnetv2_l.in1k,85.664,14.336,97.474,2.526,118.52,480,1.000,bicubic
flexivit_large.1200ep_in1k,85.644,14.356,97.540,2.460,304.36,240,0.950,bicubic
beitv2_base_patch16_224.in1k_ft_in1k,85.594,14.406,97.506,2.494,86.53,224,0.900,bicubic
convformer_m36.sail_in1k_384,85.580,14.420,97.542,2.458,57.05,384,1.000,bicubic
xcit_small_24_p8_384.fb_dist_in1k,85.554,14.446,97.570,2.430,47.63,384,1.000,bicubic
flexivit_large.600ep_in1k,85.540,14.460,97.488,2.512,304.36,240,0.950,bicubic
maxvit_small_tf_384.in1k,85.540,14.460,97.462,2.538,69.02,384,1.000,bicubic
vit_medium_patch16_gap_384.sw_in12k_ft_in1k,85.530,14.470,97.636,2.364,39.03,384,0.950,bicubic
caformer_b36.sail_in1k,85.504,14.496,97.310,2.690,98.75,224,1.000,bicubic
convnextv2_base.fcmae_ft_in1k,85.474,14.526,97.384,2.616,88.72,288,1.000,bicubic
vit_base_patch16_clip_224.laion2b_ft_in1k,85.470,14.530,97.576,2.424,86.57,224,1.000,bicubic
cait_s36_384.fb_dist_in1k,85.454,14.546,97.478,2.522,68.37,384,1.000,bicubic
xcit_medium_24_p16_384.fb_dist_in1k,85.424,14.576,97.406,2.594,84.40,384,1.000,bicubic
deit_base_distilled_patch16_384.fb_in1k,85.424,14.576,97.330,2.670,87.63,384,1.000,bicubic
caformer_s18.sail_in22k_ft_in1k_384,85.414,14.586,97.702,2.298,26.34,384,1.000,bicubic
convformer_s36.sail_in22k_ft_in1k,85.414,14.586,97.568,2.432,40.01,224,1.000,bicubic
volo_d3_224.sail_in1k,85.414,14.586,97.276,2.724,86.33,224,0.960,bicubic
xcit_large_24_p8_224.fb_dist_in1k,85.402,14.598,97.402,2.598,188.93,224,1.000,bicubic
regnety_120.sw_in12k_ft_in1k,85.400,14.600,97.582,2.418,51.82,288,1.000,bicubic
convformer_s36.sail_in1k_384,85.378,14.622,97.476,2.524,40.01,384,1.000,bicubic
tf_efficientnet_b8.ra_in1k,85.368,14.632,97.394,2.606,87.41,672,0.954,bicubic
vit_base_patch32_clip_384.laion2b_ft_in12k_in1k,85.366,14.634,97.660,2.340,88.30,384,1.000,bicubic
tf_efficientnet_b8.ap_in1k,85.364,14.636,97.292,2.708,87.41,672,0.954,bicubic
convnext_small.in12k_ft_in1k,85.330,14.670,97.546,2.454,50.22,288,1.000,bicubic
vit_base_patch16_clip_224.openai_ft_in1k,85.292,14.708,97.436,2.564,86.57,224,0.900,bicubic
flexivit_large.300ep_in1k,85.288,14.712,97.400,2.600,304.36,240,0.950,bicubic
swin_base_patch4_window7_224.ms_in22k_ft_in1k,85.272,14.728,97.564,2.436,87.77,224,0.900,bicubic
convnext_small.fb_in22k_ft_in1k,85.262,14.738,97.682,2.318,50.22,288,1.000,bicubic
volo_d1_384.sail_in1k,85.244,14.756,97.214,2.786,26.78,384,1.000,bicubic
mvitv2_large.fb_in1k,85.244,14.756,97.194,2.806,217.99,224,0.900,bicubic
caformer_m36.sail_in1k,85.232,14.768,97.200,2.800,56.20,224,1.000,bicubic
deit3_huge_patch14_224.fb_in1k,85.224,14.776,97.360,2.640,632.13,224,0.900,bicubic
vit_base_patch32_clip_384.openai_ft_in12k_in1k,85.214,14.786,97.404,2.596,88.30,384,0.950,bicubic
beit_base_patch16_224.in22k_ft_in22k_in1k,85.212,14.788,97.658,2.342,86.53,224,0.900,bicubic
tf_efficientnetv2_m.in1k,85.204,14.796,97.364,2.636,54.14,480,1.000,bicubic
inception_next_base.sail_in1k_384,85.202,14.798,97.414,2.586,86.67,384,1.000,bicubic
volo_d2_224.sail_in1k,85.202,14.798,97.190,2.810,58.68,224,0.960,bicubic
dm_nfnet_f2.dm_in1k,85.192,14.808,97.346,2.654,193.78,352,0.920,bicubic
tf_efficientnet_b4.ns_jft_in1k,85.160,14.840,97.468,2.532,19.34,380,0.922,bicubic
regnety_2560.seer_ft_in1k,85.150,14.850,97.438,2.562,"1,282.60",384,1.000,bicubic
tf_efficientnet_b7.ap_in1k,85.124,14.876,97.252,2.748,66.35,600,0.949,bicubic
convnext_tiny.in12k_ft_in1k_384,85.122,14.878,97.606,2.394,28.59,384,1.000,bicubic
convnextv2_tiny.fcmae_ft_in22k_in1k_384,85.106,14.894,97.628,2.372,28.64,384,1.000,bicubic
maxvit_tiny_tf_384.in1k,85.100,14.900,97.378,2.622,30.98,384,1.000,bicubic
resnext101_32x32d.fb_wsl_ig1b_ft_in1k,85.098,14.902,97.438,2.562,468.53,224,0.875,bilinear
vit_base_patch16_224.augreg2_in21k_ft_in1k,85.094,14.906,97.530,2.470,86.57,224,0.900,bicubic
xcit_small_24_p16_384.fb_dist_in1k,85.090,14.910,97.312,2.688,47.67,384,1.000,bicubic
tiny_vit_21m_224.dist_in22k_ft_in1k,85.086,14.914,97.366,2.634,21.20,224,0.950,bicubic
xcit_small_12_p8_384.fb_dist_in1k,85.078,14.922,97.282,2.718,26.21,384,1.000,bicubic
xcit_medium_24_p8_224.fb_dist_in1k,85.074,14.926,97.274,2.726,84.32,224,1.000,bicubic
deit3_base_patch16_384.fb_in1k,85.074,14.926,97.250,2.750,86.88,384,1.000,bicubic
cait_s24_384.fb_dist_in1k,85.048,14.952,97.346,2.654,47.06,384,1.000,bicubic
regnetz_e8.ra3_in1k,85.034,14.966,97.272,2.728,57.70,320,1.000,bicubic
caformer_s18.sail_in1k_384,85.026,14.974,97.358,2.642,26.34,384,1.000,bicubic
resnetrs420.tf_in1k,85.004,14.996,97.124,2.876,191.89,416,1.000,bicubic
convformer_s18.sail_in22k_ft_in1k_384,84.998,15.002,97.570,2.430,26.77,384,1.000,bicubic
vit_base_r50_s16_384.orig_in21k_ft_in1k,84.976,15.024,97.290,2.710,98.95,384,1.000,bicubic
ecaresnet269d.ra2_in1k,84.968,15.032,97.222,2.778,102.09,352,1.000,bicubic
maxvit_large_tf_224.in1k,84.942,15.058,96.970,3.030,211.79,224,0.950,bicubic
tf_efficientnet_b7.ra_in1k,84.932,15.068,97.208,2.792,66.35,600,0.949,bicubic
resnetv2_152x4_bit.goog_in21k_ft_in1k,84.916,15.084,97.438,2.562,936.53,480,1.000,bilinear
xcit_large_24_p16_224.fb_dist_in1k,84.916,15.084,97.128,2.872,189.10,224,1.000,bicubic
coatnet_rmlp_1_rw2_224.sw_in12k_ft_in1k,84.910,15.090,96.958,3.042,41.72,224,0.950,bicubic
coat_lite_medium_384.in1k,84.878,15.122,97.372,2.628,44.57,384,1.000,bicubic
xcit_small_24_p8_224.fb_dist_in1k,84.868,15.132,97.190,2.810,47.63,224,1.000,bicubic
maxvit_base_tf_224.in1k,84.860,15.140,96.988,3.012,119.47,224,0.950,bicubic
convnext_large.fb_in1k,84.846,15.154,97.214,2.786,197.77,288,1.000,bicubic
deit3_small_patch16_384.fb_in22k_ft_in1k,84.824,15.176,97.486,2.514,22.21,384,1.000,bicubic
convformer_b36.sail_in1k,84.818,15.182,96.946,3.054,99.88,224,1.000,bicubic
efficientnetv2_rw_m.agc_in1k,84.810,15.190,97.152,2.848,53.24,416,1.000,bicubic
tf_efficientnet_b6.ap_in1k,84.788,15.212,97.138,2.862,43.04,528,0.942,bicubic
deit3_large_patch16_224.fb_in1k,84.774,15.226,97.036,2.964,304.37,224,0.900,bicubic
resnetrs350.tf_in1k,84.714,15.286,96.992,3.008,163.96,384,1.000,bicubic
xcit_small_12_p16_384.fb_dist_in1k,84.712,15.288,97.118,2.882,26.25,384,1.000,bicubic
dm_nfnet_f1.dm_in1k,84.702,15.298,97.182,2.818,132.63,320,0.910,bicubic
eca_nfnet_l2.ra3_in1k,84.700,15.300,97.266,2.734,56.72,384,1.000,bicubic
flexivit_base.1200ep_in1k,84.676,15.324,96.994,3.006,86.59,240,0.950,bicubic
davit_base.msft_in1k,84.642,15.358,97.020,2.980,87.95,224,0.950,bicubic
maxxvit_rmlp_small_rw_256.sw_in1k,84.624,15.376,97.068,2.932,66.01,256,0.950,bicubic
coatnet_rmlp_2_rw_224.sw_in1k,84.608,15.392,96.740,3.260,73.88,224,0.950,bicubic
swinv2_base_window16_256.ms_in1k,84.600,15.400,97.090,2.910,87.92,256,0.900,bicubic
fastvit_ma36.apple_dist_in1k,84.598,15.402,97.002,2.998,44.07,256,0.950,bicubic
seresnextaa101d_32x8d.ah_in1k,84.566,15.434,97.076,2.924,93.59,288,1.000,bicubic
deit3_medium_patch16_224.fb_in22k_ft_in1k,84.550,15.450,97.188,2.812,38.85,224,1.000,bicubic
regnety_320.swag_lc_in1k,84.548,15.452,97.442,2.558,145.05,224,0.965,bicubic
rexnetr_300.sw_in12k_ft_in1k,84.546,15.454,97.256,2.744,34.81,288,1.000,bicubic
vit_base_patch16_224.augreg_in21k_ft_in1k,84.532,15.468,97.294,2.706,86.57,224,0.900,bicubic
flexivit_base.600ep_in1k,84.524,15.476,96.936,3.064,86.59,240,0.950,bicubic
resnetv2_152x2_bit.goog_in21k_ft_in1k,84.510,15.490,97.434,2.566,236.34,448,1.000,bilinear
resnest269e.in1k,84.508,15.492,96.990,3.010,110.93,416,0.928,bicubic
caformer_s36.sail_in1k,84.506,15.494,96.996,3.004,39.30,224,1.000,bicubic
convformer_m36.sail_in1k,84.494,15.506,96.866,3.134,57.05,224,1.000,bicubic
regnetz_040_h.ra3_in1k,84.492,15.508,97.010,2.990,28.94,320,1.000,bicubic
maxvit_rmlp_small_rw_224.sw_in1k,84.492,15.508,96.758,3.242,64.90,224,0.900,bicubic
hrnet_w48_ssld.paddle_in1k,84.480,15.520,97.234,2.766,77.47,288,1.000,bilinear
swin_base_patch4_window12_384.ms_in1k,84.476,15.524,96.892,3.108,87.90,384,1.000,bicubic
convnext_tiny.in12k_ft_in1k,84.450,15.550,97.340,2.660,28.59,288,1.000,bicubic
mvitv2_base.fb_in1k,84.450,15.550,96.858,3.142,51.47,224,0.900,bicubic
vit_medium_patch16_gap_256.sw_in12k_ft_in1k,84.446,15.554,97.210,2.790,38.86,256,0.950,bicubic
resnetrs200.tf_in1k,84.444,15.556,97.082,2.918,93.21,320,1.000,bicubic
gcvit_base.in1k,84.444,15.556,96.842,3.158,90.32,224,0.875,bicubic
resnetv2_101x3_bit.goog_in21k_ft_in1k,84.438,15.562,97.382,2.618,387.93,448,1.000,bilinear
regnety_1280.seer_ft_in1k,84.432,15.568,97.092,2.908,644.81,384,1.000,bicubic
convnext_base.fb_in1k,84.428,15.572,96.968,3.032,88.59,288,1.000,bicubic
resnetrs270.tf_in1k,84.428,15.572,96.968,3.032,129.86,352,1.000,bicubic
maxvit_small_tf_224.in1k,84.426,15.574,96.824,3.176,68.93,224,0.950,bicubic
vit_large_r50_s32_224.augreg_in21k_ft_in1k,84.418,15.582,97.172,2.828,328.99,224,0.900,bicubic
convnextv2_tiny.fcmae_ft_in22k_in1k,84.416,15.584,97.260,2.740,28.64,288,1.000,bicubic
tf_efficientnet_b7.aa_in1k,84.416,15.584,96.908,3.092,66.35,600,0.949,bicubic
flexivit_base.300ep_in1k,84.406,15.594,96.884,3.116,86.59,240,0.950,bicubic
convformer_s18.sail_in1k_384,84.402,15.598,97.112,2.888,26.77,384,1.000,bicubic
resmlp_big_24_224.fb_in22k_ft_in1k,84.398,15.602,97.112,2.888,129.14,224,0.875,bicubic
xcit_large_24_p8_224.fb_in1k,84.394,15.606,96.664,3.336,188.93,224,1.000,bicubic
seresnet152d.ra2_in1k,84.360,15.640,97.040,2.960,66.84,320,1.000,bicubic
seresnext101d_32x8d.ah_in1k,84.358,15.642,96.920,3.080,93.59,288,1.000,bicubic
resnext101_32x8d.fb_swsl_ig1b_ft_in1k,84.302,15.698,97.176,2.824,88.79,224,0.875,bilinear
tf_efficientnetv2_s.in21k_ft_in1k,84.286,15.714,97.252,2.748,21.46,384,1.000,bicubic
xcit_medium_24_p16_224.fb_dist_in1k,84.286,15.714,96.932,3.068,84.40,224,1.000,bicubic
vit_base_patch16_224_miil.in21k_ft_in1k,84.266,15.734,96.804,3.196,86.54,224,0.875,bilinear
tf_efficientnet_b5.ap_in1k,84.258,15.742,96.974,3.026,30.39,456,0.934,bicubic
davit_small.msft_in1k,84.252,15.748,96.940,3.060,49.75,224,0.950,bicubic
swinv2_base_window8_256.ms_in1k,84.250,15.750,96.924,3.076,87.92,256,0.900,bicubic
regnetz_040.ra3_in1k,84.240,15.760,96.932,3.068,27.12,320,1.000,bicubic
xcit_small_12_p8_224.fb_dist_in1k,84.236,15.764,96.870,3.130,26.21,224,1.000,bicubic
swinv2_small_window16_256.ms_in1k,84.224,15.776,96.868,3.132,49.73,256,0.900,bicubic
maxvit_rmlp_tiny_rw_256.sw_in1k,84.224,15.776,96.778,3.222,29.15,256,0.950,bicubic
crossvit_18_dagger_408.in1k,84.202,15.798,96.818,3.182,44.61,408,1.000,bicubic
vit_base_patch16_384.orig_in21k_ft_in1k,84.200,15.800,97.218,2.782,86.86,384,1.000,bicubic
seresnext101_32x8d.ah_in1k,84.186,15.814,96.874,3.126,93.57,288,1.000,bicubic
resnext101_32x16d.fb_wsl_ig1b_ft_in1k,84.166,15.834,97.198,2.802,194.03,224,0.875,bilinear
volo_d1_224.sail_in1k,84.162,15.838,96.776,3.224,26.63,224,0.960,bicubic
efficientvit_b3.r288_in1k,84.154,15.846,96.736,3.264,48.65,288,1.000,bicubic
regnetz_d8_evos.ch_in1k,84.126,15.874,97.012,2.988,23.46,320,1.000,bicubic
resnetaa101d.sw_in12k_ft_in1k,84.124,15.876,97.106,2.894,44.57,288,1.000,bicubic
tf_efficientnet_b6.aa_in1k,84.112,15.888,96.884,3.116,43.04,528,0.942,bicubic
inception_next_base.sail_in1k,84.092,15.908,96.796,3.204,86.67,224,0.950,bicubic
convnext_tiny.fb_in22k_ft_in1k_384,84.088,15.912,97.144,2.856,28.59,384,1.000,bicubic
caformer_s18.sail_in22k_ft_in1k,84.074,15.926,97.198,2.802,26.34,224,1.000,bicubic
cait_xs24_384.fb_dist_in1k,84.062,15.938,96.884,3.116,26.67,384,1.000,bicubic
convformer_s36.sail_in1k,84.060,15.940,96.746,3.254,40.01,224,1.000,bicubic
edgenext_base.in21k_ft_in1k,84.054,15.946,97.196,2.804,18.51,320,1.000,bicubic
regnetz_d8.ra3_in1k,84.052,15.948,96.996,3.004,23.37,320,1.000,bicubic
tf_efficientnet_b3.ns_jft_in1k,84.052,15.948,96.918,3.082,12.23,300,0.904,bicubic
vit_small_r26_s32_384.augreg_in21k_ft_in1k,84.048,15.952,97.328,2.672,36.47,384,1.000,bicubic
fastvit_sa36.apple_dist_in1k,84.026,15.974,96.854,3.146,31.53,256,0.900,bicubic
regnetz_d32.ra3_in1k,84.022,15.978,96.868,3.132,27.58,320,0.950,bicubic
resnetv2_50x3_bit.goog_in21k_ft_in1k,84.020,15.980,97.126,2.874,217.32,448,1.000,bilinear
eca_nfnet_l1.ra2_in1k,84.012,15.988,97.026,2.974,41.41,320,1.000,bicubic
resnet200d.ra2_in1k,83.964,16.036,96.826,3.174,64.69,320,1.000,bicubic
edgenext_base.usi_in1k,83.958,16.042,96.770,3.230,18.51,320,1.000,bicubic
regnety_080.ra3_in1k,83.926,16.074,96.890,3.110,39.18,288,1.000,bicubic
swin_s3_base_224.ms_in1k,83.920,16.080,96.672,3.328,71.13,224,0.900,bicubic
regnety_640.seer_ft_in1k,83.908,16.092,96.922,3.078,281.38,384,1.000,bicubic
tf_efficientnetv2_s.in1k,83.898,16.102,96.696,3.304,21.46,384,1.000,bicubic
tresnet_v2_l.miil_in21k_ft_in1k,83.894,16.106,96.490,3.510,46.17,224,0.875,bilinear
gcvit_small.in1k,83.892,16.108,96.658,3.342,51.09,224,0.875,bicubic
fastvit_ma36.apple_in1k,83.882,16.118,96.742,3.258,44.07,256,0.950,bicubic
xcit_small_24_p16_224.fb_dist_in1k,83.874,16.126,96.736,3.264,47.67,224,1.000,bicubic
swinv2_small_window8_256.ms_in1k,83.854,16.146,96.644,3.356,49.73,256,0.900,bicubic
resnest200e.in1k,83.844,16.156,96.884,3.116,70.20,320,0.909,bicubic
crossvit_15_dagger_408.in1k,83.840,16.160,96.778,3.222,28.50,408,1.000,bicubic
focalnet_base_lrf.ms_in1k,83.838,16.162,96.608,3.392,88.75,224,0.900,bicubic
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k_384,83.836,16.164,97.126,2.874,236.34,384,1.000,bicubic
xcit_small_24_p8_224.fb_in1k,83.834,16.166,96.632,3.368,47.63,224,1.000,bicubic
focalnet_base_srf.ms_in1k,83.820,16.180,96.680,3.320,88.15,224,0.900,bicubic
tf_efficientnet_b5.ra_in1k,83.814,16.186,96.752,3.248,30.39,456,0.934,bicubic
efficientnetv2_rw_s.ra2_in1k,83.806,16.194,96.732,3.268,23.94,384,1.000,bicubic
vit_small_patch16_384.augreg_in21k_ft_in1k,83.804,16.196,97.100,2.900,22.20,384,1.000,bicubic
efficientvit_b3.r256_in1k,83.802,16.198,96.516,3.484,48.65,256,1.000,bicubic
deit3_base_patch16_224.fb_in1k,83.786,16.214,96.586,3.414,86.59,224,0.900,bicubic
regnety_160.swag_lc_in1k,83.782,16.218,97.280,2.720,83.59,224,0.965,bicubic
mvitv2_small.fb_in1k,83.770,16.230,96.576,3.424,34.87,224,0.900,bicubic
pit_b_distilled_224.in1k,83.766,16.234,96.468,3.532,74.79,224,0.900,bicubic
swin_s3_small_224.ms_in1k,83.756,16.244,96.452,3.548,49.74,224,0.900,bicubic
xcit_tiny_24_p8_384.fb_dist_in1k,83.746,16.254,96.710,3.290,12.11,384,1.000,bicubic
xcit_medium_24_p8_224.fb_in1k,83.746,16.254,96.400,3.600,84.32,224,1.000,bicubic
repvit_m2_3.dist_450e_in1k,83.742,16.258,96.644,3.356,23.69,224,0.950,bicubic
pvt_v2_b5.in1k,83.740,16.260,96.636,3.364,81.96,224,0.900,bicubic
convformer_s18.sail_in22k_ft_in1k,83.738,16.262,97.048,2.952,26.77,224,1.000,bicubic
regnety_064.ra3_in1k,83.720,16.280,96.722,3.278,30.58,288,1.000,bicubic
regnetv_064.ra3_in1k,83.716,16.284,96.742,3.258,30.58,288,1.000,bicubic
pvt_v2_b4.in1k,83.712,16.288,96.670,3.330,62.56,224,0.900,bicubic
resnetrs152.tf_in1k,83.702,16.298,96.612,3.388,86.62,320,1.000,bicubic
convnext_small.fb_in1k,83.700,16.300,96.808,3.192,50.22,288,1.000,bicubic
regnety_160.deit_in1k,83.690,16.310,96.780,3.220,83.59,288,1.000,bicubic
tf_efficientnet_b5.aa_in1k,83.688,16.312,96.712,3.288,30.39,456,0.934,bicubic
resnet152d.ra2_in1k,83.684,16.316,96.738,3.262,60.21,320,1.000,bicubic
twins_svt_large.in1k,83.678,16.322,96.588,3.412,99.27,224,0.900,bicubic
caformer_s18.sail_in1k,83.654,16.346,96.518,3.482,26.34,224,1.000,bicubic
efficientformerv2_l.snap_dist_in1k,83.632,16.368,96.558,3.442,26.32,224,0.950,bicubic
swin_base_patch4_window7_224.ms_in1k,83.606,16.394,96.452,3.548,87.77,224,0.900,bicubic
coat_lite_medium.in1k,83.600,16.400,96.728,3.272,44.57,224,0.900,bicubic
coatnet_1_rw_224.sw_in1k,83.596,16.404,96.382,3.618,41.72,224,0.950,bicubic
resmlp_big_24_224.fb_distilled_in1k,83.592,16.408,96.650,3.350,129.14,224,0.875,bicubic
inception_next_small.sail_in1k,83.578,16.422,96.598,3.402,49.37,224,0.875,bicubic
repvgg_d2se.rvgg_in1k,83.560,16.440,96.658,3.342,133.33,320,1.000,bilinear
cs3se_edgenet_x.c2ns_in1k,83.546,16.454,96.670,3.330,50.72,320,1.000,bicubic
nest_base_jx.goog_in1k,83.534,16.466,96.374,3.626,67.72,224,0.875,bicubic
repvit_m2_3.dist_300e_in1k,83.504,16.496,96.514,3.486,23.69,224,0.950,bicubic
maxvit_tiny_rw_224.sw_in1k,83.504,16.496,96.504,3.496,29.06,224,0.950,bicubic
fastvit_sa36.apple_in1k,83.500,16.500,96.630,3.370,31.53,256,0.900,bicubic
swinv2_cr_small_ns_224.sw_in1k,83.498,16.502,96.484,3.516,49.70,224,0.900,bicubic
focalnet_small_lrf.ms_in1k,83.494,16.506,96.580,3.420,50.34,224,0.900,bicubic
dm_nfnet_f0.dm_in1k,83.486,16.514,96.568,3.432,71.49,256,0.900,bicubic
convnextv2_tiny.fcmae_ft_in1k,83.464,16.536,96.718,3.282,28.64,288,1.000,bicubic
efficientvit_b3.r224_in1k,83.460,16.540,96.330,3.670,48.65,224,0.950,bicubic
resnet152.a1h_in1k,83.450,16.550,96.538,3.462,60.19,288,1.000,bicubic
cait_s24_224.fb_dist_in1k,83.442,16.558,96.574,3.426,46.92,224,1.000,bicubic
deit3_small_patch16_384.fb_in1k,83.428,16.572,96.674,3.326,22.21,384,1.000,bicubic
focalnet_small_srf.ms_in1k,83.416,16.584,96.438,3.562,49.89,224,0.900,bicubic
efficientnet_b4.ra2_in1k,83.414,16.586,96.598,3.402,19.34,384,1.000,bicubic
maxvit_tiny_tf_224.in1k,83.402,16.598,96.590,3.410,30.92,224,0.950,bicubic
mobilevitv2_200.cvnets_in22k_ft_in1k_384,83.400,16.600,96.582,3.418,18.45,384,1.000,bicubic
sequencer2d_l.in1k,83.394,16.606,96.496,3.504,54.30,224,0.875,bicubic
deit_base_distilled_patch16_224.fb_in1k,83.390,16.610,96.488,3.512,87.34,224,0.900,bicubic
gcvit_tiny.in1k,83.384,16.616,96.398,3.602,28.22,224,0.875,bicubic
efficientformer_l7.snap_dist_in1k,83.382,16.618,96.536,3.464,82.23,224,0.950,bicubic
convnextv2_nano.fcmae_ft_in22k_in1k_384,83.374,16.626,96.744,3.256,15.62,384,1.000,bicubic
coatnet_rmlp_1_rw_224.sw_in1k,83.362,16.638,96.450,3.550,41.69,224,0.950,bicubic
vit_base_patch32_384.augreg_in21k_ft_in1k,83.352,16.648,96.840,3.160,88.30,384,1.000,bicubic
fastvit_sa24.apple_dist_in1k,83.342,16.658,96.552,3.448,21.55,256,0.900,bicubic
resnext101_32x16d.fb_swsl_ig1b_ft_in1k,83.336,16.664,96.846,3.154,194.03,224,0.875,bilinear
xcit_small_12_p8_224.fb_in1k,83.334,16.666,96.482,3.518,26.21,224,1.000,bicubic
regnety_320.seer_ft_in1k,83.328,16.672,96.708,3.292,145.05,384,1.000,bicubic
xcit_small_12_p16_224.fb_dist_in1k,83.328,16.672,96.416,3.584,26.25,224,1.000,bicubic
swin_small_patch4_window7_224.ms_in22k_ft_in1k,83.298,16.702,96.964,3.036,49.61,224,0.900,bicubic
vit_base_patch32_clip_224.laion2b_ft_in12k_in1k,83.296,16.704,96.528,3.472,88.22,224,0.900,bicubic
tiny_vit_21m_224.in1k,83.254,16.746,96.592,3.408,21.20,224,0.950,bicubic
tf_efficientnet_b4.ap_in1k,83.250,16.750,96.396,3.604,19.34,380,0.922,bicubic
tiny_vit_11m_224.dist_in22k_ft_in1k,83.228,16.772,96.630,3.370,11.00,224,0.950,bicubic
resnext101_32x4d.fb_swsl_ig1b_ft_in1k,83.226,16.774,96.760,3.240,44.18,224,0.875,bilinear
swin_small_patch4_window7_224.ms_in1k,83.208,16.792,96.316,3.684,49.61,224,0.900,bicubic
regnetv_040.ra3_in1k,83.190,16.810,96.658,3.342,20.64,288,1.000,bicubic
xception65.ra3_in1k,83.180,16.820,96.592,3.408,39.92,299,0.940,bicubic
tf_efficientnet_b5.in1k,83.176,16.824,96.536,3.464,30.39,456,0.934,bicubic
regnety_320.tv2_in1k,83.162,16.838,96.414,3.586,145.05,224,0.965,bicubic
resnext101_64x4d.c1_in1k,83.156,16.844,96.374,3.626,83.46,288,1.000,bicubic
rexnetr_200.sw_in12k_ft_in1k,83.138,16.862,96.636,3.364,16.52,288,1.000,bicubic
swinv2_cr_small_224.sw_in1k,83.136,16.864,96.108,3.892,49.70,224,0.900,bicubic
twins_pcpvt_large.in1k,83.130,16.870,96.604,3.396,60.99,224,0.900,bicubic
xception65p.ra3_in1k,83.126,16.874,96.482,3.518,39.82,299,0.940,bicubic
nest_small_jx.goog_in1k,83.124,16.876,96.320,3.680,38.35,224,0.875,bicubic
twins_svt_base.in1k,83.120,16.880,96.414,3.586,56.07,224,0.900,bicubic
pvt_v2_b3.in1k,83.118,16.882,96.556,3.444,45.24,224,0.900,bicubic
maxxvitv2_nano_rw_256.sw_in1k,83.110,16.890,96.324,3.676,23.70,256,0.950,bicubic
deit_base_patch16_384.fb_in1k,83.104,16.896,96.368,3.632,86.86,384,1.000,bicubic
efficientvit_b2.r288_in1k,83.100,16.900,96.304,3.696,24.33,288,1.000,bicubic
deit3_medium_patch16_224.fb_in1k,83.086,16.914,96.294,3.706,38.85,224,0.900,bicubic
deit3_small_patch16_224.fb_in22k_ft_in1k,83.076,16.924,96.776,3.224,22.06,224,1.000,bicubic
tresnet_m.miil_in21k_ft_in1k,83.070,16.930,96.110,3.890,31.39,224,0.875,bilinear
tresnet_xl.miil_in1k_448,83.058,16.942,96.172,3.828,78.44,448,0.875,bilinear
regnety_040.ra3_in1k,83.044,16.956,96.502,3.498,20.65,288,1.000,bicubic
maxxvit_rmlp_nano_rw_256.sw_in1k,83.042,16.958,96.350,3.650,16.78,256,0.950,bicubic
resnet101d.ra2_in1k,83.020,16.980,96.452,3.548,44.57,320,1.000,bicubic
tf_efficientnet_b4.aa_in1k,83.018,16.982,96.300,3.700,19.34,380,0.922,bicubic
resnetv2_101.a1h_in1k,83.000,17.000,96.454,3.546,44.54,288,1.000,bicubic
resnext101_64x4d.tv_in1k,82.992,17.008,96.244,3.756,83.46,224,0.875,bilinear
convformer_s18.sail_in1k,82.986,17.014,96.250,3.750,26.77,224,1.000,bicubic
ecaresnet101d.miil_in1k,82.984,17.016,96.542,3.458,44.57,288,0.950,bicubic
maxvit_rmlp_nano_rw_256.sw_in1k,82.954,17.046,96.266,3.734,15.50,256,0.950,bicubic
mobilevitv2_175.cvnets_in22k_ft_in1k_384,82.938,17.062,96.426,3.574,14.25,384,1.000,bicubic
maxvit_nano_rw_256.sw_in1k,82.928,17.072,96.220,3.780,15.45,256,0.950,bicubic
xcit_large_24_p16_224.fb_in1k,82.902,17.098,95.884,4.116,189.10,224,1.000,bicubic
resnest101e.in1k,82.884,17.116,96.322,3.678,48.28,256,0.875,bilinear
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k,82.876,17.124,96.582,3.418,236.34,224,0.875,bicubic
convnext_nano.in12k_ft_in1k,82.862,17.138,96.556,3.444,15.59,288,1.000,bicubic
resnext101_32x8d.tv2_in1k,82.832,17.168,96.232,3.768,88.79,224,0.965,bilinear
resnetv2_50x1_bit.goog_distilled_in1k,82.824,17.176,96.518,3.482,25.55,224,0.875,bicubic
sequencer2d_m.in1k,82.812,17.188,96.274,3.726,38.31,224,0.875,bicubic
regnetx_320.tv2_in1k,82.810,17.190,96.208,3.792,107.81,224,0.965,bicubic
swinv2_tiny_window16_256.ms_in1k,82.804,17.196,96.236,3.764,28.35,256,0.900,bicubic
pnasnet5large.tf_in1k,82.782,17.218,96.040,3.960,86.06,331,0.911,bicubic
resnet101.a1h_in1k,82.778,17.222,96.310,3.690,44.55,288,1.000,bicubic
rexnet_300.nav_in1k,82.774,17.226,96.238,3.762,34.71,224,0.875,bicubic
vit_relpos_base_patch16_clsgap_224.sw_in1k,82.760,17.240,96.172,3.828,86.43,224,0.900,bicubic
nfnet_l0.ra2_in1k,82.750,17.250,96.516,3.484,35.07,288,1.000,bicubic
resnet152.a1_in1k,82.732,17.268,95.720,4.280,60.19,288,1.000,bicubic
regnety_032.ra_in1k,82.726,17.274,96.416,3.584,19.44,288,1.000,bicubic
twins_pcpvt_base.in1k,82.714,17.286,96.346,3.654,43.83,224,0.900,bicubic
cs3edgenet_x.c2_in1k,82.708,17.292,96.370,3.630,47.82,288,1.000,bicubic
resnext101_32x8d.fb_wsl_ig1b_ft_in1k,82.698,17.302,96.632,3.368,88.79,224,0.875,bilinear
convnext_tiny.fb_in1k,82.698,17.302,96.144,3.856,28.59,288,1.000,bicubic
davit_tiny.msft_in1k,82.696,17.304,96.274,3.726,28.36,224,0.950,bicubic
efficientvit_b2.r256_in1k,82.690,17.310,96.094,3.906,24.33,256,1.000,bicubic
fastvit_sa24.apple_in1k,82.678,17.322,96.272,3.728,21.55,256,0.900,bicubic
tf_efficientnetv2_b3.in21k_ft_in1k,82.670,17.330,96.626,3.374,14.36,300,0.900,bicubic
convnextv2_nano.fcmae_ft_in22k_in1k,82.664,17.336,96.520,3.480,15.62,288,1.000,bicubic
cs3sedarknet_x.c2ns_in1k,82.658,17.342,96.350,3.650,35.40,288,1.000,bicubic
regnety_160.tv2_in1k,82.646,17.354,96.214,3.786,83.59,224,0.965,bicubic
xcit_medium_24_p16_224.fb_in1k,82.640,17.360,95.982,4.018,84.40,224,1.000,bicubic
regnetz_c16_evos.ch_in1k,82.636,17.364,96.474,3.526,13.49,320,0.950,bicubic
regnetz_c16.ra3_in1k,82.632,17.368,96.318,3.682,13.46,320,1.000,bicubic
nasnetalarge.tf_in1k,82.626,17.374,96.042,3.958,88.75,331,0.911,bicubic
poolformerv2_m48.sail_in1k,82.618,17.382,96.072,3.928,73.35,224,1.000,bicubic
tf_efficientnet_b4.in1k,82.608,17.392,96.128,3.872,19.34,380,0.922,bicubic
resnet152.a2_in1k,82.608,17.392,95.752,4.248,60.19,288,1.000,bicubic
resnetaa50d.sw_in12k_ft_in1k,82.600,17.400,96.498,3.502,25.58,288,1.000,bicubic
levit_384.fb_dist_in1k,82.596,17.404,96.018,3.982,39.13,224,0.900,bicubic
regnety_080_tv.tv2_in1k,82.594,17.406,96.248,3.752,39.38,224,0.965,bicubic
levit_conv_384.fb_dist_in1k,82.590,17.410,96.016,3.984,39.13,224,0.900,bicubic
mobilevitv2_150.cvnets_in22k_ft_in1k_384,82.586,17.414,96.314,3.686,10.59,384,1.000,bicubic
convnext_tiny_hnf.a2h_in1k,82.584,17.416,96.008,3.992,28.59,288,1.000,bicubic
vit_base_patch32_clip_224.laion2b_ft_in1k,82.582,17.418,96.200,3.800,88.22,224,0.900,bicubic
eca_nfnet_l0.ra2_in1k,82.578,17.422,96.492,3.508,24.14,288,1.000,bicubic
xcit_small_24_p16_224.fb_in1k,82.576,17.424,96.012,3.988,47.67,224,1.000,bicubic
vit_relpos_medium_patch16_cls_224.sw_in1k,82.572,17.428,96.068,3.932,38.76,224,0.900,bicubic
xcit_tiny_24_p16_384.fb_dist_in1k,82.570,17.430,96.276,3.724,12.12,384,1.000,bicubic
xcit_tiny_24_p8_224.fb_dist_in1k,82.566,17.434,96.172,3.828,12.11,224,1.000,bicubic
regnetx_160.tv2_in1k,82.566,17.434,96.058,3.942,54.28,224,0.965,bicubic
efficientformer_l3.snap_dist_in1k,82.548,17.452,96.250,3.750,31.41,224,0.950,bicubic
flexivit_small.1200ep_in1k,82.526,17.474,96.126,3.874,22.06,240,0.950,bicubic
resnet61q.ra2_in1k,82.524,17.476,96.130,3.870,36.85,288,1.000,bicubic
crossvit_18_dagger_240.in1k,82.518,17.482,96.068,3.932,44.27,240,0.875,bicubic
repvit_m1_5.dist_450e_in1k,82.512,17.488,96.112,3.888,14.64,224,0.950,bicubic
wide_resnet101_2.tv2_in1k,82.502,17.498,96.016,3.984,126.89,224,0.965,bilinear
vit_relpos_base_patch16_224.sw_in1k,82.496,17.504,96.138,3.862,86.43,224,0.900,bicubic
convnextv2_nano.fcmae_ft_in1k,82.486,17.514,96.226,3.774,15.62,288,1.000,bicubic
poolformer_m48.sail_in1k,82.482,17.518,95.966,4.034,73.47,224,0.950,bicubic
inception_next_tiny.sail_in1k,82.478,17.522,96.022,3.978,28.06,224,0.875,bicubic
vit_relpos_medium_patch16_224.sw_in1k,82.462,17.538,96.082,3.918,38.75,224,0.900,bicubic
gc_efficientnetv2_rw_t.agc_in1k,82.456,17.544,96.296,3.704,13.68,288,1.000,bicubic
pit_b_224.in1k,82.438,17.562,95.714,4.286,73.76,224,0.900,bicubic
mvitv2_tiny.fb_in1k,82.410,17.590,96.152,3.848,24.17,224,0.900,bicubic
coatnet_bn_0_rw_224.sw_in1k,82.400,17.600,96.186,3.814,27.44,224,0.950,bicubic
crossvit_18_240.in1k,82.400,17.600,96.060,3.940,43.27,240,0.875,bicubic
coatnet_0_rw_224.sw_in1k,82.390,17.610,95.836,4.164,27.44,224,0.950,bicubic
xcit_tiny_12_p8_384.fb_dist_in1k,82.388,17.612,96.220,3.780,6.71,384,1.000,bicubic
tf_efficientnet_b2.ns_jft_in1k,82.378,17.622,96.254,3.746,9.11,260,0.890,bicubic
repvit_m1_5.dist_300e_in1k,82.376,17.624,96.030,3.970,14.64,224,0.950,bicubic
coat_small.in1k,82.362,17.638,96.208,3.792,21.69,224,0.900,bicubic
flexivit_small.600ep_in1k,82.362,17.638,96.084,3.916,22.06,240,0.950,bicubic
resnet51q.ra2_in1k,82.360,17.640,96.186,3.814,35.70,288,1.000,bilinear
ecaresnet50t.ra2_in1k,82.352,17.648,96.140,3.860,25.57,320,0.950,bicubic
efficientnetv2_rw_t.ra2_in1k,82.350,17.650,96.192,3.808,13.65,288,1.000,bicubic
resnetv2_101x1_bit.goog_in21k_ft_in1k,82.342,17.658,96.520,3.480,44.54,448,1.000,bilinear
sequencer2d_s.in1k,82.340,17.660,96.028,3.972,27.65,224,0.875,bicubic
mobilevitv2_200.cvnets_in22k_ft_in1k,82.332,17.668,95.942,4.058,18.45,256,0.888,bicubic
crossvit_15_dagger_240.in1k,82.330,17.670,95.956,4.044,28.21,240,0.875,bicubic
resnet101.a1_in1k,82.322,17.678,95.632,4.368,44.55,288,1.000,bicubic
coat_lite_small.in1k,82.312,17.688,95.850,4.150,19.84,224,0.900,bicubic
vit_relpos_medium_patch16_rpn_224.sw_in1k,82.310,17.690,95.972,4.028,38.73,224,0.900,bicubic
mixer_b16_224.miil_in21k_ft_in1k,82.306,17.694,95.720,4.280,59.88,224,0.875,bilinear
convit_base.fb_in1k,82.290,17.710,95.936,4.064,86.54,224,0.875,bicubic
resnet152.tv2_in1k,82.286,17.714,96.004,3.996,60.19,224,0.965,bilinear
resnetrs101.tf_in1k,82.284,17.716,96.014,3.986,63.62,288,0.940,bicubic
wide_resnet50_2.racm_in1k,82.280,17.720,96.064,3.936,68.88,288,0.950,bicubic
tresnet_l.miil_in1k_448,82.276,17.724,95.978,4.022,55.99,448,0.875,bilinear
efficientnet_b3.ra2_in1k,82.246,17.754,96.118,3.882,12.23,320,1.000,bicubic
vit_srelpos_medium_patch16_224.sw_in1k,82.240,17.760,95.942,4.058,38.74,224,0.900,bicubic
resnet101.a2_in1k,82.236,17.764,95.730,4.270,44.55,288,1.000,bicubic
cs3darknet_x.c2ns_in1k,82.222,17.778,96.230,3.770,35.05,288,1.000,bicubic
poolformerv2_m36.sail_in1k,82.216,17.784,95.924,4.076,56.08,224,1.000,bicubic
crossvit_base_240.in1k,82.214,17.786,95.834,4.166,105.03,240,0.875,bicubic
cait_xxs36_384.fb_dist_in1k,82.204,17.796,96.144,3.856,17.37,384,1.000,bicubic
vit_base_patch16_rpn_224.sw_in1k,82.202,17.798,95.996,4.004,86.54,224,0.900,bicubic
seresnext50_32x4d.racm_in1k,82.196,17.804,96.148,3.852,27.56,288,0.950,bicubic
pvt_v2_b2_li.in1k,82.194,17.806,96.092,3.908,22.55,224,0.900,bicubic
flexivit_small.300ep_in1k,82.178,17.822,96.038,3.962,22.06,240,0.950,bicubic
resnext50_32x4d.fb_swsl_ig1b_ft_in1k,82.172,17.828,96.224,3.776,25.03,224,0.875,bilinear
efficientformerv2_s2.snap_dist_in1k,82.166,17.834,95.910,4.090,12.71,224,0.950,bicubic
focalnet_tiny_lrf.ms_in1k,82.154,17.846,95.948,4.052,28.65,224,0.900,bicubic
efficientvit_b2.r224_in1k,82.148,17.852,95.706,4.294,24.33,224,0.950,bicubic
swin_s3_tiny_224.ms_in1k,82.144,17.856,95.954,4.046,28.33,224,0.900,bicubic
focalnet_tiny_srf.ms_in1k,82.138,17.862,95.968,4.032,28.43,224,0.900,bicubic
ecaresnet50t.a1_in1k,82.128,17.872,95.642,4.358,25.57,288,1.000,bicubic
visformer_small.in1k,82.106,17.894,95.878,4.122,40.22,224,0.900,bicubic
poolformer_m36.sail_in1k,82.102,17.898,95.698,4.302,56.17,224,0.950,bicubic
pvt_v2_b2.in1k,82.084,17.916,95.956,4.044,25.36,224,0.900,bicubic
tresnet_xl.miil_in1k,82.074,17.926,95.928,4.072,78.44,224,0.875,bilinear
halo2botnet50ts_256.a1h_in1k,82.060,17.940,95.634,4.366,22.64,256,0.950,bicubic
coatnet_rmlp_nano_rw_224.sw_in1k,82.050,17.950,95.878,4.122,15.15,224,0.900,bicubic
hrnet_w18_ssld.paddle_in1k,82.048,17.952,96.250,3.750,21.30,288,1.000,bilinear
fbnetv3_g.ra2_in1k,82.040,17.960,96.060,3.940,16.62,288,0.950,bilinear
resnext50_32x4d.a1h_in1k,82.014,17.986,95.934,4.066,25.03,288,1.000,bicubic
resnetv2_50d_evos.ah_in1k,82.002,17.998,95.900,4.100,25.59,288,1.000,bicubic
ecaresnet101d_pruned.miil_in1k,81.998,18.002,96.160,3.840,24.88,288,0.950,bicubic
deit_base_patch16_224.fb_in1k,81.992,18.008,95.736,4.264,86.57,224,0.900,bicubic
xception41p.ra3_in1k,81.972,18.028,95.802,4.198,26.91,299,0.940,bicubic
tf_efficientnetv2_b3.in1k,81.972,18.028,95.784,4.216,14.36,300,0.904,bicubic
xcit_small_12_p16_224.fb_in1k,81.970,18.030,95.812,4.188,26.25,224,1.000,bicubic
resnetv2_50d_gn.ah_in1k,81.958,18.042,95.928,4.072,25.57,288,1.000,bicubic
gcvit_xtiny.in1k,81.954,18.046,95.966,4.034,19.98,224,0.875,bicubic
coatnext_nano_rw_224.sw_in1k,81.942,18.058,95.916,4.084,14.70,224,0.900,bicubic
mobilevitv2_175.cvnets_in22k_ft_in1k,81.938,18.062,95.790,4.210,14.25,256,0.888,bicubic
vit_base_patch32_clip_224.openai_ft_in1k,81.930,18.070,95.966,4.034,88.22,224,0.900,bicubic
xcit_tiny_24_p8_224.fb_in1k,81.892,18.108,95.970,4.030,12.11,224,1.000,bicubic
resnet101.tv2_in1k,81.888,18.112,95.768,4.232,44.55,224,0.965,bilinear
vit_small_r26_s32_224.augreg_in21k_ft_in1k,81.864,18.136,96.022,3.978,36.43,224,0.900,bicubic
fastvit_sa12.apple_dist_in1k,81.854,18.146,95.710,4.290,11.58,256,0.900,bicubic
resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k,81.838,18.162,96.092,3.908,194.03,224,0.875,bilinear
swinv2_tiny_window8_256.ms_in1k,81.820,18.180,95.994,4.006,28.35,256,0.900,bicubic
tf_efficientnet_b3.ap_in1k,81.820,18.180,95.626,4.374,12.23,300,0.904,bicubic
pit_s_distilled_224.in1k,81.814,18.186,95.730,4.270,24.04,224,0.900,bicubic
swinv2_cr_tiny_ns_224.sw_in1k,81.802,18.198,95.818,4.182,28.33,224,0.900,bicubic
vit_base_patch16_224.orig_in21k_ft_in1k,81.790,18.210,96.126,3.874,86.57,224,0.900,bicubic
cs3sedarknet_l.c2ns_in1k,81.784,18.216,95.964,4.036,21.91,288,0.950,bicubic
regnety_032.tv2_in1k,81.756,18.244,95.844,4.156,19.44,224,0.965,bicubic
tresnet_m.miil_in1k_448,81.710,18.290,95.574,4.426,31.39,448,0.875,bilinear
coatnet_nano_rw_224.sw_in1k,81.696,18.304,95.646,4.354,15.14,224,0.900,bicubic
twins_svt_small.in1k,81.676,18.324,95.658,4.342,24.06,224,0.900,bicubic
halonet50ts.a1h_in1k,81.662,18.338,95.610,4.390,22.73,256,0.940,bicubic
ecaresnet50t.a2_in1k,81.658,18.342,95.550,4.450,25.57,288,1.000,bicubic
ecaresnet50d.miil_in1k,81.650,18.350,95.882,4.118,25.58,288,0.950,bicubic
tf_efficientnet_b3.aa_in1k,81.640,18.360,95.722,4.278,12.23,300,0.904,bicubic
rexnet_200.nav_in1k,81.636,18.364,95.666,4.334,16.37,224,0.875,bicubic
resnetaa50.a1h_in1k,81.614,18.386,95.802,4.198,25.56,288,1.000,bicubic
resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k,81.606,18.394,96.040,3.960,88.79,224,0.875,bilinear
wide_resnet50_2.tv2_in1k,81.606,18.394,95.760,4.240,68.88,224,0.965,bilinear
convnext_nano_ols.d1h_in1k,81.600,18.400,95.636,4.364,15.65,288,1.000,bicubic
poolformerv2_s36.sail_in1k,81.566,18.434,95.690,4.310,30.79,224,1.000,bicubic
edgenext_small.usi_in1k,81.564,18.436,95.712,4.288,5.59,320,1.000,bicubic
lamhalobotnet50ts_256.a1h_in1k,81.552,18.448,95.492,4.508,22.57,256,0.950,bicubic
regnetx_080.tv2_in1k,81.540,18.460,95.542,4.458,39.57,224,0.965,bicubic
tnt_s_patch16_224,81.536,18.464,95.736,4.264,23.76,224,0.900,bicubic
crossvit_15_240.in1k,81.536,18.464,95.690,4.310,27.53,240,0.875,bicubic
tf_efficientnet_lite4.in1k,81.530,18.470,95.664,4.336,13.01,380,0.920,bilinear
levit_256.fb_dist_in1k,81.524,18.476,95.494,4.506,18.89,224,0.900,bicubic
levit_conv_256.fb_dist_in1k,81.522,18.478,95.490,4.510,18.89,224,0.900,bicubic
vit_large_patch32_384.orig_in21k_ft_in1k,81.510,18.490,96.090,3.910,306.63,384,1.000,bicubic
repvit_m3.dist_in1k,81.502,18.498,95.568,4.432,10.68,224,0.950,bicubic
tiny_vit_11m_224.in1k,81.492,18.508,95.862,4.138,11.00,224,0.950,bicubic
mobilevitv2_150.cvnets_in22k_ft_in1k,81.488,18.512,95.668,4.332,10.59,256,0.888,bicubic
convnext_nano.d1h_in1k,81.482,18.518,95.658,4.342,15.59,288,1.000,bicubic
tresnet_l.miil_in1k,81.480,18.520,95.624,4.376,55.99,224,0.875,bilinear
resnext50_32x4d.a1_in1k,81.466,18.534,95.174,4.826,25.03,288,1.000,bicubic
vit_relpos_small_patch16_224.sw_in1k,81.462,18.538,95.820,4.180,21.98,224,0.900,bicubic
gcresnet50t.ra2_in1k,81.456,18.544,95.718,4.282,25.90,288,1.000,bicubic
resnet50d.a1_in1k,81.450,18.550,95.218,4.782,25.58,288,1.000,bicubic
poolformer_s36.sail_in1k,81.430,18.570,95.444,4.556,30.86,224,0.900,bicubic
nest_tiny_jx.goog_in1k,81.426,18.574,95.618,4.382,17.06,224,0.875,bicubic
convit_small.fb_in1k,81.420,18.580,95.744,4.256,27.78,224,0.875,bicubic
ecaresnetlight.miil_in1k,81.408,18.592,95.816,4.184,30.16,288,0.950,bicubic
resnetv2_50.a1h_in1k,81.398,18.602,95.726,4.274,25.55,288,1.000,bicubic
tf_efficientnet_b1.ns_jft_in1k,81.388,18.612,95.738,4.262,7.79,240,0.882,bicubic
vit_small_patch16_224.augreg_in21k_ft_in1k,81.386,18.614,96.136,3.864,22.05,224,0.900,bicubic
swin_tiny_patch4_window7_224.ms_in1k,81.376,18.624,95.544,4.456,28.29,224,0.900,bicubic
deit3_small_patch16_224.fb_in1k,81.370,18.630,95.456,4.544,22.06,224,0.900,bicubic
convmixer_1536_20.in1k,81.362,18.638,95.614,4.386,51.63,224,0.960,bicubic
resnet50d.ra2_in1k,81.356,18.644,95.738,4.262,25.58,288,0.950,bicubic
gernet_l.idstcv_in1k,81.354,18.646,95.530,4.470,31.08,256,0.875,bilinear
repvit_m1_1.dist_450e_in1k,81.312,18.688,95.560,4.440,8.80,224,0.950,bicubic
efficientnet_el.ra_in1k,81.312,18.688,95.536,4.464,10.59,300,0.904,bicubic
legacy_senet154.in1k,81.312,18.688,95.490,4.510,115.09,224,0.875,bilinear
resnext50_32x4d.a2_in1k,81.304,18.696,95.096,4.904,25.03,288,1.000,bicubic
seresnet50.ra2_in1k,81.284,18.716,95.652,4.348,28.09,288,0.950,bicubic
coat_mini.in1k,81.270,18.730,95.382,4.618,10.34,224,0.900,bicubic
gcresnext50ts.ch_in1k,81.230,18.770,95.542,4.458,15.67,288,1.000,bicubic
senet154.gluon_in1k,81.226,18.774,95.358,4.642,115.09,224,0.875,bicubic
res2net101d.in1k,81.218,18.782,95.350,4.650,45.23,224,0.875,bilinear
resnet50_gn.a1h_in1k,81.216,18.784,95.624,4.376,25.56,288,0.950,bicubic
deit_small_distilled_patch16_224.fb_in1k,81.216,18.784,95.384,4.616,22.44,224,0.900,bicubic
resnet50.a1_in1k,81.214,18.786,95.102,4.898,25.56,288,1.000,bicubic
xcit_tiny_12_p8_224.fb_dist_in1k,81.212,18.788,95.602,4.398,6.71,224,1.000,bicubic
resnext50_32x4d.tv2_in1k,81.182,18.818,95.340,4.660,25.03,224,0.965,bilinear
resnet50.fb_swsl_ig1b_ft_in1k,81.172,18.828,95.986,4.014,25.56,224,0.875,bilinear
sebotnet33ts_256.a1h_in1k,81.168,18.832,95.168,4.832,13.70,256,0.940,bicubic
resnet50d.a2_in1k,81.164,18.836,95.080,4.920,25.58,288,1.000,bicubic
lambda_resnet50ts.a1h_in1k,81.158,18.842,95.098,4.902,21.54,256,0.950,bicubic
resmlp_36_224.fb_distilled_in1k,81.148,18.852,95.478,4.522,44.69,224,0.875,bicubic
mobilevitv2_200.cvnets_in1k,81.134,18.866,95.362,4.638,18.45,256,0.888,bicubic
resnest50d_4s2x40d.in1k,81.120,18.880,95.560,4.440,30.42,224,0.875,bicubic
vit_small_patch16_384.augreg_in1k,81.116,18.884,95.574,4.426,22.20,384,1.000,bicubic
seresnet50.a2_in1k,81.106,18.894,95.222,4.778,28.09,288,1.000,bicubic
vit_base_patch16_384.augreg_in1k,81.102,18.898,95.328,4.672,86.86,384,1.000,bicubic
seresnet50.a1_in1k,81.102,18.898,95.120,4.880,28.09,288,1.000,bicubic
twins_pcpvt_small.in1k,81.092,18.908,95.648,4.352,24.11,224,0.900,bicubic
vit_srelpos_small_patch16_224.sw_in1k,81.092,18.908,95.570,4.430,21.97,224,0.900,bicubic
convnextv2_pico.fcmae_ft_in1k,81.086,18.914,95.480,4.520,9.07,288,0.950,bicubic
pit_s_224.in1k,81.086,18.914,95.330,4.670,23.46,224,0.900,bicubic
fastvit_s12.apple_dist_in1k,81.070,18.930,95.284,4.716,9.47,256,0.900,bicubic
haloregnetz_b.ra3_in1k,81.046,18.954,95.200,4.800,11.68,224,0.940,bicubic
resmlp_big_24_224.fb_in1k,81.036,18.964,95.018,4.982,129.14,224,0.875,bicubic
crossvit_small_240.in1k,81.018,18.982,95.456,4.544,26.86,240,0.875,bicubic
resnet152s.gluon_in1k,81.008,18.992,95.416,4.584,60.32,224,0.875,bicubic
resnest50d_1s4x24d.in1k,80.988,19.012,95.326,4.674,25.68,224,0.875,bicubic
cait_xxs24_384.fb_dist_in1k,80.972,19.028,95.640,4.360,12.03,384,1.000,bicubic
resnet50.d_in1k,80.972,19.028,95.430,4.570,25.56,288,1.000,bicubic
swin_tiny_patch4_window7_224.ms_in22k_ft_in1k,80.968,19.032,96.014,3.986,28.29,224,0.900,bicubic
resnest50d.in1k,80.960,19.040,95.382,4.618,27.48,224,0.875,bilinear
sehalonet33ts.ra2_in1k,80.958,19.042,95.272,4.728,13.69,256,0.940,bicubic
xcit_tiny_12_p16_384.fb_dist_in1k,80.938,19.062,95.414,4.586,6.72,384,1.000,bicubic
regnetx_032.tv2_in1k,80.926,19.074,95.278,4.722,15.30,224,0.965,bicubic
resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k,80.924,19.076,95.734,4.266,44.18,224,0.875,bilinear
resnet50.c1_in1k,80.912,19.088,95.552,4.448,25.56,288,1.000,bicubic
cs3darknet_l.c2ns_in1k,80.896,19.104,95.662,4.338,21.16,288,0.950,bicubic
seresnext101_64x4d.gluon_in1k,80.894,19.106,95.296,4.704,88.23,224,0.875,bicubic
seresnext101_32x4d.gluon_in1k,80.892,19.108,95.296,4.704,48.96,224,0.875,bicubic
cs3darknet_focus_l.c2ns_in1k,80.876,19.124,95.682,4.318,21.15,288,0.950,bicubic
tiny_vit_5m_224.dist_in22k_ft_in1k,80.876,19.124,95.664,4.336,5.39,224,0.950,bicubic
tf_efficientnet_b3.in1k,80.874,19.126,95.300,4.700,12.23,300,0.904,bicubic
resnet50.c2_in1k,80.870,19.130,95.534,4.466,25.56,288,1.000,bicubic
mobilevitv2_175.cvnets_in1k,80.860,19.140,95.256,4.744,14.25,256,0.888,bicubic
efficientnet_b3_pruned.in1k,80.852,19.148,95.244,4.756,9.86,300,0.904,bicubic
resnet50.tv2_in1k,80.848,19.152,95.434,4.566,25.56,224,0.965,bilinear
fastvit_sa12.apple_in1k,80.844,19.156,95.340,4.660,11.58,256,0.900,bicubic
repvit_m1_1.dist_300e_in1k,80.826,19.174,95.170,4.830,8.80,224,0.950,bicubic
regnety_320.pycls_in1k,80.810,19.190,95.238,4.762,145.05,224,0.875,bicubic
tresnet_m.miil_in1k,80.798,19.202,94.856,5.144,31.39,224,0.875,bilinear
ecaresnet50d_pruned.miil_in1k,80.790,19.210,95.570,4.430,19.94,288,0.950,bicubic
seresnet33ts.ra2_in1k,80.784,19.216,95.362,4.638,19.78,288,1.000,bicubic
resnet50.a2_in1k,80.772,19.228,94.988,5.012,25.56,288,1.000,bicubic
resmlp_24_224.fb_distilled_in1k,80.756,19.244,95.224,4.776,30.02,224,0.875,bicubic
poolformerv2_s24.sail_in1k,80.748,19.252,95.310,4.690,21.34,224,1.000,bicubic
gernet_m.idstcv_in1k,80.736,19.264,95.190,4.810,21.14,224,0.875,bilinear
regnetz_b16.ra3_in1k,80.728,19.272,95.518,4.482,9.72,288,1.000,bicubic
vit_base_patch32_224.augreg_in21k_ft_in1k,80.716,19.284,95.566,4.434,88.22,224,0.900,bicubic
resnet50.b1k_in1k,80.706,19.294,95.432,4.568,25.56,288,1.000,bicubic
resnext50_32x4d.ra_in1k,80.698,19.302,95.392,4.608,25.03,288,0.950,bicubic
resnet50.a1h_in1k,80.678,19.322,95.306,4.694,25.56,224,1.000,bicubic
eca_resnet33ts.ra2_in1k,80.672,19.328,95.364,4.636,19.68,288,1.000,bicubic
regnety_016.tv2_in1k,80.666,19.334,95.330,4.670,11.20,224,0.965,bicubic
resnext50d_32x4d.bt_in1k,80.664,19.336,95.420,4.580,25.05,288,0.950,bicubic
nf_resnet50.ra2_in1k,80.640,19.360,95.334,4.666,25.56,288,0.940,bicubic
eva02_tiny_patch14_336.mim_in22k_ft_in1k,80.630,19.370,95.526,4.474,5.76,336,1.000,bicubic
efficientnet_b2.ra_in1k,80.610,19.390,95.314,4.686,9.11,288,1.000,bicubic
gcresnet33ts.ra2_in1k,80.600,19.400,95.322,4.678,19.88,288,1.000,bicubic
resnext101_64x4d.gluon_in1k,80.600,19.400,94.992,5.008,83.46,224,0.875,bicubic
cspresnext50.ra_in1k,80.554,19.446,95.326,4.674,20.57,256,0.887,bilinear
resnet152.a3_in1k,80.546,19.454,95.000,5.000,60.19,224,0.950,bicubic
darknet53.c2ns_in1k,80.532,19.468,95.432,4.568,41.61,288,1.000,bicubic
maxvit_rmlp_pico_rw_256.sw_in1k,80.514,19.486,95.214,4.786,7.52,256,0.950,bicubic
darknetaa53.c2ns_in1k,80.506,19.494,95.322,4.678,36.02,288,1.000,bilinear
repvgg_b3.rvgg_in1k,80.506,19.494,95.254,4.746,123.09,224,0.875,bilinear
efficientformer_l1.snap_dist_in1k,80.498,19.502,94.988,5.012,12.29,224,0.950,bicubic
vit_small_patch32_384.augreg_in21k_ft_in1k,80.486,19.514,95.600,4.400,22.92,384,1.000,bicubic
mixnet_xl.ra_in1k,80.482,19.518,94.936,5.064,11.90,224,0.875,bicubic
resnet152d.gluon_in1k,80.476,19.524,95.202,4.798,60.21,224,0.875,bicubic
convnext_pico_ols.d1_in1k,80.462,19.538,95.252,4.748,9.06,288,1.000,bicubic
repvit_m2.dist_in1k,80.460,19.540,95.168,4.832,8.80,224,0.950,bicubic
inception_resnet_v2.tf_in1k,80.458,19.542,95.308,4.692,55.84,299,0.897,bicubic
edgenext_small_rw.sw_in1k,80.458,19.542,95.190,4.810,7.83,320,1.000,bicubic
resnet50.b2k_in1k,80.454,19.546,95.318,4.682,25.56,288,1.000,bicubic
xcit_tiny_24_p16_224.fb_dist_in1k,80.454,19.546,95.218,4.782,12.12,224,1.000,bicubic
repvit_m1_0.dist_450e_in1k,80.434,19.566,94.918,5.082,7.30,224,0.950,bicubic
resnet101d.gluon_in1k,80.426,19.574,95.024,4.976,44.57,224,0.875,bicubic
convnext_pico.d1_in1k,80.416,19.584,95.048,4.952,9.05,288,0.950,bicubic
regnety_120.pycls_in1k,80.380,19.620,95.126,4.874,51.82,224,0.875,bicubic
mobilevitv2_150.cvnets_in1k,80.370,19.630,95.074,4.926,10.59,256,0.888,bicubic
fastvit_t12.apple_dist_in1k,80.352,19.648,95.042,4.958,7.55,256,0.900,bicubic
ese_vovnet39b.ra_in1k,80.350,19.650,95.366,4.634,24.57,288,0.950,bicubic
resnetv2_50x1_bit.goog_in21k_ft_in1k,80.342,19.658,95.682,4.318,25.55,448,1.000,bilinear
resnext101_32x4d.gluon_in1k,80.340,19.660,94.930,5.070,44.18,224,0.875,bicubic
resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k,80.334,19.666,95.400,4.600,25.03,224,0.875,bilinear
rexnet_150.nav_in1k,80.324,19.676,95.176,4.824,9.73,224,0.875,bicubic
efficientvit_b1.r288_in1k,80.324,19.676,94.990,5.010,9.10,288,1.000,bicubic
tf_efficientnet_b2.ap_in1k,80.310,19.690,95.026,4.974,9.11,260,0.890,bicubic
resnet101s.gluon_in1k,80.304,19.696,95.152,4.848,44.67,224,0.875,bicubic
efficientnet_el_pruned.in1k,80.298,19.702,95.222,4.778,10.59,300,0.904,bicubic
regnety_160.pycls_in1k,80.298,19.702,94.964,5.036,83.59,224,0.875,bicubic
poolformer_s24.sail_in1k,80.294,19.706,95.074,4.926,21.39,224,0.900,bicubic
res2net50d.in1k,80.254,19.746,95.036,4.964,25.72,224,0.875,bilinear
tf_efficientnet_el.in1k,80.248,19.752,95.120,4.880,10.59,300,0.904,bicubic
regnetx_320.pycls_in1k,80.246,19.754,95.022,4.978,107.81,224,0.875,bicubic
vit_base_patch16_224.sam_in1k,80.238,19.762,94.756,5.244,86.57,224,0.900,bicubic
resnetblur50.bt_in1k,80.234,19.766,95.234,4.766,25.56,288,0.950,bicubic
legacy_seresnext101_32x4d.in1k,80.232,19.768,95.020,4.980,48.96,224,0.875,bilinear
repvgg_b3g4.rvgg_in1k,80.216,19.784,95.092,4.908,83.83,224,0.875,bilinear
tf_efficientnetv2_b2.in1k,80.196,19.804,95.042,4.958,10.10,260,0.890,bicubic
dpn107.mx_in1k,80.170,19.830,94.942,5.058,86.92,224,0.875,bicubic
convmixer_768_32.in1k,80.168,19.832,95.074,4.926,21.11,224,0.960,bicubic
skresnext50_32x4d.ra_in1k,80.164,19.836,94.640,5.360,27.48,224,0.875,bicubic
inception_v4.tf_in1k,80.156,19.844,94.970,5.030,42.68,299,0.875,bicubic
repvit_m1_0.dist_300e_in1k,80.126,19.874,94.744,5.256,7.30,224,0.950,bicubic
tf_efficientnet_b2.aa_in1k,80.084,19.916,94.906,5.094,9.11,260,0.890,bicubic
cspdarknet53.ra_in1k,80.068,19.932,95.078,4.922,27.64,256,0.887,bilinear
dpn92.mx_in1k,80.038,19.962,94.860,5.140,37.67,224,0.875,bicubic
inception_resnet_v2.tf_ens_adv_in1k,79.978,20.022,94.948,5.052,55.84,299,0.897,bicubic
resnet50.ram_in1k,79.976,20.024,95.052,4.948,25.56,288,0.950,bicubic
fastvit_s12.apple_in1k,79.942,20.058,94.794,5.206,9.47,256,0.900,bicubic
seresnext50_32x4d.gluon_in1k,79.924,20.076,94.824,5.176,27.56,224,0.875,bicubic
efficientnet_b2_pruned.in1k,79.920,20.080,94.852,5.148,8.31,260,0.890,bicubic
resnet152c.gluon_in1k,79.912,20.088,94.846,5.154,60.21,224,0.875,bicubic
resnetrs50.tf_in1k,79.894,20.106,94.974,5.026,35.69,224,0.910,bicubic
xception71.tf_in1k,79.874,20.126,94.928,5.072,42.34,299,0.903,bicubic
regnety_080.pycls_in1k,79.868,20.132,94.832,5.168,39.18,224,0.875,bicubic
regnetx_160.pycls_in1k,79.866,20.134,94.828,5.172,54.28,224,0.875,bicubic
ecaresnet26t.ra2_in1k,79.850,20.150,95.090,4.910,16.01,320,0.950,bicubic
deit_small_patch16_224.fb_in1k,79.848,20.152,95.044,4.956,22.05,224,0.900,bicubic
levit_conv_192.fb_dist_in1k,79.838,20.162,94.784,5.216,10.95,224,0.900,bicubic
levit_192.fb_dist_in1k,79.838,20.162,94.778,5.222,10.95,224,0.900,bicubic
resnet50.ra_in1k,79.836,20.164,94.966,5.034,25.56,288,0.950,bicubic
dpn131.mx_in1k,79.814,20.186,94.700,5.300,79.25,224,0.875,bicubic
resnet101.a3_in1k,79.814,20.186,94.614,5.386,44.55,224,0.950,bicubic
tf_efficientnet_lite3.in1k,79.806,20.194,94.914,5.086,8.20,300,0.904,bilinear
resmlp_36_224.fb_in1k,79.772,20.228,94.884,5.116,44.69,224,0.875,bicubic
cait_xxs36_224.fb_dist_in1k,79.746,20.254,94.874,5.126,17.30,224,1.000,bicubic
efficientvit_b1.r256_in1k,79.734,20.266,94.780,5.220,9.10,256,1.000,bicubic
gcvit_xxtiny.in1k,79.726,20.274,95.080,4.920,12.00,224,0.875,bicubic
resnet33ts.ra2_in1k,79.726,20.274,94.828,5.172,19.68,288,1.000,bicubic
regnety_064.pycls_in1k,79.716,20.284,94.766,5.234,30.58,224,0.875,bicubic
resnet152.gluon_in1k,79.696,20.304,94.730,5.270,60.19,224,0.875,bicubic
efficientformerv2_s1.snap_dist_in1k,79.692,20.308,94.716,5.284,6.19,224,0.950,bicubic
xcit_tiny_12_p8_224.fb_in1k,79.688,20.312,95.054,4.946,6.71,224,1.000,bicubic
fbnetv3_d.ra2_in1k,79.682,20.318,94.944,5.056,10.31,256,0.950,bilinear
mobilevitv2_125.cvnets_in1k,79.680,20.320,94.858,5.142,7.48,256,0.888,bicubic
dpn98.mx_in1k,79.670,20.330,94.654,5.346,61.57,224,0.875,bicubic
gmlp_s16_224.ra3_in1k,79.644,20.356,94.622,5.378,19.42,224,0.875,bicubic
resnet50.bt_in1k,79.640,20.360,94.892,5.108,25.56,288,0.950,bicubic
tf_efficientnet_b2.in1k,79.608,20.392,94.714,5.286,9.11,260,0.890,bicubic
regnetx_120.pycls_in1k,79.588,20.412,94.742,5.258,46.11,224,0.875,bicubic
cspresnet50.ra_in1k,79.582,20.418,94.710,5.290,21.62,256,0.887,bilinear
xception65.tf_in1k,79.556,20.444,94.658,5.342,39.92,299,0.903,bicubic
ecaresnet50t.a3_in1k,79.552,20.448,94.694,5.306,25.57,224,0.950,bicubic
resnet101c.gluon_in1k,79.538,20.462,94.584,5.416,44.57,224,0.875,bicubic
rexnet_130.nav_in1k,79.506,20.494,94.678,5.322,7.56,224,0.875,bicubic
eca_halonext26ts.c1_in1k,79.486,20.514,94.600,5.400,10.76,256,0.940,bicubic
vit_relpos_base_patch32_plus_rpn_256.sw_in1k,79.484,20.516,94.138,5.862,119.42,256,0.900,bicubic
hrnet_w64.ms_in1k,79.476,20.524,94.652,5.348,128.06,224,0.875,bilinear
tf_efficientnetv2_b1.in1k,79.460,20.540,94.722,5.278,8.14,240,0.882,bicubic
xcit_tiny_24_p16_224.fb_in1k,79.448,20.552,94.878,5.122,12.12,224,1.000,bicubic
dla102x2.in1k,79.446,20.554,94.632,5.368,41.28,224,0.875,bilinear
regnetx_016.tv2_in1k,79.436,20.564,94.768,5.232,9.19,224,0.965,bicubic
mobileone_s4.apple_in1k,79.426,20.574,94.480,5.520,14.95,224,0.900,bilinear
resnet32ts.ra2_in1k,79.388,20.612,94.652,5.348,17.96,288,1.000,bicubic
repvgg_b2g4.rvgg_in1k,79.382,20.618,94.676,5.324,61.76,224,0.875,bilinear
resmlp_24_224.fb_in1k,79.374,20.626,94.546,5.454,30.02,224,0.875,bicubic
dpn68b.ra_in1k,79.360,20.640,94.436,5.564,12.61,288,1.000,bicubic
resnext50_32x4d.gluon_in1k,79.360,20.640,94.430,5.570,25.03,224,0.875,bicubic
convnextv2_femto.fcmae_ft_in1k,79.338,20.662,94.560,5.440,5.23,288,0.950,bicubic
resnet101.gluon_in1k,79.310,20.690,94.522,5.478,44.55,224,0.875,bicubic
resnext101_32x8d.tv_in1k,79.310,20.690,94.520,5.480,88.79,224,0.875,bilinear
nf_regnet_b1.ra2_in1k,79.308,20.692,94.740,5.260,10.22,288,0.900,bicubic
hrnet_w48.ms_in1k,79.306,20.694,94.516,5.484,77.47,224,0.875,bilinear
tf_efficientnet_cc_b1_8e.in1k,79.302,20.698,94.374,5.626,39.72,240,0.882,bicubic
tf_efficientnet_b1.ap_in1k,79.276,20.724,94.312,5.688,7.79,240,0.882,bicubic
eca_botnext26ts_256.c1_in1k,79.268,20.732,94.606,5.394,10.59,256,0.950,bicubic
resnext50_32x4d.a3_in1k,79.268,20.732,94.306,5.694,25.03,224,0.950,bicubic
fastvit_t12.apple_in1k,79.264,20.736,94.562,5.438,7.55,256,0.900,bicubic
botnet26t_256.c1_in1k,79.258,20.742,94.532,5.468,12.49,256,0.950,bicubic
efficientvit_b1.r224_in1k,79.252,20.748,94.304,5.696,9.10,224,0.950,bicubic
efficientnet_em.ra2_in1k,79.244,20.756,94.794,5.206,6.90,240,0.882,bicubic
resnet50.fb_ssl_yfcc100m_ft_in1k,79.230,20.770,94.826,5.174,25.56,224,0.875,bilinear
regnety_040.pycls_in1k,79.220,20.780,94.656,5.344,20.65,224,0.875,bicubic
res2net101_26w_4s.in1k,79.200,20.800,94.436,5.564,45.21,224,0.875,bilinear
regnetx_080.pycls_in1k,79.198,20.802,94.554,5.446,39.57,224,0.875,bicubic
pit_xs_distilled_224.in1k,79.180,20.820,94.366,5.634,11.00,224,0.900,bicubic
tiny_vit_5m_224.in1k,79.170,20.830,94.794,5.206,5.39,224,0.950,bicubic
vit_base_patch16_224.augreg_in1k,79.154,20.846,94.090,5.910,86.57,224,0.900,bicubic
fbnetv3_b.ra2_in1k,79.146,20.854,94.744,5.256,8.60,256,0.950,bilinear
halonet26t.a1h_in1k,79.106,20.894,94.306,5.694,12.48,256,0.950,bicubic
coat_lite_mini.in1k,79.102,20.898,94.608,5.392,11.01,224,0.900,bicubic
lambda_resnet26t.c1_in1k,79.088,20.912,94.590,5.410,10.96,256,0.940,bicubic
resnet50d.gluon_in1k,79.078,20.922,94.466,5.534,25.58,224,0.875,bicubic
legacy_seresnext50_32x4d.in1k,79.076,20.924,94.432,5.568,27.56,224,0.875,bilinear
regnetx_064.pycls_in1k,79.066,20.934,94.460,5.540,26.21,224,0.875,bicubic
repvit_m0_9.dist_450e_in1k,79.066,20.934,94.380,5.620,5.49,224,0.950,bicubic
legacy_xception.tf_in1k,79.040,20.960,94.382,5.618,22.86,299,0.897,bicubic
resnet50.am_in1k,79.002,20.998,94.398,5.602,25.56,224,0.875,bicubic
mixnet_l.ft_in1k,78.966,21.034,94.182,5.818,7.33,224,0.875,bicubic
lambda_resnet26rpt_256.c1_in1k,78.964,21.036,94.436,5.564,10.99,256,0.940,bicubic
res2net50_26w_8s.in1k,78.942,21.058,94.294,5.706,48.40,224,0.875,bilinear
hrnet_w40.ms_in1k,78.932,21.068,94.464,5.536,57.56,224,0.875,bilinear
convnext_femto_ols.d1_in1k,78.924,21.076,94.526,5.474,5.23,288,0.950,bicubic
convnext_tiny.fb_in22k_ft_in1k,78.898,21.102,94.674,5.326,28.59,288,1.000,bicubic
hrnet_w44.ms_in1k,78.894,21.106,94.364,5.636,67.06,224,0.875,bilinear
regnety_032.pycls_in1k,78.876,21.124,94.408,5.592,19.44,224,0.875,bicubic
vit_small_patch16_224.augreg_in1k,78.848,21.152,94.288,5.712,22.05,224,0.900,bicubic
wide_resnet101_2.tv_in1k,78.842,21.158,94.282,5.718,126.89,224,0.875,bilinear
tf_efficientnet_b1.aa_in1k,78.828,21.172,94.200,5.800,7.79,240,0.882,bicubic
seresnext26d_32x4d.bt_in1k,78.814,21.186,94.240,5.760,16.81,288,0.950,bicubic
repghostnet_200.in1k,78.806,21.194,94.330,5.670,9.80,224,0.875,bicubic
inception_v3.gluon_in1k,78.802,21.198,94.376,5.624,23.83,299,0.875,bicubic
efficientnet_b1.ft_in1k,78.800,21.200,94.342,5.658,7.79,256,1.000,bicubic
repvgg_b2.rvgg_in1k,78.792,21.208,94.420,5.580,89.02,224,0.875,bilinear
tf_mixnet_l.in1k,78.776,21.224,94.002,5.998,7.33,224,0.875,bicubic
vit_base_patch32_384.augreg_in1k,78.756,21.244,94.226,5.774,88.30,384,1.000,bicubic
seresnext26t_32x4d.bt_in1k,78.744,21.256,94.312,5.688,16.81,288,0.950,bicubic
resnet50d.a3_in1k,78.720,21.280,94.232,5.768,25.58,224,0.950,bicubic
convnext_femto.d1_in1k,78.716,21.284,94.430,5.570,5.22,288,0.950,bicubic
resnet50s.gluon_in1k,78.714,21.286,94.242,5.758,25.68,224,0.875,bicubic
dla169.in1k,78.708,21.292,94.344,5.656,53.39,224,0.875,bilinear
pvt_v2_b1.in1k,78.704,21.296,94.502,5.498,14.01,224,0.900,bicubic
tf_efficientnet_b0.ns_jft_in1k,78.668,21.332,94.372,5.628,5.29,224,0.875,bicubic
regnety_008_tv.tv2_in1k,78.666,21.334,94.390,5.610,6.43,224,0.965,bicubic
legacy_seresnet152.in1k,78.660,21.340,94.370,5.630,66.82,224,0.875,bilinear
repvit_m0_9.dist_300e_in1k,78.658,21.342,94.116,5.884,5.49,224,0.950,bicubic
xcit_tiny_12_p16_224.fb_dist_in1k,78.574,21.426,94.198,5.802,6.72,224,1.000,bicubic
res2net50_26w_6s.in1k,78.568,21.432,94.122,5.878,37.05,224,0.875,bilinear
tf_efficientnet_b1.in1k,78.562,21.438,94.094,5.906,7.79,240,0.882,bicubic
repvit_m1.dist_in1k,78.538,21.462,94.070,5.930,5.49,224,0.950,bicubic
dla102x.in1k,78.512,21.488,94.236,5.764,26.31,224,0.875,bilinear
xception41.tf_in1k,78.504,21.496,94.276,5.724,26.97,299,0.903,bicubic
levit_conv_128.fb_dist_in1k,78.494,21.506,94.008,5.992,9.21,224,0.900,bicubic
regnetx_040.pycls_in1k,78.492,21.508,94.242,5.758,22.12,224,0.875,bicubic
levit_128.fb_dist_in1k,78.490,21.510,94.012,5.988,9.21,224,0.900,bicubic
resnest26d.gluon_in1k,78.482,21.518,94.294,5.706,17.07,224,0.875,bilinear
wide_resnet50_2.tv_in1k,78.476,21.524,94.088,5.912,68.88,224,0.875,bilinear
dla60_res2net.in1k,78.464,21.536,94.198,5.802,20.85,224,0.875,bilinear
hrnet_w32.ms_in1k,78.442,21.558,94.190,5.810,41.23,224,0.875,bilinear
dla60_res2next.in1k,78.440,21.560,94.144,5.856,17.03,224,0.875,bilinear
resnet34d.ra2_in1k,78.436,21.564,94.344,5.656,21.82,288,0.950,bicubic
coat_tiny.in1k,78.426,21.574,94.048,5.952,5.50,224,0.900,bicubic
vit_tiny_patch16_384.augreg_in21k_ft_in1k,78.424,21.576,94.542,5.458,5.79,384,1.000,bicubic
gcresnext26ts.ch_in1k,78.414,21.586,94.036,5.964,10.48,288,1.000,bicubic
selecsls60b.in1k,78.412,21.588,94.168,5.832,32.77,224,0.875,bicubic
legacy_seresnet101.in1k,78.386,21.614,94.262,5.738,49.33,224,0.875,bilinear
cait_xxs24_224.fb_dist_in1k,78.384,21.616,94.316,5.684,11.96,224,1.000,bicubic
repvgg_b1.rvgg_in1k,78.368,21.632,94.096,5.904,57.42,224,0.875,bilinear
tf_efficientnetv2_b0.in1k,78.358,21.642,94.014,5.986,7.14,224,0.875,bicubic
resnet26t.ra2_in1k,78.328,21.672,94.124,5.876,16.01,320,1.000,bicubic
resnet152.tv_in1k,78.322,21.678,94.046,5.954,60.19,224,0.875,bilinear
mobilevit_s.cvnets_in1k,78.312,21.688,94.148,5.852,5.58,256,0.900,bicubic
seresnext26ts.ch_in1k,78.270,21.730,94.092,5.908,10.39,288,1.000,bicubic
bat_resnext26ts.ch_in1k,78.252,21.748,94.098,5.902,10.73,256,0.900,bicubic
res2next50.in1k,78.242,21.758,93.892,6.108,24.67,224,0.875,bilinear
efficientnet_b1_pruned.in1k,78.240,21.760,93.834,6.166,6.33,240,0.882,bicubic
dla60x.in1k,78.236,21.764,94.026,5.974,17.35,224,0.875,bilinear
hrnet_w30.ms_in1k,78.196,21.804,94.222,5.778,37.71,224,0.875,bilinear
hrnet_w18_small_v2.gluon_in1k,78.190,21.810,93.902,6.098,15.60,224,0.875,bicubic
pit_xs_224.in1k,78.176,21.824,94.162,5.838,10.62,224,0.900,bicubic
regnetx_032.pycls_in1k,78.168,21.832,94.082,5.918,15.30,224,0.875,bicubic
visformer_tiny.in1k,78.160,21.840,94.166,5.834,10.32,224,0.900,bicubic
res2net50_14w_8s.in1k,78.158,21.842,93.846,6.154,25.06,224,0.875,bilinear
tf_efficientnet_em.in1k,78.126,21.874,94.048,5.952,6.90,240,0.882,bicubic
hrnet_w18.ms_aug_in1k,78.122,21.878,94.054,5.946,21.30,224,0.950,bilinear
hardcorenas_f.miil_green_in1k,78.096,21.904,93.802,6.198,8.20,224,0.875,bilinear
mobilevitv2_100.cvnets_in1k,78.080,21.920,94.170,5.830,4.90,256,0.888,bicubic
efficientnet_es.ra_in1k,78.058,21.942,93.926,6.074,5.44,224,0.875,bicubic
resnet50.a3_in1k,78.048,21.952,93.780,6.220,25.56,224,0.950,bicubic
gmixer_24_224.ra3_in1k,78.026,21.974,93.668,6.332,24.72,224,0.875,bicubic
dla102.in1k,78.024,21.976,93.934,6.066,33.27,224,0.875,bilinear
resnet50c.gluon_in1k,78.006,21.994,93.992,6.008,25.58,224,0.875,bicubic
poolformerv2_s12.sail_in1k,78.002,21.998,93.864,6.136,11.89,224,1.000,bicubic
eca_resnext26ts.ch_in1k,78.000,22.000,93.926,6.074,10.30,288,1.000,bicubic
mobileone_s3.apple_in1k,77.992,22.008,93.914,6.086,10.17,224,0.900,bilinear
selecsls60.in1k,77.988,22.012,93.830,6.170,30.67,224,0.875,bicubic
resmlp_12_224.fb_distilled_in1k,77.954,22.046,93.560,6.440,15.35,224,0.875,bicubic
res2net50_26w_4s.in1k,77.950,22.050,93.852,6.148,25.70,224,0.875,bilinear
mobilenetv3_large_100.miil_in21k_ft_in1k,77.920,22.080,92.914,7.086,5.48,224,0.875,bilinear
resnet34.a1_in1k,77.918,22.082,93.764,6.236,21.80,288,1.000,bicubic
tf_efficientnet_cc_b0_8e.in1k,77.904,22.096,93.662,6.338,24.01,224,0.875,bicubic
regnety_016.pycls_in1k,77.868,22.132,93.718,6.282,11.20,224,0.875,bicubic
rexnet_100.nav_in1k,77.856,22.144,93.866,6.134,4.80,224,0.875,bicubic
inception_v3.tf_in1k,77.856,22.144,93.640,6.360,23.83,299,0.875,bicubic
ghostnetv2_160.in1k,77.832,22.168,93.940,6.060,12.39,224,0.875,bicubic
xcit_nano_12_p8_384.fb_dist_in1k,77.820,22.180,94.040,5.960,3.05,384,1.000,bicubic
hardcorenas_e.miil_green_in1k,77.790,22.210,93.700,6.300,8.07,224,0.875,bilinear
convnextv2_atto.fcmae_ft_in1k,77.760,22.240,93.726,6.274,3.71,288,0.950,bicubic
ese_vovnet19b_dw.ra_in1k,77.744,22.256,93.784,6.216,6.54,288,0.950,bicubic
efficientnet_b0.ra_in1k,77.694,22.306,93.532,6.468,5.29,224,0.875,bicubic
tinynet_a.in1k,77.648,22.352,93.540,6.460,6.19,192,0.875,bicubic
legacy_seresnet50.in1k,77.644,22.356,93.758,6.242,28.09,224,0.875,bilinear
cs3darknet_m.c2ns_in1k,77.634,22.366,94.016,5.984,9.31,288,0.950,bicubic
resnext50_32x4d.tv_in1k,77.622,22.378,93.696,6.304,25.03,224,0.875,bilinear
inception_v3.tf_adv_in1k,77.592,22.408,93.730,6.270,23.83,299,0.875,bicubic
repvgg_b1g4.rvgg_in1k,77.588,22.412,93.836,6.164,39.97,224,0.875,bilinear
resnet50.gluon_in1k,77.582,22.418,93.720,6.280,25.56,224,0.875,bicubic
coat_lite_tiny.in1k,77.520,22.480,93.922,6.078,5.72,224,0.900,bicubic
dpn68b.mx_in1k,77.518,22.482,93.852,6.148,12.61,224,0.875,bicubic
mobileone_s2.apple_in1k,77.516,22.484,93.668,6.332,7.88,224,0.900,bilinear
res2net50_48w_2s.in1k,77.514,22.486,93.550,6.450,25.29,224,0.875,bilinear
tf_efficientnet_lite2.in1k,77.462,22.538,93.752,6.248,6.09,260,0.890,bicubic
repghostnet_150.in1k,77.460,22.540,93.510,6.490,6.58,224,0.875,bicubic
hardcorenas_d.miil_green_in1k,77.434,22.566,93.490,6.510,7.50,224,0.875,bilinear
inception_v3.tv_in1k,77.434,22.566,93.474,6.526,23.83,299,0.875,bicubic
resnet26d.bt_in1k,77.408,22.592,93.638,6.362,16.01,288,0.950,bicubic
resnet101.tv_in1k,77.380,22.620,93.546,6.454,44.55,224,0.875,bilinear
densenet161.tv_in1k,77.358,22.642,93.642,6.358,28.68,224,0.875,bicubic
densenetblur121d.ra_in1k,77.322,22.678,93.788,6.212,8.00,288,0.950,bicubic
mobilenetv2_120d.ra_in1k,77.308,22.692,93.502,6.498,5.83,224,0.875,bicubic
regnetx_008.tv2_in1k,77.306,22.694,93.664,6.336,7.26,224,0.965,bicubic
tf_efficientnet_cc_b0_4e.in1k,77.302,22.698,93.336,6.664,13.31,224,0.875,bicubic
densenet201.tv_in1k,77.286,22.714,93.480,6.520,20.01,224,0.875,bicubic
cs3darknet_focus_m.c2ns_in1k,77.284,22.716,93.966,6.034,9.30,288,0.950,bicubic
mixnet_m.ft_in1k,77.260,22.740,93.418,6.582,5.01,224,0.875,bicubic
poolformer_s12.sail_in1k,77.240,22.760,93.532,6.468,11.92,224,0.900,bicubic
convnext_atto_ols.a2_in1k,77.216,22.784,93.676,6.324,3.70,288,0.950,bicubic
resnext26ts.ra2_in1k,77.178,22.822,93.464,6.536,10.30,288,1.000,bicubic
fastvit_t8.apple_dist_in1k,77.176,22.824,93.298,6.702,4.03,256,0.900,bicubic
selecsls42b.in1k,77.170,22.830,93.392,6.608,32.46,224,0.875,bicubic
resnet34.a2_in1k,77.158,22.842,93.274,6.726,21.80,288,1.000,bicubic
xcit_tiny_12_p16_224.fb_in1k,77.140,22.860,93.716,6.284,6.72,224,1.000,bicubic
legacy_seresnext26_32x4d.in1k,77.108,22.892,93.314,6.686,16.79,224,0.875,bicubic
tf_efficientnet_b0.ap_in1k,77.090,22.910,93.262,6.738,5.29,224,0.875,bicubic
hardcorenas_c.miil_green_in1k,77.066,22.934,93.162,6.838,5.52,224,0.875,bilinear
efficientvit_m5.r224_in1k,77.058,22.942,93.184,6.816,12.47,224,0.875,bicubic
dla60.in1k,77.046,22.954,93.318,6.682,22.04,224,0.875,bilinear
seresnet50.a3_in1k,77.026,22.974,93.072,6.928,28.09,224,0.950,bicubic
convnext_atto.d2_in1k,77.008,22.992,93.702,6.298,3.70,288,0.950,bicubic
crossvit_9_dagger_240.in1k,76.978,23.022,93.618,6.382,8.78,240,0.875,bicubic
tf_mixnet_m.in1k,76.954,23.046,93.154,6.846,5.01,224,0.875,bicubic
convmixer_1024_20_ks9_p14.in1k,76.936,23.064,93.350,6.650,24.38,224,0.960,bicubic
regnetx_016.pycls_in1k,76.924,23.076,93.416,6.584,9.19,224,0.875,bicubic
skresnet34.ra_in1k,76.910,23.090,93.316,6.684,22.28,224,0.875,bicubic
gernet_s.idstcv_in1k,76.910,23.090,93.144,6.856,8.17,224,0.875,bilinear
tf_efficientnet_b0.aa_in1k,76.844,23.156,93.218,6.782,5.29,224,0.875,bicubic
ghostnetv2_130.in1k,76.756,23.244,93.362,6.638,8.96,224,0.875,bicubic
hrnet_w18.ms_in1k,76.752,23.248,93.444,6.556,21.30,224,0.875,bilinear
resmlp_12_224.fb_in1k,76.648,23.352,93.178,6.822,15.35,224,0.875,bicubic
tf_efficientnet_lite1.in1k,76.644,23.356,93.224,6.776,5.42,240,0.882,bicubic
mixer_b16_224.goog_in21k_ft_in1k,76.602,23.398,92.224,7.776,59.88,224,0.875,bicubic
tf_efficientnet_es.in1k,76.598,23.402,93.202,6.798,5.44,224,0.875,bicubic
hardcorenas_b.miil_green_in1k,76.548,23.452,92.762,7.238,5.18,224,0.875,bilinear
tf_efficientnet_b0.in1k,76.530,23.470,93.008,6.992,5.29,224,0.875,bicubic
levit_128s.fb_dist_in1k,76.526,23.474,92.872,7.128,7.78,224,0.900,bicubic
levit_conv_128s.fb_dist_in1k,76.520,23.480,92.866,7.134,7.78,224,0.900,bicubic
mobilenetv2_140.ra_in1k,76.516,23.484,92.988,7.012,6.11,224,0.875,bicubic
densenet121.ra_in1k,76.500,23.500,93.368,6.632,7.98,288,0.950,bicubic
resnet34.bt_in1k,76.480,23.520,93.354,6.646,21.80,288,0.950,bicubic
repvgg_a2.rvgg_in1k,76.458,23.542,93.002,6.998,28.21,224,0.875,bilinear
repghostnet_130.in1k,76.376,23.624,92.892,7.108,5.48,224,0.875,bicubic
resnet26.bt_in1k,76.366,23.634,93.180,6.820,16.00,288,0.950,bicubic
dpn68.mx_in1k,76.346,23.654,93.008,6.992,12.61,224,0.875,bicubic
xcit_nano_12_p8_224.fb_dist_in1k,76.332,23.668,93.098,6.902,3.05,224,1.000,bicubic
regnety_008.pycls_in1k,76.302,23.698,93.062,6.938,6.26,224,0.875,bicubic
fastvit_t8.apple_in1k,76.174,23.826,93.052,6.948,4.03,256,0.900,bicubic
resnet50.tv_in1k,76.128,23.872,92.858,7.142,25.56,224,0.875,bilinear
efficientformerv2_s0.snap_dist_in1k,76.114,23.886,92.858,7.142,3.60,224,0.950,bicubic
vit_small_patch32_224.augreg_in21k_ft_in1k,75.994,24.006,93.270,6.730,22.88,224,0.900,bicubic
mixnet_s.ft_in1k,75.994,24.006,92.800,7.200,4.13,224,0.875,bicubic
vit_tiny_r_s16_p8_384.augreg_in21k_ft_in1k,75.960,24.040,93.262,6.738,6.36,384,1.000,bicubic
hardcorenas_a.miil_green_in1k,75.938,24.062,92.508,7.492,5.26,224,0.875,bilinear
densenet169.tv_in1k,75.900,24.100,93.028,6.972,14.15,224,0.875,bicubic
mobileone_s1.apple_in1k,75.786,24.214,92.792,7.208,4.83,224,0.900,bilinear
mobilenetv3_large_100.ra_in1k,75.766,24.234,92.538,7.462,5.48,224,0.875,bicubic
edgenext_x_small.in1k,75.688,24.312,92.766,7.234,2.34,288,1.000,bicubic
tf_mixnet_s.in1k,75.652,24.348,92.640,7.360,4.13,224,0.875,bicubic
mobilenetv3_rw.rmsp_in1k,75.620,24.380,92.704,7.296,5.48,224,0.875,bicubic
mobilevitv2_075.cvnets_in1k,75.608,24.392,92.744,7.256,2.87,256,0.888,bicubic
regnety_004.tv2_in1k,75.594,24.406,92.700,7.300,4.34,224,0.965,bicubic
tf_mobilenetv3_large_100.in1k,75.516,24.484,92.594,7.406,5.48,224,0.875,bilinear
resnest14d.gluon_in1k,75.508,24.492,92.508,7.492,10.61,224,0.875,bilinear
efficientnet_lite0.ra_in1k,75.482,24.518,92.520,7.480,4.65,224,0.875,bicubic
vit_tiny_patch16_224.augreg_in21k_ft_in1k,75.462,24.538,92.844,7.156,5.72,224,0.900,bicubic
xcit_nano_12_p16_384.fb_dist_in1k,75.458,24.542,92.698,7.302,3.05,384,1.000,bicubic
semnasnet_100.rmsp_in1k,75.450,24.550,92.598,7.402,3.89,224,0.875,bicubic
regnety_006.pycls_in1k,75.268,24.732,92.526,7.474,6.06,224,0.875,bicubic
ghostnetv2_100.in1k,75.166,24.834,92.354,7.646,6.16,224,0.875,bicubic
repvgg_b0.rvgg_in1k,75.144,24.856,92.416,7.584,15.82,224,0.875,bilinear
fbnetc_100.rmsp_in1k,75.130,24.870,92.388,7.612,5.57,224,0.875,bilinear
hrnet_w18_small_v2.ms_in1k,75.110,24.890,92.416,7.584,15.60,224,0.875,bilinear
repghostnet_111.in1k,75.056,24.944,92.192,7.808,4.54,224,0.875,bicubic
mobilenetv2_110d.ra_in1k,75.054,24.946,92.184,7.816,4.52,224,0.875,bicubic
regnetx_008.pycls_in1k,75.028,24.972,92.338,7.662,7.26,224,0.875,bicubic
efficientnet_es_pruned.in1k,75.006,24.994,92.444,7.556,5.44,224,0.875,bicubic
tinynet_b.in1k,74.978,25.022,92.186,7.814,3.73,188,0.875,bicubic
vit_base_patch32_224.augreg_in1k,74.894,25.106,91.778,8.222,88.22,224,0.900,bicubic
tf_efficientnet_lite0.in1k,74.832,25.168,92.170,7.830,4.65,224,0.875,bicubic
legacy_seresnet34.in1k,74.802,25.198,92.126,7.874,21.96,224,0.875,bilinear
densenet121.tv_in1k,74.764,25.236,92.154,7.846,7.98,224,0.875,bicubic
mnasnet_100.rmsp_in1k,74.652,25.348,92.122,7.878,4.38,224,0.875,bicubic
dla34.in1k,74.640,25.360,92.066,7.934,15.74,224,0.875,bilinear
mobilevit_xs.cvnets_in1k,74.634,25.366,92.348,7.652,2.32,256,0.900,bicubic
regnetx_004_tv.tv2_in1k,74.600,25.400,92.170,7.830,5.50,224,0.965,bicubic
resnet34.gluon_in1k,74.580,25.420,91.982,8.018,21.80,224,0.875,bicubic
deit_tiny_distilled_patch16_224.fb_in1k,74.504,25.496,91.890,8.110,5.91,224,0.900,bicubic
repvgg_a1.rvgg_in1k,74.462,25.538,91.856,8.144,14.09,224,0.875,bilinear
efficientvit_m4.r224_in1k,74.368,25.632,91.980,8.020,8.80,224,0.875,bicubic
pit_ti_distilled_224.in1k,74.256,25.744,91.952,8.048,5.10,224,0.900,bicubic
vgg19_bn.tv_in1k,74.216,25.784,91.844,8.156,143.68,224,0.875,bilinear
repghostnet_100.in1k,74.206,25.794,91.542,8.458,4.07,224,0.875,bicubic
spnasnet_100.rmsp_in1k,74.094,25.906,91.820,8.180,4.42,224,0.875,bilinear
regnety_004.pycls_in1k,74.026,25.974,91.748,8.252,4.34,224,0.875,bicubic
crossvit_9_240.in1k,73.960,26.040,91.962,8.038,8.55,240,0.875,bicubic
ghostnet_100.in1k,73.958,26.042,91.532,8.468,5.18,224,0.875,bicubic
hrnet_w18_small.gluon_in1k,73.920,26.080,91.194,8.806,13.19,224,0.875,bicubic
xcit_nano_12_p8_224.fb_in1k,73.910,26.090,92.168,7.832,3.05,224,1.000,bicubic
regnetx_006.pycls_in1k,73.868,26.132,91.678,8.322,6.20,224,0.875,bicubic
resnet18d.ra2_in1k,73.794,26.206,91.838,8.162,11.71,288,0.950,bicubic
vit_base_patch32_224.sam_in1k,73.694,26.306,91.014,8.986,88.22,224,0.900,bicubic
tf_mobilenetv3_large_075.in1k,73.430,26.570,91.352,8.648,3.99,224,0.875,bilinear
efficientvit_m3.r224_in1k,73.374,26.626,91.348,8.652,6.90,224,0.875,bicubic
vgg16_bn.tv_in1k,73.370,26.630,91.514,8.486,138.37,224,0.875,bilinear
crossvit_tiny_240.in1k,73.340,26.660,91.908,8.092,7.01,240,0.875,bicubic
resnet34.tv_in1k,73.306,26.694,91.420,8.580,21.80,224,0.875,bilinear
resnet18.fb_swsl_ig1b_ft_in1k,73.288,26.712,91.730,8.270,11.69,224,0.875,bilinear
resnet18.a1_in1k,73.158,26.842,91.026,8.974,11.69,288,1.000,bicubic
convit_tiny.fb_in1k,73.112,26.888,91.712,8.288,5.71,224,0.875,bicubic
skresnet18.ra_in1k,73.034,26.966,91.172,8.828,11.96,224,0.875,bicubic
semnasnet_075.rmsp_in1k,73.004,26.996,91.140,8.860,2.91,224,0.875,bicubic
resnet34.a3_in1k,72.970,27.030,91.106,8.894,21.80,224,0.950,bicubic
mobilenetv2_100.ra_in1k,72.968,27.032,91.016,8.984,3.50,224,0.875,bicubic
pit_ti_224.in1k,72.910,27.090,91.404,8.596,4.85,224,0.900,bicubic
resnet18.fb_ssl_yfcc100m_ft_in1k,72.598,27.402,91.416,8.584,11.69,224,0.875,bilinear
repvgg_a0.rvgg_in1k,72.408,27.592,90.492,9.508,9.11,224,0.875,bilinear
regnetx_004.pycls_in1k,72.402,27.598,90.826,9.174,5.16,224,0.875,bicubic
vgg19.tv_in1k,72.378,27.622,90.874,9.126,143.67,224,0.875,bilinear
resnet18.a2_in1k,72.372,27.628,90.596,9.404,11.69,288,1.000,bicubic
hrnet_w18_small.ms_in1k,72.336,27.664,90.680,9.320,13.19,224,0.875,bilinear
xcit_nano_12_p16_224.fb_dist_in1k,72.310,27.690,90.860,9.140,3.05,224,1.000,bicubic
tf_mobilenetv3_large_minimal_100.in1k,72.264,27.736,90.640,9.360,3.92,224,0.875,bilinear
resnet14t.c3_in1k,72.254,27.746,90.306,9.694,10.08,224,0.950,bicubic
repghostnet_080.in1k,72.212,27.788,90.484,9.516,3.28,224,0.875,bicubic
deit_tiny_patch16_224.fb_in1k,72.170,27.830,91.116,8.884,5.72,224,0.900,bicubic
lcnet_100.ra2_in1k,72.102,27.898,90.354,9.646,2.95,224,0.875,bicubic
mixer_l16_224.goog_in21k_ft_in1k,72.054,27.946,87.674,12.326,208.20,224,0.875,bicubic
edgenext_xx_small.in1k,71.878,28.122,90.552,9.448,1.33,288,1.000,bicubic
vit_tiny_r_s16_p8_224.augreg_in21k_ft_in1k,71.798,28.202,90.824,9.176,6.34,224,0.900,bicubic
legacy_seresnet18.in1k,71.760,28.240,90.332,9.668,11.78,224,0.875,bicubic
vgg16.tv_in1k,71.592,28.408,90.384,9.616,138.36,224,0.875,bilinear
vgg13_bn.tv_in1k,71.588,28.412,90.378,9.622,133.05,224,0.875,bilinear
mobileone_s0.apple_in1k,71.402,28.598,89.842,10.158,5.29,224,0.875,bilinear
efficientvit_b0.r224_in1k,71.398,28.602,89.428,10.572,3.41,224,0.950,bicubic
tinynet_c.in1k,71.242,28.758,89.732,10.268,2.46,184,0.875,bicubic
resnet18.gluon_in1k,70.834,29.166,89.756,10.244,11.69,224,0.875,bicubic
efficientvit_m2.r224_in1k,70.814,29.186,90.142,9.858,4.19,224,0.875,bicubic
pvt_v2_b0.in1k,70.660,29.340,90.196,9.804,3.67,224,0.900,bicubic
vgg11_bn.tv_in1k,70.382,29.618,89.808,10.192,132.87,224,0.875,bilinear
regnety_002.pycls_in1k,70.280,29.720,89.530,10.470,3.16,224,0.875,bicubic
mobilevitv2_050.cvnets_in1k,70.148,29.852,89.918,10.082,1.37,256,0.888,bicubic
xcit_nano_12_p16_224.fb_in1k,69.962,30.038,89.762,10.238,3.05,224,1.000,bicubic
vgg13.tv_in1k,69.932,30.068,89.250,10.750,133.05,224,0.875,bilinear
resnet18.tv_in1k,69.760,30.240,89.070,10.930,11.69,224,0.875,bilinear
vgg11.tv_in1k,69.022,30.978,88.624,11.376,132.86,224,0.875,bilinear
mobilevit_xxs.cvnets_in1k,68.918,31.082,88.946,11.054,1.27,256,0.900,bicubic
repghostnet_058.in1k,68.914,31.086,88.420,11.580,2.55,224,0.875,bicubic
lcnet_075.ra2_in1k,68.782,31.218,88.360,11.640,2.36,224,0.875,bicubic
regnetx_002.pycls_in1k,68.752,31.248,88.542,11.458,2.68,224,0.875,bicubic
resnet10t.c3_in1k,68.364,31.636,88.036,11.964,5.44,224,0.950,bicubic
efficientvit_m1.r224_in1k,68.306,31.694,88.670,11.330,2.98,224,0.875,bicubic
resnet18.a3_in1k,68.252,31.748,88.172,11.828,11.69,224,0.950,bicubic
tf_mobilenetv3_small_100.in1k,67.922,32.078,87.672,12.328,2.54,224,0.875,bilinear
dla60x_c.in1k,67.912,32.088,88.432,11.568,1.32,224,0.875,bilinear
mobilenetv3_small_100.lamb_in1k,67.658,32.342,87.636,12.364,2.54,224,0.875,bicubic
tinynet_d.in1k,66.972,33.028,87.066,12.934,2.34,152,0.875,bicubic
repghostnet_050.in1k,66.966,33.034,86.920,13.080,2.31,224,0.875,bicubic
mnasnet_small.lamb_in1k,66.196,33.804,86.504,13.496,2.03,224,0.875,bicubic
dla46x_c.in1k,65.992,34.008,86.974,13.026,1.07,224,0.875,bilinear
mobilenetv2_050.lamb_in1k,65.948,34.052,86.084,13.916,1.97,224,0.875,bicubic
tf_mobilenetv3_small_075.in1k,65.726,34.274,86.132,13.868,2.04,224,0.875,bilinear
mobilenetv3_small_075.lamb_in1k,65.236,34.764,85.446,14.554,2.04,224,0.875,bicubic
dla46_c.in1k,64.872,35.128,86.298,13.702,1.30,224,0.875,bilinear
efficientvit_m0.r224_in1k,63.270,36.730,85.176,14.824,2.35,224,0.875,bicubic
lcnet_050.ra2_in1k,63.138,36.862,84.382,15.618,1.88,224,0.875,bicubic
tf_mobilenetv3_small_minimal_100.in1k,62.894,37.106,84.238,15.762,2.04,224,0.875,bilinear
tinynet_e.in1k,59.866,40.134,81.762,18.238,2.04,106,0.875,bicubic
mobilenetv3_small_050.lamb_in1k,57.916,42.084,80.180,19.820,1.59,224,0.875,bicubic
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/model_metadata-in1k.csv
|
model,pretrain
adv_inception_v3,in1k-adv
bat_resnext26ts,in1k
beit_base_patch16_224,in21k-selfsl
beit_base_patch16_384,in21k-selfsl
beit_large_patch16_224,in21k-selfsl
beit_large_patch16_384,in21k-selfsl
beit_large_patch16_512,in21k-selfsl
botnet26t_256,in1k
cait_m36_384,in1k-dist
cait_m48_448,in1k-dist
cait_s24_224,in1k-dist
cait_s24_384,in1k-dist
cait_s36_384,in1k-dist
cait_xs24_384,in1k-dist
cait_xxs24_224,in1k-dist
cait_xxs24_384,in1k-dist
cait_xxs36_224,in1k-dist
cait_xxs36_384,in1k-dist
coat_lite_mini,in1k
coat_lite_small,in1k
coat_lite_tiny,in1k
coat_mini,in1k
coat_tiny,in1k
convit_base,in1k
convit_small,in1k
convit_tiny,in1k
convmixer_1024_20_ks9_p14,in1k
convmixer_1536_20,in1k
convmixer_768_32,in1k
crossvit_15_240,in1k
crossvit_15_dagger_240,in1k
crossvit_15_dagger_408,in1k
crossvit_18_240,in1k
crossvit_18_dagger_240,in1k
crossvit_18_dagger_408,in1k
crossvit_9_240,in1k
crossvit_9_dagger_240,in1k
crossvit_base_240,in1k
crossvit_small_240,in1k
crossvit_tiny_240,in1k
cspdarknet53,in1k
cspresnet50,in1k
cspresnext50,in1k
deit_base_distilled_patch16_224,in1k-dist
deit_base_distilled_patch16_384,in1k-dist
deit_base_patch16_224,in1k
deit_base_patch16_384,in1k
deit_small_distilled_patch16_224,in1k-dist
deit_small_patch16_224,in1k
deit_tiny_distilled_patch16_224,in1k-dist
deit_tiny_patch16_224,in1k
densenet121,in1k
densenet161,in1k
densenet169,in1k
densenet201,in1k
densenetblur121d,in1k
dla102,in1k
dla102x,in1k
dla102x2,in1k
dla169,in1k
dla34,in1k
dla46_c,in1k
dla46x_c,in1k
dla60,in1k
dla60_res2net,in1k
dla60_res2next,in1k
dla60x,in1k
dla60x_c,in1k
dm_nfnet_f0,in1k
dm_nfnet_f1,in1k
dm_nfnet_f2,in1k
dm_nfnet_f3,in1k
dm_nfnet_f4,in1k
dm_nfnet_f5,in1k
dm_nfnet_f6,in1k
dpn107,in1k
dpn131,in1k
dpn68,in1k
dpn68b,in1k
dpn92,in1k
dpn98,in1k
eca_botnext26ts_256,in1k
eca_halonext26ts,in1k
eca_nfnet_l0,in1k
eca_nfnet_l1,in1k
eca_nfnet_l2,in1k
eca_resnet33ts,in1k
eca_resnext26ts,in1k
ecaresnet101d,in1k
ecaresnet101d_pruned,in1k
ecaresnet269d,in1k
ecaresnet26t,in1k
ecaresnet50d,in1k
ecaresnet50d_pruned,in1k
ecaresnet50t,in1k
ecaresnetlight,in1k
efficientnet_b0,in1k
efficientnet_b1,in1k
efficientnet_b1_pruned,in1k
efficientnet_b2,in1k
efficientnet_b2_pruned,in1k
efficientnet_b3,in1k
efficientnet_b3_pruned,in1k
efficientnet_b4,in1k
efficientnet_el,in1k
efficientnet_el_pruned,in1k
efficientnet_em,in1k
efficientnet_es,in1k
efficientnet_es_pruned,in1k
efficientnet_lite0,in1k
efficientnetv2_rw_m,in1k
efficientnetv2_rw_s,in1k
efficientnetv2_rw_t,in1k
ens_adv_inception_resnet_v2,in1k-adv
ese_vovnet19b_dw,in1k
ese_vovnet39b,in1k
fbnetc_100,in1k
gc_efficientnetv2_rw_t,in1k
gcresnet33ts,in1k
gcresnet50t,in1k
gcresnext26ts,in1k
gcresnext50ts,in1k
gernet_l,in1k
gernet_m,in1k
gernet_s,in1k
ghostnet_100,in1k
gluon_inception_v3,in1k
gluon_resnet101_v1b,in1k
gluon_resnet101_v1c,in1k
gluon_resnet101_v1d,in1k
gluon_resnet101_v1s,in1k
gluon_resnet152_v1b,in1k
gluon_resnet152_v1c,in1k
gluon_resnet152_v1d,in1k
gluon_resnet152_v1s,in1k
gluon_resnet18_v1b,in1k
gluon_resnet34_v1b,in1k
gluon_resnet50_v1b,in1k
gluon_resnet50_v1c,in1k
gluon_resnet50_v1d,in1k
gluon_resnet50_v1s,in1k
gluon_resnext101_32x4d,in1k
gluon_resnext101_64x4d,in1k
gluon_resnext50_32x4d,in1k
gluon_senet154,in1k
gluon_seresnext101_32x4d,in1k
gluon_seresnext101_64x4d,in1k
gluon_seresnext50_32x4d,in1k
gluon_xception65,in1k
gmixer_24_224,in1k
gmlp_s16_224,in1k
halo2botnet50ts_256,in1k
halonet26t,in1k
halonet50ts,in1k
haloregnetz_b,in1k
hardcorenas_a,in1k
hardcorenas_b,in1k
hardcorenas_c,in1k
hardcorenas_d,in1k
hardcorenas_e,in1k
hardcorenas_f,in1k
hrnet_w18,in1k
hrnet_w18_small,in1k
hrnet_w18_small_v2,in1k
hrnet_w30,in1k
hrnet_w32,in1k
hrnet_w40,in1k
hrnet_w44,in1k
hrnet_w48,in1k
hrnet_w64,in1k
ig_resnext101_32x16d,ig1b-wsl
ig_resnext101_32x32d,ig1b-wsl
ig_resnext101_32x48d,ig1b-wsl
ig_resnext101_32x8d,ig1b-wsl
inception_resnet_v2,in1k
inception_v3,in1k
inception_v4,in1k
jx_nest_base,in1k
jx_nest_small,in1k
jx_nest_tiny,in1k
lambda_resnet26rpt_256,in1k
lambda_resnet26t,in1k
lambda_resnet50ts,in1k
lamhalobotnet50ts_256,in1k
legacy_senet154,in1k
legacy_seresnet101,in1k
legacy_seresnet152,in1k
legacy_seresnet18,in1k
legacy_seresnet34,in1k
legacy_seresnet50,in1k
legacy_seresnext101_32x4d,in1k
legacy_seresnext26_32x4d,in1k
legacy_seresnext50_32x4d,in1k
levit_128,in1k-dist
levit_128s,in1k-dist
levit_192,in1k-dist
levit_256,in1k-dist
levit_384,in1k-dist
mixer_b16_224,in1k
mixer_b16_224_miil,in21k
mixer_l16_224,in1k
mixnet_l,in1k
mixnet_m,in1k
mixnet_s,in1k
mixnet_xl,in1k
mnasnet_100,in1k
mobilenetv2_100,in1k
mobilenetv2_110d,in1k
mobilenetv2_120d,in1k
mobilenetv2_140,in1k
mobilenetv3_large_100,in1k
mobilenetv3_large_100_miil,in21k
mobilenetv3_rw,in1k
nasnetalarge,in1k
nf_regnet_b1,in1k
nf_resnet50,in1k
nfnet_l0,in1k
pit_b_224,in1k
pit_b_distilled_224,in1k-dist
pit_s_224,in1k
pit_s_distilled_224,in1k-dist
pit_ti_224,in1k
pit_ti_distilled_224,in1k-dist
pit_xs_224,in1k
pit_xs_distilled_224,in1k-dist
pnasnet5large,in1k
regnetx_002,in1k
regnetx_004,in1k
regnetx_006,in1k
regnetx_008,in1k
regnetx_016,in1k
regnetx_032,in1k
regnetx_040,in1k
regnetx_064,in1k
regnetx_080,in1k
regnetx_120,in1k
regnetx_160,in1k
regnetx_320,in1k
regnety_002,in1k
regnety_004,in1k
regnety_006,in1k
regnety_008,in1k
regnety_016,in1k
regnety_032,in1k
regnety_040,in1k
regnety_064,in1k
regnety_080,in1k
regnety_120,in1k
regnety_160,in1k
regnety_320,in1k
regnetz_b,in1k
regnetz_c,in1k
regnetz_d,in1k
repvgg_a2,in1k
repvgg_b0,in1k
repvgg_b1,in1k
repvgg_b1g4,in1k
repvgg_b2,in1k
repvgg_b2g4,in1k
repvgg_b3,in1k
repvgg_b3g4,in1k
res2net101_26w_4s,in1k
res2net50_14w_8s,in1k
res2net50_26w_4s,in1k
res2net50_26w_6s,in1k
res2net50_26w_8s,in1k
res2net50_48w_2s,in1k
res2next50,in1k
resmlp_12_224,in1k
resmlp_12_distilled_224,in1k-dist
resmlp_24_224,in1k
resmlp_24_distilled_224,in1k-dist
resmlp_36_224,in1k
resmlp_36_distilled_224,in1k-dist
resmlp_big_24_224,in1k
resmlp_big_24_224_in22ft1k,in21k
resmlp_big_24_distilled_224,in1k-dist
resnest101e,in1k
resnest14d,in1k
resnest200e,in1k
resnest269e,in1k
resnest26d,in1k
resnest50d,in1k
resnest50d_1s4x24d,in1k
resnest50d_4s2x40d,in1k
resnet101d,in1k
resnet152d,in1k
resnet18,in1k
resnet18d,in1k
resnet200d,in1k
resnet26,in1k
resnet26d,in1k
resnet26t,in1k
resnet32ts,in1k
resnet33ts,in1k
resnet34,in1k
resnet34d,in1k
resnet50,in1k
resnet50d,in1k
resnet51q,in1k
resnet61q,in1k
resnetblur50,in1k
resnetrs101,in1k
resnetrs152,in1k
resnetrs200,in1k
resnetrs270,in1k
resnetrs350,in1k
resnetrs420,in1k
resnetrs50,in1k
resnetv2_101,in1k
resnetv2_101x1_bitm,in21k
resnetv2_101x3_bitm,in21k
resnetv2_152x2_bit_teacher,in21k
resnetv2_152x2_bit_teacher_384,in21k
resnetv2_152x2_bitm,in21k
resnetv2_152x4_bitm,in21k
resnetv2_50,in1k
resnetv2_50x1_bit_distilled,in1k-dist
resnetv2_50x1_bitm,in21k
resnetv2_50x3_bitm,in21k
resnext101_32x8d,in1k
resnext26ts,in1k
resnext50_32x4d,in1k
resnext50d_32x4d,in1k
rexnet_100,in1k
rexnet_130,in1k
rexnet_150,in1k
rexnet_200,in1k
sehalonet33ts,in1k
selecsls42b,in1k
selecsls60,in1k
selecsls60b,in1k
semnasnet_100,in1k
seresnet152d,in1k
seresnet33ts,in1k
seresnet50,in1k
seresnext26d_32x4d,in1k
seresnext26t_32x4d,in1k
seresnext26ts,in1k
seresnext50_32x4d,in1k
skresnet18,in1k
skresnet34,in1k
skresnext50_32x4d,in1k
spnasnet_100,in1k
ssl_resnet18,yfc-semisl
ssl_resnet50,yfc-semisl
ssl_resnext101_32x16d,yfc-semisl
ssl_resnext101_32x4d,yfc-semisl
ssl_resnext101_32x8d,yfc-semisl
ssl_resnext50_32x4d,yfc-semisl
swin_base_patch4_window12_384,in21k
swin_base_patch4_window7_224,in21k
swin_large_patch4_window12_384,in21k
swin_large_patch4_window7_224,in21k
swin_small_patch4_window7_224,in1k
swin_tiny_patch4_window7_224,in1k
swsl_resnet18,ig1b-swsl
swsl_resnet50,ig1b-swsl
swsl_resnext101_32x16d,ig1b-swsl
swsl_resnext101_32x4d,ig1b-swsl
swsl_resnext101_32x8d,ig1b-swsl
swsl_resnext50_32x4d,ig1b-swsl
tf_efficientnet_b0,in1k
tf_efficientnet_b0_ap,in1k-ap
tf_efficientnet_b0_ns,jft300m-ns
tf_efficientnet_b1,in1k
tf_efficientnet_b1_ap,in1k-ap
tf_efficientnet_b1_ns,jft300m-ns
tf_efficientnet_b2,in1k
tf_efficientnet_b2_ap,in1k-ap
tf_efficientnet_b2_ns,jft300m-ns
tf_efficientnet_b3,in1k
tf_efficientnet_b3_ap,in1k-ap
tf_efficientnet_b3_ns,jft300m-ns
tf_efficientnet_b4,in1k
tf_efficientnet_b4_ap,in1k-ap
tf_efficientnet_b4_ns,jft300m-ns
tf_efficientnet_b5,in1k
tf_efficientnet_b5_ap,in1k-ap
tf_efficientnet_b5_ns,jft300m-ns
tf_efficientnet_b6,in1k
tf_efficientnet_b6_ap,in1k-ap
tf_efficientnet_b6_ns,jft300m-ns
tf_efficientnet_b7,in1k
tf_efficientnet_b7_ap,in1k-ap
tf_efficientnet_b7_ns,jft300m-ns
tf_efficientnet_b8,in1k
tf_efficientnet_b8_ap,in1k-ap
tf_efficientnet_cc_b0_4e,in1k
tf_efficientnet_cc_b0_8e,in1k
tf_efficientnet_cc_b1_8e,in1k
tf_efficientnet_el,in1k
tf_efficientnet_em,in1k
tf_efficientnet_es,in1k
tf_efficientnet_l2_ns,jft300m-ns
tf_efficientnet_l2_ns_475,jft300m-ns
tf_efficientnet_lite0,in1k
tf_efficientnet_lite1,in1k
tf_efficientnet_lite2,in1k
tf_efficientnet_lite3,in1k
tf_efficientnet_lite4,in1k
tf_efficientnetv2_b0,in1k
tf_efficientnetv2_b1,in1k
tf_efficientnetv2_b2,in1k
tf_efficientnetv2_b3,in1k
tf_efficientnetv2_l,in1k
tf_efficientnetv2_l_in21ft1k,in21k
tf_efficientnetv2_m,in1k
tf_efficientnetv2_m_in21ft1k,in21k
tf_efficientnetv2_s,in1k
tf_efficientnetv2_s_in21ft1k,in21k
tf_efficientnetv2_xl_in21ft1k,in21k
tf_inception_v3,in1k
tf_mixnet_l,in1k
tf_mixnet_m,in1k
tf_mixnet_s,in1k
tf_mobilenetv3_large_075,in1k
tf_mobilenetv3_large_100,in1k
tf_mobilenetv3_large_minimal_100,in1k
tf_mobilenetv3_small_075,in1k
tf_mobilenetv3_small_100,in1k
tf_mobilenetv3_small_minimal_100,in1k
tnt_s_patch16_224,in1k
tresnet_l,in1k
tresnet_l_448,in1k
tresnet_m,in21k
tresnet_m_448,in1k
tresnet_xl,in1k
tresnet_xl_448,in1k
tv_densenet121,in1k
tv_resnet101,in1k
tv_resnet152,in1k
tv_resnet34,in1k
tv_resnet50,in1k
tv_resnext50_32x4d,in1k
twins_pcpvt_base,in1k
twins_pcpvt_large,in1k
twins_pcpvt_small,in1k
twins_svt_base,in1k
twins_svt_large,in1k
twins_svt_small,in1k
vgg11,in1k
vgg11_bn,in1k
vgg13,in1k
vgg13_bn,in1k
vgg16,in1k
vgg16_bn,in1k
vgg19,in1k
vgg19_bn,in1k
visformer_small,in1k
vit_base_patch16_224,in21k
vit_base_patch16_224_miil,in21k
vit_base_patch16_384,in21k
vit_base_patch16_224_sam,in1k
vit_base_patch32_224,in21k
vit_base_patch32_384,in21k
vit_base_patch32_224_sam,in1k
vit_base_r50_s16_384,in21k
vit_large_patch16_224,in21k
vit_large_patch16_384,in21k
vit_large_patch32_384,in21k
vit_large_r50_s32_224,in21k
vit_large_r50_s32_384,in21k
vit_small_patch16_224,in21k
vit_small_patch16_384,in21k
vit_small_patch32_224,in21k
vit_small_patch32_384,in21k
vit_small_r26_s32_224,in21k
vit_small_r26_s32_384,in21k
vit_tiny_patch16_224,in21k
vit_tiny_patch16_384,in21k
vit_tiny_r_s16_p8_224,in21k
vit_tiny_r_s16_p8_384,in21k
wide_resnet101_2,in1k
wide_resnet50_2,in1k
xception,in1k
xception41,in1k
xception65,in1k
xception71,in1k
xcit_large_24_p16_224,in1k
xcit_large_24_p16_224_dist,in1k-dist
xcit_large_24_p16_384_dist,in1k-dist
xcit_large_24_p8_224,in1k
xcit_large_24_p8_224_dist,in1k-dist
xcit_large_24_p8_384_dist,in1k-dist
xcit_medium_24_p16_224,in1k
xcit_medium_24_p16_224_dist,in1k-dist
xcit_medium_24_p16_384_dist,in1k-dist
xcit_medium_24_p8_224,in1k
xcit_medium_24_p8_224_dist,in1k-dist
xcit_medium_24_p8_384_dist,in1k-dist
xcit_nano_12_p16_224,in1k
xcit_nano_12_p16_224_dist,in1k-dist
xcit_nano_12_p16_384_dist,in1k-dist
xcit_nano_12_p8_224,in1k
xcit_nano_12_p8_224_dist,in1k-dist
xcit_nano_12_p8_384_dist,in1k-dist
xcit_small_12_p16_224,in1k
xcit_small_12_p16_224_dist,in1k-dist
xcit_small_12_p16_384_dist,in1k-dist
xcit_small_12_p8_224,in1k
xcit_small_12_p8_224_dist,in1k-dist
xcit_small_12_p8_384_dist,in1k-dist
xcit_small_24_p16_224,in1k
xcit_small_24_p16_224_dist,in1k-dist
xcit_small_24_p16_384_dist,in1k-dist
xcit_small_24_p8_224,in1k
xcit_small_24_p8_224_dist,in1k-dist
xcit_small_24_p8_384_dist,in1k-dist
xcit_tiny_12_p16_224,in1k
xcit_tiny_12_p16_224_dist,in1k-dist
xcit_tiny_12_p16_384_dist,in1k-dist
xcit_tiny_12_p8_224,in1k
xcit_tiny_12_p8_224_dist,in1k-dist
xcit_tiny_12_p8_384_dist,in1k-dist
xcit_tiny_24_p16_224,in1k
xcit_tiny_24_p16_224_dist,in1k-dist
xcit_tiny_24_p16_384_dist,in1k-dist
xcit_tiny_24_p8_224,in1k
xcit_tiny_24_p8_224_dist,in1k-dist
xcit_tiny_24_p8_384_dist,in1k-dist
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-infer-amp-nhwc-pt210-cu121-rtx3090.csv
|
model,infer_img_size,infer_batch_size,infer_samples_per_sec,infer_step_time,infer_gmacs,infer_macts,param_count
tinynet_e,106,1024.0,75290.96,13.591,0.03,0.69,2.04
mobilenetv3_small_050,224,1024.0,56785.93,18.023,0.03,0.92,1.59
efficientvit_m0,224,1024.0,50656.23,20.205,0.08,0.91,2.35
lcnet_035,224,1024.0,48853.22,20.951,0.03,1.04,1.64
lcnet_050,224,1024.0,42147.98,24.285,0.05,1.26,1.88
mobilenetv3_small_075,224,1024.0,42002.46,24.369,0.05,1.3,2.04
mobilenetv3_small_100,224,1024.0,38516.23,26.573,0.06,1.42,2.54
tinynet_d,152,1024.0,37989.71,26.944,0.05,1.42,2.34
efficientvit_m1,224,1024.0,37486.44,27.306,0.17,1.33,2.98
tf_mobilenetv3_small_minimal_100,224,1024.0,33948.13,30.153,0.06,1.41,2.04
efficientvit_m2,224,1024.0,33551.67,30.51,0.2,1.47,4.19
tf_mobilenetv3_small_075,224,1024.0,33262.15,30.775,0.05,1.3,2.04
tf_mobilenetv3_small_100,224,1024.0,31002.71,33.019,0.06,1.42,2.54
lcnet_075,224,1024.0,30664.19,33.384,0.1,1.99,2.36
efficientvit_m3,224,1024.0,29423.78,34.792,0.27,1.62,6.9
efficientvit_m4,224,1024.0,27882.1,36.716,0.3,1.7,8.8
mnasnet_small,224,1024.0,25015.02,40.925,0.07,2.16,2.03
regnetx_002,224,1024.0,24564.71,41.67,0.2,2.16,2.68
lcnet_100,224,1024.0,24268.72,42.183,0.16,2.52,2.95
levit_128s,224,1024.0,22705.11,45.089,0.31,1.88,7.78
regnety_002,224,1024.0,22248.91,46.012,0.2,2.17,3.16
resnet10t,176,1024.0,22236.3,46.04,0.7,1.51,5.44
mobilenetv2_035,224,1024.0,22055.42,46.418,0.07,2.86,1.68
levit_conv_128s,224,1024.0,21863.15,46.826,0.31,1.88,7.78
ghostnet_050,224,1024.0,20782.95,49.261,0.05,1.77,2.59
mnasnet_050,224,1024.0,20672.17,49.525,0.11,3.07,2.22
repghostnet_050,224,1024.0,20617.05,49.657,0.05,2.02,2.31
efficientvit_m5,224,1024.0,19010.14,53.856,0.53,2.41,12.47
tinynet_c,184,1024.0,18737.07,54.641,0.11,2.87,2.46
efficientvit_b0,224,1024.0,18023.56,56.804,0.1,2.87,3.41
semnasnet_050,224,1024.0,17573.38,58.26,0.11,3.44,2.08
mobilenetv2_050,224,1024.0,17491.5,58.532,0.1,3.64,1.97
regnetx_004,224,1024.0,17164.74,59.647,0.4,3.14,5.16
repghostnet_058,224,1024.0,16947.81,60.41,0.07,2.59,2.55
regnetx_004_tv,224,1024.0,16485.73,62.101,0.42,3.17,5.5
vit_small_patch32_224,224,1024.0,16428.86,62.319,1.12,2.09,22.88
cs3darknet_focus_s,256,1024.0,16333.25,62.684,0.69,2.7,3.27
lcnet_150,224,1024.0,15841.02,64.632,0.34,3.79,4.5
gernet_s,224,1024.0,15617.62,65.556,0.75,2.65,8.17
cs3darknet_s,256,1024.0,15597.89,65.64,0.72,2.97,3.28
levit_128,224,1024.0,15372.6,66.601,0.41,2.71,9.21
vit_tiny_r_s16_p8_224,224,1024.0,15191.19,67.397,0.43,1.85,6.34
levit_conv_128,224,1024.0,14904.31,68.695,0.41,2.71,9.21
mobilenetv3_large_075,224,1024.0,14843.63,68.964,0.16,4.0,3.99
pit_ti_distilled_224,224,1024.0,14746.15,69.432,0.51,2.77,5.1
pit_ti_224,224,1024.0,14700.08,69.649,0.5,2.75,4.85
mixer_s32_224,224,1024.0,14362.24,71.288,1.0,2.28,19.1
resnet10t,224,1024.0,14254.88,71.825,1.1,2.43,5.44
repghostnet_080,224,1024.0,13967.84,73.293,0.1,3.22,3.28
tf_efficientnetv2_b0,192,1024.0,13629.52,75.121,0.54,3.51,7.14
mobilenetv3_rw,224,1024.0,13582.75,75.38,0.23,4.41,5.48
levit_192,224,1024.0,13511.34,75.778,0.66,3.2,10.95
mnasnet_075,224,1024.0,13417.36,76.309,0.23,4.77,3.17
mobilenetv3_large_100,224,1024.0,13322.79,76.851,0.23,4.41,5.48
hardcorenas_a,224,1024.0,13314.34,76.899,0.23,4.38,5.26
levit_conv_192,224,1024.0,12952.02,79.05,0.66,3.2,10.95
regnety_004,224,1024.0,12651.55,80.929,0.41,3.89,4.34
tf_mobilenetv3_large_075,224,1024.0,12636.69,81.023,0.16,4.0,3.99
nf_regnet_b0,192,1024.0,12264.41,83.481,0.37,3.15,8.76
tinynet_b,188,1024.0,12262.56,83.495,0.21,4.44,3.73
tf_mobilenetv3_large_minimal_100,224,1024.0,12182.74,84.043,0.22,4.4,3.92
hardcorenas_b,224,1024.0,12118.5,84.488,0.26,5.09,5.18
hardcorenas_c,224,1024.0,12088.28,84.699,0.28,5.01,5.52
resnet14t,176,1024.0,11843.82,86.448,1.07,3.61,10.08
mnasnet_100,224,1024.0,11686.43,87.612,0.33,5.46,4.38
regnety_006,224,1024.0,11675.48,87.69,0.61,4.33,6.06
ese_vovnet19b_slim_dw,224,1024.0,11663.91,87.781,0.4,5.28,1.9
repghostnet_100,224,1024.0,11508.79,88.956,0.15,3.98,4.07
tf_mobilenetv3_large_100,224,1024.0,11443.62,89.472,0.23,4.41,5.48
vit_tiny_patch16_224,224,1024.0,11342.82,90.267,1.08,4.12,5.72
hardcorenas_d,224,1024.0,11329.99,90.369,0.3,4.93,7.5
deit_tiny_distilled_patch16_224,224,1024.0,11311.9,90.514,1.09,4.15,5.91
deit_tiny_patch16_224,224,1024.0,11286.31,90.719,1.08,4.12,5.72
semnasnet_075,224,1024.0,11132.28,91.974,0.23,5.54,2.91
resnet18,224,1024.0,11101.69,92.228,1.82,2.48,11.69
ghostnet_100,224,1024.0,11039.87,92.744,0.15,3.55,5.18
mobilenetv2_075,224,1024.0,10984.87,93.208,0.22,5.86,2.64
spnasnet_100,224,1024.0,10557.11,96.986,0.35,6.03,4.42
tf_efficientnetv2_b1,192,1024.0,10473.04,97.765,0.76,4.59,8.14
regnetx_008,224,1024.0,10422.45,98.23,0.81,5.15,7.26
seresnet18,224,1024.0,10416.31,98.297,1.82,2.49,11.78
tf_efficientnetv2_b0,224,1024.0,10174.51,100.633,0.73,4.77,7.14
legacy_seresnet18,224,1024.0,10133.12,101.044,1.82,2.49,11.78
repghostnet_111,224,1024.0,10094.28,101.428,0.18,4.38,4.54
hardcorenas_f,224,1024.0,10012.95,102.257,0.35,5.57,8.2
tinynet_a,192,1024.0,9946.05,102.945,0.35,5.41,6.19
dla46_c,224,1024.0,9943.77,102.967,0.58,4.5,1.3
hardcorenas_e,224,1024.0,9851.75,103.931,0.35,5.65,8.07
semnasnet_100,224,1024.0,9823.16,104.233,0.32,6.23,3.89
levit_256,224,1024.0,9811.76,104.354,1.13,4.23,18.89
repvgg_a0,224,1024.0,9709.7,105.449,1.52,3.59,9.11
mobilenetv2_100,224,1024.0,9654.78,106.051,0.31,6.68,3.5
regnety_008,224,1024.0,9643.2,106.178,0.81,5.25,6.26
fbnetc_100,224,1024.0,9552.51,107.186,0.4,6.51,5.57
efficientnet_lite0,224,1024.0,9466.4,108.161,0.4,6.74,4.65
levit_conv_256,224,1024.0,9461.49,108.218,1.13,4.23,18.89
resnet18d,224,1024.0,9458.4,108.253,2.06,3.29,11.71
pit_xs_224,224,1024.0,9332.33,109.714,1.1,4.12,10.62
ese_vovnet19b_slim,224,1024.0,9277.16,110.369,1.69,3.52,3.17
regnety_008_tv,224,1024.0,9213.78,111.127,0.84,5.42,6.43
pit_xs_distilled_224,224,1024.0,9203.86,111.241,1.11,4.15,11.0
convnext_atto,224,1024.0,9104.06,112.467,0.55,3.81,3.7
repghostnet_130,224,1024.0,8873.05,115.395,0.25,5.24,5.48
ghostnet_130,224,1024.0,8870.81,115.424,0.24,4.6,7.36
convnext_atto_ols,224,1024.0,8829.55,115.964,0.58,4.11,3.7
regnetz_005,224,1024.0,8796.44,116.392,0.52,5.86,7.12
xcit_nano_12_p16_224,224,1024.0,8604.96,118.991,0.56,4.17,3.05
levit_256d,224,1024.0,8322.97,123.022,1.4,4.93,26.21
regnetx_006,224,1024.0,8320.1,123.064,0.61,3.98,6.2
tf_efficientnet_lite0,224,1024.0,8163.21,125.431,0.4,6.74,4.65
fbnetv3_b,224,1024.0,8152.31,125.598,0.42,6.97,8.6
efficientnet_b0,224,1024.0,8085.72,126.633,0.4,6.75,5.29
levit_conv_256d,224,1024.0,8055.13,127.113,1.4,4.93,26.21
edgenext_xx_small,256,1024.0,8014.51,127.757,0.26,3.33,1.33
mnasnet_140,224,1024.0,7984.3,128.241,0.6,7.71,7.12
convnext_femto,224,1024.0,7977.79,128.346,0.79,4.57,5.22
tf_efficientnetv2_b2,208,1024.0,7861.13,130.251,1.06,6.0,10.1
mobilevit_xxs,256,1024.0,7827.79,130.801,0.34,5.74,1.27
repghostnet_150,224,1024.0,7766.69,131.835,0.32,6.0,6.58
convnext_femto_ols,224,1024.0,7757.32,131.994,0.82,4.87,5.23
rexnetr_100,224,1024.0,7545.9,135.692,0.43,7.72,4.88
repvit_m1,224,1024.0,7543.44,135.728,0.83,7.45,5.49
resnet14t,224,1024.0,7466.4,137.137,1.69,5.8,10.08
mobilenetv2_110d,224,1024.0,7331.32,139.66,0.45,8.71,4.52
hrnet_w18_small,224,1024.0,7298.3,140.296,1.61,5.72,13.19
cs3darknet_focus_m,256,1024.0,7202.61,142.16,1.98,4.89,9.3
repvit_m0_9,224,1024.0,7165.5,142.888,0.83,7.45,5.49
crossvit_tiny_240,240,1024.0,7123.68,143.735,1.3,5.67,7.01
efficientvit_b1,224,1024.0,7109.59,144.02,0.53,7.25,9.1
tf_efficientnet_b0,224,1024.0,7104.21,144.129,0.4,6.75,5.29
crossvit_9_240,240,1024.0,7025.32,145.747,1.55,5.59,8.55
nf_regnet_b0,256,1024.0,6992.1,146.441,0.64,5.58,8.76
repvgg_a1,224,1024.0,6942.64,147.483,2.64,4.74,14.09
mobilevitv2_050,256,1024.0,6935.55,147.628,0.48,8.04,1.37
cs3darknet_m,256,1024.0,6929.59,147.762,2.08,5.28,9.31
efficientnet_b1_pruned,240,1024.0,6922.7,147.909,0.4,6.21,6.33
gernet_m,224,1024.0,6840.64,149.682,3.02,5.24,21.14
fbnetv3_d,224,1024.0,6784.35,150.925,0.52,8.5,10.31
semnasnet_140,224,1024.0,6771.35,151.215,0.6,8.87,6.11
crossvit_9_dagger_240,240,1024.0,6704.51,152.722,1.68,6.03,8.78
tf_efficientnetv2_b1,240,1024.0,6611.54,154.87,1.21,7.34,8.14
mobilenetv2_140,224,1024.0,6588.7,155.407,0.6,9.57,6.11
resnet34,224,1024.0,6504.25,157.425,3.67,3.74,21.8
ese_vovnet19b_dw,224,1024.0,6406.95,159.816,1.34,8.25,6.54
selecsls42,224,1024.0,6366.41,160.834,2.94,4.62,30.35
resnet18,288,1024.0,6354.7,161.13,3.01,4.11,11.69
selecsls42b,224,1024.0,6344.62,161.386,2.98,4.62,32.46
efficientnet_b0_g16_evos,224,1024.0,6342.4,161.442,1.01,7.42,8.11
edgenext_xx_small,288,1024.0,6334.97,161.631,0.33,4.21,1.33
efficientnet_lite1,240,1024.0,6268.15,163.355,0.62,10.14,5.42
pvt_v2_b0,224,1024.0,6254.52,163.711,0.53,7.01,3.67
visformer_tiny,224,1024.0,6218.29,164.665,1.27,5.72,10.32
convnext_pico,224,1024.0,6208.02,164.938,1.37,6.1,9.05
fbnetv3_b,256,1024.0,6192.25,165.357,0.55,9.1,8.6
efficientnet_es_pruned,224,1024.0,6175.39,165.809,1.81,8.73,5.44
efficientnet_es,224,1024.0,6170.12,165.95,1.81,8.73,5.44
rexnet_100,224,1024.0,6170.05,165.953,0.41,7.44,4.8
ghostnetv2_100,224,1024.0,6155.62,166.342,0.18,4.55,6.16
seresnet34,224,1024.0,6069.09,168.714,3.67,3.74,21.96
convnext_pico_ols,224,1024.0,6043.01,169.442,1.43,6.5,9.06
seresnet18,288,1024.0,5998.94,170.686,3.01,4.11,11.78
dla46x_c,224,1024.0,5992.19,170.877,0.54,5.66,1.07
dla34,224,1024.0,5954.72,171.952,3.07,5.02,15.74
repghostnet_200,224,1024.0,5934.75,172.524,0.54,7.96,9.8
resnet26,224,1024.0,5916.33,173.07,2.36,7.35,16.0
levit_384,224,1024.0,5897.4,173.625,2.36,6.26,39.13
resnet34d,224,1024.0,5884.13,174.017,3.91,4.54,21.82
cs3darknet_focus_m,288,1024.0,5878.89,174.173,2.51,6.19,9.3
legacy_seresnet34,224,1024.0,5873.4,174.335,3.67,3.74,21.96
repvit_m2,224,1024.0,5866.53,174.53,1.36,9.43,8.8
vit_base_patch32_224,224,1024.0,5866.04,174.553,4.37,4.19,88.22
vit_base_patch32_clip_224,224,1024.0,5864.79,174.59,4.37,4.19,88.22
repvit_m1_0,224,1024.0,5862.26,174.66,1.13,8.69,7.3
tf_efficientnet_es,224,1024.0,5831.76,175.58,1.81,8.73,5.44
rexnetr_130,224,1024.0,5827.09,175.72,0.68,9.81,7.61
resnetrs50,160,1024.0,5819.33,175.954,2.29,6.2,35.69
dla60x_c,224,1024.0,5709.85,179.326,0.59,6.01,1.32
vit_small_patch32_384,384,1024.0,5700.23,179.631,3.26,6.07,22.92
levit_conv_384,224,1024.0,5694.64,179.807,2.36,6.26,39.13
tiny_vit_5m_224,224,1024.0,5681.84,180.212,1.18,9.32,12.08
efficientnet_b1,224,1024.0,5671.54,180.54,0.59,9.36,7.79
cs3darknet_m,288,1024.0,5670.5,180.573,2.63,6.69,9.31
resnetblur18,224,1024.0,5631.98,181.808,2.34,3.39,11.69
tf_efficientnet_lite1,240,1024.0,5588.09,183.236,0.62,10.14,5.42
repvit_m1_1,224,1024.0,5584.25,183.355,1.36,9.43,8.8
mixnet_s,224,1024.0,5566.85,183.931,0.25,6.25,4.13
convnext_atto,288,1024.0,5556.64,184.274,0.91,6.3,3.7
darknet17,256,1024.0,5525.94,185.298,3.26,7.18,14.3
pit_s_224,224,1024.0,5520.06,185.491,2.42,6.18,23.46
resnet18d,288,1024.0,5497.35,186.262,3.41,5.43,11.71
selecsls60,224,1024.0,5496.69,186.283,3.59,5.52,30.67
pit_s_distilled_224,224,1024.0,5494.69,186.349,2.45,6.22,24.04
xcit_tiny_12_p16_224,224,1024.0,5472.11,187.12,1.24,6.29,6.72
selecsls60b,224,1024.0,5466.97,187.296,3.63,5.52,32.77
skresnet18,224,1024.0,5432.07,188.499,1.82,3.24,11.96
convnext_atto_ols,288,1024.0,5378.78,190.367,0.96,6.8,3.7
resmlp_12_224,224,1024.0,5371.14,190.637,3.01,5.5,15.35
regnetz_005,288,1024.0,5353.96,191.249,0.86,9.68,7.12
mobilenetv2_120d,224,1024.0,5347.39,191.484,0.69,11.97,5.83
convnextv2_atto,224,1024.0,5293.77,193.425,0.55,3.81,3.71
repvgg_b0,224,1024.0,5265.8,194.451,3.41,6.15,15.82
mixer_b32_224,224,1024.0,5245.72,195.191,3.24,6.29,60.29
vit_tiny_r_s16_p8_384,384,1024.0,5235.72,195.568,1.25,5.39,6.36
nf_regnet_b1,256,1024.0,5226.46,195.915,0.82,7.27,10.22
nf_regnet_b2,240,1024.0,5223.53,196.02,0.97,7.23,14.31
vit_base_patch32_clip_quickgelu_224,224,1024.0,5220.87,196.124,4.37,4.19,87.85
resnetaa34d,224,1024.0,5205.31,196.711,4.43,5.07,21.82
resnet26d,224,1024.0,5169.81,198.062,2.6,8.15,16.01
tf_mixnet_s,224,1024.0,5128.65,199.652,0.25,6.25,4.13
rexnetr_150,224,1024.0,5105.32,200.564,0.89,11.13,9.78
gmixer_12_224,224,1024.0,5083.79,201.414,2.67,7.26,12.7
fbnetv3_d,256,1024.0,5047.63,202.856,0.68,11.1,10.31
edgenext_x_small,256,1024.0,5018.94,204.014,0.54,5.93,2.34
mixer_s16_224,224,1024.0,5009.58,204.393,3.79,5.97,18.53
regnetz_b16,224,1024.0,5008.24,204.437,1.45,9.95,9.72
gmlp_ti16_224,224,1024.0,4999.44,204.811,1.34,7.55,5.87
darknet21,256,1024.0,4956.17,206.601,3.93,7.47,20.86
eva02_tiny_patch14_224,224,1024.0,4940.45,207.258,1.4,6.17,5.5
ghostnetv2_130,224,1024.0,4896.55,209.116,0.28,5.9,8.96
convnext_femto,288,1024.0,4844.52,211.362,1.3,7.56,5.22
nf_resnet26,224,1024.0,4822.21,212.339,2.41,7.35,16.0
efficientnet_lite2,260,1024.0,4817.66,212.541,0.89,12.9,6.09
tf_efficientnetv2_b2,260,1024.0,4797.27,213.444,1.72,9.84,10.1
efficientnet_cc_b0_8e,224,1024.0,4749.51,215.591,0.42,9.42,24.01
sedarknet21,256,1024.0,4747.46,215.684,3.93,7.47,20.95
efficientnet_cc_b0_4e,224,1024.0,4720.11,216.933,0.41,9.42,13.31
efficientnet_b2_pruned,260,1024.0,4716.64,217.093,0.73,9.13,8.31
convnext_femto_ols,288,1024.0,4709.5,217.422,1.35,8.06,5.23
resnext26ts,256,1024.0,4668.94,219.311,2.43,10.52,10.3
tiny_vit_11m_224,224,1024.0,4649.32,220.237,1.9,10.73,20.35
ecaresnet50d_pruned,224,1024.0,4636.78,220.832,2.53,6.43,19.94
deit_small_patch16_224,224,1024.0,4620.93,221.59,4.25,8.25,22.05
efficientformer_l1,224,1024.0,4616.64,221.795,1.3,5.53,12.29
vit_small_patch16_224,224,1024.0,4614.32,221.907,4.25,8.25,22.05
dpn48b,224,1024.0,4588.67,223.146,1.69,8.92,9.13
deit_small_distilled_patch16_224,224,1024.0,4587.3,223.214,4.27,8.29,22.44
vit_base_patch32_clip_256,256,1024.0,4547.51,225.168,5.68,5.44,87.86
convnextv2_femto,224,1024.0,4545.73,225.256,0.79,4.57,5.23
mobilevitv2_075,256,1024.0,4537.95,225.638,1.05,12.06,2.87
eca_resnext26ts,256,1024.0,4521.18,226.479,2.43,10.52,10.3
seresnext26ts,256,1024.0,4517.43,226.666,2.43,10.52,10.39
efficientnetv2_rw_t,224,1024.0,4511.98,226.94,1.93,9.94,13.65
legacy_seresnext26_32x4d,224,1024.0,4489.21,228.092,2.49,9.39,16.79
gernet_l,256,1024.0,4474.96,228.817,4.57,8.0,31.08
gcresnext26ts,256,1024.0,4472.11,228.964,2.43,10.53,10.48
rexnet_130,224,1024.0,4453.51,229.92,0.68,9.71,7.56
tf_efficientnet_b1,240,1024.0,4442.45,230.492,0.71,10.88,7.79
tf_efficientnet_cc_b0_8e,224,1024.0,4391.83,233.15,0.42,9.42,24.01
convnext_nano,224,1024.0,4389.78,233.258,2.46,8.37,15.59
gc_efficientnetv2_rw_t,224,1024.0,4373.41,234.132,1.94,9.97,13.68
tf_efficientnet_cc_b0_4e,224,1024.0,4373.37,234.134,0.41,9.42,13.31
tf_efficientnetv2_b3,240,1024.0,4372.06,234.204,1.93,9.95,14.36
tf_efficientnet_lite2,260,1024.0,4324.79,236.764,0.89,12.9,6.09
efficientnet_b1,256,1024.0,4298.75,238.198,0.77,12.22,7.79
deit3_small_patch16_224,224,1024.0,4270.38,239.779,4.25,8.25,22.06
cs3darknet_focus_l,256,1024.0,4230.07,242.066,4.66,8.03,21.15
nf_regnet_b1,288,1024.0,4135.98,247.568,1.02,9.2,10.22
convnext_nano_ols,224,1024.0,4118.16,248.644,2.65,9.38,15.65
nf_seresnet26,224,1024.0,4112.79,248.966,2.41,7.36,17.4
nf_ecaresnet26,224,1024.0,4107.39,249.292,2.41,7.36,16.0
efficientnet_b2,256,1024.0,4105.27,249.424,0.89,12.81,9.11
cs3darknet_l,256,1024.0,4101.41,249.66,4.86,8.55,21.16
nf_regnet_b2,272,1024.0,4097.18,249.913,1.22,9.27,14.31
ecaresnext50t_32x4d,224,1024.0,4074.12,251.332,2.7,10.09,15.41
ecaresnext26t_32x4d,224,1024.0,4072.14,251.454,2.7,10.09,15.41
seresnext26t_32x4d,224,1024.0,4061.05,252.141,2.7,10.09,16.81
repvgg_a2,224,1024.0,4049.32,252.867,5.7,6.26,28.21
poolformer_s12,224,1024.0,4047.55,252.981,1.82,5.53,11.92
seresnext26d_32x4d,224,1024.0,4037.54,253.609,2.73,10.19,16.81
regnetx_016,224,1024.0,4025.84,254.342,1.62,7.93,9.19
resnet26t,256,1024.0,4021.85,254.598,3.35,10.52,16.01
flexivit_small,240,1024.0,4011.8,255.236,4.88,9.46,22.06
edgenext_x_small,288,1024.0,3990.87,256.573,0.68,7.5,2.34
rexnet_150,224,1024.0,3983.48,257.051,0.9,11.21,9.73
vit_relpos_small_patch16_rpn_224,224,1024.0,3975.32,257.575,4.24,9.38,21.97
repvit_m3,224,1024.0,3966.18,258.164,1.89,13.94,10.68
vit_relpos_small_patch16_224,224,1024.0,3948.05,259.358,4.24,9.38,21.98
vit_srelpos_small_patch16_224,224,1024.0,3937.22,260.07,4.23,8.49,21.97
mobileone_s1,224,1024.0,3931.71,260.434,0.86,9.67,4.83
resnetv2_50,224,1024.0,3890.29,263.208,4.11,11.11,25.55
eca_botnext26ts_256,256,1024.0,3883.93,263.639,2.46,11.6,10.59
cs3sedarknet_l,256,1024.0,3835.91,266.94,4.86,8.56,21.91
ghostnetv2_160,224,1024.0,3826.79,267.576,0.42,7.23,12.39
resnet34,288,1024.0,3820.15,268.041,6.07,6.18,21.8
edgenext_small,256,1024.0,3794.31,269.865,1.26,9.07,5.59
dpn68,224,1024.0,3788.79,270.258,2.35,10.47,12.61
ese_vovnet19b_dw,288,1024.0,3782.88,270.682,2.22,13.63,6.54
fbnetv3_g,240,1024.0,3779.41,270.931,1.28,14.87,16.62
convnext_pico,288,1024.0,3777.8,271.046,2.27,10.08,9.05
ecaresnetlight,224,1024.0,3759.77,272.346,4.11,8.42,30.16
eca_halonext26ts,256,1024.0,3745.07,273.414,2.44,11.46,10.76
dpn68b,224,1024.0,3719.51,275.293,2.35,10.47,12.61
mixnet_m,224,1024.0,3687.37,277.689,0.36,8.19,5.01
resnet50,224,1024.0,3687.18,277.708,4.11,11.11,25.56
efficientnet_em,240,1024.0,3685.78,277.814,3.04,14.34,6.9
convnext_pico_ols,288,1024.0,3673.49,278.743,2.37,10.74,9.06
resnet32ts,256,1024.0,3641.96,281.156,4.63,11.58,17.96
bat_resnext26ts,256,1024.0,3638.35,281.435,2.53,12.51,10.73
efficientnet_b3_pruned,300,1024.0,3633.29,281.827,1.04,11.86,9.86
botnet26t_256,256,1024.0,3632.31,281.904,3.32,11.98,12.49
hrnet_w18_small_v2,224,1024.0,3631.33,281.979,2.62,9.65,15.6
ecaresnet101d_pruned,224,1024.0,3611.37,283.538,3.48,7.69,24.88
ecaresnet26t,256,1024.0,3599.02,284.511,3.35,10.53,16.01
regnetv_040,224,1024.0,3598.04,284.583,4.0,12.29,20.64
seresnet34,288,1024.0,3583.61,285.735,6.07,6.18,21.96
resnetv2_50t,224,1024.0,3573.26,286.561,4.32,11.82,25.57
pvt_v2_b1,224,1024.0,3571.19,286.726,2.04,14.01,14.01
regnety_016,224,1024.0,3567.37,287.031,1.63,8.04,11.2
resnext26ts,288,1024.0,3565.74,287.167,3.07,13.31,10.3
regnety_040,224,1024.0,3565.62,287.173,4.0,12.29,20.65
resnet33ts,256,1024.0,3563.66,287.335,4.76,11.66,19.68
resnetv2_50d,224,1024.0,3553.44,288.159,4.35,11.92,25.57
tf_efficientnet_em,240,1024.0,3544.42,288.894,3.04,14.34,6.9
halonet26t,256,1024.0,3541.55,289.129,3.19,11.69,12.48
dla60,224,1024.0,3527.55,290.275,4.26,10.16,22.04
tf_mixnet_m,224,1024.0,3524.0,290.567,0.36,8.19,5.01
resnet50c,224,1024.0,3521.04,290.812,4.35,11.92,25.58
edgenext_small_rw,256,1024.0,3501.76,292.411,1.58,9.51,7.83
resnet34d,288,1024.0,3491.3,293.29,6.47,7.51,21.82
convnextv2_pico,224,1024.0,3480.58,294.194,1.37,6.1,9.07
vit_small_resnet26d_224,224,1024.0,3476.26,294.557,5.04,10.65,63.61
convit_tiny,224,1024.0,3460.49,295.901,1.26,7.94,5.71
tresnet_m,224,1024.0,3457.69,296.14,5.75,7.31,31.39
resnet26,288,1024.0,3457.48,296.158,3.9,12.15,16.0
seresnext26ts,288,1024.0,3455.43,296.333,3.07,13.32,10.39
vit_relpos_base_patch32_plus_rpn_256,256,1024.0,3447.98,296.974,7.59,6.63,119.42
seresnet33ts,256,1024.0,3444.98,297.233,4.76,11.66,19.78
eca_resnext26ts,288,1024.0,3443.01,297.404,3.07,13.32,10.3
eca_resnet33ts,256,1024.0,3442.23,297.471,4.76,11.66,19.68
tf_efficientnet_b2,260,1024.0,3440.99,297.578,1.02,13.83,9.11
gcresnet33ts,256,1024.0,3424.64,298.998,4.76,11.68,19.88
gcresnext26ts,288,1024.0,3414.23,299.91,3.07,13.33,10.48
resnet50t,224,1024.0,3401.57,301.026,4.32,11.82,25.57
vovnet39a,224,1024.0,3395.56,301.56,7.09,6.73,22.6
resnet50d,224,1024.0,3380.59,302.894,4.35,11.92,25.58
efficientvit_b2,224,1024.0,3359.89,304.76,1.6,14.62,24.33
resnest14d,224,1024.0,3357.89,304.943,2.76,7.33,10.61
vit_base_patch32_plus_256,256,1024.0,3354.04,305.293,7.7,6.35,119.48
efficientnet_b0_gn,224,1024.0,3353.74,305.319,0.42,6.75,5.29
cs3darknet_focus_l,288,1024.0,3340.22,306.556,5.9,10.16,21.15
selecsls84,224,1024.0,3335.07,307.029,5.9,7.57,50.95
vit_tiny_patch16_384,384,1024.0,3332.37,307.277,3.16,12.08,5.79
legacy_seresnet50,224,1024.0,3325.14,307.946,3.88,10.6,28.09
coatnet_nano_cc_224,224,1024.0,3301.24,310.176,2.13,13.1,13.76
fastvit_t8,256,1024.0,3298.88,310.398,0.7,8.63,4.03
resnetblur18,288,1024.0,3292.39,311.01,3.87,5.6,11.69
repvit_m1_5,224,1024.0,3281.4,312.05,2.31,15.7,14.64
ese_vovnet39b,224,1024.0,3276.58,312.51,7.09,6.74,24.57
levit_512,224,1024.0,3274.29,312.728,5.64,10.22,95.17
haloregnetz_b,224,1024.0,3272.82,312.869,1.97,11.94,11.68
mobilevit_xs,256,1024.0,3272.76,312.87,0.93,13.62,2.32
coat_lite_tiny,224,1024.0,3257.39,314.352,1.6,11.65,5.72
coatnext_nano_rw_224,224,1024.0,3256.31,314.455,2.36,10.68,14.7
eca_vovnet39b,224,1024.0,3252.14,314.859,7.09,6.74,22.6
efficientnet_b2,288,1024.0,3249.31,315.132,1.12,16.2,9.11
resnetaa50,224,1024.0,3245.58,315.495,5.15,11.64,25.56
coatnet_nano_rw_224,224,1024.0,3238.25,316.209,2.29,13.29,15.14
cs3darknet_l,288,1024.0,3236.81,316.35,6.16,10.83,21.16
convnextv2_atto,288,1024.0,3226.1,317.401,0.91,6.3,3.71
mobileone_s2,224,1024.0,3211.19,318.869,1.34,11.55,7.88
seresnet50,224,1024.0,3200.07,319.981,4.11,11.13,28.09
nf_regnet_b3,288,1024.0,3185.16,321.477,1.67,11.84,18.59
crossvit_small_240,240,1024.0,3184.9,321.506,5.09,11.34,26.86
res2net50_48w_2s,224,1024.0,3168.87,323.132,4.18,11.72,25.29
resnetaa34d,288,1024.0,3155.87,324.463,7.33,8.38,21.82
vit_small_r26_s32_224,224,1024.0,3124.44,327.727,3.54,9.44,36.43
dla60x,224,1024.0,3106.99,329.567,3.54,13.8,17.35
efficientnet_b0_g8_gn,224,1024.0,3104.31,329.853,0.66,6.75,6.56
resnext50_32x4d,224,1024.0,3099.2,330.397,4.26,14.4,25.03
levit_conv_512,224,1024.0,3078.02,332.67,5.64,10.22,95.17
skresnet34,224,1024.0,3073.03,333.21,3.67,5.13,22.28
coat_lite_mini,224,1024.0,3058.66,334.777,2.0,12.25,11.01
resnet26d,288,1024.0,3053.73,335.317,4.29,13.48,16.01
mobileone_s0,224,1024.0,3053.01,335.391,1.09,15.48,5.29
levit_512d,224,1024.0,3045.04,336.274,5.85,11.3,92.5
cs3sedarknet_l,288,1024.0,3026.08,338.38,6.16,10.83,21.91
resnetaa50d,224,1024.0,3022.22,338.813,5.39,12.44,25.58
convnext_tiny,224,1024.0,3015.62,339.555,4.47,13.44,28.59
eca_nfnet_l0,224,1024.0,3011.21,340.052,4.35,10.47,24.14
xcit_nano_12_p16_384,384,1024.0,3011.18,340.055,1.64,12.14,3.05
nfnet_l0,224,1024.0,3000.78,341.23,4.36,10.47,35.07
resnetrs50,224,1024.0,2989.89,342.477,4.48,12.14,35.69
efficientnet_cc_b1_8e,240,1024.0,2988.69,342.615,0.75,15.44,39.72
regnetz_b16,288,1024.0,2987.05,342.79,2.39,16.43,9.72
seresnet50t,224,1024.0,2984.21,343.128,4.32,11.83,28.1
ecaresnet50d,224,1024.0,2975.54,344.128,4.35,11.93,25.58
regnetz_c16,256,1024.0,2971.35,344.607,2.51,16.57,13.46
densenet121,224,1024.0,2967.84,345.021,2.87,6.9,7.98
crossvit_15_240,240,1024.0,2967.06,345.11,5.17,12.01,27.53
resnet50s,224,1024.0,2958.0,346.169,5.47,13.52,25.68
rexnetr_200,224,1024.0,2955.32,346.483,1.59,15.11,16.52
mixnet_l,224,1024.0,2926.26,349.918,0.58,10.84,7.33
xcit_tiny_24_p16_224,224,1024.0,2925.33,350.035,2.34,11.82,12.12
levit_conv_512d,224,1024.0,2899.99,353.091,5.85,11.3,92.5
gcresnext50ts,256,1024.0,2897.54,353.393,3.75,15.46,15.67
lambda_resnet26rpt_256,256,1024.0,2887.51,354.621,3.16,11.87,10.99
resnext50d_32x4d,224,1024.0,2876.86,355.933,4.5,15.2,25.05
resnet32ts,288,1024.0,2868.64,356.953,5.86,14.65,17.96
crossvit_15_dagger_240,240,1024.0,2848.99,359.413,5.5,12.68,28.21
tiny_vit_21m_224,224,1024.0,2842.09,360.287,4.08,15.96,33.22
vit_base_resnet26d_224,224,1024.0,2837.87,360.821,6.93,12.34,101.4
tf_efficientnet_cc_b1_8e,240,1024.0,2835.77,361.09,0.75,15.44,39.72
cspresnet50,256,1024.0,2834.55,361.245,4.54,11.5,21.62
mobilevitv2_100,256,1024.0,2833.62,361.358,1.84,16.08,4.9
resnet33ts,288,1024.0,2829.43,361.9,6.02,14.75,19.68
vovnet57a,224,1024.0,2821.83,362.874,8.95,7.52,36.64
deit3_medium_patch16_224,224,1024.0,2805.09,365.038,7.53,10.99,38.85
inception_next_tiny,224,1024.0,2798.9,365.847,4.19,11.98,28.06
tf_mixnet_l,224,1024.0,2798.14,365.947,0.58,10.84,7.33
res2next50,224,1024.0,2797.04,366.091,4.2,13.71,24.67
dla60_res2next,224,1024.0,2795.54,366.285,3.49,13.17,17.03
coatnet_pico_rw_224,224,1024.0,2793.27,366.584,1.96,12.91,10.85
convnext_tiny_hnf,224,1024.0,2770.64,369.577,4.47,13.44,28.59
gcresnet50t,256,1024.0,2767.9,369.943,5.42,14.67,25.9
convnextv2_femto,288,1024.0,2762.62,370.652,1.3,7.56,5.23
tf_efficientnetv2_b3,300,1024.0,2757.15,371.387,3.04,15.74,14.36
legacy_seresnext50_32x4d,224,1024.0,2750.41,372.297,4.26,14.42,27.56
ecaresnet50d_pruned,288,1024.0,2749.78,372.383,4.19,10.61,19.94
res2net50_26w_4s,224,1024.0,2749.69,372.394,4.28,12.61,25.7
seresnext50_32x4d,224,1024.0,2749.17,372.464,4.26,14.42,27.56
vgg11_bn,224,1024.0,2746.28,372.857,7.62,7.44,132.87
resmlp_24_224,224,1024.0,2745.97,372.9,5.96,10.91,30.02
resnetv2_50x1_bit,224,1024.0,2742.41,373.383,4.23,11.11,25.55
eca_resnet33ts,288,1024.0,2737.24,374.089,6.02,14.76,19.68
efficientnetv2_rw_t,288,1024.0,2736.91,374.133,3.19,16.42,13.65
seresnet33ts,288,1024.0,2734.83,374.417,6.02,14.76,19.78
nfnet_f0,192,1024.0,2731.03,374.934,7.21,10.16,71.49
res2net50_14w_8s,224,1024.0,2724.75,375.804,4.21,13.28,25.06
visformer_small,224,1024.0,2720.95,376.328,4.88,11.43,40.22
ese_vovnet57b,224,1024.0,2711.8,377.598,8.95,7.52,38.61
gcresnet33ts,288,1024.0,2705.39,378.493,6.02,14.78,19.88
cspresnet50d,256,1024.0,2702.61,378.881,4.86,12.55,21.64
twins_svt_small,224,1024.0,2696.15,379.788,2.82,10.7,24.06
efficientvit_l1,224,1024.0,2692.51,380.303,5.27,15.85,52.65
resnetblur50,224,1024.0,2689.65,380.707,5.16,12.02,25.56
seresnetaa50d,224,1024.0,2682.26,381.757,5.4,12.46,28.11
fbnetv3_g,288,1024.0,2673.23,383.046,1.77,21.09,16.62
cspresnet50w,256,1024.0,2671.97,383.228,5.04,12.19,28.12
dla60_res2net,224,1024.0,2669.84,383.53,4.15,12.34,20.85
convnext_nano,288,1024.0,2669.05,383.645,4.06,13.84,15.59
gc_efficientnetv2_rw_t,288,1024.0,2659.37,385.042,3.2,16.45,13.68
gcvit_xxtiny,224,1024.0,2658.4,385.182,2.14,15.36,12.0
poolformerv2_s12,224,1024.0,2624.04,390.223,1.83,5.53,11.89
vit_relpos_medium_patch16_rpn_224,224,1024.0,2618.88,390.989,7.5,12.13,38.73
mobileone_s3,224,1024.0,2616.83,391.296,1.94,13.85,10.17
davit_tiny,224,1024.0,2612.7,391.92,4.47,17.08,28.36
vit_relpos_medium_patch16_224,224,1024.0,2603.89,393.246,7.5,12.13,38.75
resnet51q,256,1024.0,2602.52,393.454,6.38,16.55,35.7
gmixer_24_224,224,1024.0,2594.59,394.657,5.28,14.45,24.72
maxvit_pico_rw_256,256,768.0,2593.58,296.105,1.68,18.77,7.46
vit_srelpos_medium_patch16_224,224,1024.0,2591.17,395.176,7.49,11.32,38.74
vit_relpos_medium_patch16_cls_224,224,1024.0,2587.16,395.789,7.55,13.3,38.76
maxvit_rmlp_pico_rw_256,256,768.0,2587.02,296.857,1.69,21.32,7.52
nf_regnet_b3,320,1024.0,2582.41,396.514,2.05,14.61,18.59
res2net50d,224,1024.0,2577.65,397.25,4.52,13.41,25.72
cs3darknet_focus_x,256,1024.0,2569.33,398.536,8.03,10.69,35.02
densenetblur121d,224,1024.0,2559.52,400.063,3.11,7.9,8.0
inception_v3,299,1024.0,2546.29,402.143,5.73,8.97,23.83
coatnet_0_rw_224,224,1024.0,2545.57,402.256,4.23,15.1,27.44
repvgg_b1g4,224,1024.0,2545.06,402.332,8.15,10.64,39.97
regnetx_032,224,1024.0,2534.07,404.077,3.2,11.37,15.3
twins_pcpvt_small,224,1024.0,2533.92,404.104,3.68,15.51,24.11
resnetblur50d,224,1024.0,2528.9,404.909,5.4,12.82,25.58
rexnet_200,224,1024.0,2519.88,406.358,1.56,14.91,16.37
resnetrs101,192,1024.0,2505.12,408.751,6.04,12.7,63.62
resnet26t,320,1024.0,2502.87,409.119,5.24,16.44,16.01
nf_ecaresnet50,224,1024.0,2502.03,409.253,4.21,11.13,25.56
convnext_nano_ols,288,1024.0,2497.73,409.961,4.38,15.5,15.65
convnextv2_nano,224,1024.0,2497.72,409.963,2.46,8.37,15.62
nf_seresnet50,224,1024.0,2494.79,410.425,4.21,11.13,28.09
regnety_032,224,1024.0,2483.68,412.275,3.2,11.26,19.44
vit_medium_patch16_gap_240,240,1024.0,2477.36,413.332,8.6,12.57,44.4
cs3darknet_x,256,1024.0,2475.51,413.641,8.38,11.35,35.05
densenet169,224,1024.0,2463.83,415.603,3.4,7.3,14.15
xcit_small_12_p16_224,224,1024.0,2460.07,416.237,4.82,12.57,26.25
cspresnext50,256,1024.0,2452.36,417.546,4.05,15.86,20.57
mobilevit_s,256,1024.0,2447.35,418.395,1.86,17.03,5.58
darknet53,256,1024.0,2439.82,419.693,9.31,12.39,41.61
darknetaa53,256,1024.0,2432.07,421.03,7.97,12.39,36.02
edgenext_small,320,1024.0,2429.25,421.516,1.97,14.16,5.59
seresnext26t_32x4d,288,1024.0,2412.74,424.404,4.46,16.68,16.81
sehalonet33ts,256,1024.0,2403.77,425.986,3.55,14.7,13.69
seresnext26d_32x4d,288,1024.0,2391.16,428.231,4.51,16.85,16.81
resnet61q,256,1024.0,2368.17,432.39,7.8,17.01,36.85
fastvit_t12,256,1024.0,2356.34,434.562,1.42,12.42,7.55
vit_base_r26_s32_224,224,1024.0,2354.84,434.838,6.76,11.54,101.38
focalnet_tiny_srf,224,1024.0,2353.35,435.113,4.42,16.32,28.43
resnetv2_101,224,1024.0,2342.24,437.176,7.83,16.23,44.54
cs3sedarknet_x,256,1024.0,2329.01,439.66,8.38,11.35,35.4
nf_resnet50,256,1024.0,2318.52,441.645,5.46,14.52,25.56
xcit_nano_12_p8_224,224,1024.0,2310.67,443.15,2.16,15.71,3.05
resnest26d,224,1024.0,2309.28,443.418,3.64,9.97,17.07
coatnet_rmlp_nano_rw_224,224,1024.0,2308.34,443.598,2.51,18.21,15.15
resnetv2_50,288,1024.0,2302.9,444.644,6.79,18.37,25.55
ecaresnet50t,256,1024.0,2299.59,445.285,5.64,15.45,25.57
gmlp_s16_224,224,1024.0,2291.16,446.925,4.42,15.1,19.42
efficientnet_lite3,300,1024.0,2290.17,447.117,1.65,21.85,8.2
dm_nfnet_f0,192,1024.0,2271.28,450.836,7.21,10.16,71.49
resnet101,224,1024.0,2263.99,452.287,7.83,16.23,44.55
ecaresnet26t,320,1024.0,2258.47,453.393,5.24,16.44,16.01
edgenext_base,256,1024.0,2256.96,453.695,3.85,15.58,18.51
efficientnetv2_s,288,1024.0,2251.36,454.825,4.75,20.13,21.46
skresnet50,224,1024.0,2250.82,454.933,4.11,12.5,25.8
dla102,224,1024.0,2248.24,455.455,7.19,14.18,33.27
edgenext_small_rw,320,1024.0,2240.98,456.929,2.46,14.85,7.83
ecaresnetlight,288,1024.0,2235.21,458.11,6.79,13.91,30.16
dpn68b,288,1024.0,2234.13,458.331,3.89,17.3,12.61
gcresnext50ts,288,1024.0,2232.45,458.676,4.75,19.57,15.67
fastvit_s12,256,1024.0,2229.72,459.239,1.82,13.67,9.47
fastvit_sa12,256,1024.0,2225.03,460.206,1.96,13.83,11.58
focalnet_tiny_lrf,224,1024.0,2222.33,460.766,4.49,17.76,28.65
resnetv2_101d,224,1024.0,2216.51,461.976,8.07,17.04,44.56
resnet101c,224,1024.0,2202.12,464.995,8.08,17.04,44.57
vit_base_resnet50d_224,224,1024.0,2199.36,465.578,8.68,16.1,110.97
regnetv_040,288,1024.0,2190.89,467.375,6.6,20.3,20.64
vit_medium_patch16_gap_256,256,1024.0,2190.03,467.563,9.78,14.29,38.86
resnet50,288,1024.0,2185.5,468.532,6.8,18.37,25.56
gcresnet50t,288,1024.0,2180.99,469.5,6.86,18.57,25.9
regnety_040,288,1024.0,2169.28,472.031,6.61,20.3,20.65
vgg13,224,1024.0,2159.6,474.15,11.31,12.25,133.05
eva02_small_patch14_224,224,1024.0,2151.59,475.915,5.53,12.34,21.62
vit_medium_patch16_reg4_gap_256,256,1024.0,2149.02,476.485,9.93,14.51,38.87
efficientnetv2_rw_s,288,1024.0,2146.83,476.971,4.91,21.41,23.94
ecaresnet101d_pruned,288,1024.0,2141.83,478.084,5.75,12.71,24.88
mobilevitv2_125,256,1024.0,2139.71,478.555,2.86,20.1,7.48
vit_medium_patch16_reg4_256,256,1024.0,2136.17,479.352,9.97,14.56,38.87
skresnet50d,224,1024.0,2134.1,479.815,4.36,13.31,25.82
pvt_v2_b2,224,1024.0,2119.72,483.066,3.9,24.96,25.36
hrnet_w18_ssld,224,1024.0,2114.47,484.27,4.32,16.31,21.3
convnextv2_pico,288,1024.0,2113.62,484.464,2.27,10.08,9.07
eva02_tiny_patch14_336,336,1024.0,2113.11,484.582,3.14,13.85,5.76
efficientvit_l2,224,1024.0,2109.14,485.494,6.97,19.58,63.71
hrnet_w18,224,1024.0,2100.77,487.428,4.32,16.31,21.3
regnetx_040,224,1024.0,2099.85,487.636,3.99,12.2,22.12
tf_efficientnet_lite3,300,1024.0,2090.5,489.823,1.65,21.85,8.2
wide_resnet50_2,224,1024.0,2081.66,491.904,11.43,14.4,68.88
resnet51q,288,1024.0,2069.71,494.744,8.07,20.94,35.7
poolformer_s24,224,1024.0,2067.46,495.278,3.41,10.68,21.39
sebotnet33ts_256,256,512.0,2066.45,247.758,3.89,17.46,13.7
efficientformer_l3,224,1024.0,2064.62,495.963,3.93,12.01,31.41
resnest50d_1s4x24d,224,1024.0,2057.55,497.667,4.43,13.57,25.68
gcvit_xtiny,224,1024.0,2053.45,498.662,2.93,20.26,19.98
cspdarknet53,256,1024.0,2048.51,499.863,6.57,16.81,27.64
crossvit_18_240,240,1024.0,2029.53,504.539,8.21,16.14,43.27
mixnet_xl,224,1024.0,2029.05,504.653,0.93,14.57,11.9
vit_base_patch32_384,384,1024.0,2028.15,504.881,12.67,12.14,88.3
efficientnet_b3,288,1024.0,2027.72,504.989,1.63,21.49,12.23
vit_base_patch32_clip_384,384,1024.0,2026.31,505.34,12.67,12.14,88.3
resnet50t,288,1024.0,2024.16,505.879,7.14,19.53,25.57
dla102x,224,1024.0,2023.35,506.08,5.89,19.42,26.31
legacy_seresnet101,224,1024.0,2012.58,508.788,7.61,15.74,49.33
resnet50d,288,1024.0,2012.14,508.9,7.19,19.7,25.58
cs3edgenet_x,256,1024.0,2002.36,511.384,11.53,12.92,47.82
resnetaa101d,224,1024.0,1994.67,513.346,9.12,17.56,44.57
repvgg_b1,224,1024.0,1994.42,513.418,13.16,10.64,57.42
res2net50_26w_6s,224,1024.0,1979.48,517.295,6.33,15.28,37.05
regnetz_d32,256,1024.0,1978.14,517.642,5.98,23.74,27.58
cs3sedarknet_xdw,256,1024.0,1970.5,519.653,5.97,17.18,21.6
resnetaa50,288,1024.0,1968.61,520.152,8.52,19.24,25.56
seresnet101,224,1024.0,1966.15,520.803,7.84,16.27,49.33
resnet101s,224,1024.0,1964.56,521.226,9.19,18.64,44.67
cs3darknet_x,288,1024.0,1958.87,522.739,10.6,14.36,35.05
crossvit_18_dagger_240,240,1024.0,1955.55,523.625,8.65,16.91,44.27
swin_tiny_patch4_window7_224,224,1024.0,1951.67,524.668,4.51,17.06,28.29
tresnet_v2_l,224,1024.0,1947.69,525.738,8.85,16.34,46.17
ese_vovnet39b,288,1024.0,1941.03,527.543,11.71,11.13,24.57
regnetz_d8,256,1024.0,1940.13,527.785,3.97,23.74,23.37
tf_efficientnetv2_s,300,1024.0,1939.51,527.958,5.35,22.73,21.46
regnetz_c16,320,1024.0,1933.29,529.65,3.92,25.88,13.46
coatnet_bn_0_rw_224,224,1024.0,1926.49,531.525,4.48,18.41,27.44
darknet53,288,1024.0,1924.44,532.092,11.78,15.68,41.61
resnext101_32x4d,224,1024.0,1923.83,532.261,8.01,21.23,44.18
coatnet_rmlp_0_rw_224,224,1024.0,1920.22,533.259,4.52,21.26,27.45
xcit_tiny_12_p16_384,384,1024.0,1917.57,533.997,3.64,18.25,6.72
darknetaa53,288,1024.0,1915.93,534.454,10.08,15.68,36.02
mobileone_s4,224,1024.0,1915.84,534.474,3.04,17.74,14.95
maxxvit_rmlp_nano_rw_256,256,768.0,1913.61,401.326,4.17,21.53,16.78
nest_tiny,224,1024.0,1909.31,536.303,5.24,14.75,17.06
regnetz_040,256,1024.0,1906.99,536.946,4.06,24.19,27.12
nf_regnet_b4,320,1024.0,1906.99,536.957,3.29,19.88,30.21
seresnet50,288,1024.0,1902.22,538.306,6.8,18.39,28.09
pvt_v2_b2_li,224,1024.0,1897.86,539.539,3.77,25.04,22.55
regnetz_040_h,256,1024.0,1896.27,539.981,4.12,24.29,28.94
densenet201,224,1024.0,1895.14,540.319,4.34,7.85,20.01
halonet50ts,256,1024.0,1887.53,542.495,5.3,19.2,22.73
nest_tiny_jx,224,1024.0,1885.06,543.199,5.24,14.75,17.06
vgg13_bn,224,1024.0,1884.94,543.241,11.33,12.25,133.05
regnetx_080,224,1024.0,1883.47,543.661,8.02,14.06,39.57
vit_large_patch32_224,224,1024.0,1882.39,543.977,15.27,11.11,305.51
ecaresnet101d,224,1024.0,1880.92,544.404,8.08,17.07,44.57
resnet61q,288,1024.0,1874.14,546.373,9.87,21.52,36.85
nf_resnet101,224,1024.0,1864.42,549.218,8.01,16.23,44.55
cs3se_edgenet_x,256,1024.0,1859.86,550.568,11.53,12.94,50.72
repvit_m2_3,224,1024.0,1852.95,552.61,4.57,26.21,23.69
resmlp_36_224,224,1024.0,1843.66,555.406,8.91,16.33,44.69
cs3sedarknet_x,288,1024.0,1843.16,555.556,10.6,14.37,35.4
resnext50_32x4d,288,1024.0,1841.23,556.139,7.04,23.81,25.03
convnext_small,224,1024.0,1838.66,556.915,8.71,21.56,50.22
convnext_tiny,288,1024.0,1835.18,557.972,7.39,22.21,28.59
resnetv2_50d_gn,224,1024.0,1829.29,559.767,4.38,11.92,25.57
resnetaa50d,288,1024.0,1827.2,560.408,8.92,20.57,25.58
pit_b_224,224,1024.0,1823.77,561.458,10.56,16.6,73.76
eca_nfnet_l0,288,1024.0,1822.69,561.796,7.12,17.29,24.14
nfnet_l0,288,1024.0,1817.7,563.332,7.13,17.29,35.07
sequencer2d_s,224,1024.0,1816.41,563.738,4.96,11.31,27.65
pit_b_distilled_224,224,1024.0,1810.4,565.6,10.63,16.67,74.79
nf_resnet50,288,1024.0,1794.38,570.655,6.88,18.37,25.56
twins_pcpvt_base,224,1024.0,1790.37,571.935,6.46,21.35,43.83
rexnetr_200,288,768.0,1782.92,430.745,2.62,24.96,16.52
seresnet50t,288,1024.0,1780.59,575.079,7.14,19.55,28.1
cait_xxs24_224,224,1024.0,1779.24,575.513,2.53,20.29,11.96
swin_s3_tiny_224,224,1024.0,1777.31,576.139,4.64,19.13,28.33
resnet50_gn,224,1024.0,1776.88,576.279,4.14,11.11,25.56
ecaresnet50d,288,1024.0,1775.84,576.616,7.19,19.72,25.58
resnetblur101d,224,1024.0,1765.86,579.878,9.12,17.94,44.57
densenet121,288,1024.0,1761.12,581.437,4.74,11.41,7.98
coat_lite_small,224,1024.0,1760.12,581.767,3.96,22.09,19.84
mixer_b16_224,224,1024.0,1758.48,582.299,12.62,14.53,59.88
mobilevitv2_150,256,768.0,1748.31,439.266,4.09,24.11,10.59
efficientvit_b3,224,1024.0,1742.56,587.628,3.99,26.9,48.65
rexnetr_300,224,1024.0,1736.82,589.571,3.39,22.16,34.81
vgg16,224,1024.0,1730.88,591.595,15.47,13.56,138.36
maxxvitv2_nano_rw_256,256,768.0,1724.32,445.384,6.12,19.66,23.7
res2net101_26w_4s,224,1024.0,1723.01,594.296,8.1,18.45,45.21
resnext50d_32x4d,288,1024.0,1717.01,596.374,7.44,25.13,25.05
maxvit_nano_rw_256,256,768.0,1709.05,449.363,4.26,25.76,15.45
legacy_seresnext101_32x4d,224,1024.0,1707.02,599.865,8.02,21.26,48.96
seresnext101_32x4d,224,1024.0,1706.74,599.963,8.02,21.26,48.96
maxvit_rmlp_nano_rw_256,256,768.0,1705.93,450.183,4.28,27.4,15.5
resnetv2_50d_frn,224,1024.0,1703.71,601.028,4.33,11.92,25.59
mobilevitv2_175,256,512.0,1701.95,300.817,5.54,28.13,14.25
tf_efficientnet_b3,300,1024.0,1694.25,604.385,1.87,23.83,12.23
convnext_tiny_hnf,288,1024.0,1681.52,608.96,7.39,22.21,28.59
ese_vovnet39b_evos,224,1024.0,1671.22,612.716,7.07,6.74,24.58
res2net50_26w_8s,224,1024.0,1656.9,618.009,8.37,17.95,48.4
resnet101d,256,1024.0,1654.59,618.871,10.55,22.25,44.57
tresnet_l,224,1024.0,1652.13,619.794,10.9,11.9,55.99
res2net101d,224,1024.0,1652.09,619.808,8.35,19.25,45.23
mixer_l32_224,224,1024.0,1651.22,620.129,11.27,19.86,206.94
regnetz_b16_evos,224,1024.0,1648.87,621.016,1.43,9.95,9.74
botnet50ts_256,256,512.0,1645.51,311.14,5.54,22.23,22.74
efficientnet_b3,320,1024.0,1641.76,623.708,2.01,26.52,12.23
seresnext50_32x4d,288,1024.0,1638.34,625.012,7.04,23.82,27.56
coatnet_0_224,224,512.0,1634.58,313.22,4.43,21.14,25.04
swinv2_cr_tiny_224,224,1024.0,1629.27,628.491,4.66,28.45,28.33
inception_next_small,224,1024.0,1628.58,628.755,8.36,19.27,49.37
resnetv2_152,224,1024.0,1628.46,628.801,11.55,22.56,60.19
regnetx_064,224,1024.0,1628.2,628.898,6.49,16.37,26.21
hrnet_w32,224,1024.0,1627.55,629.157,8.97,22.02,41.23
convnextv2_tiny,224,1024.0,1627.26,629.266,4.47,13.44,28.64
seresnetaa50d,288,1024.0,1622.33,631.178,8.92,20.59,28.11
davit_small,224,1024.0,1614.32,634.313,8.69,27.54,49.75
regnety_040_sgn,224,1024.0,1612.57,634.996,4.03,12.29,20.65
legacy_xception,299,768.0,1604.43,478.663,8.4,35.83,22.86
swinv2_cr_tiny_ns_224,224,1024.0,1600.49,639.793,4.66,28.45,28.33
resnetblur50,288,1024.0,1598.7,640.511,8.52,19.87,25.56
efficientnet_el,300,1024.0,1595.26,641.889,8.0,30.7,10.59
efficientnet_el_pruned,300,1024.0,1592.53,642.988,8.0,30.7,10.59
resnet152,224,1024.0,1589.58,644.183,11.56,22.56,60.19
deit_base_patch16_224,224,1024.0,1581.19,647.603,16.87,16.49,86.57
cs3edgenet_x,288,1024.0,1577.26,649.216,14.59,16.36,47.82
deit_base_distilled_patch16_224,224,1024.0,1575.74,649.842,16.95,16.58,87.34
vit_base_patch16_224,224,1024.0,1574.94,650.173,16.87,16.49,86.57
vit_base_patch16_224_miil,224,1024.0,1574.63,650.301,16.88,16.5,94.4
vit_base_patch16_clip_224,224,1024.0,1574.46,650.371,16.87,16.49,86.57
vit_base_patch16_siglip_224,224,1024.0,1571.54,651.577,17.02,16.71,92.88
resnetv2_152d,224,1024.0,1564.52,654.501,11.8,23.36,60.2
vit_base_patch16_gap_224,224,1024.0,1563.13,655.085,16.78,16.41,86.57
halo2botnet50ts_256,256,1024.0,1562.09,655.52,5.02,21.78,22.64
resnet152c,224,1024.0,1558.11,657.195,11.8,23.36,60.21
ese_vovnet99b,224,1024.0,1554.99,658.512,16.51,11.27,63.2
vit_small_resnet50d_s16_224,224,1024.0,1551.97,659.792,13.0,21.12,57.53
nf_seresnet101,224,1024.0,1549.92,660.662,8.02,16.27,49.33
nf_ecaresnet101,224,1024.0,1549.88,660.683,8.01,16.27,44.55
tf_efficientnet_el,300,1024.0,1543.58,663.384,8.0,30.7,10.59
coatnet_rmlp_1_rw_224,224,1024.0,1542.97,663.643,7.44,28.08,41.69
nfnet_f0,256,1024.0,1541.8,664.144,12.62,18.05,71.49
vgg16_bn,224,1024.0,1533.25,667.85,15.5,13.56,138.37
resnest50d,224,1024.0,1530.42,669.084,5.4,14.36,27.48
caformer_s18,224,1024.0,1528.28,670.023,3.9,15.18,26.34
pvt_v2_b3,224,1024.0,1527.57,670.328,6.71,33.8,45.24
densenetblur121d,288,1024.0,1521.38,673.062,5.14,13.06,8.0
maxvit_tiny_rw_224,224,768.0,1520.98,504.928,4.93,28.54,29.06
mvitv2_tiny,224,1024.0,1518.09,674.509,4.7,21.16,24.17
vit_base_patch16_rpn_224,224,1024.0,1516.7,675.134,16.78,16.41,86.54
convnextv2_nano,288,768.0,1514.74,507.006,4.06,13.84,15.62
regnety_032,288,1024.0,1514.59,676.077,5.29,18.61,19.44
rexnet_300,224,1024.0,1508.74,678.701,3.44,22.4,34.71
resnetblur50d,288,1024.0,1506.45,679.732,8.92,21.19,25.58
deit3_base_patch16_224,224,1024.0,1497.14,683.959,16.87,16.49,86.59
convit_small,224,1024.0,1494.54,685.148,5.76,17.87,27.78
vit_base_patch32_clip_448,448,1024.0,1493.83,685.476,17.21,16.49,88.34
dla169,224,1024.0,1487.25,688.504,11.6,20.2,53.39
skresnext50_32x4d,224,1024.0,1470.99,696.12,4.5,17.18,27.48
xcit_tiny_12_p8_224,224,1024.0,1465.13,698.903,4.81,23.6,6.71
vit_small_patch16_36x1_224,224,1024.0,1460.65,701.044,12.63,24.59,64.67
ecaresnet50t,320,1024.0,1451.46,705.484,8.82,24.13,25.57
beitv2_base_patch16_224,224,1024.0,1448.02,707.161,16.87,16.49,86.53
vgg19,224,1024.0,1441.93,710.149,19.63,14.86,143.67
beit_base_patch16_224,224,1024.0,1440.48,710.862,16.87,16.49,86.53
hrnet_w30,224,1024.0,1436.17,712.996,8.15,21.21,37.71
edgenext_base,320,1024.0,1435.98,713.087,6.01,24.32,18.51
resnet152s,224,1024.0,1434.4,713.876,12.92,24.96,60.32
convformer_s18,224,1024.0,1427.19,717.481,3.96,15.82,26.77
resnetv2_50d_evos,224,1024.0,1426.57,717.793,4.33,11.92,25.59
focalnet_small_srf,224,1024.0,1426.35,717.904,8.62,26.26,49.89
sequencer2d_m,224,1024.0,1413.9,724.228,6.55,14.26,38.31
vit_relpos_base_patch16_rpn_224,224,1024.0,1408.36,727.069,16.8,17.63,86.41
volo_d1_224,224,1024.0,1407.83,727.348,6.94,24.43,26.63
regnety_080,224,1024.0,1407.5,727.512,8.0,17.97,39.18
vit_small_patch16_18x2_224,224,1024.0,1407.09,727.729,12.63,24.59,64.67
gcvit_tiny,224,1024.0,1405.32,728.65,4.79,29.82,28.22
dpn92,224,1024.0,1404.08,729.292,6.54,18.21,37.67
vit_relpos_base_patch16_224,224,1024.0,1402.98,729.864,16.8,17.63,86.43
resnetv2_101,288,1024.0,1402.28,730.227,12.94,26.83,44.54
regnetx_160,224,1024.0,1400.84,730.974,15.99,25.52,54.28
dla102x2,224,1024.0,1395.12,733.975,9.34,29.91,41.28
legacy_seresnet152,224,1024.0,1394.86,734.109,11.33,22.08,66.82
vit_relpos_base_patch16_clsgap_224,224,1024.0,1394.83,734.131,16.88,17.72,86.43
vit_relpos_base_patch16_cls_224,224,1024.0,1392.12,735.556,16.88,17.72,86.43
vit_small_patch16_384,384,1024.0,1390.73,736.291,12.45,24.15,22.2
poolformer_s36,224,1024.0,1388.46,737.493,5.0,15.82,30.86
vit_base_patch16_clip_quickgelu_224,224,1024.0,1388.13,737.672,16.87,16.49,86.19
densenet161,224,1024.0,1384.23,739.75,7.79,11.06,28.68
flexivit_base,240,1024.0,1380.45,741.777,19.35,18.92,86.59
efficientformerv2_s0,224,1024.0,1377.72,743.244,0.41,5.3,3.6
seresnet152,224,1024.0,1371.27,746.737,11.57,22.61,66.82
poolformerv2_s24,224,1024.0,1356.43,754.905,3.42,10.68,21.34
resnet101,288,1024.0,1354.29,756.102,12.95,26.83,44.55
focalnet_small_lrf,224,1024.0,1339.63,764.378,8.74,28.61,50.34
inception_v4,299,1024.0,1338.22,765.183,12.28,15.09,42.68
repvgg_b2,224,1024.0,1336.97,765.895,20.45,12.9,89.02
nf_regnet_b4,384,1024.0,1327.28,771.488,4.7,28.61,30.21
repvgg_b2g4,224,1024.0,1323.55,773.658,12.63,12.9,61.76
eca_nfnet_l1,256,1024.0,1319.97,775.763,9.62,22.04,41.41
fastvit_sa24,256,1024.0,1310.4,781.428,3.79,23.92,21.55
xcit_small_24_p16_224,224,1024.0,1307.21,783.335,9.1,23.63,47.67
twins_pcpvt_large,224,1024.0,1303.57,785.524,9.53,30.21,60.99
vit_base_patch16_xp_224,224,1024.0,1302.82,785.975,16.85,16.49,86.51
maxvit_tiny_tf_224,224,768.0,1301.05,590.28,5.42,31.21,30.92
deit3_small_patch16_384,384,1024.0,1298.34,788.686,12.45,24.15,22.21
coatnet_rmlp_1_rw2_224,224,1024.0,1296.36,789.892,7.71,32.74,41.72
coatnet_1_rw_224,224,1024.0,1295.8,790.234,7.63,27.22,41.72
regnety_080_tv,224,1024.0,1291.63,792.778,8.51,19.73,39.38
vgg19_bn,224,1024.0,1290.82,793.286,19.66,14.86,143.68
mixnet_xxl,224,768.0,1286.88,596.774,2.04,23.43,23.96
dm_nfnet_f0,256,1024.0,1286.75,795.79,12.62,18.05,71.49
efficientnet_b4,320,768.0,1280.17,599.91,3.13,34.76,19.34
hrnet_w18_ssld,288,1024.0,1279.49,800.308,7.14,26.96,21.3
maxxvit_rmlp_tiny_rw_256,256,768.0,1274.84,602.417,6.36,32.69,29.64
efficientformerv2_s1,224,1024.0,1271.59,805.28,0.67,7.66,6.19
convnext_base,224,1024.0,1268.86,807.011,15.38,28.75,88.59
mobilevitv2_200,256,512.0,1268.57,403.59,7.22,32.15,18.45
regnetz_d32,320,1024.0,1265.97,808.844,9.33,37.08,27.58
efficientnetv2_s,384,1024.0,1265.12,809.401,8.44,35.77,21.46
twins_svt_base,224,1024.0,1261.93,811.442,8.36,20.42,56.07
wide_resnet50_2,288,1024.0,1242.89,823.878,18.89,23.81,68.88
regnetz_d8,320,1024.0,1242.36,824.221,6.19,37.08,23.37
regnetz_040,320,512.0,1238.82,413.274,6.35,37.78,27.12
regnetz_040_h,320,512.0,1231.07,415.879,6.43,37.94,28.94
nest_small,224,1024.0,1230.37,832.252,9.41,22.88,38.35
tf_efficientnetv2_s,384,1024.0,1224.58,836.191,8.44,35.77,21.46
nest_small_jx,224,1024.0,1220.76,838.798,9.41,22.88,38.35
maxvit_tiny_rw_256,256,768.0,1213.37,632.937,6.44,37.27,29.07
maxvit_rmlp_tiny_rw_256,256,768.0,1210.44,634.468,6.47,39.84,29.15
vit_base_patch16_siglip_256,256,1024.0,1208.23,847.511,22.23,21.83,92.93
efficientnetv2_rw_s,384,1024.0,1208.22,847.514,8.72,38.03,23.94
resnetaa101d,288,1024.0,1207.75,847.844,15.07,29.03,44.57
swin_small_patch4_window7_224,224,1024.0,1206.81,848.507,8.77,27.47,49.61
dpn98,224,1024.0,1206.02,849.061,11.73,25.2,61.57
swinv2_tiny_window8_256,256,1024.0,1197.34,855.217,5.96,24.57,28.35
cs3se_edgenet_x,320,1024.0,1196.49,855.827,18.01,20.21,50.72
resnext101_64x4d,224,1024.0,1196.17,856.053,15.52,31.21,83.46
cait_xxs36_224,224,1024.0,1193.04,858.302,3.77,30.34,17.3
resnext101_32x8d,224,1024.0,1188.06,861.896,16.48,31.21,88.79
seresnet101,288,1024.0,1178.9,868.597,12.95,26.87,49.33
resnet152d,256,1024.0,1177.58,869.569,15.41,30.51,60.21
wide_resnet101_2,224,1024.0,1172.43,873.387,22.8,21.23,126.89
crossvit_base_240,240,1024.0,1171.25,874.269,20.13,22.67,105.03
resnet200,224,1024.0,1159.72,882.961,15.07,32.19,64.67
inception_resnet_v2,299,1024.0,1156.1,885.722,13.18,25.06,55.84
rexnetr_300,288,512.0,1153.3,443.932,5.59,36.61,34.81
resnetrs101,288,1024.0,1142.76,896.066,13.56,28.53,63.62
davit_base,224,1024.0,1141.57,896.996,15.36,36.72,87.95
tresnet_xl,224,1024.0,1136.08,901.333,15.2,15.34,78.44
coat_tiny,224,1024.0,1135.01,902.184,4.35,27.2,5.5
tnt_s_patch16_224,224,1024.0,1134.91,902.262,5.24,24.37,23.76
mvitv2_small,224,1024.0,1131.08,905.308,7.0,28.08,34.87
ecaresnet101d,288,1024.0,1130.54,905.749,13.35,28.19,44.57
vit_base_patch16_reg8_gap_256,256,1024.0,1124.62,910.517,22.6,22.09,86.62
maxvit_tiny_pm_256,256,768.0,1121.86,684.565,6.31,40.82,30.09
hrnet_w40,224,1024.0,1119.9,914.356,12.75,25.29,57.56
convnext_small,288,1024.0,1119.4,914.761,14.39,35.65,50.22
nfnet_f1,224,1024.0,1117.42,916.384,17.87,22.94,132.63
efficientnet_lite4,380,768.0,1117.23,687.403,4.04,45.66,13.01
pvt_v2_b4,224,1024.0,1107.81,924.328,9.83,48.14,62.56
seresnext101_64x4d,224,1024.0,1107.71,924.416,15.53,31.25,88.23
seresnext101_32x8d,224,1024.0,1101.53,929.602,16.48,31.25,93.57
resnetv2_50d_gn,288,1024.0,1100.54,930.437,7.24,19.7,25.57
coatnet_1_224,224,512.0,1098.68,466.003,8.28,31.3,42.23
repvgg_b3g4,224,1024.0,1097.61,932.923,17.89,15.1,83.83
samvit_base_patch16_224,224,1024.0,1097.38,933.118,16.83,17.2,86.46
eva02_base_patch16_clip_224,224,1024.0,1094.75,935.361,16.9,18.91,86.26
mvitv2_small_cls,224,1024.0,1086.56,942.407,7.04,28.17,34.87
vit_large_r50_s32_224,224,1024.0,1082.13,946.268,19.45,22.22,328.99
inception_next_base,224,1024.0,1079.66,948.435,14.85,25.69,86.67
resnet50_gn,288,1024.0,1076.3,951.4,6.85,18.37,25.56
pvt_v2_b5,224,1024.0,1073.94,953.474,11.39,44.23,81.96
seresnext101d_32x8d,224,1024.0,1071.41,955.74,16.72,32.05,93.59
efficientnetv2_m,320,1024.0,1070.2,956.818,11.01,39.97,54.14
vit_small_r26_s32_384,384,1024.0,1066.07,960.526,10.24,27.67,36.47
resnetblur101d,288,1024.0,1059.66,966.334,15.07,29.65,44.57
resnet101d,320,1024.0,1045.1,979.801,16.48,34.77,44.57
regnetz_e8,256,1024.0,1042.94,981.82,9.91,40.94,57.7
tf_efficientnet_lite4,380,768.0,1038.99,739.169,4.04,45.66,13.01
xception41p,299,768.0,1034.81,742.157,9.25,39.86,26.91
repvgg_b3,224,1024.0,1031.23,992.974,29.16,15.1,123.09
xcit_tiny_24_p16_384,384,1024.0,1026.84,997.227,6.87,34.29,12.12
resnetrs152,256,1024.0,1024.28,999.711,15.59,30.83,86.62
seresnet152d,256,1024.0,1022.13,1001.814,15.42,30.56,66.84
swinv2_cr_small_224,224,1024.0,1005.65,1018.232,9.07,50.27,49.7
vit_base_patch16_plus_240,240,1024.0,1004.91,1018.982,26.31,22.07,117.56
regnetz_b16_evos,288,768.0,997.65,769.796,2.36,16.43,9.74
focalnet_base_srf,224,1024.0,995.12,1029.007,15.28,35.01,88.15
swinv2_cr_small_ns_224,224,1024.0,993.65,1030.528,9.08,50.27,49.7
convnextv2_small,224,1024.0,992.07,1032.17,8.71,21.56,50.32
convnextv2_tiny,288,768.0,989.58,776.074,7.39,22.21,28.64
vit_small_patch8_224,224,1024.0,985.02,1039.56,16.76,32.86,21.67
regnety_040_sgn,288,1024.0,979.5,1045.407,6.67,20.3,20.65
regnetz_c16_evos,256,768.0,978.11,785.174,2.48,16.57,13.49
vit_base_r50_s16_224,224,1024.0,971.42,1054.108,20.94,27.88,97.89
hrnet_w44,224,1024.0,967.41,1058.48,14.94,26.92,67.06
efficientformer_l7,224,1024.0,966.26,1059.742,10.17,24.45,82.23
hrnet_w48_ssld,224,1024.0,963.59,1062.678,17.34,28.56,77.47
hrnet_w48,224,1024.0,962.72,1063.645,17.34,28.56,77.47
poolformer_m36,224,1024.0,959.97,1066.674,8.8,22.02,56.17
resnet152,288,1024.0,955.06,1072.17,19.11,37.28,60.19
cait_s24_224,224,1024.0,951.69,1075.97,9.35,40.58,46.92
tiny_vit_21m_384,384,512.0,946.04,541.193,11.94,46.84,21.23
focalnet_base_lrf,224,1024.0,946.02,1082.418,15.43,38.13,88.75
dm_nfnet_f1,224,1024.0,943.8,1084.958,17.87,22.94,132.63
efficientnet_b3_gn,288,512.0,943.58,542.602,1.74,23.35,11.73
efficientnetv2_rw_m,320,1024.0,934.42,1095.856,12.72,47.14,53.24
vit_relpos_base_patch16_plus_240,240,1024.0,933.99,1096.357,26.21,23.41,117.38
gmlp_b16_224,224,1024.0,931.13,1099.724,15.78,30.21,73.08
fastvit_sa36,256,1024.0,928.53,1102.809,5.62,34.02,31.53
xception41,299,768.0,927.7,827.842,9.28,39.86,26.97
eva02_small_patch14_336,336,1024.0,926.94,1104.696,12.41,27.7,22.13
maxvit_rmlp_small_rw_224,224,768.0,923.72,831.408,10.48,42.44,64.9
sequencer2d_l,224,1024.0,917.56,1115.991,9.74,22.12,54.3
poolformerv2_s36,224,1024.0,914.51,1119.704,5.01,15.82,30.79
xcit_medium_24_p16_224,224,1024.0,901.57,1135.786,16.13,31.71,84.4
coat_mini,224,1024.0,900.78,1136.787,6.82,33.68,10.34
coat_lite_medium,224,1024.0,898.48,1139.693,9.81,40.06,44.57
swin_s3_small_224,224,768.0,882.63,870.118,9.43,37.84,49.74
efficientnet_b3_g8_gn,288,512.0,882.63,580.072,2.59,23.35,14.25
dpn131,224,1024.0,878.67,1165.389,16.09,32.97,79.25
levit_384_s8,224,512.0,874.93,585.181,9.98,35.86,39.12
efficientnet_b4,384,512.0,874.47,585.489,4.51,50.04,19.34
vit_medium_patch16_gap_384,384,1024.0,873.17,1172.722,22.01,32.15,39.03
nest_base,224,1024.0,871.22,1175.339,16.71,30.51,67.72
nf_regnet_b5,384,1024.0,867.94,1179.793,7.95,42.9,49.74
resnet200d,256,1024.0,866.43,1181.848,20.0,43.09,64.69
maxvit_small_tf_224,224,512.0,864.97,591.915,11.39,46.31,68.93
nest_base_jx,224,1024.0,863.51,1185.835,16.71,30.51,67.72
xcit_small_12_p16_384,384,1024.0,860.6,1189.852,14.14,36.5,26.25
resnetv2_50d_evos,288,1024.0,857.98,1193.488,7.15,19.7,25.59
swin_base_patch4_window7_224,224,1024.0,857.23,1194.527,15.47,36.63,87.77
gcvit_small,224,1024.0,850.2,1204.416,8.57,41.61,51.09
crossvit_15_dagger_408,408,1024.0,849.94,1204.779,16.07,37.0,28.5
eca_nfnet_l1,320,1024.0,845.79,1210.693,14.92,34.42,41.41
tf_efficientnet_b4,380,512.0,836.31,612.204,4.49,49.49,19.34
regnety_080,288,1024.0,834.08,1227.682,13.22,29.69,39.18
levit_conv_384_s8,224,512.0,831.47,615.767,9.98,35.86,39.12
twins_svt_large,224,1024.0,829.67,1234.208,14.84,27.23,99.27
seresnet152,288,1024.0,826.68,1238.676,19.11,37.34,66.82
xception65p,299,768.0,826.46,929.251,13.91,52.48,39.82
eva02_base_patch14_224,224,1024.0,822.18,1245.459,22.0,24.67,85.76
caformer_s36,224,1024.0,811.28,1262.182,7.55,29.29,39.3
maxxvit_rmlp_small_rw_256,256,768.0,805.75,953.134,14.21,47.76,66.01
coatnet_2_rw_224,224,512.0,802.77,637.783,14.55,39.37,73.87
swinv2_base_window12_192,192,1024.0,801.77,1277.157,11.9,39.72,109.28
mvitv2_base,224,1024.0,789.29,1297.348,10.16,40.5,51.47
densenet264d,224,1024.0,784.72,1304.914,13.57,14.0,72.74
resnest50d_4s2x40d,224,1024.0,782.94,1307.879,4.4,17.94,30.42
swinv2_tiny_window16_256,256,512.0,779.51,656.811,6.68,39.02,28.35
volo_d2_224,224,1024.0,778.59,1315.191,14.34,41.34,58.68
dpn107,224,1024.0,773.9,1323.149,18.38,33.46,86.92
xcit_tiny_24_p8_224,224,1024.0,770.47,1329.042,9.21,45.38,12.11
convnext_base,288,1024.0,769.28,1331.103,25.43,47.53,88.59
coatnet_rmlp_2_rw_224,224,512.0,762.93,671.09,14.64,44.94,73.88
mvitv2_base_cls,224,1024.0,760.58,1346.32,10.23,40.65,65.44
convit_base,224,1024.0,757.3,1352.149,17.52,31.77,86.54
convformer_s36,224,1024.0,757.3,1352.161,7.67,30.5,40.01
coatnet_2_224,224,384.0,753.79,509.418,15.94,42.41,74.68
hrnet_w64,224,1024.0,748.82,1367.478,28.97,35.09,128.06
resnet152d,320,1024.0,747.67,1369.57,24.08,47.67,60.21
ecaresnet200d,256,1024.0,744.16,1376.037,20.0,43.15,64.69
seresnet200d,256,1024.0,743.64,1376.992,20.01,43.15,71.86
resnetrs200,256,1024.0,743.56,1377.137,20.18,43.42,93.21
swinv2_small_window8_256,256,1024.0,740.78,1382.313,11.58,40.14,49.73
xception65,299,768.0,738.05,1040.572,13.96,52.48,39.92
fastvit_ma36,256,1024.0,734.46,1394.207,7.85,40.39,44.07
swinv2_cr_small_ns_256,256,1024.0,733.6,1395.843,12.07,76.21,49.7
senet154,224,1024.0,731.81,1399.262,20.77,38.69,115.09
maxvit_rmlp_small_rw_256,256,768.0,731.54,1049.835,13.69,55.48,64.9
legacy_senet154,224,1024.0,730.99,1400.828,20.77,38.69,115.09
tf_efficientnetv2_m,384,1024.0,728.54,1405.529,15.85,57.52,54.14
xcit_nano_12_p8_384,384,1024.0,723.54,1415.249,6.34,46.06,3.05
poolformer_m48,224,1024.0,722.45,1417.374,11.59,29.17,73.47
tnt_b_patch16_224,224,1024.0,722.04,1418.187,14.09,39.01,65.41
efficientvit_l3,224,1024.0,720.55,1421.127,27.62,39.16,246.04
swinv2_cr_base_224,224,1024.0,719.69,1422.825,15.86,59.66,87.88
efficientnet_b3_g8_gn,320,512.0,718.69,712.395,3.2,28.83,14.25
resnest101e,256,1024.0,718.12,1425.925,13.38,28.66,48.28
swin_s3_base_224,224,1024.0,717.57,1427.034,13.69,48.26,71.13
resnext101_64x4d,288,1024.0,717.4,1427.37,25.66,51.59,83.46
swinv2_cr_base_ns_224,224,1024.0,713.5,1435.162,15.86,59.66,87.88
convnextv2_base,224,768.0,711.23,1079.807,15.38,28.75,88.72
resnet200,288,1024.0,697.53,1468.023,24.91,53.21,64.67
efficientnet_b3_gn,320,512.0,695.5,736.148,2.14,28.83,11.73
coat_small,224,1024.0,694.03,1475.431,12.61,44.25,21.69
convnext_large,224,1024.0,690.43,1483.117,34.4,43.13,197.77
regnetz_e8,320,1024.0,670.8,1526.503,15.46,63.94,57.7
efficientformerv2_s2,224,1024.0,670.26,1527.748,1.27,11.77,12.71
seresnext101_32x8d,288,1024.0,656.14,1560.626,27.24,51.63,93.57
resnetrs152,320,1024.0,655.8,1561.431,24.34,48.14,86.62
xcit_small_12_p8_224,224,1024.0,655.5,1562.148,18.69,47.19,26.21
maxxvitv2_rmlp_base_rw_224,224,768.0,651.85,1178.173,23.88,54.39,116.09
seresnet152d,320,1024.0,649.85,1575.74,24.09,47.72,66.84
vit_large_patch32_384,384,1024.0,647.57,1581.281,44.28,32.22,306.63
poolformerv2_m36,224,1024.0,646.73,1583.338,8.81,22.02,56.08
resnext101_32x16d,224,1024.0,641.29,1596.767,36.27,51.18,194.03
seresnext101d_32x8d,288,1024.0,639.61,1600.97,27.64,52.95,93.59
regnetz_d8_evos,256,1024.0,638.02,1604.938,4.5,24.92,23.46
davit_large,224,1024.0,634.07,1614.963,34.37,55.08,196.81
efficientnetv2_m,416,1024.0,633.12,1617.367,18.6,67.5,54.14
regnety_064,224,1024.0,632.1,1619.968,6.39,16.41,30.58
regnetv_064,224,1024.0,629.87,1625.704,6.39,16.41,30.58
regnetz_c16_evos,320,512.0,622.61,822.333,3.86,25.88,13.49
gcvit_base,224,1024.0,620.94,1649.111,14.87,55.48,90.32
nf_regnet_b5,456,512.0,602.97,849.111,11.7,61.95,49.74
seresnextaa101d_32x8d,288,1024.0,601.98,1701.035,28.51,56.44,93.59
xception71,299,768.0,600.76,1278.366,18.09,69.92,42.34
eca_nfnet_l2,320,1024.0,593.89,1724.216,20.95,47.43,56.72
nfnet_f2,256,1024.0,593.31,1725.904,33.76,41.85,193.78
crossvit_18_dagger_408,408,1024.0,585.92,1747.666,25.31,49.38,44.61
hrnet_w48_ssld,288,1024.0,585.32,1749.444,28.66,47.21,77.47
ecaresnet200d,288,1024.0,584.36,1752.321,25.31,54.59,64.69
seresnet200d,288,1024.0,583.25,1755.672,25.32,54.6,71.86
caformer_m36,224,1024.0,582.88,1756.773,12.75,40.61,56.2
levit_512_s8,224,256.0,582.77,439.271,21.82,52.28,74.05
maxvit_rmlp_base_rw_224,224,768.0,582.44,1318.589,22.63,79.3,116.14
seresnet269d,256,1024.0,581.62,1760.578,26.59,53.6,113.67
convmixer_768_32,224,1024.0,580.09,1765.235,19.55,25.95,21.11
resnetrs270,256,1024.0,565.62,1810.398,27.06,55.84,129.86
mixer_l16_224,224,1024.0,553.36,1850.484,44.6,41.69,208.2
levit_conv_512_s8,224,256.0,552.47,463.363,21.82,52.28,74.05
efficientnetv2_rw_m,416,1024.0,552.47,1853.491,21.49,79.62,53.24
resnet200d,320,1024.0,551.74,1855.93,31.25,67.33,64.69
nfnet_f1,320,1024.0,548.82,1865.795,35.97,46.77,132.63
convformer_m36,224,1024.0,548.78,1865.947,12.89,42.05,57.05
volo_d3_224,224,1024.0,541.9,1889.619,20.78,60.09,86.33
swinv2_base_window8_256,256,1024.0,530.42,1930.519,20.37,52.59,87.92
maxvit_base_tf_224,224,512.0,517.72,988.937,23.52,81.67,119.47
xcit_large_24_p16_224,224,1024.0,511.16,2003.26,35.86,47.26,189.1
convmixer_1024_20_ks9_p14,224,1024.0,510.74,2004.929,5.55,5.51,24.38
dm_nfnet_f2,256,1024.0,503.11,2035.325,33.76,41.85,193.78
swin_large_patch4_window7_224,224,768.0,494.53,1552.967,34.53,54.94,196.53
vit_base_patch16_18x2_224,224,1024.0,494.1,2072.443,50.37,49.17,256.73
deit_base_patch16_384,384,1024.0,493.77,2073.808,49.4,48.3,86.86
vit_base_patch16_384,384,1024.0,493.5,2074.946,49.4,48.3,86.86
deit_base_distilled_patch16_384,384,1024.0,493.31,2075.754,49.49,48.39,87.63
vit_base_patch16_clip_384,384,1024.0,492.52,2079.081,49.41,48.3,86.86
eva_large_patch14_196,196,1024.0,491.4,2083.813,59.66,43.77,304.14
vit_base_patch16_siglip_384,384,1024.0,490.82,2086.272,50.0,49.11,93.18
vit_large_patch16_224,224,1024.0,489.19,2093.231,59.7,43.77,304.33
halonet_h1,256,256.0,487.96,524.621,3.0,51.17,8.1
tiny_vit_21m_512,512,256.0,487.73,524.868,21.23,83.26,21.27
seresnextaa101d_32x8d,320,768.0,487.6,1575.053,35.19,69.67,93.59
swinv2_large_window12_192,192,768.0,487.6,1575.036,26.17,56.53,228.77
swinv2_small_window16_256,256,512.0,487.58,1050.071,12.82,66.29,49.73
poolformerv2_m48,224,1024.0,487.33,2101.208,11.59,29.17,73.35
resnetrs200,320,1024.0,476.69,2148.152,31.51,67.81,93.21
xcit_tiny_12_p8_384,384,1024.0,472.87,2165.479,14.12,69.12,6.71
vit_small_patch14_dinov2,518,1024.0,470.72,2175.374,29.46,57.34,22.06
deit3_base_patch16_384,384,1024.0,469.96,2178.883,49.4,48.3,86.88
vit_small_patch14_reg4_dinov2,518,1024.0,469.28,2182.048,29.55,57.51,22.06
deit3_large_patch16_224,224,1024.0,468.18,2187.162,59.7,43.77,304.37
tf_efficientnetv2_m,480,1024.0,466.8,2193.627,24.76,89.84,54.14
dm_nfnet_f1,320,1024.0,463.74,2208.099,35.97,46.77,132.63
xcit_small_24_p16_384,384,1024.0,458.11,2235.247,26.72,68.57,47.67
seresnet269d,288,1024.0,457.25,2239.451,33.65,67.81,113.67
beit_large_patch16_224,224,1024.0,453.95,2255.726,59.7,43.77,304.43
beitv2_large_patch16_224,224,1024.0,453.79,2256.515,59.7,43.77,304.43
regnetx_120,224,1024.0,452.56,2262.648,12.13,21.37,46.11
efficientnet_b5,448,512.0,444.06,1152.996,9.59,93.56,30.39
regnety_120,224,1024.0,444.03,2306.127,12.14,21.38,51.82
efficientformerv2_l,224,1024.0,441.81,2317.703,2.59,18.54,26.32
coatnet_3_rw_224,224,384.0,441.21,870.327,32.63,59.07,181.81
resnetv2_152x2_bit,224,1024.0,439.95,2327.532,46.95,45.11,236.34
convnext_xlarge,224,768.0,438.91,1749.766,60.98,57.5,350.2
coatnet_rmlp_3_rw_224,224,256.0,438.69,583.549,32.75,64.7,165.15
coatnet_3_224,224,256.0,431.52,593.24,35.72,63.61,166.97
convnextv2_base,288,512.0,430.66,1188.858,25.43,47.53,88.72
flexivit_large,240,1024.0,427.93,2392.897,68.48,50.22,304.36
convnextv2_large,224,512.0,424.61,1205.798,34.4,43.13,197.96
swinv2_cr_large_224,224,768.0,424.12,1810.813,35.1,78.42,196.68
swinv2_cr_tiny_384,384,256.0,420.98,608.099,15.34,161.01,28.33
caformer_b36,224,768.0,420.2,1827.698,22.5,54.14,98.75
maxvit_tiny_tf_384,384,256.0,419.78,609.84,16.0,94.22,30.98
convnext_large,288,768.0,417.93,1837.619,56.87,71.29,197.77
regnety_160,224,1024.0,417.09,2455.096,15.96,23.04,83.59
eca_nfnet_l2,384,1024.0,412.81,2480.539,30.05,68.28,56.72
maxxvitv2_rmlp_large_rw_224,224,768.0,411.22,1867.582,43.69,75.4,215.42
efficientnetv2_l,384,1024.0,409.83,2498.611,36.1,101.16,118.52
davit_huge,224,768.0,407.6,1884.205,60.93,73.44,348.92
tf_efficientnetv2_l,384,1024.0,405.08,2527.906,36.1,101.16,118.52
regnety_320,224,1024.0,403.27,2539.241,32.34,30.26,145.05
regnetz_d8_evos,320,768.0,403.13,1905.094,7.03,38.92,23.46
beit_base_patch16_384,384,1024.0,402.61,2543.386,49.4,48.3,86.74
convformer_b36,224,768.0,397.77,1930.749,22.69,56.06,99.88
tf_efficientnet_b5,456,384.0,394.74,972.77,10.46,98.86,30.39
eca_nfnet_l3,352,1024.0,378.23,2707.314,32.57,73.12,72.04
vit_large_patch16_siglip_256,256,1024.0,375.52,2726.866,78.12,57.42,315.96
ecaresnet269d,320,1024.0,372.48,2749.133,41.53,83.69,102.09
vit_large_r50_s32_384,384,1024.0,369.32,2772.633,56.4,64.88,329.09
maxvit_large_tf_224,224,384.0,359.98,1066.726,42.99,109.57,211.79
vit_large_patch14_224,224,1024.0,359.62,2847.449,77.83,57.11,304.2
vit_large_patch14_clip_224,224,1024.0,359.62,2847.409,77.83,57.11,304.2
swinv2_base_window16_256,256,384.0,359.2,1069.042,22.02,84.71,87.92
swinv2_base_window12to16_192to256,256,384.0,359.01,1069.609,22.02,84.71,87.92
nasnetalarge,331,384.0,356.97,1075.708,23.89,90.56,88.75
resnetrs350,288,1024.0,356.46,2872.642,43.67,87.09,163.96
vit_base_patch8_224,224,1024.0,351.76,2911.045,66.87,65.71,86.58
volo_d4_224,224,1024.0,343.2,2983.708,44.34,80.22,192.96
xcit_small_24_p8_224,224,1024.0,342.74,2987.714,35.81,90.77,47.63
volo_d1_384,384,512.0,340.3,1504.541,22.75,108.55,26.78
convnext_large_mlp,320,512.0,338.23,1513.736,70.21,88.02,200.13
repvgg_d2se,320,1024.0,335.87,3048.766,74.57,46.82,133.33
vit_large_patch14_clip_quickgelu_224,224,1024.0,324.37,3156.896,77.83,57.11,303.97
vit_base_r50_s16_384,384,1024.0,315.28,3247.919,61.29,81.77,98.95
nfnet_f2,352,1024.0,313.79,3263.314,63.22,79.06,193.78
xcit_medium_24_p16_384,384,1024.0,313.38,3267.626,47.39,91.63,84.4
vit_large_patch14_xp_224,224,1024.0,311.53,3287.018,77.77,57.11,304.06
ecaresnet269d,352,1024.0,307.84,3326.422,50.25,101.25,102.09
coat_lite_medium_384,384,512.0,301.48,1698.273,28.73,116.7,44.57
regnety_064,288,1024.0,298.91,3425.709,10.56,27.11,30.58
resnetrs270,352,1024.0,298.81,3426.892,51.13,105.48,129.86
regnetv_064,288,1024.0,298.12,3434.809,10.55,27.11,30.58
resnext101_32x32d,224,512.0,296.06,1729.362,87.29,91.12,468.53
nfnet_f3,320,1024.0,290.3,3527.352,68.77,83.93,254.92
efficientnetv2_xl,384,1024.0,290.02,3530.821,52.81,139.2,208.12
tf_efficientnetv2_xl,384,1024.0,287.47,3562.138,52.81,139.2,208.12
cait_xxs24_384,384,1024.0,284.02,3605.396,9.63,122.65,12.03
maxvit_small_tf_384,384,192.0,274.58,699.228,33.58,139.86,69.02
coatnet_4_224,224,256.0,274.31,933.246,60.81,98.85,275.43
convnext_xlarge,288,512.0,265.38,1929.279,100.8,95.05,350.2
dm_nfnet_f2,352,1024.0,265.36,3858.944,63.22,79.06,193.78
vit_base_patch16_siglip_512,512,512.0,263.16,1945.545,88.89,87.3,93.52
vit_so400m_patch14_siglip_224,224,1024.0,262.63,3898.968,106.18,70.45,427.68
efficientnetv2_l,480,512.0,261.08,1961.059,56.4,157.99,118.52
swinv2_cr_small_384,384,256.0,258.97,988.525,29.7,298.03,49.7
convnextv2_large,288,384.0,257.89,1488.981,56.87,71.29,197.96
tf_efficientnetv2_l,480,512.0,257.78,1986.206,56.4,157.99,118.52
eva02_large_patch14_224,224,1024.0,256.9,3985.935,77.9,65.52,303.27
eva02_large_patch14_clip_224,224,1024.0,253.93,4032.531,77.93,65.52,304.11
regnety_120,288,768.0,253.81,3025.924,20.06,35.34,51.82
xcit_tiny_24_p8_384,384,1024.0,248.2,4125.63,27.05,132.94,12.11
coatnet_rmlp_2_rw_384,384,192.0,247.61,775.41,43.04,132.57,73.88
dm_nfnet_f3,320,1024.0,247.07,4144.617,68.77,83.93,254.92
resnetrs420,320,1024.0,244.54,4187.355,64.2,126.56,191.89
mvitv2_large,224,512.0,243.6,2101.832,43.87,112.02,217.99
mvitv2_large_cls,224,512.0,241.75,2117.866,42.17,111.69,234.58
resmlp_big_24_224,224,1024.0,241.59,4238.519,100.23,87.31,129.14
regnety_160,288,768.0,237.71,3230.76,26.37,38.07,83.59
xcit_medium_24_p8_224,224,768.0,234.01,3281.941,63.52,121.22,84.32
eca_nfnet_l3,448,512.0,233.43,2193.322,52.55,118.4,72.04
volo_d5_224,224,1024.0,228.8,4475.542,72.4,118.11,295.46
swin_base_patch4_window12_384,384,256.0,227.46,1125.454,47.19,134.78,87.9
xcit_small_12_p8_384,384,384.0,223.23,1720.206,54.92,138.25,26.21
swinv2_large_window12to16_192to256,256,256.0,219.08,1168.537,47.81,121.53,196.74
maxxvitv2_rmlp_base_rw_384,384,384.0,217.17,1768.16,70.18,160.22,116.09
efficientnet_b6,528,256.0,205.22,1247.45,19.4,167.39,43.04
regnetx_320,224,768.0,200.5,3830.333,31.81,36.3,107.81
resnetrs350,384,1024.0,199.92,5122.143,77.59,154.74,163.96
cait_xs24_384,384,768.0,198.76,3863.971,19.28,183.98,26.67
maxvit_xlarge_tf_224,224,256.0,198.54,1289.412,96.49,164.37,506.99
tf_efficientnet_b6,528,192.0,198.54,967.028,19.4,167.39,43.04
focalnet_huge_fl3,224,512.0,191.39,2675.182,118.26,104.8,745.28
volo_d2_384,384,384.0,190.85,2012.066,46.17,184.51,58.87
cait_xxs36_384,384,1024.0,189.78,5395.721,14.35,183.7,17.37
eva02_base_patch14_448,448,512.0,189.58,2700.759,87.74,98.4,87.12
vit_huge_patch14_gap_224,224,1024.0,186.27,5497.294,161.36,94.7,630.76
swinv2_cr_base_384,384,256.0,185.05,1383.395,50.57,333.68,87.88
swinv2_cr_huge_224,224,384.0,182.04,2109.357,115.97,121.08,657.83
maxvit_rmlp_base_rw_384,384,384.0,179.65,2137.52,66.51,233.79,116.14
vit_huge_patch14_224,224,1024.0,179.6,5701.574,161.99,95.07,630.76
vit_huge_patch14_clip_224,224,1024.0,179.43,5706.842,161.99,95.07,632.05
xcit_large_24_p16_384,384,1024.0,177.48,5769.692,105.34,137.15,189.1
vit_base_patch14_dinov2,518,512.0,176.68,2897.828,117.11,114.68,86.58
vit_base_patch14_reg4_dinov2,518,512.0,175.98,2909.337,117.45,115.02,86.58
deit3_huge_patch14_224,224,1024.0,173.53,5900.889,161.99,95.07,632.13
nfnet_f3,416,768.0,171.77,4471.127,115.58,141.78,254.92
maxvit_tiny_tf_512,512,128.0,170.91,748.92,28.66,172.66,31.05
seresnextaa201d_32x8d,384,512.0,170.35,3005.583,101.11,199.72,149.39
maxvit_base_tf_384,384,192.0,166.63,1152.259,69.34,247.75,119.65
vit_huge_patch14_clip_quickgelu_224,224,1024.0,165.5,6187.275,161.99,95.07,632.08
efficientnetv2_xl,512,512.0,163.45,3132.529,93.85,247.32,208.12
nfnet_f4,384,768.0,163.26,4704.17,122.14,147.57,316.07
tf_efficientnetv2_xl,512,512.0,161.63,3167.699,93.85,247.32,208.12
vit_huge_patch14_xp_224,224,1024.0,159.72,6411.21,161.88,95.07,631.8
eva_large_patch14_336,336,768.0,155.72,4931.845,174.74,128.21,304.53
vit_large_patch14_clip_336,336,768.0,155.28,4945.947,174.74,128.21,304.53
vit_large_patch16_384,384,768.0,155.12,4950.906,174.85,128.21,304.72
vit_large_patch16_siglip_384,384,768.0,154.94,4956.619,175.76,129.18,316.28
convnext_xxlarge,256,384.0,153.59,2500.071,198.09,124.45,846.47
vit_giant_patch16_gap_224,224,1024.0,153.47,6672.363,198.14,103.64,1011.37
cait_s24_384,384,512.0,153.12,3343.821,32.17,245.3,47.06
davit_giant,224,384.0,152.05,2525.491,192.34,138.2,1406.47
deit3_large_patch16_384,384,1024.0,148.73,6884.872,174.85,128.21,304.76
coatnet_5_224,224,192.0,147.83,1298.762,142.72,143.69,687.47
dm_nfnet_f3,416,512.0,146.0,3506.787,115.58,141.78,254.92
resnetrs420,416,768.0,144.59,5311.727,108.45,213.79,191.89
vit_large_patch14_clip_quickgelu_336,336,768.0,141.12,5441.998,174.74,128.21,304.29
dm_nfnet_f4,384,768.0,139.13,5519.969,122.14,147.57,316.07
swin_large_patch4_window12_384,384,128.0,135.95,941.498,104.08,202.16,196.74
xcit_large_24_p8_224,224,512.0,131.73,3886.696,141.22,181.53,188.93
beit_large_patch16_384,384,768.0,129.79,5917.023,174.84,128.21,305.0
efficientnet_b7,600,192.0,128.05,1499.407,38.33,289.94,66.35
tf_efficientnet_b7,600,192.0,124.56,1541.433,38.33,289.94,66.35
focalnet_huge_fl4,224,512.0,123.26,4153.862,118.9,113.34,686.46
eva_giant_patch14_clip_224,224,1024.0,116.99,8753.07,259.74,135.89,1012.59
eva_giant_patch14_224,224,1024.0,116.91,8758.747,259.74,135.89,1012.56
nfnet_f5,416,768.0,116.91,6569.029,170.71,204.56,377.21
xcit_small_24_p8_384,384,384.0,116.73,3289.571,105.23,265.87,47.63
maxvit_large_tf_384,384,128.0,116.56,1098.144,126.61,332.3,212.03
vit_giant_patch14_224,224,1024.0,114.32,8957.604,259.74,135.89,1012.61
vit_giant_patch14_clip_224,224,1024.0,114.12,8973.257,259.74,135.89,1012.65
swinv2_cr_large_384,384,192.0,113.51,1691.47,108.96,404.96,196.68
eva02_large_patch14_clip_336,336,768.0,110.42,6955.361,174.97,147.1,304.43
mvitv2_huge_cls,224,384.0,105.54,3638.368,120.67,243.63,694.8
maxvit_small_tf_512,512,96.0,104.89,915.238,60.02,256.36,69.13
cait_s36_384,384,512.0,102.28,5005.663,47.99,367.39,68.37
dm_nfnet_f5,416,512.0,99.59,5141.209,170.71,204.56,377.21
swinv2_base_window12to24_192to384,384,96.0,96.5,994.841,55.25,280.36,87.92
focalnet_large_fl3,384,256.0,93.78,2729.925,105.06,168.04,239.13
nfnet_f4,512,512.0,91.69,5583.92,216.26,262.26,316.07
focalnet_large_fl4,384,256.0,90.64,2824.324,105.2,181.78,239.32
nfnet_f6,448,512.0,86.88,5893.345,229.7,273.62,438.36
efficientnet_b8,672,128.0,85.75,1492.768,63.48,442.89,87.41
tf_efficientnet_b8,672,128.0,83.71,1529.068,63.48,442.89,87.41
volo_d3_448,448,128.0,81.1,1578.235,96.33,446.83,86.63
vit_so400m_patch14_siglip_384,384,512.0,80.75,6340.618,302.34,200.62,428.23
xcit_medium_24_p8_384,384,256.0,80.25,3189.919,186.67,354.69,84.32
dm_nfnet_f4,512,384.0,78.23,4908.575,216.26,262.26,316.07
vit_huge_patch14_clip_336,336,512.0,75.44,6786.84,363.7,213.44,632.46
dm_nfnet_f6,448,512.0,74.17,6903.248,229.7,273.62,438.36
maxvit_base_tf_512,512,96.0,72.37,1326.47,123.93,456.26,119.88
nfnet_f5,544,384.0,68.39,5614.643,290.97,349.71,377.21
nfnet_f7,480,512.0,66.61,7686.561,300.08,355.86,499.5
vit_gigantic_patch14_224,224,512.0,66.24,7729.406,473.4,204.12,1844.44
vit_gigantic_patch14_clip_224,224,512.0,66.15,7739.524,473.41,204.12,1844.91
focalnet_xlarge_fl3,384,192.0,65.92,2912.463,185.61,223.99,408.79
maxvit_xlarge_tf_384,384,96.0,64.9,1479.208,283.86,498.45,475.32
focalnet_xlarge_fl4,384,192.0,63.63,3017.361,185.79,242.31,409.03
beit_large_patch16_512,512,256.0,61.48,4163.85,310.6,227.76,305.67
volo_d4_448,448,192.0,60.99,3147.895,197.13,527.35,193.41
regnety_640,384,192.0,60.97,3149.012,188.47,124.83,281.38
convnextv2_huge,384,96.0,60.92,1575.922,337.96,232.35,660.29
swinv2_large_window12to24_192to384,384,48.0,60.75,790.151,116.15,407.83,196.74
eva02_large_patch14_448,448,512.0,59.67,8581.221,310.69,261.32,305.08
dm_nfnet_f5,544,384.0,58.35,6580.773,290.97,349.71,377.21
vit_huge_patch14_clip_378,378,512.0,58.14,8806.389,460.13,270.04,632.68
convmixer_1536_20,224,1024.0,56.99,17967.01,48.68,33.03,51.63
vit_large_patch14_dinov2,518,384.0,56.83,6757.154,414.89,304.42,304.37
vit_large_patch14_reg4_dinov2,518,384.0,56.64,6779.944,416.1,305.31,304.37
maxvit_large_tf_512,512,64.0,54.68,1170.494,225.96,611.85,212.33
tf_efficientnet_l2,475,96.0,54.05,1776.14,172.11,609.89,480.31
vit_huge_patch14_clip_quickgelu_378,378,384.0,53.95,7117.573,460.13,270.04,632.68
vit_huge_patch16_gap_448,448,512.0,52.86,9685.108,494.35,290.02,631.67
nfnet_f6,576,384.0,52.55,7307.184,378.69,452.2,438.36
swinv2_cr_giant_224,224,192.0,52.45,3660.551,483.85,309.15,2598.76
eva_giant_patch14_336,336,512.0,49.65,10312.606,583.14,305.1,1013.01
swinv2_cr_huge_384,384,96.0,49.62,1934.539,352.04,583.18,657.94
xcit_large_24_p8_384,384,192.0,45.19,4249.177,415.0,531.74,188.93
dm_nfnet_f6,576,256.0,44.83,5710.109,378.69,452.2,438.36
volo_d5_448,448,192.0,42.49,4518.905,315.06,737.92,295.91
nfnet_f7,608,256.0,41.52,6165.283,480.39,570.85,499.5
cait_m36_384,384,256.0,33.1,7733.448,173.11,734.79,271.22
resnetv2_152x4_bit,480,96.0,32.12,2989.13,844.84,414.26,936.53
maxvit_xlarge_tf_512,512,48.0,30.41,1578.222,505.95,917.77,475.77
regnety_2560,384,128.0,30.25,4231.43,747.83,296.49,1282.6
volo_d5_512,512,128.0,29.54,4332.489,425.09,1105.37,296.09
samvit_base_patch16,1024,16.0,23.81,671.88,371.55,403.08,89.67
regnety_1280,384,128.0,22.93,5583.053,374.99,210.2,644.81
efficientnet_l2,800,48.0,19.03,2521.932,479.12,1707.39,480.31
vit_giant_patch14_dinov2,518,192.0,17.15,11193.542,1553.56,871.89,1136.48
vit_giant_patch14_reg4_dinov2,518,192.0,17.12,11212.072,1558.09,874.43,1136.48
swinv2_cr_giant_384,384,32.0,15.04,2127.877,1450.71,1394.86,2598.76
eva_giant_patch14_560,560,192.0,15.03,12771.913,1618.04,846.56,1014.45
cait_m48_448,448,128.0,13.96,9172.063,329.4,1708.21,356.46
samvit_large_patch16,1024,12.0,10.64,1127.934,1317.08,1055.58,308.28
samvit_huge_patch16,1024,8.0,6.61,1210.638,2741.59,1727.57,637.03
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-train-amp-nhwc-pt111-cu113-rtx3090.csv
|
model,train_samples_per_sec,train_step_time,train_batch_size,train_img_size,param_count
tinynet_e,10725.36,46.047,512,106,2.04
mobilenetv3_small_050,9864.52,50.786,512,224,1.59
lcnet_035,9593.72,52.888,512,224,1.64
lcnet_050,8283.82,61.296,512,224,1.88
tf_mobilenetv3_small_minimal_100,8178.73,62.055,512,224,2.04
tinynet_d,7987.22,63.336,512,152,2.34
mobilenetv3_small_075,7734.29,65.482,512,224,2.04
mobilenetv3_small_100,7481.49,67.702,512,224,2.54
tf_mobilenetv3_small_075,7093.89,71.455,512,224,2.04
tf_mobilenetv3_small_100,6879.11,73.705,512,224,2.54
levit_128s,6303.14,80.293,512,224,7.78
lcnet_075,5742.95,88.676,512,224,2.36
lcnet_100,5331.75,95.531,512,224,2.95
mixer_s32_224,4714.36,108.029,512,224,19.1
mnasnet_small,4652.19,109.156,512,224,2.03
mnasnet_050,4534.41,112.14,512,224,2.22
levit_128,4434.56,114.332,512,224,9.21
vit_small_patch32_224,4334.06,117.284,512,224,22.88
mobilenetv2_035,4197.24,121.203,512,224,1.68
tinynet_c,4165.97,121.921,512,184,2.46
gernet_s,4117.74,123.649,512,224,8.17
semnasnet_050,4027.14,126.223,512,224,2.08
vit_tiny_r_s16_p8_224,3857.49,131.88,512,224,6.34
levit_192,3823.94,132.765,512,224,10.95
lcnet_150,3663.02,139.3,512,224,4.5
resnet18,3584.19,142.504,512,224,11.69
gluon_resnet18_v1b,3584.07,142.508,512,224,11.69
swsl_resnet18,3583.72,142.531,512,224,11.69
ssl_resnet18,3558.1,143.543,512,224,11.69
mobilenetv2_050,3541.93,143.76,512,224,1.97
mobilenetv3_large_075,3343.71,152.255,512,224,3.99
ese_vovnet19b_slim_dw,3243.14,157.395,512,224,1.9
tf_mobilenetv3_large_minimal_100,3227.09,157.922,512,224,3.92
seresnet18,3222.19,158.398,512,224,11.78
legacy_seresnet18,3130.77,163.021,512,224,11.78
tf_mobilenetv3_large_075,3109.15,163.824,512,224,3.99
mnasnet_075,3102.76,164.235,512,224,3.17
ghostnet_050,3069.54,165.437,512,224,2.59
mobilenetv3_rw,3020.36,168.644,512,224,5.48
mobilenetv3_large_100,2997.0,169.969,512,224,5.48
mobilenetv3_large_100_miil,2996.51,169.991,512,224,5.48
levit_256,2923.52,174.041,512,224,18.89
hardcorenas_a,2875.54,177.351,512,224,5.26
resnet18d,2830.02,180.547,512,224,11.71
mnasnet_b1,2826.42,180.324,512,224,4.38
mnasnet_100,2810.01,181.39,512,224,4.38
tf_mobilenetv3_large_100,2800.51,181.961,512,224,5.48
tinynet_b,2773.33,183.58,512,188,3.73
hardcorenas_b,2665.39,191.158,512,224,5.18
semnasnet_075,2649.51,192.342,512,224,2.91
hardcorenas_c,2643.17,192.764,512,224,5.52
ese_vovnet19b_slim,2613.26,195.551,512,224,3.17
mobilenetv2_075,2538.45,200.896,512,224,2.64
tf_efficientnetv2_b0,2507.59,202.986,512,224,7.14
spnasnet_100,2504.82,203.411,512,224,4.42
levit_256d,2485.6,204.456,512,224,26.21
hardcorenas_d,2483.8,204.983,512,224,7.5
semnasnet_100,2411.15,211.44,512,224,3.89
mnasnet_a1,2396.96,212.676,512,224,3.89
mobilenetv2_100,2381.65,214.209,512,224,3.5
regnetx_002,2371.97,215.169,512,224,2.68
tinynet_a,2274.6,223.875,512,192,6.19
regnety_002,2255.16,226.095,512,224,3.16
ghostnet_100,2251.56,226.062,512,224,5.18
fbnetc_100,2248.09,226.804,512,224,5.57
deit_tiny_patch16_224,2233.8,228.385,512,224,5.72
vit_tiny_patch16_224,2229.44,228.819,512,224,5.72
efficientnet_lite0,2209.28,231.003,512,224,4.65
hardcorenas_f,2207.45,230.839,512,224,8.2
deit_tiny_distilled_patch16_224,2193.62,232.567,512,224,5.91
hardcorenas_e,2183.35,233.392,512,224,8.07
xcit_nano_12_p16_224_dist,2148.4,236.588,512,224,3.05
xcit_nano_12_p16_224,2147.35,236.626,512,224,3.05
tv_resnet34,2081.13,245.45,512,224,21.8
resnet34,2080.93,245.474,512,224,21.8
gluon_resnet34_v1b,2070.75,246.674,512,224,21.8
pit_ti_distilled_224,2069.0,246.563,512,224,5.1
pit_ti_224,2067.14,246.798,512,224,4.85
tf_efficientnet_lite0,2051.57,248.83,512,224,4.65
skresnet18,2011.18,253.956,512,224,11.96
resnet26,1965.66,259.994,512,224,16.0
resnetblur18,1945.89,262.773,512,224,11.69
gernet_m,1920.15,265.947,512,224,21.14
ese_vovnet19b_dw,1905.39,268.216,512,224,6.54
nf_resnet26,1877.61,272.201,512,224,16.0
hrnet_w18_small,1858.6,274.127,512,224,13.19
seresnet34,1854.8,275.134,512,224,21.96
mnasnet_140,1835.61,278.136,512,224,7.12
legacy_seresnet34,1814.57,281.284,512,224,21.96
efficientnet_b0,1800.57,212.181,384,224,5.29
levit_384,1799.63,283.374,512,224,39.13
resnet34d,1799.03,284.0,512,224,21.82
mobilenetv2_110d,1768.64,216.1,384,224,4.52
rexnetr_100,1759.23,217.141,384,224,4.88
selecsls42,1754.3,291.22,512,224,30.35
selecsls42b,1748.59,292.176,512,224,32.46
mixer_b32_224,1728.02,295.478,512,224,60.29
tf_efficientnet_b0_ns,1702.0,224.532,384,224,5.29
tf_efficientnet_b0_ap,1700.69,224.751,384,224,5.29
tf_efficientnet_b0,1700.24,224.78,384,224,5.29
mixer_s16_224,1649.18,309.899,512,224,18.53
semnasnet_140,1640.99,311.119,512,224,6.11
tf_efficientnet_es,1622.88,314.744,512,224,5.44
efficientnet_es,1618.83,315.528,512,224,5.44
efficientnet_es_pruned,1616.07,316.054,512,224,5.44
vit_base_patch32_224_sam,1613.76,316.429,512,224,88.22
vit_base_patch32_224,1612.96,316.587,512,224,88.22
resnet26d,1609.37,317.63,512,224,16.01
tf_efficientnetv2_b1,1607.05,237.525,384,240,8.14
ghostnet_130,1600.19,318.63,512,224,7.36
pit_xs_distilled_224,1591.2,320.859,512,224,11.0
pit_xs_224,1589.39,321.258,512,224,10.62
repvgg_b0,1586.15,321.714,512,224,15.82
resmlp_12_224,1552.02,329.099,512,224,15.35
resmlp_12_distilled_224,1551.98,329.119,512,224,15.35
gmixer_12_224,1551.47,329.195,512,224,12.7
mobilenetv2_140,1539.46,248.646,384,224,6.11
mobilevit_xxs,1515.36,252.293,384,256,1.27
selecsls60,1486.56,343.544,512,224,30.67
selecsls60b,1480.82,344.867,512,224,32.77
nf_seresnet26,1471.47,347.302,512,224,17.4
xcit_tiny_12_p16_224,1422.37,358.218,512,224,6.72
xcit_tiny_12_p16_224_dist,1420.76,358.558,512,224,6.72
efficientnet_lite1,1418.67,179.483,256,240,5.42
vit_small_patch32_384,1414.68,361.045,512,384,22.92
efficientnet_b1_pruned,1384.32,368.407,512,240,6.33
gmlp_ti16_224,1378.39,276.995,384,224,5.87
dla46_c,1367.99,373.533,512,224,1.3
nf_ecaresnet26,1361.11,375.605,512,224,16.0
poolformer_s12,1359.83,375.831,512,224,11.92
rexnetr_130,1350.37,188.438,256,224,7.61
tf_efficientnet_lite1,1343.78,189.557,256,240,5.42
crossvit_tiny_240,1320.33,386.159,512,240,7.01
mobilenetv2_120d,1309.71,194.276,256,224,5.83
resnetv2_50,1296.0,394.281,512,224,25.55
gernet_l,1283.32,398.106,512,256,31.08
rexnet_100,1277.92,299.324,384,224,4.8
crossvit_9_240,1236.5,309.16,384,240,8.55
resnet26t,1227.85,416.475,512,256,16.01
ssl_resnet50,1223.5,417.643,512,224,25.56
resnet50,1222.61,417.956,512,224,25.56
tv_resnet50,1222.03,418.147,512,224,25.56
swsl_resnet50,1222.0,418.189,512,224,25.56
gluon_resnet50_v1b,1221.97,418.181,512,224,25.56
crossvit_9_dagger_240,1195.59,319.731,384,240,8.78
vit_tiny_r_s16_p8_384,1193.45,320.912,384,384,6.36
rexnetr_150,1185.63,214.799,256,224,9.78
fbnetv3_b,1184.86,322.487,384,256,8.6
botnet26t_256,1168.73,327.971,384,256,12.49
tf_efficientnetv2_b2,1157.41,219.649,256,260,10.1
regnetx_004,1150.82,443.834,512,224,5.16
repvgg_a2,1142.14,447.42,512,224,28.21
skresnet34,1140.48,447.803,512,224,22.28
resnetv2_50t,1134.35,450.541,512,224,25.57
fbnetv3_d,1133.65,223.966,256,256,10.31
resnetv2_50d,1131.93,451.499,512,224,25.57
gluon_resnet50_v1c,1131.41,338.54,384,224,25.58
halonet26t,1122.34,341.57,384,256,12.48
convit_tiny,1108.09,461.02,512,224,5.71
efficientnet_lite2,1096.26,232.544,256,260,6.09
dla34,1094.33,467.287,512,224,15.74
convnext_nano_hnf,1075.54,356.227,384,224,15.59
resnet50d,1070.21,357.929,384,224,25.58
gluon_resnet50_v1d,1070.13,357.997,384,224,25.58
resnet50t,1068.62,358.508,384,224,25.57
mixnet_s,1051.2,485.837,512,224,4.13
legacy_seresnext26_32x4d,1051.18,486.414,512,224,16.79
tf_efficientnet_lite2,1045.58,243.879,256,260,6.09
vit_small_resnet26d_224,1042.37,367.395,384,224,63.61
deit_small_patch16_224,1032.62,371.017,384,224,22.05
vit_small_patch16_224,1027.63,372.823,384,224,22.05
regnety_004,1027.25,497.255,512,224,4.34
tf_efficientnet_b1_ns,1026.16,247.955,256,240,7.79
tf_efficientnet_b1_ap,1026.11,248.006,256,240,7.79
tf_efficientnet_b1,1025.11,248.193,256,240,7.79
resnet32ts,1021.77,249.96,256,256,17.96
deit_small_distilled_patch16_224,1010.79,379.058,384,224,22.44
resnet33ts,1009.39,252.998,256,256,19.68
res2net50_48w_2s,1006.25,380.82,384,224,25.29
vovnet39a,1004.46,509.092,512,224,22.6
seresnext26d_32x4d,1002.27,382.471,384,224,16.81
seresnext26t_32x4d,1001.74,382.665,384,224,16.81
seresnext26tn_32x4d,1001.47,382.779,384,224,16.81
legacy_seresnet50,993.86,385.256,384,224,28.09
tf_efficientnet_em,979.63,260.356,256,240,6.9
efficientnet_em,978.46,260.687,256,240,6.9
dla46x_c,973.77,525.047,512,224,1.07
eca_resnet33ts,964.39,264.788,256,256,19.68
pit_s_224,961.0,265.507,256,224,23.46
pit_s_distilled_224,960.14,265.718,256,224,24.04
tf_mixnet_s,958.59,532.877,512,224,4.13
seresnet50,956.87,400.165,384,224,28.09
efficientnet_b1,954.91,266.596,256,256,7.79
seresnet33ts,954.85,267.313,256,256,19.78
ecaresnetlight,952.86,536.422,512,224,30.16
vit_base2_patch32_256,950.48,537.853,512,256,119.46
ese_vovnet39b,947.73,539.578,512,224,24.57
ecaresnext50t_32x4d,947.69,404.65,384,224,15.41
ecaresnext26t_32x4d,947.22,404.844,384,224,15.41
dla60,945.45,405.196,384,224,22.04
gluon_resnet50_v1s,943.31,406.227,384,224,25.68
resnetaa50d,941.94,406.82,384,224,25.58
eca_vovnet39b,939.18,544.514,512,224,22.6
vgg11,930.6,550.023,512,224,132.86
gcresnet33ts,927.58,274.995,256,256,19.88
lambda_resnet26rpt_256,921.55,207.755,192,256,10.99
dla60x_c,921.01,554.951,512,224,1.32
ecaresnet50d_pruned,911.85,560.565,512,224,19.94
resnetblur50,909.58,421.362,384,224,25.56
mobilevit_xs,909.51,210.0,192,256,2.32
cspresnet50,906.79,422.612,384,256,21.62
rexnetr_200,896.75,212.966,192,224,16.52
coat_lite_tiny,890.79,430.193,384,224,5.72
nf_seresnet50,886.81,431.807,384,224,28.09
dpn68b,878.29,436.039,384,224,12.61
selecsls84,872.56,585.545,512,224,50.95
twins_svt_small,868.85,440.366,384,224,24.06
hrnet_w18_small_v2,867.42,587.948,512,224,15.6
seresnet50t,865.51,442.504,384,224,28.1
cspresnext50,862.61,444.309,384,224,20.57
resnetrs50,861.96,444.354,384,224,35.69
cspresnet50w,860.12,445.567,384,256,28.12
cspresnet50d,849.09,451.363,384,256,21.64
densenet121,845.28,301.077,256,224,7.98
tv_densenet121,845.28,301.063,256,224,7.98
rexnet_150,842.23,302.82,256,224,9.73
tv_resnext50_32x4d,836.18,458.41,384,224,25.03
swsl_resnext50_32x4d,836.09,458.464,384,224,25.03
res2net50_26w_4s,835.77,458.208,384,224,25.7
coat_lite_mini,833.77,459.672,384,224,11.01
vit_base_resnet26d_224,833.37,459.491,384,224,101.4
resnext50_32x4d,832.87,460.244,384,224,25.03
ssl_resnext50_32x4d,832.34,460.521,384,224,25.03
dpn68,831.95,460.44,384,224,12.61
gluon_resnext50_32x4d,831.77,460.799,384,224,25.03
vovnet57a,828.64,616.994,512,224,36.64
efficientnet_b2_pruned,825.77,308.457,256,260,8.31
resnetblur50d,818.48,311.928,256,224,25.58
skresnet50,810.12,472.628,384,224,25.8
tf_efficientnet_b2_ap,809.72,235.646,192,260,9.11
tf_efficientnet_b2_ns,809.32,235.717,192,260,9.11
tf_efficientnet_b2,809.06,235.843,192,260,9.11
vgg11_bn,805.38,476.555,384,224,132.87
densenet121d,805.31,316.045,256,224,8.0
nf_ecaresnet50,804.89,476.102,384,224,25.56
rexnet_130,801.42,318.304,256,224,7.56
ecaresnet50d,793.07,483.245,384,224,25.58
ese_vovnet57b,790.94,484.567,384,224,38.61
regnetx_006,790.56,646.782,512,224,6.2
gcresnet50t,788.09,323.372,256,256,25.9
regnety_006,787.76,648.873,512,224,6.06
convnext_tiny,781.17,326.737,256,224,28.59
tf_inception_v3,774.78,494.217,384,299,23.83
gluon_inception_v3,774.48,494.404,384,299,23.83
seresnetaa50d,773.82,329.682,256,224,28.11
inception_v3,772.99,495.38,384,299,23.83
adv_inception_v3,769.85,497.351,384,299,23.83
resmlp_24_distilled_224,767.92,331.893,256,224,30.02
resnext50d_32x4d,765.61,333.516,256,224,25.05
resmlp_24_224,762.25,334.383,256,224,30.02
gmixer_24_224,759.72,335.461,256,224,24.72
resnetv2_101,757.49,336.449,256,224,44.54
res2net50_14w_8s,756.34,336.303,256,224,25.06
xcit_nano_12_p16_384_dist,754.95,337.338,256,384,3.05
sehalonet33ts,749.52,340.746,256,256,13.69
densenetblur121d,739.98,344.167,256,224,8.0
dla60_res2net,736.45,346.205,256,224,20.85
skresnet50d,736.23,346.324,256,224,25.82
mobilevit_s,734.66,260.236,192,256,5.58
gluon_resnet101_v1b,731.02,348.601,256,224,44.55
tv_resnet101,727.65,350.31,256,224,44.55
resnet101,727.5,350.369,256,224,44.55
xcit_tiny_24_p16_224,726.25,349.143,256,224,12.12
xcit_tiny_24_p16_224_dist,725.28,349.489,256,224,12.12
efficientnet_b2,724.21,263.654,192,288,9.11
twins_pcpvt_small,724.17,351.883,256,224,24.11
efficientnet_b2a,723.56,263.856,192,288,9.11
ecaresnet101d_pruned,714.19,715.116,512,224,24.88
nf_resnet50,710.87,539.322,384,288,25.56
nf_resnet101,707.69,540.968,384,224,44.55
seresnext50_32x4d,706.91,361.01,256,224,27.56
gluon_seresnext50_32x4d,706.58,361.114,256,224,27.56
efficientnet_b0_gn,705.84,361.599,256,224,5.29
legacy_seresnext50_32x4d,704.96,362.028,256,224,27.56
nf_regnet_b0,703.19,726.933,512,256,8.76
darknet53,701.97,363.892,256,256,41.61
resnetv2_101d,699.92,364.177,256,224,44.56
densenet169,698.35,364.121,256,224,14.15
gluon_resnet101_v1c,697.85,365.313,256,224,44.57
dla60x,694.54,367.645,256,224,17.35
poolformer_s24,690.11,369.675,256,224,21.39
semobilevit_s,684.71,279.143,192,256,5.74
efficientnetv2_rw_t,684.48,278.424,192,288,13.65
vit_small_r26_s32_224,682.34,373.878,256,224,36.43
convnext_tiny_hnf,681.83,374.501,256,224,28.59
gluon_resnet101_v1d,674.08,378.237,256,224,44.57
tf_efficientnetv2_b3,670.7,284.467,192,300,14.36
xcit_small_12_p16_224,670.22,380.208,256,224,26.25
xcit_small_12_p16_224_dist,669.97,380.26,256,224,26.25
sebotnet33ts_256,666.38,191.3,128,256,13.7
rexnet_200,665.99,287.162,192,224,16.37
vgg13,663.21,578.818,384,224,133.05
regnety_008,661.82,772.637,512,224,6.26
wide_resnet50_2,661.04,580.088,384,224,68.88
dla102,650.38,392.047,256,224,33.27
gmlp_s16_224,648.9,294.316,192,224,19.42
vit_base_resnet50d_224,646.39,394.444,256,224,110.97
swin_tiny_patch4_window7_224,639.35,399.428,256,224,28.29
ecaresnet26t,628.75,406.586,256,320,16.01
repvgg_b1,627.31,815.089,512,224,57.42
gluon_resnet101_v1s,624.3,408.497,256,224,44.67
crossvit_small_240,624.04,408.587,256,240,26.86
resnetaa101d,621.23,410.532,256,224,44.57
eca_botnext26ts_256,618.03,413.613,256,256,10.59
resnext26ts,615.62,623.252,384,256,10.3
gc_efficientnetv2_rw_t,605.04,314.593,192,288,13.68
eca_halonext26ts,604.41,422.932,256,256,10.76
seresnext26ts,598.53,427.051,256,256,10.39
eca_resnext26ts,598.25,427.346,256,256,10.3
convnext_tiny_hnfd,592.58,430.992,256,224,28.63
regnetx_008,591.8,864.35,512,224,7.26
resnetv2_50x1_bit_distilled,589.64,324.794,192,224,25.55
legacy_seresnet101,588.41,432.899,256,224,49.33
gcresnext26ts,585.17,436.656,256,256,10.48
halonet50ts,584.39,327.573,192,256,22.73
xcit_nano_12_p8_224,583.42,437.05,256,224,3.05
cait_xxs24_224,582.93,436.673,256,224,11.96
xcit_nano_12_p8_224_dist,581.01,438.848,256,224,3.05
mixer_b16_224,580.44,440.268,256,224,59.88
mixer_b16_224_miil,579.94,440.653,256,224,59.88
swin_s3_tiny_224,578.76,441.339,256,224,28.33
seresnet101,574.05,443.691,256,224,49.33
mixnet_m,573.23,891.634,512,224,5.01
res2net50_26w_6s,569.28,447.908,256,224,37.05
vgg13_bn,568.79,449.813,256,224,133.05
crossvit_15_240,568.46,335.957,192,240,27.53
resnetblur101d,567.71,449.352,256,224,44.57
cspdarknet53,565.64,451.547,256,256,27.64
efficientnet_lite3,563.01,226.244,128,300,8.2
tf_efficientnet_lite3,561.05,227.067,128,300,8.2
crossvit_15_dagger_240,549.96,347.22,192,240,28.21
resnext101_32x4d,548.6,465.031,256,224,44.18
tf_mixnet_m,547.03,700.438,384,224,5.01
swsl_resnext101_32x4d,546.13,467.192,256,224,44.18
gluon_resnext101_32x4d,545.31,467.899,256,224,44.18
ssl_resnext101_32x4d,545.31,467.944,256,224,44.18
densenet201,539.32,353.029,192,224,20.01
vgg16,536.49,715.548,384,224,138.36
bat_resnext26ts,533.44,478.688,256,256,10.73
nf_seresnet101,532.2,478.733,256,224,49.33
vit_base_r26_s32_224,528.57,361.981,192,224,101.38
resnetv2_152,524.49,485.916,256,224,60.19
botnet50ts_256,524.41,243.109,128,256,22.74
res2net101_26w_4s,523.46,486.536,256,224,45.21
efficientnet_b3_pruned,519.43,491.117,256,300,9.86
vit_base_patch32_384,517.32,494.013,256,384,88.3
vit_large_patch32_224,511.41,498.976,256,224,306.54
halo2botnet50ts_256,510.64,375.017,192,256,22.64
mixer_l32_224,510.1,374.908,192,224,206.94
swin_v2_cr_tiny_224,506.89,377.55,192,224,28.33
resmlp_36_distilled_224,505.72,377.435,192,224,44.69
resmlp_36_224,505.39,377.735,192,224,44.69
vit_tiny_patch16_384,503.94,253.174,128,384,5.79
res2next50,502.84,507.818,256,224,24.67
dla102x,501.98,380.938,192,224,26.31
swin_v2_cr_tiny_ns_224,501.36,381.688,192,224,28.33
resnet152,499.83,381.783,192,224,60.19
gluon_resnet152_v1b,499.79,381.933,192,224,60.19
tv_resnet152,496.47,384.401,192,224,60.19
xception,494.06,258.29,128,299,22.86
visformer_tiny,491.83,1040.332,512,224,10.32
mixnet_l,488.23,784.993,384,224,7.33
gluon_resnet152_v1c,485.58,393.139,192,224,60.21
resnet50_gn,484.73,395.256,192,224,25.56
resnetv2_152d,484.59,393.938,192,224,60.2
twins_pcpvt_base,480.18,397.102,192,224,43.83
nest_tiny,477.31,267.267,128,224,17.06
res2net50_26w_8s,476.93,534.581,256,224,48.4
ecaresnet101d,475.58,536.508,256,224,44.57
convnext_small,473.89,403.445,192,224,50.22
gluon_resnet152_v1d,473.52,403.216,192,224,60.21
tf_mixnet_l,470.86,542.123,256,224,7.33
jx_nest_tiny,470.71,271.015,128,224,17.06
vgg16_bn,469.97,544.386,256,224,138.37
nf_ecaresnet101,465.87,547.628,256,224,44.55
coat_lite_small,463.34,412.922,192,224,19.84
poolformer_s36,458.24,417.134,192,224,30.86
efficientnet_el,455.42,279.993,128,300,10.59
efficientnet_el_pruned,454.32,280.643,128,300,10.59
vgg19,451.86,849.574,384,224,143.67
convit_small,450.19,425.458,192,224,27.78
fbnetv3_g,449.02,283.045,128,288,16.62
ese_vovnet99b,448.97,568.658,256,224,63.2
seresnext101_32x4d,448.16,426.163,192,224,48.96
gluon_seresnext101_32x4d,447.49,426.94,192,224,48.96
gluon_resnet152_v1s,446.73,427.49,192,224,60.32
legacy_seresnext101_32x4d,446.01,428.263,192,224,48.96
nf_regnet_b3,442.03,577.378,256,320,18.59
tf_efficientnet_el,441.61,288.785,128,300,10.59
ese_vovnet39b_evos,439.23,290.493,128,224,24.58
dla60_res2next,437.89,583.208,256,224,17.03
volo_d1_224,437.25,437.756,192,224,26.63
dla169,436.98,436.866,192,224,53.39
skresnext50_32x4d,435.09,586.972,256,224,27.48
hrnet_w32,433.46,438.263,192,224,41.23
vit_small_resnet50d_s16_224,432.94,442.239,192,224,57.53
twins_svt_base,429.83,444.617,192,224,56.07
hrnet_w18,419.32,605.846,256,224,21.3
crossvit_18_240,400.15,317.858,128,240,43.27
vgg19_bn,399.74,640.035,256,224,143.68
ecaresnet50t,399.44,319.549,128,320,25.57
inception_v4,397.78,480.469,192,299,42.68
tf_efficientnet_b3,393.44,323.695,128,300,12.23
swin_small_patch4_window7_224,393.38,486.272,192,224,49.61
legacy_seresnet152,392.87,485.406,192,224,66.82
tf_efficientnet_b3_ap,392.08,324.783,128,300,12.23
vit_base_patch16_224_miil,391.87,489.169,192,224,86.54
tf_efficientnet_b3_ns,391.61,325.176,128,300,12.23
crossvit_18_dagger_240,387.95,327.891,128,240,44.27
vit_base_patch16_224,385.23,497.575,192,224,86.57
vit_base_patch16_224_sam,384.98,497.891,192,224,86.57
deit_base_patch16_224,384.15,498.956,192,224,86.57
cait_xxs36_224,380.29,501.012,192,224,17.3
repvgg_b2,380.03,1346.183,512,224,89.02
regnetx_016,379.21,1349.264,512,224,9.19
deit_base_distilled_patch16_224,378.28,506.739,192,224,87.34
densenet161,376.13,337.907,128,224,28.68
haloregnetz_b,375.52,680.247,256,224,11.68
xcit_tiny_12_p8_224,374.86,339.659,128,224,6.71
xcit_tiny_12_p8_224_dist,374.8,339.672,128,224,6.71
seresnet152,372.93,339.954,128,224,66.82
dla102x2,362.92,351.15,128,224,41.28
wide_resnet101_2,360.46,531.121,192,224,126.89
gluon_resnext101_64x4d,357.83,356.147,128,224,83.46
efficientnet_b3a,354.6,359.335,128,320,12.23
resnet200,354.57,357.998,128,224,64.67
xception41p,353.88,360.846,128,299,26.91
regnety_016,353.36,1447.082,512,224,11.2
efficientnet_b3,353.18,360.796,128,320,12.23
beit_base_patch16_224,352.29,543.93,192,224,86.53
resnest14d,349.84,1463.045,512,224,10.61
hrnet_w30,346.82,733.381,256,224,37.71
ens_adv_inception_resnet_v2,341.5,558.819,192,299,55.84
inception_resnet_v2,341.03,559.722,192,299,55.84
xcit_small_24_p16_224_dist,341.01,371.894,128,224,47.67
tnt_s_patch16_224,340.3,562.32,192,224,23.76
xcit_small_24_p16_224,339.02,373.952,128,224,47.67
efficientnet_lite4,334.87,189.779,64,380,13.01
dpn92,332.25,769.002,256,224,37.67
nf_regnet_b1,330.04,1549.911,512,288,10.22
twins_pcpvt_large,327.73,386.698,128,224,60.99
resnet101d,327.41,389.353,128,320,44.57
convnext_small_in22ft1k,327.31,389.301,128,224,88.59
convnext_base,327.3,389.374,128,224,88.59
convnext_base_in22ft1k,326.61,390.177,128,224,88.59
convnext_tiny_in22ft1k,324.94,392.181,128,224,88.59
tf_efficientnet_lite4,322.62,197.041,64,380,13.01
resnetrs101,321.74,395.61,128,288,63.62
pit_b_224,318.92,400.391,128,224,73.76
pit_b_distilled_224,318.09,401.455,128,224,74.79
gcresnext50ts,316.73,604.706,192,256,15.67
repvgg_b3,315.56,1215.795,384,224,123.09
gluon_seresnext101_64x4d,313.23,406.424,128,224,88.23
regnetz_d8,311.3,204.013,64,320,23.37
poolformer_m36,307.82,413.95,128,224,56.17
xception41,305.16,418.228,128,299,26.97
resnetv2_50d_gn,304.62,419.347,128,288,25.57
coat_tiny,304.05,418.955,128,224,5.5
vit_small_patch16_36x1_224,302.22,420.928,128,224,64.67
swin_v2_cr_small_224,302.05,421.43,128,224,49.7
cait_s24_224,301.09,422.497,128,224,46.92
mixnet_xl,300.81,849.169,256,224,11.9
vit_small_patch16_18x2_224,299.9,424.102,128,224,64.67
resnetv2_50d_frn,299.64,426.047,128,224,25.59
efficientnetv2_s,299.05,318.819,96,384,21.46
twins_svt_large,298.59,426.599,128,224,99.27
tf_efficientnetv2_s,297.45,320.544,96,384,21.46
tf_efficientnetv2_s_in21ft1k,295.84,322.238,96,384,21.46
efficientnetv2_rw_s,295.58,214.343,64,384,23.94
nest_small,295.19,323.571,96,224,38.35
jx_nest_small,293.04,325.974,96,224,38.35
hrnet_w40,291.68,653.529,192,224,57.56
regnetz_005,286.8,1783.819,512,224,7.12
nf_regnet_b2,284.19,1799.964,512,272,14.31
dpn98,283.21,450.397,128,224,61.57
gluon_xception65,280.82,339.911,96,299,39.92
xception65,279.18,341.922,96,299,39.92
resnet51q,277.93,689.969,192,288,35.7
nf_regnet_b4,277.28,459.47,128,384,30.21
swin_s3_small_224,277.05,344.678,96,224,49.74
swin_base_patch4_window7_224,275.83,462.244,128,224,87.77
xception65p,274.95,464.241,128,299,39.82
gmlp_b16_224,270.89,352.851,96,224,73.08
hrnet_w48,266.5,475.648,128,224,77.47
resnest26d,258.35,1485.604,384,224,17.07
resnest50d_1s4x24d,258.11,990.508,256,224,25.68
xcit_tiny_24_p16_384_dist,251.56,378.207,96,384,12.12
regnetz_c16,250.16,510.248,128,320,13.46
crossvit_base_240,249.99,382.366,96,240,105.03
coat_mini,247.34,515.482,128,224,10.34
xcit_medium_24_p16_224,244.65,519.851,128,224,84.4
xcit_medium_24_p16_224_dist,244.08,521.069,128,224,84.4
hrnet_w44,241.77,789.352,192,224,67.06
efficientnet_b4,241.47,262.953,64,384,19.34
volo_d2_224,238.72,400.292,96,224,58.68
tf_efficientnet_b4,236.52,268.528,64,380,19.34
tf_efficientnet_b4_ap,236.39,268.69,64,380,19.34
tf_efficientnet_b4_ns,236.1,269.028,64,380,19.34
vit_small_patch16_384,235.52,270.897,64,384,22.2
resnetv2_50d_evob,235.19,406.951,96,224,25.59
tresnet_m,234.84,2177.505,512,224,31.39
nfnet_l0,233.22,1096.449,256,288,35.07
visformer_small,232.72,1649.37,384,224,40.22
xcit_small_12_p16_384_dist,230.87,414.048,96,384,26.25
vit_large_r50_s32_224,228.88,417.083,96,224,328.99
convit_base,228.65,558.76,128,224,86.54
eca_nfnet_l0,226.92,1127.142,256,288,24.14
resnetv2_50d_evos,223.6,285.051,64,288,25.59
tnt_b_patch16_224,222.54,573.324,128,224,65.41
vit_small_r26_s32_384,221.41,287.746,64,384,36.47
densenet264,220.0,432.388,96,224,72.69
swin_s3_base_224,219.07,435.557,96,224,71.13
hrnet_w64,218.91,579.973,128,224,128.06
resnext101_64x4d,217.22,440.378,96,288,83.46
resnet152d,216.09,441.927,96,320,60.21
xception71,215.62,294.681,64,299,42.34
swin_v2_cr_base_224,215.23,443.662,96,224,87.88
dpn131,211.54,603.038,128,224,79.25
nest_base,210.35,302.564,64,224,67.72
vit_base_r50_s16_224,208.87,457.959,96,224,98.66
jx_nest_base,208.37,305.48,64,224,67.72
resnet61q,207.91,614.63,128,288,36.85
mixnet_xxl,202.79,629.315,128,224,23.96
xcit_nano_12_p8_384_dist,196.18,324.445,64,384,3.05
poolformer_m48,193.91,492.638,96,224,73.47
xcit_tiny_24_p8_224,190.8,499.85,96,224,12.11
xcit_tiny_24_p8_224_dist,190.62,500.3,96,224,12.11
seresnet200d,190.31,500.165,96,256,71.86
ecaresnet200d,182.23,523.49,96,256,64.69
regnetz_b16,181.6,1055.83,192,288,9.72
convnext_large_in22ft1k,180.86,529.028,96,224,197.77
convnext_large,180.64,529.707,96,224,197.77
convmixer_768_32,179.25,534.249,96,224,21.11
repvgg_b1g4,178.41,2868.637,512,224,39.97
regnety_032,178.01,1436.656,256,288,19.44
regnetx_032,177.68,2160.015,384,224,15.3
resnest50d,177.64,1439.786,256,224,27.48
gluon_senet154,177.29,538.24,96,224,115.09
senet154,176.81,539.67,96,224,115.09
halonet_h1,175.85,362.526,64,256,8.1
legacy_senet154,175.45,543.752,96,224,115.09
xcit_small_12_p8_224,174.99,363.954,64,224,26.21
xcit_small_12_p8_224_dist,174.85,364.256,64,224,26.21
seresnet152d,173.74,364.967,64,320,66.84
dpn107,173.23,552.482,96,224,86.92
mixer_l16_224,172.58,554.751,96,224,208.2
resnetrs152,171.14,370.433,64,320,86.62
resnest50d_4s2x40d,167.22,1529.64,256,224,30.42
resnet200d,166.19,382.131,64,320,64.69
volo_d3_224,162.25,391.709,64,224,86.33
regnetx_040,161.87,2371.182,384,224,22.12
vit_large_patch32_384,161.83,591.666,96,384,306.63
efficientnet_b3_gn,160.22,397.781,64,320,11.73
swin_large_patch4_window7_224,151.35,420.987,64,224,196.53
regnetx_080,150.02,2558.515,384,224,39.57
regnety_040s_gn,149.61,853.983,128,224,20.65
efficientnetv2_m,149.31,318.263,48,416,54.14
ssl_resnext101_32x8d,147.46,866.477,128,224,88.79
resnext101_32x8d,147.32,867.312,128,224,88.79
swsl_resnext101_32x8d,147.16,868.264,128,224,88.79
ig_resnext101_32x8d,146.91,869.691,128,224,88.79
regnetz_e8,146.27,326.163,48,320,57.7
resnetv2_50x1_bitm,140.77,340.162,48,448,25.55
seresnet269d,137.16,460.633,64,256,113.67
xcit_large_24_p16_224,136.77,464.553,64,224,189.1
xcit_large_24_p16_224_dist,136.73,464.738,64,224,189.1
xcit_tiny_12_p8_384_dist,128.06,373.098,48,384,6.71
efficientnetv2_rw_m,126.21,250.013,32,416,53.24
regnetx_064,124.13,2061.422,256,224,26.21
resnetrs200,123.75,383.587,48,320,93.21
dm_nfnet_f0,121.58,2104.417,256,256,71.49
swin_v2_cr_large_224,120.98,394.327,48,224,196.68
regnety_040,120.25,1595.17,192,288,20.65
nfnet_f0,119.63,2138.729,256,256,71.49
regnetv_040,118.92,1074.873,128,288,20.64
ese_vovnet99b_iabn,118.27,3243.951,384,224,63.2
xcit_small_24_p16_384_dist,117.41,405.393,48,384,47.67
regnetz_b16_evos,117.22,544.08,64,288,9.74
crossvit_15_dagger_408,116.43,273.026,32,408,28.5
efficientnet_b0_g8_gn,115.43,2216.717,256,224,6.56
vit_large_patch16_224,115.39,553.095,64,224,304.33
regnetz_c16_evos,115.15,414.978,48,320,13.49
vit_base_patch16_18x2_224,114.12,558.126,64,224,256.73
convnext_xlarge_in22ft1k,114.03,559.475,64,224,350.2
convnext_tiny_384_in22ft1k,112.53,424.859,48,384,88.59
convnext_small_384_in22ft1k,112.49,424.983,48,384,88.59
convnext_base_384_in22ft1k,112.43,425.187,48,384,88.59
swin_v2_cr_tiny_384,111.07,286.85,32,384,28.33
tf_efficientnetv2_m,109.5,289.071,32,480,54.14
tf_efficientnetv2_m_in21ft1k,109.37,289.435,32,480,54.14
beit_large_patch16_224,106.59,598.365,64,224,304.43
volo_d1_384,104.96,303.527,32,384,26.78
tresnet_l,104.79,4882.686,512,224,55.99
repvgg_b2g4,102.33,5002.104,512,224,61.76
eca_nfnet_l1,101.26,1262.246,128,320,41.41
volo_d4_224,101.17,471.916,48,224,192.96
cspdarknet53_iabn,98.65,3890.155,384,256,27.64
cait_xxs24_384,98.52,484.701,48,384,12.03
efficientnet_b5,97.28,326.485,32,456,30.39
tf_efficientnet_b5,95.75,331.792,32,456,30.39
tf_efficientnet_b5_ns,95.73,331.833,32,456,30.39
vit_base_patch16_384,95.69,333.568,32,384,86.86
deit_base_patch16_384,95.67,333.658,32,384,86.86
regnetz_d8_evos,95.66,332.456,32,320,23.46
tf_efficientnet_b5_ap,95.45,332.725,32,456,30.39
regnetz_040,94.56,674.98,64,320,27.12
regnetz_040h,94.14,678.01,64,320,28.94
deit_base_distilled_patch16_384,93.65,340.87,32,384,87.63
tresnet_xl,90.76,4227.401,384,224,78.44
cspresnext50_iabn,89.98,4265.102,384,256,20.57
resnest101e,89.71,1424.366,128,256,48.28
crossvit_18_dagger_408,87.99,361.633,32,408,44.61
xcit_small_24_p8_224,87.73,361.414,32,224,47.63
xcit_small_24_p8_224_dist,87.7,361.441,32,224,47.63
resnetv2_101x1_bitm,86.89,366.637,32,448,44.54
nf_regnet_b5,86.53,736.892,64,456,49.74
resnetv2_152x2_bit_teacher,86.4,368.012,32,224,236.34
repvgg_b3g4,84.75,4530.071,384,224,83.83
vit_large_patch14_224,84.3,567.814,48,224,304.2
beit_base_patch16_384,82.73,385.724,32,384,86.74
seresnext101_32x8d,81.65,781.61,64,288,93.57
xcit_medium_24_p16_384_dist,81.08,391.168,32,384,84.4
ecaresnet269d,77.88,406.425,32,352,102.09
regnetx_120,77.54,3300.773,256,224,46.11
pnasnet5large,76.44,414.721,32,331,86.06
vit_large_r50_s32_384,75.76,419.984,32,384,329.09
regnety_120,75.34,2547.007,192,224,51.82
resnetrs270,75.27,419.128,32,352,129.86
swin_base_patch4_window12_384,73.61,432.836,32,384,87.9
regnety_064,72.69,1759.173,128,288,30.58
regnetz_d32,72.18,885.049,64,320,27.58
regnetv_064,71.81,1780.738,128,288,30.58
resmlp_big_24_224,68.34,466.717,32,224,129.14
resmlp_big_24_224_in22ft1k,68.31,466.955,32,224,129.14
resmlp_big_24_distilled_224,68.26,467.278,32,224,129.14
regnety_320,67.49,1895.092,128,224,145.05
nasnetalarge,66.93,473.19,32,331,88.75
swin_v2_cr_small_384,66.26,359.874,24,384,49.7
cait_xs24_384,65.98,482.447,32,384,26.67
regnety_080,65.45,1954.421,128,288,39.18
regnetx_160,65.01,2952.336,192,224,54.28
xcit_tiny_24_p8_384_dist,64.61,492.002,32,384,12.11
volo_d5_224,64.26,494.733,32,224,295.46
cait_xxs36_384,63.75,498.023,32,384,17.37
vit_base_patch8_224,62.96,380.293,24,224,86.58
xcit_medium_24_p8_224,62.71,506.97,32,224,84.32
xcit_medium_24_p8_224_dist,62.63,507.407,32,224,84.32
efficientnet_b3_g8_gn,62.43,1023.448,64,320,14.25
convnext_large_384_in22ft1k,61.7,516.836,32,384,197.77
convmixer_1024_20_ks9_p14,61.37,4170.722,256,224,24.38
efficientnet_b0_g16_evos,60.46,6350.16,384,224,8.11
tf_efficientnetv2_l_in21ft1k,60.25,261.19,16,480,118.52
xcit_small_12_p8_384_dist,60.04,398.02,24,384,26.21
efficientnetv2_l,59.19,265.957,16,480,118.52
vit_base_resnet50_384,59.0,405.136,24,384,98.95
tf_efficientnetv2_l,58.87,267.382,16,480,118.52
vit_base_r50_s16_384,58.85,406.196,24,384,98.95
volo_d2_384,58.21,273.125,16,384,58.87
tresnet_m_448,53.55,3582.526,192,448,31.39
cait_s24_384,49.8,479.385,24,384,47.06
regnety_160,48.14,1993.0,96,288,83.59
ig_resnext101_32x16d,47.52,2018.737,96,224,194.03
swsl_resnext101_32x16d,47.43,2022.391,96,224,194.03
xcit_large_24_p16_384_dist,47.39,502.961,24,384,189.1
ssl_resnext101_32x16d,47.29,2028.601,96,224,194.03
resnetrs350,47.22,500.442,24,384,163.96
swin_v2_cr_base_384,47.19,336.677,16,384,87.88
swin_v2_cr_huge_224,46.26,343.404,16,224,657.83
regnetx_320,46.16,2771.769,128,224,107.81
eca_nfnet_l2,44.82,1425.204,64,384,56.72
efficientnet_b6,42.73,371.597,16,528,43.04
tf_efficientnet_b6,41.54,382.29,16,528,43.04
tf_efficientnet_b6_ns,41.5,382.621,16,528,43.04
tf_efficientnet_b6_ap,41.36,384.088,16,528,43.04
swin_large_patch4_window12_384,41.02,388.265,16,384,196.74
nfnet_f1,40.44,2371.478,96,320,132.63
vit_huge_patch14_224,39.64,401.543,16,224,632.05
dm_nfnet_f1,38.26,1670.645,64,320,132.63
convnext_xlarge_384_in22ft1k,37.04,430.195,16,384,350.2
efficientnet_b7,36.68,214.625,8,600,66.35
efficientnetv2_xl,36.54,322.755,12,512,208.12
tf_efficientnetv2_xl_in21ft1k,36.36,324.371,12,512,208.12
tf_efficientnet_b7_ap,36.21,217.402,8,600,66.35
tf_efficientnet_b7,36.05,218.422,8,600,66.35
tf_efficientnet_b7_ns,35.41,221.975,8,600,66.35
xcit_large_24_p8_224,34.96,454.269,16,224,188.93
xcit_large_24_p8_224_dist,34.92,454.696,16,224,188.93
resnetrs420,32.2,487.619,16,416,191.89
cait_s36_384,32.09,494.814,16,384,68.37
resnest200e,32.0,1494.763,48,320,70.2
densenet264d_iabn,31.93,4003.992,128,224,72.74
resnetv2_50x3_bitm,31.81,502.194,16,448,217.32
xcit_small_24_p8_384_dist,29.97,396.971,12,384,47.63
resnetv2_152x2_bit_teacher_384,29.92,398.698,12,384,236.34
vit_large_patch16_384,28.85,414.372,12,384,304.72
swin_v2_cr_large_384,28.47,419.18,12,384,196.68
tresnet_l_448,25.64,4988.984,128,448,55.99
beit_large_patch16_384,25.11,475.839,12,384,305.0
volo_d3_448,24.79,320.233,8,448,86.63
eca_nfnet_l3,24.19,1319.468,32,448,72.04
tresnet_xl_448,23.17,4140.191,96,448,78.44
nfnet_f2,22.36,2143.929,48,352,193.78
vit_giant_patch14_224,22.25,356.944,8,224,1012.61
dm_nfnet_f2,22.12,2166.49,48,352,193.78
efficientnet_cc_b0_8e,21.78,44.083,1,224,24.01
resnetv2_152x2_bitm,21.74,365.619,8,448,236.34
tf_efficientnet_cc_b0_4e,21.44,44.859,1,224,13.31
tf_efficientnet_cc_b0_8e,20.98,45.875,1,224,24.01
xcit_medium_24_p8_384_dist,20.42,388.21,8,384,84.32
efficientnet_cc_b0_4e,20.36,47.276,1,224,13.31
ig_resnext101_32x32d,18.17,1760.063,32,224,468.53
volo_d4_448,17.6,338.311,6,448,193.41
resnetv2_101x3_bitm,17.53,454.801,8,448,387.93
tf_efficientnet_cc_b1_8e,17.15,56.006,1,240,39.72
efficientnet_cc_b1_8e,16.21,59.398,1,240,39.72
resnest269e,13.09,1826.778,24,416,110.93
tf_efficientnet_b8_ap,12.18,488.771,6,672,87.41
efficientnet_b8,12.12,491.31,6,672,87.41
nfnet_f3,12.08,1982.52,24,416,254.92
xcit_large_24_p8_384_dist,11.91,500.622,6,384,188.93
tf_efficientnet_b8,11.9,500.516,6,672,87.41
cait_m36_384,11.87,501.638,6,384,271.22
dm_nfnet_f3,11.74,2040.434,24,416,254.92
volo_d5_448,11.52,343.952,4,448,295.91
swin_v2_cr_huge_384,10.9,364.423,4,384,657.94
convmixer_1536_20,9.63,4981.706,48,224,51.63
beit_large_patch16_512,9.42,422.773,4,512,305.67
tf_efficientnet_l2_ns_475,9.0,327.787,3,475,480.31
ig_resnext101_32x48d,8.5,1880.808,16,224,828.41
volo_d5_512,8.05,369.448,3,512,296.09
nfnet_f4,6.47,1849.874,12,512,316.07
dm_nfnet_f4,6.2,1929.065,12,512,316.07
cait_m48_448,4.75,415.897,2,448,356.46
nfnet_f5,4.63,1719.792,8,544,377.21
resnetv2_152x4_bitm,4.49,443.185,2,480,936.53
dm_nfnet_f5,4.47,1782.137,8,544,377.21
nfnet_f6,3.5,1707.387,6,576,438.36
dm_nfnet_f6,3.39,1759.608,6,576,438.36
nfnet_f7,2.67,1489.771,4,608,499.5
efficientnet_l2,2.09,473.733,1,800,480.31
tf_efficientnet_l2_ns,2.09,474.031,1,800,480.31
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/results-imagenet-r.csv
|
model,top1,top1_err,top5,top5_err,param_count,img_size,crop_pct,interpolation,top1_diff,top5_diff,rank_diff
convnext_xxlarge.clip_laion2b_soup_ft_in1k,90.623,9.377,97.913,2.087,846.47,256,1.000,bicubic,-7.127,-1.897,+18
eva_giant_patch14_336.clip_ft_in1k,90.550,9.450,97.230,2.770,"1,013.01",336,1.000,bicubic,-7.310,-2.650,+6
eva02_large_patch14_448.mim_m38m_ft_in1k,90.457,9.543,97.267,2.733,305.08,448,1.000,bicubic,-7.373,-2.553,+6
eva02_large_patch14_448.mim_m38m_ft_in22k_in1k,90.293,9.707,97.170,2.830,305.08,448,1.000,bicubic,-7.737,-2.720,-2
eva_giant_patch14_224.clip_ft_in1k,89.843,10.157,97.023,2.977,"1,012.56",224,0.900,bicubic,-7.727,-2.687,+29
eva_giant_patch14_336.m30m_ft_in22k_in1k,88.590,11.410,95.920,4.080,"1,013.01",336,1.000,bicubic,-9.400,-3.980,-2
eva_giant_patch14_560.m30m_ft_in22k_in1k,88.397,11.603,95.610,4.390,"1,014.45",560,1.000,bicubic,-9.603,-4.250,-4
regnety_1280.swag_ft_in1k,88.257,11.743,96.477,3.523,644.81,384,1.000,bicubic,-9.523,-3.383,+6
eva02_large_patch14_448.mim_in22k_ft_in22k_in1k,87.980,12.020,95.610,4.390,305.08,448,1.000,bicubic,-10.170,-4.270,-8
eva02_large_patch14_448.mim_in22k_ft_in1k,87.587,12.413,95.730,4.270,305.08,448,1.000,bicubic,-10.273,-4.060,-3
regnety_1280.swag_lc_in1k,86.943,13.057,95.737,4.263,644.81,224,0.965,bicubic,-10.447,-4.003,+36
convnext_large_mlp.clip_laion2b_augreg_ft_in1k_384,84.437,15.563,94.297,5.703,200.13,384,1.000,bicubic,-12.953,-5.433,+34
vit_large_patch14_clip_336.openai_ft_in12k_in1k,83.923,16.077,93.873,6.127,304.53,336,1.000,bicubic,-13.677,-5.857,+18
vit_large_patch14_clip_336.laion2b_ft_in1k,83.610,16.390,93.510,6.490,304.53,336,1.000,bicubic,-13.620,-6.210,+63
eva_large_patch14_336.in22k_ft_in1k,83.520,16.480,93.103,6.897,304.53,336,1.000,bicubic,-14.290,-6.757,-4
vit_huge_patch14_clip_224.laion2b_ft_in1k,83.263,16.737,93.133,6.867,632.05,224,1.000,bicubic,-13.837,-6.557,+80
vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k,83.047,16.953,92.837,7.163,632.46,336,1.000,bicubic,-14.563,-6.943,+12
vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k,82.797,17.203,92.613,7.387,632.05,224,1.000,bicubic,-14.563,-7.187,+36
vit_large_patch14_clip_224.openai_ft_in1k,82.323,17.677,92.907,7.093,304.20,224,1.000,bicubic,-15.117,-6.773,+24
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384,82.217,17.783,92.433,7.567,200.13,384,1.000,bicubic,-15.253,-7.327,+19
vit_large_patch14_clip_224.laion2b_ft_in1k,81.700,18.300,92.263,7.737,304.20,224,1.000,bicubic,-15.320,-7.407,+90
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320,81.347,18.653,91.963,8.037,200.13,320,1.000,bicubic,-15.913,-7.747,+48
regnety_320.swag_lc_in1k,81.317,18.683,93.153,6.847,145.05,224,0.965,bicubic,-15.463,-6.467,+123
eva_large_patch14_196.in22k_ft_in1k,81.300,18.700,91.533,8.467,304.14,196,1.000,bicubic,-16.220,-8.257,+12
regnety_320.swag_ft_in1k,81.203,18.797,92.707,7.293,145.05,384,1.000,bicubic,-16.177,-7.013,+23
convnext_large_mlp.clip_laion2b_augreg_ft_in1k,80.870,19.130,91.917,8.083,200.13,256,1.000,bicubic,-16.260,-7.863,+65
eva_large_patch14_336.in22k_ft_in22k_in1k,80.087,19.913,89.357,10.643,304.53,336,1.000,bicubic,-17.773,-10.443,-21
resnext101_32x32d.fb_wsl_ig1b_ft_in1k,79.467,20.533,89.197,10.803,468.53,224,0.875,bilinear,-17.303,-10.423,+124
resnext101_32x16d.fb_wsl_ig1b_ft_in1k,78.830,21.170,88.473,11.527,194.03,224,0.875,bilinear,-17.600,-11.157,+183
vit_large_patch14_clip_224.openai_ft_in12k_in1k,78.677,21.323,88.920,11.080,304.20,224,1.000,bicubic,-18.933,-10.810,0
eva_large_patch14_196.in22k_ft_in22k_in1k,78.500,21.500,88.330,11.670,304.14,196,1.000,bicubic,-19.110,-11.480,-3
vit_large_patch14_clip_336.laion2b_ft_in12k_in1k,78.433,21.567,88.500,11.500,304.53,336,1.000,bicubic,-19.027,-11.280,+8
vit_large_patch14_clip_224.laion2b_ft_in12k_in1k,78.267,21.733,88.673,11.327,304.20,224,1.000,bicubic,-19.123,-11.057,+12
regnety_160.swag_lc_in1k,78.187,21.813,91.663,8.337,83.59,224,0.965,bicubic,-18.263,-7.947,+172
regnety_160.swag_ft_in1k,77.683,22.317,90.737,9.263,83.59,384,1.000,bicubic,-19.487,-8.903,+50
eva02_base_patch14_448.mim_in22k_ft_in1k,77.610,22.390,89.307,10.693,87.12,448,1.000,bicubic,-20.110,-10.453,-14
eva02_base_patch14_448.mim_in22k_ft_in22k_in1k,77.497,22.503,88.490,11.510,87.12,448,1.000,bicubic,-20.113,-11.330,-10
tf_efficientnet_l2.ns_jft_in1k_475,76.490,23.510,88.640,11.360,480.31,475,0.936,bicubic,-21.260,-11.150,-20
beitv2_large_patch16_224.in1k_ft_in22k_in1k,76.353,23.647,87.090,12.910,304.43,224,0.950,bicubic,-21.397,-12.730,-19
resnext101_32x16d.fb_swsl_ig1b_ft_in1k,76.300,23.700,87.743,12.257,194.03,224,0.875,bilinear,-19.970,-11.757,+206
convnextv2_huge.fcmae_ft_in22k_in1k_384,75.953,24.047,86.637,13.363,660.29,384,1.000,bicubic,-21.917,-13.273,-36
convnextv2_huge.fcmae_ft_in22k_in1k_512,75.823,24.177,86.943,13.057,660.29,512,1.000,bicubic,-21.987,-12.847,-32
resnext101_32x8d.fb_wsl_ig1b_ft_in1k,75.797,24.203,86.197,13.803,88.79,224,0.875,bilinear,-20.143,-13.183,+277
resnext101_32x8d.fb_swsl_ig1b_ft_in1k,75.600,24.400,86.943,13.057,88.79,224,0.875,bilinear,-20.650,-12.597,+203
convnext_base.clip_laiona_augreg_ft_in1k_384,75.210,24.790,88.580,11.420,88.59,384,1.000,bicubic,-21.650,-11.110,+88
convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384,75.173,24.827,88.460,11.540,88.59,384,1.000,bicubic,-21.867,-11.210,+58
tf_efficientnet_l2.ns_jft_in1k,74.657,25.343,87.557,12.443,480.31,800,0.960,bicubic,-23.123,-12.263,-34
convnext_base.clip_laion2b_augreg_ft_in12k_in1k,73.733,26.267,87.343,12.657,88.59,256,1.000,bicubic,-23.057,-12.337,+97
convnext_base.clip_laion2b_augreg_ft_in1k,73.400,26.600,87.147,12.853,88.59,256,1.000,bicubic,-23.160,-12.503,+134
beit_large_patch16_384.in22k_ft_in22k_in1k,73.277,26.723,85.030,14.970,305.00,384,1.000,bicubic,-24.533,-14.810,-38
beit_large_patch16_512.in22k_ft_in22k_in1k,73.160,26.840,85.090,14.910,305.67,512,1.000,bicubic,-24.620,-14.800,-36
resnext101_32x4d.fb_swsl_ig1b_ft_in1k,72.657,27.343,85.160,14.840,44.18,224,0.875,bilinear,-23.383,-14.250,+242
maxvit_xlarge_tf_512.in21k_ft_in1k,71.873,28.127,82.927,17.073,475.77,512,1.000,bicubic,-25.887,-16.893,-36
maxvit_xlarge_tf_384.in21k_ft_in1k,71.703,28.297,82.713,17.287,475.32,384,1.000,bicubic,-26.037,-17.137,-33
beit_large_patch16_224.in22k_ft_in22k_in1k,71.040,28.960,83.430,16.570,304.43,224,0.900,bicubic,-26.440,-16.260,-17
deit3_huge_patch14_224.fb_in22k_ft_in1k,70.817,29.183,82.200,17.800,632.13,224,1.000,bicubic,-26.433,-17.520,+17
vit_base_patch16_clip_384.laion2b_ft_in1k,70.777,29.223,83.820,16.180,86.86,384,1.000,bicubic,-26.123,-15.850,+69
caformer_b36.sail_in22k_ft_in1k_384,70.750,29.250,82.650,17.350,98.75,384,1.000,bicubic,-26.910,-17.210,-34
deit3_large_patch16_384.fb_in22k_ft_in1k,70.580,29.420,82.437,17.563,304.76,384,1.000,bicubic,-27.000,-17.273,-26
beitv2_large_patch16_224.in1k_ft_in1k,70.403,29.597,83.373,16.627,304.43,224,0.950,bicubic,-26.907,-16.387,0
maxvit_base_tf_512.in21k_ft_in1k,70.383,29.617,81.600,18.400,119.88,512,1.000,bicubic,-27.377,-18.260,-45
maxvit_large_tf_512.in21k_ft_in1k,70.380,29.620,81.650,18.350,212.33,512,1.000,bicubic,-27.290,-18.080,-39
maxvit_large_tf_384.in21k_ft_in1k,70.010,29.990,81.037,18.963,212.03,384,1.000,bicubic,-27.650,-18.783,-38
deit3_large_patch16_224.fb_in22k_ft_in1k,69.710,30.290,81.197,18.803,304.37,224,1.000,bicubic,-27.600,-18.483,-3
maxvit_base_tf_384.in21k_ft_in1k,69.557,30.443,80.730,19.270,119.65,384,1.000,bicubic,-28.003,-19.030,-30
maxvit_rmlp_base_rw_384.sw_in12k_ft_in1k,69.130,30.870,80.060,19.940,116.14,384,1.000,bicubic,-28.210,-19.630,-11
resnext50_32x4d.fb_swsl_ig1b_ft_in1k,68.970,31.030,82.803,17.197,25.03,224,0.875,bilinear,-26.660,-16.607,+317
vit_base_patch16_clip_224.laion2b_ft_in1k,68.780,31.220,82.503,17.497,86.57,224,1.000,bicubic,-27.540,-17.017,+165
convnextv2_large.fcmae_ft_in22k_in1k_384,68.653,31.347,81.070,18.930,197.96,384,1.000,bicubic,-28.977,-18.730,-43
caformer_b36.sail_in22k_ft_in1k,68.603,31.397,80.857,19.143,98.75,224,1.000,bicubic,-28.757,-18.973,-17
resnet50.fb_swsl_ig1b_ft_in1k,68.287,31.713,83.313,16.687,25.56,224,0.875,bilinear,-26.913,-16.077,+404
convnext_xlarge.fb_in22k_ft_in1k_384,68.157,31.843,80.453,19.547,350.20,384,1.000,bicubic,-29.433,-19.317,-40
maxxvitv2_rmlp_base_rw_384.sw_in12k_ft_in1k,68.077,31.923,80.740,19.260,116.09,384,1.000,bicubic,-29.373,-19.020,-31
swinv2_large_window12to24_192to384.ms_in22k_ft_in1k,67.667,32.333,80.100,19.900,196.74,384,1.000,bicubic,-29.613,-19.680,-7
tf_efficientnet_b7.ns_jft_in1k,67.507,32.493,81.383,18.617,66.35,600,0.949,bicubic,-29.683,-18.317,+8
vit_base_patch16_clip_384.openai_ft_in1k,67.350,32.650,81.687,18.313,86.86,384,1.000,bicubic,-29.460,-18.023,+65
vit_large_patch16_384.augreg_in21k_ft_in1k,67.057,32.943,78.697,21.303,304.72,384,1.000,bicubic,-30.353,-21.083,-33
convnextv2_base.fcmae_ft_in22k_in1k_384,67.030,32.970,79.800,20.200,88.72,384,1.000,bicubic,-30.350,-19.960,-29
convformer_b36.sail_in22k_ft_in1k_384,66.823,33.177,79.443,20.557,99.88,384,1.000,bicubic,-30.667,-20.317,-42
convnext_large.fb_in22k_ft_in1k_384,66.673,33.327,79.797,20.203,197.77,384,1.000,bicubic,-30.627,-19.963,-18
coatnet_rmlp_2_rw_384.sw_in12k_ft_in1k,66.580,33.420,78.413,21.587,73.88,384,1.000,bicubic,-30.790,-21.287,-30
maxvit_rmlp_base_rw_224.sw_in12k_ft_in1k,66.530,33.470,78.027,21.973,116.14,224,0.950,bicubic,-30.580,-21.573,+13
swin_large_patch4_window12_384.ms_in22k_ft_in1k,66.287,33.713,79.747,20.253,196.74,384,1.000,bicubic,-30.883,-19.993,+4
vit_base_patch16_clip_384.laion2b_ft_in12k_in1k,66.147,33.853,78.883,21.117,86.86,384,1.000,bicubic,-31.073,-20.817,-5
vit_base_patch16_clip_224.openai_ft_in1k,66.027,33.973,80.980,19.020,86.57,224,0.900,bicubic,-30.283,-18.520,+151
beitv2_base_patch16_224.in1k_ft_in22k_in1k,65.757,34.243,78.893,21.107,86.53,224,0.900,bicubic,-31.153,-20.837,+38
swinv2_base_window12to24_192to384.ms_in22k_ft_in1k,65.733,34.267,79.317,20.683,87.92,384,1.000,bicubic,-31.537,-20.473,-19
swinv2_large_window12to16_192to256.ms_in22k_ft_in1k,65.640,34.360,78.480,21.520,196.74,256,0.900,bicubic,-31.600,-21.230,-13
tf_efficientnet_b6.ns_jft_in1k,65.583,34.417,79.557,20.443,43.04,528,0.942,bicubic,-31.437,-20.153,+20
caformer_m36.sail_in22k_ft_in1k_384,65.580,34.420,78.703,21.297,56.20,384,1.000,bicubic,-31.790,-21.087,-40
convformer_b36.sail_in22k_ft_in1k,65.513,34.487,78.140,21.860,99.88,224,1.000,bicubic,-31.747,-21.610,-22
convnext_xlarge.fb_in22k_ft_in1k,65.373,34.627,78.350,21.650,350.20,288,1.000,bicubic,-32.077,-21.470,-51
vit_base_patch16_clip_384.openai_ft_in12k_in1k,65.353,34.647,78.957,21.043,86.86,384,0.950,bicubic,-31.767,-20.613,-1
convnext_base.fb_in22k_ft_in1k_384,64.903,35.097,78.387,21.613,88.59,384,1.000,bicubic,-32.357,-21.353,-23
vit_base_patch16_clip_224.laion2b_ft_in12k_in1k,64.767,35.233,77.770,22.230,86.57,224,0.950,bicubic,-31.853,-21.790,+79
convnextv2_large.fcmae_ft_in22k_in1k,64.710,35.290,78.157,21.843,197.96,288,1.000,bicubic,-32.600,-21.583,-37
vit_large_patch16_224.augreg_in21k_ft_in1k,64.360,35.640,76.180,23.820,304.33,224,0.900,bicubic,-32.340,-23.390,+64
convnext_large.fb_in22k_ft_in1k,64.263,35.737,77.773,22.227,197.77,288,1.000,bicubic,-32.957,-21.957,-20
vit_large_r50_s32_384.augreg_in21k_ft_in1k,64.103,35.897,75.847,24.153,329.09,384,1.000,bicubic,-32.847,-23.783,+18
maxxvitv2_rmlp_base_rw_224.sw_in12k_ft_in1k,64.010,35.990,77.513,22.487,116.09,224,0.950,bicubic,-33.080,-22.167,-3
swin_large_patch4_window7_224.ms_in22k_ft_in1k,63.887,36.113,78.187,21.813,196.53,224,0.900,bicubic,-33.053,-21.483,+20
seresnextaa201d_32x8d.sw_in12k_ft_in1k_384,63.653,36.347,76.623,23.377,149.39,384,1.000,bicubic,-33.637,-23.157,-37
beit_base_patch16_384.in22k_ft_in22k_in1k,63.623,36.377,78.103,21.897,86.74,384,1.000,bicubic,-33.697,-21.617,-46
caformer_m36.sail_in22k_ft_in1k,63.493,36.507,76.920,23.080,56.20,224,1.000,bicubic,-33.527,-22.810,+4
swin_base_patch4_window12_384.ms_in22k_ft_in1k,63.483,36.517,78.050,21.950,87.90,384,1.000,bicubic,-33.647,-21.670,-15
convnextv2_base.fcmae_ft_in22k_in1k,63.307,36.693,77.163,22.837,88.72,288,1.000,bicubic,-33.893,-22.597,-25
caformer_s36.sail_in22k_ft_in1k_384,63.263,36.737,77.460,22.540,39.30,384,1.000,bicubic,-34.027,-22.290,-41
swinv2_base_window12to16_192to256.ms_in22k_ft_in1k,63.183,36.817,77.093,22.907,87.92,256,0.900,bicubic,-33.867,-22.497,-6
tf_efficientnet_b5.ns_jft_in1k,63.053,36.947,77.787,22.213,30.39,456,0.934,bicubic,-33.817,-21.853,+23
vit_base_patch16_clip_224.openai_ft_in12k_in1k,62.943,37.057,76.597,23.403,86.57,224,0.950,bicubic,-33.567,-22.953,+87
deit3_base_patch16_384.fb_in22k_ft_in1k,62.637,37.363,75.550,24.450,86.88,384,1.000,bicubic,-34.603,-24.190,-35
convnext_base.fb_in22k_ft_in1k,62.537,37.463,76.550,23.450,88.59,288,1.000,bicubic,-34.663,-23.210,-32
coatnet_rmlp_2_rw_224.sw_in12k_ft_in1k,62.417,37.583,75.130,24.870,73.88,224,0.950,bicubic,-34.763,-24.520,-29
vit_base_patch8_224.augreg2_in21k_ft_in1k,62.400,37.600,76.600,23.400,86.58,224,0.900,bicubic,-34.550,-23.010,+4
tf_efficientnetv2_l.in21k_ft_in1k,62.363,37.637,76.757,23.243,118.52,480,1.000,bicubic,-34.957,-22.883,-57
vit_base_patch8_224.augreg_in21k_ft_in1k,62.177,37.823,75.627,24.373,86.58,224,0.900,bicubic,-34.913,-23.983,-18
convformer_m36.sail_in22k_ft_in1k_384,62.103,37.897,75.540,24.460,57.05,384,1.000,bicubic,-35.267,-24.140,-65
tf_efficientnetv2_xl.in21k_ft_in1k,62.083,37.917,75.653,24.347,208.12,512,1.000,bicubic,-35.247,-23.947,-62
hrnet_w48_ssld.paddle_in1k,61.923,38.077,75.163,24.837,77.47,288,1.000,bilinear,-34.617,-24.477,+70
deit3_base_patch16_224.fb_in22k_ft_in1k,61.803,38.197,74.717,25.283,86.59,224,1.000,bicubic,-35.057,-24.903,+14
beitv2_base_patch16_224.in1k_ft_in1k,61.427,38.573,75.930,24.070,86.53,224,0.900,bicubic,-35.323,-23.670,+34
coatnet_2_rw_224.sw_in12k_ft_in1k,61.277,38.723,73.967,26.033,73.87,224,0.950,bicubic,-35.713,-25.693,-8
convformer_m36.sail_in22k_ft_in1k,61.257,38.743,74.620,25.380,57.05,224,1.000,bicubic,-35.813,-25.130,-23
tf_efficientnet_b4.ns_jft_in1k,61.233,38.767,76.160,23.840,19.34,380,0.922,bicubic,-35.477,-23.480,+35
convnextv2_huge.fcmae_ft_in1k,61.197,38.803,74.500,25.500,660.29,288,1.000,bicubic,-36.053,-25.220,-53
maxvit_base_tf_512.in1k,61.103,38.897,74.050,25.950,119.88,512,1.000,bicubic,-36.067,-25.630,-38
caformer_s36.sail_in22k_ft_in1k,60.767,39.233,75.167,24.833,39.30,224,1.000,bicubic,-36.053,-24.523,+12
vit_base_patch32_clip_384.laion2b_ft_in12k_in1k,60.357,39.643,73.797,26.203,88.30,384,1.000,bicubic,-36.253,-25.683,+47
beit_base_patch16_224.in22k_ft_in22k_in1k,60.317,39.683,75.593,24.407,86.53,224,0.900,bicubic,-36.343,-24.067,+40
tf_efficientnetv2_m.in21k_ft_in1k,60.267,39.733,75.070,24.930,54.14,480,1.000,bicubic,-36.733,-24.560,-18
vit_base_patch32_clip_448.laion2b_ft_in12k_in1k,60.230,39.770,73.550,26.450,88.34,448,1.000,bicubic,-36.330,-25.970,+54
vit_base_patch16_384.augreg_in21k_ft_in1k,60.183,39.817,73.843,26.157,86.86,384,1.000,bicubic,-36.837,-25.867,-22
convformer_s36.sail_in22k_ft_in1k_384,60.070,39.930,74.127,25.873,40.01,384,1.000,bicubic,-36.990,-25.583,-32
convnext_small.fb_in22k_ft_in1k_384,59.923,40.077,74.487,25.513,50.22,384,1.000,bicubic,-37.187,-25.153,-40
maxvit_large_tf_512.in1k,59.887,40.113,72.847,27.153,212.33,512,1.000,bicubic,-37.163,-26.813,-32
tiny_vit_21m_512.dist_in22k_ft_in1k,59.580,40.420,74.757,25.243,21.27,512,1.000,bicubic,-37.320,-24.933,-11
swin_base_patch4_window7_224.ms_in22k_ft_in1k,59.527,40.473,74.247,25.753,87.77,224,0.900,bicubic,-37.153,-25.423,+30
vit_base_patch32_clip_224.laion2b_ft_in1k,59.163,40.837,73.883,26.117,88.22,224,0.900,bicubic,-35.587,-25.187,+436
maxvit_base_tf_384.in1k,59.073,40.927,71.687,28.313,119.65,384,1.000,bicubic,-38.047,-27.953,-46
vit_base_patch16_224.augreg2_in21k_ft_in1k,59.060,40.940,73.603,26.397,86.57,224,0.900,bicubic,-37.450,-25.957,+56
seresnextaa101d_32x8d.sw_in12k_ft_in1k_288,59.030,40.970,72.837,27.163,93.59,320,1.000,bicubic,-38.260,-26.883,-77
tiny_vit_21m_384.dist_in22k_ft_in1k,59.017,40.983,74.100,25.900,21.23,384,1.000,bicubic,-37.933,-25.610,-22
volo_d5_512.sail_in1k,58.927,41.073,73.210,26.790,296.09,512,1.150,bicubic,-38.373,-26.550,-80
convformer_s36.sail_in22k_ft_in1k,58.913,41.087,72.943,27.057,40.01,224,1.000,bicubic,-37.587,-26.517,+54
convnext_small.in12k_ft_in1k_384,58.807,41.193,72.797,27.203,50.22,384,1.000,bicubic,-38.183,-26.853,-32
volo_d5_448.sail_in1k,58.800,41.200,73.063,26.937,295.91,448,1.150,bicubic,-38.440,-26.607,-72
vit_large_r50_s32_224.augreg_in21k_ft_in1k,58.667,41.333,71.733,28.267,328.99,224,0.900,bicubic,-37.513,-27.777,+117
vit_base_patch32_clip_384.openai_ft_in12k_in1k,58.587,41.413,73.150,26.850,88.30,384,0.950,bicubic,-37.833,-26.350,+66
maxvit_large_tf_384.in1k,58.427,41.573,71.177,28.823,212.03,384,1.000,bicubic,-38.503,-28.393,-26
eva02_small_patch14_336.mim_in22k_ft_in1k,58.363,41.637,72.877,27.123,22.13,336,1.000,bicubic,-38.327,-26.733,+15
deit3_large_patch16_384.fb_in1k,58.357,41.643,72.950,27.050,304.76,384,1.000,bicubic,-38.493,-26.670,-16
deit3_huge_patch14_224.fb_in1k,58.137,41.863,72.143,27.857,632.13,224,0.900,bicubic,-38.443,-27.377,+27
convnextv2_large.fcmae_ft_in1k,58.103,41.897,72.627,27.373,197.96,288,1.000,bicubic,-38.727,-27.133,-16
tf_efficientnet_b8.ap_in1k,57.837,42.163,72.960,27.040,87.41,672,0.954,bicubic,-38.713,-26.600,+34
convnext_small.fb_in22k_ft_in1k,57.743,42.257,72.773,27.227,50.22,288,1.000,bicubic,-39.067,-26.857,-12
seresnextaa101d_32x8d.sw_in12k_ft_in1k,57.643,42.357,71.357,28.643,93.59,288,1.000,bicubic,-39.527,-28.423,-70
mvitv2_large.fb_in1k,57.497,42.503,70.750,29.250,217.99,224,0.900,bicubic,-38.903,-28.790,+62
cait_m48_448.fb_dist_in1k,57.477,42.523,71.857,28.143,356.46,448,1.000,bicubic,-39.403,-27.813,-28
cait_m36_384.fb_dist_in1k,57.473,42.527,72.307,27.693,271.22,384,1.000,bicubic,-39.367,-27.353,-23
tf_efficientnet_b3.ns_jft_in1k,57.417,42.583,72.370,27.630,12.23,300,0.904,bicubic,-38.683,-27.110,+119
volo_d4_448.sail_in1k,57.287,42.713,71.540,28.460,193.41,448,1.150,bicubic,-39.783,-28.090,-62
tiny_vit_21m_224.dist_in22k_ft_in1k,57.143,42.857,72.573,27.427,21.20,224,0.950,bicubic,-39.237,-26.847,+58
maxvit_small_tf_512.in1k,57.077,42.923,70.957,29.043,69.13,512,1.000,bicubic,-40.123,-28.663,-81
vit_base_patch32_clip_224.laion2b_ft_in12k_in1k,57.070,42.930,71.267,28.733,88.22,224,0.900,bicubic,-38.160,-27.973,+302
vit_base_patch16_224.augreg_in21k_ft_in1k,56.833,43.167,70.633,29.367,86.57,224,0.900,bicubic,-39.467,-28.897,+75
deit3_medium_patch16_224.fb_in22k_ft_in1k,56.653,43.347,69.740,30.260,38.85,224,1.000,bicubic,-39.487,-29.550,+104
volo_d5_224.sail_in1k,56.507,43.493,70.660,29.340,295.46,224,0.960,bicubic,-40.373,-28.960,-39
deit3_large_patch16_224.fb_in1k,56.453,43.547,70.473,29.527,304.37,224,0.900,bicubic,-39.737,-29.027,+95
xcit_large_24_p8_384.fb_dist_in1k,56.360,43.640,71.307,28.693,188.93,384,1.000,bicubic,-40.400,-28.253,-16
flexivit_large.1200ep_in1k,56.290,43.710,71.560,28.440,304.36,240,0.950,bicubic,-40.490,-28.110,-21
convnextv2_tiny.fcmae_ft_in22k_in1k_384,56.100,43.900,71.897,28.103,28.64,384,1.000,bicubic,-40.520,-27.733,+2
flexivit_large.600ep_in1k,56.077,43.923,71.170,28.830,304.36,240,0.950,bicubic,-40.653,-28.390,-15
xcit_large_24_p8_224.fb_dist_in1k,56.020,43.980,70.670,29.330,188.93,224,1.000,bicubic,-40.610,-28.790,-2
caformer_s18.sail_in22k_ft_in1k_384,56.010,43.990,71.380,28.620,26.34,384,1.000,bicubic,-40.520,-28.200,+16
vit_base_patch32_clip_224.openai_ft_in1k,55.913,44.087,72.157,27.843,88.22,224,0.900,bicubic,-38.527,-26.953,+468
vit_medium_patch16_gap_384.sw_in12k_ft_in1k,55.783,44.217,70.997,29.003,39.03,384,0.950,bicubic,-40.707,-28.623,+24
flexivit_large.300ep_in1k,55.703,44.297,70.703,29.297,304.36,240,0.950,bicubic,-40.997,-28.877,-15
convnext_small.in12k_ft_in1k,55.700,44.300,70.727,29.273,50.22,288,1.000,bicubic,-40.900,-28.833,-1
caformer_b36.sail_in1k_384,55.193,44.807,68.070,31.930,98.75,384,1.000,bicubic,-41.967,-31.540,-90
convformer_s18.sail_in22k_ft_in1k_384,55.123,44.877,70.127,29.873,26.77,384,1.000,bicubic,-41.667,-29.583,-36
caformer_m36.sail_in1k_384,55.083,44.917,68.470,31.530,56.20,384,1.000,bicubic,-41.947,-31.240,-76
swin_small_patch4_window7_224.ms_in22k_ft_in1k,54.977,45.023,71.023,28.977,49.61,224,0.900,bicubic,-41.083,-28.387,+110
xcit_large_24_p16_384.fb_dist_in1k,54.907,45.093,69.850,30.150,189.10,384,1.000,bicubic,-42.033,-29.660,-61
volo_d4_224.sail_in1k,54.757,45.243,68.847,31.153,192.96,224,0.960,bicubic,-42.023,-30.763,-37
maxvit_tiny_tf_512.in1k,54.747,45.253,68.937,31.063,31.05,512,1.000,bicubic,-42.223,-30.733,-69
dm_nfnet_f5.dm_in1k,54.610,45.390,68.673,31.327,377.21,544,0.954,bicubic,-42.420,-31.007,-79
caformer_s36.sail_in1k_384,54.573,45.427,68.740,31.260,39.30,384,1.000,bicubic,-42.307,-30.930,-60
deit3_small_patch16_384.fb_in22k_ft_in1k,54.480,45.520,68.317,31.683,22.21,384,1.000,bicubic,-42.190,-31.323,-20
efficientnet_b5.sw_in12k_ft_in1k,54.413,45.587,69.873,30.127,30.39,448,1.000,bicubic,-42.357,-29.657,-38
vit_base_r50_s16_384.orig_in21k_ft_in1k,54.407,45.593,69.563,30.437,98.95,384,1.000,bicubic,-42.043,-30.047,+17
inception_next_base.sail_in1k_384,54.360,45.640,68.570,31.430,86.67,384,1.000,bicubic,-42.360,-31.040,-33
maxvit_small_tf_384.in1k,54.337,45.663,68.190,31.810,69.02,384,1.000,bicubic,-42.413,-31.350,-38
regnety_160.sw_in12k_ft_in1k,54.330,45.670,69.047,30.953,83.59,288,1.000,bicubic,-42.490,-30.573,-55
resnetv2_152x4_bit.goog_in21k_ft_in1k,54.317,45.683,70.170,29.830,936.53,480,1.000,bilinear,-42.563,-29.490,-65
xcit_large_24_p16_224.fb_dist_in1k,54.250,45.750,68.967,31.033,189.10,224,1.000,bicubic,-42.070,-30.533,+40
regnety_160.lion_in12k_ft_in1k,54.203,45.797,68.990,31.010,83.59,288,1.000,bicubic,-42.607,-30.520,-56
vit_small_r26_s32_384.augreg_in21k_ft_in1k,54.187,45.813,68.760,31.240,36.47,384,1.000,bicubic,-41.873,-30.740,+92
caformer_s18.sail_in22k_ft_in1k,54.113,45.887,69.713,30.287,26.34,224,1.000,bicubic,-41.897,-29.837,+102
volo_d3_448.sail_in1k,53.977,46.023,68.040,31.960,86.63,448,1.000,bicubic,-43.053,-31.630,-94
caformer_b36.sail_in1k,53.977,46.023,66.687,33.313,98.75,224,1.000,bicubic,-42.523,-32.943,0
tf_efficientnet_b5.ap_in1k,53.887,46.113,69.170,30.830,30.39,456,0.934,bicubic,-42.203,-30.370,+80
dm_nfnet_f6.dm_in1k,53.843,46.157,68.413,31.587,438.36,576,0.956,bicubic,-43.127,-31.347,-87
xcit_medium_24_p8_224.fb_dist_in1k,53.650,46.350,68.403,31.597,84.32,224,1.000,bicubic,-42.880,-31.107,-11
tf_efficientnet_b2.ns_jft_in1k,53.600,46.400,70.270,29.730,9.11,260,0.890,bicubic,-41.920,-29.070,+204
cait_s36_384.fb_dist_in1k,53.560,46.440,68.003,31.997,68.37,384,1.000,bicubic,-43.070,-31.607,-35
tf_efficientnet_b6.ap_in1k,53.553,46.447,68.563,31.437,43.04,528,0.942,bicubic,-42.817,-30.987,+16
vit_medium_patch16_gap_256.sw_in12k_ft_in1k,53.537,46.463,69.093,30.907,38.86,256,0.950,bicubic,-42.463,-30.327,+95
dm_nfnet_f3.dm_in1k,53.523,46.477,67.743,32.257,254.92,416,0.940,bicubic,-43.097,-31.837,-36
deit3_base_patch16_384.fb_in1k,53.483,46.517,67.623,32.377,86.88,384,1.000,bicubic,-42.747,-31.817,+47
deit3_base_patch16_224.fb_in1k,53.470,46.530,67.597,32.403,86.59,224,0.900,bicubic,-42.300,-31.673,+145
convformer_s18.sail_in22k_ft_in1k,53.437,46.563,68.680,31.320,26.77,224,1.000,bicubic,-42.663,-30.810,+67
tf_efficientnet_b8.ra_in1k,53.417,46.583,69.093,30.907,87.41,672,0.954,bicubic,-43.283,-30.557,-49
convnextv2_base.fcmae_ft_in1k,53.417,46.583,67.750,32.250,88.72,288,1.000,bicubic,-43.063,-31.800,-9
xcit_medium_24_p8_384.fb_dist_in1k,53.397,46.603,68.130,31.870,84.32,384,1.000,bicubic,-43.373,-31.470,-64
vit_base_patch32_384.augreg_in21k_ft_in1k,53.300,46.700,68.057,31.943,88.30,384,1.000,bicubic,-42.600,-31.283,+110
tf_efficientnet_b7.ap_in1k,53.257,46.743,68.870,31.130,66.35,600,0.949,bicubic,-43.093,-30.650,+9
convnext_large.fb_in1k,53.247,46.753,67.887,32.113,197.77,288,1.000,bicubic,-43.153,-31.563,+1
xcit_medium_24_p16_384.fb_dist_in1k,53.237,46.763,68.043,31.957,84.40,384,1.000,bicubic,-43.453,-31.557,-52
hrnet_w18_ssld.paddle_in1k,53.233,46.767,68.183,31.817,21.30,288,1.000,bilinear,-42.477,-31.107,+150
maxvit_base_tf_224.in1k,53.233,46.767,66.133,33.867,119.47,224,0.950,bicubic,-43.107,-33.447,+11
tf_efficientnetv2_l.in1k,53.160,46.840,67.827,32.173,118.52,480,1.000,bicubic,-43.580,-31.723,-65
tf_efficientnetv2_s.in21k_ft_in1k,53.127,46.873,69.000,31.000,21.46,384,1.000,bicubic,-43.343,-30.570,-18
tf_efficientnet_b4.ap_in1k,53.087,46.913,68.223,31.777,19.34,380,0.922,bicubic,-42.403,-31.197,+188
convnext_tiny.in12k_ft_in1k_384,53.063,46.937,68.503,31.497,28.59,384,1.000,bicubic,-43.497,-31.127,-40
regnetz_e8.ra3_in1k,53.003,46.997,67.147,32.853,57.70,320,1.000,bicubic,-43.597,-32.433,-50
maxvit_large_tf_224.in1k,53.003,46.997,65.337,34.663,211.79,224,0.950,bicubic,-43.327,-34.073,+7
coatnet_rmlp_1_rw2_224.sw_in12k_ft_in1k,52.873,47.127,66.460,33.540,41.72,224,0.950,bicubic,-43.257,-32.880,+46
volo_d3_224.sail_in1k,52.697,47.303,66.307,33.693,86.33,224,0.960,bicubic,-43.733,-33.233,-17
deit3_small_patch16_224.fb_in22k_ft_in1k,52.690,47.310,66.867,33.133,22.06,224,1.000,bicubic,-43.140,-32.473,+110
regnety_120.sw_in12k_ft_in1k,52.637,47.363,67.637,32.363,51.82,288,1.000,bicubic,-43.913,-32.043,-44
dm_nfnet_f4.dm_in1k,52.460,47.540,67.117,32.883,316.07,512,0.951,bicubic,-44.490,-32.523,-112
maxvit_tiny_tf_384.in1k,52.457,47.543,66.777,33.223,30.98,384,1.000,bicubic,-44.143,-32.843,-54
tf_efficientnet_b7.ra_in1k,52.410,47.590,68.220,31.780,66.35,600,0.949,bicubic,-44.160,-31.250,-52
xcit_small_24_p8_384.fb_dist_in1k,52.377,47.623,66.847,33.153,47.63,384,1.000,bicubic,-44.433,-32.813,-92
efficientnetv2_rw_m.agc_in1k,52.343,47.657,67.223,32.777,53.24,416,1.000,bicubic,-43.927,-32.407,+10
resnet18.fb_swsl_ig1b_ft_in1k,52.323,47.677,70.483,29.517,11.69,224,0.875,bilinear,-38.777,-27.717,+762
convformer_b36.sail_in1k_384,52.300,47.700,66.543,33.457,99.88,384,1.000,bicubic,-44.570,-33.107,-106
convformer_m36.sail_in1k_384,52.283,47.717,66.150,33.850,57.05,384,1.000,bicubic,-44.497,-33.580,-90
inception_next_base.sail_in1k,52.273,47.727,65.930,34.070,86.67,224,0.950,bicubic,-43.647,-33.430,+84
deit_base_distilled_patch16_384.fb_in1k,52.260,47.740,67.747,32.253,87.63,384,1.000,bicubic,-44.250,-31.843,-45
xcit_medium_24_p16_224.fb_dist_in1k,52.197,47.803,66.917,33.083,84.40,224,1.000,bicubic,-44.053,-32.493,+10
xcit_small_24_p8_224.fb_dist_in1k,52.183,47.817,66.767,33.233,47.63,224,1.000,bicubic,-44.367,-32.773,-55
convnext_tiny.fb_in22k_ft_in1k_384,52.180,47.820,66.923,33.077,28.59,384,1.000,bicubic,-43.990,-32.577,+23
convformer_s36.sail_in1k_384,51.987,48.013,66.197,33.803,40.01,384,1.000,bicubic,-44.713,-33.333,-81
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k_384,51.937,48.063,68.660,31.340,236.34,384,1.000,bicubic,-44.253,-30.640,+17
fastvit_ma36.apple_dist_in1k,51.923,48.077,67.013,32.987,44.07,256,0.950,bicubic,-44.377,-32.547,-3
convnextv2_tiny.fcmae_ft_in22k_in1k,51.903,48.097,67.800,32.200,28.64,288,1.000,bicubic,-44.437,-31.570,-18
resmlp_big_24_224.fb_in22k_ft_in1k,51.893,48.107,68.463,31.537,129.14,224,0.875,bicubic,-44.457,-31.127,-21
xcit_small_24_p16_384.fb_dist_in1k,51.883,48.117,66.367,33.633,47.67,384,1.000,bicubic,-44.457,-33.183,-21
cait_s24_384.fb_dist_in1k,51.787,48.213,66.307,33.693,47.06,384,1.000,bicubic,-44.783,-33.243,-70
resnetv2_152x2_bit.goog_in21k_ft_in1k,51.763,48.237,69.263,30.737,236.34,448,1.000,bilinear,-44.757,-30.307,-58
caformer_m36.sail_in1k,51.693,48.307,64.497,35.503,56.20,224,1.000,bicubic,-44.717,-35.063,-36
ecaresnet269d.ra2_in1k,51.677,48.323,66.040,33.960,102.09,352,1.000,bicubic,-44.773,-33.620,-44
regnety_2560.seer_ft_in1k,51.650,48.350,68.193,31.807,"1,282.60",384,1.000,bicubic,-44.880,-31.327,-63
rexnetr_300.sw_in12k_ft_in1k,51.627,48.373,68.020,31.980,34.81,288,1.000,bicubic,-44.463,-31.520,+25
caformer_s36.sail_in1k,51.593,48.407,64.907,35.093,39.30,224,1.000,bicubic,-44.487,-34.603,+27
coat_lite_medium_384.in1k,51.570,48.430,65.740,34.260,44.57,384,1.000,bicubic,-45.000,-33.780,-75
mvitv2_base.fb_in1k,51.550,48.450,65.633,34.367,51.47,224,0.900,bicubic,-44.430,-33.957,+50
vit_base_patch16_224_miil.in21k_ft_in1k,51.543,48.457,65.207,34.793,86.54,224,0.875,bilinear,-44.497,-34.143,+38
davit_small.msft_in1k,51.523,48.477,66.440,33.560,49.75,224,0.950,bicubic,-44.507,-32.960,+38
convnext_tiny.in12k_ft_in1k,51.440,48.560,67.063,32.937,28.59,288,1.000,bicubic,-44.800,-32.227,-9
maxvit_rmlp_small_rw_224.sw_in1k,51.430,48.570,65.190,34.810,64.90,224,0.900,bicubic,-44.530,-34.240,+54
tf_efficientnetv2_m.in1k,51.423,48.577,66.623,33.377,54.14,480,1.000,bicubic,-45.057,-32.897,-62
repvit_m2_3.dist_450e_in1k,51.383,48.617,66.737,33.263,23.69,224,0.950,bicubic,-44.607,-32.663,+42
caformer_s18.sail_in1k_384,51.353,48.647,65.657,34.343,26.34,384,1.000,bicubic,-45.057,-33.873,-50
edgenext_base.in21k_ft_in1k,51.283,48.717,65.657,34.343,18.51,320,1.000,bicubic,-44.917,-33.803,-6
davit_base.msft_in1k,51.267,48.733,65.223,34.777,87.95,224,0.950,bicubic,-44.973,-34.417,-14
convformer_b36.sail_in1k,51.203,48.797,64.280,35.720,99.88,224,1.000,bicubic,-45.037,-35.130,-14
maxvit_small_tf_224.in1k,51.190,48.810,65.260,34.740,68.93,224,0.950,bicubic,-45.010,-34.270,-8
xcit_small_12_p8_384.fb_dist_in1k,51.107,48.893,65.840,34.160,26.21,384,1.000,bicubic,-45.363,-33.650,-65
convnext_base.fb_in1k,51.057,48.943,65.880,34.120,88.59,288,1.000,bicubic,-45.253,-33.630,-33
convformer_m36.sail_in1k,51.020,48.980,63.653,36.347,57.05,224,1.000,bicubic,-45.060,-35.737,+13
volo_d2_384.sail_in1k,50.900,49.100,65.623,34.377,58.87,384,1.000,bicubic,-45.810,-33.977,-113
tf_efficientnet_b1.ns_jft_in1k,50.887,49.113,67.910,32.090,7.79,240,0.882,bicubic,-43.973,-31.340,+281
vit_base_patch16_384.orig_in21k_ft_in1k,50.880,49.120,65.277,34.723,86.86,384,1.000,bicubic,-45.320,-34.193,-16
convformer_s36.sail_in1k,50.863,49.137,64.083,35.917,40.01,224,1.000,bicubic,-45.247,-35.377,+1
tiny_vit_11m_224.dist_in22k_ft_in1k,50.830,49.170,66.870,33.130,11.00,224,0.950,bicubic,-44.880,-32.390,+95
xcit_small_24_p16_224.fb_dist_in1k,50.743,49.257,65.047,34.953,47.67,224,1.000,bicubic,-45.047,-34.243,+73
repvit_m2_3.dist_300e_in1k,50.737,49.263,66.743,33.257,23.69,224,0.950,bicubic,-44.853,-32.647,+113
convformer_s18.sail_in1k_384,50.687,49.313,65.637,34.363,26.77,384,1.000,bicubic,-45.563,-33.943,-31
flexivit_base.1200ep_in1k,50.687,49.313,65.133,34.867,86.59,240,0.950,bicubic,-45.433,-34.277,-7
coatnet_rmlp_2_rw_224.sw_in1k,50.603,49.397,63.363,36.637,73.88,224,0.950,bicubic,-45.607,-36.117,-24
xcit_small_12_p16_384.fb_dist_in1k,50.527,49.473,65.300,34.700,26.25,384,1.000,bicubic,-45.813,-34.190,-53
efficientnet_b4.ra2_in1k,50.500,49.500,65.730,34.270,19.34,384,1.000,bicubic,-45.030,-33.670,+122
volo_d1_384.sail_in1k,50.477,49.523,64.913,35.087,26.78,384,1.000,bicubic,-46.003,-34.697,-84
efficientvit_b3.r256_in1k,50.477,49.523,64.180,35.820,48.65,256,1.000,bicubic,-45.353,-35.040,+56
xcit_small_12_p8_224.fb_dist_in1k,50.437,49.563,65.420,34.580,26.21,224,1.000,bicubic,-45.523,-33.950,+25
fastvit_sa36.apple_dist_in1k,50.427,49.573,65.803,34.197,31.53,256,0.900,bicubic,-45.533,-33.577,+27
resnetv2_101x3_bit.goog_in21k_ft_in1k,50.393,49.607,67.783,32.217,387.93,448,1.000,bilinear,-45.857,-31.687,-41
flexivit_base.600ep_in1k,50.353,49.647,64.627,35.373,86.59,240,0.950,bicubic,-45.607,-34.793,+23
efficientvit_b3.r288_in1k,50.340,49.660,64.063,35.937,48.65,288,1.000,bicubic,-45.800,-35.297,-19
regnetz_040_h.ra3_in1k,50.317,49.683,65.623,34.377,28.94,320,1.000,bicubic,-46.003,-33.917,-58
inception_next_small.sail_in1k,50.273,49.727,65.097,34.903,49.37,224,0.875,bicubic,-45.407,-34.153,+84
mvitv2_small.fb_in1k,50.260,49.740,64.893,35.107,34.87,224,0.900,bicubic,-45.630,-34.467,+35
cait_s24_224.fb_dist_in1k,50.250,49.750,65.020,34.980,46.92,224,1.000,bicubic,-45.410,-34.370,+83
resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k,50.230,49.770,66.033,33.967,194.03,224,0.875,bilinear,-45.170,-33.127,+134
pit_b_distilled_224.in1k,50.220,49.780,64.997,35.003,74.79,224,0.900,bicubic,-45.600,-34.293,+50
eca_nfnet_l2.ra3_in1k,50.203,49.797,65.440,34.560,56.72,384,1.000,bicubic,-46.247,-34.310,-90
resnest269e.in1k,50.187,49.813,64.680,35.320,110.93,416,0.928,bicubic,-45.923,-34.630,-24
vit_small_patch16_384.augreg_in21k_ft_in1k,50.173,49.827,65.790,34.210,22.20,384,1.000,bicubic,-45.807,-33.540,+7
tresnet_v2_l.miil_in21k_ft_in1k,50.163,49.837,65.123,34.877,46.17,224,0.875,bilinear,-45.657,-34.197,+45
pvt_v2_b5.in1k,50.147,49.853,65.027,34.973,81.96,224,0.900,bicubic,-45.793,-34.363,+17
deit3_medium_patch16_224.fb_in1k,50.140,49.860,64.710,35.290,38.85,224,0.900,bicubic,-45.240,-34.640,+133
deit_base_distilled_patch16_224.fb_in1k,50.077,49.923,66.230,33.770,87.34,224,0.900,bicubic,-45.673,-33.200,+55
tf_efficientnet_b3.ap_in1k,50.047,49.953,65.210,34.790,12.23,300,0.904,bicubic,-44.923,-33.900,+228
pvt_v2_b4.in1k,50.023,49.977,65.127,34.873,62.56,224,0.900,bicubic,-45.897,-34.093,+16
flexivit_base.300ep_in1k,50.003,49.997,64.103,35.897,86.59,240,0.950,bicubic,-45.947,-35.367,+11
coat_lite_medium.in1k,49.983,50.017,64.857,35.143,44.57,224,0.900,bicubic,-46.017,-34.643,-3
resnest200e.in1k,49.873,50.127,64.717,35.283,70.20,320,0.909,bicubic,-46.197,-34.763,-23
efficientformer_l7.snap_dist_in1k,49.837,50.163,66.033,33.967,82.23,224,0.950,bicubic,-45.763,-33.357,+79
volo_d2_224.sail_in1k,49.820,50.180,64.580,35.420,58.68,224,0.960,bicubic,-46.600,-34.880,-98
xception65.ra3_in1k,49.780,50.220,63.523,36.477,39.92,299,0.940,bicubic,-45.910,-35.797,+63
seresnextaa101d_32x8d.ah_in1k,49.747,50.253,64.443,35.557,93.59,288,1.000,bicubic,-46.693,-35.067,-103
swinv2_base_window16_256.ms_in1k,49.667,50.333,63.800,36.200,87.92,256,0.900,bicubic,-46.503,-35.590,-47
pvt_v2_b3.in1k,49.613,50.387,64.793,35.207,45.24,224,0.900,bicubic,-45.857,-34.517,+101
convnextv2_nano.fcmae_ft_in22k_in1k_384,49.603,50.397,65.657,34.343,15.62,384,1.000,bicubic,-46.187,-33.733,+36
cait_xs24_384.fb_dist_in1k,49.537,50.463,64.900,35.100,26.67,384,1.000,bicubic,-46.463,-34.530,-13
maxvit_rmlp_tiny_rw_256.sw_in1k,49.530,50.470,63.823,36.177,29.15,256,0.950,bicubic,-46.510,-35.717,-23
tf_efficientnet_b5.ra_in1k,49.523,50.477,65.653,34.347,30.39,456,0.934,bicubic,-46.447,-33.777,-10
fastvit_ma36.apple_in1k,49.500,50.500,63.620,36.380,44.07,256,0.950,bicubic,-46.470,-35.840,-9
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k,49.480,50.520,65.623,34.377,236.34,224,0.875,bicubic,-46.270,-33.807,+37
resnet200d.ra2_in1k,49.463,50.537,64.330,35.670,64.69,320,1.000,bicubic,-46.647,-35.190,-46
efficientformerv2_l.snap_dist_in1k,49.457,50.543,64.920,35.080,26.32,224,0.950,bicubic,-46.303,-34.450,+33
xcit_small_12_p16_224.fb_dist_in1k,49.413,50.587,63.847,36.153,26.25,224,1.000,bicubic,-46.317,-35.453,+41
resnest101e.in1k,49.367,50.633,65.583,34.417,48.28,256,0.875,bilinear,-46.203,-33.787,+73
dm_nfnet_f2.dm_in1k,49.350,50.650,63.950,36.050,193.78,352,0.920,bicubic,-47.170,-35.640,-132
regnetz_040.ra3_in1k,49.297,50.703,64.060,35.940,27.12,320,1.000,bicubic,-46.883,-35.470,-62
vit_base_patch32_224.augreg_in21k_ft_in1k,49.267,50.733,64.340,35.660,88.22,224,0.900,bicubic,-45.133,-34.700,+324
tiny_vit_21m_224.in1k,49.267,50.733,64.303,35.697,21.20,224,0.950,bicubic,-46.383,-34.947,+53
resnet152d.ra2_in1k,49.263,50.737,64.413,35.587,60.21,320,1.000,bicubic,-46.607,-35.057,+6
seresnet152d.ra2_in1k,49.237,50.763,64.167,35.833,66.84,320,1.000,bicubic,-47.073,-35.243,-94
xcit_large_24_p8_224.fb_in1k,49.237,50.763,62.840,37.160,188.93,224,1.000,bicubic,-46.853,-36.300,-50
gcvit_base.in1k,49.153,50.847,63.950,36.050,90.32,224,0.875,bicubic,-46.927,-35.300,-49
maxxvit_rmlp_small_rw_256.sw_in1k,49.150,50.850,63.343,36.657,66.01,256,0.950,bicubic,-47.060,-35.937,-77
resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k,49.093,50.907,65.490,34.510,88.79,224,0.875,bilinear,-46.227,-33.830,+110
resmlp_big_24_224.fb_distilled_in1k,49.093,50.907,65.473,34.527,129.14,224,0.875,bicubic,-46.777,-33.967,-1
convnext_small.fb_in1k,49.067,50.933,64.830,35.170,50.22,288,1.000,bicubic,-46.903,-34.530,-27
resnetaa101d.sw_in12k_ft_in1k,49.013,50.987,64.237,35.763,44.57,288,1.000,bicubic,-47.347,-35.233,-114
resnext101_64x4d.tv_in1k,48.990,51.010,63.510,36.490,83.46,224,0.875,bilinear,-46.830,-35.800,+5
volo_d1_224.sail_in1k,48.973,51.027,63.187,36.813,26.63,224,0.960,bicubic,-47.057,-36.203,-41
repvgg_b3.rvgg_in1k,48.913,51.087,64.880,35.120,123.09,224,0.875,bilinear,-45.657,-33.770,+275
efficientvit_b3.r224_in1k,48.877,51.123,62.947,37.053,48.65,224,0.950,bicubic,-46.663,-36.373,+63
resnetrs420.tf_in1k,48.863,51.137,63.433,36.567,191.89,416,1.000,bicubic,-47.537,-36.097,-126
maxvit_tiny_tf_224.in1k,48.807,51.193,62.933,37.067,30.92,224,0.950,bicubic,-47.003,-36.327,+4
convformer_s18.sail_in1k,48.787,51.213,62.930,37.070,26.77,224,1.000,bicubic,-46.543,-36.370,+99
caformer_s18.sail_in1k,48.753,51.247,62.870,37.130,26.34,224,1.000,bicubic,-46.927,-36.420,+30
deit3_small_patch16_384.fb_in1k,48.667,51.333,62.800,37.200,22.21,384,1.000,bicubic,-46.933,-36.640,+43
seresnext101d_32x8d.ah_in1k,48.603,51.397,62.960,37.040,93.59,288,1.000,bicubic,-47.757,-36.480,-125
swinv2_small_window16_256.ms_in1k,48.593,51.407,62.747,37.253,49.73,256,0.900,bicubic,-47.477,-36.703,-61
regnetz_d32.ra3_in1k,48.583,51.417,65.167,34.833,27.58,320,0.950,bicubic,-47.287,-34.263,-15
efficientnetv2_rw_s.ra2_in1k,48.577,51.423,63.837,36.163,23.94,384,1.000,bicubic,-47.133,-35.503,+17
efficientnet_b3.ra2_in1k,48.567,51.433,64.250,35.750,12.23,320,1.000,bicubic,-46.573,-35.050,+138
edgenext_base.usi_in1k,48.457,51.543,64.317,35.683,18.51,320,1.000,bicubic,-47.333,-34.983,-4
focalnet_base_lrf.ms_in1k,48.440,51.560,63.120,36.880,88.75,224,0.900,bicubic,-47.390,-36.080,-11
focalnet_base_srf.ms_in1k,48.423,51.577,63.103,36.897,88.15,224,0.900,bicubic,-47.477,-36.207,-29
vit_small_r26_s32_224.augreg_in21k_ft_in1k,48.377,51.623,63.800,36.200,36.43,224,0.900,bicubic,-46.743,-35.400,+139
swinv2_base_window8_256.ms_in1k,48.333,51.667,63.610,36.390,87.92,256,0.900,bicubic,-47.727,-35.940,-64
fastvit_sa36.apple_in1k,48.320,51.680,62.793,37.207,31.53,256,0.900,bicubic,-47.290,-36.527,+30
repvgg_b3g4.rvgg_in1k,48.310,51.690,64.780,35.220,83.83,224,0.875,bilinear,-46.190,-34.240,+275
vit_large_patch32_384.orig_in21k_ft_in1k,48.240,51.760,61.827,38.173,306.63,384,1.000,bicubic,-47.000,-37.403,+103
convit_base.fb_in1k,48.210,51.790,63.000,37.000,86.54,224,0.875,bicubic,-46.890,-36.150,+139
swin_s3_base_224.ms_in1k,48.153,51.847,62.243,37.757,71.13,224,0.900,bicubic,-47.887,-37.107,-66
sequencer2d_l.in1k,48.107,51.893,62.353,37.647,54.30,224,0.875,bicubic,-47.763,-37.207,-31
resnext101_32x8d.tv2_in1k,48.093,51.907,62.723,37.277,88.79,224,0.965,bilinear,-47.207,-36.507,+84
resnetrs350.tf_in1k,48.057,51.943,62.667,37.333,163.96,384,1.000,bicubic,-48.193,-36.923,-115
tf_efficientnetv2_b3.in21k_ft_in1k,48.030,51.970,64.747,35.253,14.36,300,0.900,bicubic,-47.570,-34.533,+24
gcvit_small.in1k,48.030,51.970,62.713,37.287,51.09,224,0.875,bicubic,-47.880,-36.567,-42
focalnet_small_lrf.ms_in1k,48.027,51.973,63.130,36.870,50.34,224,0.900,bicubic,-47.713,-36.030,-7
regnetz_d8.ra3_in1k,48.010,51.990,64.423,35.577,23.37,320,1.000,bicubic,-48.000,-35.097,-68
regnety_1280.seer_ft_in1k,47.990,52.010,64.230,35.770,644.81,384,1.000,bicubic,-48.320,-35.320,-131
twins_svt_large.in1k,47.963,52.037,62.907,37.093,99.27,224,0.900,bicubic,-47.737,-36.463,+2
fastvit_sa24.apple_dist_in1k,47.960,52.040,62.797,37.203,21.55,256,0.900,bicubic,-47.590,-36.513,+29
repvit_m1_5.dist_450e_in1k,47.950,52.050,63.643,36.357,14.64,224,0.950,bicubic,-47.330,-35.587,+83
vit_relpos_base_patch16_224.sw_in1k,47.923,52.077,62.840,37.160,86.43,224,0.900,bicubic,-47.207,-36.240,+117
repvgg_b2g4.rvgg_in1k,47.810,52.190,64.397,35.603,61.76,224,0.875,bilinear,-46.020,-34.533,+377
mixer_b16_224.miil_in21k_ft_in1k,47.793,52.207,63.403,36.597,59.88,224,0.875,bilinear,-47.087,-35.677,+175
repvgg_d2se.rvgg_in1k,47.780,52.220,62.770,37.230,133.33,320,1.000,bilinear,-48.170,-36.600,-60
vit_relpos_base_patch16_clsgap_224.sw_in1k,47.760,52.240,62.410,37.590,86.43,224,0.900,bicubic,-47.490,-36.800,+82
tf_efficientnet_b5.aa_in1k,47.737,52.263,63.910,36.090,30.39,456,0.934,bicubic,-48.143,-35.440,-49
repvit_m1_5.dist_300e_in1k,47.697,52.303,63.763,36.237,14.64,224,0.950,bicubic,-47.453,-35.387,+105
mvitv2_tiny.fb_in1k,47.667,52.333,62.830,37.170,24.17,224,0.900,bicubic,-47.733,-36.470,+51
eca_nfnet_l1.ra2_in1k,47.653,52.347,62.767,37.233,41.41,320,1.000,bicubic,-48.277,-36.723,-61
vit_relpos_medium_patch16_cls_224.sw_in1k,47.653,52.347,61.783,38.217,38.76,224,0.900,bicubic,-47.647,-37.317,+67
seresnext101_32x8d.ah_in1k,47.650,52.350,61.477,38.523,93.57,288,1.000,bicubic,-48.490,-38.013,-113
ecaresnet101d.miil_in1k,47.630,52.370,63.540,36.460,44.57,288,0.950,bicubic,-48.000,-35.750,0
regnetz_d8_evos.ch_in1k,47.623,52.377,63.807,36.193,23.46,320,1.000,bicubic,-48.517,-35.673,-117
resnetv2_50x3_bit.goog_in21k_ft_in1k,47.593,52.407,65.603,34.397,217.32,448,1.000,bilinear,-48.677,-33.957,-143
focalnet_small_srf.ms_in1k,47.547,52.453,62.510,37.490,49.89,224,0.900,bicubic,-48.083,-36.930,-2
pit_s_distilled_224.in1k,47.533,52.467,63.173,36.827,24.04,224,0.900,bicubic,-47.157,-35.977,+199
resnest50d_4s2x40d.in1k,47.483,52.517,63.803,36.197,30.42,224,0.875,bicubic,-47.217,-35.327,+194
dm_nfnet_f1.dm_in1k,47.457,52.543,62.083,37.917,132.63,320,0.910,bicubic,-48.843,-37.417,-150
efficientnet_b3_pruned.in1k,47.447,52.553,62.803,37.197,9.86,300,0.904,bicubic,-47.153,-36.287,+217
davit_tiny.msft_in1k,47.410,52.590,63.360,36.640,28.36,224,0.950,bicubic,-47.670,-35.900,+114
crossvit_18_dagger_408.in1k,47.390,52.610,60.927,39.073,44.61,408,1.000,bicubic,-48.760,-38.543,-126
poolformerv2_m48.sail_in1k,47.380,52.620,63.960,36.040,73.35,224,1.000,bicubic,-47.800,-35.160,+87
coatnet_rmlp_1_rw_224.sw_in1k,47.370,52.630,61.437,38.563,41.69,224,0.950,bicubic,-48.120,-37.923,+17
vit_base_patch16_224.orig_in21k_ft_in1k,47.347,52.653,61.617,38.383,86.57,224,0.900,bicubic,-47.853,-37.613,+80
xcit_small_24_p8_224.fb_in1k,47.290,52.710,60.990,39.010,47.63,224,1.000,bicubic,-48.610,-38.190,-70
efficientformer_l3.snap_dist_in1k,47.247,52.753,63.420,36.580,31.41,224,0.950,bicubic,-47.963,-35.890,+72
tresnet_m.miil_in21k_ft_in1k,47.233,52.767,62.003,37.997,31.39,224,0.875,bilinear,-48.147,-37.147,+37
tf_efficientnet_b6.aa_in1k,47.207,52.793,63.110,36.890,43.04,528,0.942,bicubic,-49.093,-36.420,-160
wide_resnet101_2.tv2_in1k,47.207,52.793,61.877,38.123,126.89,224,0.965,bilinear,-48.033,-37.323,+62
efficientvit_b2.r256_in1k,47.207,52.793,61.670,38.330,24.33,256,1.000,bicubic,-48.013,-37.590,+67
resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k,47.187,52.813,63.387,36.613,44.18,224,0.875,bilinear,-47.953,-35.833,+84
swin_base_patch4_window12_384.ms_in1k,47.183,52.817,62.027,37.973,87.90,384,1.000,bicubic,-49.197,-37.473,-184
repvit_m3.dist_in1k,47.153,52.847,63.347,36.653,10.68,224,0.950,bicubic,-47.567,-35.483,+173
resnetrs270.tf_in1k,47.127,52.873,61.997,38.003,129.86,352,1.000,bicubic,-48.933,-37.483,-116
efficientvit_b2.r288_in1k,47.110,52.890,61.560,38.440,24.33,288,1.000,bicubic,-48.470,-37.710,-12
tf_efficientnet_b4.aa_in1k,47.083,52.917,62.860,37.140,19.34,380,0.922,bicubic,-48.507,-36.470,-16
regnety_320.tv2_in1k,47.073,52.927,62.050,37.950,145.05,224,0.965,bicubic,-48.487,-37.340,-11
vit_base_patch16_rpn_224.sw_in1k,47.063,52.937,62.397,37.603,86.54,224,0.900,bicubic,-47.757,-36.693,+150
rexnetr_200.sw_in12k_ft_in1k,47.053,52.947,63.973,36.027,16.52,288,1.000,bicubic,-48.257,-35.497,+35
convnextv2_tiny.fcmae_ft_in1k,47.043,52.957,62.503,37.497,28.64,288,1.000,bicubic,-48.787,-36.887,-73
swinv2_small_window8_256.ms_in1k,47.020,52.980,62.300,37.700,49.73,256,0.900,bicubic,-48.710,-37.060,-50
inception_next_tiny.sail_in1k,46.983,53.017,62.893,37.107,28.06,224,0.875,bicubic,-48.117,-36.247,+86
xcit_small_12_p8_224.fb_in1k,46.977,53.023,60.523,39.477,26.21,224,1.000,bicubic,-48.443,-38.667,+10
xcit_large_24_p16_224.fb_in1k,46.940,53.060,60.657,39.343,189.10,224,1.000,bicubic,-48.020,-38.173,+118
coat_small.in1k,46.937,53.063,61.307,38.693,21.69,224,0.900,bicubic,-48.253,-37.973,+60
xception65p.ra3_in1k,46.923,53.077,61.087,38.913,39.82,299,0.940,bicubic,-48.737,-38.183,-39
maxvit_tiny_rw_224.sw_in1k,46.907,53.093,60.897,39.103,29.06,224,0.950,bicubic,-48.833,-38.543,-58
resnet101d.ra2_in1k,46.890,53.110,62.340,37.660,44.57,320,1.000,bicubic,-48.850,-36.870,-61
swin_tiny_patch4_window7_224.ms_in22k_ft_in1k,46.837,53.163,64.117,35.883,28.29,224,0.900,bicubic,-47.963,-35.173,+145
pvt_v2_b2_li.in1k,46.813,53.187,62.490,37.510,22.55,224,0.900,bicubic,-48.407,-36.640,+44
resnet152.tv2_in1k,46.803,53.197,61.070,38.930,60.19,224,0.965,bilinear,-48.237,-38.100,+95
regnety_640.seer_ft_in1k,46.707,53.293,63.233,36.767,281.38,384,1.000,bicubic,-49.353,-36.247,-135
fastvit_sa24.apple_in1k,46.653,53.347,61.710,38.290,21.55,256,0.900,bicubic,-48.627,-37.580,+27
seresnext101_64x4d.gluon_in1k,46.650,53.350,61.283,38.717,88.23,224,0.875,bicubic,-48.000,-37.687,+173
twins_pcpvt_large.in1k,46.613,53.387,62.263,37.737,60.99,224,0.900,bicubic,-49.107,-37.227,-62
convnextv2_nano.fcmae_ft_in22k_in1k,46.607,53.393,62.957,37.043,15.62,288,1.000,bicubic,-48.813,-36.353,-4
convnext_tiny.fb_in1k,46.587,53.413,63.187,36.813,28.59,288,1.000,bicubic,-48.623,-36.123,+40
swin_base_patch4_window7_224.ms_in1k,46.543,53.457,61.580,38.420,87.77,224,0.900,bicubic,-49.357,-37.860,-104
resnet152.a1h_in1k,46.540,53.460,60.403,39.597,60.19,288,1.000,bicubic,-49.210,-38.877,-75
efficientformerv2_s2.snap_dist_in1k,46.537,53.463,61.717,38.283,12.71,224,0.950,bicubic,-48.573,-37.403,+66
regnetv_064.ra3_in1k,46.480,53.520,62.250,37.750,30.58,288,1.000,bicubic,-49.300,-37.170,-80
crossvit_15_dagger_408.in1k,46.467,53.533,60.490,39.510,28.50,408,1.000,bicubic,-49.353,-38.720,-90
xcit_medium_24_p8_224.fb_in1k,46.467,53.533,59.653,40.347,84.32,224,1.000,bicubic,-49.393,-39.427,-98
resnetrs200.tf_in1k,46.440,53.560,61.067,38.933,93.21,320,1.000,bicubic,-49.910,-38.483,-211
swin_s3_small_224.ms_in1k,46.400,53.600,60.893,39.107,49.74,224,0.900,bicubic,-49.430,-38.287,-96
coatnet_1_rw_224.sw_in1k,46.393,53.607,60.063,39.937,41.72,224,0.950,bicubic,-49.227,-39.157,-52
gcvit_tiny.in1k,46.367,53.633,61.630,38.370,28.22,224,0.875,bicubic,-49.293,-37.700,-61
rexnet_300.nav_in1k,46.360,53.640,62.690,37.310,34.71,224,0.875,bicubic,-49.180,-36.500,-39
fbnetv3_g.ra2_in1k,46.340,53.660,62.387,37.613,16.62,288,0.950,bilinear,-48.780,-36.813,+54
sequencer2d_m.in1k,46.297,53.703,60.920,39.080,38.31,224,0.875,bicubic,-49.283,-38.300,-49
tresnet_xl.miil_in1k,46.290,53.710,61.943,38.057,78.44,224,0.875,bilinear,-48.790,-37.347,+62
xcit_tiny_24_p8_224.fb_dist_in1k,46.283,53.717,60.590,39.410,12.11,224,1.000,bicubic,-49.167,-38.770,-23
xcit_tiny_24_p8_384.fb_dist_in1k,46.260,53.740,60.730,39.270,12.11,384,1.000,bicubic,-49.970,-38.670,-191
deit_small_distilled_patch16_224.fb_in1k,46.173,53.827,62.423,37.577,22.44,224,0.900,bicubic,-48.427,-36.647,+160
regnety_160.deit_in1k,46.170,53.830,61.823,38.177,83.59,288,1.000,bicubic,-49.700,-37.617,-117
gernet_m.idstcv_in1k,46.163,53.837,62.687,37.313,21.14,224,0.875,bilinear,-48.377,-36.233,+174
repvit_m1_1.dist_450e_in1k,46.130,53.870,63.287,36.713,8.80,224,0.950,bicubic,-48.420,-35.863,+172
crossvit_base_240.in1k,46.120,53.880,60.207,39.793,105.03,240,0.875,bicubic,-48.950,-38.773,+59
resnest50d_1s4x24d.in1k,46.100,53.900,62.373,37.627,25.68,224,0.875,bicubic,-48.280,-36.697,+207
swinv2_cr_small_ns_224.sw_in1k,46.100,53.900,60.800,39.200,49.70,224,0.900,bicubic,-49.610,-38.600,-82
poolformerv2_m36.sail_in1k,46.050,53.950,62.247,37.753,56.08,224,1.000,bicubic,-49.000,-36.833,+61
tf_efficientnet_b0.ns_jft_in1k,46.043,53.957,63.277,36.723,5.29,224,0.875,bicubic,-47.707,-35.653,+307
poolformerv2_s36.sail_in1k,46.027,53.973,62.253,37.747,30.79,224,1.000,bicubic,-48.673,-36.977,+126
nest_base_jx.goog_in1k,46.027,53.973,60.100,39.900,67.72,224,0.875,bicubic,-49.513,-39.190,-53
vit_small_patch16_224.augreg_in21k_ft_in1k,46.020,53.980,61.830,38.170,22.05,224,0.900,bicubic,-48.870,-37.240,+89
resnet51q.ra2_in1k,46.020,53.980,60.903,39.097,35.70,288,1.000,bilinear,-49.180,-38.377,+18
vit_relpos_medium_patch16_224.sw_in1k,45.980,54.020,61.033,38.967,38.75,224,0.900,bicubic,-49.210,-38.187,+19
regnety_080.ra3_in1k,45.980,54.020,60.853,39.147,39.18,288,1.000,bicubic,-49.890,-38.577,-127
deit3_small_patch16_224.fb_in1k,45.943,54.057,58.883,41.117,22.06,224,0.900,bicubic,-48.747,-40.297,+127
resnest50d.in1k,45.933,54.067,62.620,37.380,27.48,224,0.875,bilinear,-48.667,-36.530,+147
convnext_nano.in12k_ft_in1k,45.900,54.100,62.693,37.307,15.59,288,1.000,bicubic,-49.450,-36.757,-25
crossvit_18_240.in1k,45.900,54.100,60.350,39.650,43.27,240,0.875,bicubic,-49.170,-38.680,+43
levit_384.fb_dist_in1k,45.877,54.123,61.683,38.317,39.13,224,0.900,bicubic,-49.333,-37.477,+7
regnety_032.ra_in1k,45.877,54.123,61.540,38.460,19.44,288,1.000,bicubic,-49.583,-37.850,-48
levit_conv_384.fb_dist_in1k,45.870,54.130,61.690,38.310,39.13,224,0.900,bicubic,-49.340,-37.590,+6
twins_svt_base.in1k,45.870,54.130,60.973,39.027,56.07,224,0.900,bicubic,-49.690,-38.257,-69
twins_pcpvt_base.in1k,45.857,54.143,61.330,38.670,43.83,224,0.900,bicubic,-49.613,-37.790,-55
convnext_tiny_hnf.a2h_in1k,45.853,54.147,60.180,39.820,28.59,288,1.000,bicubic,-49.397,-39.020,-10
crossvit_18_dagger_240.in1k,45.853,54.147,59.920,40.080,44.27,240,0.875,bicubic,-49.327,-39.240,+11
regnetz_c16.ra3_in1k,45.790,54.210,62.733,37.267,13.46,320,1.000,bicubic,-49.590,-36.687,-38
vit_relpos_medium_patch16_rpn_224.sw_in1k,45.743,54.257,60.983,39.017,38.73,224,0.900,bicubic,-49.317,-38.217,+38
vit_srelpos_medium_patch16_224.sw_in1k,45.723,54.277,61.073,38.927,38.74,224,0.900,bicubic,-49.217,-37.967,+62
crossvit_15_dagger_240.in1k,45.703,54.297,60.073,39.927,28.21,240,0.875,bicubic,-49.287,-39.117,+53
regnetx_320.tv2_in1k,45.673,54.327,60.233,39.767,107.81,224,0.965,bicubic,-49.607,-39.057,-22
convmixer_1536_20.in1k,45.663,54.337,61.743,38.257,51.63,224,0.960,bicubic,-49.307,-37.327,+54
gc_efficientnetv2_rw_t.agc_in1k,45.653,54.347,60.193,39.807,13.68,288,1.000,bicubic,-49.637,-39.187,-27
dm_nfnet_f0.dm_in1k,45.617,54.383,61.277,38.723,71.49,256,0.900,bicubic,-50.073,-38.073,-106
flexivit_small.1200ep_in1k,45.617,54.383,59.887,40.113,22.06,240,0.950,bicubic,-49.573,-39.333,0
efficientnetv2_rw_t.ra2_in1k,45.603,54.397,60.183,39.817,13.65,288,1.000,bicubic,-49.457,-39.037,+29
seresnext101_32x4d.gluon_in1k,45.600,54.400,61.160,38.840,48.96,224,0.875,bicubic,-48.830,-37.870,+164
xcit_tiny_24_p16_384.fb_dist_in1k,45.580,54.420,60.507,39.493,12.12,384,1.000,bicubic,-49.910,-38.883,-72
xcit_medium_24_p16_224.fb_in1k,45.537,54.463,59.007,40.993,84.40,224,1.000,bicubic,-49.593,-39.933,+9
repvit_m1_1.dist_300e_in1k,45.530,54.470,62.647,37.353,8.80,224,0.950,bicubic,-48.650,-36.433,+213
xcit_small_24_p16_224.fb_in1k,45.513,54.487,58.880,41.120,47.67,224,1.000,bicubic,-49.547,-40.190,+26
resnext101_64x4d.c1_in1k,45.450,54.550,59.033,40.967,83.46,288,1.000,bicubic,-50.080,-40.257,-81
resnet152d.gluon_in1k,45.427,54.573,60.080,39.920,60.21,224,0.875,bicubic,-49.003,-39.010,+160
regnety_320.seer_ft_in1k,45.420,54.580,62.227,37.773,145.05,384,1.000,bicubic,-50.370,-37.343,-140
nfnet_l0.ra2_in1k,45.417,54.583,62.087,37.913,35.07,288,1.000,bicubic,-49.963,-37.123,-57
resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k,45.413,54.587,62.023,37.977,25.03,224,0.875,bilinear,-49.287,-37.057,+90
xcit_small_12_p16_224.fb_in1k,45.413,54.587,59.403,40.597,26.25,224,1.000,bicubic,-49.407,-39.657,+69
resnetv2_50x1_bit.goog_distilled_in1k,45.407,54.593,62.297,37.703,25.55,224,0.875,bicubic,-50.003,-37.023,-67
cs3se_edgenet_x.c2ns_in1k,45.393,54.607,60.447,39.553,50.72,320,1.000,bicubic,-50.607,-38.903,-192
resnet101.tv2_in1k,45.363,54.637,60.060,39.940,44.55,224,0.965,bilinear,-49.477,-38.970,+63
nest_small_jx.goog_in1k,45.350,54.650,59.040,40.960,38.35,224,0.875,bicubic,-50.190,-40.180,-93
tf_efficientnet_b7.aa_in1k,45.307,54.693,61.730,38.270,66.35,600,0.949,bicubic,-50.763,-37.610,-211
resnet61q.ra2_in1k,45.280,54.720,59.407,40.593,36.85,288,1.000,bicubic,-49.850,-39.853,-6
pvt_v2_b2.in1k,45.277,54.723,60.613,39.387,25.36,224,0.900,bicubic,-49.733,-38.437,+24
cs3edgenet_x.c2_in1k,45.257,54.743,60.260,39.740,47.82,288,1.000,bicubic,-50.203,-39.060,-81
nasnetalarge.tf_in1k,45.223,54.777,57.893,42.107,88.75,331,0.911,bicubic,-49.927,-41.377,-15
focalnet_tiny_lrf.ms_in1k,45.217,54.783,61.300,38.700,28.65,224,0.900,bicubic,-49.973,-37.880,-24
tresnet_xl.miil_in1k_448,45.213,54.787,61.447,38.553,78.44,448,0.875,bilinear,-50.297,-37.893,-95
flexivit_small.600ep_in1k,45.203,54.797,59.420,40.580,22.06,240,0.950,bicubic,-50.057,-39.960,-47
convit_small.fb_in1k,45.200,54.800,60.487,39.513,27.78,224,0.875,bicubic,-49.720,-38.693,+36
efficientvit_b2.r224_in1k,45.193,54.807,59.150,40.850,24.33,224,0.950,bicubic,-49.657,-39.970,+51
swin_small_patch4_window7_224.ms_in1k,45.183,54.817,60.357,39.643,49.61,224,0.900,bicubic,-50.527,-38.933,-138
sequencer2d_s.in1k,45.113,54.887,60.067,39.933,27.65,224,0.875,bicubic,-50.347,-39.193,-88
tf_efficientnet_b3.aa_in1k,45.110,54.890,60.637,39.363,12.23,300,0.904,bicubic,-49.800,-38.473,+34
rexnet_200.nav_in1k,45.063,54.937,62.323,37.677,16.37,224,0.875,bicubic,-49.607,-36.767,+87
maxxvit_rmlp_nano_rw_256.sw_in1k,45.050,54.950,59.663,40.337,16.78,256,0.950,bicubic,-50.290,-39.677,-70
resnetrs152.tf_in1k,44.950,55.050,59.690,40.310,86.62,320,1.000,bicubic,-51.010,-39.660,-199
deit_base_patch16_224.fb_in1k,44.867,55.133,59.170,40.830,86.57,224,0.900,bicubic,-50.153,-40.000,+9
flexivit_small.300ep_in1k,44.850,55.150,59.360,40.640,22.06,240,0.950,bicubic,-50.300,-39.770,-29
focalnet_tiny_srf.ms_in1k,44.833,55.167,61.040,38.960,28.43,224,0.900,bicubic,-50.207,-38.240,+2
coatnet_bn_0_rw_224.sw_in1k,44.803,55.197,60.913,39.087,27.44,224,0.950,bicubic,-50.177,-38.317,+13
tiny_vit_11m_224.in1k,44.803,55.197,60.097,39.903,11.00,224,0.950,bicubic,-50.437,-39.223,-54
deit_base_patch16_384.fb_in1k,44.770,55.230,59.600,40.400,86.86,384,1.000,bicubic,-50.870,-39.820,-136
resmlp_36_224.fb_distilled_in1k,44.763,55.237,61.073,38.927,44.69,224,0.875,bicubic,-49.797,-38.087,+99
cait_xxs36_384.fb_dist_in1k,44.760,55.240,59.387,40.613,17.37,384,1.000,bicubic,-50.480,-39.933,-59
resnet101.a1h_in1k,44.727,55.273,59.120,40.880,44.55,288,1.000,bicubic,-50.853,-40.130,-127
gernet_l.idstcv_in1k,44.720,55.280,58.950,41.050,31.08,256,0.875,bilinear,-50.210,-40.250,+16
tf_efficientnet_b2.ap_in1k,44.710,55.290,60.687,39.313,9.11,260,0.890,bicubic,-49.560,-38.353,+158
resmlp_24_224.fb_distilled_in1k,44.707,55.293,61.463,38.537,30.02,224,0.875,bicubic,-49.623,-37.627,+145
xcit_tiny_24_p16_224.fb_dist_in1k,44.707,55.293,59.417,40.583,12.12,224,1.000,bicubic,-49.533,-39.543,+162
repvit_m2.dist_in1k,44.673,55.327,61.730,38.270,8.80,224,0.950,bicubic,-49.727,-37.320,+128
regnety_032.tv2_in1k,44.650,55.350,60.897,39.103,19.44,224,0.965,bicubic,-50.220,-38.243,+24
ecaresnetlight.miil_in1k,44.580,55.420,60.383,39.617,30.16,288,0.950,bicubic,-49.950,-38.797,+95
swinv2_tiny_window16_256.ms_in1k,44.570,55.430,59.573,40.427,28.35,256,0.900,bicubic,-50.760,-39.577,-87
vit_relpos_small_patch16_224.sw_in1k,44.553,55.447,60.210,39.790,21.98,224,0.900,bicubic,-50.127,-38.890,+59
gmlp_s16_224.ra3_in1k,44.470,55.530,58.617,41.383,19.42,224,0.875,bicubic,-49.030,-40.223,+274
tiny_vit_5m_224.dist_in22k_ft_in1k,44.460,55.540,60.940,39.060,5.39,224,0.950,bicubic,-50.170,-38.200,+68
regnety_160.tv2_in1k,44.427,55.573,59.227,40.773,83.59,224,0.965,bicubic,-50.733,-40.023,-50
resnetv2_101.a1h_in1k,44.423,55.577,58.697,41.303,44.54,288,1.000,bicubic,-51.147,-40.573,-138
inception_resnet_v2.tf_ens_adv_in1k,44.393,55.607,58.113,41.887,55.84,299,0.897,bicubic,-49.727,-40.737,+175
tresnet_l.miil_in1k,44.387,55.613,59.947,40.053,55.99,224,0.875,bilinear,-50.513,-39.083,+9
repvit_m1_0.dist_450e_in1k,44.373,55.627,61.417,38.583,7.30,224,0.950,bicubic,-49.897,-37.623,+141
maxxvitv2_nano_rw_256.sw_in1k,44.373,55.627,58.830,41.170,23.70,256,0.950,bicubic,-51.057,-40.360,-114
repvit_m1_0.dist_300e_in1k,44.367,55.633,61.453,38.547,7.30,224,0.950,bicubic,-49.393,-37.467,+220
resnext101_32x4d.gluon_in1k,44.280,55.720,59.057,40.943,44.18,224,0.875,bicubic,-49.850,-39.883,+167
gcvit_xtiny.in1k,44.247,55.753,59.973,40.027,19.98,224,0.875,bicubic,-50.773,-39.187,-20
regnety_080_tv.tv2_in1k,44.183,55.817,58.787,41.213,39.38,224,0.965,bicubic,-51.117,-40.573,-94
maxvit_rmlp_nano_rw_256.sw_in1k,44.180,55.820,58.243,41.757,15.50,256,0.950,bicubic,-51.260,-40.817,-121
poolformer_m48.sail_in1k,44.160,55.840,59.153,40.847,73.47,224,0.950,bicubic,-50.940,-39.947,-43
regnetz_c16_evos.ch_in1k,44.143,55.857,61.067,38.933,13.49,320,0.950,bicubic,-51.497,-38.173,-164
vit_srelpos_small_patch16_224.sw_in1k,44.130,55.870,59.693,40.307,21.97,224,0.900,bicubic,-50.420,-39.397,+74
resnetv2_101x1_bit.goog_in21k_ft_in1k,44.127,55.873,61.950,38.050,44.54,448,1.000,bilinear,-51.193,-37.420,-103
crossvit_15_240.in1k,44.110,55.890,59.143,40.857,27.53,240,0.875,bicubic,-50.590,-40.097,+36
fastvit_sa12.apple_dist_in1k,44.103,55.897,59.110,40.890,11.58,256,0.900,bicubic,-50.577,-39.810,+41
convnextv2_nano.fcmae_ft_in1k,44.100,55.900,60.167,39.833,15.62,288,1.000,bicubic,-51.040,-39.043,-62
pit_b_224.in1k,44.080,55.920,58.023,41.977,73.76,224,0.900,bicubic,-50.730,-41.237,+14
resnet152s.gluon_in1k,44.060,55.940,58.710,41.290,60.32,224,0.875,bicubic,-50.650,-40.350,+28
resnet50.fb_ssl_yfcc100m_ft_in1k,44.023,55.977,61.890,38.110,25.56,224,0.875,bilinear,-50.287,-37.260,+118
poolformer_m36.sail_in1k,44.007,55.993,59.033,40.967,56.17,224,0.950,bicubic,-51.023,-40.067,-36
inception_resnet_v2.tf_in1k,44.007,55.993,57.920,42.080,55.84,299,0.897,bicubic,-50.353,-40.880,+110
pnasnet5large.tf_in1k,43.943,56.057,56.740,43.260,86.06,331,0.911,bicubic,-51.417,-42.390,-119
resnext101_64x4d.gluon_in1k,43.913,56.087,58.707,41.293,83.46,224,0.875,bicubic,-50.437,-40.363,+111
wide_resnet50_2.tv2_in1k,43.907,56.093,59.630,40.370,68.88,224,0.965,bilinear,-50.903,-39.350,+2
coatnext_nano_rw_224.sw_in1k,43.907,56.093,58.663,41.337,14.70,224,0.900,bicubic,-50.943,-40.537,-3
pit_s_224.in1k,43.907,56.093,58.640,41.360,23.46,224,0.900,bicubic,-50.683,-40.500,+52
coat_lite_small.in1k,43.817,56.183,57.127,42.873,19.84,224,0.900,bicubic,-51.253,-41.993,-53
mobilevitv2_200.cvnets_in22k_ft_in1k,43.810,56.190,59.490,40.510,18.45,256,0.888,bicubic,-51.240,-39.670,-46
tnt_s_patch16_224,43.777,56.223,59.220,40.780,23.76,224,0.900,bicubic,-50.783,-39.920,+53
regnetv_040.ra3_in1k,43.777,56.223,58.443,41.557,20.64,288,1.000,bicubic,-51.953,-40.937,-201
swinv2_cr_small_224.sw_in1k,43.767,56.233,57.717,42.283,49.70,224,0.900,bicubic,-51.643,-41.343,-137
cspresnext50.ra_in1k,43.763,56.237,60.147,39.853,20.57,256,0.887,bilinear,-50.477,-38.903,+120
cait_xxs36_224.fb_dist_in1k,43.757,56.243,58.740,41.260,17.30,224,1.000,bicubic,-50.153,-40.350,+172
pit_xs_distilled_224.in1k,43.723,56.277,60.657,39.343,11.00,224,0.900,bicubic,-49.567,-38.133,+271
swin_s3_tiny_224.ms_in1k,43.717,56.283,59.507,40.493,28.33,224,0.900,bicubic,-51.203,-39.593,-29
tf_efficientnetv2_s.in1k,43.710,56.290,58.593,41.407,21.46,384,1.000,bicubic,-52.000,-40.787,-204
rexnet_150.nav_in1k,43.693,56.307,60.923,39.077,9.73,224,0.875,bicubic,-50.587,-38.057,+105
xcit_tiny_12_p8_224.fb_dist_in1k,43.643,56.357,58.477,41.523,6.71,224,1.000,bicubic,-51.047,-40.283,+14
edgenext_small.usi_in1k,43.637,56.363,59.893,40.107,5.59,320,1.000,bicubic,-51.193,-39.517,-14
tf_efficientnet_b5.in1k,43.623,56.377,60.133,39.867,30.39,456,0.934,bicubic,-52.247,-39.257,-238
efficientformer_l1.snap_dist_in1k,43.593,56.407,59.957,40.043,12.29,224,0.950,bicubic,-50.347,-38.973,+158
maxvit_nano_rw_256.sw_in1k,43.523,56.477,57.610,42.390,15.45,256,0.950,bicubic,-51.947,-41.780,-160
wide_resnet50_2.racm_in1k,43.503,56.497,59.053,40.947,68.88,288,0.950,bicubic,-51.627,-40.237,-88
cs3sedarknet_x.c2ns_in1k,43.503,56.497,58.770,41.230,35.40,288,1.000,bicubic,-51.907,-40.660,-151
coatnet_rmlp_nano_rw_224.sw_in1k,43.503,56.497,58.610,41.390,15.15,224,0.900,bicubic,-51.587,-40.560,-75
crossvit_small_240.in1k,43.473,56.527,58.950,41.050,26.86,240,0.875,bicubic,-51.107,-40.100,+32
regnety_016.tv2_in1k,43.433,56.567,59.540,40.460,11.20,224,0.965,bicubic,-50.977,-39.500,+68
resnet101d.gluon_in1k,43.430,56.570,58.627,41.373,44.57,224,0.875,bicubic,-50.750,-40.383,+118
ecaresnet50t.ra2_in1k,43.417,56.583,59.313,40.687,25.57,320,0.950,bicubic,-51.663,-39.827,-79
resnet101s.gluon_in1k,43.370,56.630,58.517,41.483,44.67,224,0.875,bicubic,-50.810,-40.423,+115
efficientvit_b1.r288_in1k,43.367,56.633,57.850,42.150,9.10,288,1.000,bicubic,-50.863,-40.970,+104
cspdarknet53.ra_in1k,43.360,56.640,59.420,40.580,27.64,256,0.887,bilinear,-50.740,-39.560,+125
tf_efficientnet_b4.in1k,43.330,56.670,59.447,40.553,19.34,380,0.922,bicubic,-52.150,-39.823,-174
xcit_tiny_24_p8_224.fb_in1k,43.323,56.677,57.277,42.723,12.11,224,1.000,bicubic,-51.577,-41.743,-44
xcit_tiny_12_p8_384.fb_dist_in1k,43.317,56.683,58.177,41.823,6.71,384,1.000,bicubic,-52.023,-41.133,-149
visformer_small.in1k,43.273,56.727,57.977,42.023,40.22,224,0.900,bicubic,-51.697,-41.193,-60
convmixer_768_32.in1k,43.270,56.730,59.383,40.617,21.11,224,0.960,bicubic,-51.170,-39.497,+52
repvit_m0_9.dist_300e_in1k,43.243,56.757,60.377,39.623,5.49,224,0.950,bicubic,-50.217,-38.573,+222
eca_nfnet_l0.ra2_in1k,43.227,56.773,59.930,40.070,24.14,288,1.000,bicubic,-52.233,-39.350,-176
ecaresnet101d_pruned.miil_in1k,43.220,56.780,58.967,41.033,24.88,288,0.950,bicubic,-51.780,-40.263,-69
regnety_064.ra3_in1k,43.213,56.787,57.253,42.747,30.58,288,1.000,bicubic,-52.577,-42.097,-243
poolformerv2_s24.sail_in1k,43.190,56.810,60.430,39.570,21.34,224,1.000,bicubic,-51.280,-38.580,+40
regnetx_160.tv2_in1k,43.187,56.813,57.480,42.520,54.28,224,0.965,bicubic,-52.023,-41.680,-126
vit_relpos_base_patch32_plus_rpn_256.sw_in1k,43.183,56.817,58.403,41.597,119.42,256,0.900,bicubic,-49.957,-39.907,+258
vit_small_patch32_384.augreg_in21k_ft_in1k,43.150,56.850,59.313,40.687,22.92,384,1.000,bicubic,-51.440,-39.617,+11
resnest26d.gluon_in1k,43.140,56.860,60.637,39.363,17.07,224,0.875,bilinear,-50.080,-38.213,+246
twins_pcpvt_small.in1k,43.120,56.880,58.890,41.110,24.11,224,0.900,bicubic,-51.480,-40.210,+4
regnetx_080.tv2_in1k,43.090,56.910,57.923,42.077,39.57,224,0.965,bicubic,-51.640,-41.107,-25
repvit_m0_9.dist_450e_in1k,43.073,56.927,60.200,39.800,5.49,224,0.950,bicubic,-50.367,-38.450,+215
resmlp_36_224.fb_in1k,43.060,56.940,59.300,40.700,44.69,224,0.875,bicubic,-50.590,-39.350,+174
cspresnet50.ra_in1k,43.050,56.950,59.157,40.843,21.62,256,0.887,bilinear,-50.820,-39.733,+139
coatnet_nano_rw_224.sw_in1k,43.030,56.970,57.927,42.073,15.14,224,0.900,bicubic,-52.020,-41.223,-91
ecaresnet50d.miil_in1k,43.020,56.980,59.417,40.583,25.58,288,0.950,bicubic,-51.650,-39.813,-14
tf_efficientnet_lite4.in1k,42.990,57.010,57.630,42.370,13.01,380,0.920,bilinear,-51.880,-41.470,-55
twins_svt_small.in1k,42.917,57.083,58.460,41.540,24.06,224,0.900,bicubic,-51.843,-40.490,-39
dpn131.mx_in1k,42.917,57.083,57.130,42.870,79.25,224,0.875,bicubic,-50.863,-41.830,+149
mobilevitv2_200.cvnets_in22k_ft_in1k_384,42.913,57.087,58.973,41.027,18.45,384,1.000,bicubic,-52.477,-40.307,-179
resnet152.gluon_in1k,42.900,57.100,57.747,42.253,60.19,224,0.875,bicubic,-51.130,-41.103,+110
fastvit_sa12.apple_in1k,42.880,57.120,58.800,41.200,11.58,256,0.900,bicubic,-51.550,-40.200,+34
fbnetv3_d.ra2_in1k,42.850,57.150,59.677,40.323,10.31,256,0.950,bilinear,-51.020,-39.193,+129
levit_conv_256.fb_dist_in1k,42.823,57.177,57.897,42.103,18.89,224,0.900,bicubic,-51.577,-41.163,+36
resnet50.tv2_in1k,42.820,57.180,58.570,41.430,25.56,224,0.965,bilinear,-51.780,-40.460,-9
resnet152c.gluon_in1k,42.813,57.187,57.720,42.280,60.21,224,0.875,bicubic,-51.057,-41.080,+128
levit_256.fb_dist_in1k,42.810,57.190,57.897,42.103,18.89,224,0.900,bicubic,-51.590,-41.163,+32
tf_efficientnet_b1.ap_in1k,42.800,57.200,58.820,41.180,7.79,240,0.882,bicubic,-50.830,-39.980,+165
resnext50_32x4d.tv2_in1k,42.780,57.220,57.567,42.433,25.03,224,0.965,bilinear,-51.680,-41.333,+18
gcresnet50t.ra2_in1k,42.767,57.233,59.033,40.967,25.90,288,1.000,bicubic,-52.013,-40.087,-53
coatnet_0_rw_224.sw_in1k,42.753,57.247,56.233,43.767,27.44,224,0.950,bicubic,-52.147,-42.957,-76
tresnet_l.miil_in1k_448,42.740,57.260,58.943,41.057,55.99,448,0.875,bilinear,-52.660,-40.457,-193
cs3darknet_x.c2ns_in1k,42.727,57.273,58.190,41.810,35.05,288,1.000,bicubic,-52.553,-41.120,-171
dpn107.mx_in1k,42.710,57.290,57.160,42.840,86.92,224,0.875,bicubic,-51.300,-41.870,+102
seresnext50_32x4d.gluon_in1k,42.683,57.317,58.707,41.293,27.56,224,0.875,bicubic,-51.487,-40.213,+76
convnext_nano.d1h_in1k,42.677,57.323,57.577,42.423,15.59,288,1.000,bicubic,-52.193,-41.653,-75
tresnet_m.miil_in1k,42.670,57.330,58.160,41.840,31.39,224,0.875,bilinear,-51.410,-40.670,+88
fastvit_s12.apple_dist_in1k,42.643,57.357,58.160,41.840,9.47,256,0.900,bicubic,-51.637,-40.700,+48
resnetaa50d.sw_in12k_ft_in1k,42.610,57.390,58.490,41.510,25.58,288,1.000,bicubic,-52.680,-40.730,-180
xcit_tiny_12_p16_384.fb_dist_in1k,42.593,57.407,58.083,41.917,6.72,384,1.000,bicubic,-51.927,-40.887,-6
regnety_040.ra3_in1k,42.577,57.423,57.010,42.990,20.65,288,1.000,bicubic,-52.913,-42.240,-223
resnext101_32x8d.tv_in1k,42.563,57.437,58.303,41.697,88.79,224,0.875,bilinear,-51.217,-40.547,+124
efficientvit_b1.r256_in1k,42.527,57.473,57.427,42.573,9.10,256,1.000,bicubic,-51.183,-41.363,+136
nf_resnet50.ra2_in1k,42.517,57.483,59.530,40.470,25.56,288,0.940,bicubic,-51.863,-39.290,+21
mobilevitv2_175.cvnets_in22k_ft_in1k,42.517,57.483,58.117,41.883,14.25,256,0.888,bicubic,-52.263,-40.973,-66
seresnext50_32x4d.racm_in1k,42.433,57.567,58.093,41.907,27.56,288,0.950,bicubic,-52.567,-41.097,-111
resnetrs101.tf_in1k,42.417,57.583,57.283,42.717,63.62,288,0.940,bicubic,-52.833,-41.697,-180
poolformer_s36.sail_in1k,42.383,57.617,58.827,41.173,30.86,224,0.900,bicubic,-52.237,-40.293,-36
repvit_m1.dist_in1k,42.350,57.650,59.677,40.323,5.49,224,0.950,bicubic,-51.030,-38.973,+191
nest_tiny_jx.goog_in1k,42.320,57.680,57.067,42.933,17.06,224,0.875,bicubic,-52.600,-42.103,-99
mobileone_s4.apple_in1k,42.317,57.683,58.033,41.967,14.95,224,0.900,bilinear,-51.333,-40.917,+137
tf_efficientnetv2_b3.in1k,42.313,57.687,57.943,42.057,14.36,300,0.904,bicubic,-52.807,-41.277,-147
xcit_tiny_24_p16_224.fb_in1k,42.277,57.723,56.837,43.163,12.12,224,1.000,bicubic,-51.573,-41.913,+104
resnet152.a1_in1k,42.273,57.727,55.530,44.470,60.19,288,1.000,bicubic,-52.817,-43.460,-142
convmixer_1024_20_ks9_p14.in1k,42.267,57.733,59.723,40.277,24.38,224,0.960,bicubic,-50.063,-38.707,+292
deit_small_patch16_224.fb_in1k,42.263,57.737,58.030,41.970,22.05,224,0.900,bicubic,-51.727,-41.010,+83
mobileone_s3.apple_in1k,42.257,57.743,59.270,40.730,10.17,224,0.900,bilinear,-50.723,-39.350,+228
tf_efficientnet_cc_b1_8e.in1k,42.247,57.753,58.430,41.570,39.72,240,0.882,bicubic,-51.333,-40.260,+144
legacy_senet154.in1k,42.230,57.770,56.620,43.380,115.09,224,0.875,bilinear,-52.500,-42.480,-74
cait_xxs24_384.fb_dist_in1k,42.183,57.817,57.473,42.527,12.03,384,1.000,bicubic,-52.767,-41.657,-116
dpn98.mx_in1k,42.180,57.820,56.597,43.403,61.57,224,0.875,bicubic,-51.750,-42.313,+83
xception41p.ra3_in1k,42.160,57.840,56.883,43.117,26.91,299,0.940,bicubic,-52.890,-42.257,-140
tf_efficientnet_b2.aa_in1k,42.123,57.877,58.197,41.803,9.11,260,0.890,bicubic,-52.097,-40.733,+38
resnet50.b1k_in1k,42.080,57.920,58.153,41.847,25.56,288,1.000,bicubic,-52.430,-40.917,-24
resnext50_32x4d.gluon_in1k,42.050,57.950,57.673,42.327,25.03,224,0.875,bicubic,-51.620,-41.017,+121
convnext_nano_ols.d1h_in1k,42.023,57.977,56.867,43.133,15.65,288,1.000,bicubic,-52.557,-42.253,-44
pvt_v2_b1.in1k,41.953,58.047,59.593,40.407,14.01,224,0.900,bicubic,-51.537,-39.117,+148
xcit_tiny_12_p16_224.fb_dist_in1k,41.940,58.060,57.247,42.753,6.72,224,1.000,bicubic,-51.410,-41.443,+174
efficientnet_b2.ra_in1k,41.937,58.063,58.297,41.703,9.11,288,1.000,bicubic,-52.423,-40.743,+2
mobilevitv2_150.cvnets_in22k_ft_in1k,41.937,58.063,57.913,42.087,10.59,256,0.888,bicubic,-52.743,-41.197,-69
resnet50.b2k_in1k,41.910,58.090,57.673,42.327,25.56,288,1.000,bicubic,-52.390,-41.257,+12
efficientformerv2_s1.snap_dist_in1k,41.870,58.130,57.973,42.027,6.19,224,0.950,bicubic,-51.970,-40.917,+86
gcvit_xxtiny.in1k,41.837,58.163,58.440,41.560,12.00,224,0.875,bicubic,-52.213,-40.640,+54
fastvit_t12.apple_dist_in1k,41.817,58.183,57.597,42.403,7.55,256,0.900,bicubic,-52.283,-41.353,+47
tf_efficientnet_b3.in1k,41.803,58.197,58.057,41.943,12.23,300,0.904,bicubic,-52.487,-41.043,+9
mobilevitv2_150.cvnets_in22k_ft_in1k_384,41.790,58.210,57.807,42.193,10.59,384,1.000,bicubic,-53.560,-41.313,-228
resnet50d.ra2_in1k,41.787,58.213,58.023,41.977,25.58,288,0.950,bicubic,-53.023,-40.797,-105
resnext50_32x4d.a1h_in1k,41.753,58.247,56.443,43.557,25.03,288,1.000,bicubic,-53.237,-42.717,-142
gcresnext50ts.ch_in1k,41.707,58.293,57.403,42.597,15.67,288,1.000,bicubic,-52.783,-41.607,-36
efficientvit_b1.r224_in1k,41.703,58.297,56.600,43.400,9.10,224,0.950,bicubic,-51.627,-41.970,+166
poolformer_s24.sail_in1k,41.700,58.300,58.470,41.530,21.39,224,0.900,bicubic,-52.690,-40.590,-16
edgenext_small_rw.sw_in1k,41.677,58.323,58.517,41.483,7.83,320,1.000,bicubic,-52.683,-40.533,-10
mobilevitv2_175.cvnets_in22k_ft_in1k_384,41.673,58.327,58.010,41.990,14.25,384,1.000,bicubic,-53.587,-41.150,-218
hrnet_w64.ms_in1k,41.643,58.357,57.113,42.887,128.06,224,0.875,bilinear,-52.187,-41.827,+75
dla102x2.in1k,41.637,58.363,57.970,42.030,41.28,224,0.875,bilinear,-52.353,-40.990,+53
senet154.gluon_in1k,41.627,58.373,56.353,43.647,115.09,224,0.875,bicubic,-53.073,-42.617,-92
seresnet50.ra2_in1k,41.593,58.407,57.987,42.013,28.09,288,0.950,bicubic,-53.147,-41.123,-103
inception_v4.tf_in1k,41.557,58.443,55.380,44.620,42.68,299,0.875,bicubic,-52.823,-43.690,-20
cs3sedarknet_l.c2ns_in1k,41.553,58.447,57.350,42.650,21.91,288,0.950,bicubic,-53.557,-41.860,-183
convnext_tiny.fb_in22k_ft_in1k,41.533,58.467,55.470,44.530,28.59,288,1.000,bicubic,-51.997,-43.130,+121
haloregnetz_b.ra3_in1k,41.527,58.473,57.087,42.913,11.68,224,0.940,bicubic,-52.993,-41.893,-54
swinv2_cr_tiny_ns_224.sw_in1k,41.523,58.477,57.183,42.817,28.33,224,0.900,bicubic,-53.247,-41.927,-113
efficientnet_em.ra2_in1k,41.490,58.510,58.873,41.127,6.90,240,0.882,bicubic,-52.240,-40.057,+79
efficientnet_el.ra_in1k,41.490,58.510,58.293,41.707,10.59,300,0.904,bicubic,-53.180,-40.837,-89
tf_efficientnet_cc_b0_8e.in1k,41.483,58.517,57.370,42.630,24.01,224,0.875,bicubic,-51.387,-41.090,+209
convnextv2_pico.fcmae_ft_in1k,41.477,58.523,58.050,41.950,9.07,288,0.950,bicubic,-53.083,-41.120,-68
halo2botnet50ts_256.a1h_in1k,41.460,58.540,56.190,43.810,22.64,256,0.950,bicubic,-53.550,-42.950,-164
swin_tiny_patch4_window7_224.ms_in1k,41.453,58.547,57.303,42.697,28.29,224,0.900,bicubic,-53.167,-41.847,-87
resnetv2_50d_evos.ah_in1k,41.407,58.593,56.500,43.500,25.59,288,1.000,bicubic,-53.513,-42.600,-151
resnet50.a1h_in1k,41.387,58.613,56.677,43.323,25.56,224,1.000,bicubic,-52.813,-42.243,+6
cait_xxs24_224.fb_dist_in1k,41.370,58.630,57.520,42.480,11.96,224,1.000,bicubic,-52.080,-41.280,+124
swinv2_tiny_window8_256.ms_in1k,41.370,58.630,57.150,42.850,28.35,256,0.900,bicubic,-53.650,-41.820,-173
resnet152.tv_in1k,41.330,58.670,57.513,42.487,60.19,224,0.875,bilinear,-51.910,-41.187,+149
dpn68b.ra_in1k,41.297,58.703,55.077,44.923,12.61,288,1.000,bicubic,-52.493,-43.863,+61
resnet50.ram_in1k,41.297,58.703,55.033,44.967,25.56,288,0.950,bicubic,-52.703,-43.847,+32
cs3darknet_l.c2ns_in1k,41.277,58.723,57.377,42.623,21.16,288,0.950,bicubic,-53.393,-41.883,-103
xception71.tf_in1k,41.273,58.727,55.890,44.110,42.34,299,0.903,bicubic,-52.647,-43.060,+39
gernet_s.idstcv_in1k,41.260,58.740,58.823,41.177,8.17,224,0.875,bilinear,-51.180,-39.677,+232
inception_v3.tf_adv_in1k,41.253,58.747,56.327,43.673,23.83,299,0.875,bicubic,-51.757,-42.493,+171
resnet101.a1_in1k,41.207,58.793,54.273,45.727,44.55,288,1.000,bicubic,-53.733,-44.927,-164
dpn92.mx_in1k,41.203,58.797,56.217,43.783,37.67,224,0.875,bicubic,-52.947,-42.733,+2
resnetv2_50d_gn.ah_in1k,41.113,58.887,56.520,43.480,25.57,288,1.000,bicubic,-54.107,-42.510,-235
resnet50.c1_in1k,41.090,58.910,56.453,43.547,25.56,288,1.000,bicubic,-53.420,-42.547,-74
nf_regnet_b1.ra2_in1k,41.010,58.990,58.133,41.867,10.22,288,0.900,bicubic,-52.880,-40.757,+36
resnet50d.gluon_in1k,40.963,59.037,57.130,42.870,25.58,224,0.875,bicubic,-52.577,-41.580,+94
fbnetv3_b.ra2_in1k,40.960,59.040,58.650,41.350,8.60,256,0.950,bilinear,-52.670,-40.300,+76
resnet152.a3_in1k,40.940,59.060,55.030,44.970,60.19,224,0.950,bicubic,-53.500,-44.150,-63
inception_v3.gluon_in1k,40.910,59.090,55.623,44.377,23.83,299,0.875,bicubic,-52.650,-43.217,+86
ecaresnet50d_pruned.miil_in1k,40.907,59.093,57.633,42.367,19.94,288,0.950,bicubic,-53.393,-41.567,-37
cs3darknet_focus_l.c2ns_in1k,40.893,59.107,56.630,43.370,21.15,288,0.950,bicubic,-53.897,-42.530,-144
resnetv2_50.a1h_in1k,40.890,59.110,56.383,43.617,25.55,288,1.000,bicubic,-53.790,-42.707,-121
levit_conv_192.fb_dist_in1k,40.840,59.160,56.707,43.293,10.95,224,0.900,bicubic,-52.870,-42.113,+59
levit_192.fb_dist_in1k,40.837,59.163,56.710,43.290,10.95,224,0.900,bicubic,-52.873,-42.220,+57
tiny_vit_5m_224.in1k,40.827,59.173,57.257,42.743,5.39,224,0.950,bicubic,-52.963,-41.283,+39
regnety_320.pycls_in1k,40.810,59.190,56.110,43.890,145.05,224,0.875,bicubic,-53.710,-42.850,-92
regnetx_032.tv2_in1k,40.790,59.210,56.627,43.373,15.30,224,0.965,bicubic,-53.730,-42.543,-88
eva02_tiny_patch14_336.mim_in22k_ft_in1k,40.767,59.233,56.067,43.933,5.76,336,1.000,bicubic,-53.683,-43.033,-78
maxvit_rmlp_pico_rw_256.sw_in1k,40.757,59.243,55.203,44.797,7.52,256,0.950,bicubic,-53.453,-43.747,-24
legacy_xception.tf_in1k,40.753,59.247,56.397,43.603,22.86,299,0.897,bicubic,-52.867,-42.373,+65
lamhalobotnet50ts_256.a1h_in1k,40.753,59.247,56.103,43.897,22.57,256,0.950,bicubic,-54.057,-43.127,-156
resnet152.a2_in1k,40.737,59.263,54.280,45.720,60.19,288,1.000,bicubic,-54.233,-44.930,-189
vit_base_patch32_384.augreg_in1k,40.710,59.290,55.197,44.803,88.30,384,1.000,bicubic,-52.450,-43.413,+128
skresnext50_32x4d.ra_in1k,40.703,59.297,56.023,43.977,27.48,224,0.875,bicubic,-53.267,-42.807,+7
resnet101.gluon_in1k,40.693,59.307,56.137,43.863,44.55,224,0.875,bicubic,-53.067,-42.563,+35
hrnet_w40.ms_in1k,40.660,59.340,56.757,43.243,57.56,224,0.875,bilinear,-53.070,-42.043,+39
resmlp_24_224.fb_in1k,40.657,59.343,56.557,43.443,30.02,224,0.875,bicubic,-52.763,-42.273,+98
resnext50_32x4d.ra_in1k,40.627,59.373,56.340,43.660,25.03,288,0.950,bicubic,-53.703,-42.690,-58
resnet50.am_in1k,40.610,59.390,57.377,42.623,25.56,224,0.875,bicubic,-53.030,-41.493,+52
repvgg_b1.rvgg_in1k,40.607,59.393,57.843,42.157,57.42,224,0.875,bilinear,-52.833,-40.947,+90
ese_vovnet39b.ra_in1k,40.590,59.410,56.600,43.400,24.57,288,0.950,bicubic,-53.890,-42.460,-96
halonet50ts.a1h_in1k,40.577,59.423,55.193,44.807,22.73,256,0.940,bicubic,-54.143,-43.867,-153
tf_efficientnet_lite3.in1k,40.560,59.440,56.467,43.533,8.20,300,0.904,bilinear,-53.530,-42.373,-19
mobilevitv2_175.cvnets_in1k,40.537,59.463,56.273,43.727,14.25,256,0.888,bicubic,-53.693,-42.657,-44
xcit_tiny_12_p8_224.fb_in1k,40.537,59.463,55.647,44.353,6.71,224,1.000,bicubic,-53.813,-43.233,-69
dla169.in1k,40.523,59.477,57.240,42.760,53.39,224,0.875,bilinear,-53.277,-41.670,+17
tresnet_m.miil_in1k_448,40.520,59.480,56.687,43.313,31.39,448,0.875,bilinear,-54.140,-42.463,-139
regnetz_b16.ra3_in1k,40.507,59.493,56.003,43.997,9.72,288,1.000,bicubic,-54.163,-43.127,-142
pit_xs_224.in1k,40.493,59.507,56.540,43.460,10.62,224,0.900,bicubic,-52.397,-42.240,+152
resnetaa50.a1h_in1k,40.483,59.517,55.987,44.013,25.56,288,1.000,bicubic,-54.367,-42.983,-184
regnetx_320.pycls_in1k,40.470,59.530,55.653,44.347,107.81,224,0.875,bicubic,-53.760,-43.297,-53
vit_base_patch16_384.augreg_in1k,40.470,59.530,53.250,46.750,86.86,384,1.000,bicubic,-53.970,-45.780,-98
repvgg_b2.rvgg_in1k,40.467,59.533,57.780,42.220,89.02,224,0.875,bilinear,-53.113,-41.010,+48
coat_mini.in1k,40.420,59.580,55.200,44.800,10.34,224,0.900,bicubic,-54.340,-43.890,-172
eca_resnet33ts.ra2_in1k,40.400,59.600,57.340,42.660,19.68,288,1.000,bicubic,-53.850,-41.690,-60
resnet34d.ra2_in1k,40.400,59.600,56.170,43.830,21.82,288,0.950,bicubic,-53.200,-42.590,+41
skresnet34.ra_in1k,40.393,59.607,56.730,43.270,22.28,224,0.875,bicubic,-52.177,-41.790,+175
efficientnet_el_pruned.in1k,40.383,59.617,56.907,43.093,10.59,300,0.904,bicubic,-53.707,-42.113,-35
efficientnet_b2_pruned.in1k,40.373,59.627,56.520,43.480,8.31,260,0.890,bicubic,-53.427,-42.340,+3
wide_resnet101_2.tv_in1k,40.363,59.637,55.790,44.210,126.89,224,0.875,bilinear,-53.357,-43.010,+16
coat_lite_mini.in1k,40.357,59.643,55.697,44.303,11.01,224,0.900,bicubic,-53.113,-43.073,+60
sebotnet33ts_256.a1h_in1k,40.353,59.647,53.223,46.777,13.70,256,0.940,bicubic,-53.957,-45.377,-80
legacy_seresnext101_32x4d.in1k,40.347,59.653,54.817,45.183,48.96,224,0.875,bilinear,-53.773,-43.973,-45
tf_efficientnet_b0.ap_in1k,40.343,59.657,56.803,43.197,5.29,224,0.875,bicubic,-52.277,-41.567,+161
densenet201.tv_in1k,40.283,59.717,56.713,43.287,20.01,224,0.875,bicubic,-52.417,-41.887,+153
mobileone_s2.apple_in1k,40.263,59.737,57.967,42.033,7.88,224,0.900,bilinear,-52.397,-40.683,+156
regnetx_160.pycls_in1k,40.253,59.747,56.050,43.950,54.28,224,0.875,bicubic,-53.657,-42.840,-19
xception65.tf_in1k,40.253,59.747,55.263,44.737,39.92,299,0.903,bicubic,-53.497,-43.607,+5
resnet101.a2_in1k,40.190,59.810,54.223,45.777,44.55,288,1.000,bicubic,-54.700,-45.047,-210
resnet50.c2_in1k,40.187,59.813,55.247,44.753,25.56,288,1.000,bicubic,-54.083,-43.573,-80
resnet50.ra_in1k,40.180,59.820,56.197,43.803,25.56,288,0.950,bicubic,-53.910,-42.763,-46
resnetblur50.bt_in1k,40.163,59.837,56.190,43.810,25.56,288,0.950,bicubic,-54.007,-42.820,-61
poolformerv2_s12.sail_in1k,40.160,59.840,57.447,42.553,11.89,224,1.000,bicubic,-52.730,-41.083,+129
mobilevitv2_200.cvnets_in1k,40.140,59.860,55.517,44.483,18.45,256,0.888,bicubic,-54.380,-43.653,-137
fastvit_t12.apple_in1k,40.127,59.873,55.243,44.757,7.55,256,0.900,bicubic,-53.363,-43.617,+42
darknetaa53.c2ns_in1k,40.120,59.880,55.787,44.213,36.02,288,1.000,bilinear,-54.090,-43.213,-70
hrnet_w48.ms_in1k,40.110,59.890,56.633,43.367,77.47,224,0.875,bilinear,-53.900,-42.297,-43
vit_base_patch16_224.sam_in1k,40.100,59.900,55.423,44.577,86.57,224,0.900,bicubic,-53.790,-43.317,-28
resnet50_gn.a1h_in1k,40.057,59.943,54.853,45.147,25.56,288,0.950,bicubic,-54.563,-44.197,-168
legacy_seresnet152.in1k,40.050,59.950,55.837,44.163,66.82,224,0.875,bilinear,-53.360,-42.993,+56
resnet50.d_in1k,40.050,59.950,54.713,45.287,25.56,288,1.000,bicubic,-54.420,-44.287,-134
hrnet_w30.ms_in1k,40.043,59.957,57.110,42.890,37.71,224,0.875,bilinear,-53.367,-41.740,+55
seresnet33ts.ra2_in1k,39.990,60.010,56.403,43.597,19.78,288,1.000,bicubic,-54.270,-42.597,-90
regnetx_080.pycls_in1k,39.990,60.010,55.950,44.050,39.57,224,0.875,bicubic,-53.800,-42.960,-18
tf_efficientnet_b1.aa_in1k,39.970,60.030,56.127,43.873,7.79,240,0.882,bicubic,-53.750,-42.683,-8
resnet101c.gluon_in1k,39.953,60.047,55.287,44.713,44.57,224,0.875,bicubic,-53.747,-43.433,-3
fastvit_s12.apple_in1k,39.927,60.073,54.840,45.160,9.47,256,0.900,bicubic,-53.773,-43.920,-3
convnext_pico_ols.d1_in1k,39.887,60.113,55.617,44.383,9.06,288,1.000,bicubic,-54.123,-43.413,-52
resmlp_12_224.fb_distilled_in1k,39.833,60.167,57.430,42.570,15.35,224,0.875,bicubic,-53.037,-41.190,+115
tf_efficientnetv2_b0.in1k,39.800,60.200,56.287,43.713,7.14,224,0.875,bicubic,-53.260,-42.413,+80
res2net50_26w_8s.in1k,39.800,60.200,54.900,45.100,48.40,224,0.875,bilinear,-53.610,-43.850,+49
darknet53.c2ns_in1k,39.737,60.263,55.287,44.713,41.61,288,1.000,bicubic,-54.623,-43.763,-121
res2net101_26w_4s.in1k,39.720,60.280,54.550,45.450,45.21,224,0.875,bilinear,-53.810,-44.020,+17
lambda_resnet50ts.a1h_in1k,39.720,60.280,54.373,45.627,21.54,256,0.950,bicubic,-54.850,-44.537,-167
regnetx_120.pycls_in1k,39.687,60.313,55.643,44.357,46.11,224,0.875,bicubic,-54.573,-43.527,-103
hrnet_w44.ms_in1k,39.687,60.313,55.330,44.670,67.06,224,0.875,bilinear,-53.933,-43.630,0
vit_small_patch32_224.augreg_in21k_ft_in1k,39.677,60.323,55.253,44.747,22.88,224,0.900,bicubic,-52.463,-43.267,+162
resmlp_big_24_224.fb_in1k,39.637,60.363,54.830,45.170,129.14,224,0.875,bicubic,-54.633,-44.120,-106
vit_small_patch16_384.augreg_in1k,39.623,60.377,54.243,45.757,22.20,384,1.000,bicubic,-54.987,-44.897,-185
mixnet_xl.ra_in1k,39.620,60.380,55.870,44.130,11.90,224,0.875,bicubic,-54.610,-43.180,-99
xception41.tf_in1k,39.607,60.393,55.013,44.987,26.97,299,0.903,bicubic,-53.873,-43.737,+19
densenet161.tv_in1k,39.603,60.397,56.130,43.870,28.68,224,0.875,bicubic,-53.287,-42.660,+98
tf_efficientnetv2_b1.in1k,39.573,60.427,55.327,44.673,8.14,240,0.882,bicubic,-54.137,-43.463,-24
dla102x.in1k,39.567,60.433,56.320,43.680,26.31,224,0.875,bilinear,-53.973,-42.530,+5
xcit_tiny_12_p16_224.fb_in1k,39.540,60.460,55.050,44.950,6.72,224,1.000,bicubic,-52.940,-43.380,+133
sehalonet33ts.ra2_in1k,39.540,60.460,54.023,45.977,13.69,256,0.940,bicubic,-54.970,-44.737,-163
convnext_pico.d1_in1k,39.517,60.483,55.317,44.683,9.05,288,0.950,bicubic,-54.513,-43.693,-77
tf_efficientnet_b2.in1k,39.513,60.487,56.107,43.893,9.11,260,0.890,bicubic,-54.197,-42.703,-30
rexnet_130.nav_in1k,39.480,60.520,56.637,43.363,7.56,224,0.875,bicubic,-54.200,-42.063,-23
hrnet_w32.ms_in1k,39.467,60.533,56.120,43.880,41.23,224,0.875,bilinear,-53.483,-42.730,+79
levit_128.fb_dist_in1k,39.447,60.553,55.350,44.650,9.21,224,0.900,bicubic,-53.583,-43.360,+62
levit_conv_128.fb_dist_in1k,39.447,60.553,55.337,44.663,9.21,224,0.900,bicubic,-53.583,-43.363,+62
resnetv2_50x1_bit.goog_in21k_ft_in1k,39.433,60.567,57.853,42.147,25.55,448,1.000,bilinear,-55.307,-41.327,-229
ecaresnet50t.a1_in1k,39.417,60.583,53.680,46.320,25.57,288,1.000,bicubic,-55.473,-45.380,-256
regnety_064.pycls_in1k,39.387,60.613,55.770,44.230,30.58,224,0.875,bicubic,-54.743,-43.260,-100
regnety_120.pycls_in1k,39.343,60.657,55.277,44.723,51.82,224,0.875,bicubic,-54.667,-43.543,-81
gcresnet33ts.ra2_in1k,39.333,60.667,55.880,44.120,19.88,288,1.000,bicubic,-55.017,-43.220,-140
mobilevitv2_150.cvnets_in1k,39.323,60.677,55.230,44.770,10.59,256,0.888,bicubic,-54.727,-43.670,-89
resnet101.tv_in1k,39.290,60.710,55.787,44.213,44.55,224,0.875,bilinear,-53.590,-42.873,+84
tf_efficientnet_el.in1k,39.287,60.713,55.377,44.623,10.59,300,0.904,bicubic,-55.063,-43.583,-145
resnet50s.gluon_in1k,39.260,60.740,55.020,44.980,25.68,224,0.875,bicubic,-54.310,-43.820,-17
resnet101.a3_in1k,39.253,60.747,53.710,46.290,44.55,224,0.950,bicubic,-54.607,-45.050,-65
regnety_160.pycls_in1k,39.250,60.750,55.447,44.553,83.59,224,0.875,bicubic,-54.880,-43.573,-107
repghostnet_200.in1k,39.243,60.757,56.423,43.577,9.80,224,0.875,bicubic,-54.307,-42.397,-16
inception_v3.tf_in1k,39.223,60.777,54.300,45.700,23.83,299,0.875,bicubic,-53.967,-44.360,+34
resnext50_32x4d.a1_in1k,39.190,60.810,53.350,46.650,25.03,288,1.000,bicubic,-55.190,-45.430,-156
densenet169.tv_in1k,39.170,60.830,55.860,44.140,14.15,224,0.875,bicubic,-53.130,-42.750,+123
tf_efficientnetv2_b2.in1k,39.167,60.833,54.563,45.437,10.10,260,0.890,bicubic,-54.893,-44.367,-101
resnext50d_32x4d.bt_in1k,39.100,60.900,54.343,45.657,25.05,288,0.950,bicubic,-55.300,-44.447,-166
legacy_seresnet101.in1k,39.043,60.957,55.003,44.997,49.33,224,0.875,bilinear,-54.227,-43.737,+22
efficientnet_b1_pruned.in1k,39.003,60.997,55.630,44.370,6.33,240,0.882,bicubic,-53.977,-43.100,+57
repvgg_b1g4.rvgg_in1k,38.977,61.023,56.350,43.650,39.97,224,0.875,bilinear,-54.053,-42.250,+40
crossvit_9_dagger_240.in1k,38.970,61.030,54.867,45.133,8.78,240,0.875,bicubic,-53.800,-43.793,+81
resnet50.a1_in1k,38.967,61.033,53.243,46.757,25.56,288,1.000,bicubic,-55.253,-45.507,-131
inception_v3.tv_in1k,38.943,61.057,53.877,46.123,23.83,299,0.875,bicubic,-53.957,-44.853,+64
regnety_080.pycls_in1k,38.910,61.090,55.190,44.810,39.18,224,0.875,bicubic,-54.970,-43.810,-84
res2net101d.in1k,38.903,61.097,53.057,46.943,45.23,224,0.875,bilinear,-55.617,-45.853,-201
legacy_seresnext50_32x4d.in1k,38.890,61.110,54.577,45.423,27.56,224,0.875,bilinear,-54.560,-44.203,-13
dla102.in1k,38.840,61.160,55.317,44.683,33.27,224,0.875,bilinear,-54.430,-43.473,+12
visformer_tiny.in1k,38.830,61.170,55.023,44.977,10.32,224,0.900,bicubic,-54.150,-43.647,+44
regnety_040.pycls_in1k,38.823,61.177,55.580,44.420,20.65,224,0.875,bicubic,-54.807,-43.330,-49
efficientvit_m5.r224_in1k,38.820,61.180,54.980,45.020,12.47,224,0.875,bicubic,-53.330,-43.540,+116
regnetx_040.pycls_in1k,38.710,61.290,55.357,44.643,22.12,224,0.875,bicubic,-54.980,-43.573,-58
res2net50_14w_8s.in1k,38.710,61.290,54.073,45.927,25.06,224,0.875,bilinear,-54.320,-44.747,+31
dpn68.mx_in1k,38.687,61.313,54.687,45.313,12.61,224,0.875,bicubic,-53.613,-43.903,+104
regnetx_032.pycls_in1k,38.680,61.320,55.160,44.840,15.30,224,0.875,bicubic,-54.560,-43.560,+8
res2net50_26w_6s.in1k,38.680,61.320,53.773,46.227,37.05,224,0.875,bilinear,-54.910,-44.857,-49
resnet33ts.ra2_in1k,38.667,61.333,55.173,44.827,19.68,288,1.000,bicubic,-55.263,-43.707,-102
wide_resnet50_2.tv_in1k,38.650,61.350,54.467,45.533,68.88,224,0.875,bilinear,-54.810,-44.353,-26
selecsls60.in1k,38.620,61.380,55.617,44.383,30.67,224,0.875,bicubic,-54.390,-42.873,+28
regnetx_016.tv2_in1k,38.620,61.380,54.733,45.267,9.19,224,0.965,bicubic,-55.400,-44.197,-117
dla60x.in1k,38.610,61.390,55.403,44.597,17.35,224,0.875,bilinear,-54.580,-43.317,+5
tf_efficientnet_b0.aa_in1k,38.597,61.403,55.970,44.030,5.29,224,0.875,bicubic,-53.803,-42.500,+90
densenetblur121d.ra_in1k,38.593,61.407,55.637,44.363,8.00,288,0.950,bicubic,-54.437,-43.063,+22
dla60_res2net.in1k,38.593,61.407,54.563,45.437,20.85,224,0.875,bilinear,-54.777,-44.277,-13
repvgg_a2.rvgg_in1k,38.553,61.447,55.767,44.233,28.21,224,0.875,bilinear,-54.127,-42.513,+64
selecsls60b.in1k,38.550,61.450,55.310,44.690,32.77,224,0.875,bicubic,-54.950,-43.470,-43
seresnet50.a1_in1k,38.537,61.463,53.413,46.587,28.09,288,1.000,bicubic,-55.623,-45.437,-145
dpn68b.mx_in1k,38.530,61.470,55.183,44.817,12.61,224,0.875,bicubic,-54.250,-43.337,+51
resnet32ts.ra2_in1k,38.510,61.490,55.507,44.493,17.96,288,1.000,bicubic,-55.080,-43.233,-61
hardcorenas_f.miil_green_in1k,38.507,61.493,55.653,44.347,8.20,224,0.875,bilinear,-54.473,-42.977,+26
hrnet_w18_small_v2.gluon_in1k,38.460,61.540,56.180,43.820,15.60,224,0.875,bicubic,-54.540,-42.580,+20
resmlp_12_224.fb_in1k,38.440,61.560,56.327,43.673,15.35,224,0.875,bicubic,-53.680,-42.243,+98
tf_efficientnet_cc_b0_4e.in1k,38.430,61.570,55.177,44.823,13.31,224,0.875,bicubic,-54.390,-43.263,+44
dla60_res2next.in1k,38.423,61.577,54.930,45.070,17.03,224,0.875,bilinear,-55.157,-44.140,-63
regnetx_064.pycls_in1k,38.420,61.580,54.993,45.007,26.21,224,0.875,bicubic,-55.220,-44.047,-76
ghostnetv2_160.in1k,38.410,61.590,55.530,44.470,12.39,224,0.875,bicubic,-54.680,-43.210,+2
resnet50.gluon_in1k,38.410,61.590,54.830,45.170,25.56,224,0.875,bicubic,-54.150,-43.720,+65
resnet50d.a1_in1k,38.403,61.597,52.860,47.140,25.58,288,1.000,bicubic,-55.997,-46.200,-204
regnety_008_tv.tv2_in1k,38.337,61.663,54.270,45.730,6.43,224,0.965,bicubic,-54.813,-44.410,-7
hrnet_w18.ms_in1k,38.263,61.737,55.660,44.340,21.30,224,0.875,bilinear,-54.507,-42.960,+41
tinynet_a.in1k,38.230,61.770,55.193,44.807,6.19,192,0.875,bicubic,-54.580,-43.367,+37
ecaresnet50t.a3_in1k,38.227,61.773,53.650,46.350,25.57,224,0.950,bicubic,-55.633,-45.200,-117
resnet34.a1_in1k,38.220,61.780,52.373,47.627,21.80,288,1.000,bicubic,-54.780,-46.257,+9
densenet121.ra_in1k,38.180,61.820,55.137,44.863,7.98,288,0.950,bicubic,-54.520,-43.503,+44
mixnet_l.ft_in1k,38.173,61.827,54.797,45.203,7.33,224,0.875,bicubic,-55.067,-43.953,-21
regnety_032.pycls_in1k,38.167,61.833,54.363,45.637,19.44,224,0.875,bicubic,-55.293,-44.587,-55
poolformer_s12.sail_in1k,38.157,61.843,56.213,43.787,11.92,224,0.900,bicubic,-54.343,-42.177,+58
hardcorenas_e.miil_green_in1k,38.150,61.850,55.177,44.823,8.07,224,0.875,bilinear,-54.790,-43.333,+12
efficientnet_b1.ft_in1k,38.080,61.920,54.000,46.000,7.79,256,1.000,bicubic,-54.940,-44.710,-2
ecaresnet50t.a2_in1k,38.053,61.947,52.967,47.033,25.57,288,1.000,bicubic,-56.517,-46.073,-257
gmixer_24_224.ra3_in1k,38.050,61.950,52.077,47.923,24.72,224,0.875,bicubic,-54.630,-46.453,+39
vit_base_patch16_224.augreg_in1k,38.037,61.963,50.710,49.290,86.57,224,0.900,bicubic,-55.313,-48.050,-37
coat_lite_tiny.in1k,38.033,61.967,53.463,46.537,5.72,224,0.900,bicubic,-54.827,-45.177,+22
resnetrs50.tf_in1k,37.977,62.023,53.303,46.697,35.69,224,0.910,bicubic,-56.053,-45.437,-154
resnext50_32x4d.a2_in1k,37.933,62.067,52.353,47.647,25.03,288,1.000,bicubic,-56.287,-46.687,-182
mobilevitv2_125.cvnets_in1k,37.893,62.107,54.067,45.933,7.48,256,0.888,bicubic,-55.587,-44.773,-68
resnet50c.gluon_in1k,37.873,62.127,54.117,45.883,25.58,224,0.875,bicubic,-55.037,-44.573,+7
hardcorenas_c.miil_green_in1k,37.860,62.140,55.727,44.273,5.52,224,0.875,bilinear,-54.490,-42.613,+58
efficientformerv2_s0.snap_dist_in1k,37.823,62.177,54.053,45.947,3.60,224,0.950,bicubic,-54.037,-44.317,+82
res2net50_26w_4s.in1k,37.810,62.190,53.080,46.920,25.70,224,0.875,bilinear,-55.370,-45.580,-31
efficientnet_es.ra_in1k,37.787,62.213,54.960,45.040,5.44,224,0.875,bicubic,-55.123,-43.720,+4
resnest14d.gluon_in1k,37.777,62.223,56.480,43.520,10.61,224,0.875,bilinear,-53.373,-41.850,+109
convnext_femto.d1_in1k,37.733,62.267,54.100,45.900,5.22,288,0.950,bicubic,-55.707,-44.720,-65
resnext50_32x4d.tv_in1k,37.727,62.273,54.103,45.897,25.03,224,0.875,bilinear,-55.173,-44.217,+3
pit_ti_distilled_224.in1k,37.710,62.290,55.647,44.353,5.10,224,0.900,bicubic,-53.020,-42.603,+121
ecaresnet26t.ra2_in1k,37.667,62.333,54.357,45.643,16.01,320,0.950,bicubic,-56.273,-44.673,-153
seresnet50.a2_in1k,37.667,62.333,52.373,47.627,28.09,288,1.000,bicubic,-56.783,-46.517,-247
vit_base_patch32_224.augreg_in1k,37.563,62.437,51.810,48.190,88.22,224,0.900,bicubic,-53.027,-45.910,+122
fastvit_t8.apple_dist_in1k,37.557,62.443,53.830,46.170,4.03,256,0.900,bicubic,-55.033,-44.710,+30
hardcorenas_d.miil_green_in1k,37.543,62.457,54.720,45.280,7.50,224,0.875,bilinear,-55.067,-43.790,+26
res2next50.in1k,37.483,62.517,52.853,47.147,24.67,224,0.875,bilinear,-55.667,-45.787,-38
convnextv2_femto.fcmae_ft_in1k,37.477,62.523,53.507,46.493,5.23,288,0.950,bicubic,-56.273,-45.463,-131
resnet50.bt_in1k,37.420,62.580,53.860,46.140,25.56,288,0.950,bicubic,-56.540,-45.070,-162
hrnet_w18.ms_aug_in1k,37.337,62.663,54.123,45.877,21.30,224,0.950,bilinear,-56.143,-44.857,-87
lambda_resnet26t.c1_in1k,37.300,62.700,53.563,46.437,10.96,256,0.940,bicubic,-56.130,-45.167,-73
convnext_femto_ols.d1_in1k,37.253,62.747,53.047,46.953,5.23,288,0.950,bicubic,-56.137,-45.863,-67
mobilenetv3_large_100.miil_in21k_ft_in1k,37.243,62.757,53.543,46.457,5.48,224,0.875,bilinear,-55.017,-44.697,+47
hardcorenas_b.miil_green_in1k,37.233,62.767,55.033,44.967,5.18,224,0.875,bilinear,-54.687,-43.377,+60
resnet50d.a2_in1k,37.227,62.773,51.743,48.257,25.58,288,1.000,bicubic,-57.233,-47.287,-261
res2net50d.in1k,37.223,62.777,51.387,48.613,25.72,224,0.875,bilinear,-57.057,-47.703,-223
fastvit_t8.apple_in1k,37.217,62.783,53.133,46.867,4.03,256,0.900,bicubic,-54.713,-45.247,+56
eca_halonext26ts.c1_in1k,37.170,62.830,53.103,46.897,10.76,256,0.940,bicubic,-56.380,-45.497,-106
cs3darknet_focus_m.c2ns_in1k,37.140,62.860,53.910,46.090,9.30,288,0.950,bicubic,-55.960,-44.840,-46
res2net50_48w_2s.in1k,37.120,62.880,53.327,46.673,25.29,224,0.875,bilinear,-55.660,-45.143,-4
lambda_resnet26rpt_256.c1_in1k,37.103,62.897,53.860,46.140,10.99,256,0.940,bicubic,-56.327,-45.020,-84
vit_small_patch16_224.augreg_in1k,37.080,62.920,51.553,48.447,22.05,224,0.900,bicubic,-56.370,-47.227,-90
rexnet_100.nav_in1k,37.057,62.943,54.050,45.950,4.80,224,0.875,bicubic,-55.773,-44.550,-12
bat_resnext26ts.ch_in1k,37.057,62.943,53.743,46.257,10.73,256,0.900,bicubic,-56.063,-44.987,-51
resnet50.a2_in1k,37.050,62.950,51.347,48.653,25.56,288,1.000,bicubic,-57.070,-47.623,-201
dla60.in1k,37.040,62.960,54.220,45.780,22.04,224,0.875,bilinear,-55.610,-44.410,+3
tf_efficientnet_b1.in1k,37.017,62.983,53.417,46.583,7.79,240,0.882,bicubic,-55.913,-45.243,-29
regnety_016.pycls_in1k,37.013,62.987,54.083,45.917,11.20,224,0.875,bicubic,-55.997,-44.587,-43
botnet26t_256.c1_in1k,36.963,63.037,53.053,46.947,12.49,256,0.950,bicubic,-56.477,-45.857,-93
tf_mixnet_l.in1k,36.960,63.040,52.597,47.403,7.33,224,0.875,bicubic,-56.070,-45.933,-48
resnet34.a2_in1k,36.953,63.047,51.433,48.567,21.80,288,1.000,bicubic,-55.617,-47.137,+5
mobileone_s1.apple_in1k,36.930,63.070,54.603,45.397,4.83,224,0.900,bilinear,-54.860,-43.857,+46
ghostnetv2_130.in1k,36.887,63.113,54.137,45.863,8.96,224,0.875,bicubic,-55.353,-44.243,+28
legacy_seresnet50.in1k,36.863,63.137,53.473,46.527,28.09,224,0.875,bilinear,-55.797,-45.207,-6
halonet26t.a1h_in1k,36.843,63.157,52.270,47.730,12.48,256,0.950,bicubic,-56.747,-46.470,-130
densenet121.tv_in1k,36.807,63.193,54.030,45.970,7.98,224,0.875,bicubic,-54.593,-44.220,+60
tf_efficientnet_lite2.in1k,36.800,63.200,53.313,46.687,6.09,260,0.890,bicubic,-55.790,-45.117,-3
mobilenetv2_120d.ra_in1k,36.790,63.210,54.047,45.953,5.83,224,0.875,bicubic,-55.820,-44.383,-7
tf_efficientnet_lite1.in1k,36.730,63.270,53.583,46.417,5.42,240,0.882,bicubic,-55.560,-44.917,+17
regnetx_016.pycls_in1k,36.693,63.307,53.307,46.693,9.19,224,0.875,bicubic,-55.837,-45.243,0
eca_botnext26ts_256.c1_in1k,36.690,63.310,52.493,47.507,10.59,256,0.950,bicubic,-56.660,-46.197,-92
hardcorenas_a.miil_green_in1k,36.687,63.313,54.923,45.077,5.26,224,0.875,bilinear,-54.933,-43.357,+45
levit_conv_128s.fb_dist_in1k,36.623,63.377,53.137,46.863,7.78,224,0.900,bicubic,-54.877,-45.263,+48
levit_128s.fb_dist_in1k,36.623,63.377,53.130,46.870,7.78,224,0.900,bicubic,-54.887,-45.270,+48
repghostnet_150.in1k,36.617,63.383,54.110,45.890,6.58,224,0.875,bicubic,-55.763,-44.420,+4
efficientnet_b0.ra_in1k,36.593,63.407,53.477,46.523,5.29,224,0.875,bicubic,-55.887,-45.203,-3
efficientvit_m4.r224_in1k,36.587,63.413,53.267,46.733,8.80,224,0.875,bicubic,-54.153,-44.773,+74
resnext50_32x4d.a3_in1k,36.553,63.447,51.143,48.857,25.03,224,0.950,bicubic,-56.997,-47.537,-137
vit_base_patch32_224.sam_in1k,36.543,63.457,53.043,46.957,88.22,224,0.900,bicubic,-53.327,-44.557,+94
xcit_nano_12_p8_224.fb_dist_in1k,36.510,63.490,52.873,47.127,3.05,224,1.000,bicubic,-55.920,-45.667,-3
cs3darknet_m.c2ns_in1k,36.480,63.520,53.230,46.770,9.31,288,0.950,bicubic,-56.800,-45.490,-96
repvgg_a1.rvgg_in1k,36.450,63.550,53.770,46.230,14.09,224,0.875,bilinear,-54.670,-44.390,+57
tf_efficientnet_em.in1k,36.393,63.607,52.853,47.147,6.90,240,0.882,bicubic,-56.797,-45.637,-90
mobilevitv2_100.cvnets_in1k,36.387,63.613,53.083,46.917,4.90,256,0.888,bicubic,-56.753,-45.677,-84
resnet50d.a3_in1k,36.330,63.670,51.320,48.680,25.58,224,0.950,bicubic,-57.080,-47.370,-112
skresnet18.ra_in1k,36.313,63.687,54.187,45.813,11.96,224,0.875,bicubic,-53.867,-43.593,+83
repvgg_b0.rvgg_in1k,36.290,63.710,54.057,45.943,15.82,224,0.875,bilinear,-55.390,-44.393,+25
resnet34.bt_in1k,36.257,63.743,52.757,47.243,21.80,288,0.950,bicubic,-56.023,-45.843,-1
resnet50.tv_in1k,36.180,63.820,52.793,47.207,25.56,224,0.875,bilinear,-55.940,-45.617,+8
xcit_nano_12_p16_384.fb_dist_in1k,36.150,63.850,53.260,46.740,3.05,384,1.000,bicubic,-55.980,-45.250,+4
legacy_seresnet34.in1k,36.150,63.850,52.567,47.433,21.96,224,0.875,bilinear,-55.340,-45.633,+33
coat_tiny.in1k,36.117,63.883,51.047,48.953,5.50,224,0.900,bicubic,-57.393,-47.633,-144
efficientvit_m3.r224_in1k,36.090,63.910,52.453,47.547,6.90,224,0.875,bicubic,-53.910,-45.377,+77
resnet34.tv_in1k,36.070,63.930,53.537,46.463,21.80,224,0.875,bilinear,-54.230,-44.433,+70
deit_tiny_distilled_patch16_224.fb_in1k,36.023,63.977,54.243,45.757,5.91,224,0.900,bicubic,-55.057,-44.027,+50
mobilenetv2_140.ra_in1k,35.997,64.003,53.963,46.037,6.11,224,0.875,bicubic,-56.053,-44.287,+4
convnextv2_atto.fcmae_ft_in1k,35.990,64.010,51.187,48.813,3.71,288,0.950,bicubic,-56.930,-47.373,-68
resnet50.a3_in1k,35.957,64.043,50.483,49.517,25.56,224,0.950,bicubic,-56.983,-48.187,-71
tf_efficientnet_lite0.in1k,35.937,64.063,53.477,46.523,4.65,224,0.875,bicubic,-55.343,-44.363,+29
selecsls42b.in1k,35.813,64.187,52.473,47.527,32.46,224,0.875,bicubic,-56.667,-46.157,-25
xcit_nano_12_p8_384.fb_dist_in1k,35.797,64.203,52.300,47.700,3.05,384,1.000,bicubic,-57.503,-46.560,-118
seresnext26ts.ch_in1k,35.780,64.220,53.460,46.540,10.39,288,1.000,bicubic,-57.160,-45.120,-78
resnet34.gluon_in1k,35.780,64.220,52.183,47.817,21.80,224,0.875,bicubic,-55.310,-45.997,+42
convnext_atto.d2_in1k,35.777,64.223,52.323,47.677,3.70,288,0.950,bicubic,-56.993,-46.167,-56
seresnet50.a3_in1k,35.757,64.243,51.163,48.837,28.09,224,0.950,bicubic,-56.603,-47.167,-24
resnet26t.ra2_in1k,35.727,64.273,53.587,46.413,16.01,320,1.000,bicubic,-57.183,-45.113,-74
dla34.in1k,35.657,64.343,52.803,47.197,15.74,224,0.875,bilinear,-55.563,-45.367,+24
efficientnet_lite0.ra_in1k,35.640,64.360,53.663,46.337,4.65,224,0.875,bicubic,-55.610,-44.577,+22
mixnet_m.ft_in1k,35.633,64.367,52.423,47.577,5.01,224,0.875,bicubic,-56.637,-45.937,-20
resnet18.fb_ssl_yfcc100m_ft_in1k,35.613,64.387,53.753,46.247,11.69,224,0.875,bilinear,-55.087,-44.267,+43
mobilenetv3_rw.rmsp_in1k,35.543,64.457,53.707,46.293,5.48,224,0.875,bicubic,-56.007,-44.573,+9
regnetx_008.tv2_in1k,35.527,64.473,51.480,48.520,7.26,224,0.965,bicubic,-56.963,-46.950,-40
convnext_atto_ols.a2_in1k,35.403,64.597,51.390,48.610,3.70,288,0.950,bicubic,-57.577,-47.150,-93
efficientnet_es_pruned.in1k,35.393,64.607,52.840,47.160,5.44,224,0.875,bicubic,-56.307,-45.570,-3
mobilenetv2_110d.ra_in1k,35.313,64.687,52.870,47.130,4.52,224,0.875,bicubic,-56.017,-45.320,+12
repghostnet_130.in1k,35.263,64.737,52.567,47.433,5.48,224,0.875,bicubic,-56.677,-45.823,-14
tf_mixnet_m.in1k,35.193,64.807,50.987,49.013,5.01,224,0.875,bicubic,-57.017,-47.433,-25
hrnet_w18_small_v2.ms_in1k,35.163,64.837,52.420,47.580,15.60,224,0.875,bilinear,-55.997,-45.920,+17
xcit_nano_12_p16_224.fb_dist_in1k,35.120,64.880,52.543,47.457,3.05,224,1.000,bicubic,-55.070,-45.207,+49
resnet34.a3_in1k,35.033,64.967,50.503,49.497,21.80,224,0.950,bicubic,-55.207,-47.377,+45
convit_tiny.fb_in1k,35.027,64.973,51.777,48.223,5.71,224,0.875,bicubic,-55.523,-46.413,+36
gcresnext26ts.ch_in1k,35.017,64.983,51.493,48.507,10.48,288,1.000,bicubic,-57.723,-47.117,-70
eca_resnext26ts.ch_in1k,34.927,65.073,52.370,47.630,10.30,288,1.000,bicubic,-57.823,-46.340,-72
regnety_004.tv2_in1k,34.923,65.077,51.263,48.737,4.34,224,0.965,bicubic,-56.697,-46.907,-8
tinynet_b.in1k,34.867,65.133,51.997,48.003,3.73,188,0.875,bicubic,-56.243,-46.063,+15
resnext26ts.ra2_in1k,34.837,65.163,52.710,47.290,10.30,288,1.000,bicubic,-57.543,-45.680,-46
regnety_008.pycls_in1k,34.780,65.220,51.743,48.257,6.26,224,0.875,bicubic,-57.120,-46.667,-21
pit_ti_224.in1k,34.670,65.330,52.160,47.840,4.85,224,0.900,bicubic,-55.750,-45.850,+33
tf_efficientnet_b0.in1k,34.623,65.377,51.143,48.857,5.29,224,0.875,bicubic,-57.657,-47.407,-41
crossvit_9_240.in1k,34.613,65.387,51.767,48.233,8.55,240,0.875,bicubic,-56.437,-46.553,+16
mobilenetv3_large_100.ra_in1k,34.600,65.400,52.843,47.157,5.48,224,0.875,bicubic,-56.870,-45.477,-7
mixer_b16_224.goog_in21k_ft_in1k,34.427,65.573,48.107,51.893,59.88,224,0.875,bicubic,-56.713,-49.293,+6
pvt_v2_b0.in1k,34.400,65.600,53.103,46.897,3.67,224,0.900,bicubic,-54.570,-44.587,+55
tf_efficientnet_es.in1k,34.260,65.740,51.353,48.647,5.44,224,0.875,bicubic,-57.850,-47.077,-35
fbnetc_100.rmsp_in1k,34.243,65.757,51.190,48.810,5.57,224,0.875,bilinear,-57.037,-46.890,-6
resnet18d.ra2_in1k,34.213,65.787,51.763,48.237,11.71,288,0.950,bicubic,-56.587,-46.397,+12
regnety_006.pycls_in1k,34.153,65.847,51.253,48.747,6.06,224,0.875,bicubic,-57.407,-47.177,-18
repvgg_a0.rvgg_in1k,34.070,65.930,51.967,48.033,9.11,224,0.875,bilinear,-55.610,-45.793,+37
ghostnetv2_100.in1k,34.037,65.963,51.973,48.027,6.16,224,0.875,bicubic,-57.593,-46.317,-24
resnet18.a1_in1k,33.973,66.027,49.410,50.590,11.69,288,1.000,bicubic,-56.227,-48.350,+27
tf_mobilenetv3_large_100.in1k,33.963,66.037,51.470,48.530,5.48,224,0.875,bilinear,-57.457,-46.790,-16
regnetx_008.pycls_in1k,33.803,66.197,50.543,49.457,7.26,224,0.875,bicubic,-57.367,-47.827,-8
repghostnet_111.in1k,33.790,66.210,51.543,48.457,4.54,224,0.875,bicubic,-57.310,-46.687,0
mnasnet_100.rmsp_in1k,33.777,66.223,51.150,48.850,4.38,224,0.875,bicubic,-57.433,-47.070,-11
semnasnet_075.rmsp_in1k,33.773,66.227,52.403,47.597,2.91,224,0.875,bicubic,-56.447,-45.547,+21
lcnet_100.ra2_in1k,33.767,66.233,52.090,47.910,2.95,224,0.875,bicubic,-55.143,-45.290,+45
ese_vovnet19b_dw.ra_in1k,33.753,66.247,50.920,49.080,6.54,288,0.950,bicubic,-59.007,-47.730,-97
regnetx_004_tv.tv2_in1k,33.717,66.283,49.803,50.197,5.50,224,0.965,bicubic,-57.443,-48.297,-12
vit_tiny_r_s16_p8_384.augreg_in21k_ft_in1k,33.633,66.367,50.687,49.313,6.36,384,1.000,bicubic,-58.097,-47.743,-40
mobilevit_s.cvnets_in1k,33.633,66.367,49.287,50.713,5.58,256,0.900,bicubic,-59.517,-49.493,-152
xcit_nano_12_p8_224.fb_in1k,33.580,66.420,50.233,49.767,3.05,224,1.000,bicubic,-57.520,-47.817,-10
vit_tiny_patch16_384.augreg_in21k_ft_in1k,33.550,66.450,51.077,48.923,5.79,384,1.000,bicubic,-59.870,-47.733,-183
semnasnet_100.rmsp_in1k,33.507,66.493,50.803,49.197,3.89,224,0.875,bicubic,-58.163,-47.487,-39
spnasnet_100.rmsp_in1k,33.490,66.510,51.287,48.713,4.42,224,0.875,bilinear,-57.100,-46.663,+1
mixnet_s.ft_in1k,33.467,66.533,51.007,48.993,4.13,224,0.875,bicubic,-58.313,-47.293,-46
crossvit_tiny_240.in1k,33.347,66.653,49.900,50.100,7.01,240,0.875,bicubic,-57.183,-48.050,+3
mobilevitv2_075.cvnets_in1k,33.340,66.660,50.110,49.890,2.87,256,0.888,bicubic,-58.630,-48.190,-56
efficientvit_m2.r224_in1k,33.293,66.707,49.810,50.190,4.19,224,0.875,bicubic,-55.617,-47.580,+32
vgg19_bn.tv_in1k,33.240,66.760,50.777,49.223,143.68,224,0.875,bilinear,-57.750,-47.323,-12
regnetx_006.pycls_in1k,33.147,66.853,50.243,49.757,6.20,224,0.875,bicubic,-57.643,-47.847,-11
repghostnet_100.in1k,33.140,66.860,50.770,49.230,4.07,224,0.875,bicubic,-57.550,-47.350,-7
edgenext_x_small.in1k,33.110,66.890,49.003,50.997,2.34,288,1.000,bicubic,-58.470,-49.187,-44
seresnext26t_32x4d.bt_in1k,33.083,66.917,50.270,49.730,16.81,288,0.950,bicubic,-60.267,-48.390,-183
resnet18.tv_in1k,33.057,66.943,51.173,48.827,11.69,224,0.875,bilinear,-55.093,-45.947,+34
seresnext26d_32x4d.bt_in1k,32.973,67.027,49.823,50.177,16.81,288,0.950,bicubic,-60.087,-48.887,-160
xcit_nano_12_p16_224.fb_in1k,32.953,67.047,49.977,50.023,3.05,224,1.000,bicubic,-56.017,-47.433,+23
mobileone_s0.apple_in1k,32.853,67.147,51.063,48.937,5.29,224,0.875,bilinear,-55.957,-46.157,+25
hrnet_w18_small.gluon_in1k,32.827,67.173,50.380,49.620,13.19,224,0.875,bicubic,-57.483,-47.370,-5
legacy_seresnext26_32x4d.in1k,32.773,67.227,49.217,50.783,16.79,224,0.875,bicubic,-59.827,-49.193,-106
hrnet_w18_small.ms_in1k,32.653,67.347,50.587,49.413,13.19,224,0.875,bilinear,-57.217,-47.303,+1
deit_tiny_patch16_224.fb_in1k,32.650,67.350,50.263,49.737,5.72,224,0.900,bicubic,-56.960,-47.697,+6
legacy_seresnet18.in1k,32.600,67.400,50.317,49.683,11.78,224,0.875,bicubic,-56.660,-47.373,+11
ghostnet_100.in1k,32.550,67.450,50.410,49.590,5.18,224,0.875,bicubic,-57.910,-47.500,-13
mobilenetv2_100.ra_in1k,32.513,67.487,50.790,49.210,3.50,224,0.875,bicubic,-57.357,-47.040,-2
regnetx_004.pycls_in1k,32.493,67.507,49.323,50.677,5.16,224,0.875,bicubic,-56.977,-48.437,+3
resnet26d.bt_in1k,32.420,67.580,49.997,50.003,16.01,288,0.950,bicubic,-60.130,-48.653,-107
resnet18.gluon_in1k,32.417,67.583,49.737,50.263,11.69,224,0.875,bicubic,-56.243,-47.363,+16
regnety_004.pycls_in1k,32.313,67.687,49.463,50.537,4.34,224,0.875,bicubic,-58.457,-48.607,-28
resnet18.a2_in1k,32.223,67.777,47.890,52.110,11.69,288,1.000,bicubic,-57.247,-49.520,0
tf_mixnet_s.in1k,32.200,67.800,48.523,51.477,4.13,224,0.875,bicubic,-59.480,-49.717,-67
resnet26.bt_in1k,32.173,67.827,49.443,50.557,16.00,288,0.950,bicubic,-59.937,-49.107,-83
vit_tiny_patch16_224.augreg_in21k_ft_in1k,32.020,67.980,49.017,50.983,5.72,224,0.900,bicubic,-59.900,-49.323,-77
tf_mobilenetv3_large_075.in1k,31.847,68.153,49.130,50.870,3.99,224,0.875,bilinear,-58.483,-48.740,-21
tf_mobilenetv3_large_minimal_100.in1k,31.600,68.400,49.353,50.647,3.92,224,0.875,bilinear,-57.560,-47.957,+2
efficientvit_m1.r224_in1k,31.297,68.703,48.197,51.803,2.98,224,0.875,bicubic,-55.923,-48.823,+18
repghostnet_080.in1k,30.873,69.127,48.797,51.203,3.28,224,0.875,bicubic,-58.597,-48.833,-6
vit_tiny_r_s16_p8_224.augreg_in21k_ft_in1k,30.790,69.210,47.663,52.337,6.34,224,0.900,bicubic,-58.550,-50.037,-5
tinynet_c.in1k,30.517,69.483,48.477,51.523,2.46,184,0.875,bicubic,-57.883,-48.783,+6
lcnet_075.ra2_in1k,30.373,69.627,48.770,51.230,2.36,224,0.875,bicubic,-56.587,-47.780,+19
vgg16_bn.tv_in1k,30.357,69.643,47.280,52.720,138.37,224,0.875,bilinear,-60.183,-50.710,-32
resnet18.a3_in1k,30.093,69.907,46.333,53.667,11.69,224,0.950,bicubic,-56.977,-50.327,+15
efficientvit_b0.r224_in1k,30.060,69.940,46.603,53.397,3.41,224,0.950,bicubic,-58.240,-50.277,+4
edgenext_xx_small.in1k,29.750,70.250,46.487,53.513,1.33,288,1.000,bicubic,-60.050,-51.013,-19
regnety_002.pycls_in1k,29.707,70.293,46.827,53.173,3.16,224,0.875,bicubic,-58.463,-50.613,+3
mobilevit_xs.cvnets_in1k,29.587,70.413,46.037,53.963,2.32,256,0.900,bicubic,-61.623,-52.013,-63
mobilenetv3_small_100.lamb_in1k,29.050,70.950,47.200,52.800,2.54,224,0.875,bicubic,-57.130,-49.250,+14
mnasnet_small.lamb_in1k,28.953,71.047,47.287,52.713,2.03,224,0.875,bicubic,-56.527,-48.713,+15
vgg13_bn.tv_in1k,28.863,71.137,46.730,53.270,133.05,224,0.875,bilinear,-60.337,-50.800,-13
resnet10t.c3_in1k,28.857,71.143,46.903,53.097,5.44,224,0.950,bicubic,-57.823,-49.827,+10
regnetx_002.pycls_in1k,28.837,71.163,45.453,54.547,2.68,224,0.875,bicubic,-58.533,-51.547,+1
efficientvit_m0.r224_in1k,28.797,71.203,45.777,54.223,2.35,224,0.875,bicubic,-54.423,-49.923,+19
mobilenetv2_050.lamb_in1k,28.657,71.343,46.597,53.403,1.97,224,0.875,bicubic,-56.333,-49.023,+14
vgg19.tv_in1k,28.580,71.420,45.167,54.833,143.67,224,0.875,bilinear,-61.100,-52.383,-27
mobilevitv2_050.cvnets_in1k,28.560,71.440,45.203,54.797,1.37,256,0.888,bicubic,-60.470,-52.397,-17
dla60x_c.in1k,28.443,71.557,46.220,53.780,1.32,224,0.875,bilinear,-58.667,-50.920,0
vgg11_bn.tv_in1k,28.430,71.570,46.460,53.540,132.87,224,0.875,bilinear,-59.970,-50.790,-11
repghostnet_058.in1k,28.400,71.600,46.587,53.413,2.55,224,0.875,bicubic,-58.750,-50.193,-3
tinynet_d.in1k,27.960,72.040,45.877,54.123,2.34,152,0.875,bicubic,-57.480,-50.143,+7
vgg16.tv_in1k,27.877,72.123,44.680,55.320,138.36,224,0.875,bilinear,-61.493,-52.840,-28
resnet14t.c3_in1k,27.553,72.447,44.683,55.317,10.08,224,0.950,bicubic,-61.697,-52.757,-26
tf_mobilenetv3_small_100.in1k,27.293,72.707,44.410,55.590,2.54,224,0.875,bilinear,-58.657,-51.990,0
repghostnet_050.in1k,27.060,72.940,44.977,55.023,2.31,224,0.875,bicubic,-58.390,-51.173,+1
mixer_l16_224.goog_in21k_ft_in1k,26.860,73.140,37.913,62.087,208.20,224,0.875,bicubic,-60.130,-56.157,-6
vgg11.tv_in1k,26.537,73.463,43.470,56.530,132.86,224,0.875,bilinear,-60.813,-53.630,-12
mobilenetv3_small_075.lamb_in1k,26.533,73.467,43.877,56.123,2.04,224,0.875,bicubic,-57.587,-51.643,+4
mobilevit_xxs.cvnets_in1k,26.353,73.647,43.060,56.940,1.27,256,0.900,bicubic,-61.587,-54.130,-17
vgg13.tv_in1k,26.273,73.727,43.353,56.647,133.05,224,0.875,bilinear,-61.297,-53.757,-17
dla46x_c.in1k,26.207,73.793,43.777,56.223,1.07,224,0.875,bilinear,-59.233,-52.643,-4
lcnet_050.ra2_in1k,26.197,73.803,44.577,55.423,1.88,224,0.875,bicubic,-56.843,-50.443,+2
tf_mobilenetv3_small_075.in1k,26.197,73.803,43.617,56.383,2.04,224,0.875,bilinear,-58.303,-52.263,-2
dla46_c.in1k,25.507,74.493,43.777,56.223,1.30,224,0.875,bilinear,-59.193,-52.433,-4
tf_mobilenetv3_small_minimal_100.in1k,25.123,74.877,42.950,57.050,2.04,224,0.875,bilinear,-57.537,-52.080,0
tinynet_e.in1k,23.353,76.647,41.067,58.933,2.04,106,0.875,bicubic,-56.457,-52.903,0
mobilenetv3_small_050.lamb_in1k,21.737,78.263,38.757,61.243,1.59,224,0.875,bicubic,-56.343,-54.253,0
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/results-imagenet-r-clean.csv
|
model,top1,top1_err,top5,top5_err,param_count,img_size,crop_pct,interpolation
eva02_large_patch14_448.mim_in22k_ft_in22k_in1k,98.150,1.850,99.880,0.120,305.08,448,1.000,bicubic
eva02_large_patch14_448.mim_m38m_ft_in22k_in1k,98.030,1.970,99.890,0.110,305.08,448,1.000,bicubic
eva_giant_patch14_560.m30m_ft_in22k_in1k,98.000,2.000,99.860,0.140,"1,014.45",560,1.000,bicubic
eva_giant_patch14_336.m30m_ft_in22k_in1k,97.990,2.010,99.900,0.100,"1,013.01",336,1.000,bicubic
convnextv2_huge.fcmae_ft_in22k_in1k_384,97.870,2.130,99.910,0.090,660.29,384,1.000,bicubic
eva_large_patch14_336.in22k_ft_in22k_in1k,97.860,2.140,99.880,0.120,304.53,336,1.000,bicubic
eva02_large_patch14_448.mim_in22k_ft_in1k,97.860,2.140,99.800,0.200,305.08,448,1.000,bicubic
eva_giant_patch14_336.clip_ft_in1k,97.860,2.140,99.790,0.210,"1,013.01",336,1.000,bicubic
eva02_large_patch14_448.mim_m38m_ft_in1k,97.830,2.170,99.820,0.180,305.08,448,1.000,bicubic
convnextv2_huge.fcmae_ft_in22k_in1k_512,97.810,2.190,99.860,0.140,660.29,512,1.000,bicubic
eva_large_patch14_336.in22k_ft_in1k,97.810,2.190,99.840,0.160,304.53,336,1.000,bicubic
beit_large_patch16_384.in22k_ft_in22k_in1k,97.810,2.190,99.790,0.210,305.00,384,1.000,bicubic
tf_efficientnet_l2.ns_jft_in1k,97.780,2.220,99.890,0.110,480.31,800,0.960,bicubic
regnety_1280.swag_ft_in1k,97.780,2.220,99.860,0.140,644.81,384,1.000,bicubic
beit_large_patch16_512.in22k_ft_in22k_in1k,97.780,2.220,99.820,0.180,305.67,512,1.000,bicubic
maxvit_base_tf_512.in21k_ft_in1k,97.760,2.240,99.860,0.140,119.88,512,1.000,bicubic
maxvit_xlarge_tf_512.in21k_ft_in1k,97.760,2.240,99.820,0.180,475.77,512,1.000,bicubic
tf_efficientnet_l2.ns_jft_in1k_475,97.750,2.250,99.820,0.180,480.31,475,0.936,bicubic
convnext_xxlarge.clip_laion2b_soup_ft_in1k,97.750,2.250,99.810,0.190,846.47,256,1.000,bicubic
beitv2_large_patch16_224.in1k_ft_in22k_in1k,97.750,2.250,99.790,0.210,304.43,224,0.950,bicubic
maxvit_xlarge_tf_384.in21k_ft_in1k,97.740,2.260,99.850,0.150,475.32,384,1.000,bicubic
eva02_base_patch14_448.mim_in22k_ft_in1k,97.720,2.280,99.760,0.240,87.12,448,1.000,bicubic
maxvit_large_tf_512.in21k_ft_in1k,97.670,2.330,99.730,0.270,212.33,512,1.000,bicubic
caformer_b36.sail_in22k_ft_in1k_384,97.660,2.340,99.860,0.140,98.75,384,1.000,bicubic
maxvit_large_tf_384.in21k_ft_in1k,97.660,2.340,99.820,0.180,212.03,384,1.000,bicubic
convnextv2_large.fcmae_ft_in22k_in1k_384,97.630,2.370,99.800,0.200,197.96,384,1.000,bicubic
eva02_base_patch14_448.mim_in22k_ft_in22k_in1k,97.610,2.390,99.820,0.180,87.12,448,1.000,bicubic
eva_large_patch14_196.in22k_ft_in22k_in1k,97.610,2.390,99.810,0.190,304.14,196,1.000,bicubic
vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k,97.610,2.390,99.780,0.220,632.46,336,1.000,bicubic
vit_large_patch14_clip_224.openai_ft_in12k_in1k,97.610,2.390,99.730,0.270,304.20,224,1.000,bicubic
vit_large_patch14_clip_336.openai_ft_in12k_in1k,97.600,2.400,99.730,0.270,304.53,336,1.000,bicubic
convnext_xlarge.fb_in22k_ft_in1k_384,97.590,2.410,99.770,0.230,350.20,384,1.000,bicubic
deit3_large_patch16_384.fb_in22k_ft_in1k,97.580,2.420,99.710,0.290,304.76,384,1.000,bicubic
eva_giant_patch14_224.clip_ft_in1k,97.570,2.430,99.710,0.290,"1,012.56",224,0.900,bicubic
maxvit_base_tf_384.in21k_ft_in1k,97.560,2.440,99.760,0.240,119.65,384,1.000,bicubic
eva_large_patch14_196.in22k_ft_in1k,97.520,2.480,99.790,0.210,304.14,196,1.000,bicubic
convformer_b36.sail_in22k_ft_in1k_384,97.490,2.510,99.760,0.240,99.88,384,1.000,bicubic
beit_large_patch16_224.in22k_ft_in22k_in1k,97.480,2.520,99.690,0.310,304.43,224,0.900,bicubic
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384,97.470,2.530,99.760,0.240,200.13,384,1.000,bicubic
vit_large_patch14_clip_336.laion2b_ft_in12k_in1k,97.460,2.540,99.780,0.220,304.53,336,1.000,bicubic
convnext_xlarge.fb_in22k_ft_in1k,97.450,2.550,99.820,0.180,350.20,288,1.000,bicubic
maxxvitv2_rmlp_base_rw_384.sw_in12k_ft_in1k,97.450,2.550,99.760,0.240,116.09,384,1.000,bicubic
vit_large_patch14_clip_224.openai_ft_in1k,97.440,2.560,99.680,0.320,304.20,224,1.000,bicubic
vit_large_patch16_384.augreg_in21k_ft_in1k,97.410,2.590,99.780,0.220,304.72,384,1.000,bicubic
vit_large_patch14_clip_224.laion2b_ft_in12k_in1k,97.390,2.610,99.740,0.260,304.20,224,1.000,bicubic
convnext_large_mlp.clip_laion2b_augreg_ft_in1k_384,97.390,2.610,99.730,0.270,200.13,384,1.000,bicubic
regnety_1280.swag_lc_in1k,97.390,2.610,99.730,0.270,644.81,224,0.965,bicubic
regnety_320.swag_ft_in1k,97.380,2.620,99.760,0.240,145.05,384,1.000,bicubic
convnextv2_base.fcmae_ft_in22k_in1k_384,97.380,2.620,99.720,0.280,88.72,384,1.000,bicubic
caformer_m36.sail_in22k_ft_in1k_384,97.370,2.630,99.790,0.210,56.20,384,1.000,bicubic
coatnet_rmlp_2_rw_384.sw_in12k_ft_in1k,97.370,2.630,99.700,0.300,73.88,384,1.000,bicubic
convformer_m36.sail_in22k_ft_in1k_384,97.370,2.630,99.680,0.320,57.05,384,1.000,bicubic
caformer_b36.sail_in22k_ft_in1k,97.360,2.640,99.830,0.170,98.75,224,1.000,bicubic
vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k,97.360,2.640,99.800,0.200,632.05,224,1.000,bicubic
maxvit_rmlp_base_rw_384.sw_in12k_ft_in1k,97.340,2.660,99.690,0.310,116.14,384,1.000,bicubic
tf_efficientnetv2_xl.in21k_ft_in1k,97.330,2.670,99.600,0.400,208.12,512,1.000,bicubic
beit_base_patch16_384.in22k_ft_in22k_in1k,97.320,2.680,99.720,0.280,86.74,384,1.000,bicubic
tf_efficientnetv2_l.in21k_ft_in1k,97.320,2.680,99.640,0.360,118.52,480,1.000,bicubic
convnextv2_large.fcmae_ft_in22k_in1k,97.310,2.690,99.760,0.240,197.96,288,1.000,bicubic
beitv2_large_patch16_224.in1k_ft_in1k,97.310,2.690,99.740,0.260,304.43,224,0.950,bicubic
deit3_large_patch16_224.fb_in22k_ft_in1k,97.310,2.690,99.680,0.320,304.37,224,1.000,bicubic
convnext_large.fb_in22k_ft_in1k_384,97.300,2.700,99.760,0.240,197.77,384,1.000,bicubic
volo_d5_512.sail_in1k,97.300,2.700,99.760,0.240,296.09,512,1.150,bicubic
seresnextaa101d_32x8d.sw_in12k_ft_in1k_288,97.290,2.710,99.780,0.220,93.59,320,1.000,bicubic
seresnextaa201d_32x8d.sw_in12k_ft_in1k_384,97.290,2.710,99.750,0.250,149.39,384,1.000,bicubic
caformer_s36.sail_in22k_ft_in1k_384,97.290,2.710,99.720,0.280,39.30,384,1.000,bicubic
swinv2_large_window12to24_192to384.ms_in22k_ft_in1k,97.280,2.720,99.780,0.220,196.74,384,1.000,bicubic
swinv2_base_window12to24_192to384.ms_in22k_ft_in1k,97.270,2.730,99.790,0.210,87.92,384,1.000,bicubic
convformer_b36.sail_in22k_ft_in1k,97.260,2.740,99.750,0.250,99.88,224,1.000,bicubic
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320,97.260,2.740,99.740,0.260,200.13,320,1.000,bicubic
convnext_base.fb_in22k_ft_in1k_384,97.260,2.740,99.710,0.290,88.59,384,1.000,bicubic
convnextv2_huge.fcmae_ft_in1k,97.250,2.750,99.720,0.280,660.29,288,1.000,bicubic
deit3_huge_patch14_224.fb_in22k_ft_in1k,97.250,2.750,99.720,0.280,632.13,224,1.000,bicubic
volo_d5_448.sail_in1k,97.240,2.760,99.740,0.260,295.91,448,1.150,bicubic
swinv2_large_window12to16_192to256.ms_in22k_ft_in1k,97.240,2.760,99.710,0.290,196.74,256,0.900,bicubic
deit3_base_patch16_384.fb_in22k_ft_in1k,97.240,2.760,99.670,0.330,86.88,384,1.000,bicubic
vit_large_patch14_clip_336.laion2b_ft_in1k,97.230,2.770,99.720,0.280,304.53,336,1.000,bicubic
convnext_large.fb_in22k_ft_in1k,97.220,2.780,99.730,0.270,197.77,288,1.000,bicubic
vit_base_patch16_clip_384.laion2b_ft_in12k_in1k,97.220,2.780,99.700,0.300,86.86,384,1.000,bicubic
convnext_base.fb_in22k_ft_in1k,97.200,2.800,99.760,0.240,88.59,288,1.000,bicubic
convnextv2_base.fcmae_ft_in22k_in1k,97.200,2.800,99.760,0.240,88.72,288,1.000,bicubic
maxvit_small_tf_512.in1k,97.200,2.800,99.620,0.380,69.13,512,1.000,bicubic
tf_efficientnet_b7.ns_jft_in1k,97.190,2.810,99.700,0.300,66.35,600,0.949,bicubic
coatnet_rmlp_2_rw_224.sw_in12k_ft_in1k,97.180,2.820,99.650,0.350,73.88,224,0.950,bicubic
regnety_160.swag_ft_in1k,97.170,2.830,99.780,0.220,83.59,384,1.000,bicubic
seresnextaa101d_32x8d.sw_in12k_ft_in1k,97.170,2.830,99.740,0.260,93.59,288,1.000,bicubic
swin_large_patch4_window12_384.ms_in22k_ft_in1k,97.170,2.830,99.680,0.320,196.74,384,1.000,bicubic
maxvit_base_tf_512.in1k,97.170,2.830,99.640,0.360,119.88,512,1.000,bicubic
caformer_b36.sail_in1k_384,97.160,2.840,99.610,0.390,98.75,384,1.000,bicubic
swin_base_patch4_window12_384.ms_in22k_ft_in1k,97.130,2.870,99.780,0.220,87.90,384,1.000,bicubic
convnext_large_mlp.clip_laion2b_augreg_ft_in1k,97.130,2.870,99.720,0.280,200.13,256,1.000,bicubic
vit_base_patch16_clip_384.openai_ft_in12k_in1k,97.120,2.880,99.640,0.360,86.86,384,0.950,bicubic
maxvit_base_tf_384.in1k,97.120,2.880,99.570,0.430,119.65,384,1.000,bicubic
convnext_small.fb_in22k_ft_in1k_384,97.110,2.890,99.640,0.360,50.22,384,1.000,bicubic
maxvit_rmlp_base_rw_224.sw_in12k_ft_in1k,97.110,2.890,99.600,0.400,116.14,224,0.950,bicubic
vit_huge_patch14_clip_224.laion2b_ft_in1k,97.100,2.900,99.690,0.310,632.05,224,1.000,bicubic
maxxvitv2_rmlp_base_rw_224.sw_in12k_ft_in1k,97.090,2.910,99.680,0.320,116.09,224,0.950,bicubic
vit_base_patch8_224.augreg_in21k_ft_in1k,97.090,2.910,99.610,0.390,86.58,224,0.900,bicubic
volo_d4_448.sail_in1k,97.070,2.930,99.750,0.250,193.41,448,1.150,bicubic
convformer_m36.sail_in22k_ft_in1k,97.070,2.930,99.630,0.370,57.05,224,1.000,bicubic
convformer_s36.sail_in22k_ft_in1k_384,97.060,2.940,99.710,0.290,40.01,384,1.000,bicubic
swinv2_base_window12to16_192to256.ms_in22k_ft_in1k,97.050,2.950,99.660,0.340,87.92,256,0.900,bicubic
maxvit_large_tf_512.in1k,97.050,2.950,99.590,0.410,212.33,512,1.000,bicubic
convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384,97.040,2.960,99.670,0.330,88.59,384,1.000,bicubic
caformer_m36.sail_in1k_384,97.030,2.970,99.710,0.290,56.20,384,1.000,bicubic
volo_d3_448.sail_in1k,97.030,2.970,99.680,0.320,86.63,448,1.000,bicubic
dm_nfnet_f5.dm_in1k,97.030,2.970,99.670,0.330,377.21,544,0.954,bicubic
caformer_m36.sail_in22k_ft_in1k,97.020,2.980,99.730,0.270,56.20,224,1.000,bicubic
tf_efficientnet_b6.ns_jft_in1k,97.020,2.980,99.710,0.290,43.04,528,0.942,bicubic
vit_base_patch16_384.augreg_in21k_ft_in1k,97.020,2.980,99.710,0.290,86.86,384,1.000,bicubic
vit_large_patch14_clip_224.laion2b_ft_in1k,97.020,2.980,99.670,0.330,304.20,224,1.000,bicubic
tf_efficientnetv2_m.in21k_ft_in1k,97.000,3.000,99.630,0.370,54.14,480,1.000,bicubic
convnext_small.in12k_ft_in1k_384,96.990,3.010,99.660,0.340,50.22,384,1.000,bicubic
coatnet_2_rw_224.sw_in12k_ft_in1k,96.990,3.010,99.650,0.350,73.87,224,0.950,bicubic
dm_nfnet_f6.dm_in1k,96.970,3.030,99.760,0.240,438.36,576,0.956,bicubic
maxvit_tiny_tf_512.in1k,96.970,3.030,99.670,0.330,31.05,512,1.000,bicubic
vit_large_r50_s32_384.augreg_in21k_ft_in1k,96.950,3.050,99.710,0.290,329.09,384,1.000,bicubic
vit_base_patch8_224.augreg2_in21k_ft_in1k,96.950,3.050,99.640,0.360,86.58,224,0.900,bicubic
dm_nfnet_f4.dm_in1k,96.950,3.050,99.630,0.370,316.07,512,0.951,bicubic
tiny_vit_21m_384.dist_in22k_ft_in1k,96.950,3.050,99.610,0.390,21.23,384,1.000,bicubic
swin_large_patch4_window7_224.ms_in22k_ft_in1k,96.940,3.060,99.670,0.330,196.53,224,0.900,bicubic
xcit_large_24_p16_384.fb_dist_in1k,96.940,3.060,99.510,0.490,189.10,384,1.000,bicubic
maxvit_large_tf_384.in1k,96.930,3.070,99.570,0.430,212.03,384,1.000,bicubic
beitv2_base_patch16_224.in1k_ft_in22k_in1k,96.910,3.090,99.730,0.270,86.53,224,0.900,bicubic
tiny_vit_21m_512.dist_in22k_ft_in1k,96.900,3.100,99.690,0.310,21.27,512,1.000,bicubic
vit_base_patch16_clip_384.laion2b_ft_in1k,96.900,3.100,99.670,0.330,86.86,384,1.000,bicubic
caformer_s36.sail_in1k_384,96.880,3.120,99.670,0.330,39.30,384,1.000,bicubic
volo_d5_224.sail_in1k,96.880,3.120,99.670,0.330,295.46,224,0.960,bicubic
resnetv2_152x4_bit.goog_in21k_ft_in1k,96.880,3.120,99.660,0.340,936.53,480,1.000,bilinear
cait_m48_448.fb_dist_in1k,96.880,3.120,99.620,0.380,356.46,448,1.000,bicubic
convformer_b36.sail_in1k_384,96.870,3.130,99.650,0.350,99.88,384,1.000,bicubic
tf_efficientnet_b5.ns_jft_in1k,96.870,3.130,99.640,0.360,30.39,456,0.934,bicubic
convnext_base.clip_laiona_augreg_ft_in1k_384,96.860,3.140,99.690,0.310,88.59,384,1.000,bicubic
deit3_base_patch16_224.fb_in22k_ft_in1k,96.860,3.140,99.620,0.380,86.59,224,1.000,bicubic
deit3_large_patch16_384.fb_in1k,96.850,3.150,99.620,0.380,304.76,384,1.000,bicubic
cait_m36_384.fb_dist_in1k,96.840,3.160,99.660,0.340,271.22,384,1.000,bicubic
convnextv2_large.fcmae_ft_in1k,96.830,3.170,99.760,0.240,197.96,288,1.000,bicubic
regnety_160.sw_in12k_ft_in1k,96.820,3.180,99.690,0.310,83.59,288,1.000,bicubic
caformer_s36.sail_in22k_ft_in1k,96.820,3.180,99.620,0.380,39.30,224,1.000,bicubic
regnety_160.lion_in12k_ft_in1k,96.810,3.190,99.710,0.290,83.59,288,1.000,bicubic
vit_base_patch16_clip_384.openai_ft_in1k,96.810,3.190,99.660,0.340,86.86,384,1.000,bicubic
xcit_small_24_p8_384.fb_dist_in1k,96.810,3.190,99.630,0.370,47.63,384,1.000,bicubic
convnext_small.fb_in22k_ft_in1k,96.810,3.190,99.510,0.490,50.22,288,1.000,bicubic
convformer_s18.sail_in22k_ft_in1k_384,96.790,3.210,99.710,0.290,26.77,384,1.000,bicubic
convnext_base.clip_laion2b_augreg_ft_in12k_in1k,96.790,3.210,99.680,0.320,88.59,256,1.000,bicubic
regnety_320.swag_lc_in1k,96.780,3.220,99.730,0.270,145.05,224,0.965,bicubic
volo_d4_224.sail_in1k,96.780,3.220,99.670,0.330,192.96,224,0.960,bicubic
convformer_m36.sail_in1k_384,96.780,3.220,99.620,0.380,57.05,384,1.000,bicubic
flexivit_large.1200ep_in1k,96.780,3.220,99.610,0.390,304.36,240,0.950,bicubic
xcit_medium_24_p8_384.fb_dist_in1k,96.770,3.230,99.620,0.380,84.32,384,1.000,bicubic
efficientnet_b5.sw_in12k_ft_in1k,96.770,3.230,99.600,0.400,30.39,448,1.000,bicubic
resnext101_32x32d.fb_wsl_ig1b_ft_in1k,96.770,3.230,99.530,0.470,468.53,224,0.875,bilinear
xcit_large_24_p8_384.fb_dist_in1k,96.760,3.240,99.560,0.440,188.93,384,1.000,bicubic
maxvit_small_tf_384.in1k,96.750,3.250,99.600,0.400,69.02,384,1.000,bicubic
beitv2_base_patch16_224.in1k_ft_in1k,96.750,3.250,99.540,0.460,86.53,224,0.900,bicubic
tf_efficientnetv2_l.in1k,96.740,3.260,99.550,0.450,118.52,480,1.000,bicubic
flexivit_large.600ep_in1k,96.730,3.270,99.560,0.440,304.36,240,0.950,bicubic
inception_next_base.sail_in1k_384,96.720,3.280,99.610,0.390,86.67,384,1.000,bicubic
tf_efficientnet_b4.ns_jft_in1k,96.710,3.290,99.640,0.360,19.34,380,0.922,bicubic
volo_d2_384.sail_in1k,96.710,3.290,99.600,0.400,58.87,384,1.000,bicubic
vit_large_patch16_224.augreg_in21k_ft_in1k,96.700,3.300,99.650,0.350,304.33,224,0.900,bicubic
flexivit_large.300ep_in1k,96.700,3.300,99.580,0.420,304.36,240,0.950,bicubic
convformer_s36.sail_in1k_384,96.700,3.300,99.570,0.430,40.01,384,1.000,bicubic
tf_efficientnet_b8.ra_in1k,96.700,3.300,99.530,0.470,87.41,672,0.954,bicubic
eva02_small_patch14_336.mim_in22k_ft_in1k,96.690,3.310,99.610,0.390,22.13,336,1.000,bicubic
xcit_medium_24_p16_384.fb_dist_in1k,96.690,3.310,99.600,0.400,84.40,384,1.000,bicubic
swin_base_patch4_window7_224.ms_in22k_ft_in1k,96.680,3.320,99.670,0.330,87.77,224,0.900,bicubic
deit3_small_patch16_384.fb_in22k_ft_in1k,96.670,3.330,99.640,0.360,22.21,384,1.000,bicubic
beit_base_patch16_224.in22k_ft_in22k_in1k,96.660,3.340,99.660,0.340,86.53,224,0.900,bicubic
cait_s36_384.fb_dist_in1k,96.630,3.370,99.610,0.390,68.37,384,1.000,bicubic
xcit_large_24_p8_224.fb_dist_in1k,96.630,3.370,99.460,0.540,188.93,224,1.000,bicubic
dm_nfnet_f3.dm_in1k,96.620,3.380,99.630,0.370,254.92,416,0.940,bicubic
convnextv2_tiny.fcmae_ft_in22k_in1k_384,96.620,3.380,99.580,0.420,28.64,384,1.000,bicubic
vit_base_patch16_clip_224.laion2b_ft_in12k_in1k,96.620,3.380,99.560,0.440,86.57,224,0.950,bicubic
vit_base_patch32_clip_384.laion2b_ft_in12k_in1k,96.610,3.390,99.480,0.520,88.30,384,1.000,bicubic
regnetz_e8.ra3_in1k,96.600,3.400,99.620,0.380,57.70,320,1.000,bicubic
convnext_small.in12k_ft_in1k,96.600,3.400,99.580,0.420,50.22,288,1.000,bicubic
maxvit_tiny_tf_384.in1k,96.600,3.400,99.560,0.440,30.98,384,1.000,bicubic
deit3_huge_patch14_224.fb_in1k,96.580,3.420,99.520,0.480,632.13,224,0.900,bicubic
cait_s24_384.fb_dist_in1k,96.570,3.430,99.550,0.450,47.06,384,1.000,bicubic
tf_efficientnet_b7.ra_in1k,96.570,3.430,99.520,0.480,66.35,600,0.949,bicubic
coat_lite_medium_384.in1k,96.570,3.430,99.470,0.530,44.57,384,1.000,bicubic
convnext_base.clip_laion2b_augreg_ft_in1k,96.560,3.440,99.650,0.350,88.59,256,1.000,bicubic
convnext_tiny.in12k_ft_in1k_384,96.560,3.440,99.630,0.370,28.59,384,1.000,bicubic
vit_base_patch32_clip_448.laion2b_ft_in12k_in1k,96.560,3.440,99.520,0.480,88.34,448,1.000,bicubic
regnety_120.sw_in12k_ft_in1k,96.550,3.450,99.680,0.320,51.82,288,1.000,bicubic
xcit_small_24_p8_224.fb_dist_in1k,96.550,3.450,99.560,0.440,47.63,224,1.000,bicubic
tf_efficientnet_b8.ap_in1k,96.550,3.450,99.540,0.460,87.41,672,0.954,bicubic
hrnet_w48_ssld.paddle_in1k,96.540,3.460,99.640,0.360,77.47,288,1.000,bilinear
caformer_s18.sail_in22k_ft_in1k_384,96.530,3.470,99.580,0.420,26.34,384,1.000,bicubic
regnety_2560.seer_ft_in1k,96.530,3.470,99.520,0.480,"1,282.60",384,1.000,bicubic
xcit_medium_24_p8_224.fb_dist_in1k,96.530,3.470,99.510,0.490,84.32,224,1.000,bicubic
resnetv2_152x2_bit.goog_in21k_ft_in1k,96.520,3.480,99.590,0.410,236.34,448,1.000,bilinear
dm_nfnet_f2.dm_in1k,96.520,3.480,99.570,0.430,193.78,352,0.920,bicubic
deit_base_distilled_patch16_384.fb_in1k,96.510,3.490,99.590,0.410,87.63,384,1.000,bicubic
vit_base_patch16_224.augreg2_in21k_ft_in1k,96.510,3.490,99.560,0.440,86.57,224,0.900,bicubic
vit_base_patch16_clip_224.openai_ft_in12k_in1k,96.510,3.490,99.550,0.450,86.57,224,0.950,bicubic
convformer_s36.sail_in22k_ft_in1k,96.500,3.500,99.630,0.370,40.01,224,1.000,bicubic
caformer_b36.sail_in1k,96.500,3.500,99.460,0.540,98.75,224,1.000,bicubic
vit_medium_patch16_gap_384.sw_in12k_ft_in1k,96.490,3.510,99.620,0.380,39.03,384,0.950,bicubic
tf_efficientnetv2_m.in1k,96.480,3.520,99.610,0.390,54.14,480,1.000,bicubic
volo_d1_384.sail_in1k,96.480,3.520,99.550,0.450,26.78,384,1.000,bicubic
convnextv2_base.fcmae_ft_in1k,96.480,3.520,99.520,0.480,88.72,288,1.000,bicubic
tf_efficientnetv2_s.in21k_ft_in1k,96.470,3.530,99.570,0.430,21.46,384,1.000,bicubic
xcit_small_12_p8_384.fb_dist_in1k,96.470,3.530,99.490,0.510,26.21,384,1.000,bicubic
regnety_160.swag_lc_in1k,96.450,3.550,99.750,0.250,83.59,224,0.965,bicubic
vit_base_r50_s16_384.orig_in21k_ft_in1k,96.450,3.550,99.660,0.340,98.95,384,1.000,bicubic
eca_nfnet_l2.ra3_in1k,96.450,3.550,99.610,0.390,56.72,384,1.000,bicubic
ecaresnet269d.ra2_in1k,96.450,3.550,99.610,0.390,102.09,352,1.000,bicubic
seresnextaa101d_32x8d.ah_in1k,96.440,3.560,99.510,0.490,93.59,288,1.000,bicubic
volo_d3_224.sail_in1k,96.430,3.570,99.630,0.370,86.33,224,0.960,bicubic
resnext101_32x16d.fb_wsl_ig1b_ft_in1k,96.430,3.570,99.540,0.460,194.03,224,0.875,bilinear
volo_d2_224.sail_in1k,96.420,3.580,99.500,0.500,58.68,224,0.960,bicubic
vit_base_patch32_clip_384.openai_ft_in12k_in1k,96.420,3.580,99.460,0.540,88.30,384,0.950,bicubic
caformer_s18.sail_in1k_384,96.410,3.590,99.560,0.440,26.34,384,1.000,bicubic
caformer_m36.sail_in1k,96.410,3.590,99.530,0.470,56.20,224,1.000,bicubic
resnetrs420.tf_in1k,96.400,3.600,99.540,0.460,191.89,416,1.000,bicubic
convnext_large.fb_in1k,96.400,3.600,99.530,0.470,197.77,288,1.000,bicubic
mvitv2_large.fb_in1k,96.400,3.600,99.450,0.550,217.99,224,0.900,bicubic
tiny_vit_21m_224.dist_in22k_ft_in1k,96.380,3.620,99.500,0.500,21.20,224,0.950,bicubic
swin_base_patch4_window12_384.ms_in1k,96.380,3.620,99.420,0.580,87.90,384,1.000,bicubic
tf_efficientnet_b6.ap_in1k,96.370,3.630,99.550,0.450,43.04,528,0.942,bicubic
seresnext101d_32x8d.ah_in1k,96.360,3.640,99.470,0.530,93.59,288,1.000,bicubic
resnetaa101d.sw_in12k_ft_in1k,96.360,3.640,99.440,0.560,44.57,288,1.000,bicubic
tf_efficientnet_b7.ap_in1k,96.350,3.650,99.590,0.410,66.35,600,0.949,bicubic
resnetrs200.tf_in1k,96.350,3.650,99.550,0.450,93.21,320,1.000,bicubic
resmlp_big_24_224.fb_in22k_ft_in1k,96.350,3.650,99.520,0.480,129.14,224,0.875,bicubic
xcit_small_24_p16_384.fb_dist_in1k,96.340,3.660,99.580,0.420,47.67,384,1.000,bicubic
convnextv2_tiny.fcmae_ft_in22k_in1k,96.340,3.660,99.550,0.450,28.64,288,1.000,bicubic
xcit_small_12_p16_384.fb_dist_in1k,96.340,3.660,99.490,0.510,26.25,384,1.000,bicubic
maxvit_base_tf_224.in1k,96.340,3.660,99.370,0.630,119.47,224,0.950,bicubic
maxvit_large_tf_224.in1k,96.330,3.670,99.410,0.590,211.79,224,0.950,bicubic
vit_base_patch16_clip_224.laion2b_ft_in1k,96.320,3.680,99.540,0.460,86.57,224,1.000,bicubic
regnetz_040_h.ra3_in1k,96.320,3.680,99.520,0.480,28.94,320,1.000,bicubic
xcit_large_24_p16_224.fb_dist_in1k,96.320,3.680,99.500,0.500,189.10,224,1.000,bicubic
vit_base_patch16_clip_224.openai_ft_in1k,96.310,3.690,99.550,0.450,86.57,224,0.900,bicubic
seresnet152d.ra2_in1k,96.310,3.690,99.510,0.490,66.84,320,1.000,bicubic
convnext_base.fb_in1k,96.310,3.690,99.500,0.500,88.59,288,1.000,bicubic
regnety_1280.seer_ft_in1k,96.310,3.690,99.410,0.590,644.81,384,1.000,bicubic
vit_base_patch16_224.augreg_in21k_ft_in1k,96.300,3.700,99.560,0.440,86.57,224,0.900,bicubic
dm_nfnet_f1.dm_in1k,96.300,3.700,99.530,0.470,132.63,320,0.910,bicubic
tf_efficientnet_b6.aa_in1k,96.300,3.700,99.530,0.470,43.04,528,0.942,bicubic
fastvit_ma36.apple_dist_in1k,96.300,3.700,99.500,0.500,44.07,256,0.950,bicubic
resnetv2_50x3_bit.goog_in21k_ft_in1k,96.270,3.730,99.630,0.370,217.32,448,1.000,bilinear
efficientnetv2_rw_m.agc_in1k,96.270,3.730,99.560,0.440,53.24,416,1.000,bicubic
resnext101_32x16d.fb_swsl_ig1b_ft_in1k,96.270,3.730,99.500,0.500,194.03,224,0.875,bilinear
resnext101_32x8d.fb_swsl_ig1b_ft_in1k,96.250,3.750,99.590,0.410,88.79,224,0.875,bilinear
resnetv2_101x3_bit.goog_in21k_ft_in1k,96.250,3.750,99.580,0.420,387.93,448,1.000,bilinear
convformer_s18.sail_in1k_384,96.250,3.750,99.540,0.460,26.77,384,1.000,bicubic
resnetrs350.tf_in1k,96.250,3.750,99.470,0.530,163.96,384,1.000,bicubic
xcit_medium_24_p16_224.fb_dist_in1k,96.250,3.750,99.410,0.590,84.40,224,1.000,bicubic
convnext_tiny.in12k_ft_in1k,96.240,3.760,99.640,0.360,28.59,288,1.000,bicubic
davit_base.msft_in1k,96.240,3.760,99.410,0.590,87.95,224,0.950,bicubic
convformer_b36.sail_in1k,96.240,3.760,99.290,0.710,99.88,224,1.000,bicubic
xcit_tiny_24_p8_384.fb_dist_in1k,96.230,3.770,99.440,0.560,12.11,384,1.000,bicubic
deit3_base_patch16_384.fb_in1k,96.230,3.770,99.400,0.600,86.88,384,1.000,bicubic
maxxvit_rmlp_small_rw_256.sw_in1k,96.210,3.790,99.480,0.520,66.01,256,0.950,bicubic
coatnet_rmlp_2_rw_224.sw_in1k,96.210,3.790,99.280,0.720,73.88,224,0.950,bicubic
vit_base_patch16_384.orig_in21k_ft_in1k,96.200,3.800,99.530,0.470,86.86,384,1.000,bicubic
edgenext_base.in21k_ft_in1k,96.200,3.800,99.470,0.530,18.51,320,1.000,bicubic
maxvit_small_tf_224.in1k,96.200,3.800,99.460,0.540,68.93,224,0.950,bicubic
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k_384,96.190,3.810,99.500,0.500,236.34,384,1.000,bicubic
deit3_large_patch16_224.fb_in1k,96.190,3.810,99.300,0.700,304.37,224,0.900,bicubic
vit_large_r50_s32_224.augreg_in21k_ft_in1k,96.180,3.820,99.530,0.470,328.99,224,0.900,bicubic
regnetz_040.ra3_in1k,96.180,3.820,99.510,0.490,27.12,320,1.000,bicubic
convnext_tiny.fb_in22k_ft_in1k_384,96.170,3.830,99.500,0.500,28.59,384,1.000,bicubic
swinv2_base_window16_256.ms_in1k,96.170,3.830,99.390,0.610,87.92,256,0.900,bicubic
crossvit_18_dagger_408.in1k,96.150,3.850,99.470,0.530,44.61,408,1.000,bicubic
regnetz_d8_evos.ch_in1k,96.140,3.860,99.490,0.510,23.46,320,1.000,bicubic
deit3_medium_patch16_224.fb_in22k_ft_in1k,96.140,3.860,99.480,0.520,38.85,224,1.000,bicubic
seresnext101_32x8d.ah_in1k,96.140,3.860,99.360,0.640,93.57,288,1.000,bicubic
efficientvit_b3.r288_in1k,96.140,3.860,99.290,0.710,48.65,288,1.000,bicubic
coatnet_rmlp_1_rw2_224.sw_in12k_ft_in1k,96.130,3.870,99.340,0.660,41.72,224,0.950,bicubic
flexivit_base.1200ep_in1k,96.120,3.880,99.410,0.590,86.59,240,0.950,bicubic
resnest269e.in1k,96.110,3.890,99.520,0.480,110.93,416,0.928,bicubic
resnet200d.ra2_in1k,96.110,3.890,99.460,0.540,64.69,320,1.000,bicubic
convformer_s36.sail_in1k,96.110,3.890,99.310,0.690,40.01,224,1.000,bicubic
convformer_s18.sail_in22k_ft_in1k,96.100,3.900,99.490,0.510,26.77,224,1.000,bicubic
tf_efficientnet_b3.ns_jft_in1k,96.100,3.900,99.480,0.520,12.23,300,0.904,bicubic
rexnetr_300.sw_in12k_ft_in1k,96.090,3.910,99.540,0.460,34.81,288,1.000,bicubic
tf_efficientnet_b5.ap_in1k,96.090,3.910,99.540,0.460,30.39,456,0.934,bicubic
xcit_large_24_p8_224.fb_in1k,96.090,3.910,99.140,0.860,188.93,224,1.000,bicubic
caformer_s36.sail_in1k,96.080,3.920,99.510,0.490,39.30,224,1.000,bicubic
gcvit_base.in1k,96.080,3.920,99.390,0.610,90.32,224,0.875,bicubic
convformer_m36.sail_in1k,96.080,3.920,99.250,0.750,57.05,224,1.000,bicubic
resnest200e.in1k,96.070,3.930,99.480,0.520,70.20,320,0.909,bicubic
tf_efficientnet_b7.aa_in1k,96.070,3.930,99.450,0.550,66.35,600,0.949,bicubic
swinv2_small_window16_256.ms_in1k,96.070,3.930,99.340,0.660,49.73,256,0.900,bicubic
vit_small_r26_s32_384.augreg_in21k_ft_in1k,96.060,3.940,99.550,0.450,36.47,384,1.000,bicubic
regnety_640.seer_ft_in1k,96.060,3.940,99.500,0.500,281.38,384,1.000,bicubic
resnetrs270.tf_in1k,96.060,3.940,99.480,0.520,129.86,352,1.000,bicubic
swin_small_patch4_window7_224.ms_in22k_ft_in1k,96.060,3.940,99.480,0.520,49.61,224,0.900,bicubic
swinv2_base_window8_256.ms_in1k,96.060,3.940,99.410,0.590,87.92,256,0.900,bicubic
resnext101_32x4d.fb_swsl_ig1b_ft_in1k,96.040,3.960,99.540,0.460,44.18,224,0.875,bilinear
maxvit_rmlp_tiny_rw_256.sw_in1k,96.040,3.960,99.410,0.590,29.15,256,0.950,bicubic
swin_s3_base_224.ms_in1k,96.040,3.960,99.350,0.650,71.13,224,0.900,bicubic
vit_base_patch16_224_miil.in21k_ft_in1k,96.040,3.960,99.350,0.650,86.54,224,0.875,bilinear
davit_small.msft_in1k,96.030,3.970,99.400,0.600,49.75,224,0.950,bicubic
volo_d1_224.sail_in1k,96.030,3.970,99.390,0.610,26.63,224,0.960,bicubic
caformer_s18.sail_in22k_ft_in1k,96.010,3.990,99.550,0.450,26.34,224,1.000,bicubic
regnetz_d8.ra3_in1k,96.010,3.990,99.520,0.480,23.37,320,1.000,bicubic
vit_medium_patch16_gap_256.sw_in12k_ft_in1k,96.000,4.000,99.500,0.500,38.86,256,0.950,bicubic
cs3se_edgenet_x.c2ns_in1k,96.000,4.000,99.430,0.570,50.72,320,1.000,bicubic
cait_xs24_384.fb_dist_in1k,96.000,4.000,99.420,0.580,26.67,384,1.000,bicubic
coat_lite_medium.in1k,96.000,4.000,99.350,0.650,44.57,224,0.900,bicubic
repvit_m2_3.dist_450e_in1k,95.990,4.010,99.400,0.600,23.69,224,0.950,bicubic
vit_small_patch16_384.augreg_in21k_ft_in1k,95.980,4.020,99.590,0.410,22.20,384,1.000,bicubic
mvitv2_base.fb_in1k,95.980,4.020,99.330,0.670,51.47,224,0.900,bicubic
tf_efficientnet_b5.ra_in1k,95.970,4.030,99.460,0.540,30.39,456,0.934,bicubic
convnext_small.fb_in1k,95.970,4.030,99.430,0.570,50.22,288,1.000,bicubic
fastvit_ma36.apple_in1k,95.970,4.030,99.360,0.640,44.07,256,0.950,bicubic
xcit_small_12_p8_224.fb_dist_in1k,95.960,4.040,99.430,0.570,26.21,224,1.000,bicubic
flexivit_base.600ep_in1k,95.960,4.040,99.420,0.580,86.59,240,0.950,bicubic
resnetrs152.tf_in1k,95.960,4.040,99.380,0.620,86.62,320,1.000,bicubic
fastvit_sa36.apple_dist_in1k,95.960,4.040,99.370,0.630,31.53,256,0.900,bicubic
maxvit_rmlp_small_rw_224.sw_in1k,95.960,4.040,99.350,0.650,64.90,224,0.900,bicubic
repvgg_d2se.rvgg_in1k,95.950,4.050,99.470,0.530,133.33,320,1.000,bilinear
flexivit_base.300ep_in1k,95.950,4.050,99.370,0.630,86.59,240,0.950,bicubic
pvt_v2_b5.in1k,95.940,4.060,99.390,0.610,81.96,224,0.900,bicubic
resnext101_32x8d.fb_wsl_ig1b_ft_in1k,95.940,4.060,99.380,0.620,88.79,224,0.875,bilinear
eca_nfnet_l1.ra2_in1k,95.930,4.070,99.490,0.510,41.41,320,1.000,bicubic
pvt_v2_b4.in1k,95.920,4.080,99.360,0.640,62.56,224,0.900,bicubic
inception_next_base.sail_in1k,95.920,4.080,99.220,0.780,86.67,224,0.950,bicubic
gcvit_small.in1k,95.910,4.090,99.280,0.720,51.09,224,0.875,bicubic
vit_base_patch32_384.augreg_in21k_ft_in1k,95.900,4.100,99.440,0.560,88.30,384,1.000,bicubic
focalnet_base_srf.ms_in1k,95.900,4.100,99.340,0.660,88.15,224,0.900,bicubic
swin_base_patch4_window7_224.ms_in1k,95.900,4.100,99.310,0.690,87.77,224,0.900,bicubic
xcit_small_24_p8_224.fb_in1k,95.900,4.100,99.180,0.820,47.63,224,1.000,bicubic
mvitv2_small.fb_in1k,95.890,4.110,99.360,0.640,34.87,224,0.900,bicubic
tf_efficientnet_b5.aa_in1k,95.880,4.120,99.350,0.650,30.39,456,0.934,bicubic
regnety_160.deit_in1k,95.870,4.130,99.560,0.440,83.59,288,1.000,bicubic
sequencer2d_l.in1k,95.870,4.130,99.470,0.530,54.30,224,0.875,bicubic
regnety_080.ra3_in1k,95.870,4.130,99.440,0.560,39.18,288,1.000,bicubic
resmlp_big_24_224.fb_distilled_in1k,95.870,4.130,99.440,0.560,129.14,224,0.875,bicubic
regnetz_d32.ra3_in1k,95.870,4.130,99.430,0.570,27.58,320,0.950,bicubic
resnet152d.ra2_in1k,95.870,4.130,99.430,0.570,60.21,320,1.000,bicubic
tf_efficientnet_b5.in1k,95.870,4.130,99.390,0.610,30.39,456,0.934,bicubic
xcit_medium_24_p8_224.fb_in1k,95.860,4.140,99.080,0.920,84.32,224,1.000,bicubic
deit3_small_patch16_224.fb_in22k_ft_in1k,95.830,4.170,99.390,0.610,22.06,224,1.000,bicubic
convnextv2_tiny.fcmae_ft_in1k,95.830,4.170,99.340,0.660,28.64,288,1.000,bicubic
efficientvit_b3.r256_in1k,95.830,4.170,99.220,0.780,48.65,256,1.000,bicubic
swin_s3_small_224.ms_in1k,95.830,4.170,99.200,0.800,49.74,224,0.900,bicubic
focalnet_base_lrf.ms_in1k,95.830,4.170,99.180,0.820,88.75,224,0.900,bicubic
resnext101_64x4d.tv_in1k,95.820,4.180,99.320,0.680,83.46,224,0.875,bilinear
crossvit_15_dagger_408.in1k,95.820,4.180,99.310,0.690,28.50,408,1.000,bicubic
tresnet_v2_l.miil_in21k_ft_in1k,95.820,4.180,99.290,0.710,46.17,224,0.875,bilinear
pit_b_distilled_224.in1k,95.820,4.180,99.210,0.790,74.79,224,0.900,bicubic
maxvit_tiny_tf_224.in1k,95.810,4.190,99.260,0.740,30.92,224,0.950,bicubic
edgenext_base.usi_in1k,95.790,4.210,99.570,0.430,18.51,320,1.000,bicubic
regnety_320.seer_ft_in1k,95.790,4.210,99.390,0.610,145.05,384,1.000,bicubic
xcit_small_24_p16_224.fb_dist_in1k,95.790,4.210,99.350,0.650,47.67,224,1.000,bicubic
convnextv2_nano.fcmae_ft_in22k_in1k_384,95.790,4.210,99.300,0.700,15.62,384,1.000,bicubic
regnety_064.ra3_in1k,95.790,4.210,99.290,0.710,30.58,288,1.000,bicubic
regnetv_064.ra3_in1k,95.780,4.220,99.420,0.580,30.58,288,1.000,bicubic
deit3_base_patch16_224.fb_in1k,95.770,4.230,99.270,0.730,86.59,224,0.900,bicubic
efficientformerv2_l.snap_dist_in1k,95.760,4.240,99.370,0.630,26.32,224,0.950,bicubic
resnet152.a1h_in1k,95.750,4.250,99.430,0.570,60.19,288,1.000,bicubic
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k,95.750,4.250,99.430,0.570,236.34,224,0.875,bicubic
deit_base_distilled_patch16_224.fb_in1k,95.750,4.250,99.280,0.720,87.34,224,0.900,bicubic
resnet101d.ra2_in1k,95.740,4.260,99.440,0.560,44.57,320,1.000,bicubic
focalnet_small_lrf.ms_in1k,95.740,4.260,99.210,0.790,50.34,224,0.900,bicubic
maxvit_tiny_rw_224.sw_in1k,95.740,4.260,99.160,0.840,29.06,224,0.950,bicubic
regnetv_040.ra3_in1k,95.730,4.270,99.380,0.620,20.64,288,1.000,bicubic
swinv2_small_window8_256.ms_in1k,95.730,4.270,99.360,0.640,49.73,256,0.900,bicubic
xcit_small_12_p16_224.fb_dist_in1k,95.730,4.270,99.300,0.700,26.25,224,1.000,bicubic
twins_pcpvt_large.in1k,95.720,4.280,99.490,0.510,60.99,224,0.900,bicubic
tf_efficientnetv2_s.in1k,95.710,4.290,99.400,0.600,21.46,384,1.000,bicubic
efficientnetv2_rw_s.ra2_in1k,95.710,4.290,99.380,0.620,23.94,384,1.000,bicubic
hrnet_w18_ssld.paddle_in1k,95.710,4.290,99.340,0.660,21.30,288,1.000,bilinear
swin_small_patch4_window7_224.ms_in1k,95.710,4.290,99.290,0.710,49.61,224,0.900,bicubic
swinv2_cr_small_ns_224.sw_in1k,95.710,4.290,99.290,0.710,49.70,224,0.900,bicubic
tiny_vit_11m_224.dist_in22k_ft_in1k,95.710,4.290,99.260,0.740,11.00,224,0.950,bicubic
twins_svt_large.in1k,95.700,4.300,99.370,0.630,99.27,224,0.900,bicubic
dm_nfnet_f0.dm_in1k,95.690,4.310,99.350,0.650,71.49,256,0.900,bicubic
xception65.ra3_in1k,95.690,4.310,99.320,0.680,39.92,299,0.940,bicubic
caformer_s18.sail_in1k,95.680,4.320,99.290,0.710,26.34,224,1.000,bicubic
inception_next_small.sail_in1k,95.680,4.320,99.250,0.750,49.37,224,0.875,bicubic
cait_s24_224.fb_dist_in1k,95.660,4.340,99.390,0.610,46.92,224,1.000,bicubic
gcvit_tiny.in1k,95.660,4.340,99.330,0.670,28.22,224,0.875,bicubic
xception65p.ra3_in1k,95.660,4.340,99.270,0.730,39.82,299,0.940,bicubic
tiny_vit_21m_224.in1k,95.650,4.350,99.250,0.750,21.20,224,0.950,bicubic
regnetz_c16_evos.ch_in1k,95.640,4.360,99.420,0.580,13.49,320,0.950,bicubic
deit_base_patch16_384.fb_in1k,95.640,4.360,99.240,0.760,86.86,384,1.000,bicubic
resnext50_32x4d.fb_swsl_ig1b_ft_in1k,95.630,4.370,99.440,0.560,25.03,224,0.875,bilinear
ecaresnet101d.miil_in1k,95.630,4.370,99.410,0.590,44.57,288,0.950,bicubic
focalnet_small_srf.ms_in1k,95.630,4.370,99.290,0.710,49.89,224,0.900,bicubic
coatnet_1_rw_224.sw_in1k,95.620,4.380,99.220,0.780,41.72,224,0.950,bicubic
fastvit_sa36.apple_in1k,95.610,4.390,99.320,0.680,31.53,256,0.900,bicubic
efficientformer_l7.snap_dist_in1k,95.600,4.400,99.440,0.560,82.23,224,0.950,bicubic
deit3_small_patch16_384.fb_in1k,95.600,4.400,99.390,0.610,22.21,384,1.000,bicubic
tf_efficientnetv2_b3.in21k_ft_in1k,95.600,4.400,99.280,0.720,14.36,300,0.900,bicubic
repvit_m2_3.dist_300e_in1k,95.590,4.410,99.390,0.610,23.69,224,0.950,bicubic
tf_efficientnet_b4.aa_in1k,95.590,4.410,99.330,0.670,19.34,380,0.922,bicubic
sequencer2d_m.in1k,95.580,4.420,99.270,0.730,38.31,224,0.875,bicubic
resnet101.a1h_in1k,95.580,4.420,99.250,0.750,44.55,288,1.000,bicubic
efficientvit_b2.r288_in1k,95.580,4.420,99.220,0.780,24.33,288,1.000,bicubic
resnetv2_101.a1h_in1k,95.570,4.430,99.370,0.630,44.54,288,1.000,bicubic
resnest101e.in1k,95.570,4.430,99.270,0.730,48.28,256,0.875,bilinear
regnety_320.tv2_in1k,95.560,4.440,99.390,0.610,145.05,224,0.965,bicubic
twins_svt_base.in1k,95.560,4.440,99.230,0.770,56.07,224,0.900,bicubic
fastvit_sa24.apple_dist_in1k,95.550,4.450,99.310,0.690,21.55,256,0.900,bicubic
rexnet_300.nav_in1k,95.540,4.460,99.320,0.680,34.71,224,0.875,bicubic
nest_base_jx.goog_in1k,95.540,4.460,99.290,0.710,67.72,224,0.875,bicubic
nest_small_jx.goog_in1k,95.540,4.460,99.220,0.780,38.35,224,0.875,bicubic
efficientvit_b3.r224_in1k,95.540,4.460,99.190,0.810,48.65,224,0.950,bicubic
efficientnet_b4.ra2_in1k,95.530,4.470,99.400,0.600,19.34,384,1.000,bicubic
resnext101_64x4d.c1_in1k,95.530,4.470,99.290,0.710,83.46,288,1.000,bicubic
tf_efficientnet_b2.ns_jft_in1k,95.520,4.480,99.340,0.660,9.11,260,0.890,bicubic
tresnet_xl.miil_in1k_448,95.510,4.490,99.340,0.660,78.44,448,0.875,bilinear
regnety_040.ra3_in1k,95.490,4.510,99.420,0.580,20.65,288,1.000,bicubic
tf_efficientnet_b4.ap_in1k,95.490,4.510,99.390,0.610,19.34,380,0.922,bicubic
xcit_tiny_24_p16_384.fb_dist_in1k,95.490,4.510,99.360,0.640,12.12,384,1.000,bicubic
coatnet_rmlp_1_rw_224.sw_in1k,95.490,4.510,99.250,0.750,41.69,224,0.950,bicubic
tf_efficientnet_b4.in1k,95.480,4.520,99.270,0.730,19.34,380,0.922,bicubic
twins_pcpvt_base.in1k,95.470,4.530,99.390,0.610,43.83,224,0.900,bicubic
pvt_v2_b3.in1k,95.470,4.530,99.310,0.690,45.24,224,0.900,bicubic
maxvit_nano_rw_256.sw_in1k,95.470,4.530,99.120,0.880,15.45,256,0.950,bicubic
eca_nfnet_l0.ra2_in1k,95.460,4.540,99.390,0.610,24.14,288,1.000,bicubic
regnety_032.ra_in1k,95.460,4.540,99.320,0.680,19.44,288,1.000,bicubic
cs3edgenet_x.c2_in1k,95.460,4.540,99.280,0.720,47.82,288,1.000,bicubic
sequencer2d_s.in1k,95.460,4.540,99.260,0.740,27.65,224,0.875,bicubic
xcit_tiny_24_p8_224.fb_dist_in1k,95.450,4.550,99.360,0.640,12.11,224,1.000,bicubic
maxvit_rmlp_nano_rw_256.sw_in1k,95.440,4.560,99.060,0.940,15.50,256,0.950,bicubic
maxxvitv2_nano_rw_256.sw_in1k,95.430,4.570,99.190,0.810,23.70,256,0.950,bicubic
convnextv2_nano.fcmae_ft_in22k_in1k,95.420,4.580,99.310,0.690,15.62,288,1.000,bicubic
xcit_small_12_p8_224.fb_in1k,95.420,4.580,99.190,0.810,26.21,224,1.000,bicubic
resnetv2_50x1_bit.goog_distilled_in1k,95.410,4.590,99.430,0.570,25.55,224,0.875,bicubic
cs3sedarknet_x.c2ns_in1k,95.410,4.590,99.320,0.680,35.40,288,1.000,bicubic
swinv2_cr_small_224.sw_in1k,95.410,4.590,99.060,0.940,49.70,224,0.900,bicubic
resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k,95.400,4.600,99.400,0.600,194.03,224,0.875,bilinear
tresnet_l.miil_in1k_448,95.400,4.600,99.300,0.700,55.99,448,0.875,bilinear
mvitv2_tiny.fb_in1k,95.400,4.600,99.160,0.840,24.17,224,0.900,bicubic
mobilevitv2_200.cvnets_in22k_ft_in1k_384,95.390,4.610,99.280,0.720,18.45,384,1.000,bicubic
nfnet_l0.ra2_in1k,95.380,4.620,99.420,0.580,35.07,288,1.000,bicubic
regnetz_c16.ra3_in1k,95.380,4.620,99.350,0.650,13.46,320,1.000,bicubic
deit3_medium_patch16_224.fb_in1k,95.380,4.620,99.210,0.790,38.85,224,0.900,bicubic
tresnet_m.miil_in21k_ft_in1k,95.380,4.620,99.150,0.850,31.39,224,0.875,bilinear
pnasnet5large.tf_in1k,95.360,4.640,99.130,0.870,86.06,331,0.911,bicubic
convnext_nano.in12k_ft_in1k,95.350,4.650,99.450,0.550,15.59,288,1.000,bicubic
mobilevitv2_150.cvnets_in22k_ft_in1k_384,95.350,4.650,99.120,0.880,10.59,384,1.000,bicubic
xcit_tiny_12_p8_384.fb_dist_in1k,95.340,4.660,99.340,0.660,6.71,384,1.000,bicubic
maxxvit_rmlp_nano_rw_256.sw_in1k,95.340,4.660,99.310,0.690,16.78,256,0.950,bicubic
swinv2_tiny_window16_256.ms_in1k,95.330,4.670,99.300,0.700,28.35,256,0.900,bicubic
convformer_s18.sail_in1k,95.330,4.670,99.150,0.850,26.77,224,1.000,bicubic
resnetv2_101x1_bit.goog_in21k_ft_in1k,95.320,4.680,99.370,0.630,44.54,448,1.000,bilinear
resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k,95.320,4.680,99.320,0.680,88.79,224,0.875,bilinear
rexnetr_200.sw_in12k_ft_in1k,95.310,4.690,99.470,0.530,16.52,288,1.000,bicubic
resnext101_32x8d.tv2_in1k,95.300,4.700,99.360,0.640,88.79,224,0.965,bilinear
regnety_080_tv.tv2_in1k,95.300,4.700,99.230,0.770,39.38,224,0.965,bicubic
vit_relpos_medium_patch16_cls_224.sw_in1k,95.300,4.700,99.100,0.900,38.76,224,0.900,bicubic
resnetaa50d.sw_in12k_ft_in1k,95.290,4.710,99.380,0.620,25.58,288,1.000,bicubic
gc_efficientnetv2_rw_t.agc_in1k,95.290,4.710,99.220,0.780,13.68,288,1.000,bicubic
fastvit_sa24.apple_in1k,95.280,4.720,99.310,0.690,21.55,256,0.900,bicubic
cs3darknet_x.c2ns_in1k,95.280,4.720,99.290,0.710,35.05,288,1.000,bicubic
regnetx_320.tv2_in1k,95.280,4.720,99.290,0.710,107.81,224,0.965,bicubic
repvit_m1_5.dist_450e_in1k,95.280,4.720,99.230,0.770,14.64,224,0.950,bicubic
mobilevitv2_175.cvnets_in22k_ft_in1k_384,95.260,4.740,99.380,0.620,14.25,384,1.000,bicubic
flexivit_small.600ep_in1k,95.260,4.740,99.160,0.840,22.06,240,0.950,bicubic
resnetrs101.tf_in1k,95.250,4.750,99.210,0.790,63.62,288,0.940,bicubic
vit_relpos_base_patch16_clsgap_224.sw_in1k,95.250,4.750,99.200,0.800,86.43,224,0.900,bicubic
convnext_tiny_hnf.a2h_in1k,95.250,4.750,98.980,1.020,28.59,288,1.000,bicubic
cait_xxs36_384.fb_dist_in1k,95.240,4.760,99.320,0.680,17.37,384,1.000,bicubic
vit_large_patch32_384.orig_in21k_ft_in1k,95.240,4.760,99.320,0.680,306.63,384,1.000,bicubic
tiny_vit_11m_224.in1k,95.240,4.760,99.230,0.770,11.00,224,0.950,bicubic
wide_resnet101_2.tv2_in1k,95.240,4.760,99.200,0.800,126.89,224,0.965,bilinear
vit_base_patch32_clip_224.laion2b_ft_in12k_in1k,95.230,4.770,99.240,0.760,88.22,224,0.900,bicubic
pvt_v2_b2_li.in1k,95.220,4.780,99.260,0.740,22.55,224,0.900,bicubic
efficientvit_b2.r256_in1k,95.220,4.780,99.130,0.870,24.33,256,1.000,bicubic
resnetv2_50d_gn.ah_in1k,95.220,4.780,99.030,0.970,25.57,288,1.000,bicubic
convnext_tiny.fb_in1k,95.210,4.790,99.310,0.690,28.59,288,1.000,bicubic
efficientformer_l3.snap_dist_in1k,95.210,4.790,99.310,0.690,31.41,224,0.950,bicubic
regnetx_160.tv2_in1k,95.210,4.790,99.280,0.720,54.28,224,0.965,bicubic
levit_384.fb_dist_in1k,95.210,4.790,99.160,0.840,39.13,224,0.900,bicubic
levit_conv_384.fb_dist_in1k,95.210,4.790,99.160,0.840,39.13,224,0.900,bicubic
resnet50.fb_swsl_ig1b_ft_in1k,95.200,4.800,99.390,0.610,25.56,224,0.875,bilinear
resnet51q.ra2_in1k,95.200,4.800,99.280,0.720,35.70,288,1.000,bilinear
vit_base_patch16_224.orig_in21k_ft_in1k,95.200,4.800,99.230,0.770,86.57,224,0.900,bicubic
coat_small.in1k,95.190,4.810,99.280,0.720,21.69,224,0.900,bicubic
focalnet_tiny_lrf.ms_in1k,95.190,4.810,99.220,0.780,28.65,224,0.900,bicubic
vit_relpos_medium_patch16_224.sw_in1k,95.190,4.810,99.220,0.780,38.75,224,0.900,bicubic
flexivit_small.1200ep_in1k,95.190,4.810,99.180,0.820,22.06,240,0.950,bicubic
poolformerv2_m48.sail_in1k,95.180,4.820,99.160,0.840,73.35,224,1.000,bicubic
crossvit_18_dagger_240.in1k,95.180,4.820,99.120,0.880,44.27,240,0.875,bicubic
regnety_160.tv2_in1k,95.160,4.840,99.250,0.750,83.59,224,0.965,bicubic
repvit_m1_5.dist_300e_in1k,95.150,4.850,99.270,0.730,14.64,224,0.950,bicubic
flexivit_small.300ep_in1k,95.150,4.850,99.150,0.850,22.06,240,0.950,bicubic
nasnetalarge.tf_in1k,95.150,4.850,99.130,0.870,88.75,331,0.911,bicubic
resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k,95.140,4.860,99.300,0.700,44.18,224,0.875,bilinear
convnextv2_nano.fcmae_ft_in1k,95.140,4.860,99.220,0.780,15.62,288,1.000,bicubic
efficientnet_b3.ra2_in1k,95.140,4.860,99.210,0.790,12.23,320,1.000,bicubic
vit_relpos_base_patch16_224.sw_in1k,95.130,4.870,99.290,0.710,86.43,224,0.900,bicubic
wide_resnet50_2.racm_in1k,95.130,4.870,99.260,0.740,68.88,288,0.950,bicubic
resnet61q.ra2_in1k,95.130,4.870,99.080,0.920,36.85,288,1.000,bicubic
xcit_medium_24_p16_224.fb_in1k,95.130,4.870,98.940,1.060,84.40,224,1.000,bicubic
vit_small_r26_s32_224.augreg_in21k_ft_in1k,95.120,4.880,99.220,0.780,36.43,224,0.900,bicubic
fbnetv3_g.ra2_in1k,95.120,4.880,99.200,0.800,16.62,288,0.950,bilinear
tf_efficientnetv2_b3.in1k,95.120,4.880,99.200,0.800,14.36,300,0.904,bicubic
cs3sedarknet_l.c2ns_in1k,95.110,4.890,99.210,0.790,21.91,288,0.950,bicubic
efficientformerv2_s2.snap_dist_in1k,95.110,4.890,99.120,0.880,12.71,224,0.950,bicubic
convit_base.fb_in1k,95.100,4.900,99.150,0.850,86.54,224,0.875,bicubic
inception_next_tiny.sail_in1k,95.100,4.900,99.140,0.860,28.06,224,0.875,bicubic
poolformer_m48.sail_in1k,95.100,4.900,99.100,0.900,73.47,224,0.950,bicubic
coatnet_rmlp_nano_rw_224.sw_in1k,95.090,4.910,99.170,0.830,15.15,224,0.900,bicubic
resnet152.a1_in1k,95.090,4.910,98.990,1.010,60.19,288,1.000,bicubic
ecaresnet50t.ra2_in1k,95.080,4.920,99.290,0.710,25.57,320,0.950,bicubic
tresnet_xl.miil_in1k,95.080,4.920,99.260,0.740,78.44,224,0.875,bilinear
davit_tiny.msft_in1k,95.080,4.920,99.140,0.860,28.36,224,0.950,bicubic
crossvit_18_240.in1k,95.070,4.930,99.120,0.880,43.27,240,0.875,bicubic
coat_lite_small.in1k,95.070,4.930,99.030,0.970,19.84,224,0.900,bicubic
crossvit_base_240.in1k,95.070,4.930,98.980,1.020,105.03,240,0.875,bicubic
efficientnetv2_rw_t.ra2_in1k,95.060,4.940,99.220,0.780,13.65,288,1.000,bicubic
vit_relpos_medium_patch16_rpn_224.sw_in1k,95.060,4.940,99.200,0.800,38.73,224,0.900,bicubic
xcit_small_24_p16_224.fb_in1k,95.060,4.940,99.070,0.930,47.67,224,1.000,bicubic
xception41p.ra3_in1k,95.050,4.950,99.160,0.840,26.91,299,0.940,bicubic
poolformerv2_m36.sail_in1k,95.050,4.950,99.150,0.850,56.08,224,1.000,bicubic
coatnet_nano_rw_224.sw_in1k,95.050,4.950,99.140,0.860,15.14,224,0.900,bicubic
mobilevitv2_200.cvnets_in22k_ft_in1k,95.050,4.950,99.080,0.920,18.45,256,0.888,bicubic
focalnet_tiny_srf.ms_in1k,95.040,4.960,99.280,0.720,28.43,224,0.900,bicubic
resnet152.tv2_in1k,95.040,4.960,99.170,0.830,60.19,224,0.965,bilinear
poolformer_m36.sail_in1k,95.030,4.970,99.100,0.900,56.17,224,0.950,bicubic
swinv2_tiny_window8_256.ms_in1k,95.020,4.980,99.170,0.830,28.35,256,0.900,bicubic
gcvit_xtiny.in1k,95.020,4.980,99.160,0.840,19.98,224,0.875,bicubic
deit_base_patch16_224.fb_in1k,95.020,4.980,98.970,1.030,86.57,224,0.900,bicubic
pvt_v2_b2.in1k,95.010,4.990,99.140,0.860,25.36,224,0.900,bicubic
halo2botnet50ts_256.a1h_in1k,95.010,4.990,99.050,0.950,22.64,256,0.950,bicubic
ecaresnet101d_pruned.miil_in1k,95.000,5.000,99.230,0.770,24.88,288,0.950,bicubic
seresnext50_32x4d.racm_in1k,95.000,5.000,99.190,0.810,27.56,288,0.950,bicubic
resnext50_32x4d.a1h_in1k,94.990,5.010,99.190,0.810,25.03,288,1.000,bicubic
crossvit_15_dagger_240.in1k,94.990,5.010,99.160,0.840,28.21,240,0.875,bicubic
coatnet_bn_0_rw_224.sw_in1k,94.980,5.020,99.230,0.770,27.44,224,0.950,bicubic
visformer_small.in1k,94.970,5.030,99.210,0.790,40.22,224,0.900,bicubic
convmixer_1536_20.in1k,94.970,5.030,99.170,0.830,51.63,224,0.960,bicubic
tf_efficientnet_b3.ap_in1k,94.970,5.030,99.110,0.890,12.23,300,0.904,bicubic
resnet152.a2_in1k,94.970,5.030,99.070,0.930,60.19,288,1.000,bicubic
xcit_large_24_p16_224.fb_in1k,94.960,5.040,98.830,1.170,189.10,224,1.000,bicubic
cait_xxs24_384.fb_dist_in1k,94.950,5.050,99.130,0.870,12.03,384,1.000,bicubic
vit_srelpos_medium_patch16_224.sw_in1k,94.940,5.060,99.200,0.800,38.74,224,0.900,bicubic
resnet101.a1_in1k,94.940,5.060,99.040,0.960,44.55,288,1.000,bicubic
gernet_l.idstcv_in1k,94.930,5.070,99.200,0.800,31.08,256,0.875,bilinear
resnetv2_50d_evos.ah_in1k,94.920,5.080,99.180,0.820,25.59,288,1.000,bicubic
swin_s3_tiny_224.ms_in1k,94.920,5.080,99.170,0.830,28.33,224,0.900,bicubic
convit_small.fb_in1k,94.920,5.080,99.100,0.900,27.78,224,0.875,bicubic
nest_tiny_jx.goog_in1k,94.920,5.080,99.100,0.900,17.06,224,0.875,bicubic
tf_efficientnet_b3.aa_in1k,94.910,5.090,99.110,0.890,12.23,300,0.904,bicubic
xcit_tiny_24_p8_224.fb_in1k,94.900,5.100,99.190,0.810,12.11,224,1.000,bicubic
tresnet_l.miil_in1k,94.900,5.100,99.030,0.970,55.99,224,0.875,bilinear
coatnet_0_rw_224.sw_in1k,94.900,5.100,99.020,0.980,27.44,224,0.950,bicubic
vit_small_patch16_224.augreg_in21k_ft_in1k,94.890,5.110,99.270,0.730,22.05,224,0.900,bicubic
ecaresnet50t.a1_in1k,94.890,5.110,99.070,0.930,25.57,288,1.000,bicubic
resnet101.a2_in1k,94.890,5.110,99.060,0.940,44.55,288,1.000,bicubic
mixer_b16_224.miil_in21k_ft_in1k,94.880,5.120,99.080,0.920,59.88,224,0.875,bilinear
regnety_032.tv2_in1k,94.870,5.130,99.230,0.770,19.44,224,0.965,bicubic
convnext_nano.d1h_in1k,94.870,5.130,99.140,0.860,15.59,288,1.000,bicubic
tf_efficientnet_lite4.in1k,94.870,5.130,99.100,0.900,13.01,380,0.920,bilinear
tf_efficientnet_b1.ns_jft_in1k,94.860,5.140,99.250,0.750,7.79,240,0.882,bicubic
coatnext_nano_rw_224.sw_in1k,94.850,5.150,99.200,0.800,14.70,224,0.900,bicubic
resnetaa50.a1h_in1k,94.850,5.150,99.120,0.880,25.56,288,1.000,bicubic
efficientvit_b2.r224_in1k,94.850,5.150,98.970,1.030,24.33,224,0.950,bicubic
resnet101.tv2_in1k,94.840,5.160,99.030,0.970,44.55,224,0.965,bilinear
edgenext_small.usi_in1k,94.830,5.170,99.410,0.590,5.59,320,1.000,bicubic
vit_base_patch16_rpn_224.sw_in1k,94.820,5.180,99.090,0.910,86.54,224,0.900,bicubic
xcit_small_12_p16_224.fb_in1k,94.820,5.180,99.060,0.940,26.25,224,1.000,bicubic
wide_resnet50_2.tv2_in1k,94.810,5.190,99.260,0.740,68.88,224,0.965,bilinear
resnet50d.ra2_in1k,94.810,5.190,99.230,0.770,25.58,288,0.950,bicubic
lamhalobotnet50ts_256.a1h_in1k,94.810,5.190,98.980,1.020,22.57,256,0.950,bicubic
pit_b_224.in1k,94.810,5.190,98.820,1.180,73.76,224,0.900,bicubic
swin_tiny_patch4_window7_224.ms_in22k_ft_in1k,94.800,5.200,99.290,0.710,28.29,224,0.900,bicubic
cs3darknet_focus_l.c2ns_in1k,94.790,5.210,99.160,0.840,21.15,288,0.950,bicubic
gcresnet50t.ra2_in1k,94.780,5.220,99.120,0.880,25.90,288,1.000,bicubic
mobilevitv2_175.cvnets_in22k_ft_in1k,94.780,5.220,99.090,0.910,14.25,256,0.888,bicubic
swinv2_cr_tiny_ns_224.sw_in1k,94.770,5.230,99.110,0.890,28.33,224,0.900,bicubic
twins_svt_small.in1k,94.760,5.240,99.090,0.910,24.06,224,0.900,bicubic
coat_mini.in1k,94.760,5.240,98.950,1.050,10.34,224,0.900,bicubic
vit_base_patch32_clip_224.laion2b_ft_in1k,94.750,5.250,99.070,0.930,88.22,224,0.900,bicubic
resnetv2_50x1_bit.goog_in21k_ft_in1k,94.740,5.260,99.180,0.820,25.55,448,1.000,bilinear
seresnet50.ra2_in1k,94.740,5.260,99.110,0.890,28.09,288,0.950,bicubic
legacy_senet154.in1k,94.730,5.270,99.100,0.900,115.09,224,0.875,bilinear
regnetx_080.tv2_in1k,94.730,5.270,99.030,0.970,39.57,224,0.965,bicubic
repvit_m3.dist_in1k,94.720,5.280,99.060,0.940,10.68,224,0.950,bicubic
halonet50ts.a1h_in1k,94.720,5.280,98.830,1.170,22.73,256,0.940,bicubic
resnet152s.gluon_in1k,94.710,5.290,99.060,0.940,60.32,224,0.875,bicubic
resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k,94.700,5.300,99.240,0.760,25.03,224,0.875,bilinear
poolformerv2_s36.sail_in1k,94.700,5.300,99.230,0.770,30.79,224,1.000,bicubic
resnest50d_4s2x40d.in1k,94.700,5.300,99.130,0.870,30.42,224,0.875,bicubic
crossvit_15_240.in1k,94.700,5.300,99.080,0.920,27.53,240,0.875,bicubic
senet154.gluon_in1k,94.700,5.300,98.970,1.030,115.09,224,0.875,bicubic
xcit_tiny_12_p8_224.fb_dist_in1k,94.690,5.310,99.180,0.820,6.71,224,1.000,bicubic
pit_s_distilled_224.in1k,94.690,5.310,99.150,0.850,24.04,224,0.900,bicubic
deit3_small_patch16_224.fb_in1k,94.690,5.310,98.760,1.240,22.06,224,0.900,bicubic
vit_relpos_small_patch16_224.sw_in1k,94.680,5.320,99.110,0.890,21.98,224,0.900,bicubic
fastvit_sa12.apple_dist_in1k,94.680,5.320,99.100,0.900,11.58,256,0.900,bicubic
resnetv2_50.a1h_in1k,94.680,5.320,99.090,0.910,25.55,288,1.000,bicubic
mobilevitv2_150.cvnets_in22k_ft_in1k,94.680,5.320,98.920,1.080,10.59,256,0.888,bicubic
ecaresnet50d.miil_in1k,94.670,5.330,99.260,0.740,25.58,288,0.950,bicubic
cs3darknet_l.c2ns_in1k,94.670,5.330,99.230,0.770,21.16,288,0.950,bicubic
efficientnet_el.ra_in1k,94.670,5.330,99.130,0.870,10.59,300,0.904,bicubic
regnetz_b16.ra3_in1k,94.670,5.330,99.130,0.870,9.72,288,1.000,bicubic
rexnet_200.nav_in1k,94.670,5.330,99.090,0.910,16.37,224,0.875,bicubic
tresnet_m.miil_in1k_448,94.660,5.340,99.150,0.850,31.39,448,0.875,bilinear
seresnext101_64x4d.gluon_in1k,94.650,5.350,98.970,1.030,88.23,224,0.875,bicubic
tiny_vit_5m_224.dist_in22k_ft_in1k,94.630,5.370,99.140,0.860,5.39,224,0.950,bicubic
resnet50_gn.a1h_in1k,94.620,5.380,99.150,0.850,25.56,288,0.950,bicubic
swin_tiny_patch4_window7_224.ms_in1k,94.620,5.380,99.120,0.880,28.29,224,0.900,bicubic
poolformer_s36.sail_in1k,94.620,5.380,99.050,0.950,30.86,224,0.900,bicubic
vit_small_patch16_384.augreg_in1k,94.610,5.390,99.140,0.860,22.20,384,1.000,bicubic
twins_pcpvt_small.in1k,94.600,5.400,99.150,0.850,24.11,224,0.900,bicubic
deit_small_distilled_patch16_224.fb_in1k,94.600,5.400,99.100,0.900,22.44,224,0.900,bicubic
resnet50.tv2_in1k,94.600,5.400,99.090,0.910,25.56,224,0.965,bilinear
efficientnet_b3_pruned.in1k,94.600,5.400,99.070,0.930,9.86,300,0.904,bicubic
resnest50d.in1k,94.600,5.400,99.030,0.970,27.48,224,0.875,bilinear
vit_small_patch32_384.augreg_in21k_ft_in1k,94.590,5.410,99.140,0.860,22.92,384,1.000,bicubic
pit_s_224.in1k,94.590,5.410,98.930,1.070,23.46,224,0.900,bicubic
crossvit_small_240.in1k,94.580,5.420,99.120,0.880,26.86,240,0.875,bicubic
convnext_nano_ols.d1h_in1k,94.580,5.420,99.050,0.950,15.65,288,1.000,bicubic
ecaresnet50t.a2_in1k,94.570,5.430,99.040,0.960,25.57,288,1.000,bicubic
repvgg_b3.rvgg_in1k,94.570,5.430,98.910,1.090,123.09,224,0.875,bilinear
lambda_resnet50ts.a1h_in1k,94.570,5.430,98.650,1.350,21.54,256,0.950,bicubic
tnt_s_patch16_224,94.560,5.440,99.170,0.830,23.76,224,0.900,bicubic
resmlp_36_224.fb_distilled_in1k,94.560,5.440,99.160,0.840,44.69,224,0.875,bicubic
convnextv2_pico.fcmae_ft_in1k,94.560,5.440,99.140,0.860,9.07,288,0.950,bicubic
vit_srelpos_small_patch16_224.sw_in1k,94.550,5.450,99.150,0.850,21.97,224,0.900,bicubic
repvit_m1_1.dist_450e_in1k,94.550,5.450,99.090,0.910,8.80,224,0.950,bicubic
gernet_m.idstcv_in1k,94.540,5.460,98.920,1.080,21.14,224,0.875,bilinear
ecaresnetlight.miil_in1k,94.530,5.470,99.180,0.820,30.16,288,0.950,bicubic
regnety_320.pycls_in1k,94.520,5.480,99.170,0.830,145.05,224,0.875,bicubic
xcit_tiny_12_p16_384.fb_dist_in1k,94.520,5.480,99.170,0.830,6.72,384,1.000,bicubic
res2net101d.in1k,94.520,5.480,98.980,1.020,45.23,224,0.875,bilinear
mobilevitv2_200.cvnets_in1k,94.520,5.480,98.970,1.030,18.45,256,0.888,bicubic
haloregnetz_b.ra3_in1k,94.520,5.480,98.960,1.040,11.68,224,0.940,bicubic
regnetx_032.tv2_in1k,94.520,5.480,98.910,1.090,15.30,224,0.965,bicubic
resnet50.c1_in1k,94.510,5.490,99.070,0.930,25.56,288,1.000,bicubic
resnet50.b1k_in1k,94.510,5.490,99.000,1.000,25.56,288,1.000,bicubic
sehalonet33ts.ra2_in1k,94.510,5.490,98.760,1.240,13.69,256,0.940,bicubic
repvgg_b3g4.rvgg_in1k,94.500,5.500,99.020,0.980,83.83,224,0.875,bilinear
gcresnext50ts.ch_in1k,94.490,5.510,99.010,0.990,15.67,288,1.000,bicubic
ese_vovnet39b.ra_in1k,94.480,5.520,99.060,0.940,24.57,288,0.950,bicubic
poolformerv2_s24.sail_in1k,94.470,5.530,99.010,0.990,21.34,224,1.000,bicubic
resnet50.d_in1k,94.470,5.530,99.000,1.000,25.56,288,1.000,bicubic
resnext50_32x4d.tv2_in1k,94.460,5.540,99.030,0.970,25.03,224,0.965,bilinear
resnet50d.a2_in1k,94.460,5.540,98.900,1.100,25.58,288,1.000,bicubic
eva02_tiny_patch14_336.mim_in22k_ft_in1k,94.450,5.550,99.100,0.900,5.76,336,1.000,bicubic
seresnet50.a2_in1k,94.450,5.550,98.890,1.110,28.09,288,1.000,bicubic
vit_base_patch32_clip_224.openai_ft_in1k,94.440,5.560,99.180,0.820,88.22,224,0.900,bicubic
convmixer_768_32.in1k,94.440,5.560,99.110,0.890,21.11,224,0.960,bicubic
vit_base_patch16_384.augreg_in1k,94.440,5.560,99.030,0.970,86.86,384,1.000,bicubic
resnet152.a3_in1k,94.440,5.560,98.880,1.120,60.19,224,0.950,bicubic
seresnext101_32x4d.gluon_in1k,94.430,5.570,99.090,0.910,48.96,224,0.875,bicubic
fastvit_sa12.apple_in1k,94.430,5.570,99.030,0.970,11.58,256,0.900,bicubic
resnet152d.gluon_in1k,94.430,5.570,99.000,1.000,60.21,224,0.875,bicubic
regnety_016.tv2_in1k,94.410,5.590,99.040,0.960,11.20,224,0.965,bicubic
levit_256.fb_dist_in1k,94.400,5.600,99.060,0.940,18.89,224,0.900,bicubic
levit_conv_256.fb_dist_in1k,94.400,5.600,99.060,0.940,18.89,224,0.900,bicubic
vit_base_patch32_224.augreg_in21k_ft_in1k,94.400,5.600,99.060,0.940,88.22,224,0.900,bicubic
resnext50d_32x4d.bt_in1k,94.400,5.600,99.050,0.950,25.05,288,0.950,bicubic
repvit_m2.dist_in1k,94.400,5.600,99.040,0.960,8.80,224,0.950,bicubic
resnet50d.a1_in1k,94.400,5.600,98.790,1.210,25.58,288,1.000,bicubic
poolformer_s24.sail_in1k,94.390,5.610,99.060,0.940,21.39,224,0.900,bicubic
nf_resnet50.ra2_in1k,94.380,5.620,99.070,0.930,25.56,288,0.940,bicubic
resnest50d_1s4x24d.in1k,94.380,5.620,99.070,0.930,25.68,224,0.875,bicubic
inception_v4.tf_in1k,94.380,5.620,98.820,1.180,42.68,299,0.875,bicubic
resnext50_32x4d.a1_in1k,94.380,5.620,98.780,1.220,25.03,288,1.000,bicubic
darknet53.c2ns_in1k,94.360,5.640,99.050,0.950,41.61,288,1.000,bicubic
efficientnet_b2.ra_in1k,94.360,5.640,99.050,0.950,9.11,288,1.000,bicubic
edgenext_small_rw.sw_in1k,94.360,5.640,99.040,0.960,7.83,320,1.000,bicubic
inception_resnet_v2.tf_in1k,94.360,5.640,98.800,1.200,55.84,299,0.897,bicubic
tf_efficientnet_el.in1k,94.350,5.650,99.100,0.900,10.59,300,0.904,bicubic
xcit_tiny_12_p8_224.fb_in1k,94.350,5.650,99.070,0.930,6.71,224,1.000,bicubic
gcresnet33ts.ra2_in1k,94.350,5.650,98.960,1.040,19.88,288,1.000,bicubic
resnext101_64x4d.gluon_in1k,94.350,5.650,98.880,1.120,83.46,224,0.875,bicubic
resmlp_24_224.fb_distilled_in1k,94.330,5.670,99.090,0.910,30.02,224,0.875,bicubic
resnext50_32x4d.ra_in1k,94.330,5.670,99.030,0.970,25.03,288,0.950,bicubic
resnet50.fb_ssl_yfcc100m_ft_in1k,94.310,5.690,99.150,0.850,25.56,224,0.875,bilinear
sebotnet33ts_256.a1h_in1k,94.310,5.690,98.600,1.400,13.70,256,0.940,bicubic
ecaresnet50d_pruned.miil_in1k,94.300,5.700,99.200,0.800,19.94,288,0.950,bicubic
resnet50.b2k_in1k,94.300,5.700,98.930,1.070,25.56,288,1.000,bicubic
tf_efficientnet_b3.in1k,94.290,5.710,99.100,0.900,12.23,300,0.904,bicubic
rexnet_150.nav_in1k,94.280,5.720,99.090,0.910,9.73,224,0.875,bicubic
fastvit_s12.apple_dist_in1k,94.280,5.720,98.980,1.020,9.47,256,0.900,bicubic
res2net50d.in1k,94.280,5.720,98.860,1.140,25.72,224,0.875,bilinear
repvit_m1_0.dist_450e_in1k,94.270,5.730,99.040,0.960,7.30,224,0.950,bicubic
resnet50.c2_in1k,94.270,5.730,99.040,0.960,25.56,288,1.000,bicubic
tf_efficientnet_b2.ap_in1k,94.270,5.730,98.950,1.050,9.11,260,0.890,bicubic
resmlp_big_24_224.fb_in1k,94.270,5.730,98.820,1.180,129.14,224,0.875,bicubic
regnetx_120.pycls_in1k,94.260,5.740,99.170,0.830,46.11,224,0.875,bicubic
seresnet33ts.ra2_in1k,94.260,5.740,99.000,1.000,19.78,288,1.000,bicubic
eca_resnet33ts.ra2_in1k,94.250,5.750,99.030,0.970,19.68,288,1.000,bicubic
cspresnext50.ra_in1k,94.240,5.760,99.050,0.950,20.57,256,0.887,bilinear
xcit_tiny_24_p16_224.fb_dist_in1k,94.240,5.760,98.960,1.040,12.12,224,1.000,bicubic
regnetx_320.pycls_in1k,94.230,5.770,99.050,0.950,107.81,224,0.875,bicubic
efficientvit_b1.r288_in1k,94.230,5.770,98.950,1.050,9.10,288,1.000,bicubic
mobilevitv2_175.cvnets_in1k,94.230,5.770,98.930,1.070,14.25,256,0.888,bicubic
mixnet_xl.ra_in1k,94.230,5.770,98.820,1.180,11.90,224,0.875,bicubic
tf_efficientnet_b2.aa_in1k,94.220,5.780,99.040,0.960,9.11,260,0.890,bicubic
resnet50.a1_in1k,94.220,5.780,98.930,1.070,25.56,288,1.000,bicubic
resnext50_32x4d.a2_in1k,94.220,5.780,98.750,1.250,25.03,288,1.000,bicubic
maxvit_rmlp_pico_rw_256.sw_in1k,94.210,5.790,99.000,1.000,7.52,256,0.950,bicubic
darknetaa53.c2ns_in1k,94.210,5.790,98.950,1.050,36.02,288,1.000,bilinear
resnet50.a1h_in1k,94.200,5.800,98.920,1.080,25.56,224,1.000,bicubic
repvit_m1_1.dist_300e_in1k,94.180,5.820,99.080,0.920,8.80,224,0.950,bicubic
resnet101s.gluon_in1k,94.180,5.820,99.010,0.990,44.67,224,0.875,bicubic
resnet101d.gluon_in1k,94.180,5.820,98.940,1.060,44.57,224,0.875,bicubic
resnetblur50.bt_in1k,94.170,5.830,99.010,0.990,25.56,288,0.950,bicubic
seresnext50_32x4d.gluon_in1k,94.170,5.830,98.920,1.080,27.56,224,0.875,bicubic
seresnet50.a1_in1k,94.160,5.840,98.850,1.150,28.09,288,1.000,bicubic
dpn92.mx_in1k,94.150,5.850,98.950,1.050,37.67,224,0.875,bicubic
regnety_064.pycls_in1k,94.130,5.870,99.030,0.970,30.58,224,0.875,bicubic
regnety_160.pycls_in1k,94.130,5.870,99.020,0.980,83.59,224,0.875,bicubic
resnext101_32x4d.gluon_in1k,94.130,5.870,98.940,1.060,44.18,224,0.875,bicubic
legacy_seresnext101_32x4d.in1k,94.120,5.880,98.970,1.030,48.96,224,0.875,bilinear
resnet50.a2_in1k,94.120,5.880,98.850,1.150,25.56,288,1.000,bicubic
inception_resnet_v2.tf_ens_adv_in1k,94.120,5.880,98.790,1.210,55.84,299,0.897,bicubic
cspdarknet53.ra_in1k,94.100,5.900,98.980,1.020,27.64,256,0.887,bilinear
fastvit_t12.apple_dist_in1k,94.100,5.900,98.950,1.050,7.55,256,0.900,bicubic
efficientnet_el_pruned.in1k,94.090,5.910,99.020,0.980,10.59,300,0.904,bicubic
tf_efficientnet_lite3.in1k,94.090,5.910,98.960,1.040,8.20,300,0.904,bilinear
resnet50.ra_in1k,94.090,5.910,98.840,1.160,25.56,288,0.950,bicubic
tresnet_m.miil_in1k,94.080,5.920,98.830,1.170,31.39,224,0.875,bilinear
tf_efficientnetv2_b2.in1k,94.060,5.940,98.930,1.070,10.10,260,0.890,bicubic
gcvit_xxtiny.in1k,94.050,5.950,99.080,0.920,12.00,224,0.875,bicubic
mobilevitv2_150.cvnets_in1k,94.050,5.950,98.900,1.100,10.59,256,0.888,bicubic
convnext_pico.d1_in1k,94.030,5.970,99.010,0.990,9.05,288,0.950,bicubic
resnetrs50.tf_in1k,94.030,5.970,98.850,1.150,35.69,224,0.910,bicubic
resnet152.gluon_in1k,94.030,5.970,98.740,1.260,60.19,224,0.875,bicubic
regnetx_016.tv2_in1k,94.020,5.980,98.930,1.070,9.19,224,0.965,bicubic
hrnet_w48.ms_in1k,94.010,5.990,99.030,0.970,77.47,224,0.875,bilinear
regnety_120.pycls_in1k,94.010,5.990,99.030,0.970,51.82,224,0.875,bicubic
convnext_pico_ols.d1_in1k,94.010,5.990,98.930,1.070,9.06,288,1.000,bicubic
dpn107.mx_in1k,94.010,5.990,98.820,1.180,86.92,224,0.875,bicubic
resnet50.ram_in1k,94.000,6.000,98.880,1.120,25.56,288,0.950,bicubic
dla102x2.in1k,93.990,6.010,99.040,0.960,41.28,224,0.875,bilinear
deit_small_patch16_224.fb_in1k,93.990,6.010,98.960,1.040,22.05,224,0.900,bicubic
skresnext50_32x4d.ra_in1k,93.970,6.030,98.830,1.170,27.48,224,0.875,bicubic
resnet50.bt_in1k,93.960,6.040,98.930,1.070,25.56,288,0.950,bicubic
efficientformer_l1.snap_dist_in1k,93.940,6.060,99.030,0.970,12.29,224,0.950,bicubic
ecaresnet26t.ra2_in1k,93.940,6.060,98.930,1.070,16.01,320,0.950,bicubic
dpn98.mx_in1k,93.930,6.070,98.910,1.090,61.57,224,0.875,bicubic
resnet33ts.ra2_in1k,93.930,6.070,98.880,1.120,19.68,288,1.000,bicubic
xception71.tf_in1k,93.920,6.080,98.950,1.050,42.34,299,0.903,bicubic
regnetx_160.pycls_in1k,93.910,6.090,99.090,0.910,54.28,224,0.875,bicubic
cait_xxs36_224.fb_dist_in1k,93.910,6.090,98.890,1.110,17.30,224,1.000,bicubic
vit_base_patch16_224.sam_in1k,93.890,6.110,98.890,1.110,86.57,224,0.900,bicubic
nf_regnet_b1.ra2_in1k,93.890,6.110,98.740,1.260,10.22,288,0.900,bicubic
regnety_080.pycls_in1k,93.880,6.120,99.000,1.000,39.18,224,0.875,bicubic
fbnetv3_d.ra2_in1k,93.870,6.130,98.890,1.110,10.31,256,0.950,bilinear
cspresnet50.ra_in1k,93.870,6.130,98.870,1.130,21.62,256,0.887,bilinear
resnet152c.gluon_in1k,93.870,6.130,98.800,1.200,60.21,224,0.875,bicubic
ecaresnet50t.a3_in1k,93.860,6.140,98.850,1.150,25.57,224,0.950,bicubic
resnet101.a3_in1k,93.860,6.140,98.760,1.240,44.55,224,0.950,bicubic
xcit_tiny_24_p16_224.fb_in1k,93.850,6.150,98.750,1.250,12.12,224,1.000,bicubic
efficientformerv2_s1.snap_dist_in1k,93.840,6.160,98.890,1.110,6.19,224,0.950,bicubic
hrnet_w64.ms_in1k,93.830,6.170,98.940,1.060,128.06,224,0.875,bilinear
repvgg_b2g4.rvgg_in1k,93.830,6.170,98.930,1.070,61.76,224,0.875,bilinear
efficientnet_b2_pruned.in1k,93.800,6.200,98.910,1.090,8.31,260,0.890,bicubic
dla169.in1k,93.800,6.200,98.860,1.140,53.39,224,0.875,bilinear
tiny_vit_5m_224.in1k,93.790,6.210,98.940,1.060,5.39,224,0.950,bicubic
regnetx_080.pycls_in1k,93.790,6.210,98.910,1.090,39.57,224,0.875,bicubic
dpn68b.ra_in1k,93.790,6.210,98.540,1.460,12.61,288,1.000,bicubic
resnext101_32x8d.tv_in1k,93.780,6.220,98.960,1.040,88.79,224,0.875,bilinear
dpn131.mx_in1k,93.780,6.220,98.850,1.150,79.25,224,0.875,bicubic
repvit_m1_0.dist_300e_in1k,93.760,6.240,98.920,1.080,7.30,224,0.950,bicubic
resnet101.gluon_in1k,93.760,6.240,98.700,1.300,44.55,224,0.875,bicubic
tf_efficientnet_b0.ns_jft_in1k,93.750,6.250,98.970,1.030,5.29,224,0.875,bicubic
convnextv2_femto.fcmae_ft_in1k,93.750,6.250,98.930,1.070,5.23,288,0.950,bicubic
xception65.tf_in1k,93.750,6.250,98.870,1.130,39.92,299,0.903,bicubic
efficientnet_em.ra2_in1k,93.730,6.270,98.930,1.070,6.90,240,0.882,bicubic
hrnet_w40.ms_in1k,93.730,6.270,98.800,1.200,57.56,224,0.875,bilinear
wide_resnet101_2.tv_in1k,93.720,6.280,98.810,1.190,126.89,224,0.875,bilinear
tf_efficientnet_b1.aa_in1k,93.720,6.280,98.800,1.200,7.79,240,0.882,bicubic
tf_efficientnet_b2.in1k,93.710,6.290,98.930,1.070,9.11,260,0.890,bicubic
tf_efficientnetv2_b1.in1k,93.710,6.290,98.820,1.180,8.14,240,0.882,bicubic
efficientvit_b1.r256_in1k,93.710,6.290,98.810,1.190,9.10,256,1.000,bicubic
levit_192.fb_dist_in1k,93.710,6.290,98.790,1.210,10.95,224,0.900,bicubic
levit_conv_192.fb_dist_in1k,93.710,6.290,98.790,1.210,10.95,224,0.900,bicubic
resnet101c.gluon_in1k,93.700,6.300,98.760,1.240,44.57,224,0.875,bicubic
fastvit_s12.apple_in1k,93.700,6.300,98.720,1.280,9.47,256,0.900,bicubic
regnetx_040.pycls_in1k,93.690,6.310,98.930,1.070,22.12,224,0.875,bicubic
rexnet_130.nav_in1k,93.680,6.320,98.700,1.300,7.56,224,0.875,bicubic
resnext50_32x4d.gluon_in1k,93.670,6.330,98.690,1.310,25.03,224,0.875,bicubic
resmlp_36_224.fb_in1k,93.650,6.350,98.950,1.050,44.69,224,0.875,bicubic
mobileone_s4.apple_in1k,93.650,6.350,98.650,1.350,14.95,224,0.900,bilinear
regnetx_064.pycls_in1k,93.640,6.360,99.040,0.960,26.21,224,0.875,bicubic
resnet50.am_in1k,93.640,6.360,98.870,1.130,25.56,224,0.875,bicubic
regnety_040.pycls_in1k,93.630,6.370,98.950,1.050,20.65,224,0.875,bicubic
fbnetv3_b.ra2_in1k,93.630,6.370,98.910,1.090,8.60,256,0.950,bilinear
tf_efficientnet_b1.ap_in1k,93.630,6.370,98.800,1.200,7.79,240,0.882,bicubic
hrnet_w44.ms_in1k,93.620,6.380,98.960,1.040,67.06,224,0.875,bilinear
legacy_xception.tf_in1k,93.620,6.380,98.770,1.230,22.86,299,0.897,bicubic
resnet34d.ra2_in1k,93.600,6.400,98.760,1.240,21.82,288,0.950,bicubic
res2net50_26w_6s.in1k,93.590,6.410,98.740,1.260,37.05,224,0.875,bilinear
resnet32ts.ra2_in1k,93.590,6.410,98.740,1.260,17.96,288,1.000,bicubic
halonet26t.a1h_in1k,93.590,6.410,98.630,1.370,12.48,256,0.950,bicubic
repvgg_b2.rvgg_in1k,93.580,6.420,99.070,0.930,89.02,224,0.875,bilinear
dla60_res2next.in1k,93.580,6.420,98.790,1.210,17.03,224,0.875,bilinear
tf_efficientnet_cc_b1_8e.in1k,93.580,6.420,98.690,1.310,39.72,240,0.882,bicubic
resnet50s.gluon_in1k,93.570,6.430,98.840,1.160,25.68,224,0.875,bicubic
inception_v3.gluon_in1k,93.560,6.440,98.840,1.160,23.83,299,0.875,bicubic
resnext50_32x4d.a3_in1k,93.550,6.450,98.820,1.180,25.03,224,0.950,bicubic
eca_halonext26ts.c1_in1k,93.550,6.450,98.680,1.320,10.76,256,0.940,bicubic
repghostnet_200.in1k,93.550,6.450,98.600,1.400,9.80,224,0.875,bicubic
dla102x.in1k,93.540,6.460,98.850,1.150,26.31,224,0.875,bilinear
resnet50d.gluon_in1k,93.540,6.460,98.710,1.290,25.58,224,0.875,bicubic
res2net101_26w_4s.in1k,93.530,6.470,98.600,1.400,45.21,224,0.875,bilinear
convnext_tiny.fb_in22k_ft_in1k,93.530,6.470,98.570,1.430,28.59,288,1.000,bicubic
coat_tiny.in1k,93.510,6.490,98.680,1.320,5.50,224,0.900,bicubic
selecsls60b.in1k,93.500,6.500,98.840,1.160,32.77,224,0.875,bicubic
gmlp_s16_224.ra3_in1k,93.500,6.500,98.780,1.220,19.42,224,0.875,bicubic
pvt_v2_b1.in1k,93.490,6.510,98.860,1.140,14.01,224,0.900,bicubic
fastvit_t12.apple_in1k,93.490,6.510,98.710,1.290,7.55,256,0.900,bicubic
hrnet_w18.ms_aug_in1k,93.480,6.520,98.980,1.020,21.30,224,0.950,bilinear
mobilevitv2_125.cvnets_in1k,93.480,6.520,98.840,1.160,7.48,256,0.888,bicubic
xception41.tf_in1k,93.480,6.520,98.750,1.250,26.97,299,0.903,bicubic
coat_lite_mini.in1k,93.470,6.530,98.770,1.230,11.01,224,0.900,bicubic
regnety_032.pycls_in1k,93.460,6.540,98.950,1.050,19.44,224,0.875,bicubic
wide_resnet50_2.tv_in1k,93.460,6.540,98.950,1.050,68.88,224,0.875,bilinear
repvit_m0_9.dist_300e_in1k,93.460,6.540,98.820,1.180,5.49,224,0.950,bicubic
legacy_seresnext50_32x4d.in1k,93.450,6.550,98.800,1.200,27.56,224,0.875,bilinear
cait_xxs24_224.fb_dist_in1k,93.450,6.550,98.780,1.220,11.96,224,1.000,bicubic
vit_small_patch16_224.augreg_in1k,93.450,6.550,98.780,1.220,22.05,224,0.900,bicubic
repvit_m0_9.dist_450e_in1k,93.440,6.560,98.910,1.090,5.49,224,0.950,bicubic
convnext_femto.d1_in1k,93.440,6.560,98.820,1.180,5.22,288,0.950,bicubic
repvgg_b1.rvgg_in1k,93.440,6.560,98.790,1.210,57.42,224,0.875,bilinear
botnet26t_256.c1_in1k,93.440,6.560,98.650,1.350,12.49,256,0.950,bicubic
lambda_resnet26rpt_256.c1_in1k,93.430,6.570,98.880,1.120,10.99,256,0.940,bicubic
lambda_resnet26t.c1_in1k,93.430,6.570,98.730,1.270,10.96,256,0.940,bicubic
vit_tiny_patch16_384.augreg_in21k_ft_in1k,93.420,6.580,98.830,1.170,5.79,384,1.000,bicubic
resmlp_24_224.fb_in1k,93.420,6.580,98.810,1.190,30.02,224,0.875,bicubic
legacy_seresnet152.in1k,93.410,6.590,98.850,1.150,66.82,224,0.875,bilinear
hrnet_w30.ms_in1k,93.410,6.590,98.830,1.170,37.71,224,0.875,bilinear
resnet50d.a3_in1k,93.410,6.590,98.750,1.250,25.58,224,0.950,bicubic
res2net50_26w_8s.in1k,93.410,6.590,98.690,1.310,48.40,224,0.875,bilinear
convnext_femto_ols.d1_in1k,93.390,6.610,98.910,1.090,5.23,288,0.950,bicubic
repvit_m1.dist_in1k,93.380,6.620,98.650,1.350,5.49,224,0.950,bicubic
dla60_res2net.in1k,93.370,6.630,98.840,1.160,20.85,224,0.875,bilinear
xcit_tiny_12_p16_224.fb_dist_in1k,93.350,6.650,98.760,1.240,6.72,224,1.000,bicubic
eca_botnext26ts_256.c1_in1k,93.350,6.650,98.690,1.310,10.59,256,0.950,bicubic
seresnext26t_32x4d.bt_in1k,93.350,6.650,98.690,1.310,16.81,288,0.950,bicubic
vit_base_patch16_224.augreg_in1k,93.350,6.650,98.660,1.340,86.57,224,0.900,bicubic
efficientvit_b1.r224_in1k,93.330,6.670,98.570,1.430,9.10,224,0.950,bicubic
xcit_nano_12_p8_384.fb_dist_in1k,93.300,6.700,98.860,1.140,3.05,384,1.000,bicubic
pit_xs_distilled_224.in1k,93.290,6.710,98.790,1.210,11.00,224,0.900,bicubic
cs3darknet_m.c2ns_in1k,93.280,6.720,98.720,1.280,9.31,288,0.950,bicubic
dla102.in1k,93.270,6.730,98.790,1.210,33.27,224,0.875,bilinear
legacy_seresnet101.in1k,93.270,6.730,98.740,1.260,49.33,224,0.875,bilinear
resnet152.tv_in1k,93.240,6.760,98.750,1.250,60.19,224,0.875,bilinear
regnetx_032.pycls_in1k,93.240,6.760,98.720,1.280,15.30,224,0.875,bicubic
mixnet_l.ft_in1k,93.240,6.760,98.700,1.300,7.33,224,0.875,bicubic
resnest26d.gluon_in1k,93.220,6.780,98.850,1.150,17.07,224,0.875,bilinear
dla60x.in1k,93.190,6.810,98.720,1.280,17.35,224,0.875,bilinear
tf_efficientnet_em.in1k,93.190,6.810,98.660,1.340,6.90,240,0.882,bicubic
inception_v3.tf_in1k,93.190,6.810,98.490,1.510,23.83,299,0.875,bicubic
res2net50_26w_4s.in1k,93.180,6.820,98.660,1.340,25.70,224,0.875,bilinear
vit_base_patch32_384.augreg_in1k,93.160,6.840,98.610,1.390,88.30,384,1.000,bicubic
mobilevit_s.cvnets_in1k,93.150,6.850,98.780,1.220,5.58,256,0.900,bicubic
regnety_008_tv.tv2_in1k,93.150,6.850,98.680,1.320,6.43,224,0.965,bicubic
res2next50.in1k,93.150,6.850,98.640,1.360,24.67,224,0.875,bilinear
mobilevitv2_100.cvnets_in1k,93.140,6.860,98.760,1.240,4.90,256,0.888,bicubic
vit_relpos_base_patch32_plus_rpn_256.sw_in1k,93.140,6.860,98.310,1.690,119.42,256,0.900,bicubic
bat_resnext26ts.ch_in1k,93.120,6.880,98.730,1.270,10.73,256,0.900,bicubic
cs3darknet_focus_m.c2ns_in1k,93.100,6.900,98.750,1.250,9.30,288,0.950,bicubic
ghostnetv2_160.in1k,93.090,6.910,98.740,1.260,12.39,224,0.875,bicubic
seresnext26d_32x4d.bt_in1k,93.060,6.940,98.710,1.290,16.81,288,0.950,bicubic
tf_efficientnetv2_b0.in1k,93.060,6.940,98.700,1.300,7.14,224,0.875,bicubic
repvgg_b1g4.rvgg_in1k,93.030,6.970,98.820,1.180,39.97,224,0.875,bilinear
levit_128.fb_dist_in1k,93.030,6.970,98.710,1.290,9.21,224,0.900,bicubic
levit_conv_128.fb_dist_in1k,93.030,6.970,98.700,1.300,9.21,224,0.900,bicubic
res2net50_14w_8s.in1k,93.030,6.970,98.700,1.300,25.06,224,0.875,bilinear
densenetblur121d.ra_in1k,93.030,6.970,98.600,1.400,8.00,288,0.950,bicubic
tf_mixnet_l.in1k,93.030,6.970,98.530,1.470,7.33,224,0.875,bicubic
efficientnet_b1.ft_in1k,93.020,6.980,98.710,1.290,7.79,256,1.000,bicubic
selecsls60.in1k,93.010,6.990,98.820,1.180,30.67,224,0.875,bicubic
regnety_016.pycls_in1k,93.010,6.990,98.670,1.330,11.20,224,0.875,bicubic
inception_v3.tf_adv_in1k,93.010,6.990,98.490,1.510,23.83,299,0.875,bicubic
hrnet_w18_small_v2.gluon_in1k,93.000,7.000,98.760,1.240,15.60,224,0.875,bicubic
resnet34.a1_in1k,93.000,7.000,98.630,1.370,21.80,288,1.000,bicubic
visformer_tiny.in1k,92.980,7.020,98.730,1.270,10.32,224,0.900,bicubic
convnext_atto_ols.a2_in1k,92.980,7.020,98.670,1.330,3.70,288,0.950,bicubic
mobileone_s3.apple_in1k,92.980,7.020,98.630,1.370,10.17,224,0.900,bilinear
hardcorenas_f.miil_green_in1k,92.980,7.020,98.620,1.380,8.20,224,0.875,bilinear
efficientnet_b1_pruned.in1k,92.980,7.020,98.540,1.460,6.33,240,0.882,bicubic
hrnet_w32.ms_in1k,92.950,7.050,98.850,1.150,41.23,224,0.875,bilinear
seresnext26ts.ch_in1k,92.940,7.060,98.670,1.330,10.39,288,1.000,bicubic
hardcorenas_e.miil_green_in1k,92.940,7.060,98.580,1.420,8.07,224,0.875,bilinear
resnet50.a3_in1k,92.940,7.060,98.510,1.490,25.56,224,0.950,bicubic
tf_efficientnet_b1.in1k,92.930,7.070,98.660,1.340,7.79,240,0.882,bicubic
convnextv2_atto.fcmae_ft_in1k,92.920,7.080,98.560,1.440,3.71,288,0.950,bicubic
resnet50c.gluon_in1k,92.910,7.090,98.700,1.300,25.58,224,0.875,bicubic
efficientnet_es.ra_in1k,92.910,7.090,98.690,1.310,5.44,224,0.875,bicubic
resnet26t.ra2_in1k,92.910,7.090,98.680,1.320,16.01,320,1.000,bicubic
resnext50_32x4d.tv_in1k,92.900,7.100,98.730,1.270,25.03,224,0.875,bilinear
inception_v3.tv_in1k,92.900,7.100,98.320,1.680,23.83,299,0.875,bicubic
densenet161.tv_in1k,92.890,7.110,98.790,1.210,28.68,224,0.875,bicubic
pit_xs_224.in1k,92.890,7.110,98.780,1.220,10.62,224,0.900,bicubic
poolformerv2_s12.sail_in1k,92.890,7.110,98.530,1.470,11.89,224,1.000,bicubic
resnet101.tv_in1k,92.880,7.120,98.660,1.340,44.55,224,0.875,bilinear
resmlp_12_224.fb_distilled_in1k,92.870,7.130,98.620,1.380,15.35,224,0.875,bicubic
tf_efficientnet_cc_b0_8e.in1k,92.870,7.130,98.460,1.540,24.01,224,0.875,bicubic
coat_lite_tiny.in1k,92.860,7.140,98.640,1.360,5.72,224,0.900,bicubic
rexnet_100.nav_in1k,92.830,7.170,98.600,1.400,4.80,224,0.875,bicubic
tf_efficientnet_cc_b0_4e.in1k,92.820,7.180,98.440,1.560,13.31,224,0.875,bicubic
tinynet_a.in1k,92.810,7.190,98.560,1.440,6.19,192,0.875,bicubic
dpn68b.mx_in1k,92.780,7.220,98.520,1.480,12.61,224,0.875,bicubic
res2net50_48w_2s.in1k,92.780,7.220,98.470,1.530,25.29,224,0.875,bilinear
hrnet_w18.ms_in1k,92.770,7.230,98.660,1.340,21.30,224,0.875,bilinear
convnext_atto.d2_in1k,92.770,7.230,98.620,1.380,3.70,288,0.950,bicubic
crossvit_9_dagger_240.in1k,92.770,7.230,98.490,1.510,8.78,240,0.875,bicubic
ese_vovnet19b_dw.ra_in1k,92.760,7.240,98.650,1.350,6.54,288,0.950,bicubic
eca_resnext26ts.ch_in1k,92.750,7.250,98.710,1.290,10.30,288,1.000,bicubic
gcresnext26ts.ch_in1k,92.740,7.260,98.610,1.390,10.48,288,1.000,bicubic
densenet201.tv_in1k,92.700,7.300,98.640,1.360,20.01,224,0.875,bicubic
densenet121.ra_in1k,92.700,7.300,98.600,1.400,7.98,288,0.950,bicubic
repvgg_a2.rvgg_in1k,92.680,7.320,98.530,1.470,28.21,224,0.875,bilinear
gmixer_24_224.ra3_in1k,92.680,7.320,98.280,1.720,24.72,224,0.875,bicubic
mobileone_s2.apple_in1k,92.660,7.340,98.680,1.320,7.88,224,0.900,bilinear
legacy_seresnet50.in1k,92.660,7.340,98.650,1.350,28.09,224,0.875,bilinear
dla60.in1k,92.650,7.350,98.630,1.370,22.04,224,0.875,bilinear
tf_efficientnet_b0.ap_in1k,92.620,7.380,98.370,1.630,5.29,224,0.875,bicubic
mobilenetv2_120d.ra_in1k,92.610,7.390,98.510,1.490,5.83,224,0.875,bicubic
hardcorenas_d.miil_green_in1k,92.610,7.390,98.430,1.570,7.50,224,0.875,bilinear
legacy_seresnext26_32x4d.in1k,92.600,7.400,98.410,1.590,16.79,224,0.875,bicubic
tf_efficientnet_lite2.in1k,92.590,7.410,98.540,1.460,6.09,260,0.890,bicubic
fastvit_t8.apple_dist_in1k,92.590,7.410,98.430,1.570,4.03,256,0.900,bicubic
resnet34.a2_in1k,92.570,7.430,98.570,1.430,21.80,288,1.000,bicubic
skresnet34.ra_in1k,92.570,7.430,98.520,1.480,22.28,224,0.875,bicubic
resnet50.gluon_in1k,92.560,7.440,98.550,1.450,25.56,224,0.875,bicubic
resnet26d.bt_in1k,92.550,7.450,98.650,1.350,16.01,288,0.950,bicubic
regnetx_016.pycls_in1k,92.530,7.470,98.550,1.450,9.19,224,0.875,bicubic
poolformer_s12.sail_in1k,92.500,7.500,98.390,1.610,11.92,224,0.900,bicubic
regnetx_008.tv2_in1k,92.490,7.510,98.430,1.570,7.26,224,0.965,bicubic
efficientnet_b0.ra_in1k,92.480,7.520,98.680,1.320,5.29,224,0.875,bicubic
xcit_tiny_12_p16_224.fb_in1k,92.480,7.520,98.630,1.370,6.72,224,1.000,bicubic
selecsls42b.in1k,92.480,7.520,98.430,1.570,32.46,224,0.875,bicubic
gernet_s.idstcv_in1k,92.440,7.560,98.500,1.500,8.17,224,0.875,bilinear
xcit_nano_12_p8_224.fb_dist_in1k,92.430,7.570,98.540,1.460,3.05,224,1.000,bicubic
tf_efficientnet_b0.aa_in1k,92.400,7.600,98.470,1.530,5.29,224,0.875,bicubic
repghostnet_150.in1k,92.380,7.620,98.530,1.470,6.58,224,0.875,bicubic
resnext26ts.ra2_in1k,92.380,7.620,98.390,1.610,10.30,288,1.000,bicubic
seresnet50.a3_in1k,92.360,7.640,98.330,1.670,28.09,224,0.950,bicubic
hardcorenas_c.miil_green_in1k,92.350,7.650,98.340,1.660,5.52,224,0.875,bilinear
convmixer_1024_20_ks9_p14.in1k,92.330,7.670,98.430,1.570,24.38,224,0.960,bicubic
dpn68.mx_in1k,92.300,7.700,98.610,1.390,12.61,224,0.875,bicubic
densenet169.tv_in1k,92.300,7.700,98.590,1.410,14.15,224,0.875,bicubic
tf_efficientnet_lite1.in1k,92.290,7.710,98.500,1.500,5.42,240,0.882,bicubic
resnet34.bt_in1k,92.280,7.720,98.600,1.400,21.80,288,0.950,bicubic
tf_efficientnet_b0.in1k,92.280,7.720,98.550,1.450,5.29,224,0.875,bicubic
mixnet_m.ft_in1k,92.270,7.730,98.360,1.640,5.01,224,0.875,bicubic
mobilenetv3_large_100.miil_in21k_ft_in1k,92.260,7.740,98.240,1.760,5.48,224,0.875,bilinear
ghostnetv2_130.in1k,92.240,7.760,98.380,1.620,8.96,224,0.875,bicubic
tf_mixnet_m.in1k,92.210,7.790,98.420,1.580,5.01,224,0.875,bicubic
efficientvit_m5.r224_in1k,92.150,7.850,98.520,1.480,12.47,224,0.875,bicubic
vit_small_patch32_224.augreg_in21k_ft_in1k,92.140,7.860,98.520,1.480,22.88,224,0.900,bicubic
xcit_nano_12_p16_384.fb_dist_in1k,92.130,7.870,98.510,1.490,3.05,384,1.000,bicubic
resmlp_12_224.fb_in1k,92.120,7.880,98.570,1.430,15.35,224,0.875,bicubic
resnet50.tv_in1k,92.120,7.880,98.410,1.590,25.56,224,0.875,bilinear
resnet26.bt_in1k,92.110,7.890,98.550,1.450,16.00,288,0.950,bicubic
tf_efficientnet_es.in1k,92.110,7.890,98.430,1.570,5.44,224,0.875,bicubic
mobilenetv2_140.ra_in1k,92.050,7.950,98.250,1.750,6.11,224,0.875,bicubic
mobilevitv2_075.cvnets_in1k,91.970,8.030,98.300,1.700,2.87,256,0.888,bicubic
repghostnet_130.in1k,91.940,8.060,98.390,1.610,5.48,224,0.875,bicubic
fastvit_t8.apple_in1k,91.930,8.070,98.380,1.620,4.03,256,0.900,bicubic
hardcorenas_b.miil_green_in1k,91.920,8.080,98.410,1.590,5.18,224,0.875,bilinear
vit_tiny_patch16_224.augreg_in21k_ft_in1k,91.920,8.080,98.340,1.660,5.72,224,0.900,bicubic
regnety_008.pycls_in1k,91.900,8.100,98.410,1.590,6.26,224,0.875,bicubic
efficientformerv2_s0.snap_dist_in1k,91.860,8.140,98.370,1.630,3.60,224,0.950,bicubic
mobileone_s1.apple_in1k,91.790,8.210,98.460,1.540,4.83,224,0.900,bilinear
mixnet_s.ft_in1k,91.780,8.220,98.300,1.700,4.13,224,0.875,bicubic
vit_tiny_r_s16_p8_384.augreg_in21k_ft_in1k,91.730,8.270,98.430,1.570,6.36,384,1.000,bicubic
efficientnet_es_pruned.in1k,91.700,8.300,98.410,1.590,5.44,224,0.875,bicubic
repvgg_b0.rvgg_in1k,91.680,8.320,98.450,1.550,15.82,224,0.875,bilinear
tf_mixnet_s.in1k,91.680,8.320,98.240,1.760,4.13,224,0.875,bicubic
semnasnet_100.rmsp_in1k,91.670,8.330,98.290,1.710,3.89,224,0.875,bicubic
ghostnetv2_100.in1k,91.630,8.370,98.290,1.710,6.16,224,0.875,bicubic
regnety_004.tv2_in1k,91.620,8.380,98.280,1.720,4.34,224,0.965,bicubic
hardcorenas_a.miil_green_in1k,91.620,8.380,98.170,1.830,5.26,224,0.875,bilinear
edgenext_x_small.in1k,91.580,8.420,98.190,1.810,2.34,288,1.000,bicubic
regnety_006.pycls_in1k,91.560,8.440,98.430,1.570,6.06,224,0.875,bicubic
mobilenetv3_rw.rmsp_in1k,91.550,8.450,98.280,1.720,5.48,224,0.875,bicubic
levit_128s.fb_dist_in1k,91.510,8.490,98.400,1.600,7.78,224,0.900,bicubic
levit_conv_128s.fb_dist_in1k,91.500,8.500,98.400,1.600,7.78,224,0.900,bicubic
legacy_seresnet34.in1k,91.490,8.510,98.200,1.800,21.96,224,0.875,bilinear
mobilenetv3_large_100.ra_in1k,91.470,8.530,98.320,1.680,5.48,224,0.875,bicubic
tf_mobilenetv3_large_100.in1k,91.420,8.580,98.260,1.740,5.48,224,0.875,bilinear
densenet121.tv_in1k,91.400,8.600,98.250,1.750,7.98,224,0.875,bicubic
mobilenetv2_110d.ra_in1k,91.330,8.670,98.190,1.810,4.52,224,0.875,bicubic
tf_efficientnet_lite0.in1k,91.280,8.720,98.080,1.920,4.65,224,0.875,bicubic
fbnetc_100.rmsp_in1k,91.280,8.720,97.840,2.160,5.57,224,0.875,bilinear
efficientnet_lite0.ra_in1k,91.250,8.750,98.240,1.760,4.65,224,0.875,bicubic
dla34.in1k,91.220,8.780,98.170,1.830,15.74,224,0.875,bilinear
mobilevit_xs.cvnets_in1k,91.210,8.790,98.220,1.780,2.32,256,0.900,bicubic
mnasnet_100.rmsp_in1k,91.210,8.790,98.050,1.950,4.38,224,0.875,bicubic
regnetx_008.pycls_in1k,91.170,8.830,98.370,1.630,7.26,224,0.875,bicubic
hrnet_w18_small_v2.ms_in1k,91.160,8.840,98.340,1.660,15.60,224,0.875,bilinear
regnetx_004_tv.tv2_in1k,91.160,8.840,98.100,1.900,5.50,224,0.965,bicubic
resnest14d.gluon_in1k,91.150,8.850,98.330,1.670,10.61,224,0.875,bilinear
mixer_b16_224.goog_in21k_ft_in1k,91.140,8.860,97.400,2.600,59.88,224,0.875,bicubic
repvgg_a1.rvgg_in1k,91.120,8.880,98.160,1.840,14.09,224,0.875,bilinear
tinynet_b.in1k,91.110,8.890,98.060,1.940,3.73,188,0.875,bicubic
xcit_nano_12_p8_224.fb_in1k,91.100,8.900,98.230,1.770,3.05,224,1.000,bicubic
resnet18.fb_swsl_ig1b_ft_in1k,91.100,8.900,98.200,1.800,11.69,224,0.875,bilinear
repghostnet_111.in1k,91.100,8.900,98.050,1.950,4.54,224,0.875,bicubic
resnet34.gluon_in1k,91.090,8.910,98.180,1.820,21.80,224,0.875,bicubic
deit_tiny_distilled_patch16_224.fb_in1k,91.080,8.920,98.270,1.730,5.91,224,0.900,bicubic
crossvit_9_240.in1k,91.050,8.950,98.320,1.680,8.55,240,0.875,bicubic
vgg19_bn.tv_in1k,90.990,9.010,98.100,1.900,143.68,224,0.875,bilinear
resnet18d.ra2_in1k,90.800,9.200,98.160,1.840,11.71,288,0.950,bicubic
regnetx_006.pycls_in1k,90.790,9.210,98.090,1.910,6.20,224,0.875,bicubic
regnety_004.pycls_in1k,90.770,9.230,98.070,1.930,4.34,224,0.875,bicubic
efficientvit_m4.r224_in1k,90.740,9.260,98.040,1.960,8.80,224,0.875,bicubic
pit_ti_distilled_224.in1k,90.730,9.270,98.250,1.750,5.10,224,0.900,bicubic
resnet18.fb_ssl_yfcc100m_ft_in1k,90.700,9.300,98.020,1.980,11.69,224,0.875,bilinear
repghostnet_100.in1k,90.690,9.310,98.120,1.880,4.07,224,0.875,bicubic
spnasnet_100.rmsp_in1k,90.590,9.410,97.950,2.050,4.42,224,0.875,bilinear
vit_base_patch32_224.augreg_in1k,90.590,9.410,97.720,2.280,88.22,224,0.900,bicubic
convit_tiny.fb_in1k,90.550,9.450,98.190,1.810,5.71,224,0.875,bicubic
vgg16_bn.tv_in1k,90.540,9.460,97.990,2.010,138.37,224,0.875,bilinear
crossvit_tiny_240.in1k,90.530,9.470,97.950,2.050,7.01,240,0.875,bicubic
ghostnet_100.in1k,90.460,9.540,97.910,2.090,5.18,224,0.875,bicubic
pit_ti_224.in1k,90.420,9.580,98.010,1.990,4.85,224,0.900,bicubic
tf_mobilenetv3_large_075.in1k,90.330,9.670,97.870,2.130,3.99,224,0.875,bilinear
hrnet_w18_small.gluon_in1k,90.310,9.690,97.750,2.250,13.19,224,0.875,bicubic
resnet34.tv_in1k,90.300,9.700,97.970,2.030,21.80,224,0.875,bilinear
resnet34.a3_in1k,90.240,9.760,97.880,2.120,21.80,224,0.950,bicubic
semnasnet_075.rmsp_in1k,90.220,9.780,97.950,2.050,2.91,224,0.875,bicubic
resnet18.a1_in1k,90.200,9.800,97.760,2.240,11.69,288,1.000,bicubic
xcit_nano_12_p16_224.fb_dist_in1k,90.190,9.810,97.750,2.250,3.05,224,1.000,bicubic
skresnet18.ra_in1k,90.180,9.820,97.780,2.220,11.96,224,0.875,bicubic
efficientvit_m3.r224_in1k,90.000,10.000,97.830,2.170,6.90,224,0.875,bicubic
hrnet_w18_small.ms_in1k,89.870,10.130,97.890,2.110,13.19,224,0.875,bilinear
mobilenetv2_100.ra_in1k,89.870,10.130,97.830,2.170,3.50,224,0.875,bicubic
vit_base_patch32_224.sam_in1k,89.870,10.130,97.600,2.400,88.22,224,0.900,bicubic
edgenext_xx_small.in1k,89.800,10.200,97.500,2.500,1.33,288,1.000,bicubic
repvgg_a0.rvgg_in1k,89.680,10.320,97.760,2.240,9.11,224,0.875,bilinear
vgg19.tv_in1k,89.680,10.320,97.550,2.450,143.67,224,0.875,bilinear
deit_tiny_patch16_224.fb_in1k,89.610,10.390,97.960,2.040,5.72,224,0.900,bicubic
regnetx_004.pycls_in1k,89.470,10.530,97.760,2.240,5.16,224,0.875,bicubic
resnet18.a2_in1k,89.470,10.530,97.630,2.370,11.69,288,1.000,bicubic
repghostnet_080.in1k,89.470,10.530,97.410,2.590,3.28,224,0.875,bicubic
vgg16.tv_in1k,89.370,10.630,97.520,2.480,138.36,224,0.875,bilinear
vit_tiny_r_s16_p8_224.augreg_in21k_ft_in1k,89.340,10.660,97.700,2.300,6.34,224,0.900,bicubic
legacy_seresnet18.in1k,89.260,10.740,97.690,2.310,11.78,224,0.875,bicubic
resnet14t.c3_in1k,89.250,10.750,97.440,2.560,10.08,224,0.950,bicubic
vgg13_bn.tv_in1k,89.200,10.800,97.530,2.470,133.05,224,0.875,bilinear
tf_mobilenetv3_large_minimal_100.in1k,89.160,10.840,97.310,2.690,3.92,224,0.875,bilinear
mobilevitv2_050.cvnets_in1k,89.030,10.970,97.600,2.400,1.37,256,0.888,bicubic
pvt_v2_b0.in1k,88.970,11.030,97.690,2.310,3.67,224,0.900,bicubic
xcit_nano_12_p16_224.fb_in1k,88.970,11.030,97.410,2.590,3.05,224,1.000,bicubic
efficientvit_m2.r224_in1k,88.910,11.090,97.390,2.610,4.19,224,0.875,bicubic
lcnet_100.ra2_in1k,88.910,11.090,97.380,2.620,2.95,224,0.875,bicubic
mobileone_s0.apple_in1k,88.810,11.190,97.220,2.780,5.29,224,0.875,bilinear
resnet18.gluon_in1k,88.660,11.340,97.100,2.900,11.69,224,0.875,bicubic
tinynet_c.in1k,88.400,11.600,97.260,2.740,2.46,184,0.875,bicubic
vgg11_bn.tv_in1k,88.400,11.600,97.250,2.750,132.87,224,0.875,bilinear
efficientvit_b0.r224_in1k,88.300,11.700,96.880,3.120,3.41,224,0.950,bicubic
regnety_002.pycls_in1k,88.170,11.830,97.440,2.560,3.16,224,0.875,bicubic
resnet18.tv_in1k,88.150,11.850,97.120,2.880,11.69,224,0.875,bilinear
mobilevit_xxs.cvnets_in1k,87.940,12.060,97.190,2.810,1.27,256,0.900,bicubic
vgg13.tv_in1k,87.570,12.430,97.110,2.890,133.05,224,0.875,bilinear
regnetx_002.pycls_in1k,87.370,12.630,97.000,3.000,2.68,224,0.875,bicubic
vgg11.tv_in1k,87.350,12.650,97.100,2.900,132.86,224,0.875,bilinear
efficientvit_m1.r224_in1k,87.220,12.780,97.020,2.980,2.98,224,0.875,bicubic
repghostnet_058.in1k,87.150,12.850,96.780,3.220,2.55,224,0.875,bicubic
dla60x_c.in1k,87.110,12.890,97.140,2.860,1.32,224,0.875,bilinear
resnet18.a3_in1k,87.070,12.930,96.660,3.340,11.69,224,0.950,bicubic
mixer_l16_224.goog_in21k_ft_in1k,86.990,13.010,94.070,5.930,208.20,224,0.875,bicubic
lcnet_075.ra2_in1k,86.960,13.040,96.550,3.450,2.36,224,0.875,bicubic
resnet10t.c3_in1k,86.680,13.320,96.730,3.270,5.44,224,0.950,bicubic
mobilenetv3_small_100.lamb_in1k,86.180,13.820,96.450,3.550,2.54,224,0.875,bicubic
tf_mobilenetv3_small_100.in1k,85.950,14.050,96.400,3.600,2.54,224,0.875,bilinear
mnasnet_small.lamb_in1k,85.480,14.520,96.000,4.000,2.03,224,0.875,bicubic
repghostnet_050.in1k,85.450,14.550,96.150,3.850,2.31,224,0.875,bicubic
dla46x_c.in1k,85.440,14.560,96.420,3.580,1.07,224,0.875,bilinear
tinynet_d.in1k,85.440,14.560,96.020,3.980,2.34,152,0.875,bicubic
mobilenetv2_050.lamb_in1k,84.990,15.010,95.620,4.380,1.97,224,0.875,bicubic
dla46_c.in1k,84.700,15.300,96.210,3.790,1.30,224,0.875,bilinear
tf_mobilenetv3_small_075.in1k,84.500,15.500,95.880,4.120,2.04,224,0.875,bilinear
mobilenetv3_small_075.lamb_in1k,84.120,15.880,95.520,4.480,2.04,224,0.875,bicubic
efficientvit_m0.r224_in1k,83.220,16.780,95.700,4.300,2.35,224,0.875,bicubic
lcnet_050.ra2_in1k,83.040,16.960,95.020,4.980,1.88,224,0.875,bicubic
tf_mobilenetv3_small_minimal_100.in1k,82.660,17.340,95.030,4.970,2.04,224,0.875,bilinear
tinynet_e.in1k,79.810,20.190,93.970,6.030,2.04,106,0.875,bicubic
mobilenetv3_small_050.lamb_in1k,78.080,21.920,93.010,6.990,1.59,224,0.875,bicubic
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/results-imagenetv2-matched-frequency.csv
|
model,top1,top1_err,top5,top5_err,param_count,img_size,crop_pct,interpolation,top1_diff,top5_diff,rank_diff
eva_giant_patch14_336.clip_ft_in1k,82.200,17.800,96.290,3.710,"1,013.01",336,1.000,bicubic,-7.266,-2.536,+6
eva02_large_patch14_448.mim_in22k_ft_in1k,82.130,17.870,96.260,3.740,305.08,448,1.000,bicubic,-7.492,-2.690,+2
eva02_large_patch14_448.mim_m38m_ft_in1k,82.130,17.870,96.160,3.840,305.08,448,1.000,bicubic,-7.444,-2.764,+2
eva_giant_patch14_560.m30m_ft_in22k_in1k,82.040,17.960,96.440,3.560,"1,014.45",560,1.000,bicubic,-7.746,-2.552,-1
eva02_large_patch14_448.mim_in22k_ft_in22k_in1k,81.900,18.100,96.150,3.850,305.08,448,1.000,bicubic,-8.070,-2.862,-3
eva02_large_patch14_448.mim_m38m_ft_in22k_in1k,81.890,18.110,96.370,3.630,305.08,448,1.000,bicubic,-8.162,-2.678,-5
eva_giant_patch14_336.m30m_ft_in22k_in1k,81.820,18.180,96.290,3.710,"1,013.01",336,1.000,bicubic,-7.746,-2.662,-1
eva_giant_patch14_224.clip_ft_in1k,81.750,18.250,96.080,3.920,"1,012.56",224,0.900,bicubic,-7.130,-2.600,+1
eva_large_patch14_336.in22k_ft_in1k,81.190,18.810,95.880,4.120,304.53,336,1.000,bicubic,-7.480,-2.842,+4
eva_large_patch14_336.in22k_ft_in22k_in1k,80.930,19.070,96.010,3.990,304.53,336,1.000,bicubic,-8.276,-2.844,-2
vit_large_patch14_clip_336.openai_ft_in12k_in1k,80.520,19.480,95.500,4.500,304.53,336,1.000,bicubic,-7.748,-3.026,+13
regnety_1280.swag_ft_in1k,80.480,19.520,96.180,3.820,644.81,384,1.000,bicubic,-7.750,-2.506,+16
tf_efficientnet_l2.ns_jft_in1k_475,80.470,19.530,95.730,4.270,480.31,475,0.936,bicubic,-7.764,-2.816,+14
convnext_xxlarge.clip_laion2b_soup_ft_in1k,80.450,19.550,95.780,4.220,846.47,256,1.000,bicubic,-8.154,-2.928,0
beitv2_large_patch16_224.in1k_ft_in22k_in1k,80.270,19.730,95.150,4.850,304.43,224,0.950,bicubic,-8.124,-3.448,+5
tf_efficientnet_l2.ns_jft_in1k,80.250,19.750,95.860,4.140,480.31,800,0.960,bicubic,-8.102,-2.788,+5
eva_large_patch14_196.in22k_ft_in22k_in1k,80.170,19.830,95.380,4.620,304.14,196,1.000,bicubic,-8.404,-3.278,0
maxvit_base_tf_512.in21k_ft_in1k,80.150,19.850,95.480,4.520,119.88,512,1.000,bicubic,-8.070,-3.050,+11
eva_large_patch14_196.in22k_ft_in1k,80.150,19.850,95.450,4.550,304.14,196,1.000,bicubic,-7.782,-3.048,+19
maxvit_xlarge_tf_512.in21k_ft_in1k,80.100,19.900,95.480,4.520,475.77,512,1.000,bicubic,-8.438,-3.164,-2
convnextv2_huge.fcmae_ft_in22k_in1k_512,79.990,20.010,95.900,4.100,660.29,512,1.000,bicubic,-8.868,-2.848,-11
maxvit_large_tf_512.in21k_ft_in1k,79.980,20.020,95.160,4.840,212.33,512,1.000,bicubic,-8.244,-3.438,+7
beit_large_patch16_512.in22k_ft_in22k_in1k,79.950,20.050,95.350,4.650,305.67,512,1.000,bicubic,-8.646,-3.306,-8
convnextv2_huge.fcmae_ft_in22k_in1k_384,79.940,20.060,95.690,4.310,660.29,384,1.000,bicubic,-8.730,-3.048,-12
maxvit_xlarge_tf_384.in21k_ft_in1k,79.710,20.290,95.160,4.840,475.32,384,1.000,bicubic,-8.604,-3.384,-3
vit_large_patch14_clip_224.openai_ft_in1k,79.620,20.380,95.000,5.000,304.20,224,1.000,bicubic,-8.234,-3.426,+15
maxvit_large_tf_384.in21k_ft_in1k,79.590,20.410,95.070,4.930,212.03,384,1.000,bicubic,-8.396,-3.498,+8
eva02_base_patch14_448.mim_in22k_ft_in22k_in1k,79.520,20.480,95.200,4.800,87.12,448,1.000,bicubic,-9.170,-3.524,-17
vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k,79.520,20.480,95.000,5.000,632.46,336,1.000,bicubic,-9.072,-3.662,-13
beit_large_patch16_384.in22k_ft_in22k_in1k,79.500,20.500,95.180,4.820,305.00,384,1.000,bicubic,-8.902,-3.428,-11
vit_large_patch14_clip_224.openai_ft_in12k_in1k,79.400,20.600,95.070,4.930,304.20,224,1.000,bicubic,-8.774,-3.476,+2
vit_huge_patch14_clip_224.laion2b_ft_in1k,79.370,20.630,94.920,5.080,632.05,224,1.000,bicubic,-8.218,-3.298,+16
maxvit_base_tf_384.in21k_ft_in1k,79.340,20.660,95.080,4.920,119.65,384,1.000,bicubic,-8.582,-3.464,+5
vit_large_patch14_clip_336.laion2b_ft_in1k,79.240,20.760,95.000,5.000,304.53,336,1.000,bicubic,-8.616,-3.368,+6
vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k,79.200,20.800,95.120,4.880,632.05,224,1.000,bicubic,-9.056,-3.432,-10
deit3_huge_patch14_224.fb_in22k_ft_in1k,79.190,20.810,94.870,5.130,632.13,224,1.000,bicubic,-7.996,-3.390,+27
eva02_base_patch14_448.mim_in22k_ft_in1k,79.150,20.850,95.110,4.890,87.12,448,1.000,bicubic,-9.102,-3.454,-11
deit3_large_patch16_384.fb_in22k_ft_in1k,79.080,20.920,94.870,5.130,304.76,384,1.000,bicubic,-8.640,-3.642,+7
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384,79.040,20.960,95.010,4.990,200.13,384,1.000,bicubic,-9.266,-3.572,-17
caformer_b36.sail_in22k_ft_in1k_384,79.040,20.960,94.920,5.080,98.75,384,1.000,bicubic,-9.018,-3.662,-5
vit_large_patch14_clip_336.laion2b_ft_in12k_in1k,78.980,21.020,94.890,5.110,304.53,336,1.000,bicubic,-9.200,-3.682,-9
maxvit_rmlp_base_rw_384.sw_in12k_ft_in1k,78.900,21.100,94.580,5.420,116.14,384,1.000,bicubic,-8.928,-3.792,+1
beit_large_patch16_224.in22k_ft_in22k_in1k,78.830,21.170,94.610,5.390,304.43,224,0.900,bicubic,-8.648,-3.694,+7
convnext_large_mlp.clip_laion2b_augreg_ft_in1k_384,78.750,21.250,94.950,5.050,200.13,384,1.000,bicubic,-9.098,-3.496,-2
beitv2_large_patch16_224.in1k_ft_in1k,78.730,21.270,94.230,5.770,304.43,224,0.950,bicubic,-8.682,-4.004,+11
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320,78.650,21.350,94.660,5.340,200.13,320,1.000,bicubic,-9.308,-3.816,-10
deit3_large_patch16_224.fb_in22k_ft_in1k,78.630,21.370,94.720,5.280,304.37,224,1.000,bicubic,-8.352,-3.516,+26
convnextv2_large.fcmae_ft_in22k_in1k_384,78.570,21.430,94.840,5.160,197.96,384,1.000,bicubic,-9.628,-3.688,-17
tf_efficientnet_b7.ns_jft_in1k,78.530,21.470,94.370,5.630,66.35,600,0.949,bicubic,-8.310,-3.722,+30
vit_large_patch14_clip_224.laion2b_ft_in12k_in1k,78.490,21.510,94.660,5.340,304.20,224,1.000,bicubic,-9.404,-3.748,-11
vit_large_patch14_clip_224.laion2b_ft_in1k,78.440,21.560,94.580,5.420,304.20,224,1.000,bicubic,-8.846,-3.664,+10
regnety_320.swag_ft_in1k,78.380,21.620,95.200,4.800,145.05,384,1.000,bicubic,-8.454,-3.162,+28
coatnet_rmlp_2_rw_384.sw_in12k_ft_in1k,78.240,21.760,94.350,5.650,73.88,384,1.000,bicubic,-9.142,-3.962,+4
caformer_m36.sail_in22k_ft_in1k_384,78.170,21.830,94.180,5.820,56.20,384,1.000,bicubic,-9.276,-4.128,0
caformer_b36.sail_in22k_ft_in1k,78.110,21.890,94.400,5.600,98.75,224,1.000,bicubic,-9.310,-3.928,0
convnextv2_huge.fcmae_ft_in1k,78.060,21.940,94.060,5.940,660.29,288,1.000,bicubic,-8.520,-3.912,+36
convformer_b36.sail_in22k_ft_in1k_384,78.040,21.960,94.170,5.830,99.88,384,1.000,bicubic,-9.562,-4.264,-10
convnext_xlarge.fb_in22k_ft_in1k_384,77.990,22.010,94.480,5.520,350.20,384,1.000,bicubic,-9.762,-4.076,-14
volo_d5_512.sail_in1k,77.970,22.030,94.160,5.840,296.09,512,1.150,bicubic,-9.088,-3.810,+9
vit_large_patch16_384.augreg_in21k_ft_in1k,77.940,22.060,94.460,5.540,304.72,384,1.000,bicubic,-9.144,-3.842,+6
convnext_large_mlp.clip_laion2b_augreg_ft_in1k,77.940,22.060,94.390,5.610,200.13,256,1.000,bicubic,-9.396,-3.828,-2
convnextv2_large.fcmae_ft_in22k_in1k,77.900,22.100,94.390,5.610,197.96,288,1.000,bicubic,-9.584,-3.966,-13
deit3_base_patch16_384.fb_in22k_ft_in1k,77.870,22.130,94.030,5.970,86.88,384,1.000,bicubic,-8.870,-4.086,+23
vit_base_patch16_clip_384.laion2b_ft_in12k_in1k,77.770,22.230,94.160,5.840,86.86,384,1.000,bicubic,-9.436,-3.874,-2
volo_d5_448.sail_in1k,77.750,22.250,94.050,5.950,295.91,448,1.150,bicubic,-9.202,-3.888,+10
volo_d4_448.sail_in1k,77.740,22.260,93.940,6.060,193.41,448,1.150,bicubic,-9.052,-3.944,+18
maxxvitv2_rmlp_base_rw_384.sw_in12k_ft_in1k,77.710,22.290,94.330,5.670,116.09,384,1.000,bicubic,-9.754,-4.044,-15
maxvit_rmlp_base_rw_224.sw_in12k_ft_in1k,77.690,22.310,94.070,5.930,116.14,224,0.950,bicubic,-9.204,-3.944,+8
tf_efficientnetv2_xl.in21k_ft_in1k,77.630,22.370,93.960,6.040,208.12,512,1.000,bicubic,-9.118,-4.054,+16
tf_efficientnetv2_l.in21k_ft_in1k,77.580,22.420,94.280,5.720,118.52,480,1.000,bicubic,-9.222,-3.856,+11
caformer_s36.sail_in22k_ft_in1k_384,77.540,22.460,94.070,5.930,39.30,384,1.000,bicubic,-9.318,-4.142,+7
convnextv2_base.fcmae_ft_in22k_in1k_384,77.490,22.510,94.410,5.590,88.72,384,1.000,bicubic,-10.154,-4.006,-27
caformer_b36.sail_in1k_384,77.490,22.510,93.530,6.470,98.75,384,1.000,bicubic,-8.918,-4.284,+32
convnext_base.clip_laiona_augreg_ft_in1k_384,77.480,22.520,93.960,6.040,88.59,384,1.000,bicubic,-9.022,-4.008,+22
maxvit_base_tf_512.in1k,77.440,22.560,93.970,6.030,119.88,512,1.000,bicubic,-9.162,-3.948,+14
caformer_m36.sail_in22k_ft_in1k,77.440,22.560,93.590,6.410,56.20,224,1.000,bicubic,-9.154,-4.434,+16
convnext_large.fb_in22k_ft_in1k_384,77.420,22.580,94.200,5.800,197.77,384,1.000,bicubic,-10.052,-4.186,-26
seresnextaa201d_32x8d.sw_in12k_ft_in1k_384,77.420,22.580,93.990,6.010,149.39,384,1.000,bicubic,-9.868,-4.344,-18
regnety_1280.swag_lc_in1k,77.380,22.620,94.500,5.500,644.81,224,0.965,bicubic,-8.602,-3.350,+56
swinv2_large_window12to24_192to384.ms_in22k_ft_in1k,77.340,22.660,93.910,6.090,196.74,384,1.000,bicubic,-10.124,-4.340,-27
vit_base_patch16_clip_384.laion2b_ft_in1k,77.320,22.680,93.860,6.140,86.86,384,1.000,bicubic,-9.298,-4.148,+8
maxvit_large_tf_512.in1k,77.300,22.700,93.790,6.210,212.33,512,1.000,bicubic,-9.226,-4.090,+12
convnextv2_base.fcmae_ft_in22k_in1k,77.290,22.710,94.110,5.890,88.72,288,1.000,bicubic,-9.708,-4.058,-11
seresnextaa101d_32x8d.sw_in12k_ft_in1k_288,77.290,22.710,94.050,5.950,93.59,320,1.000,bicubic,-9.434,-4.126,+3
tf_efficientnet_b6.ns_jft_in1k,77.290,22.710,93.890,6.110,43.04,528,0.942,bicubic,-9.168,-4.000,+17
convformer_b36.sail_in22k_ft_in1k,77.270,22.730,93.970,6.030,99.88,224,1.000,bicubic,-9.728,-4.202,-15
swinv2_base_window12to24_192to384.ms_in22k_ft_in1k,77.240,22.760,94.250,5.750,87.92,384,1.000,bicubic,-9.856,-3.984,-21
regnety_160.swag_ft_in1k,77.230,22.770,94.600,5.400,83.59,384,1.000,bicubic,-8.790,-3.452,+41
convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384,77.180,22.820,94.170,5.830,88.59,384,1.000,bicubic,-9.954,-4.052,-25
caformer_m36.sail_in1k_384,77.160,22.840,93.630,6.370,56.20,384,1.000,bicubic,-9.006,-4.190,+28
maxvit_large_tf_384.in1k,77.130,22.870,93.460,6.540,212.03,384,1.000,bicubic,-9.100,-4.228,+21
beitv2_base_patch16_224.in1k_ft_in22k_in1k,77.120,22.880,94.020,5.980,86.53,224,0.900,bicubic,-9.354,-4.032,+8
volo_d3_448.sail_in1k,77.090,22.910,94.110,5.890,86.63,448,1.000,bicubic,-9.412,-3.600,+4
swin_large_patch4_window12_384.ms_in22k_ft_in1k,77.070,22.930,93.760,6.240,196.74,384,1.000,bicubic,-10.062,-4.474,-29
vit_large_r50_s32_384.augreg_in21k_ft_in1k,77.070,22.930,93.710,6.290,329.09,384,1.000,bicubic,-9.112,-4.212,+21
coatnet_rmlp_2_rw_224.sw_in12k_ft_in1k,77.010,22.990,93.320,6.680,73.88,224,0.950,bicubic,-9.494,-4.574,-1
swinv2_large_window12to16_192to256.ms_in22k_ft_in1k,76.970,23.030,93.520,6.480,196.74,256,0.900,bicubic,-9.982,-4.586,-23
vit_base_patch16_clip_384.openai_ft_in1k,76.960,23.040,93.750,6.250,86.86,384,1.000,bicubic,-9.246,-4.126,+15
convformer_m36.sail_in22k_ft_in1k_384,76.960,23.040,93.690,6.310,57.05,384,1.000,bicubic,-9.932,-4.426,-21
beit_base_patch16_384.in22k_ft_in22k_in1k,76.930,23.070,93.910,6.090,86.74,384,1.000,bicubic,-9.870,-4.226,-18
seresnextaa101d_32x8d.sw_in12k_ft_in1k,76.900,23.100,93.850,6.150,93.59,288,1.000,bicubic,-9.584,-4.180,-2
tf_efficientnetv2_m.in21k_ft_in1k,76.890,23.110,93.640,6.360,54.14,480,1.000,bicubic,-9.102,-4.304,+30
vit_base_patch16_clip_384.openai_ft_in12k_in1k,76.870,23.130,93.780,6.220,86.86,384,0.950,bicubic,-10.156,-4.402,-34
cait_m48_448.fb_dist_in1k,76.870,23.130,93.380,6.620,356.46,448,1.000,bicubic,-9.622,-4.372,-5
tf_efficientnet_b5.ns_jft_in1k,76.830,23.170,93.580,6.420,30.39,456,0.934,bicubic,-9.258,-4.176,+19
resnext101_32x32d.fb_wsl_ig1b_ft_in1k,76.810,23.190,93.210,6.790,468.53,224,0.875,bilinear,-8.288,-4.228,+92
maxvit_base_tf_384.in1k,76.790,23.210,93.440,6.560,119.65,384,1.000,bicubic,-9.512,-4.358,+2
convnext_large.fb_in22k_ft_in1k,76.750,23.250,93.710,6.290,197.77,288,1.000,bicubic,-10.276,-4.494,-39
tiny_vit_21m_512.dist_in22k_ft_in1k,76.710,23.290,93.470,6.530,21.27,512,1.000,bicubic,-9.748,-4.414,-8
deit3_large_patch16_384.fb_in1k,76.690,23.310,93.350,6.650,304.76,384,1.000,bicubic,-9.122,-4.248,+32
convnextv2_large.fcmae_ft_in1k,76.660,23.340,93.570,6.430,197.96,288,1.000,bicubic,-9.458,-4.252,+10
convnext_base.fb_in22k_ft_in1k_384,76.650,23.350,93.700,6.300,88.59,384,1.000,bicubic,-10.146,-4.564,-29
convnext_xlarge.fb_in22k_ft_in1k,76.630,23.370,93.860,6.140,350.20,288,1.000,bicubic,-10.700,-4.468,-55
beitv2_base_patch16_224.in1k_ft_in1k,76.630,23.370,92.930,7.070,86.53,224,0.900,bicubic,-8.964,-4.576,+43
coatnet_2_rw_224.sw_in12k_ft_in1k,76.620,23.380,93.380,6.620,73.87,224,0.950,bicubic,-9.944,-4.516,-22
xcit_large_24_p8_384.fb_dist_in1k,76.620,23.380,93.090,6.910,188.93,384,1.000,bicubic,-9.376,-4.600,+14
regnety_160.sw_in12k_ft_in1k,76.610,23.390,93.560,6.440,83.59,288,1.000,bicubic,-9.376,-4.274,+17
vit_base_patch8_224.augreg2_in21k_ft_in1k,76.590,23.410,93.340,6.660,86.58,224,0.900,bicubic,-9.628,-4.492,-5
caformer_s36.sail_in22k_ft_in1k,76.580,23.420,93.620,6.380,39.30,224,1.000,bicubic,-9.210,-4.206,+25
caformer_s36.sail_in1k_384,76.550,23.450,93.450,6.550,39.30,384,1.000,bicubic,-9.192,-4.222,+28
volo_d5_224.sail_in1k,76.550,23.450,93.330,6.670,295.46,224,0.960,bicubic,-9.520,-4.246,+5
deit3_base_patch16_224.fb_in22k_ft_in1k,76.530,23.470,93.570,6.430,86.59,224,1.000,bicubic,-9.170,-4.176,+29
maxxvitv2_rmlp_base_rw_224.sw_in12k_ft_in1k,76.530,23.470,93.370,6.630,116.09,224,0.950,bicubic,-10.112,-4.650,-35
dm_nfnet_f6.dm_in1k,76.510,23.490,93.340,6.660,438.36,576,0.956,bicubic,-9.852,-4.556,-17
maxvit_small_tf_512.in1k,76.500,23.500,93.360,6.640,69.13,512,1.000,bicubic,-9.584,-4.404,0
vit_base_patch16_384.augreg_in21k_ft_in1k,76.490,23.510,93.760,6.240,86.86,384,1.000,bicubic,-9.504,-4.242,+5
tiny_vit_21m_384.dist_in22k_ft_in1k,76.470,23.530,93.130,6.870,21.23,384,1.000,bicubic,-9.638,-4.580,-5
swinv2_base_window12to16_192to256.ms_in22k_ft_in1k,76.450,23.550,93.700,6.300,87.92,256,0.900,bicubic,-9.818,-4.182,-17
convformer_m36.sail_in1k_384,76.370,23.630,93.100,6.900,57.05,384,1.000,bicubic,-9.210,-4.442,+28
convformer_s36.sail_in22k_ft_in1k_384,76.360,23.640,93.530,6.470,40.01,384,1.000,bicubic,-10.018,-4.454,-26
convformer_m36.sail_in22k_ft_in1k,76.360,23.640,93.450,6.550,57.05,224,1.000,bicubic,-9.788,-4.400,-10
convnext_small.in12k_ft_in1k_384,76.340,23.660,93.420,6.580,50.22,384,1.000,bicubic,-9.842,-4.502,-17
cait_m36_384.fb_dist_in1k,76.330,23.670,93.060,6.940,271.22,384,1.000,bicubic,-9.728,-4.670,-6
swin_large_patch4_window7_224.ms_in22k_ft_in1k,76.310,23.690,93.390,6.610,196.53,224,0.900,bicubic,-10.002,-4.512,-26
vit_large_patch16_224.augreg_in21k_ft_in1k,76.300,23.700,93.610,6.390,304.33,224,0.900,bicubic,-9.536,-4.054,+4
convnext_base.clip_laion2b_augreg_ft_in12k_in1k,76.290,23.710,93.810,6.190,88.59,256,1.000,bicubic,-10.080,-4.174,-30
swin_base_patch4_window12_384.ms_in22k_ft_in1k,76.250,23.750,93.300,6.700,87.90,384,1.000,bicubic,-10.188,-4.766,-34
cait_s36_384.fb_dist_in1k,76.220,23.780,92.960,7.040,68.37,384,1.000,bicubic,-9.234,-4.518,+27
regnety_160.lion_in12k_ft_in1k,76.190,23.810,93.670,6.330,83.59,288,1.000,bicubic,-9.798,-4.164,-6
dm_nfnet_f4.dm_in1k,76.150,23.850,93.050,6.950,316.07,512,0.951,bicubic,-9.686,-4.768,0
xcit_medium_24_p8_384.fb_dist_in1k,76.130,23.870,92.980,7.020,84.32,384,1.000,bicubic,-9.686,-4.612,0
maxvit_small_tf_384.in1k,76.120,23.880,92.610,7.390,69.02,384,1.000,bicubic,-9.420,-4.852,+18
flexivit_large.1200ep_in1k,76.100,23.900,93.010,6.990,304.36,240,0.950,bicubic,-9.544,-4.530,+12
volo_d2_384.sail_in1k,76.090,23.910,93.130,6.870,58.87,384,1.000,bicubic,-9.952,-4.444,-17
tf_efficientnet_b8.ap_in1k,76.090,23.910,92.730,7.270,87.41,672,0.954,bicubic,-9.274,-4.562,+32
tf_efficientnet_b7.ap_in1k,76.080,23.920,92.970,7.030,66.35,600,0.949,bicubic,-9.044,-4.282,+48
convformer_b36.sail_in1k_384,76.070,23.930,92.720,7.280,99.88,384,1.000,bicubic,-9.670,-4.804,+2
maxvit_tiny_tf_512.in1k,76.060,23.940,93.160,6.840,31.05,512,1.000,bicubic,-9.604,-4.424,+5
flexivit_large.600ep_in1k,76.050,23.950,92.960,7.040,304.36,240,0.950,bicubic,-9.490,-4.528,+10
volo_d4_224.sail_in1k,76.020,23.980,92.980,7.020,192.96,224,0.960,bicubic,-9.852,-4.492,-12
tf_efficientnetv2_l.in1k,76.000,24.000,93.070,6.930,118.52,480,1.000,bicubic,-9.664,-4.404,+3
vit_base_patch8_224.augreg_in21k_ft_in1k,75.970,24.030,93.370,6.630,86.58,224,0.900,bicubic,-9.828,-4.420,-9
xcit_large_24_p8_224.fb_dist_in1k,75.970,24.030,92.710,7.290,188.93,224,1.000,bicubic,-9.432,-4.692,+18
flexivit_large.300ep_in1k,75.940,24.060,92.650,7.350,304.36,240,0.950,bicubic,-9.348,-4.750,+25
convnext_base.fb_in22k_ft_in1k,75.930,24.070,93.580,6.420,88.59,288,1.000,bicubic,-10.344,-4.512,-45
convnext_base.clip_laion2b_augreg_ft_in1k,75.840,24.160,93.210,6.790,88.59,256,1.000,bicubic,-10.318,-4.470,-37
xcit_large_24_p16_384.fb_dist_in1k,75.820,24.180,92.740,7.260,189.10,384,1.000,bicubic,-9.934,-4.798,-10
dm_nfnet_f3.dm_in1k,75.790,24.210,92.910,7.090,254.92,416,0.940,bicubic,-9.896,-4.660,-6
eva02_small_patch14_336.mim_in22k_ft_in1k,75.790,24.210,92.820,7.180,22.13,336,1.000,bicubic,-9.928,-4.814,-9
convformer_s36.sail_in1k_384,75.780,24.220,93.090,6.910,40.01,384,1.000,bicubic,-9.598,-4.386,+13
deit3_huge_patch14_224.fb_in1k,75.780,24.220,92.740,7.260,632.13,224,0.900,bicubic,-9.444,-4.620,+24
xcit_small_24_p8_384.fb_dist_in1k,75.770,24.230,92.970,7.030,47.63,384,1.000,bicubic,-9.784,-4.600,-4
caformer_b36.sail_in1k,75.740,24.260,92.690,7.310,98.75,224,1.000,bicubic,-9.764,-4.620,-1
resnext101_32x16d.fb_wsl_ig1b_ft_in1k,75.720,24.280,92.910,7.090,194.03,224,0.875,bilinear,-8.446,-4.288,+119
vit_base_patch16_clip_224.laion2b_ft_in12k_in1k,75.710,24.290,92.770,7.230,86.57,224,0.950,bicubic,-10.460,-4.986,-48
tf_efficientnet_b4.ns_jft_in1k,75.690,24.310,93.050,6.950,19.34,380,0.922,bicubic,-9.470,-4.418,+26
caformer_s18.sail_in22k_ft_in1k_384,75.680,24.320,93.510,6.490,26.34,384,1.000,bicubic,-9.734,-4.192,+1
efficientnet_b5.sw_in12k_ft_in1k,75.680,24.320,93.040,6.960,30.39,448,1.000,bicubic,-10.216,-4.696,-31
hrnet_w48_ssld.paddle_in1k,75.670,24.330,92.830,7.170,77.47,288,1.000,bilinear,-8.810,-4.404,+79
vit_medium_patch16_gap_384.sw_in12k_ft_in1k,75.660,24.340,92.980,7.020,39.03,384,0.950,bicubic,-9.870,-4.656,-9
volo_d1_384.sail_in1k,75.650,24.350,93.060,6.940,26.78,384,1.000,bicubic,-9.594,-4.134,+11
volo_d3_224.sail_in1k,75.620,24.380,92.980,7.020,86.33,224,0.960,bicubic,-9.794,-4.296,-3
convnextv2_base.fcmae_ft_in1k,75.620,24.380,92.850,7.150,88.72,288,1.000,bicubic,-9.854,-4.534,-9
dm_nfnet_f5.dm_in1k,75.610,24.390,92.780,7.220,377.21,544,0.954,bicubic,-10.490,-4.908,-51
caformer_m36.sail_in1k,75.600,24.400,92.400,7.600,56.20,224,1.000,bicubic,-9.632,-4.800,+9
vit_base_patch16_clip_224.openai_ft_in1k,75.590,24.410,92.960,7.040,86.57,224,0.900,bicubic,-9.702,-4.476,+2
vit_base_r50_s16_384.orig_in21k_ft_in1k,75.570,24.430,92.790,7.210,98.95,384,1.000,bicubic,-9.406,-4.500,+33
deit_base_distilled_patch16_384.fb_in1k,75.550,24.450,92.500,7.500,87.63,384,1.000,bicubic,-9.874,-4.906,-11
inception_next_base.sail_in1k_384,75.530,24.470,92.560,7.440,86.67,384,1.000,bicubic,-9.672,-4.854,+10
regnetz_e8.ra3_in1k,75.510,24.490,92.690,7.310,57.70,320,1.000,bicubic,-9.524,-4.582,+26
cait_s24_384.fb_dist_in1k,75.500,24.500,92.610,7.390,47.06,384,1.000,bicubic,-9.548,-4.736,+24
convformer_s36.sail_in22k_ft_in1k,75.480,24.520,93.190,6.810,40.01,224,1.000,bicubic,-9.934,-4.378,-13
xcit_medium_24_p8_224.fb_dist_in1k,75.470,24.530,92.890,7.110,84.32,224,1.000,bicubic,-9.604,-4.360,+19
resnext101_32x8d.fb_swsl_ig1b_ft_in1k,75.470,24.530,92.730,7.270,88.79,224,0.875,bilinear,-8.832,-4.446,+86
vit_base_patch16_clip_224.openai_ft_in12k_in1k,75.460,24.540,92.750,7.250,86.57,224,0.950,bicubic,-10.482,-4.978,-49
regnety_2560.seer_ft_in1k,75.440,24.560,92.830,7.170,"1,282.60",384,1.000,bicubic,-9.710,-4.608,+7
vit_base_patch32_clip_448.laion2b_ft_in12k_in1k,75.420,24.580,92.710,7.290,88.34,448,1.000,bicubic,-10.360,-4.928,-42
regnety_320.swag_lc_in1k,75.400,24.600,93.680,6.320,145.05,224,0.965,bicubic,-9.148,-3.762,+50
beit_base_patch16_224.in22k_ft_in22k_in1k,75.390,24.610,93.020,6.980,86.53,224,0.900,bicubic,-9.822,-4.638,-2
tf_efficientnetv2_m.in1k,75.380,24.620,92.760,7.240,54.14,480,1.000,bicubic,-9.824,-4.604,-3
tf_efficientnet_b6.ap_in1k,75.380,24.620,92.440,7.560,43.04,528,0.942,bicubic,-9.408,-4.698,+34
vit_base_patch16_224.augreg2_in21k_ft_in1k,75.360,24.640,93.230,6.770,86.57,224,0.900,bicubic,-9.734,-4.300,+7
regnety_120.sw_in12k_ft_in1k,75.330,24.670,92.970,7.030,51.82,288,1.000,bicubic,-10.070,-4.612,-21
volo_d2_224.sail_in1k,75.310,24.690,92.520,7.480,58.68,224,0.960,bicubic,-9.892,-4.670,-4
mvitv2_large.fb_in1k,75.270,24.730,92.360,7.640,217.99,224,0.900,bicubic,-9.974,-4.854,-13
coat_lite_medium_384.in1k,75.270,24.730,92.230,7.770,44.57,384,1.000,bicubic,-9.608,-5.142,+22
dm_nfnet_f2.dm_in1k,75.260,24.740,92.450,7.550,193.78,352,0.920,bicubic,-9.932,-4.896,-6
caformer_s18.sail_in1k_384,75.210,24.790,92.700,7.300,26.34,384,1.000,bicubic,-9.816,-4.658,+9
convnext_small.fb_in22k_ft_in1k_384,75.160,24.840,93.060,6.940,50.22,384,1.000,bicubic,-10.618,-4.830,-53
efficientnetv2_rw_m.agc_in1k,75.150,24.850,92.570,7.430,53.24,416,1.000,bicubic,-9.660,-4.582,+23
ecaresnet269d.ra2_in1k,75.130,24.870,92.830,7.170,102.09,352,1.000,bicubic,-9.838,-4.392,+10
vit_base_patch16_clip_224.laion2b_ft_in1k,75.120,24.880,92.700,7.300,86.57,224,1.000,bicubic,-10.350,-4.876,-39
deit3_large_patch16_224.fb_in1k,75.120,24.880,92.280,7.720,304.37,224,0.900,bicubic,-9.654,-4.756,+23
deit3_small_patch16_384.fb_in22k_ft_in1k,75.090,24.910,92.810,7.190,22.21,384,1.000,bicubic,-9.734,-4.676,+17
vit_base_patch32_clip_384.laion2b_ft_in12k_in1k,75.090,24.910,92.580,7.420,88.30,384,1.000,bicubic,-10.276,-5.080,-30
xcit_medium_24_p16_384.fb_dist_in1k,75.090,24.910,92.440,7.560,84.40,384,1.000,bicubic,-10.334,-4.890,-40
tiny_vit_21m_224.dist_in22k_ft_in1k,75.070,24.930,92.590,7.410,21.20,224,0.950,bicubic,-10.016,-4.776,-6
maxvit_tiny_tf_384.in1k,75.020,24.980,92.450,7.550,30.98,384,1.000,bicubic,-10.080,-4.928,-11
convformer_b36.sail_in1k,74.980,25.020,91.600,8.400,99.88,224,1.000,bicubic,-9.838,-5.346,+13
xcit_small_24_p8_224.fb_dist_in1k,74.970,25.030,92.320,7.680,47.63,224,1.000,bicubic,-9.898,-4.870,+8
convnext_small.in12k_ft_in1k,74.960,25.040,92.520,7.480,50.22,288,1.000,bicubic,-10.370,-5.026,-34
tf_efficientnet_b8.ra_in1k,74.920,25.080,92.330,7.670,87.41,672,0.954,bicubic,-10.448,-5.064,-38
xcit_small_12_p8_384.fb_dist_in1k,74.850,25.150,92.450,7.550,26.21,384,1.000,bicubic,-10.228,-4.832,-11
eca_nfnet_l2.ra3_in1k,74.820,25.180,92.650,7.350,56.72,384,1.000,bicubic,-9.880,-4.616,+15
deit3_base_patch16_384.fb_in1k,74.800,25.200,92.240,7.760,86.88,384,1.000,bicubic,-10.274,-5.034,-11
convnext_tiny.in12k_ft_in1k_384,74.730,25.270,92.810,7.190,28.59,384,1.000,bicubic,-10.392,-4.796,-21
tf_efficientnet_b7.ra_in1k,74.730,25.270,92.220,7.780,66.35,600,0.949,bicubic,-10.202,-4.988,-4
deit3_medium_patch16_224.fb_in22k_ft_in1k,74.690,25.310,92.470,7.530,38.85,224,1.000,bicubic,-9.860,-4.718,+19
convnext_large.fb_in1k,74.650,25.350,91.970,8.030,197.77,288,1.000,bicubic,-10.196,-5.244,+1
xcit_large_24_p16_224.fb_dist_in1k,74.650,25.350,91.860,8.140,189.10,224,1.000,bicubic,-10.266,-5.268,-5
convformer_s18.sail_in1k_384,74.640,25.360,92.480,7.520,26.77,384,1.000,bicubic,-9.762,-4.632,+42
caformer_s36.sail_in1k,74.640,25.360,92.080,7.920,39.30,224,1.000,bicubic,-9.866,-4.916,+23
xcit_small_24_p16_384.fb_dist_in1k,74.590,25.410,92.460,7.540,47.67,384,1.000,bicubic,-10.500,-4.852,-24
tf_efficientnet_b5.ap_in1k,74.590,25.410,91.990,8.010,30.39,456,0.934,bicubic,-9.668,-4.984,+50
maxvit_large_tf_224.in1k,74.580,25.420,91.700,8.300,211.79,224,0.950,bicubic,-10.362,-5.270,-13
vit_medium_patch16_gap_256.sw_in12k_ft_in1k,74.570,25.430,91.960,8.040,38.86,256,0.950,bicubic,-9.876,-5.250,+26
swin_base_patch4_window7_224.ms_in22k_ft_in1k,74.560,25.440,92.540,7.460,87.77,224,0.900,bicubic,-10.712,-5.024,-48
dm_nfnet_f1.dm_in1k,74.560,25.440,92.170,7.830,132.63,320,0.910,bicubic,-10.142,-5.012,+1
convformer_s18.sail_in22k_ft_in1k_384,74.550,25.450,92.760,7.240,26.77,384,1.000,bicubic,-10.448,-4.810,-20
maxxvit_rmlp_small_rw_256.sw_in1k,74.520,25.480,92.000,8.000,66.01,256,0.950,bicubic,-10.104,-5.068,+2
regnetz_040.ra3_in1k,74.510,25.490,91.880,8.120,27.12,320,1.000,bicubic,-9.730,-5.052,+45
seresnet152d.ra2_in1k,74.500,25.500,92.090,7.910,66.84,320,1.000,bicubic,-9.860,-4.950,+35
convnext_small.fb_in22k_ft_in1k,74.490,25.510,92.700,7.300,50.22,288,1.000,bicubic,-10.772,-4.982,-52
davit_base.msft_in1k,74.490,25.510,91.750,8.250,87.95,224,0.950,bicubic,-10.152,-5.270,-3
fastvit_ma36.apple_dist_in1k,74.480,25.520,91.960,8.040,44.07,256,0.950,bicubic,-10.118,-5.042,0
gcvit_base.in1k,74.480,25.520,91.770,8.230,90.32,224,0.875,bicubic,-9.964,-5.312,+18
regnetz_040_h.ra3_in1k,74.460,25.540,92.250,7.750,28.94,320,1.000,bicubic,-10.032,-4.508,+8
resnest200e.in1k,74.460,25.540,91.880,8.120,70.20,320,0.909,bicubic,-9.384,-5.004,+74
coatnet_rmlp_1_rw2_224.sw_in12k_ft_in1k,74.460,25.540,91.640,8.360,41.72,224,0.950,bicubic,-10.450,-5.318,-21
tf_efficientnetv2_s.in21k_ft_in1k,74.440,25.560,92.500,7.500,21.46,384,1.000,bicubic,-9.846,-4.752,+30
maxvit_rmlp_small_rw_224.sw_in1k,74.400,25.600,91.390,8.610,64.90,224,0.900,bicubic,-10.092,-5.620,+6
caformer_s18.sail_in22k_ft_in1k,74.380,25.620,92.500,7.500,26.34,224,1.000,bicubic,-9.694,-4.698,+49
convformer_m36.sail_in1k,74.380,25.620,91.480,8.520,57.05,224,1.000,bicubic,-10.114,-5.386,+2
flexivit_base.1200ep_in1k,74.350,25.650,91.800,8.200,86.59,240,0.950,bicubic,-10.326,-5.194,-14
resnetrs200.tf_in1k,74.340,25.660,91.940,8.060,93.21,320,1.000,bicubic,-10.104,-4.902,+8
seresnextaa101d_32x8d.ah_in1k,74.320,25.680,91.720,8.280,93.59,288,1.000,bicubic,-10.246,-5.356,-10
maxvit_base_tf_224.in1k,74.290,25.710,91.760,8.240,119.47,224,0.950,bicubic,-10.570,-5.228,-28
regnety_160.swag_lc_in1k,74.230,25.770,92.870,7.130,83.59,224,0.965,bicubic,-9.552,-4.410,+76
convnextv2_tiny.fcmae_ft_in22k_in1k_384,74.220,25.780,92.470,7.530,28.64,384,1.000,bicubic,-10.886,-5.158,-53
mvitv2_base.fb_in1k,74.220,25.780,91.650,8.350,51.47,224,0.900,bicubic,-10.230,-5.208,+1
resnest269e.in1k,74.210,25.790,91.960,8.040,110.93,416,0.928,bicubic,-10.298,-5.030,-9
convformer_s36.sail_in1k,74.210,25.790,91.230,8.770,40.01,224,1.000,bicubic,-9.850,-5.516,+42
convnext_tiny.in12k_ft_in1k,74.200,25.800,92.690,7.310,28.59,288,1.000,bicubic,-10.250,-4.650,-3
seresnext101d_32x8d.ah_in1k,74.180,25.820,91.850,8.150,93.59,288,1.000,bicubic,-10.178,-5.070,+13
coatnet_rmlp_2_rw_224.sw_in1k,74.180,25.820,91.300,8.700,73.88,224,0.950,bicubic,-10.428,-5.440,-21
cait_xs24_384.fb_dist_in1k,74.170,25.830,91.930,8.070,26.67,384,1.000,bicubic,-9.892,-4.954,+36
efficientnetv2_rw_s.ra2_in1k,74.160,25.840,91.710,8.290,23.94,384,1.000,bicubic,-9.646,-5.022,+63
resnext101_32x4d.fb_swsl_ig1b_ft_in1k,74.130,25.870,91.990,8.010,44.18,224,0.875,bilinear,-9.096,-4.770,+126
vit_base_patch16_384.orig_in21k_ft_in1k,74.120,25.880,92.360,7.640,86.86,384,1.000,bicubic,-10.080,-4.858,+20
flexivit_base.600ep_in1k,74.120,25.880,91.750,8.250,86.59,240,0.950,bicubic,-10.404,-5.186,-19
flexivit_base.300ep_in1k,74.120,25.880,91.390,8.610,86.59,240,0.950,bicubic,-10.286,-5.494,+4
vit_large_r50_s32_224.augreg_in21k_ft_in1k,74.110,25.890,92.390,7.610,328.99,224,0.900,bicubic,-10.308,-4.782,-4
xcit_small_12_p16_384.fb_dist_in1k,74.110,25.890,92.090,7.910,26.25,384,1.000,bicubic,-10.602,-5.028,-38
eca_nfnet_l1.ra2_in1k,74.110,25.890,92.070,7.930,41.41,320,1.000,bicubic,-9.902,-4.956,+38
convnext_base.fb_in1k,74.110,25.890,91.710,8.290,88.59,288,1.000,bicubic,-10.318,-5.258,-5
volo_d1_224.sail_in1k,74.100,25.900,92.030,7.970,26.63,224,0.960,bicubic,-10.062,-4.746,+18
inception_next_base.sail_in1k,74.080,25.920,91.380,8.620,86.67,224,0.950,bicubic,-10.012,-5.416,+22
vit_base_patch32_clip_384.openai_ft_in12k_in1k,74.050,25.950,92.410,7.590,88.30,384,0.950,bicubic,-11.164,-4.994,-82
vit_base_patch16_224_miil.in21k_ft_in1k,74.040,25.960,91.700,8.300,86.54,224,0.875,bilinear,-10.226,-5.104,+3
xcit_large_24_p8_224.fb_in1k,74.040,25.960,90.890,9.110,188.93,224,1.000,bicubic,-10.354,-5.774,-4
tf_efficientnet_b7.aa_in1k,74.030,25.970,91.870,8.130,66.35,600,0.949,bicubic,-10.386,-5.038,-9
vit_base_patch16_224.augreg_in21k_ft_in1k,74.010,25.990,92.470,7.530,86.57,224,0.900,bicubic,-10.522,-4.824,-33
resnetv2_152x4_bit.goog_in21k_ft_in1k,74.010,25.990,92.350,7.650,936.53,480,1.000,bilinear,-10.906,-5.088,-58
resnext101_32x16d.fb_swsl_ig1b_ft_in1k,73.990,26.010,92.180,7.820,194.03,224,0.875,bilinear,-9.346,-4.666,+101
tf_efficientnetv2_s.in1k,73.990,26.010,91.540,8.460,21.46,384,1.000,bicubic,-9.908,-5.156,+32
swinv2_base_window16_256.ms_in1k,73.980,26.020,91.750,8.250,87.92,256,0.900,bicubic,-10.620,-5.340,-42
regnetz_d32.ra3_in1k,73.960,26.040,91.950,8.050,27.58,320,0.950,bicubic,-10.062,-4.918,+22
seresnext101_32x8d.ah_in1k,73.960,26.040,91.450,8.550,93.57,288,1.000,bicubic,-10.226,-5.424,+4
resnetv2_152x2_bit.goog_in21k_ft_in1k,73.950,26.050,92.670,7.330,236.34,448,1.000,bilinear,-10.560,-4.764,-37
crossvit_18_dagger_408.in1k,73.930,26.070,91.410,8.590,44.61,408,1.000,bicubic,-10.272,-5.408,0
rexnetr_300.sw_in12k_ft_in1k,73.920,26.080,92.210,7.790,34.81,288,1.000,bicubic,-10.626,-5.046,-43
resnetrs420.tf_in1k,73.920,26.080,91.780,8.220,191.89,416,1.000,bicubic,-11.084,-5.344,-73
xcit_small_12_p8_224.fb_dist_in1k,73.920,26.080,91.720,8.280,26.21,224,1.000,bicubic,-10.316,-5.150,-6
resmlp_big_24_224.fb_in22k_ft_in1k,73.900,26.100,91.760,8.240,129.14,224,0.875,bicubic,-10.498,-5.352,-20
pit_b_distilled_224.in1k,73.900,26.100,90.780,9.220,74.79,224,0.900,bicubic,-9.866,-5.688,+42
tf_efficientnet_b6.aa_in1k,73.890,26.110,91.750,8.250,43.04,528,0.942,bicubic,-10.222,-5.134,+2
edgenext_base.usi_in1k,73.880,26.120,91.760,8.240,18.51,320,1.000,bicubic,-10.078,-5.010,+16
regnety_1280.seer_ft_in1k,73.860,26.140,91.900,8.100,644.81,384,1.000,bicubic,-10.572,-5.192,-32
deit3_small_patch16_224.fb_in22k_ft_in1k,73.850,26.150,91.970,8.030,22.06,224,1.000,bicubic,-9.226,-4.806,+113
tf_efficientnet_b3.ns_jft_in1k,73.850,26.150,91.860,8.140,12.23,300,0.904,bicubic,-10.202,-5.058,+6
maxvit_rmlp_tiny_rw_256.sw_in1k,73.830,26.170,91.440,8.560,29.15,256,0.950,bicubic,-10.394,-5.428,-12
vit_small_r26_s32_384.augreg_in21k_ft_in1k,73.790,26.210,92.300,7.700,36.47,384,1.000,bicubic,-10.258,-5.028,+5
davit_small.msft_in1k,73.780,26.220,91.560,8.440,49.75,224,0.950,bicubic,-10.472,-5.380,-19
fastvit_ma36.apple_in1k,73.780,26.220,91.490,8.510,44.07,256,0.950,bicubic,-10.102,-5.252,+16
regnety_080.ra3_in1k,73.760,26.240,91.800,8.200,39.18,288,1.000,bicubic,-10.166,-5.090,+8
maxvit_small_tf_224.in1k,73.760,26.240,91.410,8.590,68.93,224,0.950,bicubic,-10.666,-5.414,-36
regnetz_d8.ra3_in1k,73.750,26.250,92.000,8.000,23.37,320,1.000,bicubic,-10.302,-4.996,-2
focalnet_base_lrf.ms_in1k,73.740,26.260,90.990,9.010,88.75,224,0.900,bicubic,-10.098,-5.618,+17
fastvit_sa36.apple_dist_in1k,73.730,26.270,91.730,8.270,31.53,256,0.900,bicubic,-10.296,-5.124,-1
resnetrs270.tf_in1k,73.730,26.270,91.580,8.420,129.86,352,1.000,bicubic,-10.698,-5.388,-42
efficientvit_b3.r288_in1k,73.720,26.280,91.450,8.550,48.65,288,1.000,bicubic,-10.434,-5.286,-16
gcvit_small.in1k,73.700,26.300,91.230,8.770,51.09,224,0.875,bicubic,-10.192,-5.428,+7
resnext101_32x8d.fb_wsl_ig1b_ft_in1k,73.670,26.330,92.140,7.860,88.79,224,0.875,bilinear,-9.028,-4.004,+130
resnet200d.ra2_in1k,73.670,26.330,91.570,8.430,64.69,320,1.000,bicubic,-10.294,-5.256,-1
resnetv2_101x3_bit.goog_in21k_ft_in1k,73.660,26.340,92.470,7.530,387.93,448,1.000,bilinear,-10.778,-4.912,-50
xcit_medium_24_p16_224.fb_dist_in1k,73.650,26.350,91.570,8.430,84.40,224,1.000,bicubic,-10.636,-5.362,-35
convnextv2_tiny.fcmae_ft_in22k_in1k,73.640,26.360,91.960,8.040,28.64,288,1.000,bicubic,-10.776,-5.300,-46
inception_next_small.sail_in1k,73.620,26.380,91.240,8.760,49.37,224,0.875,bicubic,-9.958,-5.358,+40
edgenext_base.in21k_ft_in1k,73.610,26.390,91.860,8.140,18.51,320,1.000,bicubic,-10.444,-5.336,-15
repvgg_d2se.rvgg_in1k,73.610,26.390,91.380,8.620,133.33,320,1.000,bilinear,-9.950,-5.278,+38
regnety_064.ra3_in1k,73.610,26.390,91.330,8.670,30.58,288,1.000,bicubic,-10.110,-5.392,+23
tresnet_v2_l.miil_in21k_ft_in1k,73.590,26.410,90.990,9.010,46.17,224,0.875,bilinear,-10.304,-5.500,-4
swinv2_base_window8_256.ms_in1k,73.550,26.450,91.510,8.490,87.92,256,0.900,bicubic,-10.700,-5.414,-38
tf_efficientnet_b5.ra_in1k,73.540,26.460,91.460,8.540,30.39,456,0.934,bicubic,-10.274,-5.292,+5
resnetaa101d.sw_in12k_ft_in1k,73.520,26.480,91.750,8.250,44.57,288,1.000,bicubic,-10.604,-5.356,-28
cs3se_edgenet_x.c2ns_in1k,73.510,26.490,91.480,8.520,50.72,320,1.000,bicubic,-10.036,-5.190,+34
resnet152d.ra2_in1k,73.510,26.490,91.240,8.760,60.21,320,1.000,bicubic,-10.174,-5.498,+23
regnetz_d8_evos.ch_in1k,73.490,26.510,91.710,8.290,23.46,320,1.000,bicubic,-10.636,-5.302,-33
deit3_base_patch16_224.fb_in1k,73.490,26.510,91.280,8.720,86.59,224,0.900,bicubic,-10.296,-5.306,+5
repvit_m2_3.dist_450e_in1k,73.480,26.520,91.260,8.740,23.69,224,0.950,bicubic,-10.262,-5.384,+10
regnetv_064.ra3_in1k,73.460,26.540,91.600,8.400,30.58,288,1.000,bicubic,-10.256,-5.142,+13
convnext_small.fb_in1k,73.450,26.550,91.330,8.670,50.22,288,1.000,bicubic,-10.250,-5.478,+14
coat_lite_medium.in1k,73.450,26.550,91.230,8.770,44.57,224,0.900,bicubic,-10.150,-5.498,+23
sequencer2d_l.in1k,73.450,26.550,91.090,8.910,54.30,224,0.875,bicubic,-9.944,-5.406,+43
xcit_tiny_24_p8_384.fb_dist_in1k,73.430,26.570,91.560,8.440,12.11,384,1.000,bicubic,-10.316,-4.840,+3
twins_svt_large.in1k,73.420,26.580,90.890,9.110,99.27,224,0.900,bicubic,-10.258,-5.698,+15
tiny_vit_21m_224.in1k,73.410,26.590,91.480,8.520,21.20,224,0.950,bicubic,-9.844,-5.112,+54
resnetrs350.tf_in1k,73.400,26.600,91.300,8.700,163.96,384,1.000,bicubic,-11.314,-5.692,-103
pvt_v2_b4.in1k,73.400,26.600,91.070,8.930,62.56,224,0.900,bicubic,-10.312,-5.600,+7
tf_efficientnet_b5.aa_in1k,73.390,26.610,91.210,8.790,30.39,456,0.934,bicubic,-10.298,-5.502,+9
regnety_160.deit_in1k,73.380,26.620,91.700,8.300,83.59,288,1.000,bicubic,-10.310,-5.080,+7
swin_s3_base_224.ms_in1k,73.350,26.650,91.180,8.820,71.13,224,0.900,bicubic,-10.570,-5.492,-27
convformer_s18.sail_in22k_ft_in1k,73.340,26.660,91.910,8.090,26.77,224,1.000,bicubic,-10.398,-5.138,-1
efficientnet_b4.ra2_in1k,73.340,26.660,91.270,8.730,19.34,384,1.000,bicubic,-10.074,-5.328,+30
gcvit_tiny.in1k,73.330,26.670,90.970,9.030,28.22,224,0.875,bicubic,-10.054,-5.428,+34
repvit_m2_3.dist_300e_in1k,73.320,26.680,90.890,9.110,23.69,224,0.950,bicubic,-10.184,-5.614,+16
mvitv2_small.fb_in1k,73.300,26.700,91.230,8.770,34.87,224,0.900,bicubic,-10.470,-5.346,-12
resnet152.a1h_in1k,73.300,26.700,91.180,8.820,60.19,288,1.000,bicubic,-10.150,-5.358,+21
resmlp_big_24_224.fb_distilled_in1k,73.300,26.700,91.170,8.830,129.14,224,0.875,bicubic,-10.292,-5.480,+9
swin_small_patch4_window7_224.ms_in22k_ft_in1k,73.290,26.710,92.030,7.970,49.61,224,0.900,bicubic,-10.008,-4.934,+39
vit_small_patch16_384.augreg_in21k_ft_in1k,73.290,26.710,91.990,8.010,22.20,384,1.000,bicubic,-10.514,-5.110,-20
xcit_small_24_p16_224.fb_dist_in1k,73.270,26.730,91.460,8.540,47.67,224,1.000,bicubic,-10.604,-5.276,-32
focalnet_base_srf.ms_in1k,73.270,26.730,91.260,8.740,88.15,224,0.900,bicubic,-10.550,-5.420,-24
swinv2_small_window16_256.ms_in1k,73.250,26.750,91.290,8.710,49.73,256,0.900,bicubic,-10.974,-5.488,-68
pvt_v2_b5.in1k,73.250,26.750,91.080,8.920,81.96,224,0.900,bicubic,-10.490,-5.556,-14
deit_base_distilled_patch16_224.fb_in1k,73.250,26.750,91.010,8.990,87.34,224,0.900,bicubic,-10.140,-5.478,+24
maxvit_tiny_rw_224.sw_in1k,73.230,26.770,90.770,9.230,29.06,224,0.950,bicubic,-10.274,-5.744,+6
pvt_v2_b3.in1k,73.210,26.790,91.010,8.990,45.24,224,0.900,bicubic,-9.908,-5.546,+49
resnetrs152.tf_in1k,73.200,26.800,91.260,8.740,86.62,320,1.000,bicubic,-10.502,-5.352,-13
fastvit_sa24.apple_dist_in1k,73.180,26.820,91.350,8.650,21.55,256,0.900,bicubic,-10.162,-5.202,+24
swin_base_patch4_window12_384.ms_in1k,73.160,26.840,91.130,8.870,87.90,384,1.000,bicubic,-11.316,-5.762,-102
efficientvit_b3.r256_in1k,73.150,26.850,91.110,8.890,48.65,256,1.000,bicubic,-10.652,-5.406,-30
vit_base_patch32_384.augreg_in21k_ft_in1k,73.130,26.870,91.250,8.750,88.30,384,1.000,bicubic,-10.222,-5.590,+19
nest_base_jx.goog_in1k,73.130,26.870,91.070,8.930,67.72,224,0.875,bicubic,-10.404,-5.304,-2
xcit_medium_24_p8_224.fb_in1k,73.130,26.870,90.280,9.720,84.32,224,1.000,bicubic,-10.616,-6.430,-26
regnety_640.seer_ft_in1k,73.110,26.890,91.520,8.480,281.38,384,1.000,bicubic,-10.798,-5.402,-50
swinv2_small_window8_256.ms_in1k,73.100,26.900,90.950,9.050,49.73,256,0.900,bicubic,-10.754,-5.694,-45
deit3_small_patch16_384.fb_in1k,73.090,26.910,91.220,8.780,22.21,384,1.000,bicubic,-10.338,-5.454,+4
xcit_small_24_p8_224.fb_in1k,73.090,26.910,91.160,8.840,47.63,224,1.000,bicubic,-10.744,-5.472,-42
cait_s24_224.fb_dist_in1k,73.060,26.940,91.120,8.880,46.92,224,1.000,bicubic,-10.382,-5.454,+1
efficientformerv2_l.snap_dist_in1k,73.060,26.940,90.880,9.120,26.32,224,0.950,bicubic,-10.572,-5.678,-18
focalnet_small_srf.ms_in1k,73.060,26.940,90.730,9.270,49.89,224,0.900,bicubic,-10.356,-5.708,+1
coatnet_rmlp_1_rw_224.sw_in1k,73.040,26.960,90.880,9.120,41.69,224,0.950,bicubic,-10.322,-5.570,+9
fastvit_sa36.apple_in1k,73.030,26.970,90.930,9.070,31.53,256,0.900,bicubic,-10.470,-5.700,-10
efficientvit_b3.r224_in1k,72.980,27.020,90.570,9.430,48.65,224,0.950,bicubic,-10.480,-5.760,-6
crossvit_15_dagger_408.in1k,72.970,27.030,91.090,8.910,28.50,408,1.000,bicubic,-10.870,-5.688,-52
caformer_s18.sail_in1k,72.960,27.040,91.010,8.990,26.34,224,1.000,bicubic,-10.694,-5.508,-25
regnety_320.tv2_in1k,72.940,27.060,90.760,9.240,145.05,224,0.965,bicubic,-10.222,-5.654,+21
ecaresnet101d.miil_in1k,72.930,27.070,91.180,8.820,44.57,288,0.950,bicubic,-10.054,-5.362,+43
regnetv_040.ra3_in1k,72.930,27.070,91.120,8.880,20.64,288,1.000,bicubic,-10.260,-5.538,+16
tf_efficientnet_b4.ap_in1k,72.900,27.100,90.980,9.020,19.34,380,0.922,bicubic,-10.350,-5.416,+9
maxvit_tiny_tf_224.in1k,72.900,27.100,90.830,9.170,30.92,224,0.950,bicubic,-10.502,-5.760,-7
coatnet_1_rw_224.sw_in1k,72.900,27.100,90.790,9.210,41.72,224,0.950,bicubic,-10.696,-5.592,-25
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k_384,72.870,27.130,91.550,8.450,236.34,384,1.000,bicubic,-10.966,-5.576,-58
swinv2_cr_small_ns_224.sw_in1k,72.850,27.150,90.820,9.180,49.70,224,0.900,bicubic,-10.648,-5.664,-20
regnety_040.ra3_in1k,72.790,27.210,90.750,9.250,20.65,288,1.000,bicubic,-10.254,-5.752,+29
regnety_032.ra_in1k,72.780,27.220,90.950,9.050,19.44,288,1.000,bicubic,-9.946,-5.466,+54
convnextv2_tiny.fcmae_ft_in1k,72.770,27.230,91.180,8.820,28.64,288,1.000,bicubic,-10.694,-5.538,-20
xception65p.ra3_in1k,72.760,27.240,90.910,9.090,39.82,299,0.940,bicubic,-10.366,-5.572,+15
swin_s3_small_224.ms_in1k,72.720,27.280,90.560,9.440,49.74,224,0.900,bicubic,-11.036,-5.892,-53
resnetv2_101.a1h_in1k,72.690,27.310,90.670,9.330,44.54,288,1.000,bicubic,-10.310,-5.784,+28
efficientformer_l7.snap_dist_in1k,72.660,27.340,90.800,9.200,82.23,224,0.950,bicubic,-10.722,-5.736,-12
xcit_small_12_p8_224.fb_in1k,72.630,27.370,90.670,9.330,26.21,224,1.000,bicubic,-10.704,-5.812,-7
nfnet_l0.ra2_in1k,72.610,27.390,90.990,9.010,35.07,288,1.000,bicubic,-10.140,-5.526,+44
resnext101_64x4d.c1_in1k,72.610,27.390,90.830,9.170,83.46,288,1.000,bicubic,-10.546,-5.544,+3
focalnet_small_lrf.ms_in1k,72.610,27.390,90.720,9.280,50.34,224,0.900,bicubic,-10.884,-5.860,-28
pnasnet5large.tf_in1k,72.610,27.390,90.500,9.500,86.06,331,0.911,bicubic,-10.172,-5.540,+39
xception65.ra3_in1k,72.600,27.400,90.840,9.160,39.92,299,0.940,bicubic,-10.580,-5.752,-1
cs3sedarknet_x.c2ns_in1k,72.580,27.420,91.060,8.940,35.40,288,1.000,bicubic,-10.078,-5.290,+52
resnest101e.in1k,72.580,27.420,90.810,9.190,48.28,256,0.875,bilinear,-10.304,-5.512,+27
twins_pcpvt_large.in1k,72.580,27.420,90.700,9.300,60.99,224,0.900,bicubic,-10.550,-5.904,+2
gc_efficientnetv2_rw_t.agc_in1k,72.570,27.430,90.830,9.170,13.68,288,1.000,bicubic,-9.886,-5.466,+82
tf_efficientnet_b5.in1k,72.560,27.440,91.090,8.910,30.39,456,0.934,bicubic,-10.616,-5.446,-6
resnext50_32x4d.fb_swsl_ig1b_ft_in1k,72.560,27.440,90.840,9.160,25.03,224,0.875,bilinear,-9.612,-5.384,+119
tresnet_xl.miil_in1k_448,72.560,27.440,90.310,9.690,78.44,448,0.875,bilinear,-10.498,-5.862,+9
twins_svt_base.in1k,72.550,27.450,90.450,9.550,56.07,224,0.900,bicubic,-10.570,-5.964,-1
deit_base_patch16_384.fb_in1k,72.550,27.450,90.260,9.740,86.86,384,1.000,bicubic,-10.554,-6.108,+3
resnetv2_50x3_bit.goog_in21k_ft_in1k,72.520,27.480,91.760,8.240,217.32,448,1.000,bilinear,-11.500,-5.366,-98
xcit_small_12_p16_224.fb_dist_in1k,72.520,27.480,91.130,8.870,26.25,224,1.000,bicubic,-10.808,-5.286,-22
rexnet_300.nav_in1k,72.520,27.480,90.610,9.390,34.71,224,0.875,bicubic,-10.254,-5.628,+28
vit_base_patch32_clip_224.laion2b_ft_in12k_in1k,72.510,27.490,90.880,9.120,88.22,224,0.900,bicubic,-10.786,-5.648,-23
deit3_medium_patch16_224.fb_in1k,72.510,27.490,90.810,9.190,38.85,224,0.900,bicubic,-10.576,-5.484,-1
convformer_s18.sail_in1k,72.510,27.490,90.510,9.490,26.77,224,1.000,bicubic,-10.476,-5.740,+10
convnext_tiny.fb_in22k_ft_in1k_384,72.470,27.530,91.540,8.460,28.59,384,1.000,bicubic,-11.618,-5.604,-114
regnety_080_tv.tv2_in1k,72.460,27.540,90.540,9.460,39.38,224,0.965,bicubic,-10.134,-5.708,+47
xcit_tiny_24_p8_224.fb_dist_in1k,72.440,27.560,90.920,9.080,12.11,224,1.000,bicubic,-10.126,-5.138,+55
sequencer2d_m.in1k,72.430,27.570,90.710,9.290,38.31,224,0.875,bicubic,-10.382,-5.564,+14
resnet101d.ra2_in1k,72.430,27.570,90.650,9.350,44.57,320,1.000,bicubic,-10.590,-5.802,0
regnetx_320.tv2_in1k,72.420,27.580,90.300,9.700,107.81,224,0.965,bicubic,-10.390,-5.908,+14
maxxvit_rmlp_nano_rw_256.sw_in1k,72.370,27.630,90.750,9.250,16.78,256,0.950,bicubic,-10.672,-5.600,-4
nest_small_jx.goog_in1k,72.350,27.650,90.690,9.310,38.35,224,0.875,bicubic,-10.774,-5.630,-17
maxvit_rmlp_nano_rw_256.sw_in1k,72.350,27.650,90.430,9.570,15.50,256,0.950,bicubic,-10.604,-5.836,+2
efficientvit_b2.r288_in1k,72.330,27.670,90.930,9.070,24.33,288,1.000,bicubic,-10.770,-5.374,-13
resnext101_32x8d.tv2_in1k,72.330,27.670,90.270,9.730,88.79,224,0.965,bilinear,-10.502,-5.962,+6
tf_efficientnet_b4.aa_in1k,72.300,27.700,90.590,9.410,19.34,380,0.922,bicubic,-10.718,-5.710,-7
tf_efficientnet_b2.ns_jft_in1k,72.280,27.720,91.110,8.890,9.11,260,0.890,bicubic,-10.098,-5.144,+65
hrnet_w18_ssld.paddle_in1k,72.270,27.730,90.810,9.190,21.30,288,1.000,bilinear,-9.778,-5.440,+108
maxvit_nano_rw_256.sw_in1k,72.270,27.730,90.570,9.430,15.45,256,0.950,bicubic,-10.658,-5.650,-3
tresnet_m.miil_in21k_ft_in1k,72.270,27.730,90.230,9.770,31.39,224,0.875,bilinear,-10.800,-5.880,-16
swin_base_patch4_window7_224.ms_in1k,72.260,27.740,90.810,9.190,87.77,224,0.900,bicubic,-11.346,-5.642,-77
resnext101_64x4d.tv_in1k,72.260,27.740,90.550,9.450,83.46,224,0.875,bilinear,-10.732,-5.694,-10
resnetv2_50x1_bit.goog_distilled_in1k,72.240,27.760,90.980,9.020,25.55,224,0.875,bicubic,-10.584,-5.538,-2
nasnetalarge.tf_in1k,72.240,27.760,90.460,9.540,88.75,331,0.911,bicubic,-10.386,-5.582,+24
efficientnetv2_rw_t.ra2_in1k,72.230,27.770,90.420,9.580,13.65,288,1.000,bicubic,-10.120,-5.772,+62
crossvit_18_240.in1k,72.230,27.770,90.270,9.730,43.27,240,0.875,bicubic,-10.170,-5.790,+54
regnetz_c16_evos.ch_in1k,72.220,27.780,91.210,8.790,13.49,320,0.950,bicubic,-10.416,-5.264,+18
cait_xxs36_384.fb_dist_in1k,72.190,27.810,90.840,9.160,17.37,384,1.000,bicubic,-10.014,-5.304,+80
twins_pcpvt_base.in1k,72.190,27.810,90.500,9.500,43.83,224,0.900,bicubic,-10.524,-5.846,+3
crossvit_18_dagger_240.in1k,72.190,27.810,90.090,9.910,44.27,240,0.875,bicubic,-10.328,-5.978,+38
regnetz_c16.ra3_in1k,72.180,27.820,91.120,8.880,13.46,320,1.000,bicubic,-10.452,-5.198,+15
regnety_320.seer_ft_in1k,72.170,27.830,90.870,9.130,145.05,384,1.000,bicubic,-11.158,-5.838,-55
maxxvitv2_nano_rw_256.sw_in1k,72.170,27.830,90.470,9.530,23.70,256,0.950,bicubic,-10.940,-5.854,-33
repvit_m1_5.dist_450e_in1k,72.160,27.840,90.370,9.630,14.64,224,0.950,bicubic,-10.352,-5.742,+34
resnet101.a1h_in1k,72.100,27.900,90.820,9.180,44.55,288,1.000,bicubic,-10.678,-5.490,-9
regnety_160.tv2_in1k,72.100,27.900,90.230,9.770,83.59,224,0.965,bicubic,-10.546,-5.984,+8
inception_next_tiny.sail_in1k,72.090,27.910,90.100,9.900,28.06,224,0.875,bicubic,-10.388,-5.922,+36
xcit_tiny_24_p16_384.fb_dist_in1k,72.070,27.930,90.580,9.420,12.12,384,1.000,bicubic,-10.500,-5.696,+23
tiny_vit_11m_224.dist_in22k_ft_in1k,72.050,27.950,91.450,8.550,11.00,224,0.950,bicubic,-11.178,-5.180,-56
cs3edgenet_x.c2_in1k,72.050,27.950,90.370,9.630,47.82,288,1.000,bicubic,-10.658,-6.000,-5
vit_relpos_medium_patch16_cls_224.sw_in1k,72.020,27.980,90.300,9.700,38.76,224,0.900,bicubic,-10.552,-5.768,+19
davit_tiny.msft_in1k,72.010,27.990,90.070,9.930,28.36,224,0.950,bicubic,-10.686,-6.204,-5
convnext_tiny_hnf.a2h_in1k,72.000,28.000,89.770,10.230,28.59,288,1.000,bicubic,-10.584,-6.238,+13
mobilevitv2_200.cvnets_in22k_ft_in1k_384,71.990,28.010,90.650,9.350,18.45,384,1.000,bicubic,-11.410,-5.932,-77
efficientformer_l3.snap_dist_in1k,71.980,28.020,90.270,9.730,31.41,224,0.950,bicubic,-10.568,-5.980,+18
convnext_tiny.fb_in1k,71.980,28.020,90.220,9.780,28.59,288,1.000,bicubic,-10.718,-6.412,-9
vit_relpos_base_patch16_clsgap_224.sw_in1k,71.970,28.030,90.250,9.750,86.43,224,0.900,bicubic,-10.790,-5.922,-18
sequencer2d_s.in1k,71.940,28.060,90.500,9.500,27.65,224,0.875,bicubic,-10.400,-5.528,+41
resnet152.a2_in1k,71.940,28.060,89.420,10.580,60.19,288,1.000,bicubic,-10.668,-6.708,+2
dm_nfnet_f0.dm_in1k,71.910,28.090,90.760,9.240,71.49,256,0.900,bicubic,-11.576,-5.808,-92
rexnetr_200.sw_in12k_ft_in1k,71.900,28.100,91.260,8.740,16.52,288,1.000,bicubic,-11.238,-5.376,-60
convnext_nano.in12k_ft_in1k,71.900,28.100,91.000,9.000,15.59,288,1.000,bicubic,-10.962,-5.556,-31
swinv2_cr_small_224.sw_in1k,71.900,28.100,90.270,9.730,49.70,224,0.900,bicubic,-11.236,-5.838,-60
convnextv2_nano.fcmae_ft_in22k_in1k,71.890,28.110,90.890,9.110,15.62,288,1.000,bicubic,-10.774,-5.630,-13
fastvit_sa24.apple_in1k,71.890,28.110,90.630,9.370,21.55,256,0.900,bicubic,-10.788,-5.642,-16
repvit_m1_5.dist_300e_in1k,71.850,28.150,90.330,9.670,14.64,224,0.950,bicubic,-10.526,-5.700,+27
resnet101.a1_in1k,71.850,28.150,89.150,10.850,44.55,288,1.000,bicubic,-10.472,-6.482,+36
eca_nfnet_l0.ra2_in1k,71.840,28.160,91.110,8.890,24.14,288,1.000,bicubic,-10.738,-5.382,0
mobilevitv2_175.cvnets_in22k_ft_in1k_384,71.840,28.160,90.770,9.230,14.25,384,1.000,bicubic,-11.098,-5.656,-44
vit_relpos_base_patch16_224.sw_in1k,71.820,28.180,90.250,9.750,86.43,224,0.900,bicubic,-10.676,-5.888,+10
regnetx_160.tv2_in1k,71.800,28.200,90.050,9.950,54.28,224,0.965,bicubic,-10.766,-6.122,+2
seresnext50_32x4d.racm_in1k,71.800,28.200,90.030,9.970,27.56,288,0.950,bicubic,-10.396,-6.118,+48
swin_small_patch4_window7_224.ms_in1k,71.770,28.230,90.260,9.740,49.61,224,0.900,bicubic,-11.438,-6.056,-77
coat_small.in1k,71.750,28.250,90.410,9.590,21.69,224,0.900,bicubic,-10.612,-5.798,+20
flexivit_small.1200ep_in1k,71.750,28.250,90.270,9.730,22.06,240,0.950,bicubic,-10.776,-5.856,0
mvitv2_tiny.fb_in1k,71.740,28.260,90.310,9.690,24.17,224,0.900,bicubic,-10.670,-5.842,+11
flexivit_small.300ep_in1k,71.730,28.270,89.980,10.020,22.06,240,0.950,bicubic,-10.448,-6.058,+45
efficientvit_b2.r256_in1k,71.720,28.280,90.350,9.650,24.33,256,1.000,bicubic,-10.970,-5.744,-30
flexivit_small.600ep_in1k,71.720,28.280,90.150,9.850,22.06,240,0.950,bicubic,-10.642,-5.934,+16
pit_b_224.in1k,71.720,28.280,89.250,10.750,73.76,224,0.900,bicubic,-10.718,-6.464,+6
xcit_large_24_p16_224.fb_in1k,71.710,28.290,89.170,10.830,189.10,224,1.000,bicubic,-11.192,-6.714,-54
pvt_v2_b2_li.in1k,71.700,28.300,90.010,9.990,22.55,224,0.900,bicubic,-10.494,-6.082,+39
resnet50.fb_swsl_ig1b_ft_in1k,71.690,28.310,90.500,9.500,25.56,224,0.875,bilinear,-9.482,-5.486,+136
ecaresnet101d_pruned.miil_in1k,71.680,28.320,90.430,9.570,24.88,288,0.950,bicubic,-10.318,-5.730,+56
coatnet_bn_0_rw_224.sw_in1k,71.670,28.330,90.380,9.620,27.44,224,0.950,bicubic,-10.730,-5.806,+3
gcvit_xtiny.in1k,71.670,28.330,90.250,9.750,19.98,224,0.875,bicubic,-10.284,-5.716,+60
tresnet_xl.miil_in1k,71.650,28.350,89.630,10.370,78.44,224,0.875,bilinear,-10.424,-6.298,+46
resnet61q.ra2_in1k,71.630,28.370,90.280,9.720,36.85,288,1.000,bicubic,-10.894,-5.850,-12
tresnet_l.miil_in1k_448,71.630,28.370,90.030,9.970,55.99,448,0.875,bilinear,-10.646,-5.948,+22
poolformerv2_m48.sail_in1k,71.620,28.380,89.800,10.200,73.35,224,1.000,bicubic,-10.998,-6.272,-32
convnextv2_nano.fcmae_ft_in22k_in1k_384,71.590,28.410,90.760,9.240,15.62,384,1.000,bicubic,-11.784,-5.984,-109
xcit_tiny_12_p8_384.fb_dist_in1k,71.590,28.410,90.700,9.300,6.71,384,1.000,bicubic,-10.798,-5.520,-2
swinv2_tiny_window16_256.ms_in1k,71.590,28.410,90.330,9.670,28.35,256,0.900,bicubic,-11.214,-5.906,-57
convit_base.fb_in1k,71.580,28.420,90.120,9.880,86.54,224,0.875,bicubic,-10.710,-5.816,+13
coatnet_0_rw_224.sw_in1k,71.570,28.430,89.400,10.600,27.44,224,0.950,bicubic,-10.820,-6.436,-5
resnetv2_50d_evos.ah_in1k,71.560,28.440,90.090,9.910,25.59,288,1.000,bicubic,-10.442,-5.810,+43
fbnetv3_g.ra2_in1k,71.530,28.470,90.370,9.630,16.62,288,0.950,bilinear,-10.510,-5.690,+40
crossvit_15_dagger_240.in1k,71.520,28.480,89.850,10.150,28.21,240,0.875,bicubic,-10.810,-6.106,+4
poolformer_m48.sail_in1k,71.520,28.480,89.770,10.230,73.47,224,0.950,bicubic,-10.962,-6.196,-17
resnet152.tv2_in1k,71.510,28.490,89.970,10.030,60.19,224,0.965,bilinear,-10.776,-6.034,+8
resnetaa50d.sw_in12k_ft_in1k,71.500,28.500,90.320,9.680,25.58,288,1.000,bicubic,-11.100,-6.178,-40
mobilevitv2_150.cvnets_in22k_ft_in1k_384,71.490,28.510,90.420,9.580,10.59,384,1.000,bicubic,-11.096,-5.894,-37
resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k,71.480,28.520,90.470,9.530,88.79,224,0.875,bilinear,-10.126,-5.570,+68
wide_resnet50_2.racm_in1k,71.480,28.520,90.220,9.780,68.88,288,0.950,bicubic,-10.800,-5.844,+6
efficientnet_b3.ra2_in1k,71.470,28.530,90.060,9.940,12.23,320,1.000,bicubic,-10.776,-6.058,+7
pvt_v2_b2.in1k,71.450,28.550,90.060,9.940,25.36,224,0.900,bicubic,-10.634,-5.896,+26
resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k,71.420,28.580,90.530,9.470,194.03,224,0.875,bilinear,-10.418,-5.562,+47
resnet51q.ra2_in1k,71.410,28.590,90.180,9.820,35.70,288,1.000,bilinear,-10.950,-6.006,-12
efficientvit_b2.r224_in1k,71.380,28.620,89.710,10.290,24.33,224,0.950,bicubic,-10.768,-5.996,+17
wide_resnet101_2.tv2_in1k,71.360,28.640,89.790,10.210,126.89,224,0.965,bilinear,-11.142,-6.226,-31
xcit_tiny_24_p8_224.fb_in1k,71.350,28.650,90.230,9.770,12.11,224,1.000,bicubic,-10.542,-5.740,+37
vit_relpos_medium_patch16_224.sw_in1k,71.350,28.650,89.960,10.040,38.75,224,0.900,bicubic,-11.112,-6.122,-28
resnet152.a1_in1k,71.350,28.650,89.310,10.690,60.19,288,1.000,bicubic,-11.382,-6.410,-70
tf_efficientnetv2_b3.in21k_ft_in1k,71.340,28.660,90.760,9.240,14.36,300,0.900,bicubic,-11.330,-5.866,-64
vit_base_patch16_224.orig_in21k_ft_in1k,71.320,28.680,90.480,9.520,86.57,224,0.900,bicubic,-10.470,-5.646,+43
tf_efficientnet_b4.in1k,71.320,28.680,90.110,9.890,19.34,380,0.922,bicubic,-11.288,-5.642,-56
mixer_b16_224.miil_in21k_ft_in1k,71.300,28.700,89.650,10.350,59.88,224,0.875,bilinear,-11.006,-6.070,-11
pit_s_distilled_224.in1k,71.260,28.740,89.640,10.360,24.04,224,0.900,bicubic,-10.554,-6.090,+39
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k,71.250,28.750,90.480,9.520,236.34,224,0.875,bicubic,-11.626,-6.102,-90
ecaresnet50t.ra2_in1k,71.240,28.760,90.450,9.550,25.57,320,0.950,bicubic,-11.112,-5.690,-23
convmixer_1536_20.in1k,71.230,28.770,89.450,10.550,51.63,224,0.960,bicubic,-10.132,-6.164,+81
deit_base_patch16_224.fb_in1k,71.220,28.780,89.190,10.810,86.57,224,0.900,bicubic,-10.772,-6.546,+19
xcit_small_12_p16_224.fb_in1k,71.200,28.800,89.750,10.250,26.25,224,1.000,bicubic,-10.770,-6.062,+21
resnext50_32x4d.a1h_in1k,71.190,28.810,89.690,10.310,25.03,288,1.000,bicubic,-10.824,-6.244,+13
ecaresnet50t.a1_in1k,71.190,28.810,89.580,10.420,25.57,288,1.000,bicubic,-10.938,-6.062,+5
vit_relpos_medium_patch16_rpn_224.sw_in1k,71.170,28.830,90.080,9.920,38.73,224,0.900,bicubic,-11.140,-5.892,-22
crossvit_base_240.in1k,71.170,28.830,89.830,10.170,105.03,240,0.875,bicubic,-11.044,-6.004,-9
vit_base_patch32_clip_224.laion2b_ft_in1k,71.160,28.840,90.210,9.790,88.22,224,0.900,bicubic,-11.422,-5.990,-61
swin_s3_tiny_224.ms_in1k,71.150,28.850,89.710,10.290,28.33,224,0.900,bicubic,-10.994,-6.244,-2
cs3sedarknet_l.c2ns_in1k,71.110,28.890,90.350,9.650,21.91,288,0.950,bicubic,-10.674,-5.614,+30
efficientformerv2_s2.snap_dist_in1k,71.110,28.890,90.140,9.860,12.71,224,0.950,bicubic,-11.056,-5.770,-7
halo2botnet50ts_256.a1h_in1k,71.110,28.890,89.600,10.400,22.64,256,0.950,bicubic,-10.950,-6.034,+2
cs3darknet_x.c2ns_in1k,71.090,28.910,90.150,9.850,35.05,288,1.000,bicubic,-11.132,-6.080,-18
mobilevitv2_200.cvnets_in22k_ft_in1k,71.090,28.910,89.700,10.300,18.45,256,0.888,bicubic,-11.242,-6.242,-33
ecaresnet50d.miil_in1k,71.080,28.920,90.240,9.760,25.58,288,0.950,bicubic,-10.570,-5.642,+32
focalnet_tiny_lrf.ms_in1k,71.060,28.940,89.570,10.430,28.65,224,0.900,bicubic,-11.094,-6.378,-11
convnextv2_nano.fcmae_ft_in1k,71.050,28.950,90.100,9.900,15.62,288,1.000,bicubic,-11.436,-6.126,-56
focalnet_tiny_srf.ms_in1k,71.050,28.950,89.600,10.400,28.43,224,0.900,bicubic,-11.088,-6.368,-10
xcit_tiny_12_p8_224.fb_dist_in1k,71.040,28.960,89.890,10.110,6.71,224,1.000,bicubic,-10.172,-5.712,+77
xcit_small_24_p16_224.fb_in1k,71.040,28.960,89.680,10.320,47.67,224,1.000,bicubic,-11.536,-6.332,-70
tresnet_m.miil_in1k_448,71.020,28.980,88.680,11.320,31.39,448,0.875,bilinear,-10.690,-6.894,+21
resnetv2_101x1_bit.goog_in21k_ft_in1k,71.010,28.990,91.080,8.920,44.54,448,1.000,bilinear,-11.332,-5.440,-43
visformer_small.in1k,71.010,28.990,89.440,10.560,40.22,224,0.900,bicubic,-11.096,-6.438,-13
repvit_m3.dist_in1k,70.990,29.010,89.630,10.370,10.68,224,0.950,bicubic,-10.512,-5.938,+39
xcit_medium_24_p16_224.fb_in1k,70.990,29.010,89.530,10.470,84.40,224,1.000,bicubic,-11.650,-6.452,-93
resnet101.a2_in1k,70.990,29.010,89.160,10.840,44.55,288,1.000,bicubic,-11.246,-6.570,-32
lamhalobotnet50ts_256.a1h_in1k,70.990,29.010,89.040,10.960,22.57,256,0.950,bicubic,-10.562,-6.452,+32
edgenext_small.usi_in1k,70.980,29.020,89.880,10.120,5.59,320,1.000,bicubic,-10.584,-5.832,+27
resnetv2_50d_gn.ah_in1k,70.960,29.040,89.830,10.170,25.57,288,1.000,bicubic,-10.998,-6.098,-5
tnt_s_patch16_224,70.960,29.040,89.610,10.390,23.76,224,0.900,bicubic,-10.576,-6.080,+27
convnext_nano.d1h_in1k,70.960,29.040,89.430,10.570,15.59,288,1.000,bicubic,-10.522,-6.228,+38
tiny_vit_11m_224.in1k,70.940,29.060,89.840,10.160,11.00,224,0.950,bicubic,-10.552,-6.022,+31
resnest50d_4s2x40d.in1k,70.940,29.060,89.720,10.280,30.42,224,0.875,bicubic,-10.180,-5.840,+72
coatnet_nano_rw_224.sw_in1k,70.940,29.060,89.700,10.300,15.14,224,0.900,bicubic,-10.756,-5.946,+11
vit_srelpos_medium_patch16_224.sw_in1k,70.920,29.080,89.940,10.060,38.74,224,0.900,bicubic,-11.320,-6.002,-43
tf_efficientnet_b3.ap_in1k,70.920,29.080,89.430,10.570,12.23,300,0.904,bicubic,-10.900,-6.196,+1
vit_small_patch16_224.augreg_in21k_ft_in1k,70.910,29.090,90.170,9.830,22.05,224,0.900,bicubic,-10.476,-5.966,+42
coatnext_nano_rw_224.sw_in1k,70.890,29.110,90.250,9.750,14.70,224,0.900,bicubic,-11.052,-5.666,-12
coatnet_rmlp_nano_rw_224.sw_in1k,70.890,29.110,89.920,10.080,15.15,224,0.900,bicubic,-11.160,-5.958,-23
vit_base_patch16_rpn_224.sw_in1k,70.890,29.110,89.770,10.230,86.54,224,0.900,bicubic,-11.312,-6.226,-41
vit_large_patch32_384.orig_in21k_ft_in1k,70.870,29.130,90.570,9.430,306.63,384,1.000,bicubic,-10.640,-5.520,+22
nest_tiny_jx.goog_in1k,70.860,29.140,89.940,10.060,17.06,224,0.875,bicubic,-10.566,-5.678,+32
resnetrs101.tf_in1k,70.860,29.140,89.830,10.170,63.62,288,0.940,bicubic,-11.424,-6.184,-54
rexnet_200.nav_in1k,70.860,29.140,89.700,10.300,16.37,224,0.875,bicubic,-10.776,-5.966,+5
poolformerv2_m36.sail_in1k,70.850,29.150,89.330,10.670,56.08,224,1.000,bicubic,-11.366,-6.594,-49
tf_efficientnet_b1.ns_jft_in1k,70.840,29.160,90.120,9.880,7.79,240,0.882,bicubic,-10.548,-5.618,+31
regnety_032.tv2_in1k,70.840,29.160,89.850,10.150,19.44,224,0.965,bicubic,-10.916,-5.994,-5
wide_resnet50_2.tv2_in1k,70.840,29.160,89.270,10.730,68.88,224,0.965,bilinear,-10.766,-6.490,+4
tresnet_l.miil_in1k,70.830,29.170,89.600,10.400,55.99,224,0.875,bilinear,-10.650,-6.024,+16
poolformer_m36.sail_in1k,70.830,29.170,89.510,10.490,56.17,224,0.950,bicubic,-11.272,-6.188,-39
tf_efficientnetv2_b3.in1k,70.830,29.170,89.510,10.490,14.36,300,0.904,bicubic,-11.142,-6.292,-28
fastvit_sa12.apple_dist_in1k,70.830,29.170,89.240,10.760,11.58,256,0.900,bicubic,-11.024,-6.470,-16
levit_384.fb_dist_in1k,70.810,29.190,89.320,10.680,39.13,224,0.900,bicubic,-11.786,-6.698,-111
levit_conv_384.fb_dist_in1k,70.800,29.200,89.320,10.680,39.13,224,0.900,bicubic,-11.790,-6.696,-110
coat_lite_small.in1k,70.780,29.220,89.580,10.420,19.84,224,0.900,bicubic,-11.532,-6.270,-71
ecaresnetlight.miil_in1k,70.750,29.250,89.880,10.120,30.16,288,0.950,bicubic,-10.658,-5.936,+19
deit3_small_patch16_224.fb_in1k,70.750,29.250,89.450,10.550,22.06,224,0.900,bicubic,-10.620,-6.006,+25
regnetx_080.tv2_in1k,70.730,29.270,89.330,10.670,39.57,224,0.965,bicubic,-10.810,-6.212,-1
swinv2_cr_tiny_ns_224.sw_in1k,70.700,29.300,89.380,10.620,28.33,224,0.900,bicubic,-11.102,-6.438,-21
seresnet50.ra2_in1k,70.690,29.310,89.870,10.130,28.09,288,0.950,bicubic,-10.594,-5.782,+28
vit_base_patch32_clip_224.openai_ft_in1k,70.690,29.310,89.830,10.170,88.22,224,0.900,bicubic,-11.240,-6.136,-33
resnet50d.ra2_in1k,70.690,29.310,89.310,10.690,25.58,288,0.950,bicubic,-10.666,-6.428,+23
vit_relpos_small_patch16_224.sw_in1k,70.680,29.320,90.000,10.000,21.98,224,0.900,bicubic,-10.782,-5.820,+7
mobilevitv2_175.cvnets_in22k_ft_in1k,70.670,29.330,89.700,10.300,14.25,256,0.888,bicubic,-11.268,-6.090,-36
resnet101.tv2_in1k,70.630,29.370,89.390,10.610,44.55,224,0.965,bilinear,-11.258,-6.378,-34
tf_efficientnet_b3.aa_in1k,70.620,29.380,89.440,10.560,12.23,300,0.904,bicubic,-11.020,-6.282,-20
resnet50_gn.a1h_in1k,70.620,29.380,89.390,10.610,25.56,288,0.950,bicubic,-10.596,-5.994,+27
crossvit_small_240.in1k,70.620,29.380,89.360,10.640,26.86,240,0.875,bicubic,-10.398,-6.096,+51
cait_xxs24_384.fb_dist_in1k,70.600,29.400,89.730,10.270,12.03,384,1.000,bicubic,-10.372,-5.910,+51
senet154.gluon_in1k,70.600,29.400,88.920,11.080,115.09,224,0.875,bicubic,-10.626,-6.438,+22
convit_small.fb_in1k,70.580,29.420,89.590,10.410,27.78,224,0.875,bicubic,-10.840,-6.154,+4
convnext_nano_ols.d1h_in1k,70.570,29.430,89.100,10.900,15.65,288,1.000,bicubic,-11.030,-6.536,-19
twins_pcpvt_small.in1k,70.570,29.430,89.070,10.930,24.11,224,0.900,bicubic,-10.522,-6.578,+37
regnetz_b16.ra3_in1k,70.550,29.450,89.400,10.600,9.72,288,1.000,bicubic,-10.178,-6.118,+75
resnet50.c2_in1k,70.540,29.460,89.170,10.830,25.56,288,1.000,bicubic,-10.330,-6.364,+60
resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k,70.530,29.470,89.770,10.230,44.18,224,0.875,bilinear,-10.394,-5.964,+50
resnetv2_50.a1h_in1k,70.530,29.470,89.110,10.890,25.55,288,1.000,bicubic,-10.868,-6.616,+1
vit_small_r26_s32_224.augreg_in21k_ft_in1k,70.520,29.480,90.110,9.890,36.43,224,0.900,bicubic,-11.344,-5.912,-46
swinv2_tiny_window8_256.ms_in1k,70.510,29.490,89.500,10.500,28.35,256,0.900,bicubic,-11.310,-6.494,-44
deit_small_distilled_patch16_224.fb_in1k,70.500,29.500,89.460,10.540,22.44,224,0.900,bicubic,-10.716,-6.164,+15
legacy_senet154.in1k,70.500,29.500,88.990,11.010,115.09,224,0.875,bilinear,-10.812,-6.570,+6
repvit_m1_1.dist_450e_in1k,70.490,29.510,89.140,10.860,8.80,224,0.950,bicubic,-10.822,-6.396,+3
halonet50ts.a1h_in1k,70.480,29.520,89.340,10.660,22.73,256,0.940,bicubic,-11.182,-6.270,-38
tf_efficientnet_lite4.in1k,70.450,29.550,89.130,10.870,13.01,380,0.920,bilinear,-11.080,-6.534,-24
resnetaa50.a1h_in1k,70.440,29.560,89.990,10.010,25.56,288,1.000,bicubic,-11.174,-5.812,-36
crossvit_15_240.in1k,70.440,29.560,89.530,10.470,27.53,240,0.875,bicubic,-11.096,-6.206,-26
poolformerv2_s36.sail_in1k,70.430,29.570,89.630,10.370,30.79,224,1.000,bicubic,-11.136,-6.060,-34
twins_svt_small.in1k,70.430,29.570,89.360,10.640,24.06,224,0.900,bicubic,-11.246,-6.298,-45
ecaresnet50t.a2_in1k,70.430,29.570,89.020,10.980,25.57,288,1.000,bicubic,-11.228,-6.530,-41
resnest50d_1s4x24d.in1k,70.420,29.580,89.220,10.780,25.68,224,0.875,bicubic,-10.568,-6.106,+28
resnest50d.in1k,70.420,29.580,88.760,11.240,27.48,224,0.875,bilinear,-10.540,-6.622,+33
seresnext101_64x4d.gluon_in1k,70.410,29.590,89.360,10.640,88.23,224,0.875,bicubic,-10.484,-5.936,+37
gernet_l.idstcv_in1k,70.410,29.590,88.980,11.020,31.08,256,0.875,bilinear,-10.944,-6.550,-8
gcresnext50ts.ch_in1k,70.400,29.600,89.420,10.580,15.67,288,1.000,bicubic,-10.830,-6.122,-3
cs3darknet_l.c2ns_in1k,70.350,29.650,89.750,10.250,21.16,288,0.950,bicubic,-10.546,-5.912,+34
resnet152s.gluon_in1k,70.310,29.690,88.870,11.130,60.32,224,0.875,bicubic,-10.698,-6.546,+22
vit_srelpos_small_patch16_224.sw_in1k,70.290,29.710,89.540,10.460,21.97,224,0.900,bicubic,-10.802,-6.030,+14
repvgg_b3.rvgg_in1k,70.230,29.770,88.740,11.260,123.09,224,0.875,bilinear,-10.276,-6.514,+69
coat_mini.in1k,70.200,29.800,89.460,10.540,10.34,224,0.900,bicubic,-11.070,-5.922,-9
xception41p.ra3_in1k,70.200,29.800,89.100,10.900,26.91,299,0.940,bicubic,-11.772,-6.684,-78
sebotnet33ts_256.a1h_in1k,70.160,29.840,88.820,11.180,13.70,256,0.940,bicubic,-11.008,-6.348,-1
efficientnet_el.ra_in1k,70.140,29.860,89.310,10.690,10.59,300,0.904,bicubic,-11.172,-6.180,-16
inception_resnet_v2.tf_in1k,70.130,29.870,88.690,11.310,55.84,299,0.897,bicubic,-10.328,-6.500,+71
resnet50.d_in1k,70.120,29.880,88.740,11.260,25.56,288,1.000,bicubic,-10.852,-6.690,+17
resnet50d.a1_in1k,70.120,29.880,88.350,11.650,25.58,288,1.000,bicubic,-11.330,-6.868,-33
poolformer_s36.sail_in1k,70.100,29.900,89.130,10.870,30.86,224,0.900,bicubic,-11.330,-6.314,-34
resmlp_36_224.fb_distilled_in1k,70.100,29.900,89.090,10.910,44.69,224,0.875,bicubic,-11.048,-6.388,-5
haloregnetz_b.ra3_in1k,70.100,29.900,88.900,11.100,11.68,224,0.940,bicubic,-10.946,-6.300,+9
ecaresnet50d_pruned.miil_in1k,70.090,29.910,89.510,10.490,19.94,288,0.950,bicubic,-10.700,-6.060,+34
resnet50.c1_in1k,70.070,29.930,89.000,11.000,25.56,288,1.000,bicubic,-10.842,-6.552,+18
gcresnet50t.ra2_in1k,70.050,29.950,89.520,10.480,25.90,288,1.000,bicubic,-11.406,-6.198,-40
sehalonet33ts.ra2_in1k,70.050,29.950,88.740,11.260,13.69,256,0.940,bicubic,-10.908,-6.532,+12
seresnext101_32x4d.gluon_in1k,70.020,29.980,88.940,11.060,48.96,224,0.875,bicubic,-10.872,-6.356,+17
regnety_320.pycls_in1k,70.020,29.980,88.900,11.100,145.05,224,0.875,bicubic,-10.790,-6.338,+28
seresnet50.a2_in1k,70.000,30.000,88.710,11.290,28.09,288,1.000,bicubic,-11.106,-6.512,-8
levit_256.fb_dist_in1k,69.970,30.030,89.240,10.760,18.89,224,0.900,bicubic,-11.554,-6.254,-55
levit_conv_256.fb_dist_in1k,69.960,30.040,89.250,10.750,18.89,224,0.900,bicubic,-11.562,-6.240,-55
resnet152d.gluon_in1k,69.940,30.060,88.500,11.500,60.21,224,0.875,bicubic,-10.536,-6.702,+53
fastvit_s12.apple_dist_in1k,69.910,30.090,88.930,11.070,9.47,256,0.900,bicubic,-11.160,-6.354,-5
maxvit_rmlp_pico_rw_256.sw_in1k,69.890,30.110,89.250,10.750,7.52,256,0.950,bicubic,-10.624,-5.964,+45
pit_s_224.in1k,69.880,30.120,88.940,11.060,23.46,224,0.900,bicubic,-11.206,-6.390,-8
swin_tiny_patch4_window7_224.ms_in22k_ft_in1k,69.850,30.150,90.410,9.590,28.29,224,0.900,bicubic,-11.118,-5.604,0
resnext50_32x4d.ra_in1k,69.830,30.170,88.220,11.780,25.03,288,0.950,bicubic,-10.868,-7.172,+29
regnety_016.tv2_in1k,69.810,30.190,89.360,10.640,11.20,224,0.965,bicubic,-10.856,-5.970,+31
seresnet50.a1_in1k,69.810,30.190,88.550,11.450,28.09,288,1.000,bicubic,-11.292,-6.778,-16
mobilevitv2_150.cvnets_in22k_ft_in1k,69.800,30.200,89.180,10.820,10.59,256,0.888,bicubic,-11.688,-6.488,-60
resnet50.a1_in1k,69.780,30.220,88.350,11.650,25.56,288,1.000,bicubic,-11.434,-6.752,-31
mobilevitv2_200.cvnets_in1k,69.750,30.250,88.620,11.380,18.45,256,0.888,bicubic,-11.384,-6.742,-24
resnext50_32x4d.a2_in1k,69.750,30.250,88.200,11.800,25.03,288,1.000,bicubic,-11.554,-6.896,-41
resnet50.tv2_in1k,69.740,30.260,88.600,11.400,25.56,224,0.965,bilinear,-11.108,-6.834,+8
ese_vovnet39b.ra_in1k,69.730,30.270,89.550,10.450,24.57,288,0.950,bicubic,-10.620,-5.816,+53
tiny_vit_5m_224.dist_in22k_ft_in1k,69.730,30.270,89.440,10.560,5.39,224,0.950,bicubic,-11.146,-6.224,0
lambda_resnet50ts.a1h_in1k,69.730,30.270,88.830,11.170,21.54,256,0.950,bicubic,-11.428,-6.268,-30
resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k,69.710,30.290,89.420,10.580,25.03,224,0.875,bilinear,-10.624,-5.980,+53
xcit_tiny_24_p16_224.fb_dist_in1k,69.710,30.290,88.720,11.280,12.12,224,1.000,bicubic,-10.744,-6.498,+42
fastvit_sa12.apple_in1k,69.700,30.300,88.950,11.050,11.58,256,0.900,bicubic,-11.144,-6.390,+3
xcit_tiny_12_p16_384.fb_dist_in1k,69.690,30.310,89.020,10.980,6.72,384,1.000,bicubic,-11.248,-6.394,-12
resmlp_24_224.fb_distilled_in1k,69.660,30.340,89.070,10.930,30.02,224,0.875,bicubic,-11.096,-6.154,+8
resnext101_64x4d.gluon_in1k,69.660,30.340,88.260,11.740,83.46,224,0.875,bicubic,-10.940,-6.732,+22
tresnet_m.miil_in1k,69.650,30.350,88.000,12.000,31.39,224,0.875,bilinear,-11.148,-6.856,+2
resnext50d_32x4d.bt_in1k,69.640,30.360,89.250,10.750,25.05,288,0.950,bicubic,-11.024,-6.170,+15
regnetx_032.tv2_in1k,69.620,30.380,89.360,10.640,15.30,224,0.965,bicubic,-11.306,-5.918,-16
efficientnet_b3_pruned.in1k,69.580,30.420,88.960,11.040,9.86,300,0.904,bicubic,-11.272,-6.284,-6
fastvit_t12.apple_dist_in1k,69.580,30.420,88.410,11.590,7.55,256,0.900,bicubic,-10.772,-6.632,+39
convnextv2_pico.fcmae_ft_in1k,69.570,30.430,89.230,10.770,9.07,288,0.950,bicubic,-11.516,-6.250,-33
eva02_tiny_patch14_336.mim_in22k_ft_in1k,69.560,30.440,89.320,10.680,5.76,336,1.000,bicubic,-11.070,-6.206,+12
repvit_m1_1.dist_300e_in1k,69.560,30.440,88.820,11.180,8.80,224,0.950,bicubic,-11.266,-6.350,-7
nf_resnet50.ra2_in1k,69.550,30.450,88.730,11.270,25.56,288,0.940,bicubic,-11.090,-6.604,+9
gernet_m.idstcv_in1k,69.540,30.460,88.690,11.310,21.14,224,0.875,bilinear,-11.196,-6.500,-1
inception_resnet_v2.tf_ens_adv_in1k,69.540,30.460,88.490,11.510,55.84,299,0.897,bicubic,-10.438,-6.458,+61
repvgg_b3g4.rvgg_in1k,69.530,30.470,88.450,11.550,83.83,224,0.875,bilinear,-10.686,-6.642,+50
gcresnet33ts.ra2_in1k,69.510,30.490,89.110,10.890,19.88,288,1.000,bicubic,-11.090,-6.212,+7
efficientnet_el_pruned.in1k,69.510,30.490,88.940,11.060,10.59,300,0.904,bicubic,-10.788,-6.282,+40
efficientnet_b2.ra_in1k,69.490,30.510,88.690,11.310,9.11,288,1.000,bicubic,-11.120,-6.624,+5
resnext50_32x4d.a1_in1k,69.490,30.510,88.340,11.660,25.03,288,1.000,bicubic,-11.976,-6.834,-86
swin_tiny_patch4_window7_224.ms_in1k,69.470,30.530,89.060,10.940,28.29,224,0.900,bicubic,-11.906,-6.484,-77
regnetx_320.pycls_in1k,69.470,30.530,88.270,11.730,107.81,224,0.875,bicubic,-10.776,-6.752,+41
vit_base_patch32_224.augreg_in21k_ft_in1k,69.450,30.550,89.440,10.560,88.22,224,0.900,bicubic,-11.266,-6.126,-9
res2net101d.in1k,69.450,30.550,88.710,11.290,45.23,224,0.875,bilinear,-11.768,-6.640,-65
gcvit_xxtiny.in1k,69.440,30.560,88.840,11.160,12.00,224,0.875,bicubic,-10.286,-6.240,+72
cspresnext50.ra_in1k,69.430,30.570,88.600,11.400,20.57,256,0.887,bilinear,-11.124,-6.726,+1
rexnet_150.nav_in1k,69.410,30.590,88.980,11.020,9.73,224,0.875,bicubic,-10.914,-6.010,+25
efficientvit_b1.r288_in1k,69.410,30.590,88.680,11.320,9.10,288,1.000,bicubic,-10.914,-6.496,+27
resnext50_32x4d.tv2_in1k,69.400,30.600,88.440,11.560,25.03,224,0.965,bilinear,-11.782,-6.900,-66
eca_resnet33ts.ra2_in1k,69.380,30.620,89.210,10.790,19.68,288,1.000,bicubic,-11.292,-6.154,-12
seresnet33ts.ra2_in1k,69.380,30.620,89.190,10.810,19.78,288,1.000,bicubic,-11.404,-6.172,-23
convmixer_768_32.in1k,69.380,30.620,88.880,11.120,21.11,224,0.960,bicubic,-10.788,-6.194,+39
inception_v4.tf_in1k,69.370,30.630,88.780,11.220,42.68,299,0.875,bicubic,-10.786,-6.190,+38
xception71.tf_in1k,69.360,30.640,88.270,11.730,42.34,299,0.903,bicubic,-10.514,-6.658,+49
darknet53.c2ns_in1k,69.350,30.650,88.780,11.220,41.61,288,1.000,bicubic,-11.182,-6.652,-6
cs3darknet_focus_l.c2ns_in1k,69.340,30.660,89.420,10.580,21.15,288,0.950,bicubic,-11.536,-6.262,-39
legacy_seresnext101_32x4d.in1k,69.340,30.660,88.050,11.950,48.96,224,0.875,bilinear,-10.892,-6.970,+28
vit_small_patch16_384.augreg_in1k,69.320,30.680,89.000,11.000,22.20,384,1.000,bicubic,-11.796,-6.574,-67
repvit_m1_0.dist_450e_in1k,69.310,30.690,88.700,11.300,7.30,224,0.950,bicubic,-11.124,-6.218,+4
mobilevitv2_175.cvnets_in1k,69.290,30.710,88.960,11.040,14.25,256,0.888,bicubic,-11.570,-6.296,-39
efficientformer_l1.snap_dist_in1k,69.280,30.720,88.560,11.440,12.29,224,0.950,bicubic,-11.218,-6.428,-8
vit_small_patch32_384.augreg_in21k_ft_in1k,69.270,30.730,89.820,10.180,22.92,384,1.000,bicubic,-11.216,-5.780,-8
convnext_pico_ols.d1_in1k,69.240,30.760,88.840,11.160,9.06,288,1.000,bicubic,-11.222,-6.412,-6
repvit_m2.dist_in1k,69.230,30.770,88.740,11.260,8.80,224,0.950,bicubic,-11.230,-6.428,-6
edgenext_small_rw.sw_in1k,69.210,30.790,88.740,11.260,7.83,320,1.000,bicubic,-11.248,-6.568,-5
vit_base_patch16_384.augreg_in1k,69.190,30.810,88.370,11.630,86.86,384,1.000,bicubic,-11.912,-6.750,-73
resnet50.b2k_in1k,69.180,30.820,88.660,11.340,25.56,288,1.000,bicubic,-11.274,-6.658,-6
resnet152c.gluon_in1k,69.140,30.860,87.890,12.110,60.21,224,0.875,bicubic,-10.772,-6.956,+33
resnet50.b1k_in1k,69.100,30.900,88.740,11.260,25.56,288,1.000,bicubic,-11.606,-6.692,-34
resnet50.a1h_in1k,69.100,30.900,88.510,11.490,25.56,224,1.000,bicubic,-11.578,-6.796,-31
tf_efficientnetv2_b2.in1k,69.100,30.900,88.230,11.770,10.10,260,0.890,bicubic,-11.096,-6.812,+16
mixnet_xl.ra_in1k,69.090,30.910,88.310,11.690,11.90,224,0.875,bicubic,-11.392,-6.626,-17
resnetblur50.bt_in1k,69.070,30.930,88.450,11.550,25.56,288,0.950,bicubic,-11.164,-6.784,+11
repvgg_b2g4.rvgg_in1k,69.010,30.990,88.360,11.640,61.76,224,0.875,bilinear,-10.372,-6.316,+69
regnety_160.pycls_in1k,69.010,30.990,88.270,11.730,83.59,224,0.875,bicubic,-11.288,-6.694,+4
resnet101d.gluon_in1k,69.010,30.990,88.080,11.920,44.57,224,0.875,bicubic,-11.416,-6.944,-12
xception65.tf_in1k,68.950,31.050,88.470,11.530,39.92,299,0.903,bicubic,-10.606,-6.188,+54
resnext101_32x4d.gluon_in1k,68.940,31.060,88.340,11.660,44.18,224,0.875,bicubic,-11.400,-6.590,-7
tf_efficientnet_b2.ap_in1k,68.930,31.070,88.330,11.670,9.11,260,0.890,bicubic,-11.380,-6.696,-5
repvit_m1_0.dist_300e_in1k,68.930,31.070,88.100,11.900,7.30,224,0.950,bicubic,-11.196,-6.644,+13
poolformerv2_s24.sail_in1k,68.920,31.080,88.670,11.330,21.34,224,1.000,bicubic,-11.828,-6.640,-50
cspdarknet53.ra_in1k,68.920,31.080,88.600,11.400,27.64,256,0.887,bilinear,-11.148,-6.478,+13
convnext_pico.d1_in1k,68.900,31.100,88.480,11.520,9.05,288,0.950,bicubic,-11.516,-6.568,-18
resnet50d.a2_in1k,68.900,31.100,87.980,12.020,25.58,288,1.000,bicubic,-12.264,-7.100,-98
mobilevitv2_150.cvnets_in1k,68.890,31.110,88.080,11.920,10.59,256,0.888,bicubic,-11.480,-6.994,-18
regnety_120.pycls_in1k,68.860,31.140,88.330,11.670,51.82,224,0.875,bicubic,-11.520,-6.796,-20
resnet152.a3_in1k,68.820,31.180,87.760,12.240,60.19,224,0.950,bicubic,-11.726,-7.240,-39
resnet152.gluon_in1k,68.820,31.180,87.700,12.300,60.19,224,0.875,bicubic,-10.876,-7.030,+32
poolformer_s24.sail_in1k,68.780,31.220,88.170,11.830,21.39,224,0.900,bicubic,-11.514,-6.904,-12
dpn107.mx_in1k,68.780,31.220,88.130,11.870,86.92,224,0.875,bicubic,-11.390,-6.812,-1
gmlp_s16_224.ra3_in1k,68.780,31.220,88.070,11.930,19.42,224,0.875,bicubic,-10.864,-6.552,+36
dpn131.mx_in1k,68.770,31.230,87.570,12.430,79.25,224,0.875,bicubic,-11.044,-7.130,+19
darknetaa53.c2ns_in1k,68.760,31.240,88.700,11.300,36.02,288,1.000,bilinear,-11.746,-6.622,-42
tf_efficientnet_b2.aa_in1k,68.760,31.240,87.950,12.050,9.11,260,0.890,bicubic,-11.324,-6.956,-1
deit_small_patch16_224.fb_in1k,68.730,31.270,88.200,11.800,22.05,224,0.900,bicubic,-11.118,-6.844,+12
regnety_080.pycls_in1k,68.710,31.290,87.970,12.030,39.18,224,0.875,bicubic,-11.158,-6.862,+8
resnet101s.gluon_in1k,68.710,31.290,87.900,12.100,44.67,224,0.875,bicubic,-11.594,-7.252,-21
seresnext50_32x4d.gluon_in1k,68.660,31.340,88.360,11.640,27.56,224,0.875,bicubic,-11.264,-6.464,+1
resnet50.ram_in1k,68.630,31.370,88.330,11.670,25.56,288,0.950,bicubic,-11.346,-6.722,-3
hrnet_w64.ms_in1k,68.630,31.370,88.070,11.930,128.06,224,0.875,bilinear,-10.846,-6.582,+38
xcit_tiny_12_p8_224.fb_in1k,68.580,31.420,88.720,11.280,6.71,224,1.000,bicubic,-11.108,-6.334,+21
resnet50.a2_in1k,68.570,31.430,88.000,12.000,25.56,288,1.000,bicubic,-12.202,-6.988,-72
tf_efficientnet_b3.in1k,68.530,31.470,88.700,11.300,12.23,300,0.904,bicubic,-12.344,-6.600,-84
dpn98.mx_in1k,68.520,31.480,87.610,12.390,61.57,224,0.875,bicubic,-11.150,-7.044,+21
regnetx_160.pycls_in1k,68.510,31.490,88.460,11.540,54.28,224,0.875,bicubic,-11.356,-6.368,0
fastvit_s12.apple_in1k,68.490,31.510,87.850,12.150,9.47,256,0.900,bicubic,-11.452,-6.944,-8
rexnet_130.nav_in1k,68.460,31.540,88.040,11.960,7.56,224,0.875,bicubic,-11.046,-6.638,+27
cspresnet50.ra_in1k,68.450,31.550,87.960,12.040,21.62,256,0.887,bilinear,-11.132,-6.750,+22
tf_efficientnet_el.in1k,68.440,31.560,88.210,11.790,10.59,300,0.904,bicubic,-11.808,-6.910,-29
regnety_064.pycls_in1k,68.440,31.560,88.060,11.940,30.58,224,0.875,bicubic,-11.276,-6.706,+10
xcit_tiny_24_p16_224.fb_in1k,68.420,31.580,88.290,11.710,12.12,224,1.000,bicubic,-11.028,-6.588,+28
cait_xxs36_224.fb_dist_in1k,68.410,31.590,88.650,11.350,17.30,224,1.000,bicubic,-11.336,-6.224,+3
skresnext50_32x4d.ra_in1k,68.400,31.600,87.570,12.430,27.48,224,0.875,bicubic,-11.764,-7.070,-23
resnet50.fb_ssl_yfcc100m_ft_in1k,68.380,31.620,88.560,11.440,25.56,224,0.875,bilinear,-10.850,-6.266,+47
efficientvit_b1.r256_in1k,68.370,31.630,87.910,12.090,9.10,256,1.000,bicubic,-11.364,-6.870,+1
dla102x2.in1k,68.350,31.650,87.890,12.110,41.28,224,0.875,bilinear,-11.096,-6.742,+24
fbnetv3_d.ra2_in1k,68.330,31.670,88.420,11.580,10.31,256,0.950,bilinear,-11.352,-6.524,+6
efficientnet_b2_pruned.in1k,68.300,31.700,88.100,11.900,8.31,260,0.890,bicubic,-11.620,-6.752,-18
res2net50d.in1k,68.300,31.700,88.100,11.900,25.72,224,0.875,bilinear,-11.954,-6.936,-39
resmlp_big_24_224.fb_in1k,68.300,31.700,87.540,12.460,129.14,224,0.875,bicubic,-12.736,-7.478,-119
vit_base_patch16_224.sam_in1k,68.290,31.710,87.730,12.270,86.57,224,0.900,bicubic,-11.948,-7.026,-39
resnext50_32x4d.gluon_in1k,68.290,31.710,87.330,12.670,25.03,224,0.875,bicubic,-11.070,-7.100,+26
ecaresnet26t.ra2_in1k,68.260,31.740,88.810,11.190,16.01,320,0.950,bicubic,-11.590,-6.280,-17
efficientformerv2_s1.snap_dist_in1k,68.240,31.760,88.330,11.670,6.19,224,0.950,bicubic,-11.452,-6.386,-3
tf_efficientnet_lite3.in1k,68.220,31.780,87.730,12.270,8.20,300,0.904,bilinear,-11.586,-7.184,-13
resnet101.a3_in1k,68.220,31.780,87.640,12.360,44.55,224,0.950,bicubic,-11.594,-6.974,-13
pit_xs_distilled_224.in1k,68.190,31.810,87.730,12.270,11.00,224,0.900,bicubic,-10.990,-6.636,+38
resnet50.ra_in1k,68.150,31.850,87.900,12.100,25.56,288,0.950,bicubic,-11.686,-7.066,-18
fbnetv3_b.ra2_in1k,68.140,31.860,87.920,12.080,8.60,256,0.950,bilinear,-11.006,-6.824,+39
tiny_vit_5m_224.in1k,68.140,31.860,87.840,12.160,5.39,224,0.950,bicubic,-11.030,-6.954,+36
regnetx_120.pycls_in1k,68.130,31.870,87.610,12.390,46.11,224,0.875,bicubic,-11.458,-7.132,-3
mobileone_s4.apple_in1k,68.130,31.870,87.110,12.890,14.95,224,0.900,bilinear,-11.296,-7.370,+11
resmlp_36_224.fb_in1k,68.090,31.910,88.180,11.820,44.69,224,0.875,bicubic,-11.682,-6.704,-19
resnet50.bt_in1k,68.070,31.930,87.810,12.190,25.56,288,0.950,bicubic,-11.570,-7.082,-8
dpn68b.ra_in1k,68.070,31.930,87.380,12.620,12.61,288,1.000,bicubic,-11.290,-7.056,+12
regnetx_016.tv2_in1k,68.050,31.950,88.230,11.770,9.19,224,0.965,bicubic,-11.386,-6.538,+5
resnetrs50.tf_in1k,68.040,31.960,87.730,12.270,35.69,224,0.910,bicubic,-11.854,-7.244,-35
dpn92.mx_in1k,67.980,32.020,87.600,12.400,37.67,224,0.875,bicubic,-12.058,-7.260,-43
nf_regnet_b1.ra2_in1k,67.970,32.030,88.220,11.780,10.22,288,0.900,bicubic,-11.338,-6.520,+12
repvit_m0_9.dist_450e_in1k,67.920,32.080,87.890,12.110,5.49,224,0.950,bicubic,-11.146,-6.490,+34
fastvit_t12.apple_in1k,67.920,32.080,87.550,12.450,7.55,256,0.900,bicubic,-11.344,-7.012,+17
resnet50d.gluon_in1k,67.910,32.090,87.120,12.880,25.58,224,0.875,bicubic,-11.168,-7.346,+30
resnetv2_50x1_bit.goog_in21k_ft_in1k,67.900,32.100,89.270,10.730,25.55,448,1.000,bilinear,-12.442,-6.412,-75
levit_192.fb_dist_in1k,67.900,32.100,87.900,12.100,10.95,224,0.900,bicubic,-11.938,-6.884,-34
levit_conv_192.fb_dist_in1k,67.900,32.100,87.890,12.110,10.95,224,0.900,bicubic,-11.938,-6.888,-36
tf_efficientnetv2_b1.in1k,67.890,32.110,87.810,12.190,8.14,240,0.882,bicubic,-11.570,-6.912,-9
regnetx_080.pycls_in1k,67.890,32.110,87.000,13.000,39.57,224,0.875,bicubic,-11.308,-7.554,+18
legacy_seresnext50_32x4d.in1k,67.860,32.140,87.620,12.380,27.56,224,0.875,bilinear,-11.216,-6.812,+25
efficientnet_em.ra2_in1k,67.850,32.150,88.130,11.870,6.90,240,0.882,bicubic,-11.394,-6.664,+11
resnext101_32x8d.tv_in1k,67.840,32.160,87.490,12.510,88.79,224,0.875,bilinear,-11.470,-7.030,0
resmlp_24_224.fb_in1k,67.820,32.180,87.610,12.390,30.02,224,0.875,bicubic,-11.554,-6.936,-6
lambda_resnet26t.c1_in1k,67.800,32.200,87.790,12.210,10.96,256,0.940,bicubic,-11.288,-6.800,+19
ecaresnet50t.a3_in1k,67.790,32.210,87.560,12.440,25.57,224,0.950,bicubic,-11.762,-7.134,-21
hrnet_w48.ms_in1k,67.760,32.240,87.410,12.590,77.47,224,0.875,bilinear,-11.546,-7.106,-3
efficientvit_b1.r224_in1k,67.760,32.240,87.340,12.660,9.10,224,0.950,bicubic,-11.492,-6.964,+5
hrnet_w44.ms_in1k,67.750,32.250,87.540,12.460,67.06,224,0.875,bilinear,-11.144,-6.824,+28
resnet33ts.ra2_in1k,67.730,32.270,88.100,11.900,19.68,288,1.000,bicubic,-11.996,-6.728,-39
coat_lite_mini.in1k,67.710,32.290,87.710,12.290,11.01,224,0.900,bicubic,-11.392,-6.898,+12
tf_efficientnet_b0.ns_jft_in1k,67.700,32.300,88.070,11.930,5.29,224,0.875,bicubic,-10.968,-6.302,+42
resnext50_32x4d.a3_in1k,67.700,32.300,86.930,13.070,25.03,224,0.950,bicubic,-11.568,-7.376,-3
legacy_xception.tf_in1k,67.690,32.310,87.560,12.440,22.86,299,0.897,bicubic,-11.350,-6.822,+15
eca_botnext26ts_256.c1_in1k,67.680,32.320,87.080,12.920,10.59,256,0.950,bicubic,-11.588,-7.526,-7
regnetx_064.pycls_in1k,67.670,32.330,87.520,12.480,26.21,224,0.875,bicubic,-11.396,-6.940,+11
resnet32ts.ra2_in1k,67.650,32.350,87.580,12.420,17.96,288,1.000,bicubic,-11.738,-7.072,-22
convnext_femto_ols.d1_in1k,67.650,32.350,87.390,12.610,5.23,288,0.950,bicubic,-11.274,-7.136,+18
halonet26t.a1h_in1k,67.620,32.380,87.250,12.750,12.48,256,0.950,bicubic,-11.486,-7.056,+3
inception_v3.gluon_in1k,67.590,32.410,87.450,12.550,23.83,299,0.875,bicubic,-11.212,-6.926,+24
hrnet_w40.ms_in1k,67.580,32.420,87.140,12.860,57.56,224,0.875,bilinear,-11.352,-7.324,+13
regnety_040.pycls_in1k,67.570,32.430,87.490,12.510,20.65,224,0.875,bicubic,-11.650,-7.166,-7
resnet101c.gluon_in1k,67.560,32.440,87.160,12.840,44.57,224,0.875,bicubic,-11.978,-7.424,-37
legacy_seresnet152.in1k,67.550,32.450,87.380,12.620,66.82,224,0.875,bilinear,-11.110,-6.990,+33
dla169.in1k,67.540,32.460,87.570,12.430,53.39,224,0.875,bilinear,-11.168,-6.774,+28
tf_efficientnet_b1.ap_in1k,67.520,32.480,87.750,12.250,7.79,240,0.882,bicubic,-11.756,-6.562,-20
efficientnet_b1.ft_in1k,67.520,32.480,87.470,12.530,7.79,256,1.000,bicubic,-11.280,-6.872,+19
mobilevitv2_125.cvnets_in1k,67.480,32.520,87.570,12.430,7.48,256,0.888,bicubic,-12.200,-7.288,-52
tf_efficientnet_cc_b1_8e.in1k,67.480,32.520,87.300,12.700,39.72,240,0.882,bicubic,-11.822,-7.074,-24
eca_halonext26ts.c1_in1k,67.480,32.520,87.240,12.760,10.76,256,0.940,bicubic,-12.006,-7.360,-40
resnet101.gluon_in1k,67.470,32.530,87.230,12.770,44.55,224,0.875,bicubic,-11.840,-7.292,-29
res2net101_26w_4s.in1k,67.460,32.540,87.010,12.990,45.21,224,0.875,bilinear,-11.740,-7.426,-16
res2net50_26w_8s.in1k,67.440,32.560,87.250,12.750,48.40,224,0.875,bilinear,-11.502,-7.044,0
repvit_m0_9.dist_300e_in1k,67.430,32.570,87.230,12.770,5.49,224,0.950,bicubic,-11.228,-6.886,+24
regnety_008_tv.tv2_in1k,67.410,32.590,88.030,11.970,6.43,224,0.965,bicubic,-11.256,-6.360,+21
resnet34d.ra2_in1k,67.400,32.600,87.960,12.040,21.82,288,0.950,bicubic,-11.036,-6.384,+36
tf_efficientnet_b2.in1k,67.400,32.600,87.580,12.420,9.11,260,0.890,bicubic,-12.208,-7.134,-57
regnety_032.pycls_in1k,67.400,32.600,87.260,12.740,19.44,224,0.875,bicubic,-11.476,-7.148,+2
convnextv2_femto.fcmae_ft_in1k,67.350,32.650,87.580,12.420,5.23,288,0.950,bicubic,-11.988,-6.980,-38
cait_xxs24_224.fb_dist_in1k,67.340,32.660,87.520,12.480,11.96,224,1.000,bicubic,-11.044,-6.796,+39
xception41.tf_in1k,67.260,32.740,87.190,12.810,26.97,299,0.903,bicubic,-11.244,-7.086,+23
coat_tiny.in1k,67.240,32.760,87.310,12.690,5.50,224,0.900,bicubic,-11.186,-6.738,+32
repghostnet_200.in1k,67.240,32.760,87.290,12.710,9.80,224,0.875,bicubic,-11.566,-7.040,-1
regnetx_032.pycls_in1k,67.240,32.760,86.990,13.010,15.30,224,0.875,bicubic,-10.928,-7.092,+50
resnest26d.gluon_in1k,67.220,32.780,87.180,12.820,17.07,224,0.875,bilinear,-11.262,-7.114,+23
convnext_femto.d1_in1k,67.190,32.810,87.480,12.520,5.22,288,0.950,bicubic,-11.526,-6.950,+5
repvgg_b2.rvgg_in1k,67.160,32.840,87.320,12.680,89.02,224,0.875,bilinear,-11.632,-7.100,-1
vit_relpos_base_patch32_plus_rpn_256.sw_in1k,67.160,32.840,86.480,13.520,119.42,256,0.900,bicubic,-12.324,-7.658,-59
botnet26t_256.c1_in1k,67.140,32.860,87.510,12.490,12.49,256,0.950,bicubic,-12.118,-7.022,-38
legacy_seresnet101.in1k,67.110,32.890,87.050,12.950,49.33,224,0.875,bilinear,-11.276,-7.212,+28
resnet50s.gluon_in1k,67.110,32.890,86.860,13.140,25.68,224,0.875,bicubic,-11.604,-7.382,+1
dla60x.in1k,67.070,32.930,87.200,12.800,17.35,224,0.875,bilinear,-11.166,-6.826,+37
visformer_tiny.in1k,67.050,32.950,87.060,12.940,10.32,224,0.900,bicubic,-11.110,-7.106,+41
dla60_res2net.in1k,67.030,32.970,87.140,12.860,20.85,224,0.875,bilinear,-11.434,-7.058,+16
resnet34.a1_in1k,67.030,32.970,86.280,13.720,21.80,288,1.000,bicubic,-10.888,-7.484,+57
xcit_tiny_12_p16_224.fb_dist_in1k,67.020,32.980,87.400,12.600,6.72,224,1.000,bicubic,-11.554,-6.798,+3
dla102x.in1k,67.000,33.000,86.790,13.210,26.31,224,0.875,bilinear,-11.512,-7.446,+6
resnet152.tv_in1k,66.990,33.010,87.560,12.440,60.19,224,0.875,bilinear,-11.332,-6.486,+25
lambda_resnet26rpt_256.c1_in1k,66.960,33.040,87.130,12.870,10.99,256,0.940,bicubic,-12.004,-7.306,-27
mixnet_l.ft_in1k,66.960,33.040,86.940,13.060,7.33,224,0.875,bicubic,-12.006,-7.242,-29
pit_xs_224.in1k,66.920,33.080,87.280,12.720,10.62,224,0.900,bicubic,-11.256,-6.882,+31
repvgg_b1.rvgg_in1k,66.920,33.080,86.780,13.220,57.42,224,0.875,bilinear,-11.448,-7.316,+18
resnet50d.a3_in1k,66.920,33.080,86.540,13.460,25.58,224,0.950,bicubic,-11.800,-7.692,-13
pvt_v2_b1.in1k,66.910,33.090,87.410,12.590,14.01,224,0.900,bicubic,-11.794,-7.092,-10
res2net50_26w_6s.in1k,66.910,33.090,86.880,13.120,37.05,224,0.875,bilinear,-11.658,-7.242,-5
tf_efficientnet_b1.aa_in1k,66.900,33.100,87.020,12.980,7.79,240,0.882,bicubic,-11.928,-7.180,-25
xcit_nano_12_p8_384.fb_dist_in1k,66.870,33.130,87.110,12.890,3.05,384,1.000,bicubic,-10.950,-6.930,+51
efficientnet_es.ra_in1k,66.860,33.140,86.710,13.290,5.44,224,0.875,bicubic,-11.198,-7.216,+32
mobilevit_s.cvnets_in1k,66.850,33.150,87.070,12.930,5.58,256,0.900,bicubic,-11.462,-7.078,+15
regnetx_040.pycls_in1k,66.820,33.180,86.740,13.260,22.12,224,0.875,bicubic,-11.672,-7.502,-4
hrnet_w32.ms_in1k,66.780,33.220,87.290,12.710,41.23,224,0.875,bilinear,-11.662,-6.900,0
resnet50.am_in1k,66.780,33.220,86.740,13.260,25.56,224,0.875,bicubic,-12.222,-7.658,-42
tf_mixnet_l.in1k,66.780,33.220,86.480,13.520,7.33,224,0.875,bicubic,-11.996,-7.522,-26
seresnext26t_32x4d.bt_in1k,66.770,33.230,86.720,13.280,16.81,288,0.950,bicubic,-11.974,-7.592,-25
hrnet_w18.ms_aug_in1k,66.760,33.240,87.480,12.520,21.30,224,0.950,bilinear,-11.362,-6.574,+22
repvit_m1.dist_in1k,66.760,33.240,87.180,12.820,5.49,224,0.950,bicubic,-11.778,-6.890,-15
hrnet_w30.ms_in1k,66.760,33.240,86.790,13.210,37.71,224,0.875,bilinear,-11.436,-7.432,+14
selecsls60b.in1k,66.750,33.250,86.540,13.460,32.77,224,0.875,bicubic,-11.662,-7.628,-1
vit_small_patch16_224.augreg_in1k,66.690,33.310,86.720,13.280,22.05,224,0.900,bicubic,-12.158,-7.568,-41
tf_efficientnetv2_b0.in1k,66.690,33.310,86.700,13.300,7.14,224,0.875,bicubic,-11.668,-7.314,+2
wide_resnet101_2.tv_in1k,66.680,33.320,87.030,12.970,126.89,224,0.875,bilinear,-12.162,-7.252,-42
seresnext26d_32x4d.bt_in1k,66.680,33.320,86.830,13.170,16.81,288,0.950,bicubic,-12.134,-7.410,-39
dla60_res2next.in1k,66.660,33.340,87.020,12.980,17.03,224,0.875,bilinear,-11.780,-7.124,-11
wide_resnet50_2.tv_in1k,66.650,33.350,86.800,13.200,68.88,224,0.875,bilinear,-11.826,-7.288,-15
inception_v3.tf_adv_in1k,66.630,33.370,86.580,13.420,23.83,299,0.875,bicubic,-10.962,-7.150,+42
mobilevitv2_100.cvnets_in1k,66.610,33.390,87.020,12.980,4.90,256,0.888,bicubic,-11.470,-7.150,+13
vit_tiny_patch16_384.augreg_in21k_ft_in1k,66.590,33.410,87.250,12.750,5.79,384,1.000,bicubic,-11.834,-7.292,-12
cs3darknet_m.c2ns_in1k,66.580,33.420,87.180,12.820,9.31,288,0.950,bicubic,-11.054,-6.836,+37
levit_128.fb_dist_in1k,66.560,33.440,86.740,13.260,9.21,224,0.900,bicubic,-11.930,-7.272,-22
levit_conv_128.fb_dist_in1k,66.550,33.450,86.730,13.270,9.21,224,0.900,bicubic,-11.944,-7.278,-25
tf_efficientnet_b1.in1k,66.540,33.460,86.700,13.300,7.79,240,0.882,bicubic,-12.022,-7.394,-31
resnet50c.gluon_in1k,66.540,33.460,86.170,13.830,25.58,224,0.875,bicubic,-11.466,-7.822,+13
dla102.in1k,66.530,33.470,86.910,13.090,33.27,224,0.875,bilinear,-11.494,-7.024,+10
hrnet_w18_small_v2.gluon_in1k,66.510,33.490,86.500,13.500,15.60,224,0.875,bicubic,-11.680,-7.402,-3
mobileone_s3.apple_in1k,66.500,33.500,86.370,13.630,10.17,224,0.900,bilinear,-11.492,-7.544,+12
vit_base_patch16_224.augreg_in1k,66.480,33.520,86.260,13.740,86.57,224,0.900,bicubic,-12.674,-7.830,-76
vit_base_patch32_384.augreg_in1k,66.430,33.570,86.960,13.040,88.30,384,1.000,bicubic,-12.326,-7.266,-50
gmixer_24_224.ra3_in1k,66.430,33.570,86.160,13.840,24.72,224,0.875,bicubic,-11.596,-7.508,+5
inception_v3.tf_in1k,66.420,33.580,86.680,13.320,23.83,299,0.875,bicubic,-11.436,-7.186,+17
bat_resnext26ts.ch_in1k,66.390,33.610,86.860,13.140,10.73,256,0.900,bicubic,-11.862,-7.238,-14
hardcorenas_f.miil_green_in1k,66.360,33.640,86.190,13.810,8.20,224,0.875,bilinear,-11.736,-7.612,-3
seresnext26ts.ch_in1k,66.320,33.680,86.700,13.300,10.39,288,1.000,bicubic,-11.950,-7.392,-17
coat_lite_tiny.in1k,66.290,33.710,86.960,13.040,5.72,224,0.900,bicubic,-11.230,-6.962,+27
eca_resnext26ts.ch_in1k,66.270,33.730,86.410,13.590,10.30,288,1.000,bicubic,-11.730,-7.516,+2
legacy_seresnet50.in1k,66.250,33.750,86.300,13.700,28.09,224,0.875,bilinear,-11.394,-7.458,+18
efficientnet_b0.ra_in1k,66.250,33.750,85.970,14.030,5.29,224,0.875,bicubic,-11.444,-7.562,+17
cs3darknet_focus_m.c2ns_in1k,66.240,33.760,87.080,12.920,9.30,288,0.950,bicubic,-11.044,-6.886,+39
selecsls60.in1k,66.220,33.780,86.330,13.670,30.67,224,0.875,bicubic,-11.768,-7.500,0
tf_efficientnet_cc_b0_8e.in1k,66.220,33.780,86.230,13.770,24.01,224,0.875,bicubic,-11.684,-7.432,+4
res2net50_26w_4s.in1k,66.150,33.850,86.610,13.390,25.70,224,0.875,bilinear,-11.800,-7.242,0
tf_efficientnet_em.in1k,66.150,33.850,86.360,13.640,6.90,240,0.882,bicubic,-11.976,-7.688,-16
resnext50_32x4d.tv_in1k,66.150,33.850,86.040,13.960,25.03,224,0.875,bilinear,-11.472,-7.656,+15
densenetblur121d.ra_in1k,66.140,33.860,86.600,13.400,8.00,288,0.950,bicubic,-11.182,-7.188,+28
resmlp_12_224.fb_distilled_in1k,66.120,33.880,86.620,13.380,15.35,224,0.875,bicubic,-11.834,-6.940,-5
inception_v3.tv_in1k,66.110,33.890,86.320,13.680,23.83,299,0.875,bicubic,-11.324,-7.154,+22
resnet50.a3_in1k,66.110,33.890,85.820,14.180,25.56,224,0.950,bicubic,-11.938,-7.960,-15
ghostnetv2_160.in1k,66.080,33.920,86.730,13.270,12.39,224,0.875,bicubic,-11.752,-7.210,0
resnet26t.ra2_in1k,66.080,33.920,86.670,13.330,16.01,320,1.000,bicubic,-12.248,-7.454,-37
regnety_016.pycls_in1k,66.080,33.920,86.370,13.630,11.20,224,0.875,bicubic,-11.788,-7.348,-4
efficientnet_b1_pruned.in1k,66.070,33.930,86.550,13.450,6.33,240,0.882,bicubic,-12.170,-7.284,-32
gcresnext26ts.ch_in1k,66.040,33.960,86.750,13.250,10.48,288,1.000,bicubic,-12.374,-7.286,-45
resnet50.gluon_in1k,66.030,33.970,86.270,13.730,25.56,224,0.875,bicubic,-11.552,-7.450,+7
rexnet_100.nav_in1k,66.020,33.980,86.490,13.510,4.80,224,0.875,bicubic,-11.836,-7.150,-8
tinynet_a.in1k,66.020,33.980,85.780,14.220,6.19,192,0.875,bicubic,-11.628,-7.760,-1
res2net50_14w_8s.in1k,66.000,34.000,86.230,13.770,25.06,224,0.875,bilinear,-12.158,-7.616,-30
poolformerv2_s12.sail_in1k,65.890,34.110,86.510,13.490,11.89,224,1.000,bicubic,-12.112,-7.354,-21
resnet34.a2_in1k,65.870,34.130,86.140,13.860,21.80,288,1.000,bicubic,-11.288,-7.134,+26
densenet161.tv_in1k,65.850,34.150,86.460,13.540,28.68,224,0.875,bicubic,-11.508,-7.182,+12
res2next50.in1k,65.850,34.150,85.830,14.170,24.67,224,0.875,bilinear,-12.392,-8.062,-42
convnextv2_atto.fcmae_ft_in1k,65.840,34.160,86.170,13.830,3.71,288,0.950,bicubic,-11.920,-7.556,-10
hardcorenas_e.miil_green_in1k,65.830,34.170,85.970,14.030,8.07,224,0.875,bilinear,-11.960,-7.730,-12
repvgg_b1g4.rvgg_in1k,65.820,34.180,86.110,13.890,39.97,224,0.875,bilinear,-11.768,-7.726,-4
regnetx_008.tv2_in1k,65.810,34.190,86.210,13.790,7.26,224,0.965,bicubic,-11.496,-7.454,+10
xcit_tiny_12_p16_224.fb_in1k,65.790,34.210,86.220,13.780,6.72,224,1.000,bicubic,-11.350,-7.496,+20
ese_vovnet19b_dw.ra_in1k,65.770,34.230,86.470,13.530,6.54,288,0.950,bicubic,-11.974,-7.314,-14
mobilenetv3_large_100.miil_in21k_ft_in1k,65.770,34.230,85.180,14.820,5.48,224,0.875,bilinear,-12.150,-7.734,-25
skresnet34.ra_in1k,65.740,34.260,85.950,14.050,22.28,224,0.875,bicubic,-11.170,-7.194,+29
resnet101.tv_in1k,65.710,34.290,85.990,14.010,44.55,224,0.875,bilinear,-11.670,-7.556,+1
convnext_tiny.fb_in22k_ft_in1k,65.650,34.350,86.620,13.380,28.59,288,1.000,bicubic,-13.248,-8.054,-103
hardcorenas_d.miil_green_in1k,65.650,34.350,85.450,14.550,7.50,224,0.875,bilinear,-11.784,-8.040,-4
poolformer_s12.sail_in1k,65.630,34.370,86.210,13.790,11.92,224,0.900,bicubic,-11.610,-7.322,+7
dpn68b.mx_in1k,65.620,34.380,85.950,14.050,12.61,224,0.875,bicubic,-11.898,-7.902,-11
selecsls42b.in1k,65.620,34.380,85.840,14.160,32.46,224,0.875,bicubic,-11.550,-7.552,+9
convnext_atto_ols.a2_in1k,65.600,34.400,86.260,13.740,3.70,288,0.950,bicubic,-11.616,-7.416,+5
fastvit_t8.apple_dist_in1k,65.550,34.450,86.160,13.840,4.03,256,0.900,bicubic,-11.626,-7.138,+6
tf_efficientnet_b0.ap_in1k,65.500,34.500,85.570,14.430,5.29,224,0.875,bicubic,-11.590,-7.692,+10
mobileone_s2.apple_in1k,65.440,34.560,85.950,14.050,7.88,224,0.900,bilinear,-12.076,-7.718,-15
tf_efficientnet_lite2.in1k,65.400,34.600,86.020,13.980,6.09,260,0.890,bicubic,-12.062,-7.732,-14
convmixer_1024_20_ks9_p14.in1k,65.390,34.610,85.610,14.390,24.38,224,0.960,bicubic,-11.546,-7.740,+15
res2net50_48w_2s.in1k,65.350,34.650,85.950,14.050,25.29,224,0.875,bilinear,-12.164,-7.600,-17
resnet26d.bt_in1k,65.340,34.660,86.000,14.000,16.01,288,0.950,bicubic,-12.068,-7.638,-13
densenet201.tv_in1k,65.270,34.730,85.670,14.330,20.01,224,0.875,bicubic,-12.016,-7.810,-7
dla60.in1k,65.220,34.780,85.760,14.240,22.04,224,0.875,bilinear,-11.826,-7.558,+6
seresnet50.a3_in1k,65.190,34.810,85.300,14.700,28.09,224,0.950,bicubic,-11.836,-7.772,+6
crossvit_9_dagger_240.in1k,65.170,34.830,86.570,13.430,8.78,240,0.875,bicubic,-11.808,-7.048,+7
gernet_s.idstcv_in1k,65.150,34.850,85.530,14.470,8.17,224,0.875,bilinear,-11.760,-7.786,+11
tf_efficientnet_cc_b0_4e.in1k,65.150,34.850,85.140,14.860,13.31,224,0.875,bicubic,-12.152,-8.196,-13
mobilenetv2_120d.ra_in1k,65.060,34.940,85.990,14.010,5.83,224,0.875,bicubic,-12.248,-7.512,-16
legacy_seresnext26_32x4d.in1k,65.040,34.960,85.630,14.370,16.79,224,0.875,bicubic,-12.068,-7.684,-4
resnet34.bt_in1k,64.940,35.060,86.210,13.790,21.80,288,0.950,bicubic,-11.540,-7.144,+21
convnext_atto.d2_in1k,64.920,35.080,86.230,13.770,3.70,288,0.950,bicubic,-12.088,-7.472,0
hrnet_w18.ms_in1k,64.920,35.080,85.730,14.270,21.30,224,0.875,bilinear,-11.832,-7.714,+8
resnext26ts.ra2_in1k,64.900,35.100,85.710,14.290,10.30,288,1.000,bicubic,-12.278,-7.754,-13
efficientvit_m5.r224_in1k,64.890,35.110,85.390,14.610,12.47,224,0.875,bicubic,-12.168,-7.794,-6
hardcorenas_c.miil_green_in1k,64.870,35.130,85.250,14.750,5.52,224,0.875,bilinear,-12.196,-7.912,-8
repghostnet_150.in1k,64.830,35.170,85.880,14.120,6.58,224,0.875,bicubic,-12.630,-7.630,-31
efficientformerv2_s0.snap_dist_in1k,64.800,35.200,85.650,14.350,3.60,224,0.950,bicubic,-11.314,-7.208,+23
densenet169.tv_in1k,64.790,35.210,85.250,14.750,14.15,224,0.875,bicubic,-11.110,-7.778,+27
fastvit_t8.apple_in1k,64.730,35.270,85.680,14.320,4.03,256,0.900,bicubic,-11.444,-7.372,+19
ghostnetv2_130.in1k,64.720,35.280,85.420,14.580,8.96,224,0.875,bicubic,-12.036,-7.942,-1
mixnet_m.ft_in1k,64.670,35.330,85.460,14.540,5.01,224,0.875,bicubic,-12.590,-7.958,-24
xcit_nano_12_p8_224.fb_dist_in1k,64.610,35.390,85.990,14.010,3.05,224,1.000,bicubic,-11.722,-7.108,+14
levit_128s.fb_dist_in1k,64.590,35.410,84.730,15.270,7.78,224,0.900,bicubic,-11.936,-8.142,+4
levit_conv_128s.fb_dist_in1k,64.590,35.410,84.730,15.270,7.78,224,0.900,bicubic,-11.930,-8.136,+4
repvgg_a2.rvgg_in1k,64.470,35.530,85.140,14.860,28.21,224,0.875,bilinear,-11.988,-7.862,+7
xcit_nano_12_p16_384.fb_dist_in1k,64.420,35.580,85.300,14.700,3.05,384,1.000,bicubic,-11.038,-7.398,+31
hardcorenas_b.miil_green_in1k,64.390,35.610,84.880,15.120,5.18,224,0.875,bilinear,-12.158,-7.882,-2
regnetx_016.pycls_in1k,64.380,35.620,85.470,14.530,9.19,224,0.875,bicubic,-12.544,-7.946,-13
tf_efficientnet_lite1.in1k,64.370,35.630,85.460,14.540,5.42,240,0.882,bicubic,-12.274,-7.764,-7
resmlp_12_224.fb_in1k,64.350,35.650,85.590,14.410,15.35,224,0.875,bicubic,-12.298,-7.588,-9
tf_efficientnet_b0.aa_in1k,64.300,35.700,85.280,14.720,5.29,224,0.875,bicubic,-12.544,-7.938,-13
tf_mixnet_m.in1k,64.260,35.740,85.100,14.900,5.01,224,0.875,bicubic,-12.694,-8.054,-19
densenet121.ra_in1k,64.250,35.750,85.820,14.180,7.98,288,0.950,bicubic,-12.250,-7.548,-3
resnet26.bt_in1k,64.200,35.800,85.210,14.790,16.00,288,0.950,bicubic,-12.166,-7.970,0
tf_efficientnet_es.in1k,64.200,35.800,84.740,15.260,5.44,224,0.875,bicubic,-12.398,-8.462,-11
regnety_008.pycls_in1k,64.140,35.860,85.240,14.760,6.26,224,0.875,bicubic,-12.162,-7.822,+1
dpn68.mx_in1k,64.120,35.880,85.080,14.920,12.61,224,0.875,bicubic,-12.226,-7.928,-2
vit_small_patch32_224.augreg_in21k_ft_in1k,64.090,35.910,85.550,14.450,22.88,224,0.900,bicubic,-11.904,-7.250,+3
mobilenetv2_140.ra_in1k,64.070,35.930,85.030,14.970,6.11,224,0.875,bicubic,-12.446,-7.958,-10
repghostnet_130.in1k,63.960,36.040,84.840,15.160,5.48,224,0.875,bicubic,-12.416,-8.052,-7
hardcorenas_a.miil_green_in1k,63.720,36.280,84.410,15.590,5.26,224,0.875,bilinear,-12.218,-8.098,+3
resnest14d.gluon_in1k,63.620,36.380,84.230,15.770,10.61,224,0.875,bilinear,-11.888,-8.278,+12
regnety_004.tv2_in1k,63.600,36.400,84.860,15.140,4.34,224,0.965,bicubic,-11.994,-7.840,+9
mobilevitv2_075.cvnets_in1k,63.590,36.410,84.950,15.050,2.87,256,0.888,bicubic,-12.018,-7.794,+7
tf_mixnet_s.in1k,63.580,36.420,84.260,15.740,4.13,224,0.875,bicubic,-12.072,-8.380,+4
tf_efficientnet_b0.in1k,63.520,36.480,84.870,15.130,5.29,224,0.875,bicubic,-13.010,-8.138,-20
mixnet_s.ft_in1k,63.390,36.610,84.720,15.280,4.13,224,0.875,bicubic,-12.604,-8.550,-5
mobilenetv3_large_100.ra_in1k,63.380,36.620,84.080,15.920,5.48,224,0.875,bicubic,-12.386,-8.458,-1
vit_tiny_r_s16_p8_384.augreg_in21k_ft_in1k,63.340,36.660,85.270,14.730,6.36,384,1.000,bicubic,-12.620,-7.992,-6
resnet50.tv_in1k,63.330,36.670,84.670,15.330,25.56,224,0.875,bilinear,-12.798,-8.188,-11
efficientnet_es_pruned.in1k,63.310,36.690,84.950,15.050,5.44,224,0.875,bicubic,-11.696,-7.494,+16
mixer_b16_224.goog_in21k_ft_in1k,63.290,36.710,83.340,16.660,59.88,224,0.875,bicubic,-13.312,-8.884,-29
mobilenetv3_rw.rmsp_in1k,63.240,36.760,84.520,15.480,5.48,224,0.875,bicubic,-12.380,-8.184,-4
efficientnet_lite0.ra_in1k,63.240,36.760,84.410,15.590,4.65,224,0.875,bicubic,-12.242,-8.110,+2
mobileone_s1.apple_in1k,63.200,36.800,84.210,15.790,4.83,224,0.900,bilinear,-12.586,-8.582,-9
semnasnet_100.rmsp_in1k,63.160,36.840,84.540,15.460,3.89,224,0.875,bicubic,-12.290,-8.058,+2
vit_tiny_patch16_224.augreg_in21k_ft_in1k,63.130,36.870,84.870,15.130,5.72,224,0.900,bicubic,-12.332,-7.974,-1
regnety_006.pycls_in1k,63.090,36.910,84.240,15.760,6.06,224,0.875,bicubic,-12.178,-8.286,+1
pit_ti_distilled_224.in1k,63.020,36.980,83.810,16.190,5.10,224,0.900,bicubic,-11.236,-8.142,+22
densenet121.tv_in1k,62.940,37.060,84.250,15.750,7.98,224,0.875,bicubic,-11.824,-7.904,+12
mobilevit_xs.cvnets_in1k,62.930,37.070,84.830,15.170,2.32,256,0.900,bicubic,-11.704,-7.518,+14
ghostnetv2_100.in1k,62.900,37.100,84.090,15.910,6.16,224,0.875,bicubic,-12.266,-8.264,-2
legacy_seresnet34.in1k,62.890,37.110,84.230,15.770,21.96,224,0.875,bilinear,-11.912,-7.896,+8
hrnet_w18_small_v2.ms_in1k,62.830,37.170,83.970,16.030,15.60,224,0.875,bilinear,-12.280,-8.446,-1
edgenext_x_small.in1k,62.810,37.190,84.670,15.330,2.34,288,1.000,bicubic,-12.878,-8.096,-17
mobilenetv2_110d.ra_in1k,62.810,37.190,84.480,15.520,4.52,224,0.875,bicubic,-12.244,-7.704,-1
deit_tiny_distilled_patch16_224.fb_in1k,62.790,37.210,83.930,16.070,5.91,224,0.900,bicubic,-11.714,-7.960,+11
resnet18.fb_swsl_ig1b_ft_in1k,62.750,37.250,84.300,15.700,11.69,224,0.875,bilinear,-10.538,-7.430,+30
tinynet_b.in1k,62.720,37.280,84.220,15.780,3.73,188,0.875,bicubic,-12.258,-7.966,-1
repvgg_b0.rvgg_in1k,62.710,37.290,83.880,16.120,15.82,224,0.875,bilinear,-12.434,-8.536,-9
tf_efficientnet_lite0.in1k,62.580,37.420,84.230,15.770,4.65,224,0.875,bicubic,-12.252,-7.940,-1
xcit_nano_12_p8_224.fb_in1k,62.560,37.440,84.210,15.790,3.05,224,1.000,bicubic,-11.350,-7.958,+17
resnet34.gluon_in1k,62.540,37.460,84.000,16.000,21.80,224,0.875,bicubic,-12.040,-7.982,+4
regnetx_008.pycls_in1k,62.490,37.510,84.020,15.980,7.26,224,0.875,bicubic,-12.538,-8.318,-9
dla34.in1k,62.490,37.510,83.930,16.070,15.74,224,0.875,bilinear,-12.150,-8.136,0
fbnetc_100.rmsp_in1k,62.450,37.550,83.370,16.630,5.57,224,0.875,bilinear,-12.680,-9.018,-14
tf_mobilenetv3_large_100.in1k,62.440,37.560,83.950,16.050,5.48,224,0.875,bilinear,-13.076,-8.644,-24
crossvit_9_240.in1k,62.250,37.750,84.240,15.760,8.55,240,0.875,bicubic,-11.710,-7.722,+8
repghostnet_111.in1k,62.250,37.750,83.880,16.120,4.54,224,0.875,bicubic,-12.806,-8.312,-15
crossvit_tiny_240.in1k,62.080,37.920,83.610,16.390,7.01,240,0.875,bicubic,-11.260,-8.298,+16
regnetx_004_tv.tv2_in1k,62.060,37.940,83.770,16.230,5.50,224,0.965,bicubic,-12.540,-8.400,-5
resnet18d.ra2_in1k,62.000,38.000,83.790,16.210,11.71,288,0.950,bicubic,-11.794,-8.048,+9
repvgg_a1.rvgg_in1k,61.970,38.030,83.040,16.960,14.09,224,0.875,bilinear,-12.492,-8.816,-4
efficientvit_m4.r224_in1k,61.950,38.050,83.580,16.420,8.80,224,0.875,bicubic,-12.418,-8.400,-4
mnasnet_100.rmsp_in1k,61.920,38.080,83.700,16.300,4.38,224,0.875,bicubic,-12.732,-8.422,-12
vgg19_bn.tv_in1k,61.870,38.130,83.450,16.550,143.68,224,0.875,bilinear,-12.346,-8.394,-4
regnety_004.pycls_in1k,61.840,38.160,83.410,16.590,4.34,224,0.875,bicubic,-12.186,-8.338,-2
convit_tiny.fb_in1k,61.590,38.410,84.090,15.910,5.71,224,0.875,bicubic,-11.522,-7.622,+12
resnet18.a1_in1k,61.580,38.420,82.470,17.530,11.69,288,1.000,bicubic,-11.578,-8.556,+10
resnet34.a3_in1k,61.490,38.510,82.610,17.390,21.80,224,0.950,bicubic,-11.480,-8.496,+13
resnet18.fb_ssl_yfcc100m_ft_in1k,61.480,38.520,83.330,16.670,11.69,224,0.875,bilinear,-11.118,-8.086,+15
repghostnet_100.in1k,61.380,38.620,82.750,17.250,4.07,224,0.875,bicubic,-12.826,-8.792,-9
regnetx_006.pycls_in1k,61.360,38.640,83.450,16.550,6.20,224,0.875,bicubic,-12.508,-8.228,-3
hrnet_w18_small.gluon_in1k,61.290,38.710,82.280,17.720,13.19,224,0.875,bicubic,-12.630,-8.914,-6
ghostnet_100.in1k,61.250,38.750,82.300,17.700,5.18,224,0.875,bicubic,-12.708,-9.232,-8
spnasnet_100.rmsp_in1k,61.240,38.760,82.760,17.240,4.42,224,0.875,bilinear,-12.854,-9.060,-12
resnet34.tv_in1k,61.200,38.800,82.740,17.260,21.80,224,0.875,bilinear,-12.106,-8.680,0
vit_base_patch32_224.augreg_in1k,61.040,38.960,82.730,17.270,88.22,224,0.900,bicubic,-13.854,-9.048,-29
efficientvit_m3.r224_in1k,61.010,38.990,83.200,16.800,6.90,224,0.875,bicubic,-12.364,-8.148,-5
pit_ti_224.in1k,60.960,39.040,83.850,16.150,4.85,224,0.900,bicubic,-11.950,-7.554,+5
skresnet18.ra_in1k,60.850,39.150,82.850,17.150,11.96,224,0.875,bicubic,-12.184,-8.322,0
vgg16_bn.tv_in1k,60.770,39.230,82.960,17.040,138.37,224,0.875,bilinear,-12.600,-8.554,-7
semnasnet_075.rmsp_in1k,60.720,39.280,82.540,17.460,2.91,224,0.875,bicubic,-12.284,-8.600,-1
tf_mobilenetv3_large_075.in1k,60.380,39.620,81.970,18.030,3.99,224,0.875,bilinear,-13.050,-9.382,-11
xcit_nano_12_p16_224.fb_dist_in1k,60.220,39.780,82.490,17.510,3.05,224,1.000,bicubic,-12.090,-8.370,+7
resnet18.a2_in1k,60.200,39.800,81.840,18.160,11.69,288,1.000,bicubic,-12.172,-8.756,+4
mobilenetv2_100.ra_in1k,60.150,39.850,82.190,17.810,3.50,224,0.875,bicubic,-12.818,-8.826,-3
vit_base_patch32_224.sam_in1k,59.990,40.010,81.220,18.780,88.22,224,0.900,bicubic,-13.704,-9.794,-16
deit_tiny_patch16_224.fb_in1k,59.800,40.200,82.660,17.340,5.72,224,0.900,bicubic,-12.370,-8.456,+7
legacy_seresnet18.in1k,59.790,40.210,81.680,18.320,11.78,224,0.875,bicubic,-11.970,-8.652,+11
repvgg_a0.rvgg_in1k,59.760,40.240,81.250,18.750,9.11,224,0.875,bilinear,-12.648,-9.242,-4
vgg19.tv_in1k,59.710,40.290,81.460,18.540,143.67,224,0.875,bilinear,-12.668,-9.414,-3
edgenext_xx_small.in1k,59.410,40.590,81.840,18.160,1.33,288,1.000,bicubic,-12.468,-8.712,+6
regnetx_004.pycls_in1k,59.380,40.620,81.740,18.260,5.16,224,0.875,bicubic,-13.022,-9.086,-6
vit_tiny_r_s16_p8_224.augreg_in21k_ft_in1k,59.080,40.920,81.760,18.240,6.34,224,0.900,bicubic,-12.718,-9.064,+4
tf_mobilenetv3_large_minimal_100.in1k,59.080,40.920,81.130,18.870,3.92,224,0.875,bilinear,-13.184,-9.510,-2
repghostnet_080.in1k,59.060,40.940,81.160,18.840,3.28,224,0.875,bicubic,-13.152,-9.324,-2
hrnet_w18_small.ms_in1k,58.960,41.040,81.340,18.660,13.19,224,0.875,bilinear,-13.376,-9.340,-7
vgg13_bn.tv_in1k,58.960,41.040,81.090,18.910,133.05,224,0.875,bilinear,-12.628,-9.288,+4
lcnet_100.ra2_in1k,58.920,41.080,81.200,18.800,2.95,224,0.875,bicubic,-13.182,-9.154,-3
vgg16.tv_in1k,58.840,41.160,81.660,18.340,138.36,224,0.875,bilinear,-12.752,-8.724,+1
pvt_v2_b0.in1k,58.770,41.230,82.120,17.880,3.67,224,0.900,bicubic,-11.890,-8.076,+7
mobileone_s0.apple_in1k,58.580,41.420,80.080,19.920,5.29,224,0.875,bilinear,-12.822,-9.762,+1
efficientvit_m2.r224_in1k,58.420,41.580,81.360,18.640,4.19,224,0.875,bicubic,-12.394,-8.782,+4
resnet18.gluon_in1k,58.340,41.660,80.970,19.030,11.69,224,0.875,bicubic,-12.494,-8.786,+2
xcit_nano_12_p16_224.fb_in1k,58.330,41.670,80.920,19.080,3.05,224,1.000,bicubic,-11.632,-8.842,+7
tinynet_c.in1k,58.200,41.800,80.280,19.720,2.46,184,0.875,bicubic,-13.042,-9.452,-1
resnet14t.c3_in1k,58.190,41.810,80.300,19.700,10.08,224,0.950,bicubic,-14.064,-10.006,-14
efficientvit_b0.r224_in1k,58.050,41.950,79.800,20.200,3.41,224,0.950,bicubic,-13.348,-9.628,-4
mobilevitv2_050.cvnets_in1k,57.700,42.300,80.880,19.120,1.37,256,0.888,bicubic,-12.448,-9.038,+2
vgg11_bn.tv_in1k,57.410,42.590,80.010,19.990,132.87,224,0.875,bilinear,-12.972,-9.798,-1
resnet18.tv_in1k,57.180,42.820,80.190,19.810,11.69,224,0.875,bilinear,-12.580,-8.880,+3
mobilevit_xxs.cvnets_in1k,57.170,42.830,79.760,20.240,1.27,256,0.900,bicubic,-11.748,-9.186,+4
vgg13.tv_in1k,57.140,42.860,79.550,20.450,133.05,224,0.875,bilinear,-12.792,-9.700,0
regnety_002.pycls_in1k,57.010,42.990,79.880,20.120,3.16,224,0.875,bicubic,-13.270,-9.650,-4
mixer_l16_224.goog_in21k_ft_in1k,56.660,43.340,76.010,23.990,208.20,224,0.875,bicubic,-15.394,-11.664,-18
repghostnet_058.in1k,56.110,43.890,78.510,21.490,2.55,224,0.875,bicubic,-12.804,-9.910,+1
regnetx_002.pycls_in1k,56.070,43.930,79.210,20.790,2.68,224,0.875,bicubic,-12.682,-9.332,+2
resnet18.a3_in1k,56.000,44.000,78.970,21.030,11.69,224,0.950,bicubic,-12.252,-9.202,+4
dla60x_c.in1k,55.990,44.010,78.970,21.030,1.32,224,0.875,bilinear,-11.922,-9.462,+5
vgg11.tv_in1k,55.820,44.180,78.830,21.170,132.86,224,0.875,bilinear,-13.202,-9.794,-5
resnet10t.c3_in1k,55.560,44.440,78.440,21.560,5.44,224,0.950,bicubic,-12.804,-9.596,-1
efficientvit_m1.r224_in1k,55.470,44.530,79.150,20.850,2.98,224,0.875,bicubic,-12.836,-9.520,-1
lcnet_075.ra2_in1k,55.440,44.560,78.350,21.650,2.36,224,0.875,bicubic,-13.342,-10.010,-5
mobilenetv3_small_100.lamb_in1k,54.680,45.320,77.770,22.230,2.54,224,0.875,bicubic,-12.978,-9.866,+1
tf_mobilenetv3_small_100.in1k,54.470,45.530,77.070,22.930,2.54,224,0.875,bilinear,-13.452,-10.602,-2
repghostnet_050.in1k,53.550,46.450,76.740,23.260,2.31,224,0.875,bicubic,-13.416,-10.180,+1
tinynet_d.in1k,53.430,46.570,76.320,23.680,2.34,152,0.875,bicubic,-13.542,-10.746,-1
mnasnet_small.lamb_in1k,53.270,46.730,75.910,24.090,2.03,224,0.875,bicubic,-12.926,-10.594,0
dla46x_c.in1k,53.040,46.960,76.840,23.160,1.07,224,0.875,bilinear,-12.952,-10.134,0
mobilenetv2_050.lamb_in1k,52.840,47.160,75.450,24.550,1.97,224,0.875,bicubic,-13.108,-10.634,0
dla46_c.in1k,52.200,47.800,75.670,24.330,1.30,224,0.875,bilinear,-12.672,-10.628,+2
tf_mobilenetv3_small_075.in1k,52.160,47.840,75.450,24.550,2.04,224,0.875,bilinear,-13.566,-10.682,-1
mobilenetv3_small_075.lamb_in1k,51.890,48.110,74.740,25.260,2.04,224,0.875,bicubic,-13.346,-10.706,-1
efficientvit_m0.r224_in1k,50.750,49.250,74.470,25.530,2.35,224,0.875,bicubic,-12.520,-10.706,0
lcnet_050.ra2_in1k,49.980,50.020,73.440,26.560,1.88,224,0.875,bicubic,-13.158,-10.942,0
tf_mobilenetv3_small_minimal_100.in1k,49.530,50.470,73.020,26.980,2.04,224,0.875,bilinear,-13.364,-11.218,0
tinynet_e.in1k,46.730,53.270,70.360,29.640,2.04,106,0.875,bicubic,-13.136,-11.402,0
mobilenetv3_small_050.lamb_in1k,44.850,55.150,67.690,32.310,1.59,224,0.875,bicubic,-13.066,-12.490,0
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-train-amp-nchw-pt112-cu113-rtx3090.csv
|
model,train_samples_per_sec,train_step_time,train_batch_size,train_img_size,param_count
tinynet_e,10001.12,50.423,512,106,2.04
mobilenetv3_small_050,7406.47,68.392,512,224,1.59
tf_mobilenetv3_small_minimal_100,6438.14,78.983,512,224,2.04
mobilenetv3_small_075,6186.83,82.006,512,224,2.04
tf_mobilenetv3_small_075,5783.46,87.782,512,224,2.04
mobilenetv3_small_100,5749.13,88.315,512,224,2.54
lcnet_035,5673.53,89.75,512,224,1.64
tf_mobilenetv3_small_100,5383.9,94.36,512,224,2.54
levit_128s,5298.88,95.701,512,224,7.78
lcnet_050,5280.37,96.452,512,224,1.88
tinynet_d,5161.83,98.416,512,152,2.34
mixer_s32_224,4696.33,108.475,512,224,19.1
resnet10t,4669.46,109.393,512,176,5.44
vit_small_patch32_224,4447.28,114.289,512,224,22.88
lcnet_075,4278.23,119.175,512,224,2.36
vit_tiny_r_s16_p8_224,4137.87,122.895,512,224,6.34
levit_128,3895.0,130.318,512,224,9.21
regnetx_002,3718.05,137.026,512,224,2.68
lcnet_100,3569.0,142.969,512,224,2.95
mnasnet_small,3450.28,147.453,512,224,2.03
regnety_002,3414.18,149.006,512,224,3.16
cs3darknet_focus_s,3251.91,156.949,512,256,3.27
mobilenetv2_035,3160.04,161.202,512,224,1.68
levit_192,3046.5,166.9,512,224,10.95
gernet_s,3034.31,168.028,512,224,8.17
tinynet_c,2919.98,174.314,512,184,2.46
mnasnet_050,2847.14,179.025,512,224,2.22
cs3darknet_s,2821.27,180.951,512,256,3.28
resnet18,2764.22,184.877,512,224,11.69
ssl_resnet18,2760.71,185.109,512,224,11.69
mobilenetv2_050,2751.58,185.257,512,224,1.97
swsl_resnet18,2742.47,186.338,512,224,11.69
semnasnet_050,2741.67,185.816,512,224,2.08
gluon_resnet18_v1b,2741.53,186.395,512,224,11.69
lcnet_150,2713.5,188.193,512,224,4.5
regnetx_004,2695.23,188.875,512,224,5.16
ese_vovnet19b_slim_dw,2588.37,197.313,512,224,1.9
seresnet18,2562.51,199.293,512,224,11.78
nf_regnet_b0,2561.76,198.646,512,192,8.76
legacy_seresnet18,2500.8,204.207,512,224,11.78
tf_efficientnetv2_b0,2483.22,204.949,512,192,7.14
levit_256,2482.39,205.091,512,224,18.89
mobilenetv3_large_075,2392.41,213.119,512,224,3.99
resnet14t,2385.69,214.281,512,176,10.08
tf_mobilenetv3_large_minimal_100,2347.68,217.368,512,224,3.92
vit_tiny_patch16_224,2293.54,222.408,512,224,5.72
regnetx_006,2293.09,222.433,512,224,6.2
deit_tiny_patch16_224,2290.53,222.68,512,224,5.72
tf_mobilenetv3_large_075,2259.6,225.688,512,224,3.99
deit_tiny_distilled_patch16_224,2253.36,226.358,512,224,5.91
edgenext_xx_small,2231.33,228.598,512,256,1.33
ghostnet_050,2189.91,232.414,512,224,2.59
mobilenetv3_rw,2184.31,233.512,512,224,5.48
mnasnet_075,2176.02,234.492,512,224,3.17
mobilenetv3_large_100,2167.29,235.344,512,224,5.48
mobilenetv3_large_100_miil,2165.63,235.504,512,224,5.48
levit_256d,2159.5,235.516,512,224,26.21
resnet18d,2129.12,240.084,512,224,11.71
hardcorenas_a,2118.32,240.968,512,224,5.26
regnety_004,2100.99,242.536,512,224,4.34
pit_ti_distilled_224,2086.5,244.504,512,224,5.1
pit_ti_224,2079.54,245.311,512,224,4.85
ese_vovnet19b_slim,2066.1,247.446,512,224,3.17
mnasnet_100,2053.84,248.477,512,224,4.38
tf_mobilenetv3_large_100,2053.63,248.437,512,224,5.48
mnasnet_b1,2053.54,248.485,512,224,4.38
semnasnet_075,2008.51,253.986,512,224,2.91
hardcorenas_b,2008.46,253.96,512,224,5.18
mobilenetv2_075,1983.69,257.32,512,224,2.64
hardcorenas_c,1977.37,257.94,512,224,5.52
xcit_nano_12_p16_224_dist,1970.62,258.036,512,224,3.05
xcit_nano_12_p16_224,1969.78,258.084,512,224,3.05
tinynet_b,1965.95,259.368,512,188,3.73
hardcorenas_d,1880.3,271.085,512,224,7.5
tf_efficientnetv2_b1,1876.23,271.395,512,192,8.14
resnetblur18,1872.21,273.11,512,224,11.69
spnasnet_100,1862.13,273.955,512,224,4.42
mnasnet_a1,1859.21,274.476,512,224,3.89
semnasnet_100,1857.75,274.693,512,224,3.89
mobilenetv2_100,1832.14,278.633,512,224,3.5
regnety_006,1809.24,281.912,512,224,6.06
visformer_tiny,1802.41,283.384,512,224,10.32
mixer_b32_224,1784.58,286.101,512,224,60.29
skresnet18,1730.13,295.275,512,224,11.96
tinynet_a,1710.13,298.117,512,192,6.19
vit_base_patch32_224_sam,1703.64,299.668,512,224,88.22
vit_base_patch32_224,1703.57,299.695,512,224,88.22
efficientnet_lite0,1674.68,304.971,512,224,4.65
cs3darknet_focus_m,1668.48,306.209,512,256,9.3
hardcorenas_e,1650.74,309.021,512,224,8.07
hardcorenas_f,1646.88,309.777,512,224,8.2
gluon_resnet34_v1b,1634.03,312.731,512,224,21.8
regnetx_008,1632.2,312.851,512,224,7.26
tv_resnet34,1630.02,313.513,512,224,21.8
resnet34,1622.41,314.992,512,224,21.8
ghostnet_100,1601.5,318.319,512,224,5.18
tf_efficientnet_lite0,1591.79,320.884,512,224,4.65
fbnetc_100,1567.77,325.605,512,224,5.57
pit_xs_distilled_224,1551.83,329.02,512,224,11.0
pit_xs_224,1549.02,329.642,512,224,10.62
mixer_s16_224,1543.23,331.197,512,224,18.53
dla46_c,1532.94,333.18,512,224,1.3
mnasnet_140,1525.17,334.879,512,224,7.12
seresnet34,1505.77,339.147,512,224,21.96
cs3darknet_m,1499.82,340.716,512,256,9.31
regnety_008,1498.63,340.596,512,224,6.26
levit_384,1491.26,342.207,512,224,39.13
edgenext_x_small,1481.71,344.446,512,256,2.34
ese_vovnet19b_dw,1466.46,348.623,512,224,6.54
legacy_seresnet34,1465.81,348.38,512,224,21.96
efficientnet_b0,1459.11,262.1,384,224,5.29
gernet_m,1456.76,350.74,512,224,21.14
vit_small_patch32_384,1448.56,352.604,512,384,22.92
regnetz_005,1448.06,352.165,512,224,7.12
rexnet_100,1447.81,264.049,384,224,4.8
rexnetr_100,1441.71,265.216,384,224,4.88
nf_resnet26,1422.76,359.346,512,224,16.0
hrnet_w18_small,1410.43,361.614,512,224,13.19
selecsls42,1405.04,363.736,512,224,30.35
selecsls42b,1401.22,364.735,512,224,32.46
mobilenetv2_110d,1400.15,273.199,384,224,4.52
tf_efficientnet_b0_ap,1398.67,273.43,384,224,5.29
mobilevitv2_050,1396.45,365.664,512,256,1.37
tf_efficientnet_b0_ns,1395.54,274.064,384,224,5.29
tf_efficientnet_b0,1395.32,274.114,384,224,5.29
tf_efficientnetv2_b2,1392.9,365.948,512,208,10.1
vit_tiny_r_s16_p8_384,1392.75,274.873,384,384,6.36
resnet34d,1379.64,370.514,512,224,21.82
ghostnet_130,1364.55,373.824,512,224,7.36
gmixer_12_224,1352.72,377.701,512,224,12.7
crossvit_tiny_240,1349.19,377.902,512,240,7.01
gmlp_ti16_224,1340.6,284.894,384,224,5.87
semnasnet_140,1340.57,380.992,512,224,6.11
dla46x_c,1338.33,381.81,512,224,1.07
xcit_tiny_12_p16_224,1323.84,384.926,512,224,6.72
xcit_tiny_12_p16_224_dist,1317.19,386.895,512,224,6.72
resnetrs50,1317.01,387.565,512,160,35.69
mobilevit_xxs,1316.84,290.489,384,256,1.27
resnet26,1312.7,389.566,512,224,16.0
efficientnet_b1_pruned,1301.95,391.798,512,240,6.33
mobilenetv2_140,1267.4,302.189,384,224,6.11
dla60x_c,1262.98,404.404,512,224,1.32
crossvit_9_240,1260.08,303.33,384,240,8.55
convnext_nano_hnf,1235.34,413.703,512,224,15.59
convnext_nano_ols,1234.94,413.902,512,224,15.6
poolformer_s12,1234.11,414.201,512,224,11.92
convnext_nano,1233.61,414.261,512,224,15.59
resmlp_12_distilled_224,1232.37,414.645,512,224,15.35
resmlp_12_224,1232.04,414.762,512,224,15.35
fbnetv3_b,1226.89,415.617,512,224,8.6
nf_regnet_b2,1219.45,418.235,512,240,14.31
repvgg_b0,1217.24,419.512,512,224,15.82
selecsls60b,1214.07,420.825,512,224,32.77
selecsls60,1211.7,421.663,512,224,30.67
nf_regnet_b1,1209.03,421.975,512,256,10.22
crossvit_9_dagger_240,1206.16,316.906,384,240,8.78
nf_seresnet26,1198.39,426.558,512,224,17.4
mixnet_s,1181.75,431.958,512,224,4.13
nf_ecaresnet26,1174.85,435.233,512,224,16.0
efficientnet_lite1,1171.46,217.556,256,240,5.42
darknet17,1164.06,439.537,512,256,14.3
efficientnet_es_pruned,1160.76,440.317,512,224,5.44
efficientnet_es,1160.37,440.47,512,224,5.44
regnetx_016,1139.3,448.473,512,224,9.19
fbnetv3_d,1138.14,335.598,384,224,10.31
tf_efficientnet_es,1136.29,449.83,512,224,5.44
rexnetr_130,1133.04,224.76,256,224,7.61
dla34,1132.96,451.315,512,224,15.74
resnet26d,1119.56,456.822,512,224,16.01
tf_mixnet_s,1118.11,456.605,512,224,4.13
tf_efficientnet_lite1,1110.94,229.444,256,240,5.42
edgenext_small,1109.37,460.388,512,256,5.59
convit_tiny,1095.04,466.531,512,224,5.71
rexnet_130,1094.78,232.699,256,224,7.56
mobilenetv2_120d,1078.49,236.158,256,224,5.83
darknet21,1073.87,476.43,512,256,20.86
ecaresnet50d_pruned,1067.01,478.899,512,224,19.94
deit_small_patch16_224,1053.64,363.563,384,224,22.05
vit_small_patch16_224,1052.92,363.872,384,224,22.05
deit_small_distilled_patch16_224,1032.61,370.971,384,224,22.44
sedarknet21,1031.46,495.893,512,256,20.95
gernet_l,1030.31,496.058,512,256,31.08
efficientnet_b1,1030.3,246.963,256,224,7.79
rexnetr_150,1022.06,249.288,256,224,9.78
repvgg_a2,1010.18,506.008,512,224,28.21
edgenext_small_rw,1009.52,506.183,512,256,7.83
skresnet34,1008.96,506.323,512,224,22.28
resnest14d,979.06,522.497,512,224,10.61
cs3darknet_focus_l,977.57,391.957,384,256,21.15
deit3_small_patch16_224,977.26,391.961,384,224,22.06
deit3_small_patch16_224_in21ft1k,976.5,392.276,384,224,22.06
rexnet_150,965.2,264.04,256,224,9.73
regnety_016,954.26,534.657,512,224,11.2
vit_base_patch32_plus_256,951.64,537.091,512,256,119.48
mobilevitv2_075,947.54,269.157,256,256,2.87
legacy_seresnext26_32x4d,946.21,405.17,384,224,16.79
pit_s_224,942.8,270.615,256,224,23.46
pit_s_distilled_224,939.97,271.455,256,224,24.04
vit_srelpos_small_patch16_224,922.29,415.451,384,224,21.97
vit_relpos_small_patch16_224,921.7,415.439,384,224,21.98
efficientnet_b0_g16_evos,909.42,421.149,384,224,8.11
resnext26ts,905.09,423.733,384,256,10.3
cs3darknet_l,902.18,282.891,256,256,21.16
coat_lite_tiny,893.97,428.624,384,224,5.72
resnet26t,890.89,574.188,512,256,16.01
efficientnet_b0_gn,881.54,289.263,256,224,5.29
resnetv2_50,880.1,580.976,512,224,25.55
efficientnet_b2_pruned,874.13,291.317,256,260,8.31
seresnext26ts,867.62,294.407,256,256,10.39
eca_resnext26ts,867.54,294.527,256,256,10.3
tf_efficientnet_b1,863.78,294.816,256,240,7.79
tf_efficientnet_b1_ap,863.54,294.906,256,240,7.79
tf_efficientnet_b1_ns,863.39,294.941,256,240,7.79
cs3sedarknet_l,861.38,444.523,384,256,21.91
tf_efficientnetv2_b3,855.0,297.539,256,240,14.36
efficientnet_lite2,852.1,299.402,256,260,6.09
twins_svt_small,851.73,449.18,384,224,24.06
gcresnext26ts,850.58,300.113,256,256,10.48
efficientnetv2_rw_t,850.16,298.981,256,224,13.65
botnet26t_256,849.43,451.465,384,256,12.49
ecaresnetlight,846.78,603.683,512,224,30.16
seresnext26t_32x4d,845.6,453.458,384,224,16.81
seresnext26tn_32x4d,845.31,453.612,384,224,16.81
seresnext26d_32x4d,844.96,453.775,384,224,16.81
coat_lite_mini,842.06,455.115,384,224,11.01
tf_efficientnet_cc_b0_8e,837.03,457.594,384,224,24.01
ecaresnet101d_pruned,837.02,609.921,512,224,24.88
ecaresnext26t_32x4d,835.25,459.196,384,224,15.41
ecaresnext50t_32x4d,834.39,459.653,384,224,15.41
cspresnet50,830.57,461.498,384,256,21.62
swsl_resnet50,829.79,616.192,512,224,25.56
ssl_resnet50,829.64,616.294,512,224,25.56
gluon_resnet50_v1b,829.63,616.32,512,224,25.56
visformer_small,828.8,462.625,384,224,40.22
tv_resnet50,826.55,618.618,512,224,25.56
resnet50,826.06,618.983,512,224,25.56
vgg11,825.98,619.706,512,224,132.86
halonet26t,824.96,464.902,384,256,12.48
vovnet39a,817.65,625.544,512,224,22.6
tf_efficientnet_lite2,816.76,312.458,256,260,6.09
convnext_tiny_hnf,815.48,312.97,256,224,28.59
convnext_tiny_hnfd,815.19,313.078,256,224,28.59
vit_small_resnet26d_224,813.66,470.891,384,224,63.61
convnext_tiny,813.16,313.859,256,224,28.59
efficientnet_cc_b0_8e,812.96,471.165,384,224,24.01
vit_relpos_base_patch32_plus_rpn_256,811.26,630.0,512,256,119.42
mixnet_m,810.2,630.361,512,224,5.01
efficientnet_cc_b0_4e,808.8,473.577,384,224,13.31
convnext_tiny_in22ft1k,808.5,315.666,256,224,28.59
efficientnet_b2a,800.27,318.401,256,256,9.11
efficientnet_b2,799.96,318.544,256,256,9.11
regnetz_b16,796.72,319.811,256,224,9.72
tresnet_m,792.8,643.233,512,224,31.39
mobilevit_xs,792.23,321.979,256,256,2.32
ecaresnet26t,791.55,484.557,384,256,16.01
gc_efficientnetv2_rw_t,791.43,320.691,256,224,13.68
resnetv2_50t,790.83,646.602,512,224,25.57
resnetv2_50d,790.11,647.216,512,224,25.57
regnetx_032,787.5,486.368,384,224,15.3
ese_vovnet39b,786.37,650.429,512,224,24.57
tf_efficientnet_cc_b0_4e,784.54,488.254,384,224,13.31
eca_botnext26ts_256,781.43,326.976,256,256,10.59
resnet32ts,781.02,327.191,256,256,17.96
tf_mixnet_m,777.93,492.059,384,224,5.01
resnet33ts,767.78,332.812,256,256,19.68
gluon_resnet50_v1c,763.32,502.196,384,224,25.58
eca_halonext26ts,763.09,334.836,256,256,10.76
rexnetr_200,762.4,250.686,192,224,16.52
dpn68b,756.57,506.252,384,224,12.61
lambda_resnet26t,751.39,510.429,384,256,10.96
vit_relpos_small_patch16_rpn_224,751.37,510.029,384,224,21.97
resnet50t,748.03,512.487,384,224,25.57
cspresnet50d,746.91,341.842,256,256,21.64
gluon_resnet50_v1d,746.5,513.543,384,224,25.58
resnet50d,744.99,514.575,384,224,25.58
legacy_seresnet50,744.88,514.356,384,224,28.09
cspresnet50w,744.04,343.166,256,256,28.12
eca_resnet33ts,743.27,343.735,256,256,19.68
efficientnet_b0_g8_gn,743.27,343.315,256,224,6.56
seresnet33ts,742.4,344.003,256,256,19.78
resnetaa50,741.95,516.711,384,224,25.56
selecsls84,740.99,689.736,512,224,50.95
dpn68,739.97,517.747,384,224,12.61
res2net50_48w_2s,738.06,519.427,384,224,25.29
vit_small_r26_s32_224,737.22,345.982,256,224,36.43
eca_vovnet39b,735.59,695.354,512,224,22.6
lambda_resnet26rpt_256,735.09,260.579,192,256,10.99
nf_regnet_b3,732.33,522.471,384,288,18.59
rexnet_200,731.68,261.239,192,224,16.37
densenet121,730.11,348.758,256,224,7.98
resnest26d,728.94,526.039,384,224,17.07
bat_resnext26ts,728.42,350.197,256,256,10.73
mobilevitv2_100,727.72,262.852,192,256,4.9
tv_densenet121,727.58,350.07,256,224,7.98
nf_seresnet50,727.17,526.884,384,224,28.09
gcresnet33ts,725.89,351.666,256,256,19.88
eca_nfnet_l0,723.69,706.434,512,224,24.14
nfnet_l0,719.96,532.162,384,224,35.07
seresnet50,714.65,536.208,384,224,28.09
twins_pcpvt_small,714.45,356.63,256,224,24.11
nf_ecaresnet50,713.69,537.063,384,224,25.56
dla60,709.61,540.13,384,224,22.04
efficientnet_em,708.33,360.423,256,240,6.9
hrnet_w18_small_v2,705.02,723.712,512,224,15.6
resnetblur50d,704.94,362.275,256,224,25.58
vgg11_bn,703.05,545.962,384,224,132.87
resnetblur50,698.61,548.824,384,224,25.56
regnety_032,696.58,549.77,384,224,19.44
nf_resnet50,696.17,550.716,384,256,25.56
efficientnet_b3_pruned,694.05,367.106,256,300,9.86
tf_efficientnet_em,690.66,369.697,256,240,6.9
skresnet50,685.67,371.92,256,224,25.8
xcit_tiny_24_p16_224,683.44,371.201,256,224,12.12
poolformer_s24,681.93,374.176,256,224,21.39
xcit_tiny_24_p16_224_dist,681.85,371.937,256,224,12.12
vit_base_resnet26d_224,680.96,562.594,384,224,101.4
vovnet57a,678.75,564.837,384,224,36.64
densenet121d,678.22,375.614,256,224,8.0
resnetaa50d,673.73,569.117,384,224,25.58
gluon_resnet50_v1s,669.16,573.001,384,224,25.68
gmixer_24_224,666.22,382.715,256,224,24.72
swsl_resnext50_32x4d,663.66,577.766,384,224,25.03
resnext50_32x4d,663.39,577.966,384,224,25.03
ssl_resnext50_32x4d,663.18,578.185,384,224,25.03
tv_resnext50_32x4d,662.37,578.888,384,224,25.03
gluon_resnext50_32x4d,662.06,579.185,384,224,25.03
haloregnetz_b,660.09,386.296,256,224,11.68
ese_vovnet57b,656.27,584.17,384,224,38.61
cspresnext50,656.07,389.365,256,256,20.57
seresnet50t,655.71,584.407,384,224,28.1
vit_relpos_medium_patch16_cls_224,654.69,389.857,256,224,38.76
seresnetaa50d,654.11,390.147,256,224,28.11
densenetblur121d,649.47,392.249,256,224,8.0
res2net50_26w_4s,648.76,590.62,384,224,25.7
fbnetv3_g,647.3,294.603,192,240,16.62
swin_tiny_patch4_window7_224,646.76,394.841,256,224,28.29
ecaresnet50d,643.9,595.437,384,224,25.58
regnety_040,640.15,598.298,384,224,20.65
gmlp_s16_224,638.67,299.017,192,224,19.42
crossvit_small_240,637.47,399.952,256,240,26.86
resnext50d_32x4d,635.21,402.121,256,224,25.05
nfnet_f0,634.03,806.334,512,192,71.49
vit_srelpos_medium_patch16_224,629.85,405.54,256,224,38.74
mobilevit_s,629.67,303.779,192,256,5.58
skresnet50d,628.92,405.574,256,224,25.82
vit_relpos_medium_patch16_224,628.2,406.369,256,224,38.75
resnest50d_1s4x24d,628.12,406.263,256,224,25.68
mixnet_l,627.47,406.445,256,224,7.33
tf_efficientnet_b2_ns,627.11,304.591,192,260,9.11
tf_efficientnet_b2_ap,626.79,304.757,192,260,9.11
tf_efficientnet_b2,626.11,305.153,192,260,9.11
regnetx_040,624.89,613.356,384,224,22.12
regnetv_040,622.71,409.581,256,224,20.64
darknetaa53,614.47,415.819,256,256,36.02
seresnext50_32x4d,613.62,416.021,256,224,27.56
gluon_seresnext50_32x4d,613.35,416.206,256,224,27.56
sehalonet33ts,613.13,416.664,256,256,13.69
legacy_seresnext50_32x4d,612.89,416.52,256,224,27.56
dla60x,612.79,416.731,256,224,17.35
gcresnet50t,611.79,626.12,384,256,25.9
xcit_nano_12_p16_384_dist,611.55,416.81,256,384,3.05
resmlp_24_224,609.69,418.351,256,224,30.02
resmlp_24_distilled_224,609.51,418.474,256,224,30.02
gcresnext50ts,606.82,314.923,192,256,15.67
tf_inception_v3,603.29,635.057,384,299,23.83
gluon_inception_v3,603.22,635.143,384,299,23.83
adv_inception_v3,603.01,635.347,384,299,23.83
inception_v3,602.27,636.205,384,299,23.83
tf_mixnet_l,600.24,424.956,256,224,7.33
dm_nfnet_f0,600.1,638.573,384,192,71.49
xcit_small_12_p16_224,598.44,425.955,256,224,26.25
xcit_small_12_p16_224_dist,598.22,426.013,256,224,26.25
semobilevit_s,597.07,320.258,192,256,5.74
densenet169,592.78,429.221,256,224,14.15
res2next50,591.98,431.144,256,224,24.67
resnetv2_101,590.74,431.806,256,224,44.54
darknet53,590.64,432.606,256,256,41.61
resnetv2_50x1_bit_distilled,587.0,326.262,192,224,25.55
res2net50_14w_8s,586.94,433.992,256,224,25.06
swin_s3_tiny_224,586.39,435.576,256,224,28.33
repvgg_b1g4,584.26,875.234,512,224,39.97
dla60_res2net,583.07,437.618,256,224,20.85
crossvit_15_240,576.62,331.16,192,240,27.53
cait_xxs24_224,576.52,441.46,256,224,11.96
cs3darknet_focus_x,569.86,448.292,256,256,35.02
resnet101,568.98,448.321,256,224,44.55
gluon_resnet101_v1b,568.4,448.834,256,224,44.55
tv_resnet101,566.24,450.547,256,224,44.55
resnetrs101,564.72,451.1,256,192,63.62
efficientnet_cc_b1_8e,564.18,452.061,256,240,39.72
crossvit_15_dagger_240,558.23,342.033,192,240,28.21
vit_base_resnet50d_224,557.61,457.501,256,224,110.97
mobilevitv2_125,557.38,343.473,192,256,7.48
xcit_nano_12_p8_224_dist,555.43,459.069,256,224,3.05
xcit_nano_12_p8_224,555.18,459.311,256,224,3.05
sebotnet33ts_256,554.49,230.012,128,256,13.7
resnet51q,551.31,463.504,256,256,35.7
resnetv2_101d,548.02,465.6,256,224,44.56
tf_efficientnet_cc_b1_8e,547.16,466.173,256,240,39.72
resnetv2_50d_gn,546.54,350.469,192,224,25.57
nf_resnet101,543.9,704.337,384,224,44.55
vit_base_patch32_384,542.76,470.804,256,384,88.3
gluon_resnet101_v1c,537.15,475.0,256,224,44.57
cspdarknet53,537.1,475.617,256,256,27.64
cs3darknet_x,534.86,477.64,256,256,35.05
vit_base_r26_s32_224,534.67,357.767,192,224,101.38
resnest50d,534.66,477.434,256,224,27.48
resnet50_gn,531.78,360.235,192,224,25.56
regnetz_c16,530.35,360.552,192,256,13.46
gluon_resnet101_v1d,528.59,482.76,256,224,44.57
mixer_b16_224,528.02,484.004,256,224,59.88
mixer_l32_224,527.33,362.504,192,224,206.94
mixer_b16_224_miil,526.58,485.347,256,224,59.88
vit_large_patch32_224,521.73,489.021,256,224,306.54
dla60_res2next,520.31,490.572,256,224,17.03
ecaresnet50t,516.29,494.896,256,256,25.57
cs3sedarknet_xdw,516.24,246.008,128,256,21.6
lambda_resnet50ts,515.16,371.658,192,256,21.54
vit_tiny_patch16_384,512.2,249.072,128,384,5.79
resnet61q,510.55,375.027,192,256,36.85
swinv2_cr_tiny_224,505.83,504.823,256,224,28.33
halonet50ts,503.76,380.122,192,256,22.73
repvgg_b1,503.57,1015.623,512,224,57.42
swinv2_cr_tiny_ns_224,502.5,508.144,256,224,28.33
cs3sedarknet_x,501.96,508.547,256,256,35.4
dla102,497.28,513.14,256,224,33.27
wide_resnet50_2,495.68,773.85,384,224,68.88
res2net50_26w_6s,493.57,516.914,256,224,37.05
resnetaa101d,490.51,520.338,256,224,44.57
convnext_small,489.81,390.224,192,224,50.22
convnext_small_in22ft1k,489.45,390.576,192,224,50.22
legacy_seresnet101,487.73,522.616,256,224,49.33
vit_relpos_medium_patch16_rpn_224,485.5,526.221,256,224,38.73
efficientnet_lite3,484.47,263.098,128,300,8.2
gluon_resnet101_v1s,483.47,527.891,256,224,44.67
seresnet101,480.65,530.38,256,224,49.33
cs3edgenet_x,477.59,535.019,256,256,47.82
nest_tiny,476.68,267.593,128,224,17.06
nf_seresnet101,474.46,537.213,256,224,49.33
mobilevitv2_150_in22ft1k,473.86,269.132,128,256,10.59
mobilevitv2_150,473.84,269.144,128,256,10.59
resnetblur101d,472.23,540.497,256,224,44.57
jx_nest_tiny,472.22,270.163,128,224,17.06
nf_ecaresnet101,469.53,543.375,256,224,44.55
vgg13_bn,468.37,546.3,256,224,133.05
twins_pcpvt_base,466.47,408.848,192,224,43.83
tf_efficientnet_lite3,465.4,273.895,128,300,8.2
vgg16,465.36,824.954,384,224,138.36
sequencer2d_s,462.43,412.819,192,224,27.65
mixnet_xl,460.21,554.308,256,224,11.9
coat_lite_small,457.06,418.568,192,224,19.84
efficientnet_b3a,456.95,278.403,128,288,12.23
efficientnet_b3,456.81,278.448,128,288,12.23
regnetx_080,454.56,843.629,384,224,39.57
regnetx_064,452.43,564.953,256,224,26.21
halo2botnet50ts_256,451.35,424.392,192,256,22.64
ecaresnet101d,447.44,570.33,256,224,44.57
densenet201,447.44,425.987,192,224,20.01
nf_regnet_b4,445.8,428.533,192,320,30.21
convit_small,443.63,431.737,192,224,27.78
efficientnetv2_s,433.27,293.157,128,288,21.46
skresnext50_32x4d,432.3,590.802,256,224,27.48
cs3se_edgenet_x,431.68,443.324,192,256,50.72
botnet50ts_256,428.8,297.529,128,256,22.74
ssl_resnext101_32x4d,427.28,447.74,192,224,44.18
resnext101_32x4d,427.18,447.921,192,224,44.18
swsl_resnext101_32x4d,427.16,447.915,192,224,44.18
gluon_resnext101_32x4d,427.13,447.97,192,224,44.18
poolformer_s36,425.0,449.906,192,224,30.86
ese_vovnet39b_evos,421.31,302.862,128,224,24.58
resnet101d,418.0,457.739,192,256,44.57
dla102x,417.16,458.658,192,224,26.31
res2net101_26w_4s,416.51,612.121,256,224,45.21
lamhalobotnet50ts_256,413.09,463.774,192,256,22.57
twins_svt_base,411.8,464.138,192,224,56.07
crossvit_18_240,406.79,312.611,128,240,43.27
tresnet_l,404.84,1261.389,512,224,55.99
efficientnetv2_rw_s,402.34,315.806,128,288,23.94
volo_d1_224,401.47,476.8,192,224,26.63
resmlp_36_224,401.06,476.505,192,224,44.69
res2net50_26w_8s,400.52,636.999,256,224,48.4
resmlp_36_distilled_224,400.14,477.557,192,224,44.69
swin_small_patch4_window7_224,399.67,478.499,192,224,49.61
resnest50d_4s2x40d,396.0,645.092,256,224,30.42
vit_base_patch16_224_miil,395.78,484.311,192,224,86.54
crossvit_18_dagger_240,394.08,322.72,128,240,44.27
deit_base_patch16_224,390.88,490.307,192,224,86.57
vit_base_patch16_224,390.86,490.391,192,224,86.57
vit_base_patch16_224_sam,390.67,490.608,192,224,86.57
mobilevitv2_175_in22ft1k,389.97,327.241,128,256,14.25
mobilevitv2_175,389.95,327.23,128,256,14.25
tf_efficientnetv2_s_in21ft1k,389.66,326.288,128,300,21.46
tf_efficientnetv2_s,389.1,326.713,128,300,21.46
vgg16_bn,388.69,658.276,256,224,138.37
regnety_064,388.46,657.256,256,224,30.58
regnety_080,385.84,662.273,256,224,39.18
deit_base_distilled_patch16_224,385.62,497.03,192,224,87.34
xception,385.56,331.194,128,299,22.86
regnety_040s_gn,384.97,330.927,128,224,20.65
repvgg_b2g4,379.42,1348.329,512,224,61.76
resnetv2_152,379.4,503.868,192,224,60.19
regnetz_d8,378.76,336.282,128,256,23.37
hrnet_w18,378.26,671.883,256,224,21.3
ese_vovnet99b,377.27,677.036,256,224,63.2
vit_small_resnet50d_s16_224,376.52,508.654,192,224,57.53
cait_xxs36_224,375.8,507.032,192,224,17.3
gluon_seresnext101_32x4d,375.02,509.78,192,224,48.96
regnetz_040,374.91,339.544,128,256,27.12
seresnext101_32x4d,374.73,510.176,192,224,48.96
regnetv_064,372.69,513.522,192,224,30.58
regnetz_040h,372.64,341.593,128,256,28.94
legacy_seresnext101_32x4d,372.06,513.705,192,224,48.96
deit3_base_patch16_224_in21ft1k,371.79,515.431,192,224,86.59
deit3_base_patch16_224,371.73,515.464,192,224,86.59
tf_efficientnet_b3,370.15,344.089,128,300,12.23
tf_efficientnet_b3_ap,370.14,344.111,128,300,12.23
tf_efficientnet_b3_ns,370.1,344.134,128,300,12.23
resnet152,370.08,516.516,192,224,60.19
vit_relpos_base_patch16_clsgap_224,369.76,518.105,192,224,86.43
vit_relpos_base_patch16_cls_224,369.34,518.67,192,224,86.43
resnetv2_50d_frn,369.16,345.594,128,224,25.59
gluon_resnet152_v1b,369.02,517.998,192,224,60.19
tv_resnet152,369.0,518.088,192,224,60.19
regnetz_b16_evos,365.3,348.518,128,224,9.74
sequencer2d_m,363.12,525.505,192,224,38.31
ese_vovnet99b_iabn,362.9,1055.043,384,224,63.2
resnetv2_152d,360.99,529.48,192,224,60.2
beit_base_patch16_224,358.29,534.776,192,224,86.53
xcit_tiny_12_p16_384_dist,357.91,534.55,192,384,6.72
vit_relpos_base_patch16_224,355.33,539.194,192,224,86.43
gluon_resnet152_v1c,354.77,538.797,192,224,60.21
regnetz_d32,354.52,359.397,128,256,27.58
swinv2_tiny_window8_256,354.35,540.569,192,256,28.35
resnetv2_50d_evos,353.36,270.506,96,224,25.59
dpn92,353.0,723.617,256,224,37.67
vgg19,352.0,1090.655,384,224,143.67
gluon_resnet152_v1d,351.06,544.563,192,224,60.21
densenet161,346.02,367.416,128,224,28.68
xception41p,344.85,370.318,128,299,26.91
gluon_resnet152_v1s,344.7,368.96,128,224,60.32
mobilevitv2_200,342.4,372.843,128,256,18.45
tnt_s_patch16_224,342.25,559.037,192,224,23.76
mobilevitv2_200_in22ft1k,342.08,373.147,128,256,18.45
eca_nfnet_l1,341.07,561.084,192,256,41.41
hrnet_w32,340.54,747.043,256,224,41.23
dla169,338.11,565.259,192,224,53.39
convnext_base_in22ft1k,337.76,377.102,128,224,88.59
convnext_base,336.98,378.091,128,224,88.59
repvgg_b2,335.01,1527.215,512,224,89.02
repvgg_b3g4,334.01,1148.557,384,224,83.83
vgg13,331.37,1544.923,512,224,133.05
pit_b_224,331.17,385.577,128,224,73.76
vgg19_bn,330.96,773.109,256,224,143.68
pit_b_distilled_224,329.46,387.568,128,224,74.79
regnetx_120,327.41,780.952,256,224,46.11
twins_pcpvt_large,322.96,392.17,128,224,60.99
hrnet_w30,321.86,790.607,256,224,37.71
legacy_seresnet152,319.56,397.245,128,224,66.82
inception_v4,316.87,603.734,192,299,42.68
seresnet152,313.75,608.677,192,224,66.82
vit_small_patch16_36x1_224,310.56,409.448,128,224,64.67
dla102x2,309.09,412.537,128,224,41.28
xcit_small_24_p16_224_dist,307.83,412.466,128,224,47.67
convmixer_1024_20_ks9_p14,307.81,830.813,256,224,24.38
vit_small_patch16_18x2_224,307.61,413.3,128,224,64.67
xcit_small_24_p16_224,307.46,412.867,128,224,47.67
regnety_120,307.05,623.971,192,224,51.82
poolformer_m36,303.34,420.132,128,224,56.17
efficientnet_el_pruned,301.49,423.464,128,300,10.59
efficientnet_el,301.45,423.5,128,300,10.59
swinv2_cr_small_ns_224,300.41,423.619,128,224,49.7
swinv2_cr_small_224,297.65,427.521,128,224,49.7
mixnet_xxl,297.33,428.503,128,224,23.96
cait_s24_224,296.96,428.341,128,224,46.92
nest_small,296.72,321.888,96,224,38.35
coat_tiny,296.44,429.708,128,224,5.5
tf_efficientnet_el,295.51,432.07,128,300,10.59
jx_nest_small,294.83,323.932,96,224,38.35
efficientnet_b4,293.1,325.442,96,320,19.34
xception41,293.07,435.505,128,299,26.97
xcit_tiny_12_p8_224_dist,291.52,437.287,128,224,6.71
tresnet_xl,291.4,875.028,256,224,78.44
resnext101_64x4d,291.33,437.816,128,224,83.46
gluon_resnext101_64x4d,291.25,437.881,128,224,83.46
swin_s3_small_224,289.76,439.818,128,224,49.74
wide_resnet101_2,289.62,661.356,192,224,126.89
xcit_tiny_12_p8_224,289.33,440.549,128,224,6.71
twins_svt_large,289.11,440.647,128,224,99.27
resnet152d,281.47,452.46,128,256,60.21
swin_base_patch4_window7_224,279.66,455.817,128,224,87.77
convnext_tiny_384_in22ft1k,278.8,343.389,96,384,28.59
resnet200,276.62,459.688,128,224,64.67
ssl_resnext101_32x8d,276.52,461.341,128,224,88.79
ig_resnext101_32x8d,276.39,461.582,128,224,88.79
resnext101_32x8d,276.22,461.854,128,224,88.79
swsl_resnext101_32x8d,276.22,461.764,128,224,88.79
repvgg_b3,271.93,1411.039,384,224,123.09
nfnet_f1,271.43,705.161,192,224,132.63
resnetv2_50d_evob,268.92,355.729,96,224,25.59
gmlp_b16_224,268.22,356.298,96,224,73.08
dpn98,267.62,476.602,128,224,61.57
regnetx_160,266.55,719.244,192,224,54.28
regnety_160,264.44,724.758,192,224,83.59
gluon_seresnext101_64x4d,264.12,482.344,128,224,88.23
ens_adv_inception_resnet_v2,261.47,730.952,192,299,55.84
inception_resnet_v2,261.44,730.919,192,299,55.84
xception65p,259.32,492.32,128,299,39.82
efficientnet_lite4,255.23,249.374,64,380,13.01
vit_base_patch16_rpn_224,254.51,753.593,192,224,86.54
resnest101e,253.9,501.575,128,256,48.28
crossvit_base_240,253.73,376.737,96,240,105.03
seresnext101_32x8d,251.44,506.792,128,224,93.57
vit_relpos_base_patch16_rpn_224,250.62,765.002,192,224,86.41
vit_base_patch16_plus_240,248.4,514.352,128,240,117.56
tf_efficientnet_lite4,247.32,257.437,64,380,13.01
efficientnet_b3_gn,245.44,258.998,64,288,11.73
dm_nfnet_f1,244.51,521.138,128,224,132.63
seresnext101d_32x8d,242.49,525.503,128,224,93.59
seresnet152d,242.25,392.75,96,256,66.84
xcit_tiny_24_p16_384_dist,241.62,526.318,128,384,12.12
vit_small_patch16_384,239.05,266.865,64,384,22.2
vit_relpos_base_patch16_plus_240,238.57,535.322,128,240,117.38
vit_large_r50_s32_224,237.94,401.033,96,224,328.99
resnetrs152,237.63,535.199,128,256,86.62
swinv2_tiny_window16_256,237.41,403.072,96,256,28.35
seresnextaa101d_32x8d,228.0,559.15,128,224,93.59
xcit_medium_24_p16_224_dist,227.91,558.239,128,224,84.4
deit3_small_patch16_384_in21ft1k,227.77,280.008,64,384,22.21
deit3_small_patch16_384,227.76,280.015,64,384,22.21
xcit_medium_24_p16_224,227.75,558.491,128,224,84.4
vit_small_r26_s32_384,227.25,280.302,64,384,36.47
convit_base,224.86,568.198,128,224,86.54
gluon_xception65,224.1,426.474,96,299,39.92
swin_s3_base_224,223.36,426.944,96,224,71.13
tnt_b_patch16_224,222.94,572.213,128,224,65.41
xception65,222.93,428.728,96,299,39.92
coat_mini,222.88,572.209,128,224,10.34
volo_d2_224,222.28,430.111,96,224,58.68
xcit_small_12_p16_384_dist,221.79,430.984,96,384,26.25
poolformer_m48,220.53,432.741,96,224,73.47
hrnet_w40,219.45,869.959,192,224,57.56
vit_base_r50_s16_224,215.41,443.988,96,224,98.66
swinv2_cr_base_ns_224,213.88,446.428,96,224,87.88
sequencer2d_l,213.86,444.098,96,224,54.3
swinv2_small_window8_256,212.59,449.01,96,256,49.73
swinv2_cr_base_224,211.39,451.682,96,224,87.88
mobilevitv2_150_384_in22ft1k,210.38,303.23,64,384,10.59
nest_base,210.2,302.774,64,224,67.72
tresnet_m_448,209.55,913.447,192,448,31.39
efficientnetv2_m,207.96,304.545,64,320,54.14
jx_nest_base,207.78,306.371,64,224,67.72
regnetz_c16_evos,207.35,306.824,64,256,13.49
hrnet_w44,206.45,925.026,192,224,67.06
resnet200d,204.47,623.017,128,256,64.69
efficientnet_b3_g8_gn,203.44,312.836,64,288,14.25
hrnet_w48,202.15,628.427,128,224,77.47
densenet264,202.1,470.789,96,224,72.69
dpn131,198.25,643.486,128,224,79.25
tf_efficientnet_b4,194.83,326.399,64,380,19.34
tf_efficientnet_b4_ap,194.65,326.738,64,380,19.34
tf_efficientnet_b4_ns,194.23,327.375,64,380,19.34
xcit_nano_12_p8_384_dist,187.76,338.965,64,384,3.05
efficientnetv2_rw_m,187.31,338.14,64,320,53.24
dpn107,187.14,682.151,128,224,86.92
convnext_large_in22ft1k,187.05,511.402,96,224,197.77
convnext_large,187.01,511.523,96,224,197.77
nf_regnet_b5,186.49,512.09,96,384,49.74
xcit_tiny_24_p8_224_dist,183.21,520.533,96,224,12.11
xcit_tiny_24_p8_224,183.21,520.609,96,224,12.11
halonet_h1,177.48,359.151,64,256,8.1
hrnet_w64,176.04,722.362,128,224,128.06
mobilevitv2_175_384_in22ft1k,175.76,363.135,64,384,14.25
senet154,174.83,545.792,96,224,115.09
regnety_320,174.41,732.528,128,224,145.05
gluon_senet154,174.03,548.162,96,224,115.09
regnetz_e8,173.89,365.999,64,256,57.7
legacy_senet154,170.27,560.493,96,224,115.09
xception71,168.81,376.911,64,299,42.34
xcit_small_12_p8_224,168.52,377.961,64,224,26.21
xcit_small_12_p8_224_dist,168.32,378.375,64,224,26.21
vit_large_patch32_384,168.05,569.595,96,384,306.63
convnext_small_384_in22ft1k,164.94,386.292,64,384,50.22
mixer_l16_224,164.43,582.335,96,224,208.2
ecaresnet200d,161.74,392.334,64,256,64.69
seresnet200d,161.56,391.892,64,256,71.86
resnetrs200,160.52,394.222,64,256,93.21
densenet264d_iabn,158.12,804.924,128,224,72.74
regnetx_320,155.94,819.702,128,224,107.81
swin_large_patch4_window7_224,153.79,414.255,64,224,196.53
volo_d3_224,152.73,416.584,64,224,86.33
mobilevitv2_200_384_in22ft1k,150.68,317.559,48,384,18.45
swinv2_base_window8_256,150.28,423.39,64,256,87.92
resnetv2_50x1_bitm,149.04,321.21,48,448,25.55
nfnet_f2,148.92,641.446,96,256,193.78
swinv2_small_window16_256,142.83,445.591,64,256,49.73
tf_efficientnetv2_m,142.41,333.833,48,384,54.14
eca_nfnet_l2,142.2,672.291,96,320,56.72
tf_efficientnetv2_m_in21ft1k,141.35,336.246,48,384,54.14
regnetz_d8_evos,132.2,360.995,48,256,23.46
swinv2_cr_tiny_384,131.5,485.388,64,384,28.33
ig_resnext101_32x16d,130.47,734.203,96,224,194.03
ssl_resnext101_32x16d,130.4,734.63,96,224,194.03
swsl_resnext101_32x16d,130.37,734.771,96,224,194.03
xcit_large_24_p16_224,126.98,500.577,64,224,189.1
xcit_large_24_p16_224_dist,126.97,500.662,64,224,189.1
seresnet269d,125.7,503.318,64,256,113.67
dm_nfnet_f2,125.2,507.681,64,256,193.78
swinv2_cr_large_224,124.7,510.736,64,224,196.68
xcit_tiny_12_p8_384_dist,122.38,390.412,48,384,6.71
resnetrs270,121.5,520.761,64,256,129.86
crossvit_15_dagger_408,117.57,270.29,32,408,28.5
vit_large_patch16_224,117.08,544.981,64,224,304.33
vit_base_patch16_18x2_224,116.55,546.378,64,224,256.73
convnext_base_384_in22ft1k,115.97,412.084,48,384,88.59
convnext_xlarge_in22ft1k,115.91,550.445,64,224,350.2
deit3_large_patch16_224_in21ft1k,113.19,563.501,64,224,304.37
deit3_large_patch16_224,113.17,563.634,64,224,304.37
xcit_small_24_p16_384_dist,112.88,421.839,48,384,47.67
beit_large_patch16_224,107.8,591.544,64,224,304.43
swinv2_base_window16_256,103.68,460.461,48,256,87.92
swinv2_base_window12to16_192to256_22kft1k,103.56,461.021,48,256,87.92
tresnet_l_448,103.2,1236.839,128,448,55.99
volo_d1_384,99.39,320.613,32,384,26.78
cait_xxs24_384,97.83,488.033,48,384,12.03
vit_base_patch16_384,96.96,329.192,32,384,86.86
deit_base_patch16_384,96.37,331.171,32,384,86.86
volo_d4_224,95.75,498.748,48,224,192.96
deit_base_distilled_patch16_384,94.83,336.556,32,384,87.63
efficientnet_b5,93.71,338.901,32,456,30.39
deit3_base_patch16_384,93.22,342.328,32,384,86.88
deit3_base_patch16_384_in21ft1k,92.68,344.327,32,384,86.88
tf_efficientnet_b5,92.16,344.785,32,456,30.39
tf_efficientnet_b5_ns,92.11,344.939,32,456,30.39
tf_efficientnet_b5_ap,91.98,345.405,32,456,30.39
resnetv2_152x2_bit_teacher,89.37,355.711,32,224,236.34
crossvit_18_dagger_408,88.76,358.492,32,408,44.61
xcit_small_24_p8_224,87.25,546.829,48,224,47.63
xcit_small_24_p8_224_dist,86.87,549.24,48,224,47.63
convmixer_768_32,85.53,1121.025,96,224,21.11
vit_large_patch14_224,85.27,561.239,48,224,304.2
eca_nfnet_l3,84.61,563.702,48,352,72.04
resnetv2_101x1_bitm,84.61,187.399,16,448,44.54
beit_base_patch16_384,83.67,381.291,32,384,86.74
resnest200e,83.33,570.802,48,320,70.2
tf_efficientnetv2_l_in21ft1k,83.27,379.678,32,384,118.52
efficientnetv2_l,83.27,379.867,32,384,118.52
tf_efficientnetv2_l,82.74,382.367,32,384,118.52
ecaresnet269d,82.15,579.477,48,320,102.09
tresnet_xl_448,78.31,1222.487,96,448,78.44
xcit_medium_24_p16_384_dist,77.9,407.346,32,384,84.4
vit_large_r50_s32_384,77.49,410.426,32,384,329.09
swinv2_cr_small_384,76.31,416.793,32,384,49.7
swin_base_patch4_window12_384,74.03,430.327,32,384,87.9
pnasnet5large,68.77,461.392,32,331,86.06
resnetrs350,68.24,460.967,32,288,163.96
nfnet_f3,67.87,703.087,48,320,254.92
nasnetalarge,67.38,469.785,32,331,88.75
resmlp_big_24_distilled_224,67.03,475.867,32,224,129.14
resmlp_big_24_224_in22ft1k,67.03,475.857,32,224,129.14
resmlp_big_24_224,67.02,475.97,32,224,129.14
cait_xs24_384,65.59,485.229,32,384,26.67
convnext_large_384_in22ft1k,63.62,501.159,32,384,197.77
vit_base_patch8_224,63.42,377.591,24,224,86.58
cait_xxs36_384,63.23,502.345,32,384,17.37
ig_resnext101_32x32d,62.72,508.666,32,224,468.53
xcit_tiny_24_p8_384_dist,62.14,511.676,32,384,12.11
volo_d5_224,61.58,516.467,32,224,295.46
vit_base_resnet50_384,61.06,391.43,24,384,98.95
vit_base_r50_s16_384,61.0,391.773,24,384,98.95
swinv2_large_window12to16_192to256_22kft1k,60.93,391.352,24,256,196.74
xcit_medium_24_p8_224,60.43,526.111,32,224,84.32
xcit_medium_24_p8_224_dist,60.03,529.652,32,224,84.32
xcit_small_12_p8_384_dist,57.75,413.763,24,384,26.21
dm_nfnet_f3,57.26,554.375,32,320,254.92
volo_d2_384,55.56,286.247,16,384,58.87
efficientnet_b6,54.98,288.003,16,528,43.04
swinv2_cr_base_384,54.71,436.265,24,384,87.88
tf_efficientnet_b6,54.38,291.338,16,528,43.04
tf_efficientnet_b6_ns,54.21,292.241,16,528,43.04
tf_efficientnet_b6_ap,54.17,292.479,16,528,43.04
efficientnetv2_xl,53.77,291.666,16,384,208.12
tf_efficientnetv2_xl_in21ft1k,53.07,295.611,16,384,208.12
convmixer_1536_20,50.1,957.271,48,224,51.63
swinv2_cr_huge_224,49.51,482.114,24,224,657.83
cait_s24_384,49.37,483.331,24,384,47.06
resnetrs420,48.12,489.4,24,320,191.89
xcit_large_24_p16_384_dist,45.6,522.909,24,384,189.1
swin_large_patch4_window12_384,41.65,382.145,16,384,196.74
convnext_xlarge_384_in22ft1k,40.18,595.51,24,384,350.2
vit_huge_patch14_224,39.94,398.436,16,224,632.05
deit3_huge_patch14_224_in21ft1k,38.36,414.578,16,224,632.13
deit3_huge_patch14_224,38.33,414.855,16,224,632.13
nfnet_f4,36.85,646.047,24,384,316.07
resnest269e,35.75,664.499,24,416,110.93
resnetv2_50x3_bitm,34.88,457.822,16,448,217.32
xcit_large_24_p8_224_dist,33.7,471.344,16,224,188.93
xcit_large_24_p8_224,33.68,471.512,16,224,188.93
resnetv2_152x2_bit_teacher_384,32.69,487.138,16,384,236.34
ig_resnext101_32x48d,32.39,492.417,16,224,828.41
swinv2_cr_large_384,32.36,491.839,16,384,196.68
cait_s36_384,31.92,497.404,16,384,68.37
efficientnet_b7,31.74,248.492,8,600,66.35
dm_nfnet_f4,31.4,758.349,24,384,316.07
tf_efficientnet_b7,31.4,251.12,8,600,66.35
tf_efficientnet_b7_ns,31.37,251.394,8,600,66.35
tf_efficientnet_b7_ap,31.35,251.548,8,600,66.35
xcit_small_24_p8_384_dist,29.2,544.65,16,384,47.63
vit_large_patch16_384,29.07,411.127,12,384,304.72
deit3_large_patch16_384,28.22,423.365,12,384,304.76
deit3_large_patch16_384_in21ft1k,28.19,423.825,12,384,304.76
swinv2_base_window12to24_192to384_22kft1k,28.14,423.938,12,384,87.92
beit_large_patch16_384,25.12,475.56,12,384,305.0
volo_d3_448,23.79,333.686,8,448,86.63
nfnet_f5,22.84,694.067,16,416,377.21
resnetv2_152x2_bitm,22.69,350.236,8,448,236.34
vit_giant_patch14_224,22.29,356.113,8,224,1012.61
dm_nfnet_f5,20.95,756.72,16,416,377.21
xcit_medium_24_p8_384_dist,19.97,397.272,8,384,84.32
efficientnet_b8,19.9,297.659,6,672,87.41
tf_efficientnet_b8_ap,19.66,301.198,6,672,87.41
tf_efficientnet_b8,19.65,301.246,6,672,87.41
nfnet_f6,18.5,641.193,12,448,438.36
resnetv2_101x3_bitm,18.0,442.742,8,448,387.93
volo_d4_448,16.87,353.154,6,448,193.41
swinv2_large_window12to24_192to384_22kft1k,16.59,359.187,6,384,196.74
dm_nfnet_f6,15.07,522.261,8,448,438.36
swinv2_cr_huge_384,12.92,461.964,6,384,657.94
nfnet_f7,12.67,622.861,8,480,499.5
cait_m36_384,11.76,506.439,6,384,271.22
xcit_large_24_p8_384_dist,11.53,516.755,6,384,188.93
volo_d5_448,11.08,357.783,4,448,295.91
tf_efficientnet_l2_ns_475,10.91,360.832,4,475,480.31
beit_large_patch16_512,9.42,422.333,4,512,305.67
volo_d5_512,7.72,385.462,3,512,296.09
resnetv2_152x4_bitm,4.91,404.529,2,480,936.53
cait_m48_448,4.71,419.69,2,448,356.46
efficientnet_l2,3.43,285.826,1,800,480.31
tf_efficientnet_l2_ns,3.42,287.247,1,800,480.31
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-train-amp-nhwc-pt112-cu113-rtx3090.csv
|
model,train_samples_per_sec,train_step_time,train_batch_size,train_img_size,param_count
tinynet_e,11915.85,41.681,512,106,2.04
mobilenetv3_small_050,11290.99,44.293,512,224,1.59
lcnet_035,10015.98,50.125,512,224,1.64
lcnet_050,9286.37,54.37,512,224,1.88
tf_mobilenetv3_small_minimal_100,9042.22,55.986,512,224,2.04
mobilenetv3_small_075,8679.98,58.254,512,224,2.04
mobilenetv3_small_100,8035.08,62.981,512,224,2.54
tinynet_d,7990.69,63.223,512,152,2.34
tf_mobilenetv3_small_075,7930.1,63.8,512,224,2.04
tf_mobilenetv3_small_100,7330.24,69.047,512,224,2.54
lcnet_075,6950.91,73.156,512,224,2.36
levit_128s,6539.16,77.346,512,224,7.78
resnet10t,6318.63,80.774,512,176,5.44
mnasnet_small,5607.09,90.422,512,224,2.03
lcnet_100,5354.67,95.126,512,224,2.95
mixer_s32_224,4943.04,103.013,512,224,19.1
mobilenetv2_035,4789.43,106.101,512,224,1.68
mnasnet_050,4680.08,108.62,512,224,2.22
levit_128,4558.28,111.213,512,224,9.21
cs3darknet_focus_s,4469.48,114.041,512,256,3.27
vit_small_patch32_224,4445.76,114.324,512,224,22.88
tinynet_c,4167.16,121.826,512,184,2.46
gernet_s,4165.03,122.198,512,224,8.17
cs3darknet_s,4110.51,124.007,512,256,3.28
regnetx_002,4105.04,124.027,512,224,2.68
mobilenetv2_050,4051.14,125.606,512,224,1.97
vit_tiny_r_s16_p8_224,4025.23,126.328,512,224,6.34
semnasnet_050,3904.91,130.185,512,224,2.08
regnety_002,3777.81,134.562,512,224,3.16
levit_192,3727.29,136.213,512,224,10.95
ghostnet_050,3670.99,138.144,512,224,2.59
ese_vovnet19b_slim_dw,3629.92,140.575,512,224,1.9
lcnet_150,3576.28,142.665,512,224,4.5
gluon_resnet18_v1b,3482.17,146.691,512,224,11.69
resnet18,3481.78,146.713,512,224,11.69
swsl_resnet18,3480.5,146.765,512,224,11.69
ssl_resnet18,3477.04,146.904,512,224,11.69
resnet14t,3472.37,147.102,512,176,10.08
tf_efficientnetv2_b0,3428.08,148.143,512,192,7.14
tf_mobilenetv3_large_minimal_100,3366.45,151.356,512,224,3.92
mnasnet_075,3238.88,157.273,512,224,3.17
tf_mobilenetv3_large_075,3189.08,159.67,512,224,3.99
seresnet18,3138.91,162.608,512,224,11.78
mobilenetv3_large_075,3095.0,164.56,512,224,3.99
legacy_seresnet18,3076.04,165.928,512,224,11.78
hardcorenas_a,2971.63,171.576,512,224,5.26
levit_256,2956.43,172.043,512,224,18.89
mnasnet_b1,2930.02,173.933,512,224,4.38
mnasnet_100,2929.31,173.976,512,224,4.38
tf_mobilenetv3_large_100,2907.93,175.204,512,224,5.48
resnet18d,2875.3,177.69,512,224,11.71
tinynet_b,2851.82,178.435,512,188,3.73
hardcorenas_b,2772.42,183.73,512,224,5.18
hardcorenas_c,2763.94,184.272,512,224,5.52
mobilenetv3_rw,2754.46,184.981,512,224,5.48
nf_regnet_b0,2740.89,185.595,512,192,8.76
mobilenetv3_large_100_miil,2733.62,186.4,512,224,5.48
mobilenetv3_large_100,2732.43,186.472,512,224,5.48
ese_vovnet19b_slim,2684.58,190.344,512,224,3.17
spnasnet_100,2610.47,195.171,512,224,4.42
mobilenetv2_075,2609.91,195.379,512,224,2.64
semnasnet_075,2603.1,195.762,512,224,2.91
hardcorenas_d,2566.48,198.271,512,224,7.5
tf_efficientnetv2_b1,2548.95,199.349,512,192,8.14
levit_256d,2522.09,201.424,512,224,26.21
fbnetc_100,2397.58,212.548,512,224,5.57
tinynet_a,2334.41,218.035,512,192,6.19
mobilenetv2_100,2313.1,220.563,512,224,3.5
vit_tiny_patch16_224,2299.56,221.804,512,224,5.72
mnasnet_a1,2291.94,222.453,512,224,3.89
deit_tiny_patch16_224,2290.33,222.697,512,224,5.72
semnasnet_100,2279.15,223.737,512,224,3.89
edgenext_xx_small,2271.04,224.572,512,256,1.33
dla46_c,2266.89,225.115,512,224,1.3
hardcorenas_f,2252.64,226.141,512,224,8.2
deit_tiny_distilled_patch16_224,2248.67,226.799,512,224,5.91
hardcorenas_e,2245.94,226.861,512,224,8.07
xcit_nano_12_p16_224_dist,2177.52,233.052,512,224,3.05
xcit_nano_12_p16_224,2170.17,234.054,512,224,3.05
tf_efficientnet_lite0,2134.89,239.057,512,224,4.65
ghostnet_100,2129.82,239.0,512,224,5.18
hrnet_w18_small,2121.96,239.906,512,224,13.19
regnety_004,2085.76,244.311,512,224,4.34
efficientnet_lite0,2079.28,245.485,512,224,4.65
cs3darknet_focus_m,2062.98,247.547,512,256,9.3
pit_ti_distilled_224,2061.94,247.414,512,224,5.1
mnasnet_140,2060.59,247.645,512,224,7.12
pit_ti_224,2057.02,247.989,512,224,4.85
gluon_resnet34_v1b,2039.68,250.446,512,224,21.8
tv_resnet34,2038.39,250.573,512,224,21.8
resnet34,2036.51,250.813,512,224,21.8
ese_vovnet19b_dw,1999.58,255.562,512,224,6.54
resnet26,1962.0,260.488,512,224,16.0
tf_efficientnetv2_b2,1951.52,260.748,512,208,10.1
skresnet18,1943.81,262.753,512,224,11.96
cs3darknet_m,1940.79,263.122,512,256,9.31
regnetz_005,1916.17,265.765,512,224,7.12
resnetblur18,1897.99,269.406,512,224,11.69
rexnetr_100,1893.12,201.724,384,224,4.88
nf_resnet26,1869.64,273.344,512,224,16.0
mobilenetv2_110d,1868.27,204.505,384,224,4.52
visformer_tiny,1861.63,274.356,512,224,10.32
mixer_b32_224,1856.75,274.965,512,224,60.29
seresnet34,1837.21,277.783,512,224,21.96
fbnetv3_b,1825.5,278.744,512,224,8.6
mobilevitv2_050,1824.87,279.552,512,256,1.37
gernet_m,1822.12,280.293,512,224,21.14
resnet34d,1813.16,281.758,512,224,21.82
levit_384,1801.0,283.153,512,224,39.13
legacy_seresnet34,1781.32,286.529,512,224,21.96
regnetx_004,1780.61,286.47,512,224,5.16
tf_efficientnet_b0_ns,1779.24,214.77,384,224,5.29
tf_efficientnet_b0,1779.02,214.765,384,224,5.29
tf_efficientnet_b0_ap,1777.73,214.898,384,224,5.29
efficientnet_b0,1751.72,291.183,512,224,5.29
selecsls42,1718.76,297.231,512,224,30.35
selecsls42b,1710.18,298.726,512,224,32.46
vit_base_patch32_224,1708.63,298.818,512,224,88.22
vit_base_patch32_224_sam,1707.5,298.997,512,224,88.22
efficientnet_es_pruned,1687.32,302.688,512,224,5.44
resnetrs50,1686.45,302.42,512,160,35.69
efficientnet_es,1686.11,302.906,512,224,5.44
mixer_s16_224,1660.76,307.737,512,224,18.53
darknet17,1654.01,309.253,512,256,14.3
mobilenetv2_140,1637.57,233.691,384,224,6.11
fbnetv3_d,1634.54,233.136,384,224,10.31
tf_efficientnet_es,1623.9,314.542,512,224,5.44
resnet26d,1623.31,314.899,512,224,16.01
mobilevit_xxs,1602.81,238.427,384,256,1.27
resmlp_12_distilled_224,1577.54,323.769,512,224,15.35
resmlp_12_224,1577.31,323.803,512,224,15.35
pit_xs_224,1555.66,328.198,512,224,10.62
pit_xs_distilled_224,1555.48,328.255,512,224,11.0
semnasnet_140,1546.19,330.184,512,224,6.11
ghostnet_130,1542.47,330.535,512,224,7.36
repvgg_b0,1538.07,331.828,512,224,15.82
efficientnet_lite1,1530.99,166.26,256,240,5.42
dla34,1524.02,335.337,512,224,15.74
edgenext_x_small,1512.48,337.399,512,256,2.34
darknet21,1486.14,344.159,512,256,20.86
selecsls60,1482.76,344.397,512,224,30.67
selecsls60b,1478.62,345.378,512,224,32.77
nf_seresnet26,1473.71,346.754,512,224,17.4
vit_small_patch32_384,1455.89,350.818,512,384,22.92
gmixer_12_224,1448.32,352.721,512,224,12.7
efficientnet_b1_pruned,1446.82,352.35,512,240,6.33
tf_efficientnet_lite1,1443.47,176.394,256,240,5.42
nf_ecaresnet26,1440.41,354.896,512,224,16.0
xcit_tiny_12_p16_224_dist,1426.36,357.157,512,224,6.72
xcit_tiny_12_p16_224,1426.18,357.168,512,224,6.72
sedarknet21,1401.98,364.696,512,256,20.95
rexnetr_130,1388.84,183.199,256,224,7.61
dla46x_c,1388.59,367.953,512,224,1.07
gmlp_ti16_224,1381.11,276.449,384,224,5.87
mixnet_s,1365.54,373.667,512,224,4.13
rexnet_100,1364.31,280.319,384,224,4.8
regnety_006,1361.43,374.963,512,224,6.06
mobilenetv2_120d,1352.9,188.013,256,224,5.83
legacy_seresnext26_32x4d,1349.26,378.798,512,224,16.79
crossvit_tiny_240,1348.01,378.219,512,240,7.01
vit_tiny_r_s16_p8_384,1345.31,284.562,384,384,6.36
poolformer_s12,1342.54,380.659,512,224,11.92
dla60x_c,1341.77,380.621,512,224,1.32
efficientnet_b1,1325.85,191.544,256,224,7.79
resnetv2_50,1288.61,396.553,512,224,25.55
regnetx_006,1286.44,397.176,512,224,6.2
crossvit_9_240,1258.73,303.637,384,240,8.55
convnext_nano_ols,1252.33,408.151,512,224,15.6
convnext_nano,1249.89,408.864,512,224,15.59
convnext_nano_hnf,1249.05,409.138,512,224,15.59
resnet26t,1237.34,413.275,512,256,16.01
tf_mixnet_s,1236.15,412.905,512,224,4.13
nf_regnet_b2,1229.56,414.759,512,240,14.31
rexnetr_150,1224.57,207.878,256,224,9.78
gluon_resnet50_v1b,1219.23,419.12,512,224,25.56
tv_resnet50,1218.99,419.17,512,224,25.56
crossvit_9_dagger_240,1218.38,313.701,384,240,8.78
resnet50,1218.01,419.528,512,224,25.56
swsl_resnet50,1217.39,419.737,512,224,25.56
ssl_resnet50,1217.38,419.757,512,224,25.56
cs3darknet_focus_l,1216.61,314.788,384,256,21.15
repvgg_a2,1214.87,420.579,512,224,28.21
cs3darknet_l,1203.14,318.267,384,256,21.16
gernet_l,1201.09,425.379,512,256,31.08
efficientnet_lite2,1191.67,213.855,256,260,6.09
nf_regnet_b1,1181.15,431.966,512,256,10.22
seresnext26d_32x4d,1178.86,325.051,384,224,16.81
botnet26t_256,1178.34,325.281,384,256,12.49
seresnext26tn_32x4d,1177.85,325.355,384,224,16.81
seresnext26t_32x4d,1176.65,325.669,384,224,16.81
mobilevitv2_075,1174.29,217.001,256,256,2.87
ecaresnext50t_32x4d,1159.52,330.605,384,224,15.41
ecaresnext26t_32x4d,1158.26,330.961,384,224,15.41
gluon_resnet50_v1c,1147.86,333.697,384,224,25.58
halonet26t,1136.15,337.402,384,256,12.48
resnetv2_50d,1134.86,450.316,512,224,25.57
resnetv2_50t,1132.89,451.133,512,224,25.57
edgenext_small,1127.71,452.849,512,256,5.59
tf_efficientnet_lite2,1121.02,227.403,256,260,6.09
convit_tiny,1118.98,456.53,512,224,5.71
skresnet34,1113.08,458.799,512,224,22.28
tf_efficientnet_b1,1099.77,231.299,256,240,7.79
tf_efficientnet_b1_ap,1099.37,231.402,256,240,7.79
efficientnetv2_rw_t,1098.86,230.78,256,224,13.65
tf_efficientnet_b1_ns,1098.29,231.567,256,240,7.79
ecaresnetlight,1091.16,468.275,512,224,30.16
gluon_resnet50_v1d,1084.38,353.226,384,224,25.58
dpn68b,1083.77,353.123,384,224,12.61
cs3sedarknet_l,1083.42,353.12,384,256,21.91
resnet50d,1078.0,355.348,384,224,25.58
resnet50t,1076.81,355.721,384,224,25.57
resnet32ts,1075.86,237.337,256,256,17.96
resnet33ts,1061.36,240.599,256,256,19.68
vit_small_patch16_224,1057.92,362.157,384,224,22.05
resnetaa50,1057.73,362.204,384,224,25.56
vit_small_resnet26d_224,1057.57,362.04,384,224,63.61
deit_small_patch16_224,1050.7,364.638,384,224,22.05
cspresnet50,1042.19,367.617,384,256,21.62
tf_efficientnetv2_b3,1041.71,243.94,256,240,14.36
regnetx_008,1034.73,493.971,512,224,7.26
ecaresnet26t,1033.34,371.048,384,256,16.01
deit_small_distilled_patch16_224,1028.8,372.398,384,224,22.44
vit_relpos_base_patch32_plus_rpn_256,1021.86,499.989,512,256,119.42
dla60,1020.05,375.488,384,224,22.04
res2net50_48w_2s,1018.83,376.079,384,224,25.29
gc_efficientnetv2_rw_t,1014.65,249.524,256,224,13.68
vit_relpos_small_patch16_rpn_224,1013.69,377.786,384,224,21.97
edgenext_small_rw,1011.18,505.339,512,256,7.83
pit_s_224,1010.83,378.943,384,224,23.46
seresnet33ts,1007.26,253.362,256,256,19.78
efficientnet_em,1007.19,253.179,256,240,6.9
vovnet39a,1006.62,507.995,512,224,22.6
legacy_seresnet50,1003.5,381.52,384,224,28.09
gluon_resnext50_32x4d,1001.3,382.689,384,224,25.03
tv_resnext50_32x4d,1001.18,382.711,384,224,25.03
resnext50_32x4d,1001.03,382.776,384,224,25.03
ssl_resnext50_32x4d,1000.68,382.908,384,224,25.03
eca_resnet33ts,999.77,255.368,256,256,19.68
swsl_resnext50_32x4d,997.37,384.186,384,224,25.03
regnety_008,993.3,514.408,512,224,6.26
dpn68,992.27,385.859,384,224,12.61
deit3_small_patch16_224,987.86,387.777,384,224,22.06
deit3_small_patch16_224_in21ft1k,987.15,388.058,384,224,22.06
gcresnet33ts,985.12,258.855,256,256,19.88
efficientnet_b2a,980.29,259.63,256,256,9.11
tf_efficientnet_em,980.0,260.253,256,240,6.9
efficientnet_b2,978.68,260.092,256,256,9.11
seresnet50,971.79,394.011,384,224,28.09
gluon_resnet50_v1s,970.71,394.714,384,224,25.68
vit_srelpos_small_patch16_224,969.18,395.281,384,224,21.97
vit_relpos_small_patch16_224,965.13,396.742,384,224,21.98
ecaresnet50d_pruned,964.18,530.07,512,224,19.94
cspresnet50d,956.82,266.672,256,256,21.64
vgg11,954.03,536.508,512,224,132.86
cspresnet50w,952.27,267.927,256,256,28.12
ese_vovnet39b,951.93,537.173,512,224,24.57
vit_base_patch32_plus_256,951.5,537.138,512,256,119.48
resnetaa50d,950.79,403.026,384,224,25.58
eca_vovnet39b,948.4,539.184,512,224,22.6
lambda_resnet26rpt_256,942.15,203.17,192,256,10.99
pit_s_distilled_224,934.29,273.079,256,224,24.04
mobilevit_xs,924.5,275.792,256,256,2.32
tv_densenet121,917.93,277.067,256,224,7.98
densenet121,913.65,278.353,256,224,7.98
resnetblur50,911.91,420.254,384,224,25.56
hrnet_w18_small_v2,910.26,559.998,512,224,15.6
coat_lite_tiny,909.29,421.406,384,224,5.72
mobilevitv2_100,907.45,281.094,256,256,4.9
nf_resnet50,900.11,425.722,384,256,25.56
resnext50d_32x4d,894.57,285.293,256,224,25.05
nf_seresnet50,892.73,428.967,384,224,28.09
rexnetr_200,890.57,214.407,192,224,16.52
efficientnet_cc_b0_4e,890.34,430.073,384,224,13.31
efficientnet_cc_b0_8e,889.37,430.553,384,224,24.01
dla60x,886.5,287.775,256,224,17.35
twins_svt_small,885.48,432.048,384,224,24.06
seresnet50t,879.71,435.29,384,224,28.1
mixnet_m,878.04,581.529,512,224,5.01
nf_ecaresnet50,875.38,437.674,384,224,25.56
efficientnet_b2_pruned,873.9,291.355,256,260,8.31
densenet121d,873.44,291.238,256,224,8.0
cspresnext50,868.23,294.006,256,256,20.57
rexnet_150,866.26,294.391,256,224,9.73
ecaresnet50d,862.65,444.205,384,224,25.58
fbnetv3_g,862.32,220.642,192,240,16.62
regnetz_b16,862.05,295.457,256,224,9.72
tf_efficientnet_cc_b0_4e,861.1,444.691,384,224,13.31
tf_efficientnet_cc_b0_8e,857.16,446.822,384,224,24.01
gcresnet50t,854.99,447.633,384,256,25.9
res2net50_26w_4s,851.03,449.921,384,224,25.7
coat_lite_mini,849.82,450.985,384,224,11.01
tf_efficientnet_b2_ap,849.52,224.466,192,260,9.11
tf_efficientnet_b2,848.58,224.736,192,260,9.11
tf_efficientnet_b2_ns,847.86,224.983,192,260,9.11
vit_base_resnet26d_224,844.62,453.315,384,224,101.4
vgg11_bn,832.74,460.889,384,224,132.87
vovnet57a,832.06,614.449,512,224,36.64
selecsls84,830.17,615.492,512,224,50.95
resnetblur50d,826.31,308.964,256,224,25.58
convnext_tiny_hnfd,820.9,310.941,256,224,28.59
convnext_tiny_hnf,819.46,311.471,256,224,28.59
convnext_tiny,819.24,311.536,256,224,28.59
convnext_tiny_in22ft1k,818.81,311.724,256,224,28.59
rexnet_130,816.78,312.226,256,224,7.56
seresnext50_32x4d,814.69,313.102,256,224,27.56
legacy_seresnext50_32x4d,813.61,313.477,256,224,27.56
gluon_seresnext50_32x4d,813.13,313.678,256,224,27.56
skresnet50,808.8,473.357,384,224,25.8
visformer_small,806.27,475.588,384,224,40.22
res2net50_14w_8s,794.56,319.93,256,224,25.06
densenetblur121d,789.33,322.521,256,224,8.0
seresnetaa50d,785.32,324.779,256,224,28.11
gluon_inception_v3,782.59,489.263,384,299,23.83
inception_v3,782.35,489.427,384,299,23.83
adv_inception_v3,778.18,491.976,384,299,23.83
resmlp_24_distilled_224,777.24,327.895,256,224,30.02
resmlp_24_224,776.95,327.972,256,224,30.02
tf_inception_v3,775.41,493.776,384,299,23.83
ese_vovnet57b,774.18,495.058,384,224,38.61
tf_mixnet_m,773.08,495.127,384,224,5.01
resnetv2_101,772.45,329.834,256,224,44.54
dla60_res2net,767.35,332.099,256,224,20.85
nf_regnet_b3,766.23,499.321,384,288,18.59
sehalonet33ts,763.66,334.4,256,256,13.69
ecaresnet101d_pruned,754.9,676.449,512,224,24.88
darknet53,753.16,339.081,256,256,41.61
densenet169,752.52,337.551,256,224,14.15
resnet101,747.89,340.74,256,224,44.55
gluon_resnet101_v1b,747.04,341.055,256,224,44.55
tv_resnet101,746.84,341.219,256,224,44.55
skresnet50d,739.17,344.891,256,224,25.82
twins_pcpvt_small,738.11,345.194,256,224,24.11
vit_small_r26_s32_224,733.9,347.477,256,224,36.43
mobilevit_s,733.0,260.821,192,256,5.58
darknetaa53,732.7,348.577,256,256,36.02
xcit_tiny_24_p16_224_dist,727.98,348.335,256,224,12.12
xcit_tiny_24_p16_224,727.1,348.63,256,224,12.12
efficientnet_b0_gn,724.56,352.174,256,224,5.29
efficientnet_b3_pruned,722.23,352.701,256,300,9.86
gluon_resnet101_v1c,717.66,355.143,256,224,44.57
resnext26ts,717.15,534.946,384,256,10.3
resnetv2_101d,715.45,356.238,256,224,44.56
gmixer_24_224,714.67,356.582,256,224,24.72
resnetrs101,714.37,356.071,256,192,63.62
nf_resnet101,712.1,537.603,384,224,44.55
efficientnet_lite3,702.44,181.104,128,300,8.2
mixnet_l,702.18,545.327,384,224,7.33
eca_resnext26ts,694.05,368.289,256,256,10.3
semobilevit_s,692.92,368.16,256,256,5.74
seresnext26ts,691.18,369.699,256,256,10.39
poolformer_s24,689.84,369.792,256,224,21.39
gluon_resnet101_v1d,688.26,370.323,256,224,44.57
dla102,688.03,370.524,256,224,33.27
vit_relpos_medium_patch16_rpn_224,687.13,371.514,256,224,38.73
sebotnet33ts_256,686.07,279.058,192,256,13.7
gcresnext26ts,683.09,373.929,256,256,10.48
regnetx_016,682.73,749.012,512,224,9.19
haloregnetz_b,680.78,374.495,256,224,11.68
cspdarknet53,679.01,375.961,256,256,27.64
vgg13,677.45,566.653,384,224,133.05
xcit_nano_12_p16_384_dist,671.72,379.231,256,384,3.05
wide_resnet50_2,668.78,573.358,384,224,68.88
tf_efficientnet_lite3,665.78,191.165,128,300,8.2
vit_relpos_medium_patch16_cls_224,661.77,385.665,256,224,38.76
vit_srelpos_medium_patch16_224,659.88,386.996,256,224,38.74
rexnet_200,659.06,290.146,192,224,16.37
vit_base_resnet50d_224,658.84,386.945,256,224,110.97
ecaresnet50t,657.78,388.237,256,256,25.57
gmlp_s16_224,657.63,290.408,192,224,19.42
vit_relpos_medium_patch16_224,657.05,388.484,256,224,38.75
tf_efficientnet_cc_b1_8e,654.82,389.25,256,240,39.72
regnety_016,650.07,785.757,512,224,11.2
swin_tiny_patch4_window7_224,648.69,393.641,256,224,28.29
xcit_small_12_p16_224,640.82,397.688,256,224,26.25
gluon_resnet101_v1s,640.79,397.908,256,224,44.67
xcit_small_12_p16_224_dist,639.99,398.193,256,224,26.25
crossvit_small_240,638.8,399.076,256,240,26.86
efficientnet_cc_b1_8e,637.42,399.94,256,240,39.72
resnetaa101d,634.86,401.619,256,224,44.57
cs3sedarknet_xdw,630.82,302.41,192,256,21.6
repvgg_b1,623.55,820.034,512,224,57.42
mobilevitv2_125,620.51,308.406,192,256,7.48
bat_resnext26ts,613.61,415.954,256,256,10.73
gluon_resnext101_32x4d,609.67,418.333,256,224,44.18
swsl_resnext101_32x4d,609.02,418.731,256,224,44.18
resnext101_32x4d,609.01,418.74,256,224,44.18
tf_mixnet_l,606.88,420.297,256,224,7.33
ssl_resnext101_32x4d,606.28,420.718,256,224,44.18
legacy_seresnet101,601.55,423.316,256,224,49.33
cs3darknet_focus_x,600.02,425.715,256,256,35.02
dla102x,598.42,319.231,192,224,26.31
halonet50ts,597.8,320.205,192,256,22.73
xcit_nano_12_p8_224,595.07,428.358,256,224,3.05
xcit_nano_12_p8_224_dist,593.27,429.695,256,224,3.05
cait_xxs24_224,593.22,428.92,256,224,11.96
seresnet101,590.42,431.41,256,224,49.33
swin_s3_tiny_224,588.57,433.98,256,224,28.33
resnetv2_50x1_bit_distilled,585.83,326.889,192,224,25.55
efficientnet_b0_g8_gn,582.67,438.264,256,224,6.56
crossvit_15_240,580.46,328.975,192,240,27.53
resnetblur101d,576.87,442.155,256,224,44.57
res2net50_26w_6s,573.8,444.339,256,224,37.05
vgg13_bn,573.47,446.125,256,224,133.05
efficientnet_b3a,572.29,221.925,128,288,12.23
efficientnet_b3,572.18,221.941,128,288,12.23
cs3darknet_x,571.12,447.259,256,256,35.05
densenet201,562.52,338.221,192,224,20.01
crossvit_15_dagger_240,562.49,339.489,192,240,28.21
efficientnetv2_s,559.15,226.702,128,288,21.46
eca_botnext26ts_256,558.6,457.666,256,256,10.59
mixer_b16_224,556.6,459.152,256,224,59.88
mixer_b16_224_miil,556.5,459.202,256,224,59.88
eca_halonext26ts,547.96,466.574,256,256,10.76
ecaresnet101d,546.58,466.555,256,224,44.57
vgg16,546.06,702.994,384,224,138.36
mixer_l32_224,543.38,351.819,192,224,206.94
vit_base_patch32_384,543.37,470.294,256,384,88.3
nf_seresnet101,540.53,471.014,256,224,49.33
resnetv2_152,536.63,474.697,256,224,60.19
botnet50ts_256,534.71,238.412,128,256,22.74
mobilevitv2_150,533.38,238.97,128,256,10.59
vit_base_r26_s32_224,533.28,358.697,192,224,101.38
mobilevitv2_150_in22ft1k,532.99,239.183,128,256,10.59
cs3sedarknet_x,531.85,479.872,256,256,35.4
nf_ecaresnet101,531.3,479.947,256,224,44.55
cs3edgenet_x,529.37,482.632,256,256,47.82
res2next50,528.59,483.023,256,224,24.67
res2net101_26w_4s,527.01,483.179,256,224,45.21
vit_large_patch32_224,524.74,486.172,256,224,306.54
resnet101d,523.85,364.964,192,256,44.57
efficientnetv2_rw_s,520.77,243.564,128,288,23.94
halo2botnet50ts_256,517.51,369.975,192,256,22.64
resmlp_36_distilled_224,513.23,371.84,192,224,44.69
vit_tiny_patch16_384,510.99,249.657,128,384,5.79
resmlp_36_224,509.53,374.55,192,224,44.69
swinv2_cr_tiny_224,506.82,503.861,256,224,28.33
mixnet_xl,505.67,504.387,256,224,11.9
resnetv2_50d_gn,505.25,379.149,192,224,25.57
swinv2_cr_tiny_ns_224,504.1,506.527,256,224,28.33
gluon_resnet152_v1b,502.02,380.204,192,224,60.19
regnetz_d8,501.8,253.463,128,256,23.37
resnet152,501.44,380.547,192,224,60.19
tv_resnet152,501.12,380.811,192,224,60.19
xception,497.64,256.367,128,299,22.86
regnety_032,496.85,771.405,384,224,19.44
tf_efficientnet_b3_ap,496.0,256.348,128,300,12.23
tf_efficientnet_b3,494.58,257.101,128,300,12.23
tf_efficientnet_b3_ns,492.45,258.213,128,300,12.23
convnext_small_in22ft1k,490.79,389.411,192,224,50.22
res2net50_26w_8s,489.22,520.921,256,224,48.4
tf_efficientnetv2_s_in21ft1k,488.77,259.622,128,300,21.46
tf_efficientnetv2_s,488.25,259.918,128,300,21.46
gluon_resnet152_v1c,488.2,390.894,192,224,60.21
convnext_small,487.15,392.394,192,224,50.22
twins_pcpvt_base,487.13,391.314,192,224,43.83
resnetv2_152d,486.82,392.059,192,224,60.2
legacy_seresnext101_32x4d,484.7,393.829,192,224,48.96
resnet50_gn,482.45,397.141,192,224,25.56
gluon_seresnext101_32x4d,480.63,397.197,192,224,48.96
hrnet_w32,480.24,528.215,256,224,41.23
sequencer2d_s,480.16,264.213,128,224,27.65
seresnext101_32x4d,479.03,398.526,192,224,48.96
nest_tiny,477.97,266.889,128,224,17.06
dla60_res2next,477.74,534.408,256,224,17.03
gluon_resnet152_v1d,476.32,400.788,192,224,60.21
regnetz_c16,475.79,402.061,192,256,13.46
hrnet_w18,473.42,535.867,256,224,21.3
jx_nest_tiny,472.84,269.807,128,224,17.06
regnetz_d32,471.85,269.596,128,256,27.58
regnetz_040,471.81,269.412,128,256,27.12
xception41p,471.79,270.424,128,299,26.91
vgg16_bn,470.3,543.999,256,224,138.37
regnetz_040h,469.27,270.846,128,256,28.94
poolformer_s36,467.39,408.832,192,224,30.86
resnet51q,463.81,551.102,256,256,35.7
efficientnet_el_pruned,461.98,275.957,128,300,10.59
efficientnet_el,461.97,275.94,128,300,10.59
coat_lite_small,461.36,414.606,192,224,19.84
nf_regnet_b4,457.97,417.049,192,320,30.21
vgg19,457.37,839.347,384,224,143.67
cs3se_edgenet_x,457.05,418.615,192,256,50.72
dla169,455.51,419.044,192,224,53.39
convit_small,454.72,421.197,192,224,27.78
gluon_resnet152_v1s,452.53,421.917,192,224,60.32
tf_efficientnet_el,449.8,283.446,128,300,10.59
gcresnext50ts,445.12,429.834,192,256,15.67
regnetx_040,442.35,866.937,384,224,22.12
vit_small_resnet50d_s16_224,437.26,437.826,192,224,57.53
volo_d1_224,437.04,437.842,192,224,26.63
mobilevitv2_175_in22ft1k,434.9,293.339,128,256,14.25
mobilevitv2_175,434.88,293.341,128,256,14.25
resnet61q,433.95,441.405,192,256,36.85
ese_vovnet99b,433.24,589.371,256,224,63.2
ese_vovnet39b_evos,430.54,296.328,128,224,24.58
twins_svt_base,425.02,449.64,192,224,56.07
resnest14d,411.87,1242.634,512,224,10.61
dla102x2,405.87,313.766,128,224,41.28
mobilevitv2_200_in22ft1k,405.83,314.414,128,256,18.45
mobilevitv2_200,405.76,314.447,128,256,18.45
inception_v4,405.43,471.399,192,299,42.68
crossvit_18_240,404.58,314.332,128,240,43.27
swin_small_patch4_window7_224,400.37,477.632,192,224,49.61
densenet161,399.17,318.189,128,224,28.68
vgg19_bn,398.5,642.012,256,224,143.68
legacy_seresnet152,398.4,478.588,192,224,66.82
vit_base_patch16_224_miil,397.86,481.79,192,224,86.54
sequencer2d_m,396.83,480.626,192,224,38.31
crossvit_18_dagger_240,396.31,320.926,128,240,44.27
resnetv2_50d_frn,394.18,323.553,128,224,25.59
vit_base_patch16_224,392.99,487.729,192,224,86.57
vit_base_patch16_224_sam,392.92,487.774,192,224,86.57
vit_base_patch16_rpn_224,391.32,489.846,192,224,86.54
xception41,391.05,326.045,128,299,26.97
deit_base_patch16_224,390.04,491.437,192,224,86.57
cait_xxs36_224,387.24,492.066,192,224,17.3
efficientnet_b0_g16_evos,386.23,993.13,384,224,8.11
deit_base_distilled_patch16_224,384.06,499.086,192,224,87.34
xcit_tiny_12_p16_384_dist,383.09,499.371,192,384,6.72
vit_relpos_base_patch16_rpn_224,382.95,500.328,192,224,86.41
seresnet152,379.38,334.127,128,224,66.82
resnetv2_50d_evos,374.27,340.812,128,224,25.59
deit3_base_patch16_224,374.05,512.341,192,224,86.59
deit3_base_patch16_224_in21ft1k,373.84,512.639,192,224,86.59
vit_relpos_base_patch16_clsgap_224,370.76,516.704,192,224,86.43
vit_relpos_base_patch16_cls_224,370.22,517.437,192,224,86.43
hrnet_w30,369.35,688.202,256,224,37.71
vit_relpos_base_patch16_224,368.93,519.279,192,224,86.43
gluon_resnext101_64x4d,363.93,350.084,128,224,83.46
resnext101_64x4d,363.79,350.21,128,224,83.46
beit_base_patch16_224,358.77,534.02,192,224,86.53
ens_adv_inception_resnet_v2,358.6,532.08,192,299,55.84
wide_resnet101_2,358.56,533.868,192,224,126.89
inception_resnet_v2,358.54,532.143,192,299,55.84
resnet200,357.55,355.04,128,224,64.67
resnet152d,357.54,355.729,128,256,60.21
swinv2_tiny_window8_256,357.0,536.56,192,256,28.35
efficientnet_b4,354.99,268.381,96,320,19.34
dpn92,353.0,723.721,256,224,37.67
repvgg_b2,352.05,1453.222,512,224,89.02
resnest50d_1s4x24d,349.97,730.142,256,224,25.68
regnetz_b16_evos,347.19,366.83,128,224,9.74
tnt_s_patch16_224,342.71,558.25,192,224,23.76
xception65p,341.43,373.588,128,299,39.82
convnext_base_in22ft1k,339.67,374.996,128,224,88.59
convnext_base,338.68,376.119,128,224,88.59
efficientnet_lite4,338.39,187.783,64,380,13.01
twins_pcpvt_large,333.52,379.633,128,224,60.99
pit_b_224,331.08,385.636,128,224,73.76
pit_b_distilled_224,328.87,388.208,128,224,74.79
xcit_small_24_p16_224_dist,326.41,388.817,128,224,47.67
xcit_small_24_p16_224,326.38,388.806,128,224,47.67
tf_efficientnet_lite4,324.6,195.748,64,380,13.01
eca_nfnet_l0,319.84,1599.745,512,224,24.14
nfnet_l0,319.69,1600.317,512,224,35.07
gluon_seresnext101_64x4d,319.51,398.398,128,224,88.23
repvgg_b3,316.57,1211.922,384,224,123.09
skresnext50_32x4d,315.84,809.121,256,224,27.48
poolformer_m36,315.69,403.496,128,224,56.17
ssl_resnext101_32x8d,313.35,406.924,128,224,88.79
resnext101_32x8d,312.8,407.622,128,224,88.79
swsl_resnext101_32x8d,312.76,407.724,128,224,88.79
ig_resnext101_32x8d,311.13,409.865,128,224,88.79
vit_small_patch16_36x1_224,309.04,411.365,128,224,64.67
regnetx_032,308.88,1241.936,384,224,15.3
vit_small_patch16_18x2_224,306.57,414.654,128,224,64.67
xcit_tiny_12_p8_224,306.37,415.93,128,224,6.71
cait_s24_224,305.74,415.965,128,224,46.92
xcit_tiny_12_p8_224_dist,304.23,418.886,128,224,6.71
swinv2_cr_small_ns_224,300.99,422.86,128,224,49.7
twins_svt_large,300.69,423.548,128,224,99.27
swinv2_cr_small_224,299.81,424.482,128,224,49.7
coat_tiny,298.83,426.275,128,224,5.5
resnest26d,298.23,1286.829,384,224,17.07
nest_small,296.79,321.765,96,224,38.35
jx_nest_small,293.75,325.094,96,224,38.35
swin_s3_small_224,290.56,438.612,128,224,49.74
dpn98,290.11,439.591,128,224,61.57
resnetv2_50d_evob,289.65,330.197,96,224,25.59
seresnet152d,283.9,447.414,128,256,66.84
gluon_xception65,283.18,337.068,96,299,39.92
convnext_tiny_384_in22ft1k,282.39,338.982,96,384,28.59
resnetrs152,282.26,450.046,128,256,86.62
xception65,281.11,339.548,96,299,39.92
swin_base_patch4_window7_224,281.0,453.662,128,224,87.77
hrnet_w48,279.44,682.135,192,224,77.47
mixnet_xxl,278.4,457.833,128,224,23.96
seresnext101_32x8d,278.13,458.033,128,224,93.57
gmlp_b16_224,275.97,346.253,96,224,73.08
seresnext101d_32x8d,270.35,471.144,128,224,93.59
resnet200d,267.1,476.272,128,256,64.69
nfnet_f0,265.61,1926.394,512,192,71.49
regnetz_e8,256.51,247.489,64,256,57.7
xcit_tiny_24_p16_384_dist,255.73,371.975,96,384,12.12
crossvit_base_240,254.6,375.374,96,240,105.03
dm_nfnet_f0,251.38,1526.301,384,192,71.49
hrnet_w40,249.23,765.525,192,224,57.56
vit_base_patch16_plus_240,246.96,517.368,128,240,117.56
efficientnetv2_m,246.55,256.379,64,320,54.14
vit_relpos_base_patch16_plus_240,244.88,521.493,128,240,117.38
seresnextaa101d_32x8d,243.89,522.629,128,224,93.59
tf_efficientnet_b4_ap,242.14,262.218,64,380,19.34
tf_efficientnet_b4,241.83,262.52,64,380,19.34
tf_efficientnet_b4_ns,241.46,263.01,64,380,19.34
xcit_medium_24_p16_224,241.39,526.926,128,224,84.4
xcit_medium_24_p16_224_dist,241.08,527.466,128,224,84.4
xcit_small_12_p16_384_dist,240.61,397.192,96,384,26.25
vit_small_patch16_384,239.06,266.856,64,384,22.2
volo_d2_224,238.89,400.019,96,224,58.68
swinv2_tiny_window16_256,238.79,400.76,96,256,28.35
mobilevitv2_150_384_in22ft1k,238.1,267.77,64,384,10.59
vit_large_r50_s32_224,236.27,403.797,96,224,328.99
tresnet_m,233.24,2192.365,512,224,31.39
hrnet_w44,232.48,820.975,192,224,67.06
poolformer_m48,232.43,410.471,96,224,73.47
densenet264,231.27,411.12,96,224,72.69
convit_base,231.06,552.947,128,224,86.54
nf_regnet_b5,228.54,417.354,96,384,49.74
deit3_small_patch16_384,226.74,281.318,64,384,22.21
deit3_small_patch16_384_in21ft1k,226.44,281.652,64,384,22.21
vit_small_r26_s32_384,226.15,281.726,64,384,36.47
coat_mini,225.14,566.497,128,224,10.34
efficientnetv2_rw_m,224.46,281.565,64,320,53.24
swin_s3_base_224,224.0,425.728,96,224,71.13
tnt_b_patch16_224,223.52,570.669,128,224,65.41
hrnet_w64,223.29,568.417,128,224,128.06
sequencer2d_l,220.03,286.022,64,224,54.3
dpn131,216.53,588.962,128,224,79.25
vit_base_r50_s16_224,215.49,443.851,96,224,98.66
swinv2_cr_base_ns_224,214.73,444.647,96,224,87.88
xception71,214.09,296.77,64,299,42.34
swinv2_cr_base_224,213.21,447.81,96,224,87.88
swinv2_small_window8_256,213.06,448.048,96,256,49.73
nest_base,210.25,302.717,64,224,67.72
jx_nest_base,209.06,304.441,64,224,67.72
seresnet200d,203.53,467.209,96,256,71.86
resnetrs200,201.84,471.293,96,256,93.21
resnest50d,201.6,1268.493,256,224,27.48
ecaresnet200d,201.55,472.938,96,256,64.69
xcit_nano_12_p8_384_dist,201.45,315.854,64,384,3.05
efficientnet_b3_gn,197.65,322.123,64,288,11.73
xcit_tiny_24_p8_224_dist,195.37,488.075,96,224,12.11
xcit_tiny_24_p8_224,195.11,488.622,96,224,12.11
dpn107,194.08,492.913,96,224,86.92
regnetz_c16_evos,193.89,328.188,64,256,13.49
regnety_040,190.14,2017.916,384,224,20.65
mobilevitv2_175_384_in22ft1k,189.6,336.534,64,384,14.25
regnetv_040,188.29,1358.084,256,224,20.64
convnext_large,187.93,509.087,96,224,197.77
convnext_large_in22ft1k,187.83,509.365,96,224,197.77
convmixer_768_32,187.17,511.603,96,224,21.11
regnetx_080,181.41,1409.979,256,224,39.57
resnest50d_4s2x40d,180.44,1417.38,256,224,30.42
xcit_small_12_p8_224,179.5,354.768,64,224,26.21
xcit_small_12_p8_224_dist,179.34,355.047,64,224,26.21
halonet_h1,176.7,360.706,64,256,8.1
tf_efficientnetv2_m_in21ft1k,175.14,270.794,48,384,54.14
mobilevitv2_200_384_in22ft1k,175.13,273.08,48,384,18.45
tf_efficientnetv2_m,173.37,273.617,48,384,54.14
mixer_l16_224,171.41,558.471,96,224,208.2
efficientnet_b3_g8_gn,168.79,377.376,64,288,14.25
repvgg_b1g4,167.59,3053.943,512,224,39.97
vit_large_patch32_384,167.04,573.058,96,384,306.63
convnext_small_384_in22ft1k,165.65,384.557,64,384,50.22
volo_d3_224,162.19,392.021,64,224,86.33
regnetz_d8_evos,155.31,307.002,48,256,23.46
swin_large_patch4_window7_224,153.79,414.289,64,224,196.53
swinv2_base_window8_256,151.21,420.663,64,256,87.92
convmixer_1024_20_ks9_p14,149.3,1713.726,256,224,24.38
resnetv2_50x1_bitm,147.75,215.764,32,448,25.55
seresnet269d,145.59,433.61,64,256,113.67
resnetrs270,144.14,437.83,64,256,129.86
swinv2_small_window16_256,143.52,443.487,64,256,49.73
regnety_040s_gn,142.58,896.132,128,224,20.65
repvgg_b2g4,133.72,3827.892,512,224,61.76
eca_nfnet_l1,133.59,1435.413,192,256,41.41
xcit_large_24_p16_224,132.6,479.222,64,224,189.1
swinv2_cr_tiny_384,131.94,483.86,64,384,28.33
xcit_large_24_p16_224_dist,131.66,482.65,64,224,189.1
xcit_tiny_12_p8_384_dist,131.64,362.75,48,384,6.71
regnetx_064,129.82,1970.916,256,224,26.21
swinv2_cr_large_224,124.15,513.018,64,224,196.68
xcit_small_24_p16_384_dist,120.64,394.328,48,384,47.67
regnety_064,119.44,2141.523,256,224,30.58
regnety_080,117.88,2170.37,256,224,39.18
crossvit_15_dagger_408,117.86,269.618,32,408,28.5
vit_large_patch16_224,117.2,544.512,64,224,304.33
regnetv_064,117.03,1638.944,192,224,30.58
ese_vovnet99b_iabn,117.02,3278.167,384,224,63.2
convnext_xlarge_in22ft1k,116.37,548.167,64,224,350.2
vit_base_patch16_18x2_224,116.0,548.972,64,224,256.73
convnext_base_384_in22ft1k,115.58,413.454,48,384,88.59
efficientnet_b5,113.63,279.129,32,456,30.39
deit3_large_patch16_224_in21ft1k,112.51,567.041,64,224,304.37
deit3_large_patch16_224,112.48,567.139,64,224,304.37
tf_efficientnet_b5,111.42,284.665,32,456,30.39
tf_efficientnet_b5_ap,111.14,285.451,32,456,30.39
tf_efficientnet_b5_ns,111.14,285.33,32,456,30.39
legacy_senet154,110.98,861.567,96,224,115.09
senet154,110.82,862.828,96,224,115.09
gluon_senet154,110.77,863.12,96,224,115.09
beit_large_patch16_224,109.02,584.818,64,224,304.43
repvgg_b3g4,108.77,3529.239,384,224,83.83
regnetx_160,107.6,1783.261,192,224,54.28
nfnet_f1,107.01,1791.907,192,224,132.63
volo_d1_384,105.69,301.347,32,384,26.78
swinv2_base_window16_256,103.88,459.56,48,256,87.92
swinv2_base_window12to16_192to256_22kft1k,103.79,460.002,48,256,87.92
tresnet_l,102.82,4975.916,512,224,55.99
dm_nfnet_f1,101.59,1257.525,128,224,132.63
volo_d4_224,101.08,472.359,48,224,192.96
cait_xxs24_384,99.39,480.268,48,384,12.03
ecaresnet269d,99.06,479.988,48,320,102.09
efficientnetv2_l,98.76,319.521,32,384,118.52
tf_efficientnetv2_l_in21ft1k,98.35,320.759,32,384,118.52
tf_efficientnetv2_l,97.56,323.47,32,384,118.52
deit_base_patch16_384,97.3,328.042,32,384,86.86
vit_base_patch16_384,97.1,328.712,32,384,86.86
resnest101e,96.09,1329.413,128,256,48.28
deit_base_distilled_patch16_384,94.63,337.315,32,384,87.63
regnetx_120,94.03,2721.558,256,224,46.11
deit3_base_patch16_384,93.5,341.294,32,384,86.88
deit3_base_patch16_384_in21ft1k,93.49,341.309,32,384,86.88
xcit_small_24_p8_224_dist,92.61,514.968,48,224,47.63
xcit_small_24_p8_224,92.51,515.466,48,224,47.63
regnety_120,92.07,2083.952,192,224,51.82
tresnet_xl,91.15,4209.119,384,224,78.44
crossvit_18_dagger_408,89.17,356.787,32,408,44.61
resnetv2_152x2_bit_teacher,89.16,356.538,32,224,236.34
vit_large_patch14_224,85.06,562.673,48,224,304.2
resnetv2_101x1_bitm,84.72,187.286,16,448,44.54
resnetrs350,84.14,372.211,32,288,163.96
beit_base_patch16_384,83.87,380.424,32,384,86.74
regnety_160,83.24,2305.144,192,224,83.59
pnasnet5large,83.16,380.801,32,331,86.06
xcit_medium_24_p16_384_dist,82.74,383.266,32,384,84.4
vit_large_r50_s32_384,77.34,411.186,32,384,329.09
nasnetalarge,77.32,408.633,32,331,88.75
swinv2_cr_small_384,76.42,416.277,32,384,49.7
swin_base_patch4_window12_384,74.73,426.327,32,384,87.9
resmlp_big_24_distilled_224,70.88,449.95,32,224,129.14
resmlp_big_24_224_in22ft1k,70.88,449.933,32,224,129.14
resmlp_big_24_224,70.41,452.99,32,224,129.14
regnety_320,66.33,1928.357,128,224,145.05
xcit_tiny_24_p8_384_dist,66.29,479.361,32,384,12.11
cait_xs24_384,66.24,480.525,32,384,26.67
ig_resnext101_32x16d,65.9,1455.165,96,224,194.03
ssl_resnext101_32x16d,65.74,1458.688,96,224,194.03
swsl_resnext101_32x16d,65.74,1458.738,96,224,194.03
volo_d5_224,64.41,493.535,32,224,295.46
cait_xxs36_384,64.34,493.602,32,384,17.37
efficientnet_b6,64.08,246.77,16,528,43.04
xcit_medium_24_p8_224,63.96,496.86,32,224,84.32
xcit_medium_24_p8_224_dist,63.93,497.194,32,224,84.32
convnext_large_384_in22ft1k,63.85,499.388,32,384,197.77
vit_base_patch8_224,63.45,377.425,24,224,86.58
tf_efficientnet_b6_ns,63.1,250.577,16,528,43.04
tf_efficientnet_b6,62.84,251.669,16,528,43.04
tf_efficientnet_b6_ap,62.76,252.073,16,528,43.04
efficientnetv2_xl,62.18,251.438,16,384,208.12
tf_efficientnetv2_xl_in21ft1k,62.14,251.721,16,384,208.12
xcit_small_12_p8_384_dist,61.84,386.224,24,384,26.21
vit_base_r50_s16_384,61.01,391.67,24,384,98.95
vit_base_resnet50_384,60.98,391.903,24,384,98.95
swinv2_large_window12to16_192to256_22kft1k,60.98,391.098,24,256,196.74
eca_nfnet_l2,58.72,1632.112,96,320,56.72
volo_d2_384,58.5,271.766,16,384,58.87
resnetrs420,56.49,415.629,24,320,191.89
swinv2_cr_base_384,55.08,433.269,24,384,87.88
nfnet_f2,54.73,1750.573,96,256,193.78
tresnet_m_448,53.52,3584.333,192,448,31.39
dm_nfnet_f2,51.26,1245.084,64,256,193.78
cait_s24_384,50.14,476.064,24,384,47.06
swinv2_cr_huge_224,49.63,481.092,24,224,657.83
regnetx_320,48.06,2662.024,128,224,107.81
xcit_large_24_p16_384_dist,48.02,496.416,24,384,189.1
swin_large_patch4_window12_384,41.75,381.31,16,384,196.74
convnext_xlarge_384_in22ft1k,40.45,591.485,24,384,350.2
deit3_huge_patch14_224_in21ft1k,38.41,414.036,16,224,632.13
deit3_huge_patch14_224,38.4,414.103,16,224,632.13
efficientnet_b7,37.97,207.232,8,600,66.35
tf_efficientnet_b7_ap,37.27,210.986,8,600,66.35
tf_efficientnet_b7_ns,37.25,211.249,8,600,66.35
tf_efficientnet_b7,37.22,211.329,8,600,66.35
eca_nfnet_l3,35.61,1344.526,48,352,72.04
xcit_large_24_p8_224_dist,35.32,449.68,16,224,188.93
xcit_large_24_p8_224,35.06,452.952,16,224,188.93
resnetv2_50x3_bitm,34.68,460.605,16,448,217.32
swinv2_cr_large_384,32.56,488.949,16,384,196.68
cait_s36_384,32.3,491.641,16,384,68.37
densenet264d_iabn,32.11,3982.48,128,224,72.74
resnetv2_152x2_bit_teacher_384,31.17,382.508,12,384,236.34
xcit_small_24_p8_384_dist,31.12,510.761,16,384,47.63
resnest200e,30.3,1579.149,48,320,70.2
vit_large_patch16_384,29.14,410.147,12,384,304.72
deit3_large_patch16_384,28.26,422.768,12,384,304.76
deit3_large_patch16_384_in21ft1k,28.25,422.94,12,384,304.76
swinv2_base_window12to24_192to384_22kft1k,28.19,423.281,12,384,87.92
nfnet_f3,26.1,1834.868,48,320,254.92
beit_large_patch16_384,25.3,472.219,12,384,305.0
tresnet_l_448,25.28,5060.321,128,448,55.99
volo_d3_448,24.7,321.403,8,448,86.63
dm_nfnet_f3,24.56,1297.967,32,320,254.92
tresnet_xl_448,23.27,4122.323,96,448,78.44
efficientnet_b8,22.93,257.589,6,672,87.41
tf_efficientnet_b8,22.71,260.18,6,672,87.41
tf_efficientnet_b8_ap,22.69,260.555,6,672,87.41
resnetv2_152x2_bitm,22.58,351.774,8,448,236.34
vit_giant_patch14_224,22.32,355.766,8,224,1012.61
ig_resnext101_32x32d,21.03,1519.993,32,224,468.53
xcit_medium_24_p8_384_dist,21.03,376.997,8,384,84.32
convmixer_1536_20,20.83,2303.059,48,224,51.63
resnetv2_101x3_bitm,18.05,441.706,8,448,387.93
volo_d4_448,17.57,338.789,6,448,193.41
swinv2_large_window12to24_192to384_22kft1k,16.62,358.548,6,384,196.74
resnest269e,16.0,1493.085,24,416,110.93
nfnet_f4,14.17,1687.74,24,384,316.07
swinv2_cr_huge_384,13.13,454.627,6,384,657.94
dm_nfnet_f4,12.96,1229.019,16,384,316.07
xcit_large_24_p8_384_dist,12.19,488.758,6,384,188.93
cait_m36_384,11.91,500.148,6,384,271.22
volo_d5_448,11.43,346.654,4,448,295.91
ig_resnext101_32x48d,11.2,1427.437,16,224,828.41
tf_efficientnet_l2_ns_475,10.96,267.912,3,475,480.31
dm_nfnet_f5,9.76,1222.345,12,416,377.21
beit_large_patch16_512,9.42,422.548,4,512,305.67
volo_d5_512,8.0,371.847,3,512,296.09
nfnet_f5,8.0,1992.337,16,416,377.21
dm_nfnet_f6,7.45,1065.231,8,448,438.36
nfnet_f6,5.82,2052.248,12,448,438.36
nfnet_f7,5.73,1387.07,8,480,499.5
resnetv2_152x4_bitm,4.89,406.668,2,480,936.53
cait_m48_448,4.76,414.936,2,448,356.46
efficientnet_l2,3.95,247.515,1,800,480.31
tf_efficientnet_l2_ns,3.93,248.975,1,800,480.31
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-infer-amp-nchw-pt113-cu117-rtx3090.csv
|
model,infer_samples_per_sec,infer_step_time,infer_batch_size,infer_img_size,infer_gmacs,infer_macts,param_count
tinynet_e,49277.65,20.77,1024,106,0.03,0.69,2.04
mobilenetv3_small_050,45562.75,22.464,1024,224,0.03,0.92,1.59
lcnet_035,41026.68,24.949,1024,224,0.03,1.04,1.64
lcnet_050,37575.13,27.242,1024,224,0.05,1.26,1.88
mobilenetv3_small_075,33062.39,30.961,1024,224,0.05,1.3,2.04
mobilenetv3_small_100,30012.26,34.109,1024,224,0.06,1.42,2.54
tf_mobilenetv3_small_minimal_100,28698.14,35.672,1024,224,0.06,1.41,2.04
tf_mobilenetv3_small_075,27407.51,37.352,1024,224,0.05,1.3,2.04
tinynet_d,27236.47,37.585,1024,152,0.05,1.42,2.34
tf_mobilenetv3_small_100,25103.65,40.781,1024,224,0.06,1.42,2.54
lcnet_075,24140.95,42.406,1024,224,0.1,1.99,2.36
mnasnet_small,20706.43,49.443,1024,224,0.07,2.16,2.03
levit_128s,20595.72,49.709,1024,224,0.31,1.88,7.78
lcnet_100,19684.75,52.01,1024,224,0.16,2.52,2.95
mobilenetv2_035,18358.82,55.767,1024,224,0.07,2.86,1.68
regnetx_002,18244.04,56.117,1024,224,0.2,2.16,2.68
ghostnet_050,17564.96,58.287,1024,224,0.05,1.77,2.59
regnety_002,17006.07,60.202,1024,224,0.2,2.17,3.16
mnasnet_050,15925.32,64.29,1024,224,0.11,3.07,2.22
vit_tiny_r_s16_p8_224,15068.38,67.946,1024,224,0.44,2.06,6.34
mobilenetv2_050,14843.74,68.974,1024,224,0.1,3.64,1.97
tinynet_c,14634.69,69.959,1024,184,0.11,2.87,2.46
semnasnet_050,14248.78,71.855,1024,224,0.11,3.44,2.08
levit_128,14164.26,72.284,1024,224,0.41,2.71,9.21
vit_small_patch32_224,13811.36,74.131,1024,224,1.15,2.5,22.88
mixer_s32_224,13352.85,76.677,1024,224,1.0,2.28,19.1
cs3darknet_focus_s,12798.44,79.999,1024,256,0.69,2.7,3.27
lcnet_150,12783.12,80.094,1024,224,0.34,3.79,4.5
cs3darknet_s,12395.11,82.602,1024,256,0.72,2.97,3.28
regnetx_004,12366.39,82.791,1024,224,0.4,3.14,5.16
mobilenetv3_large_075,12001.32,85.313,1024,224,0.16,4.0,3.99
levit_192,11882.81,86.163,1024,224,0.66,3.2,10.95
resnet10t,11615.84,88.145,1024,224,1.1,2.43,5.44
ese_vovnet19b_slim_dw,11539.4,88.729,1024,224,0.4,5.28,1.9
gernet_s,11496.77,89.058,1024,224,0.75,2.65,8.17
mobilenetv3_rw,10873.77,94.16,1024,224,0.23,4.41,5.48
mobilenetv3_large_100,10705.06,95.645,1024,224,0.23,4.41,5.48
hardcorenas_a,10554.34,97.012,1024,224,0.23,4.38,5.26
tf_mobilenetv3_large_075,10511.12,97.41,1024,224,0.16,4.0,3.99
tf_mobilenetv3_large_minimal_100,10371.16,98.725,1024,224,0.22,4.4,3.92
mnasnet_075,10345.17,98.972,1024,224,0.23,4.77,3.17
hardcorenas_b,9695.74,105.601,1024,224,0.26,5.09,5.18
regnety_004,9655.22,106.046,1024,224,0.41,3.89,4.34
ghostnet_100,9483.99,107.96,1024,224,0.15,3.55,5.18
hardcorenas_c,9481.05,107.994,1024,224,0.28,5.01,5.52
tf_mobilenetv3_large_100,9456.79,108.271,1024,224,0.23,4.41,5.48
regnetx_006,9408.22,108.83,1024,224,0.61,3.98,6.2
mobilenetv2_075,9313.88,109.932,1024,224,0.22,5.86,2.64
tinynet_b,9291.99,110.191,1024,188,0.21,4.44,3.73
mnasnet_b1,9286.4,110.258,1024,224,0.33,5.46,4.38
mnasnet_100,9263.52,110.53,1024,224,0.33,5.46,4.38
gluon_resnet18_v1b,9078.31,112.785,1024,224,1.82,2.48,11.69
semnasnet_075,9069.42,112.895,1024,224,0.23,5.54,2.91
resnet18,9045.63,113.192,1024,224,1.82,2.48,11.69
ssl_resnet18,9045.4,113.196,1024,224,1.82,2.48,11.69
swsl_resnet18,9040.4,113.258,1024,224,1.82,2.48,11.69
levit_256,8921.47,114.768,1024,224,1.13,4.23,18.89
hardcorenas_d,8879.46,115.311,1024,224,0.3,4.93,7.5
regnety_006,8666.48,118.144,1024,224,0.61,4.33,6.06
seresnet18,8542.99,119.851,1024,224,1.82,2.49,11.78
mobilenetv2_100,8507.29,120.356,1024,224,0.31,6.68,3.5
spnasnet_100,8342.04,122.741,1024,224,0.35,6.03,4.42
legacy_seresnet18,8310.8,123.202,1024,224,1.82,2.49,11.78
semnasnet_100,8284.16,123.599,1024,224,0.32,6.23,3.89
mnasnet_a1,8283.57,123.607,1024,224,0.32,6.23,3.89
regnetx_008,7852.75,130.39,1024,224,0.81,5.15,7.26
hardcorenas_f,7809.07,131.117,1024,224,0.35,5.57,8.2
hardcorenas_e,7730.97,132.444,1024,224,0.35,5.65,8.07
efficientnet_lite0,7722.75,132.584,1024,224,0.4,6.74,4.65
levit_256d,7689.03,133.165,1024,224,1.4,4.93,26.21
xcit_nano_12_p16_224_dist,7674.8,133.413,1024,224,0.56,4.17,3.05
xcit_nano_12_p16_224,7670.11,133.492,1024,224,0.56,4.17,3.05
resnet18d,7636.48,134.082,1024,224,2.06,3.29,11.71
ghostnet_130,7625.58,134.274,1024,224,0.24,4.6,7.36
tf_efficientnetv2_b0,7614.25,134.473,1024,224,0.73,4.77,7.14
ese_vovnet19b_slim,7588.4,134.932,1024,224,1.69,3.52,3.17
deit_tiny_distilled_patch16_224,7449.3,137.451,1024,224,1.27,6.01,5.91
deit_tiny_patch16_224,7398.73,138.391,1024,224,1.26,5.97,5.72
vit_tiny_patch16_224,7390.78,138.538,1024,224,1.26,5.97,5.72
regnety_008,7366.88,138.989,1024,224,0.81,5.25,6.26
tinynet_a,7358.6,139.145,1024,192,0.35,5.41,6.19
dla46_c,7311.64,140.038,1024,224,0.58,4.5,1.3
fbnetc_100,7303.94,140.187,1024,224,0.4,6.51,5.57
mobilevitv2_050,7248.37,141.262,1024,256,0.48,8.04,1.37
tf_efficientnet_lite0,6816.26,150.218,1024,224,0.4,6.74,4.65
pit_ti_distilled_224,6788.49,150.832,1024,224,0.71,6.23,5.1
pit_ti_224,6762.99,151.401,1024,224,0.7,6.19,4.85
efficientnet_b0,6687.26,153.115,1024,224,0.4,6.75,5.29
visformer_tiny,6618.81,154.698,1024,224,1.27,5.72,10.32
rexnet_100,6608.65,154.937,1024,224,0.41,7.44,4.8
mnasnet_140,6580.58,155.597,1024,224,0.6,7.71,7.12
efficientnet_b1_pruned,6513.48,157.201,1024,240,0.4,6.21,6.33
rexnetr_100,6491.35,157.737,1024,224,0.43,7.72,4.88
mobilenetv2_110d,6395.98,160.089,1024,224,0.45,8.71,4.52
resnet14t,6341.58,161.462,1024,224,1.69,5.8,10.08
regnetz_005,6208.75,164.916,1024,224,0.52,5.86,7.12
dla46x_c,6145.64,166.61,1024,224,0.54,5.66,1.07
nf_regnet_b0,6055.0,169.104,1024,256,0.64,5.58,8.76
tf_efficientnet_b0,5992.76,170.862,1024,224,0.4,6.75,5.29
hrnet_w18_small,5908.15,173.308,1024,224,1.61,5.72,13.19
edgenext_xx_small,5886.07,173.957,1024,288,0.33,4.21,1.33
semnasnet_140,5856.63,174.833,1024,224,0.6,8.87,6.11
resnetblur18,5839.81,175.336,1024,224,2.34,3.39,11.69
ese_vovnet19b_dw,5825.11,175.779,1024,224,1.34,8.25,6.54
dla60x_c,5790.89,176.817,1024,224,0.59,6.01,1.32
mobilenetv2_140,5780.41,177.139,1024,224,0.6,9.57,6.11
skresnet18,5648.81,181.265,1024,224,1.82,3.24,11.96
mobilevit_xxs,5528.18,185.22,1024,256,0.42,8.34,1.27
efficientnet_b0_gn,5401.88,189.551,1024,224,0.42,6.75,5.29
convnext_atto,5364.13,190.886,1024,288,0.91,6.3,3.7
gluon_resnet34_v1b,5344.34,191.593,1024,224,3.67,3.74,21.8
resnet34,5335.05,191.926,1024,224,3.67,3.74,21.8
efficientnet_lite1,5334.12,191.959,1024,240,0.62,10.14,5.42
tv_resnet34,5332.7,192.011,1024,224,3.67,3.74,21.8
vit_base_patch32_224,5287.0,193.67,1024,224,4.41,5.01,88.22
vit_base_patch32_clip_224,5281.4,193.877,1024,224,4.41,5.01,88.22
levit_384,5276.74,194.047,1024,224,2.36,6.26,39.13
pit_xs_distilled_224,5241.4,195.357,1024,224,1.41,7.76,11.0
pit_xs_224,5237.09,195.517,1024,224,1.4,7.71,10.62
selecsls42,5225.99,195.932,1024,224,2.94,4.62,30.35
selecsls42b,5201.55,196.853,1024,224,2.98,4.62,32.46
gernet_m,5124.67,199.807,1024,224,3.02,5.24,21.14
pvt_v2_b0,5122.72,199.882,1024,224,0.57,7.99,3.67
tf_efficientnetv2_b1,5122.21,199.903,1024,240,1.21,7.34,8.14
mixnet_s,5079.84,201.57,1024,224,0.25,6.25,4.13
convnext_atto_ols,5062.64,202.255,1024,288,0.96,6.8,3.7
seresnet34,5028.88,203.611,1024,224,3.67,3.74,21.96
rexnetr_130,5003.96,204.626,1024,224,0.68,9.81,7.61
fbnetv3_b,5003.0,204.666,1024,256,0.55,9.1,8.6
mixer_b32_224,4982.51,205.508,1024,224,3.24,6.29,60.29
xcit_tiny_12_p16_224_dist,4879.26,209.853,1024,224,1.24,6.29,6.72
legacy_seresnet34,4875.12,210.034,1024,224,3.67,3.74,21.96
xcit_tiny_12_p16_224,4870.16,210.244,1024,224,1.24,6.29,6.72
resnet34d,4834.78,211.786,1024,224,3.91,4.54,21.82
tf_efficientnet_lite1,4822.03,212.348,1024,240,0.62,10.14,5.42
resnet26,4794.98,213.545,1024,224,2.36,7.35,16.0
mobilenetv2_120d,4786.27,213.934,1024,224,0.69,11.97,5.83
rexnet_130,4770.1,214.659,1024,224,0.68,9.71,7.56
efficientnet_b0_g16_evos,4743.69,215.854,1024,224,1.01,7.42,8.11
efficientnet_es,4736.89,216.163,1024,224,1.81,8.73,5.44
efficientnet_es_pruned,4735.25,216.239,1024,224,1.81,8.73,5.44
tf_mixnet_s,4735.17,216.242,1024,224,0.25,6.25,4.13
gmlp_ti16_224,4709.0,217.445,1024,224,1.34,7.55,5.87
convnext_femto,4672.08,219.162,1024,288,1.3,7.56,5.22
mobilevitv2_075,4638.17,220.764,1024,256,1.05,12.06,2.87
resmlp_12_224,4601.92,222.504,1024,224,3.01,5.5,15.35
resmlp_12_distilled_224,4597.97,222.695,1024,224,3.01,5.5,15.35
gmixer_12_224,4543.02,225.388,1024,224,2.67,7.26,12.7
fbnetv3_d,4532.2,225.927,1024,256,0.68,11.1,10.31
tf_efficientnet_es,4518.93,226.591,1024,224,1.81,8.73,5.44
selecsls60,4510.1,227.034,1024,224,3.59,5.52,30.67
mixer_s16_224,4509.29,227.075,1024,224,3.79,5.97,18.53
regnetx_016,4507.02,227.189,1024,224,1.62,7.93,9.19
selecsls60b,4490.35,228.033,1024,224,3.63,5.52,32.77
cs3darknet_focus_m,4487.64,228.171,1024,288,2.51,6.19,9.3
dla34,4481.03,228.505,1024,224,3.07,5.02,15.74
crossvit_tiny_240,4476.83,228.722,1024,240,1.57,9.08,7.01
convnext_femto_ols,4473.25,228.904,1024,288,1.35,8.06,5.23
vit_tiny_r_s16_p8_384,4463.13,229.423,1024,384,1.34,6.49,6.36
cs3darknet_m,4452.94,229.949,1024,288,2.63,6.69,9.31
repvgg_b0,4433.11,230.978,1024,224,3.41,6.15,15.82
resnet26d,4354.59,235.143,1024,224,2.6,8.15,16.01
rexnetr_150,4349.97,235.392,1024,224,0.89,11.13,9.78
resnetaa34d,4309.77,237.588,1024,224,4.43,5.07,21.82
efficientnet_b2_pruned,4309.58,237.598,1024,260,0.73,9.13,8.31
darknet17,4296.61,238.316,1024,256,3.26,7.18,14.3
vit_small_patch32_384,4250.58,240.897,1024,384,3.45,8.25,22.92
crossvit_9_240,4201.98,243.683,1024,240,1.85,9.52,8.55
nf_resnet26,4197.39,243.949,1024,224,2.41,7.35,16.0
efficientnet_b0_g8_gn,4190.39,244.357,1024,224,0.66,6.75,6.56
rexnet_150,4186.31,244.594,1024,224,0.9,11.21,9.73
ecaresnet50d_pruned,4182.62,244.81,1024,224,2.53,6.43,19.94
efficientformer_l1,4075.83,251.225,1024,224,1.3,5.53,12.29
poolformer_s12,4050.19,252.815,1024,224,1.82,5.53,11.92
regnety_016,4035.9,253.712,1024,224,1.63,8.04,11.2
efficientnet_lite2,4013.48,255.128,1024,260,0.89,12.9,6.09
crossvit_9_dagger_240,3992.98,256.437,1024,240,1.99,9.97,8.78
efficientnet_cc_b0_8e,3929.29,260.595,1024,224,0.42,9.42,24.01
efficientnet_cc_b0_4e,3918.01,261.346,1024,224,0.41,9.42,13.31
darknet21,3914.26,261.596,1024,256,3.93,7.47,20.86
efficientnet_b1,3876.9,264.116,1024,256,0.77,12.22,7.79
tf_efficientnet_b1,3834.3,267.052,1024,240,0.71,10.88,7.79
resnest14d,3793.21,269.944,1024,224,2.76,7.33,10.61
sedarknet21,3784.73,270.549,1024,256,3.93,7.47,20.95
resnext26ts,3775.5,271.211,1024,256,2.43,10.52,10.3
tf_efficientnetv2_b2,3727.06,274.735,1024,260,1.72,9.84,10.1
convnext_pico,3702.78,276.537,1024,288,2.27,10.08,9.05
edgenext_x_small,3692.42,277.311,1024,288,0.68,7.5,2.34
tf_efficientnet_cc_b0_8e,3691.33,277.395,1024,224,0.42,9.42,24.01
dpn48b,3689.99,277.494,1024,224,1.69,8.92,9.13
eca_resnext26ts,3675.59,278.583,1024,256,2.43,10.52,10.3
seresnext26ts,3670.33,278.98,1024,256,2.43,10.52,10.39
tf_efficientnet_cc_b0_4e,3665.41,279.357,1024,224,0.41,9.42,13.31
tf_efficientnet_lite2,3662.0,279.618,1024,260,0.89,12.9,6.09
nf_ecaresnet26,3619.99,282.862,1024,224,2.41,7.36,16.0
nf_seresnet26,3618.8,282.955,1024,224,2.41,7.36,17.4
gcresnext26ts,3594.7,284.852,1024,256,2.43,10.53,10.48
mobilevitv2_100,3589.19,213.964,768,256,1.84,16.08,4.9
gernet_l,3556.24,287.933,1024,256,4.57,8.0,31.08
legacy_seresnext26_32x4d,3545.88,288.774,1024,224,2.49,9.39,16.79
convnext_pico_ols,3532.27,289.886,1024,288,2.37,10.74,9.06
resnet26t,3503.33,292.28,1024,256,3.35,10.52,16.01
repvgg_a2,3454.82,296.386,1024,224,5.7,6.26,28.21
mixnet_m,3418.52,299.526,1024,224,0.36,8.19,5.01
efficientnet_b3_pruned,3356.7,305.049,1024,300,1.04,11.86,9.86
nf_regnet_b1,3352.23,305.456,1024,288,1.02,9.2,10.22
ecaresnext50t_32x4d,3339.2,306.649,1024,224,2.7,10.09,15.41
ecaresnext26t_32x4d,3337.18,306.833,1024,224,2.7,10.09,15.41
seresnext26tn_32x4d,3327.66,307.711,1024,224,2.7,10.09,16.81
seresnext26t_32x4d,3327.23,307.751,1024,224,2.7,10.09,16.81
seresnext26d_32x4d,3303.57,309.954,1024,224,2.73,10.19,16.81
tf_mixnet_m,3301.19,310.17,1024,224,0.36,8.19,5.01
convit_tiny,3286.62,311.554,1024,224,1.26,7.94,5.71
mobilevit_xs,3278.19,234.265,768,256,1.05,16.33,2.32
pit_s_224,3268.88,313.245,1024,224,2.88,11.56,23.46
pit_s_distilled_224,3266.72,313.452,1024,224,2.9,11.64,24.04
skresnet34,3242.45,315.8,1024,224,3.67,5.13,22.28
eca_botnext26ts_256,3224.24,317.583,1024,256,2.46,11.6,10.59
ecaresnet101d_pruned,3223.88,317.616,1024,224,3.48,7.69,24.88
deit_small_distilled_patch16_224,3220.79,317.922,1024,224,4.63,12.02,22.44
ecaresnetlight,3215.57,318.439,1024,224,4.11,8.42,30.16
deit_small_patch16_224,3209.05,319.085,1024,224,4.61,11.95,22.05
vit_small_patch16_224,3199.98,319.99,1024,224,4.61,11.95,22.05
eca_halonext26ts,3173.71,322.639,1024,256,2.44,11.46,10.76
convnextv2_atto,3162.98,323.733,1024,288,0.91,6.3,3.71
resnetv2_50,3158.28,324.214,1024,224,4.11,11.11,25.55
nf_regnet_b2,3133.63,326.765,1024,272,1.22,9.27,14.31
rexnetr_200,3133.12,245.111,768,224,1.59,15.11,16.52
botnet26t_256,3123.98,327.772,1024,256,3.32,11.98,12.49
coat_lite_tiny,3113.54,328.874,1024,224,1.6,11.65,5.72
vit_small_r26_s32_224,3112.34,329.001,1024,224,3.56,9.85,36.43
bat_resnext26ts,3103.95,329.89,1024,256,2.53,12.51,10.73
halonet26t,3103.39,329.95,1024,256,3.19,11.69,12.48
pvt_v2_b1,3095.14,330.828,1024,224,2.12,15.39,14.01
cspresnet50,3063.22,334.278,1024,256,4.54,11.5,21.62
resnet32ts,3055.79,335.09,1024,256,4.63,11.58,17.96
rexnet_200,3051.5,251.668,768,224,1.56,14.91,16.37
lambda_resnet26t,3046.2,336.144,1024,256,3.02,11.87,10.96
ssl_resnet50,3030.48,337.887,1024,224,4.11,11.11,25.56
gluon_resnet50_v1b,3027.43,338.23,1024,224,4.11,11.11,25.56
tv_resnet50,3027.39,338.232,1024,224,4.11,11.11,25.56
swsl_resnet50,3027.07,338.268,1024,224,4.11,11.11,25.56
resnet50,3025.4,338.455,1024,224,4.11,11.11,25.56
deit3_small_patch16_224_in21ft1k,3023.02,338.721,1024,224,4.61,11.95,22.06
deit3_small_patch16_224,3017.77,339.312,1024,224,4.61,11.95,22.06
tresnet_m,3006.54,340.578,1024,224,5.74,7.31,31.39
resnet33ts,3005.78,340.665,1024,256,4.76,11.66,19.68
vit_small_resnet26d_224,2994.08,341.995,1024,224,5.07,11.12,63.61
resnetv2_50t,2989.06,342.569,1024,224,4.32,11.82,25.57
regnetx_032,2988.15,342.675,1024,224,3.2,11.37,15.3
dpn68b,2981.13,343.481,1024,224,2.35,10.47,12.61
hrnet_w18_small_v2,2978.67,343.765,1024,224,2.62,9.65,15.6
dpn68,2975.29,344.155,1024,224,2.35,10.47,12.61
resnetv2_50d,2971.15,344.633,1024,224,4.35,11.92,25.57
efficientnet_em,2938.12,348.51,1024,240,3.04,14.34,6.9
vit_base_patch32_plus_256,2934.64,348.925,1024,256,7.79,7.76,119.48
coat_lite_mini,2921.75,350.462,1024,224,2.0,12.25,11.01
tf_efficientnet_b2,2919.63,350.718,1024,260,1.02,13.83,9.11
seresnet33ts,2919.51,350.732,1024,256,4.76,11.66,19.78
eca_resnet33ts,2917.21,351.008,1024,256,4.76,11.66,19.68
haloregnetz_b,2890.29,354.276,1024,224,1.97,11.94,11.68
coatnet_pico_rw_224,2884.58,354.98,1024,224,2.05,14.62,10.85
dla60,2883.99,355.049,1024,224,4.26,10.16,22.04
gluon_resnet50_v1c,2872.58,356.463,1024,224,4.35,11.92,25.58
resnet50t,2869.49,356.844,1024,224,4.32,11.82,25.57
gcresnet33ts,2863.36,357.609,1024,256,4.76,11.68,19.88
gluon_resnet50_v1d,2853.24,358.879,1024,224,4.35,11.92,25.58
cspresnet50d,2852.98,358.911,1024,256,4.86,12.55,21.64
resnet50d,2850.55,359.218,1024,224,4.35,11.92,25.58
vovnet39a,2845.31,359.878,1024,224,7.09,6.73,22.6
cspresnet50w,2835.31,361.148,1024,256,5.04,12.19,28.12
vgg11,2827.53,362.143,1024,224,7.61,7.44,132.86
tf_efficientnet_em,2826.28,362.303,1024,240,3.04,14.34,6.9
visformer_small,2818.88,363.251,1024,224,4.88,11.43,40.22
vit_relpos_small_patch16_224,2792.87,366.637,1024,224,4.59,13.05,21.98
vit_relpos_base_patch32_plus_rpn_256,2784.26,367.771,1024,256,7.68,8.01,119.42
vit_srelpos_small_patch16_224,2781.72,368.106,1024,224,4.59,12.16,21.97
resnest26d,2772.97,369.267,1024,224,3.64,9.97,17.07
cs3darknet_focus_l,2770.5,369.596,1024,288,5.9,10.16,21.15
efficientnet_b2a,2767.64,369.979,1024,288,1.12,16.2,9.11
efficientnet_b2,2766.98,370.065,1024,288,1.12,16.2,9.11
ese_vovnet39b,2760.12,370.986,1024,224,7.09,6.74,24.57
legacy_seresnet50,2753.49,371.881,1024,224,3.88,10.6,28.09
densenet121,2749.79,372.378,1024,224,2.87,6.9,7.98
tv_densenet121,2747.16,372.735,1024,224,2.87,6.9,7.98
eca_vovnet39b,2736.53,374.185,1024,224,7.09,6.74,22.6
coatnet_nano_cc_224,2716.19,376.986,1024,224,2.24,15.02,13.76
convnextv2_femto,2710.95,377.714,1024,288,1.3,7.56,5.23
resnetv2_50x1_bit_distilled,2704.93,378.554,1024,224,4.23,11.11,25.55
selecsls84,2697.2,379.64,1024,224,5.9,7.57,50.95
flexivit_small,2693.55,380.153,1024,240,5.35,14.18,22.06
twins_svt_small,2691.25,380.48,1024,224,2.94,13.75,24.06
mixnet_l,2678.25,382.327,1024,224,0.58,10.84,7.33
seresnet50,2674.61,382.848,1024,224,4.11,11.13,28.09
xcit_nano_12_p16_384_dist,2668.39,383.74,1024,384,1.64,12.15,3.05
cs3darknet_l,2649.93,386.412,1024,288,6.16,10.83,21.16
coatnet_nano_rw_224,2633.36,388.844,1024,224,2.41,15.41,15.14
coatnext_nano_rw_224,2627.24,389.75,1024,224,2.47,12.8,14.7
xcit_tiny_24_p16_224_dist,2617.14,391.253,1024,224,2.34,11.82,12.12
densenet121d,2616.98,391.278,1024,224,3.11,7.7,8.0
xcit_tiny_24_p16_224,2614.91,391.584,1024,224,2.34,11.82,12.12
resnet50_gn,2599.07,393.975,1024,224,4.14,11.11,25.56
vit_relpos_small_patch16_rpn_224,2596.73,394.33,1024,224,4.59,13.05,21.97
res2net50_48w_2s,2593.21,394.865,1024,224,4.18,11.72,25.29
mobilevit_s,2587.93,296.749,768,256,2.03,19.94,5.58
convnext_nano,2579.36,396.983,1024,288,4.06,13.84,15.59
tf_mixnet_l,2577.4,397.288,1024,224,0.58,10.84,7.33
resnetaa50d,2573.35,397.912,1024,224,5.39,12.44,25.58
vgg11_bn,2556.04,400.607,1024,224,7.62,7.44,132.87
seresnet50t,2550.33,401.504,1024,224,4.32,11.83,28.1
ecaresnet50d,2544.16,402.478,1024,224,4.35,11.93,25.58
gcvit_xxtiny,2518.13,406.639,1024,224,2.14,15.36,12.0
cs3sedarknet_l,2502.51,409.176,1024,288,6.16,10.83,21.91
resnetrs50,2497.73,409.96,1024,224,4.48,12.14,35.69
mobilevitv2_125,2489.87,308.438,768,256,2.86,20.1,7.48
resnetblur50,2484.87,412.08,1024,224,5.16,12.02,25.56
cspresnext50,2483.24,412.352,1024,256,4.05,15.86,20.57
gluon_resnet50_v1s,2459.02,416.413,1024,224,5.47,13.52,25.68
efficientnet_cc_b1_8e,2458.85,416.443,1024,240,0.75,15.44,39.72
vit_base_resnet26d_224,2458.01,416.584,1024,224,6.97,13.16,101.4
densenetblur121d,2444.58,418.873,1024,224,3.11,7.9,8.0
tv_resnext50_32x4d,2431.41,421.143,1024,224,4.26,14.4,25.03
ssl_resnext50_32x4d,2431.35,421.155,1024,224,4.26,14.4,25.03
swsl_resnext50_32x4d,2430.87,421.236,1024,224,4.26,14.4,25.03
resnext50_32x4d,2429.56,421.462,1024,224,4.26,14.4,25.03
gluon_resnext50_32x4d,2428.35,421.674,1024,224,4.26,14.4,25.03
dla60x,2414.82,424.035,1024,224,3.54,13.8,17.35
efficientnet_lite3,2407.43,212.664,512,300,1.65,21.85,8.2
regnetx_040,2406.98,425.416,1024,224,3.99,12.2,22.12
semobilevit_s,2404.63,319.371,768,256,2.03,19.95,5.74
gcresnext50ts,2402.57,426.196,1024,256,3.75,15.46,15.67
regnety_040s_gn,2385.11,429.317,1024,224,4.03,12.29,20.65
resnetblur50d,2367.52,432.507,1024,224,5.4,12.82,25.58
vovnet57a,2360.79,433.737,1024,224,8.95,7.52,36.64
tf_efficientnet_cc_b1_8e,2357.71,434.307,1024,240,0.75,15.44,39.72
resmlp_24_distilled_224,2351.85,435.39,1024,224,5.96,10.91,30.02
resmlp_24_224,2345.81,436.509,1024,224,5.96,10.91,30.02
res2net50_14w_8s,2341.48,437.317,1024,224,4.21,13.28,25.06
coatnet_rmlp_nano_rw_224,2340.53,437.494,1024,224,2.62,20.34,15.15
sehalonet33ts,2339.44,328.271,768,256,3.55,14.7,13.69
res2net50_26w_4s,2338.49,437.876,1024,224,4.28,12.61,25.7
convnext_nano_ols,2328.37,439.779,1024,288,4.38,15.5,15.65
lambda_resnet26rpt_256,2324.88,165.158,384,256,3.16,11.87,10.99
gmixer_24_224,2324.82,440.451,1024,224,5.28,14.45,24.72
gcresnet50t,2321.78,441.028,1024,256,5.42,14.67,25.9
resnext50d_32x4d,2317.05,441.929,1024,224,4.5,15.2,25.05
resnest50d_1s4x24d,2309.9,443.296,1024,224,4.43,13.57,25.68
seresnetaa50d,2309.78,443.319,1024,224,5.4,12.46,28.11
dla60_res2net,2301.91,444.834,1024,224,4.15,12.34,20.85
vit_base_r26_s32_224,2301.77,444.864,1024,224,6.81,12.36,101.38
twins_pcpvt_small,2290.09,447.132,1024,224,3.83,18.08,24.11
regnetz_b16,2286.62,447.81,1024,288,2.39,16.43,9.72
ese_vovnet57b,2267.23,451.64,1024,224,8.95,7.52,38.61
gluon_inception_v3,2265.31,452.024,1024,299,5.73,8.97,23.83
inception_v3,2260.97,452.888,1024,299,5.73,8.97,23.83
adv_inception_v3,2258.89,453.305,1024,299,5.73,8.97,23.83
tf_inception_v3,2255.73,453.943,1024,299,5.73,8.97,23.83
densenet169,2232.91,458.582,1024,224,3.4,7.3,14.15
tf_efficientnetv2_b3,2223.64,460.493,1024,300,3.04,15.74,14.36
nf_ecaresnet50,2211.52,463.019,1024,224,4.21,11.13,25.56
nf_seresnet50,2207.21,463.921,1024,224,4.21,11.13,28.09
skresnet50,2206.75,464.017,1024,224,4.11,12.5,25.8
edgenext_small,2206.31,464.109,1024,320,1.97,14.16,5.59
seresnext50_32x4d,2197.09,466.058,1024,224,4.26,14.42,27.56
gluon_seresnext50_32x4d,2196.94,466.091,1024,224,4.26,14.42,27.56
xcit_small_12_p16_224_dist,2195.81,466.33,1024,224,4.82,12.58,26.25
legacy_seresnext50_32x4d,2193.34,466.856,1024,224,4.26,14.42,27.56
xcit_small_12_p16_224,2190.16,467.534,1024,224,4.82,12.58,26.25
repvgg_b1g4,2188.83,467.817,1024,224,8.15,10.64,39.97
tf_efficientnet_lite3,2188.37,233.953,512,300,1.65,21.85,8.2
efficientnetv2_rw_t,2170.03,471.87,1024,288,3.19,16.42,13.65
gmlp_s16_224,2164.56,473.061,1024,224,4.42,15.1,19.42
dla60_res2next,2126.26,481.583,1024,224,3.49,13.17,17.03
gc_efficientnetv2_rw_t,2126.09,481.621,1024,288,3.2,16.45,13.68
skresnet50d,2112.57,484.703,1024,224,4.36,13.31,25.82
mobilevitv2_150,2105.0,243.219,512,256,4.09,24.11,10.59
mobilevitv2_150_in22ft1k,2104.51,243.274,512,256,4.09,24.11,10.59
convnextv2_pico,2092.16,489.434,1024,288,2.27,10.08,9.07
poolformer_s24,2090.38,489.851,1024,224,3.41,10.68,21.39
cs3sedarknet_xdw,2090.04,489.929,1024,256,5.97,17.18,21.6
res2next50,2085.23,491.055,1024,224,4.2,13.71,24.67
cspdarknet53,2084.51,491.231,1024,256,6.57,16.81,27.64
fbnetv3_g,2084.48,491.238,1024,288,1.77,21.09,16.62
crossvit_small_240,2074.04,493.709,1024,240,5.63,18.17,26.86
deit3_medium_patch16_224_in21ft1k,2064.27,496.046,1024,224,8.0,15.93,38.85
deit3_medium_patch16_224,2063.34,496.268,1024,224,8.0,15.93,38.85
xcit_nano_12_p8_224_dist,2049.01,499.742,1024,224,2.16,15.71,3.05
xcit_nano_12_p8_224,2044.48,500.848,1024,224,2.16,15.71,3.05
nf_regnet_b3,2035.39,503.085,1024,320,2.05,14.61,18.59
cs3darknet_focus_x,2017.73,507.488,1024,256,8.03,10.69,35.02
vit_relpos_medium_patch16_cls_224,2000.38,511.89,1024,224,8.03,18.24,38.76
lambda_resnet50ts,1991.21,514.246,1024,256,5.07,17.48,21.54
swin_tiny_patch4_window7_224,1978.72,517.495,1024,224,4.51,17.06,28.29
sebotnet33ts_256,1959.75,195.932,384,256,3.89,17.46,13.7
coatnet_0_rw_224,1957.32,523.148,1024,224,4.43,18.73,27.44
ecaresnet26t,1953.32,524.224,1024,320,5.24,16.44,16.01
regnetx_080,1942.5,527.144,1024,224,8.02,14.06,39.57
gcvit_xtiny,1941.57,527.393,1024,224,2.93,20.26,19.98
resnetv2_101,1925.46,531.806,1024,224,7.83,16.23,44.54
regnetx_064,1920.06,533.303,1024,224,6.49,16.37,26.21
mixnet_xl,1918.85,533.64,1024,224,0.93,14.57,11.9
edgenext_small_rw,1912.9,535.3,1024,320,2.46,14.85,7.83
vit_relpos_medium_patch16_224,1907.96,536.687,1024,224,7.97,17.02,38.75
vit_srelpos_medium_patch16_224,1900.57,538.773,1024,224,7.96,16.21,38.74
resnest50d,1896.74,539.858,1024,224,5.4,14.36,27.48
crossvit_15_240,1894.86,540.397,1024,240,5.81,19.77,27.53
vit_base_resnet50d_224,1892.78,540.989,1024,224,8.73,16.92,110.97
gluon_resnet101_v1b,1879.26,544.883,1024,224,7.83,16.23,44.55
tv_resnet101,1878.26,545.172,1024,224,7.83,16.23,44.55
resnet101,1875.25,546.047,1024,224,7.83,16.23,44.55
dla102,1873.79,546.472,1024,224,7.19,14.18,33.27
efficientformer_l3,1868.08,548.142,1024,224,3.93,12.01,31.41
maxvit_rmlp_pico_rw_256,1866.73,411.402,768,256,1.85,24.86,7.52
resnetv2_101d,1855.94,551.727,1024,224,8.07,17.04,44.56
pvt_v2_b2,1835.92,557.745,1024,224,4.05,27.53,25.36
maxvit_pico_rw_256,1829.44,419.787,768,256,1.83,22.3,7.46
vgg13,1820.36,562.512,1024,224,11.31,12.25,133.05
lamhalobotnet50ts_256,1818.57,563.067,1024,256,5.02,18.44,22.57
crossvit_15_dagger_240,1817.96,563.255,1024,240,6.13,20.43,28.21
gluon_resnet101_v1c,1816.14,563.82,1024,224,8.08,17.04,44.57
res2net50_26w_6s,1811.81,565.168,1024,224,6.33,15.28,37.05
gluon_resnet101_v1d,1808.21,566.295,1024,224,8.08,17.04,44.57
swin_s3_tiny_224,1803.67,567.72,1024,224,4.64,19.13,28.33
coatnet_rmlp_0_rw_224,1803.63,567.733,1024,224,4.72,24.89,27.45
vit_relpos_medium_patch16_rpn_224,1770.72,578.284,1024,224,7.97,17.02,38.73
halonet50ts,1765.73,579.917,1024,256,5.3,19.2,22.73
repvgg_b1,1760.92,581.5,1024,224,13.16,10.64,57.42
coatnet_bn_0_rw_224,1753.99,583.799,1024,224,4.67,22.04,27.44
wide_resnet50_2,1747.87,585.844,1024,224,11.43,14.4,68.88
efficientnet_b3,1741.21,294.036,512,320,2.01,26.52,12.23
efficientnet_b3a,1740.84,294.1,512,320,2.01,26.52,12.23
densenet201,1738.22,589.096,1024,224,4.34,7.85,20.01
coatnet_0_224,1727.45,296.376,512,224,4.58,24.01,25.04
darknetaa53,1721.33,594.876,1024,288,10.08,15.68,36.02
tf_efficientnet_b3,1720.61,297.558,512,300,1.87,23.83,12.23
cait_xxs24_224,1720.1,595.301,1024,224,2.53,20.29,11.96
vit_large_patch32_224,1718.53,595.845,1024,224,15.41,13.32,327.9
mobilevitv2_175,1697.71,301.572,512,256,5.54,28.13,14.25
mobilevitv2_175_in22ft1k,1697.51,301.606,512,256,5.54,28.13,14.25
xcit_tiny_12_p16_384_dist,1694.92,604.145,1024,384,3.64,18.26,6.72
pvt_v2_b2_li,1694.45,604.311,1024,224,3.91,27.6,22.55
coat_lite_small,1694.41,604.328,1024,224,3.96,22.09,19.84
resnetaa101d,1692.59,604.976,1024,224,9.12,17.56,44.57
legacy_seresnet101,1686.93,607.005,1024,224,7.61,15.74,49.33
tresnet_v2_l,1685.52,607.515,1024,224,8.81,16.34,46.17
hrnet_w18,1679.12,609.832,1024,224,4.32,16.31,21.3
vit_medium_patch16_gap_240,1667.0,614.264,1024,240,9.22,18.81,44.4
vit_tiny_patch16_384,1660.88,616.528,1024,384,4.7,25.39,5.79
regnetv_040,1659.81,616.926,1024,288,6.6,20.3,20.64
convnext_tiny_hnf,1659.73,616.951,1024,288,7.39,22.21,28.59
seresnet101,1655.13,618.666,1024,224,7.84,16.27,49.33
vit_base_patch32_384,1651.29,620.109,1024,384,13.06,16.5,88.3
vit_base_patch32_clip_384,1649.72,620.7,1024,384,13.06,16.5,88.3
regnety_040,1647.66,621.47,1024,288,6.61,20.3,20.65
regnety_032,1645.25,622.383,1024,288,5.29,18.61,19.44
gluon_resnet101_v1s,1642.29,623.505,1024,224,9.19,18.64,44.67
vgg13_bn,1634.19,626.596,1024,224,11.33,12.25,133.05
resnetaa50,1631.05,627.803,1024,288,8.52,19.24,25.56
mixer_b16_224_miil,1628.71,628.706,1024,224,12.62,14.53,59.88
mixer_b16_224,1627.79,629.061,1024,224,12.62,14.53,59.88
convnext_tiny,1626.95,629.384,1024,288,7.39,22.21,28.59
nf_resnet101,1620.77,631.785,1024,224,8.01,16.23,44.55
swinv2_cr_tiny_224,1618.15,632.807,1024,224,4.66,28.45,28.33
ecaresnet101d,1609.33,636.276,1024,224,8.08,17.07,44.57
twins_pcpvt_base,1605.41,637.831,1024,224,6.68,25.25,43.83
dla102x,1601.78,639.274,1024,224,5.89,19.42,26.31
ese_vovnet39b_evos,1601.47,639.4,1024,224,7.07,6.74,24.58
darknet53,1597.03,641.177,1024,288,11.78,15.68,41.61
resnetblur101d,1596.24,641.494,1024,224,9.12,17.94,44.57
resnet51q,1592.08,643.172,1024,288,8.07,20.94,35.7
swinv2_cr_tiny_ns_224,1591.39,643.448,1024,224,4.66,28.45,28.33
mixer_l32_224,1583.03,646.85,1024,224,11.27,19.86,206.94
resmlp_36_distilled_224,1577.86,648.967,1024,224,8.91,16.33,44.69
resmlp_36_224,1577.4,649.158,1024,224,8.91,16.33,44.69
resnetv2_50d_gn,1561.87,655.61,1024,288,7.24,19.7,25.57
botnet50ts_256,1556.81,246.643,384,256,5.54,22.23,22.74
nf_resnet50,1548.83,661.132,1024,288,6.88,18.37,25.56
resnetv2_50d_frn,1547.35,661.764,1024,224,4.33,11.92,25.59
halo2botnet50ts_256,1546.64,496.545,768,256,5.02,21.78,22.64
mvitv2_tiny,1534.63,667.247,1024,224,4.7,21.16,24.17
gluon_resnext101_32x4d,1505.04,680.366,1024,224,8.01,21.23,44.18
swsl_resnext101_32x4d,1504.46,680.63,1024,224,8.01,21.23,44.18
cs3darknet_x,1504.38,680.665,1024,288,10.6,14.36,35.05
ssl_resnext101_32x4d,1503.93,680.869,1024,224,8.01,21.23,44.18
resnext101_32x4d,1503.63,681.005,1024,224,8.01,21.23,44.18
resnest50d_4s2x40d,1497.58,683.755,1024,224,4.4,17.94,30.42
convnextv2_nano,1488.75,515.858,768,288,4.06,13.84,15.62
skresnext50_32x4d,1478.83,692.427,1024,224,4.5,17.18,27.48
mobilevitv2_200,1478.44,519.454,768,256,7.22,32.15,18.45
tresnet_l,1477.44,693.076,1024,224,10.88,11.9,55.99
mobilevitv2_200_in22ft1k,1477.37,519.83,768,256,7.22,32.15,18.45
vgg16,1475.59,693.946,1024,224,15.47,13.56,138.36
regnetz_c16,1475.58,693.953,1024,320,3.92,25.88,13.46
resnetv2_50d_evob,1468.61,697.244,1024,224,4.33,11.92,25.59
vit_medium_patch16_gap_256,1467.03,697.996,1024,256,10.59,22.15,38.86
res2net50_26w_8s,1466.52,698.239,1024,224,8.37,17.95,48.4
sequencer2d_s,1465.84,698.562,1024,224,4.96,11.31,27.65
eca_nfnet_l0,1461.61,700.586,1024,288,7.12,17.29,24.14
nfnet_l0,1460.27,701.228,1024,288,7.13,17.29,35.07
cs3sedarknet_x,1435.72,713.217,1024,288,10.6,14.37,35.4
resnet61q,1434.01,714.068,1024,288,9.87,21.52,36.85
res2net101_26w_4s,1424.71,718.728,1024,224,8.1,18.45,45.21
repvgg_b2g4,1415.15,723.581,1024,224,12.63,12.9,61.76
nest_tiny,1413.2,543.434,768,224,5.83,25.48,17.06
poolformer_s36,1408.65,726.922,1024,224,5.0,15.82,30.86
maxvit_rmlp_nano_rw_256,1404.06,546.971,768,256,4.47,31.92,15.5
convit_small,1397.72,732.608,1024,224,5.76,17.87,27.78
jx_nest_tiny,1387.89,553.347,768,224,5.83,25.48,17.06
maxvit_nano_rw_256,1378.18,557.246,768,256,4.46,30.28,15.45
nf_ecaresnet101,1373.28,745.649,1024,224,8.01,16.27,44.55
nf_seresnet101,1369.04,747.958,1024,224,8.02,16.27,49.33
gluon_seresnext101_32x4d,1358.35,753.84,1024,224,8.02,21.26,48.96
legacy_seresnext101_32x4d,1357.27,754.442,1024,224,8.02,21.26,48.96
efficientnet_b3_gn,1357.0,282.964,384,320,2.14,28.83,11.73
nfnet_f0,1356.65,754.786,1024,256,12.62,18.05,71.49
seresnext101_32x4d,1356.0,755.148,1024,224,8.02,21.26,48.96
resnetv2_152,1353.28,756.668,1024,224,11.55,22.56,60.19
xception,1353.17,567.542,768,299,8.4,35.83,22.86
twins_svt_base,1350.54,758.199,1024,224,8.59,26.33,56.07
crossvit_18_240,1343.82,761.996,1024,240,9.05,26.26,43.27
ese_vovnet99b_iabn,1343.72,762.049,1024,224,16.49,11.27,63.2
maxxvit_rmlp_nano_rw_256,1341.45,763.341,1024,256,4.37,26.05,16.78
regnetx_120,1339.05,764.708,1024,224,12.13,21.37,46.11
vgg16_bn,1336.79,765.998,1024,224,15.5,13.56,138.37
dpn92,1330.6,769.562,1024,224,6.54,18.21,37.67
tv_resnet152,1329.75,770.054,1024,224,11.56,22.56,60.19
gcvit_tiny,1328.61,770.718,1024,224,4.79,29.82,28.22
gluon_resnet152_v1b,1328.2,770.954,1024,224,11.56,22.56,60.19
resnet152,1327.13,771.578,1024,224,11.56,22.56,60.19
ese_vovnet99b,1316.93,777.554,1024,224,16.51,11.27,63.2
pvt_v2_b3,1316.31,777.917,1024,224,6.92,37.7,45.24
xcit_tiny_12_p8_224_dist,1300.55,787.348,1024,224,4.81,23.6,6.71
xcit_tiny_12_p8_224,1299.96,787.704,1024,224,4.81,23.6,6.71
crossvit_18_dagger_240,1298.96,788.312,1024,240,9.5,27.03,44.27
hrnet_w32,1297.82,789.002,1024,224,8.97,22.02,41.23
gluon_resnet152_v1c,1296.47,789.825,1024,224,11.8,23.36,60.21
resnetv2_152d,1296.37,789.881,1024,224,11.8,23.36,60.2
gluon_resnet152_v1d,1293.21,791.811,1024,224,11.8,23.36,60.21
vit_small_resnet50d_s16_224,1288.35,794.801,1024,224,13.48,24.82,57.53
cs3edgenet_x,1281.15,799.266,1024,288,14.59,16.36,47.82
edgenext_base,1272.74,804.548,1024,320,6.01,24.32,18.51
regnety_120,1268.38,807.318,1024,224,12.14,21.38,51.82
dla169,1258.34,813.753,1024,224,11.6,20.2,53.39
hrnet_w30,1252.2,817.74,1024,224,8.15,21.21,37.71
xception41p,1249.06,409.896,512,299,9.25,39.86,26.91
maxxvitv2_nano_rw_256,1248.81,819.967,1024,256,6.26,23.05,23.7
ecaresnet50t,1243.91,823.198,1024,320,8.82,24.13,25.57
vgg19,1237.03,827.774,1024,224,19.63,14.86,143.67
swin_small_patch4_window7_224,1228.67,833.406,1024,224,8.77,27.47,49.61
efficientnet_el_pruned,1220.93,838.69,1024,300,8.0,30.7,10.59
densenet161,1220.41,839.05,1024,224,7.79,11.06,28.68
efficientnet_el,1218.76,840.187,1024,300,8.0,30.7,10.59
deit_base_distilled_patch16_224,1211.4,845.292,1024,224,17.68,24.05,87.34
vit_base_patch16_224,1209.0,846.969,1024,224,17.58,23.9,86.57
vit_base_patch16_224_miil,1208.72,847.163,1024,224,17.59,23.91,94.4
deit_base_patch16_224,1208.56,847.275,1024,224,17.58,23.9,86.57
vit_base_patch16_clip_224,1205.77,849.236,1024,224,17.58,23.9,86.57
gluon_resnet152_v1s,1205.41,849.488,1024,224,12.92,24.96,60.32
coatnet_rmlp_1_rw_224,1201.89,851.979,1024,224,7.85,35.47,41.69
maxvit_tiny_rw_224,1200.3,853.107,1024,224,5.11,33.11,29.06
mixnet_xxl,1193.04,643.721,768,224,2.04,23.43,23.96
tf_efficientnet_el,1192.11,858.967,1024,300,8.0,30.7,10.59
swinv2_tiny_window8_256,1191.01,859.761,1024,256,5.96,24.57,28.35
volo_d1_224,1190.57,860.079,1024,224,6.94,24.43,26.63
repvgg_b2,1183.91,864.916,1024,224,20.45,12.9,89.02
legacy_seresnet152,1181.09,866.978,1024,224,11.33,22.08,66.82
xcit_small_24_p16_224_dist,1175.31,871.245,1024,224,9.1,23.64,47.67
xcit_small_24_p16_224,1174.76,871.656,1024,224,9.1,23.64,47.67
inception_v4,1168.76,876.127,1024,299,12.28,15.09,42.68
seresnet152,1166.02,878.19,1024,224,11.57,22.61,66.82
twins_pcpvt_large,1163.18,880.331,1024,224,9.84,35.82,60.99
deit3_base_patch16_224,1159.4,883.201,1024,224,17.58,23.9,86.59
deit3_base_patch16_224_in21ft1k,1159.14,883.404,1024,224,17.58,23.9,86.59
cait_xxs36_224,1156.4,885.493,1024,224,3.77,30.34,17.3
vit_base_patch32_clip_448,1154.9,886.645,1024,448,17.93,23.9,88.34
regnetx_160,1153.07,888.048,1024,224,15.99,25.52,54.28
dm_nfnet_f0,1152.75,888.293,1024,256,12.62,18.05,71.49
sequencer2d_m,1147.71,892.201,1024,224,6.55,14.26,38.31
repvgg_b3g4,1145.87,893.631,1024,224,17.89,15.1,83.83
mvitv2_small_cls,1144.7,894.542,1024,224,7.04,28.17,34.87
mvitv2_small,1143.83,895.224,1024,224,7.0,28.08,34.87
efficientnet_lite4,1139.64,336.935,384,380,4.04,45.66,13.01
tnt_s_patch16_224,1135.12,902.091,1024,224,5.24,24.37,23.76
convmixer_1024_20_ks9_p14,1130.85,905.497,1024,224,5.55,5.51,24.38
vgg19_bn,1127.16,908.464,1024,224,19.66,14.86,143.68
vit_relpos_base_patch16_clsgap_224,1124.58,910.547,1024,224,17.6,25.12,86.43
vit_relpos_base_patch16_cls_224,1122.76,912.026,1024,224,17.6,25.12,86.43
coatnet_rmlp_1_rw2_224,1119.61,914.591,1024,224,8.11,40.13,41.72
beit_base_patch16_224,1109.32,923.073,1024,224,17.58,23.9,86.53
xception41,1107.6,462.251,512,299,9.28,39.86,26.97
tresnet_xl,1106.51,925.423,1024,224,15.17,15.34,78.44
beitv2_base_patch16_224,1106.05,925.798,1024,224,17.58,23.9,86.53
coat_tiny,1099.16,931.604,1024,224,4.35,27.2,5.5
vit_base_patch16_gap_224,1085.51,943.323,1024,224,17.49,25.59,86.57
maxvit_tiny_tf_224,1081.57,710.062,768,224,5.6,35.78,30.92
vit_relpos_base_patch16_224,1078.21,949.713,1024,224,17.51,24.97,86.43
nf_regnet_b4,1075.82,951.823,1024,384,4.7,28.61,30.21
coatnet_1_rw_224,1074.48,953.005,1024,224,8.04,34.6,41.72
dla102x2,1070.83,956.252,1024,224,9.34,29.91,41.28
pit_b_224,1066.8,479.928,512,224,12.42,32.94,73.76
pit_b_distilled_224,1063.31,481.504,512,224,12.5,33.07,74.79
tf_efficientnet_lite4,1058.68,362.703,384,380,4.04,45.66,13.01
efficientnetv2_s,1057.28,968.508,1024,384,8.44,35.77,21.46
vit_large_r50_s32_224,1034.79,989.556,1024,224,19.58,24.41,328.99
vit_small_patch16_36x1_224,1032.1,992.142,1024,224,13.71,35.69,64.67
efficientnet_b3_g8_gn,1031.26,496.465,512,320,3.2,28.83,14.25
tf_efficientnetv2_s,1029.13,995.002,1024,384,8.44,35.77,21.46
flexivit_base,1028.55,995.558,1024,240,20.29,28.36,86.59
vit_base_patch16_rpn_224,1016.66,1007.208,1024,224,17.49,23.75,86.54
vit_small_r26_s32_384,1011.11,1012.73,1024,384,10.43,29.85,36.47
vit_small_patch16_18x2_224,1005.34,1018.547,1024,224,13.71,35.69,64.67
swinv2_cr_small_224,1000.71,1023.259,1024,224,9.07,50.27,49.7
efficientnetv2_rw_s,995.91,1028.19,1024,384,8.72,38.03,23.94
wide_resnet101_2,995.32,1028.801,1024,224,22.8,21.23,126.89
swinv2_cr_small_ns_224,989.25,1035.114,1024,224,9.08,50.27,49.7
vit_relpos_base_patch16_rpn_224,986.84,1037.641,1024,224,17.51,24.97,86.41
coatnet_1_224,984.69,519.944,512,224,8.7,39.0,42.23
resnet200,983.36,1041.314,1024,224,15.07,32.19,64.67
dpn98,982.09,1042.657,1024,224,11.73,25.2,61.57
convnext_small,981.97,1042.782,1024,288,14.39,35.65,50.22
cs3se_edgenet_x,975.89,1049.279,1024,320,18.01,20.21,50.72
regnety_080,969.67,1056.01,1024,288,13.22,29.69,39.18
poolformer_m36,966.97,1058.965,1024,224,8.8,22.02,56.17
resnest101e,963.69,1062.57,1024,256,13.38,28.66,48.28
regnetz_b16_evos,955.65,803.632,768,288,2.36,16.43,9.74
twins_svt_large,954.95,1072.291,1024,224,15.15,35.1,99.27
pvt_v2_b4,952.02,1075.594,1024,224,10.14,53.74,62.56
gluon_resnext101_64x4d,944.48,1084.183,1024,224,15.52,31.21,83.46
regnetv_064,944.32,1084.367,1024,288,10.55,27.11,30.58
regnety_064,944.18,1084.526,1024,288,10.56,27.11,30.58
maxvit_rmlp_tiny_rw_256,941.64,815.588,768,256,6.77,46.92,29.15
regnetz_d8,936.16,1093.814,1024,320,6.19,37.08,23.37
resnetrs101,936.12,1093.858,1024,288,13.56,28.53,63.62
regnetz_d32,933.58,1096.833,1024,320,9.33,37.08,27.58
ig_resnext101_32x8d,930.9,1099.997,1024,224,16.48,31.21,88.79
swsl_resnext101_32x8d,930.28,1100.725,1024,224,16.48,31.21,88.79
resnext101_32x8d,929.98,1101.084,1024,224,16.48,31.21,88.79
ssl_resnext101_32x8d,929.0,1102.24,1024,224,16.48,31.21,88.79
convnextv2_tiny,925.13,553.423,512,288,7.39,22.21,28.64
convnextv2_small,924.53,1107.57,1024,224,8.71,21.56,50.32
maxvit_tiny_rw_256,921.72,833.209,768,256,6.74,44.35,29.07
inception_resnet_v2,917.69,1115.834,1024,299,13.18,25.06,55.84
ens_adv_inception_resnet_v2,917.66,1115.871,1024,299,13.18,25.06,55.84
maxxvit_rmlp_tiny_rw_256,914.74,1119.428,1024,256,6.66,39.76,29.64
xcit_tiny_24_p16_384_dist,912.61,1122.045,1024,384,6.87,34.29,12.12
cait_s24_224,908.65,1126.929,1024,224,9.35,40.58,46.92
pvt_v2_b5,904.89,1131.615,1024,224,11.76,50.92,81.96
nest_small,902.63,850.834,768,224,10.35,40.04,38.35
repvgg_b3,901.73,1135.583,1024,224,29.16,15.1,123.09
maxvit_tiny_pm_256,896.67,1141.994,1024,256,6.61,47.9,30.09
xception65p,896.53,571.079,512,299,13.91,52.48,39.82
swin_s3_small_224,896.35,856.792,768,224,9.43,37.84,49.74
jx_nest_small,892.32,860.663,768,224,10.35,40.04,38.35
efficientnet_b4,890.89,431.018,384,384,4.51,50.04,19.34
gmlp_b16_224,885.75,1156.072,1024,224,15.78,30.21,73.08
gluon_seresnext101_64x4d,885.23,1156.747,1024,224,15.53,31.25,88.23
hrnet_w40,881.9,1161.12,1024,224,12.75,25.29,57.56
efficientformer_l7,877.43,1167.027,1024,224,10.17,24.45,82.23
coat_mini,874.29,1171.227,1024,224,6.82,33.68,10.34
resnet101d,871.81,1174.559,1024,320,16.48,34.77,44.57
swin_base_patch4_window7_224,870.1,1176.867,1024,224,15.47,36.63,87.77
regnetz_040,868.17,884.605,768,320,6.35,37.78,27.12
regnetz_040h,862.76,890.151,768,320,6.43,37.94,28.94
mobilevitv2_150_384_in22ft1k,848.7,301.627,256,384,9.2,54.25,10.59
resnetv2_50d_evos,844.34,909.573,768,288,7.15,19.7,25.59
tf_efficientnet_b4,838.16,458.136,384,380,4.49,49.49,19.34
crossvit_base_240,835.31,919.411,768,240,21.22,36.33,105.03
vit_base_r50_s16_224,821.15,1247.01,1024,224,21.67,35.31,114.69
xcit_medium_24_p16_224_dist,819.59,1249.397,1024,224,16.13,31.71,84.4
xcit_medium_24_p16_224,818.73,1250.697,1024,224,16.13,31.71,84.4
gcvit_small,807.46,1268.151,1024,224,8.57,41.61,51.09
gluon_xception65,806.21,635.055,512,299,13.96,52.48,39.92
xception65,800.01,639.983,512,299,13.96,52.48,39.92
mvitv2_base,799.31,1281.092,1024,224,10.16,40.5,51.47
hrnet_w44,789.29,1297.348,1024,224,14.94,26.92,67.06
vit_base_patch16_plus_240,780.68,1311.665,1024,240,27.41,33.08,117.56
hrnet_w48,780.39,1312.147,1024,224,17.34,28.56,77.47
swinv2_tiny_window16_256,778.19,657.926,512,256,6.68,39.02,28.35
tresnet_m_448,775.99,1319.596,1024,448,22.94,29.21,31.39
xcit_small_12_p16_384_dist,760.88,1345.804,1024,384,14.14,36.51,26.25
vit_small_patch16_384,750.95,1022.685,768,384,15.52,50.78,22.2
maxvit_rmlp_small_rw_224,745.49,1373.585,1024,224,10.75,49.3,64.9
sequencer2d_l,742.48,1379.149,1024,224,9.74,22.12,54.3
swinv2_small_window8_256,738.39,1386.788,1024,256,11.58,40.14,49.73
swin_s3_base_224,730.45,1401.854,1024,224,13.69,48.26,71.13
poolformer_m48,729.44,1403.808,1024,224,11.59,29.17,73.47
densenet264d_iabn,727.43,1407.671,1024,224,13.47,14.0,72.74
vit_relpos_base_patch16_plus_240,723.43,1415.468,1024,240,27.3,34.33,117.38
dpn131,722.72,1416.854,1024,224,16.09,32.97,79.25
tnt_b_patch16_224,722.12,1418.026,1024,224,14.09,39.01,65.41
deit3_small_patch16_384,717.36,1070.572,768,384,15.52,50.78,22.21
deit3_small_patch16_384_in21ft1k,716.76,1071.477,768,384,15.52,50.78,22.21
swinv2_cr_base_224,715.64,1430.874,1024,224,15.86,59.66,87.88
eca_nfnet_l1,713.15,1435.867,1024,320,14.92,34.42,41.41
coatnet_2_rw_224,709.88,721.237,512,224,15.09,49.22,73.87
swinv2_cr_base_ns_224,709.69,1442.871,1024,224,15.86,59.66,87.88
coatnet_rmlp_2_rw_224,708.85,722.285,512,224,15.18,54.78,73.88
convit_base,706.65,1449.076,1024,224,17.52,31.77,86.54
mobilevitv2_175_384_in22ft1k,703.41,363.928,256,384,12.47,63.29,14.25
maxvit_small_tf_224,701.58,729.767,512,224,11.66,53.17,68.93
densenet264,701.03,1460.686,1024,224,12.95,12.8,72.69
ecaresnet200d,694.19,1475.094,1024,256,20.0,43.15,64.69
resnetv2_50x1_bitm,691.29,740.624,512,448,16.62,44.46,25.55
seresnet200d,691.25,1481.355,1024,256,20.01,43.15,71.86
xcit_tiny_24_p8_224,684.73,1495.467,1024,224,9.21,45.39,12.11
xcit_tiny_24_p8_224_dist,684.22,1496.573,1024,224,9.21,45.39,12.11
convnext_base,682.42,1500.518,1024,288,25.43,47.53,88.59
volo_d2_224,663.51,1543.3,1024,224,14.34,41.34,58.68
coatnet_2_224,660.84,581.062,384,224,16.5,52.67,74.68
legacy_senet154,654.15,1565.387,1024,224,20.77,38.69,115.09
gluon_senet154,654.04,1565.641,1024,224,20.77,38.69,115.09
senet154,653.94,1565.866,1024,224,20.77,38.69,115.09
xcit_nano_12_p8_384_dist,646.53,1583.823,1024,384,6.34,46.08,3.05
dpn107,646.38,1584.202,1024,224,18.38,33.46,86.92
nest_base,640.55,799.298,512,224,17.96,53.39,67.72
jx_nest_base,633.53,808.151,512,224,17.96,53.39,67.72
mobilevitv2_200_384_in22ft1k,626.31,408.731,256,384,16.24,72.34,18.45
xception71,619.72,826.163,512,299,18.09,69.92,42.34
hrnet_w64,618.15,1656.539,1024,224,28.97,35.09,128.06
resnet152d,618.09,1656.699,1024,320,24.08,47.67,60.21
regnetz_c16_evos,604.19,847.399,512,320,3.86,25.88,13.49
gcvit_base,594.61,1722.135,1024,224,14.87,55.48,90.32
regnety_160,594.3,1292.258,768,288,26.37,38.07,83.59
maxxvit_rmlp_small_rw_256,588.15,1741.023,1024,256,14.67,58.38,66.01
xcit_small_12_p8_224,582.04,1759.324,1024,224,18.69,47.21,26.21
xcit_small_12_p8_224_dist,581.74,1760.224,1024,224,18.69,47.21,26.21
maxvit_rmlp_small_rw_256,575.72,1333.976,768,256,14.15,66.09,64.9
regnetx_320,551.07,1393.631,768,224,31.81,36.3,107.81
seresnet152d,547.51,1870.27,1024,320,24.09,47.72,66.84
resnetrs152,544.33,1881.196,1024,320,24.34,48.14,86.62
vit_large_patch32_384,543.23,1884.997,1024,384,45.31,43.86,306.63
halonet_h1,540.47,473.65,256,256,3.0,51.17,8.1
seresnet269d,540.42,1894.818,1024,256,26.59,53.6,113.67
swinv2_base_window8_256,529.22,1451.182,768,256,20.37,52.59,87.92
maxxvitv2_rmlp_base_rw_224,523.43,1956.308,1024,224,24.2,62.77,116.09
resnext101_64x4d,521.77,1962.525,1024,288,25.66,51.59,83.46
regnetz_e8,521.5,1472.647,768,320,15.46,63.94,57.7
mixer_l16_224,518.26,1975.807,1024,224,44.6,41.69,208.2
vit_medium_patch16_gap_384,508.63,1006.611,512,384,26.08,67.54,39.03
swin_large_patch4_window7_224,501.11,1532.586,768,224,34.53,54.94,196.53
regnety_320,490.98,2085.591,1024,224,32.34,30.26,145.05
swinv2_small_window16_256,487.64,1049.932,512,256,12.82,66.29,49.73
seresnext101_32x8d,483.23,2119.074,1024,288,27.24,51.63,93.57
vit_small_patch8_224,478.05,1071.009,512,224,22.44,80.84,21.67
ig_resnext101_32x16d,477.64,2143.862,1024,224,36.27,51.18,194.03
swsl_resnext101_32x16d,476.69,2148.145,1024,224,36.27,51.18,194.03
ssl_resnext101_32x16d,476.06,2150.954,1024,224,36.27,51.18,194.03
seresnext101d_32x8d,475.05,2155.547,1024,288,27.64,52.95,93.59
nf_regnet_b5,470.14,1089.029,512,456,11.7,61.95,49.74
xcit_large_24_p16_224_dist,468.86,2184.017,1024,224,35.86,47.27,189.1
xcit_large_24_p16_224,468.75,2184.529,1024,224,35.86,47.27,189.1
volo_d3_224,463.72,2208.199,1024,224,20.78,60.09,86.33
nfnet_f1,463.52,2209.163,1024,320,35.97,46.77,132.63
efficientnet_b5,460.91,555.412,256,448,9.59,93.56,30.39
resnet200d,453.15,2259.739,1024,320,31.25,67.33,64.69
efficientnetv2_m,451.89,2266.018,1024,416,18.6,67.5,54.14
seresnextaa101d_32x8d,447.26,2289.498,1024,288,28.51,56.44,93.59
efficientnetv2_rw_m,437.1,1757.005,768,416,21.49,79.62,53.24
swinv2_cr_large_224,422.08,1819.551,768,224,35.1,78.42,196.68
coatnet_rmlp_3_rw_224,421.87,910.226,384,224,33.56,79.47,165.15
xcit_tiny_12_p8_384_dist,421.04,2432.044,1024,384,14.13,69.14,6.71
swinv2_cr_tiny_384,419.77,609.847,256,384,15.34,161.01,28.33
maxvit_rmlp_base_rw_224,419.03,1832.808,768,224,23.15,92.64,116.14
resnetv2_152x2_bit_teacher,418.89,2444.553,1024,224,46.95,45.11,236.34
resnetv2_101x1_bitm,418.36,1223.813,512,448,31.65,64.93,44.54
dm_nfnet_f1,409.02,1877.643,768,320,35.97,46.77,132.63
xcit_small_24_p16_384_dist,407.47,2513.062,1024,384,26.72,68.58,47.67
coatnet_3_rw_224,404.39,633.033,256,224,33.44,73.83,181.81
tf_efficientnet_b5,403.59,634.298,256,456,10.46,98.86,30.39
convnextv2_base,402.92,1270.715,512,288,25.43,47.53,88.72
resnetrs200,396.11,2585.123,1024,320,31.51,67.81,93.21
tresnet_l_448,395.6,2588.481,1024,448,43.5,47.56,55.99
eva_large_patch14_196,391.22,2617.408,1024,196,61.57,63.52,304.14
vit_large_patch16_224,389.92,2626.132,1024,224,61.6,63.52,304.33
regnetz_d8_evos,389.86,1969.937,768,320,7.03,38.92,23.46
maxvit_base_tf_224,387.71,1320.545,512,224,24.04,95.01,119.47
coatnet_3_224,387.35,660.882,256,224,36.56,79.01,166.97
crossvit_15_dagger_408,386.57,662.227,256,408,21.45,95.05,28.5
vit_base_patch16_18x2_224,384.3,2664.545,1024,224,52.51,71.38,256.73
deit3_large_patch16_224,376.93,2716.643,1024,224,61.6,63.52,304.37
deit3_large_patch16_224_in21ft1k,376.54,2719.504,1024,224,61.6,63.52,304.37
tf_efficientnetv2_m,374.38,2051.373,768,480,24.76,89.84,54.14
convnext_large,371.39,1378.579,512,288,56.87,71.29,197.77
beitv2_large_patch16_224,360.12,2843.465,1024,224,61.6,63.52,304.43
beit_large_patch16_224,359.86,2845.558,1024,224,61.6,63.52,304.43
swinv2_base_window12to16_192to256_22kft1k,359.31,1068.705,384,256,22.02,84.71,87.92
swinv2_base_window16_256,359.09,1069.342,384,256,22.02,84.71,87.92
eca_nfnet_l2,347.1,2212.621,768,384,30.05,68.28,56.72
flexivit_large,333.31,3072.173,1024,240,70.99,75.39,304.36
vit_large_r50_s32_384,332.86,3076.333,1024,384,57.43,76.52,329.09
maxxvitv2_rmlp_large_rw_224,330.79,3095.576,1024,224,44.14,87.15,215.42
resnest200e,317.25,3227.754,1024,320,35.69,82.78,70.2
maxvit_tiny_tf_384,317.22,807.002,256,384,17.53,123.42,30.98
convmixer_768_32,309.28,3310.892,1024,224,19.55,25.95,21.11
deit_base_patch16_384,306.13,1254.335,384,384,55.54,101.56,86.86
vit_base_patch16_384,306.13,1254.349,384,384,55.54,101.56,86.86
vit_base_patch16_clip_384,305.56,1256.673,384,384,55.54,101.56,86.86
xcit_small_24_p8_224_dist,305.18,3355.41,1024,224,35.81,90.78,47.63
deit_base_distilled_patch16_384,304.96,1259.16,384,384,55.65,101.82,87.63
xcit_small_24_p8_224,304.86,3358.887,1024,224,35.81,90.78,47.63
nasnetalarge,300.31,1278.679,384,331,23.89,90.56,88.75
volo_d1_384,299.05,1712.072,512,384,22.75,108.55,26.78
volo_d4_224,295.86,3461.069,1024,224,44.34,80.22,192.96
deit3_base_patch16_384,294.03,1305.985,384,384,55.54,101.56,86.88
deit3_base_patch16_384_in21ft1k,293.78,1307.085,384,384,55.54,101.56,86.88
tresnet_xl_448,292.43,2626.294,768,448,60.65,61.31,78.44
pnasnet5large,285.95,1342.894,384,331,25.04,92.89,86.06
vit_large_patch14_224,285.66,3584.705,1024,224,81.08,88.79,304.2
vit_large_patch14_clip_224,285.43,3587.599,1024,224,81.08,88.79,304.2
crossvit_18_dagger_408,283.82,901.967,256,408,32.47,124.87,44.61
xcit_medium_24_p16_384_dist,282.22,3628.317,1024,384,47.39,91.64,84.4
cait_xxs24_384,275.38,3718.492,1024,384,9.63,122.66,12.03
regnety_640,271.79,2825.663,768,224,64.16,42.5,281.38
maxvit_large_tf_224,268.97,1427.67,384,224,43.68,127.35,211.79
nfnet_f2,263.0,3893.59,1024,352,63.22,79.06,193.78
beit_base_patch16_384,260.66,1473.146,384,384,55.54,101.56,86.74
swinv2_cr_small_384,258.79,989.214,256,384,29.7,298.03,49.7
ecaresnet269d,257.79,3972.16,1024,352,50.25,101.25,102.09
resnetrs270,249.11,4110.633,1024,352,51.13,105.48,129.86
mvitv2_large,248.64,2059.181,512,224,43.87,112.02,217.99
efficientnet_b6,246.42,519.432,128,528,19.4,167.39,43.04
convnext_xlarge,241.35,2121.412,512,288,100.8,95.05,350.2
convnextv2_large,238.64,1072.708,256,288,56.87,71.29,197.96
tf_efficientnet_b6,236.4,541.434,128,528,19.4,167.39,43.04
swin_base_patch4_window12_384,235.04,816.885,192,384,47.19,134.78,87.9
dm_nfnet_f2,234.34,3277.279,768,352,63.22,79.06,193.78
coatnet_4_224,228.52,1120.23,256,224,62.48,129.26,275.43
vit_base_r50_s16_384,227.31,1689.303,384,384,67.43,135.03,98.95
efficientnetv2_l,221.97,2306.653,512,480,56.4,157.99,118.52
xcit_tiny_24_p8_384_dist,221.23,4628.611,1024,384,27.05,132.95,12.11
ig_resnext101_32x32d,220.61,2320.857,512,224,87.29,91.12,468.53
swinv2_large_window12to16_192to256_22kft1k,219.46,1166.485,256,256,47.81,121.53,196.74
tf_efficientnetv2_l,219.35,2334.183,512,480,56.4,157.99,118.52
resmlp_big_24_224,214.31,4778.166,1024,224,100.23,87.31,129.14
resmlp_big_24_224_in22ft1k,214.13,4782.043,1024,224,100.23,87.31,129.14
resmlp_big_24_distilled_224,214.04,4784.169,1024,224,100.23,87.31,129.14
xcit_medium_24_p8_224_dist,210.1,4873.763,1024,224,63.53,121.23,84.32
xcit_medium_24_p8_224,210.01,4875.864,1024,224,63.53,121.23,84.32
maxvit_small_tf_384,208.79,919.556,192,384,35.87,183.65,69.02
vit_base_patch8_224,199.59,1282.637,256,224,78.22,161.69,86.58
eca_nfnet_l3,199.58,2565.434,512,448,52.55,118.4,72.04
volo_d5_224,196.25,5217.924,1024,224,72.4,118.11,295.46
xcit_small_12_p8_384_dist,194.27,2635.521,512,384,54.92,138.29,26.21
cait_xs24_384,192.73,3984.863,768,384,19.28,183.98,26.67
swinv2_cr_base_384,184.92,1384.392,256,384,50.57,333.68,87.88
cait_xxs36_384,184.35,5554.56,1024,384,14.35,183.7,17.37
swinv2_cr_huge_224,183.61,2091.395,384,224,115.97,121.08,657.83
convnext_xxlarge,183.01,2098.268,384,224,151.66,95.29,846.47
coatnet_rmlp_2_rw_384,178.88,715.532,128,384,47.69,209.43,73.88
convmixer_1536_20,173.51,5901.752,1024,224,48.68,33.03,51.63
volo_d2_384,168.46,1519.603,256,384,46.17,184.51,58.87
resnetrs350,168.28,6085.136,1024,384,77.59,154.74,163.96
xcit_large_24_p16_384_dist,160.71,4778.847,768,384,105.35,137.17,189.1
resnetv2_152x2_bit_teacher_384,159.55,1604.488,256,384,136.16,132.56,236.34
maxvit_xlarge_tf_224,155.79,1643.178,256,224,97.49,191.02,474.95
maxvit_tiny_tf_512,155.64,822.373,128,512,33.49,257.59,31.05
regnety_1280,155.18,2474.502,384,224,127.66,71.58,644.81
vit_huge_patch14_224,154.03,6647.897,1024,224,167.43,139.43,658.75
vit_huge_patch14_clip_224,153.92,6652.944,1024,224,167.4,139.41,632.05
maxxvitv2_rmlp_base_rw_384,153.34,1669.502,256,384,72.98,213.74,116.09
efficientnetv2_xl,152.49,3357.61,512,512,93.85,247.32,208.12
tf_efficientnetv2_xl,151.4,2536.254,384,512,93.85,247.32,208.12
deit3_huge_patch14_224_in21ft1k,149.08,6868.834,1024,224,167.4,139.41,632.13
deit3_huge_patch14_224,149.01,6871.974,1024,224,167.4,139.41,632.13
cait_s24_384,148.46,3448.684,512,384,32.17,245.31,47.06
resnest269e,147.61,3468.584,512,416,77.69,171.98,110.93
nfnet_f3,147.43,3472.717,512,416,115.58,141.78,254.92
efficientnet_b7,142.41,674.084,96,600,38.33,289.94,66.35
resnetv2_50x3_bitm,138.27,1388.564,192,448,145.7,133.37,217.32
tf_efficientnet_b7,137.89,696.181,96,600,38.33,289.94,66.35
swin_large_patch4_window12_384,137.6,930.229,128,384,104.08,202.16,196.74
ig_resnext101_32x48d,132.29,2902.628,384,224,153.57,131.06,828.41
dm_nfnet_f3,127.59,4012.898,512,416,115.58,141.78,254.92
coatnet_5_224,125.18,1022.512,128,224,145.49,194.24,687.47
maxvit_rmlp_base_rw_384,121.26,2111.079,256,384,70.97,318.95,116.14
xcit_large_24_p8_224,119.97,6401.598,768,224,141.23,181.56,188.93
xcit_large_24_p8_224_dist,119.94,6403.17,768,224,141.23,181.56,188.93
resnetrs420,119.93,6403.598,768,416,108.45,213.79,191.89
resnetv2_152x2_bitm,117.33,2181.801,256,448,184.99,180.43,236.34
maxvit_base_tf_384,113.69,1688.826,192,384,73.8,332.9,119.65
swinv2_cr_large_384,113.07,1132.03,128,384,108.95,404.96,196.68
eva_large_patch14_336,102.65,2493.904,256,336,191.1,270.24,304.53
vit_large_patch14_clip_336,102.47,2498.286,256,336,191.11,270.24,304.53
vit_large_patch16_384,102.37,2500.639,256,384,191.21,270.24,304.72
xcit_small_24_p8_384_dist,102.36,5001.728,512,384,105.24,265.91,47.63
eva_giant_patch14_224,101.75,10063.521,1024,224,267.18,192.64,1012.56
vit_giant_patch14_224,100.42,7648.057,768,224,267.18,192.64,1012.61
vit_giant_patch14_clip_224,100.32,7655.265,768,224,267.18,192.64,1012.65
cait_s36_384,99.37,5152.338,512,384,47.99,367.4,68.37
deit3_large_patch16_384,99.34,2577.037,256,384,191.21,270.24,304.76
deit3_large_patch16_384_in21ft1k,99.27,2578.907,256,384,191.21,270.24,304.76
regnety_2560,97.99,2612.623,256,224,257.07,87.48,826.14
maxvit_small_tf_512,97.85,981.11,96,512,67.26,383.77,69.13
swinv2_base_window12to24_192to384_22kft1k,95.95,666.98,64,384,55.25,280.36,87.92
efficientnet_b8,95.3,1007.298,96,672,63.48,442.89,87.41
tf_efficientnet_b8,92.65,1036.1,96,672,63.48,442.89,87.41
beit_large_patch16_384,88.55,2890.891,256,384,191.21,270.24,305.0
resnetv2_101x3_bitm,83.1,2310.491,192,448,280.33,194.78,387.93
maxvit_large_tf_384,80.34,1593.284,128,384,132.55,445.84,212.03
nfnet_f4,79.54,4827.723,384,512,216.26,262.26,316.07
volo_d3_448,73.5,2612.274,192,448,96.33,446.83,86.63
dm_nfnet_f4,71.41,3584.699,256,512,216.26,262.26,316.07
xcit_medium_24_p8_384_dist,70.91,5415.294,384,384,186.67,354.73,84.32
swinv2_large_window12to24_192to384_22kft1k,60.84,788.97,48,384,116.15,407.83,196.74
vit_gigantic_patch14_clip_224,60.15,8511.823,512,224,483.96,275.37,1844.91
vit_gigantic_patch14_224,60.11,8517.291,512,224,483.95,275.37,1844.44
nfnet_f5,58.02,4412.387,256,544,290.97,349.71,377.21
vit_huge_patch14_clip_336,57.29,4468.831,256,336,390.97,407.54,632.46
convnextv2_huge,56.06,1712.576,96,384,337.96,232.35,660.29
volo_d4_448,54.47,2349.801,128,448,197.13,527.35,193.41
tf_efficientnet_l2,54.12,1182.593,64,475,172.11,609.89,480.31
maxvit_base_tf_512,52.65,1823.292,96,512,138.02,703.99,119.88
swinv2_cr_giant_224,52.12,2455.882,128,224,483.85,309.15,2598.76
dm_nfnet_f5,50.7,5049.339,256,544,290.97,349.71,377.21
swinv2_cr_huge_384,48.86,1309.971,64,384,352.04,583.18,657.94
maxvit_xlarge_tf_384,46.24,2076.289,96,384,292.78,668.76,475.32
nfnet_f6,44.3,5778.548,256,576,378.69,452.2,438.36
xcit_large_24_p8_384_dist,40.2,6368.127,256,384,415.0,531.82,188.93
eva_giant_patch14_336,39.77,6436.237,256,336,620.64,550.67,1013.01
dm_nfnet_f6,39.62,6461.626,256,576,378.69,452.2,438.36
maxvit_large_tf_512,38.67,1654.908,64,512,244.75,942.15,212.33
volo_d5_448,37.56,3408.043,128,448,315.06,737.92,295.91
beit_large_patch16_512,35.36,2715.28,96,512,362.24,656.39,305.67
nfnet_f7,34.74,7370.0,256,608,480.39,570.85,499.5
cait_m36_384,32.36,7912.123,256,384,173.11,734.81,271.22
resnetv2_152x4_bitm,30.0,4266.89,128,480,844.84,414.26,936.53
volo_d5_512,26.35,4857.602,128,512,425.09,1105.37,296.09
maxvit_xlarge_tf_512,23.12,2076.455,48,512,534.14,1413.22,475.77
efficientnet_l2,21.26,1505.032,32,800,479.12,1707.39,480.31
swinv2_cr_giant_384,15.03,2129.6,32,384,1450.71,1394.86,2598.76
cait_m48_448,13.69,9353.048,128,448,329.41,1708.23,356.46
eva_giant_patch14_560,10.36,4631.037,48,560,1906.76,2577.17,1014.45
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-infer-amp-nhwc-pt112-cu113-rtx3090.csv
|
model,infer_samples_per_sec,infer_step_time,infer_batch_size,infer_img_size,infer_gmacs,infer_macts,param_count
tinynet_e,70939.06,14.424,1024,106,0.03,0.69,2.04
mobilenetv3_small_050,53363.87,19.179,1024,224,0.03,0.92,1.59
lcnet_035,39908.29,25.648,1024,224,0.03,1.04,1.64
mobilenetv3_small_075,38048.72,26.902,1024,224,0.05,1.3,2.04
tinynet_d,35634.7,28.724,1024,152,0.05,1.42,2.34
lcnet_050,35231.0,29.055,1024,224,0.05,1.26,1.88
mobilenetv3_small_100,34913.55,29.319,1024,224,0.06,1.42,2.54
tf_mobilenetv3_small_minimal_100,31288.96,32.716,1024,224,0.06,1.41,2.04
tf_mobilenetv3_small_075,30676.85,33.368,1024,224,0.05,1.3,2.04
lcnet_075,30088.74,34.022,1024,224,0.1,1.99,2.36
tf_mobilenetv3_small_100,28547.5,35.858,1024,224,0.06,1.42,2.54
lcnet_100,23945.91,42.753,1024,224,0.16,2.52,2.95
mnasnet_small,22244.35,46.024,1024,224,0.07,2.16,2.03
levit_128s,22002.58,46.529,1024,224,0.31,1.88,7.78
mobilenetv2_035,20937.19,48.897,1024,224,0.07,2.86,1.68
mnasnet_050,18984.06,53.93,1024,224,0.11,3.07,2.22
ghostnet_050,18415.88,55.593,1024,224,0.05,1.77,2.59
tinynet_c,17846.73,57.365,1024,184,0.11,2.87,2.46
mobilenetv2_050,16928.19,60.48,1024,224,0.1,3.64,1.97
semnasnet_050,16394.61,62.449,1024,224,0.11,3.44,2.08
lcnet_150,15508.0,66.02,1024,224,0.34,3.79,4.5
gernet_s,15282.73,66.993,1024,224,0.75,2.65,8.17
levit_128,14929.05,68.581,1024,224,0.41,2.71,9.21
cs3darknet_focus_s,14654.05,69.868,1024,256,0.69,2.7,3.27
cs3darknet_s,14422.34,70.991,1024,256,0.72,2.97,3.28
mobilenetv3_large_075,14412.2,71.04,1024,224,0.16,4.0,3.99
mixer_s32_224,13576.58,75.414,1024,224,1.0,2.28,19.1
resnet10t,13509.21,75.789,1024,224,1.1,2.43,5.44
mobilenetv3_rw,13202.47,77.551,1024,224,0.23,4.41,5.48
levit_192,13174.02,77.717,1024,224,0.66,3.2,10.95
mobilenetv3_large_100,12955.77,79.027,1024,224,0.23,4.41,5.48
mobilenetv3_large_100_miil,12954.49,79.035,1024,224,0.23,4.41,5.48
vit_small_patch32_224,12913.76,79.284,1024,224,1.15,2.5,22.88
hardcorenas_a,12748.25,80.313,1024,224,0.23,4.38,5.26
mnasnet_075,12700.79,80.614,1024,224,0.23,4.77,3.17
tf_mobilenetv3_large_075,12296.64,83.262,1024,224,0.16,4.0,3.99
tinynet_b,12104.09,84.587,1024,188,0.21,4.44,3.73
tf_mobilenetv3_large_minimal_100,11915.92,85.923,1024,224,0.22,4.4,3.92
hardcorenas_b,11667.78,87.752,1024,224,0.26,5.09,5.18
hardcorenas_c,11602.35,88.247,1024,224,0.28,5.01,5.52
ese_vovnet19b_slim_dw,11450.49,89.417,1024,224,0.4,5.28,1.9
mnasnet_b1,11305.32,90.567,1024,224,0.33,5.46,4.38
mnasnet_100,11303.72,90.578,1024,224,0.33,5.46,4.38
tf_mobilenetv3_large_100,11165.87,91.695,1024,224,0.23,4.41,5.48
gluon_resnet18_v1b,11046.16,92.691,1024,224,1.82,2.48,11.69
ssl_resnet18,11027.01,92.852,1024,224,1.82,2.48,11.69
resnet18,11023.6,92.881,1024,224,1.82,2.48,11.69
swsl_resnet18,11003.86,93.048,1024,224,1.82,2.48,11.69
semnasnet_075,10953.15,93.479,1024,224,0.23,5.54,2.91
regnetx_004,10946.78,93.533,1024,224,0.4,3.14,5.16
hardcorenas_d,10898.13,93.95,1024,224,0.3,4.93,7.5
mobilenetv2_075,10779.36,94.985,1024,224,0.22,5.86,2.64
ghostnet_100,10498.38,97.527,1024,224,0.15,3.55,5.18
seresnet18,10333.85,99.081,1024,224,1.82,2.49,11.78
vit_tiny_r_s16_p8_224,10273.23,99.666,1024,224,0.44,2.06,6.34
spnasnet_100,10240.55,99.983,1024,224,0.35,6.03,4.42
legacy_seresnet18,10026.58,102.118,1024,224,1.82,2.49,11.78
mnasnet_a1,9730.7,105.223,1024,224,0.32,6.23,3.89
semnasnet_100,9728.29,105.249,1024,224,0.32,6.23,3.89
tf_efficientnetv2_b0,9714.68,105.397,1024,224,0.73,4.77,7.14
tinynet_a,9706.17,105.487,1024,192,0.35,5.41,6.19
hardcorenas_f,9654.05,106.058,1024,224,0.35,5.57,8.2
mobilenetv2_100,9591.46,106.751,1024,224,0.31,6.68,3.5
levit_256,9580.42,106.874,1024,224,1.13,4.23,18.89
regnetx_002,9551.99,107.191,1024,224,0.2,2.16,2.68
hardcorenas_e,9521.78,107.531,1024,224,0.35,5.65,8.07
efficientnet_lite0,9415.07,108.751,1024,224,0.4,6.74,4.65
regnety_002,9227.09,110.966,1024,224,0.2,2.17,3.16
fbnetc_100,9182.63,111.504,1024,224,0.4,6.51,5.57
resnet18d,9153.04,111.864,1024,224,2.06,3.29,11.71
regnety_006,9048.64,113.154,1024,224,0.61,4.33,6.06
ese_vovnet19b_slim,8822.42,116.057,1024,224,1.69,3.52,3.17
ghostnet_130,8402.81,121.852,1024,224,0.24,4.6,7.36
regnetx_006,8395.84,121.954,1024,224,0.61,3.98,6.2
levit_256d,8131.84,125.914,1024,224,1.4,4.93,26.21
tf_efficientnet_lite0,8115.07,126.174,1024,224,0.4,6.74,4.65
regnetz_005,8104.32,126.341,1024,224,0.52,5.86,7.12
efficientnet_b0,8015.36,127.743,1024,224,0.4,6.75,5.29
xcit_nano_12_p16_224_dist,7700.28,132.971,1024,224,0.56,4.17,3.05
xcit_nano_12_p16_224,7692.58,133.102,1024,224,0.56,4.17,3.05
mnasnet_140,7484.35,136.808,1024,224,0.6,7.71,7.12
rexnetr_100,7414.06,138.105,1024,224,0.43,7.72,4.88
resnet14t,7298.17,140.296,1024,224,1.69,5.8,10.08
mobilenetv2_110d,7233.3,141.556,1024,224,0.45,8.71,4.52
regnetx_008,7112.68,143.957,1024,224,0.81,5.15,7.26
tf_efficientnet_b0_ns,7058.22,145.067,1024,224,0.4,6.75,5.29
tf_efficientnet_b0_ap,7055.15,145.131,1024,224,0.4,6.75,5.29
tf_efficientnet_b0,7052.24,145.19,1024,224,0.4,6.75,5.29
edgenext_xx_small,6998.78,146.298,1024,256,0.33,4.21,1.33
dla46_c,6933.78,147.671,1024,224,0.58,4.5,1.3
deit_tiny_patch16_224,6855.38,149.36,1024,224,1.26,5.97,5.72
vit_tiny_patch16_224,6844.8,149.592,1024,224,1.26,5.97,5.72
regnety_008,6827.19,149.977,1024,224,0.81,5.25,6.26
gernet_m,6753.27,151.619,1024,224,3.02,5.24,21.14
deit_tiny_distilled_patch16_224,6720.97,152.347,1024,224,1.27,6.01,5.91
efficientnet_b1_pruned,6608.04,154.952,1024,240,0.4,6.21,6.33
hrnet_w18_small,6603.38,155.061,1024,224,1.61,5.72,13.19
gluon_resnet34_v1b,6434.28,159.136,1024,224,3.67,3.74,21.8
semnasnet_140,6428.21,159.287,1024,224,0.6,8.87,6.11
tv_resnet34,6406.04,159.838,1024,224,3.67,3.74,21.8
resnet34,6404.45,159.878,1024,224,3.67,3.74,21.8
ese_vovnet19b_dw,6353.89,161.15,1024,224,1.34,8.25,6.54
rexnet_100,6291.85,162.738,1024,224,0.41,7.44,4.8
mobilenetv2_140,6258.1,163.617,1024,224,0.6,9.57,6.11
mobilevitv2_050,6228.2,164.403,1024,256,0.48,8.04,1.37
efficientnet_lite1,6215.56,164.736,1024,240,0.62,10.14,5.42
tf_efficientnetv2_b1,6143.78,166.661,1024,240,1.21,7.34,8.14
visformer_tiny,6090.13,168.13,1024,224,1.27,5.72,10.32
fbnetv3_b,6012.29,170.307,1024,256,0.55,9.1,8.6
seresnet34,5988.65,170.978,1024,224,3.67,3.74,21.96
resnet26,5923.37,172.863,1024,224,2.36,7.35,16.0
efficientnet_es,5871.09,174.403,1024,224,1.81,8.73,5.44
efficientnet_es_pruned,5866.42,174.542,1024,224,1.81,8.73,5.44
selecsls42,5796.58,176.643,1024,224,2.94,4.62,30.35
pit_ti_distilled_224,5792.34,176.773,1024,224,0.71,6.23,5.1
legacy_seresnet34,5780.67,177.13,1024,224,3.67,3.74,21.96
selecsls42b,5766.92,177.544,1024,224,2.98,4.62,32.46
pit_ti_224,5764.0,177.643,1024,224,0.7,6.19,4.85
resnet34d,5748.0,178.138,1024,224,3.91,4.54,21.82
levit_384,5659.16,180.934,1024,224,2.36,6.26,39.13
tf_efficientnet_es,5608.15,182.58,1024,224,1.81,8.73,5.44
resnetblur18,5572.02,183.764,1024,224,2.34,3.39,11.69
tf_efficientnet_lite1,5541.93,184.761,1024,240,0.62,10.14,5.42
rexnetr_130,5487.71,186.587,1024,224,0.68,9.81,7.61
cs3darknet_m,5481.96,186.783,1024,288,2.63,6.69,9.31
mixnet_s,5402.43,189.533,1024,224,0.25,6.25,4.13
regnety_004,5382.71,190.227,1024,224,0.41,3.89,4.34
skresnet18,5371.9,190.61,1024,224,1.82,3.24,11.96
darknet17,5347.86,143.598,768,256,3.26,7.18,14.3
mobilevit_xxs,5306.36,192.964,1024,256,0.42,8.34,1.27
cs3darknet_focus_m,5289.88,193.566,1024,288,2.51,6.19,9.3
mobilenetv2_120d,5178.64,197.724,1024,224,0.69,11.97,5.83
repvgg_b0,5161.18,198.394,1024,224,3.41,6.15,15.82
xcit_tiny_12_p16_224_dist,5107.76,200.467,1024,224,1.24,6.29,6.72
xcit_tiny_12_p16_224,5104.48,200.597,1024,224,1.24,6.29,6.72
resnet26d,5093.48,201.03,1024,224,2.6,8.15,16.01
tf_mixnet_s,4981.9,205.528,1024,224,0.25,6.25,4.13
vit_base_patch32_224_sam,4925.79,207.875,1024,224,4.41,5.01,88.22
vit_base_patch32_224,4923.14,207.986,1024,224,4.41,5.01,88.22
selecsls60,4922.02,208.033,1024,224,3.59,5.52,30.67
mixer_b32_224,4909.46,208.566,1024,224,3.24,6.29,60.29
selecsls60b,4902.59,208.858,1024,224,3.63,5.52,32.77
rexnetr_150,4887.15,209.518,1024,224,0.89,11.13,9.78
nf_resnet26,4834.78,211.787,1024,224,2.41,7.35,16.0
darknet21,4804.05,159.854,768,256,3.93,7.47,20.86
resmlp_12_distilled_224,4801.62,213.251,1024,224,3.01,5.5,15.35
resmlp_12_224,4801.47,213.257,1024,224,3.01,5.5,15.35
efficientnet_lite2,4791.14,213.716,1024,260,0.89,12.9,6.09
pit_xs_224,4790.75,213.733,1024,224,1.4,7.71,10.62
fbnetv3_d,4788.83,213.819,1024,256,0.68,11.1,10.31
pit_xs_distilled_224,4740.73,215.989,1024,224,1.41,7.76,11.0
dla34,4712.5,217.283,1024,224,3.07,5.02,15.74
sedarknet21,4615.23,166.394,768,256,3.93,7.47,20.95
resnext26ts,4512.74,226.902,1024,256,2.43,10.52,10.3
tf_efficientnetv2_b2,4506.54,227.212,1024,260,1.72,9.84,10.1
mixer_s16_224,4471.0,229.021,1024,224,3.79,5.97,18.53
legacy_seresnext26_32x4d,4467.81,229.184,1024,224,2.49,9.39,16.79
edgenext_x_small,4458.42,229.664,1024,256,0.68,7.5,2.34
gernet_l,4450.89,230.055,1024,256,4.57,8.0,31.08
tf_efficientnet_b1,4403.29,232.542,1024,240,0.71,10.88,7.79
tf_efficientnet_b1_ns,4402.75,232.57,1024,240,0.71,10.88,7.79
tf_efficientnet_b1_ap,4402.24,232.597,1024,240,0.71,10.88,7.79
eca_resnext26ts,4354.84,235.13,1024,256,2.43,10.52,10.3
seresnext26ts,4350.24,235.378,1024,256,2.43,10.52,10.39
tf_efficientnet_lite2,4300.72,238.087,1024,260,0.89,12.9,6.09
gcresnext26ts,4295.78,238.361,1024,256,2.43,10.53,10.48
rexnet_130,4282.77,239.085,1024,224,0.68,9.71,7.56
efficientnet_b1,4273.32,239.615,1024,256,0.77,12.22,7.79
gmlp_ti16_224,4143.15,247.143,1024,224,1.34,7.55,5.87
efficientnet_b0_g16_evos,4127.77,248.064,1024,224,1.01,7.42,8.11
crossvit_tiny_240,4122.07,248.408,1024,240,1.57,9.08,7.01
nf_ecaresnet26,4104.24,249.486,1024,224,2.41,7.36,16.0
efficientnet_b2_pruned,4103.05,249.558,1024,260,0.73,9.13,8.31
nf_seresnet26,4102.39,249.599,1024,224,2.41,7.36,17.4
mobilevitv2_075,4066.07,251.829,1024,256,1.05,12.06,2.87
vit_small_patch32_384,4025.63,254.359,1024,384,3.45,8.25,22.92
ecaresnext50t_32x4d,4000.29,255.97,1024,224,2.7,10.09,15.41
ecaresnext26t_32x4d,3998.13,256.108,1024,224,2.7,10.09,15.41
seresnext26tn_32x4d,3995.56,256.274,1024,224,2.7,10.09,16.81
seresnext26t_32x4d,3993.92,256.378,1024,224,2.7,10.09,16.81
seresnext26d_32x4d,3982.98,257.083,1024,224,2.73,10.19,16.81
resnet26t,3922.56,261.042,1024,256,3.35,10.52,16.01
dla46x_c,3917.72,261.363,1024,224,0.54,5.66,1.07
rexnet_150,3870.49,264.554,1024,224,0.9,11.21,9.73
resnetv2_50,3868.33,264.701,1024,224,4.11,11.11,25.55
crossvit_9_240,3865.11,264.922,1024,240,1.85,9.52,8.55
convnext_nano_ols,3859.14,265.333,1024,224,2.5,8.37,15.6
nf_regnet_b0,3819.49,268.086,1024,256,0.64,5.58,8.76
ecaresnet50d_pruned,3812.92,268.549,1024,224,2.53,6.43,19.94
regnetx_016,3808.7,268.846,1024,224,1.62,7.93,9.19
crossvit_9_dagger_240,3792.18,270.018,1024,240,1.99,9.97,8.78
dla60x_c,3787.69,270.337,1024,224,0.59,6.01,1.32
convnext_nano_hnf,3786.83,270.399,1024,224,2.46,8.37,15.59
ecaresnetlight,3729.3,274.57,1024,224,4.11,8.42,30.16
poolformer_s12,3722.39,275.081,1024,224,1.82,5.53,11.92
gmixer_12_224,3686.02,277.795,1024,224,2.67,7.26,12.7
gluon_resnet50_v1b,3677.5,278.438,1024,224,4.11,11.11,25.56
resnet50,3676.35,278.526,1024,224,4.11,11.11,25.56
ssl_resnet50,3674.86,278.638,1024,224,4.11,11.11,25.56
tv_resnet50,3673.31,278.756,1024,224,4.11,11.11,25.56
swsl_resnet50,3672.81,278.794,1024,224,4.11,11.11,25.56
dpn68,3650.67,280.484,1024,224,2.35,10.47,12.61
dpn68b,3606.63,283.907,1024,224,2.35,10.47,12.61
botnet26t_256,3555.59,287.983,1024,256,3.32,11.98,12.49
regnety_016,3516.07,291.222,1024,224,1.63,8.04,11.2
repvgg_a2,3514.85,291.324,1024,224,5.7,6.26,28.21
resnetv2_50t,3513.08,291.47,1024,224,4.32,11.82,25.57
efficientnet_em,3504.92,292.149,1024,240,3.04,14.34,6.9
mixnet_m,3496.31,292.868,1024,224,0.36,8.19,5.01
resnetv2_50d,3496.21,292.876,1024,224,4.35,11.92,25.57
rexnetr_200,3492.0,219.921,768,224,1.59,15.11,16.52
halonet26t,3480.88,294.167,1024,256,3.19,11.69,12.48
gluon_resnet50_v1c,3476.97,294.498,1024,224,4.35,11.92,25.58
resnet32ts,3465.97,295.433,1024,256,4.63,11.58,17.96
bat_resnext26ts,3457.26,296.173,1024,256,2.53,12.51,10.73
resnet33ts,3415.09,299.834,1024,256,4.76,11.66,19.68
tf_efficientnet_b2_ns,3414.14,299.918,1024,260,1.02,13.83,9.11
tf_efficientnet_b2,3412.6,300.052,1024,260,1.02,13.83,9.11
tf_efficientnet_b2_ap,3412.15,300.092,1024,260,1.02,13.83,9.11
tf_efficientnet_em,3389.12,302.132,1024,240,3.04,14.34,6.9
resnet50t,3345.79,306.045,1024,224,4.32,11.82,25.57
gluon_resnet50_v1d,3337.44,306.809,1024,224,4.35,11.92,25.58
resnet50d,3336.65,306.882,1024,224,4.35,11.92,25.58
vit_tiny_r_s16_p8_384,3314.06,154.482,512,384,1.34,6.49,6.36
legacy_seresnet50,3311.17,309.245,1024,224,3.88,10.6,28.09
seresnet33ts,3304.6,309.859,1024,256,4.76,11.66,19.78
tf_mixnet_m,3303.11,309.997,1024,224,0.36,8.19,5.01
eca_resnet33ts,3297.96,310.484,1024,256,4.76,11.66,19.68
convit_tiny,3289.83,311.251,1024,224,1.26,7.94,5.71
gcresnet33ts,3263.32,313.778,1024,256,4.76,11.68,19.88
vit_small_resnet26d_224,3252.42,314.83,1024,224,5.07,11.12,63.61
vovnet39a,3229.96,317.018,1024,224,7.09,6.73,22.6
efficientnet_b2a,3221.33,317.868,1024,288,1.12,16.2,9.11
efficientnet_b2,3221.08,317.894,1024,288,1.12,16.2,9.11
efficientnet_b3_pruned,3195.08,320.48,1024,300,1.04,11.86,9.86
seresnet50,3183.22,321.675,1024,224,4.11,11.13,28.09
cs3darknet_l,3171.55,322.859,1024,288,6.16,10.83,21.16
cs3darknet_focus_l,3146.07,325.473,1024,288,5.9,10.16,21.15
res2net50_48w_2s,3141.04,325.995,1024,224,4.18,11.72,25.29
eca_vovnet39b,3123.78,327.796,1024,224,7.09,6.74,22.6
ese_vovnet39b,3117.85,328.419,1024,224,7.09,6.74,24.57
mobilevit_xs,3095.24,248.113,768,256,1.05,16.33,2.32
resnext50_32x4d,3092.67,331.094,1024,224,4.26,14.4,25.03
ssl_resnext50_32x4d,3089.61,331.422,1024,224,4.26,14.4,25.03
gluon_resnext50_32x4d,3087.43,331.656,1024,224,4.26,14.4,25.03
swsl_resnext50_32x4d,3086.28,331.779,1024,224,4.26,14.4,25.03
tv_resnext50_32x4d,3085.42,331.872,1024,224,4.26,14.4,25.03
hrnet_w18_small_v2,3075.57,332.934,1024,224,2.62,9.65,15.6
eca_botnext26ts_256,3069.01,333.646,1024,256,2.46,11.6,10.59
dla60,3049.62,335.767,1024,224,4.26,10.16,22.04
vgg11,3048.59,167.933,512,224,7.61,7.44,132.86
skresnet34,3035.0,337.386,1024,224,3.67,5.13,22.28
mobilevitv2_100,3028.13,253.611,768,256,1.84,16.08,4.9
vit_small_patch16_224,3005.65,340.679,1024,224,4.61,11.95,22.05
deit_small_patch16_224,3005.16,340.735,1024,224,4.61,11.95,22.05
eca_halonext26ts,2980.4,343.567,1024,256,2.44,11.46,10.76
resnetaa50d,2972.16,344.518,1024,224,5.39,12.44,25.58
ecaresnet101d_pruned,2963.47,345.529,1024,224,3.48,7.69,24.88
coat_lite_tiny,2957.59,346.216,1024,224,1.6,11.65,5.72
cs3sedarknet_l,2955.87,346.418,1024,288,6.16,10.83,21.91
deit_small_distilled_patch16_224,2950.21,347.082,1024,224,4.63,12.02,22.44
gluon_resnet50_v1s,2947.99,347.343,1024,224,5.47,13.52,25.68
resnetrs50,2945.06,347.688,1024,224,4.48,12.14,35.69
seresnet50t,2939.07,348.397,1024,224,4.32,11.83,28.1
densenet121,2936.15,348.744,1024,224,2.87,6.9,7.98
pit_s_224,2935.87,348.777,1024,224,2.88,11.56,23.46
ecaresnet50d,2934.67,348.92,1024,224,4.35,11.93,25.58
tv_densenet121,2927.3,349.799,1024,224,2.87,6.9,7.98
selecsls84,2927.1,349.822,1024,224,5.9,7.57,50.95
pit_s_distilled_224,2909.88,351.892,1024,224,2.9,11.64,24.04
vit_relpos_base_patch32_plus_rpn_256,2858.63,358.203,1024,256,7.68,8.01,119.42
deit3_small_patch16_224_in21ft1k,2858.2,358.255,1024,224,4.61,11.95,22.06
deit3_small_patch16_224,2853.97,358.786,1024,224,4.61,11.95,22.06
resnext50d_32x4d,2849.78,359.313,1024,224,4.5,15.2,25.05
vit_relpos_small_patch16_rpn_224,2814.18,363.86,1024,224,4.59,13.05,21.97
densenet121d,2808.28,364.624,1024,224,3.11,7.7,8.0
cspresnet50,2805.19,365.024,1024,256,4.54,11.5,21.62
vit_relpos_small_patch16_224,2800.45,365.643,1024,224,4.59,13.05,21.98
gcresnext50ts,2798.56,365.89,1024,256,3.75,15.46,15.67
vit_srelpos_small_patch16_224,2795.09,366.343,1024,224,4.59,12.16,21.97
coat_lite_mini,2774.65,369.044,1024,224,2.0,12.25,11.01
haloregnetz_b,2774.49,369.064,1024,224,1.97,11.94,11.68
vit_base_patch32_plus_256,2772.64,369.31,1024,256,7.79,7.76,119.48
rexnet_200,2762.14,278.034,768,224,1.56,14.91,16.37
res2net50_26w_4s,2757.69,371.313,1024,224,4.28,12.61,25.7
seresnext50_32x4d,2741.98,373.44,1024,224,4.26,14.42,27.56
gluon_seresnext50_32x4d,2737.7,374.024,1024,224,4.26,14.42,27.56
legacy_seresnext50_32x4d,2737.27,374.083,1024,224,4.26,14.42,27.56
xcit_tiny_24_p16_224_dist,2733.14,374.648,1024,224,2.34,11.82,12.12
xcit_tiny_24_p16_224,2731.46,374.879,1024,224,2.34,11.82,12.12
xcit_nano_12_p16_384_dist,2727.34,375.445,1024,384,1.64,12.15,3.05
dla60x,2724.01,375.903,1024,224,3.54,13.8,17.35
gcresnet50t,2718.77,376.628,1024,256,5.42,14.67,25.9
vgg11_bn,2701.91,189.486,512,224,7.62,7.44,132.87
visformer_small,2695.9,379.825,1024,224,4.88,11.43,40.22
lambda_resnet26rpt_256,2689.85,380.679,1024,256,3.16,11.87,10.99
mixnet_l,2682.98,286.237,768,224,0.58,10.84,7.33
resnetblur50,2682.32,381.746,1024,224,5.16,12.02,25.56
vovnet57a,2674.97,382.797,1024,224,8.95,7.52,36.64
efficientnet_lite3,2659.04,192.54,512,300,1.65,21.85,8.2
cspresnet50d,2649.78,386.434,1024,256,4.86,12.55,21.64
seresnetaa50d,2644.86,387.154,1024,224,5.4,12.46,28.11
efficientnetv2_rw_t,2633.28,388.856,1024,288,3.19,16.42,13.65
cspresnet50w,2624.25,390.195,1024,256,5.04,12.19,28.12
twins_svt_small,2599.22,393.951,1024,224,2.94,13.75,24.06
tf_efficientnetv2_b3,2587.08,395.8,1024,300,3.04,15.74,14.36
nf_regnet_b2,2583.08,396.413,1024,272,1.22,9.27,14.31
vit_base_resnet26d_224,2575.01,397.656,1024,224,6.97,13.16,101.4
ese_vovnet57b,2572.63,398.024,1024,224,8.95,7.52,38.61
fbnetv3_g,2570.51,398.353,1024,288,1.77,21.09,16.62
gc_efficientnetv2_rw_t,2557.73,400.342,1024,288,3.2,16.45,13.68
tf_mixnet_l,2550.98,301.047,768,224,0.58,10.84,7.33
nf_regnet_b1,2527.71,405.098,1024,288,1.02,9.2,10.22
res2net50_14w_8s,2514.56,407.217,1024,224,4.21,13.28,25.06
inception_v3,2512.32,407.579,1024,299,5.73,8.97,23.83
densenetblur121d,2509.93,407.967,1024,224,3.11,7.9,8.0
adv_inception_v3,2509.35,408.057,1024,299,5.73,8.97,23.83
tf_inception_v3,2505.31,408.714,1024,299,5.73,8.97,23.83
gluon_inception_v3,2501.93,409.271,1024,299,5.73,8.97,23.83
resnetblur50d,2498.32,409.863,1024,224,5.4,12.82,25.58
nf_ecaresnet50,2492.25,410.862,1024,224,4.21,11.13,25.56
nf_seresnet50,2488.35,411.506,1024,224,4.21,11.13,28.09
resmlp_24_224,2465.19,415.371,1024,224,5.96,10.91,30.02
resmlp_24_distilled_224,2463.93,415.584,1024,224,5.96,10.91,30.02
mobilevit_s,2450.02,313.456,768,256,2.03,19.94,5.58
regnetx_032,2449.76,417.988,1024,224,3.2,11.37,15.3
cspresnext50,2440.84,419.515,1024,256,4.05,15.86,20.57
dla60_res2net,2430.52,421.296,1024,224,4.15,12.34,20.85
resnest14d,2424.63,422.32,1024,224,2.76,7.33,10.61
densenet169,2421.68,422.835,1024,224,3.4,7.3,14.15
convnext_tiny_hnfd,2418.1,423.46,1024,224,4.47,13.44,28.59
convnext_tiny_hnf,2414.45,424.101,1024,224,4.47,13.44,28.59
tf_efficientnet_lite3,2396.12,213.668,512,300,1.65,21.85,8.2
sehalonet33ts,2392.73,427.951,1024,256,3.55,14.7,13.69
efficientnet_cc_b0_4e,2389.83,428.47,1024,224,0.41,9.42,13.31
efficientnet_cc_b0_8e,2386.88,429.0,1024,224,0.42,9.42,24.01
convnext_tiny_in22ft1k,2380.94,430.068,1024,224,4.47,13.44,28.59
convnext_tiny,2379.5,430.329,1024,224,4.47,13.44,28.59
regnetz_b16,2348.93,435.929,1024,288,2.39,16.43,9.72
resnetv2_101,2321.74,441.036,1024,224,7.83,16.23,44.54
convnext_nano,2304.31,444.37,1024,288,4.06,13.84,15.59
tf_efficientnet_cc_b0_4e,2293.39,446.488,1024,224,0.41,9.42,13.31
semobilevit_s,2279.96,336.836,768,256,2.03,19.95,5.74
gluon_resnet101_v1b,2249.95,455.108,1024,224,7.83,16.23,44.55
tv_resnet101,2246.24,455.861,1024,224,7.83,16.23,44.55
resnet101,2246.09,455.891,1024,224,7.83,16.23,44.55
mobilevitv2_125,2233.52,343.842,768,256,2.86,20.1,7.48
skresnet50,2232.97,458.569,1024,224,4.11,12.5,25.8
ecaresnet26t,2203.93,464.611,1024,320,5.24,16.44,16.01
resnetv2_101d,2180.36,469.635,1024,224,8.07,17.04,44.56
gluon_resnet101_v1c,2174.26,470.952,1024,224,8.08,17.04,44.57
twins_pcpvt_small,2160.75,473.897,1024,224,3.83,18.08,24.11
xcit_small_12_p16_224_dist,2141.07,478.253,1024,224,4.82,12.58,26.25
xcit_small_12_p16_224,2140.23,478.441,1024,224,4.82,12.58,26.25
cs3darknet_focus_x,2138.85,478.75,1024,256,8.03,10.69,35.02
edgenext_small,2120.03,482.998,1024,320,1.97,14.16,5.59
gluon_resnet101_v1d,2116.52,483.8,1024,224,8.08,17.04,44.57
tf_efficientnet_cc_b0_8e,2114.86,484.181,1024,224,0.42,9.42,24.01
vgg13,2106.04,486.207,1024,224,11.31,12.25,133.05
skresnet50d,2104.19,486.637,1024,224,4.36,13.31,25.82
xcit_nano_12_p8_224_dist,2091.47,489.594,1024,224,2.16,15.71,3.05
xcit_nano_12_p8_224,2088.8,490.222,1024,224,2.16,15.71,3.05
sebotnet33ts_256,2061.15,248.392,512,256,3.89,17.46,13.7
efficientnet_b0_gn,2058.74,373.032,768,224,0.42,6.75,5.29
wide_resnet50_2,2057.11,497.774,1024,224,11.43,14.4,68.88
dla102,2034.05,503.416,1024,224,7.19,14.18,33.27
vit_base_resnet50d_224,2032.15,503.887,1024,224,8.73,16.92,110.97
resnet51q,2003.15,511.181,1024,288,8.07,20.94,35.7
legacy_seresnet101,1997.44,512.643,1024,224,7.61,15.74,49.33
regnetx_040,1987.99,515.08,1024,224,3.99,12.2,22.12
gmlp_s16_224,1983.97,516.124,1024,224,4.42,15.1,19.42
res2net50_26w_6s,1970.9,519.546,1024,224,6.33,15.28,37.05
resnetaa101d,1964.05,521.361,1024,224,9.12,17.56,44.57
gluon_resnet101_v1s,1954.76,523.835,1024,224,9.19,18.64,44.67
seresnet101,1951.77,524.639,1024,224,7.84,16.27,49.33
repvgg_b1,1949.45,525.265,1024,224,13.16,10.64,57.42
crossvit_small_240,1948.12,525.623,1024,240,5.63,18.17,26.86
cs3sedarknet_xdw,1942.13,527.244,1024,256,5.97,17.18,21.6
resnetaa50,1934.76,529.254,1024,288,8.52,19.24,25.56
swin_tiny_patch4_window7_224,1924.88,531.966,1024,224,4.51,17.06,28.29
resnext101_32x4d,1924.17,532.166,1024,224,8.01,21.23,44.18
ssl_resnext101_32x4d,1923.99,532.215,1024,224,8.01,21.23,44.18
poolformer_s24,1923.48,532.355,1024,224,3.41,10.68,21.39
gluon_resnext101_32x4d,1922.64,532.587,1024,224,8.01,21.23,44.18
swsl_resnext101_32x4d,1922.44,532.644,1024,224,8.01,21.23,44.18
vit_relpos_medium_patch16_cls_224,1918.8,533.655,1024,224,8.03,18.24,38.76
vit_relpos_medium_patch16_rpn_224,1917.82,533.927,1024,224,7.97,17.02,38.73
vit_relpos_medium_patch16_224,1911.97,535.56,1024,224,7.97,17.02,38.75
vit_srelpos_medium_patch16_224,1908.42,536.559,1024,224,7.96,16.21,38.74
resnest50d_1s4x24d,1891.51,541.355,1024,224,4.43,13.57,25.68
darknet53,1883.2,407.805,768,288,11.78,15.68,41.61
gmixer_24_224,1881.95,544.104,1024,224,5.28,14.45,24.72
darknetaa53,1875.8,409.414,768,288,10.08,15.68,36.02
densenet201,1868.67,547.971,1024,224,4.34,7.85,20.01
halonet50ts,1867.02,548.456,1024,256,5.3,19.2,22.73
mobilevitv2_150,1865.71,274.416,512,256,4.09,24.11,10.59
mobilevitv2_150_in22ft1k,1864.94,274.529,512,256,4.09,24.11,10.59
tf_efficientnet_b3_ns,1862.24,274.927,512,300,1.87,23.83,12.23
tf_efficientnet_b3,1861.19,275.081,512,300,1.87,23.83,12.23
tf_efficientnet_b3_ap,1860.71,275.153,512,300,1.87,23.83,12.23
nf_resnet101,1854.11,552.273,1024,224,8.01,16.23,44.55
dla102x,1853.94,552.322,1024,224,5.89,19.42,26.31
ecaresnet101d,1853.67,552.405,1024,224,8.08,17.07,44.57
vgg13_bn,1850.18,276.718,512,224,11.33,12.25,133.05
cspdarknet53,1837.13,418.032,768,256,6.57,16.81,27.64
efficientnet_b3a,1829.01,279.921,512,320,2.01,26.52,12.23
efficientnet_b3,1828.85,279.946,512,320,2.01,26.52,12.23
mixnet_xl,1821.62,281.057,512,224,0.93,14.57,11.9
resnet61q,1810.56,565.559,1024,288,9.87,21.52,36.85
vit_small_r26_s32_224,1806.33,566.883,1024,224,3.56,9.85,36.43
xcit_tiny_12_p16_384_dist,1805.03,567.29,1024,384,3.64,18.26,6.72
edgenext_small_rw,1803.36,567.813,1024,320,2.46,14.85,7.83
crossvit_15_240,1792.63,571.217,1024,240,5.81,19.77,27.53
resnest26d,1781.59,574.753,1024,224,3.64,9.97,17.07
nf_resnet50,1773.35,577.425,1024,288,6.88,18.37,25.56
hrnet_w18,1766.17,579.773,1024,224,4.32,16.31,21.3
crossvit_15_dagger_240,1755.64,583.25,1024,240,6.13,20.43,28.21
swin_s3_tiny_224,1746.31,586.364,1024,224,4.64,19.13,28.33
resnetblur101d,1744.07,587.12,1024,224,9.12,17.94,44.57
res2net101_26w_4s,1715.56,596.875,1024,224,8.1,18.45,45.21
cait_xxs24_224,1707.34,599.75,1024,224,2.53,20.29,11.96
nf_regnet_b3,1702.33,601.516,1024,320,2.05,14.61,18.59
seresnext101_32x4d,1701.46,601.825,1024,224,8.02,21.26,48.96
gluon_seresnext101_32x4d,1701.17,601.927,1024,224,8.02,21.26,48.96
legacy_seresnext101_32x4d,1697.89,603.09,1024,224,8.02,21.26,48.96
resnetv2_50d_frn,1691.2,605.475,1024,224,4.33,11.92,25.59
vgg16,1690.51,605.724,1024,224,15.47,13.56,138.36
repvgg_b1g4,1670.27,613.064,1024,224,8.15,10.64,39.97
res2net50_26w_8s,1662.2,616.038,1024,224,8.37,17.95,48.4
resmlp_36_224,1656.29,618.237,1024,224,8.91,16.33,44.69
resmlp_36_distilled_224,1655.44,618.553,1024,224,8.91,16.33,44.69
regnetz_c16,1654.15,309.514,512,320,3.92,25.88,13.46
sequencer2d_s,1646.91,621.756,1024,224,4.96,11.31,27.65
efficientnet_b0_g8_gn,1636.03,469.418,768,224,0.66,6.75,6.56
botnet50ts_256,1633.5,313.424,512,256,5.54,22.23,22.74
vit_large_patch32_224,1629.91,628.244,1024,224,15.39,13.3,306.54
ese_vovnet39b_evos,1627.7,629.096,1024,224,7.07,6.74,24.58
cs3darknet_x,1627.54,629.156,1024,288,10.6,14.36,35.05
resnetv2_50d_evob,1620.95,631.713,1024,224,4.33,11.92,25.59
efficientnet_cc_b1_8e,1619.01,632.471,1024,240,0.75,15.44,39.72
resnetv2_152,1614.21,634.351,1024,224,11.55,22.56,60.19
xception41p,1587.33,322.543,512,299,9.25,39.86,26.91
regnetx_064,1586.15,484.18,768,224,6.49,16.37,26.21
coat_lite_small,1583.04,646.84,1024,224,3.96,22.09,19.84
swinv2_cr_tiny_224,1581.48,647.481,1024,224,4.66,28.45,28.33
xception,1578.2,486.619,768,299,8.4,35.83,22.86
gluon_resnet152_v1b,1573.68,650.689,1024,224,11.56,22.56,60.19
tv_resnet152,1573.22,650.883,1024,224,11.56,22.56,60.19
resnetv2_50x1_bit_distilled,1572.94,650.997,1024,224,4.23,11.11,25.55
resnet152,1571.71,651.505,1024,224,11.56,22.56,60.19
tf_efficientnet_cc_b1_8e,1564.9,654.344,1024,240,0.75,15.44,39.72
halo2botnet50ts_256,1559.2,656.732,1024,256,5.02,21.78,22.64
mixer_l32_224,1558.59,656.992,1024,224,11.27,19.86,206.94
vit_tiny_patch16_384,1557.53,657.441,1024,384,4.7,25.39,5.79
mobilevitv2_175,1554.07,329.447,512,256,5.54,28.13,14.25
swinv2_cr_tiny_ns_224,1551.84,659.847,1024,224,4.66,28.45,28.33
mobilevitv2_175_in22ft1k,1551.58,329.973,512,256,5.54,28.13,14.25
vit_base_patch32_384,1550.87,660.263,1024,384,13.06,16.5,88.3
cs3sedarknet_x,1549.02,661.048,1024,288,10.6,14.37,35.4
resnetv2_152d,1545.41,662.596,1024,224,11.8,23.36,60.2
nf_ecaresnet101,1540.66,664.639,1024,224,8.01,16.27,44.55
nf_seresnet101,1538.7,665.483,1024,224,8.02,16.27,49.33
gluon_resnet152_v1c,1536.27,666.538,1024,224,11.8,23.36,60.21
efficientnet_el,1524.59,335.818,512,300,8.0,30.7,10.59
efficientnet_el_pruned,1523.29,336.105,512,300,8.0,30.7,10.59
gluon_resnet152_v1d,1508.62,678.753,1024,224,11.8,23.36,60.21
vgg16_bn,1504.44,340.315,512,224,15.5,13.56,138.37
twins_pcpvt_base,1494.56,685.139,1024,224,6.68,25.25,43.83
tf_efficientnet_el,1481.51,345.582,512,300,8.0,30.7,10.59
cs3edgenet_x,1479.52,692.101,1024,288,14.59,16.36,47.82
vit_base_r26_s32_224,1479.31,692.202,1024,224,6.81,12.36,101.38
skresnext50_32x4d,1465.64,698.657,1024,224,4.5,17.18,27.48
convnext_small,1452.43,705.012,1024,224,8.71,21.56,50.22
convnext_small_in22ft1k,1450.42,705.987,1024,224,8.71,21.56,50.22
hrnet_w32,1444.27,708.991,1024,224,8.97,22.02,41.23
ese_vovnet99b,1442.7,709.767,1024,224,16.51,11.27,63.2
mixer_b16_224,1424.87,718.648,1024,224,12.62,14.53,59.88
gluon_resnet152_v1s,1424.84,718.665,1024,224,12.92,24.96,60.32
ecaresnet50t,1422.0,720.1,1024,320,8.82,24.13,25.57
mixer_b16_224_miil,1421.45,720.381,1024,224,12.62,14.53,59.88
vgg19,1411.57,725.42,1024,224,19.63,14.86,143.67
regnety_032,1398.62,732.137,1024,288,5.29,18.61,19.44
convit_small,1398.4,732.254,1024,224,5.76,17.87,27.78
nest_tiny,1387.46,553.516,768,224,5.83,25.48,17.06
legacy_seresnet152,1382.46,740.694,1024,224,11.33,22.08,66.82
dla169,1382.4,740.729,1024,224,11.6,20.2,53.39
xcit_tiny_12_p8_224,1374.74,744.858,1024,224,4.81,23.6,6.71
xcit_tiny_12_p8_224_dist,1374.04,745.236,1024,224,4.81,23.6,6.71
densenet161,1366.11,749.563,1024,224,7.79,11.06,28.68
jx_nest_tiny,1362.5,563.656,768,224,5.83,25.48,17.06
seresnet152,1361.36,752.174,1024,224,11.57,22.61,66.82
mobilevitv2_200_in22ft1k,1354.63,283.461,384,256,7.22,32.15,18.45
mobilevitv2_200,1354.25,283.54,384,256,7.22,32.15,18.45
xception41,1347.67,379.903,512,299,9.28,39.86,26.97
inception_v4,1323.37,773.767,1024,299,12.28,15.09,42.68
twins_svt_base,1316.07,778.059,1024,224,8.59,26.33,56.07
vit_small_resnet50d_s16_224,1305.38,784.435,1024,224,13.48,24.82,57.53
dpn92,1303.44,785.601,1024,224,6.54,18.21,37.67
tresnet_m,1297.91,788.947,1024,224,5.74,7.31,31.39
poolformer_s36,1296.72,789.674,1024,224,5.0,15.82,30.86
sequencer2d_m,1285.21,796.745,1024,224,6.55,14.26,38.31
crossvit_18_240,1273.5,804.072,1024,240,9.05,26.26,43.27
regnetx_080,1271.94,805.056,1024,224,8.02,14.06,39.57
dla102x2,1271.93,402.524,512,224,9.34,29.91,41.28
vgg19_bn,1265.83,404.467,512,224,19.66,14.86,143.68
efficientnet_lite4,1263.67,303.867,384,380,4.04,45.66,13.01
crossvit_18_dagger_240,1245.15,822.375,1024,240,9.5,27.03,44.27
res2next50,1241.53,824.775,1024,224,4.2,13.71,24.67
volo_d1_224,1235.2,829.0,1024,224,6.94,24.43,26.63
efficientnetv2_s,1221.73,838.141,1024,384,8.44,35.77,21.46
resnest50d,1214.35,843.233,1024,224,5.4,14.36,27.48
tf_efficientnetv2_s_in21ft1k,1191.03,859.741,1024,384,8.44,35.77,21.46
tf_efficientnetv2_s,1191.03,859.748,1024,384,8.44,35.77,21.46
dpn98,1188.11,861.858,1024,224,11.73,25.2,61.57
mixnet_xxl,1187.83,323.267,384,224,2.04,23.43,23.96
swin_small_patch4_window7_224,1183.48,865.227,1024,224,8.77,27.47,49.61
regnetz_d8,1180.44,867.458,1024,320,6.19,37.08,23.37
hrnet_w30,1180.28,867.576,1024,224,8.15,21.21,37.71
gluon_resnext101_64x4d,1176.11,870.653,1024,224,15.52,31.21,83.46
efficientnetv2_rw_s,1166.72,877.658,1024,384,8.72,38.03,23.94
tf_efficientnet_lite4,1164.88,329.636,384,380,4.04,45.66,13.01
swinv2_tiny_window8_256,1158.39,883.971,1024,256,5.96,24.57,28.35
wide_resnet101_2,1155.6,886.11,1024,224,22.8,21.23,126.89
repvgg_b2,1154.84,886.691,1024,224,20.45,12.9,89.02
vit_base_patch16_224_miil,1153.59,887.648,1024,224,17.58,23.9,86.54
resnet50_gn,1150.43,890.083,1024,224,4.14,11.11,25.56
resnet200,1149.46,890.84,1024,224,15.07,32.19,64.67
cait_xxs36_224,1148.13,891.873,1024,224,3.77,30.34,17.3
xception65p,1140.81,448.791,512,299,13.91,52.48,39.82
regnetz_040,1140.64,336.641,384,320,6.35,37.78,27.12
xcit_small_24_p16_224_dist,1137.7,900.049,1024,224,9.1,23.64,47.67
xcit_small_24_p16_224,1136.58,900.934,1024,224,9.1,23.64,47.67
regnetz_040h,1135.68,338.111,384,320,6.43,37.94,28.94
deit_base_patch16_224,1133.84,903.113,1024,224,17.58,23.9,86.57
vit_base_patch16_224,1133.45,903.419,1024,224,17.58,23.9,86.57
vit_base_patch16_224_sam,1132.11,904.493,1024,224,17.58,23.9,86.57
regnetz_d32,1128.13,907.679,1024,320,9.33,37.08,27.58
dla60_res2next,1127.83,907.922,1024,224,3.49,13.17,17.03
eca_nfnet_l0,1126.64,908.88,1024,288,7.12,17.29,24.14
resnetrs101,1123.2,911.667,1024,288,13.56,28.53,63.62
nfnet_l0,1123.1,911.747,1024,288,7.13,17.29,35.07
cs3se_edgenet_x,1120.46,913.898,1024,320,18.01,20.21,50.72
deit_base_distilled_patch16_224,1119.36,914.798,1024,224,17.68,24.05,87.34
vit_base_patch16_rpn_224,1111.53,921.235,1024,224,17.49,23.75,86.54
inception_resnet_v2,1108.79,923.511,1024,299,13.18,25.06,55.84
ens_adv_inception_resnet_v2,1107.31,924.747,1024,299,13.18,25.06,55.84
deit3_base_patch16_224,1093.88,936.101,1024,224,17.58,23.9,86.59
deit3_base_patch16_224_in21ft1k,1092.45,937.33,1024,224,17.58,23.9,86.59
gluon_seresnext101_64x4d,1088.94,940.353,1024,224,15.53,31.25,88.23
vit_relpos_base_patch16_clsgap_224,1081.95,946.422,1024,224,17.6,25.12,86.43
vit_relpos_base_patch16_cls_224,1081.77,946.586,1024,224,17.6,25.12,86.43
vit_relpos_base_patch16_rpn_224,1080.35,947.826,1024,224,17.51,24.97,86.41
vit_relpos_base_patch16_224,1079.52,948.56,1024,224,17.51,24.97,86.43
tnt_s_patch16_224,1078.48,949.47,1024,224,5.24,24.37,23.76
twins_pcpvt_large,1066.87,959.801,1024,224,9.84,35.82,60.99
ssl_resnext101_32x8d,1054.92,970.677,1024,224,16.48,31.21,88.79
resnext101_32x8d,1054.71,970.869,1024,224,16.48,31.21,88.79
ig_resnext101_32x8d,1054.19,971.349,1024,224,16.48,31.21,88.79
swsl_resnext101_32x8d,1053.69,971.812,1024,224,16.48,31.21,88.79
beit_base_patch16_224,1049.2,975.962,1024,224,17.58,23.9,86.53
resnest50d_4s2x40d,1042.05,982.666,1024,224,4.4,17.94,30.42
coat_tiny,1040.04,984.566,1024,224,4.35,27.2,5.5
resnet101d,1029.19,994.942,1024,320,16.48,34.77,44.57
convnext_base,1012.72,1011.124,1024,224,15.38,28.75,88.59
convnext_base_in22ft1k,1011.75,1012.088,1024,224,15.38,28.75,88.59
efficientnet_b4,993.87,386.356,384,384,4.51,50.04,19.34
pit_b_224,992.43,515.895,512,224,12.42,32.94,73.76
pit_b_distilled_224,988.79,517.791,512,224,12.5,33.07,74.79
gluon_xception65,981.26,521.764,512,299,13.96,52.48,39.92
xception65,975.37,524.918,512,299,13.96,52.48,39.92
vit_small_patch16_36x1_224,973.92,1051.404,1024,224,13.71,35.69,64.67
repvgg_b3,972.06,1053.416,1024,224,29.16,15.1,123.09
repvgg_b2g4,969.39,1056.316,1024,224,12.63,12.9,61.76
xcit_tiny_24_p16_384_dist,969.37,1056.345,1024,384,6.87,34.29,12.12
swinv2_cr_small_224,967.97,1057.869,1024,224,9.07,50.27,49.7
swinv2_cr_small_ns_224,957.86,1069.034,1024,224,9.08,50.27,49.7
vit_small_patch16_18x2_224,951.03,1076.715,1024,224,13.71,35.69,64.67
twins_svt_large,923.66,1108.616,1024,224,15.15,35.1,99.27
tf_efficientnet_b4,922.69,416.164,384,380,4.49,49.49,19.34
tf_efficientnet_b4_ap,922.5,416.247,384,380,4.49,49.49,19.34
tf_efficientnet_b4_ns,922.41,416.289,384,380,4.49,49.49,19.34
hrnet_w40,910.46,1124.691,1024,224,12.75,25.29,57.56
regnetz_b16_evos,903.54,849.98,768,288,2.36,16.43,9.74
cait_s24_224,902.24,1134.941,1024,224,9.35,40.58,46.92
nfnet_f0,901.6,1135.748,1024,256,12.62,18.05,71.49
nf_regnet_b4,900.78,1136.78,1024,384,4.7,28.61,30.21
poolformer_m36,896.53,1142.174,1024,224,8.8,22.02,56.17
nest_small,884.77,868.006,768,224,10.35,40.04,38.35
hrnet_w48,875.56,1169.527,1024,224,17.34,28.56,77.47
jx_nest_small,874.29,878.41,768,224,10.35,40.04,38.35
dpn131,866.66,1181.531,1024,224,16.09,32.97,79.25
swin_s3_small_224,854.8,898.443,768,224,9.43,37.84,49.74
regnety_040,854.48,898.782,768,288,6.61,20.3,20.65
regnetv_040,854.18,899.093,768,288,6.6,20.3,20.64
regnety_080,846.16,605.071,512,288,13.22,29.69,39.18
resnetv2_50d_evos,844.23,1212.929,1024,288,7.15,19.7,25.59
coat_mini,836.76,1223.748,1024,224,6.82,33.68,10.34
swin_base_patch4_window7_224,836.19,1224.59,1024,224,15.47,36.63,87.77
repvgg_b3g4,835.5,1225.597,1024,224,17.89,15.1,83.83
sequencer2d_l,832.91,1229.407,1024,224,9.74,22.12,54.3
dm_nfnet_f0,831.16,1232.004,1024,256,12.62,18.05,71.49
mobilevitv2_150_384_in22ft1k,826.94,309.566,256,384,9.2,54.25,10.59
gmlp_b16_224,825.63,1240.256,1024,224,15.78,30.21,73.08
convnext_tiny_384_in22ft1k,814.55,628.553,512,384,13.14,39.48,28.59
xcit_medium_24_p16_224_dist,812.18,1260.787,1024,224,16.13,31.71,84.4
xcit_medium_24_p16_224,812.16,1260.822,1024,224,16.13,31.71,84.4
regnetx_120,810.53,631.674,512,224,12.13,21.37,46.11
xcit_small_12_p16_384_dist,800.29,1279.521,1024,384,14.14,36.51,26.25
densenet264,798.84,1281.845,1024,224,12.95,12.8,72.69
hrnet_w44,790.45,1295.441,1024,224,14.94,26.92,67.06
crossvit_base_240,787.5,975.226,768,240,21.22,36.33,105.03
regnety_120,783.87,653.154,512,224,12.14,21.38,51.82
swinv2_tiny_window16_256,765.77,668.599,512,256,6.68,39.02,28.35
resnetv2_50d_gn,751.14,1022.427,768,288,7.24,19.7,25.57
xception71,749.93,682.721,512,299,18.09,69.92,42.34
vit_large_r50_s32_224,739.74,1384.252,1024,224,19.58,24.41,328.99
vit_base_patch16_plus_240,737.95,1387.618,1024,240,27.41,33.08,117.56
dpn107,736.42,1390.497,1024,224,18.38,33.46,86.92
resnet152d,736.09,1391.121,1024,320,24.08,47.67,60.21
ecaresnet200d,730.76,1401.271,1024,256,20.0,43.15,64.69
seresnet200d,728.63,1405.363,1024,256,20.01,43.15,71.86
vit_relpos_base_patch16_plus_240,727.74,1407.074,1024,240,27.3,34.33,117.38
xcit_tiny_24_p8_224,725.69,1411.06,1024,224,9.21,45.39,12.11
xcit_tiny_24_p8_224_dist,725.66,1411.108,1024,224,9.21,45.39,12.11
hrnet_w64,719.99,1422.231,1024,224,28.97,35.09,128.06
regnety_040s_gn,719.08,1068.018,768,224,4.03,12.29,20.65
xcit_nano_12_p8_384_dist,713.49,1435.191,1024,384,6.34,46.08,3.05
swinv2_small_window8_256,712.51,1437.16,1024,256,11.58,40.14,49.73
convit_base,706.35,1449.693,1024,224,17.52,31.77,86.54
resnext101_64x4d,704.57,1090.007,768,288,25.66,51.59,83.46
swin_s3_base_224,695.14,1473.064,1024,224,13.69,48.26,71.13
vit_small_patch16_384,693.62,1107.217,768,384,15.52,50.78,22.2
tnt_b_patch16_224,691.83,1480.107,1024,224,14.09,39.01,65.41
swinv2_cr_base_224,689.84,1484.394,1024,224,15.86,59.66,87.88
regnety_064,684.37,748.122,512,288,10.56,27.11,30.58
swinv2_cr_base_ns_224,683.96,1497.137,1024,224,15.86,59.66,87.88
volo_d2_224,679.94,1506.002,1024,224,14.34,41.34,58.68
mobilevitv2_175_384_in22ft1k,679.77,376.586,256,384,12.47,63.29,14.25
regnetv_064,678.0,755.149,512,288,10.55,27.11,30.58
poolformer_m48,676.06,1514.652,1024,224,11.59,29.17,73.47
deit3_small_patch16_384,669.71,1146.752,768,384,15.52,50.78,22.21
deit3_small_patch16_384_in21ft1k,669.36,1147.345,768,384,15.52,50.78,22.21
legacy_senet154,660.41,1550.529,1024,224,20.77,38.69,115.09
senet154,659.33,1553.081,1024,224,20.77,38.69,115.09
gluon_senet154,659.08,1553.657,1024,224,20.77,38.69,115.09
regnetx_160,650.99,786.486,512,224,15.99,25.52,54.28
resnetrs152,646.22,1584.595,1024,320,24.34,48.14,86.62
seresnet152d,642.19,1594.525,1024,320,24.09,47.72,66.84
regnetz_e8,633.28,1212.715,768,320,15.46,63.94,57.7
tresnet_l,630.57,1623.908,1024,224,10.88,11.9,55.99
nest_base,629.43,813.42,512,224,17.96,53.39,67.72
ese_vovnet99b_iabn,628.09,1630.325,1024,224,16.49,11.27,63.2
jx_nest_base,622.33,822.697,512,224,17.96,53.39,67.72
vit_small_r26_s32_384,613.78,834.168,512,384,10.43,29.85,36.47
vit_base_r50_s16_224,610.7,1676.739,1024,224,21.66,35.29,98.66
xcit_small_12_p8_224,609.86,1679.054,1024,224,18.69,47.21,26.21
xcit_small_12_p8_224_dist,609.57,1679.853,1024,224,18.69,47.21,26.21
efficientnetv2_m,603.09,1697.911,1024,416,18.6,67.5,54.14
mobilevitv2_200_384_in22ft1k,594.06,323.186,192,384,16.24,72.34,18.45
seresnext101_32x8d,590.8,1299.927,768,288,27.24,51.63,93.57
resnest101e,588.3,1305.448,768,256,13.38,28.66,48.28
regnetz_c16_evos,588.05,870.656,512,320,3.86,25.88,13.49
convmixer_768_32,578.91,1768.818,1024,224,19.55,25.95,21.11
seresnext101d_32x8d,575.36,1334.812,768,288,27.64,52.95,93.59
seresnet269d,570.63,1794.5,1024,256,26.59,53.6,113.67
convnext_large,561.47,1823.764,1024,224,34.4,43.13,197.77
convnext_large_in22ft1k,560.92,1825.557,1024,224,34.4,43.13,197.77
resnet200d,544.65,1880.08,1024,320,31.25,67.33,64.69
efficientnetv2_rw_m,534.87,1435.852,768,416,21.49,79.62,53.24
seresnextaa101d_32x8d,521.71,1472.057,768,288,28.51,56.44,93.59
vit_large_patch32_384,517.33,1979.373,1024,384,45.31,43.86,306.63
swinv2_base_window8_256,509.01,2011.746,1024,256,20.37,52.59,87.92
eca_nfnet_l1,497.06,2060.113,1024,320,14.92,34.42,41.41
convnext_small_384_in22ft1k,496.5,1031.206,512,384,25.58,63.37,50.22
mixer_l16_224,495.85,2065.122,1024,224,44.6,41.69,208.2
efficientnet_b5,493.19,519.054,256,456,10.46,98.86,30.39
halonet_h1,492.19,520.115,256,256,3.0,51.17,8.1
regnety_320,483.53,1058.87,512,224,32.34,30.26,145.05
swin_large_patch4_window7_224,480.21,1599.271,768,224,34.53,54.94,196.53
swinv2_small_window16_256,477.23,1072.842,512,256,12.82,66.29,49.73
volo_d3_224,474.32,2158.852,1024,224,20.78,60.09,86.33
resnetrs200,471.22,2173.072,1024,320,31.51,67.81,93.21
tf_efficientnet_b5_ns,469.35,545.419,256,456,10.46,98.86,30.39
tf_efficientnet_b5_ap,469.08,545.738,256,456,10.46,98.86,30.39
tf_efficientnet_b5,468.97,545.864,256,456,10.46,98.86,30.39
xcit_tiny_12_p8_384_dist,468.35,2186.374,1024,384,14.13,69.14,6.71
tresnet_xl,466.16,2196.648,1024,224,15.17,15.34,78.44
efficientnet_b3_gn,464.67,550.914,256,320,2.14,28.83,11.73
xcit_large_24_p16_224_dist,454.83,2251.393,1024,224,35.86,47.27,189.1
xcit_large_24_p16_224,454.8,2251.543,1024,224,35.86,47.27,189.1
tf_efficientnetv2_m_in21ft1k,444.81,1726.544,768,480,24.76,89.84,54.14
tf_efficientnetv2_m,444.79,1726.633,768,480,24.76,89.84,54.14
xcit_small_24_p16_384_dist,426.37,2401.669,1024,384,26.72,68.58,47.67
regnety_160,422.88,908.045,384,288,26.37,38.07,83.59
nf_regnet_b5,413.3,1238.797,512,456,11.7,61.95,49.74
swinv2_cr_tiny_384,408.94,625.992,256,384,15.34,161.01,28.33
swinv2_cr_large_224,406.2,1890.673,768,224,35.1,78.42,196.68
resnetv2_50x1_bitm,400.73,958.233,384,448,16.62,44.46,25.55
regnetz_d8_evos,392.85,1954.947,768,320,7.03,38.92,23.46
convmixer_1024_20_ks9_p14,390.62,2621.469,1024,224,5.55,5.51,24.38
efficientnet_b3_g8_gn,373.24,685.874,256,320,3.2,28.83,14.25
vit_large_patch16_224,370.72,2762.206,1024,224,61.6,63.52,304.33
convnext_xlarge_in22ft1k,368.8,1388.275,512,224,60.98,57.5,350.2
crossvit_15_dagger_408,368.65,694.421,256,408,21.45,95.05,28.5
vit_base_patch16_18x2_224,361.92,2829.305,1024,224,52.51,71.38,256.73
deit3_large_patch16_224_in21ft1k,358.3,2857.929,1024,224,61.6,63.52,304.37
deit3_large_patch16_224,357.82,2861.791,1024,224,61.6,63.52,304.37
swinv2_base_window16_256,346.22,1109.109,384,256,22.02,84.71,87.92
swinv2_base_window12to16_192to256_22kft1k,345.97,1109.898,384,256,22.02,84.71,87.92
nasnetalarge,345.75,1110.628,384,331,23.89,90.56,88.75
convnext_base_384_in22ft1k,345.74,1110.63,384,384,45.21,84.49,88.59
beit_large_patch16_224,343.01,2985.299,1024,224,61.6,63.52,304.43
ssl_resnext101_32x16d,338.92,1510.65,512,224,36.27,51.18,194.03
ig_resnext101_32x16d,338.75,1511.419,512,224,36.27,51.18,194.03
swsl_resnext101_32x16d,338.52,1512.441,512,224,36.27,51.18,194.03
tresnet_m_448,325.14,3149.347,1024,448,22.94,29.21,31.39
pnasnet5large,323.98,1185.25,384,331,25.04,92.89,86.06
regnetx_320,320.95,1196.437,384,224,31.81,36.3,107.81
xcit_small_24_p8_224,319.26,3207.434,1024,224,35.81,90.78,47.63
xcit_small_24_p8_224_dist,319.16,3208.44,1024,224,35.81,90.78,47.63
volo_d1_384,315.83,1621.098,512,384,22.75,108.55,26.78
nfnet_f1,306.71,3338.679,1024,320,35.97,46.77,132.63
ecaresnet269d,304.87,3358.754,1024,352,50.25,101.25,102.09
volo_d4_224,303.38,3375.331,1024,224,44.34,80.22,192.96
xcit_medium_24_p16_384_dist,297.67,2580.013,768,384,47.39,91.64,84.4
resnetrs270,296.93,3448.599,1024,352,51.13,105.48,129.86
resnetv2_152x2_bit_teacher,290.64,2642.472,768,224,46.95,45.11,236.34
vit_base_patch16_384,289.22,1327.696,384,384,55.54,101.56,86.86
deit_base_patch16_384,289.04,1328.502,384,384,55.54,101.56,86.86
deit_base_distilled_patch16_384,285.12,1346.806,384,384,55.65,101.82,87.63
dm_nfnet_f1,282.58,3623.721,1024,320,35.97,46.77,132.63
deit3_base_patch16_384,281.04,1366.341,384,384,55.54,101.56,86.88
deit3_base_patch16_384_in21ft1k,280.92,1366.918,384,384,55.54,101.56,86.88
efficientnet_b6,279.71,457.611,128,528,19.4,167.39,43.04
cait_xxs24_384,275.0,3723.573,1024,384,9.63,122.66,12.03
vit_large_patch14_224,271.58,3770.566,1024,224,81.08,88.79,304.2
crossvit_18_dagger_408,269.56,712.259,192,408,32.47,124.87,44.61
tf_efficientnet_b6_ap,267.5,478.495,128,528,19.4,167.39,43.04
tf_efficientnet_b6,267.48,478.534,128,528,19.4,167.39,43.04
tf_efficientnet_b6_ns,267.38,478.715,128,528,19.4,167.39,43.04
efficientnetv2_l,254.55,2011.402,512,480,56.4,157.99,118.52
resnetv2_101x1_bitm,252.46,1521.025,384,448,31.65,64.93,44.54
tf_efficientnetv2_l,251.41,2036.496,512,480,56.4,157.99,118.52
tf_efficientnetv2_l_in21ft1k,251.09,2039.122,512,480,56.4,157.99,118.52
swinv2_cr_small_384,250.5,1021.951,256,384,29.7,298.03,49.7
beit_base_patch16_384,248.36,1546.143,384,384,55.54,101.56,86.74
xcit_tiny_24_p8_384_dist,246.6,4152.437,1024,384,27.05,132.95,12.11
vit_large_r50_s32_384,246.12,2080.308,512,384,57.43,76.52,329.09
eca_nfnet_l2,237.09,3239.214,768,384,30.05,68.28,56.72
resmlp_big_24_224,228.25,4486.35,1024,224,100.23,87.31,129.14
resmlp_big_24_224_in22ft1k,227.96,4491.908,1024,224,100.23,87.31,129.14
resmlp_big_24_distilled_224,227.94,4492.471,1024,224,100.23,87.31,129.14
xcit_medium_24_p8_224_dist,222.27,3455.29,768,224,63.53,121.23,84.32
xcit_medium_24_p8_224,222.27,3455.239,768,224,63.53,121.23,84.32
swin_base_patch4_window12_384,221.01,868.732,192,384,47.19,134.78,87.9
swinv2_large_window12to16_192to256_22kft1k,212.32,1205.699,256,256,47.81,121.53,196.74
xcit_small_12_p8_384_dist,207.32,1852.22,384,384,54.92,138.29,26.21
resnest200e,201.89,2536.056,512,320,35.69,82.78,70.2
volo_d5_224,200.39,5110.009,1024,224,72.4,118.11,295.46
resnetrs350,194.78,3942.989,768,384,77.59,154.74,163.96
convnext_large_384_in22ft1k,191.43,1337.313,256,384,101.1,126.74,197.77
cait_xs24_384,190.0,4042.088,768,384,19.28,183.98,26.67
vit_base_patch8_224,188.23,1360.04,256,224,78.22,161.69,86.58
cait_xxs36_384,183.87,5569.221,1024,384,14.35,183.7,17.37
swinv2_cr_base_384,178.69,1432.608,256,384,50.57,333.68,87.88
vit_base_r50_s16_384,176.73,1448.509,256,384,67.43,135.03,98.95
vit_base_resnet50_384,176.72,1448.639,256,384,67.43,135.03,98.95
volo_d2_384,176.17,2179.696,384,384,46.17,184.51,58.87
swinv2_cr_huge_224,175.72,2185.293,384,224,115.97,121.08,657.83
nfnet_f2,172.68,5929.942,1024,352,63.22,79.06,193.78
xcit_large_24_p16_384_dist,168.33,3041.628,512,384,105.35,137.17,189.1
densenet264d_iabn,166.37,6155.083,1024,224,13.47,14.0,72.74
efficientnet_b7,161.46,594.574,96,600,38.33,289.94,66.35
efficientnetv2_xl,159.28,2410.894,384,512,93.85,247.32,208.12
dm_nfnet_f2,159.16,4825.445,768,352,63.22,79.06,193.78
tf_efficientnetv2_xl_in21ft1k,158.05,2429.572,384,512,93.85,247.32,208.12
tf_efficientnet_b7,155.86,615.938,96,600,38.33,289.94,66.35
tf_efficientnet_b7_ap,155.83,616.029,96,600,38.33,289.94,66.35
tf_efficientnet_b7_ns,155.78,616.248,96,600,38.33,289.94,66.35
tresnet_l_448,151.99,6737.079,1024,448,43.5,47.56,55.99
cait_s24_384,148.32,3451.871,512,384,32.17,245.31,47.06
vit_huge_patch14_224,146.38,6995.606,1024,224,167.4,139.41,632.05
ig_resnext101_32x32d,143.03,1789.868,256,224,87.29,91.12,468.53
resnetrs420,142.89,5374.778,768,416,108.45,213.79,191.89
deit3_huge_patch14_224,142.28,7197.12,1024,224,167.4,139.41,632.13
deit3_huge_patch14_224_in21ft1k,142.06,7208.264,1024,224,167.4,139.41,632.13
eca_nfnet_l3,132.47,3864.977,512,448,52.55,118.4,72.04
swin_large_patch4_window12_384,130.79,978.633,128,384,104.08,202.16,196.74
convnext_xlarge_384_in22ft1k,126.14,2029.4,256,384,179.2,168.99,350.2
xcit_large_24_p8_224,125.6,4076.305,512,224,141.23,181.56,188.93
xcit_large_24_p8_224_dist,125.51,4079.48,512,224,141.23,181.56,188.93
tresnet_xl_448,111.87,9153.659,1024,448,60.65,61.31,78.44
xcit_small_24_p8_384_dist,108.62,3535.115,384,384,105.24,265.91,47.63
swinv2_cr_large_384,108.25,1182.395,128,384,108.95,404.96,196.68
efficientnet_b8,101.79,943.075,96,672,63.48,442.89,87.41
resnetv2_50x3_bitm,101.1,1266.075,128,448,145.7,133.37,217.32
cait_s36_384,99.17,5162.791,512,384,47.99,367.4,68.37
tf_efficientnet_b8,98.82,971.416,96,672,63.48,442.89,87.41
tf_efficientnet_b8_ap,98.81,971.518,96,672,63.48,442.89,87.41
resnetv2_152x2_bit_teacher_384,98.49,2599.208,256,384,136.16,132.56,236.34
vit_large_patch16_384,97.65,2621.481,256,384,191.21,270.24,304.72
vit_giant_patch14_224,95.7,8025.142,768,224,267.18,192.64,1012.61
deit3_large_patch16_384,94.92,2697.101,256,384,191.21,270.24,304.76
deit3_large_patch16_384_in21ft1k,94.92,2697.083,256,384,191.21,270.24,304.76
swinv2_base_window12to24_192to384_22kft1k,94.52,677.087,64,384,55.25,280.36,87.92
nfnet_f3,93.84,8184.276,768,416,115.58,141.78,254.92
resnest269e,93.31,4115.183,384,416,77.69,171.98,110.93
dm_nfnet_f3,86.56,5914.801,512,416,115.58,141.78,254.92
beit_large_patch16_384,84.77,3019.757,256,384,191.21,270.24,305.0
volo_d3_448,76.33,2515.269,192,448,96.33,446.83,86.63
xcit_medium_24_p8_384_dist,75.97,3369.835,256,384,186.67,354.73,84.32
ig_resnext101_32x48d,74.47,2578.247,192,224,153.57,131.06,828.41
resnetv2_152x2_bitm,72.95,2631.902,192,448,184.99,180.43,236.34
tf_efficientnet_l2_ns_475,63.13,1013.713,64,475,172.11,609.89,480.31
resnetv2_101x3_bitm,60.06,2131.166,128,448,280.33,194.78,387.93
swinv2_large_window12to24_192to384_22kft1k,59.82,802.459,48,384,116.15,407.83,196.74
vit_gigantic_patch14_224,57.64,8883.342,512,224,483.95,275.37,1844.44
volo_d4_448,56.09,3423.322,192,448,197.13,527.35,193.41
convmixer_1536_20,53.95,18979.912,1024,224,48.68,33.03,51.63
swinv2_cr_giant_224,50.56,2531.46,128,224,483.85,309.15,2598.76
nfnet_f4,49.93,10254.329,512,512,216.26,262.26,316.07
swinv2_cr_huge_384,47.13,1357.909,64,384,352.04,583.18,657.94
dm_nfnet_f4,45.97,8353.673,384,512,216.26,262.26,316.07
xcit_large_24_p8_384_dist,42.84,4481.402,192,384,415.0,531.82,188.93
volo_d5_448,38.62,3314.317,128,448,315.06,737.92,295.91
nfnet_f5,37.03,10370.994,384,544,290.97,349.71,377.21
dm_nfnet_f5,33.93,11318.026,384,544,290.97,349.71,377.21
beit_large_patch16_512,33.87,2834.401,96,512,362.24,656.39,305.67
cait_m36_384,32.22,7945.944,256,384,173.11,734.81,271.22
nfnet_f6,28.33,13554.319,384,576,378.69,452.2,438.36
volo_d5_512,26.99,4742.789,128,512,425.09,1105.37,296.09
dm_nfnet_f6,26.14,9792.719,256,576,378.69,452.2,438.36
resnetv2_152x4_bitm,24.34,2629.892,64,480,844.84,414.26,936.53
efficientnet_l2,23.12,1037.889,24,800,479.12,1707.39,480.31
tf_efficientnet_l2_ns,22.7,1057.422,24,800,479.12,1707.39,480.31
nfnet_f7,22.34,11460.175,256,608,480.39,570.85,499.5
swinv2_cr_giant_384,14.63,2187.208,32,384,1450.71,1394.86,2598.76
cait_m48_448,13.64,9385.159,128,448,329.41,1708.23,356.46
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-train-amp-nchw-pt111-cu113-rtx3090.csv
|
model,train_samples_per_sec,train_step_time,train_batch_size,train_img_size,param_count
tinynet_e,9380.97,53.881,512,106,2.04
mobilenetv3_small_050,7276.68,69.643,512,224,1.59
tf_mobilenetv3_small_minimal_100,6334.14,80.291,512,224,2.04
mobilenetv3_small_075,5920.21,85.765,512,224,2.04
lcnet_035,5760.61,88.397,512,224,1.64
mobilenetv3_small_100,5583.48,90.99,512,224,2.54
tf_mobilenetv3_small_075,5569.37,91.204,512,224,2.04
levit_128s,5426.95,93.402,512,224,7.78
lcnet_050,5425.23,93.895,512,224,1.88
tf_mobilenetv3_small_100,5275.47,96.328,512,224,2.54
tinynet_d,4879.29,104.179,512,152,2.34
mixer_s32_224,4491.84,113.42,512,224,19.1
vit_small_patch32_224,4321.32,117.658,512,224,22.88
lcnet_075,4147.31,122.971,512,224,2.36
levit_128,3971.6,127.764,512,224,9.21
vit_tiny_r_s16_p8_224,3957.53,128.516,512,224,6.34
lcnet_100,3524.02,144.807,512,224,2.95
mnasnet_small,3488.23,145.878,512,224,2.03
levit_192,3436.74,147.842,512,224,10.95
regnetx_002,3423.6,148.87,512,224,2.68
regnety_002,3178.35,160.128,512,224,3.16
mobilenetv2_035,3069.55,166.033,512,224,1.68
gernet_s,2949.06,172.893,512,224,8.17
mnasnet_050,2821.34,180.673,512,224,2.22
ssl_resnet18,2720.94,187.823,512,224,11.69
swsl_resnet18,2720.55,187.842,512,224,11.69
gluon_resnet18_v1b,2718.42,187.998,512,224,11.69
resnet18,2711.83,188.459,512,224,11.69
semnasnet_050,2700.91,188.67,512,224,2.08
tinynet_c,2654.89,191.851,512,184,2.46
mobilenetv2_050,2650.81,192.34,512,224,1.97
levit_256,2620.65,194.218,512,224,18.89
lcnet_150,2613.69,195.4,512,224,4.5
regnetx_004,2532.79,201.109,512,224,5.16
seresnet18,2519.64,202.716,512,224,11.78
legacy_seresnet18,2461.74,207.479,512,224,11.78
ese_vovnet19b_slim_dw,2456.97,207.908,512,224,1.9
mobilenetv3_large_075,2342.92,217.677,512,224,3.99
tf_mobilenetv3_large_minimal_100,2311.18,220.824,512,224,3.92
vit_tiny_patch16_224,2249.49,226.757,512,224,5.72
levit_256d,2248.95,226.128,512,224,26.21
deit_tiny_patch16_224,2244.74,227.248,512,224,5.72
tf_mobilenetv3_large_075,2222.25,229.522,512,224,3.99
deit_tiny_distilled_patch16_224,2207.23,231.107,512,224,5.91
ghostnet_050,2193.43,232.046,512,224,2.59
mnasnet_075,2163.05,235.897,512,224,3.17
mobilenetv3_rw,2147.03,237.6,512,224,5.48
mobilenetv3_large_100_miil,2135.44,238.882,512,224,5.48
mobilenetv3_large_100,2134.66,238.979,512,224,5.48
resnet18d,2086.64,244.99,512,224,11.71
pit_ti_distilled_224,2085.77,244.59,512,224,5.1
pit_ti_224,2083.57,244.847,512,224,4.85
hardcorenas_a,2050.08,249.015,512,224,5.26
regnetx_006,2048.56,249.103,512,224,6.2
tf_mobilenetv3_large_100,2046.24,249.335,512,224,5.48
ese_vovnet19b_slim,1997.73,255.915,512,224,3.17
mnasnet_100,1996.94,255.609,512,224,4.38
mnasnet_b1,1993.53,256.025,512,224,4.38
xcit_nano_12_p16_224_dist,1946.62,261.255,512,224,3.05
xcit_nano_12_p16_224,1946.1,261.308,512,224,3.05
semnasnet_075,1927.77,264.67,512,224,2.91
hardcorenas_b,1912.19,266.758,512,224,5.18
mobilenetv2_075,1888.29,270.368,512,224,2.64
hardcorenas_c,1885.12,270.645,512,224,5.52
tf_efficientnetv2_b0,1880.38,271.08,512,224,7.14
tinynet_b,1857.15,274.614,512,188,3.73
regnety_004,1855.83,274.756,512,224,4.34
resnetblur18,1846.1,276.978,512,224,11.69
regnety_006,1808.76,282.043,512,224,6.06
hardcorenas_d,1808.52,281.873,512,224,7.5
mnasnet_a1,1796.84,284.044,512,224,3.89
semnasnet_100,1784.41,286.019,512,224,3.89
spnasnet_100,1782.41,286.288,512,224,4.42
skresnet18,1776.65,287.57,512,224,11.96
mobilenetv2_100,1761.25,289.902,512,224,3.5
mixer_b32_224,1711.23,298.43,512,224,60.29
regnetx_008,1669.43,305.887,512,224,7.26
levit_384,1662.81,306.81,512,224,39.13
vit_base_patch32_224,1658.09,307.969,512,224,88.22
vit_base_patch32_224_sam,1657.82,308.015,512,224,88.22
efficientnet_lite0,1636.56,312.086,512,224,4.65
visformer_tiny,1629.51,313.524,512,224,10.32
gluon_resnet34_v1b,1626.25,314.261,512,224,21.8
resnet34,1624.32,314.625,512,224,21.8
tv_resnet34,1622.42,314.985,512,224,21.8
ghostnet_100,1610.26,316.618,512,224,5.18
hardcorenas_f,1606.37,317.568,512,224,8.2
pit_xs_distilled_224,1596.23,319.885,512,224,11.0
pit_xs_224,1595.39,320.023,512,224,10.62
hardcorenas_e,1595.11,319.841,512,224,8.07
tinynet_a,1592.8,320.187,512,192,6.19
resmlp_12_distilled_224,1562.43,326.888,512,224,15.35
resmlp_12_224,1562.2,326.933,512,224,15.35
regnety_008,1550.03,329.315,512,224,6.26
tf_efficientnet_lite0,1547.08,330.18,512,224,4.65
mixer_s16_224,1538.34,332.27,512,224,18.53
fbnetc_100,1532.35,333.142,512,224,5.57
seresnet34,1494.24,341.776,512,224,21.96
mnasnet_140,1493.96,341.916,512,224,7.12
nf_regnet_b0,1481.15,344.453,512,256,8.76
legacy_seresnet34,1458.06,350.271,512,224,21.96
gmixer_12_224,1449.53,352.403,512,224,12.7
gernet_m,1442.79,354.129,512,224,21.14
ese_vovnet19b_dw,1440.24,354.984,512,224,6.54
vit_small_patch32_384,1423.05,358.931,512,384,22.92
nf_resnet26,1422.6,359.386,512,224,16.0
efficientnet_b0,1413.79,270.572,384,224,5.29
dla46_c,1408.04,362.893,512,224,1.3
rexnetr_100,1385.71,275.935,384,224,4.88
mobilenetv2_110d,1384.57,276.313,384,224,4.52
rexnet_100,1381.06,276.903,384,224,4.8
vit_tiny_r_s16_p8_384,1371.38,279.143,384,384,6.36
ghostnet_130,1367.64,372.982,512,224,7.36
resnet34d,1367.17,373.876,512,224,21.82
tf_efficientnet_b0,1353.9,282.522,384,224,5.29
tf_efficientnet_b0_ap,1353.18,282.683,384,224,5.29
tf_efficientnet_b0_ns,1352.63,282.744,384,224,5.29
selecsls42,1349.71,378.706,512,224,30.35
selecsls42b,1347.34,379.337,512,224,32.46
gmlp_ti16_224,1340.35,284.915,384,224,5.87
regnetz_005,1326.54,384.587,512,224,7.12
crossvit_tiny_240,1322.74,385.447,512,240,7.01
resnet26,1321.4,386.974,512,224,16.0
semnasnet_140,1315.96,388.179,512,224,6.11
xcit_tiny_12_p16_224,1309.91,389.1,512,224,6.72
xcit_tiny_12_p16_224_dist,1303.81,390.894,512,224,6.72
hrnet_w18_small,1302.26,391.765,512,224,13.19
efficientnet_b1_pruned,1277.21,399.368,512,240,6.33
mobilevit_xxs,1270.89,301.046,384,256,1.27
mobilenetv2_140,1250.7,306.236,384,224,6.11
poolformer_s12,1245.33,410.466,512,224,11.92
crossvit_9_240,1236.49,309.127,384,240,8.55
nf_seresnet26,1207.14,423.49,512,224,17.4
tf_efficientnetv2_b1,1198.02,319.027,384,240,8.14
repvgg_b0,1184.29,431.28,512,224,15.82
crossvit_9_dagger_240,1177.9,324.572,384,240,8.78
selecsls60,1152.93,443.193,512,224,30.67
mixnet_s,1150.64,443.717,512,224,4.13
selecsls60b,1149.12,444.678,512,224,32.77
efficientnet_es,1142.77,447.293,512,224,5.44
efficientnet_es_pruned,1141.22,447.892,512,224,5.44
nf_ecaresnet26,1137.4,449.596,512,224,16.0
resnet26d,1125.5,454.383,512,224,16.01
tf_efficientnet_es,1120.47,456.18,512,224,5.44
efficientnet_lite1,1109.69,229.723,256,240,5.42
rexnetr_130,1097.15,232.191,256,224,7.61
tf_mixnet_s,1090.03,468.412,512,224,4.13
convit_tiny,1087.94,469.614,512,224,5.71
convnext_nano_hnf,1075.51,356.223,384,224,15.59
dla34,1072.51,476.787,512,224,15.74
tf_efficientnet_lite1,1061.0,240.288,256,240,5.42
dla46x_c,1058.5,482.972,512,224,1.07
rexnet_130,1051.04,242.43,256,224,7.56
regnetx_016,1042.11,490.415,512,224,9.19
mobilenetv2_120d,1036.53,245.787,256,224,5.83
skresnet34,1033.13,494.425,512,224,22.28
vit_small_patch16_224,1032.86,370.943,384,224,22.05
deit_small_patch16_224,1031.61,371.386,384,224,22.05
deit_small_distilled_patch16_224,1011.7,378.721,384,224,22.44
dla60x_c,1011.12,505.401,512,224,1.32
ecaresnet50d_pruned,1009.57,506.231,512,224,19.94
gernet_l,1007.43,507.303,512,256,31.08
efficientnet_b0_g16_evos,992.51,385.814,384,224,8.11
rexnetr_150,973.93,261.684,256,224,9.78
vit_base2_patch32_256,970.35,526.818,512,256,119.46
pit_s_distilled_224,963.9,264.693,256,224,24.04
pit_s_224,963.25,264.885,256,224,23.46
fbnetv3_b,936.5,408.4,384,256,8.6
rexnet_150,934.54,272.774,256,224,9.73
repvgg_a2,931.84,548.616,512,224,28.21
legacy_seresnext26_32x4d,930.69,411.955,384,224,16.79
regnety_016,921.99,553.558,512,224,11.2
resnest14d,909.12,562.716,512,224,10.61
efficientnet_cc_b0_4e,892.75,428.931,384,224,13.31
efficientnet_cc_b0_8e,889.3,430.599,384,224,24.01
coat_lite_tiny,885.61,432.738,384,224,5.72
nf_regnet_b1,883.44,578.138,512,288,10.22
resnetv2_50,883.05,579.001,512,224,25.55
resnext26ts,882.89,434.405,384,256,10.3
resnet26t,880.13,581.227,512,256,16.01
tf_efficientnetv2_b2,877.18,290.276,256,260,10.1
fbnetv3_d,875.63,290.586,256,256,10.31
nf_regnet_b2,875.24,583.366,512,272,14.31
tf_efficientnet_cc_b0_4e,872.5,438.913,384,224,13.31
tf_efficientnet_cc_b0_8e,867.49,441.45,384,224,24.01
efficientnet_b0_gn,860.35,296.448,256,224,5.29
eca_resnext26ts,852.81,299.643,256,256,10.3
seresnext26ts,848.06,301.17,256,256,10.39
botnet26t_256,835.32,459.139,384,256,12.49
gluon_resnet50_v1b,835.19,458.946,384,224,25.56
twins_svt_small,835.02,458.264,384,224,24.06
swsl_resnet50,834.38,459.391,384,224,25.56
resnet50,833.17,460.051,384,224,25.56
tf_efficientnet_b1_ap,832.71,305.903,256,240,7.79
tf_efficientnet_b1,832.37,306.044,256,240,7.79
tf_efficientnet_b1_ns,831.62,306.366,256,240,7.79
tv_resnet50,831.54,460.947,384,224,25.56
efficientnet_b2_pruned,831.45,306.38,256,260,8.31
seresnext26tn_32x4d,831.09,461.371,384,224,16.81
seresnext26d_32x4d,830.66,461.598,384,224,16.81
gcresnext26ts,830.6,307.372,256,256,10.48
seresnext26t_32x4d,830.1,461.943,384,224,16.81
ssl_resnet50,828.96,462.397,384,224,25.56
coat_lite_mini,825.96,464.063,384,224,11.01
vgg11,818.58,625.301,512,224,132.86
halonet26t,813.08,471.69,384,256,12.48
vit_small_resnet26d_224,812.32,471.613,384,224,63.61
eca_botnext26ts_256,804.88,317.468,256,256,10.59
efficientnet_lite2,800.14,318.978,256,260,6.09
efficientnet_b1,797.8,319.366,256,256,7.79
resnetv2_50t,793.9,644.111,512,224,25.57
ecaresnext26t_32x4d,792.87,483.752,384,224,15.41
ecaresnext50t_32x4d,790.88,484.99,384,224,15.41
resnetv2_50d,790.5,646.873,512,224,25.57
tresnet_m,787.27,647.568,512,224,31.39
convnext_tiny,782.73,326.109,256,224,28.59
eca_halonext26ts,781.32,327.03,256,256,10.76
vovnet39a,779.14,656.53,512,224,22.6
ecaresnet101d_pruned,778.75,655.702,512,224,24.88
mixnet_m,778.63,491.619,384,224,5.01
cspresnet50,774.32,495.068,384,256,21.62
tf_efficientnet_lite2,771.04,331.043,256,260,6.09
gluon_resnet50_v1c,769.5,498.184,384,224,25.58
ecaresnetlight,767.59,666.098,512,224,30.16
resmlp_24_224,766.31,332.575,256,224,30.02
resmlp_24_distilled_224,766.21,332.58,256,224,30.02
cspresnext50,765.28,500.93,384,224,20.57
resnet50t,753.13,509.003,384,224,25.57
resnet50d,752.95,509.12,384,224,25.58
gluon_resnet50_v1d,752.94,509.122,384,224,25.58
legacy_seresnet50,751.28,510.001,384,224,28.09
mobilevit_xs,751.23,339.638,256,256,2.32
tf_mixnet_m,748.19,511.684,384,224,5.01
ese_vovnet39b,748.06,683.786,512,224,24.57
resnet32ts,744.11,343.453,256,256,17.96
visformer_small,743.29,515.928,384,224,40.22
dpn68b,737.5,519.461,384,224,12.61
selecsls84,735.99,694.378,512,224,50.95
resnet33ts,732.63,348.809,256,256,19.68
nf_seresnet50,732.06,523.375,384,224,28.09
rexnetr_200,724.64,263.786,192,224,16.52
lambda_resnet26t,724.61,529.342,384,256,10.96
seresnet50,722.32,530.5,384,224,28.09
dpn68,721.48,531.095,384,224,12.61
res2net50_48w_2s,717.64,534.245,384,224,25.29
gmixer_24_224,717.31,355.331,256,224,24.72
eca_resnet33ts,712.11,358.824,256,256,19.68
seresnet33ts,709.27,360.12,256,256,19.78
bat_resnext26ts,708.45,360.137,256,256,10.73
rexnet_200,703.57,271.754,192,224,16.37
resnetblur50,702.19,546.026,384,224,25.56
cspresnet50d,701.76,546.305,384,256,21.64
vgg11_bn,698.67,549.368,384,224,132.87
twins_pcpvt_small,698.21,364.986,256,224,24.11
eca_vovnet39b,697.82,733.074,512,224,22.6
efficientnet_em,694.64,551.807,384,240,6.9
dla60,693.8,552.499,384,224,22.04
tf_efficientnet_em,692.96,368.419,256,240,6.9
cspresnet50w,692.76,553.433,384,256,28.12
resnest26d,688.5,556.976,384,224,17.07
gcresnet33ts,687.02,371.584,256,256,19.88
tv_densenet121,686.65,371.007,256,224,7.98
densenet121,686.51,371.091,256,224,7.98
vit_small_r26_s32_224,685.79,371.994,256,224,36.43
convnext_tiny_hnf,683.74,373.449,256,224,28.59
xcit_nano_12_p16_384_dist,682.7,373.202,256,384,3.05
xcit_tiny_24_p16_224_dist,681.26,372.348,256,224,12.12
lambda_resnet26rpt_256,679.63,281.896,192,256,10.99
xcit_tiny_24_p16_224,679.38,373.443,256,224,12.12
vit_base_resnet26d_224,679.21,563.967,384,224,101.4
resnetaa50d,678.8,564.84,384,224,25.58
nf_ecaresnet50,673.73,568.984,384,224,25.56
hrnet_w18_small_v2,672.52,758.874,512,224,15.6
gluon_resnet50_v1s,671.98,570.574,384,224,25.68
efficientnet_b0_g8_gn,661.67,385.796,256,224,6.56
seresnet50t,659.6,581.03,384,224,28.1
efficientnet_b3_pruned,657.34,387.781,256,300,9.86
haloregnetz_b,656.48,388.419,256,224,11.68
tv_resnext50_32x4d,651.27,588.791,384,224,25.03
swsl_resnext50_32x4d,651.11,588.925,384,224,25.03
gluon_resnext50_32x4d,650.97,589.059,384,224,25.03
ssl_resnext50_32x4d,650.9,589.136,384,224,25.03
resnext50_32x4d,649.99,589.957,384,224,25.03
resnetrs50,648.71,590.781,384,224,35.69
densenet121d,646.33,394.275,256,224,8.0
regnetx_032,643.77,595.296,384,224,15.3
resnetblur50d,642.77,397.429,256,224,25.58
res2net50_26w_4s,636.71,601.86,384,224,25.7
swin_tiny_patch4_window7_224,635.13,402.044,256,224,28.29
skresnet50,634.96,603.369,384,224,25.8
poolformer_s24,634.53,402.132,256,224,21.39
gmlp_s16_224,632.58,301.939,192,224,19.42
ese_vovnet57b,632.35,606.334,384,224,38.61
crossvit_small_240,625.73,407.457,256,240,26.86
xcit_small_12_p16_224_dist,625.08,407.723,256,224,26.25
xcit_small_12_p16_224,624.67,407.992,256,224,26.25
densenetblur121d,617.78,412.569,256,224,8.0
ecaresnet50d,612.94,625.548,384,224,25.58
tf_efficientnet_b2_ns,610.27,313.114,192,260,9.11
tf_efficientnet_b2_ap,610.1,313.168,192,260,9.11
tf_efficientnet_b2,609.89,313.274,192,260,9.11
mixnet_l,603.6,634.707,384,224,7.33
gluon_inception_v3,601.71,636.771,384,299,23.83
inception_v3,601.07,637.475,384,299,23.83
adv_inception_v3,601.02,637.509,384,299,23.83
tf_inception_v3,600.65,637.912,384,299,23.83
sehalonet33ts,599.96,425.883,256,256,13.69
seresnetaa50d,599.53,425.823,256,224,28.11
resnext50d_32x4d,598.58,426.825,256,224,25.05
mobilevit_s,596.24,320.895,192,256,5.58
resnetv2_50x1_bit_distilled,588.8,325.252,192,224,25.55
gcresnet50t,583.98,436.858,256,256,25.9
skresnet50d,583.54,437.238,256,224,25.82
swin_s3_tiny_224,576.57,442.971,256,224,28.33
seresnext50_32x4d,576.37,443.036,256,224,27.56
gluon_seresnext50_32x4d,576.3,443.051,256,224,27.56
tf_mixnet_l,576.22,442.743,256,224,7.33
legacy_seresnext50_32x4d,575.52,443.681,256,224,27.56
res2next50,575.07,443.835,256,224,24.67
repvgg_b1g4,572.25,893.649,512,224,39.97
cspresnext50_iabn,572.06,669.169,384,256,20.57
convnext_tiny_hnfd,570.41,447.777,256,224,28.63
res2net50_14w_8s,569.05,447.639,256,224,25.06
resnest50d_1s4x24d,568.93,448.655,256,224,25.68
cait_xxs24_224,568.71,447.7,256,224,11.96
crossvit_15_240,567.12,336.74,192,240,27.53
semobilevit_s,566.94,337.375,192,256,5.74
densenet169,563.82,451.519,256,224,14.15
efficientnet_b2,561.37,340.497,192,288,9.11
efficientnet_cc_b1_8e,559.34,455.785,256,240,39.72
efficientnet_b2a,558.58,342.255,192,288,9.11
darknet53,556.85,458.909,256,256,41.61
dla60_res2net,555.54,459.405,256,224,20.85
dla60x,553.4,461.598,256,224,17.35
mixer_b16_224,552.46,462.598,256,224,59.88
nf_resnet101,550.61,695.793,384,224,44.55
tf_efficientnet_cc_b1_8e,549.81,463.949,256,240,39.72
regnetx_040,549.5,697.703,384,224,22.12
mixer_b16_224_miil,549.25,465.267,256,224,59.88
crossvit_15_dagger_240,547.66,348.734,192,240,28.21
nf_regnet_b3,547.65,465.691,256,320,18.59
vovnet57a,547.59,934.12,512,224,36.64
resnetv2_101,544.49,468.639,256,224,44.54
xcit_nano_12_p8_224,542.16,470.425,256,224,3.05
xcit_nano_12_p8_224_dist,541.55,470.928,256,224,3.05
tf_efficientnetv2_b3,541.54,352.751,192,300,14.36
sebotnet33ts_256,540.04,236.225,128,256,13.7
gcresnext50ts,538.97,354.583,192,256,15.67
nf_resnet50,536.27,715.168,384,288,25.56
resnet50_gn,532.97,359.41,192,224,25.56
vit_base_r26_s32_224,532.84,359.064,192,224,101.38
vit_base_patch32_384,531.15,481.156,256,384,88.3
efficientnetv2_rw_t,527.04,362.193,192,288,13.65
resnet101,526.31,484.87,256,224,44.55
gluon_resnet101_v1b,525.74,485.353,256,224,44.55
tv_resnet101,523.69,487.267,256,224,44.55
vit_large_patch32_224,515.11,495.389,256,224,306.54
vit_base_resnet50d_224,514.44,495.989,256,224,110.97
mixer_l32_224,507.93,376.544,192,224,206.94
resnetv2_101d,506.46,503.899,256,224,44.56
swin_v2_cr_tiny_224,505.66,378.391,192,224,28.33
vit_tiny_patch16_384,505.36,252.456,128,384,5.79
resmlp_36_224,505.0,378.005,192,224,44.69
resmlp_36_distilled_224,504.83,378.088,192,224,44.69
swin_v2_cr_tiny_ns_224,503.02,380.384,192,224,28.33
repvgg_b1,501.28,1020.314,512,224,57.42
dla60_res2next,500.08,510.484,256,224,17.03
gluon_resnet101_v1c,498.53,511.943,256,224,44.57
wide_resnet50_2,491.97,779.7,384,224,68.88
gluon_resnet101_v1d,491.71,519.004,256,224,44.57
resnest50d,484.85,526.671,256,224,27.48
vgg13,482.75,795.25,384,224,133.05
cspdarknet53,481.82,530.306,256,256,27.64
gc_efficientnetv2_rw_t,480.93,396.503,192,288,13.68
convnext_small,478.98,399.137,192,224,50.22
efficientnet_lite3,478.82,266.239,128,300,8.2
ecaresnet26t,478.49,534.465,256,320,16.01
nest_tiny,477.21,267.341,128,224,17.06
regnetz_b16,476.53,401.488,192,288,9.72
dla102,475.31,537.042,256,224,33.27
cspdarknet53_iabn,474.61,806.628,384,256,27.64
res2net50_26w_6s,474.4,537.903,256,224,37.05
jx_nest_tiny,469.53,271.701,128,224,17.06
twins_pcpvt_base,465.48,409.697,192,224,43.83
vgg13_bn,465.35,549.842,256,224,133.05
halonet50ts,463.11,413.595,192,256,22.73
regnetx_080,462.29,829.538,384,224,39.57
lambda_resnet50ts,461.63,414.909,192,256,21.54
coat_lite_small,461.47,414.609,192,224,19.84
legacy_seresnet101,460.47,553.804,256,224,49.33
vgg16,459.25,557.204,256,224,138.36
tf_efficientnet_lite3,458.84,277.849,128,300,8.2
resnetaa101d,458.35,556.916,256,224,44.57
gluon_resnet101_v1s,456.37,559.352,256,224,44.67
xcit_tiny_12_p16_384_dist,451.6,423.4,192,384,6.72
seresnet101,447.82,569.466,256,224,49.33
densenet201,447.49,426.067,192,224,20.01
mixnet_xl,447.09,570.692,256,224,11.9
nf_seresnet101,444.89,573.162,256,224,49.33
convit_small,441.84,433.517,192,224,27.78
resnetblur101d,441.53,578.26,256,224,44.57
nfnet_l0,426.47,599.103,256,288,35.07
skresnext50_32x4d,424.37,601.881,256,224,27.48
poolformer_s36,424.29,450.68,192,224,30.86
botnet50ts_256,421.6,302.649,128,256,22.74
halo2botnet50ts_256,418.8,457.379,192,256,22.64
gluon_resnext101_32x4d,418.26,610.512,256,224,44.18
resnext101_32x4d,417.6,611.486,256,224,44.18
ssl_resnext101_32x4d,417.52,611.578,256,224,44.18
swsl_resnext101_32x4d,417.45,611.654,256,224,44.18
fbnetv3_g,415.62,305.885,128,288,16.62
twins_svt_base,413.84,461.835,192,224,56.07
ese_vovnet39b_evos,413.01,308.979,128,224,24.58
res2net101_26w_4s,405.25,629.29,256,224,45.21
eca_nfnet_l0,404.73,631.533,256,288,24.14
tresnet_l,403.54,1265.42,512,224,55.99
lamhalobotnet50ts_256,403.43,474.897,192,256,22.57
volo_d1_224,400.45,478.074,192,224,26.63
crossvit_18_240,400.32,317.735,128,240,43.27
resnet51q,397.98,481.556,192,288,35.7
nf_ecaresnet101,396.22,644.266,256,224,44.55
dla102x,392.86,487.139,192,224,26.31
vit_base_patch16_224_miil,392.0,489.026,192,224,86.54
swin_small_patch4_window7_224,391.85,488.136,192,224,49.61
regnety_032,391.81,651.958,256,288,19.44
regnetx_064,390.82,654.137,256,224,26.21
res2net50_26w_8s,389.26,655.478,256,224,48.4
vgg16_bn,388.47,658.66,256,224,138.37
deit_base_patch16_224,386.56,495.833,192,224,86.57
crossvit_18_dagger_240,386.08,329.466,128,240,44.27
vit_base_patch16_224,385.82,496.783,192,224,86.57
resnest50d_4s2x40d,385.78,662.242,256,224,30.42
xception,384.95,331.72,128,299,22.86
vit_base_patch16_224_sam,384.94,497.961,192,224,86.57
ecaresnet101d,381.61,669.061,256,224,44.57
deit_base_distilled_patch16_224,379.86,504.583,192,224,87.34
resnetv2_152,379.0,673.236,256,224,60.19
repvgg_b2g4,377.95,1353.629,512,224,61.76
ese_vovnet99b,373.94,683.067,256,224,63.2
vit_small_resnet50d_s16_224,373.61,512.677,192,224,57.53
cait_xxs36_224,371.71,512.777,192,224,17.3
resnet152,371.52,514.561,192,224,60.19
tv_resnet152,371.15,515.007,192,224,60.19
gluon_resnet152_v1b,368.93,518.118,192,224,60.19
gluon_seresnext101_32x4d,368.19,519.228,192,224,48.96
seresnext101_32x4d,367.5,520.242,192,224,48.96
nfnet_f0,366.97,696.402,256,256,71.49
legacy_seresnext101_32x4d,366.06,522.375,192,224,48.96
tf_efficientnet_b3_ap,364.58,349.402,128,300,12.23
tf_efficientnet_b3_ns,364.49,349.53,128,300,12.23
tf_efficientnet_b3,364.34,349.633,128,300,12.23
resnetv2_152d,361.2,529.264,192,224,60.2
resnet61q,358.48,356.024,128,288,36.85
resnetv2_50d_frn,357.45,356.923,128,224,25.59
ese_vovnet99b_iabn,357.31,1071.637,384,224,63.2
gluon_resnet152_v1c,355.88,537.206,192,224,60.21
hrnet_w18,355.73,714.824,256,224,21.3
efficientnet_b3,355.08,358.849,128,320,12.23
efficientnet_b3a,354.83,359.115,128,320,12.23
regnety_040,354.78,539.617,192,288,20.65
beit_base_patch16_224,353.55,541.973,192,224,86.53
gluon_resnet152_v1d,353.52,540.757,192,224,60.21
regnety_040s_gn,353.11,360.919,128,224,20.65
vgg19,352.0,1090.659,384,224,143.67
xcit_tiny_12_p8_224,351.32,362.59,128,224,6.71
xcit_tiny_12_p8_224_dist,351.06,362.838,128,224,6.71
xception41p,343.38,371.913,128,299,26.91
regnetv_040,342.33,372.402,128,288,20.64
tnt_s_patch16_224,339.73,563.196,192,224,23.76
repvgg_b2,339.2,1508.352,512,224,89.02
vgg19_bn,336.08,761.352,256,224,143.68
gluon_resnet152_v1s,334.98,570.781,192,224,60.32
densenet161,332.41,382.595,128,224,28.68
dm_nfnet_f0,331.82,770.266,256,256,71.49
dla169,331.14,577.405,192,224,53.39
resnetv2_50d_gn,330.91,385.985,128,288,25.57
convnext_base_in22ft1k,328.04,388.471,128,224,88.59
convnext_tiny_in22ft1k,327.7,388.845,128,224,88.59
convnext_small_in22ft1k,326.36,390.508,128,224,88.59
convnext_base,325.78,391.109,128,224,88.59
xcit_small_24_p16_224_dist,322.26,393.856,128,224,47.67
xcit_small_24_p16_224,321.22,395.027,128,224,47.67
repvgg_b3g4,321.08,1194.912,384,224,83.83
pit_b_224,319.89,399.203,128,224,73.76
twins_pcpvt_large,319.03,397.109,128,224,60.99
pit_b_distilled_224,318.55,400.853,128,224,74.79
legacy_seresnet152,315.59,605.069,192,224,66.82
dpn92,314.54,812.398,256,224,37.67
inception_v4,312.24,612.734,192,299,42.68
regnetx_120,309.13,827.171,256,224,46.11
hrnet_w32,308.88,823.922,256,224,41.23
convmixer_1024_20_ks9_p14,308.12,830.025,256,224,24.38
ecaresnet50t,306.63,416.493,128,320,25.57
seresnet152,305.84,415.318,128,224,66.82
coat_tiny,303.51,419.779,128,224,5.5
vit_small_patch16_36x1_224,303.33,419.341,128,224,64.67
regnetz_c16,302.72,421.37,128,320,13.46
vit_small_patch16_18x2_224,302.0,421.199,128,224,64.67
hrnet_w30,301.17,845.104,256,224,37.71
swin_v2_cr_small_224,300.16,424.024,128,224,49.7
nest_small,296.43,322.186,96,224,38.35
tresnet_xl,295.35,1296.648,384,224,78.44
jx_nest_small,294.56,324.278,96,224,38.35
efficientnet_el,291.47,438.031,128,300,10.59
xception41,291.46,437.936,128,299,26.97
efficientnet_el_pruned,291.16,438.564,128,300,10.59
regnety_120,291.14,658.173,192,224,51.82
cait_s24_224,290.55,438.065,128,224,46.92
nf_regnet_b4,288.27,441.948,128,384,30.21
twins_svt_large,288.14,442.123,128,224,99.27
wide_resnet101_2,287.9,665.343,192,224,126.89
mixnet_xxl,287.84,442.746,128,224,23.96
tf_efficientnet_el,286.59,445.507,128,300,10.59
poolformer_m36,284.22,448.517,128,224,56.17
swin_s3_small_224,277.99,343.436,96,224,49.74
repvgg_b3,276.27,1388.871,384,224,123.09
swin_base_patch4_window7_224,273.38,466.273,128,224,87.77
gmlp_b16_224,266.77,358.289,96,224,73.08
gluon_resnext101_64x4d,266.55,478.624,128,224,83.46
dla102x2,265.99,479.601,128,224,41.28
resnet200,264.41,481.093,128,224,64.67
resnetv2_50d_evob,261.14,366.354,96,224,25.59
regnetx_160,260.8,735.124,192,224,54.28
xception65p,258.86,493.161,128,299,39.82
inception_resnet_v2,255.7,747.606,192,299,55.84
ens_adv_inception_resnet_v2,255.53,748.108,192,299,55.84
resnetrs101,253.22,503.169,128,288,63.62
resnext101_32x8d,252.1,506.144,128,224,88.79
ig_resnext101_32x8d,251.82,506.789,128,224,88.79
ssl_resnext101_32x8d,251.44,507.525,128,224,88.79
swsl_resnext101_32x8d,251.43,507.504,128,224,88.79
crossvit_base_240,250.95,380.881,96,240,105.03
dpn98,249.4,511.692,128,224,61.57
efficientnet_lite4,247.4,257.372,64,380,13.01
coat_mini,246.33,517.632,128,224,10.34
efficientnetv2_s,245.94,388.167,96,384,21.46
gluon_seresnext101_64x4d,243.95,522.441,128,224,88.23
tf_efficientnetv2_s_in21ft1k,243.54,391.981,96,384,21.46
tf_efficientnetv2_s,242.89,393.02,96,384,21.46
resnet101d,240.52,530.628,128,320,44.57
resnest101e,240.15,530.415,128,256,48.28
tf_efficientnet_lite4,239.36,265.97,64,380,13.01
vit_small_patch16_384,235.44,270.99,64,384,22.2
xcit_tiny_24_p16_384_dist,233.56,407.54,96,384,12.12
efficientnetv2_rw_s,232.48,273.03,64,384,23.94
regnety_064,232.21,549.529,128,288,30.58
xcit_medium_24_p16_224_dist,231.18,550.33,128,224,84.4
xcit_medium_24_p16_224,231.11,550.432,128,224,84.4
vit_large_r50_s32_224,229.5,415.85,96,224,328.99
regnetv_064,225.87,564.974,128,288,30.58
vit_small_r26_s32_384,225.43,282.622,64,384,36.47
convit_base,225.0,567.909,128,224,86.54
gluon_xception65,223.32,427.945,96,299,39.92
xception65,222.42,429.727,96,299,39.92
tnt_b_patch16_224,222.32,573.844,128,224,65.41
volo_d2_224,221.58,431.459,96,224,58.68
regnety_080,221.2,577.488,128,288,39.18
hrnet_w40,220.15,867.238,192,224,57.56
swin_s3_base_224,219.92,433.728,96,224,71.13
xcit_small_12_p16_384_dist,216.0,442.656,96,384,26.25
swin_v2_cr_base_224,214.24,445.643,96,224,87.88
hrnet_w48,213.16,895.867,192,224,77.47
resnetv2_50d_evos,211.1,226.199,48,288,25.59
nest_base,210.29,302.692,64,224,67.72
jx_nest_base,209.17,304.343,64,224,67.72
vit_base_r50_s16_224,208.73,458.333,96,224,98.66
tresnet_m_448,206.95,924.987,192,448,31.39
hrnet_w44,206.72,924.031,192,224,67.06
efficientnet_b4,203.38,312.633,64,384,19.34
regnetz_b16_evos,201.27,316.106,64,288,9.74
regnetz_040h,199.85,318.401,64,320,28.94
regnetz_040,199.66,318.691,64,320,27.12
efficientnet_b3_gn,199.47,319.133,64,320,11.73
densenet264,199.22,477.999,96,224,72.69
regnetz_d8,197.46,322.524,64,320,23.37
eca_nfnet_l1,189.9,672.191,128,320,41.41
tf_efficientnet_b4,188.81,336.904,64,380,19.34
tf_efficientnet_b4_ns,188.45,337.501,64,380,19.34
tf_efficientnet_b4_ap,188.4,337.642,64,380,19.34
poolformer_m48,187.63,509.206,96,224,73.47
dpn131,186.23,685.159,128,224,79.25
regnetz_d32,186.08,342.366,64,320,27.58
xcit_nano_12_p8_384_dist,183.21,347.477,64,384,3.05
convnext_large_in22ft1k,182.13,525.349,96,224,197.77
convnext_large,180.72,529.451,96,224,197.77
xcit_tiny_24_p8_224_dist,179.19,532.437,96,224,12.11
xcit_tiny_24_p8_224,179.14,532.524,96,224,12.11
dpn107,174.56,731.55,128,224,86.92
resnet152d,172.45,554.35,96,320,60.21
hrnet_w64,170.75,744.856,128,224,128.06
halonet_h1,170.55,373.795,64,256,8.1
xception71,168.15,378.514,64,299,42.34
mixer_l16_224,167.46,571.737,96,224,208.2
regnety_320,166.3,768.299,128,224,145.05
vit_large_patch32_384,165.23,579.454,96,384,306.63
densenet264d_iabn,165.07,1158.795,192,224,72.74
xcit_small_12_p8_224,164.33,387.646,64,224,26.21
xcit_small_12_p8_224_dist,164.16,388.107,64,224,26.21
efficientnet_b3_g8_gn,153.99,413.949,64,320,14.25
swin_large_patch4_window7_224,152.41,417.988,64,224,196.53
volo_d3_224,152.25,417.779,64,224,86.33
ecaresnet200d,150.97,632.522,96,256,64.69
seresnet200d,149.9,422.786,64,256,71.86
regnetx_320,146.61,871.937,128,224,107.81
seresnet152d,143.56,442.443,64,320,66.84
resnetv2_50x1_bitm,142.06,337.009,48,448,25.55
gluon_senet154,141.6,674.51,96,224,115.09
resnetrs152,141.55,448.821,64,320,86.62
senet154,141.16,676.729,96,224,115.09
legacy_senet154,139.88,683.0,96,224,115.09
regnety_160,136.04,704.372,96,288,83.59
xcit_large_24_p16_224_dist,130.76,486.175,64,224,189.1
xcit_large_24_p16_224,130.33,487.671,64,224,189.1
seresnext101_32x8d,126.87,502.294,64,288,93.57
nfnet_f1,125.34,763.802,96,320,132.63
resnet200d,124.61,510.654,64,320,64.69
regnetz_c16_evos,123.93,385.448,48,320,13.49
resnext101_64x4d,122.56,781.647,96,288,83.46
efficientnetv2_m,120.01,396.838,48,416,54.14
swin_v2_cr_large_224,119.95,397.782,48,224,196.68
xcit_tiny_12_p8_384_dist,119.32,400.563,48,384,6.71
vit_large_patch16_224,116.44,548.026,64,224,304.33
crossvit_15_dagger_408,116.23,273.461,32,408,28.5
seresnet269d,116.02,545.703,64,256,113.67
vit_base_patch16_18x2_224,115.21,552.796,64,224,256.73
convnext_xlarge_in22ft1k,113.61,561.618,64,224,350.2
convnext_base_384_in22ft1k,112.88,423.483,48,384,88.59
dm_nfnet_f1,112.71,565.567,64,320,132.63
convnext_tiny_384_in22ft1k,112.65,424.398,48,384,88.59
convnext_small_384_in22ft1k,112.59,424.604,48,384,88.59
nf_regnet_b5,112.22,567.655,64,456,49.74
swin_v2_cr_tiny_384,111.16,286.572,32,384,28.33
xcit_small_24_p16_384_dist,110.42,431.325,48,384,47.67
beit_large_patch16_224,107.44,593.678,64,224,304.43
regnetz_e8,103.46,461.955,48,320,57.7
efficientnetv2_rw_m,103.11,307.008,32,416,53.24
tresnet_l_448,101.52,1257.535,128,448,55.99
swsl_resnext101_32x16d,99.54,962.862,96,224,194.03
ig_resnext101_32x16d,99.3,965.129,96,224,194.03
ssl_resnext101_32x16d,99.25,965.706,96,224,194.03
volo_d1_384,98.94,322.06,32,384,26.78
resnetrs200,98.6,482.325,48,320,93.21
cait_xxs24_384,97.24,491.195,48,384,12.03
deit_base_patch16_384,95.92,332.748,32,384,86.86
vit_base_patch16_384,95.88,332.951,32,384,86.86
volo_d4_224,95.4,500.622,48,224,192.96
eca_nfnet_l2,94.36,675.671,64,384,56.72
deit_base_distilled_patch16_384,93.97,339.685,32,384,87.63
efficientnet_b5,90.42,351.458,32,456,30.39
tf_efficientnetv2_m,89.36,354.724,32,480,54.14
tf_efficientnetv2_m_in21ft1k,89.12,355.723,32,480,54.14
tf_efficientnet_b5,88.7,358.215,32,456,30.39
tf_efficientnet_b5_ap,88.68,358.368,32,456,30.39
tf_efficientnet_b5_ns,88.58,358.774,32,456,30.39
crossvit_18_dagger_408,87.86,362.177,32,408,44.61
resnetv2_101x1_bitm,87.29,364.971,32,448,44.54
convmixer_768_32,87.13,1100.537,96,224,21.11
resnetv2_152x2_bit_teacher,87.1,364.899,32,224,236.34
vit_large_patch14_224,84.6,565.739,48,224,304.2
regnetz_d8_evos,83.97,379.015,32,320,23.46
beit_base_patch16_384,82.88,385.038,32,384,86.74
xcit_small_24_p8_224_dist,82.62,383.85,32,224,47.63
xcit_small_24_p8_224,82.5,384.488,32,224,47.63
resnest200e,79.64,597.77,48,320,70.2
tresnet_xl_448,77.44,1236.244,96,448,78.44
xcit_medium_24_p16_384_dist,76.61,414.275,32,384,84.4
vit_large_r50_s32_384,76.27,417.132,32,384,329.09
swin_base_patch4_window12_384,73.39,434.11,32,384,87.9
nfnet_f2,71.5,668.121,48,352,193.78
resmlp_big_24_224_in22ft1k,68.15,468.085,32,224,129.14
resmlp_big_24_224,68.13,468.141,32,224,129.14
resmlp_big_24_distilled_224,68.08,468.542,32,224,129.14
pnasnet5large,66.87,474.686,32,331,86.06
swin_v2_cr_small_384,66.28,359.629,24,384,49.7
nasnetalarge,65.59,482.923,32,331,88.75
cait_xs24_384,65.34,487.167,32,384,26.67
dm_nfnet_f2,64.55,740.206,48,352,193.78
cait_xxs36_384,62.84,505.392,32,384,17.37
vit_base_patch8_224,62.83,381.142,24,224,86.58
convnext_large_384_in22ft1k,61.61,517.714,32,384,197.77
ecaresnet269d,61.52,515.695,32,352,102.09
volo_d5_224,61.49,517.153,32,224,295.46
xcit_tiny_24_p8_384_dist,60.46,525.849,32,384,12.11
xcit_medium_24_p8_224,59.24,536.775,32,224,84.32
xcit_medium_24_p8_224_dist,59.23,536.852,32,224,84.32
vit_base_resnet50_384,58.93,405.608,24,384,98.95
vit_base_r50_s16_384,58.91,405.68,24,384,98.95
resnetrs270,58.9,537.38,32,352,129.86
xcit_small_12_p8_384_dist,56.03,426.632,24,384,26.21
volo_d2_384,55.33,287.363,16,384,58.87
ig_resnext101_32x32d,52.68,605.908,32,224,468.53
eca_nfnet_l3,50.62,628.636,32,448,72.04
convmixer_1536_20,49.63,966.338,48,224,51.63
cait_s24_384,49.12,486.034,24,384,47.06
efficientnet_b6,48.5,327.112,16,528,43.04
tf_efficientnet_b6_ns,48.02,330.391,16,528,43.04
tf_efficientnet_b6_ap,47.83,331.608,16,528,43.04
tf_efficientnet_b6,47.78,331.966,16,528,43.04
efficientnetv2_l,47.17,334.927,16,480,118.52
swin_v2_cr_base_384,47.07,337.445,16,384,87.88
tf_efficientnetv2_l_in21ft1k,46.85,337.127,16,480,118.52
tf_efficientnetv2_l,46.53,339.329,16,480,118.52
swin_v2_cr_huge_224,46.2,343.813,16,224,657.83
xcit_large_24_p16_384_dist,44.95,530.527,24,384,189.1
swin_large_patch4_window12_384,41.24,386.07,16,384,196.74
vit_huge_patch14_224,39.65,401.44,16,224,632.05
resnetrs350,37.15,638.194,24,384,163.96
convnext_xlarge_384_in22ft1k,36.94,431.409,16,384,350.2
nfnet_f3,35.43,673.367,24,416,254.92
resnest269e,33.7,705.124,24,416,110.93
xcit_large_24_p8_224_dist,33.34,476.606,16,224,188.93
xcit_large_24_p8_224,33.33,476.71,16,224,188.93
dm_nfnet_f3,32.28,738.882,24,416,254.92
resnetv2_50x3_bitm,31.96,499.813,16,448,217.32
cait_s36_384,31.71,500.682,16,384,68.37
resnetv2_152x2_bit_teacher_384,31.41,506.876,16,384,236.34
tf_efficientnetv2_xl_in21ft1k,31.07,380.33,12,512,208.12
efficientnetv2_xl,30.57,386.62,12,512,208.12
efficientnet_b7,30.54,258.47,8,600,66.35
tf_efficientnet_b7,30.24,261.049,8,600,66.35
tf_efficientnet_b7_ap,30.18,261.545,8,600,66.35
tf_efficientnet_b7_ns,30.17,261.527,8,600,66.35
vit_large_patch16_384,29.1,410.723,12,384,304.72
xcit_small_24_p8_384_dist,28.47,558.499,16,384,47.63
swin_v2_cr_large_384,28.33,421.122,12,384,196.68
ig_resnext101_32x48d,27.82,573.581,16,224,828.41
resnetrs420,25.59,615.408,16,416,191.89
beit_large_patch16_384,24.96,478.806,12,384,305.0
volo_d3_448,23.79,333.801,8,448,86.63
vit_giant_patch14_224,22.41,354.342,8,224,1012.61
resnetv2_152x2_bitm,21.79,364.675,8,448,236.34
xcit_medium_24_p8_384_dist,19.42,408.574,8,384,84.32
nfnet_f4,18.89,630.122,12,512,316.07
resnetv2_101x3_bitm,17.61,452.554,8,448,387.93
dm_nfnet_f4,17.17,693.324,12,512,316.07
volo_d4_448,16.84,353.766,6,448,193.41
efficientnet_b8,14.4,412.697,6,672,87.41
tf_efficientnet_b8_ap,14.32,415.19,6,672,87.41
tf_efficientnet_b8,14.24,417.31,6,672,87.41
nfnet_f5,12.31,643.8,8,544,377.21
cait_m36_384,11.75,507.058,6,384,271.22
xcit_large_24_p8_384_dist,11.36,524.55,6,384,188.93
dm_nfnet_f5,11.21,706.654,8,544,377.21
volo_d5_448,11.01,359.686,4,448,295.91
swin_v2_cr_huge_384,10.89,364.734,4,384,657.94
tf_efficientnet_l2_ns_475,10.43,377.703,4,475,480.31
nfnet_f6,10.18,778.679,8,576,438.36
beit_large_patch16_512,9.37,425.057,4,512,305.67
dm_nfnet_f6,8.56,692.978,6,576,438.36
volo_d5_512,7.76,383.588,3,512,296.09
nfnet_f7,7.54,787.101,6,608,499.5
cait_m48_448,4.71,419.695,2,448,356.46
resnetv2_152x4_bitm,4.53,438.64,2,480,936.53
tf_efficientnet_l2_ns,2.96,331.907,1,800,480.31
efficientnet_l2,2.92,336.419,1,800,480.31
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-infer-amp-nchw-pt111-cu113-rtx3090.csv
|
model,infer_samples_per_sec,infer_step_time,infer_batch_size,infer_img_size,param_count
tinynet_e,47972.76,21.335,1024,106,2.04
mobilenetv3_small_050,42473.43,24.099,1024,224,1.59
lcnet_035,39739.31,25.756,1024,224,1.64
lcnet_050,35211.0,29.071,1024,224,1.88
mobilenetv3_small_075,31410.3,32.589,1024,224,2.04
mobilenetv3_small_100,28111.39,36.416,1024,224,2.54
tf_mobilenetv3_small_minimal_100,27538.82,37.173,1024,224,2.04
tinynet_d,26670.17,38.384,1024,152,2.34
tf_mobilenetv3_small_075,26522.93,38.597,1024,224,2.04
tf_mobilenetv3_small_100,24036.65,42.591,1024,224,2.54
lcnet_075,22451.72,45.598,1024,224,2.36
levit_128s,19963.52,51.282,1024,224,7.78
mnasnet_small,19706.27,51.952,1024,224,2.03
lcnet_100,18132.59,56.461,1024,224,2.95
mobilenetv2_035,17586.23,58.217,1024,224,1.68
ghostnet_050,16726.5,61.209,1024,224,2.59
regnetx_002,16238.56,63.048,1024,224,2.68
regnety_002,15227.23,67.235,1024,224,3.16
mnasnet_050,15022.24,68.154,1024,224,2.22
tinynet_c,14089.9,72.665,1024,184,2.46
mobilenetv2_050,14032.51,72.961,1024,224,1.97
levit_128,13679.77,74.845,1024,224,9.21
semnasnet_050,13508.98,75.79,1024,224,2.08
vit_small_patch32_224,12109.88,84.548,1024,224,22.88
mixer_s32_224,11702.15,87.494,1024,224,19.1
levit_192,11695.39,87.545,1024,224,10.95
lcnet_150,11564.86,88.533,1024,224,4.5
mobilenetv3_large_075,11407.33,89.755,1024,224,3.99
gernet_s,10837.81,94.473,1024,224,8.17
vit_tiny_r_s16_p8_224,10598.14,96.609,1024,224,6.34
mobilenetv3_rw,10164.28,100.733,1024,224,5.48
tf_mobilenetv3_large_075,10125.76,101.117,1024,224,3.99
regnetx_004,10069.8,101.678,1024,224,5.16
mobilenetv3_large_100,10017.28,102.212,1024,224,5.48
mobilenetv3_large_100_miil,10014.68,102.238,1024,224,5.48
ese_vovnet19b_slim_dw,9944.69,102.957,1024,224,1.9
hardcorenas_a,9792.24,104.561,1024,224,5.26
mnasnet_075,9774.09,104.755,1024,224,3.17
tf_mobilenetv3_large_minimal_100,9771.38,104.784,1024,224,3.92
ghostnet_100,9041.17,113.248,1024,224,5.18
hardcorenas_b,9021.74,113.492,1024,224,5.18
tinynet_b,8976.69,114.061,1024,188,3.73
swsl_resnet18,8971.65,114.125,1024,224,11.69
tf_mobilenetv3_large_100,8954.66,114.343,1024,224,5.48
gluon_resnet18_v1b,8949.74,114.406,1024,224,11.69
ssl_resnet18,8947.2,114.439,1024,224,11.69
resnet18,8927.25,114.693,1024,224,11.69
hardcorenas_c,8864.36,115.506,1024,224,5.52
mobilenetv2_075,8764.66,116.82,1024,224,2.64
mnasnet_100,8646.99,118.411,1024,224,4.38
mnasnet_b1,8646.34,118.421,1024,224,4.38
semnasnet_075,8603.57,119.009,1024,224,2.91
levit_256,8528.42,120.058,1024,224,18.89
regnety_004,8497.03,120.501,1024,224,4.34
seresnet18,8461.0,121.015,1024,224,11.78
hardcorenas_d,8306.25,123.269,1024,224,7.5
legacy_seresnet18,8213.74,124.658,1024,224,11.78
regnetx_006,8055.06,127.114,1024,224,6.2
mobilenetv2_100,7900.5,129.6,1024,224,3.5
spnasnet_100,7827.4,130.811,1024,224,4.42
semnasnet_100,7701.96,132.941,1024,224,3.89
mnasnet_a1,7678.8,133.342,1024,224,3.89
resnet18d,7478.23,136.919,1024,224,11.71
levit_256d,7357.48,139.166,1024,224,26.21
ghostnet_130,7270.82,140.824,1024,224,7.36
regnety_006,7263.32,140.971,1024,224,6.06
hardcorenas_f,7222.67,141.764,1024,224,8.2
hardcorenas_e,7174.56,142.715,1024,224,8.07
efficientnet_lite0,7057.13,145.09,1024,224,4.65
ese_vovnet19b_slim,6975.46,146.789,1024,224,3.17
tinynet_a,6918.13,148.004,1024,192,6.19
fbnetc_100,6847.55,149.531,1024,224,5.57
tf_efficientnetv2_b0,6842.85,149.633,1024,224,7.14
xcit_nano_12_p16_224_dist,6769.74,151.25,1024,224,3.05
xcit_nano_12_p16_224,6760.61,151.454,1024,224,3.05
regnetx_008,6358.56,161.031,1024,224,7.26
deit_tiny_patch16_224,6350.86,161.227,1024,224,5.72
vit_tiny_patch16_224,6346.8,161.33,1024,224,5.72
tf_efficientnet_lite0,6324.64,161.895,1024,224,4.65
deit_tiny_distilled_patch16_224,6241.01,164.064,1024,224,5.91
efficientnet_b0,6183.5,165.59,1024,224,5.29
efficientnet_b1_pruned,6044.57,169.396,1024,240,6.33
rexnet_100,6031.43,169.765,1024,224,4.8
mnasnet_140,6024.95,169.948,1024,224,7.12
rexnetr_100,5992.24,170.876,1024,224,4.88
dla46_c,5989.01,170.968,1024,224,1.3
pit_ti_distilled_224,5978.67,171.264,1024,224,5.1
mobilenetv2_110d,5966.33,171.618,1024,224,4.52
pit_ti_224,5950.27,172.082,1024,224,4.85
regnety_008,5940.31,172.37,1024,224,6.26
resnetblur18,5929.76,172.677,1024,224,11.69
tf_efficientnet_b0,5620.98,182.161,1024,224,5.29
tf_efficientnet_b0_ns,5610.83,182.491,1024,224,5.29
tf_efficientnet_b0_ap,5603.36,182.734,1024,224,5.29
skresnet18,5578.53,183.549,1024,224,11.96
regnetz_005,5482.6,186.761,1024,224,7.12
semnasnet_140,5325.69,192.263,1024,224,6.11
mobilenetv2_140,5271.81,194.229,1024,224,6.11
resnet34,5239.3,195.434,1024,224,21.8
tv_resnet34,5236.24,195.548,1024,224,21.8
gluon_resnet34_v1b,5233.7,195.644,1024,224,21.8
ese_vovnet19b_dw,5190.19,197.284,1024,224,6.54
levit_384,5177.17,197.779,1024,224,39.13
mobilevit_xxs,5154.01,198.668,1024,256,1.27
visformer_tiny,5148.77,198.871,1024,224,10.32
hrnet_w18_small,5137.51,199.307,1024,224,13.19
nf_regnet_b0,5134.01,199.442,1024,256,8.76
seresnet34,4940.74,207.245,1024,224,21.96
mixnet_s,4921.18,208.067,1024,224,4.13
gernet_m,4881.9,209.742,1024,224,21.14
efficientnet_lite1,4817.11,212.565,1024,240,5.42
legacy_seresnet34,4790.34,213.752,1024,224,21.96
selecsls42,4787.26,213.889,1024,224,30.35
selecsls42b,4772.45,214.553,1024,224,32.46
dla46x_c,4717.05,217.071,1024,224,1.07
resnet34d,4707.81,217.499,1024,224,21.82
vit_base_patch32_224,4653.29,220.048,1024,224,88.22
vit_base_patch32_224_sam,4636.98,220.822,1024,224,88.22
pit_xs_224,4628.58,221.222,1024,224,10.62
tf_mixnet_s,4615.88,221.831,1024,224,4.13
fbnetv3_b,4595.16,222.83,1024,256,8.6
rexnetr_130,4587.31,223.212,1024,224,7.61
pit_xs_distilled_224,4586.36,223.258,1024,224,11.0
resmlp_12_distilled_224,4524.79,226.297,1024,224,15.35
resmlp_12_224,4522.51,226.411,1024,224,15.35
tf_efficientnetv2_b1,4515.62,226.756,1024,240,8.14
dla60x_c,4459.55,229.607,1024,224,1.32
tf_efficientnet_lite1,4427.03,231.295,1024,240,5.42
mixer_b32_224,4423.78,231.465,1024,224,60.29
rexnet_130,4423.43,231.482,1024,224,7.56
xcit_tiny_12_p16_224_dist,4363.62,234.654,1024,224,6.72
xcit_tiny_12_p16_224,4352.75,235.24,1024,224,6.72
resnet26,4300.48,238.1,1024,224,16.0
mobilenetv2_120d,4276.4,239.441,1024,224,5.83
efficientnet_es_pruned,4244.8,241.225,1024,224,5.44
efficientnet_es,4243.52,241.298,1024,224,5.44
repvgg_b0,4216.88,242.821,1024,224,15.82
selecsls60,4146.99,246.913,1024,224,30.67
selecsls60b,4134.81,247.64,1024,224,32.77
tf_efficientnet_es,4097.34,249.906,1024,224,5.44
fbnetv3_d,4060.14,252.196,1024,256,10.31
efficientnet_b2_pruned,4051.65,252.724,1024,260,8.31
efficientnet_b0_g16_evos,3982.5,257.113,1024,224,8.11
rexnetr_150,3947.19,259.414,1024,224,9.78
crossvit_tiny_240,3922.12,261.07,1024,240,7.01
mixer_s16_224,3902.13,262.409,1024,224,18.53
resnet26d,3896.6,262.781,1024,224,16.01
dla34,3859.49,265.307,1024,224,15.74
ecaresnet50d_pruned,3854.93,265.621,1024,224,19.94
rexnet_150,3827.99,267.492,1024,224,9.73
vit_small_patch32_384,3806.52,269.0,1024,384,22.92
nf_resnet26,3804.26,269.16,1024,224,16.0
gmixer_12_224,3797.94,269.608,1024,224,12.7
efficientnet_lite2,3793.99,269.889,1024,260,6.09
gmlp_ti16_224,3724.27,274.941,1024,224,5.87
crossvit_9_240,3711.74,275.869,1024,240,8.55
regnetx_016,3640.75,281.247,1024,224,9.19
efficientnet_cc_b0_4e,3606.59,283.912,1024,224,13.31
efficientnet_cc_b0_8e,3600.63,284.383,1024,224,24.01
crossvit_9_dagger_240,3560.19,287.614,1024,240,8.78
tf_efficientnet_b1_ap,3560.01,287.626,1024,240,7.79
tf_efficientnet_b1,3559.25,287.687,1024,240,7.79
tf_efficientnet_b1_ns,3553.74,288.134,1024,240,7.79
tf_efficientnet_lite2,3505.87,292.07,1024,260,6.09
efficientnet_b1,3481.01,294.155,1024,256,7.79
poolformer_s12,3480.88,294.166,1024,224,11.92
vit_tiny_r_s16_p8_384,3451.73,148.319,512,384,6.36
tf_efficientnetv2_b2,3443.76,297.337,1024,260,10.1
tf_efficientnet_cc_b0_8e,3407.59,300.493,1024,224,24.01
tf_efficientnet_cc_b0_4e,3402.61,300.934,1024,224,13.31
mixnet_m,3369.57,303.884,1024,224,5.01
regnety_016,3343.57,306.248,1024,224,11.2
nf_seresnet26,3326.57,307.813,1024,224,17.4
nf_ecaresnet26,3308.22,309.519,1024,224,16.0
repvgg_a2,3284.74,311.731,1024,224,28.21
gernet_l,3260.13,314.086,1024,256,31.08
tf_mixnet_m,3258.23,314.269,1024,224,5.01
resnest14d,3225.43,317.465,1024,224,10.61
efficientnet_b3_pruned,3214.49,318.545,1024,300,9.86
convnext_nano_hnf,3199.89,319.999,1024,224,15.59
skresnet34,3189.47,321.044,1024,224,22.28
convit_tiny,3117.16,328.49,1024,224,5.71
resnext26ts,3098.65,330.453,1024,256,10.3
legacy_seresnext26_32x4d,3086.27,331.78,1024,224,16.79
nf_regnet_b1,3049.58,335.771,1024,288,10.22
resnet26t,3040.5,336.774,1024,256,16.01
seresnext26ts,3026.32,338.35,1024,256,10.39
eca_resnext26ts,3023.05,338.719,1024,256,10.3
gcresnext26ts,2976.22,344.049,1024,256,10.48
ecaresnet101d_pruned,2954.94,346.526,1024,224,24.88
nf_regnet_b2,2933.65,349.041,1024,272,14.31
mobilevit_xs,2913.13,175.744,512,256,2.32
seresnext26tn_32x4d,2898.81,353.236,1024,224,16.81
seresnext26t_32x4d,2897.48,353.398,1024,224,16.81
ecaresnext26t_32x4d,2893.22,353.918,1024,224,15.41
ecaresnext50t_32x4d,2891.83,354.088,1024,224,15.41
seresnext26d_32x4d,2884.91,354.937,1024,224,16.81
ecaresnetlight,2878.45,355.733,1024,224,30.16
pit_s_224,2872.59,356.46,1024,224,23.46
deit_small_patch16_224,2853.43,358.855,1024,224,22.05
pit_s_distilled_224,2851.86,359.052,1024,224,24.04
vit_small_patch16_224,2845.41,359.865,1024,224,22.05
tf_efficientnet_b2_ap,2814.51,363.814,1024,260,9.11
tf_efficientnet_b2_ns,2814.31,363.839,1024,260,9.11
coat_lite_tiny,2814.08,363.872,1024,224,5.72
tf_efficientnet_b2,2813.96,363.886,1024,260,9.11
rexnetr_200,2808.62,182.283,512,224,16.52
deit_small_distilled_patch16_224,2801.73,365.478,1024,224,22.44
tresnet_m,2787.92,367.287,1024,224,31.39
resnetv2_50,2780.22,368.303,1024,224,25.55
eca_botnext26ts_256,2766.45,370.137,1024,256,10.59
rexnet_200,2763.52,185.259,512,224,16.37
vit_base2_patch32_256,2752.1,372.066,1024,256,119.46
botnet26t_256,2750.58,372.273,1024,256,12.49
halonet26t,2727.12,375.475,1024,256,12.48
eca_halonext26ts,2721.42,376.262,1024,256,10.76
swsl_resnet50,2693.51,380.159,1024,224,25.56
tv_resnet50,2687.71,380.98,1024,224,25.56
efficientnet_b0_gn,2686.21,381.194,1024,224,5.29
ssl_resnet50,2684.8,381.394,1024,224,25.56
gluon_resnet50_v1b,2682.34,381.744,1024,224,25.56
resnet50,2681.21,381.904,1024,224,25.56
vit_small_resnet26d_224,2675.44,382.728,1024,224,63.61
efficientnet_b2a,2654.03,385.816,1024,288,9.11
efficientnet_b2,2649.9,386.418,1024,288,9.11
coat_lite_mini,2646.6,386.899,1024,224,11.01
hrnet_w18_small_v2,2638.47,388.089,1024,224,15.6
resnetv2_50t,2621.75,390.565,1024,224,25.57
vovnet39a,2620.98,390.68,1024,224,22.6
resnetv2_50d,2613.09,391.86,1024,224,25.57
bat_resnext26ts,2594.52,394.665,1024,256,10.73
resnet32ts,2591.72,395.092,1024,256,17.96
efficientnet_em,2587.13,395.792,1024,240,6.9
cspresnet50,2561.78,399.709,1024,256,21.62
mixnet_l,2560.63,199.939,512,224,7.33
resnet33ts,2550.89,401.417,1024,256,19.68
dpn68b,2548.03,401.866,1024,224,12.61
gluon_resnet50_v1c,2543.04,402.654,1024,224,25.58
ese_vovnet39b,2535.01,403.931,1024,224,24.57
eca_vovnet39b,2533.56,404.162,1024,224,22.6
cspresnext50,2530.86,404.592,1024,224,20.57
legacy_seresnet50,2527.3,405.162,1024,224,28.09
vgg11_bn,2524.72,202.784,512,224,132.87
tf_efficientnet_em,2523.81,405.723,1024,240,6.9
resnet50t,2521.66,406.069,1024,224,25.57
resnet50d,2517.37,406.76,1024,224,25.58
gluon_resnet50_v1d,2514.4,407.243,1024,224,25.58
dpn68,2513.01,407.467,1024,224,12.61
selecsls84,2490.46,411.155,1024,224,50.95
seresnet33ts,2489.53,411.308,1024,256,19.78
eca_resnet33ts,2483.09,412.378,1024,256,19.68
lambda_resnet26t,2479.99,412.893,1024,256,10.96
tf_mixnet_l,2478.99,206.524,512,224,7.33
twins_svt_small,2475.31,413.674,1024,224,24.06
gcresnet33ts,2439.7,419.711,1024,256,19.88
cspresnet50w,2418.46,423.398,1024,256,28.12
seresnet50,2407.37,425.348,1024,224,28.09
cspresnet50d,2400.89,426.492,1024,256,21.64
dla60,2376.63,430.848,1024,224,22.04
densenet121,2346.76,436.333,1024,224,7.98
resnest26d,2346.08,436.46,1024,224,17.07
tv_densenet121,2345.79,436.514,1024,224,7.98
xcit_tiny_24_p16_224_dist,2334.54,438.616,1024,224,12.12
xcit_nano_12_p16_384_dist,2332.68,438.968,1024,384,3.05
xcit_tiny_24_p16_224,2328.43,439.766,1024,224,12.12
haloregnetz_b,2320.38,441.294,1024,224,11.68
resmlp_24_224,2308.85,443.498,1024,224,30.02
resmlp_24_distilled_224,2308.13,443.636,1024,224,30.02
resnetaa50d,2295.34,446.109,1024,224,25.58
seresnet50t,2282.97,448.526,1024,224,28.1
efficientnet_cc_b1_8e,2282.2,448.676,1024,240,39.72
convnext_tiny,2276.32,449.828,1024,224,28.59
res2net50_48w_2s,2265.43,451.999,1024,224,25.29
resnetblur50,2265.02,452.08,1024,224,25.56
efficientnet_lite3,2261.44,226.393,512,300,8.2
ecaresnet50d,2260.92,452.9,1024,224,25.58
efficientnet_b0_g8_gn,2259.05,453.276,1024,224,6.56
densenet121d,2243.46,456.425,1024,224,8.0
resnetrs50,2241.6,456.804,1024,224,35.69
mobilevit_s,2240.37,228.522,512,256,5.58
regnetx_032,2201.52,465.118,1024,224,15.3
visformer_small,2194.82,466.539,1024,224,40.22
gluon_resnet50_v1s,2193.42,466.838,1024,224,25.68
tf_efficientnet_cc_b1_8e,2188.26,467.941,1024,240,39.72
vit_base_resnet26d_224,2188.18,467.954,1024,224,101.4
resnetblur50d,2150.42,476.173,1024,224,25.58
gluon_inception_v3,2150.17,476.225,1024,299,23.83
adv_inception_v3,2148.48,476.602,1024,299,23.83
vovnet57a,2148.25,476.651,1024,224,36.64
tf_inception_v3,2147.17,476.894,1024,299,23.83
inception_v3,2146.78,476.978,1024,299,23.83
densenetblur121d,2133.31,479.992,1024,224,8.0
cspresnext50_iabn,2129.31,480.895,1024,256,20.57
semobilevit_s,2105.83,243.122,512,256,5.74
seresnetaa50d,2088.02,490.403,1024,224,28.11
cspdarknet53_iabn,2087.26,490.582,1024,256,27.64
swsl_resnext50_32x4d,2080.34,492.214,1024,224,25.03
convnext_tiny_hnf,2075.56,493.349,1024,224,28.59
ese_vovnet57b,2074.63,493.57,1024,224,38.61
tf_efficientnet_lite3,2074.49,246.795,512,300,8.2
resnext50_32x4d,2073.97,493.725,1024,224,25.03
ssl_resnext50_32x4d,2073.23,493.902,1024,224,25.03
gluon_resnext50_32x4d,2072.3,494.125,1024,224,25.03
tv_resnext50_32x4d,2055.26,498.221,1024,224,25.03
res2net50_26w_4s,2037.79,502.491,1024,224,25.7
twins_pcpvt_small,2036.4,502.835,1024,224,24.11
xcit_small_12_p16_224_dist,2021.26,506.599,1024,224,26.25
tf_efficientnetv2_b3,2020.91,506.688,1024,300,14.36
xcit_small_12_p16_224,2020.4,506.814,1024,224,26.25
nf_seresnet50,2015.9,507.948,1024,224,28.09
skresnet50,2014.26,508.362,1024,224,25.8
nf_ecaresnet50,2005.54,510.572,1024,224,25.56
regnetx_040,2003.23,511.16,1024,224,22.12
efficientnetv2_rw_t,1999.8,512.038,1024,288,13.65
sehalonet33ts,1991.52,257.078,512,256,13.69
fbnetv3_g,1991.44,514.187,1024,288,16.62
dla60x,1986.19,515.547,1024,224,17.35
gcresnet50t,1982.14,516.599,1024,256,25.9
resnext50d_32x4d,1976.72,518.018,1024,224,25.05
lambda_resnet26rpt_256,1950.98,262.42,512,256,10.99
gmixer_24_224,1937.61,528.474,1024,224,24.72
gc_efficientnetv2_rw_t,1928.42,530.993,1024,288,13.68
res2net50_14w_8s,1920.35,533.22,1024,224,25.06
skresnet50d,1919.46,533.468,1024,224,25.82
densenet169,1916.46,534.305,1024,224,14.15
dla60_res2net,1907.38,536.848,1024,224,20.85
gcresnext50ts,1902.33,538.276,1024,256,15.67
seresnext50_32x4d,1902.13,538.331,1024,224,27.56
gluon_seresnext50_32x4d,1901.68,538.457,1024,224,27.56
res2next50,1896.89,539.818,1024,224,24.67
legacy_seresnext50_32x4d,1896.31,539.984,1024,224,27.56
repvgg_b1g4,1884.44,543.384,1024,224,39.97
resnest50d_1s4x24d,1879.81,544.722,1024,224,25.68
crossvit_small_240,1855.05,551.992,1024,240,26.86
nf_regnet_b3,1852.1,552.873,1024,320,18.59
darknet53,1847.62,277.102,512,256,41.61
mixnet_xl,1839.34,278.346,512,224,11.9
dla60_res2next,1829.2,559.793,1024,224,17.03
swin_tiny_patch4_window7_224,1820.74,562.394,1024,224,28.29
cspdarknet53,1804.24,283.762,512,256,27.64
poolformer_s24,1803.36,567.817,1024,224,21.39
vit_small_r26_s32_224,1799.77,568.949,1024,224,36.43
xcit_nano_12_p8_224_dist,1796.05,570.128,1024,224,3.05
xcit_nano_12_p8_224,1795.49,570.304,1024,224,3.05
regnetz_b16,1791.93,571.438,1024,288,9.72
ecaresnet26t,1786.82,573.073,1024,320,16.01
convnext_tiny_hnfd,1744.82,586.868,1024,224,28.63
gmlp_s16_224,1741.6,587.951,1024,224,19.42
sebotnet33ts_256,1718.54,223.433,384,256,13.7
crossvit_15_240,1711.44,598.314,1024,240,27.53
resnetv2_101,1693.03,604.82,1024,224,44.54
vit_base_resnet50d_224,1670.98,612.798,1024,224,110.97
swin_s3_tiny_224,1660.14,616.801,1024,224,28.33
repvgg_b1,1657.85,617.654,1024,224,57.42
tv_resnet101,1656.03,618.332,1024,224,44.55
gluon_resnet101_v1b,1653.8,619.167,1024,224,44.55
resnet101,1652.45,619.673,1024,224,44.55
tf_efficientnet_b3_ap,1649.82,310.323,512,300,12.23
tf_efficientnet_b3,1649.61,310.363,512,300,12.23
tf_efficientnet_b3_ns,1649.37,310.407,512,300,12.23
crossvit_15_dagger_240,1648.34,621.218,1024,240,28.21
lambda_resnet50ts,1639.74,624.475,1024,256,21.54
resnetv2_101d,1628.19,628.906,1024,224,44.56
efficientnet_b3,1614.3,317.152,512,320,12.23
efficientnet_b3a,1613.57,317.296,512,320,12.23
gluon_resnet101_v1c,1597.11,641.145,1024,224,44.57
resnest50d,1586.43,645.46,1024,224,27.48
gluon_resnet101_v1d,1584.97,646.054,1024,224,44.57
wide_resnet50_2,1583.06,646.835,1024,224,68.88
cait_xxs24_224,1580.68,647.808,1024,224,11.96
dla102,1573.96,650.576,1024,224,33.27
resnetv2_50x1_bit_distilled,1561.81,655.635,1024,224,25.55
res2net50_26w_6s,1556.3,657.955,1024,224,37.05
vit_large_patch32_224,1551.79,659.869,1024,224,306.54
resmlp_36_224,1549.07,661.03,1024,224,44.69
regnetx_080,1548.92,661.091,1024,224,39.57
resmlp_36_distilled_224,1548.49,661.277,1024,224,44.69
halonet50ts,1542.39,663.891,1024,256,22.73
legacy_seresnet101,1528.82,669.783,1024,224,49.33
ese_vovnet39b_evos,1523.51,672.121,1024,224,24.58
coat_lite_small,1509.49,678.361,1024,224,19.84
vgg13_bn,1504.09,340.392,512,224,133.05
xcit_tiny_12_p16_384_dist,1501.95,681.763,1024,384,6.72
resnetaa101d,1496.92,684.059,1024,224,44.57
swin_v2_cr_tiny_224,1495.36,684.765,1024,224,28.33
densenet201,1488.38,687.985,1024,224,20.01
seresnet101,1484.39,689.83,1024,224,49.33
vit_tiny_patch16_384,1479.06,692.319,1024,384,5.79
lamhalobotnet50ts_256,1474.4,694.504,1024,256,22.57
swin_v2_cr_tiny_ns_224,1471.62,695.817,1024,224,28.33
vit_base_patch32_384,1470.27,696.459,1024,384,88.3
vit_base_r26_s32_224,1467.48,697.779,1024,224,101.38
gluon_resnet101_v1s,1452.66,704.901,1024,224,44.67
regnetx_064,1450.48,352.975,512,224,26.21
mixer_b16_224,1448.87,706.746,1024,224,59.88
nf_resnet101,1448.19,707.076,1024,224,44.55
mixer_b16_224_miil,1446.14,708.08,1024,224,59.88
resnetv2_50d_frn,1444.18,709.041,1024,224,25.59
resnetblur101d,1433.31,714.414,1024,224,44.57
nf_resnet50,1425.83,718.163,1024,288,25.56
mixer_l32_224,1425.39,718.388,1024,224,206.94
ecaresnet101d,1423.43,719.375,1024,224,44.57
hrnet_w18,1416.64,722.817,1024,224,21.3
convnext_small,1412.23,725.081,1024,224,50.22
tresnet_l,1401.27,730.75,1024,224,55.99
twins_pcpvt_base,1398.04,732.441,1024,224,43.83
regnety_032,1384.84,739.421,1024,288,19.44
nest_tiny,1381.77,370.526,512,224,17.06
resnet50_gn,1377.54,743.338,1024,224,25.56
resnet51q,1366.14,749.545,1024,288,35.7
resnetv2_50d_evob,1365.49,749.897,1024,224,25.59
jx_nest_tiny,1357.18,377.241,512,224,17.06
botnet50ts_256,1354.82,377.899,512,256,22.74
xception,1347.6,379.923,512,299,22.86
dla102x,1333.01,768.169,1024,224,26.31
convit_small,1329.45,770.231,1024,224,27.78
halo2botnet50ts_256,1327.73,385.607,512,256,22.64
skresnext50_32x4d,1314.86,778.776,1024,224,27.48
swsl_resnext101_32x4d,1313.2,779.762,1024,224,44.18
ssl_resnext101_32x4d,1309.86,781.751,1024,224,44.18
gluon_resnext101_32x4d,1309.44,781.999,1024,224,44.18
resnext101_32x4d,1308.3,782.684,1024,224,44.18
repvgg_b2g4,1287.88,795.093,1024,224,61.76
res2net50_26w_8s,1277.42,801.601,1024,224,48.4
res2net101_26w_4s,1275.83,802.597,1024,224,45.21
resnest50d_4s2x40d,1267.83,807.666,1024,224,30.42
nf_seresnet101,1246.67,821.372,1024,224,49.33
twins_svt_base,1242.25,824.296,1024,224,56.07
nf_ecaresnet101,1240.12,825.715,1024,224,44.55
vgg16_bn,1239.17,413.169,512,224,138.37
resnet61q,1236.58,828.077,1024,288,36.85
eca_nfnet_l0,1234.82,829.257,1024,288,24.14
nfnet_l0,1234.39,829.546,1024,288,35.07
hrnet_w32,1225.86,835.318,1024,224,41.23
xception41p,1219.47,419.841,512,299,26.91
poolformer_s36,1217.85,840.81,1024,224,30.86
ese_vovnet99b_iabn,1214.4,843.2,1024,224,63.2
crossvit_18_240,1210.54,845.892,1024,240,43.27
regnetv_040,1210.02,423.122,512,288,20.64
regnety_040,1209.84,423.186,512,288,20.65
hrnet_w30,1208.97,846.988,1024,224,37.71
dpn92,1205.79,849.224,1024,224,37.67
gluon_seresnext101_32x4d,1198.19,854.61,1024,224,48.96
seresnext101_32x4d,1197.93,854.792,1024,224,48.96
legacy_seresnext101_32x4d,1196.55,855.783,1024,224,48.96
efficientnet_el,1181.26,433.424,512,300,10.59
efficientnet_el_pruned,1179.3,434.142,512,300,10.59
ese_vovnet99b,1178.02,869.24,1024,224,63.2
resnetv2_152,1172.64,873.226,1024,224,60.19
crossvit_18_dagger_240,1169.98,875.211,1024,240,44.27
tf_efficientnet_el,1156.16,442.832,512,300,10.59
tv_resnet152,1155.28,886.354,1024,224,60.19
resnet152,1153.46,887.752,1024,224,60.19
gluon_resnet152_v1b,1153.39,887.802,1024,224,60.19
xcit_tiny_12_p8_224_dist,1148.77,891.372,1024,224,6.71
xcit_tiny_12_p8_224,1148.55,891.542,1024,224,6.71
vit_small_resnet50d_s16_224,1140.9,897.523,1024,224,57.53
resnetv2_152d,1140.85,897.561,1024,224,60.2
mixnet_xxl,1136.05,338.002,384,224,23.96
ecaresnet50t,1133.49,903.391,1024,320,25.57
regnetz_c16,1132.12,452.237,512,320,13.46
repvgg_b2,1129.63,906.478,1024,224,89.02
gluon_resnet152_v1c,1126.87,908.694,1024,224,60.21
vit_base_patch16_224_miil,1124.8,910.37,1024,224,86.54
gluon_resnet152_v1d,1122.21,912.468,1024,224,60.21
volo_d1_224,1121.56,912.997,1024,224,26.63
swin_small_patch4_window7_224,1117.13,916.62,1024,224,49.61
regnety_040s_gn,1113.45,919.647,1024,224,20.65
inception_v4,1099.65,931.189,1024,299,42.68
vit_base_patch16_224_sam,1089.03,940.269,1024,224,86.57
xception41,1089.03,470.133,512,299,26.97
vit_base_patch16_224,1089.0,940.3,1024,224,86.57
deit_base_patch16_224,1088.05,941.117,1024,224,86.57
xcit_small_24_p16_224_dist,1079.16,948.867,1024,224,47.67
xcit_small_24_p16_224,1078.82,949.17,1024,224,47.67
densenet161,1076.84,950.914,1024,224,28.68
convmixer_1024_20_ks9_p14,1075.95,951.707,1024,224,24.38
nfnet_f0,1075.06,952.487,1024,256,71.49
deit_base_distilled_patch16_224,1073.74,953.659,1024,224,87.34
dla169,1071.22,955.906,1024,224,53.39
vgg19_bn,1060.19,482.919,512,224,143.68
cait_xxs36_224,1058.79,967.13,1024,224,17.3
tnt_s_patch16_224,1056.28,969.422,1024,224,23.76
legacy_seresnet152,1054.79,970.791,1024,224,66.82
gluon_resnet152_v1s,1052.41,972.991,1024,224,60.32
regnetx_120,1048.48,488.312,512,224,46.11
seresnet152,1033.07,991.203,1024,224,66.82
tresnet_xl,1032.49,991.76,1024,224,78.44
efficientnet_lite4,1029.33,373.047,384,380,13.01
beit_base_patch16_224,1003.86,1020.053,1024,224,86.53
regnety_120,1003.65,510.126,512,224,51.82
repvgg_b3g4,997.42,1026.637,1024,224,83.83
twins_pcpvt_large,995.78,1028.325,1024,224,60.99
convnext_base,984.44,1040.164,1024,224,88.59
convnext_base_in22ft1k,984.35,1040.256,1024,224,88.59
coat_tiny,976.27,1048.875,1024,224,5.5
tf_efficientnet_lite4,967.59,396.848,384,380,13.01
pit_b_224,954.91,536.164,512,224,73.76
dm_nfnet_f0,948.25,1079.874,1024,256,71.49
pit_b_distilled_224,947.46,540.381,512,224,74.79
vit_small_patch16_36x1_224,919.98,1113.057,1024,224,64.67
wide_resnet101_2,917.07,1116.581,1024,224,126.89
swin_v2_cr_small_224,915.96,1117.937,1024,224,49.7
dla102x2,910.31,562.434,512,224,41.28
resnetv2_50d_gn,909.47,1125.915,1024,288,25.57
efficientnetv2_s,905.47,1130.894,1024,384,21.46
vit_small_patch16_18x2_224,899.06,1138.948,1024,224,64.67
tf_efficientnetv2_s_in21ft1k,889.42,1151.294,1024,384,21.46
tf_efficientnetv2_s,889.32,1151.431,1024,384,21.46
xception65p,886.31,577.66,512,299,39.82
nest_small,881.75,580.652,512,224,38.35
twins_svt_large,880.33,1163.185,1024,224,99.27
repvgg_b3,878.37,1165.779,1024,224,123.09
resnetrs101,877.57,1166.842,1024,288,63.62
jx_nest_small,871.53,587.458,512,224,38.35
efficientnetv2_rw_s,866.51,1181.735,1024,384,23.94
dpn98,862.25,1187.578,1024,224,61.57
ens_adv_inception_resnet_v2,856.37,1195.737,1024,299,55.84
inception_resnet_v2,855.57,1196.844,1024,299,55.84
nf_regnet_b4,854.46,1198.403,1024,384,30.21
regnetz_b16_evos,853.4,599.942,512,288,9.74
regnetx_160,848.43,603.455,512,224,54.28
regnetz_d8,845.88,1210.552,1024,320,23.37
cait_s24_224,834.81,1226.608,1024,224,46.92
gluon_resnext101_64x4d,834.02,1227.777,1024,224,83.46
resnet200,828.19,1236.414,1024,224,64.67
regnetz_040,826.88,464.381,384,320,27.12
regnetz_040h,823.24,466.438,384,320,28.94
efficientnet_b4,820.41,468.046,384,384,19.34
swin_s3_small_224,817.53,626.263,512,224,49.74
hrnet_w40,816.49,1254.128,1024,224,57.56
poolformer_m36,815.39,1255.826,1024,224,56.17
swsl_resnext101_32x8d,805.27,1271.611,1024,224,88.79
regnety_064,803.23,637.411,512,288,30.58
ssl_resnext101_32x8d,802.86,1275.43,1024,224,88.79
xcit_tiny_24_p16_384_dist,802.73,1275.631,1024,384,12.12
ig_resnext101_32x8d,802.17,1276.521,1024,224,88.79
resnext101_32x8d,802.06,1276.704,1024,224,88.79
regnetv_064,800.43,639.64,512,288,30.58
resnetv2_50d_evos,798.97,640.813,512,288,25.59
gluon_xception65,797.15,642.277,512,299,39.92
resnet101d,796.13,1286.203,1024,320,44.57
xception65,795.21,643.84,512,299,39.92
resnest101e,791.92,646.513,512,256,48.28
swin_base_patch4_window7_224,791.65,1293.482,1024,224,87.77
gluon_seresnext101_64x4d,787.65,1300.055,1024,224,88.23
coat_mini,785.23,1304.064,1024,224,10.34
tf_efficientnet_b4_ap,782.92,490.459,384,380,19.34
tf_efficientnet_b4,782.13,490.953,384,380,19.34
tf_efficientnet_b4_ns,782.09,490.976,384,380,19.34
regnety_080,767.29,667.271,512,288,39.18
hrnet_w44,759.24,1348.7,1024,224,67.06
crossvit_base_240,756.03,677.208,512,240,105.03
xcit_medium_24_p16_224_dist,748.3,1368.42,1024,224,84.4
xcit_medium_24_p16_224,747.93,1369.089,1024,224,84.4
gmlp_b16_224,744.43,1375.538,1024,224,73.08
hrnet_w48,733.55,1395.939,1024,224,77.47
tresnet_m_448,727.54,1407.473,1024,448,31.39
vit_large_r50_s32_224,726.03,1410.402,1024,224,328.99
regnetz_d32,721.75,1418.755,1024,320,27.58
vit_small_patch16_384,684.33,748.167,512,384,22.2
tnt_b_patch16_224,677.0,1512.541,1024,224,65.41
xcit_small_12_p16_384_dist,675.76,1515.313,1024,384,26.25
convit_base,673.78,1519.763,1024,224,86.54
swin_s3_base_224,663.17,1544.087,1024,224,71.13
swin_v2_cr_base_224,651.94,1570.69,1024,224,87.88
densenet264d_iabn,647.21,1582.161,1024,224,72.74
efficientnet_b3_gn,646.96,395.683,256,320,11.73
dpn131,635.27,1611.889,1024,224,79.25
densenet264,628.46,1629.36,1024,224,72.69
nest_base,627.45,815.992,512,224,67.72
volo_d2_224,626.17,1635.329,1024,224,58.68
jx_nest_base,620.43,825.215,512,224,67.72
poolformer_m48,615.67,1663.217,1024,224,73.47
vit_small_r26_s32_384,611.26,628.196,384,384,36.47
xcit_nano_12_p8_384_dist,610.06,1678.498,1024,384,3.05
xcit_tiny_24_p8_224,603.0,1698.15,1024,224,12.11
xcit_tiny_24_p8_224_dist,602.22,1700.353,1024,224,12.11
xception71,600.16,853.086,512,299,42.34
hrnet_w64,599.02,1709.451,1024,224,128.06
vit_base_r50_s16_224,598.25,1711.642,1024,224,98.66
legacy_senet154,582.07,1759.212,1024,224,115.09
senet154,582.01,1759.392,1024,224,115.09
gluon_senet154,581.94,1759.609,1024,224,115.09
dpn107,578.97,1768.627,1024,224,86.92
eca_nfnet_l1,575.61,1778.965,1024,320,41.41
seresnet200d,562.84,1819.333,1024,256,71.86
resnet152d,561.29,1824.359,1024,320,60.21
ecaresnet200d,559.73,1829.437,1024,256,64.69
convnext_large_in22ft1k,546.71,936.5,512,224,197.77
convnext_large,546.09,937.561,512,224,197.77
regnetz_c16_evos,539.7,948.669,512,320,13.49
regnety_320,534.72,957.496,512,224,145.05
regnety_160,523.72,733.2,384,288,83.59
efficientnet_b3_g8_gn,520.23,492.079,256,320,14.25
xcit_small_12_p8_224,514.94,1988.547,1024,224,26.21
xcit_small_12_p8_224_dist,514.63,1989.761,1024,224,26.21
halonet_h1,507.89,504.03,256,256,8.1
resnext101_64x4d,505.79,1012.259,512,288,83.46
seresnet152d,503.07,2035.474,1024,320,66.84
resnetrs152,499.61,2049.599,1024,320,86.62
vit_large_patch32_384,494.68,2070.028,1024,384,306.63
regnetx_320,468.44,819.734,384,224,107.81
mixer_l16_224,464.51,2204.436,1024,224,208.2
seresnext101_32x8d,461.67,1109.006,512,288,93.57
swin_large_patch4_window7_224,454.97,1125.326,512,224,196.53
regnetz_e8,451.54,1133.877,512,320,57.7
efficientnetv2_m,442.19,2315.716,1024,416,54.14
seresnet269d,440.45,2324.895,1024,256,113.67
volo_d3_224,438.95,2332.847,1024,224,86.33
xcit_large_24_p16_224,432.87,2365.575,1024,224,189.1
xcit_large_24_p16_224_dist,432.7,2366.505,1024,224,189.1
efficientnet_b5,411.07,622.743,256,456,30.39
efficientnetv2_rw_m,406.88,1258.338,512,416,53.24
resnet200d,404.01,2534.588,1024,320,64.69
tf_efficientnet_b5,395.61,647.082,256,456,30.39
tf_efficientnet_b5_ap,395.52,647.242,256,456,30.39
tf_efficientnet_b5_ns,395.47,647.322,256,456,30.39
resnetv2_50x1_bitm,392.67,1303.884,512,448,25.55
xcit_tiny_12_p8_384_dist,390.32,2623.487,1024,384,6.71
swin_v2_cr_large_224,385.91,1326.718,512,224,196.68
swsl_resnext101_32x16d,375.78,1362.484,512,224,194.03
ig_resnext101_32x16d,373.86,1369.478,512,224,194.03
ssl_resnext101_32x16d,373.64,1370.28,512,224,194.03
regnetz_d8_evos,373.09,1372.306,512,320,23.46
swin_v2_cr_tiny_384,364.64,702.039,256,384,28.33
tresnet_l_448,361.5,2832.633,1024,448,55.99
convnext_xlarge_in22ft1k,360.86,1418.822,512,224,350.2
xcit_small_24_p16_384_dist,360.65,2839.301,1024,384,47.67
resnetrs200,358.31,2857.856,1024,320,93.21
nfnet_f1,357.09,2867.576,1024,320,132.63
vit_large_patch16_224,356.78,2870.107,1024,224,304.33
crossvit_15_dagger_408,354.9,721.306,256,408,28.5
vit_base_patch16_18x2_224,348.19,2940.87,1024,224,256.73
tf_efficientnetv2_m_in21ft1k,347.38,1473.892,512,480,54.14
tf_efficientnetv2_m,346.86,1476.074,512,480,54.14
dm_nfnet_f1,342.33,1495.606,512,320,132.63
convnext_base_384_in22ft1k,336.37,1141.59,384,384,88.59
beit_large_patch16_224,330.46,3098.668,1024,224,304.43
volo_d1_384,293.12,1746.724,512,384,26.78
convmixer_768_32,290.99,3518.992,1024,224,21.11
eca_nfnet_l2,282.42,1812.878,512,384,56.72
volo_d4_224,281.7,3635.003,1024,224,192.96
resnetv2_152x2_bit_teacher,280.73,1823.773,512,224,236.34
vit_base_patch16_384,280.43,1369.296,384,384,86.86
deit_base_patch16_384,280.4,1369.456,384,384,86.86
deit_base_distilled_patch16_384,276.56,1388.495,384,384,87.63
xcit_small_24_p8_224,269.73,3796.43,1024,224,47.63
tresnet_xl_448,269.71,1898.302,512,448,78.44
xcit_small_24_p8_224_dist,269.67,3797.157,1024,224,47.63
resnest200e,265.14,1931.055,512,320,70.2
cait_xxs24_384,264.07,3877.737,1024,384,12.03
vit_large_patch14_224,262.13,3906.369,1024,224,304.2
crossvit_18_dagger_408,260.17,983.968,256,408,44.61
xcit_medium_24_p16_384_dist,254.44,2012.271,512,384,84.4
nasnetalarge,254.21,1510.535,384,331,88.75
pnasnet5large,251.02,1529.719,384,331,86.06
resnetv2_101x1_bitm,246.55,2076.684,512,448,44.54
beit_base_patch16_384,241.2,1592.025,384,384,86.74
vit_large_r50_s32_384,240.58,1596.142,384,384,329.09
efficientnet_b6,239.36,534.745,128,528,43.04
ecaresnet269d,233.52,4385.094,1024,352,102.09
tf_efficientnet_b6_ns,231.37,553.209,128,528,43.04
tf_efficientnet_b6,231.23,553.545,128,528,43.04
tf_efficientnet_b6_ap,231.04,553.998,128,528,43.04
resnetrs270,227.52,4500.663,1024,352,129.86
swin_v2_cr_small_384,224.2,1141.845,256,384,49.7
swin_base_patch4_window12_384,211.77,906.647,192,384,87.9
resmlp_big_24_224,209.16,4895.869,1024,224,129.14
resmlp_big_24_distilled_224,208.94,4900.863,1024,224,129.14
resmlp_big_24_224_in22ft1k,208.92,4901.346,1024,224,129.14
xcit_tiny_24_p8_384_dist,204.37,5010.586,1024,384,12.11
nfnet_f2,200.76,5100.492,1024,352,193.78
tf_efficientnetv2_l,199.73,1922.595,384,480,118.52
efficientnetv2_l,198.4,2580.63,512,480,118.52
tf_efficientnetv2_l_in21ft1k,196.98,2599.257,512,480,118.52
dm_nfnet_f2,194.75,2628.942,512,352,193.78
xcit_medium_24_p8_224,189.81,2697.371,512,224,84.32
xcit_medium_24_p8_224_dist,189.81,2697.462,512,224,84.32
volo_d5_224,187.06,5474.175,1024,224,295.46
convnext_large_384_in22ft1k,186.8,1370.399,256,384,197.77
cait_xs24_384,184.72,2771.758,512,384,26.67
vit_base_patch8_224,183.25,1396.951,256,224,86.58
cait_xxs36_384,176.62,5797.903,1024,384,17.37
vit_base_r50_s16_384,174.09,2205.77,384,384,98.95
vit_base_resnet50_384,174.05,2206.21,384,384,98.95
xcit_small_12_p8_384_dist,173.11,2957.56,512,384,26.21
swin_v2_cr_huge_224,170.28,2255.094,384,224,657.83
convmixer_1536_20,167.4,6117.1,1024,224,51.63
volo_d2_384,164.47,2334.792,384,384,58.87
swin_v2_cr_base_384,160.01,1199.906,192,384,87.88
eca_nfnet_l3,159.42,3211.652,512,448,72.04
resnetrs350,151.88,3371.146,512,384,163.96
xcit_large_24_p16_384_dist,147.05,3481.712,512,384,189.1
ig_resnext101_32x32d,146.83,1743.474,256,224,468.53
cait_s24_384,142.26,3598.927,512,384,47.06
vit_huge_patch14_224,141.6,7231.567,1024,224,632.05
efficientnet_b7,139.66,687.357,96,600,66.35
tf_efficientnet_b7,135.86,706.608,96,600,66.35
tf_efficientnet_b7_ap,135.79,706.955,96,600,66.35
tf_efficientnet_b7_ns,135.74,707.2,96,600,66.35
efficientnetv2_xl,127.77,3005.388,384,512,208.12
tf_efficientnetv2_xl_in21ft1k,127.07,3021.991,384,512,208.12
swin_large_patch4_window12_384,125.28,1021.736,128,384,196.74
resnest269e,123.52,3108.719,384,416,110.93
convnext_xlarge_384_in22ft1k,123.3,1557.136,192,384,350.2
xcit_large_24_p8_224,110.05,4652.475,512,224,188.93
xcit_large_24_p8_224_dist,109.94,4656.907,512,224,188.93
nfnet_f3,109.46,4677.461,512,416,254.92
resnetrs420,108.78,4706.738,512,416,191.89
dm_nfnet_f3,98.93,5175.208,512,416,254.92
swin_v2_cr_large_384,98.1,1304.756,128,384,196.68
resnetv2_152x2_bit_teacher_384,97.22,2633.19,256,384,236.34
cait_s36_384,95.27,5374.225,512,384,68.37
vit_large_patch16_384,94.74,2702.22,256,384,304.72
resnetv2_50x3_bitm,94.62,1352.761,128,448,217.32
vit_giant_patch14_224,92.89,5511.692,512,224,1012.61
xcit_small_24_p8_384_dist,90.84,4227.257,384,384,47.63
ig_resnext101_32x48d,87.18,2202.253,192,224,828.41
efficientnet_b8,83.91,1144.078,96,672,87.41
beit_large_patch16_384,82.64,3097.749,256,384,305.0
tf_efficientnet_b8_ap,82.27,1166.925,96,672,87.41
tf_efficientnet_b8,82.25,1167.133,96,672,87.41
volo_d3_448,72.26,2657.14,192,448,86.63
resnetv2_152x2_bitm,71.6,2681.389,192,448,236.34
xcit_medium_24_p8_384_dist,64.15,3990.849,256,384,84.32
nfnet_f4,60.24,6374.659,384,512,316.07
dm_nfnet_f4,59.09,4332.344,256,512,316.07
resnetv2_101x3_bitm,58.2,2199.411,128,448,387.93
vit_gigantic_patch14_224,56.14,9120.818,512,224,1844.44
volo_d4_448,52.76,2425.859,128,448,193.41
swin_v2_cr_giant_224,49.04,2610.23,128,224,2598.76
tf_efficientnet_l2_ns_475,46.41,1378.993,64,475,480.31
nfnet_f5,44.14,5799.172,256,544,377.21
swin_v2_cr_huge_384,43.59,1468.203,64,384,657.94
dm_nfnet_f5,39.84,6426.216,256,544,377.21
xcit_large_24_p8_384_dist,36.78,5220.859,192,384,188.93
volo_d5_448,36.57,3500.029,128,448,295.91
nfnet_f6,34.59,7401.234,256,576,438.36
beit_large_patch16_512,33.3,2882.954,96,512,305.67
cait_m36_384,31.36,8162.98,256,384,271.22
dm_nfnet_f6,31.17,8214.193,256,576,438.36
nfnet_f7,26.91,9514.003,256,608,499.5
volo_d5_512,25.64,4991.614,128,512,296.09
resnetv2_152x4_bitm,18.58,3444.83,64,480,936.53
efficientnet_l2,16.55,1450.295,24,800,480.31
tf_efficientnet_l2_ns,16.36,1467.093,24,800,480.31
swin_v2_cr_giant_384,13.63,1760.856,24,384,2598.76
cait_m48_448,13.35,9584.614,128,448,356.46
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-infer-amp-nchw-pt210-cu121-rtx3090.csv
|
model,infer_img_size,infer_batch_size,infer_samples_per_sec,infer_step_time,infer_gmacs,infer_macts,param_count
tinynet_e,106,1024.0,50604.03,20.225,0.03,0.69,2.04
mobilenetv3_small_050,224,1024.0,46069.42,22.217,0.03,0.92,1.59
lcnet_035,224,1024.0,41190.64,24.85,0.03,1.04,1.64
lcnet_050,224,1024.0,37663.82,27.178,0.05,1.26,1.88
mobilenetv3_small_075,224,1024.0,33398.64,30.649,0.05,1.3,2.04
efficientvit_m0,224,1024.0,32179.13,31.812,0.08,0.91,2.35
mobilenetv3_small_100,224,1024.0,29653.41,34.522,0.06,1.42,2.54
tf_mobilenetv3_small_minimal_100,224,1024.0,28352.57,36.106,0.06,1.41,2.04
tinynet_d,152,1024.0,27612.87,37.074,0.05,1.42,2.34
tf_mobilenetv3_small_075,224,1024.0,27505.95,37.218,0.05,1.3,2.04
tf_mobilenetv3_small_100,224,1024.0,24859.95,41.18,0.06,1.42,2.54
efficientvit_m1,224,1024.0,24836.87,41.219,0.17,1.33,2.98
lcnet_075,224,1024.0,24184.78,42.33,0.1,1.99,2.36
efficientvit_m2,224,1024.0,21907.95,46.731,0.2,1.47,4.19
mnasnet_small,224,1024.0,20764.95,49.303,0.07,2.16,2.03
levit_128s,224,1024.0,20669.44,49.531,0.31,1.88,7.78
lcnet_100,224,1024.0,19774.93,51.772,0.16,2.52,2.95
regnetx_002,224,1024.0,18945.55,54.04,0.2,2.16,2.68
resnet10t,176,1024.0,18840.28,54.342,0.7,1.51,5.44
efficientvit_m3,224,1024.0,18627.14,54.963,0.27,1.62,6.9
mobilenetv2_035,224,1024.0,18464.78,55.447,0.07,2.86,1.68
ghostnet_050,224,1024.0,17741.46,57.707,0.05,1.77,2.59
resnet18,160,1024.0,17592.15,58.198,0.93,1.27,11.69
regnety_002,224,1024.0,17571.32,58.267,0.2,2.17,3.16
levit_conv_128s,224,1024.0,17529.9,58.404,0.31,1.88,7.78
efficientvit_m4,224,1024.0,17446.52,58.683,0.3,1.7,8.8
repghostnet_050,224,1024.0,17090.91,59.904,0.05,2.02,2.31
efficientvit_b0,224,1024.0,16784.26,60.999,0.1,2.87,3.41
vit_tiny_r_s16_p8_224,224,1024.0,16479.31,62.128,0.43,1.85,6.34
vit_small_patch32_224,224,1024.0,15974.78,64.091,1.12,2.09,22.88
mnasnet_050,224,1024.0,15859.35,64.557,0.11,3.07,2.22
mobilenetv2_050,224,1024.0,14885.11,68.783,0.1,3.64,1.97
tinynet_c,184,1024.0,14726.2,69.525,0.11,2.87,2.46
pit_ti_224,224,1024.0,14628.51,69.989,0.5,2.75,4.85
pit_ti_distilled_224,224,1024.0,14546.3,70.385,0.51,2.77,5.1
semnasnet_050,224,1024.0,14351.42,71.341,0.11,3.44,2.08
levit_128,224,1024.0,14192.78,72.139,0.41,2.71,9.21
repghostnet_058,224,1024.0,13482.93,75.937,0.07,2.59,2.55
mixer_s32_224,224,1024.0,13082.53,78.262,1.0,2.28,19.1
cs3darknet_focus_s,256,1024.0,12838.86,79.748,0.69,2.7,3.27
regnetx_004,224,1024.0,12620.59,81.127,0.4,3.14,5.16
levit_conv_128,224,1024.0,12584.5,81.359,0.41,2.71,9.21
cs3darknet_s,256,1024.0,12531.56,81.703,0.72,2.97,3.28
lcnet_150,224,1024.0,12510.06,81.844,0.34,3.79,4.5
regnetx_004_tv,224,1024.0,12294.91,83.276,0.42,3.17,5.5
efficientvit_m5,224,1024.0,12067.16,84.847,0.53,2.41,12.47
mobilenetv3_large_075,224,1024.0,12041.45,85.029,0.16,4.0,3.99
levit_192,224,1024.0,11986.94,85.416,0.66,3.2,10.95
resnet10t,224,1024.0,11963.05,85.587,1.1,2.43,5.44
gernet_s,224,1024.0,11809.29,86.701,0.75,2.65,8.17
ese_vovnet19b_slim_dw,224,1024.0,11618.32,88.126,0.4,5.28,1.9
vit_tiny_patch16_224,224,1024.0,11270.42,90.846,1.08,4.12,5.72
deit_tiny_patch16_224,224,1024.0,11259.37,90.936,1.08,4.12,5.72
deit_tiny_distilled_patch16_224,224,1024.0,11217.54,91.275,1.09,4.15,5.91
repghostnet_080,224,1024.0,11079.58,92.412,0.1,3.22,3.28
mobilenetv3_rw,224,1024.0,10908.78,93.859,0.23,4.41,5.48
levit_conv_192,224,1024.0,10768.96,95.077,0.66,3.2,10.95
mobilenetv3_large_100,224,1024.0,10731.24,95.412,0.23,4.41,5.48
hardcorenas_a,224,1024.0,10620.31,96.408,0.23,4.38,5.26
tf_mobilenetv3_large_075,224,1024.0,10495.83,97.552,0.16,4.0,3.99
resnet14t,176,1024.0,10451.45,97.965,1.07,3.61,10.08
mnasnet_075,224,1024.0,10423.24,98.231,0.23,4.77,3.17
tf_mobilenetv3_large_minimal_100,224,1024.0,10369.07,98.745,0.22,4.4,3.92
resnet34,160,1024.0,10330.89,99.109,1.87,1.91,21.8
regnety_004,224,1024.0,9931.33,103.097,0.41,3.89,4.34
nf_regnet_b0,192,1024.0,9884.05,103.59,0.37,3.15,8.76
regnetx_006,224,1024.0,9823.29,104.232,0.61,3.98,6.2
hardcorenas_b,224,1024.0,9755.67,104.953,0.26,5.09,5.18
hardcorenas_c,224,1024.0,9572.88,106.958,0.28,5.01,5.52
ghostnet_100,224,1024.0,9528.83,107.453,0.15,3.55,5.18
tf_mobilenetv3_large_100,224,1024.0,9484.05,107.96,0.23,4.41,5.48
tinynet_b,188,1024.0,9358.37,109.409,0.21,4.44,3.73
mnasnet_100,224,1024.0,9357.9,109.416,0.33,5.46,4.38
tf_efficientnetv2_b0,192,1024.0,9316.15,109.906,0.54,3.51,7.14
repghostnet_100,224,1024.0,9303.14,110.06,0.15,3.98,4.07
mobilenetv2_075,224,1024.0,9280.78,110.325,0.22,5.86,2.64
resnet18,224,1024.0,9222.44,111.023,1.82,2.48,11.69
pit_xs_distilled_224,224,1024.0,9172.76,111.624,1.11,4.15,11.0
semnasnet_075,224,1024.0,9145.4,111.959,0.23,5.54,2.91
pit_xs_224,224,1024.0,9134.12,112.096,1.1,4.12,10.62
regnety_006,224,1024.0,9106.78,112.433,0.61,4.33,6.06
convnext_atto,224,1024.0,8993.29,113.851,0.55,3.81,3.7
hardcorenas_d,224,1024.0,8915.53,114.845,0.3,4.93,7.5
levit_256,224,1024.0,8893.96,115.124,1.13,4.23,18.89
seresnet18,224,1024.0,8718.39,117.442,1.82,2.49,11.78
convnext_atto_ols,224,1024.0,8549.03,119.769,0.58,4.11,3.7
mobilenetv2_100,224,1024.0,8479.08,120.757,0.31,6.68,3.5
legacy_seresnet18,224,1024.0,8452.0,121.144,1.82,2.49,11.78
spnasnet_100,224,1024.0,8438.72,121.334,0.35,6.03,4.42
repghostnet_111,224,1024.0,8382.7,122.146,0.18,4.38,4.54
semnasnet_100,224,1024.0,8351.88,122.597,0.32,6.23,3.89
dla46_c,224,1024.0,8209.51,124.721,0.58,4.5,1.3
repvgg_a0,224,1024.0,8124.8,126.024,1.52,3.59,9.11
levit_conv_256,224,1024.0,7997.32,128.032,1.13,4.23,18.89
edgenext_xx_small,256,1024.0,7955.06,128.711,0.26,3.33,1.33
regnetx_008,224,1024.0,7889.15,129.787,0.81,5.15,7.26
resnet18d,224,1024.0,7873.83,130.041,2.06,3.29,11.71
convnext_femto,224,1024.0,7867.13,130.151,0.79,4.57,5.22
ese_vovnet19b_slim,224,1024.0,7834.56,130.693,1.69,3.52,3.17
mobilevit_xxs,256,1024.0,7818.95,130.953,0.34,5.74,1.27
hardcorenas_f,224,1024.0,7811.68,131.075,0.35,5.57,8.2
hardcorenas_e,224,1024.0,7751.65,132.09,0.35,5.65,8.07
efficientnet_lite0,224,1024.0,7716.09,132.699,0.4,6.74,4.65
xcit_nano_12_p16_224,224,1024.0,7711.63,132.776,0.56,4.17,3.05
ghostnet_130,224,1024.0,7680.26,133.318,0.24,4.6,7.36
levit_256d,224,1024.0,7643.23,133.964,1.4,4.93,26.21
tf_efficientnetv2_b0,224,1024.0,7637.19,134.07,0.73,4.77,7.14
repghostnet_130,224,1024.0,7550.55,135.609,0.25,5.24,5.48
convnext_femto_ols,224,1024.0,7514.81,136.254,0.82,4.87,5.23
regnety_008,224,1024.0,7508.88,136.361,0.81,5.25,6.26
tinynet_a,192,1024.0,7458.0,137.291,0.35,5.41,6.19
fbnetc_100,224,1024.0,7362.02,139.082,0.4,6.51,5.57
tf_efficientnetv2_b1,192,1024.0,7241.64,141.394,0.76,4.59,8.14
crossvit_tiny_240,240,1024.0,7093.57,144.345,1.3,5.67,7.01
regnety_008_tv,224,1024.0,7067.28,144.882,0.84,5.42,6.43
mobilevitv2_050,256,1024.0,7057.9,145.075,0.48,8.04,1.37
crossvit_9_240,240,1024.0,6964.15,147.028,1.55,5.59,8.55
dla46x_c,224,1024.0,6837.04,149.761,0.54,5.66,1.07
tf_efficientnet_lite0,224,1024.0,6819.73,150.142,0.4,6.74,4.65
efficientnet_b0,224,1024.0,6721.47,152.337,0.4,6.75,5.29
rexnet_100,224,1024.0,6689.15,153.073,0.41,7.44,4.8
rexnetr_100,224,1024.0,6646.85,154.047,0.43,7.72,4.88
levit_conv_256d,224,1024.0,6618.0,154.719,1.4,4.93,26.21
repvit_m1,224,1024.0,6591.52,155.339,0.83,7.45,5.49
efficientnet_b1_pruned,240,1024.0,6583.2,155.537,0.4,6.21,6.33
repghostnet_150,224,1024.0,6564.41,155.982,0.32,6.0,6.58
mnasnet_140,224,1024.0,6559.1,156.108,0.6,7.71,7.12
efficientvit_b1,224,1024.0,6458.82,158.532,0.53,7.25,9.1
visformer_tiny,224,1024.0,6456.3,158.594,1.27,5.72,10.32
crossvit_9_dagger_240,240,1024.0,6436.13,159.091,1.68,6.03,8.78
resnet14t,224,1024.0,6404.13,159.886,1.69,5.8,10.08
dla60x_c,224,1024.0,6404.11,159.885,0.59,6.01,1.32
mobilenetv2_110d,224,1024.0,6387.15,160.311,0.45,8.71,4.52
ghostnetv2_100,224,1024.0,6375.73,160.599,0.18,4.55,6.16
regnetz_005,224,1024.0,6372.66,160.676,0.52,5.86,7.12
repvit_m0_9,224,1024.0,6295.33,162.649,0.83,7.45,5.49
edgenext_xx_small,288,1024.0,6241.41,164.053,0.33,4.21,1.33
fbnetv3_b,224,1024.0,6166.1,166.058,0.42,6.97,8.6
convnext_pico,224,1024.0,6145.95,166.603,1.37,6.1,9.05
cs3darknet_focus_m,256,1024.0,6145.46,166.616,1.98,4.89,9.3
pvt_v2_b0,224,1024.0,6126.38,167.135,0.53,7.01,3.67
tf_efficientnet_b0,224,1024.0,6026.91,169.894,0.4,6.75,5.29
nf_regnet_b0,256,1024.0,5970.36,171.503,0.64,5.58,8.76
resnetblur18,224,1024.0,5963.74,171.694,2.34,3.39,11.69
ese_vovnet19b_dw,224,1024.0,5956.2,171.911,1.34,8.25,6.54
hrnet_w18_small,224,1024.0,5950.21,172.083,1.61,5.72,13.19
resnet50,160,1024.0,5943.32,172.284,2.1,5.67,25.56
repvgg_a1,224,1024.0,5891.09,173.812,2.64,4.74,14.09
cs3darknet_m,256,1024.0,5871.36,174.395,2.08,5.28,9.31
convnext_pico_ols,224,1024.0,5852.38,174.961,1.43,6.5,9.06
vit_base_patch32_clip_224,224,1024.0,5768.1,177.517,4.37,4.19,88.22
tf_efficientnetv2_b2,208,1024.0,5753.76,177.96,1.06,6.0,10.1
vit_base_patch32_224,224,1024.0,5748.7,178.117,4.37,4.19,88.22
semnasnet_140,224,1024.0,5744.77,178.239,0.6,8.87,6.11
skresnet18,224,1024.0,5740.29,178.378,1.82,3.24,11.96
vit_tiny_r_s16_p8_384,384,1024.0,5663.72,180.79,1.25,5.39,6.36
resnet50d,160,1024.0,5651.35,181.185,2.22,6.08,25.58
resnet18,288,1024.0,5636.85,181.651,3.01,4.11,11.69
mobilenetv2_140,224,1024.0,5629.57,181.886,0.6,9.57,6.11
vit_small_patch32_384,384,1024.0,5499.31,186.195,3.26,6.07,22.92
convnext_atto,288,1024.0,5487.38,186.599,0.91,6.3,3.7
efficientnet_b0_gn,224,1024.0,5481.83,186.788,0.42,6.75,5.29
selecsls42,224,1024.0,5458.22,187.596,2.94,4.62,30.35
efficientnet_lite1,240,1024.0,5452.84,187.782,0.62,10.14,5.42
fbnetv3_d,224,1024.0,5449.6,187.893,0.52,8.5,10.31
pit_s_224,224,1024.0,5438.08,188.291,2.42,6.18,23.46
selecsls42b,224,1024.0,5414.81,189.1,2.98,4.62,32.46
resnet34,224,1024.0,5413.46,189.147,3.67,3.74,21.8
pit_s_distilled_224,224,1024.0,5407.14,189.368,2.45,6.22,24.04
efficientvit_b1,256,1024.0,5391.26,189.926,0.69,9.46,9.1
seresnet18,288,1024.0,5348.84,191.432,3.01,4.11,11.78
tf_efficientnetv2_b1,240,1024.0,5293.37,193.439,1.21,7.34,8.14
levit_384,224,1024.0,5286.23,193.7,2.36,6.26,39.13
convnextv2_atto,224,1024.0,5265.85,194.45,0.55,3.81,3.71
repvit_m1_0,224,1024.0,5259.32,194.683,1.13,8.69,7.3
seresnet50,160,1024.0,5236.4,195.543,2.1,5.69,28.09
convnext_atto_ols,288,1024.0,5201.4,196.86,0.96,6.8,3.7
gernet_m,224,1024.0,5195.05,197.1,3.02,5.24,21.14
fbnetv3_b,256,1024.0,5178.49,197.729,0.55,9.1,8.6
mixnet_s,224,1024.0,5129.76,199.608,0.25,6.25,4.13
repghostnet_200,224,1024.0,5125.91,199.759,0.54,7.96,9.8
vit_base_patch32_clip_quickgelu_224,224,1024.0,5125.16,199.787,4.37,4.19,87.85
seresnet34,224,1024.0,5104.13,200.612,3.67,3.74,21.96
repvit_m2,224,1024.0,5098.16,200.845,1.36,9.43,8.8
rexnetr_130,224,1024.0,5082.35,201.471,0.68,9.81,7.61
efficientnet_b0_g16_evos,224,1024.0,5016.04,204.134,1.01,7.42,8.11
ghostnetv2_130,224,1024.0,5011.79,204.307,0.28,5.9,8.96
edgenext_x_small,256,1024.0,4992.08,205.112,0.54,5.93,2.34
ecaresnet50t,160,1024.0,4989.39,205.225,2.21,6.04,25.57
tiny_vit_5m_224,224,1024.0,4963.53,206.293,1.18,9.32,12.08
rexnet_130,224,1024.0,4939.41,207.301,0.68,9.71,7.56
legacy_seresnet34,224,1024.0,4938.49,207.34,3.67,3.74,21.96
eva02_tiny_patch14_224,224,1024.0,4931.19,207.646,1.4,6.17,5.5
resnet34d,224,1024.0,4924.89,207.912,3.91,4.54,21.82
tf_efficientnet_lite1,240,1024.0,4918.8,208.17,0.62,10.14,5.42
mixer_b32_224,224,1024.0,4917.45,208.227,3.24,6.29,60.29
resnet50,176,1024.0,4914.58,208.348,2.62,6.92,25.56
resnetrs50,160,1024.0,4904.24,208.788,2.29,6.2,35.69
xcit_tiny_12_p16_224,224,1024.0,4900.19,208.961,1.24,6.29,6.72
repvit_m1_1,224,1024.0,4858.32,210.759,1.36,9.43,8.8
levit_conv_384,224,1024.0,4851.29,211.066,2.36,6.26,39.13
efficientnet_es_pruned,224,1024.0,4832.02,211.909,1.81,8.73,5.44
efficientnet_es,224,1024.0,4828.47,212.065,1.81,8.73,5.44
dla34,224,1024.0,4823.61,212.277,3.07,5.02,15.74
resnet26,224,1024.0,4806.46,213.036,2.36,7.35,16.0
resnet18d,288,1024.0,4806.17,213.049,3.41,5.43,11.71
resnext50_32x4d,160,1024.0,4797.48,213.435,2.17,7.35,25.03
tf_mixnet_s,224,1024.0,4783.68,214.05,0.25,6.25,4.13
convnext_femto,288,1024.0,4774.19,214.475,1.3,7.56,5.22
efficientnet_b1,224,1024.0,4707.45,217.516,0.59,9.36,7.79
gmlp_ti16_224,224,1024.0,4694.71,218.108,1.34,7.55,5.87
cs3darknet_focus_m,288,1024.0,4686.36,218.495,2.51,6.19,9.3
mobilenetv2_120d,224,1024.0,4673.25,219.108,0.69,11.97,5.83
selecsls60,224,1024.0,4656.74,219.885,3.59,5.52,30.67
selecsls60b,224,1024.0,4628.67,221.219,3.63,5.52,32.77
tf_efficientnet_es,224,1024.0,4617.85,221.737,1.81,8.73,5.44
resmlp_12_224,224,1024.0,4607.73,222.224,3.01,5.5,15.35
vit_small_patch16_224,224,1024.0,4586.65,223.246,4.25,8.25,22.05
deit_small_patch16_224,224,1024.0,4584.29,223.359,4.25,8.25,22.05
fbnetv3_d,256,1024.0,4567.33,224.19,0.68,11.1,10.31
gmixer_12_224,224,1024.0,4565.4,224.285,2.67,7.26,12.7
deit_small_distilled_patch16_224,224,1024.0,4564.97,224.306,4.27,8.29,22.44
convnext_femto_ols,288,1024.0,4561.96,224.454,1.35,8.06,5.23
efficientnet_b0_g8_gn,224,1024.0,4561.27,224.488,0.66,6.75,6.56
efficientnet_cc_b0_8e,224,1024.0,4542.29,225.426,0.42,9.42,24.01
efficientnet_cc_b0_4e,224,1024.0,4540.5,225.515,0.41,9.42,13.31
repvgg_b0,224,1024.0,4526.99,226.188,3.41,6.15,15.82
mixer_s16_224,224,1024.0,4518.8,226.598,3.79,5.97,18.53
cs3darknet_m,288,1024.0,4513.42,226.868,2.63,6.69,9.31
convnextv2_femto,224,1024.0,4509.16,227.082,0.79,4.57,5.23
regnetx_016,224,1024.0,4476.6,228.734,1.62,7.93,9.19
nf_regnet_b1,256,1024.0,4444.68,230.377,0.82,7.27,10.22
vit_base_patch32_clip_256,256,1024.0,4442.76,230.476,5.68,5.44,87.86
mobilevitv2_075,256,1024.0,4419.22,231.704,1.05,12.06,2.87
rexnetr_150,224,1024.0,4415.72,231.888,0.89,11.13,9.78
darknet17,256,1024.0,4402.14,232.603,3.26,7.18,14.3
resnet26d,224,1024.0,4396.77,232.887,2.6,8.15,16.01
resnetaa34d,224,1024.0,4381.9,233.677,4.43,5.07,21.82
efficientnet_b2_pruned,260,1024.0,4356.91,235.018,0.73,9.13,8.31
convnext_nano,224,1024.0,4340.39,235.913,2.46,8.37,15.59
ecaresnet50d_pruned,224,1024.0,4337.48,236.07,2.53,6.43,19.94
efficientformer_l1,224,1024.0,4271.29,239.728,1.3,5.53,12.29
nf_resnet26,224,1024.0,4216.31,242.856,2.41,7.35,16.0
deit3_small_patch16_224,224,1024.0,4203.29,243.607,4.25,8.25,22.06
nf_regnet_b2,240,1024.0,4197.9,243.92,0.97,7.23,14.31
tf_efficientnet_cc_b0_4e,224,1024.0,4196.5,244.002,0.41,9.42,13.31
tf_efficientnet_cc_b0_8e,224,1024.0,4190.23,244.367,0.42,9.42,24.01
regnety_016,224,1024.0,4161.97,246.026,1.63,8.04,11.2
rexnet_150,224,1024.0,4147.2,246.903,0.9,11.21,9.73
ghostnetv2_160,224,1024.0,4116.92,248.718,0.42,7.23,12.39
tiny_vit_11m_224,224,1024.0,4086.56,250.566,1.9,10.73,20.35
poolformer_s12,224,1024.0,4071.24,251.51,1.82,5.53,11.92
regnetz_005,288,1024.0,4056.8,252.404,0.86,9.68,7.12
efficientnet_lite2,260,1024.0,4046.71,253.034,0.89,12.9,6.09
darknet21,256,1024.0,4001.6,255.887,3.93,7.47,20.86
efficientvit_b1,288,1024.0,3997.55,256.145,0.87,11.96,9.1
resnext50_32x4d,176,1024.0,3992.51,256.47,2.71,8.97,25.03
edgenext_x_small,288,1024.0,3965.96,258.184,0.68,7.5,2.34
efficientnet_b1,256,1024.0,3961.36,258.486,0.77,12.22,7.79
convnext_nano_ols,224,1024.0,3944.64,259.582,2.65,9.38,15.65
resnest14d,224,1024.0,3932.19,260.404,2.76,7.33,10.61
tf_efficientnet_b1,240,1024.0,3922.37,261.055,0.71,10.88,7.79
flexivit_small,240,1024.0,3913.54,261.645,4.88,9.46,22.06
mobilevit_xs,256,768.0,3904.8,196.672,0.93,13.62,2.32
regnetz_b16,224,1024.0,3893.58,262.986,1.45,9.95,9.72
sedarknet21,256,1024.0,3874.2,264.302,3.93,7.47,20.95
resnext26ts,256,1024.0,3832.52,267.176,2.43,10.52,10.3
mobileone_s1,224,1024.0,3826.99,267.562,0.86,9.67,4.83
tf_efficientnetv2_b2,260,1024.0,3817.93,268.197,1.72,9.84,10.1
edgenext_small,256,1024.0,3770.23,271.588,1.26,9.07,5.59
convnext_pico,288,1024.0,3731.48,274.411,2.27,10.08,9.05
gernet_l,256,1024.0,3727.69,274.69,4.57,8.0,31.08
seresnext26ts,256,1024.0,3724.62,274.916,2.43,10.52,10.39
eca_resnext26ts,256,1024.0,3723.07,275.031,2.43,10.52,10.3
dpn48b,224,1024.0,3716.75,275.497,1.69,8.92,9.13
tf_efficientnet_lite2,260,1024.0,3695.32,277.096,0.89,12.9,6.09
gcresnext26ts,256,1024.0,3691.17,277.409,2.43,10.53,10.48
efficientnet_b2,256,1024.0,3671.26,278.912,0.89,12.81,9.11
nf_ecaresnet26,224,1024.0,3640.87,281.24,2.41,7.36,16.0
resnetblur18,288,1024.0,3639.91,281.314,3.87,5.6,11.69
nf_seresnet26,224,1024.0,3637.43,281.506,2.41,7.36,17.4
resnet101,160,1024.0,3616.15,283.164,4.0,8.28,44.55
vit_relpos_small_patch16_224,224,1024.0,3590.52,285.183,4.24,9.38,21.98
resnet26t,256,1024.0,3578.9,286.111,3.35,10.52,16.01
vit_srelpos_small_patch16_224,224,1024.0,3572.97,286.585,4.23,8.49,21.97
convnext_pico_ols,288,1024.0,3558.03,287.789,2.37,10.74,9.06
cs3darknet_focus_l,256,1024.0,3544.69,288.872,4.66,8.03,21.15
tf_efficientnetv2_b3,240,1024.0,3543.38,288.978,1.93,9.95,14.36
legacy_seresnext26_32x4d,224,1024.0,3516.72,291.169,2.49,9.39,16.79
pvt_v2_b1,224,1024.0,3507.87,291.903,2.04,14.01,14.01
repvit_m3,224,1024.0,3501.61,292.425,1.89,13.94,10.68
repvgg_a2,224,1024.0,3495.75,292.916,5.7,6.26,28.21
efficientnetv2_rw_t,224,1024.0,3486.59,293.686,1.93,9.94,13.65
ecaresnet101d_pruned,224,1024.0,3483.13,293.977,3.48,7.69,24.88
ese_vovnet19b_dw,288,1024.0,3478.51,294.369,2.22,13.63,6.54
mixnet_m,224,1024.0,3474.22,294.731,0.36,8.19,5.01
edgenext_small_rw,256,1024.0,3458.08,296.106,1.58,9.51,7.83
convnextv2_pico,224,1024.0,3458.0,296.113,1.37,6.1,9.07
gc_efficientnetv2_rw_t,224,1024.0,3445.15,297.218,1.94,9.97,13.68
cs3darknet_l,256,1024.0,3414.99,299.845,4.86,8.55,21.16
efficientnet_b3_pruned,300,1024.0,3412.19,300.09,1.04,11.86,9.86
nf_regnet_b1,288,1024.0,3373.08,303.57,1.02,9.2,10.22
tf_mixnet_m,224,1024.0,3353.29,305.361,0.36,8.19,5.01
convit_tiny,224,1024.0,3342.83,306.316,1.26,7.94,5.71
eca_botnext26ts_256,256,1024.0,3341.38,306.449,2.46,11.6,10.59
ecaresnext50t_32x4d,224,1024.0,3327.77,307.703,2.7,10.09,15.41
ecaresnext26t_32x4d,224,1024.0,3321.66,308.269,2.7,10.09,15.41
resnet34,288,1024.0,3320.08,308.416,6.07,6.18,21.8
seresnext26t_32x4d,224,1024.0,3319.26,308.491,2.7,10.09,16.81
vit_tiny_patch16_384,384,1024.0,3311.59,309.206,3.16,12.08,5.79
vit_base_patch32_plus_256,256,1024.0,3301.22,310.177,7.7,6.35,119.48
seresnext26d_32x4d,224,1024.0,3300.83,310.214,2.73,10.19,16.81
skresnet34,224,1024.0,3294.57,310.803,3.67,5.13,22.28
mobilevitv2_100,256,768.0,3290.58,233.384,1.84,16.08,4.9
vit_relpos_small_patch16_rpn_224,224,1024.0,3279.29,312.245,4.24,9.38,21.97
eca_halonext26ts,256,1024.0,3270.39,313.1,2.44,11.46,10.76
coatnet_pico_rw_224,224,1024.0,3250.74,314.993,1.96,12.91,10.85
rexnetr_200,224,768.0,3238.38,237.146,1.59,15.11,16.52
ecaresnet26t,256,1024.0,3228.23,317.19,3.35,10.53,16.01
ecaresnetlight,224,1024.0,3222.96,317.708,4.11,8.42,30.16
coatnext_nano_rw_224,224,1024.0,3218.47,318.153,2.36,10.68,14.7
cs3sedarknet_l,256,1024.0,3218.11,318.188,4.86,8.56,21.91
coat_lite_tiny,224,1024.0,3216.35,318.362,1.6,11.65,5.72
nf_regnet_b2,272,1024.0,3205.43,319.447,1.22,9.27,14.31
convnextv2_atto,288,1024.0,3199.9,319.999,0.91,6.3,3.71
vit_small_r26_s32_224,224,1024.0,3174.89,322.52,3.54,9.44,36.43
botnet26t_256,256,1024.0,3173.81,322.63,3.32,11.98,12.49
resnetv2_50,224,1024.0,3170.95,322.919,4.11,11.11,25.55
fastvit_t8,256,1024.0,3164.9,323.538,0.7,8.63,4.03
crossvit_small_240,240,1024.0,3164.86,323.541,5.09,11.34,26.86
bat_resnext26ts,256,1024.0,3139.26,326.18,2.53,12.51,10.73
seresnet34,288,1024.0,3136.77,326.439,6.07,6.18,21.96
halonet26t,256,1024.0,3132.55,326.879,3.19,11.69,12.48
lambda_resnet26t,256,1024.0,3123.88,327.786,3.02,11.87,10.96
rexnet_200,224,768.0,3120.89,246.073,1.56,14.91,16.37
vit_small_resnet26d_224,224,1024.0,3106.26,329.645,5.04,10.65,63.61
hrnet_w18_small_v2,224,1024.0,3095.42,330.8,2.62,9.65,15.6
mobileone_s2,224,1024.0,3085.91,331.82,1.34,11.55,7.88
vit_relpos_base_patch32_plus_rpn_256,256,1024.0,3081.88,332.247,7.59,6.63,119.42
tresnet_m,224,1024.0,3073.78,333.129,5.75,7.31,31.39
resnet32ts,256,1024.0,3072.91,333.224,4.63,11.58,17.96
coatnet_nano_cc_224,224,1024.0,3066.72,333.896,2.13,13.1,13.76
resnet101,176,1024.0,3047.24,336.031,4.92,10.08,44.55
resnet33ts,256,1024.0,3032.6,337.653,4.76,11.66,19.68
efficientvit_b2,224,1024.0,3030.14,337.927,1.6,14.62,24.33
resnet50,224,1024.0,3021.24,338.922,4.11,11.11,25.56
coat_lite_mini,224,1024.0,3021.22,338.925,2.0,12.25,11.01
resnet34d,288,1024.0,3013.98,339.739,6.47,7.51,21.82
cspresnet50,256,1024.0,3012.57,339.898,4.54,11.5,21.62
resnetv2_50t,224,1024.0,3011.73,339.991,4.32,11.82,25.57
dpn68b,224,1024.0,3008.58,340.347,2.35,10.47,12.61
coatnet_nano_rw_224,224,1024.0,3001.39,341.165,2.29,13.29,15.14
dpn68,224,1024.0,3001.33,341.17,2.35,10.47,12.61
resnetv2_50d,224,1024.0,2992.98,342.12,4.35,11.92,25.57
convnext_tiny,224,1024.0,2986.71,342.841,4.47,13.44,28.59
levit_512,224,1024.0,2974.0,344.305,5.64,10.22,95.17
dla60,224,1024.0,2959.44,345.999,4.26,10.16,22.04
fbnetv3_g,240,1024.0,2957.87,346.184,1.28,14.87,16.62
tf_efficientnet_b2,260,1024.0,2957.04,346.28,1.02,13.83,9.11
efficientnet_em,240,1024.0,2948.76,347.254,3.04,14.34,6.9
crossvit_15_240,240,1024.0,2948.65,347.266,5.17,12.01,27.53
eca_resnet33ts,256,1024.0,2945.18,347.676,4.76,11.66,19.68
seresnet33ts,256,1024.0,2940.4,348.24,4.76,11.66,19.78
regnetx_032,224,1024.0,2932.49,349.18,3.2,11.37,15.3
gcresnet33ts,256,1024.0,2919.42,350.744,4.76,11.68,19.88
mobileone_s0,224,1024.0,2911.68,351.675,1.09,15.48,5.29
resnet50t,224,1024.0,2893.61,353.872,4.32,11.82,25.57
resnet50c,224,1024.0,2893.38,353.9,4.35,11.92,25.58
repvit_m1_5,224,1024.0,2891.53,354.126,2.31,15.7,14.64
selecsls84,224,1024.0,2891.52,354.128,5.9,7.57,50.95
efficientnet_cc_b1_8e,240,1024.0,2883.89,355.064,0.75,15.44,39.72
haloregnetz_b,224,1024.0,2883.33,355.134,1.97,11.94,11.68
vgg11,224,1024.0,2881.16,355.4,7.61,7.44,132.86
resnet50d,224,1024.0,2872.03,356.53,4.35,11.92,25.58
resnest26d,224,1024.0,2863.53,357.59,3.64,9.97,17.07
tf_efficientnet_em,240,1024.0,2860.98,357.908,3.04,14.34,6.9
visformer_small,224,1024.0,2837.73,360.841,4.88,11.43,40.22
cspresnet50w,256,1024.0,2834.78,361.216,5.04,12.19,28.12
vovnet39a,224,1024.0,2834.5,361.252,7.09,6.73,22.6
wide_resnet50_2,176,1024.0,2833.12,361.428,7.29,8.97,68.88
cspresnet50d,256,1024.0,2828.94,361.963,4.86,12.55,21.64
resnet26,288,1024.0,2826.83,362.233,3.9,12.15,16.0
resnext26ts,288,1024.0,2826.2,362.312,3.07,13.31,10.3
efficientnet_b2,288,1024.0,2822.88,362.739,1.12,16.2,9.11
regnetv_040,224,1024.0,2785.35,367.627,4.0,12.29,20.64
levit_512d,224,1024.0,2784.75,367.707,5.85,11.3,92.5
levit_conv_512,224,1024.0,2781.3,368.162,5.64,10.22,95.17
deit3_medium_patch16_224,224,1024.0,2780.75,368.235,7.53,10.99,38.85
crossvit_15_dagger_240,240,1024.0,2776.34,368.82,5.5,12.68,28.21
regnety_040,224,1024.0,2768.62,369.849,4.0,12.29,20.65
legacy_seresnet50,224,1024.0,2766.98,370.066,3.88,10.6,28.09
eca_resnext26ts,288,1024.0,2756.51,371.473,3.07,13.32,10.3
seresnext26ts,288,1024.0,2751.54,372.144,3.07,13.32,10.39
regnety_032,224,1024.0,2744.75,373.065,3.2,11.26,19.44
convnext_tiny_hnf,224,1024.0,2744.61,373.082,4.47,13.44,28.59
convnextv2_femto,288,1024.0,2744.25,373.131,1.3,7.56,5.23
eca_vovnet39b,224,1024.0,2742.23,373.408,7.09,6.74,22.6
resnetv2_50x1_bit,224,1024.0,2741.57,373.497,4.23,11.11,25.55
gcresnext26ts,288,1024.0,2728.39,375.302,3.07,13.33,10.48
resnetaa50,224,1024.0,2728.16,375.334,5.15,11.64,25.56
densenet121,224,1024.0,2725.3,375.726,2.87,6.9,7.98
ese_vovnet39b,224,1024.0,2723.97,375.912,7.09,6.74,24.57
mixnet_l,224,1024.0,2712.93,377.44,0.58,10.84,7.33
tf_efficientnet_cc_b1_8e,240,1024.0,2710.75,377.745,0.75,15.44,39.72
mobilevit_s,256,768.0,2698.84,284.557,1.86,17.03,5.58
cs3darknet_focus_l,288,1024.0,2695.52,379.878,5.9,10.16,21.15
seresnet50,224,1024.0,2693.22,380.203,4.11,11.13,28.09
xcit_nano_12_p16_384,384,1024.0,2679.82,382.104,1.64,12.14,3.05
resnetaa34d,288,1024.0,2675.02,382.79,7.33,8.38,21.82
twins_svt_small,224,1024.0,2670.35,383.458,2.82,10.7,24.06
ecaresnet50d_pruned,288,1024.0,2662.19,384.634,4.19,10.61,19.94
convnext_nano,288,1024.0,2634.79,388.635,4.06,13.84,15.59
resnet50_gn,224,1024.0,2631.91,389.06,4.14,11.11,25.56
resnetv2_50d_gn,224,1024.0,2623.43,390.317,4.38,11.92,25.57
xcit_tiny_24_p16_224,224,1024.0,2616.39,391.368,2.34,11.82,12.12
tf_mixnet_l,224,1024.0,2615.89,391.443,0.58,10.84,7.33
res2net50_48w_2s,224,1024.0,2611.06,392.166,4.18,11.72,25.29
gcvit_xxtiny,224,1024.0,2608.34,392.574,2.14,15.36,12.0
cs3darknet_l,288,1024.0,2607.33,392.728,6.16,10.83,21.16
resnetaa50d,224,1024.0,2596.72,394.332,5.39,12.44,25.58
vgg11_bn,224,1024.0,2590.27,395.315,7.62,7.44,132.87
vit_base_resnet26d_224,224,1024.0,2580.41,396.822,6.93,12.34,101.4
vit_relpos_medium_patch16_cls_224,224,1024.0,2579.62,396.946,7.55,13.3,38.76
ecaresnet50t,224,1024.0,2579.62,396.946,4.32,11.83,25.57
coatnet_rmlp_nano_rw_224,224,1024.0,2579.38,396.984,2.51,18.21,15.15
davit_tiny,224,1024.0,2578.68,397.091,4.47,17.08,28.36
seresnet50t,224,1024.0,2574.91,397.672,4.32,11.83,28.1
resnet26d,288,1024.0,2569.96,398.438,4.29,13.48,16.01
mobilevitv2_125,256,768.0,2568.23,299.03,2.86,20.1,7.48
nf_regnet_b3,288,1024.0,2563.17,399.494,1.67,11.84,18.59
ecaresnet50d,224,1024.0,2560.76,399.87,4.35,11.93,25.58
levit_conv_512d,224,1024.0,2557.63,400.359,5.85,11.3,92.5
resnet152,160,1024.0,2531.48,404.495,5.9,11.51,60.19
efficientvit_b2,256,1024.0,2531.18,404.544,2.09,19.03,24.33
mobileone_s3,224,1024.0,2513.71,407.355,1.94,13.85,10.17
resnetrs50,224,1024.0,2512.05,407.624,4.48,12.14,35.69
twins_pcpvt_small,224,1024.0,2506.77,408.482,3.68,15.51,24.11
resnetblur50,224,1024.0,2495.43,410.338,5.16,12.02,25.56
poolformerv2_s12,224,1024.0,2489.38,411.337,1.83,5.53,11.89
convnextv2_nano,224,1024.0,2480.83,412.755,2.46,8.37,15.62
regnetx_040,224,1024.0,2478.03,413.222,3.99,12.2,22.12
eca_nfnet_l0,224,1024.0,2476.91,413.407,4.35,10.47,24.14
gcresnext50ts,256,1024.0,2473.39,413.995,3.75,15.46,15.67
nfnet_l0,224,1024.0,2472.84,414.088,4.36,10.47,35.07
tiny_vit_21m_224,224,1024.0,2468.7,414.781,4.08,15.96,33.22
cs3sedarknet_l,288,1024.0,2463.79,415.609,6.16,10.83,21.91
resnet50s,224,1024.0,2456.52,416.838,5.47,13.52,25.68
dla60x,224,1024.0,2437.95,420.012,3.54,13.8,17.35
densenetblur121d,224,1024.0,2433.6,420.765,3.11,7.9,8.0
edgenext_small,320,1024.0,2424.08,422.414,1.97,14.16,5.59
resnext50_32x4d,224,1024.0,2410.12,424.862,4.26,14.4,25.03
inception_next_tiny,224,1024.0,2404.04,425.937,4.19,11.98,28.06
convnext_nano_ols,288,1024.0,2397.01,427.188,4.38,15.5,15.65
vit_relpos_medium_patch16_224,224,1024.0,2394.54,427.629,7.5,12.13,38.75
efficientnet_lite3,300,512.0,2392.78,213.967,1.65,21.85,8.2
vit_srelpos_medium_patch16_224,224,1024.0,2386.54,429.062,7.49,11.32,38.74
regnetz_c16,256,1024.0,2383.36,429.635,2.51,16.57,13.46
resnetblur50d,224,1024.0,2382.64,429.765,5.4,12.82,25.58
vit_base_r26_s32_224,224,1024.0,2381.88,429.901,6.76,11.54,101.38
gcresnet50t,256,1024.0,2372.96,431.518,5.42,14.67,25.9
regnety_040_sgn,224,1024.0,2371.57,431.77,4.03,12.29,20.65
res2net50_26w_4s,224,1024.0,2359.62,433.957,4.28,12.61,25.7
vovnet57a,224,1024.0,2357.12,434.416,8.95,7.52,36.64
resmlp_24_224,224,1024.0,2350.19,435.697,5.96,10.91,30.02
maxvit_pico_rw_256,256,768.0,2346.84,327.238,1.68,18.77,7.46
inception_v3,299,1024.0,2346.46,436.391,5.73,8.97,23.83
maxvit_rmlp_pico_rw_256,256,768.0,2343.0,327.774,1.69,21.32,7.52
seresnetaa50d,224,1024.0,2333.21,438.87,5.4,12.46,28.11
focalnet_tiny_srf,224,1024.0,2331.81,439.132,4.42,16.32,28.43
cspresnext50,256,1024.0,2330.62,439.358,4.05,15.86,20.57
res2net50_14w_8s,224,1024.0,2327.89,439.871,4.21,13.28,25.06
dla60_res2net,224,1024.0,2327.26,439.99,4.15,12.34,20.85
coatnet_0_rw_224,224,1024.0,2319.62,441.438,4.23,15.1,27.44
regnetz_b16,288,1024.0,2318.51,441.651,2.39,16.43,9.72
gmixer_24_224,224,1024.0,2315.73,442.182,5.28,14.45,24.72
resnext50d_32x4d,224,1024.0,2305.65,444.116,4.5,15.2,25.05
lambda_resnet26rpt_256,256,768.0,2282.36,336.484,3.16,11.87,10.99
ese_vovnet57b,224,1024.0,2279.9,449.132,8.95,7.52,38.61
resnest50d_1s4x24d,224,1024.0,2278.75,449.357,4.43,13.57,25.68
dla60_res2next,224,1024.0,2268.77,451.333,3.49,13.17,17.03
sehalonet33ts,256,1024.0,2262.52,452.582,3.55,14.7,13.69
res2net50d,224,1024.0,2256.17,453.855,4.52,13.41,25.72
vit_medium_patch16_gap_240,240,1024.0,2253.27,454.439,8.6,12.57,44.4
res2next50,224,1024.0,2251.4,454.817,4.2,13.71,24.67
resnet32ts,288,1024.0,2244.87,456.139,5.86,14.65,17.96
edgenext_base,256,1024.0,2239.63,457.204,3.85,15.58,18.51
efficientvit_l1,224,1024.0,2235.54,458.043,5.27,15.85,52.65
skresnet50,224,1024.0,2226.66,459.87,4.11,12.5,25.8
nfnet_f0,192,1024.0,2226.44,459.916,7.21,10.16,71.49
tf_efficientnetv2_b3,300,1024.0,2226.35,459.935,3.04,15.74,14.36
efficientnetv2_rw_t,288,1024.0,2225.5,460.11,3.19,16.42,13.65
nf_ecaresnet50,224,1024.0,2219.3,461.395,4.21,11.13,25.56
darknetaa53,256,1024.0,2219.0,461.459,7.97,12.39,36.02
densenet169,224,1024.0,2218.3,461.604,3.4,7.3,14.15
nf_seresnet50,224,1024.0,2217.49,461.772,4.21,11.13,28.09
edgenext_small_rw,320,1024.0,2214.15,462.468,2.46,14.85,7.83
resnet33ts,288,1024.0,2214.09,462.482,6.02,14.75,19.68
xcit_small_12_p16_224,224,1024.0,2207.67,463.826,4.82,12.57,26.25
focalnet_tiny_lrf,224,1024.0,2205.41,464.301,4.49,17.76,28.65
resnet51q,256,1024.0,2195.84,466.325,6.38,16.55,35.7
repvgg_b1g4,224,1024.0,2195.75,466.344,8.15,10.64,39.97
seresnext50_32x4d,224,1024.0,2188.04,467.986,4.26,14.42,27.56
vit_relpos_medium_patch16_rpn_224,224,1024.0,2187.29,468.147,7.5,12.13,38.73
cs3darknet_focus_x,256,1024.0,2185.7,468.489,8.03,10.69,35.02
legacy_seresnext50_32x4d,224,1024.0,2184.4,468.766,4.26,14.42,27.56
tf_efficientnet_lite3,300,512.0,2178.27,235.039,1.65,21.85,8.2
resnet26t,320,1024.0,2173.03,471.22,5.24,16.44,16.01
gc_efficientnetv2_rw_t,288,1024.0,2170.84,471.696,3.2,16.45,13.68
gmlp_s16_224,224,1024.0,2161.42,473.752,4.42,15.1,19.42
seresnet33ts,288,1024.0,2156.33,474.868,6.02,14.76,19.78
eca_resnet33ts,288,1024.0,2152.27,475.765,6.02,14.76,19.68
fastvit_t12,256,1024.0,2151.9,475.846,1.42,12.42,7.55
nf_regnet_b3,320,1024.0,2148.66,476.564,2.05,14.61,18.59
eva02_small_patch14_224,224,1024.0,2144.78,477.426,5.53,12.34,21.62
resnet152,176,1024.0,2139.0,478.716,7.22,13.99,60.19
vit_medium_patch16_reg4_gap_256,256,1024.0,2137.51,479.051,9.93,14.51,38.87
gcresnet33ts,288,1024.0,2134.49,479.728,6.02,14.78,19.88
skresnet50d,224,1024.0,2133.34,479.986,4.36,13.31,25.82
ecaresnet101d_pruned,288,1024.0,2128.45,481.09,5.75,12.71,24.88
fbnetv3_g,288,1024.0,2127.74,481.25,1.77,21.09,16.62
vit_medium_patch16_reg4_256,256,1024.0,2119.83,483.047,9.97,14.56,38.87
eva02_tiny_patch14_336,336,1024.0,2106.54,486.094,3.14,13.85,5.76
convnextv2_pico,288,1024.0,2101.04,487.367,2.27,10.08,9.07
nf_resnet50,256,1024.0,2100.31,487.536,5.46,14.52,25.56
resnetrs101,192,1024.0,2100.21,487.558,6.04,12.7,63.62
poolformer_s24,224,1024.0,2099.97,487.615,3.41,10.68,21.39
pvt_v2_b2,224,1024.0,2099.92,487.626,3.9,24.96,25.36
efficientnet_b3,288,512.0,2089.91,244.977,1.63,21.49,12.23
cs3sedarknet_xdw,256,1024.0,2078.01,492.768,5.97,17.18,21.6
darknet53,256,1024.0,2077.03,493.0,9.31,12.39,41.61
ecaresnet50t,256,1024.0,2076.41,493.149,5.64,15.45,25.57
cs3darknet_x,256,1024.0,2060.02,497.071,8.38,11.35,35.05
xcit_nano_12_p8_224,224,1024.0,2059.06,497.302,2.16,15.71,3.05
mobilevitv2_150,256,512.0,2058.61,248.702,4.09,24.11,10.59
rexnetr_300,224,1024.0,2042.01,501.455,3.39,22.16,34.81
lambda_resnet50ts,256,1024.0,2041.61,501.552,5.07,17.48,21.54
fastvit_s12,256,1024.0,2028.81,504.718,1.82,13.67,9.47
coatnet_rmlp_0_rw_224,224,1024.0,2024.25,505.855,4.52,21.26,27.45
gcvit_xtiny,224,1024.0,2023.42,506.063,2.93,20.26,19.98
fastvit_sa12,256,1024.0,2022.28,506.347,1.96,13.83,11.58
crossvit_18_240,240,1024.0,2014.44,508.318,8.21,16.14,43.27
vit_medium_patch16_gap_256,256,1024.0,1996.45,512.899,9.78,14.29,38.86
resnet61q,256,1024.0,1996.22,512.958,7.8,17.01,36.85
coatnet_bn_0_rw_224,224,1024.0,1985.64,515.69,4.48,18.41,27.44
vit_base_patch32_384,384,1024.0,1984.44,516.005,12.67,12.14,88.3
vit_base_patch32_clip_384,384,1024.0,1981.44,516.784,12.67,12.14,88.3
cspdarknet53,256,1024.0,1981.04,516.888,6.57,16.81,27.64
sebotnet33ts_256,256,512.0,1977.98,258.841,3.89,17.46,13.7
ecaresnet26t,320,1024.0,1973.79,518.786,5.24,16.44,16.01
vit_base_resnet50d_224,224,1024.0,1971.35,519.428,8.68,16.1,110.97
cs3sedarknet_x,256,1024.0,1962.3,521.825,8.38,11.35,35.4
regnetx_080,224,1024.0,1962.04,521.894,8.02,14.06,39.57
seresnext26t_32x4d,288,1024.0,1950.77,524.91,4.46,16.68,16.81
mixnet_xl,224,1024.0,1948.29,525.576,0.93,14.57,11.9
resnest50d,224,1024.0,1945.36,526.368,5.4,14.36,27.48
seresnext26d_32x4d,288,1024.0,1940.04,527.813,4.51,16.85,16.81
coatnet_0_224,224,512.0,1939.29,264.004,4.43,21.14,25.04
swin_tiny_patch4_window7_224,224,1024.0,1938.74,528.165,4.51,17.06,28.29
resnetv2_101,224,1024.0,1935.15,529.146,7.83,16.23,44.54
regnetx_064,224,1024.0,1933.12,529.703,6.49,16.37,26.21
dla102,224,1024.0,1924.77,531.998,7.19,14.18,33.27
crossvit_18_dagger_240,240,1024.0,1921.19,532.991,8.65,16.91,44.27
rexnetr_200,288,512.0,1914.7,267.396,2.62,24.96,16.52
rexnet_300,224,1024.0,1911.46,535.706,3.44,22.4,34.71
nest_tiny,224,1024.0,1908.27,536.601,5.24,14.75,17.06
dm_nfnet_f0,192,1024.0,1907.3,536.873,7.21,10.16,71.49
ecaresnetlight,288,1024.0,1897.75,539.574,6.79,13.91,30.16
maxxvit_rmlp_nano_rw_256,256,768.0,1897.05,404.83,4.17,21.53,16.78
resnet101,224,1024.0,1885.15,543.183,7.83,16.23,44.55
nest_tiny_jx,224,1024.0,1884.26,543.437,5.24,14.75,17.06
pvt_v2_b2_li,224,1024.0,1882.78,543.863,3.77,25.04,22.55
vit_large_patch32_224,224,1024.0,1869.82,547.632,15.27,11.11,305.51
vgg13,224,1024.0,1868.34,548.068,11.31,12.25,133.05
resnetv2_101d,224,1024.0,1865.75,548.827,8.07,17.04,44.56
efficientformer_l3,224,1024.0,1865.63,548.865,3.93,12.01,31.41
resnetv2_50,288,1024.0,1863.99,549.347,6.79,18.37,25.55
mobileone_s4,224,1024.0,1856.33,551.615,3.04,17.74,14.95
res2net50_26w_6s,224,1024.0,1853.01,552.603,6.33,15.28,37.05
efficientvit_b2,288,1024.0,1851.14,553.16,2.64,24.03,24.33
lamhalobotnet50ts_256,256,1024.0,1841.89,555.938,5.02,18.44,22.57
maxvit_nano_rw_256,256,768.0,1833.65,418.827,4.26,25.76,15.45
maxvit_rmlp_nano_rw_256,256,768.0,1832.13,419.175,4.28,27.4,15.5
convnext_small,224,1024.0,1829.72,559.636,8.71,21.56,50.22
resnet101c,224,1024.0,1824.57,561.217,8.08,17.04,44.57
convnext_tiny,288,1024.0,1817.02,563.549,7.39,22.21,28.59
resnet101d,224,1024.0,1816.61,563.677,8.08,17.04,44.57
gcresnext50ts,288,1024.0,1802.21,568.181,4.75,19.57,15.67
efficientnetv2_s,288,1024.0,1800.9,568.595,4.75,20.13,21.46
pit_b_distilled_224,224,1024.0,1798.47,569.363,10.63,16.67,74.79
resnet50,288,1024.0,1790.94,571.757,6.8,18.37,25.56
twins_pcpvt_base,224,1024.0,1774.55,577.037,6.46,21.35,43.83
halonet50ts,256,1024.0,1772.89,577.576,5.3,19.2,22.73
dpn68b,288,1024.0,1770.85,578.24,3.89,17.3,12.61
pit_b_224,224,1024.0,1769.93,578.542,10.56,16.6,73.76
hrnet_w18_ssld,224,1024.0,1769.77,578.594,4.32,16.31,21.3
swin_s3_tiny_224,224,1024.0,1768.18,579.114,4.64,19.13,28.33
efficientvit_l2,224,1024.0,1765.89,579.866,6.97,19.58,63.71
hrnet_w18,224,1024.0,1763.75,580.57,4.32,16.31,21.3
coat_lite_small,224,1024.0,1746.27,586.38,3.96,22.09,19.84
repvgg_b1,224,1024.0,1745.5,586.64,13.16,10.64,57.42
wide_resnet50_2,224,1024.0,1744.59,586.947,11.43,14.4,68.88
efficientnet_b3,320,512.0,1740.17,294.213,2.01,26.52,12.23
gcresnet50t,288,1024.0,1734.6,590.328,6.86,18.57,25.9
densenet201,224,1024.0,1731.46,591.397,4.34,7.85,20.01
tresnet_v2_l,224,1024.0,1730.52,591.717,8.85,16.34,46.17
tf_efficientnet_b3,300,512.0,1724.68,296.856,1.87,23.83,12.23
efficientnetv2_rw_s,288,1024.0,1722.48,594.481,4.91,21.41,23.94
darknetaa53,288,1024.0,1719.51,595.509,10.08,15.68,36.02
maxxvitv2_nano_rw_256,256,768.0,1706.28,450.091,6.12,19.66,23.7
resnetaa101d,224,1024.0,1701.55,601.792,9.12,17.56,44.57
xcit_tiny_12_p16_384,384,1024.0,1700.55,602.144,3.64,18.25,6.72
cait_xxs24_224,224,1024.0,1698.66,602.815,2.53,20.29,11.96
resnet50t,288,1024.0,1694.77,604.2,7.14,19.53,25.57
legacy_seresnet101,224,1024.0,1693.62,604.611,7.61,15.74,49.33
cs3edgenet_x,256,1024.0,1692.79,604.907,11.53,12.92,47.82
resnet50d,288,1024.0,1684.01,608.061,7.19,19.7,25.58
mobilevitv2_175,256,512.0,1675.38,305.592,5.54,28.13,14.25
regnetv_064,224,1024.0,1674.09,611.663,6.39,16.41,30.58
resnetv2_101x1_bit,224,1024.0,1672.61,612.204,8.04,16.23,44.54
efficientnet_b3_gn,288,512.0,1669.75,306.623,1.74,23.35,11.73
ese_vovnet39b,288,768.0,1667.87,460.459,11.71,11.13,24.57
regnety_032,288,1024.0,1666.89,614.307,5.29,18.61,19.44
seresnet101,224,1024.0,1666.33,614.509,7.84,16.27,49.33
regnety_064,224,1024.0,1666.11,614.593,6.39,16.41,30.58
convnext_tiny_hnf,288,1024.0,1663.94,615.393,7.39,22.21,28.59
regnetv_040,288,1024.0,1658.56,617.391,6.6,20.3,20.64
regnety_040,288,1024.0,1648.75,621.064,6.61,20.3,20.65
regnety_080,224,1024.0,1645.74,622.202,8.0,17.97,39.18
resnet101s,224,1024.0,1640.53,624.176,9.19,18.64,44.67
mixer_b16_224,224,1024.0,1627.76,629.075,12.62,14.53,59.88
dla102x,224,1024.0,1623.56,630.698,5.89,19.42,26.31
nf_resnet101,224,1024.0,1622.48,631.12,8.01,16.23,44.55
swinv2_cr_tiny_224,224,1024.0,1621.28,631.59,4.66,28.45,28.33
ecaresnet101d,224,1024.0,1619.0,632.477,8.08,17.07,44.57
convnextv2_tiny,224,1024.0,1618.49,632.676,4.47,13.44,28.64
darknet53,288,1024.0,1615.64,633.795,11.78,15.68,41.61
wide_resnet101_2,176,1024.0,1615.25,633.945,14.31,13.18,126.89
repvit_m2_3,224,1024.0,1614.73,634.149,4.57,26.21,23.69
resnetaa50,288,1024.0,1610.23,635.923,8.52,19.24,25.56
resnetblur101d,224,1024.0,1609.76,636.109,9.12,17.94,44.57
efficientvit_b3,224,1024.0,1609.54,636.196,3.99,26.9,48.65
regnetz_d32,256,1024.0,1603.03,638.779,5.98,23.74,27.58
regnetz_b16_evos,224,1024.0,1602.47,639.001,1.43,9.95,9.74
ese_vovnet39b_evos,224,1024.0,1599.88,640.036,7.07,6.74,24.58
davit_small,224,1024.0,1599.81,640.066,8.69,27.54,49.75
seresnet50,288,1024.0,1595.89,641.637,6.8,18.39,28.09
cs3se_edgenet_x,256,1024.0,1593.53,642.587,11.53,12.94,50.72
nf_regnet_b4,320,1024.0,1592.57,642.975,3.29,19.88,30.21
swinv2_cr_tiny_ns_224,224,1024.0,1590.7,643.731,4.66,28.45,28.33
sequencer2d_s,224,1024.0,1586.65,645.372,4.96,11.31,27.65
tf_efficientnetv2_s,300,1024.0,1583.75,646.555,5.35,22.73,21.46
densenet121,288,1024.0,1581.16,647.615,4.74,11.41,7.98
resnet51q,288,1024.0,1581.05,647.659,8.07,20.94,35.7
regnetz_d8,256,1024.0,1580.57,647.855,3.97,23.74,23.37
resmlp_36_224,224,1024.0,1577.5,649.116,8.91,16.33,44.69
mixer_l32_224,224,1024.0,1577.26,649.215,11.27,19.86,206.94
regnetz_040,256,1024.0,1574.58,650.32,4.06,24.19,27.12
vit_base_patch16_224_miil,224,1024.0,1574.06,650.535,16.88,16.5,94.4
botnet50ts_256,256,512.0,1573.5,325.38,5.54,22.23,22.74
resnet50_gn,288,1024.0,1570.23,652.122,6.85,18.37,25.56
vit_base_patch16_clip_224,224,1024.0,1569.93,652.248,16.87,16.49,86.57
cs3darknet_x,288,1024.0,1569.68,652.352,10.6,14.36,35.05
deit_base_distilled_patch16_224,224,1024.0,1568.26,652.942,16.95,16.58,87.34
vit_base_patch16_224,224,1024.0,1568.03,653.038,16.87,16.49,86.57
deit_base_patch16_224,224,1024.0,1567.8,653.131,16.87,16.49,86.57
regnetz_040_h,256,1024.0,1564.2,654.638,4.12,24.29,28.94
resnetv2_50d_gn,288,1024.0,1555.81,658.164,7.24,19.7,25.57
resnetv2_50d_frn,224,1024.0,1553.07,659.326,4.33,11.92,25.59
tresnet_l,224,1024.0,1528.92,669.739,10.9,11.9,55.99
regnety_080_tv,224,1024.0,1528.54,669.91,8.51,19.73,39.38
resnetaa50d,288,1024.0,1524.48,671.692,8.92,20.57,25.58
nf_resnet50,288,1024.0,1524.41,671.724,6.88,18.37,25.56
caformer_s18,224,1024.0,1522.76,672.449,3.9,15.18,26.34
resnext101_32x8d,176,1024.0,1521.82,672.868,10.33,19.37,88.79
seresnet50t,288,1024.0,1518.59,674.299,7.14,19.55,28.1
ecaresnet50t,288,1024.0,1518.21,674.465,7.14,19.55,25.57
mvitv2_tiny,224,1024.0,1518.01,674.556,4.7,21.16,24.17
resnet101d,256,1024.0,1517.18,674.926,10.55,22.25,44.57
pvt_v2_b3,224,1024.0,1516.27,675.326,6.71,33.8,45.24
maxvit_tiny_rw_224,224,768.0,1513.7,507.357,4.93,28.54,29.06
ecaresnet50d,288,1024.0,1510.36,677.975,7.19,19.72,25.58
convnextv2_nano,288,768.0,1503.98,510.637,4.06,13.84,15.62
halo2botnet50ts_256,256,1024.0,1499.3,682.975,5.02,21.78,22.64
cs3sedarknet_x,288,1024.0,1498.9,683.158,10.6,14.37,35.4
res2net50_26w_8s,224,1024.0,1498.8,683.201,8.37,17.95,48.4
resnext101_32x4d,224,1024.0,1496.35,684.32,8.01,21.23,44.18
deit3_base_patch16_224,224,1024.0,1488.08,688.122,16.87,16.49,86.59
regnetz_c16,320,1024.0,1478.43,692.615,3.92,25.88,13.46
resnest50d_4s2x40d,224,1024.0,1478.06,692.785,4.4,17.94,30.42
resnetblur50,288,1024.0,1477.0,693.285,8.52,19.87,25.56
skresnext50_32x4d,224,1024.0,1470.18,696.502,4.5,17.18,27.48
efficientvit_l2,256,1024.0,1466.16,698.41,9.09,25.49,63.71
eca_nfnet_l0,288,1024.0,1463.28,699.787,7.12,17.29,24.14
mobilevitv2_200,256,768.0,1462.66,525.062,7.22,32.15,18.45
nfnet_l0,288,1024.0,1461.21,700.775,7.13,17.29,35.07
resnet61q,288,1024.0,1460.17,701.277,9.87,21.52,36.85
vit_base_patch32_clip_448,448,1024.0,1456.81,702.892,17.21,16.49,88.34
vit_small_patch16_36x1_224,224,1024.0,1454.45,704.036,12.63,24.59,64.67
vit_small_resnet50d_s16_224,224,1024.0,1451.55,705.439,13.0,21.12,57.53
beit_base_patch16_224,224,1024.0,1443.54,709.354,16.87,16.49,86.53
res2net101_26w_4s,224,1024.0,1442.54,709.848,8.1,18.45,45.21
vit_base_patch16_siglip_224,224,1024.0,1439.5,711.343,17.02,16.71,92.88
vit_base_patch16_gap_224,224,1024.0,1436.45,712.857,16.78,16.41,86.57
regnety_040_sgn,288,1024.0,1436.16,712.999,6.67,20.3,20.65
beitv2_base_patch16_224,224,1024.0,1436.01,713.075,16.87,16.49,86.53
convit_small,224,1024.0,1431.38,715.383,5.76,17.87,27.78
edgenext_base,320,1024.0,1423.6,719.289,6.01,24.32,18.51
convformer_s18,224,1024.0,1421.81,720.197,3.96,15.82,26.77
focalnet_small_srf,224,1024.0,1419.82,721.204,8.62,26.26,49.89
densenetblur121d,288,1024.0,1416.47,722.914,5.14,13.06,8.0
poolformer_s36,224,1024.0,1415.39,723.463,5.0,15.82,30.86
resnetv2_50d_evos,224,1024.0,1415.09,723.614,4.33,11.92,25.59
coatnet_rmlp_1_rw_224,224,1024.0,1413.05,724.664,7.44,28.08,41.69
res2net101d,224,1024.0,1406.68,727.943,8.35,19.25,45.23
legacy_xception,299,1024.0,1405.99,728.302,8.4,35.83,22.86
vit_small_patch16_18x2_224,224,1024.0,1405.24,728.689,12.63,24.59,64.67
resnetblur50d,288,1024.0,1403.3,729.695,8.92,21.19,25.58
resnext50_32x4d,288,1024.0,1402.5,730.115,7.04,23.81,25.03
inception_next_small,224,1024.0,1397.1,732.931,8.36,19.27,49.37
repvgg_b2g4,224,1024.0,1392.83,735.183,12.63,12.9,61.76
gcvit_tiny,224,1024.0,1390.57,736.376,4.79,29.82,28.22
vit_relpos_base_patch16_clsgap_224,224,1024.0,1386.7,738.433,16.88,17.72,86.43
vit_base_patch16_clip_quickgelu_224,224,1024.0,1384.47,739.621,16.87,16.49,86.19
vit_relpos_base_patch16_cls_224,224,1024.0,1384.18,739.775,16.88,17.72,86.43
dpn92,224,1024.0,1380.04,741.995,6.54,18.21,37.67
seresnetaa50d,288,1024.0,1379.8,742.125,8.92,20.59,28.11
vit_small_patch16_384,384,1024.0,1379.23,742.429,12.45,24.15,22.2
nf_ecaresnet101,224,1024.0,1375.27,744.569,8.01,16.27,44.55
nf_seresnet101,224,1024.0,1370.83,746.983,8.02,16.27,49.33
efficientnet_b3_gn,320,384.0,1366.12,281.077,2.14,28.83,11.73
vgg16_bn,224,1024.0,1361.56,752.067,15.5,13.56,138.37
flexivit_base,240,1024.0,1360.19,752.822,19.35,18.92,86.59
efficientformerv2_s0,224,1024.0,1357.83,754.133,0.41,5.3,3.6
resnetv2_152,224,1024.0,1356.74,754.735,11.55,22.56,60.19
seresnext101_32x4d,224,1024.0,1356.08,755.105,8.02,21.26,48.96
legacy_seresnext101_32x4d,224,1024.0,1355.29,755.543,8.02,21.26,48.96
efficientnet_b3_g8_gn,288,768.0,1342.01,572.264,2.59,23.35,14.25
efficientvit_b3,256,768.0,1340.35,572.972,5.2,35.01,48.65
efficientnet_b4,320,512.0,1338.46,382.52,3.13,34.76,19.34
nfnet_f0,256,1024.0,1336.25,766.311,12.62,18.05,71.49
resnext50d_32x4d,288,1024.0,1335.71,766.62,7.44,25.13,25.05
focalnet_small_lrf,224,1024.0,1333.55,767.863,8.74,28.61,50.34
resnet152,224,1024.0,1331.42,769.094,11.56,22.56,60.19
ese_vovnet99b,224,1024.0,1328.91,770.544,16.51,11.27,63.2
resnetv2_152d,224,1024.0,1322.45,774.307,11.8,23.36,60.2
regnetx_120,224,1024.0,1317.68,777.11,12.13,21.37,46.11
hrnet_w32,224,1024.0,1308.75,782.414,8.97,22.02,41.23
xception41p,299,512.0,1308.08,391.403,9.25,39.86,26.91
vit_relpos_base_patch16_224,224,1024.0,1306.59,783.71,16.8,17.63,86.43
xcit_tiny_12_p8_224,224,1024.0,1306.3,783.883,4.81,23.6,6.71
coatnet_1_rw_224,224,1024.0,1303.02,785.857,7.63,27.22,41.72
resnet152c,224,1024.0,1301.97,786.489,11.8,23.36,60.21
coatnet_rmlp_1_rw2_224,224,1024.0,1300.63,787.299,7.71,32.74,41.72
twins_pcpvt_large,224,1024.0,1297.56,789.162,9.53,30.21,60.99
maxvit_tiny_tf_224,224,768.0,1297.26,592.007,5.42,31.21,30.92
resnet152d,224,1024.0,1296.94,789.538,11.8,23.36,60.21
cs3edgenet_x,288,1024.0,1296.8,789.626,14.59,16.36,47.82
vit_base_patch16_xp_224,224,1024.0,1295.7,790.295,16.85,16.49,86.51
poolformerv2_s24,224,1024.0,1287.82,795.129,3.42,10.68,21.34
dla169,224,1024.0,1280.41,799.732,11.6,20.2,53.39
efficientnet_el_pruned,300,1024.0,1280.32,799.789,8.0,30.7,10.59
efficientnet_el,300,1024.0,1279.02,800.603,8.0,30.7,10.59
seresnext50_32x4d,288,1024.0,1276.82,801.978,7.04,23.82,27.56
hrnet_w30,224,1024.0,1276.63,802.098,8.15,21.21,37.71
deit3_small_patch16_384,384,1024.0,1274.41,803.494,12.45,24.15,22.21
ecaresnet50t,320,1024.0,1274.01,803.751,8.82,24.13,25.57
maxxvit_rmlp_tiny_rw_256,256,768.0,1269.37,605.011,6.36,32.69,29.64
volo_d1_224,224,1024.0,1269.05,806.894,6.94,24.43,26.63
vgg19,224,1024.0,1264.63,809.714,19.63,14.86,143.67
convnext_base,224,1024.0,1259.04,813.306,15.38,28.75,88.59
rexnetr_300,288,512.0,1257.05,407.293,5.59,36.61,34.81
vit_base_patch16_rpn_224,224,1024.0,1255.24,815.771,16.78,16.41,86.54
densenet161,224,1024.0,1254.96,815.95,7.79,11.06,28.68
efficientformerv2_s1,224,1024.0,1251.09,818.477,0.67,7.66,6.19
regnety_120,224,1024.0,1250.69,818.739,12.14,21.38,51.82
twins_svt_base,224,1024.0,1249.89,819.258,8.36,20.42,56.07
tf_efficientnet_el,300,1024.0,1249.79,819.323,8.0,30.7,10.59
sequencer2d_m,224,1024.0,1238.3,826.927,6.55,14.26,38.31
nest_small,224,1024.0,1229.99,832.512,9.41,22.88,38.35
maxvit_tiny_rw_256,256,768.0,1229.06,624.855,6.44,37.27,29.07
maxvit_rmlp_tiny_rw_256,256,768.0,1228.3,625.245,6.47,39.84,29.15
repvgg_b2,224,1024.0,1219.54,839.651,20.45,12.9,89.02
nest_small_jx,224,1024.0,1219.36,839.775,9.41,22.88,38.35
mixnet_xxl,224,768.0,1211.88,633.716,2.04,23.43,23.96
resnet152s,224,1024.0,1205.05,849.747,12.92,24.96,60.32
swin_small_patch4_window7_224,224,1024.0,1202.25,851.724,8.77,27.47,49.61
inception_v4,299,1024.0,1191.21,859.617,12.28,15.09,42.68
swinv2_tiny_window8_256,256,1024.0,1191.2,859.622,5.96,24.57,28.35
legacy_seresnet152,224,1024.0,1187.19,862.527,11.33,22.08,66.82
coatnet_1_224,224,512.0,1184.08,432.392,8.28,31.3,42.23
xcit_small_24_p16_224,224,1024.0,1178.16,869.138,9.1,23.63,47.67
vit_relpos_base_patch16_rpn_224,224,1024.0,1177.44,869.665,16.8,17.63,86.41
eca_nfnet_l1,256,1024.0,1175.13,871.38,9.62,22.04,41.41
seresnet152,224,1024.0,1173.43,872.64,11.57,22.61,66.82
maxvit_tiny_pm_256,256,768.0,1169.83,656.496,6.31,40.82,30.09
crossvit_base_240,240,1024.0,1165.77,878.374,20.13,22.67,105.03
efficientnet_lite4,380,384.0,1155.38,332.349,4.04,45.66,13.01
xception41,299,512.0,1153.48,443.864,9.28,39.86,26.97
regnetx_160,224,1024.0,1153.37,887.82,15.99,25.52,54.28
vgg19_bn,224,1024.0,1151.34,889.391,19.66,14.86,143.68
cait_xxs36_224,224,1024.0,1139.1,898.942,3.77,30.34,17.3
tresnet_xl,224,1024.0,1138.98,899.04,15.2,15.34,78.44
tnt_s_patch16_224,224,1024.0,1134.46,902.62,5.24,24.37,23.76
davit_base,224,1024.0,1133.31,903.534,15.36,36.72,87.95
dm_nfnet_f0,256,1024.0,1132.28,904.361,12.62,18.05,71.49
resnetv2_101,288,1024.0,1131.44,905.029,12.94,26.83,44.54
mvitv2_small_cls,224,1024.0,1129.19,906.833,7.04,28.17,34.87
mvitv2_small,224,1024.0,1128.19,907.64,7.0,28.08,34.87
coat_tiny,224,1024.0,1126.07,909.345,4.35,27.2,5.5
convmixer_1024_20_ks9_p14,224,1024.0,1123.31,911.577,5.55,5.51,24.38
vit_base_patch16_reg8_gap_256,256,1024.0,1115.77,917.744,22.6,22.09,86.62
fastvit_sa24,256,1024.0,1114.43,918.841,3.79,23.92,21.55
repvgg_b3g4,224,1024.0,1113.37,919.717,17.89,15.1,83.83
convnext_small,288,1024.0,1110.94,921.731,14.39,35.65,50.22
vit_base_patch16_siglip_256,256,1024.0,1108.01,924.168,22.23,21.83,92.93
resnet101,288,1024.0,1104.31,927.267,12.95,26.83,44.55
dla102x2,224,1024.0,1104.21,927.342,9.34,29.91,41.28
pvt_v2_b4,224,1024.0,1101.67,929.481,9.83,48.14,62.56
vit_large_r50_s32_224,224,1024.0,1091.33,938.289,19.45,22.22,328.99
eva02_base_patch16_clip_224,224,1024.0,1090.31,939.167,16.9,18.91,86.26
vgg13_bn,224,1024.0,1090.15,939.306,11.33,12.25,133.05
resnet152d,256,1024.0,1089.57,939.806,15.41,30.51,60.21
nf_regnet_b4,384,1024.0,1089.51,939.86,4.7,28.61,30.21
efficientnet_b3_g8_gn,320,768.0,1085.43,707.541,3.2,28.83,14.25
vit_small_r26_s32_384,384,1024.0,1083.82,944.797,10.24,27.67,36.47
efficientvit_l2,288,1024.0,1083.69,944.906,11.51,32.19,63.71
efficientnetv2_s,384,1024.0,1081.44,946.869,8.44,35.77,21.46
tf_efficientnet_lite4,380,384.0,1073.72,357.628,4.04,45.66,13.01
pvt_v2_b5,224,1024.0,1068.28,958.536,11.39,44.23,81.96
hrnet_w18_ssld,288,1024.0,1066.01,960.575,7.14,26.96,21.3
tf_efficientnetv2_s,384,1024.0,1054.1,971.431,8.44,35.77,21.46
regnety_160,224,1024.0,1046.76,978.242,15.96,23.04,83.59
samvit_base_patch16_224,224,1024.0,1027.37,996.713,16.83,17.2,86.46
convnext_tiny,384,768.0,1026.31,748.299,13.14,39.48,28.59
wide_resnet50_2,288,1024.0,1025.91,998.129,18.89,23.81,68.88
efficientnetv2_rw_s,384,1024.0,1024.66,999.343,8.72,38.03,23.94
vgg16,224,1024.0,1020.44,1003.475,15.47,13.56,138.36
cs3se_edgenet_x,320,1024.0,1009.45,1014.397,18.01,20.21,50.72
vit_base_patch16_plus_240,240,1024.0,1002.7,1021.234,26.31,22.07,117.56
swinv2_cr_small_224,224,1024.0,1001.72,1022.232,9.07,50.27,49.7
dpn98,224,1024.0,998.61,1025.406,11.73,25.2,61.57
efficientvit_b3,288,768.0,996.43,770.744,6.58,44.2,48.65
resnetaa101d,288,1024.0,996.18,1027.911,15.07,29.03,44.57
wide_resnet101_2,224,1024.0,994.0,1030.164,22.8,21.23,126.89
regnetz_d32,320,1024.0,994.0,1030.165,9.33,37.08,27.58
swinv2_cr_small_ns_224,224,1024.0,991.13,1033.149,9.08,50.27,49.7
focalnet_base_srf,224,1024.0,990.91,1033.385,15.28,35.01,88.15
convnextv2_small,224,1024.0,989.67,1034.674,8.71,21.56,50.32
resnet200,224,1024.0,987.28,1037.18,15.07,32.19,64.67
convnextv2_tiny,288,768.0,983.87,780.578,7.39,22.21,28.64
seresnet101,288,1024.0,983.64,1041.016,12.95,26.87,49.33
vit_small_patch8_224,224,1024.0,981.8,1042.968,16.76,32.86,21.67
regnetz_d8,320,1024.0,980.9,1043.922,6.19,37.08,23.37
regnety_080,288,1024.0,977.86,1047.177,13.22,29.69,39.18
inception_next_base,224,1024.0,977.1,1047.988,14.85,25.69,86.67
vit_base_r50_s16_224,224,1024.0,974.47,1050.816,20.94,27.88,97.89
resnest101e,256,1024.0,968.0,1057.838,13.38,28.66,48.28
convnext_base,256,1024.0,965.93,1060.101,20.09,37.55,88.59
regnetz_c16_evos,256,768.0,965.5,795.429,2.48,16.57,13.49
regnetz_040,320,512.0,964.02,531.096,6.35,37.78,27.12
poolformer_m36,224,1024.0,963.9,1062.337,8.8,22.02,56.17
regnetz_b16_evos,288,768.0,961.28,798.923,2.36,16.43,9.74
inception_resnet_v2,299,1024.0,958.82,1067.962,13.18,25.06,55.84
regnetz_040_h,320,512.0,958.46,534.182,6.43,37.94,28.94
seresnet152d,256,1024.0,956.44,1070.629,15.42,30.56,66.84
ecaresnet101d,288,1024.0,951.62,1076.05,13.35,28.19,44.57
regnety_064,288,1024.0,949.24,1078.741,10.56,27.11,30.58
resnetrs152,256,1024.0,948.32,1079.798,15.59,30.83,86.62
resnext101_64x4d,224,1024.0,947.79,1080.397,15.52,31.21,83.46
regnetv_064,288,1024.0,947.23,1081.038,10.55,27.11,30.58
xception65p,299,512.0,944.43,542.118,13.91,52.48,39.82
resnetblur101d,288,1024.0,942.52,1086.438,15.07,29.65,44.57
resnetrs101,288,1024.0,941.79,1087.277,13.56,28.53,63.62
focalnet_base_lrf,224,1024.0,941.31,1087.831,15.43,38.13,88.75
resnext101_32x8d,224,1024.0,939.44,1090.002,16.48,31.21,88.79
repvgg_b3,224,1024.0,933.91,1096.448,29.16,15.1,123.09
hrnet_w40,224,1024.0,931.96,1098.75,12.75,25.29,57.56
nfnet_f1,224,1024.0,924.88,1107.159,17.87,22.94,132.63
eva02_small_patch14_336,336,1024.0,923.99,1108.223,12.41,27.7,22.13
resnet101d,320,1024.0,923.18,1109.193,16.48,34.77,44.57
xcit_tiny_24_p16_384,384,1024.0,910.96,1124.082,6.87,34.29,12.12
efficientnet_b4,384,384.0,908.88,422.486,4.51,50.04,19.34
cait_s24_224,224,1024.0,904.24,1132.424,9.35,40.58,46.92
mobilevitv2_150,384,256.0,899.17,284.697,9.2,54.25,10.59
maxvit_rmlp_small_rw_224,224,768.0,898.81,854.449,10.48,42.44,64.9
coat_mini,224,1024.0,894.78,1144.406,6.82,33.68,10.34
coat_lite_medium,224,1024.0,892.4,1147.459,9.81,40.06,44.57
efficientnetv2_m,320,1024.0,889.26,1151.505,11.01,39.97,54.14
seresnext101_64x4d,224,1024.0,888.73,1152.196,15.53,31.25,88.23
gmlp_b16_224,224,1024.0,884.5,1157.706,15.78,30.21,73.08
seresnext101_32x8d,224,1024.0,883.56,1158.934,16.48,31.25,93.57
swin_s3_small_224,224,768.0,879.87,872.841,9.43,37.84,49.74
vit_relpos_base_patch16_plus_240,240,1024.0,875.04,1170.215,26.21,23.41,117.38
efficientformer_l7,224,1024.0,873.11,1172.808,10.17,24.45,82.23
nest_base,224,1024.0,870.02,1176.974,16.71,30.51,67.72
poolformerv2_s36,224,1024.0,869.16,1178.141,5.01,15.82,30.79
maxvit_small_tf_224,224,512.0,868.0,589.85,11.39,46.31,68.93
seresnext101d_32x8d,224,1024.0,866.35,1181.949,16.72,32.05,93.59
nest_base_jx,224,1024.0,862.67,1187.001,16.71,30.51,67.72
levit_384_s8,224,512.0,854.68,599.045,9.98,35.86,39.12
regnetz_e8,256,1024.0,853.36,1199.952,9.91,40.94,57.7
swin_base_patch4_window7_224,224,1024.0,852.78,1200.762,15.47,36.63,87.77
coatnet_2_rw_224,224,512.0,852.23,600.767,14.55,39.37,73.87
tf_efficientnet_b4,380,384.0,851.5,450.956,4.49,49.49,19.34
gcvit_small,224,1024.0,841.82,1216.401,8.57,41.61,51.09
convnextv2_nano,384,512.0,841.68,608.3,7.22,24.61,15.62
resnetv2_50d_evos,288,1024.0,840.21,1218.735,7.15,19.7,25.59
levit_conv_384_s8,224,512.0,839.77,609.68,9.98,35.86,39.12
xception65,299,512.0,839.39,609.953,13.96,52.48,39.92
hrnet_w44,224,1024.0,835.38,1225.779,14.94,26.92,67.06
crossvit_15_dagger_408,408,1024.0,833.7,1228.252,16.07,37.0,28.5
tiny_vit_21m_384,384,512.0,827.46,618.747,11.94,46.84,21.23
twins_svt_large,224,1024.0,824.23,1242.353,14.84,27.23,99.27
seresnextaa101d_32x8d,224,1024.0,820.77,1247.602,17.25,34.16,93.59
xcit_medium_24_p16_224,224,1024.0,820.51,1247.988,16.13,31.71,84.4
eva02_base_patch14_224,224,1024.0,819.51,1249.51,22.0,24.67,85.76
coatnet_rmlp_2_rw_224,224,512.0,814.13,628.885,14.64,44.94,73.88
hrnet_w48_ssld,224,1024.0,812.33,1260.551,17.34,28.56,77.47
hrnet_w48,224,1024.0,811.26,1262.228,17.34,28.56,77.47
caformer_s36,224,1024.0,810.13,1263.986,7.55,29.29,39.3
tresnet_m,448,1024.0,809.9,1264.343,22.99,29.21,31.39
resnet200d,256,1024.0,803.17,1274.938,20.0,43.09,64.69
sequencer2d_l,224,1024.0,802.78,1275.557,9.74,22.12,54.3
maxxvit_rmlp_small_rw_256,256,768.0,801.57,958.106,14.21,47.76,66.01
swinv2_base_window12_192,192,1024.0,799.54,1280.724,11.9,39.72,109.28
dm_nfnet_f1,224,1024.0,798.67,1282.118,17.87,22.94,132.63
coatnet_2_224,224,512.0,796.89,642.486,15.94,42.41,74.68
vit_medium_patch16_gap_384,384,1024.0,795.07,1287.922,22.01,32.15,39.03
mvitv2_base_cls,224,1024.0,791.15,1294.298,10.23,40.65,65.44
mvitv2_base,224,1024.0,785.87,1303.007,10.16,40.5,51.47
efficientnetv2_rw_m,320,1024.0,785.27,1303.997,12.72,47.14,53.24
resnet152,288,1024.0,781.77,1309.827,19.11,37.28,60.19
swinv2_tiny_window16_256,256,512.0,775.64,660.087,6.68,39.02,28.35
fastvit_sa36,256,1024.0,768.44,1332.545,5.62,34.02,31.53
xcit_small_12_p16_384,384,1024.0,764.7,1339.074,14.14,36.5,26.25
convnext_base,288,1024.0,763.36,1341.427,25.43,47.53,88.59
convformer_s36,224,1024.0,754.92,1356.424,7.67,30.5,40.01
regnety_120,288,768.0,738.36,1040.13,20.06,35.34,51.82
swinv2_small_window8_256,256,1024.0,737.99,1387.548,11.58,40.14,49.73
dpn131,224,1024.0,732.6,1397.744,16.09,32.97,79.25
swinv2_cr_small_ns_256,256,1024.0,731.79,1399.291,12.07,76.21,49.7
mobilevitv2_175,384,256.0,731.75,349.838,12.47,63.29,14.25
convit_base,224,1024.0,730.43,1401.91,17.52,31.77,86.54
resnetv2_50x1_bit,448,512.0,729.61,701.734,16.62,44.46,25.55
poolformer_m48,224,1024.0,727.01,1408.491,11.59,29.17,73.47
maxvit_rmlp_small_rw_256,256,768.0,724.69,1059.745,13.69,55.48,64.9
tnt_b_patch16_224,224,1024.0,721.67,1418.912,14.09,39.01,65.41
eca_nfnet_l1,320,1024.0,720.22,1421.77,14.92,34.42,41.41
swinv2_cr_base_224,224,1024.0,716.89,1428.383,15.86,59.66,87.88
swin_s3_base_224,224,1024.0,715.81,1430.534,13.69,48.26,71.13
volo_d2_224,224,1024.0,711.4,1439.408,14.34,41.34,58.68
swinv2_cr_base_ns_224,224,1024.0,711.07,1440.068,15.86,59.66,87.88
convnextv2_base,224,768.0,708.71,1083.64,15.38,28.75,88.72
densenet264d,224,1024.0,697.85,1467.348,13.57,14.0,72.74
ecaresnet200d,256,1024.0,697.3,1468.506,20.0,43.15,64.69
seresnet200d,256,1024.0,696.92,1469.301,20.01,43.15,71.86
nf_regnet_b5,384,1024.0,694.76,1473.879,7.95,42.9,49.74
seresnet152,288,1024.0,693.47,1476.616,19.11,37.34,66.82
resnetrs200,256,1024.0,693.26,1477.057,20.18,43.42,93.21
coat_small,224,1024.0,689.68,1484.732,12.61,44.25,21.69
convnext_large,224,1024.0,686.69,1491.207,34.4,43.13,197.77
xcit_tiny_24_p8_224,224,1024.0,684.2,1496.615,9.21,45.38,12.11
efficientvit_l3,224,1024.0,667.4,1534.307,27.62,39.16,246.04
dpn107,224,1024.0,666.43,1536.527,18.38,33.46,86.92
resnet152d,320,1024.0,664.6,1540.768,24.08,47.67,60.21
senet154,224,1024.0,664.59,1540.791,20.77,38.69,115.09
legacy_senet154,224,1024.0,663.62,1543.045,20.77,38.69,115.09
efficientformerv2_s2,224,1024.0,658.11,1555.962,1.27,11.77,12.71
maxxvitv2_rmlp_base_rw_224,224,768.0,650.48,1180.654,23.88,54.39,116.09
xcit_nano_12_p8_384,384,1024.0,649.92,1575.56,6.34,46.06,3.05
xception71,299,512.0,649.47,788.325,18.09,69.92,42.34
vit_large_patch32_384,384,1024.0,643.51,1591.268,44.28,32.22,306.63
mobilevitv2_200,384,256.0,640.82,399.48,16.24,72.34,18.45
davit_large,224,1024.0,630.01,1625.361,34.37,55.08,196.81
hrnet_w64,224,1024.0,629.26,1627.299,28.97,35.09,128.06
convnext_small,384,768.0,628.81,1221.341,25.58,63.37,50.22
regnetz_d8_evos,256,1024.0,626.83,1633.604,4.5,24.92,23.46
regnety_160,288,768.0,626.54,1225.759,26.37,38.07,83.59
convnext_base,320,768.0,617.04,1244.641,31.39,58.68,88.59
fastvit_ma36,256,1024.0,615.75,1662.995,7.85,40.39,44.07
tf_efficientnetv2_m,384,1024.0,614.24,1667.09,15.85,57.52,54.14
gcvit_base,224,1024.0,612.92,1670.669,14.87,55.48,90.32
regnety_320,224,1024.0,612.34,1672.272,32.34,30.26,145.05
efficientvit_l2,384,768.0,610.03,1258.949,20.45,57.01,63.71
poolformerv2_m36,224,1024.0,609.2,1680.886,8.81,22.02,56.08
regnetz_c16_evos,320,512.0,608.23,841.78,3.86,25.88,13.49
resnetv2_50x3_bit,224,768.0,585.49,1311.719,37.06,33.34,217.32
seresnet152d,320,1024.0,585.32,1749.453,24.09,47.72,66.84
xcit_small_12_p8_224,224,1024.0,584.75,1751.159,18.69,47.19,26.21
resnet200,288,1024.0,584.49,1751.952,24.91,53.21,64.67
resnetrs152,320,1024.0,580.71,1763.336,24.34,48.14,86.62
caformer_m36,224,1024.0,580.7,1763.373,12.75,40.61,56.2
resnext101_64x4d,288,1024.0,579.65,1766.578,25.66,51.59,83.46
levit_conv_512_s8,224,256.0,579.33,441.879,21.82,52.28,74.05
crossvit_18_dagger_408,408,1024.0,578.67,1769.56,25.31,49.38,44.61
levit_512_s8,224,256.0,564.15,453.77,21.82,52.28,74.05
convnextv2_tiny,384,384.0,553.95,693.189,13.14,39.48,28.64
convformer_m36,224,1024.0,546.86,1872.507,12.89,42.05,57.05
efficientnet_b5,416,256.0,546.68,468.268,8.27,80.68,30.39
seresnet269d,256,1024.0,545.35,1877.679,26.59,53.6,113.67
efficientvit_l3,256,768.0,542.99,1414.373,36.06,50.98,246.04
seresnext101_32x8d,288,1024.0,537.9,1903.669,27.24,51.63,93.57
efficientnetv2_m,416,1024.0,531.24,1927.549,18.6,67.5,54.14
resnetrs270,256,1024.0,529.33,1934.515,27.06,55.84,129.86
maxvit_rmlp_base_rw_224,224,768.0,529.1,1451.502,22.63,79.3,116.14
swinv2_base_window8_256,256,1024.0,528.71,1936.775,20.37,52.59,87.92
regnetz_e8,320,768.0,528.46,1453.264,15.46,63.94,57.7
seresnext101d_32x8d,288,1024.0,527.36,1941.726,27.64,52.95,93.59
convnext_large_mlp,256,768.0,525.72,1460.834,44.94,56.33,200.13
nfnet_f2,256,1024.0,524.14,1953.657,33.76,41.85,193.78
halonet_h1,256,256.0,522.84,489.621,3.0,51.17,8.1
regnetx_320,224,1024.0,522.6,1959.408,31.81,36.3,107.81
mixer_l16_224,224,1024.0,520.22,1968.376,44.6,41.69,208.2
resnext101_32x16d,224,1024.0,519.8,1969.975,36.27,51.18,194.03
eca_nfnet_l2,320,1024.0,509.51,2009.758,20.95,47.43,56.72
ecaresnet200d,288,1024.0,503.74,2032.793,25.31,54.59,64.69
seresnet200d,288,1024.0,503.36,2034.329,25.32,54.6,71.86
caformer_s18,384,512.0,501.38,1021.162,11.45,44.61,26.34
volo_d3_224,224,1024.0,497.87,2056.757,20.78,60.09,86.33
resnet200d,320,1024.0,493.82,2073.621,31.25,67.33,64.69
swin_large_patch4_window7_224,224,768.0,492.35,1559.852,34.53,54.94,196.53
vit_base_patch16_18x2_224,224,1024.0,492.32,2079.918,50.37,49.17,256.73
deit_base_patch16_384,384,1024.0,491.82,2082.046,49.4,48.3,86.86
vit_base_patch16_clip_384,384,1024.0,491.74,2082.405,49.41,48.3,86.86
vit_base_patch16_384,384,1024.0,491.42,2083.727,49.4,48.3,86.86
deit_base_distilled_patch16_384,384,1024.0,491.32,2084.164,49.49,48.39,87.63
hrnet_w48_ssld,288,1024.0,490.92,2085.876,28.66,47.21,77.47
eva_large_patch14_196,196,1024.0,490.45,2087.863,59.66,43.77,304.14
maxvit_base_tf_224,224,512.0,488.88,1047.285,23.52,81.67,119.47
efficientnet_b5,448,256.0,488.83,523.691,9.59,93.56,30.39
vit_large_patch16_224,224,1024.0,488.5,2096.219,59.7,43.77,304.33
swinv2_small_window16_256,256,512.0,486.59,1052.215,12.82,66.29,49.73
swinv2_large_window12_192,192,768.0,485.58,1581.6,26.17,56.53,228.77
convformer_s18,384,512.0,484.08,1057.663,11.63,46.49,26.77
seresnextaa101d_32x8d,288,1024.0,479.96,2133.497,28.51,56.44,93.59
coatnet_3_rw_224,224,256.0,478.44,535.067,32.63,59.07,181.81
coatnet_rmlp_3_rw_224,224,256.0,477.75,535.833,32.75,64.7,165.15
xcit_large_24_p16_224,224,1024.0,472.07,2169.166,35.86,47.26,189.1
vit_small_patch14_dinov2,518,1024.0,469.29,2181.987,29.46,57.34,22.06
deit3_base_patch16_384,384,1024.0,466.88,2193.286,49.4,48.3,86.88
deit3_large_patch16_224,224,1024.0,466.56,2194.777,59.7,43.77,304.37
efficientnetv2_rw_m,416,768.0,466.5,1646.281,21.49,79.62,53.24
nfnet_f1,320,1024.0,466.35,2195.774,35.97,46.77,132.63
nf_regnet_b5,456,768.0,464.5,1653.385,11.7,61.95,49.74
coatnet_3_224,224,256.0,464.1,551.594,35.72,63.61,166.97
vit_small_patch14_reg4_dinov2,518,1024.0,460.4,2224.119,29.55,57.51,22.06
poolformerv2_m48,224,1024.0,459.37,2229.113,11.59,29.17,73.35
beitv2_large_patch16_224,224,1024.0,452.16,2264.697,59.7,43.77,304.43
beit_large_patch16_224,224,1024.0,452.15,2264.716,59.7,43.77,304.43
resnetv2_101x1_bit,448,512.0,451.35,1134.365,31.65,64.93,44.54
dm_nfnet_f2,256,1024.0,451.22,2269.395,33.76,41.85,193.78
vit_base_patch16_siglip_384,384,1024.0,448.34,2283.991,50.0,49.11,93.18
resnetv2_152x2_bit,224,1024.0,441.5,2319.35,46.95,45.11,236.34
convnext_xlarge,224,768.0,435.62,1762.988,60.98,57.5,350.2
maxvit_tiny_tf_384,384,256.0,434.99,588.503,16.0,94.22,30.98
efficientformerv2_l,224,1024.0,431.02,2375.769,2.59,18.54,26.32
convnext_base,384,512.0,430.72,1188.698,45.21,84.49,88.59
convnextv2_base,288,512.0,429.59,1191.832,25.43,47.53,88.72
resnetrs200,320,1024.0,428.05,2392.217,31.51,67.81,93.21
flexivit_large,240,1024.0,424.67,2411.279,68.48,50.22,304.36
convnextv2_large,224,512.0,423.49,1208.977,34.4,43.13,197.96
xcit_tiny_12_p8_384,384,1024.0,423.2,2419.661,14.12,69.12,6.71
swinv2_cr_large_224,224,768.0,422.05,1819.675,35.1,78.42,196.68
caformer_b36,224,768.0,419.19,1832.111,22.5,54.14,98.75
swinv2_cr_tiny_384,384,256.0,419.04,610.909,15.34,161.01,28.33
tf_efficientnet_b5,456,256.0,418.1,612.278,10.46,98.86,30.39
convnext_large,288,512.0,415.42,1232.482,56.87,71.29,197.77
davit_huge,224,512.0,410.45,1247.402,60.93,73.44,348.92
maxxvitv2_rmlp_large_rw_224,224,768.0,409.41,1875.861,43.69,75.4,215.42
tiny_vit_21m_512,512,384.0,408.26,940.575,21.23,83.26,21.27
xcit_small_24_p16_384,384,1024.0,408.08,2509.308,26.72,68.57,47.67
tf_efficientnetv2_m,480,768.0,405.02,1896.185,24.76,89.84,54.14
tresnet_l,448,1024.0,403.56,2537.407,43.59,47.56,55.99
beit_base_patch16_384,384,1024.0,401.76,2548.786,49.4,48.3,86.74
convformer_b36,224,768.0,396.81,1935.431,22.69,56.06,99.88
regnetz_d8_evos,320,768.0,395.82,1940.285,7.03,38.92,23.46
seresnextaa101d_32x8d,320,1024.0,395.0,2592.386,35.19,69.67,93.59
seresnet269d,288,1024.0,393.84,2600.059,33.65,67.81,113.67
dm_nfnet_f1,320,1024.0,393.6,2601.642,35.97,46.77,132.63
regnety_160,384,384.0,378.47,1014.589,46.87,67.67,83.59
vit_large_r50_s32_384,384,1024.0,372.96,2745.589,56.4,64.88,329.09
regnety_640,224,768.0,362.45,2118.906,64.16,42.5,281.38
eca_nfnet_l2,384,768.0,361.66,2123.504,30.05,68.28,56.72
vit_large_patch14_224,224,1024.0,359.79,2846.069,77.83,57.11,304.2
vit_large_patch14_clip_224,224,1024.0,359.08,2851.744,77.83,57.11,304.2
swinv2_base_window12to16_192to256,256,384.0,358.35,1071.569,22.02,84.71,87.92
swinv2_base_window16_256,256,384.0,358.25,1071.869,22.02,84.71,87.92
vit_large_patch16_siglip_256,256,1024.0,351.53,2912.942,78.12,57.42,315.96
vit_base_patch8_224,224,1024.0,350.95,2917.813,66.87,65.71,86.58
efficientvit_l3,320,512.0,346.1,1479.341,56.32,79.34,246.04
efficientnetv2_l,384,1024.0,342.83,2986.92,36.1,101.16,118.52
tf_efficientnetv2_l,384,1024.0,338.97,3020.897,36.1,101.16,118.52
ecaresnet269d,320,1024.0,337.13,3037.39,41.53,83.69,102.09
resnest200e,320,1024.0,336.33,3044.627,35.69,82.78,70.2
maxvit_large_tf_224,224,384.0,336.26,1141.954,42.99,109.57,211.79
convnext_large_mlp,320,512.0,336.03,1523.669,70.21,88.02,200.13
inception_next_base,384,512.0,335.9,1524.27,43.64,75.48,86.67
resnetv2_101x3_bit,224,768.0,334.56,2295.509,71.23,48.7,387.93
eca_nfnet_l3,352,768.0,328.62,2337.043,32.57,73.12,72.04
vit_large_patch14_clip_quickgelu_224,224,1024.0,324.15,3159.023,77.83,57.11,303.97
repvgg_d2se,320,1024.0,320.2,3197.943,74.57,46.82,133.33
vit_base_r50_s16_384,384,1024.0,317.01,3230.175,61.29,81.77,98.95
volo_d4_224,224,1024.0,317.0,3230.22,44.34,80.22,192.96
volo_d1_384,384,512.0,314.1,1630.023,22.75,108.55,26.78
vit_large_patch14_xp_224,224,1024.0,309.84,3304.92,77.77,57.11,304.06
convmixer_768_32,224,1024.0,308.6,3318.227,19.55,25.95,21.11
xcit_small_24_p8_224,224,1024.0,305.72,3349.464,35.81,90.77,47.63
resnetrs350,288,1024.0,304.48,3363.098,43.67,87.09,163.96
nasnetalarge,331,384.0,300.79,1276.642,23.89,90.56,88.75
coat_lite_medium_384,384,512.0,299.62,1708.831,28.73,116.7,44.57
tresnet_xl,448,768.0,296.15,2593.304,60.77,61.31,78.44
maxvit_small_tf_384,384,192.0,288.16,666.295,33.58,139.86,69.02
pnasnet5large,331,384.0,287.26,1336.778,25.04,92.89,86.06
xcit_medium_24_p16_384,384,1024.0,282.76,3621.451,47.39,91.63,84.4
ecaresnet269d,352,1024.0,281.17,3641.867,50.25,101.25,102.09
coatnet_4_224,224,256.0,280.04,914.128,60.81,98.85,275.43
cait_xxs24_384,384,1024.0,277.04,3696.16,9.63,122.65,12.03
coatnet_rmlp_2_rw_384,384,192.0,273.87,701.059,43.04,132.57,73.88
resnetrs270,352,1024.0,271.91,3765.914,51.13,105.48,129.86
nfnet_f2,352,768.0,270.88,2835.244,63.22,79.06,193.78
caformer_s36,384,512.0,266.29,1922.686,22.2,86.08,39.3
convnext_xlarge,288,512.0,263.75,1941.25,100.8,95.05,350.2
swinv2_cr_small_384,384,256.0,258.42,990.618,29.7,298.03,49.7
efficientnet_b6,528,128.0,257.57,496.944,19.4,167.39,43.04
convformer_s36,384,512.0,257.36,1989.401,22.54,89.62,40.01
convnextv2_large,288,256.0,256.91,996.448,56.87,71.29,197.96
eva02_large_patch14_224,224,1024.0,256.79,3987.739,77.9,65.52,303.27
eva02_large_patch14_clip_224,224,1024.0,253.51,4039.312,77.93,65.52,304.11
resnext101_32x32d,224,512.0,253.0,2023.672,87.29,91.12,468.53
maxvit_tiny_tf_512,512,192.0,249.39,769.864,28.66,172.66,31.05
tf_efficientnet_b6,528,128.0,247.44,517.29,19.4,167.39,43.04
nfnet_f3,320,1024.0,247.37,4139.575,68.77,83.93,254.92
mvitv2_large_cls,224,768.0,246.55,3114.926,42.17,111.69,234.58
vit_so400m_patch14_siglip_224,224,1024.0,246.49,4154.292,106.18,70.45,427.68
efficientnetv2_xl,384,1024.0,244.46,4188.739,52.81,139.2,208.12
mvitv2_large,224,512.0,242.6,2110.485,43.87,112.02,217.99
convnextv2_base,384,256.0,242.26,1056.699,45.21,84.49,88.72
vit_base_patch16_siglip_512,512,512.0,241.2,2122.705,88.89,87.3,93.52
convnext_large,384,384.0,234.69,1636.209,101.1,126.74,197.77
convnext_large_mlp,384,384.0,234.65,1636.476,101.11,126.74,200.13
dm_nfnet_f2,352,768.0,234.38,3276.685,63.22,79.06,193.78
tf_efficientnetv2_xl,384,1024.0,230.18,4448.679,52.81,139.2,208.12
efficientnetv2_l,480,512.0,229.94,2226.68,56.4,157.99,118.52
tf_efficientnetv2_l,480,512.0,227.38,2251.742,56.4,157.99,118.52
swin_base_patch4_window12_384,384,256.0,226.65,1129.483,47.19,134.78,87.9
regnety_320,384,384.0,225.95,1699.504,95.0,88.87,145.05
resnetrs420,320,1024.0,221.8,4616.729,64.2,126.56,191.89
xcit_tiny_24_p8_384,384,1024.0,221.03,4632.753,27.05,132.94,12.11
efficientvit_l3,384,384.0,220.15,1744.25,81.08,114.02,246.04
swinv2_large_window12to16_192to256,256,256.0,218.91,1169.41,47.81,121.53,196.74
maxxvitv2_rmlp_base_rw_384,384,384.0,215.87,1778.825,70.18,160.22,116.09
resmlp_big_24_224,224,1024.0,214.65,4770.604,100.23,87.31,129.14
dm_nfnet_f3,320,1024.0,212.33,4822.62,68.77,83.93,254.92
volo_d5_224,224,1024.0,212.3,4823.349,72.4,118.11,295.46
xcit_medium_24_p8_224,224,1024.0,210.35,4868.038,63.52,121.22,84.32
seresnextaa201d_32x8d,320,1024.0,207.05,4945.752,70.22,138.71,149.39
eca_nfnet_l3,448,512.0,204.74,2500.737,52.55,118.4,72.04
xcit_small_12_p8_384,384,512.0,195.78,2615.134,54.92,138.25,26.21
cait_xs24_384,384,768.0,193.45,3970.037,19.28,183.98,26.67
caformer_m36,384,256.0,191.51,1336.728,37.45,119.33,56.2
focalnet_huge_fl3,224,384.0,190.45,2016.221,118.26,104.8,745.28
eva02_base_patch14_448,448,512.0,189.13,2707.053,87.74,98.4,87.12
maxvit_xlarge_tf_224,224,256.0,188.97,1354.682,96.49,164.37,506.99
convformer_m36,384,384.0,186.96,2053.847,37.87,123.56,57.05
cait_xxs36_384,384,1024.0,185.14,5531.038,14.35,183.7,17.37
swinv2_cr_base_384,384,256.0,184.66,1386.338,50.57,333.68,87.88
resnetrs350,384,1024.0,184.39,5553.562,77.59,154.74,163.96
regnety_1280,224,512.0,182.89,2799.45,127.66,71.58,644.81
swinv2_cr_huge_224,224,384.0,181.27,2118.357,115.97,121.08,657.83
vit_huge_patch14_clip_224,224,1024.0,179.25,5712.71,161.99,95.07,632.05
vit_huge_patch14_224,224,1024.0,179.24,5713.082,161.99,95.07,630.76
volo_d2_384,384,384.0,177.67,2161.247,46.17,184.51,58.87
maxvit_rmlp_base_rw_384,384,384.0,177.21,2166.875,66.51,233.79,116.14
vit_base_patch14_dinov2,518,512.0,175.93,2910.275,117.11,114.68,86.58
vit_huge_patch14_gap_224,224,1024.0,175.35,5839.715,161.36,94.7,630.76
vit_base_patch14_reg4_dinov2,518,512.0,175.34,2920.066,117.45,115.02,86.58
convnextv2_huge,224,256.0,174.19,1469.676,115.0,79.07,660.29
deit3_huge_patch14_224,224,1024.0,172.49,5936.531,161.99,95.07,632.13
convmixer_1536_20,224,1024.0,172.27,5944.074,48.68,33.03,51.63
vit_huge_patch14_clip_quickgelu_224,224,1024.0,165.12,6201.386,161.99,95.07,632.08
maxvit_small_tf_512,512,96.0,163.95,585.546,60.02,256.36,69.13
maxvit_base_tf_384,384,192.0,162.75,1179.72,69.34,247.75,119.65
xcit_large_24_p16_384,384,1024.0,162.01,6320.659,105.34,137.15,189.1
resnetv2_152x2_bit,384,384.0,160.06,2399.153,136.16,132.56,236.34
vit_huge_patch14_xp_224,224,1024.0,159.21,6431.544,161.88,95.07,631.8
resnest269e,416,512.0,159.04,3219.278,77.69,171.98,110.93
eva_large_patch14_336,336,768.0,155.41,4941.906,174.74,128.21,304.53
vit_large_patch14_clip_336,336,768.0,155.09,4951.819,174.74,128.21,304.53
vit_large_patch16_384,384,768.0,154.94,4956.737,174.85,128.21,304.72
convnext_xxlarge,256,384.0,152.35,2520.42,198.09,124.45,846.47
davit_giant,224,384.0,151.56,2533.626,192.34,138.2,1406.47
resnetv2_50x3_bit,448,192.0,150.44,1276.251,145.7,133.37,217.32
coatnet_5_224,224,192.0,149.61,1283.336,142.72,143.69,687.47
efficientnetv2_xl,512,512.0,149.15,3432.877,93.85,247.32,208.12
cait_s24_384,384,512.0,148.91,3438.219,32.17,245.3,47.06
convnext_xlarge,384,256.0,148.61,1722.573,179.2,168.99,350.2
tf_efficientnetv2_xl,512,512.0,148.0,3459.525,93.85,247.32,208.12
efficientnet_b7,600,96.0,147.91,649.053,38.33,289.94,66.35
deit3_large_patch16_384,384,1024.0,147.79,6928.856,174.85,128.21,304.76
seresnextaa201d_32x8d,384,768.0,147.05,5222.537,101.11,199.72,149.39
nfnet_f3,416,512.0,146.71,3489.974,115.58,141.78,254.92
vit_giant_patch16_gap_224,224,1024.0,145.38,7043.632,198.14,103.64,1011.37
convnextv2_large,384,192.0,144.92,1324.86,101.1,126.74,197.96
resnetv2_152x4_bit,224,512.0,144.91,3533.266,186.9,90.22,936.53
vit_large_patch16_siglip_384,384,768.0,144.23,5324.878,175.76,129.18,316.28
tf_efficientnet_b7,600,96.0,143.48,669.058,38.33,289.94,66.35
nfnet_f4,384,768.0,142.67,5383.101,122.14,147.57,316.07
vit_large_patch14_clip_quickgelu_336,336,768.0,140.95,5448.604,174.74,128.21,304.29
caformer_b36,384,256.0,138.42,1849.458,66.12,159.11,98.75
swin_large_patch4_window12_384,384,128.0,135.49,944.717,104.08,202.16,196.74
convformer_b36,384,256.0,135.29,1892.221,66.67,164.75,99.88
resnetrs420,416,1024.0,130.11,7870.213,108.45,213.79,191.89
beit_large_patch16_384,384,768.0,129.31,5939.365,174.84,128.21,305.0
dm_nfnet_f3,416,512.0,127.57,4013.328,115.58,141.78,254.92
regnety_640,384,256.0,126.8,2018.836,188.47,124.83,281.38
dm_nfnet_f4,384,768.0,123.05,6241.189,122.14,147.57,316.07
focalnet_huge_fl4,224,512.0,122.81,4169.023,118.9,113.34,686.46
xcit_large_24_p8_224,224,512.0,120.1,4263.036,141.22,181.53,188.93
resnetv2_152x2_bit,448,256.0,117.91,2171.109,184.99,180.43,236.34
eva_giant_patch14_224,224,1024.0,116.71,8773.739,259.74,135.89,1012.56
eva_giant_patch14_clip_224,224,1024.0,116.64,8779.464,259.74,135.89,1012.59
vit_giant_patch14_224,224,1024.0,114.18,8968.21,259.74,135.89,1012.61
vit_giant_patch14_clip_224,224,1024.0,114.09,8975.383,259.74,135.89,1012.65
swinv2_cr_large_384,384,128.0,112.81,1134.666,108.96,404.96,196.68
maxvit_large_tf_384,384,128.0,111.17,1151.411,126.61,332.3,212.03
eva02_large_patch14_clip_336,336,1024.0,110.28,9285.405,174.97,147.1,304.43
mvitv2_huge_cls,224,384.0,107.61,3568.518,120.67,243.63,694.8
convnextv2_huge,288,128.0,105.35,1214.957,190.1,130.7,660.29
xcit_small_24_p8_384,384,512.0,102.73,4983.926,105.23,265.87,47.63
nfnet_f5,416,512.0,100.11,5114.164,170.71,204.56,377.21
cait_s36_384,384,512.0,99.61,5140.29,47.99,367.39,68.37
swinv2_base_window12to24_192to384,384,96.0,96.35,996.364,55.25,280.36,87.92
efficientnet_b8,672,96.0,95.78,1002.248,63.48,442.89,87.41
focalnet_large_fl3,384,384.0,94.47,4064.948,105.06,168.04,239.13
tf_efficientnet_b8,672,96.0,93.18,1030.252,63.48,442.89,87.41
maxvit_base_tf_512,512,96.0,92.2,1041.169,123.93,456.26,119.88
focalnet_large_fl4,384,256.0,90.17,2839.222,105.2,181.78,239.32
resnetv2_101x3_bit,448,192.0,87.88,2184.819,280.33,194.78,387.93
dm_nfnet_f5,416,512.0,86.64,5909.833,170.71,204.56,377.21
nfnet_f4,512,384.0,81.51,4711.211,216.26,262.26,316.07
volo_d3_448,448,192.0,76.74,2501.831,96.33,446.83,86.63
vit_so400m_patch14_siglip_384,384,512.0,75.92,6743.556,302.34,200.62,428.23
nfnet_f6,448,512.0,75.59,6773.482,229.7,273.62,438.36
vit_huge_patch14_clip_336,336,768.0,75.49,10173.683,363.7,213.44,632.46
xcit_medium_24_p8_384,384,384.0,71.15,5396.903,186.67,354.69,84.32
dm_nfnet_f4,512,384.0,69.56,5520.408,216.26,262.26,316.07
vit_gigantic_patch14_224,224,512.0,66.18,7736.423,473.4,204.12,1844.44
vit_gigantic_patch14_clip_224,224,512.0,66.18,7735.92,473.41,204.12,1844.91
focalnet_xlarge_fl3,384,256.0,66.07,3874.786,185.61,223.99,408.79
dm_nfnet_f6,448,512.0,65.28,7842.994,229.7,273.62,438.36
maxvit_large_tf_512,512,64.0,63.68,1005.087,225.96,611.85,212.33
focalnet_xlarge_fl4,384,192.0,63.39,3028.979,185.79,242.31,409.03
maxvit_xlarge_tf_384,384,96.0,63.2,1518.995,283.86,498.45,475.32
regnety_1280,384,128.0,62.14,2059.919,374.99,210.2,644.81
beit_large_patch16_512,512,256.0,61.47,4164.41,310.6,227.76,305.67
convnextv2_huge,384,96.0,60.73,1580.79,337.96,232.35,660.29
swinv2_large_window12to24_192to384,384,48.0,60.6,792.119,116.15,407.83,196.74
eva02_large_patch14_448,448,512.0,59.6,8591.147,310.69,261.32,305.08
tf_efficientnet_l2,475,128.0,59.14,2164.439,172.11,609.89,480.31
nfnet_f5,544,384.0,58.55,6558.595,290.97,349.71,377.21
vit_huge_patch14_clip_378,378,512.0,58.17,8801.788,460.13,270.04,632.68
volo_d4_448,448,192.0,57.2,3356.883,197.13,527.35,193.41
nfnet_f7,480,384.0,57.05,6730.663,300.08,355.86,499.5
vit_large_patch14_dinov2,518,384.0,56.81,6759.458,414.89,304.42,304.37
vit_large_patch14_reg4_dinov2,518,384.0,56.51,6795.142,416.1,305.31,304.37
vit_huge_patch14_clip_quickgelu_378,378,384.0,53.9,7123.722,460.13,270.04,632.68
swinv2_cr_giant_224,224,192.0,52.42,3662.593,483.85,309.15,2598.76
dm_nfnet_f5,544,384.0,50.82,7555.977,290.97,349.71,377.21
eva_giant_patch14_336,336,512.0,49.6,10322.486,583.14,305.1,1013.01
swinv2_cr_huge_384,384,64.0,48.85,1310.056,352.04,583.18,657.94
nfnet_f6,576,256.0,45.99,5566.397,378.69,452.2,438.36
xcit_large_24_p8_384,384,256.0,40.54,6315.135,415.0,531.74,188.93
volo_d5_448,448,192.0,39.97,4803.918,315.06,737.92,295.91
dm_nfnet_f6,576,256.0,39.68,6452.4,378.69,452.2,438.36
nfnet_f7,608,256.0,35.92,7127.91,480.39,570.85,499.5
maxvit_xlarge_tf_512,512,48.0,35.73,1343.449,505.95,917.77,475.77
regnety_2560,384,96.0,35.19,2728.299,747.83,296.49,1282.6
convnextv2_huge,512,48.0,34.07,1408.989,600.81,413.07,660.29
cait_m36_384,384,256.0,32.53,7868.895,173.11,734.79,271.22
resnetv2_152x4_bit,480,128.0,32.31,3961.512,844.84,414.26,936.53
volo_d5_512,512,96.0,27.94,3435.72,425.09,1105.37,296.09
samvit_base_patch16,1024,12.0,23.01,521.487,371.55,403.08,89.67
efficientnet_l2,800,32.0,22.53,1420.616,479.12,1707.39,480.31
tf_efficientnet_l2,800,32.0,22.12,1446.454,479.12,1707.39,480.31
vit_giant_patch14_dinov2,518,192.0,17.14,11200.639,1553.56,871.89,1136.48
vit_giant_patch14_reg4_dinov2,518,128.0,17.05,7505.847,1558.09,874.43,1136.48
swinv2_cr_giant_384,384,32.0,15.01,2131.256,1450.71,1394.86,2598.76
eva_giant_patch14_560,560,192.0,15.01,12792.976,1618.04,846.56,1014.45
cait_m48_448,448,128.0,13.76,9299.464,329.4,1708.21,356.46
samvit_large_patch16,1024,8.0,10.25,780.237,1317.08,1055.58,308.28
samvit_huge_patch16,1024,6.0,6.31,950.475,2741.59,1727.57,637.03
eva02_enormous_patch14_clip_224,224,,,,1132.46,497.58,4350.56
vit_huge_patch16_gap_448,448,,,,544.7,636.83,631.67
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/results-sketch.csv
|
model,top1,top1_err,top5,top5_err,param_count,img_size,crop_pct,interpolation,top1_diff,top5_diff,rank_diff
eva_giant_patch14_336.clip_ft_in1k,71.177,28.823,90.299,9.701,"1,013.01",336,1.000,bicubic,-18.289,-8.527,+6
eva02_large_patch14_448.mim_m38m_ft_in22k_in1k,70.662,29.338,89.856,10.144,305.08,448,1.000,bicubic,-19.390,-9.192,-1
eva02_large_patch14_448.mim_m38m_ft_in1k,70.546,29.454,89.843,10.157,305.08,448,1.000,bicubic,-19.028,-9.081,+2
convnext_xxlarge.clip_laion2b_soup_ft_in1k,70.039,29.961,90.334,9.666,846.47,256,1.000,bicubic,-18.565,-8.374,+10
eva_giant_patch14_224.clip_ft_in1k,70.021,29.979,89.768,10.232,"1,012.56",224,0.900,bicubic,-18.859,-8.912,+4
eva_giant_patch14_336.m30m_ft_in22k_in1k,68.052,31.948,87.819,12.181,"1,013.01",336,1.000,bicubic,-21.514,-11.133,0
eva02_large_patch14_448.mim_in22k_ft_in22k_in1k,67.533,32.467,87.506,12.494,305.08,448,1.000,bicubic,-22.437,-11.506,-5
eva_giant_patch14_560.m30m_ft_in22k_in1k,67.486,32.514,87.473,12.527,"1,014.45",560,1.000,bicubic,-22.300,-11.519,-5
vit_huge_patch14_clip_224.laion2b_ft_in1k,67.396,32.604,87.882,12.118,632.05,224,1.000,bicubic,-20.192,-10.337,+39
eva02_large_patch14_448.mim_in22k_ft_in1k,66.987,33.013,87.443,12.557,305.08,448,1.000,bicubic,-22.635,-11.507,-6
convnext_large_mlp.clip_laion2b_augreg_ft_in1k_384,66.435,33.565,87.349,12.651,200.13,384,1.000,bicubic,-21.413,-11.097,+31
vit_large_patch14_clip_336.laion2b_ft_in1k,65.741,34.259,86.909,13.091,304.53,336,1.000,bicubic,-22.115,-11.459,+28
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384,65.731,34.269,87.017,12.983,200.13,384,1.000,bicubic,-22.575,-11.565,+10
vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k,65.323,34.677,86.826,13.174,632.05,224,1.000,bicubic,-22.933,-11.726,+11
vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k,65.260,34.740,86.757,13.242,632.46,336,1.000,bicubic,-23.332,-11.905,+1
convnext_large_mlp.clip_laion2b_augreg_ft_in1k,65.091,34.909,86.284,13.716,200.13,256,1.000,bicubic,-22.245,-11.934,+42
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320,64.918,35.082,86.649,13.351,200.13,320,1.000,bicubic,-23.040,-11.827,+19
vit_large_patch14_clip_224.laion2b_ft_in1k,64.820,35.181,86.575,13.425,304.20,224,1.000,bicubic,-22.466,-11.669,+43
regnety_1280.swag_ft_in1k,64.106,35.894,86.034,13.966,644.81,384,1.000,bicubic,-24.124,-12.652,+9
vit_large_patch14_clip_336.openai_ft_in12k_in1k,64.065,35.935,85.912,14.088,304.53,336,1.000,bicubic,-24.203,-12.614,+4
eva_large_patch14_336.in22k_ft_in1k,63.096,36.904,84.382,15.618,304.53,336,1.000,bicubic,-25.574,-14.340,-8
vit_large_patch14_clip_224.openai_ft_in1k,62.629,37.371,85.109,14.891,304.20,224,1.000,bicubic,-25.225,-13.317,+19
vit_large_patch14_clip_224.laion2b_ft_in12k_in1k,62.065,37.935,84.313,15.687,304.20,224,1.000,bicubic,-25.830,-14.095,+16
vit_large_patch14_clip_336.laion2b_ft_in12k_in1k,61.611,38.389,83.651,16.349,304.53,336,1.000,bicubic,-26.569,-14.921,+8
vit_large_patch14_clip_224.openai_ft_in12k_in1k,61.402,38.598,83.362,16.638,304.20,224,1.000,bicubic,-26.772,-15.184,+8
eva_large_patch14_196.in22k_ft_in1k,61.113,38.887,82.776,17.224,304.14,196,1.000,bicubic,-26.819,-15.722,+11
eva_large_patch14_336.in22k_ft_in22k_in1k,60.944,39.056,82.136,17.864,304.53,336,1.000,bicubic,-28.262,-16.718,-19
convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384,60.795,39.205,83.220,16.780,88.59,384,1.000,bicubic,-26.339,-15.002,+36
convnext_base.clip_laion2b_augreg_ft_in1k,60.276,39.724,82.678,17.322,88.59,256,1.000,bicubic,-25.882,-15.002,+90
regnety_1280.swag_lc_in1k,59.913,40.087,83.161,16.839,644.81,224,0.965,bicubic,-26.069,-14.689,+105
eva_large_patch14_196.in22k_ft_in22k_in1k,59.883,40.117,81.143,18.857,304.14,196,1.000,bicubic,-28.691,-17.515,-14
convnext_base.clip_laion2b_augreg_ft_in12k_in1k,59.836,40.164,82.810,17.190,88.59,256,1.000,bicubic,-26.534,-15.174,+74
convnext_base.clip_laiona_augreg_ft_in1k_384,59.392,40.608,82.242,17.758,88.59,384,1.000,bicubic,-27.110,-15.726,+63
eva02_base_patch14_448.mim_in22k_ft_in22k_in1k,58.512,41.488,80.797,19.203,87.12,448,1.000,bicubic,-30.178,-17.927,-23
resnext101_32x32d.fb_wsl_ig1b_ft_in1k,58.376,41.624,80.398,19.602,468.53,224,0.875,bilinear,-26.722,-17.040,+163
beitv2_large_patch16_224.in1k_ft_in22k_in1k,58.366,41.634,80.226,19.774,304.43,224,0.950,bicubic,-30.028,-18.372,-16
eva02_base_patch14_448.mim_in22k_ft_in1k,58.036,41.964,80.768,19.232,87.12,448,1.000,bicubic,-30.216,-17.796,-11
regnety_320.swag_ft_in1k,57.906,42.094,81.456,18.544,145.05,384,1.000,bicubic,-28.928,-16.906,+42
convnextv2_huge.fcmae_ft_in22k_in1k_384,57.865,42.135,79.671,20.329,660.29,384,1.000,bicubic,-30.805,-19.067,-27
convnextv2_huge.fcmae_ft_in22k_in1k_512,57.851,42.149,79.497,20.503,660.29,512,1.000,bicubic,-31.007,-19.251,-30
resnext101_32x16d.fb_wsl_ig1b_ft_in1k,57.696,42.304,79.907,20.093,194.03,224,0.875,bilinear,-26.470,-17.291,+242
resnext101_32x16d.fb_swsl_ig1b_ft_in1k,57.478,42.522,80.373,19.627,194.03,224,0.875,bilinear,-25.858,-16.473,+333
vit_base_patch16_clip_384.laion2b_ft_in1k,56.879,43.121,80.004,19.997,86.86,384,1.000,bicubic,-29.739,-18.004,+45
beit_large_patch16_384.in22k_ft_in22k_in1k,56.879,43.121,79.221,20.779,305.00,384,1.000,bicubic,-31.523,-19.387,-24
beit_large_patch16_512.in22k_ft_in22k_in1k,56.757,43.243,78.911,21.089,305.67,512,1.000,bicubic,-31.839,-19.745,-30
resnext101_32x8d.fb_swsl_ig1b_ft_in1k,56.438,43.562,78.931,21.069,88.79,224,0.875,bilinear,-27.864,-18.245,+223
maxvit_rmlp_base_rw_384.sw_in12k_ft_in1k,56.327,43.673,77.305,22.695,116.14,384,1.000,bicubic,-31.502,-21.067,-4
maxvit_xlarge_tf_384.in21k_ft_in1k,56.212,43.788,78.742,21.258,475.32,384,1.000,bicubic,-32.101,-19.802,-26
maxvit_xlarge_tf_512.in21k_ft_in1k,56.156,43.844,78.636,21.364,475.77,512,1.000,bicubic,-32.382,-20.008,-31
maxvit_base_tf_512.in21k_ft_in1k,56.083,43.917,78.606,21.394,119.88,512,1.000,bicubic,-32.137,-19.924,-20
deit3_huge_patch14_224.fb_in22k_ft_in1k,55.767,44.233,77.626,22.374,632.13,224,1.000,bicubic,-31.420,-20.634,+12
maxvit_base_tf_384.in21k_ft_in1k,55.639,44.361,78.078,21.922,119.65,384,1.000,bicubic,-32.283,-20.466,-14
vit_base_patch16_clip_224.laion2b_ft_in1k,55.405,44.595,79.050,20.950,86.57,224,1.000,bicubic,-30.065,-18.525,+111
regnety_320.swag_lc_in1k,55.354,44.646,79.703,20.297,145.05,224,0.965,bicubic,-29.194,-17.739,+184
regnety_160.swag_ft_in1k,55.177,44.823,79.316,20.684,83.59,384,1.000,bicubic,-30.843,-18.736,+74
maxvit_large_tf_512.in21k_ft_in1k,55.171,44.829,77.276,22.724,212.33,512,1.000,bicubic,-33.053,-21.322,-27
maxvit_large_tf_384.in21k_ft_in1k,55.077,44.923,77.142,22.858,212.03,384,1.000,bicubic,-32.909,-21.426,-22
beit_large_patch16_224.in22k_ft_in22k_in1k,54.965,45.035,77.608,22.392,304.43,224,0.900,bicubic,-32.513,-20.696,-8
convnext_xlarge.fb_in22k_ft_in1k_384,54.961,45.039,76.824,23.176,350.20,384,1.000,bicubic,-32.791,-21.732,-15
resnext101_32x8d.fb_wsl_ig1b_ft_in1k,54.908,45.092,77.541,22.459,88.79,224,0.875,bilinear,-27.790,-18.603,+374
deit3_large_patch16_384.fb_in22k_ft_in1k,54.886,45.114,77.372,22.628,304.76,384,1.000,bicubic,-32.834,-21.140,-16
maxxvitv2_rmlp_base_rw_384.sw_in12k_ft_in1k,54.747,45.253,76.848,23.152,116.09,384,1.000,bicubic,-32.717,-21.526,-10
caformer_b36.sail_in22k_ft_in1k_384,54.440,45.560,76.830,23.170,98.75,384,1.000,bicubic,-33.618,-21.752,-29
deit3_large_patch16_224.fb_in22k_ft_in1k,54.359,45.641,76.563,23.437,304.37,224,1.000,bicubic,-32.623,-21.673,+9
beitv2_large_patch16_224.in1k_ft_in1k,54.161,45.839,75.562,24.438,304.43,224,0.950,bicubic,-33.251,-22.671,-9
convnextv2_large.fcmae_ft_in22k_in1k_384,53.947,46.053,76.007,23.993,197.96,384,1.000,bicubic,-34.251,-22.521,-35
maxvit_rmlp_base_rw_224.sw_in12k_ft_in1k,53.731,46.269,75.140,24.860,116.14,224,0.950,bicubic,-33.163,-22.874,+9
resnext101_32x4d.fb_swsl_ig1b_ft_in1k,53.587,46.413,76.337,23.663,44.18,224,0.875,bilinear,-29.639,-20.423,+316
vit_base_patch16_clip_384.laion2b_ft_in12k_in1k,53.493,46.507,75.653,24.347,86.86,384,1.000,bicubic,-33.713,-22.381,-7
vit_base_patch16_clip_384.openai_ft_in1k,53.074,46.926,76.655,23.345,86.86,384,1.000,bicubic,-33.132,-21.221,+44
regnety_160.swag_lc_in1k,53.043,46.957,78.090,21.910,83.59,224,0.965,bicubic,-30.739,-19.190,+253
convnextv2_base.fcmae_ft_in22k_in1k_384,52.907,47.093,75.083,24.917,88.72,384,1.000,bicubic,-34.737,-23.333,-26
convformer_b36.sail_in22k_ft_in1k_384,52.874,47.126,74.979,25.021,99.88,384,1.000,bicubic,-34.728,-23.455,-26
convnext_large.fb_in22k_ft_in1k_384,52.774,47.226,74.700,25.300,197.77,384,1.000,bicubic,-34.698,-23.686,-23
vit_large_patch16_384.augreg_in21k_ft_in1k,52.760,47.240,74.706,25.294,304.72,384,1.000,bicubic,-34.324,-23.596,-8
caformer_b36.sail_in22k_ft_in1k,52.756,47.244,75.309,24.691,98.75,224,1.000,bicubic,-34.664,-23.019,-21
convformer_b36.sail_in22k_ft_in1k,52.746,47.254,74.896,25.104,99.88,224,1.000,bicubic,-34.252,-23.276,-6
coatnet_rmlp_2_rw_384.sw_in12k_ft_in1k,52.389,47.611,73.802,26.198,73.88,384,1.000,bicubic,-34.994,-24.510,-21
swinv2_large_window12to24_192to384.ms_in22k_ft_in1k,52.304,47.696,74.415,25.585,196.74,384,1.000,bicubic,-35.160,-23.835,-26
convnext_xlarge.fb_in22k_ft_in1k,52.216,47.784,73.955,26.045,350.20,288,1.000,bicubic,-35.114,-24.373,-21
vit_large_r50_s32_384.augreg_in21k_ft_in1k,52.041,47.959,73.570,26.430,329.09,384,1.000,bicubic,-34.141,-24.352,+35
vit_large_patch16_224.augreg_in21k_ft_in1k,51.819,48.181,73.690,26.310,304.33,224,0.900,bicubic,-34.017,-23.974,+57
vit_base_patch16_clip_224.laion2b_ft_in12k_in1k,51.785,48.215,74.637,25.363,86.57,224,0.950,bicubic,-34.385,-23.119,+34
convnext_base.fb_in22k_ft_in1k_384,51.565,48.435,74.543,25.457,88.59,384,1.000,bicubic,-35.231,-23.721,-1
tf_efficientnet_l2.ns_jft_in1k_475,51.496,48.504,73.931,26.069,480.31,475,0.936,bicubic,-36.738,-24.615,-58
maxxvitv2_rmlp_base_rw_224.sw_in12k_ft_in1k,51.190,48.810,73.126,26.874,116.09,224,0.950,bicubic,-35.452,-24.894,+2
vit_base_patch16_clip_384.openai_ft_in12k_in1k,51.153,48.847,74.323,25.677,86.86,384,0.950,bicubic,-35.873,-23.859,-17
caformer_m36.sail_in22k_ft_in1k_384,51.048,48.952,73.442,26.558,56.20,384,1.000,bicubic,-36.398,-24.866,-34
swinv2_base_window12to24_192to384.ms_in22k_ft_in1k,50.976,49.024,73.295,26.705,87.92,384,1.000,bicubic,-36.120,-24.939,-23
vit_base_patch16_clip_224.openai_ft_in1k,50.936,49.064,74.841,25.159,86.57,224,0.900,bicubic,-34.356,-22.595,+88
seresnextaa201d_32x8d.sw_in12k_ft_in1k_384,50.710,49.290,73.666,26.334,149.39,384,1.000,bicubic,-36.578,-24.668,-31
resnext50_32x4d.fb_swsl_ig1b_ft_in1k,50.465,49.535,73.366,26.634,25.03,224,0.875,bilinear,-31.707,-22.858,+420
swinv2_large_window12to16_192to256.ms_in22k_ft_in1k,50.433,49.567,72.735,27.265,196.74,256,0.900,bicubic,-36.519,-25.371,-19
swin_large_patch4_window12_384.ms_in22k_ft_in1k,50.394,49.606,72.538,27.462,196.74,384,1.000,bicubic,-36.738,-25.696,-29
convnextv2_large.fcmae_ft_in22k_in1k,50.160,49.840,72.399,27.601,197.96,288,1.000,bicubic,-37.324,-25.957,-46
convnext_large.fb_in22k_ft_in1k,49.999,50.001,72.267,27.733,197.77,288,1.000,bicubic,-37.027,-25.937,-27
tf_efficientnetv2_xl.in21k_ft_in1k,49.722,50.278,72.124,27.876,208.12,512,1.000,bicubic,-37.026,-25.890,-12
vit_base_patch16_clip_224.openai_ft_in12k_in1k,49.700,50.300,72.868,27.132,86.57,224,0.950,bicubic,-36.242,-24.860,+37
caformer_m36.sail_in22k_ft_in1k,49.700,50.300,72.141,27.859,56.20,224,1.000,bicubic,-36.894,-25.883,-7
resnet50.fb_swsl_ig1b_ft_in1k,49.531,50.469,72.338,27.662,25.56,224,0.875,bilinear,-31.641,-23.648,+508
beitv2_base_patch16_224.in1k_ft_in22k_in1k,49.516,50.484,72.391,27.609,86.53,224,0.900,bicubic,-36.958,-25.661,-1
coatnet_rmlp_2_rw_224.sw_in12k_ft_in1k,49.394,50.606,70.848,29.152,73.88,224,0.950,bicubic,-37.110,-27.046,-7
convnextv2_base.fcmae_ft_in22k_in1k,49.142,50.858,71.230,28.770,88.72,288,1.000,bicubic,-37.856,-26.939,-31
convformer_m36.sail_in22k_ft_in1k_384,49.132,50.868,71.387,28.613,57.05,384,1.000,bicubic,-37.760,-26.729,-27
convformer_m36.sail_in22k_ft_in1k,49.091,50.909,71.471,28.529,57.05,224,1.000,bicubic,-37.057,-26.379,+15
vit_base_patch32_clip_224.laion2b_ft_in1k,49.062,50.938,72.584,27.416,88.22,224,0.900,bicubic,-33.520,-23.617,+350
swin_large_patch4_window7_224.ms_in22k_ft_in1k,48.993,51.007,71.387,28.613,196.53,224,0.900,bicubic,-37.319,-26.515,+1
convnext_base.fb_in22k_ft_in1k,48.938,51.062,71.748,28.252,88.59,288,1.000,bicubic,-37.336,-26.344,+2
swinv2_base_window12to16_192to256.ms_in22k_ft_in1k,48.781,51.219,71.410,28.590,87.92,256,0.900,bicubic,-37.487,-26.472,+2
tf_efficientnetv2_l.in21k_ft_in1k,48.739,51.261,71.992,28.008,118.52,480,1.000,bicubic,-38.063,-26.144,-29
coatnet_2_rw_224.sw_in12k_ft_in1k,48.678,51.322,70.123,29.877,73.87,224,0.950,bicubic,-37.885,-27.773,-18
beit_base_patch16_384.in22k_ft_in22k_in1k,48.669,51.331,72.102,27.898,86.74,384,1.000,bicubic,-38.131,-26.034,-30
swin_base_patch4_window12_384.ms_in22k_ft_in1k,48.545,51.455,71.819,28.181,87.90,384,1.000,bicubic,-37.893,-26.247,-10
caformer_s36.sail_in22k_ft_in1k_384,48.486,51.514,71.518,28.482,39.30,384,1.000,bicubic,-38.372,-26.694,-36
maxvit_base_tf_512.in1k,48.240,51.760,70.793,29.207,119.88,512,1.000,bicubic,-38.362,-27.125,-25
vit_large_r50_s32_224.augreg_in21k_ft_in1k,48.185,51.815,70.866,29.134,328.99,224,0.900,bicubic,-36.233,-26.306,+144
vit_base_patch32_clip_384.laion2b_ft_in12k_in1k,47.934,52.066,70.923,29.077,88.30,384,1.000,bicubic,-37.432,-26.737,+58
tf_efficientnet_b7.ns_jft_in1k,47.798,52.202,69.638,30.362,66.35,600,0.949,bicubic,-39.042,-28.454,-39
tf_efficientnet_b6.ns_jft_in1k,47.761,52.239,69.956,30.044,43.04,528,0.942,bicubic,-38.697,-27.934,-17
vit_base_patch8_224.augreg_in21k_ft_in1k,47.727,52.273,70.933,29.067,86.58,224,0.900,bicubic,-38.071,-26.857,+23
deit3_base_patch16_384.fb_in22k_ft_in1k,47.676,52.324,69.752,30.248,86.88,384,1.000,bicubic,-39.064,-28.364,-35
tf_efficientnet_l2.ns_jft_in1k,47.574,52.426,70.019,29.981,480.31,800,0.960,bicubic,-40.778,-28.629,-101
vit_base_patch32_clip_448.laion2b_ft_in12k_in1k,47.570,52.430,70.047,29.953,88.34,448,1.000,bicubic,-38.210,-27.591,+22
vit_base_patch8_224.augreg2_in21k_ft_in1k,47.507,52.493,70.326,29.674,86.58,224,0.900,bicubic,-38.711,-27.506,-11
tf_efficientnetv2_m.in21k_ft_in1k,47.456,52.544,70.945,29.055,54.14,480,1.000,bicubic,-38.536,-26.999,+7
deit3_base_patch16_224.fb_in22k_ft_in1k,47.378,52.622,69.769,30.230,86.59,224,1.000,bicubic,-38.322,-27.977,+25
tiny_vit_21m_512.dist_in22k_ft_in1k,47.254,52.746,70.062,29.938,21.27,512,1.000,bicubic,-39.204,-27.822,-26
convformer_s36.sail_in22k_ft_in1k_384,47.152,52.848,69.498,30.502,40.01,384,1.000,bicubic,-39.226,-28.486,-23
maxvit_large_tf_512.in1k,47.022,52.978,69.506,30.494,212.33,512,1.000,bicubic,-39.504,-28.374,-35
convnext_small.fb_in22k_ft_in1k_384,46.882,53.118,69.528,30.472,50.22,384,1.000,bicubic,-38.896,-28.362,+16
convnextv2_huge.fcmae_ft_in1k,46.880,53.120,67.785,32.215,660.29,288,1.000,bicubic,-39.700,-30.187,-39
convformer_s36.sail_in22k_ft_in1k,46.863,53.137,69.528,30.472,40.01,224,1.000,bicubic,-38.551,-28.040,+37
caformer_s36.sail_in22k_ft_in1k,46.708,53.292,69.744,30.256,39.30,224,1.000,bicubic,-39.083,-28.082,+11
tiny_vit_21m_384.dist_in22k_ft_in1k,46.256,53.744,69.231,30.769,21.23,384,1.000,bicubic,-39.852,-28.479,-12
beit_base_patch16_224.in22k_ft_in22k_in1k,46.254,53.746,69.885,30.115,86.53,224,0.900,bicubic,-38.958,-27.773,+52
vit_base_patch32_clip_384.openai_ft_in12k_in1k,46.234,53.766,69.288,30.712,88.30,384,0.950,bicubic,-38.980,-28.116,+49
maxvit_base_tf_384.in1k,46.234,53.766,68.531,31.468,119.65,384,1.000,bicubic,-40.068,-29.267,-27
hrnet_w48_ssld.paddle_in1k,46.177,53.823,68.056,31.944,77.47,288,1.000,bilinear,-38.303,-29.178,+110
beitv2_base_patch16_224.in1k_ft_in1k,46.002,53.998,67.859,32.141,86.53,224,0.900,bicubic,-39.592,-29.647,+17
vit_base_patch16_384.augreg_in21k_ft_in1k,45.894,54.106,68.557,31.443,86.86,384,1.000,bicubic,-40.100,-29.445,-9
seresnextaa101d_32x8d.sw_in12k_ft_in1k_288,45.863,54.137,68.608,31.392,93.59,320,1.000,bicubic,-40.861,-29.568,-54
tf_efficientnet_b8.ap_in1k,45.780,54.220,67.907,32.093,87.41,672,0.954,bicubic,-39.584,-29.385,+34
maxvit_large_tf_384.in1k,45.760,54.240,68.160,31.840,212.03,384,1.000,bicubic,-40.470,-29.528,-31
vit_base_patch32_clip_224.laion2b_ft_in12k_in1k,45.758,54.242,68.881,31.119,88.22,224,0.900,bicubic,-37.538,-27.647,+236
convnext_small.in12k_ft_in1k_384,45.721,54.279,67.818,32.182,50.22,384,1.000,bicubic,-40.461,-30.104,-30
tf_efficientnet_b5.ns_jft_in1k,45.611,54.389,67.850,32.150,30.39,456,0.934,bicubic,-40.477,-29.906,-22
swin_base_patch4_window7_224.ms_in22k_ft_in1k,45.532,54.468,68.504,31.496,87.77,224,0.900,bicubic,-39.740,-29.060,+33
mvitv2_large.fb_in1k,45.277,54.723,65.183,34.817,217.99,224,0.900,bicubic,-39.967,-32.031,+35
vit_base_patch16_224.augreg2_in21k_ft_in1k,45.114,54.886,67.394,32.606,86.57,224,0.900,bicubic,-39.980,-30.136,+50
vit_base_patch32_clip_224.openai_ft_in1k,45.031,54.969,68.453,31.547,88.22,224,0.900,bicubic,-36.899,-27.513,+388
tiny_vit_21m_224.dist_in22k_ft_in1k,44.851,55.149,67.590,32.410,21.20,224,0.950,bicubic,-40.235,-29.776,+50
seresnextaa101d_32x8d.sw_in12k_ft_in1k,44.801,55.199,67.372,32.628,93.59,288,1.000,bicubic,-41.683,-30.658,-53
convnextv2_large.fcmae_ft_in1k,44.797,55.203,65.853,34.147,197.96,288,1.000,bicubic,-41.320,-31.969,-32
volo_d5_512.sail_in1k,44.577,55.423,65.765,34.235,296.09,512,1.150,bicubic,-42.481,-32.205,-86
convnextv2_tiny.fcmae_ft_in22k_in1k_384,44.322,55.678,66.655,33.345,28.64,384,1.000,bicubic,-40.784,-30.973,+41
cait_m48_448.fb_dist_in1k,44.212,55.788,64.674,35.326,356.46,448,1.000,bicubic,-42.280,-33.078,-58
deit3_large_patch16_384.fb_in1k,44.182,55.818,64.845,35.155,304.76,384,1.000,bicubic,-41.630,-32.753,-15
volo_d5_448.sail_in1k,44.100,55.900,65.065,34.935,295.91,448,1.150,bicubic,-42.852,-32.873,-83
eva02_small_patch14_336.mim_in22k_ft_in1k,43.960,56.040,65.942,34.058,22.13,336,1.000,bicubic,-41.758,-31.692,-9
deit3_huge_patch14_224.fb_in1k,43.807,56.193,64.350,35.650,632.13,224,0.900,bicubic,-41.417,-33.010,+25
convnext_small.fb_in22k_ft_in1k,43.620,56.380,66.464,33.536,50.22,288,1.000,bicubic,-41.642,-31.218,+20
deit3_large_patch16_224.fb_in1k,43.516,56.484,63.572,36.428,304.37,224,0.900,bicubic,-41.258,-33.464,+63
vit_base_r50_s16_384.orig_in21k_ft_in1k,43.501,56.499,66.781,33.219,98.95,384,1.000,bicubic,-41.475,-30.509,+47
tf_efficientnet_b4.ns_jft_in1k,43.447,56.553,65.513,34.487,19.34,380,0.922,bicubic,-41.712,-31.955,+28
deit3_medium_patch16_224.fb_in22k_ft_in1k,43.276,56.724,64.888,35.112,38.85,224,1.000,bicubic,-41.273,-32.300,+72
volo_d5_224.sail_in1k,43.243,56.757,64.079,35.921,295.46,224,0.960,bicubic,-42.827,-33.497,-40
vit_base_patch16_224.augreg_in21k_ft_in1k,43.221,56.779,65.722,34.279,86.57,224,0.900,bicubic,-41.310,-31.572,+73
volo_d4_448.sail_in1k,43.133,56.867,64.108,35.892,193.41,448,1.150,bicubic,-43.659,-33.776,-84
efficientnet_b5.sw_in12k_ft_in1k,42.872,57.128,65.415,34.585,30.39,448,1.000,bicubic,-43.024,-32.321,-32
xcit_large_24_p8_384.fb_dist_in1k,42.838,57.162,63.418,36.582,188.93,384,1.000,bicubic,-43.158,-34.272,-40
regnety_160.lion_in12k_ft_in1k,42.748,57.252,64.203,35.797,83.59,288,1.000,bicubic,-43.240,-33.632,-38
regnety_160.sw_in12k_ft_in1k,42.683,57.317,64.338,35.662,83.59,288,1.000,bicubic,-43.303,-33.496,-38
maxvit_small_tf_512.in1k,42.681,57.319,64.537,35.464,69.13,512,1.000,bicubic,-43.403,-33.227,-48
convnext_small.in12k_ft_in1k,42.669,57.331,64.342,35.658,50.22,288,1.000,bicubic,-42.661,-33.204,+3
xcit_large_24_p8_224.fb_dist_in1k,42.557,57.443,63.098,36.902,188.93,224,1.000,bicubic,-42.845,-34.304,-4
tf_efficientnet_b8.ra_in1k,42.498,57.502,64.873,35.127,87.41,672,0.954,bicubic,-42.870,-32.521,-2
caformer_b36.sail_in1k,42.465,57.535,62.849,37.151,98.75,224,1.000,bicubic,-43.039,-34.461,-15
caformer_b36.sail_in1k_384,42.457,57.543,62.856,37.144,98.75,384,1.000,bicubic,-43.951,-34.958,-74
maxvit_large_tf_224.in1k,42.414,57.586,63.399,36.601,211.79,224,0.950,bicubic,-42.528,-33.571,+33
cait_m36_384.fb_dist_in1k,42.410,57.590,63.324,36.676,271.22,384,1.000,bicubic,-43.648,-34.406,-53
volo_d4_224.sail_in1k,42.304,57.696,63.002,36.998,192.96,224,0.960,bicubic,-43.568,-34.470,-43
caformer_s18.sail_in22k_ft_in1k_384,42.054,57.946,64.774,35.226,26.34,384,1.000,bicubic,-43.360,-32.928,-14
deit3_small_patch16_384.fb_in22k_ft_in1k,41.954,58.046,64.564,35.436,22.21,384,1.000,bicubic,-42.870,-32.922,+38
vit_medium_patch16_gap_384.sw_in12k_ft_in1k,41.891,58.109,63.701,36.299,39.03,384,0.950,bicubic,-43.639,-33.935,-23
maxvit_tiny_tf_512.in1k,41.842,58.158,63.576,36.424,31.05,512,1.000,bicubic,-43.822,-34.008,-32
caformer_s36.sail_in1k_384,41.738,58.262,62.762,37.238,39.30,384,1.000,bicubic,-44.004,-34.910,-38
swin_small_patch4_window7_224.ms_in22k_ft_in1k,41.590,58.410,64.542,35.458,49.61,224,0.900,bicubic,-41.708,-32.422,+192
convformer_s18.sail_in22k_ft_in1k_384,41.573,58.427,63.348,36.652,26.77,384,1.000,bicubic,-43.425,-34.222,+21
regnety_2560.seer_ft_in1k,41.524,58.476,64.896,35.104,"1,282.60",384,1.000,bicubic,-43.626,-32.542,+4
caformer_m36.sail_in1k_384,41.498,58.502,61.524,38.476,56.20,384,1.000,bicubic,-44.668,-36.296,-72
coatnet_rmlp_1_rw2_224.sw_in12k_ft_in1k,41.486,58.514,61.485,38.515,41.72,224,0.950,bicubic,-43.424,-35.473,+25
tf_efficientnet_b7.ra_in1k,41.437,58.563,63.027,36.973,66.35,600,0.949,bicubic,-43.495,-34.181,+21
tf_efficientnet_b7.ap_in1k,41.433,58.567,62.880,37.120,66.35,600,0.949,bicubic,-43.691,-34.372,+1
tf_efficientnet_b5.ap_in1k,41.418,58.582,62.074,37.926,30.39,456,0.934,bicubic,-42.840,-34.900,+79
regnety_120.sw_in12k_ft_in1k,41.331,58.669,63.187,36.813,51.82,288,1.000,bicubic,-44.069,-34.395,-23
dm_nfnet_f3.dm_in1k,41.323,58.677,62.110,37.890,254.92,416,0.940,bicubic,-44.363,-35.460,-44
resnetv2_152x4_bit.goog_in21k_ft_in1k,41.306,58.694,64.311,35.689,936.53,480,1.000,bilinear,-43.610,-33.127,+17
dm_nfnet_f5.dm_in1k,41.290,58.710,62.013,37.987,377.21,544,0.954,bicubic,-44.810,-35.675,-75
caformer_s18.sail_in22k_ft_in1k,41.217,58.783,63.831,36.169,26.34,224,1.000,bicubic,-42.857,-33.367,+92
dm_nfnet_f6.dm_in1k,41.170,58.830,62.843,37.157,438.36,576,0.956,bicubic,-45.192,-35.053,-93
convnext_tiny.in12k_ft_in1k_384,41.113,58.887,62.825,37.175,28.59,384,1.000,bicubic,-44.009,-34.781,-6
tf_efficientnet_b6.ap_in1k,41.095,58.905,62.365,37.635,43.04,528,0.942,bicubic,-43.693,-34.773,+22
xcit_large_24_p16_384.fb_dist_in1k,41.036,58.964,61.237,38.763,189.10,384,1.000,bicubic,-44.718,-36.301,-56
xcit_large_24_p16_224.fb_dist_in1k,40.942,59.058,61.326,38.674,189.10,224,1.000,bicubic,-43.974,-35.802,+11
tf_efficientnetv2_l.in1k,40.928,59.072,62.011,37.989,118.52,480,1.000,bicubic,-44.736,-35.463,-51
tf_efficientnetv2_s.in21k_ft_in1k,40.922,59.078,63.849,36.151,21.46,384,1.000,bicubic,-43.364,-33.403,+64
maxvit_small_tf_384.in1k,40.844,59.156,61.972,38.028,69.02,384,1.000,bicubic,-44.696,-35.490,-47
edgenext_base.in21k_ft_in1k,40.836,59.164,61.776,38.224,18.51,320,1.000,bicubic,-43.218,-35.420,+86
maxvit_base_tf_224.in1k,40.789,59.211,61.196,38.804,119.47,224,0.950,bicubic,-44.071,-35.792,+10
xcit_medium_24_p8_224.fb_dist_in1k,40.498,59.502,60.502,39.498,84.32,224,1.000,bicubic,-44.576,-36.748,-7
vit_small_r26_s32_384.augreg_in21k_ft_in1k,40.474,59.526,62.731,37.269,36.47,384,1.000,bicubic,-43.574,-34.597,+85
tf_efficientnet_b4.ap_in1k,40.474,59.526,61.713,38.287,19.34,380,0.922,bicubic,-42.776,-34.683,+171
deit3_base_patch16_224.fb_in1k,40.382,59.618,60.164,39.836,86.59,224,0.900,bicubic,-43.404,-36.422,+110
inception_next_base.sail_in1k_384,40.333,59.667,60.781,39.219,86.67,384,1.000,bicubic,-44.869,-36.633,-25
convformer_s18.sail_in22k_ft_in1k,40.301,59.699,61.719,38.281,26.77,224,1.000,bicubic,-43.437,-35.329,+117
vit_medium_patch16_gap_256.sw_in12k_ft_in1k,40.278,59.722,61.664,38.336,38.86,256,0.950,bicubic,-44.168,-35.546,+36
flexivit_large.600ep_in1k,40.268,59.732,60.365,39.635,304.36,240,0.950,bicubic,-45.272,-37.123,-58
vit_base_patch16_224_miil.in21k_ft_in1k,40.164,59.836,60.883,39.117,86.54,224,0.875,bilinear,-44.102,-35.921,+54
deit3_small_patch16_224.fb_in22k_ft_in1k,40.152,59.848,61.864,38.136,22.06,224,1.000,bicubic,-42.924,-34.912,+183
regnetz_e8.ra3_in1k,40.136,59.864,61.316,38.684,57.70,320,1.000,bicubic,-44.898,-35.956,-14
maxvit_rmlp_small_rw_224.sw_in1k,40.105,59.895,59.514,40.486,64.90,224,0.900,bicubic,-44.387,-37.496,+26
flexivit_large.1200ep_in1k,40.097,59.903,60.650,39.350,304.36,240,0.950,bicubic,-45.547,-36.890,-67
xcit_medium_24_p8_384.fb_dist_in1k,40.050,59.950,60.451,39.549,84.32,384,1.000,bicubic,-45.766,-37.141,-82
flexivit_large.300ep_in1k,39.991,60.009,59.987,40.013,304.36,240,0.950,bicubic,-45.297,-37.413,-45
maxvit_tiny_tf_384.in1k,39.971,60.029,60.909,39.091,30.98,384,1.000,bicubic,-45.129,-36.469,-28
convnextv2_tiny.fcmae_ft_in22k_in1k,39.938,60.062,61.835,38.165,28.64,288,1.000,bicubic,-44.478,-35.425,+35
dm_nfnet_f4.dm_in1k,39.926,60.074,60.449,39.551,316.07,512,0.951,bicubic,-45.910,-37.369,-87
xcit_medium_24_p16_384.fb_dist_in1k,39.897,60.103,60.097,39.903,84.40,384,1.000,bicubic,-45.527,-37.233,-62
convnext_tiny.fb_in22k_ft_in1k_384,39.787,60.213,61.536,38.464,28.59,384,1.000,bicubic,-44.301,-35.608,+61
cait_s36_384.fb_dist_in1k,39.767,60.233,60.469,39.531,68.37,384,1.000,bicubic,-45.687,-37.009,-65
convnextv2_base.fcmae_ft_in1k,39.755,60.245,59.875,40.125,88.72,288,1.000,bicubic,-45.719,-37.509,-68
volo_d3_448.sail_in1k,39.702,60.298,59.760,40.240,86.63,448,1.000,bicubic,-46.800,-37.950,-135
efficientnetv2_rw_m.agc_in1k,39.675,60.325,59.679,40.321,53.24,416,1.000,bicubic,-45.135,-37.473,-10
xception65.ra3_in1k,39.623,60.377,60.919,39.081,39.92,299,0.940,bicubic,-43.557,-35.673,+153
ecaresnet269d.ra2_in1k,39.604,60.396,60.345,39.655,102.09,352,1.000,bicubic,-45.364,-36.877,-24
tf_efficientnet_b3.ns_jft_in1k,39.586,60.414,61.475,38.525,12.23,300,0.904,bicubic,-44.466,-35.443,+60
caformer_m36.sail_in1k,39.576,60.424,58.692,41.308,56.20,224,1.000,bicubic,-45.656,-38.508,-53
convformer_b36.sail_in1k,39.525,60.475,58.081,41.919,99.88,224,1.000,bicubic,-45.293,-38.865,-16
caformer_s36.sail_in1k,39.519,60.481,59.760,40.240,39.30,224,1.000,bicubic,-44.987,-37.236,+5
volo_d3_224.sail_in1k,39.482,60.518,59.858,40.142,86.33,224,0.960,bicubic,-45.932,-37.418,-70
convnext_large.fb_in1k,39.460,60.540,59.188,40.812,197.77,288,1.000,bicubic,-45.386,-38.026,-21
deit3_base_patch16_384.fb_in1k,39.409,60.591,58.930,41.070,86.88,384,1.000,bicubic,-45.665,-38.344,-38
xcit_small_24_p8_224.fb_dist_in1k,39.327,60.673,59.402,40.598,47.63,224,1.000,bicubic,-45.541,-37.788,-25
inception_next_base.sail_in1k,39.295,60.705,59.245,40.755,86.67,224,0.950,bicubic,-44.797,-37.551,+45
xcit_medium_24_p16_224.fb_dist_in1k,39.270,60.730,59.461,40.539,84.40,224,1.000,bicubic,-45.016,-37.471,+26
convformer_m36.sail_in1k,39.234,60.766,57.631,42.369,57.05,224,1.000,bicubic,-45.260,-39.235,-1
coat_lite_medium_384.in1k,39.175,60.825,59.280,40.720,44.57,384,1.000,bicubic,-45.703,-38.092,-30
tiny_vit_11m_224.dist_in22k_ft_in1k,39.124,60.876,61.035,38.965,11.00,224,0.950,bicubic,-44.104,-35.595,+135
efficientnet_b4.ra2_in1k,39.087,60.913,59.618,40.382,19.34,384,1.000,bicubic,-44.327,-36.980,+115
hrnet_w18_ssld.paddle_in1k,39.067,60.933,60.650,39.350,21.30,288,1.000,bilinear,-42.981,-35.600,+275
tresnet_v2_l.miil_in21k_ft_in1k,39.010,60.990,59.477,40.523,46.17,224,0.875,bilinear,-44.884,-37.013,+57
xcit_small_24_p8_384.fb_dist_in1k,39.010,60.990,59.166,40.834,47.63,384,1.000,bicubic,-46.544,-38.404,-94
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k_384,38.979,61.021,62.428,37.572,236.34,384,1.000,bicubic,-44.857,-34.698,+63
convformer_b36.sail_in1k_384,38.930,61.070,58.413,41.587,99.88,384,1.000,bicubic,-46.810,-39.111,-105
convnext_tiny.in12k_ft_in1k,38.912,61.088,59.862,40.138,28.59,288,1.000,bicubic,-45.538,-37.478,-5
maxvit_small_tf_224.in1k,38.871,61.129,59.162,40.838,68.93,224,0.950,bicubic,-45.555,-37.662,+3
coatnet_rmlp_2_rw_224.sw_in1k,38.869,61.131,58.018,41.982,73.88,224,0.950,bicubic,-45.739,-38.722,-24
vit_base_patch32_384.augreg_in21k_ft_in1k,38.796,61.204,60.329,39.671,88.30,384,1.000,bicubic,-44.556,-36.511,+115
tf_efficientnetv2_m.in1k,38.714,61.286,59.791,40.209,54.14,480,1.000,bicubic,-46.490,-37.573,-71
efficientvit_b3.r288_in1k,38.659,61.341,58.364,41.636,48.65,288,1.000,bicubic,-45.495,-38.372,+25
eca_nfnet_l2.ra3_in1k,38.655,61.345,59.445,40.555,56.72,384,1.000,bicubic,-46.045,-37.821,-32
davit_small.msft_in1k,38.631,61.369,58.203,41.797,49.75,224,0.950,bicubic,-45.621,-38.737,+12
efficientvit_b3.r256_in1k,38.621,61.379,58.641,41.359,48.65,256,1.000,bicubic,-45.181,-37.875,+59
mvitv2_small.fb_in1k,38.578,61.422,58.130,41.870,34.87,224,0.900,bicubic,-45.192,-38.446,+61
xcit_small_12_p8_384.fb_dist_in1k,38.547,61.453,58.795,41.205,26.21,384,1.000,bicubic,-46.531,-38.487,-63
convformer_m36.sail_in1k_384,38.531,61.469,57.736,42.264,57.05,384,1.000,bicubic,-47.049,-39.806,-109
xcit_small_24_p16_384.fb_dist_in1k,38.499,61.501,58.396,41.604,47.67,384,1.000,bicubic,-46.591,-38.916,-67
davit_base.msft_in1k,38.490,61.510,57.535,42.465,87.95,224,0.950,bicubic,-46.152,-39.485,-37
mvitv2_base.fb_in1k,38.458,61.542,57.934,42.066,51.47,224,0.900,bicubic,-45.992,-38.924,-18
rexnetr_300.sw_in12k_ft_in1k,38.431,61.569,60.612,39.388,34.81,288,1.000,bicubic,-46.115,-36.644,-31
convformer_s36.sail_in1k,38.405,61.595,57.710,42.290,40.01,224,1.000,bicubic,-45.655,-39.036,+22
xcit_small_12_p8_224.fb_dist_in1k,38.360,61.640,58.832,41.168,26.21,224,1.000,bicubic,-45.876,-38.038,+5
tf_efficientnet_b5.ra_in1k,38.331,61.669,59.928,40.072,30.39,456,0.934,bicubic,-45.483,-36.823,+46
fastvit_ma36.apple_dist_in1k,38.323,61.677,58.461,41.539,44.07,256,0.950,bicubic,-46.275,-38.541,-39
deit_base_distilled_patch16_384.fb_in1k,38.244,61.756,57.785,42.215,87.63,384,1.000,bicubic,-47.180,-39.621,-108
xcit_large_24_p8_224.fb_in1k,38.106,61.894,57.873,42.127,188.93,224,1.000,bicubic,-46.288,-38.791,-10
vit_base_patch16_384.orig_in21k_ft_in1k,38.105,61.895,60.426,39.574,86.86,384,1.000,bicubic,-46.096,-36.792,+4
resnetv2_152x2_bit.goog_in21k_ft_in1k,38.000,62.000,61.137,38.863,236.34,448,1.000,bilinear,-46.510,-36.297,-36
repvit_m2_3.dist_450e_in1k,37.996,62.004,58.154,41.846,23.69,224,0.950,bicubic,-45.746,-38.490,+51
pvt_v2_b4.in1k,37.953,62.047,58.217,41.783,62.56,224,0.900,bicubic,-45.759,-38.453,+55
coat_lite_medium.in1k,37.880,62.120,57.792,42.208,44.57,224,0.900,bicubic,-45.719,-38.935,+64
cait_s24_384.fb_dist_in1k,37.879,62.121,58.069,41.931,47.06,384,1.000,bicubic,-47.169,-39.277,-77
convnextv2_nano.fcmae_ft_in22k_in1k_384,37.875,62.125,59.443,40.557,15.62,384,1.000,bicubic,-45.499,-37.301,+88
resnet152d.ra2_in1k,37.853,62.147,58.362,41.638,60.21,320,1.000,bicubic,-45.831,-38.376,+56
convformer_s36.sail_in1k_384,37.812,62.188,57.488,42.512,40.01,384,1.000,bicubic,-47.566,-39.988,-112
resnetrs420.tf_in1k,37.745,62.255,58.209,41.791,191.89,416,1.000,bicubic,-47.259,-38.915,-78
deit3_medium_patch16_224.fb_in1k,37.709,62.291,57.087,42.913,38.85,224,0.900,bicubic,-45.377,-39.207,+114
xcit_small_24_p16_224.fb_dist_in1k,37.700,62.300,57.374,42.626,47.67,224,1.000,bicubic,-46.174,-39.362,+23
resnetrs350.tf_in1k,37.664,62.336,58.097,41.903,163.96,384,1.000,bicubic,-47.050,-38.895,-63
caformer_s18.sail_in1k_384,37.660,62.340,57.612,42.388,26.34,384,1.000,bicubic,-47.366,-39.746,-83
regnety_640.seer_ft_in1k,37.584,62.416,59.864,40.136,281.38,384,1.000,bicubic,-46.324,-37.058,+15
xcit_small_12_p16_384.fb_dist_in1k,37.582,62.418,57.761,42.239,26.25,384,1.000,bicubic,-47.130,-39.357,-65
pvt_v2_b5.in1k,37.548,62.452,57.301,42.699,81.96,224,0.900,bicubic,-46.192,-39.335,+38
resnet200d.ra2_in1k,37.503,62.497,58.301,41.699,64.69,320,1.000,bicubic,-46.461,-38.525,+8
maxvit_rmlp_tiny_rw_256.sw_in1k,37.381,62.619,57.193,42.807,29.15,256,0.950,bicubic,-46.843,-39.675,-16
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k,37.342,62.658,59.404,40.596,236.34,224,0.875,bicubic,-45.534,-37.178,+122
efficientvit_b3.r224_in1k,37.340,62.660,57.122,42.878,48.65,224,0.950,bicubic,-46.120,-39.208,+62
regnety_1280.seer_ft_in1k,37.332,62.668,59.133,40.867,644.81,384,1.000,bicubic,-47.100,-37.959,-42
resnest269e.in1k,37.309,62.691,57.488,42.512,110.93,416,0.928,bicubic,-47.199,-39.502,-56
convnext_base.fb_in1k,37.307,62.693,57.319,42.681,88.59,288,1.000,bicubic,-47.121,-39.649,-43
resmlp_big_24_224.fb_in22k_ft_in1k,37.242,62.758,58.191,41.809,129.14,224,0.875,bicubic,-47.156,-38.921,-36
vit_small_r26_s32_224.augreg_in21k_ft_in1k,37.232,62.768,59.072,40.928,36.43,224,0.900,bicubic,-44.632,-36.950,+239
repvit_m2_3.dist_300e_in1k,37.210,62.790,57.234,42.766,23.69,224,0.950,bicubic,-46.294,-39.270,+49
pit_b_distilled_224.in1k,37.195,62.805,56.507,43.493,74.79,224,0.900,bicubic,-46.571,-39.961,+22
cait_s24_224.fb_dist_in1k,37.153,62.847,56.724,43.276,46.92,224,1.000,bicubic,-46.289,-39.850,+56
dm_nfnet_f2.dm_in1k,37.128,62.872,56.981,43.019,193.78,352,0.920,bicubic,-48.064,-40.365,-115
resnetaa101d.sw_in12k_ft_in1k,37.116,62.884,57.853,42.147,44.57,288,1.000,bicubic,-47.008,-39.253,-20
tiny_vit_21m_224.in1k,37.114,62.886,57.380,42.620,21.20,224,0.950,bicubic,-46.140,-39.212,+73
efficientformer_l7.snap_dist_in1k,37.112,62.888,56.900,43.100,82.23,224,0.950,bicubic,-46.270,-39.636,+61
pvt_v2_b3.in1k,37.108,62.892,57.335,42.665,45.24,224,0.900,bicubic,-46.010,-39.221,+87
fastvit_sa36.apple_dist_in1k,37.106,62.894,57.144,42.856,31.53,256,0.900,bicubic,-46.920,-39.710,-13
vit_base_patch32_224.augreg_in21k_ft_in1k,37.073,62.927,59.307,40.693,88.22,224,0.900,bicubic,-43.643,-36.259,+347
volo_d1_384.sail_in1k,37.069,62.931,57.138,42.862,26.78,384,1.000,bicubic,-48.175,-40.056,-131
efficientnetv2_rw_s.ra2_in1k,37.057,62.943,56.814,43.186,23.94,384,1.000,bicubic,-46.749,-39.918,+6
tf_efficientnet_b3.ap_in1k,37.051,62.949,57.238,42.762,12.23,300,0.904,bicubic,-44.769,-38.388,+230
maxvit_tiny_tf_224.in1k,37.020,62.980,56.904,43.096,30.92,224,0.950,bicubic,-46.382,-39.686,+49
swinv2_base_window16_256.ms_in1k,36.990,63.010,56.140,43.860,87.92,256,0.900,bicubic,-47.610,-40.950,-83
xcit_small_12_p16_224.fb_dist_in1k,36.975,63.025,56.725,43.275,26.25,224,1.000,bicubic,-46.353,-39.691,+60
regnetz_040_h.ra3_in1k,36.973,63.027,57.276,42.724,28.94,320,1.000,bicubic,-47.519,-39.482,-73
inception_next_small.sail_in1k,36.914,63.086,56.749,43.251,49.37,224,0.875,bicubic,-46.664,-39.849,+28
volo_d1_224.sail_in1k,36.888,63.112,56.635,43.365,26.63,224,0.960,bicubic,-47.274,-40.141,-37
seresnet152d.ra2_in1k,36.784,63.216,56.727,43.273,66.84,320,1.000,bicubic,-47.576,-40.313,-55
efficientformerv2_l.snap_dist_in1k,36.764,63.236,56.627,43.373,26.32,224,0.950,bicubic,-46.868,-39.931,+20
maxxvit_rmlp_small_rw_256.sw_in1k,36.707,63.293,56.012,43.988,66.01,256,0.950,bicubic,-47.917,-41.056,-92
seresnext101d_32x8d.ah_in1k,36.633,63.367,56.325,43.675,93.59,288,1.000,bicubic,-47.725,-40.596,-57
volo_d2_224.sail_in1k,36.627,63.373,56.470,43.530,58.68,224,0.960,bicubic,-48.575,-40.720,-136
caformer_s18.sail_in1k,36.578,63.422,55.831,44.169,26.34,224,1.000,bicubic,-47.076,-40.687,+15
xception65p.ra3_in1k,36.570,63.430,56.438,43.562,39.82,299,0.940,bicubic,-46.556,-40.044,+66
fastvit_ma36.apple_in1k,36.568,63.432,56.564,43.436,44.07,256,0.950,bicubic,-47.314,-40.178,-19
fastvit_sa36.apple_in1k,36.556,63.444,56.004,43.996,31.53,256,0.900,bicubic,-46.944,-40.626,+24
seresnextaa101d_32x8d.ah_in1k,36.532,63.468,56.409,43.591,93.59,288,1.000,bicubic,-48.034,-40.667,-95
focalnet_base_srf.ms_in1k,36.460,63.540,56.217,43.783,88.15,224,0.900,bicubic,-47.360,-40.464,-14
regnetz_d32.ra3_in1k,36.452,63.548,57.366,42.634,27.58,320,0.950,bicubic,-47.570,-39.502,-34
cait_xs24_384.fb_dist_in1k,36.416,63.584,56.944,43.056,26.67,384,1.000,bicubic,-47.645,-39.941,-42
efficientnet_b3.ra2_in1k,36.414,63.586,56.830,43.170,12.23,320,1.000,bicubic,-45.831,-39.288,+166
deit_base_distilled_patch16_224.fb_in1k,36.407,63.593,56.615,43.385,87.34,224,0.900,bicubic,-46.983,-39.873,+32
volo_d2_384.sail_in1k,36.407,63.593,56.323,43.677,58.87,384,1.000,bicubic,-49.635,-41.251,-209
resnetv2_101x3_bit.goog_in21k_ft_in1k,36.383,63.617,59.068,40.932,387.93,448,1.000,bilinear,-48.055,-38.314,-83
gcvit_base.in1k,36.381,63.619,55.880,44.120,90.32,224,0.875,bicubic,-48.063,-41.202,-85
dm_nfnet_f1.dm_in1k,36.326,63.674,55.747,44.253,132.63,320,0.910,bicubic,-48.376,-41.435,-112
resnetrs270.tf_in1k,36.310,63.690,56.566,43.434,129.86,352,1.000,bicubic,-48.118,-40.402,-83
tresnet_m.miil_in21k_ft_in1k,36.289,63.711,55.792,44.208,31.39,224,0.875,bilinear,-46.781,-40.318,+61
mixer_b16_224.miil_in21k_ft_in1k,36.267,63.733,55.965,44.035,59.88,224,0.875,bilinear,-46.039,-39.755,+152
convnext_small.fb_in1k,36.248,63.752,55.912,44.088,50.22,288,1.000,bicubic,-47.453,-40.896,-7
convformer_s18.sail_in1k_384,36.204,63.796,56.059,43.941,26.77,384,1.000,bicubic,-48.198,-41.053,-81
deit3_small_patch16_384.fb_in1k,36.193,63.807,55.558,44.442,22.21,384,1.000,bicubic,-47.235,-41.116,+16
tf_efficientnet_b2.ns_jft_in1k,36.167,63.833,57.559,42.441,9.11,260,0.890,bicubic,-46.211,-38.695,+133
mvitv2_tiny.fb_in1k,36.167,63.833,55.132,44.868,24.17,224,0.900,bicubic,-46.243,-41.020,+129
focalnet_base_lrf.ms_in1k,36.118,63.882,55.810,44.190,88.75,224,0.900,bicubic,-47.720,-40.798,-34
regnety_320.seer_ft_in1k,36.069,63.931,58.484,41.516,145.05,384,1.000,bicubic,-47.259,-38.224,+27
regnetz_040.ra3_in1k,36.053,63.947,55.735,44.265,27.12,320,1.000,bicubic,-48.187,-41.197,-75
resnest200e.in1k,35.929,64.071,55.847,44.153,70.20,320,0.909,bicubic,-47.915,-41.037,-39
resnet18.fb_swsl_ig1b_ft_in1k,35.874,64.126,58.447,41.553,11.69,224,0.875,bilinear,-37.414,-33.283,+666
sequencer2d_l.in1k,35.831,64.169,55.715,44.285,54.30,224,0.875,bicubic,-47.563,-40.781,+13
eca_nfnet_l1.ra2_in1k,35.815,64.185,55.953,44.047,41.41,320,1.000,bicubic,-48.197,-41.073,-54
gcvit_small.in1k,35.760,64.240,54.790,45.210,51.09,224,0.875,bicubic,-48.132,-41.868,-47
vit_base_patch16_224.orig_in21k_ft_in1k,35.754,64.246,57.401,42.599,86.57,224,0.900,bicubic,-46.036,-38.725,+191
vit_relpos_medium_patch16_cls_224.sw_in1k,35.725,64.275,54.919,45.081,38.76,224,0.900,bicubic,-46.847,-41.148,+101
xcit_small_24_p8_224.fb_in1k,35.556,64.444,54.780,45.220,47.63,224,1.000,bicubic,-48.278,-41.852,-42
xcit_small_12_p8_224.fb_in1k,35.522,64.478,55.507,44.493,26.21,224,1.000,bicubic,-47.812,-40.975,+15
xcit_large_24_p16_224.fb_in1k,35.522,64.478,54.741,45.259,189.10,224,1.000,bicubic,-47.380,-41.143,+56
coat_small.in1k,35.520,64.480,55.157,44.843,21.69,224,0.900,bicubic,-46.842,-41.051,+121
flexivit_base.1200ep_in1k,35.519,64.481,53.837,46.163,86.59,240,0.950,bicubic,-49.157,-43.157,-133
vit_small_patch16_384.augreg_in21k_ft_in1k,35.467,64.533,57.543,42.457,22.20,384,1.000,bicubic,-48.337,-39.557,-43
regnetz_d8_evos.ch_in1k,35.454,64.546,55.751,44.249,23.46,320,1.000,bicubic,-48.672,-41.261,-79
xcit_medium_24_p8_224.fb_in1k,35.452,64.548,54.823,45.177,84.32,224,1.000,bicubic,-48.294,-41.887,-37
swinv2_base_window8_256.ms_in1k,35.450,64.550,54.607,45.393,87.92,256,0.900,bicubic,-48.800,-42.317,-92
swinv2_small_window16_256.ms_in1k,35.428,64.572,54.623,45.377,49.73,256,0.900,bicubic,-48.796,-42.155,-90
dm_nfnet_f0.dm_in1k,35.407,64.594,55.525,44.475,71.49,256,0.900,bicubic,-48.080,-41.043,-12
resnest101e.in1k,35.373,64.627,55.790,44.210,48.28,256,0.875,bilinear,-47.511,-40.532,+47
resnet152.a1h_in1k,35.357,64.643,54.627,45.373,60.19,288,1.000,bicubic,-48.093,-41.911,-11
tf_efficientnet_b5.aa_in1k,35.316,64.684,56.038,43.962,30.39,456,0.934,bicubic,-48.372,-40.674,-33
convit_base.fb_in1k,35.302,64.698,54.939,45.061,86.54,224,0.875,bicubic,-46.988,-40.997,+123
focalnet_small_lrf.ms_in1k,35.277,64.723,54.888,45.112,50.34,224,0.900,bicubic,-48.217,-41.692,-18
efficientformer_l3.snap_dist_in1k,35.253,64.747,54.501,45.499,31.41,224,0.950,bicubic,-47.295,-41.749,+88
xcit_tiny_24_p8_224.fb_dist_in1k,35.241,64.759,55.267,44.733,12.11,224,1.000,bicubic,-47.325,-40.791,+85
edgenext_base.usi_in1k,35.216,64.784,55.126,44.874,18.51,320,1.000,bicubic,-48.742,-41.644,-74
fastvit_sa24.apple_dist_in1k,35.214,64.786,54.674,45.326,21.55,256,0.900,bicubic,-48.128,-41.878,-4
convformer_s18.sail_in1k,35.208,64.792,54.629,45.371,26.77,224,1.000,bicubic,-47.778,-41.621,+32
flexivit_base.600ep_in1k,35.137,64.863,53.652,46.348,86.59,240,0.950,bicubic,-49.387,-43.284,-139
twins_svt_large.in1k,35.084,64.916,54.719,45.281,99.27,224,0.900,bicubic,-48.594,-41.869,-40
repvgg_b3.rvgg_in1k,35.057,64.943,54.548,45.452,123.09,224,0.875,bilinear,-45.449,-40.706,+294
repvgg_b3g4.rvgg_in1k,35.049,64.951,54.788,45.212,83.83,224,0.875,bilinear,-45.167,-40.304,+327
convnextv2_tiny.fcmae_ft_in1k,35.045,64.955,54.224,45.776,28.64,288,1.000,bicubic,-48.419,-42.494,-26
repvit_m1_5.dist_450e_in1k,35.021,64.979,54.477,45.523,14.64,224,0.950,bicubic,-47.491,-41.635,+82
regnetz_d8.ra3_in1k,35.008,64.992,55.930,44.070,23.37,320,1.000,bicubic,-49.044,-41.066,-91
xcit_tiny_24_p8_384.fb_dist_in1k,34.915,65.085,55.148,44.852,12.11,384,1.000,bicubic,-48.831,-41.253,-59
resnet101d.ra2_in1k,34.876,65.124,54.210,45.790,44.57,320,1.000,bicubic,-48.144,-42.242,+19
rexnetr_200.sw_in12k_ft_in1k,34.870,65.130,55.857,44.143,16.52,288,1.000,bicubic,-48.268,-40.779,+2
repvit_m1_5.dist_300e_in1k,34.868,65.132,54.375,45.625,14.64,224,0.950,bicubic,-47.508,-41.655,+92
coatnet_1_rw_224.sw_in1k,34.850,65.150,53.442,46.558,41.72,224,0.950,bicubic,-48.746,-42.940,-45
coatnet_rmlp_1_rw_224.sw_in1k,34.803,65.197,53.953,46.047,41.69,224,0.950,bicubic,-48.559,-42.497,-20
swin_s3_base_224.ms_in1k,34.803,65.197,53.703,46.297,71.13,224,0.900,bicubic,-49.117,-42.969,-88
flexivit_base.300ep_in1k,34.799,65.201,53.161,46.839,86.59,240,0.950,bicubic,-49.607,-43.723,-131
seresnext101_32x8d.ah_in1k,34.789,65.211,53.452,46.548,93.57,288,1.000,bicubic,-49.397,-43.422,-113
resmlp_big_24_224.fb_distilled_in1k,34.788,65.213,54.642,45.358,129.14,224,0.875,bicubic,-48.804,-42.008,-49
maxvit_tiny_rw_224.sw_in1k,34.780,65.220,53.351,46.649,29.06,224,0.950,bicubic,-48.724,-43.163,-44
repvgg_d2se.rvgg_in1k,34.740,65.260,53.200,46.800,133.33,320,1.000,bilinear,-48.820,-43.458,-49
vit_relpos_base_patch16_clsgap_224.sw_in1k,34.725,65.275,54.214,45.786,86.43,224,0.900,bicubic,-48.035,-41.958,+29
vit_base_patch16_rpn_224.sw_in1k,34.715,65.285,54.658,45.342,86.54,224,0.900,bicubic,-47.487,-41.338,+108
sequencer2d_m.in1k,34.703,65.297,53.996,46.004,38.31,224,0.875,bicubic,-48.109,-42.278,+21
deit3_small_patch16_224.fb_in1k,34.685,65.315,53.173,46.827,22.06,224,0.900,bicubic,-46.685,-42.283,+188
davit_tiny.msft_in1k,34.673,65.326,54.344,45.656,28.36,224,0.950,bicubic,-48.023,-41.930,+33
vit_large_patch32_384.orig_in21k_ft_in1k,34.672,65.329,55.733,44.267,306.63,384,1.000,bicubic,-46.839,-40.357,+168
focalnet_small_srf.ms_in1k,34.670,65.330,54.420,45.580,49.89,224,0.900,bicubic,-48.746,-42.018,-42
ecaresnet101d.miil_in1k,34.644,65.356,54.499,45.501,44.57,288,0.950,bicubic,-48.340,-42.043,+6
resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k,34.615,65.385,55.937,44.063,194.03,224,0.875,bilinear,-47.223,-40.155,+136
vit_relpos_base_patch16_224.sw_in1k,34.607,65.393,54.291,45.709,86.43,224,0.900,bicubic,-47.889,-41.847,+61
repvgg_b2g4.rvgg_in1k,34.593,65.407,54.768,45.232,61.76,224,0.875,bilinear,-44.789,-39.908,+359
gcvit_tiny.in1k,34.554,65.446,53.249,46.751,28.22,224,0.875,bicubic,-48.830,-43.149,-41
resnetrs200.tf_in1k,34.502,65.498,54.281,45.719,93.21,320,1.000,bicubic,-49.942,-42.561,-158
poolformerv2_m48.sail_in1k,34.485,65.515,54.027,45.973,73.35,224,1.000,bicubic,-48.133,-42.045,+35
efficientvit_b2.r256_in1k,34.430,65.570,53.593,46.407,24.33,256,1.000,bicubic,-48.260,-42.501,+24
convnextv2_nano.fcmae_ft_in22k_in1k,34.379,65.621,55.014,44.986,15.62,288,1.000,bicubic,-48.285,-41.506,+26
resnest50d_4s2x40d.in1k,34.369,65.631,54.725,45.275,30.42,224,0.875,bicubic,-46.751,-40.835,+199
resnetrs152.tf_in1k,34.351,65.649,53.564,46.436,86.62,320,1.000,bicubic,-49.351,-43.048,-80
pvt_v2_b2_li.in1k,34.318,65.682,54.104,45.896,22.55,224,0.900,bicubic,-47.876,-41.988,+93
crossvit_18_dagger_408.in1k,34.247,65.753,53.106,46.894,44.61,408,1.000,bicubic,-49.955,-43.712,-138
xcit_medium_24_p16_224.fb_in1k,34.241,65.759,53.157,46.843,84.40,224,1.000,bicubic,-48.399,-42.825,+24
efficientvit_b2.r288_in1k,34.224,65.776,53.546,46.454,24.33,288,1.000,bicubic,-48.876,-42.758,-20
pit_s_distilled_224.in1k,34.161,65.839,53.361,46.639,24.04,224,0.900,bicubic,-47.653,-42.369,+125
efficientnetv2_rw_t.ra2_in1k,34.159,65.841,53.135,46.865,13.65,288,1.000,bicubic,-48.191,-43.057,+65
tf_efficientnet_b1.ns_jft_in1k,34.151,65.849,55.503,44.497,7.79,240,0.882,bicubic,-47.237,-40.235,+164
twins_pcpvt_large.in1k,34.119,65.881,54.136,45.864,60.99,224,0.900,bicubic,-49.011,-42.468,-31
fastvit_sa24.apple_in1k,34.098,65.902,53.782,46.218,21.55,256,0.900,bicubic,-48.580,-42.490,+13
tf_efficientnet_b4.aa_in1k,34.058,65.942,54.212,45.788,19.34,380,0.922,bicubic,-48.960,-42.088,-18
efficientformerv2_s2.snap_dist_in1k,34.056,65.944,53.473,46.527,12.71,224,0.950,bicubic,-48.109,-42.437,+86
resnetv2_101.a1h_in1k,34.056,65.944,52.308,47.692,44.54,288,1.000,bicubic,-48.944,-44.146,-19
resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k,34.033,65.967,55.596,44.404,88.79,224,0.875,bilinear,-47.573,-40.445,+131
xcit_small_24_p16_224.fb_in1k,34.005,65.995,53.281,46.719,47.67,224,1.000,bicubic,-48.571,-42.731,+28
tf_efficientnet_b6.aa_in1k,34.002,65.999,54.542,45.458,43.04,528,0.942,bicubic,-50.110,-42.342,-144
nfnet_l0.ra2_in1k,34.002,65.999,54.377,45.623,35.07,288,1.000,bicubic,-48.748,-42.139,-2
efficientnet_b3_pruned.in1k,33.992,66.008,54.110,45.890,9.86,300,0.904,bicubic,-46.860,-41.134,+213
regnety_160.deit_in1k,33.980,66.020,53.552,46.448,83.59,288,1.000,bicubic,-49.710,-43.228,-96
resnext101_64x4d.tv_in1k,33.966,66.034,52.530,47.470,83.46,224,0.875,bilinear,-49.026,-43.714,-25
gc_efficientnetv2_rw_t.agc_in1k,33.962,66.038,53.228,46.772,13.68,288,1.000,bicubic,-48.494,-43.068,+38
swinv2_cr_small_ns_224.sw_in1k,33.846,66.154,52.640,47.360,49.70,224,0.900,bicubic,-49.652,-43.844,-82
resnext101_64x4d.c1_in1k,33.842,66.158,52.161,47.839,83.46,288,1.000,bicubic,-49.314,-44.213,-48
poolformerv2_s36.sail_in1k,33.825,66.175,53.685,46.315,30.79,224,1.000,bicubic,-47.741,-42.005,+124
repvit_m3.dist_in1k,33.791,66.209,53.123,46.877,10.68,224,0.950,bicubic,-47.711,-42.444,+133
resnet101.a1h_in1k,33.781,66.219,53.104,46.896,44.55,288,1.000,bicubic,-48.997,-43.206,-15
xcit_small_12_p16_224.fb_in1k,33.756,66.244,53.228,46.772,26.25,224,1.000,bicubic,-48.214,-42.584,+91
swin_s3_small_224.ms_in1k,33.697,66.303,52.365,47.635,49.74,224,0.900,bicubic,-50.059,-44.087,-116
resnetv2_50x3_bit.goog_in21k_ft_in1k,33.671,66.329,55.888,44.112,217.32,448,1.000,bilinear,-50.349,-41.238,-144
swinv2_small_window8_256.ms_in1k,33.634,66.366,52.827,47.173,49.73,256,0.900,bicubic,-50.220,-43.817,-133
resnet152.tv2_in1k,33.622,66.378,51.656,48.344,60.19,224,0.965,bilinear,-48.664,-44.348,+51
inception_next_tiny.sail_in1k,33.583,66.417,52.978,47.022,28.06,224,0.875,bicubic,-48.895,-43.044,+25
resnet51q.ra2_in1k,33.555,66.445,53.023,46.977,35.70,288,1.000,bilinear,-48.805,-43.163,+37
xcit_tiny_24_p16_384.fb_dist_in1k,33.508,66.492,52.768,47.232,12.12,384,1.000,bicubic,-49.062,-43.508,+11
vit_relpos_medium_patch16_224.sw_in1k,33.498,66.502,52.620,47.380,38.75,224,0.900,bicubic,-48.964,-43.462,+23
regnety_080.ra3_in1k,33.455,66.545,52.939,47.061,39.18,288,1.000,bicubic,-50.471,-43.951,-148
cs3edgenet_x.c2_in1k,33.455,66.545,52.925,47.075,47.82,288,1.000,bicubic,-49.253,-43.445,-18
convmixer_1536_20.in1k,33.432,66.568,53.029,46.971,51.63,224,0.960,bicubic,-47.930,-42.585,+138
sequencer2d_s.in1k,33.430,66.570,52.383,47.617,27.65,224,0.875,bicubic,-48.910,-43.645,+35
regnety_032.ra_in1k,33.412,66.588,52.770,47.230,19.44,288,1.000,bicubic,-49.314,-43.646,-24
crossvit_18_240.in1k,33.392,66.608,52.245,47.755,43.27,240,0.875,bicubic,-49.008,-43.815,+22
vit_srelpos_medium_patch16_224.sw_in1k,33.373,66.627,52.459,47.541,38.74,224,0.900,bicubic,-48.867,-43.483,+45
gernet_l.idstcv_in1k,33.371,66.629,51.911,48.089,31.08,256,0.875,bilinear,-47.983,-43.619,+135
tf_efficientnetv2_b3.in21k_ft_in1k,33.353,66.647,54.922,45.078,14.36,300,0.900,bicubic,-49.317,-41.705,-20
regnetz_c16.ra3_in1k,33.337,66.663,54.242,45.758,13.46,320,1.000,bicubic,-49.295,-42.076,-15
crossvit_15_dagger_408.in1k,33.335,66.665,52.200,47.800,28.50,408,1.000,bicubic,-50.505,-44.578,-147
tiny_vit_5m_224.dist_in22k_ft_in1k,33.306,66.694,55.028,44.972,5.39,224,0.950,bicubic,-47.570,-40.636,+180
crossvit_18_dagger_240.in1k,33.284,66.716,52.190,47.810,44.27,240,0.875,bicubic,-49.234,-43.878,+3
swin_tiny_patch4_window7_224.ms_in22k_ft_in1k,33.257,66.743,55.295,44.705,28.29,224,0.900,bicubic,-47.711,-40.719,+167
wide_resnet101_2.tv2_in1k,33.257,66.743,51.430,48.570,126.89,224,0.965,bilinear,-49.245,-44.586,+3
tresnet_xl.miil_in1k,33.251,66.749,52.298,47.702,78.44,224,0.875,bilinear,-48.823,-43.630,+56
nest_base_jx.goog_in1k,33.215,66.785,51.825,48.175,67.72,224,0.875,bicubic,-50.319,-44.549,-116
convnext_tiny.fb_in1k,33.168,66.832,52.677,47.323,28.59,288,1.000,bicubic,-49.530,-43.955,-33
resnest50d_1s4x24d.in1k,33.145,66.855,52.840,47.160,25.68,224,0.875,bicubic,-47.843,-42.485,+159
convnext_nano.in12k_ft_in1k,33.119,66.881,53.988,46.012,15.59,288,1.000,bicubic,-49.743,-42.568,-51
vit_relpos_medium_patch16_rpn_224.sw_in1k,33.092,66.908,52.363,47.637,38.73,224,0.900,bicubic,-49.218,-43.609,+23
resnet61q.ra2_in1k,33.090,66.910,51.746,48.254,36.85,288,1.000,bicubic,-49.434,-44.384,-7
maxxvit_rmlp_nano_rw_256.sw_in1k,33.074,66.926,51.852,48.148,16.78,256,0.950,bicubic,-49.968,-44.498,-67
nest_small_jx.goog_in1k,33.052,66.948,51.064,48.936,38.35,224,0.875,bicubic,-50.072,-45.256,-79
rexnet_300.nav_in1k,33.048,66.952,52.365,47.635,34.71,224,0.875,bicubic,-49.726,-43.873,-48
crossvit_base_240.in1k,33.035,66.965,51.390,48.610,105.03,240,0.875,bicubic,-49.179,-44.444,+30
poolformerv2_m36.sail_in1k,33.033,66.967,51.860,48.140,56.08,224,1.000,bicubic,-49.183,-44.064,+28
twins_pcpvt_base.in1k,33.021,66.979,52.502,47.498,43.83,224,0.900,bicubic,-49.693,-43.844,-46
pvt_v2_b2.in1k,33.017,66.983,52.001,47.999,25.36,224,0.900,bicubic,-49.067,-43.955,+42
xcit_tiny_24_p16_224.fb_dist_in1k,32.999,67.001,52.074,47.926,12.12,224,1.000,bicubic,-47.455,-43.144,+206
resnest50d.in1k,32.966,67.034,52.707,47.293,27.48,224,0.875,bilinear,-47.994,-42.675,+151
rexnet_200.nav_in1k,32.962,67.038,52.921,47.079,16.37,224,0.875,bicubic,-48.674,-42.745,+76
crossvit_15_dagger_240.in1k,32.925,67.075,51.807,48.193,28.21,240,0.875,bicubic,-49.405,-44.149,+8
convit_small.fb_in1k,32.923,67.077,52.115,47.885,27.78,224,0.875,bicubic,-48.497,-43.629,+100
tf_efficientnetv2_s.in1k,32.913,67.087,51.728,48.272,21.46,384,1.000,bicubic,-50.985,-44.968,-178
swin_base_patch4_window12_384.ms_in1k,32.905,67.095,51.750,48.250,87.90,384,1.000,bicubic,-51.571,-45.142,-237
vit_small_patch16_224.augreg_in21k_ft_in1k,32.885,67.115,53.923,46.077,22.05,224,0.900,bicubic,-48.501,-42.213,+101
convnext_tiny_hnf.a2h_in1k,32.881,67.119,51.194,48.806,28.59,288,1.000,bicubic,-49.703,-44.814,-33
tf_efficientnet_b3.aa_in1k,32.864,67.136,52.964,47.036,12.23,300,0.904,bicubic,-48.776,-42.758,+68
pnasnet5large.tf_in1k,32.862,67.138,50.516,49.484,86.06,331,0.911,bicubic,-49.920,-45.524,-65
twins_svt_base.in1k,32.832,67.168,51.565,48.435,56.07,224,0.900,bicubic,-50.288,-44.849,-95
regnetv_064.ra3_in1k,32.830,67.170,52.864,47.136,30.58,288,1.000,bicubic,-50.886,-43.878,-158
convnextv2_nano.fcmae_ft_in1k,32.815,67.185,52.656,47.344,15.62,288,1.000,bicubic,-49.671,-43.570,-23
nasnetalarge.tf_in1k,32.781,67.219,50.141,49.859,88.75,331,0.911,bicubic,-49.845,-45.901,-48
gernet_m.idstcv_in1k,32.758,67.242,51.899,48.101,21.14,224,0.875,bilinear,-47.978,-43.291,+162
inception_resnet_v2.tf_in1k,32.734,67.266,50.640,49.360,55.84,299,0.897,bicubic,-47.724,-44.550,+187
resnet152d.gluon_in1k,32.730,67.270,51.080,48.920,60.21,224,0.875,bicubic,-47.746,-44.122,+183
repvit_m1_1.dist_450e_in1k,32.726,67.274,52.687,47.313,8.80,224,0.950,bicubic,-48.586,-42.849,+96
pit_b_224.in1k,32.722,67.278,49.854,50.146,73.76,224,0.900,bicubic,-49.716,-45.860,-24
tf_efficientnet_b2.ap_in1k,32.685,67.315,52.237,47.763,9.11,260,0.890,bicubic,-47.625,-42.789,+199
swin_base_patch4_window7_224.ms_in1k,32.644,67.356,51.573,48.427,87.77,224,0.900,bicubic,-50.962,-44.879,-157
fbnetv3_g.ra2_in1k,32.624,67.376,52.886,47.114,16.62,288,0.950,bilinear,-49.416,-43.174,+24
regnety_320.tv2_in1k,32.616,67.384,50.296,49.704,145.05,224,0.965,bicubic,-50.546,-46.118,-114
tresnet_l.miil_in1k,32.561,67.439,51.137,48.863,55.99,224,0.875,bilinear,-48.919,-44.487,+73
cait_xxs36_384.fb_dist_in1k,32.553,67.447,52.221,47.779,17.37,384,1.000,bicubic,-49.651,-43.923,+2
resnext101_32x8d.tv2_in1k,32.549,67.451,50.164,49.836,88.79,224,0.965,bilinear,-50.283,-46.068,-86
regnetz_c16_evos.ch_in1k,32.539,67.461,52.921,47.079,13.49,320,0.950,bicubic,-50.097,-43.553,-63
gmlp_s16_224.ra3_in1k,32.420,67.580,51.817,48.183,19.42,224,0.875,bicubic,-47.224,-42.805,+242
inception_resnet_v2.tf_ens_adv_in1k,32.368,67.632,50.429,49.571,55.84,299,0.897,bicubic,-47.609,-44.519,+211
deit_base_patch16_224.fb_in1k,32.363,67.637,50.991,49.009,86.57,224,0.900,bicubic,-49.629,-44.745,+20
maxvit_nano_rw_256.sw_in1k,32.355,67.645,50.622,49.378,15.45,256,0.950,bicubic,-50.573,-45.598,-96
swin_small_patch4_window7_224.ms_in1k,32.347,67.653,50.903,49.097,49.61,224,0.900,bicubic,-50.861,-45.413,-127
resnet152s.gluon_in1k,32.333,67.667,50.541,49.459,60.32,224,0.875,bicubic,-48.675,-44.875,+114
xcit_tiny_24_p8_224.fb_in1k,32.292,67.708,51.895,48.105,12.11,224,1.000,bicubic,-49.600,-44.075,+25
deit_small_distilled_patch16_224.fb_in1k,32.270,67.730,52.109,47.891,22.44,224,0.900,bicubic,-48.946,-43.514,+89
poolformerv2_s24.sail_in1k,32.266,67.734,51.492,48.508,21.34,224,1.000,bicubic,-48.482,-43.818,+140
repvit_m1_1.dist_300e_in1k,32.243,67.757,51.917,48.083,8.80,224,0.950,bicubic,-48.583,-43.253,+132
seresnext101_64x4d.gluon_in1k,32.192,67.808,50.313,49.687,88.23,224,0.875,bicubic,-48.702,-44.983,+121
regnetx_320.tv2_in1k,32.174,67.826,49.349,50.651,107.81,224,0.965,bicubic,-50.636,-46.859,-96
efficientvit_b2.r224_in1k,32.148,67.852,51.001,48.999,24.33,224,0.950,bicubic,-50.000,-44.705,-5
seresnext101_32x4d.gluon_in1k,32.121,67.879,51.243,48.757,48.96,224,0.875,bicubic,-48.771,-44.053,+119
coat_lite_small.in1k,32.113,67.887,49.928,50.072,19.84,224,0.900,bicubic,-50.199,-45.922,-29
flexivit_small.1200ep_in1k,32.095,67.905,50.323,49.677,22.06,240,0.950,bicubic,-50.431,-45.803,-59
tiny_vit_11m_224.in1k,32.072,67.928,51.278,48.722,11.00,224,0.950,bicubic,-49.420,-44.584,+50
coatnext_nano_rw_224.sw_in1k,32.070,67.930,51.017,48.983,14.70,224,0.900,bicubic,-49.872,-44.899,+11
maxxvitv2_nano_rw_256.sw_in1k,32.068,67.932,50.345,49.655,23.70,256,0.950,bicubic,-51.042,-45.979,-128
focalnet_tiny_lrf.ms_in1k,32.052,67.948,51.451,48.549,28.65,224,0.900,bicubic,-50.102,-44.497,-13
gcvit_xtiny.in1k,32.044,67.956,50.991,49.009,19.98,224,0.875,bicubic,-49.910,-44.975,+7
deit_base_patch16_384.fb_in1k,31.991,68.009,50.559,49.441,86.86,384,1.000,bicubic,-51.113,-45.809,-130
maxvit_rmlp_nano_rw_256.sw_in1k,31.976,68.025,50.618,49.382,15.50,256,0.950,bicubic,-50.978,-45.648,-117
xcit_tiny_12_p8_224.fb_dist_in1k,31.940,68.060,51.402,48.598,6.71,224,1.000,bicubic,-49.272,-44.200,+75
coatnet_bn_0_rw_224.sw_in1k,31.905,68.095,51.015,48.985,27.44,224,0.950,bicubic,-50.495,-45.171,-55
tf_efficientnet_b7.aa_in1k,31.877,68.123,51.909,48.091,66.35,600,0.949,bicubic,-52.539,-44.999,-271
levit_384.fb_dist_in1k,31.875,68.125,50.594,49.406,39.13,224,0.900,bicubic,-50.721,-45.424,-83
levit_conv_384.fb_dist_in1k,31.871,68.129,50.596,49.404,39.13,224,0.900,bicubic,-50.719,-45.420,-82
resnetrs101.tf_in1k,31.858,68.142,51.023,48.977,63.62,288,0.940,bicubic,-50.426,-44.991,-38
cs3se_edgenet_x.c2ns_in1k,31.808,68.192,50.769,49.231,50.72,320,1.000,bicubic,-51.738,-45.901,-187
vit_relpos_small_patch16_224.sw_in1k,31.781,68.219,50.620,49.380,21.98,224,0.900,bicubic,-49.681,-45.200,+41
convnext_tiny.fb_in22k_ft_in1k,31.667,68.333,51.785,48.215,28.59,288,1.000,bicubic,-47.231,-42.889,+268
poolformer_m48.sail_in1k,31.665,68.335,49.800,50.200,73.47,224,0.950,bicubic,-50.817,-46.165,-69
flexivit_small.600ep_in1k,31.647,68.353,49.384,50.616,22.06,240,0.950,bicubic,-50.715,-46.700,-57
tnt_s_patch16_224,31.634,68.366,51.153,48.847,23.76,224,0.900,bicubic,-49.902,-44.537,+25
eca_nfnet_l0.ra2_in1k,31.616,68.384,51.587,48.413,24.14,288,1.000,bicubic,-50.962,-44.905,-86
focalnet_tiny_srf.ms_in1k,31.606,68.394,50.864,49.136,28.43,224,0.900,bicubic,-50.532,-45.104,-27
resnetv2_50x1_bit.goog_distilled_in1k,31.581,68.419,51.267,48.733,25.55,224,0.875,bicubic,-51.243,-45.252,-124
coatnet_rmlp_nano_rw_224.sw_in1k,31.549,68.451,50.178,49.822,15.15,224,0.900,bicubic,-50.501,-45.700,-22
mobilevitv2_200.cvnets_in22k_ft_in1k,31.521,68.478,51.777,48.223,18.45,256,0.888,bicubic,-50.810,-44.165,-57
wide_resnet50_2.racm_in1k,31.521,68.478,50.388,49.612,68.88,288,0.950,bicubic,-50.758,-45.676,-49
xception41p.ra3_in1k,31.504,68.496,50.380,49.620,26.91,299,0.940,bicubic,-50.468,-45.404,-18
regnety_064.ra3_in1k,31.463,68.537,50.520,49.480,30.58,288,1.000,bicubic,-52.257,-46.202,-217
poolformer_m36.sail_in1k,31.447,68.553,50.017,49.983,56.17,224,0.950,bicubic,-50.655,-45.681,-31
flexivit_small.300ep_in1k,31.437,68.563,49.215,50.785,22.06,240,0.950,bicubic,-50.741,-46.823,-41
resnet152.a1_in1k,31.433,68.567,48.653,51.347,60.19,288,1.000,bicubic,-51.299,-47.067,-123
resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k,31.429,68.571,52.127,47.873,44.18,224,0.875,bilinear,-49.495,-43.607,+82
repvit_m1_0.dist_450e_in1k,31.413,68.587,50.653,49.347,7.30,224,0.950,bicubic,-49.020,-44.265,+132
inception_v4.tf_in1k,31.386,68.614,49.235,50.765,42.68,299,0.875,bicubic,-48.770,-45.735,+159
rexnet_150.nav_in1k,31.376,68.624,51.284,48.716,9.73,224,0.875,bicubic,-48.948,-43.706,+140
efficientformer_l1.snap_dist_in1k,31.341,68.659,50.441,49.559,12.29,224,0.950,bicubic,-49.157,-44.547,+119
regnety_032.tv2_in1k,31.341,68.659,50.127,49.873,19.44,224,0.965,bicubic,-50.415,-45.717,-9
pit_s_224.in1k,31.339,68.661,49.677,50.323,23.46,224,0.900,bicubic,-49.747,-45.653,+62
resnet101.tv2_in1k,31.339,68.661,49.673,50.327,44.55,224,0.965,bilinear,-50.549,-46.095,-21
crossvit_15_240.in1k,31.319,68.681,50.182,49.818,27.53,240,0.875,bicubic,-50.217,-45.554,+6
swinv2_tiny_window16_256.ms_in1k,31.305,68.695,49.667,50.333,28.35,256,0.900,bicubic,-51.499,-46.569,-139
repvit_m1_0.dist_300e_in1k,31.274,68.726,50.801,49.199,7.30,224,0.950,bicubic,-48.852,-43.943,+152
cspresnet50.ra_in1k,31.272,68.728,51.225,48.775,21.62,256,0.887,bilinear,-48.310,-43.485,+188
cait_xxs36_224.fb_dist_in1k,31.272,68.728,50.614,49.386,17.30,224,1.000,bicubic,-48.474,-44.260,+174
crossvit_small_240.in1k,31.270,68.730,50.192,49.808,26.86,240,0.875,bicubic,-49.748,-45.264,+59
vit_srelpos_small_patch16_224.sw_in1k,31.260,68.740,50.233,49.767,21.97,224,0.900,bicubic,-49.832,-45.337,+52
swinv2_cr_small_224.sw_in1k,31.254,68.746,48.745,51.255,49.70,224,0.900,bicubic,-51.882,-47.363,-177
coatnet_0_rw_224.sw_in1k,31.252,68.748,48.633,51.367,27.44,224,0.950,bicubic,-51.138,-47.203,-91
swin_s3_tiny_224.ms_in1k,31.248,68.752,49.728,50.272,28.33,224,0.900,bicubic,-50.896,-46.226,-55
repvit_m2.dist_in1k,31.239,68.761,50.626,49.374,8.80,224,0.950,bicubic,-49.221,-44.542,+110
cspresnext50.ra_in1k,31.227,68.773,50.871,49.129,20.57,256,0.887,bilinear,-49.327,-44.454,+98
regnetv_040.ra3_in1k,31.225,68.775,50.115,49.885,20.64,288,1.000,bicubic,-51.965,-46.543,-188
convmixer_768_32.in1k,31.219,68.781,50.928,49.072,21.11,224,0.960,bicubic,-48.949,-44.145,+138
coat_mini.in1k,31.191,68.809,49.761,50.239,10.34,224,0.900,bicubic,-50.079,-45.621,+23
xcit_tiny_12_p8_384.fb_dist_in1k,31.182,68.818,50.508,49.492,6.71,384,1.000,bicubic,-51.206,-45.712,-97
fastvit_sa12.apple_dist_in1k,31.138,68.862,49.958,50.042,11.58,256,0.900,bicubic,-50.716,-45.752,-36
resnet101s.gluon_in1k,31.107,68.893,49.791,50.209,44.67,224,0.875,bicubic,-49.197,-45.361,+121
edgenext_small.usi_in1k,31.101,68.899,50.135,49.865,5.59,320,1.000,bicubic,-50.463,-45.577,-16
coatnet_nano_rw_224.sw_in1k,31.099,68.901,49.581,50.419,15.14,224,0.900,bicubic,-50.597,-46.066,-29
tf_efficientnet_cc_b0_8e.in1k,31.091,68.909,50.781,49.219,24.01,224,0.875,bicubic,-46.813,-42.881,+302
resmlp_36_224.fb_distilled_in1k,31.062,68.938,49.696,50.304,44.69,224,0.875,bicubic,-50.086,-45.782,+29
ecaresnet50t.ra2_in1k,31.050,68.950,50.573,49.427,25.57,320,0.950,bicubic,-51.302,-45.567,-98
repvit_m0_9.dist_300e_in1k,31.046,68.954,50.681,49.319,5.49,224,0.950,bicubic,-47.612,-43.435,+244
resnet152c.gluon_in1k,31.018,68.981,48.936,51.064,60.21,224,0.875,bicubic,-48.894,-45.910,+139
cs3sedarknet_x.c2ns_in1k,31.017,68.984,50.131,49.869,35.40,288,1.000,bicubic,-51.642,-46.219,-146
cspdarknet53.ra_in1k,31.001,68.999,50.412,49.588,27.64,256,0.887,bilinear,-49.067,-44.666,+130
resnext101_64x4d.gluon_in1k,30.993,69.007,48.549,51.451,83.46,224,0.875,bicubic,-49.607,-46.443,+81
tresnet_m.miil_in1k,30.987,69.013,48.686,51.314,31.39,224,0.875,bilinear,-49.811,-46.170,+61
twins_svt_small.in1k,30.983,69.017,49.219,50.781,24.06,224,0.900,bicubic,-50.693,-46.439,-38
regnety_160.tv2_in1k,30.942,69.058,49.060,50.940,83.59,224,0.965,bicubic,-51.704,-47.154,-150
gcresnet50t.ra2_in1k,30.936,69.064,50.034,49.966,25.90,288,1.000,bicubic,-50.520,-45.684,-13
tf_efficientnet_cc_b1_8e.in1k,30.903,69.097,50.078,49.922,39.72,240,0.882,bicubic,-48.399,-44.296,+183
resmlp_24_224.fb_distilled_in1k,30.901,69.099,50.174,49.826,30.02,224,0.875,bicubic,-49.855,-45.050,+60
regnety_080_tv.tv2_in1k,30.891,69.109,48.724,51.276,39.38,224,0.965,bicubic,-51.703,-47.524,-144
resnext101_32x4d.gluon_in1k,30.891,69.109,48.539,51.461,44.18,224,0.875,bicubic,-49.449,-46.391,+98
tf_efficientnetv2_b3.in1k,30.853,69.147,49.808,50.192,14.36,300,0.904,bicubic,-51.119,-45.994,-66
repvit_m0_9.dist_450e_in1k,30.846,69.154,50.119,49.881,5.49,224,0.950,bicubic,-48.220,-44.261,+200
tf_efficientnet_lite4.in1k,30.830,69.170,50.390,49.610,13.01,380,0.920,bilinear,-50.700,-45.274,-31
resnetaa50d.sw_in12k_ft_in1k,30.802,69.198,50.569,49.431,25.58,288,1.000,bicubic,-51.798,-45.929,-151
efficientvit_b1.r288_in1k,30.791,69.210,50.009,49.991,9.10,288,1.000,bicubic,-49.533,-45.167,+96
nf_resnet50.ra2_in1k,30.698,69.302,49.944,50.056,25.56,288,0.940,bicubic,-49.942,-45.390,+63
poolformer_s36.sail_in1k,30.690,69.310,49.445,50.555,30.86,224,0.900,bicubic,-50.740,-45.999,-22
xcit_tiny_24_p16_224.fb_in1k,30.671,69.329,50.416,49.584,12.12,224,1.000,bicubic,-48.777,-44.462,+158
resnet50.a1h_in1k,30.631,69.369,49.415,50.585,25.56,224,1.000,bicubic,-50.047,-45.891,+56
dpn107.mx_in1k,30.625,69.374,48.739,51.261,86.92,224,0.875,bicubic,-49.544,-46.203,+105
tresnet_xl.miil_in1k_448,30.622,69.379,49.077,50.923,78.44,448,0.875,bilinear,-52.437,-47.095,-204
resnet152.gluon_in1k,30.614,69.386,48.515,51.485,60.19,224,0.875,bicubic,-49.082,-46.215,+135
resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k,30.602,69.398,50.667,49.333,25.03,224,0.875,bilinear,-49.732,-44.733,+86
haloregnetz_b.ra3_in1k,30.594,69.406,48.987,51.013,11.68,224,0.940,bicubic,-50.452,-46.213,+13
pit_xs_distilled_224.in1k,30.543,69.457,50.180,49.820,11.00,224,0.900,bicubic,-48.637,-44.186,+177
regnetx_080.tv2_in1k,30.521,69.479,48.030,51.970,39.57,224,0.965,bicubic,-51.019,-47.512,-47
resnet101d.gluon_in1k,30.506,69.494,47.973,52.027,44.57,224,0.875,bicubic,-49.920,-47.051,+74
resnest26d.gluon_in1k,30.490,69.510,50.663,49.337,17.07,224,0.875,bilinear,-47.992,-43.631,+223
mobilevitv2_200.cvnets_in22k_ft_in1k_384,30.490,69.510,50.575,49.425,18.45,384,1.000,bicubic,-52.910,-46.007,-249
resnet50.ram_in1k,30.451,69.549,48.997,51.003,25.56,288,0.950,bicubic,-49.525,-46.055,+104
efficientformerv2_s1.snap_dist_in1k,30.445,69.555,49.582,50.418,6.19,224,0.950,bicubic,-49.247,-45.134,+127
efficientnet_b2.ra_in1k,30.427,69.573,49.688,50.312,9.11,288,1.000,bicubic,-50.183,-45.625,+49
tf_efficientnet_b1.ap_in1k,30.423,69.577,49.555,50.445,7.79,240,0.882,bicubic,-48.853,-44.757,+158
cs3darknet_x.c2ns_in1k,30.409,69.591,49.197,50.803,35.05,288,1.000,bicubic,-51.813,-47.033,-117
xcit_tiny_12_p16_384.fb_dist_in1k,30.403,69.597,50.131,49.869,6.72,384,1.000,bicubic,-50.535,-45.283,+12
twins_pcpvt_small.in1k,30.396,69.604,49.384,50.616,24.11,224,0.900,bicubic,-50.696,-46.264,-4
ecaresnetlight.miil_in1k,30.382,69.618,49.146,50.854,30.16,288,0.950,bicubic,-51.026,-46.670,-39
resnet50d.ra2_in1k,30.376,69.624,48.794,51.206,25.58,288,0.950,bicubic,-50.980,-46.944,-33
ecaresnet101d_pruned.miil_in1k,30.356,69.644,48.810,51.190,24.88,288,0.950,bicubic,-51.642,-47.350,-97
visformer_small.in1k,30.339,69.661,48.285,51.715,40.22,224,0.900,bicubic,-51.767,-47.593,-108
resnet101.a1_in1k,30.297,69.703,46.584,53.416,44.55,288,1.000,bicubic,-52.025,-49.048,-136
tf_efficientnet_b5.in1k,30.295,69.705,49.753,50.247,30.39,456,0.934,bicubic,-52.881,-46.783,-241
regnety_040.ra3_in1k,30.266,69.734,48.930,51.070,20.65,288,1.000,bicubic,-52.778,-47.572,-225
regnetx_160.tv2_in1k,30.217,69.783,47.055,52.945,54.28,224,0.965,bicubic,-52.349,-49.117,-169
mobilevitv2_175.cvnets_in22k_ft_in1k,30.209,69.791,49.056,50.944,14.25,256,0.888,bicubic,-51.729,-46.734,-95
vit_relpos_base_patch32_plus_rpn_256.sw_in1k,30.201,69.799,48.688,51.312,119.42,256,0.900,bicubic,-49.283,-45.450,+127
resnet50.b1k_in1k,30.197,69.803,49.187,50.813,25.56,288,1.000,bicubic,-50.509,-46.245,+26
fastvit_s12.apple_dist_in1k,30.187,69.813,48.962,51.038,9.47,256,0.900,bicubic,-50.883,-46.323,-12
seresnext50_32x4d.racm_in1k,30.168,69.832,49.093,50.907,27.56,288,0.950,bicubic,-52.028,-47.055,-127
resnet50.b2k_in1k,30.152,69.848,48.244,51.756,25.56,288,1.000,bicubic,-50.302,-47.074,+48
regnety_016.tv2_in1k,30.138,69.862,49.231,50.769,11.20,224,0.965,bicubic,-50.528,-46.099,+26
wide_resnet50_2.tv2_in1k,30.116,69.883,48.362,51.638,68.88,224,0.965,bilinear,-51.489,-47.398,-78
dpn68b.ra_in1k,30.109,69.891,48.164,51.836,12.61,288,1.000,bicubic,-49.251,-46.272,+130
convmixer_1024_20_ks9_p14.in1k,30.093,69.907,49.934,50.066,24.38,224,0.960,bicubic,-46.843,-43.416,+294
efficientnet_el.ra_in1k,30.028,69.972,48.849,51.151,10.59,300,0.904,bicubic,-51.284,-46.640,-47
tf_efficientnet_b2.aa_in1k,30.012,69.988,49.590,50.410,9.11,260,0.890,bicubic,-50.072,-45.316,+74
xcit_tiny_12_p16_224.fb_dist_in1k,30.010,69.990,49.651,50.349,6.72,224,1.000,bicubic,-48.564,-44.547,+186
legacy_senet154.in1k,29.993,70.007,48.038,51.962,115.09,224,0.875,bilinear,-51.319,-47.522,-49
halo2botnet50ts_256.a1h_in1k,29.987,70.013,48.380,51.620,22.64,256,0.950,bicubic,-52.073,-47.254,-123
dpn98.mx_in1k,29.981,70.019,48.146,51.854,61.57,224,0.875,bicubic,-49.689,-46.508,+102
mobilevitv2_150.cvnets_in22k_ft_in1k,29.971,70.029,49.217,50.783,10.59,256,0.888,bicubic,-51.517,-46.451,-73
resnetv2_50d_gn.ah_in1k,29.957,70.043,48.195,51.805,25.57,288,1.000,bicubic,-52.001,-47.733,-115
dpn92.mx_in1k,29.946,70.054,49.113,50.887,37.67,224,0.875,bicubic,-50.092,-45.747,+69
resnext50_32x4d.a1h_in1k,29.942,70.058,48.234,51.766,25.03,288,1.000,bicubic,-52.072,-47.700,-124
convnextv2_pico.fcmae_ft_in1k,29.940,70.060,48.853,51.147,9.07,288,0.950,bicubic,-51.146,-46.627,-31
dpn131.mx_in1k,29.934,70.066,48.034,51.966,79.25,224,0.875,bicubic,-49.880,-46.666,+82
resnetv2_101x1_bit.goog_in21k_ft_in1k,29.898,70.102,51.109,48.891,44.54,448,1.000,bilinear,-52.444,-45.411,-166
senet154.gluon_in1k,29.887,70.113,47.879,52.121,115.09,224,0.875,bicubic,-51.339,-47.479,-54
tf_efficientnet_b4.in1k,29.861,70.139,49.014,50.986,19.34,380,0.922,bicubic,-52.747,-46.737,-208
legacy_xception.tf_in1k,29.855,70.145,48.684,51.316,22.86,299,0.897,bicubic,-49.185,-45.698,+143
resnet50.tv2_in1k,29.837,70.162,48.012,51.988,25.56,224,0.965,bilinear,-51.011,-47.422,-11
cs3sedarknet_l.c2ns_in1k,29.832,70.168,49.009,50.991,21.91,288,0.950,bicubic,-51.952,-46.955,-110
resnet152.a2_in1k,29.826,70.174,45.961,54.039,60.19,288,1.000,bicubic,-52.782,-50.167,-211
efficientvit_b1.r224_in1k,29.822,70.178,48.289,51.711,9.10,224,0.950,bicubic,-49.430,-46.015,+122
inception_v3.tf_adv_in1k,29.818,70.182,47.843,52.157,23.83,299,0.875,bicubic,-47.774,-45.887,+236
vit_base_patch16_384.augreg_in1k,29.792,70.208,48.327,51.673,86.86,384,1.000,bicubic,-51.310,-46.793,-46
resnetaa50.a1h_in1k,29.790,70.210,48.014,51.986,25.56,288,1.000,bicubic,-51.824,-47.788,-105
lamhalobotnet50ts_256.a1h_in1k,29.755,70.245,48.335,51.665,22.57,256,0.950,bicubic,-51.797,-47.157,-100
fbnetv3_d.ra2_in1k,29.739,70.261,49.472,50.528,10.31,256,0.950,bilinear,-49.943,-45.472,+81
ese_vovnet39b.ra_in1k,29.733,70.267,49.046,50.954,24.57,288,0.950,bicubic,-50.617,-46.320,+26
fastvit_sa12.apple_in1k,29.731,70.269,48.610,51.390,11.58,256,0.900,bicubic,-51.113,-46.730,-20
resmlp_36_224.fb_in1k,29.698,70.302,48.958,51.042,44.69,224,0.875,bicubic,-50.074,-45.926,+69
fastvit_t12.apple_dist_in1k,29.686,70.314,48.517,51.483,7.55,256,0.900,bicubic,-50.666,-46.525,+22
convnext_nano.d1h_in1k,29.684,70.316,47.920,52.080,15.59,288,1.000,bicubic,-51.798,-47.738,-95
resnet50.c1_in1k,29.672,70.328,48.458,51.542,25.56,288,1.000,bicubic,-51.240,-47.094,-35
vit_base_patch32_384.augreg_in1k,29.645,70.355,48.987,51.013,88.30,384,1.000,bicubic,-49.111,-45.239,+146
efficientvit_b1.r256_in1k,29.621,70.379,48.209,51.791,9.10,256,1.000,bicubic,-50.113,-46.571,+66
ecaresnet50d.miil_in1k,29.562,70.438,48.967,51.033,25.58,288,0.950,bicubic,-52.088,-46.915,-119
nest_tiny_jx.goog_in1k,29.558,70.442,46.983,53.017,17.06,224,0.875,bicubic,-51.868,-48.635,-93
resnet152.a3_in1k,29.525,70.475,47.038,52.962,60.19,224,0.950,bicubic,-51.021,-47.962,-5
gcresnext50ts.ch_in1k,29.517,70.483,47.851,52.149,15.67,288,1.000,bicubic,-51.713,-47.691,-78
resnext50_32x4d.tv2_in1k,29.480,70.520,47.232,52.768,25.03,224,0.965,bilinear,-51.702,-48.108,-72
cs3darknet_l.c2ns_in1k,29.466,70.534,48.234,51.766,21.16,288,0.950,bicubic,-51.430,-47.428,-42
efficientnet_em.ra2_in1k,29.464,70.536,48.922,51.078,6.90,240,0.882,bicubic,-49.780,-45.872,+103
resnext101_32x8d.tv_in1k,29.437,70.563,48.488,51.512,88.79,224,0.875,bilinear,-49.873,-46.032,+92
resnet50.fb_ssl_yfcc100m_ft_in1k,29.435,70.565,49.791,50.209,25.56,224,0.875,bilinear,-49.795,-45.035,+101
coat_lite_mini.in1k,29.435,70.565,47.718,52.282,11.01,224,0.900,bicubic,-49.667,-46.890,+111
resnetv2_50.a1h_in1k,29.435,70.565,47.441,52.559,25.55,288,1.000,bicubic,-51.963,-48.285,-99
deit_small_patch16_224.fb_in1k,29.425,70.575,48.250,51.750,22.05,224,0.900,bicubic,-50.423,-46.794,+45
sebotnet33ts_256.a1h_in1k,29.423,70.577,47.160,52.840,13.70,256,0.940,bicubic,-51.745,-48.009,-78
nf_regnet_b1.ra2_in1k,29.411,70.589,49.427,50.573,10.22,288,0.900,bicubic,-49.897,-45.313,+87
repvit_m1.dist_in1k,29.407,70.593,48.541,51.459,5.49,224,0.950,bicubic,-49.131,-45.529,+144
mobileone_s4.apple_in1k,29.405,70.595,47.967,52.033,14.95,224,0.900,bilinear,-50.021,-46.513,+76
cait_xxs24_384.fb_dist_in1k,29.391,70.609,48.751,51.249,12.03,384,1.000,bicubic,-51.581,-46.889,-62
resnetv2_50d_evos.ah_in1k,29.382,70.618,47.226,52.774,25.59,288,1.000,bicubic,-52.620,-48.674,-164
edgenext_small_rw.sw_in1k,29.346,70.654,48.722,51.278,7.83,320,1.000,bicubic,-51.112,-46.586,-9
convnext_nano_ols.d1h_in1k,29.325,70.675,47.478,52.522,15.65,288,1.000,bicubic,-52.275,-48.158,-132
regnetz_b16.ra3_in1k,29.321,70.679,47.885,52.115,9.72,288,1.000,bicubic,-51.407,-47.633,-37
swin_tiny_patch4_window7_224.ms_in1k,29.321,70.679,47.609,52.391,28.29,224,0.900,bicubic,-52.055,-47.934,-107
cait_xxs24_224.fb_dist_in1k,29.295,70.705,48.537,51.463,11.96,224,1.000,bicubic,-49.089,-45.779,+153
eca_resnet33ts.ra2_in1k,29.277,70.723,48.928,51.072,19.68,288,1.000,bicubic,-51.395,-46.436,-35
resnet50.d_in1k,29.250,70.750,47.213,52.787,25.56,288,1.000,bicubic,-51.722,-48.217,-69
resnet50.c2_in1k,29.244,70.756,47.165,52.835,25.56,288,1.000,bicubic,-51.626,-48.369,-56
pvt_v2_b1.in1k,29.242,70.758,48.962,51.038,14.01,224,0.900,bicubic,-49.462,-45.540,+124
maxvit_rmlp_pico_rw_256.sw_in1k,29.238,70.762,47.719,52.281,7.52,256,0.950,bicubic,-51.276,-47.495,-28
gcvit_xxtiny.in1k,29.218,70.781,48.348,51.652,12.00,224,0.875,bicubic,-50.508,-46.732,+38
poolformer_s24.sail_in1k,29.205,70.795,48.079,51.921,21.39,224,0.900,bicubic,-51.089,-46.995,-1
tresnet_l.miil_in1k_448,29.162,70.838,47.224,52.776,55.99,448,0.875,bilinear,-53.114,-48.754,-205
seresnet50.ra2_in1k,29.146,70.854,47.749,52.251,28.09,288,0.950,bicubic,-52.138,-47.903,-108
inception_v3.gluon_in1k,29.112,70.888,46.943,53.057,23.83,299,0.875,bicubic,-49.690,-47.433,+108
lambda_resnet50ts.a1h_in1k,29.108,70.891,46.961,53.039,21.54,256,0.950,bicubic,-52.050,-48.137,-97
resnet101.a2_in1k,29.091,70.909,45.762,54.238,44.55,288,1.000,bicubic,-53.145,-49.968,-206
xception71.tf_in1k,29.030,70.970,47.391,52.609,42.34,299,0.903,bicubic,-50.844,-47.537,+17
resnet34d.ra2_in1k,29.020,70.980,48.052,51.948,21.82,288,0.950,bicubic,-49.416,-46.292,+133
hrnet_w64.ms_in1k,28.987,71.013,47.138,52.862,128.06,224,0.875,bilinear,-50.489,-47.514,+49
xcit_tiny_12_p8_224.fb_in1k,28.979,71.021,47.501,52.499,6.71,224,1.000,bicubic,-50.709,-47.553,+33
regnetx_032.tv2_in1k,28.975,71.025,47.069,52.931,15.30,224,0.965,bicubic,-51.951,-48.209,-79
cs3darknet_focus_l.c2ns_in1k,28.939,71.061,47.639,52.361,21.15,288,0.950,bicubic,-51.937,-48.043,-74
tf_efficientnet_b0.ns_jft_in1k,28.902,71.098,49.007,50.993,5.29,224,0.875,bicubic,-49.766,-45.365,+110
xception65.tf_in1k,28.894,71.106,47.154,52.846,39.92,299,0.903,bicubic,-50.662,-47.504,+38
tf_efficientnet_b1.aa_in1k,28.886,71.114,47.523,52.477,7.79,240,0.882,bicubic,-49.942,-46.677,+94
vit_small_patch32_384.augreg_in21k_ft_in1k,28.875,71.125,48.889,51.111,22.92,384,1.000,bicubic,-51.611,-46.711,-41
mobilevitv2_150.cvnets_in22k_ft_in1k_384,28.871,71.129,47.932,52.068,10.59,384,1.000,bicubic,-53.715,-48.382,-266
resnet101.gluon_in1k,28.853,71.147,46.381,53.619,44.55,224,0.875,bicubic,-50.457,-48.141,+52
skresnext50_32x4d.ra_in1k,28.810,71.190,46.507,53.493,27.48,224,0.875,bicubic,-51.354,-48.133,-8
sehalonet33ts.ra2_in1k,28.780,71.220,46.574,53.426,13.69,256,0.940,bicubic,-52.178,-48.698,-90
levit_256.fb_dist_in1k,28.743,71.257,46.729,53.271,18.89,224,0.900,bicubic,-52.781,-48.765,-154
levit_conv_256.fb_dist_in1k,28.739,71.261,46.723,53.277,18.89,224,0.900,bicubic,-52.783,-48.767,-154
resnet50.ra_in1k,28.694,71.306,47.366,52.634,25.56,288,0.950,bicubic,-51.142,-47.600,+8
mobileone_s3.apple_in1k,28.676,71.324,47.582,52.418,10.17,224,0.900,bilinear,-49.316,-46.332,+151
resnetblur50.bt_in1k,28.662,71.338,46.908,53.092,25.56,288,0.950,bicubic,-51.572,-48.326,-20
tf_efficientnet_lite3.in1k,28.651,71.349,47.360,52.640,8.20,300,0.904,bilinear,-51.155,-47.554,+8
darknetaa53.c2ns_in1k,28.647,71.353,46.949,53.051,36.02,288,1.000,bilinear,-51.859,-48.373,-55
hrnet_w40.ms_in1k,28.645,71.355,47.452,52.548,57.56,224,0.875,bilinear,-50.287,-47.012,+74
skresnet34.ra_in1k,28.631,71.369,47.961,52.039,22.28,224,0.875,bicubic,-48.279,-45.183,+205
swinv2_tiny_window8_256.ms_in1k,28.627,71.373,46.189,53.811,28.35,256,0.900,bicubic,-53.193,-49.805,-189
seresnext50_32x4d.gluon_in1k,28.619,71.381,46.438,53.562,27.56,224,0.875,bicubic,-51.305,-48.386,-11
mobilevitv2_175.cvnets_in22k_ft_in1k_384,28.598,71.403,47.118,52.882,14.25,384,1.000,bicubic,-54.341,-49.308,-321
tf_efficientnet_b3.in1k,28.582,71.418,47.981,52.019,12.23,300,0.904,bicubic,-52.292,-47.319,-93
halonet50ts.a1h_in1k,28.580,71.420,46.183,53.817,22.73,256,0.940,bicubic,-53.082,-49.427,-183
tf_efficientnetv2_b0.in1k,28.570,71.430,47.077,52.923,7.14,224,0.875,bicubic,-49.788,-46.937,+114
poolformerv2_s12.sail_in1k,28.556,71.444,47.399,52.601,11.89,224,1.000,bicubic,-49.446,-46.465,+137
resnet152.tv_in1k,28.543,71.457,47.106,52.894,60.19,224,0.875,bilinear,-49.779,-46.940,+114
xcit_tiny_12_p16_224.fb_in1k,28.515,71.485,47.403,52.597,6.72,224,1.000,bicubic,-48.625,-46.313,+184
eva02_tiny_patch14_336.mim_in22k_ft_in1k,28.507,71.493,47.539,52.461,5.76,336,1.000,bicubic,-52.123,-47.987,-75
ecaresnet50t.a1_in1k,28.456,71.544,45.576,54.424,25.57,288,1.000,bicubic,-53.672,-50.066,-225
repvgg_b2.rvgg_in1k,28.421,71.579,47.044,52.956,89.02,224,0.875,bilinear,-50.371,-47.376,+73
hrnet_w48.ms_in1k,28.409,71.591,47.576,52.424,77.47,224,0.875,bilinear,-50.897,-46.940,+31
swinv2_cr_tiny_ns_224.sw_in1k,28.379,71.621,45.902,54.098,28.33,224,0.900,bicubic,-53.423,-49.916,-199
resnext50_32x4d.gluon_in1k,28.372,71.628,45.332,54.668,25.03,224,0.875,bicubic,-50.988,-49.098,+24
efficientnet_b2_pruned.in1k,28.360,71.640,47.057,52.943,8.31,260,0.890,bicubic,-51.560,-47.795,-24
tf_efficientnet_b0.ap_in1k,28.354,71.646,47.535,52.465,5.29,224,0.875,bicubic,-48.736,-45.727,+178
darknet53.c2ns_in1k,28.328,71.672,46.880,53.120,41.61,288,1.000,bicubic,-52.204,-48.552,-77
dla169.in1k,28.320,71.680,47.388,52.612,53.39,224,0.875,bilinear,-50.388,-46.956,+73
tf_efficientnet_cc_b0_4e.in1k,28.313,71.687,47.360,52.640,13.31,224,0.875,bicubic,-48.989,-45.976,+162
dla102x2.in1k,28.313,71.687,46.774,53.226,41.28,224,0.875,bilinear,-51.133,-47.858,+12
resnext50_32x4d.ra_in1k,28.303,71.697,46.081,53.919,25.03,288,0.950,bicubic,-52.395,-49.311,-93
mixnet_xl.ra_in1k,28.291,71.709,46.708,53.292,11.90,224,0.875,bicubic,-52.191,-48.229,-76
seresnet33ts.ra2_in1k,28.236,71.764,47.578,52.422,19.78,288,1.000,bicubic,-52.548,-47.784,-103
resnet50d.gluon_in1k,28.234,71.766,45.880,54.120,25.58,224,0.875,bicubic,-50.844,-48.586,+39
resnet50.a1_in1k,28.220,71.780,44.937,55.063,25.56,288,1.000,bicubic,-52.994,-50.165,-153
fastvit_s12.apple_in1k,28.126,71.874,46.649,53.351,9.47,256,0.900,bicubic,-51.816,-48.145,-37
densenet161.tv_in1k,28.108,71.892,46.653,53.347,28.68,224,0.875,bicubic,-49.250,-46.989,+151
resnet101c.gluon_in1k,28.104,71.896,45.953,54.047,44.57,224,0.875,bicubic,-51.434,-48.631,-4
resnet34.a1_in1k,28.100,71.900,45.707,54.293,21.80,288,1.000,bicubic,-49.818,-48.057,+121
wide_resnet101_2.tv_in1k,28.095,71.906,46.426,53.574,126.89,224,0.875,bilinear,-50.748,-47.855,+48
regnetx_320.pycls_in1k,28.079,71.921,45.118,54.882,107.81,224,0.875,bicubic,-52.167,-49.904,-58
regnety_320.pycls_in1k,28.075,71.925,45.460,54.540,145.05,224,0.875,bicubic,-52.735,-49.778,-115
resnext50_32x4d.a1_in1k,28.075,71.925,44.815,55.185,25.03,288,1.000,bicubic,-53.391,-50.359,-188
gernet_s.idstcv_in1k,28.051,71.949,46.727,53.273,8.17,224,0.875,bilinear,-48.859,-46.589,+171
ecaresnet50d_pruned.miil_in1k,28.043,71.957,47.038,52.962,19.94,288,0.950,bicubic,-52.747,-48.532,-116
levit_conv_192.fb_dist_in1k,28.032,71.968,45.874,54.126,10.95,224,0.900,bicubic,-51.806,-48.904,-37
mobilevitv2_175.cvnets_in1k,28.028,71.972,46.098,53.902,14.25,256,0.888,bicubic,-52.832,-49.158,-126
levit_192.fb_dist_in1k,28.028,71.972,45.870,54.130,10.95,224,0.900,bicubic,-51.810,-48.914,-37
efficientnet_el_pruned.in1k,28.004,71.996,46.804,53.196,10.59,300,0.904,bicubic,-52.294,-48.418,-71
vit_base_patch16_224.augreg_in1k,27.971,72.029,45.737,54.263,86.57,224,0.900,bicubic,-51.183,-48.353,+18
fastvit_t12.apple_in1k,27.935,72.065,46.393,53.607,7.55,256,0.900,bicubic,-51.329,-48.169,+7
resnet101.a3_in1k,27.925,72.075,45.014,54.986,44.55,224,0.950,bicubic,-51.888,-49.600,-39
resnet50_gn.a1h_in1k,27.916,72.084,46.075,53.925,25.56,288,0.950,bicubic,-53.300,-49.309,-173
xception41.tf_in1k,27.880,72.120,45.888,54.112,26.97,299,0.903,bicubic,-50.624,-48.388,+58
regnetx_160.pycls_in1k,27.827,72.173,45.631,54.369,54.28,224,0.875,bicubic,-52.039,-49.197,-49
dpn68b.mx_in1k,27.814,72.186,47.415,52.585,12.61,224,0.875,bicubic,-49.704,-46.437,+123
resnet50d.a1_in1k,27.790,72.210,44.377,55.623,25.58,288,1.000,bicubic,-53.660,-50.841,-199
inception_v3.tf_in1k,27.780,72.220,45.727,54.273,23.83,299,0.875,bicubic,-50.076,-48.139,+106
res2net101_26w_4s.in1k,27.778,72.222,45.159,54.841,45.21,224,0.875,bilinear,-51.422,-49.277,+5
tf_efficientnetv2_b1.in1k,27.745,72.255,46.580,53.420,8.14,240,0.882,bicubic,-51.715,-48.142,-21
repghostnet_200.in1k,27.719,72.281,46.322,53.678,9.80,224,0.875,bicubic,-51.087,-48.008,+30
vit_base_patch16_224.sam_in1k,27.703,72.297,45.100,54.900,86.57,224,0.900,bicubic,-52.535,-49.656,-78
fbnetv3_b.ra2_in1k,27.670,72.330,46.987,53.013,8.60,256,0.950,bilinear,-51.476,-47.757,+6
regnety_160.pycls_in1k,27.641,72.359,45.531,54.469,83.59,224,0.875,bicubic,-52.657,-49.433,-85
mobilevitv2_200.cvnets_in1k,27.637,72.363,45.784,54.216,18.45,256,0.888,bicubic,-53.497,-49.578,-175
repvgg_b1.rvgg_in1k,27.631,72.369,46.533,53.467,57.42,224,0.875,bilinear,-50.737,-47.563,+62
hrnet_w44.ms_in1k,27.627,72.373,45.837,54.163,67.06,224,0.875,bilinear,-51.267,-48.527,+18
resnet50.am_in1k,27.574,72.426,45.369,54.631,25.56,224,0.875,bicubic,-51.428,-49.029,+10
inception_v3.tv_in1k,27.556,72.444,45.265,54.735,23.83,299,0.875,bicubic,-49.878,-48.209,+116
resmlp_24_224.fb_in1k,27.517,72.483,45.690,54.310,30.02,224,0.875,bicubic,-51.857,-48.856,-24
pit_xs_224.in1k,27.485,72.515,45.910,54.090,10.62,224,0.900,bicubic,-50.691,-48.252,+69
tiny_vit_5m_224.in1k,27.483,72.517,45.855,54.145,5.39,224,0.950,bicubic,-51.687,-48.939,-5
gcresnet33ts.ra2_in1k,27.401,72.599,46.155,53.845,19.88,288,1.000,bicubic,-53.199,-49.167,-127
regnetx_080.pycls_in1k,27.397,72.603,45.012,54.988,39.57,224,0.875,bicubic,-51.801,-49.542,-9
hrnet_w30.ms_in1k,27.389,72.611,46.554,53.446,37.71,224,0.875,bilinear,-50.807,-47.668,+63
hrnet_w32.ms_in1k,27.381,72.619,46.006,53.994,41.23,224,0.875,bilinear,-51.061,-48.184,+43
convnext_pico.d1_in1k,27.358,72.642,45.660,54.340,9.05,288,0.950,bicubic,-53.058,-49.388,-111
vit_small_patch16_384.augreg_in1k,27.328,72.672,46.118,53.882,22.20,384,1.000,bicubic,-53.788,-49.456,-186
resnet50s.gluon_in1k,27.324,72.676,45.214,54.786,25.68,224,0.875,bicubic,-51.390,-49.028,+21
res2net50_26w_8s.in1k,27.306,72.694,44.815,55.185,48.40,224,0.875,bilinear,-51.635,-49.479,+1
convnext_pico_ols.d1_in1k,27.297,72.703,45.644,54.356,9.06,288,1.000,bicubic,-53.165,-49.608,-123
densenet201.tv_in1k,27.271,72.729,46.210,53.790,20.01,224,0.875,bicubic,-50.015,-47.270,+111
resnet33ts.ra2_in1k,27.257,72.743,45.183,54.817,19.68,288,1.000,bicubic,-52.469,-49.645,-64
ecaresnet50t.a2_in1k,27.246,72.754,44.047,55.953,25.57,288,1.000,bicubic,-54.412,-51.503,-252
regnety_064.pycls_in1k,27.238,72.762,44.866,55.134,30.58,224,0.875,bicubic,-52.478,-49.900,-65
ghostnetv2_160.in1k,27.232,72.768,46.366,53.634,12.39,224,0.875,bicubic,-50.600,-47.574,+79
efficientnet_b1_pruned.in1k,27.196,72.804,45.861,54.139,6.33,240,0.882,bicubic,-51.044,-47.973,+49
tf_efficientnetv2_b2.in1k,27.163,72.837,44.568,55.432,10.10,260,0.890,bicubic,-53.033,-50.474,-100
vit_base_patch32_224.augreg_in1k,27.140,72.861,45.175,54.825,88.22,224,0.900,bicubic,-47.755,-46.603,+178
seresnet50.a1_in1k,27.124,72.876,43.563,56.437,28.09,288,1.000,bicubic,-53.978,-51.765,-195
resnet50d.a2_in1k,27.120,72.880,43.811,56.189,25.58,288,1.000,bicubic,-54.044,-51.269,-204
resnetrs50.tf_in1k,27.098,72.902,45.027,54.973,35.69,224,0.910,bicubic,-52.796,-49.947,-89
rexnet_130.nav_in1k,27.081,72.919,45.957,54.043,7.56,224,0.875,bicubic,-52.425,-48.721,-58
gmixer_24_224.ra3_in1k,27.031,72.969,44.353,55.647,24.72,224,0.875,bicubic,-50.995,-49.315,+56
dla102x.in1k,27.022,72.978,45.505,54.495,26.31,224,0.875,bilinear,-51.490,-48.731,+16
resnet101.tv_in1k,26.963,73.037,45.234,54.766,44.55,224,0.875,bilinear,-50.417,-48.312,+91
regnetx_120.pycls_in1k,26.866,73.134,44.688,55.312,46.11,224,0.875,bicubic,-52.722,-50.055,-67
resnet32ts.ra2_in1k,26.849,73.151,45.041,54.959,17.96,288,1.000,bicubic,-52.539,-49.611,-54
rexnet_100.nav_in1k,26.841,73.159,45.379,54.621,4.80,224,0.875,bicubic,-51.015,-48.261,+64
legacy_seresnext101_32x4d.in1k,26.821,73.179,43.505,56.495,48.96,224,0.875,bilinear,-53.411,-51.515,-114
densenet169.tv_in1k,26.819,73.181,45.381,54.619,14.15,224,0.875,bicubic,-49.081,-47.647,+142
tinynet_a.in1k,26.815,73.185,45.082,54.918,6.19,192,0.875,bicubic,-50.833,-48.458,+69
regnetx_064.pycls_in1k,26.803,73.197,44.913,55.087,26.21,224,0.875,bicubic,-52.263,-49.547,-28
regnety_120.pycls_in1k,26.780,73.220,44.442,55.558,51.82,224,0.875,bicubic,-53.600,-50.684,-137
regnetx_032.pycls_in1k,26.717,73.283,45.230,54.770,15.30,224,0.875,bicubic,-51.451,-48.852,+36
res2net101d.in1k,26.711,73.289,44.336,55.664,45.23,224,0.875,bilinear,-54.507,-51.014,-227
resnext50_32x4d.a2_in1k,26.682,73.318,42.768,57.232,25.03,288,1.000,bicubic,-54.622,-52.328,-233
efficientvit_m5.r224_in1k,26.674,73.326,44.923,55.077,12.47,224,0.875,bicubic,-50.384,-48.261,+98
legacy_seresnet152.in1k,26.666,73.334,43.949,56.051,66.82,224,0.875,bilinear,-51.994,-50.421,-4
efficientnet_es.ra_in1k,26.619,73.381,45.096,54.904,5.44,224,0.875,bicubic,-51.439,-48.830,+38
res2net50_26w_6s.in1k,26.591,73.409,44.004,55.996,37.05,224,0.875,bilinear,-51.977,-50.118,-3
repvgg_b1g4.rvgg_in1k,26.579,73.421,45.100,54.900,39.97,224,0.875,bilinear,-51.009,-48.736,+64
dla60x.in1k,26.566,73.434,45.031,54.969,17.35,224,0.875,bilinear,-51.670,-48.995,+24
coat_lite_tiny.in1k,26.517,73.484,44.646,55.354,5.72,224,0.900,bicubic,-51.003,-49.276,+64
regnety_080.pycls_in1k,26.515,73.485,44.351,55.649,39.18,224,0.875,bicubic,-53.353,-50.481,-110
mobilenetv3_large_100.miil_in21k_ft_in1k,26.493,73.507,44.491,55.509,5.48,224,0.875,bilinear,-51.427,-48.423,+43
tf_efficientnet_b0.aa_in1k,26.483,73.517,45.642,54.358,5.29,224,0.875,bicubic,-50.361,-47.576,+99
res2net50_14w_8s.in1k,26.477,73.523,44.371,55.629,25.06,224,0.875,bilinear,-51.681,-49.475,+25
tf_efficientnet_b2.in1k,26.462,73.538,44.788,55.212,9.11,260,0.890,bicubic,-53.147,-49.926,-90
mobileone_s2.apple_in1k,26.456,73.544,44.566,55.434,7.88,224,0.900,bilinear,-51.060,-49.102,+60
resnet50.gluon_in1k,26.428,73.572,44.039,55.961,25.56,224,0.875,bicubic,-51.154,-49.681,+56
ecaresnet50t.a3_in1k,26.420,73.580,43.507,56.493,25.57,224,0.950,bicubic,-53.132,-51.188,-89
tf_efficientnet_el.in1k,26.353,73.647,44.173,55.827,10.59,300,0.904,bicubic,-53.895,-50.947,-141
levit_conv_128.fb_dist_in1k,26.330,73.670,44.120,55.880,9.21,224,0.900,bicubic,-52.164,-49.888,-11
levit_128.fb_dist_in1k,26.328,73.672,44.116,55.884,9.21,224,0.900,bicubic,-52.162,-49.896,-10
lambda_resnet26t.c1_in1k,26.326,73.674,44.430,55.570,10.96,256,0.940,bicubic,-52.762,-50.160,-54
resmlp_big_24_224.fb_in1k,26.318,73.682,43.554,56.446,129.14,224,0.875,bicubic,-54.718,-51.464,-225
resmlp_12_224.fb_distilled_in1k,26.316,73.684,44.874,55.126,15.35,224,0.875,bicubic,-51.638,-48.686,+29
visformer_tiny.in1k,26.257,73.743,44.182,55.818,10.32,224,0.900,bicubic,-51.903,-49.984,+13
regnetx_040.pycls_in1k,26.243,73.757,44.424,55.576,22.12,224,0.875,bicubic,-52.249,-49.818,-16
mobilevitv2_150.cvnets_in1k,26.178,73.822,43.762,56.238,10.59,256,0.888,bicubic,-54.192,-51.312,-163
crossvit_9_dagger_240.in1k,26.177,73.823,44.542,55.458,8.78,240,0.875,bicubic,-50.801,-49.076,+78
vit_small_patch32_224.augreg_in21k_ft_in1k,26.165,73.835,45.106,54.894,22.88,224,0.900,bicubic,-49.829,-47.694,+105
seresnet50.a2_in1k,26.165,73.835,42.675,57.325,28.09,288,1.000,bicubic,-54.941,-52.547,-240
densenetblur121d.ra_in1k,26.135,73.865,45.037,54.963,8.00,288,0.950,bicubic,-51.187,-48.751,+54
resnet50.a2_in1k,26.094,73.906,42.583,57.417,25.56,288,1.000,bicubic,-54.678,-52.405,-205
resnet50d.a3_in1k,26.090,73.910,42.970,57.030,25.58,224,0.950,bicubic,-52.630,-51.262,-39
resnext50_32x4d.a3_in1k,26.082,73.918,42.946,57.054,25.03,224,0.950,bicubic,-53.186,-51.360,-81
resnet34.a2_in1k,26.080,73.920,43.109,56.891,21.80,288,1.000,bicubic,-51.078,-50.165,+62
efficientnet_b1.ft_in1k,26.055,73.945,44.080,55.920,7.79,256,1.000,bicubic,-52.745,-50.262,-47
convnextv2_femto.fcmae_ft_in1k,26.039,73.961,44.302,55.698,5.23,288,0.950,bicubic,-53.299,-50.258,-92
fastvit_t8.apple_dist_in1k,26.033,73.967,44.397,55.603,4.03,256,0.900,bicubic,-51.143,-48.901,+57
lambda_resnet26rpt_256.c1_in1k,26.033,73.967,44.190,55.810,10.99,256,0.940,bicubic,-52.931,-50.246,-63
mobilevitv2_125.cvnets_in1k,26.021,73.979,43.668,56.332,7.48,256,0.888,bicubic,-53.659,-51.190,-119
dpn68.mx_in1k,26.006,73.994,44.084,55.916,12.61,224,0.875,bicubic,-50.340,-48.924,+88
hrnet_w18.ms_in1k,25.982,74.018,44.803,55.197,21.30,224,0.875,bilinear,-50.770,-48.641,+72
hardcorenas_f.miil_green_in1k,25.939,74.061,44.204,55.796,8.20,224,0.875,bilinear,-52.157,-49.598,-1
vit_small_patch16_224.augreg_in1k,25.935,74.065,43.964,56.036,22.05,224,0.900,bicubic,-52.913,-50.324,-61
repghostnet_150.in1k,25.923,74.077,44.328,55.672,6.58,224,0.875,bicubic,-51.537,-49.182,+34
regnety_040.pycls_in1k,25.909,74.091,43.854,56.146,20.65,224,0.875,bicubic,-53.311,-50.802,-87
hrnet_w18_small_v2.gluon_in1k,25.884,74.116,43.815,56.185,15.60,224,0.875,bicubic,-52.306,-50.087,-12
regnetx_016.tv2_in1k,25.878,74.122,43.353,56.647,9.19,224,0.965,bicubic,-53.558,-51.415,-110
fastvit_t8.apple_in1k,25.876,74.124,44.153,55.847,4.03,256,0.900,bicubic,-50.298,-48.899,+83
resnext50d_32x4d.bt_in1k,25.876,74.124,42.956,57.044,25.05,288,0.950,bicubic,-54.788,-52.464,-212
res2net50_26w_4s.in1k,25.872,74.128,43.163,56.837,25.70,224,0.875,bilinear,-52.078,-50.689,+3
tresnet_m.miil_in1k_448,25.860,74.140,42.868,57.132,31.39,448,0.875,bilinear,-55.850,-52.706,-328
coat_tiny.in1k,25.858,74.142,43.275,56.725,5.50,224,0.900,bicubic,-52.568,-50.773,-35
hardcorenas_c.miil_green_in1k,25.821,74.179,44.764,55.236,5.52,224,0.875,bilinear,-51.245,-48.398,+47
densenet121.ra_in1k,25.815,74.185,44.866,55.134,7.98,288,0.950,bicubic,-50.685,-48.502,+69
resnet50c.gluon_in1k,25.793,74.207,43.019,56.981,25.58,224,0.875,bicubic,-52.213,-50.973,-8
halonet26t.a1h_in1k,25.776,74.224,43.220,56.780,12.48,256,0.950,bicubic,-53.330,-51.086,-91
selecsls60.in1k,25.729,74.272,44.065,55.935,30.67,224,0.875,bicubic,-52.260,-49.765,-6
hardcorenas_e.miil_green_in1k,25.658,74.342,43.408,56.592,8.07,224,0.875,bilinear,-52.132,-50.292,+4
poolformer_s12.sail_in1k,25.654,74.346,44.167,55.833,11.92,224,0.900,bicubic,-51.586,-49.365,+31
dla60_res2next.in1k,25.654,74.346,43.675,56.325,17.03,224,0.875,bilinear,-52.786,-50.469,-44
dla60_res2net.in1k,25.648,74.352,43.583,56.417,20.85,224,0.875,bilinear,-52.816,-50.615,-48
ecaresnet26t.ra2_in1k,25.538,74.462,43.660,56.340,16.01,320,0.950,bicubic,-54.312,-51.430,-160
resmlp_12_224.fb_in1k,25.528,74.472,44.330,55.670,15.35,224,0.875,bicubic,-51.120,-48.848,+51
mixnet_l.ft_in1k,25.520,74.480,43.471,56.529,7.33,224,0.875,bicubic,-53.446,-50.711,-90
convnext_femto.d1_in1k,25.510,74.490,43.672,56.328,5.22,288,0.950,bicubic,-53.206,-50.759,-71
tf_efficientnet_lite1.in1k,25.509,74.492,43.573,56.427,5.42,240,0.882,bicubic,-51.136,-49.651,+49
res2net50d.in1k,25.497,74.503,43.041,56.959,25.72,224,0.875,bilinear,-54.757,-51.995,-191
cs3darknet_focus_m.c2ns_in1k,25.493,74.507,43.762,56.238,9.30,288,0.950,bicubic,-51.791,-50.204,+21
resnext50_32x4d.tv_in1k,25.467,74.533,42.787,57.213,25.03,224,0.875,bilinear,-52.155,-50.909,0
bat_resnext26ts.ch_in1k,25.455,74.545,43.194,56.806,10.73,256,0.900,bicubic,-52.797,-50.904,-41
botnet26t_256.c1_in1k,25.451,74.549,42.660,57.340,12.49,256,0.950,bicubic,-53.806,-51.872,-117
eca_halonext26ts.c1_in1k,25.442,74.558,43.182,56.818,10.76,256,0.940,bicubic,-54.044,-51.418,-141
repvgg_a2.rvgg_in1k,25.436,74.564,43.945,56.055,28.21,224,0.875,bilinear,-51.022,-49.057,+52
tf_mixnet_l.in1k,25.414,74.586,42.538,57.462,7.33,224,0.875,bicubic,-53.362,-51.464,-84
regnety_008_tv.tv2_in1k,25.406,74.594,43.434,56.566,6.43,224,0.965,bicubic,-53.260,-50.956,-76
hardcorenas_b.miil_green_in1k,25.396,74.603,44.188,55.812,5.18,224,0.875,bilinear,-51.151,-48.574,+42
res2next50.in1k,25.392,74.608,42.492,57.508,24.67,224,0.875,bilinear,-52.850,-51.399,-47
convnext_femto_ols.d1_in1k,25.387,74.613,43.137,56.863,5.23,288,0.950,bicubic,-53.537,-51.389,-100
efficientformerv2_s0.snap_dist_in1k,25.349,74.651,43.929,56.071,3.60,224,0.950,bicubic,-50.765,-48.929,+54
legacy_seresnet101.in1k,25.332,74.668,42.832,57.168,49.33,224,0.875,bilinear,-53.054,-51.430,-59
selecsls60b.in1k,25.326,74.674,43.556,56.444,32.77,224,0.875,bicubic,-53.086,-50.612,-62
hardcorenas_d.miil_green_in1k,25.326,74.674,43.137,56.863,7.50,224,0.875,bilinear,-52.108,-50.353,-2
resnetv2_50x1_bit.goog_in21k_ft_in1k,25.322,74.678,45.348,54.652,25.55,448,1.000,bilinear,-55.020,-50.334,-217
regnety_032.pycls_in1k,25.318,74.682,42.923,57.077,19.44,224,0.875,bicubic,-53.558,-51.485,-103
wide_resnet50_2.tv_in1k,25.310,74.690,42.182,57.818,68.88,224,0.875,bilinear,-53.166,-51.906,-73
dla102.in1k,25.294,74.706,43.846,56.154,33.27,224,0.875,bilinear,-52.730,-50.088,-40
resnest14d.gluon_in1k,25.280,74.719,44.106,55.894,10.61,224,0.875,bilinear,-50.227,-48.402,+60
legacy_seresnext50_32x4d.in1k,25.212,74.788,41.950,58.050,27.56,224,0.875,bilinear,-53.864,-52.482,-119
ghostnetv2_130.in1k,25.149,74.851,43.271,56.729,8.96,224,0.875,bicubic,-51.607,-50.091,+23
mixer_b16_224.goog_in21k_ft_in1k,25.111,74.888,41.225,58.775,59.88,224,0.875,bicubic,-51.491,-50.999,+26
res2net50_48w_2s.in1k,25.025,74.975,42.206,57.794,25.29,224,0.875,bilinear,-52.489,-51.344,-15
efficientnet_b0.ra_in1k,25.015,74.985,42.797,57.203,5.29,224,0.875,bicubic,-52.679,-50.735,-27
resnet50.a3_in1k,25.000,75.001,41.889,58.111,25.56,224,0.950,bicubic,-53.049,-51.891,-49
dla60.in1k,24.937,75.063,43.322,56.678,22.04,224,0.875,bilinear,-52.109,-49.996,+8
resnet34.gluon_in1k,24.937,75.063,42.237,57.763,21.80,224,0.875,bicubic,-49.643,-49.745,+75
mobilenetv2_120d.ra_in1k,24.915,75.085,43.039,56.961,5.83,224,0.875,bicubic,-52.393,-50.463,-11
convnextv2_atto.fcmae_ft_in1k,24.890,75.111,42.469,57.531,3.71,288,0.950,bicubic,-52.871,-51.257,-34
eca_botnext26ts_256.c1_in1k,24.868,75.132,42.943,57.057,10.59,256,0.950,bicubic,-54.400,-51.663,-147
resnet34.bt_in1k,24.828,75.171,42.080,57.920,21.80,288,0.950,bicubic,-51.652,-51.274,+25
regnety_016.pycls_in1k,24.823,75.177,42.610,57.390,11.20,224,0.875,bicubic,-53.045,-51.108,-43
xcit_nano_12_p8_224.fb_dist_in1k,24.801,75.199,43.072,56.928,3.05,224,1.000,bicubic,-51.531,-50.026,+28
seresnet50.a3_in1k,24.793,75.207,42.094,57.906,28.09,224,0.950,bicubic,-52.233,-50.978,+1
pit_ti_distilled_224.in1k,24.711,75.289,43.225,56.775,5.10,224,0.900,bicubic,-49.545,-48.727,+71
cs3darknet_m.c2ns_in1k,24.626,75.374,42.970,57.030,9.31,288,0.950,bicubic,-53.008,-51.046,-36
eca_resnext26ts.ch_in1k,24.559,75.441,42.536,57.464,10.30,288,1.000,bicubic,-53.441,-51.390,-56
resnet50.bt_in1k,24.559,75.441,41.445,58.555,25.56,288,0.950,bicubic,-55.081,-53.447,-183
mobilevitv2_100.cvnets_in1k,24.553,75.447,42.919,57.081,4.90,256,0.888,bicubic,-53.527,-51.251,-65
tf_efficientnet_lite2.in1k,24.532,75.468,42.290,57.710,6.09,260,0.890,bicubic,-52.930,-51.462,-31
seresnext26ts.ch_in1k,24.506,75.494,42.665,57.335,10.39,288,1.000,bicubic,-53.764,-51.427,-81
skresnet18.ra_in1k,24.497,75.504,42.534,57.466,11.96,224,0.875,bicubic,-48.538,-48.638,+84
regnetx_016.pycls_in1k,24.477,75.523,42.490,57.510,9.19,224,0.875,bicubic,-52.447,-50.925,-3
tf_efficientnet_lite0.in1k,24.373,75.627,42.510,57.490,4.65,224,0.875,bicubic,-50.459,-49.660,+51
hardcorenas_a.miil_green_in1k,24.367,75.633,43.316,56.684,5.26,224,0.875,bilinear,-51.571,-49.192,+24
hrnet_w18.ms_aug_in1k,24.341,75.659,42.897,57.103,21.30,224,0.950,bilinear,-53.781,-51.157,-74
efficientvit_m4.r224_in1k,24.286,75.714,41.758,58.242,8.80,224,0.875,bicubic,-50.082,-50.222,+58
gcresnext26ts.ch_in1k,24.154,75.846,41.306,58.694,10.48,288,1.000,bicubic,-54.260,-52.730,-97
resnet50.tv_in1k,24.096,75.904,41.323,58.677,25.56,224,0.875,bilinear,-52.032,-51.535,+15
tf_efficientnet_b1.in1k,24.070,75.930,41.512,58.488,7.79,240,0.882,bicubic,-54.492,-52.582,-114
levit_128s.fb_dist_in1k,24.060,75.940,41.001,58.999,7.78,224,0.900,bicubic,-52.466,-51.871,+1
levit_conv_128s.fb_dist_in1k,24.052,75.948,41.003,58.997,7.78,224,0.900,bicubic,-52.468,-51.863,+1
legacy_seresnet34.in1k,24.021,75.979,41.901,58.099,21.96,224,0.875,bilinear,-50.781,-50.225,+43
xcit_nano_12_p16_384.fb_dist_in1k,24.005,75.995,42.306,57.694,3.05,384,1.000,bicubic,-51.453,-50.392,+28
xcit_nano_12_p8_384.fb_dist_in1k,23.956,76.044,41.954,58.046,3.05,384,1.000,bicubic,-53.864,-52.086,-62
efficientnet_lite0.ra_in1k,23.905,76.095,42.109,57.891,4.65,224,0.875,bicubic,-51.577,-50.411,+24
repghostnet_130.in1k,23.852,76.148,41.569,58.431,5.48,224,0.875,bicubic,-52.524,-51.323,+1
densenet121.tv_in1k,23.840,76.160,41.928,58.072,7.98,224,0.875,bicubic,-50.924,-50.225,+39
efficientnet_es_pruned.in1k,23.815,76.185,41.991,58.009,5.44,224,0.875,bicubic,-51.191,-50.453,+33
regnetx_008.tv2_in1k,23.779,76.221,40.698,59.302,7.26,224,0.965,bicubic,-53.527,-52.966,-42
mixnet_m.ft_in1k,23.716,76.284,41.146,58.854,5.01,224,0.875,bicubic,-53.544,-52.272,-39
resnet26t.ra2_in1k,23.712,76.288,41.321,58.679,16.01,320,1.000,bicubic,-54.616,-52.803,-105
mobilenetv2_140.ra_in1k,23.695,76.305,41.469,58.531,6.11,224,0.875,bicubic,-52.821,-51.519,-9
dla34.in1k,23.685,76.315,41.538,58.462,15.74,224,0.875,bilinear,-50.955,-50.529,+35
legacy_seresnet50.in1k,23.640,76.360,40.079,59.921,28.09,224,0.875,bilinear,-54.004,-53.679,-66
convnext_atto.d2_in1k,23.589,76.411,41.087,58.913,3.70,288,0.950,bicubic,-53.419,-52.614,-30
resnext26ts.ra2_in1k,23.589,76.411,40.891,59.109,10.30,288,1.000,bicubic,-53.589,-52.573,-42
tf_mixnet_m.in1k,23.484,76.516,40.997,59.003,5.01,224,0.875,bicubic,-53.470,-52.157,-30
resnet34.tv_in1k,23.473,76.527,41.361,58.639,21.80,224,0.875,bilinear,-49.833,-50.059,+53
efficientvit_m3.r224_in1k,23.412,76.588,40.531,59.469,6.90,224,0.875,bicubic,-49.962,-50.817,+49
selecsls42b.in1k,23.369,76.632,40.681,59.319,32.46,224,0.875,bicubic,-53.802,-52.711,-44
tf_efficientnet_em.in1k,23.369,76.632,40.398,59.602,6.90,240,0.882,bicubic,-54.758,-53.650,-101
resnet34.a3_in1k,23.366,76.633,40.068,59.932,21.80,224,0.950,bicubic,-49.603,-51.038,+55
repvgg_b0.rvgg_in1k,23.319,76.681,41.164,58.836,15.82,224,0.875,bilinear,-51.825,-51.252,+12
regnety_004.tv2_in1k,23.292,76.708,40.987,59.013,4.34,224,0.965,bicubic,-52.302,-51.713,+2
xcit_nano_12_p16_224.fb_dist_in1k,23.249,76.751,41.368,58.632,3.05,224,1.000,bicubic,-49.061,-49.492,+61
convnext_atto_ols.a2_in1k,23.129,76.871,40.881,59.119,3.70,288,0.950,bicubic,-54.087,-52.795,-53
mobilenetv2_110d.ra_in1k,23.070,76.930,40.749,59.251,4.52,224,0.875,bicubic,-51.984,-51.434,+12
resnet18.a1_in1k,23.056,76.944,39.551,60.449,11.69,288,1.000,bicubic,-50.102,-51.475,+45
vit_base_patch32_224.sam_in1k,23.046,76.954,39.565,60.435,88.22,224,0.900,bicubic,-50.648,-51.449,+37
tinynet_b.in1k,23.017,76.983,40.979,59.021,3.73,188,0.875,bicubic,-51.961,-51.207,+12
resnet18d.ra2_in1k,23.001,76.999,41.197,58.803,11.71,288,0.950,bicubic,-50.793,-50.640,+34
repghostnet_111.in1k,22.881,77.119,40.439,59.561,4.54,224,0.875,bicubic,-52.175,-51.753,+6
mobileone_s1.apple_in1k,22.803,77.197,39.851,60.149,4.83,224,0.900,bilinear,-52.983,-52.941,-13
deit_tiny_distilled_patch16_224.fb_in1k,22.722,77.278,40.777,59.223,5.91,224,0.900,bicubic,-51.782,-51.113,+18
mobilenetv3_large_100.ra_in1k,22.657,77.343,40.763,59.237,5.48,224,0.875,bicubic,-53.109,-51.775,-14
repvgg_a1.rvgg_in1k,22.640,77.361,39.869,60.131,14.09,224,0.875,bilinear,-51.823,-51.987,+17
mobilenetv3_rw.rmsp_in1k,22.630,77.370,40.362,59.638,5.48,224,0.875,bicubic,-52.990,-52.342,-13
ghostnetv2_100.in1k,22.604,77.396,40.022,59.978,6.16,224,0.875,bicubic,-52.562,-52.332,-4
edgenext_x_small.in1k,22.590,77.410,39.500,60.500,2.34,288,1.000,bicubic,-53.098,-53.266,-17
tf_mobilenetv3_large_100.in1k,22.579,77.421,39.777,60.223,5.48,224,0.875,bilinear,-52.937,-52.817,-13
tf_efficientnet_b0.in1k,22.559,77.441,39.570,60.430,5.29,224,0.875,bicubic,-53.971,-53.438,-41
mobilevit_s.cvnets_in1k,22.478,77.522,38.657,61.343,5.58,256,0.900,bicubic,-55.834,-55.491,-134
xcit_nano_12_p8_224.fb_in1k,22.408,77.592,40.626,59.374,3.05,224,1.000,bicubic,-51.502,-51.542,+20
tf_efficientnet_es.in1k,22.406,77.594,39.089,60.911,5.44,224,0.875,bicubic,-54.192,-54.113,-46
hrnet_w18_small_v2.ms_in1k,22.341,77.659,39.857,60.143,15.60,224,0.875,bilinear,-52.769,-52.559,-8
convit_tiny.fb_in1k,22.268,77.732,39.675,60.325,5.71,224,0.875,bicubic,-50.844,-52.037,+28
regnetx_004_tv.tv2_in1k,22.213,77.787,39.126,60.874,5.50,224,0.965,bicubic,-52.387,-53.044,+3
regnety_008.pycls_in1k,22.109,77.891,38.902,61.098,6.26,224,0.875,bicubic,-54.193,-54.160,-37
ese_vovnet19b_dw.ra_in1k,22.072,77.928,39.464,60.536,6.54,288,0.950,bicubic,-55.672,-54.320,-104
regnety_006.pycls_in1k,21.981,78.019,38.959,61.041,6.06,224,0.875,bicubic,-53.287,-53.567,-17
vit_tiny_r_s16_p8_384.augreg_in21k_ft_in1k,21.948,78.052,39.405,60.595,6.36,384,1.000,bicubic,-54.012,-53.857,-35
regnetx_008.pycls_in1k,21.948,78.052,38.928,61.072,7.26,224,0.875,bicubic,-53.080,-53.410,-11
semnasnet_100.rmsp_in1k,21.889,78.111,38.604,61.396,3.89,224,0.875,bicubic,-53.561,-53.994,-21
pit_ti_224.in1k,21.871,78.129,39.537,60.463,4.85,224,0.900,bicubic,-51.039,-51.867,+25
pvt_v2_b0.in1k,21.836,78.164,40.152,59.848,3.67,224,0.900,bicubic,-48.824,-50.044,+48
regnetx_006.pycls_in1k,21.747,78.253,38.914,61.086,6.20,224,0.875,bicubic,-52.121,-52.764,+8
vit_tiny_patch16_384.augreg_in21k_ft_in1k,21.720,78.280,39.319,60.681,5.79,384,1.000,bicubic,-56.704,-55.223,-158
crossvit_9_240.in1k,21.696,78.304,39.268,60.732,8.55,240,0.875,bicubic,-52.264,-52.694,+2
vgg19_bn.tv_in1k,21.623,78.376,39.276,60.724,143.68,224,0.875,bilinear,-52.592,-52.568,-3
semnasnet_075.rmsp_in1k,21.582,78.418,38.918,61.082,2.91,224,0.875,bicubic,-51.422,-52.222,+16
resnet18.gluon_in1k,21.545,78.455,38.875,61.125,11.69,224,0.875,bicubic,-49.289,-50.881,+40
mobilevitv2_075.cvnets_in1k,21.535,78.465,38.631,61.369,2.87,256,0.888,bicubic,-54.073,-54.113,-37
fbnetc_100.rmsp_in1k,21.492,78.508,38.179,61.821,5.57,224,0.875,bilinear,-53.638,-54.209,-27
repghostnet_100.in1k,21.459,78.541,38.682,61.318,4.07,224,0.875,bicubic,-52.748,-52.860,-7
xcit_nano_12_p16_224.fb_in1k,21.437,78.563,39.791,60.209,3.05,224,1.000,bicubic,-48.525,-49.971,+42
ghostnet_100.in1k,21.384,78.616,38.158,61.842,5.18,224,0.875,bicubic,-52.574,-53.374,-5
mnasnet_100.rmsp_in1k,21.346,78.653,37.709,62.291,4.38,224,0.875,bicubic,-53.306,-54.413,-20
resnet18.fb_ssl_yfcc100m_ft_in1k,21.293,78.707,39.114,60.886,11.69,224,0.875,bilinear,-51.305,-52.301,+11
lcnet_100.ra2_in1k,21.293,78.707,38.867,61.133,2.95,224,0.875,bicubic,-50.809,-51.487,+23
mixnet_s.ft_in1k,21.276,78.724,38.199,61.801,4.13,224,0.875,bicubic,-54.718,-55.071,-54
legacy_seresnext26_32x4d.in1k,21.089,78.911,37.627,62.373,16.79,224,0.875,bicubic,-56.019,-55.687,-92
efficientvit_m2.r224_in1k,21.081,78.919,37.690,62.310,4.19,224,0.875,bicubic,-49.733,-52.452,+30
crossvit_tiny_240.in1k,21.048,78.952,38.061,61.939,7.01,240,0.875,bicubic,-52.292,-53.847,-3
resnet18.a2_in1k,20.944,79.056,36.851,63.149,11.69,288,1.000,bicubic,-51.428,-53.745,+10
repvgg_a0.rvgg_in1k,20.922,79.078,37.539,62.461,9.11,224,0.875,bilinear,-51.486,-52.953,+6
regnetx_004.pycls_in1k,20.887,79.113,37.541,62.459,5.16,224,0.875,bicubic,-51.515,-53.285,+6
spnasnet_100.rmsp_in1k,20.867,79.133,37.896,62.104,4.42,224,0.875,bilinear,-53.227,-53.924,-19
seresnext26t_32x4d.bt_in1k,20.847,79.153,36.344,63.656,16.81,288,0.950,bicubic,-57.897,-57.968,-205
legacy_seresnet18.in1k,20.835,79.165,37.639,62.361,11.78,224,0.875,bicubic,-50.925,-52.693,+16
mobilenetv2_100.ra_in1k,20.761,79.239,37.757,62.243,3.50,224,0.875,bicubic,-52.207,-53.259,-2
tf_mixnet_s.in1k,20.474,79.526,36.621,63.379,4.13,224,0.875,bicubic,-55.178,-56.019,-58
vit_tiny_patch16_224.augreg_in21k_ft_in1k,20.460,79.540,37.601,62.399,5.72,224,0.900,bicubic,-55.002,-55.243,-52
regnety_004.pycls_in1k,20.411,79.589,37.014,62.986,4.34,224,0.875,bicubic,-53.615,-54.734,-24
tf_mobilenetv3_large_075.in1k,20.378,79.622,36.782,63.218,3.99,224,0.875,bilinear,-53.052,-54.570,-17
hrnet_w18_small.ms_in1k,20.364,79.636,37.093,62.907,13.19,224,0.875,bilinear,-51.972,-53.588,0
hrnet_w18_small.gluon_in1k,20.362,79.638,36.973,63.027,13.19,224,0.875,bicubic,-53.558,-54.221,-24
resnet26d.bt_in1k,20.266,79.734,36.348,63.652,16.01,288,0.950,bicubic,-57.142,-57.290,-125
resnet18.tv_in1k,20.230,79.770,37.258,62.742,11.69,224,0.875,bilinear,-49.530,-51.812,+21
mixer_l16_224.goog_in21k_ft_in1k,20.175,79.825,32.938,67.062,208.20,224,0.875,bicubic,-51.879,-54.736,+3
deit_tiny_patch16_224.fb_in1k,20.148,79.852,37.537,62.463,5.72,224,0.900,bicubic,-52.022,-53.579,0
tf_mobilenetv3_large_minimal_100.in1k,20.103,79.897,36.894,63.106,3.92,224,0.875,bilinear,-52.161,-53.746,-4
seresnext26d_32x4d.bt_in1k,20.067,79.933,35.233,64.766,16.81,288,0.950,bicubic,-58.747,-59.006,-226
vgg16_bn.tv_in1k,19.945,80.055,36.314,63.686,138.37,224,0.875,bilinear,-53.425,-55.200,-24
efficientvit_m1.r224_in1k,19.938,80.062,36.403,63.597,2.98,224,0.875,bicubic,-48.368,-52.267,+22
resnet26.bt_in1k,19.739,80.261,35.839,64.161,16.00,288,0.950,bicubic,-56.627,-57.341,-87
repghostnet_080.in1k,19.454,80.546,35.953,64.047,3.28,224,0.875,bicubic,-52.758,-54.531,-7
vit_tiny_r_s16_p8_224.augreg_in21k_ft_in1k,19.334,80.666,36.059,63.941,6.34,224,0.900,bicubic,-52.464,-54.765,-3
mobileone_s0.apple_in1k,19.309,80.691,35.342,64.658,5.29,224,0.875,bilinear,-52.093,-54.500,0
tinynet_c.in1k,19.258,80.742,35.982,64.018,2.46,184,0.875,bicubic,-51.984,-53.750,+1
edgenext_xx_small.in1k,18.863,81.137,35.159,64.841,1.33,288,1.000,bicubic,-53.015,-55.393,-7
efficientvit_b0.r224_in1k,18.464,81.536,33.190,66.810,3.41,224,0.950,bicubic,-52.934,-56.238,-2
resnet18.a3_in1k,18.442,81.558,33.487,66.513,11.69,224,0.950,bicubic,-49.810,-54.685,+15
mobilevit_xs.cvnets_in1k,18.312,81.688,33.206,66.794,2.32,256,0.900,bicubic,-56.322,-59.142,-54
lcnet_075.ra2_in1k,18.128,81.872,34.371,65.629,2.36,224,0.875,bicubic,-50.654,-53.989,+9
vgg19.tv_in1k,17.941,82.059,33.058,66.942,143.67,224,0.875,bilinear,-54.437,-57.816,-22
vgg13_bn.tv_in1k,17.798,82.202,34.043,65.957,133.05,224,0.875,bilinear,-53.790,-56.335,-9
vgg16.tv_in1k,17.538,82.462,32.779,67.221,138.36,224,0.875,bilinear,-54.054,-57.605,-11
regnety_002.pycls_in1k,17.456,82.544,32.435,67.565,3.16,224,0.875,bicubic,-52.824,-57.095,-3
vgg11_bn.tv_in1k,17.397,82.603,33.001,66.999,132.87,224,0.875,bilinear,-52.985,-56.807,-5
mobilevitv2_050.cvnets_in1k,17.306,82.694,33.007,66.993,1.37,256,0.888,bicubic,-52.842,-56.911,-4
repghostnet_058.in1k,17.161,82.839,32.596,67.404,2.55,224,0.875,bicubic,-51.753,-55.824,+1
regnetx_002.pycls_in1k,16.959,83.041,32.225,67.775,2.68,224,0.875,bicubic,-51.793,-56.317,+2
mobilenetv3_small_100.lamb_in1k,16.803,83.197,32.518,67.482,2.54,224,0.875,bicubic,-50.855,-55.118,+7
resnet10t.c3_in1k,16.699,83.301,32.123,67.877,5.44,224,0.950,bicubic,-51.665,-55.913,+1
efficientvit_m0.r224_in1k,16.670,83.330,31.948,68.052,2.35,224,0.875,bicubic,-46.600,-53.228,+14
mobilenetv2_050.lamb_in1k,16.668,83.332,31.950,68.050,1.97,224,0.875,bicubic,-49.280,-54.134,+9
tinynet_d.in1k,16.658,83.342,32.449,67.551,2.34,152,0.875,bicubic,-50.314,-54.617,+4
mnasnet_small.lamb_in1k,16.638,83.362,31.909,68.091,2.03,224,0.875,bicubic,-49.558,-54.595,+5
dla60x_c.in1k,16.336,83.664,31.757,68.243,1.32,224,0.875,bilinear,-51.576,-56.675,0
tf_mobilenetv3_small_100.in1k,16.216,83.784,31.205,68.795,2.54,224,0.875,bilinear,-51.706,-56.467,-2
vgg13.tv_in1k,16.096,83.904,30.985,69.015,133.05,224,0.875,bilinear,-53.836,-58.265,-13
resnet14t.c3_in1k,15.925,84.075,30.003,69.997,10.08,224,0.950,bicubic,-56.329,-60.303,-34
vgg11.tv_in1k,15.723,84.278,30.451,69.549,132.86,224,0.875,bilinear,-53.300,-58.173,-13
repghostnet_050.in1k,15.589,84.411,30.189,69.811,2.31,224,0.875,bicubic,-51.377,-56.731,-2
mobilenetv3_small_075.lamb_in1k,14.948,85.052,29.733,70.267,2.04,224,0.875,bicubic,-50.288,-55.713,+2
tf_mobilenetv3_small_075.in1k,14.932,85.067,29.562,70.438,2.04,224,0.875,bilinear,-50.794,-56.570,0
dla46_c.in1k,14.665,85.335,29.397,70.603,1.30,224,0.875,bilinear,-50.207,-56.901,+1
mobilevit_xxs.cvnets_in1k,14.490,85.510,28.654,71.346,1.27,256,0.900,bicubic,-54.428,-60.291,-17
dla46x_c.in1k,14.380,85.620,29.197,70.803,1.07,224,0.875,bilinear,-51.612,-57.777,-5
lcnet_050.ra2_in1k,14.290,85.710,28.659,71.341,1.88,224,0.875,bicubic,-48.848,-55.724,0
tf_mobilenetv3_small_minimal_100.in1k,13.962,86.038,27.990,72.010,2.04,224,0.875,bilinear,-48.932,-56.248,0
tinynet_e.in1k,12.671,87.329,26.383,73.617,2.04,106,0.875,bicubic,-47.195,-55.379,0
mobilenetv3_small_050.lamb_in1k,11.038,88.962,23.477,76.523,1.59,224,0.875,bicubic,-46.878,-56.703,0
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/results-imagenet-a-clean.csv
|
model,top1,top1_err,top5,top5_err,param_count,img_size,crop_pct,interpolation
eva02_large_patch14_448.mim_in22k_ft_in22k_in1k,98.930,1.070,99.910,0.090,305.08,448,1.000,bicubic
eva02_large_patch14_448.mim_m38m_ft_in22k_in1k,98.850,1.150,99.880,0.120,305.08,448,1.000,bicubic
eva02_large_patch14_448.mim_in22k_ft_in1k,98.840,1.160,99.830,0.170,305.08,448,1.000,bicubic
eva_giant_patch14_560.m30m_ft_in22k_in1k,98.830,1.170,99.900,0.100,"1,014.45",560,1.000,bicubic
eva_giant_patch14_336.clip_ft_in1k,98.820,1.180,99.810,0.190,"1,013.01",336,1.000,bicubic
eva_giant_patch14_336.m30m_ft_in22k_in1k,98.810,1.190,99.900,0.100,"1,013.01",336,1.000,bicubic
eva_large_patch14_336.in22k_ft_in22k_in1k,98.740,1.260,99.800,0.200,304.53,336,1.000,bicubic
eva_large_patch14_336.in22k_ft_in1k,98.730,1.270,99.870,0.130,304.53,336,1.000,bicubic
eva02_large_patch14_448.mim_m38m_ft_in1k,98.730,1.270,99.790,0.210,305.08,448,1.000,bicubic
convnextv2_huge.fcmae_ft_in22k_in1k_384,98.670,1.330,99.860,0.140,660.29,384,1.000,bicubic
eva02_base_patch14_448.mim_in22k_ft_in22k_in1k,98.640,1.360,99.800,0.200,87.12,448,1.000,bicubic
maxvit_base_tf_512.in21k_ft_in1k,98.620,1.380,99.800,0.200,119.88,512,1.000,bicubic
maxvit_xlarge_tf_512.in21k_ft_in1k,98.620,1.380,99.800,0.200,475.77,512,1.000,bicubic
maxvit_large_tf_512.in21k_ft_in1k,98.620,1.380,99.790,0.210,212.33,512,1.000,bicubic
convnextv2_huge.fcmae_ft_in22k_in1k_512,98.600,1.400,99.870,0.130,660.29,512,1.000,bicubic
beit_large_patch16_512.in22k_ft_in22k_in1k,98.560,1.440,99.840,0.160,305.67,512,1.000,bicubic
tf_efficientnet_l2.ns_jft_in1k,98.550,1.450,99.820,0.180,480.31,800,0.960,bicubic
beitv2_large_patch16_224.in1k_ft_in22k_in1k,98.540,1.460,99.760,0.240,304.43,224,0.950,bicubic
beit_large_patch16_384.in22k_ft_in22k_in1k,98.520,1.480,99.820,0.180,305.00,384,1.000,bicubic
maxvit_base_tf_384.in21k_ft_in1k,98.520,1.480,99.750,0.250,119.65,384,1.000,bicubic
tf_efficientnet_l2.ns_jft_in1k_475,98.500,1.500,99.830,0.170,480.31,475,0.936,bicubic
maxvit_xlarge_tf_384.in21k_ft_in1k,98.500,1.500,99.780,0.220,475.32,384,1.000,bicubic
maxvit_large_tf_384.in21k_ft_in1k,98.490,1.510,99.750,0.250,212.03,384,1.000,bicubic
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384,98.480,1.520,99.780,0.220,200.13,384,1.000,bicubic
deit3_large_patch16_384.fb_in22k_ft_in1k,98.460,1.540,99.760,0.240,304.76,384,1.000,bicubic
eva_giant_patch14_224.clip_ft_in1k,98.460,1.540,99.750,0.250,"1,012.56",224,0.900,bicubic
regnety_1280.swag_ft_in1k,98.450,1.550,99.870,0.130,644.81,384,1.000,bicubic
eva02_base_patch14_448.mim_in22k_ft_in1k,98.440,1.560,99.820,0.180,87.12,448,1.000,bicubic
caformer_b36.sail_in22k_ft_in1k_384,98.440,1.560,99.800,0.200,98.75,384,1.000,bicubic
convnext_xxlarge.clip_laion2b_soup_ft_in1k,98.440,1.560,99.800,0.200,846.47,256,1.000,bicubic
convnext_xlarge.fb_in22k_ft_in1k_384,98.420,1.580,99.810,0.190,350.20,384,1.000,bicubic
vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k,98.420,1.580,99.810,0.190,632.46,336,1.000,bicubic
eva_large_patch14_196.in22k_ft_in22k_in1k,98.420,1.580,99.770,0.230,304.14,196,1.000,bicubic
convnextv2_large.fcmae_ft_in22k_in1k_384,98.400,1.600,99.760,0.240,197.96,384,1.000,bicubic
eva_large_patch14_196.in22k_ft_in1k,98.360,1.640,99.820,0.180,304.14,196,1.000,bicubic
convnextv2_base.fcmae_ft_in22k_in1k_384,98.350,1.650,99.770,0.230,88.72,384,1.000,bicubic
vit_large_patch14_clip_336.laion2b_ft_in12k_in1k,98.340,1.660,99.760,0.240,304.53,336,1.000,bicubic
vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k,98.300,1.700,99.760,0.240,632.05,224,1.000,bicubic
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320,98.280,1.720,99.770,0.230,200.13,320,1.000,bicubic
convformer_b36.sail_in22k_ft_in1k_384,98.260,1.740,99.830,0.170,99.88,384,1.000,bicubic
maxxvitv2_rmlp_base_rw_384.sw_in12k_ft_in1k,98.260,1.740,99.780,0.220,116.09,384,1.000,bicubic
vit_large_patch14_clip_336.openai_ft_in12k_in1k,98.260,1.740,99.770,0.230,304.53,336,1.000,bicubic
convnext_large_mlp.clip_laion2b_augreg_ft_in1k_384,98.250,1.750,99.760,0.240,200.13,384,1.000,bicubic
convnext_large.fb_in22k_ft_in1k_384,98.240,1.760,99.750,0.250,197.77,384,1.000,bicubic
vit_large_patch16_384.augreg_in21k_ft_in1k,98.220,1.780,99.800,0.200,304.72,384,1.000,bicubic
vit_large_patch14_clip_224.openai_ft_in12k_in1k,98.220,1.780,99.720,0.280,304.20,224,1.000,bicubic
vit_large_patch14_clip_336.laion2b_ft_in1k,98.220,1.780,99.720,0.280,304.53,336,1.000,bicubic
seresnextaa201d_32x8d.sw_in12k_ft_in1k_384,98.210,1.790,99.780,0.220,149.39,384,1.000,bicubic
vit_base_patch16_clip_384.openai_ft_in12k_in1k,98.190,1.810,99.660,0.340,86.86,384,0.950,bicubic
beit_large_patch16_224.in22k_ft_in22k_in1k,98.180,1.820,99.760,0.240,304.43,224,0.900,bicubic
deit3_large_patch16_224.fb_in22k_ft_in1k,98.170,1.830,99.760,0.240,304.37,224,1.000,bicubic
maxvit_rmlp_base_rw_384.sw_in12k_ft_in1k,98.170,1.830,99.760,0.240,116.14,384,1.000,bicubic
deit3_huge_patch14_224.fb_in22k_ft_in1k,98.170,1.830,99.730,0.270,632.13,224,1.000,bicubic
caformer_b36.sail_in22k_ft_in1k,98.160,1.840,99.780,0.220,98.75,224,1.000,bicubic
vit_large_patch14_clip_224.openai_ft_in1k,98.160,1.840,99.660,0.340,304.20,224,1.000,bicubic
caformer_m36.sail_in22k_ft_in1k_384,98.150,1.850,99.750,0.250,56.20,384,1.000,bicubic
swinv2_large_window12to24_192to384.ms_in22k_ft_in1k,98.130,1.870,99.710,0.290,196.74,384,1.000,bicubic
swinv2_base_window12to24_192to384.ms_in22k_ft_in1k,98.120,1.880,99.780,0.220,87.92,384,1.000,bicubic
convnext_large.fb_in22k_ft_in1k,98.120,1.880,99.740,0.260,197.77,288,1.000,bicubic
convnext_xlarge.fb_in22k_ft_in1k,98.110,1.890,99.780,0.220,350.20,288,1.000,bicubic
convnextv2_large.fcmae_ft_in22k_in1k,98.090,1.910,99.770,0.230,197.96,288,1.000,bicubic
vit_large_patch14_clip_224.laion2b_ft_in12k_in1k,98.080,1.920,99.760,0.240,304.20,224,1.000,bicubic
convnext_base.fb_in22k_ft_in1k_384,98.080,1.920,99.650,0.350,88.59,384,1.000,bicubic
coatnet_rmlp_2_rw_384.sw_in12k_ft_in1k,98.070,1.930,99.720,0.280,73.88,384,1.000,bicubic
regnety_320.swag_ft_in1k,98.060,1.940,99.860,0.140,145.05,384,1.000,bicubic
convnextv2_base.fcmae_ft_in22k_in1k,98.060,1.940,99.760,0.240,88.72,288,1.000,bicubic
swin_large_patch4_window12_384.ms_in22k_ft_in1k,98.050,1.950,99.690,0.310,196.74,384,1.000,bicubic
convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384,98.040,1.960,99.750,0.250,88.59,384,1.000,bicubic
convformer_m36.sail_in22k_ft_in1k_384,98.040,1.960,99.690,0.310,57.05,384,1.000,bicubic
vit_huge_patch14_clip_224.laion2b_ft_in1k,98.020,1.980,99.720,0.280,632.05,224,1.000,bicubic
vit_base_patch16_clip_384.laion2b_ft_in12k_in1k,97.990,2.010,99.660,0.340,86.86,384,1.000,bicubic
caformer_s36.sail_in22k_ft_in1k_384,97.970,2.030,99.720,0.280,39.30,384,1.000,bicubic
seresnextaa101d_32x8d.sw_in12k_ft_in1k_288,97.970,2.030,99.700,0.300,93.59,320,1.000,bicubic
convnext_large_mlp.clip_laion2b_augreg_ft_in1k,97.950,2.050,99.710,0.290,200.13,256,1.000,bicubic
convformer_b36.sail_in22k_ft_in1k,97.940,2.060,99.760,0.240,99.88,224,1.000,bicubic
tf_efficientnet_b7.ns_jft_in1k,97.920,2.080,99.720,0.280,66.35,600,0.949,bicubic
beitv2_large_patch16_224.in1k_ft_in1k,97.910,2.090,99.660,0.340,304.43,224,0.950,bicubic
seresnextaa101d_32x8d.sw_in12k_ft_in1k,97.910,2.090,99.660,0.340,93.59,288,1.000,bicubic
swin_base_patch4_window12_384.ms_in22k_ft_in1k,97.900,2.100,99.710,0.290,87.90,384,1.000,bicubic
convnextv2_huge.fcmae_ft_in1k,97.900,2.100,99.670,0.330,660.29,288,1.000,bicubic
tf_efficientnetv2_xl.in21k_ft_in1k,97.900,2.100,99.570,0.430,208.12,512,1.000,bicubic
vit_large_patch14_clip_224.laion2b_ft_in1k,97.890,2.110,99.650,0.350,304.20,224,1.000,bicubic
tiny_vit_21m_512.dist_in22k_ft_in1k,97.870,2.130,99.630,0.370,21.27,512,1.000,bicubic
convnext_base.fb_in22k_ft_in1k,97.860,2.140,99.680,0.320,88.59,288,1.000,bicubic
vit_large_r50_s32_384.augreg_in21k_ft_in1k,97.860,2.140,99.670,0.330,329.09,384,1.000,bicubic
swinv2_large_window12to16_192to256.ms_in22k_ft_in1k,97.850,2.150,99.650,0.350,196.74,256,0.900,bicubic
convformer_s36.sail_in22k_ft_in1k_384,97.850,2.150,99.640,0.360,40.01,384,1.000,bicubic
deit3_base_patch16_384.fb_in22k_ft_in1k,97.840,2.160,99.680,0.320,86.88,384,1.000,bicubic
caformer_m36.sail_in22k_ft_in1k,97.840,2.160,99.670,0.330,56.20,224,1.000,bicubic
vit_base_patch16_384.augreg_in21k_ft_in1k,97.840,2.160,99.670,0.330,86.86,384,1.000,bicubic
maxvit_large_tf_512.in1k,97.830,2.170,99.560,0.440,212.33,512,1.000,bicubic
beit_base_patch16_384.in22k_ft_in22k_in1k,97.820,2.180,99.700,0.300,86.74,384,1.000,bicubic
tf_efficientnetv2_m.in21k_ft_in1k,97.820,2.180,99.600,0.400,54.14,480,1.000,bicubic
maxvit_rmlp_base_rw_224.sw_in12k_ft_in1k,97.810,2.190,99.650,0.350,116.14,224,0.950,bicubic
tf_efficientnetv2_l.in21k_ft_in1k,97.800,2.200,99.770,0.230,118.52,480,1.000,bicubic
convnext_small.in12k_ft_in1k_384,97.800,2.200,99.660,0.340,50.22,384,1.000,bicubic
regnety_160.swag_ft_in1k,97.780,2.220,99.760,0.240,83.59,384,1.000,bicubic
dm_nfnet_f6.dm_in1k,97.780,2.220,99.650,0.350,438.36,576,0.956,bicubic
dm_nfnet_f5.dm_in1k,97.780,2.220,99.600,0.400,377.21,544,0.954,bicubic
volo_d5_512.sail_in1k,97.770,2.230,99.670,0.330,296.09,512,1.150,bicubic
maxxvitv2_rmlp_base_rw_224.sw_in12k_ft_in1k,97.760,2.240,99.700,0.300,116.09,224,0.950,bicubic
volo_d5_448.sail_in1k,97.750,2.250,99.620,0.380,295.91,448,1.150,bicubic
maxvit_small_tf_512.in1k,97.750,2.250,99.550,0.450,69.13,512,1.000,bicubic
maxvit_base_tf_512.in1k,97.730,2.270,99.610,0.390,119.88,512,1.000,bicubic
vit_base_patch16_clip_384.laion2b_ft_in1k,97.720,2.280,99.630,0.370,86.86,384,1.000,bicubic
beitv2_base_patch16_224.in1k_ft_in22k_in1k,97.690,2.310,99.680,0.320,86.53,224,0.900,bicubic
vit_base_patch8_224.augreg2_in21k_ft_in1k,97.690,2.310,99.650,0.350,86.58,224,0.900,bicubic
volo_d4_448.sail_in1k,97.670,2.330,99.610,0.390,193.41,448,1.150,bicubic
swinv2_base_window12to16_192to256.ms_in22k_ft_in1k,97.660,2.340,99.720,0.280,87.92,256,0.900,bicubic
convnextv2_large.fcmae_ft_in1k,97.660,2.340,99.610,0.390,197.96,288,1.000,bicubic
regnety_1280.swag_lc_in1k,97.650,2.350,99.640,0.360,644.81,224,0.965,bicubic
coatnet_rmlp_2_rw_224.sw_in12k_ft_in1k,97.650,2.350,99.570,0.430,73.88,224,0.950,bicubic
swin_large_patch4_window7_224.ms_in22k_ft_in1k,97.650,2.350,99.570,0.430,196.53,224,0.900,bicubic
dm_nfnet_f4.dm_in1k,97.640,2.360,99.540,0.460,316.07,512,0.951,bicubic
vit_large_patch16_224.augreg_in21k_ft_in1k,97.630,2.370,99.590,0.410,304.33,224,0.900,bicubic
tf_efficientnet_b6.ns_jft_in1k,97.620,2.380,99.580,0.420,43.04,528,0.942,bicubic
convnext_base.clip_laiona_augreg_ft_in1k_384,97.620,2.380,99.550,0.450,88.59,384,1.000,bicubic
convnext_small.fb_in22k_ft_in1k_384,97.610,2.390,99.600,0.400,50.22,384,1.000,bicubic
tiny_vit_21m_384.dist_in22k_ft_in1k,97.610,2.390,99.590,0.410,21.23,384,1.000,bicubic
convnext_base.clip_laion2b_augreg_ft_in12k_in1k,97.600,2.400,99.720,0.280,88.59,256,1.000,bicubic
convformer_m36.sail_in22k_ft_in1k,97.600,2.400,99.620,0.380,57.05,224,1.000,bicubic
caformer_s36.sail_in22k_ft_in1k,97.600,2.400,99.610,0.390,39.30,224,1.000,bicubic
maxvit_tiny_tf_512.in1k,97.580,2.420,99.560,0.440,31.05,512,1.000,bicubic
vit_base_patch8_224.augreg_in21k_ft_in1k,97.570,2.430,99.670,0.330,86.58,224,0.900,bicubic
maxvit_base_tf_384.in1k,97.570,2.430,99.590,0.410,119.65,384,1.000,bicubic
maxvit_large_tf_384.in1k,97.570,2.430,99.530,0.470,212.03,384,1.000,bicubic
volo_d3_448.sail_in1k,97.550,2.450,99.560,0.440,86.63,448,1.000,bicubic
vit_base_patch16_clip_384.openai_ft_in1k,97.540,2.460,99.660,0.340,86.86,384,1.000,bicubic
convformer_b36.sail_in1k_384,97.530,2.470,99.520,0.480,99.88,384,1.000,bicubic
vit_base_patch16_clip_224.openai_ft_in12k_in1k,97.530,2.470,99.500,0.500,86.57,224,0.950,bicubic
coatnet_2_rw_224.sw_in12k_ft_in1k,97.520,2.480,99.600,0.400,73.87,224,0.950,bicubic
xcit_large_24_p8_384.fb_dist_in1k,97.520,2.480,99.540,0.460,188.93,384,1.000,bicubic
xcit_large_24_p16_384.fb_dist_in1k,97.520,2.480,99.480,0.520,189.10,384,1.000,bicubic
tf_efficientnet_b5.ns_jft_in1k,97.500,2.500,99.640,0.360,30.39,456,0.934,bicubic
caformer_b36.sail_in1k_384,97.500,2.500,99.580,0.420,98.75,384,1.000,bicubic
resnetv2_152x4_bit.goog_in21k_ft_in1k,97.490,2.510,99.610,0.390,936.53,480,1.000,bilinear
deit3_base_patch16_224.fb_in22k_ft_in1k,97.480,2.520,99.600,0.400,86.59,224,1.000,bicubic
cait_m48_448.fb_dist_in1k,97.480,2.520,99.550,0.450,356.46,448,1.000,bicubic
dm_nfnet_f3.dm_in1k,97.470,2.530,99.560,0.440,254.92,416,0.940,bicubic
tf_efficientnetv2_l.in1k,97.470,2.530,99.530,0.470,118.52,480,1.000,bicubic
regnety_160.lion_in12k_ft_in1k,97.450,2.550,99.600,0.400,83.59,288,1.000,bicubic
regnety_160.sw_in12k_ft_in1k,97.450,2.550,99.590,0.410,83.59,288,1.000,bicubic
vit_base_patch16_clip_224.laion2b_ft_in12k_in1k,97.450,2.550,99.540,0.460,86.57,224,0.950,bicubic
vit_medium_patch16_gap_384.sw_in12k_ft_in1k,97.440,2.560,99.640,0.360,39.03,384,0.950,bicubic
caformer_m36.sail_in1k_384,97.440,2.560,99.600,0.400,56.20,384,1.000,bicubic
maxvit_small_tf_384.in1k,97.430,2.570,99.510,0.490,69.02,384,1.000,bicubic
deit3_large_patch16_384.fb_in1k,97.420,2.580,99.620,0.380,304.76,384,1.000,bicubic
caformer_s18.sail_in22k_ft_in1k_384,97.420,2.580,99.570,0.430,26.34,384,1.000,bicubic
flexivit_large.1200ep_in1k,97.410,2.590,99.600,0.400,304.36,240,0.950,bicubic
efficientnet_b5.sw_in12k_ft_in1k,97.410,2.590,99.550,0.450,30.39,448,1.000,bicubic
convformer_m36.sail_in1k_384,97.410,2.590,99.470,0.530,57.05,384,1.000,bicubic
cait_m36_384.fb_dist_in1k,97.400,2.600,99.510,0.490,271.22,384,1.000,bicubic
caformer_s36.sail_in1k_384,97.390,2.610,99.540,0.460,39.30,384,1.000,bicubic
volo_d5_224.sail_in1k,97.380,2.620,99.570,0.430,295.46,224,0.960,bicubic
resnext101_32x32d.fb_wsl_ig1b_ft_in1k,97.370,2.630,99.680,0.320,468.53,224,0.875,bilinear
convnext_small.fb_in22k_ft_in1k,97.360,2.640,99.530,0.470,50.22,288,1.000,bicubic
vit_base_patch32_clip_384.laion2b_ft_in12k_in1k,97.360,2.640,99.520,0.480,88.30,384,1.000,bicubic
convnext_small.in12k_ft_in1k,97.350,2.650,99.580,0.420,50.22,288,1.000,bicubic
convnext_tiny.in12k_ft_in1k_384,97.340,2.660,99.600,0.400,28.59,384,1.000,bicubic
cait_s36_384.fb_dist_in1k,97.330,2.670,99.540,0.460,68.37,384,1.000,bicubic
volo_d2_384.sail_in1k,97.320,2.680,99.600,0.400,58.87,384,1.000,bicubic
maxvit_tiny_tf_384.in1k,97.310,2.690,99.500,0.500,30.98,384,1.000,bicubic
vit_base_patch32_clip_448.laion2b_ft_in12k_in1k,97.310,2.690,99.480,0.520,88.34,448,1.000,bicubic
flexivit_large.600ep_in1k,97.280,2.720,99.590,0.410,304.36,240,0.950,bicubic
swin_base_patch4_window7_224.ms_in22k_ft_in1k,97.280,2.720,99.540,0.460,87.77,224,0.900,bicubic
regnety_120.sw_in12k_ft_in1k,97.280,2.720,99.530,0.470,51.82,288,1.000,bicubic
volo_d4_224.sail_in1k,97.280,2.720,99.520,0.480,192.96,224,0.960,bicubic
xcit_medium_24_p8_384.fb_dist_in1k,97.280,2.720,99.510,0.490,84.32,384,1.000,bicubic
xcit_medium_24_p16_384.fb_dist_in1k,97.280,2.720,99.470,0.530,84.40,384,1.000,bicubic
convformer_s36.sail_in1k_384,97.280,2.720,99.430,0.570,40.01,384,1.000,bicubic
convformer_s18.sail_in22k_ft_in1k_384,97.270,2.730,99.550,0.450,26.77,384,1.000,bicubic
inception_next_base.sail_in1k_384,97.260,2.740,99.490,0.510,86.67,384,1.000,bicubic
flexivit_large.300ep_in1k,97.250,2.750,99.490,0.510,304.36,240,0.950,bicubic
xcit_small_24_p8_384.fb_dist_in1k,97.240,2.760,99.610,0.390,47.63,384,1.000,bicubic
convnext_base.clip_laion2b_augreg_ft_in1k,97.240,2.760,99.550,0.450,88.59,256,1.000,bicubic
convnextv2_tiny.fcmae_ft_in22k_in1k_384,97.240,2.760,99.520,0.480,28.64,384,1.000,bicubic
xcit_small_12_p8_384.fb_dist_in1k,97.230,2.770,99.480,0.520,26.21,384,1.000,bicubic
convnextv2_base.fcmae_ft_in1k,97.220,2.780,99.540,0.460,88.72,288,1.000,bicubic
regnety_2560.seer_ft_in1k,97.220,2.780,99.520,0.480,"1,282.60",384,1.000,bicubic
resnext101_32x8d.fb_swsl_ig1b_ft_in1k,97.210,2.790,99.570,0.430,88.79,224,0.875,bilinear
tf_efficientnetv2_m.in1k,97.210,2.790,99.530,0.470,54.14,480,1.000,bicubic
tf_efficientnet_b7.ap_in1k,97.200,2.800,99.540,0.460,66.35,600,0.949,bicubic
regnetz_e8.ra3_in1k,97.200,2.800,99.500,0.500,57.70,320,1.000,bicubic
tf_efficientnet_b8.ra_in1k,97.200,2.800,99.500,0.500,87.41,672,0.954,bicubic
tiny_vit_21m_224.dist_in22k_ft_in1k,97.200,2.800,99.490,0.510,21.20,224,0.950,bicubic
vit_base_r50_s16_384.orig_in21k_ft_in1k,97.190,2.810,99.560,0.440,98.95,384,1.000,bicubic
beitv2_base_patch16_224.in1k_ft_in1k,97.170,2.830,99.470,0.530,86.53,224,0.900,bicubic
regnety_320.swag_lc_in1k,97.160,2.840,99.670,0.330,145.05,224,0.965,bicubic
vit_base_patch16_224.augreg2_in21k_ft_in1k,97.150,2.850,99.540,0.460,86.57,224,0.900,bicubic
coat_lite_medium_384.in1k,97.150,2.850,99.450,0.550,44.57,384,1.000,bicubic
eva02_small_patch14_336.mim_in22k_ft_in1k,97.140,2.860,99.470,0.530,22.13,336,1.000,bicubic
deit3_small_patch16_384.fb_in22k_ft_in1k,97.130,2.870,99.510,0.490,22.21,384,1.000,bicubic
vit_base_patch16_clip_224.laion2b_ft_in1k,97.130,2.870,99.460,0.540,86.57,224,1.000,bicubic
xcit_small_24_p16_384.fb_dist_in1k,97.120,2.880,99.450,0.550,47.67,384,1.000,bicubic
tf_efficientnet_b8.ap_in1k,97.110,2.890,99.660,0.340,87.41,672,0.954,bicubic
dm_nfnet_f2.dm_in1k,97.110,2.890,99.510,0.490,193.78,352,0.920,bicubic
vit_base_patch32_clip_384.openai_ft_in12k_in1k,97.110,2.890,99.500,0.500,88.30,384,0.950,bicubic
convnext_large.fb_in1k,97.100,2.900,99.450,0.550,197.77,288,1.000,bicubic
ecaresnet269d.ra2_in1k,97.090,2.910,99.470,0.530,102.09,352,1.000,bicubic
volo_d3_224.sail_in1k,97.090,2.910,99.470,0.530,86.33,224,0.960,bicubic
beit_base_patch16_224.in22k_ft_in22k_in1k,97.080,2.920,99.610,0.390,86.53,224,0.900,bicubic
tf_efficientnet_b6.ap_in1k,97.080,2.920,99.610,0.390,43.04,528,0.942,bicubic
convformer_s36.sail_in22k_ft_in1k,97.080,2.920,99.560,0.440,40.01,224,1.000,bicubic
convnext_tiny.fb_in22k_ft_in1k_384,97.080,2.920,99.510,0.490,28.59,384,1.000,bicubic
eca_nfnet_l2.ra3_in1k,97.080,2.920,99.510,0.490,56.72,384,1.000,bicubic
vit_base_patch16_clip_224.openai_ft_in1k,97.080,2.920,99.490,0.510,86.57,224,0.900,bicubic
caformer_s18.sail_in1k_384,97.080,2.920,99.420,0.580,26.34,384,1.000,bicubic
cait_s24_384.fb_dist_in1k,97.070,2.930,99.430,0.570,47.06,384,1.000,bicubic
xcit_large_24_p8_224.fb_dist_in1k,97.070,2.930,99.420,0.580,188.93,224,1.000,bicubic
convnext_tiny.in12k_ft_in1k,97.060,2.940,99.550,0.450,28.59,288,1.000,bicubic
deit3_base_patch16_384.fb_in1k,97.040,2.960,99.390,0.610,86.88,384,1.000,bicubic
convformer_s18.sail_in1k_384,97.040,2.960,99.380,0.620,26.77,384,1.000,bicubic
hrnet_w48_ssld.paddle_in1k,97.030,2.970,99.640,0.360,77.47,288,1.000,bilinear
dm_nfnet_f1.dm_in1k,97.030,2.970,99.390,0.610,132.63,320,0.910,bicubic
resnetv2_152x2_bit.goog_in21k_ft_in1k,97.000,3.000,99.590,0.410,236.34,448,1.000,bilinear
tf_efficientnet_b7.ra_in1k,97.000,3.000,99.520,0.480,66.35,600,0.949,bicubic
volo_d2_224.sail_in1k,97.000,3.000,99.390,0.610,58.68,224,0.960,bicubic
efficientnetv2_rw_m.agc_in1k,96.980,3.020,99.530,0.470,53.24,416,1.000,bicubic
resnetv2_101x3_bit.goog_in21k_ft_in1k,96.980,3.020,99.490,0.510,387.93,448,1.000,bilinear
caformer_b36.sail_in1k,96.980,3.020,99.340,0.660,98.75,224,1.000,bicubic
deit3_medium_patch16_224.fb_in22k_ft_in1k,96.970,3.030,99.430,0.570,38.85,224,1.000,bicubic
deit_base_distilled_patch16_384.fb_in1k,96.960,3.040,99.480,0.520,87.63,384,1.000,bicubic
seresnextaa101d_32x8d.ah_in1k,96.960,3.040,99.390,0.610,93.59,288,1.000,bicubic
maxvit_large_tf_224.in1k,96.960,3.040,99.250,0.750,211.79,224,0.950,bicubic
tf_efficientnet_b4.ns_jft_in1k,96.950,3.050,99.580,0.420,19.34,380,0.922,bicubic
maxvit_base_tf_224.in1k,96.950,3.050,99.260,0.740,119.47,224,0.950,bicubic
deit3_large_patch16_224.fb_in1k,96.940,3.060,99.340,0.660,304.37,224,0.900,bicubic
davit_base.msft_in1k,96.940,3.060,99.260,0.740,87.95,224,0.950,bicubic
mvitv2_large.fb_in1k,96.930,3.070,99.400,0.600,217.99,224,0.900,bicubic
xcit_small_12_p16_384.fb_dist_in1k,96.920,3.080,99.400,0.600,26.25,384,1.000,bicubic
xcit_medium_24_p8_224.fb_dist_in1k,96.920,3.080,99.390,0.610,84.32,224,1.000,bicubic
volo_d1_384.sail_in1k,96.910,3.090,99.520,0.480,26.78,384,1.000,bicubic
resnetrs420.tf_in1k,96.910,3.090,99.460,0.540,191.89,416,1.000,bicubic
deit3_huge_patch14_224.fb_in1k,96.900,3.100,99.480,0.520,632.13,224,0.900,bicubic
convformer_b36.sail_in1k,96.900,3.100,99.220,0.780,99.88,224,1.000,bicubic
caformer_m36.sail_in1k,96.890,3.110,99.430,0.570,56.20,224,1.000,bicubic
vit_base_patch16_224.augreg_in21k_ft_in1k,96.880,3.120,99.530,0.470,86.57,224,0.900,bicubic
xcit_small_24_p8_224.fb_dist_in1k,96.870,3.130,99.480,0.520,47.63,224,1.000,bicubic
regnety_1280.seer_ft_in1k,96.860,3.140,99.390,0.610,644.81,384,1.000,bicubic
convnextv2_tiny.fcmae_ft_in22k_in1k,96.850,3.150,99.460,0.540,28.64,288,1.000,bicubic
regnety_640.seer_ft_in1k,96.850,3.150,99.420,0.580,281.38,384,1.000,bicubic
rexnetr_300.sw_in12k_ft_in1k,96.840,3.160,99.510,0.490,34.81,288,1.000,bicubic
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k_384,96.830,3.170,99.450,0.550,236.34,384,1.000,bicubic
convnext_base.fb_in1k,96.830,3.170,99.410,0.590,88.59,288,1.000,bicubic
regnety_160.swag_lc_in1k,96.820,3.180,99.650,0.350,83.59,224,0.965,bicubic
resnext101_32x16d.fb_wsl_ig1b_ft_in1k,96.810,3.190,99.600,0.400,194.03,224,0.875,bilinear
maxxvit_rmlp_small_rw_256.sw_in1k,96.800,3.200,99.380,0.620,66.01,256,0.950,bicubic
xcit_large_24_p16_224.fb_dist_in1k,96.800,3.200,99.350,0.650,189.10,224,1.000,bicubic
vit_large_r50_s32_224.augreg_in21k_ft_in1k,96.790,3.210,99.340,0.660,328.99,224,0.900,bicubic
fastvit_ma36.apple_dist_in1k,96.780,3.220,99.330,0.670,44.07,256,0.950,bicubic
seresnet152d.ra2_in1k,96.770,3.230,99.440,0.560,66.84,320,1.000,bicubic
seresnext101_32x8d.ah_in1k,96.770,3.230,99.350,0.650,93.57,288,1.000,bicubic
mvitv2_base.fb_in1k,96.760,3.240,99.260,0.740,51.47,224,0.900,bicubic
resnetrs350.tf_in1k,96.750,3.250,99.370,0.630,163.96,384,1.000,bicubic
swinv2_base_window16_256.ms_in1k,96.750,3.250,99.350,0.650,87.92,256,0.900,bicubic
flexivit_base.1200ep_in1k,96.740,3.260,99.360,0.640,86.59,240,0.950,bicubic
edgenext_base.in21k_ft_in1k,96.730,3.270,99.420,0.580,18.51,320,1.000,bicubic
tf_efficientnetv2_s.in21k_ft_in1k,96.730,3.270,99.420,0.580,21.46,384,1.000,bicubic
resnet200d.ra2_in1k,96.730,3.270,99.330,0.670,64.69,320,1.000,bicubic
vit_base_patch16_384.orig_in21k_ft_in1k,96.720,3.280,99.500,0.500,86.86,384,1.000,bicubic
regnetz_040.ra3_in1k,96.720,3.280,99.480,0.520,27.12,320,1.000,bicubic
resnetv2_50x3_bit.goog_in21k_ft_in1k,96.710,3.290,99.550,0.450,217.32,448,1.000,bilinear
caformer_s18.sail_in22k_ft_in1k,96.710,3.290,99.490,0.510,26.34,224,1.000,bicubic
regnetz_040_h.ra3_in1k,96.700,3.300,99.500,0.500,28.94,320,1.000,bicubic
vit_small_patch16_384.augreg_in21k_ft_in1k,96.700,3.300,99.480,0.520,22.20,384,1.000,bicubic
edgenext_base.usi_in1k,96.700,3.300,99.430,0.570,18.51,320,1.000,bicubic
xcit_small_12_p8_224.fb_dist_in1k,96.700,3.300,99.380,0.620,26.21,224,1.000,bicubic
resnetrs200.tf_in1k,96.700,3.300,99.370,0.630,93.21,320,1.000,bicubic
resnetaa101d.sw_in12k_ft_in1k,96.700,3.300,99.360,0.640,44.57,288,1.000,bicubic
seresnext101d_32x8d.ah_in1k,96.700,3.300,99.360,0.640,93.59,288,1.000,bicubic
eca_nfnet_l1.ra2_in1k,96.700,3.300,99.290,0.710,41.41,320,1.000,bicubic
repvgg_d2se.rvgg_in1k,96.690,3.310,99.370,0.630,133.33,320,1.000,bilinear
caformer_s36.sail_in1k,96.690,3.310,99.360,0.640,39.30,224,1.000,bicubic
maxvit_small_tf_224.in1k,96.690,3.310,99.360,0.640,68.93,224,0.950,bicubic
resnetrs270.tf_in1k,96.690,3.310,99.350,0.650,129.86,352,1.000,bicubic
vit_small_r26_s32_384.augreg_in21k_ft_in1k,96.680,3.320,99.580,0.420,36.47,384,1.000,bicubic
tf_efficientnet_b5.ap_in1k,96.680,3.320,99.460,0.540,30.39,456,0.934,bicubic
convformer_m36.sail_in1k,96.680,3.320,99.080,0.920,57.05,224,1.000,bicubic
vit_medium_patch16_gap_256.sw_in12k_ft_in1k,96.670,3.330,99.490,0.510,38.86,256,0.950,bicubic
tf_efficientnet_b6.aa_in1k,96.670,3.330,99.370,0.630,43.04,528,0.942,bicubic
deit3_small_patch16_224.fb_in22k_ft_in1k,96.660,3.340,99.330,0.670,22.06,224,1.000,bicubic
flexivit_base.600ep_in1k,96.650,3.350,99.330,0.670,86.59,240,0.950,bicubic
davit_small.msft_in1k,96.630,3.370,99.350,0.650,49.75,224,0.950,bicubic
convformer_s18.sail_in22k_ft_in1k,96.630,3.370,99.340,0.660,26.77,224,1.000,bicubic
efficientvit_b3.r288_in1k,96.630,3.370,99.220,0.780,48.65,288,1.000,bicubic
coatnet_rmlp_1_rw2_224.sw_in12k_ft_in1k,96.630,3.370,99.160,0.840,41.72,224,0.950,bicubic
resmlp_big_24_224.fb_in22k_ft_in1k,96.620,3.380,99.510,0.490,129.14,224,0.875,bicubic
regnetz_d8.ra3_in1k,96.620,3.380,99.450,0.550,23.37,320,1.000,bicubic
resnest200e.in1k,96.610,3.390,99.350,0.650,70.20,320,0.909,bicubic
resnext101_32x16d.fb_swsl_ig1b_ft_in1k,96.600,3.400,99.530,0.470,194.03,224,0.875,bilinear
flexivit_base.300ep_in1k,96.600,3.400,99.270,0.730,86.59,240,0.950,bicubic
xcit_medium_24_p16_224.fb_dist_in1k,96.600,3.400,99.270,0.730,84.40,224,1.000,bicubic
regnetz_d32.ra3_in1k,96.590,3.410,99.380,0.620,27.58,320,0.950,bicubic
swin_base_patch4_window12_384.ms_in1k,96.580,3.420,99.250,0.750,87.90,384,1.000,bicubic
resnetrs152.tf_in1k,96.580,3.420,99.240,0.760,86.62,320,1.000,bicubic
convformer_s36.sail_in1k,96.580,3.420,99.170,0.830,40.01,224,1.000,bicubic
maxvit_rmlp_small_rw_224.sw_in1k,96.580,3.420,99.120,0.880,64.90,224,0.900,bicubic
regnetz_d8_evos.ch_in1k,96.570,3.430,99.460,0.540,23.46,320,1.000,bicubic
gcvit_base.in1k,96.560,3.440,99.230,0.770,90.32,224,0.875,bicubic
focalnet_base_srf.ms_in1k,96.560,3.440,99.150,0.850,88.15,224,0.900,bicubic
inception_next_base.sail_in1k,96.560,3.440,99.080,0.920,86.67,224,0.950,bicubic
cait_xs24_384.fb_dist_in1k,96.540,3.460,99.420,0.580,26.67,384,1.000,bicubic
efficientnetv2_rw_s.ra2_in1k,96.540,3.460,99.360,0.640,23.94,384,1.000,bicubic
tf_efficientnet_b7.aa_in1k,96.540,3.460,99.300,0.700,66.35,600,0.949,bicubic
crossvit_18_dagger_408.in1k,96.540,3.460,99.260,0.740,44.61,408,1.000,bicubic
coatnet_rmlp_2_rw_224.sw_in1k,96.540,3.460,99.100,0.900,73.88,224,0.950,bicubic
regnety_080.ra3_in1k,96.530,3.470,99.320,0.680,39.18,288,1.000,bicubic
xcit_tiny_24_p8_384.fb_dist_in1k,96.530,3.470,99.320,0.680,12.11,384,1.000,bicubic
swinv2_base_window8_256.ms_in1k,96.530,3.470,99.270,0.730,87.92,256,0.900,bicubic
convnext_small.fb_in1k,96.520,3.480,99.340,0.660,50.22,288,1.000,bicubic
resnest269e.in1k,96.510,3.490,99.350,0.650,110.93,416,0.928,bicubic
vit_base_patch32_384.augreg_in21k_ft_in1k,96.490,3.510,99.410,0.590,88.30,384,1.000,bicubic
swin_small_patch4_window7_224.ms_in22k_ft_in1k,96.480,3.520,99.390,0.610,49.61,224,0.900,bicubic
fastvit_ma36.apple_in1k,96.470,3.530,99.280,0.720,44.07,256,0.950,bicubic
tf_efficientnet_b5.aa_in1k,96.470,3.530,99.240,0.760,30.39,456,0.934,bicubic
swinv2_small_window16_256.ms_in1k,96.470,3.530,99.200,0.800,49.73,256,0.900,bicubic
coat_lite_medium.in1k,96.470,3.530,99.150,0.850,44.57,224,0.900,bicubic
cs3se_edgenet_x.c2ns_in1k,96.450,3.550,99.400,0.600,50.72,320,1.000,bicubic
resmlp_big_24_224.fb_distilled_in1k,96.450,3.550,99.310,0.690,129.14,224,0.875,bicubic
vit_base_patch16_224_miil.in21k_ft_in1k,96.450,3.550,99.300,0.700,86.54,224,0.875,bilinear
focalnet_base_lrf.ms_in1k,96.450,3.550,99.120,0.880,88.75,224,0.900,bicubic
resnext101_32x4d.fb_swsl_ig1b_ft_in1k,96.420,3.580,99.470,0.530,44.18,224,0.875,bilinear
maxvit_rmlp_tiny_rw_256.sw_in1k,96.420,3.580,99.380,0.620,29.15,256,0.950,bicubic
cait_s24_224.fb_dist_in1k,96.420,3.580,99.150,0.850,46.92,224,1.000,bicubic
regnetv_064.ra3_in1k,96.410,3.590,99.360,0.640,30.58,288,1.000,bicubic
xcit_small_24_p8_224.fb_in1k,96.410,3.590,99.150,0.850,47.63,224,1.000,bicubic
xcit_large_24_p8_224.fb_in1k,96.400,3.600,98.990,1.010,188.93,224,1.000,bicubic
resnet152d.ra2_in1k,96.390,3.610,99.390,0.610,60.21,320,1.000,bicubic
tf_efficientnet_b3.ns_jft_in1k,96.390,3.610,99.350,0.650,12.23,300,0.904,bicubic
crossvit_15_dagger_408.in1k,96.390,3.610,99.160,0.840,28.50,408,1.000,bicubic
convnextv2_nano.fcmae_ft_in22k_in1k_384,96.370,3.630,99.400,0.600,15.62,384,1.000,bicubic
mvitv2_small.fb_in1k,96.370,3.630,99.200,0.800,34.87,224,0.900,bicubic
xception65.ra3_in1k,96.360,3.640,99.240,0.760,39.92,299,0.940,bicubic
fastvit_sa36.apple_dist_in1k,96.360,3.640,99.230,0.770,31.53,256,0.900,bicubic
regnety_064.ra3_in1k,96.360,3.640,99.230,0.770,30.58,288,1.000,bicubic
pvt_v2_b5.in1k,96.360,3.640,99.170,0.830,81.96,224,0.900,bicubic
regnety_160.deit_in1k,96.350,3.650,99.330,0.670,83.59,288,1.000,bicubic
pvt_v2_b4.in1k,96.350,3.650,99.180,0.820,62.56,224,0.900,bicubic
regnety_320.seer_ft_in1k,96.340,3.660,99.350,0.650,145.05,384,1.000,bicubic
tf_efficientnet_b5.ra_in1k,96.340,3.660,99.310,0.690,30.39,456,0.934,bicubic
tf_efficientnetv2_s.in1k,96.340,3.660,99.200,0.800,21.46,384,1.000,bicubic
resnext101_32x8d.fb_wsl_ig1b_ft_in1k,96.330,3.670,99.430,0.570,88.79,224,0.875,bilinear
repvit_m2_3.dist_450e_in1k,96.330,3.670,99.400,0.600,23.69,224,0.950,bicubic
volo_d1_224.sail_in1k,96.320,3.680,99.310,0.690,26.63,224,0.960,bicubic
dm_nfnet_f0.dm_in1k,96.310,3.690,99.320,0.680,71.49,256,0.900,bicubic
deit3_base_patch16_224.fb_in1k,96.300,3.700,99.180,0.820,86.59,224,0.900,bicubic
resnet101d.ra2_in1k,96.290,3.710,99.230,0.770,44.57,320,1.000,bicubic
tiny_vit_11m_224.dist_in22k_ft_in1k,96.290,3.710,99.190,0.810,11.00,224,0.950,bicubic
efficientvit_b3.r256_in1k,96.290,3.710,99.120,0.880,48.65,256,1.000,bicubic
gcvit_small.in1k,96.280,3.720,99.140,0.860,51.09,224,0.875,bicubic
swinv2_small_window8_256.ms_in1k,96.270,3.730,99.210,0.790,49.73,256,0.900,bicubic
inception_next_small.sail_in1k,96.240,3.760,99.220,0.780,49.37,224,0.875,bicubic
nest_base_jx.goog_in1k,96.240,3.760,99.200,0.800,67.72,224,0.875,bicubic
fastvit_sa36.apple_in1k,96.240,3.760,99.190,0.810,31.53,256,0.900,bicubic
twins_svt_large.in1k,96.240,3.760,99.170,0.830,99.27,224,0.900,bicubic
swin_s3_base_224.ms_in1k,96.240,3.760,99.150,0.850,71.13,224,0.900,bicubic
maxvit_tiny_rw_224.sw_in1k,96.240,3.760,99.130,0.870,29.06,224,0.950,bicubic
pit_b_distilled_224.in1k,96.230,3.770,99.110,0.890,74.79,224,0.900,bicubic
swin_s3_small_224.ms_in1k,96.230,3.770,99.080,0.920,49.74,224,0.900,bicubic
ecaresnet101d.miil_in1k,96.220,3.780,99.310,0.690,44.57,288,0.950,bicubic
tf_efficientnetv2_b3.in21k_ft_in1k,96.220,3.780,99.230,0.770,14.36,300,0.900,bicubic
xcit_small_24_p16_224.fb_dist_in1k,96.220,3.780,99.210,0.790,47.67,224,1.000,bicubic
xception65p.ra3_in1k,96.210,3.790,99.180,0.820,39.82,299,0.940,bicubic
deit3_small_patch16_384.fb_in1k,96.200,3.800,99.290,0.710,22.21,384,1.000,bicubic
rexnetr_200.sw_in12k_ft_in1k,96.200,3.800,99.260,0.740,16.52,288,1.000,bicubic
resnet152.a1h_in1k,96.200,3.800,99.220,0.780,60.19,288,1.000,bicubic
regnetv_040.ra3_in1k,96.190,3.810,99.330,0.670,20.64,288,1.000,bicubic
convnextv2_tiny.fcmae_ft_in1k,96.190,3.810,99.250,0.750,28.64,288,1.000,bicubic
gcvit_tiny.in1k,96.180,3.820,99.230,0.770,28.22,224,0.875,bicubic
focalnet_small_lrf.ms_in1k,96.180,3.820,99.190,0.810,50.34,224,0.900,bicubic
swinv2_cr_small_ns_224.sw_in1k,96.180,3.820,99.140,0.860,49.70,224,0.900,bicubic
mobilevitv2_175.cvnets_in22k_ft_in1k_384,96.180,3.820,99.120,0.880,14.25,384,1.000,bicubic
tf_efficientnet_b4.ap_in1k,96.160,3.840,99.270,0.730,19.34,380,0.922,bicubic
tresnet_v2_l.miil_in21k_ft_in1k,96.160,3.840,99.240,0.760,46.17,224,0.875,bilinear
deit_base_patch16_384.fb_in1k,96.160,3.840,99.140,0.860,86.86,384,1.000,bicubic
twins_svt_base.in1k,96.160,3.840,99.050,0.950,56.07,224,0.900,bicubic
fastvit_sa24.apple_dist_in1k,96.150,3.850,99.210,0.790,21.55,256,0.900,bicubic
efficientnet_b4.ra2_in1k,96.150,3.850,99.190,0.810,19.34,384,1.000,bicubic
sequencer2d_l.in1k,96.150,3.850,99.160,0.840,54.30,224,0.875,bicubic
regnetz_c16_evos.ch_in1k,96.140,3.860,99.360,0.640,13.49,320,0.950,bicubic
twins_pcpvt_large.in1k,96.140,3.860,99.170,0.830,60.99,224,0.900,bicubic
caformer_s18.sail_in1k,96.140,3.860,99.000,1.000,26.34,224,1.000,bicubic
vit_base_patch32_clip_224.laion2b_ft_in12k_in1k,96.130,3.870,99.220,0.780,88.22,224,0.900,bicubic
tiny_vit_21m_224.in1k,96.130,3.870,99.160,0.840,21.20,224,0.950,bicubic
repvit_m2_3.dist_300e_in1k,96.120,3.880,99.340,0.660,23.69,224,0.950,bicubic
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k,96.120,3.880,99.270,0.730,236.34,224,0.875,bicubic
nfnet_l0.ra2_in1k,96.120,3.880,99.240,0.760,35.07,288,1.000,bicubic
swin_base_patch4_window7_224.ms_in1k,96.120,3.880,99.060,0.940,87.77,224,0.900,bicubic
resnetv2_50x1_bit.goog_distilled_in1k,96.110,3.890,99.280,0.720,25.55,224,0.875,bicubic
efficientformer_l7.snap_dist_in1k,96.110,3.890,99.270,0.730,82.23,224,0.950,bicubic
xcit_small_12_p8_224.fb_in1k,96.110,3.890,99.160,0.840,26.21,224,1.000,bicubic
resnetv2_101x1_bit.goog_in21k_ft_in1k,96.100,3.900,99.280,0.720,44.54,448,1.000,bilinear
maxvit_tiny_tf_224.in1k,96.100,3.900,99.270,0.730,30.92,224,0.950,bicubic
deit_base_distilled_patch16_224.fb_in1k,96.090,3.910,99.190,0.810,87.34,224,0.900,bicubic
xcit_medium_24_p8_224.fb_in1k,96.090,3.910,98.890,1.110,84.32,224,1.000,bicubic
resnext101_64x4d.c1_in1k,96.080,3.920,99.240,0.760,83.46,288,1.000,bicubic
regnety_320.tv2_in1k,96.080,3.920,99.230,0.770,145.05,224,0.965,bicubic
deit3_medium_patch16_224.fb_in1k,96.080,3.920,99.200,0.800,38.85,224,0.900,bicubic
xcit_tiny_12_p8_384.fb_dist_in1k,96.080,3.920,99.140,0.860,6.71,384,1.000,bicubic
swinv2_cr_small_224.sw_in1k,96.080,3.920,98.860,1.140,49.70,224,0.900,bicubic
tf_efficientnet_b5.in1k,96.070,3.930,99.290,0.710,30.39,456,0.934,bicubic
efficientformerv2_l.snap_dist_in1k,96.070,3.930,99.190,0.810,26.32,224,0.950,bicubic
focalnet_small_srf.ms_in1k,96.070,3.930,99.120,0.880,49.89,224,0.900,bicubic
convnextv2_nano.fcmae_ft_in22k_in1k,96.060,3.940,99.220,0.780,15.62,288,1.000,bicubic
mobilevitv2_200.cvnets_in22k_ft_in1k_384,96.060,3.940,99.080,0.920,18.45,384,1.000,bicubic
resnetv2_101.a1h_in1k,96.050,3.950,99.170,0.830,44.54,288,1.000,bicubic
cs3edgenet_x.c2_in1k,96.050,3.950,99.140,0.860,47.82,288,1.000,bicubic
maxxvit_rmlp_nano_rw_256.sw_in1k,96.040,3.960,99.260,0.740,16.78,256,0.950,bicubic
resnext101_64x4d.tv_in1k,96.030,3.970,99.160,0.840,83.46,224,0.875,bilinear
xcit_small_12_p16_224.fb_dist_in1k,96.030,3.970,99.130,0.870,26.25,224,1.000,bicubic
coatnet_1_rw_224.sw_in1k,96.030,3.970,99.060,0.940,41.72,224,0.950,bicubic
efficientvit_b3.r224_in1k,96.030,3.970,98.990,1.010,48.65,224,0.950,bicubic
regnety_040.ra3_in1k,96.020,3.980,99.190,0.810,20.65,288,1.000,bicubic
resnet101.a1h_in1k,96.020,3.980,99.140,0.860,44.55,288,1.000,bicubic
cs3sedarknet_x.c2ns_in1k,96.020,3.980,99.110,0.890,35.40,288,1.000,bicubic
convnext_tiny_hnf.a2h_in1k,96.020,3.980,99.070,0.930,28.59,288,1.000,bicubic
hrnet_w18_ssld.paddle_in1k,95.990,4.010,99.320,0.680,21.30,288,1.000,bilinear
convnext_nano.in12k_ft_in1k,95.990,4.010,99.310,0.690,15.59,288,1.000,bicubic
pvt_v2_b3.in1k,95.990,4.010,99.190,0.810,45.24,224,0.900,bicubic
regnetx_320.tv2_in1k,95.990,4.010,99.100,0.900,107.81,224,0.965,bicubic
tresnet_xl.miil_in1k_448,95.980,4.020,99.130,0.870,78.44,448,0.875,bilinear
sequencer2d_s.in1k,95.980,4.020,99.050,0.950,27.65,224,0.875,bicubic
regnety_160.tv2_in1k,95.970,4.030,99.150,0.850,83.59,224,0.965,bicubic
nest_small_jx.goog_in1k,95.970,4.030,99.030,0.970,38.35,224,0.875,bicubic
maxvit_rmlp_nano_rw_256.sw_in1k,95.970,4.030,98.970,1.030,15.50,256,0.950,bicubic
efficientvit_b2.r288_in1k,95.960,4.040,99.190,0.810,24.33,288,1.000,bicubic
regnety_032.ra_in1k,95.960,4.040,99.190,0.810,19.44,288,1.000,bicubic
coatnet_rmlp_1_rw_224.sw_in1k,95.960,4.040,99.160,0.840,41.69,224,0.950,bicubic
resnext101_32x8d.tv2_in1k,95.950,4.050,99.080,0.920,88.79,224,0.965,bilinear
convformer_s18.sail_in1k,95.950,4.050,98.900,1.100,26.77,224,1.000,bicubic
xcit_tiny_24_p16_384.fb_dist_in1k,95.940,4.060,99.220,0.780,12.12,384,1.000,bicubic
eca_nfnet_l0.ra2_in1k,95.940,4.060,99.210,0.790,24.14,288,1.000,bicubic
regnetz_c16.ra3_in1k,95.940,4.060,99.110,0.890,13.46,320,1.000,bicubic
swinv2_tiny_window16_256.ms_in1k,95.930,4.070,99.150,0.850,28.35,256,0.900,bicubic
swin_small_patch4_window7_224.ms_in1k,95.930,4.070,99.020,0.980,49.61,224,0.900,bicubic
fastvit_sa24.apple_in1k,95.920,4.080,99.160,0.840,21.55,256,0.900,bicubic
maxvit_nano_rw_256.sw_in1k,95.920,4.080,99.010,0.990,15.45,256,0.950,bicubic
coat_small.in1k,95.910,4.090,99.150,0.850,21.69,224,0.900,bicubic
tf_efficientnet_b4.aa_in1k,95.900,4.100,99.170,0.830,19.34,380,0.922,bicubic
repvit_m1_5.dist_450e_in1k,95.900,4.100,99.120,0.880,14.64,224,0.950,bicubic
maxxvitv2_nano_rw_256.sw_in1k,95.900,4.100,99.050,0.950,23.70,256,0.950,bicubic
regnetx_160.tv2_in1k,95.880,4.120,99.090,0.910,54.28,224,0.965,bicubic
resnet51q.ra2_in1k,95.870,4.130,99.130,0.870,35.70,288,1.000,bilinear
resnext50_32x4d.fb_swsl_ig1b_ft_in1k,95.860,4.140,99.250,0.750,25.03,224,0.875,bilinear
resnest101e.in1k,95.860,4.140,99.200,0.800,48.28,256,0.875,bilinear
tresnet_l.miil_in1k_448,95.860,4.140,99.120,0.880,55.99,448,0.875,bilinear
regnety_080_tv.tv2_in1k,95.860,4.140,99.100,0.900,39.38,224,0.965,bicubic
mvitv2_tiny.fb_in1k,95.860,4.140,99.070,0.930,24.17,224,0.900,bicubic
cs3darknet_x.c2ns_in1k,95.850,4.150,99.170,0.830,35.05,288,1.000,bicubic
rexnet_300.nav_in1k,95.840,4.160,99.130,0.870,34.71,224,0.875,bicubic
resnetaa50d.sw_in12k_ft_in1k,95.830,4.170,99.170,0.830,25.58,288,1.000,bicubic
vit_large_patch32_384.orig_in21k_ft_in1k,95.830,4.170,99.150,0.850,306.63,384,1.000,bicubic
cait_xxs36_384.fb_dist_in1k,95.830,4.170,99.090,0.910,17.37,384,1.000,bicubic
tf_efficientnet_b4.in1k,95.820,4.180,99.050,0.950,19.34,380,0.922,bicubic
xcit_tiny_24_p8_224.fb_dist_in1k,95.810,4.190,99.210,0.790,12.11,224,1.000,bicubic
sequencer2d_m.in1k,95.810,4.190,99.110,0.890,38.31,224,0.875,bicubic
convnextv2_nano.fcmae_ft_in1k,95.800,4.200,99.090,0.910,15.62,288,1.000,bicubic
resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k,95.790,4.210,99.180,0.820,194.03,224,0.875,bilinear
convnext_tiny.fb_in1k,95.790,4.210,99.160,0.840,28.59,288,1.000,bicubic
twins_pcpvt_base.in1k,95.790,4.210,99.130,0.870,43.83,224,0.900,bicubic
resnet61q.ra2_in1k,95.780,4.220,98.990,1.010,36.85,288,1.000,bicubic
tf_efficientnet_b2.ns_jft_in1k,95.770,4.230,99.120,0.880,9.11,260,0.890,bicubic
vit_relpos_base_patch16_clsgap_224.sw_in1k,95.770,4.230,99.040,0.960,86.43,224,0.900,bicubic
poolformerv2_m48.sail_in1k,95.770,4.230,98.980,1.020,73.35,224,1.000,bicubic
ecaresnet101d_pruned.miil_in1k,95.760,4.240,99.180,0.820,24.88,288,0.950,bicubic
seresnext50_32x4d.racm_in1k,95.740,4.260,99.180,0.820,27.56,288,0.950,bicubic
gc_efficientnetv2_rw_t.agc_in1k,95.740,4.260,99.020,0.980,13.68,288,1.000,bicubic
efficientnet_b3.ra2_in1k,95.720,4.280,99.040,0.960,12.23,320,1.000,bicubic
tresnet_m.miil_in21k_ft_in1k,95.710,4.290,99.030,0.970,31.39,224,0.875,bilinear
pnasnet5large.tf_in1k,95.710,4.290,98.920,1.080,86.06,331,0.911,bicubic
coatnet_bn_0_rw_224.sw_in1k,95.700,4.300,99.050,0.950,27.44,224,0.950,bicubic
mobilevitv2_150.cvnets_in22k_ft_in1k_384,95.690,4.310,99.140,0.860,10.59,384,1.000,bicubic
nasnetalarge.tf_in1k,95.690,4.310,98.930,1.070,88.75,331,0.911,bicubic
crossvit_15_dagger_240.in1k,95.680,4.320,98.830,1.170,28.21,240,0.875,bicubic
flexivit_small.600ep_in1k,95.670,4.330,99.060,0.940,22.06,240,0.950,bicubic
xcit_tiny_24_p8_224.fb_in1k,95.660,4.340,99.050,0.950,12.11,224,1.000,bicubic
davit_tiny.msft_in1k,95.660,4.340,99.030,0.970,28.36,224,0.950,bicubic
efficientvit_b2.r256_in1k,95.650,4.350,99.060,0.940,24.33,256,1.000,bicubic
repvit_m1_5.dist_300e_in1k,95.640,4.360,98.990,1.010,14.64,224,0.950,bicubic
wide_resnet50_2.racm_in1k,95.630,4.370,99.240,0.760,68.88,288,0.950,bicubic
vit_small_r26_s32_224.augreg_in21k_ft_in1k,95.630,4.370,99.190,0.810,36.43,224,0.900,bicubic
resnetv2_50d_evos.ah_in1k,95.630,4.370,99.110,0.890,25.59,288,1.000,bicubic
poolformer_m48.sail_in1k,95.630,4.370,98.940,1.060,73.47,224,0.950,bicubic
pit_b_224.in1k,95.630,4.370,98.660,1.340,73.76,224,0.900,bicubic
efficientformer_l3.snap_dist_in1k,95.590,4.410,99.160,0.840,31.41,224,0.950,bicubic
efficientnetv2_rw_t.ra2_in1k,95.590,4.410,99.070,0.930,13.65,288,1.000,bicubic
gcvit_xtiny.in1k,95.590,4.410,99.040,0.960,19.98,224,0.875,bicubic
crossvit_18_dagger_240.in1k,95.570,4.430,99.060,0.940,44.27,240,0.875,bicubic
vit_relpos_base_patch16_224.sw_in1k,95.560,4.440,99.030,0.970,86.43,224,0.900,bicubic
pvt_v2_b2_li.in1k,95.560,4.440,98.990,1.010,22.55,224,0.900,bicubic
flexivit_small.1200ep_in1k,95.550,4.450,99.110,0.890,22.06,240,0.950,bicubic
convit_base.fb_in1k,95.550,4.450,98.880,1.120,86.54,224,0.875,bicubic
wide_resnet101_2.tv2_in1k,95.540,4.460,99.080,0.920,126.89,224,0.965,bilinear
coat_lite_small.in1k,95.540,4.460,98.860,1.140,19.84,224,0.900,bicubic
xcit_small_24_p16_224.fb_in1k,95.540,4.460,98.780,1.220,47.67,224,1.000,bicubic
xcit_medium_24_p16_224.fb_in1k,95.540,4.460,98.720,1.280,84.40,224,1.000,bicubic
levit_384.fb_dist_in1k,95.530,4.470,99.060,0.940,39.13,224,0.900,bicubic
levit_conv_384.fb_dist_in1k,95.530,4.470,99.050,0.950,39.13,224,0.900,bicubic
ecaresnet50t.ra2_in1k,95.520,4.480,99.110,0.890,25.57,320,0.950,bicubic
fbnetv3_g.ra2_in1k,95.520,4.480,98.990,1.010,16.62,288,0.950,bilinear
vit_base_patch32_clip_224.laion2b_ft_in1k,95.520,4.480,98.870,1.130,88.22,224,0.900,bicubic
resnet101.a1_in1k,95.520,4.480,98.850,1.150,44.55,288,1.000,bicubic
crossvit_base_240.in1k,95.520,4.480,98.810,1.190,105.03,240,0.875,bicubic
swin_tiny_patch4_window7_224.ms_in22k_ft_in1k,95.510,4.490,99.200,0.800,28.29,224,0.900,bicubic
resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k,95.510,4.490,99.120,0.880,88.79,224,0.875,bilinear
xception41p.ra3_in1k,95.510,4.490,98.910,1.090,26.91,299,0.940,bicubic
focalnet_tiny_srf.ms_in1k,95.500,4.500,99.130,0.870,28.43,224,0.900,bicubic
vit_relpos_medium_patch16_rpn_224.sw_in1k,95.500,4.500,99.080,0.920,38.73,224,0.900,bicubic
resnet152.tv2_in1k,95.500,4.500,98.960,1.040,60.19,224,0.965,bilinear
resnet152.a1_in1k,95.500,4.500,98.780,1.220,60.19,288,1.000,bicubic
swinv2_tiny_window8_256.ms_in1k,95.490,4.510,99.100,0.900,28.35,256,0.900,bicubic
tiny_vit_11m_224.in1k,95.490,4.510,98.990,1.010,11.00,224,0.950,bicubic
flexivit_small.300ep_in1k,95.490,4.510,98.960,1.040,22.06,240,0.950,bicubic
resnet152.a2_in1k,95.490,4.510,98.790,1.210,60.19,288,1.000,bicubic
pvt_v2_b2.in1k,95.480,4.520,99.000,1.000,25.36,224,0.900,bicubic
resnetv2_50d_gn.ah_in1k,95.480,4.520,98.950,1.050,25.57,288,1.000,bicubic
visformer_small.in1k,95.480,4.520,98.900,1.100,40.22,224,0.900,bicubic
vit_relpos_medium_patch16_cls_224.sw_in1k,95.470,4.530,98.950,1.050,38.76,224,0.900,bicubic
inception_next_tiny.sail_in1k,95.460,4.540,99.010,0.990,28.06,224,0.875,bicubic
vit_relpos_medium_patch16_224.sw_in1k,95.460,4.540,98.960,1.040,38.75,224,0.900,bicubic
focalnet_tiny_lrf.ms_in1k,95.460,4.540,98.910,1.090,28.65,224,0.900,bicubic
ecaresnet50d.miil_in1k,95.450,4.550,99.090,0.910,25.58,288,0.950,bicubic
deit_base_patch16_224.fb_in1k,95.450,4.550,98.840,1.160,86.57,224,0.900,bicubic
resnext50_32x4d.a1h_in1k,95.450,4.550,98.840,1.160,25.03,288,1.000,bicubic
resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k,95.440,4.560,99.120,0.880,44.18,224,0.875,bilinear
tresnet_xl.miil_in1k,95.440,4.560,99.050,0.950,78.44,224,0.875,bilinear
crossvit_18_240.in1k,95.440,4.560,98.790,1.210,43.27,240,0.875,bicubic
coatnet_0_rw_224.sw_in1k,95.440,4.560,98.720,1.280,27.44,224,0.950,bicubic
resnetrs101.tf_in1k,95.430,4.570,99.040,0.960,63.62,288,0.940,bicubic
coatnext_nano_rw_224.sw_in1k,95.430,4.570,99.010,0.990,14.70,224,0.900,bicubic
coatnet_rmlp_nano_rw_224.sw_in1k,95.430,4.570,98.990,1.010,15.15,224,0.900,bicubic
xcit_small_12_p16_224.fb_in1k,95.430,4.570,98.840,1.160,26.25,224,1.000,bicubic
xcit_large_24_p16_224.fb_in1k,95.430,4.570,98.630,1.370,189.10,224,1.000,bicubic
halo2botnet50ts_256.a1h_in1k,95.420,4.580,99.020,0.980,22.64,256,0.950,bicubic
ecaresnet50t.a1_in1k,95.410,4.590,99.010,0.990,25.57,288,1.000,bicubic
resnet101.a2_in1k,95.410,4.590,98.940,1.060,44.55,288,1.000,bicubic
resnet50.fb_swsl_ig1b_ft_in1k,95.400,4.600,99.300,0.700,25.56,224,0.875,bilinear
edgenext_small.usi_in1k,95.400,4.600,99.100,0.900,5.59,320,1.000,bicubic
poolformerv2_m36.sail_in1k,95.400,4.600,98.870,1.130,56.08,224,1.000,bicubic
vit_base_patch16_rpn_224.sw_in1k,95.390,4.610,98.940,1.060,86.54,224,0.900,bicubic
poolformer_m36.sail_in1k,95.390,4.610,98.860,1.140,56.17,224,0.950,bicubic
vit_small_patch16_224.augreg_in21k_ft_in1k,95.370,4.630,99.140,0.860,22.05,224,0.900,bicubic
swinv2_cr_tiny_ns_224.sw_in1k,95.370,4.630,98.940,1.060,28.33,224,0.900,bicubic
efficientformerv2_s2.snap_dist_in1k,95.360,4.640,98.930,1.070,12.71,224,0.950,bicubic
ecaresnet50t.a2_in1k,95.350,4.650,98.920,1.080,25.57,288,1.000,bicubic
convnext_nano.d1h_in1k,95.350,4.650,98.860,1.140,15.59,288,1.000,bicubic
vit_base_patch16_224.orig_in21k_ft_in1k,95.340,4.660,99.000,1.000,86.57,224,0.900,bicubic
seresnet50.ra2_in1k,95.330,4.670,99.010,0.990,28.09,288,0.950,bicubic
tf_efficientnet_b3.ap_in1k,95.320,4.680,98.900,1.100,12.23,300,0.904,bicubic
cs3sedarknet_l.c2ns_in1k,95.310,4.690,99.130,0.870,21.91,288,0.950,bicubic
poolformerv2_s36.sail_in1k,95.310,4.690,98.920,1.080,30.79,224,1.000,bicubic
regnety_032.tv2_in1k,95.310,4.690,98.910,1.090,19.44,224,0.965,bicubic
mixer_b16_224.miil_in21k_ft_in1k,95.300,4.700,98.880,1.120,59.88,224,0.875,bilinear
ecaresnetlight.miil_in1k,95.290,4.710,99.030,0.970,30.16,288,0.950,bicubic
vit_small_patch16_384.augreg_in1k,95.290,4.710,99.000,1.000,22.20,384,1.000,bicubic
tresnet_l.miil_in1k,95.280,4.720,99.010,0.990,55.99,224,0.875,bilinear
cait_xxs24_384.fb_dist_in1k,95.280,4.720,98.960,1.040,12.03,384,1.000,bicubic
resnet101.tv2_in1k,95.280,4.720,98.910,1.090,44.55,224,0.965,bilinear
resnet50_gn.a1h_in1k,95.250,4.750,99.000,1.000,25.56,288,0.950,bicubic
vit_srelpos_medium_patch16_224.sw_in1k,95.240,4.760,98.990,1.010,38.74,224,0.900,bicubic
nest_tiny_jx.goog_in1k,95.240,4.760,98.980,1.020,17.06,224,0.875,bicubic
gcresnet50t.ra2_in1k,95.240,4.760,98.910,1.090,25.90,288,1.000,bicubic
coatnet_nano_rw_224.sw_in1k,95.240,4.760,98.870,1.130,15.14,224,0.900,bicubic
convnextv2_pico.fcmae_ft_in1k,95.230,4.770,98.920,1.080,9.07,288,0.950,bicubic
mobilevitv2_175.cvnets_in22k_ft_in1k,95.220,4.780,98.800,1.200,14.25,256,0.888,bicubic
convit_small.fb_in1k,95.210,4.790,98.900,1.100,27.78,224,0.875,bicubic
twins_pcpvt_small.in1k,95.210,4.790,98.880,1.120,24.11,224,0.900,bicubic
repvit_m3.dist_in1k,95.200,4.800,99.090,0.910,10.68,224,0.950,bicubic
resnetaa50.a1h_in1k,95.200,4.800,98.920,1.080,25.56,288,1.000,bicubic
efficientvit_b2.r224_in1k,95.200,4.800,98.820,1.180,24.33,224,0.950,bicubic
twins_svt_small.in1k,95.190,4.810,98.880,1.120,24.06,224,0.900,bicubic
swin_s3_tiny_224.ms_in1k,95.180,4.820,98.950,1.050,28.33,224,0.900,bicubic
regnetz_b16.ra3_in1k,95.170,4.830,99.080,0.920,9.72,288,1.000,bicubic
tf_efficientnet_b1.ns_jft_in1k,95.160,4.840,99.100,0.900,7.79,240,0.882,bicubic
mobilevitv2_200.cvnets_in22k_ft_in1k,95.160,4.840,98.950,1.050,18.45,256,0.888,bicubic
tf_efficientnetv2_b3.in1k,95.160,4.840,98.820,1.180,14.36,300,0.904,bicubic
cs3darknet_focus_l.c2ns_in1k,95.150,4.850,98.960,1.040,21.15,288,0.950,bicubic
vit_relpos_small_patch16_224.sw_in1k,95.150,4.850,98.960,1.040,21.98,224,0.900,bicubic
crossvit_15_240.in1k,95.150,4.850,98.930,1.070,27.53,240,0.875,bicubic
lamhalobotnet50ts_256.a1h_in1k,95.150,4.850,98.880,1.120,22.57,256,0.950,bicubic
fastvit_sa12.apple_dist_in1k,95.150,4.850,98.810,1.190,11.58,256,0.900,bicubic
pit_s_distilled_224.in1k,95.140,4.860,98.890,1.110,24.04,224,0.900,bicubic
mobilevitv2_150.cvnets_in22k_ft_in1k,95.140,4.860,98.860,1.140,10.59,256,0.888,bicubic
swin_tiny_patch4_window7_224.ms_in1k,95.140,4.860,98.850,1.150,28.29,224,0.900,bicubic
halonet50ts.a1h_in1k,95.140,4.860,98.780,1.220,22.73,256,0.940,bicubic
convnext_nano_ols.d1h_in1k,95.140,4.860,98.730,1.270,15.65,288,1.000,bicubic
xcit_tiny_12_p16_384.fb_dist_in1k,95.130,4.870,99.020,0.980,6.72,384,1.000,bicubic
cs3darknet_l.c2ns_in1k,95.120,4.880,98.980,1.020,21.16,288,0.950,bicubic
efficientnet_el.ra_in1k,95.120,4.880,98.970,1.030,10.59,300,0.904,bicubic
vit_base_patch32_clip_224.openai_ft_in1k,95.110,4.890,98.980,1.020,88.22,224,0.900,bicubic
ecaresnet50d_pruned.miil_in1k,95.110,4.890,98.930,1.070,19.94,288,0.950,bicubic
gernet_l.idstcv_in1k,95.110,4.890,98.900,1.100,31.08,256,0.875,bilinear
regnetx_080.tv2_in1k,95.100,4.900,98.830,1.170,39.57,224,0.965,bicubic
xcit_tiny_12_p8_224.fb_dist_in1k,95.090,4.910,98.910,1.090,6.71,224,1.000,bicubic
convmixer_1536_20.in1k,95.080,4.920,99.030,0.970,51.63,224,0.960,bicubic
poolformer_s36.sail_in1k,95.080,4.920,98.910,1.090,30.86,224,0.900,bicubic
legacy_senet154.in1k,95.070,4.930,98.830,1.170,115.09,224,0.875,bilinear
tiny_vit_5m_224.dist_in22k_ft_in1k,95.050,4.950,98.970,1.030,5.39,224,0.950,bicubic
seresnet33ts.ra2_in1k,95.040,4.960,98.900,1.100,19.78,288,1.000,bicubic
tnt_s_patch16_224,95.040,4.960,98.830,1.170,23.76,224,0.900,bicubic
vit_srelpos_small_patch16_224.sw_in1k,95.030,4.970,98.950,1.050,21.97,224,0.900,bicubic
vit_small_patch32_384.augreg_in21k_ft_in1k,95.020,4.980,98.990,1.010,22.92,384,1.000,bicubic
resnet152s.gluon_in1k,95.020,4.980,98.930,1.070,60.32,224,0.875,bicubic
levit_256.fb_dist_in1k,95.020,4.980,98.880,1.120,18.89,224,0.900,bicubic
levit_conv_256.fb_dist_in1k,95.020,4.980,98.880,1.120,18.89,224,0.900,bicubic
resnetv2_50x1_bit.goog_in21k_ft_in1k,95.010,4.990,99.060,0.940,25.55,448,1.000,bilinear
vit_base_patch32_224.augreg_in21k_ft_in1k,95.010,4.990,99.030,0.970,88.22,224,0.900,bicubic
resnet50d.ra2_in1k,95.010,4.990,98.980,1.020,25.58,288,0.950,bicubic
tf_efficientnet_b3.aa_in1k,95.010,4.990,98.910,1.090,12.23,300,0.904,bicubic
tresnet_m.miil_in1k_448,94.990,5.010,98.980,1.020,31.39,448,0.875,bilinear
deit3_small_patch16_224.fb_in1k,94.990,5.010,98.460,1.540,22.06,224,0.900,bicubic
resnet50.d_in1k,94.980,5.020,98.840,1.160,25.56,288,1.000,bicubic
resnest50d_4s2x40d.in1k,94.970,5.030,99.080,0.920,30.42,224,0.875,bicubic
coat_mini.in1k,94.970,5.030,98.780,1.220,10.34,224,0.900,bicubic
rexnet_200.nav_in1k,94.940,5.060,99.010,0.990,16.37,224,0.875,bicubic
vit_base_patch16_384.augreg_in1k,94.940,5.060,98.890,1.110,86.86,384,1.000,bicubic
seresnext101_64x4d.gluon_in1k,94.930,5.070,98.830,1.170,88.23,224,0.875,bicubic
gcresnet33ts.ra2_in1k,94.920,5.080,98.810,1.190,19.88,288,1.000,bicubic
resnet50.c2_in1k,94.920,5.080,98.810,1.190,25.56,288,1.000,bicubic
senet154.gluon_in1k,94.920,5.080,98.760,1.240,115.09,224,0.875,bicubic
eva02_tiny_patch14_336.mim_in22k_ft_in1k,94.910,5.090,98.880,1.120,5.76,336,1.000,bicubic
repvit_m1_1.dist_450e_in1k,94.900,5.100,98.960,1.040,8.80,224,0.950,bicubic
resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k,94.890,5.110,98.870,1.130,25.03,224,0.875,bilinear
mobilevitv2_175.cvnets_in1k,94.890,5.110,98.860,1.140,14.25,256,0.888,bicubic
resmlp_36_224.fb_distilled_in1k,94.890,5.110,98.860,1.140,44.69,224,0.875,bicubic
seresnext101_32x4d.gluon_in1k,94.890,5.110,98.820,1.180,48.96,224,0.875,bicubic
tf_efficientnet_lite4.in1k,94.880,5.120,99.020,0.980,13.01,380,0.920,bilinear
fastvit_sa12.apple_in1k,94.880,5.120,98.890,1.110,11.58,256,0.900,bicubic
wide_resnet50_2.tv2_in1k,94.870,5.130,98.940,1.060,68.88,224,0.965,bilinear
ese_vovnet39b.ra_in1k,94.870,5.130,98.910,1.090,24.57,288,0.950,bicubic
gcresnext50ts.ch_in1k,94.860,5.140,98.860,1.140,15.67,288,1.000,bicubic
resnet50.b1k_in1k,94.860,5.140,98.810,1.190,25.56,288,1.000,bicubic
crossvit_small_240.in1k,94.850,5.150,99.020,0.980,26.86,240,0.875,bicubic
resnest50d.in1k,94.850,5.150,98.880,1.120,27.48,224,0.875,bilinear
resnetv2_50.a1h_in1k,94.850,5.150,98.870,1.130,25.55,288,1.000,bicubic
mobilevitv2_200.cvnets_in1k,94.840,5.160,98.710,1.290,18.45,256,0.888,bicubic
convnext_tiny.fb_in22k_ft_in1k,94.840,5.160,98.530,1.470,28.59,288,1.000,bicubic
fastvit_s12.apple_dist_in1k,94.830,5.170,98.810,1.190,9.47,256,0.900,bicubic
cspresnext50.ra_in1k,94.830,5.170,98.770,1.230,20.57,256,0.887,bilinear
resnext50_32x4d.a1_in1k,94.830,5.170,98.590,1.410,25.03,288,1.000,bicubic
res2net101d.in1k,94.820,5.180,98.770,1.230,45.23,224,0.875,bilinear
resnet50.a1h_in1k,94.770,5.230,98.690,1.310,25.56,224,1.000,bicubic
lambda_resnet50ts.a1h_in1k,94.770,5.230,98.470,1.530,21.54,256,0.950,bicubic
repvit_m1_1.dist_300e_in1k,94.760,5.240,98.930,1.070,8.80,224,0.950,bicubic
convnext_pico.d1_in1k,94.760,5.240,98.710,1.290,9.05,288,0.950,bicubic
sehalonet33ts.ra2_in1k,94.760,5.240,98.570,1.430,13.69,256,0.940,bicubic
resnet50.a1_in1k,94.740,5.260,98.710,1.290,25.56,288,1.000,bicubic
repvit_m2.dist_in1k,94.740,5.260,98.680,1.320,8.80,224,0.950,bicubic
resnest50d_1s4x24d.in1k,94.730,5.270,98.980,1.020,25.68,224,0.875,bicubic
resnet50.c1_in1k,94.730,5.270,98.930,1.070,25.56,288,1.000,bicubic
resnet50.b2k_in1k,94.730,5.270,98.820,1.180,25.56,288,1.000,bicubic
resnet152d.gluon_in1k,94.730,5.270,98.750,1.250,60.21,224,0.875,bicubic
resnet152.a3_in1k,94.730,5.270,98.680,1.320,60.19,224,0.950,bicubic
resnet50d.a1_in1k,94.730,5.270,98.490,1.510,25.58,288,1.000,bicubic
resnet101s.gluon_in1k,94.720,5.280,98.820,1.180,44.67,224,0.875,bicubic
deit_small_distilled_patch16_224.fb_in1k,94.710,5.290,99.030,0.970,22.44,224,0.900,bicubic
resnext50_32x4d.ra_in1k,94.700,5.300,98.760,1.240,25.03,288,0.950,bicubic
xcit_tiny_12_p8_224.fb_in1k,94.690,5.310,98.830,1.170,6.71,224,1.000,bicubic
haloregnetz_b.ra3_in1k,94.690,5.310,98.660,1.340,11.68,224,0.940,bicubic
resmlp_big_24_224.fb_in1k,94.680,5.320,98.500,1.500,129.14,224,0.875,bicubic
edgenext_small_rw.sw_in1k,94.670,5.330,98.780,1.220,7.83,320,1.000,bicubic
resnext101_64x4d.gluon_in1k,94.670,5.330,98.660,1.340,83.46,224,0.875,bicubic
regnetx_032.tv2_in1k,94.660,5.340,98.850,1.150,15.30,224,0.965,bicubic
cspdarknet53.ra_in1k,94.660,5.340,98.800,1.200,27.64,256,0.887,bilinear
seresnet50.a2_in1k,94.660,5.340,98.780,1.220,28.09,288,1.000,bicubic
seresnet50.a1_in1k,94.660,5.340,98.720,1.280,28.09,288,1.000,bicubic
poolformerv2_s24.sail_in1k,94.650,5.350,98.840,1.160,21.34,224,1.000,bicubic
maxvit_rmlp_pico_rw_256.sw_in1k,94.640,5.360,98.810,1.190,7.52,256,0.950,bicubic
resnet50.tv2_in1k,94.640,5.360,98.800,1.200,25.56,224,0.965,bilinear
efficientnet_b3_pruned.in1k,94.630,5.370,98.760,1.240,9.86,300,0.904,bicubic
resnet50.a2_in1k,94.630,5.370,98.660,1.340,25.56,288,1.000,bicubic
eca_resnet33ts.ra2_in1k,94.620,5.380,98.910,1.090,19.68,288,1.000,bicubic
darknet53.c2ns_in1k,94.620,5.380,98.900,1.100,41.61,288,1.000,bicubic
gernet_m.idstcv_in1k,94.620,5.380,98.860,1.140,21.14,224,0.875,bilinear
resnext50_32x4d.tv2_in1k,94.620,5.380,98.780,1.220,25.03,224,0.965,bilinear
convnext_pico_ols.d1_in1k,94.620,5.380,98.760,1.240,9.06,288,1.000,bicubic
efficientnet_b2.ra_in1k,94.620,5.380,98.710,1.290,9.11,288,1.000,bicubic
tresnet_m.miil_in1k,94.620,5.380,98.550,1.450,31.39,224,0.875,bilinear
sebotnet33ts_256.a1h_in1k,94.610,5.390,98.510,1.490,13.70,256,0.940,bicubic
fastvit_t12.apple_dist_in1k,94.590,5.410,98.790,1.210,7.55,256,0.900,bicubic
inception_resnet_v2.tf_in1k,94.580,5.420,98.790,1.210,55.84,299,0.897,bicubic
resnet50d.a2_in1k,94.580,5.420,98.690,1.310,25.58,288,1.000,bicubic
resnext50_32x4d.a2_in1k,94.580,5.420,98.650,1.350,25.03,288,1.000,bicubic
pit_s_224.in1k,94.570,5.430,98.700,1.300,23.46,224,0.900,bicubic
poolformer_s24.sail_in1k,94.560,5.440,98.900,1.100,21.39,224,0.900,bicubic
repvgg_b3.rvgg_in1k,94.550,5.450,98.780,1.220,123.09,224,0.875,bilinear
mobilevitv2_150.cvnets_in1k,94.550,5.450,98.710,1.290,10.59,256,0.888,bicubic
resnext50d_32x4d.bt_in1k,94.550,5.450,98.690,1.310,25.05,288,0.950,bicubic
regnety_320.pycls_in1k,94.540,5.460,98.850,1.150,145.05,224,0.875,bicubic
tf_efficientnet_b3.in1k,94.540,5.460,98.800,1.200,12.23,300,0.904,bicubic
nf_resnet50.ra2_in1k,94.540,5.460,98.790,1.210,25.56,288,0.940,bicubic
xcit_tiny_24_p16_224.fb_dist_in1k,94.540,5.460,98.780,1.220,12.12,224,1.000,bicubic
resnext101_32x4d.gluon_in1k,94.540,5.460,98.630,1.370,44.18,224,0.875,bicubic
repvit_m1_0.dist_450e_in1k,94.530,5.470,98.880,1.120,7.30,224,0.950,bicubic
regnety_016.tv2_in1k,94.520,5.480,98.820,1.180,11.20,224,0.965,bicubic
resnet50.ram_in1k,94.520,5.480,98.650,1.350,25.56,288,0.950,bicubic
repvgg_b3g4.rvgg_in1k,94.510,5.490,98.970,1.030,83.83,224,0.875,bilinear
tf_efficientnet_b2.ap_in1k,94.510,5.490,98.620,1.380,9.11,260,0.890,bicubic
convmixer_768_32.in1k,94.500,5.500,98.860,1.140,21.11,224,0.960,bicubic
efficientvit_b1.r288_in1k,94.490,5.510,98.520,1.480,9.10,288,1.000,bicubic
efficientformer_l1.snap_dist_in1k,94.480,5.520,98.830,1.170,12.29,224,0.950,bicubic
rexnet_150.nav_in1k,94.480,5.520,98.800,1.200,9.73,224,0.875,bicubic
regnety_120.pycls_in1k,94.470,5.530,98.820,1.180,51.82,224,0.875,bicubic
darknetaa53.c2ns_in1k,94.470,5.530,98.770,1.230,36.02,288,1.000,bilinear
resnetblur50.bt_in1k,94.460,5.540,98.840,1.160,25.56,288,0.950,bicubic
resnet50.fb_ssl_yfcc100m_ft_in1k,94.450,5.550,98.920,1.080,25.56,224,0.875,bilinear
resmlp_24_224.fb_distilled_in1k,94.450,5.550,98.770,1.230,30.02,224,0.875,bicubic
regnetx_320.pycls_in1k,94.450,5.550,98.740,1.260,107.81,224,0.875,bicubic
gcvit_xxtiny.in1k,94.420,5.580,98.890,1.110,12.00,224,0.875,bicubic
tf_efficientnetv2_b2.in1k,94.420,5.580,98.580,1.420,10.10,260,0.890,bicubic
tf_efficientnet_el.in1k,94.400,5.600,98.710,1.290,10.59,300,0.904,bicubic
efficientnet_el_pruned.in1k,94.390,5.610,98.740,1.260,10.59,300,0.904,bicubic
tf_efficientnet_b2.aa_in1k,94.380,5.620,98.610,1.390,9.11,260,0.890,bicubic
legacy_seresnext101_32x4d.in1k,94.370,5.630,98.630,1.370,48.96,224,0.875,bilinear
inception_v4.tf_in1k,94.370,5.630,98.580,1.420,42.68,299,0.875,bicubic
regnety_160.pycls_in1k,94.360,5.640,98.860,1.140,83.59,224,0.875,bicubic
deit_small_patch16_224.fb_in1k,94.350,5.650,98.690,1.310,22.05,224,0.900,bicubic
ecaresnet50t.a3_in1k,94.350,5.650,98.670,1.330,25.57,224,0.950,bicubic
dpn107.mx_in1k,94.340,5.660,98.500,1.500,86.92,224,0.875,bicubic
seresnext50_32x4d.gluon_in1k,94.330,5.670,98.610,1.390,27.56,224,0.875,bicubic
ecaresnet26t.ra2_in1k,94.320,5.680,98.720,1.280,16.01,320,0.950,bicubic
resnet50.bt_in1k,94.320,5.680,98.640,1.360,25.56,288,0.950,bicubic
resnetrs50.tf_in1k,94.320,5.680,98.640,1.360,35.69,224,0.910,bicubic
res2net50d.in1k,94.320,5.680,98.530,1.470,25.72,224,0.875,bilinear
repvit_m1_0.dist_300e_in1k,94.300,5.700,98.850,1.150,7.30,224,0.950,bicubic
xception71.tf_in1k,94.300,5.700,98.650,1.350,42.34,299,0.903,bicubic
dpn92.mx_in1k,94.290,5.710,98.750,1.250,37.67,224,0.875,bicubic
cait_xxs36_224.fb_dist_in1k,94.270,5.730,98.710,1.290,17.30,224,1.000,bicubic
tiny_vit_5m_224.in1k,94.240,5.760,98.690,1.310,5.39,224,0.950,bicubic
skresnext50_32x4d.ra_in1k,94.240,5.760,98.460,1.540,27.48,224,0.875,bicubic
regnetx_120.pycls_in1k,94.230,5.770,98.670,1.330,46.11,224,0.875,bicubic
resnet101d.gluon_in1k,94.230,5.770,98.540,1.460,44.57,224,0.875,bicubic
resnet50.ra_in1k,94.210,5.790,98.620,1.380,25.56,288,0.950,bicubic
tf_efficientnet_lite3.in1k,94.200,5.800,98.640,1.360,8.20,300,0.904,bilinear
efficientformerv2_s1.snap_dist_in1k,94.200,5.800,98.630,1.370,6.19,224,0.950,bicubic
resmlp_36_224.fb_in1k,94.190,5.810,98.660,1.340,44.69,224,0.875,bicubic
convnextv2_femto.fcmae_ft_in1k,94.190,5.810,98.610,1.390,5.23,288,0.950,bicubic
mixnet_xl.ra_in1k,94.180,5.820,98.320,1.680,11.90,224,0.875,bicubic
regnety_080.pycls_in1k,94.170,5.830,98.680,1.320,39.18,224,0.875,bicubic
inception_resnet_v2.tf_ens_adv_in1k,94.170,5.830,98.600,1.400,55.84,299,0.897,bicubic
levit_192.fb_dist_in1k,94.170,5.830,98.540,1.460,10.95,224,0.900,bicubic
levit_conv_192.fb_dist_in1k,94.170,5.830,98.540,1.460,10.95,224,0.900,bicubic
resnet152c.gluon_in1k,94.160,5.840,98.640,1.360,60.21,224,0.875,bicubic
dpn98.mx_in1k,94.160,5.840,98.590,1.410,61.57,224,0.875,bicubic
vit_base_patch16_224.sam_in1k,94.150,5.850,98.670,1.330,86.57,224,0.900,bicubic
gmlp_s16_224.ra3_in1k,94.150,5.850,98.500,1.500,19.42,224,0.875,bicubic
regnetx_160.pycls_in1k,94.140,5.860,98.750,1.250,54.28,224,0.875,bicubic
regnety_064.pycls_in1k,94.140,5.860,98.710,1.290,30.58,224,0.875,bicubic
efficientnet_b2_pruned.in1k,94.140,5.860,98.520,1.480,8.31,260,0.890,bicubic
regnetx_016.tv2_in1k,94.130,5.870,98.750,1.250,9.19,224,0.965,bicubic
fastvit_s12.apple_in1k,94.130,5.870,98.620,1.380,9.47,256,0.900,bicubic
nf_regnet_b1.ra2_in1k,94.110,5.890,98.620,1.380,10.22,288,0.900,bicubic
tf_efficientnet_b2.in1k,94.110,5.890,98.450,1.550,9.11,260,0.890,bicubic
resnet33ts.ra2_in1k,94.100,5.900,98.650,1.350,19.68,288,1.000,bicubic
efficientvit_b1.r256_in1k,94.090,5.910,98.360,1.640,9.10,256,1.000,bicubic
xcit_tiny_24_p16_224.fb_in1k,94.080,5.920,98.540,1.460,12.12,224,1.000,bicubic
resnet152.gluon_in1k,94.070,5.930,98.460,1.540,60.19,224,0.875,bicubic
dpn131.mx_in1k,94.050,5.950,98.710,1.290,79.25,224,0.875,bicubic
coat_lite_mini.in1k,94.040,5.960,98.550,1.450,11.01,224,0.900,bicubic
eca_halonext26ts.c1_in1k,94.040,5.960,98.490,1.510,10.76,256,0.940,bicubic
resnet101.a3_in1k,94.030,5.970,98.660,1.340,44.55,224,0.950,bicubic
hrnet_w64.ms_in1k,94.030,5.970,98.590,1.410,128.06,224,0.875,bilinear
resmlp_24_224.fb_in1k,94.030,5.970,98.330,1.670,30.02,224,0.875,bicubic
halonet26t.a1h_in1k,94.000,6.000,98.500,1.500,12.48,256,0.950,bicubic
dpn68b.ra_in1k,94.000,6.000,98.340,1.660,12.61,288,1.000,bicubic
fbnetv3_b.ra2_in1k,93.970,6.030,98.630,1.370,8.60,256,0.950,bilinear
resnet50.am_in1k,93.970,6.030,98.520,1.480,25.56,224,0.875,bicubic
dla102x2.in1k,93.970,6.030,98.490,1.510,41.28,224,0.875,bilinear
mobilevitv2_125.cvnets_in1k,93.960,6.040,98.550,1.450,7.48,256,0.888,bicubic
tf_efficientnetv2_b1.in1k,93.950,6.050,98.620,1.380,8.14,240,0.882,bicubic
convnext_femto.d1_in1k,93.930,6.070,98.520,1.480,5.22,288,0.950,bicubic
fbnetv3_d.ra2_in1k,93.920,6.080,98.740,1.260,10.31,256,0.950,bilinear
convnext_femto_ols.d1_in1k,93.920,6.080,98.620,1.380,5.23,288,0.950,bicubic
hrnet_w48.ms_in1k,93.920,6.080,98.610,1.390,77.47,224,0.875,bilinear
fastvit_t12.apple_in1k,93.920,6.080,98.600,1.400,7.55,256,0.900,bicubic
tf_efficientnet_cc_b1_8e.in1k,93.920,6.080,98.260,1.740,39.72,240,0.882,bicubic
regnetx_064.pycls_in1k,93.900,6.100,98.640,1.360,26.21,224,0.875,bicubic
rexnet_130.nav_in1k,93.900,6.100,98.400,1.600,7.56,224,0.875,bicubic
vit_small_patch16_224.augreg_in1k,93.890,6.110,98.440,1.560,22.05,224,0.900,bicubic
regnety_040.pycls_in1k,93.880,6.120,98.660,1.340,20.65,224,0.875,bicubic
repvgg_b2g4.rvgg_in1k,93.880,6.120,98.590,1.410,61.76,224,0.875,bilinear
regnetx_080.pycls_in1k,93.880,6.120,98.520,1.480,39.57,224,0.875,bicubic
efficientnet_em.ra2_in1k,93.830,6.170,98.820,1.180,6.90,240,0.882,bicubic
resnet32ts.ra2_in1k,93.830,6.170,98.650,1.350,17.96,288,1.000,bicubic
lambda_resnet26t.c1_in1k,93.830,6.170,98.640,1.360,10.96,256,0.940,bicubic
resnext101_32x8d.tv_in1k,93.820,6.180,98.580,1.420,88.79,224,0.875,bilinear
resnext50_32x4d.gluon_in1k,93.810,6.190,98.420,1.580,25.03,224,0.875,bicubic
pvt_v2_b1.in1k,93.800,6.200,98.660,1.340,14.01,224,0.900,bicubic
pit_xs_distilled_224.in1k,93.780,6.220,98.620,1.380,11.00,224,0.900,bicubic
eca_botnext26ts_256.c1_in1k,93.780,6.220,98.500,1.500,10.59,256,0.950,bicubic
xception65.tf_in1k,93.780,6.220,98.370,1.630,39.92,299,0.903,bicubic
legacy_seresnext50_32x4d.in1k,93.750,6.250,98.580,1.420,27.56,224,0.875,bilinear
resnet50d.gluon_in1k,93.750,6.250,98.390,1.610,25.58,224,0.875,bicubic
resnet101.gluon_in1k,93.750,6.250,98.380,1.620,44.55,224,0.875,bicubic
cspresnet50.ra_in1k,93.740,6.260,98.640,1.360,21.62,256,0.887,bilinear
wide_resnet101_2.tv_in1k,93.740,6.260,98.540,1.460,126.89,224,0.875,bilinear
mobileone_s4.apple_in1k,93.740,6.260,98.230,1.770,14.95,224,0.900,bilinear
vit_relpos_base_patch32_plus_rpn_256.sw_in1k,93.740,6.260,98.070,1.930,119.42,256,0.900,bicubic
res2net101_26w_4s.in1k,93.720,6.280,98.310,1.690,45.21,224,0.875,bilinear
lambda_resnet26rpt_256.c1_in1k,93.710,6.290,98.520,1.480,10.99,256,0.940,bicubic
regnety_008_tv.tv2_in1k,93.690,6.310,98.490,1.510,6.43,224,0.965,bicubic
tf_efficientnet_b1.ap_in1k,93.680,6.320,98.360,1.640,7.79,240,0.882,bicubic
resnet101c.gluon_in1k,93.670,6.330,98.420,1.580,44.57,224,0.875,bicubic
resnext50_32x4d.a3_in1k,93.660,6.340,98.520,1.480,25.03,224,0.950,bicubic
vit_tiny_patch16_384.augreg_in21k_ft_in1k,93.650,6.350,98.590,1.410,5.79,384,1.000,bicubic
resnet34d.ra2_in1k,93.640,6.360,98.540,1.460,21.82,288,0.950,bicubic
resnet50s.gluon_in1k,93.640,6.360,98.460,1.540,25.68,224,0.875,bicubic
vit_base_patch32_384.augreg_in1k,93.640,6.360,98.400,1.600,88.30,384,1.000,bicubic
vit_base_patch16_224.augreg_in1k,93.640,6.360,98.240,1.760,86.57,224,0.900,bicubic
tf_efficientnet_b0.ns_jft_in1k,93.620,6.380,98.640,1.360,5.29,224,0.875,bicubic
cait_xxs24_224.fb_dist_in1k,93.610,6.390,98.460,1.540,11.96,224,1.000,bicubic
repvit_m0_9.dist_450e_in1k,93.600,6.400,98.500,1.500,5.49,224,0.950,bicubic
coat_tiny.in1k,93.580,6.420,98.420,1.580,5.50,224,0.900,bicubic
regnetx_040.pycls_in1k,93.560,6.440,98.530,1.470,22.12,224,0.875,bicubic
visformer_tiny.in1k,93.560,6.440,98.490,1.510,10.32,224,0.900,bicubic
seresnext26t_32x4d.bt_in1k,93.560,6.440,98.390,1.610,16.81,288,0.950,bicubic
hrnet_w44.ms_in1k,93.550,6.450,98.700,1.300,67.06,224,0.875,bilinear
hrnet_w18.ms_aug_in1k,93.550,6.450,98.600,1.400,21.30,224,0.950,bilinear
hrnet_w32.ms_in1k,93.530,6.470,98.450,1.550,41.23,224,0.875,bilinear
xcit_nano_12_p8_384.fb_dist_in1k,93.520,6.480,98.530,1.470,3.05,384,1.000,bicubic
efficientvit_b1.r224_in1k,93.510,6.490,98.320,1.680,9.10,224,0.950,bicubic
botnet26t_256.c1_in1k,93.510,6.490,98.300,1.700,12.49,256,0.950,bicubic
repvgg_b2.rvgg_in1k,93.500,6.500,98.730,1.270,89.02,224,0.875,bilinear
hrnet_w40.ms_in1k,93.500,6.500,98.570,1.430,57.56,224,0.875,bilinear
repghostnet_200.in1k,93.500,6.500,98.540,1.460,9.80,224,0.875,bicubic
dla102x.in1k,93.490,6.510,98.500,1.500,26.31,224,0.875,bilinear
tf_efficientnet_b1.aa_in1k,93.490,6.510,98.360,1.640,7.79,240,0.882,bicubic
resnet50d.a3_in1k,93.480,6.520,98.450,1.550,25.58,224,0.950,bicubic
inception_v3.gluon_in1k,93.470,6.530,98.570,1.430,23.83,299,0.875,bicubic
legacy_xception.tf_in1k,93.460,6.540,98.530,1.470,22.86,299,0.897,bicubic
repvit_m0_9.dist_300e_in1k,93.440,6.560,98.710,1.290,5.49,224,0.950,bicubic
seresnext26d_32x4d.bt_in1k,93.440,6.560,98.330,1.670,16.81,288,0.950,bicubic
xception41.tf_in1k,93.430,6.570,98.430,1.570,26.97,299,0.903,bicubic
mixnet_l.ft_in1k,93.430,6.570,98.220,1.780,7.33,224,0.875,bicubic
regnety_032.pycls_in1k,93.410,6.590,98.640,1.360,19.44,224,0.875,bicubic
xcit_tiny_12_p16_224.fb_dist_in1k,93.410,6.590,98.510,1.490,6.72,224,1.000,bicubic
res2net50_26w_6s.in1k,93.400,6.600,98.280,1.720,37.05,224,0.875,bilinear
res2net50_26w_8s.in1k,93.390,6.610,98.170,1.830,48.40,224,0.875,bilinear
legacy_seresnet152.in1k,93.380,6.620,98.340,1.660,66.82,224,0.875,bilinear
dla169.in1k,93.350,6.650,98.610,1.390,53.39,224,0.875,bilinear
cs3darknet_m.c2ns_in1k,93.350,6.650,98.600,1.400,9.31,288,0.950,bicubic
levit_conv_128.fb_dist_in1k,93.350,6.650,98.370,1.630,9.21,224,0.900,bicubic
repvgg_b1.rvgg_in1k,93.330,6.670,98.520,1.480,57.42,224,0.875,bilinear
levit_128.fb_dist_in1k,93.330,6.670,98.370,1.630,9.21,224,0.900,bicubic
resnest26d.gluon_in1k,93.320,6.680,98.620,1.380,17.07,224,0.875,bilinear
resnet152.tv_in1k,93.320,6.680,98.380,1.620,60.19,224,0.875,bilinear
bat_resnext26ts.ch_in1k,93.320,6.680,98.360,1.640,10.73,256,0.900,bicubic
inception_v3.tf_in1k,93.320,6.680,98.040,1.960,23.83,299,0.875,bicubic
tf_mixnet_l.in1k,93.320,6.680,98.030,1.970,7.33,224,0.875,bicubic
legacy_seresnet101.in1k,93.310,6.690,98.520,1.480,49.33,224,0.875,bilinear
selecsls60b.in1k,93.310,6.690,98.290,1.710,32.77,224,0.875,bicubic
mobilevitv2_100.cvnets_in1k,93.300,6.700,98.280,1.720,4.90,256,0.888,bicubic
repvit_m1.dist_in1k,93.290,6.710,98.440,1.560,5.49,224,0.950,bicubic
efficientnet_b1.ft_in1k,93.250,6.750,98.290,1.710,7.79,256,1.000,bicubic
coat_lite_tiny.in1k,93.240,6.760,98.260,1.740,5.72,224,0.900,bicubic
resnet26t.ra2_in1k,93.200,6.800,98.490,1.510,16.01,320,1.000,bicubic
hrnet_w30.ms_in1k,93.200,6.800,98.410,1.590,37.71,224,0.875,bilinear
dla60_res2next.in1k,93.190,6.810,98.400,1.600,17.03,224,0.875,bilinear
dla60_res2net.in1k,93.170,6.830,98.430,1.570,20.85,224,0.875,bilinear
efficientnet_es.ra_in1k,93.170,6.830,98.410,1.590,5.44,224,0.875,bicubic
mobilevit_s.cvnets_in1k,93.160,6.840,98.440,1.560,5.58,256,0.900,bicubic
wide_resnet50_2.tv_in1k,93.160,6.840,98.370,1.630,68.88,224,0.875,bilinear
gcresnext26ts.ch_in1k,93.160,6.840,98.320,1.680,10.48,288,1.000,bicubic
ese_vovnet19b_dw.ra_in1k,93.150,6.850,98.250,1.750,6.54,288,0.950,bicubic
regnetx_032.pycls_in1k,93.110,6.890,98.390,1.610,15.30,224,0.875,bicubic
tf_efficientnetv2_b0.in1k,93.110,6.890,98.390,1.610,7.14,224,0.875,bicubic
resnet34.a1_in1k,93.100,6.900,98.330,1.670,21.80,288,1.000,bicubic
pit_xs_224.in1k,93.100,6.900,98.310,1.690,10.62,224,0.900,bicubic
tf_efficientnet_b1.in1k,93.100,6.900,98.300,1.700,7.79,240,0.882,bicubic
convnext_atto_ols.a2_in1k,93.090,6.910,98.470,1.530,3.70,288,0.950,bicubic
dla60x.in1k,93.080,6.920,98.500,1.500,17.35,224,0.875,bilinear
eca_resnext26ts.ch_in1k,93.060,6.940,98.400,1.600,10.30,288,1.000,bicubic
dla102.in1k,93.050,6.950,98.550,1.450,33.27,224,0.875,bilinear
regnety_016.pycls_in1k,93.040,6.960,98.370,1.630,11.20,224,0.875,bicubic
resnet50c.gluon_in1k,93.030,6.970,98.400,1.600,25.58,224,0.875,bicubic
rexnet_100.nav_in1k,93.020,6.980,98.190,1.810,4.80,224,0.875,bicubic
selecsls60.in1k,93.010,6.990,98.300,1.700,30.67,224,0.875,bicubic
repvgg_b1g4.rvgg_in1k,93.000,7.000,98.430,1.570,39.97,224,0.875,bilinear
ghostnetv2_160.in1k,92.990,7.010,98.230,1.770,12.39,224,0.875,bicubic
cs3darknet_focus_m.c2ns_in1k,92.970,7.030,98.390,1.610,9.30,288,0.950,bicubic
hardcorenas_f.miil_green_in1k,92.970,7.030,98.160,1.840,8.20,224,0.875,bilinear
convnextv2_atto.fcmae_ft_in1k,92.970,7.030,98.060,1.940,3.71,288,0.950,bicubic
seresnext26ts.ch_in1k,92.960,7.040,98.410,1.590,10.39,288,1.000,bicubic
poolformerv2_s12.sail_in1k,92.960,7.040,98.360,1.640,11.89,224,1.000,bicubic
legacy_seresnet50.in1k,92.960,7.040,98.180,1.820,28.09,224,0.875,bilinear
tf_efficientnet_em.in1k,92.940,7.060,98.190,1.810,6.90,240,0.882,bicubic
mobileone_s3.apple_in1k,92.940,7.060,98.180,1.820,10.17,224,0.900,bilinear
inception_v3.tf_adv_in1k,92.900,7.100,98.140,1.860,23.83,299,0.875,bicubic
crossvit_9_dagger_240.in1k,92.890,7.110,98.240,1.760,8.78,240,0.875,bicubic
res2next50.in1k,92.840,7.160,98.180,1.820,24.67,224,0.875,bilinear
tf_efficientnet_cc_b0_8e.in1k,92.840,7.160,98.180,1.820,24.01,224,0.875,bicubic
gmixer_24_224.ra3_in1k,92.830,7.170,97.880,2.120,24.72,224,0.875,bicubic
mobileone_s2.apple_in1k,92.820,7.180,98.270,1.730,7.88,224,0.900,bilinear
resmlp_12_224.fb_distilled_in1k,92.820,7.180,98.140,1.860,15.35,224,0.875,bicubic
resnet101.tv_in1k,92.810,7.190,98.250,1.750,44.55,224,0.875,bilinear
dpn68b.mx_in1k,92.790,7.210,98.150,1.850,12.61,224,0.875,bicubic
convnext_atto.d2_in1k,92.790,7.210,98.080,1.920,3.70,288,0.950,bicubic
efficientnet_b1_pruned.in1k,92.780,7.220,98.040,1.960,6.33,240,0.882,bicubic
hrnet_w18_small_v2.gluon_in1k,92.770,7.230,98.410,1.590,15.60,224,0.875,bicubic
res2net50_14w_8s.in1k,92.770,7.230,98.160,1.840,25.06,224,0.875,bilinear
resnext50_32x4d.tv_in1k,92.750,7.250,98.270,1.730,25.03,224,0.875,bilinear
densenet201.tv_in1k,92.740,7.260,98.230,1.770,20.01,224,0.875,bicubic
inception_v3.tv_in1k,92.730,7.270,97.970,2.030,23.83,299,0.875,bicubic
resnet50.a3_in1k,92.720,7.280,98.170,1.830,25.56,224,0.950,bicubic
resnet34.a2_in1k,92.720,7.280,98.010,1.990,21.80,288,1.000,bicubic
efficientnet_b0.ra_in1k,92.690,7.310,98.070,1.930,5.29,224,0.875,bicubic
tf_efficientnet_lite2.in1k,92.650,7.350,98.230,1.770,6.09,260,0.890,bicubic
legacy_seresnext26_32x4d.in1k,92.640,7.360,98.120,1.880,16.79,224,0.875,bicubic
tf_efficientnet_lite1.in1k,92.630,7.370,98.060,1.940,5.42,240,0.882,bicubic
densenetblur121d.ra_in1k,92.620,7.380,98.260,1.740,8.00,288,0.950,bicubic
poolformer_s12.sail_in1k,92.610,7.390,98.180,1.820,11.92,224,0.900,bicubic
tf_efficientnet_cc_b0_4e.in1k,92.600,7.400,98.080,1.920,13.31,224,0.875,bicubic
hardcorenas_e.miil_green_in1k,92.570,7.430,98.100,1.900,8.07,224,0.875,bilinear
regnetx_008.tv2_in1k,92.550,7.450,98.180,1.820,7.26,224,0.965,bicubic
res2net50_48w_2s.in1k,92.550,7.450,98.080,1.920,25.29,224,0.875,bilinear
resnet50.gluon_in1k,92.540,7.460,98.170,1.830,25.56,224,0.875,bicubic
fastvit_t8.apple_dist_in1k,92.540,7.460,98.040,1.960,4.03,256,0.900,bicubic
densenet121.ra_in1k,92.520,7.480,98.220,1.780,7.98,288,0.950,bicubic
resnet26d.bt_in1k,92.520,7.480,98.210,1.790,16.01,288,0.950,bicubic
xcit_tiny_12_p16_224.fb_in1k,92.510,7.490,98.240,1.760,6.72,224,1.000,bicubic
res2net50_26w_4s.in1k,92.490,7.510,98.030,1.970,25.70,224,0.875,bilinear
densenet161.tv_in1k,92.480,7.520,98.300,1.700,28.68,224,0.875,bicubic
efficientvit_m5.r224_in1k,92.460,7.540,97.990,2.010,12.47,224,0.875,bicubic
tinynet_a.in1k,92.440,7.560,98.080,1.920,6.19,192,0.875,bicubic
resnet34.bt_in1k,92.410,7.590,98.150,1.850,21.80,288,0.950,bicubic
mixnet_m.ft_in1k,92.410,7.590,97.870,2.130,5.01,224,0.875,bicubic
convmixer_1024_20_ks9_p14.in1k,92.400,7.600,98.270,1.730,24.38,224,0.960,bicubic
hardcorenas_d.miil_green_in1k,92.400,7.600,98.080,1.920,7.50,224,0.875,bilinear
mobilenetv2_120d.ra_in1k,92.400,7.600,98.060,1.940,5.83,224,0.875,bicubic
skresnet34.ra_in1k,92.380,7.620,98.150,1.850,22.28,224,0.875,bicubic
repghostnet_150.in1k,92.370,7.630,98.050,1.950,6.58,224,0.875,bicubic
hrnet_w18.ms_in1k,92.320,7.680,98.260,1.740,21.30,224,0.875,bilinear
ghostnetv2_130.in1k,92.320,7.680,98.070,1.930,8.96,224,0.875,bicubic
tf_mixnet_m.in1k,92.300,7.700,97.890,2.110,5.01,224,0.875,bicubic
selecsls42b.in1k,92.290,7.710,98.110,1.890,32.46,224,0.875,bicubic
mobilenetv3_large_100.miil_in21k_ft_in1k,92.260,7.740,97.620,2.380,5.48,224,0.875,bilinear
tf_efficientnet_b0.aa_in1k,92.240,7.760,98.000,2.000,5.29,224,0.875,bicubic
resmlp_12_224.fb_in1k,92.220,7.780,98.150,1.850,15.35,224,0.875,bicubic
dla60.in1k,92.200,7.800,98.100,1.900,22.04,224,0.875,bilinear
tf_efficientnet_b0.ap_in1k,92.200,7.800,98.020,1.980,5.29,224,0.875,bicubic
regnetx_016.pycls_in1k,92.180,7.820,98.200,1.800,9.19,224,0.875,bicubic
gernet_s.idstcv_in1k,92.130,7.870,98.190,1.810,8.17,224,0.875,bilinear
resnext26ts.ra2_in1k,92.130,7.870,98.030,1.970,10.30,288,1.000,bicubic
xcit_nano_12_p8_224.fb_dist_in1k,92.080,7.920,98.160,1.840,3.05,224,1.000,bicubic
tf_efficientnet_b0.in1k,92.080,7.920,97.910,2.090,5.29,224,0.875,bicubic
seresnet50.a3_in1k,92.070,7.930,98.040,1.960,28.09,224,0.950,bicubic
fastvit_t8.apple_in1k,92.060,7.940,97.930,2.070,4.03,256,0.900,bicubic
vit_tiny_r_s16_p8_384.augreg_in21k_ft_in1k,92.040,7.960,98.290,1.710,6.36,384,1.000,bicubic
vit_small_patch32_224.augreg_in21k_ft_in1k,92.040,7.960,98.230,1.770,22.88,224,0.900,bicubic
hardcorenas_c.miil_green_in1k,92.020,7.980,97.840,2.160,5.52,224,0.875,bilinear
resnet26.bt_in1k,91.990,8.010,98.220,1.780,16.00,288,0.950,bicubic
dpn68.mx_in1k,91.990,8.010,98.020,1.980,12.61,224,0.875,bicubic
tf_efficientnet_es.in1k,91.970,8.030,97.880,2.120,5.44,224,0.875,bicubic
levit_128s.fb_dist_in1k,91.960,8.040,98.060,1.940,7.78,224,0.900,bicubic
levit_conv_128s.fb_dist_in1k,91.960,8.040,98.060,1.940,7.78,224,0.900,bicubic
efficientformerv2_s0.snap_dist_in1k,91.960,8.040,97.890,2.110,3.60,224,0.950,bicubic
repvgg_a2.rvgg_in1k,91.940,8.060,98.140,1.860,28.21,224,0.875,bilinear
densenet169.tv_in1k,91.940,8.060,98.100,1.900,14.15,224,0.875,bicubic
repghostnet_130.in1k,91.890,8.110,97.930,2.070,5.48,224,0.875,bicubic
resnet50.tv_in1k,91.880,8.120,98.040,1.960,25.56,224,0.875,bilinear
mixer_b16_224.goog_in21k_ft_in1k,91.880,8.120,97.260,2.740,59.88,224,0.875,bicubic
mobilenetv2_140.ra_in1k,91.840,8.160,97.860,2.140,6.11,224,0.875,bicubic
xcit_nano_12_p16_384.fb_dist_in1k,91.830,8.170,98.020,1.980,3.05,384,1.000,bicubic
mixnet_s.ft_in1k,91.820,8.180,97.700,2.300,4.13,224,0.875,bicubic
vit_tiny_patch16_224.augreg_in21k_ft_in1k,91.770,8.230,98.040,1.960,5.72,224,0.900,bicubic
mobilevitv2_075.cvnets_in1k,91.760,8.240,97.860,2.140,2.87,256,0.888,bicubic
hardcorenas_b.miil_green_in1k,91.760,8.240,97.780,2.220,5.18,224,0.875,bilinear
regnety_008.pycls_in1k,91.730,8.270,98.180,1.820,6.26,224,0.875,bicubic
resnest14d.gluon_in1k,91.720,8.280,97.870,2.130,10.61,224,0.875,bilinear
edgenext_x_small.in1k,91.710,8.290,97.600,2.400,2.34,288,1.000,bicubic
regnety_004.tv2_in1k,91.580,8.420,97.890,2.110,4.34,224,0.965,bicubic
tf_mixnet_s.in1k,91.520,8.480,97.620,2.380,4.13,224,0.875,bicubic
repvgg_b0.rvgg_in1k,91.390,8.610,98.000,2.000,15.82,224,0.875,bilinear
regnety_006.pycls_in1k,91.390,8.610,97.700,2.300,6.06,224,0.875,bicubic
hardcorenas_a.miil_green_in1k,91.350,8.650,97.850,2.150,5.26,224,0.875,bilinear
mobilenetv3_large_100.ra_in1k,91.350,8.650,97.710,2.290,5.48,224,0.875,bicubic
semnasnet_100.rmsp_in1k,91.310,8.690,97.560,2.440,3.89,224,0.875,bicubic
mobileone_s1.apple_in1k,91.280,8.720,97.820,2.180,4.83,224,0.900,bilinear
tf_mobilenetv3_large_100.in1k,91.230,8.770,97.660,2.340,5.48,224,0.875,bilinear
mobilenetv3_rw.rmsp_in1k,91.210,8.790,97.660,2.340,5.48,224,0.875,bicubic
hrnet_w18_small_v2.ms_in1k,91.200,8.800,97.900,2.100,15.60,224,0.875,bilinear
vit_base_patch32_224.augreg_in1k,91.190,8.810,97.390,2.610,88.22,224,0.900,bicubic
efficientnet_es_pruned.in1k,91.170,8.830,97.740,2.260,5.44,224,0.875,bicubic
efficientnet_lite0.ra_in1k,91.120,8.880,97.640,2.360,4.65,224,0.875,bicubic
regnetx_008.pycls_in1k,91.050,8.950,97.720,2.280,7.26,224,0.875,bicubic
tf_efficientnet_lite0.in1k,91.050,8.950,97.580,2.420,4.65,224,0.875,bicubic
xcit_nano_12_p8_224.fb_in1k,91.010,8.990,97.770,2.230,3.05,224,1.000,bicubic
resnet34.gluon_in1k,90.980,9.020,97.630,2.370,21.80,224,0.875,bicubic
mobilenetv2_110d.ra_in1k,90.960,9.040,97.540,2.460,4.52,224,0.875,bicubic
tinynet_b.in1k,90.920,9.080,97.660,2.340,3.73,188,0.875,bicubic
ghostnetv2_100.in1k,90.900,9.100,97.700,2.300,6.16,224,0.875,bicubic
legacy_seresnet34.in1k,90.900,9.100,97.580,2.420,21.96,224,0.875,bilinear
densenet121.tv_in1k,90.890,9.110,97.710,2.290,7.98,224,0.875,bicubic
mobilevit_xs.cvnets_in1k,90.820,9.180,97.930,2.070,2.32,256,0.900,bicubic
pit_ti_distilled_224.in1k,90.770,9.230,97.610,2.390,5.10,224,0.900,bicubic
dla34.in1k,90.760,9.240,97.650,2.350,15.74,224,0.875,bilinear
fbnetc_100.rmsp_in1k,90.730,9.270,97.210,2.790,5.57,224,0.875,bilinear
deit_tiny_distilled_patch16_224.fb_in1k,90.710,9.290,97.560,2.440,5.91,224,0.900,bicubic
repghostnet_111.in1k,90.710,9.290,97.470,2.530,4.54,224,0.875,bicubic
resnet18.fb_swsl_ig1b_ft_in1k,90.700,9.300,97.700,2.300,11.69,224,0.875,bilinear
convit_tiny.fb_in1k,90.660,9.340,97.730,2.270,5.71,224,0.875,bicubic
regnetx_004_tv.tv2_in1k,90.640,9.360,97.600,2.400,5.50,224,0.965,bicubic
crossvit_9_240.in1k,90.630,9.370,97.730,2.270,8.55,240,0.875,bicubic
repvgg_a1.rvgg_in1k,90.600,9.400,97.650,2.350,14.09,224,0.875,bilinear
efficientvit_m4.r224_in1k,90.580,9.420,97.530,2.470,8.80,224,0.875,bicubic
mnasnet_100.rmsp_in1k,90.500,9.500,97.470,2.530,4.38,224,0.875,bicubic
regnety_004.pycls_in1k,90.490,9.510,97.530,2.470,4.34,224,0.875,bicubic
regnetx_006.pycls_in1k,90.350,9.650,97.430,2.570,6.20,224,0.875,bicubic
spnasnet_100.rmsp_in1k,90.330,9.670,97.190,2.810,4.42,224,0.875,bilinear
repghostnet_100.in1k,90.290,9.710,97.480,2.520,4.07,224,0.875,bicubic
resnet18d.ra2_in1k,90.280,9.720,97.560,2.440,11.71,288,0.950,bicubic
crossvit_tiny_240.in1k,90.230,9.770,97.590,2.410,7.01,240,0.875,bicubic
resnet18.fb_ssl_yfcc100m_ft_in1k,90.210,9.790,97.550,2.450,11.69,224,0.875,bilinear
ghostnet_100.in1k,90.180,9.820,97.290,2.710,5.18,224,0.875,bicubic
vgg16_bn.tv_in1k,90.090,9.910,97.370,2.630,138.37,224,0.875,bilinear
vgg19_bn.tv_in1k,90.080,9.920,97.580,2.420,143.68,224,0.875,bilinear
semnasnet_075.rmsp_in1k,90.070,9.930,97.440,2.560,2.91,224,0.875,bicubic
resnet34.tv_in1k,89.950,10.050,97.340,2.660,21.80,224,0.875,bilinear
pit_ti_224.in1k,89.940,10.060,97.450,2.550,4.85,224,0.900,bicubic
resnet34.a3_in1k,89.940,10.060,97.180,2.820,21.80,224,0.950,bicubic
efficientvit_m3.r224_in1k,89.860,10.140,97.540,2.460,6.90,224,0.875,bicubic
vit_base_patch32_224.sam_in1k,89.740,10.260,97.000,3.000,88.22,224,0.900,bicubic
xcit_nano_12_p16_224.fb_dist_in1k,89.700,10.300,97.100,2.900,3.05,224,1.000,bicubic
resnet18.a1_in1k,89.680,10.320,97.100,2.900,11.69,288,1.000,bicubic
deit_tiny_patch16_224.fb_in1k,89.660,10.340,97.450,2.550,5.72,224,0.900,bicubic
skresnet18.ra_in1k,89.660,10.340,97.230,2.770,11.96,224,0.875,bicubic
tf_mobilenetv3_large_075.in1k,89.640,10.360,97.190,2.810,3.99,224,0.875,bilinear
mobilenetv2_100.ra_in1k,89.600,10.400,97.150,2.850,3.50,224,0.875,bicubic
resnet18.a2_in1k,89.570,10.430,96.960,3.040,11.69,288,1.000,bicubic
hrnet_w18_small.gluon_in1k,89.470,10.530,97.060,2.940,13.19,224,0.875,bicubic
repvgg_a0.rvgg_in1k,89.280,10.720,96.890,3.110,9.11,224,0.875,bilinear
vit_tiny_r_s16_p8_224.augreg_in21k_ft_in1k,89.180,10.820,97.220,2.780,6.34,224,0.900,bicubic
hrnet_w18_small.ms_in1k,89.050,10.950,97.120,2.880,13.19,224,0.875,bilinear
vgg19.tv_in1k,89.050,10.950,96.870,3.130,143.67,224,0.875,bilinear
resnet14t.c3_in1k,88.990,11.010,96.730,3.270,10.08,224,0.950,bicubic
tf_mobilenetv3_large_minimal_100.in1k,88.950,11.050,96.860,3.140,3.92,224,0.875,bilinear
regnetx_004.pycls_in1k,88.930,11.070,97.120,2.880,5.16,224,0.875,bicubic
legacy_seresnet18.in1k,88.890,11.110,96.980,3.020,11.78,224,0.875,bicubic
edgenext_xx_small.in1k,88.890,11.110,96.700,3.300,1.33,288,1.000,bicubic
repghostnet_080.in1k,88.840,11.160,96.700,3.300,3.28,224,0.875,bicubic
pvt_v2_b0.in1k,88.780,11.220,96.860,3.140,3.67,224,0.900,bicubic
vgg13_bn.tv_in1k,88.750,11.250,96.980,3.020,133.05,224,0.875,bilinear
lcnet_100.ra2_in1k,88.750,11.250,96.720,3.280,2.95,224,0.875,bicubic
xcit_nano_12_p16_224.fb_in1k,88.620,11.380,96.790,3.210,3.05,224,1.000,bicubic
vgg16.tv_in1k,88.560,11.440,96.800,3.200,138.36,224,0.875,bilinear
efficientvit_m2.r224_in1k,88.470,11.530,96.900,3.100,4.19,224,0.875,bicubic
resnet18.gluon_in1k,88.370,11.630,96.670,3.330,11.69,224,0.875,bicubic
mobileone_s0.apple_in1k,88.230,11.770,96.400,3.600,5.29,224,0.875,bilinear
mobilevitv2_050.cvnets_in1k,88.180,11.820,96.990,3.010,1.37,256,0.888,bicubic
efficientvit_b0.r224_in1k,87.940,12.060,96.130,3.870,3.41,224,0.950,bicubic
tinynet_c.in1k,87.780,12.220,96.370,3.630,2.46,184,0.875,bicubic
vgg11_bn.tv_in1k,87.500,12.500,96.820,3.180,132.87,224,0.875,bilinear
resnet18.tv_in1k,87.380,12.620,96.290,3.710,11.69,224,0.875,bilinear
regnety_002.pycls_in1k,87.370,12.630,96.610,3.390,3.16,224,0.875,bicubic
mobilevit_xxs.cvnets_in1k,87.160,12.840,96.100,3.900,1.27,256,0.900,bicubic
mixer_l16_224.goog_in21k_ft_in1k,87.150,12.850,93.520,6.480,208.20,224,0.875,bicubic
vgg13.tv_in1k,87.040,12.960,96.330,3.670,133.05,224,0.875,bilinear
efficientvit_m1.r224_in1k,86.790,13.210,96.030,3.970,2.98,224,0.875,bicubic
vgg11.tv_in1k,86.580,13.420,96.280,3.720,132.86,224,0.875,bilinear
repghostnet_058.in1k,86.540,13.460,95.900,4.100,2.55,224,0.875,bicubic
resnet18.a3_in1k,86.450,13.550,95.880,4.120,11.69,224,0.950,bicubic
dla60x_c.in1k,86.290,13.710,96.160,3.840,1.32,224,0.875,bilinear
resnet10t.c3_in1k,86.220,13.780,95.740,4.260,5.44,224,0.950,bicubic
regnetx_002.pycls_in1k,86.140,13.860,95.970,4.030,2.68,224,0.875,bicubic
lcnet_075.ra2_in1k,85.970,14.030,95.680,4.320,2.36,224,0.875,bicubic
mobilenetv3_small_100.lamb_in1k,85.220,14.780,95.650,4.350,2.54,224,0.875,bicubic
tf_mobilenetv3_small_100.in1k,85.190,14.810,95.770,4.230,2.54,224,0.875,bilinear
repghostnet_050.in1k,85.060,14.940,95.200,4.800,2.31,224,0.875,bicubic
tinynet_d.in1k,84.720,15.280,95.170,4.830,2.34,152,0.875,bicubic
mnasnet_small.lamb_in1k,84.420,15.580,95.190,4.810,2.03,224,0.875,bicubic
dla46x_c.in1k,84.230,15.770,95.270,4.730,1.07,224,0.875,bilinear
mobilenetv2_050.lamb_in1k,83.910,16.090,94.720,5.280,1.97,224,0.875,bicubic
dla46_c.in1k,83.610,16.390,94.950,5.050,1.30,224,0.875,bilinear
tf_mobilenetv3_small_075.in1k,83.500,16.500,94.840,5.160,2.04,224,0.875,bilinear
mobilenetv3_small_075.lamb_in1k,83.030,16.970,94.100,5.900,2.04,224,0.875,bicubic
efficientvit_m0.r224_in1k,82.350,17.650,94.430,5.570,2.35,224,0.875,bicubic
lcnet_050.ra2_in1k,81.800,18.200,93.710,6.290,1.88,224,0.875,bicubic
tf_mobilenetv3_small_minimal_100.in1k,81.400,18.600,93.680,6.320,2.04,224,0.875,bilinear
tinynet_e.in1k,78.920,21.080,92.540,7.460,2.04,106,0.875,bicubic
mobilenetv3_small_050.lamb_in1k,77.020,22.980,91.300,8.700,1.59,224,0.875,bicubic
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/results-imagenet-real.csv
|
model,top1,top1_err,top5,top5_err,param_count,img_size,crop_pct,interpolation,top1_diff,top5_diff,rank_diff
eva02_large_patch14_448.mim_m38m_ft_in22k_in1k,91.129,8.871,98.713,1.287,305.08,448,1.000,bicubic,+1.077,-0.335,0
eva_giant_patch14_336.clip_ft_in1k,91.058,8.942,98.602,1.399,"1,013.01",336,1.000,bicubic,+1.592,-0.224,+5
eva02_large_patch14_448.mim_in22k_ft_in22k_in1k,91.022,8.978,98.683,1.317,305.08,448,1.000,bicubic,+1.052,-0.329,-1
eva_giant_patch14_560.m30m_ft_in22k_in1k,90.969,9.031,98.672,1.328,"1,014.45",560,1.000,bicubic,+1.183,-0.320,-1
eva02_large_patch14_448.mim_in22k_ft_in1k,90.920,9.080,98.685,1.315,305.08,448,1.000,bicubic,+1.298,-0.265,-1
eva_giant_patch14_336.m30m_ft_in22k_in1k,90.907,9.093,98.661,1.339,"1,013.01",336,1.000,bicubic,+1.341,-0.291,0
eva_large_patch14_336.in22k_ft_in1k,90.905,9.095,98.785,1.215,304.53,336,1.000,bicubic,+2.235,+0.063,+6
eva_giant_patch14_224.clip_ft_in1k,90.900,9.100,98.680,1.319,"1,012.56",224,0.900,bicubic,+2.020,+0.000,+1
eva02_base_patch14_448.mim_in22k_ft_in22k_in1k,90.896,9.104,98.802,1.198,87.12,448,1.000,bicubic,+2.206,+0.078,+2
eva02_large_patch14_448.mim_m38m_ft_in1k,90.890,9.110,98.653,1.347,305.08,448,1.000,bicubic,+1.316,-0.271,-5
eva_large_patch14_336.in22k_ft_in22k_in1k,90.862,9.138,98.715,1.285,304.53,336,1.000,bicubic,+1.656,-0.139,-3
eva02_base_patch14_448.mim_in22k_ft_in1k,90.800,9.200,98.742,1.258,87.12,448,1.000,bicubic,+2.548,+0.178,+14
caformer_b36.sail_in22k_ft_in1k_384,90.781,9.219,98.860,1.140,98.75,384,1.000,bicubic,+2.723,+0.278,+21
beit_large_patch16_512.in22k_ft_in22k_in1k,90.687,9.313,98.753,1.247,305.67,512,1.000,bicubic,+2.091,+0.097,+1
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384,90.678,9.322,98.813,1.187,200.13,384,1.000,bicubic,+2.372,+0.231,+8
regnety_1280.swag_ft_in1k,90.657,9.343,98.819,1.181,644.81,384,1.000,bicubic,+2.427,+0.133,+12
convnext_xxlarge.clip_laion2b_soup_ft_in1k,90.642,9.358,98.806,1.194,846.47,256,1.000,bicubic,+2.038,+0.099,-3
convnext_large_mlp.clip_laion2b_augreg_ft_in1k_384,90.633,9.367,98.755,1.245,200.13,384,1.000,bicubic,+2.785,+0.309,+24
volo_d5_512.sail_in1k,90.614,9.386,98.698,1.302,296.09,512,1.150,bicubic,+3.556,+0.728,+49
beit_large_patch16_384.in22k_ft_in22k_in1k,90.606,9.394,98.766,1.234,305.00,384,1.000,bicubic,+2.204,+0.158,-1
maxvit_rmlp_base_rw_384.sw_in12k_ft_in1k,90.584,9.416,98.617,1.383,116.14,384,1.000,bicubic,+2.756,+0.245,+22
volo_d5_448.sail_in1k,90.580,9.420,98.685,1.315,295.91,448,1.150,bicubic,+3.628,+0.747,+53
tf_efficientnet_l2.ns_jft_in1k,90.561,9.439,98.775,1.226,480.31,800,0.960,bicubic,+2.209,+0.127,-3
maxvit_base_tf_512.in21k_ft_in1k,90.561,9.439,98.702,1.298,119.88,512,1.000,bicubic,+2.341,+0.172,+7
eva_large_patch14_196.in22k_ft_in22k_in1k,90.557,9.443,98.698,1.302,304.14,196,1.000,bicubic,+1.983,+0.040,-8
convnextv2_huge.fcmae_ft_in22k_in1k_512,90.542,9.458,98.710,1.290,660.29,512,1.000,bicubic,+1.684,-0.038,-16
vit_large_patch14_clip_336.openai_ft_in12k_in1k,90.542,9.458,98.687,1.313,304.53,336,1.000,bicubic,+2.274,+0.161,-3
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320,90.540,9.460,98.806,1.194,200.13,320,1.000,bicubic,+2.582,+0.331,+8
tf_efficientnet_l2.ns_jft_in1k_475,90.537,9.463,98.710,1.290,480.31,475,0.936,bicubic,+2.303,+0.164,-2
eva_large_patch14_196.in22k_ft_in1k,90.535,9.465,98.770,1.230,304.14,196,1.000,bicubic,+2.603,+0.272,+7
volo_d4_448.sail_in1k,90.512,9.488,98.591,1.409,193.41,448,1.150,bicubic,+3.720,+0.707,+53
maxvit_xlarge_tf_512.in21k_ft_in1k,90.503,9.497,98.576,1.424,475.77,512,1.000,bicubic,+1.965,-0.068,-14
vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k,90.501,9.499,98.631,1.369,632.46,336,1.000,bicubic,+1.909,-0.031,-17
convnextv2_huge.fcmae_ft_in22k_in1k_384,90.497,9.503,98.695,1.304,660.29,384,1.000,bicubic,+1.827,-0.043,-22
convnext_xlarge.fb_in22k_ft_in1k_384,90.495,9.505,98.768,1.232,350.20,384,1.000,bicubic,+2.743,+0.212,+9
caformer_m36.sail_in22k_ft_in1k_384,90.460,9.540,98.668,1.332,56.20,384,1.000,bicubic,+3.014,+0.360,+18
convformer_b36.sail_in22k_ft_in1k_384,90.448,9.552,98.772,1.228,99.88,384,1.000,bicubic,+2.846,+0.338,+10
maxxvitv2_rmlp_base_rw_384.sw_in12k_ft_in1k,90.444,9.556,98.749,1.251,116.09,384,1.000,bicubic,+2.980,+0.375,+14
caformer_b36.sail_in22k_ft_in1k,90.431,9.569,98.764,1.236,98.75,224,1.000,bicubic,+3.011,+0.436,+16
vit_large_patch14_clip_336.laion2b_ft_in12k_in1k,90.428,9.572,98.642,1.358,304.53,336,1.000,bicubic,+2.248,+0.070,-8
vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k,90.418,9.582,98.644,1.356,632.05,224,1.000,bicubic,+2.162,+0.092,-16
swinv2_base_window12to24_192to384.ms_in22k_ft_in1k,90.407,9.593,98.734,1.266,87.92,384,1.000,bicubic,+3.311,+0.500,+24
vit_large_patch14_clip_224.openai_ft_in12k_in1k,90.382,9.618,98.659,1.341,304.20,224,1.000,bicubic,+2.207,+0.113,-10
maxvit_xlarge_tf_384.in21k_ft_in1k,90.379,9.621,98.582,1.418,475.32,384,1.000,bicubic,+2.065,+0.038,-22
coatnet_rmlp_2_rw_384.sw_in12k_ft_in1k,90.377,9.623,98.648,1.351,73.88,384,1.000,bicubic,+2.995,+0.337,+12
beit_base_patch16_384.in22k_ft_in22k_in1k,90.367,9.633,98.725,1.275,86.74,384,1.000,bicubic,+3.567,+0.589,+36
convnextv2_large.fcmae_ft_in22k_in1k_384,90.367,9.633,98.663,1.337,197.96,384,1.000,bicubic,+2.169,+0.135,-16
seresnextaa201d_32x8d.sw_in12k_ft_in1k_384,90.362,9.638,98.734,1.266,149.39,384,1.000,bicubic,+3.074,+0.400,+11
maxvit_large_tf_512.in21k_ft_in1k,90.362,9.638,98.642,1.358,212.33,512,1.000,bicubic,+2.138,+0.044,-19
maxvit_base_tf_384.in21k_ft_in1k,90.360,9.640,98.683,1.317,119.65,384,1.000,bicubic,+2.438,+0.139,-13
convnextv2_base.fcmae_ft_in22k_in1k_384,90.360,9.640,98.670,1.330,88.72,384,1.000,bicubic,+2.716,+0.254,-4
beitv2_large_patch16_224.in1k_ft_in22k_in1k,90.354,9.646,98.587,1.413,304.43,224,0.950,bicubic,+1.960,-0.011,-32
vit_large_patch14_clip_336.laion2b_ft_in1k,90.332,9.668,98.597,1.403,304.53,336,1.000,bicubic,+2.476,+0.229,-13
convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384,90.330,9.670,98.770,1.230,88.59,384,1.000,bicubic,+3.196,+0.548,+10
maxvit_large_tf_384.in21k_ft_in1k,90.315,9.685,98.687,1.313,212.03,384,1.000,bicubic,+2.329,+0.119,-20
vit_large_patch14_clip_224.laion2b_ft_in12k_in1k,90.311,9.689,98.668,1.332,304.20,224,1.000,bicubic,+2.417,+0.260,-17
convnext_large_mlp.clip_laion2b_augreg_ft_in1k,90.309,9.691,98.651,1.349,200.13,256,1.000,bicubic,+2.973,+0.433,+1
vit_large_patch14_clip_224.openai_ft_in1k,90.307,9.693,98.640,1.360,304.20,224,1.000,bicubic,+2.453,+0.214,-17
caformer_s36.sail_in22k_ft_in1k_384,90.305,9.695,98.794,1.206,39.30,384,1.000,bicubic,+3.447,+0.582,+19
convnext_base.fb_in22k_ft_in1k_384,90.283,9.717,98.800,1.200,88.59,384,1.000,bicubic,+3.487,+0.536,+23
convnext_large.fb_in22k_ft_in1k_384,90.279,9.721,98.659,1.341,197.77,384,1.000,bicubic,+2.807,+0.273,-10
deit3_large_patch16_384.fb_in22k_ft_in1k,90.247,9.753,98.625,1.375,304.76,384,1.000,bicubic,+2.527,+0.113,-17
dm_nfnet_f6.dm_in1k,90.241,9.759,98.625,1.375,438.36,576,0.956,bicubic,+3.879,+0.729,+44
seresnextaa101d_32x8d.sw_in12k_ft_in1k_288,90.226,9.774,98.766,1.234,93.59,320,1.000,bicubic,+3.502,+0.590,+22
deit3_huge_patch14_224.fb_in22k_ft_in1k,90.226,9.774,98.640,1.360,632.13,224,1.000,bicubic,+3.040,+0.380,-1
regnety_320.swag_ft_in1k,90.213,9.787,98.764,1.236,145.05,384,1.000,bicubic,+3.379,+0.402,+14
vit_large_patch16_384.augreg_in21k_ft_in1k,90.213,9.787,98.661,1.339,304.72,384,1.000,bicubic,+3.129,+0.359,0
vit_base_patch16_clip_384.laion2b_ft_in1k,90.211,9.789,98.702,1.298,86.86,384,1.000,bicubic,+3.593,+0.694,+21
convformer_m36.sail_in22k_ft_in1k_384,90.204,9.796,98.651,1.349,57.05,384,1.000,bicubic,+3.312,+0.535,+8
convnextv2_huge.fcmae_ft_in1k,90.204,9.796,98.548,1.452,660.29,288,1.000,bicubic,+3.624,+0.576,+22
tf_efficientnetv2_l.in21k_ft_in1k,90.202,9.798,98.719,1.281,118.52,480,1.000,bicubic,+3.400,+0.583,+10
vit_base_patch16_clip_384.openai_ft_in12k_in1k,90.200,9.800,98.648,1.351,86.86,384,0.950,bicubic,+3.174,+0.466,-2
convformer_b36.sail_in22k_ft_in1k,90.189,9.811,98.695,1.304,99.88,224,1.000,bicubic,+3.191,+0.523,-3
vit_base_patch16_clip_384.laion2b_ft_in12k_in1k,90.189,9.811,98.585,1.415,86.86,384,1.000,bicubic,+2.983,+0.550,-13
cait_m48_448.fb_dist_in1k,90.189,9.811,98.484,1.516,356.46,448,1.000,bicubic,+3.697,+0.732,+25
vit_huge_patch14_clip_224.laion2b_ft_in1k,90.181,9.819,98.544,1.456,632.05,224,1.000,bicubic,+2.593,+0.326,-28
volo_d3_448.sail_in1k,90.177,9.823,98.550,1.450,86.63,448,1.000,bicubic,+3.675,+0.840,+20
swinv2_large_window12to24_192to384.ms_in22k_ft_in1k,90.157,9.843,98.614,1.386,196.74,384,1.000,bicubic,+2.693,+0.364,-25
beit_large_patch16_224.in22k_ft_in22k_in1k,90.149,9.851,98.725,1.275,304.43,224,0.900,bicubic,+2.671,+0.421,-29
beitv2_large_patch16_224.in1k_ft_in1k,90.100,9.900,98.439,1.561,304.43,224,0.950,bicubic,+2.688,+0.205,-24
tf_efficientnet_b7.ns_jft_in1k,90.098,9.902,98.614,1.386,66.35,600,0.949,bicubic,+3.258,+0.522,-2
vit_large_patch14_clip_224.laion2b_ft_in1k,90.095,9.905,98.555,1.445,304.20,224,1.000,bicubic,+2.809,+0.311,-21
convnextv2_base.fcmae_ft_in22k_in1k,90.068,9.932,98.676,1.324,88.72,288,1.000,bicubic,+3.070,+0.508,-11
convnext_xlarge.fb_in22k_ft_in1k,90.066,9.934,98.619,1.381,350.20,288,1.000,bicubic,+2.736,+0.291,-25
caformer_b36.sail_in1k_384,90.063,9.937,98.514,1.486,98.75,384,1.000,bicubic,+3.655,+0.700,+19
cait_m36_384.fb_dist_in1k,90.051,9.949,98.495,1.505,271.22,384,1.000,bicubic,+3.993,+0.765,+41
convnextv2_large.fcmae_ft_in22k_in1k,90.034,9.966,98.629,1.371,197.96,288,1.000,bicubic,+2.550,+0.273,-38
seresnextaa101d_32x8d.sw_in12k_ft_in1k,90.029,9.971,98.685,1.315,93.59,288,1.000,bicubic,+3.545,+0.655,+11
tiny_vit_21m_512.dist_in22k_ft_in1k,90.029,9.971,98.493,1.507,21.27,512,1.000,bicubic,+3.571,+0.609,+12
tf_efficientnetv2_m.in21k_ft_in1k,90.025,9.975,98.666,1.334,54.14,480,1.000,bicubic,+4.033,+0.722,+42
convformer_s36.sail_in22k_ft_in1k_384,90.023,9.977,98.619,1.381,40.01,384,1.000,bicubic,+3.645,+0.635,+14
swin_large_patch4_window12_384.ms_in22k_ft_in1k,90.019,9.981,98.661,1.339,196.74,384,1.000,bicubic,+2.887,+0.427,-27
deit3_large_patch16_224.fb_in22k_ft_in1k,90.008,9.992,98.659,1.341,304.37,224,1.000,bicubic,+3.026,+0.423,-20
convnext_base.clip_laiona_augreg_ft_in1k_384,90.001,9.998,98.548,1.452,88.59,384,1.000,bicubic,+3.499,+0.580,+2
swin_base_patch4_window12_384.ms_in22k_ft_in1k,89.997,10.003,98.700,1.300,87.90,384,1.000,bicubic,+3.559,+0.634,+8
vit_base_patch16_384.augreg_in21k_ft_in1k,89.989,10.011,98.678,1.322,86.86,384,1.000,bicubic,+3.995,+0.676,+35
maxvit_base_tf_512.in1k,89.974,10.026,98.433,1.567,119.88,512,1.000,bicubic,+3.372,+0.515,-7
caformer_m36.sail_in1k_384,89.933,10.067,98.454,1.546,56.20,384,1.000,bicubic,+3.767,+0.634,+20
convnextv2_large.fcmae_ft_in1k,89.931,10.069,98.559,1.441,197.96,288,1.000,bicubic,+3.813,+0.737,+22
maxvit_rmlp_base_rw_224.sw_in12k_ft_in1k,89.927,10.073,98.414,1.586,116.14,224,0.950,bicubic,+3.033,+0.400,-24
swinv2_large_window12to16_192to256.ms_in22k_ft_in1k,89.925,10.075,98.505,1.494,196.74,256,0.900,bicubic,+2.973,+0.400,-27
regnety_160.swag_ft_in1k,89.918,10.082,98.644,1.356,83.59,384,1.000,bicubic,+3.898,+0.592,+27
convnext_small.fb_in22k_ft_in1k_384,89.916,10.084,98.685,1.315,50.22,384,1.000,bicubic,+4.138,+0.795,+43
efficientnet_b5.sw_in12k_ft_in1k,89.912,10.088,98.565,1.435,30.39,448,1.000,bicubic,+4.016,+0.829,+33
deit3_base_patch16_384.fb_in22k_ft_in1k,89.891,10.110,98.602,1.399,86.88,384,1.000,bicubic,+3.151,+0.486,-19
xcit_large_24_p8_384.fb_dist_in1k,89.886,10.114,98.384,1.616,188.93,384,1.000,bicubic,+3.890,+0.694,+24
volo_d5_224.sail_in1k,89.880,10.120,98.490,1.510,295.46,224,0.960,bicubic,+3.810,+0.915,+19
convnext_large.fb_in22k_ft_in1k,89.876,10.124,98.593,1.407,197.77,288,1.000,bicubic,+2.850,+0.389,-39
swinv2_base_window12to16_192to256.ms_in22k_ft_in1k,89.867,10.133,98.644,1.356,87.92,256,0.900,bicubic,+3.599,+0.762,+2
tiny_vit_21m_384.dist_in22k_ft_in1k,89.863,10.137,98.499,1.501,21.23,384,1.000,bicubic,+3.755,+0.789,+12
convnext_base.fb_in22k_ft_in1k,89.858,10.142,98.691,1.309,88.59,288,1.000,bicubic,+3.584,+0.599,-1
caformer_m36.sail_in22k_ft_in1k,89.852,10.148,98.585,1.415,56.20,224,1.000,bicubic,+3.258,+0.561,-21
cait_s36_384.fb_dist_in1k,89.835,10.165,98.427,1.573,68.37,384,1.000,bicubic,+4.381,+0.949,+52
coatnet_2_rw_224.sw_in12k_ft_in1k,89.831,10.169,98.527,1.473,73.87,224,0.950,bicubic,+3.267,+0.631,-21
convformer_m36.sail_in22k_ft_in1k,89.818,10.182,98.548,1.452,57.05,224,1.000,bicubic,+3.670,+0.698,+5
xcit_medium_24_p8_384.fb_dist_in1k,89.816,10.184,98.365,1.635,84.32,384,1.000,bicubic,+4.000,+0.773,+25
convnext_base.clip_laion2b_augreg_ft_in12k_in1k,89.811,10.188,98.668,1.332,88.59,256,1.000,bicubic,+3.441,+0.684,-11
volo_d4_224.sail_in1k,89.811,10.188,98.427,1.573,192.96,224,0.960,bicubic,+3.939,+0.955,+20
maxvit_large_tf_512.in1k,89.799,10.201,98.330,1.670,212.33,512,1.000,bicubic,+3.273,+0.450,-25
vit_large_r50_s32_384.augreg_in21k_ft_in1k,89.792,10.208,98.522,1.478,329.09,384,1.000,bicubic,+3.610,+0.600,-4
swin_large_patch4_window7_224.ms_in22k_ft_in1k,89.786,10.214,98.640,1.360,196.53,224,0.900,bicubic,+3.474,+0.738,-13
tf_efficientnet_b6.ns_jft_in1k,89.786,10.214,98.508,1.492,43.04,528,0.942,bicubic,+3.328,+0.618,-20
volo_d2_384.sail_in1k,89.784,10.216,98.403,1.597,58.87,384,1.000,bicubic,+3.742,+0.829,+5
coatnet_rmlp_2_rw_224.sw_in12k_ft_in1k,89.775,10.225,98.469,1.531,73.88,224,0.950,bicubic,+3.271,+0.575,-29
tf_efficientnetv2_xl.in21k_ft_in1k,89.773,10.227,98.288,1.712,208.12,512,1.000,bicubic,+3.025,+0.274,-40
maxxvitv2_rmlp_base_rw_224.sw_in12k_ft_in1k,89.752,10.248,98.484,1.516,116.09,224,0.950,bicubic,+3.110,+0.464,-38
beitv2_base_patch16_224.in1k_ft_in22k_in1k,89.743,10.257,98.585,1.415,86.53,224,0.900,bicubic,+3.269,+0.532,-27
dm_nfnet_f4.dm_in1k,89.741,10.259,98.409,1.591,316.07,512,0.951,bicubic,+3.905,+0.591,+12
dm_nfnet_f5.dm_in1k,89.737,10.263,98.441,1.559,377.21,544,0.954,bicubic,+3.637,+0.753,-6
xcit_small_24_p8_384.fb_dist_in1k,89.735,10.265,98.420,1.580,47.63,384,1.000,bicubic,+4.181,+0.850,+28
regnety_160.lion_in12k_ft_in1k,89.726,10.274,98.608,1.392,83.59,288,1.000,bicubic,+3.738,+0.774,+2
caformer_s18.sail_in22k_ft_in1k_384,89.720,10.280,98.578,1.422,26.34,384,1.000,bicubic,+4.306,+0.876,+36
vit_base_patch8_224.augreg2_in21k_ft_in1k,89.718,10.283,98.510,1.490,86.58,224,0.900,bicubic,+3.499,+0.678,-21
convformer_m36.sail_in1k_384,89.718,10.283,98.435,1.565,57.05,384,1.000,bicubic,+4.138,+0.893,+24
vit_base_patch16_clip_384.openai_ft_in1k,89.707,10.293,98.508,1.492,86.86,384,1.000,bicubic,+3.501,+0.632,-21
caformer_s36.sail_in22k_ft_in1k,89.701,10.300,98.638,1.362,39.30,224,1.000,bicubic,+3.910,+0.812,+8
volo_d1_384.sail_in1k,89.698,10.302,98.290,1.710,26.78,384,1.000,bicubic,+4.454,+1.096,+45
deit3_large_patch16_384.fb_in1k,89.688,10.312,98.390,1.610,304.76,384,1.000,bicubic,+3.876,+0.792,+4
caformer_s36.sail_in1k_384,89.679,10.321,98.324,1.676,39.30,384,1.000,bicubic,+3.937,+0.652,+9
xcit_large_24_p16_384.fb_dist_in1k,89.662,10.338,98.403,1.597,189.10,384,1.000,bicubic,+3.908,+0.865,+7
convformer_b36.sail_in1k_384,89.658,10.342,98.379,1.621,99.88,384,1.000,bicubic,+3.918,+0.855,+8
tf_efficientnet_b5.ns_jft_in1k,89.651,10.349,98.486,1.514,30.39,456,0.934,bicubic,+3.563,+0.730,-18
dm_nfnet_f3.dm_in1k,89.647,10.353,98.463,1.537,254.92,416,0.940,bicubic,+3.961,+0.893,+9
convnext_base.clip_laion2b_augreg_ft_in1k,89.645,10.355,98.463,1.537,88.59,256,1.000,bicubic,+3.487,+0.783,-25
regnety_2560.seer_ft_in1k,89.632,10.368,98.399,1.601,"1,282.60",384,1.000,bicubic,+4.482,+0.961,+48
convnext_small.in12k_ft_in1k_384,89.598,10.402,98.480,1.520,50.22,384,1.000,bicubic,+3.416,+0.558,-31
maxvit_base_tf_384.in1k,89.589,10.411,98.320,1.680,119.65,384,1.000,bicubic,+3.287,+0.522,-38
tf_efficientnet_b8.ap_in1k,89.579,10.421,98.305,1.695,87.41,672,0.954,bicubic,+4.215,+1.013,+28
regnety_160.sw_in12k_ft_in1k,89.570,10.430,98.546,1.454,83.59,288,1.000,bicubic,+3.584,+0.712,-15
vit_base_patch16_clip_224.laion2b_ft_in12k_in1k,89.566,10.434,98.420,1.580,86.57,224,0.950,bicubic,+3.396,+0.664,-33
maxvit_tiny_tf_512.in1k,89.562,10.438,98.335,1.665,31.05,512,1.000,bicubic,+3.898,+0.751,+2
convformer_s18.sail_in22k_ft_in1k_384,89.553,10.447,98.531,1.469,26.77,384,1.000,bicubic,+4.555,+0.961,+57
volo_d3_224.sail_in1k,89.553,10.447,98.373,1.627,86.33,224,0.960,bicubic,+4.139,+1.097,+16
maxvit_large_tf_384.in1k,89.553,10.447,98.185,1.815,212.03,384,1.000,bicubic,+3.323,+0.497,-41
tf_efficientnetv2_l.in1k,89.540,10.460,98.339,1.661,118.52,480,1.000,bicubic,+3.876,+0.865,-1
flexivit_large.1200ep_in1k,89.534,10.466,98.414,1.586,304.36,240,0.950,bicubic,+3.890,+0.874,-1
xcit_large_24_p8_224.fb_dist_in1k,89.517,10.483,98.217,1.783,188.93,224,1.000,bicubic,+4.115,+0.815,+14
flexivit_large.600ep_in1k,89.510,10.490,98.392,1.608,304.36,240,0.950,bicubic,+3.970,+0.904,+1
xcit_small_12_p8_384.fb_dist_in1k,89.508,10.492,98.307,1.693,26.21,384,1.000,bicubic,+4.430,+1.025,+43
cait_s24_384.fb_dist_in1k,89.504,10.496,98.362,1.638,47.06,384,1.000,bicubic,+4.456,+1.016,+45
convformer_s36.sail_in22k_ft_in1k,89.496,10.505,98.454,1.546,40.01,224,1.000,bicubic,+4.082,+0.886,+8
convnextv2_tiny.fcmae_ft_in22k_in1k_384,89.485,10.515,98.484,1.516,28.64,384,1.000,bicubic,+4.379,+0.856,+34
convformer_s36.sail_in1k_384,89.481,10.520,98.369,1.631,40.01,384,1.000,bicubic,+4.103,+0.893,+10
xcit_medium_24_p16_384.fb_dist_in1k,89.481,10.520,98.296,1.704,84.40,384,1.000,bicubic,+4.056,+0.966,+2
convnext_tiny.in12k_ft_in1k_384,89.472,10.528,98.505,1.494,28.59,384,1.000,bicubic,+4.350,+0.900,+30
inception_next_base.sail_in1k_384,89.453,10.547,98.345,1.655,86.67,384,1.000,bicubic,+4.251,+0.931,+23
regnety_120.sw_in12k_ft_in1k,89.446,10.554,98.537,1.462,51.82,288,1.000,bicubic,+4.046,+0.956,+5
vit_base_patch16_224.augreg2_in21k_ft_in1k,89.446,10.554,98.441,1.559,86.57,224,0.900,bicubic,+4.352,+0.911,+31
deit3_base_patch16_224.fb_in22k_ft_in1k,89.444,10.556,98.557,1.443,86.59,224,1.000,bicubic,+3.744,+0.811,-18
maxvit_small_tf_512.in1k,89.444,10.556,98.354,1.646,69.13,512,1.000,bicubic,+3.360,+0.590,-45
vit_base_patch8_224.augreg_in21k_ft_in1k,89.436,10.564,98.484,1.516,86.58,224,0.900,bicubic,+3.638,+0.694,-28
vit_base_patch32_clip_448.laion2b_ft_in12k_in1k,89.436,10.564,98.401,1.599,88.34,448,1.000,bicubic,+3.656,+0.763,-27
deit_base_distilled_patch16_384.fb_in1k,89.434,10.566,98.439,1.561,87.63,384,1.000,bicubic,+4.010,+1.033,-6
vit_base_patch16_clip_224.laion2b_ft_in1k,89.431,10.569,98.469,1.531,86.57,224,1.000,bicubic,+3.961,+0.893,-11
tf_efficientnet_b7.ap_in1k,89.431,10.569,98.345,1.655,66.35,600,0.949,bicubic,+4.307,+1.093,+20
convnextv2_base.fcmae_ft_in1k,89.421,10.579,98.360,1.640,88.72,288,1.000,bicubic,+3.947,+0.976,-13
caformer_b36.sail_in1k,89.408,10.592,98.222,1.778,98.75,224,1.000,bicubic,+3.904,+0.912,-15
vit_base_patch16_clip_224.openai_ft_in12k_in1k,89.404,10.596,98.394,1.606,86.57,224,0.950,bicubic,+3.462,+0.666,-42
hrnet_w48_ssld.paddle_in1k,89.401,10.598,98.382,1.618,77.47,288,1.000,bilinear,+4.921,+1.148,+69
beit_base_patch16_224.in22k_ft_in22k_in1k,89.395,10.605,98.529,1.471,86.53,224,0.900,bicubic,+4.183,+0.871,+7
regnetz_e8.ra3_in1k,89.378,10.622,98.459,1.542,57.70,320,1.000,bicubic,+4.344,+1.186,+25
deit3_small_patch16_384.fb_in22k_ft_in1k,89.363,10.637,98.386,1.614,22.21,384,1.000,bicubic,+4.539,+0.900,+39
tf_efficientnetv2_m.in1k,89.350,10.650,98.326,1.674,54.14,480,1.000,bicubic,+4.146,+0.962,+5
vit_medium_patch16_gap_384.sw_in12k_ft_in1k,89.348,10.652,98.495,1.505,39.03,384,0.950,bicubic,+3.818,+0.859,-24
tf_efficientnet_b8.ra_in1k,89.348,10.652,98.305,1.695,87.41,672,0.954,bicubic,+3.980,+0.911,-10
tf_efficientnet_b6.ap_in1k,89.344,10.656,98.283,1.717,43.04,528,0.942,bicubic,+4.556,+1.145,+38
volo_d2_224.sail_in1k,89.335,10.665,98.213,1.787,58.68,224,0.960,bicubic,+4.133,+1.023,+3
eva02_small_patch14_336.mim_in22k_ft_in1k,89.333,10.667,98.377,1.623,22.13,336,1.000,bicubic,+3.615,+0.743,-38
caformer_s18.sail_in1k_384,89.331,10.669,98.294,1.706,26.34,384,1.000,bicubic,+4.305,+0.936,+18
vit_large_patch16_224.augreg_in21k_ft_in1k,89.318,10.682,98.390,1.610,304.33,224,0.900,bicubic,+3.482,+0.726,-51
flexivit_large.300ep_in1k,89.310,10.690,98.324,1.676,304.36,240,0.950,bicubic,+4.022,+0.924,-12
tf_efficientnet_b4.ns_jft_in1k,89.301,10.699,98.345,1.655,19.34,380,0.922,bicubic,+4.141,+0.877,0
convnext_small.fb_in22k_ft_in1k,89.299,10.701,98.358,1.642,50.22,288,1.000,bicubic,+4.037,+0.676,-12
xcit_small_24_p16_384.fb_dist_in1k,89.295,10.705,98.326,1.674,47.67,384,1.000,bicubic,+4.205,+1.014,+6
xcit_medium_24_p8_224.fb_dist_in1k,89.293,10.707,98.185,1.815,84.32,224,1.000,bicubic,+4.219,+0.935,+8
beitv2_base_patch16_224.in1k_ft_in1k,89.271,10.729,98.262,1.738,86.53,224,0.900,bicubic,+3.677,+0.756,-40
deit3_huge_patch14_224.fb_in1k,89.218,10.782,98.164,1.836,632.13,224,0.900,bicubic,+3.994,+0.804,-12
coat_lite_medium_384.in1k,89.209,10.791,98.219,1.781,44.57,384,1.000,bicubic,+4.331,+0.847,+19
xcit_small_24_p8_224.fb_dist_in1k,89.199,10.801,98.243,1.757,47.63,224,1.000,bicubic,+4.331,+1.053,+19
vit_base_patch32_clip_384.laion2b_ft_in12k_in1k,89.197,10.803,98.358,1.642,88.30,384,1.000,bicubic,+3.831,+0.698,-25
dm_nfnet_f2.dm_in1k,89.194,10.806,98.228,1.772,193.78,352,0.920,bicubic,+4.002,+0.882,-10
xcit_small_12_p16_384.fb_dist_in1k,89.194,10.806,98.217,1.783,26.25,384,1.000,bicubic,+4.482,+1.099,+25
swin_base_patch4_window7_224.ms_in22k_ft_in1k,89.173,10.827,98.433,1.567,87.77,224,0.900,bicubic,+3.901,+0.869,-23
vit_base_patch16_clip_224.openai_ft_in1k,89.171,10.829,98.269,1.732,86.57,224,0.900,bicubic,+3.879,+0.832,-26
eca_nfnet_l2.ra3_in1k,89.147,10.852,98.315,1.685,56.72,384,1.000,bicubic,+4.447,+1.049,+23
cait_xs24_384.fb_dist_in1k,89.147,10.852,98.292,1.708,26.67,384,1.000,bicubic,+5.085,+1.408,+87
fastvit_ma36.apple_dist_in1k,89.124,10.876,98.142,1.857,44.07,256,0.950,bicubic,+4.526,+1.141,+27
convformer_s18.sail_in1k_384,89.124,10.876,98.130,1.870,26.77,384,1.000,bicubic,+4.722,+1.018,+57
maxvit_tiny_tf_384.in1k,89.111,10.889,98.211,1.789,30.98,384,1.000,bicubic,+4.011,+0.833,-12
maxvit_small_tf_384.in1k,89.109,10.891,98.164,1.836,69.02,384,1.000,bicubic,+3.569,+0.702,-50
resnext101_32x32d.fb_wsl_ig1b_ft_in1k,89.107,10.893,98.189,1.810,468.53,224,0.875,bilinear,+4.009,+0.751,-13
tf_efficientnet_b7.ra_in1k,89.081,10.919,98.185,1.815,66.35,600,0.949,bicubic,+4.149,+0.977,+1
tiny_vit_21m_224.dist_in22k_ft_in1k,89.077,10.923,98.236,1.764,21.20,224,0.950,bicubic,+3.991,+0.870,-12
ecaresnet269d.ra2_in1k,89.071,10.929,98.234,1.766,102.09,352,1.000,bicubic,+4.103,+1.012,-3
regnety_1280.seer_ft_in1k,89.064,10.936,98.157,1.843,644.81,384,1.000,bicubic,+4.632,+1.065,+41
vit_base_patch32_clip_384.openai_ft_in12k_in1k,89.045,10.955,98.281,1.719,88.30,384,0.950,bicubic,+3.831,+0.877,-30
xcit_large_24_p16_224.fb_dist_in1k,89.045,10.955,98.059,1.941,189.10,224,1.000,bicubic,+4.129,+0.931,-2
convnext_small.in12k_ft_in1k,89.024,10.976,98.243,1.757,50.22,288,1.000,bicubic,+3.694,+0.697,-41
resmlp_big_24_224.fb_in22k_ft_in1k,89.019,10.981,98.215,1.785,129.14,224,0.875,bicubic,+4.621,+1.103,+46
dm_nfnet_f1.dm_in1k,89.017,10.983,98.256,1.744,132.63,320,0.910,bicubic,+4.315,+1.074,+8
xcit_small_12_p8_224.fb_dist_in1k,89.009,10.991,98.076,1.924,26.21,224,1.000,bicubic,+4.773,+1.206,+56
convnext_large.fb_in1k,88.994,11.006,98.040,1.960,197.77,288,1.000,bicubic,+4.148,+0.826,-2
efficientnetv2_rw_m.agc_in1k,88.985,11.015,98.219,1.781,53.24,416,1.000,bicubic,+4.175,+1.067,-1
caformer_m36.sail_in1k,88.985,11.015,98.016,1.984,56.20,224,1.000,bicubic,+3.753,+0.816,-39
regnety_1280.swag_lc_in1k,88.955,11.045,98.228,1.772,644.81,224,0.965,bicubic,+2.973,+0.378,-90
regnetz_d8_evos.ch_in1k,88.951,11.049,98.181,1.819,23.46,320,1.000,bicubic,+4.825,+1.169,+60
regnetz_040_h.ra3_in1k,88.949,11.051,98.209,1.791,28.94,320,1.000,bicubic,+4.457,+1.451,+19
edgenext_base.in21k_ft_in1k,88.942,11.057,98.279,1.721,18.51,320,1.000,bicubic,+4.888,+1.083,+66
tf_efficientnet_b5.ap_in1k,88.942,11.057,98.166,1.834,30.39,456,0.934,bicubic,+4.684,+1.192,+44
mvitv2_large.fb_in1k,88.936,11.064,97.965,2.035,217.99,224,0.900,bicubic,+3.692,+0.751,-47
caformer_s18.sail_in22k_ft_in1k,88.932,11.068,98.311,1.689,26.34,224,1.000,bicubic,+4.858,+1.113,+60
deit3_base_patch16_384.fb_in1k,88.923,11.077,98.042,1.958,86.88,384,1.000,bicubic,+3.849,+0.768,-28
deit3_medium_patch16_224.fb_in22k_ft_in1k,88.919,11.081,98.296,1.704,38.85,224,1.000,bicubic,+4.369,+1.108,+4
volo_d1_224.sail_in1k,88.915,11.085,98.027,1.973,26.63,224,0.960,bicubic,+4.753,+1.251,+50
convnext_tiny.in12k_ft_in1k,88.906,11.094,98.309,1.691,28.59,288,1.000,bicubic,+4.456,+0.969,+15
tf_efficientnetv2_s.in21k_ft_in1k,88.898,11.102,98.275,1.725,21.46,384,1.000,bicubic,+4.612,+1.023,+34
vit_base_patch16_224.augreg_in21k_ft_in1k,88.868,11.132,98.232,1.768,86.57,224,0.900,bicubic,+4.336,+0.938,+3
convnext_tiny.fb_in22k_ft_in1k_384,88.855,11.145,98.296,1.704,28.59,384,1.000,bicubic,+4.767,+1.152,+52
regnetz_d8.ra3_in1k,88.853,11.147,98.189,1.810,23.37,320,1.000,bicubic,+4.801,+1.194,+56
convformer_b36.sail_in1k,88.851,11.149,97.878,2.122,99.88,224,1.000,bicubic,+4.033,+0.932,-18
coatnet_rmlp_1_rw2_224.sw_in12k_ft_in1k,88.842,11.158,97.869,2.131,41.72,224,0.950,bicubic,+3.932,+0.911,-25
resnetrs420.tf_in1k,88.838,11.162,98.031,1.968,191.89,416,1.000,bicubic,+3.834,+0.907,-34
resnext101_32x16d.fb_wsl_ig1b_ft_in1k,88.834,11.166,98.051,1.949,194.03,224,0.875,bilinear,+4.668,+0.853,+40
resnetrs270.tf_in1k,88.829,11.171,98.136,1.864,129.86,352,1.000,bicubic,+4.401,+1.168,+14
vit_small_r26_s32_384.augreg_in21k_ft_in1k,88.821,11.179,98.341,1.659,36.47,384,1.000,bicubic,+4.773,+1.013,+52
swin_small_patch4_window7_224.ms_in22k_ft_in1k,88.817,11.184,98.328,1.672,49.61,224,0.900,bicubic,+5.519,+1.364,+133
vit_base_r50_s16_384.orig_in21k_ft_in1k,88.806,11.194,98.232,1.768,98.95,384,1.000,bicubic,+3.830,+0.942,-37
xcit_medium_24_p16_224.fb_dist_in1k,88.806,11.194,98.038,1.962,84.40,224,1.000,bicubic,+4.520,+1.106,+23
tf_efficientnet_b7.aa_in1k,88.804,11.196,98.057,1.943,66.35,600,0.949,bicubic,+4.388,+1.149,+13
maxxvit_rmlp_small_rw_256.sw_in1k,88.795,11.205,98.064,1.937,66.01,256,0.950,bicubic,+4.171,+0.996,-18
fastvit_sa36.apple_dist_in1k,88.789,11.211,98.096,1.905,31.53,256,0.900,bicubic,+4.763,+1.242,+47
seresnet152d.ra2_in1k,88.787,11.213,98.172,1.828,66.84,320,1.000,bicubic,+4.427,+1.132,+14
xcit_tiny_24_p8_384.fb_dist_in1k,88.787,11.213,98.155,1.845,12.11,384,1.000,bicubic,+5.041,+1.755,+74
convformer_m36.sail_in1k,88.787,11.213,97.769,2.231,57.05,224,1.000,bicubic,+4.293,+0.903,-7
resnext101_32x8d.fb_swsl_ig1b_ft_in1k,88.782,11.218,98.151,1.849,88.79,224,0.875,bilinear,+4.480,+0.975,+14
resnetrs200.tf_in1k,88.763,11.237,98.115,1.885,93.21,320,1.000,bicubic,+4.319,+1.273,-5
convnext_base.fb_in1k,88.763,11.237,97.920,2.080,88.59,288,1.000,bicubic,+4.335,+0.952,+1
deit3_large_patch16_224.fb_in1k,88.763,11.237,97.920,2.080,304.37,224,0.900,bicubic,+3.989,+0.884,-32
tf_efficientnet_b6.aa_in1k,88.759,11.241,98.066,1.934,43.04,528,0.942,bicubic,+4.647,+1.182,+29
resnetrs350.tf_in1k,88.757,11.243,98.031,1.968,163.96,384,1.000,bicubic,+4.043,+1.039,-34
rexnetr_300.sw_in12k_ft_in1k,88.746,11.254,98.339,1.661,34.81,288,1.000,bicubic,+4.200,+1.083,-23
caformer_s36.sail_in1k,88.746,11.254,98.023,1.977,39.30,224,1.000,bicubic,+4.240,+1.027,-17
convnextv2_tiny.fcmae_ft_in22k_in1k,88.744,11.256,98.194,1.806,28.64,288,1.000,bicubic,+4.328,+0.934,-2
vit_base_patch16_224_miil.in21k_ft_in1k,88.742,11.258,98.027,1.973,86.54,224,0.875,bilinear,+4.476,+1.223,+8
edgenext_base.usi_in1k,88.735,11.265,98.147,1.853,18.51,320,1.000,bicubic,+4.777,+1.377,+38
regnetz_040.ra3_in1k,88.731,11.269,98.091,1.909,27.12,320,1.000,bicubic,+4.491,+1.159,+10
convformer_s18.sail_in22k_ft_in1k,88.727,11.273,98.194,1.806,26.77,224,1.000,bicubic,+4.989,+1.146,+65
resnetv2_152x2_bit.goog_in21k_ft_in1k,88.725,11.275,98.311,1.689,236.34,448,1.000,bilinear,+4.215,+0.877,-26
regnety_160.deit_in1k,88.703,11.297,98.068,1.932,83.59,288,1.000,bicubic,+5.013,+1.288,+69
davit_base.msft_in1k,88.701,11.299,97.874,2.127,87.95,224,0.950,bicubic,+4.059,+0.854,-39
regnety_640.seer_ft_in1k,88.674,11.326,98.166,1.834,281.38,384,1.000,bicubic,+4.766,+1.244,+35
regnetz_d32.ra3_in1k,88.652,11.348,98.078,1.922,27.58,320,0.950,bicubic,+4.630,+1.210,+27
vit_small_patch16_384.augreg_in21k_ft_in1k,88.648,11.352,98.230,1.770,22.20,384,1.000,bicubic,+4.844,+1.130,+47
flexivit_base.1200ep_in1k,88.648,11.352,97.935,2.065,86.59,240,0.950,bicubic,+3.972,+0.941,-43
mvitv2_base.fb_in1k,88.644,11.356,97.826,2.174,51.47,224,0.900,bicubic,+4.194,+0.968,-24
regnety_080.ra3_in1k,88.635,11.365,97.965,2.035,39.18,288,1.000,bicubic,+4.709,+1.075,+28
vit_medium_patch16_gap_256.sw_in12k_ft_in1k,88.633,11.367,98.189,1.810,38.86,256,0.950,bicubic,+4.187,+0.980,-25
davit_small.msft_in1k,88.631,11.369,97.953,2.047,49.75,224,0.950,bicubic,+4.379,+1.013,-4
eca_nfnet_l1.ra2_in1k,88.622,11.378,98.132,1.868,41.41,320,1.000,bicubic,+4.610,+1.106,+22
repvgg_d2se.rvgg_in1k,88.599,11.401,97.984,2.015,133.33,320,1.000,bilinear,+5.039,+1.326,+69
maxvit_base_tf_224.in1k,88.588,11.412,97.850,2.150,119.47,224,0.950,bicubic,+3.728,+0.862,-62
swinv2_base_window16_256.ms_in1k,88.586,11.414,97.908,2.092,87.92,256,0.900,bicubic,+3.986,+0.818,-48
resnetaa101d.sw_in12k_ft_in1k,88.573,11.427,98.076,1.924,44.57,288,1.000,bicubic,+4.449,+0.970,+4
regnety_320.seer_ft_in1k,88.571,11.429,98.106,1.894,145.05,384,1.000,bicubic,+5.243,+1.398,+93
efficientvit_b3.r288_in1k,88.562,11.438,97.707,2.293,48.65,288,1.000,bicubic,+4.409,+0.971,0
resnetv2_152x4_bit.goog_in21k_ft_in1k,88.554,11.446,98.192,1.808,936.53,480,1.000,bilinear,+3.638,+0.754,-73
resnet200d.ra2_in1k,88.554,11.446,97.961,2.039,64.69,320,1.000,bicubic,+4.590,+1.135,+16
xcit_small_24_p16_224.fb_dist_in1k,88.545,11.455,98.004,1.996,47.67,224,1.000,bicubic,+4.671,+1.268,+22
flexivit_base.600ep_in1k,88.545,11.455,97.933,2.067,86.59,240,0.950,bicubic,+4.021,+0.997,-47
seresnextaa101d_32x8d.ah_in1k,88.537,11.463,98.002,1.998,93.59,288,1.000,bicubic,+3.971,+0.926,-54
maxvit_rmlp_small_rw_224.sw_in1k,88.522,11.478,97.773,2.227,64.90,224,0.900,bicubic,+4.030,+0.763,-44
resnest269e.in1k,88.520,11.480,98.029,1.971,110.93,416,0.928,bicubic,+4.012,+1.039,-49
coatnet_rmlp_2_rw_224.sw_in1k,88.513,11.487,97.566,2.434,73.88,224,0.950,bicubic,+3.905,+0.826,-60
efficientformerv2_l.snap_dist_in1k,88.509,11.491,97.963,2.037,26.32,224,0.950,bicubic,+4.877,+1.405,+49
repvit_m2_3.dist_300e_in1k,88.503,11.497,97.942,2.058,23.69,224,0.950,bicubic,+4.999,+1.438,+57
gcvit_base.in1k,88.501,11.499,97.769,2.231,90.32,224,0.875,bicubic,+4.056,+0.687,-42
swinv2_base_window8_256.ms_in1k,88.498,11.502,97.891,2.109,87.92,256,0.900,bicubic,+4.248,+0.967,-22
seresnext101_32x8d.ah_in1k,88.494,11.506,97.884,2.116,93.57,288,1.000,bicubic,+4.308,+1.010,-16
repvit_m2_3.dist_450e_in1k,88.488,11.512,98.059,1.941,23.69,224,0.950,bicubic,+4.746,+1.415,+31
convformer_s36.sail_in1k,88.486,11.514,97.763,2.237,40.01,224,1.000,bicubic,+4.426,+1.017,-7
flexivit_base.300ep_in1k,88.479,11.521,97.846,2.154,86.59,240,0.950,bicubic,+4.073,+0.962,-38
crossvit_18_dagger_408.in1k,88.473,11.527,97.897,2.103,44.61,408,1.000,bicubic,+4.271,+1.079,-22
efficientnetv2_rw_s.ra2_in1k,88.469,11.531,97.978,2.022,23.94,384,1.000,bicubic,+4.663,+1.246,+17
resnetv2_101x3_bit.goog_in21k_ft_in1k,88.466,11.534,98.157,1.843,387.93,448,1.000,bilinear,+4.028,+0.775,-49
fastvit_sa24.apple_dist_in1k,88.460,11.540,97.957,2.043,21.55,256,0.900,bicubic,+5.118,+1.405,+69
maxvit_large_tf_224.in1k,88.460,11.540,97.809,2.191,211.79,224,0.950,bicubic,+3.518,+0.839,-94
maxvit_small_tf_224.in1k,88.456,11.544,97.880,2.120,68.93,224,0.950,bicubic,+4.030,+1.056,-48
resnetv2_50x3_bit.goog_in21k_ft_in1k,88.447,11.553,98.200,1.800,217.32,448,1.000,bilinear,+4.427,+1.074,-8
cait_s24_224.fb_dist_in1k,88.443,11.557,97.961,2.039,46.92,224,1.000,bicubic,+5.001,+1.387,+52
resmlp_big_24_224.fb_distilled_in1k,88.441,11.559,97.938,2.062,129.14,224,0.875,bicubic,+4.849,+1.287,+37
regnetv_064.ra3_in1k,88.436,11.563,98.061,1.939,30.58,288,1.000,bicubic,+4.721,+1.319,+23
resnest200e.in1k,88.434,11.566,98.040,1.960,70.20,320,0.909,bicubic,+4.590,+1.156,+1
vit_large_r50_s32_224.augreg_in21k_ft_in1k,88.432,11.568,98.083,1.917,328.99,224,0.900,bicubic,+4.014,+0.911,-54
inception_next_base.sail_in1k,88.432,11.568,97.773,2.227,86.67,224,0.950,bicubic,+4.340,+0.977,-24
tf_efficientnet_b3.ns_jft_in1k,88.428,11.572,98.031,1.968,12.23,300,0.904,bicubic,+4.376,+1.113,-20
seresnext101d_32x8d.ah_in1k,88.428,11.572,97.961,2.039,93.59,288,1.000,bicubic,+4.070,+1.041,-47
swin_base_patch4_window12_384.ms_in1k,88.426,11.574,97.803,2.197,87.90,384,1.000,bicubic,+3.950,+0.911,-68
tf_efficientnetv2_s.in1k,88.409,11.591,97.927,2.073,21.46,384,1.000,bicubic,+4.511,+1.231,-11
convnext_small.fb_in1k,88.407,11.593,98.012,1.988,50.22,288,1.000,bicubic,+4.707,+1.204,+18
tf_efficientnet_b5.aa_in1k,88.407,11.593,97.931,2.069,30.39,456,0.934,bicubic,+4.719,+1.219,+19
regnety_320.swag_lc_in1k,88.398,11.602,98.115,1.885,145.05,224,0.965,bicubic,+3.850,+0.673,-83
vit_base_patch16_384.orig_in21k_ft_in1k,88.392,11.608,98.160,1.840,86.86,384,1.000,bicubic,+4.192,+0.942,-41
regnetz_c16_evos.ch_in1k,88.381,11.619,98.040,1.960,13.49,320,0.950,bicubic,+5.745,+1.566,+121
tresnet_v2_l.miil_in21k_ft_in1k,88.375,11.625,97.925,2.075,46.17,224,0.875,bilinear,+4.481,+1.435,-16
swinv2_small_window16_256.ms_in1k,88.372,11.628,97.848,2.152,49.73,256,0.900,bicubic,+4.148,+1.070,-47
efficientnet_b4.ra2_in1k,88.366,11.634,97.961,2.039,19.34,384,1.000,bicubic,+4.952,+1.363,+38
resnet152d.ra2_in1k,88.362,11.638,97.931,2.069,60.21,320,1.000,bicubic,+4.678,+1.193,+13
fastvit_ma36.apple_in1k,88.358,11.643,97.923,2.077,44.07,256,0.950,bicubic,+4.475,+1.181,-18
maxvit_rmlp_tiny_rw_256.sw_in1k,88.355,11.645,97.820,2.180,29.15,256,0.950,bicubic,+4.131,+0.952,-50
tf_efficientnet_b4.ap_in1k,88.347,11.653,97.891,2.109,19.34,380,0.922,bicubic,+5.097,+1.495,+52
deit3_small_patch16_224.fb_in22k_ft_in1k,88.338,11.662,98.132,1.868,22.06,224,1.000,bicubic,+5.262,+1.356,+71
regnety_064.ra3_in1k,88.323,11.677,97.865,2.135,30.58,288,1.000,bicubic,+4.603,+1.143,+1
efficientvit_b3.r256_in1k,88.319,11.681,97.560,2.440,48.65,256,1.000,bicubic,+4.517,+1.044,-11
convnextv2_nano.fcmae_ft_in22k_in1k_384,88.315,11.685,97.938,2.062,15.62,384,1.000,bicubic,+4.941,+1.194,+37
tf_efficientnet_b5.ra_in1k,88.313,11.687,97.914,2.086,30.39,456,0.934,bicubic,+4.499,+1.162,-17
crossvit_15_dagger_408.in1k,88.313,11.687,97.874,2.127,28.50,408,1.000,bicubic,+4.473,+1.096,-21
cs3se_edgenet_x.c2ns_in1k,88.300,11.700,97.933,2.067,50.72,320,1.000,bicubic,+4.754,+1.263,+13
deit3_small_patch16_384.fb_in1k,88.296,11.704,97.888,2.112,22.21,384,1.000,bicubic,+4.868,+1.214,+24
pvt_v2_b4.in1k,88.285,11.715,97.816,2.184,62.56,224,0.900,bicubic,+4.573,+1.146,-4
efficientformer_l7.snap_dist_in1k,88.278,11.722,97.882,2.118,82.23,224,0.950,bicubic,+4.896,+1.346,+30
mvitv2_small.fb_in1k,88.264,11.736,97.692,2.308,34.87,224,0.900,bicubic,+4.494,+1.116,-16
inception_next_small.sail_in1k,88.253,11.747,97.816,2.184,49.37,224,0.875,bicubic,+4.675,+1.218,+6
resnetrs152.tf_in1k,88.249,11.751,97.737,2.263,86.62,320,1.000,bicubic,+4.547,+1.125,-7
deit3_base_patch16_224.fb_in1k,88.242,11.758,97.818,2.182,86.59,224,0.900,bicubic,+4.456,+1.232,-21
xcit_small_12_p16_224.fb_dist_in1k,88.240,11.760,97.841,2.159,26.25,224,1.000,bicubic,+4.912,+1.425,+33
gcvit_small.in1k,88.223,11.777,97.788,2.212,51.09,224,0.875,bicubic,+4.331,+1.130,-37
deit_base_distilled_patch16_224.fb_in1k,88.210,11.790,97.910,2.090,87.34,224,0.900,bicubic,+4.820,+1.422,+21
regnetv_040.ra3_in1k,88.208,11.792,97.972,2.028,20.64,288,1.000,bicubic,+5.018,+1.314,+38
xception65p.ra3_in1k,88.189,11.811,97.788,2.212,39.82,299,0.940,bicubic,+5.063,+1.306,+45
swinv2_small_window8_256.ms_in1k,88.172,11.828,97.777,2.223,49.73,256,0.900,bicubic,+4.318,+1.133,-38
caformer_s18.sail_in1k,88.163,11.837,97.728,2.272,26.34,224,1.000,bicubic,+4.509,+1.210,-9
xcit_tiny_24_p16_384.fb_dist_in1k,88.161,11.839,97.942,2.058,12.12,384,1.000,bicubic,+5.591,+1.666,+108
xcit_large_24_p8_224.fb_in1k,88.159,11.841,97.391,2.609,188.93,224,1.000,bicubic,+3.765,+0.727,-87
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k_384,88.148,11.852,98.051,1.949,236.34,384,1.000,bicubic,+4.312,+0.925,-38
tiny_vit_21m_224.in1k,88.146,11.854,97.850,2.150,21.20,224,0.950,bicubic,+4.892,+1.258,+26
coat_lite_medium.in1k,88.144,11.856,97.899,2.101,44.57,224,0.900,bicubic,+4.544,+1.171,-11
resnext101_32x8d.fb_wsl_ig1b_ft_in1k,88.144,11.856,97.859,2.142,88.79,224,0.875,bilinear,+5.446,+1.715,+77
cait_xxs36_384.fb_dist_in1k,88.140,11.860,97.903,2.097,17.37,384,1.000,bicubic,+5.936,+1.759,+149
dm_nfnet_f0.dm_in1k,88.133,11.867,97.903,2.097,71.49,256,0.900,bicubic,+4.647,+1.335,-2
resnext101_32x4d.fb_swsl_ig1b_ft_in1k,88.112,11.888,97.965,2.035,44.18,224,0.875,bilinear,+4.886,+1.205,+24
pit_b_distilled_224.in1k,88.110,11.890,97.654,2.346,74.79,224,0.900,bicubic,+4.344,+1.186,-35
xcit_tiny_12_p8_384.fb_dist_in1k,88.103,11.897,97.925,2.075,6.71,384,1.000,bicubic,+5.715,+1.705,+116
tiny_vit_11m_224.dist_in22k_ft_in1k,88.103,11.897,97.790,2.210,11.00,224,0.950,bicubic,+4.875,+1.160,+20
pvt_v2_b5.in1k,88.103,11.897,97.694,2.306,81.96,224,0.900,bicubic,+4.363,+1.058,-31
rexnetr_200.sw_in12k_ft_in1k,88.099,11.901,98.006,1.994,16.52,288,1.000,bicubic,+4.961,+1.370,+25
pvt_v2_b3.in1k,88.099,11.901,97.777,2.223,45.24,224,0.900,bicubic,+4.981,+1.221,+32
fastvit_sa36.apple_in1k,88.088,11.912,97.790,2.210,31.53,256,0.900,bicubic,+4.588,+1.160,-13
hrnet_w18_ssld.paddle_in1k,88.067,11.933,97.824,2.176,21.30,288,1.000,bilinear,+6.019,+1.574,+157
xception65.ra3_in1k,88.067,11.933,97.750,2.250,39.92,299,0.940,bicubic,+4.887,+1.158,+18
efficientvit_b3.r224_in1k,88.065,11.935,97.566,2.434,48.65,224,0.950,bicubic,+4.605,+1.236,-11
swin_s3_base_224.ms_in1k,88.046,11.954,97.662,2.338,71.13,224,0.900,bicubic,+4.126,+0.990,-66
xcit_tiny_24_p8_224.fb_dist_in1k,88.037,11.963,97.818,2.182,12.11,224,1.000,bicubic,+5.471,+1.760,+89
convnextv2_tiny.fcmae_ft_in1k,88.035,11.965,97.859,2.142,28.64,288,1.000,bicubic,+4.571,+1.141,-15
regnety_160.swag_lc_in1k,88.033,11.967,98.042,1.958,83.59,224,0.965,bicubic,+4.251,+0.762,-51
resnet152.a1h_in1k,88.033,11.967,97.696,2.304,60.19,288,1.000,bicubic,+4.583,+1.158,-16
focalnet_base_srf.ms_in1k,88.033,11.967,97.656,2.344,88.15,224,0.900,bicubic,+4.213,+0.976,-56
maxvit_tiny_tf_224.in1k,88.027,11.973,97.816,2.184,30.92,224,0.950,bicubic,+4.625,+1.226,-12
focalnet_base_lrf.ms_in1k,88.003,11.997,97.609,2.391,88.75,224,0.900,bicubic,+4.165,+1.001,-63
gcvit_tiny.in1k,88.001,11.999,97.722,2.278,28.22,224,0.875,bicubic,+4.617,+1.324,-10
eca_nfnet_l0.ra2_in1k,87.980,12.020,97.871,2.129,24.14,288,1.000,bicubic,+5.402,+1.379,+77
tf_efficientnet_b5.in1k,87.975,12.025,97.933,2.067,30.39,456,0.934,bicubic,+4.799,+1.397,+6
cs3sedarknet_x.c2ns_in1k,87.975,12.025,97.794,2.205,35.40,288,1.000,bicubic,+5.317,+1.445,+60
nfnet_l0.ra2_in1k,87.967,12.033,97.867,2.133,35.07,288,1.000,bicubic,+5.217,+1.351,+46
efficientformer_l3.snap_dist_in1k,87.963,12.037,97.711,2.289,31.41,224,0.950,bicubic,+5.415,+1.461,+79
tf_efficientnet_b4.aa_in1k,87.958,12.042,97.739,2.261,19.34,380,0.922,bicubic,+4.940,+1.439,+23
xcit_small_24_p8_224.fb_in1k,87.956,12.044,97.581,2.419,47.63,224,1.000,bicubic,+4.122,+0.949,-69
regnetz_c16.ra3_in1k,87.952,12.048,97.779,2.220,13.46,320,1.000,bicubic,+5.320,+1.462,+58
regnety_032.ra_in1k,87.950,12.050,97.897,2.103,19.44,288,1.000,bicubic,+5.224,+1.481,+43
coatnet_1_rw_224.sw_in1k,87.943,12.057,97.453,2.547,41.72,224,0.950,bicubic,+4.347,+1.071,-43
resnet101d.ra2_in1k,87.937,12.063,97.910,2.090,44.57,320,1.000,bicubic,+4.917,+1.458,+17
regnety_040.ra3_in1k,87.933,12.067,97.880,2.120,20.65,288,1.000,bicubic,+4.889,+1.378,+13
mobilevitv2_200.cvnets_in22k_ft_in1k_384,87.933,12.067,97.822,2.178,18.45,384,1.000,bicubic,+4.533,+1.240,-25
swinv2_cr_small_ns_224.sw_in1k,87.933,12.067,97.668,2.332,49.70,224,0.900,bicubic,+4.435,+1.184,-38
focalnet_small_lrf.ms_in1k,87.930,12.069,97.696,2.304,50.34,224,0.900,bicubic,+4.436,+1.116,-38
repvit_m1_5.dist_450e_in1k,87.928,12.072,97.703,2.297,14.64,224,0.950,bicubic,+5.416,+1.591,+72
resnetv2_101.a1h_in1k,87.924,12.076,97.651,2.349,44.54,288,1.000,bicubic,+4.924,+1.197,+13
efficientvit_b2.r288_in1k,87.920,12.080,97.600,2.400,24.33,288,1.000,bicubic,+4.820,+1.296,+3
vit_base_patch32_384.augreg_in21k_ft_in1k,87.909,12.091,98.010,1.990,88.30,384,1.000,bicubic,+4.557,+1.170,-25
sequencer2d_l.in1k,87.907,12.093,97.703,2.297,54.30,224,0.875,bicubic,+4.513,+1.207,-32
twins_svt_large.in1k,87.903,12.097,97.581,2.419,99.27,224,0.900,bicubic,+4.225,+0.993,-59
coatnet_rmlp_1_rw_224.sw_in1k,87.892,12.108,97.624,2.376,41.69,224,0.950,bicubic,+4.530,+1.174,-29
twins_pcpvt_large.in1k,87.869,12.131,97.859,2.142,60.99,224,0.900,bicubic,+4.739,+1.255,-9
swin_base_patch4_window7_224.ms_in1k,87.866,12.134,97.564,2.436,87.77,224,0.900,bicubic,+4.260,+1.112,-59
maxvit_tiny_rw_224.sw_in1k,87.856,12.144,97.643,2.357,29.06,224,0.950,bicubic,+4.352,+1.129,-51
swin_s3_small_224.ms_in1k,87.849,12.151,97.431,2.568,49.74,224,0.900,bicubic,+4.093,+0.980,-78
convformer_s18.sail_in1k,87.845,12.155,97.551,2.449,26.77,224,1.000,bicubic,+4.859,+1.301,+5
deit_base_patch16_384.fb_in1k,87.845,12.155,97.508,2.492,86.86,384,1.000,bicubic,+4.741,+1.140,-8
mobilevitv2_175.cvnets_in22k_ft_in1k_384,87.841,12.159,97.728,2.272,14.25,384,1.000,bicubic,+4.903,+1.302,+6
ecaresnet101d.miil_in1k,87.839,12.161,97.899,2.101,44.57,288,0.950,bicubic,+4.855,+1.357,+3
convnext_nano.in12k_ft_in1k,87.837,12.164,97.888,2.112,15.59,288,1.000,bicubic,+4.975,+1.332,+9
xcit_small_12_p8_224.fb_in1k,87.822,12.178,97.568,2.432,26.21,224,1.000,bicubic,+4.488,+1.086,-35
flexivit_small.600ep_in1k,87.811,12.189,97.577,2.423,22.06,240,0.950,bicubic,+5.449,+1.493,+72
vit_base_patch32_clip_224.laion2b_ft_in12k_in1k,87.809,12.191,97.756,2.244,88.22,224,0.900,bicubic,+4.513,+1.228,-35
flexivit_small.1200ep_in1k,87.809,12.191,97.613,2.387,22.06,240,0.950,bicubic,+5.283,+1.487,+51
focalnet_small_srf.ms_in1k,87.809,12.191,97.575,2.425,49.89,224,0.900,bicubic,+4.393,+1.137,-51
tf_efficientnetv2_b3.in21k_ft_in1k,87.807,12.193,97.895,2.105,14.36,300,0.900,bicubic,+5.137,+1.269,+21
maxxvit_rmlp_nano_rw_256.sw_in1k,87.807,12.193,97.752,2.248,16.78,256,0.950,bicubic,+4.765,+1.402,-11
deit3_medium_patch16_224.fb_in1k,87.807,12.193,97.654,2.346,38.85,224,0.900,bicubic,+4.721,+1.360,-15
tresnet_xl.miil_in1k_448,87.796,12.204,97.459,2.541,78.44,448,0.875,bilinear,+4.738,+1.287,-15
resnetv2_50x1_bit.goog_distilled_in1k,87.790,12.210,97.899,2.101,25.55,224,0.875,bicubic,+4.966,+1.381,+1
repvit_m1_5.dist_300e_in1k,87.772,12.227,97.649,2.351,14.64,224,0.950,bicubic,+5.396,+1.619,+61
convnext_tiny.fb_in1k,87.770,12.230,97.585,2.415,28.59,288,1.000,bicubic,+5.072,+0.953,+13
regnety_320.tv2_in1k,87.743,12.257,97.673,2.327,145.05,224,0.965,bicubic,+4.581,+1.259,-34
resnext101_64x4d.tv_in1k,87.740,12.259,97.592,2.408,83.46,224,0.875,bilinear,+4.748,+1.348,-14
convnextv2_nano.fcmae_ft_in22k_in1k,87.732,12.268,97.886,2.114,15.62,288,1.000,bicubic,+5.068,+1.366,+15
twins_pcpvt_base.in1k,87.730,12.270,97.728,2.272,43.83,224,0.900,bicubic,+5.016,+1.382,+6
tresnet_m.miil_in21k_ft_in1k,87.725,12.274,97.517,2.483,31.39,224,0.875,bilinear,+4.656,+1.407,-24
mvitv2_tiny.fb_in1k,87.719,12.281,97.553,2.447,24.17,224,0.900,bicubic,+5.309,+1.401,+48
gc_efficientnetv2_rw_t.agc_in1k,87.717,12.283,97.803,2.197,13.68,288,1.000,bicubic,+5.261,+1.507,+45
maxvit_rmlp_nano_rw_256.sw_in1k,87.715,12.285,97.577,2.423,15.50,256,0.950,bicubic,+4.761,+1.311,-17
resnetv2_101x1_bit.goog_in21k_ft_in1k,87.683,12.317,97.940,2.060,44.54,448,1.000,bilinear,+5.341,+1.420,+57
rexnet_300.nav_in1k,87.683,12.317,97.611,2.389,34.71,224,0.875,bicubic,+4.909,+1.373,-5
swin_small_patch4_window7_224.ms_in1k,87.662,12.338,97.568,2.432,49.61,224,0.900,bicubic,+4.454,+1.252,-48
efficientnetv2_rw_t.ra2_in1k,87.651,12.349,97.690,2.310,13.65,288,1.000,bicubic,+5.301,+1.498,+53
twins_svt_base.in1k,87.651,12.349,97.525,2.474,56.07,224,0.900,bicubic,+4.531,+1.111,-39
mobilevitv2_150.cvnets_in22k_ft_in1k_384,87.649,12.351,97.647,2.353,10.59,384,1.000,bicubic,+5.063,+1.333,+18
coat_small.in1k,87.647,12.354,97.530,2.470,21.69,224,0.900,bicubic,+5.285,+1.322,+46
fastvit_sa24.apple_in1k,87.638,12.362,97.726,2.274,21.55,256,0.900,bicubic,+4.960,+1.454,-1
maxxvitv2_nano_rw_256.sw_in1k,87.638,12.362,97.525,2.474,23.70,256,0.950,bicubic,+4.528,+1.201,-42
efficientvit_b2.r256_in1k,87.638,12.362,97.453,2.547,24.33,256,1.000,bicubic,+4.948,+1.359,-1
pnasnet5large.tf_in1k,87.636,12.364,97.487,2.513,86.06,331,0.911,bicubic,+4.854,+1.447,-16
resnet101.a1h_in1k,87.634,12.366,97.558,2.442,44.55,288,1.000,bicubic,+4.856,+1.248,-16
resnetaa50d.sw_in12k_ft_in1k,87.623,12.377,97.803,2.197,25.58,288,1.000,bicubic,+5.023,+1.305,+7
cs3edgenet_x.c2_in1k,87.621,12.379,97.654,2.346,47.82,288,1.000,bicubic,+4.913,+1.284,-11
flexivit_small.300ep_in1k,87.621,12.379,97.613,2.387,22.06,240,0.950,bicubic,+5.443,+1.575,+66
xcit_medium_24_p8_224.fb_in1k,87.612,12.388,97.201,2.799,84.32,224,1.000,bicubic,+3.866,+0.491,-117
swinv2_tiny_window16_256.ms_in1k,87.610,12.390,97.560,2.440,28.35,256,0.900,bicubic,+4.806,+1.324,-23
resnext101_32x16d.fb_swsl_ig1b_ft_in1k,87.606,12.394,97.816,2.184,194.03,224,0.875,bilinear,+4.270,+0.970,-73
resnext50_32x4d.fb_swsl_ig1b_ft_in1k,87.600,12.400,97.649,2.351,25.03,224,0.875,bilinear,+5.428,+1.425,+63
nest_base_jx.goog_in1k,87.597,12.403,97.521,2.479,67.72,224,0.875,bicubic,+4.063,+1.147,-99
maxvit_nano_rw_256.sw_in1k,87.595,12.405,97.519,2.481,15.45,256,0.950,bicubic,+4.667,+1.299,-36
tf_efficientnet_b4.in1k,87.585,12.415,97.602,2.398,19.34,380,0.922,bicubic,+4.977,+1.850,-4
davit_tiny.msft_in1k,87.565,12.435,97.585,2.415,28.36,224,0.950,bicubic,+4.869,+1.311,-17
levit_384.fb_dist_in1k,87.563,12.437,97.538,2.462,39.13,224,0.900,bicubic,+4.967,+1.520,-3
levit_conv_384.fb_dist_in1k,87.561,12.439,97.538,2.462,39.13,224,0.900,bicubic,+4.971,+1.522,-2
sequencer2d_m.in1k,87.559,12.441,97.581,2.419,38.31,224,0.875,bicubic,+4.747,+1.307,-34
convnextv2_nano.fcmae_ft_in1k,87.553,12.447,97.666,2.334,15.62,288,1.000,bicubic,+5.067,+1.440,+13
tf_efficientnet_b2.ns_jft_in1k,87.553,12.447,97.626,2.374,9.11,260,0.890,bicubic,+5.175,+1.372,+23
ecaresnet50t.ra2_in1k,87.544,12.456,97.649,2.351,25.57,320,0.950,bicubic,+5.192,+1.509,+27
regnetx_320.tv2_in1k,87.538,12.462,97.564,2.436,107.81,224,0.965,bicubic,+4.728,+1.356,-37
vit_base_patch32_clip_224.laion2b_ft_in1k,87.529,12.471,97.547,2.453,88.22,224,0.900,bicubic,+4.947,+1.347,-5
efficientformerv2_s2.snap_dist_in1k,87.516,12.484,97.615,2.385,12.71,224,0.950,bicubic,+5.350,+1.705,+51
inception_next_tiny.sail_in1k,87.516,12.484,97.549,2.451,28.06,224,0.875,bicubic,+5.038,+1.527,+9
coatnet_bn_0_rw_224.sw_in1k,87.508,12.492,97.551,2.449,27.44,224,0.950,bicubic,+5.108,+1.365,+13
vit_base_patch16_rpn_224.sw_in1k,87.504,12.496,97.489,2.511,86.54,224,0.900,bicubic,+5.302,+1.493,+43
pvt_v2_b2_li.in1k,87.501,12.499,97.470,2.530,22.55,224,0.900,bicubic,+5.307,+1.378,+44
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k,87.495,12.505,97.818,2.182,236.34,224,0.875,bicubic,+4.619,+1.236,-49
edgenext_small.usi_in1k,87.493,12.507,97.587,2.413,5.59,320,1.000,bicubic,+5.929,+1.875,+96
nest_small_jx.goog_in1k,87.493,12.507,97.513,2.487,38.35,224,0.875,bicubic,+4.369,+1.193,-74
vit_relpos_base_patch16_clsgap_224.sw_in1k,87.471,12.529,97.525,2.474,86.43,224,0.900,bicubic,+4.711,+1.353,-42
vit_relpos_base_patch16_224.sw_in1k,87.467,12.533,97.558,2.442,86.43,224,0.900,bicubic,+4.971,+1.419,-2
regnety_080_tv.tv2_in1k,87.465,12.535,97.636,2.364,39.38,224,0.965,bicubic,+4.871,+1.388,-20
fbnetv3_g.ra2_in1k,87.446,12.554,97.545,2.455,16.62,288,0.950,bilinear,+5.406,+1.485,+53
resnet61q.ra2_in1k,87.439,12.561,97.598,2.402,36.85,288,1.000,bicubic,+4.915,+1.468,-9
wide_resnet50_2.racm_in1k,87.437,12.563,97.543,2.458,68.88,288,0.950,bicubic,+5.157,+1.479,+24
poolformerv2_m48.sail_in1k,87.435,12.565,97.421,2.579,73.35,224,1.000,bicubic,+4.817,+1.349,-29
efficientnet_b3.ra2_in1k,87.433,12.567,97.679,2.321,12.23,320,1.000,bicubic,+5.187,+1.561,+24
regnetx_160.tv2_in1k,87.433,12.567,97.444,2.556,54.28,224,0.965,bicubic,+4.867,+1.272,-16
resnext101_64x4d.c1_in1k,87.433,12.567,97.444,2.556,83.46,288,1.000,bicubic,+4.277,+1.070,-89
cait_xxs24_384.fb_dist_in1k,87.414,12.586,97.621,2.378,12.03,384,1.000,bicubic,+6.442,+1.981,+149
resnet51q.ra2_in1k,87.397,12.603,97.583,2.417,35.70,288,1.000,bilinear,+5.037,+1.397,+4
cs3darknet_x.c2ns_in1k,87.395,12.605,97.611,2.389,35.05,288,1.000,bicubic,+5.173,+1.381,+22
cs3sedarknet_l.c2ns_in1k,87.395,12.605,97.570,2.430,21.91,288,0.950,bicubic,+5.611,+1.606,+66
coat_lite_small.in1k,87.390,12.610,97.374,2.626,19.84,224,0.900,bicubic,+5.078,+1.524,+9
pvt_v2_b2.in1k,87.382,12.618,97.519,2.481,25.36,224,0.900,bicubic,+5.298,+1.563,+36
tresnet_l.miil_in1k_448,87.382,12.618,97.487,2.513,55.99,448,0.875,bilinear,+5.106,+1.509,+14
resnetv2_50d_gn.ah_in1k,87.380,12.620,97.536,2.464,25.57,288,1.000,bicubic,+5.422,+1.608,+47
sequencer2d_s.in1k,87.377,12.623,97.387,2.613,27.65,224,0.875,bicubic,+5.037,+1.359,+1
swinv2_cr_small_224.sw_in1k,87.377,12.623,97.346,2.654,49.70,224,0.900,bicubic,+4.242,+1.238,-97
xcit_tiny_24_p8_224.fb_in1k,87.373,12.627,97.626,2.374,12.11,224,1.000,bicubic,+5.481,+1.656,+49
fastvit_sa12.apple_dist_in1k,87.363,12.637,97.493,2.507,11.58,256,0.900,bicubic,+5.509,+1.783,+51
vit_relpos_medium_patch16_cls_224.sw_in1k,87.363,12.637,97.451,2.549,38.76,224,0.900,bicubic,+4.791,+1.383,-33
nasnetalarge.tf_in1k,87.360,12.640,97.417,2.583,88.75,331,0.911,bicubic,+4.734,+1.375,-47
seresnext50_32x4d.racm_in1k,87.356,12.644,97.617,2.383,27.56,288,0.950,bicubic,+5.160,+1.469,+15
crossvit_18_dagger_240.in1k,87.350,12.650,97.453,2.547,44.27,240,0.875,bicubic,+4.832,+1.385,-29
repvit_m3.dist_in1k,87.326,12.674,97.481,2.519,10.68,224,0.950,bicubic,+5.824,+1.913,+77
ecaresnet50d.miil_in1k,87.322,12.678,97.666,2.334,25.58,288,0.950,bicubic,+5.672,+1.784,+59
wide_resnet101_2.tv2_in1k,87.322,12.678,97.404,2.596,126.89,224,0.965,bilinear,+4.820,+1.388,-30
resnext101_32x8d.tv2_in1k,87.318,12.682,97.560,2.440,88.79,224,0.965,bilinear,+4.486,+1.328,-80
crossvit_18_240.in1k,87.318,12.682,97.487,2.513,43.27,240,0.875,bicubic,+4.918,+1.427,-21
focalnet_tiny_srf.ms_in1k,87.301,12.699,97.414,2.586,28.43,224,0.900,bicubic,+5.163,+1.446,+16
resnest101e.in1k,87.288,12.712,97.562,2.438,48.28,256,0.875,bilinear,+4.404,+1.240,-85
gcvit_xtiny.in1k,87.279,12.721,97.478,2.522,19.98,224,0.875,bicubic,+5.325,+1.513,+32
tiny_vit_11m_224.in1k,87.277,12.723,97.487,2.513,11.00,224,0.950,bicubic,+5.785,+1.625,+69
coatnet_rmlp_nano_rw_224.sw_in1k,87.277,12.723,97.447,2.554,15.15,224,0.900,bicubic,+5.227,+1.569,+20
ecaresnet101d_pruned.miil_in1k,87.269,12.731,97.713,2.287,24.88,288,0.950,bicubic,+5.271,+1.553,+23
vit_relpos_medium_patch16_rpn_224.sw_in1k,87.269,12.731,97.447,2.554,38.73,224,0.900,bicubic,+4.959,+1.475,-13
swin_tiny_patch4_window7_224.ms_in22k_ft_in1k,87.249,12.751,97.782,2.218,28.29,224,0.900,bicubic,+6.281,+1.768,+123
resnetrs101.tf_in1k,87.241,12.759,97.455,2.545,63.62,288,0.940,bicubic,+4.957,+1.441,-11
coatnext_nano_rw_224.sw_in1k,87.239,12.761,97.543,2.458,14.70,224,0.900,bicubic,+5.297,+1.627,+26
poolformer_m48.sail_in1k,87.237,12.763,97.318,2.682,73.47,224,0.950,bicubic,+4.755,+1.352,-40
regnety_160.tv2_in1k,87.232,12.768,97.461,2.539,83.59,224,0.965,bicubic,+4.586,+1.247,-70
mixer_b16_224.miil_in21k_ft_in1k,87.230,12.770,97.414,2.586,59.88,224,0.875,bilinear,+4.924,+1.694,-18
tresnet_xl.miil_in1k,87.230,12.770,97.397,2.603,78.44,224,0.875,bilinear,+5.156,+1.469,+8
xcit_tiny_12_p8_224.fb_dist_in1k,87.224,12.776,97.449,2.551,6.71,224,1.000,bicubic,+6.012,+1.847,+91
xcit_tiny_12_p16_384.fb_dist_in1k,87.209,12.791,97.466,2.534,6.72,384,1.000,bicubic,+6.271,+2.052,+117
convit_base.fb_in1k,87.209,12.791,97.284,2.716,86.54,224,0.875,bicubic,+4.919,+1.348,-20
resnetv2_50d_evos.ah_in1k,87.205,12.795,97.421,2.579,25.59,288,1.000,bicubic,+5.203,+1.521,+10
resnet152.tv2_in1k,87.188,12.812,97.387,2.613,60.19,224,0.965,bilinear,+4.901,+1.383,-22
tf_efficientnet_b3.ap_in1k,87.188,12.812,97.382,2.618,12.23,300,0.904,bicubic,+5.368,+1.756,+25
vit_base_patch32_clip_224.openai_ft_in1k,87.179,12.821,97.466,2.534,88.22,224,0.900,bicubic,+5.249,+1.500,+16
visformer_small.in1k,87.179,12.821,97.323,2.677,40.22,224,0.900,bicubic,+5.073,+1.445,-2
vit_srelpos_medium_patch16_224.sw_in1k,87.177,12.823,97.310,2.690,38.74,224,0.900,bicubic,+4.937,+1.368,-21
focalnet_tiny_lrf.ms_in1k,87.175,12.825,97.370,2.630,28.65,224,0.900,bicubic,+5.021,+1.422,-10
crossvit_15_dagger_240.in1k,87.166,12.834,97.431,2.568,28.21,240,0.875,bicubic,+4.836,+1.475,-34
convnext_tiny_hnf.a2h_in1k,87.147,12.853,97.280,2.720,28.59,288,1.000,bicubic,+4.563,+1.272,-71
vit_relpos_medium_patch16_224.sw_in1k,87.145,12.855,97.500,2.500,38.75,224,0.900,bicubic,+4.683,+1.418,-54
swin_s3_tiny_224.ms_in1k,87.143,12.857,97.308,2.692,28.33,224,0.900,bicubic,+4.999,+1.354,-12
coatnet_0_rw_224.sw_in1k,87.128,12.872,97.233,2.767,27.44,224,0.950,bicubic,+4.738,+1.397,-50
xcit_small_24_p16_224.fb_in1k,87.119,12.881,97.267,2.733,47.67,224,1.000,bicubic,+4.543,+1.255,-72
repvit_m1_1.dist_450e_in1k,87.104,12.896,97.412,2.588,8.80,224,0.950,bicubic,+5.792,+1.876,+63
swinv2_tiny_window8_256.ms_in1k,87.089,12.911,97.510,2.490,28.35,256,0.900,bicubic,+5.269,+1.516,+12
pit_s_distilled_224.in1k,87.081,12.919,97.363,2.637,24.04,224,0.900,bicubic,+5.267,+1.633,+11
efficientvit_b2.r224_in1k,87.081,12.919,97.203,2.797,24.33,224,0.950,bicubic,+4.933,+1.497,-19
ecaresnet50t.a1_in1k,87.081,12.919,97.122,2.878,25.57,288,1.000,bicubic,+4.953,+1.480,-15
mobilevitv2_200.cvnets_in22k_ft_in1k,87.057,12.943,97.431,2.568,18.45,256,0.888,bicubic,+4.725,+1.490,-46
resnext50_32x4d.a1h_in1k,87.057,12.943,97.331,2.669,25.03,288,1.000,bicubic,+5.043,+1.397,-10
xception41p.ra3_in1k,87.051,12.949,97.201,2.799,26.91,299,0.940,bicubic,+5.079,+1.417,-7
regnetz_b16.ra3_in1k,87.047,12.953,97.412,2.588,9.72,288,1.000,bicubic,+6.319,+1.894,+119
crossvit_15_240.in1k,87.042,12.958,97.423,2.577,27.53,240,0.875,bicubic,+5.506,+1.687,+27
convit_small.fb_in1k,87.042,12.958,97.350,2.650,27.78,224,0.875,bicubic,+5.622,+1.606,+44
tf_efficientnetv2_b3.in1k,87.027,12.973,97.301,2.699,14.36,300,0.904,bicubic,+5.055,+1.499,-10
gcresnet50t.ra2_in1k,87.025,12.975,97.391,2.609,25.90,288,1.000,bicubic,+5.569,+1.673,+37
xcit_small_12_p16_224.fb_in1k,87.010,12.990,97.242,2.759,26.25,224,1.000,bicubic,+5.040,+1.430,-11
nest_tiny_jx.goog_in1k,87.008,12.992,97.378,2.622,17.06,224,0.875,bicubic,+5.582,+1.760,+37
poolformerv2_m36.sail_in1k,87.008,12.992,97.284,2.716,56.08,224,1.000,bicubic,+4.792,+1.360,-42
deit3_small_patch16_224.fb_in1k,87.008,12.992,97.171,2.829,22.06,224,0.900,bicubic,+5.638,+1.715,+45
deit_small_distilled_patch16_224.fb_in1k,87.004,12.996,97.323,2.677,22.44,224,0.900,bicubic,+5.788,+1.699,+56
swinv2_cr_tiny_ns_224.sw_in1k,87.002,12.998,97.280,2.720,28.33,224,0.900,bicubic,+5.200,+1.462,-2
resnet101.a1_in1k,87.000,13.000,96.960,3.040,44.55,288,1.000,bicubic,+4.678,+1.328,-58
resnet152.a2_in1k,86.995,13.005,96.923,3.077,60.19,288,1.000,bicubic,+4.387,+0.795,-102
coatnet_nano_rw_224.sw_in1k,86.993,13.007,97.248,2.752,15.14,224,0.900,bicubic,+5.297,+1.602,0
resmlp_36_224.fb_distilled_in1k,86.987,13.013,97.276,2.724,44.69,224,0.875,bicubic,+5.839,+1.798,+59
poolformer_m36.sail_in1k,86.955,13.045,97.141,2.859,56.17,224,0.950,bicubic,+4.853,+1.443,-34
mobilevitv2_175.cvnets_in22k_ft_in1k,86.951,13.050,97.335,2.665,14.25,256,0.888,bicubic,+5.013,+1.545,-18
regnety_032.tv2_in1k,86.942,13.058,97.410,2.590,19.44,224,0.965,bicubic,+5.186,+1.566,-6
resnet101.a2_in1k,86.942,13.058,96.990,3.010,44.55,288,1.000,bicubic,+4.706,+1.260,-54
xcit_large_24_p16_224.fb_in1k,86.942,13.058,96.921,3.079,189.10,224,1.000,bicubic,+4.040,+1.037,-142
resnet101.tv2_in1k,86.936,13.065,97.248,2.752,44.55,224,0.965,bilinear,+5.047,+1.480,-19
xcit_medium_24_p16_224.fb_in1k,86.933,13.067,97.103,2.897,84.40,224,1.000,bicubic,+4.293,+1.121,-117
resnetv2_50.a1h_in1k,86.929,13.071,97.340,2.660,25.55,288,1.000,bicubic,+5.531,+1.614,+25
poolformerv2_s36.sail_in1k,86.921,13.079,97.353,2.647,30.79,224,1.000,bicubic,+5.355,+1.663,+1
tnt_s_patch16_224,86.914,13.086,97.361,2.639,23.76,224,0.900,bicubic,+5.378,+1.671,+4
vit_relpos_small_patch16_224.sw_in1k,86.891,13.109,97.489,2.511,21.98,224,0.900,bicubic,+5.429,+1.669,+15
vit_small_patch16_224.augreg_in21k_ft_in1k,86.871,13.129,97.609,2.391,22.05,224,0.900,bicubic,+5.486,+1.473,+23
vit_small_r26_s32_224.augreg_in21k_ft_in1k,86.856,13.143,97.530,2.470,36.43,224,0.900,bicubic,+4.992,+1.508,-26
resnet152.a1_in1k,86.856,13.143,96.793,3.207,60.19,288,1.000,bicubic,+4.124,+1.073,-136
ecaresnetlight.miil_in1k,86.852,13.148,97.508,2.492,30.16,288,0.950,bicubic,+5.444,+1.692,+17
resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k,86.848,13.152,97.521,2.479,194.03,224,0.875,bilinear,+5.010,+1.429,-26
convmixer_1536_20.in1k,86.840,13.161,97.355,2.645,51.63,224,0.960,bicubic,+5.478,+1.741,+21
tf_efficientnet_b3.aa_in1k,86.840,13.161,97.297,2.703,12.23,300,0.904,bicubic,+5.200,+1.575,-15
rexnet_200.nav_in1k,86.840,13.161,97.276,2.724,16.37,224,0.875,bicubic,+5.204,+1.610,-13
resnet50.fb_swsl_ig1b_ft_in1k,86.827,13.173,97.493,2.507,25.56,224,0.875,bilinear,+5.655,+1.507,+33
repvit_m1_1.dist_300e_in1k,86.827,13.173,97.318,2.682,8.80,224,0.950,bicubic,+6.001,+2.148,+75
deit_base_patch16_224.fb_in1k,86.827,13.173,97.052,2.949,86.57,224,0.900,bicubic,+4.835,+1.316,-43
tresnet_m.miil_in1k_448,86.816,13.184,97.216,2.784,31.39,448,0.875,bilinear,+5.106,+1.642,-25
tf_efficientnet_lite4.in1k,86.799,13.201,97.265,2.735,13.01,380,0.920,bilinear,+5.269,+1.601,-9
coat_mini.in1k,86.799,13.201,97.156,2.844,10.34,224,0.900,bicubic,+5.529,+1.774,+22
resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k,86.795,13.205,97.472,2.528,88.79,224,0.875,bilinear,+5.189,+1.432,-19
eva02_tiny_patch14_336.mim_in22k_ft_in1k,86.784,13.216,97.269,2.731,5.76,336,1.000,bicubic,+6.154,+1.743,+87
seresnet50.ra2_in1k,86.778,13.223,97.361,2.639,28.09,288,0.950,bicubic,+5.493,+1.709,+17
vit_base_patch16_224.orig_in21k_ft_in1k,86.773,13.227,97.442,2.558,86.57,224,0.900,bicubic,+4.983,+1.316,-35
convnextv2_pico.fcmae_ft_in1k,86.773,13.227,97.331,2.669,9.07,288,0.950,bicubic,+5.687,+1.851,+39
regnetx_080.tv2_in1k,86.771,13.229,97.197,2.803,39.57,224,0.965,bicubic,+5.231,+1.655,-18
cs3darknet_l.c2ns_in1k,86.767,13.233,97.459,2.541,21.16,288,0.950,bicubic,+5.871,+1.797,+53
resnetaa50.a1h_in1k,86.765,13.235,97.391,2.609,25.56,288,1.000,bicubic,+5.151,+1.589,-27
tresnet_l.miil_in1k,86.763,13.237,97.273,2.727,55.99,224,0.875,bilinear,+5.283,+1.650,-10
resnet50d.ra2_in1k,86.760,13.239,97.372,2.628,25.58,288,0.950,bicubic,+5.404,+1.634,+4
ese_vovnet39b.ra_in1k,86.756,13.244,97.372,2.628,24.57,288,0.950,bicubic,+6.406,+2.006,+104
twins_svt_small.in1k,86.752,13.248,97.180,2.820,24.06,224,0.900,bicubic,+5.076,+1.522,-37
resnet50_gn.a1h_in1k,86.750,13.250,97.449,2.551,25.56,288,0.950,bicubic,+5.534,+2.065,+12
tiny_vit_5m_224.dist_in22k_ft_in1k,86.748,13.252,97.316,2.684,5.39,224,0.950,bicubic,+5.872,+1.652,+50
seresnet50.a1_in1k,86.746,13.255,96.951,3.049,28.09,288,1.000,bicubic,+5.644,+1.623,+25
mobilevitv2_150.cvnets_in22k_ft_in1k,86.743,13.257,97.214,2.786,10.59,256,0.888,bicubic,+5.255,+1.546,-19
fastvit_s12.apple_dist_in1k,86.741,13.259,97.207,2.793,9.47,256,0.900,bicubic,+5.671,+1.923,+28
crossvit_base_240.in1k,86.733,13.267,97.122,2.878,105.03,240,0.875,bicubic,+4.519,+1.288,-90
levit_256.fb_dist_in1k,86.731,13.269,97.254,2.746,18.89,224,0.900,bicubic,+5.207,+1.760,-27
levit_conv_256.fb_dist_in1k,86.726,13.274,97.256,2.744,18.89,224,0.900,bicubic,+5.204,+1.766,-28
ecaresnet50t.a2_in1k,86.726,13.274,97.081,2.919,25.57,288,1.000,bicubic,+5.068,+1.531,-43
cs3darknet_focus_l.c2ns_in1k,86.722,13.278,97.376,2.624,21.15,288,0.950,bicubic,+5.846,+1.694,+41
convnext_nano_ols.d1h_in1k,86.718,13.282,97.047,2.953,15.65,288,1.000,bicubic,+5.118,+1.411,-39
vit_srelpos_small_patch16_224.sw_in1k,86.707,13.293,97.252,2.748,21.97,224,0.900,bicubic,+5.615,+1.682,+18
resnet50.ram_in1k,86.699,13.301,97.199,2.801,25.56,288,0.950,bicubic,+6.723,+2.147,+118
crossvit_small_240.in1k,86.686,13.314,97.276,2.724,26.86,240,0.875,bicubic,+5.668,+1.820,+22
halo2botnet50ts_256.a1h_in1k,86.686,13.314,97.098,2.902,22.64,256,0.950,bicubic,+4.626,+1.464,-82
pit_b_224.in1k,86.686,13.314,96.894,3.107,73.76,224,0.900,bicubic,+4.248,+1.180,-131
resnet50d.a1_in1k,86.671,13.329,96.693,3.307,25.58,288,1.000,bicubic,+5.221,+1.475,-26
ecaresnet50d_pruned.miil_in1k,86.669,13.331,97.429,2.571,19.94,288,0.950,bicubic,+5.879,+1.859,+44
tf_efficientnet_b1.ns_jft_in1k,86.669,13.331,97.382,2.618,7.79,240,0.882,bicubic,+5.281,+1.644,-22
swin_tiny_patch4_window7_224.ms_in1k,86.658,13.342,97.203,2.797,28.29,224,0.900,bicubic,+5.282,+1.659,-21
poolformer_s36.sail_in1k,86.654,13.346,97.152,2.848,30.86,224,0.900,bicubic,+5.224,+1.708,-29
gernet_l.idstcv_in1k,86.643,13.357,97.188,2.812,31.08,256,0.875,bilinear,+5.289,+1.658,-19
efficientnet_el.ra_in1k,86.630,13.370,97.190,2.810,10.59,300,0.904,bicubic,+5.318,+1.700,-18
twins_pcpvt_small.in1k,86.622,13.378,97.344,2.656,24.11,224,0.900,bicubic,+5.530,+1.696,+5
repvit_m2.dist_in1k,86.615,13.385,97.207,2.793,8.80,224,0.950,bicubic,+6.155,+2.039,+67
resmlp_24_224.fb_distilled_in1k,86.615,13.385,97.137,2.863,30.02,224,0.875,bicubic,+5.859,+1.913,+39
gcresnext50ts.ch_in1k,86.609,13.391,97.188,2.812,15.67,288,1.000,bicubic,+5.379,+1.646,-17
resnet50.c2_in1k,86.605,13.395,97.338,2.662,25.56,288,1.000,bicubic,+5.735,+1.804,+26
nf_resnet50.ra2_in1k,86.592,13.408,97.295,2.705,25.56,288,0.940,bicubic,+5.952,+1.961,+47
resnest50d_4s2x40d.in1k,86.588,13.412,97.267,2.733,30.42,224,0.875,bicubic,+5.468,+1.707,-6
sebotnet33ts_256.a1h_in1k,86.585,13.415,96.793,3.207,13.70,256,0.940,bicubic,+5.417,+1.625,-12
efficientnet_b3_pruned.in1k,86.579,13.421,97.184,2.816,9.86,300,0.904,bicubic,+5.727,+1.940,+24
repvgg_b3.rvgg_in1k,86.579,13.421,97.139,2.861,123.09,224,0.875,bilinear,+6.073,+1.885,+53
wide_resnet50_2.tv2_in1k,86.566,13.434,97.248,2.752,68.88,224,0.965,bilinear,+4.960,+1.488,-64
sehalonet33ts.ra2_in1k,86.566,13.434,97.004,2.995,13.69,256,0.940,bicubic,+5.608,+1.732,+9
fastvit_sa12.apple_in1k,86.564,13.436,97.233,2.767,11.58,256,0.900,bicubic,+5.720,+1.893,+22
resnet50.a1_in1k,86.541,13.459,96.838,3.162,25.56,288,1.000,bicubic,+5.327,+1.736,-22
repvit_m1_0.dist_450e_in1k,86.538,13.462,97.098,2.902,7.30,224,0.950,bicubic,+6.104,+2.180,+59
resnet50.c1_in1k,86.536,13.464,97.235,2.765,25.56,288,1.000,bicubic,+5.624,+1.683,+7
convnext_nano.d1h_in1k,86.536,13.464,97.182,2.818,15.59,288,1.000,bicubic,+5.054,+1.524,-53
xcit_tiny_24_p16_224.fb_dist_in1k,86.534,13.466,97.218,2.782,12.12,224,1.000,bicubic,+6.080,+2.000,+55
seresnet50.a2_in1k,86.519,13.481,96.987,3.013,28.09,288,1.000,bicubic,+5.413,+1.765,-16
vit_small_patch16_384.augreg_in1k,86.496,13.504,97.182,2.818,22.20,384,1.000,bicubic,+5.380,+1.608,-18
halonet50ts.a1h_in1k,86.487,13.513,97.148,2.852,22.73,256,0.940,bicubic,+4.825,+1.538,-80
resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k,86.483,13.517,97.474,2.526,44.18,224,0.875,bilinear,+5.559,+1.740,+1
maxvit_rmlp_pico_rw_256.sw_in1k,86.481,13.519,97.201,2.799,7.52,256,0.950,bicubic,+5.967,+1.987,+38
haloregnetz_b.ra3_in1k,86.466,13.534,96.947,3.053,11.68,224,0.940,bicubic,+5.420,+1.747,-13
resnet152s.gluon_in1k,86.464,13.536,97.109,2.891,60.32,224,0.875,bicubic,+5.456,+1.693,-11
seresnet33ts.ra2_in1k,86.462,13.538,97.190,2.810,19.78,288,1.000,bicubic,+5.678,+1.828,+14
repvit_m1_0.dist_300e_in1k,86.457,13.543,97.054,2.946,7.30,224,0.950,bicubic,+6.331,+2.310,+75
mobilevitv2_200.cvnets_in1k,86.457,13.543,96.970,3.030,18.45,256,0.888,bicubic,+5.323,+1.608,-27
resnext50d_32x4d.bt_in1k,86.455,13.545,97.165,2.835,25.05,288,0.950,bicubic,+5.791,+1.745,+22
resnet50.d_in1k,86.455,13.545,97.056,2.944,25.56,288,1.000,bicubic,+5.483,+1.626,-12
resnetv2_50x1_bit.goog_in21k_ft_in1k,86.440,13.560,97.605,2.396,25.55,448,1.000,bilinear,+6.098,+1.922,+49
resnet50.tv2_in1k,86.440,13.560,97.145,2.855,25.56,224,0.965,bilinear,+5.592,+1.711,+3
resnet50.b1k_in1k,86.438,13.562,97.235,2.765,25.56,288,1.000,bicubic,+5.732,+1.803,+14
resnest50d_1s4x24d.in1k,86.432,13.568,97.152,2.848,25.68,224,0.875,bicubic,+5.444,+1.826,-19
poolformerv2_s24.sail_in1k,86.389,13.611,97.150,2.850,21.34,224,1.000,bicubic,+5.641,+1.840,+8
regnety_016.tv2_in1k,86.372,13.628,97.188,2.812,11.20,224,0.965,bicubic,+5.706,+1.858,+15
repvgg_b3g4.rvgg_in1k,86.368,13.632,97.047,2.953,83.83,224,0.875,bilinear,+6.152,+1.955,+60
darknetaa53.c2ns_in1k,86.359,13.641,97.165,2.835,36.02,288,1.000,bilinear,+5.853,+1.843,+24
efficientformer_l1.snap_dist_in1k,86.357,13.643,97.019,2.981,12.29,224,0.950,bicubic,+5.859,+2.031,+25
darknet53.c2ns_in1k,86.350,13.649,97.130,2.869,41.61,288,1.000,bicubic,+5.819,+1.698,+20
lamhalobotnet50ts_256.a1h_in1k,86.350,13.649,97.041,2.959,22.57,256,0.950,bicubic,+4.798,+1.549,-89
fastvit_t12.apple_dist_in1k,86.344,13.656,97.098,2.902,7.55,256,0.900,bicubic,+5.992,+2.056,+37
legacy_senet154.in1k,86.344,13.656,96.934,3.066,115.09,224,0.875,bilinear,+5.032,+1.374,-60
resnet50.a1h_in1k,86.340,13.660,97.060,2.940,25.56,224,1.000,bicubic,+5.662,+1.754,+5
cait_xxs36_224.fb_dist_in1k,86.329,13.671,97.118,2.882,17.30,224,1.000,bicubic,+6.583,+2.244,+81
tf_efficientnet_b3.in1k,86.325,13.675,96.964,3.036,12.23,300,0.904,bicubic,+5.451,+1.664,-16
mobilevitv2_175.cvnets_in1k,86.321,13.679,96.985,3.015,14.25,256,0.888,bicubic,+5.461,+1.729,-15
gernet_m.idstcv_in1k,86.319,13.681,97.098,2.902,21.14,224,0.875,bilinear,+5.582,+1.908,-4
resnet50d.a2_in1k,86.314,13.686,96.674,3.326,25.58,288,1.000,bicubic,+5.150,+1.594,-52
vit_small_patch32_384.augreg_in21k_ft_in1k,86.312,13.688,97.419,2.581,22.92,384,1.000,bicubic,+5.826,+1.819,+15
pit_s_224.in1k,86.308,13.692,97.049,2.951,23.46,224,0.900,bicubic,+5.222,+1.719,-42
efficientnet_b2.ra_in1k,86.306,13.694,96.987,3.013,9.11,288,1.000,bicubic,+5.696,+1.673,+3
gcresnet33ts.ra2_in1k,86.301,13.699,97.060,2.940,19.88,288,1.000,bicubic,+5.701,+1.738,+3
resnext50_32x4d.a1_in1k,86.284,13.716,96.710,3.290,25.03,288,1.000,bicubic,+4.818,+1.536,-89
resnext50_32x4d.a2_in1k,86.280,13.720,96.684,3.316,25.03,288,1.000,bicubic,+4.976,+1.588,-71
resnet50.b2k_in1k,86.272,13.729,97.071,2.929,25.56,288,1.000,bicubic,+5.818,+1.753,+16
senet154.gluon_in1k,86.272,13.729,96.957,3.042,115.09,224,0.875,bicubic,+5.046,+1.599,-69
resnext50_32x4d.tv2_in1k,86.267,13.733,97.054,2.946,25.03,224,0.965,bilinear,+5.085,+1.714,-64
eca_resnet33ts.ra2_in1k,86.257,13.743,97.150,2.850,19.68,288,1.000,bicubic,+5.585,+1.786,-9
resnest50d.in1k,86.248,13.752,97.069,2.931,27.48,224,0.875,bilinear,+5.288,+1.687,-41
gcvit_xxtiny.in1k,86.244,13.756,97.107,2.893,12.00,224,0.875,bicubic,+6.518,+2.027,+67
regnetx_032.tv2_in1k,86.244,13.756,97.092,2.908,15.30,224,0.965,bicubic,+5.318,+1.814,-40
vit_base_patch16_384.augreg_in1k,86.231,13.769,96.964,3.036,86.86,384,1.000,bicubic,+5.129,+1.844,-59
convmixer_768_32.in1k,86.220,13.780,97.037,2.963,21.11,224,0.960,bicubic,+6.052,+1.963,+36
efficientnet_el_pruned.in1k,86.192,13.807,97.028,2.972,10.59,300,0.904,bicubic,+5.894,+1.806,+23
tresnet_m.miil_in1k,86.188,13.812,96.667,3.333,31.39,224,0.875,bilinear,+5.390,+1.811,-28
cspdarknet53.ra_in1k,86.178,13.822,97.009,2.991,27.64,256,0.887,bilinear,+6.110,+1.931,+38
rexnet_150.nav_in1k,86.169,13.831,97.060,2.940,9.73,224,0.875,bicubic,+5.845,+2.070,+16
inception_v4.tf_in1k,86.163,13.837,96.919,3.081,42.68,299,0.875,bicubic,+6.007,+1.949,+33
efficientvit_b1.r288_in1k,86.154,13.846,96.932,3.068,9.10,288,1.000,bicubic,+5.830,+1.756,+15
inception_resnet_v2.tf_in1k,86.133,13.867,97.045,2.955,55.84,299,0.897,bicubic,+5.675,+1.855,-1
xcit_tiny_12_p8_224.fb_in1k,86.114,13.886,97.088,2.912,6.71,224,1.000,bicubic,+6.425,+2.034,+61
resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k,86.101,13.899,97.212,2.788,25.03,224,0.875,bilinear,+5.767,+1.812,+10
res2net101d.in1k,86.099,13.901,96.851,3.149,45.23,224,0.875,bilinear,+4.881,+1.501,-85
resnet50.a2_in1k,86.097,13.903,96.701,3.299,25.56,288,1.000,bicubic,+5.325,+1.713,-34
tf_efficientnet_el.in1k,86.082,13.918,96.955,3.045,10.59,300,0.904,bicubic,+5.834,+1.835,+15
mobilevitv2_150.cvnets_in1k,86.082,13.918,96.857,3.143,10.59,256,0.888,bicubic,+5.712,+1.783,+2
cspresnext50.ra_in1k,86.077,13.923,97.107,2.893,20.57,256,0.887,bilinear,+5.523,+1.781,-20
convnext_pico_ols.d1_in1k,86.069,13.931,97.017,2.983,9.06,288,1.000,bicubic,+5.607,+1.765,-11
resnet101s.gluon_in1k,86.067,13.933,97.030,2.970,44.67,224,0.875,bicubic,+5.763,+1.878,+7
edgenext_small_rw.sw_in1k,86.054,13.946,96.928,3.072,7.83,320,1.000,bicubic,+5.596,+1.620,-10
lambda_resnet50ts.a1h_in1k,86.049,13.950,96.742,3.258,21.54,256,0.950,bicubic,+4.891,+1.644,-84
seresnext101_32x4d.gluon_in1k,86.028,13.972,96.972,3.027,48.96,224,0.875,bicubic,+5.136,+1.676,-56
convnext_pico.d1_in1k,86.015,13.985,96.932,3.068,9.05,288,0.950,bicubic,+5.599,+1.884,-8
poolformer_s24.sail_in1k,86.013,13.987,97.030,2.970,21.39,224,0.900,bicubic,+5.719,+1.956,+5
resnetblur50.bt_in1k,85.992,14.008,96.985,3.015,25.56,288,0.950,bicubic,+5.758,+1.751,+9
resnet152.a3_in1k,85.985,14.015,96.849,3.151,60.19,224,0.950,bicubic,+5.439,+1.849,-28
ecaresnet26t.ra2_in1k,85.981,14.019,97.043,2.957,16.01,320,0.950,bicubic,+6.131,+1.953,+29
tf_efficientnet_b2.ap_in1k,85.981,14.019,96.812,3.188,9.11,260,0.890,bicubic,+5.671,+1.786,-3
seresnext101_64x4d.gluon_in1k,85.973,14.027,96.981,3.019,88.23,224,0.875,bicubic,+5.079,+1.685,-64
efficientformerv2_s1.snap_dist_in1k,85.962,14.038,96.874,3.126,6.19,224,0.950,bicubic,+6.270,+2.158,+41
vit_base_patch32_224.augreg_in21k_ft_in1k,85.956,14.044,97.128,2.872,88.22,224,0.900,bicubic,+5.240,+1.562,-46
resnext50_32x4d.ra_in1k,85.934,14.066,97.032,2.968,25.03,288,0.950,bicubic,+5.236,+1.640,-45
fbnetv3_d.ra2_in1k,85.930,14.070,97.026,2.974,10.31,256,0.950,bilinear,+6.248,+2.082,+40
vit_large_patch32_384.orig_in21k_ft_in1k,85.915,14.085,97.370,2.630,306.63,384,1.000,bicubic,+4.405,+1.280,-137
resnet152d.gluon_in1k,85.915,14.085,96.814,3.186,60.21,224,0.875,bicubic,+5.439,+1.612,-28
tf_efficientnet_b2.aa_in1k,85.896,14.104,96.864,3.136,9.11,260,0.890,bicubic,+5.812,+1.958,+7
tf_efficientnetv2_b2.in1k,85.887,14.113,96.887,3.113,10.10,260,0.890,bicubic,+5.691,+1.845,0
vit_base_patch16_224.sam_in1k,85.872,14.128,96.693,3.307,86.57,224,0.900,bicubic,+5.634,+1.937,-5
repvgg_b2g4.rvgg_in1k,85.862,14.138,96.804,3.196,61.76,224,0.875,bilinear,+6.480,+2.128,+55
resnet101d.gluon_in1k,85.862,14.138,96.674,3.326,44.57,224,0.875,bicubic,+5.436,+1.650,-26
efficientvit_b1.r256_in1k,85.830,14.170,96.782,3.217,9.10,256,1.000,bicubic,+6.096,+2.002,+25
mixnet_xl.ra_in1k,85.804,14.196,96.716,3.284,11.90,224,0.875,bicubic,+5.322,+1.780,-37
inception_resnet_v2.tf_ens_adv_in1k,85.768,14.232,96.759,3.241,55.84,299,0.897,bicubic,+5.790,+1.811,+3
repvit_m0_9.dist_450e_in1k,85.757,14.243,96.810,3.190,5.49,224,0.950,bicubic,+6.691,+2.430,+81
tf_efficientnet_lite3.in1k,85.755,14.245,96.891,3.109,8.20,300,0.904,bilinear,+5.949,+1.977,+18
resnext101_32x4d.gluon_in1k,85.753,14.247,96.637,3.363,44.18,224,0.875,bicubic,+5.413,+1.707,-25
xcit_tiny_24_p16_224.fb_in1k,85.746,14.254,96.932,3.068,12.12,224,1.000,bicubic,+6.298,+2.054,+42
legacy_seresnext101_32x4d.in1k,85.742,14.258,96.772,3.228,48.96,224,0.875,bilinear,+5.510,+1.752,-13
resnet101.a3_in1k,85.736,14.264,96.501,3.499,44.55,224,0.950,bicubic,+5.922,+1.887,+13
res2net50d.in1k,85.729,14.271,96.763,3.237,25.72,224,0.875,bilinear,+5.475,+1.727,-20
regnety_320.pycls_in1k,85.727,14.273,96.725,3.275,145.05,224,0.875,bicubic,+4.917,+1.487,-75
cspresnet50.ra_in1k,85.719,14.281,96.799,3.200,21.62,256,0.887,bilinear,+6.137,+2.090,+28
resmlp_big_24_224.fb_in1k,85.701,14.299,96.426,3.574,129.14,224,0.875,bicubic,+4.665,+1.408,-102
resnext101_64x4d.gluon_in1k,85.693,14.307,96.641,3.358,83.46,224,0.875,bicubic,+5.093,+1.649,-58
xception71.tf_in1k,85.689,14.311,96.774,3.226,42.34,299,0.903,bicubic,+5.815,+1.846,-3
resnet33ts.ra2_in1k,85.689,14.311,96.757,3.243,19.68,288,1.000,bicubic,+5.963,+1.929,+13
deit_small_patch16_224.fb_in1k,85.676,14.324,96.906,3.094,22.05,224,0.900,bicubic,+5.828,+1.862,0
resnet50.ra_in1k,85.674,14.326,96.889,3.111,25.56,288,0.950,bicubic,+5.838,+1.923,+2
efficientnet_em.ra2_in1k,85.672,14.328,96.951,3.049,6.90,240,0.882,bicubic,+6.428,+2.157,+49
dpn107.mx_in1k,85.672,14.328,96.757,3.243,86.92,224,0.875,bicubic,+5.502,+1.815,-21
ecaresnet50t.a3_in1k,85.672,14.328,96.725,3.275,25.57,224,0.950,bicubic,+6.120,+2.031,+22
efficientnet_b2_pruned.in1k,85.642,14.358,96.742,3.258,8.31,260,0.890,bicubic,+5.722,+1.890,-12
resmlp_36_224.fb_in1k,85.623,14.377,96.795,3.205,44.69,224,0.875,bicubic,+5.850,+1.911,+1
tiny_vit_5m_224.in1k,85.605,14.395,96.953,3.047,5.39,224,0.950,bicubic,+6.435,+2.159,+52
mobilevitv2_125.cvnets_in1k,85.578,14.422,96.665,3.335,7.48,256,0.888,bicubic,+5.898,+1.807,+9
resnet50.bt_in1k,85.576,14.425,96.832,3.168,25.56,288,0.950,bicubic,+5.936,+1.940,+11
levit_conv_192.fb_dist_in1k,85.571,14.429,96.742,3.258,10.95,224,0.900,bicubic,+5.733,+1.964,-9
resnet32ts.ra2_in1k,85.569,14.431,96.866,3.134,17.96,288,1.000,bicubic,+6.181,+2.214,+24
levit_192.fb_dist_in1k,85.569,14.431,96.740,3.260,10.95,224,0.900,bicubic,+5.731,+1.956,-9
resnet152c.gluon_in1k,85.569,14.431,96.644,3.356,60.21,224,0.875,bicubic,+5.657,+1.798,-19
pit_xs_distilled_224.in1k,85.561,14.439,96.686,3.314,11.00,224,0.900,bicubic,+6.381,+2.320,+44
tf_efficientnetv2_b1.in1k,85.556,14.444,96.731,3.269,8.14,240,0.882,bicubic,+6.096,+2.009,+16
regnety_120.pycls_in1k,85.554,14.446,96.774,3.226,51.82,224,0.875,bicubic,+5.174,+1.648,-57
fbnetv3_b.ra2_in1k,85.520,14.480,96.862,3.139,8.60,256,0.950,bilinear,+6.374,+2.118,+44
regnetx_320.pycls_in1k,85.520,14.480,96.671,3.329,107.81,224,0.875,bicubic,+5.274,+1.649,-43
nf_regnet_b1.ra2_in1k,85.507,14.493,96.789,3.211,10.22,288,0.900,bicubic,+6.199,+2.049,+25
convnextv2_femto.fcmae_ft_in1k,85.503,14.497,96.806,3.194,5.23,288,0.950,bicubic,+6.165,+2.246,+21
dpn92.mx_in1k,85.501,14.499,96.650,3.350,37.67,224,0.875,bicubic,+5.463,+1.790,-33
fastvit_s12.apple_in1k,85.492,14.508,96.721,3.280,9.47,256,0.900,bicubic,+5.550,+1.927,-31
regnety_160.pycls_in1k,85.486,14.514,96.616,3.384,83.59,224,0.875,bicubic,+5.188,+1.652,-52
resnet152.gluon_in1k,85.482,14.518,96.550,3.450,60.19,224,0.875,bicubic,+5.786,+1.820,-11
resnetrs50.tf_in1k,85.469,14.531,96.733,3.267,35.69,224,0.910,bicubic,+5.575,+1.759,-30
rexnet_130.nav_in1k,85.465,14.536,96.680,3.320,7.56,224,0.875,bicubic,+5.959,+2.002,+1
dpn131.mx_in1k,85.450,14.550,96.620,3.380,79.25,224,0.875,bicubic,+5.636,+1.920,-23
regnetx_160.pycls_in1k,85.396,14.604,96.650,3.350,54.28,224,0.875,bicubic,+5.530,+1.822,-30
tf_efficientnet_b2.in1k,85.396,14.604,96.586,3.414,9.11,260,0.890,bicubic,+5.788,+1.872,-8
convnext_tiny.fb_in22k_ft_in1k,85.381,14.619,96.804,3.196,28.59,288,1.000,bicubic,+6.483,+2.130,+46
dla102x2.in1k,85.375,14.625,96.629,3.371,41.28,224,0.875,bilinear,+5.929,+1.997,+2
repvit_m0_9.dist_300e_in1k,85.373,14.627,96.627,3.373,5.49,224,0.950,bicubic,+6.715,+2.511,+66
dpn98.mx_in1k,85.351,14.649,96.509,3.491,61.57,224,0.875,bicubic,+5.681,+1.855,-15
regnetx_016.tv2_in1k,85.349,14.651,96.817,3.183,9.19,224,0.965,bicubic,+5.913,+2.049,-1
gmlp_s16_224.ra3_in1k,85.349,14.651,96.644,3.356,19.42,224,0.875,bicubic,+5.705,+2.022,-15
botnet26t_256.c1_in1k,85.332,14.668,96.631,3.369,12.49,256,0.950,bicubic,+6.074,+2.099,+15
skresnext50_32x4d.ra_in1k,85.328,14.672,96.390,3.610,27.48,224,0.875,bicubic,+5.164,+1.750,-54
seresnext50_32x4d.gluon_in1k,85.321,14.679,96.674,3.326,27.56,224,0.875,bicubic,+5.397,+1.850,-46
xception65.tf_in1k,85.315,14.685,96.646,3.354,39.92,299,0.903,bicubic,+5.759,+1.988,-15
resnet101c.gluon_in1k,85.311,14.689,96.411,3.589,44.57,224,0.875,bicubic,+5.773,+1.827,-14
lambda_resnet26t.c1_in1k,85.302,14.698,96.721,3.280,10.96,256,0.940,bicubic,+6.214,+2.130,+23
resnext50_32x4d.a3_in1k,85.298,14.702,96.328,3.672,25.03,224,0.950,bicubic,+6.030,+2.022,+7
regnety_064.pycls_in1k,85.283,14.717,96.648,3.352,30.58,224,0.875,bicubic,+5.567,+1.882,-31
resnet34d.ra2_in1k,85.272,14.728,96.701,3.299,21.82,288,0.950,bicubic,+6.836,+2.357,+68
coat_lite_mini.in1k,85.272,14.728,96.686,3.314,11.01,224,0.900,bicubic,+6.170,+2.078,+19
resmlp_24_224.fb_in1k,85.264,14.736,96.490,3.510,30.02,224,0.875,bicubic,+5.890,+1.944,-8
convnext_femto_ols.d1_in1k,85.253,14.747,96.772,3.228,5.23,288,0.950,bicubic,+6.329,+2.246,+28
cait_xxs24_224.fb_dist_in1k,85.238,14.762,96.714,3.286,11.96,224,1.000,bicubic,+6.854,+2.398,+71
regnety_080.pycls_in1k,85.238,14.762,96.635,3.365,39.18,224,0.875,bicubic,+5.370,+1.803,-52
efficientvit_b1.r224_in1k,85.232,14.768,96.462,3.538,9.10,224,0.950,bicubic,+5.980,+2.158,+2
pvt_v2_b1.in1k,85.210,14.790,96.631,3.369,14.01,224,0.900,bicubic,+6.506,+2.129,+43
xcit_tiny_12_p16_224.fb_dist_in1k,85.200,14.800,96.597,3.403,6.72,224,1.000,bicubic,+6.626,+2.399,+47
halonet26t.a1h_in1k,85.191,14.809,96.454,3.546,12.48,256,0.950,bicubic,+6.085,+2.148,+9
resnext101_32x8d.tv_in1k,85.189,14.811,96.456,3.544,88.79,224,0.875,bilinear,+5.879,+1.936,-11
inception_v3.gluon_in1k,85.187,14.813,96.535,3.465,23.83,299,0.875,bicubic,+6.385,+2.159,+29
fastvit_t12.apple_in1k,85.180,14.819,96.607,3.393,7.55,256,0.900,bicubic,+5.916,+2.045,-6
convnext_femto.d1_in1k,85.163,14.837,96.704,3.296,5.22,288,0.950,bicubic,+6.447,+2.273,+34
resnet101.gluon_in1k,85.161,14.839,96.366,3.634,44.55,224,0.875,bicubic,+5.851,+1.844,-16
hrnet_w48.ms_in1k,85.159,14.841,96.490,3.510,77.47,224,0.875,bilinear,+5.853,+1.974,-14
regnetx_120.pycls_in1k,85.138,14.862,96.475,3.525,46.11,224,0.875,bicubic,+5.550,+1.733,-38
eca_halonext26ts.c1_in1k,85.131,14.869,96.580,3.420,10.76,256,0.940,bicubic,+5.645,+1.980,-33
resnet50.am_in1k,85.131,14.869,96.571,3.429,25.56,224,0.875,bicubic,+6.129,+2.173,+8
tf_efficientnet_b1.ap_in1k,85.131,14.869,96.407,3.593,7.79,240,0.882,bicubic,+5.855,+2.095,-16
repvit_m1.dist_in1k,85.121,14.879,96.597,3.403,5.49,224,0.950,bicubic,+6.583,+2.527,+36
eca_botnext26ts_256.c1_in1k,85.121,14.879,96.511,3.489,10.59,256,0.950,bicubic,+5.853,+1.905,-16
legacy_xception.tf_in1k,85.121,14.879,96.469,3.531,22.86,299,0.897,bicubic,+6.081,+2.087,+4
hrnet_w64.ms_in1k,85.114,14.886,96.738,3.262,128.06,224,0.875,bilinear,+5.638,+2.086,-37
lambda_resnet26rpt_256.c1_in1k,85.099,14.901,96.565,3.435,10.99,256,0.940,bicubic,+6.135,+2.129,+4
res2net101_26w_4s.in1k,85.097,14.903,96.381,3.619,45.21,224,0.875,bilinear,+5.897,+1.945,-13
resnet50.fb_ssl_yfcc100m_ft_in1k,85.093,14.907,96.862,3.139,25.56,224,0.875,bilinear,+5.863,+2.036,-16
mobileone_s4.apple_in1k,85.082,14.918,96.434,3.566,14.95,224,0.900,bilinear,+5.656,+1.954,-36
dpn68b.ra_in1k,85.061,14.939,96.449,3.551,12.61,288,1.000,bicubic,+5.701,+2.013,-33
tf_efficientnet_cc_b1_8e.in1k,85.061,14.939,96.430,3.570,39.72,240,0.882,bicubic,+5.759,+2.056,-27
resnest26d.gluon_in1k,85.016,14.984,96.639,3.361,17.07,224,0.875,bilinear,+6.534,+2.345,+34
xcit_nano_12_p8_384.fb_dist_in1k,85.014,14.986,96.629,3.371,3.05,384,1.000,bicubic,+7.194,+2.589,+83
resnext50_32x4d.gluon_in1k,85.005,14.995,96.428,3.572,25.03,224,0.875,bicubic,+5.645,+1.998,-36
tf_efficientnet_b0.ns_jft_in1k,84.999,15.001,96.505,3.495,5.29,224,0.875,bicubic,+6.331,+2.133,+18
coat_tiny.in1k,84.971,15.029,96.422,3.578,5.50,224,0.900,bicubic,+6.545,+2.374,+36
regnety_040.pycls_in1k,84.952,15.048,96.612,3.388,20.65,224,0.875,bicubic,+5.732,+1.956,-24
dla169.in1k,84.931,15.069,96.541,3.459,53.39,224,0.875,bilinear,+6.223,+2.197,+13
tf_efficientnet_b1.aa_in1k,84.918,15.082,96.366,3.634,7.79,240,0.882,bicubic,+6.090,+2.166,0
resnet50d.a3_in1k,84.901,15.099,96.285,3.715,25.58,224,0.950,bicubic,+6.181,+2.053,+8
legacy_seresnext50_32x4d.in1k,84.897,15.103,96.428,3.572,27.56,224,0.875,bilinear,+5.821,+1.996,-17
mobilevitv2_100.cvnets_in1k,84.894,15.106,96.388,3.612,4.90,256,0.888,bicubic,+6.814,+2.218,+55
hrnet_w44.ms_in1k,84.886,15.114,96.430,3.570,67.06,224,0.875,bilinear,+5.992,+2.066,-8
resnet50s.gluon_in1k,84.877,15.123,96.441,3.559,25.68,224,0.875,bicubic,+6.163,+2.199,+6
regnety_008_tv.tv2_in1k,84.875,15.125,96.637,3.363,6.43,224,0.965,bicubic,+6.209,+2.247,+9
regnetx_080.pycls_in1k,84.865,15.136,96.432,3.568,39.57,224,0.875,bicubic,+5.667,+1.878,-31
levit_conv_128.fb_dist_in1k,84.847,15.153,96.353,3.647,9.21,224,0.900,bicubic,+6.353,+2.345,+16
visformer_tiny.in1k,84.845,15.155,96.509,3.491,10.32,224,0.900,bicubic,+6.685,+2.343,+43
levit_128.fb_dist_in1k,84.845,15.155,96.353,3.647,9.21,224,0.900,bicubic,+6.355,+2.341,+17
res2net50_26w_8s.in1k,84.837,15.163,96.355,3.645,48.40,224,0.875,bilinear,+5.895,+2.061,-19
vit_tiny_patch16_384.augreg_in21k_ft_in1k,84.832,15.168,96.714,3.286,5.79,384,1.000,bicubic,+6.408,+2.172,+22
repghostnet_200.in1k,84.830,15.170,96.413,3.587,9.80,224,0.875,bicubic,+6.024,+2.083,-11
resnet50d.gluon_in1k,84.830,15.170,96.398,3.602,25.58,224,0.875,bicubic,+5.752,+1.932,-30
dla60_res2next.in1k,84.826,15.174,96.407,3.593,17.03,224,0.875,bilinear,+6.386,+2.263,+16
resnet152.tv_in1k,84.826,15.174,96.232,3.768,60.19,224,0.875,bilinear,+6.504,+2.186,+26
resnet26t.ra2_in1k,84.824,15.176,96.390,3.610,16.01,320,1.000,bicubic,+6.496,+2.266,+24
dla60_res2net.in1k,84.818,15.182,96.473,3.527,20.85,224,0.875,bilinear,+6.354,+2.275,+11
mixnet_l.ft_in1k,84.815,15.185,96.323,3.677,7.33,224,0.875,bicubic,+5.849,+2.141,-29
dla102x.in1k,84.807,15.193,96.552,3.448,26.31,224,0.875,bilinear,+6.295,+2.316,+2
xception41.tf_in1k,84.785,15.214,96.411,3.589,26.97,299,0.903,bicubic,+6.281,+2.135,+2
pit_xs_224.in1k,84.783,15.217,96.501,3.499,10.62,224,0.900,bicubic,+6.607,+2.339,+28
hrnet_w18.ms_aug_in1k,84.783,15.217,96.466,3.534,21.30,224,0.950,bilinear,+6.661,+2.412,+34
regnetx_064.pycls_in1k,84.773,15.227,96.494,3.506,26.21,224,0.875,bicubic,+5.707,+2.034,-38
resnet34.a1_in1k,84.762,15.238,96.230,3.771,21.80,288,1.000,bicubic,+6.844,+2.466,+46
poolformerv2_s12.sail_in1k,84.756,15.244,96.373,3.627,11.89,224,1.000,bicubic,+6.754,+2.508,+37
gcresnext26ts.ch_in1k,84.756,15.244,96.293,3.707,10.48,288,1.000,bicubic,+6.342,+2.257,+9
hrnet_w40.ms_in1k,84.745,15.255,96.552,3.448,57.56,224,0.875,bilinear,+5.813,+2.088,-35
repvgg_b2.rvgg_in1k,84.726,15.274,96.471,3.529,89.02,224,0.875,bilinear,+5.934,+2.051,-24
res2net50_26w_6s.in1k,84.726,15.274,96.281,3.719,37.05,224,0.875,bilinear,+6.158,+2.159,-11
resmlp_12_224.fb_distilled_in1k,84.719,15.281,96.225,3.775,15.35,224,0.875,bicubic,+6.765,+2.665,+37
vit_base_patch32_384.augreg_in1k,84.715,15.285,96.323,3.677,88.30,384,1.000,bicubic,+5.959,+2.097,-25
legacy_seresnet152.in1k,84.713,15.287,96.419,3.580,66.82,224,0.875,bilinear,+6.053,+2.049,-17
cs3darknet_m.c2ns_in1k,84.700,15.300,96.488,3.512,9.31,288,0.950,bicubic,+7.066,+2.472,+50
selecsls60b.in1k,84.662,15.338,96.300,3.700,32.77,224,0.875,bicubic,+6.250,+2.132,+1
bat_resnext26ts.ch_in1k,84.653,15.347,96.268,3.732,10.73,256,0.900,bicubic,+6.401,+2.170,+9
hrnet_w32.ms_in1k,84.651,15.349,96.413,3.587,41.23,224,0.875,bilinear,+6.209,+2.223,-7
seresnext26d_32x4d.bt_in1k,84.642,15.357,96.261,3.739,16.81,288,0.950,bicubic,+5.829,+2.022,-37
tf_efficientnetv2_b0.in1k,84.623,15.377,96.274,3.726,7.14,224,0.875,bicubic,+6.265,+2.260,+1
efficientnet_b1.ft_in1k,84.611,15.389,96.334,3.666,7.79,256,1.000,bicubic,+5.811,+1.992,-36
regnetx_040.pycls_in1k,84.606,15.394,96.379,3.621,22.12,224,0.875,bicubic,+6.114,+2.137,-16
vit_relpos_base_patch32_plus_rpn_256.sw_in1k,84.598,15.402,96.014,3.986,119.42,256,0.900,bicubic,+5.114,+1.876,-94
regnety_032.pycls_in1k,84.593,15.407,96.415,3.585,19.44,224,0.875,bicubic,+5.717,+2.007,-46
seresnext26t_32x4d.bt_in1k,84.589,15.411,96.381,3.619,16.81,288,0.950,bicubic,+5.845,+2.069,-36
hrnet_w30.ms_in1k,84.583,15.417,96.381,3.619,37.71,224,0.875,bilinear,+6.387,+2.159,+4
efficientnet_es.ra_in1k,84.581,15.419,96.308,3.692,5.44,224,0.875,bicubic,+6.523,+2.382,+13
tf_mixnet_l.in1k,84.578,15.422,96.244,3.756,7.33,224,0.875,bicubic,+5.802,+2.242,-41
wide_resnet101_2.tv_in1k,84.546,15.454,96.349,3.651,126.89,224,0.875,bilinear,+5.704,+2.067,-49
hrnet_w18_small_v2.gluon_in1k,84.542,15.458,96.285,3.715,15.60,224,0.875,bicubic,+6.352,+2.383,+1
vit_small_patch16_224.augreg_in1k,84.536,15.464,96.272,3.728,22.05,224,0.900,bicubic,+5.688,+1.984,-52
dla60x.in1k,84.531,15.469,96.293,3.707,17.35,224,0.875,bilinear,+6.295,+2.267,-3
legacy_seresnet101.in1k,84.497,15.503,96.326,3.674,49.33,224,0.875,bilinear,+6.111,+2.064,-15
resnet50.a3_in1k,84.484,15.515,96.003,3.997,25.56,224,0.950,bicubic,+6.436,+2.223,+7
cs3darknet_focus_m.c2ns_in1k,84.482,15.518,96.417,3.583,9.30,288,0.950,bicubic,+7.198,+2.451,+51
seresnext26ts.ch_in1k,84.480,15.520,96.321,3.679,10.39,288,1.000,bicubic,+6.210,+2.229,-11
tf_efficientnet_b1.in1k,84.472,15.528,96.074,3.926,7.79,240,0.882,bicubic,+5.910,+1.980,-36
coat_lite_tiny.in1k,84.459,15.541,96.383,3.617,5.72,224,0.900,bicubic,+6.939,+2.461,+32
tf_efficientnet_em.in1k,84.455,15.545,96.185,3.815,6.90,240,0.882,bicubic,+6.329,+2.137,-3
wide_resnet50_2.tv_in1k,84.429,15.571,96.257,3.743,68.88,224,0.875,bilinear,+5.953,+2.169,-31
repvgg_b1.rvgg_in1k,84.420,15.580,96.217,3.783,57.42,224,0.875,bilinear,+6.052,+2.121,-21
efficientnet_b1_pruned.in1k,84.397,15.603,96.140,3.860,6.33,240,0.882,bicubic,+6.157,+2.306,-14
vit_base_patch16_224.augreg_in1k,84.384,15.616,96.042,3.958,86.57,224,0.900,bicubic,+5.230,+1.952,-83
res2net50_26w_4s.in1k,84.359,15.642,96.091,3.909,25.70,224,0.875,bilinear,+6.409,+2.239,+6
hardcorenas_f.miil_green_in1k,84.324,15.676,96.025,3.975,8.20,224,0.875,bilinear,+6.228,+2.222,-7
res2net50_14w_8s.in1k,84.311,15.688,96.076,3.924,25.06,224,0.875,bilinear,+6.153,+2.230,-11
selecsls60.in1k,84.282,15.718,96.103,3.897,30.67,224,0.875,bicubic,+6.294,+2.273,+1
mobilevit_s.cvnets_in1k,84.277,15.723,96.259,3.741,5.58,256,0.900,bicubic,+5.965,+2.111,-24
regnetx_032.pycls_in1k,84.245,15.755,96.242,3.758,15.30,224,0.875,bicubic,+6.077,+2.160,-16
mobileone_s3.apple_in1k,84.233,15.767,96.135,3.865,10.17,224,0.900,bilinear,+6.241,+2.221,-3
convnextv2_atto.fcmae_ft_in1k,84.226,15.774,96.059,3.941,3.71,288,0.950,bicubic,+6.466,+2.333,+9
eca_resnext26ts.ch_in1k,84.224,15.776,96.191,3.809,10.30,288,1.000,bicubic,+6.224,+2.265,-6
ese_vovnet19b_dw.ra_in1k,84.220,15.780,96.259,3.741,6.54,288,0.950,bicubic,+6.476,+2.475,+7
convnext_atto_ols.a2_in1k,84.220,15.780,96.217,3.783,3.70,288,0.950,bicubic,+7.004,+2.541,+36
resnet50c.gluon_in1k,84.213,15.787,96.163,3.837,25.58,224,0.875,bicubic,+6.207,+2.171,-12
res2next50.in1k,84.213,15.787,96.001,3.999,24.67,224,0.875,bilinear,+5.971,+2.109,-28
mobileone_s2.apple_in1k,84.196,15.804,96.063,3.937,7.88,224,0.900,bilinear,+6.680,+2.395,+15
dla102.in1k,84.190,15.810,96.206,3.794,33.27,224,0.875,bilinear,+6.166,+2.272,-15
densenetblur121d.ra_in1k,84.168,15.832,96.240,3.760,8.00,288,0.950,bicubic,+6.846,+2.452,+22
rexnet_100.nav_in1k,84.158,15.842,96.249,3.751,4.80,224,0.875,bicubic,+6.302,+2.609,-5
fastvit_t8.apple_dist_in1k,84.143,15.857,96.078,3.922,4.03,256,0.900,bicubic,+6.967,+2.780,+30
convnext_atto.d2_in1k,84.141,15.859,96.200,3.800,3.70,288,0.950,bicubic,+7.133,+2.498,+39
inception_v3.tf_in1k,84.141,15.859,95.911,4.089,23.83,299,0.875,bicubic,+6.285,+2.045,-7
res2net50_48w_2s.in1k,84.117,15.883,95.960,4.040,25.29,224,0.875,bilinear,+6.603,+2.410,+9
xcit_tiny_12_p16_224.fb_in1k,84.111,15.889,96.236,3.764,6.72,224,1.000,bicubic,+6.971,+2.520,+29
ghostnetv2_160.in1k,84.098,15.902,96.210,3.790,12.39,224,0.875,bicubic,+6.266,+2.270,-9
tf_efficientnet_lite2.in1k,84.087,15.913,96.074,3.926,6.09,260,0.890,bicubic,+6.625,+2.322,+7
poolformer_s12.sail_in1k,84.049,15.951,96.178,3.822,11.92,224,0.900,bicubic,+6.809,+2.646,+20
resnet34.a2_in1k,84.045,15.955,95.922,4.078,21.80,288,1.000,bicubic,+6.887,+2.648,+24
efficientnet_b0.ra_in1k,84.034,15.966,95.965,4.035,5.29,224,0.875,bicubic,+6.340,+2.433,-8
crossvit_9_dagger_240.in1k,84.019,15.981,96.084,3.916,8.78,240,0.875,bicubic,+7.041,+2.466,+31
tf_efficientnet_cc_b0_8e.in1k,83.972,16.028,96.067,3.933,24.01,224,0.875,bicubic,+6.068,+2.405,-19
regnety_016.pycls_in1k,83.966,16.034,96.005,3.995,11.20,224,0.875,bicubic,+6.098,+2.287,-20
gmixer_24_224.ra3_in1k,83.966,16.034,95.854,4.146,24.72,224,0.875,bicubic,+5.940,+2.186,-31
hardcorenas_e.miil_green_in1k,83.963,16.037,95.903,4.097,8.07,224,0.875,bilinear,+6.173,+2.203,-16
resnext50_32x4d.tv_in1k,83.951,16.049,95.969,4.031,25.03,224,0.875,bilinear,+6.329,+2.273,-10
resnet50.gluon_in1k,83.940,16.060,96.020,3.980,25.56,224,0.875,bicubic,+6.358,+2.300,-8
densenet161.tv_in1k,83.910,16.090,96.022,3.978,28.68,224,0.875,bicubic,+6.552,+2.380,+2
mobilenetv2_120d.ra_in1k,83.902,16.098,95.903,4.097,5.83,224,0.875,bicubic,+6.594,+2.401,+3
inception_v3.tf_adv_in1k,83.897,16.103,95.939,4.061,23.83,299,0.875,bicubic,+6.305,+2.209,-13
resnet101.tv_in1k,83.863,16.137,95.888,4.112,44.55,224,0.875,bilinear,+6.483,+2.342,-2
tinynet_a.in1k,83.825,16.175,95.817,4.183,6.19,192,0.875,bicubic,+6.177,+2.277,-19
resnet26d.bt_in1k,83.791,16.209,95.960,4.040,16.01,288,0.950,bicubic,+6.383,+2.322,-5
dpn68b.mx_in1k,83.786,16.214,95.986,4.014,12.61,224,0.875,bicubic,+6.268,+2.134,-13
inception_v3.tv_in1k,83.756,16.244,95.886,4.114,23.83,299,0.875,bicubic,+6.322,+2.412,-9
hardcorenas_d.miil_green_in1k,83.756,16.244,95.738,4.262,7.50,224,0.875,bilinear,+6.322,+2.248,-9
xcit_nano_12_p8_224.fb_dist_in1k,83.733,16.267,95.963,4.037,3.05,224,1.000,bicubic,+7.401,+2.865,+39
dla60.in1k,83.716,16.284,95.926,4.074,22.04,224,0.875,bilinear,+6.670,+2.608,+11
resnext26ts.ra2_in1k,83.701,16.299,95.984,4.016,10.30,288,1.000,bicubic,+6.523,+2.520,+1
repvgg_b1g4.rvgg_in1k,83.699,16.301,96.027,3.973,39.97,224,0.875,bilinear,+6.111,+2.191,-22
convmixer_1024_20_ks9_p14.in1k,83.682,16.318,95.884,4.116,24.38,224,0.960,bicubic,+6.746,+2.534,+13
legacy_seresnet50.in1k,83.669,16.331,95.984,4.016,28.09,224,0.875,bilinear,+6.025,+2.226,-28
regnetx_008.tv2_in1k,83.667,16.333,95.975,4.025,7.26,224,0.965,bicubic,+6.361,+2.311,-10
tf_efficientnet_b0.ap_in1k,83.652,16.348,95.785,4.215,5.29,224,0.875,bicubic,+6.562,+2.523,+2
skresnet34.ra_in1k,83.645,16.355,95.928,4.072,22.28,224,0.875,bicubic,+6.735,+2.784,+11
tf_efficientnet_cc_b0_4e.in1k,83.641,16.359,95.740,4.260,13.31,224,0.875,bicubic,+6.339,+2.404,-12
repghostnet_150.in1k,83.630,16.369,95.920,4.080,6.58,224,0.875,bicubic,+6.171,+2.410,-22
seresnet50.a3_in1k,83.624,16.376,95.709,4.292,28.09,224,0.950,bicubic,+6.598,+2.636,+2
densenet121.ra_in1k,83.596,16.404,96.054,3.946,7.98,288,0.950,bicubic,+7.096,+2.686,+21
resmlp_12_224.fb_in1k,83.569,16.431,95.760,4.240,15.35,224,0.875,bicubic,+6.921,+2.582,+11
densenet201.tv_in1k,83.554,16.446,95.809,4.191,20.01,224,0.875,bicubic,+6.268,+2.329,-16
mobilenetv3_large_100.miil_in21k_ft_in1k,83.554,16.446,95.448,4.552,5.48,224,0.875,bilinear,+5.634,+2.534,-51
mixnet_m.ft_in1k,83.532,16.468,95.687,4.313,5.01,224,0.875,bicubic,+6.272,+2.269,-16
legacy_seresnext26_32x4d.in1k,83.522,16.478,95.709,4.292,16.79,224,0.875,bicubic,+6.414,+2.395,-9
gernet_s.idstcv_in1k,83.513,16.487,95.798,4.202,8.17,224,0.875,bilinear,+6.603,+2.482,+2
tf_efficientnet_b0.aa_in1k,83.502,16.498,95.704,4.296,5.29,224,0.875,bicubic,+6.658,+2.486,+2
hrnet_w18.ms_in1k,83.490,16.511,95.911,4.089,21.30,224,0.875,bilinear,+6.738,+2.467,+3
resnet34.bt_in1k,83.468,16.532,95.965,4.035,21.80,288,0.950,bicubic,+6.988,+2.611,+13
selecsls42b.in1k,83.451,16.549,95.736,4.264,32.46,224,0.875,bicubic,+6.281,+2.344,-17
efficientvit_m5.r224_in1k,83.449,16.551,95.813,4.187,12.47,224,0.875,bicubic,+6.391,+2.629,-12
efficientformerv2_s0.snap_dist_in1k,83.402,16.598,95.817,4.183,3.60,224,0.950,bicubic,+7.288,+2.959,+19
hardcorenas_c.miil_green_in1k,83.353,16.647,95.713,4.287,5.52,224,0.875,bilinear,+6.287,+2.551,-15
ghostnetv2_130.in1k,83.342,16.658,95.843,4.157,8.96,224,0.875,bicubic,+6.586,+2.481,-4
tf_efficientnet_lite1.in1k,83.338,16.662,95.647,4.353,5.42,240,0.882,bicubic,+6.694,+2.423,-2
fastvit_t8.apple_in1k,83.270,16.730,95.832,4.168,4.03,256,0.900,bicubic,+7.096,+2.780,+13
dpn68.mx_in1k,83.195,16.805,95.617,4.383,12.61,224,0.875,bicubic,+6.849,+2.609,+9
tf_mixnet_m.in1k,83.189,16.811,95.469,4.531,5.01,224,0.875,bicubic,+6.235,+2.315,-14
regnetx_016.pycls_in1k,83.180,16.820,95.740,4.260,9.19,224,0.875,bicubic,+6.256,+2.325,-13
tf_efficientnet_es.in1k,83.180,16.820,95.580,4.420,5.44,224,0.875,bicubic,+6.582,+2.378,-5
xcit_nano_12_p16_384.fb_dist_in1k,83.178,16.822,95.755,4.245,3.05,384,1.000,bicubic,+7.720,+3.058,+27
mobilenetv2_140.ra_in1k,83.174,16.826,95.687,4.313,6.11,224,0.875,bicubic,+6.658,+2.699,-2
levit_128s.fb_dist_in1k,83.052,16.948,95.533,4.467,7.78,224,0.900,bicubic,+6.526,+2.661,-5
levit_conv_128s.fb_dist_in1k,83.046,16.954,95.536,4.464,7.78,224,0.900,bicubic,+6.526,+2.670,-5
repvgg_a2.rvgg_in1k,82.996,17.004,95.593,4.407,28.21,224,0.875,bilinear,+6.538,+2.591,-2
resnet50.tv_in1k,82.958,17.042,95.469,4.531,25.56,224,0.875,bilinear,+6.830,+2.611,+4
repghostnet_130.in1k,82.922,17.078,95.459,4.541,5.48,224,0.875,bicubic,+6.546,+2.567,-3
resnet26.bt_in1k,82.917,17.083,95.726,4.274,16.00,288,0.950,bicubic,+6.551,+2.546,-3
hardcorenas_b.miil_green_in1k,82.864,17.136,95.395,4.605,5.18,224,0.875,bilinear,+6.316,+2.633,-13
mobileone_s1.apple_in1k,82.853,17.147,95.540,4.460,4.83,224,0.900,bilinear,+7.067,+2.748,+7
mobilevitv2_075.cvnets_in1k,82.796,17.204,95.570,4.430,2.87,256,0.888,bicubic,+7.188,+2.826,+11
densenet169.tv_in1k,82.689,17.311,95.604,4.396,14.15,224,0.875,bicubic,+6.789,+2.576,+4
vit_tiny_r_s16_p8_384.augreg_in21k_ft_in1k,82.687,17.313,95.847,4.153,6.36,384,1.000,bicubic,+6.727,+2.585,+1
regnety_004.tv2_in1k,82.640,17.360,95.499,4.501,4.34,224,0.965,bicubic,+7.046,+2.799,+9
edgenext_x_small.in1k,82.580,17.420,95.459,4.541,2.34,288,1.000,bicubic,+6.892,+2.693,+4
tf_efficientnet_b0.in1k,82.563,17.437,95.418,4.582,5.29,224,0.875,bicubic,+6.033,+2.410,-19
mixnet_s.ft_in1k,82.520,17.480,95.356,4.644,4.13,224,0.875,bicubic,+6.526,+2.086,-4
vit_small_patch32_224.augreg_in21k_ft_in1k,82.516,17.484,95.664,4.336,22.88,224,0.900,bicubic,+6.522,+2.864,-6
regnety_008.pycls_in1k,82.486,17.514,95.489,4.511,6.26,224,0.875,bicubic,+6.184,+2.427,-11
efficientnet_lite0.ra_in1k,82.371,17.629,95.294,4.706,4.65,224,0.875,bicubic,+6.889,+2.774,+6
resnest14d.gluon_in1k,82.356,17.644,95.335,4.665,10.61,224,0.875,bilinear,+6.848,+2.827,+4
hardcorenas_a.miil_green_in1k,82.322,17.678,95.288,4.712,5.26,224,0.875,bilinear,+6.384,+2.780,-7
efficientnet_es_pruned.in1k,82.294,17.706,95.301,4.699,5.44,224,0.875,bicubic,+7.288,+2.857,+15
mobilenetv3_rw.rmsp_in1k,82.264,17.736,95.234,4.766,5.48,224,0.875,bicubic,+6.644,+2.531,-3
semnasnet_100.rmsp_in1k,82.258,17.742,95.226,4.774,3.89,224,0.875,bicubic,+6.808,+2.628,+4
mobilenetv3_large_100.ra_in1k,82.179,17.821,95.192,4.808,5.48,224,0.875,bicubic,+6.413,+2.654,-8
vit_tiny_patch16_224.augreg_in21k_ft_in1k,82.080,17.920,95.476,4.524,5.72,224,0.900,bicubic,+6.618,+2.632,0
mobilenetv2_110d.ra_in1k,82.068,17.932,95.070,4.930,4.52,224,0.875,bicubic,+7.014,+2.886,+8
tf_mixnet_s.in1k,82.038,17.962,95.126,4.874,4.13,224,0.875,bicubic,+6.386,+2.486,-9
repvgg_b0.rvgg_in1k,81.999,18.001,95.104,4.896,15.82,224,0.875,bilinear,+6.855,+2.688,+2
deit_tiny_distilled_patch16_224.fb_in1k,81.993,18.007,95.134,4.866,5.91,224,0.900,bicubic,+7.489,+3.244,+18
hrnet_w18_small_v2.ms_in1k,81.976,18.024,95.160,4.840,15.60,224,0.875,bilinear,+6.866,+2.744,+2
mixer_b16_224.goog_in21k_ft_in1k,81.976,18.024,94.451,5.549,59.88,224,0.875,bicubic,+5.374,+2.227,-39
tf_efficientnet_lite0.in1k,81.950,18.050,95.160,4.840,4.65,224,0.875,bicubic,+7.118,+2.990,+7
ghostnetv2_100.in1k,81.905,18.095,95.111,4.889,6.16,224,0.875,bicubic,+6.739,+2.757,-4
tinynet_b.in1k,81.880,18.120,94.876,5.124,3.73,188,0.875,bicubic,+6.902,+2.690,+3
tf_mobilenetv3_large_100.in1k,81.848,18.152,95.059,4.941,5.48,224,0.875,bilinear,+6.332,+2.466,-13
pit_ti_distilled_224.in1k,81.779,18.221,95.098,4.902,5.10,224,0.900,bicubic,+7.523,+3.146,+14
repghostnet_111.in1k,81.743,18.257,94.842,5.158,4.54,224,0.875,bicubic,+6.687,+2.650,-4
densenet121.tv_in1k,81.732,18.268,95.036,4.964,7.98,224,0.875,bicubic,+6.968,+2.882,+3
regnety_006.pycls_in1k,81.717,18.283,95.115,4.885,6.06,224,0.875,bicubic,+6.449,+2.589,-11
regnetx_004_tv.tv2_in1k,81.694,18.306,95.057,4.943,5.50,224,0.965,bicubic,+7.094,+2.887,+5
resnet18d.ra2_in1k,81.679,18.321,95.079,4.921,11.71,288,0.950,bicubic,+7.885,+3.241,+19
dla34.in1k,81.664,18.336,94.867,5.133,15.74,224,0.875,bilinear,+7.024,+2.801,+1
xcit_nano_12_p8_224.fb_in1k,81.645,18.355,95.271,4.729,3.05,224,1.000,bicubic,+7.735,+3.103,+15
crossvit_9_240.in1k,81.613,18.387,94.981,5.019,8.55,240,0.875,bicubic,+7.653,+3.019,+11
fbnetc_100.rmsp_in1k,81.559,18.441,94.968,5.032,5.57,224,0.875,bilinear,+6.430,+2.580,-14
mobilevit_xs.cvnets_in1k,81.553,18.447,95.023,4.977,2.32,256,0.900,bicubic,+6.919,+2.675,-2
legacy_seresnet34.in1k,81.538,18.462,94.897,5.103,21.96,224,0.875,bilinear,+6.736,+2.771,-7
efficientvit_m4.r224_in1k,81.498,18.503,95.002,4.998,8.80,224,0.875,bicubic,+7.130,+3.022,+1
regnetx_008.pycls_in1k,81.487,18.513,95.053,4.947,7.26,224,0.875,bicubic,+6.459,+2.715,-14
resnet34.gluon_in1k,81.481,18.520,94.799,5.201,21.80,224,0.875,bicubic,+6.901,+2.817,-4
mnasnet_100.rmsp_in1k,81.446,18.554,94.914,5.086,4.38,224,0.875,bicubic,+6.794,+2.792,-9
vgg19_bn.tv_in1k,81.438,18.562,94.771,5.229,143.68,224,0.875,bilinear,+7.222,+2.927,-1
repvgg_a1.rvgg_in1k,81.256,18.744,94.714,5.286,14.09,224,0.875,bilinear,+6.794,+2.858,-5
vit_base_patch32_224.augreg_in1k,81.143,18.857,94.427,5.572,88.22,224,0.900,bicubic,+6.249,+2.649,-16
convit_tiny.fb_in1k,81.126,18.874,95.036,4.964,5.71,224,0.875,bicubic,+8.014,+3.324,+14
crossvit_tiny_240.in1k,81.098,18.902,94.983,5.017,7.01,240,0.875,bicubic,+7.758,+3.075,+9
resnet18.a1_in1k,81.036,18.964,94.357,5.643,11.69,288,1.000,bicubic,+7.878,+3.331,+11
repghostnet_100.in1k,80.930,19.070,94.543,5.457,4.07,224,0.875,bicubic,+6.724,+3.001,-6
spnasnet_100.rmsp_in1k,80.883,19.117,94.526,5.474,4.42,224,0.875,bilinear,+6.789,+2.706,-6
resnet34.a3_in1k,80.814,19.186,94.353,5.647,21.80,224,0.950,bicubic,+7.844,+3.247,+12
efficientvit_m3.r224_in1k,80.693,19.307,94.556,5.444,6.90,224,0.875,bicubic,+7.319,+3.208,+2
ghostnet_100.in1k,80.678,19.322,94.359,5.641,5.18,224,0.875,bicubic,+6.720,+2.827,-6
regnety_004.pycls_in1k,80.656,19.344,94.682,5.318,4.34,224,0.875,bicubic,+6.630,+2.934,-9
skresnet18.ra_in1k,80.648,19.352,94.378,5.622,11.96,224,0.875,bicubic,+7.614,+3.206,+6
regnetx_006.pycls_in1k,80.639,19.361,94.530,5.470,6.20,224,0.875,bicubic,+6.771,+2.852,-6
pit_ti_224.in1k,80.599,19.401,94.620,5.380,4.85,224,0.900,bicubic,+7.689,+3.216,+8
resnet18.fb_swsl_ig1b_ft_in1k,80.577,19.423,94.741,5.259,11.69,224,0.875,bilinear,+7.289,+3.011,0
vgg16_bn.tv_in1k,80.571,19.429,94.600,5.400,138.37,224,0.875,bilinear,+7.201,+3.086,-4
semnasnet_075.rmsp_in1k,80.481,19.519,94.319,5.681,2.91,224,0.875,bicubic,+7.477,+3.179,+2
hrnet_w18_small.gluon_in1k,80.406,19.593,94.045,5.955,13.19,224,0.875,bicubic,+6.486,+2.851,-13
resnet34.tv_in1k,80.381,19.619,94.430,5.570,21.80,224,0.875,bilinear,+7.075,+3.010,-5
resnet18.a2_in1k,80.310,19.690,94.099,5.901,11.69,288,1.000,bicubic,+7.938,+3.503,+7
mobilenetv2_100.ra_in1k,80.253,19.747,94.188,5.812,3.50,224,0.875,bicubic,+7.285,+3.172,0
xcit_nano_12_p16_224.fb_dist_in1k,80.231,19.769,94.351,5.649,3.05,224,1.000,bicubic,+7.921,+3.491,+7
vit_base_patch32_224.sam_in1k,80.214,19.786,93.823,6.177,88.22,224,0.900,bicubic,+6.520,+2.809,-14
resnet18.fb_ssl_yfcc100m_ft_in1k,80.095,19.905,94.592,5.408,11.69,224,0.875,bilinear,+7.497,+3.176,-1
tf_mobilenetv3_large_075.in1k,80.073,19.927,94.180,5.820,3.99,224,0.875,bilinear,+6.643,+2.828,-15
deit_tiny_patch16_224.fb_in1k,80.011,19.988,94.449,5.551,5.72,224,0.900,bicubic,+7.841,+3.333,+7
hrnet_w18_small.ms_in1k,79.550,20.450,93.906,6.093,13.19,224,0.875,bilinear,+7.214,+3.226,+1
repvgg_a0.rvgg_in1k,79.508,20.492,93.778,6.222,9.11,224,0.875,bilinear,+7.100,+3.286,-4
vgg19.tv_in1k,79.484,20.516,93.868,6.132,143.67,224,0.875,bilinear,+7.106,+2.994,-3
regnetx_004.pycls_in1k,79.420,20.580,93.847,6.153,5.16,224,0.875,bicubic,+7.018,+3.021,-5
tf_mobilenetv3_large_minimal_100.in1k,79.232,20.768,93.702,6.298,3.92,224,0.875,bilinear,+6.968,+3.062,-1
edgenext_xx_small.in1k,79.175,20.825,93.819,6.181,1.33,288,1.000,bicubic,+7.297,+3.267,+4
legacy_seresnet18.in1k,79.164,20.836,93.774,6.226,11.78,224,0.875,bicubic,+7.404,+3.442,+5
resnet14t.c3_in1k,79.143,20.857,93.565,6.435,10.08,224,0.950,bicubic,+6.889,+3.259,-3
repghostnet_080.in1k,79.089,20.911,93.716,6.284,3.28,224,0.875,bicubic,+6.877,+3.233,-3
vgg16.tv_in1k,79.038,20.962,93.644,6.356,138.36,224,0.875,bilinear,+7.446,+3.260,+3
vgg13_bn.tv_in1k,78.995,21.005,93.661,6.339,133.05,224,0.875,bilinear,+7.407,+3.283,+3
vit_tiny_r_s16_p8_224.augreg_in21k_ft_in1k,78.991,21.009,93.898,6.102,6.34,224,0.900,bicubic,+7.193,+3.074,-1
lcnet_100.ra2_in1k,78.899,21.101,93.550,6.450,2.95,224,0.875,bicubic,+6.797,+3.196,-5
pvt_v2_b0.in1k,78.756,21.244,93.836,6.164,3.67,224,0.900,bicubic,+8.096,+3.640,+6
efficientvit_m2.r224_in1k,78.632,21.368,93.552,6.448,4.19,224,0.875,bicubic,+7.818,+3.410,+4
mobileone_s0.apple_in1k,78.496,21.504,93.322,6.678,5.29,224,0.875,bilinear,+7.094,+3.480,-1
tinynet_c.in1k,78.449,21.551,93.125,6.875,2.46,184,0.875,bicubic,+7.207,+3.393,0
efficientvit_b0.r224_in1k,78.425,21.575,92.801,7.199,3.41,224,0.950,bicubic,+7.027,+3.373,-2
resnet18.gluon_in1k,78.376,21.624,93.129,6.871,11.69,224,0.875,bicubic,+7.542,+3.373,-1
mobilevitv2_050.cvnets_in1k,78.124,21.876,93.565,6.435,1.37,256,0.888,bicubic,+7.976,+3.647,+3
vgg11_bn.tv_in1k,77.956,22.044,93.232,6.768,132.87,224,0.875,bilinear,+7.573,+3.424,0
xcit_nano_12_p16_224.fb_in1k,77.904,22.096,93.437,6.563,3.05,224,1.000,bicubic,+7.942,+3.675,+2
regnety_002.pycls_in1k,77.422,22.578,92.905,7.095,3.16,224,0.875,bicubic,+7.142,+3.375,-1
resnet18.tv_in1k,77.285,22.715,92.756,7.244,11.69,224,0.875,bilinear,+7.525,+3.686,+2
mixer_l16_224.goog_in21k_ft_in1k,77.279,22.721,90.584,9.416,208.20,224,0.875,bicubic,+5.225,+2.910,-16
vgg13.tv_in1k,77.227,22.773,92.692,7.308,133.05,224,0.875,bilinear,+7.295,+3.442,-1
mobilevit_xxs.cvnets_in1k,76.604,23.396,92.681,7.319,1.27,256,0.900,bicubic,+7.686,+3.735,+1
resnet18.a3_in1k,76.457,23.543,92.226,7.774,11.69,224,0.950,bicubic,+8.205,+4.054,+6
efficientvit_m1.r224_in1k,76.388,23.612,92.542,7.458,2.98,224,0.875,bicubic,+8.082,+3.872,+4
vgg11.tv_in1k,76.384,23.616,92.156,7.844,132.86,224,0.875,bilinear,+7.362,+3.532,-3
repghostnet_058.in1k,76.224,23.776,92.117,7.883,2.55,224,0.875,bicubic,+7.310,+3.697,-2
resnet10t.c3_in1k,76.171,23.829,92.226,7.774,5.44,224,0.950,bicubic,+7.806,+4.190,0
regnetx_002.pycls_in1k,76.128,23.872,92.198,7.801,2.68,224,0.875,bicubic,+7.376,+3.656,-2
lcnet_075.ra2_in1k,76.036,23.964,92.060,7.940,2.36,224,0.875,bicubic,+7.254,+3.700,-4
dla60x_c.in1k,75.637,24.363,92.171,7.829,1.32,224,0.875,bilinear,+7.725,+3.739,+1
mobilenetv3_small_100.lamb_in1k,74.921,25.078,91.487,8.512,2.54,224,0.875,bicubic,+7.263,+3.852,+1
tf_mobilenetv3_small_100.in1k,74.725,25.275,91.266,8.735,2.54,224,0.875,bilinear,+6.803,+3.594,-2
tinynet_d.in1k,74.292,25.708,90.917,9.083,2.34,152,0.875,bicubic,+7.320,+3.851,0
repghostnet_050.in1k,74.236,25.764,90.808,9.191,2.31,224,0.875,bicubic,+7.270,+3.888,0
mnasnet_small.lamb_in1k,73.801,26.199,90.732,9.268,2.03,224,0.875,bicubic,+7.605,+4.228,0
dla46x_c.in1k,73.655,26.345,91.097,8.903,1.07,224,0.875,bilinear,+7.663,+4.123,0
mobilenetv2_050.lamb_in1k,73.470,26.530,90.317,9.682,1.97,224,0.875,bicubic,+7.522,+4.233,0
tf_mobilenetv3_small_075.in1k,72.814,27.186,90.051,9.949,2.04,224,0.875,bilinear,+7.088,+3.919,0
dla46_c.in1k,72.626,27.374,90.499,9.501,1.30,224,0.875,bilinear,+7.754,+4.201,+1
mobilenetv3_small_075.lamb_in1k,72.317,27.683,89.666,10.334,2.04,224,0.875,bicubic,+7.081,+4.220,-1
efficientvit_m0.r224_in1k,71.091,28.909,89.589,10.411,2.35,224,0.875,bicubic,+7.821,+4.413,0
lcnet_050.ra2_in1k,70.402,29.598,88.825,11.175,1.88,224,0.875,bicubic,+7.264,+4.443,0
tf_mobilenetv3_small_minimal_100.in1k,70.096,29.904,88.516,11.485,2.04,224,0.875,bilinear,+7.202,+4.278,0
tinynet_e.in1k,66.810,33.190,86.280,13.720,2.04,106,0.875,bicubic,+6.944,+4.518,0
mobilenetv3_small_050.lamb_in1k,64.697,35.303,84.858,15.142,1.59,224,0.875,bicubic,+6.781,+4.678,0
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-infer-amp-nhwc-pt111-cu113-rtx3090.csv
|
model,infer_samples_per_sec,infer_step_time,infer_batch_size,infer_img_size,param_count
tinynet_e,68298.73,14.982,1024,106,2.04
mobilenetv3_small_050,48773.32,20.985,1024,224,1.59
lcnet_035,47045.94,21.755,1024,224,1.64
lcnet_050,41541.83,24.639,1024,224,1.88
mobilenetv3_small_075,37803.23,27.076,1024,224,2.04
mobilenetv3_small_100,34839.31,29.381,1024,224,2.54
tinynet_d,34615.54,29.571,1024,152,2.34
tf_mobilenetv3_small_minimal_100,31097.5,32.918,1024,224,2.04
tf_mobilenetv3_small_075,30498.6,33.564,1024,224,2.04
tf_mobilenetv3_small_100,28466.28,35.962,1024,224,2.54
lcnet_075,26999.4,37.915,1024,224,2.36
mnasnet_small,23228.74,44.072,1024,224,2.03
lcnet_100,22774.77,44.951,1024,224,2.95
levit_128s,21485.8,47.648,1024,224,7.78
mobilenetv2_035,20032.08,51.106,1024,224,1.68
ghostnet_050,18639.82,54.925,1024,224,2.59
mnasnet_050,18244.9,56.115,1024,224,2.22
regnetx_002,17821.98,57.446,1024,224,2.68
tinynet_c,17586.87,58.214,1024,184,2.46
regnety_002,16673.08,61.405,1024,224,3.16
mobilenetv2_050,16415.14,62.371,1024,224,1.97
semnasnet_050,16295.23,62.83,1024,224,2.08
lcnet_150,15040.68,68.071,1024,224,4.5
levit_128,14657.83,69.849,1024,224,9.21
regnetx_004,14440.03,70.903,1024,224,5.16
gernet_s,14051.59,72.863,1024,224,8.17
mobilenetv3_large_075,13658.47,74.961,1024,224,3.99
levit_192,12892.86,79.412,1024,224,10.95
mnasnet_075,12457.54,82.188,1024,224,3.17
mobilenetv3_rw,12442.0,82.291,1024,224,5.48
hardcorenas_a,12441.72,82.293,1024,224,5.26
mixer_s32_224,12325.03,83.072,1024,224,19.1
mobilenetv3_large_100,12253.83,83.554,1024,224,5.48
mobilenetv3_large_100_miil,12253.15,83.559,1024,224,5.48
vit_small_patch32_224,12098.59,84.625,1024,224,22.88
tf_mobilenetv3_large_075,11757.16,87.085,1024,224,3.99
tinynet_b,11714.84,87.399,1024,188,3.73
hardcorenas_b,11307.89,90.545,1024,224,5.18
hardcorenas_c,11295.65,90.643,1024,224,5.52
ese_vovnet19b_slim_dw,11295.18,90.646,1024,224,1.9
tf_mobilenetv3_large_minimal_100,11279.9,90.77,1024,224,3.92
mnasnet_b1,10903.65,93.902,1024,224,4.38
mnasnet_100,10903.28,93.906,1024,224,4.38
swsl_resnet18,10835.46,94.49,1024,224,11.69
ssl_resnet18,10829.31,94.547,1024,224,11.69
resnet18,10826.05,94.576,1024,224,11.69
gluon_resnet18_v1b,10791.4,94.879,1024,224,11.69
tf_mobilenetv3_large_100,10638.58,96.242,1024,224,5.48
hardcorenas_d,10551.44,97.037,1024,224,7.5
semnasnet_075,10519.79,97.329,1024,224,2.91
ghostnet_100,10434.77,98.122,1024,224,5.18
mobilenetv2_075,10372.86,98.708,1024,224,2.64
seresnet18,10183.7,100.541,1024,224,11.78
regnety_006,9982.58,102.567,1024,224,6.06
vit_tiny_r_s16_p8_224,9895.77,103.465,1024,224,6.34
spnasnet_100,9875.93,103.675,1024,224,4.42
legacy_seresnet18,9845.25,103.999,1024,224,11.78
regnety_004,9552.58,107.183,1024,224,4.34
levit_256,9434.24,108.53,1024,224,18.89
tinynet_a,9412.38,108.782,1024,192,6.19
hardcorenas_f,9390.96,109.029,1024,224,8.2
semnasnet_100,9334.36,109.69,1024,224,3.89
mnasnet_a1,9318.68,109.875,1024,224,3.89
mobilenetv2_100,9260.72,110.564,1024,224,3.5
hardcorenas_e,9255.53,110.624,1024,224,8.07
tf_efficientnetv2_b0,9250.71,110.683,1024,224,7.14
fbnetc_100,9032.97,113.35,1024,224,5.57
efficientnet_lite0,8999.14,113.778,1024,224,4.65
resnet18d,8913.81,114.867,1024,224,11.71
ese_vovnet19b_slim,8715.26,117.484,1024,224,3.17
regnetx_008,8458.52,121.05,1024,224,7.26
levit_256d,8024.27,127.602,1024,224,26.21
regnetx_006,7937.85,128.991,1024,224,6.2
regnety_008,7871.11,130.085,1024,224,6.26
tf_efficientnet_lite0,7813.92,131.036,1024,224,4.65
efficientnet_b0,7681.79,133.291,1024,224,5.29
ghostnet_130,7655.04,133.756,1024,224,7.36
mnasnet_140,7500.9,136.506,1024,224,7.12
xcit_nano_12_p16_224_dist,7406.15,138.252,1024,224,3.05
xcit_nano_12_p16_224,7390.69,138.541,1024,224,3.05
rexnetr_100,7266.82,140.903,1024,224,4.88
mobilenetv2_110d,7027.38,145.704,1024,224,4.52
tf_efficientnet_b0_ap,6815.39,150.235,1024,224,5.29
tf_efficientnet_b0,6815.06,150.243,1024,224,5.29
tf_efficientnet_b0_ns,6813.82,150.27,1024,224,5.29
regnetz_005,6657.81,153.793,1024,224,7.12
hrnet_w18_small,6626.19,154.526,1024,224,13.19
gernet_m,6452.0,158.699,1024,224,21.14
semnasnet_140,6407.4,159.804,1024,224,6.11
tv_resnet34,6289.18,162.807,1024,224,21.8
gluon_resnet34_v1b,6283.02,162.967,1024,224,21.8
resnet34,6262.0,163.515,1024,224,21.8
vit_tiny_patch16_224,6224.71,164.493,1024,224,5.72
mobilenetv2_140,6223.01,164.539,1024,224,6.11
deit_tiny_patch16_224,6218.5,164.657,1024,224,5.72
ese_vovnet19b_dw,6178.2,165.733,1024,224,6.54
deit_tiny_distilled_patch16_224,6117.9,167.365,1024,224,5.91
efficientnet_b1_pruned,6023.92,169.977,1024,240,6.33
efficientnet_lite1,5983.27,171.132,1024,240,5.42
tf_efficientnetv2_b1,5921.06,172.93,1024,240,8.14
selecsls42,5908.85,173.287,1024,224,30.35
fbnetv3_b,5903.05,173.458,1024,256,8.6
seresnet34,5898.43,173.594,1024,224,21.96
selecsls42b,5885.09,173.988,1024,224,32.46
rexnet_100,5871.63,174.387,1024,224,4.8
pit_ti_distilled_224,5837.73,175.397,1024,224,5.1
pit_ti_224,5807.14,176.321,1024,224,4.85
efficientnet_es,5711.75,179.268,1024,224,5.44
efficientnet_es_pruned,5710.76,179.298,1024,224,5.44
resnet26,5694.28,179.817,1024,224,16.0
legacy_seresnet34,5670.37,180.577,1024,224,21.96
levit_384,5643.3,181.443,1024,224,39.13
resnet34d,5571.88,183.768,1024,224,21.82
resnetblur18,5507.37,185.919,1024,224,11.69
rexnetr_130,5484.64,186.691,1024,224,7.61
nf_regnet_b0,5429.0,188.605,1024,256,8.76
tf_efficientnet_es,5424.57,188.76,1024,224,5.44
tf_efficientnet_lite1,5360.34,191.021,1024,240,5.42
skresnet18,5350.54,191.371,1024,224,11.96
selecsls60,5263.66,194.53,1024,224,30.67
selecsls60b,5251.8,194.969,1024,224,32.77
mobilenetv2_120d,5160.96,198.401,1024,224,5.83
mobilevit_xxs,5125.56,199.771,1024,256,1.27
repvgg_b0,5049.08,202.797,1024,224,15.82
resnet26d,4891.07,209.349,1024,224,16.01
fbnetv3_d,4812.34,212.774,1024,256,10.31
rexnetr_150,4791.0,213.722,1024,224,9.78
xcit_tiny_12_p16_224_dist,4778.77,214.269,1024,224,6.72
xcit_tiny_12_p16_224,4774.3,214.469,1024,224,6.72
visformer_tiny,4714.23,217.203,1024,224,10.32
nf_resnet26,4639.54,220.7,1024,224,16.0
efficientnet_lite2,4604.04,222.402,1024,260,6.09
pit_xs_224,4582.73,223.434,1024,224,10.62
pit_xs_distilled_224,4539.81,225.546,1024,224,11.0
resmlp_12_distilled_224,4519.92,226.541,1024,224,15.35
resmlp_12_224,4518.97,226.589,1024,224,15.35
tf_efficientnetv2_b2,4403.08,232.553,1024,260,10.1
vit_base_patch32_224_sam,4330.12,236.472,1024,224,88.22
vit_base_patch32_224,4316.0,237.246,1024,224,88.22
mixer_b32_224,4294.72,238.421,1024,224,60.29
tf_efficientnet_b1_ns,4267.56,239.936,1024,240,7.79
tf_efficientnet_b1_ap,4266.36,240.004,1024,240,7.79
tf_efficientnet_b1,4266.12,240.017,1024,240,7.79
efficientnet_b0_g16_evos,4260.91,240.312,1024,224,8.11
legacy_seresnext26_32x4d,4210.62,243.183,1024,224,16.79
mixer_s16_224,4210.38,243.197,1024,224,18.53
gernet_l,4176.68,245.159,1024,256,31.08
tf_efficientnet_lite2,4154.51,246.467,1024,260,6.09
gmixer_12_224,4127.0,248.109,1024,224,12.7
efficientnet_b1,4101.97,249.625,1024,256,7.79
resnext26ts,4091.38,250.27,1024,256,10.3
rexnet_130,4013.83,255.106,1024,224,7.56
seresnext26ts,3966.87,258.123,1024,256,10.39
nf_seresnet26,3954.94,258.905,1024,224,17.4
eca_resnext26ts,3953.54,258.996,1024,256,10.3
nf_ecaresnet26,3938.91,259.956,1024,224,16.0
repvgg_a2,3868.66,264.68,1024,224,28.21
gmlp_ti16_224,3865.39,264.903,1024,224,5.87
crossvit_tiny_240,3854.54,265.648,1024,240,7.01
efficientnet_b2_pruned,3821.39,267.953,1024,260,8.31
vit_small_patch32_384,3802.4,269.289,1024,384,22.92
resnet26t,3789.73,270.19,1024,256,16.01
regnetx_016,3776.49,271.139,1024,224,9.19
rexnet_150,3774.39,271.29,1024,224,9.73
seresnext26t_32x4d,3769.71,271.627,1024,224,16.81
seresnext26tn_32x4d,3769.68,271.629,1024,224,16.81
gcresnext26ts,3763.59,272.069,1024,256,10.48
seresnext26d_32x4d,3760.94,272.26,1024,224,16.81
ecaresnext50t_32x4d,3751.71,272.931,1024,224,15.41
ecaresnext26t_32x4d,3745.62,273.373,1024,224,15.41
ecaresnet50d_pruned,3714.28,275.68,1024,224,19.94
resnetv2_50,3703.17,276.505,1024,224,25.55
eca_botnext26ts_256,3685.99,277.797,1024,256,10.59
crossvit_9_240,3640.93,281.235,1024,240,8.55
ecaresnetlight,3588.5,285.344,1024,224,30.16
eca_halonext26ts,3573.18,286.568,1024,256,10.76
crossvit_9_dagger_240,3562.28,287.444,1024,240,8.78
poolformer_s12,3555.09,288.026,1024,224,11.92
swsl_resnet50,3536.6,289.53,1024,224,25.56
gluon_resnet50_v1b,3534.52,289.702,1024,224,25.56
tv_resnet50,3534.18,289.73,1024,224,25.56
resnet50,3528.73,290.177,1024,224,25.56
ssl_resnet50,3522.22,290.713,1024,224,25.56
rexnetr_200,3521.98,145.362,512,224,16.52
dla46_c,3494.33,293.033,1024,224,1.3
botnet26t_256,3447.34,297.026,1024,256,12.49
efficientnet_em,3434.61,298.129,1024,240,6.9
dpn68,3423.83,299.067,1024,224,12.61
resnet32ts,3414.56,299.879,1024,256,17.96
dpn68b,3412.63,300.049,1024,224,12.61
regnety_016,3408.85,300.382,1024,224,11.2
halonet26t,3379.19,303.017,1024,256,12.48
resnetv2_50t,3374.73,303.416,1024,224,25.57
resnetv2_50d,3370.34,303.812,1024,224,25.57
resnet33ts,3369.92,303.852,1024,256,19.68
nf_regnet_b1,3357.92,304.938,1024,288,10.22
gluon_resnet50_v1c,3339.61,306.611,1024,224,25.58
nf_regnet_b2,3329.81,307.512,1024,272,14.31
tf_efficientnet_b2_ap,3315.4,308.848,1024,260,9.11
tf_efficientnet_b2_ns,3314.91,308.894,1024,260,9.11
tf_efficientnet_em,3313.67,309.011,1024,240,6.9
tf_efficientnet_b2,3313.08,309.063,1024,260,9.11
seresnet33ts,3285.4,311.671,1024,256,19.78
eca_resnet33ts,3264.82,313.634,1024,256,19.68
resnet50t,3209.81,319.009,1024,224,25.57
bat_resnext26ts,3208.41,319.147,1024,256,10.73
legacy_seresnet50,3208.39,319.15,1024,224,28.09
gluon_resnet50_v1d,3207.43,319.246,1024,224,25.58
convnext_nano_hnf,3203.44,319.644,1024,224,15.59
resnet50d,3201.53,319.835,1024,224,25.58
convit_tiny,3182.1,321.788,1024,224,5.71
gcresnet33ts,3115.24,328.694,1024,256,19.88
vit_small_resnet26d_224,3114.54,328.764,1024,224,63.61
efficientnet_b2,3113.3,328.898,1024,288,9.11
efficientnet_b2a,3112.75,328.957,1024,288,9.11
efficientnet_b3_pruned,3098.33,330.489,1024,300,9.86
mobilevit_xs,3098.23,165.245,512,256,2.32
vovnet39a,3094.91,330.85,1024,224,22.6
seresnet50,3084.99,331.918,1024,224,28.09
haloregnetz_b,3050.82,335.635,1024,224,11.68
skresnet34,3016.36,339.47,1024,224,22.28
ese_vovnet39b,2991.04,342.343,1024,224,24.57
eca_vovnet39b,2984.95,343.042,1024,224,22.6
cspresnext50,2981.83,343.401,1024,224,20.57
selecsls84,2979.47,343.673,1024,224,50.95
res2net50_48w_2s,2961.27,345.783,1024,224,25.29
ssl_resnext50_32x4d,2883.39,355.125,1024,224,25.03
resnext50_32x4d,2879.89,355.557,1024,224,25.03
swsl_resnext50_32x4d,2879.83,355.563,1024,224,25.03
tv_resnext50_32x4d,2876.25,356.006,1024,224,25.03
gluon_resnext50_32x4d,2870.17,356.762,1024,224,25.03
resnetaa50d,2869.21,356.878,1024,224,25.58
tv_densenet121,2861.74,357.812,1024,224,7.98
densenet121,2859.4,358.103,1024,224,7.98
mixnet_s,2849.57,359.34,1024,224,4.13
seresnet50t,2844.85,359.936,1024,224,28.1
resnetrs50,2844.33,360.0,1024,224,35.69
deit_small_patch16_224,2839.86,360.568,1024,224,22.05
vit_small_patch16_224,2838.21,360.777,1024,224,22.05
ecaresnet101d_pruned,2833.47,361.381,1024,224,24.88
coat_lite_tiny,2832.3,361.531,1024,224,5.72
gluon_resnet50_v1s,2824.57,362.521,1024,224,25.68
pit_s_224,2821.07,362.969,1024,224,23.46
ecaresnet50d,2815.88,363.64,1024,224,25.58
rexnet_200,2812.74,182.018,512,224,16.37
pit_s_distilled_224,2802.25,365.406,1024,224,24.04
deit_small_distilled_patch16_224,2791.58,366.805,1024,224,22.44
dla34,2790.34,366.966,1024,224,15.74
dla46x_c,2773.06,369.253,1024,224,1.07
cspresnet50,2754.0,371.811,1024,256,21.62
hrnet_w18_small_v2,2736.24,374.217,1024,224,15.6
densenet121d,2731.76,374.837,1024,224,8.0
tf_mixnet_s,2728.68,375.26,1024,224,4.13
efficientnet_lite3,2718.45,188.332,512,300,8.2
regnetz_b16,2715.55,377.076,1024,288,9.72
dla60x_c,2712.17,377.543,1024,224,1.32
vit_tiny_r_s16_p8_384,2700.75,189.562,512,384,6.36
coat_lite_mini,2663.5,384.444,1024,224,11.01
resnext50d_32x4d,2662.22,384.629,1024,224,25.05
resnetblur50,2619.08,390.962,1024,224,25.56
cspresnet50d,2596.32,394.393,1024,256,21.64
vgg11_bn,2593.2,197.428,512,224,132.87
vit_base2_patch32_256,2576.99,397.35,1024,256,119.46
seresnetaa50d,2576.43,397.436,1024,224,28.11
cspresnet50w,2574.64,397.712,1024,256,28.12
seresnext50_32x4d,2574.54,397.729,1024,224,27.56
lambda_resnet26rpt_256,2573.59,397.876,1024,256,10.99
legacy_seresnext50_32x4d,2571.71,398.166,1024,224,27.56
xcit_nano_12_p16_384_dist,2569.25,398.547,1024,384,3.05
gluon_seresnext50_32x4d,2564.07,399.352,1024,224,27.56
xcit_tiny_24_p16_224_dist,2557.0,400.456,1024,224,12.12
xcit_tiny_24_p16_224,2556.28,400.571,1024,224,12.12
efficientnetv2_rw_t,2545.32,402.294,1024,288,13.65
vovnet57a,2538.22,403.418,1024,224,36.64
fbnetv3_g,2530.36,404.673,1024,288,16.62
gcresnet50t,2514.25,407.265,1024,256,25.9
res2net50_26w_4s,2505.47,408.693,1024,224,25.7
tf_efficientnetv2_b3,2496.85,410.104,1024,300,14.36
twins_svt_small,2488.33,411.508,1024,224,24.06
vit_base_resnet26d_224,2472.64,414.118,1024,224,101.4
ese_vovnet57b,2446.54,418.537,1024,224,38.61
densenetblur121d,2445.52,418.711,1024,224,8.0
tf_efficientnet_lite3,2442.63,209.598,512,300,8.2
resnetblur50d,2422.6,422.672,1024,224,25.58
mobilevit_s,2415.06,211.991,512,256,5.58
resnest14d,2409.35,424.998,1024,224,10.61
tf_inception_v3,2400.27,426.605,1024,299,23.83
gluon_inception_v3,2398.26,426.963,1024,299,23.83
nf_seresnet50,2397.62,427.078,1024,224,28.09
inception_v3,2397.12,427.166,1024,299,23.83
nf_ecaresnet50,2388.81,428.654,1024,224,25.56
adv_inception_v3,2388.59,428.689,1024,299,23.83
densenet169,2362.23,433.477,1024,224,14.15
sehalonet33ts,2337.63,219.014,512,256,13.69
resmlp_24_224,2312.09,442.876,1024,224,30.02
gc_efficientnetv2_rw_t,2311.11,443.065,1024,288,13.68
resmlp_24_distilled_224,2308.78,443.512,1024,224,30.02
convnext_tiny,2275.45,450.007,1024,224,28.59
gcresnext50ts,2263.71,452.341,1024,256,15.67
res2net50_14w_8s,2252.88,454.515,1024,224,25.06
semobilevit_s,2250.59,227.485,512,256,5.74
darknet53,2248.73,227.672,512,256,41.61
resnetv2_101,2235.55,458.038,1024,224,44.54
xcit_small_12_p16_224_dist,2216.94,461.885,1024,224,26.25
xcit_small_12_p16_224,2212.11,462.891,1024,224,26.25
skresnet50,2193.01,466.925,1024,224,25.8
resnet101,2170.93,471.673,1024,224,44.55
tv_resnet101,2164.75,473.02,1024,224,44.55
gluon_resnet101_v1b,2163.02,473.399,1024,224,44.55
res2next50,2156.68,474.791,1024,224,24.67
ecaresnet26t,2145.15,477.343,1024,320,16.01
nf_regnet_b3,2126.07,481.627,1024,320,18.59
gmixer_24_224,2110.93,485.081,1024,224,24.72
resnetv2_101d,2090.05,489.925,1024,224,44.56
gluon_resnet101_v1c,2088.96,490.184,1024,224,44.57
dla60,2087.19,490.597,1024,224,22.04
convnext_tiny_hnf,2073.34,493.877,1024,224,28.59
skresnet50d,2063.75,496.172,1024,224,25.82
twins_pcpvt_small,2048.61,499.837,1024,224,24.11
gluon_resnet101_v1d,2036.64,502.777,1024,224,44.57
vgg13,2019.99,506.92,1024,224,133.05
efficientnet_b0_gn,2017.66,253.748,512,224,5.29
wide_resnet50_2,2001.07,511.71,1024,224,68.88
sebotnet33ts_256,1999.77,192.009,384,256,13.7
xcit_nano_12_p8_224,1978.07,517.663,1024,224,3.05
xcit_nano_12_p8_224_dist,1977.13,517.91,1024,224,3.05
vit_base_resnet50d_224,1950.99,524.848,1024,224,110.97
repvgg_b1,1936.11,528.882,1024,224,57.42
legacy_seresnet101,1932.53,529.863,1024,224,49.33
dla60x,1896.77,539.852,1024,224,17.35
resnetaa101d,1893.61,540.751,1024,224,44.57
tf_efficientnet_b3_ns,1893.08,270.444,512,300,12.23
tf_efficientnet_b3_ap,1892.83,270.481,512,300,12.23
tf_efficientnet_b3,1892.52,270.524,512,300,12.23
seresnet101,1891.69,541.301,1024,224,49.33
gluon_resnet101_v1s,1874.89,546.152,1024,224,44.67
resnet51q,1869.52,547.722,1024,288,35.7
efficientnet_b3,1853.06,276.288,512,320,12.23
efficientnet_b3a,1852.72,276.339,512,320,12.23
crossvit_small_240,1844.12,555.265,1024,240,26.86
poolformer_s24,1843.54,555.44,1024,224,21.39
swin_tiny_patch4_window7_224,1831.44,559.111,1024,224,28.29
halonet50ts,1826.61,560.589,1024,256,22.73
densenet201,1818.74,563.014,1024,224,20.01
gmlp_s16_224,1810.78,565.489,1024,224,19.42
ssl_resnext101_32x4d,1796.49,569.986,1024,224,44.18
resnext101_32x4d,1794.8,570.526,1024,224,44.18
swsl_resnext101_32x4d,1794.2,570.714,1024,224,44.18
convnext_tiny_hnfd,1793.87,570.819,1024,224,28.63
gluon_resnext101_32x4d,1791.38,571.613,1024,224,44.18
cspdarknet53,1782.45,287.233,512,256,27.64
nf_resnet101,1778.39,575.79,1024,224,44.55
vit_small_r26_s32_224,1778.32,575.808,1024,224,36.43
ecaresnet101d,1778.2,575.85,1024,224,44.57
regnetz_c16,1772.58,288.833,512,320,13.46
res2net50_26w_6s,1761.45,581.325,1024,224,37.05
resnest26d,1761.21,581.405,1024,224,17.07
dla60_res2net,1720.09,595.302,1024,224,20.85
nf_resnet50,1719.27,595.589,1024,288,25.56
crossvit_15_240,1700.36,602.213,1024,240,27.53
resnetblur101d,1688.41,606.472,1024,224,44.57
resnet61q,1677.9,610.274,1024,288,36.85
swin_s3_tiny_224,1668.56,613.692,1024,224,28.33
xcit_tiny_12_p16_384_dist,1651.99,619.843,1024,384,6.72
crossvit_15_dagger_240,1651.71,619.951,1024,240,28.21
resnetv2_50d_frn,1650.73,620.316,1024,224,25.59
vgg13_bn,1642.03,311.797,512,224,133.05
efficientnet_b0_g8_gn,1637.47,312.665,512,224,6.56
vgg16,1632.03,627.428,1024,224,138.36
cait_xxs24_224,1630.9,627.86,1024,224,11.96
repvgg_b1g4,1614.79,634.126,1024,224,39.97
regnetx_032,1605.58,637.76,1024,224,15.3
seresnext101_32x4d,1600.47,639.797,1024,224,48.96
gluon_seresnext101_32x4d,1597.12,641.141,1024,224,48.96
legacy_seresnext101_32x4d,1596.46,641.406,1024,224,48.96
botnet50ts_256,1594.64,321.062,512,256,22.74
res2net101_26w_4s,1591.38,643.454,1024,224,45.21
regnetx_040,1586.28,645.521,1024,224,22.12
ese_vovnet39b_evos,1584.69,646.169,1024,224,24.58
resnetv2_50d_evob,1578.15,648.842,1024,224,25.59
resnetv2_50x1_bit_distilled,1562.05,655.535,1024,224,25.55
visformer_small,1557.14,657.604,1024,224,40.22
xception41p,1555.03,329.24,512,299,26.91
resnetv2_152,1552.32,659.642,1024,224,60.19
dla102,1552.08,659.747,1024,224,33.27
resmlp_36_224,1551.92,659.816,1024,224,44.69
dla60_res2next,1551.9,659.821,1024,224,17.03
resmlp_36_distilled_224,1551.61,659.949,1024,224,44.69
resnest50d_1s4x24d,1548.62,661.22,1024,224,25.68
xception,1546.37,331.084,512,299,22.86
hrnet_w32,1543.69,663.332,1024,224,41.23
halo2botnet50ts_256,1528.74,669.82,1024,256,22.64
tv_resnet152,1516.25,675.334,1024,224,60.19
coat_lite_small,1515.55,675.647,1024,224,19.84
mixer_b16_224,1515.3,675.76,1024,224,59.88
mixer_b16_224_miil,1514.28,676.214,1024,224,59.88
gluon_resnet152_v1b,1512.22,677.139,1024,224,60.19
resnet152,1507.31,679.341,1024,224,60.19
efficientnet_el,1505.49,340.076,512,300,10.59
efficientnet_el_pruned,1505.01,340.185,512,300,10.59
vit_large_patch32_224,1500.0,682.653,1024,224,306.54
swin_v2_cr_tiny_224,1497.15,683.946,1024,224,28.33
res2net50_26w_8s,1487.1,688.575,1024,224,48.4
nf_seresnet101,1486.78,688.723,1024,224,49.33
resnetv2_152d,1485.01,689.541,1024,224,60.2
nf_ecaresnet101,1477.89,692.868,1024,224,44.55
gluon_resnet152_v1c,1474.38,694.516,1024,224,60.21
swin_v2_cr_tiny_ns_224,1472.66,695.323,1024,224,28.33
tf_efficientnet_el,1464.69,349.551,512,300,10.59
vit_tiny_patch16_384,1460.68,701.028,1024,384,5.79
vit_base_r26_s32_224,1452.81,704.825,1024,224,101.38
gluon_resnet152_v1d,1447.4,707.462,1024,224,60.21
hrnet_w18,1432.55,714.792,1024,224,21.3
mixnet_m,1431.87,715.134,1024,224,5.01
mixer_l32_224,1426.61,717.775,1024,224,206.94
dla102x,1416.93,722.674,1024,224,26.31
tf_mixnet_m,1413.24,724.561,1024,224,5.01
convnext_small,1410.61,725.911,1024,224,50.22
twins_pcpvt_base,1405.32,728.645,1024,224,43.83
ecaresnet50t,1382.49,740.677,1024,320,25.57
nest_tiny,1378.24,371.475,512,224,17.06
vit_base_patch32_384,1377.87,743.165,1024,384,88.3
convit_small,1368.52,748.238,1024,224,27.78
regnety_032,1367.75,748.662,1024,288,19.44
vgg19,1367.17,748.977,1024,224,143.67
gluon_resnet152_v1s,1363.8,750.829,1024,224,60.32
vgg16_bn,1363.24,375.563,512,224,138.37
jx_nest_tiny,1354.35,378.027,512,224,17.06
ese_vovnet99b,1351.66,757.57,1024,224,63.2
xception41,1337.66,382.745,512,299,26.97
legacy_seresnet152,1335.99,766.462,1024,224,66.82
seresnet152,1317.5,777.213,1024,224,66.82
dpn92,1303.11,785.8,1024,224,37.67
efficientnet_lite4,1287.19,298.314,384,380,13.01
tresnet_m,1283.27,797.947,1024,224,31.39
inception_v4,1282.29,798.555,1024,299,42.68
densenet161,1276.9,801.927,1024,224,28.68
xcit_tiny_12_p8_224_dist,1263.32,810.55,1024,224,6.71
xcit_tiny_12_p8_224,1260.86,812.129,1024,224,6.71
regnetx_080,1259.62,812.931,1024,224,39.57
skresnext50_32x4d,1255.24,815.767,1024,224,27.48
vit_small_resnet50d_s16_224,1252.87,817.305,1024,224,57.53
twins_svt_base,1249.6,819.445,1024,224,56.07
poolformer_s36,1245.02,822.461,1024,224,30.86
repvgg_b2,1243.81,823.263,1024,224,89.02
volo_d1_224,1232.57,830.769,1024,224,26.63
hrnet_w30,1221.32,838.421,1024,224,37.71
crossvit_18_240,1205.55,849.387,1024,240,43.27
resnest50d,1199.06,853.99,1024,224,27.48
tf_efficientnet_lite4,1182.46,324.733,384,380,13.01
xcit_small_24_p16_224_dist,1182.35,866.056,1024,224,47.67
xcit_small_24_p16_224,1181.92,866.371,1024,224,47.67
crossvit_18_dagger_240,1172.17,873.578,1024,240,44.27
regnetx_064,1166.56,438.885,512,224,26.21
vgg19_bn,1163.98,439.857,512,224,143.68
nf_regnet_b4,1158.13,884.17,1024,384,30.21
efficientnetv2_s,1149.66,890.683,1024,384,21.46
regnetz_d8,1143.01,895.867,1024,320,23.37
resnet50_gn,1132.8,903.934,1024,224,25.56
dla169,1130.6,905.699,1024,224,53.39
wide_resnet101_2,1125.72,909.622,1024,224,126.89
swin_small_patch4_window7_224,1123.67,911.288,1024,224,49.61
tf_efficientnetv2_s,1123.66,911.292,1024,384,21.46
tf_efficientnetv2_s_in21ft1k,1121.7,912.889,1024,384,21.46
gluon_resnext101_64x4d,1118.73,915.313,1024,224,83.46
mixnet_l,1117.6,458.111,512,224,7.33
xception65p,1115.45,458.995,512,299,39.82
vit_base_patch16_224_miil,1108.89,923.433,1024,224,86.54
nfnet_l0,1101.58,929.561,1024,288,35.07
eca_nfnet_l0,1100.75,930.261,1024,288,24.14
tf_mixnet_l,1100.74,465.128,512,224,7.33
dpn98,1097.38,933.115,1024,224,61.57
resnet200,1096.34,934.004,1024,224,64.67
resnetrs101,1095.82,934.443,1024,288,63.62
cait_xxs36_224,1095.14,935.023,1024,224,17.3
efficientnetv2_rw_s,1089.64,939.747,1024,384,23.94
vit_base_patch16_224_sam,1073.62,953.767,1024,224,86.57
vit_base_patch16_224,1073.46,953.912,1024,224,86.57
deit_base_patch16_224,1071.76,955.424,1024,224,86.57
inception_resnet_v2,1061.78,964.405,1024,299,55.84
ens_adv_inception_resnet_v2,1058.32,967.554,1024,299,55.84
deit_base_distilled_patch16_224,1057.89,967.952,1024,224,87.34
tnt_s_patch16_224,1053.73,971.773,1024,224,23.76
dla102x2,1044.22,490.307,512,224,41.28
regnetz_040,1039.59,369.364,384,320,27.12
gluon_seresnext101_64x4d,1039.37,985.196,1024,224,88.23
regnetz_040h,1034.17,371.302,384,320,28.94
resnext101_32x8d,1033.81,990.5,1024,224,88.79
ssl_resnext101_32x8d,1033.7,990.597,1024,224,88.79
swsl_resnext101_32x8d,1033.38,990.91,1024,224,88.79
ig_resnext101_32x8d,1028.15,995.947,1024,224,88.79
regnetz_d32,1009.91,1013.934,1024,320,27.58
resnest50d_4s2x40d,1008.21,1015.647,1024,224,30.42
repvgg_b3,1001.81,1022.132,1024,224,123.09
twins_pcpvt_large,999.56,1024.437,1024,224,60.99
resnet101d,999.12,1024.886,1024,320,44.57
beit_base_patch16_224,993.76,1030.414,1024,224,86.53
convnext_base,982.94,1041.763,1024,224,88.59
convnext_base_in22ft1k,982.23,1042.511,1024,224,88.59
hrnet_w40,980.59,1044.254,1024,224,57.56
coat_tiny,977.86,1047.174,1024,224,5.5
efficientnet_b4,974.19,394.16,384,384,19.34
gluon_xception65,972.42,526.509,512,299,39.92
xception65,965.78,530.126,512,299,39.92
pit_b_224,946.3,541.042,512,224,73.76
regnetz_b16_evos,945.35,541.586,512,288,9.74
pit_b_distilled_224,942.16,543.418,512,224,74.79
tf_efficientnet_b4,925.87,414.732,384,380,19.34
tf_efficientnet_b4_ap,925.37,414.955,384,380,19.34
tf_efficientnet_b4_ns,925.29,414.99,384,380,19.34
vit_small_patch16_36x1_224,920.52,1112.401,1024,224,64.67
swin_v2_cr_small_224,915.78,1118.157,1024,224,49.7
vit_small_patch16_18x2_224,899.5,1138.389,1024,224,64.67
xcit_tiny_24_p16_384_dist,885.13,1156.874,1024,384,12.12
twins_svt_large,884.59,1157.58,1024,224,99.27
nest_small,880.02,581.79,512,224,38.35
hrnet_w48,878.91,1165.063,1024,224,77.47
cait_s24_224,871.81,1174.556,1024,224,46.92
jx_nest_small,870.35,588.255,512,224,38.35
poolformer_m36,860.43,1190.082,1024,224,56.17
regnety_040,849.33,602.817,512,288,20.65
regnetv_040,849.16,602.939,512,288,20.64
nfnet_f0,843.53,1213.933,1024,256,71.49
resnetv2_50d_evos,834.99,1226.346,1024,288,25.59
swin_s3_small_224,821.03,623.593,512,224,49.74
repvgg_b2g4,812.87,1259.724,1024,224,61.76
xcit_medium_24_p16_224_dist,811.19,1262.323,1024,224,84.4
xcit_medium_24_p16_224,810.62,1263.211,1024,224,84.4
dpn131,808.5,1266.522,1024,224,79.25
swin_base_patch4_window7_224,797.2,1284.488,1024,224,87.77
hrnet_w44,791.54,1293.66,1024,224,67.06
coat_mini,786.27,1302.337,1024,224,10.34
regnetx_120,777.95,658.124,512,224,46.11
gmlp_b16_224,772.78,1325.073,1024,224,73.08
dm_nfnet_f0,761.49,1344.713,1024,256,71.49
regnety_120,754.35,678.713,512,224,51.82
densenet264,750.52,1364.368,1024,224,72.69
mixnet_xl,748.42,684.091,512,224,11.9
crossvit_base_240,747.2,685.212,512,240,105.03
xcit_small_12_p16_384_dist,744.87,1374.713,1024,384,26.25
resnetv2_50d_gn,742.22,689.806,512,288,25.57
xception71,741.19,690.762,512,299,42.34
hrnet_w64,723.1,1416.105,1024,224,128.06
vit_large_r50_s32_224,721.71,1418.838,1024,224,328.99
regnety_040s_gn,714.09,716.982,512,224,20.65
seresnet200d,712.88,1436.411,1024,256,71.86
resnet152d,711.54,1439.114,1024,320,60.21
ecaresnet200d,707.4,1447.533,1024,256,64.69
dpn107,701.19,1460.351,1024,224,86.92
cspresnext50_iabn,693.03,1477.558,1024,256,20.57
senet154,692.07,1479.599,1024,224,115.09
legacy_senet154,691.5,1480.826,1024,224,115.09
gluon_senet154,689.56,1484.981,1024,224,115.09
convit_base,687.54,1489.363,1024,224,86.54
vit_small_patch16_384,683.58,748.981,512,384,22.2
volo_d2_224,677.6,1511.204,1024,224,58.68
tnt_b_patch16_224,675.63,1515.603,1024,224,65.41
xcit_nano_12_p8_384_dist,672.93,1521.688,1024,384,3.05
resnext101_64x4d,667.5,767.03,512,288,83.46
xcit_tiny_24_p8_224,665.57,1538.51,1024,224,12.11
swin_s3_base_224,665.44,1538.819,1024,224,71.13
xcit_tiny_24_p8_224_dist,665.24,1539.279,1024,224,12.11
swin_v2_cr_base_224,655.51,1562.118,1024,224,87.88
poolformer_m48,648.5,1579.004,1024,224,73.47
repvgg_b3g4,644.06,1589.905,1024,224,83.83
tresnet_l,638.84,1602.889,1024,224,55.99
resnetrs152,628.07,1630.372,1024,320,86.62
nest_base,626.36,817.402,512,224,67.72
seresnet152d,626.17,1635.31,1024,320,66.84
regnetx_160,623.38,821.322,512,224,54.28
jx_nest_base,620.6,824.996,512,224,67.72
regnetz_e8,607.41,842.906,512,320,57.7
ese_vovnet99b_iabn,606.83,1687.442,1024,224,63.2
regnetz_c16_evos,598.38,855.63,512,320,13.49
vit_base_r50_s16_224,595.98,1718.162,1024,224,98.66
resnest101e,584.85,875.416,512,256,48.28
vit_small_r26_s32_384,582.99,878.207,512,384,36.47
convmixer_768_32,574.69,1781.818,1024,224,21.11
seresnext101_32x8d,574.1,891.822,512,288,93.57
cspdarknet53_iabn,571.87,1790.614,1024,256,27.64
xcit_small_12_p8_224,568.57,1800.991,1024,224,26.21
xcit_small_12_p8_224_dist,567.71,1803.708,1024,224,26.21
seresnet269d,558.78,1832.551,1024,256,113.67
convnext_large_in22ft1k,544.26,940.706,512,224,197.77
convnext_large,544.24,940.754,512,224,197.77
regnety_080,536.38,954.53,512,288,39.18
resnet200d,524.48,1952.377,1024,320,64.69
mixnet_xxl,489.07,785.144,384,224,23.96
halonet_h1,488.76,523.759,256,256,8.1
eca_nfnet_l1,486.84,2103.345,1024,320,41.41
mixer_l16_224,485.71,2108.232,1024,224,208.2
efficientnetv2_m,483.16,2119.353,1024,416,54.14
vit_large_patch32_384,479.45,2135.754,1024,384,306.63
volo_d3_224,473.36,2163.252,1024,224,86.33
tresnet_xl,471.89,2169.959,1024,224,78.44
regnety_064,471.86,1085.046,512,288,30.58
regnetv_064,468.27,1093.376,512,288,30.58
efficientnet_b5,467.72,547.327,256,456,30.39
xcit_large_24_p16_224_dist,464.03,2206.748,1024,224,189.1
xcit_large_24_p16_224,463.92,2207.253,1024,224,189.1
efficientnet_b3_gn,459.87,417.496,192,320,11.73
swin_large_patch4_window7_224,457.64,1118.756,512,224,196.53
resnetrs200,456.26,2244.305,1024,320,93.21
tf_efficientnet_b5_ap,447.29,572.321,256,456,30.39
tf_efficientnet_b5,447.21,572.425,256,456,30.39
tf_efficientnet_b5_ns,446.89,572.83,256,456,30.39
xcit_tiny_12_p8_384_dist,429.76,2382.721,1024,384,6.71
regnety_320,415.34,1232.707,512,224,145.05
regnety_160,414.45,926.515,384,288,83.59
xcit_small_24_p16_384_dist,397.04,2579.039,1024,384,47.67
efficientnetv2_rw_m,393.18,1302.2,512,416,53.24
regnetz_d8_evos,387.65,1320.76,512,320,23.46
swin_v2_cr_large_224,386.59,1324.379,512,224,196.68
efficientnet_b3_g8_gn,385.43,498.136,192,320,14.25
swin_v2_cr_tiny_384,366.25,698.968,256,384,28.33
convnext_xlarge_in22ft1k,359.4,1424.579,512,224,350.2
tf_efficientnetv2_m,359.31,1424.935,512,480,54.14
tf_efficientnetv2_m_in21ft1k,358.34,1428.804,512,480,54.14
vit_large_patch16_224,354.58,2887.929,1024,224,304.33
crossvit_15_dagger_408,354.15,722.849,256,408,28.5
ssl_resnext101_32x16d,347.17,1474.787,512,224,194.03
swsl_resnext101_32x16d,346.95,1475.72,512,224,194.03
vit_base_patch16_18x2_224,346.41,2956.033,1024,224,256.73
ig_resnext101_32x16d,345.97,1479.87,512,224,194.03
convnext_base_384_in22ft1k,335.81,1143.488,384,384,88.59
beit_large_patch16_224,328.45,3117.61,1024,224,304.43
tresnet_m_448,321.84,3181.675,1024,448,31.39
volo_d1_384,310.06,1651.255,512,384,26.78
volo_d4_224,303.03,3379.23,1024,224,192.96
pnasnet5large,299.35,1282.762,384,331,86.06
xcit_small_24_p8_224,297.54,3441.571,1024,224,47.63
xcit_small_24_p8_224_dist,297.52,3441.791,1024,224,47.63
ecaresnet269d,293.49,3489.006,1024,352,102.09
nasnetalarge,291.8,1315.945,384,331,88.75
resnetrs270,287.51,3561.559,1024,352,129.86
nfnet_f1,286.44,3574.909,1024,320,132.63
resnetv2_152x2_bit_teacher,280.03,1828.366,512,224,236.34
xcit_medium_24_p16_384_dist,278.67,1837.25,512,384,84.4
vit_base_patch16_384,277.57,1383.439,384,384,86.86
deit_base_patch16_384,277.28,1384.877,384,384,86.86
deit_base_distilled_patch16_384,273.16,1405.736,384,384,87.63
cait_xxs24_384,268.69,3811.003,1024,384,12.03
efficientnet_b6,268.02,477.57,128,528,43.04
regnetx_320,262.23,1464.357,384,224,107.81
dm_nfnet_f1,261.81,3911.156,1024,320,132.63
vit_large_patch14_224,261.02,3923.008,1024,224,304.2
crossvit_18_dagger_408,259.11,740.985,192,408,44.61
tf_efficientnet_b6_ns,257.39,497.28,128,528,43.04
tf_efficientnet_b6_ap,257.15,497.757,128,528,43.04
tf_efficientnet_b6,257.08,497.876,128,528,43.04
beit_base_patch16_384,239.23,1605.166,384,384,86.74
vit_large_r50_s32_384,236.31,1624.94,384,384,329.09
eca_nfnet_l2,232.83,2198.978,512,384,56.72
xcit_tiny_24_p8_384_dist,226.12,4528.604,1024,384,12.11
swin_v2_cr_small_384,224.11,1142.266,256,384,49.7
swin_base_patch4_window12_384,212.75,902.462,192,384,87.9
resmlp_big_24_distilled_224,209.1,4897.073,1024,224,129.14
resmlp_big_24_224,208.19,4918.549,1024,224,129.14
resmlp_big_24_224_in22ft1k,208.06,4921.598,1024,224,129.14
xcit_medium_24_p8_224,208.01,4922.91,1024,224,84.32
xcit_medium_24_p8_224_dist,207.93,4924.607,1024,224,84.32
resnest200e,201.92,2535.671,512,320,70.2
volo_d5_224,199.55,5131.435,1024,224,295.46
xcit_small_12_p8_384_dist,193.71,2643.156,512,384,26.21
resnetrs350,188.78,2712.164,512,384,163.96
cait_xs24_384,188.52,2715.867,512,384,26.67
efficientnetv2_l,186.87,2739.852,512,480,118.52
convnext_large_384_in22ft1k,186.32,1373.958,256,384,197.77
tf_efficientnetv2_l,185.75,2756.36,512,480,118.52
tf_efficientnetv2_l_in21ft1k,185.69,2757.265,512,480,118.52
vit_base_patch8_224,182.98,1399.048,256,224,86.58
cait_xxs36_384,179.96,5689.985,1024,384,17.37
volo_d2_384,173.85,1472.545,256,384,58.87
vit_base_r50_s16_384,171.75,2235.786,384,384,98.95
vit_base_resnet50_384,171.74,2235.946,384,384,98.95
swin_v2_cr_huge_224,170.46,2252.707,384,224,657.83
densenet264d_iabn,163.84,6249.837,1024,224,72.74
efficientnet_b7,161.28,595.212,96,600,66.35
nfnet_f2,161.05,6358.083,1024,352,193.78
swin_v2_cr_base_384,160.58,1195.67,192,384,87.88
xcit_large_24_p16_384_dist,158.84,3223.264,512,384,189.1
tf_efficientnet_b7,155.96,615.536,96,600,66.35
tf_efficientnet_b7_ap,155.94,615.594,96,600,66.35
tf_efficientnet_b7_ns,155.94,615.59,96,600,66.35
tresnet_l_448,152.84,6699.593,1024,448,55.99
dm_nfnet_f2,147.62,3468.342,512,352,193.78
cait_s24_384,145.6,3516.436,512,384,47.06
efficientnetv2_xl,141.55,2712.868,384,512,208.12
vit_huge_patch14_224,141.3,7246.972,1024,224,632.05
tf_efficientnetv2_xl_in21ft1k,141.27,2718.106,384,512,208.12
resnetrs420,134.22,3814.641,512,416,191.89
swin_large_patch4_window12_384,125.78,1017.629,128,384,196.74
eca_nfnet_l3,122.95,4164.413,512,448,72.04
convnext_xlarge_384_in22ft1k,122.83,1563.128,192,384,350.2
xcit_large_24_p8_224,118.58,4317.779,512,224,188.93
xcit_large_24_p8_224_dist,118.55,4318.826,512,224,188.93
tresnet_xl_448,113.53,9019.968,1024,448,78.44
efficientnet_cc_b0_8e,109.15,9.151,1,224,24.01
efficientnet_cc_b0_4e,106.25,9.403,1,224,13.31
tf_efficientnet_cc_b0_4e,105.28,9.489,1,224,13.31
tf_efficientnet_cc_b0_8e,105.02,9.512,1,224,24.01
xcit_small_24_p8_384_dist,101.44,5047.074,512,384,47.63
efficientnet_b8,98.72,972.467,96,672,87.41
swin_v2_cr_large_384,98.1,1304.769,128,384,196.68
cait_s36_384,97.37,5258.401,512,384,68.37
resnetv2_152x2_bit_teacher_384,96.99,2639.495,256,384,236.34
tf_efficientnet_b8,96.21,997.82,96,672,87.41
tf_efficientnet_b8_ap,96.12,998.706,96,672,87.41
resnest269e,94.82,4049.739,384,416,110.93
vit_large_patch16_384,94.39,2712.048,256,384,304.72
resnetv2_50x3_bitm,92.92,1377.53,128,448,217.32
vit_giant_patch14_224,92.83,5515.228,512,224,1012.61
nfnet_f3,83.35,6142.681,512,416,254.92
convmixer_1024_20_ks9_p14,82.56,12403.804,1024,224,24.38
beit_large_patch16_384,82.25,3112.33,256,384,305.0
efficientnet_cc_b1_8e,77.14,12.953,1,240,39.72
dm_nfnet_f3,76.92,6655.946,512,416,254.92
volo_d3_448,75.82,2532.146,192,448,86.63
tf_efficientnet_cc_b1_8e,73.25,13.64,1,240,39.72
xcit_medium_24_p8_384_dist,70.98,3606.847,256,384,84.32
resnetv2_152x2_bitm,70.34,2729.533,192,448,236.34
resnetv2_101x3_bitm,57.82,2213.862,128,448,387.93
vit_gigantic_patch14_224,56.06,9133.289,512,224,1844.44
volo_d4_448,55.8,3441.115,192,448,193.41
swin_v2_cr_giant_224,49.27,2597.97,128,224,2598.76
nfnet_f4,44.93,8547.007,384,512,316.07
swin_v2_cr_huge_384,43.72,1463.762,64,384,657.94
dm_nfnet_f4,41.24,9312.061,384,512,316.07
xcit_large_24_p8_384_dist,40.33,4760.775,192,384,188.93
volo_d5_448,38.49,3325.731,128,448,295.91
beit_large_patch16_512,33.13,2897.67,96,512,305.67
nfnet_f5,33.1,11599.778,384,544,377.21
cait_m36_384,31.8,8050.244,256,384,271.22
dm_nfnet_f5,31.5,8127.503,256,544,377.21
volo_d5_512,26.9,4758.532,128,512,296.09
nfnet_f6,25.41,10073.116,256,576,438.36
dm_nfnet_f6,23.41,10936.755,256,576,438.36
nfnet_f7,20.63,9307.125,192,608,499.5
resnetv2_152x4_bitm,18.31,3496.111,64,480,936.53
swin_v2_cr_giant_384,13.76,2325.078,32,384,2598.76
cait_m48_448,13.51,9473.095,128,448,356.46
convmixer_1536_20,13.51,75823.541,1024,224,51.63
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-infer-amp-nhwc-pt113-cu117-rtx3090.csv
|
model,infer_samples_per_sec,infer_step_time,infer_batch_size,infer_img_size,infer_gmacs,infer_macts,param_count
tinynet_e,72737.62,14.068,1024,106,0.03,0.69,2.04
mobilenetv3_small_050,54822.3,18.668,1024,224,0.03,0.92,1.59
lcnet_035,53629.35,19.084,1024,224,0.03,1.04,1.64
lcnet_050,45492.41,22.499,1024,224,0.05,1.26,1.88
mobilenetv3_small_075,39215.51,26.102,1024,224,0.05,1.3,2.04
tinynet_d,37346.61,27.409,1024,152,0.05,1.42,2.34
mobilenetv3_small_100,36280.34,28.214,1024,224,0.06,1.42,2.54
tf_mobilenetv3_small_minimal_100,31726.33,32.265,1024,224,0.06,1.41,2.04
tf_mobilenetv3_small_075,31503.43,32.494,1024,224,0.05,1.3,2.04
lcnet_075,29817.69,34.332,1024,224,0.1,1.99,2.36
tf_mobilenetv3_small_100,29444.91,34.767,1024,224,0.06,1.42,2.54
mnasnet_small,25354.86,40.376,1024,224,0.07,2.16,2.03
lcnet_100,24134.76,42.417,1024,224,0.16,2.52,2.95
regnetx_002,23983.4,42.686,1024,224,0.2,2.16,2.68
levit_128s,22675.73,45.148,1024,224,0.31,1.88,7.78
regnety_002,21709.37,47.158,1024,224,0.2,2.17,3.16
mobilenetv2_035,21673.44,47.236,1024,224,0.07,2.86,1.68
mnasnet_050,20010.27,51.163,1024,224,0.11,3.07,2.22
ghostnet_050,18932.82,54.075,1024,224,0.05,1.77,2.59
tinynet_c,18428.42,55.556,1024,184,0.11,2.87,2.46
semnasnet_050,17215.18,59.471,1024,224,0.11,3.44,2.08
mobilenetv2_050,17194.94,59.542,1024,224,0.1,3.64,1.97
cs3darknet_focus_s,16189.76,63.24,1024,256,0.69,2.7,3.27
lcnet_150,15557.15,65.811,1024,224,0.34,3.79,4.5
cs3darknet_s,15369.47,66.615,1024,256,0.72,2.97,3.28
levit_128,15337.67,66.754,1024,224,0.41,2.71,9.21
gernet_s,15288.68,66.966,1024,224,0.75,2.65,8.17
mobilenetv3_large_075,14216.3,72.019,1024,224,0.16,4.0,3.99
mixer_s32_224,14182.92,72.188,1024,224,1.0,2.28,19.1
vit_tiny_r_s16_p8_224,14125.39,72.482,1024,224,0.44,2.06,6.34
resnet10t,14112.07,72.551,1024,224,1.1,2.43,5.44
vit_small_patch32_224,13799.47,74.195,1024,224,1.15,2.5,22.88
regnetx_004,13610.2,75.225,1024,224,0.4,3.14,5.16
levit_192,13524.14,75.706,1024,224,0.66,3.2,10.95
mobilenetv3_rw,12956.58,79.021,1024,224,0.23,4.41,5.48
hardcorenas_a,12803.61,79.966,1024,224,0.23,4.38,5.26
mobilenetv3_large_100,12749.93,80.304,1024,224,0.23,4.41,5.48
mnasnet_075,12532.36,81.697,1024,224,0.23,4.77,3.17
tf_mobilenetv3_large_075,12186.51,84.017,1024,224,0.16,4.0,3.99
tinynet_b,12083.18,84.735,1024,188,0.21,4.44,3.73
regnety_004,11918.36,85.906,1024,224,0.41,3.89,4.34
tf_mobilenetv3_large_minimal_100,11715.94,87.392,1024,224,0.22,4.4,3.92
hardcorenas_c,11548.05,88.662,1024,224,0.28,5.01,5.52
hardcorenas_b,11510.71,88.949,1024,224,0.26,5.09,5.18
ese_vovnet19b_slim_dw,11501.95,89.018,1024,224,0.4,5.28,1.9
ghostnet_100,11332.61,90.348,1024,224,0.15,3.55,5.18
mnasnet_100,11138.43,91.923,1024,224,0.33,5.46,4.38
gluon_resnet18_v1b,11098.78,92.252,1024,224,1.82,2.48,11.69
resnet18,11083.1,92.383,1024,224,1.82,2.48,11.69
swsl_resnet18,11062.48,92.555,1024,224,1.82,2.48,11.69
ssl_resnet18,11061.11,92.565,1024,224,1.82,2.48,11.69
tf_mobilenetv3_large_100,11018.56,92.922,1024,224,0.23,4.41,5.48
mnasnet_b1,10993.58,93.135,1024,224,0.33,5.46,4.38
hardcorenas_d,10910.47,93.843,1024,224,0.3,4.93,7.5
semnasnet_075,10898.09,93.951,1024,224,0.23,5.54,2.91
mobilenetv2_075,10893.76,93.988,1024,224,0.22,5.86,2.64
seresnet18,10385.56,98.588,1024,224,1.82,2.49,11.78
legacy_seresnet18,10064.41,101.734,1024,224,1.82,2.49,11.78
spnasnet_100,10009.21,102.296,1024,224,0.35,6.03,4.42
tf_efficientnetv2_b0,9930.95,103.1,1024,224,0.73,4.77,7.14
levit_256,9858.1,103.863,1024,224,1.13,4.23,18.89
tinynet_a,9720.11,105.337,1024,192,0.35,5.41,6.19
hardcorenas_f,9714.91,105.393,1024,224,0.35,5.57,8.2
semnasnet_100,9623.78,106.393,1024,224,0.32,6.23,3.89
mnasnet_a1,9623.77,106.393,1024,224,0.32,6.23,3.89
mobilenetv2_100,9598.91,106.667,1024,224,0.31,6.68,3.5
hardcorenas_e,9571.87,106.966,1024,224,0.35,5.65,8.07
dla46_c,9568.4,107.007,1024,224,0.58,4.5,1.3
efficientnet_lite0,9361.14,109.377,1024,224,0.4,6.74,4.65
fbnetc_100,9352.03,109.484,1024,224,0.4,6.51,5.57
resnet18d,9334.83,109.687,1024,224,2.06,3.29,11.71
ese_vovnet19b_slim,9109.47,112.4,1024,224,1.69,3.52,3.17
regnety_006,9097.63,112.542,1024,224,0.61,4.33,6.06
regnetz_005,8607.49,118.955,1024,224,0.52,5.86,7.12
xcit_nano_12_p16_224_dist,8577.2,119.375,1024,224,0.56,4.17,3.05
xcit_nano_12_p16_224,8554.61,119.689,1024,224,0.56,4.17,3.05
levit_256d,8382.88,122.143,1024,224,1.4,4.93,26.21
regnetx_006,8379.52,122.192,1024,224,0.61,3.98,6.2
ghostnet_130,8278.59,123.681,1024,224,0.24,4.6,7.36
tf_efficientnet_lite0,8080.51,126.714,1024,224,0.4,6.74,4.65
efficientnet_b0,7965.17,128.548,1024,224,0.4,6.75,5.29
mnasnet_140,7779.42,131.618,1024,224,0.6,7.71,7.12
deit_tiny_distilled_patch16_224,7467.68,137.113,1024,224,1.27,6.01,5.91
rexnetr_100,7464.12,137.179,1024,224,0.43,7.72,4.88
deit_tiny_patch16_224,7430.15,137.806,1024,224,1.26,5.97,5.72
resnet14t,7429.68,137.815,1024,224,1.69,5.8,10.08
vit_tiny_patch16_224,7424.93,137.902,1024,224,1.26,5.97,5.72
regnetx_008,7394.88,138.463,1024,224,0.81,5.15,7.26
mobilenetv2_110d,7247.12,141.287,1024,224,0.45,8.71,4.52
hrnet_w18_small,7232.93,141.561,1024,224,1.61,5.72,13.19
tf_efficientnet_b0,7016.18,145.938,1024,224,0.4,6.75,5.29
regnety_008,6938.46,147.571,1024,224,0.81,5.25,6.26
mobilevitv2_050,6848.87,149.503,1024,256,0.48,8.04,1.37
pit_ti_distilled_224,6811.68,150.317,1024,224,0.71,6.23,5.1
pit_ti_224,6784.24,150.927,1024,224,0.7,6.19,4.85
gernet_m,6679.85,153.286,1024,224,3.02,5.24,21.14
efficientnet_b1_pruned,6642.37,154.15,1024,240,0.4,6.21,6.33
resnet34,6496.42,157.614,1024,224,3.67,3.74,21.8
gluon_resnet34_v1b,6494.61,157.658,1024,224,3.67,3.74,21.8
tv_resnet34,6481.01,157.989,1024,224,3.67,3.74,21.8
tf_efficientnetv2_b1,6476.52,158.098,1024,240,1.21,7.34,8.14
semnasnet_140,6454.5,158.637,1024,224,0.6,8.87,6.11
nf_regnet_b0,6452.24,158.693,1024,256,0.64,5.58,8.76
ese_vovnet19b_dw,6335.13,161.627,1024,224,1.34,8.25,6.54
mobilenetv2_140,6271.56,163.266,1024,224,0.6,9.57,6.11
rexnet_100,6226.48,164.447,1024,224,0.41,7.44,4.8
efficientnet_lite1,6187.91,165.472,1024,240,0.62,10.14,5.42
efficientnet_es_pruned,6115.4,167.434,1024,224,1.81,8.73,5.44
efficientnet_es,6115.12,167.443,1024,224,1.81,8.73,5.44
visformer_tiny,6103.09,167.772,1024,224,1.27,5.72,10.32
seresnet34,6058.13,169.019,1024,224,3.67,3.74,21.96
fbnetv3_b,6018.76,170.124,1024,256,0.55,9.1,8.6
selecsls42,5953.76,171.98,1024,224,2.94,4.62,30.35
selecsls42b,5921.2,172.924,1024,224,2.98,4.62,32.46
resnet26,5895.21,173.69,1024,224,2.36,7.35,16.0
edgenext_xx_small,5893.72,173.732,1024,288,0.33,4.21,1.33
levit_384,5880.4,174.126,1024,224,2.36,6.26,39.13
resnet34d,5865.98,174.555,1024,224,3.91,4.54,21.82
legacy_seresnet34,5850.24,175.025,1024,224,3.67,3.74,21.96
dla34,5827.3,175.712,1024,224,3.07,5.02,15.74
tf_efficientnet_es,5781.29,177.112,1024,224,1.81,8.73,5.44
cs3darknet_focus_m,5721.39,178.967,1024,288,2.51,6.19,9.3
resnetblur18,5636.65,181.657,1024,224,2.34,3.39,11.69
rexnetr_130,5590.0,183.173,1024,224,0.68,9.81,7.61
mobilevit_xxs,5524.87,185.333,1024,256,0.42,8.34,1.27
tf_efficientnet_lite1,5524.68,185.339,1024,240,0.62,10.14,5.42
cs3darknet_m,5478.07,186.916,1024,288,2.63,6.69,9.31
convnext_atto,5460.54,187.516,1024,288,0.91,6.3,3.7
xcit_tiny_12_p16_224_dist,5457.72,187.611,1024,224,1.24,6.29,6.72
xcit_tiny_12_p16_224,5456.63,187.649,1024,224,1.24,6.29,6.72
skresnet18,5413.1,189.159,1024,224,1.82,3.24,11.96
darknet17,5401.37,189.571,1024,256,3.26,7.18,14.3
mixnet_s,5392.58,189.878,1024,224,0.25,6.25,4.13
resmlp_12_224,5366.15,190.814,1024,224,3.01,5.5,15.35
resmlp_12_distilled_224,5364.91,190.857,1024,224,3.01,5.5,15.35
convnext_atto_ols,5288.94,193.6,1024,288,0.96,6.8,3.7
vit_base_patch32_clip_224,5280.68,193.903,1024,224,4.41,5.01,88.22
vit_base_patch32_224,5280.52,193.908,1024,224,4.41,5.01,88.22
pit_xs_distilled_224,5272.13,194.218,1024,224,1.41,7.76,11.0
pit_xs_224,5271.0,194.259,1024,224,1.4,7.71,10.62
repvgg_b0,5252.66,194.939,1024,224,3.41,6.15,15.82
mixer_b32_224,5221.71,196.094,1024,224,3.24,6.29,60.29
pvt_v2_b0,5210.31,196.521,1024,224,0.57,7.99,3.67
resnetaa34d,5171.78,197.986,1024,224,4.43,5.07,21.82
selecsls60,5160.83,198.407,1024,224,3.59,5.52,30.67
selecsls60b,5119.51,200.008,1024,224,3.63,5.52,32.77
mobilenetv2_120d,5111.95,200.304,1024,224,0.69,11.97,5.83
resnet26d,5108.26,200.449,1024,224,2.6,8.15,16.01
gmixer_12_224,5064.97,202.162,1024,224,2.67,7.26,12.7
gmlp_ti16_224,5007.93,204.464,1024,224,1.34,7.55,5.87
mixer_s16_224,4998.69,204.842,1024,224,3.79,5.97,18.53
tf_mixnet_s,4989.18,205.231,1024,224,0.25,6.25,4.13
efficientnet_b0_g16_evos,4930.67,207.667,1024,224,1.01,7.42,8.11
rexnetr_150,4900.22,208.959,1024,224,0.89,11.13,9.78
fbnetv3_d,4881.14,209.776,1024,256,0.68,11.1,10.31
darknet21,4850.41,211.105,1024,256,3.93,7.47,20.86
nf_resnet26,4816.48,212.591,1024,224,2.41,7.35,16.0
efficientnet_lite2,4781.65,214.14,1024,260,0.89,12.9,6.09
convnext_femto,4749.12,215.607,1024,288,1.3,7.56,5.22
tf_efficientnetv2_b2,4718.26,217.018,1024,260,1.72,9.84,10.1
sedarknet21,4656.51,219.895,1024,256,3.93,7.47,20.95
dla46x_c,4636.77,220.831,1024,224,0.54,5.66,1.07
convnext_femto_ols,4618.33,221.714,1024,288,1.35,8.06,5.23
resnext26ts,4603.25,222.441,1024,256,2.43,10.52,10.3
efficientformer_l1,4566.14,224.248,1024,224,1.3,5.53,12.29
dpn48b,4506.78,227.201,1024,224,1.69,8.92,9.13
crossvit_tiny_240,4481.69,228.473,1024,240,1.57,9.08,7.01
dla60x_c,4459.27,229.622,1024,224,0.59,6.01,1.32
eca_resnext26ts,4456.63,229.759,1024,256,2.43,10.52,10.3
seresnext26ts,4453.99,229.896,1024,256,2.43,10.52,10.39
legacy_seresnext26_32x4d,4441.15,230.558,1024,224,2.49,9.39,16.79
gernet_l,4396.56,232.898,1024,256,4.57,8.0,31.08
mobilevitv2_075,4393.87,233.041,1024,256,1.05,12.06,2.87
gcresnext26ts,4384.92,233.516,1024,256,2.43,10.53,10.48
tf_efficientnet_b1,4370.6,234.282,1024,240,0.71,10.88,7.79
tf_efficientnet_lite2,4293.9,238.467,1024,260,0.89,12.9,6.09
rexnet_130,4262.16,240.243,1024,224,0.68,9.71,7.56
efficientnet_b1,4239.44,241.53,1024,256,0.77,12.22,7.79
vit_small_patch32_384,4239.1,241.55,1024,384,3.45,8.25,22.92
crossvit_9_240,4212.37,243.082,1024,240,1.85,9.52,8.55
crossvit_9_dagger_240,4095.03,250.049,1024,240,1.99,9.97,8.78
nf_ecaresnet26,4091.86,250.24,1024,224,2.41,7.36,16.0
nf_seresnet26,4088.47,250.449,1024,224,2.41,7.36,17.4
efficientnet_cc_b0_8e,4076.51,251.183,1024,224,0.42,9.42,24.01
efficientnet_cc_b0_4e,4073.3,251.382,1024,224,0.41,9.42,13.31
ecaresnet50d_pruned,4055.39,252.492,1024,224,2.53,6.43,19.94
efficientnet_b2_pruned,4030.92,254.025,1024,260,0.73,9.13,8.31
ecaresnext50t_32x4d,4018.73,254.796,1024,224,2.7,10.09,15.41
ecaresnext26t_32x4d,4017.09,254.9,1024,224,2.7,10.09,15.41
seresnext26t_32x4d,4014.43,255.069,1024,224,2.7,10.09,16.81
seresnext26tn_32x4d,4014.36,255.074,1024,224,2.7,10.09,16.81
repvgg_a2,3987.84,256.77,1024,224,5.7,6.26,28.21
poolformer_s12,3982.67,257.103,1024,224,1.82,5.53,11.92
seresnext26d_32x4d,3979.57,257.303,1024,224,2.73,10.19,16.81
vit_tiny_r_s16_p8_384,3963.05,258.374,1024,384,1.34,6.49,6.36
resnet26t,3939.46,259.923,1024,256,3.35,10.52,16.01
nf_regnet_b1,3911.64,261.772,1024,288,1.02,9.2,10.22
rexnet_150,3881.93,263.775,1024,224,0.9,11.21,9.73
nf_regnet_b2,3879.78,263.921,1024,272,1.22,9.27,14.31
resnetv2_50,3865.49,264.896,1024,224,4.11,11.11,25.55
regnetx_016,3852.41,265.794,1024,224,1.62,7.93,9.19
tf_efficientnet_cc_b0_4e,3812.08,268.608,1024,224,0.41,9.42,13.31
tf_efficientnet_cc_b0_8e,3803.67,269.202,1024,224,0.42,9.42,24.01
convnext_pico,3747.49,273.239,1024,288,2.27,10.08,9.05
ecaresnetlight,3744.45,273.459,1024,224,4.11,8.42,30.16
dpn68,3724.59,274.917,1024,224,2.35,10.47,12.61
edgenext_x_small,3714.71,275.646,1024,288,0.68,7.5,2.34
gluon_resnet50_v1b,3672.76,278.798,1024,224,4.11,11.11,25.56
ssl_resnet50,3671.85,278.866,1024,224,4.11,11.11,25.56
efficientnet_em,3671.25,278.913,1024,240,3.04,14.34,6.9
resnet50,3668.58,279.116,1024,224,4.11,11.11,25.56
swsl_resnet50,3668.32,279.136,1024,224,4.11,11.11,25.56
tv_resnet50,3667.14,279.225,1024,224,4.11,11.11,25.56
dpn68b,3667.07,279.229,1024,224,2.35,10.47,12.61
rexnetr_200,3659.45,279.811,1024,224,1.59,15.11,16.52
convnext_pico_ols,3651.34,280.434,1024,288,2.37,10.74,9.06
botnet26t_256,3594.28,284.883,1024,256,3.32,11.98,12.49
bat_resnext26ts,3569.91,286.828,1024,256,2.53,12.51,10.73
resnetv2_50t,3547.32,288.657,1024,224,4.32,11.82,25.57
mixnet_m,3537.26,289.477,1024,224,0.36,8.19,5.01
regnety_016,3531.88,289.919,1024,224,1.63,8.04,11.2
tf_efficientnet_em,3529.62,290.106,1024,240,3.04,14.34,6.9
resnetv2_50d,3525.02,290.482,1024,224,4.35,11.92,25.57
halonet26t,3515.15,291.299,1024,256,3.19,11.69,12.48
resnet32ts,3492.62,293.179,1024,256,4.63,11.58,17.96
hrnet_w18_small_v2,3482.81,294.001,1024,224,2.62,9.65,15.6
gluon_resnet50_v1c,3481.59,294.107,1024,224,4.35,11.92,25.58
dla60,3466.91,295.351,1024,224,4.26,10.16,22.04
resnet33ts,3460.78,295.875,1024,256,4.76,11.66,19.68
tf_efficientnet_b2,3402.3,300.962,1024,260,1.02,13.83,9.11
convit_tiny,3399.61,301.199,1024,224,1.26,7.94,5.71
resnet50t,3373.72,303.51,1024,224,4.32,11.82,25.57
tf_mixnet_m,3366.38,304.167,1024,224,0.36,8.19,5.01
efficientnet_b3_pruned,3360.1,304.74,1024,300,1.04,11.86,9.86
seresnet33ts,3354.27,305.27,1024,256,4.76,11.66,19.78
resnet50d,3351.47,305.527,1024,224,4.35,11.92,25.58
eca_resnet33ts,3350.95,305.574,1024,256,4.76,11.66,19.68
vit_small_resnet26d_224,3346.77,305.954,1024,224,5.07,11.12,63.61
cs3darknet_focus_l,3335.18,307.018,1024,288,5.9,10.16,21.15
gluon_resnet50_v1d,3334.65,307.068,1024,224,4.35,11.92,25.58
mobilevitv2_100,3324.63,307.994,1024,256,1.84,16.08,4.9
vovnet39a,3320.12,308.408,1024,224,7.09,6.73,22.6
legacy_seresnet50,3312.33,309.135,1024,224,3.88,10.6,28.09
efficientnet_b0_gn,3307.86,309.554,1024,224,0.42,6.75,5.29
gcresnet33ts,3307.01,309.633,1024,256,4.76,11.68,19.88
pit_s_distilled_224,3301.25,310.173,1024,224,2.9,11.64,24.04
pit_s_224,3299.97,310.295,1024,224,2.88,11.56,23.46
mobilevit_xs,3252.28,314.844,1024,256,1.05,16.33,2.32
deit_small_distilled_patch16_224,3233.6,316.663,1024,224,4.63,12.02,22.44
efficientnet_b2a,3223.97,317.608,1024,288,1.12,16.2,9.11
efficientnet_b2,3223.9,317.615,1024,288,1.12,16.2,9.11
deit_small_patch16_224,3218.99,318.1,1024,224,4.61,11.95,22.05
vit_small_patch16_224,3218.38,318.16,1024,224,4.61,11.95,22.05
cs3darknet_l,3210.26,318.965,1024,288,6.16,10.83,21.16
ese_vovnet39b,3206.21,319.369,1024,224,7.09,6.74,24.57
eca_vovnet39b,3203.77,319.612,1024,224,7.09,6.74,22.6
convnextv2_atto,3196.73,320.315,1024,288,0.91,6.3,3.71
coatnet_pico_rw_224,3189.82,321.008,1024,224,2.05,14.62,10.85
seresnet50,3181.57,321.841,1024,224,4.11,11.13,28.09
pvt_v2_b1,3147.37,325.339,1024,224,2.12,15.39,14.01
coat_lite_tiny,3146.41,325.439,1024,224,1.6,11.65,5.72
res2net50_48w_2s,3127.52,327.404,1024,224,4.18,11.72,25.29
eca_botnext26ts_256,3112.32,329.003,1024,256,2.46,11.6,10.59
ecaresnet101d_pruned,3103.16,329.973,1024,224,3.48,7.69,24.88
efficientnet_b0_g8_gn,3073.2,333.192,1024,224,0.66,6.75,6.56
ssl_resnext50_32x4d,3071.68,333.356,1024,224,4.26,14.4,25.03
dla60x,3071.64,333.359,1024,224,3.54,13.8,17.35
swsl_resnext50_32x4d,3070.7,333.464,1024,224,4.26,14.4,25.03
tv_resnext50_32x4d,3069.81,333.56,1024,224,4.26,14.4,25.03
resnext50_32x4d,3069.72,333.57,1024,224,4.26,14.4,25.03
gluon_resnext50_32x4d,3068.47,333.704,1024,224,4.26,14.4,25.03
vit_small_r26_s32_224,3061.92,334.417,1024,224,3.56,9.85,36.43
skresnet34,3055.95,335.073,1024,224,3.67,5.13,22.28
deit3_small_patch16_224_in21ft1k,3048.82,335.855,1024,224,4.61,11.95,22.06
deit3_small_patch16_224,3047.23,336.031,1024,224,4.61,11.95,22.06
eca_halonext26ts,3035.71,337.305,1024,256,2.44,11.46,10.76
haloregnetz_b,3032.47,337.665,1024,224,1.97,11.94,11.68
vit_relpos_base_patch32_plus_rpn_256,3026.45,338.338,1024,256,7.68,8.01,119.42
vit_relpos_small_patch16_rpn_224,3019.95,339.067,1024,224,4.59,13.05,21.97
vit_relpos_small_patch16_224,3008.26,340.383,1024,224,4.59,13.05,21.98
vit_srelpos_small_patch16_224,3000.96,341.213,1024,224,4.59,12.16,21.97
xcit_nano_12_p16_384_dist,3000.48,341.266,1024,384,1.64,12.15,3.05
cs3sedarknet_l,2995.41,341.845,1024,288,6.16,10.83,21.91
resnetaa50d,2993.03,342.116,1024,224,5.39,12.44,25.58
vgg11,2983.47,85.796,256,224,7.61,7.44,132.86
selecsls84,2973.16,344.402,1024,224,5.9,7.57,50.95
resnetrs50,2963.42,345.535,1024,224,4.48,12.14,35.69
seresnet50t,2957.12,346.271,1024,224,4.32,11.83,28.1
resnest14d,2954.69,346.556,1024,224,2.76,7.33,10.61
gluon_resnet50_v1s,2953.65,346.677,1024,224,5.47,13.52,25.68
coat_lite_mini,2952.61,346.799,1024,224,2.0,12.25,11.01
ecaresnet50d,2945.96,347.583,1024,224,4.35,11.93,25.58
densenet121,2933.45,349.064,1024,224,2.87,6.9,7.98
tv_densenet121,2929.69,349.514,1024,224,2.87,6.9,7.98
vit_base_patch32_plus_256,2929.65,349.519,1024,256,7.79,7.76,119.48
rexnet_200,2927.94,349.723,1024,224,1.56,14.91,16.37
xcit_tiny_24_p16_224_dist,2927.0,349.834,1024,224,2.34,11.82,12.12
xcit_tiny_24_p16_224,2921.97,350.436,1024,224,2.34,11.82,12.12
coatnet_nano_cc_224,2867.38,357.108,1024,224,2.24,15.02,13.76
gcresnext50ts,2857.34,358.363,1024,256,3.75,15.46,15.67
lambda_resnet26rpt_256,2853.55,358.839,1024,256,3.16,11.87,10.99
resnext50d_32x4d,2845.08,359.908,1024,224,4.5,15.2,25.05
mixnet_l,2828.6,361.996,1024,224,0.58,10.84,7.33
densenet121d,2824.08,362.584,1024,224,3.11,7.7,8.0
efficientnet_lite3,2821.84,362.87,1024,300,1.65,21.85,8.2
cspresnet50,2793.65,366.534,1024,256,4.54,11.5,21.62
coatnet_nano_rw_224,2781.93,368.077,1024,224,2.41,15.41,15.14
vgg11_bn,2760.38,370.949,1024,224,7.62,7.44,132.87
vovnet57a,2755.77,371.572,1024,224,8.95,7.52,36.64
resmlp_24_224,2750.33,372.306,1024,224,5.96,10.91,30.02
resmlp_24_distilled_224,2740.33,373.665,1024,224,5.96,10.91,30.02
convnextv2_femto,2735.91,374.269,1024,288,1.3,7.56,5.23
flexivit_small,2735.78,374.287,1024,240,5.35,14.18,22.06
gcresnet50t,2732.04,374.8,1024,256,5.42,14.67,25.9
legacy_seresnext50_32x4d,2722.84,376.065,1024,224,4.26,14.42,27.56
seresnext50_32x4d,2721.47,376.256,1024,224,4.26,14.42,27.56
gluon_seresnext50_32x4d,2720.58,376.379,1024,224,4.26,14.42,27.56
visformer_small,2719.93,376.468,1024,224,4.88,11.43,40.22
twins_svt_small,2713.39,377.374,1024,224,2.94,13.75,24.06
resnetv2_50x1_bit_distilled,2708.81,378.014,1024,224,4.23,11.11,25.55
res2net50_14w_8s,2692.9,380.248,1024,224,4.21,13.28,25.06
resnetblur50,2685.97,381.228,1024,224,5.16,12.02,25.56
vit_base_resnet26d_224,2684.6,381.421,1024,224,6.97,13.16,101.4
tf_mixnet_l,2680.8,381.958,1024,224,0.58,10.84,7.33
seresnetaa50d,2658.93,385.106,1024,224,5.4,12.46,28.11
dla60_res2net,2656.16,385.506,1024,224,4.15,12.34,20.85
cspresnet50d,2655.05,385.668,1024,256,4.86,12.55,21.64
coatnext_nano_rw_224,2655.0,385.674,1024,224,2.47,12.8,14.7
ese_vovnet57b,2654.33,385.773,1024,224,8.95,7.52,38.61
tf_efficientnetv2_b3,2654.14,385.8,1024,300,3.04,15.74,14.36
cspresnet50w,2641.68,387.621,1024,256,5.04,12.19,28.12
res2net50_26w_4s,2629.64,389.395,1024,224,4.28,12.61,25.7
regnetz_b16,2626.71,389.828,1024,288,2.39,16.43,9.72
convnext_nano,2611.78,392.059,1024,288,4.06,13.84,15.59
efficientnetv2_rw_t,2601.49,393.609,1024,288,3.19,16.42,13.65
fbnetv3_g,2595.29,394.549,1024,288,1.77,21.09,16.62
gmixer_24_224,2595.15,394.571,1024,224,5.28,14.45,24.72
mobilevit_s,2586.09,395.952,1024,256,2.03,19.94,5.58
coatnet_rmlp_nano_rw_224,2569.7,398.478,1024,224,2.62,20.34,15.15
gcvit_xxtiny,2561.41,399.768,1024,224,2.14,15.36,12.0
tf_efficientnet_lite3,2530.94,404.582,1024,300,1.65,21.85,8.2
efficientnet_cc_b1_8e,2530.65,404.628,1024,240,0.75,15.44,39.72
densenetblur121d,2522.66,405.908,1024,224,3.11,7.9,8.0
resnetblur50d,2509.45,408.045,1024,224,5.4,12.82,25.58
nf_ecaresnet50,2490.39,411.168,1024,224,4.21,11.13,25.56
inception_v3,2485.21,412.025,1024,299,5.73,8.97,23.83
nf_seresnet50,2482.66,412.449,1024,224,4.21,11.13,28.09
tf_inception_v3,2481.38,412.658,1024,299,5.73,8.97,23.83
gc_efficientnetv2_rw_t,2480.59,412.793,1024,288,3.2,16.45,13.68
adv_inception_v3,2479.41,412.983,1024,299,5.73,8.97,23.83
repvgg_b1g4,2473.34,414.003,1024,224,8.15,10.64,39.97
mobilevitv2_125,2472.28,414.18,1024,256,2.86,20.1,7.48
gluon_inception_v3,2468.42,414.827,1024,299,5.73,8.97,23.83
nf_regnet_b3,2461.52,415.991,1024,320,2.05,14.61,18.59
xcit_small_12_p16_224_dist,2446.89,418.478,1024,224,4.82,12.58,26.25
xcit_small_12_p16_224,2446.42,418.558,1024,224,4.82,12.58,26.25
cspresnext50,2438.96,419.836,1024,256,4.05,15.86,20.57
convnext_nano_ols,2435.0,420.521,1024,288,4.38,15.5,15.65
regnetx_032,2429.42,421.489,1024,224,3.2,11.37,15.3
densenet169,2426.29,422.031,1024,224,3.4,7.3,14.15
sehalonet33ts,2419.4,423.234,1024,256,3.55,14.7,13.69
tf_efficientnet_cc_b1_8e,2406.19,425.557,1024,240,0.75,15.44,39.72
semobilevit_s,2402.02,426.294,1024,256,2.03,19.95,5.74
resnetv2_101,2330.6,439.36,1024,224,7.83,16.23,44.54
twins_pcpvt_small,2312.72,442.754,1024,224,3.83,18.08,24.11
xcit_nano_12_p8_224_dist,2295.5,446.077,1024,224,2.16,15.71,3.05
xcit_nano_12_p8_224,2292.87,446.587,1024,224,2.16,15.71,3.05
gmlp_s16_224,2290.73,447.007,1024,224,4.42,15.1,19.42
cs3darknet_focus_x,2287.2,447.697,1024,256,8.03,10.69,35.02
vit_base_r26_s32_224,2275.25,450.047,1024,224,6.81,12.36,101.38
gluon_resnet101_v1b,2260.37,453.01,1024,224,7.83,16.23,44.55
tv_resnet101,2258.59,453.368,1024,224,7.83,16.23,44.55
resnet101,2258.28,453.43,1024,224,7.83,16.23,44.55
skresnet50,2234.62,458.23,1024,224,4.11,12.5,25.8
ecaresnet26t,2232.29,458.709,1024,320,5.24,16.44,16.01
edgenext_small,2226.69,459.86,1024,320,1.97,14.16,5.59
dla102,2219.96,461.255,1024,224,7.19,14.18,33.27
res2next50,2214.71,462.347,1024,224,4.2,13.71,24.67
dla60_res2next,2210.67,463.194,1024,224,3.49,13.17,17.03
resnetv2_101d,2203.82,464.633,1024,224,8.07,17.04,44.56
gluon_resnet101_v1c,2194.65,466.578,1024,224,8.08,17.04,44.57
resnest26d,2170.04,471.869,1024,224,3.64,9.97,17.07
vgg13,2149.71,476.331,1024,224,11.31,12.25,133.05
gluon_resnet101_v1d,2137.49,479.053,1024,224,8.08,17.04,44.57
skresnet50d,2115.22,484.098,1024,224,4.36,13.31,25.82
convnextv2_pico,2108.5,485.64,1024,288,2.27,10.08,9.07
vit_base_resnet50d_224,2101.17,487.333,1024,224,8.73,16.92,110.97
coatnet_0_rw_224,2082.49,491.706,1024,224,4.43,18.73,27.44
crossvit_small_240,2081.5,491.94,1024,240,5.63,18.17,26.86
deit3_medium_patch16_224_in21ft1k,2076.53,493.118,1024,224,8.0,15.93,38.85
deit3_medium_patch16_224,2072.34,494.116,1024,224,8.0,15.93,38.85
mobilevitv2_150,2071.36,494.349,1024,256,4.09,24.11,10.59
mobilevitv2_150_in22ft1k,2070.3,494.603,1024,256,4.09,24.11,10.59
sebotnet33ts_256,2067.91,247.581,512,256,3.89,17.46,13.7
wide_resnet50_2,2057.08,497.78,1024,224,11.43,14.4,68.88
vit_relpos_medium_patch16_rpn_224,2044.85,500.757,1024,224,7.97,17.02,38.73
efficientformer_l3,2041.79,501.507,1024,224,3.93,12.01,31.41
poolformer_s24,2040.35,501.863,1024,224,3.41,10.68,21.39
vit_relpos_medium_patch16_224,2037.47,502.572,1024,224,7.97,17.02,38.75
cspdarknet53,2035.94,502.949,1024,256,6.57,16.81,27.64
resnet51q,2034.41,503.329,1024,288,8.07,20.94,35.7
vit_srelpos_medium_patch16_224,2033.15,503.638,1024,224,7.96,16.21,38.74
maxvit_rmlp_pico_rw_256,2008.78,509.748,1024,256,1.85,24.86,7.52
vit_relpos_medium_patch16_cls_224,2007.24,510.141,1024,224,8.03,18.24,38.76
dla102x,2006.55,510.315,1024,224,5.89,19.42,26.31
legacy_seresnet101,2003.12,511.188,1024,224,7.61,15.74,49.33
swin_tiny_patch4_window7_224,1995.14,513.235,1024,224,4.51,17.06,28.29
repvgg_b1,1985.42,515.747,1024,224,13.16,10.64,57.42
resnetaa101d,1982.98,516.381,1024,224,9.12,17.56,44.57
coatnet_rmlp_0_rw_224,1981.75,516.703,1024,224,4.72,24.89,27.45
tf_efficientnet_b3,1975.92,518.226,1024,300,1.87,23.83,12.23
gcvit_xtiny,1969.68,519.869,1024,224,2.93,20.26,19.98
hrnet_w18,1967.17,520.531,1024,224,4.32,16.31,21.3
gluon_resnet101_v1s,1965.68,520.926,1024,224,9.19,18.64,44.67
maxvit_pico_rw_256,1965.38,521.006,1024,256,1.83,22.3,7.46
resnetaa50,1958.15,522.93,1024,288,8.52,19.24,25.56
seresnet101,1954.63,523.871,1024,224,7.84,16.27,49.33
efficientnet_b3,1949.54,525.239,1024,320,2.01,26.52,12.23
efficientnet_b3a,1949.11,525.356,1024,320,2.01,26.52,12.23
edgenext_small_rw,1932.68,529.816,1024,320,2.46,14.85,7.83
regnetx_040,1932.62,529.839,1024,224,3.99,12.2,22.12
cs3sedarknet_xdw,1925.4,531.825,1024,256,5.97,17.18,21.6
coatnet_bn_0_rw_224,1920.71,533.123,1024,224,4.67,22.04,27.44
xcit_tiny_12_p16_384_dist,1911.65,535.652,1024,384,3.64,18.26,6.72
ssl_resnext101_32x4d,1910.73,535.909,1024,224,8.01,21.23,44.18
swsl_resnext101_32x4d,1910.43,535.993,1024,224,8.01,21.23,44.18
resnext101_32x4d,1909.99,536.115,1024,224,8.01,21.23,44.18
gluon_resnext101_32x4d,1909.34,536.298,1024,224,8.01,21.23,44.18
darknet53,1903.77,537.866,1024,288,11.78,15.68,41.61
darknetaa53,1898.12,539.468,1024,288,10.08,15.68,36.02
crossvit_15_240,1892.46,541.083,1024,240,5.81,19.77,27.53
halonet50ts,1881.53,544.226,1024,256,5.3,19.2,22.73
vgg13_bn,1879.72,544.749,1024,224,11.33,12.25,133.05
mixnet_xl,1872.46,546.86,1024,224,0.93,14.57,11.9
res2net50_26w_6s,1870.88,547.321,1024,224,6.33,15.28,37.05
ecaresnet101d,1869.88,547.616,1024,224,8.08,17.07,44.57
densenet201,1869.57,547.706,1024,224,4.34,7.85,20.01
nf_resnet101,1858.48,550.976,1024,224,8.01,16.23,44.55
coatnet_0_224,1857.28,275.661,512,224,4.58,24.01,25.04
pvt_v2_b2,1854.85,552.053,1024,224,4.05,27.53,25.36
crossvit_15_dagger_240,1850.69,553.295,1024,240,6.13,20.43,28.21
resmlp_36_224,1846.41,554.574,1024,224,8.91,16.33,44.69
resmlp_36_distilled_224,1845.04,554.99,1024,224,8.91,16.33,44.69
resnet61q,1841.84,555.954,1024,288,9.87,21.52,36.85
swin_s3_tiny_224,1817.5,563.398,1024,224,4.64,19.13,28.33
cait_xxs24_224,1796.55,569.968,1024,224,2.53,20.29,11.96
cs3darknet_x,1789.33,572.268,1024,288,10.6,14.36,35.05
vit_medium_patch16_gap_240,1785.54,573.481,1024,240,9.22,18.81,44.4
nf_resnet50,1784.84,573.708,1024,288,6.88,18.37,25.56
resnet50_gn,1764.31,580.385,1024,224,4.14,11.11,25.56
mixer_b16_224_miil,1761.45,581.327,1024,224,12.62,14.53,59.88
mixer_b16_224,1759.76,581.885,1024,224,12.62,14.53,59.88
resnetblur101d,1757.96,582.482,1024,224,9.12,17.94,44.57
eca_nfnet_l0,1726.58,593.068,1024,288,7.12,17.29,24.14
nfnet_l0,1721.83,594.705,1024,288,7.13,17.29,35.07
vit_large_patch32_224,1717.59,596.169,1024,224,15.41,13.32,327.9
vgg16,1717.44,596.224,1024,224,15.47,13.56,138.36
regnetz_c16,1710.89,598.505,1024,320,3.92,25.88,13.46
pvt_v2_b2_li,1709.89,598.855,1024,224,3.91,27.6,22.55
resnest50d_1s4x24d,1705.52,600.391,1024,224,4.43,13.57,25.68
coat_lite_small,1704.55,600.733,1024,224,3.96,22.09,19.84
resnetv2_50d_frn,1697.1,603.368,1024,224,4.33,11.92,25.59
cs3sedarknet_x,1689.8,605.975,1024,288,10.6,14.37,35.4
seresnext101_32x4d,1687.65,606.747,1024,224,8.02,21.26,48.96
gluon_seresnext101_32x4d,1687.1,606.945,1024,224,8.02,21.26,48.96
legacy_seresnext101_32x4d,1684.69,607.813,1024,224,8.02,21.26,48.96
regnetv_040,1682.92,608.454,1024,288,6.6,20.3,20.64
mobilevitv2_175,1677.66,457.769,768,256,5.54,28.13,14.25
regnety_040,1677.03,610.59,1024,288,6.61,20.3,20.65
mobilevitv2_175_in22ft1k,1677.0,457.949,768,256,5.54,28.13,14.25
convnext_tiny_hnf,1676.16,610.908,1024,288,7.39,22.21,28.59
res2net101_26w_4s,1675.37,611.195,1024,224,8.1,18.45,45.21
vit_tiny_patch16_384,1665.76,614.72,1024,384,4.7,25.39,5.79
sequencer2d_s,1661.32,616.362,1024,224,4.96,11.31,27.65
ese_vovnet39b_evos,1661.21,616.404,1024,224,7.07,6.74,24.58
vit_base_patch32_384,1649.27,620.868,1024,384,13.06,16.5,88.3
vit_base_patch32_clip_384,1648.64,621.105,1024,384,13.06,16.5,88.3
mixer_l32_224,1645.23,622.393,1024,224,11.27,19.86,206.94
convnext_tiny,1642.14,623.562,1024,288,7.39,22.21,28.59
botnet50ts_256,1639.64,312.25,512,256,5.54,22.23,22.74
swinv2_cr_tiny_224,1630.02,628.199,1024,224,4.66,28.45,28.33
resnetv2_50d_evob,1627.44,629.196,1024,224,4.33,11.92,25.59
twins_pcpvt_base,1615.12,633.996,1024,224,6.68,25.25,43.83
resnetv2_152,1614.43,634.268,1024,224,11.55,22.56,60.19
hrnet_w32,1605.06,637.96,1024,224,8.97,22.02,41.23
swinv2_cr_tiny_ns_224,1600.43,639.811,1024,224,4.66,28.45,28.33
xception41p,1598.79,480.351,768,299,9.25,39.86,26.91
tv_resnet152,1582.54,647.049,1024,224,11.56,22.56,60.19
gluon_resnet152_v1b,1581.57,647.444,1024,224,11.56,22.56,60.19
resnet152,1581.02,647.671,1024,224,11.56,22.56,60.19
xception,1579.88,648.138,1024,299,8.4,35.83,22.86
halo2botnet50ts_256,1572.75,651.076,1024,256,5.02,21.78,22.64
res2net50_26w_8s,1568.85,652.695,1024,224,8.37,17.95,48.4
vit_medium_patch16_gap_256,1564.22,654.626,1024,256,10.59,22.15,38.86
resnetv2_152d,1557.03,657.648,1024,224,11.8,23.36,60.2
efficientnet_el_pruned,1555.14,658.449,1024,300,8.0,30.7,10.59
maxvit_rmlp_nano_rw_256,1551.85,659.845,1024,256,4.47,31.92,15.5
regnetx_064,1550.52,660.413,1024,224,6.49,16.37,26.21
efficientnet_el,1549.97,660.646,1024,300,8.0,30.7,10.59
gluon_resnet152_v1c,1548.96,661.078,1024,224,11.8,23.36,60.21
nf_ecaresnet101,1546.58,662.091,1024,224,8.01,16.27,44.55
nf_seresnet101,1539.38,665.191,1024,224,8.02,16.27,49.33
mvitv2_tiny,1537.54,665.985,1024,224,4.7,21.16,24.17
nfnet_f0,1525.01,671.456,1024,256,12.62,18.05,71.49
vgg16_bn,1523.86,671.963,1024,224,15.5,13.56,138.37
cs3edgenet_x,1521.21,673.136,1024,288,14.59,16.36,47.82
gluon_resnet152_v1d,1520.11,673.621,1024,224,11.8,23.36,60.21
maxvit_nano_rw_256,1517.43,674.812,1024,256,4.46,30.28,15.45
tf_efficientnet_el,1506.16,679.862,1024,300,8.0,30.7,10.59
convnextv2_nano,1500.71,511.746,768,288,4.06,13.84,15.62
resnest50d,1492.63,686.022,1024,224,5.4,14.36,27.48
ese_vovnet99b,1489.17,687.617,1024,224,16.51,11.27,63.2
dla169,1471.11,696.059,1024,224,11.6,20.2,53.39
regnety_032,1467.85,697.604,1024,288,5.29,18.61,19.44
skresnext50_32x4d,1463.28,699.785,1024,224,4.5,17.18,27.48
xcit_tiny_12_p8_224_dist,1458.7,701.981,1024,224,4.81,23.6,6.71
xcit_tiny_12_p8_224,1458.23,702.211,1024,224,4.81,23.6,6.71
convit_small,1457.54,702.541,1024,224,5.76,17.87,27.78
mobilevitv2_200_in22ft1k,1456.59,527.247,768,256,7.22,32.15,18.45
mobilevitv2_200,1456.02,527.451,768,256,7.22,32.15,18.45
ecaresnet50t,1438.32,711.929,1024,320,8.82,24.13,25.57
gluon_resnet152_v1s,1432.22,714.961,1024,224,12.92,24.96,60.32
nest_tiny,1415.33,542.618,768,224,5.83,25.48,17.06
regnety_040s_gn,1412.65,724.867,1024,224,4.03,12.29,20.65
vgg19,1393.71,183.67,256,224,19.63,14.86,143.67
jx_nest_tiny,1389.62,552.657,768,224,5.83,25.48,17.06
legacy_seresnet152,1383.83,739.96,1024,224,11.33,22.08,66.82
densenet161,1376.52,743.891,1024,224,7.79,11.06,28.68
poolformer_s36,1370.67,747.069,1024,224,5.0,15.82,30.86
vit_small_resnet50d_s16_224,1367.59,748.748,1024,224,13.48,24.82,57.53
twins_svt_base,1362.65,751.463,1024,224,8.59,26.33,56.07
seresnet152,1361.7,751.99,1024,224,11.57,22.61,66.82
xception41,1356.44,566.173,768,299,9.28,39.86,26.97
maxvit_tiny_rw_224,1350.45,758.254,1024,224,5.11,33.11,29.06
crossvit_18_240,1348.85,759.154,1024,240,9.05,26.26,43.27
maxxvit_rmlp_nano_rw_256,1347.73,759.767,1024,256,4.37,26.05,16.78
efficientnet_lite4,1343.74,571.528,768,380,4.04,45.66,13.01
gcvit_tiny,1339.65,764.364,1024,224,4.79,29.82,28.22
pvt_v2_b3,1325.92,772.282,1024,224,6.92,37.7,45.24
crossvit_18_dagger_240,1313.78,779.419,1024,240,9.5,27.03,44.27
volo_d1_224,1312.37,780.255,1024,224,6.94,24.43,26.63
xcit_small_24_p16_224_dist,1307.3,783.278,1024,224,9.1,23.64,47.67
tresnet_m,1305.71,784.234,1024,224,5.74,7.31,31.39
inception_v4,1305.41,784.412,1024,299,12.28,15.09,42.68
repvgg_b2,1305.22,784.529,1024,224,20.45,12.9,89.02
xcit_small_24_p16_224,1303.71,785.433,1024,224,9.1,23.64,47.67
sequencer2d_m,1295.72,790.281,1024,224,6.55,14.26,38.31
edgenext_base,1283.77,797.633,1024,320,6.01,24.32,18.51
hrnet_w30,1280.53,799.653,1024,224,8.15,21.21,37.71
dm_nfnet_f0,1275.46,802.834,1024,256,12.62,18.05,71.49
coatnet_rmlp_1_rw_224,1268.37,807.322,1024,224,7.85,35.47,41.69
maxxvitv2_nano_rw_256,1259.7,812.877,1024,256,6.26,23.05,23.7
efficientnetv2_s,1254.49,816.255,1024,384,8.44,35.77,21.46
vgg19_bn,1246.52,205.36,256,224,19.66,14.86,143.68
nf_regnet_b4,1235.79,828.604,1024,384,4.7,28.61,30.21
swin_small_patch4_window7_224,1235.74,828.641,1024,224,8.77,27.47,49.61
tf_efficientnet_lite4,1232.22,623.25,768,380,4.04,45.66,13.01
regnetz_d32,1223.51,836.919,1024,320,9.33,37.08,27.58
mixnet_xxl,1219.27,629.871,768,224,2.04,23.43,23.96
tf_efficientnetv2_s,1219.16,839.906,1024,384,8.44,35.77,21.46
deit_base_patch16_224,1213.08,844.121,1024,224,17.58,23.9,86.57
deit_base_distilled_patch16_224,1212.98,844.19,1024,224,17.68,24.05,87.34
vit_base_patch16_clip_224,1211.82,844.996,1024,224,17.58,23.9,86.57
vit_base_patch16_224_miil,1211.26,845.389,1024,224,17.59,23.91,94.4
dpn92,1210.45,845.948,1024,224,6.54,18.21,37.67
vit_base_patch16_224,1210.28,846.074,1024,224,17.58,23.9,86.57
coatnet_rmlp_1_rw2_224,1208.65,847.215,1024,224,8.11,40.13,41.72
cait_xxs36_224,1205.51,849.419,1024,224,3.77,30.34,17.3
maxvit_tiny_tf_224,1200.3,639.828,768,224,5.6,35.78,30.92
swinv2_tiny_window8_256,1200.06,853.274,1024,256,5.96,24.57,28.35
efficientnetv2_rw_s,1199.87,853.413,1024,384,8.72,38.03,23.94
dla102x2,1198.52,854.374,1024,224,9.34,29.91,41.28
regnetx_160,1195.08,856.833,1024,224,15.99,25.52,54.28
dpn98,1183.92,864.908,1024,224,11.73,25.2,61.57
vit_base_patch16_rpn_224,1180.39,867.498,1024,224,17.49,23.75,86.54
twins_pcpvt_large,1168.64,876.22,1024,224,9.84,35.82,60.99
deit3_base_patch16_224,1164.77,879.134,1024,224,17.58,23.9,86.59
deit3_base_patch16_224_in21ft1k,1164.5,879.334,1024,224,17.58,23.9,86.59
regnetz_d8,1163.64,879.982,1024,320,6.19,37.08,23.37
swsl_resnext101_32x8d,1158.15,884.156,1024,224,16.48,31.21,88.79
resnext101_32x8d,1158.05,884.232,1024,224,16.48,31.21,88.79
ssl_resnext101_32x8d,1158.02,884.255,1024,224,16.48,31.21,88.79
wide_resnet101_2,1157.66,884.531,1024,224,22.8,21.23,126.89
ig_resnext101_32x8d,1157.3,884.8,1024,224,16.48,31.21,88.79
coatnet_1_rw_224,1155.72,886.014,1024,224,8.04,34.6,41.72
vit_base_patch16_gap_224,1154.73,886.777,1024,224,17.49,25.59,86.57
vit_base_patch32_clip_448,1154.21,887.173,1024,448,17.93,23.9,88.34
resnet200,1149.71,890.646,1024,224,15.07,32.19,64.67
mvitv2_small,1146.92,892.812,1024,224,7.0,28.08,34.87
xception65p,1145.07,670.686,768,299,13.91,52.48,39.82
cs3se_edgenet_x,1143.17,895.738,1024,320,18.01,20.21,50.72
vit_relpos_base_patch16_rpn_224,1143.15,895.76,1024,224,17.51,24.97,86.41
vit_relpos_base_patch16_224,1141.31,897.204,1024,224,17.51,24.97,86.43
tnt_s_patch16_224,1135.32,901.935,1024,224,5.24,24.37,23.76
resnetrs101,1134.67,902.454,1024,288,13.56,28.53,63.62
vit_relpos_base_patch16_clsgap_224,1128.94,907.03,1024,224,17.6,25.12,86.43
vit_relpos_base_patch16_cls_224,1126.78,908.771,1024,224,17.6,25.12,86.43
inception_resnet_v2,1126.73,908.809,1024,299,13.18,25.06,55.84
ens_adv_inception_resnet_v2,1125.41,909.877,1024,299,13.18,25.06,55.84
beit_base_patch16_224,1112.26,920.631,1024,224,17.58,23.9,86.53
coat_tiny,1108.72,923.572,1024,224,4.35,27.2,5.5
beitv2_base_patch16_224,1108.55,923.711,1024,224,17.58,23.9,86.53
mvitv2_small_cls,1101.66,929.491,1024,224,7.04,28.17,34.87
resnetv2_50d_gn,1092.35,937.413,1024,288,7.24,19.7,25.57
pit_b_distilled_224,1078.48,474.731,512,224,12.5,33.07,74.79
pit_b_224,1075.34,476.117,512,224,12.42,32.94,73.76
hrnet_w40,1059.78,966.217,1024,224,12.75,25.29,57.56
coatnet_1_224,1045.17,489.859,512,224,8.7,39.0,42.23
resnet101d,1039.88,984.712,1024,320,16.48,34.77,44.57
flexivit_base,1037.21,987.248,1024,240,20.29,28.36,86.59
gluon_resnext101_64x4d,1034.86,989.491,1024,224,15.52,31.21,83.46
vit_small_patch16_36x1_224,1033.13,991.146,1024,224,13.71,35.69,64.67
vit_large_r50_s32_224,1030.67,993.517,1024,224,19.58,24.41,328.99
maxvit_rmlp_tiny_rw_256,1029.25,746.162,768,256,6.77,46.92,29.15
xcit_tiny_24_p16_384_dist,1027.64,996.444,1024,384,6.87,34.29,12.12
efficientnet_b4,1014.08,504.879,512,384,4.51,50.04,19.34
maxvit_tiny_rw_256,1008.0,1015.861,1024,256,6.74,44.35,29.07
vit_small_patch16_18x2_224,1006.7,1017.169,1024,224,13.71,35.69,64.67
swinv2_cr_small_224,1005.28,1018.603,1024,224,9.07,50.27,49.7
regnetx_080,1004.51,1019.384,1024,224,8.02,14.06,39.57
repvgg_b3,994.23,1029.925,1024,224,29.16,15.1,123.09
swinv2_cr_small_ns_224,993.75,1030.424,1024,224,9.08,50.27,49.7
repvgg_b2g4,988.97,1035.405,1024,224,12.63,12.9,61.76
convnext_small,988.3,1036.113,1024,288,14.39,35.65,50.22
gluon_xception65,987.82,777.458,768,299,13.96,52.48,39.92
vit_small_r26_s32_384,982.68,1042.031,1024,384,10.43,29.85,36.47
xception65,978.83,784.597,768,299,13.96,52.48,39.92
regnetz_040,975.77,787.056,768,320,6.35,37.78,27.12
regnetz_040h,971.51,790.512,768,320,6.43,37.94,28.94
gluon_seresnext101_64x4d,965.3,1060.794,1024,224,15.53,31.25,88.23
maxvit_tiny_pm_256,964.03,1062.189,1024,256,6.61,47.9,30.09
efficientformer_l7,962.55,1063.825,1024,224,10.17,24.45,82.23
twins_svt_large,962.19,1064.229,1024,224,15.15,35.1,99.27
tf_efficientnet_b4,957.62,534.646,512,380,4.49,49.49,19.34
pvt_v2_b4,957.38,1069.569,1024,224,10.14,53.74,62.56
poolformer_m36,954.91,1072.334,1024,224,8.8,22.02,56.17
cait_s24_224,954.44,1072.866,1024,224,9.35,40.58,46.92
regnetz_b16_evos,950.47,808.013,768,288,2.36,16.43,9.74
resnest50d_4s2x40d,938.07,1091.586,1024,224,4.4,17.94,30.42
hrnet_w48,936.07,1093.917,1024,224,17.34,28.56,77.47
gmlp_b16_224,930.95,1099.935,1024,224,15.78,30.21,73.08
convnextv2_tiny,930.82,550.041,512,288,7.39,22.21,28.64
convnextv2_small,928.68,1102.629,1024,224,8.71,21.56,50.32
maxxvit_rmlp_tiny_rw_256,918.72,1114.583,1024,256,6.66,39.76,29.64
mobilevitv2_150_384_in22ft1k,915.49,419.435,384,384,9.2,54.25,10.59
pvt_v2_b5,909.79,1125.516,1024,224,11.76,50.92,81.96
nest_small,903.21,850.284,768,224,10.35,40.04,38.35
swin_s3_small_224,899.98,853.339,768,224,9.43,37.84,49.74
xcit_medium_24_p16_224_dist,898.61,1139.525,1024,224,16.13,31.71,84.4
xcit_medium_24_p16_224,898.6,1139.542,1024,224,16.13,31.71,84.4
jx_nest_small,892.03,860.939,768,224,10.35,40.04,38.35
coat_mini,880.8,1162.569,1024,224,6.82,33.68,10.34
swin_base_patch4_window7_224,875.38,1169.764,1024,224,15.47,36.63,87.77
dpn131,865.2,1183.527,1024,224,16.09,32.97,79.25
resnetv2_50d_evos,854.82,1197.895,1024,288,7.15,19.7,25.59
xcit_small_12_p16_384_dist,853.54,1199.694,1024,384,14.14,36.51,26.25
sequencer2d_l,839.78,1219.347,1024,224,9.74,22.12,54.3
crossvit_base_240,839.43,914.892,768,240,21.22,36.33,105.03
hrnet_w44,821.37,1246.671,1024,224,14.94,26.92,67.06
eca_nfnet_l1,818.87,1250.489,1024,320,14.92,34.42,41.41
vit_base_r50_s16_224,817.55,1252.502,1024,224,21.67,35.31,114.69
maxvit_rmlp_small_rw_224,816.34,1254.368,1024,224,10.75,49.3,64.9
gcvit_small,815.24,1256.055,1024,224,8.57,41.61,51.09
regnety_080,811.28,1262.191,1024,288,13.22,29.69,39.18
densenet264,804.85,1272.268,1024,224,12.95,12.8,72.69
mvitv2_base,804.14,1273.395,1024,224,10.16,40.5,51.47
repvgg_b3g4,802.85,1275.443,1024,224,17.89,15.1,83.83
vit_base_patch16_plus_240,782.25,1309.022,1024,240,27.41,33.08,117.56
swinv2_tiny_window16_256,781.61,655.045,512,256,6.68,39.02,28.35
maxvit_small_tf_224,777.04,658.899,512,224,11.66,53.17,68.93
xcit_tiny_24_p8_224,771.1,1327.958,1024,224,9.21,45.39,12.11
xcit_tiny_24_p8_224_dist,770.21,1329.496,1024,224,9.21,45.39,12.11
coatnet_2_rw_224,763.52,670.562,512,224,15.09,49.22,73.87
vit_relpos_base_patch16_plus_240,763.4,1341.361,1024,240,27.3,34.33,117.38
efficientnet_b3_gn,763.0,671.023,512,320,2.14,28.83,11.73
coatnet_rmlp_2_rw_224,759.73,673.906,512,224,15.18,54.78,73.88
vit_small_patch16_384,753.82,1018.79,768,384,15.52,50.78,22.2
hrnet_w64,750.36,1364.663,1024,224,28.97,35.09,128.06
xception71,749.7,1024.396,768,299,18.09,69.92,42.34
resnet152d,742.37,1379.356,1024,320,24.08,47.67,60.21
swinv2_small_window8_256,741.95,1380.134,1024,256,11.58,40.14,49.73
mobilevitv2_175_384_in22ft1k,739.09,519.544,384,384,12.47,63.29,14.25
ecaresnet200d,736.17,1390.959,1024,256,20.0,43.15,64.69
seresnet200d,733.28,1396.444,1024,256,20.01,43.15,71.86
swin_s3_base_224,733.27,1396.459,1024,224,13.69,48.26,71.13
convit_base,731.09,1400.636,1024,224,17.52,31.77,86.54
resnest101e,726.65,1409.184,1024,256,13.38,28.66,48.28
deit3_small_patch16_384,726.49,1057.125,768,384,15.52,50.78,22.21
deit3_small_patch16_384_in21ft1k,726.32,1057.368,768,384,15.52,50.78,22.21
volo_d2_224,722.61,1417.079,1024,224,14.34,41.34,58.68
tnt_b_patch16_224,721.24,1419.762,1024,224,14.09,39.01,65.41
xcit_nano_12_p8_384_dist,720.41,1421.4,1024,384,6.34,46.08,3.05
swinv2_cr_base_224,719.23,1423.721,1024,224,15.86,59.66,87.88
poolformer_m48,719.07,1424.046,1024,224,11.59,29.17,73.47
coatnet_2_224,715.36,715.711,512,224,16.5,52.67,74.68
swinv2_cr_base_ns_224,712.96,1436.239,1024,224,15.86,59.66,87.88
dpn107,691.0,1481.897,1024,224,18.38,33.46,86.92
convnext_base,687.14,1490.219,1024,288,25.43,47.53,88.59
resnetv2_50x1_bitm,684.31,374.087,256,448,16.62,44.46,25.55
efficientnet_b3_g8_gn,664.63,770.341,512,320,3.2,28.83,14.25
regnety_064,657.71,1556.911,1024,288,10.56,27.11,30.58
regnetv_064,652.6,1569.096,1024,288,10.55,27.11,30.58
xcit_small_12_p8_224,651.3,1572.214,1024,224,18.69,47.21,26.21
xcit_small_12_p8_224_dist,651.08,1572.755,1024,224,18.69,47.21,26.21
resnetrs152,649.95,1575.501,1024,320,24.34,48.14,86.62
mobilevitv2_200_384_in22ft1k,647.42,395.4,256,384,16.24,72.34,18.45
seresnet152d,645.69,1585.88,1024,320,24.09,47.72,66.84
tresnet_l,644.38,1589.105,1024,224,10.88,11.9,55.99
tresnet_v2_l,642.3,1594.246,1024,224,8.81,16.34,46.17
nest_base,640.98,798.76,512,224,17.96,53.39,67.72
regnetx_120,640.37,1599.07,1024,224,12.13,21.37,46.11
seresnext101_32x8d,639.53,1601.159,1024,288,27.24,51.63,93.57
regnetz_e8,639.43,1601.423,1024,320,15.46,63.94,57.7
ese_vovnet99b_iabn,636.1,1609.798,1024,224,16.49,11.27,63.2
jx_nest_base,634.61,806.787,512,224,17.96,53.39,67.72
regnety_120,625.75,1636.422,1024,224,12.14,21.38,51.82
efficientnetv2_m,624.53,1639.618,1024,416,18.6,67.5,54.14
seresnext101d_32x8d,621.55,1647.466,1024,288,27.64,52.95,93.59
resnext101_64x4d,619.77,1652.21,1024,288,25.66,51.59,83.46
swsl_resnext101_32x16d,612.21,1672.624,1024,224,36.27,51.18,194.03
ig_resnext101_32x16d,611.98,1673.243,1024,224,36.27,51.18,194.03
maxvit_rmlp_small_rw_256,611.67,1255.571,768,256,14.15,66.09,64.9
ssl_resnext101_32x16d,611.31,1675.063,1024,224,36.27,51.18,194.03
regnety_320,605.31,1691.684,1024,224,32.34,30.26,145.05
gcvit_base,602.42,1699.782,1024,224,14.87,55.48,90.32
regnetz_c16_evos,596.93,857.706,512,320,3.86,25.88,13.49
maxxvit_rmlp_small_rw_256,590.18,1735.046,1024,256,14.67,58.38,66.01
legacy_senet154,585.86,1747.854,1024,224,20.77,38.69,115.09
senet154,585.53,1748.836,1024,224,20.77,38.69,115.09
seresnextaa101d_32x8d,585.08,1750.175,1024,288,28.51,56.44,93.59
gluon_senet154,584.86,1750.843,1024,224,20.77,38.69,115.09
convmixer_768_32,581.95,1759.577,1024,224,19.55,25.95,21.11
seresnet269d,574.5,1782.4,1024,256,26.59,53.6,113.67
nf_regnet_b5,565.36,905.602,512,456,11.7,61.95,49.74
mixer_l16_224,553.66,1849.49,1024,224,44.6,41.69,208.2
resnet200d,545.14,1878.401,1024,320,31.25,67.33,64.69
nfnet_f1,544.28,1881.353,1024,320,35.97,46.77,132.63
vit_large_patch32_384,543.45,1884.237,1024,384,45.31,43.86,306.63
efficientnetv2_rw_m,543.37,1884.512,1024,416,21.49,79.62,53.24
vit_medium_patch16_gap_384,539.24,949.475,512,384,26.08,67.54,39.03
efficientnet_b5,533.21,960.212,512,448,9.59,93.56,30.39
swinv2_base_window8_256,531.81,1925.495,1024,256,20.37,52.59,87.92
maxxvitv2_rmlp_base_rw_224,525.72,1947.791,1024,224,24.2,62.77,116.09
xcit_large_24_p16_224_dist,509.19,2011.039,1024,224,35.86,47.27,189.1
xcit_large_24_p16_224,509.15,2011.169,1024,224,35.86,47.27,189.1
swin_large_patch4_window7_224,504.4,1522.593,768,224,34.53,54.94,196.53
halonet_h1,503.39,508.543,256,256,3.0,51.17,8.1
volo_d3_224,502.58,2037.467,1024,224,20.78,60.09,86.33
swinv2_small_window16_256,488.97,1047.084,512,256,12.82,66.29,49.73
tresnet_xl,481.58,2126.301,1024,224,15.17,15.34,78.44
vit_small_patch8_224,479.11,1068.641,512,224,22.44,80.84,21.67
tf_efficientnet_b5,476.47,805.919,384,456,10.46,98.86,30.39
maxvit_rmlp_base_rw_224,472.06,2169.196,1024,224,23.15,92.64,116.14
resnetrs200,471.68,2170.964,1024,320,31.51,67.81,93.21
xcit_tiny_12_p8_384_dist,471.45,2172.002,1024,384,14.13,69.14,6.71
dm_nfnet_f1,461.24,2220.087,1024,320,35.97,46.77,132.63
tf_efficientnetv2_m,458.93,1673.426,768,480,24.76,89.84,54.14
xcit_small_24_p16_384_dist,457.16,2239.891,1024,384,26.72,68.58,47.67
coatnet_rmlp_3_rw_224,439.5,582.463,256,224,33.56,79.47,165.15
maxvit_base_tf_224,430.05,1190.542,512,224,24.04,95.01,119.47
swinv2_cr_large_224,423.86,1811.887,768,224,35.1,78.42,196.68
resnetv2_152x2_bit_teacher,423.36,2418.743,1024,224,46.95,45.11,236.34
swinv2_cr_tiny_384,423.1,907.565,384,384,15.34,161.01,28.33
coatnet_3_rw_224,421.95,606.701,256,224,33.44,73.83,181.81
resnetv2_101x1_bitm,419.35,610.453,256,448,31.65,64.93,44.54
coatnet_3_224,405.07,631.982,256,224,36.56,79.01,166.97
convnextv2_base,403.59,1268.593,512,288,25.43,47.53,88.72
eca_nfnet_l2,401.73,2548.946,1024,384,30.05,68.28,56.72
regnetz_d8_evos,394.39,1947.294,768,320,7.03,38.92,23.46
convmixer_1024_20_ks9_p14,393.5,2602.254,1024,224,5.55,5.51,24.38
eva_large_patch14_196,392.3,2610.234,1024,196,61.57,63.52,304.14
crossvit_15_dagger_408,390.72,655.182,256,408,21.45,95.05,28.5
vit_large_patch16_224,390.66,2621.182,1024,224,61.6,63.52,304.33
vit_base_patch16_18x2_224,384.38,2663.987,1024,224,52.51,71.38,256.73
deit3_large_patch16_224_in21ft1k,377.58,2711.976,1024,224,61.6,63.52,304.37
deit3_large_patch16_224,377.53,2712.348,1024,224,61.6,63.52,304.37
convnext_large,373.02,2058.836,768,288,56.87,71.29,197.77
beit_large_patch16_224,360.62,2839.572,1024,224,61.6,63.52,304.43
beitv2_large_patch16_224,360.58,2839.86,1024,224,61.6,63.52,304.43
swinv2_base_window12to16_192to256_22kft1k,360.56,1065.006,384,256,22.02,84.71,87.92
swinv2_base_window16_256,360.23,1065.959,384,256,22.02,84.71,87.92
regnety_160,353.5,2172.566,768,288,26.37,38.07,83.59
nasnetalarge,345.63,1111.004,384,331,23.89,90.56,88.75
maxvit_tiny_tf_384,344.01,744.157,256,384,17.53,123.42,30.98
xcit_small_24_p8_224,342.37,2990.915,1024,224,35.81,90.78,47.63
xcit_small_24_p8_224_dist,342.26,2991.817,1024,224,35.81,90.78,47.63
flexivit_large,335.35,3053.52,1024,240,70.99,75.39,304.36
maxxvitv2_rmlp_large_rw_224,332.33,3081.271,1024,224,44.14,87.15,215.42
vit_large_r50_s32_384,329.8,3104.921,1024,384,57.43,76.52,329.09
pnasnet5large,328.89,1167.534,384,331,25.04,92.89,86.06
tresnet_m_448,325.8,3143.01,1024,448,22.94,29.21,31.39
volo_d1_384,323.04,1584.906,512,384,22.75,108.55,26.78
volo_d4_224,318.96,3210.439,1024,224,44.34,80.22,192.96
xcit_medium_24_p16_384_dist,312.74,3274.268,1024,384,47.39,91.64,84.4
nfnet_f2,310.6,3296.869,1024,352,63.22,79.06,193.78
vit_base_patch16_384,307.09,1250.42,384,384,55.54,101.56,86.86
deit_base_patch16_384,306.8,1251.599,384,384,55.54,101.56,86.86
vit_base_patch16_clip_384,306.29,1253.685,384,384,55.54,101.56,86.86
deit_base_distilled_patch16_384,305.48,1257.017,384,384,55.65,101.82,87.63
ecaresnet269d,305.06,3356.684,1024,352,50.25,101.25,102.09
maxvit_large_tf_224,301.43,1273.908,384,224,43.68,127.35,211.79
deit3_base_patch16_384_in21ft1k,298.01,1288.526,384,384,55.54,101.56,86.88
deit3_base_patch16_384,297.88,1289.093,384,384,55.54,101.56,86.88
resnetrs270,296.97,3448.186,1024,352,51.13,105.48,129.86
regnetx_320,289.44,2653.413,768,224,31.81,36.3,107.81
efficientnet_b6,287.31,890.997,256,528,19.4,167.39,43.04
vit_large_patch14_224,286.23,3577.501,1024,224,81.08,88.79,304.2
vit_large_patch14_clip_224,285.99,3580.5,1024,224,81.08,88.79,304.2
crossvit_18_dagger_408,285.18,673.248,192,408,32.47,124.87,44.61
cait_xxs24_384,281.48,3637.936,1024,384,9.63,122.66,12.03
ig_resnext101_32x32d,275.12,1860.956,512,224,87.29,91.12,468.53
tf_efficientnet_b6,274.07,700.545,192,528,19.4,167.39,43.04
dm_nfnet_f2,264.79,2900.408,768,352,63.22,79.06,193.78
beit_base_patch16_384,261.27,1469.733,384,384,55.54,101.56,86.74
efficientnetv2_l,260.33,1966.694,512,480,56.4,157.99,118.52
swinv2_cr_small_384,259.75,985.56,256,384,29.7,298.03,49.7
tf_efficientnetv2_l,257.29,1989.923,512,480,56.4,157.99,118.52
resnest200e,254.36,1006.453,256,320,35.69,82.78,70.2
mvitv2_large,249.99,2048.061,512,224,43.87,112.02,217.99
xcit_tiny_24_p8_384_dist,248.25,4124.916,1024,384,27.05,132.95,12.11
convnext_xlarge,242.63,2110.182,512,288,100.8,95.05,350.2
resmlp_big_24_224_in22ft1k,241.9,4233.056,1024,224,100.23,87.31,129.14
resmlp_big_24_224,241.74,4235.988,1024,224,100.23,87.31,129.14
resmlp_big_24_distilled_224,241.44,4241.249,1024,224,100.23,87.31,129.14
convnextv2_large,239.52,1068.782,256,288,56.87,71.29,197.96
coatnet_4_224,238.62,1072.827,256,224,62.48,129.26,275.43
swin_base_patch4_window12_384,236.12,813.144,192,384,47.19,134.78,87.9
xcit_medium_24_p8_224_dist,233.5,3289.007,768,224,63.53,121.23,84.32
xcit_medium_24_p8_224,233.5,3289.104,768,224,63.53,121.23,84.32
eca_nfnet_l3,229.87,2227.284,512,448,52.55,118.4,72.04
vit_base_r50_s16_384,226.32,1696.687,384,384,67.43,135.03,98.95
maxvit_small_tf_384,224.01,857.105,192,384,35.87,183.65,69.02
xcit_small_12_p8_384_dist,221.54,1733.28,384,384,54.92,138.29,26.21
swinv2_large_window12to16_192to256_22kft1k,220.1,1163.101,256,256,47.81,121.53,196.74
volo_d5_224,210.88,4855.76,1024,224,72.4,118.11,295.46
vit_base_patch8_224,199.67,1282.079,256,224,78.22,161.69,86.58
cait_xs24_384,197.64,3885.811,768,384,19.28,183.98,26.67
resnetrs350,196.19,5219.377,1024,384,77.59,154.74,163.96
cait_xxs36_384,188.27,5439.03,1024,384,14.35,183.7,17.37
swinv2_cr_base_384,185.68,1378.725,256,384,50.57,333.68,87.88
coatnet_rmlp_2_rw_384,184.84,1038.746,192,384,47.69,209.43,73.88
swinv2_cr_huge_224,184.09,2085.934,384,224,115.97,121.08,657.83
convnext_xxlarge,183.68,2787.486,512,224,151.66,95.29,846.47
volo_d2_384,180.56,2126.753,384,384,46.17,184.51,58.87
xcit_large_24_p16_384_dist,176.39,5805.281,1024,384,105.35,137.17,189.1
regnety_640,174.81,4393.396,768,224,64.16,42.5,281.38
maxvit_xlarge_tf_224,171.63,1491.6,256,224,97.49,191.02,474.95
nfnet_f3,170.11,4514.791,768,416,115.58,141.78,254.92
densenet264d_iabn,167.13,6126.84,1024,224,13.47,14.0,72.74
efficientnet_b7,166.38,1153.975,192,600,38.33,289.94,66.35
maxvit_tiny_tf_512,163.72,781.809,128,512,33.49,257.59,31.05
efficientnetv2_xl,162.7,3146.865,512,512,93.85,247.32,208.12
tf_efficientnetv2_xl,161.32,3173.821,512,512,93.85,247.32,208.12
tf_efficientnet_b7,160.43,1196.798,192,600,38.33,289.94,66.35
resnetv2_152x2_bit_teacher_384,159.54,1604.579,256,384,136.16,132.56,236.34
tresnet_l_448,154.66,6620.743,1024,448,43.5,47.56,55.99
vit_huge_patch14_224,154.27,6637.58,1024,224,167.43,139.43,658.75
vit_huge_patch14_clip_224,154.17,6642.017,1024,224,167.4,139.41,632.05
maxxvitv2_rmlp_base_rw_384,153.9,1663.429,256,384,72.98,213.74,116.09
cait_s24_384,152.41,3359.254,512,384,32.17,245.31,47.06
deit3_huge_patch14_224_in21ft1k,150.05,6824.53,1024,224,167.4,139.41,632.13
deit3_huge_patch14_224,149.59,6845.356,1024,224,167.4,139.41,632.13
dm_nfnet_f3,145.48,3519.403,512,416,115.58,141.78,254.92
resnetrs420,142.37,5394.528,768,416,108.45,213.79,191.89
swin_large_patch4_window12_384,138.37,925.016,128,384,104.08,202.16,196.74
resnetv2_50x3_bitm,133.5,1438.189,192,448,145.7,133.37,217.32
maxvit_rmlp_base_rw_384,131.6,1945.285,256,384,70.97,318.95,116.14
xcit_large_24_p8_224_dist,131.32,3898.808,512,224,141.23,181.56,188.93
xcit_large_24_p8_224,131.27,3900.391,512,224,141.23,181.56,188.93
coatnet_5_224,130.48,1471.508,192,224,145.49,194.24,687.47
maxvit_base_tf_384,122.48,1567.652,192,384,73.8,332.9,119.65
resnest269e,119.17,2148.198,256,416,77.69,171.98,110.93
resnetv2_152x2_bitm,117.29,2182.534,256,448,184.99,180.43,236.34
xcit_small_24_p8_384_dist,116.59,3293.649,384,384,105.24,265.91,47.63
tresnet_xl_448,115.63,8855.938,1024,448,60.65,61.31,78.44
swinv2_cr_large_384,113.43,1128.479,128,384,108.95,404.96,196.68
maxvit_small_tf_512,106.82,1198.298,128,512,67.26,383.77,69.13
efficientnet_b8,106.21,1205.18,128,672,63.48,442.89,87.41
tf_efficientnet_b8,102.86,1244.358,128,672,63.48,442.89,87.41
eva_large_patch14_336,102.71,2492.371,256,336,191.1,270.24,304.53
vit_large_patch14_clip_336,102.52,2496.99,256,336,191.11,270.24,304.53
vit_large_patch16_384,102.5,2497.593,256,384,191.21,270.24,304.72
cait_s36_384,101.88,5025.316,512,384,47.99,367.4,68.37
eva_giant_patch14_224,101.84,10055.112,1024,224,267.18,192.64,1012.56
vit_giant_patch14_224,100.71,7625.752,768,224,267.18,192.64,1012.61
vit_giant_patch14_clip_224,100.43,7646.856,768,224,267.18,192.64,1012.65
deit3_large_patch16_384_in21ft1k,99.81,2564.809,256,384,191.21,270.24,304.76
deit3_large_patch16_384,99.8,2564.994,256,384,191.21,270.24,304.76
swinv2_base_window12to24_192to384_22kft1k,96.12,665.832,64,384,55.25,280.36,87.92
nfnet_f4,89.33,5731.574,512,512,216.26,262.26,316.07
beit_large_patch16_384,88.56,2890.58,256,384,191.21,270.24,305.0
maxvit_large_tf_384,86.44,1480.84,128,384,132.55,445.84,212.03
regnety_1280,82.49,4654.845,384,224,127.66,71.58,644.81
xcit_medium_24_p8_384_dist,79.96,3201.705,256,384,186.67,354.73,84.32
resnetv2_101x3_bitm,79.41,2417.67,192,448,280.33,194.78,387.93
volo_d3_448,77.64,2473.021,192,448,96.33,446.83,86.63
dm_nfnet_f4,77.54,4952.036,384,512,216.26,262.26,316.07
nfnet_f5,67.46,5691.915,384,544,290.97,349.71,377.21
tf_efficientnet_l2,63.66,1507.989,96,475,172.11,609.89,480.31
swinv2_large_window12to24_192to384_22kft1k,60.94,787.651,48,384,116.15,407.83,196.74
vit_gigantic_patch14_224,60.18,8507.121,512,224,483.95,275.37,1844.44
vit_gigantic_patch14_clip_224,60.11,8517.85,512,224,483.96,275.37,1844.91
volo_d4_448,57.87,3317.675,192,448,197.13,527.35,193.41
maxvit_base_tf_512,57.86,2212.256,128,512,138.02,703.99,119.88
dm_nfnet_f5,57.78,6645.368,384,544,290.97,349.71,377.21
vit_huge_patch14_clip_336,57.4,4460.085,256,336,390.97,407.54,632.46
ig_resnext101_32x48d,56.43,6804.709,384,224,153.57,131.06,828.41
convnextv2_huge,56.31,1704.92,96,384,337.96,232.35,660.29
convmixer_1536_20,55.47,18461.426,1024,224,48.68,33.03,51.63
swinv2_cr_giant_224,52.39,3665.046,192,224,483.85,309.15,2598.76
nfnet_f6,51.81,7411.574,384,576,378.69,452.2,438.36
maxvit_xlarge_tf_384,50.76,1891.335,96,384,292.78,668.76,475.32
swinv2_cr_huge_384,49.01,1305.73,64,384,352.04,583.18,657.94
regnety_2560,47.69,8051.463,384,224,257.07,87.48,826.14
xcit_large_24_p8_384_dist,44.91,4275.004,192,384,415.0,531.82,188.93
dm_nfnet_f6,44.62,5737.462,256,576,378.69,452.2,438.36
nfnet_f7,41.13,6224.782,256,608,480.39,570.85,499.5
maxvit_large_tf_512,41.04,1559.597,64,512,244.75,942.15,212.33
eva_giant_patch14_336,39.89,6418.269,256,336,620.64,550.67,1013.01
volo_d5_448,39.88,3209.812,128,448,315.06,737.92,295.91
beit_large_patch16_512,35.33,2716.953,96,512,362.24,656.39,305.67
cait_m36_384,32.89,7783.487,256,384,173.11,734.81,271.22
resnetv2_152x4_bitm,30.46,3151.929,96,480,844.84,414.26,936.53
volo_d5_512,27.89,4590.0,128,512,425.09,1105.37,296.09
maxvit_xlarge_tf_512,24.38,1968.424,48,512,534.14,1413.22,475.77
efficientnet_l2,23.13,1383.428,32,800,479.12,1707.39,480.31
swinv2_cr_giant_384,15.06,2124.735,32,384,1450.71,1394.86,2598.76
cait_m48_448,13.86,9235.876,128,448,329.41,1708.23,356.46
eva_giant_patch14_560,10.52,3043.009,32,560,1906.76,2577.17,1014.45
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/results-imagenet-a.csv
|
model,top1,top1_err,top5,top5_err,param_count,img_size,crop_pct,interpolation,top1_diff,top5_diff,rank_diff
eva02_large_patch14_448.mim_m38m_ft_in22k_in1k,88.227,11.773,97.093,2.907,305.08,448,1.000,bicubic,-10.623,-2.787,+1
eva02_large_patch14_448.mim_in22k_ft_in22k_in1k,87.893,12.107,96.920,3.080,305.08,448,1.000,bicubic,-11.037,-2.990,-1
eva_giant_patch14_560.m30m_ft_in22k_in1k,87.573,12.427,96.893,3.107,"1,014.45",560,1.000,bicubic,-11.257,-3.007,+1
eva02_large_patch14_448.mim_m38m_ft_in1k,87.107,12.893,96.280,3.720,305.08,448,1.000,bicubic,-11.623,-3.590,+5
eva02_large_patch14_448.mim_in22k_ft_in1k,86.227,13.773,95.787,4.213,305.08,448,1.000,bicubic,-12.613,-4.043,-2
eva_giant_patch14_336.clip_ft_in1k,85.307,14.693,95.720,4.280,"1,013.01",336,1.000,bicubic,-13.513,-4.090,-1
eva_giant_patch14_336.m30m_ft_in22k_in1k,85.147,14.853,96.360,3.640,"1,013.01",336,1.000,bicubic,-13.663,-3.540,-1
tf_efficientnet_l2.ns_jft_in1k,84.747,15.253,96.147,3.853,480.31,800,0.960,bicubic,-13.803,-3.673,+9
regnety_1280.swag_ft_in1k,83.907,16.093,96.200,3.800,644.81,384,1.000,bicubic,-14.543,-3.670,+18
eva_large_patch14_336.in22k_ft_in22k_in1k,83.853,16.147,95.347,4.653,304.53,336,1.000,bicubic,-14.887,-4.453,-3
convnextv2_huge.fcmae_ft_in22k_in1k_512,83.827,16.173,96.173,3.827,660.29,512,1.000,bicubic,-14.773,-3.697,+4
maxvit_xlarge_tf_512.in21k_ft_in1k,83.400,16.600,95.520,4.480,475.77,512,1.000,bicubic,-15.220,-4.270,+1
tf_efficientnet_l2.ns_jft_in1k_475,83.400,16.600,95.453,4.547,480.31,475,0.936,bicubic,-15.100,-4.327,+8
eva_large_patch14_336.in22k_ft_in1k,82.760,17.240,95.507,4.493,304.53,336,1.000,bicubic,-15.970,-4.283,-6
maxvit_large_tf_512.in21k_ft_in1k,81.733,18.267,95.027,4.973,212.33,512,1.000,bicubic,-16.887,-4.773,-1
beit_large_patch16_512.in22k_ft_in22k_in1k,81.600,18.400,94.880,5.120,305.67,512,1.000,bicubic,-16.960,-4.960,0
maxvit_base_tf_512.in21k_ft_in1k,81.360,18.640,94.467,5.533,119.88,512,1.000,bicubic,-17.260,-5.333,-5
eva_giant_patch14_224.clip_ft_in1k,81.213,18.787,94.333,5.667,"1,012.56",224,0.900,bicubic,-17.247,-5.417,+8
maxvit_xlarge_tf_384.in21k_ft_in1k,81.067,18.933,94.640,5.360,475.32,384,1.000,bicubic,-17.433,-5.190,+3
convnextv2_huge.fcmae_ft_in22k_in1k_384,79.893,20.107,94.640,5.360,660.29,384,1.000,bicubic,-18.777,-5.220,-10
deit3_large_patch16_384.fb_in22k_ft_in1k,79.187,20.813,93.613,6.387,304.76,384,1.000,bicubic,-19.273,-6.147,+4
beit_large_patch16_384.in22k_ft_in22k_in1k,79.107,20.893,94.267,5.733,305.00,384,1.000,bicubic,-19.413,-5.553,-3
caformer_b36.sail_in22k_ft_in1k_384,78.360,21.640,93.467,6.533,98.75,384,1.000,bicubic,-20.080,-6.333,+6
maxvit_large_tf_384.in21k_ft_in1k,78.013,21.987,93.267,6.733,212.03,384,1.000,bicubic,-20.477,-6.483,-1
eva02_base_patch14_448.mim_in22k_ft_in1k,77.547,22.453,93.120,6.880,87.12,448,1.000,bicubic,-20.893,-6.680,+3
vit_large_patch14_clip_336.openai_ft_in12k_in1k,77.333,22.667,93.627,6.373,304.53,336,1.000,bicubic,-20.927,-6.143,+16
convnext_xxlarge.clip_laion2b_soup_ft_in1k,77.120,22.880,94.320,5.680,846.47,256,1.000,bicubic,-21.320,-5.500,+3
eva02_base_patch14_448.mim_in22k_ft_in22k_in1k,76.893,23.107,92.693,7.307,87.12,448,1.000,bicubic,-21.747,-7.107,-17
maxvit_base_tf_384.in21k_ft_in1k,76.853,23.147,92.600,7.400,119.65,384,1.000,bicubic,-21.667,-7.150,-9
beitv2_large_patch16_224.in1k_ft_in22k_in1k,76.773,23.227,93.173,6.827,304.43,224,0.950,bicubic,-21.767,-6.587,-12
eva_large_patch14_196.in22k_ft_in22k_in1k,75.507,24.493,91.760,8.240,304.14,196,1.000,bicubic,-22.913,-8.050,+2
regnety_1280.swag_lc_in1k,74.587,25.413,91.680,8.320,644.81,224,0.965,bicubic,-23.063,-7.890,+79
maxvit_rmlp_base_rw_384.sw_in12k_ft_in1k,74.253,25.747,90.827,9.173,116.14,384,1.000,bicubic,-23.917,-8.933,+19
vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k,74.240,25.760,92.253,7.747,632.46,336,1.000,bicubic,-24.180,-7.517,-2
regnety_320.swag_ft_in1k,74.200,25.800,92.960,7.040,145.05,384,1.000,bicubic,-23.860,-6.800,+30
swinv2_large_window12to24_192to384.ms_in22k_ft_in1k,73.933,26.067,91.733,8.267,196.74,384,1.000,bicubic,-24.197,-7.977,+21
eva_large_patch14_196.in22k_ft_in1k,73.160,26.840,91.413,8.587,304.14,196,1.000,bicubic,-25.200,-8.407,-2
caformer_m36.sail_in22k_ft_in1k_384,72.987,27.013,90.600,9.400,56.20,384,1.000,bicubic,-25.163,-9.150,+18
vit_large_patch14_clip_224.openai_ft_in12k_in1k,72.293,27.707,90.880,9.120,304.20,224,1.000,bicubic,-25.927,-8.840,+7
convnextv2_large.fcmae_ft_in22k_in1k_384,72.067,27.933,91.013,8.987,197.96,384,1.000,bicubic,-26.333,-8.747,-6
vit_large_patch14_clip_224.openai_ft_in1k,71.813,28.187,91.453,8.547,304.20,224,1.000,bicubic,-26.347,-8.207,+14
vit_large_patch14_clip_336.laion2b_ft_in12k_in1k,71.800,28.200,90.227,9.773,304.53,336,1.000,bicubic,-26.540,-9.533,-5
convformer_b36.sail_in22k_ft_in1k_384,71.547,28.453,90.240,9.760,99.88,384,1.000,bicubic,-26.713,-9.590,-3
vit_large_patch16_384.augreg_in21k_ft_in1k,71.227,28.773,89.773,10.227,304.72,384,1.000,bicubic,-26.993,-9.947,+1
swinv2_base_window12to24_192to384.ms_in22k_ft_in1k,71.200,28.800,91.320,8.680,87.92,384,1.000,bicubic,-26.920,-8.420,+12
deit3_base_patch16_384.fb_in22k_ft_in1k,71.200,28.800,89.933,10.067,86.88,384,1.000,bicubic,-26.640,-9.737,+43
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384,70.720,29.280,90.547,9.453,200.13,384,1.000,bicubic,-27.760,-9.233,-23
vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k,70.680,29.320,90.400,9.600,632.05,224,1.000,bicubic,-27.620,-9.360,-10
coatnet_rmlp_2_rw_384.sw_in12k_ft_in1k,70.600,29.400,89.227,10.773,73.88,384,1.000,bicubic,-27.470,-10.493,+15
caformer_s36.sail_in22k_ft_in1k_384,70.347,29.653,90.067,9.933,39.30,384,1.000,bicubic,-27.623,-9.653,+22
deit3_huge_patch14_224.fb_in22k_ft_in1k,70.253,29.747,90.707,9.293,632.13,224,1.000,bicubic,-27.917,-9.053,+2
convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320,69.880,30.120,90.493,9.507,200.13,320,1.000,bicubic,-28.400,-9.277,-13
swin_large_patch4_window12_384.ms_in22k_ft_in1k,69.640,30.360,89.547,10.453,196.74,384,1.000,bicubic,-28.410,-10.143,+14
volo_d5_512.sail_in1k,69.587,30.413,90.427,9.573,296.09,512,1.150,bicubic,-28.183,-9.243,+46
convnext_xlarge.fb_in22k_ft_in1k_384,69.320,30.680,89.307,10.693,350.20,384,1.000,bicubic,-29.100,-10.503,-24
caformer_b36.sail_in22k_ft_in1k,69.133,30.867,89.600,10.400,98.75,224,1.000,bicubic,-29.027,-10.180,-2
maxxvitv2_rmlp_base_rw_384.sw_in12k_ft_in1k,69.067,30.933,89.880,10.120,116.09,384,1.000,bicubic,-29.193,-9.900,-16
deit3_large_patch16_224.fb_in22k_ft_in1k,68.693,31.307,89.973,10.027,304.37,224,1.000,bicubic,-29.477,-9.757,-7
seresnextaa201d_32x8d.sw_in12k_ft_in1k_384,68.560,31.440,88.613,11.387,149.39,384,1.000,bicubic,-29.650,-11.167,-11
beit_large_patch16_224.in22k_ft_in22k_in1k,68.467,31.533,89.573,10.427,304.43,224,0.900,bicubic,-29.713,-10.187,-10
regnety_160.swag_ft_in1k,68.093,31.907,90.707,9.293,83.59,384,1.000,bicubic,-29.687,-8.893,+35
convnextv2_large.fcmae_ft_in22k_in1k,68.093,31.907,89.720,10.280,197.96,288,1.000,bicubic,-29.997,-10.050,0
volo_d5_448.sail_in1k,68.080,31.920,89.720,10.280,295.91,448,1.150,bicubic,-29.670,-9.830,+39
maxvit_base_tf_512.in1k,67.933,32.067,88.493,11.507,119.88,512,1.000,bicubic,-29.797,-11.117,+40
maxvit_large_tf_512.in1k,67.880,32.120,87.653,12.347,212.33,512,1.000,bicubic,-29.950,-11.907,+26
tf_efficientnetv2_xl.in21k_ft_in1k,67.787,32.213,87.347,12.653,208.12,512,1.000,bicubic,-30.113,-12.223,+15
beitv2_large_patch16_224.in1k_ft_in1k,67.640,32.360,88.667,11.333,304.43,224,0.950,bicubic,-30.270,-10.993,+10
swinv2_large_window12to16_192to256.ms_in22k_ft_in1k,67.320,32.680,88.000,12.000,196.74,256,0.900,bicubic,-30.530,-11.640,+18
vit_large_patch14_clip_336.laion2b_ft_in1k,67.080,32.920,89.493,10.507,304.53,336,1.000,bicubic,-31.140,-10.307,-22
tf_efficientnet_b7.ns_jft_in1k,67.027,32.973,88.667,11.333,66.35,600,0.949,bicubic,-30.893,-11.053,+6
vit_large_patch14_clip_224.laion2b_ft_in12k_in1k,67.000,33.000,87.960,12.040,304.20,224,1.000,bicubic,-31.080,-11.690,-9
convnext_xlarge.fb_in22k_ft_in1k,66.960,33.040,88.947,11.053,350.20,288,1.000,bicubic,-31.150,-10.833,-12
convnext_large_mlp.clip_laion2b_augreg_ft_in1k_384,66.880,33.120,89.280,10.720,200.13,384,1.000,bicubic,-31.370,-10.480,-30
convformer_m36.sail_in22k_ft_in1k_384,66.853,33.147,87.800,12.200,57.05,384,1.000,bicubic,-31.187,-11.950,-5
convnextv2_base.fcmae_ft_in22k_in1k_384,66.787,33.213,88.893,11.107,88.72,384,1.000,bicubic,-31.563,-10.877,-39
volo_d4_448.sail_in1k,66.667,33.333,88.987,11.013,193.41,448,1.150,bicubic,-31.003,-10.623,+32
seresnextaa101d_32x8d.sw_in12k_ft_in1k_288,65.987,34.013,87.880,12.120,93.59,320,1.000,bicubic,-31.983,-11.820,-4
beit_base_patch16_384.in22k_ft_in22k_in1k,65.920,34.080,88.507,11.493,86.74,384,1.000,bicubic,-31.900,-11.193,+14
regnety_320.swag_lc_in1k,65.573,34.427,88.080,11.920,145.05,224,0.965,bicubic,-31.587,-11.590,+109
convnext_large.fb_in22k_ft_in1k_384,65.533,34.467,87.467,12.533,197.77,384,1.000,bicubic,-32.707,-12.283,-36
vit_huge_patch14_clip_224.laion2b_ft_in1k,65.493,34.507,87.720,12.280,632.05,224,1.000,bicubic,-32.527,-12.000,-11
volo_d3_448.sail_in1k,65.400,34.600,87.573,12.427,86.63,448,1.000,bicubic,-32.150,-11.987,+45
convnext_large.fb_in22k_ft_in1k,65.000,35.000,87.947,12.053,197.77,288,1.000,bicubic,-33.120,-11.833,-24
tf_efficientnetv2_l.in21k_ft_in1k,64.947,35.053,87.840,12.160,118.52,480,1.000,bicubic,-32.853,-11.820,+11
swin_base_patch4_window12_384.ms_in22k_ft_in1k,64.467,35.533,87.520,12.480,87.90,384,1.000,bicubic,-33.433,-12.150,-6
convnextv2_huge.fcmae_ft_in1k,64.440,35.560,87.080,12.920,660.29,288,1.000,bicubic,-33.460,-12.630,-6
maxvit_rmlp_base_rw_224.sw_in12k_ft_in1k,64.187,35.813,85.520,14.480,116.14,224,0.950,bicubic,-33.623,-14.130,+7
vit_base_patch16_384.augreg_in21k_ft_in1k,63.693,36.307,86.693,13.307,86.86,384,1.000,bicubic,-34.147,-12.977,+2
maxvit_large_tf_384.in1k,63.507,36.493,85.093,14.907,212.03,384,1.000,bicubic,-34.063,-14.577,+37
swinv2_base_window12to16_192to256.ms_in22k_ft_in1k,63.307,36.693,87.507,12.493,87.92,256,0.900,bicubic,-34.353,-12.103,+19
convnextv2_base.fcmae_ft_in22k_in1k,62.893,37.107,87.667,12.333,88.72,288,1.000,bicubic,-35.167,-12.193,-25
maxvit_small_tf_512.in1k,62.867,37.133,86.307,13.693,69.13,512,1.000,bicubic,-34.883,-13.313,+11
maxvit_base_tf_384.in1k,62.613,37.387,85.187,14.813,119.65,384,1.000,bicubic,-34.957,-14.343,+32
convformer_b36.sail_in22k_ft_in1k,62.600,37.400,85.467,14.533,99.88,224,1.000,bicubic,-35.340,-14.293,-19
cait_m48_448.fb_dist_in1k,62.373,37.627,86.453,13.547,356.46,448,1.000,bicubic,-35.107,-13.147,+43
convnext_base.fb_in22k_ft_in1k_384,62.360,37.640,86.240,13.760,88.59,384,1.000,bicubic,-35.720,-13.520,-33
tf_efficientnet_b6.ns_jft_in1k,62.227,37.773,85.160,14.840,43.04,528,0.942,bicubic,-35.393,-14.390,+19
caformer_b36.sail_in1k_384,62.160,37.840,84.493,15.507,98.75,384,1.000,bicubic,-35.340,-15.147,+37
vit_base_patch8_224.augreg2_in21k_ft_in1k,62.027,37.973,85.840,14.160,86.58,224,0.900,bicubic,-35.663,-13.810,+8
convformer_s36.sail_in22k_ft_in1k_384,61.920,38.080,85.960,14.040,40.01,384,1.000,bicubic,-35.930,-13.690,-13
beitv2_base_patch16_224.in1k_ft_in22k_in1k,61.667,38.333,85.480,14.520,86.53,224,0.900,bicubic,-36.023,-14.200,+5
vit_large_r50_s32_384.augreg_in21k_ft_in1k,61.493,38.507,84.027,15.973,329.09,384,1.000,bicubic,-36.367,-15.643,-17
seresnextaa101d_32x8d.sw_in12k_ft_in1k,61.080,38.920,85.920,14.080,93.59,288,1.000,bicubic,-36.830,-13.740,-25
caformer_m36.sail_in22k_ft_in1k,61.067,38.933,84.893,15.107,56.20,224,1.000,bicubic,-36.773,-14.787,-15
swin_large_patch4_window7_224.ms_in22k_ft_in1k,61.000,39.000,85.867,14.133,196.53,224,0.900,bicubic,-36.650,-13.703,+8
convnext_base.fb_in22k_ft_in1k,60.893,39.107,86.147,13.853,88.59,288,1.000,bicubic,-36.967,-13.533,-22
resnetv2_152x4_bit.goog_in21k_ft_in1k,60.773,39.227,83.560,16.440,936.53,480,1.000,bilinear,-36.717,-16.050,+29
convnext_small.in12k_ft_in1k_384,60.733,39.267,84.960,15.040,50.22,384,1.000,bicubic,-37.067,-14.810,-12
convnext_large_mlp.clip_laion2b_augreg_ft_in1k,60.547,39.453,86.293,13.707,200.13,256,1.000,bicubic,-37.403,-13.417,-35
deit3_large_patch16_384.fb_in1k,60.533,39.467,85.733,14.267,304.76,384,1.000,bicubic,-36.887,-13.837,+36
caformer_m36.sail_in1k_384,60.533,39.467,84.760,15.240,56.20,384,1.000,bicubic,-36.907,-14.880,+35
tf_efficientnet_b5.ns_jft_in1k,60.293,39.707,84.453,15.547,30.39,456,0.934,bicubic,-37.207,-15.127,+22
vit_base_patch16_clip_384.openai_ft_in12k_in1k,60.227,39.773,84.613,15.387,86.86,384,0.950,bicubic,-37.963,-15.047,-64
regnety_160.swag_lc_in1k,60.147,39.853,85.760,14.240,83.59,224,0.965,bicubic,-36.673,-13.890,+131
coatnet_rmlp_2_rw_224.sw_in12k_ft_in1k,59.973,40.027,83.760,16.240,73.88,224,0.950,bicubic,-37.677,-15.880,-3
vit_large_patch14_clip_224.laion2b_ft_in1k,59.947,40.053,85.667,14.333,304.20,224,1.000,bicubic,-37.943,-13.983,-34
xcit_large_24_p8_384.fb_dist_in1k,59.947,40.053,85.467,14.533,188.93,384,1.000,bicubic,-37.573,-14.073,+15
coatnet_2_rw_224.sw_in12k_ft_in1k,59.533,40.467,84.213,15.787,73.87,224,0.950,bicubic,-37.987,-15.387,+13
tf_efficientnetv2_m.in21k_ft_in1k,59.413,40.587,84.573,15.427,54.14,480,1.000,bicubic,-38.407,-15.027,-26
maxxvitv2_rmlp_base_rw_224.sw_in12k_ft_in1k,59.160,40.840,84.480,15.520,116.09,224,0.950,bicubic,-38.600,-15.220,-19
vit_base_patch16_clip_384.laion2b_ft_in12k_in1k,59.147,40.853,83.280,16.720,86.86,384,1.000,bicubic,-38.843,-16.380,-50
vit_base_patch8_224.augreg_in21k_ft_in1k,58.960,41.040,82.733,17.267,86.58,224,0.900,bicubic,-38.610,-16.857,+2
maxvit_tiny_tf_512.in1k,58.800,41.200,84.573,15.427,31.05,512,1.000,bicubic,-38.780,-14.987,0
tiny_vit_21m_512.dist_in22k_ft_in1k,58.733,41.267,83.693,16.307,21.27,512,1.000,bicubic,-39.137,-15.937,-41
caformer_s18.sail_in22k_ft_in1k_384,58.640,41.360,85.347,14.653,26.34,384,1.000,bicubic,-38.780,-14.273,+23
volo_d2_384.sail_in1k,58.600,41.400,84.280,15.720,58.87,384,1.000,bicubic,-38.720,-15.320,+35
tiny_vit_21m_384.dist_in22k_ft_in1k,58.320,41.680,83.653,16.347,21.23,384,1.000,bicubic,-39.290,-15.937,-8
convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384,58.253,41.747,85.480,14.520,88.59,384,1.000,bicubic,-39.787,-14.210,-60
resnext101_32x32d.fb_wsl_ig1b_ft_in1k,58.040,41.960,80.640,19.360,468.53,224,0.875,bilinear,-39.330,-19.040,+26
cait_m36_384.fb_dist_in1k,57.840,42.160,84.840,15.160,271.22,384,1.000,bicubic,-39.560,-14.670,+22
dm_nfnet_f5.dm_in1k,57.640,42.360,82.267,17.733,377.21,544,0.954,bicubic,-40.140,-17.493,-32
dm_nfnet_f6.dm_in1k,57.560,42.440,82.360,17.640,438.36,576,0.956,bicubic,-40.220,-17.290,-34
caformer_s36.sail_in1k_384,57.347,42.653,82.787,17.213,39.30,384,1.000,bicubic,-40.043,-16.753,+20
deit3_base_patch16_224.fb_in22k_ft_in1k,57.267,42.733,83.520,16.480,86.59,224,1.000,bicubic,-40.213,-16.030,+3
volo_d5_224.sail_in1k,57.107,42.893,82.733,17.267,295.46,224,0.960,bicubic,-40.273,-16.837,+19
convnext_base.clip_laiona_augreg_ft_in1k_384,57.080,42.920,84.773,15.227,88.59,384,1.000,bicubic,-40.540,-14.807,-19
deit3_small_patch16_384.fb_in22k_ft_in1k,57.080,42.920,83.067,16.933,22.21,384,1.000,bicubic,-40.050,-16.443,+55
convnextv2_large.fcmae_ft_in1k,56.880,43.120,83.467,16.533,197.96,288,1.000,bicubic,-40.780,-16.253,-28
regnety_160.lion_in12k_ft_in1k,56.747,43.253,83.453,16.547,83.59,288,1.000,bicubic,-40.703,-16.147,+2
dm_nfnet_f4.dm_in1k,56.707,43.293,81.760,18.240,316.07,512,0.951,bicubic,-40.933,-17.780,-26
xcit_medium_24_p8_384.fb_dist_in1k,56.693,43.307,83.453,16.547,84.32,384,1.000,bicubic,-40.587,-16.067,+27
maxvit_small_tf_384.in1k,56.600,43.400,82.293,17.707,69.02,384,1.000,bicubic,-40.830,-17.217,+4
regnety_160.sw_in12k_ft_in1k,56.253,43.747,82.840,17.160,83.59,288,1.000,bicubic,-41.197,-16.750,-1
convformer_m36.sail_in22k_ft_in1k,55.880,44.120,81.813,18.187,57.05,224,1.000,bicubic,-41.720,-17.807,-23
caformer_s36.sail_in22k_ft_in1k,55.813,44.187,82.133,17.867,39.30,224,1.000,bicubic,-41.787,-17.587,-23
vit_large_patch16_224.augreg_in21k_ft_in1k,55.573,44.427,80.107,19.893,304.33,224,0.900,bicubic,-42.057,-19.483,-31
convformer_b36.sail_in1k_384,55.453,44.547,81.293,18.707,99.88,384,1.000,bicubic,-42.077,-18.227,-18
vit_base_patch16_clip_384.openai_ft_in1k,55.000,45.000,82.613,17.387,86.86,384,1.000,bicubic,-42.540,-17.047,-20
vit_base_r50_s16_384.orig_in21k_ft_in1k,54.627,45.373,81.213,18.787,98.95,384,1.000,bicubic,-42.563,-18.347,+37
cait_s36_384.fb_dist_in1k,54.360,45.640,81.373,18.627,68.37,384,1.000,bicubic,-42.970,-18.167,+10
volo_d1_384.sail_in1k,54.333,45.667,81.000,19.000,26.78,384,1.000,bicubic,-42.577,-18.460,+81
deit3_huge_patch14_224.fb_in1k,54.320,45.680,82.093,17.907,632.13,224,0.900,bicubic,-42.580,-17.127,+82
vit_base_patch16_clip_384.laion2b_ft_in1k,54.267,45.733,80.880,19.120,86.86,384,1.000,bicubic,-43.453,-18.750,-48
xcit_small_24_p8_384.fb_dist_in1k,54.253,45.747,81.547,18.453,47.63,384,1.000,bicubic,-42.987,-18.003,+20
vit_medium_patch16_gap_384.sw_in12k_ft_in1k,54.173,45.827,81.640,18.360,39.03,384,0.950,bicubic,-43.267,-17.960,-11
resnetv2_101x3_bit.goog_in21k_ft_in1k,54.027,45.973,81.027,18.973,387.93,448,1.000,bilinear,-42.953,-18.503,+63
resnetv2_152x2_bit.goog_in21k_ft_in1k,54.013,45.987,82.000,18.000,236.34,448,1.000,bilinear,-42.987,-17.590,+58
beitv2_base_patch16_224.in1k_ft_in1k,53.813,46.187,81.853,18.147,86.53,224,0.900,bicubic,-43.357,-17.617,+29
dm_nfnet_f3.dm_in1k,53.773,46.227,79.813,20.187,254.92,416,0.940,bicubic,-43.697,-19.747,-20
convformer_m36.sail_in1k_384,53.547,46.453,80.733,19.267,57.05,384,1.000,bicubic,-43.863,-18.867,-9
deit3_base_patch16_384.fb_in1k,53.427,46.573,80.547,19.453,86.88,384,1.000,bicubic,-43.613,-18.833,+50
convnext_small.in12k_ft_in1k,53.240,46.760,81.400,18.600,50.22,288,1.000,bicubic,-44.110,-18.180,-4
resnext101_32x16d.fb_wsl_ig1b_ft_in1k,53.067,46.933,76.907,23.093,194.03,224,0.875,bilinear,-43.743,-22.693,+83
volo_d4_224.sail_in1k,52.987,47.013,80.427,19.573,192.96,224,0.960,bicubic,-44.293,-19.113,+3
xcit_large_24_p16_384.fb_dist_in1k,52.853,47.147,81.827,18.173,189.10,384,1.000,bicubic,-44.667,-17.653,-32
convnext_small.fb_in22k_ft_in1k_384,52.427,47.573,80.813,19.187,50.22,384,1.000,bicubic,-45.183,-18.787,-48
convnext_base.clip_laion2b_augreg_ft_in12k_in1k,52.320,47.680,82.480,17.520,88.59,256,1.000,bicubic,-45.280,-17.130,-47
vit_base_patch32_clip_448.laion2b_ft_in12k_in1k,52.320,47.680,79.747,20.253,88.34,448,1.000,bicubic,-44.990,-19.733,-5
maxvit_tiny_tf_384.in1k,52.080,47.920,79.813,20.187,30.98,384,1.000,bicubic,-45.230,-19.687,-7
regnety_120.sw_in12k_ft_in1k,51.813,48.187,80.787,19.213,51.82,288,1.000,bicubic,-45.467,-18.743,-4
swin_base_patch4_window7_224.ms_in22k_ft_in1k,51.440,48.560,80.080,19.920,87.77,224,0.900,bicubic,-45.840,-19.510,-6
tf_efficientnet_b4.ns_jft_in1k,51.253,48.747,79.173,20.827,19.34,380,0.922,bicubic,-45.697,-20.087,+53
efficientnet_b5.sw_in12k_ft_in1k,51.213,48.787,78.840,21.160,30.39,448,1.000,bicubic,-46.197,-20.710,-23
resnext101_32x8d.fb_swsl_ig1b_ft_in1k,51.213,48.787,78.240,21.760,88.79,224,0.875,bilinear,-45.997,-21.330,+6
flexivit_large.1200ep_in1k,51.200,48.800,80.667,19.333,304.36,240,0.950,bicubic,-46.210,-18.803,-27
eva02_small_patch14_336.mim_in22k_ft_in1k,51.200,48.800,79.120,20.880,22.13,336,1.000,bicubic,-45.940,-20.350,+16
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k_384,51.120,48.880,78.533,21.467,236.34,384,1.000,bicubic,-45.710,-20.877,+66
convnext_small.fb_in22k_ft_in1k,51.067,48.933,80.920,19.080,50.22,288,1.000,bicubic,-46.293,-18.610,-22
mvitv2_large.fb_in1k,50.867,49.133,78.493,21.507,217.99,224,0.900,bicubic,-46.063,-20.907,+50
beit_base_patch16_224.in22k_ft_in22k_in1k,50.720,49.280,79.733,20.267,86.53,224,0.900,bicubic,-46.360,-19.877,+21
tf_efficientnetv2_l.in1k,50.693,49.307,77.613,22.387,118.52,480,1.000,bicubic,-46.777,-21.917,-41
vit_base_patch16_384.orig_in21k_ft_in1k,50.653,49.347,78.200,21.800,86.86,384,1.000,bicubic,-46.067,-21.280,+78
xcit_small_12_p8_384.fb_dist_in1k,50.587,49.413,79.573,20.427,26.21,384,1.000,bicubic,-46.643,-19.907,-6
convformer_s36.sail_in1k_384,50.333,49.667,78.893,21.107,40.01,384,1.000,bicubic,-46.947,-20.577,-14
convnext_tiny.in12k_ft_in1k_384,50.320,49.680,79.800,20.200,28.59,384,1.000,bicubic,-47.020,-19.800,-26
volo_d3_224.sail_in1k,50.320,49.680,78.213,21.787,86.33,224,0.960,bicubic,-46.770,-21.257,+14
flexivit_large.600ep_in1k,50.253,49.747,80.013,19.987,304.36,240,0.950,bicubic,-47.027,-19.417,-23
vit_base_patch16_clip_224.laion2b_ft_in12k_in1k,50.120,49.880,78.040,21.960,86.57,224,0.950,bicubic,-47.330,-21.500,-45
convformer_s18.sail_in22k_ft_in1k_384,50.067,49.933,80.973,19.027,26.77,384,1.000,bicubic,-47.203,-18.577,-18
vit_base_patch16_224.augreg2_in21k_ft_in1k,49.827,50.173,78.960,21.040,86.57,224,0.900,bicubic,-47.323,-20.490,-1
vit_base_patch16_clip_224.openai_ft_in12k_in1k,49.800,50.200,77.120,22.880,86.57,224,0.950,bicubic,-47.730,-22.380,-61
cait_s24_384.fb_dist_in1k,49.733,50.267,78.760,21.240,47.06,384,1.000,bicubic,-47.337,-20.670,+16
inception_next_base.sail_in1k_384,49.693,50.307,79.133,20.867,86.67,384,1.000,bicubic,-47.567,-20.357,-21
xcit_medium_24_p16_384.fb_dist_in1k,49.333,50.667,79.867,20.133,84.40,384,1.000,bicubic,-47.947,-19.643,-26
deit_base_distilled_patch16_384.fb_in1k,49.333,50.667,79.227,20.773,87.63,384,1.000,bicubic,-47.627,-20.253,+28
caformer_s18.sail_in1k_384,49.147,50.853,78.693,21.307,26.34,384,1.000,bicubic,-47.933,-20.797,+11
coat_lite_medium_384.in1k,49.093,50.907,78.600,21.400,44.57,384,1.000,bicubic,-48.057,-20.940,-7
tf_efficientnet_b8.ra_in1k,48.933,51.067,77.227,22.773,87.41,672,0.954,bicubic,-48.267,-22.273,-14
convnextv2_base.fcmae_ft_in1k,48.693,51.307,78.827,21.173,88.72,288,1.000,bicubic,-48.527,-20.713,-21
deit3_large_patch16_224.fb_in1k,48.627,51.373,78.107,21.893,304.37,224,0.900,bicubic,-48.313,-21.153,+27
caformer_b36.sail_in1k,48.533,51.467,75.667,24.333,98.75,224,1.000,bicubic,-48.447,-23.823,+19
flexivit_large.300ep_in1k,48.520,51.480,78.653,21.347,304.36,240,0.950,bicubic,-48.730,-20.837,-29
convnext_base.clip_laion2b_augreg_ft_in1k,48.453,51.547,79.827,20.173,88.59,256,1.000,bicubic,-48.787,-19.693,-28
tf_efficientnetv2_s.in21k_ft_in1k,48.453,51.547,77.867,22.133,21.46,384,1.000,bicubic,-48.277,-21.463,+54
resnest269e.in1k,48.213,51.787,74.333,25.667,110.93,416,0.928,bicubic,-48.297,-25.017,+106
xcit_large_24_p8_224.fb_dist_in1k,48.160,51.840,79.093,20.907,188.93,224,1.000,bicubic,-48.910,-20.327,+2
deit3_medium_patch16_224.fb_in22k_ft_in1k,48.160,51.840,77.027,22.973,38.85,224,1.000,bicubic,-48.810,-22.403,+15
vit_base_patch32_clip_384.laion2b_ft_in12k_in1k,47.960,52.040,76.853,23.147,88.30,384,1.000,bicubic,-49.400,-22.667,-51
regnety_2560.seer_ft_in1k,47.907,52.093,76.800,23.200,"1,282.60",384,1.000,bicubic,-49.313,-22.720,-30
regnetz_e8.ra3_in1k,47.800,52.200,76.200,23.800,57.70,320,1.000,bicubic,-49.400,-23.340,-27
convnext_tiny.in12k_ft_in1k,47.507,52.493,78.747,21.253,28.59,288,1.000,bicubic,-49.553,-20.803,-1
convformer_s36.sail_in22k_ft_in1k,47.440,52.560,77.107,22.893,40.01,224,1.000,bicubic,-49.640,-22.453,-9
resnetv2_50x3_bit.goog_in21k_ft_in1k,47.307,52.693,77.333,22.667,217.32,448,1.000,bilinear,-49.403,-22.157,+49
vit_base_patch16_clip_224.openai_ft_in1k,47.200,52.800,77.533,22.467,86.57,224,0.900,bicubic,-49.880,-22.077,-8
xcit_large_24_p8_224.fb_in1k,47.160,52.840,74.480,25.520,188.93,224,1.000,bicubic,-49.240,-24.510,+112
xcit_small_24_p16_384.fb_dist_in1k,46.987,53.013,77.160,22.840,47.67,384,1.000,bicubic,-50.133,-22.290,-22
tf_efficientnet_b8.ap_in1k,46.893,53.107,76.533,23.467,87.41,672,0.954,bicubic,-50.217,-22.977,-22
convnext_large.fb_in1k,46.813,53.187,76.627,23.373,197.77,288,1.000,bicubic,-50.287,-22.823,-20
dm_nfnet_f2.dm_in1k,46.640,53.360,74.813,25.187,193.78,352,0.920,bicubic,-50.470,-24.847,-23
efficientnetv2_rw_m.agc_in1k,46.293,53.707,75.720,24.280,53.24,416,1.000,bicubic,-50.687,-23.620,-2
swinv2_base_window16_256.ms_in1k,46.213,53.787,75.173,24.827,87.92,256,0.900,bicubic,-50.537,-24.177,+34
resnext101_32x16d.fb_swsl_ig1b_ft_in1k,46.120,53.880,72.240,27.760,194.03,224,0.875,bilinear,-50.480,-27.030,+68
caformer_m36.sail_in1k,46.080,53.920,74.533,25.467,56.20,224,1.000,bicubic,-50.810,-24.897,+13
volo_d2_224.sail_in1k,46.067,53.933,75.253,24.747,58.68,224,0.960,bicubic,-50.933,-24.137,-7
vit_small_patch16_384.augreg_in21k_ft_in1k,45.893,54.107,76.720,23.280,22.20,384,1.000,bicubic,-50.807,-22.710,+40
ecaresnet269d.ra2_in1k,45.853,54.147,75.067,24.933,102.09,352,1.000,bicubic,-51.237,-24.403,-27
convnextv2_tiny.fcmae_ft_in22k_in1k_384,45.760,54.240,76.973,23.027,28.64,384,1.000,bicubic,-51.480,-22.637,-51
vit_small_r26_s32_384.augreg_in21k_ft_in1k,45.720,54.280,76.040,23.960,36.47,384,1.000,bicubic,-50.960,-23.040,+48
tf_efficientnet_b7.ap_in1k,45.373,54.627,74.200,25.800,66.35,600,0.949,bicubic,-51.827,-25.300,-47
swin_base_patch4_window12_384.ms_in1k,45.333,54.667,74.360,25.640,87.90,384,1.000,bicubic,-51.247,-24.810,+64
resnext101_32x8d.fb_wsl_ig1b_ft_in1k,45.320,54.680,70.907,29.093,88.79,224,0.875,bilinear,-51.010,-28.493,+111
xcit_medium_24_p8_224.fb_dist_in1k,45.227,54.773,76.760,23.240,84.32,224,1.000,bicubic,-51.693,-22.640,-1
tiny_vit_21m_224.dist_in22k_ft_in1k,45.027,54.973,75.547,24.453,21.20,224,0.950,bicubic,-52.173,-23.943,-48
eca_nfnet_l2.ra3_in1k,44.987,55.013,75.880,24.120,56.72,384,1.000,bicubic,-52.093,-23.630,-29
coatnet_rmlp_1_rw2_224.sw_in12k_ft_in1k,44.787,55.213,73.747,26.253,41.72,224,0.950,bicubic,-51.843,-25.473,+51
maxxvit_rmlp_small_rw_256.sw_in1k,44.600,55.400,75.040,24.960,66.01,256,0.950,bicubic,-52.200,-24.340,+11
crossvit_18_dagger_408.in1k,44.253,55.747,73.840,26.160,44.61,408,1.000,bicubic,-52.287,-25.520,+68
resnest200e.in1k,44.133,55.867,73.493,26.507,70.20,320,0.909,bicubic,-52.477,-25.857,+51
seresnextaa101d_32x8d.ah_in1k,43.973,56.027,73.347,26.653,93.59,288,1.000,bicubic,-52.987,-25.903,-16
cait_xs24_384.fb_dist_in1k,43.947,56.053,75.160,24.840,26.67,384,1.000,bicubic,-52.593,-24.260,+62
mvitv2_base.fb_in1k,43.747,56.253,74.507,25.493,51.47,224,0.900,bicubic,-53.013,-24.753,+12
resnetrs200.tf_in1k,43.733,56.267,72.827,27.173,93.21,320,1.000,bicubic,-52.967,-26.543,+26
rexnetr_300.sw_in12k_ft_in1k,43.653,56.347,76.507,23.493,34.81,288,1.000,bicubic,-53.187,-23.003,-1
tresnet_xl.miil_in1k_448,43.493,56.507,72.427,27.573,78.44,448,0.875,bilinear,-52.487,-26.623,+177
caformer_s18.sail_in22k_ft_in1k,43.333,56.667,74.960,25.040,26.34,224,1.000,bicubic,-53.377,-24.590,+18
xcit_small_12_p16_384.fb_dist_in1k,43.307,56.693,73.933,26.067,26.25,384,1.000,bicubic,-53.613,-25.457,-16
vit_base_patch16_224.augreg_in21k_ft_in1k,43.267,56.733,72.907,27.093,86.57,224,0.900,bicubic,-53.613,-26.623,-10
swin_small_patch4_window7_224.ms_in22k_ft_in1k,43.227,56.773,76.187,23.813,49.61,224,0.900,bicubic,-53.253,-23.203,+65
resnetrs420.tf_in1k,43.147,56.853,70.440,29.560,191.89,416,1.000,bicubic,-53.763,-29.080,-16
vit_base_patch32_clip_384.openai_ft_in12k_in1k,43.107,56.893,73.253,26.747,88.30,384,0.950,bicubic,-54.003,-26.247,-53
edgenext_base.in21k_ft_in1k,43.040,56.960,75.440,24.560,18.51,320,1.000,bicubic,-53.690,-23.980,+5
coatnet_rmlp_2_rw_224.sw_in1k,43.040,56.960,71.693,28.307,73.88,224,0.950,bicubic,-53.500,-27.607,+55
xcit_medium_24_p8_224.fb_in1k,43.040,56.960,70.320,29.680,84.32,224,1.000,bicubic,-53.050,-28.570,+142
tf_efficientnet_b7.ra_in1k,42.973,57.027,73.147,26.853,66.35,600,0.949,bicubic,-54.027,-26.373,-38
tf_efficientnetv2_m.in1k,42.813,57.187,72.600,27.400,54.14,480,1.000,bicubic,-54.397,-26.930,-74
vit_medium_patch16_gap_256.sw_in12k_ft_in1k,42.707,57.293,74.307,25.693,38.86,256,0.950,bicubic,-53.963,-25.063,+23
hrnet_w48_ssld.paddle_in1k,42.600,57.400,72.147,27.853,77.47,288,1.000,bilinear,-54.430,-27.243,-44
dm_nfnet_f1.dm_in1k,42.547,57.453,71.560,28.440,132.63,320,0.910,bicubic,-54.483,-28.080,-44
xcit_tiny_24_p8_384.fb_dist_in1k,42.453,57.547,72.853,27.147,12.11,384,1.000,bicubic,-54.077,-26.417,+49
gcvit_base.in1k,42.440,57.560,73.773,26.227,90.32,224,0.875,bicubic,-54.120,-25.377,+39
swinv2_small_window16_256.ms_in1k,42.360,57.640,72.867,27.133,49.73,256,0.900,bicubic,-54.110,-26.333,+55
maxvit_rmlp_small_rw_224.sw_in1k,42.227,57.773,72.387,27.613,64.90,224,0.900,bicubic,-54.353,-26.863,+35
convformer_s18.sail_in1k_384,42.120,57.880,74.453,25.547,26.77,384,1.000,bicubic,-54.920,-24.937,-51
convnextv2_tiny.fcmae_ft_in22k_in1k,42.093,57.907,75.747,24.253,28.64,288,1.000,bicubic,-54.757,-23.713,-24
maxvit_base_tf_224.in1k,41.987,58.013,70.080,29.920,119.47,224,0.950,bicubic,-54.963,-29.500,-39
convnext_base.fb_in1k,41.947,58.053,73.987,26.013,88.59,288,1.000,bicubic,-54.883,-25.463,-22
xcit_small_24_p8_224.fb_dist_in1k,41.907,58.093,73.653,26.347,47.63,224,1.000,bicubic,-54.963,-25.827,-30
crossvit_15_dagger_408.in1k,41.907,58.093,72.040,27.960,28.50,408,1.000,bicubic,-54.483,-27.310,+63
resnetaa101d.sw_in12k_ft_in1k,41.893,58.107,72.387,27.613,44.57,288,1.000,bicubic,-54.807,-26.973,0
maxvit_large_tf_224.in1k,41.880,58.120,68.653,31.347,211.79,224,0.950,bicubic,-55.080,-30.737,-46
xcit_small_24_p8_224.fb_in1k,41.800,58.200,71.080,28.920,47.63,224,1.000,bicubic,-54.610,-28.070,+55
vit_large_r50_s32_224.augreg_in21k_ft_in1k,41.640,58.360,70.227,29.773,328.99,224,0.900,bicubic,-55.150,-29.113,-23
vit_base_patch16_clip_224.laion2b_ft_in1k,41.613,58.387,73.627,26.373,86.57,224,1.000,bicubic,-55.517,-25.833,-80
swinv2_base_window8_256.ms_in1k,41.547,58.453,72.427,27.573,87.92,256,0.900,bicubic,-54.983,-26.893,+34
resnext101_32x4d.fb_swsl_ig1b_ft_in1k,41.547,58.453,71.720,28.280,44.18,224,0.875,bilinear,-54.873,-27.430,+48
deit3_small_patch16_224.fb_in22k_ft_in1k,41.200,58.800,71.893,28.107,22.06,224,1.000,bicubic,-55.460,-27.437,+5
maxvit_rmlp_tiny_rw_256.sw_in1k,41.173,58.827,71.187,28.813,29.15,256,0.950,bicubic,-55.247,-28.193,+46
regnety_1280.seer_ft_in1k,41.147,58.853,71.200,28.800,644.81,384,1.000,bicubic,-55.713,-28.190,-39
seresnext101d_32x8d.ah_in1k,41.147,58.853,70.920,29.080,93.59,288,1.000,bicubic,-55.553,-28.560,-9
convnext_tiny.fb_in22k_ft_in1k_384,41.053,58.947,72.533,27.467,28.59,384,1.000,bicubic,-56.027,-26.977,-76
caformer_s36.sail_in1k,41.040,58.960,70.893,29.107,39.30,224,1.000,bicubic,-55.650,-28.467,-8
davit_base.msft_in1k,40.880,59.120,72.747,27.253,87.95,224,0.950,bicubic,-56.060,-26.593,-54
tf_efficientnet_b6.ap_in1k,40.827,59.173,71.627,28.373,43.04,528,0.942,bicubic,-56.253,-27.793,-81
flexivit_base.1200ep_in1k,40.640,59.360,72.307,27.693,86.59,240,0.950,bicubic,-56.100,-27.053,-28
convformer_b36.sail_in1k,40.453,59.547,69.440,30.560,99.88,224,1.000,bicubic,-56.447,-30.040,-50
resmlp_big_24_224.fb_in22k_ft_in1k,40.373,59.627,74.800,25.200,129.14,224,0.875,bicubic,-56.247,-24.650,+1
deit3_small_patch16_384.fb_in1k,40.333,59.667,70.333,29.667,22.21,384,1.000,bicubic,-55.867,-28.957,+77
tresnet_l.miil_in1k_448,40.213,59.787,69.920,30.080,55.99,448,0.875,bilinear,-55.647,-29.280,+158
deit_base_patch16_384.fb_in1k,40.187,59.813,70.813,29.187,86.86,384,1.000,bicubic,-55.973,-28.427,+86
regnetz_040_h.ra3_in1k,40.000,60.000,71.320,28.680,28.94,320,1.000,bicubic,-56.700,-27.970,-26
resnetrs350.tf_in1k,39.960,60.040,68.907,31.093,163.96,384,1.000,bicubic,-56.790,-30.463,-37
flexivit_base.600ep_in1k,39.920,60.080,71.893,28.107,86.59,240,0.950,bicubic,-56.730,-27.437,-10
regnetz_d8.ra3_in1k,39.907,60.093,71.707,28.293,23.37,320,1.000,bicubic,-56.713,-27.803,-5
swin_s3_base_224.ms_in1k,39.853,60.147,70.467,29.533,71.13,224,0.900,bicubic,-56.387,-28.683,+62
flexivit_base.300ep_in1k,39.533,60.467,71.000,29.000,86.59,240,0.950,bicubic,-57.067,-28.530,-4
seresnext101_32x8d.ah_in1k,39.520,60.480,69.480,30.520,93.57,288,1.000,bicubic,-57.250,-29.870,-44
gcvit_small.in1k,39.400,60.600,70.480,29.520,51.09,224,0.875,bicubic,-56.880,-28.660,+53
regnetz_d8_evos.ch_in1k,39.360,60.640,71.427,28.573,23.46,320,1.000,bicubic,-57.210,-28.033,0
deit3_base_patch16_224.fb_in1k,39.173,60.827,70.933,29.067,86.59,224,0.900,bicubic,-57.127,-28.247,+47
volo_d1_224.sail_in1k,39.013,60.987,70.267,29.733,26.63,224,0.960,bicubic,-57.307,-29.043,+44
vit_large_patch32_384.orig_in21k_ft_in1k,38.920,61.080,68.933,31.067,306.63,384,1.000,bicubic,-56.910,-30.237,+151
resnetv2_101x1_bit.goog_in21k_ft_in1k,38.907,61.093,71.027,28.973,44.54,448,1.000,bilinear,-57.193,-28.243,+90
mvitv2_small.fb_in1k,38.773,61.227,70.413,29.587,34.87,224,0.900,bicubic,-57.597,-28.787,+29
regnetz_040.ra3_in1k,38.747,61.253,70.440,29.560,27.12,320,1.000,bicubic,-57.973,-29.060,-43
coat_lite_medium.in1k,38.600,61.400,71.093,28.907,44.57,224,0.900,bicubic,-57.870,-28.147,+12
xcit_small_12_p8_224.fb_dist_in1k,38.280,61.720,71.373,28.627,26.21,224,1.000,bicubic,-58.420,-27.987,-39
resnet200d.ra2_in1k,38.133,61.867,68.573,31.427,64.69,320,1.000,bicubic,-58.597,-30.847,-48
davit_small.msft_in1k,38.120,61.880,70.747,29.253,49.75,224,0.950,bicubic,-58.510,-28.413,-25
tf_efficientnet_b7.aa_in1k,38.120,61.880,69.320,30.680,66.35,600,0.949,bicubic,-58.420,-29.940,-5
focalnet_base_srf.ms_in1k,37.827,62.173,69.747,30.253,88.15,224,0.900,bicubic,-58.733,-29.483,-10
focalnet_base_lrf.ms_in1k,37.787,62.213,68.573,31.427,88.75,224,0.900,bicubic,-58.663,-30.727,+10
convformer_m36.sail_in1k,37.760,62.240,67.280,32.720,57.05,224,1.000,bicubic,-58.920,-32.300,-34
swinv2_small_window8_256.ms_in1k,37.733,62.267,69.853,30.147,49.73,256,0.900,bicubic,-58.537,-29.357,+38
xcit_large_24_p16_224.fb_dist_in1k,37.653,62.347,71.613,28.387,189.10,224,1.000,bicubic,-59.147,-27.737,-67
seresnet152d.ra2_in1k,37.653,62.347,69.493,30.507,66.84,320,1.000,bicubic,-59.117,-29.947,-63
maxvit_small_tf_224.in1k,37.560,62.440,67.947,32.053,68.93,224,0.950,bicubic,-59.130,-31.423,-42
xcit_small_12_p8_224.fb_in1k,37.547,62.453,68.160,31.840,26.21,224,1.000,bicubic,-58.563,-31.000,+74
eca_nfnet_l1.ra2_in1k,37.533,62.467,70.933,29.067,41.41,320,1.000,bicubic,-59.167,-28.447,-47
fastvit_ma36.apple_dist_in1k,37.493,62.507,71.053,28.947,44.07,256,0.950,bicubic,-59.287,-28.277,-69
efficientvit_b3.r288_in1k,37.427,62.573,69.893,30.107,48.65,288,1.000,bicubic,-59.203,-29.457,-35
twins_svt_large.in1k,37.213,62.787,69.187,30.813,99.27,224,0.900,bicubic,-59.027,-30.013,+34
regnetz_d32.ra3_in1k,37.160,62.840,70.480,29.520,27.58,320,0.950,bicubic,-59.430,-28.900,-29
vit_base_patch32_384.augreg_in21k_ft_in1k,37.107,62.893,69.787,30.213,88.30,384,1.000,bicubic,-59.383,-29.623,-11
regnety_064.ra3_in1k,37.013,62.987,68.107,31.893,30.58,288,1.000,bicubic,-59.347,-31.123,+11
swin_s3_small_224.ms_in1k,36.933,63.067,68.253,31.747,49.74,224,0.900,bicubic,-59.297,-30.827,+34
resnext101_64x4d.c1_in1k,36.840,63.160,66.627,33.373,83.46,288,1.000,bicubic,-59.240,-32.573,+70
regnety_160.deit_in1k,36.827,63.173,69.120,30.880,83.59,288,1.000,bicubic,-59.523,-30.060,+9
efficientnetv2_rw_s.ra2_in1k,36.827,63.173,68.320,31.680,23.94,384,1.000,bicubic,-59.713,-30.780,-24
convnext_small.fb_in1k,36.707,63.293,71.067,28.933,50.22,288,1.000,bicubic,-59.813,-28.273,-19
pit_b_distilled_224.in1k,36.707,63.293,68.093,31.907,74.79,224,0.900,bicubic,-59.523,-31.017,+28
pvt_v2_b4.in1k,36.627,63.373,68.653,31.347,62.56,224,0.900,bicubic,-59.723,-30.677,+7
pvt_v2_b5.in1k,36.293,63.707,68.427,31.573,81.96,224,0.900,bicubic,-60.067,-30.813,+4
cait_xxs36_384.fb_dist_in1k,36.267,63.733,67.733,32.267,17.37,384,1.000,bicubic,-59.563,-31.417,+120
fastvit_sa36.apple_dist_in1k,36.187,63.813,69.267,30.733,31.53,256,0.900,bicubic,-60.173,-29.903,0
regnety_640.seer_ft_in1k,36.120,63.880,68.307,31.693,281.38,384,1.000,bicubic,-60.730,-31.113,-94
nest_base_jx.goog_in1k,36.053,63.947,66.747,33.253,67.72,224,0.875,bicubic,-60.187,-32.473,+17
coatnet_1_rw_224.sw_in1k,35.973,64.027,67.133,32.867,41.72,224,0.950,bicubic,-60.057,-32.027,+74
maxvit_tiny_rw_224.sw_in1k,35.960,64.040,65.573,34.427,29.06,224,0.950,bicubic,-60.280,-33.597,+19
repvgg_d2se.rvgg_in1k,35.827,64.173,66.720,33.280,133.33,320,1.000,bilinear,-60.863,-32.640,-67
cs3se_edgenet_x.c2ns_in1k,35.667,64.333,67.827,32.173,50.72,320,1.000,bicubic,-60.783,-31.573,-22
sequencer2d_l.in1k,35.600,64.400,67.360,32.640,54.30,224,0.875,bicubic,-60.550,-31.800,+38
swin_base_patch4_window7_224.ms_in1k,35.560,64.440,68.160,31.840,87.77,224,0.900,bicubic,-60.560,-30.900,+45
regnety_080.ra3_in1k,35.560,64.440,67.213,32.787,39.18,288,1.000,bicubic,-60.970,-32.107,-35
tf_efficientnet_b3.ns_jft_in1k,35.520,64.480,67.773,32.227,12.23,300,0.904,bicubic,-60.870,-31.617,-15
inception_next_base.sail_in1k,35.187,64.813,66.533,33.467,86.67,224,0.950,bicubic,-61.373,-32.547,-44
tf_efficientnet_b6.aa_in1k,35.147,64.853,67.733,32.267,43.04,528,0.942,bicubic,-61.523,-31.757,-66
resnetrs270.tf_in1k,34.973,65.027,65.453,34.547,129.86,352,1.000,bicubic,-61.717,-33.897,-72
gcvit_tiny.in1k,34.947,65.053,66.893,33.107,28.22,224,0.875,bicubic,-61.233,-32.297,+21
tf_efficientnet_b5.ap_in1k,34.813,65.187,67.467,32.533,30.39,456,0.934,bicubic,-61.867,-31.993,-72
xcit_tiny_12_p8_384.fb_dist_in1k,34.653,65.347,66.293,33.707,6.71,384,1.000,bicubic,-61.427,-32.567,+49
fastvit_ma36.apple_in1k,34.627,65.373,66.987,33.013,44.07,256,0.950,bicubic,-61.843,-32.163,-37
vit_base_patch16_224_miil.in21k_ft_in1k,34.533,65.467,64.973,35.027,86.54,224,0.875,bilinear,-61.917,-34.337,-32
xcit_medium_24_p16_224.fb_dist_in1k,34.333,65.667,67.880,32.120,84.40,224,1.000,bicubic,-62.267,-31.390,-61
resnet152d.ra2_in1k,34.280,65.720,65.947,34.053,60.21,320,1.000,bicubic,-62.110,-33.213,-26
deit3_medium_patch16_224.fb_in1k,34.253,65.747,66.013,33.987,38.85,224,0.900,bicubic,-61.827,-33.227,+43
coat_small.in1k,34.187,65.813,66.133,33.867,21.69,224,0.900,bicubic,-61.723,-33.017,+82
tresnet_m.miil_in1k_448,34.107,65.893,64.533,35.467,31.39,448,0.875,bilinear,-60.883,-33.927,+253
resmlp_big_24_224.fb_distilled_in1k,34.053,65.947,69.573,30.427,129.14,224,0.875,bicubic,-62.397,-29.547,-39
regnetv_064.ra3_in1k,34.000,66.000,67.880,32.120,30.58,288,1.000,bicubic,-62.410,-31.480,-34
tiny_vit_11m_224.dist_in22k_ft_in1k,33.867,66.133,65.827,34.173,11.00,224,0.950,bicubic,-62.423,-33.403,-12
xcit_tiny_24_p16_384.fb_dist_in1k,33.827,66.173,65.400,34.600,12.12,384,1.000,bicubic,-62.113,-33.810,+70
inception_next_small.sail_in1k,33.800,66.200,65.773,34.227,49.37,224,0.875,bicubic,-62.440,-33.417,-10
caformer_s18.sail_in1k,33.707,66.293,65.400,34.600,26.34,224,1.000,bicubic,-62.433,-33.770,+19
pvt_v2_b3.in1k,33.640,66.360,67.680,32.320,45.24,224,0.900,bicubic,-62.350,-31.510,+55
focalnet_small_srf.ms_in1k,33.520,66.480,65.907,34.093,49.89,224,0.900,bicubic,-62.550,-33.383,+38
coatnet_rmlp_1_rw_224.sw_in1k,33.507,66.493,65.573,34.427,41.69,224,0.950,bicubic,-62.453,-33.617,+62
convformer_s18.sail_in22k_ft_in1k,33.453,66.547,68.280,31.720,26.77,224,1.000,bicubic,-63.177,-31.060,-83
resnet152.a1h_in1k,33.427,66.573,63.453,36.547,60.19,288,1.000,bicubic,-62.773,-35.807,-2
twins_pcpvt_large.in1k,33.387,66.613,67.907,32.093,60.99,224,0.900,bicubic,-62.753,-31.453,+12
convformer_s36.sail_in1k,33.320,66.680,63.827,36.173,40.01,224,1.000,bicubic,-63.260,-35.413,-74
focalnet_small_lrf.ms_in1k,33.227,66.773,67.067,32.933,50.34,224,0.900,bicubic,-62.953,-32.163,-1
fastvit_sa36.apple_in1k,33.200,66.800,66.000,34.000,31.53,256,0.900,bicubic,-63.040,-33.130,-18
twins_svt_base.in1k,33.200,66.800,65.720,34.280,56.07,224,0.900,bicubic,-62.960,-33.330,+3
pit_b_224.in1k,33.160,66.840,62.347,37.653,73.76,224,0.900,bicubic,-62.470,-36.893,+110
tiny_vit_21m_224.in1k,33.133,66.867,67.453,32.547,21.20,224,0.950,bicubic,-62.997,-31.767,+9
resnext50_32x4d.fb_swsl_ig1b_ft_in1k,33.040,66.960,65.120,34.880,25.03,224,0.875,bilinear,-62.820,-33.950,+68
resnetv2_152x2_bit.goog_teacher_in21k_ft_in1k,33.027,66.973,64.240,35.760,236.34,224,0.875,bicubic,-63.093,-35.100,+9
mobilevitv2_200.cvnets_in22k_ft_in1k_384,32.987,67.013,65.493,34.507,18.45,384,1.000,bicubic,-63.073,-33.587,+27
convnextv2_nano.fcmae_ft_in22k_in1k_384,32.947,67.053,67.173,32.827,15.62,384,1.000,bicubic,-63.423,-32.227,-48
regnety_320.seer_ft_in1k,32.933,67.067,66.333,33.667,145.05,384,1.000,bicubic,-63.407,-33.017,-41
swinv2_cr_small_ns_224.sw_in1k,32.893,67.107,65.947,34.053,49.70,224,0.900,bicubic,-63.287,-33.173,-10
xception65.ra3_in1k,32.747,67.253,62.973,37.027,39.92,299,0.940,bicubic,-63.613,-36.257,-49
xcit_large_24_p16_224.fb_in1k,32.733,67.267,62.067,37.933,189.10,224,1.000,bicubic,-62.697,-36.773,+150
ecaresnet101d.miil_in1k,32.707,67.293,65.973,34.027,44.57,288,0.950,bicubic,-63.513,-33.337,-24
resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k,32.640,67.360,63.987,36.013,194.03,224,0.875,bilinear,-63.150,-35.173,+73
swin_small_patch4_window7_224.ms_in1k,32.573,67.427,65.347,34.653,49.61,224,0.900,bicubic,-63.357,-33.803,+49
mobilevitv2_175.cvnets_in22k_ft_in1k_384,32.480,67.520,64.747,35.253,14.25,384,1.000,bicubic,-63.700,-34.393,-15
efficientvit_b3.r256_in1k,32.440,67.560,65.960,34.040,48.65,256,1.000,bicubic,-63.850,-33.230,-39
tf_efficientnetv2_b3.in21k_ft_in1k,32.333,67.667,66.133,33.867,14.36,300,0.900,bicubic,-63.887,-33.097,-28
nest_small_jx.goog_in1k,32.280,67.720,63.747,36.253,38.35,224,0.875,bicubic,-63.690,-35.283,+34
convnext_tiny_hnf.a2h_in1k,32.227,67.773,62.893,37.107,28.59,288,1.000,bicubic,-63.793,-36.247,+25
resnext101_64x4d.tv_in1k,32.160,67.840,64.227,35.773,83.46,224,0.875,bilinear,-63.870,-34.833,+17
efficientvit_b3.r224_in1k,32.080,67.920,64.867,35.133,48.65,224,0.950,bicubic,-63.950,-34.263,+19
vit_base_patch16_224.orig_in21k_ft_in1k,32.053,67.947,61.613,38.387,86.57,224,0.900,bicubic,-63.287,-37.387,+153
convnextv2_tiny.fcmae_ft_in1k,32.027,67.973,67.067,32.933,28.64,288,1.000,bicubic,-64.163,-32.263,-27
maxvit_nano_rw_256.sw_in1k,31.933,68.067,64.187,35.813,15.45,256,0.950,bicubic,-63.987,-34.823,+41
tf_efficientnet_b5.ra_in1k,31.787,68.213,65.280,34.720,30.39,456,0.934,bicubic,-64.553,-34.030,-57
rexnetr_200.sw_in12k_ft_in1k,31.733,68.267,67.653,32.347,16.52,288,1.000,bicubic,-64.467,-31.567,-33
swinv2_cr_small_224.sw_in1k,31.733,68.267,62.547,37.453,49.70,224,0.900,bicubic,-64.347,-36.593,+1
swinv2_tiny_window16_256.ms_in1k,31.707,68.293,65.667,34.333,28.35,256,0.900,bicubic,-64.223,-33.353,+34
regnetz_c16_evos.ch_in1k,31.493,68.507,66.347,33.653,13.49,320,0.950,bicubic,-64.647,-32.653,-21
fastvit_sa24.apple_dist_in1k,31.427,68.573,64.760,35.240,21.55,256,0.900,bicubic,-64.723,-34.430,-25
resnest101e.in1k,31.413,68.587,64.320,35.680,48.28,256,0.875,bilinear,-64.447,-34.780,+42
maxvit_rmlp_nano_rw_256.sw_in1k,31.400,68.600,63.413,36.587,15.50,256,0.950,bicubic,-64.570,-35.737,+21
regnety_320.tv2_in1k,31.360,68.640,64.840,35.160,145.05,224,0.965,bicubic,-64.720,-34.390,-8
maxxvit_rmlp_nano_rw_256.sw_in1k,31.333,68.667,64.360,35.640,16.78,256,0.950,bicubic,-64.707,-34.900,+2
regnetv_040.ra3_in1k,31.307,68.693,64.680,35.320,20.64,288,1.000,bicubic,-64.883,-34.570,-40
crossvit_base_240.in1k,31.267,68.733,61.307,38.693,105.03,240,0.875,bicubic,-64.253,-37.563,+95
cait_s24_224.fb_dist_in1k,31.187,68.813,64.573,35.427,46.92,224,1.000,bicubic,-65.233,-34.897,-85
convnext_nano.in12k_ft_in1k,31.107,68.893,67.333,32.667,15.59,288,1.000,bicubic,-64.883,-31.987,+8
efficientnet_b4.ra2_in1k,30.880,69.120,64.667,35.333,19.34,384,1.000,bicubic,-65.270,-34.543,-33
repvit_m2_3.dist_450e_in1k,30.787,69.213,63.840,36.160,23.69,224,0.950,bicubic,-65.543,-35.590,-69
regnety_040.ra3_in1k,30.613,69.387,63.827,36.173,20.65,288,1.000,bicubic,-65.407,-35.243,0
maxvit_tiny_tf_224.in1k,30.587,69.413,62.773,37.227,30.92,224,0.950,bicubic,-65.513,-36.507,-22
crossvit_18_240.in1k,30.587,69.413,61.920,38.080,43.27,240,0.875,bicubic,-64.853,-37.200,+113
sequencer2d_m.in1k,30.560,69.440,62.987,37.013,38.31,224,0.875,bicubic,-65.250,-36.223,+40
vit_base_patch32_clip_224.laion2b_ft_in12k_in1k,30.560,69.440,62.053,37.947,88.22,224,0.900,bicubic,-65.570,-37.107,-34
xcit_small_24_p16_224.fb_dist_in1k,30.547,69.453,64.733,35.267,47.67,224,1.000,bicubic,-65.673,-34.477,-56
crossvit_18_dagger_240.in1k,30.520,69.480,61.787,38.213,44.27,240,0.875,bicubic,-65.050,-37.273,+69
maxxvitv2_nano_rw_256.sw_in1k,30.440,69.560,63.707,36.293,23.70,256,0.950,bicubic,-65.460,-35.463,+21
mvitv2_tiny.fb_in1k,30.173,69.827,64.320,35.680,24.17,224,0.900,bicubic,-65.687,-34.800,+27
xcit_medium_24_p16_224.fb_in1k,30.173,69.827,59.307,40.693,84.40,224,1.000,bicubic,-65.367,-39.473,+74
rexnet_300.nav_in1k,30.027,69.973,63.920,36.080,34.71,224,0.875,bicubic,-65.813,-35.210,+27
cait_xxs24_384.fb_dist_in1k,30.013,69.987,63.907,36.093,12.03,384,1.000,bicubic,-65.267,-35.003,+133
tf_efficientnet_b5.aa_in1k,29.973,70.027,63.080,36.920,30.39,456,0.934,bicubic,-66.497,-36.200,-110
convnext_tiny.fb_in1k,29.960,70.040,65.120,34.880,28.59,288,1.000,bicubic,-65.830,-34.060,+33
twins_pcpvt_base.in1k,29.947,70.053,64.627,35.373,43.83,224,0.900,bicubic,-65.843,-34.503,+33
cs3sedarknet_x.c2ns_in1k,29.933,70.067,62.000,38.000,35.40,288,1.000,bicubic,-66.087,-37.190,-13
resnet50.fb_swsl_ig1b_ft_in1k,29.813,70.187,63.827,36.173,25.56,224,0.875,bilinear,-65.587,-35.273,+107
mobilevitv2_150.cvnets_in22k_ft_in1k_384,29.813,70.187,62.133,37.867,10.59,384,1.000,bicubic,-65.877,-37.007,+43
vit_relpos_base_patch16_clsgap_224.sw_in1k,29.733,70.267,62.880,37.120,86.43,224,0.900,bicubic,-66.037,-36.240,+32
resnet152.a1_in1k,29.720,70.280,57.280,42.720,60.19,288,1.000,bicubic,-65.780,-41.800,+78
convnextv2_nano.fcmae_ft_in22k_in1k,29.693,70.307,65.920,34.080,15.62,288,1.000,bicubic,-66.367,-33.300,-29
deit_base_distilled_patch16_224.fb_in1k,29.587,70.413,64.387,35.613,87.34,224,0.900,bicubic,-66.503,-34.803,-40
convit_base.fb_in1k,29.520,70.480,61.747,38.253,86.54,224,0.875,bicubic,-66.030,-37.363,+57
vit_relpos_medium_patch16_cls_224.sw_in1k,29.307,70.693,60.587,39.413,38.76,224,0.900,bicubic,-66.163,-38.363,+82
swin_tiny_patch4_window7_224.ms_in22k_ft_in1k,29.293,70.707,66.107,33.893,28.29,224,0.900,bicubic,-66.217,-33.013,+67
resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k,29.120,70.880,61.000,39.000,88.79,224,0.875,bilinear,-66.390,-38.200,+67
davit_tiny.msft_in1k,29.107,70.893,63.573,36.427,28.36,224,0.950,bicubic,-66.553,-35.477,+38
tf_efficientnetv2_s.in1k,29.040,70.960,61.227,38.773,21.46,384,1.000,bicubic,-67.300,-37.973,-99
edgenext_base.usi_in1k,29.027,70.973,64.907,35.093,18.51,320,1.000,bicubic,-67.673,-34.593,-175
xception65p.ra3_in1k,28.987,71.013,59.907,40.093,39.82,299,0.940,bicubic,-67.223,-39.273,-80
convnext_tiny.fb_in22k_ft_in1k,28.987,71.013,55.600,44.400,28.59,288,1.000,bicubic,-65.853,-43.110,+195
resnet101d.ra2_in1k,28.973,71.027,62.080,37.920,44.57,320,1.000,bicubic,-67.317,-37.040,-97
fastvit_sa24.apple_in1k,28.960,71.040,62.413,37.587,21.55,256,0.900,bicubic,-66.960,-36.747,-9
resnetv2_101.a1h_in1k,28.933,71.067,59.867,40.133,44.54,288,1.000,bicubic,-67.117,-39.273,-40
resnetrs152.tf_in1k,28.893,71.107,60.507,39.493,86.62,320,1.000,bicubic,-67.687,-38.613,-152
regnety_160.tv2_in1k,28.867,71.133,61.627,38.373,83.59,224,0.965,bicubic,-67.103,-37.343,-25
vit_relpos_medium_patch16_224.sw_in1k,28.853,71.147,61.973,38.027,38.75,224,0.900,bicubic,-66.607,-37.037,+71
xcit_tiny_24_p8_224.fb_dist_in1k,28.707,71.293,61.373,38.627,12.11,224,1.000,bicubic,-67.103,-37.737,+5
xcit_tiny_24_p8_224.fb_in1k,28.653,71.347,60.440,39.560,12.11,224,1.000,bicubic,-67.007,-38.590,+25
efficientvit_b2.r288_in1k,28.627,71.373,64.227,35.773,24.33,288,1.000,bicubic,-67.333,-34.933,-26
repvit_m2_3.dist_300e_in1k,28.627,71.373,61.907,38.093,23.69,224,0.950,bicubic,-67.493,-37.333,-68
crossvit_15_dagger_240.in1k,28.547,71.453,60.293,39.707,28.21,240,0.875,bicubic,-67.133,-38.537,+20
cs3edgenet_x.c2_in1k,28.507,71.493,61.133,38.867,47.82,288,1.000,bicubic,-67.543,-38.037,-48
pvt_v2_b2_li.in1k,28.480,71.520,62.000,38.000,22.55,224,0.900,bicubic,-67.080,-37.030,+34
xcit_small_24_p16_224.fb_in1k,28.320,71.680,58.707,41.293,47.67,224,1.000,bicubic,-67.220,-40.013,+38
efficientformerv2_l.snap_dist_in1k,28.253,71.747,61.920,38.080,26.32,224,0.950,bicubic,-67.817,-37.200,-56
resnet101.a1h_in1k,28.160,71.840,59.387,40.613,44.55,288,1.000,bicubic,-67.860,-39.723,-45
efficientformer_l7.snap_dist_in1k,28.000,72.000,63.013,36.987,82.23,224,0.950,bicubic,-68.110,-36.267,-70
flexivit_small.1200ep_in1k,27.853,72.147,58.653,41.347,22.06,240,0.950,bicubic,-67.697,-40.227,+30
resnetaa50d.sw_in12k_ft_in1k,27.813,72.187,62.253,37.747,25.58,288,1.000,bicubic,-68.017,-36.837,-11
pvt_v2_b2.in1k,27.600,72.400,60.733,39.267,25.36,224,0.900,bicubic,-67.880,-38.267,+52
regnetz_c16.ra3_in1k,27.587,72.413,62.720,37.280,13.46,320,1.000,bicubic,-68.353,-36.500,-31
resnext101_32x8d.tv2_in1k,27.573,72.427,59.827,40.173,88.79,224,0.965,bilinear,-68.377,-39.073,-36
vit_base_patch16_384.augreg_in1k,27.547,72.453,57.253,42.747,86.86,384,1.000,bicubic,-67.393,-41.637,+150
coat_lite_small.in1k,27.507,72.493,58.533,41.467,19.84,224,0.900,bicubic,-68.033,-40.547,+27
deit_base_patch16_224.fb_in1k,27.413,72.587,58.880,41.120,86.57,224,0.900,bicubic,-68.037,-40.210,+55
vit_relpos_base_patch16_224.sw_in1k,27.320,72.680,61.147,38.853,86.43,224,0.900,bicubic,-68.240,-37.843,+20
resnetv2_50x1_bit.goog_in21k_ft_in1k,27.307,72.693,62.867,37.133,25.55,448,1.000,bilinear,-67.703,-36.113,+136
regnety_080_tv.tv2_in1k,27.280,72.720,61.520,38.480,39.38,224,0.965,bicubic,-68.580,-37.730,-24
coatnet_bn_0_rw_224.sw_in1k,27.213,72.787,61.253,38.747,27.44,224,0.950,bicubic,-68.487,-37.797,-1
xcit_small_12_p16_224.fb_dist_in1k,27.147,72.853,59.800,40.200,26.25,224,1.000,bicubic,-68.883,-39.190,-63
dm_nfnet_f0.dm_in1k,27.067,72.933,58.320,41.680,71.49,256,0.900,bicubic,-69.243,-41.000,-129
vit_small_patch16_224.augreg_in21k_ft_in1k,27.040,72.960,59.253,40.747,22.05,224,0.900,bicubic,-68.330,-39.687,+67
coatnet_0_rw_224.sw_in1k,27.027,72.973,59.387,40.613,27.44,224,0.950,bicubic,-68.413,-39.663,+52
sequencer2d_s.in1k,26.867,73.133,60.640,39.360,27.65,224,0.875,bicubic,-69.113,-38.490,-55
flexivit_small.600ep_in1k,26.853,73.147,57.253,42.747,22.06,240,0.950,bicubic,-68.817,-41.807,-3
tresnet_v2_l.miil_in21k_ft_in1k,26.707,73.293,59.827,40.173,46.17,224,0.875,bilinear,-69.453,-39.443,-105
gcvit_xtiny.in1k,26.693,73.307,60.907,39.093,19.98,224,0.875,bicubic,-68.897,-38.133,+7
mobilevitv2_200.cvnets_in22k_ft_in1k,26.653,73.347,59.400,40.600,18.45,256,0.888,bicubic,-68.507,-39.700,+94
swin_s3_tiny_224.ms_in1k,26.520,73.480,60.347,39.653,28.33,224,0.900,bicubic,-68.660,-38.603,+90
coatnet_rmlp_nano_rw_224.sw_in1k,26.467,73.533,60.547,39.453,15.15,224,0.900,bicubic,-68.963,-38.493,+48
swinv2_tiny_window8_256.ms_in1k,26.387,73.613,60.480,39.520,28.35,256,0.900,bicubic,-69.103,-38.480,+27
tf_efficientnet_b4.aa_in1k,26.293,73.707,60.080,39.920,19.34,380,0.922,bicubic,-69.607,-38.970,-46
regnetx_320.tv2_in1k,26.267,73.733,58.133,41.867,107.81,224,0.965,bicubic,-69.723,-40.967,-66
tf_efficientnet_b4.ap_in1k,26.227,73.773,60.240,39.760,19.34,380,0.922,bicubic,-69.933,-38.900,-116
coatnext_nano_rw_224.sw_in1k,26.227,73.773,59.613,40.387,14.70,224,0.900,bicubic,-69.203,-39.397,+43
deit3_small_patch16_224.fb_in1k,26.227,73.773,54.480,45.520,22.06,224,0.900,bicubic,-68.763,-44.500,+123
nfnet_l0.ra2_in1k,26.213,73.787,61.747,38.253,35.07,288,1.000,bicubic,-69.907,-37.523,-103
regnety_032.ra_in1k,26.200,73.800,61.013,38.987,19.44,288,1.000,bicubic,-69.760,-38.177,-64
ecaresnet50t.ra2_in1k,26.120,73.880,60.013,39.987,25.57,320,0.950,bicubic,-69.400,-38.797,+7
fbnetv3_g.ra2_in1k,26.107,73.893,61.080,38.920,16.62,288,0.950,bilinear,-69.413,-38.030,+7
mobilevitv2_175.cvnets_in22k_ft_in1k,26.013,73.987,58.520,41.480,14.25,256,0.888,bicubic,-69.207,-40.280,+71
flexivit_small.300ep_in1k,25.907,74.093,57.080,42.920,22.06,240,0.950,bicubic,-69.583,-42.020,+18
visformer_small.in1k,25.853,74.147,58.933,41.067,40.22,224,0.900,bicubic,-69.627,-39.967,+21
inception_next_tiny.sail_in1k,25.800,74.200,59.813,40.187,28.06,224,0.875,bicubic,-69.660,-39.097,+22
vit_small_patch16_384.augreg_in1k,25.773,74.227,57.587,42.413,22.20,384,1.000,bicubic,-69.517,-41.413,+57
convformer_s18.sail_in1k,25.680,74.320,57.960,42.040,26.77,224,1.000,bicubic,-70.270,-41.120,-69
halo2botnet50ts_256.a1h_in1k,25.573,74.427,56.827,43.173,22.64,256,0.950,bicubic,-69.847,-42.193,+34
vit_relpos_medium_patch16_rpn_224.sw_in1k,25.520,74.480,58.667,41.333,38.73,224,0.900,bicubic,-69.980,-40.113,+7
coat_mini.in1k,25.507,74.493,57.707,42.293,10.34,224,0.900,bicubic,-69.463,-41.373,+112
crossvit_15_240.in1k,25.453,74.547,57.587,42.413,27.53,240,0.875,bicubic,-69.697,-41.223,+76
vit_srelpos_medium_patch16_224.sw_in1k,25.373,74.627,58.427,41.573,38.74,224,0.900,bicubic,-69.867,-40.443,+56
resnet101.a1_in1k,25.253,74.747,55.120,44.880,44.55,288,1.000,bicubic,-70.267,-43.730,-3
resnetv2_50x1_bit.goog_distilled_in1k,25.200,74.800,59.667,40.333,25.55,224,0.875,bicubic,-70.910,-39.603,-117
convit_small.fb_in1k,25.147,74.853,57.293,42.707,27.78,224,0.875,bicubic,-70.063,-41.607,+59
xcit_small_12_p16_224.fb_in1k,25.107,74.893,56.067,43.933,26.25,224,1.000,bicubic,-70.323,-42.563,+24
vit_base_patch16_rpn_224.sw_in1k,25.080,74.920,58.680,41.320,86.54,224,0.900,bicubic,-70.310,-40.180,+31
gc_efficientnetv2_rw_t.agc_in1k,25.067,74.933,57.707,42.293,13.68,288,1.000,bicubic,-70.673,-41.473,-43
resnet152.a2_in1k,25.053,74.947,54.320,45.680,60.19,288,1.000,bicubic,-70.437,-44.670,+3
eca_nfnet_l0.ra2_in1k,24.800,75.200,60.040,39.960,24.14,288,1.000,bicubic,-71.140,-39.070,-80
xception41p.ra3_in1k,24.787,75.213,55.253,44.747,26.91,299,0.940,bicubic,-70.723,-43.657,-7
efficientvit_b2.r256_in1k,24.773,75.227,59.720,40.280,24.33,256,1.000,bicubic,-70.877,-39.340,-36
tnt_s_patch16_224,24.707,75.293,58.173,41.827,23.76,224,0.900,bicubic,-70.333,-40.657,+85
convnext_nano_ols.d1h_in1k,24.547,75.453,57.027,42.973,15.65,288,1.000,bicubic,-70.593,-41.823,+70
xcit_tiny_12_p16_384.fb_dist_in1k,24.453,75.547,57.080,42.920,6.72,384,1.000,bicubic,-70.677,-41.940,+70
cs3darknet_x.c2ns_in1k,24.387,75.613,57.760,42.240,35.05,288,1.000,bicubic,-71.463,-41.410,-69
efficientnetv2_rw_t.ra2_in1k,24.280,75.720,57.400,42.600,13.65,288,1.000,bicubic,-71.310,-41.670,-33
swinv2_cr_tiny_ns_224.sw_in1k,24.147,75.853,58.187,41.813,28.33,224,0.900,bicubic,-71.223,-40.953,+23
eva02_tiny_patch14_336.mim_in22k_ft_in1k,24.133,75.867,55.400,44.600,5.76,336,1.000,bicubic,-70.777,-43.480,+100
resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k,24.120,75.880,57.373,42.627,44.18,224,0.875,bilinear,-71.320,-41.347,+3
twins_svt_small.in1k,24.107,75.893,57.160,42.840,24.06,224,0.900,bicubic,-71.083,-41.720,+48
coatnet_nano_rw_224.sw_in1k,24.093,75.907,57.147,42.853,15.14,224,0.900,bicubic,-71.147,-41.843,+39
vit_small_r26_s32_224.augreg_in21k_ft_in1k,24.093,75.907,56.213,43.787,36.43,224,0.900,bicubic,-71.537,-42.727,-44
tf_efficientnet_b5.in1k,24.080,75.920,58.347,41.653,30.39,456,0.934,bicubic,-71.990,-40.843,-125
tf_efficientnet_b2.ns_jft_in1k,24.040,75.960,57.320,42.680,9.11,260,0.890,bicubic,-71.730,-41.660,-65
mobilevitv2_150.cvnets_in22k_ft_in1k,24.013,75.987,55.920,44.080,10.59,256,0.888,bicubic,-71.127,-42.860,+55
vit_relpos_small_patch16_224.sw_in1k,24.000,76.000,58.160,41.840,21.98,224,0.900,bicubic,-71.150,-40.800,+48
convnext_nano.d1h_in1k,24.000,76.000,56.200,43.800,15.59,288,1.000,bicubic,-71.350,-42.720,+17
cs3sedarknet_l.c2ns_in1k,23.960,76.040,58.760,41.240,21.91,288,0.950,bicubic,-71.350,-40.370,+19
ecaresnet50d.miil_in1k,23.933,76.067,58.760,41.240,25.58,288,0.950,bicubic,-71.517,-40.080,-10
poolformer_m48.sail_in1k,23.867,76.133,57.160,42.840,73.47,224,0.950,bicubic,-71.763,-42.030,-50
vit_small_patch32_384.augreg_in21k_ft_in1k,23.760,76.240,57.280,42.720,22.92,384,1.000,bicubic,-71.260,-41.600,+68
tiny_vit_11m_224.in1k,23.627,76.373,58.973,41.027,11.00,224,0.950,bicubic,-71.863,-39.817,-23
lamhalobotnet50ts_256.a1h_in1k,23.613,76.387,55.320,44.680,22.57,256,0.950,bicubic,-71.537,-43.560,+44
nasnetalarge.tf_in1k,23.493,76.507,55.000,45.000,88.75,331,0.911,bicubic,-72.197,-43.930,-64
crossvit_small_240.in1k,23.427,76.573,56.840,43.160,26.86,240,0.875,bicubic,-71.423,-42.180,+95
levit_384.fb_dist_in1k,23.400,76.600,56.400,43.600,39.13,224,0.900,bicubic,-72.130,-42.660,-42
levit_conv_384.fb_dist_in1k,23.387,76.613,56.413,43.587,39.13,224,0.900,bicubic,-72.143,-42.637,-42
seresnext50_32x4d.racm_in1k,23.360,76.640,57.453,42.547,27.56,288,0.950,bicubic,-72.380,-41.567,-75
pnasnet5large.tf_in1k,23.333,76.667,53.627,46.373,86.06,331,0.911,bicubic,-72.377,-45.403,-72
focalnet_tiny_srf.ms_in1k,23.267,76.733,58.333,41.667,28.43,224,0.900,bicubic,-72.233,-40.797,-36
efficientnet_b3.ra2_in1k,23.200,76.800,55.947,44.053,12.23,320,1.000,bicubic,-72.520,-43.093,-76
wide_resnet50_2.racm_in1k,23.187,76.813,56.000,44.000,68.88,288,0.950,bicubic,-72.443,-42.660,-65
pit_s_distilled_224.in1k,23.160,76.840,57.093,42.907,24.04,224,0.900,bicubic,-71.980,-41.637,+36
hrnet_w18_ssld.paddle_in1k,23.147,76.853,55.160,44.840,21.30,288,1.000,bilinear,-72.843,-44.150,-130
efficientformer_l3.snap_dist_in1k,23.120,76.880,57.147,42.853,31.41,224,0.950,bicubic,-72.470,-42.013,-63
nest_tiny_jx.goog_in1k,23.107,76.893,56.200,43.800,17.06,224,0.875,bicubic,-72.133,-42.710,+12
focalnet_tiny_lrf.ms_in1k,23.080,76.920,58.560,41.440,28.65,224,0.900,bicubic,-72.380,-40.400,-29
tiny_vit_5m_224.dist_in22k_ft_in1k,23.067,76.933,56.480,43.520,5.39,224,0.950,bicubic,-71.983,-42.490,+47
resnet61q.ra2_in1k,23.013,76.987,55.760,44.240,36.85,288,1.000,bicubic,-72.767,-43.230,-91
regnetx_160.tv2_in1k,22.987,77.013,56.333,43.667,54.28,224,0.965,bicubic,-72.893,-42.757,-111
resmlp_big_24_224.fb_in1k,22.893,77.107,54.280,45.720,129.14,224,0.875,bicubic,-71.787,-44.220,+106
vit_srelpos_small_patch16_224.sw_in1k,22.880,77.120,55.747,44.253,21.97,224,0.900,bicubic,-72.150,-43.203,+46
resnetv2_50d_evos.ah_in1k,22.867,77.133,55.173,44.827,25.59,288,1.000,bicubic,-72.763,-43.937,-75
halonet50ts.a1h_in1k,22.867,77.133,54.013,45.987,22.73,256,0.940,bicubic,-72.273,-44.877,+29
vit_base_patch32_clip_224.laion2b_ft_in1k,22.853,77.147,55.013,44.987,88.22,224,0.900,bicubic,-72.667,-43.977,-57
tf_efficientnet_b4.in1k,22.773,77.227,57.093,42.907,19.34,380,0.922,bicubic,-73.047,-41.957,-105
poolformerv2_m48.sail_in1k,22.760,77.240,55.747,44.253,73.35,224,1.000,bicubic,-73.010,-43.293,-96
twins_pcpvt_small.in1k,22.693,77.307,56.813,43.187,24.11,224,0.900,bicubic,-72.517,-42.067,+6
repvit_m1_5.dist_450e_in1k,22.680,77.320,56.107,43.893,14.64,224,0.950,bicubic,-73.220,-43.013,-122
convnextv2_nano.fcmae_ft_in1k,22.653,77.347,58.307,41.693,15.62,288,1.000,bicubic,-73.147,-40.783,-106
poolformer_m36.sail_in1k,22.587,77.413,55.373,44.627,56.17,224,0.950,bicubic,-72.803,-43.567,-23
vit_base_patch32_clip_224.openai_ft_in1k,22.560,77.440,55.333,44.667,88.22,224,0.900,bicubic,-72.550,-43.597,+25
vit_base_patch32_224.augreg_in21k_ft_in1k,22.427,77.573,54.027,45.973,88.22,224,0.900,bicubic,-72.583,-45.033,+41
resnetv2_50d_gn.ah_in1k,22.307,77.693,55.000,45.000,25.57,288,1.000,bicubic,-73.173,-43.950,-51
ecaresnet101d_pruned.miil_in1k,22.267,77.733,56.227,43.773,24.88,288,0.950,bicubic,-73.493,-42.953,-103
wide_resnet101_2.tv2_in1k,22.160,77.840,54.973,45.027,126.89,224,0.965,bilinear,-73.380,-43.887,-76
ecaresnet50t.a1_in1k,22.093,77.907,53.680,46.320,25.57,288,1.000,bicubic,-73.317,-45.330,-35
xcit_tiny_12_p8_224.fb_dist_in1k,22.067,77.933,54.267,45.733,6.71,224,1.000,bicubic,-73.023,-44.643,+23
resnext50_32x4d.a1h_in1k,22.013,77.987,54.080,45.920,25.03,288,1.000,bicubic,-73.437,-44.760,-48
efficientvit_b2.r224_in1k,22.000,78.000,55.533,44.467,24.33,224,0.950,bicubic,-73.200,-43.387,-3
res2net101d.in1k,21.853,78.147,51.680,48.320,45.23,224,0.875,bilinear,-72.967,-47.090,+67
tresnet_m.miil_in21k_ft_in1k,21.680,78.320,53.853,46.147,31.39,224,0.875,bilinear,-74.030,-45.067,-106
repvit_m1_5.dist_300e_in1k,21.613,78.387,54.707,45.293,14.64,224,0.950,bicubic,-74.027,-44.283,-97
efficientformerv2_s2.snap_dist_in1k,21.547,78.453,54.240,45.760,12.71,224,0.950,bicubic,-73.813,-44.690,-33
fastvit_sa12.apple_dist_in1k,21.493,78.507,54.613,45.387,11.58,256,0.900,bicubic,-73.657,-44.347,+3
resnet50_gn.a1h_in1k,21.387,78.613,54.067,45.933,25.56,288,0.950,bicubic,-73.863,-44.933,-20
maxvit_rmlp_pico_rw_256.sw_in1k,21.240,78.760,51.920,48.080,7.52,256,0.950,bicubic,-73.400,-46.890,+88
convmixer_1536_20.in1k,21.213,78.787,55.507,44.493,51.63,224,0.960,bicubic,-73.867,-43.523,+14
swin_tiny_patch4_window7_224.ms_in1k,21.173,78.827,55.933,44.067,28.29,224,0.900,bicubic,-73.967,-42.927,+1
resnet101.a2_in1k,21.173,78.827,51.920,48.080,44.55,288,1.000,bicubic,-74.237,-47.020,-46
pit_s_224.in1k,21.080,78.920,53.613,46.387,23.46,224,0.900,bicubic,-73.490,-45.087,+100
xcit_tiny_12_p8_224.fb_in1k,20.960,79.040,52.467,47.533,6.71,224,1.000,bicubic,-73.730,-46.193,+73
resnet51q.ra2_in1k,20.920,79.080,55.680,44.320,35.70,288,1.000,bilinear,-74.950,-43.450,-144
poolformerv2_m36.sail_in1k,20.920,79.080,53.120,46.880,56.08,224,1.000,bicubic,-74.480,-46.180,-47
resnetrs101.tf_in1k,20.867,79.133,52.827,47.173,63.62,288,0.940,bicubic,-74.563,-46.163,-59
deit_small_distilled_patch16_224.fb_in1k,20.720,79.280,55.133,44.867,22.44,224,0.900,bicubic,-73.990,-43.897,+67
sebotnet33ts_256.a1h_in1k,20.720,79.280,48.760,51.240,13.70,256,0.940,bicubic,-73.890,-49.750,+89
regnety_032.tv2_in1k,20.547,79.453,54.347,45.653,19.44,224,0.965,bicubic,-74.763,-44.563,-39
resnet152.tv2_in1k,20.493,79.507,52.347,47.653,60.19,224,0.965,bilinear,-75.007,-46.613,-83
ecaresnetlight.miil_in1k,20.480,79.520,53.413,46.587,30.16,288,0.950,bicubic,-74.810,-45.617,-39
resnest50d_4s2x40d.in1k,20.387,79.613,52.773,47.227,30.42,224,0.875,bicubic,-74.583,-46.007,+19
resnetaa50.a1h_in1k,20.067,79.933,51.947,48.053,25.56,288,1.000,bicubic,-75.133,-47.143,-26
resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k,20.040,79.960,53.560,46.440,25.03,224,0.875,bilinear,-74.850,-45.300,+27
haloregnetz_b.ra3_in1k,20.013,79.987,49.987,50.013,11.68,224,0.940,bicubic,-74.677,-48.843,+62
regnetz_b16.ra3_in1k,19.813,80.187,52.853,47.147,9.72,288,1.000,bicubic,-75.357,-46.227,-25
xcit_nano_12_p8_384.fb_dist_in1k,19.787,80.213,50.587,49.413,3.05,384,1.000,bicubic,-73.733,-47.943,+221
resnext50_32x4d.a1_in1k,19.733,80.267,50.173,49.827,25.03,288,1.000,bicubic,-75.097,-48.417,+40
tresnet_xl.miil_in1k,19.640,80.360,53.160,46.840,78.44,224,0.875,bilinear,-75.800,-45.630,-75
senet154.gluon_in1k,19.320,80.680,47.600,52.400,115.09,224,0.875,bicubic,-75.600,-51.160,+18
rexnet_200.nav_in1k,19.227,80.773,52.640,47.360,16.37,224,0.875,bicubic,-75.713,-46.370,+12
levit_conv_256.fb_dist_in1k,19.187,80.813,50.067,49.933,18.89,224,0.900,bicubic,-75.833,-48.923,+1
levit_256.fb_dist_in1k,19.173,80.827,50.107,49.893,18.89,224,0.900,bicubic,-75.847,-48.823,-1
lambda_resnet50ts.a1h_in1k,19.147,80.853,49.240,50.760,21.54,256,0.950,bicubic,-75.623,-49.450,+37
repvgg_b3.rvgg_in1k,19.067,80.933,50.293,49.707,123.09,224,0.875,bilinear,-75.483,-48.417,+79
mixer_b16_224.miil_in21k_ft_in1k,19.040,80.960,51.213,48.787,59.88,224,0.875,bilinear,-76.260,-47.667,-56
legacy_senet154.in1k,19.040,80.960,47.947,52.053,115.09,224,0.875,bilinear,-76.030,-50.883,-11
resnet50d.a1_in1k,18.960,81.040,48.813,51.187,25.58,288,1.000,bicubic,-75.770,-49.937,+44
seresnext101_64x4d.gluon_in1k,18.933,81.067,49.213,50.787,88.23,224,0.875,bicubic,-75.997,-49.617,+6
deit_small_patch16_224.fb_in1k,18.893,81.107,51.400,48.600,22.05,224,0.900,bicubic,-75.457,-47.290,+105
mobilevitv2_200.cvnets_in1k,18.893,81.107,50.520,49.480,18.45,256,0.888,bicubic,-75.947,-48.010,+23
gcvit_xxtiny.in1k,18.773,81.227,53.347,46.653,12.00,224,0.875,bicubic,-75.647,-45.543,+95
edgenext_small.usi_in1k,18.680,81.320,53.600,46.400,5.59,320,1.000,bicubic,-76.720,-45.270,-77
tf_efficientnet_b1.ns_jft_in1k,18.653,81.347,51.733,48.267,7.79,240,0.882,bicubic,-76.507,-47.217,-42
regnetx_080.tv2_in1k,18.573,81.427,50.333,49.667,39.57,224,0.965,bicubic,-76.527,-48.497,-24
ecaresnet50t.a2_in1k,18.533,81.467,48.773,51.227,25.57,288,1.000,bicubic,-76.817,-50.087,-73
poolformer_s36.sail_in1k,18.493,81.507,52.187,47.813,30.86,224,0.900,bicubic,-76.587,-46.723,-23
repvit_m3.dist_in1k,18.480,81.520,51.867,48.133,10.68,224,0.950,bicubic,-76.720,-46.953,-52
wide_resnet50_2.tv2_in1k,18.467,81.533,52.467,47.533,68.88,224,0.965,bilinear,-76.403,-46.443,+8
cait_xxs36_224.fb_dist_in1k,18.307,81.693,49.480,50.520,17.30,224,1.000,bicubic,-75.963,-49.230,+106
cs3darknet_l.c2ns_in1k,18.293,81.707,51.840,48.160,21.16,288,0.950,bicubic,-76.827,-47.140,-35
seresnet50.ra2_in1k,18.293,81.707,51.253,48.747,28.09,288,0.950,bicubic,-77.037,-47.757,-76
sehalonet33ts.ra2_in1k,18.227,81.773,47.733,52.267,13.69,256,0.940,bicubic,-76.533,-50.837,+21
ese_vovnet39b.ra_in1k,18.200,81.800,49.880,50.120,24.57,288,0.950,bicubic,-76.670,-49.060,+4
tf_efficientnet_lite4.in1k,18.120,81.880,50.707,49.293,13.01,380,0.920,bilinear,-76.760,-48.183,0
vit_tiny_patch16_384.augreg_in21k_ft_in1k,18.027,81.973,50.307,49.693,5.79,384,1.000,bicubic,-75.623,-48.283,+177
tiny_vit_5m_224.in1k,18.000,82.000,50.173,49.827,5.39,224,0.950,bicubic,-76.240,-48.287,+100
gcresnet50t.ra2_in1k,17.880,82.120,49.413,50.587,25.90,288,1.000,bicubic,-77.360,-49.567,-68
resnet50d.ra2_in1k,17.853,82.147,50.240,49.760,25.58,288,0.950,bicubic,-77.157,-48.670,-23
mobilevitv2_175.cvnets_in1k,17.787,82.213,49.747,50.253,14.25,256,0.888,bicubic,-77.103,-49.113,-8
resnest50d_1s4x24d.in1k,17.640,82.360,49.760,50.240,25.68,224,0.875,bicubic,-77.090,-49.220,+16
resnetv2_50.a1h_in1k,17.587,82.413,49.813,50.187,25.55,288,1.000,bicubic,-77.263,-49.057,+1
resnet50.c2_in1k,17.533,82.467,49.760,50.240,25.56,288,1.000,bicubic,-77.387,-49.050,-16
resnest50d.in1k,17.360,82.640,50.733,49.267,27.48,224,0.875,bilinear,-77.490,-48.147,-2
convnext_pico.d1_in1k,17.347,82.653,50.213,49.787,9.05,288,0.950,bicubic,-77.413,-48.717,+8
seresnext101_32x4d.gluon_in1k,17.333,82.667,46.373,53.627,48.96,224,0.875,bicubic,-77.557,-52.447,-12
efficientnet_el.ra_in1k,17.320,82.680,50.027,49.973,10.59,300,0.904,bicubic,-77.800,-48.943,-50
inception_v4.tf_in1k,17.293,82.707,45.867,54.133,42.68,299,0.875,bicubic,-77.077,-52.763,+75
tf_efficientnet_b3.ap_in1k,17.227,82.773,49.680,50.320,12.23,300,0.904,bicubic,-78.093,-49.220,-92
gcresnext50ts.ch_in1k,17.173,82.827,48.107,51.893,15.67,288,1.000,bicubic,-77.687,-50.753,-11
xcit_tiny_24_p16_224.fb_dist_in1k,17.160,82.840,47.507,52.493,12.12,224,1.000,bicubic,-77.380,-51.293,+49
fastvit_s12.apple_dist_in1k,17.120,82.880,49.400,50.600,9.47,256,0.900,bicubic,-77.710,-49.370,-6
regnetx_032.tv2_in1k,17.067,82.933,48.107,51.893,15.30,224,0.965,bicubic,-77.593,-50.693,+18
xception71.tf_in1k,17.027,82.973,45.547,54.453,42.34,299,0.903,bicubic,-77.273,-53.103,+80
regnety_016.tv2_in1k,16.973,83.027,49.853,50.147,11.20,224,0.965,bicubic,-77.547,-48.967,+48
tf_efficientnet_b3.aa_in1k,16.973,83.027,49.280,50.720,12.23,300,0.904,bicubic,-78.037,-49.750,-39
cs3darknet_focus_l.c2ns_in1k,16.960,83.040,50.493,49.507,21.15,288,0.950,bicubic,-78.190,-48.437,-72
convnextv2_pico.fcmae_ft_in1k,16.907,83.093,50.307,49.693,9.07,288,0.950,bicubic,-78.323,-48.613,-86
resmlp_36_224.fb_distilled_in1k,16.880,83.120,51.493,48.507,44.69,224,0.875,bicubic,-78.010,-47.377,-26
resnext101_64x4d.gluon_in1k,16.867,83.133,44.147,55.853,83.46,224,0.875,bicubic,-77.803,-54.513,+10
poolformerv2_s36.sail_in1k,16.680,83.320,49.600,50.400,30.79,224,1.000,bicubic,-78.630,-49.320,-102
resnet152d.gluon_in1k,16.627,83.373,44.213,55.787,60.21,224,0.875,bicubic,-78.103,-54.277,-2
tf_efficientnetv2_b3.in1k,16.600,83.400,48.680,51.320,14.36,300,0.904,bicubic,-78.560,-50.140,-79
gmlp_s16_224.ra3_in1k,16.547,83.453,45.080,54.920,19.42,224,0.875,bicubic,-77.603,-53.590,+90
resnet152s.gluon_in1k,16.547,83.453,44.507,55.493,60.32,224,0.875,bicubic,-78.473,-54.373,-54
tresnet_l.miil_in1k,16.533,83.467,49.893,50.107,55.99,224,0.875,bilinear,-78.747,-49.067,-102
inception_resnet_v2.tf_in1k,16.520,83.480,44.973,55.027,55.84,299,0.897,bicubic,-78.060,-53.817,+22
convnext_pico_ols.d1_in1k,16.507,83.493,49.707,50.293,9.06,288,1.000,bicubic,-78.113,-49.153,+16
seresnet50.a1_in1k,16.493,83.507,46.760,53.240,28.09,288,1.000,bicubic,-78.167,-52.020,+5
resmlp_24_224.fb_distilled_in1k,16.453,83.547,50.347,49.653,30.02,224,0.875,bicubic,-77.997,-48.423,+44
mobilevitv2_150.cvnets_in1k,16.453,83.547,48.453,51.547,10.59,256,0.888,bicubic,-78.097,-50.327,+25
resnet101.tv2_in1k,16.400,83.600,48.707,51.293,44.55,224,0.965,bilinear,-78.880,-50.303,-106
gernet_l.idstcv_in1k,16.320,83.680,47.227,52.773,31.08,256,0.875,bilinear,-78.790,-51.753,-73
repvit_m1_1.dist_450e_in1k,16.280,83.720,49.680,50.320,8.80,224,0.950,bicubic,-78.620,-49.280,-46
fastvit_sa12.apple_in1k,16.280,83.720,49.653,50.347,11.58,256,0.900,bicubic,-78.600,-49.367,-38
inception_resnet_v2.tf_ens_adv_in1k,16.280,83.720,43.640,56.360,55.84,299,0.897,bicubic,-77.890,-54.900,+73
repvgg_b3g4.rvgg_in1k,16.227,83.773,47.653,52.347,83.83,224,0.875,bilinear,-78.283,-51.317,+28
xcit_tiny_24_p16_224.fb_in1k,16.227,83.773,46.000,54.000,12.12,224,1.000,bicubic,-77.853,-52.540,+86
resnet50.a1_in1k,16.040,83.960,45.773,54.227,25.56,288,1.000,bicubic,-78.700,-52.907,-24
xception65.tf_in1k,16.040,83.960,43.760,56.240,39.92,299,0.903,bicubic,-77.740,-54.610,+119
resnet50.c1_in1k,16.000,84.000,47.387,52.613,25.56,288,1.000,bicubic,-78.730,-51.433,-23
ecaresnet50d_pruned.miil_in1k,15.987,84.013,49.707,50.293,19.94,288,0.950,bicubic,-79.123,-49.193,-83
edgenext_small_rw.sw_in1k,15.987,84.013,49.667,50.333,7.83,320,1.000,bicubic,-78.683,-49.113,-14
resnet50.fb_ssl_yfcc100m_ft_in1k,15.920,84.080,49.413,50.587,25.56,224,0.875,bilinear,-78.530,-49.327,+30
resnext50_32x4d.ra_in1k,15.867,84.133,47.227,52.773,25.03,288,0.950,bicubic,-78.833,-51.533,-20
fastvit_t12.apple_dist_in1k,15.640,84.360,47.680,52.320,7.55,256,0.900,bicubic,-78.950,-51.110,+2
resnet50.d_in1k,15.640,84.360,45.160,54.840,25.56,288,1.000,bicubic,-79.340,-53.680,-67
regnety_320.pycls_in1k,15.627,84.373,44.760,55.240,145.05,224,0.875,bicubic,-78.913,-54.030,+9
vit_base_patch32_384.augreg_in1k,15.600,84.400,44.147,55.853,88.30,384,1.000,bicubic,-78.040,-54.093,+127
convmixer_768_32.in1k,15.507,84.493,47.907,52.093,21.11,224,0.960,bicubic,-78.993,-50.953,+17
ecaresnet26t.ra2_in1k,15.453,84.547,47.960,52.040,16.01,320,0.950,bicubic,-78.867,-50.760,+38
resnext50d_32x4d.bt_in1k,15.400,84.600,46.187,53.813,25.05,288,0.950,bicubic,-79.150,-52.503,+3
coat_tiny.in1k,15.400,84.600,45.587,54.413,5.50,224,0.900,bicubic,-78.180,-52.833,+129
resnet50d.a2_in1k,15.387,84.613,44.813,55.187,25.58,288,1.000,bicubic,-79.193,-53.877,-4
skresnext50_32x4d.ra_in1k,15.373,84.627,44.547,55.453,27.48,224,0.875,bicubic,-78.867,-54.143,+43
resnext50_32x4d.a2_in1k,15.293,84.707,45.253,54.747,25.03,288,1.000,bicubic,-79.287,-53.397,-5
efficientvit_b1.r288_in1k,15.280,84.720,46.600,53.400,9.10,288,1.000,bicubic,-79.210,-51.920,+11
seresnet33ts.ra2_in1k,15.267,84.733,47.373,52.627,19.78,288,1.000,bicubic,-79.773,-51.527,-91
vit_relpos_base_patch32_plus_rpn_256.sw_in1k,15.240,84.760,42.640,57.360,119.42,256,0.900,bicubic,-78.500,-55.900,+107
cait_xxs24_224.fb_dist_in1k,15.160,84.840,44.893,55.107,11.96,224,1.000,bicubic,-78.450,-53.567,+119
eca_resnet33ts.ra2_in1k,15.053,84.947,49.013,50.987,19.68,288,1.000,bicubic,-79.567,-49.747,-21
vit_base_patch16_224.augreg_in1k,15.013,84.987,42.027,57.973,86.57,224,0.900,bicubic,-78.627,-56.373,+115
fastvit_s12.apple_in1k,14.920,85.080,45.320,54.680,9.47,256,0.900,bicubic,-79.210,-53.430,+56
repvit_m1_0.dist_450e_in1k,14.907,85.093,46.827,53.173,7.30,224,0.950,bicubic,-79.623,-52.053,-3
levit_conv_192.fb_dist_in1k,14.907,85.093,44.933,55.067,10.95,224,0.900,bicubic,-79.263,-53.747,+46
levit_192.fb_dist_in1k,14.880,85.120,44.933,55.067,10.95,224,0.900,bicubic,-79.290,-53.607,+43
seresnet50.a2_in1k,14.840,85.160,44.400,55.600,28.09,288,1.000,bicubic,-79.820,-54.320,-34
poolformerv2_s24.sail_in1k,14.800,85.200,45.920,54.080,21.34,224,1.000,bicubic,-79.850,-52.920,-33
repvit_m1_1.dist_300e_in1k,14.760,85.240,47.227,52.773,8.80,224,0.950,bicubic,-80.000,-51.483,-57
rexnet_150.nav_in1k,14.733,85.267,46.880,53.120,9.73,224,0.875,bicubic,-79.747,-51.920,0
darknet53.c2ns_in1k,14.680,85.320,47.107,52.893,41.61,288,1.000,bicubic,-79.940,-51.793,-30
gcresnet33ts.ra2_in1k,14.667,85.333,46.320,53.680,19.88,288,1.000,bicubic,-80.253,-52.490,-86
resnet50.tv2_in1k,14.640,85.360,46.973,53.027,25.56,224,0.965,bilinear,-80.000,-51.827,-36
res2net50d.in1k,14.627,85.373,44.480,55.520,25.72,224,0.875,bilinear,-79.693,-54.160,+18
darknetaa53.c2ns_in1k,14.560,85.440,45.427,54.573,36.02,288,1.000,bilinear,-79.910,-53.393,-4
coat_lite_mini.in1k,14.560,85.440,44.493,55.507,11.01,224,0.900,bicubic,-79.480,-54.057,+52
efficientnet_el_pruned.in1k,14.480,85.520,46.107,53.893,10.59,300,0.904,bicubic,-79.910,-52.633,+3
efficientnet_b2.ra_in1k,14.440,85.560,46.053,53.947,9.11,288,1.000,bicubic,-80.180,-52.727,-33
poolformer_s24.sail_in1k,14.267,85.733,47.360,52.640,21.39,224,0.900,bicubic,-80.293,-51.540,-26
legacy_seresnext101_32x4d.in1k,14.173,85.827,43.000,57.000,48.96,224,0.875,bilinear,-80.197,-55.580,+2
fbnetv3_d.ra2_in1k,14.093,85.907,46.467,53.533,10.31,256,0.950,bilinear,-79.827,-52.153,+59
gernet_m.idstcv_in1k,14.080,85.920,46.080,53.920,21.14,224,0.875,bilinear,-80.540,-52.830,-40
pvt_v2_b1.in1k,14.027,85.973,47.733,52.267,14.01,224,0.900,bicubic,-79.773,-50.927,+73
repvit_m2.dist_in1k,14.013,85.987,46.373,53.627,8.80,224,0.950,bicubic,-80.727,-52.337,-69
mobilevitv2_125.cvnets_in1k,14.013,85.987,45.027,54.973,7.48,256,0.888,bicubic,-79.947,-53.523,+53
dpn68b.ra_in1k,13.893,86.107,40.307,59.693,12.61,288,1.000,bicubic,-80.107,-58.193,+47
resnext101_32x4d.gluon_in1k,13.853,86.147,41.640,58.360,44.18,224,0.875,bicubic,-80.687,-57.140,-26
seresnext50_32x4d.gluon_in1k,13.627,86.373,43.773,56.227,27.56,224,0.875,bicubic,-80.703,-54.837,0
fastvit_t12.apple_in1k,13.613,86.387,43.307,56.693,7.55,256,0.900,bicubic,-80.307,-55.303,+54
pit_xs_distilled_224.in1k,13.573,86.427,45.173,54.827,11.00,224,0.900,bicubic,-80.207,-53.327,+67
resnet152.a3_in1k,13.547,86.453,43.387,56.613,60.19,224,0.950,bicubic,-81.183,-55.293,-70
resmlp_36_224.fb_in1k,13.467,86.533,46.667,53.333,44.69,224,0.875,bicubic,-80.723,-51.943,+11
repvgg_b2g4.rvgg_in1k,13.467,86.533,43.853,56.147,61.76,224,0.875,bilinear,-80.413,-54.807,+57
efficientformerv2_s1.snap_dist_in1k,13.453,86.547,42.933,57.067,6.19,224,0.950,bicubic,-80.747,-55.707,+9
vit_small_patch16_224.augreg_in1k,13.387,86.613,41.427,58.573,22.05,224,0.900,bicubic,-80.503,-57.013,+52
eca_botnext26ts_256.c1_in1k,13.320,86.680,42.133,57.867,10.59,256,0.950,bicubic,-80.460,-56.487,+62
visformer_tiny.in1k,13.307,86.693,43.933,56.067,10.32,224,0.900,bicubic,-80.253,-54.457,+86
regnetx_320.pycls_in1k,13.293,86.707,40.747,59.253,107.81,224,0.875,bicubic,-81.157,-58.173,-22
resnet101d.gluon_in1k,13.200,86.800,41.547,58.453,44.57,224,0.875,bicubic,-81.030,-56.993,+1
efficientnet_b3_pruned.in1k,13.160,86.840,45.200,54.800,9.86,300,0.904,bicubic,-81.470,-53.560,-62
resnet50.b1k_in1k,13.093,86.907,43.947,56.053,25.56,288,1.000,bicubic,-81.767,-54.863,-101
mixnet_xl.ra_in1k,13.080,86.920,43.213,56.787,11.90,224,0.875,bicubic,-81.100,-55.107,+4
cspresnext50.ra_in1k,13.053,86.947,44.973,55.027,20.57,256,0.887,bilinear,-81.777,-53.837,-96
efficientformer_l1.snap_dist_in1k,13.013,86.987,45.600,54.400,12.29,224,0.950,bicubic,-81.467,-53.230,-35
regnetx_016.tv2_in1k,13.000,87.000,45.427,54.573,9.19,224,0.965,bicubic,-81.130,-53.193,+13
repvit_m1_0.dist_300e_in1k,13.000,87.000,44.413,55.587,7.30,224,0.950,bicubic,-81.300,-54.437,-13
nf_regnet_b1.ra2_in1k,12.947,87.053,44.373,55.627,10.22,288,0.900,bicubic,-81.163,-54.247,+12
eca_halonext26ts.c1_in1k,12.947,87.053,42.813,57.187,10.76,256,0.940,bicubic,-81.093,-55.677,+21
mobilevit_s.cvnets_in1k,12.907,87.093,40.760,59.240,5.58,256,0.900,bicubic,-80.253,-57.560,+120
pit_xs_224.in1k,12.827,87.173,42.813,57.187,10.62,224,0.900,bicubic,-80.273,-55.517,+126
tf_efficientnet_b3.in1k,12.787,87.213,43.627,56.373,12.23,300,0.904,bicubic,-81.753,-55.223,-53
resnet50.b2k_in1k,12.760,87.240,44.133,55.867,25.56,288,1.000,bicubic,-81.970,-54.797,-93
resnext50_32x4d.tv2_in1k,12.693,87.307,43.093,56.907,25.03,224,0.965,bilinear,-81.927,-55.617,-70
inception_v3.gluon_in1k,12.640,87.360,40.427,59.573,23.83,299,0.875,bicubic,-80.830,-58.143,+83
tresnet_m.miil_in1k,12.613,87.387,41.907,58.093,31.39,224,0.875,bilinear,-82.007,-56.643,-69
resnet50.a1h_in1k,12.587,87.413,44.240,55.760,25.56,224,1.000,bicubic,-82.183,-54.230,-106
crossvit_9_dagger_240.in1k,12.560,87.440,41.720,58.280,8.78,240,0.875,bicubic,-80.330,-56.520,+140
efficientvit_b1.r256_in1k,12.547,87.453,42.120,57.880,9.10,256,1.000,bicubic,-81.543,-56.240,+5
resnetblur50.bt_in1k,12.493,87.507,44.160,55.840,25.56,288,0.950,bicubic,-81.967,-54.680,-47
resmlp_24_224.fb_in1k,12.493,87.507,43.413,56.587,30.02,224,0.875,bicubic,-81.537,-55.247,+12
convnext_femto_ols.d1_in1k,12.480,87.520,43.933,56.067,5.23,288,0.950,bicubic,-81.440,-54.667,+20
coat_lite_tiny.in1k,12.467,87.533,41.053,58.947,5.72,224,0.900,bicubic,-80.773,-57.207,+101
efficientnet_em.ra2_in1k,12.400,87.600,43.933,56.067,6.90,240,0.882,bicubic,-81.430,-54.887,+28
regnety_120.pycls_in1k,12.400,87.600,42.173,57.827,51.82,224,0.875,bicubic,-82.070,-56.597,-53
regnety_160.pycls_in1k,12.227,87.773,41.387,58.613,83.59,224,0.875,bicubic,-82.133,-57.473,-41
resnet50.a2_in1k,12.173,87.827,40.453,59.547,25.56,288,1.000,bicubic,-82.457,-58.207,-87
ecaresnet50t.a3_in1k,12.147,87.853,41.573,58.427,25.57,224,0.950,bicubic,-82.203,-57.097,-41
hrnet_w64.ms_in1k,12.013,87.987,40.827,59.173,128.06,224,0.875,bilinear,-82.017,-57.503,+2
cspdarknet53.ra_in1k,11.960,88.040,43.267,56.733,27.64,256,0.887,bilinear,-82.700,-55.583,-97
xcit_tiny_12_p16_224.fb_dist_in1k,11.933,88.067,40.133,59.867,6.72,224,1.000,bicubic,-81.477,-58.377,+74
resnet101s.gluon_in1k,11.893,88.107,40.947,59.053,44.67,224,0.875,bicubic,-82.827,-57.873,-108
resnet101.a3_in1k,11.867,88.133,40.840,59.160,44.55,224,0.950,bicubic,-82.163,-57.750,-4
gmixer_24_224.ra3_in1k,11.867,88.133,37.800,62.200,24.72,224,0.875,bicubic,-80.963,-60.080,+128
nf_resnet50.ra2_in1k,11.773,88.227,45.907,54.093,25.56,288,0.940,bicubic,-82.767,-52.723,-75
fbnetv3_b.ra2_in1k,11.760,88.240,44.400,55.600,8.60,256,0.950,bilinear,-82.210,-54.090,-1
dpn92.mx_in1k,11.640,88.360,40.160,59.840,37.67,224,0.875,bicubic,-82.650,-58.590,-41
botnet26t_256.c1_in1k,11.613,88.387,40.107,59.893,12.49,256,0.950,bicubic,-81.897,-58.213,+53
convnextv2_femto.fcmae_ft_in1k,11.600,88.400,40.800,59.200,5.23,288,0.950,bicubic,-82.590,-57.860,-33
dla102x2.in1k,11.560,88.440,41.293,58.707,41.28,224,0.875,bilinear,-82.410,-57.227,-3
xception41.tf_in1k,11.560,88.440,39.067,60.933,26.97,299,0.903,bicubic,-81.870,-59.153,+61
vit_small_patch32_224.augreg_in21k_ft_in1k,11.507,88.493,39.547,60.453,22.88,224,0.900,bicubic,-80.533,-58.743,+176
efficientvit_b1.r224_in1k,11.493,88.507,40.200,59.800,9.10,224,0.950,bicubic,-82.017,-58.100,+47
levit_128.fb_dist_in1k,11.440,88.560,40.187,59.813,9.21,224,0.900,bicubic,-81.890,-58.333,+69
levit_conv_128.fb_dist_in1k,11.440,88.560,40.173,59.827,9.21,224,0.900,bicubic,-81.910,-58.197,+66
lambda_resnet26t.c1_in1k,11.387,88.613,40.187,59.813,10.96,256,0.940,bicubic,-82.443,-58.463,+8
seresnext26t_32x4d.bt_in1k,11.373,88.627,41.107,58.893,16.81,288,0.950,bicubic,-82.187,-57.383,+37
regnety_080.pycls_in1k,11.373,88.627,40.613,59.387,39.18,224,0.875,bicubic,-82.797,-57.987,-39
efficientnet_b2_pruned.in1k,11.333,88.667,42.040,57.960,8.31,260,0.890,bicubic,-82.807,-56.670,-31
resnet50.ra_in1k,11.333,88.667,41.013,58.987,25.56,288,0.950,bicubic,-82.877,-57.607,-48
tf_efficientnet_el.in1k,11.320,88.680,42.053,57.947,10.59,300,0.904,bicubic,-83.080,-56.657,-71
xcit_nano_12_p16_384.fb_dist_in1k,11.227,88.773,39.853,60.147,3.05,384,1.000,bicubic,-80.603,-58.167,+180
convnext_femto.d1_in1k,11.213,88.787,42.773,57.227,5.22,288,0.950,bicubic,-82.717,-55.747,-13
resnet152c.gluon_in1k,11.107,88.893,37.133,62.867,60.21,224,0.875,bicubic,-83.053,-61.457,-42
hrnet_w48.ms_in1k,11.093,88.907,40.307,59.693,77.47,224,0.875,bilinear,-82.827,-58.433,-13
vit_tiny_r_s16_p8_384.augreg_in21k_ft_in1k,11.093,88.907,39.933,60.067,6.36,384,1.000,bicubic,-80.947,-58.297,+160
halonet26t.a1h_in1k,11.093,88.907,38.800,61.200,12.48,256,0.950,bicubic,-82.907,-59.540,-22
mobilevitv2_100.cvnets_in1k,11.067,88.933,40.613,59.387,4.90,256,0.888,bicubic,-82.233,-57.667,+63
tf_efficientnet_b0.ns_jft_in1k,11.000,89.000,40.080,59.920,5.29,224,0.875,bicubic,-82.620,-58.560,+19
tf_efficientnetv2_b2.in1k,11.000,89.000,39.747,60.253,10.10,260,0.890,bicubic,-83.420,-58.833,-82
inception_v3.tf_adv_in1k,11.000,89.000,36.720,63.280,23.83,299,0.875,bicubic,-81.900,-61.420,+98
seresnext26d_32x4d.bt_in1k,10.987,89.013,41.347,58.653,16.81,288,0.950,bicubic,-82.453,-56.983,+39
xcit_tiny_12_p16_224.fb_in1k,10.987,89.013,37.067,62.933,6.72,224,1.000,bicubic,-81.523,-61.173,+126
regnety_008_tv.tv2_in1k,10.840,89.160,40.533,59.467,6.43,224,0.965,bicubic,-82.850,-57.957,+6
resnet34d.ra2_in1k,10.827,89.173,38.653,61.347,21.82,288,0.950,bicubic,-82.813,-59.887,+8
dpn107.mx_in1k,10.827,89.173,38.307,61.693,86.92,224,0.875,bicubic,-83.513,-60.193,-77
inception_v3.tf_in1k,10.827,89.173,36.853,63.147,23.83,299,0.875,bicubic,-82.493,-61.527,+51
mobileone_s4.apple_in1k,10.787,89.213,38.480,61.520,14.95,224,0.900,bilinear,-82.953,-59.590,-2
xcit_nano_12_p8_224.fb_dist_in1k,10.773,89.227,38.120,61.880,3.05,224,1.000,bicubic,-81.307,-59.790,+144
densenetblur121d.ra_in1k,10.547,89.453,39.707,60.293,8.00,288,0.950,bicubic,-82.073,-58.553,+109
tf_efficientnet_b2.ap_in1k,10.533,89.467,40.133,59.867,9.11,260,0.890,bicubic,-83.977,-58.487,-105
dpn131.mx_in1k,10.533,89.467,36.787,63.213,79.25,224,0.875,bicubic,-83.517,-61.923,-44
rexnet_130.nav_in1k,10.413,89.587,41.547,58.453,7.56,224,0.875,bicubic,-83.487,-56.853,-26
repvit_m0_9.dist_450e_in1k,10.400,89.600,40.120,59.880,5.49,224,0.950,bicubic,-83.200,-58.380,+7
hrnet_w44.ms_in1k,10.307,89.693,39.493,60.507,67.06,224,0.875,bilinear,-83.243,-59.107,+11
xcit_nano_12_p8_224.fb_in1k,10.307,89.693,36.973,63.027,3.05,224,1.000,bicubic,-80.703,-60.797,+180
resnext50_32x4d.a3_in1k,10.267,89.733,38.200,61.800,25.03,224,0.950,bicubic,-83.393,-60.320,-4
lambda_resnet26rpt_256.c1_in1k,10.227,89.773,38.133,61.867,10.99,256,0.940,bicubic,-83.483,-60.387,-9
resnext101_32x8d.tv_in1k,10.173,89.827,37.747,62.253,88.79,224,0.875,bilinear,-83.647,-60.833,-24
regnetx_160.pycls_in1k,10.133,89.867,38.053,61.947,54.28,224,0.875,bicubic,-84.007,-60.467,-64
legacy_seresnext50_32x4d.in1k,10.093,89.907,39.213,60.787,27.56,224,0.875,bilinear,-83.657,-59.367,-20
resnetrs50.tf_in1k,10.053,89.947,37.573,62.427,35.69,224,0.910,bicubic,-84.267,-61.067,-90
dpn98.mx_in1k,10.013,89.987,36.173,63.827,61.57,224,0.875,bicubic,-84.147,-62.467,-70
inception_v3.tv_in1k,10.013,89.987,35.227,64.773,23.83,299,0.875,bicubic,-82.717,-62.743,+88
efficientnet_b1.ft_in1k,10.000,90.000,37.600,62.400,7.79,256,1.000,bicubic,-83.250,-60.690,+38
legacy_xception.tf_in1k,9.987,90.013,37.987,62.013,22.86,299,0.897,bicubic,-83.473,-60.543,+13
resnet33ts.ra2_in1k,9.947,90.053,39.840,60.160,19.68,288,1.000,bicubic,-84.153,-58.810,-64
regnety_064.pycls_in1k,9.933,90.067,39.093,60.907,30.58,224,0.875,bicubic,-84.207,-59.657,-71
resnet152.gluon_in1k,9.733,90.267,36.093,63.907,60.19,224,0.875,bicubic,-84.337,-62.367,-63
tf_efficientnet_lite3.in1k,9.680,90.320,39.013,60.987,8.20,300,0.904,bilinear,-84.520,-59.617,-87
tf_efficientnet_b2.aa_in1k,9.667,90.333,38.893,61.107,9.11,260,0.890,bicubic,-84.713,-59.717,-109
tf_efficientnet_cc_b1_8e.in1k,9.573,90.427,36.840,63.160,39.72,240,0.882,bicubic,-84.347,-61.420,-47
res2net101_26w_4s.in1k,9.507,90.493,35.093,64.907,45.21,224,0.875,bilinear,-84.213,-63.217,-25
resnet50.ram_in1k,9.480,90.520,35.507,64.493,25.56,288,0.950,bicubic,-85.040,-63.143,-129
legacy_seresnet152.in1k,9.333,90.667,37.373,62.627,66.82,224,0.875,bilinear,-84.047,-60.967,+13
cspresnet50.ra_in1k,9.293,90.707,39.613,60.387,21.62,256,0.887,bilinear,-84.447,-59.027,-32
repvit_m0_9.dist_300e_in1k,9.293,90.707,38.840,61.160,5.49,224,0.950,bicubic,-84.147,-59.870,+3
resnet34.a1_in1k,9.267,90.733,34.947,65.053,21.80,288,1.000,bicubic,-83.833,-63.363,+38
hrnet_w40.ms_in1k,9.240,90.760,36.920,63.080,57.56,224,0.875,bilinear,-84.260,-61.620,-6
resnet32ts.ra2_in1k,9.213,90.787,38.600,61.400,17.96,288,1.000,bicubic,-84.617,-60.040,-48
regnetx_120.pycls_in1k,9.213,90.787,37.200,62.800,46.11,224,0.875,bicubic,-85.017,-61.470,-100
crossvit_tiny_240.in1k,9.133,90.867,34.600,65.400,7.01,240,0.875,bicubic,-81.097,-62.990,+179
resnest26d.gluon_in1k,9.053,90.947,37.840,62.160,17.07,224,0.875,bilinear,-84.267,-60.520,+11
vit_tiny_patch16_224.augreg_in21k_ft_in1k,9.053,90.947,34.627,65.373,5.72,224,0.900,bicubic,-82.717,-63.413,+130
resnet50d.a3_in1k,9.040,90.960,37.307,62.693,25.58,224,0.950,bicubic,-84.440,-61.143,-8
gcresnext26ts.ch_in1k,8.987,91.013,36.920,63.080,10.48,288,1.000,bicubic,-84.173,-61.450,+26
vit_base_patch16_224.sam_in1k,8.987,91.013,36.133,63.867,86.57,224,0.900,bicubic,-85.163,-62.367,-93
regnety_040.pycls_in1k,8.933,91.067,37.067,62.933,20.65,224,0.875,bicubic,-84.947,-61.453,-59
resnext50_32x4d.gluon_in1k,8.933,91.067,36.293,63.707,25.03,224,0.875,bicubic,-84.877,-62.127,-53
rexnet_100.nav_in1k,8.893,91.107,36.413,63.587,4.80,224,0.875,bicubic,-84.127,-61.777,+33
mixnet_l.ft_in1k,8.893,91.107,36.240,63.760,7.33,224,0.875,bicubic,-84.537,-62.190,-8
efficientvit_m5.r224_in1k,8.893,91.107,34.587,65.413,12.47,224,0.875,bicubic,-83.567,-63.403,+82
bat_resnext26ts.ch_in1k,8.880,91.120,36.453,63.547,10.73,256,0.900,bicubic,-84.440,-62.167,+3
convit_tiny.fb_in1k,8.867,91.133,34.307,65.693,5.71,224,0.875,bicubic,-81.793,-63.423,+156
mobilenetv3_large_100.miil_in21k_ft_in1k,8.853,91.147,33.080,66.920,5.48,224,0.875,bilinear,-83.407,-64.540,+90
hrnet_w18.ms_aug_in1k,8.747,91.253,38.787,61.213,21.30,224,0.950,bilinear,-84.803,-59.913,-29
resnet50.bt_in1k,8.653,91.347,38.720,61.280,25.56,288,0.950,bicubic,-85.667,-59.810,-127
levit_conv_128s.fb_dist_in1k,8.653,91.347,33.093,66.907,7.78,224,0.900,bicubic,-83.307,-64.967,+107
dla169.in1k,8.640,91.360,36.000,64.000,53.39,224,0.875,bilinear,-84.710,-62.600,-10
levit_128s.fb_dist_in1k,8.640,91.360,33.067,66.933,7.78,224,0.900,bicubic,-83.320,-64.823,+103
mixer_b16_224.goog_in21k_ft_in1k,8.627,91.373,29.413,70.587,59.88,224,0.875,bicubic,-83.253,-68.627,+109
repvit_m1.dist_in1k,8.613,91.387,37.293,62.707,5.49,224,0.950,bicubic,-84.677,-61.147,0
hrnet_w30.ms_in1k,8.587,91.413,37.067,62.933,37.71,224,0.875,bilinear,-84.613,-61.423,+3
eca_resnext26ts.ch_in1k,8.560,91.440,36.827,63.173,10.30,288,1.000,bicubic,-84.500,-61.573,+17
ghostnetv2_160.in1k,8.560,91.440,36.627,63.373,12.39,224,0.875,bicubic,-84.430,-61.603,+23
legacy_seresnet101.in1k,8.533,91.467,35.960,64.040,49.33,224,0.875,bilinear,-84.777,-62.560,-8
convnext_atto_ols.a2_in1k,8.533,91.467,35.000,65.000,3.70,288,0.950,bicubic,-84.557,-63.470,+13
tf_efficientnet_b2.in1k,8.520,91.480,36.520,63.480,9.11,260,0.890,bicubic,-85.590,-61.930,-106
tf_efficientnet_b1.ap_in1k,8.453,91.547,35.240,64.760,7.79,240,0.882,bicubic,-85.227,-63.120,-58
repvgg_b2.rvgg_in1k,8.440,91.560,36.480,63.520,89.02,224,0.875,bilinear,-85.060,-62.090,-38
ese_vovnet19b_dw.ra_in1k,8.307,91.693,36.973,63.027,6.54,288,0.950,bicubic,-84.843,-61.277,+2
resmlp_12_224.fb_distilled_in1k,8.307,91.693,36.853,63.147,15.35,224,0.875,bicubic,-84.513,-61.287,+31
crossvit_9_240.in1k,8.280,91.720,34.107,65.893,8.55,240,0.875,bicubic,-82.350,-63.623,+139
dla102x.in1k,8.187,91.813,37.067,62.933,26.31,224,0.875,bilinear,-85.303,-61.433,-39
seresnext26ts.ch_in1k,8.147,91.853,36.093,63.907,10.39,288,1.000,bicubic,-84.813,-62.087,+17
hrnet_w32.ms_in1k,8.053,91.947,37.560,62.440,41.23,224,0.875,bilinear,-85.477,-60.890,-48
resnet101c.gluon_in1k,8.027,91.973,33.320,66.680,44.57,224,0.875,bicubic,-85.643,-65.100,-65
vit_base_patch32_224.augreg_in1k,7.987,92.013,30.453,69.547,88.22,224,0.900,bicubic,-83.203,-66.937,+113
cs3darknet_m.c2ns_in1k,7.960,92.040,36.507,63.493,9.31,288,0.950,bicubic,-85.390,-62.103,-29
poolformerv2_s12.sail_in1k,7.960,92.040,34.560,65.440,11.89,224,1.000,bicubic,-85.000,-63.800,+13
resnet50d.gluon_in1k,7.947,92.053,34.987,65.013,25.58,224,0.875,bicubic,-85.803,-63.403,-79
resnet26t.ra2_in1k,7.893,92.107,36.720,63.280,16.01,320,1.000,bicubic,-85.307,-61.690,-17
fastvit_t8.apple_dist_in1k,7.853,92.147,34.667,65.333,4.03,256,0.900,bicubic,-84.687,-63.503,+43
res2net50_26w_8s.in1k,7.853,92.147,33.707,66.293,48.40,224,0.875,bilinear,-85.537,-64.463,-37
dla60_res2next.in1k,7.840,92.160,34.960,65.040,17.03,224,0.875,bilinear,-85.350,-63.440,-18
repghostnet_200.in1k,7.800,92.200,37.200,62.800,9.80,224,0.875,bicubic,-85.700,-61.530,-52
mobilevitv2_075.cvnets_in1k,7.773,92.227,33.720,66.280,2.87,256,0.888,bicubic,-83.987,-64.060,+87
convnextv2_atto.fcmae_ft_in1k,7.773,92.227,32.907,67.093,3.71,288,0.950,bicubic,-85.197,-65.253,+4
mobilevit_xs.cvnets_in1k,7.733,92.267,32.520,67.480,2.32,256,0.900,bicubic,-83.087,-65.410,+114
tf_efficientnetv2_b1.in1k,7.720,92.280,34.613,65.387,8.14,240,0.882,bicubic,-86.230,-64.007,-111
deit_tiny_distilled_patch16_224.fb_in1k,7.693,92.307,33.507,66.493,5.91,224,0.900,bicubic,-83.017,-64.053,+116
regnety_032.pycls_in1k,7.680,92.320,34.280,65.720,19.44,224,0.875,bicubic,-85.730,-64.360,-48
efficientformerv2_s0.snap_dist_in1k,7.667,92.333,32.653,67.347,3.60,224,0.950,bicubic,-84.293,-65.407,+72
convnext_atto.d2_in1k,7.613,92.387,35.053,64.947,3.70,288,0.950,bicubic,-85.177,-63.097,+12
dla60_res2net.in1k,7.600,92.400,34.613,65.387,20.85,224,0.875,bilinear,-85.570,-63.817,-27
efficientnet_b1_pruned.in1k,7.480,92.520,34.480,65.520,6.33,240,0.882,bicubic,-85.300,-63.560,+11
regnetx_064.pycls_in1k,7.373,92.627,34.360,65.640,26.21,224,0.875,bicubic,-86.527,-64.280,-111
wide_resnet101_2.tv_in1k,7.360,92.640,34.147,65.853,126.89,224,0.875,bilinear,-86.380,-64.083,-93
densenet121.ra_in1k,7.333,92.667,35.480,64.520,7.98,288,0.950,bicubic,-85.187,-62.740,+28
deit_tiny_patch16_224.fb_in1k,7.293,92.707,30.680,69.320,5.72,224,0.900,bicubic,-82.367,-66.770,+134
regnetx_008.tv2_in1k,7.280,92.720,34.133,65.867,7.26,224,0.965,bicubic,-85.270,-64.047,+22
resnet50s.gluon_in1k,7.280,92.720,33.453,66.547,25.68,224,0.875,bicubic,-86.360,-65.007,-86
resnet101.gluon_in1k,7.267,92.733,32.773,67.227,44.55,224,0.875,bicubic,-86.483,-65.607,-102
resnet34.a2_in1k,7.267,92.733,31.813,68.187,21.80,288,1.000,bicubic,-85.453,-66.357,+10
edgenext_x_small.in1k,7.267,92.733,30.920,69.080,2.34,288,1.000,bicubic,-84.443,-66.680,+76
hardcorenas_e.miil_green_in1k,7.240,92.760,33.307,66.693,8.07,224,0.875,bilinear,-85.330,-64.793,+16
efficientnet_b0.ra_in1k,7.213,92.787,33.987,66.013,5.29,224,0.875,bicubic,-85.477,-64.083,+8
tf_mixnet_l.in1k,7.173,92.827,31.667,68.333,7.33,224,0.875,bicubic,-86.147,-66.363,-50
tf_efficientnet_b1.aa_in1k,7.160,92.840,33.027,66.973,7.79,240,0.882,bicubic,-86.330,-65.333,-73
tf_efficientnet_cc_b0_8e.in1k,7.133,92.867,31.787,68.213,24.01,224,0.875,bicubic,-85.707,-66.393,-10
convmixer_1024_20_ks9_p14.in1k,7.093,92.907,33.053,66.947,24.38,224,0.960,bicubic,-85.307,-65.217,+25
resmlp_12_224.fb_in1k,7.013,92.987,33.933,66.067,15.35,224,0.875,bicubic,-85.207,-64.217,+35
cs3darknet_focus_m.c2ns_in1k,6.933,93.067,34.587,65.413,9.30,288,0.950,bicubic,-86.037,-63.473,-24
fastvit_t8.apple_in1k,6.893,93.107,33.400,66.600,4.03,256,0.900,bicubic,-85.167,-64.530,+42
hardcorenas_f.miil_green_in1k,6.880,93.120,34.067,65.933,8.20,224,0.875,bilinear,-86.090,-64.323,-25
pit_ti_distilled_224.in1k,6.840,93.160,30.947,69.053,5.10,224,0.900,bicubic,-83.930,-66.663,+88
ghostnetv2_130.in1k,6.707,93.293,32.960,67.040,8.96,224,0.875,bicubic,-85.613,-65.300,+25
efficientnet_es.ra_in1k,6.693,93.307,33.973,66.027,5.44,224,0.875,bicubic,-86.477,-64.437,-49
selecsls60b.in1k,6.693,93.307,33.293,66.707,32.77,224,0.875,bicubic,-86.617,-64.997,-60
res2net50_26w_6s.in1k,6.693,93.307,31.653,68.347,37.05,224,0.875,bilinear,-86.707,-66.627,-73
poolformer_s12.sail_in1k,6.653,93.347,34.520,65.480,11.92,224,0.900,bicubic,-85.957,-63.660,-2
mixnet_m.ft_in1k,6.653,93.347,32.053,67.947,5.01,224,0.875,bicubic,-85.757,-66.097,+14
dpn68b.mx_in1k,6.640,93.360,32.907,67.093,12.61,224,0.875,bicubic,-86.150,-65.173,-18
tinynet_a.in1k,6.640,93.360,32.227,67.773,6.19,192,0.875,bicubic,-85.800,-65.853,+9
legacy_seresnext26_32x4d.in1k,6.627,93.373,33.240,66.760,16.79,224,0.875,bicubic,-86.013,-64.880,-8
tf_efficientnet_b1.in1k,6.627,93.373,32.640,67.360,7.79,240,0.882,bicubic,-86.473,-65.660,-49
mobileone_s3.apple_in1k,6.627,93.373,32.147,67.853,10.17,224,0.900,bilinear,-86.313,-66.043,-30
resnet50.a3_in1k,6.587,93.413,32.053,67.947,25.56,224,0.950,bicubic,-86.133,-65.957,-15
regnety_004.tv2_in1k,6.533,93.467,30.480,69.520,4.34,224,0.965,bicubic,-85.047,-67.410,+51
dla60x.in1k,6.493,93.507,34.080,65.920,17.35,224,0.875,bilinear,-86.587,-64.420,-50
repvgg_b1.rvgg_in1k,6.467,93.533,33.800,66.200,57.42,224,0.875,bilinear,-86.863,-64.570,-79
skresnet34.ra_in1k,6.467,93.533,31.573,68.427,22.28,224,0.875,bicubic,-85.913,-66.577,+7
repghostnet_150.in1k,6.453,93.547,32.307,67.693,6.58,224,0.875,bicubic,-85.917,-65.743,+6
hardcorenas_d.miil_green_in1k,6.453,93.547,32.187,67.813,7.50,224,0.875,bilinear,-85.947,-65.893,+4
resnet26d.bt_in1k,6.307,93.693,32.747,67.253,16.01,288,0.950,bicubic,-86.213,-65.463,-7
regnetx_080.pycls_in1k,6.293,93.707,32.373,67.627,39.57,224,0.875,bicubic,-87.587,-66.217,-145
resnet18.fb_swsl_ig1b_ft_in1k,6.253,93.747,31.600,68.400,11.69,224,0.875,bilinear,-84.447,-66.100,+72
legacy_seresnet50.in1k,6.200,93.800,32.680,67.320,28.09,224,0.875,bilinear,-86.760,-65.730,-44
pit_ti_224.in1k,6.120,93.880,30.240,69.760,4.85,224,0.900,bicubic,-83.820,-67.210,+89
resnet152.tv_in1k,6.067,93.933,32.080,67.920,60.19,224,0.875,bilinear,-87.253,-65.960,-85
wide_resnet50_2.tv_in1k,6.013,93.987,32.160,67.840,68.88,224,0.875,bilinear,-87.147,-66.280,-70
tf_efficientnet_cc_b0_4e.in1k,5.987,94.013,29.587,70.413,13.31,224,0.875,bicubic,-86.613,-68.493,-21
regnetx_040.pycls_in1k,5.947,94.053,31.493,68.507,22.12,224,0.875,bicubic,-87.613,-67.037,-120
seresnet50.a3_in1k,5.947,94.053,30.827,69.173,28.09,224,0.950,bicubic,-86.123,-67.213,+11
mixer_l16_224.goog_in21k_ft_in1k,5.880,94.120,18.547,81.453,208.20,224,0.875,bicubic,-81.270,-74.973,+120
tf_efficientnetv2_b0.in1k,5.867,94.133,30.787,69.213,7.14,224,0.875,bicubic,-87.243,-67.603,-71
dla102.in1k,5.827,94.173,32.760,67.240,33.27,224,0.875,bilinear,-87.223,-65.790,-65
selecsls60.in1k,5.680,94.320,32.520,67.480,30.67,224,0.875,bicubic,-87.330,-65.780,-62
regnety_016.pycls_in1k,5.667,94.333,30.480,69.520,11.20,224,0.875,bicubic,-87.373,-67.890,-66
res2next50.in1k,5.653,94.347,30.893,69.107,24.67,224,0.875,bilinear,-87.187,-67.287,-52
hardcorenas_c.miil_green_in1k,5.653,94.347,30.427,69.573,5.52,224,0.875,bilinear,-86.367,-67.413,+9
hrnet_w18_small_v2.gluon_in1k,5.520,94.480,31.853,68.147,15.60,224,0.875,bicubic,-87.250,-66.557,-44
hrnet_w18.ms_in1k,5.493,94.507,30.907,69.093,21.30,224,0.875,bilinear,-86.827,-67.163,-12
resnest14d.gluon_in1k,5.453,94.547,28.547,71.453,10.61,224,0.875,bilinear,-86.267,-69.323,+24
ghostnetv2_100.in1k,5.387,94.613,28.560,71.440,6.16,224,0.875,bicubic,-85.513,-69.140,+45
tf_efficientnet_em.in1k,5.360,94.640,31.080,68.920,6.90,240,0.882,bicubic,-87.580,-67.100,-61
tf_efficientnet_lite2.in1k,5.333,94.667,30.880,69.120,6.09,260,0.890,bicubic,-87.317,-67.350,-41
gernet_s.idstcv_in1k,5.320,94.680,30.147,69.853,8.17,224,0.875,bilinear,-86.810,-68.043,-7
resnext26ts.ra2_in1k,5.307,94.693,29.680,70.320,10.30,288,1.000,bicubic,-86.823,-68.350,-8
tf_efficientnet_b0.ap_in1k,5.307,94.693,28.840,71.160,5.29,224,0.875,bicubic,-86.893,-69.180,-12
efficientvit_m4.r224_in1k,5.307,94.693,28.013,71.987,8.80,224,0.875,bicubic,-85.273,-69.517,+55
repvgg_b1g4.rvgg_in1k,5.240,94.760,30.760,69.240,39.97,224,0.875,bilinear,-87.760,-67.670,-75
resnet34.bt_in1k,5.213,94.787,29.440,70.560,21.80,288,0.950,bicubic,-87.197,-68.430,-29
xcit_nano_12_p16_224.fb_dist_in1k,5.200,94.800,26.493,73.507,3.05,224,1.000,bicubic,-84.500,-70.607,+68
res2net50_26w_4s.in1k,5.173,94.827,29.360,70.640,25.70,224,0.875,bilinear,-87.317,-68.670,-35
efficientvit_m3.r224_in1k,5.160,94.840,27.400,72.600,6.90,224,0.875,bicubic,-84.700,-70.140,+64
vit_tiny_r_s16_p8_224.augreg_in21k_ft_in1k,5.080,94.920,27.027,72.973,6.34,224,0.900,bicubic,-84.100,-70.193,+74
mobilenetv3_large_100.ra_in1k,5.067,94.933,28.200,71.800,5.48,224,0.875,bicubic,-86.283,-69.510,+17
tf_efficientnet_b0.aa_in1k,5.053,94.947,28.760,71.240,5.29,224,0.875,bicubic,-87.187,-69.240,-23
tf_mixnet_m.in1k,5.053,94.947,28.187,71.813,5.01,224,0.875,bicubic,-87.247,-69.703,-27
res2net50_14w_8s.in1k,5.040,94.960,28.733,71.267,25.06,224,0.875,bilinear,-87.730,-69.427,-62
regnetx_004_tv.tv2_in1k,5.000,95.000,27.560,72.440,5.50,224,0.965,bicubic,-85.640,-70.040,+39
repghostnet_130.in1k,4.987,95.013,29.653,70.347,5.48,224,0.875,bicubic,-86.903,-68.277,-5
mixnet_s.ft_in1k,4.947,95.053,28.573,71.427,4.13,224,0.875,bicubic,-86.873,-69.127,-2
hardcorenas_b.miil_green_in1k,4.947,95.053,28.040,71.960,5.18,224,0.875,bilinear,-86.813,-69.820,+2
mobilenetv3_rw.rmsp_in1k,4.933,95.067,29.853,70.147,5.48,224,0.875,bicubic,-86.277,-67.807,+13
hardcorenas_a.miil_green_in1k,4.893,95.107,28.093,71.907,5.26,224,0.875,bilinear,-86.457,-69.757,+7
regnetx_032.pycls_in1k,4.880,95.120,30.227,69.773,15.30,224,0.875,bicubic,-88.230,-68.163,-104
resnet50c.gluon_in1k,4.880,95.120,28.080,71.920,25.58,224,0.875,bicubic,-88.150,-70.320,-95
xcit_nano_12_p16_224.fb_in1k,4.853,95.147,25.467,74.533,3.05,224,1.000,bicubic,-83.767,-71.323,+73
resnext50_32x4d.tv_in1k,4.827,95.173,30.267,69.733,25.03,224,0.875,bilinear,-87.923,-68.003,-71
densenet161.tv_in1k,4.733,95.267,29.560,70.440,28.68,224,0.875,bicubic,-87.747,-68.740,-51
resnet101.tv_in1k,4.693,95.307,29.347,70.653,44.55,224,0.875,bilinear,-88.117,-68.903,-79
selecsls42b.in1k,4.680,95.320,28.573,71.427,32.46,224,0.875,bicubic,-87.610,-69.537,-40
tf_efficientnet_lite1.in1k,4.600,95.400,28.347,71.653,5.42,240,0.882,bicubic,-88.030,-69.713,-67
mobilenetv2_120d.ra_in1k,4.547,95.453,29.320,70.680,5.83,224,0.875,bicubic,-87.853,-68.740,-48
mobileone_s2.apple_in1k,4.520,95.480,29.133,70.867,7.88,224,0.900,bilinear,-88.300,-69.137,-85
tf_efficientnet_b0.in1k,4.427,95.573,26.680,73.320,5.29,224,0.875,bicubic,-87.653,-71.480,-34
pvt_v2_b0.in1k,4.347,95.653,25.960,74.040,3.67,224,0.900,bicubic,-84.433,-70.900,+61
vit_base_patch32_224.sam_in1k,4.333,95.667,24.387,75.613,88.22,224,0.900,bicubic,-85.407,-72.613,+41
resnet50.am_in1k,4.267,95.733,28.627,71.373,25.56,224,0.875,bicubic,-89.703,-70.003,-216
edgenext_xx_small.in1k,4.267,95.733,24.093,75.907,1.33,288,1.000,bicubic,-84.623,-72.887,+57
tinynet_b.in1k,4.200,95.800,26.787,73.213,3.73,188,0.875,bicubic,-86.720,-70.873,+5
efficientnet_es_pruned.in1k,4.200,95.800,26.453,73.547,5.44,224,0.875,bicubic,-86.970,-71.287,-1
repghostnet_111.in1k,4.147,95.853,26.187,73.813,4.54,224,0.875,bicubic,-86.563,-71.283,+13
densenet201.tv_in1k,4.133,95.867,27.547,72.453,20.01,224,0.875,bicubic,-88.607,-70.683,-85
resnet50.gluon_in1k,4.120,95.880,26.960,73.040,25.56,224,0.875,bicubic,-88.420,-71.080,-72
fbnetc_100.rmsp_in1k,4.107,95.893,25.907,74.093,5.57,224,0.875,bilinear,-86.623,-71.303,+8
semnasnet_100.rmsp_in1k,3.947,96.053,26.933,73.067,3.89,224,0.875,bicubic,-87.363,-70.627,-14
mobilevitv2_050.cvnets_in1k,3.947,96.053,23.947,76.053,1.37,256,0.888,bicubic,-84.233,-73.043,+59
resnet26.bt_in1k,3.933,96.067,28.213,71.787,16.00,288,0.950,bicubic,-88.057,-69.807,-42
repvgg_a2.rvgg_in1k,3.933,96.067,27.227,72.773,28.21,224,0.875,bilinear,-88.007,-70.873,-35
dpn68.mx_in1k,3.893,96.107,25.693,74.307,12.61,224,0.875,bicubic,-88.097,-72.527,-42
tf_mixnet_s.in1k,3.880,96.120,25.267,74.733,4.13,224,0.875,bicubic,-87.640,-72.353,-23
semnasnet_075.rmsp_in1k,3.867,96.133,27.080,72.920,2.91,224,0.875,bicubic,-86.203,-70.360,+21
tf_efficientnet_es.in1k,3.827,96.173,26.133,73.867,5.44,224,0.875,bicubic,-88.143,-71.747,-45
mobilevit_xxs.cvnets_in1k,3.827,96.173,21.733,78.267,1.27,256,0.900,bicubic,-83.333,-74.367,+58
resnet18d.ra2_in1k,3.813,96.187,26.013,73.987,11.71,288,0.950,bicubic,-86.467,-71.547,+12
regnety_008.pycls_in1k,3.787,96.213,27.160,72.840,6.26,224,0.875,bicubic,-87.943,-71.020,-32
dla60.in1k,3.747,96.253,27.947,72.053,22.04,224,0.875,bilinear,-88.453,-70.153,-62
resnet18.fb_ssl_yfcc100m_ft_in1k,3.747,96.253,25.373,74.627,11.69,224,0.875,bilinear,-86.463,-72.177,+11
mobilenetv2_140.ra_in1k,3.720,96.280,26.720,73.280,6.11,224,0.875,bicubic,-88.120,-71.140,-41
densenet169.tv_in1k,3.707,96.293,25.587,74.413,14.15,224,0.875,bicubic,-88.233,-72.553,-46
resnet18.a1_in1k,3.707,96.293,22.960,77.040,11.69,288,1.000,bicubic,-85.973,-74.140,+19
regnetx_016.pycls_in1k,3.613,96.387,26.320,73.680,9.19,224,0.875,bicubic,-88.567,-71.880,-66
efficientvit_m2.r224_in1k,3.613,96.387,21.853,78.147,4.19,224,0.875,bicubic,-84.857,-75.047,+40
spnasnet_100.rmsp_in1k,3.573,96.427,24.253,75.747,4.42,224,0.875,bilinear,-86.757,-72.937,+1
res2net50_48w_2s.in1k,3.560,96.440,26.613,73.387,25.29,224,0.875,bilinear,-88.990,-71.467,-94
tf_mobilenetv3_large_100.in1k,3.560,96.440,25.120,74.880,5.48,224,0.875,bilinear,-87.670,-72.540,-31
repghostnet_100.in1k,3.520,96.480,24.520,75.480,4.07,224,0.875,bicubic,-86.770,-72.960,-1
regnety_006.pycls_in1k,3.453,96.547,24.920,75.080,6.06,224,0.875,bicubic,-87.937,-73.080,-38
ghostnet_100.in1k,3.427,96.573,25.120,74.880,5.18,224,0.875,bicubic,-86.753,-72.170,+1
resnet34.a3_in1k,3.373,96.627,23.387,76.613,21.80,224,0.950,bicubic,-86.567,-73.793,+6
legacy_seresnet34.in1k,3.347,96.653,23.813,76.187,21.96,224,0.875,bilinear,-87.553,-73.767,-23
resnet18.a2_in1k,3.267,96.733,22.373,77.627,11.69,288,1.000,bicubic,-86.303,-74.587,+13
efficientnet_lite0.ra_in1k,3.240,96.760,25.947,74.053,4.65,224,0.875,bicubic,-87.880,-71.693,-34
dla34.in1k,3.240,96.760,23.547,76.453,15.74,224,0.875,bilinear,-87.520,-74.103,-21
efficientvit_b0.r224_in1k,3.200,96.800,19.533,80.467,3.41,224,0.950,bicubic,-84.740,-76.597,+31
mobilenetv2_110d.ra_in1k,3.187,96.813,24.573,75.427,4.52,224,0.875,bicubic,-87.773,-72.967,-31
regnety_004.pycls_in1k,3.187,96.813,22.680,77.320,4.34,224,0.875,bicubic,-87.303,-74.850,-14
tinynet_c.in1k,3.120,96.880,21.520,78.480,2.46,184,0.875,bicubic,-84.660,-74.850,+29
mnasnet_100.rmsp_in1k,3.107,96.893,24.227,75.773,4.38,224,0.875,bicubic,-87.393,-73.243,-17
repghostnet_080.in1k,3.080,96.920,21.973,78.027,3.28,224,0.875,bicubic,-85.760,-74.727,+16
tf_efficientnet_lite0.in1k,3.040,96.960,22.893,77.107,4.65,224,0.875,bicubic,-88.010,-74.687,-39
skresnet18.ra_in1k,3.027,96.973,22.813,77.187,11.96,224,0.875,bicubic,-86.633,-74.417,0
mobileone_s1.apple_in1k,2.947,97.053,24.947,75.053,4.83,224,0.900,bilinear,-88.333,-72.873,-49
vgg19_bn.tv_in1k,2.947,97.053,23.440,76.560,143.68,224,0.875,bilinear,-87.133,-74.140,-12
tinynet_d.in1k,2.853,97.147,17.787,82.213,2.34,152,0.875,bicubic,-81.867,-77.383,+40
tf_mobilenetv3_large_075.in1k,2.840,97.160,21.560,78.440,3.99,224,0.875,bilinear,-86.800,-75.630,-3
efficientvit_m1.r224_in1k,2.827,97.173,19.600,80.400,2.98,224,0.875,bicubic,-83.963,-76.430,+27
resnet14t.c3_in1k,2.760,97.240,20.213,79.787,10.08,224,0.950,bicubic,-86.230,-76.517,+3
hrnet_w18_small_v2.ms_in1k,2.720,97.280,23.720,76.280,15.60,224,0.875,bilinear,-88.480,-74.180,-52
regnetx_008.pycls_in1k,2.667,97.333,22.453,77.547,7.26,224,0.875,bicubic,-88.383,-75.267,-49
vgg16_bn.tv_in1k,2.653,97.347,23.800,76.200,138.37,224,0.875,bilinear,-87.437,-73.570,-21
resnet34.gluon_in1k,2.653,97.347,21.680,78.320,21.80,224,0.875,bicubic,-88.327,-75.950,-47
lcnet_100.ra2_in1k,2.627,97.373,20.760,79.240,2.95,224,0.875,bicubic,-86.123,-76.220,+6
vgg16.tv_in1k,2.627,97.373,20.360,79.640,138.36,224,0.875,bilinear,-85.933,-76.440,+7
repvgg_b0.rvgg_in1k,2.560,97.440,24.000,76.000,15.82,224,0.875,bilinear,-88.830,-73.700,-66
densenet121.tv_in1k,2.547,97.453,22.653,77.347,7.98,224,0.875,bicubic,-88.343,-75.057,-47
regnetx_006.pycls_in1k,2.533,97.467,20.627,79.373,6.20,224,0.875,bicubic,-87.817,-76.803,-33
hrnet_w18_small.gluon_in1k,2.507,97.493,20.653,79.347,13.19,224,0.875,bicubic,-86.963,-76.407,-12
legacy_seresnet18.in1k,2.480,97.520,20.067,79.933,11.78,224,0.875,bicubic,-86.410,-76.633,-5
lcnet_075.ra2_in1k,2.320,97.680,17.173,82.827,2.36,224,0.875,bicubic,-83.650,-78.507,+21
mobilenetv3_small_075.lamb_in1k,2.307,97.693,15.893,84.107,2.04,224,0.875,bicubic,-80.723,-78.207,+30
efficientvit_m0.r224_in1k,2.293,97.707,16.493,83.507,2.35,224,0.875,bicubic,-80.057,-77.937,+30
repghostnet_058.in1k,2.253,97.747,18.320,81.680,2.55,224,0.875,bicubic,-84.287,-77.580,+13
repvgg_a1.rvgg_in1k,2.240,97.760,21.333,78.667,14.09,224,0.875,bilinear,-88.360,-76.317,-45
mobileone_s0.apple_in1k,2.240,97.760,17.467,82.533,5.29,224,0.875,bilinear,-85.990,-78.933,0
resnet18.a3_in1k,2.227,97.773,17.773,82.227,11.69,224,0.950,bicubic,-84.223,-78.107,+11
mobilenetv2_100.ra_in1k,2.147,97.853,19.933,80.067,3.50,224,0.875,bicubic,-87.453,-77.217,-23
regnety_002.pycls_in1k,2.120,97.880,18.920,81.080,3.16,224,0.875,bicubic,-85.250,-77.690,+2
vgg19.tv_in1k,2.107,97.893,20.760,79.240,143.67,224,0.875,bilinear,-86.943,-76.110,-19
vgg13_bn.tv_in1k,2.093,97.907,20.333,79.667,133.05,224,0.875,bilinear,-86.657,-76.387,-12
tf_mobilenetv3_small_100.in1k,2.027,97.973,15.827,84.173,2.54,224,0.875,bilinear,-83.163,-79.943,+12
mobilenetv3_small_100.lamb_in1k,1.987,98.013,17.093,82.907,2.54,224,0.875,bicubic,-83.233,-78.557,+10
repghostnet_050.in1k,1.987,98.013,16.507,83.493,2.31,224,0.875,bicubic,-83.073,-78.693,+11
tf_mobilenetv3_small_075.in1k,1.987,98.013,14.840,85.160,2.04,224,0.875,bilinear,-81.513,-80.000,+16
regnetx_004.pycls_in1k,1.920,98.080,19.147,80.853,5.16,224,0.875,bicubic,-87.010,-77.973,-22
resnet34.tv_in1k,1.853,98.147,20.053,79.947,21.80,224,0.875,bilinear,-88.097,-77.287,-42
tinynet_e.in1k,1.853,98.147,14.013,85.987,2.04,106,0.875,bicubic,-77.067,-78.527,+18
vgg13.tv_in1k,1.840,98.160,18.027,81.973,133.05,224,0.875,bilinear,-85.200,-78.303,-5
mobilenetv3_small_050.lamb_in1k,1.813,98.187,12.533,87.467,1.59,224,0.875,bicubic,-75.207,-78.767,+17
lcnet_050.ra2_in1k,1.787,98.213,13.867,86.133,1.88,224,0.875,bicubic,-80.013,-79.843,+13
mnasnet_small.lamb_in1k,1.773,98.227,15.080,84.920,2.03,224,0.875,bicubic,-82.647,-80.110,+5
dla46x_c.in1k,1.747,98.253,16.387,83.613,1.07,224,0.875,bilinear,-82.483,-78.883,+5
vgg11_bn.tv_in1k,1.720,98.280,18.093,81.907,132.87,224,0.875,bilinear,-85.780,-78.727,-15
dla60x_c.in1k,1.613,98.387,18.013,81.987,1.32,224,0.875,bilinear,-84.677,-78.147,-6
tf_mobilenetv3_large_minimal_100.in1k,1.613,98.387,17.120,82.880,3.92,224,0.875,bilinear,-87.337,-79.740,-34
mobilenetv2_050.lamb_in1k,1.613,98.387,14.200,85.800,1.97,224,0.875,bicubic,-82.297,-80.520,+3
resnet10t.c3_in1k,1.600,98.400,16.053,83.947,5.44,224,0.950,bicubic,-84.620,-79.687,-8
vgg11.tv_in1k,1.560,98.440,16.187,83.813,132.86,224,0.875,bilinear,-85.020,-80.093,-13
resnet18.gluon_in1k,1.547,98.453,16.640,83.360,11.69,224,0.875,bicubic,-86.823,-80.030,-26
hrnet_w18_small.ms_in1k,1.533,98.467,18.093,81.907,13.19,224,0.875,bilinear,-87.517,-79.027,-41
dla46_c.in1k,1.493,98.507,15.227,84.773,1.30,224,0.875,bilinear,-82.117,-79.723,-2
repvgg_a0.rvgg_in1k,1.467,98.533,17.587,82.413,9.11,224,0.875,bilinear,-87.813,-79.303,-45
regnetx_002.pycls_in1k,1.373,98.627,15.053,84.947,2.68,224,0.875,bicubic,-84.767,-80.917,-13
resnet18.tv_in1k,1.160,98.840,16.227,83.773,11.69,224,0.875,bilinear,-86.220,-80.063,-25
tf_mobilenetv3_small_minimal_100.in1k,1.040,98.960,11.493,88.507,2.04,224,0.875,bilinear,-80.360,-82.187,-1
resnet50.tv_in1k,0.000,100.000,14.453,85.547,25.56,224,0.875,bilinear,-91.880,-82.807,-120
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/benchmark-infer-amp-nchw-pt112-cu113-rtx3090.csv
|
model,infer_samples_per_sec,infer_step_time,infer_batch_size,infer_img_size,infer_gmacs,infer_macts,param_count
tinynet_e,49285.12,20.767,1024,106,0.03,0.69,2.04
mobilenetv3_small_050,43905.96,23.312,1024,224,0.03,0.92,1.59
lcnet_035,40961.84,24.988,1024,224,0.03,1.04,1.64
lcnet_050,36451.18,28.081,1024,224,0.05,1.26,1.88
mobilenetv3_small_075,32291.57,31.7,1024,224,0.05,1.3,2.04
mobilenetv3_small_100,28935.54,35.379,1024,224,0.06,1.42,2.54
tf_mobilenetv3_small_minimal_100,27926.5,36.657,1024,224,0.06,1.41,2.04
tinynet_d,27303.88,37.493,1024,152,0.05,1.42,2.34
tf_mobilenetv3_small_075,26850.04,38.127,1024,224,0.05,1.3,2.04
tf_mobilenetv3_small_100,24320.21,42.094,1024,224,0.06,1.42,2.54
lcnet_075,22627.19,45.245,1024,224,0.1,1.99,2.36
mnasnet_small,20150.91,50.806,1024,224,0.07,2.16,2.03
levit_128s,19458.78,52.613,1024,224,0.31,1.88,7.78
lcnet_100,18910.66,54.139,1024,224,0.16,2.52,2.95
mobilenetv2_035,18047.72,56.728,1024,224,0.07,2.86,1.68
regnetx_002,17921.55,57.126,1024,224,0.2,2.16,2.68
regnety_002,16656.92,61.462,1024,224,0.2,2.17,3.16
ghostnet_050,16494.57,62.071,1024,224,0.05,1.77,2.59
mnasnet_050,15574.97,65.736,1024,224,0.11,3.07,2.22
mobilenetv2_050,14533.98,70.445,1024,224,0.1,3.64,1.97
tinynet_c,14397.76,71.111,1024,184,0.11,2.87,2.46
semnasnet_050,14065.61,72.79,1024,224,0.11,3.44,2.08
levit_128,13348.5,76.702,1024,224,0.41,2.71,9.21
vit_small_patch32_224,12899.41,79.373,1024,224,1.15,2.5,22.88
mixer_s32_224,12823.61,79.842,1024,224,1.0,2.28,19.1
lcnet_150,12599.24,81.264,1024,224,0.34,3.79,4.5
regnetx_004,12314.46,83.141,1024,224,0.4,3.14,5.16
cs3darknet_focus_s,11852.98,86.381,1024,256,0.69,2.7,3.27
mobilenetv3_large_075,11687.27,87.605,1024,224,0.16,4.0,3.99
resnet10t,11549.51,88.651,1024,224,1.1,2.43,5.44
cs3darknet_s,11540.93,88.716,1024,256,0.72,2.97,3.28
vit_tiny_r_s16_p8_224,10917.33,93.785,1024,224,0.44,2.06,6.34
ese_vovnet19b_slim_dw,10530.7,97.229,1024,224,0.4,5.28,1.9
mobilenetv3_rw,10453.43,97.947,1024,224,0.23,4.41,5.48
hardcorenas_a,10387.47,98.569,1024,224,0.23,4.38,5.26
mobilenetv3_large_100_miil,10298.68,99.419,1024,224,0.23,4.41,5.48
mobilenetv3_large_100,10295.13,99.453,1024,224,0.23,4.41,5.48
tf_mobilenetv3_large_075,10277.2,99.627,1024,224,0.16,4.0,3.99
gernet_s,10228.24,100.105,1024,224,0.75,2.65,8.17
mnasnet_075,10209.23,100.29,1024,224,0.23,4.77,3.17
levit_192,10099.95,101.375,1024,224,0.66,3.2,10.95
tf_mobilenetv3_large_minimal_100,10021.88,102.166,1024,224,0.22,4.4,3.92
hardcorenas_b,9469.88,108.121,1024,224,0.26,5.09,5.18
regnetx_006,9309.45,109.982,1024,224,0.61,3.98,6.2
tinynet_b,9298.6,110.113,1024,188,0.21,4.44,3.73
regnety_004,9296.36,110.137,1024,224,0.41,3.89,4.34
ghostnet_100,9264.87,110.513,1024,224,0.15,3.55,5.18
hardcorenas_c,9196.31,111.338,1024,224,0.28,5.01,5.52
resnet18,9171.4,111.64,1024,224,1.82,2.48,11.69
tf_mobilenetv3_large_100,9170.64,111.649,1024,224,0.23,4.41,5.48
mobilenetv2_075,9151.72,111.88,1024,224,0.22,5.86,2.64
swsl_resnet18,9145.07,111.962,1024,224,1.82,2.48,11.69
mnasnet_100,9128.95,112.159,1024,224,0.33,5.46,4.38
mnasnet_b1,9096.68,112.558,1024,224,0.33,5.46,4.38
gluon_resnet18_v1b,9092.93,112.604,1024,224,1.82,2.48,11.69
ssl_resnet18,9043.33,113.221,1024,224,1.82,2.48,11.69
semnasnet_075,8958.16,114.297,1024,224,0.23,5.54,2.91
hardcorenas_d,8756.1,116.935,1024,224,0.3,4.93,7.5
seresnet18,8678.54,117.981,1024,224,1.82,2.49,11.78
regnety_006,8404.01,121.832,1024,224,0.61,4.33,6.06
mobilenetv2_100,8360.81,122.466,1024,224,0.31,6.68,3.5
legacy_seresnet18,8318.12,123.094,1024,224,1.82,2.49,11.78
spnasnet_100,8246.33,124.165,1024,224,0.35,6.03,4.42
semnasnet_100,8027.18,127.555,1024,224,0.32,6.23,3.89
mnasnet_a1,8013.14,127.779,1024,224,0.32,6.23,3.89
levit_256,7862.24,130.228,1024,224,1.13,4.23,18.89
resnet18d,7721.04,132.614,1024,224,2.06,3.29,11.71
hardcorenas_f,7642.68,133.973,1024,224,0.35,5.57,8.2
hardcorenas_e,7588.05,134.938,1024,224,0.35,5.65,8.07
ese_vovnet19b_slim,7530.8,135.964,1024,224,1.69,3.52,3.17
efficientnet_lite0,7530.79,135.964,1024,224,0.4,6.74,4.65
ghostnet_130,7411.84,138.146,1024,224,0.24,4.6,7.36
regnetx_008,7376.89,138.798,1024,224,0.81,5.15,7.26
tinynet_a,7260.16,141.032,1024,192,0.35,5.41,6.19
tf_efficientnetv2_b0,7117.22,143.865,1024,224,0.73,4.77,7.14
fbnetc_100,7115.49,143.899,1024,224,0.4,6.51,5.57
regnety_008,7108.36,144.037,1024,224,0.81,5.25,6.26
xcit_nano_12_p16_224_dist,7019.86,145.861,1024,224,0.56,4.17,3.05
xcit_nano_12_p16_224,7000.29,146.268,1024,224,0.56,4.17,3.05
edgenext_xx_small,6963.01,147.05,1024,256,0.33,4.21,1.33
levit_256d,6856.41,149.338,1024,224,1.4,4.93,26.21
deit_tiny_patch16_224,6794.46,150.698,1024,224,1.26,5.97,5.72
vit_tiny_patch16_224,6769.81,151.248,1024,224,1.26,5.97,5.72
tf_efficientnet_lite0,6667.82,153.562,1024,224,0.4,6.74,4.65
deit_tiny_distilled_patch16_224,6647.4,154.032,1024,224,1.27,6.01,5.91
efficientnet_b0,6576.16,155.702,1024,224,0.4,6.75,5.29
dla46_c,6538.59,156.596,1024,224,0.58,4.5,1.3
rexnetr_100,6369.79,160.748,1024,224,0.43,7.72,4.88
mnasnet_140,6297.39,162.595,1024,224,0.6,7.71,7.12
rexnet_100,6295.89,162.634,1024,224,0.41,7.44,4.8
efficientnet_b1_pruned,6269.62,163.315,1024,240,0.4,6.21,6.33
mobilenetv2_110d,6263.44,163.477,1024,224,0.45,8.71,4.52
regnetz_005,6057.27,169.042,1024,224,0.52,5.86,7.12
resnetblur18,6056.25,169.07,1024,224,2.34,3.39,11.69
pit_ti_distilled_224,6026.49,169.903,1024,224,0.71,6.23,5.1
pit_ti_224,5988.08,170.993,1024,224,0.7,6.19,4.85
nf_regnet_b0,5936.35,172.485,1024,256,0.64,5.58,8.76
mobilevitv2_050,5906.65,173.353,1024,256,0.48,8.04,1.37
tf_efficientnet_b0_ap,5894.61,173.707,1024,224,0.4,6.75,5.29
tf_efficientnet_b0,5892.32,173.774,1024,224,0.4,6.75,5.29
tf_efficientnet_b0_ns,5891.52,173.799,1024,224,0.4,6.75,5.29
visformer_tiny,5845.93,175.153,1024,224,1.27,5.72,10.32
resnet14t,5834.5,175.495,1024,224,1.69,5.8,10.08
dla46x_c,5690.98,179.922,1024,224,0.54,5.66,1.07
skresnet18,5640.2,181.543,1024,224,1.82,3.24,11.96
semnasnet_140,5544.22,184.685,1024,224,0.6,8.87,6.11
hrnet_w18_small,5451.81,187.816,1024,224,1.61,5.72,13.19
mobilenetv2_140,5399.1,189.649,1024,224,0.6,9.57,6.11
resnet34,5356.43,191.161,1024,224,3.67,3.74,21.8
dla60x_c,5292.02,193.487,1024,224,0.59,6.01,1.32
mobilevit_xxs,5275.05,194.109,1024,256,0.42,8.34,1.27
ese_vovnet19b_dw,5260.83,194.634,1024,224,1.34,8.25,6.54
gluon_resnet34_v1b,5203.76,196.769,1024,224,3.67,3.74,21.8
tv_resnet34,5193.55,197.156,1024,224,3.67,3.74,21.8
efficientnet_lite1,5144.81,199.024,1024,240,0.62,10.14,5.42
mixnet_s,5054.78,202.566,1024,224,0.25,6.25,4.13
seresnet34,5051.53,202.699,1024,224,3.67,3.74,21.96
gernet_m,5028.39,203.632,1024,224,3.02,5.24,21.14
fbnetv3_b,4982.49,205.508,1024,256,0.55,9.1,8.6
selecsls42,4945.53,207.043,1024,224,2.94,4.62,30.35
selecsls42b,4942.3,207.179,1024,224,2.98,4.62,32.46
vit_base_patch32_224_sam,4921.3,208.063,1024,224,4.41,5.01,88.22
vit_base_patch32_224,4918.17,208.197,1024,224,4.41,5.01,88.22
resnet34d,4834.03,211.82,1024,224,3.91,4.54,21.82
rexnetr_130,4789.2,213.803,1024,224,0.68,9.81,7.61
pit_xs_224,4766.56,214.816,1024,224,1.4,7.71,10.62
tf_efficientnetv2_b1,4738.51,216.09,1024,240,1.21,7.34,8.14
legacy_seresnet34,4737.15,216.152,1024,224,3.67,3.74,21.96
pit_xs_distilled_224,4722.37,216.826,1024,224,1.41,7.76,11.0
mixer_b32_224,4707.3,217.523,1024,224,3.24,6.29,60.29
tf_mixnet_s,4706.57,217.551,1024,224,0.25,6.25,4.13
tf_efficientnet_lite1,4662.36,219.62,1024,240,0.62,10.14,5.42
xcit_tiny_12_p16_224_dist,4593.26,222.924,1024,224,1.24,6.29,6.72
xcit_tiny_12_p16_224,4592.09,222.979,1024,224,1.24,6.29,6.72
rexnet_130,4578.43,223.646,1024,224,0.68,9.71,7.56
levit_384,4538.82,225.597,1024,224,2.36,6.26,39.13
mobilenetv2_120d,4530.56,226.009,1024,224,0.69,11.97,5.83
edgenext_x_small,4436.58,230.795,1024,256,0.68,7.5,2.34
cs3darknet_focus_m,4399.27,232.755,1024,288,2.51,6.19,9.3
efficientnet_b0_g16_evos,4394.3,233.017,1024,224,1.01,7.42,8.11
efficientnet_es,4389.85,233.253,1024,224,1.81,8.73,5.44
efficientnet_es_pruned,4389.6,233.266,1024,224,1.81,8.73,5.44
resnet26,4383.27,233.604,1024,224,2.36,7.35,16.0
cs3darknet_m,4330.89,236.429,1024,288,2.63,6.69,9.31
fbnetv3_d,4328.59,236.555,1024,256,0.68,11.1,10.31
repvgg_b0,4286.85,238.858,1024,224,3.41,6.15,15.82
selecsls60,4286.11,238.899,1024,224,3.59,5.52,30.67
darknet17,4270.14,179.843,768,256,3.26,7.18,14.3
selecsls60b,4265.59,240.05,1024,224,3.63,5.52,32.77
efficientnet_b2_pruned,4264.69,240.099,1024,260,0.73,9.13,8.31
tf_efficientnet_es,4239.07,241.551,1024,224,1.81,8.73,5.44
regnetx_016,4196.72,243.986,1024,224,1.62,7.93,9.19
rexnetr_150,4170.12,245.545,1024,224,0.89,11.13,9.78
crossvit_tiny_240,4122.19,248.4,1024,240,1.57,9.08,7.01
dla34,4120.09,248.525,1024,224,3.07,5.02,15.74
mixer_s16_224,4085.83,250.611,1024,224,3.79,5.97,18.53
vit_small_patch32_384,4015.79,254.982,1024,384,3.45,8.25,22.92
rexnet_150,3990.72,256.583,1024,224,0.9,11.21,9.73
resnet26d,3989.2,256.681,1024,224,2.6,8.15,16.01
ecaresnet50d_pruned,3983.23,257.066,1024,224,2.53,6.43,19.94
efficientnet_lite2,3977.91,257.41,1024,260,0.89,12.9,6.09
gmlp_ti16_224,3944.3,259.603,1024,224,1.34,7.55,5.87
mobilevitv2_075,3905.27,262.199,1024,256,1.05,12.06,2.87
crossvit_9_240,3875.46,264.215,1024,240,1.85,9.52,8.55
darknet21,3872.25,198.322,768,256,3.93,7.47,20.86
nf_resnet26,3857.21,265.465,1024,224,2.41,7.35,16.0
convnext_nano_ols,3756.44,272.585,1024,224,2.5,8.37,15.6
convnext_nano_hnf,3749.56,273.084,1024,224,2.46,8.37,15.59
sedarknet21,3744.18,205.107,768,256,3.93,7.47,20.95
efficientnet_b1,3742.67,273.59,1024,256,0.77,12.22,7.79
crossvit_9_dagger_240,3734.14,274.215,1024,240,1.99,9.97,8.78
tf_efficientnet_b1,3731.51,274.409,1024,240,0.71,10.88,7.79
tf_efficientnet_b1_ns,3731.48,274.411,1024,240,0.71,10.88,7.79
tf_efficientnet_b1_ap,3726.19,274.8,1024,240,0.71,10.88,7.79
resnest14d,3644.88,280.93,1024,224,2.76,7.33,10.61
regnety_016,3624.55,282.503,1024,224,1.63,8.04,11.2
tf_efficientnet_lite2,3624.06,282.543,1024,260,0.89,12.9,6.09
vit_tiny_r_s16_p8_384,3594.94,213.622,768,384,1.34,6.49,6.36
tf_efficientnetv2_b2,3593.98,284.91,1024,260,1.72,9.84,10.1
poolformer_s12,3483.41,293.951,1024,224,1.82,5.53,11.92
resmlp_12_224,3460.87,295.868,1024,224,3.01,5.5,15.35
resmlp_12_distilled_224,3458.54,296.067,1024,224,3.01,5.5,15.35
mixnet_m,3455.23,296.35,1024,224,0.36,8.19,5.01
gmixer_12_224,3401.29,301.051,1024,224,2.67,7.26,12.7
resnext26ts,3375.26,303.371,1024,256,2.43,10.52,10.3
nf_ecaresnet26,3365.9,304.215,1024,224,2.41,7.36,16.0
nf_seresnet26,3360.23,304.729,1024,224,2.41,7.36,17.4
gernet_l,3328.59,307.626,1024,256,4.57,8.0,31.08
repvgg_a2,3325.03,307.955,1024,224,5.7,6.26,28.21
tf_mixnet_m,3322.0,308.236,1024,224,0.36,8.19,5.01
efficientnet_b3_pruned,3297.4,310.535,1024,300,1.04,11.86,9.86
nf_regnet_b1,3293.07,310.944,1024,288,1.02,9.2,10.22
seresnext26ts,3291.26,311.115,1024,256,2.43,10.52,10.39
eca_resnext26ts,3290.56,311.182,1024,256,2.43,10.52,10.3
legacy_seresnext26_32x4d,3269.09,313.225,1024,224,2.49,9.39,16.79
skresnet34,3229.96,317.02,1024,224,3.67,5.13,22.28
gcresnext26ts,3229.79,317.037,1024,256,2.43,10.53,10.48
nf_regnet_b2,3193.49,320.64,1024,272,1.22,9.27,14.31
convit_tiny,3179.42,322.058,1024,224,1.26,7.94,5.71
resnet26t,3149.41,325.128,1024,256,3.35,10.52,16.01
rexnetr_200,3135.78,244.904,768,224,1.59,15.11,16.52
ecaresnet101d_pruned,3129.51,327.195,1024,224,3.48,7.69,24.88
seresnext26tn_32x4d,3050.2,335.704,1024,224,2.7,10.09,16.81
seresnext26t_32x4d,3050.01,335.724,1024,224,2.7,10.09,16.81
ecaresnext50t_32x4d,3049.83,335.744,1024,224,2.7,10.09,15.41
ecaresnext26t_32x4d,3048.36,335.905,1024,224,2.7,10.09,15.41
seresnext26d_32x4d,3037.9,337.063,1024,224,2.73,10.19,16.81
deit_small_patch16_224,3002.36,341.052,1024,224,4.61,11.95,22.05
rexnet_200,3001.86,255.828,768,224,1.56,14.91,16.37
vit_small_patch16_224,3000.37,341.279,1024,224,4.61,11.95,22.05
mobilevit_xs,2981.72,257.559,768,256,1.05,16.33,2.32
deit_small_distilled_patch16_224,2950.87,347.001,1024,224,4.63,12.02,22.44
pit_s_224,2945.22,347.668,1024,224,2.88,11.56,23.46
ecaresnetlight,2941.7,348.085,1024,224,4.11,8.42,30.16
coat_lite_tiny,2932.4,349.189,1024,224,1.6,11.65,5.72
eca_botnext26ts_256,2930.99,349.358,1024,256,2.46,11.6,10.59
pit_s_distilled_224,2918.75,350.821,1024,224,2.9,11.64,24.04
tf_efficientnet_b2_ns,2903.13,352.71,1024,260,1.02,13.83,9.11
tf_efficientnet_b2,2902.67,352.766,1024,260,1.02,13.83,9.11
tf_efficientnet_b2_ap,2901.98,352.851,1024,260,1.02,13.83,9.11
eca_halonext26ts,2883.09,355.163,1024,256,2.44,11.46,10.76
tresnet_m,2870.7,356.694,1024,224,5.74,7.31,31.39
botnet26t_256,2862.72,357.688,1024,256,3.32,11.98,12.49
regnetx_032,2852.1,359.019,1024,224,3.2,11.37,15.3
hrnet_w18_small_v2,2845.04,359.912,1024,224,2.62,9.65,15.6
deit3_small_patch16_224_in21ft1k,2837.48,360.868,1024,224,4.61,11.95,22.06
halonet26t,2832.73,361.477,1024,256,3.19,11.69,12.48
resnetv2_50,2829.74,361.858,1024,224,4.11,11.11,25.55
deit3_small_patch16_224,2828.59,362.004,1024,224,4.61,11.95,22.06
vgg11,2795.7,183.125,512,224,7.61,7.44,132.86
haloregnetz_b,2794.73,366.391,1024,224,1.97,11.94,11.68
bat_resnext26ts,2793.93,366.495,1024,256,2.53,12.51,10.73
vit_relpos_base_patch32_plus_rpn_256,2775.87,368.882,1024,256,7.68,8.01,119.42
vit_base_patch32_plus_256,2773.66,369.174,1024,256,7.79,7.76,119.48
dpn68b,2762.53,370.662,1024,224,2.35,10.47,12.61
vit_small_resnet26d_224,2758.05,371.264,1024,224,5.07,11.12,63.61
efficientnet_b2,2753.12,371.929,1024,288,1.12,16.2,9.11
coat_lite_mini,2752.31,372.037,1024,224,2.0,12.25,11.01
efficientnet_b2a,2752.1,372.068,1024,288,1.12,16.2,9.11
efficientnet_b0_gn,2748.63,372.536,1024,224,0.42,6.75,5.29
resnet50,2733.34,374.621,1024,224,4.11,11.11,25.56
ssl_resnet50,2732.73,374.705,1024,224,4.11,11.11,25.56
tv_resnet50,2732.0,374.804,1024,224,4.11,11.11,25.56
gluon_resnet50_v1b,2731.92,374.815,1024,224,4.11,11.11,25.56
swsl_resnet50,2730.89,374.957,1024,224,4.11,11.11,25.56
cspresnet50,2720.51,376.385,1024,256,4.54,11.5,21.62
resnet32ts,2719.03,376.593,1024,256,4.63,11.58,17.96
dpn68,2711.28,377.669,1024,224,2.35,10.47,12.61
mobilevitv2_100,2710.59,283.322,768,256,1.84,16.08,4.9
vovnet39a,2706.48,378.339,1024,224,7.09,6.73,22.6
resnetv2_50t,2687.6,380.997,1024,224,4.32,11.82,25.57
resnet33ts,2683.32,381.605,1024,256,4.76,11.66,19.68
resnetv2_50d,2678.25,382.327,1024,224,4.35,11.92,25.57
efficientnet_em,2663.14,384.496,1024,240,3.04,14.34,6.9
mixnet_l,2651.17,289.672,768,224,0.58,10.84,7.33
visformer_small,2638.82,388.04,1024,224,4.88,11.43,40.22
ese_vovnet39b,2631.4,389.135,1024,224,7.09,6.74,24.57
resnest26d,2624.21,390.2,1024,224,3.64,9.97,17.07
vit_relpos_small_patch16_224,2615.53,391.496,1024,224,4.59,13.05,21.98
seresnet33ts,2613.57,391.79,1024,256,4.76,11.66,19.78
eca_resnet33ts,2609.68,392.373,1024,256,4.76,11.66,19.68
vit_srelpos_small_patch16_224,2607.7,392.67,1024,224,4.59,12.16,21.97
eca_vovnet39b,2607.14,392.755,1024,224,7.09,6.74,22.6
gluon_resnet50_v1c,2599.91,393.848,1024,224,4.35,11.92,25.58
tf_efficientnet_em,2599.16,393.961,1024,240,3.04,14.34,6.9
cspresnet50w,2589.44,395.44,1024,256,5.04,12.19,28.12
resnet50d,2584.02,396.27,1024,224,4.35,11.92,25.58
legacy_seresnet50,2582.34,396.527,1024,224,3.88,10.6,28.09
resnet50t,2580.37,396.829,1024,224,4.32,11.82,25.57
twins_svt_small,2576.63,397.407,1024,224,2.94,13.75,24.06
gluon_resnet50_v1d,2570.89,398.293,1024,224,4.35,11.92,25.58
gcresnet33ts,2569.2,398.556,1024,256,4.76,11.68,19.88
cspresnet50d,2560.0,399.988,1024,256,4.86,12.55,21.64
lambda_resnet26t,2551.69,401.29,1024,256,3.02,11.87,10.96
tf_mixnet_l,2550.79,301.072,768,224,0.58,10.84,7.33
selecsls84,2543.3,402.613,1024,224,5.9,7.57,50.95
vgg11_bn,2541.42,201.45,512,224,7.62,7.44,132.87
dla60,2525.2,405.498,1024,224,4.26,10.16,22.04
cs3darknet_focus_l,2520.03,406.331,1024,288,5.9,10.16,21.15
res2net50_48w_2s,2502.4,409.196,1024,224,4.18,11.72,25.29
cs3darknet_l,2485.99,411.896,1024,288,6.16,10.83,21.16
densenet121,2467.71,414.945,1024,224,2.87,6.9,7.98
xcit_nano_12_p16_384_dist,2466.74,415.111,1024,384,1.64,12.15,3.05
xcit_tiny_24_p16_224_dist,2463.21,415.705,1024,224,2.34,11.82,12.12
tv_densenet121,2461.4,416.011,1024,224,2.87,6.9,7.98
xcit_tiny_24_p16_224,2457.2,416.72,1024,224,2.34,11.82,12.12
seresnet50,2438.84,419.859,1024,224,4.11,11.13,28.09
convnext_tiny_hnfd,2399.93,426.664,1024,224,4.47,13.44,28.59
convnext_tiny_hnf,2395.62,427.433,1024,224,4.47,13.44,28.59
efficientnet_lite3,2383.15,214.83,512,300,1.65,21.85,8.2
efficientnet_b0_g8_gn,2375.8,431.002,1024,224,0.66,6.75,6.56
convnext_tiny_in22ft1k,2362.17,433.485,1024,224,4.47,13.44,28.59
densenet121d,2362.07,433.503,1024,224,3.11,7.7,8.0
convnext_tiny,2359.72,433.936,1024,224,4.47,13.44,28.59
cs3sedarknet_l,2353.08,435.161,1024,288,6.16,10.83,21.91
resnetaa50d,2350.26,435.685,1024,224,5.39,12.44,25.58
efficientnet_cc_b0_4e,2334.0,438.721,1024,224,0.41,9.42,13.31
seresnet50t,2333.68,438.78,1024,224,4.32,11.83,28.1
ecaresnet50d,2316.33,442.066,1024,224,4.35,11.93,25.58
resnetblur50,2298.66,445.465,1024,224,5.16,12.02,25.56
mobilevit_s,2279.76,336.866,768,256,2.03,19.94,5.58
convnext_nano,2276.19,449.862,1024,288,4.06,13.84,15.59
resnetrs50,2276.18,449.864,1024,224,4.48,12.14,35.69
vit_base_resnet26d_224,2262.15,452.654,1024,224,6.97,13.16,101.4
gluon_resnet50_v1s,2257.16,453.655,1024,224,5.47,13.52,25.68
vovnet57a,2253.6,454.372,1024,224,8.95,7.52,36.64
adv_inception_v3,2250.27,455.041,1024,299,5.73,8.97,23.83
gluon_inception_v3,2249.35,455.229,1024,299,5.73,8.97,23.83
tf_inception_v3,2245.22,456.064,1024,299,5.73,8.97,23.83
tf_efficientnet_cc_b0_4e,2243.01,456.518,1024,224,0.41,9.42,13.31
inception_v3,2240.78,456.965,1024,299,5.73,8.97,23.83
tf_efficientnet_cc_b0_8e,2240.71,456.986,1024,224,0.42,9.42,24.01
densenetblur121d,2235.56,458.037,1024,224,3.11,7.9,8.0
resnest50d_1s4x24d,2213.57,462.589,1024,224,4.43,13.57,25.68
res2net50_26w_4s,2209.54,463.432,1024,224,4.28,12.61,25.7
ssl_resnext50_32x4d,2205.13,464.359,1024,224,4.26,14.4,25.03
swsl_resnext50_32x4d,2204.8,464.429,1024,224,4.26,14.4,25.03
gluon_resnext50_32x4d,2203.2,464.765,1024,224,4.26,14.4,25.03
resnext50_32x4d,2199.44,465.561,1024,224,4.26,14.4,25.03
tv_resnext50_32x4d,2198.23,465.818,1024,224,4.26,14.4,25.03
regnetx_040,2190.95,467.362,1024,224,3.99,12.2,22.12
cspresnext50,2182.4,469.194,1024,256,4.05,15.86,20.57
resnetblur50d,2182.09,469.263,1024,224,5.4,12.82,25.58
regnetz_b16,2180.8,469.54,1024,288,2.39,16.43,9.72
ese_vovnet57b,2171.57,471.535,1024,224,8.95,7.52,38.61
tf_efficientnet_lite3,2166.77,236.285,512,300,1.65,21.85,8.2
mobilevitv2_125,2151.63,356.926,768,256,2.86,20.1,7.48
efficientnet_cc_b0_8e,2149.58,476.36,1024,224,0.42,9.42,24.01
semobilevit_s,2143.19,358.331,768,256,2.03,19.95,5.74
twins_pcpvt_small,2142.01,478.043,1024,224,3.83,18.08,24.11
nf_regnet_b3,2133.81,479.88,1024,320,2.05,14.61,18.59
tf_efficientnetv2_b3,2121.62,482.639,1024,300,3.04,15.74,14.36
seresnetaa50d,2118.99,483.236,1024,224,5.4,12.46,28.11
efficientnetv2_rw_t,2117.33,483.616,1024,288,3.19,16.42,13.65
gcresnext50ts,2113.73,484.438,1024,256,3.75,15.46,15.67
edgenext_small,2107.47,485.876,1024,320,1.97,14.16,5.59
resnext50d_32x4d,2094.1,488.98,1024,224,4.5,15.2,25.05
dla60x,2080.97,492.062,1024,224,3.54,13.8,17.35
res2net50_14w_8s,2066.79,495.441,1024,224,4.21,13.28,25.06
gc_efficientnetv2_rw_t,2061.37,496.743,1024,288,3.2,16.45,13.68
sehalonet33ts,2057.14,373.322,768,256,3.55,14.7,13.69
gcresnet50t,2055.89,498.068,1024,256,5.42,14.67,25.9
skresnet50,2048.81,499.79,1024,224,4.11,12.5,25.8
fbnetv3_g,2047.87,500.019,1024,288,1.77,21.09,16.62
nf_ecaresnet50,2039.26,502.129,1024,224,4.21,11.13,25.56
nf_seresnet50,2037.45,502.576,1024,224,4.21,11.13,28.09
cs3darknet_focus_x,2026.01,505.415,1024,256,8.03,10.69,35.02
dla60_res2net,2015.55,508.035,1024,224,4.15,12.34,20.85
lambda_resnet26rpt_256,2015.24,190.536,384,256,3.16,11.87,10.99
seresnext50_32x4d,2010.4,509.339,1024,224,4.26,14.42,27.56
legacy_seresnext50_32x4d,2003.57,511.076,1024,224,4.26,14.42,27.56
repvgg_b1g4,2003.2,511.169,1024,224,8.15,10.64,39.97
gluon_seresnext50_32x4d,2002.45,511.358,1024,224,4.26,14.42,27.56
densenet169,1987.41,515.228,1024,224,3.4,7.3,14.15
res2next50,1967.78,520.369,1024,224,4.2,13.71,24.67
vit_relpos_small_patch16_rpn_224,1966.99,520.579,1024,224,4.59,13.05,21.97
skresnet50d,1957.14,523.201,1024,224,4.36,13.31,25.82
xcit_small_12_p16_224_dist,1952.72,524.382,1024,224,4.82,12.58,26.25
crossvit_small_240,1952.54,524.431,1024,240,5.63,18.17,26.86
xcit_small_12_p16_224,1952.1,524.55,1024,224,4.82,12.58,26.25
cs3sedarknet_xdw,1919.73,533.397,1024,256,5.97,17.18,21.6
swin_tiny_patch4_window7_224,1915.52,534.569,1024,224,4.51,17.06,28.29
vit_relpos_medium_patch16_cls_224,1909.56,536.236,1024,224,8.03,18.24,38.76
mixnet_xl,1903.31,268.993,512,224,0.93,14.57,11.9
dla60_res2next,1893.19,540.873,1024,224,3.49,13.17,17.03
xcit_nano_12_p8_224_dist,1887.0,542.649,1024,224,2.16,15.71,3.05
xcit_nano_12_p8_224,1883.21,543.74,1024,224,2.16,15.71,3.05
cspdarknet53,1881.33,408.211,768,256,6.57,16.81,27.64
gmlp_s16_224,1873.82,546.464,1024,224,4.42,15.1,19.42
edgenext_small_rw,1831.96,558.95,1024,320,2.46,14.85,7.83
ecaresnet26t,1828.87,559.898,1024,320,5.24,16.44,16.01
vit_small_r26_s32_224,1825.94,560.792,1024,224,3.56,9.85,36.43
vgg13,1819.73,281.346,512,224,11.31,12.25,133.05
poolformer_s24,1804.4,567.487,1024,224,3.41,10.68,21.39
crossvit_15_240,1799.06,569.173,1024,240,5.81,19.77,27.53
vit_relpos_medium_patch16_224,1794.09,570.75,1024,224,7.97,17.02,38.75
vit_srelpos_medium_patch16_224,1787.05,573.0,1024,224,7.96,16.21,38.74
mobilevitv2_150,1774.77,288.477,512,256,4.09,24.11,10.59
mobilevitv2_150_in22ft1k,1773.43,288.695,512,256,4.09,24.11,10.59
sebotnet33ts_256,1762.47,217.864,384,256,3.89,17.46,13.7
resmlp_24_224,1761.92,581.171,1024,224,5.96,10.91,30.02
efficientnet_b3,1761.71,290.615,512,320,2.01,26.52,12.23
efficientnet_b3a,1761.71,290.614,512,320,2.01,26.52,12.23
resmlp_24_distilled_224,1760.69,581.576,1024,224,5.96,10.91,30.02
regnetx_064,1757.88,436.877,768,224,6.49,16.37,26.21
resnest50d,1750.78,584.87,1024,224,5.4,14.36,27.48
gmixer_24_224,1741.35,588.036,1024,224,5.28,14.45,24.72
swin_s3_tiny_224,1737.42,589.369,1024,224,4.64,19.13,28.33
crossvit_15_dagger_240,1736.98,589.517,1024,240,6.13,20.43,28.21
vit_base_resnet50d_224,1722.65,594.42,1024,224,8.73,16.92,110.97
resnetv2_101,1717.18,596.314,1024,224,7.83,16.23,44.54
tf_efficientnet_b3_ap,1706.97,299.935,512,300,1.87,23.83,12.23
tf_efficientnet_b3,1705.74,300.151,512,300,1.87,23.83,12.23
tf_efficientnet_b3_ns,1705.51,300.191,512,300,1.87,23.83,12.23
lambda_resnet50ts,1694.68,604.231,1024,256,5.07,17.48,21.54
dla102,1693.75,604.56,1024,224,7.19,14.18,33.27
darknetaa53,1689.16,454.651,768,288,10.08,15.68,36.02
gluon_resnet101_v1b,1679.75,609.599,1024,224,7.83,16.23,44.55
tv_resnet101,1679.18,609.808,1024,224,7.83,16.23,44.55
resnet101,1676.67,610.719,1024,224,7.83,16.23,44.55
repvgg_b1,1663.53,615.546,1024,224,13.16,10.64,57.42
resnetv2_101d,1653.14,619.414,1024,224,8.07,17.04,44.56
gluon_resnet101_v1c,1649.02,620.96,1024,224,8.08,17.04,44.57
vgg13_bn,1642.15,311.774,512,224,11.33,12.25,133.05
cait_xxs24_224,1641.65,623.749,1024,224,2.53,20.29,11.96
res2net50_26w_6s,1639.34,624.627,1024,224,6.33,15.28,37.05
hrnet_w18,1631.65,627.569,1024,224,4.32,16.31,21.3
vit_large_patch32_224,1623.29,630.805,1024,224,15.39,13.3,306.54
wide_resnet50_2,1618.04,632.851,1024,224,11.43,14.4,68.88
gluon_resnet101_v1d,1616.88,633.307,1024,224,8.08,17.04,44.57
xcit_tiny_12_p16_384_dist,1614.06,634.414,1024,384,3.64,18.26,6.72
regnetv_040,1604.78,478.557,768,288,6.6,20.3,20.64
halonet50ts,1600.56,639.764,1024,256,5.3,19.2,22.73
regnety_040,1597.66,480.688,768,288,6.61,20.3,20.65
darknet53,1585.01,484.528,768,288,11.78,15.68,41.61
efficientnet_cc_b1_8e,1576.34,649.593,1024,240,0.75,15.44,39.72
coat_lite_small,1576.03,649.72,1024,224,3.96,22.09,19.84
regnety_032,1576.03,649.722,1024,288,5.29,18.61,19.44
resnetv2_50x1_bit_distilled,1575.9,649.775,1024,224,4.23,11.11,25.55
swinv2_cr_tiny_224,1574.62,650.304,1024,224,4.66,28.45,28.33
legacy_seresnet101,1569.43,652.454,1024,224,7.61,15.74,49.33
vit_base_patch32_384,1551.76,659.885,1024,384,13.06,16.5,88.3
ese_vovnet39b_evos,1551.37,660.05,1024,224,7.07,6.74,24.58
swinv2_cr_tiny_ns_224,1546.02,662.33,1024,224,4.66,28.45,28.33
vit_tiny_patch16_384,1542.96,663.648,1024,384,4.7,25.39,5.79
lamhalobotnet50ts_256,1533.2,667.873,1024,256,5.02,18.44,22.57
tf_efficientnet_cc_b1_8e,1527.24,670.479,1024,240,0.75,15.44,39.72
resnetaa101d,1521.49,673.009,1024,224,9.12,17.56,44.57
densenet201,1515.85,675.514,1024,224,4.34,7.85,20.01
resnetaa50,1510.7,677.817,1024,288,8.52,19.24,25.56
mixer_l32_224,1508.54,678.791,1024,224,11.27,19.86,206.94
seresnet101,1502.48,681.526,1024,224,7.84,16.27,49.33
vit_base_r26_s32_224,1492.4,686.129,1024,224,6.81,12.36,101.38
gluon_resnet101_v1s,1485.37,689.375,1024,224,9.19,18.64,44.67
twins_pcpvt_base,1484.93,689.584,1024,224,6.68,25.25,43.83
mobilevitv2_175,1472.18,347.77,512,256,5.54,28.13,14.25
mobilevitv2_175_in22ft1k,1472.06,347.8,512,256,5.54,28.13,14.25
nf_resnet101,1469.16,696.987,1024,224,8.01,16.23,44.55
resnest50d_4s2x40d,1467.36,697.84,1024,224,4.4,17.94,30.42
vgg16,1464.57,349.576,512,224,15.47,13.56,138.36
resnetv2_50d_frn,1463.93,699.474,1024,224,4.33,11.92,25.59
resnetblur101d,1458.09,702.276,1024,224,9.12,17.94,44.57
ecaresnet101d,1457.01,702.796,1024,224,8.08,17.07,44.57
sequencer2d_s,1455.29,703.627,1024,224,4.96,11.31,27.65
nf_resnet50,1445.9,708.195,1024,288,6.88,18.37,25.56
convnext_small,1445.85,708.22,1024,224,8.71,21.56,50.22
convnext_small_in22ft1k,1443.98,709.135,1024,224,8.71,21.56,50.22
regnetz_c16,1437.42,356.181,512,320,3.92,25.88,13.46
tresnet_l,1432.52,714.812,1024,224,10.88,11.9,55.99
cs3darknet_x,1429.24,716.453,1024,288,10.6,14.36,35.05
dla102x,1397.97,732.475,1024,224,5.89,19.42,26.31
ssl_resnext101_32x4d,1392.96,735.11,1024,224,8.01,21.23,44.18
swsl_resnext101_32x4d,1392.73,735.231,1024,224,8.01,21.23,44.18
resnext101_32x4d,1390.48,736.423,1024,224,8.01,21.23,44.18
botnet50ts_256,1389.99,276.247,384,256,5.54,22.23,22.74
skresnext50_32x4d,1389.9,736.732,1024,224,4.5,17.18,27.48
gluon_resnext101_32x4d,1389.41,736.987,1024,224,8.01,21.23,44.18
nest_tiny,1388.05,553.283,768,224,5.83,25.48,17.06
resnet50_gn,1386.72,738.422,1024,224,4.14,11.11,25.56
resnetv2_50d_evob,1383.3,740.244,1024,224,4.33,11.92,25.59
res2net50_26w_8s,1373.33,745.622,1024,224,8.37,17.95,48.4
halo2botnet50ts_256,1372.33,559.619,768,256,5.02,21.78,22.64
regnetx_080,1370.56,747.125,1024,224,8.02,14.06,39.57
cs3sedarknet_x,1368.84,748.067,1024,288,10.6,14.37,35.4
jx_nest_tiny,1362.62,563.605,768,224,5.83,25.48,17.06
convit_small,1355.18,755.603,1024,224,5.76,17.87,27.78
res2net101_26w_4s,1353.43,756.586,1024,224,8.1,18.45,45.21
xception,1340.72,572.814,768,299,8.4,35.83,22.86
mixer_b16_224_miil,1340.03,764.147,1024,224,12.62,14.53,59.88
repvgg_b2g4,1335.06,766.992,1024,224,12.63,12.9,61.76
vgg16_bn,1335.02,383.503,512,224,15.5,13.56,138.37
mixer_b16_224,1328.05,771.041,1024,224,12.62,14.53,59.88
twins_svt_base,1307.2,783.34,1024,224,8.59,26.33,56.07
dpn92,1299.67,787.878,1024,224,6.54,18.21,37.67
cs3edgenet_x,1289.05,794.37,1024,288,14.59,16.36,47.82
ese_vovnet99b_iabn,1282.63,798.345,1024,224,16.49,11.27,63.2
crossvit_18_240,1272.74,804.553,1024,240,9.05,26.26,43.27
regnety_040s_gn,1271.39,805.405,1024,224,4.03,12.29,20.65
eca_nfnet_l0,1271.38,805.411,1024,288,7.12,17.29,24.14
nfnet_l0,1269.37,806.681,1024,288,7.13,17.29,35.07
seresnext101_32x4d,1268.1,807.494,1024,224,8.02,21.26,48.96
legacy_seresnext101_32x4d,1267.59,807.817,1024,224,8.02,21.26,48.96
gluon_seresnext101_32x4d,1265.67,809.045,1024,224,8.02,21.26,48.96
nf_ecaresnet101,1264.2,809.986,1024,224,8.01,16.27,44.55
vit_relpos_medium_patch16_rpn_224,1263.66,810.331,1024,224,7.97,17.02,38.73
nf_seresnet101,1261.42,811.77,1024,224,8.02,16.27,49.33
mobilevitv2_200,1256.15,305.684,384,256,7.22,32.15,18.45
mobilevitv2_200_in22ft1k,1255.83,305.762,384,256,7.22,32.15,18.45
xception41p,1254.65,408.071,512,299,9.25,39.86,26.91
resnet51q,1254.6,816.185,1024,288,8.07,20.94,35.7
efficientnet_el,1254.42,408.143,512,300,8.0,30.7,10.59
efficientnet_el_pruned,1254.28,408.188,512,300,8.0,30.7,10.59
ese_vovnet99b,1240.88,825.205,1024,224,16.51,11.27,63.2
xcit_tiny_12_p8_224_dist,1237.16,827.688,1024,224,4.81,23.6,6.71
xcit_tiny_12_p8_224,1235.05,829.105,1024,224,4.81,23.6,6.71
crossvit_18_dagger_240,1235.02,829.126,1024,240,9.5,27.03,44.27
vgg19,1227.1,417.229,512,224,19.63,14.86,143.67
tf_efficientnet_el,1226.94,417.286,512,300,8.0,30.7,10.59
poolformer_s36,1217.09,841.334,1024,224,5.0,15.82,30.86
hrnet_w32,1204.83,849.897,1024,224,8.97,22.02,41.23
hrnet_w30,1202.88,851.275,1024,224,8.15,21.21,37.71
resnetv2_152,1196.21,856.023,1024,224,11.55,22.56,60.19
nfnet_f0,1193.84,857.722,1024,256,12.62,18.05,71.49
swin_small_patch4_window7_224,1179.92,867.841,1024,224,8.77,27.47,49.61
resmlp_36_224,1179.88,867.87,1024,224,8.91,16.33,44.69
vit_small_resnet50d_s16_224,1179.15,868.406,1024,224,13.48,24.82,57.53
resmlp_36_distilled_224,1179.01,868.509,1024,224,8.91,16.33,44.69
efficientnet_lite4,1178.02,325.958,384,380,4.04,45.66,13.01
tv_resnet152,1172.68,873.198,1024,224,11.56,22.56,60.19
gluon_resnet152_v1b,1172.67,873.208,1024,224,11.56,22.56,60.19
resnet152,1170.69,874.682,1024,224,11.56,22.56,60.19
mixnet_xxl,1163.99,329.888,384,224,2.04,23.43,23.96
resnetv2_152d,1163.57,880.032,1024,224,11.8,23.36,60.2
ecaresnet50t,1162.34,880.97,1024,320,8.82,24.13,25.57
resnet61q,1160.54,882.331,1024,288,9.87,21.52,36.85
vit_base_patch16_224_miil,1154.75,886.763,1024,224,17.58,23.9,86.54
repvgg_b2,1154.25,887.146,1024,224,20.45,12.9,89.02
inception_v4,1153.57,887.661,1024,299,12.28,15.09,42.68
swinv2_tiny_window8_256,1152.79,888.266,1024,256,5.96,24.57,28.35
densenet161,1147.8,892.122,1024,224,7.79,11.06,28.68
gluon_resnet152_v1c,1146.71,892.979,1024,224,11.8,23.36,60.21
gluon_resnet152_v1d,1141.31,897.204,1024,224,11.8,23.36,60.21
sequencer2d_m,1138.06,899.765,1024,224,6.55,14.26,38.31
vit_base_patch16_224_sam,1132.42,904.242,1024,224,17.58,23.9,86.57
deit_base_patch16_224,1132.42,904.245,1024,224,17.58,23.9,86.57
vit_base_patch16_224,1132.21,904.413,1024,224,17.58,23.9,86.57
dla169,1130.13,906.071,1024,224,11.6,20.2,53.39
regnetx_120,1129.55,453.263,512,224,12.13,21.37,46.11
volo_d1_224,1126.62,908.904,1024,224,6.94,24.43,26.63
vgg19_bn,1122.31,456.189,512,224,19.66,14.86,143.68
deit_base_distilled_patch16_224,1116.6,917.056,1024,224,17.68,24.05,87.34
xception41,1110.46,461.057,512,299,9.28,39.86,26.97
cait_xxs36_224,1104.66,926.97,1024,224,3.77,30.34,17.3
tf_efficientnet_lite4,1091.59,351.767,384,380,4.04,45.66,13.01
convmixer_1024_20_ks9_p14,1091.56,938.092,1024,224,5.55,5.51,24.38
deit3_base_patch16_224,1090.26,939.213,1024,224,17.58,23.9,86.59
deit3_base_patch16_224_in21ft1k,1088.57,940.667,1024,224,17.58,23.9,86.59
legacy_seresnet152,1086.41,942.544,1024,224,11.33,22.08,66.82
tnt_s_patch16_224,1079.54,948.54,1024,224,5.24,24.37,23.76
regnety_120,1077.58,475.125,512,224,12.14,21.38,51.82
repvgg_b3g4,1077.28,950.524,1024,224,17.89,15.1,83.83
vit_relpos_base_patch16_clsgap_224,1077.01,950.767,1024,224,17.6,25.12,86.43
vit_relpos_base_patch16_cls_224,1076.19,951.489,1024,224,17.6,25.12,86.43
gluon_resnet152_v1s,1074.28,953.181,1024,224,12.92,24.96,60.32
twins_pcpvt_large,1061.77,964.416,1024,224,9.84,35.82,60.99
seresnet152,1047.32,977.721,1024,224,11.57,22.61,66.82
beit_base_patch16_224,1045.12,979.774,1024,224,17.58,23.9,86.53
xcit_small_24_p16_224_dist,1038.39,986.125,1024,224,9.1,23.64,47.67
xcit_small_24_p16_224,1037.69,986.793,1024,224,9.1,23.64,47.67
coat_tiny,1036.7,987.731,1024,224,4.35,27.2,5.5
dm_nfnet_f0,1035.11,989.253,1024,256,12.62,18.05,71.49
nf_regnet_b4,1027.0,997.065,1024,384,4.7,28.61,30.21
vit_relpos_base_patch16_224,1017.61,1006.263,1024,224,17.51,24.97,86.43
convnext_base_in22ft1k,1006.85,1017.02,1024,224,15.38,28.75,88.59
convnext_base,1006.73,1017.126,1024,224,15.38,28.75,88.59
pit_b_224,993.61,515.277,512,224,12.42,32.94,73.76
pit_b_distilled_224,985.16,519.696,512,224,12.5,33.07,74.79
tresnet_xl,983.38,1041.292,1024,224,15.17,15.34,78.44
efficientnetv2_s,976.0,1049.166,1024,384,8.44,35.77,21.46
dla102x2,973.1,526.138,512,224,9.34,29.91,41.28
cs3se_edgenet_x,972.26,1053.196,1024,320,18.01,20.21,50.72
vit_small_patch16_36x1_224,972.14,1053.329,1024,224,13.71,35.69,64.67
swinv2_cr_small_224,966.28,1059.712,1024,224,9.07,50.27,49.7
swinv2_cr_small_ns_224,955.69,1071.465,1024,224,9.08,50.27,49.7
tf_efficientnetv2_s_in21ft1k,955.24,1071.964,1024,384,8.44,35.77,21.46
tf_efficientnetv2_s,955.13,1072.086,1024,384,8.44,35.77,21.46
vit_small_patch16_18x2_224,948.32,1079.793,1024,224,13.71,35.69,64.67
wide_resnet101_2,939.08,1090.412,1024,224,22.8,21.23,126.89
regnetx_160,936.53,546.684,512,224,15.99,25.52,54.28
regnety_080,933.52,548.447,512,288,13.22,29.69,39.18
regnetz_b16_evos,933.51,822.691,768,288,2.36,16.43,9.74
efficientnetv2_rw_s,931.24,1099.596,1024,384,8.72,38.03,23.94
resnetv2_50d_gn,920.9,1111.946,1024,288,7.24,19.7,25.57
twins_svt_large,918.22,1115.185,1024,224,15.15,35.1,99.27
efficientnet_b4,917.89,418.339,384,384,4.51,50.04,19.34
regnetz_040,913.72,420.249,384,320,6.35,37.78,27.12
xception65p,910.71,562.184,512,299,13.91,52.48,39.82
regnetz_040h,909.33,422.274,384,320,6.43,37.94,28.94
dpn98,906.73,1129.316,1024,224,11.73,25.2,61.57
repvgg_b3,901.67,1135.661,1024,224,29.16,15.1,123.09
resnetrs101,898.53,1139.62,1024,288,13.56,28.53,63.62
gluon_resnext101_64x4d,887.37,1153.955,1024,224,15.52,31.21,83.46
nest_small,885.28,867.51,768,224,10.35,40.04,38.35
poolformer_m36,879.83,1163.84,1024,224,8.8,22.02,56.17
regnetz_d8,877.84,1166.489,1024,320,6.19,37.08,23.37
jx_nest_small,874.11,878.596,768,224,10.35,40.04,38.35
ssl_resnext101_32x8d,874.01,1171.597,1024,224,16.48,31.21,88.79
swsl_resnext101_32x8d,873.31,1172.532,1024,224,16.48,31.21,88.79
resnext101_32x8d,873.01,1172.932,1024,224,16.48,31.21,88.79
ig_resnext101_32x8d,872.81,1173.211,1024,224,16.48,31.21,88.79
regnetz_d32,869.58,1177.564,1024,320,9.33,37.08,27.58
inception_resnet_v2,868.78,1178.653,1024,299,13.18,25.06,55.84
ens_adv_inception_resnet_v2,868.32,1179.275,1024,299,13.18,25.06,55.84
xcit_tiny_24_p16_384_dist,866.54,1181.7,1024,384,6.87,34.29,12.12
cait_s24_224,865.33,1183.354,1024,224,9.35,40.58,46.92
resnest101e,858.93,894.122,768,256,13.38,28.66,48.28
tf_efficientnet_b4,858.91,447.067,384,380,4.49,49.49,19.34
tf_efficientnet_b4_ap,858.7,447.171,384,380,4.49,49.49,19.34
tf_efficientnet_b4_ns,858.52,447.267,384,380,4.49,49.49,19.34
swin_s3_small_224,853.54,899.766,768,224,9.43,37.84,49.74
regnetv_064,852.1,600.857,512,288,10.55,27.11,30.58
regnety_064,851.33,601.396,512,288,10.56,27.11,30.58
resnet200,847.44,1208.333,1024,224,15.07,32.19,64.67
gluon_seresnext101_64x4d,834.87,1226.518,1024,224,15.53,31.25,88.23
coat_mini,833.41,1228.669,1024,224,6.82,33.68,10.34
swin_base_patch4_window7_224,832.6,1229.869,1024,224,15.47,36.63,87.77
resnet101d,816.8,1253.661,1024,320,16.48,34.77,44.57
gluon_xception65,816.5,627.052,512,299,13.96,52.48,39.92
xception65,811.16,631.185,512,299,13.96,52.48,39.92
resnetv2_50d_evos,810.51,947.543,768,288,7.15,19.7,25.59
convnext_tiny_384_in22ft1k,807.27,634.218,512,384,13.14,39.48,28.59
gmlp_b16_224,789.84,1296.449,1024,224,15.78,30.21,73.08
hrnet_w40,787.85,1299.728,1024,224,12.75,25.29,57.56
crossvit_base_240,787.17,975.639,768,240,21.22,36.33,105.03
hrnet_w44,771.15,1327.87,1024,224,14.94,26.92,67.06
swinv2_tiny_window16_256,763.4,670.672,512,256,6.68,39.02,28.35
mobilevitv2_150_384_in22ft1k,757.55,337.918,256,384,9.2,54.25,10.59
xcit_medium_24_p16_224_dist,748.7,1367.689,1024,224,16.13,31.71,84.4
xcit_medium_24_p16_224,748.18,1368.635,1024,224,16.13,31.71,84.4
tresnet_m_448,743.16,1377.885,1024,448,22.94,29.21,31.39
vit_large_r50_s32_224,742.19,1379.692,1024,224,19.58,24.41,328.99
hrnet_w48,738.63,1386.343,1024,224,17.34,28.56,77.47
vit_base_patch16_plus_240,738.11,1387.321,1024,240,27.41,33.08,117.56
sequencer2d_l,736.17,1390.978,1024,224,9.74,22.12,54.3
xcit_small_12_p16_384_dist,715.91,1430.327,1024,384,14.14,36.51,26.25
swinv2_small_window8_256,710.32,1441.594,1024,256,11.58,40.14,49.73
swin_s3_base_224,693.67,1476.198,1024,224,13.69,48.26,71.13
vit_small_patch16_384,692.4,1109.164,768,384,15.52,50.78,22.2
vit_relpos_base_patch16_plus_240,691.79,1480.194,1024,240,27.3,34.33,117.38
tnt_b_patch16_224,691.78,1480.223,1024,224,14.09,39.01,65.41
swinv2_cr_base_224,688.11,1488.125,1024,224,15.86,59.66,87.88
densenet264d_iabn,687.57,1489.287,1024,224,13.47,14.0,72.74
convit_base,685.88,1492.962,1024,224,17.52,31.77,86.54
swinv2_cr_base_ns_224,682.58,1500.17,1024,224,15.86,59.66,87.88
vit_base_patch16_rpn_224,667.73,1533.544,1024,224,17.49,23.75,86.54
densenet264,664.62,1540.716,1024,224,12.95,12.8,72.69
deit3_small_patch16_384,664.03,1156.564,768,384,15.52,50.78,22.21
poolformer_m48,663.83,1542.547,1024,224,11.59,29.17,73.47
deit3_small_patch16_384_in21ft1k,663.62,1157.274,768,384,15.52,50.78,22.21
efficientnet_b3_gn,662.87,386.187,256,320,2.14,28.83,11.73
dpn131,660.11,1551.238,1024,224,16.09,32.97,79.25
eca_nfnet_l1,655.87,1561.27,1024,320,14.92,34.42,41.41
vit_relpos_base_patch16_rpn_224,655.49,1562.186,1024,224,17.51,24.97,86.41
xcit_tiny_24_p8_224,650.45,1574.283,1024,224,9.21,45.39,12.11
xcit_tiny_24_p8_224_dist,649.22,1577.262,1024,224,9.21,45.39,12.11
xcit_nano_12_p8_384_dist,643.06,1592.369,1024,384,6.34,46.08,3.05
nest_base,629.02,813.95,512,224,17.96,53.39,67.72
volo_d2_224,627.91,1630.781,1024,224,14.34,41.34,58.68
mobilevitv2_175_384_in22ft1k,627.52,407.942,256,384,12.47,63.29,14.25
jx_nest_base,621.88,823.3,512,224,17.96,53.39,67.72
vit_small_r26_s32_384,619.54,619.804,384,384,10.43,29.85,36.47
senet154,618.82,1654.743,1024,224,20.77,38.69,115.09
gluon_senet154,618.51,1655.586,1024,224,20.77,38.69,115.09
legacy_senet154,618.16,1656.503,1024,224,20.77,38.69,115.09
xception71,616.97,829.852,512,299,18.09,69.92,42.34
vit_base_r50_s16_224,613.11,1670.152,1024,224,21.66,35.29,98.66
hrnet_w64,609.7,1679.491,1024,224,28.97,35.09,128.06
regnety_320,607.61,842.637,512,224,32.34,30.26,145.05
dpn107,606.08,1689.539,1024,224,18.38,33.46,86.92
regnetz_c16_evos,598.89,854.904,512,320,3.86,25.88,13.49
ecaresnet200d,592.5,1728.248,1024,256,20.0,43.15,64.69
seresnet200d,591.19,1732.085,1024,256,20.01,43.15,71.86
resnet152d,576.9,1774.999,1024,320,24.08,47.67,60.21
convnext_large,559.02,1831.761,1024,224,34.4,43.13,197.77
convnext_large_in22ft1k,558.96,1831.941,1024,224,34.4,43.13,197.77
regnety_160,558.21,687.896,384,288,26.37,38.07,83.59
efficientnet_b3_g8_gn,557.9,458.854,256,320,3.2,28.83,14.25
xcit_small_12_p8_224,546.6,1873.371,1024,224,18.69,47.21,26.21
xcit_small_12_p8_224_dist,546.45,1873.905,1024,224,18.69,47.21,26.21
resnext101_64x4d,541.68,1417.803,768,288,25.66,51.59,83.46
mobilevitv2_200_384_in22ft1k,527.08,364.262,192,384,16.24,72.34,18.45
halonet_h1,518.76,493.471,256,256,3.0,51.17,8.1
vit_large_patch32_384,517.18,1979.967,1024,384,45.31,43.86,306.63
seresnet152d,516.02,1984.399,1024,320,24.09,47.72,66.84
resnetrs152,512.78,1996.941,1024,320,24.34,48.14,86.62
swinv2_base_window8_256,507.32,1513.812,768,256,20.37,52.59,87.92
seresnext101_32x8d,503.19,1526.235,768,288,27.24,51.63,93.57
convnext_small_384_in22ft1k,494.64,1035.087,512,384,25.58,63.37,50.22
seresnext101d_32x8d,494.43,1553.287,768,288,27.64,52.95,93.59
swin_large_patch4_window7_224,478.67,1604.435,768,224,34.53,54.94,196.53
swinv2_small_window16_256,476.38,1074.753,512,256,12.82,66.29,49.73
regnetz_e8,474.49,1618.577,768,320,15.46,63.94,57.7
regnetx_320,471.27,814.799,384,224,31.81,36.3,107.81
ssl_resnext101_32x16d,471.02,1086.983,512,224,36.27,51.18,194.03
swsl_resnext101_32x16d,470.83,1087.428,512,224,36.27,51.18,194.03
ig_resnext101_32x16d,470.74,1087.624,512,224,36.27,51.18,194.03
mixer_l16_224,470.73,2175.315,1024,224,44.6,41.69,208.2
seresnextaa101d_32x8d,463.39,1657.351,768,288,28.51,56.44,93.59
seresnet269d,463.29,2210.273,1024,256,26.59,53.6,113.67
nf_regnet_b5,450.96,1135.344,512,456,11.7,61.95,49.74
efficientnetv2_m,449.82,2276.453,1024,416,18.6,67.5,54.14
volo_d3_224,439.99,2327.294,1024,224,20.78,60.09,86.33
efficientnet_b5,425.78,601.238,256,456,10.46,98.86,30.39
xcit_large_24_p16_224_dist,423.07,2420.403,1024,224,35.86,47.27,189.1
xcit_large_24_p16_224,422.98,2420.908,1024,224,35.86,47.27,189.1
xcit_tiny_12_p8_384_dist,419.35,2441.847,1024,384,14.13,69.14,6.71
resnet200d,417.0,2455.593,1024,320,31.25,67.33,64.69
efficientnetv2_rw_m,411.82,1864.879,768,416,21.49,79.62,53.24
tf_efficientnet_b5_ns,408.16,627.186,256,456,10.46,98.86,30.39
swinv2_cr_tiny_384,408.1,627.286,256,384,15.34,161.01,28.33
tf_efficientnet_b5,407.78,627.773,256,456,10.46,98.86,30.39
tf_efficientnet_b5_ap,407.68,627.936,256,456,10.46,98.86,30.39
swinv2_cr_large_224,405.25,1895.127,768,224,35.1,78.42,196.68
resnetv2_50x1_bitm,401.93,955.37,384,448,16.62,44.46,25.55
nfnet_f1,399.69,2561.946,1024,320,35.97,46.77,132.63
xcit_small_24_p16_384_dist,382.57,2676.633,1024,384,26.72,68.58,47.67
regnetz_d8_evos,376.87,2037.797,768,320,7.03,38.92,23.46
tresnet_l_448,371.52,2756.242,1024,448,43.5,47.56,55.99
vit_large_patch16_224,369.7,2769.802,1024,224,61.6,63.52,304.33
resnetrs200,368.58,2778.22,1024,320,31.51,67.81,93.21
convnext_xlarge_in22ft1k,368.02,1391.221,512,224,60.98,57.5,350.2
crossvit_15_dagger_408,366.37,698.731,256,408,21.45,95.05,28.5
vit_base_patch16_18x2_224,361.96,2829.064,1024,224,52.51,71.38,256.73
deit3_large_patch16_224,358.07,2859.733,1024,224,61.6,63.52,304.37
deit3_large_patch16_224_in21ft1k,357.9,2861.143,1024,224,61.6,63.52,304.37
dm_nfnet_f1,357.87,2146.026,768,320,35.97,46.77,132.63
tf_efficientnetv2_m,350.54,2190.896,768,480,24.76,89.84,54.14
tf_efficientnetv2_m_in21ft1k,350.14,2193.372,768,480,24.76,89.84,54.14
swinv2_base_window16_256,345.6,1111.087,384,256,22.02,84.71,87.92
swinv2_base_window12to16_192to256_22kft1k,345.47,1111.525,384,256,22.02,84.71,87.92
convnext_base_384_in22ft1k,344.56,1485.926,512,384,45.21,84.49,88.59
beit_large_patch16_224,342.32,2991.347,1024,224,61.6,63.52,304.43
eca_nfnet_l2,322.02,2384.947,768,384,30.05,68.28,56.72
volo_d1_384,293.04,1747.159,512,384,22.75,108.55,26.78
convmixer_768_32,292.83,3496.872,1024,224,19.55,25.95,21.11
resnetv2_152x2_bit_teacher,291.46,2634.992,768,224,46.95,45.11,236.34
deit_base_patch16_384,288.65,1330.327,384,384,55.54,101.56,86.86
vit_base_patch16_384,288.47,1331.141,384,384,55.54,101.56,86.86
resnest200e,288.19,1776.58,512,320,35.69,82.78,70.2
xcit_small_24_p8_224,286.12,3578.848,1024,224,35.81,90.78,47.63
xcit_small_24_p8_224_dist,286.06,3579.677,1024,224,35.81,90.78,47.63
deit_base_distilled_patch16_384,284.56,1349.413,384,384,55.65,101.82,87.63
volo_d4_224,282.61,3623.333,1024,224,44.34,80.22,192.96
deit3_base_patch16_384,277.81,1382.217,384,384,55.54,101.56,86.88
deit3_base_patch16_384_in21ft1k,277.78,1382.367,384,384,55.54,101.56,86.88
tresnet_xl_448,277.15,2771.052,768,448,60.65,61.31,78.44
nasnetalarge,276.88,1386.877,384,331,23.89,90.56,88.75
vit_large_patch14_224,271.51,3771.489,1024,224,81.08,88.79,304.2
cait_xxs24_384,269.82,3795.14,1024,384,9.63,122.66,12.03
crossvit_18_dagger_408,269.4,950.247,256,408,32.47,124.87,44.61
xcit_medium_24_p16_384_dist,269.2,2852.889,768,384,47.39,91.64,84.4
pnasnet5large,264.84,1449.925,384,331,25.04,92.89,86.06
resnetv2_101x1_bitm,252.59,1520.226,384,448,31.65,64.93,44.54
efficientnet_b6,252.26,507.392,128,528,19.4,167.39,43.04
swinv2_cr_small_384,250.03,1023.876,256,384,29.7,298.03,49.7
beit_base_patch16_384,247.68,1550.363,384,384,55.54,101.56,86.74
vit_large_r50_s32_384,246.17,1559.866,384,384,57.43,76.52,329.09
tf_efficientnet_b6_ns,242.42,527.986,128,528,19.4,167.39,43.04
tf_efficientnet_b6,242.34,528.179,128,528,19.4,167.39,43.04
tf_efficientnet_b6_ap,242.3,528.255,128,528,19.4,167.39,43.04
ecaresnet269d,241.69,4236.816,1024,352,50.25,101.25,102.09
resnetrs270,234.11,4373.986,1024,352,51.13,105.48,129.86
nfnet_f2,224.73,4556.614,1024,352,63.22,79.06,193.78
swin_base_patch4_window12_384,220.36,871.278,192,384,47.19,134.78,87.9
xcit_tiny_24_p8_384_dist,219.9,4656.678,1024,384,27.05,132.95,12.11
resmlp_big_24_224,218.18,4693.363,1024,224,100.23,87.31,129.14
resmlp_big_24_224_in22ft1k,217.68,4704.164,1024,224,100.23,87.31,129.14
resmlp_big_24_distilled_224,217.65,4704.831,1024,224,100.23,87.31,129.14
swinv2_large_window12to16_192to256_22kft1k,211.96,1207.756,256,256,47.81,121.53,196.74
efficientnetv2_l,206.63,2477.808,512,480,56.4,157.99,118.52
tf_efficientnetv2_l,204.52,2503.355,512,480,56.4,157.99,118.52
tf_efficientnetv2_l_in21ft1k,204.48,2503.917,512,480,56.4,157.99,118.52
ig_resnext101_32x32d,202.59,1263.594,256,224,87.29,91.12,468.53
xcit_medium_24_p8_224,202.12,5066.293,1024,224,63.53,121.23,84.32
xcit_medium_24_p8_224_dist,201.88,5072.196,1024,224,63.53,121.23,84.32
dm_nfnet_f2,200.18,3836.576,768,352,63.22,79.06,193.78
convnext_large_384_in22ft1k,190.55,1343.472,256,384,101.1,126.74,197.77
vit_base_patch8_224,188.25,1359.85,256,224,78.22,161.69,86.58
volo_d5_224,187.56,5459.662,1024,224,72.4,118.11,295.46
cait_xs24_384,186.33,4121.716,768,384,19.28,183.98,26.67
xcit_small_12_p8_384_dist,183.57,2091.823,384,384,54.92,138.29,26.21
eca_nfnet_l3,182.91,2799.141,512,448,52.55,118.4,72.04
cait_xxs36_384,180.41,5675.791,1024,384,14.35,183.7,17.37
swinv2_cr_base_384,178.38,1435.085,256,384,50.57,333.68,87.88
vit_base_resnet50_384,177.85,2159.087,384,384,67.43,135.03,98.95
vit_base_r50_s16_384,177.6,2162.196,384,384,67.43,135.03,98.95
swinv2_cr_huge_224,175.47,2188.347,384,224,115.97,121.08,657.83
convmixer_1536_20,167.1,6128.044,1024,224,48.68,33.03,51.63
volo_d2_384,164.75,1553.889,256,384,46.17,184.51,58.87
resnetrs350,156.77,4898.75,768,384,77.59,154.74,163.96
xcit_large_24_p16_384_dist,154.33,3317.602,512,384,105.35,137.17,189.1
vit_huge_patch14_224,146.32,6998.359,1024,224,167.4,139.41,632.05
efficientnet_b7,145.11,661.558,96,600,38.33,289.94,66.35
cait_s24_384,144.99,3531.336,512,384,32.17,245.31,47.06
deit3_huge_patch14_224,142.26,7197.843,1024,224,167.4,139.41,632.13
deit3_huge_patch14_224_in21ft1k,142.17,7202.758,1024,224,167.4,139.41,632.13
tf_efficientnet_b7_ns,140.64,682.566,96,600,38.33,289.94,66.35
tf_efficientnet_b7_ap,140.61,682.704,96,600,38.33,289.94,66.35
tf_efficientnet_b7,140.6,682.756,96,600,38.33,289.94,66.35
efficientnetv2_xl,139.56,2751.573,384,512,93.85,247.32,208.12
tf_efficientnetv2_xl_in21ft1k,138.42,2774.117,384,512,93.85,247.32,208.12
resnest269e,135.65,2830.833,384,416,77.69,171.98,110.93
swin_large_patch4_window12_384,130.35,981.936,128,384,104.08,202.16,196.74
convnext_xlarge_384_in22ft1k,125.25,1532.9,192,384,179.2,168.99,350.2
nfnet_f3,124.74,4104.555,512,416,115.58,141.78,254.92
ig_resnext101_32x48d,118.28,1623.193,192,224,153.57,131.06,828.41
xcit_large_24_p8_224,115.22,4443.765,512,224,141.23,181.56,188.93
xcit_large_24_p8_224_dist,115.18,4445.056,512,224,141.23,181.56,188.93
resnetrs420,112.12,6849.78,768,416,108.45,213.79,191.89
dm_nfnet_f3,110.18,4647.097,512,416,115.58,141.78,254.92
swinv2_cr_large_384,108.04,1184.75,128,384,108.95,404.96,196.68
resnetv2_50x3_bitm,102.09,1253.798,128,448,145.7,133.37,217.32
resnetv2_152x2_bit_teacher_384,98.91,2588.163,256,384,136.16,132.56,236.34
vit_large_patch16_384,97.45,2626.88,256,384,191.21,270.24,304.72
cait_s36_384,97.05,5275.469,512,384,47.99,367.4,68.37
xcit_small_24_p8_384_dist,96.34,3985.916,384,384,105.24,265.91,47.63
vit_giant_patch14_224,95.73,8022.929,768,224,267.18,192.64,1012.61
deit3_large_patch16_384,94.64,2704.996,256,384,191.21,270.24,304.76
deit3_large_patch16_384_in21ft1k,94.52,2708.314,256,384,191.21,270.24,304.76
swinv2_base_window12to24_192to384_22kft1k,94.37,678.174,64,384,55.25,280.36,87.92
efficientnet_b8,91.29,1051.594,96,672,63.48,442.89,87.41
tf_efficientnet_b8,88.95,1079.277,96,672,63.48,442.89,87.41
tf_efficientnet_b8_ap,88.84,1080.533,96,672,63.48,442.89,87.41
beit_large_patch16_384,84.67,3023.634,256,384,191.21,270.24,305.0
resnetv2_152x2_bitm,73.09,2626.956,192,448,184.99,180.43,236.34
volo_d3_448,72.41,2651.496,192,448,96.33,446.83,86.63
nfnet_f4,69.91,5493.031,384,512,216.26,262.26,316.07
xcit_medium_24_p8_384_dist,67.93,3768.466,256,384,186.67,354.73,84.32
dm_nfnet_f4,62.55,4092.528,256,512,216.26,262.26,316.07
resnetv2_101x3_bitm,61.05,2096.759,128,448,280.33,194.78,387.93
swinv2_large_window12to24_192to384_22kft1k,59.71,803.821,48,384,116.15,407.83,196.74
vit_gigantic_patch14_224,57.59,8890.782,512,224,483.95,275.37,1844.44
tf_efficientnet_l2_ns_475,56.35,1135.833,64,475,172.11,609.89,480.31
volo_d4_448,52.92,2418.622,128,448,197.13,527.35,193.41
swinv2_cr_giant_224,50.53,2532.906,128,224,483.85,309.15,2598.76
nfnet_f5,49.64,5157.064,256,544,290.97,349.71,377.21
swinv2_cr_huge_384,47.06,1360.056,64,384,352.04,583.18,657.94
dm_nfnet_f5,44.17,5795.363,256,544,290.97,349.71,377.21
xcit_large_24_p8_384_dist,38.64,4968.379,192,384,415.0,531.82,188.93
nfnet_f6,37.99,6738.223,256,576,378.69,452.2,438.36
volo_d5_448,36.49,3507.831,128,448,315.06,737.92,295.91
beit_large_patch16_512,33.88,2833.282,96,512,362.24,656.39,305.67
dm_nfnet_f6,33.83,7567.962,256,576,378.69,452.2,438.36
cait_m36_384,31.72,8071.786,256,384,173.11,734.81,271.22
nfnet_f7,30.38,8426.213,256,608,480.39,570.85,499.5
volo_d5_512,25.58,3752.221,96,512,425.09,1105.37,296.09
resnetv2_152x4_bitm,22.67,4234.474,96,480,844.84,414.26,936.53
efficientnet_l2,20.51,1169.975,24,800,479.12,1707.39,480.31
tf_efficientnet_l2_ns,20.15,1191.261,24,800,479.12,1707.39,480.31
swinv2_cr_giant_384,14.62,2188.205,32,384,1450.71,1394.86,2598.76
cait_m48_448,13.47,9503.031,128,448,329.41,1708.23,356.46
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/results/generate_csv_results.py
|
import numpy as np
import pandas as pd
results = {
'results-imagenet.csv': [
'results-imagenet-real.csv',
'results-imagenetv2-matched-frequency.csv',
'results-sketch.csv'
],
'results-imagenet-a-clean.csv': [
'results-imagenet-a.csv',
],
'results-imagenet-r-clean.csv': [
'results-imagenet-r.csv',
],
}
def diff(base_df, test_csv):
base_models = base_df['model'].values
test_df = pd.read_csv(test_csv)
test_models = test_df['model'].values
rank_diff = np.zeros_like(test_models, dtype='object')
top1_diff = np.zeros_like(test_models, dtype='object')
top5_diff = np.zeros_like(test_models, dtype='object')
for rank, model in enumerate(test_models):
if model in base_models:
base_rank = int(np.where(base_models == model)[0])
top1_d = test_df['top1'][rank] - base_df['top1'][base_rank]
top5_d = test_df['top5'][rank] - base_df['top5'][base_rank]
# rank_diff
if rank == base_rank:
rank_diff[rank] = f'0'
elif rank > base_rank:
rank_diff[rank] = f'-{rank - base_rank}'
else:
rank_diff[rank] = f'+{base_rank - rank}'
# top1_diff
if top1_d >= .0:
top1_diff[rank] = f'+{top1_d:.3f}'
else:
top1_diff[rank] = f'-{abs(top1_d):.3f}'
# top5_diff
if top5_d >= .0:
top5_diff[rank] = f'+{top5_d:.3f}'
else:
top5_diff[rank] = f'-{abs(top5_d):.3f}'
else:
rank_diff[rank] = ''
top1_diff[rank] = ''
top5_diff[rank] = ''
test_df['top1_diff'] = top1_diff
test_df['top5_diff'] = top5_diff
test_df['rank_diff'] = rank_diff
test_df['param_count'] = test_df['param_count'].map('{:,.2f}'.format)
test_df.sort_values(['top1', 'top5', 'model'], ascending=[False, False, True], inplace=True)
test_df.to_csv(test_csv, index=False, float_format='%.3f')
for base_results, test_results in results.items():
base_df = pd.read_csv(base_results)
base_df.sort_values(['top1', 'top5', 'model'], ascending=[False, False, True], inplace=True)
for test_csv in test_results:
diff(base_df, test_csv)
base_df['param_count'] = base_df['param_count'].map('{:,.2f}'.format)
base_df.to_csv(base_results, index=False, float_format='%.3f')
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/timm/__init__.py
|
from .version import __version__
from .layers import is_scriptable, is_exportable, set_scriptable, set_exportable
from .models import create_model, list_models, list_pretrained, is_model, list_modules, model_entrypoint, \
is_model_pretrained, get_pretrained_cfg, get_pretrained_cfg_value
| 0
|
hf_public_repos/pytorch-image-models
|
hf_public_repos/pytorch-image-models/timm/version.py
|
__version__ = '0.9.13dev0'
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/activations_me.py
|
""" Activations (memory-efficient w/ custom autograd)
A collection of activations fn and modules with a common interface so that they can
easily be swapped. All have an `inplace` arg even if not used.
These activations are not compatible with jit scripting or ONNX export of the model, please use either
the JIT or basic versions of the activations.
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
from torch import nn as nn
from torch.nn import functional as F
@torch.jit.script
def swish_jit_fwd(x):
return x.mul(torch.sigmoid(x))
@torch.jit.script
def swish_jit_bwd(x, grad_output):
x_sigmoid = torch.sigmoid(x)
return grad_output * (x_sigmoid * (1 + x * (1 - x_sigmoid)))
class SwishJitAutoFn(torch.autograd.Function):
""" torch.jit.script optimised Swish w/ memory-efficient checkpoint
Inspired by conversation btw Jeremy Howard & Adam Pazske
https://twitter.com/jeremyphoward/status/1188251041835315200
"""
@staticmethod
def symbolic(g, x):
return g.op("Mul", x, g.op("Sigmoid", x))
@staticmethod
def forward(ctx, x):
ctx.save_for_backward(x)
return swish_jit_fwd(x)
@staticmethod
def backward(ctx, grad_output):
x = ctx.saved_tensors[0]
return swish_jit_bwd(x, grad_output)
def swish_me(x, inplace=False):
return SwishJitAutoFn.apply(x)
class SwishMe(nn.Module):
def __init__(self, inplace: bool = False):
super(SwishMe, self).__init__()
def forward(self, x):
return SwishJitAutoFn.apply(x)
@torch.jit.script
def mish_jit_fwd(x):
return x.mul(torch.tanh(F.softplus(x)))
@torch.jit.script
def mish_jit_bwd(x, grad_output):
x_sigmoid = torch.sigmoid(x)
x_tanh_sp = F.softplus(x).tanh()
return grad_output.mul(x_tanh_sp + x * x_sigmoid * (1 - x_tanh_sp * x_tanh_sp))
class MishJitAutoFn(torch.autograd.Function):
""" Mish: A Self Regularized Non-Monotonic Neural Activation Function - https://arxiv.org/abs/1908.08681
A memory efficient, jit scripted variant of Mish
"""
@staticmethod
def forward(ctx, x):
ctx.save_for_backward(x)
return mish_jit_fwd(x)
@staticmethod
def backward(ctx, grad_output):
x = ctx.saved_tensors[0]
return mish_jit_bwd(x, grad_output)
def mish_me(x, inplace=False):
return MishJitAutoFn.apply(x)
class MishMe(nn.Module):
def __init__(self, inplace: bool = False):
super(MishMe, self).__init__()
def forward(self, x):
return MishJitAutoFn.apply(x)
@torch.jit.script
def hard_sigmoid_jit_fwd(x, inplace: bool = False):
return (x + 3).clamp(min=0, max=6).div(6.)
@torch.jit.script
def hard_sigmoid_jit_bwd(x, grad_output):
m = torch.ones_like(x) * ((x >= -3.) & (x <= 3.)) / 6.
return grad_output * m
class HardSigmoidJitAutoFn(torch.autograd.Function):
@staticmethod
def forward(ctx, x):
ctx.save_for_backward(x)
return hard_sigmoid_jit_fwd(x)
@staticmethod
def backward(ctx, grad_output):
x = ctx.saved_tensors[0]
return hard_sigmoid_jit_bwd(x, grad_output)
def hard_sigmoid_me(x, inplace: bool = False):
return HardSigmoidJitAutoFn.apply(x)
class HardSigmoidMe(nn.Module):
def __init__(self, inplace: bool = False):
super(HardSigmoidMe, self).__init__()
def forward(self, x):
return HardSigmoidJitAutoFn.apply(x)
@torch.jit.script
def hard_swish_jit_fwd(x):
return x * (x + 3).clamp(min=0, max=6).div(6.)
@torch.jit.script
def hard_swish_jit_bwd(x, grad_output):
m = torch.ones_like(x) * (x >= 3.)
m = torch.where((x >= -3.) & (x <= 3.), x / 3. + .5, m)
return grad_output * m
class HardSwishJitAutoFn(torch.autograd.Function):
"""A memory efficient, jit-scripted HardSwish activation"""
@staticmethod
def forward(ctx, x):
ctx.save_for_backward(x)
return hard_swish_jit_fwd(x)
@staticmethod
def backward(ctx, grad_output):
x = ctx.saved_tensors[0]
return hard_swish_jit_bwd(x, grad_output)
@staticmethod
def symbolic(g, self):
input = g.op("Add", self, g.op('Constant', value_t=torch.tensor(3, dtype=torch.float)))
hardtanh_ = g.op("Clip", input, g.op('Constant', value_t=torch.tensor(0, dtype=torch.float)), g.op('Constant', value_t=torch.tensor(6, dtype=torch.float)))
hardtanh_ = g.op("Div", hardtanh_, g.op('Constant', value_t=torch.tensor(6, dtype=torch.float)))
return g.op("Mul", self, hardtanh_)
def hard_swish_me(x, inplace=False):
return HardSwishJitAutoFn.apply(x)
class HardSwishMe(nn.Module):
def __init__(self, inplace: bool = False):
super(HardSwishMe, self).__init__()
def forward(self, x):
return HardSwishJitAutoFn.apply(x)
@torch.jit.script
def hard_mish_jit_fwd(x):
return 0.5 * x * (x + 2).clamp(min=0, max=2)
@torch.jit.script
def hard_mish_jit_bwd(x, grad_output):
m = torch.ones_like(x) * (x >= -2.)
m = torch.where((x >= -2.) & (x <= 0.), x + 1., m)
return grad_output * m
class HardMishJitAutoFn(torch.autograd.Function):
""" A memory efficient, jit scripted variant of Hard Mish
Experimental, based on notes by Mish author Diganta Misra at
https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md
"""
@staticmethod
def forward(ctx, x):
ctx.save_for_backward(x)
return hard_mish_jit_fwd(x)
@staticmethod
def backward(ctx, grad_output):
x = ctx.saved_tensors[0]
return hard_mish_jit_bwd(x, grad_output)
def hard_mish_me(x, inplace: bool = False):
return HardMishJitAutoFn.apply(x)
class HardMishMe(nn.Module):
def __init__(self, inplace: bool = False):
super(HardMishMe, self).__init__()
def forward(self, x):
return HardMishJitAutoFn.apply(x)
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/activations.py
|
""" Activations
A collection of activations fn and modules with a common interface so that they can
easily be swapped. All have an `inplace` arg even if not used.
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
from torch import nn as nn
from torch.nn import functional as F
def swish(x, inplace: bool = False):
"""Swish - Described in: https://arxiv.org/abs/1710.05941
"""
return x.mul_(x.sigmoid()) if inplace else x.mul(x.sigmoid())
class Swish(nn.Module):
def __init__(self, inplace: bool = False):
super(Swish, self).__init__()
self.inplace = inplace
def forward(self, x):
return swish(x, self.inplace)
def mish(x, inplace: bool = False):
"""Mish: A Self Regularized Non-Monotonic Neural Activation Function - https://arxiv.org/abs/1908.08681
NOTE: I don't have a working inplace variant
"""
return x.mul(F.softplus(x).tanh())
class Mish(nn.Module):
"""Mish: A Self Regularized Non-Monotonic Neural Activation Function - https://arxiv.org/abs/1908.08681
"""
def __init__(self, inplace: bool = False):
super(Mish, self).__init__()
def forward(self, x):
return mish(x)
def sigmoid(x, inplace: bool = False):
return x.sigmoid_() if inplace else x.sigmoid()
# PyTorch has this, but not with a consistent inplace argmument interface
class Sigmoid(nn.Module):
def __init__(self, inplace: bool = False):
super(Sigmoid, self).__init__()
self.inplace = inplace
def forward(self, x):
return x.sigmoid_() if self.inplace else x.sigmoid()
def tanh(x, inplace: bool = False):
return x.tanh_() if inplace else x.tanh()
# PyTorch has this, but not with a consistent inplace argmument interface
class Tanh(nn.Module):
def __init__(self, inplace: bool = False):
super(Tanh, self).__init__()
self.inplace = inplace
def forward(self, x):
return x.tanh_() if self.inplace else x.tanh()
def hard_swish(x, inplace: bool = False):
inner = F.relu6(x + 3.).div_(6.)
return x.mul_(inner) if inplace else x.mul(inner)
class HardSwish(nn.Module):
def __init__(self, inplace: bool = False):
super(HardSwish, self).__init__()
self.inplace = inplace
def forward(self, x):
return hard_swish(x, self.inplace)
def hard_sigmoid(x, inplace: bool = False):
if inplace:
return x.add_(3.).clamp_(0., 6.).div_(6.)
else:
return F.relu6(x + 3.) / 6.
class HardSigmoid(nn.Module):
def __init__(self, inplace: bool = False):
super(HardSigmoid, self).__init__()
self.inplace = inplace
def forward(self, x):
return hard_sigmoid(x, self.inplace)
def hard_mish(x, inplace: bool = False):
""" Hard Mish
Experimental, based on notes by Mish author Diganta Misra at
https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md
"""
if inplace:
return x.mul_(0.5 * (x + 2).clamp(min=0, max=2))
else:
return 0.5 * x * (x + 2).clamp(min=0, max=2)
class HardMish(nn.Module):
def __init__(self, inplace: bool = False):
super(HardMish, self).__init__()
self.inplace = inplace
def forward(self, x):
return hard_mish(x, self.inplace)
class PReLU(nn.PReLU):
"""Applies PReLU (w/ dummy inplace arg)
"""
def __init__(self, num_parameters: int = 1, init: float = 0.25, inplace: bool = False) -> None:
super(PReLU, self).__init__(num_parameters=num_parameters, init=init)
def forward(self, input: torch.Tensor) -> torch.Tensor:
return F.prelu(input, self.weight)
def gelu(x: torch.Tensor, inplace: bool = False) -> torch.Tensor:
return F.gelu(x)
class GELU(nn.Module):
"""Applies the Gaussian Error Linear Units function (w/ dummy inplace arg)
"""
def __init__(self, inplace: bool = False):
super(GELU, self).__init__()
def forward(self, input: torch.Tensor) -> torch.Tensor:
return F.gelu(input)
def gelu_tanh(x: torch.Tensor, inplace: bool = False) -> torch.Tensor:
return F.gelu(x, approximate='tanh')
class GELUTanh(nn.Module):
"""Applies the Gaussian Error Linear Units function (w/ dummy inplace arg)
"""
def __init__(self, inplace: bool = False):
super(GELUTanh, self).__init__()
def forward(self, input: torch.Tensor) -> torch.Tensor:
return F.gelu(input, approximate='tanh')
def quick_gelu(x: torch.Tensor, inplace: bool = False) -> torch.Tensor:
return x * torch.sigmoid(1.702 * x)
class QuickGELU(nn.Module):
"""Applies the Gaussian Error Linear Units function (w/ dummy inplace arg)
"""
def __init__(self, inplace: bool = False):
super(QuickGELU, self).__init__()
def forward(self, input: torch.Tensor) -> torch.Tensor:
return quick_gelu(input)
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/norm_act.py
|
""" Normalization + Activation Layers
Provides Norm+Act fns for standard PyTorch norm layers such as
* BatchNorm
* GroupNorm
* LayerNorm
This allows swapping with alternative layers that are natively both norm + act such as
* EvoNorm (evo_norm.py)
* FilterResponseNorm (filter_response_norm.py)
* InplaceABN (inplace_abn.py)
Hacked together by / Copyright 2022 Ross Wightman
"""
from typing import Union, List, Optional, Any
import torch
from torch import nn as nn
from torch.nn import functional as F
from torchvision.ops.misc import FrozenBatchNorm2d
from .create_act import get_act_layer
from .fast_norm import is_fast_norm, fast_group_norm, fast_layer_norm
from .trace_utils import _assert
def _create_act(act_layer, act_kwargs=None, inplace=False, apply_act=True):
act_layer = get_act_layer(act_layer) # string -> nn.Module
act_kwargs = act_kwargs or {}
if act_layer is not None and apply_act:
if inplace:
act_kwargs['inplace'] = inplace
act = act_layer(**act_kwargs)
else:
act = nn.Identity()
return act
class BatchNormAct2d(nn.BatchNorm2d):
"""BatchNorm + Activation
This module performs BatchNorm + Activation in a manner that will remain backwards
compatible with weights trained with separate bn, act. This is why we inherit from BN
instead of composing it as a .bn member.
"""
def __init__(
self,
num_features,
eps=1e-5,
momentum=0.1,
affine=True,
track_running_stats=True,
apply_act=True,
act_layer=nn.ReLU,
act_kwargs=None,
inplace=True,
drop_layer=None,
device=None,
dtype=None,
):
try:
factory_kwargs = {'device': device, 'dtype': dtype}
super(BatchNormAct2d, self).__init__(
num_features,
eps=eps,
momentum=momentum,
affine=affine,
track_running_stats=track_running_stats,
**factory_kwargs,
)
except TypeError:
# NOTE for backwards compat with old PyTorch w/o factory device/dtype support
super(BatchNormAct2d, self).__init__(
num_features,
eps=eps,
momentum=momentum,
affine=affine,
track_running_stats=track_running_stats,
)
self.drop = drop_layer() if drop_layer is not None else nn.Identity()
self.act = _create_act(act_layer, act_kwargs=act_kwargs, inplace=inplace, apply_act=apply_act)
def forward(self, x):
# cut & paste of torch.nn.BatchNorm2d.forward impl to avoid issues with torchscript and tracing
_assert(x.ndim == 4, f'expected 4D input (got {x.ndim}D input)')
# exponential_average_factor is set to self.momentum
# (when it is available) only so that it gets updated
# in ONNX graph when this node is exported to ONNX.
if self.momentum is None:
exponential_average_factor = 0.0
else:
exponential_average_factor = self.momentum
if self.training and self.track_running_stats:
# TODO: if statement only here to tell the jit to skip emitting this when it is None
if self.num_batches_tracked is not None: # type: ignore[has-type]
self.num_batches_tracked.add_(1) # type: ignore[has-type]
if self.momentum is None: # use cumulative moving average
exponential_average_factor = 1.0 / float(self.num_batches_tracked)
else: # use exponential moving average
exponential_average_factor = self.momentum
r"""
Decide whether the mini-batch stats should be used for normalization rather than the buffers.
Mini-batch stats are used in training mode, and in eval mode when buffers are None.
"""
if self.training:
bn_training = True
else:
bn_training = (self.running_mean is None) and (self.running_var is None)
r"""
Buffers are only updated if they are to be tracked and we are in training mode. Thus they only need to be
passed when the update should occur (i.e. in training mode when they are tracked), or when buffer stats are
used for normalization (i.e. in eval mode when buffers are not None).
"""
x = F.batch_norm(
x,
# If buffers are not to be tracked, ensure that they won't be updated
self.running_mean if not self.training or self.track_running_stats else None,
self.running_var if not self.training or self.track_running_stats else None,
self.weight,
self.bias,
bn_training,
exponential_average_factor,
self.eps,
)
x = self.drop(x)
x = self.act(x)
return x
class SyncBatchNormAct(nn.SyncBatchNorm):
# Thanks to Selim Seferbekov (https://github.com/rwightman/pytorch-image-models/issues/1254)
# This is a quick workaround to support SyncBatchNorm for timm BatchNormAct2d layers
# but ONLY when used in conjunction with the timm conversion function below.
# Do not create this module directly or use the PyTorch conversion function.
def forward(self, x: torch.Tensor) -> torch.Tensor:
x = super().forward(x) # SyncBN doesn't work with torchscript anyways, so this is fine
if hasattr(self, "drop"):
x = self.drop(x)
if hasattr(self, "act"):
x = self.act(x)
return x
def convert_sync_batchnorm(module, process_group=None):
# convert both BatchNorm and BatchNormAct layers to Synchronized variants
module_output = module
if isinstance(module, torch.nn.modules.batchnorm._BatchNorm):
if isinstance(module, BatchNormAct2d):
# convert timm norm + act layer
module_output = SyncBatchNormAct(
module.num_features,
module.eps,
module.momentum,
module.affine,
module.track_running_stats,
process_group=process_group,
)
# set act and drop attr from the original module
module_output.act = module.act
module_output.drop = module.drop
else:
# convert standard BatchNorm layers
module_output = torch.nn.SyncBatchNorm(
module.num_features,
module.eps,
module.momentum,
module.affine,
module.track_running_stats,
process_group,
)
if module.affine:
with torch.no_grad():
module_output.weight = module.weight
module_output.bias = module.bias
module_output.running_mean = module.running_mean
module_output.running_var = module.running_var
module_output.num_batches_tracked = module.num_batches_tracked
if hasattr(module, "qconfig"):
module_output.qconfig = module.qconfig
for name, child in module.named_children():
module_output.add_module(name, convert_sync_batchnorm(child, process_group))
del module
return module_output
class FrozenBatchNormAct2d(torch.nn.Module):
"""
BatchNormAct2d where the batch statistics and the affine parameters are fixed
Args:
num_features (int): Number of features ``C`` from an expected input of size ``(N, C, H, W)``
eps (float): a value added to the denominator for numerical stability. Default: 1e-5
"""
def __init__(
self,
num_features: int,
eps: float = 1e-5,
apply_act=True,
act_layer=nn.ReLU,
act_kwargs=None,
inplace=True,
drop_layer=None,
):
super().__init__()
self.eps = eps
self.register_buffer("weight", torch.ones(num_features))
self.register_buffer("bias", torch.zeros(num_features))
self.register_buffer("running_mean", torch.zeros(num_features))
self.register_buffer("running_var", torch.ones(num_features))
self.drop = drop_layer() if drop_layer is not None else nn.Identity()
self.act = _create_act(act_layer, act_kwargs=act_kwargs, inplace=inplace, apply_act=apply_act)
def _load_from_state_dict(
self,
state_dict: dict,
prefix: str,
local_metadata: dict,
strict: bool,
missing_keys: List[str],
unexpected_keys: List[str],
error_msgs: List[str],
):
num_batches_tracked_key = prefix + "num_batches_tracked"
if num_batches_tracked_key in state_dict:
del state_dict[num_batches_tracked_key]
super()._load_from_state_dict(
state_dict, prefix, local_metadata, strict, missing_keys, unexpected_keys, error_msgs
)
def forward(self, x: torch.Tensor) -> torch.Tensor:
# move reshapes to the beginning
# to make it fuser-friendly
w = self.weight.reshape(1, -1, 1, 1)
b = self.bias.reshape(1, -1, 1, 1)
rv = self.running_var.reshape(1, -1, 1, 1)
rm = self.running_mean.reshape(1, -1, 1, 1)
scale = w * (rv + self.eps).rsqrt()
bias = b - rm * scale
x = x * scale + bias
x = self.act(self.drop(x))
return x
def __repr__(self) -> str:
return f"{self.__class__.__name__}({self.weight.shape[0]}, eps={self.eps}, act={self.act})"
def freeze_batch_norm_2d(module):
"""
Converts all `BatchNorm2d` and `SyncBatchNorm` or `BatchNormAct2d` and `SyncBatchNormAct2d` layers
of provided module into `FrozenBatchNorm2d` or `FrozenBatchNormAct2d` respectively.
Args:
module (torch.nn.Module): Any PyTorch module.
Returns:
torch.nn.Module: Resulting module
Inspired by https://github.com/pytorch/pytorch/blob/a5895f85be0f10212791145bfedc0261d364f103/torch/nn/modules/batchnorm.py#L762
"""
res = module
if isinstance(module, (BatchNormAct2d, SyncBatchNormAct)):
res = FrozenBatchNormAct2d(module.num_features)
res.num_features = module.num_features
res.affine = module.affine
if module.affine:
res.weight.data = module.weight.data.clone().detach()
res.bias.data = module.bias.data.clone().detach()
res.running_mean.data = module.running_mean.data
res.running_var.data = module.running_var.data
res.eps = module.eps
res.drop = module.drop
res.act = module.act
elif isinstance(module, (torch.nn.modules.batchnorm.BatchNorm2d, torch.nn.modules.batchnorm.SyncBatchNorm)):
res = FrozenBatchNorm2d(module.num_features)
res.num_features = module.num_features
res.affine = module.affine
if module.affine:
res.weight.data = module.weight.data.clone().detach()
res.bias.data = module.bias.data.clone().detach()
res.running_mean.data = module.running_mean.data
res.running_var.data = module.running_var.data
res.eps = module.eps
else:
for name, child in module.named_children():
new_child = freeze_batch_norm_2d(child)
if new_child is not child:
res.add_module(name, new_child)
return res
def unfreeze_batch_norm_2d(module):
"""
Converts all `FrozenBatchNorm2d` layers of provided module into `BatchNorm2d`. If `module` is itself and instance
of `FrozenBatchNorm2d`, it is converted into `BatchNorm2d` and returned. Otherwise, the module is walked
recursively and submodules are converted in place.
Args:
module (torch.nn.Module): Any PyTorch module.
Returns:
torch.nn.Module: Resulting module
Inspired by https://github.com/pytorch/pytorch/blob/a5895f85be0f10212791145bfedc0261d364f103/torch/nn/modules/batchnorm.py#L762
"""
res = module
if isinstance(module, FrozenBatchNormAct2d):
res = BatchNormAct2d(module.num_features)
if module.affine:
res.weight.data = module.weight.data.clone().detach()
res.bias.data = module.bias.data.clone().detach()
res.running_mean.data = module.running_mean.data
res.running_var.data = module.running_var.data
res.eps = module.eps
res.drop = module.drop
res.act = module.act
elif isinstance(module, FrozenBatchNorm2d):
res = torch.nn.BatchNorm2d(module.num_features)
if module.affine:
res.weight.data = module.weight.data.clone().detach()
res.bias.data = module.bias.data.clone().detach()
res.running_mean.data = module.running_mean.data
res.running_var.data = module.running_var.data
res.eps = module.eps
else:
for name, child in module.named_children():
new_child = unfreeze_batch_norm_2d(child)
if new_child is not child:
res.add_module(name, new_child)
return res
def _num_groups(num_channels, num_groups, group_size):
if group_size:
assert num_channels % group_size == 0
return num_channels // group_size
return num_groups
class GroupNormAct(nn.GroupNorm):
# NOTE num_channel and num_groups order flipped for easier layer swaps / binding of fixed args
def __init__(
self,
num_channels,
num_groups=32,
eps=1e-5,
affine=True,
group_size=None,
apply_act=True,
act_layer=nn.ReLU,
act_kwargs=None,
inplace=True,
drop_layer=None,
):
super(GroupNormAct, self).__init__(
_num_groups(num_channels, num_groups, group_size),
num_channels,
eps=eps,
affine=affine,
)
self.drop = drop_layer() if drop_layer is not None else nn.Identity()
self.act = _create_act(act_layer, act_kwargs=act_kwargs, inplace=inplace, apply_act=apply_act)
self._fast_norm = is_fast_norm()
def forward(self, x):
if self._fast_norm:
x = fast_group_norm(x, self.num_groups, self.weight, self.bias, self.eps)
else:
x = F.group_norm(x, self.num_groups, self.weight, self.bias, self.eps)
x = self.drop(x)
x = self.act(x)
return x
class GroupNorm1Act(nn.GroupNorm):
def __init__(
self,
num_channels,
eps=1e-5,
affine=True,
apply_act=True,
act_layer=nn.ReLU,
act_kwargs=None,
inplace=True,
drop_layer=None,
):
super(GroupNorm1Act, self).__init__(1, num_channels, eps=eps, affine=affine)
self.drop = drop_layer() if drop_layer is not None else nn.Identity()
self.act = _create_act(act_layer, act_kwargs=act_kwargs, inplace=inplace, apply_act=apply_act)
self._fast_norm = is_fast_norm()
def forward(self, x):
if self._fast_norm:
x = fast_group_norm(x, self.num_groups, self.weight, self.bias, self.eps)
else:
x = F.group_norm(x, self.num_groups, self.weight, self.bias, self.eps)
x = self.drop(x)
x = self.act(x)
return x
class LayerNormAct(nn.LayerNorm):
def __init__(
self,
normalization_shape: Union[int, List[int], torch.Size],
eps=1e-5,
affine=True,
apply_act=True,
act_layer=nn.ReLU,
act_kwargs=None,
inplace=True,
drop_layer=None,
):
super(LayerNormAct, self).__init__(normalization_shape, eps=eps, elementwise_affine=affine)
self.drop = drop_layer() if drop_layer is not None else nn.Identity()
act_layer = get_act_layer(act_layer) # string -> nn.Module
self.act = _create_act(act_layer, act_kwargs=act_kwargs, inplace=inplace, apply_act=apply_act)
self._fast_norm = is_fast_norm()
def forward(self, x):
if self._fast_norm:
x = fast_layer_norm(x, self.normalized_shape, self.weight, self.bias, self.eps)
else:
x = F.layer_norm(x, self.normalized_shape, self.weight, self.bias, self.eps)
x = self.drop(x)
x = self.act(x)
return x
class LayerNormAct2d(nn.LayerNorm):
def __init__(
self,
num_channels,
eps=1e-5,
affine=True,
apply_act=True,
act_layer=nn.ReLU,
act_kwargs=None,
inplace=True,
drop_layer=None,
):
super(LayerNormAct2d, self).__init__(num_channels, eps=eps, elementwise_affine=affine)
self.drop = drop_layer() if drop_layer is not None else nn.Identity()
self.act = _create_act(act_layer, act_kwargs=act_kwargs, inplace=inplace, apply_act=apply_act)
self._fast_norm = is_fast_norm()
def forward(self, x):
x = x.permute(0, 2, 3, 1)
if self._fast_norm:
x = fast_layer_norm(x, self.normalized_shape, self.weight, self.bias, self.eps)
else:
x = F.layer_norm(x, self.normalized_shape, self.weight, self.bias, self.eps)
x = x.permute(0, 3, 1, 2)
x = self.drop(x)
x = self.act(x)
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/mixed_conv2d.py
|
""" PyTorch Mixed Convolution
Paper: MixConv: Mixed Depthwise Convolutional Kernels (https://arxiv.org/abs/1907.09595)
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
from torch import nn as nn
from .conv2d_same import create_conv2d_pad
def _split_channels(num_chan, num_groups):
split = [num_chan // num_groups for _ in range(num_groups)]
split[0] += num_chan - sum(split)
return split
class MixedConv2d(nn.ModuleDict):
""" Mixed Grouped Convolution
Based on MDConv and GroupedConv in MixNet impl:
https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py
"""
def __init__(self, in_channels, out_channels, kernel_size=3,
stride=1, padding='', dilation=1, depthwise=False, **kwargs):
super(MixedConv2d, self).__init__()
kernel_size = kernel_size if isinstance(kernel_size, list) else [kernel_size]
num_groups = len(kernel_size)
in_splits = _split_channels(in_channels, num_groups)
out_splits = _split_channels(out_channels, num_groups)
self.in_channels = sum(in_splits)
self.out_channels = sum(out_splits)
for idx, (k, in_ch, out_ch) in enumerate(zip(kernel_size, in_splits, out_splits)):
conv_groups = in_ch if depthwise else 1
# use add_module to keep key space clean
self.add_module(
str(idx),
create_conv2d_pad(
in_ch, out_ch, k, stride=stride,
padding=padding, dilation=dilation, groups=conv_groups, **kwargs)
)
self.splits = in_splits
def forward(self, x):
x_split = torch.split(x, self.splits, 1)
x_out = [c(x_split[i]) for i, c in enumerate(self.values())]
x = torch.cat(x_out, 1)
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/conv2d_same.py
|
""" Conv2d w/ Same Padding
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
import torch.nn as nn
import torch.nn.functional as F
from typing import Tuple, Optional
from .config import is_exportable, is_scriptable
from .padding import pad_same, pad_same_arg, get_padding_value
_USE_EXPORT_CONV = False
def conv2d_same(
x,
weight: torch.Tensor,
bias: Optional[torch.Tensor] = None,
stride: Tuple[int, int] = (1, 1),
padding: Tuple[int, int] = (0, 0),
dilation: Tuple[int, int] = (1, 1),
groups: int = 1,
):
x = pad_same(x, weight.shape[-2:], stride, dilation)
return F.conv2d(x, weight, bias, stride, (0, 0), dilation, groups)
class Conv2dSame(nn.Conv2d):
""" Tensorflow like 'SAME' convolution wrapper for 2D convolutions
"""
def __init__(
self,
in_channels,
out_channels,
kernel_size,
stride=1,
padding=0,
dilation=1,
groups=1,
bias=True,
):
super(Conv2dSame, self).__init__(
in_channels, out_channels, kernel_size,
stride, 0, dilation, groups, bias,
)
def forward(self, x):
return conv2d_same(
x, self.weight, self.bias,
self.stride, self.padding, self.dilation, self.groups,
)
class Conv2dSameExport(nn.Conv2d):
""" ONNX export friendly Tensorflow like 'SAME' convolution wrapper for 2D convolutions
NOTE: This does not currently work with torch.jit.script
"""
# pylint: disable=unused-argument
def __init__(
self,
in_channels,
out_channels,
kernel_size,
stride=1,
padding=0,
dilation=1,
groups=1,
bias=True,
):
super(Conv2dSameExport, self).__init__(
in_channels, out_channels, kernel_size,
stride, 0, dilation, groups, bias,
)
self.pad = None
self.pad_input_size = (0, 0)
def forward(self, x):
input_size = x.size()[-2:]
if self.pad is None:
pad_arg = pad_same_arg(input_size, self.weight.size()[-2:], self.stride, self.dilation)
self.pad = nn.ZeroPad2d(pad_arg)
self.pad_input_size = input_size
x = self.pad(x)
return F.conv2d(
x, self.weight, self.bias,
self.stride, self.padding, self.dilation, self.groups,
)
def create_conv2d_pad(in_chs, out_chs, kernel_size, **kwargs):
padding = kwargs.pop('padding', '')
kwargs.setdefault('bias', False)
padding, is_dynamic = get_padding_value(padding, kernel_size, **kwargs)
if is_dynamic:
if _USE_EXPORT_CONV and is_exportable():
# older PyTorch ver needed this to export same padding reasonably
assert not is_scriptable() # Conv2DSameExport does not work with jit
return Conv2dSameExport(in_chs, out_chs, kernel_size, **kwargs)
else:
return Conv2dSame(in_chs, out_chs, kernel_size, **kwargs)
else:
return nn.Conv2d(in_chs, out_chs, kernel_size, padding=padding, **kwargs)
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/bottleneck_attn.py
|
""" Bottleneck Self Attention (Bottleneck Transformers)
Paper: `Bottleneck Transformers for Visual Recognition` - https://arxiv.org/abs/2101.11605
@misc{2101.11605,
Author = {Aravind Srinivas and Tsung-Yi Lin and Niki Parmar and Jonathon Shlens and Pieter Abbeel and Ashish Vaswani},
Title = {Bottleneck Transformers for Visual Recognition},
Year = {2021},
}
Based on ref gist at: https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2
This impl is a WIP but given that it is based on the ref gist likely not too far off.
Hacked together by / Copyright 2021 Ross Wightman
"""
from typing import List
import torch
import torch.nn as nn
import torch.nn.functional as F
from .helpers import to_2tuple, make_divisible
from .weight_init import trunc_normal_
from .trace_utils import _assert
def rel_logits_1d(q, rel_k, permute_mask: List[int]):
""" Compute relative logits along one dimension
As per: https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2
Originally from: `Attention Augmented Convolutional Networks` - https://arxiv.org/abs/1904.09925
Args:
q: (batch, heads, height, width, dim)
rel_k: (2 * width - 1, dim)
permute_mask: permute output dim according to this
"""
B, H, W, dim = q.shape
x = (q @ rel_k.transpose(-1, -2))
x = x.reshape(-1, W, 2 * W -1)
# pad to shift from relative to absolute indexing
x_pad = F.pad(x, [0, 1]).flatten(1)
x_pad = F.pad(x_pad, [0, W - 1])
# reshape and slice out the padded elements
x_pad = x_pad.reshape(-1, W + 1, 2 * W - 1)
x = x_pad[:, :W, W - 1:]
# reshape and tile
x = x.reshape(B, H, 1, W, W).expand(-1, -1, H, -1, -1)
return x.permute(permute_mask)
class PosEmbedRel(nn.Module):
""" Relative Position Embedding
As per: https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2
Originally from: `Attention Augmented Convolutional Networks` - https://arxiv.org/abs/1904.09925
"""
def __init__(self, feat_size, dim_head, scale):
super().__init__()
self.height, self.width = to_2tuple(feat_size)
self.dim_head = dim_head
self.height_rel = nn.Parameter(torch.randn(self.height * 2 - 1, dim_head) * scale)
self.width_rel = nn.Parameter(torch.randn(self.width * 2 - 1, dim_head) * scale)
def forward(self, q):
B, HW, _ = q.shape
# relative logits in width dimension.
q = q.reshape(B, self.height, self.width, -1)
rel_logits_w = rel_logits_1d(q, self.width_rel, permute_mask=(0, 1, 3, 2, 4))
# relative logits in height dimension.
q = q.transpose(1, 2)
rel_logits_h = rel_logits_1d(q, self.height_rel, permute_mask=(0, 3, 1, 4, 2))
rel_logits = rel_logits_h + rel_logits_w
rel_logits = rel_logits.reshape(B, HW, HW)
return rel_logits
class BottleneckAttn(nn.Module):
""" Bottleneck Attention
Paper: `Bottleneck Transformers for Visual Recognition` - https://arxiv.org/abs/2101.11605
The internal dimensions of the attention module are controlled by the interaction of several arguments.
* the output dimension of the module is specified by dim_out, which falls back to input dim if not set
* the value (v) dimension is set to dim_out // num_heads, the v projection determines the output dim
* the query and key (qk) dimensions are determined by
* num_heads * dim_head if dim_head is not None
* num_heads * (dim_out * attn_ratio // num_heads) if dim_head is None
* as seen above, attn_ratio determines the ratio of q and k relative to the output if dim_head not used
Args:
dim (int): input dimension to the module
dim_out (int): output dimension of the module, same as dim if not set
stride (int): output stride of the module, avg pool used if stride == 2 (default: 1).
num_heads (int): parallel attention heads (default: 4)
dim_head (int): dimension of query and key heads, calculated from dim_out * attn_ratio // num_heads if not set
qk_ratio (float): ratio of q and k dimensions to output dimension when dim_head not set. (default: 1.0)
qkv_bias (bool): add bias to q, k, and v projections
scale_pos_embed (bool): scale the position embedding as well as Q @ K
"""
def __init__(
self, dim, dim_out=None, feat_size=None, stride=1, num_heads=4, dim_head=None,
qk_ratio=1.0, qkv_bias=False, scale_pos_embed=False):
super().__init__()
assert feat_size is not None, 'A concrete feature size matching expected input (H, W) is required'
dim_out = dim_out or dim
assert dim_out % num_heads == 0
self.num_heads = num_heads
self.dim_head_qk = dim_head or make_divisible(dim_out * qk_ratio, divisor=8) // num_heads
self.dim_head_v = dim_out // self.num_heads
self.dim_out_qk = num_heads * self.dim_head_qk
self.dim_out_v = num_heads * self.dim_head_v
self.scale = self.dim_head_qk ** -0.5
self.scale_pos_embed = scale_pos_embed
self.qkv = nn.Conv2d(dim, self.dim_out_qk * 2 + self.dim_out_v, 1, bias=qkv_bias)
# NOTE I'm only supporting relative pos embedding for now
self.pos_embed = PosEmbedRel(feat_size, dim_head=self.dim_head_qk, scale=self.scale)
self.pool = nn.AvgPool2d(2, 2) if stride == 2 else nn.Identity()
self.reset_parameters()
def reset_parameters(self):
trunc_normal_(self.qkv.weight, std=self.qkv.weight.shape[1] ** -0.5) # fan-in
trunc_normal_(self.pos_embed.height_rel, std=self.scale)
trunc_normal_(self.pos_embed.width_rel, std=self.scale)
def forward(self, x):
B, C, H, W = x.shape
_assert(H == self.pos_embed.height, '')
_assert(W == self.pos_embed.width, '')
x = self.qkv(x) # B, (2 * dim_head_qk + dim_head_v) * num_heads, H, W
# NOTE head vs channel split ordering in qkv projection was decided before I allowed qk to differ from v
# So, this is more verbose than if heads were before qkv splits, but throughput is not impacted.
q, k, v = torch.split(x, [self.dim_out_qk, self.dim_out_qk, self.dim_out_v], dim=1)
q = q.reshape(B * self.num_heads, self.dim_head_qk, -1).transpose(-1, -2)
k = k.reshape(B * self.num_heads, self.dim_head_qk, -1) # no transpose, for q @ k
v = v.reshape(B * self.num_heads, self.dim_head_v, -1).transpose(-1, -2)
if self.scale_pos_embed:
attn = (q @ k + self.pos_embed(q)) * self.scale # B * num_heads, H * W, H * W
else:
attn = (q @ k) * self.scale + self.pos_embed(q)
attn = attn.softmax(dim=-1)
out = (attn @ v).transpose(-1, -2).reshape(B, self.dim_out_v, H, W) # B, dim_out, H, W
out = self.pool(out)
return out
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/conv_bn_act.py
|
""" Conv2d + BN + Act
Hacked together by / Copyright 2020 Ross Wightman
"""
import functools
from torch import nn as nn
from .create_conv2d import create_conv2d
from .create_norm_act import get_norm_act_layer
class ConvNormAct(nn.Module):
def __init__(
self,
in_channels,
out_channels,
kernel_size=1,
stride=1,
padding='',
dilation=1,
groups=1,
bias=False,
apply_act=True,
norm_layer=nn.BatchNorm2d,
norm_kwargs=None,
act_layer=nn.ReLU,
act_kwargs=None,
drop_layer=None,
):
super(ConvNormAct, self).__init__()
norm_kwargs = norm_kwargs or {}
act_kwargs = act_kwargs or {}
self.conv = create_conv2d(
in_channels, out_channels, kernel_size, stride=stride,
padding=padding, dilation=dilation, groups=groups, bias=bias)
# NOTE for backwards compatibility with models that use separate norm and act layer definitions
norm_act_layer = get_norm_act_layer(norm_layer, act_layer)
# NOTE for backwards (weight) compatibility, norm layer name remains `.bn`
if drop_layer:
norm_kwargs['drop_layer'] = drop_layer
self.bn = norm_act_layer(
out_channels,
apply_act=apply_act,
act_kwargs=act_kwargs,
**norm_kwargs,
)
@property
def in_channels(self):
return self.conv.in_channels
@property
def out_channels(self):
return self.conv.out_channels
def forward(self, x):
x = self.conv(x)
x = self.bn(x)
return x
ConvBnAct = ConvNormAct
def create_aa(aa_layer, channels, stride=2, enable=True):
if not aa_layer or not enable:
return nn.Identity()
if isinstance(aa_layer, functools.partial):
if issubclass(aa_layer.func, nn.AvgPool2d):
return aa_layer()
else:
return aa_layer(channels)
elif issubclass(aa_layer, nn.AvgPool2d):
return aa_layer(stride)
else:
return aa_layer(channels=channels, stride=stride)
class ConvNormActAa(nn.Module):
def __init__(
self,
in_channels,
out_channels,
kernel_size=1,
stride=1,
padding='',
dilation=1,
groups=1,
bias=False,
apply_act=True,
norm_layer=nn.BatchNorm2d,
norm_kwargs=None,
act_layer=nn.ReLU,
act_kwargs=None,
aa_layer=None,
drop_layer=None,
):
super(ConvNormActAa, self).__init__()
use_aa = aa_layer is not None and stride == 2
norm_kwargs = norm_kwargs or {}
act_kwargs = act_kwargs or {}
self.conv = create_conv2d(
in_channels, out_channels, kernel_size, stride=1 if use_aa else stride,
padding=padding, dilation=dilation, groups=groups, bias=bias)
# NOTE for backwards compatibility with models that use separate norm and act layer definitions
norm_act_layer = get_norm_act_layer(norm_layer, act_layer)
# NOTE for backwards (weight) compatibility, norm layer name remains `.bn`
if drop_layer:
norm_kwargs['drop_layer'] = drop_layer
self.bn = norm_act_layer(out_channels, apply_act=apply_act, act_kwargs=act_kwargs, **norm_kwargs)
self.aa = create_aa(aa_layer, out_channels, stride=stride, enable=use_aa)
@property
def in_channels(self):
return self.conv.in_channels
@property
def out_channels(self):
return self.conv.out_channels
def forward(self, x):
x = self.conv(x)
x = self.bn(x)
x = self.aa(x)
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/norm.py
|
""" Normalization layers and wrappers
Norm layer definitions that support fast norm and consistent channel arg order (always first arg).
Hacked together by / Copyright 2022 Ross Wightman
"""
import numbers
from typing import Tuple
import torch
import torch.nn as nn
import torch.nn.functional as F
from .fast_norm import is_fast_norm, fast_group_norm, fast_layer_norm, fast_rms_norm
class GroupNorm(nn.GroupNorm):
def __init__(self, num_channels, num_groups=32, eps=1e-5, affine=True):
# NOTE num_channels is swapped to first arg for consistency in swapping norm layers with BN
super().__init__(num_groups, num_channels, eps=eps, affine=affine)
self.fast_norm = is_fast_norm() # can't script unless we have these flags here (no globals)
def forward(self, x):
if self.fast_norm:
return fast_group_norm(x, self.num_groups, self.weight, self.bias, self.eps)
else:
return F.group_norm(x, self.num_groups, self.weight, self.bias, self.eps)
class GroupNorm1(nn.GroupNorm):
""" Group Normalization with 1 group.
Input: tensor in shape [B, C, *]
"""
def __init__(self, num_channels, **kwargs):
super().__init__(1, num_channels, **kwargs)
self.fast_norm = is_fast_norm() # can't script unless we have these flags here (no globals)
def forward(self, x: torch.Tensor) -> torch.Tensor:
if self.fast_norm:
return fast_group_norm(x, self.num_groups, self.weight, self.bias, self.eps)
else:
return F.group_norm(x, self.num_groups, self.weight, self.bias, self.eps)
class LayerNorm(nn.LayerNorm):
""" LayerNorm w/ fast norm option
"""
def __init__(self, num_channels, eps=1e-6, affine=True):
super().__init__(num_channels, eps=eps, elementwise_affine=affine)
self._fast_norm = is_fast_norm() # can't script unless we have these flags here (no globals)
def forward(self, x: torch.Tensor) -> torch.Tensor:
if self._fast_norm:
x = fast_layer_norm(x, self.normalized_shape, self.weight, self.bias, self.eps)
else:
x = F.layer_norm(x, self.normalized_shape, self.weight, self.bias, self.eps)
return x
class LayerNorm2d(nn.LayerNorm):
""" LayerNorm for channels of '2D' spatial NCHW tensors """
def __init__(self, num_channels, eps=1e-6, affine=True):
super().__init__(num_channels, eps=eps, elementwise_affine=affine)
self._fast_norm = is_fast_norm() # can't script unless we have these flags here (no globals)
def forward(self, x: torch.Tensor) -> torch.Tensor:
x = x.permute(0, 2, 3, 1)
if self._fast_norm:
x = fast_layer_norm(x, self.normalized_shape, self.weight, self.bias, self.eps)
else:
x = F.layer_norm(x, self.normalized_shape, self.weight, self.bias, self.eps)
x = x.permute(0, 3, 1, 2)
return x
def _is_contiguous(tensor: torch.Tensor) -> bool:
# jit is oh so lovely :/
if torch.jit.is_scripting():
return tensor.is_contiguous()
else:
return tensor.is_contiguous(memory_format=torch.contiguous_format)
@torch.jit.script
def _layer_norm_cf(x: torch.Tensor, weight: torch.Tensor, bias: torch.Tensor, eps: float):
s, u = torch.var_mean(x, dim=1, unbiased=False, keepdim=True)
x = (x - u) * torch.rsqrt(s + eps)
x = x * weight[:, None, None] + bias[:, None, None]
return x
def _layer_norm_cf_sqm(x: torch.Tensor, weight: torch.Tensor, bias: torch.Tensor, eps: float):
u = x.mean(dim=1, keepdim=True)
s = ((x * x).mean(dim=1, keepdim=True) - (u * u)).clamp(0)
x = (x - u) * torch.rsqrt(s + eps)
x = x * weight.view(1, -1, 1, 1) + bias.view(1, -1, 1, 1)
return x
class LayerNormExp2d(nn.LayerNorm):
""" LayerNorm for channels_first tensors with 2d spatial dimensions (ie N, C, H, W).
Experimental implementation w/ manual norm for tensors non-contiguous tensors.
This improves throughput in some scenarios (tested on Ampere GPU), esp w/ channels_last
layout. However, benefits are not always clear and can perform worse on other GPUs.
"""
def __init__(self, num_channels, eps=1e-6):
super().__init__(num_channels, eps=eps)
def forward(self, x) -> torch.Tensor:
if _is_contiguous(x):
x = F.layer_norm(
x.permute(0, 2, 3, 1), self.normalized_shape, self.weight, self.bias, self.eps).permute(0, 3, 1, 2)
else:
x = _layer_norm_cf(x, self.weight, self.bias, self.eps)
return x
class RmsNorm(nn.Module):
""" RmsNorm w/ fast (apex) norm if available
"""
__constants__ = ['normalized_shape', 'eps', 'elementwise_affine']
normalized_shape: Tuple[int, ...]
eps: float
elementwise_affine: bool
def __init__(self, channels, eps=1e-6, affine=True, device=None, dtype=None) -> None:
factory_kwargs = {'device': device, 'dtype': dtype}
super().__init__()
normalized_shape = channels
if isinstance(normalized_shape, numbers.Integral):
# mypy error: incompatible types in assignment
normalized_shape = (normalized_shape,) # type: ignore[assignment]
self.normalized_shape = tuple(normalized_shape) # type: ignore[arg-type]
self.eps = eps
self.elementwise_affine = affine
if self.elementwise_affine:
self.weight = nn.Parameter(torch.empty(self.normalized_shape, **factory_kwargs))
else:
self.register_parameter('weight', None)
self.reset_parameters()
def reset_parameters(self) -> None:
if self.elementwise_affine:
nn.init.ones_(self.weight)
def forward(self, x: torch.Tensor) -> torch.Tensor:
# NOTE fast norm fallback needs our rms norm impl, so both paths through here.
# Since there is no built-in PyTorch impl, always use APEX RmsNorm if is installed.
x = fast_rms_norm(x, self.normalized_shape, self.weight, self.eps)
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/pool2d_same.py
|
""" AvgPool2d w/ Same Padding
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
import torch.nn as nn
import torch.nn.functional as F
from typing import List, Tuple, Optional
from .helpers import to_2tuple
from .padding import pad_same, get_padding_value
def avg_pool2d_same(x, kernel_size: List[int], stride: List[int], padding: List[int] = (0, 0),
ceil_mode: bool = False, count_include_pad: bool = True):
# FIXME how to deal with count_include_pad vs not for external padding?
x = pad_same(x, kernel_size, stride)
return F.avg_pool2d(x, kernel_size, stride, (0, 0), ceil_mode, count_include_pad)
class AvgPool2dSame(nn.AvgPool2d):
""" Tensorflow like 'SAME' wrapper for 2D average pooling
"""
def __init__(self, kernel_size: int, stride=None, padding=0, ceil_mode=False, count_include_pad=True):
kernel_size = to_2tuple(kernel_size)
stride = to_2tuple(stride)
super(AvgPool2dSame, self).__init__(kernel_size, stride, (0, 0), ceil_mode, count_include_pad)
def forward(self, x):
x = pad_same(x, self.kernel_size, self.stride)
return F.avg_pool2d(
x, self.kernel_size, self.stride, self.padding, self.ceil_mode, self.count_include_pad)
def max_pool2d_same(
x, kernel_size: List[int], stride: List[int], padding: List[int] = (0, 0),
dilation: List[int] = (1, 1), ceil_mode: bool = False):
x = pad_same(x, kernel_size, stride, value=-float('inf'))
return F.max_pool2d(x, kernel_size, stride, (0, 0), dilation, ceil_mode)
class MaxPool2dSame(nn.MaxPool2d):
""" Tensorflow like 'SAME' wrapper for 2D max pooling
"""
def __init__(self, kernel_size: int, stride=None, padding=0, dilation=1, ceil_mode=False):
kernel_size = to_2tuple(kernel_size)
stride = to_2tuple(stride)
dilation = to_2tuple(dilation)
super(MaxPool2dSame, self).__init__(kernel_size, stride, (0, 0), dilation, ceil_mode)
def forward(self, x):
x = pad_same(x, self.kernel_size, self.stride, value=-float('inf'))
return F.max_pool2d(x, self.kernel_size, self.stride, (0, 0), self.dilation, self.ceil_mode)
def create_pool2d(pool_type, kernel_size, stride=None, **kwargs):
stride = stride or kernel_size
padding = kwargs.pop('padding', '')
padding, is_dynamic = get_padding_value(padding, kernel_size, stride=stride, **kwargs)
if is_dynamic:
if pool_type == 'avg':
return AvgPool2dSame(kernel_size, stride=stride, **kwargs)
elif pool_type == 'max':
return MaxPool2dSame(kernel_size, stride=stride, **kwargs)
else:
assert False, f'Unsupported pool type {pool_type}'
else:
if pool_type == 'avg':
return nn.AvgPool2d(kernel_size, stride=stride, padding=padding, **kwargs)
elif pool_type == 'max':
return nn.MaxPool2d(kernel_size, stride=stride, padding=padding, **kwargs)
else:
assert False, f'Unsupported pool type {pool_type}'
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/patch_dropout.py
|
from typing import Optional, Tuple, Union
import torch
import torch.nn as nn
class PatchDropout(nn.Module):
"""
https://arxiv.org/abs/2212.00794
"""
return_indices: torch.jit.Final[bool]
def __init__(
self,
prob: float = 0.5,
num_prefix_tokens: int = 1,
ordered: bool = False,
return_indices: bool = False,
):
super().__init__()
assert 0 <= prob < 1.
self.prob = prob
self.num_prefix_tokens = num_prefix_tokens # exclude CLS token (or other prefix tokens)
self.ordered = ordered
self.return_indices = return_indices
def forward(self, x) -> Union[torch.Tensor, Tuple[torch.Tensor, Optional[torch.Tensor]]]:
if not self.training or self.prob == 0.:
if self.return_indices:
return x, None
return x
if self.num_prefix_tokens:
prefix_tokens, x = x[:, :self.num_prefix_tokens], x[:, self.num_prefix_tokens:]
else:
prefix_tokens = None
B = x.shape[0]
L = x.shape[1]
num_keep = max(1, int(L * (1. - self.prob)))
keep_indices = torch.argsort(torch.randn(B, L, device=x.device), dim=-1)[:, :num_keep]
if self.ordered:
# NOTE does not need to maintain patch order in typical transformer use,
# but possibly useful for debug / visualization
keep_indices = keep_indices.sort(dim=-1)[0]
x = x.gather(1, keep_indices.unsqueeze(-1).expand((-1, -1) + x.shape[2:]))
if prefix_tokens is not None:
x = torch.cat((prefix_tokens, x), dim=1)
if self.return_indices:
return x, keep_indices
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/activations_jit.py
|
""" Activations
A collection of jit-scripted activations fn and modules with a common interface so that they can
easily be swapped. All have an `inplace` arg even if not used.
All jit scripted activations are lacking in-place variations on purpose, scripted kernel fusion does not
currently work across in-place op boundaries, thus performance is equal to or less than the non-scripted
versions if they contain in-place ops.
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
from torch import nn as nn
from torch.nn import functional as F
@torch.jit.script
def swish_jit(x, inplace: bool = False):
"""Swish - Described in: https://arxiv.org/abs/1710.05941
"""
return x.mul(x.sigmoid())
@torch.jit.script
def mish_jit(x, _inplace: bool = False):
"""Mish: A Self Regularized Non-Monotonic Neural Activation Function - https://arxiv.org/abs/1908.08681
"""
return x.mul(F.softplus(x).tanh())
class SwishJit(nn.Module):
def __init__(self, inplace: bool = False):
super(SwishJit, self).__init__()
def forward(self, x):
return swish_jit(x)
class MishJit(nn.Module):
def __init__(self, inplace: bool = False):
super(MishJit, self).__init__()
def forward(self, x):
return mish_jit(x)
@torch.jit.script
def hard_sigmoid_jit(x, inplace: bool = False):
# return F.relu6(x + 3.) / 6.
return (x + 3).clamp(min=0, max=6).div(6.) # clamp seems ever so slightly faster?
class HardSigmoidJit(nn.Module):
def __init__(self, inplace: bool = False):
super(HardSigmoidJit, self).__init__()
def forward(self, x):
return hard_sigmoid_jit(x)
@torch.jit.script
def hard_swish_jit(x, inplace: bool = False):
# return x * (F.relu6(x + 3.) / 6)
return x * (x + 3).clamp(min=0, max=6).div(6.) # clamp seems ever so slightly faster?
class HardSwishJit(nn.Module):
def __init__(self, inplace: bool = False):
super(HardSwishJit, self).__init__()
def forward(self, x):
return hard_swish_jit(x)
@torch.jit.script
def hard_mish_jit(x, inplace: bool = False):
""" Hard Mish
Experimental, based on notes by Mish author Diganta Misra at
https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md
"""
return 0.5 * x * (x + 2).clamp(min=0, max=2)
class HardMishJit(nn.Module):
def __init__(self, inplace: bool = False):
super(HardMishJit, self).__init__()
def forward(self, x):
return hard_mish_jit(x)
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/drop.py
|
""" DropBlock, DropPath
PyTorch implementations of DropBlock and DropPath (Stochastic Depth) regularization layers.
Papers:
DropBlock: A regularization method for convolutional networks (https://arxiv.org/abs/1810.12890)
Deep Networks with Stochastic Depth (https://arxiv.org/abs/1603.09382)
Code:
DropBlock impl inspired by two Tensorflow impl that I liked:
- https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74
- https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
import torch.nn as nn
import torch.nn.functional as F
def drop_block_2d(
x, drop_prob: float = 0.1, block_size: int = 7, gamma_scale: float = 1.0,
with_noise: bool = False, inplace: bool = False, batchwise: bool = False):
""" DropBlock. See https://arxiv.org/pdf/1810.12890.pdf
DropBlock with an experimental gaussian noise option. This layer has been tested on a few training
runs with success, but needs further validation and possibly optimization for lower runtime impact.
"""
B, C, H, W = x.shape
total_size = W * H
clipped_block_size = min(block_size, min(W, H))
# seed_drop_rate, the gamma parameter
gamma = gamma_scale * drop_prob * total_size / clipped_block_size ** 2 / (
(W - block_size + 1) * (H - block_size + 1))
# Forces the block to be inside the feature map.
w_i, h_i = torch.meshgrid(torch.arange(W).to(x.device), torch.arange(H).to(x.device))
valid_block = ((w_i >= clipped_block_size // 2) & (w_i < W - (clipped_block_size - 1) // 2)) & \
((h_i >= clipped_block_size // 2) & (h_i < H - (clipped_block_size - 1) // 2))
valid_block = torch.reshape(valid_block, (1, 1, H, W)).to(dtype=x.dtype)
if batchwise:
# one mask for whole batch, quite a bit faster
uniform_noise = torch.rand((1, C, H, W), dtype=x.dtype, device=x.device)
else:
uniform_noise = torch.rand_like(x)
block_mask = ((2 - gamma - valid_block + uniform_noise) >= 1).to(dtype=x.dtype)
block_mask = -F.max_pool2d(
-block_mask,
kernel_size=clipped_block_size, # block_size,
stride=1,
padding=clipped_block_size // 2)
if with_noise:
normal_noise = torch.randn((1, C, H, W), dtype=x.dtype, device=x.device) if batchwise else torch.randn_like(x)
if inplace:
x.mul_(block_mask).add_(normal_noise * (1 - block_mask))
else:
x = x * block_mask + normal_noise * (1 - block_mask)
else:
normalize_scale = (block_mask.numel() / block_mask.to(dtype=torch.float32).sum().add(1e-7)).to(x.dtype)
if inplace:
x.mul_(block_mask * normalize_scale)
else:
x = x * block_mask * normalize_scale
return x
def drop_block_fast_2d(
x: torch.Tensor, drop_prob: float = 0.1, block_size: int = 7,
gamma_scale: float = 1.0, with_noise: bool = False, inplace: bool = False):
""" DropBlock. See https://arxiv.org/pdf/1810.12890.pdf
DropBlock with an experimental gaussian noise option. Simplied from above without concern for valid
block mask at edges.
"""
B, C, H, W = x.shape
total_size = W * H
clipped_block_size = min(block_size, min(W, H))
gamma = gamma_scale * drop_prob * total_size / clipped_block_size ** 2 / (
(W - block_size + 1) * (H - block_size + 1))
block_mask = torch.empty_like(x).bernoulli_(gamma)
block_mask = F.max_pool2d(
block_mask.to(x.dtype), kernel_size=clipped_block_size, stride=1, padding=clipped_block_size // 2)
if with_noise:
normal_noise = torch.empty_like(x).normal_()
if inplace:
x.mul_(1. - block_mask).add_(normal_noise * block_mask)
else:
x = x * (1. - block_mask) + normal_noise * block_mask
else:
block_mask = 1 - block_mask
normalize_scale = (block_mask.numel() / block_mask.to(dtype=torch.float32).sum().add(1e-6)).to(dtype=x.dtype)
if inplace:
x.mul_(block_mask * normalize_scale)
else:
x = x * block_mask * normalize_scale
return x
class DropBlock2d(nn.Module):
""" DropBlock. See https://arxiv.org/pdf/1810.12890.pdf
"""
def __init__(
self,
drop_prob: float = 0.1,
block_size: int = 7,
gamma_scale: float = 1.0,
with_noise: bool = False,
inplace: bool = False,
batchwise: bool = False,
fast: bool = True):
super(DropBlock2d, self).__init__()
self.drop_prob = drop_prob
self.gamma_scale = gamma_scale
self.block_size = block_size
self.with_noise = with_noise
self.inplace = inplace
self.batchwise = batchwise
self.fast = fast # FIXME finish comparisons of fast vs not
def forward(self, x):
if not self.training or not self.drop_prob:
return x
if self.fast:
return drop_block_fast_2d(
x, self.drop_prob, self.block_size, self.gamma_scale, self.with_noise, self.inplace)
else:
return drop_block_2d(
x, self.drop_prob, self.block_size, self.gamma_scale, self.with_noise, self.inplace, self.batchwise)
def drop_path(x, drop_prob: float = 0., training: bool = False, scale_by_keep: bool = True):
"""Drop paths (Stochastic Depth) per sample (when applied in main path of residual blocks).
This is the same as the DropConnect impl I created for EfficientNet, etc networks, however,
the original name is misleading as 'Drop Connect' is a different form of dropout in a separate paper...
See discussion: https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 ... I've opted for
changing the layer and argument names to 'drop path' rather than mix DropConnect as a layer name and use
'survival rate' as the argument.
"""
if drop_prob == 0. or not training:
return x
keep_prob = 1 - drop_prob
shape = (x.shape[0],) + (1,) * (x.ndim - 1) # work with diff dim tensors, not just 2D ConvNets
random_tensor = x.new_empty(shape).bernoulli_(keep_prob)
if keep_prob > 0.0 and scale_by_keep:
random_tensor.div_(keep_prob)
return x * random_tensor
class DropPath(nn.Module):
"""Drop paths (Stochastic Depth) per sample (when applied in main path of residual blocks).
"""
def __init__(self, drop_prob: float = 0., scale_by_keep: bool = True):
super(DropPath, self).__init__()
self.drop_prob = drop_prob
self.scale_by_keep = scale_by_keep
def forward(self, x):
return drop_path(x, self.drop_prob, self.training, self.scale_by_keep)
def extra_repr(self):
return f'drop_prob={round(self.drop_prob,3):0.3f}'
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/median_pool.py
|
""" Median Pool
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch.nn as nn
import torch.nn.functional as F
from .helpers import to_2tuple, to_4tuple
class MedianPool2d(nn.Module):
""" Median pool (usable as median filter when stride=1) module.
Args:
kernel_size: size of pooling kernel, int or 2-tuple
stride: pool stride, int or 2-tuple
padding: pool padding, int or 4-tuple (l, r, t, b) as in pytorch F.pad
same: override padding and enforce same padding, boolean
"""
def __init__(self, kernel_size=3, stride=1, padding=0, same=False):
super(MedianPool2d, self).__init__()
self.k = to_2tuple(kernel_size)
self.stride = to_2tuple(stride)
self.padding = to_4tuple(padding) # convert to l, r, t, b
self.same = same
def _padding(self, x):
if self.same:
ih, iw = x.size()[2:]
if ih % self.stride[0] == 0:
ph = max(self.k[0] - self.stride[0], 0)
else:
ph = max(self.k[0] - (ih % self.stride[0]), 0)
if iw % self.stride[1] == 0:
pw = max(self.k[1] - self.stride[1], 0)
else:
pw = max(self.k[1] - (iw % self.stride[1]), 0)
pl = pw // 2
pr = pw - pl
pt = ph // 2
pb = ph - pt
padding = (pl, pr, pt, pb)
else:
padding = self.padding
return padding
def forward(self, x):
x = F.pad(x, self._padding(x), mode='reflect')
x = x.unfold(2, self.k[0], self.stride[0]).unfold(3, self.k[1], self.stride[1])
x = x.contiguous().view(x.size()[:4] + (-1,)).median(dim=-1)[0]
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/cbam.py
|
""" CBAM (sort-of) Attention
Experimental impl of CBAM: Convolutional Block Attention Module: https://arxiv.org/abs/1807.06521
WARNING: Results with these attention layers have been mixed. They can significantly reduce performance on
some tasks, especially fine-grained it seems. I may end up removing this impl.
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
from torch import nn as nn
import torch.nn.functional as F
from .conv_bn_act import ConvNormAct
from .create_act import create_act_layer, get_act_layer
from .helpers import make_divisible
class ChannelAttn(nn.Module):
""" Original CBAM channel attention module, currently avg + max pool variant only.
"""
def __init__(
self, channels, rd_ratio=1./16, rd_channels=None, rd_divisor=1,
act_layer=nn.ReLU, gate_layer='sigmoid', mlp_bias=False):
super(ChannelAttn, self).__init__()
if not rd_channels:
rd_channels = make_divisible(channels * rd_ratio, rd_divisor, round_limit=0.)
self.fc1 = nn.Conv2d(channels, rd_channels, 1, bias=mlp_bias)
self.act = act_layer(inplace=True)
self.fc2 = nn.Conv2d(rd_channels, channels, 1, bias=mlp_bias)
self.gate = create_act_layer(gate_layer)
def forward(self, x):
x_avg = self.fc2(self.act(self.fc1(x.mean((2, 3), keepdim=True))))
x_max = self.fc2(self.act(self.fc1(x.amax((2, 3), keepdim=True))))
return x * self.gate(x_avg + x_max)
class LightChannelAttn(ChannelAttn):
"""An experimental 'lightweight' that sums avg + max pool first
"""
def __init__(
self, channels, rd_ratio=1./16, rd_channels=None, rd_divisor=1,
act_layer=nn.ReLU, gate_layer='sigmoid', mlp_bias=False):
super(LightChannelAttn, self).__init__(
channels, rd_ratio, rd_channels, rd_divisor, act_layer, gate_layer, mlp_bias)
def forward(self, x):
x_pool = 0.5 * x.mean((2, 3), keepdim=True) + 0.5 * x.amax((2, 3), keepdim=True)
x_attn = self.fc2(self.act(self.fc1(x_pool)))
return x * F.sigmoid(x_attn)
class SpatialAttn(nn.Module):
""" Original CBAM spatial attention module
"""
def __init__(self, kernel_size=7, gate_layer='sigmoid'):
super(SpatialAttn, self).__init__()
self.conv = ConvNormAct(2, 1, kernel_size, apply_act=False)
self.gate = create_act_layer(gate_layer)
def forward(self, x):
x_attn = torch.cat([x.mean(dim=1, keepdim=True), x.amax(dim=1, keepdim=True)], dim=1)
x_attn = self.conv(x_attn)
return x * self.gate(x_attn)
class LightSpatialAttn(nn.Module):
"""An experimental 'lightweight' variant that sums avg_pool and max_pool results.
"""
def __init__(self, kernel_size=7, gate_layer='sigmoid'):
super(LightSpatialAttn, self).__init__()
self.conv = ConvNormAct(1, 1, kernel_size, apply_act=False)
self.gate = create_act_layer(gate_layer)
def forward(self, x):
x_attn = 0.5 * x.mean(dim=1, keepdim=True) + 0.5 * x.amax(dim=1, keepdim=True)
x_attn = self.conv(x_attn)
return x * self.gate(x_attn)
class CbamModule(nn.Module):
def __init__(
self, channels, rd_ratio=1./16, rd_channels=None, rd_divisor=1,
spatial_kernel_size=7, act_layer=nn.ReLU, gate_layer='sigmoid', mlp_bias=False):
super(CbamModule, self).__init__()
self.channel = ChannelAttn(
channels, rd_ratio=rd_ratio, rd_channels=rd_channels,
rd_divisor=rd_divisor, act_layer=act_layer, gate_layer=gate_layer, mlp_bias=mlp_bias)
self.spatial = SpatialAttn(spatial_kernel_size, gate_layer=gate_layer)
def forward(self, x):
x = self.channel(x)
x = self.spatial(x)
return x
class LightCbamModule(nn.Module):
def __init__(
self, channels, rd_ratio=1./16, rd_channels=None, rd_divisor=1,
spatial_kernel_size=7, act_layer=nn.ReLU, gate_layer='sigmoid', mlp_bias=False):
super(LightCbamModule, self).__init__()
self.channel = LightChannelAttn(
channels, rd_ratio=rd_ratio, rd_channels=rd_channels,
rd_divisor=rd_divisor, act_layer=act_layer, gate_layer=gate_layer, mlp_bias=mlp_bias)
self.spatial = LightSpatialAttn(spatial_kernel_size)
def forward(self, x):
x = self.channel(x)
x = self.spatial(x)
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/attention_pool2d.py
|
""" Attention Pool 2D
Implementations of 2D spatial feature pooling using multi-head attention instead of average pool.
Based on idea in CLIP by OpenAI, licensed Apache 2.0
https://github.com/openai/CLIP/blob/3b473b0e682c091a9e53623eebc1ca1657385717/clip/model.py
Hacked together by / Copyright 2021 Ross Wightman
"""
from typing import Union, Tuple
import torch
import torch.nn as nn
from .helpers import to_2tuple
from .pos_embed_sincos import apply_rot_embed, RotaryEmbedding
from .weight_init import trunc_normal_
class RotAttentionPool2d(nn.Module):
""" Attention based 2D feature pooling w/ rotary (relative) pos embedding.
This is a multi-head attention based replacement for (spatial) average pooling in NN architectures.
Adapted from the AttentionPool2d in CLIP w/ rotary embedding instead of learned embed.
https://github.com/openai/CLIP/blob/3b473b0e682c091a9e53623eebc1ca1657385717/clip/model.py
NOTE: While this impl does not require a fixed feature size, performance at differeing resolutions from
train varies widely and falls off dramatically. I'm not sure if there is a way around this... -RW
"""
def __init__(
self,
in_features: int,
out_features: int = None,
embed_dim: int = None,
num_heads: int = 4,
qkv_bias: bool = True,
):
super().__init__()
embed_dim = embed_dim or in_features
out_features = out_features or in_features
self.qkv = nn.Linear(in_features, embed_dim * 3, bias=qkv_bias)
self.proj = nn.Linear(embed_dim, out_features)
self.num_heads = num_heads
assert embed_dim % num_heads == 0
self.head_dim = embed_dim // num_heads
self.scale = self.head_dim ** -0.5
self.pos_embed = RotaryEmbedding(self.head_dim)
trunc_normal_(self.qkv.weight, std=in_features ** -0.5)
nn.init.zeros_(self.qkv.bias)
def forward(self, x):
B, _, H, W = x.shape
N = H * W
x = x.reshape(B, -1, N).permute(0, 2, 1)
x = torch.cat([x.mean(1, keepdim=True), x], dim=1)
x = self.qkv(x).reshape(B, N + 1, 3, self.num_heads, self.head_dim).permute(2, 0, 3, 1, 4)
q, k, v = x[0], x[1], x[2]
qc, q = q[:, :, :1], q[:, :, 1:]
sin_emb, cos_emb = self.pos_embed.get_embed((H, W))
q = apply_rot_embed(q, sin_emb, cos_emb)
q = torch.cat([qc, q], dim=2)
kc, k = k[:, :, :1], k[:, :, 1:]
k = apply_rot_embed(k, sin_emb, cos_emb)
k = torch.cat([kc, k], dim=2)
attn = (q @ k.transpose(-2, -1)) * self.scale
attn = attn.softmax(dim=-1)
x = (attn @ v).transpose(1, 2).reshape(B, N + 1, -1)
x = self.proj(x)
return x[:, 0]
class AttentionPool2d(nn.Module):
""" Attention based 2D feature pooling w/ learned (absolute) pos embedding.
This is a multi-head attention based replacement for (spatial) average pooling in NN architectures.
It was based on impl in CLIP by OpenAI
https://github.com/openai/CLIP/blob/3b473b0e682c091a9e53623eebc1ca1657385717/clip/model.py
NOTE: This requires feature size upon construction and well prevent adaptive sizing of the network.
"""
def __init__(
self,
in_features: int,
feat_size: Union[int, Tuple[int, int]],
out_features: int = None,
embed_dim: int = None,
num_heads: int = 4,
qkv_bias: bool = True,
):
super().__init__()
embed_dim = embed_dim or in_features
out_features = out_features or in_features
assert embed_dim % num_heads == 0
self.feat_size = to_2tuple(feat_size)
self.qkv = nn.Linear(in_features, embed_dim * 3, bias=qkv_bias)
self.proj = nn.Linear(embed_dim, out_features)
self.num_heads = num_heads
self.head_dim = embed_dim // num_heads
self.scale = self.head_dim ** -0.5
spatial_dim = self.feat_size[0] * self.feat_size[1]
self.pos_embed = nn.Parameter(torch.zeros(spatial_dim + 1, in_features))
trunc_normal_(self.pos_embed, std=in_features ** -0.5)
trunc_normal_(self.qkv.weight, std=in_features ** -0.5)
nn.init.zeros_(self.qkv.bias)
def forward(self, x):
B, _, H, W = x.shape
N = H * W
assert self.feat_size[0] == H
assert self.feat_size[1] == W
x = x.reshape(B, -1, N).permute(0, 2, 1)
x = torch.cat([x.mean(1, keepdim=True), x], dim=1)
x = x + self.pos_embed.unsqueeze(0).to(x.dtype)
x = self.qkv(x).reshape(B, N + 1, 3, self.num_heads, self.head_dim).permute(2, 0, 3, 1, 4)
q, k, v = x[0], x[1], x[2]
attn = (q @ k.transpose(-2, -1)) * self.scale
attn = attn.softmax(dim=-1)
x = (attn @ v).transpose(1, 2).reshape(B, N + 1, -1)
x = self.proj(x)
return x[:, 0]
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/split_batchnorm.py
|
""" Split BatchNorm
A PyTorch BatchNorm layer that splits input batch into N equal parts and passes each through
a separate BN layer. The first split is passed through the parent BN layers with weight/bias
keys the same as the original BN. All other splits pass through BN sub-layers under the '.aux_bn'
namespace.
This allows easily removing the auxiliary BN layers after training to efficiently
achieve the 'Auxiliary BatchNorm' as described in the AdvProp Paper, section 4.2,
'Disentangled Learning via An Auxiliary BN'
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
import torch.nn as nn
class SplitBatchNorm2d(torch.nn.BatchNorm2d):
def __init__(self, num_features, eps=1e-5, momentum=0.1, affine=True,
track_running_stats=True, num_splits=2):
super().__init__(num_features, eps, momentum, affine, track_running_stats)
assert num_splits > 1, 'Should have at least one aux BN layer (num_splits at least 2)'
self.num_splits = num_splits
self.aux_bn = nn.ModuleList([
nn.BatchNorm2d(num_features, eps, momentum, affine, track_running_stats) for _ in range(num_splits - 1)])
def forward(self, input: torch.Tensor):
if self.training: # aux BN only relevant while training
split_size = input.shape[0] // self.num_splits
assert input.shape[0] == split_size * self.num_splits, "batch size must be evenly divisible by num_splits"
split_input = input.split(split_size)
x = [super().forward(split_input[0])]
for i, a in enumerate(self.aux_bn):
x.append(a(split_input[i + 1]))
return torch.cat(x, dim=0)
else:
return super().forward(input)
def convert_splitbn_model(module, num_splits=2):
"""
Recursively traverse module and its children to replace all instances of
``torch.nn.modules.batchnorm._BatchNorm`` with `SplitBatchnorm2d`.
Args:
module (torch.nn.Module): input module
num_splits: number of separate batchnorm layers to split input across
Example::
>>> # model is an instance of torch.nn.Module
>>> model = timm.models.convert_splitbn_model(model, num_splits=2)
"""
mod = module
if isinstance(module, torch.nn.modules.instancenorm._InstanceNorm):
return module
if isinstance(module, torch.nn.modules.batchnorm._BatchNorm):
mod = SplitBatchNorm2d(
module.num_features, module.eps, module.momentum, module.affine,
module.track_running_stats, num_splits=num_splits)
mod.running_mean = module.running_mean
mod.running_var = module.running_var
mod.num_batches_tracked = module.num_batches_tracked
if module.affine:
mod.weight.data = module.weight.data.clone().detach()
mod.bias.data = module.bias.data.clone().detach()
for aux in mod.aux_bn:
aux.running_mean = module.running_mean.clone()
aux.running_var = module.running_var.clone()
aux.num_batches_tracked = module.num_batches_tracked.clone()
if module.affine:
aux.weight.data = module.weight.data.clone().detach()
aux.bias.data = module.bias.data.clone().detach()
for name, child in module.named_children():
mod.add_module(name, convert_splitbn_model(child, num_splits=num_splits))
del module
return mod
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/trace_utils.py
|
try:
from torch import _assert
except ImportError:
def _assert(condition: bool, message: str):
assert condition, message
def _float_to_int(x: float) -> int:
"""
Symbolic tracing helper to substitute for inbuilt `int`.
Hint: Inbuilt `int` can't accept an argument of type `Proxy`
"""
return int(x)
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/lambda_layer.py
|
""" Lambda Layer
Paper: `LambdaNetworks: Modeling Long-Range Interactions Without Attention`
- https://arxiv.org/abs/2102.08602
@misc{2102.08602,
Author = {Irwan Bello},
Title = {LambdaNetworks: Modeling Long-Range Interactions Without Attention},
Year = {2021},
}
Status:
This impl is a WIP. Code snippets in the paper were used as reference but
good chance some details are missing/wrong.
I've only implemented local lambda conv based pos embeddings.
For a PyTorch impl that includes other embedding options checkout
https://github.com/lucidrains/lambda-networks
Hacked together by / Copyright 2021 Ross Wightman
"""
import torch
from torch import nn
import torch.nn.functional as F
from .helpers import to_2tuple, make_divisible
from .weight_init import trunc_normal_
def rel_pos_indices(size):
size = to_2tuple(size)
pos = torch.stack(torch.meshgrid(torch.arange(size[0]), torch.arange(size[1]))).flatten(1)
rel_pos = pos[:, None, :] - pos[:, :, None]
rel_pos[0] += size[0] - 1
rel_pos[1] += size[1] - 1
return rel_pos # 2, H * W, H * W
class LambdaLayer(nn.Module):
"""Lambda Layer
Paper: `LambdaNetworks: Modeling Long-Range Interactions Without Attention`
- https://arxiv.org/abs/2102.08602
NOTE: intra-depth parameter 'u' is fixed at 1. It did not appear worth the complexity to add.
The internal dimensions of the lambda module are controlled via the interaction of several arguments.
* the output dimension of the module is specified by dim_out, which falls back to input dim if not set
* the value (v) dimension is set to dim_out // num_heads, the v projection determines the output dim
* the query (q) and key (k) dimension are determined by
* dim_head = (dim_out * attn_ratio // num_heads) if dim_head is None
* q = num_heads * dim_head, k = dim_head
* as seen above, attn_ratio determines the ratio of q and k relative to the output if dim_head not set
Args:
dim (int): input dimension to the module
dim_out (int): output dimension of the module, same as dim if not set
feat_size (Tuple[int, int]): size of input feature_map for relative pos variant H, W
stride (int): output stride of the module, avg pool used if stride == 2
num_heads (int): parallel attention heads.
dim_head (int): dimension of query and key heads, calculated from dim_out * attn_ratio // num_heads if not set
r (int): local lambda convolution radius. Use lambda conv if set, else relative pos if not. (default: 9)
qk_ratio (float): ratio of q and k dimensions to output dimension when dim_head not set. (default: 1.0)
qkv_bias (bool): add bias to q, k, and v projections
"""
def __init__(
self, dim, dim_out=None, feat_size=None, stride=1, num_heads=4, dim_head=16, r=9,
qk_ratio=1.0, qkv_bias=False):
super().__init__()
dim_out = dim_out or dim
assert dim_out % num_heads == 0, ' should be divided by num_heads'
self.dim_qk = dim_head or make_divisible(dim_out * qk_ratio, divisor=8) // num_heads
self.num_heads = num_heads
self.dim_v = dim_out // num_heads
self.qkv = nn.Conv2d(
dim,
num_heads * self.dim_qk + self.dim_qk + self.dim_v,
kernel_size=1, bias=qkv_bias)
self.norm_q = nn.BatchNorm2d(num_heads * self.dim_qk)
self.norm_v = nn.BatchNorm2d(self.dim_v)
if r is not None:
# local lambda convolution for pos
self.conv_lambda = nn.Conv3d(1, self.dim_qk, (r, r, 1), padding=(r // 2, r // 2, 0))
self.pos_emb = None
self.rel_pos_indices = None
else:
# relative pos embedding
assert feat_size is not None
feat_size = to_2tuple(feat_size)
rel_size = [2 * s - 1 for s in feat_size]
self.conv_lambda = None
self.pos_emb = nn.Parameter(torch.zeros(rel_size[0], rel_size[1], self.dim_qk))
self.register_buffer('rel_pos_indices', rel_pos_indices(feat_size), persistent=False)
self.pool = nn.AvgPool2d(2, 2) if stride == 2 else nn.Identity()
self.reset_parameters()
def reset_parameters(self):
trunc_normal_(self.qkv.weight, std=self.qkv.weight.shape[1] ** -0.5) # fan-in
if self.conv_lambda is not None:
trunc_normal_(self.conv_lambda.weight, std=self.dim_qk ** -0.5)
if self.pos_emb is not None:
trunc_normal_(self.pos_emb, std=.02)
def forward(self, x):
B, C, H, W = x.shape
M = H * W
qkv = self.qkv(x)
q, k, v = torch.split(qkv, [
self.num_heads * self.dim_qk, self.dim_qk, self.dim_v], dim=1)
q = self.norm_q(q).reshape(B, self.num_heads, self.dim_qk, M).transpose(-1, -2) # B, num_heads, M, K
v = self.norm_v(v).reshape(B, self.dim_v, M).transpose(-1, -2) # B, M, V
k = F.softmax(k.reshape(B, self.dim_qk, M), dim=-1) # B, K, M
content_lam = k @ v # B, K, V
content_out = q @ content_lam.unsqueeze(1) # B, num_heads, M, V
if self.pos_emb is None:
position_lam = self.conv_lambda(v.reshape(B, 1, H, W, self.dim_v)) # B, H, W, V, K
position_lam = position_lam.reshape(B, 1, self.dim_qk, H * W, self.dim_v).transpose(2, 3) # B, 1, M, K, V
else:
# FIXME relative pos embedding path not fully verified
pos_emb = self.pos_emb[self.rel_pos_indices[0], self.rel_pos_indices[1]].expand(B, -1, -1, -1)
position_lam = (pos_emb.transpose(-1, -2) @ v.unsqueeze(1)).unsqueeze(1) # B, 1, M, K, V
position_out = (q.unsqueeze(-2) @ position_lam).squeeze(-2) # B, num_heads, M, V
out = (content_out + position_out).transpose(-1, -2).reshape(B, C, H, W) # B, C (num_heads * V), H, W
out = self.pool(out)
return out
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/split_attn.py
|
""" Split Attention Conv2d (for ResNeSt Models)
Paper: `ResNeSt: Split-Attention Networks` - /https://arxiv.org/abs/2004.08955
Adapted from original PyTorch impl at https://github.com/zhanghang1989/ResNeSt
Modified for torchscript compat, performance, and consistency with timm by Ross Wightman
"""
import torch
import torch.nn.functional as F
from torch import nn
from .helpers import make_divisible
class RadixSoftmax(nn.Module):
def __init__(self, radix, cardinality):
super(RadixSoftmax, self).__init__()
self.radix = radix
self.cardinality = cardinality
def forward(self, x):
batch = x.size(0)
if self.radix > 1:
x = x.view(batch, self.cardinality, self.radix, -1).transpose(1, 2)
x = F.softmax(x, dim=1)
x = x.reshape(batch, -1)
else:
x = torch.sigmoid(x)
return x
class SplitAttn(nn.Module):
"""Split-Attention (aka Splat)
"""
def __init__(self, in_channels, out_channels=None, kernel_size=3, stride=1, padding=None,
dilation=1, groups=1, bias=False, radix=2, rd_ratio=0.25, rd_channels=None, rd_divisor=8,
act_layer=nn.ReLU, norm_layer=None, drop_layer=None, **kwargs):
super(SplitAttn, self).__init__()
out_channels = out_channels or in_channels
self.radix = radix
mid_chs = out_channels * radix
if rd_channels is None:
attn_chs = make_divisible(in_channels * radix * rd_ratio, min_value=32, divisor=rd_divisor)
else:
attn_chs = rd_channels * radix
padding = kernel_size // 2 if padding is None else padding
self.conv = nn.Conv2d(
in_channels, mid_chs, kernel_size, stride, padding, dilation,
groups=groups * radix, bias=bias, **kwargs)
self.bn0 = norm_layer(mid_chs) if norm_layer else nn.Identity()
self.drop = drop_layer() if drop_layer is not None else nn.Identity()
self.act0 = act_layer(inplace=True)
self.fc1 = nn.Conv2d(out_channels, attn_chs, 1, groups=groups)
self.bn1 = norm_layer(attn_chs) if norm_layer else nn.Identity()
self.act1 = act_layer(inplace=True)
self.fc2 = nn.Conv2d(attn_chs, mid_chs, 1, groups=groups)
self.rsoftmax = RadixSoftmax(radix, groups)
def forward(self, x):
x = self.conv(x)
x = self.bn0(x)
x = self.drop(x)
x = self.act0(x)
B, RC, H, W = x.shape
if self.radix > 1:
x = x.reshape((B, self.radix, RC // self.radix, H, W))
x_gap = x.sum(dim=1)
else:
x_gap = x
x_gap = x_gap.mean((2, 3), keepdim=True)
x_gap = self.fc1(x_gap)
x_gap = self.bn1(x_gap)
x_gap = self.act1(x_gap)
x_attn = self.fc2(x_gap)
x_attn = self.rsoftmax(x_attn).view(B, -1, 1, 1)
if self.radix > 1:
out = (x * x_attn.reshape((B, self.radix, RC // self.radix, 1, 1))).sum(dim=1)
else:
out = x * x_attn
return out.contiguous()
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/non_local_attn.py
|
""" Bilinear-Attention-Transform and Non-Local Attention
Paper: `Non-Local Neural Networks With Grouped Bilinear Attentional Transforms`
- https://openaccess.thecvf.com/content_CVPR_2020/html/Chi_Non-Local_Neural_Networks_With_Grouped_Bilinear_Attentional_Transforms_CVPR_2020_paper.html
Adapted from original code: https://github.com/BA-Transform/BAT-Image-Classification
"""
import torch
from torch import nn
from torch.nn import functional as F
from .conv_bn_act import ConvNormAct
from .helpers import make_divisible
from .trace_utils import _assert
class NonLocalAttn(nn.Module):
"""Spatial NL block for image classification.
This was adapted from https://github.com/BA-Transform/BAT-Image-Classification
Their NonLocal impl inspired by https://github.com/facebookresearch/video-nonlocal-net.
"""
def __init__(self, in_channels, use_scale=True, rd_ratio=1/8, rd_channels=None, rd_divisor=8, **kwargs):
super(NonLocalAttn, self).__init__()
if rd_channels is None:
rd_channels = make_divisible(in_channels * rd_ratio, divisor=rd_divisor)
self.scale = in_channels ** -0.5 if use_scale else 1.0
self.t = nn.Conv2d(in_channels, rd_channels, kernel_size=1, stride=1, bias=True)
self.p = nn.Conv2d(in_channels, rd_channels, kernel_size=1, stride=1, bias=True)
self.g = nn.Conv2d(in_channels, rd_channels, kernel_size=1, stride=1, bias=True)
self.z = nn.Conv2d(rd_channels, in_channels, kernel_size=1, stride=1, bias=True)
self.norm = nn.BatchNorm2d(in_channels)
self.reset_parameters()
def forward(self, x):
shortcut = x
t = self.t(x)
p = self.p(x)
g = self.g(x)
B, C, H, W = t.size()
t = t.view(B, C, -1).permute(0, 2, 1)
p = p.view(B, C, -1)
g = g.view(B, C, -1).permute(0, 2, 1)
att = torch.bmm(t, p) * self.scale
att = F.softmax(att, dim=2)
x = torch.bmm(att, g)
x = x.permute(0, 2, 1).reshape(B, C, H, W)
x = self.z(x)
x = self.norm(x) + shortcut
return x
def reset_parameters(self):
for name, m in self.named_modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(
m.weight, mode='fan_out', nonlinearity='relu')
if len(list(m.parameters())) > 1:
nn.init.constant_(m.bias, 0.0)
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 0)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.GroupNorm):
nn.init.constant_(m.weight, 0)
nn.init.constant_(m.bias, 0)
class BilinearAttnTransform(nn.Module):
def __init__(self, in_channels, block_size, groups, act_layer=nn.ReLU, norm_layer=nn.BatchNorm2d):
super(BilinearAttnTransform, self).__init__()
self.conv1 = ConvNormAct(in_channels, groups, 1, act_layer=act_layer, norm_layer=norm_layer)
self.conv_p = nn.Conv2d(groups, block_size * block_size * groups, kernel_size=(block_size, 1))
self.conv_q = nn.Conv2d(groups, block_size * block_size * groups, kernel_size=(1, block_size))
self.conv2 = ConvNormAct(in_channels, in_channels, 1, act_layer=act_layer, norm_layer=norm_layer)
self.block_size = block_size
self.groups = groups
self.in_channels = in_channels
def resize_mat(self, x, t: int):
B, C, block_size, block_size1 = x.shape
_assert(block_size == block_size1, '')
if t <= 1:
return x
x = x.view(B * C, -1, 1, 1)
x = x * torch.eye(t, t, dtype=x.dtype, device=x.device)
x = x.view(B * C, block_size, block_size, t, t)
x = torch.cat(torch.split(x, 1, dim=1), dim=3)
x = torch.cat(torch.split(x, 1, dim=2), dim=4)
x = x.view(B, C, block_size * t, block_size * t)
return x
def forward(self, x):
_assert(x.shape[-1] % self.block_size == 0, '')
_assert(x.shape[-2] % self.block_size == 0, '')
B, C, H, W = x.shape
out = self.conv1(x)
rp = F.adaptive_max_pool2d(out, (self.block_size, 1))
cp = F.adaptive_max_pool2d(out, (1, self.block_size))
p = self.conv_p(rp).view(B, self.groups, self.block_size, self.block_size).sigmoid()
q = self.conv_q(cp).view(B, self.groups, self.block_size, self.block_size).sigmoid()
p = p / p.sum(dim=3, keepdim=True)
q = q / q.sum(dim=2, keepdim=True)
p = p.view(B, self.groups, 1, self.block_size, self.block_size).expand(x.size(
0), self.groups, C // self.groups, self.block_size, self.block_size).contiguous()
p = p.view(B, C, self.block_size, self.block_size)
q = q.view(B, self.groups, 1, self.block_size, self.block_size).expand(x.size(
0), self.groups, C // self.groups, self.block_size, self.block_size).contiguous()
q = q.view(B, C, self.block_size, self.block_size)
p = self.resize_mat(p, H // self.block_size)
q = self.resize_mat(q, W // self.block_size)
y = p.matmul(x)
y = y.matmul(q)
y = self.conv2(y)
return y
class BatNonLocalAttn(nn.Module):
""" BAT
Adapted from: https://github.com/BA-Transform/BAT-Image-Classification
"""
def __init__(
self, in_channels, block_size=7, groups=2, rd_ratio=0.25, rd_channels=None, rd_divisor=8,
drop_rate=0.2, act_layer=nn.ReLU, norm_layer=nn.BatchNorm2d, **_):
super().__init__()
if rd_channels is None:
rd_channels = make_divisible(in_channels * rd_ratio, divisor=rd_divisor)
self.conv1 = ConvNormAct(in_channels, rd_channels, 1, act_layer=act_layer, norm_layer=norm_layer)
self.ba = BilinearAttnTransform(rd_channels, block_size, groups, act_layer=act_layer, norm_layer=norm_layer)
self.conv2 = ConvNormAct(rd_channels, in_channels, 1, act_layer=act_layer, norm_layer=norm_layer)
self.dropout = nn.Dropout2d(p=drop_rate)
def forward(self, x):
xl = self.conv1(x)
y = self.ba(xl)
y = self.conv2(y)
y = self.dropout(y)
return y + x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/cond_conv2d.py
|
""" PyTorch Conditionally Parameterized Convolution (CondConv)
Paper: CondConv: Conditionally Parameterized Convolutions for Efficient Inference
(https://arxiv.org/abs/1904.04971)
Hacked together by / Copyright 2020 Ross Wightman
"""
import math
from functools import partial
import numpy as np
import torch
from torch import nn as nn
from torch.nn import functional as F
from .helpers import to_2tuple
from .conv2d_same import conv2d_same
from .padding import get_padding_value
def get_condconv_initializer(initializer, num_experts, expert_shape):
def condconv_initializer(weight):
"""CondConv initializer function."""
num_params = np.prod(expert_shape)
if (len(weight.shape) != 2 or weight.shape[0] != num_experts or
weight.shape[1] != num_params):
raise (ValueError(
'CondConv variables must have shape [num_experts, num_params]'))
for i in range(num_experts):
initializer(weight[i].view(expert_shape))
return condconv_initializer
class CondConv2d(nn.Module):
""" Conditionally Parameterized Convolution
Inspired by: https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py
Grouped convolution hackery for parallel execution of the per-sample kernel filters inspired by this discussion:
https://github.com/pytorch/pytorch/issues/17983
"""
__constants__ = ['in_channels', 'out_channels', 'dynamic_padding']
def __init__(self, in_channels, out_channels, kernel_size=3,
stride=1, padding='', dilation=1, groups=1, bias=False, num_experts=4):
super(CondConv2d, self).__init__()
self.in_channels = in_channels
self.out_channels = out_channels
self.kernel_size = to_2tuple(kernel_size)
self.stride = to_2tuple(stride)
padding_val, is_padding_dynamic = get_padding_value(
padding, kernel_size, stride=stride, dilation=dilation)
self.dynamic_padding = is_padding_dynamic # if in forward to work with torchscript
self.padding = to_2tuple(padding_val)
self.dilation = to_2tuple(dilation)
self.groups = groups
self.num_experts = num_experts
self.weight_shape = (self.out_channels, self.in_channels // self.groups) + self.kernel_size
weight_num_param = 1
for wd in self.weight_shape:
weight_num_param *= wd
self.weight = torch.nn.Parameter(torch.Tensor(self.num_experts, weight_num_param))
if bias:
self.bias_shape = (self.out_channels,)
self.bias = torch.nn.Parameter(torch.Tensor(self.num_experts, self.out_channels))
else:
self.register_parameter('bias', None)
self.reset_parameters()
def reset_parameters(self):
init_weight = get_condconv_initializer(
partial(nn.init.kaiming_uniform_, a=math.sqrt(5)), self.num_experts, self.weight_shape)
init_weight(self.weight)
if self.bias is not None:
fan_in = np.prod(self.weight_shape[1:])
bound = 1 / math.sqrt(fan_in)
init_bias = get_condconv_initializer(
partial(nn.init.uniform_, a=-bound, b=bound), self.num_experts, self.bias_shape)
init_bias(self.bias)
def forward(self, x, routing_weights):
B, C, H, W = x.shape
weight = torch.matmul(routing_weights, self.weight)
new_weight_shape = (B * self.out_channels, self.in_channels // self.groups) + self.kernel_size
weight = weight.view(new_weight_shape)
bias = None
if self.bias is not None:
bias = torch.matmul(routing_weights, self.bias)
bias = bias.view(B * self.out_channels)
# move batch elements with channels so each batch element can be efficiently convolved with separate kernel
# reshape instead of view to work with channels_last input
x = x.reshape(1, B * C, H, W)
if self.dynamic_padding:
out = conv2d_same(
x, weight, bias, stride=self.stride, padding=self.padding,
dilation=self.dilation, groups=self.groups * B)
else:
out = F.conv2d(
x, weight, bias, stride=self.stride, padding=self.padding,
dilation=self.dilation, groups=self.groups * B)
out = out.permute([1, 0, 2, 3]).view(B, self.out_channels, out.shape[-2], out.shape[-1])
# Literal port (from TF definition)
# x = torch.split(x, 1, 0)
# weight = torch.split(weight, 1, 0)
# if self.bias is not None:
# bias = torch.matmul(routing_weights, self.bias)
# bias = torch.split(bias, 1, 0)
# else:
# bias = [None] * B
# out = []
# for xi, wi, bi in zip(x, weight, bias):
# wi = wi.view(*self.weight_shape)
# if bi is not None:
# bi = bi.view(*self.bias_shape)
# out.append(self.conv_fn(
# xi, wi, bi, stride=self.stride, padding=self.padding,
# dilation=self.dilation, groups=self.groups))
# out = torch.cat(out, 0)
return out
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/attention_pool.py
|
from typing import Optional
import torch
import torch.nn as nn
import torch.nn.functional as F
from .config import use_fused_attn
from .mlp import Mlp
from .weight_init import trunc_normal_tf_
class AttentionPoolLatent(nn.Module):
""" Attention pooling w/ latent query
"""
fused_attn: torch.jit.Final[bool]
def __init__(
self,
in_features: int,
out_features: int = None,
embed_dim: int = None,
num_heads: int = 8,
mlp_ratio: float = 4.0,
qkv_bias: bool = True,
qk_norm: bool = False,
latent_len: int = 1,
latent_dim: int = None,
pos_embed: str = '',
pool_type: str = 'token',
norm_layer: Optional[nn.Module] = None,
drop: float = 0.0,
):
super().__init__()
embed_dim = embed_dim or in_features
out_features = out_features or in_features
assert embed_dim % num_heads == 0
self.num_heads = num_heads
self.head_dim = embed_dim // num_heads
self.scale = self.head_dim ** -0.5
self.pool = pool_type
self.fused_attn = use_fused_attn()
if pos_embed == 'abs':
spatial_len = self.feat_size
self.pos_embed = nn.Parameter(torch.zeros(spatial_len, in_features))
else:
self.pos_embed = None
self.latent_dim = latent_dim or embed_dim
self.latent_len = latent_len
self.latent = nn.Parameter(torch.zeros(1, self.latent_len, embed_dim))
self.q = nn.Linear(embed_dim, embed_dim, bias=qkv_bias)
self.kv = nn.Linear(embed_dim, embed_dim * 2, bias=qkv_bias)
self.q_norm = norm_layer(self.head_dim) if qk_norm else nn.Identity()
self.k_norm = norm_layer(self.head_dim) if qk_norm else nn.Identity()
self.proj = nn.Linear(embed_dim, embed_dim)
self.proj_drop = nn.Dropout(drop)
self.norm = norm_layer(out_features) if norm_layer is not None else nn.Identity()
self.mlp = Mlp(embed_dim, int(embed_dim * mlp_ratio))
self.init_weights()
def init_weights(self):
if self.pos_embed is not None:
trunc_normal_tf_(self.pos_embed, std=self.pos_embed.shape[1] ** -0.5)
trunc_normal_tf_(self.latent, std=self.latent_dim ** -0.5)
def forward(self, x):
B, N, C = x.shape
if self.pos_embed is not None:
# FIXME interpolate
x = x + self.pos_embed.unsqueeze(0).to(x.dtype)
q_latent = self.latent.expand(B, -1, -1)
q = self.q(q_latent).reshape(B, self.latent_len, self.num_heads, self.head_dim).transpose(1, 2)
kv = self.kv(x).reshape(B, N, 2, self.num_heads, self.head_dim).permute(2, 0, 3, 1, 4)
k, v = kv.unbind(0)
q, k = self.q_norm(q), self.k_norm(k)
if self.fused_attn:
x = F.scaled_dot_product_attention(q, k, v)
else:
q = q * self.scale
attn = q @ k.transpose(-2, -1)
attn = attn.softmax(dim=-1)
x = attn @ v
x = x.transpose(1, 2).reshape(B, self.latent_len, C)
x = self.proj(x)
x = self.proj_drop(x)
x = x + self.mlp(self.norm(x))
# optional pool if latent seq_len > 1 and pooled output is desired
if self.pool == 'token':
x = x[:, 0]
elif self.pool == 'avg':
x = x.mean(1)
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/selective_kernel.py
|
""" Selective Kernel Convolution/Attention
Paper: Selective Kernel Networks (https://arxiv.org/abs/1903.06586)
Hacked together by / Copyright 2020 Ross Wightman
"""
import torch
from torch import nn as nn
from .conv_bn_act import ConvNormActAa
from .helpers import make_divisible
from .trace_utils import _assert
def _kernel_valid(k):
if isinstance(k, (list, tuple)):
for ki in k:
return _kernel_valid(ki)
assert k >= 3 and k % 2
class SelectiveKernelAttn(nn.Module):
def __init__(self, channels, num_paths=2, attn_channels=32, act_layer=nn.ReLU, norm_layer=nn.BatchNorm2d):
""" Selective Kernel Attention Module
Selective Kernel attention mechanism factored out into its own module.
"""
super(SelectiveKernelAttn, self).__init__()
self.num_paths = num_paths
self.fc_reduce = nn.Conv2d(channels, attn_channels, kernel_size=1, bias=False)
self.bn = norm_layer(attn_channels)
self.act = act_layer(inplace=True)
self.fc_select = nn.Conv2d(attn_channels, channels * num_paths, kernel_size=1, bias=False)
def forward(self, x):
_assert(x.shape[1] == self.num_paths, '')
x = x.sum(1).mean((2, 3), keepdim=True)
x = self.fc_reduce(x)
x = self.bn(x)
x = self.act(x)
x = self.fc_select(x)
B, C, H, W = x.shape
x = x.view(B, self.num_paths, C // self.num_paths, H, W)
x = torch.softmax(x, dim=1)
return x
class SelectiveKernel(nn.Module):
def __init__(self, in_channels, out_channels=None, kernel_size=None, stride=1, dilation=1, groups=1,
rd_ratio=1./16, rd_channels=None, rd_divisor=8, keep_3x3=True, split_input=True,
act_layer=nn.ReLU, norm_layer=nn.BatchNorm2d, aa_layer=None, drop_layer=None):
""" Selective Kernel Convolution Module
As described in Selective Kernel Networks (https://arxiv.org/abs/1903.06586) with some modifications.
Largest change is the input split, which divides the input channels across each convolution path, this can
be viewed as a grouping of sorts, but the output channel counts expand to the module level value. This keeps
the parameter count from ballooning when the convolutions themselves don't have groups, but still provides
a noteworthy increase in performance over similar param count models without this attention layer. -Ross W
Args:
in_channels (int): module input (feature) channel count
out_channels (int): module output (feature) channel count
kernel_size (int, list): kernel size for each convolution branch
stride (int): stride for convolutions
dilation (int): dilation for module as a whole, impacts dilation of each branch
groups (int): number of groups for each branch
rd_ratio (int, float): reduction factor for attention features
keep_3x3 (bool): keep all branch convolution kernels as 3x3, changing larger kernels for dilations
split_input (bool): split input channels evenly across each convolution branch, keeps param count lower,
can be viewed as grouping by path, output expands to module out_channels count
act_layer (nn.Module): activation layer to use
norm_layer (nn.Module): batchnorm/norm layer to use
aa_layer (nn.Module): anti-aliasing module
drop_layer (nn.Module): spatial drop module in convs (drop block, etc)
"""
super(SelectiveKernel, self).__init__()
out_channels = out_channels or in_channels
kernel_size = kernel_size or [3, 5] # default to one 3x3 and one 5x5 branch. 5x5 -> 3x3 + dilation
_kernel_valid(kernel_size)
if not isinstance(kernel_size, list):
kernel_size = [kernel_size] * 2
if keep_3x3:
dilation = [dilation * (k - 1) // 2 for k in kernel_size]
kernel_size = [3] * len(kernel_size)
else:
dilation = [dilation] * len(kernel_size)
self.num_paths = len(kernel_size)
self.in_channels = in_channels
self.out_channels = out_channels
self.split_input = split_input
if self.split_input:
assert in_channels % self.num_paths == 0
in_channels = in_channels // self.num_paths
groups = min(out_channels, groups)
conv_kwargs = dict(
stride=stride, groups=groups, act_layer=act_layer, norm_layer=norm_layer,
aa_layer=aa_layer, drop_layer=drop_layer)
self.paths = nn.ModuleList([
ConvNormActAa(in_channels, out_channels, kernel_size=k, dilation=d, **conv_kwargs)
for k, d in zip(kernel_size, dilation)])
attn_channels = rd_channels or make_divisible(out_channels * rd_ratio, divisor=rd_divisor)
self.attn = SelectiveKernelAttn(out_channels, self.num_paths, attn_channels)
def forward(self, x):
if self.split_input:
x_split = torch.split(x, self.in_channels // self.num_paths, 1)
x_paths = [op(x_split[i]) for i, op in enumerate(self.paths)]
else:
x_paths = [op(x) for op in self.paths]
x = torch.stack(x_paths, dim=1)
x_attn = self.attn(x)
x = x * x_attn
x = torch.sum(x, dim=1)
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/space_to_depth.py
|
import torch
import torch.nn as nn
class SpaceToDepth(nn.Module):
bs: torch.jit.Final[int]
def __init__(self, block_size=4):
super().__init__()
assert block_size == 4
self.bs = block_size
def forward(self, x):
N, C, H, W = x.size()
x = x.view(N, C, H // self.bs, self.bs, W // self.bs, self.bs) # (N, C, H//bs, bs, W//bs, bs)
x = x.permute(0, 3, 5, 1, 2, 4).contiguous() # (N, bs, bs, C, H//bs, W//bs)
x = x.view(N, C * self.bs * self.bs, H // self.bs, W // self.bs) # (N, C*bs^2, H//bs, W//bs)
return x
@torch.jit.script
class SpaceToDepthJit:
def __call__(self, x: torch.Tensor):
# assuming hard-coded that block_size==4 for acceleration
N, C, H, W = x.size()
x = x.view(N, C, H // 4, 4, W // 4, 4) # (N, C, H//bs, bs, W//bs, bs)
x = x.permute(0, 3, 5, 1, 2, 4).contiguous() # (N, bs, bs, C, H//bs, W//bs)
x = x.view(N, C * 16, H // 4, W // 4) # (N, C*bs^2, H//bs, W//bs)
return x
class SpaceToDepthModule(nn.Module):
def __init__(self, no_jit=False):
super().__init__()
if not no_jit:
self.op = SpaceToDepthJit()
else:
self.op = SpaceToDepth()
def forward(self, x):
return self.op(x)
class DepthToSpace(nn.Module):
def __init__(self, block_size):
super().__init__()
self.bs = block_size
def forward(self, x):
N, C, H, W = x.size()
x = x.view(N, self.bs, self.bs, C // (self.bs ** 2), H, W) # (N, bs, bs, C//bs^2, H, W)
x = x.permute(0, 3, 4, 1, 5, 2).contiguous() # (N, C//bs^2, H, bs, W, bs)
x = x.view(N, C // (self.bs ** 2), H * self.bs, W * self.bs) # (N, C//bs^2, H * bs, W * bs)
return x
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/gather_excite.py
|
""" Gather-Excite Attention Block
Paper: `Gather-Excite: Exploiting Feature Context in CNNs` - https://arxiv.org/abs/1810.12348
Official code here, but it's only partial impl in Caffe: https://github.com/hujie-frank/GENet
I've tried to support all of the extent both w/ and w/o params. I don't believe I've seen another
impl that covers all of the cases.
NOTE: extent=0 + extra_params=False is equivalent to Squeeze-and-Excitation
Hacked together by / Copyright 2021 Ross Wightman
"""
import math
from torch import nn as nn
import torch.nn.functional as F
from .create_act import create_act_layer, get_act_layer
from .create_conv2d import create_conv2d
from .helpers import make_divisible
from .mlp import ConvMlp
class GatherExcite(nn.Module):
""" Gather-Excite Attention Module
"""
def __init__(
self, channels, feat_size=None, extra_params=False, extent=0, use_mlp=True,
rd_ratio=1./16, rd_channels=None, rd_divisor=1, add_maxpool=False,
act_layer=nn.ReLU, norm_layer=nn.BatchNorm2d, gate_layer='sigmoid'):
super(GatherExcite, self).__init__()
self.add_maxpool = add_maxpool
act_layer = get_act_layer(act_layer)
self.extent = extent
if extra_params:
self.gather = nn.Sequential()
if extent == 0:
assert feat_size is not None, 'spatial feature size must be specified for global extent w/ params'
self.gather.add_module(
'conv1', create_conv2d(channels, channels, kernel_size=feat_size, stride=1, depthwise=True))
if norm_layer:
self.gather.add_module(f'norm1', nn.BatchNorm2d(channels))
else:
assert extent % 2 == 0
num_conv = int(math.log2(extent))
for i in range(num_conv):
self.gather.add_module(
f'conv{i + 1}',
create_conv2d(channels, channels, kernel_size=3, stride=2, depthwise=True))
if norm_layer:
self.gather.add_module(f'norm{i + 1}', nn.BatchNorm2d(channels))
if i != num_conv - 1:
self.gather.add_module(f'act{i + 1}', act_layer(inplace=True))
else:
self.gather = None
if self.extent == 0:
self.gk = 0
self.gs = 0
else:
assert extent % 2 == 0
self.gk = self.extent * 2 - 1
self.gs = self.extent
if not rd_channels:
rd_channels = make_divisible(channels * rd_ratio, rd_divisor, round_limit=0.)
self.mlp = ConvMlp(channels, rd_channels, act_layer=act_layer) if use_mlp else nn.Identity()
self.gate = create_act_layer(gate_layer)
def forward(self, x):
size = x.shape[-2:]
if self.gather is not None:
x_ge = self.gather(x)
else:
if self.extent == 0:
# global extent
x_ge = x.mean(dim=(2, 3), keepdims=True)
if self.add_maxpool:
# experimental codepath, may remove or change
x_ge = 0.5 * x_ge + 0.5 * x.amax((2, 3), keepdim=True)
else:
x_ge = F.avg_pool2d(
x, kernel_size=self.gk, stride=self.gs, padding=self.gk // 2, count_include_pad=False)
if self.add_maxpool:
# experimental codepath, may remove or change
x_ge = 0.5 * x_ge + 0.5 * F.max_pool2d(x, kernel_size=self.gk, stride=self.gs, padding=self.gk // 2)
x_ge = self.mlp(x_ge)
if x_ge.shape[-1] != 1 or x_ge.shape[-2] != 1:
x_ge = F.interpolate(x_ge, size=size)
return x * self.gate(x_ge)
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/squeeze_excite.py
|
""" Squeeze-and-Excitation Channel Attention
An SE implementation originally based on PyTorch SE-Net impl.
Has since evolved with additional functionality / configuration.
Paper: `Squeeze-and-Excitation Networks` - https://arxiv.org/abs/1709.01507
Also included is Effective Squeeze-Excitation (ESE).
Paper: `CenterMask : Real-Time Anchor-Free Instance Segmentation` - https://arxiv.org/abs/1911.06667
Hacked together by / Copyright 2021 Ross Wightman
"""
from torch import nn as nn
from .create_act import create_act_layer
from .helpers import make_divisible
class SEModule(nn.Module):
""" SE Module as defined in original SE-Nets with a few additions
Additions include:
* divisor can be specified to keep channels % div == 0 (default: 8)
* reduction channels can be specified directly by arg (if rd_channels is set)
* reduction channels can be specified by float rd_ratio (default: 1/16)
* global max pooling can be added to the squeeze aggregation
* customizable activation, normalization, and gate layer
"""
def __init__(
self, channels, rd_ratio=1. / 16, rd_channels=None, rd_divisor=8, add_maxpool=False,
bias=True, act_layer=nn.ReLU, norm_layer=None, gate_layer='sigmoid'):
super(SEModule, self).__init__()
self.add_maxpool = add_maxpool
if not rd_channels:
rd_channels = make_divisible(channels * rd_ratio, rd_divisor, round_limit=0.)
self.fc1 = nn.Conv2d(channels, rd_channels, kernel_size=1, bias=bias)
self.bn = norm_layer(rd_channels) if norm_layer else nn.Identity()
self.act = create_act_layer(act_layer, inplace=True)
self.fc2 = nn.Conv2d(rd_channels, channels, kernel_size=1, bias=bias)
self.gate = create_act_layer(gate_layer)
def forward(self, x):
x_se = x.mean((2, 3), keepdim=True)
if self.add_maxpool:
# experimental codepath, may remove or change
x_se = 0.5 * x_se + 0.5 * x.amax((2, 3), keepdim=True)
x_se = self.fc1(x_se)
x_se = self.act(self.bn(x_se))
x_se = self.fc2(x_se)
return x * self.gate(x_se)
SqueezeExcite = SEModule # alias
class EffectiveSEModule(nn.Module):
""" 'Effective Squeeze-Excitation
From `CenterMask : Real-Time Anchor-Free Instance Segmentation` - https://arxiv.org/abs/1911.06667
"""
def __init__(self, channels, add_maxpool=False, gate_layer='hard_sigmoid', **_):
super(EffectiveSEModule, self).__init__()
self.add_maxpool = add_maxpool
self.fc = nn.Conv2d(channels, channels, kernel_size=1, padding=0)
self.gate = create_act_layer(gate_layer)
def forward(self, x):
x_se = x.mean((2, 3), keepdim=True)
if self.add_maxpool:
# experimental codepath, may remove or change
x_se = 0.5 * x_se + 0.5 * x.amax((2, 3), keepdim=True)
x_se = self.fc(x_se)
return x * self.gate(x_se)
EffectiveSqueezeExcite = EffectiveSEModule # alias
class SqueezeExciteCl(nn.Module):
""" SE Module as defined in original SE-Nets with a few additions
Additions include:
* divisor can be specified to keep channels % div == 0 (default: 8)
* reduction channels can be specified directly by arg (if rd_channels is set)
* reduction channels can be specified by float rd_ratio (default: 1/16)
* global max pooling can be added to the squeeze aggregation
* customizable activation, normalization, and gate layer
"""
def __init__(
self, channels, rd_ratio=1. / 16, rd_channels=None, rd_divisor=8,
bias=True, act_layer=nn.ReLU, gate_layer='sigmoid'):
super().__init__()
if not rd_channels:
rd_channels = make_divisible(channels * rd_ratio, rd_divisor, round_limit=0.)
self.fc1 = nn.Linear(channels, rd_channels, bias=bias)
self.act = create_act_layer(act_layer, inplace=True)
self.fc2 = nn.Linear(rd_channels, channels, bias=bias)
self.gate = create_act_layer(gate_layer)
def forward(self, x):
x_se = x.mean((1, 2), keepdims=True) # FIXME avg dim [1:n-1], don't assume 2D NHWC
x_se = self.fc1(x_se)
x_se = self.act(x_se)
x_se = self.fc2(x_se)
return x * self.gate(x_se)
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/pos_embed_rel.py
|
""" Relative position embedding modules and functions
Hacked together by / Copyright 2022 Ross Wightman
"""
import math
import os
from typing import Optional, Tuple
import torch
import torch.nn as nn
import torch.nn.functional as F
from .interpolate import RegularGridInterpolator
from .mlp import Mlp
from .weight_init import trunc_normal_
_USE_SCIPY = int(os.environ.get('TIMM_USE_SCIPY_INTERP', 0)) > 0
def gen_relative_position_index(
q_size: Tuple[int, int],
k_size: Optional[Tuple[int, int]] = None,
class_token: bool = False,
) -> torch.Tensor:
# Adapted with significant modifications from Swin / BeiT codebases
# get pair-wise relative position index for each token inside the window
assert k_size is None, 'Different q & k sizes not currently supported' # FIXME
coords = torch.stack(
torch.meshgrid([
torch.arange(q_size[0]),
torch.arange(q_size[1])
])
).flatten(1) # 2, Wh, Ww
relative_coords = coords[:, :, None] - coords[:, None, :] # 2, Wh*Ww, Wh*Ww
relative_coords = relative_coords.permute(1, 2, 0) # Qh*Qw, Kh*Kw, 2
relative_coords[:, :, 0] += q_size[0] - 1 # shift to start from 0
relative_coords[:, :, 1] += q_size[1] - 1
relative_coords[:, :, 0] *= 2 * q_size[1] - 1
num_relative_distance = (2 * q_size[0] - 1) * (2 * q_size[1] - 1)
# else:
# # FIXME different q vs k sizes is a WIP, need to better offset the two grids?
# q_coords = torch.stack(
# torch.meshgrid([
# torch.arange(q_size[0]),
# torch.arange(q_size[1])
# ])
# ).flatten(1) # 2, Wh, Ww
# k_coords = torch.stack(
# torch.meshgrid([
# torch.arange(k_size[0]),
# torch.arange(k_size[1])
# ])
# ).flatten(1)
# relative_coords = q_coords[:, :, None] - k_coords[:, None, :] # 2, Wh*Ww, Wh*Ww
# relative_coords = relative_coords.permute(1, 2, 0) # Qh*Qw, Kh*Kw, 2
# relative_coords[:, :, 0] += max(q_size[0], k_size[0]) - 1 # shift to start from 0
# relative_coords[:, :, 1] += max(q_size[1], k_size[1]) - 1
# relative_coords[:, :, 0] *= k_size[1] + q_size[1] - 1
# relative_position_index = relative_coords.sum(-1) # Qh*Qw, Kh*Kw
# num_relative_distance = (q_size[0] + k_size[0] - 1) * (q_size[1] + k_size[1] - 1) + 3
relative_position_index = relative_coords.sum(-1) # Wh*Ww, Wh*Ww
if class_token:
# handle cls to token & token 2 cls & cls to cls as per beit for rel pos bias
# NOTE not intended or tested with MLP log-coords
relative_position_index = F.pad(relative_position_index, [1, 0, 1, 0])
relative_position_index[0, 0:] = num_relative_distance
relative_position_index[0:, 0] = num_relative_distance + 1
relative_position_index[0, 0] = num_relative_distance + 2
return relative_position_index.contiguous()
def resize_rel_pos_bias_table_simple(
rel_pos_bias,
new_window_size: Tuple[int, int],
new_bias_shape: Tuple[int, ...],
):
dst_size = (new_window_size[0] * 2 - 1, new_window_size[1] * 2 - 1)
if rel_pos_bias.ndim == 3:
# TF maxvit style (num_heads, H, W) bias shape, no extra tokens currently supported
_, dst_h, dst_w = new_bias_shape
num_attn_heads, src_h, src_w = rel_pos_bias.shape
assert dst_h == dst_size[0] and dst_w == dst_size[1]
if src_h != dst_h or src_w != dst_w:
rel_pos_bias = torch.nn.functional.interpolate(
rel_pos_bias.unsqueeze(0),
size=dst_size,
mode="bicubic",
align_corners=False,
).squeeze(0)
else:
assert rel_pos_bias.ndim == 2
# (num_pos, num_heads) (aka flat) bias shape
dst_num_pos, _ = new_bias_shape
src_num_pos, num_attn_heads = rel_pos_bias.shape
num_extra_tokens = dst_num_pos - (dst_size[0] * dst_size[1])
src_size = int((src_num_pos - num_extra_tokens) ** 0.5)
src_size = (src_size, src_size) # FIXME could support non-equal src if argument passed
if src_size[0] != dst_size[0] or src_size[1] != dst_size[1]:
if num_extra_tokens:
extra_tokens = rel_pos_bias[-num_extra_tokens:, :]
rel_pos_bias = rel_pos_bias[:-num_extra_tokens, :]
else:
extra_tokens = None
rel_pos_bias = torch.nn.functional.interpolate(
rel_pos_bias.transpose(1, 0).reshape((1, -1, src_size[0], src_size[1])),
size=dst_size,
mode="bicubic",
align_corners=False,
).view(-1, dst_num_pos - num_extra_tokens).transpose(0, 1)
if extra_tokens is not None:
rel_pos_bias = torch.cat((rel_pos_bias, extra_tokens), dim=0)
return rel_pos_bias
def resize_rel_pos_bias_table_levit(
position_bias_table,
new_size,
interpolation: str = 'bicubic',
antialias: bool = True,
):
"""
Resample relative position bias table suggested in LeVit
Adapted from: https://github.com/microsoft/Cream/blob/main/TinyViT/utils.py
"""
L1, nH1 = position_bias_table.size()
L2, nH2 = new_size
assert nH1 == nH2
if L1 != L2:
orig_dtype = position_bias_table.dtype
position_bias_table = position_bias_table.float()
# bicubic interpolate relative_position_bias_table if not match
S1 = int(L1 ** 0.5)
S2 = int(L2 ** 0.5)
relative_position_bias_table_resized = F.interpolate(
position_bias_table.permute(1, 0).view(1, nH1, S1, S1),
size=(S2, S2),
mode=interpolation,
antialias=antialias)
relative_position_bias_table_resized = \
relative_position_bias_table_resized.view(nH2, L2).permute(1, 0)
relative_position_bias_table_resized.to(orig_dtype)
return relative_position_bias_table_resized
else:
return position_bias_table
def resize_rel_pos_bias_table(
rel_pos_bias,
new_window_size: Tuple[int, int],
new_bias_shape: Tuple[int, ...],
):
""" Resize relative position bias table using more advanced interpolation.
Modified from code in Microsoft Unilm (https://github.com/microsoft/unilm) repo (BeiT, BeiT-v2, etc).
https://github.com/microsoft/unilm/blob/5255d52de86dad642810f5849dd357769346c1d7/beit/run_class_finetuning.py#L351
Args:
rel_pos_bias:
new_window_size:
new_bias_shape:
Returns:
"""
if _USE_SCIPY:
from scipy import interpolate
dst_size = (new_window_size[0] * 2 - 1, new_window_size[1] * 2 - 1)
if rel_pos_bias.ndim == 3:
# TF maxvit style (num_heads, H, W) bias shape, no extra tokens currently supported
num_extra_tokens = 0
_, dst_h, dst_w = new_bias_shape
assert dst_h == dst_size[0] and dst_w == dst_size[1]
num_attn_heads, src_h, src_w = rel_pos_bias.shape
src_size = (src_h, src_w)
has_flat_shape = False
else:
assert rel_pos_bias.ndim == 2
# (num_pos, num_heads) (aka flat) bias shape
dst_num_pos, _ = new_bias_shape
src_num_pos, num_attn_heads = rel_pos_bias.shape
num_extra_tokens = dst_num_pos - (dst_size[0] * dst_size[1])
src_size = int((src_num_pos - num_extra_tokens) ** 0.5)
src_size = (src_size, src_size)
has_flat_shape = True
if src_size[0] != dst_size[0] or src_size[1] != dst_size[1]:
# print("Interpolating position from %dx%d to %dx%d" % (src_size[0], src_size[1], dst_size[0], dst_size[1]))
if num_extra_tokens:
extra_tokens = rel_pos_bias[-num_extra_tokens:, :]
rel_pos_bias = rel_pos_bias[:-num_extra_tokens, :]
else:
extra_tokens = None
def geometric_progression(a, r, n):
return a * (1.0 - r ** n) / (1.0 - r)
def _calc(src, dst):
left, right = 1.01, 1.5
while right - left > 1e-6:
q = (left + right) / 2.0
gp = geometric_progression(1, q, src // 2)
if gp > dst // 2:
right = q
else:
left = q
dis = []
cur = 1
for i in range(src // 2):
dis.append(cur)
cur += q ** (i + 1)
r_ids = [-_ for _ in reversed(dis)]
return r_ids + [0] + dis
y = _calc(src_size[0], dst_size[0])
x = _calc(src_size[1], dst_size[1])
yx = [torch.tensor(y), torch.tensor(x)]
# print("Original positions = %s" % str(x))
ty = dst_size[0] // 2.0
tx = dst_size[1] // 2.0
dy = torch.arange(-ty, ty + 0.1, 1.0)
dx = torch.arange(-tx, tx + 0.1, 1.0)
dyx = torch.meshgrid([dy, dx])
# print("Target positions = %s" % str(dx))
all_rel_pos_bias = []
for i in range(num_attn_heads):
if has_flat_shape:
z = rel_pos_bias[:, i].view(src_size[0], src_size[1]).float()
else:
z = rel_pos_bias[i, :, :].float()
if _USE_SCIPY:
# Original beit code uses scipy w/ cubic interpolation
f = interpolate.interp2d(x, y, z.numpy(), kind='cubic')
r = torch.Tensor(f(dx, dy)).contiguous().to(rel_pos_bias.device)
else:
# Without scipy dependency, I've found a reasonably simple impl
# that supports uneven spaced interpolation pts with 'linear' interp.
# Results are comparable to scipy for model accuracy in most cases.
f = RegularGridInterpolator(yx, z)
r = f(dyx).contiguous().to(rel_pos_bias.device)
if has_flat_shape:
r = r.view(-1, 1)
all_rel_pos_bias.append(r)
if has_flat_shape:
rel_pos_bias = torch.cat(all_rel_pos_bias, dim=-1)
else:
rel_pos_bias = torch.cat(all_rel_pos_bias, dim=0)
if extra_tokens is not None:
assert has_flat_shape
rel_pos_bias = torch.cat((rel_pos_bias, extra_tokens), dim=0)
return rel_pos_bias
class RelPosBias(nn.Module):
""" Relative Position Bias
Adapted from Swin-V1 relative position bias impl, modularized.
"""
def __init__(self, window_size, num_heads, prefix_tokens=0):
super().__init__()
assert prefix_tokens <= 1
self.window_size = window_size
self.window_area = window_size[0] * window_size[1]
self.bias_shape = (self.window_area + prefix_tokens,) * 2 + (num_heads,)
num_relative_distance = (2 * window_size[0] - 1) * (2 * window_size[1] - 1) + 3 * prefix_tokens
self.relative_position_bias_table = nn.Parameter(torch.zeros(num_relative_distance, num_heads))
self.register_buffer(
"relative_position_index",
gen_relative_position_index(self.window_size, class_token=prefix_tokens > 0).view(-1),
persistent=False,
)
self.init_weights()
def init_weights(self):
trunc_normal_(self.relative_position_bias_table, std=.02)
def get_bias(self) -> torch.Tensor:
relative_position_bias = self.relative_position_bias_table[self.relative_position_index]
# win_h * win_w, win_h * win_w, num_heads
relative_position_bias = relative_position_bias.view(self.bias_shape).permute(2, 0, 1)
return relative_position_bias.unsqueeze(0).contiguous()
def forward(self, attn, shared_rel_pos: Optional[torch.Tensor] = None):
return attn + self.get_bias()
def gen_relative_log_coords(
win_size: Tuple[int, int],
pretrained_win_size: Tuple[int, int] = (0, 0),
mode='swin',
):
assert mode in ('swin', 'cr')
# as per official swin-v2 impl, supporting timm specific 'cr' log coords as well
relative_coords_h = torch.arange(-(win_size[0] - 1), win_size[0], dtype=torch.float32)
relative_coords_w = torch.arange(-(win_size[1] - 1), win_size[1], dtype=torch.float32)
relative_coords_table = torch.stack(torch.meshgrid([relative_coords_h, relative_coords_w]))
relative_coords_table = relative_coords_table.permute(1, 2, 0).contiguous() # 2*Wh-1, 2*Ww-1, 2
if mode == 'swin':
if pretrained_win_size[0] > 0:
relative_coords_table[:, :, 0] /= (pretrained_win_size[0] - 1)
relative_coords_table[:, :, 1] /= (pretrained_win_size[1] - 1)
else:
relative_coords_table[:, :, 0] /= (win_size[0] - 1)
relative_coords_table[:, :, 1] /= (win_size[1] - 1)
relative_coords_table *= 8 # normalize to -8, 8
relative_coords_table = torch.sign(relative_coords_table) * torch.log2(
1.0 + relative_coords_table.abs()) / math.log2(8)
else:
# mode == 'cr'
relative_coords_table = torch.sign(relative_coords_table) * torch.log(
1.0 + relative_coords_table.abs())
return relative_coords_table
class RelPosMlp(nn.Module):
""" Log-Coordinate Relative Position MLP
Based on ideas presented in Swin-V2 paper (https://arxiv.org/abs/2111.09883)
This impl covers the 'swin' implementation as well as two timm specific modes ('cr', and 'rw')
"""
def __init__(
self,
window_size,
num_heads=8,
hidden_dim=128,
prefix_tokens=0,
mode='cr',
pretrained_window_size=(0, 0)
):
super().__init__()
self.window_size = window_size
self.window_area = self.window_size[0] * self.window_size[1]
self.prefix_tokens = prefix_tokens
self.num_heads = num_heads
self.bias_shape = (self.window_area,) * 2 + (num_heads,)
if mode == 'swin':
self.bias_act = nn.Sigmoid()
self.bias_gain = 16
mlp_bias = (True, False)
else:
self.bias_act = nn.Identity()
self.bias_gain = None
mlp_bias = True
self.mlp = Mlp(
2, # x, y
hidden_features=hidden_dim,
out_features=num_heads,
act_layer=nn.ReLU,
bias=mlp_bias,
drop=(0.125, 0.)
)
self.register_buffer(
"relative_position_index",
gen_relative_position_index(window_size).view(-1),
persistent=False)
# get relative_coords_table
self.register_buffer(
"rel_coords_log",
gen_relative_log_coords(window_size, pretrained_window_size, mode=mode),
persistent=False)
def get_bias(self) -> torch.Tensor:
relative_position_bias = self.mlp(self.rel_coords_log)
if self.relative_position_index is not None:
relative_position_bias = relative_position_bias.view(-1, self.num_heads)[self.relative_position_index]
relative_position_bias = relative_position_bias.view(self.bias_shape)
relative_position_bias = relative_position_bias.permute(2, 0, 1)
relative_position_bias = self.bias_act(relative_position_bias)
if self.bias_gain is not None:
relative_position_bias = self.bias_gain * relative_position_bias
if self.prefix_tokens:
relative_position_bias = F.pad(relative_position_bias, [self.prefix_tokens, 0, self.prefix_tokens, 0])
return relative_position_bias.unsqueeze(0).contiguous()
def forward(self, attn, shared_rel_pos: Optional[torch.Tensor] = None):
return attn + self.get_bias()
def generate_lookup_tensor(
length: int,
max_relative_position: Optional[int] = None,
):
"""Generate a one_hot lookup tensor to reindex embeddings along one dimension.
Args:
length: the length to reindex to.
max_relative_position: the maximum relative position to consider.
Relative position embeddings for distances above this threshold
are zeroed out.
Returns:
a lookup Tensor of size [length, length, vocab_size] that satisfies
ret[n,m,v] = 1{m - n + max_relative_position = v}.
"""
if max_relative_position is None:
max_relative_position = length - 1
# Return the cached lookup tensor, otherwise compute it and cache it.
vocab_size = 2 * max_relative_position + 1
ret = torch.zeros(length, length, vocab_size)
for i in range(length):
for x in range(length):
v = x - i + max_relative_position
if abs(x - i) > max_relative_position:
continue
ret[i, x, v] = 1
return ret
def reindex_2d_einsum_lookup(
relative_position_tensor,
height: int,
width: int,
height_lookup: torch.Tensor,
width_lookup: torch.Tensor,
) -> torch.Tensor:
"""Reindex 2d relative position bias with 2 independent einsum lookups.
Adapted from:
https://github.com/google-research/maxvit/blob/2e06a7f1f70c76e64cd3dabe5cd1b8c1a23c9fb7/maxvit/models/attention_utils.py
Args:
relative_position_tensor: tensor of shape
[..., vocab_height, vocab_width, ...].
height: height to reindex to.
width: width to reindex to.
height_lookup: one-hot height lookup
width_lookup: one-hot width lookup
Returns:
reindexed_tensor: a Tensor of shape
[..., height * width, height * width, ...]
"""
reindexed_tensor = torch.einsum('nhw,ixh->nixw', relative_position_tensor, height_lookup)
reindexed_tensor = torch.einsum('nixw,jyw->nijxy', reindexed_tensor, width_lookup)
area = height * width
return reindexed_tensor.reshape(relative_position_tensor.shape[0], area, area)
class RelPosBiasTf(nn.Module):
""" Relative Position Bias Impl (Compatible with Tensorflow MaxViT models)
Adapted from:
https://github.com/google-research/maxvit/blob/2e06a7f1f70c76e64cd3dabe5cd1b8c1a23c9fb7/maxvit/models/attention_utils.py
"""
def __init__(self, window_size, num_heads, prefix_tokens=0):
super().__init__()
assert prefix_tokens <= 1
self.window_size = window_size
self.window_area = window_size[0] * window_size[1]
self.num_heads = num_heads
vocab_height = 2 * window_size[0] - 1
vocab_width = 2 * window_size[1] - 1
self.bias_shape = (self.num_heads, vocab_height, vocab_width)
self.relative_position_bias_table = nn.Parameter(torch.zeros(self.bias_shape))
self.register_buffer('height_lookup', generate_lookup_tensor(window_size[0]), persistent=False)
self.register_buffer('width_lookup', generate_lookup_tensor(window_size[1]), persistent=False)
self.init_weights()
def init_weights(self):
nn.init.normal_(self.relative_position_bias_table, std=.02)
def get_bias(self) -> torch.Tensor:
# FIXME change to not use one-hot/einsum?
return reindex_2d_einsum_lookup(
self.relative_position_bias_table,
self.window_size[0],
self.window_size[1],
self.height_lookup,
self.width_lookup
)
def forward(self, attn, shared_rel_pos: Optional[torch.Tensor] = None):
return attn + self.get_bias()
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/interpolate.py
|
""" Interpolation helpers for timm layers
RegularGridInterpolator from https://github.com/sbarratt/torch_interpolations
Copyright Shane Barratt, Apache 2.0 license
"""
import torch
from itertools import product
class RegularGridInterpolator:
""" Interpolate data defined on a rectilinear grid with even or uneven spacing.
Produces similar results to scipy RegularGridInterpolator or interp2d
in 'linear' mode.
Taken from https://github.com/sbarratt/torch_interpolations
"""
def __init__(self, points, values):
self.points = points
self.values = values
assert isinstance(self.points, tuple) or isinstance(self.points, list)
assert isinstance(self.values, torch.Tensor)
self.ms = list(self.values.shape)
self.n = len(self.points)
assert len(self.ms) == self.n
for i, p in enumerate(self.points):
assert isinstance(p, torch.Tensor)
assert p.shape[0] == self.values.shape[i]
def __call__(self, points_to_interp):
assert self.points is not None
assert self.values is not None
assert len(points_to_interp) == len(self.points)
K = points_to_interp[0].shape[0]
for x in points_to_interp:
assert x.shape[0] == K
idxs = []
dists = []
overalls = []
for p, x in zip(self.points, points_to_interp):
idx_right = torch.bucketize(x, p)
idx_right[idx_right >= p.shape[0]] = p.shape[0] - 1
idx_left = (idx_right - 1).clamp(0, p.shape[0] - 1)
dist_left = x - p[idx_left]
dist_right = p[idx_right] - x
dist_left[dist_left < 0] = 0.
dist_right[dist_right < 0] = 0.
both_zero = (dist_left == 0) & (dist_right == 0)
dist_left[both_zero] = dist_right[both_zero] = 1.
idxs.append((idx_left, idx_right))
dists.append((dist_left, dist_right))
overalls.append(dist_left + dist_right)
numerator = 0.
for indexer in product([0, 1], repeat=self.n):
as_s = [idx[onoff] for onoff, idx in zip(indexer, idxs)]
bs_s = [dist[1 - onoff] for onoff, dist in zip(indexer, dists)]
numerator += self.values[as_s] * \
torch.prod(torch.stack(bs_s), dim=0)
denominator = torch.prod(torch.stack(overalls), dim=0)
return numerator / denominator
| 0
|
hf_public_repos/pytorch-image-models/timm
|
hf_public_repos/pytorch-image-models/timm/layers/classifier.py
|
""" Classifier head and layer factory
Hacked together by / Copyright 2020 Ross Wightman
"""
from collections import OrderedDict
from functools import partial
from typing import Optional, Union, Callable
import torch
import torch.nn as nn
from torch.nn import functional as F
from .adaptive_avgmax_pool import SelectAdaptivePool2d
from .create_act import get_act_layer
from .create_norm import get_norm_layer
def _create_pool(
num_features: int,
num_classes: int,
pool_type: str = 'avg',
use_conv: bool = False,
input_fmt: Optional[str] = None,
):
flatten_in_pool = not use_conv # flatten when we use a Linear layer after pooling
if not pool_type:
assert num_classes == 0 or use_conv,\
'Pooling can only be disabled if classifier is also removed or conv classifier is used'
flatten_in_pool = False # disable flattening if pooling is pass-through (no pooling)
global_pool = SelectAdaptivePool2d(
pool_type=pool_type,
flatten=flatten_in_pool,
input_fmt=input_fmt,
)
num_pooled_features = num_features * global_pool.feat_mult()
return global_pool, num_pooled_features
def _create_fc(num_features, num_classes, use_conv=False):
if num_classes <= 0:
fc = nn.Identity() # pass-through (no classifier)
elif use_conv:
fc = nn.Conv2d(num_features, num_classes, 1, bias=True)
else:
fc = nn.Linear(num_features, num_classes, bias=True)
return fc
def create_classifier(
num_features: int,
num_classes: int,
pool_type: str = 'avg',
use_conv: bool = False,
input_fmt: str = 'NCHW',
drop_rate: Optional[float] = None,
):
global_pool, num_pooled_features = _create_pool(
num_features,
num_classes,
pool_type,
use_conv=use_conv,
input_fmt=input_fmt,
)
fc = _create_fc(
num_pooled_features,
num_classes,
use_conv=use_conv,
)
if drop_rate is not None:
dropout = nn.Dropout(drop_rate)
return global_pool, dropout, fc
return global_pool, fc
class ClassifierHead(nn.Module):
"""Classifier head w/ configurable global pooling and dropout."""
def __init__(
self,
in_features: int,
num_classes: int,
pool_type: str = 'avg',
drop_rate: float = 0.,
use_conv: bool = False,
input_fmt: str = 'NCHW',
):
"""
Args:
in_features: The number of input features.
num_classes: The number of classes for the final classifier layer (output).
pool_type: Global pooling type, pooling disabled if empty string ('').
drop_rate: Pre-classifier dropout rate.
"""
super(ClassifierHead, self).__init__()
self.in_features = in_features
self.use_conv = use_conv
self.input_fmt = input_fmt
global_pool, fc = create_classifier(
in_features,
num_classes,
pool_type,
use_conv=use_conv,
input_fmt=input_fmt,
)
self.global_pool = global_pool
self.drop = nn.Dropout(drop_rate)
self.fc = fc
self.flatten = nn.Flatten(1) if use_conv and pool_type else nn.Identity()
def reset(self, num_classes, pool_type=None):
if pool_type is not None and pool_type != self.global_pool.pool_type:
self.global_pool, self.fc = create_classifier(
self.in_features,
num_classes,
pool_type=pool_type,
use_conv=self.use_conv,
input_fmt=self.input_fmt,
)
self.flatten = nn.Flatten(1) if self.use_conv and pool_type else nn.Identity()
else:
num_pooled_features = self.in_features * self.global_pool.feat_mult()
self.fc = _create_fc(
num_pooled_features,
num_classes,
use_conv=self.use_conv,
)
def forward(self, x, pre_logits: bool = False):
x = self.global_pool(x)
x = self.drop(x)
if pre_logits:
return self.flatten(x)
x = self.fc(x)
return self.flatten(x)
class NormMlpClassifierHead(nn.Module):
def __init__(
self,
in_features: int,
num_classes: int,
hidden_size: Optional[int] = None,
pool_type: str = 'avg',
drop_rate: float = 0.,
norm_layer: Union[str, Callable] = 'layernorm2d',
act_layer: Union[str, Callable] = 'tanh',
):
"""
Args:
in_features: The number of input features.
num_classes: The number of classes for the final classifier layer (output).
hidden_size: The hidden size of the MLP (pre-logits FC layer) if not None.
pool_type: Global pooling type, pooling disabled if empty string ('').
drop_rate: Pre-classifier dropout rate.
norm_layer: Normalization layer type.
act_layer: MLP activation layer type (only used if hidden_size is not None).
"""
super().__init__()
self.in_features = in_features
self.hidden_size = hidden_size
self.num_features = in_features
self.use_conv = not pool_type
norm_layer = get_norm_layer(norm_layer)
act_layer = get_act_layer(act_layer)
linear_layer = partial(nn.Conv2d, kernel_size=1) if self.use_conv else nn.Linear
self.global_pool = SelectAdaptivePool2d(pool_type=pool_type)
self.norm = norm_layer(in_features)
self.flatten = nn.Flatten(1) if pool_type else nn.Identity()
if hidden_size:
self.pre_logits = nn.Sequential(OrderedDict([
('fc', linear_layer(in_features, hidden_size)),
('act', act_layer()),
]))
self.num_features = hidden_size
else:
self.pre_logits = nn.Identity()
self.drop = nn.Dropout(drop_rate)
self.fc = linear_layer(self.num_features, num_classes) if num_classes > 0 else nn.Identity()
def reset(self, num_classes, global_pool=None):
if global_pool is not None:
self.global_pool = SelectAdaptivePool2d(pool_type=global_pool)
self.flatten = nn.Flatten(1) if global_pool else nn.Identity()
self.use_conv = self.global_pool.is_identity()
linear_layer = partial(nn.Conv2d, kernel_size=1) if self.use_conv else nn.Linear
if self.hidden_size:
if ((isinstance(self.pre_logits.fc, nn.Conv2d) and not self.use_conv) or
(isinstance(self.pre_logits.fc, nn.Linear) and self.use_conv)):
with torch.no_grad():
new_fc = linear_layer(self.in_features, self.hidden_size)
new_fc.weight.copy_(self.pre_logits.fc.weight.reshape(new_fc.weight.shape))
new_fc.bias.copy_(self.pre_logits.fc.bias)
self.pre_logits.fc = new_fc
self.fc = linear_layer(self.num_features, num_classes) if num_classes > 0 else nn.Identity()
def forward(self, x, pre_logits: bool = False):
x = self.global_pool(x)
x = self.norm(x)
x = self.flatten(x)
x = self.pre_logits(x)
x = self.drop(x)
if pre_logits:
return x
x = self.fc(x)
return x
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.