metadata
title: README
emoji: 🚀
colorFrom: indigo
colorTo: indigo
sdk: docker
pinned: false
tags: - computer-vision - pre-trained-models - model-hub license: cc0-1.0 description: A curated collection of links to authentic pre-trained model weights from official sources like PyTorch, TensorFlow Hub, and Hugging Face.
Branes.AI Model Hub
This repository provides links to authentic pre-trained weights for various machine learning models. No weights are hosted here; instead, we point to official sources.
Available Models
| Model | Parameters | Complexity | Year | Source Link |
|---|---|---|---|---|
| MobileNetV1 | ~4M | Very Low | 2017 | TensorFlow Hub - MobileNet V1 |
| MobileNetV2 | ~3.5M | Very Low | 2018 | TensorFlow Hub - MobileNet V2 |
| EfficientNetV2-B0 | 6.6M | Low | 2019 | Kaggle - timm/tf-efficientnet |
| EfficientNetV2-B1 | 7.8M | Medium | 2021 | Kaggle - timm/tf-efficientnet |
| ResNet-18 | 12.5M | Medium | 2015 | Hugging Face - microsoft/resnet-18 |
| ResNet-34 | 18.0M | Medium | 2015 | Hugging Face - microsoft/resnet-34 |
| ResNet-50 | 25.6M | Medium | 2015 | Hugging Face - microsoft/resnet-50 |
| EfficientNetV2-L | 66M | High | 2021 | Kaggle - timm/tf-efficientnet |
| ViT-Base-16 | 86M | High | 2020 | Hugging Face - ViT-Base-Patch16-224 |
| Swin Transformer-Base | 86M | High | 2021 | Hugging Face - Swin-Base-Patch4-Window7-224 |
| DINO-v2 | 147M | Very High | 2022 | Hugging Face - DINOv2-Base |
| ViT-Large-Patch16 | 304M | Very High | 2020 | Hugging Face - ViT-Large-Patch16-224 |
| DINO-v2 (Large) | 354M | Very High | 2022 | Hugging Face - DINOv2-Large |
| GPT-4 Vision | Billions | Extremely High | 2023 | Not available (Proprietary, OpenAI API) |
Usage
- Click the links above to access the official weights from their respective sources.
- For Kaggle models, select the specific variant (e.g., B0) from the model page.
- For PyTorch models (e.g., ResNet), use
torchvision.modelswithpretrained=True. - For TensorFlow models, use
tf.keras.applicationsor TensorFlow Hub APIs. - For Hugging Face models, use the
transformerslibrary with the provided model IDs.
Notes
- Ensure you comply with the licensing terms of each source.
- This repository is a reference hub and does not host the weights directly.