Instructions to use apple/MobileCLIP-S2-OpenCLIP with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- OpenCLIP
How to use apple/MobileCLIP-S2-OpenCLIP with OpenCLIP:
import open_clip model, preprocess_train, preprocess_val = open_clip.create_model_and_transforms('hf-hub:apple/MobileCLIP-S2-OpenCLIP') tokenizer = open_clip.get_tokenizer('hf-hub:apple/MobileCLIP-S2-OpenCLIP') - Notebooks
- Google Colab
- Kaggle
Remove outdated license fields from metadata
Browse files
README.md
CHANGED
|
@@ -4,8 +4,6 @@ tags:
|
|
| 4 |
library_name: open_clip
|
| 5 |
pipeline_tag: zero-shot-image-classification
|
| 6 |
license: apple-amlr
|
| 7 |
-
license_name: apple-ascl
|
| 8 |
-
license_link: https://github.com/apple/ml-mobileclip/blob/main/LICENSE_weights_data
|
| 9 |
---
|
| 10 |
|
| 11 |
# MobileCLIP: Fast Image-Text Models through Multi-Modal Reinforced Training
|
|
|
|
| 4 |
library_name: open_clip
|
| 5 |
pipeline_tag: zero-shot-image-classification
|
| 6 |
license: apple-amlr
|
|
|
|
|
|
|
| 7 |
---
|
| 8 |
|
| 9 |
# MobileCLIP: Fast Image-Text Models through Multi-Modal Reinforced Training
|