OpenCLIP DataComp
Collection
OpenCLIP models trained on DataComp (https://huggingface.co/papers/2304.14108). β’ 6 items β’ Updated β’ 6
How to use laion/CLIP-ViT-B-32-DataComp.M-s128M-b4K with OpenCLIP:
import open_clip
model, preprocess_train, preprocess_val = open_clip.create_model_and_transforms('hf-hub:laion/CLIP-ViT-B-32-DataComp.M-s128M-b4K')
tokenizer = open_clip.get_tokenizer('hf-hub:laion/CLIP-ViT-B-32-DataComp.M-s128M-b4K')