Instructions to use google/owlvit-base-patch32 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/owlvit-base-patch32 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("zero-shot-object-detection", model="google/owlvit-base-patch32")# Load model directly from transformers import AutoProcessor, AutoModelForZeroShotObjectDetection processor = AutoProcessor.from_pretrained("google/owlvit-base-patch32") model = AutoModelForZeroShotObjectDetection.from_pretrained("google/owlvit-base-patch32") - Notebooks
- Google Colab
- Kaggle
Alara Dirik commited on
Commit ·
7d594ca
1
Parent(s): 171eb6c
Update tokenizer_config.json
Browse files- tokenizer_config.json +0 -1
tokenizer_config.json
CHANGED
|
@@ -22,7 +22,6 @@
|
|
| 22 |
"name_or_path": "openai/clip-vit-base-patch32",
|
| 23 |
"pad_token": "!",
|
| 24 |
"processor_class": "OwlViTProcessor",
|
| 25 |
-
"special_tokens_map_file": "/Users/adirik/.cache/huggingface/transformers/18a566598f286c9139f88160c99f84eec492a26bd22738fa9cb44d5b7e0a5c76.cce1206abbad28826f000510f22f354e53e66a97f7c23745a7dfe27609cc07f5",
|
| 26 |
"tokenizer_class": "CLIPTokenizer",
|
| 27 |
"unk_token": {
|
| 28 |
"__type": "AddedToken",
|
|
|
|
| 22 |
"name_or_path": "openai/clip-vit-base-patch32",
|
| 23 |
"pad_token": "!",
|
| 24 |
"processor_class": "OwlViTProcessor",
|
|
|
|
| 25 |
"tokenizer_class": "CLIPTokenizer",
|
| 26 |
"unk_token": {
|
| 27 |
"__type": "AddedToken",
|