Instructions to use google/siglip-large-patch16-384 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/siglip-large-patch16-384 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("zero-shot-image-classification", model="google/siglip-large-patch16-384") pipe( "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png", candidate_labels=["animals", "humans", "landscape"], )# Load model directly from transformers import AutoProcessor, AutoModelForZeroShotImageClassification processor = AutoProcessor.from_pretrained("google/siglip-large-patch16-384") model = AutoModelForZeroShotImageClassification.from_pretrained("google/siglip-large-patch16-384") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -54,7 +54,7 @@ print(f"{probs[0][0]:.1%} that image 0 is '{texts[0]}'")
|
|
| 54 |
|
| 55 |
Alternatively, one can leverage the pipeline API which abstracts away the complexity for the user:
|
| 56 |
|
| 57 |
-
```
|
| 58 |
from transformers import pipeline
|
| 59 |
from PIL import Image
|
| 60 |
import requests
|
|
|
|
| 54 |
|
| 55 |
Alternatively, one can leverage the pipeline API which abstracts away the complexity for the user:
|
| 56 |
|
| 57 |
+
```python
|
| 58 |
from transformers import pipeline
|
| 59 |
from PIL import Image
|
| 60 |
import requests
|