Instructions to use facebook/dinov2-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use facebook/dinov2-large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-feature-extraction", model="facebook/dinov2-large")# Load model directly from transformers import AutoImageProcessor, AutoModel processor = AutoImageProcessor.from_pretrained("facebook/dinov2-large") model = AutoModel.from_pretrained("facebook/dinov2-large") - Notebooks
- Google Colab
- Kaggle
recommended server to host DINOv2
#6
by alecauduro - opened
Any recommended web server to host DINOv2? Or should I just build a FASTAPI REST API wrapper around the hugging face transformers lib?