Instructions to use nvidia/mit-b5 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use nvidia/mit-b5 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-classification", model="nvidia/mit-b5") pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")# Load model directly from transformers import AutoImageProcessor, AutoModelForImageClassification processor = AutoImageProcessor.from_pretrained("nvidia/mit-b5") model = AutoModelForImageClassification.from_pretrained("nvidia/mit-b5") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit ·
9707ed6
1
Parent(s): 88b4d75
Add TF weights (#1)
Browse files- Add TF weights (4dc413caa07a8275f55dc90ab1713f60844ae521)
- tf_model.h5 +3 -0
tf_model.h5
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:2e7c5bd05493d9a067335d16606959bce699154c3f628c8805dfaf2bc0080dbb
|
| 3 |
+
size 329337544
|