Instructions to use microsoft/beit-large-patch16-384 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use microsoft/beit-large-patch16-384 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-classification", model="microsoft/beit-large-patch16-384") pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")# Load model directly from transformers import AutoImageProcessor, AutoModelForImageClassification processor = AutoImageProcessor.from_pretrained("microsoft/beit-large-patch16-384") model = AutoModelForImageClassification.from_pretrained("microsoft/beit-large-patch16-384") - Notebooks
- Google Colab
- Kaggle
Commit ·
d4320bb
1
Parent(s): d6f5d89
Update preprocessor_config.json
Browse files- preprocessor_config.json +1 -1
preprocessor_config.json
CHANGED
|
@@ -3,7 +3,7 @@
|
|
| 3 |
"do_center_crop": false,
|
| 4 |
"do_normalize": true,
|
| 5 |
"do_resize": true,
|
| 6 |
-
"feature_extractor_type": "
|
| 7 |
"image_mean": [
|
| 8 |
0.5,
|
| 9 |
0.5,
|
|
|
|
| 3 |
"do_center_crop": false,
|
| 4 |
"do_normalize": true,
|
| 5 |
"do_resize": true,
|
| 6 |
+
"feature_extractor_type": "BeitFeatureExtractor",
|
| 7 |
"image_mean": [
|
| 8 |
0.5,
|
| 9 |
0.5,
|