How to use apple/aimv2-large-patch14-native with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-feature-extraction", model="apple/aimv2-large-patch14-native", trust_remote_code=True)
# Load model directly from transformers import AutoImageProcessor, AutoModel processor = AutoImageProcessor.from_pretrained("apple/aimv2-large-patch14-native", trust_remote_code=True) model = AutoModel.from_pretrained("apple/aimv2-large-patch14-native", trust_remote_code=True)
How to use apple/aimv2-large-patch14-native with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir aimv2-large-patch14-native apple/aimv2-large-patch14-native
Thank you!
· Sign up or log in to comment