mlx-community/siglip2-base-patch16-224-8bit
This model mlx-community/siglip2-base-patch16-224-8bit was converted to MLX format from google/siglip2-base-patch16-224
- Downloads last month
- 21
Hardware compatibility
Log In to add your hardware
Quantized
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for mlx-community/siglip2-base-patch16-224-8bit
Base model
google/siglip2-base-patch16-224