mlx-community/siglip2-base-patch16-224-8bit

This model mlx-community/siglip2-base-patch16-224-8bit was converted to MLX format from google/siglip2-base-patch16-224

Downloads last month
21
MLX
Hardware compatibility
Log In to add your hardware

Quantized

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for mlx-community/siglip2-base-patch16-224-8bit

Finetuned
(118)
this model