AlonBBar's picture
Add MLX 4-bit quantized model for iOS on-device inference
f8f9acd verified
metadata
base_model: AlonBBar/phi4mini-task-assistant
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - phi3
  - mlx
license: apache-2.0
language:
  - en
library_name: mlx
pipeline_tag: text-generation