gemma-4-e2b-it-5bit / README.md
prince-canuma's picture
Upload folder using huggingface_hub
be3b4ee verified
---
library_name: mlx
license: apache-2.0
license_link: https://ai.google.dev/gemma/docs/gemma_4_license
pipeline_tag: any-to-any
tags:
- mlx
base_model: google/gemma-4-e2b-it
---
# mlx-community/gemma-4-e2b-it-5bit
This model was converted to MLX format from [`google/gemma-4-e2b-it`](https://huggingface.co/google/gemma-4-e2b-it)
using mlx-vlm version **0.4.3**.
Refer to the [original model card](https://huggingface.co/google/gemma-4-e2b-it) for more details on the model.
## Use with mlx
```bash
pip install -U mlx-vlm
```
```bash
python -m mlx_vlm.generate --model mlx-community/gemma-4-e2b-it-5bit --max-tokens 100 --temperature 0.0 --prompt "Describe this image." --image <path_to_image>
```