Any-to-Any
Transformers
Safetensors
MLX
minicpmo
feature-extraction
minicpm-o
minicpm-v
multimodal
full-duplex
custom_code
5-bit
Instructions to use mlx-community/MiniCPM-o-4_5-5bit with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mlx-community/MiniCPM-o-4_5-5bit with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("mlx-community/MiniCPM-o-4_5-5bit", trust_remote_code=True, dtype="auto") - MLX
How to use mlx-community/MiniCPM-o-4_5-5bit with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir MiniCPM-o-4_5-5bit mlx-community/MiniCPM-o-4_5-5bit
- Notebooks
- Google Colab
- Kaggle
- Local Apps
- LM Studio