How to use LibraxisAI/whisper-medium-mlx-q8 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("automatic-speech-recognition", model="LibraxisAI/whisper-medium-mlx-q8")
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("LibraxisAI/whisper-medium-mlx-q8", dtype="auto")
How to use LibraxisAI/whisper-medium-mlx-q8 with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir whisper-medium-mlx-q8 LibraxisAI/whisper-medium-mlx-q8