whisper-tiny-mlx

This model was converted to MLX format from tiny.

Use with mlx

git clone https://github.com/ml-explore/mlx-examples.git
cd mlx-examples/whisper/
pip install -r requirements.txt

>> import whisper
>> whisper.transcribe("FILE_NAME")
Downloads last month
3
MLX
Hardware compatibility
Log In to add your hardware

Quantized

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for HashNuke/whisper-tiny-mlx

Finetuned
(1685)
this model