Instructions to use mlx-community/mlx_bark with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- MLX
How to use mlx-community/mlx_bark with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir mlx_bark mlx-community/mlx_bark
- Notebooks
- Google Colab
- Kaggle
- Local Apps
- LM Studio
What does model.py do? Support for MPS
I tried to run # Run example (large model) python model.py --text="Hello world!" --path weights/ --model large but it seems nothing happens. No output, no sound. Was only able to generate and output audio by writing my own code.
Also, how does one enable support for MPS? The audio generation currently happens completely on CPU, and when I modify the code to include 'model.to("mps")' I get :
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/torch/nn/functional.py", line 2266, in embedding
return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Placeholder storage has not been allocated on MPS device!