Instructions to use aystream/GigaAM-v3-e2e-rnnt-mlx with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- MLX
How to use aystream/GigaAM-v3-e2e-rnnt-mlx with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir GigaAM-v3-e2e-rnnt-mlx aystream/GigaAM-v3-e2e-rnnt-mlx
- Notebooks
- Google Colab
- Kaggle
- Local Apps
- LM Studio
File size: 131 Bytes
1613074 | 1 2 3 4 | version https://git-lfs.github.com/spec/v1
oid sha256:828c12c991019eef952a960661f25a92d6ad279591e2ea466b4aeddf1d20a18a
size 255336
|