Instructions to use NeoDim/starchat-alpha-GGML with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use NeoDim/starchat-alpha-GGML with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("NeoDim/starchat-alpha-GGML", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Which inference repo is this quantized for?
#2
by xhyi - opened
Is this quantized for the current starcoder.cpp? Or for upstream ggml or something
I know nothing about starcoder.cpp. Could you provide link please? It's for upstream ggml
Thank you. For now it should work for both.