Instructions to use TheBloke/MPT-7B-GGML with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use TheBloke/MPT-7B-GGML with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("TheBloke/MPT-7B-GGML", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Have plan to create a ggml version for mpt-7b-chat?
3
#4 opened almost 3 years ago
by
LouiSum
Python library
👍 1
3
#2 opened almost 3 years ago
by
marella
Non llama.cpp GPU compatible libraries?
5
#1 opened almost 3 years ago
by
FriendlyVisage