Instructions to use debisoft/mpt-7b-foundergpt with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use debisoft/mpt-7b-foundergpt with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("debisoft/mpt-7b-8k-instruct-peft-compatible") model = PeftModel.from_pretrained(base_model, "debisoft/mpt-7b-foundergpt") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 4cb0d3271c4669315ca7d952566d7921c4a16ae91fa72bf8050b56797e36a8d3
- Size of remote file:
- 537 MB
- SHA256:
- 89bd6fc45f6f0553041451765036bbce8671d6ab189b65d5a9ed78b523d49090
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.