HachiML/databricks-dolly-15k-ja-alpaca-format
Viewer • Updated • 15k • 88
How to use HachiML/mpt-7b-instruct-ja-qlora with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("mosaicml/mpt-7b-instruct")
model = PeftModel.from_pretrained(base_model, "HachiML/mpt-7b-instruct-ja-qlora")The following bitsandbytes quantization config was used during training: