How to use forestai/fireact_llama_2_7b_lora with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-2-7b-chat-hf") model = PeftModel.from_pretrained(base_model, "forestai/fireact_llama_2_7b_lora")