How to use WHATX/30k-Llama3-8B with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("../ckpts/Meta-Llama-3-8B-Instruct") model = PeftModel.from_pretrained(base_model, "WHATX/30k-Llama3-8B")
The best checkpoint is 240-epoch.