How to use SwastikM/Llama-2-7B-Chat-text2code with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("TheBloke/Llama-2-7b-Chat-GPTQ") model = PeftModel.from_pretrained(base_model, "SwastikM/Llama-2-7B-Chat-text2code")
Congrats, the model looks amazing, we wish you a great success on your journey π€
Thank you so much. I really appreciate the feedback. Thank you for the wonderful dataset.
Β· Sign up or log in to comment