How to use xy1e22/text-lora with PEFT:
from peft import PeftModel from transformers import AutoModelForSeq2SeqLM base_model = AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-base") model = PeftModel.from_pretrained(base_model, "xy1e22/text-lora")
How to use xy1e22/text-lora with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("xy1e22/text-lora", dtype="auto")