How to use from the
Use from the
PEFT library
from peft import PeftModel
from transformers import AutoModelForCausalLM

base_model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan-7B")
model = PeftModel.from_pretrained(base_model, "jeeejeee/baichuan7b-zero-init")

Model Card for Model ID

This LoRA model exclusively utilizes the unit tests from vLLM, and all the lora_B weights are initialized to zero.

Framework versions

  • PEFT 0.10.0
Downloads last month
395
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for jeeejeee/baichuan7b-zero-init

Adapter
(3)
this model