How to use from the
Use from the
PEFT library
from peft import PeftModel
from transformers import AutoModelForCausalLM

base_model = AutoModelForCausalLM.from_pretrained("unsloth/Qwen2.5-3B-Instruct-unsloth-bnb-4bit")
model = PeftModel.from_pretrained(base_model, "devZeeshaan/NanoR1")

Model Description

  • Developed by: Jeesan Abbas
  • License: Apache license 2.0
  • Finetuned from model: unsloth/Qwen2.5-3B-Instruct-unsloth-bnb-4bit

Uploaded model

This qwen2 model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
2
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for devZeeshaan/NanoR1

Base model

Qwen/Qwen2.5-3B
Adapter
(8)
this model