How to use from the
Use from the
Transformers library
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("leafspark/SFR-Iterative-DPO-LLaMA-3-8B-R-lora", dtype="auto")
Quick Links

SFR-Iterative-DPO-LLaMA-3-8B-R LoRA Model

This is a LoRA extracted from a language model. It was extracted using mergekit.

LoRA Details

This LoRA adapter was extracted from SFR-Iterative-DPO-LLaMA-3-8B-R and uses Meta-Llama-3-8B as a base.

Parameters

The following command was used to extract this LoRA adapter:

mergekit-extract-lora Meta-Llama-3-8B SFR-Iterative-DPO-LLaMA-3-8B-R OUTPUT_PATH --rank=32
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support