| | --- |
| | base_model: [] |
| | library_name: transformers |
| | tags: |
| | - mergekit |
| | - peft |
| | license: llama3 |
| | --- |
| | # SFR-Iterative-DPO-LLaMA-3-8B-R LoRA Model |
| |
|
| | This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit). |
| |
|
| | ## LoRA Details |
| |
|
| | This LoRA adapter was extracted from SFR-Iterative-DPO-LLaMA-3-8B-R and uses Meta-Llama-3-8B as a base. |
| |
|
| | ### Parameters |
| |
|
| | The following command was used to extract this LoRA adapter: |
| |
|
| | ```sh |
| | mergekit-extract-lora Meta-Llama-3-8B SFR-Iterative-DPO-LLaMA-3-8B-R OUTPUT_PATH --rank=32 |
| | ``` |