File size: 656 Bytes
5451cd2 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 | ---
license: apache-2.0
library_name: transformers
base_model:
- arcee-ai/Trinity-Mini
tags:
- trinity
- moe
- merged-lora
---
# Trinity Mini Lora 2
This repository contains `arcee-ai/Trinity-Mini` with the `lora-2` adapter merged into the base weights.
## Merge Details
- Base model: `arcee-ai/Trinity-Mini`
- Adapter source: `lora-2/smbdsvt74oqm7ngogf4xysn8.zip`
- Merge script: `merge_lora.py`
- Confirmed applied tensors: `11,686 / 11,686`
- LoRA rank: `16`
- LoRA alpha: `64`
- Merge scale: `4.0`
## Usage
Load this model the same way you would load `arcee-ai/Trinity-Mini`, including `trust_remote_code=True` when required by your environment.
|