metadata
license: apache-2.0
library_name: transformers
base_model:
- arcee-ai/Trinity-Mini
tags:
- trinity
- moe
- merged-lora
Trinity Mini Lora 2
This repository contains arcee-ai/Trinity-Mini with the lora-2 adapter merged into the base weights.
Merge Details
- Base model:
arcee-ai/Trinity-Mini - Adapter source:
lora-2/smbdsvt74oqm7ngogf4xysn8.zip - Merge script:
merge_lora.py - Confirmed applied tensors:
11,686 / 11,686 - LoRA rank:
16 - LoRA alpha:
64 - Merge scale:
4.0
Usage
Load this model the same way you would load arcee-ai/Trinity-Mini, including trust_remote_code=True when required by your environment.