metadata
license: apache-2.0
library_name: transformers
base_model:
- arcee-ai/Trinity-Mini
tags:
- trinity
- moe
- merged-lora
Trinity Mini Quirky
This repository contains arcee-ai/Trinity-Mini with the lora-1-quirky adapter merged into the base weights.
Merge Details
- Base model:
arcee-ai/Trinity-Mini - Adapter source:
lora-1-quirky/Quirky-cr8a7qzr4ay8yhtvjup2ijl1.zip - Merge script:
merge_lora.py - Confirmed applied tensors:
11,686 / 11,686 - LoRA rank:
16 - LoRA alpha:
64 - Merge scale:
4.0
Usage
Load this model the same way you would load arcee-ai/Trinity-Mini, including trust_remote_code=True when required by your environment.