File size: 2,895 Bytes
52eb497 3d92bd5 52eb497 3d92bd5 52eb497 3d92bd5 52eb497 3d92bd5 52eb497 3d92bd5 52eb497 3d92bd5 52eb497 3d92bd5 52eb497 3d92bd5 52eb497 3d92bd5 52eb497 3d92bd5 52eb497 3d92bd5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 |
---
base_model:
- Vortex5/MS3.2-24B-Fiery-Lynx
- ReadyArt/MS3.2-The-Omega-Directive-24B-Unslop-v2.0
- Vortex5/MS3.2-24B-Chaos-Skies
library_name: transformers
tags:
- mergekit
- merge
- roleplay
---

# π₯ **MS3.2-24B-Solar-Skies**
> Bright minds under boundless skies β where every conversation becomes a sunrise of imagination
## 𧬠Overview
**MS3.2-24B-Solar-Skies** merge of pre-trained language models created using [MergeKit](https://github.com/arcee-ai/mergekit).
It draws upon the **intellectual density** of *The Omega Directive*, the **expressive prose** of *Fiery Lynx*, and the **measured balance** of *Chaos Skies*.
---
## βοΈ **Merge Method β [Multi-SLERP](https://goddard.blog/posts/multislerp-wow-what-a-cool-idea)**
π§© **Models:**
- π§ [ReadyArt/MS3.2-The-Omega-Directive-24B-Unslop-v2.0](https://huggingface.co/ReadyArt/MS3.2-The-Omega-Directive-24B-Unslop-v2.0)
- π₯ [Vortex5/MS3.2-24B-Fiery-Lynx](https://huggingface.co/Vortex5/MS3.2-24B-Fiery-Lynx)
- π [Vortex5/MS3.2-24B-Chaos-Skies](https://huggingface.co/Vortex5/MS3.2-24B-Chaos-Skies)
<details>
<summary><b>Configuration</b></summary>
```yaml
models:
- model: ReadyArt/MS3.2-The-Omega-Directive-24B-Unslop-v2.0
parameters:
weight:
- filter: self_attn
value: [0.20, 0.35, 0.55, 0.75, 1.00, 0.95, 0.80, 0.50]
- filter: norm
value: 0.45
- value: 0.33
- model: Vortex5/MS3.2-24B-Fiery-Lynx
parameters:
weight:
- filter: lm_head
value: 0.34
- filter: mlp
value: [0.20, 0.30, 0.45, 0.60, 0.65, 0.60, 0.45, 0.30]
- value: 0.25
- model: Vortex5/MS3.2-24B-Chaos-Skies
parameters:
weight:
- filter: self_attn
value: [0.25, 0.35, 0.45, 0.55, 0.60, 0.65, 0.65, 0.60]
- filter: mlp
value: 0.2
- value: 0.33
merge_method: multislerp
dtype: bfloat16
parameters:
normalize: true
tokenizer:
source: Vortex5/MS3.2-24B-Chaos-Skies
```
</details>
## π Intended Use
| Category | Description |
|-----------|--------------|
| π§ **Reflective Dialogue** | Ideal for introspective or philosophical discussions, exploring abstract and emotional topics. |
| ποΈ **Creative Writing** | Excels at expressive prose, narrative storytelling, and immersive worldbuilding. |
| π§ **Analytical Reasoning** | Balances logic and creativity for insightful, stylistically nuanced explanations. |
| π **Character Roleplay** | Adapts fluidly to emotional, character-driven interactions and narrative depth. |
---
# π Acknowledgements
- βοΈ mradermacher β static / imatrix quantization
- π DeathGodlike β EXL3 quants
- π« All original model authors and contributors whose work formed the foundation for this merge. |