File size: 3,644 Bytes
9e64215 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa 9e64215 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa f7d79f6 fcd65aa 9e64215 fcd65aa |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 |
---
base_model:
- Vortex5/LunaMaid-12B
- Vortex5/Vermilion-Sage-12B
- inflatebot/MN-12B-Mag-Mell-R1
- Vortex5/Dark-Quill-12B
library_name: transformers
tags:
- mergekit
- merge
- roleplay
---

# ๐ **Abyssal-Seraph-12B**
> *Where the light of the divine meets the poetry of the abyss.*
---
## ๐ Overview
**Abyssal-Seraph-12B** is a **multi-stage creative merge** designed for expressive storytelling, emotional depth, and lyrical dialogue.
It was crafted through a layered fusion using [MergeKit](https://github.com/arcee-ai/mergekit):
1. ๐ **LunaMaid ร Vermilion-Sage** โ merged via **NearSwap** (`t=0.0008`) to unify LunaMaidโs balanced composure with Vermilion-Sageโs radiant prose.
2. ๐ฏ๏ธ **Dark-Quill ร Mag-Mell-R1** โ merged via **NearSwap** (`t=0.0008`) to draw forth mysticism, poetic darkness, and a sense of dreamlike gravity.
3. โจ Both intermediate results combined with the **Karcher Mean** โ a geometric blend ensuring harmony between light and shadow.
---
## ๐ฉถ Model Essence
| Trait | Description |
|:--|:--|
| ๐ง **Core Nature** | Philosophical, poetic, emotionally resonant |
| ๐ฌ **Style** | Fluid prose, vivid imagery, articulate reflection |
| ๐ซ **Tone** | Dreamlike, balanced between divine warmth and abyssal calm |
| ๐ญ **Best For** | Roleplay, character dialogue, introspection, lore writing, creative prose |
---
๐งฌ Merge Overview
Abyssal-Seraph-12B was created through a multi-stage, precision merge designed to blend expressive prose with poetic balance while maintaining model stability.
### ๐ **Stage 1**
**โจ Method:** NearSwap (`t = 0.0008`)
**๐ฉต Base:** [Vortex5/LunaMaid-12B](https://huggingface.co/Vortex5/LunaMaid-12B)
**๐ฎ Secondary:** [Vortex5/Vermilion-Sage-12B](https://huggingface.co/Vortex5/Vermilion-Sage-12B)
<details>
<summary><b>Stage 1 Configuration</b></summary>
```yaml
name: First
models:
- model: Vortex5/Vermilion-Sage-12B
merge_method: nearswap
base_model: Vortex5/LunaMaid-12B
parameters:
t: 0.0008
dtype: bfloat16
tokenizer:
source: Vortex5/LunaMaid-12B
```
</details>
### ๐ฉถ **Stage 2**
โ๏ธ Method: NearSwap (t = 0.0008)
๐ค Base: Vortex5/Dark-Quill-12B
๐ซ Secondary: inflatebot/MN-12B-Mag-Mell-R1
<details>
<summary><b>Stage 2 Configuration</b></summary>
```yaml
name: Second
models:
- model: inflatebot/MN-12B-Mag-Mell-R1
merge_method: nearswap
base_model: Vortex5/Dark-Quill-12B
parameters:
t: 0.0008
dtype: bfloat16`
```
</details>
### ๐ Stage 3 โ **Final Merge**
โ๏ธ Method: Karcher Mean (tol = 1e-9, max_iter = 20000)
๐ Inputs: First + Second
๐ Purpose:
To geometrically fuse both for coherence.
<details>
<summary><b>Final Merge Configuration</b></summary>
```yaml
models:
- model: First
- model: Second
merge_method: karcher
dtype: bfloat16
parameters:
tol: 1e-9
max_iter: 20000
tokenizer:
source: First
```
</details>
## ๐๐ **Acknowledgements** ๐๐
- โ๏ธ **mradermacher** โ for *static* and *imatrix quantization*
- ๐ **DeathGodlike** โ for *EXL3 quants*
- ๐ฉถ **All original model authors and contributors** whose work made this model possible.
---
**Models merged in this creation:**
- [Vortex5/LunaMaid-12B](https://huggingface.co/Vortex5/LunaMaid-12B)
- [Vortex5/Vermilion-Sage-12B](https://huggingface.co/Vortex5/Vermilion-Sage-12B)
- [Vortex5/Dark-Quill-12B](https://huggingface.co/Vortex5/Dark-Quill-12B)
- [inflatebot/MN-12B-Mag-Mell-R1](https://huggingface.co/inflatebot/MN-12B-Mag-Mell-R1) |