Vortex5 commited on
Commit
fcd65aa
ยท
verified ยท
1 Parent(s): f7d79f6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +103 -28
README.md CHANGED
@@ -1,44 +1,119 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
- base_model: []
3
- library_name: transformers
4
- tags:
5
- - mergekit
6
- - merge
 
 
 
 
7
 
8
  ---
9
- # merge
10
 
11
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
 
13
- ## Merge Details
14
- ### Merge Method
 
 
 
 
15
 
16
- This model was merged using the [Karcher Mean](https://en.wikipedia.org/wiki/Karcher_mean) merge method.
17
 
18
- ### Models Merged
19
 
20
- The following models were included in the merge:
21
- * ./intermediates/Second
22
- * ./intermediates/First
23
 
24
- ### Configuration
 
 
 
25
 
26
- The following YAML configuration was used to produce this model:
 
27
 
28
- ```yaml
 
 
 
 
 
 
 
29
  dtype: bfloat16
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
30
  merge_method: karcher
31
- modules:
32
- default:
33
- slices:
34
- - sources:
35
- - layer_range: [0, 40]
36
- model: ./intermediates/First
37
- - layer_range: [0, 40]
38
- model: ./intermediates/Second
39
  parameters:
40
- max_iter: 20000.0
41
- tol: 1.0e-09
42
  tokenizer:
43
- source: ./intermediates/First
44
  ```
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - Vortex5/LunaMaid-12B
4
+ - Vortex5/Vermilion-Sage-12B
5
+ - inflatebot/MN-12B-Mag-Mell-R1
6
+ - Vortex5/Dark-Quill-12B
7
+ library_name: transformers
8
+ tags:
9
+ - mergekit
10
+ - merge
11
+ - roleplay
12
+ ---
13
+ ![ComfyUI_00165_](https://cdn-uploads.huggingface.co/production/uploads/6669a3a617b838fda45637b8/6mPbbLD1oPf2297ZW3gpP.png)
14
+ # ๐ŸŒŒ **Abyssal-Seraph-12B**
15
+
16
+ > *Where the light of the divine meets the poetry of the abyss.*
17
+
18
  ---
19
+
20
+ ## ๐Ÿœ‚ Overview
21
+
22
+ **Abyssal-Seraph-12B** is a **multi-stage creative merge** designed for expressive storytelling, emotional depth, and lyrical dialogue.
23
+ It was crafted through a layered fusion using [MergeKit](https://github.com/arcee-ai/mergekit):
24
+
25
+ 1. ๐ŸŒ™ **LunaMaid ร— Vermilion-Sage** โ€” merged via **NearSwap** (`t=0.0008`) to unify LunaMaidโ€™s balanced composure with Vermilion-Sageโ€™s radiant prose.
26
+ 2. ๐Ÿ•ฏ๏ธ **Dark-Quill ร— Mag-Mell-R1** โ€” merged via **NearSwap** (`t=0.0008`) to draw forth mysticism, poetic darkness, and a sense of dreamlike gravity.
27
+ 3. โœจ Both intermediate results combined with the **Karcher Mean** โ€” a geometric blend ensuring harmony between light and shadow.
28
 
29
  ---
 
30
 
31
+ ## ๐Ÿฉถ Model Essence
32
 
33
+ | Trait | Description |
34
+ |:--|:--|
35
+ | ๐Ÿง  **Core Nature** | Philosophical, poetic, emotionally resonant |
36
+ | ๐Ÿ’ฌ **Style** | Fluid prose, vivid imagery, articulate reflection |
37
+ | ๐Ÿ’ซ **Tone** | Dreamlike, balanced between divine warmth and abyssal calm |
38
+ | ๐ŸŽญ **Best For** | Roleplay, character dialogue, introspection, lore writing, creative prose |
39
 
40
+ ---
41
 
42
+ ๐Ÿงฌ Merge Overview
43
 
44
+ Abyssal-Seraph-12B was created through a multi-stage, precision merge designed to blend expressive prose with poetic balance while maintaining model stability.
 
 
45
 
46
+ ### ๐ŸŒ™ **Stage 1**
47
+ **โœจ Method:** NearSwap (`t = 0.0008`)
48
+ **๐Ÿฉต Base:** [Vortex5/LunaMaid-12B](https://huggingface.co/Vortex5/LunaMaid-12B)
49
+ **๐Ÿ’ฎ Secondary:** [Vortex5/Vermilion-Sage-12B](https://huggingface.co/Vortex5/Vermilion-Sage-12B)
50
 
51
+ <details>
52
+ <summary><b>Stage 1 Configuration</b></summary>
53
 
54
+ ```yaml
55
+ name: First
56
+ models:
57
+ - model: Vortex5/Vermilion-Sage-12B
58
+ merge_method: nearswap
59
+ base_model: Vortex5/LunaMaid-12B
60
+ parameters:
61
+ t: 0.0008
62
  dtype: bfloat16
63
+ tokenizer:
64
+ source: Vortex5/LunaMaid-12B
65
+ ```
66
+ </details>
67
+
68
+ ### ๐Ÿฉถ **Stage 2**
69
+
70
+ โš™๏ธ Method: NearSwap (t = 0.0008)
71
+ ๐Ÿ–ค Base: Vortex5/Dark-Quill-12B
72
+ ๐Ÿ’ซ Secondary: inflatebot/MN-12B-Mag-Mell-R1
73
+
74
+ <details>
75
+ <summary><b>Stage 2 Configuration</b></summary>
76
+
77
+ ```yaml
78
+ name: Second
79
+ models:
80
+ - model: inflatebot/MN-12B-Mag-Mell-R1
81
+ merge_method: nearswap
82
+ base_model: Vortex5/Dark-Quill-12B
83
+ parameters:
84
+ t: 0.0008
85
+ dtype: bfloat16`
86
+ ```
87
+ </details>
88
+
89
+ ### ๐ŸŒŒ Stage 3 โ€” **Final Merge**
90
+
91
+ โš–๏ธ Method: Karcher Mean (tol = 1e-9, max_iter = 20000)
92
+ ๐Ÿœ‚ Inputs: First + Second
93
+ ๐Ÿ’Ž Purpose:
94
+ To geometrically fuse both for coherence.
95
+ <details>
96
+ <summary><b>Final Merge Configuration</b></summary>
97
+ ```yaml
98
+ models:
99
+ - model: First
100
+ - model: Second
101
  merge_method: karcher
102
+ dtype: bfloat16
 
 
 
 
 
 
 
103
  parameters:
104
+ tol: 1e-9
105
+ max_iter: 20000
106
  tokenizer:
107
+ source: First
108
  ```
109
+ </details>
110
+ ## ๐ŸŒ‘๐Ÿœ‚ **Acknowledgements** ๐Ÿœ‚๐ŸŒ‘
111
+ - โš™๏ธ **mradermacher** โ€” for *static* and *imatrix quantization*
112
+ - ๐Ÿœ› **DeathGodlike** โ€” for *EXL3 quants*
113
+ - ๐Ÿฉถ **All original model authors and contributors** whose work made this model possible.
114
+ ---
115
+ **Models merged in this creation:**
116
+ - [Vortex5/LunaMaid-12B](https://huggingface.co/Vortex5/LunaMaid-12B)
117
+ - [Vortex5/Vermilion-Sage-12B](https://huggingface.co/Vortex5/Vermilion-Sage-12B)
118
+ - [Vortex5/Dark-Quill-12B](https://huggingface.co/Vortex5/Dark-Quill-12B)
119
+ - [inflatebot/MN-12B-Mag-Mell-R1](https://huggingface.co/inflatebot/MN-12B-Mag-Mell-R1)