File size: 1,436 Bytes
92a9fbe
 
 
 
 
 
 
 
 
8695dd9
92a9fbe
8695dd9
 
 
92a9fbe
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8695dd9
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
---
base_model:
- Vortex5/Vermilion-Sage-12B
- Vortex5/MegaMoon-Karcher-12B
- Vortex5/Dark-Quill-12B
library_name: transformers
tags:
- mergekit
- merge
- roleplay
---
![ComfyUI_00155_](https://cdn-uploads.huggingface.co/production/uploads/6669a3a617b838fda45637b8/KWJHSZF1k-0kKUjgnut5m.png)

# Scarlet-Ink-12B

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the [Linear DELLA](https://arxiv.org/abs/2406.11617) merge method using [Vortex5/MegaMoon-Karcher-12B](https://huggingface.co/Vortex5/MegaMoon-Karcher-12B) as a base.

### Models Merged

The following models were included in the merge:
* [Vortex5/Vermilion-Sage-12B](https://huggingface.co/Vortex5/Vermilion-Sage-12B)
* [Vortex5/Dark-Quill-12B](https://huggingface.co/Vortex5/Dark-Quill-12B)

### Configuration

The following YAML configuration was used to produce this model:

```yaml

models:
  - model: Vortex5/Vermilion-Sage-12B
    parameters:
      weight: [0.2, 0.5, 0.9, 1.0, 0.95, 0.8, 0.4, 0.2]
      density: 0.55
      epsilon: 0.4
  - model: Vortex5/Dark-Quill-12B
    parameters:
      weight: [0.8, 1.0, 0.9, 0.7, 0.5, 0.4, 0.2, 0.0]
      density: 0.5
      epsilon: 0.4     
merge_method: della_linear
base_model: Vortex5/MegaMoon-Karcher-12B
parameters:
  lambda: 0.94
  normalize: true
dtype: bfloat16
tokenizer:
  source: union

```