FallenMerick commited on
Commit
1f3dcd3
·
verified ·
1 Parent(s): cce3176

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -59
README.md CHANGED
@@ -1,59 +1,61 @@
1
- ---
2
- base_model:
3
- - Undi95/BigL-7B
4
- - saishf/Multi-Verse-RP-7B
5
- - KatyTheCutie/LemonadeRP-4.5.3
6
- - icefog72/IceLemonTeaRP-32k-7b
7
- - SanjiWatsuki/Kunoichi-DPO-v2-7B
8
- library_name: transformers
9
- tags:
10
- - mergekit
11
- - merge
12
-
13
- ---
14
- # Iced Lemon Cookie
15
-
16
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
17
-
18
- ## Merge Details
19
- ### Merge Method
20
-
21
- This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [saishf/Multi-Verse-RP-7B](https://huggingface.co/saishf/Multi-Verse-RP-7B) as a base.
22
-
23
- ### Models Merged
24
-
25
- The following models were included in the merge:
26
- * [Undi95/BigL-7B](https://huggingface.co/Undi95/BigL-7B)
27
- * [KatyTheCutie/LemonadeRP-4.5.3](https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3)
28
- * [icefog72/IceLemonTeaRP-32k-7b](https://huggingface.co/icefog72/IceLemonTeaRP-32k-7b)
29
- * [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B)
30
-
31
- ### Configuration
32
-
33
- The following YAML configuration was used to produce this model:
34
-
35
- ```yaml
36
- models:
37
- - model: icefog72/IceLemonTeaRP-32k-7b
38
- parameters:
39
- density: 1.0
40
- weight: 1.0
41
- - model: Undi95/BigL-7B
42
- parameters:
43
- density: 0.4
44
- weight: 1.0
45
- - model: SanjiWatsuki/Kunoichi-DPO-v2-7B
46
- parameters:
47
- density: 0.6
48
- weight: 1.0
49
- - model: KatyTheCutie/LemonadeRP-4.5.3
50
- parameters:
51
- density: 0.8
52
- weight: 1.0
53
- merge_method: ties
54
- base_model: saishf/Multi-Verse-RP-7B
55
- parameters:
56
- normalize: true
57
- dtype: float16
58
-
59
- ```
 
 
 
1
+ ---
2
+ base_model:
3
+ - Undi95/BigL-7B
4
+ - saishf/Multi-Verse-RP-7B
5
+ - KatyTheCutie/LemonadeRP-4.5.3
6
+ - icefog72/IceLemonTeaRP-32k-7b
7
+ - SanjiWatsuki/Kunoichi-DPO-v2-7B
8
+ library_name: transformers
9
+ tags:
10
+ - mergekit
11
+ - merge
12
+
13
+ ---
14
+ # Iced Lemon Cookie
15
+
16
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
17
+
18
+ GGUF quants: https://huggingface.co/FallenMerick/Iced-Lemon-Cookie-7B-GGUF
19
+
20
+ ## Merge Details
21
+ ### Merge Method
22
+
23
+ This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [saishf/Multi-Verse-RP-7B](https://huggingface.co/saishf/Multi-Verse-RP-7B) as a base.
24
+
25
+ ### Models Merged
26
+
27
+ The following models were included in the merge:
28
+ * [Undi95/BigL-7B](https://huggingface.co/Undi95/BigL-7B)
29
+ * [KatyTheCutie/LemonadeRP-4.5.3](https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3)
30
+ * [icefog72/IceLemonTeaRP-32k-7b](https://huggingface.co/icefog72/IceLemonTeaRP-32k-7b)
31
+ * [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B)
32
+
33
+ ### Configuration
34
+
35
+ The following YAML configuration was used to produce this model:
36
+
37
+ ```yaml
38
+ models:
39
+ - model: icefog72/IceLemonTeaRP-32k-7b
40
+ parameters:
41
+ density: 1.0
42
+ weight: 1.0
43
+ - model: Undi95/BigL-7B
44
+ parameters:
45
+ density: 0.4
46
+ weight: 1.0
47
+ - model: SanjiWatsuki/Kunoichi-DPO-v2-7B
48
+ parameters:
49
+ density: 0.6
50
+ weight: 1.0
51
+ - model: KatyTheCutie/LemonadeRP-4.5.3
52
+ parameters:
53
+ density: 0.8
54
+ weight: 1.0
55
+ merge_method: ties
56
+ base_model: saishf/Multi-Verse-RP-7B
57
+ parameters:
58
+ normalize: true
59
+ dtype: float16
60
+
61
+ ```