FallenMerick commited on
Commit
b7fd7f5
·
verified ·
1 Parent(s): e006ee8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +73 -73
README.md CHANGED
@@ -1,73 +1,73 @@
1
- ---
2
- license: cc-by-4.0
3
- language:
4
- - en
5
- base_model:
6
- - TeeZee/Orca-2-13b_flat
7
- - NeverSleep/X-NoroChronos-13B
8
- - NeverSleep/Noromaid-13b-v0.3
9
- - KatyTheCutie/EstopianMaid-13B
10
- - Undi95/MLewdBoros-L2-13B
11
- - KoboldAI/LLaMA2-13B-Psyfighter2
12
- - KoboldAI/LLaMA2-13B-Erebus-v3
13
- library_name: transformers
14
- tags:
15
- - storywriting
16
- - text adventure
17
- - creative
18
- - story
19
- - writing
20
- - fiction
21
- - roleplaying
22
- - rp
23
- - mergekit
24
- - merge
25
-
26
- ---
27
-
28
- ![pic](https://huggingface.co/FallenMerick/Bionic-Cetacean-13B/resolve/main/Bionic-Cetacean.jpg)
29
-
30
- # Bionic-Cetacean-20B
31
-
32
- In the same vein as the legendary [Psyonic-Cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B), I have attempted to create a 20B model that is equal parts creative and chaotic, while still remaining coherent enough for roleplaying purposes.
33
- </br>
34
- The three components used to create [Bionic-Vaquita-13B](https://huggingface.co/FallenMerick/Bionic-Vaquita-13B) were also used to create this stack.
35
- </br>
36
- Creativity and coherency were the primary focus of the late-stage manual testing that led to selecting this model.
37
- </br>
38
- </br>
39
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
40
-
41
- ## Merge Details
42
- ### Merge Method
43
-
44
- This model was merged using the passthrough merge method.
45
-
46
- ### Models Merged
47
-
48
- The following models were included in the merge:
49
- * FallenMerick/Psyfighter2-Orca2-Erebus3
50
- * FallenMerick/XNoroChronos-Orca2-Noromaid
51
- * FallenMerick/EstopianMaid-Orca2-MlewdBoros
52
-
53
- ### Configuration
54
-
55
- The following YAML configuration was used to produce this model:
56
-
57
- ```yaml
58
- slices:
59
- - sources:
60
- - model: FallenMerick/Psyfighter2-Orca2-Erebus3
61
- layer_range: [0, 13]
62
- - sources:
63
- - model: FallenMerick/XNoroChronos-Orca2-Noromaid
64
- layer_range: [8, 26]
65
- - sources:
66
- - model: FallenMerick/EstopianMaid-Orca2-MlewdBoros
67
- layer_range: [14, 32]
68
- - sources:
69
- - model: FallenMerick/Psyfighter2-Orca2-Erebus3
70
- layer_range: [27, 40]
71
- merge_method: passthrough
72
- dtype: bfloat16
73
- ```
 
1
+ ---
2
+ license: cc-by-4.0
3
+ language:
4
+ - en
5
+ base_model:
6
+ - TeeZee/Orca-2-13b_flat
7
+ - NeverSleep/X-NoroChronos-13B
8
+ - NeverSleep/Noromaid-13b-v0.3
9
+ - KatyTheCutie/EstopianMaid-13B
10
+ - Undi95/MLewdBoros-L2-13B
11
+ - KoboldAI/LLaMA2-13B-Psyfighter2
12
+ - KoboldAI/LLaMA2-13B-Erebus-v3
13
+ library_name: transformers
14
+ tags:
15
+ - storywriting
16
+ - text adventure
17
+ - creative
18
+ - story
19
+ - writing
20
+ - fiction
21
+ - roleplaying
22
+ - rp
23
+ - mergekit
24
+ - merge
25
+
26
+ ---
27
+
28
+ ![pic](https://huggingface.co/FallenMerick/Bionic-Cetacean-20B/resolve/main/Bionic-Cetacean.jpg)
29
+
30
+ # Bionic-Cetacean-20B
31
+
32
+ In the same vein as the legendary [Psyonic-Cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B), I have attempted to create a 20B model that is equal parts creative and chaotic, while still remaining coherent enough for roleplaying purposes.
33
+ </br>
34
+ The three components used to create [Bionic-Vaquita-13B](https://huggingface.co/FallenMerick/Bionic-Vaquita-13B) were also used to create this stack.
35
+ </br>
36
+ Creativity and coherency were the primary focus of the late-stage manual testing that led to selecting this model.
37
+ </br>
38
+ </br>
39
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
40
+
41
+ ## Merge Details
42
+ ### Merge Method
43
+
44
+ This model was merged using the passthrough merge method.
45
+
46
+ ### Models Merged
47
+
48
+ The following models were included in the merge:
49
+ * FallenMerick/Psyfighter2-Orca2-Erebus3
50
+ * FallenMerick/XNoroChronos-Orca2-Noromaid
51
+ * FallenMerick/EstopianMaid-Orca2-MlewdBoros
52
+
53
+ ### Configuration
54
+
55
+ The following YAML configuration was used to produce this model:
56
+
57
+ ```yaml
58
+ slices:
59
+ - sources:
60
+ - model: FallenMerick/Psyfighter2-Orca2-Erebus3
61
+ layer_range: [0, 13]
62
+ - sources:
63
+ - model: FallenMerick/XNoroChronos-Orca2-Noromaid
64
+ layer_range: [8, 26]
65
+ - sources:
66
+ - model: FallenMerick/EstopianMaid-Orca2-MlewdBoros
67
+ layer_range: [14, 32]
68
+ - sources:
69
+ - model: FallenMerick/Psyfighter2-Orca2-Erebus3
70
+ layer_range: [27, 40]
71
+ merge_method: passthrough
72
+ dtype: bfloat16
73
+ ```