Naphula commited on
Commit
6951ded
·
verified ·
1 Parent(s): 1ab798b

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +58 -0
README.md ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model:
4
+ - EldritchLabs/Cthulhu-8B-v1.4
5
+ - SicariusSicariiStuff/Assistant_Pepe_8B
6
+ language:
7
+ - en
8
+ library_name: transformers
9
+ tags:
10
+ - SLERP
11
+ - merge
12
+ - mergekit
13
+ - llama
14
+ widget:
15
+ - text: "Kekthulhu 8B v1"
16
+ output:
17
+ url: https://cdn-uploads.huggingface.co/production/uploads/68e840caa318194c44ec2a04/8DPzjm4zPfLA9QS-aT4fz.jpeg
18
+ ---
19
+
20
+ # Kekthulhu 8B v1
21
+
22
+ ![kek1](https://cdn-uploads.huggingface.co/production/uploads/68e840caa318194c44ec2a04/8DPzjm4zPfLA9QS-aT4fz.jpeg)
23
+
24
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
25
+
26
+ ## Merge Details
27
+ ### Merge Method
28
+
29
+ This model was merged using the [SLERP](https://en.wikipedia.org/wiki/Slerp) merge method.
30
+
31
+ ### Models Merged
32
+
33
+ The following models were included in the merge:
34
+ * EldritchLabs/Cthulhu-8B-v1.4
35
+ * SicariusSicariiStuff/Assistant_Pepe_8B
36
+
37
+ ### Configuration
38
+
39
+ The following YAML configuration was used to produce this model:
40
+
41
+ ```yaml
42
+ base_model: SicariusSicariiStuff/Assistant_Pepe_8B
43
+ architecture: MistralForCausalLM
44
+ merge_method: slerp
45
+ dtype: float32
46
+ out_dtype: bfloat16
47
+ slices:
48
+ - sources:
49
+ - model: EldritchLabs/Cthulhu-8B-v1.4
50
+ layer_range: [0, 32]
51
+ - model: SicariusSicariiStuff/Assistant_Pepe_8B
52
+ layer_range: [0, 32]
53
+ parameters:
54
+ t: 0.5
55
+ tokenizer:
56
+ source: union
57
+ #chat_template: auto
58
+ ```