CuriousCat29 commited on
Commit
0487dfe
·
verified ·
1 Parent(s): cd4423e

Delete README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -57
README.md DELETED
@@ -1,57 +0,0 @@
1
- ---
2
- base_model:
3
- - Tarek07/Dungeonmaster-V2.2-Expanded-LLaMa-70B
4
- - Tarek07/Dungeonmaster-V2.4-Expanded-LLaMa-70B
5
- library_name: transformers
6
- tags:
7
- - mergekit
8
- - merge
9
-
10
- ---
11
- # merged
12
-
13
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
14
-
15
- ## Merge Details
16
- ### Merge Method
17
-
18
- This model was merged using the Passthrough merge method.
19
-
20
- ### Models Merged
21
-
22
- The following models were included in the merge:
23
- * [Tarek07/Dungeonmaster-V2.2-Expanded-LLaMa-70B](https://huggingface.co/Tarek07/Dungeonmaster-V2.2-Expanded-LLaMa-70B)
24
- * [Tarek07/Dungeonmaster-V2.4-Expanded-LLaMa-70B](https://huggingface.co/Tarek07/Dungeonmaster-V2.4-Expanded-LLaMa-70B)
25
-
26
- ### Configuration
27
-
28
- The following YAML configuration was used to produce this model:
29
-
30
- ```yaml
31
- dtype: bfloat16
32
- merge_method: passthrough
33
- modules:
34
- default:
35
- slices:
36
- - sources:
37
- - layer_range: [0, 20]
38
- model: Tarek07/Dungeonmaster-V2.4-Expanded-LLaMa-70B
39
- - sources:
40
- - layer_range: [10, 30]
41
- model: Tarek07/Dungeonmaster-V2.2-Expanded-LLaMa-70B
42
- - sources:
43
- - layer_range: [20, 40]
44
- model: Tarek07/Dungeonmaster-V2.4-Expanded-LLaMa-70B
45
- - sources:
46
- - layer_range: [30, 50]
47
- model: Tarek07/Dungeonmaster-V2.2-Expanded-LLaMa-70B
48
- - sources:
49
- - layer_range: [40, 60]
50
- model: Tarek07/Dungeonmaster-V2.4-Expanded-LLaMa-70B
51
- - sources:
52
- - layer_range: [50, 70]
53
- model: Tarek07/Dungeonmaster-V2.2-Expanded-LLaMa-70B
54
- - sources:
55
- - layer_range: [60, 80]
56
- model: Tarek07/Dungeonmaster-V2.4-Expanded-LLaMa-70B
57
- ```