Sumail commited on
Commit
058df87
·
verified ·
1 Parent(s): ecc7f97

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -59
README.md CHANGED
@@ -1,59 +0,0 @@
1
- ---
2
- base_model:
3
- - deepnetguy/gemma-111
4
- - lxsure/gemma_15
5
- - rwh/gemma2
6
- - tomaszki/gemma-41
7
- library_name: transformers
8
- tags:
9
- - mergekit
10
- - merge
11
-
12
- ---
13
- # merge
14
-
15
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
16
-
17
- ## Merge Details
18
- ### Merge Method
19
-
20
- This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [tomaszki/gemma-41](https://huggingface.co/tomaszki/gemma-41) as a base.
21
-
22
- ### Models Merged
23
-
24
- The following models were included in the merge:
25
- * [deepnetguy/gemma-111](https://huggingface.co/deepnetguy/gemma-111)
26
- * [lxsure/gemma_15](https://huggingface.co/lxsure/gemma_15)
27
- * [rwh/gemma2](https://huggingface.co/rwh/gemma2)
28
-
29
- ### Configuration
30
-
31
- The following YAML configuration was used to produce this model:
32
-
33
- ```yaml
34
-
35
-
36
- models:
37
- - model: tomaszki/gemma-41
38
- # No parameters necessary for base model
39
- - model: deepnetguy/gemma-111
40
- parameters:
41
- density: 0.53
42
- weight: 0.3
43
- - model: rwh/gemma2
44
- parameters:
45
- density: 0.53
46
- weight: 0.4
47
- - model: lxsure/gemma_15
48
- parameters:
49
- density: 0.53
50
- weight: 0.3
51
- merge_method: dare_ties
52
- base_model: tomaszki/gemma-41
53
- parameters:
54
- int8_mask: true
55
- dtype: bfloat16
56
-
57
-
58
-
59
- ```