Text Generation
Transformers
Safetensors
llama
mergekit
Merge
text-generation-inference
zxc4wewewe commited on
Commit
2ecd3f0
·
verified ·
1 Parent(s): f6ceb9b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +50 -47
README.md CHANGED
@@ -1,47 +1,50 @@
1
- ---
2
- base_model:
3
- - Novaciano/Eurinoferus-3.2-1B
4
- - cazzz307/Abliterated-Llama-3.2-1B-Instruct
5
- library_name: transformers
6
- tags:
7
- - mergekit
8
- - merge
9
-
10
- ---
11
- # merge
12
-
13
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
14
-
15
- ## Merge Details
16
- ### Merge Method
17
-
18
- This model was merged using the [Arcee Fusion](https://arcee.ai) merge method using [Novaciano/Eurinoferus-3.2-1B](https://huggingface.co/Novaciano/Eurinoferus-3.2-1B) as a base.
19
-
20
- ### Models Merged
21
-
22
- The following models were included in the merge:
23
- * [cazzz307/Abliterated-Llama-3.2-1B-Instruct](https://huggingface.co/cazzz307/Abliterated-Llama-3.2-1B-Instruct)
24
-
25
- ### Configuration
26
-
27
- The following YAML configuration was used to produce this model:
28
-
29
- ```yaml
30
- dtype: float32
31
- out_dtype: bfloat16
32
- merge_method: arcee_fusion
33
- base_model: Novaciano/Eurinoferus-3.2-1B
34
- models:
35
- - model: Novaciano/Eurinoferus-3.2-1B
36
- parameters:
37
- weight:
38
- - filter: mlp
39
- value: [1, 2]
40
- - value: 1
41
- - model: cazzz307/Abliterated-Llama-3.2-1B-Instruct
42
- parameters:
43
- weight:
44
- - filter: lm_head
45
- value: 1
46
- - value: [1, 0.5]
47
- ```
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - Novaciano/Eurinoferus-3.2-1B
4
+ - cazzz307/Abliterated-Llama-3.2-1B-Instruct
5
+ library_name: transformers
6
+ tags:
7
+ - mergekit
8
+ - merge
9
+ datasets:
10
+ - TeichAI/brainstorm-v3.1-grok-4-fast-200x
11
+ - TeichAI/grok-code-fast-1-1000x
12
+ - reedmayhew/Grok-3-reasoning-100x
13
+ ---
14
+ # merge
15
+
16
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
17
+
18
+ ## Merge Details
19
+ ### Merge Method
20
+
21
+ This model was merged using the [Arcee Fusion](https://arcee.ai) merge method using [Novaciano/Eurinoferus-3.2-1B](https://huggingface.co/Novaciano/Eurinoferus-3.2-1B) as a base.
22
+
23
+ ### Models Merged
24
+
25
+ The following models were included in the merge:
26
+ * [cazzz307/Abliterated-Llama-3.2-1B-Instruct](https://huggingface.co/cazzz307/Abliterated-Llama-3.2-1B-Instruct)
27
+
28
+ ### Configuration
29
+
30
+ The following YAML configuration was used to produce this model:
31
+
32
+ ```yaml
33
+ dtype: float32
34
+ out_dtype: bfloat16
35
+ merge_method: arcee_fusion
36
+ base_model: Novaciano/Eurinoferus-3.2-1B
37
+ models:
38
+ - model: Novaciano/Eurinoferus-3.2-1B
39
+ parameters:
40
+ weight:
41
+ - filter: mlp
42
+ value: [1, 2]
43
+ - value: 1
44
+ - model: cazzz307/Abliterated-Llama-3.2-1B-Instruct
45
+ parameters:
46
+ weight:
47
+ - filter: lm_head
48
+ value: 1
49
+ - value: [1, 0.5]
50
+ ```