Yobenboben commited on
Commit
779872f
·
verified ·
1 Parent(s): 8c050d4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -2
README.md CHANGED
@@ -9,9 +9,23 @@ tags:
9
  - merge
10
 
11
  ---
12
- # merge
13
 
14
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15
 
16
  ## Merge Details
17
  ### Merge Method
 
9
  - merge
10
 
11
  ---
12
+ # ElectraEXTRA
13
 
14
+ Like [Electranova](https://huggingface.co/sophosympatheia/Electranova-70B-v1.0) but with a different model, so the thinking works better in it. The writing quality is also better imo.
15
+
16
+ **Settings:**
17
+
18
+ Samplers: With thinking: Temp 1.05, top nsigma 0.7; w/o: Temp 1.15, top nsigma 0.7, minP 0.02, smoothing factor 0.3, smoothing curve 2
19
+
20
+ Sys. prompt: LeCeption or the one from [here](https://files.catbox.moe/b6nwbc.json)
21
+
22
+ **Quants**
23
+
24
+ Static:
25
+ https://huggingface.co/mradermacher/L3.3-ElectraEXTRA-R1-70b-GGUF
26
+
27
+ Weighted/imatrix:
28
+ https://huggingface.co/mradermacher/L3.3-ElectraEXTRA-R1-70b-i1-GGUF
29
 
30
  ## Merge Details
31
  ### Merge Method