Update README.md
Browse files
README.md
CHANGED
|
@@ -15,3 +15,20 @@ It is hierarchical SLERP merged from the following models
|
|
| 15 |
* meta-math/MetaMath-Mistral-7B (Apache 2.0)
|
| 16 |
* openchat/openchat-3.5-1210 (Apache 2.0)
|
| 17 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 15 |
* meta-math/MetaMath-Mistral-7B (Apache 2.0)
|
| 16 |
* openchat/openchat-3.5-1210 (Apache 2.0)
|
| 17 |
|
| 18 |
+
Here's how we did the hierarchical SLERP merge.
|
| 19 |
+
```
|
| 20 |
+
[flux-base-optimized]
|
| 21 |
+
↑
|
| 22 |
+
[mistral]
|
| 23 |
+
|
|
| 24 |
+
[stage-1]-+-[openchat]
|
| 25 |
+
↑
|
| 26 |
+
[mistral]
|
| 27 |
+
|
|
| 28 |
+
[stage-0]-+-[meta-math]
|
| 29 |
+
↑
|
| 30 |
+
[mistral]
|
| 31 |
+
|
|
| 32 |
+
[openhermes]-+-[neural-chat]
|
| 33 |
+
```
|
| 34 |
+
|