Update README.md
Browse files
README.md
CHANGED
|
@@ -32,7 +32,7 @@ This model is a Mixture of Experts (MoE) merger of the following two models:
|
|
| 32 |
|[youri-7b-instruction](https://huggingface.co/rinna/youri-7b-instruction) *1| 78.94 | 17.20| 54.04| 66.35|
|
| 33 |
|[youri-7b-chat](https://huggingface.co/rinna/youri-7b-chat) *1| 80.92| 25.20| 53.78| 67.36|
|
| 34 |
|
| 35 |
-
*1 From the [rinna's LM Benchmark](https://rinnakk.github.io/research/benchmarks/lm/index.html).
|
| 36 |
*2 Since there was no mention of these template versions in rinna's LM Benchmark, the scores were calculated without specifying a template.
|
| 37 |
|
| 38 |
## 🧩 Configuration
|
|
|
|
| 32 |
|[youri-7b-instruction](https://huggingface.co/rinna/youri-7b-instruction) *1| 78.94 | 17.20| 54.04| 66.35|
|
| 33 |
|[youri-7b-chat](https://huggingface.co/rinna/youri-7b-chat) *1| 80.92| 25.20| 53.78| 67.36|
|
| 34 |
|
| 35 |
+
*1 From the [rinna's LM Benchmark](https://rinnakk.github.io/research/benchmarks/lm/index.html).
|
| 36 |
*2 Since there was no mention of these template versions in rinna's LM Benchmark, the scores were calculated without specifying a template.
|
| 37 |
|
| 38 |
## 🧩 Configuration
|