Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -54,26 +54,26 @@ app_file: README.md
|
|
| 54 |
|
| 55 |
## 🏆 Benchmark Performance
|
| 56 |
|
| 57 |
-
在 **Murasaki-ACGN Benchmark**
|
| 58 |
|
| 59 |
-
| Rank | Model | Avg
|
| 60 |
-
| :--- | :--- | :--- |
|
| 61 |
-
| 🥇 | **Murasaki-
|
| 62 |
-
|
|
| 63 |
-
|
|
| 64 |
-
| 4 |
|
| 65 |
-
| 5 |
|
| 66 |
|
| 67 |
---
|
| 68 |
|
| 69 |
## 🧬 Model Matrix (模型矩阵)
|
| 70 |
|
| 71 |
-
| Model Name | Type | Size | VRAM | Description |
|
| 72 |
| :--- | :--- | :--- | :--- | :--- |
|
| 73 |
-
| **[Murasaki-
|
| 74 |
-
| **[Murasaki-
|
| 75 |
-
|
| 76 |
-
|
| 77 |
|
| 78 |
## 🛠️ Ecosystem (生态系统)
|
| 79 |
|
|
|
|
| 54 |
|
| 55 |
## 🏆 Benchmark Performance
|
| 56 |
|
| 57 |
+
在 **Murasaki-ACGN Benchmark** 的综合评测中,**Murasaki-14B-v0.2** 取得了总分第一的,**Murasaki-8B-v0.2** 在8B的小参数下取得了总分第二。(均为4-bit量化版本评测得出)
|
| 58 |
|
| 59 |
+
| Rank | Model | **Overall Avg** | Short | Long |
|
| 60 |
+
| :--- | :--- | :--- | :--- | :--- |
|
| 61 |
+
| 🥇 | **Murasaki-14B-v0.2** | **0.8545** | **0.8289** | **0.8801** |
|
| 62 |
+
| 🥈 | Murasaki-8B-v0.1 | 0.8523 | 0.8269 | 0.8778 |
|
| 63 |
+
| 🥉 | **Murasaki-8B-v0.2** | **0.8522** | **0.8271** | **0.8773** |
|
| 64 |
+
| 4 | Gemini-3-Flash-Preview | 0.8512 | 0.8262 | 0.8765 |
|
| 65 |
+
| 5 | Sakura-Qwen-2.5-14B | 0.8509 | 0.8282 | 0.8735 |
|
| 66 |
|
| 67 |
---
|
| 68 |
|
| 69 |
## 🧬 Model Matrix (模型矩阵)
|
| 70 |
|
| 71 |
+
| Model Name | Type | Size | VRAM (Est.) | Description |
|
| 72 |
| :--- | :--- | :--- | :--- | :--- |
|
| 73 |
+
| **[Murasaki-14B-v0.2](https://huggingface.co/Murasaki-Project/Murasaki-14B-v0.2)** | **BF16** | ~28GB | 32GB+ | **旗舰版**。全精度权重,最佳性能。 |
|
| 74 |
+
| **[Murasaki-14B-v0.2-GGUF](https://huggingface.co/Murasaki-Project/Murasaki-14B-v0.2-GGUF)** | **GGUF** | 6.4~12GB | 12GB+ | **进阶本地版**。适合大显存用户。 |
|
| 75 |
+
| **[Murasaki-8B-v0.2](https://huggingface.co/Murasaki-Project/Murasaki-8B-v0.2)** | **BF16** | ~16GB | 24GB+ | **标准版**。全精度权重,均衡之选。 |
|
| 76 |
+
| **[Murasaki-8B-v0.2-GGUF](https://huggingface.co/Murasaki-Project/Murasaki-8B-v0.2-GGUF)** | **GGUF** | 3.6~6.3GB | 6GB+ | **推荐/轻量版**。兼容性最强,适合大多数显卡。 |
|
| 77 |
|
| 78 |
## 🛠️ Ecosystem (生态系统)
|
| 79 |
|