Update README.md
Browse files
README.md
CHANGED
|
@@ -12,6 +12,16 @@ tags:
|
|
| 12 |
- sft
|
| 13 |
---
|
| 14 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 15 |
# Uploaded model
|
| 16 |
|
| 17 |
- **Developed by:** appvoid
|
|
|
|
| 12 |
- sft
|
| 13 |
---
|
| 14 |
|
| 15 |
+
| Task | Score | Metric |
|
| 16 |
+
|--------------|-------|-----------|
|
| 17 |
+
| ARC Challenge| 0.3541| acc_norm |
|
| 18 |
+
| HellaSwag | 0.6049| acc_norm |
|
| 19 |
+
| MMLU | 0.2730| acc |
|
| 20 |
+
| PIQA | 0.7247| acc_norm |
|
| 21 |
+
| Winogrande | 0.6022| acc |
|
| 22 |
+
|
| 23 |
+
This table presents the extracted scores in a clear, tabular format. The "Task" column shows the name of each benchmark, the "Score" column displays the corresponding value, and the "Metric" column indicates whether the score is acc_norm or acc.
|
| 24 |
+
|
| 25 |
# Uploaded model
|
| 26 |
|
| 27 |
- **Developed by:** appvoid
|