Update README.md
Browse files
README.md
CHANGED
|
@@ -29,6 +29,8 @@ Luth was trained using full fine-tuning on the Luth-SFT dataset with [Axolotl](h
|
|
| 29 |
|
| 30 |
We used LightEval for evaluation, with custom tasks for the French benchmarks. The models were evaluated with a `temperature=0`.
|
| 31 |
|
|
|
|
|
|
|
| 32 |
**French Evaluation:**
|
| 33 |
|
| 34 |

|
|
@@ -37,6 +39,29 @@ We used LightEval for evaluation, with custom tasks for the French benchmarks. T
|
|
| 37 |
|
| 38 |

|
| 39 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 40 |
## Citation
|
| 41 |
|
| 42 |
```bibtex
|
|
|
|
| 29 |
|
| 30 |
We used LightEval for evaluation, with custom tasks for the French benchmarks. The models were evaluated with a `temperature=0`.
|
| 31 |
|
| 32 |
+
### Evaluation Visualizations
|
| 33 |
+
|
| 34 |
**French Evaluation:**
|
| 35 |
|
| 36 |

|
|
|
|
| 39 |
|
| 40 |

|
| 41 |
|
| 42 |
+
### French Benchmark Scores
|
| 43 |
+
|
| 44 |
+
| Benchmark | Qwen3-0.6B | Qwen2.5-0.5B-Instruct | Luth-0.6B-0.3 |
|
| 45 |
+
|-------------------|------------------|-----------------------|-----------------|
|
| 46 |
+
| ifeval-fr | 44.45 | 22.18 | <u>48.24</u> |
|
| 47 |
+
| gpqa-diamond-fr | 28.93 | 23.86 | <u>33.50</u> |
|
| 48 |
+
| mmlu-fr | 27.16 | 35.04 | <u>40.23</u> |
|
| 49 |
+
| math-500-fr | 29.20 | 10.00 | <u>43.00</u> |
|
| 50 |
+
| arc-chall-fr | 31.31 | 28.23 | <u>33.88</u> |
|
| 51 |
+
| hellaswag-fr | 25.11 | <u>51.45</u> | 45.70 |
|
| 52 |
+
|
| 53 |
+
### English Benchmark Scores
|
| 54 |
+
|
| 55 |
+
| Benchmark | Qwen3-0.6B | Qwen2.5-0.5B-Instruct | Luth-0.6B-0.3 |
|
| 56 |
+
|-------------------|------------------|-----------------------|-----------------|
|
| 57 |
+
| ifeval-en | <u>57.86</u> | 29.21 | 53.97 |
|
| 58 |
+
| gpqa-diamond-en | <u>29.80</u> | 26.77 | 28.28 |
|
| 59 |
+
| mmlu-en | 36.85 | 43.80 | <u>48.10</u> |
|
| 60 |
+
| math-500-en | 45.00 | 31.80 | <u>47.80</u> |
|
| 61 |
+
| arc-chall-en | 33.62 | 32.17 | <u>35.92</u> |
|
| 62 |
+
| hellaswag-en | 42.91 | <u>49.56</u> | 46.96 |
|
| 63 |
+
|
| 64 |
+
|
| 65 |
## Citation
|
| 66 |
|
| 67 |
```bibtex
|