model stringlengths 7 21 | BARTScore (mean) float64 -3.08 -2.23 | BERTScore F1 (mean) float64 0.88 0.91 | Cosine Similarity (all-mpnet-base-v2) float64 0.51 0.71 | METEOR float64 0.28 0.46 | ROUGE-1 float64 0.29 0.48 | ROUGE-2 float64 0.08 0.26 | ROUGE-L float64 0.2 0.4 | num_tested_samples int64 36.7k 36.7k |
|---|---|---|---|---|---|---|---|---|
GPT-4.1 | -2.5145 | 0.8994 | 0.6749 | 0.3685 | 0.4029 | 0.1678 | 0.3038 | 36,669 |
Gemini-2.5-Flash | -2.6409 | 0.8942 | 0.6234 | 0.311 | 0.3675 | 0.1422 | 0.2771 | 36,669 |
Gemma-3-4B | -3.0766 | 0.8752 | 0.5134 | 0.2782 | 0.2883 | 0.0764 | 0.2024 | 36,669 |
LLaMA-3.1-8B-Instruct | -2.2332 | 0.9134 | 0.7051 | 0.4569 | 0.4794 | 0.2627 | 0.394 | 36,669 |
LLaMA-3.2-1B-Instruct | -2.706 | 0.8821 | 0.5909 | 0.3032 | 0.3298 | 0.1053 | 0.2332 | 36,669 |
LLaMA-3.2-3B-Instruct | -2.2655 | 0.9121 | 0.6958 | 0.4471 | 0.4697 | 0.2514 | 0.3842 | 36,669 |
Phi-4-Mini | -2.2872 | 0.9107 | 0.6891 | 0.4303 | 0.4607 | 0.242 | 0.3747 | 36,669 |
Qwen-3-1.7B-Instruct | -2.3096 | 0.9096 | 0.6731 | 0.4138 | 0.4508 | 0.2347 | 0.3697 | 36,669 |
Qwen-3-4B-Instruct | -2.2311 | 0.9137 | 0.6972 | 0.4455 | 0.4768 | 0.2636 | 0.3959 | 36,669 |
Qwen-3-8B-Instruct | -2.497 | 0.8995 | 0.6621 | 0.3792 | 0.4041 | 0.1795 | 0.3121 | 36,669 |
SmolLM-3B-Instruct | -2.7699 | 0.883 | 0.5428 | 0.3022 | 0.3369 | 0.1036 | 0.2393 | 36,669 |
Virtuoso-Large | -2.4625 | 0.9011 | 0.6676 | 0.377 | 0.4078 | 0.1809 | 0.3161 | 36,669 |
No dataset card yet
- Downloads last month
- 11