_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | Sakalti | Sakalti/Saba2-3B | b759686f-082e-44b6-9cf8-44a48f66c136 | 0.0.1 | hfopenllm_v2/Sakalti_Saba2-3B/1762652579.864372 | 1762652579.864373 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/Saba2-3B | Sakalti/Saba2-3B | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28651533486704167}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | Sakalti | Sakalti/Neptuno-Alpha | 511ac4a5-6fc8-4338-845d-859d73d57678 | 0.0.1 | hfopenllm_v2/Sakalti_Neptuno-Alpha/1762652579.857697 | 1762652579.857698 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/Neptuno-Alpha | Sakalti/Neptuno-Alpha | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3779649108809071}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.397} |
HF Open LLM v2 | Sakalti | Sakalti/ultiima-14B | abf448a9-decf-432d-8883-6e1492a7c040 | 0.0.1 | hfopenllm_v2/Sakalti_ultiima-14B/1762652579.869824 | 1762652579.8698251 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/ultiima-14B | Sakalti/ultiima-14B | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5700563394016764}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | Sakalti | Sakalti/tara-3.8B | 695d7b01-14e6-40e4-b398-541e87a812c8 | 0.0.1 | hfopenllm_v2/Sakalti_tara-3.8B/1762652579.86961 | 1762652579.869611 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/tara-3.8B | Sakalti/tara-3.8B | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4077403511571519}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.821} |
HF Open LLM v2 | Sakalti | Sakalti/SJT-7.5B | 57934f76-c8bd-4264-a3b4-14234dda0719 | 0.0.1 | hfopenllm_v2/Sakalti_SJT-7.5B/1762652579.861058 | 1762652579.861058 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/SJT-7.5B | Sakalti/SJT-7.5B | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.42232831110342783}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 7.456} |
HF Open LLM v2 | Sakalti | Sakalti/Saba-Passthrough-2 | df1e7d22-c300-4466-92b7-770078a1dc09 | 0.0.1 | hfopenllm_v2/Sakalti_Saba-Passthrough-2/1762652579.863117 | 1762652579.8631182 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/Saba-Passthrough-2 | Sakalti/Saba-Passthrough-2 | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.16913677930114318}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.087} |
HF Open LLM v2 | Sakalti | Sakalti/light-3b-beta | 2a4293ca-2434-4752-a08f-163257e0fde4 | 0.0.1 | hfopenllm_v2/Sakalti_light-3b-beta/1762652579.867648 | 1762652579.867649 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/light-3b-beta | Sakalti/light-3b-beta | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5485489612007252}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.397} |
HF Open LLM v2 | Sakalti | Sakalti/SJT-Moe2x7.5B | e95c6f08-ab57-49a2-a83b-6a77b5ab69d9 | 0.0.1 | hfopenllm_v2/Sakalti_SJT-Moe2x7.5B/1762652579.862277 | 1762652579.862278 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/SJT-Moe2x7.5B | Sakalti/SJT-Moe2x7.5B | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41166216749336204}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "MixtralForCausalLM", "params_billions": 13.401} |
HF Open LLM v2 | Sakalti | Sakalti/Saba2-14B-Preview | e3e0180f-bbd8-491a-a41b-54801e9f71de | 0.0.1 | hfopenllm_v2/Sakalti_Saba2-14B-Preview/1762652579.864167 | 1762652579.864168 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/Saba2-14B-Preview | Sakalti/Saba2-14B-Preview | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4721871301480073}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | Sakalti | Sakalti/Saka-1.5B | 854baf47-af97-46dd-acfe-a3710976fd57 | 0.0.1 | hfopenllm_v2/Sakalti_Saka-1.5B/1762652579.8647912 | 1762652579.8647912 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/Saka-1.5B | Sakalti/Saka-1.5B | Sakalti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2726266306732802}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.777} |
HF Open LLM v2 | gabrielmbmb | gabrielmbmb/SmolLM-1.7B-Instruct-IFEval | 6e3decae-f2a9-4f71-9511-76d28a675cc2 | 0.0.1 | hfopenllm_v2/gabrielmbmb_SmolLM-1.7B-Instruct-IFEval/1762652580.162997 | 1762652580.162998 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gabrielmbmb/SmolLM-1.7B-Instruct-IFEval | gabrielmbmb/SmolLM-1.7B-Instruct-IFEval | gabrielmbmb | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23058595637353335}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.711} |
HF Open LLM v2 | ozone-ai | ozone-ai/0x-lite | 9b5b23bc-44bb-4d47-91a2-18e23571743d | 0.0.1 | hfopenllm_v2/ozone-ai_0x-lite/1762652580.432846 | 1762652580.432847 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ozone-ai/0x-lite | ozone-ai/0x-lite | ozone-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7739874643723099}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | dwikitheduck | dwikitheduck/gen-try1 | 8f00112d-767f-4ac5-ae1c-e37781cf7eec | 0.0.1 | hfopenllm_v2/dwikitheduck_gen-try1/1762652580.137886 | 1762652580.137887 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | dwikitheduck/gen-try1 | dwikitheduck/gen-try1 | dwikitheduck | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7522052598217175}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | dwikitheduck | dwikitheduck/gen-inst-1 | 5117b75d-3060-4434-a40d-01c471563685 | 0.0.1 | hfopenllm_v2/dwikitheduck_gen-inst-1/1762652580.1376698 | 1762652580.137671 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | dwikitheduck/gen-inst-1 | dwikitheduck/gen-inst-1 | dwikitheduck | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7750114141588762}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | dwikitheduck | dwikitheduck/gen-try1-notemp | 5bd29754-7f93-42fb-ba9b-7b3a4315bd17 | 0.0.1 | hfopenllm_v2/dwikitheduck_gen-try1-notemp/1762652580.13809 | 1762652580.138091 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | dwikitheduck/gen-try1-notemp | dwikitheduck/gen-try1-notemp | dwikitheduck | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.26270961050013963}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | dwikitheduck | dwikitheduck/gemma-2-2b-id-instruct | 73418e8c-ce10-4ea4-97f6-6f87c2be05a2 | 0.0.1 | hfopenllm_v2/dwikitheduck_gemma-2-2b-id-instruct/1762652580.137409 | 1762652580.1374102 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | dwikitheduck/gemma-2-2b-id-instruct | dwikitheduck/gemma-2-2b-id-instruct | dwikitheduck | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.38785644312646006}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 2.0} |
HF Open LLM v2 | lkoenig | lkoenig/BBAI_145_ | 0f29b1ac-1943-463a-8a79-a4c0ace371cb | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_145_/1762652580.322459 | 1762652580.32246 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_145_ | lkoenig/BBAI_145_ | lkoenig | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.44503473007176514}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | shyamieee | shyamieee/Padma-v7.0 | 81546997-4dda-45ea-81fb-23db1b3b5cd7 | 0.0.1 | hfopenllm_v2/shyamieee_Padma-v7.0/1762652580.51635 | 1762652580.51635 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | shyamieee/Padma-v7.0 | shyamieee/Padma-v7.0 | shyamieee | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3841097177710696}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/dn_ep02 | f7f3caa2-0468-4dfb-a817-bb5cdc977911 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_dn_ep02/1762652579.885246 | 1762652579.885247 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/dn_ep02 | SicariusSicariiStuff/dn_ep02 | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5064340394597445}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/Wingless_Imp_8B | 2304646d-a399-40c0-8577-0bab9ad2ff3c | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Wingless_Imp_8B/1762652579.8848069 | 1762652579.8848078 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Wingless_Imp_8B | SicariusSicariiStuff/Wingless_Imp_8B | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.743012983328679}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/Redemption_Wind_24B | 21216e0b-dc97-4502-ba3d-d47ad1ac73b2 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Redemption_Wind_24B/1762652579.8843782 | 1762652579.884379 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Redemption_Wind_24B | SicariusSicariiStuff/Redemption_Wind_24B | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25014517037017336}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 23.572} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/Impish_Mind_8B | 3a0633f1-070a-416d-a7ab-f41dd44f577d | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Impish_Mind_8B/1762652579.8823712 | 1762652579.8823712 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Impish_Mind_8B | SicariusSicariiStuff/Impish_Mind_8B | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.31791424531354584}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/Winged_Imp_8B | dd1936aa-9b21-466d-b74a-807fafd9f24a | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Winged_Imp_8B/1762652579.8845959 | 1762652579.884597 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Winged_Imp_8B | SicariusSicariiStuff/Winged_Imp_8B | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.743012983328679}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/Eximius_Persona_5B | 98406fba-a2e4-4afd-a121-e33a723d2eb6 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Eximius_Persona_5B/1762652579.881908 | 1762652579.881909 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Eximius_Persona_5B | SicariusSicariiStuff/Eximius_Persona_5B | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6559850086658954}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 5.821} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/Dusk_Rainbow | e8f1d0e1-4086-4645-983b-b9470a22b522 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Dusk_Rainbow/1762652579.881711 | 1762652579.8817122 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Dusk_Rainbow | SicariusSicariiStuff/Dusk_Rainbow | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3588057465303173}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/Zion_Alpha | 9d6d36b1-f8ad-4cc8-b904-c7e3b0a923e4 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Zion_Alpha/1762652579.885025 | 1762652579.885026 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Zion_Alpha | SicariusSicariiStuff/Zion_Alpha | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3324024698910003}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/2B-ad | 31fd60ef-db8f-4785-b486-7a06f1cdf981 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_2B-ad/1762652579.88126 | 1762652579.881261 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/2B-ad | SicariusSicariiStuff/2B-ad | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4378903531518593}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 3.204} |
HF Open LLM v2 | SicariusSicariiStuff | SicariusSicariiStuff/2B_or_not_2B | 983cf552-1ab1-49ba-aab0-1e644e9a7acb | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_2B_or_not_2B/1762652579.881506 | 1762652579.881506 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/2B_or_not_2B | SicariusSicariiStuff/2B_or_not_2B | SicariusSicariiStuff | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2062316874781136}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "GemmaForCausalLM", "params_billions": 2.506} |
HF Open LLM v2 | Nexusflow | Nexusflow/NexusRaven-V2-13B | f5e5662e-803e-4f1f-82e7-14a2a189ed6d | 0.0.1 | hfopenllm_v2/Nexusflow_NexusRaven-V2-13B/1762652579.782948 | 1762652579.7829492 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Nexusflow/NexusRaven-V2-13B | Nexusflow/NexusRaven-V2-13B | Nexusflow | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1790781792311068}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 13.0} |
HF Open LLM v2 | ycros | ycros/BagelMIsteryTour-v2-8x7B | a88e7110-2a58-4f47-801f-2a49037eaed6 | 0.0.1 | hfopenllm_v2/ycros_BagelMIsteryTour-v2-8x7B/1762652580.605396 | 1762652580.605397 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ycros/BagelMIsteryTour-v2-8x7B | ycros/BagelMIsteryTour-v2-8x7B | ycros | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6262095683896506}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MixtralForCausalLM", "params_billions": 46.703} |
HF Open LLM v2 | ycros | ycros/BagelMIsteryTour-v2-8x7B | 2419f2a3-03df-4521-9baa-346e3cb53181 | 0.0.1 | hfopenllm_v2/ycros_BagelMIsteryTour-v2-8x7B/1762652580.605103 | 1762652580.6051042 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ycros/BagelMIsteryTour-v2-8x7B | ycros/BagelMIsteryTour-v2-8x7B | ycros | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.599431730031871}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "MixtralForCausalLM", "params_billions": 46.703} |
HF Open LLM v2 | sakaltcommunity | sakaltcommunity/novablast-preview | 588d2387-29de-41bc-8233-674081948787 | 0.0.1 | hfopenllm_v2/sakaltcommunity_novablast-preview/1762652580.507118 | 1762652580.5071192 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sakaltcommunity/novablast-preview | sakaltcommunity/novablast-preview | sakaltcommunity | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4530279657974175}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 32.764} |
HF Open LLM v2 | sakaltcommunity | sakaltcommunity/sakaltum-7b | 5fdd75fd-6e57-4ba4-8b6a-58998ff88bd9 | 0.0.1 | hfopenllm_v2/sakaltcommunity_sakaltum-7b/1762652580.5073972 | 1762652580.507398 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sakaltcommunity/sakaltum-7b | sakaltcommunity/sakaltum-7b | sakaltcommunity | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2603868845773658}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | bunnycore | bunnycore/DeepThinker-7B-Sce-v2 | 82cc30d2-9bb6-499f-b522-c66688e07c00 | 0.0.1 | hfopenllm_v2/bunnycore_DeepThinker-7B-Sce-v2/1762652580.0435221 | 1762652580.043523 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/DeepThinker-7B-Sce-v2 | bunnycore/DeepThinker-7B-Sce-v2 | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.16306621985221434}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | bunnycore | bunnycore/DeepThinker-7B-Sce-v1 | 814129ce-9101-4d9b-9e53-9161a010743f | 0.0.1 | hfopenllm_v2/bunnycore_DeepThinker-7B-Sce-v1/1762652580.043317 | 1762652580.043317 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/DeepThinker-7B-Sce-v1 | bunnycore/DeepThinker-7B-Sce-v1 | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.12180015691698028}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | bunnycore | bunnycore/SmolLM2-1.7B-roleplay-lora | ae109e51-8631-4e09-8839-8e9ed74da4c7 | 0.0.1 | hfopenllm_v2/bunnycore_SmolLM2-1.7B-roleplay-lora/1762652580.062429 | 1762652580.06243 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/SmolLM2-1.7B-roleplay-lora | bunnycore/SmolLM2-1.7B-roleplay-lora | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5382075116247114}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "?", "params_billions": 3.423} |
HF Open LLM v2 | bunnycore | bunnycore/Qandora-2.5-7B-Creative | acd82774-f29a-4b19-b08c-693706bb4603 | 0.0.1 | hfopenllm_v2/bunnycore_Qandora-2.5-7B-Creative/1762652580.0529459 | 1762652580.052947 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qandora-2.5-7B-Creative | bunnycore/Qandora-2.5-7B-Creative | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6803148978044922}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | bunnycore | bunnycore/QwQen-3B-LCoT-R1 | 636c4294-b3d0-42fc-b437-e4a80f70b4d9 | 0.0.1 | hfopenllm_v2/bunnycore_QwQen-3B-LCoT-R1/1762652580.05408 | 1762652580.054081 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/QwQen-3B-LCoT-R1 | bunnycore/QwQen-3B-LCoT-R1 | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.534160471992092}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.085} |
HF Open LLM v2 | bunnycore | bunnycore/QwQen-3B-LCoT | bff23021-087b-4118-ba4d-219a97a1dedc | 0.0.1 | hfopenllm_v2/bunnycore_QwQen-3B-LCoT/1762652580.05384 | 1762652580.0538409 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/QwQen-3B-LCoT | bunnycore/QwQen-3B-LCoT | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6025290673191577}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.397} |
HF Open LLM v2 | bunnycore | bunnycore/Maestro-S1k-7B-Sce | cc0c2de6-5a8d-4229-bd92-a1ad0b95a6b0 | 0.0.1 | hfopenllm_v2/bunnycore_Maestro-S1k-7B-Sce/1762652580.048955 | 1762652580.048955 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Maestro-S1k-7B-Sce | bunnycore/Maestro-S1k-7B-Sce | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2522684255553044}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | bunnycore | bunnycore/FuseQwQen-7B | 06b6f8e3-f3c7-43a6-bb69-e1eb3bd10b7a | 0.0.1 | hfopenllm_v2/bunnycore_FuseQwQen-7B/1762652580.0439281 | 1762652580.043929 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/FuseQwQen-7B | bunnycore/FuseQwQen-7B | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7274509412802475}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | bunnycore | bunnycore/Blabbertron-1.0 | 195957fa-9d4e-49ec-afd9-17125ebcf62d | 0.0.1 | hfopenllm_v2/bunnycore_Blabbertron-1.0/1762652580.0421708 | 1762652580.042172 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Blabbertron-1.0 | bunnycore/Blabbertron-1.0 | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7433376773627309}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | bunnycore | bunnycore/Blabbertron-1.1 | 9fbe416c-de18-4f83-812c-f48071a49917 | 0.0.1 | hfopenllm_v2/bunnycore_Blabbertron-1.1/1762652580.0424142 | 1762652580.0424151 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Blabbertron-1.1 | bunnycore/Blabbertron-1.1 | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7265267268625026}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | bunnycore | bunnycore/QandoraExp-7B | 744f9f56-fbb4-450f-9427-35e6e49ca014 | 0.0.1 | hfopenllm_v2/bunnycore_QandoraExp-7B/1762652580.0531762 | 1762652580.0531762 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/QandoraExp-7B | bunnycore/QandoraExp-7B | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7509064836855099}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | bunnycore | bunnycore/QandoraExp-7B-Persona | 4e9dc7ca-f4f2-4c1f-b532-628a8d9d515b | 0.0.1 | hfopenllm_v2/bunnycore_QandoraExp-7B-Persona/1762652580.0533981 | 1762652580.053399 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/QandoraExp-7B-Persona | bunnycore/QandoraExp-7B-Persona | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6246858335882126}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | bunnycore | bunnycore/Qwen2.5-7B-Instruct-Fusion | 6d88de9c-062d-4858-95ef-a05f6a29b6c3 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-Instruct-Fusion/1762652580.0585442 | 1762652580.0585449 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-Instruct-Fusion | bunnycore/Qwen2.5-7B-Instruct-Fusion | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6962016338869754}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | bunnycore | bunnycore/Qwen2.5-7B-Instruct-Merge-Stock-v0.1 | fe31c10e-8231-49f4-afb3-e2588396c032 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-Instruct-Merge-Stock-v0.1/1762652580.0587678 | 1762652580.058769 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-Instruct-Merge-Stock-v0.1 | bunnycore/Qwen2.5-7B-Instruct-Merge-Stock-v0.1 | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7509064836855099}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | bunnycore | bunnycore/FuseCyberMix-Qwen-2.5-7B-Instruct | d851bc0d-5f11-40f6-982c-39809dffe946 | 0.0.1 | hfopenllm_v2/bunnycore_FuseCyberMix-Qwen-2.5-7B-Instruct/1762652580.043724 | 1762652580.043725 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/FuseCyberMix-Qwen-2.5-7B-Instruct | bunnycore/FuseCyberMix-Qwen-2.5-7B-Instruct | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7019220113742648}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | bunnycore | bunnycore/Tulu-3.1-8B-SuperNova | cd979586-e334-4964-b06c-f33c66f09c0e | 0.0.1 | hfopenllm_v2/bunnycore_Tulu-3.1-8B-SuperNova/1762652580.062763 | 1762652580.0627651 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Tulu-3.1-8B-SuperNova | bunnycore/Tulu-3.1-8B-SuperNova | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8193748143813969}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | bunnycore | bunnycore/SmolLM2-1.7-Persona | 5249691a-3672-4ccd-98dd-d9b937bca750 | 0.0.1 | hfopenllm_v2/bunnycore_SmolLM2-1.7-Persona/1762652580.062155 | 1762652580.062156 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/SmolLM2-1.7-Persona | bunnycore/SmolLM2-1.7-Persona | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5465254413844156}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.711} |
HF Open LLM v2 | bunnycore | bunnycore/QandoraExp-7B-v2 | 85bc0517-382e-4a4c-ac31-ee6de74d2c8f | 0.0.1 | hfopenllm_v2/bunnycore_QandoraExp-7B-v2/1762652580.053621 | 1762652580.053621 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/QandoraExp-7B-v2 | bunnycore/QandoraExp-7B-v2 | bunnycore | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5606889719278182}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | sunbaby | sunbaby/BrainCog-8B-0.1-Instruct | 96412e92-8a74-429b-8014-30a526521356 | 0.0.1 | hfopenllm_v2/sunbaby_BrainCog-8B-0.1-Instruct/1762652580.549814 | 1762652580.549815 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sunbaby/BrainCog-8B-0.1-Instruct | sunbaby/BrainCog-8B-0.1-Instruct | sunbaby | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4253004250943053}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | gupta-tanish | gupta-tanish/llama-7b-dpo-baseline | 1b962cb9-8754-40ab-b41a-b7cdf1fa3de1 | 0.0.1 | hfopenllm_v2/gupta-tanish_llama-7b-dpo-baseline/1762652580.1871748 | 1762652580.1871748 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gupta-tanish/llama-7b-dpo-baseline | gupta-tanish/llama-7b-dpo-baseline | gupta-tanish | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.26930433472076315}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.738} |
HF Open LLM v2 | CarrotAI | CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct | 8c56b973-d5cb-48b6-a43e-ad50769b1f40 | 0.0.1 | hfopenllm_v2/CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct/1762652579.501917 | 1762652579.5019178 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct | CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct | CarrotAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7198821349574684}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | CarrotAI | CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412 | 41809335-e00c-4911-bc08-6edd71891585 | 0.0.1 | hfopenllm_v2/CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct-2412/1762652579.5021691 | 1762652579.50217 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412 | CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412 | CarrotAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.47818233398493776}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | LimYeri | LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-DPO-merged | d020a655-1cc0-49e9-9db1-f8b871babd5c | 0.0.1 | hfopenllm_v2/LimYeri_CodeMind-Llama3-8B-unsloth_v4-one-DPO-merged/1762652579.733827 | 1762652579.733829 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-DPO-merged | LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-DPO-merged | LimYeri | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6492406813920397}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | unsloth | unsloth/Phi-3-mini-4k-instruct | 36d52065-1de2-4661-bf23-85276a8ede2f | 0.0.1 | hfopenllm_v2/unsloth_Phi-3-mini-4k-instruct/1762652580.579097 | 1762652580.5790982 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | unsloth/Phi-3-mini-4k-instruct | unsloth/Phi-3-mini-4k-instruct | unsloth | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.544027624480822}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 3.821} |
HF Open LLM v2 | unsloth | unsloth/Llama-3.2-1B-Instruct-no-system-message | d8d52ed0-2eb6-4be3-9e4e-346a6b19ceca | 0.0.1 | hfopenllm_v2/unsloth_Llama-3.2-1B-Instruct-no-system-message/1762652580.578731 | 1762652580.578733 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | unsloth/Llama-3.2-1B-Instruct-no-system-message | unsloth/Llama-3.2-1B-Instruct-no-system-message | unsloth | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5649853499824908}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | unsloth | unsloth/Llama-3.2-1B-Instruct | 25ec2dbd-465f-40a9-80f0-e4001e621303 | 0.0.1 | hfopenllm_v2/unsloth_Llama-3.2-1B-Instruct/1762652580.578335 | 1762652580.578335 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | unsloth/Llama-3.2-1B-Instruct | unsloth/Llama-3.2-1B-Instruct | unsloth | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5809973093613834}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | Spestly | Spestly/Atlas-Pro-1.5B-Preview | 8282705f-6b69-40c2-825d-8e0c72756083 | 0.0.1 | hfopenllm_v2/Spestly_Atlas-Pro-1.5B-Preview/1762652579.891309 | 1762652579.89131 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Spestly/Atlas-Pro-1.5B-Preview | Spestly/Atlas-Pro-1.5B-Preview | Spestly | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2429509257658568}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.777} |
HF Open LLM v2 | Spestly | Spestly/Athena-1-3B | 29d6834e-38f7-472f-86be-79a8fce03989 | 0.0.1 | hfopenllm_v2/Spestly_Athena-1-3B/1762652579.8910668 | 1762652579.891068 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Spestly/Athena-1-3B | Spestly/Athena-1-3B | Spestly | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5569167586448401}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | Spestly | Spestly/Atlas-Pro-7B-Preview | 57a36976-0868-462e-ab57-3addef7ea2f9 | 0.0.1 | hfopenllm_v2/Spestly_Atlas-Pro-7B-Preview/1762652579.891519 | 1762652579.89152 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Spestly/Atlas-Pro-7B-Preview | Spestly/Atlas-Pro-7B-Preview | Spestly | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.31541642840995227}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | Darkknight535 | Darkknight535/OpenCrystal-12B-L3 | 8edb0a0d-994b-4b97-b9a7-7f46ba0e7365 | 0.0.1 | hfopenllm_v2/Darkknight535_OpenCrystal-12B-L3/1762652579.5369642 | 1762652579.5369651 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Darkknight535/OpenCrystal-12B-L3 | Darkknight535/OpenCrystal-12B-L3 | Darkknight535 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4070909630890482}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 11.52} |
HF Open LLM v2 | yasserrmd | yasserrmd/Coder-GRPO-3B | 425372c0-e096-4bdf-8f6c-eb2d5b36bb07 | 0.0.1 | hfopenllm_v2/yasserrmd_Coder-GRPO-3B/1762652580.6044621 | 1762652580.604463 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | yasserrmd/Coder-GRPO-3B | yasserrmd/Coder-GRPO-3B | yasserrmd | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6207640172520024}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | yasserrmd | yasserrmd/Text2SQL-1.5B | 42a767cf-7d29-486d-b83e-fcfa51f048c1 | 0.0.1 | hfopenllm_v2/yasserrmd_Text2SQL-1.5B/1762652580.604796 | 1762652580.6047978 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | yasserrmd/Text2SQL-1.5B | yasserrmd/Text2SQL-1.5B | yasserrmd | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2857407235025289}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | rubenroy | rubenroy/Geneva-12B-GCv2-5m | e6649e50-54ba-4788-a3b4-5aa3d6e8aed8 | 0.0.1 | hfopenllm_v2/rubenroy_Geneva-12B-GCv2-5m/1762652580.501345 | 1762652580.501346 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | rubenroy/Geneva-12B-GCv2-5m | rubenroy/Geneva-12B-GCv2-5m | rubenroy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2586381911106974}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | rubenroy | rubenroy/Zurich-14B-GCv2-5m | f9dca394-e108-48f3-a45d-a282f7b39098 | 0.0.1 | hfopenllm_v2/rubenroy_Zurich-14B-GCv2-5m/1762652580.5018299 | 1762652580.5018299 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | rubenroy/Zurich-14B-GCv2-5m | rubenroy/Zurich-14B-GCv2-5m | rubenroy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6163679038285084}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | rubenroy | rubenroy/Gilgamesh-72B | b577bd26-a9f9-4a50-bd2b-f47bc5222748 | 0.0.1 | hfopenllm_v2/rubenroy_Gilgamesh-72B/1762652580.5016088 | 1762652580.5016088 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | rubenroy/Gilgamesh-72B | rubenroy/Gilgamesh-72B | rubenroy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8486006019583594}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 72.706} |
HF Open LLM v2 | teknium | teknium/OpenHermes-7B | 089f10dc-8be6-4595-a0b3-7d5bb4fc13fa | 0.0.1 | hfopenllm_v2/teknium_OpenHermes-7B/1762652580.5548952 | 1762652580.5548952 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | teknium/OpenHermes-7B | teknium/OpenHermes-7B | teknium | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1812513021006485}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | teknium | teknium/OpenHermes-13B | 55d876b7-159e-4c76-848b-1480b4c2f4a2 | 0.0.1 | hfopenllm_v2/teknium_OpenHermes-13B/1762652580.5542011 | 1762652580.554202 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | teknium/OpenHermes-13B | teknium/OpenHermes-13B | teknium | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2668065178171696}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 13.0} |
HF Open LLM v2 | Casual-Autopsy | Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B | da5c1edf-bd74-48a3-ad76-a4bd89539b7f | 0.0.1 | hfopenllm_v2/Casual-Autopsy_L3-Umbral-Mind-RP-v2.0-8B/1762652579.502389 | 1762652579.502389 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B | Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B | Casual-Autopsy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7122634609502786}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | ssmits | ssmits/Qwen2.5-95B-Instruct | 1c441afa-b8ac-4ff9-b881-e75f8765dd8e | 0.0.1 | hfopenllm_v2/ssmits_Qwen2.5-95B-Instruct/1762652580.535626 | 1762652580.5356271 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ssmits/Qwen2.5-95B-Instruct | ssmits/Qwen2.5-95B-Instruct | ssmits | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8431051831363006}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 94.648} |
HF Open LLM v2 | BlackBeenie | BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1 | b298e0fc-f4fb-4464-beb8-45f8b5f35653 | 0.0.1 | hfopenllm_v2/BlackBeenie_Llama-3.1-8B-OpenO1-SFT-v0.1/1762652579.495378 | 1762652579.495378 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1 | BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1 | BlackBeenie | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5124037553690873}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | BlackBeenie | BlackBeenie/Bloslain-8B-v0.2 | 160fb625-9c1c-40c1-ab93-7d9f7a2220d2 | 0.0.1 | hfopenllm_v2/BlackBeenie_Bloslain-8B-v0.2/1762652579.495104 | 1762652579.495104 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | BlackBeenie/Bloslain-8B-v0.2 | BlackBeenie/Bloslain-8B-v0.2 | BlackBeenie | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5023371321427147}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Mxode | Mxode/NanoLM-1B-Instruct-v1.1 | 2e482de2-60ca-4758-9de8-4482e42a5b7a | 0.0.1 | hfopenllm_v2/Mxode_NanoLM-1B-Instruct-v1.1/1762652579.764964 | 1762652579.764964 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Mxode/NanoLM-1B-Instruct-v1.1 | Mxode/NanoLM-1B-Instruct-v1.1 | Mxode | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23952889444451833}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.076} |
HF Open LLM v2 | Mxode | Mxode/NanoLM-0.3B-Instruct-v1 | 3c08189e-294e-4682-a7e0-e73a8d498fb2 | 0.0.1 | hfopenllm_v2/Mxode_NanoLM-0.3B-Instruct-v1/1762652579.764268 | 1762652579.764269 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Mxode/NanoLM-0.3B-Instruct-v1 | Mxode/NanoLM-0.3B-Instruct-v1 | Mxode | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1536744726215331}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.315} |
HF Open LLM v2 | Mxode | Mxode/NanoLM-1B-Instruct-v2 | d7d1e48d-86af-4f65-803b-30fff69c78b5 | 0.0.1 | hfopenllm_v2/Mxode_NanoLM-1B-Instruct-v2/1762652579.765177 | 1762652579.7651782 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Mxode/NanoLM-1B-Instruct-v2 | Mxode/NanoLM-1B-Instruct-v2 | Mxode | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2629844368497808}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.076} |
HF Open LLM v2 | Mxode | Mxode/NanoLM-0.3B-Instruct-v2 | 43ce0bee-e8ee-417d-be0d-841d6e26b330 | 0.0.1 | hfopenllm_v2/Mxode_NanoLM-0.3B-Instruct-v2/1762652579.7647529 | 1762652579.7647538 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Mxode/NanoLM-0.3B-Instruct-v2 | Mxode/NanoLM-0.3B-Instruct-v2 | Mxode | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1667885654507817}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.315} |
HF Open LLM v2 | Mxode | Mxode/NanoLM-0.3B-Instruct-v1.1 | 0f9eeb32-85fb-4778-8618-436aa4f891ad | 0.0.1 | hfopenllm_v2/Mxode_NanoLM-0.3B-Instruct-v1.1/1762652579.764531 | 1762652579.764531 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Mxode/NanoLM-0.3B-Instruct-v1.1 | Mxode/NanoLM-0.3B-Instruct-v1.1 | Mxode | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17827918810977095}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.315} |
HF Open LLM v2 | sam-paech | sam-paech/Darkest-muse-v1 | dae1ceb0-97b1-4285-b9db-912d7b4b01c7 | 0.0.1 | hfopenllm_v2/sam-paech_Darkest-muse-v1/1762652580.508588 | 1762652580.508589 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sam-paech/Darkest-muse-v1 | sam-paech/Darkest-muse-v1 | sam-paech | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7344202272193336}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | sam-paech | sam-paech/Delirium-v1 | 78dd5568-0d0d-4cc5-ad1a-bfba857c827e | 0.0.1 | hfopenllm_v2/sam-paech_Delirium-v1/1762652580.508875 | 1762652580.508876 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sam-paech/Delirium-v1 | sam-paech/Delirium-v1 | sam-paech | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7207564816908026}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | sam-paech | sam-paech/Quill-v1 | 248541b3-aeae-429d-93ae-06cc3bc82cd8 | 0.0.1 | hfopenllm_v2/sam-paech_Quill-v1/1762652580.5091672 | 1762652580.5091681 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sam-paech/Quill-v1 | sam-paech/Quill-v1 | sam-paech | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.712213593265868}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | paulml | paulml/ECE-ILAB-Q1 | 83024ec4-e4a4-4dd3-adf4-654c90c3a271 | 0.0.1 | hfopenllm_v2/paulml_ECE-ILAB-Q1/1762652580.440484 | 1762652580.440484 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | paulml/ECE-ILAB-Q1 | paulml/ECE-ILAB-Q1 | paulml | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7864521691334547}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 72.706} |
HF Open LLM v2 | MarinaraSpaghetti | MarinaraSpaghetti/Nemomix-v4.0-12B | aeac3ed0-e93b-4fb2-bdd5-1fd06ccd3338 | 0.0.1 | hfopenllm_v2/MarinaraSpaghetti_Nemomix-v4.0-12B/1762652579.746819 | 1762652579.7468212 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MarinaraSpaghetti/Nemomix-v4.0-12B | MarinaraSpaghetti/Nemomix-v4.0-12B | MarinaraSpaghetti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5574664113441224}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | MarinaraSpaghetti | MarinaraSpaghetti/NemoReRemix-12B | ac67a9d9-0f5a-4891-a9e5-2a924fbf4f72 | 0.0.1 | hfopenllm_v2/MarinaraSpaghetti_NemoReRemix-12B/1762652579.7463942 | 1762652579.746399 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MarinaraSpaghetti/NemoReRemix-12B | MarinaraSpaghetti/NemoReRemix-12B | MarinaraSpaghetti | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33425089872649016}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | ruizhe1217 | ruizhe1217/sft-s1-qwen-0.5b | fd0e4ea3-ed10-487d-85d7-df5669bc8edc | 0.0.1 | hfopenllm_v2/ruizhe1217_sft-s1-qwen-0.5b/1762652580.502058 | 1762652580.502059 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ruizhe1217/sft-s1-qwen-0.5b | ruizhe1217/sft-s1-qwen-0.5b | ruizhe1217 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.27487510915482033}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-9B | 0ec59add-f9a9-4dbd-8a83-c6aec0b8ad21 | 0.0.1 | hfopenllm_v2/01-ai_Yi-9B/1762652579.46702 | 1762652579.4670231 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-9B | 01-ai/Yi-9B | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2708779372066118}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.829} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-1.5-34B-32K | 0d91a153-1b6b-4891-8722-a5c7e372ba64 | 0.0.1 | hfopenllm_v2/01-ai_Yi-1.5-34B-32K/1762652579.463656 | 1762652579.463657 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-1.5-34B-32K | 01-ai/Yi-1.5-34B-32K | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3118691737922047}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 34.389} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-34B-200K | fb2ebd9a-f5b8-42a2-9b58-e6f0e7d9b98a | 0.0.1 | hfopenllm_v2/01-ai_Yi-34B-200K/1762652579.465893 | 1762652579.465894 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-34B-200K | 01-ai/Yi-34B-200K | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.15424850507763843}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 34.389} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-1.5-34B-Chat-16K | 2192007d-1f6e-4f74-b518-7448ef3a896e | 0.0.1 | hfopenllm_v2/01-ai_Yi-1.5-34B-Chat-16K/1762652579.464125 | 1762652579.4641259 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-1.5-34B-Chat-16K | 01-ai/Yi-1.5-34B-Chat-16K | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.456449997118756}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 34.389} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-6B-200K | 6b720e8b-aab8-4ba4-9bce-e7a1de3cfb86 | 0.0.1 | hfopenllm_v2/01-ai_Yi-6B-200K/1762652579.4665558 | 1762652579.466557 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-6B-200K | 01-ai/Yi-6B-200K | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.08433068702154728}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.061} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-9B-200K | 4299df04-495a-4687-b143-96b1b562d5e8 | 0.0.1 | hfopenllm_v2/01-ai_Yi-9B-200K/1762652579.467233 | 1762652579.467233 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-9B-200K | 01-ai/Yi-9B-200K | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23270921155866434}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.829} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-1.5-6B-Chat | 3452e57f-3023-4e2e-ad84-b09e409fe334 | 0.0.1 | hfopenllm_v2/01-ai_Yi-1.5-6B-Chat/1762652579.464571 | 1762652579.464572 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-1.5-6B-Chat | 01-ai/Yi-1.5-6B-Chat | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5145270105542183}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.061} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-1.5-9B-Chat-16K | 090c9691-4b7e-4a98-b9a2-644e21797be4 | 0.0.1 | hfopenllm_v2/01-ai_Yi-1.5-9B-Chat-16K/1762652579.465471 | 1762652579.465471 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-1.5-9B-Chat-16K | 01-ai/Yi-1.5-9B-Chat-16K | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4214040966856829}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.829} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-Coder-9B-Chat | ef0cc3a5-0d62-4a45-b0c7-28a6f7dfdac4 | 0.0.1 | hfopenllm_v2/01-ai_Yi-Coder-9B-Chat/1762652579.4674509 | 1762652579.4674518 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-Coder-9B-Chat | 01-ai/Yi-Coder-9B-Chat | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4817041006750976}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.829} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-1.5-6B | 1a1f1263-96b6-4e32-a2c8-6c0d6b47dff9 | 0.0.1 | hfopenllm_v2/01-ai_Yi-1.5-6B/1762652579.464354 | 1762652579.464355 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-1.5-6B | 01-ai/Yi-1.5-6B | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.26166017278598563}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.061} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-1.5-34B-Chat | e335874b-9b3e-4966-a7e0-22e9d16f8324 | 0.0.1 | hfopenllm_v2/01-ai_Yi-1.5-34B-Chat/1762652579.463886 | 1762652579.4638872 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-1.5-34B-Chat | 01-ai/Yi-1.5-34B-Chat | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6066758423205982}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 34.389} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-1.5-9B-32K | df9d9d44-daa1-4e61-9b46-192380043889 | 0.0.1 | hfopenllm_v2/01-ai_Yi-1.5-9B-32K/1762652579.4649951 | 1762652579.464996 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-1.5-9B-32K | 01-ai/Yi-1.5-9B-32K | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23031113002389217}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.829} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-1.5-34B | 8409c158-ef12-4e6c-8a1d-7be2084b3446 | 0.0.1 | hfopenllm_v2/01-ai_Yi-1.5-34B/1762652579.4633532 | 1762652579.463354 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-1.5-34B | 01-ai/Yi-1.5-34B | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2841172533322695}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 34.389} |
HF Open LLM v2 | 01-ai | 01-ai/Yi-6B | 297419fa-855c-4eae-ad7c-3cf4a0262450 | 0.0.1 | hfopenllm_v2/01-ai_Yi-6B/1762652579.4663382 | 1762652579.4663382 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 01-ai/Yi-6B | 01-ai/Yi-6B | 01-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28933784580468713}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.061} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.