_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-3B-Model-Stock-v3.2 | 139f2e38-0b98-4bfe-82b0-99a6e6b51e7f | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-3B-Model-Stock-v3.2/1762652580.05695 | 1762652580.05695 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-3B-Model-Stock-v3.2 | bunnycore/Qwen2.5-3B-Model-Stock-v3.2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6353021095138676}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.396} |
HF Open LLM v2 | alibaba | bunnycore/Qwen-2.5-7B-Deep-Sky-T1 | 33cc8f90-d019-49d9-8220-d66260659435 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen-2.5-7B-Deep-Sky-T1/1762652580.0542989 | 1762652580.0542998 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen-2.5-7B-Deep-Sky-T1 | bunnycore/Qwen-2.5-7B-Deep-Sky-T1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.42080457630198986}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-RRP-ID | 85b10038-d136-4be7-8e04-7298ddb4f7d2 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-RRP-ID/1762652580.0603101 | 1762652580.0603101 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-RRP-ID | bunnycore/Qwen2.5-7B-RRP-ID | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.747259493698941}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-3B-RP-Mix | f43b9387-56a9-4c21-850c-5cfda84fc8b5 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-3B-RP-Mix/1762652580.057388 | 1762652580.057389 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-3B-RP-Mix | bunnycore/Qwen2.5-3B-RP-Mix | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5720543712903984}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.397} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-3B-Model-Stock-v2 | e949a47b-85f9-4072-8302-8bfef92579d9 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-3B-Model-Stock-v2/1762652580.0565188 | 1762652580.05652 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-3B-Model-Stock-v2 | bunnycore/Qwen2.5-3B-Model-Stock-v2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6490157227268093}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.396} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-3B-Model-Stock-v4.1 | 8348f83b-0739-411f-8b87-bd9d5e871ab3 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-3B-Model-Stock-v4.1/1762652580.0571678 | 1762652580.057169 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-3B-Model-Stock-v4.1 | bunnycore/Qwen2.5-3B-Model-Stock-v4.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6380747527671025}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.396} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-3B-Model-Stock-v3.1 | 744d1978-7aa3-44b6-91a0-664383a66f8b | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-3B-Model-Stock-v3.1/1762652580.056732 | 1762652580.056733 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-3B-Model-Stock-v3.1 | bunnycore/Qwen2.5-3B-Model-Stock-v3.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6480915083090644}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.396} |
HF Open LLM v2 | alibaba | bunnycore/QwenMosaic-7B | 4fcee29d-6351-4875-995d-81834fd878c3 | 0.0.1 | hfopenllm_v2/bunnycore_QwenMosaic-7B/1762652580.061329 | 1762652580.0613298 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/QwenMosaic-7B | bunnycore/QwenMosaic-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5819215237791282}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-3B-RP-Thinker | 80cadd5b-ebbd-4f2f-912b-5d944650e2b1 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-3B-RP-Thinker/1762652580.0576031 | 1762652580.057604 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-3B-RP-Thinker | bunnycore/Qwen2.5-3B-RP-Thinker | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.589414974489909}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.397} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-R1-Bespoke-Stock | 20de3a0f-fad0-4832-863e-2b2049037c4f | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-R1-Bespoke-Stock/1762652580.059437 | 1762652580.059438 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-R1-Bespoke-Stock | bunnycore/Qwen2.5-7B-R1-Bespoke-Stock | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3726445830396681}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-3B-Model-Stock | 4dcf1412-4182-40bd-bd1a-2246e29f18e9 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-3B-Model-Stock/1762652580.056308 | 1762652580.056309 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-3B-Model-Stock | bunnycore/Qwen2.5-3B-Model-Stock | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6380747527671025}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.396} |
HF Open LLM v2 | alibaba | bunnycore/Qwen-2.5-7b-S1k | e7394d5d-4253-4a53-8a0a-73b0a41e62a4 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen-2.5-7b-S1k/1762652580.055886 | 1762652580.0558872 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen-2.5-7b-S1k | bunnycore/Qwen-2.5-7b-S1k | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7162351449708995}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-Fuse-Exp | f435a5b0-cc12-4603-b7b0-4625dc547ed2 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-Fuse-Exp/1762652580.0583198 | 1762652580.058321 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-Fuse-Exp | bunnycore/Qwen2.5-7B-Fuse-Exp | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5468501354184675}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-RRP-1M-Thinker | 1879a765-f4ab-4bad-9525-47f428b43220 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-RRP-1M-Thinker/1762652580.060085 | 1762652580.060086 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-RRP-1M-Thinker | bunnycore/Qwen2.5-7B-RRP-1M-Thinker | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23081091503876383}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | synergetic/FrankenQwen2.5-14B | 5f69b85b-d66c-400b-8d40-58b96233ec3c | 0.0.1 | hfopenllm_v2/synergetic_FrankenQwen2.5-14B/1762652580.5505831 | 1762652580.550584 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | synergetic/FrankenQwen2.5-14B | synergetic/FrankenQwen2.5-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1869472998311148}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 16.972} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.8 | 51ff4f00-1d21-4f98-b5a3-7a72c4b2a5b1 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.8/1762652579.739795 | 1762652579.739796 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.8 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.8 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7027963581075989}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.6 | 526f6468-b7a8-47a7-9ed4-c2aa7cc63ca1 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.6/1762652579.7392142 | 1762652579.7392151 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.6 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.6 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5919382793210903}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.1 | dffd1a4a-a056-43c2-bda3-0cfa21406656 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v9.1/1762652579.74074 | 1762652579.740741 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.1 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8002655177152178}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v6 | c4b27a1b-28dd-4a79-839c-ad8673034937 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v6/1762652579.737686 | 1762652579.737687 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v6 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v6 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.704320092909037}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7-rebase | 46a21741-1860-4498-8284-c94fccad1ed0 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v7-rebase/1762652579.738374 | 1762652579.7383769 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7-rebase | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7-rebase | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.693054428915278}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v3 | eb958d5c-aa2e-4640-bef7-c8b10a892847 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v3/1762652579.736984 | 1762652579.7369852 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v3 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7048697456083193}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.5 | c723fc6f-2656-4084-81d0-4cbaf0587049 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.5/1762652579.738977 | 1762652579.7389781 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.5 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.5 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5928624937388352}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-OriginalFusion | cf14f098-cd46-4ca0-acec-02012eb78ea3 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-OriginalFusion/1762652579.741195 | 1762652579.741195 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-OriginalFusion | Lunzima/NQLSG-Qwen2.5-14B-OriginalFusion | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6141947809589667}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v5 | 79d3d942-8d5f-4aca-8759-8d70b8cfc5f3 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v5/1762652579.737468 | 1762652579.737469 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v5 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v5 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7485084021507378}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v6-cpt | 92bff089-baed-4f1f-852b-f274a7920a1a | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v6-cpt/1762652579.7379 | 1762652579.7379 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v6-cpt | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v6-cpt | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.46634152936430895}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9-stock | 757269fe-8662-4eaa-8e76-5c2f88d8fbb0 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v9-stock/1762652579.740509 | 1762652579.74051 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9-stock | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9-stock | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6513639365771708}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.2 | b5ecb480-16e6-4dfb-be77-ad8ef4e90aa3 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v9.2/1762652579.74097 | 1762652579.74097 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.2 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7862272104682243}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.7 | 56232cf6-7ee7-45ed-b139-ea20e148b5fa | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.7/1762652579.7395148 | 1762652579.739517 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.7 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.7 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7874761189200211}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9 | 682a38c6-2fb8-4c42-b6ad-69fbe65be484 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v9/1762652579.740272 | 1762652579.740273 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.523519816309614}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.9 | eee0ebda-6ff8-45bd-ac4e-15aeb724d0d1 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.9/1762652579.74003 | 1762652579.740031 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.9 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.9 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7993413032974729}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8 | b3e7af18-231e-4839-809c-bc5bfe7b4182 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8/1762652579.738731 | 1762652579.738732 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7874761189200211}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7 | d540acde-9601-4119-8ae2-f7cdf82f43f7 | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v7/1762652579.738115 | 1762652579.738116 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6793906833867471}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v4 | 17c5c728-e03d-45e9-aaae-816c4e90b14f | 0.0.1 | hfopenllm_v2/Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v4/1762652579.737248 | 1762652579.7372491 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v4 | Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v4 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6943033373670748}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained | 2f89ceb3-8bc1-48f0-a4cb-3dc1b8acad87 | 0.0.1 | hfopenllm_v2/netcat420_Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained/1762652580.4012349 | 1762652580.401236 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained | netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6486411610083467}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/qwen2.5-MFANN-7b-SLERP-V1.2 | dfacdde9-fd5d-496f-8038-aa0439c0c991 | 0.0.1 | hfopenllm_v2/netcat420_qwen2.5-MFANN-7b-SLERP-V1.2/1762652580.40188 | 1762652580.40188 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/qwen2.5-MFANN-7b-SLERP-V1.2 | netcat420/qwen2.5-MFANN-7b-SLERP-V1.2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6606060807546199}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN | 399b43e8-3c07-4f3d-8b3e-50b8acd96e78 | 0.0.1 | hfopenllm_v2/netcat420_Qwen2.5-7B-nerd-uncensored-v0.9-MFANN/1762652580.400365 | 1762652580.400365 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN | netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5878413720040603}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp | 170aa8c2-6b80-44d3-9d22-c1a5f7fa2ad4 | 0.0.1 | hfopenllm_v2/netcat420_Qwen2.5-7b-nerd-uncensored-MFANN-slerp/1762652580.4007921 | 1762652580.400793 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp | netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.15644711587476784}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/qwen2.5-MFANN-7b-SLERPv1.1 | 0e66b7a6-bd6f-48f7-95e2-c117e0ea468f | 0.0.1 | hfopenllm_v2/netcat420_qwen2.5-MFANN-7b-SLERPv1.1/1762652580.402082 | 1762652580.4020832 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/qwen2.5-MFANN-7b-SLERPv1.1 | netcat420/qwen2.5-MFANN-7b-SLERPv1.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6554852236510238}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN | bbd39707-6062-461a-8e09-c8b8bc3451f7 | 0.0.1 | hfopenllm_v2/netcat420_Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN/1762652580.4010181 | 1762652580.4010189 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN | netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5742274941599401}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/Qwen2.5-7b-MFANN-slerp | d621c163-5ca6-4e54-8913-d931e4a2c6b9 | 0.0.1 | hfopenllm_v2/netcat420_Qwen2.5-7b-MFANN-slerp/1762652580.4005811 | 1762652580.4005818 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/Qwen2.5-7b-MFANN-slerp | netcat420/Qwen2.5-7b-MFANN-slerp | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6532123654126606}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b | 9b2011ae-9d22-42be-a10b-6ce6e8ff1be4 | 0.0.1 | hfopenllm_v2/netcat420_Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b/1762652580.401459 | 1762652580.40146 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b | netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2675556412540947}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/Qwen2.5-MFANN-7b | b6578885-9721-4349-ad55-5a80fd054c85 | 0.0.1 | hfopenllm_v2/netcat420_Qwen2.5-MFANN-7b/1762652580.401672 | 1762652580.401673 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/Qwen2.5-MFANN-7b | netcat420/Qwen2.5-MFANN-7b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6097233119234742}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b | 0d99e863-596f-43b7-932e-a4a27435e63d | 0.0.1 | hfopenllm_v2/netcat420_DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b/1762652580.391702 | 1762652580.3917031 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b | netcat420/DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.11500596195871399}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | netcat420/qwen2.5-MFANN-7b-v1.1 | 845f96b7-62dc-4ebc-aa62-fcc6263e437f | 0.0.1 | hfopenllm_v2/netcat420_qwen2.5-MFANN-7b-v1.1/1762652580.402283 | 1762652580.4022841 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/qwen2.5-MFANN-7b-v1.1 | netcat420/qwen2.5-MFANN-7b-v1.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6088489651901399}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeanmichela/o-distil-qwen | 172e7bfa-b430-4e14-a15a-a54ec5c9133e | 0.0.1 | hfopenllm_v2/jeanmichela_o-distil-qwen/1762652580.280534 | 1762652580.280535 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeanmichela/o-distil-qwen | jeanmichela/o-distil-qwen | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.44823180272787316}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | theprint/ReWiz-Qwen-2.5-14B | 9a4e6a55-e39e-4da6-b4bb-670cbd75d5c6 | 0.0.1 | hfopenllm_v2/theprint_ReWiz-Qwen-2.5-14B/1762652580.563489 | 1762652580.5634902 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | theprint/ReWiz-Qwen-2.5-14B | theprint/ReWiz-Qwen-2.5-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.27854647889821227}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "?", "params_billions": 16.743} |
HF Open LLM v2 | alibaba | someon98/qwen-CoMa-0.5b | be4ee67a-59d7-4098-992e-5f75cd53cdbc | 0.0.1 | hfopenllm_v2/someon98_qwen-CoMa-0.5b/1762652580.518077 | 1762652580.5180779 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | someon98/qwen-CoMa-0.5b | someon98/qwen-CoMa-0.5b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22766371006706648}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | alibaba | vonjack/Qwen2.5-Coder-0.5B-Merged | 76b52fe1-c232-47d9-8052-077a945364cd | 0.0.1 | hfopenllm_v2/vonjack_Qwen2.5-Coder-0.5B-Merged/1762652580.5902011 | 1762652580.590202 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vonjack/Qwen2.5-Coder-0.5B-Merged | vonjack/Qwen2.5-Coder-0.5B-Merged | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.30997087727230416}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | alibaba | mobiuslabsgmbh/DeepSeek-R1-ReDistill-Qwen-7B-v1.1 | 99d27765-a9c5-4f50-8bd1-c3ce67683621 | 0.0.1 | hfopenllm_v2/mobiuslabsgmbh_DeepSeek-R1-ReDistill-Qwen-7B-v1.1/1762652580.371459 | 1762652580.3714602 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mobiuslabsgmbh/DeepSeek-R1-ReDistill-Qwen-7B-v1.1 | mobiuslabsgmbh/DeepSeek-R1-ReDistill-Qwen-7B-v1.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.34731512387132807}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | StelleX/Qwen2.5_Math_7B_Cot | a0802c61-1314-4a46-9b61-7a89246bac42 | 0.0.1 | hfopenllm_v2/StelleX_Qwen2.5_Math_7B_Cot/1762652579.8928509 | 1762652579.892852 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | StelleX/Qwen2.5_Math_7B_Cot | StelleX/Qwen2.5_Math_7B_Cot | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2142747908881767}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | wave-on-discord/qwent-7b | 1dc524b8-18d6-4bc0-9146-713ef8abd983 | 0.0.1 | hfopenllm_v2/wave-on-discord_qwent-7b/1762652580.592784 | 1762652580.592785 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | wave-on-discord/qwent-7b | wave-on-discord/qwent-7b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20148539209297997}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | 1024m/QWEN-14B-B100 | 745bd077-3a0f-4c06-8d19-d7c160512446 | 0.0.1 | hfopenllm_v2/1024m_QWEN-14B-B100/1762652579.468843 | 1762652579.4688451 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | 1024m/QWEN-14B-B100 | 1024m/QWEN-14B-B100 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7762104549262623}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | kayfour/T3Q-Qwen2.5-7B-it-KOR-Safe | 35e56ec7-deae-4674-abfc-3c45f5dec040 | 0.0.1 | hfopenllm_v2/kayfour_T3Q-Qwen2.5-7B-it-KOR-Safe/1762652580.3057542 | 1762652580.305755 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | kayfour/T3Q-Qwen2.5-7B-it-KOR-Safe | kayfour/T3Q-Qwen2.5-7B-it-KOR-Safe | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6081497094376255}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | sethuiyer/Qwen2.5-7B-Anvita | f2571e64-be03-4482-b5b4-d120444b0586 | 0.0.1 | hfopenllm_v2/sethuiyer_Qwen2.5-7B-Anvita/1762652580.514066 | 1762652580.514067 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sethuiyer/Qwen2.5-7B-Anvita | sethuiyer/Qwen2.5-7B-Anvita | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6480416406246536}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | T145/qwen-2.5-3B-merge-test | 071d7565-90e5-43e8-a158-ab333beacdcf | 0.0.1 | hfopenllm_v2/T145_qwen-2.5-3B-merge-test/1762652579.908712 | 1762652579.9087129 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/qwen-2.5-3B-merge-test | T145/qwen-2.5-3B-merge-test | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5751018408932742}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.397} |
HF Open LLM v2 | alibaba | Pinkstack/PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B | 7b8f75d1-ef18-4fb4-abbb-efd6147fe74c | 0.0.1 | hfopenllm_v2/Pinkstack_PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B/1762652579.812139 | 1762652579.812139 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Pinkstack/PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B | Pinkstack/PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5084819390328772}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | ehristoforu/coolqwen-3b-it | 5aab957b-f25b-4208-9bf8-2d16887245bc | 0.0.1 | hfopenllm_v2/ehristoforu_coolqwen-3b-it/1762652580.140961 | 1762652580.1409621 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/coolqwen-3b-it | ehristoforu/coolqwen-3b-it | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6472670292601409}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.085} |
HF Open LLM v2 | alibaba | ehristoforu/RQwen-v0.2 | 69318100-73ee-47f4-96b2-6e7b310fbcd1 | 0.0.1 | hfopenllm_v2/ehristoforu_RQwen-v0.2/1762652580.140525 | 1762652580.140526 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/RQwen-v0.2 | ehristoforu/RQwen-v0.2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7503568309862276}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | ehristoforu/frqwen2.5-from7b-it | 26034d5d-5d52-40d8-aa9b-e90dbd255903 | 0.0.1 | hfopenllm_v2/ehristoforu_frqwen2.5-from7b-it/1762652580.143308 | 1762652580.143309 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/frqwen2.5-from7b-it | ehristoforu/frqwen2.5-from7b-it | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6532123654126606}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 13.206} |
HF Open LLM v2 | alibaba | ehristoforu/QwenQwen2.5-7B-IT-Dare | 09deb823-536f-4afc-95bf-ebb0a8eb2e00 | 0.0.1 | hfopenllm_v2/ehristoforu_QwenQwen2.5-7B-IT-Dare/1762652580.1400871 | 1762652580.140088 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/QwenQwen2.5-7B-IT-Dare | ehristoforu/QwenQwen2.5-7B-IT-Dare | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7509064836855099}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | ehristoforu/qwen2.5-test-32b-it | 606d699f-c7ac-4e5b-b5a3-5bd43f0a3ff6 | 0.0.1 | hfopenllm_v2/ehristoforu_qwen2.5-test-32b-it/1762652580.144918 | 1762652580.1449192 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/qwen2.5-test-32b-it | ehristoforu/qwen2.5-test-32b-it | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7889499860370484}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 32.764} |
HF Open LLM v2 | alibaba | ehristoforu/QwenQwen2.5-7B-IT | 30f8faa5-777f-47bc-b128-f31b950079a3 | 0.0.1 | hfopenllm_v2/ehristoforu_QwenQwen2.5-7B-IT/1762652580.1398232 | 1762652580.1398232 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/QwenQwen2.5-7B-IT | ehristoforu/QwenQwen2.5-7B-IT | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.751830698103255}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | ehristoforu/RQwen-v0.1 | 93187c79-f1a4-45f9-9d95-a254a185f7a4 | 0.0.1 | hfopenllm_v2/ehristoforu_RQwen-v0.1/1762652580.140311 | 1762652580.140312 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/RQwen-v0.1 | ehristoforu/RQwen-v0.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7624968417133207}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | ehristoforu/qwen2.5-with-lora-think-3b-it | 6c40f966-753b-4301-8c9b-f7b4905c0b68 | 0.0.1 | hfopenllm_v2/ehristoforu_qwen2.5-with-lora-think-3b-it/1762652580.1451252 | 1762652580.1451259 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/qwen2.5-with-lora-think-3b-it | ehristoforu/qwen2.5-with-lora-think-3b-it | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5319374814381397}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | ehristoforu/frqwen2.5-from7b-duable4layers-it | b2c0f0f2-3c1d-4b2a-a82d-24001cbfd3d7 | 0.0.1 | hfopenllm_v2/ehristoforu_frqwen2.5-from7b-duable4layers-it/1762652580.1428769 | 1762652580.1428769 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/frqwen2.5-from7b-duable4layers-it | ehristoforu/frqwen2.5-from7b-duable4layers-it | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7728881589737453}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 8.545} |
HF Open LLM v2 | alibaba | KingNish/qwen-1b-continued-v2 | 479d9f2a-82f6-42de-b8d6-92405f60638c | 0.0.1 | hfopenllm_v2/KingNish_qwen-1b-continued-v2/1762652579.7004201 | 1762652579.700421 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | KingNish/qwen-1b-continued-v2 | KingNish/qwen-1b-continued-v2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1578711153073844}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.277} |
HF Open LLM v2 | alibaba | KingNish/qwen-1b-continued | a4063b77-fc24-4c9d-bf08-cb28fc6e8259 | 0.0.1 | hfopenllm_v2/KingNish_qwen-1b-continued/1762652579.700214 | 1762652579.700215 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | KingNish/qwen-1b-continued | KingNish/qwen-1b-continued | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.12547263483113694}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.277} |
HF Open LLM v2 | alibaba | KingNish/Qwen2.5-0.5b-Test-ft | 5a28540f-3a94-478c-84c0-5be8db86328a | 0.0.1 | hfopenllm_v2/KingNish_Qwen2.5-0.5b-Test-ft/1762652579.699473 | 1762652579.699473 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | KingNish/Qwen2.5-0.5b-Test-ft | KingNish/Qwen2.5-0.5b-Test-ft | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.26708134416681073}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | alibaba | KingNish/qwen-1b-continued-v2.2 | cf6aeb1a-4814-41ad-96f5-b59caafb902f | 0.0.1 | hfopenllm_v2/KingNish_qwen-1b-continued-v2.2/1762652579.7008262 | 1762652579.700827 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | KingNish/qwen-1b-continued-v2.2 | KingNish/qwen-1b-continued-v2.2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.14125963554479892}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.277} |
HF Open LLM v2 | alibaba | KingNish/qwen-1b-continued-v2.1 | f12c6b15-107a-41ed-98fa-40b0af5be42e | 0.0.1 | hfopenllm_v2/KingNish_qwen-1b-continued-v2.1/1762652579.700618 | 1762652579.700619 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | KingNish/qwen-1b-continued-v2.1 | KingNish/qwen-1b-continued-v2.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.11268323603594019}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.277} |
HF Open LLM v2 | alibaba | jayasuryajsk/Qwen2.5-3B-reasoner | 91c0e116-7dc0-4931-ac61-b98bac2af3e0 | 0.0.1 | hfopenllm_v2/jayasuryajsk_Qwen2.5-3B-reasoner/1762652580.280263 | 1762652580.280264 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jayasuryajsk/Qwen2.5-3B-reasoner | jayasuryajsk/Qwen2.5-3B-reasoner | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4159585455480348}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | Weyaxi/Einstein-v7-Qwen2-7B | b20c1304-d782-4d41-9c15-0091f9c914e4 | 0.0.1 | hfopenllm_v2/Weyaxi_Einstein-v7-Qwen2-7B/1762652579.949607 | 1762652579.949609 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Weyaxi/Einstein-v7-Qwen2-7B | Weyaxi/Einstein-v7-Qwen2-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4099633417111043}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | macadeliccc/Samantha-Qwen-2-7B | c443492e-3b5f-4394-9fbb-761dba338638 | 0.0.1 | hfopenllm_v2/macadeliccc_Samantha-Qwen-2-7B/1762652580.3290062 | 1762652580.3290062 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | macadeliccc/Samantha-Qwen-2-7B | macadeliccc/Samantha-Qwen-2-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4377152621710395}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | OpenGenerativeAI | OpenGenerativeAI/Bifrost | cef8e01a-071d-4ee4-997b-44679ef5b56e | 0.0.1 | hfopenllm_v2/OpenGenerativeAI_Bifrost/1762652579.8062131 | 1762652579.8062139 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | OpenGenerativeAI/Bifrost | OpenGenerativeAI/Bifrost | OpenGenerativeAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6347524568145853}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | OpenGenerativeAI | OpenGenerativeAI/Bifrost-14B | cde00174-ac52-42da-9641-0866739232e4 | 0.0.1 | hfopenllm_v2/OpenGenerativeAI_Bifrost-14B/1762652579.806474 | 1762652579.806475 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | OpenGenerativeAI/Bifrost-14B | OpenGenerativeAI/Bifrost-14B | OpenGenerativeAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6615302951723648}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | ai21labs | ai21labs/Jamba-v0.1 | e9546f28-0f6b-449e-a2b3-c6ab262103cc | 0.0.1 | hfopenllm_v2/ai21labs_Jamba-v0.1/1762652579.978585 | 1762652579.978585 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ai21labs/Jamba-v0.1 | ai21labs/Jamba-v0.1 | ai21labs | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20255920956395698}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "JambaForCausalLM", "params_billions": 51.57} |
HF Open LLM v2 | lalainy | lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR | 012fb237-8082-40d9-882e-0dd7bc9c74cb | 0.0.1 | hfopenllm_v2/lalainy_ECE-PRYMMAL-0.5B-FT-V5-MUSR/1762652580.312166 | 1762652580.312166 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR | lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR | lalainy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21377500587330506}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | lalainy | lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4 | 2ede8e21-33e9-45ac-9c60-9a4bd7e8e3cb | 0.0.1 | hfopenllm_v2/lalainy_ECE-PRYMMAL-YL-1B-SLERP-V4/1762652580.3130481 | 1762652580.313049 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4 | lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4 | lalainy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33235260220658963}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | lalainy | lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1 | 8822f27f-90ec-41a8-b71a-611f7c5ad590 | 0.0.1 | hfopenllm_v2/lalainy_ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1/1762652580.31263 | 1762652580.31263 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1 | lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1 | lalainy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1437075847639818}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | lalainy | lalainy/ECE-PRYMMAL-0.5B-SLERP-V4 | 869daca0-a700-464d-a551-290ed454421e | 0.0.1 | hfopenllm_v2/lalainy_ECE-PRYMMAL-0.5B-SLERP-V4/1762652580.312417 | 1762652580.312417 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lalainy/ECE-PRYMMAL-0.5B-SLERP-V4 | lalainy/ECE-PRYMMAL-0.5B-SLERP-V4 | lalainy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.15639724819035714}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | lalainy | lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1 | 85ac95fd-cb36-4158-818d-69c45f83dae9 | 0.0.1 | hfopenllm_v2/lalainy_ECE-PRYMMAL-YL-6B-SLERP-V1/1762652580.31332 | 1762652580.3133209 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1 | lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1 | lalainy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3264072660540699}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.061} |
HF Open LLM v2 | lalainy | lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3 | fa3c7a13-b37e-40b3-b814-b1ae421081ba | 0.0.1 | hfopenllm_v2/lalainy_ECE-PRYMMAL-YL-1B-SLERP-V3/1762652580.31284 | 1762652580.312841 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3 | lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3 | lalainy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.325008754549041}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | lalainy | lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2 | fd2e3c0b-8b35-463c-a001-444ed6e6dd9a | 0.0.1 | hfopenllm_v2/lalainy_ECE-PRYMMAL-YL-6B-SLERP-V2/1762652580.3135412 | 1762652580.3135412 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2 | lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2 | lalainy | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3248835312526319}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.061} |
HF Open LLM v2 | sci-m-wang | sci-m-wang/deepseek-llm-7b-chat-sa-v0.1 | 182d68d5-9b03-41bc-850c-1f571c36e630 | 0.0.1 | hfopenllm_v2/sci-m-wang_deepseek-llm-7b-chat-sa-v0.1/1762652580.5106509 | 1762652580.5106518 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sci-m-wang/deepseek-llm-7b-chat-sa-v0.1 | sci-m-wang/deepseek-llm-7b-chat-sa-v0.1 | sci-m-wang | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4035935761557113}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "?", "params_billions": 7.0} |
HF Open LLM v2 | sci-m-wang | sci-m-wang/Phi-3-mini-4k-instruct-sa-v0.1 | 319484e0-12aa-4212-b55f-d19efdd2f719 | 0.0.1 | hfopenllm_v2/sci-m-wang_Phi-3-mini-4k-instruct-sa-v0.1/1762652580.510415 | 1762652580.510418 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sci-m-wang/Phi-3-mini-4k-instruct-sa-v0.1 | sci-m-wang/Phi-3-mini-4k-instruct-sa-v0.1 | sci-m-wang | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5020623057930734}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "?", "params_billions": 7.642} |
HF Open LLM v2 | sci-m-wang | sci-m-wang/Mistral-7B-Instruct-sa-v0.1 | 8125700c-d9e7-4d6e-9b78-049331dd571b | 0.0.1 | hfopenllm_v2/sci-m-wang_Mistral-7B-Instruct-sa-v0.1/1762652580.510147 | 1762652580.510148 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sci-m-wang/Mistral-7B-Instruct-sa-v0.1 | sci-m-wang/Mistral-7B-Instruct-sa-v0.1 | sci-m-wang | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4335186194851882}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "?", "params_billions": 14.483} |
HF Open LLM v2 | baebee | baebee/mergekit-ties-fnjenli | 21b3d7d0-301d-431d-9cfc-a0ad1e326f03 | 0.0.1 | hfopenllm_v2/baebee_mergekit-ties-fnjenli/1762652580.0231512 | 1762652580.023152 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | baebee/mergekit-ties-fnjenli | baebee/mergekit-ties-fnjenli | baebee | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19881248420856662}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | baebee | baebee/mergekit-model_stock-nzjnheg | e847afb0-c8ac-4cce-b0f9-1667c9fbef3c | 0.0.1 | hfopenllm_v2/baebee_mergekit-model_stock-nzjnheg/1762652580.022936 | 1762652580.022937 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | baebee/mergekit-model_stock-nzjnheg | baebee/mergekit-model_stock-nzjnheg | baebee | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.48442687624392167}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | baebee | baebee/7B-Cetacea | 5985fed7-9c54-458d-8f64-533e248a38da | 0.0.1 | hfopenllm_v2/baebee_7B-Cetacea/1762652580.022699 | 1762652580.022699 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | baebee/7B-Cetacea | baebee/7B-Cetacea | baebee | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5278660620486975}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | vhab10 | vhab10/Llama-3.1-8B-Base-Instruct-SLERP | 982455a4-fb4f-4eed-96a0-c46d9eb11937 | 0.0.1 | hfopenllm_v2/vhab10_Llama-3.1-8B-Base-Instruct-SLERP/1762652580.585581 | 1762652580.585582 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vhab10/Llama-3.1-8B-Base-Instruct-SLERP | vhab10/Llama-3.1-8B-Base-Instruct-SLERP | vhab10 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.290711977552893}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | vhab10 | vhab10/Llama-3.2-Instruct-3B-TIES | 22f8bb3f-4794-46b1-828e-75711a1233bd | 0.0.1 | hfopenllm_v2/vhab10_Llama-3.2-Instruct-3B-TIES/1762652580.585841 | 1762652580.585842 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vhab10/Llama-3.2-Instruct-3B-TIES | vhab10/Llama-3.2-Instruct-3B-TIES | vhab10 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4727367828472896}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.848} |
HF Open LLM v2 | CortexLM | CortexLM/btlm-7b-base-v0.2 | aded7428-1283-4ed8-b068-cc1a5ea92dca | 0.0.1 | hfopenllm_v2/CortexLM_btlm-7b-base-v0.2/1762652579.512528 | 1762652579.512528 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CortexLM/btlm-7b-base-v0.2 | CortexLM/btlm-7b-base-v0.2 | CortexLM | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.14832865685270635}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.885} |
HF Open LLM v2 | cyberagent | cyberagent/calm3-22b-chat | b7ce290d-d082-4586-ac4b-516e8130ddc2 | 0.0.1 | hfopenllm_v2/cyberagent_calm3-22b-chat/1762652580.118237 | 1762652580.118238 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | cyberagent/calm3-22b-chat | cyberagent/calm3-22b-chat | cyberagent | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.509131327100981}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 22.543} |
HF Open LLM v2 | StelleX | StelleX/Vorisatex-7B-preview | 875156be-2ff9-4ec4-8085-27f22fb19259 | 0.0.1 | hfopenllm_v2/StelleX_Vorisatex-7B-preview/1762652579.893095 | 1762652579.893096 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | StelleX/Vorisatex-7B-preview | StelleX/Vorisatex-7B-preview | StelleX | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1515013497519914}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | gmonsoon | gmonsoon/StockSeaLLMs-7B-v1 | ac53d663-0e5c-4a7e-8d9d-efcd70d39b10 | 0.0.1 | hfopenllm_v2/gmonsoon_StockSeaLLMs-7B-v1/1762652580.165695 | 1762652580.165696 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gmonsoon/StockSeaLLMs-7B-v1 | gmonsoon/StockSeaLLMs-7B-v1 | gmonsoon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4599218961245052}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | gmonsoon | gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES | 6d500e75-5605-4268-88a1-dc4abc7c5a7f | 0.0.1 | hfopenllm_v2/gmonsoon_gemma2-9b-sahabatai-v1-instruct-BaseTIES/1762652580.165903 | 1762652580.1659038 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES | gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES | gmonsoon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7377923908562614}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | gmonsoon | gmonsoon/SahabatAI-Rebase-8B-Test | a7daa424-7b22-4320-bddd-be350d54b08d | 0.0.1 | hfopenllm_v2/gmonsoon_SahabatAI-Rebase-8B-Test/1762652580.165493 | 1762652580.165493 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gmonsoon/SahabatAI-Rebase-8B-Test | gmonsoon/SahabatAI-Rebase-8B-Test | gmonsoon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5156263159527831}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | gmonsoon | gmonsoon/SahabatAI-MediChatIndo-8B-v1 | 61543864-320f-41ef-889d-7c0e95a229bd | 0.0.1 | hfopenllm_v2/gmonsoon_SahabatAI-MediChatIndo-8B-v1/1762652580.165248 | 1762652580.165249 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gmonsoon/SahabatAI-MediChatIndo-8B-v1 | gmonsoon/SahabatAI-MediChatIndo-8B-v1 | gmonsoon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41628323958208663}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Nitral-AI | Nitral-AI/Captain-Eris_Violet-V0.420-12B | ad87ba77-99a9-463f-aea3-1d29fc0317b0 | 0.0.1 | hfopenllm_v2/Nitral-AI_Captain-Eris_Violet-V0.420-12B/1762652579.785556 | 1762652579.785557 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Nitral-AI/Captain-Eris_Violet-V0.420-12B | Nitral-AI/Captain-Eris_Violet-V0.420-12B | Nitral-AI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.43391866913123844}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Nitral-AI | Nitral-AI/Captain-Eris_BMO-Violent-12B | ebcd5d63-5c91-41d1-b9e2-0bafe7170000 | 0.0.1 | hfopenllm_v2/Nitral-AI_Captain-Eris_BMO-Violent-12B/1762652579.785123 | 1762652579.785124 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Nitral-AI/Captain-Eris_BMO-Violent-12B | Nitral-AI/Captain-Eris_BMO-Violent-12B | Nitral-AI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.615218730745533}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.