_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | alibaba | Qwen/Qwen2.5-Coder-7B | 5e82cb32-8291-497b-ac56-16b50947d1bf | 0.0.1 | hfopenllm_v2/Qwen_Qwen2.5-Coder-7B/1762652579.846894 | 1762652579.8468952 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2.5-Coder-7B | Qwen/Qwen2.5-Coder-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.344592348302504}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | Qwen/Qwen2-57B-A14B | aafb84cd-5950-4b93-98d1-9e50fd294b65 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2-57B-A14B/1762652579.8398201 | 1762652579.839821 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2-57B-A14B | Qwen/Qwen2-57B-A14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.31126965340851165}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2MoeForCausalLM", "params_billions": 57.409} |
HF Open LLM v2 | alibaba | Qwen/Qwen2.5-14B | b02dabaf-2aac-468d-b0cc-c7194c2094fd | 0.0.1 | hfopenllm_v2/Qwen_Qwen2.5-14B/1762652579.843051 | 1762652579.8430521 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2.5-14B | Qwen/Qwen2.5-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3694464022127954}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | Qwen/Qwen2.5-72B | 89ce1911-289d-40bb-be48-f9a4d8d73ac2 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2.5-72B/1762652579.844565 | 1762652579.844566 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2.5-72B | Qwen/Qwen2.5-72B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4137100670664947}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 72.706} |
HF Open LLM v2 | alibaba | Qwen/Qwen2.5-Coder-32B | 743c517a-ad0f-495d-b9d0-cdca01335933 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2.5-Coder-32B/1762652579.846424 | 1762652579.846425 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2.5-Coder-32B | Qwen/Qwen2.5-Coder-32B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4363411304228336}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 32.764} |
HF Open LLM v2 | alibaba | Qwen/Qwen2.5-Coder-14B | d0ae041c-8b56-4ce1-841b-96622a724894 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2.5-Coder-14B/1762652579.8457868 | 1762652579.845789 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2.5-Coder-14B | Qwen/Qwen2.5-Coder-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3472652561869174}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | Qwen/Qwen2.5-0.5B | c8110747-f2dd-46d0-b2b3-706d70e1d714 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2.5-0.5B/1762652579.841982 | 1762652579.841983 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2.5-0.5B | Qwen/Qwen2.5-0.5B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.16271714606133947}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.5} |
HF Open LLM v2 | alibaba | Qwen/Qwen1.5-0.5B | e0115d6b-3b2c-4047-b64c-1e7afb5edd55 | 0.0.1 | hfopenllm_v2/Qwen_Qwen1.5-0.5B/1762652579.835391 | 1762652579.835392 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen1.5-0.5B | Qwen/Qwen1.5-0.5B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17056077873375977}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.62} |
HF Open LLM v2 | alibaba | Qwen/Qwen2-Math-7B | fe474496-4efa-4ef7-844d-32b17abda7c8 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2-Math-7B/1762652579.841364 | 1762652579.841364 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2-Math-7B | Qwen/Qwen2-Math-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2687048143370701}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | Qwen/Qwen1.5-14B | 9afcb068-65e2-4d4c-b7ee-071eb4dbac73 | 0.0.1 | hfopenllm_v2/Qwen_Qwen1.5-14B/1762652579.836853 | 1762652579.836853 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen1.5-14B | Qwen/Qwen1.5-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2905368865720732}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.167} |
HF Open LLM v2 | alibaba | Qwen/Qwen1.5-110B | 29389e2b-7898-4f9f-ba8c-8fe4dad80295 | 0.0.1 | hfopenllm_v2/Qwen_Qwen1.5-110B/1762652579.836433 | 1762652579.836434 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen1.5-110B | Qwen/Qwen1.5-110B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3421942667677318}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 111.21} |
HF Open LLM v2 | alibaba | Qwen/Qwen2-7B | 196e965c-4570-43aa-ba0d-13972796bda9 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2-7B/1762652579.840696 | 1762652579.840696 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2-7B | Qwen/Qwen2-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3148667757106699}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | Qwen/Qwen2.5-7B | bed92e1c-8f11-4f70-826e-569aa55baa09 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2.5-7B/1762652579.8449879 | 1762652579.8449888 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2.5-7B | Qwen/Qwen2.5-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3374479713825982}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | Qwen/Qwen2.5-32B | 9dd61039-27d0-42f3-9b03-65b0a59465d4 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2.5-32B/1762652579.843701 | 1762652579.843702 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2.5-32B | Qwen/Qwen2.5-32B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.40766499554515356}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 32.764} |
HF Open LLM v2 | alibaba | Qwen/Qwen2-72B | fc683e1a-327f-4a69-bd51-9022c587159b | 0.0.1 | hfopenllm_v2/Qwen_Qwen2-72B/1762652579.8402402 | 1762652579.840241 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2-72B | Qwen/Qwen2-72B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3823610243044012}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 72.706} |
HF Open LLM v2 | alibaba | Qwen/Qwen2-0.5B | cdf3b683-29d9-45b4-b6a6-1f67927ef953 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2-0.5B/1762652579.838974 | 1762652579.838975 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2-0.5B | Qwen/Qwen2-0.5B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18732186154957736}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | alibaba | Qwen/Qwen2.5-3B | 43062e28-5532-4e31-ac49-fbd794c7f664 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2.5-3B/1762652579.8441322 | 1762652579.8441331 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2.5-3B | Qwen/Qwen2.5-3B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2689541527591236}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | Qwen/Qwen2-1.5B | 6eb76673-0633-440b-8849-8fcf8cf00954 | 0.0.1 | hfopenllm_v2/Qwen_Qwen2-1.5B/1762652579.839384 | 1762652579.839385 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Qwen/Qwen2-1.5B | Qwen/Qwen2-1.5B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21132705665412216}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | allknowingroger/QwenStock2-14B | 4a4c258b-2b03-4fad-a5e0-b623a25fb735 | 0.0.1 | hfopenllm_v2/allknowingroger_QwenStock2-14B/1762652580.001041 | 1762652580.001042 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/QwenStock2-14B | allknowingroger/QwenStock2-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5563427261887348}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | allknowingroger/Qwen2.5-7B-task8 | 956640e9-97a3-4641-9ed0-a63831a8ee58 | 0.0.1 | hfopenllm_v2/allknowingroger_Qwen2.5-7B-task8/1762652579.9994612 | 1762652579.999462 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwen2.5-7B-task8 | allknowingroger/Qwen2.5-7B-task8 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4645185884564068}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | allknowingroger/Qwen2.5-7B-task3 | 0c556e08-bb71-406c-88b8-d45fc4cc43f0 | 0.0.1 | hfopenllm_v2/allknowingroger_Qwen2.5-7B-task3/1762652579.998833 | 1762652579.998834 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwen2.5-7B-task3 | allknowingroger/Qwen2.5-7B-task3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.512903540383959}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | allknowingroger/Qwen2.5-7B-task7 | b5b02465-0d3f-4ccc-a104-174fcf53dc9a | 0.0.1 | hfopenllm_v2/allknowingroger_Qwen2.5-7B-task7/1762652579.999242 | 1762652579.999243 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwen2.5-7B-task7 | allknowingroger/Qwen2.5-7B-task7 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.42842325030917966}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | allknowingroger/Qwenslerp3-7B | 88727af1-7672-4ab5-9cc4-f56d286f3967 | 0.0.1 | hfopenllm_v2/allknowingroger_Qwenslerp3-7B/1762652580.0020611 | 1762652580.002062 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwenslerp3-7B | allknowingroger/Qwenslerp3-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.501837347127843}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | allknowingroger/QwenStock1-14B | 95c86ae6-dcb7-4ed7-a82d-ce0b374cca0e | 0.0.1 | hfopenllm_v2/allknowingroger_QwenStock1-14B/1762652580.0008268 | 1762652580.0008278 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/QwenStock1-14B | allknowingroger/QwenStock1-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5634117474966422}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | allknowingroger/Qwenslerp2-14B | 636ed71e-3d86-4d5d-8b8d-3019f26261fc | 0.0.1 | hfopenllm_v2/allknowingroger_Qwenslerp2-14B/1762652580.001452 | 1762652580.0014532 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwenslerp2-14B | allknowingroger/Qwenslerp2-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5007136619724553}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | allknowingroger/Qwen2.5-7B-task2 | 3518e992-9548-4025-a641-99a2cf3833e4 | 0.0.1 | hfopenllm_v2/allknowingroger_Qwen2.5-7B-task2/1762652579.998622 | 1762652579.998623 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwen2.5-7B-task2 | allknowingroger/Qwen2.5-7B-task2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.45270327176336567}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | allknowingroger/QwenSlerp12-7B | 18c67de4-1518-44b6-b92f-b490e9d55877 | 0.0.1 | hfopenllm_v2/allknowingroger_QwenSlerp12-7B/1762652579.999902 | 1762652579.999903 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/QwenSlerp12-7B | allknowingroger/QwenSlerp12-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5075577246151324}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | allknowingroger/Qwen2.5-slerp-14B | ba80d36c-7688-40e8-8182-251c6b9e6b19 | 0.0.1 | hfopenllm_v2/allknowingroger_Qwen2.5-slerp-14B/1762652579.999685 | 1762652579.999686 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwen2.5-slerp-14B | allknowingroger/Qwen2.5-slerp-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49282016161562425}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | allknowingroger/Qwen2.5-7B-task4 | a200d34f-8ed0-4f1d-93e2-cff38b1811f9 | 0.0.1 | hfopenllm_v2/allknowingroger_Qwen2.5-7B-task4/1762652579.999042 | 1762652579.999042 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwen2.5-7B-task4 | allknowingroger/Qwen2.5-7B-task4 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5005385709916355}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | allknowingroger/Qwenslerp3-14B | 06a2a807-3dbc-42c4-adec-4d6caa01cf74 | 0.0.1 | hfopenllm_v2/allknowingroger_Qwenslerp3-14B/1762652580.001856 | 1762652580.001856 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwenslerp3-14B | allknowingroger/Qwenslerp3-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5052349986923584}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | allknowingroger/QwenStock3-14B | 2b3928ad-ab69-4e63-aa3c-e64dea7b5e6c | 0.0.1 | hfopenllm_v2/allknowingroger_QwenStock3-14B/1762652580.0012438 | 1762652580.001245 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/QwenStock3-14B | allknowingroger/QwenStock3-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5615134509767417}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | allknowingroger/Rombos-LLM-V2.5-Qwen-42b | 619fde94-d095-4f5c-b36d-19a38b6a8109 | 0.0.1 | hfopenllm_v2/allknowingroger_Rombos-LLM-V2.5-Qwen-42b/1762652580.002683 | 1762652580.002683 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Rombos-LLM-V2.5-Qwen-42b | allknowingroger/Rombos-LLM-V2.5-Qwen-42b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1879213819332704}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 42.516} |
HF Open LLM v2 | alibaba | allknowingroger/Qwen2.5-42B-AGI | de6fe2ab-47de-4616-a0b9-b2cb6f44b16b | 0.0.1 | hfopenllm_v2/allknowingroger_Qwen2.5-42B-AGI/1762652579.9983659 | 1762652579.998367 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwen2.5-42B-AGI | allknowingroger/Qwen2.5-42B-AGI | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19129354557019818}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 42.516} |
HF Open LLM v2 | alibaba | allknowingroger/QwenSlerp4-14B | 1393cab1-31aa-470c-bca1-53f99d7ea1e8 | 0.0.1 | hfopenllm_v2/allknowingroger_QwenSlerp4-14B/1762652580.000124 | 1762652580.000125 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/QwenSlerp4-14B | allknowingroger/QwenSlerp4-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6327544249258634}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | allknowingroger/QwenSlerp5-14B | da7928ec-55b8-4d4b-9b9e-b40c5de7136b | 0.0.1 | hfopenllm_v2/allknowingroger_QwenSlerp5-14B/1762652580.000389 | 1762652580.0003898 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/QwenSlerp5-14B | allknowingroger/QwenSlerp5-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7119387669162267}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | allknowingroger/QwenSlerp6-14B | 5135513f-f255-412b-ab16-f0d613e4525e | 0.0.1 | hfopenllm_v2/allknowingroger_QwenSlerp6-14B/1762652580.0006049 | 1762652580.000606 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/QwenSlerp6-14B | allknowingroger/QwenSlerp6-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6866846633598851}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | allknowingroger/Qwenslerp2-7B | a1e6f539-f5d7-4f57-b0da-4df7e5a86240 | 0.0.1 | hfopenllm_v2/allknowingroger_Qwenslerp2-7B/1762652580.001649 | 1762652580.0016499 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Qwenslerp2-7B | allknowingroger/Qwenslerp2-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5294396645345462}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-7B-it-restore | 2f2577b8-28e3-4fa1-8e65-66e59499b9cd | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-7B-it-restore/1762652579.958842 | 1762652579.958842 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-7B-it-restore | YOYO-AI/Qwen2.5-7B-it-restore | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7530796065550517}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-1005 | 29058700-6465-476d-b1c9-2bb89d70c52b | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-1005/1762652579.9563992 | 1762652579.9564002 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-1005 | YOYO-AI/Qwen2.5-14B-YOYO-1005 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5971588717935079}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-V4-p1 | 441375d9-0375-4a15-9d50-267395d3ab13 | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-V4-p1/1762652579.957833 | 1762652579.957834 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-V4-p1 | YOYO-AI/Qwen2.5-14B-YOYO-V4-p1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8203488964835526}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-latest | d5487f61-9be7-4ffc-af6d-be9f925dd4ba | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-latest/1762652579.95823 | 1762652579.958231 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-latest | YOYO-AI/Qwen2.5-14B-YOYO-latest | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.591063932587756}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-1M-YOYO-V3 | fdc183ed-50d6-40c3-8e7b-02a37fc42a00 | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-1M-YOYO-V3/1762652579.955529 | 1762652579.95553 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-1M-YOYO-V3 | YOYO-AI/Qwen2.5-14B-1M-YOYO-V3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8398327548681941}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-latest-V2 | b97b327c-1730-4bfe-b5fe-00dbfcd0d372 | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-latest-V2/1762652579.958441 | 1762652579.958441 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-latest-V2 | YOYO-AI/Qwen2.5-14B-YOYO-latest-V2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7771346693440072}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | YOYO-AI/ZYH-LLM-Qwen2.5-14B-V2 | 0c7e0639-a082-47f1-bf32-0c45ce573f0a | 0.0.1 | hfopenllm_v2/YOYO-AI_ZYH-LLM-Qwen2.5-14B-V2/1762652579.959567 | 1762652579.9595678 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/ZYH-LLM-Qwen2.5-14B-V2 | YOYO-AI/ZYH-LLM-Qwen2.5-14B-V2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5070834275278483}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-1010-v2 | 2047ae80-fdc6-4e94-90e6-b3cac52d8c45 | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-1010-v2/1762652579.957223 | 1762652579.957223 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-1010-v2 | YOYO-AI/Qwen2.5-14B-YOYO-1010-v2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.594710922574325}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-0510-v2 | ad6edd05-e83f-4da3-b200-c1d972548e8b | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-0510-v2/1762652579.955989 | 1762652579.955989 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-0510-v2 | YOYO-AI/Qwen2.5-14B-YOYO-0510-v2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.594710922574325}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-1010 | 6a676239-eed6-44dc-b395-1b2453d5b0ba | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-1010/1762652579.956832 | 1762652579.956832 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-1010 | YOYO-AI/Qwen2.5-14B-YOYO-1010 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5898648918203699}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-1010 | 1de35d6f-c62f-48fd-b921-41e85b55434a | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-1010/1762652579.957045 | 1762652579.957045 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-1010 | YOYO-AI/Qwen2.5-14B-YOYO-1010 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7904737208384863}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/ZYH-LLM-Qwen2.5-14B-V4 | f5b253b5-4c42-49f8-9f3f-d85a5b2502c0 | 0.0.1 | hfopenllm_v2/YOYO-AI_ZYH-LLM-Qwen2.5-14B-V4/1762652579.959998 | 1762652579.959999 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/ZYH-LLM-Qwen2.5-14B-V4 | YOYO-AI/ZYH-LLM-Qwen2.5-14B-V4 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8364605912312664}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | YOYO-AI/ZYH-LLM-Qwen2.5-14B | 2dd14fef-53f5-491d-a5e1-7e19f6043049 | 0.0.1 | hfopenllm_v2/YOYO-AI_ZYH-LLM-Qwen2.5-14B/1762652579.959276 | 1762652579.9592772 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/ZYH-LLM-Qwen2.5-14B | YOYO-AI/ZYH-LLM-Qwen2.5-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.594111402190632}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-0505 | 1835078d-7897-4517-9d7b-86a2285dfa27 | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-0505/1762652579.9557781 | 1762652579.9557781 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-0505 | YOYO-AI/Qwen2.5-14B-YOYO-0505 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5882912893345214}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-V4 | c76d318b-eba5-4407-be86-a92051791f00 | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-V4/1762652579.9576309 | 1762652579.957632 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-V4 | YOYO-AI/Qwen2.5-14B-YOYO-V4 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8397828871837835}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-1005-v2 | ed12a458-8c3b-4e08-a218-e94b4fdd89d8 | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-1005-v2/1762652579.956619 | 1762652579.956619 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-1005-v2 | YOYO-AI/Qwen2.5-14B-YOYO-1005-v2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.595310442958018}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-it-restore | ab78a98d-0cad-4215-8f37-f3093066a98d | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-it-restore/1762652579.958646 | 1762652579.958647 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-it-restore | YOYO-AI/Qwen2.5-14B-it-restore | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8209484168672456}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-SCE | e0545222-4bd1-490a-a315-5b9ce9742310 | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-SCE/1762652579.957431 | 1762652579.957431 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-SCE | YOYO-AI/Qwen2.5-14B-YOYO-SCE | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5843694729983111}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-Coder-14B-YOYO-1010 | 4f6bda51-89d3-4005-9133-db6d871ae87d | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-Coder-14B-YOYO-1010/1762652579.9590368 | 1762652579.959038 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-Coder-14B-YOYO-1010 | YOYO-AI/Qwen2.5-Coder-14B-YOYO-1010 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5335864395359867}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | YOYO-AI/ZYH-LLM-Qwen2.5-14B-V3 | 4f85534a-0b12-42c4-a0d3-06d4d8337e0c | 0.0.1 | hfopenllm_v2/YOYO-AI_ZYH-LLM-Qwen2.5-14B-V3/1762652579.959789 | 1762652579.959789 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/ZYH-LLM-Qwen2.5-14B-V3 | YOYO-AI/ZYH-LLM-Qwen2.5-14B-V3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8577928784513978}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-V4-p2 | 9ecdd8a3-247b-46b2-ae3b-5798685329ef | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-V4-p2/1762652579.958032 | 1762652579.9580328 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-V4-p2 | YOYO-AI/Qwen2.5-14B-YOYO-V4-p2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8047868544351211}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | alibaba | YOYO-AI/Qwen2.5-14B-YOYO-0805 | 6d4ac88f-7a02-4f78-9990-6736972f43f7 | 0.0.1 | hfopenllm_v2/YOYO-AI_Qwen2.5-14B-YOYO-0805/1762652579.956195 | 1762652579.956195 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YOYO-AI/Qwen2.5-14B-YOYO-0805 | YOYO-AI/Qwen2.5-14B-YOYO-0805 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5882912893345214}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | nguyentd/FinancialAdvice-Qwen2.5-7B | 0ced7574-bfc4-4958-a6f5-0944f9ac411a | 0.0.1 | hfopenllm_v2/nguyentd_FinancialAdvice-Qwen2.5-7B/1762652580.404779 | 1762652580.4047801 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nguyentd/FinancialAdvice-Qwen2.5-7B | nguyentd/FinancialAdvice-Qwen2.5-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.449605934476079}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | LenguajeNaturalAI/leniachat-qwen2-1.5B-v0 | eb6e6d30-b349-447c-83d3-fe7760e83037 | 0.0.1 | hfopenllm_v2/LenguajeNaturalAI_leniachat-qwen2-1.5B-v0/1762652579.713998 | 1762652579.713999 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LenguajeNaturalAI/leniachat-qwen2-1.5B-v0 | LenguajeNaturalAI/leniachat-qwen2-1.5B-v0 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22211842356059697}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.543} |
HF Open LLM v2 | alibaba | Dongwei/DeepSeek-R1-Distill-Qwen-7B-GRPO | b36b915f-3c4a-40e8-ab78-8442dbe116e1 | 0.0.1 | hfopenllm_v2/Dongwei_DeepSeek-R1-Distill-Qwen-7B-GRPO/1762652579.5556989 | 1762652579.5557 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Dongwei/DeepSeek-R1-Distill-Qwen-7B-GRPO | Dongwei/DeepSeek-R1-Distill-Qwen-7B-GRPO | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.40376866713653103}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | JungZoona/T3Q-qwen2.5-14b-v1.0-e3 | eb7694ce-6fe4-4bb0-bcab-266ccc71f78a | 0.0.1 | hfopenllm_v2/JungZoona_T3Q-qwen2.5-14b-v1.0-e3/1762652579.697056 | 1762652579.697057 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JungZoona/T3Q-qwen2.5-14b-v1.0-e3 | JungZoona/T3Q-qwen2.5-14b-v1.0-e3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.732396707403024}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | Sakalti/qwen2.5-2.3B | 6dc5b101-c681-4010-941a-3983cb9eff53 | 0.0.1 | hfopenllm_v2/Sakalti_qwen2.5-2.3B/1762652579.869403 | 1762652579.8694038 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/qwen2.5-2.3B | Sakalti/qwen2.5-2.3B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.12879493078365403}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2Model", "params_billions": 2.339} |
HF Open LLM v2 | alibaba | Sakalti/QwenTest-7 | 2d99163e-9ebd-49d9-ad13-ee1f780d277c | 0.0.1 | hfopenllm_v2/Sakalti_QwenTest-7/1762652579.8585348 | 1762652579.858536 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/QwenTest-7 | Sakalti/QwenTest-7 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.16718861509683197}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.988} |
HF Open LLM v2 | alibaba | lkoenig/BBAI_456_QwenKoen | 249b0b65-5c71-4c5d-9802-28df0ead0cdf | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_456_QwenKoen/1762652580.323869 | 1762652580.323869 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_456_QwenKoen | lkoenig/BBAI_456_QwenKoen | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.45292823042859615}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | lkoenig/BBAI_7B_QwenDyanKoenLo | dedc34ed-fd8f-4b29-b898-3c9830993247 | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_7B_QwenDyanKoenLo/1762652580.324512 | 1762652580.324513 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_7B_QwenDyanKoenLo | lkoenig/BBAI_7B_QwenDyanKoenLo | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.46631714960748594}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | lkoenig/BBAI_7B_Qwen2.5koen | 078cedea-7b3a-4c77-b932-3d42f0c841fe | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_7B_Qwen2.5koen/1762652580.324276 | 1762652580.324277 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_7B_Qwen2.5koen | lkoenig/BBAI_7B_Qwen2.5koen | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.45999725173650363}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | lkoenig/BBAI_212_Qwencore | d42a520c-15dd-4497-a26a-b6f77b3257e6 | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_212_Qwencore/1762652580.3232372 | 1762652580.323238 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_212_Qwencore | lkoenig/BBAI_212_Qwencore | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4384400058511416}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | lkoenig/BBAI_7B_QwenDyancabsLAW | 05f391f3-68ac-422a-b7e8-01eba1729a0b | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_7B_QwenDyancabsLAW/1762652580.3247318 | 1762652580.3247318 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_7B_QwenDyancabsLAW | lkoenig/BBAI_7B_QwenDyancabsLAW | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5549685944405289}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | lkoenig/BBAI_375_QwenDyancabs | 08e49740-3cdd-47b2-9b95-b96d8a13dd79 | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_375_QwenDyancabs/1762652580.323661 | 1762652580.323662 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_375_QwenDyancabs | lkoenig/BBAI_375_QwenDyancabs | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4565752204151651}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | lkoenig/BBAI_7B_KoenQwenDyan | fe084d09-ee80-4c7f-93a7-3ee0f9081177 | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_7B_KoenQwenDyan/1762652580.324076 | 1762652580.3240771 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_7B_KoenQwenDyan | lkoenig/BBAI_7B_KoenQwenDyan | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5807224830117421}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | lkoenig/BBAI_230_Xiaqwen | c9393ea7-3269-435f-9159-95638b9c691e | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_230_Xiaqwen/1762652580.3234491 | 1762652580.32345 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_230_Xiaqwen | lkoenig/BBAI_230_Xiaqwen | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4648931501748693}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | lkoenig/BBAI_212_QwenLawLo | c4f888d2-c08c-43c4-a1f9-79edf519c893 | 0.0.1 | hfopenllm_v2/lkoenig_BBAI_212_QwenLawLo/1762652580.322983 | 1762652580.322984 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | lkoenig/BBAI_212_QwenLawLo | lkoenig/BBAI_212_QwenLawLo | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4566250880995758}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | SicariusSicariiStuff/Qwen2.5-14B_Uncensored_Instruct | 8012de5a-8cb0-4039-895f-70c20e9237ee | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Qwen2.5-14B_Uncensored_Instruct/1762652579.884166 | 1762652579.884167 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Qwen2.5-14B_Uncensored_Instruct | SicariusSicariiStuff/Qwen2.5-14B_Uncensored_Instruct | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3789389929830627}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | SicariusSicariiStuff/Impish_QWEN_14B-1M | a059e151-6f32-48ff-900b-4e232aef3cc0 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Impish_QWEN_14B-1M/1762652579.8825831 | 1762652579.882584 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Impish_QWEN_14B-1M | SicariusSicariiStuff/Impish_QWEN_14B-1M | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7867768631675067}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | SicariusSicariiStuff/Impish_QWEN_7B-1M | 64c02fd8-386d-4b4c-bc00-d243cfcae7f1 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Impish_QWEN_7B-1M/1762652579.8828428 | 1762652579.882844 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Impish_QWEN_7B-1M | SicariusSicariiStuff/Impish_QWEN_7B-1M | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6381744881359238}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | SicariusSicariiStuff/Qwen2.5-14B_Uncensored | ea18a046-87bb-42d9-a1b2-d01fe875c970 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Qwen2.5-14B_Uncensored/1762652579.883949 | 1762652579.88395 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Qwen2.5-14B_Uncensored | SicariusSicariiStuff/Qwen2.5-14B_Uncensored | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3173147249298528}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.0} |
HF Open LLM v2 | alibaba | SicariusSicariiStuff/Qwen2.5-14B_Uncencored | 7c6f4fa2-6847-4f57-8a8f-31673bd8b1e7 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Qwen2.5-14B_Uncencored/1762652579.883748 | 1762652579.883749 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Qwen2.5-14B_Uncencored | SicariusSicariiStuff/Qwen2.5-14B_Uncencored | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.31579099012841483}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.0} |
HF Open LLM v2 | alibaba | RESMPDEV/EVA-Qwen2.5-1.5B-FRFR | 648e69e2-54de-43c4-93ac-f8422fa4b9c1 | 0.0.1 | hfopenllm_v2/RESMPDEV_EVA-Qwen2.5-1.5B-FRFR/1762652579.848896 | 1762652579.848896 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | RESMPDEV/EVA-Qwen2.5-1.5B-FRFR | RESMPDEV/EVA-Qwen2.5-1.5B-FRFR | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.308172316121225}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | RESMPDEV/Qwen2-Wukong-0.5B | 72a11594-1d83-4e12-b82f-137b6749f5ab | 0.0.1 | hfopenllm_v2/RESMPDEV_Qwen2-Wukong-0.5B/1762652579.849144 | 1762652579.849144 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | RESMPDEV/Qwen2-Wukong-0.5B | RESMPDEV/Qwen2-Wukong-0.5B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1854235650296768}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-Sky-R1-Mini | c1f39d51-d7a2-4fee-ba35-ef4e0d429b29 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-Sky-R1-Mini/1762652580.061045 | 1762652580.0610461 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-Sky-R1-Mini | bunnycore/Qwen2.5-7B-Sky-R1-Mini | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23048622100471194}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | bunnycore/Qwen-2.5-7B-Deep-Stock-v5 | 39ce157b-e374-4963-8b40-6393835574f5 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen-2.5-7B-Deep-Stock-v5/1762652580.05501 | 1762652580.055011 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen-2.5-7B-Deep-Stock-v5 | bunnycore/Qwen-2.5-7B-Deep-Stock-v5 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.45090471061228654}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-3B-RP-Thinker-V2 | 497c8c15-1b77-4468-b33d-efa190c28e78 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-3B-RP-Thinker-V2/1762652580.057826 | 1762652580.057826 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-3B-RP-Thinker-V2 | bunnycore/Qwen2.5-3B-RP-Thinker-V2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6419965691033125}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.397} |
HF Open LLM v2 | alibaba | bunnycore/Qwen-2.5-7B-Stock-Deep-Bespoke | af89079b-b84e-48f1-876a-ebf2d933d91e | 0.0.1 | hfopenllm_v2/bunnycore_Qwen-2.5-7B-Stock-Deep-Bespoke/1762652580.0556722 | 1762652580.0556731 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen-2.5-7B-Stock-Deep-Bespoke | bunnycore/Qwen-2.5-7B-Stock-Deep-Bespoke | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5206219497599702}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/FwF-Qwen-7B-0.2 | ee7b9254-5e4a-46a0-a8b3-2ecc1708e6ab | 0.0.1 | hfopenllm_v2/bunnycore_FwF-Qwen-7B-0.2/1762652580.044472 | 1762652580.0444732 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/FwF-Qwen-7B-0.2 | bunnycore/FwF-Qwen-7B-0.2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.44790710869382133}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | bunnycore/Qwen-2.5-7B-R1-Stock | 672e66ed-80e2-4b45-b52c-d9265f8efac8 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen-2.5-7B-R1-Stock/1762652580.055454 | 1762652580.055455 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen-2.5-7B-R1-Stock | bunnycore/Qwen-2.5-7B-R1-Stock | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7573261169253137}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/CyberCore-Qwen-2.1-7B | 131132b7-5b2a-421f-aa02-360ef9b7f206 | 0.0.1 | hfopenllm_v2/bunnycore_CyberCore-Qwen-2.1-7B/1762652580.0426219 | 1762652580.042623 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/CyberCore-Qwen-2.1-7B | bunnycore/CyberCore-Qwen-2.1-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5765757080103016}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-MixStock-Sce-V0.3 | daf38e27-1149-44a8-84f2-93f842f4740a | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-MixStock-Sce-V0.3/1762652580.058998 | 1762652580.058999 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-MixStock-Sce-V0.3 | bunnycore/Qwen2.5-7B-MixStock-Sce-V0.3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21197644472222593}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/FwF-Qwen-7B-0.1 | bfaeefb1-93c9-470b-9376-9c67a1d20862 | 0.0.1 | hfopenllm_v2/bunnycore_FwF-Qwen-7B-0.1/1762652580.04422 | 1762652580.044221 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/FwF-Qwen-7B-0.1 | bunnycore/FwF-Qwen-7B-0.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.30045390674521383}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | bunnycore/DeepSeek-R1-Distill-Qwen-7B-RRP-Ex | 7e6a55fb-da39-4b16-a59b-70635e636c02 | 0.0.1 | hfopenllm_v2/bunnycore_DeepSeek-R1-Distill-Qwen-7B-RRP-Ex/1762652580.043099 | 1762652580.043099 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/DeepSeek-R1-Distill-Qwen-7B-RRP-Ex | bunnycore/DeepSeek-R1-Distill-Qwen-7B-RRP-Ex | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.39010492160800014}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-MixStock-V0.1 | 4a5bb50c-017d-421d-8ea1-21a8316db0f4 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-MixStock-V0.1/1762652580.059214 | 1762652580.059214 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-MixStock-V0.1 | bunnycore/Qwen2.5-7B-MixStock-V0.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7673428724672757}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-R1-Bespoke-Task | 0f460b31-7249-4e2d-a614-d1230e95f3cf | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-R1-Bespoke-Task/1762652580.059654 | 1762652580.059655 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-R1-Bespoke-Task | bunnycore/Qwen2.5-7B-R1-Bespoke-Task | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3786641666334215}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/Qwen-2.5-7B-Deep-Stock-v1 | a9fe98a7-e143-4100-99cd-adea90917c4c | 0.0.1 | hfopenllm_v2/bunnycore_Qwen-2.5-7B-Deep-Stock-v1/1762652580.054558 | 1762652580.054559 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen-2.5-7B-Deep-Stock-v1 | bunnycore/Qwen-2.5-7B-Deep-Stock-v1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5695066867023941}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-RRP-1M | 9ec2ac0c-21e8-4c9c-ba5f-69ad284400bb | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-RRP-1M/1762652580.059867 | 1762652580.0598679 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-RRP-1M | bunnycore/Qwen2.5-7B-RRP-1M | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7481338404322753}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-1.5B-Model-Stock | 865ffa1b-af08-416e-8de0-a16091d4ec79 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-1.5B-Model-Stock/1762652580.0561001 | 1762652580.056101 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-1.5B-Model-Stock | bunnycore/Qwen2.5-1.5B-Model-Stock | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18292574812608325}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.776} |
HF Open LLM v2 | alibaba | bunnycore/Qwen-2.5-7B-Exp-Sce | c57286a9-ee0c-48e7-814e-8f2aa8e9688a | 0.0.1 | hfopenllm_v2/bunnycore_Qwen-2.5-7B-Exp-Sce/1762652580.055233 | 1762652580.055233 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen-2.5-7B-Exp-Sce | bunnycore/Qwen-2.5-7B-Exp-Sce | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.765169749597734}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | bunnycore/DeepQwen-3B-LCoT-SCE | 49243e70-a24d-4e0c-b4c6-4275be1db944 | 0.0.1 | hfopenllm_v2/bunnycore_DeepQwen-3B-LCoT-SCE/1762652580.042877 | 1762652580.042878 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/DeepQwen-3B-LCoT-SCE | bunnycore/DeepQwen-3B-LCoT-SCE | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4489809261647983}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.396} |
HF Open LLM v2 | alibaba | bunnycore/Qwen2.5-7B-CyberRombos | 1dc11c68-ce65-4a5b-9f75-4cdf1775bfc6 | 0.0.1 | hfopenllm_v2/bunnycore_Qwen2.5-7B-CyberRombos/1762652580.058041 | 1762652580.058042 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen2.5-7B-CyberRombos | bunnycore/Qwen2.5-7B-CyberRombos | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.751830698103255}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | bunnycore/Qwen-2.5-7B-Deep-Stock-v4 | 56ae78dc-3cae-43b0-afc9-e6fac3c6556a | 0.0.1 | hfopenllm_v2/bunnycore_Qwen-2.5-7B-Deep-Stock-v4/1762652580.054795 | 1762652580.054796 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Qwen-2.5-7B-Deep-Stock-v4 | bunnycore/Qwen-2.5-7B-Deep-Stock-v4 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7752862405085175}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.