_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-1e-5-5ep | b5cdb9c2-d81a-4e0b-817a-3e101d122e7a | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-1e-5-5ep/1762652579.656047 | 1762652579.656048 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-1e-5-5ep | JayHyeon/Qwen2.5-0.5B-SFT-1e-5-5ep | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22918744486850445}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-5e-5-3ep | 1c779874-5568-462e-9e6e-0e3fd42d023e | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-5e-5-3ep/1762652579.674078 | 1762652579.674078 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-5e-5-3ep | JayHyeon/Qwen2.5-0.5B-SFT-5e-5-3ep | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2198699450790569}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-3ep_0alp_0lam | de200bef-71a2-4efb-bc34-02f69385b636 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-3ep_0alp_0lam/1762652579.662327 | 1762652579.662327 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-3ep_0alp_0lam | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-3ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23718068059415687}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-4 | 5e307ea5-70da-476a-8d9e-1d488385565f | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-4/1762652579.656255 | 1762652579.656256 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-4 | JayHyeon/Qwen2.5-0.5B-SFT-2e-4 | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2034335562972912}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-DPO-1epoch_v1 | ac94a989-668a-49e6-9975-9169d7394574 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-DPO-1epoch_v1/1762652579.67534 | 1762652579.6753411 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-DPO-1epoch_v1 | JayHyeon/Qwen2.5-0.5B-SFT-DPO-1epoch_v1 | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20245947419513555}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen_0.5-DPOP_3e-7-2ep_0alp_5lam | c02ad005-8e12-46d9-8bb3-090f62c6a946 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen_0.5-DPOP_3e-7-2ep_0alp_5lam/1762652579.677048 | 1762652579.6770492 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen_0.5-DPOP_3e-7-2ep_0alp_5lam | JayHyeon/Qwen_0.5-DPOP_3e-7-2ep_0alp_5lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2551408041773605}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen_0.5-DPOP_3e-7-1ep_0alp_5lam | c23f1072-c7be-4eab-b866-16c6429071e4 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen_0.5-DPOP_3e-7-1ep_0alp_5lam/1762652579.6768441 | 1762652579.676845 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen_0.5-DPOP_3e-7-1ep_0alp_5lam | JayHyeon/Qwen_0.5-DPOP_3e-7-1ep_0alp_5lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24474948691693596}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen_0.5-DPO_1e-6-3ep_0alp_0lam | be9afede-e624-43e6-99dd-52e0d2b413ac | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen_0.5-DPO_1e-6-3ep_0alp_0lam/1762652579.678605 | 1762652579.678606 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen_0.5-DPO_1e-6-3ep_0alp_0lam | JayHyeon/Qwen_0.5-DPO_1e-6-3ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23163539408768735}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-1e-5 | 3eac4497-66af-4fc6-bf89-459631e4a418 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-1e-5/1762652579.6553931 | 1762652579.655394 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-1e-5 | JayHyeon/Qwen2.5-0.5B-SFT-1e-5 | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1985875255433361}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen_0.5-DPO_1e-7-3ep_0alp_0lam | 9632892a-a6b2-4f17-827e-bfef9a712985 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen_0.5-DPO_1e-7-3ep_0alp_0lam/1762652579.678855 | 1762652579.678856 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen_0.5-DPO_1e-7-3ep_0alp_0lam | JayHyeon/Qwen_0.5-DPO_1e-7-3ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23598163982677073}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-7_1ep_0alp_0lam | c2e14e90-6c18-4a9f-9d68-a9d98960dd32 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-7_1ep_0alp_0lam/1762652579.663386 | 1762652579.663386 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-7_1ep_0alp_0lam | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-7_1ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25264298727376694}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-IRPO_5e-6-1ep_1alp_0lam | 670b89a5-2a83-480e-a33b-6903609a10dc | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-IRPO_5e-6-1ep_1alp_0lam/1762652579.665683 | 1762652579.665684 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-IRPO_5e-6-1ep_1alp_0lam | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-IRPO_5e-6-1ep_1alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23233464984020177}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-7_3ep_0alp_0lam | 2faf738f-64f4-4e14-8011-9e00a4e2dd6a | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-7_3ep_0alp_0lam/1762652579.663875 | 1762652579.663876 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-7_3ep_0alp_0lam | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-7_3ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2441998342176536}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-IRPO_5e-6-2ep_1alp_0lam | e660922f-847b-4993-91a4-b96809ff1e85 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-IRPO_5e-6-2ep_1alp_0lam/1762652579.665889 | 1762652579.66589 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-IRPO_5e-6-2ep_1alp_0lam | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-IRPO_5e-6-2ep_1alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23151017079127825}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_3e-6-2ep_0alp_0lam | c589d3d6-9d8b-45e3-a6c6-60f25d44349b | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_3e-6-2ep_0alp_0lam/1762652579.6696231 | 1762652579.669624 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_3e-6-2ep_0alp_0lam | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_3e-6-2ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24599839536873275}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1 | 46c6ab7f-33a0-4e72-9a63-b24da3f9c4d6 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1/1762652579.653574 | 1762652579.653575 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1 | JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1 | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24687274210206694}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-3ep | 57d9c59d-8cd8-4253-a076-8b16becc740e | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-3ep/1762652579.671975 | 1762652579.671975 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-3ep | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-3ep | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22808813946993975}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam | 972e0d76-63bb-431b-9d9b-68dd6b738447 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam/1762652579.662969 | 1762652579.662969 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2420765790325226}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_5e-7_1ep_0alp_0lam | 8662faaa-8964-468a-991b-43b2f0449d48 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_5e-7_1ep_0alp_0lam/1762652579.6709208 | 1762652579.6709208 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_5e-7_1ep_0alp_0lam | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_5e-7_1ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2487211709375568}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen_0.5-DPOP_3e-6-1ep_0alp_5lam | f78ac837-d5f4-48f1-8a9e-1549b0020160 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen_0.5-DPOP_3e-6-1ep_0alp_5lam/1762652579.6762261 | 1762652579.6762261 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen_0.5-DPOP_3e-6-1ep_0alp_5lam | JayHyeon/Qwen_0.5-DPOP_3e-6-1ep_0alp_5lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24807178286945303}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_0.5_1e-7-1ep_0alp_0lam | da330322-f144-44bb-833a-7b92c11f3888 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_0.5_1e-7-1ep_0alp_0lam/1762652579.667231 | 1762652579.667236 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_0.5_1e-7-1ep_0alp_0lam | JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-MDPO_0.5_1e-7-1ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24992021170494289}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | JayHyeon | JayHyeon/Qwen_0.5-DPO_3e-6-1ep_0alp_0lam | a690910a-388f-4a51-98a2-fc1e1bb327e2 | 0.0.1 | hfopenllm_v2/JayHyeon_Qwen_0.5-DPO_3e-6-1ep_0alp_0lam/1762652579.679086 | 1762652579.679086 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | JayHyeon/Qwen_0.5-DPO_3e-6-1ep_0alp_0lam | JayHyeon/Qwen_0.5-DPO_3e-6-1ep_0alp_0lam | JayHyeon | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23370878158840763}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2Model", "params_billions": 0.494} |
HF Open LLM v2 | CultriX | CultriX/Qwestion-14B | c6ad96f2-fcb9-47c5-8106-936436b6ad1b | 0.0.1 | hfopenllm_v2/CultriX_Qwestion-14B/1762652579.521322 | 1762652579.521322 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CultriX/Qwestion-14B | CultriX/Qwestion-14B | CultriX | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6317803428237078}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | CultriX | CultriX/Qwen2.5-14B-Wernicke-SFT | 84bc884e-29be-40b5-bfe2-6147bec90a78 | 0.0.1 | hfopenllm_v2/CultriX_Qwen2.5-14B-Wernicke-SFT/1762652579.520046 | 1762652579.5200472 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CultriX/Qwen2.5-14B-Wernicke-SFT | CultriX/Qwen2.5-14B-Wernicke-SFT | CultriX | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4937443760333692}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | Epiculous | Epiculous/Azure_Dusk-v0.2 | 79790560-846a-48fb-b37a-462162eb0e97 | 0.0.1 | hfopenllm_v2/Epiculous_Azure_Dusk-v0.2/1762652579.5970619 | 1762652579.5970628 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Epiculous/Azure_Dusk-v0.2 | Epiculous/Azure_Dusk-v0.2 | Epiculous | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.346715603487635}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Epiculous | Epiculous/Violet_Twilight-v0.2 | 83990950-a34c-463f-9a1a-d9371910da6f | 0.0.1 | hfopenllm_v2/Epiculous_Violet_Twilight-v0.2/1762652579.597749 | 1762652579.59775 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Epiculous/Violet_Twilight-v0.2 | Epiculous/Violet_Twilight-v0.2 | Epiculous | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.45317756885064964}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Epiculous | Epiculous/Crimson_Dawn-v0.2 | 91b7917e-a908-4281-9a4d-a2c1e7558105 | 0.0.1 | hfopenllm_v2/Epiculous_Crimson_Dawn-v0.2/1762652579.5973198 | 1762652579.5973198 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Epiculous/Crimson_Dawn-v0.2 | Epiculous/Crimson_Dawn-v0.2 | Epiculous | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3103454389907667}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Epiculous | Epiculous/NovaSpark | 9270e697-84b1-46c5-afcc-481065f2be8f | 0.0.1 | hfopenllm_v2/Epiculous_NovaSpark/1762652579.597535 | 1762652579.597536 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Epiculous/NovaSpark | Epiculous/NovaSpark | Epiculous | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6408473960203371}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meditsolutions | meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge | a7e4718c-c4cf-4c0f-b67f-fd12fa54e4ad | 0.0.1 | hfopenllm_v2/meditsolutions_MSH-v1-Bielik-v2.3-Instruct-MedIT-merge/1762652580.344883 | 1762652580.344884 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge | meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge | meditsolutions | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5814217387642566}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 11.169} |
HF Open LLM v2 | meditsolutions | meditsolutions/SmolLM2-MedIT-Upscale-2B | d78a23ac-c3f1-4ad5-bbd2-ea37faea455f | 0.0.1 | hfopenllm_v2/meditsolutions_SmolLM2-MedIT-Upscale-2B/1762652580.3453178 | 1762652580.3453188 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | meditsolutions/SmolLM2-MedIT-Upscale-2B | meditsolutions/SmolLM2-MedIT-Upscale-2B | meditsolutions | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6429207835210575}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 2.114} |
HF Open LLM v2 | meditsolutions | meditsolutions/Llama-3.2-SUN-HDIC-1B-Instruct | ac6f2c5a-32b7-4553-acaa-e329f1916c85 | 0.0.1 | hfopenllm_v2/meditsolutions_Llama-3.2-SUN-HDIC-1B-Instruct/1762652580.344357 | 1762652580.344363 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | meditsolutions/Llama-3.2-SUN-HDIC-1B-Instruct | meditsolutions/Llama-3.2-SUN-HDIC-1B-Instruct | meditsolutions | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6826631116548536}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.498} |
HF Open LLM v2 | meditsolutions | meditsolutions/Llama-3.2-SUN-1B-Instruct | f4c341cb-6489-49a1-9532-6b78c2238b2a | 0.0.1 | hfopenllm_v2/meditsolutions_Llama-3.2-SUN-1B-Instruct/1762652580.343025 | 1762652580.343026 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | meditsolutions/Llama-3.2-SUN-1B-Instruct | meditsolutions/Llama-3.2-SUN-1B-Instruct | meditsolutions | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6412973133507981}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaMedITForCausalLM", "params_billions": 1.498} |
HF Open LLM v2 | meditsolutions | meditsolutions/MedIT-Mesh-3B-Instruct | 89568570-298f-4dc5-9b7b-c9ce84d4010e | 0.0.1 | hfopenllm_v2/meditsolutions_MedIT-Mesh-3B-Instruct/1762652580.345099 | 1762652580.345099 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | meditsolutions/MedIT-Mesh-3B-Instruct | meditsolutions/MedIT-Mesh-3B-Instruct | meditsolutions | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5814217387642566}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Phi3ForCausalLM", "params_billions": 3.821} |
HF Open LLM v2 | meditsolutions | meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune | ff57f4fa-eb78-4ef4-9d92-9f160a1b936a | 0.0.1 | hfopenllm_v2/meditsolutions_MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune/1762652580.344661 | 1762652580.344662 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune | meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune | meditsolutions | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.36550020611976225}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.646} |
HF Open LLM v2 | meditsolutions | meditsolutions/Llama-3.2-SUN-2.5B-chat | 7385392b-79e9-4764-9326-d7bc1586b918 | 0.0.1 | hfopenllm_v2/meditsolutions_Llama-3.2-SUN-2.5B-chat/1762652580.344106 | 1762652580.344107 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | meditsolutions/Llama-3.2-SUN-2.5B-chat | meditsolutions/Llama-3.2-SUN-2.5B-chat | meditsolutions | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.560414145578177}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 2.472} |
HF Open LLM v2 | meditsolutions | meditsolutions/Llama-3.2-SUN-1B-chat | 7e72df4d-7a54-4e11-b4a2-44224db285ec | 0.0.1 | hfopenllm_v2/meditsolutions_Llama-3.2-SUN-1B-chat/1762652580.343276 | 1762652580.343277 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | meditsolutions/Llama-3.2-SUN-1B-chat | meditsolutions/Llama-3.2-SUN-1B-chat | meditsolutions | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5481743994822625}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.498} |
HF Open LLM v2 | Deci | Deci/DeciLM-7B-instruct | 1b3a2041-d14f-44d1-9efd-dbeceaa67ee6 | 0.0.1 | hfopenllm_v2/Deci_DeciLM-7B-instruct/1762652579.546672 | 1762652579.546672 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Deci/DeciLM-7B-instruct | Deci/DeciLM-7B-instruct | Deci | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4880239985460799}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "DeciLMForCausalLM", "params_billions": 7.044} |
HF Open LLM v2 | Deci | Deci/DeciLM-7B | f9d2408b-03dd-4cf8-851e-51a15ff13be9 | 0.0.1 | hfopenllm_v2/Deci_DeciLM-7B/1762652579.5463831 | 1762652579.5463839 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Deci/DeciLM-7B | Deci/DeciLM-7B | Deci | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28129474239462404}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "DeciLMForCausalLM", "params_billions": 7.044} |
HF Open LLM v2 | sthenno-com | sthenno-com/miscii-14b-1028 | 3f2549af-9bc5-4ad1-a429-79bbb91c929f | 0.0.1 | hfopenllm_v2/sthenno-com_miscii-14b-1028/1762652580.541399 | 1762652580.5414 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sthenno-com/miscii-14b-1028 | sthenno-com/miscii-14b-1028 | sthenno-com | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8236711924360696}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | sthenno-com | sthenno-com/miscii-14b-0130 | 40a09314-bb43-41ff-a36a-b39064c37add | 0.0.1 | hfopenllm_v2/sthenno-com_miscii-14b-0130/1762652580.540879 | 1762652580.54088 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sthenno-com/miscii-14b-0130 | sthenno-com/miscii-14b-0130 | sthenno-com | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6647029880716498}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | sthenno-com | sthenno-com/miscii-14b-1225 | ab816ab5-9edb-49d1-8f89-c3dc36a8a0de | 0.0.1 | hfopenllm_v2/sthenno-com_miscii-14b-1225/1762652580.541638 | 1762652580.5416389 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sthenno-com/miscii-14b-1225 | sthenno-com/miscii-14b-1225 | sthenno-com | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.787800812954073}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | sthenno-com | sthenno-com/miscii-14b-0218 | f73b09b4-020d-49fd-8ede-6a690088be94 | 0.0.1 | hfopenllm_v2/sthenno-com_miscii-14b-0218/1762652580.541173 | 1762652580.541174 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sthenno-com/miscii-14b-0218 | sthenno-com/miscii-14b-0218 | sthenno-com | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7655941790006073}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | ashercn97 | ashercn97/a1-v0.0.1 | a9e3fe74-400c-444c-9b28-6f49c6671f96 | 0.0.1 | hfopenllm_v2/ashercn97_a1-v0.0.1/1762652580.019211 | 1762652580.019212 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ashercn97/a1-v0.0.1 | ashercn97/a1-v0.0.1 | ashercn97 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21984445715146922}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | ashercn97 | ashercn97/a1-v002 | 509c2895-70ae-4381-94ef-f6cdf9ee07ef | 0.0.1 | hfopenllm_v2/ashercn97_a1-v002/1762652580.019455 | 1762652580.019456 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ashercn97/a1-v002 | ashercn97/a1-v002 | ashercn97 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2584631001298776}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | braindao | braindao/iq-code-evmind-0.5b | 58f1b3d7-74a6-4ed0-b927-afaedfdda25f | 0.0.1 | hfopenllm_v2/braindao_iq-code-evmind-0.5b/1762652580.0403671 | 1762652580.040368 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | braindao/iq-code-evmind-0.5b | braindao/iq-code-evmind-0.5b | braindao | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3215612353001148}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | braindao | braindao/Qwen2.5-14B-Instruct | cb442f90-a0e1-4588-900c-548b994a764d | 0.0.1 | hfopenllm_v2/braindao_Qwen2.5-14B-Instruct/1762652580.040103 | 1762652580.040104 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | braindao/Qwen2.5-14B-Instruct | braindao/Qwen2.5-14B-Instruct | braindao | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8142539572778007}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | nbrahme | nbrahme/IndusQ | b372e098-0e1c-410a-8f5a-1bd9a910aa6b | 0.0.1 | hfopenllm_v2/nbrahme_IndusQ/1762652580.38863 | 1762652580.388631 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbrahme/IndusQ | nbrahme/IndusQ | nbrahme | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24397487555242311}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "GPT2LMHeadModel", "params_billions": 1.176} |
HF Open LLM v2 | jebish7 | jebish7/aya-expanse-8b | 70f2cb5c-feb3-44ac-9346-7ff60137e1c7 | 0.0.1 | hfopenllm_v2/jebish7_aya-expanse-8b/1762652580.282242 | 1762652580.282243 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jebish7/aya-expanse-8b | jebish7/aya-expanse-8b | jebish7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.37911408396388246}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "CohereForCausalLM", "params_billions": 8.028} |
HF Open LLM v2 | jebish7 | jebish7/Nemotron-4-Mini-Hindi-4B-Base | 70097d1f-8c48-49ab-b285-eebe2c85628e | 0.0.1 | hfopenllm_v2/jebish7_Nemotron-4-Mini-Hindi-4B-Base/1762652580.2815292 | 1762652580.2815301 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jebish7/Nemotron-4-Mini-Hindi-4B-Base | jebish7/Nemotron-4-Mini-Hindi-4B-Base | jebish7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22848818911599}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH",... | {"precision": "bfloat16", "architecture": "NemotronForCausalLM", "params_billions": 4.191} |
HF Open LLM v2 | jebish7 | jebish7/Nemotron-4-Mini-Hindi-4B-Instruct | e108df0b-a1ce-4c07-b683-6d3b33fd3988 | 0.0.1 | hfopenllm_v2/jebish7_Nemotron-4-Mini-Hindi-4B-Instruct/1762652580.2817988 | 1762652580.2818 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jebish7/Nemotron-4-Mini-Hindi-4B-Instruct | jebish7/Nemotron-4-Mini-Hindi-4B-Instruct | jebish7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3345257250761313}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "NemotronForCausalLM", "params_billions": 4.191} |
HF Open LLM v2 | jebish7 | jebish7/Llama-3-Nanda-10B-Chat | 739c83a9-8ff7-48df-af0c-494891df487b | 0.0.1 | hfopenllm_v2/jebish7_Llama-3-Nanda-10B-Chat/1762652580.28106 | 1762652580.2810612 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jebish7/Llama-3-Nanda-10B-Chat | jebish7/Llama-3-Nanda-10B-Chat | jebish7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2952831819572069}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 9.985} |
HF Open LLM v2 | jebish7 | jebish7/Nemotron-Mini-4B-Instruct | 77bd2442-4004-48cb-ba45-eeb1ffec2a39 | 0.0.1 | hfopenllm_v2/jebish7_Nemotron-Mini-4B-Instruct/1762652580.282024 | 1762652580.282024 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jebish7/Nemotron-Mini-4B-Instruct | jebish7/Nemotron-Mini-4B-Instruct | jebish7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.37092026932982264}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "NemotronForCausalLM", "params_billions": 4.191} |
HF Open LLM v2 | jebish7 | jebish7/Llama-3.1-8B-Instruct | cc65b968-d766-4825-85cd-c36872eb1986 | 0.0.1 | hfopenllm_v2/jebish7_Llama-3.1-8B-Instruct/1762652580.281322 | 1762652580.281322 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jebish7/Llama-3.1-8B-Instruct | jebish7/Llama-3.1-8B-Instruct | jebish7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5058345190760515}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | pints-ai | pints-ai/1.5-Pints-16K-v0.1 | 8dff3ec1-066f-4f5f-ac57-879d693ee3fb | 0.0.1 | hfopenllm_v2/pints-ai_1.5-Pints-16K-v0.1/1762652580.4407208 | 1762652580.440722 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pints-ai/1.5-Pints-16K-v0.1 | pints-ai/1.5-Pints-16K-v0.1 | pints-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1635914927946737}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.566} |
HF Open LLM v2 | pints-ai | pints-ai/1.5-Pints-2K-v0.1 | 2ed76213-e562-4b36-bf46-93f09df88ee9 | 0.0.1 | hfopenllm_v2/pints-ai_1.5-Pints-2K-v0.1/1762652580.4409652 | 1762652580.440966 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pints-ai/1.5-Pints-2K-v0.1 | pints-ai/1.5-Pints-2K-v0.1 | pints-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17615593292463996}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.566} |
HF Open LLM v2 | AIDC-AI | AIDC-AI/Marco-o1 | 17f7398f-675d-4b38-b233-64fc106737c3 | 0.0.1 | hfopenllm_v2/AIDC-AI_Marco-o1/1762652579.47579 | 1762652579.4757912 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | AIDC-AI/Marco-o1 | AIDC-AI/Marco-o1 | AIDC-AI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.477083028586373}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | vicgalle | vicgalle/ConfigurableSOLAR-10.7B | 2dec3c49-01f0-4940-aa45-e7a6b2648e8f | 0.0.1 | hfopenllm_v2/vicgalle_ConfigurableSOLAR-10.7B/1762652580.587757 | 1762652580.587758 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vicgalle/ConfigurableSOLAR-10.7B | vicgalle/ConfigurableSOLAR-10.7B | vicgalle | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5099558061499045}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 10.732} |
HF Open LLM v2 | vicgalle | vicgalle/ConfigurableHermes-7B | 176727e5-31dc-462a-8210-4735543c32f2 | 0.0.1 | hfopenllm_v2/vicgalle_ConfigurableHermes-7B/1762652580.5875661 | 1762652580.587567 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vicgalle/ConfigurableHermes-7B | vicgalle/ConfigurableHermes-7B | vicgalle | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5410798902467675}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | vicgalle | vicgalle/Merge-Mixtral-Prometheus-8x7B | e6a0cf8f-323d-40c0-90c2-0e2071321df0 | 0.0.1 | hfopenllm_v2/vicgalle_Merge-Mixtral-Prometheus-8x7B/1762652580.588394 | 1762652580.588395 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vicgalle/Merge-Mixtral-Prometheus-8x7B | vicgalle/Merge-Mixtral-Prometheus-8x7B | vicgalle | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5744025851407598}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MixtralForCausalLM", "params_billions": 46.703} |
HF Open LLM v2 | vicgalle | vicgalle/ConfigurableBeagle-11B | 3fd95536-ec61-4470-9082-14a116d20e80 | 0.0.1 | hfopenllm_v2/vicgalle_ConfigurableBeagle-11B/1762652580.587369 | 1762652580.58737 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vicgalle/ConfigurableBeagle-11B | vicgalle/ConfigurableBeagle-11B | vicgalle | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5834452585805663}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 10.732} |
HF Open LLM v2 | vicgalle | vicgalle/CarbonBeagle-11B-truthy | d67aa278-fcc9-4404-a87a-4be9e1bdaa1a | 0.0.1 | hfopenllm_v2/vicgalle_CarbonBeagle-11B-truthy/1762652580.586528 | 1762652580.586528 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vicgalle/CarbonBeagle-11B-truthy | vicgalle/CarbonBeagle-11B-truthy | vicgalle | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5212214701436633}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 10.732} |
HF Open LLM v2 | vicgalle | vicgalle/Configurable-Yi-1.5-9B-Chat | 0a933130-dca9-435c-a529-16065b540aab | 0.0.1 | hfopenllm_v2/vicgalle_Configurable-Yi-1.5-9B-Chat/1762652580.587164 | 1762652580.5871649 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vicgalle/Configurable-Yi-1.5-9B-Chat | vicgalle/Configurable-Yi-1.5-9B-Chat | vicgalle | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.43234506664538974}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.829} |
HF Open LLM v2 | vicgalle | vicgalle/CarbonBeagle-11B | b906411a-6663-4c9f-9fe6-4d60e99e4e41 | 0.0.1 | hfopenllm_v2/vicgalle_CarbonBeagle-11B/1762652580.5862951 | 1762652580.5862951 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vicgalle/CarbonBeagle-11B | vicgalle/CarbonBeagle-11B | vicgalle | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5415298075772285}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 10.732} |
HF Open LLM v2 | vicgalle | vicgalle/Configurable-Llama-3.1-8B-Instruct | 82a3253a-7a6e-4d75-8ea2-114b4dee6d16 | 0.0.1 | hfopenllm_v2/vicgalle_Configurable-Llama-3.1-8B-Instruct/1762652580.586963 | 1762652580.586964 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vicgalle/Configurable-Llama-3.1-8B-Instruct | vicgalle/Configurable-Llama-3.1-8B-Instruct | vicgalle | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8312399987588488}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | arshiaafshani | arshiaafshani/Arsh-V1 | 6f40503d-59ee-4cdc-a697-ef405d9644a7 | 0.0.1 | hfopenllm_v2/arshiaafshani_Arsh-V1/1762652580.0186949 | 1762652580.0186958 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | arshiaafshani/Arsh-V1 | arshiaafshani/Arsh-V1 | arshiaafshani | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6043276284702368}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 13.96} |
HF Open LLM v2 | PowerInfer | PowerInfer/SmallThinker-3B-Preview | 6613aff7-8f26-4b74-b08b-37fbd7990e42 | 0.0.1 | hfopenllm_v2/PowerInfer_SmallThinker-3B-Preview/1762652579.814635 | 1762652579.814636 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | PowerInfer/SmallThinker-3B-Preview | PowerInfer/SmallThinker-3B-Preview | PowerInfer | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6199650261306666}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.397} |
HF Open LLM v2 | saishshinde15 | saishshinde15/TethysAI_Vortex | 6e20bb3a-728d-40ef-b6ca-91b0dde02da4 | 0.0.1 | hfopenllm_v2/saishshinde15_TethysAI_Vortex/1762652580.5066721 | 1762652580.5066729 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | saishshinde15/TethysAI_Vortex | saishshinde15/TethysAI_Vortex | saishshinde15 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4297718941297978}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | saishshinde15 | saishshinde15/TethysAI_Base_Reasoning | 74cb7205-e6c9-4faf-a84e-c15daa2ba62b | 0.0.1 | hfopenllm_v2/saishshinde15_TethysAI_Base_Reasoning/1762652580.5062242 | 1762652580.5062249 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | saishshinde15/TethysAI_Base_Reasoning | saishshinde15/TethysAI_Base_Reasoning | saishshinde15 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6368757119997164}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | saishshinde15 | saishshinde15/TethysAI_Vortex_Reasoning | 79022531-2599-4c19-93e0-ecdbde7bf736 | 0.0.1 | hfopenllm_v2/saishshinde15_TethysAI_Vortex_Reasoning/1762652580.506901 | 1762652580.506902 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | saishshinde15/TethysAI_Vortex_Reasoning | saishshinde15/TethysAI_Vortex_Reasoning | saishshinde15 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.40211970903868405}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | xkp24 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002 | 4deeeff7-f62d-4c42-b32a-98bdd773a758 | 0.0.1 | hfopenllm_v2/xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002/1762652580.599496 | 1762652580.5994968 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002 | xkp24 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7131876753680235}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xkp24 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table | a9888e61-bd14-4769-b620-cda908c8ba3e | 0.0.1 | hfopenllm_v2/xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table/1762652580.598392 | 1762652580.5983932 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table | xkp24 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6374752323834094}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xkp24 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table | 2fe15418-16bc-4f60-bad2-7329a3670507 | 0.0.1 | hfopenllm_v2/xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table/1762652580.599085 | 1762652580.599086 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table | xkp24 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6620799478716473}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xkp24 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002 | c583cff2-2944-4afb-b32e-c0f49bc0d3b7 | 0.0.1 | hfopenllm_v2/xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002/1762652580.599936 | 1762652580.599936 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002 | xkp24 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6453188650558297}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xkp24 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001 | f6bcff0a-559b-44c1-9c70-259446b3ebe5 | 0.0.1 | hfopenllm_v2/xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001/1762652580.599285 | 1762652580.599286 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001 | xkp24 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6042278931014153}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xkp24 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table | 71a54215-e97a-4ee6-928c-344bd690b020 | 0.0.1 | hfopenllm_v2/xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table/1762652580.598878 | 1762652580.5988789 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table | xkp24 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6568593553992297}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xkp24 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001 | 8ec55b3f-e425-4ee9-98d5-dac775977514 | 0.0.1 | hfopenllm_v2/xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001/1762652580.599715 | 1762652580.599715 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001 | xkp24 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.594710922574325}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xkp24 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table | 99d6ac02-a8f8-409f-ad9d-ce5fd7ed6fe0 | 0.0.1 | hfopenllm_v2/xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table/1762652580.598656 | 1762652580.598656 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table | xkp24 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7274509412802475}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Salesforce | Salesforce/LLaMA-3-8B-SFR-Iterative-DPO-R | 1bf65062-4526-407d-ba4f-866b045dbf3b | 0.0.1 | hfopenllm_v2/Salesforce_LLaMA-3-8B-SFR-Iterative-DPO-R/1762652579.8714519 | 1762652579.8714519 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Salesforce/LLaMA-3-8B-SFR-Iterative-DPO-R | Salesforce/LLaMA-3-8B-SFR-Iterative-DPO-R | Salesforce | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.38156203318306536}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | SkyOrbis | SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000 | 7875e792-80dd-4fa8-9743-b8ef42a4cdb7 | 0.0.1 | hfopenllm_v2/SkyOrbis_SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000/1762652579.888021 | 1762652579.888022 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000 | SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000 | SkyOrbis | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.38188672721711725}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | SkyOrbis | SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000 | 9354b915-68cd-47ca-a1e8-7481a8b33c49 | 0.0.1 | hfopenllm_v2/SkyOrbis_SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000/1762652579.8882601 | 1762652579.888261 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000 | SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000 | SkyOrbis | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3812373391490135}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | SkyOrbis | SkyOrbis/SKY-Ko-Qwen2.5-3B-Instruct | bdcf5d38-55d2-4f55-8bd1-7f4cd94f758c | 0.0.1 | hfopenllm_v2/SkyOrbis_SKY-Ko-Qwen2.5-3B-Instruct/1762652579.887695 | 1762652579.8876958 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SkyOrbis/SKY-Ko-Qwen2.5-3B-Instruct | SkyOrbis/SKY-Ko-Qwen2.5-3B-Instruct | SkyOrbis | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3534100630770799}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | bhuvneshsaini | bhuvneshsaini/merged_model | 44e6cddd-4ecc-499f-a6b7-d8ee0640c2f9 | 0.0.1 | hfopenllm_v2/bhuvneshsaini_merged_model/1762652580.032705 | 1762652580.032706 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bhuvneshsaini/merged_model | bhuvneshsaini/merged_model | bhuvneshsaini | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1812767900282362}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 4.715} |
HF Open LLM v2 | yanng1242 | yanng1242/Marcoro14-7B-slerp | f5005cc2-cec4-4a1c-be09-a670d996d15b | 0.0.1 | hfopenllm_v2/yanng1242_Marcoro14-7B-slerp/1762652580.604092 | 1762652580.604092 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | yanng1242/Marcoro14-7B-slerp | yanng1242/Marcoro14-7B-slerp | yanng1242 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4059916576904835}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | saltlux | saltlux/luxia-21.4b-alignment-v1.0 | fe959cc1-17bd-4e87-b9b7-84d3adddbedb | 0.0.1 | hfopenllm_v2/saltlux_luxia-21.4b-alignment-v1.0/1762652580.507964 | 1762652580.5079648 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | saltlux/luxia-21.4b-alignment-v1.0 | saltlux/luxia-21.4b-alignment-v1.0 | saltlux | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.36929679915956326}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 21.421} |
HF Open LLM v2 | saltlux | saltlux/luxia-21.4b-alignment-v1.2 | b89b30bb-fbaa-4ac6-8535-9f31cf87eb55 | 0.0.1 | hfopenllm_v2/saltlux_luxia-21.4b-alignment-v1.2/1762652580.508301 | 1762652580.5083032 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | saltlux/luxia-21.4b-alignment-v1.2 | saltlux/luxia-21.4b-alignment-v1.2 | saltlux | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41153694419695297}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 21.421} |
HF Open LLM v2 | FallenMerick | FallenMerick/Chewy-Lemon-Cookie-11B | f4f2289c-5b3c-4040-9e34-ac20352f45d7 | 0.0.1 | hfopenllm_v2/FallenMerick_Chewy-Lemon-Cookie-11B/1762652579.6178062 | 1762652579.6178071 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | FallenMerick/Chewy-Lemon-Cookie-11B | FallenMerick/Chewy-Lemon-Cookie-11B | FallenMerick | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4875242135312083}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 10.732} |
HF Open LLM v2 | arisin | arisin/orca-platypus-13B-slerp | ecd45b21-21f7-49e2-b314-c7b678bdc8c1 | 0.0.1 | hfopenllm_v2/arisin_orca-platypus-13B-slerp/1762652580.018446 | 1762652580.018446 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | arisin/orca-platypus-13B-slerp | arisin/orca-platypus-13B-slerp | arisin | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.26718107953563214}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 13.016} |
HF Open LLM v2 | dfurman | dfurman/CalmeRys-78B-Orpo-v0.1 | 31d8cf18-7b35-438e-8dc6-cdba0f593348 | 0.0.1 | hfopenllm_v2/dfurman_CalmeRys-78B-Orpo-v0.1/1762652580.124436 | 1762652580.124437 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | dfurman/CalmeRys-78B-Orpo-v0.1 | dfurman/CalmeRys-78B-Orpo-v0.1 | dfurman | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8163273447785211}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 77.965} |
HF Open LLM v2 | tugstugi | tugstugi/Qwen2.5-7B-Instruct-QwQ-v0.1 | 1cfb7d70-b903-48ae-bdb2-31c838bdabc8 | 0.0.1 | hfopenllm_v2/tugstugi_Qwen2.5-7B-Instruct-QwQ-v0.1/1762652580.577852 | 1762652580.577852 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | tugstugi/Qwen2.5-7B-Instruct-QwQ-v0.1 | tugstugi/Qwen2.5-7B-Instruct-QwQ-v0.1 | tugstugi | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6017300761978217}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | wanlige | wanlige/li-14b-v0.4 | 8965f266-28f1-43f2-b03c-acc4a9478b7c | 0.0.1 | hfopenllm_v2/wanlige_li-14b-v0.4/1762652580.591545 | 1762652580.591546 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | wanlige/li-14b-v0.4 | wanlige/li-14b-v0.4 | wanlige | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.813279875175645}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | wanlige | wanlige/li-14b-v0.4-slerp | d2451e41-e4b0-4945-9ace-1b046b11528b | 0.0.1 | hfopenllm_v2/wanlige_li-14b-v0.4-slerp/1762652580.591778 | 1762652580.591778 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | wanlige/li-14b-v0.4-slerp | wanlige/li-14b-v0.4-slerp | wanlige | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4605967721201967}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | wanlige | wanlige/li-14b-v0.4-slerp0.1 | 54a93ff0-bff3-4252-ba4a-e99f06b46896 | 0.0.1 | hfopenllm_v2/wanlige_li-14b-v0.4-slerp0.1/1762652580.5919738 | 1762652580.591975 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | wanlige/li-14b-v0.4-slerp0.1 | wanlige/li-14b-v0.4-slerp0.1 | wanlige | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7922722819895655}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | princeton-nlp | princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2 | bc221748-c03b-4fee-9147-8f63b0017f0c | 0.0.1 | hfopenllm_v2/princeton-nlp_Llama-3-Instruct-8B-RRHF-v0.2/1762652580.4489532 | 1762652580.448954 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2 | princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2 | princeton-nlp | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.712488419615509}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | princeton-nlp | princeton-nlp/Llama-3-8B-ProLong-64k-Instruct | 9c801b4e-228b-42a8-a7f7-ea2bf125d716 | 0.0.1 | hfopenllm_v2/princeton-nlp_Llama-3-8B-ProLong-64k-Instruct/1762652580.443907 | 1762652580.4439082 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | princeton-nlp/Llama-3-8B-ProLong-64k-Instruct | princeton-nlp/Llama-3-8B-ProLong-64k-Instruct | princeton-nlp | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5563172382611471}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | princeton-nlp | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct | 72eccc9b-df63-4b2f-8975-a1c89940802c | 0.0.1 | hfopenllm_v2/princeton-nlp_Llama-3-8B-ProLong-512k-Instruct/1762652580.4434712 | 1762652580.443472 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct | princeton-nlp | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3977734632996006}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | princeton-nlp | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct | e30fead2-6516-480f-abd8-6ad0713cb053 | 0.0.1 | hfopenllm_v2/princeton-nlp_Llama-3-8B-ProLong-512k-Instruct/1762652580.4431858 | 1762652580.443187 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct | princeton-nlp | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5508218194390884}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | princeton-nlp | princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2 | 2de21869-2851-43f8-b5c3-a4b9e0e6e3ac | 0.0.1 | hfopenllm_v2/princeton-nlp_Llama-3-Instruct-8B-CPO-v0.2/1762652580.44678 | 1762652580.446781 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2 | princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2 | princeton-nlp | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7505817896514582}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | princeton-nlp | princeton-nlp/Mistral-7B-Base-SFT-RRHF | fbbd671a-3005-448a-bc15-718ba23bcf72 | 0.0.1 | hfopenllm_v2/princeton-nlp_Mistral-7B-Base-SFT-RRHF/1762652580.451245 | 1762652580.451246 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | princeton-nlp/Mistral-7B-Base-SFT-RRHF | princeton-nlp/Mistral-7B-Base-SFT-RRHF | princeton-nlp | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.44066299640509404}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | princeton-nlp | princeton-nlp/Llama-3-Instruct-8B-SimPO | fcd2c5e3-ebfd-4c1c-ac8a-d28ec08f1bf2 | 0.0.1 | hfopenllm_v2/princeton-nlp_Llama-3-Instruct-8B-SimPO/1762652580.449708 | 1762652580.449709 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | princeton-nlp/Llama-3-Instruct-8B-SimPO | princeton-nlp/Llama-3-Instruct-8B-SimPO | princeton-nlp | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6503898544750152}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | princeton-nlp | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF | aaa9cd01-cca9-489c-91e0-79ff026eb258 | 0.0.1 | hfopenllm_v2/princeton-nlp_Llama-3-Instruct-8B-SLiC-HF/1762652580.449163 | 1762652580.449164 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF | princeton-nlp | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7399655137258031}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.