_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | Lawnakk | Lawnakk/BBALAW1 | 59b40f56-c27f-4b15-9288-b7033e2e4f26 | 0.0.1 | hfopenllm_v2/Lawnakk_BBALAW1/1762652579.708089 | 1762652579.70809 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBALAW1 | Lawnakk/BBALAW1 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19054442213327305}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | Lawnakk | Lawnakk/BBA100 | 745591e3-3c6a-473a-9e51-4bffe1c86fa7 | 0.0.1 | hfopenllm_v2/Lawnakk_BBA100/1762652579.707814 | 1762652579.707815 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBA100 | Lawnakk/BBA100 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2075803312987318}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | Lawnakk | Lawnakk/BBALAW1.0 | 61739e6e-92b0-4577-acd2-8c58ffc612a4 | 0.0.1 | hfopenllm_v2/Lawnakk_BBALAW1.0/1762652579.708328 | 1762652579.708329 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBALAW1.0 | Lawnakk/BBALAW1.0 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.13511482865463637}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 4.353} |
HF Open LLM v2 | Lawnakk | Lawnakk/BBALAW1.62 | 5dc300f1-e908-4d71-addc-2717e3702b12 | 0.0.1 | hfopenllm_v2/Lawnakk_BBALAW1.62/1762652579.709492 | 1762652579.709493 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBALAW1.62 | Lawnakk/BBALAW1.62 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5046099903810778}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | Lawnakk | Lawnakk/BBALAW1.64 | 4a4ce0f8-c41f-469e-b7c7-a4e3d857377e | 0.0.1 | hfopenllm_v2/Lawnakk_BBALAW1.64/1762652579.709901 | 1762652579.709902 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBALAW1.64 | Lawnakk/BBALAW1.64 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.13946107439371977}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | Lawnakk | Lawnakk/BBALAW1.61 | af87bb98-cc36-4c8d-9694-7e7428a899ac | 0.0.1 | hfopenllm_v2/Lawnakk_BBALAW1.61/1762652579.709277 | 1762652579.7092779 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBALAW1.61 | Lawnakk/BBALAW1.61 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5771253607095839}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | Lawnakk | Lawnakk/BBALAW1.2 | 917081cc-ee33-4c1f-85b0-9256ef57f6b3 | 0.0.1 | hfopenllm_v2/Lawnakk_BBALAW1.2/1762652579.708597 | 1762652579.708598 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBALAW1.2 | Lawnakk/BBALAW1.2 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.13543952268868825}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 4.353} |
HF Open LLM v2 | Lawnakk | Lawnakk/BBALAW1.6 | 684962b9-d734-4a10-a0cb-45bc4d957c2c | 0.0.1 | hfopenllm_v2/Lawnakk_BBALAW1.6/1762652579.7090619 | 1762652579.7090628 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBALAW1.6 | Lawnakk/BBALAW1.6 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5245437660961804}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v6_8b_dpo | 3e875ab6-6065-4400-8038-0fe6437f44d5 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v6_8b_dpo/1762652580.43716 | 1762652580.437161 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v6_8b_dpo | pankajmathur/orca_mini_v6_8b_dpo | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3882564927725103}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v7_7b | f801b633-5767-4b74-a0db-e474c9349820 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v7_7b/1762652580.437545 | 1762652580.437546 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v7_7b | pankajmathur/orca_mini_v7_7b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4387646998851935}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v5_8b_dpo | 1dad9bda-fbc8-499b-aab0-29be59b6921d | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v5_8b_dpo/1762652580.436573 | 1762652580.436574 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v5_8b_dpo | pankajmathur/orca_mini_v5_8b_dpo | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.48964746871633935}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_3b | bebbfd98-fdba-413d-9e7d-06af8bd4d5a7 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_3b/1762652580.434913 | 1762652580.434913 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_3b | pankajmathur/orca_mini_3b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.07421419611076388}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.426} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v5_8b | 12a231e0-deed-4d2b-9904-79a8b543d200 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v5_8b/1762652580.436376 | 1762652580.436377 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v5_8b | pankajmathur/orca_mini_v5_8b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.48060479527653294}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v7_72b | 702f1485-2941-4e27-9c96-11cee2449df8 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v7_72b/1762652580.437353 | 1762652580.437354 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v7_72b | pankajmathur/orca_mini_v7_72b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5929622291076566}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 72.706} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_5_1B-Instruct_preview | 7836190d-33df-45c2-b020-8ccec01de1f3 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_5_1B-Instruct_preview/1762652580.439178 | 1762652580.439179 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_5_1B-Instruct_preview | pankajmathur/orca_mini_v9_5_1B-Instruct_preview | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3935768206137493}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_6_1B-Instruct | 332f06db-35f1-4759-b3f8-973b1fe6fb9e | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_6_1B-Instruct/1762652580.439626 | 1762652580.439627 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_6_1B-Instruct | pankajmathur/orca_mini_v9_6_1B-Instruct | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6085741388404988}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v3_70b | beae9826-35b2-4758-a20a-10c8402daa42 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v3_70b/1762652580.43598 | 1762652580.435981 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v3_70b | pankajmathur/orca_mini_v3_70b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4014703209705803}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 70.0} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v6_8b | e45a0914-baee-4fd4-a231-3495b18db9a9 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v6_8b/1762652580.436963 | 1762652580.436963 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v6_8b | pankajmathur/orca_mini_v6_8b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.011116060940526692}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on B... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_2_70b | 69093327-3726-469d-9750-b9fa39423310 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_2_70b/1762652580.438577 | 1762652580.438578 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_2_70b | pankajmathur/orca_mini_v9_2_70b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8382591523823455}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_2_14B | e10e45b8-0d37-4905-9ebf-acc7922b7ea3 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_2_14B/1762652580.438377 | 1762652580.438378 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_2_14B | pankajmathur/orca_mini_v9_2_14B | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7780588837617521}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_7_3B-Instruct | 42a8b694-ef8f-47d2-8da3-e4db453641b3 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_7_3B-Instruct/1762652580.44028 | 1762652580.4402812 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_7_3B-Instruct | pankajmathur/orca_mini_v9_7_3B-Instruct | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5618381450107935}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_5_3B-Instruct | 2ff28335-81a0-4d61-b221-a7edb877da4a | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_5_3B-Instruct/1762652580.439394 | 1762652580.4393952 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_5_3B-Instruct | pankajmathur/orca_mini_v9_5_3B-Instruct | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7207066140063919}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v2_7b | 036c4f96-2d08-40a1-968d-293e0b3a1ed0 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v2_7b/1762652580.435575 | 1762652580.435576 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v2_7b | pankajmathur/orca_mini_v2_7b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.13578859647956312}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_6_3B-Instruct | 1cc45753-aeed-4804-a6da-413437dbb940 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_6_3B-Instruct/1762652580.439853 | 1762652580.439853 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_6_3B-Instruct | pankajmathur/orca_mini_v9_6_3B-Instruct | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7316475839660989}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | pankajmathur | pankajmathur/Al_Dente_v1_8b | 9924f2bd-abe5-431c-aa06-be24952ca363 | 0.0.1 | hfopenllm_v2/pankajmathur_Al_Dente_v1_8b/1762652580.434438 | 1762652580.434439 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/Al_Dente_v1_8b | pankajmathur/Al_Dente_v1_8b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3693721547715617}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_7_1B-Instruct | fad200e0-05bb-42d7-b7f3-caba938ca09d | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_7_1B-Instruct/1762652580.4400692 | 1762652580.44007 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_7_1B-Instruct | pankajmathur/orca_mini_v9_7_1B-Instruct | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5610136659618701}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v3_13b | d3ba7ff3-e0d7-48e3-b63d-9648a193679f | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v3_13b/1762652580.435779 | 1762652580.43578 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v3_13b | pankajmathur/orca_mini_v3_13b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28966253983873896}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 13.0} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_0_3B-Instruct | bc38a266-c3bd-4ecf-8149-6b26bb32803b | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_0_3B-Instruct/1762652580.437941 | 1762652580.437942 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_0_3B-Instruct | pankajmathur/orca_mini_v9_0_3B-Instruct | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5753766672429155}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v3_7b | 69cb8c68-5847-48f0-b2bd-0756ec761837 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v3_7b/1762652580.436181 | 1762652580.436182 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v3_7b | pankajmathur/orca_mini_v3_7b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2820937335159599}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v8_1_70b | 02201ae1-ec65-496c-bfdb-0dec8aa5308d | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v8_1_70b/1762652580.4377441 | 1762652580.4377449 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v8_1_70b | pankajmathur/orca_mini_v8_1_70b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8571434903832941}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_7b | 773c97e1-0e43-46ae-a134-8a08ca9b5094 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_7b/1762652580.435124 | 1762652580.4351249 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_7b | pankajmathur/orca_mini_7b | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.04121619525082337}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v5_8b_orpo | cf3f79fc-1fe2-4b55-a808-5664cc1f1809 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v5_8b_orpo/1762652580.436766 | 1762652580.4367669 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v5_8b_orpo | pankajmathur/orca_mini_v5_8b_orpo | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.08243239050164675}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_1_1B-Instruct | 65d0aca2-06ae-4a09-9fb2-2bb54939a554 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_1_1B-Instruct/1762652580.438177 | 1762652580.438178 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_1_1B-Instruct | pankajmathur/orca_mini_v9_1_1B-Instruct | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3629270336041702}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | pankajmathur | pankajmathur/model_007_13b_v2 | a108864f-40d6-492b-8440-1cbb5d87a5fe | 0.0.1 | hfopenllm_v2/pankajmathur_model_007_13b_v2/1762652580.434693 | 1762652580.4346938 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/model_007_13b_v2 | pankajmathur/model_007_13b_v2 | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.30564901129004374}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 13.0} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_4_70B | e3746ac6-3ee4-4d95-b800-509bed07aec3 | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_4_70B/1762652580.438774 | 1762652580.438774 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_4_70B | pankajmathur/orca_mini_v9_4_70B | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8014645584826039}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | pankajmathur | pankajmathur/orca_mini_v9_5_1B-Instruct | 2f2f821b-037b-4f3f-87f6-16703c0dc61a | 0.0.1 | hfopenllm_v2/pankajmathur_orca_mini_v9_5_1B-Instruct/1762652580.438983 | 1762652580.438984 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | pankajmathur/orca_mini_v9_5_1B-Instruct | pankajmathur/orca_mini_v9_5_1B-Instruct | pankajmathur | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.46379384477630464}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | hatemmahmoud | hatemmahmoud/qwen2.5-1.5b-sft-raft-grpo-hra-doc | 7d3c185f-4b4f-4bdd-bac9-f4ba2410f40c | 0.0.1 | hfopenllm_v2/hatemmahmoud_qwen2.5-1.5b-sft-raft-grpo-hra-doc/1762652580.190489 | 1762652580.190489 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | hatemmahmoud/qwen2.5-1.5b-sft-raft-grpo-hra-doc | hatemmahmoud/qwen2.5-1.5b-sft-raft-grpo-hra-doc | hatemmahmoud | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41958004760701606}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | Pinkstack | Pinkstack/Superthoughts-lite-v1 | ff308837-dc35-4257-a4cd-de463feb733e | 0.0.1 | hfopenllm_v2/Pinkstack_Superthoughts-lite-v1/1762652579.812961 | 1762652579.812962 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Pinkstack/Superthoughts-lite-v1 | Pinkstack/Superthoughts-lite-v1 | Pinkstack | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1658643510330368}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.711} |
HF Open LLM v2 | Pinkstack | Pinkstack/Superthoughts-lite-1.8B-experimental-o1 | fba2ce2f-6c30-4af9-ae3a-d23f39f3f963 | 0.0.1 | hfopenllm_v2/Pinkstack_Superthoughts-lite-1.8B-experimental-o1/1762652579.81273 | 1762652579.81273 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Pinkstack/Superthoughts-lite-1.8B-experimental-o1 | Pinkstack/Superthoughts-lite-1.8B-experimental-o1 | Pinkstack | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.0375193375798437}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.812} |
HF Open LLM v2 | Pinkstack | Pinkstack/SuperThoughts-CoT-14B-16k-o1-QwQ | c604f0fb-517d-45db-9e1c-6c911bce43e7 | 0.0.1 | hfopenllm_v2/Pinkstack_SuperThoughts-CoT-14B-16k-o1-QwQ/1762652579.812447 | 1762652579.812449 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Pinkstack/SuperThoughts-CoT-14B-16k-o1-QwQ | Pinkstack/SuperThoughts-CoT-14B-16k-o1-QwQ | Pinkstack | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.051457909458015844}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on B... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | assskelad | assskelad/smollm2-360M-sft_SmallThoughts | ce2f5cc8-a187-454d-ba99-4446d29aab7c | 0.0.1 | hfopenllm_v2/assskelad_smollm2-360M-sft_SmallThoughts/1762652580.019667 | 1762652580.0196679 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | assskelad/smollm2-360M-sft_SmallThoughts | assskelad/smollm2-360M-sft_SmallThoughts | assskelad | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20071078072846715}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 0.362} |
HF Open LLM v2 | qingy2019 | qingy2019/Qwen2.5-Math-14B-Instruct-Alpha | 7bc9676d-6186-4b2d-8b4b-4a3786f3ed40 | 0.0.1 | hfopenllm_v2/qingy2019_Qwen2.5-Math-14B-Instruct-Alpha/1762652580.4831731 | 1762652580.4831731 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2019/Qwen2.5-Math-14B-Instruct-Alpha | qingy2019/Qwen2.5-Math-14B-Instruct-Alpha | qingy2019 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5980830862112528}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.0} |
HF Open LLM v2 | qingy2019 | qingy2019/Qwen2.5-Ultimate-14B-Instruct | 655920b7-5687-4555-8890-ab1d08f3f00d | 0.0.1 | hfopenllm_v2/qingy2019_Qwen2.5-Ultimate-14B-Instruct/1762652580.483648 | 1762652580.483649 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2019/Qwen2.5-Ultimate-14B-Instruct | qingy2019/Qwen2.5-Ultimate-14B-Instruct | qingy2019 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.39380177927897975}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | qingy2019 | qingy2019/Qwen2.5-Math-14B-Instruct | 5a2e7119-5fe6-4d3c-8706-01e22ef5b121 | 0.0.1 | hfopenllm_v2/qingy2019_Qwen2.5-Math-14B-Instruct/1762652580.48299 | 1762652580.4829912 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2019/Qwen2.5-Math-14B-Instruct | qingy2019/Qwen2.5-Math-14B-Instruct | qingy2019 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6005310354304356}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.0} |
HF Open LLM v2 | qingy2019 | qingy2019/Qwen2.5-Math-14B-Instruct | 46d47e9a-6378-4eb5-a43d-f8e6a7c51674 | 0.0.1 | hfopenllm_v2/qingy2019_Qwen2.5-Math-14B-Instruct/1762652580.482764 | 1762652580.482764 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2019/Qwen2.5-Math-14B-Instruct | qingy2019/Qwen2.5-Math-14B-Instruct | qingy2019 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6066259746361875}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.0} |
HF Open LLM v2 | qingy2019 | qingy2019/Oracle-14B | fc5c5eff-8314-4cb2-8ba4-b562096cfe1f | 0.0.1 | hfopenllm_v2/qingy2019_Oracle-14B/1762652580.482562 | 1762652580.482562 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2019/Oracle-14B | qingy2019/Oracle-14B | qingy2019 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24007854714380067}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MixtralForCausalLM", "params_billions": 13.668} |
HF Open LLM v2 | qingy2019 | qingy2019/Oracle-14B | 90a36ffd-8eeb-44e8-9b7b-dbd56238d0a6 | 0.0.1 | hfopenllm_v2/qingy2019_Oracle-14B/1762652580.4822989 | 1762652580.4822989 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2019/Oracle-14B | qingy2019/Oracle-14B | qingy2019 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23583203677353867}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "MixtralForCausalLM", "params_billions": 13.668} |
HF Open LLM v2 | qingy2019 | qingy2019/Qwen2.5-Math-14B-Instruct-Pro | c1a0b34a-d3b5-42b9-b779-b31b9678faed | 0.0.1 | hfopenllm_v2/qingy2019_Qwen2.5-Math-14B-Instruct-Pro/1762652580.483387 | 1762652580.483388 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2019/Qwen2.5-Math-14B-Instruct-Pro | qingy2019/Qwen2.5-Math-14B-Instruct-Pro | qingy2019 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1921678923035324}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | nxmwxm | nxmwxm/Beast-Soul-new | 4ae25fa0-54af-4f47-853f-c97cd7b312d3 | 0.0.1 | hfopenllm_v2/nxmwxm_Beast-Soul-new/1762652580.416598 | 1762652580.416599 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nxmwxm/Beast-Soul-new | nxmwxm/Beast-Soul-new | nxmwxm | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.48687482546310457}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | PuxAI | PuxAI/LUA_model | 05dc0500-be97-456f-9d12-12192626ea39 | 0.0.1 | hfopenllm_v2/PuxAI_LUA_model/1762652579.818059 | 1762652579.818059 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | PuxAI/LUA_model | PuxAI/LUA_model | PuxAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22821336276634885}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.386} |
HF Open LLM v2 | darkc0de | darkc0de/BuddyGlassUncensored2025.2 | ea8dfb5f-750d-4573-a2bb-dadafc3a73b7 | 0.0.1 | hfopenllm_v2/darkc0de_BuddyGlassUncensored2025.2/1762652580.118735 | 1762652580.1187358 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | darkc0de/BuddyGlassUncensored2025.2 | darkc0de/BuddyGlassUncensored2025.2 | darkc0de | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7731131176389756}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 10.306} |
HF Open LLM v2 | darkc0de | darkc0de/BuddyGlassNeverSleeps | 675f6dfe-c623-4694-94cb-8705aab5521f | 0.0.1 | hfopenllm_v2/darkc0de_BuddyGlassNeverSleeps/1762652580.1184928 | 1762652580.118494 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | darkc0de/BuddyGlassNeverSleeps | darkc0de/BuddyGlassNeverSleeps | darkc0de | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4239019135892764}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | darkc0de | darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp | adf85459-eba0-48a8-ad54-1e17d1ea5b31 | 0.0.1 | hfopenllm_v2/darkc0de_BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp/1762652580.1189609 | 1762652580.1189609 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp | darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp | darkc0de | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.43584245357872664}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 0.007} |
HF Open LLM v2 | J-LAB | J-LAB/Thynk_orpo | 3565fba3-e63d-49f8-9e8f-deef83531eb9 | 0.0.1 | hfopenllm_v2/J-LAB_Thynk_orpo/1762652579.649622 | 1762652579.6496232 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | J-LAB/Thynk_orpo | J-LAB/Thynk_orpo | J-LAB | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21017788357114678}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | BSC-LT | BSC-LT/salamandra-7b-instruct | 2eb60f3a-53f4-478a-8292-aa5e210a8cdf | 0.0.1 | hfopenllm_v2/BSC-LT_salamandra-7b-instruct/1762652579.493781 | 1762652579.493781 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | BSC-LT/salamandra-7b-instruct | BSC-LT/salamandra-7b-instruct | BSC-LT | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24507418095098782}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 7.768} |
HF Open LLM v2 | BSC-LT | BSC-LT/salamandra-7b | 36d2d3af-60aa-4624-b414-e249d06b6ee1 | 0.0.1 | hfopenllm_v2/BSC-LT_salamandra-7b/1762652579.493503 | 1762652579.493503 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | BSC-LT/salamandra-7b | BSC-LT/salamandra-7b | BSC-LT | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.13673829882489574}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 7.768} |
HF Open LLM v2 | Davidsv | Davidsv/SUONG-1 | 097e6cbe-88cd-4d61-bb4c-0b8ddb537abe | 0.0.1 | hfopenllm_v2/Davidsv_SUONG-1/1762652579.5439382 | 1762652579.54394 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Davidsv/SUONG-1 | Davidsv/SUONG-1 | Davidsv | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2497207409673001}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 2.879} |
HF Open LLM v2 | Shreyash2010 | Shreyash2010/Uma-4x4B-Instruct-v0.1 | 83fa529b-8c61-4017-92a8-ec0f46eb7bba | 0.0.1 | hfopenllm_v2/Shreyash2010_Uma-4x4B-Instruct-v0.1/1762652579.880244 | 1762652579.880245 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Shreyash2010/Uma-4x4B-Instruct-v0.1 | Shreyash2010/Uma-4x4B-Instruct-v0.1 | Shreyash2010 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5516961661724225}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "?", "params_billions": 3.821} |
HF Open LLM v2 | databricks | databricks/dbrx-instruct | 639e4921-9fa8-446d-b539-f03a7589b142 | 0.0.1 | hfopenllm_v2/databricks_dbrx-instruct/1762652580.119466 | 1762652580.119467 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | databricks/dbrx-instruct | databricks/dbrx-instruct | databricks | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5415796752616391}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "DbrxForCausalLM", "params_billions": 131.597} |
HF Open LLM v2 | databricks | databricks/dolly-v2-12b | c83e2bf0-5d4e-45c4-aff2-27aea2bc0fb6 | 0.0.1 | hfopenllm_v2/databricks_dolly-v2-12b/1762652580.1198819 | 1762652580.119883 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | databricks/dolly-v2-12b | databricks/dolly-v2-12b | databricks | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23550734273948679}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "GPTNeoXForCausalLM", "params_billions": 12.0} |
HF Open LLM v2 | databricks | databricks/dolly-v2-3b | a8838707-f188-440e-801f-e780e0dd362a | 0.0.1 | hfopenllm_v2/databricks_dolly-v2-3b/1762652580.1200871 | 1762652580.1200871 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | databricks/dolly-v2-3b | databricks/dolly-v2-3b | databricks | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22471597583301195}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "GPTNeoXForCausalLM", "params_billions": 3.0} |
HF Open LLM v2 | databricks | databricks/dbrx-base | 17febb53-0735-4983-8049-85319818ab84 | 0.0.1 | hfopenllm_v2/databricks_dbrx-base/1762652580.1191711 | 1762652580.1191711 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | databricks/dbrx-base | databricks/dbrx-base | databricks | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.08214723926380368}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Unknown", "params_billions": 0.0} |
HF Open LLM v2 | databricks | databricks/dolly-v2-7b | 68f999d7-2dc2-4b3c-ab02-6140387893c0 | 0.0.1 | hfopenllm_v2/databricks_dolly-v2-7b/1762652580.120286 | 1762652580.120287 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | databricks/dolly-v2-7b | databricks/dolly-v2-7b | databricks | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2009856070781083}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "GPTNeoXForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | databricks | databricks/dolly-v1-6b | 62299ec1-dd42-4751-a224-3bdda71d3cdf | 0.0.1 | hfopenllm_v2/databricks_dolly-v1-6b/1762652580.1196742 | 1762652580.119675 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | databricks/dolly-v1-6b | databricks/dolly-v1-6b | databricks | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22244311759464885}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "GPTJForCausalLM", "params_billions": 6.0} |
HF Open LLM v2 | ghost-x | ghost-x/ghost-8b-beta-1608 | b5fba89f-ec8f-4e71-ad19-32c7d85698fb | 0.0.1 | hfopenllm_v2/ghost-x_ghost-8b-beta-1608/1762652580.16434 | 1762652580.164341 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ghost-x/ghost-8b-beta-1608 | ghost-x/ghost-8b-beta-1608 | ghost-x | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.42727407722620425}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | FuseAI | FuseAI/FuseChat-Llama-3.1-8B-Instruct | fdc9ea4d-acf8-4f2c-b727-482f464eb925 | 0.0.1 | hfopenllm_v2/FuseAI_FuseChat-Llama-3.1-8B-Instruct/1762652579.626143 | 1762652579.626144 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | FuseAI/FuseChat-Llama-3.1-8B-Instruct | FuseAI/FuseChat-Llama-3.1-8B-Instruct | FuseAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7204816553411615}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | FuseAI | FuseAI/FuseChat-Llama-3.2-3B-Instruct | e39160a3-8332-467d-900f-52bb7d1446c1 | 0.0.1 | hfopenllm_v2/FuseAI_FuseChat-Llama-3.2-3B-Instruct/1762652579.626356 | 1762652579.626357 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | FuseAI/FuseChat-Llama-3.2-3B-Instruct | FuseAI/FuseChat-Llama-3.2-3B-Instruct | FuseAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.684886102208806}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | FuseAI | FuseAI/FuseChat-7B-v2.0 | 26ca0085-db25-4664-823a-f56e08081dc4 | 0.0.1 | hfopenllm_v2/FuseAI_FuseChat-7B-v2.0/1762652579.625878 | 1762652579.625879 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | FuseAI/FuseChat-7B-v2.0 | FuseAI/FuseChat-7B-v2.0 | FuseAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3423194900641409}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | FuseAI | FuseAI/FuseChat-Qwen-2.5-7B-Instruct | 1bae6b5e-47b0-4fe2-847a-8aec0a36342e | 0.0.1 | hfopenllm_v2/FuseAI_FuseChat-Qwen-2.5-7B-Instruct/1762652579.626579 | 1762652579.626579 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | FuseAI/FuseChat-Qwen-2.5-7B-Instruct | FuseAI/FuseChat-Qwen-2.5-7B-Instruct | FuseAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5905641475728844}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | OnlyCheeini | OnlyCheeini/greesychat-turbo | f3a7f01c-2893-4887-a210-d126d9135edf | 0.0.1 | hfopenllm_v2/OnlyCheeini_greesychat-turbo/1762652579.7991328 | 1762652579.799134 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | OnlyCheeini/greesychat-turbo | OnlyCheeini/greesychat-turbo | OnlyCheeini | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.023256071667619692}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on B... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | HuggingFaceH4 | HuggingFaceH4/zephyr-7b-beta | 3b9d5166-4144-4222-a39d-3d1d3956a6e8 | 0.0.1 | hfopenllm_v2/HuggingFaceH4_zephyr-7b-beta/1762652579.641025 | 1762652579.641026 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | HuggingFaceH4/zephyr-7b-beta | HuggingFaceH4/zephyr-7b-beta | HuggingFaceH4 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49504315216957673}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | HuggingFaceH4 | HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1 | 8b347bb4-9f6d-4c82-bd5d-2fb5f7c8f881 | 0.0.1 | hfopenllm_v2/HuggingFaceH4_zephyr-orpo-141b-A35b-v0.1/1762652579.641484 | 1762652579.641485 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1 | HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1 | HuggingFaceH4 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6510891102275296}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MixtralForCausalLM", "params_billions": 140.621} |
HF Open LLM v2 | HuggingFaceH4 | HuggingFaceH4/zephyr-7b-alpha | 2029aa96-40b2-4af8-a7fa-8ae968b20502 | 0.0.1 | hfopenllm_v2/HuggingFaceH4_zephyr-7b-alpha/1762652579.640769 | 1762652579.64077 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | HuggingFaceH4/zephyr-7b-alpha | HuggingFaceH4/zephyr-7b-alpha | HuggingFaceH4 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5191480826429429}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | baconnier | baconnier/Napoleon_24B_V0.2 | 4857d2d0-1a4b-4544-8b1e-fb4b01618a3b | 0.0.1 | hfopenllm_v2/baconnier_Napoleon_24B_V0.2/1762652580.022489 | 1762652580.022489 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | baconnier/Napoleon_24B_V0.2 | baconnier/Napoleon_24B_V0.2 | baconnier | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2527172347150006}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 23.572} |
HF Open LLM v2 | baconnier | baconnier/Napoleon_24B_V0.0 | 88fb101e-35dd-40af-922f-9b66a2711249 | 0.0.1 | hfopenllm_v2/baconnier_Napoleon_24B_V0.0/1762652580.0222468 | 1762652580.022248 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | baconnier/Napoleon_24B_V0.0 | baconnier/Napoleon_24B_V0.0 | baconnier | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1801021290176731}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 23.572} |
HF Open LLM v2 | ehristoforu | ehristoforu/fd-lora-merged-64x128 | 6474672b-7728-4ab5-8fdf-749e996272a2 | 0.0.1 | hfopenllm_v2/ehristoforu_fd-lora-merged-64x128/1762652580.14183 | 1762652580.141831 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/fd-lora-merged-64x128 | ehristoforu/fd-lora-merged-64x128 | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3281060918363276}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.777} |
HF Open LLM v2 | ehristoforu | ehristoforu/moremerge | 38cf2a56-ed33-4f7e-94aa-bf4f15a5a53c | 0.0.1 | hfopenllm_v2/ehristoforu_moremerge/1762652580.1440692 | 1762652580.14407 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/moremerge | ehristoforu/moremerge | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20190982149585324}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | ehristoforu | ehristoforu/ud-14b | 7e7ffbef-c8d4-47ff-9ae6-7f0701e9e192 | 0.0.1 | hfopenllm_v2/ehristoforu_ud-14b/1762652580.146786 | 1762652580.146786 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/ud-14b | ehristoforu/ud-14b | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4235273518708139}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | ehristoforu | ehristoforu/trd-7b-it | 3bd7f3c1-772a-45fa-9d71-a6e3dff3b54f | 0.0.1 | hfopenllm_v2/ehristoforu_trd-7b-it/1762652580.146566 | 1762652580.1465669 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/trd-7b-it | ehristoforu/trd-7b-it | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21847143357402804}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | ehristoforu | ehristoforu/della-70b-test-v1 | d9f6c1e9-84be-4666-b64f-5da37cf98202 | 0.0.1 | hfopenllm_v2/ehristoforu_della-70b-test-v1/1762652580.141174 | 1762652580.141175 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/della-70b-test-v1 | ehristoforu/della-70b-test-v1 | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49786566310722213}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | ehristoforu | ehristoforu/fq2.5-7b-it-normalize_true | d0d8274c-7d05-4166-a510-487cb294135e | 0.0.1 | hfopenllm_v2/ehristoforu_fq2.5-7b-it-normalize_true/1762652580.1426702 | 1762652580.142671 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/fq2.5-7b-it-normalize_true | ehristoforu/fq2.5-7b-it-normalize_true | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7399156460413925}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | ehristoforu | ehristoforu/tmoe-v2 | 0a84406f-a970-4a03-8d2f-c82a8bbd3872 | 0.0.1 | hfopenllm_v2/ehristoforu_tmoe-v2/1762652580.146366 | 1762652580.146367 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/tmoe-v2 | ehristoforu/tmoe-v2 | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19026959578363187}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2MoeForCausalLM", "params_billions": 11.026} |
HF Open LLM v2 | ehristoforu | ehristoforu/fq2.5-7b-it-normalize_false | a5004f95-0854-40d2-8a71-004875544499 | 0.0.1 | hfopenllm_v2/ehristoforu_fq2.5-7b-it-normalize_false/1762652580.142459 | 1762652580.1424599 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/fq2.5-7b-it-normalize_false | ehristoforu/fq2.5-7b-it-normalize_false | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7399156460413925}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | ehristoforu | ehristoforu/fp4-14b-v1-fix | 37d01a2d-f8ca-46a3-a4b7-3fa725b4023b | 0.0.1 | hfopenllm_v2/ehristoforu_fp4-14b-v1-fix/1762652580.142252 | 1762652580.1422532 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/fp4-14b-v1-fix | ehristoforu/fp4-14b-v1-fix | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6741700909143296}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | ehristoforu | ehristoforu/testq-32b | d5acc9ed-9fd1-411f-a85c-e790521e7fe4 | 0.0.1 | hfopenllm_v2/ehristoforu_testq-32b/1762652580.145958 | 1762652580.145958 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/testq-32b | ehristoforu/testq-32b | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18759668789921852}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 56.165} |
HF Open LLM v2 | ehristoforu | ehristoforu/falcon3-ultraset | e2291d7c-7627-484e-a0c1-1857c642be2b | 0.0.1 | hfopenllm_v2/ehristoforu_falcon3-ultraset/1762652580.1413918 | 1762652580.141393 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/falcon3-ultraset | ehristoforu/falcon3-ultraset | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7135123694020753}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 7.456} |
HF Open LLM v2 | ehristoforu | ehristoforu/fp4-14b-it-v1 | 31618256-7ca8-4a3c-bfbf-4397bf2cf339 | 0.0.1 | hfopenllm_v2/ehristoforu_fp4-14b-it-v1/1762652580.1420429 | 1762652580.1420438 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/fp4-14b-it-v1 | ehristoforu/fp4-14b-it-v1 | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25346746632269046}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | ehristoforu | ehristoforu/Falcon3-8B-Franken-Basestruct | 1653400c-137e-4745-8676-eeaf39bbcc13 | 0.0.1 | hfopenllm_v2/ehristoforu_Falcon3-8B-Franken-Basestruct/1762652580.138562 | 1762652580.1385632 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/Falcon3-8B-Franken-Basestruct | ehristoforu/Falcon3-8B-Franken-Basestruct | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17148499315150467}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.406} |
HF Open LLM v2 | ehristoforu | ehristoforu/Falcon3-MoE-2x7B-Insruct | 6b208d1e-96f1-4b72-8d31-6c6e43c42111 | 0.0.1 | hfopenllm_v2/ehristoforu_Falcon3-MoE-2x7B-Insruct/1762652580.1388721 | 1762652580.138873 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/Falcon3-MoE-2x7B-Insruct | ehristoforu/Falcon3-MoE-2x7B-Insruct | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7642954028643998}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MixtralForCausalLM", "params_billions": 13.401} |
HF Open LLM v2 | ehristoforu | ehristoforu/fd-lora-merged-16x32 | 4d00474d-97e6-4384-82f7-956b2e7268e9 | 0.0.1 | hfopenllm_v2/ehristoforu_fd-lora-merged-16x32/1762652580.141611 | 1762652580.141612 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/fd-lora-merged-16x32 | ehristoforu/fd-lora-merged-16x32 | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3480897352358409}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.776} |
HF Open LLM v2 | ehristoforu | ehristoforu/tmoe | 0a160c2d-06ed-43c0-8705-bd76e47c093a | 0.0.1 | hfopenllm_v2/ehristoforu_tmoe/1762652580.1461592 | 1762652580.1461592 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/tmoe | ehristoforu/tmoe | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.11930234001338672}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2MoeForCausalLM", "params_billions": 11.026} |
HF Open LLM v2 | ehristoforu | ehristoforu/mllama-3.1-8b-instruct | 40016b83-0730-4e67-b7e9-3b1d29d9d1be | 0.0.1 | hfopenllm_v2/ehristoforu_mllama-3.1-8b-instruct/1762652580.143588 | 1762652580.143589 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/mllama-3.1-8b-instruct | ehristoforu/mllama-3.1-8b-instruct | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3457913890698901}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | ehristoforu | ehristoforu/moremerge-upscaled | 5c465aeb-c6be-4a22-9cf0-3d9c2558ba39 | 0.0.1 | hfopenllm_v2/ehristoforu_moremerge-upscaled/1762652580.144358 | 1762652580.1443589 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/moremerge-upscaled | ehristoforu/moremerge-upscaled | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1978882697908217}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 8.545} |
HF Open LLM v2 | ehristoforu | ehristoforu/rufalcon3-3b-it | 8f4336f8-1fdb-4a3d-8b9a-2e7c5e156f07 | 0.0.1 | hfopenllm_v2/ehristoforu_rufalcon3-3b-it/1762652580.14555 | 1762652580.14555 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/rufalcon3-3b-it | ehristoforu/rufalcon3-3b-it | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5942111375594533}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.228} |
HF Open LLM v2 | ehristoforu | ehristoforu/SoRu-0009 | d45e7b32-f09d-4185-ac78-d0eb7a4d3823 | 0.0.1 | hfopenllm_v2/ehristoforu_SoRu-0009/1762652580.1407459 | 1762652580.140747 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/SoRu-0009 | ehristoforu/SoRu-0009 | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25818827378023645}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | ehristoforu | ehristoforu/rmoe-v1 | e58aecba-3254-426d-aac2-05a32c3cbdab | 0.0.1 | hfopenllm_v2/ehristoforu_rmoe-v1/1762652580.1453388 | 1762652580.14534 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ehristoforu/rmoe-v1 | ehristoforu/rmoe-v1 | ehristoforu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.26500795666609045}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2MoeForCausalLM", "params_billions": 11.026} |
HF Open LLM v2 | Gryphe | Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small | f5f73aa0-2223-49c0-a2ad-df38ee33355b | 0.0.1 | hfopenllm_v2/Gryphe_Pantheon-RP-Pure-1.6.2-22b-Small/1762652579.6344929 | 1762652579.6344929 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small | Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small | Gryphe | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6931042965996888}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 22.247} |
HF Open LLM v2 | Gryphe | Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO | a2445d2d-b8a2-44e4-9c74-7401e7afde75 | 0.0.1 | hfopenllm_v2/Gryphe_Pantheon-RP-1.6-12b-Nemo-KTO/1762652579.634284 | 1762652579.634285 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO | Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO | Gryphe | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4636187537954849}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Gryphe | Gryphe/Pantheon-RP-1.5-12b-Nemo | f9ed0b0f-6fa9-4450-97fe-204f6dc8d88a | 0.0.1 | hfopenllm_v2/Gryphe_Pantheon-RP-1.5-12b-Nemo/1762652579.633812 | 1762652579.633813 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Gryphe/Pantheon-RP-1.5-12b-Nemo | Gryphe/Pantheon-RP-1.5-12b-Nemo | Gryphe | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.47630841722186024}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Gryphe | Gryphe/Pantheon-RP-1.6-12b-Nemo | 9a2ca2e5-a2e9-460f-b4dc-a6293ca13003 | 0.0.1 | hfopenllm_v2/Gryphe_Pantheon-RP-1.6-12b-Nemo/1762652579.634059 | 1762652579.6340601 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Gryphe/Pantheon-RP-1.6-12b-Nemo | Gryphe/Pantheon-RP-1.6-12b-Nemo | Gryphe | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.44805671174705336}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.