_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | meta | ValiantLabs/Llama3.1-8B-Fireplace2 | 08843042-f5ed-4dbb-befe-82c48e370664 | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.1-8B-Fireplace2/1762652579.945827 | 1762652579.945827 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.1-8B-Fireplace2 | ValiantLabs/Llama3.1-8B-Fireplace2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5483240025354947}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.1-8B-Fireplace2 | 8c25e90b-944b-4c23-a7ed-43c9609c6bf7 | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.1-8B-Fireplace2/1762652579.946038 | 1762652579.946039 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.1-8B-Fireplace2 | ValiantLabs/Llama3.1-8B-Fireplace2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5328118281714739}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.2-3B-Enigma | 71e3ab93-9667-4e99-b0a1-e25b701b13fd | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.2-3B-Enigma/1762652579.94662 | 1762652579.946621 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.2-3B-Enigma | ValiantLabs/Llama3.2-3B-Enigma | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2786218345102107}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.1-8B-Esper2 | aa8f6d7a-bf7a-4e00-932f-b31c9cf0705e | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.1-8B-Esper2/1762652579.945612 | 1762652579.9456131 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.1-8B-Esper2 | ValiantLabs/Llama3.1-8B-Esper2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2567398945907968}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.1-8B-Enigma | e1c4e454-79c8-448d-ab33-629900a35779 | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.1-8B-Enigma/1762652579.945396 | 1762652579.945397 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.1-8B-Enigma | ValiantLabs/Llama3.1-8B-Enigma | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.26805542626896633}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge | f852dab4-9c5a-4fb9-99c2-951e7d2300d0 | 0.0.1 | hfopenllm_v2/BlackBeenie_Llama-3.1-8B-pythonic-passthrough-merge/1762652579.495604 | 1762652579.495605 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge | BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23158552640327662}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 20.245} |
HF Open LLM v2 | meta | BlackBeenie/llama-3-luminous-merged | 9ca4809e-2bf0-477e-b960-64718561583b | 0.0.1 | hfopenllm_v2/BlackBeenie_llama-3-luminous-merged/1762652579.496879 | 1762652579.49688 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | BlackBeenie/llama-3-luminous-merged | BlackBeenie/llama-3-luminous-merged | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.43234506664538974}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco | 7f8d4c8c-4877-4b2f-a0fe-7817894daa79 | 0.0.1 | hfopenllm_v2/BlackBeenie_llama-3.1-8B-Galore-openassistant-guanaco/1762652579.4970949 | 1762652579.4970958 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco | BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2634842218646525}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | BlackBeenie/Neos-Llama-3.1-8B | 904e3917-3bfd-4c83-8088-6b5ac596e7ea | 0.0.1 | hfopenllm_v2/BlackBeenie_Neos-Llama-3.1-8B/1762652579.496156 | 1762652579.496157 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | BlackBeenie/Neos-Llama-3.1-8B | BlackBeenie/Neos-Llama-3.1-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49439376410147295}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | BlackBeenie/Neos-Llama-3.1-base | ec9c46a6-a0e9-4174-8ebe-ce33d5eeb27d | 0.0.1 | hfopenllm_v2/BlackBeenie_Neos-Llama-3.1-base/1762652579.496382 | 1762652579.496383 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | BlackBeenie/Neos-Llama-3.1-base | BlackBeenie/Neos-Llama-3.1-base | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17508211545366295}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 4.65} |
HF Open LLM v2 | meta | jiangxinyang-shanda/Homer-LLama3-8B | 73c50ab1-bdf8-4fbc-b7e6-d4a8e8bb8a4e | 0.0.1 | hfopenllm_v2/jiangxinyang-shanda_Homer-LLama3-8B/1762652580.2879412 | 1762652580.287943 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jiangxinyang-shanda/Homer-LLama3-8B | jiangxinyang-shanda/Homer-LLama3-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3991719748046295}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Bllossom/llama-3.2-Korean-Bllossom-AICA-5B | e2668c3c-a862-4564-acee-3c3ce439f74f | 0.0.1 | hfopenllm_v2/Bllossom_llama-3.2-Korean-Bllossom-AICA-5B/1762652579.497314 | 1762652579.497314 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Bllossom/llama-3.2-Korean-Bllossom-AICA-5B | Bllossom/llama-3.2-Korean-Bllossom-AICA-5B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5172497861230424}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MllamaForConditionalGeneration", "params_billions": 5.199} |
HF Open LLM v2 | meta | ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1 | 1d33cf05-9690-41ba-9288-5f39e5b3c17d | 0.0.1 | hfopenllm_v2/ArliAI_Llama-3.1-8B-ArliAI-RPMax-v1.1/1762652579.4817438 | 1762652579.481745 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1 | ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6359016298975606}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | oopere/pruned20-llama-3.2-3b | e0e6bdbd-91c2-4d45-be73-03890ed13709 | 0.0.1 | hfopenllm_v2/oopere_pruned20-llama-3.2-3b/1762652580.4269419 | 1762652580.426943 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/pruned20-llama-3.2-3b | oopere/pruned20-llama-3.2-3b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17887870849346402}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 2.79} |
HF Open LLM v2 | meta | oopere/pruned60-llama-3.2-3b | 219c6f49-3d48-4e1b-8105-fdf323b2fc3c | 0.0.1 | hfopenllm_v2/oopere_pruned60-llama-3.2-3b/1762652580.42798 | 1762652580.4279811 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/pruned60-llama-3.2-3b | oopere/pruned60-llama-3.2-3b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1824758307956223}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.944} |
HF Open LLM v2 | meta | oopere/pruned60-llama-1b | 4c0ac526-821a-49eb-9eee-152d594ed25b | 0.0.1 | hfopenllm_v2/oopere_pruned60-llama-1b/1762652580.4277859 | 1762652580.4277859 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/pruned60-llama-1b | oopere/pruned60-llama-1b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18285039251408486}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.753} |
HF Open LLM v2 | meta | oopere/pruned20-llama-1b | c86ed5b4-8793-424a-a5a2-9a54689cb388 | 0.0.1 | hfopenllm_v2/oopere_pruned20-llama-1b/1762652580.426731 | 1762652580.426732 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/pruned20-llama-1b | oopere/pruned20-llama-1b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19936213690784896}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.075} |
HF Open LLM v2 | meta | oopere/Llama-FinSent-S | 8b9ec467-1555-415c-b1ee-23be18ded9e5 | 0.0.1 | hfopenllm_v2/oopere_Llama-FinSent-S/1762652580.4263492 | 1762652580.42635 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/Llama-FinSent-S | oopere/Llama-FinSent-S | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2163980460733077}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 0.914} |
HF Open LLM v2 | meta | oopere/Llama-FinSent-S | f99bad90-e7b2-4205-9f51-93f96e90188c | 0.0.1 | hfopenllm_v2/oopere_Llama-FinSent-S/1762652580.426095 | 1762652580.426095 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/Llama-FinSent-S | oopere/Llama-FinSent-S | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21187670935340452}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.914} |
HF Open LLM v2 | meta | oopere/pruned40-llama-1b | 0032ea65-98dc-48a9-90e7-835e389acecd | 0.0.1 | hfopenllm_v2/oopere_pruned40-llama-1b/1762652580.427145 | 1762652580.427145 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/pruned40-llama-1b | oopere/pruned40-llama-1b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22843832143157933}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.914} |
HF Open LLM v2 | meta | oopere/pruned40-llama-3.2-1B | bae27b4d-4046-45f1-b798-8356fa962df4 | 0.0.1 | hfopenllm_v2/oopere_pruned40-llama-3.2-1B/1762652580.427387 | 1762652580.427387 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/pruned40-llama-3.2-1B | oopere/pruned40-llama-3.2-1B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22663976028050017}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 0.914} |
HF Open LLM v2 | meta | oopere/pruned10-llama-3.2-3B | 2ff7d218-348b-4069-808f-6b32e7a77a5b | 0.0.1 | hfopenllm_v2/oopere_pruned10-llama-3.2-3B/1762652580.426529 | 1762652580.4265301 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/pruned10-llama-3.2-3B | oopere/pruned10-llama-3.2-3B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17762980004166723}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.001} |
HF Open LLM v2 | meta | oopere/pruned40-llama-3.2-3b | 97c9b209-b2ed-439f-9b01-cad25e205fa9 | 0.0.1 | hfopenllm_v2/oopere_pruned40-llama-3.2-3b/1762652580.4275908 | 1762652580.4275908 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | oopere/pruned40-llama-3.2-3b | oopere/pruned40-llama-3.2-3b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21829634259320824}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 2.367} |
HF Open LLM v2 | meta | orai-nlp/Llama-eus-8B | 0ed99007-3e31-4c48-abe5-0cd94b95dcf4 | 0.0.1 | hfopenllm_v2/orai-nlp_Llama-eus-8B/1762652580.43225 | 1762652580.432275 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | orai-nlp/Llama-eus-8B | orai-nlp/Llama-eus-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21612321972366655}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | MLP-KTLim/llama-3-Korean-Bllossom-8B | 31a37662-052e-440c-a475-66543b6c52b1 | 0.0.1 | hfopenllm_v2/MLP-KTLim_llama-3-Korean-Bllossom-8B/1762652579.7427032 | 1762652579.7427042 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MLP-KTLim/llama-3-Korean-Bllossom-8B | MLP-KTLim/llama-3-Korean-Bllossom-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5112800702136997}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | sakhan10/quantized_open_llama_3b_v2 | f96ce5a9-7cc2-4380-9285-09052b906411 | 0.0.1 | hfopenllm_v2/sakhan10_quantized_open_llama_3b_v2/1762652580.507647 | 1762652580.507648 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sakhan10/quantized_open_llama_3b_v2 | sakhan10/quantized_open_llama_3b_v2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18722212618075595}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.0} |
HF Open LLM v2 | meta | riaz/FineLlama-3.1-8B | 55eb0438-f0bd-4f9d-8bff-577d0245a57c | 0.0.1 | hfopenllm_v2/riaz_FineLlama-3.1-8B/1762652580.495657 | 1762652580.495657 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | riaz/FineLlama-3.1-8B | riaz/FineLlama-3.1-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.43734070045257695}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | riaz/FineLlama-3.1-8B | d5fb7571-bafd-424a-87f5-2d14ac7bd8d2 | 0.0.1 | hfopenllm_v2/riaz_FineLlama-3.1-8B/1762652580.4959512 | 1762652580.495952 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | riaz/FineLlama-3.1-8B | riaz/FineLlama-3.1-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.413660199382084}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base | e5843711-00cb-4167-a47d-4874af0c3ba2 | 0.0.1 | hfopenllm_v2/Joseph717171_Llama-3.1-SuperNova-8B-Lite_TIES_with_Base/1762652579.6947358 | 1762652579.694737 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base | Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8096328851890761}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32 | 26dd2a1f-27ae-4311-9b80-f5a8f0fa456a | 0.0.1 | hfopenllm_v2/Joseph717171_Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32/1762652579.694483 | 1762652579.694484 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32 | Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6185410266980501}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0 | c256cede-47bb-487d-9de2-ae7352faa165 | 0.0.1 | hfopenllm_v2/Columbia-NLP_LION-LLaMA-3-8b-odpo-v1.0/1762652579.5080209 | 1762652579.508022 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0 | Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.39679938119744496}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | OpenLeecher/llama3-8b-lima | b482d6e6-8520-4a77-a729-ebe2e9635a6c | 0.0.1 | hfopenllm_v2/OpenLeecher_llama3-8b-lima/1762652579.807648 | 1762652579.8076491 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | OpenLeecher/llama3-8b-lima | OpenLeecher/llama3-8b-lima | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.43706587410293574}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | akhadangi/Llama3.2.1B.0.01-Last | 9f796e5e-6c31-46e0-b839-e21d33a403c4 | 0.0.1 | hfopenllm_v2/akhadangi_Llama3.2.1B.0.01-Last/1762652579.980133 | 1762652579.9801338 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | akhadangi/Llama3.2.1B.0.01-Last | akhadangi/Llama3.2.1B.0.01-Last | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.09165015492227291}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | akhadangi/Llama3.2.1B.BaseFiT | 8577766f-d696-489d-8194-31b48c17941a | 0.0.1 | hfopenllm_v2/akhadangi_Llama3.2.1B.BaseFiT/1762652579.980761 | 1762652579.980762 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | akhadangi/Llama3.2.1B.BaseFiT | akhadangi/Llama3.2.1B.BaseFiT | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.08827799128534511}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | akhadangi/Llama3.2.1B.0.01-First | d07eada4-e73c-4dd6-8538-e3a9cd471f34 | 0.0.1 | hfopenllm_v2/akhadangi_Llama3.2.1B.0.01-First/1762652579.979876 | 1762652579.979876 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | akhadangi/Llama3.2.1B.0.01-First | akhadangi/Llama3.2.1B.0.01-First | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.08135857303066973}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | akhadangi/Llama3.2.1B.0.1-Last | 82c24fd7-de74-4dc8-bd22-5761243ed826 | 0.0.1 | hfopenllm_v2/akhadangi_Llama3.2.1B.0.1-Last/1762652579.980555 | 1762652579.980556 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | akhadangi/Llama3.2.1B.0.1-Last | akhadangi/Llama3.2.1B.0.1-Last | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.09497245087479}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH",... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | akhadangi/Llama3.2.1B.0.1-First | 4ec306d4-3f34-4330-9898-fb5ccb9a3483 | 0.0.1 | hfopenllm_v2/akhadangi_Llama3.2.1B.0.1-First/1762652579.9803479 | 1762652579.9803488 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | akhadangi/Llama3.2.1B.0.1-First | akhadangi/Llama3.2.1B.0.1-First | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.10009330797838623}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | CreitinGameplays/Llama-3.1-8B-R1-v0.1 | a4b935d4-1664-44e4-ad82-639755c2b909 | 0.0.1 | hfopenllm_v2/CreitinGameplays_Llama-3.1-8B-R1-v0.1/1762652579.514677 | 1762652579.514678 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CreitinGameplays/Llama-3.1-8B-R1-v0.1 | CreitinGameplays/Llama-3.1-8B-R1-v0.1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.323485019747603}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0 | 8c8a47f2-c8cf-4ea8-b0ee-0180aeb1b9f0 | 0.0.1 | hfopenllm_v2/Infinirc_Infinirc-Llama3-8B-2G-Release-v1.0/1762652579.6465652 | 1762652579.6465652 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0 | Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20243398626754788}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ContactDoctor/Bio-Medical-Llama-3-8B | 42a3e3b7-b8e3-4470-b1a6-4a3daa146484 | 0.0.1 | hfopenllm_v2/ContactDoctor_Bio-Medical-Llama-3-8B/1762652579.510189 | 1762652579.510189 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ContactDoctor/Bio-Medical-Llama-3-8B | ContactDoctor/Bio-Medical-Llama-3-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4422365988909427}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 4.015} |
HF Open LLM v2 | meta | VIRNECT/llama-3-Korean-8B-r-v-0.1 | c3448f16-33c4-42c8-bde3-b503786cba7f | 0.0.1 | hfopenllm_v2/VIRNECT_llama-3-Korean-8B-r-v-0.1/1762652579.944067 | 1762652579.9440682 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | VIRNECT/llama-3-Korean-8B-r-v-0.1 | VIRNECT/llama-3-Korean-8B-r-v-0.1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49157125316382755}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "?", "params_billions": 16.061} |
HF Open LLM v2 | meta | VIRNECT/llama-3-Korean-8B | 1193d16a-5ba8-4a6c-b13d-116bb7731a71 | 0.0.1 | hfopenllm_v2/VIRNECT_llama-3-Korean-8B/1762652579.943881 | 1762652579.943882 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | VIRNECT/llama-3-Korean-8B | VIRNECT/llama-3-Korean-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5021376614050719}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | VIRNECT/llama-3-Korean-8B | c5ef57d2-a521-4b09-9aa1-0c06c9888cda | 0.0.1 | hfopenllm_v2/VIRNECT_llama-3-Korean-8B/1762652579.943627 | 1762652579.943627 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | VIRNECT/llama-3-Korean-8B | VIRNECT/llama-3-Korean-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5058345190760515}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Azure99/blossom-v5-llama3-8b | 19a6e24f-819e-480f-a15f-90273a0a06c5 | 0.0.1 | hfopenllm_v2/Azure99_blossom-v5-llama3-8b/1762652579.486878 | 1762652579.486878 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Azure99/blossom-v5-llama3-8b | Azure99/blossom-v5-llama3-8b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.434293230849701}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ngxson/MiniThinky-v2-1B-Llama-3.2 | f37d1682-5df9-45dc-92ae-6bf587a03e9b | 0.0.1 | hfopenllm_v2/ngxson_MiniThinky-v2-1B-Llama-3.2/1762652580.405281 | 1762652580.405282 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ngxson/MiniThinky-v2-1B-Llama-3.2 | ngxson/MiniThinky-v2-1B-Llama-3.2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2963071317437732}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | ngxson/MiniThinky-1B-Llama-3.2 | 3a05547d-850b-42b5-978d-0aff574cb5ca | 0.0.1 | hfopenllm_v2/ngxson_MiniThinky-1B-Llama-3.2/1762652580.4050229 | 1762652580.4050229 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ngxson/MiniThinky-1B-Llama-3.2 | ngxson/MiniThinky-1B-Llama-3.2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2771479673931834}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | arcee-ai/Llama-Spark | aaceb35d-4106-4d6c-b895-446b87394f3b | 0.0.1 | hfopenllm_v2/arcee-ai_Llama-Spark/1762652580.0163891 | 1762652580.0163898 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | arcee-ai/Llama-Spark | arcee-ai/Llama-Spark | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7910732412221794}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | arcee-ai/Llama-3.1-SuperNova-Lite | 4bc80120-a5e2-4824-b278-c2de7140a2bf | 0.0.1 | hfopenllm_v2/arcee-ai_Llama-3.1-SuperNova-Lite/1762652580.016114 | 1762652580.016115 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | arcee-ai/Llama-3.1-SuperNova-Lite | arcee-ai/Llama-3.1-SuperNova-Lite | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8017393848322452}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | noname0202/llama-math-1b-r32-test | 6c3ed9db-730c-48cb-95f9-662467957403 | 0.0.1 | hfopenllm_v2/noname0202_llama-math-1b-r32-test/1762652580.410917 | 1762652580.410918 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | noname0202/llama-math-1b-r32-test | noname0202/llama-math-1b-r32-test | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5819215237791282}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | noname0202/llama-math-1b-r16-0to512tokens-test | 8fb0f696-49a8-4611-ad82-3b7e19d5d867 | 0.0.1 | hfopenllm_v2/noname0202_llama-math-1b-r16-0to512tokens-test/1762652580.4104571 | 1762652580.410458 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | noname0202/llama-math-1b-r16-0to512tokens-test | noname0202/llama-math-1b-r16-0to512tokens-test | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5469753587148765}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | noname0202/llama-math-1b-r8-512tokens-test | c9d6f048-95b8-44ea-9d17-9d9f2d4854b4 | 0.0.1 | hfopenllm_v2/noname0202_llama-math-1b-r8-512tokens-test/1762652580.411124 | 1762652580.411125 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | noname0202/llama-math-1b-r8-512tokens-test | noname0202/llama-math-1b-r8-512tokens-test | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5791987482103043}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | noname0202/llama-math-1b-r32-0to512tokens-test | 5623295c-0170-4832-b3e9-df00c660c59b | 0.0.1 | hfopenllm_v2/noname0202_llama-math-1b-r32-0to512tokens-test/1762652580.410711 | 1762652580.410711 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | noname0202/llama-math-1b-r32-0to512tokens-test | noname0202/llama-math-1b-r32-0to512tokens-test | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5682577782505973}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | CYFRAGOVPL/Llama-PLLuM-8B-base | 01484796-f32b-43fe-b865-517b1a5c0b10 | 0.0.1 | hfopenllm_v2/CYFRAGOVPL_Llama-PLLuM-8B-base/1762652579.500559 | 1762652579.5005598 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CYFRAGOVPL/Llama-PLLuM-8B-base | CYFRAGOVPL/Llama-PLLuM-8B-base | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28988749850396944}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4 | 12a56879-c48c-4422-bc6f-fad813c94414 | 0.0.1 | hfopenllm_v2/netcat420_MFANN-Llama3.1-Abliterated-SLERP-V4/1762652580.39286 | 1762652580.392861 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4 | netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41688275996577967}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2 | 1ef7ee4e-ab54-4e5a-b27f-4d6aeffd3f54 | 0.0.1 | hfopenllm_v2/netcat420_MFANN-Llama3.1-Abliterated-Slerp-V3.2/1762652580.3935192 | 1762652580.39352 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2 | netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41281134057633745}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-llama3.1-abliterated-v2 | 46728c83-957a-4eb7-8a04-0fee4efe50d1 | 0.0.1 | hfopenllm_v2/netcat420_MFANN-llama3.1-abliterated-v2/1762652580.3948102 | 1762652580.394811 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-llama3.1-abliterated-v2 | netcat420/MFANN-llama3.1-abliterated-v2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4429114748866341}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/Llama3.1-MFANN-8b | aa3467df-1a74-47af-b635-0318df88dd58 | 0.0.1 | hfopenllm_v2/netcat420_Llama3.1-MFANN-8b/1762652580.3921962 | 1762652580.3921971 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/Llama3.1-MFANN-8b | netcat420/Llama3.1-MFANN-8b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.29695651981187693}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3 | e5a71267-56c7-418a-bfcc-b4b5ed10496e | 0.0.1 | hfopenllm_v2/netcat420_MFANN-Llama3.1-Abliterated-SLERP-TIES-V3/1762652580.3926558 | 1762652580.3926558 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3 | netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4238021782204551}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5 | d52d6e93-b291-4f21-aca7-2c8d48313dec | 0.0.1 | hfopenllm_v2/netcat420_MFANN-Llama3.1-Abliterated-SLERP-V5/1762652580.393064 | 1762652580.393065 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5 | netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4328947193446721}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-llama3.1-Abliterated-SLERP | 3d3862a4-79df-488c-8d17-dc332fa3abce | 0.0.1 | hfopenllm_v2/netcat420_MFANN-llama3.1-Abliterated-SLERP/1762652580.394179 | 1762652580.39418 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-llama3.1-Abliterated-SLERP | netcat420/MFANN-llama3.1-Abliterated-SLERP | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25906262051357065}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-llama3.1-abliterated-SLERP-v3 | 73f2659d-ff95-403f-99e0-09de7c807c3c | 0.0.1 | hfopenllm_v2/netcat420_MFANN-llama3.1-abliterated-SLERP-v3/1762652580.394387 | 1762652580.394388 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-llama3.1-abliterated-SLERP-v3 | netcat420/MFANN-llama3.1-abliterated-SLERP-v3 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.37993856301280604}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1 | 71e87ce8-88f2-4858-b65f-9225f59cc3f9 | 0.0.1 | hfopenllm_v2/netcat420_MFANN-llama3.1-abliterated-SLERP-v3.1/1762652580.394599 | 1762652580.3946 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1 | netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4201551882338861}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2 | a9c38a44-a973-4bfd-a1f1-aa094d5e37fd | 0.0.1 | hfopenllm_v2/netcat420_MFANN-Llama3.1-Abliterated-SLERP-TIES-V2/1762652580.3924491 | 1762652580.39245 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2 | netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4209796672828096}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES | c5a71d25-35f7-453e-9551-7881046fdeff | 0.0.1 | hfopenllm_v2/netcat420_MFANN-Llama3.1-Abliterated-Slerp-TIES/1762652580.393313 | 1762652580.393313 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES | netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.42934746472692453}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | theprint/ReWiz-Llama-3.2-3B | 17d4fced-6a93-4e5e-8349-25dae16596f8 | 0.0.1 | hfopenllm_v2/theprint_ReWiz-Llama-3.2-3B/1762652580.5630422 | 1762652580.563043 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | theprint/ReWiz-Llama-3.2-3B | theprint/ReWiz-Llama-3.2-3B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4648931501748693}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | theprint/CleverBoi-Llama-3.1-8B-v2 | 42ea4b8d-98af-4c57-8b55-cef38c473fd5 | 0.0.1 | hfopenllm_v2/theprint_CleverBoi-Llama-3.1-8B-v2/1762652580.560884 | 1762652580.560884 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | theprint/CleverBoi-Llama-3.1-8B-v2 | theprint/CleverBoi-Llama-3.1-8B-v2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19613957632415324}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "?", "params_billions": 9.3} |
HF Open LLM v2 | meta | theprint/Code-Llama-Bagel-8B | 3a63b21d-0aaa-45d5-ae12-6d6c9777edbe | 0.0.1 | hfopenllm_v2/theprint_Code-Llama-Bagel-8B/1762652580.561388 | 1762652580.5613928 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | theprint/Code-Llama-Bagel-8B | theprint/Code-Llama-Bagel-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2529676813078188}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | theprint/ReWiz-Llama-3.1-8B-v2 | e57e6483-7e4c-4a64-8c58-890aafb38f37 | 0.0.1 | hfopenllm_v2/theprint_ReWiz-Llama-3.1-8B-v2/1762652580.5627892 | 1762652580.56279 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | theprint/ReWiz-Llama-3.1-8B-v2 | theprint/ReWiz-Llama-3.1-8B-v2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23790542427425895}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "?", "params_billions": 9.3} |
HF Open LLM v2 | meta | theprint/Llama-3.2-3B-VanRossum | 78e423de-2f66-4c53-8d07-8401802973ca | 0.0.1 | hfopenllm_v2/theprint_Llama-3.2-3B-VanRossum/1762652580.562204 | 1762652580.562206 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | theprint/Llama-3.2-3B-VanRossum | theprint/Llama-3.2-3B-VanRossum | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4782820693537591}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "?", "params_billions": 3.696} |
HF Open LLM v2 | meta | TencentARC/LLaMA-Pro-8B | 8d2c510b-a092-4e5d-b468-6e58501cad8a | 0.0.1 | hfopenllm_v2/TencentARC_LLaMA-Pro-8B/1762652579.912878 | 1762652579.912879 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | TencentARC/LLaMA-Pro-8B | TencentARC/LLaMA-Pro-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2277135777514772}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.357} |
HF Open LLM v2 | meta | dnhkng/RYS-Llama3.1-Large | ca04e634-81e6-49fb-bdc4-2ff0ef04b75f | 0.0.1 | hfopenllm_v2/dnhkng_RYS-Llama3.1-Large/1762652580.133179 | 1762652580.1331809 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | dnhkng/RYS-Llama3.1-Large | dnhkng/RYS-Llama3.1-Large | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8492001223420524}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 81.677} |
HF Open LLM v2 | meta | Skywork/Skywork-o1-Open-Llama-3.1-8B | e98879cc-d7fd-4e97-ab86-0ca28265abeb | 0.0.1 | hfopenllm_v2/Skywork_Skywork-o1-Open-Llama-3.1-8B/1762652579.8887959 | 1762652579.888797 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Skywork/Skywork-o1-Open-Llama-3.1-8B | Skywork/Skywork-o1-Open-Llama-3.1-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3518364605912313}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | NAPS-ai/naps-llama3.1-70B-v0.2-fp16 | 16b6df0d-8e1b-4bec-b3f9-060273a4ad15 | 0.0.1 | hfopenllm_v2/NAPS-ai_naps-llama3.1-70B-v0.2-fp16/1762652579.7671611 | 1762652579.767162 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | NAPS-ai/naps-llama3.1-70B-v0.2-fp16 | NAPS-ai/naps-llama3.1-70B-v0.2-fp16 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1844993506119319}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 70.761} |
HF Open LLM v2 | meta | NAPS-ai/naps-llama-3_1_instruct-v0.6.0 | 3378460d-d044-4c7e-ba9f-48cc94f0bc3f | 0.0.1 | hfopenllm_v2/NAPS-ai_naps-llama-3_1_instruct-v0.6.0/1762652579.766795 | 1762652579.766796 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | NAPS-ai/naps-llama-3_1_instruct-v0.6.0 | NAPS-ai/naps-llama-3_1_instruct-v0.6.0 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3280063564675062}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | flammenai/Mahou-1.2a-llama3-8B | eb10ecab-2be4-4b75-9b85-d2f2786fd095 | 0.0.1 | hfopenllm_v2/flammenai_Mahou-1.2a-llama3-8B/1762652580.154922 | 1762652580.154923 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | flammenai/Mahou-1.2a-llama3-8B | flammenai/Mahou-1.2a-llama3-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.50925655039739}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH",... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | flammenai/Mahou-1.5-llama3.1-70B | 653ff1ac-158e-4d36-a813-22ebef4a76ce | 0.0.1 | hfopenllm_v2/flammenai_Mahou-1.5-llama3.1-70B/1762652580.155493 | 1762652580.155494 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | flammenai/Mahou-1.5-llama3.1-70B | flammenai/Mahou-1.5-llama3.1-70B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7146615424850509}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | meta | flammenai/Llama3.1-Flammades-70B | 92b8ecb7-80a2-4b77-bf20-8d87a36209c0 | 0.0.1 | hfopenllm_v2/flammenai_Llama3.1-Flammades-70B/1762652580.154665 | 1762652580.154666 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | flammenai/Llama3.1-Flammades-70B | flammenai/Llama3.1-Flammades-70B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7058438277104748}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | meta | laislemke/LLaMA-2-vicuna-7b-slerp | 66d98c7d-7fd1-41bc-9229-855f9d02412d | 0.0.1 | hfopenllm_v2/laislemke_LLaMA-2-vicuna-7b-slerp/1762652580.311907 | 1762652580.311908 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | laislemke/LLaMA-2-vicuna-7b-slerp | laislemke/LLaMA-2-vicuna-7b-slerp | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.29320979445648654}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.738} |
HF Open LLM v2 | meta | tenyx/Llama3-TenyxChat-70B | 6fc094c0-ca29-4594-b086-2dae90195e8d | 0.0.1 | hfopenllm_v2/tenyx_Llama3-TenyxChat-70B/1762652580.5593112 | 1762652580.5593119 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | tenyx/Llama3-TenyxChat-70B | tenyx/Llama3-TenyxChat-70B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8087086707713311}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | meta | mobiuslabsgmbh/DeepSeek-R1-ReDistill-Llama3-8B-v1.1 | 09ec0c0c-d403-4f23-99a4-61196c70734d | 0.0.1 | hfopenllm_v2/mobiuslabsgmbh_DeepSeek-R1-ReDistill-Llama3-8B-v1.1/1762652580.371218 | 1762652580.371218 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mobiuslabsgmbh/DeepSeek-R1-ReDistill-Llama3-8B-v1.1 | mobiuslabsgmbh/DeepSeek-R1-ReDistill-Llama3-8B-v1.1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.370396104558128}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | asharsha30/LLAMA_Harsha_8_B_ORDP_10k | 61523c37-faee-4708-be49-4c7e31d760e6 | 0.0.1 | hfopenllm_v2/asharsha30_LLAMA_Harsha_8_B_ORDP_10k/1762652580.01895 | 1762652580.018951 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | asharsha30/LLAMA_Harsha_8_B_ORDP_10k | asharsha30/LLAMA_Harsha_8_B_ORDP_10k | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.34639090945358314}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | vhab10/llama-3-8b-merged-linear | deed0e49-b9fd-4623-bb90-3e885bec9bb0 | 0.0.1 | hfopenllm_v2/vhab10_llama-3-8b-merged-linear/1762652580.5860548 | 1762652580.5860548 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vhab10/llama-3-8b-merged-linear | vhab10/llama-3-8b-merged-linear | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5916634529714491}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 4.65} |
HF Open LLM v2 | meta | gmonsoon/SahabatAI-Llama-11B-Test | 48f5e083-9fa3-4753-a734-578ac3e15e1f | 0.0.1 | hfopenllm_v2/gmonsoon_SahabatAI-Llama-11B-Test/1762652580.16498 | 1762652580.164981 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gmonsoon/SahabatAI-Llama-11B-Test | gmonsoon/SahabatAI-Llama-11B-Test | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33757319467900726}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 11.52} |
HF Open LLM v2 | meta | EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos | b77a4371-97d7-43a0-892f-a0c01c2b8528 | 0.0.1 | hfopenllm_v2/EpistemeAI2_Fireball-Alpaca-Llama3.1.03-8B-Philos/1762652579.6105568 | 1762652579.610558 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos | EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3880814017916905}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | meta | EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection | 42a38b08-6eb7-449d-99c5-cb0b2b76dd06 | 0.0.1 | hfopenllm_v2/EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection/1762652579.611454 | 1762652579.611454 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection | EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.39522577871159636}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | meta | EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1 | 9ce9031b-76fd-4c33-b209-3011643d9266 | 0.0.1 | hfopenllm_v2/EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1/1762652579.611669 | 1762652579.61167 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1 | EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5316382753316755}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | meta | EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos | 536229bc-b1fb-4078-826c-074b09c362b9 | 0.0.1 | hfopenllm_v2/EpistemeAI2_Fireball-Alpaca-Llama3.1.01-8B-Philos/1762652579.610341 | 1762652579.610341 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos | EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.42117913802045237}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | meta | EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos | 392ea212-afd9-44a3-a6bb-2bba8f124492 | 0.0.1 | hfopenllm_v2/EpistemeAI2_Fireball-Alpaca-Llama3.1-8B-Philos/1762652579.6100821 | 1762652579.610083 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos | EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.498640274471735}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | meta | EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos | de05ec0d-805d-4aa5-8ec3-1dc7446e6c1a | 0.0.1 | hfopenllm_v2/EpistemeAI2_Fireball-Alpaca-Llama3.1.04-8B-Philos/1762652579.6107578 | 1762652579.610759 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos | EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.40843960690966635}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | meta | EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection | 5ea20ab3-9d05-43f1-a276-7acbd2229fe8 | 0.0.1 | hfopenllm_v2/EpistemeAI2_Fireball-Llama-3.1-8B-Philos-Reflection/1762652579.6118872 | 1762652579.6118872 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection | EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3596047376516532}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | meta | EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math | 2790feab-6850-4d51-a3a1-78ada0c56d03 | 0.0.1 | hfopenllm_v2/EpistemeAI2_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math/1762652579.611186 | 1762652579.611187 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math | EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5079079065767719}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | meta | mukaj/Llama-3.1-Hawkish-8B | b94f468b-7c0e-491e-8404-de1bad7ff0f0 | 0.0.1 | hfopenllm_v2/mukaj_Llama-3.1-Hawkish-8B/1762652580.3748438 | 1762652580.374845 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mukaj/Llama-3.1-Hawkish-8B | mukaj/Llama-3.1-Hawkish-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6720468357291984}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | OpenScholar/Llama-3.1_OpenScholar-8B | 1e6ea564-30ff-4db3-8bb6-070da34e3fb5 | 0.0.1 | hfopenllm_v2/OpenScholar_Llama-3.1_OpenScholar-8B/1762652579.807913 | 1762652579.807913 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | OpenScholar/Llama-3.1_OpenScholar-8B | OpenScholar/Llama-3.1_OpenScholar-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6064010159709571}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.0} |
HF Open LLM v2 | meta | sethuiyer/LlamaZero-3.1-8B-Experimental-1208 | abebe996-35e4-4fa6-a16c-0b33481d7357 | 0.0.1 | hfopenllm_v2/sethuiyer_LlamaZero-3.1-8B-Experimental-1208/1762652580.5134048 | 1762652580.513406 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sethuiyer/LlamaZero-3.1-8B-Experimental-1208 | sethuiyer/LlamaZero-3.1-8B-Experimental-1208 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6051022398347496}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | T145/Llama-3.1-8B-Zeus | e0889500-8f6e-496c-b275-ac110458c56d | 0.0.1 | hfopenllm_v2/T145_Llama-3.1-8B-Zeus/1762652579.900112 | 1762652579.9001129 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/Llama-3.1-8B-Zeus | T145/Llama-3.1-8B-Zeus | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.35176110497923285}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Xkev/Llama-3.2V-11B-cot | 55f777f4-460f-4b83-a309-7e9e9113fd55 | 0.0.1 | hfopenllm_v2/Xkev_Llama-3.2V-11B-cot/1762652579.9552681 | 1762652579.955269 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Xkev/Llama-3.2V-11B-cot | Xkev/Llama-3.2V-11B-cot | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41580894249480266}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "MllamaForConditionalGeneration", "params_billions": 10.67} |
HF Open LLM v2 | meta | xinchen9/Llama3.1_8B_Instruct_CoT | eddb5bfc-d5ae-44bc-8ffd-b1d318b0e3d2 | 0.0.1 | hfopenllm_v2/xinchen9_Llama3.1_8B_Instruct_CoT/1762652580.5972009 | 1762652580.5972018 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xinchen9/Llama3.1_8B_Instruct_CoT | xinchen9/Llama3.1_8B_Instruct_CoT | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2973565694579272}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | xinchen9/Llama3.1_CoT_V1 | 501bff5b-2809-4af7-9600-d6471167b701 | 0.0.1 | hfopenllm_v2/xinchen9_Llama3.1_CoT_V1/1762652580.597682 | 1762652580.597683 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xinchen9/Llama3.1_CoT_V1 | xinchen9/Llama3.1_CoT_V1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2452991396162183}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | xinchen9/Llama3.1_CoT | 4ccfc9fe-c222-490e-badd-bfeecc9ede91 | 0.0.1 | hfopenllm_v2/xinchen9_Llama3.1_CoT/1762652580.597471 | 1762652580.597472 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xinchen9/Llama3.1_CoT | xinchen9/Llama3.1_CoT | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22461624046419057}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | mlabonne/ChimeraLlama-3-8B-v2 | fd31a5f1-986e-4040-b04b-3018161e6e66 | 0.0.1 | hfopenllm_v2/mlabonne_ChimeraLlama-3-8B-v2/1762652580.3680582 | 1762652580.3680582 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/ChimeraLlama-3-8B-v2 | mlabonne/ChimeraLlama-3-8B-v2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.44688315890725494}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.