_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | meta | duyhv1411/Llama-3.2-1B-en-vi | 000fcba9-c157-48de-b672-f583f4cd3881 | 0.0.1 | hfopenllm_v2/duyhv1411_Llama-3.2-1B-en-vi/1762652580.1364539 | 1762652580.1364548 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | duyhv1411/Llama-3.2-1B-en-vi | duyhv1411/Llama-3.2-1B-en-vi | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4788317220530415}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | duyhv1411/Llama-3.2-3B-en-vi | 31381b9d-77fe-491d-891c-de4fd37fa1cd | 0.0.1 | hfopenllm_v2/duyhv1411_Llama-3.2-3B-en-vi/1762652580.136725 | 1762652580.136726 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | duyhv1411/Llama-3.2-3B-en-vi | duyhv1411/Llama-3.2-3B-en-vi | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4852014876084345}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | trthminh1112/autotrain-llama32-1b-finetune | cad93026-baf2-47ef-a554-4d0ba0d5a946 | 0.0.1 | hfopenllm_v2/trthminh1112_autotrain-llama32-1b-finetune/1762652580.577601 | 1762652580.5776021 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | trthminh1112/autotrain-llama32-1b-finetune | trthminh1112/autotrain-llama32-1b-finetune | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17685518867715438}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.1} |
HF Open LLM v2 | meta | DeepMount00/Llama-3-8b-Ita | bee65c80-73f2-46e5-9532-8f92b38c4fc5 | 0.0.1 | hfopenllm_v2/DeepMount00_Llama-3-8b-Ita/1762652579.551231 | 1762652579.551231 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepMount00/Llama-3-8b-Ita | DeepMount00/Llama-3-8b-Ita | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7530297388706411}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | DeepMount00/Llama-3.1-Distilled | 6424a285-b3dc-4221-b3ba-5e7922185269 | 0.0.1 | hfopenllm_v2/DeepMount00_Llama-3.1-Distilled/1762652579.551904 | 1762652579.551905 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepMount00/Llama-3.1-Distilled | DeepMount00/Llama-3.1-Distilled | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7843787816327346}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | DeepMount00/Llama-3.1-8b-Ita | ca297bdd-d804-4c43-bb6e-0b7e230974e2 | 0.0.1 | hfopenllm_v2/DeepMount00_Llama-3.1-8b-Ita/1762652579.551703 | 1762652579.5517042 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepMount00/Llama-3.1-8b-Ita | DeepMount00/Llama-3.1-8b-Ita | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5364843060856306}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Unknown", "params_billions": 0.0} |
HF Open LLM v2 | meta | DeepMount00/Llama-3.1-8b-ITA | 1c5ce85b-84f3-4ac4-8a98-9d80659bff18 | 0.0.1 | hfopenllm_v2/DeepMount00_Llama-3.1-8b-ITA/1762652579.5514839 | 1762652579.5514848 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepMount00/Llama-3.1-8b-ITA | DeepMount00/Llama-3.1-8b-ITA | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7916727616058724}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | khulaifi95/Llama-3.1-8B-Reason-Blend-888k | 85a2710f-feaf-4dc2-aafa-04c33abf6425 | 0.0.1 | hfopenllm_v2/khulaifi95_Llama-3.1-8B-Reason-Blend-888k/1762652580.309421 | 1762652580.309421 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | khulaifi95/Llama-3.1-8B-Reason-Blend-888k | khulaifi95/Llama-3.1-8B-Reason-Blend-888k | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.583170432230925}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | huggyllama/llama-65b | 2bff16e4-f0ed-4957-8b20-4ae269642088 | 0.0.1 | hfopenllm_v2/huggyllama_llama-65b/1762652580.1999428 | 1762652580.199944 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | huggyllama/llama-65b | huggyllama/llama-65b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25259311958935626}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 65.286} |
HF Open LLM v2 | meta | huggyllama/llama-13b | 20b49499-5df3-450c-a20d-dc421b937e91 | 0.0.1 | hfopenllm_v2/huggyllama_llama-13b/1762652580.199647 | 1762652580.199648 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | huggyllama/llama-13b | huggyllama/llama-13b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24105262924595627}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 13.016} |
HF Open LLM v2 | meta | huggyllama/llama-7b | 61a5624d-ef42-4fdd-a0b1-08fdc2d07615 | 0.0.1 | hfopenllm_v2/huggyllama_llama-7b/1762652580.200164 | 1762652580.200165 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | huggyllama/llama-7b | huggyllama/llama-7b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25009530268576263}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 6.738} |
HF Open LLM v2 | meta | hotmailuser/LlamaStock-8B | 23b559eb-4493-462f-bb37-5e232b3336bc | 0.0.1 | hfopenllm_v2/hotmailuser_LlamaStock-8B/1762652580.19518 | 1762652580.19518 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | hotmailuser/LlamaStock-8B | hotmailuser/LlamaStock-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4249513513034304}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | hotmailuser/Llama-Hermes-slerp2-8B | be5505d7-06ae-4ab5-ba7f-6ff4732b3180 | 0.0.1 | hfopenllm_v2/hotmailuser_Llama-Hermes-slerp2-8B/1762652580.194975 | 1762652580.194976 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | hotmailuser/Llama-Hermes-slerp2-8B | hotmailuser/Llama-Hermes-slerp2-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3728440537773109}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | hotmailuser/Llama-Hermes-slerp-8B | cf2de222-77bf-456c-acb3-c3aa33367a9d | 0.0.1 | hfopenllm_v2/hotmailuser_Llama-Hermes-slerp-8B/1762652580.1947231 | 1762652580.194724 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | hotmailuser/Llama-Hermes-slerp-8B | hotmailuser/Llama-Hermes-slerp-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3390470617960345}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | sequelbox/Llama3.1-8B-MOTH | 3a820ba4-bdd8-4caf-a90a-d7e9fee52997 | 0.0.1 | hfopenllm_v2/sequelbox_Llama3.1-8B-MOTH/1762652580.511786 | 1762652580.511787 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sequelbox/Llama3.1-8B-MOTH | sequelbox/Llama3.1-8B-MOTH | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5244938984117696}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | sequelbox/Llama3.1-8B-PlumMath | 4734bf79-d464-43b4-8df3-1937f7c37796 | 0.0.1 | hfopenllm_v2/sequelbox_Llama3.1-8B-PlumMath/1762652580.512456 | 1762652580.512456 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sequelbox/Llama3.1-8B-PlumMath | sequelbox/Llama3.1-8B-PlumMath | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.224241678745728}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | sequelbox/Llama3.1-8B-PlumCode | 2695c341-eabe-4809-9b87-9e771e1ee9d6 | 0.0.1 | hfopenllm_v2/sequelbox_Llama3.1-8B-PlumCode/1762652580.512235 | 1762652580.512235 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sequelbox/Llama3.1-8B-PlumCode | sequelbox/Llama3.1-8B-PlumCode | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20448299401144518}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | sequelbox/Llama3.1-70B-PlumChat | ab796471-db79-40a2-8147-72ed7099b355 | 0.0.1 | hfopenllm_v2/sequelbox_Llama3.1-70B-PlumChat/1762652580.5115242 | 1762652580.5115242 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sequelbox/Llama3.1-70B-PlumChat | sequelbox/Llama3.1-70B-PlumChat | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5616131863455631}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | meta | sequelbox/Llama3.1-8B-PlumChat | 32f38aeb-615c-4785-a674-bd8a50eb1057 | 0.0.1 | hfopenllm_v2/sequelbox_Llama3.1-8B-PlumChat/1762652580.512009 | 1762652580.51201 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sequelbox/Llama3.1-8B-PlumChat | sequelbox/Llama3.1-8B-PlumChat | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.42427647530773904}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Solshine/Llama-3-1-big-thoughtful-passthrough-merge-2 | b36e0fba-9fa1-4e74-9d26-b4889343f113 | 0.0.1 | hfopenllm_v2/Solshine_Llama-3-1-big-thoughtful-passthrough-merge-2/1762652579.889379 | 1762652579.88938 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Solshine/Llama-3-1-big-thoughtful-passthrough-merge-2 | Solshine/Llama-3-1-big-thoughtful-passthrough-merge-2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25466650709007654}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 18.5} |
HF Open LLM v2 | meta | allknowingroger/Llama3.1-60B | 21684c0e-c9b7-4375-bf05-cf63e9bd19b4 | 0.0.1 | hfopenllm_v2/allknowingroger_Llama3.1-60B/1762652579.989347 | 1762652579.9893482 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Llama3.1-60B | allknowingroger/Llama3.1-60B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18145188100905596}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 61.997} |
HF Open LLM v2 | meta | allknowingroger/llama3-Jallabi-40B-s | d46307f8-774b-4871-a32a-6c5a9cc6b1b8 | 0.0.1 | hfopenllm_v2/allknowingroger_llama3-Jallabi-40B-s/1762652580.006197 | 1762652580.006198 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/llama3-Jallabi-40B-s | allknowingroger/llama3-Jallabi-40B-s | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19206815693471102}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 18.769} |
HF Open LLM v2 | meta | allknowingroger/llama3AnFeng-40B | dc25bda9-966c-44f8-991b-ad891d59befe | 0.0.1 | hfopenllm_v2/allknowingroger_llama3AnFeng-40B/1762652580.006448 | 1762652580.006449 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/llama3AnFeng-40B | allknowingroger/llama3AnFeng-40B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17420776872032873}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 39.971} |
HF Open LLM v2 | meta | allknowingroger/Yillama-40B | ab5ef6c9-76de-470e-b524-497036db94d4 | 0.0.1 | hfopenllm_v2/allknowingroger_Yillama-40B/1762652580.004728 | 1762652580.004729 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | allknowingroger/Yillama-40B | allknowingroger/Yillama-40B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.16968643200042555}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 34.389} |
HF Open LLM v2 | meta | uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b | e9556ee4-63e8-4e0b-88df-62cc6c62c65a | 0.0.1 | hfopenllm_v2/uukuguy_speechless-llama2-hermes-orca-platypus-wizardlm-13b/1762652580.5833302 | 1762652580.583331 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b | uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.45617517076911485}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 13.016} |
HF Open LLM v2 | meta | uukuguy/speechless-codellama-34b-v2.0 | ddcf1dc2-5281-4d14-b870-7ed2fa44c8d0 | 0.0.1 | hfopenllm_v2/uukuguy_speechless-codellama-34b-v2.0/1762652580.5824919 | 1762652580.5824928 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | uukuguy/speechless-codellama-34b-v2.0 | uukuguy/speechless-codellama-34b-v2.0 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.46042168113937687}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 34.0} |
HF Open LLM v2 | meta | OEvortex/Emotional-llama-8B | c2593003-ca2a-4699-8473-a07683e7cd85 | 0.0.1 | hfopenllm_v2/OEvortex_Emotional-llama-8B/1762652579.797152 | 1762652579.797153 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | OEvortex/Emotional-llama-8B | OEvortex/Emotional-llama-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3516369898535885}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Yuma42/Llama3.1-SuperHawk-8B | 458dd163-075e-48ca-bb3b-650912f55696 | 0.0.1 | hfopenllm_v2/Yuma42_Llama3.1-SuperHawk-8B/1762652579.965369 | 1762652579.9653702 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Yuma42/Llama3.1-SuperHawk-8B | Yuma42/Llama3.1-SuperHawk-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7986420475449585}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Yuma42/Llama3.1-IgneousIguana-8B | cd2f97bc-3f4d-43f2-b100-09eec8d122a6 | 0.0.1 | hfopenllm_v2/Yuma42_Llama3.1-IgneousIguana-8B/1762652579.965119 | 1762652579.965119 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Yuma42/Llama3.1-IgneousIguana-8B | Yuma42/Llama3.1-IgneousIguana-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8133297428600558}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | sabersaleh/Llama2-7B-SimPO | a530f116-e413-4d73-8d1f-2f44fcc0c6a9 | 0.0.1 | hfopenllm_v2/sabersaleh_Llama2-7B-SimPO/1762652580.504319 | 1762652580.50432 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sabersaleh/Llama2-7B-SimPO | sabersaleh/Llama2-7B-SimPO | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1658643510330368}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | meta | sabersaleh/Llama2-7B-KTO | 0744b5c6-e109-4ccb-acc9-955106ef5562 | 0.0.1 | hfopenllm_v2/sabersaleh_Llama2-7B-KTO/1762652580.503802 | 1762652580.5038028 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sabersaleh/Llama2-7B-KTO | sabersaleh/Llama2-7B-KTO | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.15284999357260956}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | meta | sabersaleh/Llama2-7B-IPO | 14deb011-b6ce-47c7-b855-c7ebcc291121 | 0.0.1 | hfopenllm_v2/sabersaleh_Llama2-7B-IPO/1762652580.503558 | 1762652580.5035589 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sabersaleh/Llama2-7B-IPO | sabersaleh/Llama2-7B-IPO | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17685518867715438}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | meta | sabersaleh/Llama2-7B-CPO | 2ecc5d1d-edb7-4713-9bde-f83ab4736690 | 0.0.1 | hfopenllm_v2/sabersaleh_Llama2-7B-CPO/1762652580.502833 | 1762652580.502836 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sabersaleh/Llama2-7B-CPO | sabersaleh/Llama2-7B-CPO | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1545488193548673}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | meta | sabersaleh/Llama3 | 286860d2-7f43-4488-9d43-9058fe59b248 | 0.0.1 | hfopenllm_v2/sabersaleh_Llama3/1762652580.504582 | 1762652580.504583 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sabersaleh/Llama3 | sabersaleh/Llama3 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3320777758569484}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | sabersaleh/Llama2-7B-SPO | cfbdbc52-d846-48e7-bad4-f6240f1d2551 | 0.0.1 | hfopenllm_v2/sabersaleh_Llama2-7B-SPO/1762652580.504033 | 1762652580.504034 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sabersaleh/Llama2-7B-SPO | sabersaleh/Llama2-7B-SPO | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.15667207453999832}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | meta | winglian/Llama-3-8b-64k-PoSE | 76bbd348-21b9-4253-8085-d8c4eb0932f6 | 0.0.1 | hfopenllm_v2/winglian_Llama-3-8b-64k-PoSE/1762652580.595902 | 1762652580.595903 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | winglian/Llama-3-8b-64k-PoSE | winglian/Llama-3-8b-64k-PoSE | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28569085581811815}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | winglian/llama-3-8b-256k-PoSE | 5077856e-f85c-4395-8be9-e3e9bf3655cb | 0.0.1 | hfopenllm_v2/winglian_llama-3-8b-256k-PoSE/1762652580.5961442 | 1762652580.596145 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | winglian/llama-3-8b-256k-PoSE | winglian/llama-3-8b-256k-PoSE | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2909114482905358}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | iFaz/llama32_1B_en_emo_v1 | f202b553-56e6-4a27-b2fa-0f98feabe11e | 0.0.1 | hfopenllm_v2/iFaz_llama32_1B_en_emo_v1/1762652580.2027268 | 1762652580.2027268 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | iFaz/llama32_1B_en_emo_v1 | iFaz/llama32_1B_en_emo_v1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.44083808738591385}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.765} |
HF Open LLM v2 | meta | iFaz/llama32_3B_en_emo_v3 | 8bb5540b-b19d-4641-9dea-36ea43b07250 | 0.0.1 | hfopenllm_v2/iFaz_llama32_3B_en_emo_v3/1762652580.203954 | 1762652580.203954 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | iFaz/llama32_3B_en_emo_v3 | iFaz/llama32_3B_en_emo_v3 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5759263199421978}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.848} |
HF Open LLM v2 | meta | iFaz/llama32_3B_en_emo_2000_stp | 5468fbdc-63e7-4e9d-8370-2f3f0e83e559 | 0.0.1 | hfopenllm_v2/iFaz_llama32_3B_en_emo_2000_stp/1762652580.203131 | 1762652580.203132 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | iFaz/llama32_3B_en_emo_2000_stp | iFaz/llama32_3B_en_emo_2000_stp | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7368681764385165}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.848} |
HF Open LLM v2 | meta | iFaz/llama32_3B_en_emo_5000_stp | 9ffc9dbb-065b-47ae-a985-541ee7f7126d | 0.0.1 | hfopenllm_v2/iFaz_llama32_3B_en_emo_5000_stp/1762652580.203531 | 1762652580.203532 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | iFaz/llama32_3B_en_emo_5000_stp | iFaz/llama32_3B_en_emo_5000_stp | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7100404703963262}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.848} |
HF Open LLM v2 | meta | iFaz/llama31_8B_en_emo_v4 | 198e5d81-0dcd-4dc0-9919-139ce0aa2dd5 | 0.0.1 | hfopenllm_v2/iFaz_llama31_8B_en_emo_v4/1762652580.202469 | 1762652580.202469 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | iFaz/llama31_8B_en_emo_v4 | iFaz/llama31_8B_en_emo_v4 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3042504997850149}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "", "params_billions": 4.777} |
HF Open LLM v2 | meta | iFaz/llama32_3B_en_emo_v2 | 03587c1e-14e3-434f-9582-448914832c95 | 0.0.1 | hfopenllm_v2/iFaz_llama32_3B_en_emo_v2/1762652580.203742 | 1762652580.203743 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | iFaz/llama32_3B_en_emo_v2 | iFaz/llama32_3B_en_emo_v2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5454017562290279}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.848} |
HF Open LLM v2 | meta | iFaz/llama32_3B_en_emo_1000_stp | a4111230-4313-4f75-bcd3-c598e436987b | 0.0.1 | hfopenllm_v2/iFaz_llama32_3B_en_emo_1000_stp/1762652580.202935 | 1762652580.2029362 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | iFaz/llama32_3B_en_emo_1000_stp | iFaz/llama32_3B_en_emo_1000_stp | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7295243287809678}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.848} |
HF Open LLM v2 | meta | iFaz/llama32_3B_en_emo_300_stp | 0806c872-f913-493a-ada4-7db88a93b840 | 0.0.1 | hfopenllm_v2/iFaz_llama32_3B_en_emo_300_stp/1762652580.203331 | 1762652580.203331 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | iFaz/llama32_3B_en_emo_300_stp | iFaz/llama32_3B_en_emo_300_stp | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.725552644760347}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.848} |
HF Open LLM v2 | meta | Lyte/Llama-3.2-3B-Overthinker | d997330d-6679-4d63-839c-677694ea4abc | 0.0.1 | hfopenllm_v2/Lyte_Llama-3.2-3B-Overthinker/1762652579.741945 | 1762652579.7419462 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lyte/Llama-3.2-3B-Overthinker | Lyte/Llama-3.2-3B-Overthinker | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6407975283359264}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | Josephgflowers/TinyLlama-Cinder-Agent-v1 | 00332c0d-d698-4ecd-9c2d-5f56921709d5 | 0.0.1 | hfopenllm_v2/Josephgflowers_TinyLlama-Cinder-Agent-v1/1762652579.695456 | 1762652579.695457 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Josephgflowers/TinyLlama-Cinder-Agent-v1 | Josephgflowers/TinyLlama-Cinder-Agent-v1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.26695612087040166}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.1} |
HF Open LLM v2 | meta | Josephgflowers/Tinyllama-r1 | 4293bc9f-4968-4af9-acd2-0ada64be43d4 | 0.0.1 | hfopenllm_v2/Josephgflowers_Tinyllama-r1/1762652579.6965919 | 1762652579.6965928 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Josephgflowers/Tinyllama-r1 | Josephgflowers/Tinyllama-r1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2119265770378152}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.1} |
HF Open LLM v2 | meta | Josephgflowers/TinyLlama-v1.1-Cinders-World | 2b993039-8980-4578-a9e2-a22a39385664 | 0.0.1 | hfopenllm_v2/Josephgflowers_TinyLlama-v1.1-Cinders-World/1762652579.6958752 | 1762652579.6958761 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Josephgflowers/TinyLlama-v1.1-Cinders-World | Josephgflowers/TinyLlama-v1.1-Cinders-World | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24692260978647768}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.1} |
HF Open LLM v2 | meta | Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama | 670580f3-ca8a-473d-a3df-8c01952bda00 | 0.0.1 | hfopenllm_v2/Josephgflowers_Differential-Attention-Liquid-Metal-Tinyllama/1762652579.695199 | 1762652579.6952 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama | Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22269245601670234}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.1} |
HF Open LLM v2 | meta | Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1 | 0c22748e-74ad-4bac-a714-c64a19a88af7 | 0.0.1 | hfopenllm_v2/Josephgflowers_Tinyllama-STEM-Cinder-Agent-v1/1762652579.696357 | 1762652579.696357 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1 | Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21257596510591897}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.1} |
HF Open LLM v2 | meta | Josephgflowers/TinyLlama_v1.1_math_code-world-test-1 | 72cf7999-e4cb-4987-a694-cdcfae37bb02 | 0.0.1 | hfopenllm_v2/Josephgflowers_TinyLlama_v1.1_math_code-world-test-1/1762652579.696125 | 1762652579.696125 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Josephgflowers/TinyLlama_v1.1_math_code-world-test-1 | Josephgflowers/TinyLlama_v1.1_math_code-world-test-1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.00784363267242029}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 1.1} |
HF Open LLM v2 | meta | DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst | a4da2ab3-adb3-405f-9bb7-2164d740d424 | 0.0.1 | hfopenllm_v2/DeepAutoAI_ldm_soup_Llama-3.1-8B-Inst/1762652579.5498 | 1762652579.5498009 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst | DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.803263119633683}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1 | 74f0ecd4-e04a-4775-9551-fc0e9fa40314 | 0.0.1 | hfopenllm_v2/DeepAutoAI_Explore_Llama-3.2-1B-Inst_v1/1762652579.548037 | 1762652579.548039 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1 | DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4998891829235318}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1 | 40bc60f8-aa35-460b-a7af-b4cccd138c80 | 0.0.1 | hfopenllm_v2/DeepAutoAI_Explore_Llama-3.2-1B-Inst_v1.1/1762652579.5483131 | 1762652579.548314 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1 | DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5844193406827218}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | DeepAutoAI/Explore_Llama-3.1-8B-Inst | 0da22342-b4ef-4dd2-b4f5-327710986701 | 0.0.1 | hfopenllm_v2/DeepAutoAI_Explore_Llama-3.1-8B-Inst/1762652579.547036 | 1762652579.5470378 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepAutoAI/Explore_Llama-3.1-8B-Inst | DeepAutoAI/Explore_Llama-3.1-8B-Inst | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7794828831943688}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | DeepAutoAI/Explore_Llama-3.2-1B-Inst | f8e00446-f253-4ff3-a9ff-ef182cf9e147 | 0.0.1 | hfopenllm_v2/DeepAutoAI_Explore_Llama-3.2-1B-Inst/1762652579.5474088 | 1762652579.547411 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepAutoAI/Explore_Llama-3.2-1B-Inst | DeepAutoAI/Explore_Llama-3.2-1B-Inst | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5648856146136695}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0 | 455764e4-7b66-4189-b2e8-907047a92d45 | 0.0.1 | hfopenllm_v2/DeepAutoAI_Explore_Llama-3.2-1B-Inst_v0/1762652579.547727 | 1762652579.5477278 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0 | DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5597148898256625}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | meta | RLHFlow/ArmoRM-Llama3-8B-v0.1 | b8ce63dd-5c8a-4bba-b381-147efcdcc161 | 0.0.1 | hfopenllm_v2/RLHFlow_ArmoRM-Llama3-8B-v0.1/1762652579.8493571 | 1762652579.8493571 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | RLHFlow/ArmoRM-Llama3-8B-v0.1 | RLHFlow/ArmoRM-Llama3-8B-v0.1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18967007539993883}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForRewardModelWithGating", "params_billions": 7.511} |
HF Open LLM v2 | meta | Sakalti/Llama3.2-3B-Uranus-1 | aba2e376-936d-4960-a82b-da09d2266826 | 0.0.1 | hfopenllm_v2/Sakalti_Llama3.2-3B-Uranus-1/1762652579.8570151 | 1762652579.857016 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sakalti/Llama3.2-3B-Uranus-1 | Sakalti/Llama3.2-3B-Uranus-1 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5335365718515761}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | SicariusSicariiStuff/Impish_LLAMA_3B | 9235cd92-5335-498e-881f-21938da4ed61 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_Impish_LLAMA_3B/1762652579.882116 | 1762652579.882117 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/Impish_LLAMA_3B | SicariusSicariiStuff/Impish_LLAMA_3B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.46299485365496884}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA | 27e6623c-49b2-4763-ac6f-b35f1f9002a8 | 0.0.1 | hfopenllm_v2/SicariusSicariiStuff_LLAMA-3_8B_Unaligned_BETA/1762652579.883067 | 1762652579.883067 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA | SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3713203189758729}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | HoangHa/Pensez-Llama3.1-8B | d27e73c5-654c-48c6-ad60-652a60bda72c | 0.0.1 | hfopenllm_v2/HoangHa_Pensez-Llama3.1-8B/1762652579.640512 | 1762652579.640512 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | HoangHa/Pensez-Llama3.1-8B | HoangHa/Pensez-Llama3.1-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3886809221753835}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-Deep-Test | 76edae8d-f4d3-41b2-8a24-cc676feed31c | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-Deep-Test/1762652580.046704 | 1762652580.046706 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-Deep-Test | bunnycore/Llama-3.2-3B-Deep-Test | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.46516797652451053}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.607} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-Deep-Test | f150ea9d-0e4a-49c7-aa12-a703ca011755 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-Deep-Test/1762652580.046481 | 1762652580.046481 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-Deep-Test | bunnycore/Llama-3.2-3B-Deep-Test | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17753006467284582}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.803} |
HF Open LLM v2 | meta | bunnycore/HyperLlama-3.1-8B | 7d031f11-6623-40c0-96bd-b3f0c135600b | 0.0.1 | hfopenllm_v2/bunnycore_HyperLlama-3.1-8B/1762652580.045207 | 1762652580.045208 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/HyperLlama-3.1-8B | bunnycore/HyperLlama-3.1-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7883005979689446}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3b-RP-Toxic-Fuse | 4c2bc39c-2d04-4afd-a94d-bc8f59e75755 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3b-RP-Toxic-Fuse/1762652580.048726 | 1762652580.048727 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3b-RP-Toxic-Fuse | bunnycore/Llama-3.2-3b-RP-Toxic-Fuse | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.683362367407368}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-Mix-Skill | 7a6d897c-0efe-4c18-808c-25f6b9a78b5d | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-Mix-Skill/1762652580.047411 | 1762652580.047412 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-Mix-Skill | bunnycore/Llama-3.2-3B-Mix-Skill | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6404229666174639}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.607} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-Long-Think | bf24dc90-551e-4e0d-8525-9b3b8c4ccfe1 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-Long-Think/1762652580.047193 | 1762652580.047194 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-Long-Think | bunnycore/Llama-3.2-3B-Long-Think | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5473499204333391}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-ProdigyPlus | 0ef3d0a9-a3e9-4b33-bece-bd7eec82514d | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-ProdigyPlus/1762652580.047628 | 1762652580.047629 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-ProdigyPlus | bunnycore/Llama-3.2-3B-ProdigyPlus | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.40152018865499095}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.607} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-Booval | 9cb855b6-e141-492a-99fb-98858d76f66c | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-Booval/1762652580.046278 | 1762652580.046279 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-Booval | bunnycore/Llama-3.2-3B-Booval | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6669259786256023}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | bunnycore/Llama-3.1-8B-TitanFusion-v3 | 6ee91c1c-b44e-44a9-b4b2-4e3cbeb594d3 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.1-8B-TitanFusion-v3/1762652580.045624 | 1762652580.045625 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.1-8B-TitanFusion-v3 | bunnycore/Llama-3.1-8B-TitanFusion-v3 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4809549772381725}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | bunnycore/Llama-3.1-8B-TitanFusion-Mix | 5b0421b6-04ff-422c-a13e-9649306959d4 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.1-8B-TitanFusion-Mix/1762652580.045413 | 1762652580.045414 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.1-8B-TitanFusion-Mix | bunnycore/Llama-3.1-8B-TitanFusion-Mix | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4924954675815725}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-Della | 8c23bcaf-2753-4f60-85ec-e92a48b8bba3 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-Della/1762652580.0469692 | 1762652580.0469701 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-Della | bunnycore/Llama-3.2-3B-Della | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.35608297096149333}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.607} |
HF Open LLM v2 | meta | bunnycore/Best-Mix-Llama-3.1-8B | ee1e13fe-2ec6-4ce8-8d32-1fe011b12ef8 | 0.0.1 | hfopenllm_v2/bunnycore_Best-Mix-Llama-3.1-8B/1762652580.0419252 | 1762652580.041926 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Best-Mix-Llama-3.1-8B | bunnycore/Best-Mix-Llama-3.1-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20670598456539757}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-Bespoke-Thought | b43702d0-eef7-42d8-87b9-c1cbd0edb417 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-Bespoke-Thought/1762652580.046056 | 1762652580.046057 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-Bespoke-Thought | bunnycore/Llama-3.2-3B-Bespoke-Thought | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4112621178473118}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-All-Mix | 60766e3b-e153-4ee8-8615-1c1e68b7cd75 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-All-Mix/1762652580.045842 | 1762652580.045843 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-All-Mix | bunnycore/Llama-3.2-3B-All-Mix | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7226049105262924}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.607} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-ProdigyPlusPlus | 485d4a25-810a-4022-828b-15c255fa2004 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-ProdigyPlusPlus/1762652580.047838 | 1762652580.047839 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-ProdigyPlusPlus | bunnycore/Llama-3.2-3B-ProdigyPlusPlus | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1645157072124186}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.607} |
HF Open LLM v2 | meta | bunnycore/Smol-Llama-3.2-3B | eed01a32-3282-40c9-9a6c-9a0eae79fc8e | 0.0.1 | hfopenllm_v2/bunnycore_Smol-Llama-3.2-3B/1762652580.061756 | 1762652580.0617611 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Smol-Llama-3.2-3B | bunnycore/Smol-Llama-3.2-3B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6678501930433471}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.607} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-ToxicKod | d59a73eb-0aee-49f8-abce-6500f1fae79d | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-ToxicKod/1762652580.0485172 | 1762652580.048518 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-ToxicKod | bunnycore/Llama-3.2-3B-ToxicKod | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6319299458769398}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-RP-DeepThink | d24cf761-7c11-4f9b-9e41-ca24ac1225b9 | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-RP-DeepThink/1762652580.048058 | 1762652580.048059 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-RP-DeepThink | bunnycore/Llama-3.2-3B-RP-DeepThink | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7143867161354096}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.607} |
HF Open LLM v2 | meta | bunnycore/Llama-3.2-3B-RRStock | f1af1d33-fb95-462d-830c-5330d6481b6a | 0.0.1 | hfopenllm_v2/bunnycore_Llama-3.2-3B-RRStock/1762652580.048298 | 1762652580.048298 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bunnycore/Llama-3.2-3B-RRStock | bunnycore/Llama-3.2-3B-RRStock | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6657269378582162}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.607} |
HF Open LLM v2 | meta | chargoddard/prometheus-2-llama-3-8b | ea26b157-81d0-4aa2-a6df-d1d391ab2a3b | 0.0.1 | hfopenllm_v2/chargoddard_prometheus-2-llama-3-8b/1762652580.100514 | 1762652580.100516 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | chargoddard/prometheus-2-llama-3-8b | chargoddard/prometheus-2-llama-3-8b | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5288900118352637}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3 | f1e005a2-b949-4518-b7e5-3fd7af3fcf0f | 0.0.1 | hfopenllm_v2/Enno-Ai_EnnoAi-Pro-Llama-3-8B-v0.3/1762652579.596117 | 1762652579.596118 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3 | Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5082569803676467}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | Enno-Ai/EnnoAi-Pro-Llama-3-8B | 39a6c969-d938-4e4c-9adc-f71f1d30143d | 0.0.1 | hfopenllm_v2/Enno-Ai_EnnoAi-Pro-Llama-3-8B/1762652579.5958989 | 1762652579.5958998 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Enno-Ai/EnnoAi-Pro-Llama-3-8B | Enno-Ai/EnnoAi-Pro-Llama-3-8B | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.31953771548380516}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.031} |
HF Open LLM v2 | meta | Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4 | bbc78d6d-09e3-410a-9bf9-a6dcdbef346e | 0.0.1 | hfopenllm_v2/Enno-Ai_EnnoAi-Pro-French-Llama-3-8B-v0.4/1762652579.5956101 | 1762652579.5956109 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4 | Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4188807918545016}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.031} |
HF Open LLM v2 | meta | Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9 | cf0ca830-4bb6-4317-97ae-380f54518d9f | 0.0.1 | hfopenllm_v2/Enno-Ai_EnnoAi-Pro-Llama-3.1-8B-v0.9/1762652579.5963311 | 1762652579.596332 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9 | Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4689147018799009}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-merged | 0f52efcb-1b9b-4df1-820b-a8c0698481a7 | 0.0.1 | hfopenllm_v2/LimYeri_CodeMind-Llama3-8B-unsloth_v4-one-merged/1762652579.7341938 | 1762652579.734195 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-merged | LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-merged | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.32108693821283085}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | LimYeri/CodeMind-Llama3-8B-unsloth_v2-merged | 0338e807-8f8e-41d9-b4ac-d80239340678 | 0.0.1 | hfopenllm_v2/LimYeri_CodeMind-Llama3-8B-unsloth_v2-merged/1762652579.733024 | 1762652579.733025 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LimYeri/CodeMind-Llama3-8B-unsloth_v2-merged | LimYeri/CodeMind-Llama3-8B-unsloth_v2-merged | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6946280314011268}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | LimYeri/CodeMind-Llama3-8B-unsloth_v3-merged | c96743a9-b5ca-40ab-a86a-ed1c7ab8ddfd | 0.0.1 | hfopenllm_v2/LimYeri_CodeMind-Llama3-8B-unsloth_v3-merged/1762652579.733407 | 1762652579.7334101 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LimYeri/CodeMind-Llama3-8B-unsloth_v3-merged | LimYeri/CodeMind-Llama3-8B-unsloth_v3-merged | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6762933460994606}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | LimYeri/CodeMind-Llama3.1-8B-unsloth-merged | 82d77852-64e4-4dd0-a636-785958786fd2 | 0.0.1 | hfopenllm_v2/LimYeri_CodeMind-Llama3.1-8B-unsloth-merged/1762652579.7344582 | 1762652579.734459 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LimYeri/CodeMind-Llama3.1-8B-unsloth-merged | LimYeri/CodeMind-Llama3.1-8B-unsloth-merged | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6490157227268093}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.1-8B-Cobalt | 8683a084-2521-469c-8575-9b2595c112dd | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.1-8B-Cobalt/1762652579.9449751 | 1762652579.9449759 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.1-8B-Cobalt | ValiantLabs/Llama3.1-8B-Cobalt | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3496134700372789}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.1-8B-Cobalt | 382ce872-f5a6-4753-9cca-ba06ddcbb4b6 | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.1-8B-Cobalt/1762652579.945206 | 1762652579.945206 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.1-8B-Cobalt | ValiantLabs/Llama3.1-8B-Cobalt | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7168346653545925}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ValiantLabs/Llama3-70B-ShiningValiant2 | 1650ab9b-4e64-48f1-9551-fb58758cb2f6 | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3-70B-ShiningValiant2/1762652579.9445372 | 1762652579.944538 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3-70B-ShiningValiant2 | ValiantLabs/Llama3-70B-ShiningValiant2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6121712611426571}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | meta | ValiantLabs/Llama3-70B-Fireplace | 60150622-5b73-4b2c-a8f2-7c2e84cd3d0e | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3-70B-Fireplace/1762652579.944278 | 1762652579.944279 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3-70B-Fireplace | ValiantLabs/Llama3-70B-Fireplace | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7773596280092377}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.2-3B-ShiningValiant2 | 6c3a0d11-d421-4420-9df7-359164a85893 | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.2-3B-ShiningValiant2/1762652579.947389 | 1762652579.9473898 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.2-3B-ShiningValiant2 | ValiantLabs/Llama3.2-3B-ShiningValiant2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2625101397624968}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.2-3B-Esper2 | 5567fc86-d3f8-4ef7-94d8-12fc28eeb9b4 | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.2-3B-Esper2/1762652579.947128 | 1762652579.9471302 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.2-3B-Esper2 | ValiantLabs/Llama3.2-3B-Esper2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.27497484452364174}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.1-8B-ShiningValiant2 | 4b3c0c63-4718-4fce-bd70-a31b3b60dfad | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.1-8B-ShiningValiant2/1762652579.946223 | 1762652579.9462242 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.1-8B-ShiningValiant2 | ValiantLabs/Llama3.1-8B-ShiningValiant2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6495653754260917}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.1-8B-ShiningValiant2 | e1d82962-59c9-44e7-9243-ea62f6639d1e | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.1-8B-ShiningValiant2/1762652579.946434 | 1762652579.946435 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.1-8B-ShiningValiant2 | ValiantLabs/Llama3.1-8B-ShiningValiant2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.26780608784691284}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | meta | ValiantLabs/Llama3.1-70B-ShiningValiant2 | 6f4c4594-6f73-44e3-b531-f7651b523e8f | 0.0.1 | hfopenllm_v2/ValiantLabs_Llama3.1-70B-ShiningValiant2/1762652579.94475 | 1762652579.944751 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ValiantLabs/Llama3.1-70B-ShiningValiant2 | ValiantLabs/Llama3.1-70B-ShiningValiant2 | meta | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5355346037402979}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.