_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | T145 | T145/ZEUS-8B-V2L2 | 8e7be46e-af57-4e88-9df5-3161110dfa66 | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V2L2/1762652579.9065118 | 1762652579.906513 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V2L2 | T145/ZEUS-8B-V2L2 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8020640788662969}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/KRONOS-8B-V5 | 290206b5-0e46-4f92-a2bd-f2c53ef3d147 | 0.0.1 | hfopenllm_v2/T145_KRONOS-8B-V5/1762652579.8986921 | 1762652579.898693 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/KRONOS-8B-V5 | T145/KRONOS-8B-V5 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5405058577906621}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V14 | 2b0eb3f5-d35e-41ea-ba69-18c0b8a3e1e1 | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V14/1762652579.901653 | 1762652579.901653 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V14 | T145/ZEUS-8B-V14 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.770939994769434}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V29 | c383684a-2f70-46e9-ab55-4d68903613b3 | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V29/1762652579.906123 | 1762652579.906123 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V29 | T145/ZEUS-8B-V29 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7417640748768822}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V16 | 7beef3ca-6423-4a81-836d-0e4cdc4af973 | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V16/1762652579.9020631 | 1762652579.902064 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V16 | T145/ZEUS-8B-V16 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7925471083392066}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/KRONOS-8B-V2 | ff4c64ec-f44b-4bec-9534-bafa632a0e3f | 0.0.1 | hfopenllm_v2/T145_KRONOS-8B-V2/1762652579.897814 | 1762652579.897815 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/KRONOS-8B-V2 | T145/KRONOS-8B-V2 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5180243974875552}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/KRONOS-8B-V8 | 57a4ddc6-0447-4840-94bc-5bb136025aab | 0.0.1 | hfopenllm_v2/T145_KRONOS-8B-V8/1762652579.899387 | 1762652579.8993878 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/KRONOS-8B-V8 | T145/KRONOS-8B-V8 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7770349339751859}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/KRONOS-8B-V1-P3 | 14eb1867-80a0-47f9-9b2a-f0a05f683fb4 | 0.0.1 | hfopenllm_v2/T145_KRONOS-8B-V1-P3/1762652579.897578 | 1762652579.897579 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/KRONOS-8B-V1-P3 | T145/KRONOS-8B-V1-P3 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7137373280673058}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V2 | e64503c5-d9ce-4544-8caf-0fec97a2b592 | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V2/1762652579.9035678 | 1762652579.903569 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V2 | T145/ZEUS-8B-V2 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8029384255996312}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V19 | 0392cccb-0a1c-486e-876a-1404f14a1080 | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V19/1762652579.903361 | 1762652579.903362 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V19 | T145/ZEUS-8B-V19 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7882507302845339}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V22 | 3f44a1c0-b70a-4712-a0c1-bdf3318b270c | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V22/1762652579.9047282 | 1762652579.9047291 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V22 | T145/ZEUS-8B-V22 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7995163942782927}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V17-abliterated-V4 | bf9c0bfa-98e5-45b2-8819-0911af81d78f | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V17-abliterated-V4/1762652579.902891 | 1762652579.902891 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V17-abliterated-V4 | T145/ZEUS-8B-V17-abliterated-V4 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7228298691915229}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V4 | 9330c290-ee47-4a7d-9b8f-62903402e0e3 | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V4/1762652579.9075332 | 1762652579.907535 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V4 | T145/ZEUS-8B-V4 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7807317916461656}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V20 | 0ba8bca5-3a61-499a-8e2d-ca84f52ef654 | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V20/1762652579.904202 | 1762652579.904203 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V20 | T145/ZEUS-8B-V20 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7955945779420825}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | T145 | T145/ZEUS-8B-V2L1 | 015f91ef-9318-44d6-acb2-17628000c273 | 0.0.1 | hfopenllm_v2/T145_ZEUS-8B-V2L1/1762652579.906316 | 1762652579.906317 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | T145/ZEUS-8B-V2L1 | T145/ZEUS-8B-V2L1 | T145 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3191886416929303}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xinchen9 | xinchen9/llama3-b8-ft-dis | 5ea3a084-bc30-4390-97a2-1933c5422790 | 0.0.1 | hfopenllm_v2/xinchen9_llama3-b8-ft-dis/1762652580.598142 | 1762652580.598142 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xinchen9/llama3-b8-ft-dis | xinchen9/llama3-b8-ft-dis | xinchen9 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.154598687039278}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | mlabonne | mlabonne/NeuralDaredevil-8B-abliterated | 05fe5948-c228-46f5-ac96-3c234bc5b3ce | 0.0.1 | hfopenllm_v2/mlabonne_NeuralDaredevil-8B-abliterated/1762652580.369559 | 1762652580.36956 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/NeuralDaredevil-8B-abliterated | mlabonne/NeuralDaredevil-8B-abliterated | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.756077208473517}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | mlabonne | mlabonne/NeuralDaredevil-8B-abliterated | d4b40160-579a-4e66-96a2-8441e5c02694 | 0.0.1 | hfopenllm_v2/mlabonne_NeuralDaredevil-8B-abliterated/1762652580.369774 | 1762652580.369775 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/NeuralDaredevil-8B-abliterated | mlabonne/NeuralDaredevil-8B-abliterated | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41623337189767595}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | mlabonne | mlabonne/BigQwen2.5-Echo-47B-Instruct | 12efcd4e-13cc-46e5-964a-35d4be69a01e | 0.0.1 | hfopenllm_v2/mlabonne_BigQwen2.5-Echo-47B-Instruct/1762652580.36785 | 1762652580.36785 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/BigQwen2.5-Echo-47B-Instruct | mlabonne/BigQwen2.5-Echo-47B-Instruct | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7356691356711305}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 47.392} |
HF Open LLM v2 | mlabonne | mlabonne/Daredevil-8B-abliterated | 3ad89b65-5719-4e54-aadf-c10d3f27857a | 0.0.1 | hfopenllm_v2/mlabonne_Daredevil-8B-abliterated/1762652580.3686998 | 1762652580.3686998 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/Daredevil-8B-abliterated | mlabonne/Daredevil-8B-abliterated | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.44263664853699297}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | mlabonne | mlabonne/Daredevil-8B | 4653087e-b528-47c1-86eb-0166538229bc | 0.0.1 | hfopenllm_v2/mlabonne_Daredevil-8B/1762652580.368499 | 1762652580.3685 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/Daredevil-8B | mlabonne/Daredevil-8B | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.45477665926408595}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | mlabonne | mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated | 605f3f59-204e-4332-8b4e-9da04871ca1b | 0.0.1 | hfopenllm_v2/mlabonne_Meta-Llama-3.1-8B-Instruct-abliterated/1762652580.369122 | 1762652580.369123 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated | mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7329463601023063}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | mlabonne | mlabonne/BigQwen2.5-52B-Instruct | b18517f1-db51-43a8-812f-75aeccae508f | 0.0.1 | hfopenllm_v2/mlabonne_BigQwen2.5-52B-Instruct/1762652580.3676438 | 1762652580.367645 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/BigQwen2.5-52B-Instruct | mlabonne/BigQwen2.5-52B-Instruct | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7913480675718205}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 52.268} |
HF Open LLM v2 | mlabonne | mlabonne/AlphaMonarch-7B | d7eb4408-6857-4df1-b92b-9dd4712a4f23 | 0.0.1 | hfopenllm_v2/mlabonne_AlphaMonarch-7B/1762652580.367184 | 1762652580.3671849 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/AlphaMonarch-7B | mlabonne/AlphaMonarch-7B | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49394384677101205}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | mlabonne | mlabonne/NeuralBeagle14-7B | 0bfec228-5bfb-4662-8be5-ad910b5bc3bd | 0.0.1 | hfopenllm_v2/mlabonne_NeuralBeagle14-7B/1762652580.369343 | 1762652580.369343 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/NeuralBeagle14-7B | mlabonne/NeuralBeagle14-7B | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49351941736813876}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | mlabonne | mlabonne/Beyonder-4x7B-v3 | b0867447-6dd9-453c-af09-da0db5651e65 | 0.0.1 | hfopenllm_v2/mlabonne_Beyonder-4x7B-v3/1762652580.36743 | 1762652580.367431 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mlabonne/Beyonder-4x7B-v3 | mlabonne/Beyonder-4x7B-v3 | mlabonne | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5608385749810503}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MixtralForCausalLM", "params_billions": 24.154} |
HF Open LLM v2 | nothingiisreal | nothingiisreal/MN-12B-Starcannon-v3 | 633a786a-fe99-4a6e-b402-888e36e8b6c9 | 0.0.1 | hfopenllm_v2/nothingiisreal_MN-12B-Starcannon-v3/1762652580.412042 | 1762652580.412042 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nothingiisreal/MN-12B-Starcannon-v3 | nothingiisreal/MN-12B-Starcannon-v3 | nothingiisreal | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.38073755413414184}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | nothingiisreal | nothingiisreal/MN-12B-Starcannon-v2 | 1ff70031-dbe8-467a-9dbd-9fd789b9841b | 0.0.1 | hfopenllm_v2/nothingiisreal_MN-12B-Starcannon-v2/1762652580.411832 | 1762652580.411832 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nothingiisreal/MN-12B-Starcannon-v2 | nothingiisreal/MN-12B-Starcannon-v2 | nothingiisreal | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3925273828995953}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | nothingiisreal | nothingiisreal/L3.1-8B-Celeste-V1.5 | 5b7a80ce-0fb2-4fb8-9381-184d7a434706 | 0.0.1 | hfopenllm_v2/nothingiisreal_L3.1-8B-Celeste-V1.5/1762652580.4115741 | 1762652580.411575 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nothingiisreal/L3.1-8B-Celeste-V1.5 | nothingiisreal/L3.1-8B-Celeste-V1.5 | nothingiisreal | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7326715337526651}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Amu | Amu/t1-3B | c0b7e3e6-4160-4482-af4f-038ae79c7578 | 0.0.1 | hfopenllm_v2/Amu_t1-3B/1762652579.481272 | 1762652579.4812732 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Amu/t1-3B | Amu/t1-3B | Amu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33277703160946287}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.397} |
HF Open LLM v2 | Amu | Amu/t1-1.5B | 3e967795-680c-4bfc-906b-eadb969cf2bd | 0.0.1 | hfopenllm_v2/Amu_t1-1.5B/1762652579.481014 | 1762652579.481015 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Amu/t1-1.5B | Amu/t1-1.5B | Amu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3393717558300864}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.777} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/GivingTree-8b-sce | 9b753075-a150-4bc3-9425-2371010daf8b | 0.0.1 | hfopenllm_v2/Quazim0t0_GivingTree-8b-sce/1762652579.8222332 | 1762652579.8222342 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/GivingTree-8b-sce | Quazim0t0/GivingTree-8b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5006139266036339}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/InspectorDeck-14B-sce | 1ac547e3-1b29-462a-aa08-1e9ef9e3f409 | 0.0.1 | hfopenllm_v2/Quazim0t0_InspectorDeck-14B-sce/1762652579.8238342 | 1762652579.8238342 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/InspectorDeck-14B-sce | Quazim0t0/InspectorDeck-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.32408454013129606}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/tesseract-14b-stock | 4311b63a-282b-4c16-8609-a1d4ab93ace9 | 0.0.1 | hfopenllm_v2/Quazim0t0_tesseract-14b-stock/1762652579.834054 | 1762652579.834055 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/tesseract-14b-stock | Quazim0t0/tesseract-14b-stock | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5847939024011845}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Origami-14B-sce | 82826944-e4a1-47bd-b240-c70e21acfc51 | 0.0.1 | hfopenllm_v2/Quazim0t0_Origami-14B-sce/1762652579.828193 | 1762652579.8281941 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Origami-14B-sce | Quazim0t0/Origami-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3259329689667859}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/MFGRIMM-14B | 773228d8-7e03-4ba8-87c1-f59ac5aad425 | 0.0.1 | hfopenllm_v2/Quazim0t0_MFGRIMM-14B/1762652579.8259468 | 1762652579.825948 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/MFGRIMM-14B | Quazim0t0/MFGRIMM-14B | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6894074389287091}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/bloom-14b-stock | 1a2b4a76-0feb-4404-a1ef-0408c75f2ca7 | 0.0.1 | hfopenllm_v2/Quazim0t0_bloom-14b-stock/1762652579.8329449 | 1762652579.8329458 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/bloom-14b-stock | Quazim0t0/bloom-14b-stock | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6575087434673332}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Imagine-v0.5-16bit | ccb33ad4-98f5-4980-a442-1a1772fab792 | 0.0.1 | hfopenllm_v2/Quazim0t0_Imagine-v0.5-16bit/1762652579.823242 | 1762652579.823243 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Imagine-v0.5-16bit | Quazim0t0/Imagine-v0.5-16bit | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2758990589413866}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Hyde-14b-sce | 814ce716-6f61-4980-a8f6-7918c7b0eea5 | 0.0.1 | hfopenllm_v2/Quazim0t0_Hyde-14b-sce/1762652579.823039 | 1762652579.823039 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Hyde-14b-sce | Quazim0t0/Hyde-14b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6715470507143269}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/MFDOOM-14B | 3efa12a5-4525-4ee9-80bd-99c4b8d2ccb2 | 0.0.1 | hfopenllm_v2/Quazim0t0_MFDOOM-14B/1762652579.825741 | 1762652579.825742 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/MFDOOM-14B | Quazim0t0/MFDOOM-14B | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6736204382150472}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/TB0-8B-sce | 8f0da98a-cf9f-4cbb-8d4a-8c12d737580c | 0.0.1 | hfopenllm_v2/Quazim0t0_TB0-8B-sce/1762652579.8308768 | 1762652579.8308768 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/TB0-8B-sce | Quazim0t0/TB0-8B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5107304175144174}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Spok-14b-sce | 9f15293c-5668-4895-b4d0-4062cac344e7 | 0.0.1 | hfopenllm_v2/Quazim0t0_Spok-14b-sce/1762652579.830291 | 1762652579.830292 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Spok-14b-sce | Quazim0t0/Spok-14b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6681748870773991}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Casa-14b-sce | 09bbb732-62d8-4cec-972a-273b728df1f4 | 0.0.1 | hfopenllm_v2/Quazim0t0_Casa-14b-sce/1762652579.8199282 | 1762652579.8199282 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Casa-14b-sce | Quazim0t0/Casa-14b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6653523761397536}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Casa-14b-sce | a0dde1eb-a763-4568-8122-1b280dedb2ce | 0.0.1 | hfopenllm_v2/Quazim0t0_Casa-14b-sce/1762652579.820149 | 1762652579.820149 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Casa-14b-sce | Quazim0t0/Casa-14b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6718218770639681}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Imbue-14b | c50c07fc-b529-43c9-9f3d-0f1ff174b905 | 0.0.1 | hfopenllm_v2/Quazim0t0_Imbue-14b/1762652579.8234398 | 1762652579.8234408 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Imbue-14b | Quazim0t0/Imbue-14b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5199725616918665}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Alice-14B | 3dd99496-1274-439f-b7c2-1fd731745753 | 0.0.1 | hfopenllm_v2/Quazim0t0_Alice-14B/1762652579.819317 | 1762652579.819317 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Alice-14B | Quazim0t0/Alice-14B | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6836371937570092}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Vine-14b-sce | 70d25d8c-96e9-45e4-b0d1-684a89278064 | 0.0.1 | hfopenllm_v2/Quazim0t0_Vine-14b-sce/1762652579.8321972 | 1762652579.832198 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Vine-14b-sce | Quazim0t0/Vine-14b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.673345611865406}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Insom | 51f419c6-1107-41c9-896b-fadbbde4f5e9 | 0.0.1 | hfopenllm_v2/Quazim0t0_Insom/1762652579.823634 | 1762652579.8236349 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Insom | Quazim0t0/Insom | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.68183863260593}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH",... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/GuiltySpark-14B-ties | 2b50b73e-9734-4502-b088-8d4936291aaa | 0.0.1 | hfopenllm_v2/Quazim0t0_GuiltySpark-14B-ties/1762652579.822431 | 1762652579.822432 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/GuiltySpark-14B-ties | Quazim0t0/GuiltySpark-14B-ties | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6854357549080883}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Charlie-8B-Linear | c56d7463-dad2-4c9c-8823-a4b6faa5aeb9 | 0.0.1 | hfopenllm_v2/Quazim0t0_Charlie-8B-Linear/1762652579.820338 | 1762652579.820339 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Charlie-8B-Linear | Quazim0t0/Charlie-8B-Linear | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7380672172059026}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Alien-CoT-14B-sce | dc89616f-c86d-41d0-9945-12703dc8f905 | 0.0.1 | hfopenllm_v2/Quazim0t0_Alien-CoT-14B-sce/1762652579.819517 | 1762652579.8195179 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Alien-CoT-14B-sce | Quazim0t0/Alien-CoT-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.07486358417886763}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Aura-8B-Linear | 2d22ab53-547d-41bb-8700-12bc5b16c97d | 0.0.1 | hfopenllm_v2/Quazim0t0_Aura-8B-Linear/1762652579.819725 | 1762652579.819726 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Aura-8B-Linear | Quazim0t0/Aura-8B-Linear | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.794770098893159}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/ODB-14b-sce | 66743ed1-93ab-41f7-9002-0080e7f74722 | 0.0.1 | hfopenllm_v2/Quazim0t0_ODB-14b-sce/1762652579.827807 | 1762652579.827808 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/ODB-14b-sce | Quazim0t0/ODB-14b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7015973173402128}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/ODB-14B-sce | 79d7d2a1-dcb6-40a7-b29c-7213ebd261df | 0.0.1 | hfopenllm_v2/Quazim0t0_ODB-14B-sce/1762652579.827594 | 1762652579.827595 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/ODB-14B-sce | Quazim0t0/ODB-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.292235712354331}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Unknown", "params_billions": 0.0} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Halo-14B-sce | 156424f1-2a1e-4e61-b081-bb066ee3958d | 0.0.1 | hfopenllm_v2/Quazim0t0_Halo-14B-sce/1762652579.822633 | 1762652579.822633 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Halo-14B-sce | Quazim0t0/Halo-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6753691316817156}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Jigsaw-14B-Linear | 7533defe-b19d-4571-a403-c443ec03a31b | 0.0.1 | hfopenllm_v2/Quazim0t0_Jigsaw-14B-Linear/1762652579.824291 | 1762652579.824291 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Jigsaw-14B-Linear | Quazim0t0/Jigsaw-14B-Linear | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6480416406246536}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Mouse-9B | 70e3145f-d67b-403d-af2a-1b06b2ba0f24 | 0.0.1 | hfopenllm_v2/Quazim0t0_Mouse-9B/1762652579.826978 | 1762652579.826978 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Mouse-9B | Quazim0t0/Mouse-9B | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1324917884546337}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 9.207} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/RZA-14B-sce | e8a8cf1f-5bcf-45ae-b590-fb04de06b77f | 0.0.1 | hfopenllm_v2/Quazim0t0_RZA-14B-sce/1762652579.829216 | 1762652579.829216 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/RZA-14B-sce | Quazim0t0/RZA-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4773578549360142}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Sake-20b | 25a672ed-3e0e-416f-abf4-a935e63171c6 | 0.0.1 | hfopenllm_v2/Quazim0t0_Sake-20b/1762652579.830092 | 1762652579.8300931 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Sake-20b | Quazim0t0/Sake-20b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6692741924759638}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 21.475} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Nova-14b-sce | 3336c8fa-fcef-4513-946d-9254f537e418 | 0.0.1 | hfopenllm_v2/Quazim0t0_Nova-14b-sce/1762652579.827177 | 1762652579.827178 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Nova-14b-sce | Quazim0t0/Nova-14b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7021968377239058}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/mosaic-14b-sce | 4fd82b3e-4b13-4e21-9253-6492f8b1feaa | 0.0.1 | hfopenllm_v2/Quazim0t0_mosaic-14b-sce/1762652579.8338351 | 1762652579.833836 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/mosaic-14b-sce | Quazim0t0/mosaic-14b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6875590100932193}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/SZA-14B-sce | 6d983237-925e-4197-a592-17cca9219bda | 0.0.1 | hfopenllm_v2/Quazim0t0_SZA-14B-sce/1762652579.829889 | 1762652579.82989 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/SZA-14B-sce | Quazim0t0/SZA-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5659095644002359}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Mithril-14B-sce | 8ab4e441-2efb-4510-87ea-43f3fbcc67ac | 0.0.1 | hfopenllm_v2/Quazim0t0_Mithril-14B-sce/1762652579.826359 | 1762652579.82636 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Mithril-14B-sce | Quazim0t0/Mithril-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6957772044841022}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/GZA-14B-sce | cfb61ec3-ab7e-4697-892e-a8dd62518f39 | 0.0.1 | hfopenllm_v2/Quazim0t0_GZA-14B-sce/1762652579.821823 | 1762652579.821824 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/GZA-14B-sce | Quazim0t0/GZA-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6274086091570367}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Oasis-14B-ties | a3ef4bc2-c560-4a62-8227-2bd30120b537 | 0.0.1 | hfopenllm_v2/Quazim0t0_Oasis-14B-ties/1762652579.827992 | 1762652579.8279932 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Oasis-14B-ties | Quazim0t0/Oasis-14B-ties | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6936539492989712}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Dyson-14b | 35c401bd-ed12-475e-afbc-e664243d90d5 | 0.0.1 | hfopenllm_v2/Quazim0t0_Dyson-14b/1762652579.821013 | 1762652579.821014 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Dyson-14b | Quazim0t0/Dyson-14b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5856682491345186}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Mononoke-14B-sce | 6f2d122b-f7fe-448a-ac8b-864314e94692 | 0.0.1 | hfopenllm_v2/Quazim0t0_Mononoke-14B-sce/1762652579.8265631 | 1762652579.826564 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Mononoke-14B-sce | Quazim0t0/Mononoke-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3502129904209719}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Knot-CoT-14B-sce | fe0b75bf-2035-4ffe-8cbf-d5f4c66907aa | 0.0.1 | hfopenllm_v2/Quazim0t0_Knot-CoT-14B-sce/1762652579.8248682 | 1762652579.8248692 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Knot-CoT-14B-sce | Quazim0t0/Knot-CoT-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4831779677921249}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Heretic1.5b | e3d7453d-0ba6-4980-be81-827122149bb6 | 0.0.1 | hfopenllm_v2/Quazim0t0_Heretic1.5b/1762652579.8228369 | 1762652579.8228369 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Heretic1.5b | Quazim0t0/Heretic1.5b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20615633186611523}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.73} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Venti-Blend-sce | e9fa96ff-d790-4948-9071-dd1376701fc1 | 0.0.1 | hfopenllm_v2/Quazim0t0_Venti-Blend-sce/1762652579.831816 | 1762652579.8318179 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Venti-Blend-sce | Quazim0t0/Venti-Blend-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6879335718116819}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 21.475} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Chromatic-8b-sce | f626897d-5003-40fa-8020-c100748a847f | 0.0.1 | hfopenllm_v2/Quazim0t0_Chromatic-8b-sce/1762652579.8205519 | 1762652579.820553 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Chromatic-8b-sce | Quazim0t0/Chromatic-8b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5085074269604649}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/1up-14b | c315527d-ea14-42a8-a002-4bb67c085fc0 | 0.0.1 | hfopenllm_v2/Quazim0t0_1up-14b/1762652579.818811 | 1762652579.818812 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/1up-14b | Quazim0t0/1up-14b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6888079185450161}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Fugazi14b | ee38e1c3-7a6b-4357-94ac-b309da33d14b | 0.0.1 | hfopenllm_v2/Quazim0t0_Fugazi14b/1762652579.8215911 | 1762652579.821592 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Fugazi14b | Quazim0t0/Fugazi14b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6997987561891337}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Wu-14b-sce | 35443539-9756-466b-a36f-66adc5f68ddb | 0.0.1 | hfopenllm_v2/Quazim0t0_Wu-14b-sce/1762652579.832721 | 1762652579.832722 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Wu-14b-sce | Quazim0t0/Wu-14b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6718218770639681}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Lineage-14B | 37f890b7-5487-46ea-b61e-d91b5349d078 | 0.0.1 | hfopenllm_v2/Quazim0t0_Lineage-14B/1762652579.82509 | 1762652579.8250911 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Lineage-14B | Quazim0t0/Lineage-14B | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7070428684778609}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Rune-14b | 3ed52eaf-6b73-46ab-8ae7-3afe120fe437 | 0.0.1 | hfopenllm_v2/Quazim0t0_Rune-14b/1762652579.829681 | 1762652579.8296819 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Rune-14b | Quazim0t0/Rune-14b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7015973173402128}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/mocha-14B | 5c04fa63-11be-42d8-8133-4e79e08e42ad | 0.0.1 | hfopenllm_v2/Quazim0t0_mocha-14B/1762652579.833622 | 1762652579.833623 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/mocha-14B | Quazim0t0/mocha-14B | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5893152391210876}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Katana-8b-sce | dea8c833-7deb-43f8-9b15-acbadf4fc749 | 0.0.1 | hfopenllm_v2/Quazim0t0_Katana-8b-sce/1762652579.8246028 | 1762652579.8246038 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Katana-8b-sce | Quazim0t0/Katana-8b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5107304175144174}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Motion-8B-Linear | db82138b-f915-4451-aa85-8bc4c7fdd225 | 0.0.1 | hfopenllm_v2/Quazim0t0_Motion-8B-Linear/1762652579.826771 | 1762652579.826771 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Motion-8B-Linear | Quazim0t0/Motion-8B-Linear | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7685917809190725}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Jekyl-8b-sce | dc6a9e35-c130-4edc-93bc-5f0b6ac0e05d | 0.0.1 | hfopenllm_v2/Quazim0t0_Jekyl-8b-sce/1762652579.82404 | 1762652579.824041 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Jekyl-8b-sce | Quazim0t0/Jekyl-8b-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.46968931324441365}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Adamant-14B-sce | 7ed9dcc6-7915-4a7e-a190-07e067d2fd79 | 0.0.1 | hfopenllm_v2/Quazim0t0_Adamant-14B-sce/1762652579.819103 | 1762652579.819104 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Adamant-14B-sce | Quazim0t0/Adamant-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6857604489421402}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Venti-20b | 2b97259b-d7a5-4934-b350-7b1322964899 | 0.0.1 | hfopenllm_v2/Quazim0t0_Venti-20b/1762652579.8314738 | 1762652579.831475 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Venti-20b | Quazim0t0/Venti-20b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6641034676879568}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 21.475} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/SuperNova14b | b0659361-fb53-40db-81a7-2a72771bbd1a | 0.0.1 | hfopenllm_v2/Quazim0t0_SuperNova14b/1762652579.830682 | 1762652579.830683 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/SuperNova14b | Quazim0t0/SuperNova14b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.707642388861554}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Loke-14B-sce | cfac443e-5c66-45e3-bf7a-7c596d01d4ff | 0.0.1 | hfopenllm_v2/Quazim0t0_Loke-14B-sce/1762652579.825529 | 1762652579.82553 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Loke-14B-sce | Quazim0t0/Loke-14B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6847863668399845}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/caramel-14B | a9d4b6a9-33af-42a3-be29-d3214a171433 | 0.0.1 | hfopenllm_v2/Quazim0t0_caramel-14B/1762652579.833162 | 1762652579.833163 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/caramel-14B | Quazim0t0/caramel-14B | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6744947849483814}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Ponder-14B-linear | 30942374-a112-4035-a4f2-e30bff57f9ce | 0.0.1 | hfopenllm_v2/Quazim0t0_Ponder-14B-linear/1762652579.8290088 | 1762652579.8290088 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Ponder-14B-linear | Quazim0t0/Ponder-14B-linear | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6906064796960952}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Wendy-14B | 13e6cad7-a063-4530-bec9-e70e4e98ccc0 | 0.0.1 | hfopenllm_v2/Quazim0t0_Wendy-14B/1762652579.832468 | 1762652579.832469 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Wendy-14B | Quazim0t0/Wendy-14B | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6772175605172055}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Geedorah-14B | c4a79914-b049-436b-9de6-640cc3e119ee | 0.0.1 | hfopenllm_v2/Quazim0t0_Geedorah-14B/1762652579.822031 | 1762652579.822032 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Geedorah-14B | Quazim0t0/Geedorah-14B | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6872841837435781}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/NovaScotia-14b-stock | 8ab3ce59-d0cd-4764-98c7-c4df81bc3c23 | 0.0.1 | hfopenllm_v2/Quazim0t0_NovaScotia-14b-stock/1762652579.827381 | 1762652579.827381 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/NovaScotia-14b-stock | Quazim0t0/NovaScotia-14b-stock | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6787412953186434}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Rosemary-14b | 84018db9-2b85-4b6f-beff-b4930b230399 | 0.0.1 | hfopenllm_v2/Quazim0t0_Rosemary-14b/1762652579.829469 | 1762652579.82947 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Rosemary-14b | Quazim0t0/Rosemary-14b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6915306941138402}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Sumatra-20b | ae69fb3f-19a1-4b00-9309-8685e107aeba | 0.0.1 | hfopenllm_v2/Quazim0t0_Sumatra-20b/1762652579.830487 | 1762652579.830488 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Sumatra-20b | Quazim0t0/Sumatra-20b | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.673795529195867}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 21.475} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/time-14b-stock | 2755da2c-8347-4bbd-80ee-c58e77a26f5e | 0.0.1 | hfopenllm_v2/Quazim0t0_time-14b-stock/1762652579.834393 | 1762652579.8343942 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/time-14b-stock | Quazim0t0/time-14b-stock | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6699235805440675}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/Edu-14B-Linear | a70e7642-3cc7-4719-bc22-68182baa3857 | 0.0.1 | hfopenllm_v2/Quazim0t0_Edu-14B-Linear/1762652579.821216 | 1762652579.821216 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/Edu-14B-Linear | Quazim0t0/Edu-14B-Linear | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6158182511292261}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | Quazim0t0 | Quazim0t0/TBL-8B-sce | 4bff88c0-89fb-4d07-a83d-251c7aaeace4 | 0.0.1 | hfopenllm_v2/Quazim0t0_TBL-8B-sce/1762652579.831074 | 1762652579.831075 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Quazim0t0/TBL-8B-sce | Quazim0t0/TBL-8B-sce | Quazim0t0 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.45809895521660304}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Xclbr7 | Xclbr7/Hyena-12b | 06eb233f-5182-4b9e-be3f-21c928eef397 | 0.0.1 | hfopenllm_v2/Xclbr7_Hyena-12b/1762652579.9516642 | 1762652579.951665 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Xclbr7/Hyena-12b | Xclbr7/Hyena-12b | Xclbr7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3404455733010634}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Xclbr7 | Xclbr7/caliburn-v2-12b | 18a12670-8785-44ef-a365-78ce797b8ba5 | 0.0.1 | hfopenllm_v2/Xclbr7_caliburn-v2-12b/1762652579.952102 | 1762652579.952102 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Xclbr7/caliburn-v2-12b | Xclbr7/caliburn-v2-12b | Xclbr7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2966816934622358}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Xclbr7 | Xclbr7/Arcanum-12b | 2d0a414f-1cf2-4ae3-951b-ed69d1ef883f | 0.0.1 | hfopenllm_v2/Xclbr7_Arcanum-12b/1762652579.9514 | 1762652579.951401 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Xclbr7/Arcanum-12b | Xclbr7/Arcanum-12b | Xclbr7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2906864896253053}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Xclbr7 | Xclbr7/caliburn-12b | e897d1fc-2c71-4c61-971b-eeddfae1b75c | 0.0.1 | hfopenllm_v2/Xclbr7_caliburn-12b/1762652579.951879 | 1762652579.95188 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Xclbr7/caliburn-12b | Xclbr7/caliburn-12b | Xclbr7 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.35763108551975425}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Lawnakk | Lawnakk/BBALAW1.3 | 60fa19b9-bf1d-4f39-b421-cb59379f5206 | 0.0.1 | hfopenllm_v2/Lawnakk_BBALAW1.3/1762652579.70884 | 1762652579.7088408 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBALAW1.3 | Lawnakk/BBALAW1.3 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.13543952268868825}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 4.353} |
HF Open LLM v2 | Lawnakk | Lawnakk/BBALAW1.63 | 6005fc02-9f02-436a-a535-ec68a3c6dbc6 | 0.0.1 | hfopenllm_v2/Lawnakk_BBALAW1.63/1762652579.709696 | 1762652579.709697 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Lawnakk/BBALAW1.63 | Lawnakk/BBALAW1.63 | Lawnakk | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.44073835201709244}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.