_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | Youlln | Youlln/ECE-Qwen0.5B-FT-V2 | ee8952db-9f0a-4892-bff9-4d2ca1b66364 | 0.0.1 | hfopenllm_v2/Youlln_ECE-Qwen0.5B-FT-V2/1762652579.9641678 | 1762652579.964169 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Youlln/ECE-Qwen0.5B-FT-V2 | Youlln/ECE-Qwen0.5B-FT-V2 | Youlln | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25259311958935626}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | Youlln | Youlln/ECE-PRYMMAL1B-FT-V1 | c3a0b587-b379-4013-a5ce-26fdc9dcc44d | 0.0.1 | hfopenllm_v2/Youlln_ECE-PRYMMAL1B-FT-V1/1762652579.963949 | 1762652579.9639502 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Youlln/ECE-PRYMMAL1B-FT-V1 | Youlln/ECE-PRYMMAL1B-FT-V1 | Youlln | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2143745262569981}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | Youlln | Youlln/ECE-PRYMMAL-0.5B-SLERP-V2 | c0fe65df-7e51-48ad-bf40-fd163804cad1 | 0.0.1 | hfopenllm_v2/Youlln_ECE-PRYMMAL-0.5B-SLERP-V2/1762652579.962454 | 1762652579.962455 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Youlln/ECE-PRYMMAL-0.5B-SLERP-V2 | Youlln/ECE-PRYMMAL-0.5B-SLERP-V2 | Youlln | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1611934112599015}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | Youlln | Youlln/2PRYMMAL-Yi1.5-6B-SLERP | e80773ef-5ca2-43de-ba99-a7a997aab7f0 | 0.0.1 | hfopenllm_v2/Youlln_2PRYMMAL-Yi1.5-6B-SLERP/1762652579.9607239 | 1762652579.960725 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Youlln/2PRYMMAL-Yi1.5-6B-SLERP | Youlln/2PRYMMAL-Yi1.5-6B-SLERP | Youlln | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28259351853083153}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.061} |
HF Open LLM v2 | Youlln | Youlln/ECE-PRYMMAL0.5B-Youri | 46564b0a-1489-4c98-9e7b-20daf58c2f87 | 0.0.1 | hfopenllm_v2/Youlln_ECE-PRYMMAL0.5B-Youri/1762652579.963748 | 1762652579.9637492 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Youlln/ECE-PRYMMAL0.5B-Youri | Youlln/ECE-PRYMMAL0.5B-Youri | Youlln | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1446317991817267}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | Youlln | Youlln/ECE-PRYMMAL-0.5B-SLERP-V3 | d67c4d9a-d5cc-4b26-a439-44c87a299ee8 | 0.0.1 | hfopenllm_v2/Youlln_ECE-PRYMMAL-0.5B-SLERP-V3/1762652579.9626722 | 1762652579.9626722 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Youlln/ECE-PRYMMAL-0.5B-SLERP-V3 | Youlln/ECE-PRYMMAL-0.5B-SLERP-V3 | Youlln | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.16701352411601217}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | Trappu | Trappu/Magnum-Picaro-0.7-v2-12b | 77871404-f2e3-46f9-8c48-808fb89442cc | 0.0.1 | hfopenllm_v2/Trappu_Magnum-Picaro-0.7-v2-12b/1762652579.920383 | 1762652579.920383 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Trappu/Magnum-Picaro-0.7-v2-12b | Trappu/Magnum-Picaro-0.7-v2-12b | Trappu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.300278815764394}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | Trappu | Trappu/Nemo-Picaro-12B | 37534f85-e1ae-482b-89d0-480c4bbc50e7 | 0.0.1 | hfopenllm_v2/Trappu_Nemo-Picaro-12B/1762652579.92064 | 1762652579.92064 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Trappu/Nemo-Picaro-12B | Trappu/Nemo-Picaro-12B | Trappu | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2577139766929525}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | ClaudioItaly | ClaudioItaly/Albacus | 0be5437b-2489-4107-8c38-d0cd198a2d8c | 0.0.1 | hfopenllm_v2/ClaudioItaly_Albacus/1762652579.503804 | 1762652579.503805 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ClaudioItaly/Albacus | ClaudioItaly/Albacus | ClaudioItaly | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4667415790103592}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 8.987} |
HF Open LLM v2 | ClaudioItaly | ClaudioItaly/intelligence-cod-rag-7b-v3 | 51559a6d-1262-41e2-8092-008dc8f53974 | 0.0.1 | hfopenllm_v2/ClaudioItaly_intelligence-cod-rag-7b-v3/1762652579.504531 | 1762652579.504531 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ClaudioItaly/intelligence-cod-rag-7b-v3 | ClaudioItaly/intelligence-cod-rag-7b-v3 | ClaudioItaly | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6897820006471718}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | ClaudioItaly | ClaudioItaly/Book-Gut12B | b2bdf337-9065-4a67-aa1a-5ba8751d5438 | 0.0.1 | hfopenllm_v2/ClaudioItaly_Book-Gut12B/1762652579.504094 | 1762652579.504095 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ClaudioItaly/Book-Gut12B | ClaudioItaly/Book-Gut12B | ClaudioItaly | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.39984685080032095}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 12.248} |
HF Open LLM v2 | ClaudioItaly | ClaudioItaly/Evolutionstory-7B-v2.2 | e06c19ce-9247-473b-b5db-8686fee5e785 | 0.0.1 | hfopenllm_v2/ClaudioItaly_Evolutionstory-7B-v2.2/1762652579.504309 | 1762652579.504309 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ClaudioItaly/Evolutionstory-7B-v2.2 | ClaudioItaly/Evolutionstory-7B-v2.2 | ClaudioItaly | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4813794066410457}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Eridanus-Opus-14B-r999 | 9dd4aa3f-98aa-4e51-bd21-c999b3990a64 | 0.0.1 | hfopenllm_v2/prithivMLmods_Eridanus-Opus-14B-r999/1762652580.461785 | 1762652580.461786 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Eridanus-Opus-14B-r999 | prithivMLmods/Eridanus-Opus-14B-r999 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.638574537781974}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Calcium-Opus-14B-Elite2-R1 | 6eeb591b-aed2-4cdd-85bb-75011c9c5760 | 0.0.1 | hfopenllm_v2/prithivMLmods_Calcium-Opus-14B-Elite2-R1/1762652580.457828 | 1762652580.4578292 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Calcium-Opus-14B-Elite2-R1 | prithivMLmods/Calcium-Opus-14B-Elite2-R1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6325793339450436}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Viper-Coder-HybridMini-v1.3 | 1ca04810-a377-4390-944a-1a4ec91a7962 | 0.0.1 | hfopenllm_v2/prithivMLmods_Viper-Coder-HybridMini-v1.3/1762652580.4793081 | 1762652580.479309 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Viper-Coder-HybridMini-v1.3 | prithivMLmods/Viper-Coder-HybridMini-v1.3 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.610372699991578}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/QwQ-LCoT-3B-Instruct | 87fc8696-17f1-4a86-8d0d-f5b124144384 | 0.0.1 | hfopenllm_v2/prithivMLmods_QwQ-LCoT-3B-Instruct/1762652580.47235 | 1762652580.472351 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/QwQ-LCoT-3B-Instruct | prithivMLmods/QwQ-LCoT-3B-Instruct | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4354424039326764}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Sombrero-Opus-14B-Sm5 | 41acaa59-3232-4c6c-be64-0acb38019405 | 0.0.1 | hfopenllm_v2/prithivMLmods_Sombrero-Opus-14B-Sm5/1762652580.476726 | 1762652580.476726 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Sombrero-Opus-14B-Sm5 | prithivMLmods/Sombrero-Opus-14B-Sm5 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6851609285584471}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/COCO-7B-Instruct-1M | a7b425bc-9160-44ed-abf1-18c3b84cede2 | 0.0.1 | hfopenllm_v2/prithivMLmods_COCO-7B-Instruct-1M/1762652580.456335 | 1762652580.456337 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/COCO-7B-Instruct-1M | prithivMLmods/COCO-7B-Instruct-1M | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4743103853331383}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Tucana-Opus-14B-r999 | f24694aa-cfe7-4a58-9f9e-f02c3e51d198 | 0.0.1 | hfopenllm_v2/prithivMLmods_Tucana-Opus-14B-r999/1762652580.47826 | 1762652580.478261 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Tucana-Opus-14B-r999 | prithivMLmods/Tucana-Opus-14B-r999 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.606725710005009}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/GWQ2b | 8a89468f-fe2f-4bc9-be99-c9619c605efc | 0.0.1 | hfopenllm_v2/prithivMLmods_GWQ2b/1762652580.462852 | 1762652580.4628532 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/GWQ2b | prithivMLmods/GWQ2b | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41148707651254224}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 2.614} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Calcium-Opus-14B-Elite4 | 380cd349-5309-40b8-b549-ac6d6d42331a | 0.0.1 | hfopenllm_v2/prithivMLmods_Calcium-Opus-14B-Elite4/1762652580.4582741 | 1762652580.458275 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Calcium-Opus-14B-Elite4 | prithivMLmods/Calcium-Opus-14B-Elite4 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6111971790405014}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Primal-Opus-14B-Optimus-v1 | 94c21b1f-ce8d-4488-a1d1-2769d34f29ec | 0.0.1 | hfopenllm_v2/prithivMLmods_Primal-Opus-14B-Optimus-v1/1762652580.4716318 | 1762652580.471633 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Primal-Opus-14B-Optimus-v1 | prithivMLmods/Primal-Opus-14B-Optimus-v1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5013131823561483}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Llama-3.1-5B-Instruct | cdc5671a-e164-43b9-864c-808a9464e618 | 0.0.1 | hfopenllm_v2/prithivMLmods_Llama-3.1-5B-Instruct/1762652580.464407 | 1762652580.4644082 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Llama-3.1-5B-Instruct | prithivMLmods/Llama-3.1-5B-Instruct | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.14066011516110588}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 5.413} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/QwQ-LCoT1-Merged | 34aec318-6db4-4df6-9d6a-ad15e353f36a | 0.0.1 | hfopenllm_v2/prithivMLmods_QwQ-LCoT1-Merged/1762652580.47278 | 1762652580.472781 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/QwQ-LCoT1-Merged | prithivMLmods/QwQ-LCoT1-Merged | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.47513486438206187}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Elita-1 | d721cfe0-eb01-42fe-955a-bfd219c38917 | 0.0.1 | hfopenllm_v2/prithivMLmods_Elita-1/1762652580.4610822 | 1762652580.4610822 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Elita-1 | prithivMLmods/Elita-1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4906470387460826}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/GWQ-9B-Preview | 7735d88c-bdaa-4a12-9a99-a2dc5ec2ec66 | 0.0.1 | hfopenllm_v2/prithivMLmods_GWQ-9B-Preview/1762652580.4624221 | 1762652580.462423 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/GWQ-9B-Preview | prithivMLmods/GWQ-9B-Preview | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5065836425129767}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/QwQ-MathOct-7B | e703fed7-cf06-4caa-b78f-3e398b437671 | 0.0.1 | hfopenllm_v2/prithivMLmods_QwQ-MathOct-7B/1762652580.473228 | 1762652580.4732292 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/QwQ-MathOct-7B | prithivMLmods/QwQ-MathOct-7B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4684404047926169}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/QwQ-R1-Distill-7B-CoT | a723f173-af0e-4172-a43c-278ccbacac18 | 0.0.1 | hfopenllm_v2/prithivMLmods_QwQ-R1-Distill-7B-CoT/1762652580.473804 | 1762652580.473805 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/QwQ-R1-Distill-7B-CoT | prithivMLmods/QwQ-R1-Distill-7B-CoT | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3500378994401522}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Viper-Coder-v0.1 | 4d801ab4-0c2d-445a-beb6-4de824618e75 | 0.0.1 | hfopenllm_v2/prithivMLmods_Viper-Coder-v0.1/1762652580.479637 | 1762652580.479639 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Viper-Coder-v0.1 | prithivMLmods/Viper-Coder-v0.1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5521460835028835}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/GWQ-9B-Preview2 | 5c534761-19b5-4111-b1f5-c2fc3e121b24 | 0.0.1 | hfopenllm_v2/prithivMLmods_GWQ-9B-Preview2/1762652580.462637 | 1762652580.4626381 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/GWQ-9B-Preview2 | prithivMLmods/GWQ-9B-Preview2 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5208967761096114}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Megatron-Opus-14B-2.0 | c6dd1b78-b487-4197-8a66-c364487ff6fb | 0.0.1 | hfopenllm_v2/prithivMLmods_Megatron-Opus-14B-2.0/1762652580.467613 | 1762652580.467613 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Megatron-Opus-14B-2.0 | prithivMLmods/Megatron-Opus-14B-2.0 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6693739278447852}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Cygnus-II-14B | 120d9ddf-0e6e-4fb9-9250-019d1fbfdc28 | 0.0.1 | hfopenllm_v2/prithivMLmods_Cygnus-II-14B/1762652580.4597278 | 1762652580.459729 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Cygnus-II-14B | prithivMLmods/Cygnus-II-14B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6184412913292286}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Sqweeks-7B-Instruct | e0eaf433-d842-47c2-b47f-9e0ddd95df72 | 0.0.1 | hfopenllm_v2/prithivMLmods_Sqweeks-7B-Instruct/1762652580.476933 | 1762652580.476934 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Sqweeks-7B-Instruct | prithivMLmods/Sqweeks-7B-Instruct | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21579852568961466}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Condor-Opus-14B-Exp | 7b9f72e6-0280-46ba-8645-ab8dcb9ddf4d | 0.0.1 | hfopenllm_v2/prithivMLmods_Condor-Opus-14B-Exp/1762652580.4595032 | 1762652580.4595041 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Condor-Opus-14B-Exp | prithivMLmods/Condor-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.40431831983581346}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/QwQ-R1-Distill-1.5B-CoT | 8dd67de7-0d3b-4359-b390-d90c609dea5a | 0.0.1 | hfopenllm_v2/prithivMLmods_QwQ-R1-Distill-1.5B-CoT/1762652580.4734771 | 1762652580.473483 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/QwQ-R1-Distill-1.5B-CoT | prithivMLmods/QwQ-R1-Distill-1.5B-CoT | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21939564799177294}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.777} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Porpoise-Opus-14B-Exp | 79832ae5-0a80-4e46-8175-4baa240dc4d9 | 0.0.1 | hfopenllm_v2/prithivMLmods_Porpoise-Opus-14B-Exp/1762652580.47141 | 1762652580.471411 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Porpoise-Opus-14B-Exp | prithivMLmods/Porpoise-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7098155117310957}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Viper-Coder-v1.1 | cc8e5b55-5b48-40c3-9e30-3c1740bc7da2 | 0.0.1 | hfopenllm_v2/prithivMLmods_Viper-Coder-v1.1/1762652580.479969 | 1762652580.47997 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Viper-Coder-v1.1 | prithivMLmods/Viper-Coder-v1.1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.443236168920686}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Viper-Coder-Hybrid-v1.3 | 17167e2a-1f42-4ea9-a947-8749259738a8 | 0.0.1 | hfopenllm_v2/prithivMLmods_Viper-Coder-Hybrid-v1.3/1762652580.4790971 | 1762652580.479098 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Viper-Coder-Hybrid-v1.3 | prithivMLmods/Viper-Coder-Hybrid-v1.3 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7554776880898239}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Viper-Coder-Hybrid-v1.2 | 1f235238-05e0-4c76-b136-0bf0cf470ba2 | 0.0.1 | hfopenllm_v2/prithivMLmods_Viper-Coder-Hybrid-v1.2/1762652580.4788852 | 1762652580.478886 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Viper-Coder-Hybrid-v1.2 | prithivMLmods/Viper-Coder-Hybrid-v1.2 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6735705705306365}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Sombrero-Opus-14B-Sm1 | 5ce1b22c-7daa-4714-a774-d7d509fa869f | 0.0.1 | hfopenllm_v2/prithivMLmods_Sombrero-Opus-14B-Sm1/1762652580.476064 | 1762652580.476065 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Sombrero-Opus-14B-Sm1 | prithivMLmods/Sombrero-Opus-14B-Sm1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3812872068334242}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Viper-Coder-v1.7-Vsm6 | 14b789c6-8b7f-4292-8ced-279e7ee856a5 | 0.0.1 | hfopenllm_v2/prithivMLmods_Viper-Coder-v1.7-Vsm6/1762652580.480439 | 1762652580.4804401 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Viper-Coder-v1.7-Vsm6 | prithivMLmods/Viper-Coder-v1.7-Vsm6 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5003889679384035}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Pegasus-Opus-14B-Exp | 5cc40900-fe74-469a-99c0-74e998b0e316 | 0.0.1 | hfopenllm_v2/prithivMLmods_Pegasus-Opus-14B-Exp/1762652580.469298 | 1762652580.4692988 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Pegasus-Opus-14B-Exp | prithivMLmods/Pegasus-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6981752860188744}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Tadpole-Opus-14B-Exp | 0faf87d0-2b35-4256-acd9-4fe57f574d06 | 0.0.1 | hfopenllm_v2/prithivMLmods_Tadpole-Opus-14B-Exp/1762652580.477141 | 1762652580.477142 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Tadpole-Opus-14B-Exp | prithivMLmods/Tadpole-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5749522378400422}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Llama-3.1-8B-Open-SFT | 37276848-95fe-4403-896d-bf9fafbff04e | 0.0.1 | hfopenllm_v2/prithivMLmods_Llama-3.1-8B-Open-SFT/1762652580.464622 | 1762652580.4646232 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Llama-3.1-8B-Open-SFT | prithivMLmods/Llama-3.1-8B-Open-SFT | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4122616878770551}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Blaze-14B-xElite | c4041b70-acce-4088-a3b9-299d4424e240 | 0.0.1 | hfopenllm_v2/prithivMLmods_Blaze-14B-xElite/1762652580.456049 | 1762652580.45605 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Blaze-14B-xElite | prithivMLmods/Blaze-14B-xElite | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.03632029681245762}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Magellanic-Opus-14B-Exp | 07236482-8709-4aa8-8e63-762b2f591b2a | 0.0.1 | hfopenllm_v2/prithivMLmods_Magellanic-Opus-14B-Exp/1762652580.466739 | 1762652580.466739 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Magellanic-Opus-14B-Exp | prithivMLmods/Magellanic-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6866347956754744}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Sombrero-Opus-14B-Elite5 | 3b12518e-ef16-4a72-89bb-071802ca636c | 0.0.1 | hfopenllm_v2/prithivMLmods_Sombrero-Opus-14B-Elite5/1762652580.4753642 | 1762652580.4753652 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Sombrero-Opus-14B-Elite5 | prithivMLmods/Sombrero-Opus-14B-Elite5 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7880756393037142}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Megatron-Corpus-14B-Exp | f71c4189-288e-4c6d-978c-d793ca57fedf | 0.0.1 | hfopenllm_v2/prithivMLmods_Megatron-Corpus-14B-Exp/1762652580.46718 | 1762652580.46718 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Megatron-Corpus-14B-Exp | prithivMLmods/Megatron-Corpus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49826571275327247}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Megatron-Opus-14B-Exp | ac65fabb-07d5-457d-844e-19aecf2b18e0 | 0.0.1 | hfopenllm_v2/prithivMLmods_Megatron-Opus-14B-Exp/1762652580.46803 | 1762652580.468031 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Megatron-Opus-14B-Exp | prithivMLmods/Megatron-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4979410187192206}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Calcium-Opus-14B-Elite-Stock | 74d10ea5-3d08-4bb2-9246-5e053eb20fea | 0.0.1 | hfopenllm_v2/prithivMLmods_Calcium-Opus-14B-Elite-Stock/1762652580.457346 | 1762652580.4573472 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Calcium-Opus-14B-Elite-Stock | prithivMLmods/Calcium-Opus-14B-Elite-Stock | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.614294516327788}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Codepy-Deepthink-3B | adb6f7d5-db2f-49b1-aab4-1fd3dfcb7e34 | 0.0.1 | hfopenllm_v2/prithivMLmods_Codepy-Deepthink-3B/1762652580.458943 | 1762652580.458944 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Codepy-Deepthink-3B | prithivMLmods/Codepy-Deepthink-3B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.43271962836385236}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Omni-Reasoner-Merged | 8043bcfd-1a4c-45c5-aca4-f23f02bd5562 | 0.0.1 | hfopenllm_v2/prithivMLmods_Omni-Reasoner-Merged/1762652580.468864 | 1762652580.468864 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Omni-Reasoner-Merged | prithivMLmods/Omni-Reasoner-Merged | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4599473840520929}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Taurus-Opus-7B | 01448351-5f76-4329-9bfd-4124e29de920 | 0.0.1 | hfopenllm_v2/prithivMLmods_Taurus-Opus-7B/1762652580.477352 | 1762652580.4773529 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Taurus-Opus-7B | prithivMLmods/Taurus-Opus-7B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.42232831110342783}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 7.456} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Evac-Opus-14B-Exp | 26c88cb2-7c81-4b0c-8493-baa9d8f7b1a0 | 0.0.1 | hfopenllm_v2/prithivMLmods_Evac-Opus-14B-Exp/1762652580.461996 | 1762652580.461997 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Evac-Opus-14B-Exp | prithivMLmods/Evac-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5916135852870383}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/LwQ-Reasoner-10B | d22507ab-2601-4bf0-a8d8-b456102c85af | 0.0.1 | hfopenllm_v2/prithivMLmods_LwQ-Reasoner-10B/1762652580.466471 | 1762652580.466471 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/LwQ-Reasoner-10B | prithivMLmods/LwQ-Reasoner-10B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.29413400887423147}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 10.306} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Bellatrix-Tiny-1.5B-R1 | 4e78f82e-aa31-414c-9c59-9c8e318fff17 | 0.0.1 | hfopenllm_v2/prithivMLmods_Bellatrix-Tiny-1.5B-R1/1762652580.455581 | 1762652580.455582 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Bellatrix-Tiny-1.5B-R1 | prithivMLmods/Bellatrix-Tiny-1.5B-R1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33522498082864577}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Jolt-v0.1 | d96ef95b-ca39-4e33-9f6b-a4faa71e5009 | 0.0.1 | hfopenllm_v2/prithivMLmods_Jolt-v0.1/1762652580.463978 | 1762652580.463979 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Jolt-v0.1 | prithivMLmods/Jolt-v0.1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5092066827129793}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Calcium-Opus-20B-v1 | 9c414577-7f2d-487a-9f2b-7675e0532ac1 | 0.0.1 | hfopenllm_v2/prithivMLmods_Calcium-Opus-20B-v1/1762652580.458724 | 1762652580.4587252 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Calcium-Opus-20B-v1 | prithivMLmods/Calcium-Opus-20B-v1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3092716215197897}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 19.173} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Volans-Opus-14B-Exp | 735058a7-c22e-42a7-94f5-d7e2459848b3 | 0.0.1 | hfopenllm_v2/prithivMLmods_Volans-Opus-14B-Exp/1762652580.480862 | 1762652580.480863 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Volans-Opus-14B-Exp | prithivMLmods/Volans-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5867675545330834}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/QwQ-LCoT2-7B-Instruct | 8c05d496-c21f-4a70-b312-1c1ba37d877a | 0.0.1 | hfopenllm_v2/prithivMLmods_QwQ-LCoT2-7B-Instruct/1762652580.473001 | 1762652580.473002 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/QwQ-LCoT2-7B-Instruct | prithivMLmods/QwQ-LCoT2-7B-Instruct | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5561177675235043}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/WebMind-7B-v0.1 | 00637ba6-99e5-4940-94ab-a620ff248ca1 | 0.0.1 | hfopenllm_v2/prithivMLmods_WebMind-7B-v0.1/1762652580.481075 | 1762652580.481076 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/WebMind-7B-v0.1 | prithivMLmods/WebMind-7B-v0.1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5278161943642867}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Calcium-Opus-14B-Elite | 487e1883-01c6-4714-9447-67837c78655b | 0.0.1 | hfopenllm_v2/prithivMLmods_Calcium-Opus-14B-Elite/1762652580.456628 | 1762652580.456629 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Calcium-Opus-14B-Elite | prithivMLmods/Calcium-Opus-14B-Elite | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6051521075191603}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Calcium-Opus-14B-Elite | 79bccc27-27a0-4194-9c46-5e89b0f21b9e | 0.0.1 | hfopenllm_v2/prithivMLmods_Calcium-Opus-14B-Elite/1762652580.456884 | 1762652580.456885 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Calcium-Opus-14B-Elite | prithivMLmods/Calcium-Opus-14B-Elite | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6063511482865463}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Deepthink-Reasoning-7B | 10d2454a-ae69-43b6-962a-77102645ed56 | 0.0.1 | hfopenllm_v2/prithivMLmods_Deepthink-Reasoning-7B/1762652580.460416 | 1762652580.460416 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Deepthink-Reasoning-7B | prithivMLmods/Deepthink-Reasoning-7B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.48400244684104843}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Calcium-Opus-14B-Elite2 | 689d38cd-898e-43ec-92e8-238cefac6776 | 0.0.1 | hfopenllm_v2/prithivMLmods_Calcium-Opus-14B-Elite2/1762652580.457599 | 1762652580.4576 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Calcium-Opus-14B-Elite2 | prithivMLmods/Calcium-Opus-14B-Elite2 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6176168122803052}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Tulu-MathLingo-8B | fa0776bd-e95e-4d54-9004-82dff09307b8 | 0.0.1 | hfopenllm_v2/prithivMLmods_Tulu-MathLingo-8B/1762652580.478472 | 1762652580.478473 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Tulu-MathLingo-8B | prithivMLmods/Tulu-MathLingo-8B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5589402784611497}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/QwQ-LCoT-14B-Conversational | 71114773-e285-4666-ae7f-5fd7c9084104 | 0.0.1 | hfopenllm_v2/prithivMLmods_QwQ-LCoT-14B-Conversational/1762652580.472128 | 1762652580.472129 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/QwQ-LCoT-14B-Conversational | prithivMLmods/QwQ-LCoT-14B-Conversational | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4047427492386867}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Megatron-Opus-14B-2.1 | 002ba3ef-6ac7-4bdf-bd7d-42ef16aa7cc9 | 0.0.1 | hfopenllm_v2/prithivMLmods_Megatron-Opus-14B-2.1/1762652580.4678242 | 1762652580.467825 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Megatron-Opus-14B-2.1 | prithivMLmods/Megatron-Opus-14B-2.1 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.02455484780382718}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 14.66} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct | b1430f51-cd48-4feb-8d94-c2a9a60f00bc | 0.0.1 | hfopenllm_v2/prithivMLmods_Qwen2.5-1.5B-DeepSeek-R1-Instruct/1762652580.474298 | 1762652580.474299 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct | prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.13968603305895025}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.777} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Triangulum-v2-10B | 00f8547d-4bb9-4510-a29c-c37376c274c8 | 0.0.1 | hfopenllm_v2/prithivMLmods_Triangulum-v2-10B/1762652580.478046 | 1762652580.478047 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Triangulum-v2-10B | prithivMLmods/Triangulum-v2-10B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6705231009277606}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 10.306} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Megatron-Opus-7B-Exp | 94536d01-2de8-4305-83aa-2673a226ab64 | 0.0.1 | hfopenllm_v2/prithivMLmods_Megatron-Opus-7B-Exp/1762652580.468447 | 1762652580.468448 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Megatron-Opus-7B-Exp | prithivMLmods/Megatron-Opus-7B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6017300761978217}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 7.456} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Calcium-Opus-14B-Elite3 | 2edb276e-86c5-4bde-a696-4f68fb659b4e | 0.0.1 | hfopenllm_v2/prithivMLmods_Calcium-Opus-14B-Elite3/1762652580.458055 | 1762652580.458056 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Calcium-Opus-14B-Elite3 | prithivMLmods/Calcium-Opus-14B-Elite3 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5428285837134359}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Epimetheus-14B-Axo | dc3aed7d-01e0-46cc-85f6-2a06cf6b6edc | 0.0.1 | hfopenllm_v2/prithivMLmods_Epimetheus-14B-Axo/1762652580.461361 | 1762652580.461361 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Epimetheus-14B-Axo | prithivMLmods/Epimetheus-14B-Axo | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.554643900406477}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Deepthink-Reasoning-14B | 343e0d36-5470-4865-aeeb-a9963b38f90a | 0.0.1 | hfopenllm_v2/prithivMLmods_Deepthink-Reasoning-14B/1762652580.460205 | 1762652580.460206 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Deepthink-Reasoning-14B | prithivMLmods/Deepthink-Reasoning-14B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5423542866261519}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/FastThink-0.5B-Tiny | b731eb88-e0ed-4edb-bed3-2d82bbce43bb | 0.0.1 | hfopenllm_v2/prithivMLmods_FastThink-0.5B-Tiny/1762652580.462207 | 1762652580.462208 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/FastThink-0.5B-Tiny | prithivMLmods/FastThink-0.5B-Tiny | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25798880304259364}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Sombrero-Opus-14B-Sm4 | 79a8057c-0791-42d6-adef-924a9cff0917 | 0.0.1 | hfopenllm_v2/prithivMLmods_Sombrero-Opus-14B-Sm4/1762652580.476516 | 1762652580.4765172 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Sombrero-Opus-14B-Sm4 | prithivMLmods/Sombrero-Opus-14B-Sm4 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4346932804957513}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Viper-Coder-7B-Elite14 | 06bc6426-310b-40ac-bbeb-0460215b8981 | 0.0.1 | hfopenllm_v2/prithivMLmods_Viper-Coder-7B-Elite14/1762652580.4786801 | 1762652580.478681 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Viper-Coder-7B-Elite14 | prithivMLmods/Viper-Coder-7B-Elite14 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.14882844186757802}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/QwQ-LCoT-7B-Instruct | 23f056f6-67dd-41fd-b1af-a1cf9abf784c | 0.0.1 | hfopenllm_v2/prithivMLmods_QwQ-LCoT-7B-Instruct/1762652580.4725702 | 1762652580.472571 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/QwQ-LCoT-7B-Instruct | prithivMLmods/QwQ-LCoT-7B-Instruct | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4986901421561457}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Triangulum-5B | 7d8850c3-61b2-41c3-a01b-8e23511558f6 | 0.0.1 | hfopenllm_v2/prithivMLmods_Triangulum-5B/1762652580.477782 | 1762652580.477782 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Triangulum-5B | prithivMLmods/Triangulum-5B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1283206336963701}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 5.413} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Dinobot-Opus-14B-Exp | 6ed13eae-92ee-4fa7-9ed8-d9f21d6de48c | 0.0.1 | hfopenllm_v2/prithivMLmods_Dinobot-Opus-14B-Exp/1762652580.460635 | 1762652580.460635 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Dinobot-Opus-14B-Exp | prithivMLmods/Dinobot-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8239958864701216}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Sombrero-Opus-14B-Sm2 | 6a1519e9-062b-454f-97cb-e57454f74e9a | 0.0.1 | hfopenllm_v2/prithivMLmods_Sombrero-Opus-14B-Sm2/1762652580.476301 | 1762652580.4763021 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Sombrero-Opus-14B-Sm2 | prithivMLmods/Sombrero-Opus-14B-Sm2 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4272242095417935}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Viper-OneCoder-UIGEN | 5d22f1b7-c062-4c46-8da1-4c895fcf8b9c | 0.0.1 | hfopenllm_v2/prithivMLmods_Viper-OneCoder-UIGEN/1762652580.480654 | 1762652580.480654 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Viper-OneCoder-UIGEN | prithivMLmods/Viper-OneCoder-UIGEN | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4691895282295421}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Sombrero-Opus-14B-Elite6 | 0d354980-9f24-4b79-afb7-a7e6f52e8131 | 0.0.1 | hfopenllm_v2/prithivMLmods_Sombrero-Opus-14B-Elite6/1762652580.47572 | 1762652580.475722 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Sombrero-Opus-14B-Elite6 | prithivMLmods/Sombrero-Opus-14B-Elite6 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7226049105262924}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Elita-0.1-Distilled-R1-abliterated | 9b63b3ad-568f-4f15-9cc6-36049ac89727 | 0.0.1 | hfopenllm_v2/prithivMLmods_Elita-0.1-Distilled-R1-abliterated/1762652580.460851 | 1762652580.460852 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Elita-0.1-Distilled-R1-abliterated | prithivMLmods/Elita-0.1-Distilled-R1-abliterated | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.35423454212600347}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Megatron-Corpus-14B-Exp.v2 | f50a6538-057e-4e57-af79-ba3a5b7121cb | 0.0.1 | hfopenllm_v2/prithivMLmods_Megatron-Corpus-14B-Exp.v2/1762652580.467396 | 1762652580.4673972 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Megatron-Corpus-14B-Exp.v2 | prithivMLmods/Megatron-Corpus-14B-Exp.v2 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.48704991644392437}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Megatron-Opus-14B-Stock | 8a0828ef-56a0-4c2b-bc61-f955c56b7700 | 0.0.1 | hfopenllm_v2/prithivMLmods_Megatron-Opus-14B-Stock/1762652580.468238 | 1762652580.468238 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Megatron-Opus-14B-Stock | prithivMLmods/Megatron-Opus-14B-Stock | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5173750094194515}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Viper-Coder-v1.6-r999 | ff5bb366-3692-441c-8e8f-8c23c5143aae | 0.0.1 | hfopenllm_v2/prithivMLmods_Viper-Coder-v1.6-r999/1762652580.480214 | 1762652580.480215 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Viper-Coder-v1.6-r999 | prithivMLmods/Viper-Coder-v1.6-r999 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4432860366050967}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/SmolLM2-CoT-360M | 8ce4dea8-d674-4b95-b025-0c6ab60f6544 | 0.0.1 | hfopenllm_v2/prithivMLmods_SmolLM2-CoT-360M/1762652580.475137 | 1762652580.475137 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/SmolLM2-CoT-360M | prithivMLmods/SmolLM2-CoT-360M | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22156877086131466}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.362} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Primal-Opus-14B-Optimus-v2 | 80407172-765a-4aa9-b189-a322150b1a7b | 0.0.1 | hfopenllm_v2/prithivMLmods_Primal-Opus-14B-Optimus-v2/1762652580.471854 | 1762652580.471854 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Primal-Opus-14B-Optimus-v2 | prithivMLmods/Primal-Opus-14B-Optimus-v2 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6403730989330532}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Gaea-Opus-14B-Exp | f75e27a8-00e8-4473-b7ed-3fffa131ee0a | 0.0.1 | hfopenllm_v2/prithivMLmods_Gaea-Opus-14B-Exp/1762652580.463063 | 1762652580.463063 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Gaea-Opus-14B-Exp | prithivMLmods/Gaea-Opus-14B-Exp | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5956351369920699}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Triangulum-10B | ee5ad026-8df4-41c0-9158-3759d4a3ef02 | 0.0.1 | hfopenllm_v2/prithivMLmods_Triangulum-10B/1762652580.477568 | 1762652580.477569 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Triangulum-10B | prithivMLmods/Triangulum-10B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3229353670483207}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 10.306} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Lacerta-Opus-14B-Elite8 | 21b53896-3b7b-470a-a49f-4b2cb4e6adef | 0.0.1 | hfopenllm_v2/prithivMLmods_Lacerta-Opus-14B-Elite8/1762652580.464193 | 1762652580.464193 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Lacerta-Opus-14B-Elite8 | prithivMLmods/Lacerta-Opus-14B-Elite8 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.614144913274556}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Coma-II-14B | 785e4cde-ec97-4e36-8ee3-3fb4c2543901 | 0.0.1 | hfopenllm_v2/prithivMLmods_Coma-II-14B/1762652580.4591591 | 1762652580.45916 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Coma-II-14B | prithivMLmods/Coma-II-14B | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.416832892281369}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Omni-Reasoner3-Merged | 972cdfdc-1c7f-4900-8acf-d5eed0ccc968 | 0.0.1 | hfopenllm_v2/prithivMLmods_Omni-Reasoner3-Merged/1762652580.46908 | 1762652580.4690812 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Omni-Reasoner3-Merged | prithivMLmods/Omni-Reasoner3-Merged | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.493469549683728}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Bellatrix-Tiny-1B-v2 | 715be726-e0e3-4589-91cf-85e41dbcbf8a | 0.0.1 | hfopenllm_v2/prithivMLmods_Bellatrix-Tiny-1B-v2/1762652580.4558249 | 1762652580.4558249 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Bellatrix-Tiny-1B-v2 | prithivMLmods/Bellatrix-Tiny-1B-v2 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.15095169705270903}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Bellatrix-1.5B-xElite | 7f1c6c88-823f-4597-9794-bf05c076d4d3 | 0.0.1 | hfopenllm_v2/prithivMLmods_Bellatrix-1.5B-xElite/1762652580.4551811 | 1762652580.455182 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Bellatrix-1.5B-xElite | prithivMLmods/Bellatrix-1.5B-xElite | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1964144026737944}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.777} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Gauss-Opus-14B-R999 | e8596a17-9e5d-4ac5-9968-44d302628c31 | 0.0.1 | hfopenllm_v2/prithivMLmods_Gauss-Opus-14B-R999/1762652580.463757 | 1762652580.463758 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Gauss-Opus-14B-R999 | prithivMLmods/Gauss-Opus-14B-R999 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.39065457430728245}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/LwQ-10B-Instruct | df470b21-0d55-4d28-af25-75908799a0cc | 0.0.1 | hfopenllm_v2/prithivMLmods_LwQ-10B-Instruct/1762652580.4662411 | 1762652580.466242 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/LwQ-10B-Instruct | prithivMLmods/LwQ-10B-Instruct | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3934770852449279}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 10.732} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Calcium-Opus-14B-Elite-1M | 0c883e9c-4cec-4c65-aa10-96e0d0de2e1f | 0.0.1 | hfopenllm_v2/prithivMLmods_Calcium-Opus-14B-Elite-1M/1762652580.457102 | 1762652580.457103 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Calcium-Opus-14B-Elite-1M | prithivMLmods/Calcium-Opus-14B-Elite-1M | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5612884923115112}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | prithivMLmods | prithivMLmods/Messier-Opus-14B-Elite7 | e2ac8e52-8326-496a-b904-ca0e48190b3b | 0.0.1 | hfopenllm_v2/prithivMLmods_Messier-Opus-14B-Elite7/1762652580.4686568 | 1762652580.468658 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | prithivMLmods/Messier-Opus-14B-Elite7 | prithivMLmods/Messier-Opus-14B-Elite7 | prithivMLmods | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7113392465325337}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.