_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | xukp20 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table | 87bcbd57-2d0e-4d77-9f1e-3ec0199c8452 | 0.0.1 | hfopenllm_v2/xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table/1762652580.6007009 | 1762652580.6007009 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table | xukp20 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6023794642659255}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xukp20 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002 | d1445003-91ea-4b2b-ab38-a47a6392620e | 0.0.1 | hfopenllm_v2/xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002/1762652580.601484 | 1762652580.6014872 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002 | xukp20 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6851609285584471}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xukp20 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table | a6996896-1464-4b55-a784-28deb06150c8 | 0.0.1 | hfopenllm_v2/xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table/1762652580.600162 | 1762652580.600162 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table | xukp20 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.575601625908146}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xukp20 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001 | 4d9c2e04-caef-43f5-9ce1-40517341ff40 | 0.0.1 | hfopenllm_v2/xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001/1762652580.601857 | 1762652580.6018581 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001 | xukp20 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5482242671666733}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xukp20 | xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table | 5d53b35f-6bff-493c-805d-b45517ae0e2b | 0.0.1 | hfopenllm_v2/xukp20_llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table/1762652580.602122 | 1762652580.602124 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table | xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table | xukp20 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6900069593124022}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xukp20 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001 | d758e9a9-c316-4de5-bdb7-d0ec7401fa12 | 0.0.1 | hfopenllm_v2/xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001/1762652580.601125 | 1762652580.601126 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001 | xukp20 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5336363072203975}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | xukp20 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table | d7125235-7b17-4a90-9125-c993646cd7c8 | 0.0.1 | hfopenllm_v2/xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table/1762652580.600907 | 1762652580.600908 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table | xukp20 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6620300801872365}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | AI-MO | AI-MO/NuminaMath-7B-TIR | 0ffa78d4-fe45-4639-bcd1-eb19ab168a35 | 0.0.1 | hfopenllm_v2/AI-MO_NuminaMath-7B-TIR/1762652579.474566 | 1762652579.474567 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | AI-MO/NuminaMath-7B-TIR | AI-MO/NuminaMath-7B-TIR | AI-MO | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.27562423259174545}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.91} |
HF Open LLM v2 | AI-MO | AI-MO/NuminaMath-7B-CoT | 9ac2ba3c-9a21-46b2-a21c-4909cfae6315 | 0.0.1 | hfopenllm_v2/AI-MO_NuminaMath-7B-CoT/1762652579.474318 | 1762652579.4743192 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | AI-MO/NuminaMath-7B-CoT | AI-MO/NuminaMath-7B-CoT | AI-MO | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2688544173903022}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 6.91} |
HF Open LLM v2 | NAPS-ai | NAPS-ai/naps-llama-3_1-8b-instruct-v0.3 | d0ce5c14-28fa-4fde-901e-6670db6943de | 0.0.1 | hfopenllm_v2/NAPS-ai_naps-llama-3_1-8b-instruct-v0.3/1762652579.765912 | 1762652579.765913 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | NAPS-ai/naps-llama-3_1-8b-instruct-v0.3 | NAPS-ai/naps-llama-3_1-8b-instruct-v0.3 | NAPS-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5390818583580456}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | NAPS-ai | NAPS-ai/naps-llama-3_1-instruct-v0.5.0 | 5553fa1d-6bf9-469d-b870-590dd4965209 | 0.0.1 | hfopenllm_v2/NAPS-ai_naps-llama-3_1-instruct-v0.5.0/1762652579.766381 | 1762652579.766382 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | NAPS-ai/naps-llama-3_1-instruct-v0.5.0 | NAPS-ai/naps-llama-3_1-instruct-v0.5.0 | NAPS-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5020124381086628}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | NAPS-ai | NAPS-ai/naps-llama-3_1-8b-instruct-v0.4 | 467a9428-e85d-489d-be59-91842b389732 | 0.0.1 | hfopenllm_v2/NAPS-ai_naps-llama-3_1-8b-instruct-v0.4/1762652579.766172 | 1762652579.766173 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | NAPS-ai/naps-llama-3_1-8b-instruct-v0.4 | NAPS-ai/naps-llama-3_1-8b-instruct-v0.4 | NAPS-ai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7344202272193336}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | Kquant03 | Kquant03/CognitiveFusion2-4x7B-BF16 | 66f84aee-5d79-4fec-9fff-799ac874d165 | 0.0.1 | hfopenllm_v2/Kquant03_CognitiveFusion2-4x7B-BF16/1762652579.701032 | 1762652579.7010329 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Kquant03/CognitiveFusion2-4x7B-BF16 | Kquant03/CognitiveFusion2-4x7B-BF16 | Kquant03 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.35665700341759865}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MixtralForCausalLM", "params_billions": 24.154} |
HF Open LLM v2 | Kquant03 | Kquant03/L3-Pneuma-8B | 5420d88b-bc26-4d04-9812-ffce8a3564e6 | 0.0.1 | hfopenllm_v2/Kquant03_L3-Pneuma-8B/1762652579.701272 | 1762652579.7012732 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Kquant03/L3-Pneuma-8B | Kquant03/L3-Pneuma-8B | Kquant03 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2374056392593873}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | AbacusResearch | AbacusResearch/Jallabi-34B | 76397277-901a-4ad0-9dae-0351ca875ec6 | 0.0.1 | hfopenllm_v2/AbacusResearch_Jallabi-34B/1762652579.477037 | 1762652579.4770381 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | AbacusResearch/Jallabi-34B | AbacusResearch/Jallabi-34B | AbacusResearch | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3528604103777976}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 34.389} |
HF Open LLM v2 | Eric111 | Eric111/CatunaMayo | 9c2ab331-44f5-4306-a57c-5ddb0154ba63 | 0.0.1 | hfopenllm_v2/Eric111_CatunaMayo/1762652579.613048 | 1762652579.613049 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Eric111/CatunaMayo | Eric111/CatunaMayo | Eric111 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4074156571231}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH", ... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | Eric111 | Eric111/CatunaMayo-DPO | ef63850d-6acf-4d04-ac01-7ac407bf3b89 | 0.0.1 | hfopenllm_v2/Eric111_CatunaMayo-DPO/1762652579.613287 | 1762652579.613288 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Eric111/CatunaMayo-DPO | Eric111/CatunaMayo-DPO | Eric111 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4214539643700936}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | WizardLMTeam | WizardLMTeam/WizardLM-13B-V1.2 | f9d2286c-ed89-4c23-b6a2-c623373331cd | 0.0.1 | hfopenllm_v2/WizardLMTeam_WizardLM-13B-V1.2/1762652579.950676 | 1762652579.950676 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | WizardLMTeam/WizardLM-13B-V1.2 | WizardLMTeam/WizardLM-13B-V1.2 | WizardLMTeam | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3392465325336773}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 13.0} |
HF Open LLM v2 | WizardLMTeam | WizardLMTeam/WizardLM-70B-V1.0 | 8c4ff628-41b6-4769-a33e-b1dbffa913cf | 0.0.1 | hfopenllm_v2/WizardLMTeam_WizardLM-70B-V1.0/1762652579.950908 | 1762652579.950909 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | WizardLMTeam/WizardLM-70B-V1.0 | WizardLMTeam/WizardLM-70B-V1.0 | WizardLMTeam | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49514288753839814}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 70.0} |
HF Open LLM v2 | WizardLMTeam | WizardLMTeam/WizardLM-13B-V1.0 | ab4f785b-779f-423b-9905-31a3b66dfeff | 0.0.1 | hfopenllm_v2/WizardLMTeam_WizardLM-13B-V1.0/1762652579.9503958 | 1762652579.950397 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | WizardLMTeam/WizardLM-13B-V1.0 | WizardLMTeam/WizardLM-13B-V1.0 | WizardLMTeam | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18504900331121424}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 13.0} |
HF Open LLM v2 | flammenai | flammenai/flammen15-gutenberg-DPO-v1-7B | 1244b8d9-e832-4f2b-8ae5-52449f6ac38c | 0.0.1 | hfopenllm_v2/flammenai_flammen15-gutenberg-DPO-v1-7B/1762652580.155953 | 1762652580.155954 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | flammenai/flammen15-gutenberg-DPO-v1-7B | flammenai/flammen15-gutenberg-DPO-v1-7B | flammenai | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.47980580415519714}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | CohereForAI | CohereForAI/aya-23-35B | 9c77aa3f-080c-4dd6-8a9d-50d18657de35 | 0.0.1 | hfopenllm_v2/CohereForAI_aya-23-35B/1762652579.5047522 | 1762652579.5047529 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CohereForAI/aya-23-35B | CohereForAI/aya-23-35B | CohereForAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6461932117891638}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "CohereForCausalLM", "params_billions": 34.981} |
HF Open LLM v2 | CohereForAI | CohereForAI/aya-expanse-8b | 3d54299c-ae39-45f4-b31c-c0667dcbe9f4 | 0.0.1 | hfopenllm_v2/CohereForAI_aya-expanse-8b/1762652579.505729 | 1762652579.5057302 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CohereForAI/aya-expanse-8b | CohereForAI/aya-expanse-8b | CohereForAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6358517622131501}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "CohereForCausalLM", "params_billions": 8.028} |
HF Open LLM v2 | CohereForAI | CohereForAI/c4ai-command-r-plus-08-2024 | f1ef3dda-1b62-4ec9-9c88-a8e60b8a8f6d | 0.0.1 | hfopenllm_v2/CohereForAI_c4ai-command-r-plus-08-2024/1762652579.506166 | 1762652579.506167 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CohereForAI/c4ai-command-r-plus-08-2024 | CohereForAI/c4ai-command-r-plus-08-2024 | CohereForAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7539539532883859}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "CohereForCausalLM", "params_billions": 103.811} |
HF Open LLM v2 | CohereForAI | CohereForAI/c4ai-command-r7b-12-2024 | 85fa7edb-df5c-4baa-a0f1-c520db55c08c | 0.0.1 | hfopenllm_v2/CohereForAI_c4ai-command-r7b-12-2024/1762652579.5066051 | 1762652579.506606 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CohereForAI/c4ai-command-r7b-12-2024 | CohereForAI/c4ai-command-r7b-12-2024 | CohereForAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7713145564878965}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Cohere2ForCausalLM", "params_billions": 8.028} |
HF Open LLM v2 | CohereForAI | CohereForAI/c4ai-command-r-plus | c5326cd1-8e73-4f84-8efb-49b3be5c50e7 | 0.0.1 | hfopenllm_v2/CohereForAI_c4ai-command-r-plus/1762652579.50595 | 1762652579.505951 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CohereForAI/c4ai-command-r-plus | CohereForAI/c4ai-command-r-plus | CohereForAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7664186580495308}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "CohereForCausalLM", "params_billions": 103.811} |
HF Open LLM v2 | CohereForAI | CohereForAI/aya-expanse-32b | ebbe9a61-6dff-467a-b77c-7c125a043832 | 0.0.1 | hfopenllm_v2/CohereForAI_aya-expanse-32b/1762652579.505483 | 1762652579.505484 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CohereForAI/aya-expanse-32b | CohereForAI/aya-expanse-32b | CohereForAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7301737168490716}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "CohereForCausalLM", "params_billions": 32.296} |
HF Open LLM v2 | CohereForAI | CohereForAI/aya-23-8B | 2ff655cd-9123-4577-832b-3f0b04f7d466 | 0.0.1 | hfopenllm_v2/CohereForAI_aya-23-8B/1762652579.5050838 | 1762652579.505085 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CohereForAI/aya-23-8B | CohereForAI/aya-23-8B | CohereForAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4698887839820565}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "CohereForCausalLM", "params_billions": 8.028} |
HF Open LLM v2 | CohereForAI | CohereForAI/c4ai-command-r-v01 | cd24b18c-faff-44e1-87d6-735bcb9ab465 | 0.0.1 | hfopenllm_v2/CohereForAI_c4ai-command-r-v01/1762652579.506387 | 1762652579.506388 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | CohereForAI/c4ai-command-r-v01 | CohereForAI/c4ai-command-r-v01 | CohereForAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6748194789824333}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "CohereForCausalLM", "params_billions": 34.981} |
HF Open LLM v2 | vonjack | vonjack/SmolLM2-360M-Merged | f1980c69-8c24-4fcd-ace1-797195026c7b | 0.0.1 | hfopenllm_v2/vonjack_SmolLM2-360M-Merged/1762652580.590822 | 1762652580.590823 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vonjack/SmolLM2-360M-Merged | vonjack/SmolLM2-360M-Merged | vonjack | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.32058715319795916}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 0.362} |
HF Open LLM v2 | vonjack | vonjack/Phi-3-mini-4k-instruct-LLaMAfied | be3635bb-83de-4cbf-8e0f-3a84ee78bd67 | 0.0.1 | hfopenllm_v2/vonjack_Phi-3-mini-4k-instruct-LLaMAfied/1762652580.589802 | 1762652580.589803 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vonjack/Phi-3-mini-4k-instruct-LLaMAfied | vonjack/Phi-3-mini-4k-instruct-LLaMAfied | vonjack | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5787488308798432}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.821} |
HF Open LLM v2 | vonjack | vonjack/SmolLM2-1.7B-Merged | 97bab408-a5f5-4363-b530-dc4a6966c859 | 0.0.1 | hfopenllm_v2/vonjack_SmolLM2-1.7B-Merged/1762652580.5904331 | 1762652580.590434 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vonjack/SmolLM2-1.7B-Merged | vonjack/SmolLM2-1.7B-Merged | vonjack | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.36979658417443495}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.711} |
HF Open LLM v2 | vonjack | vonjack/MobileLLM-125M-HF | 2e06f258-9d91-4734-aacc-f417fddad77c | 0.0.1 | hfopenllm_v2/vonjack_MobileLLM-125M-HF/1762652580.589566 | 1762652580.589567 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vonjack/MobileLLM-125M-HF | vonjack/MobileLLM-125M-HF | vonjack | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21072753627042912}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.125} |
HF Open LLM v2 | vonjack | vonjack/Phi-3.5-mini-instruct-hermes-fc-json | 19cd2513-03e8-4d78-b222-566fe3928d2b | 0.0.1 | hfopenllm_v2/vonjack_Phi-3.5-mini-instruct-hermes-fc-json/1762652580.5900009 | 1762652580.5900018 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vonjack/Phi-3.5-mini-instruct-hermes-fc-json | vonjack/Phi-3.5-mini-instruct-hermes-fc-json | vonjack | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.14158432957885078}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "?", "params_billions": 4.132} |
HF Open LLM v2 | vonjack | vonjack/SmolLM2-135M-Merged | 2c1cab05-b63f-49ca-a709-b5a4e859ba82 | 0.0.1 | hfopenllm_v2/vonjack_SmolLM2-135M-Merged/1762652580.590627 | 1762652580.590627 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | vonjack/SmolLM2-135M-Merged | vonjack/SmolLM2-135M-Merged | vonjack | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24829674153468353}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 0.135} |
HF Open LLM v2 | alibaba | gz987/qwen2.5-7b-cabs-v0.2 | 7288fa97-efd7-45d5-8769-e0071e9b5488 | 0.0.1 | hfopenllm_v2/gz987_qwen2.5-7b-cabs-v0.2/1762652580.18783 | 1762652580.187832 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gz987/qwen2.5-7b-cabs-v0.2 | gz987/qwen2.5-7b-cabs-v0.2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7417640748768822}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | gz987/qwen2.5-7b-cabs-v0.1 | 9b6c775b-ef08-4e57-8441-52d7887615b1 | 0.0.1 | hfopenllm_v2/gz987_qwen2.5-7b-cabs-v0.1/1762652580.187419 | 1762652580.18742 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gz987/qwen2.5-7b-cabs-v0.1 | gz987/qwen2.5-7b-cabs-v0.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7505817896514582}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | gz987/qwen2.5-7b-cabs-v0.3 | b664e033-1424-431e-af8d-09a11b449286 | 0.0.1 | hfopenllm_v2/gz987_qwen2.5-7b-cabs-v0.3/1762652580.188173 | 1762652580.188174 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gz987/qwen2.5-7b-cabs-v0.3 | gz987/qwen2.5-7b-cabs-v0.3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7569515552068511}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | gz987/qwen2.5-7b-cabs-v0.4 | 8fb7a2aa-3f43-4aaf-b2c0-1770704fcf81 | 0.0.1 | hfopenllm_v2/gz987_qwen2.5-7b-cabs-v0.4/1762652580.188425 | 1762652580.188426 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | gz987/qwen2.5-7b-cabs-v0.4 | gz987/qwen2.5-7b-cabs-v0.4 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7582503313430586}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | ZeroXClem/Qwen2.5-7B-HomerCreative-Mix | 336aaa71-3f35-48f3-bede-cb9ab3324cfc | 0.0.1 | hfopenllm_v2/ZeroXClem_Qwen2.5-7B-HomerCreative-Mix/1762652579.968384 | 1762652579.968385 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ZeroXClem/Qwen2.5-7B-HomerCreative-Mix | ZeroXClem/Qwen2.5-7B-HomerCreative-Mix | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7835044348994002}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix | 500a7a12-9c94-4ed8-b2b4-33473141c3c7 | 0.0.1 | hfopenllm_v2/ZeroXClem_Qwen2.5-7B-HomerAnvita-NerdMix/1762652579.96818 | 1762652579.968181 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix | ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7707649037886142}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B | 8b61e7aa-3ba3-4e25-b1bf-9718970a111a | 0.0.1 | hfopenllm_v2/ZeroXClem_Qwen-2.5-Aether-SlerpFusion-7B/1762652579.9677062 | 1762652579.9677062 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B | ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6261597007052399}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | ZeroXClem/Qwen2.5-7B-Qandora-CySec | 7a495a80-f712-477b-bd5c-0cf7a07e8ef2 | 0.0.1 | hfopenllm_v2/ZeroXClem_Qwen2.5-7B-Qandora-CySec/1762652579.968593 | 1762652579.9685938 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ZeroXClem/Qwen2.5-7B-Qandora-CySec | ZeroXClem/Qwen2.5-7B-Qandora-CySec | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6773172958860268}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | ZeroXClem/Qwen2.5-7B-CelestialHarmony-1M | d912a685-7187-4b56-a7a8-881ed678ae2f | 0.0.1 | hfopenllm_v2/ZeroXClem_Qwen2.5-7B-CelestialHarmony-1M/1762652579.967964 | 1762652579.967965 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ZeroXClem/Qwen2.5-7B-CelestialHarmony-1M | ZeroXClem/Qwen2.5-7B-CelestialHarmony-1M | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5943862285402732}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | alibaba | x0000001/Deepseek-Lumen-R1-Qwen2.5-14B | 9d6eb7bc-965e-4de8-bccf-0590ad55ce6d | 0.0.1 | hfopenllm_v2/x0000001_Deepseek-Lumen-R1-Qwen2.5-14B/1762652580.596637 | 1762652580.596638 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | x0000001/Deepseek-Lumen-R1-Qwen2.5-14B | x0000001/Deepseek-Lumen-R1-Qwen2.5-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4436107306391486}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | mhl1/Qwen2.5-0.5B-cinstruct-stage1 | bf9d8219-66b9-4c77-8c6d-2983e60dc2cb | 0.0.1 | hfopenllm_v2/mhl1_Qwen2.5-0.5B-cinstruct-stage1/1762652580.3535528 | 1762652580.353554 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | mhl1/Qwen2.5-0.5B-cinstruct-stage1 | mhl1/Qwen2.5-0.5B-cinstruct-stage1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.14817905379947427}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.63} |
HF Open LLM v2 | alibaba | sumink/bbhqwen5 | 4b528bc8-e94a-4437-8c1c-bcd823bf5f45 | 0.0.1 | hfopenllm_v2/sumink_bbhqwen5/1762652580.546902 | 1762652580.5469031 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sumink/bbhqwen5 | sumink/bbhqwen5 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1521507378200951}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | sumink/Qwensci | 57a9ff0c-795f-45c4-b0c7-ad0c7400c88d | 0.0.1 | hfopenllm_v2/sumink_Qwensci/1762652580.545888 | 1762652580.5458891 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sumink/Qwensci | sumink/Qwensci | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17398281005509825}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.543} |
HF Open LLM v2 | alibaba | sumink/bbhqwen2 | b4dbcb3f-11dd-4bce-9d45-869ae7c8f9b1 | 0.0.1 | hfopenllm_v2/sumink_bbhqwen2/1762652580.546288 | 1762652580.546289 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sumink/bbhqwen2 | sumink/bbhqwen2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.15329991090307052}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | sumink/bbhqwen6 | f585e5fe-c3b5-4134-97ed-67b57d74adb8 | 0.0.1 | hfopenllm_v2/sumink_bbhqwen6/1762652580.547101 | 1762652580.547102 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sumink/bbhqwen6 | sumink/bbhqwen6 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18929551368147626}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | sumink/Qwenftmodel | aece90fe-f0eb-4c34-afd0-7a4fc36dc385 | 0.0.1 | hfopenllm_v2/sumink_Qwenftmodel/1762652580.5454028 | 1762652580.545404 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sumink/Qwenftmodel | sumink/Qwenftmodel | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17290899258412123}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | sumink/Qwenmplus | fc41cf78-6547-4fe6-83aa-ef5edd99a392 | 0.0.1 | hfopenllm_v2/sumink_Qwenmplus/1762652580.5456882 | 1762652580.545689 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sumink/Qwenmplus | sumink/Qwenmplus | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20403307668098425}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.543} |
HF Open LLM v2 | alibaba | sumink/bbhqwen4 | 336dbfac-133a-46c8-87c9-40f1ad12a714 | 0.0.1 | hfopenllm_v2/sumink_bbhqwen4/1762652580.546697 | 1762652580.546698 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sumink/bbhqwen4 | sumink/bbhqwen4 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.14485675784695717}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | sumink/bbhqwen | 7c73720a-03d8-4d90-9557-cd579c7c3e86 | 0.0.1 | hfopenllm_v2/sumink_bbhqwen/1762652580.546088 | 1762652580.546089 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sumink/bbhqwen | sumink/bbhqwen | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18085236062536292}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | sumink/bbhqwen3 | b9dae1c0-8088-4ffb-9e91-0f6579b3147e | 0.0.1 | hfopenllm_v2/sumink_bbhqwen3/1762652580.546491 | 1762652580.546491 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sumink/bbhqwen3 | sumink/bbhqwen3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1942911474886634}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | alibaba | nbeerbower/Qwen2.5-Gutenberg-Doppel-14B | 649483fb-4b54-4824-82eb-e78e55e53912 | 0.0.1 | hfopenllm_v2/nbeerbower_Qwen2.5-Gutenberg-Doppel-14B/1762652580.38376 | 1762652580.38376 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/Qwen2.5-Gutenberg-Doppel-14B | nbeerbower/Qwen2.5-Gutenberg-Doppel-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8090832324897937}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | nbeerbower/EVA-abliterated-TIES-Qwen2.5-1.5B | dea423e8-cdbd-4895-80af-f53dbb5caa1c | 0.0.1 | hfopenllm_v2/nbeerbower_EVA-abliterated-TIES-Qwen2.5-1.5B/1762652580.378096 | 1762652580.3780968 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/EVA-abliterated-TIES-Qwen2.5-1.5B | nbeerbower/EVA-abliterated-TIES-Qwen2.5-1.5B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.41148707651254224}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.777} |
HF Open LLM v2 | alibaba | nbeerbower/Dumpling-Qwen2.5-1.5B | f2eaeee8-a75b-4d0f-9dcd-2a11c3de926b | 0.0.1 | hfopenllm_v2/nbeerbower_Dumpling-Qwen2.5-1.5B/1762652580.377223 | 1762652580.377223 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/Dumpling-Qwen2.5-1.5B | nbeerbower/Dumpling-Qwen2.5-1.5B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3698963195432563}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | nbeerbower/EVA-abliterated-TIES-Qwen2.5-14B | 997fc8c5-fc91-4e9e-a2b7-bdda77e4f4a7 | 0.0.1 | hfopenllm_v2/nbeerbower_EVA-abliterated-TIES-Qwen2.5-14B/1762652580.378304 | 1762652580.378304 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/EVA-abliterated-TIES-Qwen2.5-14B | nbeerbower/EVA-abliterated-TIES-Qwen2.5-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.783554302583811}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | nbeerbower/Dumpling-Qwen2.5-7B-1k-r16 | 76e3f2a5-7545-4270-800d-6413e39608ad | 0.0.1 | hfopenllm_v2/nbeerbower_Dumpling-Qwen2.5-7B-1k-r16/1762652580.3776908 | 1762652580.377692 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/Dumpling-Qwen2.5-7B-1k-r16 | nbeerbower/Dumpling-Qwen2.5-7B-1k-r16 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4860004787297703}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | nbeerbower/Dumpling-Qwen2.5-14B | 0a70cdb4-5ccc-40e2-bf99-3af619b8b7f6 | 0.0.1 | hfopenllm_v2/nbeerbower_Dumpling-Qwen2.5-14B/1762652580.3774788 | 1762652580.37748 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/Dumpling-Qwen2.5-14B | nbeerbower/Dumpling-Qwen2.5-14B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6064010159709571}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | alibaba | nbeerbower/Dumpling-Qwen2.5-7B-1k-r64-2e-5 | 2e6c1c46-01af-493a-a2ce-266d13b53000 | 0.0.1 | hfopenllm_v2/nbeerbower_Dumpling-Qwen2.5-7B-1k-r64-2e-5/1762652580.377894 | 1762652580.377894 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/Dumpling-Qwen2.5-7B-1k-r64-2e-5 | nbeerbower/Dumpling-Qwen2.5-7B-1k-r64-2e-5 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.417906709752346}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | godlikehhd/ifd_new_correct_all_sample_2500_qwen | b481d1bd-e678-4b78-aecb-d43a561dd969 | 0.0.1 | hfopenllm_v2/godlikehhd_ifd_new_correct_all_sample_2500_qwen/1762652580.170775 | 1762652580.1707761 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | godlikehhd/ifd_new_correct_all_sample_2500_qwen | godlikehhd/ifd_new_correct_all_sample_2500_qwen | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33757319467900726}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | godlikehhd/ifd_new_qwen_2500 | 8d8663a1-12f6-4e88-af3d-784ff86e8c59 | 0.0.1 | hfopenllm_v2/godlikehhd_ifd_new_qwen_2500/1762652580.171179 | 1762652580.17118 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | godlikehhd/ifd_new_qwen_2500 | godlikehhd/ifd_new_qwen_2500 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.323959316834887}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | godlikehhd/qwen_2.5-1.5b-cherry_new | dd0260dd-59f7-4b3d-8f9c-60b297c07a1b | 0.0.1 | hfopenllm_v2/godlikehhd_qwen_2.5-1.5b-cherry_new/1762652580.171904 | 1762652580.171905 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | godlikehhd/qwen_2.5-1.5b-cherry_new | godlikehhd/qwen_2.5-1.5b-cherry_new | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3120442647730245}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | godlikehhd/ifd_2500_qwen | 84ad6756-cb9d-4303-8e7a-395c1dc7c222 | 0.0.1 | hfopenllm_v2/godlikehhd_ifd_2500_qwen/1762652580.170526 | 1762652580.170526 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | godlikehhd/ifd_2500_qwen | godlikehhd/ifd_2500_qwen | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33647388928044253}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | godlikehhd/qwen_full_data_alpaca | 746630a6-de1d-4976-9168-d8ff06980904 | 0.0.1 | hfopenllm_v2/godlikehhd_qwen_full_data_alpaca/1762652580.1721501 | 1762652580.172151 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | godlikehhd/qwen_full_data_alpaca | godlikehhd/qwen_full_data_alpaca | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3136178672588731}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | godlikehhd/ifd_new_correct_sample_2500_qwen | c42196be-c20b-413d-8870-f10759058098 | 0.0.1 | hfopenllm_v2/godlikehhd_ifd_new_correct_sample_2500_qwen/1762652580.170979 | 1762652580.1709802 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | godlikehhd/ifd_new_correct_sample_2500_qwen | godlikehhd/ifd_new_correct_sample_2500_qwen | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33974631754854895}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | godlikehhd/qwen_ins_ans_2500 | 7f577380-2691-4906-af13-8ca3011e6316 | 0.0.1 | hfopenllm_v2/godlikehhd_qwen_ins_ans_2500/1762652580.172384 | 1762652580.172385 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | godlikehhd/qwen_ins_ans_2500 | godlikehhd/qwen_ins_ans_2500 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2698041197356348}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | alibaba | godlikehhd/qwen-2.5-1.5b-cherry | a0621e6d-4178-49c9-aa2b-f56930884b82 | 0.0.1 | hfopenllm_v2/godlikehhd_qwen-2.5-1.5b-cherry/1762652580.1715672 | 1762652580.1715689 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | godlikehhd/qwen-2.5-1.5b-cherry | godlikehhd/qwen-2.5-1.5b-cherry | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28933784580468713}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.772} |
HF Open LLM v2 | alibaba | YoungPanda/qwenqwen | 7e4c528f-bb42-40e7-b849-86732d2f2a18 | 0.0.1 | hfopenllm_v2/YoungPanda_qwenqwen/1762652579.964632 | 1762652579.964633 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | YoungPanda/qwenqwen | YoungPanda/qwenqwen | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.12639684924888184}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2MoeForCausalLM", "params_billions": 14.316} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.0 | ee2b789c-951d-426e-87e3-232c07d65ade | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.0/1762652580.283937 | 1762652580.283938 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.0 | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.0 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7695159953368174}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-olm-v1.4 | 1faf58ba-28e7-45a1-bc2c-d0aa707a49aa | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-olm-v1.4/1762652580.286527 | 1762652580.2865438 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-olm-v1.4 | jeffmeloy/Qwen2.5-7B-olm-v1.4 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4545018329144448}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-olm-v1.3 | a5c9246f-a7b5-4183-9a64-93151b536945 | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-olm-v1.3/1762652580.286303 | 1762652580.286304 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-olm-v1.3 | jeffmeloy/Qwen2.5-7B-olm-v1.3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4218540140161438}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.8 | e908901d-c122-4458-9d4e-9a7d1242211c | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.8/1762652580.2854452 | 1762652580.285446 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.8 | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.8 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6255601803215468}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.3 | 0ec990b0-b908-44f5-9fb7-5ee603737bc7 | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.3/1762652580.284589 | 1762652580.284589 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.3 | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.3 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49951462120506923}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | alibaba | jeffmeloy/jeffmeloy_Qwen2.5-7B-minperplexity-1 | ba005ac7-761f-4cd7-91ed-34b88028240f | 0.0.1 | hfopenllm_v2/jeffmeloy_jeffmeloy_Qwen2.5-7B-minperplexity-1/1762652580.2872581 | 1762652580.2872589 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/jeffmeloy_Qwen2.5-7B-minperplexity-1 | jeffmeloy/jeffmeloy_Qwen2.5-7B-minperplexity-1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.37571643239936703}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-olm-v1.2 | 8c4531a4-4418-4090-9c82-f60bcf8d9935 | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-olm-v1.2/1762652580.286082 | 1762652580.286083 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-olm-v1.2 | jeffmeloy/Qwen2.5-7B-olm-v1.2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.42025492360270744}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-olm-v1.0 | e9350de5-cae6-46bc-a83f-0e6e65eae4e3 | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-olm-v1.0/1762652580.285652 | 1762652580.2856529 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-olm-v1.0 | jeffmeloy/Qwen2.5-7B-olm-v1.0 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5331365222055258}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-olm-v1.1 | 769eabf2-4c12-4a48-8ec2-7dacf50a28f0 | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-olm-v1.1/1762652580.285865 | 1762652580.285865 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-olm-v1.1 | jeffmeloy/Qwen2.5-7B-olm-v1.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4329445870290828}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-minperplexity-2 | 593d3d30-f2e8-4ad3-b0ab-4bfed63a0ab5 | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-minperplexity-2/1762652580.28349 | 1762652580.2834911 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-minperplexity-2 | jeffmeloy/Qwen2.5-7B-minperplexity-2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.509730847484674}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.4 | 34c33a97-ae07-42e9-8025-9076e2bce3bb | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.4/1762652580.284807 | 1762652580.284807 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.4 | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.4 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6078748830879843}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-olm-v1.5 | b347eea5-e676-478e-b0ee-d53abf2c8697 | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-olm-v1.5/1762652580.286995 | 1762652580.286996 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-olm-v1.5 | jeffmeloy/Qwen2.5-7B-olm-v1.5 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4546514359676769}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.7 | 4aa966fc-ee99-430c-8688-99565f5e6fcc | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.7/1762652580.285239 | 1762652580.285239 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.7 | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.7 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4201551882338861}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.5 | bd4ff159-0bf9-4fe1-8cc8-9f3d7bb47bbc | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.5/1762652580.2850199 | 1762652580.2850208 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.5 | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.5 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5650352176669016}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen-7B-nerd-uncensored-v1.0 | 1812829e-2c91-410e-9e2e-cc758b652e9b | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen-7B-nerd-uncensored-v1.0/1762652580.283215 | 1762652580.2832158 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen-7B-nerd-uncensored-v1.0 | jeffmeloy/Qwen-7B-nerd-uncensored-v1.0 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6135952605752737}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v0.9 | 45a72c39-9cdb-4fb6-aaf0-d50cc89dfd70 | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-nerd-uncensored-v0.9/1762652580.2837172 | 1762652580.2837179 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v0.9 | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v0.9 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6048274134851084}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.1 | 2316b408-c94b-471e-b64b-c1f8f345868e | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.1/1762652580.2841558 | 1762652580.284157 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.1 | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.1 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6626296005709296}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.2 | 49d47f6d-0d11-4b07-b42e-b94310c97d3e | 0.0.1 | hfopenllm_v2/jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.2/1762652580.284375 | 1762652580.284375 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.2 | jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.2 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.49646715160219335}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | DavidAU/Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B | 78e7f7ee-3677-499a-aa36-2e8bf0902bf0 | 0.0.1 | hfopenllm_v2/DavidAU_Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B/1762652579.543009 | 1762652579.543009 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DavidAU/Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B | DavidAU/Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17832905579418165}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2MoeForCausalLM", "params_billions": 4.089} |
HF Open LLM v2 | alibaba | DavidAU/Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B | d65793ba-f363-4665-9ff5-1ac08e819d55 | 0.0.1 | hfopenllm_v2/DavidAU_Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B/1762652579.543224 | 1762652579.543225 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DavidAU/Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B | DavidAU/Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28351773294857646}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2MoeForCausalLM", "params_billions": 19.022} |
HF Open LLM v2 | alibaba | DavidAU/DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm | 4b7dd9db-5e94-4885-96f8-189af8d97c09 | 0.0.1 | hfopenllm_v2/DavidAU_DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm/1762652579.53886 | 1762652579.53886 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DavidAU/DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm | DavidAU/DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.34159474638403875}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 25.506} |
HF Open LLM v2 | alibaba | DavidAU/Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32 | c142222c-836d-493f-a9f8-857426e0573c | 0.0.1 | hfopenllm_v2/DavidAU_Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32/1762652579.543571 | 1762652579.543573 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DavidAU/Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32 | DavidAU/Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32 | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.21067766858601844}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2MoeForCausalLM", "params_billions": 8.714} |
HF Open LLM v2 | alibaba | MaziyarPanahi/calme-2.2-qwen2-7b | 154b7a41-e1bf-4827-a6a7-279ea170ab7e | 0.0.1 | hfopenllm_v2/MaziyarPanahi_calme-2.2-qwen2-7b/1762652579.7540858 | 1762652579.754087 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MaziyarPanahi/calme-2.2-qwen2-7b | MaziyarPanahi/calme-2.2-qwen2-7b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.35972996094806226}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | MaziyarPanahi/calme-2.3-qwen2-72b | 8b769df2-18f5-4712-a02b-962d3e2bb7c7 | 0.0.1 | hfopenllm_v2/MaziyarPanahi_calme-2.3-qwen2-72b/1762652579.755723 | 1762652579.755724 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MaziyarPanahi/calme-2.3-qwen2-72b | MaziyarPanahi/calme-2.3-qwen2-72b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3849840645044039}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 72.706} |
HF Open LLM v2 | alibaba | MaziyarPanahi/Qwen1.5-MoE-A2.7B-Wikihow | ee23e137-57d2-49aa-b267-27bd48457d46 | 0.0.1 | hfopenllm_v2/MaziyarPanahi_Qwen1.5-MoE-A2.7B-Wikihow/1762652579.750923 | 1762652579.750923 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MaziyarPanahi/Qwen1.5-MoE-A2.7B-Wikihow | MaziyarPanahi/Qwen1.5-MoE-A2.7B-Wikihow | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.29543278501043896}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2MoeForCausalLM", "params_billions": 14.316} |
HF Open LLM v2 | alibaba | MaziyarPanahi/calme-2.1-qwen2-7b | 6c31df3b-e408-4a6c-b475-78f174630cad | 0.0.1 | hfopenllm_v2/MaziyarPanahi_calme-2.1-qwen2-7b/1762652579.752553 | 1762652579.752554 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MaziyarPanahi/calme-2.1-qwen2-7b | MaziyarPanahi/calme-2.1-qwen2-7b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3816119008674761}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | MaziyarPanahi/calme-2.5-qwen2-7b | 762f6ff3-4823-4de8-8351-045e1d1d383b | 0.0.1 | hfopenllm_v2/MaziyarPanahi_calme-2.5-qwen2-7b/1762652579.757269 | 1762652579.75727 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MaziyarPanahi/calme-2.5-qwen2-7b | MaziyarPanahi/calme-2.5-qwen2-7b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.31449221399220734}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | alibaba | MaziyarPanahi/calme-2.1-qwen2.5-72b | 2b841a46-6210-4092-875f-ca3ae36f3d25 | 0.0.1 | hfopenllm_v2/MaziyarPanahi_calme-2.1-qwen2.5-72b/1762652579.752765 | 1762652579.752765 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MaziyarPanahi/calme-2.1-qwen2.5-72b | MaziyarPanahi/calme-2.1-qwen2.5-72b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8662360315075112}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 72.7} |
HF Open LLM v2 | alibaba | MaziyarPanahi/calme-2.1-qwen2-72b | ae68a60d-a2df-45f1-b446-1400901cb6ff | 0.0.1 | hfopenllm_v2/MaziyarPanahi_calme-2.1-qwen2-72b/1762652579.75234 | 1762652579.752341 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | MaziyarPanahi/calme-2.1-qwen2-72b | MaziyarPanahi/calme-2.1-qwen2-72b | alibaba | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8162774770941104}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 72.699} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.