_leaderboard stringclasses 1
value | _developer stringclasses 559
values | _model stringlengths 9 102 | _uuid stringlengths 36 36 | schema_version stringclasses 1
value | evaluation_id stringlengths 35 133 | retrieved_timestamp stringlengths 13 18 | source_data stringclasses 1
value | evaluation_source_name stringclasses 1
value | evaluation_source_type stringclasses 1
value | source_organization_name stringclasses 1
value | source_organization_url null | source_organization_logo_url null | evaluator_relationship stringclasses 1
value | model_name stringlengths 4 102 | model_id stringlengths 9 102 | model_developer stringclasses 559
values | model_inference_platform stringclasses 1
value | evaluation_results stringlengths 1.35k 1.41k | additional_details stringclasses 660
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HF Open LLM v2 | Novaciano | Novaciano/FuseChat-3.2-1B-GRPO_Creative_RP | 16a8882c-12f5-46d0-8e1f-88b22aa8f08c | 0.0.1 | hfopenllm_v2/Novaciano_FuseChat-3.2-1B-GRPO_Creative_RP/1762652579.795153 | 1762652579.795153 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Novaciano/FuseChat-3.2-1B-GRPO_Creative_RP | Novaciano/FuseChat-3.2-1B-GRPO_Creative_RP | Novaciano | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.559814625194484}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | Novaciano | Novaciano/LEWD-Mental-Cultist-3.2-1B | 1bce579e-9fac-46a9-92ef-48080832abbb | 0.0.1 | hfopenllm_v2/Novaciano_LEWD-Mental-Cultist-3.2-1B/1762652579.796045 | 1762652579.796046 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Novaciano/LEWD-Mental-Cultist-3.2-1B | Novaciano/LEWD-Mental-Cultist-3.2-1B | Novaciano | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5308636639671627}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.498} |
HF Open LLM v2 | Novaciano | Novaciano/Fusetrix-3.2-1B-GRPO_RP_Creative | 7fe4c32b-0bbd-49c0-9e4f-43306457aae8 | 0.0.1 | hfopenllm_v2/Novaciano_Fusetrix-3.2-1B-GRPO_RP_Creative/1762652579.795362 | 1762652579.795362 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Novaciano/Fusetrix-3.2-1B-GRPO_RP_Creative | Novaciano/Fusetrix-3.2-1B-GRPO_RP_Creative | Novaciano | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5366339091388627}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | Novaciano | Novaciano/Cultist-3.2-1B | 3dc51dce-222f-455b-b61a-04904c7fc855 | 0.0.1 | hfopenllm_v2/Novaciano_Cultist-3.2-1B/1762652579.7949288 | 1762652579.79493 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Novaciano/Cultist-3.2-1B | Novaciano/Cultist-3.2-1B | Novaciano | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5294895322189568}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.498} |
HF Open LLM v2 | Novaciano | Novaciano/La_Mejor_Mezcla-3.2-1B | 49fef1c9-bf18-465c-acdb-b8f17e93dbad | 0.0.1 | hfopenllm_v2/Novaciano_La_Mejor_Mezcla-3.2-1B/1762652579.79625 | 1762652579.7962508 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Novaciano/La_Mejor_Mezcla-3.2-1B | Novaciano/La_Mejor_Mezcla-3.2-1B | Novaciano | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5509969104199081}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.498} |
HF Open LLM v2 | Novaciano | Novaciano/ASTAROTH-3.2-1B | e454276c-3113-49f8-9397-9c1ad5e7bcc5 | 0.0.1 | hfopenllm_v2/Novaciano_ASTAROTH-3.2-1B/1762652579.7938519 | 1762652579.793853 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Novaciano/ASTAROTH-3.2-1B | Novaciano/ASTAROTH-3.2-1B | Novaciano | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5612884923115112}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.498} |
HF Open LLM v2 | Novaciano | Novaciano/Sigil-Of-Satan-3.2-1B | ae9ceba0-8e8a-431f-a762-7bb6c55b4757 | 0.0.1 | hfopenllm_v2/Novaciano_Sigil-Of-Satan-3.2-1B/1762652579.7964501 | 1762652579.7964501 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Novaciano/Sigil-Of-Satan-3.2-1B | Novaciano/Sigil-Of-Satan-3.2-1B | Novaciano | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5494233079340594}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.498} |
HF Open LLM v2 | athirdpath | athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit | 9255090f-6862-4ff1-ac91-fe0cd7613445 | 0.0.1 | hfopenllm_v2/athirdpath_Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit/1762652580.019914 | 1762652580.019914 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit | athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit | athirdpath | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4521037513796726}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | aevalone | aevalone/distill_qw_test | 108ead60-3cee-43e7-925a-619bace5b65f | 0.0.1 | hfopenllm_v2/aevalone_distill_qw_test/1762652579.975426 | 1762652579.9754272 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | aevalone/distill_qw_test | aevalone/distill_qw_test | aevalone | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.740889728143548}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | bfuzzy1 | bfuzzy1/acheron-c | 71268c77-565a-401b-a51d-122060ed5945 | 0.0.1 | hfopenllm_v2/bfuzzy1_acheron-c/1762652580.031654 | 1762652580.0316548 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bfuzzy1/acheron-c | bfuzzy1/acheron-c | bfuzzy1 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19286714805604685}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.514} |
HF Open LLM v2 | bfuzzy1 | bfuzzy1/llambses-1 | 3f04797b-fe6d-4cd5-a49e-b898a8db26a6 | 0.0.1 | hfopenllm_v2/bfuzzy1_llambses-1/1762652580.032492 | 1762652580.032493 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bfuzzy1/llambses-1 | bfuzzy1/llambses-1 | bfuzzy1 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3553837152089788}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | bfuzzy1 | bfuzzy1/acheron | 2b74949a-c0a3-4061-8cf4-4330850af288 | 0.0.1 | hfopenllm_v2/bfuzzy1_acheron/1762652580.031447 | 1762652580.031447 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bfuzzy1/acheron | bfuzzy1/acheron | bfuzzy1 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19831269919369493}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.514} |
HF Open LLM v2 | bfuzzy1 | bfuzzy1/Gunny | e7d0c3d5-d962-49b5-a4b7-3cb7ac12735c | 0.0.1 | hfopenllm_v2/bfuzzy1_Gunny/1762652580.031208 | 1762652580.031209 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bfuzzy1/Gunny | bfuzzy1/Gunny | bfuzzy1 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7128629813339716}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 3.213} |
HF Open LLM v2 | bfuzzy1 | bfuzzy1/acheron-m | fdd707f8-df0b-4384-bc77-35f3fa8ec0a0 | 0.0.1 | hfopenllm_v2/bfuzzy1_acheron-m/1762652580.032056 | 1762652580.032057 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bfuzzy1/acheron-m | bfuzzy1/acheron-m | bfuzzy1 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17583123889058808}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.514} |
HF Open LLM v2 | bfuzzy1 | bfuzzy1/acheron-d | 1c9ba45f-1f3b-42ad-a603-ea7039fee22e | 0.0.1 | hfopenllm_v2/bfuzzy1_acheron-d/1762652580.031856 | 1762652580.031857 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | bfuzzy1/acheron-d | bfuzzy1/acheron-d | bfuzzy1 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.192542454021995}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 0.514} |
HF Open LLM v2 | Alibaba-NLP | Alibaba-NLP/gte-Qwen2-7B-instruct | 39ea9329-5ed7-46ea-bcc4-30679a63b405 | 0.0.1 | hfopenllm_v2/Alibaba-NLP_gte-Qwen2-7B-instruct/1762652579.479603 | 1762652579.479604 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Alibaba-NLP/gte-Qwen2-7B-instruct | Alibaba-NLP/gte-Qwen2-7B-instruct | Alibaba-NLP | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.22554045488193547}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | ozone-research | ozone-research/Chirp-01 | 69a65ae3-71fe-4e33-be2d-20bc0c25969a | 0.0.1 | hfopenllm_v2/ozone-research_Chirp-01/1762652580.433142 | 1762652580.4331431 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | ozone-research/Chirp-01 | ozone-research/Chirp-01 | ozone-research | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6347524568145853}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 3.086} |
HF Open LLM v2 | kz919 | kz919/QwQ-0.5B-Distilled-SFT | 08efd69e-6ff6-48a1-b260-ddbb4a942d12 | 0.0.1 | hfopenllm_v2/kz919_QwQ-0.5B-Distilled-SFT/1762652580.311408 | 1762652580.311409 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | kz919/QwQ-0.5B-Distilled-SFT | kz919/QwQ-0.5B-Distilled-SFT | kz919 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3076725311063534}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 0.494} |
HF Open LLM v2 | SpaceYL | SpaceYL/ECE_Poirot | 32feb55a-fde5-4bbd-b93e-abffc1a7e573 | 0.0.1 | hfopenllm_v2/SpaceYL_ECE_Poirot/1762652579.890822 | 1762652579.890822 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | SpaceYL/ECE_Poirot | SpaceYL/ECE_Poirot | SpaceYL | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3106956209524063}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | sophosympatheia | sophosympatheia/Midnight-Miqu-70B-v1.5 | 3498b101-b86e-4968-abca-a3d3d42a4e5b | 0.0.1 | hfopenllm_v2/sophosympatheia_Midnight-Miqu-70B-v1.5/1762652580.532959 | 1762652580.53296 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | sophosympatheia/Midnight-Miqu-70B-v1.5 | sophosympatheia/Midnight-Miqu-70B-v1.5 | sophosympatheia | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6118465671086051}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 68.977} |
HF Open LLM v2 | HelpingAI | HelpingAI/Cipher-20B | 21f72176-cf3b-43ae-aa6e-51d9fe5a6e90 | 0.0.1 | hfopenllm_v2/HelpingAI_Cipher-20B/1762652579.638349 | 1762652579.63835 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | HelpingAI/Cipher-20B | HelpingAI/Cipher-20B | HelpingAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5377575942942504}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 20.551} |
HF Open LLM v2 | HelpingAI | HelpingAI/Dhanishtha-Large | e097ccca-ab91-4f16-bbfa-ca97c91fdb77 | 0.0.1 | hfopenllm_v2/HelpingAI_Dhanishtha-Large/1762652579.638597 | 1762652579.638598 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | HelpingAI/Dhanishtha-Large | HelpingAI/Dhanishtha-Large | HelpingAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24567370133468086}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.613} |
HF Open LLM v2 | HelpingAI | HelpingAI/Priya-3B | f709afd7-3220-41b0-909a-74d9086c7dd9 | 0.0.1 | hfopenllm_v2/HelpingAI_Priya-3B/1762652579.639023 | 1762652579.639024 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | HelpingAI/Priya-3B | HelpingAI/Priya-3B | HelpingAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4525780484669566}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 2.81} |
HF Open LLM v2 | HelpingAI | HelpingAI/Priya-10B | 94aca944-b0a9-46ec-bdab-53bb5cbe3b78 | 0.0.1 | hfopenllm_v2/HelpingAI_Priya-10B/1762652579.638817 | 1762652579.638818 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | HelpingAI/Priya-10B | HelpingAI/Priya-10B | HelpingAI | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.40429283190822574}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "LlamaForCausalLM", "params_billions": 10.211} |
HF Open LLM v2 | automerger | automerger/YamshadowExperiment28-7B | 1fa5dee9-c360-40d9-8e67-9b415cd36616 | 0.0.1 | hfopenllm_v2/automerger_YamshadowExperiment28-7B/1762652580.020166 | 1762652580.0201669 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | automerger/YamshadowExperiment28-7B | automerger/YamshadowExperiment28-7B | automerger | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4070156074770498}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MistralForCausalLM", "params_billions": 7.242} |
HF Open LLM v2 | qingy2024 | qingy2024/Qwen2.5-Coder-Draft-1.5B-Instruct | 40662202-f976-4dc0-acf2-f4794bb5d744 | 0.0.1 | hfopenllm_v2/qingy2024_Qwen2.5-Coder-Draft-1.5B-Instruct/1762652580.487137 | 1762652580.487138 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Qwen2.5-Coder-Draft-1.5B-Instruct | qingy2024/Qwen2.5-Coder-Draft-1.5B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4125110262991086}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | qingy2024 | qingy2024/Benchmaxx-Llama-3.2-1B-Instruct | 52ed2d5b-d9be-4f3f-b193-8d4cca4ded62 | 0.0.1 | hfopenllm_v2/qingy2024_Benchmaxx-Llama-3.2-1B-Instruct/1762652580.483871 | 1762652580.483871 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Benchmaxx-Llama-3.2-1B-Instruct | qingy2024/Benchmaxx-Llama-3.2-1B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20136016879657087}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 1.236} |
HF Open LLM v2 | qingy2024 | qingy2024/Qwen2.5-Math-14B-Instruct-Alpha | 011f32a0-458f-4bea-8192-b18a19ddd0c7 | 0.0.1 | hfopenllm_v2/qingy2024_Qwen2.5-Math-14B-Instruct-Alpha/1762652580.48737 | 1762652580.487371 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Qwen2.5-Math-14B-Instruct-Alpha | qingy2024/Qwen2.5-Math-14B-Instruct-Alpha | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7704402097545624}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | qingy2024 | qingy2024/Qwen2.5-Math-14B-Instruct-Preview | aab84d55-c491-402c-9ed0-59347573fea9 | 0.0.1 | hfopenllm_v2/qingy2024_Qwen2.5-Math-14B-Instruct-Preview/1762652580.487701 | 1762652580.4877021 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Qwen2.5-Math-14B-Instruct-Preview | qingy2024/Qwen2.5-Math-14B-Instruct-Preview | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7825802204816554}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | qingy2024 | qingy2024/Eyas-17B-Instruct | c45cc504-88b0-4110-9650-47f4d328f769 | 0.0.1 | hfopenllm_v2/qingy2024_Eyas-17B-Instruct/1762652580.484141 | 1762652580.484141 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Eyas-17B-Instruct | qingy2024/Eyas-17B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6574588757829227}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 17.431} |
HF Open LLM v2 | qingy2024 | qingy2024/OwO-14B-Instruct | f524ebb6-64cb-43e3-8cff-6305ef122890 | 0.0.1 | hfopenllm_v2/qingy2024_OwO-14B-Instruct/1762652580.485259 | 1762652580.485259 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/OwO-14B-Instruct | qingy2024/OwO-14B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1383119013107444}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | qingy2024 | qingy2024/Fusion4-14B-Instruct | bb7b828c-07a0-4530-8c2e-8e4b6370cbb4 | 0.0.1 | hfopenllm_v2/qingy2024_Fusion4-14B-Instruct/1762652580.4850292 | 1762652580.48503 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Fusion4-14B-Instruct | qingy2024/Fusion4-14B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7648949232480928}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | qingy2024 | qingy2024/QwQ-14B-Math-v0.2 | 4092651d-1d14-408d-922d-6189858aab36 | 0.0.1 | hfopenllm_v2/qingy2024_QwQ-14B-Math-v0.2/1762652580.48586 | 1762652580.4858618 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/QwQ-14B-Math-v0.2 | qingy2024/QwQ-14B-Math-v0.2 | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33909692948044523}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.77} |
HF Open LLM v2 | qingy2024 | qingy2024/QwEnlarge-16B-Instruct | dd44686d-13da-4c88-81d3-6d01676baa4e | 0.0.1 | hfopenllm_v2/qingy2024_QwEnlarge-16B-Instruct/1762652580.485478 | 1762652580.4854789 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/QwEnlarge-16B-Instruct | qingy2024/QwEnlarge-16B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7801821389468832}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 15.871} |
HF Open LLM v2 | qingy2024 | qingy2024/Qwarkstar-4B-Instruct-Preview | 701a4aa4-b057-42d8-8b89-dd59950d1981 | 0.0.1 | hfopenllm_v2/qingy2024_Qwarkstar-4B-Instruct-Preview/1762652580.4865122 | 1762652580.486513 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Qwarkstar-4B-Instruct-Preview | qingy2024/Qwarkstar-4B-Instruct-Preview | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5324372664530114}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 4.473} |
HF Open LLM v2 | qingy2024 | qingy2024/Fusion2-14B-Instruct | cc17acb9-0f4e-46a9-a250-eb79a0fedc3f | 0.0.1 | hfopenllm_v2/qingy2024_Fusion2-14B-Instruct/1762652580.4848042 | 1762652580.4848042 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Fusion2-14B-Instruct | qingy2024/Fusion2-14B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6064010159709571}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | qingy2024 | qingy2024/Qwen2.6-14B-Instruct | c27064c4-93d1-41a1-a61f-cde7a991b047 | 0.0.1 | hfopenllm_v2/qingy2024_Qwen2.6-14B-Instruct/1762652580.48806 | 1762652580.488061 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Qwen2.6-14B-Instruct | qingy2024/Qwen2.6-14B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5810970447302047}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.766} |
HF Open LLM v2 | qingy2024 | qingy2024/Fusion-14B-Instruct | 123331fd-a4fb-4dc6-a30e-17f230618df9 | 0.0.1 | hfopenllm_v2/qingy2024_Fusion-14B-Instruct/1762652580.4845738 | 1762652580.484575 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Fusion-14B-Instruct | qingy2024/Fusion-14B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7259770741632203}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.0} |
HF Open LLM v2 | qingy2024 | qingy2024/Qwen2.6-Math-14B-Instruct | 37822fb0-4ada-4413-aa77-6938678994d9 | 0.0.1 | hfopenllm_v2/qingy2024_Qwen2.6-Math-14B-Instruct/1762652580.488592 | 1762652580.4885938 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Qwen2.6-Math-14B-Instruct | qingy2024/Qwen2.6-Math-14B-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.38623186478543603}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 14.0} |
HF Open LLM v2 | qingy2024 | qingy2024/Qwarkstar-4B | 9f586b02-3514-46f7-b1df-4e78f286893e | 0.0.1 | hfopenllm_v2/qingy2024_Qwarkstar-4B/1762652580.486229 | 1762652580.4862301 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Qwarkstar-4B | qingy2024/Qwarkstar-4B | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.19941200459225966}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 4.473} |
HF Open LLM v2 | qingy2024 | qingy2024/Falcon3-2x10B-MoE-Instruct | 302e9f42-b9fa-4e2b-acda-70c391f9b6bc | 0.0.1 | hfopenllm_v2/qingy2024_Falcon3-2x10B-MoE-Instruct/1762652580.484361 | 1762652580.484362 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | qingy2024/Falcon3-2x10B-MoE-Instruct | qingy2024/Falcon3-2x10B-MoE-Instruct | qingy2024 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7849783020164276}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "MixtralForCausalLM", "params_billions": 18.799} |
HF Open LLM v2 | yam-peleg | yam-peleg/Hebrew-Gemma-11B-Instruct | 5d25872d-eacd-4e5c-b9cc-9ee014147730 | 0.0.1 | hfopenllm_v2/yam-peleg_Hebrew-Gemma-11B-Instruct/1762652580.603103 | 1762652580.603105 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | yam-peleg/Hebrew-Gemma-11B-Instruct | yam-peleg/Hebrew-Gemma-11B-Instruct | yam-peleg | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.30207737691547315}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "GemmaForCausalLM", "params_billions": 10.475} |
HF Open LLM v2 | rombodawg | rombodawg/rombos_Replete-Coder-Instruct-8b-Merged | 929abd2b-3f19-4df3-81ab-406751d52919 | 0.0.1 | hfopenllm_v2/rombodawg_rombos_Replete-Coder-Instruct-8b-Merged/1762652580.499815 | 1762652580.499816 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | rombodawg/rombos_Replete-Coder-Instruct-8b-Merged | rombodawg/rombos_Replete-Coder-Instruct-8b-Merged | rombodawg | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.5387571643239937}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 8.03} |
HF Open LLM v2 | rombodawg | rombodawg/Rombos-LLM-V2.6-Nemotron-70b | caf5de06-ab13-45e4-ac51-d4e40796952e | 0.0.1 | hfopenllm_v2/rombodawg_Rombos-LLM-V2.6-Nemotron-70b/1762652580.499233 | 1762652580.499234 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | rombodawg/Rombos-LLM-V2.6-Nemotron-70b | rombodawg/Rombos-LLM-V2.6-Nemotron-70b | rombodawg | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7526551771521784}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 70.554} |
HF Open LLM v2 | llnYou | llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3 | 183cd87c-2415-4428-9ad1-9d41c0dcdc41 | 0.0.1 | hfopenllm_v2/llnYou_ECE-PRYMMAL-YL-3B-SLERP-V3/1762652580.326333 | 1762652580.326334 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3 | llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3 | llnYou | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.35808100285021516}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Phi3ForCausalLM", "params_billions": 3.821} |
HF Open LLM v2 | llnYou | llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6 | eaa1adca-5379-4aab-bf39-8641df58a530 | 0.0.1 | hfopenllm_v2/llnYou_ECE-PRYMMAL-YL-1B-SLERP-V6/1762652580.325702 | 1762652580.325703 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6 | llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6 | llnYou | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.13876181864120535}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.357} |
HF Open LLM v2 | llnYou | llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2 | 2bb16fd8-516f-42d6-91e1-2f3f4024f0d4 | 0.0.1 | hfopenllm_v2/llnYou_ECE-PRYMMAL-YL-3B-SLERP-V2/1762652580.326129 | 1762652580.326129 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2 | llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2 | llnYou | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2309361383351729}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 2.81} |
HF Open LLM v2 | llnYou | llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1 | 844c959f-6859-4220-bdd8-99e6af53808b | 0.0.1 | hfopenllm_v2/llnYou_ECE-PRYMMAL-YL-3B-SLERP-V1/1762652580.325917 | 1762652580.325917 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1 | llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1 | llnYou | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23463299600615256}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 2.81} |
HF Open LLM v2 | llnYou | llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5 | 334bc38a-becd-405b-8982-dfaf5de35c4b | 0.0.1 | hfopenllm_v2/llnYou_ECE-PRYMMAL-YL-1B-SLERP-V5/1762652580.3253949 | 1762652580.325396 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5 | llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5 | llnYou | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.33125329680802496}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 1.544} |
HF Open LLM v2 | aloobun | aloobun/d-SmolLM2-360M | 1ad7b4c4-8074-482e-9010-ce1552325e15 | 0.0.1 | hfopenllm_v2/aloobun_d-SmolLM2-360M/1762652580.0092921 | 1762652580.009293 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | aloobun/d-SmolLM2-360M | aloobun/d-SmolLM2-360M | aloobun | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.20970358648386284}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 0.362} |
HF Open LLM v2 | GuilhermeNaturaUmana | GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall | 5aa1bdc6-4b8f-411f-9150-41217a94ec5e | 0.0.1 | hfopenllm_v2/GuilhermeNaturaUmana_Nature-Reason-1.2-reallysmall/1762652579.63471 | 1762652579.634711 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall | GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall | GuilhermeNaturaUmana | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4985405391029136}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | GuilhermeNaturaUmana | GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall | 9ddf874c-16a9-4f66-a3c5-140f10bc4787 | 0.0.1 | hfopenllm_v2/GuilhermeNaturaUmana_Nature-Reason-1.2-reallysmall/1762652579.634963 | 1762652579.634964 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall | GuilhermeNaturaUmana/Nature-Reason-1.2-reallysmall | GuilhermeNaturaUmana | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.47910654840268263}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Qwen2ForCausalLM", "params_billions": 7.616} |
HF Open LLM v2 | kno10 | kno10/ende-chat-0.0.7 | 6619dec7-71cf-4be6-90e2-815e8dd4e56f | 0.0.1 | hfopenllm_v2/kno10_ende-chat-0.0.7/1762652580.310943 | 1762652580.310944 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | kno10/ende-chat-0.0.7 | kno10/ende-chat-0.0.7 | kno10 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.440063476021401}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.891} |
HF Open LLM v2 | kno10 | kno10/ende-chat-0.0.5 | af2f11cf-8efa-4c71-a0b2-74f953b8e61b | 0.0.1 | hfopenllm_v2/kno10_ende-chat-0.0.5/1762652580.310679 | 1762652580.3106802 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | kno10/ende-chat-0.0.5 | kno10/ende-chat-0.0.5 | kno10 | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3404455733010634}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "MistralForCausalLM", "params_billions": 7.891} |
HF Open LLM v2 | Locutusque | Locutusque/CollectiveLM-Falcon-3-7B | 44737b7e-4942-4496-a818-fddce66da4d6 | 0.0.1 | hfopenllm_v2/Locutusque_CollectiveLM-Falcon-3-7B/1762652579.734693 | 1762652579.734694 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Locutusque/CollectiveLM-Falcon-3-7B | Locutusque/CollectiveLM-Falcon-3-7B | Locutusque | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3918281271470808}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "LlamaForCausalLM", "params_billions": 7.456} |
HF Open LLM v2 | LGAI-EXAONE | LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct | 7fa474fb-4aa1-4855-9759-a28056c7a5e7 | 0.0.1 | hfopenllm_v2/LGAI-EXAONE_EXAONE-3.5-7.8B-Instruct/1762652579.705873 | 1762652579.705875 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct | LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct | LGAI-EXAONE | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8136045692096969}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "ExaoneForCausalLM", "params_billions": 7.818} |
HF Open LLM v2 | LGAI-EXAONE | LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct | 97f7c73d-6d69-4c04-9cff-4914253003b0 | 0.0.1 | hfopenllm_v2/LGAI-EXAONE_EXAONE-3.0-7.8B-Instruct/1762652579.705025 | 1762652579.705025 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct | LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct | LGAI-EXAONE | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7192826145737754}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "ExaoneForCausalLM", "params_billions": 7.8} |
HF Open LLM v2 | LGAI-EXAONE | LGAI-EXAONE/EXAONE-3.5-32B-Instruct | a172b1d1-6d6e-4cd9-9a85-78cb4f71661e | 0.0.1 | hfopenllm_v2/LGAI-EXAONE_EXAONE-3.5-32B-Instruct/1762652579.705488 | 1762652579.705489 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LGAI-EXAONE/EXAONE-3.5-32B-Instruct | LGAI-EXAONE/EXAONE-3.5-32B-Instruct | LGAI-EXAONE | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.8391833668000904}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "ExaoneForCausalLM", "params_billions": 32.003} |
HF Open LLM v2 | LGAI-EXAONE | LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct | e2a2d764-ba6b-450d-8f94-abf2af95e793 | 0.0.1 | hfopenllm_v2/LGAI-EXAONE_EXAONE-3.5-2.4B-Instruct/1762652579.705282 | 1762652579.7052832 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct | LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct | LGAI-EXAONE | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7950449252428002}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "ExaoneForCausalLM", "params_billions": 2.405} |
HF Open LLM v2 | google | nbeerbower/gemma2-gutenberg-27B | b0a9fb09-2637-4b4c-9d78-7dc8d9c6aad2 | 0.0.1 | hfopenllm_v2/nbeerbower_gemma2-gutenberg-27B/1762652580.384448 | 1762652580.3844512 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/gemma2-gutenberg-27B | nbeerbower/gemma2-gutenberg-27B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.29470804133033685}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 27.227} |
HF Open LLM v2 | google | nbeerbower/Gemma2-Gutenberg-Doppel-9B | b6514bef-f106-45e0-8571-da3507b0e95b | 0.0.1 | hfopenllm_v2/nbeerbower_Gemma2-Gutenberg-Doppel-9B/1762652580.378716 | 1762652580.378717 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/Gemma2-Gutenberg-Doppel-9B | nbeerbower/Gemma2-Gutenberg-Doppel-9B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7171094917042337}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | nbeerbower/gemma2-gutenberg-9B | 14dc56ff-7f3b-430e-a4b3-6e4c9961fea3 | 0.0.1 | hfopenllm_v2/nbeerbower_gemma2-gutenberg-9B/1762652580.384712 | 1762652580.384713 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nbeerbower/gemma2-gutenberg-9B | nbeerbower/gemma2-gutenberg-9B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2795948084416016}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | Sorawiz/Gemma-9B-Base | 246e4c1f-016c-411e-870e-9ade63713daa | 0.0.1 | hfopenllm_v2/Sorawiz_Gemma-9B-Base/1762652579.8897338 | 1762652579.889735 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sorawiz/Gemma-9B-Base | Sorawiz/Gemma-9B-Base | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.16673758959560633}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | google | Sorawiz/Gemma-Creative-9B-Base | 26229a4f-9f53-453f-9899-77808040f8cb | 0.0.1 | hfopenllm_v2/Sorawiz_Gemma-Creative-9B-Base/1762652579.890075 | 1762652579.890076 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Sorawiz/Gemma-Creative-9B-Base | Sorawiz/Gemma-Creative-9B-Base | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1515002415812267}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | google | zake7749/gemma-2-9b-it-chinese-kyara | 827af354-0efb-4a44-b62a-c8562fd0065b | 0.0.1 | hfopenllm_v2/zake7749_gemma-2-9b-it-chinese-kyara/1762652580.612564 | 1762652580.612565 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | zake7749/gemma-2-9b-it-chinese-kyara | zake7749/gemma-2-9b-it-chinese-kyara | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17642965110351644}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored | b708a2a6-d738-48a9-9c20-0838bdb19646 | 0.0.1 | hfopenllm_v2/DavidAU_Gemma-The-Writer-N-Restless-Quill-10B-Uncensored/1762652579.540709 | 1762652579.54071 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored | DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7070927361622716}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.034} |
HF Open LLM v2 | google | DavidAU/Gemma-The-Writer-9B | a639bba5-4d0e-4d0b-826a-3eb4d0ccebab | 0.0.1 | hfopenllm_v2/DavidAU_Gemma-The-Writer-9B/1762652579.539702 | 1762652579.5397062 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DavidAU/Gemma-The-Writer-9B | DavidAU/Gemma-The-Writer-9B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.17403156956874427}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | google | DavidAU/Gemma-The-Writer-Mighty-Sword-9B | a403d91c-4f30-4d05-9f00-24ce97cc91ac | 0.0.1 | hfopenllm_v2/DavidAU_Gemma-The-Writer-Mighty-Sword-9B/1762652579.540473 | 1762652579.5404742 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DavidAU/Gemma-The-Writer-Mighty-Sword-9B | DavidAU/Gemma-The-Writer-Mighty-Sword-9B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7527549125209998}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | google | DavidAU/Gemma-The-Writer-J.GutenBerg-10B | 3d1cef14-ea09-45ca-a92c-a1fe7a05ce8b | 0.0.1 | hfopenllm_v2/DavidAU_Gemma-The-Writer-J.GutenBerg-10B/1762652579.5402539 | 1762652579.540255 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DavidAU/Gemma-The-Writer-J.GutenBerg-10B | DavidAU/Gemma-The-Writer-J.GutenBerg-10B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.28578948301617485}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.034} |
HF Open LLM v2 | google | DavidAU/Gemma-The-Writer-DEADLINE-10B | 66d2e2a4-a75c-4fb9-af6a-3181f17281af | 0.0.1 | hfopenllm_v2/DavidAU_Gemma-The-Writer-DEADLINE-10B/1762652579.5400288 | 1762652579.54003 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | DavidAU/Gemma-The-Writer-DEADLINE-10B | DavidAU/Gemma-The-Writer-DEADLINE-10B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.23315802071836061}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.952} |
HF Open LLM v2 | google | nhyha/N3N_gemma-2-9b-it_20241110_2026 | 4c450b48-8477-45cb-9cfa-814c21dd39d7 | 0.0.1 | hfopenllm_v2/nhyha_N3N_gemma-2-9b-it_20241110_2026/1762652580.406234 | 1762652580.406235 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nhyha/N3N_gemma-2-9b-it_20241110_2026 | nhyha/N3N_gemma-2-9b-it_20241110_2026 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6282829558903709}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | google | nhyha/N3N_gemma-2-9b-it_20241029_1532 | cb85dee2-acee-48f8-85aa-1d5664179fd5 | 0.0.1 | hfopenllm_v2/nhyha_N3N_gemma-2-9b-it_20241029_1532/1762652580.4059799 | 1762652580.4059808 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | nhyha/N3N_gemma-2-9b-it_20241029_1532 | nhyha/N3N_gemma-2-9b-it_20241029_1532 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6751940407008958}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | google | Aashraf995/Gemma-Evo-10B | 15b910c7-6c36-4af8-af78-d48278dbc4db | 0.0.1 | hfopenllm_v2/Aashraf995_Gemma-Evo-10B/1762652579.476305 | 1762652579.476305 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Aashraf995/Gemma-Evo-10B | Aashraf995/Gemma-Evo-10B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7332211864519476}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | google | anakin87/gemma-2b-orpo | 80531a18-00d3-4264-bf84-cd1d4d90df08 | 0.0.1 | hfopenllm_v2/anakin87_gemma-2b-orpo/1762652580.010973 | 1762652580.010974 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | anakin87/gemma-2b-orpo | anakin87/gemma-2b-orpo | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24779695651981187}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "GemmaForCausalLM", "params_billions": 2.506} |
HF Open LLM v2 | google | google/flan-t5-small | 368a36c5-8211-4240-ac88-3fd5e5414310 | 0.0.1 | hfopenllm_v2/google_flan-t5-small/1762652580.173366 | 1762652580.173366 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | google/flan-t5-small | google/flan-t5-small | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.1524255641697363}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "T5ForConditionalGeneration", "params_billions": 0.077} |
HF Open LLM v2 | google | AELLM/gemma-2-aeria-infinity-9b | 93d08946-76b5-4547-8bf0-966c5cccd8c1 | 0.0.1 | hfopenllm_v2/AELLM_gemma-2-aeria-infinity-9b/1762652579.4729412 | 1762652579.472942 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | AELLM/gemma-2-aeria-infinity-9b | AELLM/gemma-2-aeria-infinity-9b | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.759399504426034}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | AELLM/gemma-2-lyco-infinity-9b | fa16a47e-4009-487b-8252-1fef155ce6b4 | 0.0.1 | hfopenllm_v2/AELLM_gemma-2-lyco-infinity-9b/1762652579.473207 | 1762652579.473208 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | AELLM/gemma-2-lyco-infinity-9b | AELLM/gemma-2-lyco-infinity-9b | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7316475839660989}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | google | TheDrummer/Gemmasutra-9B-v1 | 3f7a68f4-e456-4ecf-8a5f-1f3698822a89 | 0.0.1 | hfopenllm_v2/TheDrummer_Gemmasutra-9B-v1/1762652579.9140742 | 1762652579.914075 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | TheDrummer/Gemmasutra-9B-v1 | TheDrummer/Gemmasutra-9B-v1 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24155130609006326}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
HF Open LLM v2 | google | TheDrummer/Tiger-Gemma-9B-v3 | 6fbfd3ba-e28a-4e9d-be12-e04b6d50b9ee | 0.0.1 | hfopenllm_v2/TheDrummer_Tiger-Gemma-9B-v3/1762652579.915734 | 1762652579.915734 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | TheDrummer/Tiger-Gemma-9B-v3 | TheDrummer/Tiger-Gemma-9B-v3 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6820635912711606}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | TheDrummer/Gemmasutra-Mini-2B-v1 | 3c066bd3-ec6c-412d-86a1-759c228610b9 | 0.0.1 | hfopenllm_v2/TheDrummer_Gemmasutra-Mini-2B-v1/1762652579.914318 | 1762652579.914319 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | TheDrummer/Gemmasutra-Mini-2B-v1 | TheDrummer/Gemmasutra-Mini-2B-v1 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.25486597782771936}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 2.614} |
HF Open LLM v2 | google | TheDrummer/Tiger-Gemma-9B-v1 | 7b093f59-7a4e-4e72-b9a6-7d10870917ea | 0.0.1 | hfopenllm_v2/TheDrummer_Tiger-Gemma-9B-v1/1762652579.915312 | 1762652579.915313 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | TheDrummer/Tiger-Gemma-9B-v1 | TheDrummer/Tiger-Gemma-9B-v1 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.728150197032762}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | TheDrummer/Tiger-Gemma-9B-v2 | 962205b9-009a-4201-b382-5143c80e78ce | 0.0.1 | hfopenllm_v2/TheDrummer_Tiger-Gemma-9B-v2/1762652579.915529 | 1762652579.91553 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | TheDrummer/Tiger-Gemma-9B-v2 | TheDrummer/Tiger-Gemma-9B-v2 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6985997154217476}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | google/switch-base-8 | 43e22ce0-cdd7-424f-8a01-f9fea8b2a010 | 0.0.1 | hfopenllm_v2/google_switch-base-8/1762652580.180255 | 1762652580.180256 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | google/switch-base-8 | google/switch-base-8 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.15852050337548815}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "SwitchTransformersForConditionalGeneration", "params_billions": 0.62} |
HF Open LLM v2 | google | google/flan-t5-base | 69eb63bf-72dd-4995-a8ec-49fd304a8ee7 | 0.0.1 | hfopenllm_v2/google_flan-t5-base/1762652580.172907 | 1762652580.172908 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | google/flan-t5-base | google/flan-t5-base | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.18907055501624578}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "T5ForConditionalGeneration", "params_billions": 0.248} |
HF Open LLM v2 | google | grimjim/Gigantes-v1-gemma2-9b-it | 57072a5e-1f64-4ae2-9e2c-caecc1dc05f4 | 0.0.1 | hfopenllm_v2/grimjim_Gigantes-v1-gemma2-9b-it/1762652580.1819131 | 1762652580.1819131 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | grimjim/Gigantes-v1-gemma2-9b-it | grimjim/Gigantes-v1-gemma2-9b-it | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.692454908531585}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | grimjim/Magnolia-v2-Gemma2-8k-9B | 4d0574f4-4d91-4395-afff-133216eee509 | 0.0.1 | hfopenllm_v2/grimjim_Magnolia-v2-Gemma2-8k-9B/1762652580.184566 | 1762652580.184567 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | grimjim/Magnolia-v2-Gemma2-8k-9B | grimjim/Magnolia-v2-Gemma2-8k-9B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7384417789243651}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | grimjim/Magot-v1-Gemma2-8k-9B | 9e63ff64-f862-40ad-b594-31063ec0d31e | 0.0.1 | hfopenllm_v2/grimjim_Magot-v1-Gemma2-8k-9B/1762652580.185666 | 1762652580.185667 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | grimjim/Magot-v1-Gemma2-8k-9B | grimjim/Magot-v1-Gemma2-8k-9B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.29967818720993633}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | grimjim/Gigantes-v2-gemma2-9b-it | 47486923-2194-4b8e-930c-ca14bd5f8a26 | 0.0.1 | hfopenllm_v2/grimjim_Gigantes-v2-gemma2-9b-it/1762652580.182155 | 1762652580.182156 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | grimjim/Gigantes-v2-gemma2-9b-it | grimjim/Gigantes-v2-gemma2-9b-it | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7350696152874374}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | grimjim/Magot-v2-Gemma2-8k-9B | 2d250aa8-f3c5-4f9f-9e5c-dde8f720db53 | 0.0.1 | hfopenllm_v2/grimjim_Magot-v2-Gemma2-8k-9B/1762652580.185882 | 1762652580.1858828 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | grimjim/Magot-v2-Gemma2-8k-9B | grimjim/Magot-v2-Gemma2-8k-9B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7347449212533854}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | grimjim/Magnolia-v3-Gemma2-8k-9B | 8fff2cec-a733-4505-bce9-8b605044181a | 0.0.1 | hfopenllm_v2/grimjim_Magnolia-v3-Gemma2-8k-9B/1762652580.1850398 | 1762652580.185041 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | grimjim/Magnolia-v3-Gemma2-8k-9B | grimjim/Magnolia-v3-Gemma2-8k-9B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.7378422585406721}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | grimjim/Magnolia-v1-Gemma2-8k-9B | 2cf17692-b105-41df-9783-6c7728ab778f | 0.0.1 | hfopenllm_v2/grimjim_Magnolia-v1-Gemma2-8k-9B/1762652580.1841059 | 1762652580.1841059 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | grimjim/Magnolia-v1-Gemma2-8k-9B | grimjim/Magnolia-v1-Gemma2-8k-9B | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.35308536904302806}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | grimjim/Gigantes-v3-gemma2-9b-it | bb063d7a-65fa-416b-88e9-7bacdef1da3e | 0.0.1 | hfopenllm_v2/grimjim_Gigantes-v3-gemma2-9b-it/1762652580.182362 | 1762652580.1823628 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | grimjim/Gigantes-v3-gemma2-9b-it | grimjim/Gigantes-v3-gemma2-9b-it | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.697625633319592}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH"... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 9.242} |
HF Open LLM v2 | google | INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0 | 51d4db96-4c38-464a-9e7f-0ade67699c8d | 0.0.1 | hfopenllm_v2/INSAIT-Institute_BgGPT-Gemma-2-27B-IT-v1.0/1762652579.645844 | 1762652579.645845 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0 | INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.0}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH", "lower_is_be... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 27.227} |
HF Open LLM v2 | google | AALF/gemma-2-27b-it-SimPO-37K-100steps | 214ebe7f-357a-435c-9bf5-451bdea1ca9a | 0.0.1 | hfopenllm_v2/AALF_gemma-2-27b-it-SimPO-37K-100steps/1762652579.472713 | 1762652579.472714 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | AALF/gemma-2-27b-it-SimPO-37K-100steps | AALF/gemma-2-27b-it-SimPO-37K-100steps | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2567642743476199}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 27.227} |
HF Open LLM v2 | google | AALF/gemma-2-27b-it-SimPO-37K | 878ec84b-a365-4887-b7fd-1dc738f6eda8 | 0.0.1 | hfopenllm_v2/AALF_gemma-2-27b-it-SimPO-37K/1762652579.472391 | 1762652579.4723918 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | AALF/gemma-2-27b-it-SimPO-37K | AALF/gemma-2-27b-it-SimPO-37K | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.24065257959990605}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 27.227} |
HF Open LLM v2 | google | google/flan-t5-xxl | e15f4783-510e-4b92-a999-072caa425d4c | 0.0.1 | hfopenllm_v2/google_flan-t5-xxl/1762652580.174026 | 1762652580.174026 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | google/flan-t5-xxl | google/flan-t5-xxl | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.2200450360598767}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "T5ForConditionalGeneration", "params_billions": 11.267} |
HF Open LLM v2 | google | EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3 | c05e106e-203a-49e7-b656-22809ac16037 | 0.0.1 | hfopenllm_v2/EpistemeAI_Athene-codegemma-2-7b-it-alpaca-v1.3/1762652579.598942 | 1762652579.598943 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3 | EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3 | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.40299405577201824}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BB... | {"precision": "float16", "architecture": "GemmaForCausalLM", "params_billions": 7.0} |
HF Open LLM v2 | google | EpistemeAI/Athena-gemma-2-2b-it-Philos | 21096485-ff49-4481-a530-48746334fceb | 0.0.1 | hfopenllm_v2/EpistemeAI_Athena-gemma-2-2b-it-Philos/1762652579.598697 | 1762652579.598698 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI/Athena-gemma-2-2b-it-Philos | EpistemeAI/Athena-gemma-2-2b-it-Philos | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.4620950189940469}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 2.0} |
HF Open LLM v2 | google | EpistemeAI/Athena-gemma-2-2b-it | a0ca047c-97c2-4ba1-84a7-ba0b00ba6d25 | 0.0.1 | hfopenllm_v2/EpistemeAI_Athena-gemma-2-2b-it/1762652579.598221 | 1762652579.598221 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | EpistemeAI/Athena-gemma-2-2b-it | EpistemeAI/Athena-gemma-2-2b-it | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.3134172883504657}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "float16", "architecture": "Gemma2ForCausalLM", "params_billions": 2.0} |
HF Open LLM v2 | google | Gunulhona/Gemma-Ko-Merge | dccf426d-63bb-4298-958f-d1f4776f03b2 | 0.0.1 | hfopenllm_v2/Gunulhona_Gemma-Ko-Merge/1762652579.635146 | 1762652579.635147 | ["https://open-llm-leaderboard-open-llm-leaderboard.hf.space/api/leaderboard/formatted"] | HF Open LLM v2 | leaderboard | Hugging Face | null | null | third_party | Gunulhona/Gemma-Ko-Merge | Gunulhona/Gemma-Ko-Merge | google | unknown | [{"evaluation_name": "IFEval", "metric_config": {"evaluation_description": "Accuracy on IFEval", "lower_is_better": false, "score_type": "continuous", "min_score": 0, "max_score": 1}, "score_details": {"score": 0.6415721397004392}}, {"evaluation_name": "BBH", "metric_config": {"evaluation_description": "Accuracy on BBH... | {"precision": "bfloat16", "architecture": "Gemma2ForCausalLM", "params_billions": 10.159} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.