id
stringlengths
2
115
lastModified
stringlengths
24
24
tags
list
author
stringlengths
2
42
description
stringlengths
0
68.7k
citation
stringlengths
0
10.7k
cardData
null
likes
int64
0
3.55k
downloads
int64
0
10.1M
card
stringlengths
0
1.01M
weaviate/WithRetrieval-APISplit-Test-80
2023-10-04T02:47:38.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
weaviate/WithRetrieval-APISplit-Train-40
2023-10-04T02:48:06.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
weaviate/WithRetrieval-APISplit-Test-40
2023-10-04T02:49:24.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST3
2023-10-04T02:49:56.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [wei123602/Llama-2-13b-FINETUNE4_TEST3](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST3\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T02:48:34.144397](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST3/blob/main/results_2023-10-04T02-48-34.144397.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5640249019461957,\n\ \ \"acc_stderr\": 0.034606850502455905,\n \"acc_norm\": 0.5684780100796328,\n\ \ \"acc_norm_stderr\": 0.034586314987852085,\n \"mc1\": 0.27050183598531213,\n\ \ \"mc1_stderr\": 0.015550778332842893,\n \"mc2\": 0.3997802061886997,\n\ \ \"mc2_stderr\": 0.014380874604105908\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5358361774744027,\n \"acc_stderr\": 0.01457381366473572,\n\ \ \"acc_norm\": 0.590443686006826,\n \"acc_norm_stderr\": 0.014370358632472432\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6083449512049393,\n\ \ \"acc_stderr\": 0.004871226629346401,\n \"acc_norm\": 0.8164708225453097,\n\ \ \"acc_norm_stderr\": 0.0038630862999845836\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\ \ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\ \ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\ \ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n\ \ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\ \ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\ \ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \ \ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\"\ : {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n\ \ \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n\ \ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n\ \ \"acc_stderr\": 0.047240073523838876,\n \"acc_norm\": 0.3431372549019608,\n\ \ \"acc_norm_stderr\": 0.047240073523838876\n },\n \"harness|hendrycksTest-computer_security|5\"\ : {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n\ \ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.41702127659574467,\n\ \ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\ : {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n\ \ \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n\ \ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\ : 0.43448275862068964,\n \"acc_stderr\": 0.041307408795554966,\n \"\ acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.041307408795554966\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\"\ : 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n\ \ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\ \ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\ \ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.5774193548387097,\n \"acc_stderr\": 0.02810096472427264,\n \"\ acc_norm\": 0.5774193548387097,\n \"acc_norm_stderr\": 0.02810096472427264\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\ acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\ : 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\ \ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\ acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\ \ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.02502861027671086,\n \ \ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.02502861027671086\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \ \ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\ \ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360383,\n \"\ acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360383\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\ acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\ acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n\ \ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\ \ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\ \ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\ \ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\ acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046735,\n\ \ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046735\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\ \ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\ \ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\ \ \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.7991452991452992,\n\ \ \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\ \ \"acc_stderr\": 0.015016884698539883,\n \"acc_norm\": 0.7713920817369093,\n\ \ \"acc_norm_stderr\": 0.015016884698539883\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\ \ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\ \ \"acc_stderr\": 0.015551673652172547,\n \"acc_norm\": 0.31620111731843575,\n\ \ \"acc_norm_stderr\": 0.015551673652172547\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424513,\n\ \ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424513\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\ \ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\ \ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\ \ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\ : {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342507,\n\ \ \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342507\n\ \ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\ : 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016636,\n \"\ acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016636\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829153,\n \ \ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829153\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\ \ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\ \ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.031414708025865885,\n\ \ \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.031414708025865885\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\ \ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\ \ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\ \ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\ \ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\ \ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\ \ \"mc1_stderr\": 0.015550778332842893,\n \"mc2\": 0.3997802061886997,\n\ \ \"mc2_stderr\": 0.014380874604105908\n }\n}\n```" repo_url: https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|arc:challenge|25_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hellaswag|10_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-48-34.144397.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-48-34.144397.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T02_48_34.144397 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T02-48-34.144397.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T02-48-34.144397.parquet' - config_name: results data_files: - split: 2023_10_04T02_48_34.144397 path: - results_2023-10-04T02-48-34.144397.parquet - split: latest path: - results_2023-10-04T02-48-34.144397.parquet --- # Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [wei123602/Llama-2-13b-FINETUNE4_TEST3](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST3", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T02:48:34.144397](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST3/blob/main/results_2023-10-04T02-48-34.144397.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5640249019461957, "acc_stderr": 0.034606850502455905, "acc_norm": 0.5684780100796328, "acc_norm_stderr": 0.034586314987852085, "mc1": 0.27050183598531213, "mc1_stderr": 0.015550778332842893, "mc2": 0.3997802061886997, "mc2_stderr": 0.014380874604105908 }, "harness|arc:challenge|25": { "acc": 0.5358361774744027, "acc_stderr": 0.01457381366473572, "acc_norm": 0.590443686006826, "acc_norm_stderr": 0.014370358632472432 }, "harness|hellaswag|10": { "acc": 0.6083449512049393, "acc_stderr": 0.004871226629346401, "acc_norm": 0.8164708225453097, "acc_norm_stderr": 0.0038630862999845836 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5185185185185185, "acc_stderr": 0.043163785995113245, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5986842105263158, "acc_stderr": 0.039889037033362836, "acc_norm": 0.5986842105263158, "acc_norm_stderr": 0.039889037033362836 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6452830188679245, "acc_stderr": 0.02944517532819959, "acc_norm": 0.6452830188679245, "acc_norm_stderr": 0.02944517532819959 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5694444444444444, "acc_stderr": 0.04140685639111503, "acc_norm": 0.5694444444444444, "acc_norm_stderr": 0.04140685639111503 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5549132947976878, "acc_stderr": 0.03789401760283647, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.03789401760283647 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.047240073523838876, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.047240073523838876 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.41702127659574467, "acc_stderr": 0.03223276266711712, "acc_norm": 0.41702127659574467, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.37719298245614036, "acc_stderr": 0.04559522141958216, "acc_norm": 0.37719298245614036, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.43448275862068964, "acc_stderr": 0.041307408795554966, "acc_norm": 0.43448275862068964, "acc_norm_stderr": 0.041307408795554966 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3544973544973545, "acc_stderr": 0.024636830602842, "acc_norm": 0.3544973544973545, "acc_norm_stderr": 0.024636830602842 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5774193548387097, "acc_stderr": 0.02810096472427264, "acc_norm": 0.5774193548387097, "acc_norm_stderr": 0.02810096472427264 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4630541871921182, "acc_stderr": 0.035083705204426656, "acc_norm": 0.4630541871921182, "acc_norm_stderr": 0.035083705204426656 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7121212121212122, "acc_stderr": 0.03225883512300992, "acc_norm": 0.7121212121212122, "acc_norm_stderr": 0.03225883512300992 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8238341968911918, "acc_stderr": 0.027493504244548057, "acc_norm": 0.8238341968911918, "acc_norm_stderr": 0.027493504244548057 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5794871794871795, "acc_stderr": 0.02502861027671086, "acc_norm": 0.5794871794871795, "acc_norm_stderr": 0.02502861027671086 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.02840653309060846, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.02840653309060846 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.26490066225165565, "acc_stderr": 0.03603038545360383, "acc_norm": 0.26490066225165565, "acc_norm_stderr": 0.03603038545360383 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7724770642201835, "acc_stderr": 0.017974463578776502, "acc_norm": 0.7724770642201835, "acc_norm_stderr": 0.017974463578776502 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.75, "acc_stderr": 0.03039153369274154, "acc_norm": 0.75, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.02655837250266192, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.02655837250266192 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6322869955156951, "acc_stderr": 0.03236198350928276, "acc_norm": 0.6322869955156951, "acc_norm_stderr": 0.03236198350928276 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6030534351145038, "acc_stderr": 0.04291135671009224, "acc_norm": 0.6030534351145038, "acc_norm_stderr": 0.04291135671009224 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6859504132231405, "acc_stderr": 0.042369647530410184, "acc_norm": 0.6859504132231405, "acc_norm_stderr": 0.042369647530410184 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.043300437496507416, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.043300437496507416 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6441717791411042, "acc_stderr": 0.03761521380046735, "acc_norm": 0.6441717791411042, "acc_norm_stderr": 0.03761521380046735 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.041858325989283136, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.041858325989283136 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7991452991452992, "acc_stderr": 0.026246772946890477, "acc_norm": 0.7991452991452992, "acc_norm_stderr": 0.026246772946890477 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7713920817369093, "acc_stderr": 0.015016884698539883, "acc_norm": 0.7713920817369093, "acc_norm_stderr": 0.015016884698539883 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6271676300578035, "acc_stderr": 0.02603389061357628, "acc_norm": 0.6271676300578035, "acc_norm_stderr": 0.02603389061357628 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.31620111731843575, "acc_stderr": 0.015551673652172547, "acc_norm": 0.31620111731843575, "acc_norm_stderr": 0.015551673652172547 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6078431372549019, "acc_stderr": 0.027956046165424513, "acc_norm": 0.6078431372549019, "acc_norm_stderr": 0.027956046165424513 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6463022508038585, "acc_stderr": 0.027155208103200865, "acc_norm": 0.6463022508038585, "acc_norm_stderr": 0.027155208103200865 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6450617283950617, "acc_stderr": 0.02662415247884585, "acc_norm": 0.6450617283950617, "acc_norm_stderr": 0.02662415247884585 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4621903520208605, "acc_stderr": 0.012733671880342507, "acc_norm": 0.4621903520208605, "acc_norm_stderr": 0.012733671880342507 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5661764705882353, "acc_stderr": 0.030105636570016636, "acc_norm": 0.5661764705882353, "acc_norm_stderr": 0.030105636570016636 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5947712418300654, "acc_stderr": 0.019861155193829153, "acc_norm": 0.5947712418300654, "acc_norm_stderr": 0.019861155193829153 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425464, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425464 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5959183673469388, "acc_stderr": 0.031414708025865885, "acc_norm": 0.5959183673469388, "acc_norm_stderr": 0.031414708025865885 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6069651741293532, "acc_stderr": 0.0345368246603156, "acc_norm": 0.6069651741293532, "acc_norm_stderr": 0.0345368246603156 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-virology|5": { "acc": 0.41566265060240964, "acc_stderr": 0.038367221765980515, "acc_norm": 0.41566265060240964, "acc_norm_stderr": 0.038367221765980515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7543859649122807, "acc_stderr": 0.03301405946987249, "acc_norm": 0.7543859649122807, "acc_norm_stderr": 0.03301405946987249 }, "harness|truthfulqa:mc|0": { "mc1": 0.27050183598531213, "mc1_stderr": 0.015550778332842893, "mc2": 0.3997802061886997, "mc2_stderr": 0.014380874604105908 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
weaviate/WithRetrieval-APISplit-Train-20
2023-10-04T02:49:55.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
atom-in-the-universe/bild-17e1d5ae-d2bc-4bfb-854e-5ada76d1256b
2023-10-04T03:02:26.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
weaviate/WithRetrieval-APISplit-Test-20
2023-10-04T02:50:19.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
weaviate/WithoutRetrieval-APISplit-Train-80
2023-10-04T02:53:55.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
weaviate/WithoutRetrieval-APISplit-Test-80
2023-10-04T02:54:54.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
weaviate/WithoutRetrieval-APISplit-Train-40
2023-10-04T02:55:31.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
weaviate/WithoutRetrieval-APISplit-Test-40
2023-10-04T02:56:20.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
weaviate/WithoutRetrieval-APISplit-Train-20
2023-10-04T02:56:53.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
weaviate/WithoutRetrieval-APISplit-Test-20
2023-10-04T02:58:15.000Z
[ "license:apache-2.0", "region:us" ]
weaviate
null
null
null
0
0
--- license: apache-2.0 ---
atom-in-the-universe/bild-98e0fa36-fdd8-4cb2-af39-46b88efc8739
2023-10-04T03:12:46.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3
2023-10-04T03:05:21.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of AtAndDev/ShortKing-3b-v0.3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [AtAndDev/ShortKing-3b-v0.3](https://huggingface.co/AtAndDev/ShortKing-3b-v0.3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T03:04:04.830920](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3/blob/main/results_2023-10-04T03-04-04.830920.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2685210584537041,\n\ \ \"acc_stderr\": 0.032034452923424835,\n \"acc_norm\": 0.2721231535796682,\n\ \ \"acc_norm_stderr\": 0.032030595545089316,\n \"mc1\": 0.23745410036719705,\n\ \ \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.3877758221836436,\n\ \ \"mc2_stderr\": 0.01369001571044045\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.37627986348122866,\n \"acc_stderr\": 0.014157022555407175,\n\ \ \"acc_norm\": 0.40955631399317405,\n \"acc_norm_stderr\": 0.014370358632472442\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5279824736108345,\n\ \ \"acc_stderr\": 0.004981961097590808,\n \"acc_norm\": 0.7072296355307708,\n\ \ \"acc_norm_stderr\": 0.004541039698729831\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\ \ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.3037037037037037,\n\ \ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\ \ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\ \ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874172,\n\ \ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874172\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\ \ \"acc_stderr\": 0.03437079344106136,\n \"acc_norm\": 0.2152777777777778,\n\ \ \"acc_norm_stderr\": 0.03437079344106136\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\ \ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\ \ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\ \ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.03047297336338004,\n\ \ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.03047297336338004\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\ \ \"acc_stderr\": 0.03999423879281334,\n \"acc_norm\": 0.23684210526315788,\n\ \ \"acc_norm_stderr\": 0.03999423879281334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.15862068965517243,\n \"acc_stderr\": 0.03044350031758399,\n\ \ \"acc_norm\": 0.15862068965517243,\n \"acc_norm_stderr\": 0.03044350031758399\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\ acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\ \ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\ \ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n\ \ \"acc_stderr\": 0.024362599693031086,\n \"acc_norm\": 0.24193548387096775,\n\ \ \"acc_norm_stderr\": 0.024362599693031086\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617732,\n\ \ \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617732\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\ : 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.0328766675860349,\n\ \ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.0328766675860349\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.20707070707070707,\n \"acc_stderr\": 0.02886977846026705,\n \"\ acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.02886977846026705\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.19170984455958548,\n \"acc_stderr\": 0.028408953626245282,\n\ \ \"acc_norm\": 0.19170984455958548,\n \"acc_norm_stderr\": 0.028408953626245282\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.022139081103971538,\n\ \ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.022139081103971538\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \ \ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3025210084033613,\n \"acc_stderr\": 0.029837962388291936,\n\ \ \"acc_norm\": 0.3025210084033613,\n \"acc_norm_stderr\": 0.029837962388291936\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\ acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861507,\n \"\ acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861507\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691943,\n \"\ acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691943\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083292,\n \"\ acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083292\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.3037974683544304,\n \"acc_stderr\": 0.029936696387138615,\n \ \ \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.029936696387138615\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n\ \ \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.40358744394618834,\n\ \ \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.035954616117746904,\n\ \ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.035954616117746904\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083498,\n \"\ acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083498\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\ \ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.044986763205729224,\n\ \ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.044986763205729224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\ \ \"acc_stderr\": 0.02920254015343118,\n \"acc_norm\": 0.27350427350427353,\n\ \ \"acc_norm_stderr\": 0.02920254015343118\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n\ \ \"acc_stderr\": 0.016073127851221235,\n \"acc_norm\": 0.280970625798212,\n\ \ \"acc_norm_stderr\": 0.016073127851221235\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.02361867831006937,\n\ \ \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.02361867831006937\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\ \ \"acc_stderr\": 0.014149575348976259,\n \"acc_norm\": 0.2335195530726257,\n\ \ \"acc_norm_stderr\": 0.014149575348976259\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427904,\n\ \ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427904\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\ \ \"acc_stderr\": 0.025670259242188943,\n \"acc_norm\": 0.2861736334405145,\n\ \ \"acc_norm_stderr\": 0.025670259242188943\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.025407197798890162,\n\ \ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.025407197798890162\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \ \ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n\ \ \"acc_stderr\": 0.0108859297420022,\n \"acc_norm\": 0.23859191655801826,\n\ \ \"acc_norm_stderr\": 0.0108859297420022\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487417,\n\ \ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487417\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \ \ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\ \ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n\ \ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.025801283475090503,\n\ \ \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.025801283475090503\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\ \ \"acc_stderr\": 0.02970528405677243,\n \"acc_norm\": 0.22885572139303484,\n\ \ \"acc_norm_stderr\": 0.02970528405677243\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\ \ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\ \ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.0356507967070831,\n\ \ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.0356507967070831\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\ \ \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.3877758221836436,\n\ \ \"mc2_stderr\": 0.01369001571044045\n }\n}\n```" repo_url: https://huggingface.co/AtAndDev/ShortKing-3b-v0.3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|arc:challenge|25_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hellaswag|10_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-04-04.830920.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-04-04.830920.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T03_04_04.830920 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-04-04.830920.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-04-04.830920.parquet' - config_name: results data_files: - split: 2023_10_04T03_04_04.830920 path: - results_2023-10-04T03-04-04.830920.parquet - split: latest path: - results_2023-10-04T03-04-04.830920.parquet --- # Dataset Card for Evaluation run of AtAndDev/ShortKing-3b-v0.3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/AtAndDev/ShortKing-3b-v0.3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [AtAndDev/ShortKing-3b-v0.3](https://huggingface.co/AtAndDev/ShortKing-3b-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T03:04:04.830920](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3/blob/main/results_2023-10-04T03-04-04.830920.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2685210584537041, "acc_stderr": 0.032034452923424835, "acc_norm": 0.2721231535796682, "acc_norm_stderr": 0.032030595545089316, "mc1": 0.23745410036719705, "mc1_stderr": 0.014896277441041843, "mc2": 0.3877758221836436, "mc2_stderr": 0.01369001571044045 }, "harness|arc:challenge|25": { "acc": 0.37627986348122866, "acc_stderr": 0.014157022555407175, "acc_norm": 0.40955631399317405, "acc_norm_stderr": 0.014370358632472442 }, "harness|hellaswag|10": { "acc": 0.5279824736108345, "acc_stderr": 0.004981961097590808, "acc_norm": 0.7072296355307708, "acc_norm_stderr": 0.004541039698729831 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768081, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768081 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3037037037037037, "acc_stderr": 0.03972552884785137, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.03972552884785137 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.23026315789473684, "acc_stderr": 0.03426059424403165, "acc_norm": 0.23026315789473684, "acc_norm_stderr": 0.03426059424403165 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2641509433962264, "acc_stderr": 0.02713429162874172, "acc_norm": 0.2641509433962264, "acc_norm_stderr": 0.02713429162874172 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2152777777777778, "acc_stderr": 0.03437079344106136, "acc_norm": 0.2152777777777778, "acc_norm_stderr": 0.03437079344106136 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.23, "acc_stderr": 0.042295258468165044, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165044 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2023121387283237, "acc_stderr": 0.03063114553919882, "acc_norm": 0.2023121387283237, "acc_norm_stderr": 0.03063114553919882 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.24509803921568626, "acc_stderr": 0.042801058373643966, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.042801058373643966 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3191489361702128, "acc_stderr": 0.03047297336338004, "acc_norm": 0.3191489361702128, "acc_norm_stderr": 0.03047297336338004 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.03999423879281334, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.03999423879281334 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.15862068965517243, "acc_stderr": 0.03044350031758399, "acc_norm": 0.15862068965517243, "acc_norm_stderr": 0.03044350031758399 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25132275132275134, "acc_stderr": 0.022340482339643895, "acc_norm": 0.25132275132275134, "acc_norm_stderr": 0.022340482339643895 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2222222222222222, "acc_stderr": 0.037184890068181146, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.037184890068181146 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.24193548387096775, "acc_stderr": 0.024362599693031086, "acc_norm": 0.24193548387096775, "acc_norm_stderr": 0.024362599693031086 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.24630541871921183, "acc_stderr": 0.030315099285617732, "acc_norm": 0.24630541871921183, "acc_norm_stderr": 0.030315099285617732 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23030303030303031, "acc_stderr": 0.0328766675860349, "acc_norm": 0.23030303030303031, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.20707070707070707, "acc_stderr": 0.02886977846026705, "acc_norm": 0.20707070707070707, "acc_norm_stderr": 0.02886977846026705 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19170984455958548, "acc_stderr": 0.028408953626245282, "acc_norm": 0.19170984455958548, "acc_norm_stderr": 0.028408953626245282 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2564102564102564, "acc_stderr": 0.022139081103971538, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.022139081103971538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.026719240783712163, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.026719240783712163 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3025210084033613, "acc_stderr": 0.029837962388291936, "acc_norm": 0.3025210084033613, "acc_norm_stderr": 0.029837962388291936 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.25165562913907286, "acc_stderr": 0.035433042343899844, "acc_norm": 0.25165562913907286, "acc_norm_stderr": 0.035433042343899844 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.23302752293577983, "acc_stderr": 0.018125669180861507, "acc_norm": 0.23302752293577983, "acc_norm_stderr": 0.018125669180861507 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.18981481481481483, "acc_stderr": 0.026744714834691943, "acc_norm": 0.18981481481481483, "acc_norm_stderr": 0.026744714834691943 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.27450980392156865, "acc_stderr": 0.03132179803083292, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.03132179803083292 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.3037974683544304, "acc_stderr": 0.029936696387138615, "acc_norm": 0.3037974683544304, "acc_norm_stderr": 0.029936696387138615 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.40358744394618834, "acc_stderr": 0.032928028193303135, "acc_norm": 0.40358744394618834, "acc_norm_stderr": 0.032928028193303135 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.21374045801526717, "acc_stderr": 0.035954616117746904, "acc_norm": 0.21374045801526717, "acc_norm_stderr": 0.035954616117746904 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2975206611570248, "acc_stderr": 0.04173349148083498, "acc_norm": 0.2975206611570248, "acc_norm_stderr": 0.04173349148083498 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2331288343558282, "acc_stderr": 0.0332201579577674, "acc_norm": 0.2331288343558282, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340456, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340456 }, "harness|hendrycksTest-management|5": { "acc": 0.2912621359223301, "acc_stderr": 0.044986763205729224, "acc_norm": 0.2912621359223301, "acc_norm_stderr": 0.044986763205729224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.27350427350427353, "acc_stderr": 0.02920254015343118, "acc_norm": 0.27350427350427353, "acc_norm_stderr": 0.02920254015343118 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.280970625798212, "acc_stderr": 0.016073127851221235, "acc_norm": 0.280970625798212, "acc_norm_stderr": 0.016073127851221235 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.26011560693641617, "acc_stderr": 0.02361867831006937, "acc_norm": 0.26011560693641617, "acc_norm_stderr": 0.02361867831006937 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2335195530726257, "acc_stderr": 0.014149575348976259, "acc_norm": 0.2335195530726257, "acc_norm_stderr": 0.014149575348976259 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.29411764705882354, "acc_stderr": 0.02609016250427904, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.02609016250427904 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2861736334405145, "acc_stderr": 0.025670259242188943, "acc_norm": 0.2861736334405145, "acc_norm_stderr": 0.025670259242188943 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2962962962962963, "acc_stderr": 0.025407197798890162, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.025407197798890162 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2801418439716312, "acc_stderr": 0.026789172351140242, "acc_norm": 0.2801418439716312, "acc_norm_stderr": 0.026789172351140242 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.23859191655801826, "acc_stderr": 0.0108859297420022, "acc_norm": 0.23859191655801826, "acc_norm_stderr": 0.0108859297420022 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.1948529411764706, "acc_stderr": 0.024060599423487417, "acc_norm": 0.1948529411764706, "acc_norm_stderr": 0.024060599423487417 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2549019607843137, "acc_stderr": 0.017630827375148383, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.017630827375148383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2818181818181818, "acc_stderr": 0.043091187099464585, "acc_norm": 0.2818181818181818, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.20408163265306123, "acc_stderr": 0.025801283475090503, "acc_norm": 0.20408163265306123, "acc_norm_stderr": 0.025801283475090503 }, "harness|hendrycksTest-sociology|5": { "acc": 0.22885572139303484, "acc_stderr": 0.02970528405677243, "acc_norm": 0.22885572139303484, "acc_norm_stderr": 0.02970528405677243 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.3132530120481928, "acc_stderr": 0.036108050180310235, "acc_norm": 0.3132530120481928, "acc_norm_stderr": 0.036108050180310235 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3157894736842105, "acc_stderr": 0.0356507967070831, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.0356507967070831 }, "harness|truthfulqa:mc|0": { "mc1": 0.23745410036719705, "mc1_stderr": 0.014896277441041843, "mc2": 0.3877758221836436, "mc2_stderr": 0.01369001571044045 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
sibozhu/paddington_en
2023-10-04T03:08:51.000Z
[ "region:us" ]
sibozhu
null
null
null
0
0
Entry not found
atom-in-the-universe/bild-730cb766-52f4-41a1-b68a-eb07282c4531
2023-10-04T03:28:04.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_NewstaR__Koss-7B-chat
2023-10-04T03:21:05.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of NewstaR/Koss-7B-chat dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [NewstaR/Koss-7B-chat](https://huggingface.co/NewstaR/Koss-7B-chat) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NewstaR__Koss-7B-chat\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T03:19:48.694479](https://huggingface.co/datasets/open-llm-leaderboard/details_NewstaR__Koss-7B-chat/blob/main/results_2023-10-04T03-19-48.694479.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.469938383101792,\n\ \ \"acc_stderr\": 0.03510383032379136,\n \"acc_norm\": 0.4737661486799197,\n\ \ \"acc_norm_stderr\": 0.03508937393226306,\n \"mc1\": 0.2913096695226438,\n\ \ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4396876190519236,\n\ \ \"mc2_stderr\": 0.015652499203021628\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5008532423208191,\n \"acc_stderr\": 0.014611369529813272,\n\ \ \"acc_norm\": 0.5366894197952219,\n \"acc_norm_stderr\": 0.014572000527756993\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5978888667596096,\n\ \ \"acc_stderr\": 0.004893220635011792,\n \"acc_norm\": 0.787890858394742,\n\ \ \"acc_norm_stderr\": 0.0040796625368983075\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\ \ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\ \ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\ \ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n\ \ \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\ \ \"acc_stderr\": 0.041666666666666644,\n \"acc_norm\": 0.5416666666666666,\n\ \ \"acc_norm_stderr\": 0.041666666666666644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\ \ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\ \ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\ \ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\ \ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\ \ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\ \ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\ \ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370332,\n\ \ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370332\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.28835978835978837,\n \"acc_stderr\": 0.0233306540545359,\n \"\ acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.0233306540545359\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\ \ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\ \ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.5258064516129032,\n \"acc_stderr\": 0.028406095057653315,\n \"\ acc_norm\": 0.5258064516129032,\n \"acc_norm_stderr\": 0.028406095057653315\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n \"\ acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\ \ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.038881769216741004,\n\ \ \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.038881769216741004\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056129,\n \"\ acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056129\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\ \ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331796,\n\ \ \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331796\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \ \ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.032145368597886394,\n\ \ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.032145368597886394\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\ acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.6495412844036698,\n \"acc_stderr\": 0.02045607759982446,\n \"\ acc_norm\": 0.6495412844036698,\n \"acc_norm_stderr\": 0.02045607759982446\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n \"\ acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236434,\n \"\ acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236434\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6582278481012658,\n \"acc_stderr\": 0.03087453753755362,\n \ \ \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.03087453753755362\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\ \ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\ \ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\ \ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.5619834710743802,\n \"acc_stderr\": 0.04529146804435792,\n \"\ acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.04529146804435792\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\ \ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\ \ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\ \ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\ \ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\ \ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\ \ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n\ \ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n\ \ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6730523627075351,\n\ \ \"acc_stderr\": 0.016774908180131467,\n \"acc_norm\": 0.6730523627075351,\n\ \ \"acc_norm_stderr\": 0.016774908180131467\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\ \ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21564245810055865,\n\ \ \"acc_stderr\": 0.013754835975482351,\n \"acc_norm\": 0.21564245810055865,\n\ \ \"acc_norm_stderr\": 0.013754835975482351\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576066,\n\ \ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576066\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\ \ \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.5691318327974276,\n\ \ \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607697,\n\ \ \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607697\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878638,\n \ \ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878638\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.333116036505867,\n\ \ \"acc_stderr\": 0.012037930451512052,\n \"acc_norm\": 0.333116036505867,\n\ \ \"acc_norm_stderr\": 0.012037930451512052\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.029097209568411945,\n\ \ \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.029097209568411945\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.4722222222222222,\n \"acc_stderr\": 0.02019659493354119,\n \ \ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.02019659493354119\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\ \ \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n\ \ \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.03197694118713672,\n\ \ \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.03197694118713672\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n\ \ \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.5771144278606966,\n\ \ \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \ \ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\ \ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\ \ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.035087719298245626,\n\ \ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.035087719298245626\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\ \ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4396876190519236,\n\ \ \"mc2_stderr\": 0.015652499203021628\n }\n}\n```" repo_url: https://huggingface.co/NewstaR/Koss-7B-chat leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|arc:challenge|25_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hellaswag|10_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-19-48.694479.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-19-48.694479.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T03_19_48.694479 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-19-48.694479.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-19-48.694479.parquet' - config_name: results data_files: - split: 2023_10_04T03_19_48.694479 path: - results_2023-10-04T03-19-48.694479.parquet - split: latest path: - results_2023-10-04T03-19-48.694479.parquet --- # Dataset Card for Evaluation run of NewstaR/Koss-7B-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/NewstaR/Koss-7B-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [NewstaR/Koss-7B-chat](https://huggingface.co/NewstaR/Koss-7B-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NewstaR__Koss-7B-chat", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T03:19:48.694479](https://huggingface.co/datasets/open-llm-leaderboard/details_NewstaR__Koss-7B-chat/blob/main/results_2023-10-04T03-19-48.694479.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.469938383101792, "acc_stderr": 0.03510383032379136, "acc_norm": 0.4737661486799197, "acc_norm_stderr": 0.03508937393226306, "mc1": 0.2913096695226438, "mc1_stderr": 0.015905987048184828, "mc2": 0.4396876190519236, "mc2_stderr": 0.015652499203021628 }, "harness|arc:challenge|25": { "acc": 0.5008532423208191, "acc_stderr": 0.014611369529813272, "acc_norm": 0.5366894197952219, "acc_norm_stderr": 0.014572000527756993 }, "harness|hellaswag|10": { "acc": 0.5978888667596096, "acc_stderr": 0.004893220635011792, "acc_norm": 0.787890858394742, "acc_norm_stderr": 0.0040796625368983075 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4148148148148148, "acc_stderr": 0.042561937679014075, "acc_norm": 0.4148148148148148, "acc_norm_stderr": 0.042561937679014075 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4934210526315789, "acc_stderr": 0.040685900502249704, "acc_norm": 0.4934210526315789, "acc_norm_stderr": 0.040685900502249704 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5169811320754717, "acc_stderr": 0.030755120364119905, "acc_norm": 0.5169811320754717, "acc_norm_stderr": 0.030755120364119905 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.041666666666666644, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.041666666666666644 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.047609522856952344, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952344 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4046242774566474, "acc_stderr": 0.03742461193887248, "acc_norm": 0.4046242774566474, "acc_norm_stderr": 0.03742461193887248 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.24509803921568626, "acc_stderr": 0.04280105837364396, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.04280105837364396 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.39148936170212767, "acc_stderr": 0.03190701242326812, "acc_norm": 0.39148936170212767, "acc_norm_stderr": 0.03190701242326812 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.044346007015849245, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.044346007015849245 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.42758620689655175, "acc_stderr": 0.04122737111370332, "acc_norm": 0.42758620689655175, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.28835978835978837, "acc_stderr": 0.0233306540545359, "acc_norm": 0.28835978835978837, "acc_norm_stderr": 0.0233306540545359 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.24603174603174602, "acc_stderr": 0.038522733649243156, "acc_norm": 0.24603174603174602, "acc_norm_stderr": 0.038522733649243156 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5258064516129032, "acc_stderr": 0.028406095057653315, "acc_norm": 0.5258064516129032, "acc_norm_stderr": 0.028406095057653315 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3497536945812808, "acc_stderr": 0.03355400904969565, "acc_norm": 0.3497536945812808, "acc_norm_stderr": 0.03355400904969565 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5454545454545454, "acc_stderr": 0.038881769216741004, "acc_norm": 0.5454545454545454, "acc_norm_stderr": 0.038881769216741004 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5959595959595959, "acc_stderr": 0.03496130972056129, "acc_norm": 0.5959595959595959, "acc_norm_stderr": 0.03496130972056129 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6683937823834197, "acc_stderr": 0.03397636541089118, "acc_norm": 0.6683937823834197, "acc_norm_stderr": 0.03397636541089118 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4128205128205128, "acc_stderr": 0.024962683564331796, "acc_norm": 0.4128205128205128, "acc_norm_stderr": 0.024962683564331796 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.02659393910184407, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.02659393910184407 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.032145368597886394, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.032145368597886394 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.037345356767871984, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.037345356767871984 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6495412844036698, "acc_stderr": 0.02045607759982446, "acc_norm": 0.6495412844036698, "acc_norm_stderr": 0.02045607759982446 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3287037037037037, "acc_stderr": 0.032036140846700596, "acc_norm": 0.3287037037037037, "acc_norm_stderr": 0.032036140846700596 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03308611113236434, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03308611113236434 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6582278481012658, "acc_stderr": 0.03087453753755362, "acc_norm": 0.6582278481012658, "acc_norm_stderr": 0.03087453753755362 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5695067264573991, "acc_stderr": 0.033231973029429394, "acc_norm": 0.5695067264573991, "acc_norm_stderr": 0.033231973029429394 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.4961832061068702, "acc_stderr": 0.043851623256015534, "acc_norm": 0.4961832061068702, "acc_norm_stderr": 0.043851623256015534 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5619834710743802, "acc_stderr": 0.04529146804435792, "acc_norm": 0.5619834710743802, "acc_norm_stderr": 0.04529146804435792 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356461, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5398773006134969, "acc_stderr": 0.03915857291436971, "acc_norm": 0.5398773006134969, "acc_norm_stderr": 0.03915857291436971 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.04493949068613539, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.04493949068613539 }, "harness|hendrycksTest-management|5": { "acc": 0.6699029126213593, "acc_stderr": 0.0465614711001235, "acc_norm": 0.6699029126213593, "acc_norm_stderr": 0.0465614711001235 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7393162393162394, "acc_stderr": 0.028760348956523414, "acc_norm": 0.7393162393162394, "acc_norm_stderr": 0.028760348956523414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6730523627075351, "acc_stderr": 0.016774908180131467, "acc_norm": 0.6730523627075351, "acc_norm_stderr": 0.016774908180131467 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5202312138728323, "acc_stderr": 0.026897049996382875, "acc_norm": 0.5202312138728323, "acc_norm_stderr": 0.026897049996382875 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.21564245810055865, "acc_stderr": 0.013754835975482351, "acc_norm": 0.21564245810055865, "acc_norm_stderr": 0.013754835975482351 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5196078431372549, "acc_stderr": 0.028607893699576066, "acc_norm": 0.5196078431372549, "acc_norm_stderr": 0.028607893699576066 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5691318327974276, "acc_stderr": 0.028125340983972714, "acc_norm": 0.5691318327974276, "acc_norm_stderr": 0.028125340983972714 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5648148148148148, "acc_stderr": 0.027586006221607697, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.027586006221607697 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3546099290780142, "acc_stderr": 0.028538650028878638, "acc_norm": 0.3546099290780142, "acc_norm_stderr": 0.028538650028878638 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.333116036505867, "acc_stderr": 0.012037930451512052, "acc_norm": 0.333116036505867, "acc_norm_stderr": 0.012037930451512052 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.35661764705882354, "acc_stderr": 0.029097209568411945, "acc_norm": 0.35661764705882354, "acc_norm_stderr": 0.029097209568411945 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4722222222222222, "acc_stderr": 0.02019659493354119, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.02019659493354119 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5181818181818182, "acc_stderr": 0.04785964010794915, "acc_norm": 0.5181818181818182, "acc_norm_stderr": 0.04785964010794915 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.4775510204081633, "acc_stderr": 0.03197694118713672, "acc_norm": 0.4775510204081633, "acc_norm_stderr": 0.03197694118713672 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5771144278606966, "acc_stderr": 0.034932317774212816, "acc_norm": 0.5771144278606966, "acc_norm_stderr": 0.034932317774212816 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.038515976837185335, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7017543859649122, "acc_stderr": 0.035087719298245626, "acc_norm": 0.7017543859649122, "acc_norm_stderr": 0.035087719298245626 }, "harness|truthfulqa:mc|0": { "mc1": 0.2913096695226438, "mc1_stderr": 0.015905987048184828, "mc2": 0.4396876190519236, "mc2_stderr": 0.015652499203021628 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-8b359666-dddc-4929-89f7-8ad8bd9941ee
2023-10-04T03:41:31.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_TigerResearch__tigerbot-13b-base
2023-10-04T03:32:51.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of TigerResearch/tigerbot-13b-base dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [TigerResearch/tigerbot-13b-base](https://huggingface.co/TigerResearch/tigerbot-13b-base)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TigerResearch__tigerbot-13b-base\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T03:31:16.960858](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-13b-base/blob/main/results_2023-10-04T03-31-16.960858.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5357058757796341,\n\ \ \"acc_stderr\": 0.03475322889311379,\n \"acc_norm\": 0.5396814413763038,\n\ \ \"acc_norm_stderr\": 0.03473995399464742,\n \"mc1\": 0.30599755201958384,\n\ \ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.44058088181187305,\n\ \ \"mc2_stderr\": 0.01457019109202042\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.01461130570505699,\n\ \ \"acc_norm\": 0.53839590443686,\n \"acc_norm_stderr\": 0.01456824555029636\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5725951005775742,\n\ \ \"acc_stderr\": 0.0049369085031408695,\n \"acc_norm\": 0.7704640509858594,\n\ \ \"acc_norm_stderr\": 0.0041967496483853815\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\ \ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\ \ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490437,\n\ \ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490437\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983052,\n\ \ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983052\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\ \ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\ \ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\ \ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\ \ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\ \ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\ \ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835363,\n\ \ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835363\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\ \ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"\ acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\ \ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\ \ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\ \ \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n\ \ \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.034524539038220406,\n\ \ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.034524539038220406\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\ : 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\ \ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6464646464646465,\n \"acc_stderr\": 0.03406086723547155,\n \"\ acc_norm\": 0.6464646464646465,\n \"acc_norm_stderr\": 0.03406086723547155\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n\ \ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.025329663163489943,\n\ \ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.025329663163489943\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \ \ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \ \ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\ acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7412844036697248,\n \"acc_stderr\": 0.01877605231961963,\n \"\ acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.01877605231961963\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\ acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n \"\ acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955938,\n \ \ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955938\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\ \ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\ \ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\ \ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\ acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\ \ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\ \ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\ \ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\ \ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.027236013946196687,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.027236013946196687\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \ \ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7318007662835249,\n\ \ \"acc_stderr\": 0.015842430835269414,\n \"acc_norm\": 0.7318007662835249,\n\ \ \"acc_norm_stderr\": 0.015842430835269414\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.569364161849711,\n \"acc_stderr\": 0.02665880027367238,\n\ \ \"acc_norm\": 0.569364161849711,\n \"acc_norm_stderr\": 0.02665880027367238\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2324022346368715,\n\ \ \"acc_stderr\": 0.014125968754673398,\n \"acc_norm\": 0.2324022346368715,\n\ \ \"acc_norm_stderr\": 0.014125968754673398\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.028074158947600656,\n\ \ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.028074158947600656\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\ \ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\ \ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.02758600622160771,\n\ \ \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.02758600622160771\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \ \ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3917861799217731,\n\ \ \"acc_stderr\": 0.01246756441814513,\n \"acc_norm\": 0.3917861799217731,\n\ \ \"acc_norm_stderr\": 0.01246756441814513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\ \ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626912,\n \ \ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626912\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\ \ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\ \ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\ \ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\ \ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\ \ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686396,\n\ \ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686396\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\ \ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.44058088181187305,\n\ \ \"mc2_stderr\": 0.01457019109202042\n }\n}\n```" repo_url: https://huggingface.co/TigerResearch/tigerbot-13b-base leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|arc:challenge|25_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hellaswag|10_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-31-16.960858.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-31-16.960858.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T03_31_16.960858 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-31-16.960858.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-31-16.960858.parquet' - config_name: results data_files: - split: 2023_10_04T03_31_16.960858 path: - results_2023-10-04T03-31-16.960858.parquet - split: latest path: - results_2023-10-04T03-31-16.960858.parquet --- # Dataset Card for Evaluation run of TigerResearch/tigerbot-13b-base ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TigerResearch/tigerbot-13b-base - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-13b-base](https://huggingface.co/TigerResearch/tigerbot-13b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TigerResearch__tigerbot-13b-base", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T03:31:16.960858](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-13b-base/blob/main/results_2023-10-04T03-31-16.960858.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5357058757796341, "acc_stderr": 0.03475322889311379, "acc_norm": 0.5396814413763038, "acc_norm_stderr": 0.03473995399464742, "mc1": 0.30599755201958384, "mc1_stderr": 0.016132229728155045, "mc2": 0.44058088181187305, "mc2_stderr": 0.01457019109202042 }, "harness|arc:challenge|25": { "acc": 0.5017064846416383, "acc_stderr": 0.01461130570505699, "acc_norm": 0.53839590443686, "acc_norm_stderr": 0.01456824555029636 }, "harness|hellaswag|10": { "acc": 0.5725951005775742, "acc_stderr": 0.0049369085031408695, "acc_norm": 0.7704640509858594, "acc_norm_stderr": 0.0041967496483853815 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.42962962962962964, "acc_stderr": 0.04276349494376599, "acc_norm": 0.42962962962962964, "acc_norm_stderr": 0.04276349494376599 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5592105263157895, "acc_stderr": 0.04040311062490437, "acc_norm": 0.5592105263157895, "acc_norm_stderr": 0.04040311062490437 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5735849056603773, "acc_stderr": 0.030437794342983052, "acc_norm": 0.5735849056603773, "acc_norm_stderr": 0.030437794342983052 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5555555555555556, "acc_stderr": 0.041553199555931467, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.041553199555931467 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4624277456647399, "acc_stderr": 0.0380168510452446, "acc_norm": 0.4624277456647399, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.04440521906179328, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.04440521906179328 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.39574468085106385, "acc_stderr": 0.03196758697835363, "acc_norm": 0.39574468085106385, "acc_norm_stderr": 0.03196758697835363 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.04339138322579861, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.04339138322579861 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4827586206896552, "acc_stderr": 0.04164188720169377, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.04164188720169377 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.328042328042328, "acc_stderr": 0.024180497164376896, "acc_norm": 0.328042328042328, "acc_norm_stderr": 0.024180497164376896 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.04190596438871136, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.04190596438871136 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6483870967741936, "acc_stderr": 0.02716253782694846, "acc_norm": 0.6483870967741936, "acc_norm_stderr": 0.02716253782694846 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4039408866995074, "acc_stderr": 0.034524539038220406, "acc_norm": 0.4039408866995074, "acc_norm_stderr": 0.034524539038220406 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6545454545454545, "acc_stderr": 0.037131580674819135, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.037131580674819135 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6464646464646465, "acc_stderr": 0.03406086723547155, "acc_norm": 0.6464646464646465, "acc_norm_stderr": 0.03406086723547155 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7668393782383419, "acc_stderr": 0.03051611137147602, "acc_norm": 0.7668393782383419, "acc_norm_stderr": 0.03051611137147602 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4794871794871795, "acc_stderr": 0.025329663163489943, "acc_norm": 0.4794871794871795, "acc_norm_stderr": 0.025329663163489943 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2851851851851852, "acc_stderr": 0.027528599210340496, "acc_norm": 0.2851851851851852, "acc_norm_stderr": 0.027528599210340496 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5462184873949579, "acc_stderr": 0.03233943468182088, "acc_norm": 0.5462184873949579, "acc_norm_stderr": 0.03233943468182088 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7412844036697248, "acc_stderr": 0.01877605231961963, "acc_norm": 0.7412844036697248, "acc_norm_stderr": 0.01877605231961963 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.44907407407407407, "acc_stderr": 0.03392238405321617, "acc_norm": 0.44907407407407407, "acc_norm_stderr": 0.03392238405321617 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6862745098039216, "acc_stderr": 0.032566854844603886, "acc_norm": 0.6862745098039216, "acc_norm_stderr": 0.032566854844603886 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7172995780590717, "acc_stderr": 0.029312814153955938, "acc_norm": 0.7172995780590717, "acc_norm_stderr": 0.029312814153955938 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6367713004484304, "acc_stderr": 0.032277904428505, "acc_norm": 0.6367713004484304, "acc_norm_stderr": 0.032277904428505 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6030534351145038, "acc_stderr": 0.04291135671009225, "acc_norm": 0.6030534351145038, "acc_norm_stderr": 0.04291135671009225 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908706, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908706 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356461, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6625766871165644, "acc_stderr": 0.03714908409935574, "acc_norm": 0.6625766871165644, "acc_norm_stderr": 0.03714908409935574 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.6893203883495146, "acc_stderr": 0.045821241601615506, "acc_norm": 0.6893203883495146, "acc_norm_stderr": 0.045821241601615506 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7777777777777778, "acc_stderr": 0.027236013946196687, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.027236013946196687 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7318007662835249, "acc_stderr": 0.015842430835269414, "acc_norm": 0.7318007662835249, "acc_norm_stderr": 0.015842430835269414 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.569364161849711, "acc_stderr": 0.02665880027367238, "acc_norm": 0.569364161849711, "acc_norm_stderr": 0.02665880027367238 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2324022346368715, "acc_stderr": 0.014125968754673398, "acc_norm": 0.2324022346368715, "acc_norm_stderr": 0.014125968754673398 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5980392156862745, "acc_stderr": 0.028074158947600656, "acc_norm": 0.5980392156862745, "acc_norm_stderr": 0.028074158947600656 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.662379421221865, "acc_stderr": 0.026858825879488544, "acc_norm": 0.662379421221865, "acc_norm_stderr": 0.026858825879488544 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5648148148148148, "acc_stderr": 0.02758600622160771, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.02758600622160771 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40070921985815605, "acc_stderr": 0.029233465745573083, "acc_norm": 0.40070921985815605, "acc_norm_stderr": 0.029233465745573083 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3917861799217731, "acc_stderr": 0.01246756441814513, "acc_norm": 0.3917861799217731, "acc_norm_stderr": 0.01246756441814513 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.030352303395351964, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4869281045751634, "acc_stderr": 0.020220920829626912, "acc_norm": 0.4869281045751634, "acc_norm_stderr": 0.020220920829626912 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6244897959183674, "acc_stderr": 0.03100120903989484, "acc_norm": 0.6244897959183674, "acc_norm_stderr": 0.03100120903989484 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7661691542288557, "acc_stderr": 0.029929415408348384, "acc_norm": 0.7661691542288557, "acc_norm_stderr": 0.029929415408348384 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.038515976837185335, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03188578017686396, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03188578017686396 }, "harness|truthfulqa:mc|0": { "mc1": 0.30599755201958384, "mc1_stderr": 0.016132229728155045, "mc2": 0.44058088181187305, "mc2_stderr": 0.01457019109202042 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat
2023-10-04T03:37:19.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of cmarkea/bloomz-560m-sft-chat dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [cmarkea/bloomz-560m-sft-chat](https://huggingface.co/cmarkea/bloomz-560m-sft-chat)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T03:35:59.039004](https://huggingface.co/datasets/open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat/blob/main/results_2023-10-04T03-35-59.039004.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24057144726480983,\n\ \ \"acc_stderr\": 0.030934271725636703,\n \"acc_norm\": 0.24213556979953058,\n\ \ \"acc_norm_stderr\": 0.030948364717725742,\n \"mc1\": 0.2594859241126071,\n\ \ \"mc1_stderr\": 0.015345409485557977,\n \"mc2\": 0.4235017854721072,\n\ \ \"mc2_stderr\": 0.014827960507653516\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.23464163822525597,\n \"acc_stderr\": 0.012383873560768675,\n\ \ \"acc_norm\": 0.27474402730375425,\n \"acc_norm_stderr\": 0.013044617212771227\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.31836287592113127,\n\ \ \"acc_stderr\": 0.004648890787581681,\n \"acc_norm\": 0.37054371639115713,\n\ \ \"acc_norm_stderr\": 0.004819633668832546\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\ \ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.2814814814814815,\n\ \ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.14473684210526316,\n \"acc_stderr\": 0.028631951845930384,\n\ \ \"acc_norm\": 0.14473684210526316,\n \"acc_norm_stderr\": 0.028631951845930384\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\ \ \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \ \ \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899105,\n\ \ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899105\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\ \ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\ \ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\ \ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n\ \ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n\ \ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\ \ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\ acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\ \ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\ \ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\ \ \"acc_stderr\": 0.02489246917246283,\n \"acc_norm\": 0.25806451612903225,\n\ \ \"acc_norm_stderr\": 0.02489246917246283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.028748983689941086,\n\ \ \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.028748983689941086\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\ : 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\ \ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365904,\n \"\ acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365904\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.028112091210117467,\n\ \ \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.028112091210117467\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\ \ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \ \ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.02788682807838057,\n\ \ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.02788682807838057\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"\ acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.21651376146788992,\n \"acc_stderr\": 0.01765871059444313,\n \"\ acc_norm\": 0.21651376146788992,\n \"acc_norm_stderr\": 0.01765871059444313\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\ acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"\ acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460295,\n \ \ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460295\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2556053811659193,\n\ \ \"acc_stderr\": 0.029275891003969927,\n \"acc_norm\": 0.2556053811659193,\n\ \ \"acc_norm_stderr\": 0.029275891003969927\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462472,\n\ \ \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462472\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.1652892561983471,\n \"acc_stderr\": 0.03390780612972776,\n \"\ acc_norm\": 0.1652892561983471,\n \"acc_norm_stderr\": 0.03390780612972776\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n\ \ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\ \ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\ \ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\ \ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\ \ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n\ \ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27458492975734355,\n\ \ \"acc_stderr\": 0.015959829933084035,\n \"acc_norm\": 0.27458492975734355,\n\ \ \"acc_norm_stderr\": 0.015959829933084035\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\ \ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\ \ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\ \ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087866,\n\ \ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087866\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\ \ \"acc_stderr\": 0.025122637608816636,\n \"acc_norm\": 0.26688102893890675,\n\ \ \"acc_norm_stderr\": 0.025122637608816636\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.30246913580246915,\n \"acc_stderr\": 0.025557653981868052,\n\ \ \"acc_norm\": 0.30246913580246915,\n \"acc_norm_stderr\": 0.025557653981868052\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.24822695035460993,\n \"acc_stderr\": 0.02577001564429039,\n \ \ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.02577001564429039\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.22685788787483702,\n\ \ \"acc_stderr\": 0.010696348133569926,\n \"acc_norm\": 0.22685788787483702,\n\ \ \"acc_norm_stderr\": 0.010696348133569926\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494774,\n\ \ \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494774\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142765,\n \ \ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142765\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\ \ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\ \ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904038,\n\ \ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904038\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\ \ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\ \ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \ \ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.23493975903614459,\n\ \ \"acc_stderr\": 0.03300533186128922,\n \"acc_norm\": 0.23493975903614459,\n\ \ \"acc_norm_stderr\": 0.03300533186128922\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338734,\n\ \ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338734\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\ \ \"mc1_stderr\": 0.015345409485557977,\n \"mc2\": 0.4235017854721072,\n\ \ \"mc2_stderr\": 0.014827960507653516\n }\n}\n```" repo_url: https://huggingface.co/cmarkea/bloomz-560m-sft-chat leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|arc:challenge|25_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hellaswag|10_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-35-59.039004.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-35-59.039004.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T03_35_59.039004 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-35-59.039004.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-35-59.039004.parquet' - config_name: results data_files: - split: 2023_10_04T03_35_59.039004 path: - results_2023-10-04T03-35-59.039004.parquet - split: latest path: - results_2023-10-04T03-35-59.039004.parquet --- # Dataset Card for Evaluation run of cmarkea/bloomz-560m-sft-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/cmarkea/bloomz-560m-sft-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [cmarkea/bloomz-560m-sft-chat](https://huggingface.co/cmarkea/bloomz-560m-sft-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T03:35:59.039004](https://huggingface.co/datasets/open-llm-leaderboard/details_cmarkea__bloomz-560m-sft-chat/blob/main/results_2023-10-04T03-35-59.039004.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.24057144726480983, "acc_stderr": 0.030934271725636703, "acc_norm": 0.24213556979953058, "acc_norm_stderr": 0.030948364717725742, "mc1": 0.2594859241126071, "mc1_stderr": 0.015345409485557977, "mc2": 0.4235017854721072, "mc2_stderr": 0.014827960507653516 }, "harness|arc:challenge|25": { "acc": 0.23464163822525597, "acc_stderr": 0.012383873560768675, "acc_norm": 0.27474402730375425, "acc_norm_stderr": 0.013044617212771227 }, "harness|hellaswag|10": { "acc": 0.31836287592113127, "acc_stderr": 0.004648890787581681, "acc_norm": 0.37054371639115713, "acc_norm_stderr": 0.004819633668832546 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2814814814814815, "acc_stderr": 0.038850042458002526, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.038850042458002526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.14473684210526316, "acc_stderr": 0.028631951845930384, "acc_norm": 0.14473684210526316, "acc_norm_stderr": 0.028631951845930384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542126, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542126 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.22264150943396227, "acc_stderr": 0.025604233470899105, "acc_norm": 0.22264150943396227, "acc_norm_stderr": 0.025604233470899105 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03476590104304134, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816507, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2023121387283237, "acc_stderr": 0.03063114553919882, "acc_norm": 0.2023121387283237, "acc_norm_stderr": 0.03063114553919882 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171453, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171453 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.17, "acc_stderr": 0.03775251680686371, "acc_norm": 0.17, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2851063829787234, "acc_stderr": 0.029513196625539355, "acc_norm": 0.2851063829787234, "acc_norm_stderr": 0.029513196625539355 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.04049339297748141, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.04049339297748141 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.23448275862068965, "acc_stderr": 0.035306258743465914, "acc_norm": 0.23448275862068965, "acc_norm_stderr": 0.035306258743465914 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.20634920634920634, "acc_stderr": 0.036196045241242515, "acc_norm": 0.20634920634920634, "acc_norm_stderr": 0.036196045241242515 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.2, "acc_stderr": 0.040201512610368445, "acc_norm": 0.2, "acc_norm_stderr": 0.040201512610368445 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.25806451612903225, "acc_stderr": 0.02489246917246283, "acc_norm": 0.25806451612903225, "acc_norm_stderr": 0.02489246917246283 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.21182266009852216, "acc_stderr": 0.028748983689941086, "acc_norm": 0.21182266009852216, "acc_norm_stderr": 0.028748983689941086 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2727272727272727, "acc_stderr": 0.03477691162163659, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.03477691162163659 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.25252525252525254, "acc_stderr": 0.030954055470365904, "acc_norm": 0.25252525252525254, "acc_norm_stderr": 0.030954055470365904 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.18652849740932642, "acc_stderr": 0.028112091210117467, "acc_norm": 0.18652849740932642, "acc_norm_stderr": 0.028112091210117467 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2128205128205128, "acc_stderr": 0.020752423722128013, "acc_norm": 0.2128205128205128, "acc_norm_stderr": 0.020752423722128013 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.026067159222275794, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.026067159222275794 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.24369747899159663, "acc_stderr": 0.02788682807838057, "acc_norm": 0.24369747899159663, "acc_norm_stderr": 0.02788682807838057 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2052980132450331, "acc_stderr": 0.03297986648473834, "acc_norm": 0.2052980132450331, "acc_norm_stderr": 0.03297986648473834 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.21651376146788992, "acc_stderr": 0.01765871059444313, "acc_norm": 0.21651376146788992, "acc_norm_stderr": 0.01765871059444313 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39351851851851855, "acc_stderr": 0.03331747876370312, "acc_norm": 0.39351851851851855, "acc_norm_stderr": 0.03331747876370312 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.03019028245350194, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.03019028245350194 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.25738396624472576, "acc_stderr": 0.028458820991460295, "acc_norm": 0.25738396624472576, "acc_norm_stderr": 0.028458820991460295 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.2556053811659193, "acc_stderr": 0.029275891003969927, "acc_norm": 0.2556053811659193, "acc_norm_stderr": 0.029275891003969927 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.1984732824427481, "acc_stderr": 0.03498149385462472, "acc_norm": 0.1984732824427481, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.1652892561983471, "acc_stderr": 0.03390780612972776, "acc_norm": 0.1652892561983471, "acc_norm_stderr": 0.03390780612972776 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25, "acc_stderr": 0.04186091791394607, "acc_norm": 0.25, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.26380368098159507, "acc_stderr": 0.03462419931615624, "acc_norm": 0.26380368098159507, "acc_norm_stderr": 0.03462419931615624 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.044642857142857116, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.044642857142857116 }, "harness|hendrycksTest-management|5": { "acc": 0.1650485436893204, "acc_stderr": 0.036756688322331886, "acc_norm": 0.1650485436893204, "acc_norm_stderr": 0.036756688322331886 }, "harness|hendrycksTest-marketing|5": { "acc": 0.25213675213675213, "acc_stderr": 0.02844796547623102, "acc_norm": 0.25213675213675213, "acc_norm_stderr": 0.02844796547623102 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.27458492975734355, "acc_stderr": 0.015959829933084035, "acc_norm": 0.27458492975734355, "acc_norm_stderr": 0.015959829933084035 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24566473988439305, "acc_stderr": 0.02317629820399201, "acc_norm": 0.24566473988439305, "acc_norm_stderr": 0.02317629820399201 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.238562091503268, "acc_stderr": 0.024404394928087866, "acc_norm": 0.238562091503268, "acc_norm_stderr": 0.024404394928087866 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.26688102893890675, "acc_stderr": 0.025122637608816636, "acc_norm": 0.26688102893890675, "acc_norm_stderr": 0.025122637608816636 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.30246913580246915, "acc_stderr": 0.025557653981868052, "acc_norm": 0.30246913580246915, "acc_norm_stderr": 0.025557653981868052 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24822695035460993, "acc_stderr": 0.02577001564429039, "acc_norm": 0.24822695035460993, "acc_norm_stderr": 0.02577001564429039 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.22685788787483702, "acc_stderr": 0.010696348133569926, "acc_norm": 0.22685788787483702, "acc_norm_stderr": 0.010696348133569926 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.16544117647058823, "acc_stderr": 0.022571771025494774, "acc_norm": 0.16544117647058823, "acc_norm_stderr": 0.022571771025494774 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.24509803921568626, "acc_stderr": 0.01740181671142765, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.01740181671142765 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2163265306122449, "acc_stderr": 0.026358916334904038, "acc_norm": 0.2163265306122449, "acc_norm_stderr": 0.026358916334904038 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23880597014925373, "acc_stderr": 0.03014777593540922, "acc_norm": 0.23880597014925373, "acc_norm_stderr": 0.03014777593540922 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.19, "acc_stderr": 0.03942772444036623, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036623 }, "harness|hendrycksTest-virology|5": { "acc": 0.23493975903614459, "acc_stderr": 0.03300533186128922, "acc_norm": 0.23493975903614459, "acc_norm_stderr": 0.03300533186128922 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21637426900584794, "acc_stderr": 0.03158149539338734, "acc_norm": 0.21637426900584794, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.2594859241126071, "mc1_stderr": 0.015345409485557977, "mc2": 0.4235017854721072, "mc2_stderr": 0.014827960507653516 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_vikp__phi2
2023-10-04T03:37:28.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of vikp/phi2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [vikp/phi2](https://huggingface.co/vikp/phi2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vikp__phi2\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T03:36:11.040901](https://huggingface.co/datasets/open-llm-leaderboard/details_vikp__phi2/blob/main/results_2023-10-04T03-36-11.040901.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27385595477961044,\n\ \ \"acc_stderr\": 0.03237376999165497,\n \"acc_norm\": 0.2752121319194434,\n\ \ \"acc_norm_stderr\": 0.032396515823164156,\n \"mc1\": 0.27050183598531213,\n\ \ \"mc1_stderr\": 0.015550778332842883,\n \"mc2\": 0.46097713753890357,\n\ \ \"mc2_stderr\": 0.015645791800114963\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.17235494880546076,\n \"acc_stderr\": 0.011037113093461295,\n\ \ \"acc_norm\": 0.22866894197952217,\n \"acc_norm_stderr\": 0.012272853582540787\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2833100975901215,\n\ \ \"acc_stderr\": 0.004496847773250638,\n \"acc_norm\": 0.30701055566620195,\n\ \ \"acc_norm_stderr\": 0.0046031113432130665\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\ \ \"acc_stderr\": 0.03502553170678316,\n \"acc_norm\": 0.2074074074074074,\n\ \ \"acc_norm_stderr\": 0.03502553170678316\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\ \ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\ \ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\ \ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\ \ \"acc_stderr\": 0.03586879280080342,\n \"acc_norm\": 0.24305555555555555,\n\ \ \"acc_norm_stderr\": 0.03586879280080342\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\ \ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\ \ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\ \ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\ \ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n\ \ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102953,\n\ \ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102953\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\ \ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\ \ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.037800192304380135,\n\ \ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.037800192304380135\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\ acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\ \ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\ \ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\ \ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\ \ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0317852971064275,\n\ \ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0317852971064275\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\ \ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"\ acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.034107802518361825,\n\ \ \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.034107802518361825\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.024468615241478916,\n\ \ \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.024468615241478916\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \ \ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.02851025151234193,\n \ \ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.02851025151234193\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\ : 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.344954128440367,\n\ \ \"acc_stderr\": 0.02038060540506697,\n \"acc_norm\": 0.344954128440367,\n\ \ \"acc_norm_stderr\": 0.02038060540506697\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n\ \ \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.31862745098039214,\n \"acc_stderr\": 0.0327028718148208,\n \"\ acc_norm\": 0.31862745098039214,\n \"acc_norm_stderr\": 0.0327028718148208\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.25316455696202533,\n \"acc_stderr\": 0.0283046579430353,\n \ \ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.0283046579430353\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13452914798206278,\n\ \ \"acc_stderr\": 0.022901183761575582,\n \"acc_norm\": 0.13452914798206278,\n\ \ \"acc_norm_stderr\": 0.022901183761575582\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806299,\n\ \ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806299\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041018,\n \"\ acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041018\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\ \ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.2222222222222222,\n\ \ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\ \ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n\ \ \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \ \ \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\ \ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\ \ \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.24786324786324787,\n\ \ \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23243933588761176,\n\ \ \"acc_stderr\": 0.015104550008905713,\n \"acc_norm\": 0.23243933588761176,\n\ \ \"acc_norm_stderr\": 0.015104550008905713\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n\ \ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\ \ \"acc_stderr\": 0.014310999547961459,\n \"acc_norm\": 0.24134078212290502,\n\ \ \"acc_norm_stderr\": 0.014310999547961459\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.02555316999182652,\n\ \ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.02555316999182652\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n\ \ \"acc_stderr\": 0.024926723224845543,\n \"acc_norm\": 0.2604501607717042,\n\ \ \"acc_norm_stderr\": 0.024926723224845543\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.20679012345679013,\n \"acc_stderr\": 0.02253500670594282,\n\ \ \"acc_norm\": 0.20679012345679013,\n \"acc_norm_stderr\": 0.02253500670594282\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \ \ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26727509778357234,\n\ \ \"acc_stderr\": 0.011302607515637518,\n \"acc_norm\": 0.26727509778357234,\n\ \ \"acc_norm_stderr\": 0.011302607515637518\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.34191176470588236,\n \"acc_stderr\": 0.028814722422254167,\n\ \ \"acc_norm\": 0.34191176470588236,\n \"acc_norm_stderr\": 0.028814722422254167\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.272875816993464,\n \"acc_stderr\": 0.01802047414839358,\n \ \ \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.01802047414839358\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\ \ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.2727272727272727,\n\ \ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.32653061224489793,\n \"acc_stderr\": 0.03002105623844032,\n\ \ \"acc_norm\": 0.32653061224489793,\n \"acc_norm_stderr\": 0.03002105623844032\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\ \ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n\ \ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\ \ \"acc_stderr\": 0.036293353299478595,\n \"acc_norm\": 0.3192771084337349,\n\ \ \"acc_norm_stderr\": 0.036293353299478595\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824564,\n\ \ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824564\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\ \ \"mc1_stderr\": 0.015550778332842883,\n \"mc2\": 0.46097713753890357,\n\ \ \"mc2_stderr\": 0.015645791800114963\n }\n}\n```" repo_url: https://huggingface.co/vikp/phi2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|arc:challenge|25_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hellaswag|10_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-36-11.040901.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-36-11.040901.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T03_36_11.040901 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-36-11.040901.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-36-11.040901.parquet' - config_name: results data_files: - split: 2023_10_04T03_36_11.040901 path: - results_2023-10-04T03-36-11.040901.parquet - split: latest path: - results_2023-10-04T03-36-11.040901.parquet --- # Dataset Card for Evaluation run of vikp/phi2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/vikp/phi2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [vikp/phi2](https://huggingface.co/vikp/phi2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vikp__phi2", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T03:36:11.040901](https://huggingface.co/datasets/open-llm-leaderboard/details_vikp__phi2/blob/main/results_2023-10-04T03-36-11.040901.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.27385595477961044, "acc_stderr": 0.03237376999165497, "acc_norm": 0.2752121319194434, "acc_norm_stderr": 0.032396515823164156, "mc1": 0.27050183598531213, "mc1_stderr": 0.015550778332842883, "mc2": 0.46097713753890357, "mc2_stderr": 0.015645791800114963 }, "harness|arc:challenge|25": { "acc": 0.17235494880546076, "acc_stderr": 0.011037113093461295, "acc_norm": 0.22866894197952217, "acc_norm_stderr": 0.012272853582540787 }, "harness|hellaswag|10": { "acc": 0.2833100975901215, "acc_stderr": 0.004496847773250638, "acc_norm": 0.30701055566620195, "acc_norm_stderr": 0.0046031113432130665 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2074074074074074, "acc_stderr": 0.03502553170678316, "acc_norm": 0.2074074074074074, "acc_norm_stderr": 0.03502553170678316 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.21052631578947367, "acc_stderr": 0.03317672787533157, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.03317672787533157 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.27169811320754716, "acc_stderr": 0.027377706624670713, "acc_norm": 0.27169811320754716, "acc_norm_stderr": 0.027377706624670713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.24305555555555555, "acc_stderr": 0.03586879280080342, "acc_norm": 0.24305555555555555, "acc_norm_stderr": 0.03586879280080342 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.24855491329479767, "acc_stderr": 0.03295304696818318, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.03295304696818318 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.18627450980392157, "acc_stderr": 0.03873958714149351, "acc_norm": 0.18627450980392157, "acc_norm_stderr": 0.03873958714149351 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.26, "acc_stderr": 0.044084400227680794, "acc_norm": 0.26, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102953, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102953 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.22807017543859648, "acc_stderr": 0.03947152782669415, "acc_norm": 0.22807017543859648, "acc_norm_stderr": 0.03947152782669415 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2896551724137931, "acc_stderr": 0.037800192304380135, "acc_norm": 0.2896551724137931, "acc_norm_stderr": 0.037800192304380135 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2671957671957672, "acc_stderr": 0.02278967314577656, "acc_norm": 0.2671957671957672, "acc_norm_stderr": 0.02278967314577656 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.23809523809523808, "acc_stderr": 0.03809523809523811, "acc_norm": 0.23809523809523808, "acc_norm_stderr": 0.03809523809523811 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.27741935483870966, "acc_stderr": 0.025470196835900055, "acc_norm": 0.27741935483870966, "acc_norm_stderr": 0.025470196835900055 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2857142857142857, "acc_stderr": 0.0317852971064275, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.0317852971064275 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.26666666666666666, "acc_stderr": 0.03453131801885415, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.03453131801885415 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3484848484848485, "acc_stderr": 0.033948539651564025, "acc_norm": 0.3484848484848485, "acc_norm_stderr": 0.033948539651564025 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.33678756476683935, "acc_stderr": 0.034107802518361825, "acc_norm": 0.33678756476683935, "acc_norm_stderr": 0.034107802518361825 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.36923076923076925, "acc_stderr": 0.024468615241478916, "acc_norm": 0.36923076923076925, "acc_norm_stderr": 0.024468615241478916 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085626, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085626 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2605042016806723, "acc_stderr": 0.02851025151234193, "acc_norm": 0.2605042016806723, "acc_norm_stderr": 0.02851025151234193 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.03631329803969653, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.03631329803969653 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.344954128440367, "acc_stderr": 0.02038060540506697, "acc_norm": 0.344954128440367, "acc_norm_stderr": 0.02038060540506697 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3472222222222222, "acc_stderr": 0.032468872436376486, "acc_norm": 0.3472222222222222, "acc_norm_stderr": 0.032468872436376486 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.31862745098039214, "acc_stderr": 0.0327028718148208, "acc_norm": 0.31862745098039214, "acc_norm_stderr": 0.0327028718148208 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.25316455696202533, "acc_stderr": 0.0283046579430353, "acc_norm": 0.25316455696202533, "acc_norm_stderr": 0.0283046579430353 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.13452914798206278, "acc_stderr": 0.022901183761575582, "acc_norm": 0.13452914798206278, "acc_norm_stderr": 0.022901183761575582 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3282442748091603, "acc_stderr": 0.04118438565806299, "acc_norm": 0.3282442748091603, "acc_norm_stderr": 0.04118438565806299 }, "harness|hendrycksTest-international_law|5": { "acc": 0.3140495867768595, "acc_stderr": 0.04236964753041018, "acc_norm": 0.3140495867768595, "acc_norm_stderr": 0.04236964753041018 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2222222222222222, "acc_stderr": 0.0401910747255735, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3067484662576687, "acc_stderr": 0.036230899157241474, "acc_norm": 0.3067484662576687, "acc_norm_stderr": 0.036230899157241474 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.1875, "acc_stderr": 0.0370468111477387, "acc_norm": 0.1875, "acc_norm_stderr": 0.0370468111477387 }, "harness|hendrycksTest-management|5": { "acc": 0.2524271844660194, "acc_stderr": 0.04301250399690877, "acc_norm": 0.2524271844660194, "acc_norm_stderr": 0.04301250399690877 }, "harness|hendrycksTest-marketing|5": { "acc": 0.24786324786324787, "acc_stderr": 0.0282863240755644, "acc_norm": 0.24786324786324787, "acc_norm_stderr": 0.0282863240755644 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.23, "acc_stderr": 0.042295258468165044, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165044 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23243933588761176, "acc_stderr": 0.015104550008905713, "acc_norm": 0.23243933588761176, "acc_norm_stderr": 0.015104550008905713 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.29190751445086704, "acc_stderr": 0.02447699407624734, "acc_norm": 0.29190751445086704, "acc_norm_stderr": 0.02447699407624734 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961459, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961459 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.27450980392156865, "acc_stderr": 0.02555316999182652, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.02555316999182652 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2604501607717042, "acc_stderr": 0.024926723224845543, "acc_norm": 0.2604501607717042, "acc_norm_stderr": 0.024926723224845543 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.20679012345679013, "acc_stderr": 0.02253500670594282, "acc_norm": 0.20679012345679013, "acc_norm_stderr": 0.02253500670594282 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2695035460992908, "acc_stderr": 0.026469036818590638, "acc_norm": 0.2695035460992908, "acc_norm_stderr": 0.026469036818590638 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.26727509778357234, "acc_stderr": 0.011302607515637518, "acc_norm": 0.26727509778357234, "acc_norm_stderr": 0.011302607515637518 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.34191176470588236, "acc_stderr": 0.028814722422254167, "acc_norm": 0.34191176470588236, "acc_norm_stderr": 0.028814722422254167 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.272875816993464, "acc_stderr": 0.01802047414839358, "acc_norm": 0.272875816993464, "acc_norm_stderr": 0.01802047414839358 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2727272727272727, "acc_stderr": 0.04265792110940589, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.32653061224489793, "acc_stderr": 0.03002105623844032, "acc_norm": 0.32653061224489793, "acc_norm_stderr": 0.03002105623844032 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2537313432835821, "acc_stderr": 0.03076944496729602, "acc_norm": 0.2537313432835821, "acc_norm_stderr": 0.03076944496729602 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-virology|5": { "acc": 0.3192771084337349, "acc_stderr": 0.036293353299478595, "acc_norm": 0.3192771084337349, "acc_norm_stderr": 0.036293353299478595 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2982456140350877, "acc_stderr": 0.03508771929824564, "acc_norm": 0.2982456140350877, "acc_norm_stderr": 0.03508771929824564 }, "harness|truthfulqa:mc|0": { "mc1": 0.27050183598531213, "mc1_stderr": 0.015550778332842883, "mc2": 0.46097713753890357, "mc2_stderr": 0.015645791800114963 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-87e175b7-bf34-46f6-b5c3-cdbfd7d83f37
2023-10-04T03:54:03.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_cmarkea__bloomz-3b-sft-chat
2023-10-04T03:50:13.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of cmarkea/bloomz-3b-sft-chat dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [cmarkea/bloomz-3b-sft-chat](https://huggingface.co/cmarkea/bloomz-3b-sft-chat)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cmarkea__bloomz-3b-sft-chat\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T03:48:54.225449](https://huggingface.co/datasets/open-llm-leaderboard/details_cmarkea__bloomz-3b-sft-chat/blob/main/results_2023-10-04T03-48-54.225449.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.31719998794487836,\n\ \ \"acc_stderr\": 0.03354407116071807,\n \"acc_norm\": 0.31972509430472684,\n\ \ \"acc_norm_stderr\": 0.033547872618959834,\n \"mc1\": 0.24357405140758873,\n\ \ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.39693436565628964,\n\ \ \"mc2_stderr\": 0.014363061393872474\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.34897610921501704,\n \"acc_stderr\": 0.013928933461382506,\n\ \ \"acc_norm\": 0.36860068259385664,\n \"acc_norm_stderr\": 0.014097810678042187\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4140609440350528,\n\ \ \"acc_stderr\": 0.004915524600627961,\n \"acc_norm\": 0.5434176458872735,\n\ \ \"acc_norm_stderr\": 0.004970933420231931\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\ \ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.362962962962963,\n\ \ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03459777606810537,\n\ \ \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03459777606810537\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n\ \ \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \ \ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.3320754716981132,\n \"acc_stderr\": 0.028985455652334395,\n\ \ \"acc_norm\": 0.3320754716981132,\n \"acc_norm_stderr\": 0.028985455652334395\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3125,\n\ \ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.3125,\n\ \ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\ \ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\ \ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\ \ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\ \ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\ \ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\ \ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\ \ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309994,\n\ \ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309994\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708614,\n \"\ acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708614\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\ \ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\ \ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.32903225806451614,\n\ \ \"acc_stderr\": 0.026729499068349975,\n \"acc_norm\": 0.32903225806451614,\n\ \ \"acc_norm_stderr\": 0.026729499068349975\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\ \ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.36363636363636365,\n \"acc_stderr\": 0.03427308652999935,\n \"\ acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.03427308652999935\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791516,\n\ \ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791516\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.023290888053772732,\n\ \ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.023290888053772732\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844065,\n \ \ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844065\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.02865749128507198,\n \ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.02865749128507198\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\ acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.41467889908256883,\n \"acc_stderr\": 0.021122903208602595,\n \"\ acc_norm\": 0.41467889908256883,\n \"acc_norm_stderr\": 0.021122903208602595\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353603,\n \"\ acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353603\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501964,\n \"\ acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501964\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.45569620253164556,\n \"acc_stderr\": 0.03241920684693335,\n \ \ \"acc_norm\": 0.45569620253164556,\n \"acc_norm_stderr\": 0.03241920684693335\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47085201793721976,\n\ \ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.47085201793721976,\n\ \ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806298,\n\ \ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806298\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212095,\n \"\ acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212095\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\ \ \"acc_stderr\": 0.04766075165356462,\n \"acc_norm\": 0.4166666666666667,\n\ \ \"acc_norm_stderr\": 0.04766075165356462\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\ \ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\ \ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\ \ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258974,\n\ \ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258974\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.49145299145299143,\n\ \ \"acc_stderr\": 0.032751303000970296,\n \"acc_norm\": 0.49145299145299143,\n\ \ \"acc_norm_stderr\": 0.032751303000970296\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40229885057471265,\n\ \ \"acc_stderr\": 0.017535294529068955,\n \"acc_norm\": 0.40229885057471265,\n\ \ \"acc_norm_stderr\": 0.017535294529068955\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.024752411960917202,\n\ \ \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.024752411960917202\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\ \ \"acc_stderr\": 0.014378169884098414,\n \"acc_norm\": 0.2446927374301676,\n\ \ \"acc_norm_stderr\": 0.014378169884098414\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n\ \ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3762057877813505,\n\ \ \"acc_stderr\": 0.027513925683549434,\n \"acc_norm\": 0.3762057877813505,\n\ \ \"acc_norm_stderr\": 0.027513925683549434\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.026229649178821153,\n\ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.026229649178821153\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2624113475177305,\n \"acc_stderr\": 0.02624492034984301,\n \ \ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.02624492034984301\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27509778357235987,\n\ \ \"acc_stderr\": 0.011405443620996937,\n \"acc_norm\": 0.27509778357235987,\n\ \ \"acc_norm_stderr\": 0.011405443620996937\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.028245687391462916,\n\ \ \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.028245687391462916\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.35130718954248363,\n \"acc_stderr\": 0.01931267606578654,\n \ \ \"acc_norm\": 0.35130718954248363,\n \"acc_norm_stderr\": 0.01931267606578654\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n\ \ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n\ \ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.02671143055553842,\n\ \ \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.02671143055553842\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3383084577114428,\n\ \ \"acc_stderr\": 0.03345563070339193,\n \"acc_norm\": 0.3383084577114428,\n\ \ \"acc_norm_stderr\": 0.03345563070339193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\ \ \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n\ \ \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n\ \ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\ \ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.39693436565628964,\n\ \ \"mc2_stderr\": 0.014363061393872474\n }\n}\n```" repo_url: https://huggingface.co/cmarkea/bloomz-3b-sft-chat leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|arc:challenge|25_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hellaswag|10_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-48-54.225449.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-48-54.225449.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T03_48_54.225449 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-48-54.225449.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T03-48-54.225449.parquet' - config_name: results data_files: - split: 2023_10_04T03_48_54.225449 path: - results_2023-10-04T03-48-54.225449.parquet - split: latest path: - results_2023-10-04T03-48-54.225449.parquet --- # Dataset Card for Evaluation run of cmarkea/bloomz-3b-sft-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/cmarkea/bloomz-3b-sft-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [cmarkea/bloomz-3b-sft-chat](https://huggingface.co/cmarkea/bloomz-3b-sft-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_cmarkea__bloomz-3b-sft-chat", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T03:48:54.225449](https://huggingface.co/datasets/open-llm-leaderboard/details_cmarkea__bloomz-3b-sft-chat/blob/main/results_2023-10-04T03-48-54.225449.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.31719998794487836, "acc_stderr": 0.03354407116071807, "acc_norm": 0.31972509430472684, "acc_norm_stderr": 0.033547872618959834, "mc1": 0.24357405140758873, "mc1_stderr": 0.01502635482491078, "mc2": 0.39693436565628964, "mc2_stderr": 0.014363061393872474 }, "harness|arc:challenge|25": { "acc": 0.34897610921501704, "acc_stderr": 0.013928933461382506, "acc_norm": 0.36860068259385664, "acc_norm_stderr": 0.014097810678042187 }, "harness|hellaswag|10": { "acc": 0.4140609440350528, "acc_stderr": 0.004915524600627961, "acc_norm": 0.5434176458872735, "acc_norm_stderr": 0.004970933420231931 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.362962962962963, "acc_stderr": 0.04153948404742398, "acc_norm": 0.362962962962963, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.23684210526315788, "acc_stderr": 0.03459777606810537, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.03459777606810537 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.3320754716981132, "acc_stderr": 0.028985455652334395, "acc_norm": 0.3320754716981132, "acc_norm_stderr": 0.028985455652334395 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3125, "acc_stderr": 0.038760854559127644, "acc_norm": 0.3125, "acc_norm_stderr": 0.038760854559127644 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.29, "acc_stderr": 0.04560480215720683, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720683 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.24277456647398843, "acc_stderr": 0.0326926380614177, "acc_norm": 0.24277456647398843, "acc_norm_stderr": 0.0326926380614177 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.24509803921568626, "acc_stderr": 0.042801058373643966, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.042801058373643966 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3829787234042553, "acc_stderr": 0.03177821250236922, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.03177821250236922 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.22807017543859648, "acc_stderr": 0.03947152782669415, "acc_norm": 0.22807017543859648, "acc_norm_stderr": 0.03947152782669415 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.296551724137931, "acc_stderr": 0.03806142687309994, "acc_norm": 0.296551724137931, "acc_norm_stderr": 0.03806142687309994 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.26455026455026454, "acc_stderr": 0.022717467897708614, "acc_norm": 0.26455026455026454, "acc_norm_stderr": 0.022717467897708614 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1984126984126984, "acc_stderr": 0.03567016675276864, "acc_norm": 0.1984126984126984, "acc_norm_stderr": 0.03567016675276864 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.32903225806451614, "acc_stderr": 0.026729499068349975, "acc_norm": 0.32903225806451614, "acc_norm_stderr": 0.026729499068349975 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.270935960591133, "acc_stderr": 0.031270907132976984, "acc_norm": 0.270935960591133, "acc_norm_stderr": 0.031270907132976984 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.24242424242424243, "acc_stderr": 0.03346409881055953, "acc_norm": 0.24242424242424243, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.36363636363636365, "acc_stderr": 0.03427308652999935, "acc_norm": 0.36363636363636365, "acc_norm_stderr": 0.03427308652999935 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.29015544041450775, "acc_stderr": 0.03275264467791516, "acc_norm": 0.29015544041450775, "acc_norm_stderr": 0.03275264467791516 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.30256410256410254, "acc_stderr": 0.023290888053772732, "acc_norm": 0.30256410256410254, "acc_norm_stderr": 0.023290888053772732 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.026593939101844065, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.026593939101844065 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.02865749128507198, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.02865749128507198 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2052980132450331, "acc_stderr": 0.03297986648473835, "acc_norm": 0.2052980132450331, "acc_norm_stderr": 0.03297986648473835 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.41467889908256883, "acc_stderr": 0.021122903208602595, "acc_norm": 0.41467889908256883, "acc_norm_stderr": 0.021122903208602595 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.03114144782353603, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.03114144782353603 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.030190282453501964, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.030190282453501964 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.45569620253164556, "acc_stderr": 0.03241920684693335, "acc_norm": 0.45569620253164556, "acc_norm_stderr": 0.03241920684693335 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.47085201793721976, "acc_stderr": 0.03350073248773404, "acc_norm": 0.47085201793721976, "acc_norm_stderr": 0.03350073248773404 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3282442748091603, "acc_stderr": 0.04118438565806298, "acc_norm": 0.3282442748091603, "acc_norm_stderr": 0.04118438565806298 }, "harness|hendrycksTest-international_law|5": { "acc": 0.3305785123966942, "acc_stderr": 0.04294340845212095, "acc_norm": 0.3305785123966942, "acc_norm_stderr": 0.04294340845212095 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4166666666666667, "acc_stderr": 0.04766075165356462, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.04766075165356462 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.035590395316173425, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.3786407766990291, "acc_stderr": 0.04802694698258974, "acc_norm": 0.3786407766990291, "acc_norm_stderr": 0.04802694698258974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.49145299145299143, "acc_stderr": 0.032751303000970296, "acc_norm": 0.49145299145299143, "acc_norm_stderr": 0.032751303000970296 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.40229885057471265, "acc_stderr": 0.017535294529068955, "acc_norm": 0.40229885057471265, "acc_norm_stderr": 0.017535294529068955 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.30346820809248554, "acc_stderr": 0.024752411960917202, "acc_norm": 0.30346820809248554, "acc_norm_stderr": 0.024752411960917202 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2446927374301676, "acc_stderr": 0.014378169884098414, "acc_norm": 0.2446927374301676, "acc_norm_stderr": 0.014378169884098414 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.29411764705882354, "acc_stderr": 0.02609016250427905, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.02609016250427905 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3762057877813505, "acc_stderr": 0.027513925683549434, "acc_norm": 0.3762057877813505, "acc_norm_stderr": 0.027513925683549434 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.3333333333333333, "acc_stderr": 0.026229649178821153, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.026229649178821153 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2624113475177305, "acc_stderr": 0.02624492034984301, "acc_norm": 0.2624113475177305, "acc_norm_stderr": 0.02624492034984301 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.27509778357235987, "acc_stderr": 0.011405443620996937, "acc_norm": 0.27509778357235987, "acc_norm_stderr": 0.011405443620996937 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3161764705882353, "acc_stderr": 0.028245687391462916, "acc_norm": 0.3161764705882353, "acc_norm_stderr": 0.028245687391462916 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.35130718954248363, "acc_stderr": 0.01931267606578654, "acc_norm": 0.35130718954248363, "acc_norm_stderr": 0.01931267606578654 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.44545454545454544, "acc_stderr": 0.047605488214603246, "acc_norm": 0.44545454545454544, "acc_norm_stderr": 0.047605488214603246 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.22448979591836735, "acc_stderr": 0.02671143055553842, "acc_norm": 0.22448979591836735, "acc_norm_stderr": 0.02671143055553842 }, "harness|hendrycksTest-sociology|5": { "acc": 0.3383084577114428, "acc_stderr": 0.03345563070339193, "acc_norm": 0.3383084577114428, "acc_norm_stderr": 0.03345563070339193 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-virology|5": { "acc": 0.3433734939759036, "acc_stderr": 0.03696584317010601, "acc_norm": 0.3433734939759036, "acc_norm_stderr": 0.03696584317010601 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.27485380116959063, "acc_stderr": 0.034240429246915824, "acc_norm": 0.27485380116959063, "acc_norm_stderr": 0.034240429246915824 }, "harness|truthfulqa:mc|0": { "mc1": 0.24357405140758873, "mc1_stderr": 0.01502635482491078, "mc2": 0.39693436565628964, "mc2_stderr": 0.014363061393872474 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-d5dd3ce1-b64b-43eb-bf5b-107adb961432
2023-10-04T04:07:40.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
atom-in-the-universe/bild-a72e242e-ef06-459b-bde4-c1a53d919b70
2023-10-04T04:21:34.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_cmarkea__bloomz-7b1-mt-sft-chat
2023-10-04T04:12:38.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of cmarkea/bloomz-7b1-mt-sft-chat dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [cmarkea/bloomz-7b1-mt-sft-chat](https://huggingface.co/cmarkea/bloomz-7b1-mt-sft-chat)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cmarkea__bloomz-7b1-mt-sft-chat\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T04:11:17.617298](https://huggingface.co/datasets/open-llm-leaderboard/details_cmarkea__bloomz-7b1-mt-sft-chat/blob/main/results_2023-10-04T04-11-17.617298.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38807772639842936,\n\ \ \"acc_stderr\": 0.03518367443781253,\n \"acc_norm\": 0.3913277945134569,\n\ \ \"acc_norm_stderr\": 0.035183603797313924,\n \"mc1\": 0.28518971848225216,\n\ \ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.44338292612615726,\n\ \ \"mc2_stderr\": 0.014645929380833861\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.40784982935153585,\n \"acc_stderr\": 0.014361097288449705,\n\ \ \"acc_norm\": 0.4402730375426621,\n \"acc_norm_stderr\": 0.014506769524804244\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4666401115315674,\n\ \ \"acc_stderr\": 0.004978662946687268,\n \"acc_norm\": 0.6259709221270663,\n\ \ \"acc_norm_stderr\": 0.004828822920915224\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \ \ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\ acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.03860731599316092,\n\ \ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.03860731599316092\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.4339622641509434,\n \"acc_stderr\": 0.0305032920133426,\n\ \ \"acc_norm\": 0.4339622641509434,\n \"acc_norm_stderr\": 0.0305032920133426\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.375,\n\ \ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.375,\n \ \ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\ \ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n\ \ \"acc_stderr\": 0.03550683989165581,\n \"acc_norm\": 0.3179190751445087,\n\ \ \"acc_norm_stderr\": 0.03550683989165581\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n\ \ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\ \ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610337,\n\ \ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610337\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\ \ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\ \ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419036,\n\ \ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419036\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\ acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\ \ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\ \ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36451612903225805,\n\ \ \"acc_stderr\": 0.02737987122994324,\n \"acc_norm\": 0.36451612903225805,\n\ \ \"acc_norm_stderr\": 0.02737987122994324\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\ \ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\ : 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\ \ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.4595959595959596,\n \"acc_stderr\": 0.035507024651313425,\n \"\ acc_norm\": 0.4595959595959596,\n \"acc_norm_stderr\": 0.035507024651313425\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048574,\n\ \ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048574\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.02450347255711094,\n \ \ \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.02450347255711094\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184405,\n \ \ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184405\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.03169380235712997,\n \ \ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.03169380235712997\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.5211009174311927,\n \"acc_stderr\": 0.02141822475426465,\n \"\ acc_norm\": 0.5211009174311927,\n \"acc_norm_stderr\": 0.02141822475426465\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402544,\n \"\ acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402544\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.3627450980392157,\n \"acc_stderr\": 0.03374499356319355,\n \"\ acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.03374499356319355\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.5822784810126582,\n \"acc_stderr\": 0.032103530322412685,\n \ \ \"acc_norm\": 0.5822784810126582,\n \"acc_norm_stderr\": 0.032103530322412685\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47533632286995514,\n\ \ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.47533632286995514,\n\ \ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n\ \ \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.45454545454545453,\n \"acc_stderr\": 0.045454545454545456,\n \"\ acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.045454545454545456\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\ \ \"acc_stderr\": 0.048129173245368216,\n \"acc_norm\": 0.4537037037037037,\n\ \ \"acc_norm_stderr\": 0.048129173245368216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.4171779141104294,\n \"acc_stderr\": 0.038741028598180814,\n\ \ \"acc_norm\": 0.4171779141104294,\n \"acc_norm_stderr\": 0.038741028598180814\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\ \ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\ \ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.42718446601941745,\n \"acc_stderr\": 0.04897957737781168,\n\ \ \"acc_norm\": 0.42718446601941745,\n \"acc_norm_stderr\": 0.04897957737781168\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n\ \ \"acc_stderr\": 0.03217180182641087,\n \"acc_norm\": 0.594017094017094,\n\ \ \"acc_norm_stderr\": 0.03217180182641087\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.48020434227330777,\n\ \ \"acc_stderr\": 0.01786594482729162,\n \"acc_norm\": 0.48020434227330777,\n\ \ \"acc_norm_stderr\": 0.01786594482729162\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.02648339204209818,\n\ \ \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.02648339204209818\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\ \ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\ \ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.027826109307283693,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.027826109307283693\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3987138263665595,\n\ \ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.3987138263665595,\n\ \ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.02723741509459247,\n\ \ \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.02723741509459247\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343957,\n \ \ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343957\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28878748370273793,\n\ \ \"acc_stderr\": 0.011574914757219962,\n \"acc_norm\": 0.28878748370273793,\n\ \ \"acc_norm_stderr\": 0.011574914757219962\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n\ \ \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.4166666666666667,\n \"acc_stderr\": 0.01994491413687359,\n \ \ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.01994491413687359\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\ \ \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n\ \ \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.3346938775510204,\n \"acc_stderr\": 0.030209235226242307,\n\ \ \"acc_norm\": 0.3346938775510204,\n \"acc_norm_stderr\": 0.030209235226242307\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.40298507462686567,\n\ \ \"acc_stderr\": 0.03468343295111126,\n \"acc_norm\": 0.40298507462686567,\n\ \ \"acc_norm_stderr\": 0.03468343295111126\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\ \ \"acc_stderr\": 0.037998574544796354,\n \"acc_norm\": 0.39156626506024095,\n\ \ \"acc_norm_stderr\": 0.037998574544796354\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.38011695906432746,\n \"acc_stderr\": 0.037229657413855394,\n\ \ \"acc_norm\": 0.38011695906432746,\n \"acc_norm_stderr\": 0.037229657413855394\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\ \ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.44338292612615726,\n\ \ \"mc2_stderr\": 0.014645929380833861\n }\n}\n```" repo_url: https://huggingface.co/cmarkea/bloomz-7b1-mt-sft-chat leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|arc:challenge|25_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hellaswag|10_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T04-11-17.617298.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T04-11-17.617298.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T04_11_17.617298 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T04-11-17.617298.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T04-11-17.617298.parquet' - config_name: results data_files: - split: 2023_10_04T04_11_17.617298 path: - results_2023-10-04T04-11-17.617298.parquet - split: latest path: - results_2023-10-04T04-11-17.617298.parquet --- # Dataset Card for Evaluation run of cmarkea/bloomz-7b1-mt-sft-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/cmarkea/bloomz-7b1-mt-sft-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [cmarkea/bloomz-7b1-mt-sft-chat](https://huggingface.co/cmarkea/bloomz-7b1-mt-sft-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_cmarkea__bloomz-7b1-mt-sft-chat", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T04:11:17.617298](https://huggingface.co/datasets/open-llm-leaderboard/details_cmarkea__bloomz-7b1-mt-sft-chat/blob/main/results_2023-10-04T04-11-17.617298.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.38807772639842936, "acc_stderr": 0.03518367443781253, "acc_norm": 0.3913277945134569, "acc_norm_stderr": 0.035183603797313924, "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454892, "mc2": 0.44338292612615726, "mc2_stderr": 0.014645929380833861 }, "harness|arc:challenge|25": { "acc": 0.40784982935153585, "acc_stderr": 0.014361097288449705, "acc_norm": 0.4402730375426621, "acc_norm_stderr": 0.014506769524804244 }, "harness|hellaswag|10": { "acc": 0.4666401115315674, "acc_stderr": 0.004978662946687268, "acc_norm": 0.6259709221270663, "acc_norm_stderr": 0.004828822920915224 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4, "acc_stderr": 0.04232073695151589, "acc_norm": 0.4, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.34210526315789475, "acc_stderr": 0.03860731599316092, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.03860731599316092 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4339622641509434, "acc_stderr": 0.0305032920133426, "acc_norm": 0.4339622641509434, "acc_norm_stderr": 0.0305032920133426 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.375, "acc_stderr": 0.04048439222695598, "acc_norm": 0.375, "acc_norm_stderr": 0.04048439222695598 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3179190751445087, "acc_stderr": 0.03550683989165581, "acc_norm": 0.3179190751445087, "acc_norm_stderr": 0.03550683989165581 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383889, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383889 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.32340425531914896, "acc_stderr": 0.030579442773610337, "acc_norm": 0.32340425531914896, "acc_norm_stderr": 0.030579442773610337 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.38620689655172413, "acc_stderr": 0.04057324734419036, "acc_norm": 0.38620689655172413, "acc_norm_stderr": 0.04057324734419036 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2724867724867725, "acc_stderr": 0.022930973071633345, "acc_norm": 0.2724867724867725, "acc_norm_stderr": 0.022930973071633345 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2698412698412698, "acc_stderr": 0.03970158273235173, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.03970158273235173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.36451612903225805, "acc_stderr": 0.02737987122994324, "acc_norm": 0.36451612903225805, "acc_norm_stderr": 0.02737987122994324 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.32019704433497537, "acc_stderr": 0.032826493853041504, "acc_norm": 0.32019704433497537, "acc_norm_stderr": 0.032826493853041504 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.26666666666666666, "acc_stderr": 0.03453131801885415, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.03453131801885415 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.4595959595959596, "acc_stderr": 0.035507024651313425, "acc_norm": 0.4595959595959596, "acc_norm_stderr": 0.035507024651313425 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.37305699481865284, "acc_stderr": 0.03490205592048574, "acc_norm": 0.37305699481865284, "acc_norm_stderr": 0.03490205592048574 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.3717948717948718, "acc_stderr": 0.02450347255711094, "acc_norm": 0.3717948717948718, "acc_norm_stderr": 0.02450347255711094 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.02659393910184405, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.02659393910184405 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3907563025210084, "acc_stderr": 0.03169380235712997, "acc_norm": 0.3907563025210084, "acc_norm_stderr": 0.03169380235712997 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.5211009174311927, "acc_stderr": 0.02141822475426465, "acc_norm": 0.5211009174311927, "acc_norm_stderr": 0.02141822475426465 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3055555555555556, "acc_stderr": 0.03141554629402544, "acc_norm": 0.3055555555555556, "acc_norm_stderr": 0.03141554629402544 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.3627450980392157, "acc_stderr": 0.03374499356319355, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.03374499356319355 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5822784810126582, "acc_stderr": 0.032103530322412685, "acc_norm": 0.5822784810126582, "acc_norm_stderr": 0.032103530322412685 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.47533632286995514, "acc_stderr": 0.03351695167652628, "acc_norm": 0.47533632286995514, "acc_norm_stderr": 0.03351695167652628 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.40458015267175573, "acc_stderr": 0.043046937953806645, "acc_norm": 0.40458015267175573, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.45454545454545453, "acc_stderr": 0.045454545454545456, "acc_norm": 0.45454545454545453, "acc_norm_stderr": 0.045454545454545456 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4537037037037037, "acc_stderr": 0.048129173245368216, "acc_norm": 0.4537037037037037, "acc_norm_stderr": 0.048129173245368216 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.4171779141104294, "acc_stderr": 0.038741028598180814, "acc_norm": 0.4171779141104294, "acc_norm_stderr": 0.038741028598180814 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.36607142857142855, "acc_stderr": 0.045723723587374296, "acc_norm": 0.36607142857142855, "acc_norm_stderr": 0.045723723587374296 }, "harness|hendrycksTest-management|5": { "acc": 0.42718446601941745, "acc_stderr": 0.04897957737781168, "acc_norm": 0.42718446601941745, "acc_norm_stderr": 0.04897957737781168 }, "harness|hendrycksTest-marketing|5": { "acc": 0.594017094017094, "acc_stderr": 0.03217180182641087, "acc_norm": 0.594017094017094, "acc_norm_stderr": 0.03217180182641087 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.48020434227330777, "acc_stderr": 0.01786594482729162, "acc_norm": 0.48020434227330777, "acc_norm_stderr": 0.01786594482729162 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.41040462427745666, "acc_stderr": 0.02648339204209818, "acc_norm": 0.41040462427745666, "acc_norm_stderr": 0.02648339204209818 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.38235294117647056, "acc_stderr": 0.027826109307283693, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.027826109307283693 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3987138263665595, "acc_stderr": 0.0278093225857745, "acc_norm": 0.3987138263665595, "acc_norm_stderr": 0.0278093225857745 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.39814814814814814, "acc_stderr": 0.02723741509459247, "acc_norm": 0.39814814814814814, "acc_norm_stderr": 0.02723741509459247 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.28368794326241137, "acc_stderr": 0.026891709428343957, "acc_norm": 0.28368794326241137, "acc_norm_stderr": 0.026891709428343957 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.28878748370273793, "acc_stderr": 0.011574914757219962, "acc_norm": 0.28878748370273793, "acc_norm_stderr": 0.011574914757219962 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.43014705882352944, "acc_stderr": 0.030074971917302875, "acc_norm": 0.43014705882352944, "acc_norm_stderr": 0.030074971917302875 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4166666666666667, "acc_stderr": 0.01994491413687359, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.01994491413687359 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5454545454545454, "acc_stderr": 0.04769300568972745, "acc_norm": 0.5454545454545454, "acc_norm_stderr": 0.04769300568972745 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3346938775510204, "acc_stderr": 0.030209235226242307, "acc_norm": 0.3346938775510204, "acc_norm_stderr": 0.030209235226242307 }, "harness|hendrycksTest-sociology|5": { "acc": 0.40298507462686567, "acc_stderr": 0.03468343295111126, "acc_norm": 0.40298507462686567, "acc_norm_stderr": 0.03468343295111126 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.037998574544796354, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.037998574544796354 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.38011695906432746, "acc_stderr": 0.037229657413855394, "acc_norm": 0.38011695906432746, "acc_norm_stderr": 0.037229657413855394 }, "harness|truthfulqa:mc|0": { "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454892, "mc2": 0.44338292612615726, "mc2_stderr": 0.014645929380833861 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-9ca741ee-fee9-4747-af4a-bfcd56311007
2023-10-04T04:34:22.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
Yinxing/LLM_Dataset
2023-10-04T04:25:43.000Z
[ "license:mit", "region:us" ]
Yinxing
null
null
null
0
0
--- license: mit ---
atom-in-the-universe/bild-cbabc836-4289-48f3-9bdb-ec0017064cff
2023-10-04T04:49:22.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
BangumiBase/steinsgate
2023-10-04T06:45:41.000Z
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
BangumiBase
null
null
null
1
0
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Steins;gate This is the image base of bangumi Steins;Gate, we detected 22 characters, 4292 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 82 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 64 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 1242 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 59 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 7 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | N/A | | 5 | 443 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 410 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 177 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 346 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 324 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 19 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 95 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 49 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 9 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 489 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 105 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 51 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 9 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 29 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 155 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 12 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | noise | 116 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
open-llm-leaderboard/details_dpv__finetuned-gpt2-tiny
2023-10-04T04:45:32.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of dpv/finetuned-gpt2-tiny dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [dpv/finetuned-gpt2-tiny](https://huggingface.co/dpv/finetuned-gpt2-tiny) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dpv__finetuned-gpt2-tiny\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T04:44:17.358371](https://huggingface.co/datasets/open-llm-leaderboard/details_dpv__finetuned-gpt2-tiny/blob/main/results_2023-10-04T04-44-17.358371.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25815735770896303,\n\ \ \"acc_stderr\": 0.03144000977290419,\n \"acc_norm\": 0.2588772185417444,\n\ \ \"acc_norm_stderr\": 0.03144837270186198,\n \"mc1\": 0.22766217870257038,\n\ \ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4067393224175315,\n\ \ \"mc2_stderr\": 0.014921031198907243\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.19965870307167236,\n \"acc_stderr\": 0.011681625756888676,\n\ \ \"acc_norm\": 0.21843003412969283,\n \"acc_norm_stderr\": 0.012074291605700978\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2922724556861183,\n\ \ \"acc_stderr\": 0.0045387734937465595,\n \"acc_norm\": 0.31597291376219877,\n\ \ \"acc_norm_stderr\": 0.00463952045344403\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\ \ \"acc_stderr\": 0.03712537833614867,\n \"acc_norm\": 0.24444444444444444,\n\ \ \"acc_norm_stderr\": 0.03712537833614867\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n\ \ \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.17,\n\ \ \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \ \ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.02590789712240817,\n\ \ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.02590789712240817\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n\ \ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\ \ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\ \ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\ \ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n\ \ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342347,\n\ \ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342347\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\ \ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\ acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n\ \ \"acc_stderr\": 0.0312984318574381,\n \"acc_norm\": 0.14285714285714285,\n\ \ \"acc_norm_stderr\": 0.0312984318574381\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263686,\n \ \ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263686\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.3,\n \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.3,\n\ \ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n\ \ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\ : 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\ acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\ \ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2743589743589744,\n \"acc_stderr\": 0.022622765767493225,\n\ \ \"acc_norm\": 0.2743589743589744,\n \"acc_norm_stderr\": 0.022622765767493225\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184408,\n \ \ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184408\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136098,\n\ \ \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136098\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"\ acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3467889908256881,\n \"acc_stderr\": 0.020406097104093027,\n \"\ acc_norm\": 0.3467889908256881,\n \"acc_norm_stderr\": 0.020406097104093027\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\ : 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\ \ \"acc_stderr\": 0.030587591351604243,\n \"acc_norm\": 0.2549019607843137,\n\ \ \"acc_norm_stderr\": 0.030587591351604243\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n\ \ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n\ \ \"acc_stderr\": 0.030500283176545923,\n \"acc_norm\": 0.2914798206278027,\n\ \ \"acc_norm_stderr\": 0.030500283176545923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\ \ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"\ acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\ \ \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.21296296296296297,\n\ \ \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\ \ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\ \ \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n\ \ \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.046897659372781356,\n\ \ \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.046897659372781356\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1794871794871795,\n\ \ \"acc_stderr\": 0.025140935950335418,\n \"acc_norm\": 0.1794871794871795,\n\ \ \"acc_norm_stderr\": 0.025140935950335418\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21839080459770116,\n\ \ \"acc_stderr\": 0.014774358319934488,\n \"acc_norm\": 0.21839080459770116,\n\ \ \"acc_norm_stderr\": 0.014774358319934488\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n\ \ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\ \ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\ \ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\ \ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\ \ \"acc_stderr\": 0.024619771956697165,\n \"acc_norm\": 0.2508038585209003,\n\ \ \"acc_norm_stderr\": 0.024619771956697165\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\ \ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \ \ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\ \ \"acc_stderr\": 0.011025499291443737,\n \"acc_norm\": 0.24771838331160365,\n\ \ \"acc_norm_stderr\": 0.011025499291443737\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n\ \ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913222,\n \ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913222\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\ \ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\ \ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\ \ \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.22885572139303484,\n\ \ \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\ \ \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n\ \ \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\ \ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\ \ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4067393224175315,\n\ \ \"mc2_stderr\": 0.014921031198907243\n }\n}\n```" repo_url: https://huggingface.co/dpv/finetuned-gpt2-tiny leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|arc:challenge|25_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hellaswag|10_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T04-44-17.358371.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T04-44-17.358371.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T04_44_17.358371 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T04-44-17.358371.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T04-44-17.358371.parquet' - config_name: results data_files: - split: 2023_10_04T04_44_17.358371 path: - results_2023-10-04T04-44-17.358371.parquet - split: latest path: - results_2023-10-04T04-44-17.358371.parquet --- # Dataset Card for Evaluation run of dpv/finetuned-gpt2-tiny ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dpv/finetuned-gpt2-tiny - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [dpv/finetuned-gpt2-tiny](https://huggingface.co/dpv/finetuned-gpt2-tiny) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dpv__finetuned-gpt2-tiny", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T04:44:17.358371](https://huggingface.co/datasets/open-llm-leaderboard/details_dpv__finetuned-gpt2-tiny/blob/main/results_2023-10-04T04-44-17.358371.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.25815735770896303, "acc_stderr": 0.03144000977290419, "acc_norm": 0.2588772185417444, "acc_norm_stderr": 0.03144837270186198, "mc1": 0.22766217870257038, "mc1_stderr": 0.01467925503211107, "mc2": 0.4067393224175315, "mc2_stderr": 0.014921031198907243 }, "harness|arc:challenge|25": { "acc": 0.19965870307167236, "acc_stderr": 0.011681625756888676, "acc_norm": 0.21843003412969283, "acc_norm_stderr": 0.012074291605700978 }, "harness|hellaswag|10": { "acc": 0.2922724556861183, "acc_stderr": 0.0045387734937465595, "acc_norm": 0.31597291376219877, "acc_norm_stderr": 0.00463952045344403 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.24444444444444444, "acc_stderr": 0.03712537833614867, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.03712537833614867 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.16447368421052633, "acc_stderr": 0.0301675334686327, "acc_norm": 0.16447368421052633, "acc_norm_stderr": 0.0301675334686327 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.17, "acc_stderr": 0.0377525168068637, "acc_norm": 0.17, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.23018867924528302, "acc_stderr": 0.02590789712240817, "acc_norm": 0.23018867924528302, "acc_norm_stderr": 0.02590789712240817 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03476590104304134, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036846, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720685, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720685 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.23699421965317918, "acc_stderr": 0.03242414757483098, "acc_norm": 0.23699421965317918, "acc_norm_stderr": 0.03242414757483098 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2549019607843137, "acc_stderr": 0.043364327079931785, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.043364327079931785 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.17, "acc_stderr": 0.03775251680686371, "acc_norm": 0.17, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2680851063829787, "acc_stderr": 0.028957342788342347, "acc_norm": 0.2680851063829787, "acc_norm_stderr": 0.028957342788342347 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25396825396825395, "acc_stderr": 0.022418042891113942, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.022418042891113942 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.14285714285714285, "acc_stderr": 0.0312984318574381, "acc_norm": 0.14285714285714285, "acc_norm_stderr": 0.0312984318574381 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.15, "acc_stderr": 0.035887028128263686, "acc_norm": 0.15, "acc_norm_stderr": 0.035887028128263686 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3, "acc_stderr": 0.026069362295335137, "acc_norm": 0.3, "acc_norm_stderr": 0.026069362295335137 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.27586206896551724, "acc_stderr": 0.03144712581678242, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.03144712581678242 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.35353535353535354, "acc_stderr": 0.03406086723547153, "acc_norm": 0.35353535353535354, "acc_norm_stderr": 0.03406086723547153 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.36787564766839376, "acc_stderr": 0.03480175668466036, "acc_norm": 0.36787564766839376, "acc_norm_stderr": 0.03480175668466036 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2743589743589744, "acc_stderr": 0.022622765767493225, "acc_norm": 0.2743589743589744, "acc_norm_stderr": 0.022622765767493225 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.02659393910184408, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.02659393910184408 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.28991596638655465, "acc_stderr": 0.029472485833136098, "acc_norm": 0.28991596638655465, "acc_norm_stderr": 0.029472485833136098 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.036313298039696545, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.036313298039696545 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3467889908256881, "acc_stderr": 0.020406097104093027, "acc_norm": 0.3467889908256881, "acc_norm_stderr": 0.020406097104093027 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.030587591351604243, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.030587591351604243 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2489451476793249, "acc_stderr": 0.028146970599422644, "acc_norm": 0.2489451476793249, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.2914798206278027, "acc_stderr": 0.030500283176545923, "acc_norm": 0.2914798206278027, "acc_norm_stderr": 0.030500283176545923 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.26717557251908397, "acc_stderr": 0.038808483010823944, "acc_norm": 0.26717557251908397, "acc_norm_stderr": 0.038808483010823944 }, "harness|hendrycksTest-international_law|5": { "acc": 0.3305785123966942, "acc_stderr": 0.04294340845212094, "acc_norm": 0.3305785123966942, "acc_norm_stderr": 0.04294340845212094 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.21296296296296297, "acc_stderr": 0.03957835471980981, "acc_norm": 0.21296296296296297, "acc_norm_stderr": 0.03957835471980981 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25766871165644173, "acc_stderr": 0.03436150827846917, "acc_norm": 0.25766871165644173, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.24107142857142858, "acc_stderr": 0.04059867246952688, "acc_norm": 0.24107142857142858, "acc_norm_stderr": 0.04059867246952688 }, "harness|hendrycksTest-management|5": { "acc": 0.33980582524271846, "acc_stderr": 0.046897659372781356, "acc_norm": 0.33980582524271846, "acc_norm_stderr": 0.046897659372781356 }, "harness|hendrycksTest-marketing|5": { "acc": 0.1794871794871795, "acc_stderr": 0.025140935950335418, "acc_norm": 0.1794871794871795, "acc_norm_stderr": 0.025140935950335418 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.21839080459770116, "acc_stderr": 0.014774358319934488, "acc_norm": 0.21839080459770116, "acc_norm_stderr": 0.014774358319934488 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2398843930635838, "acc_stderr": 0.02298959254312357, "acc_norm": 0.2398843930635838, "acc_norm_stderr": 0.02298959254312357 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2424581005586592, "acc_stderr": 0.014333522059217889, "acc_norm": 0.2424581005586592, "acc_norm_stderr": 0.014333522059217889 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.21895424836601307, "acc_stderr": 0.02367908986180772, "acc_norm": 0.21895424836601307, "acc_norm_stderr": 0.02367908986180772 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2508038585209003, "acc_stderr": 0.024619771956697165, "acc_norm": 0.2508038585209003, "acc_norm_stderr": 0.024619771956697165 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.22530864197530864, "acc_stderr": 0.023246202647819746, "acc_norm": 0.22530864197530864, "acc_norm_stderr": 0.023246202647819746 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2695035460992908, "acc_stderr": 0.02646903681859063, "acc_norm": 0.2695035460992908, "acc_norm_stderr": 0.02646903681859063 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24771838331160365, "acc_stderr": 0.011025499291443737, "acc_norm": 0.24771838331160365, "acc_norm_stderr": 0.011025499291443737 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4411764705882353, "acc_stderr": 0.030161911930767102, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.030161911930767102 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2647058823529412, "acc_stderr": 0.017848089574913222, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.017848089574913222 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.4, "acc_stderr": 0.031362502409358936, "acc_norm": 0.4, "acc_norm_stderr": 0.031362502409358936 }, "harness|hendrycksTest-sociology|5": { "acc": 0.22885572139303484, "acc_stderr": 0.029705284056772426, "acc_norm": 0.22885572139303484, "acc_norm_stderr": 0.029705284056772426 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-virology|5": { "acc": 0.1927710843373494, "acc_stderr": 0.030709824050565274, "acc_norm": 0.1927710843373494, "acc_norm_stderr": 0.030709824050565274 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21052631578947367, "acc_stderr": 0.0312678171466318, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.0312678171466318 }, "harness|truthfulqa:mc|0": { "mc1": 0.22766217870257038, "mc1_stderr": 0.01467925503211107, "mc2": 0.4067393224175315, "mc2_stderr": 0.014921031198907243 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-85ba9795-f953-4980-83a9-72788bbd3ea2
2023-10-04T05:02:07.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
JvManger/pharmacy-llama-2-indic3
2023-10-04T04:54:43.000Z
[ "region:us" ]
JvManger
null
null
null
0
0
Entry not found
atom-in-the-universe/bild-19795252-d58f-4285-9ca2-9414fd719459
2023-10-04T05:15:32.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
yimingzhang/lichess-2022
2023-10-08T01:35:02.000Z
[ "license:cc0-1.0", "region:us" ]
yimingzhang
null
null
null
0
0
--- configs: - config_name: all data_files: "all/*.jsonl.gz" - config_name: rapid-classical-correspondence data_files: "rapid-classical-correspondence/*.jsonl.gz" license: cc0-1.0 ---
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down
2023-10-04T05:12:20.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T05:10:57.019261](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-10-57.019261.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5532724536570037,\n\ \ \"acc_stderr\": 0.0344601611583772,\n \"acc_norm\": 0.5575807140279452,\n\ \ \"acc_norm_stderr\": 0.034441424229965406,\n \"mc1\": 0.2668298653610771,\n\ \ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.39106281296334733,\n\ \ \"mc2_stderr\": 0.01399739595458208\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5187713310580204,\n \"acc_stderr\": 0.014601090150633962,\n\ \ \"acc_norm\": 0.5631399317406144,\n \"acc_norm_stderr\": 0.014494421584256529\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.604461262696674,\n\ \ \"acc_stderr\": 0.004879667889198491,\n \"acc_norm\": 0.8142800238996216,\n\ \ \"acc_norm_stderr\": 0.0038808576792799397\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\ \ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\ \ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\ \ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\ \ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \ \ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\ \ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\ \ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\ \ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\ \ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\ \ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\ \ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\ \ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192118,\n\ \ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192118\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.30687830687830686,\n \"acc_stderr\": 0.02375292871211214,\n \"\ acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.02375292871211214\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\ \ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\ \ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\ \ \"acc_stderr\": 0.026662010578567107,\n \"acc_norm\": 0.6741935483870968,\n\ \ \"acc_norm_stderr\": 0.026662010578567107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\ \ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n\ \ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\ acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\ \ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\ \ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \ \ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \ \ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\ acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\ acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\ acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n \"\ acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7426160337552743,\n \"acc_stderr\": 0.0284588209914603,\n \ \ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.0284588209914603\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\ \ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\ \ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\ \ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\ acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\ \ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\ \ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\ \ \"acc_stderr\": 0.02490443909891823,\n \"acc_norm\": 0.8247863247863247,\n\ \ \"acc_norm_stderr\": 0.02490443909891823\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\ \ \"acc_stderr\": 0.015384352284543941,\n \"acc_norm\": 0.7547892720306514,\n\ \ \"acc_norm_stderr\": 0.015384352284543941\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.02552247463212161,\n\ \ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.02552247463212161\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n\ \ \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.33519553072625696,\n\ \ \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.028180596328259287,\n\ \ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.028180596328259287\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\ \ \"acc_stderr\": 0.02726429759980401,\n \"acc_norm\": 0.639871382636656,\n\ \ \"acc_norm_stderr\": 0.02726429759980401\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\ \ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \ \ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n\ \ \"acc_stderr\": 0.012669813464935729,\n \"acc_norm\": 0.43741851368970014,\n\ \ \"acc_norm_stderr\": 0.012669813464935729\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159644,\n\ \ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159644\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181354,\n \ \ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181354\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\ \ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\ \ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\ \ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\ \ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n\ \ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\ \ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\ \ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\ \ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n\ \ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.39106281296334733,\n\ \ \"mc2_stderr\": 0.01399739595458208\n }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|arc:challenge|25_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hellaswag|10_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-10-57.019261.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-10-57.019261.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T05_10_57.019261 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-10-57.019261.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-10-57.019261.parquet' - config_name: results data_files: - split: 2023_10_04T05_10_57.019261 path: - results_2023-10-04T05-10-57.019261.parquet - split: latest path: - results_2023-10-04T05-10-57.019261.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T05:10:57.019261](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-10-57.019261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5532724536570037, "acc_stderr": 0.0344601611583772, "acc_norm": 0.5575807140279452, "acc_norm_stderr": 0.034441424229965406, "mc1": 0.2668298653610771, "mc1_stderr": 0.015483691939237265, "mc2": 0.39106281296334733, "mc2_stderr": 0.01399739595458208 }, "harness|arc:challenge|25": { "acc": 0.5187713310580204, "acc_stderr": 0.014601090150633962, "acc_norm": 0.5631399317406144, "acc_norm_stderr": 0.014494421584256529 }, "harness|hellaswag|10": { "acc": 0.604461262696674, "acc_stderr": 0.004879667889198491, "acc_norm": 0.8142800238996216, "acc_norm_stderr": 0.0038808576792799397 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5657894736842105, "acc_stderr": 0.04033565667848319, "acc_norm": 0.5657894736842105, "acc_norm_stderr": 0.04033565667848319 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6041666666666666, "acc_stderr": 0.04089465449325582, "acc_norm": 0.6041666666666666, "acc_norm_stderr": 0.04089465449325582 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4624277456647399, "acc_stderr": 0.0380168510452446, "acc_norm": 0.4624277456647399, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201942, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201942 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.42127659574468085, "acc_stderr": 0.03227834510146268, "acc_norm": 0.42127659574468085, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.043391383225798615, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.043391383225798615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192118, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192118 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30687830687830686, "acc_stderr": 0.02375292871211214, "acc_norm": 0.30687830687830686, "acc_norm_stderr": 0.02375292871211214 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6741935483870968, "acc_stderr": 0.026662010578567107, "acc_norm": 0.6741935483870968, "acc_norm_stderr": 0.026662010578567107 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.45320197044334976, "acc_stderr": 0.03502544650845872, "acc_norm": 0.45320197044334976, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6727272727272727, "acc_stderr": 0.03663974994391245, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.03663974994391245 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6919191919191919, "acc_stderr": 0.032894773300986155, "acc_norm": 0.6919191919191919, "acc_norm_stderr": 0.032894773300986155 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8031088082901554, "acc_stderr": 0.028697873971860677, "acc_norm": 0.8031088082901554, "acc_norm_stderr": 0.028697873971860677 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5102564102564102, "acc_stderr": 0.025345672221942374, "acc_norm": 0.5102564102564102, "acc_norm_stderr": 0.025345672221942374 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2518518518518518, "acc_stderr": 0.026466117538959916, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.026466117538959916 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5882352941176471, "acc_stderr": 0.03196876989195778, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.03196876989195778 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.037345356767871984, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.037345356767871984 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7724770642201835, "acc_stderr": 0.017974463578776502, "acc_norm": 0.7724770642201835, "acc_norm_stderr": 0.017974463578776502 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.03388857118502326, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.03388857118502326 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.030190282453501943, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.030190282453501943 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7426160337552743, "acc_stderr": 0.0284588209914603, "acc_norm": 0.7426160337552743, "acc_norm_stderr": 0.0284588209914603 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776679, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776679 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6412213740458015, "acc_stderr": 0.04206739313864908, "acc_norm": 0.6412213740458015, "acc_norm_stderr": 0.04206739313864908 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.040261875275912073, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.040261875275912073 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6687116564417178, "acc_stderr": 0.03697983910025588, "acc_norm": 0.6687116564417178, "acc_norm_stderr": 0.03697983910025588 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340456, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340456 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8247863247863247, "acc_stderr": 0.02490443909891823, "acc_norm": 0.8247863247863247, "acc_norm_stderr": 0.02490443909891823 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7547892720306514, "acc_stderr": 0.015384352284543941, "acc_norm": 0.7547892720306514, "acc_norm_stderr": 0.015384352284543941 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6589595375722543, "acc_stderr": 0.02552247463212161, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.02552247463212161 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33519553072625696, "acc_stderr": 0.015788007190185884, "acc_norm": 0.33519553072625696, "acc_norm_stderr": 0.015788007190185884 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5882352941176471, "acc_stderr": 0.028180596328259287, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.028180596328259287 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.639871382636656, "acc_stderr": 0.02726429759980401, "acc_norm": 0.639871382636656, "acc_norm_stderr": 0.02726429759980401 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6141975308641975, "acc_stderr": 0.027085401226132143, "acc_norm": 0.6141975308641975, "acc_norm_stderr": 0.027085401226132143 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236844, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236844 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.43741851368970014, "acc_stderr": 0.012669813464935729, "acc_norm": 0.43741851368970014, "acc_norm_stderr": 0.012669813464935729 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5477941176470589, "acc_stderr": 0.03023375855159644, "acc_norm": 0.5477941176470589, "acc_norm_stderr": 0.03023375855159644 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.553921568627451, "acc_stderr": 0.020109864547181354, "acc_norm": 0.553921568627451, "acc_norm_stderr": 0.020109864547181354 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5183673469387755, "acc_stderr": 0.03198761546763127, "acc_norm": 0.5183673469387755, "acc_norm_stderr": 0.03198761546763127 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7064676616915423, "acc_stderr": 0.03220024104534205, "acc_norm": 0.7064676616915423, "acc_norm_stderr": 0.03220024104534205 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685517, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.03061111655743253, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.03061111655743253 }, "harness|truthfulqa:mc|0": { "mc1": 0.2668298653610771, "mc1_stderr": 0.015483691939237265, "mc2": 0.39106281296334733, "mc2_stderr": 0.01399739595458208 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-25a8c44f-32af-4f9a-8e3d-7344a3260eda
2023-10-04T05:30:10.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down
2023-10-04T05:18:50.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T05:17:27.993942](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-17-27.993942.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5592320899241068,\n\ \ \"acc_stderr\": 0.034413133808153286,\n \"acc_norm\": 0.5635314128799154,\n\ \ \"acc_norm_stderr\": 0.03439396424779458,\n \"mc1\": 0.2717258261933905,\n\ \ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.39792286545162303,\n\ \ \"mc2_stderr\": 0.01419606078232786\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5255972696245734,\n \"acc_stderr\": 0.014592230885298962,\n\ \ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.01445686294465065\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6081457876916949,\n\ \ \"acc_stderr\": 0.004871667371060536,\n \"acc_norm\": 0.8148775144393547,\n\ \ \"acc_norm_stderr\": 0.0038760312505449843\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\ \ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\ \ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.040260970832965634,\n\ \ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.040260970832965634\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.03028500925900979,\n\ \ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.03028500925900979\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\ \ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\ \ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\ \ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\ \ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\ \ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\ \ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\ \ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\ \ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\ acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.6258064516129033,\n \"acc_stderr\": 0.027528904299845707,\n \"\ acc_norm\": 0.6258064516129033,\n \"acc_norm_stderr\": 0.027528904299845707\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n \"\ acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\ : 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\ \ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"\ acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\ \ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240648,\n\ \ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240648\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\ : 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\ : {\n \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.0316314580755238,\n\ \ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.0316314580755238\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\ : 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7798165137614679,\n\ \ \"acc_stderr\": 0.01776597865232754,\n \"acc_norm\": 0.7798165137614679,\n\ \ \"acc_norm_stderr\": 0.01776597865232754\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n\ \ \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7794117647058824,\n \"acc_stderr\": 0.029102254389674082,\n \"\ acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.029102254389674082\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703642,\n \ \ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703642\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\ \ \"acc_stderr\": 0.03252113489929187,\n \"acc_norm\": 0.6233183856502242,\n\ \ \"acc_norm_stderr\": 0.03252113489929187\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\ \ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\ : 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\ \ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\ \ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\ \ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\ \ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\ \ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\ \ \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n\ \ \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \ \ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\ \ \"acc_stderr\": 0.015411308769686934,\n \"acc_norm\": 0.7535121328224776,\n\ \ \"acc_norm_stderr\": 0.015411308769686934\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306393,\n\ \ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306393\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n\ \ \"acc_stderr\": 0.01665722942458631,\n \"acc_norm\": 0.4558659217877095,\n\ \ \"acc_norm_stderr\": 0.01665722942458631\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.028074158947600653,\n\ \ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.028074158947600653\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\ \ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\ \ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027125115513166858,\n\ \ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027125115513166858\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \ \ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42894393741851367,\n\ \ \"acc_stderr\": 0.012640625443067354,\n \"acc_norm\": 0.42894393741851367,\n\ \ \"acc_norm_stderr\": 0.012640625443067354\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n\ \ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887188,\n \ \ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887188\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\ \ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\ \ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\ \ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\ \ \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n\ \ \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \ \ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\ \ \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n\ \ \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\ \ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\ \ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.39792286545162303,\n\ \ \"mc2_stderr\": 0.01419606078232786\n }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|arc:challenge|25_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hellaswag|10_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-17-27.993942.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-17-27.993942.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T05_17_27.993942 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-17-27.993942.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-17-27.993942.parquet' - config_name: results data_files: - split: 2023_10_04T05_17_27.993942 path: - results_2023-10-04T05-17-27.993942.parquet - split: latest path: - results_2023-10-04T05-17-27.993942.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T05:17:27.993942](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-17-27.993942.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5592320899241068, "acc_stderr": 0.034413133808153286, "acc_norm": 0.5635314128799154, "acc_norm_stderr": 0.03439396424779458, "mc1": 0.2717258261933905, "mc1_stderr": 0.01557284045287583, "mc2": 0.39792286545162303, "mc2_stderr": 0.01419606078232786 }, "harness|arc:challenge|25": { "acc": 0.5255972696245734, "acc_stderr": 0.014592230885298962, "acc_norm": 0.5725255972696246, "acc_norm_stderr": 0.01445686294465065 }, "harness|hellaswag|10": { "acc": 0.6081457876916949, "acc_stderr": 0.004871667371060536, "acc_norm": 0.8148775144393547, "acc_norm_stderr": 0.0038760312505449843 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.04461960433384741, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.040260970832965634, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.040260970832965634 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5886792452830188, "acc_stderr": 0.03028500925900979, "acc_norm": 0.5886792452830188, "acc_norm_stderr": 0.03028500925900979 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04122728707651282, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04122728707651282 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5202312138728323, "acc_stderr": 0.03809342081273957, "acc_norm": 0.5202312138728323, "acc_norm_stderr": 0.03809342081273957 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4723404255319149, "acc_stderr": 0.03263597118409769, "acc_norm": 0.4723404255319149, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.34210526315789475, "acc_stderr": 0.04462917535336936, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.04462917535336936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.041618085035015295, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.041618085035015295 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35714285714285715, "acc_stderr": 0.024677862841332783, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.024677862841332783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6258064516129033, "acc_stderr": 0.027528904299845707, "acc_norm": 0.6258064516129033, "acc_norm_stderr": 0.027528904299845707 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4482758620689655, "acc_stderr": 0.034991131376767445, "acc_norm": 0.4482758620689655, "acc_norm_stderr": 0.034991131376767445 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6545454545454545, "acc_stderr": 0.037131580674819135, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.037131580674819135 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7323232323232324, "acc_stderr": 0.03154449888270286, "acc_norm": 0.7323232323232324, "acc_norm_stderr": 0.03154449888270286 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8186528497409327, "acc_stderr": 0.02780703236068609, "acc_norm": 0.8186528497409327, "acc_norm_stderr": 0.02780703236068609 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5384615384615384, "acc_stderr": 0.025275892070240648, "acc_norm": 0.5384615384615384, "acc_norm_stderr": 0.025275892070240648 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.027940457136228416, "acc_norm": 0.3, "acc_norm_stderr": 0.027940457136228416 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6134453781512605, "acc_stderr": 0.0316314580755238, "acc_norm": 0.6134453781512605, "acc_norm_stderr": 0.0316314580755238 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.03757949922943343, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.03757949922943343 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7798165137614679, "acc_stderr": 0.01776597865232754, "acc_norm": 0.7798165137614679, "acc_norm_stderr": 0.01776597865232754 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.029102254389674082, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.029102254389674082 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.02798569938703642, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.02798569938703642 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6233183856502242, "acc_stderr": 0.03252113489929187, "acc_norm": 0.6233183856502242, "acc_norm_stderr": 0.03252113489929187 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6793893129770993, "acc_stderr": 0.04093329229834278, "acc_norm": 0.6793893129770993, "acc_norm_stderr": 0.04093329229834278 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302872, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302872 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6944444444444444, "acc_stderr": 0.044531975073749834, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.044531975073749834 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.656441717791411, "acc_stderr": 0.037311335196738925, "acc_norm": 0.656441717791411, "acc_norm_stderr": 0.037311335196738925 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841044, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841044 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8205128205128205, "acc_stderr": 0.02514093595033544, "acc_norm": 0.8205128205128205, "acc_norm_stderr": 0.02514093595033544 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7535121328224776, "acc_stderr": 0.015411308769686934, "acc_norm": 0.7535121328224776, "acc_norm_stderr": 0.015411308769686934 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.630057803468208, "acc_stderr": 0.025992472029306393, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.025992472029306393 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4558659217877095, "acc_stderr": 0.01665722942458631, "acc_norm": 0.4558659217877095, "acc_norm_stderr": 0.01665722942458631 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5980392156862745, "acc_stderr": 0.028074158947600653, "acc_norm": 0.5980392156862745, "acc_norm_stderr": 0.028074158947600653 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6237942122186495, "acc_stderr": 0.02751392568354943, "acc_norm": 0.6237942122186495, "acc_norm_stderr": 0.02751392568354943 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6111111111111112, "acc_stderr": 0.027125115513166858, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.027125115513166858 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4397163120567376, "acc_stderr": 0.029609912075594106, "acc_norm": 0.4397163120567376, "acc_norm_stderr": 0.029609912075594106 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.42894393741851367, "acc_stderr": 0.012640625443067354, "acc_norm": 0.42894393741851367, "acc_norm_stderr": 0.012640625443067354 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5955882352941176, "acc_stderr": 0.02981263070156974, "acc_norm": 0.5955882352941176, "acc_norm_stderr": 0.02981263070156974 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5555555555555556, "acc_stderr": 0.020102583895887188, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.020102583895887188 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.04653429807913507, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.04653429807913507 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5918367346938775, "acc_stderr": 0.03146465712827424, "acc_norm": 0.5918367346938775, "acc_norm_stderr": 0.03146465712827424 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6965174129353234, "acc_stderr": 0.03251006816458618, "acc_norm": 0.6965174129353234, "acc_norm_stderr": 0.03251006816458618 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.78, "acc_stderr": 0.04163331998932264, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932264 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.03799857454479637, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.03799857454479637 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.03158149539338734, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.2717258261933905, "mc1_stderr": 0.01557284045287583, "mc2": 0.39792286545162303, "mc2_stderr": 0.01419606078232786 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
Anis1123/quip
2023-10-04T05:24:30.000Z
[ "region:us" ]
Anis1123
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down
2023-10-04T05:25:31.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T05:24:08.290753](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-24-08.290753.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5447775601567362,\n\ \ \"acc_stderr\": 0.03444398435073481,\n \"acc_norm\": 0.5490511048768253,\n\ \ \"acc_norm_stderr\": 0.03442535588146447,\n \"mc1\": 0.2778457772337821,\n\ \ \"mc1_stderr\": 0.015680929364024654,\n \"mc2\": 0.4071991704986675,\n\ \ \"mc2_stderr\": 0.014219945219138549\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5162116040955631,\n \"acc_stderr\": 0.014603708567414947,\n\ \ \"acc_norm\": 0.5597269624573379,\n \"acc_norm_stderr\": 0.01450676952480424\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.606652061342362,\n\ \ \"acc_stderr\": 0.0048749458339470775,\n \"acc_norm\": 0.8152758414658434,\n\ \ \"acc_norm_stderr\": 0.0038728051896075527\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\ \ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\ \ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\ \ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.030325945789286105,\n\ \ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.030325945789286105\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\ \ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \ \ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\ : 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\ \ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\ \ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\ \ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\ \ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\ \ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\ \ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\ \ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\ acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.6,\n \"acc_stderr\": 0.02786932057166463,\n \"acc_norm\": 0.6,\n\ \ \"acc_norm_stderr\": 0.02786932057166463\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\ \ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\ : 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\ \ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"\ acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\ \ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\ : {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n\ \ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \ \ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\ acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7412844036697248,\n \"acc_stderr\": 0.018776052319619627,\n \"\ acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.018776052319619627\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294635,\n \"\ acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294635\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695063,\n \"\ acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695063\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \ \ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\ \ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\ \ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\ \ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\ : 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\ \ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\ \ \"acc_stderr\": 0.0449394906861354,\n \"acc_norm\": 0.3392857142857143,\n\ \ \"acc_norm_stderr\": 0.0449394906861354\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\ \ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\ \ \"acc_stderr\": 0.02704685763071667,\n \"acc_norm\": 0.782051282051282,\n\ \ \"acc_norm_stderr\": 0.02704685763071667\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.756066411238825,\n\ \ \"acc_stderr\": 0.015357212665829463,\n \"acc_norm\": 0.756066411238825,\n\ \ \"acc_norm_stderr\": 0.015357212665829463\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.025522474632121612,\n\ \ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.025522474632121612\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\ \ \"acc_stderr\": 0.014776765066438893,\n \"acc_norm\": 0.2659217877094972,\n\ \ \"acc_norm_stderr\": 0.014776765066438893\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.565359477124183,\n \"acc_stderr\": 0.02838425670488304,\n\ \ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02838425670488304\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\ \ \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n\ \ \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\ \ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370597,\n \ \ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370597\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n\ \ \"acc_stderr\": 0.01261297436939098,\n \"acc_norm\": 0.4217731421121252,\n\ \ \"acc_norm_stderr\": 0.01261297436939098\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\ \ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5506535947712419,\n \"acc_stderr\": 0.02012376652802727,\n \ \ \"acc_norm\": 0.5506535947712419,\n \"acc_norm_stderr\": 0.02012376652802727\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5020408163265306,\n \"acc_stderr\": 0.0320089533497105,\n\ \ \"acc_norm\": 0.5020408163265306,\n \"acc_norm_stderr\": 0.0320089533497105\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\ \ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\ \ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\ \ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\ \ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\ \ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\ \ \"mc1_stderr\": 0.015680929364024654,\n \"mc2\": 0.4071991704986675,\n\ \ \"mc2_stderr\": 0.014219945219138549\n }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|arc:challenge|25_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hellaswag|10_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-24-08.290753.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-24-08.290753.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T05_24_08.290753 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-24-08.290753.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-24-08.290753.parquet' - config_name: results data_files: - split: 2023_10_04T05_24_08.290753 path: - results_2023-10-04T05-24-08.290753.parquet - split: latest path: - results_2023-10-04T05-24-08.290753.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T05:24:08.290753](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-24-08.290753.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5447775601567362, "acc_stderr": 0.03444398435073481, "acc_norm": 0.5490511048768253, "acc_norm_stderr": 0.03442535588146447, "mc1": 0.2778457772337821, "mc1_stderr": 0.015680929364024654, "mc2": 0.4071991704986675, "mc2_stderr": 0.014219945219138549 }, "harness|arc:challenge|25": { "acc": 0.5162116040955631, "acc_stderr": 0.014603708567414947, "acc_norm": 0.5597269624573379, "acc_norm_stderr": 0.01450676952480424 }, "harness|hellaswag|10": { "acc": 0.606652061342362, "acc_stderr": 0.0048749458339470775, "acc_norm": 0.8152758414658434, "acc_norm_stderr": 0.0038728051896075527 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4222222222222222, "acc_stderr": 0.04266763404099582, "acc_norm": 0.4222222222222222, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5394736842105263, "acc_stderr": 0.04056242252249034, "acc_norm": 0.5394736842105263, "acc_norm_stderr": 0.04056242252249034 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5849056603773585, "acc_stderr": 0.030325945789286105, "acc_norm": 0.5849056603773585, "acc_norm_stderr": 0.030325945789286105 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5625, "acc_stderr": 0.04148415739394154, "acc_norm": 0.5625, "acc_norm_stderr": 0.04148415739394154 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5144508670520231, "acc_stderr": 0.03810871630454764, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.045766654032077636, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.045766654032077636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.42127659574468085, "acc_stderr": 0.03227834510146268, "acc_norm": 0.42127659574468085, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.045796394220704334, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.045796394220704334 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.43448275862068964, "acc_stderr": 0.04130740879555497, "acc_norm": 0.43448275862068964, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.32275132275132273, "acc_stderr": 0.024078943243597016, "acc_norm": 0.32275132275132273, "acc_norm_stderr": 0.024078943243597016 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.040406101782088394, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.040406101782088394 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6, "acc_stderr": 0.02786932057166463, "acc_norm": 0.6, "acc_norm_stderr": 0.02786932057166463 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4088669950738916, "acc_stderr": 0.034590588158832314, "acc_norm": 0.4088669950738916, "acc_norm_stderr": 0.034590588158832314 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6848484848484848, "acc_stderr": 0.0362773057502241, "acc_norm": 0.6848484848484848, "acc_norm_stderr": 0.0362773057502241 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7070707070707071, "acc_stderr": 0.03242497958178816, "acc_norm": 0.7070707070707071, "acc_norm_stderr": 0.03242497958178816 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7979274611398963, "acc_stderr": 0.02897908979429673, "acc_norm": 0.7979274611398963, "acc_norm_stderr": 0.02897908979429673 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5, "acc_stderr": 0.02535100632816969, "acc_norm": 0.5, "acc_norm_stderr": 0.02535100632816969 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.026962424325073838, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.026962424325073838 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5882352941176471, "acc_stderr": 0.03196876989195778, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.03196876989195778 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.037579499229433426, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.037579499229433426 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7412844036697248, "acc_stderr": 0.018776052319619627, "acc_norm": 0.7412844036697248, "acc_norm_stderr": 0.018776052319619627 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294635, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294635 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7696078431372549, "acc_stderr": 0.029554292605695063, "acc_norm": 0.7696078431372549, "acc_norm_stderr": 0.029554292605695063 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.02655837250266192, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.02655837250266192 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.03138147637575499, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.03138147637575499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6030534351145038, "acc_stderr": 0.04291135671009224, "acc_norm": 0.6030534351145038, "acc_norm_stderr": 0.04291135671009224 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302872, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302872 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.656441717791411, "acc_stderr": 0.037311335196738925, "acc_norm": 0.656441717791411, "acc_norm_stderr": 0.037311335196738925 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.0449394906861354, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.0449394906861354 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.044986763205729224, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.044986763205729224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.782051282051282, "acc_stderr": 0.02704685763071667, "acc_norm": 0.782051282051282, "acc_norm_stderr": 0.02704685763071667 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.756066411238825, "acc_stderr": 0.015357212665829463, "acc_norm": 0.756066411238825, "acc_norm_stderr": 0.015357212665829463 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6589595375722543, "acc_stderr": 0.025522474632121612, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.025522474632121612 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2659217877094972, "acc_stderr": 0.014776765066438893, "acc_norm": 0.2659217877094972, "acc_norm_stderr": 0.014776765066438893 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.565359477124183, "acc_stderr": 0.02838425670488304, "acc_norm": 0.565359477124183, "acc_norm_stderr": 0.02838425670488304 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6302250803858521, "acc_stderr": 0.027417996705630995, "acc_norm": 0.6302250803858521, "acc_norm_stderr": 0.027417996705630995 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6141975308641975, "acc_stderr": 0.027085401226132143, "acc_norm": 0.6141975308641975, "acc_norm_stderr": 0.027085401226132143 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4219858156028369, "acc_stderr": 0.029462189233370597, "acc_norm": 0.4219858156028369, "acc_norm_stderr": 0.029462189233370597 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4217731421121252, "acc_stderr": 0.01261297436939098, "acc_norm": 0.4217731421121252, "acc_norm_stderr": 0.01261297436939098 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5294117647058824, "acc_stderr": 0.03032024326500413, "acc_norm": 0.5294117647058824, "acc_norm_stderr": 0.03032024326500413 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5506535947712419, "acc_stderr": 0.02012376652802727, "acc_norm": 0.5506535947712419, "acc_norm_stderr": 0.02012376652802727 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5020408163265306, "acc_stderr": 0.0320089533497105, "acc_norm": 0.5020408163265306, "acc_norm_stderr": 0.0320089533497105 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6567164179104478, "acc_stderr": 0.03357379665433431, "acc_norm": 0.6567164179104478, "acc_norm_stderr": 0.03357379665433431 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685517, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7660818713450293, "acc_stderr": 0.03246721765117826, "acc_norm": 0.7660818713450293, "acc_norm_stderr": 0.03246721765117826 }, "harness|truthfulqa:mc|0": { "mc1": 0.2778457772337821, "mc1_stderr": 0.015680929364024654, "mc2": 0.4071991704986675, "mc2_stderr": 0.014219945219138549 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
Syma25/mydata1
2023-10-04T05:42:40.000Z
[ "license:other", "region:us" ]
Syma25
null
null
null
0
0
--- license: other license_name: other license_link: LICENSE ---
open-llm-leaderboard/details_Enno-Ai__ennodata-raw-pankajmathur-13b-peft
2023-10-04T05:27:42.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Enno-Ai/ennodata-raw-pankajmathur-13b-peft dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Enno-Ai/ennodata-raw-pankajmathur-13b-peft](https://huggingface.co/Enno-Ai/ennodata-raw-pankajmathur-13b-peft)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Enno-Ai__ennodata-raw-pankajmathur-13b-peft\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T05:26:18.448610](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__ennodata-raw-pankajmathur-13b-peft/blob/main/results_2023-10-04T05-26-18.448610.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5754002485958071,\n\ \ \"acc_stderr\": 0.03442932936272782,\n \"acc_norm\": 0.5794061950673782,\n\ \ \"acc_norm_stderr\": 0.03440825787558648,\n \"mc1\": 0.379436964504284,\n\ \ \"mc1_stderr\": 0.01698703926614298,\n \"mc2\": 0.5357247188867356,\n\ \ \"mc2_stderr\": 0.015675780170595004\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403077,\n\ \ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349812\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6224855606452898,\n\ \ \"acc_stderr\": 0.004837744647345718,\n \"acc_norm\": 0.8221469826727743,\n\ \ \"acc_norm_stderr\": 0.0038160747120605343\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\ \ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\ \ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\ \ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\ \ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\ \ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\ \ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \ \ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\ : 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\ \ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\ \ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\ \ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\ \ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\ \ \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n\ \ \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\ \ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\ acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\ \ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\ \ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\ \ \"acc_stderr\": 0.027575960723278243,\n \"acc_norm\": 0.6225806451612903,\n\ \ \"acc_norm_stderr\": 0.027575960723278243\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\ \ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\ : 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624335,\n\ \ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624335\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"\ acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\ \ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\ \ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \ \ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096624,\n \ \ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096624\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\ acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\ acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\ acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849323,\n \"\ acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849323\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \ \ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\ \ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\ \ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\ \ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\ acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\ \ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\ \ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\ \ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\ \ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\ \ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\ \ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\ \ \"acc_stderr\": 0.02490443909891824,\n \"acc_norm\": 0.8247863247863247,\n\ \ \"acc_norm_stderr\": 0.02490443909891824\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\ \ \"acc_stderr\": 0.01504630184669181,\n \"acc_norm\": 0.7701149425287356,\n\ \ \"acc_norm_stderr\": 0.01504630184669181\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.0258167567915842,\n\ \ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.0258167567915842\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4681564245810056,\n\ \ \"acc_stderr\": 0.01668855341561221,\n \"acc_norm\": 0.4681564245810056,\n\ \ \"acc_norm_stderr\": 0.01668855341561221\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510467998,\n\ \ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510467998\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\ \ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n\ \ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026229649178821163,\n\ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026229649178821163\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370597,\n \ \ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370597\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\ \ \"acc_stderr\": 0.012700582404768223,\n \"acc_norm\": 0.44784876140808344,\n\ \ \"acc_norm_stderr\": 0.012700582404768223\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \ \ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5980392156862745,\n \"acc_stderr\": 0.019835176484375387,\n \ \ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.019835176484375387\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.031557828165561644,\n\ \ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.031557828165561644\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\ \ \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.6368159203980099,\n\ \ \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \ \ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\ \ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\ \ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\ \ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n\ \ \"mc1_stderr\": 0.01698703926614298,\n \"mc2\": 0.5357247188867356,\n\ \ \"mc2_stderr\": 0.015675780170595004\n }\n}\n```" repo_url: https://huggingface.co/Enno-Ai/ennodata-raw-pankajmathur-13b-peft leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|arc:challenge|25_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hellaswag|10_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-26-18.448610.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-26-18.448610.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T05_26_18.448610 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-26-18.448610.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-26-18.448610.parquet' - config_name: results data_files: - split: 2023_10_04T05_26_18.448610 path: - results_2023-10-04T05-26-18.448610.parquet - split: latest path: - results_2023-10-04T05-26-18.448610.parquet --- # Dataset Card for Evaluation run of Enno-Ai/ennodata-raw-pankajmathur-13b-peft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Enno-Ai/ennodata-raw-pankajmathur-13b-peft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Enno-Ai/ennodata-raw-pankajmathur-13b-peft](https://huggingface.co/Enno-Ai/ennodata-raw-pankajmathur-13b-peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Enno-Ai__ennodata-raw-pankajmathur-13b-peft", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T05:26:18.448610](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__ennodata-raw-pankajmathur-13b-peft/blob/main/results_2023-10-04T05-26-18.448610.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5754002485958071, "acc_stderr": 0.03442932936272782, "acc_norm": 0.5794061950673782, "acc_norm_stderr": 0.03440825787558648, "mc1": 0.379436964504284, "mc1_stderr": 0.01698703926614298, "mc2": 0.5357247188867356, "mc2_stderr": 0.015675780170595004 }, "harness|arc:challenge|25": { "acc": 0.5827645051194539, "acc_stderr": 0.014409825518403077, "acc_norm": 0.6194539249146758, "acc_norm_stderr": 0.014188277712349812 }, "harness|hellaswag|10": { "acc": 0.6224855606452898, "acc_stderr": 0.004837744647345718, "acc_norm": 0.8221469826727743, "acc_norm_stderr": 0.0038160747120605343 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5526315789473685, "acc_stderr": 0.04046336883978251, "acc_norm": 0.5526315789473685, "acc_norm_stderr": 0.04046336883978251 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6037735849056604, "acc_stderr": 0.030102793781791197, "acc_norm": 0.6037735849056604, "acc_norm_stderr": 0.030102793781791197 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.625, "acc_stderr": 0.04048439222695598, "acc_norm": 0.625, "acc_norm_stderr": 0.04048439222695598 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5317919075144508, "acc_stderr": 0.03804749744364764, "acc_norm": 0.5317919075144508, "acc_norm_stderr": 0.03804749744364764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.49361702127659574, "acc_stderr": 0.032683358999363366, "acc_norm": 0.49361702127659574, "acc_norm_stderr": 0.032683358999363366 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.32456140350877194, "acc_stderr": 0.04404556157374768, "acc_norm": 0.32456140350877194, "acc_norm_stderr": 0.04404556157374768 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.496551724137931, "acc_stderr": 0.041665675771015785, "acc_norm": 0.496551724137931, "acc_norm_stderr": 0.041665675771015785 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.023517294335963286, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.023517294335963286 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6225806451612903, "acc_stderr": 0.027575960723278243, "acc_norm": 0.6225806451612903, "acc_norm_stderr": 0.027575960723278243 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4482758620689655, "acc_stderr": 0.03499113137676744, "acc_norm": 0.4482758620689655, "acc_norm_stderr": 0.03499113137676744 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7090909090909091, "acc_stderr": 0.03546563019624335, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.03546563019624335 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7525252525252525, "acc_stderr": 0.030746300742124488, "acc_norm": 0.7525252525252525, "acc_norm_stderr": 0.030746300742124488 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8290155440414507, "acc_stderr": 0.02717121368316453, "acc_norm": 0.8290155440414507, "acc_norm_stderr": 0.02717121368316453 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6025641025641025, "acc_stderr": 0.024811920017903836, "acc_norm": 0.6025641025641025, "acc_norm_stderr": 0.024811920017903836 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.027840811495871927, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.027840811495871927 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6050420168067226, "acc_stderr": 0.03175367846096624, "acc_norm": 0.6050420168067226, "acc_norm_stderr": 0.03175367846096624 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7871559633027523, "acc_stderr": 0.017549376389313694, "acc_norm": 0.7871559633027523, "acc_norm_stderr": 0.017549376389313694 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7843137254901961, "acc_stderr": 0.028867431449849323, "acc_norm": 0.7843137254901961, "acc_norm_stderr": 0.028867431449849323 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.02655837250266192, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.02655837250266192 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835794, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6106870229007634, "acc_stderr": 0.04276486542814591, "acc_norm": 0.6106870229007634, "acc_norm_stderr": 0.04276486542814591 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6942148760330579, "acc_stderr": 0.04205953933884123, "acc_norm": 0.6942148760330579, "acc_norm_stderr": 0.04205953933884123 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6944444444444444, "acc_stderr": 0.044531975073749834, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.044531975073749834 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7116564417177914, "acc_stderr": 0.035590395316173425, "acc_norm": 0.7116564417177914, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.04541609446503948, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.04541609446503948 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8247863247863247, "acc_stderr": 0.02490443909891824, "acc_norm": 0.8247863247863247, "acc_norm_stderr": 0.02490443909891824 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7701149425287356, "acc_stderr": 0.01504630184669181, "acc_norm": 0.7701149425287356, "acc_norm_stderr": 0.01504630184669181 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6416184971098265, "acc_stderr": 0.0258167567915842, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.0258167567915842 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4681564245810056, "acc_stderr": 0.01668855341561221, "acc_norm": 0.4681564245810056, "acc_norm_stderr": 0.01668855341561221 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6111111111111112, "acc_stderr": 0.027914055510467998, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.027914055510467998 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6591639871382636, "acc_stderr": 0.026920841260776162, "acc_norm": 0.6591639871382636, "acc_norm_stderr": 0.026920841260776162 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6666666666666666, "acc_stderr": 0.026229649178821163, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.026229649178821163 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4219858156028369, "acc_stderr": 0.029462189233370597, "acc_norm": 0.4219858156028369, "acc_norm_stderr": 0.029462189233370597 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44784876140808344, "acc_stderr": 0.012700582404768223, "acc_norm": 0.44784876140808344, "acc_norm_stderr": 0.012700582404768223 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5625, "acc_stderr": 0.030134614954403924, "acc_norm": 0.5625, "acc_norm_stderr": 0.030134614954403924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5980392156862745, "acc_stderr": 0.019835176484375387, "acc_norm": 0.5980392156862745, "acc_norm_stderr": 0.019835176484375387 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5836734693877551, "acc_stderr": 0.031557828165561644, "acc_norm": 0.5836734693877551, "acc_norm_stderr": 0.031557828165561644 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6368159203980099, "acc_stderr": 0.034005985055990146, "acc_norm": 0.6368159203980099, "acc_norm_stderr": 0.034005985055990146 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036625, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-virology|5": { "acc": 0.4457831325301205, "acc_stderr": 0.03869543323472101, "acc_norm": 0.4457831325301205, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.031581495393387324, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.031581495393387324 }, "harness|truthfulqa:mc|0": { "mc1": 0.379436964504284, "mc1_stderr": 0.01698703926614298, "mc2": 0.5357247188867356, "mc2_stderr": 0.015675780170595004 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-96b0d3ff-8b2e-4e65-af90-c2a4e323413b
2023-10-04T05:42:59.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down
2023-10-04T05:32:46.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T05:31:22.658790](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-31-22.658790.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5503306329316275,\n\ \ \"acc_stderr\": 0.03423836603356284,\n \"acc_norm\": 0.5545897159174654,\n\ \ \"acc_norm_stderr\": 0.03421836435271567,\n \"mc1\": 0.2827417380660955,\n\ \ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4283014811770436,\n\ \ \"mc2_stderr\": 0.0143306614735904\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5494880546075085,\n \"acc_stderr\": 0.014539646098471625,\n\ \ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449701\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6065524795857399,\n\ \ \"acc_stderr\": 0.004875162699121655,\n \"acc_norm\": 0.8151762597092213,\n\ \ \"acc_norm_stderr\": 0.0038736123391606564\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\ \ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\ \ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\ \ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\ \ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\ \ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\ \ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n\ \ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\ \ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\ \ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\ \ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\ \ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899208,\n \"\ acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899208\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\ \ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\ \ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\ \ \"acc_stderr\": 0.02652270967466776,\n \"acc_norm\": 0.6806451612903226,\n\ \ \"acc_norm_stderr\": 0.02652270967466776\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\ \ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\ : 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253811,\n\ \ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253811\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\ : 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\ \ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933886,\n\ \ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933886\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \ \ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \ \ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\ : 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7559633027522936,\n\ \ \"acc_stderr\": 0.018415286351416402,\n \"acc_norm\": 0.7559633027522936,\n\ \ \"acc_norm_stderr\": 0.018415286351416402\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.034006036255382704,\n\ \ \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.034006036255382704\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\ acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \ \ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\ \ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\ \ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\ \ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"\ acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831027,\n\ \ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831027\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\ \ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\ \ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\ \ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \ \ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\ \ \"acc_stderr\": 0.015246803197398684,\n \"acc_norm\": 0.7611749680715197,\n\ \ \"acc_norm_stderr\": 0.015246803197398684\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531025,\n\ \ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531025\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n\ \ \"acc_stderr\": 0.014173044098303675,\n \"acc_norm\": 0.2346368715083799,\n\ \ \"acc_norm_stderr\": 0.014173044098303675\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\ \ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\ \ \"acc_stderr\": 0.026795422327893937,\n \"acc_norm\": 0.6655948553054662,\n\ \ \"acc_norm_stderr\": 0.026795422327893937\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.026959344518747784,\n\ \ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.026959344518747784\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \ \ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41134289439374183,\n\ \ \"acc_stderr\": 0.012567882673803682,\n \"acc_norm\": 0.41134289439374183,\n\ \ \"acc_norm_stderr\": 0.012567882673803682\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329387,\n\ \ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329387\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181354,\n \ \ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181354\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\ \ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\ \ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235943,\n\ \ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235943\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\ \ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\ \ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\ \ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\ \ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n\ \ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\ \ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4283014811770436,\n\ \ \"mc2_stderr\": 0.0143306614735904\n }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|arc:challenge|25_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hellaswag|10_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-31-22.658790.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-31-22.658790.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T05_31_22.658790 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-31-22.658790.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-31-22.658790.parquet' - config_name: results data_files: - split: 2023_10_04T05_31_22.658790 path: - results_2023-10-04T05-31-22.658790.parquet - split: latest path: - results_2023-10-04T05-31-22.658790.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T05:31:22.658790](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-31-22.658790.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5503306329316275, "acc_stderr": 0.03423836603356284, "acc_norm": 0.5545897159174654, "acc_norm_stderr": 0.03421836435271567, "mc1": 0.2827417380660955, "mc1_stderr": 0.015764770836777308, "mc2": 0.4283014811770436, "mc2_stderr": 0.0143306614735904 }, "harness|arc:challenge|25": { "acc": 0.5494880546075085, "acc_stderr": 0.014539646098471625, "acc_norm": 0.5921501706484642, "acc_norm_stderr": 0.014361097288449701 }, "harness|hellaswag|10": { "acc": 0.6065524795857399, "acc_stderr": 0.004875162699121655, "acc_norm": 0.8151762597092213, "acc_norm_stderr": 0.0038736123391606564 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464243, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464243 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5592105263157895, "acc_stderr": 0.04040311062490436, "acc_norm": 0.5592105263157895, "acc_norm_stderr": 0.04040311062490436 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5924528301886792, "acc_stderr": 0.030242233800854494, "acc_norm": 0.5924528301886792, "acc_norm_stderr": 0.030242233800854494 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5763888888888888, "acc_stderr": 0.04132125019723369, "acc_norm": 0.5763888888888888, "acc_norm_stderr": 0.04132125019723369 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5202312138728323, "acc_stderr": 0.03809342081273957, "acc_norm": 0.5202312138728323, "acc_norm_stderr": 0.03809342081273957 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171452, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171452 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.42127659574468085, "acc_stderr": 0.03227834510146268, "acc_norm": 0.42127659574468085, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192117, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.31746031746031744, "acc_stderr": 0.02397386199899208, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.02397386199899208 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30158730158730157, "acc_stderr": 0.04104947269903394, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.04104947269903394 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6806451612903226, "acc_stderr": 0.02652270967466776, "acc_norm": 0.6806451612903226, "acc_norm_stderr": 0.02652270967466776 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.458128078817734, "acc_stderr": 0.03505630140785741, "acc_norm": 0.458128078817734, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.03646204963253811, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.03646204963253811 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.702020202020202, "acc_stderr": 0.03258630383836556, "acc_norm": 0.702020202020202, "acc_norm_stderr": 0.03258630383836556 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8341968911917098, "acc_stderr": 0.026839845022314415, "acc_norm": 0.8341968911917098, "acc_norm_stderr": 0.026839845022314415 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5282051282051282, "acc_stderr": 0.025310639254933886, "acc_norm": 0.5282051282051282, "acc_norm_stderr": 0.025310639254933886 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.027840811495871923, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.027840811495871923 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6260504201680672, "acc_stderr": 0.03142946637883708, "acc_norm": 0.6260504201680672, "acc_norm_stderr": 0.03142946637883708 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.03757949922943343, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.03757949922943343 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7559633027522936, "acc_stderr": 0.018415286351416402, "acc_norm": 0.7559633027522936, "acc_norm_stderr": 0.018415286351416402 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.46296296296296297, "acc_stderr": 0.034006036255382704, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.034006036255382704 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591361, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7215189873417721, "acc_stderr": 0.029178682304842548, "acc_norm": 0.7215189873417721, "acc_norm_stderr": 0.029178682304842548 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835794, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516304, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516304 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6073619631901841, "acc_stderr": 0.03836740907831027, "acc_norm": 0.6073619631901841, "acc_norm_stderr": 0.03836740907831027 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340456, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340456 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690879, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690879 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8076923076923077, "acc_stderr": 0.025819233256483717, "acc_norm": 0.8076923076923077, "acc_norm_stderr": 0.025819233256483717 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.58, "acc_stderr": 0.04960449637488583, "acc_norm": 0.58, "acc_norm_stderr": 0.04960449637488583 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7611749680715197, "acc_stderr": 0.015246803197398684, "acc_norm": 0.7611749680715197, "acc_norm_stderr": 0.015246803197398684 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6502890173410405, "acc_stderr": 0.025674281456531025, "acc_norm": 0.6502890173410405, "acc_norm_stderr": 0.025674281456531025 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2346368715083799, "acc_stderr": 0.014173044098303675, "acc_norm": 0.2346368715083799, "acc_norm_stderr": 0.014173044098303675 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6013071895424836, "acc_stderr": 0.028036092273891776, "acc_norm": 0.6013071895424836, "acc_norm_stderr": 0.028036092273891776 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6655948553054662, "acc_stderr": 0.026795422327893937, "acc_norm": 0.6655948553054662, "acc_norm_stderr": 0.026795422327893937 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6234567901234568, "acc_stderr": 0.026959344518747784, "acc_norm": 0.6234567901234568, "acc_norm_stderr": 0.026959344518747784 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.44680851063829785, "acc_stderr": 0.029658235097666904, "acc_norm": 0.44680851063829785, "acc_norm_stderr": 0.029658235097666904 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41134289439374183, "acc_stderr": 0.012567882673803682, "acc_norm": 0.41134289439374183, "acc_norm_stderr": 0.012567882673803682 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5551470588235294, "acc_stderr": 0.030187532060329387, "acc_norm": 0.5551470588235294, "acc_norm_stderr": 0.030187532060329387 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.553921568627451, "acc_stderr": 0.020109864547181354, "acc_norm": 0.553921568627451, "acc_norm_stderr": 0.020109864547181354 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.04673752333670239, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.04673752333670239 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6163265306122448, "acc_stderr": 0.031130880396235943, "acc_norm": 0.6163265306122448, "acc_norm_stderr": 0.031130880396235943 }, "harness|hendrycksTest-sociology|5": { "acc": 0.746268656716418, "acc_stderr": 0.030769444967296018, "acc_norm": 0.746268656716418, "acc_norm_stderr": 0.030769444967296018 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.03887971849597264, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7719298245614035, "acc_stderr": 0.03218093795602357, "acc_norm": 0.7719298245614035, "acc_norm_stderr": 0.03218093795602357 }, "harness|truthfulqa:mc|0": { "mc1": 0.2827417380660955, "mc1_stderr": 0.015764770836777308, "mc2": 0.4283014811770436, "mc2_stderr": 0.0143306614735904 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down
2023-10-04T05:38:34.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T05:37:11.185661](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-37-11.185661.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5354425583617756,\n\ \ \"acc_stderr\": 0.03467117892746031,\n \"acc_norm\": 0.5397339404688826,\n\ \ \"acc_norm_stderr\": 0.03465199283171929,\n \"mc1\": 0.2729498164014688,\n\ \ \"mc1_stderr\": 0.015594753632006525,\n \"mc2\": 0.40484956178285475,\n\ \ \"mc2_stderr\": 0.014307201832789093\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5349829351535836,\n \"acc_stderr\": 0.01457558392201967,\n\ \ \"acc_norm\": 0.5793515358361775,\n \"acc_norm_stderr\": 0.0144262112525084\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6030671181039634,\n\ \ \"acc_stderr\": 0.004882619484166601,\n \"acc_norm\": 0.8118900617406891,\n\ \ \"acc_norm_stderr\": 0.0039000125049579596\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\ \ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\ \ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\ \ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\ \ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\ \ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\ \ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\ \ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\ \ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\ \ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\ \ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\ \ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\ \ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681906,\n \"\ acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681906\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\ \ \"acc_stderr\": 0.04190596438871137,\n \"acc_norm\": 0.3253968253968254,\n\ \ \"acc_norm_stderr\": 0.04190596438871137\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\ \ \"acc_stderr\": 0.027430866579973467,\n \"acc_norm\": 0.632258064516129,\n\ \ \"acc_norm_stderr\": 0.027430866579973467\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036545,\n\ \ \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036545\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\ : 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\ \ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"\ acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\ \ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\ \ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.0284934650910286,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.0284934650910286\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\ \ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7339449541284404,\n \"acc_stderr\": 0.018946022322225604,\n \"\ acc_norm\": 0.7339449541284404,\n \"acc_norm_stderr\": 0.018946022322225604\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\ acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n \"\ acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658342,\n \ \ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658342\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\ \ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n\ \ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n\ \ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\ : 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\ \ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\ \ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.027236013946196697,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.027236013946196697\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\ \ \"acc_stderr\": 0.015274685213734188,\n \"acc_norm\": 0.7598978288633461,\n\ \ \"acc_norm_stderr\": 0.015274685213734188\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.0264545781469315,\n\ \ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.0264545781469315\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\ \ \"acc_stderr\": 0.014508979453553974,\n \"acc_norm\": 0.25139664804469275,\n\ \ \"acc_norm_stderr\": 0.014508979453553974\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n\ \ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\ \ \"acc_stderr\": 0.026981478043648033,\n \"acc_norm\": 0.6559485530546624,\n\ \ \"acc_norm_stderr\": 0.026981478043648033\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.0268228017595079,\n\ \ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.0268228017595079\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \ \ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41003911342894395,\n\ \ \"acc_stderr\": 0.012561837621962026,\n \"acc_norm\": 0.41003911342894395,\n\ \ \"acc_norm_stderr\": 0.012561837621962026\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\ \ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5326797385620915,\n \"acc_stderr\": 0.020184583359102202,\n \ \ \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.020184583359102202\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\ \ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\ \ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5142857142857142,\n \"acc_stderr\": 0.031996152328062855,\n\ \ \"acc_norm\": 0.5142857142857142,\n \"acc_norm_stderr\": 0.031996152328062855\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\ \ \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.6915422885572139,\n\ \ \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\ \ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\ \ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\ \ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n\ \ \"mc1_stderr\": 0.015594753632006525,\n \"mc2\": 0.40484956178285475,\n\ \ \"mc2_stderr\": 0.014307201832789093\n }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|arc:challenge|25_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hellaswag|10_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-37-11.185661.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-37-11.185661.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T05_37_11.185661 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-37-11.185661.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-37-11.185661.parquet' - config_name: results data_files: - split: 2023_10_04T05_37_11.185661 path: - results_2023-10-04T05-37-11.185661.parquet - split: latest path: - results_2023-10-04T05-37-11.185661.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T05:37:11.185661](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-37-11.185661.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5354425583617756, "acc_stderr": 0.03467117892746031, "acc_norm": 0.5397339404688826, "acc_norm_stderr": 0.03465199283171929, "mc1": 0.2729498164014688, "mc1_stderr": 0.015594753632006525, "mc2": 0.40484956178285475, "mc2_stderr": 0.014307201832789093 }, "harness|arc:challenge|25": { "acc": 0.5349829351535836, "acc_stderr": 0.01457558392201967, "acc_norm": 0.5793515358361775, "acc_norm_stderr": 0.0144262112525084 }, "harness|hellaswag|10": { "acc": 0.6030671181039634, "acc_stderr": 0.004882619484166601, "acc_norm": 0.8118900617406891, "acc_norm_stderr": 0.0039000125049579596 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464242, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464242 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5657894736842105, "acc_stderr": 0.04033565667848319, "acc_norm": 0.5657894736842105, "acc_norm_stderr": 0.04033565667848319 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5773584905660377, "acc_stderr": 0.03040233144576954, "acc_norm": 0.5773584905660377, "acc_norm_stderr": 0.03040233144576954 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.04166666666666665, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.04166666666666665 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4624277456647399, "acc_stderr": 0.0380168510452446, "acc_norm": 0.4624277456647399, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201943, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201943 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4085106382978723, "acc_stderr": 0.03213418026701576, "acc_norm": 0.4085106382978723, "acc_norm_stderr": 0.03213418026701576 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.0433913832257986, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.0433913832257986 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4206896551724138, "acc_stderr": 0.0411391498118926, "acc_norm": 0.4206896551724138, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3412698412698413, "acc_stderr": 0.02441923496681906, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.02441923496681906 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.04190596438871137, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.04190596438871137 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.632258064516129, "acc_stderr": 0.027430866579973467, "acc_norm": 0.632258064516129, "acc_norm_stderr": 0.027430866579973467 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39408866995073893, "acc_stderr": 0.03438157967036545, "acc_norm": 0.39408866995073893, "acc_norm_stderr": 0.03438157967036545 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7151515151515152, "acc_stderr": 0.03524390844511781, "acc_norm": 0.7151515151515152, "acc_norm_stderr": 0.03524390844511781 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7171717171717171, "acc_stderr": 0.03208779558786752, "acc_norm": 0.7171717171717171, "acc_norm_stderr": 0.03208779558786752 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7979274611398963, "acc_stderr": 0.028979089794296732, "acc_norm": 0.7979274611398963, "acc_norm_stderr": 0.028979089794296732 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5128205128205128, "acc_stderr": 0.025342671293807257, "acc_norm": 0.5128205128205128, "acc_norm_stderr": 0.025342671293807257 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.0284934650910286, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.0284934650910286 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5672268907563025, "acc_stderr": 0.032183581077426124, "acc_norm": 0.5672268907563025, "acc_norm_stderr": 0.032183581077426124 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7339449541284404, "acc_stderr": 0.018946022322225604, "acc_norm": 0.7339449541284404, "acc_norm_stderr": 0.018946022322225604 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39814814814814814, "acc_stderr": 0.033384734032074016, "acc_norm": 0.39814814814814814, "acc_norm_stderr": 0.033384734032074016 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.030190282453501943, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.030190282453501943 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7341772151898734, "acc_stderr": 0.028756799629658342, "acc_norm": 0.7341772151898734, "acc_norm_stderr": 0.028756799629658342 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6188340807174888, "acc_stderr": 0.03259625118416828, "acc_norm": 0.6188340807174888, "acc_norm_stderr": 0.03259625118416828 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5343511450381679, "acc_stderr": 0.043749285605997376, "acc_norm": 0.5343511450381679, "acc_norm_stderr": 0.043749285605997376 }, "harness|hendrycksTest-international_law|5": { "acc": 0.71900826446281, "acc_stderr": 0.04103203830514512, "acc_norm": 0.71900826446281, "acc_norm_stderr": 0.04103203830514512 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6441717791411042, "acc_stderr": 0.03761521380046734, "acc_norm": 0.6441717791411042, "acc_norm_stderr": 0.03761521380046734 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340456, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340456 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280042, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280042 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7777777777777778, "acc_stderr": 0.027236013946196697, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.027236013946196697 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7598978288633461, "acc_stderr": 0.015274685213734188, "acc_norm": 0.7598978288633461, "acc_norm_stderr": 0.015274685213734188 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5924855491329479, "acc_stderr": 0.0264545781469315, "acc_norm": 0.5924855491329479, "acc_norm_stderr": 0.0264545781469315 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25139664804469275, "acc_stderr": 0.014508979453553974, "acc_norm": 0.25139664804469275, "acc_norm_stderr": 0.014508979453553974 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5522875816993464, "acc_stderr": 0.02847293847803353, "acc_norm": 0.5522875816993464, "acc_norm_stderr": 0.02847293847803353 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6559485530546624, "acc_stderr": 0.026981478043648033, "acc_norm": 0.6559485530546624, "acc_norm_stderr": 0.026981478043648033 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6327160493827161, "acc_stderr": 0.0268228017595079, "acc_norm": 0.6327160493827161, "acc_norm_stderr": 0.0268228017595079 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.029752389657427047, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.029752389657427047 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41003911342894395, "acc_stderr": 0.012561837621962026, "acc_norm": 0.41003911342894395, "acc_norm_stderr": 0.012561837621962026 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4852941176470588, "acc_stderr": 0.03035969707904611, "acc_norm": 0.4852941176470588, "acc_norm_stderr": 0.03035969707904611 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5326797385620915, "acc_stderr": 0.020184583359102202, "acc_norm": 0.5326797385620915, "acc_norm_stderr": 0.020184583359102202 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.04673752333670238, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.04673752333670238 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5142857142857142, "acc_stderr": 0.031996152328062855, "acc_norm": 0.5142857142857142, "acc_norm_stderr": 0.031996152328062855 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6915422885572139, "acc_stderr": 0.03265819588512699, "acc_norm": 0.6915422885572139, "acc_norm_stderr": 0.03265819588512699 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.04512608598542129, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-virology|5": { "acc": 0.41566265060240964, "acc_stderr": 0.038367221765980515, "acc_norm": 0.41566265060240964, "acc_norm_stderr": 0.038367221765980515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7719298245614035, "acc_stderr": 0.032180937956023566, "acc_norm": 0.7719298245614035, "acc_norm_stderr": 0.032180937956023566 }, "harness|truthfulqa:mc|0": { "mc1": 0.2729498164014688, "mc1_stderr": 0.015594753632006525, "mc2": 0.40484956178285475, "mc2_stderr": 0.014307201832789093 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
ryo2/roly-poly_dataset
2023-10-08T10:11:46.000Z
[ "license:apache-2.0", "region:us" ]
ryo2
null
null
null
0
0
--- license: apache-2.0 ---
atom-in-the-universe/bild-931f5f0f-3a90-410f-b0ad-a3ba39206532
2023-10-04T05:55:22.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down
2023-10-04T05:44:22.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T05:43:00.841479](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-43-00.841479.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5440364921757405,\n\ \ \"acc_stderr\": 0.03464042179912206,\n \"acc_norm\": 0.5482511933923937,\n\ \ \"acc_norm_stderr\": 0.03462187613632752,\n \"mc1\": 0.27539779681762544,\n\ \ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40796864631942464,\n\ \ \"mc2_stderr\": 0.014120392600273196\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.01458063756999542,\n\ \ \"acc_norm\": 0.5776450511945392,\n \"acc_norm_stderr\": 0.014434138713379976\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6043616809400518,\n\ \ \"acc_stderr\": 0.004879880092103958,\n \"acc_norm\": 0.8078072097191794,\n\ \ \"acc_norm_stderr\": 0.003932184843841659\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\ \ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\ \ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\ \ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\ \ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\ \ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\ \ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\ : 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\ \ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\ \ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\ \ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\ \ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\ \ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\ \ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\ : 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\ \ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\ \ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\ \ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\ \ \"acc_stderr\": 0.027430866579973467,\n \"acc_norm\": 0.632258064516129,\n\ \ \"acc_norm_stderr\": 0.027430866579973467\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959215,\n\ \ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959215\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\ : 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n\ \ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.696969696969697,\n \"acc_stderr\": 0.03274287914026867,\n \"acc_norm\"\ : 0.696969696969697,\n \"acc_norm_stderr\": 0.03274287914026867\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.03074890536390989,\n\ \ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.03074890536390989\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n\ \ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \ \ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413925,\n \ \ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413925\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7192660550458716,\n \"acc_stderr\": 0.01926605504587162,\n \"\ acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.01926605504587162\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4166666666666667,\n \"acc_stderr\": 0.033622774366080445,\n \"\ acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.033622774366080445\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\ acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955924,\n \ \ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955924\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\ \ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\ \ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\ \ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\ acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548913,\n\ \ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548913\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\ \ \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n\ \ \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\ \ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\ \ \"acc_stderr\": 0.028120966503914414,\n \"acc_norm\": 0.7564102564102564,\n\ \ \"acc_norm_stderr\": 0.028120966503914414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \ \ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\ \ \"acc_stderr\": 0.015246803197398682,\n \"acc_norm\": 0.7611749680715197,\n\ \ \"acc_norm_stderr\": 0.015246803197398682\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806653,\n\ \ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806653\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\ \ \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n\ \ \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\ \ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\ \ \"acc_stderr\": 0.027264297599804012,\n \"acc_norm\": 0.639871382636656,\n\ \ \"acc_norm_stderr\": 0.027264297599804012\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5864197530864198,\n \"acc_stderr\": 0.027402042040269962,\n\ \ \"acc_norm\": 0.5864197530864198,\n \"acc_norm_stderr\": 0.027402042040269962\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419994,\n \ \ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419994\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\ \ \"acc_stderr\": 0.01260496081608737,\n \"acc_norm\": 0.4198174706649283,\n\ \ \"acc_norm_stderr\": 0.01260496081608737\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n\ \ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5490196078431373,\n \"acc_stderr\": 0.020130388312904528,\n \ \ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.020130388312904528\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\ \ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\ \ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.563265306122449,\n \"acc_stderr\": 0.03175195237583323,\n\ \ \"acc_norm\": 0.563265306122449,\n \"acc_norm_stderr\": 0.03175195237583323\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\ \ \"acc_stderr\": 0.03251006816458619,\n \"acc_norm\": 0.6965174129353234,\n\ \ \"acc_norm_stderr\": 0.03251006816458619\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\ \ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\ \ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n\ \ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\ \ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40796864631942464,\n\ \ \"mc2_stderr\": 0.014120392600273196\n }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|arc:challenge|25_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hellaswag|10_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-43-00.841479.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-43-00.841479.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T05_43_00.841479 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-43-00.841479.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-43-00.841479.parquet' - config_name: results data_files: - split: 2023_10_04T05_43_00.841479 path: - results_2023-10-04T05-43-00.841479.parquet - split: latest path: - results_2023-10-04T05-43-00.841479.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T05:43:00.841479](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-04T05-43-00.841479.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5440364921757405, "acc_stderr": 0.03464042179912206, "acc_norm": 0.5482511933923937, "acc_norm_stderr": 0.03462187613632752, "mc1": 0.27539779681762544, "mc1_stderr": 0.01563813566777552, "mc2": 0.40796864631942464, "mc2_stderr": 0.014120392600273196 }, "harness|arc:challenge|25": { "acc": 0.5324232081911263, "acc_stderr": 0.01458063756999542, "acc_norm": 0.5776450511945392, "acc_norm_stderr": 0.014434138713379976 }, "harness|hellaswag|10": { "acc": 0.6043616809400518, "acc_stderr": 0.004879880092103958, "acc_norm": 0.8078072097191794, "acc_norm_stderr": 0.003932184843841659 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621503, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621503 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5111111111111111, "acc_stderr": 0.04318275491977976, "acc_norm": 0.5111111111111111, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5328947368421053, "acc_stderr": 0.040601270352363966, "acc_norm": 0.5328947368421053, "acc_norm_stderr": 0.040601270352363966 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5924528301886792, "acc_stderr": 0.030242233800854494, "acc_norm": 0.5924528301886792, "acc_norm_stderr": 0.030242233800854494 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6041666666666666, "acc_stderr": 0.04089465449325582, "acc_norm": 0.6041666666666666, "acc_norm_stderr": 0.04089465449325582 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5144508670520231, "acc_stderr": 0.03810871630454764, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006718, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006718 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.43829787234042555, "acc_stderr": 0.03243618636108101, "acc_norm": 0.43829787234042555, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322004, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322004 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.0242785680243077, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.0242785680243077 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3492063492063492, "acc_stderr": 0.04263906892795132, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.04263906892795132 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.632258064516129, "acc_stderr": 0.027430866579973467, "acc_norm": 0.632258064516129, "acc_norm_stderr": 0.027430866579973467 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.46798029556650245, "acc_stderr": 0.03510766597959215, "acc_norm": 0.46798029556650245, "acc_norm_stderr": 0.03510766597959215 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.03646204963253812, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.03646204963253812 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.696969696969697, "acc_stderr": 0.03274287914026867, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.03274287914026867 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7616580310880829, "acc_stderr": 0.03074890536390989, "acc_norm": 0.7616580310880829, "acc_norm_stderr": 0.03074890536390989 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4846153846153846, "acc_stderr": 0.025339003010106515, "acc_norm": 0.4846153846153846, "acc_norm_stderr": 0.025339003010106515 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.362962962962963, "acc_stderr": 0.029318203645206865, "acc_norm": 0.362962962962963, "acc_norm_stderr": 0.029318203645206865 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6008403361344538, "acc_stderr": 0.03181110032413925, "acc_norm": 0.6008403361344538, "acc_norm_stderr": 0.03181110032413925 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.0386155754625517, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.0386155754625517 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7192660550458716, "acc_stderr": 0.01926605504587162, "acc_norm": 0.7192660550458716, "acc_norm_stderr": 0.01926605504587162 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4166666666666667, "acc_stderr": 0.033622774366080445, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.033622774366080445 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7450980392156863, "acc_stderr": 0.030587591351604246, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.030587591351604246 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7172995780590717, "acc_stderr": 0.029312814153955924, "acc_norm": 0.7172995780590717, "acc_norm_stderr": 0.029312814153955924 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6412556053811659, "acc_stderr": 0.032190792004199956, "acc_norm": 0.6412556053811659, "acc_norm_stderr": 0.032190792004199956 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5419847328244275, "acc_stderr": 0.04369802690578756, "acc_norm": 0.5419847328244275, "acc_norm_stderr": 0.04369802690578756 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.04026187527591205, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.04026187527591205 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5950920245398773, "acc_stderr": 0.03856672163548913, "acc_norm": 0.5950920245398773, "acc_norm_stderr": 0.03856672163548913 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.26785714285714285, "acc_stderr": 0.04203277291467764, "acc_norm": 0.26785714285714285, "acc_norm_stderr": 0.04203277291467764 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.044532548363264673, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.044532548363264673 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7564102564102564, "acc_stderr": 0.028120966503914414, "acc_norm": 0.7564102564102564, "acc_norm_stderr": 0.028120966503914414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7611749680715197, "acc_stderr": 0.015246803197398682, "acc_norm": 0.7611749680715197, "acc_norm_stderr": 0.015246803197398682 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6098265895953757, "acc_stderr": 0.026261677607806653, "acc_norm": 0.6098265895953757, "acc_norm_stderr": 0.026261677607806653 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3027932960893855, "acc_stderr": 0.015366860386397112, "acc_norm": 0.3027932960893855, "acc_norm_stderr": 0.015366860386397112 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6013071895424836, "acc_stderr": 0.028036092273891776, "acc_norm": 0.6013071895424836, "acc_norm_stderr": 0.028036092273891776 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.639871382636656, "acc_stderr": 0.027264297599804012, "acc_norm": 0.639871382636656, "acc_norm_stderr": 0.027264297599804012 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5864197530864198, "acc_stderr": 0.027402042040269962, "acc_norm": 0.5864197530864198, "acc_norm_stderr": 0.027402042040269962 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41843971631205673, "acc_stderr": 0.029427994039419994, "acc_norm": 0.41843971631205673, "acc_norm_stderr": 0.029427994039419994 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4198174706649283, "acc_stderr": 0.01260496081608737, "acc_norm": 0.4198174706649283, "acc_norm_stderr": 0.01260496081608737 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5220588235294118, "acc_stderr": 0.030343264224213514, "acc_norm": 0.5220588235294118, "acc_norm_stderr": 0.030343264224213514 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5490196078431373, "acc_stderr": 0.020130388312904528, "acc_norm": 0.5490196078431373, "acc_norm_stderr": 0.020130388312904528 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.04582004841505417, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.04582004841505417 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.563265306122449, "acc_stderr": 0.03175195237583323, "acc_norm": 0.563265306122449, "acc_norm_stderr": 0.03175195237583323 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6965174129353234, "acc_stderr": 0.03251006816458619, "acc_norm": 0.6965174129353234, "acc_norm_stderr": 0.03251006816458619 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685517, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.03158149539338733, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.03158149539338733 }, "harness|truthfulqa:mc|0": { "mc1": 0.27539779681762544, "mc1_stderr": 0.01563813566777552, "mc2": 0.40796864631942464, "mc2_stderr": 0.014120392600273196 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
BBrother/Stable-diffusion
2023-10-10T11:17:08.000Z
[ "region:us" ]
BBrother
null
null
null
0
0
Entry not found
atom-in-the-universe/bild-99168c98-11b9-4c7c-bbfd-bbaaf9927104
2023-10-04T06:10:51.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_IkariDev__Athena-v3
2023-10-04T05:59:13.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of IkariDev/Athena-v3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [IkariDev/Athena-v3](https://huggingface.co/IkariDev/Athena-v3) on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_IkariDev__Athena-v3\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T05:57:51.929610](https://huggingface.co/datasets/open-llm-leaderboard/details_IkariDev__Athena-v3/blob/main/results_2023-10-04T05-57-51.929610.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5801799110421908,\n\ \ \"acc_stderr\": 0.03429522918064415,\n \"acc_norm\": 0.583796773208885,\n\ \ \"acc_norm_stderr\": 0.03427369686054628,\n \"mc1\": 0.35862913096695226,\n\ \ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5126008436421456,\n\ \ \"mc2_stderr\": 0.015595901833167577\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009124,\n\ \ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.01420647266167288\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6538538139812786,\n\ \ \"acc_stderr\": 0.004747682003491466,\n \"acc_norm\": 0.8433578968333001,\n\ \ \"acc_norm_stderr\": 0.0036272018740533913\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\ \ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\ \ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\ \ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\ \ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \ \ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\ \ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\ \ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.6319444444444444,\n\ \ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\ \ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\ \ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\ \ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\ \ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\ \ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\ \ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\ \ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3439153439153439,\n \"acc_stderr\": 0.02446442662559643,\n \"\ acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.02446442662559643\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\ \ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\ \ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\ \ \"acc_stderr\": 0.026923446059302844,\n \"acc_norm\": 0.6612903225806451,\n\ \ \"acc_norm_stderr\": 0.026923446059302844\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\ \ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\ : 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\ \ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713545,\n \"\ acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713545\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548064,\n\ \ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548064\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.025141801511177495,\n\ \ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.025141801511177495\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \ \ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n\ \ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\ acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.781651376146789,\n \"acc_stderr\": 0.017712600528722734,\n \"\ acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722734\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"\ acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\ acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \ \ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\ \ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n\ \ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\ \ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\ acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\ \ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\ \ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \ \ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n\ \ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\ \ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\ \ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\ \ \"acc_stderr\": 0.014987270640946002,\n \"acc_norm\": 0.7726692209450831,\n\ \ \"acc_norm_stderr\": 0.014987270640946002\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\ \ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5050279329608939,\n\ \ \"acc_stderr\": 0.016721656037538418,\n \"acc_norm\": 0.5050279329608939,\n\ \ \"acc_norm_stderr\": 0.016721656037538418\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.02742047766262923,\n\ \ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.02742047766262923\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\ \ \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n\ \ \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722334,\n\ \ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722334\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \ \ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\ \ \"acc_stderr\": 0.012685906538206247,\n \"acc_norm\": 0.4426336375488918,\n\ \ \"acc_norm_stderr\": 0.012685906538206247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\ \ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5866013071895425,\n \"acc_stderr\": 0.019922115682786682,\n \ \ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.019922115682786682\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\ \ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\ \ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n\ \ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\ \ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n\ \ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\ \ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\ \ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\ \ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n\ \ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5126008436421456,\n\ \ \"mc2_stderr\": 0.015595901833167577\n }\n}\n```" repo_url: https://huggingface.co/IkariDev/Athena-v3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|arc:challenge|25_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hellaswag|10_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-57-51.929610.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-57-51.929610.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T05_57_51.929610 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-57-51.929610.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T05-57-51.929610.parquet' - config_name: results data_files: - split: 2023_10_04T05_57_51.929610 path: - results_2023-10-04T05-57-51.929610.parquet - split: latest path: - results_2023-10-04T05-57-51.929610.parquet --- # Dataset Card for Evaluation run of IkariDev/Athena-v3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/IkariDev/Athena-v3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [IkariDev/Athena-v3](https://huggingface.co/IkariDev/Athena-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_IkariDev__Athena-v3", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T05:57:51.929610](https://huggingface.co/datasets/open-llm-leaderboard/details_IkariDev__Athena-v3/blob/main/results_2023-10-04T05-57-51.929610.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5801799110421908, "acc_stderr": 0.03429522918064415, "acc_norm": 0.583796773208885, "acc_norm_stderr": 0.03427369686054628, "mc1": 0.35862913096695226, "mc1_stderr": 0.016789289499502022, "mc2": 0.5126008436421456, "mc2_stderr": 0.015595901833167577 }, "harness|arc:challenge|25": { "acc": 0.5930034129692833, "acc_stderr": 0.014356399418009124, "acc_norm": 0.6168941979522184, "acc_norm_stderr": 0.01420647266167288 }, "harness|hellaswag|10": { "acc": 0.6538538139812786, "acc_stderr": 0.004747682003491466, "acc_norm": 0.8433578968333001, "acc_norm_stderr": 0.0036272018740533913 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5657894736842105, "acc_stderr": 0.040335656678483205, "acc_norm": 0.5657894736842105, "acc_norm_stderr": 0.040335656678483205 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5924528301886792, "acc_stderr": 0.030242233800854494, "acc_norm": 0.5924528301886792, "acc_norm_stderr": 0.030242233800854494 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6319444444444444, "acc_stderr": 0.04032999053960718, "acc_norm": 0.6319444444444444, "acc_norm_stderr": 0.04032999053960718 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5375722543352601, "acc_stderr": 0.0380168510452446, "acc_norm": 0.5375722543352601, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.28431372549019607, "acc_stderr": 0.04488482852329017, "acc_norm": 0.28431372549019607, "acc_norm_stderr": 0.04488482852329017 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.49361702127659574, "acc_stderr": 0.032683358999363366, "acc_norm": 0.49361702127659574, "acc_norm_stderr": 0.032683358999363366 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2982456140350877, "acc_stderr": 0.04303684033537315, "acc_norm": 0.2982456140350877, "acc_norm_stderr": 0.04303684033537315 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.041665675771015785, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.041665675771015785 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3439153439153439, "acc_stderr": 0.02446442662559643, "acc_norm": 0.3439153439153439, "acc_norm_stderr": 0.02446442662559643 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377563, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377563 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6612903225806451, "acc_stderr": 0.026923446059302844, "acc_norm": 0.6612903225806451, "acc_norm_stderr": 0.026923446059302844 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4482758620689655, "acc_stderr": 0.034991131376767445, "acc_norm": 0.4482758620689655, "acc_norm_stderr": 0.034991131376767445 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6909090909090909, "acc_stderr": 0.036085410115739666, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.036085410115739666 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7222222222222222, "acc_stderr": 0.03191178226713545, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.03191178226713545 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8238341968911918, "acc_stderr": 0.027493504244548064, "acc_norm": 0.8238341968911918, "acc_norm_stderr": 0.027493504244548064 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5641025641025641, "acc_stderr": 0.025141801511177495, "acc_norm": 0.5641025641025641, "acc_norm_stderr": 0.025141801511177495 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.028406533090608463, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.028406533090608463 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.031357095996135904, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.031357095996135904 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.781651376146789, "acc_stderr": 0.017712600528722734, "acc_norm": 0.781651376146789, "acc_norm_stderr": 0.017712600528722734 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4398148148148148, "acc_stderr": 0.03385177976044811, "acc_norm": 0.4398148148148148, "acc_norm_stderr": 0.03385177976044811 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588663, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588663 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036423, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036423 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477518, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477518 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.648854961832061, "acc_stderr": 0.04186445163013751, "acc_norm": 0.648854961832061, "acc_norm_stderr": 0.04186445163013751 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516303, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516303 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6871165644171779, "acc_stderr": 0.03642914578292406, "acc_norm": 0.6871165644171779, "acc_norm_stderr": 0.03642914578292406 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.6699029126213593, "acc_stderr": 0.04656147110012351, "acc_norm": 0.6699029126213593, "acc_norm_stderr": 0.04656147110012351 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8076923076923077, "acc_stderr": 0.025819233256483717, "acc_norm": 0.8076923076923077, "acc_norm_stderr": 0.025819233256483717 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7726692209450831, "acc_stderr": 0.014987270640946002, "acc_norm": 0.7726692209450831, "acc_norm_stderr": 0.014987270640946002 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6445086705202312, "acc_stderr": 0.025770292082977254, "acc_norm": 0.6445086705202312, "acc_norm_stderr": 0.025770292082977254 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5050279329608939, "acc_stderr": 0.016721656037538418, "acc_norm": 0.5050279329608939, "acc_norm_stderr": 0.016721656037538418 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6437908496732027, "acc_stderr": 0.02742047766262923, "acc_norm": 0.6437908496732027, "acc_norm_stderr": 0.02742047766262923 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6655948553054662, "acc_stderr": 0.026795422327893934, "acc_norm": 0.6655948553054662, "acc_norm_stderr": 0.026795422327893934 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6358024691358025, "acc_stderr": 0.026774929899722334, "acc_norm": 0.6358024691358025, "acc_norm_stderr": 0.026774929899722334 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.02958345203628407, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.02958345203628407 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4426336375488918, "acc_stderr": 0.012685906538206247, "acc_norm": 0.4426336375488918, "acc_norm_stderr": 0.012685906538206247 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5551470588235294, "acc_stderr": 0.030187532060329383, "acc_norm": 0.5551470588235294, "acc_norm_stderr": 0.030187532060329383 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5866013071895425, "acc_stderr": 0.019922115682786682, "acc_norm": 0.5866013071895425, "acc_norm_stderr": 0.019922115682786682 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6489795918367347, "acc_stderr": 0.030555316755573637, "acc_norm": 0.6489795918367347, "acc_norm_stderr": 0.030555316755573637 }, "harness|hendrycksTest-sociology|5": { "acc": 0.736318407960199, "acc_stderr": 0.031157150869355558, "acc_norm": 0.736318407960199, "acc_norm_stderr": 0.031157150869355558 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.463855421686747, "acc_stderr": 0.03882310850890593, "acc_norm": 0.463855421686747, "acc_norm_stderr": 0.03882310850890593 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.03126781714663179, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.03126781714663179 }, "harness|truthfulqa:mc|0": { "mc1": 0.35862913096695226, "mc1_stderr": 0.016789289499502022, "mc2": 0.5126008436421456, "mc2_stderr": 0.015595901833167577 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0
2023-10-04T06:03:24.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of meta-math/MetaMath-70B-V1.0 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [meta-math/MetaMath-70B-V1.0](https://huggingface.co/meta-math/MetaMath-70B-V1.0)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:01:20.870650](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0/blob/main/results_2023-10-04T06-01-20.870650.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6919665391533253,\n\ \ \"acc_stderr\": 0.03077850676465074,\n \"acc_norm\": 0.6958343142814397,\n\ \ \"acc_norm_stderr\": 0.030750220845504973,\n \"mc1\": 0.3390452876376989,\n\ \ \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.5097969029790534,\n\ \ \"mc2_stderr\": 0.014915889066271937\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6416382252559727,\n \"acc_stderr\": 0.014012883334859857,\n\ \ \"acc_norm\": 0.6800341296928327,\n \"acc_norm_stderr\": 0.013631345807016193\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6786496713802032,\n\ \ \"acc_stderr\": 0.004660405565338756,\n \"acc_norm\": 0.8684524995020912,\n\ \ \"acc_norm_stderr\": 0.003373073863582288\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n\ \ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\ \ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\ \ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n\ \ \"acc_stderr\": 0.029514245964291766,\n \"acc_norm\": 0.8541666666666666,\n\ \ \"acc_norm_stderr\": 0.029514245964291766\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\ \ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\ \ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\ \ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\ \ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\ \ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380035,\n\ \ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380035\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\ \ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\ \ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\ \ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4603174603174603,\n \"acc_stderr\": 0.02567008063690919,\n \"\ acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.02567008063690919\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\ \ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\ \ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\ \ \"acc_stderr\": 0.02261640942074202,\n \"acc_norm\": 0.8032258064516129,\n\ \ \"acc_norm_stderr\": 0.02261640942074202\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\ \ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\ : 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\ \ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942095,\n \"\ acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942095\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n\ \ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857733,\n\ \ \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857733\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \ \ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\ : 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8880733944954129,\n\ \ \"acc_stderr\": 0.013517352714958788,\n \"acc_norm\": 0.8880733944954129,\n\ \ \"acc_norm_stderr\": 0.013517352714958788\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n\ \ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073312,\n \"\ acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073312\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \ \ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\ \ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\ \ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\ \ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\ acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\ \ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\ \ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\ \ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\ \ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\ \ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\ \ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\ \ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\ \ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\ \ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\ \ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n\ \ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958154,\n\ \ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958154\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n\ \ \"acc_stderr\": 0.022827317491059686,\n \"acc_norm\": 0.797427652733119,\n\ \ \"acc_norm_stderr\": 0.022827317491059686\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.020423955354778034,\n\ \ \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.020423955354778034\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291474,\n \ \ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291474\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5371577574967406,\n\ \ \"acc_stderr\": 0.01273492357953206,\n \"acc_norm\": 0.5371577574967406,\n\ \ \"acc_norm_stderr\": 0.01273492357953206\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\ \ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.7401960784313726,\n \"acc_stderr\": 0.01774089950917779,\n \ \ \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.01774089950917779\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\ \ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\ \ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.025206963154225395,\n\ \ \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.025206963154225395\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\ \ \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n\ \ \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \ \ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\ \ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\ \ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\ \ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\ \ \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.5097969029790534,\n\ \ \"mc2_stderr\": 0.014915889066271937\n }\n}\n```" repo_url: https://huggingface.co/meta-math/MetaMath-70B-V1.0 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hellaswag|10_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-01-20.870650.parquet' - config_name: results data_files: - split: 2023_10_04T06_01_20.870650 path: - results_2023-10-04T06-01-20.870650.parquet - split: latest path: - results_2023-10-04T06-01-20.870650.parquet --- # Dataset Card for Evaluation run of meta-math/MetaMath-70B-V1.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/meta-math/MetaMath-70B-V1.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [meta-math/MetaMath-70B-V1.0](https://huggingface.co/meta-math/MetaMath-70B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:01:20.870650](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0/blob/main/results_2023-10-04T06-01-20.870650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6919665391533253, "acc_stderr": 0.03077850676465074, "acc_norm": 0.6958343142814397, "acc_norm_stderr": 0.030750220845504973, "mc1": 0.3390452876376989, "mc1_stderr": 0.016571797910626615, "mc2": 0.5097969029790534, "mc2_stderr": 0.014915889066271937 }, "harness|arc:challenge|25": { "acc": 0.6416382252559727, "acc_stderr": 0.014012883334859857, "acc_norm": 0.6800341296928327, "acc_norm_stderr": 0.013631345807016193 }, "harness|hellaswag|10": { "acc": 0.6786496713802032, "acc_stderr": 0.004660405565338756, "acc_norm": 0.8684524995020912, "acc_norm_stderr": 0.003373073863582288 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.042039210401562783, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.042039210401562783 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8223684210526315, "acc_stderr": 0.03110318238312338, "acc_norm": 0.8223684210526315, "acc_norm_stderr": 0.03110318238312338 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7283018867924528, "acc_stderr": 0.027377706624670713, "acc_norm": 0.7283018867924528, "acc_norm_stderr": 0.027377706624670713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8541666666666666, "acc_stderr": 0.029514245964291766, "acc_norm": 0.8541666666666666, "acc_norm_stderr": 0.029514245964291766 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105653, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105653 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909281, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909281 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6808510638297872, "acc_stderr": 0.030472973363380035, "acc_norm": 0.6808510638297872, "acc_norm_stderr": 0.030472973363380035 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4603174603174603, "acc_stderr": 0.02567008063690919, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.02567008063690919 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8032258064516129, "acc_stderr": 0.02261640942074202, "acc_norm": 0.8032258064516129, "acc_norm_stderr": 0.02261640942074202 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5221674876847291, "acc_stderr": 0.03514528562175008, "acc_norm": 0.5221674876847291, "acc_norm_stderr": 0.03514528562175008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8363636363636363, "acc_stderr": 0.02888787239548795, "acc_norm": 0.8363636363636363, "acc_norm_stderr": 0.02888787239548795 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8787878787878788, "acc_stderr": 0.023253157951942095, "acc_norm": 0.8787878787878788, "acc_norm_stderr": 0.023253157951942095 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9430051813471503, "acc_stderr": 0.016731085293607555, "acc_norm": 0.9430051813471503, "acc_norm_stderr": 0.016731085293607555 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7076923076923077, "acc_stderr": 0.023060438380857733, "acc_norm": 0.7076923076923077, "acc_norm_stderr": 0.023060438380857733 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948485, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7647058823529411, "acc_stderr": 0.02755361446786381, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.02755361446786381 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.423841059602649, "acc_stderr": 0.04034846678603397, "acc_norm": 0.423841059602649, "acc_norm_stderr": 0.04034846678603397 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8880733944954129, "acc_stderr": 0.013517352714958788, "acc_norm": 0.8880733944954129, "acc_norm_stderr": 0.013517352714958788 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5740740740740741, "acc_stderr": 0.033723432716530624, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.033723432716530624 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9313725490196079, "acc_stderr": 0.017744453647073312, "acc_norm": 0.9313725490196079, "acc_norm_stderr": 0.017744453647073312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8734177215189873, "acc_stderr": 0.021644195727955173, "acc_norm": 0.8734177215189873, "acc_norm_stderr": 0.021644195727955173 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7982062780269058, "acc_stderr": 0.02693611191280227, "acc_norm": 0.7982062780269058, "acc_norm_stderr": 0.02693611191280227 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8625954198473282, "acc_stderr": 0.030194823996804475, "acc_norm": 0.8625954198473282, "acc_norm_stderr": 0.030194823996804475 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.03092278832044579, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.03092278832044579 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.03602814176392645, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.03602814176392645 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8159509202453987, "acc_stderr": 0.030446777687971726, "acc_norm": 0.8159509202453987, "acc_norm_stderr": 0.030446777687971726 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5267857142857143, "acc_stderr": 0.047389751192741546, "acc_norm": 0.5267857142857143, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573974, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9017094017094017, "acc_stderr": 0.019503444900757567, "acc_norm": 0.9017094017094017, "acc_norm_stderr": 0.019503444900757567 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8659003831417624, "acc_stderr": 0.012185528166499978, "acc_norm": 0.8659003831417624, "acc_norm_stderr": 0.012185528166499978 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7832369942196532, "acc_stderr": 0.022183477668412856, "acc_norm": 0.7832369942196532, "acc_norm_stderr": 0.022183477668412856 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41787709497206704, "acc_stderr": 0.016495400635820084, "acc_norm": 0.41787709497206704, "acc_norm_stderr": 0.016495400635820084 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.025058503316958154, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.025058503316958154 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.797427652733119, "acc_stderr": 0.022827317491059686, "acc_norm": 0.797427652733119, "acc_norm_stderr": 0.022827317491059686 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8395061728395061, "acc_stderr": 0.020423955354778034, "acc_norm": 0.8395061728395061, "acc_norm_stderr": 0.020423955354778034 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5460992907801419, "acc_stderr": 0.029700453247291474, "acc_norm": 0.5460992907801419, "acc_norm_stderr": 0.029700453247291474 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5371577574967406, "acc_stderr": 0.01273492357953206, "acc_norm": 0.5371577574967406, "acc_norm_stderr": 0.01273492357953206 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7389705882352942, "acc_stderr": 0.026679252270103128, "acc_norm": 0.7389705882352942, "acc_norm_stderr": 0.026679252270103128 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7401960784313726, "acc_stderr": 0.01774089950917779, "acc_norm": 0.7401960784313726, "acc_norm_stderr": 0.01774089950917779 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7363636363636363, "acc_stderr": 0.04220224692971987, "acc_norm": 0.7363636363636363, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8081632653061225, "acc_stderr": 0.025206963154225395, "acc_norm": 0.8081632653061225, "acc_norm_stderr": 0.025206963154225395 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700637, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700637 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.93, "acc_stderr": 0.0256432399976243, "acc_norm": 0.93, "acc_norm_stderr": 0.0256432399976243 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.03882310850890594, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.03882310850890594 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8596491228070176, "acc_stderr": 0.0266405825391332, "acc_norm": 0.8596491228070176, "acc_norm_stderr": 0.0266405825391332 }, "harness|truthfulqa:mc|0": { "mc1": 0.3390452876376989, "mc1_stderr": 0.016571797910626615, "mc2": 0.5097969029790534, "mc2_stderr": 0.014915889066271937 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-822ea1a0-c3f2-487a-8edf-8681a71ac131
2023-10-04T06:24:04.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-v2-L2-13B
2023-10-04T06:13:05.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Doctor-Shotgun/CalliopeDS-v2-L2-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Doctor-Shotgun/CalliopeDS-v2-L2-13B](https://huggingface.co/Doctor-Shotgun/CalliopeDS-v2-L2-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-v2-L2-13B\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:11:32.681767](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-v2-L2-13B/blob/main/results_2023-10-04T06-11-32.681767.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5634608989953492,\n\ \ \"acc_stderr\": 0.03449546929788679,\n \"acc_norm\": 0.5672633846209729,\n\ \ \"acc_norm_stderr\": 0.03447298770477952,\n \"mc1\": 0.34394124847001223,\n\ \ \"mc1_stderr\": 0.016629087514276792,\n \"mc2\": 0.5106295897497167,\n\ \ \"mc2_stderr\": 0.015438726334145936\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578276,\n\ \ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844463\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6468830910177256,\n\ \ \"acc_stderr\": 0.004769618829196506,\n \"acc_norm\": 0.8413662617008564,\n\ \ \"acc_norm_stderr\": 0.003645875568601281\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\ \ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\ \ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\ \ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\ \ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\ \ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\ \ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\ \ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\ \ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\ \ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\ \ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\ \ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\ \ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\ acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\ \ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\ \ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\ \ \"acc_stderr\": 0.027045746573534327,\n \"acc_norm\": 0.6548387096774193,\n\ \ \"acc_norm_stderr\": 0.027045746573534327\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n\ \ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\ : 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\ \ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\ acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117474,\n\ \ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117474\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \ \ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\ \ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\ acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7541284403669725,\n \"acc_stderr\": 0.018461940968708436,\n \"\ acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.018461940968708436\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\ acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\ acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \ \ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\ \ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\ \ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.04243869242230524,\n\ \ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.04243869242230524\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\ acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\ \ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\ \ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\ \ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\ \ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\ \ \"acc_stderr\": 0.026655699653922733,\n \"acc_norm\": 0.7905982905982906,\n\ \ \"acc_norm_stderr\": 0.026655699653922733\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\ \ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n\ \ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.026033890613576277,\n\ \ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.026033890613576277\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46145251396648046,\n\ \ \"acc_stderr\": 0.016672731267552265,\n \"acc_norm\": 0.46145251396648046,\n\ \ \"acc_norm_stderr\": 0.016672731267552265\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02782610930728369,\n\ \ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02782610930728369\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\ \ \"acc_stderr\": 0.027731258647012005,\n \"acc_norm\": 0.6077170418006431,\n\ \ \"acc_norm_stderr\": 0.027731258647012005\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138015,\n\ \ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138015\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \ \ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41134289439374183,\n\ \ \"acc_stderr\": 0.012567882673803684,\n \"acc_norm\": 0.41134289439374183,\n\ \ \"acc_norm_stderr\": 0.012567882673803684\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\ \ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492523,\n \ \ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492523\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\ \ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\ \ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.03113088039623593,\n\ \ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.03113088039623593\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\ \ \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.746268656716418,\n\ \ \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\ \ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\ \ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\ \ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\ \ \"mc1_stderr\": 0.016629087514276792,\n \"mc2\": 0.5106295897497167,\n\ \ \"mc2_stderr\": 0.015438726334145936\n }\n}\n```" repo_url: https://huggingface.co/Doctor-Shotgun/CalliopeDS-v2-L2-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hellaswag|10_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-11-32.681767.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-11-32.681767.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_11_32.681767 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-11-32.681767.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-11-32.681767.parquet' - config_name: results data_files: - split: 2023_10_04T06_11_32.681767 path: - results_2023-10-04T06-11-32.681767.parquet - split: latest path: - results_2023-10-04T06-11-32.681767.parquet --- # Dataset Card for Evaluation run of Doctor-Shotgun/CalliopeDS-v2-L2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Doctor-Shotgun/CalliopeDS-v2-L2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Doctor-Shotgun/CalliopeDS-v2-L2-13B](https://huggingface.co/Doctor-Shotgun/CalliopeDS-v2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-v2-L2-13B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:11:32.681767](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-v2-L2-13B/blob/main/results_2023-10-04T06-11-32.681767.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5634608989953492, "acc_stderr": 0.03449546929788679, "acc_norm": 0.5672633846209729, "acc_norm_stderr": 0.03447298770477952, "mc1": 0.34394124847001223, "mc1_stderr": 0.016629087514276792, "mc2": 0.5106295897497167, "mc2_stderr": 0.015438726334145936 }, "harness|arc:challenge|25": { "acc": 0.5981228668941979, "acc_stderr": 0.014327268614578276, "acc_norm": 0.6279863481228669, "acc_norm_stderr": 0.014124597881844463 }, "harness|hellaswag|10": { "acc": 0.6468830910177256, "acc_stderr": 0.004769618829196506, "acc_norm": 0.8413662617008564, "acc_norm_stderr": 0.003645875568601281 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5328947368421053, "acc_stderr": 0.040601270352363966, "acc_norm": 0.5328947368421053, "acc_norm_stderr": 0.040601270352363966 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6037735849056604, "acc_stderr": 0.030102793781791197, "acc_norm": 0.6037735849056604, "acc_norm_stderr": 0.030102793781791197 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04076663253918567, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04076663253918567 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5086705202312138, "acc_stderr": 0.03811890988940412, "acc_norm": 0.5086705202312138, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.043898699568087764, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.043898699568087764 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4297872340425532, "acc_stderr": 0.03236214467715564, "acc_norm": 0.4297872340425532, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.04339138322579861, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.04339138322579861 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4827586206896552, "acc_stderr": 0.04164188720169377, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.04164188720169377 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3439153439153439, "acc_stderr": 0.024464426625596433, "acc_norm": 0.3439153439153439, "acc_norm_stderr": 0.024464426625596433 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6548387096774193, "acc_stderr": 0.027045746573534327, "acc_norm": 0.6548387096774193, "acc_norm_stderr": 0.027045746573534327 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.42857142857142855, "acc_stderr": 0.03481904844438803, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.03481904844438803 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6606060606060606, "acc_stderr": 0.03697442205031596, "acc_norm": 0.6606060606060606, "acc_norm_stderr": 0.03697442205031596 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7070707070707071, "acc_stderr": 0.032424979581788166, "acc_norm": 0.7070707070707071, "acc_norm_stderr": 0.032424979581788166 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8134715025906736, "acc_stderr": 0.028112091210117474, "acc_norm": 0.8134715025906736, "acc_norm_stderr": 0.028112091210117474 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5128205128205128, "acc_stderr": 0.02534267129380725, "acc_norm": 0.5128205128205128, "acc_norm_stderr": 0.02534267129380725 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028604, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028604 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5630252100840336, "acc_stderr": 0.032219436365661956, "acc_norm": 0.5630252100840336, "acc_norm_stderr": 0.032219436365661956 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658753, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658753 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7541284403669725, "acc_stderr": 0.018461940968708436, "acc_norm": 0.7541284403669725, "acc_norm_stderr": 0.018461940968708436 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4166666666666667, "acc_stderr": 0.03362277436608044, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.03362277436608044 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7647058823529411, "acc_stderr": 0.029771775228145628, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.029771775228145628 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7383966244725738, "acc_stderr": 0.028609516716994934, "acc_norm": 0.7383966244725738, "acc_norm_stderr": 0.028609516716994934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6259541984732825, "acc_stderr": 0.04243869242230524, "acc_norm": 0.6259541984732825, "acc_norm_stderr": 0.04243869242230524 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.039418975265163025, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.039418975265163025 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6625766871165644, "acc_stderr": 0.03714908409935574, "acc_norm": 0.6625766871165644, "acc_norm_stderr": 0.03714908409935574 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.04464285714285714, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.04464285714285714 }, "harness|hendrycksTest-management|5": { "acc": 0.6893203883495146, "acc_stderr": 0.045821241601615506, "acc_norm": 0.6893203883495146, "acc_norm_stderr": 0.045821241601615506 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7905982905982906, "acc_stderr": 0.026655699653922733, "acc_norm": 0.7905982905982906, "acc_norm_stderr": 0.026655699653922733 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7624521072796935, "acc_stderr": 0.015218733046150193, "acc_norm": 0.7624521072796935, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6271676300578035, "acc_stderr": 0.026033890613576277, "acc_norm": 0.6271676300578035, "acc_norm_stderr": 0.026033890613576277 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.46145251396648046, "acc_stderr": 0.016672731267552265, "acc_norm": 0.46145251396648046, "acc_norm_stderr": 0.016672731267552265 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6176470588235294, "acc_stderr": 0.02782610930728369, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.02782610930728369 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6077170418006431, "acc_stderr": 0.027731258647012005, "acc_norm": 0.6077170418006431, "acc_norm_stderr": 0.027731258647012005 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6265432098765432, "acc_stderr": 0.02691500301138015, "acc_norm": 0.6265432098765432, "acc_norm_stderr": 0.02691500301138015 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41843971631205673, "acc_stderr": 0.02942799403941999, "acc_norm": 0.41843971631205673, "acc_norm_stderr": 0.02942799403941999 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41134289439374183, "acc_stderr": 0.012567882673803684, "acc_norm": 0.41134289439374183, "acc_norm_stderr": 0.012567882673803684 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5698529411764706, "acc_stderr": 0.030074971917302875, "acc_norm": 0.5698529411764706, "acc_norm_stderr": 0.030074971917302875 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5473856209150327, "acc_stderr": 0.020136790918492523, "acc_norm": 0.5473856209150327, "acc_norm_stderr": 0.020136790918492523 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6163265306122448, "acc_stderr": 0.03113088039623593, "acc_norm": 0.6163265306122448, "acc_norm_stderr": 0.03113088039623593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.746268656716418, "acc_stderr": 0.03076944496729601, "acc_norm": 0.746268656716418, "acc_norm_stderr": 0.03076944496729601 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-virology|5": { "acc": 0.4397590361445783, "acc_stderr": 0.03864139923699121, "acc_norm": 0.4397590361445783, "acc_norm_stderr": 0.03864139923699121 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.03158149539338734, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.34394124847001223, "mc1_stderr": 0.016629087514276792, "mc2": 0.5106295897497167, "mc2_stderr": 0.015438726334145936 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
Germinal/autotrain-data-perguntas-e-resposta-de-texto
2023-10-04T06:43:34.000Z
[ "region:us" ]
Germinal
null
null
null
0
0
Entry not found
atom-in-the-universe/bild-2c99444b-40a3-45eb-b75b-547feff036ad
2023-10-04T06:36:55.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_DopeorNope__LaOT
2023-10-04T06:30:12.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of DopeorNope/LaOT dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [DopeorNope/LaOT](https://huggingface.co/DopeorNope/LaOT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DopeorNope__LaOT\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:28:47.978535](https://huggingface.co/datasets/open-llm-leaderboard/details_DopeorNope__LaOT/blob/main/results_2023-10-04T06-28-47.978535.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5047817925188373,\n\ \ \"acc_stderr\": 0.03489464703209428,\n \"acc_norm\": 0.5087484207003742,\n\ \ \"acc_norm_stderr\": 0.03487912136583418,\n \"mc1\": 0.31334149326805383,\n\ \ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4472436271077177,\n\ \ \"mc2_stderr\": 0.014749127895935986\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536593,\n\ \ \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.014518421825670454\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5948018323043218,\n\ \ \"acc_stderr\": 0.0048992703105579915,\n \"acc_norm\": 0.7895837482573193,\n\ \ \"acc_norm_stderr\": 0.004067712564078285\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\ \ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\ \ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\ \ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\ \ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.030402331445769544,\n\ \ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.030402331445769544\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\ \ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\ \ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\ \ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\ \ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\ \ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\ \ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\ \ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\ \ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\ \ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\ \ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\ \ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432562,\n \"\ acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432562\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\ \ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\ \ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5483870967741935,\n\ \ \"acc_stderr\": 0.02831050034856839,\n \"acc_norm\": 0.5483870967741935,\n\ \ \"acc_norm_stderr\": 0.02831050034856839\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.034454876862647164,\n\ \ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.034454876862647164\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\ : 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n\ \ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6515151515151515,\n \"acc_stderr\": 0.03394853965156402,\n \"\ acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.03394853965156402\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n\ \ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.025310639254933903,\n\ \ \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.025310639254933903\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \ \ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n\ \ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\ acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7137614678899082,\n \"acc_stderr\": 0.019379436628919975,\n \"\ acc_norm\": 0.7137614678899082,\n \"acc_norm_stderr\": 0.019379436628919975\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"\ acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6764705882352942,\n \"acc_stderr\": 0.03283472056108561,\n \"\ acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03283472056108561\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \ \ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\ \ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\ \ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n\ \ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\ acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\ \ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n\ \ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\ \ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\ \ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\ \ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\ \ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\ \ \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.7649572649572649,\n\ \ \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \ \ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\ \ \"acc_stderr\": 0.016267000684598635,\n \"acc_norm\": 0.7075351213282248,\n\ \ \"acc_norm_stderr\": 0.016267000684598635\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.026589231142174263,\n\ \ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.026589231142174263\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\ \ \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n\ \ \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576066,\n\ \ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576066\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\ \ \"acc_stderr\": 0.02784647600593047,\n \"acc_norm\": 0.5980707395498392,\n\ \ \"acc_norm_stderr\": 0.02784647600593047\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02764847787741332,\n\ \ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02764847787741332\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \ \ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38657105606258146,\n\ \ \"acc_stderr\": 0.012437288868088727,\n \"acc_norm\": 0.38657105606258146,\n\ \ \"acc_norm_stderr\": 0.012437288868088727\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.030352303395351964,\n\ \ \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.030352303395351964\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.4934640522875817,\n \"acc_stderr\": 0.020226106567657807,\n \ \ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.020226106567657807\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\ \ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\ \ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\ \ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5870646766169154,\n\ \ \"acc_stderr\": 0.03481520803367348,\n \"acc_norm\": 0.5870646766169154,\n\ \ \"acc_norm_stderr\": 0.03481520803367348\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\ \ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\ \ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708312,\n\ \ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708312\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\ \ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4472436271077177,\n\ \ \"mc2_stderr\": 0.014749127895935986\n }\n}\n```" repo_url: https://huggingface.co/DopeorNope/LaOT leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hellaswag|10_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-28-47.978535.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-28-47.978535.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_28_47.978535 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-28-47.978535.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-28-47.978535.parquet' - config_name: results data_files: - split: 2023_10_04T06_28_47.978535 path: - results_2023-10-04T06-28-47.978535.parquet - split: latest path: - results_2023-10-04T06-28-47.978535.parquet --- # Dataset Card for Evaluation run of DopeorNope/LaOT ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/DopeorNope/LaOT - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [DopeorNope/LaOT](https://huggingface.co/DopeorNope/LaOT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_DopeorNope__LaOT", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:28:47.978535](https://huggingface.co/datasets/open-llm-leaderboard/details_DopeorNope__LaOT/blob/main/results_2023-10-04T06-28-47.978535.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5047817925188373, "acc_stderr": 0.03489464703209428, "acc_norm": 0.5087484207003742, "acc_norm_stderr": 0.03487912136583418, "mc1": 0.31334149326805383, "mc1_stderr": 0.016238065069059605, "mc2": 0.4472436271077177, "mc2_stderr": 0.014749127895935986 }, "harness|arc:challenge|25": { "acc": 0.5170648464163823, "acc_stderr": 0.014602878388536593, "acc_norm": 0.5563139931740614, "acc_norm_stderr": 0.014518421825670454 }, "harness|hellaswag|10": { "acc": 0.5948018323043218, "acc_stderr": 0.0048992703105579915, "acc_norm": 0.7895837482573193, "acc_norm_stderr": 0.004067712564078285 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5185185185185185, "acc_stderr": 0.043163785995113245, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4934210526315789, "acc_stderr": 0.040685900502249704, "acc_norm": 0.4934210526315789, "acc_norm_stderr": 0.040685900502249704 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5773584905660377, "acc_stderr": 0.030402331445769544, "acc_norm": 0.5773584905660377, "acc_norm_stderr": 0.030402331445769544 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5486111111111112, "acc_stderr": 0.041614023984032786, "acc_norm": 0.5486111111111112, "acc_norm_stderr": 0.041614023984032786 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4682080924855491, "acc_stderr": 0.03804749744364764, "acc_norm": 0.4682080924855491, "acc_norm_stderr": 0.03804749744364764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.044405219061793275, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.044405219061793275 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4723404255319149, "acc_stderr": 0.03263597118409769, "acc_norm": 0.4723404255319149, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3157894736842105, "acc_stderr": 0.043727482902780064, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.043727482902780064 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2804232804232804, "acc_stderr": 0.02313528797432562, "acc_norm": 0.2804232804232804, "acc_norm_stderr": 0.02313528797432562 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2698412698412698, "acc_stderr": 0.03970158273235173, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.03970158273235173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5483870967741935, "acc_stderr": 0.02831050034856839, "acc_norm": 0.5483870967741935, "acc_norm_stderr": 0.02831050034856839 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39901477832512317, "acc_stderr": 0.034454876862647164, "acc_norm": 0.39901477832512317, "acc_norm_stderr": 0.034454876862647164 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.703030303030303, "acc_stderr": 0.03567969772268049, "acc_norm": 0.703030303030303, "acc_norm_stderr": 0.03567969772268049 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6515151515151515, "acc_stderr": 0.03394853965156402, "acc_norm": 0.6515151515151515, "acc_norm_stderr": 0.03394853965156402 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7150259067357513, "acc_stderr": 0.032577140777096614, "acc_norm": 0.7150259067357513, "acc_norm_stderr": 0.032577140777096614 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4717948717948718, "acc_stderr": 0.025310639254933903, "acc_norm": 0.4717948717948718, "acc_norm_stderr": 0.025310639254933903 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.026593939101844086, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.026593939101844086 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.47058823529411764, "acc_stderr": 0.03242225027115006, "acc_norm": 0.47058823529411764, "acc_norm_stderr": 0.03242225027115006 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658753, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658753 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7137614678899082, "acc_stderr": 0.019379436628919975, "acc_norm": 0.7137614678899082, "acc_norm_stderr": 0.019379436628919975 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3472222222222222, "acc_stderr": 0.032468872436376486, "acc_norm": 0.3472222222222222, "acc_norm_stderr": 0.032468872436376486 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03283472056108561, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03283472056108561 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7088607594936709, "acc_stderr": 0.02957160106575337, "acc_norm": 0.7088607594936709, "acc_norm_stderr": 0.02957160106575337 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5515695067264574, "acc_stderr": 0.03337883736255098, "acc_norm": 0.5515695067264574, "acc_norm_stderr": 0.03337883736255098 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5572519083969466, "acc_stderr": 0.04356447202665069, "acc_norm": 0.5572519083969466, "acc_norm_stderr": 0.04356447202665069 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6694214876033058, "acc_stderr": 0.04294340845212094, "acc_norm": 0.6694214876033058, "acc_norm_stderr": 0.04294340845212094 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6481481481481481, "acc_stderr": 0.04616631111801713, "acc_norm": 0.6481481481481481, "acc_norm_stderr": 0.04616631111801713 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.558282208588957, "acc_stderr": 0.03901591825836184, "acc_norm": 0.558282208588957, "acc_norm_stderr": 0.03901591825836184 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.6893203883495146, "acc_stderr": 0.045821241601615506, "acc_norm": 0.6893203883495146, "acc_norm_stderr": 0.045821241601615506 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7649572649572649, "acc_stderr": 0.02777883590493543, "acc_norm": 0.7649572649572649, "acc_norm_stderr": 0.02777883590493543 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7075351213282248, "acc_stderr": 0.016267000684598635, "acc_norm": 0.7075351213282248, "acc_norm_stderr": 0.016267000684598635 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5780346820809249, "acc_stderr": 0.026589231142174263, "acc_norm": 0.5780346820809249, "acc_norm_stderr": 0.026589231142174263 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2435754189944134, "acc_stderr": 0.01435591196476786, "acc_norm": 0.2435754189944134, "acc_norm_stderr": 0.01435591196476786 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5196078431372549, "acc_stderr": 0.028607893699576066, "acc_norm": 0.5196078431372549, "acc_norm_stderr": 0.028607893699576066 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5980707395498392, "acc_stderr": 0.02784647600593047, "acc_norm": 0.5980707395498392, "acc_norm_stderr": 0.02784647600593047 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5555555555555556, "acc_stderr": 0.02764847787741332, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.02764847787741332 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.37943262411347517, "acc_stderr": 0.028947338851614105, "acc_norm": 0.37943262411347517, "acc_norm_stderr": 0.028947338851614105 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38657105606258146, "acc_stderr": 0.012437288868088727, "acc_norm": 0.38657105606258146, "acc_norm_stderr": 0.012437288868088727 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.48161764705882354, "acc_stderr": 0.030352303395351964, "acc_norm": 0.48161764705882354, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4934640522875817, "acc_stderr": 0.020226106567657807, "acc_norm": 0.4934640522875817, "acc_norm_stderr": 0.020226106567657807 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5909090909090909, "acc_stderr": 0.04709306978661895, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.04709306978661895 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5877551020408164, "acc_stderr": 0.03151236044674268, "acc_norm": 0.5877551020408164, "acc_norm_stderr": 0.03151236044674268 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5870646766169154, "acc_stderr": 0.03481520803367348, "acc_norm": 0.5870646766169154, "acc_norm_stderr": 0.03481520803367348 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6842105263157895, "acc_stderr": 0.03565079670708312, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.03565079670708312 }, "harness|truthfulqa:mc|0": { "mc1": 0.31334149326805383, "mc1_stderr": 0.016238065069059605, "mc2": 0.4472436271077177, "mc2_stderr": 0.014749127895935986 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-480k-1T
2023-10-04T06:33:51.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of PY007/TinyLlama-1.1B-intermediate-step-480k-1T dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PY007/TinyLlama-1.1B-intermediate-step-480k-1T](https://huggingface.co/PY007/TinyLlama-1.1B-intermediate-step-480k-1T)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-480k-1T\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:32:33.540256](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-480k-1T/blob/main/results_2023-10-04T06-32-33.540256.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25274382736277073,\n\ \ \"acc_stderr\": 0.031563373079172515,\n \"acc_norm\": 0.2556943466182372,\n\ \ \"acc_norm_stderr\": 0.03157551212101923,\n \"mc1\": 0.23011015911872704,\n\ \ \"mc1_stderr\": 0.014734557959807769,\n \"mc2\": 0.3955364206916061,\n\ \ \"mc2_stderr\": 0.014156601031172413\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.2636518771331058,\n \"acc_stderr\": 0.012875929151297044,\n\ \ \"acc_norm\": 0.30887372013651876,\n \"acc_norm_stderr\": 0.013501770929344003\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4008165704043019,\n\ \ \"acc_stderr\": 0.004890623693243623,\n \"acc_norm\": 0.5296753634734117,\n\ \ \"acc_norm_stderr\": 0.004980985384152897\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\ \ \"acc_stderr\": 0.03999262876617722,\n \"acc_norm\": 0.3111111111111111,\n\ \ \"acc_norm_stderr\": 0.03999262876617722\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343602,\n\ \ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343602\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\ \ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \ \ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.026480357179895685,\n\ \ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.026480357179895685\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\ \ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\ \ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\ \ \"acc_stderr\": 0.030952890217749895,\n \"acc_norm\": 0.20809248554913296,\n\ \ \"acc_norm_stderr\": 0.030952890217749895\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\ \ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n\ \ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.028504856470514192,\n\ \ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.028504856470514192\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\ \ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.20634920634920634,\n \"acc_stderr\": 0.020842290930114676,\n \"\ acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.020842290930114676\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\ \ \"acc_stderr\": 0.03455071019102146,\n \"acc_norm\": 0.18253968253968253,\n\ \ \"acc_norm_stderr\": 0.03455071019102146\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.22903225806451613,\n \"acc_stderr\": 0.023904914311782658,\n \"\ acc_norm\": 0.22903225806451613,\n \"acc_norm_stderr\": 0.023904914311782658\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617732,\n \"\ acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617732\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296782,\n\ \ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296782\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.19696969696969696,\n \"acc_stderr\": 0.02833560973246335,\n \"\ acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.02833560973246335\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\ \ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.02066059748502693,\n\ \ \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.02066059748502693\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \ \ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.02684151432295894,\n \ \ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.02684151432295894\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\ : 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21651376146788992,\n\ \ \"acc_stderr\": 0.017658710594443128,\n \"acc_norm\": 0.21651376146788992,\n\ \ \"acc_norm_stderr\": 0.017658710594443128\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02835321286686344,\n\ \ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02835321286686344\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"\ acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \ \ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n\ \ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.30493273542600896,\n\ \ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847834,\n\ \ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847834\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\ : 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\ \ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n\ \ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\ \ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\ \ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\ \ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\ \ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\ \ \"acc_stderr\": 0.029614323690456655,\n \"acc_norm\": 0.2863247863247863,\n\ \ \"acc_norm_stderr\": 0.029614323690456655\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n\ \ \"acc_stderr\": 0.01554337731371968,\n \"acc_norm\": 0.25287356321839083,\n\ \ \"acc_norm_stderr\": 0.01554337731371968\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.023948512905468365,\n\ \ \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.023948512905468365\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\ \ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\ \ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087873,\n\ \ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087873\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n\ \ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.3022508038585209,\n\ \ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\ \ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729903,\n \ \ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729903\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n\ \ \"acc_stderr\": 0.010885929742002212,\n \"acc_norm\": 0.23859191655801826,\n\ \ \"acc_norm_stderr\": 0.010885929742002212\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541097,\n\ \ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541097\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\ : 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\ : {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.041723430387053825,\n\ \ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.041723430387053825\n\ \ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.20816326530612245,\n\ \ \"acc_stderr\": 0.025991117672813296,\n \"acc_norm\": 0.20816326530612245,\n\ \ \"acc_norm_stderr\": 0.025991117672813296\n },\n \"harness|hendrycksTest-sociology|5\"\ : {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916718,\n\ \ \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916718\n\ \ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\ \ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944967,\n\ \ \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944967\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n\ \ \"acc_stderr\": 0.03546976959393163,\n \"acc_norm\": 0.30994152046783624,\n\ \ \"acc_norm_stderr\": 0.03546976959393163\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807769,\n\ \ \"mc2\": 0.3955364206916061,\n \"mc2_stderr\": 0.014156601031172413\n\ \ }\n}\n```" repo_url: https://huggingface.co/PY007/TinyLlama-1.1B-intermediate-step-480k-1T leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hellaswag|10_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-32-33.540256.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-32-33.540256.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_32_33.540256 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-32-33.540256.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-32-33.540256.parquet' - config_name: results data_files: - split: 2023_10_04T06_32_33.540256 path: - results_2023-10-04T06-32-33.540256.parquet - split: latest path: - results_2023-10-04T06-32-33.540256.parquet --- # Dataset Card for Evaluation run of PY007/TinyLlama-1.1B-intermediate-step-480k-1T ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PY007/TinyLlama-1.1B-intermediate-step-480k-1T - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PY007/TinyLlama-1.1B-intermediate-step-480k-1T](https://huggingface.co/PY007/TinyLlama-1.1B-intermediate-step-480k-1T) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-480k-1T", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:32:33.540256](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-intermediate-step-480k-1T/blob/main/results_2023-10-04T06-32-33.540256.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.25274382736277073, "acc_stderr": 0.031563373079172515, "acc_norm": 0.2556943466182372, "acc_norm_stderr": 0.03157551212101923, "mc1": 0.23011015911872704, "mc1_stderr": 0.014734557959807769, "mc2": 0.3955364206916061, "mc2_stderr": 0.014156601031172413 }, "harness|arc:challenge|25": { "acc": 0.2636518771331058, "acc_stderr": 0.012875929151297044, "acc_norm": 0.30887372013651876, "acc_norm_stderr": 0.013501770929344003 }, "harness|hellaswag|10": { "acc": 0.4008165704043019, "acc_stderr": 0.004890623693243623, "acc_norm": 0.5296753634734117, "acc_norm_stderr": 0.004980985384152897 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768081, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768081 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3111111111111111, "acc_stderr": 0.03999262876617722, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.03999262876617722 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2236842105263158, "acc_stderr": 0.03391160934343602, "acc_norm": 0.2236842105263158, "acc_norm_stderr": 0.03391160934343602 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.24528301886792453, "acc_stderr": 0.026480357179895685, "acc_norm": 0.24528301886792453, "acc_norm_stderr": 0.026480357179895685 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.22916666666666666, "acc_stderr": 0.03514697467862388, "acc_norm": 0.22916666666666666, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.040201512610368445, "acc_norm": 0.2, "acc_norm_stderr": 0.040201512610368445 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749895, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749895 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2553191489361702, "acc_stderr": 0.028504856470514192, "acc_norm": 0.2553191489361702, "acc_norm_stderr": 0.028504856470514192 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.0404933929774814, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.0404933929774814 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.20689655172413793, "acc_stderr": 0.03375672449560554, "acc_norm": 0.20689655172413793, "acc_norm_stderr": 0.03375672449560554 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20634920634920634, "acc_stderr": 0.020842290930114676, "acc_norm": 0.20634920634920634, "acc_norm_stderr": 0.020842290930114676 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.18253968253968253, "acc_stderr": 0.03455071019102146, "acc_norm": 0.18253968253968253, "acc_norm_stderr": 0.03455071019102146 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.22903225806451613, "acc_stderr": 0.023904914311782658, "acc_norm": 0.22903225806451613, "acc_norm_stderr": 0.023904914311782658 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.24630541871921183, "acc_stderr": 0.030315099285617732, "acc_norm": 0.24630541871921183, "acc_norm_stderr": 0.030315099285617732 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2787878787878788, "acc_stderr": 0.03501438706296782, "acc_norm": 0.2787878787878788, "acc_norm_stderr": 0.03501438706296782 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.19696969696969696, "acc_stderr": 0.02833560973246335, "acc_norm": 0.19696969696969696, "acc_norm_stderr": 0.02833560973246335 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.23316062176165803, "acc_stderr": 0.030516111371476008, "acc_norm": 0.23316062176165803, "acc_norm_stderr": 0.030516111371476008 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.21025641025641026, "acc_stderr": 0.02066059748502693, "acc_norm": 0.21025641025641026, "acc_norm_stderr": 0.02066059748502693 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.026719240783712163, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.026719240783712163 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2184873949579832, "acc_stderr": 0.02684151432295894, "acc_norm": 0.2184873949579832, "acc_norm_stderr": 0.02684151432295894 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.03757949922943343, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.03757949922943343 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.21651376146788992, "acc_stderr": 0.017658710594443128, "acc_norm": 0.21651376146788992, "acc_norm_stderr": 0.017658710594443128 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2222222222222222, "acc_stderr": 0.02835321286686344, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.02835321286686344 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.03019028245350194, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.03019028245350194 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.30493273542600896, "acc_stderr": 0.030898610882477518, "acc_norm": 0.30493273542600896, "acc_norm_stderr": 0.030898610882477518 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2748091603053435, "acc_stderr": 0.03915345408847834, "acc_norm": 0.2748091603053435, "acc_norm_stderr": 0.03915345408847834 }, "harness|hendrycksTest-international_law|5": { "acc": 0.256198347107438, "acc_stderr": 0.03984979653302872, "acc_norm": 0.256198347107438, "acc_norm_stderr": 0.03984979653302872 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.3148148148148148, "acc_stderr": 0.04489931073591312, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.04489931073591312 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.035590395316173425, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.043270409325787296, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.043270409325787296 }, "harness|hendrycksTest-management|5": { "acc": 0.1650485436893204, "acc_stderr": 0.036756688322331886, "acc_norm": 0.1650485436893204, "acc_norm_stderr": 0.036756688322331886 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2863247863247863, "acc_stderr": 0.029614323690456655, "acc_norm": 0.2863247863247863, "acc_norm_stderr": 0.029614323690456655 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.22, "acc_stderr": 0.04163331998932269, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932269 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.25287356321839083, "acc_stderr": 0.01554337731371968, "acc_norm": 0.25287356321839083, "acc_norm_stderr": 0.01554337731371968 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.27167630057803466, "acc_stderr": 0.023948512905468365, "acc_norm": 0.27167630057803466, "acc_norm_stderr": 0.023948512905468365 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.238562091503268, "acc_stderr": 0.024404394928087873, "acc_norm": 0.238562091503268, "acc_norm_stderr": 0.024404394928087873 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3022508038585209, "acc_stderr": 0.02608270069539966, "acc_norm": 0.3022508038585209, "acc_norm_stderr": 0.02608270069539966 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.26851851851851855, "acc_stderr": 0.024659685185967277, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.024659685185967277 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2375886524822695, "acc_stderr": 0.025389512552729903, "acc_norm": 0.2375886524822695, "acc_norm_stderr": 0.025389512552729903 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.23859191655801826, "acc_stderr": 0.010885929742002212, "acc_norm": 0.23859191655801826, "acc_norm_stderr": 0.010885929742002212 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.19852941176470587, "acc_stderr": 0.024231013370541097, "acc_norm": 0.19852941176470587, "acc_norm_stderr": 0.024231013370541097 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2545454545454545, "acc_stderr": 0.041723430387053825, "acc_norm": 0.2545454545454545, "acc_norm_stderr": 0.041723430387053825 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.20816326530612245, "acc_stderr": 0.025991117672813296, "acc_norm": 0.20816326530612245, "acc_norm_stderr": 0.025991117672813296 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24875621890547264, "acc_stderr": 0.030567675938916718, "acc_norm": 0.24875621890547264, "acc_norm_stderr": 0.030567675938916718 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.26506024096385544, "acc_stderr": 0.03436024037944967, "acc_norm": 0.26506024096385544, "acc_norm_stderr": 0.03436024037944967 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.30994152046783624, "acc_stderr": 0.03546976959393163, "acc_norm": 0.30994152046783624, "acc_norm_stderr": 0.03546976959393163 }, "harness|truthfulqa:mc|0": { "mc1": 0.23011015911872704, "mc1_stderr": 0.014734557959807769, "mc2": 0.3955364206916061, "mc2_stderr": 0.014156601031172413 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-e2dfd001-4846-4fe5-8afb-b410e6c6c2ee
2023-10-04T06:50:23.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-large
2023-10-04T06:42:27.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of TurkuNLP/gpt3-finnish-large dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [TurkuNLP/gpt3-finnish-large](https://huggingface.co/TurkuNLP/gpt3-finnish-large)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-large\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:41:08.773247](https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-large/blob/main/results_2023-10-04T06-41-08.773247.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24101706572706416,\n\ \ \"acc_stderr\": 0.030914682576886086,\n \"acc_norm\": 0.24220917519747875,\n\ \ \"acc_norm_stderr\": 0.030930132905642612,\n \"mc1\": 0.2594859241126071,\n\ \ \"mc1_stderr\": 0.015345409485557977,\n \"mc2\": 0.44349887144931116,\n\ \ \"mc2_stderr\": 0.015576744237840029\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.1825938566552901,\n \"acc_stderr\": 0.011289730684564993,\n\ \ \"acc_norm\": 0.2175767918088737,\n \"acc_norm_stderr\": 0.012057262020972504\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29346743676558457,\n\ \ \"acc_stderr\": 0.004544201359074621,\n \"acc_norm\": 0.32881896036646086,\n\ \ \"acc_norm_stderr\": 0.004688239419302081\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\ \ \"acc_stderr\": 0.03853254836552004,\n \"acc_norm\": 0.2740740740740741,\n\ \ \"acc_norm_stderr\": 0.03853254836552004\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\ \ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.16,\n\ \ \"acc_stderr\": 0.036845294917747094,\n \"acc_norm\": 0.16,\n \ \ \"acc_norm_stderr\": 0.036845294917747094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.0256042334708991,\n\ \ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.0256042334708991\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.18,\n\ \ \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \ \ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\ \ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\ \ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177455,\n\ \ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177455\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\ \ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\ \ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\ \ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\ acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\ \ \"acc_stderr\": 0.03455071019102149,\n \"acc_norm\": 0.18253968253968253,\n\ \ \"acc_norm_stderr\": 0.03455071019102149\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.2709677419354839,\n \"acc_stderr\": 0.025284416114900156,\n \"\ acc_norm\": 0.2709677419354839,\n \"acc_norm_stderr\": 0.025284416114900156\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.28078817733990147,\n \"acc_stderr\": 0.031618563353586086,\n \"\ acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.031618563353586086\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\ : 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.24242424242424243,\n \"acc_stderr\": 0.03053289223393202,\n \"\ acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03053289223393202\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178253,\n\ \ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178253\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\ \ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \ \ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.02720537153827948,\n \ \ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.02720537153827948\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\ acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.22752293577981653,\n \"acc_stderr\": 0.017974463578776502,\n \"\ acc_norm\": 0.22752293577981653,\n \"acc_norm_stderr\": 0.017974463578776502\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\ acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.2549019607843137,\n \"acc_stderr\": 0.03058759135160424,\n \"\ acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.03058759135160424\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035293,\n \ \ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035293\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.33183856502242154,\n\ \ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.33183856502242154,\n\ \ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\ \ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.21296296296296297,\n\ \ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n\ \ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\ \ \"acc_stderr\": 0.04464285714285712,\n \"acc_norm\": 0.33035714285714285,\n\ \ \"acc_norm_stderr\": 0.04464285714285712\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\ \ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n\ \ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.2692307692307692,\n\ \ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25798212005108556,\n\ \ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.25798212005108556,\n\ \ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\ \ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\ \ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\ \ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\ \ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\ \ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\ \ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451152,\n\ \ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451152\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \ \ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n\ \ \"acc_stderr\": 0.011064151027165441,\n \"acc_norm\": 0.2503259452411995,\n\ \ \"acc_norm_stderr\": 0.011064151027165441\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.23897058823529413,\n \"acc_stderr\": 0.025905280644893006,\n\ \ \"acc_norm\": 0.23897058823529413,\n \"acc_norm_stderr\": 0.025905280644893006\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\ : 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\ \ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.23265306122448978,\n\ \ \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.23265306122448978,\n\ \ \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\"\ : {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\ \ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\ \ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\ \ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.2469879518072289,\n \"acc_stderr\": 0.03357351982064537,\n\ \ \"acc_norm\": 0.2469879518072289,\n \"acc_norm_stderr\": 0.03357351982064537\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n\ \ \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n\ \ \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557977,\n\ \ \"mc2\": 0.44349887144931116,\n \"mc2_stderr\": 0.015576744237840029\n\ \ }\n}\n```" repo_url: https://huggingface.co/TurkuNLP/gpt3-finnish-large leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hellaswag|10_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-41-08.773247.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-41-08.773247.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_41_08.773247 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-41-08.773247.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-41-08.773247.parquet' - config_name: results data_files: - split: 2023_10_04T06_41_08.773247 path: - results_2023-10-04T06-41-08.773247.parquet - split: latest path: - results_2023-10-04T06-41-08.773247.parquet --- # Dataset Card for Evaluation run of TurkuNLP/gpt3-finnish-large ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TurkuNLP/gpt3-finnish-large - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [TurkuNLP/gpt3-finnish-large](https://huggingface.co/TurkuNLP/gpt3-finnish-large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-large", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:41:08.773247](https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-large/blob/main/results_2023-10-04T06-41-08.773247.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.24101706572706416, "acc_stderr": 0.030914682576886086, "acc_norm": 0.24220917519747875, "acc_norm_stderr": 0.030930132905642612, "mc1": 0.2594859241126071, "mc1_stderr": 0.015345409485557977, "mc2": 0.44349887144931116, "mc2_stderr": 0.015576744237840029 }, "harness|arc:challenge|25": { "acc": 0.1825938566552901, "acc_stderr": 0.011289730684564993, "acc_norm": 0.2175767918088737, "acc_norm_stderr": 0.012057262020972504 }, "harness|hellaswag|10": { "acc": 0.29346743676558457, "acc_stderr": 0.004544201359074621, "acc_norm": 0.32881896036646086, "acc_norm_stderr": 0.004688239419302081 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2740740740740741, "acc_stderr": 0.03853254836552004, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.03853254836552004 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.16, "acc_stderr": 0.036845294917747094, "acc_norm": 0.16, "acc_norm_stderr": 0.036845294917747094 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.22264150943396227, "acc_stderr": 0.0256042334708991, "acc_norm": 0.22264150943396227, "acc_norm_stderr": 0.0256042334708991 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03476590104304134, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.18, "acc_stderr": 0.038612291966536955, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536955 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720685, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720685 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2023121387283237, "acc_stderr": 0.03063114553919882, "acc_norm": 0.2023121387283237, "acc_norm_stderr": 0.03063114553919882 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.040233822736177455, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.040233822736177455 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.04049339297748141, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.04049339297748141 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.18253968253968253, "acc_stderr": 0.03455071019102149, "acc_norm": 0.18253968253968253, "acc_norm_stderr": 0.03455071019102149 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2709677419354839, "acc_stderr": 0.025284416114900156, "acc_norm": 0.2709677419354839, "acc_norm_stderr": 0.025284416114900156 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.28078817733990147, "acc_stderr": 0.031618563353586086, "acc_norm": 0.28078817733990147, "acc_norm_stderr": 0.031618563353586086 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.19, "acc_stderr": 0.039427724440366234, "acc_norm": 0.19, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.24242424242424243, "acc_stderr": 0.03053289223393202, "acc_norm": 0.24242424242424243, "acc_norm_stderr": 0.03053289223393202 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.22797927461139897, "acc_stderr": 0.030276909945178253, "acc_norm": 0.22797927461139897, "acc_norm_stderr": 0.030276909945178253 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2128205128205128, "acc_stderr": 0.020752423722128013, "acc_norm": 0.2128205128205128, "acc_norm_stderr": 0.020752423722128013 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.226890756302521, "acc_stderr": 0.02720537153827948, "acc_norm": 0.226890756302521, "acc_norm_stderr": 0.02720537153827948 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2052980132450331, "acc_stderr": 0.03297986648473835, "acc_norm": 0.2052980132450331, "acc_norm_stderr": 0.03297986648473835 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.22752293577981653, "acc_stderr": 0.017974463578776502, "acc_norm": 0.22752293577981653, "acc_norm_stderr": 0.017974463578776502 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4675925925925926, "acc_stderr": 0.03402801581358966, "acc_norm": 0.4675925925925926, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.03058759135160424, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.03058759135160424 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.25316455696202533, "acc_stderr": 0.028304657943035293, "acc_norm": 0.25316455696202533, "acc_norm_stderr": 0.028304657943035293 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.33183856502242154, "acc_stderr": 0.03160295143776678, "acc_norm": 0.33183856502242154, "acc_norm_stderr": 0.03160295143776678 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.21296296296296297, "acc_stderr": 0.039578354719809805, "acc_norm": 0.21296296296296297, "acc_norm_stderr": 0.039578354719809805 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.26380368098159507, "acc_stderr": 0.03462419931615624, "acc_norm": 0.26380368098159507, "acc_norm_stderr": 0.03462419931615624 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.04464285714285712, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.04464285714285712 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2692307692307692, "acc_stderr": 0.029058588303748842, "acc_norm": 0.2692307692307692, "acc_norm_stderr": 0.029058588303748842 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.25798212005108556, "acc_stderr": 0.01564583018834895, "acc_norm": 0.25798212005108556, "acc_norm_stderr": 0.01564583018834895 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2424581005586592, "acc_stderr": 0.014333522059217889, "acc_norm": 0.2424581005586592, "acc_norm_stderr": 0.014333522059217889 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24183006535947713, "acc_stderr": 0.024518195641879334, "acc_norm": 0.24183006535947713, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.23148148148148148, "acc_stderr": 0.023468429832451152, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.023468429832451152 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2553191489361702, "acc_stderr": 0.02601199293090201, "acc_norm": 0.2553191489361702, "acc_norm_stderr": 0.02601199293090201 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2503259452411995, "acc_stderr": 0.011064151027165441, "acc_norm": 0.2503259452411995, "acc_norm_stderr": 0.011064151027165441 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.23897058823529413, "acc_stderr": 0.025905280644893006, "acc_norm": 0.23897058823529413, "acc_norm_stderr": 0.025905280644893006 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.23265306122448978, "acc_stderr": 0.02704925791589618, "acc_norm": 0.23265306122448978, "acc_norm_stderr": 0.02704925791589618 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.2469879518072289, "acc_stderr": 0.03357351982064537, "acc_norm": 0.2469879518072289, "acc_norm_stderr": 0.03357351982064537 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21052631578947367, "acc_stderr": 0.0312678171466318, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.0312678171466318 }, "harness|truthfulqa:mc|0": { "mc1": 0.2594859241126071, "mc1_stderr": 0.015345409485557977, "mc2": 0.44349887144931116, "mc2_stderr": 0.015576744237840029 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
Sai0720/java_to_go_finetune
2023-10-04T06:50:48.000Z
[ "license:unknown", "region:us" ]
Sai0720
null
null
null
0
0
--- license: unknown ---
open-llm-leaderboard/details_yeen214__llama2_7b_small_tuning_v1
2023-10-04T06:49:21.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of yeen214/llama2_7b_small_tuning_v1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [yeen214/llama2_7b_small_tuning_v1](https://huggingface.co/yeen214/llama2_7b_small_tuning_v1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeen214__llama2_7b_small_tuning_v1\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:48:03.956083](https://huggingface.co/datasets/open-llm-leaderboard/details_yeen214__llama2_7b_small_tuning_v1/blob/main/results_2023-10-04T06-48-03.956083.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2544228232681838,\n\ \ \"acc_stderr\": 0.031585630644440664,\n \"acc_norm\": 0.25448067020518844,\n\ \ \"acc_norm_stderr\": 0.03158675735175906,\n \"mc1\": 0.23990208078335373,\n\ \ \"mc1_stderr\": 0.014948812679062135,\n \"mc2\": 0.48698906066945347,\n\ \ \"mc2_stderr\": 0.016923576514444937\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.22098976109215018,\n \"acc_stderr\": 0.012124929206818258,\n\ \ \"acc_norm\": 0.22440273037542663,\n \"acc_norm_stderr\": 0.012191404938603843\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.24995020912168892,\n\ \ \"acc_stderr\": 0.004320990543283153,\n \"acc_norm\": 0.24995020912168892,\n\ \ \"acc_norm_stderr\": 0.004320990543283153\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\ \ \"acc_stderr\": 0.03712537833614867,\n \"acc_norm\": 0.24444444444444444,\n\ \ \"acc_norm_stderr\": 0.03712537833614867\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n\ \ \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.15,\n\ \ \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\": 0.15,\n \ \ \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.026616482980501715,\n\ \ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.026616482980501715\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\ \ \"acc_stderr\": 0.03437079344106133,\n \"acc_norm\": 0.2152777777777778,\n\ \ \"acc_norm_stderr\": 0.03437079344106133\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\ \ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\ \ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.28901734104046245,\n\ \ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\ \ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.2,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n\ \ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342347,\n\ \ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342347\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\ \ \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n\ \ \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03724563619774632,\n\ \ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03724563619774632\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"\ acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\ \ \"acc_stderr\": 0.03567016675276865,\n \"acc_norm\": 0.1984126984126984,\n\ \ \"acc_norm_stderr\": 0.03567016675276865\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n\ \ \"acc_stderr\": 0.025649381063029265,\n \"acc_norm\": 0.2838709677419355,\n\ \ \"acc_norm_stderr\": 0.025649381063029265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678241,\n\ \ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678241\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\ : 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.031584153240477086,\n\ \ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.031584153240477086\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.30808080808080807,\n \"acc_stderr\": 0.03289477330098616,\n \"\ acc_norm\": 0.30808080808080807,\n \"acc_norm_stderr\": 0.03289477330098616\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\ \ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.022556551010132354,\n\ \ \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.022556551010132354\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \ \ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.02921354941437217,\n \ \ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.02921354941437217\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\ acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.26788990825688075,\n \"acc_stderr\": 0.018987462257978652,\n \"\ acc_norm\": 0.26788990825688075,\n \"acc_norm_stderr\": 0.018987462257978652\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257017,\n \"\ acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257017\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n\ \ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15695067264573992,\n\ \ \"acc_stderr\": 0.02441358717490739,\n \"acc_norm\": 0.15695067264573992,\n\ \ \"acc_norm_stderr\": 0.02441358717490739\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\ \ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.19008264462809918,\n \"acc_stderr\": 0.035817969517092825,\n \"\ acc_norm\": 0.19008264462809918,\n \"acc_norm_stderr\": 0.035817969517092825\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\ \ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.21296296296296297,\n\ \ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.03512385283705051,\n\ \ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.03512385283705051\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\ \ \"acc_stderr\": 0.0356236785009539,\n \"acc_norm\": 0.16964285714285715,\n\ \ \"acc_norm_stderr\": 0.0356236785009539\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.044986763205729245,\n\ \ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.044986763205729245\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\ \ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n\ \ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24521072796934865,\n\ \ \"acc_stderr\": 0.015384352284543944,\n \"acc_norm\": 0.24521072796934865,\n\ \ \"acc_norm_stderr\": 0.015384352284543944\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n\ \ \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\ \ \"acc_stderr\": 0.014310999547961476,\n \"acc_norm\": 0.24134078212290502,\n\ \ \"acc_norm_stderr\": 0.014310999547961476\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729474,\n\ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729474\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\ \ \"acc_stderr\": 0.02512263760881665,\n \"acc_norm\": 0.26688102893890675,\n\ \ \"acc_norm_stderr\": 0.02512263760881665\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967284,\n\ \ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967284\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \ \ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\ \ \"acc_stderr\": 0.010986307870045509,\n \"acc_norm\": 0.24511082138200782,\n\ \ \"acc_norm_stderr\": 0.010986307870045509\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\ \ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2238562091503268,\n \"acc_stderr\": 0.016863008585416613,\n \ \ \"acc_norm\": 0.2238562091503268,\n \"acc_norm_stderr\": 0.016863008585416613\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\ \ \"acc_stderr\": 0.040693063197213754,\n \"acc_norm\": 0.23636363636363636,\n\ \ \"acc_norm_stderr\": 0.040693063197213754\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.33877551020408164,\n \"acc_stderr\": 0.030299506562154185,\n\ \ \"acc_norm\": 0.33877551020408164,\n \"acc_norm_stderr\": 0.030299506562154185\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\ \ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n\ \ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\ \ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\ \ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n\ \ \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\ \ \"mc1_stderr\": 0.014948812679062135,\n \"mc2\": 0.48698906066945347,\n\ \ \"mc2_stderr\": 0.016923576514444937\n }\n}\n```" repo_url: https://huggingface.co/yeen214/llama2_7b_small_tuning_v1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hellaswag|10_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-48-03.956083.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-48-03.956083.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_48_03.956083 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-48-03.956083.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-48-03.956083.parquet' - config_name: results data_files: - split: 2023_10_04T06_48_03.956083 path: - results_2023-10-04T06-48-03.956083.parquet - split: latest path: - results_2023-10-04T06-48-03.956083.parquet --- # Dataset Card for Evaluation run of yeen214/llama2_7b_small_tuning_v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeen214/llama2_7b_small_tuning_v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [yeen214/llama2_7b_small_tuning_v1](https://huggingface.co/yeen214/llama2_7b_small_tuning_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeen214__llama2_7b_small_tuning_v1", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:48:03.956083](https://huggingface.co/datasets/open-llm-leaderboard/details_yeen214__llama2_7b_small_tuning_v1/blob/main/results_2023-10-04T06-48-03.956083.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2544228232681838, "acc_stderr": 0.031585630644440664, "acc_norm": 0.25448067020518844, "acc_norm_stderr": 0.03158675735175906, "mc1": 0.23990208078335373, "mc1_stderr": 0.014948812679062135, "mc2": 0.48698906066945347, "mc2_stderr": 0.016923576514444937 }, "harness|arc:challenge|25": { "acc": 0.22098976109215018, "acc_stderr": 0.012124929206818258, "acc_norm": 0.22440273037542663, "acc_norm_stderr": 0.012191404938603843 }, "harness|hellaswag|10": { "acc": 0.24995020912168892, "acc_stderr": 0.004320990543283153, "acc_norm": 0.24995020912168892, "acc_norm_stderr": 0.004320990543283153 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.24444444444444444, "acc_stderr": 0.03712537833614867, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.03712537833614867 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3881578947368421, "acc_stderr": 0.03965842097512744, "acc_norm": 0.3881578947368421, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.15, "acc_stderr": 0.03588702812826372, "acc_norm": 0.15, "acc_norm_stderr": 0.03588702812826372 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2490566037735849, "acc_stderr": 0.026616482980501715, "acc_norm": 0.2490566037735849, "acc_norm_stderr": 0.026616482980501715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2152777777777778, "acc_stderr": 0.03437079344106133, "acc_norm": 0.2152777777777778, "acc_norm_stderr": 0.03437079344106133 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.28901734104046245, "acc_stderr": 0.034564257450869995, "acc_norm": 0.28901734104046245, "acc_norm_stderr": 0.034564257450869995 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171453, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171453 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.2, "acc_stderr": 0.04020151261036844, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036844 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2680851063829787, "acc_stderr": 0.028957342788342347, "acc_norm": 0.2680851063829787, "acc_norm_stderr": 0.028957342788342347 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.19298245614035087, "acc_stderr": 0.037124548537213684, "acc_norm": 0.19298245614035087, "acc_norm_stderr": 0.037124548537213684 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.27586206896551724, "acc_stderr": 0.03724563619774632, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.03724563619774632 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.26455026455026454, "acc_stderr": 0.022717467897708624, "acc_norm": 0.26455026455026454, "acc_norm_stderr": 0.022717467897708624 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1984126984126984, "acc_stderr": 0.03567016675276865, "acc_norm": 0.1984126984126984, "acc_norm_stderr": 0.03567016675276865 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2838709677419355, "acc_stderr": 0.025649381063029265, "acc_norm": 0.2838709677419355, "acc_norm_stderr": 0.025649381063029265 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.27586206896551724, "acc_stderr": 0.03144712581678241, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.03144712581678241 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.20606060606060606, "acc_stderr": 0.031584153240477086, "acc_norm": 0.20606060606060606, "acc_norm_stderr": 0.031584153240477086 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.30808080808080807, "acc_stderr": 0.03289477330098616, "acc_norm": 0.30808080808080807, "acc_norm_stderr": 0.03289477330098616 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.32124352331606215, "acc_stderr": 0.033699508685490674, "acc_norm": 0.32124352331606215, "acc_norm_stderr": 0.033699508685490674 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2717948717948718, "acc_stderr": 0.022556551010132354, "acc_norm": 0.2717948717948718, "acc_norm_stderr": 0.022556551010132354 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.026719240783712166, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.026719240783712166 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2815126050420168, "acc_stderr": 0.02921354941437217, "acc_norm": 0.2815126050420168, "acc_norm_stderr": 0.02921354941437217 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.038020397601079024, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.038020397601079024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.26788990825688075, "acc_stderr": 0.018987462257978652, "acc_norm": 0.26788990825688075, "acc_norm_stderr": 0.018987462257978652 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.28703703703703703, "acc_stderr": 0.030851992993257017, "acc_norm": 0.28703703703703703, "acc_norm_stderr": 0.030851992993257017 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.26582278481012656, "acc_stderr": 0.02875679962965834, "acc_norm": 0.26582278481012656, "acc_norm_stderr": 0.02875679962965834 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.15695067264573992, "acc_stderr": 0.02441358717490739, "acc_norm": 0.15695067264573992, "acc_norm_stderr": 0.02441358717490739 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2824427480916031, "acc_stderr": 0.03948406125768361, "acc_norm": 0.2824427480916031, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.19008264462809918, "acc_stderr": 0.035817969517092825, "acc_norm": 0.19008264462809918, "acc_norm_stderr": 0.035817969517092825 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.21296296296296297, "acc_stderr": 0.03957835471980979, "acc_norm": 0.21296296296296297, "acc_norm_stderr": 0.03957835471980979 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.27607361963190186, "acc_stderr": 0.03512385283705051, "acc_norm": 0.27607361963190186, "acc_norm_stderr": 0.03512385283705051 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.16964285714285715, "acc_stderr": 0.0356236785009539, "acc_norm": 0.16964285714285715, "acc_norm_stderr": 0.0356236785009539 }, "harness|hendrycksTest-management|5": { "acc": 0.2912621359223301, "acc_stderr": 0.044986763205729245, "acc_norm": 0.2912621359223301, "acc_norm_stderr": 0.044986763205729245 }, "harness|hendrycksTest-marketing|5": { "acc": 0.25213675213675213, "acc_stderr": 0.02844796547623102, "acc_norm": 0.25213675213675213, "acc_norm_stderr": 0.02844796547623102 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.24521072796934865, "acc_stderr": 0.015384352284543944, "acc_norm": 0.24521072796934865, "acc_norm_stderr": 0.015384352284543944 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.22254335260115607, "acc_stderr": 0.02239421566194282, "acc_norm": 0.22254335260115607, "acc_norm_stderr": 0.02239421566194282 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961476, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961476 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2647058823529412, "acc_stderr": 0.025261691219729474, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.025261691219729474 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.26688102893890675, "acc_stderr": 0.02512263760881665, "acc_norm": 0.26688102893890675, "acc_norm_stderr": 0.02512263760881665 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.26851851851851855, "acc_stderr": 0.024659685185967284, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.024659685185967284 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.26595744680851063, "acc_stderr": 0.02635806569888059, "acc_norm": 0.26595744680851063, "acc_norm_stderr": 0.02635806569888059 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24511082138200782, "acc_stderr": 0.010986307870045509, "acc_norm": 0.24511082138200782, "acc_norm_stderr": 0.010986307870045509 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2238562091503268, "acc_stderr": 0.016863008585416613, "acc_norm": 0.2238562091503268, "acc_norm_stderr": 0.016863008585416613 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.23636363636363636, "acc_stderr": 0.040693063197213754, "acc_norm": 0.23636363636363636, "acc_norm_stderr": 0.040693063197213754 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.33877551020408164, "acc_stderr": 0.030299506562154185, "acc_norm": 0.33877551020408164, "acc_norm_stderr": 0.030299506562154185 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2537313432835821, "acc_stderr": 0.03076944496729602, "acc_norm": 0.2537313432835821, "acc_norm_stderr": 0.03076944496729602 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-virology|5": { "acc": 0.21686746987951808, "acc_stderr": 0.03208284450356365, "acc_norm": 0.21686746987951808, "acc_norm_stderr": 0.03208284450356365 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2573099415204678, "acc_stderr": 0.03352799844161865, "acc_norm": 0.2573099415204678, "acc_norm_stderr": 0.03352799844161865 }, "harness|truthfulqa:mc|0": { "mc1": 0.23990208078335373, "mc1_stderr": 0.014948812679062135, "mc2": 0.48698906066945347, "mc2_stderr": 0.016923576514444937 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yeen214__test_llama2_ko_7b
2023-10-04T06:49:35.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of yeen214/test_llama2_ko_7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [yeen214/test_llama2_ko_7b](https://huggingface.co/yeen214/test_llama2_ko_7b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeen214__test_llama2_ko_7b\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:48:16.505628](https://huggingface.co/datasets/open-llm-leaderboard/details_yeen214__test_llama2_ko_7b/blob/main/results_2023-10-04T06-48-16.505628.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2556908024660644,\n\ \ \"acc_stderr\": 0.031710434396029,\n \"acc_norm\": 0.25717007860214525,\n\ \ \"acc_norm_stderr\": 0.03173157560207755,\n \"mc1\": 0.2484700122399021,\n\ \ \"mc1_stderr\": 0.015127427096520691,\n \"mc2\": 0.49025732091616386,\n\ \ \"mc2_stderr\": 0.01699410311382743\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.01220783999540732,\n\ \ \"acc_norm\": 0.29948805460750855,\n \"acc_norm_stderr\": 0.013385021637313558\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25632344154550885,\n\ \ \"acc_stderr\": 0.004357101984278613,\n \"acc_norm\": 0.2693686516630153,\n\ \ \"acc_norm_stderr\": 0.004427251499236945\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\ \ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\ \ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.038424985593952674,\n\ \ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.038424985593952674\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\ \ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \ \ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108614,\n\ \ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108614\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\ \ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\ \ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n\ \ \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \ \ \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\ \ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n\ \ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\ \ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n\ \ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.02977164271249123,\n\ \ \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.02977164271249123\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\ \ \"acc_stderr\": 0.041857744240220575,\n \"acc_norm\": 0.2719298245614035,\n\ \ \"acc_norm_stderr\": 0.041857744240220575\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n\ \ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194978,\n \"\ acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194978\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\ \ \"acc_stderr\": 0.04073524322147127,\n \"acc_norm\": 0.29365079365079366,\n\ \ \"acc_norm_stderr\": 0.04073524322147127\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.2806451612903226,\n \"acc_stderr\": 0.0255606047210229,\n \"acc_norm\"\ : 0.2806451612903226,\n \"acc_norm_stderr\": 0.0255606047210229\n },\n\ \ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3054187192118227,\n\ \ \"acc_stderr\": 0.03240661565868408,\n \"acc_norm\": 0.3054187192118227,\n\ \ \"acc_norm_stderr\": 0.03240661565868408\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\ : {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \ \ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n \ \ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\ acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n \ \ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.2777777777777778,\n \"acc_stderr\": 0.03191178226713547,\n \"\ acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03191178226713547\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.03499807276193338,\n\ \ \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.03499807276193338\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2794871794871795,\n \"acc_stderr\": 0.022752388839776823,\n\ \ \"acc_norm\": 0.2794871794871795,\n \"acc_norm_stderr\": 0.022752388839776823\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276613,\n \ \ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276613\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n\ \ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599659,\n \"\ acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599659\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.21284403669724772,\n \"acc_stderr\": 0.017549376389313694,\n \"\ acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.017549376389313694\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257017,\n \"\ acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257017\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\ acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.21940928270042195,\n \"acc_stderr\": 0.026939106581553945,\n \ \ \"acc_norm\": 0.21940928270042195,\n \"acc_norm_stderr\": 0.026939106581553945\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17040358744394618,\n\ \ \"acc_stderr\": 0.02523459344713618,\n \"acc_norm\": 0.17040358744394618,\n\ \ \"acc_norm_stderr\": 0.02523459344713618\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\ \ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\ : 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\ \ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\ \ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n\ \ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.0454160944650395,\n\ \ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.0454160944650395\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\ \ \"acc_stderr\": 0.026453508054040342,\n \"acc_norm\": 0.20512820512820512,\n\ \ \"acc_norm_stderr\": 0.026453508054040342\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23116219667943805,\n\ \ \"acc_stderr\": 0.01507552323810108,\n \"acc_norm\": 0.23116219667943805,\n\ \ \"acc_norm_stderr\": 0.01507552323810108\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855716,\n\ \ \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855716\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\ \ \"acc_stderr\": 0.014987325439963556,\n \"acc_norm\": 0.2782122905027933,\n\ \ \"acc_norm_stderr\": 0.014987325439963556\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\ \ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\ \ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\ \ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967267,\n\ \ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967267\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02668456434046099,\n \ \ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02668456434046099\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26792698826597133,\n\ \ \"acc_stderr\": 0.01131134769063389,\n \"acc_norm\": 0.26792698826597133,\n\ \ \"acc_norm_stderr\": 0.01131134769063389\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.2977941176470588,\n \"acc_stderr\": 0.027778298701545443,\n\ \ \"acc_norm\": 0.2977941176470588,\n \"acc_norm_stderr\": 0.027778298701545443\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146627,\n \ \ \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146627\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\ \ \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \ \ \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.028666857790274648,\n\ \ \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.028666857790274648\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\ \ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.26865671641791045,\n\ \ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\ \ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\ \ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987249,\n\ \ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987249\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\ \ \"mc1_stderr\": 0.015127427096520691,\n \"mc2\": 0.49025732091616386,\n\ \ \"mc2_stderr\": 0.01699410311382743\n }\n}\n```" repo_url: https://huggingface.co/yeen214/test_llama2_ko_7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hellaswag|10_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-48-16.505628.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-48-16.505628.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_48_16.505628 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-48-16.505628.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-48-16.505628.parquet' - config_name: results data_files: - split: 2023_10_04T06_48_16.505628 path: - results_2023-10-04T06-48-16.505628.parquet - split: latest path: - results_2023-10-04T06-48-16.505628.parquet --- # Dataset Card for Evaluation run of yeen214/test_llama2_ko_7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeen214/test_llama2_ko_7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [yeen214/test_llama2_ko_7b](https://huggingface.co/yeen214/test_llama2_ko_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeen214__test_llama2_ko_7b", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:48:16.505628](https://huggingface.co/datasets/open-llm-leaderboard/details_yeen214__test_llama2_ko_7b/blob/main/results_2023-10-04T06-48-16.505628.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2556908024660644, "acc_stderr": 0.031710434396029, "acc_norm": 0.25717007860214525, "acc_norm_stderr": 0.03173157560207755, "mc1": 0.2484700122399021, "mc1_stderr": 0.015127427096520691, "mc2": 0.49025732091616386, "mc2_stderr": 0.01699410311382743 }, "harness|arc:challenge|25": { "acc": 0.22525597269624573, "acc_stderr": 0.01220783999540732, "acc_norm": 0.29948805460750855, "acc_norm_stderr": 0.013385021637313558 }, "harness|hellaswag|10": { "acc": 0.25632344154550885, "acc_stderr": 0.004357101984278613, "acc_norm": 0.2693686516630153, "acc_norm_stderr": 0.004427251499236945 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.21, "acc_stderr": 0.04093601807403326, "acc_norm": 0.21, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.23703703703703705, "acc_stderr": 0.03673731683969506, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.03673731683969506 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3355263157894737, "acc_stderr": 0.038424985593952674, "acc_norm": 0.3355263157894737, "acc_norm_stderr": 0.038424985593952674 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2188679245283019, "acc_stderr": 0.025447863825108614, "acc_norm": 0.2188679245283019, "acc_norm_stderr": 0.025447863825108614 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.20833333333333334, "acc_stderr": 0.033961162058453336, "acc_norm": 0.20833333333333334, "acc_norm_stderr": 0.033961162058453336 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.28, "acc_stderr": 0.04512608598542126, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542126 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.24, "acc_stderr": 0.04292346959909281, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909281 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2774566473988439, "acc_stderr": 0.03414014007044036, "acc_norm": 0.2774566473988439, "acc_norm_stderr": 0.03414014007044036 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.24509803921568626, "acc_stderr": 0.04280105837364395, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.04280105837364395 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2936170212765957, "acc_stderr": 0.02977164271249123, "acc_norm": 0.2936170212765957, "acc_norm_stderr": 0.02977164271249123 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.041857744240220575, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.041857744240220575 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.30344827586206896, "acc_stderr": 0.038312260488503336, "acc_norm": 0.30344827586206896, "acc_norm_stderr": 0.038312260488503336 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2830687830687831, "acc_stderr": 0.023201392938194978, "acc_norm": 0.2830687830687831, "acc_norm_stderr": 0.023201392938194978 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.04073524322147127, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.04073524322147127 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2806451612903226, "acc_stderr": 0.0255606047210229, "acc_norm": 0.2806451612903226, "acc_norm_stderr": 0.0255606047210229 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3054187192118227, "acc_stderr": 0.03240661565868408, "acc_norm": 0.3054187192118227, "acc_norm_stderr": 0.03240661565868408 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.19, "acc_stderr": 0.03942772444036622, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036622 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2545454545454545, "acc_stderr": 0.0340150671524904, "acc_norm": 0.2545454545454545, "acc_norm_stderr": 0.0340150671524904 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.2777777777777778, "acc_stderr": 0.03191178226713547, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.03191178226713547 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.37823834196891193, "acc_stderr": 0.03499807276193338, "acc_norm": 0.37823834196891193, "acc_norm_stderr": 0.03499807276193338 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2794871794871795, "acc_stderr": 0.022752388839776823, "acc_norm": 0.2794871794871795, "acc_norm_stderr": 0.022752388839776823 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.23703703703703705, "acc_stderr": 0.02592887613276613, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.02592887613276613 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2605042016806723, "acc_stderr": 0.028510251512341933, "acc_norm": 0.2605042016806723, "acc_norm_stderr": 0.028510251512341933 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.23841059602649006, "acc_stderr": 0.03479185572599659, "acc_norm": 0.23841059602649006, "acc_norm_stderr": 0.03479185572599659 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.21284403669724772, "acc_stderr": 0.017549376389313694, "acc_norm": 0.21284403669724772, "acc_norm_stderr": 0.017549376389313694 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.28703703703703703, "acc_stderr": 0.030851992993257017, "acc_norm": 0.28703703703703703, "acc_norm_stderr": 0.030851992993257017 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.030587591351604243, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.030587591351604243 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.21940928270042195, "acc_stderr": 0.026939106581553945, "acc_norm": 0.21940928270042195, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.17040358744394618, "acc_stderr": 0.02523459344713618, "acc_norm": 0.17040358744394618, "acc_norm_stderr": 0.02523459344713618 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.22900763358778625, "acc_stderr": 0.036853466317118506, "acc_norm": 0.22900763358778625, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.256198347107438, "acc_stderr": 0.03984979653302871, "acc_norm": 0.256198347107438, "acc_norm_stderr": 0.03984979653302871 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.23148148148148148, "acc_stderr": 0.04077494709252628, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.04077494709252628 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2331288343558282, "acc_stderr": 0.03322015795776741, "acc_norm": 0.2331288343558282, "acc_norm_stderr": 0.03322015795776741 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.25, "acc_stderr": 0.04109974682633932, "acc_norm": 0.25, "acc_norm_stderr": 0.04109974682633932 }, "harness|hendrycksTest-management|5": { "acc": 0.30097087378640774, "acc_stderr": 0.0454160944650395, "acc_norm": 0.30097087378640774, "acc_norm_stderr": 0.0454160944650395 }, "harness|hendrycksTest-marketing|5": { "acc": 0.20512820512820512, "acc_stderr": 0.026453508054040342, "acc_norm": 0.20512820512820512, "acc_norm_stderr": 0.026453508054040342 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23116219667943805, "acc_stderr": 0.01507552323810108, "acc_norm": 0.23116219667943805, "acc_norm_stderr": 0.01507552323810108 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.23121387283236994, "acc_stderr": 0.022698657167855716, "acc_norm": 0.23121387283236994, "acc_norm_stderr": 0.022698657167855716 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2782122905027933, "acc_stderr": 0.014987325439963556, "acc_norm": 0.2782122905027933, "acc_norm_stderr": 0.014987325439963556 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24836601307189543, "acc_stderr": 0.02473998135511359, "acc_norm": 0.24836601307189543, "acc_norm_stderr": 0.02473998135511359 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2733118971061093, "acc_stderr": 0.02531176597542612, "acc_norm": 0.2733118971061093, "acc_norm_stderr": 0.02531176597542612 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.26851851851851855, "acc_stderr": 0.024659685185967267, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.024659685185967267 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2765957446808511, "acc_stderr": 0.02668456434046099, "acc_norm": 0.2765957446808511, "acc_norm_stderr": 0.02668456434046099 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.26792698826597133, "acc_stderr": 0.01131134769063389, "acc_norm": 0.26792698826597133, "acc_norm_stderr": 0.01131134769063389 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.2977941176470588, "acc_stderr": 0.027778298701545443, "acc_norm": 0.2977941176470588, "acc_norm_stderr": 0.027778298701545443 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2434640522875817, "acc_stderr": 0.017362473762146627, "acc_norm": 0.2434640522875817, "acc_norm_stderr": 0.017362473762146627 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2, "acc_stderr": 0.03831305140884603, "acc_norm": 0.2, "acc_norm_stderr": 0.03831305140884603 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.27755102040816326, "acc_stderr": 0.028666857790274648, "acc_norm": 0.27755102040816326, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.26865671641791045, "acc_stderr": 0.03134328358208954, "acc_norm": 0.26865671641791045, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.2, "acc_stderr": 0.04020151261036846, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-virology|5": { "acc": 0.29518072289156627, "acc_stderr": 0.035509201856896294, "acc_norm": 0.29518072289156627, "acc_norm_stderr": 0.035509201856896294 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.24561403508771928, "acc_stderr": 0.03301405946987249, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.03301405946987249 }, "harness|truthfulqa:mc|0": { "mc1": 0.2484700122399021, "mc1_stderr": 0.015127427096520691, "mc2": 0.49025732091616386, "mc2_stderr": 0.01699410311382743 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-b1073cc5-6a79-4022-ad68-fef05cbe40d2
2023-10-04T07:05:14.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1
2023-10-04T06:53:14.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of hoskinson-center/proofGPT-v0.1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [hoskinson-center/proofGPT-v0.1](https://huggingface.co/hoskinson-center/proofGPT-v0.1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:51:58.783827](https://huggingface.co/datasets/open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1/blob/main/results_2023-10-04T06-51-58.783827.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25889261808594116,\n\ \ \"acc_stderr\": 0.031593823500334205,\n \"acc_norm\": 0.25952663019315186,\n\ \ \"acc_norm_stderr\": 0.031602156702635366,\n \"mc1\": 0.2974296205630355,\n\ \ \"mc1_stderr\": 0.01600265148736101,\n \"mc2\": 0.5163836837036343,\n\ \ \"mc2_stderr\": 0.015583918180205296\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.20819112627986347,\n \"acc_stderr\": 0.011864866118448064,\n\ \ \"acc_norm\": 0.22866894197952217,\n \"acc_norm_stderr\": 0.012272853582540783\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2696673969328819,\n\ \ \"acc_stderr\": 0.004428800140739951,\n \"acc_norm\": 0.2865962955586537,\n\ \ \"acc_norm_stderr\": 0.004512471612415572\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\ \ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\ \ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\ \ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n\ \ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\ \ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\ \ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\ \ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n\ \ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412424,\n\ \ \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412424\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\ \ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\ acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\ \ \"acc_stderr\": 0.032006864972873916,\n \"acc_norm\": 0.15079365079365079,\n\ \ \"acc_norm_stderr\": 0.032006864972873916\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \ \ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3032258064516129,\n\ \ \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.3032258064516129,\n\ \ \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n\ \ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511785,\n\ \ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511785\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"\ acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.3316062176165803,\n \"acc_stderr\": 0.03397636541089116,\n\ \ \"acc_norm\": 0.3316062176165803,\n \"acc_norm_stderr\": 0.03397636541089116\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n\ \ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \ \ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.20168067226890757,\n \"acc_stderr\": 0.026064313406304523,\n\ \ \"acc_norm\": 0.20168067226890757,\n \"acc_norm_stderr\": 0.026064313406304523\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\ acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3412844036697248,\n \"acc_stderr\": 0.020328612816592435,\n \"\ acc_norm\": 0.3412844036697248,\n \"acc_norm_stderr\": 0.020328612816592435\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"\ acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.29901960784313725,\n \"acc_stderr\": 0.03213325717373616,\n \"\ acc_norm\": 0.29901960784313725,\n \"acc_norm_stderr\": 0.03213325717373616\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \ \ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19730941704035873,\n\ \ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.19730941704035873,\n\ \ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.13740458015267176,\n \"acc_stderr\": 0.030194823996804468,\n\ \ \"acc_norm\": 0.13740458015267176,\n \"acc_norm_stderr\": 0.030194823996804468\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2892561983471074,\n \"acc_stderr\": 0.04139112727635463,\n \"\ acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.04139112727635463\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.17592592592592593,\n\ \ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.17592592592592593,\n\ \ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.034624199316156234,\n\ \ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.034624199316156234\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\ \ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\ \ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\ \ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\ \ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\ \ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21711366538952745,\n\ \ \"acc_stderr\": 0.014743125394823298,\n \"acc_norm\": 0.21711366538952745,\n\ \ \"acc_norm_stderr\": 0.014743125394823298\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\ \ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\ \ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\ \ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02428861946604609,\n\ \ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02428861946604609\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\ \ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\ \ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n \ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\ : 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \"\ acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\ \ \"acc_stderr\": 0.011015752255279324,\n \"acc_norm\": 0.2470664928292047,\n\ \ \"acc_norm_stderr\": 0.011015752255279324\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.030254372573976694,\n\ \ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.030254372573976694\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \ \ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\ \ \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n\ \ \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.02635891633490401,\n\ \ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.02635891633490401\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\ \ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.2537313432835821,\n\ \ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.15060240963855423,\n\ \ \"acc_stderr\": 0.02784386378726433,\n \"acc_norm\": 0.15060240963855423,\n\ \ \"acc_norm_stderr\": 0.02784386378726433\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\ \ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\ \ \"mc1_stderr\": 0.01600265148736101,\n \"mc2\": 0.5163836837036343,\n\ \ \"mc2_stderr\": 0.015583918180205296\n }\n}\n```" repo_url: https://huggingface.co/hoskinson-center/proofGPT-v0.1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hellaswag|10_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-51-58.783827.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-51-58.783827.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_51_58.783827 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-51-58.783827.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-51-58.783827.parquet' - config_name: results data_files: - split: 2023_10_04T06_51_58.783827 path: - results_2023-10-04T06-51-58.783827.parquet - split: latest path: - results_2023-10-04T06-51-58.783827.parquet --- # Dataset Card for Evaluation run of hoskinson-center/proofGPT-v0.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/hoskinson-center/proofGPT-v0.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [hoskinson-center/proofGPT-v0.1](https://huggingface.co/hoskinson-center/proofGPT-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:51:58.783827](https://huggingface.co/datasets/open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1/blob/main/results_2023-10-04T06-51-58.783827.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.25889261808594116, "acc_stderr": 0.031593823500334205, "acc_norm": 0.25952663019315186, "acc_norm_stderr": 0.031602156702635366, "mc1": 0.2974296205630355, "mc1_stderr": 0.01600265148736101, "mc2": 0.5163836837036343, "mc2_stderr": 0.015583918180205296 }, "harness|arc:challenge|25": { "acc": 0.20819112627986347, "acc_stderr": 0.011864866118448064, "acc_norm": 0.22866894197952217, "acc_norm_stderr": 0.012272853582540783 }, "harness|hellaswag|10": { "acc": 0.2696673969328819, "acc_stderr": 0.004428800140739951, "acc_norm": 0.2865962955586537, "acc_norm_stderr": 0.004512471612415572 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.34814814814814815, "acc_stderr": 0.041153246103369526, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.19736842105263158, "acc_stderr": 0.03238981601699397, "acc_norm": 0.19736842105263158, "acc_norm_stderr": 0.03238981601699397 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2943396226415094, "acc_stderr": 0.028049186315695248, "acc_norm": 0.2943396226415094, "acc_norm_stderr": 0.028049186315695248 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.25, "acc_stderr": 0.03621034121889507, "acc_norm": 0.25, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2543352601156069, "acc_stderr": 0.0332055644308557, "acc_norm": 0.2543352601156069, "acc_norm_stderr": 0.0332055644308557 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617747, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617747 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2297872340425532, "acc_stderr": 0.027501752944412424, "acc_norm": 0.2297872340425532, "acc_norm_stderr": 0.027501752944412424 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.040493392977481425, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.040493392977481425 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.22758620689655173, "acc_stderr": 0.03493950380131184, "acc_norm": 0.22758620689655173, "acc_norm_stderr": 0.03493950380131184 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24603174603174602, "acc_stderr": 0.022182037202948368, "acc_norm": 0.24603174603174602, "acc_norm_stderr": 0.022182037202948368 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.15079365079365079, "acc_stderr": 0.032006864972873916, "acc_norm": 0.15079365079365079, "acc_norm_stderr": 0.032006864972873916 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.16, "acc_stderr": 0.03684529491774708, "acc_norm": 0.16, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3032258064516129, "acc_stderr": 0.02614868593067175, "acc_norm": 0.3032258064516129, "acc_norm_stderr": 0.02614868593067175 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3103448275862069, "acc_stderr": 0.03255086769970103, "acc_norm": 0.3103448275862069, "acc_norm_stderr": 0.03255086769970103 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.28484848484848485, "acc_stderr": 0.03524390844511785, "acc_norm": 0.28484848484848485, "acc_norm_stderr": 0.03524390844511785 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3484848484848485, "acc_stderr": 0.033948539651564025, "acc_norm": 0.3484848484848485, "acc_norm_stderr": 0.033948539651564025 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.3316062176165803, "acc_stderr": 0.03397636541089116, "acc_norm": 0.3316062176165803, "acc_norm_stderr": 0.03397636541089116 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.23846153846153847, "acc_stderr": 0.021606294494647727, "acc_norm": 0.23846153846153847, "acc_norm_stderr": 0.021606294494647727 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085622, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085622 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.20168067226890757, "acc_stderr": 0.026064313406304523, "acc_norm": 0.20168067226890757, "acc_norm_stderr": 0.026064313406304523 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389024, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3412844036697248, "acc_stderr": 0.020328612816592435, "acc_norm": 0.3412844036697248, "acc_norm_stderr": 0.020328612816592435 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.03388857118502325, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.29901960784313725, "acc_stderr": 0.03213325717373616, "acc_norm": 0.29901960784313725, "acc_norm_stderr": 0.03213325717373616 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.26582278481012656, "acc_stderr": 0.02875679962965834, "acc_norm": 0.26582278481012656, "acc_norm_stderr": 0.02875679962965834 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.19730941704035873, "acc_stderr": 0.02670985334496796, "acc_norm": 0.19730941704035873, "acc_norm_stderr": 0.02670985334496796 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.13740458015267176, "acc_stderr": 0.030194823996804468, "acc_norm": 0.13740458015267176, "acc_norm_stderr": 0.030194823996804468 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2892561983471074, "acc_stderr": 0.04139112727635463, "acc_norm": 0.2892561983471074, "acc_norm_stderr": 0.04139112727635463 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.17592592592592593, "acc_stderr": 0.036809181416738807, "acc_norm": 0.17592592592592593, "acc_norm_stderr": 0.036809181416738807 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.26380368098159507, "acc_stderr": 0.034624199316156234, "acc_norm": 0.26380368098159507, "acc_norm_stderr": 0.034624199316156234 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.04432804055291519, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.04432804055291519 }, "harness|hendrycksTest-management|5": { "acc": 0.2524271844660194, "acc_stderr": 0.04301250399690878, "acc_norm": 0.2524271844660194, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.19658119658119658, "acc_stderr": 0.02603538609895129, "acc_norm": 0.19658119658119658, "acc_norm_stderr": 0.02603538609895129 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.21711366538952745, "acc_stderr": 0.014743125394823298, "acc_norm": 0.21711366538952745, "acc_norm_stderr": 0.014743125394823298 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2138728323699422, "acc_stderr": 0.022075709251757183, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.022075709251757183 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.23529411764705882, "acc_stderr": 0.02428861946604609, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.02428861946604609 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2733118971061093, "acc_stderr": 0.02531176597542612, "acc_norm": 0.2733118971061093, "acc_norm_stderr": 0.02531176597542612 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.25, "acc_stderr": 0.02409347123262133, "acc_norm": 0.25, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.25177304964539005, "acc_stderr": 0.025892151156709405, "acc_norm": 0.25177304964539005, "acc_norm_stderr": 0.025892151156709405 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2470664928292047, "acc_stderr": 0.011015752255279324, "acc_norm": 0.2470664928292047, "acc_norm_stderr": 0.011015752255279324 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.45588235294117646, "acc_stderr": 0.030254372573976694, "acc_norm": 0.45588235294117646, "acc_norm_stderr": 0.030254372573976694 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2565359477124183, "acc_stderr": 0.017667841612378984, "acc_norm": 0.2565359477124183, "acc_norm_stderr": 0.017667841612378984 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.22727272727272727, "acc_stderr": 0.04013964554072773, "acc_norm": 0.22727272727272727, "acc_norm_stderr": 0.04013964554072773 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2163265306122449, "acc_stderr": 0.02635891633490401, "acc_norm": 0.2163265306122449, "acc_norm_stderr": 0.02635891633490401 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2537313432835821, "acc_stderr": 0.030769444967296018, "acc_norm": 0.2537313432835821, "acc_norm_stderr": 0.030769444967296018 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-virology|5": { "acc": 0.15060240963855423, "acc_stderr": 0.02784386378726433, "acc_norm": 0.15060240963855423, "acc_norm_stderr": 0.02784386378726433 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.26900584795321636, "acc_stderr": 0.0340105262010409, "acc_norm": 0.26900584795321636, "acc_norm_stderr": 0.0340105262010409 }, "harness|truthfulqa:mc|0": { "mc1": 0.2974296205630355, "mc1_stderr": 0.01600265148736101, "mc2": 0.5163836837036343, "mc2_stderr": 0.015583918180205296 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1-6.7B
2023-10-04T06:56:26.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of hoskinson-center/proofGPT-v0.1-6.7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [hoskinson-center/proofGPT-v0.1-6.7B](https://huggingface.co/hoskinson-center/proofGPT-v0.1-6.7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1-6.7B\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:55:11.412904](https://huggingface.co/datasets/open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1-6.7B/blob/main/results_2023-10-04T06-55-11.412904.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24561842333609638,\n\ \ \"acc_stderr\": 0.03127980098226512,\n \"acc_norm\": 0.24616033369541826,\n\ \ \"acc_norm_stderr\": 0.03128730084182194,\n \"mc1\": 0.29498164014687883,\n\ \ \"mc1_stderr\": 0.015964400965589667,\n \"mc2\": 0.5087484633157238,\n\ \ \"mc2_stderr\": 0.01570847457765271\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.21331058020477817,\n \"acc_stderr\": 0.011970971742326334,\n\ \ \"acc_norm\": 0.23293515358361774,\n \"acc_norm_stderr\": 0.012352507042617417\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27215694084843656,\n\ \ \"acc_stderr\": 0.004441606665787927,\n \"acc_norm\": 0.28450507866958774,\n\ \ \"acc_norm_stderr\": 0.004502563079349396\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\ \ \"acc_stderr\": 0.03885004245800255,\n \"acc_norm\": 0.2814814814814815,\n\ \ \"acc_norm_stderr\": 0.03885004245800255\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n\ \ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n\ \ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.19622641509433963,\n \"acc_stderr\": 0.024442388131100827,\n\ \ \"acc_norm\": 0.19622641509433963,\n \"acc_norm_stderr\": 0.024442388131100827\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\ \ \"acc_stderr\": 0.03126511206173041,\n \"acc_norm\": 0.2138728323699422,\n\ \ \"acc_norm_stderr\": 0.03126511206173041\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\ \ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n\ \ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292316,\n\ \ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292316\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\ \ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\ \ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n\ \ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\ acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\ \ \"acc_stderr\": 0.03268454013011742,\n \"acc_norm\": 0.15873015873015872,\n\ \ \"acc_norm_stderr\": 0.03268454013011742\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \ \ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n\ \ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.24516129032258063,\n\ \ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n\ \ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.19393939393939394,\n \"acc_stderr\": 0.030874145136562097,\n\ \ \"acc_norm\": 0.19393939393939394,\n \"acc_norm_stderr\": 0.030874145136562097\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.25757575757575757,\n \"acc_stderr\": 0.031156269519646836,\n \"\ acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.031156269519646836\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\ \ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.02228214120420442,\n\ \ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.02228214120420442\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \ \ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882378,\n\ \ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882378\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804725,\n \"\ acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804725\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.28073394495412846,\n \"acc_stderr\": 0.019266055045871616,\n \"\ acc_norm\": 0.28073394495412846,\n \"acc_norm_stderr\": 0.019266055045871616\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\ acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.22549019607843138,\n \"acc_stderr\": 0.029331162294251728,\n \"\ acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.029331162294251728\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \ \ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n\ \ \"acc_stderr\": 0.030500283176545913,\n \"acc_norm\": 0.2914798206278027,\n\ \ \"acc_norm_stderr\": 0.030500283176545913\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\ \ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\ acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\ \ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.23148148148148148,\n\ \ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\ \ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\ \ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\ \ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\ \ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n\ \ \"acc_stderr\": 0.025819233256483727,\n \"acc_norm\": 0.19230769230769232,\n\ \ \"acc_norm_stderr\": 0.025819233256483727\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2247765006385696,\n\ \ \"acc_stderr\": 0.014927447101937162,\n \"acc_norm\": 0.2247765006385696,\n\ \ \"acc_norm_stderr\": 0.014927447101937162\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.023357365785874037,\n\ \ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.023357365785874037\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\ \ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\ \ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\ \ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18006430868167203,\n\ \ \"acc_stderr\": 0.021823422857744953,\n \"acc_norm\": 0.18006430868167203,\n\ \ \"acc_norm_stderr\": 0.021823422857744953\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451156,\n\ \ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451156\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.20567375886524822,\n \"acc_stderr\": 0.024112138950471883,\n \ \ \"acc_norm\": 0.20567375886524822,\n \"acc_norm_stderr\": 0.024112138950471883\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.22946544980443284,\n\ \ \"acc_stderr\": 0.010739489382279503,\n \"acc_norm\": 0.22946544980443284,\n\ \ \"acc_norm_stderr\": 0.010739489382279503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.02997280717046462,\n\ \ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.02997280717046462\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\ : 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\ : {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.041220665028782834,\n\ \ \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.041220665028782834\n\ \ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22857142857142856,\n\ \ \"acc_stderr\": 0.026882144922307744,\n \"acc_norm\": 0.22857142857142856,\n\ \ \"acc_norm_stderr\": 0.026882144922307744\n },\n \"harness|hendrycksTest-sociology|5\"\ : {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014645,\n\ \ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014645\n\ \ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\ \ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.03571609230053481,\n\ \ \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.03571609230053481\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.033014059469872487,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.033014059469872487\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.29498164014687883,\n \"mc1_stderr\": 0.015964400965589667,\n\ \ \"mc2\": 0.5087484633157238,\n \"mc2_stderr\": 0.01570847457765271\n\ \ }\n}\n```" repo_url: https://huggingface.co/hoskinson-center/proofGPT-v0.1-6.7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hellaswag|10_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-55-11.412904.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-55-11.412904.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_55_11.412904 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-55-11.412904.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-55-11.412904.parquet' - config_name: results data_files: - split: 2023_10_04T06_55_11.412904 path: - results_2023-10-04T06-55-11.412904.parquet - split: latest path: - results_2023-10-04T06-55-11.412904.parquet --- # Dataset Card for Evaluation run of hoskinson-center/proofGPT-v0.1-6.7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/hoskinson-center/proofGPT-v0.1-6.7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [hoskinson-center/proofGPT-v0.1-6.7B](https://huggingface.co/hoskinson-center/proofGPT-v0.1-6.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1-6.7B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:55:11.412904](https://huggingface.co/datasets/open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1-6.7B/blob/main/results_2023-10-04T06-55-11.412904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.24561842333609638, "acc_stderr": 0.03127980098226512, "acc_norm": 0.24616033369541826, "acc_norm_stderr": 0.03128730084182194, "mc1": 0.29498164014687883, "mc1_stderr": 0.015964400965589667, "mc2": 0.5087484633157238, "mc2_stderr": 0.01570847457765271 }, "harness|arc:challenge|25": { "acc": 0.21331058020477817, "acc_stderr": 0.011970971742326334, "acc_norm": 0.23293515358361774, "acc_norm_stderr": 0.012352507042617417 }, "harness|hellaswag|10": { "acc": 0.27215694084843656, "acc_stderr": 0.004441606665787927, "acc_norm": 0.28450507866958774, "acc_norm_stderr": 0.004502563079349396 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2814814814814815, "acc_stderr": 0.03885004245800255, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.03885004245800255 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17105263157894737, "acc_stderr": 0.030643607071677088, "acc_norm": 0.17105263157894737, "acc_norm_stderr": 0.030643607071677088 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.19622641509433963, "acc_stderr": 0.024442388131100827, "acc_norm": 0.19622641509433963, "acc_norm_stderr": 0.024442388131100827 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03476590104304134, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2138728323699422, "acc_stderr": 0.03126511206173041, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.03126511206173041 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237655, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237655 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.25957446808510637, "acc_stderr": 0.028659179374292316, "acc_norm": 0.25957446808510637, "acc_norm_stderr": 0.028659179374292316 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.040969851398436695, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.040969851398436695 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2206896551724138, "acc_stderr": 0.03455930201924812, "acc_norm": 0.2206896551724138, "acc_norm_stderr": 0.03455930201924812 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.15873015873015872, "acc_stderr": 0.03268454013011742, "acc_norm": 0.15873015873015872, "acc_norm_stderr": 0.03268454013011742 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.17, "acc_stderr": 0.0377525168068637, "acc_norm": 0.17, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.24516129032258063, "acc_stderr": 0.024472243840895525, "acc_norm": 0.24516129032258063, "acc_norm_stderr": 0.024472243840895525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2413793103448276, "acc_stderr": 0.030108330718011625, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.030108330718011625 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.19393939393939394, "acc_stderr": 0.030874145136562097, "acc_norm": 0.19393939393939394, "acc_norm_stderr": 0.030874145136562097 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.25757575757575757, "acc_stderr": 0.031156269519646836, "acc_norm": 0.25757575757575757, "acc_norm_stderr": 0.031156269519646836 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.20207253886010362, "acc_stderr": 0.02897908979429673, "acc_norm": 0.20207253886010362, "acc_norm_stderr": 0.02897908979429673 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.26153846153846155, "acc_stderr": 0.02228214120420442, "acc_norm": 0.26153846153846155, "acc_norm_stderr": 0.02228214120420442 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.27037037037037037, "acc_stderr": 0.027080372815145665, "acc_norm": 0.27037037037037037, "acc_norm_stderr": 0.027080372815145665 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.22268907563025211, "acc_stderr": 0.027025433498882378, "acc_norm": 0.22268907563025211, "acc_norm_stderr": 0.027025433498882378 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.24503311258278146, "acc_stderr": 0.03511807571804725, "acc_norm": 0.24503311258278146, "acc_norm_stderr": 0.03511807571804725 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.28073394495412846, "acc_stderr": 0.019266055045871616, "acc_norm": 0.28073394495412846, "acc_norm_stderr": 0.019266055045871616 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.27314814814814814, "acc_stderr": 0.03038805130167812, "acc_norm": 0.27314814814814814, "acc_norm_stderr": 0.03038805130167812 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.22549019607843138, "acc_stderr": 0.029331162294251728, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.029331162294251728 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.26582278481012656, "acc_stderr": 0.02875679962965834, "acc_norm": 0.26582278481012656, "acc_norm_stderr": 0.02875679962965834 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.2914798206278027, "acc_stderr": 0.030500283176545913, "acc_norm": 0.2914798206278027, "acc_norm_stderr": 0.030500283176545913 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.22900763358778625, "acc_stderr": 0.036853466317118506, "acc_norm": 0.22900763358778625, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2809917355371901, "acc_stderr": 0.04103203830514512, "acc_norm": 0.2809917355371901, "acc_norm_stderr": 0.04103203830514512 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.23148148148148148, "acc_stderr": 0.04077494709252627, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.0432704093257873, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.0432704093257873 }, "harness|hendrycksTest-management|5": { "acc": 0.1650485436893204, "acc_stderr": 0.036756688322331886, "acc_norm": 0.1650485436893204, "acc_norm_stderr": 0.036756688322331886 }, "harness|hendrycksTest-marketing|5": { "acc": 0.19230769230769232, "acc_stderr": 0.025819233256483727, "acc_norm": 0.19230769230769232, "acc_norm_stderr": 0.025819233256483727 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2247765006385696, "acc_stderr": 0.014927447101937162, "acc_norm": 0.2247765006385696, "acc_norm_stderr": 0.014927447101937162 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2514450867052023, "acc_stderr": 0.023357365785874037, "acc_norm": 0.2514450867052023, "acc_norm_stderr": 0.023357365785874037 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23910614525139665, "acc_stderr": 0.014265554192331144, "acc_norm": 0.23910614525139665, "acc_norm_stderr": 0.014265554192331144 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.21895424836601307, "acc_stderr": 0.02367908986180772, "acc_norm": 0.21895424836601307, "acc_norm_stderr": 0.02367908986180772 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.18006430868167203, "acc_stderr": 0.021823422857744953, "acc_norm": 0.18006430868167203, "acc_norm_stderr": 0.021823422857744953 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.23148148148148148, "acc_stderr": 0.023468429832451156, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.023468429832451156 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.20567375886524822, "acc_stderr": 0.024112138950471883, "acc_norm": 0.20567375886524822, "acc_norm_stderr": 0.024112138950471883 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.22946544980443284, "acc_stderr": 0.010739489382279503, "acc_norm": 0.22946544980443284, "acc_norm_stderr": 0.010739489382279503 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.41911764705882354, "acc_stderr": 0.02997280717046462, "acc_norm": 0.41911764705882354, "acc_norm_stderr": 0.02997280717046462 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.24545454545454545, "acc_stderr": 0.041220665028782834, "acc_norm": 0.24545454545454545, "acc_norm_stderr": 0.041220665028782834 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.22857142857142856, "acc_stderr": 0.026882144922307744, "acc_norm": 0.22857142857142856, "acc_norm_stderr": 0.026882144922307744 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.030360490154014645, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.030360490154014645 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-virology|5": { "acc": 0.30120481927710846, "acc_stderr": 0.03571609230053481, "acc_norm": 0.30120481927710846, "acc_norm_stderr": 0.03571609230053481 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.24561403508771928, "acc_stderr": 0.033014059469872487, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.033014059469872487 }, "harness|truthfulqa:mc|0": { "mc1": 0.29498164014687883, "mc1_stderr": 0.015964400965589667, "mc2": 0.5087484633157238, "mc2_stderr": 0.01570847457765271 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Sao10K__BrainDerp
2023-10-04T07:00:40.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Sao10K/BrainDerp dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Sao10K/BrainDerp](https://huggingface.co/Sao10K/BrainDerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__BrainDerp\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T06:59:16.770544](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__BrainDerp/blob/main/results_2023-10-04T06-59-16.770544.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5885766970363406,\n\ \ \"acc_stderr\": 0.03412725703420776,\n \"acc_norm\": 0.5924040392177476,\n\ \ \"acc_norm_stderr\": 0.03410755570409047,\n \"mc1\": 0.38922888616891066,\n\ \ \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5689827090259466,\n\ \ \"mc2_stderr\": 0.015643581657231707\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5810580204778157,\n \"acc_stderr\": 0.014418106953639008,\n\ \ \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670709\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6215893248356901,\n\ \ \"acc_stderr\": 0.004839995745602318,\n \"acc_norm\": 0.8209520015933081,\n\ \ \"acc_norm_stderr\": 0.003826089586650052\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\ \ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\ \ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\ \ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365242,\n\ \ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365242\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\ \ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\ \ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\ \ \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.5606936416184971,\n\ \ \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\ \ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\ \ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\ \ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.35978835978835977,\n \"acc_stderr\": 0.024718075944129274,\n \"\ acc_norm\": 0.35978835978835977,\n \"acc_norm_stderr\": 0.024718075944129274\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\ \ \"acc_stderr\": 0.04375888492727062,\n \"acc_norm\": 0.3968253968253968,\n\ \ \"acc_norm_stderr\": 0.04375888492727062\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n\ \ \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.7,\n \ \ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\ \ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\ : 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\ \ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"\ acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n\ \ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \ \ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524593,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524593\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\ \ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\ acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\ acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\ acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\ acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \ \ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\ \ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n\ \ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\ \ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"\ acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\ \ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\ \ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\ \ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\ \ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\ \ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\ \ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\ \ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\ \ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \ \ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\ \ \"acc_stderr\": 0.014805384478371163,\n \"acc_norm\": 0.7803320561941252,\n\ \ \"acc_norm_stderr\": 0.014805384478371163\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n\ \ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n\ \ \"acc_stderr\": 0.016578997435496706,\n \"acc_norm\": 0.4346368715083799,\n\ \ \"acc_norm_stderr\": 0.016578997435496706\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297236,\n\ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297236\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\ \ \"acc_stderr\": 0.027098652621301757,\n \"acc_norm\": 0.6495176848874598,\n\ \ \"acc_norm_stderr\": 0.027098652621301757\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732852,\n\ \ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732852\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \ \ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\ \ \"acc_stderr\": 0.012697046024399675,\n \"acc_norm\": 0.44654498044328556,\n\ \ \"acc_norm_stderr\": 0.012697046024399675\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.03010563657001663,\n\ \ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.03010563657001663\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5849673202614379,\n \"acc_stderr\": 0.019933627776857418,\n \ \ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.019933627776857418\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540606,\n\ \ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540606\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\ \ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\ \ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\ \ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\ \ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\ \ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n\ \ \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5689827090259466,\n\ \ \"mc2_stderr\": 0.015643581657231707\n }\n}\n```" repo_url: https://huggingface.co/Sao10K/BrainDerp leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hellaswag|10_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-59-16.770544.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-59-16.770544.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_59_16.770544 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-59-16.770544.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-59-16.770544.parquet' - config_name: results data_files: - split: 2023_10_04T06_59_16.770544 path: - results_2023-10-04T06-59-16.770544.parquet - split: latest path: - results_2023-10-04T06-59-16.770544.parquet --- # Dataset Card for Evaluation run of Sao10K/BrainDerp ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Sao10K/BrainDerp - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Sao10K/BrainDerp](https://huggingface.co/Sao10K/BrainDerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Sao10K__BrainDerp", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T06:59:16.770544](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__BrainDerp/blob/main/results_2023-10-04T06-59-16.770544.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5885766970363406, "acc_stderr": 0.03412725703420776, "acc_norm": 0.5924040392177476, "acc_norm_stderr": 0.03410755570409047, "mc1": 0.38922888616891066, "mc1_stderr": 0.01706855268069033, "mc2": 0.5689827090259466, "mc2_stderr": 0.015643581657231707 }, "harness|arc:challenge|25": { "acc": 0.5810580204778157, "acc_stderr": 0.014418106953639008, "acc_norm": 0.6075085324232082, "acc_norm_stderr": 0.014269634635670709 }, "harness|hellaswag|10": { "acc": 0.6215893248356901, "acc_stderr": 0.004839995745602318, "acc_norm": 0.8209520015933081, "acc_norm_stderr": 0.003826089586650052 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5111111111111111, "acc_stderr": 0.04318275491977976, "acc_norm": 0.5111111111111111, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5855263157894737, "acc_stderr": 0.04008973785779206, "acc_norm": 0.5855263157894737, "acc_norm_stderr": 0.04008973785779206 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6339622641509434, "acc_stderr": 0.029647813539365242, "acc_norm": 0.6339622641509434, "acc_norm_stderr": 0.029647813539365242 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6388888888888888, "acc_stderr": 0.04016660030451233, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.04016660030451233 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5606936416184971, "acc_stderr": 0.03784271932887467, "acc_norm": 0.5606936416184971, "acc_norm_stderr": 0.03784271932887467 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4978723404255319, "acc_stderr": 0.03268572658667492, "acc_norm": 0.4978723404255319, "acc_norm_stderr": 0.03268572658667492 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.04339138322579861, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.04339138322579861 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35978835978835977, "acc_stderr": 0.024718075944129274, "acc_norm": 0.35978835978835977, "acc_norm_stderr": 0.024718075944129274 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.04375888492727062, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.04375888492727062 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7, "acc_stderr": 0.026069362295335137, "acc_norm": 0.7, "acc_norm_stderr": 0.026069362295335137 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.45320197044334976, "acc_stderr": 0.03502544650845872, "acc_norm": 0.45320197044334976, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6909090909090909, "acc_stderr": 0.036085410115739666, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.036085410115739666 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7525252525252525, "acc_stderr": 0.030746300742124498, "acc_norm": 0.7525252525252525, "acc_norm_stderr": 0.030746300742124498 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723875, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723875 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6333333333333333, "acc_stderr": 0.02443301646605246, "acc_norm": 0.6333333333333333, "acc_norm_stderr": 0.02443301646605246 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524593, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5966386554621849, "acc_stderr": 0.031866081214088314, "acc_norm": 0.5966386554621849, "acc_norm_stderr": 0.031866081214088314 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.03802039760107903, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.03802039760107903 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7871559633027523, "acc_stderr": 0.017549376389313694, "acc_norm": 0.7871559633027523, "acc_norm_stderr": 0.017549376389313694 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.41203703703703703, "acc_stderr": 0.03356787758160835, "acc_norm": 0.41203703703703703, "acc_norm_stderr": 0.03356787758160835 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8284313725490197, "acc_stderr": 0.026460569561240644, "acc_norm": 0.8284313725490197, "acc_norm_stderr": 0.026460569561240644 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808514, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808514 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699796, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699796 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.04225875451969637, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.04225875451969637 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6942148760330579, "acc_stderr": 0.042059539338841226, "acc_norm": 0.6942148760330579, "acc_norm_stderr": 0.042059539338841226 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6687116564417178, "acc_stderr": 0.03697983910025588, "acc_norm": 0.6687116564417178, "acc_norm_stderr": 0.03697983910025588 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.044532548363264673, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.044532548363264673 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8205128205128205, "acc_stderr": 0.025140935950335445, "acc_norm": 0.8205128205128205, "acc_norm_stderr": 0.025140935950335445 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7803320561941252, "acc_stderr": 0.014805384478371163, "acc_norm": 0.7803320561941252, "acc_norm_stderr": 0.014805384478371163 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6502890173410405, "acc_stderr": 0.025674281456531018, "acc_norm": 0.6502890173410405, "acc_norm_stderr": 0.025674281456531018 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4346368715083799, "acc_stderr": 0.016578997435496706, "acc_norm": 0.4346368715083799, "acc_norm_stderr": 0.016578997435496706 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6666666666666666, "acc_stderr": 0.026992544339297236, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.026992544339297236 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6495176848874598, "acc_stderr": 0.027098652621301757, "acc_norm": 0.6495176848874598, "acc_norm_stderr": 0.027098652621301757 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6697530864197531, "acc_stderr": 0.026168298456732852, "acc_norm": 0.6697530864197531, "acc_norm_stderr": 0.026168298456732852 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873866, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873866 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44654498044328556, "acc_stderr": 0.012697046024399675, "acc_norm": 0.44654498044328556, "acc_norm_stderr": 0.012697046024399675 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5661764705882353, "acc_stderr": 0.03010563657001663, "acc_norm": 0.5661764705882353, "acc_norm_stderr": 0.03010563657001663 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5849673202614379, "acc_stderr": 0.019933627776857418, "acc_norm": 0.5849673202614379, "acc_norm_stderr": 0.019933627776857418 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6693877551020408, "acc_stderr": 0.030116426296540606, "acc_norm": 0.6693877551020408, "acc_norm_stderr": 0.030116426296540606 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7910447761194029, "acc_stderr": 0.028748298931728655, "acc_norm": 0.7910447761194029, "acc_norm_stderr": 0.028748298931728655 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.03126781714663179, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.03126781714663179 }, "harness|truthfulqa:mc|0": { "mc1": 0.38922888616891066, "mc1_stderr": 0.01706855268069033, "mc2": 0.5689827090259466, "mc2_stderr": 0.015643581657231707 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-079927c2-2fb2-4a63-a719-489694e4146f
2023-10-04T07:17:55.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_Abe13__juniper-certificate-Llama-2-7b-chat-hf
2023-10-04T07:12:57.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Abe13/juniper-certificate-Llama-2-7b-chat-hf dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Abe13/juniper-certificate-Llama-2-7b-chat-hf](https://huggingface.co/Abe13/juniper-certificate-Llama-2-7b-chat-hf)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Abe13__juniper-certificate-Llama-2-7b-chat-hf\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:11:33.936694](https://huggingface.co/datasets/open-llm-leaderboard/details_Abe13__juniper-certificate-Llama-2-7b-chat-hf/blob/main/results_2023-10-04T07-11-33.936694.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24041760250241592,\n\ \ \"acc_stderr\": 0.03115220518653763,\n \"acc_norm\": 0.24171516655879707,\n\ \ \"acc_norm_stderr\": 0.031169836498146215,\n \"mc1\": 0.2460220318237454,\n\ \ \"mc1_stderr\": 0.015077219200662578,\n \"mc2\": 0.48234181171735035,\n\ \ \"mc2_stderr\": 0.017021382125909527\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.23122866894197952,\n \"acc_stderr\": 0.012320858834772259,\n\ \ \"acc_norm\": 0.2909556313993174,\n \"acc_norm_stderr\": 0.013273077865907581\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25951005775741887,\n\ \ \"acc_stderr\": 0.004374699189284861,\n \"acc_norm\": 0.27633937462656843,\n\ \ \"acc_norm_stderr\": 0.004462727543055892\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \ \ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\ \ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\ \ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\ \ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\ \ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \ \ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.19622641509433963,\n \"acc_stderr\": 0.024442388131100837,\n\ \ \"acc_norm\": 0.19622641509433963,\n \"acc_norm_stderr\": 0.024442388131100837\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n\ \ \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \ \ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\ \ \"acc_stderr\": 0.0309528902177499,\n \"acc_norm\": 0.20809248554913296,\n\ \ \"acc_norm_stderr\": 0.0309528902177499\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\ \ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.02767845257821238,\n\ \ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.02767845257821238\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.043391383225798594,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.043391383225798594\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.0345593020192481,\n\ \ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.0345593020192481\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"\ acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\ \ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\ \ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.20967741935483872,\n\ \ \"acc_stderr\": 0.02315787934908352,\n \"acc_norm\": 0.20967741935483872,\n\ \ \"acc_norm_stderr\": 0.02315787934908352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n\ \ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\ : 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511784,\n\ \ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511784\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.18181818181818182,\n \"acc_stderr\": 0.02747960301053878,\n \"\ acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.02747960301053878\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.19170984455958548,\n \"acc_stderr\": 0.028408953626245292,\n\ \ \"acc_norm\": 0.19170984455958548,\n \"acc_norm_stderr\": 0.028408953626245292\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.02119363252514853,\n\ \ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.02119363252514853\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \ \ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279476,\n \ \ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279476\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\ acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343585,\n \"\ acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343585\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859672,\n \"\ acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859672\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\ acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.22362869198312235,\n \"acc_stderr\": 0.027123298205229972,\n \ \ \"acc_norm\": 0.22362869198312235,\n \"acc_norm_stderr\": 0.027123298205229972\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17937219730941703,\n\ \ \"acc_stderr\": 0.0257498195691928,\n \"acc_norm\": 0.17937219730941703,\n\ \ \"acc_norm_stderr\": 0.0257498195691928\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.17557251908396945,\n \"acc_stderr\": 0.03336820338476075,\n\ \ \"acc_norm\": 0.17557251908396945,\n \"acc_norm_stderr\": 0.03336820338476075\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\ \ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\ \ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\ \ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\ \ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\ \ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822584,\n\ \ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822584\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\ \ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\ \ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20178799489144317,\n\ \ \"acc_stderr\": 0.014351702181636857,\n \"acc_norm\": 0.20178799489144317,\n\ \ \"acc_norm_stderr\": 0.014351702181636857\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\ \ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\ \ \"acc_stderr\": 0.014756906483260664,\n \"acc_norm\": 0.264804469273743,\n\ \ \"acc_norm_stderr\": 0.014756906483260664\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.21241830065359477,\n \"acc_stderr\": 0.023420375478296125,\n\ \ \"acc_norm\": 0.21241830065359477,\n \"acc_norm_stderr\": 0.023420375478296125\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21864951768488747,\n\ \ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.21864951768488747,\n\ \ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135104,\n\ \ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135104\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.24822695035460993,\n \"acc_stderr\": 0.0257700156442904,\n \ \ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.0257700156442904\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2770534550195567,\n\ \ \"acc_stderr\": 0.011430462443719674,\n \"acc_norm\": 0.2770534550195567,\n\ \ \"acc_norm_stderr\": 0.011430462443719674\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n \ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\ : 0.24509803921568626,\n \"acc_stderr\": 0.017401816711427653,\n \"\ acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.017401816711427653\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\ \ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n\ \ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.026882144922307744,\n\ \ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.026882144922307744\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\ \ \"acc_stderr\": 0.03115715086935558,\n \"acc_norm\": 0.263681592039801,\n\ \ \"acc_norm_stderr\": 0.03115715086935558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\ \ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\ \ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686398,\n\ \ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686398\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\ \ \"mc1_stderr\": 0.015077219200662578,\n \"mc2\": 0.48234181171735035,\n\ \ \"mc2_stderr\": 0.017021382125909527\n }\n}\n```" repo_url: https://huggingface.co/Abe13/juniper-certificate-Llama-2-7b-chat-hf leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hellaswag|10_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-11-33.936694.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-11-33.936694.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_11_33.936694 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-11-33.936694.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-11-33.936694.parquet' - config_name: results data_files: - split: 2023_10_04T07_11_33.936694 path: - results_2023-10-04T07-11-33.936694.parquet - split: latest path: - results_2023-10-04T07-11-33.936694.parquet --- # Dataset Card for Evaluation run of Abe13/juniper-certificate-Llama-2-7b-chat-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Abe13/juniper-certificate-Llama-2-7b-chat-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Abe13/juniper-certificate-Llama-2-7b-chat-hf](https://huggingface.co/Abe13/juniper-certificate-Llama-2-7b-chat-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Abe13__juniper-certificate-Llama-2-7b-chat-hf", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:11:33.936694](https://huggingface.co/datasets/open-llm-leaderboard/details_Abe13__juniper-certificate-Llama-2-7b-chat-hf/blob/main/results_2023-10-04T07-11-33.936694.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.24041760250241592, "acc_stderr": 0.03115220518653763, "acc_norm": 0.24171516655879707, "acc_norm_stderr": 0.031169836498146215, "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662578, "mc2": 0.48234181171735035, "mc2_stderr": 0.017021382125909527 }, "harness|arc:challenge|25": { "acc": 0.23122866894197952, "acc_stderr": 0.012320858834772259, "acc_norm": 0.2909556313993174, "acc_norm_stderr": 0.013273077865907581 }, "harness|hellaswag|10": { "acc": 0.25951005775741887, "acc_stderr": 0.004374699189284861, "acc_norm": 0.27633937462656843, "acc_norm_stderr": 0.004462727543055892 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.19, "acc_stderr": 0.03942772444036623, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036623 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2518518518518518, "acc_stderr": 0.03749850709174022, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.03749850709174022 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.23026315789473684, "acc_stderr": 0.03426059424403165, "acc_norm": 0.23026315789473684, "acc_norm_stderr": 0.03426059424403165 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.19622641509433963, "acc_stderr": 0.024442388131100837, "acc_norm": 0.19622641509433963, "acc_norm_stderr": 0.024442388131100837 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03476590104304134, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036846, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.22, "acc_stderr": 0.0416333199893227, "acc_norm": 0.22, "acc_norm_stderr": 0.0416333199893227 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816507, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.0309528902177499, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.0309528902177499 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201942, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201942 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.23404255319148937, "acc_stderr": 0.02767845257821238, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.02767845257821238 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.043391383225798594, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.043391383225798594 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2206896551724138, "acc_stderr": 0.0345593020192481, "acc_norm": 0.2206896551724138, "acc_norm_stderr": 0.0345593020192481 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24867724867724866, "acc_stderr": 0.02226181769240017, "acc_norm": 0.24867724867724866, "acc_norm_stderr": 0.02226181769240017 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.04073524322147125, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.04073524322147125 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.22, "acc_stderr": 0.04163331998932269, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932269 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.20967741935483872, "acc_stderr": 0.02315787934908352, "acc_norm": 0.20967741935483872, "acc_norm_stderr": 0.02315787934908352 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2413793103448276, "acc_stderr": 0.030108330718011625, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.030108330718011625 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.27, "acc_stderr": 0.04461960433384741, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.28484848484848485, "acc_stderr": 0.03524390844511784, "acc_norm": 0.28484848484848485, "acc_norm_stderr": 0.03524390844511784 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.18181818181818182, "acc_stderr": 0.02747960301053878, "acc_norm": 0.18181818181818182, "acc_norm_stderr": 0.02747960301053878 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19170984455958548, "acc_stderr": 0.028408953626245292, "acc_norm": 0.19170984455958548, "acc_norm_stderr": 0.028408953626245292 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.22564102564102564, "acc_stderr": 0.02119363252514853, "acc_norm": 0.22564102564102564, "acc_norm_stderr": 0.02119363252514853 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.026962424325073838, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.026962424325073838 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.226890756302521, "acc_stderr": 0.027205371538279476, "acc_norm": 0.226890756302521, "acc_norm_stderr": 0.027205371538279476 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.26490066225165565, "acc_stderr": 0.03603038545360384, "acc_norm": 0.26490066225165565, "acc_norm_stderr": 0.03603038545360384 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.23486238532110093, "acc_stderr": 0.018175110510343585, "acc_norm": 0.23486238532110093, "acc_norm_stderr": 0.018175110510343585 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2175925925925926, "acc_stderr": 0.028139689444859672, "acc_norm": 0.2175925925925926, "acc_norm_stderr": 0.028139689444859672 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.030587591351604243, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.030587591351604243 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.22362869198312235, "acc_stderr": 0.027123298205229972, "acc_norm": 0.22362869198312235, "acc_norm_stderr": 0.027123298205229972 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.17937219730941703, "acc_stderr": 0.0257498195691928, "acc_norm": 0.17937219730941703, "acc_norm_stderr": 0.0257498195691928 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.17557251908396945, "acc_stderr": 0.03336820338476075, "acc_norm": 0.17557251908396945, "acc_norm_stderr": 0.03336820338476075 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.26851851851851855, "acc_stderr": 0.04284467968052192, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.04284467968052192 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2392638036809816, "acc_stderr": 0.03351953879521269, "acc_norm": 0.2392638036809816, "acc_norm_stderr": 0.03351953879521269 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2767857142857143, "acc_stderr": 0.042466243366976256, "acc_norm": 0.2767857142857143, "acc_norm_stderr": 0.042466243366976256 }, "harness|hendrycksTest-management|5": { "acc": 0.1941747572815534, "acc_stderr": 0.03916667762822584, "acc_norm": 0.1941747572815534, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2564102564102564, "acc_stderr": 0.028605953702004253, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.028605953702004253 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.20178799489144317, "acc_stderr": 0.014351702181636857, "acc_norm": 0.20178799489144317, "acc_norm_stderr": 0.014351702181636857 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24566473988439305, "acc_stderr": 0.02317629820399201, "acc_norm": 0.24566473988439305, "acc_norm_stderr": 0.02317629820399201 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.264804469273743, "acc_stderr": 0.014756906483260664, "acc_norm": 0.264804469273743, "acc_norm_stderr": 0.014756906483260664 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.21241830065359477, "acc_stderr": 0.023420375478296125, "acc_norm": 0.21241830065359477, "acc_norm_stderr": 0.023420375478296125 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.21864951768488747, "acc_stderr": 0.02347558141786111, "acc_norm": 0.21864951768488747, "acc_norm_stderr": 0.02347558141786111 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2623456790123457, "acc_stderr": 0.024477222856135104, "acc_norm": 0.2623456790123457, "acc_norm_stderr": 0.024477222856135104 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24822695035460993, "acc_stderr": 0.0257700156442904, "acc_norm": 0.24822695035460993, "acc_norm_stderr": 0.0257700156442904 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2770534550195567, "acc_stderr": 0.011430462443719674, "acc_norm": 0.2770534550195567, "acc_norm_stderr": 0.011430462443719674 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.25, "acc_stderr": 0.026303648393696036, "acc_norm": 0.25, "acc_norm_stderr": 0.026303648393696036 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.24509803921568626, "acc_stderr": 0.017401816711427653, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.017401816711427653 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2818181818181818, "acc_stderr": 0.043091187099464585, "acc_norm": 0.2818181818181818, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.22857142857142856, "acc_stderr": 0.026882144922307744, "acc_norm": 0.22857142857142856, "acc_norm_stderr": 0.026882144922307744 }, "harness|hendrycksTest-sociology|5": { "acc": 0.263681592039801, "acc_stderr": 0.03115715086935558, "acc_norm": 0.263681592039801, "acc_norm_stderr": 0.03115715086935558 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-virology|5": { "acc": 0.30120481927710846, "acc_stderr": 0.03571609230053481, "acc_norm": 0.30120481927710846, "acc_norm_stderr": 0.03571609230053481 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03188578017686398, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03188578017686398 }, "harness|truthfulqa:mc|0": { "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662578, "mc2": 0.48234181171735035, "mc2_stderr": 0.017021382125909527 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
kentdan/3dgs
2023-10-04T07:12:45.000Z
[ "region:us" ]
kentdan
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets
2023-10-04T07:13:50.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mixed-datasets dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Charlie911/vicuna-7b-v1.5-lora-mixed-datasets](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:12:27.239591](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets/blob/main/results_2023-10-04T07-12-27.239591.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.502111066090814,\n\ \ \"acc_stderr\": 0.03514784369504416,\n \"acc_norm\": 0.5060237212713287,\n\ \ \"acc_norm_stderr\": 0.035135797430285,\n \"mc1\": 0.2582619339045288,\n\ \ \"mc1_stderr\": 0.015321821688476194,\n \"mc2\": 0.39565452424851383,\n\ \ \"mc2_stderr\": 0.013764585932935629\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.48378839590443684,\n \"acc_stderr\": 0.014603708567414941,\n\ \ \"acc_norm\": 0.5170648464163823,\n \"acc_norm_stderr\": 0.014602878388536593\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5668193586934873,\n\ \ \"acc_stderr\": 0.004945023657032275,\n \"acc_norm\": 0.764389563831906,\n\ \ \"acc_norm_stderr\": 0.0042351242151201065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\ \ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\ \ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309173,\n\ \ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309173\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\ \ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808093,\n\ \ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808093\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\ \ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\ \ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\ : 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\ \ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\ \ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\ \ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n\ \ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\ \ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\ \ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\ \ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\ acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\ \ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\ \ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.5548387096774193,\n \"acc_stderr\": 0.028272410186214906,\n \"\ acc_norm\": 0.5548387096774193,\n \"acc_norm_stderr\": 0.028272410186214906\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"\ acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\ : 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\ \ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6161616161616161,\n \"acc_stderr\": 0.0346488167501634,\n \"acc_norm\"\ : 0.6161616161616161,\n \"acc_norm_stderr\": 0.0346488167501634\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\ \ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n\ \ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844082,\n \ \ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844082\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.032437180551374095,\n\ \ \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.032437180551374095\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\ acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.6862385321100918,\n \"acc_stderr\": 0.019894723341469116,\n \"\ acc_norm\": 0.6862385321100918,\n \"acc_norm_stderr\": 0.019894723341469116\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\ acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n \"\ acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6835443037974683,\n \"acc_stderr\": 0.030274974880218977,\n \ \ \"acc_norm\": 0.6835443037974683,\n \"acc_norm_stderr\": 0.030274974880218977\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\ \ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\ \ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\ \ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"\ acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\ \ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\ \ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\ \ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\ \ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\ \ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781356,\n\ \ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781356\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\ \ \"acc_stderr\": 0.0276019213814176,\n \"acc_norm\": 0.7692307692307693,\n\ \ \"acc_norm_stderr\": 0.0276019213814176\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \ \ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6883780332056194,\n\ \ \"acc_stderr\": 0.016562433867284176,\n \"acc_norm\": 0.6883780332056194,\n\ \ \"acc_norm_stderr\": 0.016562433867284176\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.026917296179149123,\n\ \ \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.026917296179149123\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\ \ \"acc_stderr\": 0.014716824273017761,\n \"acc_norm\": 0.26256983240223464,\n\ \ \"acc_norm_stderr\": 0.014716824273017761\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.028332397483664274,\n\ \ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.028332397483664274\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\ \ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\ \ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.02776768960683393,\n\ \ \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.02776768960683393\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \ \ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3663624511082138,\n\ \ \"acc_stderr\": 0.01230565834683844,\n \"acc_norm\": 0.3663624511082138,\n\ \ \"acc_norm_stderr\": 0.01230565834683844\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121603,\n\ \ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121603\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.46078431372549017,\n \"acc_stderr\": 0.02016552331390791,\n \ \ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.02016552331390791\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\ \ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\ \ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.031001209039894836,\n\ \ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.031001209039894836\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.03333333333333335,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.03333333333333335\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\ \ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\ \ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\ \ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\ \ \"mc1_stderr\": 0.015321821688476194,\n \"mc2\": 0.39565452424851383,\n\ \ \"mc2_stderr\": 0.013764585932935629\n }\n}\n```" repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hellaswag|10_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-12-27.239591.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-12-27.239591.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_12_27.239591 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-12-27.239591.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-12-27.239591.parquet' - config_name: results data_files: - split: 2023_10_04T07_12_27.239591 path: - results_2023-10-04T07-12-27.239591.parquet - split: latest path: - results_2023-10-04T07-12-27.239591.parquet --- # Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mixed-datasets ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-mixed-datasets](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:12:27.239591](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets/blob/main/results_2023-10-04T07-12-27.239591.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.502111066090814, "acc_stderr": 0.03514784369504416, "acc_norm": 0.5060237212713287, "acc_norm_stderr": 0.035135797430285, "mc1": 0.2582619339045288, "mc1_stderr": 0.015321821688476194, "mc2": 0.39565452424851383, "mc2_stderr": 0.013764585932935629 }, "harness|arc:challenge|25": { "acc": 0.48378839590443684, "acc_stderr": 0.014603708567414941, "acc_norm": 0.5170648464163823, "acc_norm_stderr": 0.014602878388536593 }, "harness|hellaswag|10": { "acc": 0.5668193586934873, "acc_stderr": 0.004945023657032275, "acc_norm": 0.764389563831906, "acc_norm_stderr": 0.0042351242151201065 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464243, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464243 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4868421052631579, "acc_stderr": 0.04067533136309173, "acc_norm": 0.4868421052631579, "acc_norm_stderr": 0.04067533136309173 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5094339622641509, "acc_stderr": 0.030767394707808093, "acc_norm": 0.5094339622641509, "acc_norm_stderr": 0.030767394707808093 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4930555555555556, "acc_stderr": 0.04180806750294938, "acc_norm": 0.4930555555555556, "acc_norm_stderr": 0.04180806750294938 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4624277456647399, "acc_stderr": 0.0380168510452446, "acc_norm": 0.4624277456647399, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237655, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237655 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4595744680851064, "acc_stderr": 0.03257901482099835, "acc_norm": 0.4595744680851064, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2894736842105263, "acc_stderr": 0.042663394431593935, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.042663394431593935 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.041618085035015295, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.041618085035015295 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30423280423280424, "acc_stderr": 0.023695415009463087, "acc_norm": 0.30423280423280424, "acc_norm_stderr": 0.023695415009463087 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04216370213557835, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04216370213557835 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5548387096774193, "acc_stderr": 0.028272410186214906, "acc_norm": 0.5548387096774193, "acc_norm_stderr": 0.028272410186214906 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3645320197044335, "acc_stderr": 0.033864057460620905, "acc_norm": 0.3645320197044335, "acc_norm_stderr": 0.033864057460620905 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6303030303030303, "acc_stderr": 0.03769430314512567, "acc_norm": 0.6303030303030303, "acc_norm_stderr": 0.03769430314512567 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6161616161616161, "acc_stderr": 0.0346488167501634, "acc_norm": 0.6161616161616161, "acc_norm_stderr": 0.0346488167501634 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7202072538860104, "acc_stderr": 0.03239637046735704, "acc_norm": 0.7202072538860104, "acc_norm_stderr": 0.03239637046735704 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.44358974358974357, "acc_stderr": 0.025189149894764198, "acc_norm": 0.44358974358974357, "acc_norm_stderr": 0.025189149894764198 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.026593939101844082, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.026593939101844082 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.47478991596638653, "acc_stderr": 0.032437180551374095, "acc_norm": 0.47478991596638653, "acc_norm_stderr": 0.032437180551374095 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2781456953642384, "acc_stderr": 0.03658603262763743, "acc_norm": 0.2781456953642384, "acc_norm_stderr": 0.03658603262763743 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6862385321100918, "acc_stderr": 0.019894723341469116, "acc_norm": 0.6862385321100918, "acc_norm_stderr": 0.019894723341469116 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39814814814814814, "acc_stderr": 0.033384734032074016, "acc_norm": 0.39814814814814814, "acc_norm_stderr": 0.033384734032074016 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03308611113236435, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03308611113236435 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6835443037974683, "acc_stderr": 0.030274974880218977, "acc_norm": 0.6835443037974683, "acc_norm_stderr": 0.030274974880218977 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5919282511210763, "acc_stderr": 0.03298574607842822, "acc_norm": 0.5919282511210763, "acc_norm_stderr": 0.03298574607842822 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5801526717557252, "acc_stderr": 0.04328577215262972, "acc_norm": 0.5801526717557252, "acc_norm_stderr": 0.04328577215262972 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5785123966942148, "acc_stderr": 0.04507732278775087, "acc_norm": 0.5785123966942148, "acc_norm_stderr": 0.04507732278775087 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04803752235190192, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04803752235190192 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5030674846625767, "acc_stderr": 0.03928297078179663, "acc_norm": 0.5030674846625767, "acc_norm_stderr": 0.03928297078179663 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.6601941747572816, "acc_stderr": 0.046897659372781356, "acc_norm": 0.6601941747572816, "acc_norm_stderr": 0.046897659372781356 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7692307692307693, "acc_stderr": 0.0276019213814176, "acc_norm": 0.7692307692307693, "acc_norm_stderr": 0.0276019213814176 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6883780332056194, "acc_stderr": 0.016562433867284176, "acc_norm": 0.6883780332056194, "acc_norm_stderr": 0.016562433867284176 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5057803468208093, "acc_stderr": 0.026917296179149123, "acc_norm": 0.5057803468208093, "acc_norm_stderr": 0.026917296179149123 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.26256983240223464, "acc_stderr": 0.014716824273017761, "acc_norm": 0.26256983240223464, "acc_norm_stderr": 0.014716824273017761 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5718954248366013, "acc_stderr": 0.028332397483664274, "acc_norm": 0.5718954248366013, "acc_norm_stderr": 0.028332397483664274 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5884244372990354, "acc_stderr": 0.02795048149440127, "acc_norm": 0.5884244372990354, "acc_norm_stderr": 0.02795048149440127 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5308641975308642, "acc_stderr": 0.02776768960683393, "acc_norm": 0.5308641975308642, "acc_norm_stderr": 0.02776768960683393 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.37943262411347517, "acc_stderr": 0.028947338851614105, "acc_norm": 0.37943262411347517, "acc_norm_stderr": 0.028947338851614105 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3663624511082138, "acc_stderr": 0.01230565834683844, "acc_norm": 0.3663624511082138, "acc_norm_stderr": 0.01230565834683844 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5514705882352942, "acc_stderr": 0.030211479609121603, "acc_norm": 0.5514705882352942, "acc_norm_stderr": 0.030211479609121603 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.46078431372549017, "acc_stderr": 0.02016552331390791, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.02016552331390791 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6244897959183674, "acc_stderr": 0.031001209039894836, "acc_norm": 0.6244897959183674, "acc_norm_stderr": 0.031001209039894836 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03333333333333335, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03333333333333335 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.4457831325301205, "acc_stderr": 0.03869543323472101, "acc_norm": 0.4457831325301205, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7192982456140351, "acc_stderr": 0.034462962170884265, "acc_norm": 0.7192982456140351, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.2582619339045288, "mc1_stderr": 0.015321821688476194, "mc2": 0.39565452424851383, "mc2_stderr": 0.013764585932935629 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_adonlee__LLaMA_2_13B_SFT_v0
2023-10-04T07:13:58.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of adonlee/LLaMA_2_13B_SFT_v0 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [adonlee/LLaMA_2_13B_SFT_v0](https://huggingface.co/adonlee/LLaMA_2_13B_SFT_v0)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adonlee__LLaMA_2_13B_SFT_v0\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:12:40.542820](https://huggingface.co/datasets/open-llm-leaderboard/details_adonlee__LLaMA_2_13B_SFT_v0/blob/main/results_2023-10-04T07-12-40.542820.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5846495806776177,\n\ \ \"acc_stderr\": 0.0340023694085291,\n \"acc_norm\": 0.5888189410019026,\n\ \ \"acc_norm_stderr\": 0.03397885687355024,\n \"mc1\": 0.3574051407588739,\n\ \ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.49917554427398053,\n\ \ \"mc2_stderr\": 0.01552903408883325\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920293,\n\ \ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6389165504879506,\n\ \ \"acc_stderr\": 0.004793330525656208,\n \"acc_norm\": 0.8379804819757021,\n\ \ \"acc_norm_stderr\": 0.0036771566878488382\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\ \ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\ \ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\ \ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\ \ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286644,\n\ \ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286644\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\ \ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\ \ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n\ \ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \ \ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\ \ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\ \ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\ \ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\ \ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596437,\n \"\ acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596437\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\ \ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\ \ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\ \ \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.6967741935483871,\n\ \ \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\ \ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\ : 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\ \ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"\ acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n\ \ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729654,\n \ \ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729654\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094527,\n \ \ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094527\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \ \ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\ acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443138,\n \"\ acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443138\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\ acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\ acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \ \ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\ \ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\ \ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\ \ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\ acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\ \ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\ \ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\ \ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\ \ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\ \ \"acc_stderr\": 0.023365051491753722,\n \"acc_norm\": 0.8504273504273504,\n\ \ \"acc_norm_stderr\": 0.023365051491753722\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\ \ \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n\ \ \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165552,\n\ \ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165552\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\ \ \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n\ \ \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02818059632825929,\n\ \ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02818059632825929\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\ \ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\ \ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\ \ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \ \ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\ \ \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n\ \ \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329387,\n\ \ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329387\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6013071895424836,\n \"acc_stderr\": 0.01980828131744985,\n \ \ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.01980828131744985\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\ \ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\ \ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\ \ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\ \ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n\ \ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\ \ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\ \ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\ \ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\ \ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.49917554427398053,\n\ \ \"mc2_stderr\": 0.01552903408883325\n }\n}\n```" repo_url: https://huggingface.co/adonlee/LLaMA_2_13B_SFT_v0 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hellaswag|10_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-12-40.542820.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-12-40.542820.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_12_40.542820 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-12-40.542820.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-12-40.542820.parquet' - config_name: results data_files: - split: 2023_10_04T07_12_40.542820 path: - results_2023-10-04T07-12-40.542820.parquet - split: latest path: - results_2023-10-04T07-12-40.542820.parquet --- # Dataset Card for Evaluation run of adonlee/LLaMA_2_13B_SFT_v0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/adonlee/LLaMA_2_13B_SFT_v0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [adonlee/LLaMA_2_13B_SFT_v0](https://huggingface.co/adonlee/LLaMA_2_13B_SFT_v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_adonlee__LLaMA_2_13B_SFT_v0", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:12:40.542820](https://huggingface.co/datasets/open-llm-leaderboard/details_adonlee__LLaMA_2_13B_SFT_v0/blob/main/results_2023-10-04T07-12-40.542820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5846495806776177, "acc_stderr": 0.0340023694085291, "acc_norm": 0.5888189410019026, "acc_norm_stderr": 0.03397885687355024, "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.49917554427398053, "mc2_stderr": 0.01552903408883325 }, "harness|arc:challenge|25": { "acc": 0.5733788395904437, "acc_stderr": 0.014453185592920293, "acc_norm": 0.6203071672354948, "acc_norm_stderr": 0.014182119866974872 }, "harness|hellaswag|10": { "acc": 0.6389165504879506, "acc_stderr": 0.004793330525656208, "acc_norm": 0.8379804819757021, "acc_norm_stderr": 0.0036771566878488382 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5657894736842105, "acc_stderr": 0.04033565667848319, "acc_norm": 0.5657894736842105, "acc_norm_stderr": 0.04033565667848319 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6188679245283019, "acc_stderr": 0.029890609686286644, "acc_norm": 0.6188679245283019, "acc_norm_stderr": 0.029890609686286644 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6597222222222222, "acc_stderr": 0.039621355734862175, "acc_norm": 0.6597222222222222, "acc_norm_stderr": 0.039621355734862175 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201942, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201942 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46382978723404256, "acc_stderr": 0.032600385118357715, "acc_norm": 0.46382978723404256, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.043391383225798615, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.043391383225798615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.041546596717075474, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.041546596717075474 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3439153439153439, "acc_stderr": 0.024464426625596437, "acc_norm": 0.3439153439153439, "acc_norm_stderr": 0.024464426625596437 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6967741935483871, "acc_stderr": 0.02614868593067175, "acc_norm": 0.6967741935483871, "acc_norm_stderr": 0.02614868593067175 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.45320197044334976, "acc_stderr": 0.035025446508458714, "acc_norm": 0.45320197044334976, "acc_norm_stderr": 0.035025446508458714 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.048523658709391, "acc_norm": 0.63, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7151515151515152, "acc_stderr": 0.03524390844511781, "acc_norm": 0.7151515151515152, "acc_norm_stderr": 0.03524390844511781 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7373737373737373, "acc_stderr": 0.031353050095330855, "acc_norm": 0.7373737373737373, "acc_norm_stderr": 0.031353050095330855 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723875, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723875 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5743589743589743, "acc_stderr": 0.02506909438729654, "acc_norm": 0.5743589743589743, "acc_norm_stderr": 0.02506909438729654 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2814814814814815, "acc_stderr": 0.02742001935094527, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.02742001935094527 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6008403361344538, "acc_stderr": 0.03181110032413926, "acc_norm": 0.6008403361344538, "acc_norm_stderr": 0.03181110032413926 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7834862385321101, "acc_stderr": 0.017658710594443138, "acc_norm": 0.7834862385321101, "acc_norm_stderr": 0.017658710594443138 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4212962962962963, "acc_stderr": 0.03367462138896078, "acc_norm": 0.4212962962962963, "acc_norm_stderr": 0.03367462138896078 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.02812597226565437, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.02812597226565437 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8185654008438819, "acc_stderr": 0.025085961144579647, "acc_norm": 0.8185654008438819, "acc_norm_stderr": 0.025085961144579647 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6591928251121076, "acc_stderr": 0.03181149747055359, "acc_norm": 0.6591928251121076, "acc_norm_stderr": 0.03181149747055359 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6793893129770993, "acc_stderr": 0.04093329229834278, "acc_norm": 0.6793893129770993, "acc_norm_stderr": 0.04093329229834278 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.04330043749650741, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.04330043749650741 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.046695106638751906, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.046695106638751906 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260595, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260595 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753722, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753722 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7803320561941252, "acc_stderr": 0.014805384478371155, "acc_norm": 0.7803320561941252, "acc_norm_stderr": 0.014805384478371155 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6647398843930635, "acc_stderr": 0.025416003773165552, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.025416003773165552 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4301675977653631, "acc_stderr": 0.016558601636041035, "acc_norm": 0.4301675977653631, "acc_norm_stderr": 0.016558601636041035 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5882352941176471, "acc_stderr": 0.02818059632825929, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.02818059632825929 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6784565916398714, "acc_stderr": 0.026527724079528872, "acc_norm": 0.6784565916398714, "acc_norm_stderr": 0.026527724079528872 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6759259259259259, "acc_stderr": 0.02604176620271716, "acc_norm": 0.6759259259259259, "acc_norm_stderr": 0.02604176620271716 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.029779450957303055, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.029779450957303055 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4667535853976532, "acc_stderr": 0.012741974333897227, "acc_norm": 0.4667535853976532, "acc_norm_stderr": 0.012741974333897227 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5551470588235294, "acc_stderr": 0.030187532060329387, "acc_norm": 0.5551470588235294, "acc_norm_stderr": 0.030187532060329387 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6013071895424836, "acc_stderr": 0.01980828131744985, "acc_norm": 0.6013071895424836, "acc_norm_stderr": 0.01980828131744985 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.046534298079135075, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.046534298079135075 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6448979591836734, "acc_stderr": 0.030635655150387638, "acc_norm": 0.6448979591836734, "acc_norm_stderr": 0.030635655150387638 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7711442786069652, "acc_stderr": 0.029705284056772432, "acc_norm": 0.7711442786069652, "acc_norm_stderr": 0.029705284056772432 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.031885780176863984, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.031885780176863984 }, "harness|truthfulqa:mc|0": { "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.49917554427398053, "mc2_stderr": 0.01552903408883325 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3
2023-10-04T07:14:26.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Juniplayground/Mist_LLaMA-2-7B-1024_V3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Juniplayground/Mist_LLaMA-2-7B-1024_V3](https://huggingface.co/Juniplayground/Mist_LLaMA-2-7B-1024_V3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:13:03.510768](https://huggingface.co/datasets/open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3/blob/main/results_2023-10-04T07-13-03.510768.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.41745428426970854,\n\ \ \"acc_stderr\": 0.03494714587730849,\n \"acc_norm\": 0.4212975081485185,\n\ \ \"acc_norm_stderr\": 0.03493415138523584,\n \"mc1\": 0.26193390452876375,\n\ \ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.41205035167771537,\n\ \ \"mc2_stderr\": 0.014036173763385399\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.48208191126279865,\n \"acc_stderr\": 0.01460200558549098,\n\ \ \"acc_norm\": 0.5136518771331058,\n \"acc_norm_stderr\": 0.014605943429860947\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5822545309699263,\n\ \ \"acc_stderr\": 0.004921798492608778,\n \"acc_norm\": 0.7774347739494125,\n\ \ \"acc_norm_stderr\": 0.004151185615952061\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \ \ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\ \ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\ \ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n\ \ \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\ \ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.41132075471698115,\n \"acc_stderr\": 0.030285009259009794,\n\ \ \"acc_norm\": 0.41132075471698115,\n \"acc_norm_stderr\": 0.030285009259009794\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\ \ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n\ \ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n\ \ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.36416184971098264,\n\ \ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\ \ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.031778212502369216,\n\ \ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.031778212502369216\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\ \ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\ \ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482757,\n\ \ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482757\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217883,\n \"\ acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217883\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\ \ \"acc_stderr\": 0.041634530313028585,\n \"acc_norm\": 0.31746031746031744,\n\ \ \"acc_norm_stderr\": 0.041634530313028585\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.3935483870967742,\n \"acc_stderr\": 0.02779187875313227,\n \"\ acc_norm\": 0.3935483870967742,\n \"acc_norm_stderr\": 0.02779187875313227\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n \"\ acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\ : 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.49696969696969695,\n \"acc_stderr\": 0.03904272341431855,\n\ \ \"acc_norm\": 0.49696969696969695,\n \"acc_norm_stderr\": 0.03904272341431855\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.47474747474747475,\n \"acc_stderr\": 0.03557806245087314,\n \"\ acc_norm\": 0.47474747474747475,\n \"acc_norm_stderr\": 0.03557806245087314\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.5958549222797928,\n \"acc_stderr\": 0.0354150857888402,\n\ \ \"acc_norm\": 0.5958549222797928,\n \"acc_norm_stderr\": 0.0354150857888402\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.024468615241478916,\n\ \ \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.024468615241478916\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230175,\n \ \ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230175\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.03169380235712997,\n \ \ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.03169380235712997\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\ acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.5339449541284403,\n \"acc_stderr\": 0.021387863350353985,\n \"\ acc_norm\": 0.5339449541284403,\n \"acc_norm_stderr\": 0.021387863350353985\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"\ acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.45588235294117646,\n \"acc_stderr\": 0.03495624522015474,\n \"\ acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03495624522015474\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.5147679324894515,\n \"acc_stderr\": 0.032533028078777386,\n \ \ \"acc_norm\": 0.5147679324894515,\n \"acc_norm_stderr\": 0.032533028078777386\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47533632286995514,\n\ \ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.47533632286995514,\n\ \ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n\ \ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\ acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\ \ \"acc_stderr\": 0.048129173245368216,\n \"acc_norm\": 0.4537037037037037,\n\ \ \"acc_norm_stderr\": 0.048129173245368216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\ \ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\ \ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n\ \ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n\ \ \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5641025641025641,\n\ \ \"acc_stderr\": 0.03248577511578401,\n \"acc_norm\": 0.5641025641025641,\n\ \ \"acc_norm_stderr\": 0.03248577511578401\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5925925925925926,\n\ \ \"acc_stderr\": 0.017570705239256565,\n \"acc_norm\": 0.5925925925925926,\n\ \ \"acc_norm_stderr\": 0.017570705239256565\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.430635838150289,\n \"acc_stderr\": 0.026658800273672376,\n\ \ \"acc_norm\": 0.430635838150289,\n \"acc_norm_stderr\": 0.026658800273672376\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\ \ \"acc_stderr\": 0.014551553659369922,\n \"acc_norm\": 0.2536312849162011,\n\ \ \"acc_norm_stderr\": 0.014551553659369922\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.4542483660130719,\n \"acc_stderr\": 0.028509807802626564,\n\ \ \"acc_norm\": 0.4542483660130719,\n \"acc_norm_stderr\": 0.028509807802626564\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5337620578778135,\n\ \ \"acc_stderr\": 0.02833327710956279,\n \"acc_norm\": 0.5337620578778135,\n\ \ \"acc_norm_stderr\": 0.02833327710956279\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.4660493827160494,\n \"acc_stderr\": 0.02775653525734767,\n\ \ \"acc_norm\": 0.4660493827160494,\n \"acc_norm_stderr\": 0.02775653525734767\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543465,\n \ \ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543465\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3213820078226858,\n\ \ \"acc_stderr\": 0.011927581352265076,\n \"acc_norm\": 0.3213820078226858,\n\ \ \"acc_norm_stderr\": 0.011927581352265076\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\ \ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.39705882352941174,\n \"acc_stderr\": 0.01979448890002411,\n \ \ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.01979448890002411\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n\ \ \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n\ \ \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n\ \ \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n\ \ \"acc_stderr\": 0.03493231777421282,\n \"acc_norm\": 0.5771144278606966,\n\ \ \"acc_norm_stderr\": 0.03493231777421282\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \ \ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\ \ \"acc_stderr\": 0.037998574544796375,\n \"acc_norm\": 0.39156626506024095,\n\ \ \"acc_norm_stderr\": 0.037998574544796375\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.03733756969066165,\n\ \ \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.03733756969066165\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\ \ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.41205035167771537,\n\ \ \"mc2_stderr\": 0.014036173763385399\n }\n}\n```" repo_url: https://huggingface.co/Juniplayground/Mist_LLaMA-2-7B-1024_V3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hellaswag|10_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-13-03.510768.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-13-03.510768.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_13_03.510768 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-13-03.510768.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-13-03.510768.parquet' - config_name: results data_files: - split: 2023_10_04T07_13_03.510768 path: - results_2023-10-04T07-13-03.510768.parquet - split: latest path: - results_2023-10-04T07-13-03.510768.parquet --- # Dataset Card for Evaluation run of Juniplayground/Mist_LLaMA-2-7B-1024_V3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Juniplayground/Mist_LLaMA-2-7B-1024_V3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Juniplayground/Mist_LLaMA-2-7B-1024_V3](https://huggingface.co/Juniplayground/Mist_LLaMA-2-7B-1024_V3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:13:03.510768](https://huggingface.co/datasets/open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3/blob/main/results_2023-10-04T07-13-03.510768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.41745428426970854, "acc_stderr": 0.03494714587730849, "acc_norm": 0.4212975081485185, "acc_norm_stderr": 0.03493415138523584, "mc1": 0.26193390452876375, "mc1_stderr": 0.01539211880501503, "mc2": 0.41205035167771537, "mc2_stderr": 0.014036173763385399 }, "harness|arc:challenge|25": { "acc": 0.48208191126279865, "acc_stderr": 0.01460200558549098, "acc_norm": 0.5136518771331058, "acc_norm_stderr": 0.014605943429860947 }, "harness|hellaswag|10": { "acc": 0.5822545309699263, "acc_stderr": 0.004921798492608778, "acc_norm": 0.7774347739494125, "acc_norm_stderr": 0.004151185615952061 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.19, "acc_stderr": 0.03942772444036622, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036622 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04292596718256981, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3881578947368421, "acc_stderr": 0.03965842097512744, "acc_norm": 0.3881578947368421, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.41132075471698115, "acc_stderr": 0.030285009259009794, "acc_norm": 0.41132075471698115, "acc_norm_stderr": 0.030285009259009794 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4236111111111111, "acc_stderr": 0.04132125019723369, "acc_norm": 0.4236111111111111, "acc_norm_stderr": 0.04132125019723369 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.36416184971098264, "acc_stderr": 0.03669072477416907, "acc_norm": 0.36416184971098264, "acc_norm_stderr": 0.03669072477416907 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171452, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171452 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3829787234042553, "acc_stderr": 0.031778212502369216, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.031778212502369216 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.041424397194893624, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.041424397194893624 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4413793103448276, "acc_stderr": 0.04137931034482757, "acc_norm": 0.4413793103448276, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.022019080012217883, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.022019080012217883 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.31746031746031744, "acc_stderr": 0.041634530313028585, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.041634530313028585 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3935483870967742, "acc_stderr": 0.02779187875313227, "acc_norm": 0.3935483870967742, "acc_norm_stderr": 0.02779187875313227 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2512315270935961, "acc_stderr": 0.030516530732694436, "acc_norm": 0.2512315270935961, "acc_norm_stderr": 0.030516530732694436 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.49696969696969695, "acc_stderr": 0.03904272341431855, "acc_norm": 0.49696969696969695, "acc_norm_stderr": 0.03904272341431855 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.47474747474747475, "acc_stderr": 0.03557806245087314, "acc_norm": 0.47474747474747475, "acc_norm_stderr": 0.03557806245087314 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.5958549222797928, "acc_stderr": 0.0354150857888402, "acc_norm": 0.5958549222797928, "acc_norm_stderr": 0.0354150857888402 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.36923076923076925, "acc_stderr": 0.024468615241478916, "acc_norm": 0.36923076923076925, "acc_norm_stderr": 0.024468615241478916 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.027309140588230175, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.027309140588230175 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3907563025210084, "acc_stderr": 0.03169380235712997, "acc_norm": 0.3907563025210084, "acc_norm_stderr": 0.03169380235712997 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.5339449541284403, "acc_stderr": 0.021387863350353985, "acc_norm": 0.5339449541284403, "acc_norm_stderr": 0.021387863350353985 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.32407407407407407, "acc_stderr": 0.03191923445686185, "acc_norm": 0.32407407407407407, "acc_norm_stderr": 0.03191923445686185 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.45588235294117646, "acc_stderr": 0.03495624522015474, "acc_norm": 0.45588235294117646, "acc_norm_stderr": 0.03495624522015474 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5147679324894515, "acc_stderr": 0.032533028078777386, "acc_norm": 0.5147679324894515, "acc_norm_stderr": 0.032533028078777386 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.47533632286995514, "acc_stderr": 0.03351695167652628, "acc_norm": 0.47533632286995514, "acc_norm_stderr": 0.03351695167652628 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5572519083969466, "acc_stderr": 0.04356447202665069, "acc_norm": 0.5572519083969466, "acc_norm_stderr": 0.04356447202665069 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6115702479338843, "acc_stderr": 0.04449270350068383, "acc_norm": 0.6115702479338843, "acc_norm_stderr": 0.04449270350068383 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4537037037037037, "acc_stderr": 0.048129173245368216, "acc_norm": 0.4537037037037037, "acc_norm_stderr": 0.048129173245368216 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3803680981595092, "acc_stderr": 0.03814269893261837, "acc_norm": 0.3803680981595092, "acc_norm_stderr": 0.03814269893261837 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.045218299028335865, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.045218299028335865 }, "harness|hendrycksTest-management|5": { "acc": 0.47572815533980584, "acc_stderr": 0.049449010929737795, "acc_norm": 0.47572815533980584, "acc_norm_stderr": 0.049449010929737795 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5641025641025641, "acc_stderr": 0.03248577511578401, "acc_norm": 0.5641025641025641, "acc_norm_stderr": 0.03248577511578401 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.5925925925925926, "acc_stderr": 0.017570705239256565, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.017570705239256565 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.430635838150289, "acc_stderr": 0.026658800273672376, "acc_norm": 0.430635838150289, "acc_norm_stderr": 0.026658800273672376 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2536312849162011, "acc_stderr": 0.014551553659369922, "acc_norm": 0.2536312849162011, "acc_norm_stderr": 0.014551553659369922 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.4542483660130719, "acc_stderr": 0.028509807802626564, "acc_norm": 0.4542483660130719, "acc_norm_stderr": 0.028509807802626564 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5337620578778135, "acc_stderr": 0.02833327710956279, "acc_norm": 0.5337620578778135, "acc_norm_stderr": 0.02833327710956279 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.4660493827160494, "acc_stderr": 0.02775653525734767, "acc_norm": 0.4660493827160494, "acc_norm_stderr": 0.02775653525734767 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.38652482269503546, "acc_stderr": 0.029049190342543465, "acc_norm": 0.38652482269503546, "acc_norm_stderr": 0.029049190342543465 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3213820078226858, "acc_stderr": 0.011927581352265076, "acc_norm": 0.3213820078226858, "acc_norm_stderr": 0.011927581352265076 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.47058823529411764, "acc_stderr": 0.030320243265004137, "acc_norm": 0.47058823529411764, "acc_norm_stderr": 0.030320243265004137 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.39705882352941174, "acc_stderr": 0.01979448890002411, "acc_norm": 0.39705882352941174, "acc_norm_stderr": 0.01979448890002411 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.43636363636363634, "acc_stderr": 0.04750185058907297, "acc_norm": 0.43636363636363634, "acc_norm_stderr": 0.04750185058907297 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.47346938775510206, "acc_stderr": 0.03196412734523272, "acc_norm": 0.47346938775510206, "acc_norm_stderr": 0.03196412734523272 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5771144278606966, "acc_stderr": 0.03493231777421282, "acc_norm": 0.5771144278606966, "acc_norm_stderr": 0.03493231777421282 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.037998574544796375, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.037998574544796375 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6140350877192983, "acc_stderr": 0.03733756969066165, "acc_norm": 0.6140350877192983, "acc_norm_stderr": 0.03733756969066165 }, "harness|truthfulqa:mc|0": { "mc1": 0.26193390452876375, "mc1_stderr": 0.01539211880501503, "mc2": 0.41205035167771537, "mc2_stderr": 0.014036173763385399 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3
2023-10-04T07:15:58.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of PY007/TinyLlama-1.1B-Chat-v0.3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PY007/TinyLlama-1.1B-Chat-v0.3](https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:14:39.217680](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3/blob/main/results_2023-10-04T07-14-39.217680.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2594245145352148,\n\ \ \"acc_stderr\": 0.03169981723918608,\n \"acc_norm\": 0.2624127016975797,\n\ \ \"acc_norm_stderr\": 0.031705948859101704,\n \"mc1\": 0.22276621787025705,\n\ \ \"mc1_stderr\": 0.014566506961396742,\n \"mc2\": 0.3667165555145351,\n\ \ \"mc2_stderr\": 0.01463510371066448\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.31399317406143346,\n \"acc_stderr\": 0.013562691224726297,\n\ \ \"acc_norm\": 0.3506825938566553,\n \"acc_norm_stderr\": 0.013944635930726092\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4373630750846445,\n\ \ \"acc_stderr\": 0.0049504729185233165,\n \"acc_norm\": 0.5769766978689504,\n\ \ \"acc_norm_stderr\": 0.004930293787545608\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\ \ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\ \ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.03738520676119667,\n\ \ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.03738520676119667\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\ \ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \ \ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891366,\n\ \ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891366\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\ \ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\ \ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\ \ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\ \ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\ \ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.02655698211783873,\n\ \ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.02655698211783873\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n\ \ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\ acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\ \ \"acc_stderr\": 0.032684540130117436,\n \"acc_norm\": 0.15873015873015872,\n\ \ \"acc_norm_stderr\": 0.032684540130117436\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\ \ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n\ \ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\ \ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\ \ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\ acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845436,\n\ \ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845436\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n\ \ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \ \ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n\ \ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\ acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.24587155963302754,\n \"acc_stderr\": 0.018461940968708426,\n \"\ acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.018461940968708426\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.24537037037037038,\n \"acc_stderr\": 0.02934666509437294,\n \"\ acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.02934666509437294\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\ acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \ \ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21076233183856502,\n\ \ \"acc_stderr\": 0.027373095500540193,\n \"acc_norm\": 0.21076233183856502,\n\ \ \"acc_norm_stderr\": 0.027373095500540193\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n\ \ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.371900826446281,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\ : 0.371900826446281,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\ \ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\ \ \"acc_stderr\": 0.044939490686135404,\n \"acc_norm\": 0.3392857142857143,\n\ \ \"acc_norm_stderr\": 0.044939490686135404\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646036,\n\ \ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646036\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\ \ \"acc_stderr\": 0.029202540153431183,\n \"acc_norm\": 0.27350427350427353,\n\ \ \"acc_norm_stderr\": 0.029202540153431183\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \ \ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\ \ \"acc_stderr\": 0.015696008563807082,\n \"acc_norm\": 0.26053639846743293,\n\ \ \"acc_norm_stderr\": 0.015696008563807082\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500104,\n\ \ \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500104\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\ \ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\ \ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n\ \ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\ \ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n\ \ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.02563082497562135,\n\ \ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.02563082497562135\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \ \ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26140808344198174,\n\ \ \"acc_stderr\": 0.011222528169771309,\n \"acc_norm\": 0.26140808344198174,\n\ \ \"acc_norm_stderr\": 0.011222528169771309\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.02352924218519311,\n\ \ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.02352924218519311\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.27124183006535946,\n \"acc_stderr\": 0.017986615304030305,\n \ \ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.017986615304030305\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\ \ \"acc_stderr\": 0.03764425585984925,\n \"acc_norm\": 0.19090909090909092,\n\ \ \"acc_norm_stderr\": 0.03764425585984925\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.026537045312145312,\n\ \ \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.026537045312145312\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\ \ \"acc_stderr\": 0.03036049015401467,\n \"acc_norm\": 0.24378109452736318,\n\ \ \"acc_norm_stderr\": 0.03036049015401467\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n\ \ \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n\ \ \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n\ \ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\ \ \"mc1_stderr\": 0.014566506961396742,\n \"mc2\": 0.3667165555145351,\n\ \ \"mc2_stderr\": 0.01463510371066448\n }\n}\n```" repo_url: https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hellaswag|10_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-14-39.217680.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-14-39.217680.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_14_39.217680 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-14-39.217680.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-14-39.217680.parquet' - config_name: results data_files: - split: 2023_10_04T07_14_39.217680 path: - results_2023-10-04T07-14-39.217680.parquet - split: latest path: - results_2023-10-04T07-14-39.217680.parquet --- # Dataset Card for Evaluation run of PY007/TinyLlama-1.1B-Chat-v0.3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PY007/TinyLlama-1.1B-Chat-v0.3](https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:14:39.217680](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3/blob/main/results_2023-10-04T07-14-39.217680.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2594245145352148, "acc_stderr": 0.03169981723918608, "acc_norm": 0.2624127016975797, "acc_norm_stderr": 0.031705948859101704, "mc1": 0.22276621787025705, "mc1_stderr": 0.014566506961396742, "mc2": 0.3667165555145351, "mc2_stderr": 0.01463510371066448 }, "harness|arc:challenge|25": { "acc": 0.31399317406143346, "acc_stderr": 0.013562691224726297, "acc_norm": 0.3506825938566553, "acc_norm_stderr": 0.013944635930726092 }, "harness|hellaswag|10": { "acc": 0.4373630750846445, "acc_stderr": 0.0049504729185233165, "acc_norm": 0.5769766978689504, "acc_norm_stderr": 0.004930293787545608 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04072314811876837, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3026315789473684, "acc_stderr": 0.03738520676119667, "acc_norm": 0.3026315789473684, "acc_norm_stderr": 0.03738520676119667 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.025288394502891366, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.025288394502891366 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.18, "acc_stderr": 0.03861229196653694, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.24, "acc_stderr": 0.04292346959909284, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2543352601156069, "acc_stderr": 0.0332055644308557, "acc_norm": 0.2543352601156069, "acc_norm_stderr": 0.0332055644308557 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171452, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171452 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.20851063829787234, "acc_stderr": 0.02655698211783873, "acc_norm": 0.20851063829787234, "acc_norm_stderr": 0.02655698211783873 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.04049339297748141, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.04049339297748141 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2827586206896552, "acc_stderr": 0.03752833958003336, "acc_norm": 0.2827586206896552, "acc_norm_stderr": 0.03752833958003336 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.022569897074918417, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.022569897074918417 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.15873015873015872, "acc_stderr": 0.032684540130117436, "acc_norm": 0.15873015873015872, "acc_norm_stderr": 0.032684540130117436 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.24838709677419354, "acc_stderr": 0.02458002892148101, "acc_norm": 0.24838709677419354, "acc_norm_stderr": 0.02458002892148101 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2512315270935961, "acc_stderr": 0.030516530732694436, "acc_norm": 0.2512315270935961, "acc_norm_stderr": 0.030516530732694436 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.26666666666666666, "acc_stderr": 0.03453131801885415, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.03453131801885415 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.22727272727272727, "acc_stderr": 0.02985751567338641, "acc_norm": 0.22727272727272727, "acc_norm_stderr": 0.02985751567338641 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.24352331606217617, "acc_stderr": 0.030975436386845436, "acc_norm": 0.24352331606217617, "acc_norm_stderr": 0.030975436386845436 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.23846153846153847, "acc_stderr": 0.021606294494647727, "acc_norm": 0.23846153846153847, "acc_norm_stderr": 0.021606294494647727 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.02696242432507383, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.02696242432507383 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.026265024608275882, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.026265024608275882 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2913907284768212, "acc_stderr": 0.03710185726119995, "acc_norm": 0.2913907284768212, "acc_norm_stderr": 0.03710185726119995 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.24587155963302754, "acc_stderr": 0.018461940968708426, "acc_norm": 0.24587155963302754, "acc_norm_stderr": 0.018461940968708426 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.24537037037037038, "acc_stderr": 0.02934666509437294, "acc_norm": 0.24537037037037038, "acc_norm_stderr": 0.02934666509437294 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24019607843137256, "acc_stderr": 0.02998373305591361, "acc_norm": 0.24019607843137256, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.26582278481012656, "acc_stderr": 0.028756799629658335, "acc_norm": 0.26582278481012656, "acc_norm_stderr": 0.028756799629658335 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.21076233183856502, "acc_stderr": 0.027373095500540193, "acc_norm": 0.21076233183856502, "acc_norm_stderr": 0.027373095500540193 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2366412213740458, "acc_stderr": 0.037276735755969195, "acc_norm": 0.2366412213740458, "acc_norm_stderr": 0.037276735755969195 }, "harness|hendrycksTest-international_law|5": { "acc": 0.371900826446281, "acc_stderr": 0.04412015806624504, "acc_norm": 0.371900826446281, "acc_norm_stderr": 0.04412015806624504 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946315, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946315 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.035590395316173425, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.044939490686135404, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.044939490686135404 }, "harness|hendrycksTest-management|5": { "acc": 0.21359223300970873, "acc_stderr": 0.04058042015646036, "acc_norm": 0.21359223300970873, "acc_norm_stderr": 0.04058042015646036 }, "harness|hendrycksTest-marketing|5": { "acc": 0.27350427350427353, "acc_stderr": 0.029202540153431183, "acc_norm": 0.27350427350427353, "acc_norm_stderr": 0.029202540153431183 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.16, "acc_stderr": 0.0368452949177471, "acc_norm": 0.16, "acc_norm_stderr": 0.0368452949177471 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.26053639846743293, "acc_stderr": 0.015696008563807082, "acc_norm": 0.26053639846743293, "acc_norm_stderr": 0.015696008563807082 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.26878612716763006, "acc_stderr": 0.023868003262500104, "acc_norm": 0.26878612716763006, "acc_norm_stderr": 0.023868003262500104 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2549019607843137, "acc_stderr": 0.02495418432487991, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.02495418432487991 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.29260450160771706, "acc_stderr": 0.025839898334877983, "acc_norm": 0.29260450160771706, "acc_norm_stderr": 0.025839898334877983 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.3055555555555556, "acc_stderr": 0.02563082497562135, "acc_norm": 0.3055555555555556, "acc_norm_stderr": 0.02563082497562135 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2730496453900709, "acc_stderr": 0.026577860943307857, "acc_norm": 0.2730496453900709, "acc_norm_stderr": 0.026577860943307857 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.26140808344198174, "acc_stderr": 0.011222528169771309, "acc_norm": 0.26140808344198174, "acc_norm_stderr": 0.011222528169771309 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.02352924218519311, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.02352924218519311 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.27124183006535946, "acc_stderr": 0.017986615304030305, "acc_norm": 0.27124183006535946, "acc_norm_stderr": 0.017986615304030305 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.19090909090909092, "acc_stderr": 0.03764425585984925, "acc_norm": 0.19090909090909092, "acc_norm_stderr": 0.03764425585984925 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.22040816326530613, "acc_stderr": 0.026537045312145312, "acc_norm": 0.22040816326530613, "acc_norm_stderr": 0.026537045312145312 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401467, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401467 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.26506024096385544, "acc_stderr": 0.03436024037944967, "acc_norm": 0.26506024096385544, "acc_norm_stderr": 0.03436024037944967 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2631578947368421, "acc_stderr": 0.033773102522091945, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.033773102522091945 }, "harness|truthfulqa:mc|0": { "mc1": 0.22276621787025705, "mc1_stderr": 0.014566506961396742, "mc2": 0.3667165555145351, "mc2_stderr": 0.01463510371066448 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
JojoPuppet/wikipedia_index_6M_title_first_sentence_categories
2023-10-04T07:37:38.000Z
[ "region:us" ]
JojoPuppet
null
null
null
0
0
Entry not found
atom-in-the-universe/bild-e9f4151a-440f-415c-ba8b-3160a3a4f6a7
2023-10-04T07:30:51.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
zperlman/mouse-seg2
2023-10-04T07:41:30.000Z
[ "region:us" ]
zperlman
null
null
null
0
0
Entry not found
BangumiBase/blends
2023-10-04T08:22:04.000Z
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
BangumiBase
null
null
null
0
0
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Blend S This is the image base of bangumi Blend S, we detected 16 characters, 1863 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 436 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 38 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 30 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 6 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | N/A | N/A | | 4 | 299 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 222 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 42 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 20 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 18 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 187 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 245 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 19 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 12 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 85 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 114 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | noise | 90 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
murukkuu/guanaco-llama2-200
2023-10-04T07:27:13.000Z
[ "region:us" ]
murukkuu
null
null
null
0
0
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 338808 num_examples: 200 download_size: 201257 dataset_size: 338808 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "guanaco-llama2-200" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
atom-in-the-universe/bild-98e6d176-54c2-4ebc-b55a-f19ed24b8dcb
2023-10-04T07:44:51.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_hyunseoki__ko-en-llama2-13b
2023-10-04T07:34:35.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of hyunseoki/ko-en-llama2-13b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [hyunseoki/ko-en-llama2-13b](https://huggingface.co/hyunseoki/ko-en-llama2-13b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hyunseoki__ko-en-llama2-13b\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:33:17.210034](https://huggingface.co/datasets/open-llm-leaderboard/details_hyunseoki__ko-en-llama2-13b/blob/main/results_2023-10-04T07-33-17.210034.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5220122335674698,\n\ \ \"acc_stderr\": 0.03470974074584197,\n \"acc_norm\": 0.5262898826435255,\n\ \ \"acc_norm_stderr\": 0.03468976944688372,\n \"mc1\": 0.26193390452876375,\n\ \ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.3996110091058917,\n\ \ \"mc2_stderr\": 0.013538590385255279\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636588,\n\ \ \"acc_norm\": 0.5819112627986348,\n \"acc_norm_stderr\": 0.014413988396996077\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6091416052579167,\n\ \ \"acc_stderr\": 0.004869455150933827,\n \"acc_norm\": 0.8188607847042422,\n\ \ \"acc_norm_stderr\": 0.0038434637920379223\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\ \ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\ \ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n\ \ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\ \ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n\ \ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\ \ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\ \ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\ : 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\ \ \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.44508670520231214,\n\ \ \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\ \ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\ \ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\ \ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"\ acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\ \ \"acc_stderr\": 0.038932596106046755,\n \"acc_norm\": 0.25396825396825395,\n\ \ \"acc_norm_stderr\": 0.038932596106046755\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421255,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421255\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.6193548387096774,\n \"acc_stderr\": 0.027621717832907036,\n \"\ acc_norm\": 0.6193548387096774,\n \"acc_norm_stderr\": 0.027621717832907036\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"\ acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\ : 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.037937131711656344,\n\ \ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.037937131711656344\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\ : 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n\ \ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736118,\n\ \ \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736118\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \ \ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \ \ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415175,\n \"\ acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415175\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\ acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6813725490196079,\n \"acc_stderr\": 0.0327028718148208,\n \"acc_norm\"\ : 0.6813725490196079,\n \"acc_norm_stderr\": 0.0327028718148208\n },\n\ \ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\ \ 0.6624472573839663,\n \"acc_stderr\": 0.030781549102026223,\n \"\ acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.030781549102026223\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\ \ \"acc_stderr\": 0.03292802819330315,\n \"acc_norm\": 0.5964125560538116,\n\ \ \"acc_norm_stderr\": 0.03292802819330315\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\ \ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\ : 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\ \ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\ \ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7478632478632479,\n\ \ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.7478632478632479,\n\ \ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7292464878671775,\n\ \ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.7292464878671775,\n\ \ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194625,\n\ \ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194625\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n\ \ \"acc_stderr\": 0.014796502622562557,\n \"acc_norm\": 0.2670391061452514,\n\ \ \"acc_norm_stderr\": 0.014796502622562557\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063146,\n\ \ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063146\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\ \ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\ \ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\ \ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \ \ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38461538461538464,\n\ \ \"acc_stderr\": 0.012425548416302942,\n \"acc_norm\": 0.38461538461538464,\n\ \ \"acc_norm_stderr\": 0.012425548416302942\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \ \ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.477124183006536,\n \"acc_stderr\": 0.02020665318788479,\n \ \ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.02020665318788479\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\ \ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n\ \ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5755102040816327,\n \"acc_stderr\": 0.031642094879429414,\n\ \ \"acc_norm\": 0.5755102040816327,\n \"acc_norm_stderr\": 0.031642094879429414\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\ \ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\ \ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \ \ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\ \ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\ \ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\ \ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\ \ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.3996110091058917,\n\ \ \"mc2_stderr\": 0.013538590385255279\n }\n}\n```" repo_url: https://huggingface.co/hyunseoki/ko-en-llama2-13b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hellaswag|10_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-33-17.210034.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-33-17.210034.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_33_17.210034 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-33-17.210034.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-33-17.210034.parquet' - config_name: results data_files: - split: 2023_10_04T07_33_17.210034 path: - results_2023-10-04T07-33-17.210034.parquet - split: latest path: - results_2023-10-04T07-33-17.210034.parquet --- # Dataset Card for Evaluation run of hyunseoki/ko-en-llama2-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/hyunseoki/ko-en-llama2-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [hyunseoki/ko-en-llama2-13b](https://huggingface.co/hyunseoki/ko-en-llama2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_hyunseoki__ko-en-llama2-13b", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:33:17.210034](https://huggingface.co/datasets/open-llm-leaderboard/details_hyunseoki__ko-en-llama2-13b/blob/main/results_2023-10-04T07-33-17.210034.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5220122335674698, "acc_stderr": 0.03470974074584197, "acc_norm": 0.5262898826435255, "acc_norm_stderr": 0.03468976944688372, "mc1": 0.26193390452876375, "mc1_stderr": 0.015392118805015023, "mc2": 0.3996110091058917, "mc2_stderr": 0.013538590385255279 }, "harness|arc:challenge|25": { "acc": 0.5392491467576792, "acc_stderr": 0.014566303676636588, "acc_norm": 0.5819112627986348, "acc_norm_stderr": 0.014413988396996077 }, "harness|hellaswag|10": { "acc": 0.6091416052579167, "acc_stderr": 0.004869455150933827, "acc_norm": 0.8188607847042422, "acc_norm_stderr": 0.0038434637920379223 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5526315789473685, "acc_stderr": 0.040463368839782514, "acc_norm": 0.5526315789473685, "acc_norm_stderr": 0.040463368839782514 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5433962264150943, "acc_stderr": 0.03065674869673943, "acc_norm": 0.5433962264150943, "acc_norm_stderr": 0.03065674869673943 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5902777777777778, "acc_stderr": 0.04112490974670787, "acc_norm": 0.5902777777777778, "acc_norm_stderr": 0.04112490974670787 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.44508670520231214, "acc_stderr": 0.03789401760283648, "acc_norm": 0.44508670520231214, "acc_norm_stderr": 0.03789401760283648 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4127659574468085, "acc_stderr": 0.03218471141400351, "acc_norm": 0.4127659574468085, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.0433913832257986, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.0433913832257986 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4827586206896552, "acc_stderr": 0.04164188720169377, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.04164188720169377 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3201058201058201, "acc_stderr": 0.024026846392873506, "acc_norm": 0.3201058201058201, "acc_norm_stderr": 0.024026846392873506 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.25396825396825395, "acc_stderr": 0.038932596106046755, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.038932596106046755 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.045126085985421255, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421255 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6193548387096774, "acc_stderr": 0.027621717832907036, "acc_norm": 0.6193548387096774, "acc_norm_stderr": 0.027621717832907036 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4433497536945813, "acc_stderr": 0.03495334582162934, "acc_norm": 0.4433497536945813, "acc_norm_stderr": 0.03495334582162934 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6181818181818182, "acc_stderr": 0.037937131711656344, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.037937131711656344 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.702020202020202, "acc_stderr": 0.03258630383836556, "acc_norm": 0.702020202020202, "acc_norm_stderr": 0.03258630383836556 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7564766839378239, "acc_stderr": 0.030975436386845436, "acc_norm": 0.7564766839378239, "acc_norm_stderr": 0.030975436386845436 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.47692307692307695, "acc_stderr": 0.025323990861736118, "acc_norm": 0.47692307692307695, "acc_norm_stderr": 0.025323990861736118 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2814814814814815, "acc_stderr": 0.027420019350945277, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.027420019350945277 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.542016806722689, "acc_stderr": 0.03236361111951941, "acc_norm": 0.542016806722689, "acc_norm_stderr": 0.03236361111951941 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7009174311926606, "acc_stderr": 0.019630417285415175, "acc_norm": 0.7009174311926606, "acc_norm_stderr": 0.019630417285415175 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.41203703703703703, "acc_stderr": 0.03356787758160835, "acc_norm": 0.41203703703703703, "acc_norm_stderr": 0.03356787758160835 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6813725490196079, "acc_stderr": 0.0327028718148208, "acc_norm": 0.6813725490196079, "acc_norm_stderr": 0.0327028718148208 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6624472573839663, "acc_stderr": 0.030781549102026223, "acc_norm": 0.6624472573839663, "acc_norm_stderr": 0.030781549102026223 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.03292802819330315, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.03292802819330315 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6564885496183206, "acc_stderr": 0.041649760719448786, "acc_norm": 0.6564885496183206, "acc_norm_stderr": 0.041649760719448786 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6446280991735537, "acc_stderr": 0.0436923632657398, "acc_norm": 0.6446280991735537, "acc_norm_stderr": 0.0436923632657398 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6296296296296297, "acc_stderr": 0.04668408033024931, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.04668408033024931 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6380368098159509, "acc_stderr": 0.037757007291414416, "acc_norm": 0.6380368098159509, "acc_norm_stderr": 0.037757007291414416 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340456, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340456 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7478632478632479, "acc_stderr": 0.02844796547623102, "acc_norm": 0.7478632478632479, "acc_norm_stderr": 0.02844796547623102 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7292464878671775, "acc_stderr": 0.015889888362560486, "acc_norm": 0.7292464878671775, "acc_norm_stderr": 0.015889888362560486 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6040462427745664, "acc_stderr": 0.02632981334194625, "acc_norm": 0.6040462427745664, "acc_norm_stderr": 0.02632981334194625 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2670391061452514, "acc_stderr": 0.014796502622562557, "acc_norm": 0.2670391061452514, "acc_norm_stderr": 0.014796502622562557 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6045751633986928, "acc_stderr": 0.02799672318063146, "acc_norm": 0.6045751633986928, "acc_norm_stderr": 0.02799672318063146 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6141479099678456, "acc_stderr": 0.027648149599751464, "acc_norm": 0.6141479099678456, "acc_norm_stderr": 0.027648149599751464 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6141975308641975, "acc_stderr": 0.027085401226132143, "acc_norm": 0.6141975308641975, "acc_norm_stderr": 0.027085401226132143 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4078014184397163, "acc_stderr": 0.029316011776343555, "acc_norm": 0.4078014184397163, "acc_norm_stderr": 0.029316011776343555 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38461538461538464, "acc_stderr": 0.012425548416302942, "acc_norm": 0.38461538461538464, "acc_norm_stderr": 0.012425548416302942 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4375, "acc_stderr": 0.030134614954403924, "acc_norm": 0.4375, "acc_norm_stderr": 0.030134614954403924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.477124183006536, "acc_stderr": 0.02020665318788479, "acc_norm": 0.477124183006536, "acc_norm_stderr": 0.02020665318788479 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5727272727272728, "acc_stderr": 0.04738198703545483, "acc_norm": 0.5727272727272728, "acc_norm_stderr": 0.04738198703545483 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5755102040816327, "acc_stderr": 0.031642094879429414, "acc_norm": 0.5755102040816327, "acc_norm_stderr": 0.031642094879429414 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7263681592039801, "acc_stderr": 0.03152439186555402, "acc_norm": 0.7263681592039801, "acc_norm_stderr": 0.03152439186555402 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-virology|5": { "acc": 0.4397590361445783, "acc_stderr": 0.03864139923699122, "acc_norm": 0.4397590361445783, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7485380116959064, "acc_stderr": 0.033275044238468436, "acc_norm": 0.7485380116959064, "acc_norm_stderr": 0.033275044238468436 }, "harness|truthfulqa:mc|0": { "mc1": 0.26193390452876375, "mc1_stderr": 0.015392118805015023, "mc2": 0.3996110091058917, "mc2_stderr": 0.013538590385255279 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-f5a972f1-e267-4b71-a39d-135395a00ceb
2023-10-04T07:57:20.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_Devio__test-9k-fn
2023-10-04T07:46:38.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Devio/test-9k-fn dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Devio/test-9k-fn](https://huggingface.co/Devio/test-9k-fn) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Devio__test-9k-fn\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:45:21.870360](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-9k-fn/blob/main/results_2023-10-04T07-45-21.870360.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29934767850730726,\n\ \ \"acc_stderr\": 0.033158996935405735,\n \"acc_norm\": 0.3034282752932227,\n\ \ \"acc_norm_stderr\": 0.03315857474708369,\n \"mc1\": 0.23378212974296206,\n\ \ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.3914546223201993,\n\ \ \"mc2_stderr\": 0.013969580332280395\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.35665529010238906,\n \"acc_stderr\": 0.013998056902620192,\n\ \ \"acc_norm\": 0.4087030716723549,\n \"acc_norm_stderr\": 0.014365750345427008\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5057757418840868,\n\ \ \"acc_stderr\": 0.004989448490164429,\n \"acc_norm\": 0.6944831706831308,\n\ \ \"acc_norm_stderr\": 0.004596845936356623\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\ \ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\ \ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n\ \ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\ \ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \ \ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438655,\n\ \ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438655\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\ \ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.2986111111111111,\n\ \ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\ : 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\ \ \"acc_stderr\": 0.03414014007044037,\n \"acc_norm\": 0.2774566473988439,\n\ \ \"acc_norm_stderr\": 0.03414014007044037\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\ \ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610337,\n\ \ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610337\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\ \ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\ \ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378947,\n\ \ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378947\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.22486772486772486,\n \"acc_stderr\": 0.02150209607822914,\n \"\ acc_norm\": 0.22486772486772486,\n \"acc_norm_stderr\": 0.02150209607822914\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\ \ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\ \ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n\ \ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.2709677419354839,\n\ \ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n\ \ \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\ \ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"\ acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.03458816042181005,\n\ \ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.03458816042181005\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.024503472557110936,\n\ \ \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.024503472557110936\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \ \ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372163,\n\ \ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372163\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\ acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3211009174311927,\n \"acc_stderr\": 0.020018149772733747,\n \"\ acc_norm\": 0.3211009174311927,\n \"acc_norm_stderr\": 0.020018149772733747\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n \"\ acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923403,\n \"\ acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923403\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.3333333333333333,\n \"acc_stderr\": 0.030685820596610795,\n \ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.030685820596610795\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\ \ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.34080717488789236,\n\ \ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806298,\n\ \ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806298\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591205,\n \"\ acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591205\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\ \ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\ \ \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.36607142857142855,\n\ \ \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283136,\n\ \ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283136\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3974358974358974,\n\ \ \"acc_stderr\": 0.032059534537892925,\n \"acc_norm\": 0.3974358974358974,\n\ \ \"acc_norm_stderr\": 0.032059534537892925\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\ \ \"acc_stderr\": 0.015594955384455772,\n \"acc_norm\": 0.2554278416347382,\n\ \ \"acc_norm_stderr\": 0.015594955384455772\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n\ \ \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\ \ \"acc_stderr\": 0.014242630070574877,\n \"acc_norm\": 0.23798882681564246,\n\ \ \"acc_norm_stderr\": 0.014242630070574877\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.02795604616542451,\n\ \ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.02795604616542451\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n\ \ \"acc_stderr\": 0.02492672322484555,\n \"acc_norm\": 0.2604501607717042,\n\ \ \"acc_norm_stderr\": 0.02492672322484555\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n\ \ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \ \ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2666232073011734,\n\ \ \"acc_stderr\": 0.011293836031612131,\n \"acc_norm\": 0.2666232073011734,\n\ \ \"acc_norm_stderr\": 0.011293836031612131\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.31985294117647056,\n \"acc_stderr\": 0.02833295951403124,\n\ \ \"acc_norm\": 0.31985294117647056,\n \"acc_norm_stderr\": 0.02833295951403124\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \ \ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\ \ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.2818181818181818,\n\ \ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.3306122448979592,\n \"acc_stderr\": 0.030116426296540592,\n\ \ \"acc_norm\": 0.3306122448979592,\n \"acc_norm_stderr\": 0.030116426296540592\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31840796019900497,\n\ \ \"acc_stderr\": 0.03294118479054096,\n \"acc_norm\": 0.31840796019900497,\n\ \ \"acc_norm_stderr\": 0.03294118479054096\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\ \ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\ \ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\ \ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\ \ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.3914546223201993,\n\ \ \"mc2_stderr\": 0.013969580332280395\n }\n}\n```" repo_url: https://huggingface.co/Devio/test-9k-fn leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hellaswag|10_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-45-21.870360.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-45-21.870360.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_45_21.870360 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-45-21.870360.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-45-21.870360.parquet' - config_name: results data_files: - split: 2023_10_04T07_45_21.870360 path: - results_2023-10-04T07-45-21.870360.parquet - split: latest path: - results_2023-10-04T07-45-21.870360.parquet --- # Dataset Card for Evaluation run of Devio/test-9k-fn ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Devio/test-9k-fn - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Devio/test-9k-fn](https://huggingface.co/Devio/test-9k-fn) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Devio__test-9k-fn", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:45:21.870360](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-9k-fn/blob/main/results_2023-10-04T07-45-21.870360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.29934767850730726, "acc_stderr": 0.033158996935405735, "acc_norm": 0.3034282752932227, "acc_norm_stderr": 0.03315857474708369, "mc1": 0.23378212974296206, "mc1_stderr": 0.014816195991931583, "mc2": 0.3914546223201993, "mc2_stderr": 0.013969580332280395 }, "harness|arc:challenge|25": { "acc": 0.35665529010238906, "acc_stderr": 0.013998056902620192, "acc_norm": 0.4087030716723549, "acc_norm_stderr": 0.014365750345427008 }, "harness|hellaswag|10": { "acc": 0.5057757418840868, "acc_stderr": 0.004989448490164429, "acc_norm": 0.6944831706831308, "acc_norm_stderr": 0.004596845936356623 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.28888888888888886, "acc_stderr": 0.0391545063041425, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.0391545063041425 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3092105263157895, "acc_stderr": 0.037610708698674805, "acc_norm": 0.3092105263157895, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.3018867924528302, "acc_stderr": 0.028254200344438655, "acc_norm": 0.3018867924528302, "acc_norm_stderr": 0.028254200344438655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2986111111111111, "acc_stderr": 0.03827052357950756, "acc_norm": 0.2986111111111111, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2774566473988439, "acc_stderr": 0.03414014007044037, "acc_norm": 0.2774566473988439, "acc_norm_stderr": 0.03414014007044037 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.19607843137254902, "acc_stderr": 0.03950581861179963, "acc_norm": 0.19607843137254902, "acc_norm_stderr": 0.03950581861179963 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.32340425531914896, "acc_stderr": 0.030579442773610337, "acc_norm": 0.32340425531914896, "acc_norm_stderr": 0.030579442773610337 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.20175438596491227, "acc_stderr": 0.037752050135836386, "acc_norm": 0.20175438596491227, "acc_norm_stderr": 0.037752050135836386 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3103448275862069, "acc_stderr": 0.03855289616378947, "acc_norm": 0.3103448275862069, "acc_norm_stderr": 0.03855289616378947 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.22486772486772486, "acc_stderr": 0.02150209607822914, "acc_norm": 0.22486772486772486, "acc_norm_stderr": 0.02150209607822914 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.041905964388711366, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.041905964388711366 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2709677419354839, "acc_stderr": 0.025284416114900156, "acc_norm": 0.2709677419354839, "acc_norm_stderr": 0.025284416114900156 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.18719211822660098, "acc_stderr": 0.027444924966882618, "acc_norm": 0.18719211822660098, "acc_norm_stderr": 0.027444924966882618 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21212121212121213, "acc_stderr": 0.03192271569548299, "acc_norm": 0.21212121212121213, "acc_norm_stderr": 0.03192271569548299 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3181818181818182, "acc_stderr": 0.03318477333845331, "acc_norm": 0.3181818181818182, "acc_norm_stderr": 0.03318477333845331 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.35751295336787564, "acc_stderr": 0.03458816042181005, "acc_norm": 0.35751295336787564, "acc_norm_stderr": 0.03458816042181005 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.3717948717948718, "acc_stderr": 0.024503472557110936, "acc_norm": 0.3717948717948718, "acc_norm_stderr": 0.024503472557110936 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2815126050420168, "acc_stderr": 0.029213549414372163, "acc_norm": 0.2815126050420168, "acc_norm_stderr": 0.029213549414372163 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2781456953642384, "acc_stderr": 0.03658603262763743, "acc_norm": 0.2781456953642384, "acc_norm_stderr": 0.03658603262763743 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3211009174311927, "acc_stderr": 0.020018149772733747, "acc_norm": 0.3211009174311927, "acc_norm_stderr": 0.020018149772733747 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3611111111111111, "acc_stderr": 0.032757734861009996, "acc_norm": 0.3611111111111111, "acc_norm_stderr": 0.032757734861009996 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2647058823529412, "acc_stderr": 0.030964517926923403, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.030964517926923403 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.3333333333333333, "acc_stderr": 0.030685820596610795, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.030685820596610795 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.34080717488789236, "acc_stderr": 0.031811497470553604, "acc_norm": 0.34080717488789236, "acc_norm_stderr": 0.031811497470553604 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3282442748091603, "acc_stderr": 0.04118438565806298, "acc_norm": 0.3282442748091603, "acc_norm_stderr": 0.04118438565806298 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2644628099173554, "acc_stderr": 0.04026187527591205, "acc_norm": 0.2644628099173554, "acc_norm_stderr": 0.04026187527591205 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25153374233128833, "acc_stderr": 0.034089978868575295, "acc_norm": 0.25153374233128833, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.36607142857142855, "acc_stderr": 0.04572372358737431, "acc_norm": 0.36607142857142855, "acc_norm_stderr": 0.04572372358737431 }, "harness|hendrycksTest-management|5": { "acc": 0.23300970873786409, "acc_stderr": 0.041858325989283136, "acc_norm": 0.23300970873786409, "acc_norm_stderr": 0.041858325989283136 }, "harness|hendrycksTest-marketing|5": { "acc": 0.3974358974358974, "acc_stderr": 0.032059534537892925, "acc_norm": 0.3974358974358974, "acc_norm_stderr": 0.032059534537892925 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2554278416347382, "acc_stderr": 0.015594955384455772, "acc_norm": 0.2554278416347382, "acc_norm_stderr": 0.015594955384455772 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2774566473988439, "acc_stderr": 0.024105712607754307, "acc_norm": 0.2774566473988439, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574877, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574877 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.39215686274509803, "acc_stderr": 0.02795604616542451, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.02795604616542451 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2604501607717042, "acc_stderr": 0.02492672322484555, "acc_norm": 0.2604501607717042, "acc_norm_stderr": 0.02492672322484555 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.25925925925925924, "acc_stderr": 0.02438366553103545, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.02438366553103545 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2801418439716312, "acc_stderr": 0.026789172351140242, "acc_norm": 0.2801418439716312, "acc_norm_stderr": 0.026789172351140242 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2666232073011734, "acc_stderr": 0.011293836031612131, "acc_norm": 0.2666232073011734, "acc_norm_stderr": 0.011293836031612131 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.31985294117647056, "acc_stderr": 0.02833295951403124, "acc_norm": 0.31985294117647056, "acc_norm_stderr": 0.02833295951403124 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2549019607843137, "acc_stderr": 0.017630827375148383, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.017630827375148383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2818181818181818, "acc_stderr": 0.04309118709946458, "acc_norm": 0.2818181818181818, "acc_norm_stderr": 0.04309118709946458 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3306122448979592, "acc_stderr": 0.030116426296540592, "acc_norm": 0.3306122448979592, "acc_norm_stderr": 0.030116426296540592 }, "harness|hendrycksTest-sociology|5": { "acc": 0.31840796019900497, "acc_stderr": 0.03294118479054096, "acc_norm": 0.31840796019900497, "acc_norm_stderr": 0.03294118479054096 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-virology|5": { "acc": 0.30120481927710846, "acc_stderr": 0.0357160923005348, "acc_norm": 0.30120481927710846, "acc_norm_stderr": 0.0357160923005348 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.32748538011695905, "acc_stderr": 0.035993357714560276, "acc_norm": 0.32748538011695905, "acc_norm_stderr": 0.035993357714560276 }, "harness|truthfulqa:mc|0": { "mc1": 0.23378212974296206, "mc1_stderr": 0.014816195991931583, "mc2": 0.3914546223201993, "mc2_stderr": 0.013969580332280395 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Sao10K__BrainDerp2
2023-10-04T07:48:17.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Sao10K/BrainDerp2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Sao10K/BrainDerp2](https://huggingface.co/Sao10K/BrainDerp2) on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__BrainDerp2\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:46:51.716254](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__BrainDerp2/blob/main/results_2023-10-04T07-46-51.716254.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5893078284212648,\n\ \ \"acc_stderr\": 0.034155733057921266,\n \"acc_norm\": 0.5932301497201119,\n\ \ \"acc_norm_stderr\": 0.03413577356853939,\n \"mc1\": 0.39657282741738065,\n\ \ \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5719069674430869,\n\ \ \"mc2_stderr\": 0.015635759800240744\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.0144262112525084,\n\ \ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513777\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.617805218084047,\n\ \ \"acc_stderr\": 0.0048493069987277735,\n \"acc_norm\": 0.8193586934873531,\n\ \ \"acc_norm_stderr\": 0.003839344497191945\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\ \ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\ \ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\ \ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365242,\n\ \ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365242\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\ \ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \ \ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\ \ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\ \ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\ \ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.68,\n \"acc_stderr\": 0.046882617226215055,\n \"acc_norm\": 0.68,\n\ \ \"acc_norm_stderr\": 0.046882617226215055\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\ \ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\ \ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\ \ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\ \ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\ : 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\ \ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\ \ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\ \ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\ \ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\ \ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\ \ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\ \ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091707,\n\ \ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091707\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\ acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n\ \ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n\ \ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652459,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652459\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\ \ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\ acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"\ acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"\ acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\ acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \ \ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\ \ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\ \ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\ \ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\ acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\ \ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\ \ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\ \ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\ \ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\ \ \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n\ \ \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \ \ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\ \ \"acc_stderr\": 0.014836205167333564,\n \"acc_norm\": 0.7790549169859514,\n\ \ \"acc_norm_stderr\": 0.014836205167333564\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.025522474632121615,\n\ \ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.025522474632121615\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\ \ \"acc_stderr\": 0.016593394227564846,\n \"acc_norm\": 0.43798882681564244,\n\ \ \"acc_norm_stderr\": 0.016593394227564846\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.02705797462449438,\n\ \ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.02705797462449438\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\ \ \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n\ \ \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862744,\n\ \ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862744\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \ \ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\ \ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\ \ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.03016191193076711,\n\ \ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.03016191193076711\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829156,\n \ \ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829156\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\ \ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n\ \ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154188,\n\ \ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154188\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\ \ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\ \ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\ \ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\ \ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\ \ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n\ \ \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5719069674430869,\n\ \ \"mc2_stderr\": 0.015635759800240744\n }\n}\n```" repo_url: https://huggingface.co/Sao10K/BrainDerp2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hellaswag|10_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-46-51.716254.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-46-51.716254.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_46_51.716254 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-46-51.716254.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-46-51.716254.parquet' - config_name: results data_files: - split: 2023_10_04T07_46_51.716254 path: - results_2023-10-04T07-46-51.716254.parquet - split: latest path: - results_2023-10-04T07-46-51.716254.parquet --- # Dataset Card for Evaluation run of Sao10K/BrainDerp2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Sao10K/BrainDerp2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Sao10K/BrainDerp2](https://huggingface.co/Sao10K/BrainDerp2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Sao10K__BrainDerp2", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:46:51.716254](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__BrainDerp2/blob/main/results_2023-10-04T07-46-51.716254.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5893078284212648, "acc_stderr": 0.034155733057921266, "acc_norm": 0.5932301497201119, "acc_norm_stderr": 0.03413577356853939, "mc1": 0.39657282741738065, "mc1_stderr": 0.017124930942023518, "mc2": 0.5719069674430869, "mc2_stderr": 0.015635759800240744 }, "harness|arc:challenge|25": { "acc": 0.5793515358361775, "acc_stderr": 0.0144262112525084, "acc_norm": 0.6092150170648464, "acc_norm_stderr": 0.014258563880513777 }, "harness|hellaswag|10": { "acc": 0.617805218084047, "acc_stderr": 0.0048493069987277735, "acc_norm": 0.8193586934873531, "acc_norm_stderr": 0.003839344497191945 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5111111111111111, "acc_stderr": 0.04318275491977976, "acc_norm": 0.5111111111111111, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5789473684210527, "acc_stderr": 0.04017901275981749, "acc_norm": 0.5789473684210527, "acc_norm_stderr": 0.04017901275981749 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6339622641509434, "acc_stderr": 0.029647813539365242, "acc_norm": 0.6339622641509434, "acc_norm_stderr": 0.029647813539365242 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.625, "acc_stderr": 0.04048439222695598, "acc_norm": 0.625, "acc_norm_stderr": 0.04048439222695598 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5838150289017341, "acc_stderr": 0.03758517775404948, "acc_norm": 0.5838150289017341, "acc_norm_stderr": 0.03758517775404948 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.04755129616062946, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.04755129616062946 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.046882617226215055, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215055 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.502127659574468, "acc_stderr": 0.03268572658667492, "acc_norm": 0.502127659574468, "acc_norm_stderr": 0.03268572658667492 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3157894736842105, "acc_stderr": 0.04372748290278007, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.04372748290278007 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.373015873015873, "acc_stderr": 0.02490699045899257, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.02490699045899257 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.0436031486007746, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.0436031486007746 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7064516129032258, "acc_stderr": 0.025906087021319295, "acc_norm": 0.7064516129032258, "acc_norm_stderr": 0.025906087021319295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.696969696969697, "acc_stderr": 0.03588624800091707, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.03588624800091707 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.030313710538198906, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.030313710538198906 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723872, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723872 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6358974358974359, "acc_stderr": 0.024396672985094767, "acc_norm": 0.6358974358974359, "acc_norm_stderr": 0.024396672985094767 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652459, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652459 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5966386554621849, "acc_stderr": 0.031866081214088314, "acc_norm": 0.5966386554621849, "acc_norm_stderr": 0.031866081214088314 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.03802039760107903, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.03802039760107903 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7926605504587156, "acc_stderr": 0.01738141556360868, "acc_norm": 0.7926605504587156, "acc_norm_stderr": 0.01738141556360868 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.03350991604696042, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.03350991604696042 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808514, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808514 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.03076935200822915, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.03076935200822915 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6412213740458015, "acc_stderr": 0.04206739313864908, "acc_norm": 0.6412213740458015, "acc_norm_stderr": 0.04206739313864908 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7024793388429752, "acc_stderr": 0.04173349148083499, "acc_norm": 0.7024793388429752, "acc_norm_stderr": 0.04173349148083499 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6809815950920245, "acc_stderr": 0.03661997551073836, "acc_norm": 0.6809815950920245, "acc_norm_stderr": 0.03661997551073836 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.044986763205729224, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.044986763205729224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8162393162393162, "acc_stderr": 0.02537213967172293, "acc_norm": 0.8162393162393162, "acc_norm_stderr": 0.02537213967172293 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7790549169859514, "acc_stderr": 0.014836205167333564, "acc_norm": 0.7790549169859514, "acc_norm_stderr": 0.014836205167333564 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6589595375722543, "acc_stderr": 0.025522474632121615, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.025522474632121615 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43798882681564244, "acc_stderr": 0.016593394227564846, "acc_norm": 0.43798882681564244, "acc_norm_stderr": 0.016593394227564846 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6633986928104575, "acc_stderr": 0.02705797462449438, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.02705797462449438 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6559485530546624, "acc_stderr": 0.026981478043648043, "acc_norm": 0.6559485530546624, "acc_norm_stderr": 0.026981478043648043 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6790123456790124, "acc_stderr": 0.025976566010862744, "acc_norm": 0.6790123456790124, "acc_norm_stderr": 0.025976566010862744 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873866, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873866 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4452411994784876, "acc_stderr": 0.012693421303973294, "acc_norm": 0.4452411994784876, "acc_norm_stderr": 0.012693421303973294 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5588235294117647, "acc_stderr": 0.03016191193076711, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.03016191193076711 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5947712418300654, "acc_stderr": 0.019861155193829156, "acc_norm": 0.5947712418300654, "acc_norm_stderr": 0.019861155193829156 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910508, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910508 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6612244897959184, "acc_stderr": 0.030299506562154188, "acc_norm": 0.6612244897959184, "acc_norm_stderr": 0.030299506562154188 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7611940298507462, "acc_stderr": 0.03014777593540922, "acc_norm": 0.7611940298507462, "acc_norm_stderr": 0.03014777593540922 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333045, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.03094445977853321, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.03094445977853321 }, "harness|truthfulqa:mc|0": { "mc1": 0.39657282741738065, "mc1_stderr": 0.017124930942023518, "mc2": 0.5719069674430869, "mc2_stderr": 0.015635759800240744 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B
2023-10-04T07:49:23.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of posicube/Llama2-chat-AYB-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [posicube/Llama2-chat-AYB-13B](https://huggingface.co/posicube/Llama2-chat-AYB-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:48:01.042889](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B/blob/main/results_2023-10-04T07-48-01.042889.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5946507900734089,\n\ \ \"acc_stderr\": 0.03396794158871376,\n \"acc_norm\": 0.5983689218231087,\n\ \ \"acc_norm_stderr\": 0.03394515462487408,\n \"mc1\": 0.3953488372093023,\n\ \ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5562121721284332,\n\ \ \"mc2_stderr\": 0.01570875335896787\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6100682593856656,\n \"acc_stderr\": 0.014252959848892894,\n\ \ \"acc_norm\": 0.6339590443686007,\n \"acc_norm_stderr\": 0.01407722310847014\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.652459669388568,\n\ \ \"acc_stderr\": 0.004752158936871873,\n \"acc_norm\": 0.8479386576379208,\n\ \ \"acc_norm_stderr\": 0.003583464810753465\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\ \ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\ \ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\ \ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\ \ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955784,\n\ \ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955784\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\ : 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\ \ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\ \ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\ \ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\ \ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\ \ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\ \ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"\ acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\ \ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\ \ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n\ \ \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.6903225806451613,\n\ \ \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\ \ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\ : 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\ \ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\ acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\ \ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \ \ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712996,\n \ \ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712996\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\ acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217912,\n \"\ acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217912\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\ acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\ acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \ \ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\ \ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\ \ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\ \ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\ acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\ \ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\ \ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\ \ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\ \ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\ \ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\ \ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \ \ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\ \ \"acc_stderr\": 0.01483620516733356,\n \"acc_norm\": 0.7790549169859514,\n\ \ \"acc_norm_stderr\": 0.01483620516733356\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n\ \ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4770949720670391,\n\ \ \"acc_stderr\": 0.01670494574032619,\n \"acc_norm\": 0.4770949720670391,\n\ \ \"acc_norm_stderr\": 0.01670494574032619\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.02724561304721536,\n\ \ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.02724561304721536\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\ \ \"acc_stderr\": 0.026730620728004906,\n \"acc_norm\": 0.6688102893890675,\n\ \ \"acc_norm_stderr\": 0.026730620728004906\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\ \ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \ \ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\ \ \"acc_stderr\": 0.012718456618701766,\n \"acc_norm\": 0.455019556714472,\n\ \ \"acc_norm_stderr\": 0.012718456618701766\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280072,\n\ \ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280072\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5996732026143791,\n \"acc_stderr\": 0.01982184368827177,\n \ \ \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.01982184368827177\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\ \ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\ \ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\ \ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\ \ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\ \ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826369,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826369\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\ \ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\ \ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\ \ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\ \ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5562121721284332,\n\ \ \"mc2_stderr\": 0.01570875335896787\n }\n}\n```" repo_url: https://huggingface.co/posicube/Llama2-chat-AYB-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hellaswag|10_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-01.042889.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-01.042889.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_48_01.042889 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-48-01.042889.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-48-01.042889.parquet' - config_name: results data_files: - split: 2023_10_04T07_48_01.042889 path: - results_2023-10-04T07-48-01.042889.parquet - split: latest path: - results_2023-10-04T07-48-01.042889.parquet --- # Dataset Card for Evaluation run of posicube/Llama2-chat-AYB-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/posicube/Llama2-chat-AYB-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [posicube/Llama2-chat-AYB-13B](https://huggingface.co/posicube/Llama2-chat-AYB-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:48:01.042889](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B/blob/main/results_2023-10-04T07-48-01.042889.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5946507900734089, "acc_stderr": 0.03396794158871376, "acc_norm": 0.5983689218231087, "acc_norm_stderr": 0.03394515462487408, "mc1": 0.3953488372093023, "mc1_stderr": 0.017115815632418197, "mc2": 0.5562121721284332, "mc2_stderr": 0.01570875335896787 }, "harness|arc:challenge|25": { "acc": 0.6100682593856656, "acc_stderr": 0.014252959848892894, "acc_norm": 0.6339590443686007, "acc_norm_stderr": 0.01407722310847014 }, "harness|hellaswag|10": { "acc": 0.652459669388568, "acc_stderr": 0.004752158936871873, "acc_norm": 0.8479386576379208, "acc_norm_stderr": 0.003583464810753465 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5333333333333333, "acc_stderr": 0.043097329010363554, "acc_norm": 0.5333333333333333, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6052631578947368, "acc_stderr": 0.039777499346220734, "acc_norm": 0.6052631578947368, "acc_norm_stderr": 0.039777499346220734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6075471698113207, "acc_stderr": 0.03005258057955784, "acc_norm": 0.6075471698113207, "acc_norm_stderr": 0.03005258057955784 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03942082639927213, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03942082639927213 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5895953757225434, "acc_stderr": 0.03750757044895537, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.03750757044895537 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201942, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201942 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.48936170212765956, "acc_stderr": 0.03267862331014063, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.03267862331014063 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.34210526315789475, "acc_stderr": 0.04462917535336936, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.04462917535336936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370333, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3386243386243386, "acc_stderr": 0.02437319786798306, "acc_norm": 0.3386243386243386, "acc_norm_stderr": 0.02437319786798306 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04285714285714281, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04285714285714281 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6903225806451613, "acc_stderr": 0.026302774983517414, "acc_norm": 0.6903225806451613, "acc_norm_stderr": 0.026302774983517414 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4827586206896552, "acc_stderr": 0.035158955511656986, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885415, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885415 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7474747474747475, "acc_stderr": 0.030954055470365897, "acc_norm": 0.7474747474747475, "acc_norm_stderr": 0.030954055470365897 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.844559585492228, "acc_stderr": 0.026148483469153303, "acc_norm": 0.844559585492228, "acc_norm_stderr": 0.026148483469153303 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6128205128205129, "acc_stderr": 0.02469721693087894, "acc_norm": 0.6128205128205129, "acc_norm_stderr": 0.02469721693087894 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948496, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948496 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6092436974789915, "acc_stderr": 0.03169380235712996, "acc_norm": 0.6092436974789915, "acc_norm_stderr": 0.03169380235712996 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2913907284768212, "acc_stderr": 0.037101857261199946, "acc_norm": 0.2913907284768212, "acc_norm_stderr": 0.037101857261199946 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8018348623853211, "acc_stderr": 0.017090573804217912, "acc_norm": 0.8018348623853211, "acc_norm_stderr": 0.017090573804217912 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4305555555555556, "acc_stderr": 0.03376922151252336, "acc_norm": 0.4305555555555556, "acc_norm_stderr": 0.03376922151252336 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7763713080168776, "acc_stderr": 0.027123298205229966, "acc_norm": 0.7763713080168776, "acc_norm_stderr": 0.027123298205229966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6793893129770993, "acc_stderr": 0.04093329229834278, "acc_norm": 0.6793893129770993, "acc_norm_stderr": 0.04093329229834278 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908705, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7055214723926381, "acc_stderr": 0.03581165790474082, "acc_norm": 0.7055214723926381, "acc_norm_stderr": 0.03581165790474082 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8205128205128205, "acc_stderr": 0.025140935950335445, "acc_norm": 0.8205128205128205, "acc_norm_stderr": 0.025140935950335445 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7790549169859514, "acc_stderr": 0.01483620516733356, "acc_norm": 0.7790549169859514, "acc_norm_stderr": 0.01483620516733356 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6502890173410405, "acc_stderr": 0.025674281456531018, "acc_norm": 0.6502890173410405, "acc_norm_stderr": 0.025674281456531018 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4770949720670391, "acc_stderr": 0.01670494574032619, "acc_norm": 0.4770949720670391, "acc_norm_stderr": 0.01670494574032619 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6535947712418301, "acc_stderr": 0.02724561304721536, "acc_norm": 0.6535947712418301, "acc_norm_stderr": 0.02724561304721536 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6688102893890675, "acc_stderr": 0.026730620728004906, "acc_norm": 0.6688102893890675, "acc_norm_stderr": 0.026730620728004906 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6820987654320988, "acc_stderr": 0.02591006352824088, "acc_norm": 0.6820987654320988, "acc_norm_stderr": 0.02591006352824088 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.02979071924382972, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.02979071924382972 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.455019556714472, "acc_stderr": 0.012718456618701766, "acc_norm": 0.455019556714472, "acc_norm_stderr": 0.012718456618701766 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6029411764705882, "acc_stderr": 0.029722152099280072, "acc_norm": 0.6029411764705882, "acc_norm_stderr": 0.029722152099280072 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5996732026143791, "acc_stderr": 0.01982184368827177, "acc_norm": 0.5996732026143791, "acc_norm_stderr": 0.01982184368827177 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6612244897959184, "acc_stderr": 0.030299506562154185, "acc_norm": 0.6612244897959184, "acc_norm_stderr": 0.030299506562154185 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7810945273631841, "acc_stderr": 0.029239174636647, "acc_norm": 0.7810945273631841, "acc_norm_stderr": 0.029239174636647 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826369, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826369 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.03126781714663179, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.03126781714663179 }, "harness|truthfulqa:mc|0": { "mc1": 0.3953488372093023, "mc1_stderr": 0.017115815632418197, "mc2": 0.5562121721284332, "mc2_stderr": 0.01570875335896787 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Sao10K__BrainDerp3
2023-10-04T07:49:27.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Sao10K/BrainDerp3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Sao10K/BrainDerp3](https://huggingface.co/Sao10K/BrainDerp3) on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__BrainDerp3\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T07:48:05.088946](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__BrainDerp3/blob/main/results_2023-10-04T07-48-05.088946.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5894733182245001,\n\ \ \"acc_stderr\": 0.03414332313258469,\n \"acc_norm\": 0.5933830960354635,\n\ \ \"acc_norm_stderr\": 0.03412320397463523,\n \"mc1\": 0.3953488372093023,\n\ \ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.571756969256499,\n\ \ \"mc2_stderr\": 0.01564827771634302\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326025,\n\ \ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513778\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.620991834295957,\n\ \ \"acc_stderr\": 0.004841486716855774,\n \"acc_norm\": 0.8209520015933081,\n\ \ \"acc_norm_stderr\": 0.0038260895866500536\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\ \ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\ \ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\ \ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398177,\n\ \ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398177\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\ \ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\ \ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\ : 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\ \ \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.5606936416184971,\n\ \ \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\ \ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.68,\n \"acc_stderr\": 0.046882617226215055,\n \"acc_norm\": 0.68,\n\ \ \"acc_norm_stderr\": 0.046882617226215055\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\ \ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\ \ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\ \ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\ \ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"\ acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\ \ \"acc_stderr\": 0.043902592653775614,\n \"acc_norm\": 0.40476190476190477,\n\ \ \"acc_norm_stderr\": 0.043902592653775614\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n\ \ \"acc_stderr\": 0.025988500792411898,\n \"acc_norm\": 0.7032258064516129,\n\ \ \"acc_norm_stderr\": 0.025988500792411898\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\ \ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\ : 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n\ \ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"\ acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\ \ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n\ \ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \ \ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016022,\n \"\ acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016022\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653064,\n \"\ acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653064\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\ acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \ \ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\ \ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n\ \ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\ \ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\ acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\ \ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\ \ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\ \ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\ \ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\ \ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\ \ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\ \ \"acc_stderr\": 0.024662496845209825,\n \"acc_norm\": 0.8290598290598291,\n\ \ \"acc_norm_stderr\": 0.024662496845209825\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \ \ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\ \ \"acc_stderr\": 0.014927447101937153,\n \"acc_norm\": 0.7752234993614304,\n\ \ \"acc_norm_stderr\": 0.014927447101937153\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.025522474632121615,\n\ \ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.025522474632121615\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\ \ \"acc_stderr\": 0.016593394227564846,\n \"acc_norm\": 0.43798882681564244,\n\ \ \"acc_norm_stderr\": 0.016593394227564846\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.02724561304721536,\n\ \ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.02724561304721536\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\ \ \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n\ \ \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732852,\n\ \ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732852\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \ \ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\ \ \"acc_stderr\": 0.012695244711379778,\n \"acc_norm\": 0.44589308996088656,\n\ \ \"acc_norm_stderr\": 0.012695244711379778\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.03010563657001663,\n\ \ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.03010563657001663\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5915032679738562,\n \"acc_stderr\": 0.019886221037501862,\n \ \ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.019886221037501862\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\ \ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\ \ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\ \ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\ \ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\ \ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\ \ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\ \ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.571756969256499,\n\ \ \"mc2_stderr\": 0.01564827771634302\n }\n}\n```" repo_url: https://huggingface.co/Sao10K/BrainDerp3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|arc:challenge|25_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hellaswag|10_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-05.088946.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-05.088946.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T07_48_05.088946 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-48-05.088946.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T07-48-05.088946.parquet' - config_name: results data_files: - split: 2023_10_04T07_48_05.088946 path: - results_2023-10-04T07-48-05.088946.parquet - split: latest path: - results_2023-10-04T07-48-05.088946.parquet --- # Dataset Card for Evaluation run of Sao10K/BrainDerp3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Sao10K/BrainDerp3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Sao10K/BrainDerp3](https://huggingface.co/Sao10K/BrainDerp3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Sao10K__BrainDerp3", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T07:48:05.088946](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__BrainDerp3/blob/main/results_2023-10-04T07-48-05.088946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5894733182245001, "acc_stderr": 0.03414332313258469, "acc_norm": 0.5933830960354635, "acc_norm_stderr": 0.03412320397463523, "mc1": 0.3953488372093023, "mc1_stderr": 0.017115815632418197, "mc2": 0.571756969256499, "mc2_stderr": 0.01564827771634302 }, "harness|arc:challenge|25": { "acc": 0.5784982935153583, "acc_stderr": 0.014430197069326025, "acc_norm": 0.6092150170648464, "acc_norm_stderr": 0.014258563880513778 }, "harness|hellaswag|10": { "acc": 0.620991834295957, "acc_stderr": 0.004841486716855774, "acc_norm": 0.8209520015933081, "acc_norm_stderr": 0.0038260895866500536 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5111111111111111, "acc_stderr": 0.04318275491977976, "acc_norm": 0.5111111111111111, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5789473684210527, "acc_stderr": 0.04017901275981749, "acc_norm": 0.5789473684210527, "acc_norm_stderr": 0.04017901275981749 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6415094339622641, "acc_stderr": 0.02951470358398177, "acc_norm": 0.6415094339622641, "acc_norm_stderr": 0.02951470358398177 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6180555555555556, "acc_stderr": 0.040629907841466674, "acc_norm": 0.6180555555555556, "acc_norm_stderr": 0.040629907841466674 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5606936416184971, "acc_stderr": 0.03784271932887467, "acc_norm": 0.5606936416184971, "acc_norm_stderr": 0.03784271932887467 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.047240073523838876, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.047240073523838876 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.046882617226215055, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215055 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5063829787234042, "acc_stderr": 0.032683358999363366, "acc_norm": 0.5063829787234042, "acc_norm_stderr": 0.032683358999363366 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.32456140350877194, "acc_stderr": 0.04404556157374767, "acc_norm": 0.32456140350877194, "acc_norm_stderr": 0.04404556157374767 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36243386243386244, "acc_stderr": 0.02475747390275206, "acc_norm": 0.36243386243386244, "acc_norm_stderr": 0.02475747390275206 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.043902592653775614, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.043902592653775614 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7032258064516129, "acc_stderr": 0.025988500792411898, "acc_norm": 0.7032258064516129, "acc_norm_stderr": 0.025988500792411898 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.46798029556650245, "acc_stderr": 0.035107665979592154, "acc_norm": 0.46798029556650245, "acc_norm_stderr": 0.035107665979592154 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.703030303030303, "acc_stderr": 0.03567969772268049, "acc_norm": 0.703030303030303, "acc_norm_stderr": 0.03567969772268049 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217483, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217483 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306433, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306433 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6358974358974359, "acc_stderr": 0.024396672985094767, "acc_norm": 0.6358974358974359, "acc_norm_stderr": 0.024396672985094767 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028597, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028597 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6008403361344538, "acc_stderr": 0.03181110032413926, "acc_norm": 0.6008403361344538, "acc_norm_stderr": 0.03181110032413926 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7944954128440367, "acc_stderr": 0.017324352325016022, "acc_norm": 0.7944954128440367, "acc_norm_stderr": 0.017324352325016022 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.03372343271653064, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.03372343271653064 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.026756401538078962, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.026756401538078962 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7637130801687764, "acc_stderr": 0.027652153144159263, "acc_norm": 0.7637130801687764, "acc_norm_stderr": 0.027652153144159263 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699796, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699796 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6412213740458015, "acc_stderr": 0.04206739313864908, "acc_norm": 0.6412213740458015, "acc_norm_stderr": 0.04206739313864908 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7024793388429752, "acc_stderr": 0.04173349148083499, "acc_norm": 0.7024793388429752, "acc_norm_stderr": 0.04173349148083499 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6871165644171779, "acc_stderr": 0.03642914578292406, "acc_norm": 0.6871165644171779, "acc_norm_stderr": 0.03642914578292406 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.044532548363264673, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.044532548363264673 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8290598290598291, "acc_stderr": 0.024662496845209825, "acc_norm": 0.8290598290598291, "acc_norm_stderr": 0.024662496845209825 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7752234993614304, "acc_stderr": 0.014927447101937153, "acc_norm": 0.7752234993614304, "acc_norm_stderr": 0.014927447101937153 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6589595375722543, "acc_stderr": 0.025522474632121615, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.025522474632121615 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43798882681564244, "acc_stderr": 0.016593394227564846, "acc_norm": 0.43798882681564244, "acc_norm_stderr": 0.016593394227564846 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6535947712418301, "acc_stderr": 0.02724561304721536, "acc_norm": 0.6535947712418301, "acc_norm_stderr": 0.02724561304721536 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6559485530546624, "acc_stderr": 0.026981478043648043, "acc_norm": 0.6559485530546624, "acc_norm_stderr": 0.026981478043648043 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6697530864197531, "acc_stderr": 0.026168298456732852, "acc_norm": 0.6697530864197531, "acc_norm_stderr": 0.026168298456732852 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236837, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236837 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44589308996088656, "acc_stderr": 0.012695244711379778, "acc_norm": 0.44589308996088656, "acc_norm_stderr": 0.012695244711379778 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5661764705882353, "acc_stderr": 0.03010563657001663, "acc_norm": 0.5661764705882353, "acc_norm_stderr": 0.03010563657001663 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5915032679738562, "acc_stderr": 0.019886221037501862, "acc_norm": 0.5915032679738562, "acc_norm_stderr": 0.019886221037501862 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.04494290866252091, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.04494290866252091 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6612244897959184, "acc_stderr": 0.030299506562154185, "acc_norm": 0.6612244897959184, "acc_norm_stderr": 0.030299506562154185 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7810945273631841, "acc_stderr": 0.029239174636647, "acc_norm": 0.7810945273631841, "acc_norm_stderr": 0.029239174636647 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333045, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7719298245614035, "acc_stderr": 0.032180937956023566, "acc_norm": 0.7719298245614035, "acc_norm_stderr": 0.032180937956023566 }, "harness|truthfulqa:mc|0": { "mc1": 0.3953488372093023, "mc1_stderr": 0.017115815632418197, "mc2": 0.571756969256499, "mc2_stderr": 0.01564827771634302 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
ArkaAcharya/phi-part
2023-10-04T07:49:56.000Z
[ "region:us" ]
ArkaAcharya
null
null
null
0
0
Entry not found
atom-in-the-universe/bild-89f21499-5d97-4c9b-bfca-bc96ead217ce
2023-10-04T08:10:12.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048
2023-10-04T08:06:43.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of harborwater/open-llama-3b-everythingLM-2048 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [harborwater/open-llama-3b-everythingLM-2048](https://huggingface.co/harborwater/open-llama-3b-everythingLM-2048)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T08:05:25.924210](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048/blob/main/results_2023-10-04T08-05-25.924210.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2781212991625057,\n\ \ \"acc_stderr\": 0.03240629002364077,\n \"acc_norm\": 0.2818325672769296,\n\ \ \"acc_norm_stderr\": 0.032402108734402385,\n \"mc1\": 0.22643818849449204,\n\ \ \"mc1_stderr\": 0.014651337324602587,\n \"mc2\": 0.3426250755220841,\n\ \ \"mc2_stderr\": 0.013487279265594353\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.3856655290102389,\n \"acc_stderr\": 0.01422425097325717,\n\ \ \"acc_norm\": 0.4274744027303754,\n \"acc_norm_stderr\": 0.014456862944650649\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5400318661621191,\n\ \ \"acc_stderr\": 0.004973762948302805,\n \"acc_norm\": 0.7171878111929895,\n\ \ \"acc_norm_stderr\": 0.004494454911844637\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.037857144650666544,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.037857144650666544\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n\ \ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\ \ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \ \ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493878,\n\ \ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493878\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\ \ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\ \ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n\ \ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\ \ \"acc_stderr\": 0.032147373020294696,\n \"acc_norm\": 0.23121387283236994,\n\ \ \"acc_norm_stderr\": 0.032147373020294696\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\ \ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\ \ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231008,\n\ \ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231008\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.04049339297748143,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.04049339297748143\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.0333333333333333,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n\ \ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n\ \ \"acc_stderr\": 0.023068188848261107,\n \"acc_norm\": 0.2777777777777778,\n\ \ \"acc_norm_stderr\": 0.023068188848261107\n },\n \"harness|hendrycksTest-formal_logic|5\"\ : {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.03764950879790606,\n\ \ \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.03764950879790606\n\ \ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n\ \ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\"\ : {\n \"acc\": 0.267741935483871,\n \"acc_stderr\": 0.02518900666021238,\n\ \ \"acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.02518900666021238\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n \"\ acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\ \ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.2474747474747475,\n \"acc_stderr\": 0.03074630074212451,\n \"\ acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.03074630074212451\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.0311958408777003,\n\ \ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.0311958408777003\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.0219169577092138,\n \ \ \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.0219169577092138\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276611,\n \ \ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276611\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372177,\n\ \ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372177\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\ acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.24587155963302754,\n \"acc_stderr\": 0.018461940968708446,\n \"\ acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.018461940968708446\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691936,\n \"\ acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691936\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814565,\n \"\ acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814565\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \ \ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\ \ \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.3901345291479821,\n\ \ \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\ \ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212095,\n \"\ acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212095\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\ \ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\ \ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\ \ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.04498676320572921,\n\ \ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.04498676320572921\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\ \ \"acc_stderr\": 0.02974504857267406,\n \"acc_norm\": 0.2905982905982906,\n\ \ \"acc_norm_stderr\": 0.02974504857267406\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.30140485312899107,\n\ \ \"acc_stderr\": 0.016409091097268787,\n \"acc_norm\": 0.30140485312899107,\n\ \ \"acc_norm_stderr\": 0.016409091097268787\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500107,\n\ \ \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500107\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\ \ \"acc_stderr\": 0.014310999547961455,\n \"acc_norm\": 0.24134078212290502,\n\ \ \"acc_norm_stderr\": 0.014310999547961455\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958154,\n\ \ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958154\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\ \ \"acc_stderr\": 0.025494259350694905,\n \"acc_norm\": 0.2797427652733119,\n\ \ \"acc_norm_stderr\": 0.025494259350694905\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.27469135802469136,\n \"acc_stderr\": 0.02483605786829468,\n\ \ \"acc_norm\": 0.27469135802469136,\n \"acc_norm_stderr\": 0.02483605786829468\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \ \ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\ \ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\ \ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.02388688192244036,\n\ \ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02388688192244036\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667192,\n \ \ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667192\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\ \ \"acc_stderr\": 0.04494290866252088,\n \"acc_norm\": 0.32727272727272727,\n\ \ \"acc_norm_stderr\": 0.04494290866252088\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.02971932942241747,\n\ \ \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.02971932942241747\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\ \ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.2736318407960199,\n\ \ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\ \ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\ \ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.03660298834049162,\n\ \ \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.03660298834049162\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\ \ \"mc1_stderr\": 0.014651337324602587,\n \"mc2\": 0.3426250755220841,\n\ \ \"mc2_stderr\": 0.013487279265594353\n }\n}\n```" repo_url: https://huggingface.co/harborwater/open-llama-3b-everythingLM-2048 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|arc:challenge|25_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hellaswag|10_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-05-25.924210.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-05-25.924210.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T08_05_25.924210 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T08-05-25.924210.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T08-05-25.924210.parquet' - config_name: results data_files: - split: 2023_10_04T08_05_25.924210 path: - results_2023-10-04T08-05-25.924210.parquet - split: latest path: - results_2023-10-04T08-05-25.924210.parquet --- # Dataset Card for Evaluation run of harborwater/open-llama-3b-everythingLM-2048 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/harborwater/open-llama-3b-everythingLM-2048 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [harborwater/open-llama-3b-everythingLM-2048](https://huggingface.co/harborwater/open-llama-3b-everythingLM-2048) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T08:05:25.924210](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048/blob/main/results_2023-10-04T08-05-25.924210.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2781212991625057, "acc_stderr": 0.03240629002364077, "acc_norm": 0.2818325672769296, "acc_norm_stderr": 0.032402108734402385, "mc1": 0.22643818849449204, "mc1_stderr": 0.014651337324602587, "mc2": 0.3426250755220841, "mc2_stderr": 0.013487279265594353 }, "harness|arc:challenge|25": { "acc": 0.3856655290102389, "acc_stderr": 0.01422425097325717, "acc_norm": 0.4274744027303754, "acc_norm_stderr": 0.014456862944650649 }, "harness|hellaswag|10": { "acc": 0.5400318661621191, "acc_stderr": 0.004973762948302805, "acc_norm": 0.7171878111929895, "acc_norm_stderr": 0.004494454911844637 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.25925925925925924, "acc_stderr": 0.037857144650666544, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.037857144650666544 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.29605263157894735, "acc_stderr": 0.03715062154998905, "acc_norm": 0.29605263157894735, "acc_norm_stderr": 0.03715062154998905 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2981132075471698, "acc_stderr": 0.028152837942493878, "acc_norm": 0.2981132075471698, "acc_norm_stderr": 0.028152837942493878 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2638888888888889, "acc_stderr": 0.03685651095897532, "acc_norm": 0.2638888888888889, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.22, "acc_stderr": 0.04163331998932269, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932269 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.23121387283236994, "acc_stderr": 0.032147373020294696, "acc_norm": 0.23121387283236994, "acc_norm_stderr": 0.032147373020294696 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.19607843137254902, "acc_stderr": 0.03950581861179961, "acc_norm": 0.19607843137254902, "acc_norm_stderr": 0.03950581861179961 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3276595744680851, "acc_stderr": 0.030683020843231008, "acc_norm": 0.3276595744680851, "acc_norm_stderr": 0.030683020843231008 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.04049339297748143, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.04049339297748143 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2, "acc_stderr": 0.0333333333333333, "acc_norm": 0.2, "acc_norm_stderr": 0.0333333333333333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.023068188848261107, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.023068188848261107 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.23015873015873015, "acc_stderr": 0.03764950879790606, "acc_norm": 0.23015873015873015, "acc_norm_stderr": 0.03764950879790606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.267741935483871, "acc_stderr": 0.02518900666021238, "acc_norm": 0.267741935483871, "acc_norm_stderr": 0.02518900666021238 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2660098522167488, "acc_stderr": 0.03108982600293753, "acc_norm": 0.2660098522167488, "acc_norm_stderr": 0.03108982600293753 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2727272727272727, "acc_stderr": 0.0347769116216366, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.0347769116216366 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.2474747474747475, "acc_stderr": 0.03074630074212451, "acc_norm": 0.2474747474747475, "acc_norm_stderr": 0.03074630074212451 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.24870466321243523, "acc_stderr": 0.0311958408777003, "acc_norm": 0.24870466321243523, "acc_norm_stderr": 0.0311958408777003 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.24871794871794872, "acc_stderr": 0.0219169577092138, "acc_norm": 0.24871794871794872, "acc_norm_stderr": 0.0219169577092138 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.23703703703703705, "acc_stderr": 0.02592887613276611, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.02592887613276611 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2815126050420168, "acc_stderr": 0.029213549414372177, "acc_norm": 0.2815126050420168, "acc_norm_stderr": 0.029213549414372177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658753, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658753 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.24587155963302754, "acc_stderr": 0.018461940968708446, "acc_norm": 0.24587155963302754, "acc_norm_stderr": 0.018461940968708446 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.18981481481481483, "acc_stderr": 0.026744714834691936, "acc_norm": 0.18981481481481483, "acc_norm_stderr": 0.026744714834691936 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.23529411764705882, "acc_stderr": 0.02977177522814565, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.02977177522814565 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.26582278481012656, "acc_stderr": 0.028756799629658335, "acc_norm": 0.26582278481012656, "acc_norm_stderr": 0.028756799629658335 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3901345291479821, "acc_stderr": 0.03273766725459156, "acc_norm": 0.3901345291479821, "acc_norm_stderr": 0.03273766725459156 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.24427480916030533, "acc_stderr": 0.03768335959728745, "acc_norm": 0.24427480916030533, "acc_norm_stderr": 0.03768335959728745 }, "harness|hendrycksTest-international_law|5": { "acc": 0.3305785123966942, "acc_stderr": 0.04294340845212095, "acc_norm": 0.3305785123966942, "acc_norm_stderr": 0.04294340845212095 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2962962962962963, "acc_stderr": 0.044143436668549335, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.044143436668549335 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25153374233128833, "acc_stderr": 0.034089978868575295, "acc_norm": 0.25153374233128833, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340456, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340456 }, "harness|hendrycksTest-management|5": { "acc": 0.2912621359223301, "acc_stderr": 0.04498676320572921, "acc_norm": 0.2912621359223301, "acc_norm_stderr": 0.04498676320572921 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267406, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267406 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.30140485312899107, "acc_stderr": 0.016409091097268787, "acc_norm": 0.30140485312899107, "acc_norm_stderr": 0.016409091097268787 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.26878612716763006, "acc_stderr": 0.023868003262500107, "acc_norm": 0.26878612716763006, "acc_norm_stderr": 0.023868003262500107 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961455, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961455 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2581699346405229, "acc_stderr": 0.025058503316958154, "acc_norm": 0.2581699346405229, "acc_norm_stderr": 0.025058503316958154 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2797427652733119, "acc_stderr": 0.025494259350694905, "acc_norm": 0.2797427652733119, "acc_norm_stderr": 0.025494259350694905 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.27469135802469136, "acc_stderr": 0.02483605786829468, "acc_norm": 0.27469135802469136, "acc_norm_stderr": 0.02483605786829468 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2872340425531915, "acc_stderr": 0.026992199173064356, "acc_norm": 0.2872340425531915, "acc_norm_stderr": 0.026992199173064356 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.19117647058823528, "acc_stderr": 0.02388688192244036, "acc_norm": 0.19117647058823528, "acc_norm_stderr": 0.02388688192244036 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.26633986928104575, "acc_stderr": 0.017883188134667192, "acc_norm": 0.26633986928104575, "acc_norm_stderr": 0.017883188134667192 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.32727272727272727, "acc_stderr": 0.04494290866252088, "acc_norm": 0.32727272727272727, "acc_norm_stderr": 0.04494290866252088 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3142857142857143, "acc_stderr": 0.02971932942241747, "acc_norm": 0.3142857142857143, "acc_norm_stderr": 0.02971932942241747 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2736318407960199, "acc_stderr": 0.031524391865554016, "acc_norm": 0.2736318407960199, "acc_norm_stderr": 0.031524391865554016 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-virology|5": { "acc": 0.3253012048192771, "acc_stderr": 0.03647168523683227, "acc_norm": 0.3253012048192771, "acc_norm_stderr": 0.03647168523683227 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3508771929824561, "acc_stderr": 0.03660298834049162, "acc_norm": 0.3508771929824561, "acc_norm_stderr": 0.03660298834049162 }, "harness|truthfulqa:mc|0": { "mc1": 0.22643818849449204, "mc1_stderr": 0.014651337324602587, "mc2": 0.3426250755220841, "mc2_stderr": 0.013487279265594353 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_formulae__Dorflan
2023-10-04T08:09:17.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of formulae/Dorflan dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [formulae/Dorflan](https://huggingface.co/formulae/Dorflan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_formulae__Dorflan\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T08:07:52.776244](https://huggingface.co/datasets/open-llm-leaderboard/details_formulae__Dorflan/blob/main/results_2023-10-04T08-07-52.776244.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5148390017657016,\n\ \ \"acc_stderr\": 0.03501266949361778,\n \"acc_norm\": 0.5182735412397382,\n\ \ \"acc_norm_stderr\": 0.03500091899359179,\n \"mc1\": 0.3402692778457772,\n\ \ \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5116789867770238,\n\ \ \"mc2_stderr\": 0.01599131763608086\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.514505119453925,\n \"acc_stderr\": 0.014605241081370056,\n\ \ \"acc_norm\": 0.5443686006825939,\n \"acc_norm_stderr\": 0.014553749939306863\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5850428201553476,\n\ \ \"acc_stderr\": 0.004917076726623795,\n \"acc_norm\": 0.7578171678948417,\n\ \ \"acc_norm_stderr\": 0.004275288367153572\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\ \ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\ \ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.04033565667848319,\n\ \ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.04033565667848319\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286644,\n\ \ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286644\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\ \ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\ \ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n\ \ \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \ \ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\"\ : {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.03811890988940412,\n\ \ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.03811890988940412\n\ \ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n\ \ \"acc_stderr\": 0.04533838195929775,\n \"acc_norm\": 0.29411764705882354,\n\ \ \"acc_norm_stderr\": 0.04533838195929775\n },\n \"harness|hendrycksTest-computer_security|5\"\ : {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\ \ 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n \"\ acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\ \ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\ acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\ \ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\ \ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5709677419354838,\n\ \ \"acc_stderr\": 0.028156036538233193,\n \"acc_norm\": 0.5709677419354838,\n\ \ \"acc_norm_stderr\": 0.028156036538233193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\ \ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\ : 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\ \ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.696969696969697,\n \"acc_stderr\": 0.03274287914026866,\n \"acc_norm\"\ : 0.696969696969697,\n \"acc_norm_stderr\": 0.03274287914026866\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n\ \ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\ \ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \ \ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.03247734334448111,\n \ \ \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.03247734334448111\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.726605504587156,\n \"acc_stderr\": 0.019109299846098295,\n \"\ acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.019109299846098295\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\ acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n \"\ acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955924,\n \ \ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955924\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\ \ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\ \ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\ \ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"\ acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\ \ \"acc_stderr\": 0.04643454608906276,\n \"acc_norm\": 0.6388888888888888,\n\ \ \"acc_norm_stderr\": 0.04643454608906276\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.03874102859818081,\n\ \ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.03874102859818081\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\ \ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\ \ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\ \ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\ \ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.7307692307692307,\n\ \ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\ \ \"acc_stderr\": 0.01626700068459864,\n \"acc_norm\": 0.7075351213282248,\n\ \ \"acc_norm_stderr\": 0.01626700068459864\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.026842985519615375,\n\ \ \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.026842985519615375\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\ \ \"acc_stderr\": 0.014816119635317008,\n \"acc_norm\": 0.2681564245810056,\n\ \ \"acc_norm_stderr\": 0.014816119635317008\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.028472938478033533,\n\ \ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.028472938478033533\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\ \ \"acc_stderr\": 0.02812534098397271,\n \"acc_norm\": 0.5691318327974276,\n\ \ \"acc_norm_stderr\": 0.02812534098397271\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542595,\n\ \ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542595\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \ \ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39048239895697523,\n\ \ \"acc_stderr\": 0.012460135913945073,\n \"acc_norm\": 0.39048239895697523,\n\ \ \"acc_norm_stderr\": 0.012460135913945073\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.03034326422421352,\n\ \ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.03034326422421352\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.48366013071895425,\n \"acc_stderr\": 0.020217030653186457,\n \ \ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.020217030653186457\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\ \ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\ \ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235933,\n\ \ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235933\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\ \ \"acc_stderr\": 0.03400598505599014,\n \"acc_norm\": 0.6368159203980099,\n\ \ \"acc_norm_stderr\": 0.03400598505599014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\ \ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\ \ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\ \ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\ \ \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5116789867770238,\n\ \ \"mc2_stderr\": 0.01599131763608086\n }\n}\n```" repo_url: https://huggingface.co/formulae/Dorflan leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|arc:challenge|25_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hellaswag|10_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-07-52.776244.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-07-52.776244.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T08_07_52.776244 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T08-07-52.776244.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T08-07-52.776244.parquet' - config_name: results data_files: - split: 2023_10_04T08_07_52.776244 path: - results_2023-10-04T08-07-52.776244.parquet - split: latest path: - results_2023-10-04T08-07-52.776244.parquet --- # Dataset Card for Evaluation run of formulae/Dorflan ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/formulae/Dorflan - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [formulae/Dorflan](https://huggingface.co/formulae/Dorflan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_formulae__Dorflan", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T08:07:52.776244](https://huggingface.co/datasets/open-llm-leaderboard/details_formulae__Dorflan/blob/main/results_2023-10-04T08-07-52.776244.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5148390017657016, "acc_stderr": 0.03501266949361778, "acc_norm": 0.5182735412397382, "acc_norm_stderr": 0.03500091899359179, "mc1": 0.3402692778457772, "mc1_stderr": 0.016586304901762557, "mc2": 0.5116789867770238, "mc2_stderr": 0.01599131763608086 }, "harness|arc:challenge|25": { "acc": 0.514505119453925, "acc_stderr": 0.014605241081370056, "acc_norm": 0.5443686006825939, "acc_norm_stderr": 0.014553749939306863 }, "harness|hellaswag|10": { "acc": 0.5850428201553476, "acc_stderr": 0.004917076726623795, "acc_norm": 0.7578171678948417, "acc_norm_stderr": 0.004275288367153572 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768081, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768081 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04292596718256981, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4342105263157895, "acc_stderr": 0.04033565667848319, "acc_norm": 0.4342105263157895, "acc_norm_stderr": 0.04033565667848319 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6188679245283019, "acc_stderr": 0.029890609686286644, "acc_norm": 0.6188679245283019, "acc_norm_stderr": 0.029890609686286644 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.04166666666666665, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.04166666666666665 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4913294797687861, "acc_stderr": 0.03811890988940412, "acc_norm": 0.4913294797687861, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929775, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929775 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.451063829787234, "acc_stderr": 0.032529096196131965, "acc_norm": 0.451063829787234, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.41379310344827586, "acc_stderr": 0.04104269211806232, "acc_norm": 0.41379310344827586, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.32275132275132273, "acc_stderr": 0.024078943243597016, "acc_norm": 0.32275132275132273, "acc_norm_stderr": 0.024078943243597016 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.042857142857142816, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.042857142857142816 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5709677419354838, "acc_stderr": 0.028156036538233193, "acc_norm": 0.5709677419354838, "acc_norm_stderr": 0.028156036538233193 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3793103448275862, "acc_stderr": 0.03413963805906235, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.03413963805906235 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6848484848484848, "acc_stderr": 0.0362773057502241, "acc_norm": 0.6848484848484848, "acc_norm_stderr": 0.0362773057502241 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.696969696969697, "acc_stderr": 0.03274287914026866, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.03274287914026866 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7150259067357513, "acc_stderr": 0.032577140777096614, "acc_norm": 0.7150259067357513, "acc_norm_stderr": 0.032577140777096614 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5128205128205128, "acc_stderr": 0.025342671293807257, "acc_norm": 0.5128205128205128, "acc_norm_stderr": 0.025342671293807257 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2851851851851852, "acc_stderr": 0.027528599210340492, "acc_norm": 0.2851851851851852, "acc_norm_stderr": 0.027528599210340492 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4957983193277311, "acc_stderr": 0.03247734334448111, "acc_norm": 0.4957983193277311, "acc_norm_stderr": 0.03247734334448111 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.726605504587156, "acc_stderr": 0.019109299846098295, "acc_norm": 0.726605504587156, "acc_norm_stderr": 0.019109299846098295 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.46296296296296297, "acc_stderr": 0.03400603625538272, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6911764705882353, "acc_stderr": 0.03242661719827218, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.03242661719827218 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7172995780590717, "acc_stderr": 0.029312814153955924, "acc_norm": 0.7172995780590717, "acc_norm_stderr": 0.029312814153955924 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5695067264573991, "acc_stderr": 0.033231973029429394, "acc_norm": 0.5695067264573991, "acc_norm_stderr": 0.033231973029429394 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5801526717557252, "acc_stderr": 0.04328577215262972, "acc_norm": 0.5801526717557252, "acc_norm_stderr": 0.04328577215262972 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6776859504132231, "acc_stderr": 0.04266416363352168, "acc_norm": 0.6776859504132231, "acc_norm_stderr": 0.04266416363352168 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6388888888888888, "acc_stderr": 0.04643454608906276, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.04643454608906276 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5828220858895705, "acc_stderr": 0.03874102859818081, "acc_norm": 0.5828220858895705, "acc_norm_stderr": 0.03874102859818081 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.04521829902833585, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.04521829902833585 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326467, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326467 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7307692307692307, "acc_stderr": 0.029058588303748842, "acc_norm": 0.7307692307692307, "acc_norm_stderr": 0.029058588303748842 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7075351213282248, "acc_stderr": 0.01626700068459864, "acc_norm": 0.7075351213282248, "acc_norm_stderr": 0.01626700068459864 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5375722543352601, "acc_stderr": 0.026842985519615375, "acc_norm": 0.5375722543352601, "acc_norm_stderr": 0.026842985519615375 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2681564245810056, "acc_stderr": 0.014816119635317008, "acc_norm": 0.2681564245810056, "acc_norm_stderr": 0.014816119635317008 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5522875816993464, "acc_stderr": 0.028472938478033533, "acc_norm": 0.5522875816993464, "acc_norm_stderr": 0.028472938478033533 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5691318327974276, "acc_stderr": 0.02812534098397271, "acc_norm": 0.5691318327974276, "acc_norm_stderr": 0.02812534098397271 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5462962962962963, "acc_stderr": 0.027701228468542595, "acc_norm": 0.5462962962962963, "acc_norm_stderr": 0.027701228468542595 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41134751773049644, "acc_stderr": 0.02935491115994098, "acc_norm": 0.41134751773049644, "acc_norm_stderr": 0.02935491115994098 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.39048239895697523, "acc_stderr": 0.012460135913945073, "acc_norm": 0.39048239895697523, "acc_norm_stderr": 0.012460135913945073 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5220588235294118, "acc_stderr": 0.03034326422421352, "acc_norm": 0.5220588235294118, "acc_norm_stderr": 0.03034326422421352 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.48366013071895425, "acc_stderr": 0.020217030653186457, "acc_norm": 0.48366013071895425, "acc_norm_stderr": 0.020217030653186457 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5363636363636364, "acc_stderr": 0.04776449162396197, "acc_norm": 0.5363636363636364, "acc_norm_stderr": 0.04776449162396197 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6163265306122448, "acc_stderr": 0.031130880396235933, "acc_norm": 0.6163265306122448, "acc_norm_stderr": 0.031130880396235933 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6368159203980099, "acc_stderr": 0.03400598505599014, "acc_norm": 0.6368159203980099, "acc_norm_stderr": 0.03400598505599014 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.40963855421686746, "acc_stderr": 0.03828401115079023, "acc_norm": 0.40963855421686746, "acc_norm_stderr": 0.03828401115079023 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6783625730994152, "acc_stderr": 0.03582529442573122, "acc_norm": 0.6783625730994152, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.3402692778457772, "mc1_stderr": 0.016586304901762557, "mc2": 0.5116789867770238, "mc2_stderr": 0.01599131763608086 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-e1773bc7-a6de-49c1-a88b-bda387b9d573
2023-10-04T08:25:50.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
HamdanXI/daily_dialogue_text_to_gloss
2023-10-04T08:16:47.000Z
[ "region:us" ]
HamdanXI
null
null
null
0
0
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: text dtype: string - name: gloss dtype: string splits: - name: train num_bytes: 7544982 num_examples: 77350 download_size: 4908386 dataset_size: 7544982 --- # Dataset Card for "daily_dialogue_text_to_gloss" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jiwon65/sample2
2023-10-04T08:24:48.000Z
[ "region:us" ]
jiwon65
null
null
null
0
0
Entry not found
atom-in-the-universe/bild-13cd89f1-fd89-4480-8ecf-7604ec9daedb
2023-10-04T08:39:57.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Synthia-WVG-Test
2023-10-04T08:35:10.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of LTC-AI-Labs/L2-7b-Synthia-WVG-Test dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [LTC-AI-Labs/L2-7b-Synthia-WVG-Test](https://huggingface.co/LTC-AI-Labs/L2-7b-Synthia-WVG-Test)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Synthia-WVG-Test\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T08:33:52.196370](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Synthia-WVG-Test/blob/main/results_2023-10-04T08-33-52.196370.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4968647192514997,\n\ \ \"acc_stderr\": 0.03529451136869261,\n \"acc_norm\": 0.5006864622272934,\n\ \ \"acc_norm_stderr\": 0.03527999206994166,\n \"mc1\": 0.2876376988984088,\n\ \ \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.44105765043932066,\n\ \ \"mc2_stderr\": 0.01477470234047246\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5247440273037542,\n \"acc_stderr\": 0.01459348769493774,\n\ \ \"acc_norm\": 0.5597269624573379,\n \"acc_norm_stderr\": 0.014506769524804236\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5884285998805019,\n\ \ \"acc_stderr\": 0.004911125101064641,\n \"acc_norm\": 0.7789285002987453,\n\ \ \"acc_norm_stderr\": 0.004141204644892014\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\ \ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\ \ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\ \ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\ \ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \ \ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851316,\n\ \ \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851316\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\ \ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\ \ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\ : 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\ \ \"acc_stderr\": 0.037657466938651504,\n \"acc_norm\": 0.42196531791907516,\n\ \ \"acc_norm_stderr\": 0.037657466938651504\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416544,\n\ \ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416544\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\ \ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\ \ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\ \ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\ \ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\ \ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"\ acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\ \ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\ \ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.5290322580645161,\n \"acc_stderr\": 0.028396016402761,\n \"acc_norm\"\ : 0.5290322580645161,\n \"acc_norm_stderr\": 0.028396016402761\n },\n\ \ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n\ \ \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.37438423645320196,\n\ \ \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\ : {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\ acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n \ \ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.5757575757575758,\n \"acc_stderr\": 0.035212249088415845,\n \"\ acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.035212249088415845\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041152,\n\ \ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041152\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331803,\n\ \ \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331803\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \ \ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.032145368597886394,\n\ \ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.032145368597886394\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.6458715596330276,\n \"acc_stderr\": 0.020504729013829118,\n \"\ acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.020504729013829118\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\ acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"\ acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6877637130801688,\n \"acc_stderr\": 0.03016513786784701,\n \ \ \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.03016513786784701\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\ \ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\ \ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\ \ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\ acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\ \ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\ \ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\ \ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\ \ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \ \ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n\ \ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\ \ \"acc_stderr\": 0.029996951858349472,\n \"acc_norm\": 0.7008547008547008,\n\ \ \"acc_norm_stderr\": 0.029996951858349472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \ \ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6871008939974457,\n\ \ \"acc_stderr\": 0.01658093594030406,\n \"acc_norm\": 0.6871008939974457,\n\ \ \"acc_norm_stderr\": 0.01658093594030406\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.523121387283237,\n \"acc_stderr\": 0.026890297881303125,\n\ \ \"acc_norm\": 0.523121387283237,\n \"acc_norm_stderr\": 0.026890297881303125\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\ \ \"acc_stderr\": 0.01440029642922562,\n \"acc_norm\": 0.24581005586592178,\n\ \ \"acc_norm_stderr\": 0.01440029642922562\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\ \ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\ \ \"acc_stderr\": 0.02788238379132595,\n \"acc_norm\": 0.594855305466238,\n\ \ \"acc_norm_stderr\": 0.02788238379132595\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5401234567901234,\n \"acc_stderr\": 0.027731022753539274,\n\ \ \"acc_norm\": 0.5401234567901234,\n \"acc_norm_stderr\": 0.027731022753539274\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \ \ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3500651890482399,\n\ \ \"acc_stderr\": 0.01218255231321517,\n \"acc_norm\": 0.3500651890482399,\n\ \ \"acc_norm_stderr\": 0.01218255231321517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.030359697079046104,\n\ \ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.030359697079046104\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.49673202614379086,\n \"acc_stderr\": 0.020227402794434867,\n \ \ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.020227402794434867\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\ \ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\ \ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674269,\n\ \ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674269\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n\ \ \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n\ \ \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \ \ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\ \ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\ \ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\ \ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\ \ \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.44105765043932066,\n\ \ \"mc2_stderr\": 0.01477470234047246\n }\n}\n```" repo_url: https://huggingface.co/LTC-AI-Labs/L2-7b-Synthia-WVG-Test leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|arc:challenge|25_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hellaswag|10_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-33-52.196370.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-33-52.196370.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T08_33_52.196370 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T08-33-52.196370.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T08-33-52.196370.parquet' - config_name: results data_files: - split: 2023_10_04T08_33_52.196370 path: - results_2023-10-04T08-33-52.196370.parquet - split: latest path: - results_2023-10-04T08-33-52.196370.parquet --- # Dataset Card for Evaluation run of LTC-AI-Labs/L2-7b-Synthia-WVG-Test ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/LTC-AI-Labs/L2-7b-Synthia-WVG-Test - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [LTC-AI-Labs/L2-7b-Synthia-WVG-Test](https://huggingface.co/LTC-AI-Labs/L2-7b-Synthia-WVG-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Synthia-WVG-Test", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T08:33:52.196370](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Synthia-WVG-Test/blob/main/results_2023-10-04T08-33-52.196370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4968647192514997, "acc_stderr": 0.03529451136869261, "acc_norm": 0.5006864622272934, "acc_norm_stderr": 0.03527999206994166, "mc1": 0.2876376988984088, "mc1_stderr": 0.01584631510139481, "mc2": 0.44105765043932066, "mc2_stderr": 0.01477470234047246 }, "harness|arc:challenge|25": { "acc": 0.5247440273037542, "acc_stderr": 0.01459348769493774, "acc_norm": 0.5597269624573379, "acc_norm_stderr": 0.014506769524804236 }, "harness|hellaswag|10": { "acc": 0.5884285998805019, "acc_stderr": 0.004911125101064641, "acc_norm": 0.7789285002987453, "acc_norm_stderr": 0.004141204644892014 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847415, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847415 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4605263157894737, "acc_stderr": 0.04056242252249034, "acc_norm": 0.4605263157894737, "acc_norm_stderr": 0.04056242252249034 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5056603773584906, "acc_stderr": 0.030770900763851316, "acc_norm": 0.5056603773584906, "acc_norm_stderr": 0.030770900763851316 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5069444444444444, "acc_stderr": 0.04180806750294938, "acc_norm": 0.5069444444444444, "acc_norm_stderr": 0.04180806750294938 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939098, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939098 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.42196531791907516, "acc_stderr": 0.037657466938651504, "acc_norm": 0.42196531791907516, "acc_norm_stderr": 0.037657466938651504 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.16666666666666666, "acc_stderr": 0.03708284662416544, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.03708284662416544 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4127659574468085, "acc_stderr": 0.03218471141400351, "acc_norm": 0.4127659574468085, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3684210526315789, "acc_stderr": 0.04537815354939392, "acc_norm": 0.3684210526315789, "acc_norm_stderr": 0.04537815354939392 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3306878306878307, "acc_stderr": 0.024229965298425082, "acc_norm": 0.3306878306878307, "acc_norm_stderr": 0.024229965298425082 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30158730158730157, "acc_stderr": 0.04104947269903394, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.04104947269903394 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5290322580645161, "acc_stderr": 0.028396016402761, "acc_norm": 0.5290322580645161, "acc_norm_stderr": 0.028396016402761 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.37438423645320196, "acc_stderr": 0.03405155380561952, "acc_norm": 0.37438423645320196, "acc_norm_stderr": 0.03405155380561952 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6606060606060606, "acc_stderr": 0.03697442205031595, "acc_norm": 0.6606060606060606, "acc_norm_stderr": 0.03697442205031595 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5757575757575758, "acc_stderr": 0.035212249088415845, "acc_norm": 0.5757575757575758, "acc_norm_stderr": 0.035212249088415845 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7253886010362695, "acc_stderr": 0.03221024508041152, "acc_norm": 0.7253886010362695, "acc_norm_stderr": 0.03221024508041152 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4128205128205128, "acc_stderr": 0.024962683564331803, "acc_norm": 0.4128205128205128, "acc_norm_stderr": 0.024962683564331803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.29259259259259257, "acc_stderr": 0.027738969632176088, "acc_norm": 0.29259259259259257, "acc_norm_stderr": 0.027738969632176088 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.032145368597886394, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.032145368597886394 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6458715596330276, "acc_stderr": 0.020504729013829118, "acc_norm": 0.6458715596330276, "acc_norm_stderr": 0.020504729013829118 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.38425925925925924, "acc_stderr": 0.03317354514310742, "acc_norm": 0.38425925925925924, "acc_norm_stderr": 0.03317354514310742 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6568627450980392, "acc_stderr": 0.033321399446680854, "acc_norm": 0.6568627450980392, "acc_norm_stderr": 0.033321399446680854 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6877637130801688, "acc_stderr": 0.03016513786784701, "acc_norm": 0.6877637130801688, "acc_norm_stderr": 0.03016513786784701 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5874439461883408, "acc_stderr": 0.03304062175449297, "acc_norm": 0.5874439461883408, "acc_norm_stderr": 0.03304062175449297 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.044120158066245044, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.044120158066245044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356461, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5398773006134969, "acc_stderr": 0.03915857291436971, "acc_norm": 0.5398773006134969, "acc_norm_stderr": 0.03915857291436971 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.6407766990291263, "acc_stderr": 0.047504583990416946, "acc_norm": 0.6407766990291263, "acc_norm_stderr": 0.047504583990416946 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7008547008547008, "acc_stderr": 0.029996951858349472, "acc_norm": 0.7008547008547008, "acc_norm_stderr": 0.029996951858349472 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6871008939974457, "acc_stderr": 0.01658093594030406, "acc_norm": 0.6871008939974457, "acc_norm_stderr": 0.01658093594030406 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.523121387283237, "acc_stderr": 0.026890297881303125, "acc_norm": 0.523121387283237, "acc_norm_stderr": 0.026890297881303125 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24581005586592178, "acc_stderr": 0.01440029642922562, "acc_norm": 0.24581005586592178, "acc_norm_stderr": 0.01440029642922562 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5392156862745098, "acc_stderr": 0.028541722692618874, "acc_norm": 0.5392156862745098, "acc_norm_stderr": 0.028541722692618874 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.594855305466238, "acc_stderr": 0.02788238379132595, "acc_norm": 0.594855305466238, "acc_norm_stderr": 0.02788238379132595 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5401234567901234, "acc_stderr": 0.027731022753539274, "acc_norm": 0.5401234567901234, "acc_norm_stderr": 0.027731022753539274 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.37943262411347517, "acc_stderr": 0.028947338851614105, "acc_norm": 0.37943262411347517, "acc_norm_stderr": 0.028947338851614105 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3500651890482399, "acc_stderr": 0.01218255231321517, "acc_norm": 0.3500651890482399, "acc_norm_stderr": 0.01218255231321517 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4852941176470588, "acc_stderr": 0.030359697079046104, "acc_norm": 0.4852941176470588, "acc_norm_stderr": 0.030359697079046104 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.49673202614379086, "acc_stderr": 0.020227402794434867, "acc_norm": 0.49673202614379086, "acc_norm_stderr": 0.020227402794434867 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5272727272727272, "acc_stderr": 0.04782001791380061, "acc_norm": 0.5272727272727272, "acc_norm_stderr": 0.04782001791380061 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5877551020408164, "acc_stderr": 0.03151236044674269, "acc_norm": 0.5877551020408164, "acc_norm_stderr": 0.03151236044674269 }, "harness|hendrycksTest-sociology|5": { "acc": 0.582089552238806, "acc_stderr": 0.034875586404620636, "acc_norm": 0.582089552238806, "acc_norm_stderr": 0.034875586404620636 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-virology|5": { "acc": 0.39759036144578314, "acc_stderr": 0.038099730845402184, "acc_norm": 0.39759036144578314, "acc_norm_stderr": 0.038099730845402184 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.0352821125824523, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.0352821125824523 }, "harness|truthfulqa:mc|0": { "mc1": 0.2876376988984088, "mc1_stderr": 0.01584631510139481, "mc2": 0.44105765043932066, "mc2_stderr": 0.01477470234047246 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
BangumiBase/rezero
2023-10-04T13:30:40.000Z
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
BangumiBase
null
null
null
0
0
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Re:zero This is the image base of bangumi Re:Zero, we detected 92 characters, 9641 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 3392 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 111 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 52 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 54 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 22 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 34 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 63 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 38 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 59 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 25 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 33 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 41 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 135 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 8 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 20 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 78 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 96 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 36 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 12 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 24 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 27 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 72 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 18 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 19 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 8 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 20 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 38 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 44 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) | | 28 | 10 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) | | 29 | 116 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | ![preview 7](29/preview_7.png) | ![preview 8](29/preview_8.png) | | 30 | 41 | [Download](30/dataset.zip) | ![preview 1](30/preview_1.png) | ![preview 2](30/preview_2.png) | ![preview 3](30/preview_3.png) | ![preview 4](30/preview_4.png) | ![preview 5](30/preview_5.png) | ![preview 6](30/preview_6.png) | ![preview 7](30/preview_7.png) | ![preview 8](30/preview_8.png) | | 31 | 25 | [Download](31/dataset.zip) | ![preview 1](31/preview_1.png) | ![preview 2](31/preview_2.png) | ![preview 3](31/preview_3.png) | ![preview 4](31/preview_4.png) | ![preview 5](31/preview_5.png) | ![preview 6](31/preview_6.png) | ![preview 7](31/preview_7.png) | ![preview 8](31/preview_8.png) | | 32 | 17 | [Download](32/dataset.zip) | ![preview 1](32/preview_1.png) | ![preview 2](32/preview_2.png) | ![preview 3](32/preview_3.png) | ![preview 4](32/preview_4.png) | ![preview 5](32/preview_5.png) | ![preview 6](32/preview_6.png) | ![preview 7](32/preview_7.png) | ![preview 8](32/preview_8.png) | | 33 | 151 | [Download](33/dataset.zip) | ![preview 1](33/preview_1.png) | ![preview 2](33/preview_2.png) | ![preview 3](33/preview_3.png) | ![preview 4](33/preview_4.png) | ![preview 5](33/preview_5.png) | ![preview 6](33/preview_6.png) | ![preview 7](33/preview_7.png) | ![preview 8](33/preview_8.png) | | 34 | 24 | [Download](34/dataset.zip) | ![preview 1](34/preview_1.png) | ![preview 2](34/preview_2.png) | ![preview 3](34/preview_3.png) | ![preview 4](34/preview_4.png) | ![preview 5](34/preview_5.png) | ![preview 6](34/preview_6.png) | ![preview 7](34/preview_7.png) | ![preview 8](34/preview_8.png) | | 35 | 75 | [Download](35/dataset.zip) | ![preview 1](35/preview_1.png) | ![preview 2](35/preview_2.png) | ![preview 3](35/preview_3.png) | ![preview 4](35/preview_4.png) | ![preview 5](35/preview_5.png) | ![preview 6](35/preview_6.png) | ![preview 7](35/preview_7.png) | ![preview 8](35/preview_8.png) | | 36 | 26 | [Download](36/dataset.zip) | ![preview 1](36/preview_1.png) | ![preview 2](36/preview_2.png) | ![preview 3](36/preview_3.png) | ![preview 4](36/preview_4.png) | ![preview 5](36/preview_5.png) | ![preview 6](36/preview_6.png) | ![preview 7](36/preview_7.png) | ![preview 8](36/preview_8.png) | | 37 | 8 | [Download](37/dataset.zip) | ![preview 1](37/preview_1.png) | ![preview 2](37/preview_2.png) | ![preview 3](37/preview_3.png) | ![preview 4](37/preview_4.png) | ![preview 5](37/preview_5.png) | ![preview 6](37/preview_6.png) | ![preview 7](37/preview_7.png) | ![preview 8](37/preview_8.png) | | 38 | 40 | [Download](38/dataset.zip) | ![preview 1](38/preview_1.png) | ![preview 2](38/preview_2.png) | ![preview 3](38/preview_3.png) | ![preview 4](38/preview_4.png) | ![preview 5](38/preview_5.png) | ![preview 6](38/preview_6.png) | ![preview 7](38/preview_7.png) | ![preview 8](38/preview_8.png) | | 39 | 71 | [Download](39/dataset.zip) | ![preview 1](39/preview_1.png) | ![preview 2](39/preview_2.png) | ![preview 3](39/preview_3.png) | ![preview 4](39/preview_4.png) | ![preview 5](39/preview_5.png) | ![preview 6](39/preview_6.png) | ![preview 7](39/preview_7.png) | ![preview 8](39/preview_8.png) | | 40 | 279 | [Download](40/dataset.zip) | ![preview 1](40/preview_1.png) | ![preview 2](40/preview_2.png) | ![preview 3](40/preview_3.png) | ![preview 4](40/preview_4.png) | ![preview 5](40/preview_5.png) | ![preview 6](40/preview_6.png) | ![preview 7](40/preview_7.png) | ![preview 8](40/preview_8.png) | | 41 | 715 | [Download](41/dataset.zip) | ![preview 1](41/preview_1.png) | ![preview 2](41/preview_2.png) | ![preview 3](41/preview_3.png) | ![preview 4](41/preview_4.png) | ![preview 5](41/preview_5.png) | ![preview 6](41/preview_6.png) | ![preview 7](41/preview_7.png) | ![preview 8](41/preview_8.png) | | 42 | 41 | [Download](42/dataset.zip) | ![preview 1](42/preview_1.png) | ![preview 2](42/preview_2.png) | ![preview 3](42/preview_3.png) | ![preview 4](42/preview_4.png) | ![preview 5](42/preview_5.png) | ![preview 6](42/preview_6.png) | ![preview 7](42/preview_7.png) | ![preview 8](42/preview_8.png) | | 43 | 58 | [Download](43/dataset.zip) | ![preview 1](43/preview_1.png) | ![preview 2](43/preview_2.png) | ![preview 3](43/preview_3.png) | ![preview 4](43/preview_4.png) | ![preview 5](43/preview_5.png) | ![preview 6](43/preview_6.png) | ![preview 7](43/preview_7.png) | ![preview 8](43/preview_8.png) | | 44 | 49 | [Download](44/dataset.zip) | ![preview 1](44/preview_1.png) | ![preview 2](44/preview_2.png) | ![preview 3](44/preview_3.png) | ![preview 4](44/preview_4.png) | ![preview 5](44/preview_5.png) | ![preview 6](44/preview_6.png) | ![preview 7](44/preview_7.png) | ![preview 8](44/preview_8.png) | | 45 | 87 | [Download](45/dataset.zip) | ![preview 1](45/preview_1.png) | ![preview 2](45/preview_2.png) | ![preview 3](45/preview_3.png) | ![preview 4](45/preview_4.png) | ![preview 5](45/preview_5.png) | ![preview 6](45/preview_6.png) | ![preview 7](45/preview_7.png) | ![preview 8](45/preview_8.png) | | 46 | 20 | [Download](46/dataset.zip) | ![preview 1](46/preview_1.png) | ![preview 2](46/preview_2.png) | ![preview 3](46/preview_3.png) | ![preview 4](46/preview_4.png) | ![preview 5](46/preview_5.png) | ![preview 6](46/preview_6.png) | ![preview 7](46/preview_7.png) | ![preview 8](46/preview_8.png) | | 47 | 20 | [Download](47/dataset.zip) | ![preview 1](47/preview_1.png) | ![preview 2](47/preview_2.png) | ![preview 3](47/preview_3.png) | ![preview 4](47/preview_4.png) | ![preview 5](47/preview_5.png) | ![preview 6](47/preview_6.png) | ![preview 7](47/preview_7.png) | ![preview 8](47/preview_8.png) | | 48 | 596 | [Download](48/dataset.zip) | ![preview 1](48/preview_1.png) | ![preview 2](48/preview_2.png) | ![preview 3](48/preview_3.png) | ![preview 4](48/preview_4.png) | ![preview 5](48/preview_5.png) | ![preview 6](48/preview_6.png) | ![preview 7](48/preview_7.png) | ![preview 8](48/preview_8.png) | | 49 | 14 | [Download](49/dataset.zip) | ![preview 1](49/preview_1.png) | ![preview 2](49/preview_2.png) | ![preview 3](49/preview_3.png) | ![preview 4](49/preview_4.png) | ![preview 5](49/preview_5.png) | ![preview 6](49/preview_6.png) | ![preview 7](49/preview_7.png) | ![preview 8](49/preview_8.png) | | 50 | 31 | [Download](50/dataset.zip) | ![preview 1](50/preview_1.png) | ![preview 2](50/preview_2.png) | ![preview 3](50/preview_3.png) | ![preview 4](50/preview_4.png) | ![preview 5](50/preview_5.png) | ![preview 6](50/preview_6.png) | ![preview 7](50/preview_7.png) | ![preview 8](50/preview_8.png) | | 51 | 24 | [Download](51/dataset.zip) | ![preview 1](51/preview_1.png) | ![preview 2](51/preview_2.png) | ![preview 3](51/preview_3.png) | ![preview 4](51/preview_4.png) | ![preview 5](51/preview_5.png) | ![preview 6](51/preview_6.png) | ![preview 7](51/preview_7.png) | ![preview 8](51/preview_8.png) | | 52 | 379 | [Download](52/dataset.zip) | ![preview 1](52/preview_1.png) | ![preview 2](52/preview_2.png) | ![preview 3](52/preview_3.png) | ![preview 4](52/preview_4.png) | ![preview 5](52/preview_5.png) | ![preview 6](52/preview_6.png) | ![preview 7](52/preview_7.png) | ![preview 8](52/preview_8.png) | | 53 | 10 | [Download](53/dataset.zip) | ![preview 1](53/preview_1.png) | ![preview 2](53/preview_2.png) | ![preview 3](53/preview_3.png) | ![preview 4](53/preview_4.png) | ![preview 5](53/preview_5.png) | ![preview 6](53/preview_6.png) | ![preview 7](53/preview_7.png) | ![preview 8](53/preview_8.png) | | 54 | 7 | [Download](54/dataset.zip) | ![preview 1](54/preview_1.png) | ![preview 2](54/preview_2.png) | ![preview 3](54/preview_3.png) | ![preview 4](54/preview_4.png) | ![preview 5](54/preview_5.png) | ![preview 6](54/preview_6.png) | ![preview 7](54/preview_7.png) | N/A | | 55 | 7 | [Download](55/dataset.zip) | ![preview 1](55/preview_1.png) | ![preview 2](55/preview_2.png) | ![preview 3](55/preview_3.png) | ![preview 4](55/preview_4.png) | ![preview 5](55/preview_5.png) | ![preview 6](55/preview_6.png) | ![preview 7](55/preview_7.png) | N/A | | 56 | 54 | [Download](56/dataset.zip) | ![preview 1](56/preview_1.png) | ![preview 2](56/preview_2.png) | ![preview 3](56/preview_3.png) | ![preview 4](56/preview_4.png) | ![preview 5](56/preview_5.png) | ![preview 6](56/preview_6.png) | ![preview 7](56/preview_7.png) | ![preview 8](56/preview_8.png) | | 57 | 20 | [Download](57/dataset.zip) | ![preview 1](57/preview_1.png) | ![preview 2](57/preview_2.png) | ![preview 3](57/preview_3.png) | ![preview 4](57/preview_4.png) | ![preview 5](57/preview_5.png) | ![preview 6](57/preview_6.png) | ![preview 7](57/preview_7.png) | ![preview 8](57/preview_8.png) | | 58 | 215 | [Download](58/dataset.zip) | ![preview 1](58/preview_1.png) | ![preview 2](58/preview_2.png) | ![preview 3](58/preview_3.png) | ![preview 4](58/preview_4.png) | ![preview 5](58/preview_5.png) | ![preview 6](58/preview_6.png) | ![preview 7](58/preview_7.png) | ![preview 8](58/preview_8.png) | | 59 | 13 | [Download](59/dataset.zip) | ![preview 1](59/preview_1.png) | ![preview 2](59/preview_2.png) | ![preview 3](59/preview_3.png) | ![preview 4](59/preview_4.png) | ![preview 5](59/preview_5.png) | ![preview 6](59/preview_6.png) | ![preview 7](59/preview_7.png) | ![preview 8](59/preview_8.png) | | 60 | 12 | [Download](60/dataset.zip) | ![preview 1](60/preview_1.png) | ![preview 2](60/preview_2.png) | ![preview 3](60/preview_3.png) | ![preview 4](60/preview_4.png) | ![preview 5](60/preview_5.png) | ![preview 6](60/preview_6.png) | ![preview 7](60/preview_7.png) | ![preview 8](60/preview_8.png) | | 61 | 37 | [Download](61/dataset.zip) | ![preview 1](61/preview_1.png) | ![preview 2](61/preview_2.png) | ![preview 3](61/preview_3.png) | ![preview 4](61/preview_4.png) | ![preview 5](61/preview_5.png) | ![preview 6](61/preview_6.png) | ![preview 7](61/preview_7.png) | ![preview 8](61/preview_8.png) | | 62 | 47 | [Download](62/dataset.zip) | ![preview 1](62/preview_1.png) | ![preview 2](62/preview_2.png) | ![preview 3](62/preview_3.png) | ![preview 4](62/preview_4.png) | ![preview 5](62/preview_5.png) | ![preview 6](62/preview_6.png) | ![preview 7](62/preview_7.png) | ![preview 8](62/preview_8.png) | | 63 | 18 | [Download](63/dataset.zip) | ![preview 1](63/preview_1.png) | ![preview 2](63/preview_2.png) | ![preview 3](63/preview_3.png) | ![preview 4](63/preview_4.png) | ![preview 5](63/preview_5.png) | ![preview 6](63/preview_6.png) | ![preview 7](63/preview_7.png) | ![preview 8](63/preview_8.png) | | 64 | 327 | [Download](64/dataset.zip) | ![preview 1](64/preview_1.png) | ![preview 2](64/preview_2.png) | ![preview 3](64/preview_3.png) | ![preview 4](64/preview_4.png) | ![preview 5](64/preview_5.png) | ![preview 6](64/preview_6.png) | ![preview 7](64/preview_7.png) | ![preview 8](64/preview_8.png) | | 65 | 44 | [Download](65/dataset.zip) | ![preview 1](65/preview_1.png) | ![preview 2](65/preview_2.png) | ![preview 3](65/preview_3.png) | ![preview 4](65/preview_4.png) | ![preview 5](65/preview_5.png) | ![preview 6](65/preview_6.png) | ![preview 7](65/preview_7.png) | ![preview 8](65/preview_8.png) | | 66 | 155 | [Download](66/dataset.zip) | ![preview 1](66/preview_1.png) | ![preview 2](66/preview_2.png) | ![preview 3](66/preview_3.png) | ![preview 4](66/preview_4.png) | ![preview 5](66/preview_5.png) | ![preview 6](66/preview_6.png) | ![preview 7](66/preview_7.png) | ![preview 8](66/preview_8.png) | | 67 | 17 | [Download](67/dataset.zip) | ![preview 1](67/preview_1.png) | ![preview 2](67/preview_2.png) | ![preview 3](67/preview_3.png) | ![preview 4](67/preview_4.png) | ![preview 5](67/preview_5.png) | ![preview 6](67/preview_6.png) | ![preview 7](67/preview_7.png) | ![preview 8](67/preview_8.png) | | 68 | 13 | [Download](68/dataset.zip) | ![preview 1](68/preview_1.png) | ![preview 2](68/preview_2.png) | ![preview 3](68/preview_3.png) | ![preview 4](68/preview_4.png) | ![preview 5](68/preview_5.png) | ![preview 6](68/preview_6.png) | ![preview 7](68/preview_7.png) | ![preview 8](68/preview_8.png) | | 69 | 39 | [Download](69/dataset.zip) | ![preview 1](69/preview_1.png) | ![preview 2](69/preview_2.png) | ![preview 3](69/preview_3.png) | ![preview 4](69/preview_4.png) | ![preview 5](69/preview_5.png) | ![preview 6](69/preview_6.png) | ![preview 7](69/preview_7.png) | ![preview 8](69/preview_8.png) | | 70 | 92 | [Download](70/dataset.zip) | ![preview 1](70/preview_1.png) | ![preview 2](70/preview_2.png) | ![preview 3](70/preview_3.png) | ![preview 4](70/preview_4.png) | ![preview 5](70/preview_5.png) | ![preview 6](70/preview_6.png) | ![preview 7](70/preview_7.png) | ![preview 8](70/preview_8.png) | | 71 | 53 | [Download](71/dataset.zip) | ![preview 1](71/preview_1.png) | ![preview 2](71/preview_2.png) | ![preview 3](71/preview_3.png) | ![preview 4](71/preview_4.png) | ![preview 5](71/preview_5.png) | ![preview 6](71/preview_6.png) | ![preview 7](71/preview_7.png) | ![preview 8](71/preview_8.png) | | 72 | 28 | [Download](72/dataset.zip) | ![preview 1](72/preview_1.png) | ![preview 2](72/preview_2.png) | ![preview 3](72/preview_3.png) | ![preview 4](72/preview_4.png) | ![preview 5](72/preview_5.png) | ![preview 6](72/preview_6.png) | ![preview 7](72/preview_7.png) | ![preview 8](72/preview_8.png) | | 73 | 85 | [Download](73/dataset.zip) | ![preview 1](73/preview_1.png) | ![preview 2](73/preview_2.png) | ![preview 3](73/preview_3.png) | ![preview 4](73/preview_4.png) | ![preview 5](73/preview_5.png) | ![preview 6](73/preview_6.png) | ![preview 7](73/preview_7.png) | ![preview 8](73/preview_8.png) | | 74 | 75 | [Download](74/dataset.zip) | ![preview 1](74/preview_1.png) | ![preview 2](74/preview_2.png) | ![preview 3](74/preview_3.png) | ![preview 4](74/preview_4.png) | ![preview 5](74/preview_5.png) | ![preview 6](74/preview_6.png) | ![preview 7](74/preview_7.png) | ![preview 8](74/preview_8.png) | | 75 | 28 | [Download](75/dataset.zip) | ![preview 1](75/preview_1.png) | ![preview 2](75/preview_2.png) | ![preview 3](75/preview_3.png) | ![preview 4](75/preview_4.png) | ![preview 5](75/preview_5.png) | ![preview 6](75/preview_6.png) | ![preview 7](75/preview_7.png) | ![preview 8](75/preview_8.png) | | 76 | 11 | [Download](76/dataset.zip) | ![preview 1](76/preview_1.png) | ![preview 2](76/preview_2.png) | ![preview 3](76/preview_3.png) | ![preview 4](76/preview_4.png) | ![preview 5](76/preview_5.png) | ![preview 6](76/preview_6.png) | ![preview 7](76/preview_7.png) | ![preview 8](76/preview_8.png) | | 77 | 14 | [Download](77/dataset.zip) | ![preview 1](77/preview_1.png) | ![preview 2](77/preview_2.png) | ![preview 3](77/preview_3.png) | ![preview 4](77/preview_4.png) | ![preview 5](77/preview_5.png) | ![preview 6](77/preview_6.png) | ![preview 7](77/preview_7.png) | ![preview 8](77/preview_8.png) | | 78 | 7 | [Download](78/dataset.zip) | ![preview 1](78/preview_1.png) | ![preview 2](78/preview_2.png) | ![preview 3](78/preview_3.png) | ![preview 4](78/preview_4.png) | ![preview 5](78/preview_5.png) | ![preview 6](78/preview_6.png) | ![preview 7](78/preview_7.png) | N/A | | 79 | 9 | [Download](79/dataset.zip) | ![preview 1](79/preview_1.png) | ![preview 2](79/preview_2.png) | ![preview 3](79/preview_3.png) | ![preview 4](79/preview_4.png) | ![preview 5](79/preview_5.png) | ![preview 6](79/preview_6.png) | ![preview 7](79/preview_7.png) | ![preview 8](79/preview_8.png) | | 80 | 17 | [Download](80/dataset.zip) | ![preview 1](80/preview_1.png) | ![preview 2](80/preview_2.png) | ![preview 3](80/preview_3.png) | ![preview 4](80/preview_4.png) | ![preview 5](80/preview_5.png) | ![preview 6](80/preview_6.png) | ![preview 7](80/preview_7.png) | ![preview 8](80/preview_8.png) | | 81 | 11 | [Download](81/dataset.zip) | ![preview 1](81/preview_1.png) | ![preview 2](81/preview_2.png) | ![preview 3](81/preview_3.png) | ![preview 4](81/preview_4.png) | ![preview 5](81/preview_5.png) | ![preview 6](81/preview_6.png) | ![preview 7](81/preview_7.png) | ![preview 8](81/preview_8.png) | | 82 | 14 | [Download](82/dataset.zip) | ![preview 1](82/preview_1.png) | ![preview 2](82/preview_2.png) | ![preview 3](82/preview_3.png) | ![preview 4](82/preview_4.png) | ![preview 5](82/preview_5.png) | ![preview 6](82/preview_6.png) | ![preview 7](82/preview_7.png) | ![preview 8](82/preview_8.png) | | 83 | 6 | [Download](83/dataset.zip) | ![preview 1](83/preview_1.png) | ![preview 2](83/preview_2.png) | ![preview 3](83/preview_3.png) | ![preview 4](83/preview_4.png) | ![preview 5](83/preview_5.png) | ![preview 6](83/preview_6.png) | N/A | N/A | | 84 | 20 | [Download](84/dataset.zip) | ![preview 1](84/preview_1.png) | ![preview 2](84/preview_2.png) | ![preview 3](84/preview_3.png) | ![preview 4](84/preview_4.png) | ![preview 5](84/preview_5.png) | ![preview 6](84/preview_6.png) | ![preview 7](84/preview_7.png) | ![preview 8](84/preview_8.png) | | 85 | 20 | [Download](85/dataset.zip) | ![preview 1](85/preview_1.png) | ![preview 2](85/preview_2.png) | ![preview 3](85/preview_3.png) | ![preview 4](85/preview_4.png) | ![preview 5](85/preview_5.png) | ![preview 6](85/preview_6.png) | ![preview 7](85/preview_7.png) | ![preview 8](85/preview_8.png) | | 86 | 16 | [Download](86/dataset.zip) | ![preview 1](86/preview_1.png) | ![preview 2](86/preview_2.png) | ![preview 3](86/preview_3.png) | ![preview 4](86/preview_4.png) | ![preview 5](86/preview_5.png) | ![preview 6](86/preview_6.png) | ![preview 7](86/preview_7.png) | ![preview 8](86/preview_8.png) | | 87 | 26 | [Download](87/dataset.zip) | ![preview 1](87/preview_1.png) | ![preview 2](87/preview_2.png) | ![preview 3](87/preview_3.png) | ![preview 4](87/preview_4.png) | ![preview 5](87/preview_5.png) | ![preview 6](87/preview_6.png) | ![preview 7](87/preview_7.png) | ![preview 8](87/preview_8.png) | | 88 | 190 | [Download](88/dataset.zip) | ![preview 1](88/preview_1.png) | ![preview 2](88/preview_2.png) | ![preview 3](88/preview_3.png) | ![preview 4](88/preview_4.png) | ![preview 5](88/preview_5.png) | ![preview 6](88/preview_6.png) | ![preview 7](88/preview_7.png) | ![preview 8](88/preview_8.png) | | 89 | 5 | [Download](89/dataset.zip) | ![preview 1](89/preview_1.png) | ![preview 2](89/preview_2.png) | ![preview 3](89/preview_3.png) | ![preview 4](89/preview_4.png) | ![preview 5](89/preview_5.png) | N/A | N/A | N/A | | 90 | 8 | [Download](90/dataset.zip) | ![preview 1](90/preview_1.png) | ![preview 2](90/preview_2.png) | ![preview 3](90/preview_3.png) | ![preview 4](90/preview_4.png) | ![preview 5](90/preview_5.png) | ![preview 6](90/preview_6.png) | ![preview 7](90/preview_7.png) | ![preview 8](90/preview_8.png) | | noise | 375 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
atom-in-the-universe/bild-ce1cec87-c7ce-45b9-ac08-26195dff5924
2023-10-04T08:52:29.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B
2023-10-04T08:45:16.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Undi95/ReMM-Mistral-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Undi95/ReMM-Mistral-13B](https://huggingface.co/Undi95/ReMM-Mistral-13B) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-04T08:43:52.595565](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B/blob/main/results_2023-10-04T08-43-52.595565.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5564046120566463,\n\ \ \"acc_stderr\": 0.03450414630343969,\n \"acc_norm\": 0.560263985496091,\n\ \ \"acc_norm_stderr\": 0.03448191613915977,\n \"mc1\": 0.37454100367197063,\n\ \ \"mc1_stderr\": 0.016943535128405324,\n \"mc2\": 0.5331836105073876,\n\ \ \"mc2_stderr\": 0.015629704316856213\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642664,\n\ \ \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.0141696645203031\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.642899820752838,\n\ \ \"acc_stderr\": 0.004781654610857137,\n \"acc_norm\": 0.8381796454889464,\n\ \ \"acc_norm_stderr\": 0.0036753325906810734\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\ \ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\ \ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\ \ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.030503292013342592,\n\ \ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.030503292013342592\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\ \ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\ \ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\ \ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\ \ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\ \ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\ \ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\ \ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\ \ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\ \ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\ \ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\ acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\ \ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\ \ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\ \ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n\ \ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n\ \ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\"\ : 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\ \ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\ acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n\ \ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\ \ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478464,\n \ \ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478464\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\ acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7229357798165138,\n \"acc_stderr\": 0.01918848259016953,\n \"\ acc_norm\": 0.7229357798165138,\n \"acc_norm_stderr\": 0.01918848259016953\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\ acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\ acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \ \ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\ \ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\ acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\ \ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\ \ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\ \ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\ \ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\ \ \"acc_stderr\": 0.02645350805404033,\n \"acc_norm\": 0.7948717948717948,\n\ \ \"acc_norm_stderr\": 0.02645350805404033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\ \ \"acc_stderr\": 0.015246803197398679,\n \"acc_norm\": 0.7611749680715197,\n\ \ \"acc_norm_stderr\": 0.015246803197398679\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\ \ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n\ \ \"acc_stderr\": 0.016337268694270105,\n \"acc_norm\": 0.39329608938547483,\n\ \ \"acc_norm_stderr\": 0.016337268694270105\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283697,\n\ \ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283697\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\ \ \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n\ \ \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.02704453813840261,\n\ \ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.02704453813840261\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \ \ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n\ \ \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n\ \ \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\ \ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \ \ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872464,\n\ \ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872464\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.03333333333333335,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.03333333333333335\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\ \ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\ \ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\ \ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n\ \ \"mc1_stderr\": 0.016943535128405324,\n \"mc2\": 0.5331836105073876,\n\ \ \"mc2_stderr\": 0.015629704316856213\n }\n}\n```" repo_url: https://huggingface.co/Undi95/ReMM-Mistral-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|arc:challenge|25_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hellaswag|10_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-43-52.595565.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-43-52.595565.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T08_43_52.595565 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T08-43-52.595565.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T08-43-52.595565.parquet' - config_name: results data_files: - split: 2023_10_04T08_43_52.595565 path: - results_2023-10-04T08-43-52.595565.parquet - split: latest path: - results_2023-10-04T08-43-52.595565.parquet --- # Dataset Card for Evaluation run of Undi95/ReMM-Mistral-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Undi95/ReMM-Mistral-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Undi95/ReMM-Mistral-13B](https://huggingface.co/Undi95/ReMM-Mistral-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-04T08:43:52.595565](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B/blob/main/results_2023-10-04T08-43-52.595565.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5564046120566463, "acc_stderr": 0.03450414630343969, "acc_norm": 0.560263985496091, "acc_norm_stderr": 0.03448191613915977, "mc1": 0.37454100367197063, "mc1_stderr": 0.016943535128405324, "mc2": 0.5331836105073876, "mc2_stderr": 0.015629704316856213 }, "harness|arc:challenge|25": { "acc": 0.5895904436860068, "acc_stderr": 0.014374922192642664, "acc_norm": 0.6220136518771331, "acc_norm_stderr": 0.0141696645203031 }, "harness|hellaswag|10": { "acc": 0.642899820752838, "acc_stderr": 0.004781654610857137, "acc_norm": 0.8381796454889464, "acc_norm_stderr": 0.0036753325906810734 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4962962962962963, "acc_stderr": 0.04319223625811331, "acc_norm": 0.4962962962962963, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5328947368421053, "acc_stderr": 0.040601270352363966, "acc_norm": 0.5328947368421053, "acc_norm_stderr": 0.040601270352363966 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5660377358490566, "acc_stderr": 0.030503292013342592, "acc_norm": 0.5660377358490566, "acc_norm_stderr": 0.030503292013342592 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842426, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842426 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5202312138728323, "acc_stderr": 0.03809342081273957, "acc_norm": 0.5202312138728323, "acc_norm_stderr": 0.03809342081273957 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2549019607843137, "acc_stderr": 0.043364327079931785, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.043364327079931785 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.49361702127659574, "acc_stderr": 0.032683358999363366, "acc_norm": 0.49361702127659574, "acc_norm_stderr": 0.032683358999363366 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3157894736842105, "acc_stderr": 0.04372748290278007, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.04372748290278007 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.496551724137931, "acc_stderr": 0.04166567577101579, "acc_norm": 0.496551724137931, "acc_norm_stderr": 0.04166567577101579 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3439153439153439, "acc_stderr": 0.024464426625596433, "acc_norm": 0.3439153439153439, "acc_norm_stderr": 0.024464426625596433 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30158730158730157, "acc_stderr": 0.04104947269903394, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.04104947269903394 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6419354838709678, "acc_stderr": 0.027273890594300645, "acc_norm": 0.6419354838709678, "acc_norm_stderr": 0.027273890594300645 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.41379310344827586, "acc_stderr": 0.03465304488406795, "acc_norm": 0.41379310344827586, "acc_norm_stderr": 0.03465304488406795 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6727272727272727, "acc_stderr": 0.03663974994391244, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.03663974994391244 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6919191919191919, "acc_stderr": 0.032894773300986155, "acc_norm": 0.6919191919191919, "acc_norm_stderr": 0.032894773300986155 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.030031147977641538, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.030031147977641538 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5128205128205128, "acc_stderr": 0.025342671293807257, "acc_norm": 0.5128205128205128, "acc_norm_stderr": 0.025342671293807257 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524575, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524575 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.592436974789916, "acc_stderr": 0.03191863374478464, "acc_norm": 0.592436974789916, "acc_norm_stderr": 0.03191863374478464 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526733, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526733 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7229357798165138, "acc_stderr": 0.01918848259016953, "acc_norm": 0.7229357798165138, "acc_norm_stderr": 0.01918848259016953 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.03293377139415191, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.03293377139415191 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591362, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591362 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036423, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036423 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.648854961832061, "acc_stderr": 0.04186445163013751, "acc_norm": 0.648854961832061, "acc_norm_stderr": 0.04186445163013751 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7107438016528925, "acc_stderr": 0.04139112727635463, "acc_norm": 0.7107438016528925, "acc_norm_stderr": 0.04139112727635463 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.043300437496507416, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.043300437496507416 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6748466257668712, "acc_stderr": 0.036803503712864616, "acc_norm": 0.6748466257668712, "acc_norm_stderr": 0.036803503712864616 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280042, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280042 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7948717948717948, "acc_stderr": 0.02645350805404033, "acc_norm": 0.7948717948717948, "acc_norm_stderr": 0.02645350805404033 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7611749680715197, "acc_stderr": 0.015246803197398679, "acc_norm": 0.7611749680715197, "acc_norm_stderr": 0.015246803197398679 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6242774566473989, "acc_stderr": 0.02607431485165708, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.02607431485165708 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39329608938547483, "acc_stderr": 0.016337268694270105, "acc_norm": 0.39329608938547483, "acc_norm_stderr": 0.016337268694270105 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6176470588235294, "acc_stderr": 0.027826109307283697, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.027826109307283697 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6366559485530546, "acc_stderr": 0.02731684767419271, "acc_norm": 0.6366559485530546, "acc_norm_stderr": 0.02731684767419271 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6172839506172839, "acc_stderr": 0.02704453813840261, "acc_norm": 0.6172839506172839, "acc_norm_stderr": 0.02704453813840261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4219858156028369, "acc_stderr": 0.029462189233370593, "acc_norm": 0.4219858156028369, "acc_norm_stderr": 0.029462189233370593 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.42242503259452413, "acc_stderr": 0.012615600475734921, "acc_norm": 0.42242503259452413, "acc_norm_stderr": 0.012615600475734921 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.030352303395351964, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5669934640522876, "acc_stderr": 0.020045442473324224, "acc_norm": 0.5669934640522876, "acc_norm_stderr": 0.020045442473324224 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6204081632653061, "acc_stderr": 0.031067211262872464, "acc_norm": 0.6204081632653061, "acc_norm_stderr": 0.031067211262872464 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03333333333333335, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03333333333333335 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.4457831325301205, "acc_stderr": 0.03869543323472101, "acc_norm": 0.4457831325301205, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.03274485211946956, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.03274485211946956 }, "harness|truthfulqa:mc|0": { "mc1": 0.37454100367197063, "mc1_stderr": 0.016943535128405324, "mc2": 0.5331836105073876, "mc2_stderr": 0.015629704316856213 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
atom-in-the-universe/bild-1f44b793-d58d-4414-8e8d-65343bd52d0b
2023-10-04T09:06:05.000Z
[ "region:us" ]
atom-in-the-universe
null
null
null
0
0
Entry not found