id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora | 2023-08-27T12:45:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Aspik101/llama-30b-instruct-2048-PL-lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aspik101/llama-30b-instruct-2048-PL-lora](https://huggingface.co/Aspik101/llama-30b-instruct-2048-PL-lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T12:31:39.765804](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora/blob/main/results_2023-08-22T12%3A31%3A39.765804.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.617198424020352,\n\
\ \"acc_stderr\": 0.03351274322059187,\n \"acc_norm\": 0.6211322922406515,\n\
\ \"acc_norm_stderr\": 0.0334887823497915,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5331763925042894,\n\
\ \"mc2_stderr\": 0.015241344362857515\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.01430175222327954,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6486755626369249,\n\
\ \"acc_stderr\": 0.0047640845971768965,\n \"acc_norm\": 0.8466440948018323,\n\
\ \"acc_norm_stderr\": 0.003595938124166228\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334385,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334385\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851095,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851095\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215639,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215639\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200148,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200148\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699824,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7931034482758621,\n\
\ \"acc_stderr\": 0.01448565604166918,\n \"acc_norm\": 0.7931034482758621,\n\
\ \"acc_norm_stderr\": 0.01448565604166918\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071762,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071762\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n\
\ \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n\
\ \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388856,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388856\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488533,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488533\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\
\ \"acc_stderr\": 0.012697046024399682,\n \"acc_norm\": 0.44654498044328556,\n\
\ \"acc_norm_stderr\": 0.012697046024399682\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877746,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5331763925042894,\n\
\ \"mc2_stderr\": 0.015241344362857515\n }\n}\n```"
repo_url: https://huggingface.co/Aspik101/llama-30b-instruct-2048-PL-lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|arc:challenge|25_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hellaswag|10_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T12:31:39.765804.parquet'
---
# Dataset Card for Evaluation run of Aspik101/llama-30b-instruct-2048-PL-lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aspik101/llama-30b-instruct-2048-PL-lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aspik101/llama-30b-instruct-2048-PL-lora](https://huggingface.co/Aspik101/llama-30b-instruct-2048-PL-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T12:31:39.765804](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora/blob/main/results_2023-08-22T12%3A31%3A39.765804.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.617198424020352,
"acc_stderr": 0.03351274322059187,
"acc_norm": 0.6211322922406515,
"acc_norm_stderr": 0.0334887823497915,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5331763925042894,
"mc2_stderr": 0.015241344362857515
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.01430175222327954,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.6486755626369249,
"acc_stderr": 0.0047640845971768965,
"acc_norm": 0.8466440948018323,
"acc_norm_stderr": 0.003595938124166228
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334385,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334385
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851095,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851095
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215639,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215639
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200148,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200148
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.02485636418450322,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.02485636418450322
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699824,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.01448565604166918,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.01448565604166918
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.024883140570071762,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.024883140570071762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388856,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388856
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488533,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399682,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.01933314202079716,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.01933314202079716
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5331763925042894,
"mc2_stderr": 0.015241344362857515
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_WhoTookMyAmogusNickname__NewHope_HF_not_official | 2023-09-17T06:38:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of WhoTookMyAmogusNickname/NewHope_HF_not_official
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WhoTookMyAmogusNickname/NewHope_HF_not_official](https://huggingface.co/WhoTookMyAmogusNickname/NewHope_HF_not_official)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WhoTookMyAmogusNickname__NewHope_HF_not_official\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T06:38:00.301208](https://huggingface.co/datasets/open-llm-leaderboard/details_WhoTookMyAmogusNickname__NewHope_HF_not_official/blob/main/results_2023-09-17T06-38-00.301208.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19693791946308725,\n\
\ \"em_stderr\": 0.004072666833657848,\n \"f1\": 0.2666285654362424,\n\
\ \"f1_stderr\": 0.004068431318455121,\n \"acc\": 0.4541280286361735,\n\
\ \"acc_stderr\": 0.011115742216344062\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.19693791946308725,\n \"em_stderr\": 0.004072666833657848,\n\
\ \"f1\": 0.2666285654362424,\n \"f1_stderr\": 0.004068431318455121\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15845337376800606,\n \
\ \"acc_stderr\": 0.010058474790238971\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.749802683504341,\n \"acc_stderr\": 0.012173009642449151\n\
\ }\n}\n```"
repo_url: https://huggingface.co/WhoTookMyAmogusNickname/NewHope_HF_not_official
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|arc:challenge|25_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T06_38_00.301208
path:
- '**/details_harness|drop|3_2023-09-17T06-38-00.301208.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T06-38-00.301208.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T06_38_00.301208
path:
- '**/details_harness|gsm8k|5_2023-09-17T06-38-00.301208.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T06-38-00.301208.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hellaswag|10_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T14:04:45.383046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T14_04_45.383046
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T14:04:45.383046.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T14:04:45.383046.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T06_38_00.301208
path:
- '**/details_harness|winogrande|5_2023-09-17T06-38-00.301208.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T06-38-00.301208.parquet'
- config_name: results
data_files:
- split: 2023_09_17T06_38_00.301208
path:
- results_2023-09-17T06-38-00.301208.parquet
- split: latest
path:
- results_2023-09-17T06-38-00.301208.parquet
---
# Dataset Card for Evaluation run of WhoTookMyAmogusNickname/NewHope_HF_not_official
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WhoTookMyAmogusNickname/NewHope_HF_not_official
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WhoTookMyAmogusNickname/NewHope_HF_not_official](https://huggingface.co/WhoTookMyAmogusNickname/NewHope_HF_not_official) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WhoTookMyAmogusNickname__NewHope_HF_not_official",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T06:38:00.301208](https://huggingface.co/datasets/open-llm-leaderboard/details_WhoTookMyAmogusNickname__NewHope_HF_not_official/blob/main/results_2023-09-17T06-38-00.301208.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19693791946308725,
"em_stderr": 0.004072666833657848,
"f1": 0.2666285654362424,
"f1_stderr": 0.004068431318455121,
"acc": 0.4541280286361735,
"acc_stderr": 0.011115742216344062
},
"harness|drop|3": {
"em": 0.19693791946308725,
"em_stderr": 0.004072666833657848,
"f1": 0.2666285654362424,
"f1_stderr": 0.004068431318455121
},
"harness|gsm8k|5": {
"acc": 0.15845337376800606,
"acc_stderr": 0.010058474790238971
},
"harness|winogrande|5": {
"acc": 0.749802683504341,
"acc_stderr": 0.012173009642449151
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b | 2023-09-23T11:25:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/zaraxe-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zaraxe-l2-7b](https://huggingface.co/zarakiquemparte/zaraxe-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T11:25:34.979979](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b/blob/main/results_2023-09-23T11-25-34.979979.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19169463087248323,\n\
\ \"em_stderr\": 0.004031181549439802,\n \"f1\": 0.27804110738255156,\n\
\ \"f1_stderr\": 0.0041099263816090316,\n \"acc\": 0.4053108206032529,\n\
\ \"acc_stderr\": 0.00984887759467774\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.19169463087248323,\n \"em_stderr\": 0.004031181549439802,\n\
\ \"f1\": 0.27804110738255156,\n \"f1_stderr\": 0.0041099263816090316\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \
\ \"acc_stderr\": 0.0072912057231626195\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n\
\ }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zaraxe-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|arc:challenge|25_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T11_25_34.979979
path:
- '**/details_harness|drop|3_2023-09-23T11-25-34.979979.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T11-25-34.979979.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T11_25_34.979979
path:
- '**/details_harness|gsm8k|5_2023-09-23T11-25-34.979979.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T11-25-34.979979.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hellaswag|10_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T11_25_34.979979
path:
- '**/details_harness|winogrande|5_2023-09-23T11-25-34.979979.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T11-25-34.979979.parquet'
- config_name: results
data_files:
- split: 2023_09_23T11_25_34.979979
path:
- results_2023-09-23T11-25-34.979979.parquet
- split: latest
path:
- results_2023-09-23T11-25-34.979979.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zaraxe-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zaraxe-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zaraxe-l2-7b](https://huggingface.co/zarakiquemparte/zaraxe-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T11:25:34.979979](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b/blob/main/results_2023-09-23T11-25-34.979979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19169463087248323,
"em_stderr": 0.004031181549439802,
"f1": 0.27804110738255156,
"f1_stderr": 0.0041099263816090316,
"acc": 0.4053108206032529,
"acc_stderr": 0.00984887759467774
},
"harness|drop|3": {
"em": 0.19169463087248323,
"em_stderr": 0.004031181549439802,
"f1": 0.27804110738255156,
"f1_stderr": 0.0041099263816090316
},
"harness|gsm8k|5": {
"acc": 0.0758150113722517,
"acc_stderr": 0.0072912057231626195
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_FelixChao__vicuna-33b-coder | 2023-09-28T18:36:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FelixChao/vicuna-33b-coder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/vicuna-33b-coder](https://huggingface.co/FelixChao/vicuna-33b-coder)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__vicuna-33b-coder\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-28T18:36:25.051390](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__vicuna-33b-coder/blob/main/results_2023-09-28T18-36-25.051390.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0045092281879194635,\n\
\ \"em_stderr\": 0.0006861346899094924,\n \"f1\": 0.08164848993288601,\n\
\ \"f1_stderr\": 0.0016912998086531358,\n \"acc\": 0.4488152932102182,\n\
\ \"acc_stderr\": 0.010539810443125387\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0045092281879194635,\n \"em_stderr\": 0.0006861346899094924,\n\
\ \"f1\": 0.08164848993288601,\n \"f1_stderr\": 0.0016912998086531358\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1288855193328279,\n \
\ \"acc_stderr\": 0.009229580761400265\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n\
\ }\n}\n```"
repo_url: https://huggingface.co/FelixChao/vicuna-33b-coder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_28T18_36_25.051390
path:
- '**/details_harness|drop|3_2023-09-28T18-36-25.051390.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-28T18-36-25.051390.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_28T18_36_25.051390
path:
- '**/details_harness|gsm8k|5_2023-09-28T18-36-25.051390.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-28T18-36-25.051390.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:16:47.198567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T17_16_47.198567
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:16:47.198567.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:16:47.198567.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_28T18_36_25.051390
path:
- '**/details_harness|winogrande|5_2023-09-28T18-36-25.051390.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-28T18-36-25.051390.parquet'
- config_name: results
data_files:
- split: 2023_09_28T18_36_25.051390
path:
- results_2023-09-28T18-36-25.051390.parquet
- split: latest
path:
- results_2023-09-28T18-36-25.051390.parquet
---
# Dataset Card for Evaluation run of FelixChao/vicuna-33b-coder
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FelixChao/vicuna-33b-coder
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FelixChao/vicuna-33b-coder](https://huggingface.co/FelixChao/vicuna-33b-coder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__vicuna-33b-coder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-28T18:36:25.051390](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__vicuna-33b-coder/blob/main/results_2023-09-28T18-36-25.051390.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0045092281879194635,
"em_stderr": 0.0006861346899094924,
"f1": 0.08164848993288601,
"f1_stderr": 0.0016912998086531358,
"acc": 0.4488152932102182,
"acc_stderr": 0.010539810443125387
},
"harness|drop|3": {
"em": 0.0045092281879194635,
"em_stderr": 0.0006861346899094924,
"f1": 0.08164848993288601,
"f1_stderr": 0.0016912998086531358
},
"harness|gsm8k|5": {
"acc": 0.1288855193328279,
"acc_stderr": 0.009229580761400265
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.011850040124850508
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B | 2023-08-27T12:45:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of project-baize/baize-healthcare-lora-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [project-baize/baize-healthcare-lora-7B](https://huggingface.co/project-baize/baize-healthcare-lora-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-22T17:11:44.232250](https://huggingface.co/datasets/open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B/blob/main/results_2023-08-22T17%3A11%3A44.232250.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3767145755854847,\n\
\ \"acc_stderr\": 0.03486661252460853,\n \"acc_norm\": 0.3805946637814986,\n\
\ \"acc_norm_stderr\": 0.03485314883251162,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.39963016478252117,\n\
\ \"mc2_stderr\": 0.014529646643854385\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5051194539249146,\n \"acc_stderr\": 0.014610624890309157,\n\
\ \"acc_norm\": 0.5409556313993175,\n \"acc_norm_stderr\": 0.014562291073601234\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5800637323242382,\n\
\ \"acc_stderr\": 0.004925394995490126,\n \"acc_norm\": 0.7731527584146585,\n\
\ \"acc_norm_stderr\": 0.004179370978481001\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4075471698113208,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.4075471698113208,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.035676037996391685,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.035676037996391685\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.031907012423268113,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.031907012423268113\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022055,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022055\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309994,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309994\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.34838709677419355,\n \"acc_stderr\": 0.027104826328100944,\n \"\
acc_norm\": 0.34838709677419355,\n \"acc_norm_stderr\": 0.027104826328100944\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.03895658065271847,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.03895658065271847\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.43434343434343436,\n \"acc_stderr\": 0.03531505879359183,\n \"\
acc_norm\": 0.43434343434343436,\n \"acc_norm_stderr\": 0.03531505879359183\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5077720207253886,\n \"acc_stderr\": 0.03608003225569654,\n\
\ \"acc_norm\": 0.5077720207253886,\n \"acc_norm_stderr\": 0.03608003225569654\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.382051282051282,\n \"acc_stderr\": 0.02463554916390823,\n \
\ \"acc_norm\": 0.382051282051282,\n \"acc_norm_stderr\": 0.02463554916390823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514567,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514567\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3403361344537815,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.3403361344537815,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4954128440366973,\n \"acc_stderr\": 0.021436420955529435,\n \"\
acc_norm\": 0.4954128440366973,\n \"acc_norm_stderr\": 0.021436420955529435\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.46078431372549017,\n \"acc_stderr\": 0.03498501649369527,\n \"\
acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.03498501649369527\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.459915611814346,\n \"acc_stderr\": 0.03244246810187914,\n \
\ \"acc_norm\": 0.459915611814346,\n \"acc_norm_stderr\": 0.03244246810187914\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.38565022421524664,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.38565022421524664,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3435114503816794,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.3435114503816794,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.47107438016528924,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.47107438016528924,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.38650306748466257,\n \"acc_stderr\": 0.03825825548848608,\n\
\ \"acc_norm\": 0.38650306748466257,\n \"acc_norm_stderr\": 0.03825825548848608\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.048467482539772386,\n\
\ \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.048467482539772386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.47863247863247865,\n\
\ \"acc_stderr\": 0.032726164476349545,\n \"acc_norm\": 0.47863247863247865,\n\
\ \"acc_norm_stderr\": 0.032726164476349545\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.4367816091954023,\n \"acc_stderr\": 0.01773647083780068,\n\
\ \"acc_norm\": 0.4367816091954023,\n \"acc_norm_stderr\": 0.01773647083780068\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.40173410404624277,\n\
\ \"acc_stderr\": 0.02639410417764363,\n \"acc_norm\": 0.40173410404624277,\n\
\ \"acc_norm_stderr\": 0.02639410417764363\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n\
\ \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3954248366013072,\n\
\ \"acc_stderr\": 0.02799672318063143,\n \"acc_norm\": 0.3954248366013072,\n\
\ \"acc_norm_stderr\": 0.02799672318063143\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.3858520900321543,\n \"acc_stderr\": 0.02764814959975146,\n\
\ \"acc_norm\": 0.3858520900321543,\n \"acc_norm_stderr\": 0.02764814959975146\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.404320987654321,\n\
\ \"acc_stderr\": 0.027306625297327684,\n \"acc_norm\": 0.404320987654321,\n\
\ \"acc_norm_stderr\": 0.027306625297327684\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.027281608344469414,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.027281608344469414\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3050847457627119,\n\
\ \"acc_stderr\": 0.011759939618085455,\n \"acc_norm\": 0.3050847457627119,\n\
\ \"acc_norm_stderr\": 0.011759939618085455\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3284313725490196,\n \"acc_stderr\": 0.018999707383162666,\n \
\ \"acc_norm\": 0.3284313725490196,\n \"acc_norm_stderr\": 0.018999707383162666\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3469387755102041,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.3469387755102041,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4975124378109453,\n\
\ \"acc_stderr\": 0.03535490150137289,\n \"acc_norm\": 0.4975124378109453,\n\
\ \"acc_norm_stderr\": 0.03535490150137289\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683228,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683228\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.038342347441649924,\n\
\ \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.038342347441649924\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.39963016478252117,\n\
\ \"mc2_stderr\": 0.014529646643854385\n }\n}\n```"
repo_url: https://huggingface.co/project-baize/baize-healthcare-lora-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:11:44.232250.parquet'
---
# Dataset Card for Evaluation run of project-baize/baize-healthcare-lora-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/project-baize/baize-healthcare-lora-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [project-baize/baize-healthcare-lora-7B](https://huggingface.co/project-baize/baize-healthcare-lora-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-22T17:11:44.232250](https://huggingface.co/datasets/open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B/blob/main/results_2023-08-22T17%3A11%3A44.232250.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3767145755854847,
"acc_stderr": 0.03486661252460853,
"acc_norm": 0.3805946637814986,
"acc_norm_stderr": 0.03485314883251162,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.39963016478252117,
"mc2_stderr": 0.014529646643854385
},
"harness|arc:challenge|25": {
"acc": 0.5051194539249146,
"acc_stderr": 0.014610624890309157,
"acc_norm": 0.5409556313993175,
"acc_norm_stderr": 0.014562291073601234
},
"harness|hellaswag|10": {
"acc": 0.5800637323242382,
"acc_stderr": 0.004925394995490126,
"acc_norm": 0.7731527584146585,
"acc_norm_stderr": 0.004179370978481001
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4075471698113208,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.4075471698113208,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391685,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391685
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.031907012423268113,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.031907012423268113
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022055,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022055
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309994,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309994
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.34838709677419355,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.34838709677419355,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.03895658065271847,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.03895658065271847
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.43434343434343436,
"acc_stderr": 0.03531505879359183,
"acc_norm": 0.43434343434343436,
"acc_norm_stderr": 0.03531505879359183
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5077720207253886,
"acc_stderr": 0.03608003225569654,
"acc_norm": 0.5077720207253886,
"acc_norm_stderr": 0.03608003225569654
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.382051282051282,
"acc_stderr": 0.02463554916390823,
"acc_norm": 0.382051282051282,
"acc_norm_stderr": 0.02463554916390823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514567,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514567
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3403361344537815,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.3403361344537815,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4954128440366973,
"acc_stderr": 0.021436420955529435,
"acc_norm": 0.4954128440366973,
"acc_norm_stderr": 0.021436420955529435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.459915611814346,
"acc_stderr": 0.03244246810187914,
"acc_norm": 0.459915611814346,
"acc_norm_stderr": 0.03244246810187914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.38565022421524664,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.38565022421524664,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3435114503816794,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.3435114503816794,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.47107438016528924,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.47107438016528924,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.38650306748466257,
"acc_stderr": 0.03825825548848608,
"acc_norm": 0.38650306748466257,
"acc_norm_stderr": 0.03825825548848608
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.47863247863247865,
"acc_stderr": 0.032726164476349545,
"acc_norm": 0.47863247863247865,
"acc_norm_stderr": 0.032726164476349545
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4367816091954023,
"acc_stderr": 0.01773647083780068,
"acc_norm": 0.4367816091954023,
"acc_norm_stderr": 0.01773647083780068
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.40173410404624277,
"acc_stderr": 0.02639410417764363,
"acc_norm": 0.40173410404624277,
"acc_norm_stderr": 0.02639410417764363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3954248366013072,
"acc_stderr": 0.02799672318063143,
"acc_norm": 0.3954248366013072,
"acc_norm_stderr": 0.02799672318063143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3858520900321543,
"acc_stderr": 0.02764814959975146,
"acc_norm": 0.3858520900321543,
"acc_norm_stderr": 0.02764814959975146
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.404320987654321,
"acc_stderr": 0.027306625297327684,
"acc_norm": 0.404320987654321,
"acc_norm_stderr": 0.027306625297327684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.027281608344469414,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.027281608344469414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3050847457627119,
"acc_stderr": 0.011759939618085455,
"acc_norm": 0.3050847457627119,
"acc_norm_stderr": 0.011759939618085455
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3284313725490196,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.3284313725490196,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.4,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3469387755102041,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.3469387755102041,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4975124378109453,
"acc_stderr": 0.03535490150137289,
"acc_norm": 0.4975124378109453,
"acc_norm_stderr": 0.03535490150137289
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683228,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683228
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.038342347441649924,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.038342347441649924
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.39963016478252117,
"mc2_stderr": 0.014529646643854385
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OpenLemur__lemur-70b-v1 | 2023-09-18T14:30:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenLemur/lemur-70b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenLemur/lemur-70b-v1](https://huggingface.co/OpenLemur/lemur-70b-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenLemur__lemur-70b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T14:30:20.780139](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenLemur__lemur-70b-v1/blob/main/results_2023-09-18T14-30-20.780139.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n\
\ \"em_stderr\": 0.0005236685642965847,\n \"f1\": 0.057400377516778664,\n\
\ \"f1_stderr\": 0.001295669399059679,\n \"acc\": 0.558823353417031,\n\
\ \"acc_stderr\": 0.011507109853735386\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965847,\n\
\ \"f1\": 0.057400377516778664,\n \"f1_stderr\": 0.001295669399059679\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.287338893100834,\n \
\ \"acc_stderr\": 0.01246467706010708\n },\n \"harness|winogrande|5\": {\n\
\ \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363692\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenLemur/lemur-70b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|arc:challenge|25_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T14_30_20.780139
path:
- '**/details_harness|drop|3_2023-09-18T14-30-20.780139.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T14-30-20.780139.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T14_30_20.780139
path:
- '**/details_harness|gsm8k|5_2023-09-18T14-30-20.780139.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T14-30-20.780139.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hellaswag|10_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T09:13:21.689197.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T09_13_21.689197
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T09:13:21.689197.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T09:13:21.689197.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T14_30_20.780139
path:
- '**/details_harness|winogrande|5_2023-09-18T14-30-20.780139.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T14-30-20.780139.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_30_20.780139
path:
- results_2023-09-18T14-30-20.780139.parquet
- split: latest
path:
- results_2023-09-18T14-30-20.780139.parquet
---
# Dataset Card for Evaluation run of OpenLemur/lemur-70b-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenLemur/lemur-70b-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenLemur/lemur-70b-v1](https://huggingface.co/OpenLemur/lemur-70b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenLemur__lemur-70b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T14:30:20.780139](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenLemur__lemur-70b-v1/blob/main/results_2023-09-18T14-30-20.780139.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965847,
"f1": 0.057400377516778664,
"f1_stderr": 0.001295669399059679,
"acc": 0.558823353417031,
"acc_stderr": 0.011507109853735386
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965847,
"f1": 0.057400377516778664,
"f1_stderr": 0.001295669399059679
},
"harness|gsm8k|5": {
"acc": 0.287338893100834,
"acc_stderr": 0.01246467706010708
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363692
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1 | 2023-09-17T13:31:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenLemur/lemur-70b-chat-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenLemur/lemur-70b-chat-v1](https://huggingface.co/OpenLemur/lemur-70b-chat-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T13:31:04.707005](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1/blob/main/results_2023-09-17T13-31-04.707005.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006711409395973154,\n\
\ \"em_stderr\": 0.0008361500895152445,\n \"f1\": 0.0739702181208053,\n\
\ \"f1_stderr\": 0.001585201628872726,\n \"acc\": 0.5850941225115532,\n\
\ \"acc_stderr\": 0.01201805791264202\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.006711409395973154,\n \"em_stderr\": 0.0008361500895152445,\n\
\ \"f1\": 0.0739702181208053,\n \"f1_stderr\": 0.001585201628872726\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35329795299469297,\n \
\ \"acc_stderr\": 0.013166337192115683\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.010869778633168358\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenLemur/lemur-70b-chat-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|arc:challenge|25_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T13_31_04.707005
path:
- '**/details_harness|drop|3_2023-09-17T13-31-04.707005.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T13-31-04.707005.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T13_31_04.707005
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-31-04.707005.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-31-04.707005.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hellaswag|10_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:11:57.870589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T04_11_57.870589
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T04:11:57.870589.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T04:11:57.870589.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T13_31_04.707005
path:
- '**/details_harness|winogrande|5_2023-09-17T13-31-04.707005.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T13-31-04.707005.parquet'
- config_name: results
data_files:
- split: 2023_09_17T13_31_04.707005
path:
- results_2023-09-17T13-31-04.707005.parquet
- split: latest
path:
- results_2023-09-17T13-31-04.707005.parquet
---
# Dataset Card for Evaluation run of OpenLemur/lemur-70b-chat-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenLemur/lemur-70b-chat-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenLemur/lemur-70b-chat-v1](https://huggingface.co/OpenLemur/lemur-70b-chat-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T13:31:04.707005](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1/blob/main/results_2023-09-17T13-31-04.707005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006711409395973154,
"em_stderr": 0.0008361500895152445,
"f1": 0.0739702181208053,
"f1_stderr": 0.001585201628872726,
"acc": 0.5850941225115532,
"acc_stderr": 0.01201805791264202
},
"harness|drop|3": {
"em": 0.006711409395973154,
"em_stderr": 0.0008361500895152445,
"f1": 0.0739702181208053,
"f1_stderr": 0.001585201628872726
},
"harness|gsm8k|5": {
"acc": 0.35329795299469297,
"acc_stderr": 0.013166337192115683
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.010869778633168358
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_doas__test5 | 2023-09-25T05:43:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of doas/test5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [doas/test5](https://huggingface.co/doas/test5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_doas__test5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-25T05:43:32.139729](https://huggingface.co/datasets/open-llm-leaderboard/details_doas__test5/blob/main/results_2023-09-25T05-43-32.139729.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 4.5092281879194636e-05,\n \"f1_stderr\"\
: 2.699913059109046e-05,\n \"acc\": 0.2632202052091555,\n \"acc_stderr\"\
: 0.007016411937203614\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 4.5092281879194636e-05,\n \"\
f1_stderr\": 2.699913059109046e-05\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.526440410418311,\n \"acc_stderr\": 0.014032823874407229\n\
\ }\n}\n```"
repo_url: https://huggingface.co/doas/test5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|arc:challenge|25_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_25T05_43_32.139729
path:
- '**/details_harness|drop|3_2023-09-25T05-43-32.139729.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-25T05-43-32.139729.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_25T05_43_32.139729
path:
- '**/details_harness|gsm8k|5_2023-09-25T05-43-32.139729.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-25T05-43-32.139729.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hellaswag|10_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:59:05.636787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T10_59_05.636787
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T10:59:05.636787.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T10:59:05.636787.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_25T05_43_32.139729
path:
- '**/details_harness|winogrande|5_2023-09-25T05-43-32.139729.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-25T05-43-32.139729.parquet'
- config_name: results
data_files:
- split: 2023_09_25T05_43_32.139729
path:
- results_2023-09-25T05-43-32.139729.parquet
- split: latest
path:
- results_2023-09-25T05-43-32.139729.parquet
---
# Dataset Card for Evaluation run of doas/test5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/doas/test5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [doas/test5](https://huggingface.co/doas/test5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_doas__test5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-25T05:43:32.139729](https://huggingface.co/datasets/open-llm-leaderboard/details_doas__test5/blob/main/results_2023-09-25T05-43-32.139729.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 4.5092281879194636e-05,
"f1_stderr": 2.699913059109046e-05,
"acc": 0.2632202052091555,
"acc_stderr": 0.007016411937203614
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 4.5092281879194636e-05,
"f1_stderr": 2.699913059109046e-05
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.526440410418311,
"acc_stderr": 0.014032823874407229
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_doas__test2 | 2023-09-20T10:22:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of doas/test2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [doas/test2](https://huggingface.co/doas/test2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_doas__test2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-20T10:22:29.225839](https://huggingface.co/datasets/open-llm-leaderboard/details_doas__test2/blob/main/results_2023-09-20T10-22-29.225839.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 2.3070469798657714e-05,\n \"f1_stderr\"\
: 6.792327290354398e-06,\n \"acc\": 0.2505919494869771,\n \"acc_stderr\"\
: 0.0070262231452645095\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 2.3070469798657714e-05,\n \"\
f1_stderr\": 6.792327290354398e-06\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5011838989739542,\n \"acc_stderr\": 0.014052446290529019\n\
\ }\n}\n```"
repo_url: https://huggingface.co/doas/test2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|arc:challenge|25_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_20T10_22_29.225839
path:
- '**/details_harness|drop|3_2023-09-20T10-22-29.225839.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-20T10-22-29.225839.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_20T10_22_29.225839
path:
- '**/details_harness|gsm8k|5_2023-09-20T10-22-29.225839.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-20T10-22-29.225839.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hellaswag|10_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:29:17.352956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T10_29_17.352956
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T10:29:17.352956.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T10:29:17.352956.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_20T10_22_29.225839
path:
- '**/details_harness|winogrande|5_2023-09-20T10-22-29.225839.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-20T10-22-29.225839.parquet'
- config_name: results
data_files:
- split: 2023_09_20T10_22_29.225839
path:
- results_2023-09-20T10-22-29.225839.parquet
- split: latest
path:
- results_2023-09-20T10-22-29.225839.parquet
---
# Dataset Card for Evaluation run of doas/test2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/doas/test2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [doas/test2](https://huggingface.co/doas/test2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_doas__test2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-20T10:22:29.225839](https://huggingface.co/datasets/open-llm-leaderboard/details_doas__test2/blob/main/results_2023-09-20T10-22-29.225839.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 2.3070469798657714e-05,
"f1_stderr": 6.792327290354398e-06,
"acc": 0.2505919494869771,
"acc_stderr": 0.0070262231452645095
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 2.3070469798657714e-05,
"f1_stderr": 6.792327290354398e-06
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5011838989739542,
"acc_stderr": 0.014052446290529019
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_huashiyiqike__testmodel | 2023-09-17T09:22:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of huashiyiqike/testmodel
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [huashiyiqike/testmodel](https://huggingface.co/huashiyiqike/testmodel) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huashiyiqike__testmodel\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T09:22:21.372556](https://huggingface.co/datasets/open-llm-leaderboard/details_huashiyiqike__testmodel/blob/main/results_2023-09-17T09-22-21.372556.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n\
\ \"em_stderr\": 0.00029649629898012493,\n \"f1\": 0.026885486577181286,\n\
\ \"f1_stderr\": 0.0009984003779091447,\n \"acc\": 0.2509865824782952,\n\
\ \"acc_stderr\": 0.007026188129612818\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012493,\n\
\ \"f1\": 0.026885486577181286,\n \"f1_stderr\": 0.0009984003779091447\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5019731649565904,\n\
\ \"acc_stderr\": 0.014052376259225636\n }\n}\n```"
repo_url: https://huggingface.co/huashiyiqike/testmodel
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|arc:challenge|25_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T09_22_21.372556
path:
- '**/details_harness|drop|3_2023-09-17T09-22-21.372556.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T09-22-21.372556.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T09_22_21.372556
path:
- '**/details_harness|gsm8k|5_2023-09-17T09-22-21.372556.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T09-22-21.372556.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hellaswag|10_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:51:27.417850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T12_51_27.417850
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T12:51:27.417850.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T12:51:27.417850.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T09_22_21.372556
path:
- '**/details_harness|winogrande|5_2023-09-17T09-22-21.372556.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T09-22-21.372556.parquet'
- config_name: results
data_files:
- split: 2023_09_17T09_22_21.372556
path:
- results_2023-09-17T09-22-21.372556.parquet
- split: latest
path:
- results_2023-09-17T09-22-21.372556.parquet
---
# Dataset Card for Evaluation run of huashiyiqike/testmodel
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huashiyiqike/testmodel
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [huashiyiqike/testmodel](https://huggingface.co/huashiyiqike/testmodel) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huashiyiqike__testmodel",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T09:22:21.372556](https://huggingface.co/datasets/open-llm-leaderboard/details_huashiyiqike__testmodel/blob/main/results_2023-09-17T09-22-21.372556.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012493,
"f1": 0.026885486577181286,
"f1_stderr": 0.0009984003779091447,
"acc": 0.2509865824782952,
"acc_stderr": 0.007026188129612818
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012493,
"f1": 0.026885486577181286,
"f1_stderr": 0.0009984003779091447
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225636
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_gywy__llama2-13b-chinese-v2 | 2023-08-27T12:45:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of gywy/llama2-13b-chinese-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gywy/llama2-13b-chinese-v2](https://huggingface.co/gywy/llama2-13b-chinese-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gywy__llama2-13b-chinese-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T06:34:52.527942](https://huggingface.co/datasets/open-llm-leaderboard/details_gywy__llama2-13b-chinese-v2/blob/main/results_2023-08-23T06%3A34%3A52.527942.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4985803003924236,\n\
\ \"acc_stderr\": 0.03531008970596033,\n \"acc_norm\": 0.5022923346532945,\n\
\ \"acc_norm_stderr\": 0.035299015367073264,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135032,\n \"mc2\": 0.45427145393809903,\n\
\ \"mc2_stderr\": 0.01572626868352842\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5042662116040956,\n \"acc_stderr\": 0.014610858923956952,\n\
\ \"acc_norm\": 0.5392491467576792,\n \"acc_norm_stderr\": 0.014566303676636588\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5623381796454889,\n\
\ \"acc_stderr\": 0.004950848456984539,\n \"acc_norm\": 0.7463652658832902,\n\
\ \"acc_norm_stderr\": 0.0043420177099679534\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5283018867924528,\n \"acc_stderr\": 0.030723535249006107,\n\
\ \"acc_norm\": 0.5283018867924528,\n \"acc_norm_stderr\": 0.030723535249006107\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.03156564682236785,\n\
\ \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.03156564682236785\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.535483870967742,\n \"acc_stderr\": 0.02837228779796294,\n \"acc_norm\"\
: 0.535483870967742,\n \"acc_norm_stderr\": 0.02837228779796294\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n\
\ \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.37438423645320196,\n\
\ \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819115,\n \
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"\
acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104282,\n\
\ \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104282\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736118,\n\
\ \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n\
\ \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.021004201260420075,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.021004201260420075\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0321495214780275,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0321495214780275\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340703,\n \"\
acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340703\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.02904133351059804,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199985,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199985\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5521472392638037,\n \"acc_stderr\": 0.03906947479456607,\n\
\ \"acc_norm\": 0.5521472392638037,\n \"acc_norm_stderr\": 0.03906947479456607\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613538,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613538\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n\
\ \"acc_stderr\": 0.030118210106942652,\n \"acc_norm\": 0.6965811965811965,\n\
\ \"acc_norm_stderr\": 0.030118210106942652\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6551724137931034,\n\
\ \"acc_stderr\": 0.01699712334611342,\n \"acc_norm\": 0.6551724137931034,\n\
\ \"acc_norm_stderr\": 0.01699712334611342\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679217,\n\
\ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679217\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n\
\ \"acc_stderr\": 0.01612554382355295,\n \"acc_norm\": 0.3675977653631285,\n\
\ \"acc_norm_stderr\": 0.01612554382355295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618877,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618877\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5594855305466238,\n\
\ \"acc_stderr\": 0.028196400574197422,\n \"acc_norm\": 0.5594855305466238,\n\
\ \"acc_norm_stderr\": 0.028196400574197422\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.02776768960683394,\n\
\ \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.02776768960683394\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.02904919034254346,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.02904919034254346\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3683181225554107,\n\
\ \"acc_stderr\": 0.012319403369564639,\n \"acc_norm\": 0.3683181225554107,\n\
\ \"acc_norm_stderr\": 0.012319403369564639\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280072,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280072\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.03235743789355042,\n\
\ \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.03235743789355042\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699122,\n\
\ \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699122\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6257309941520468,\n\
\ \"acc_stderr\": 0.03711601185389481,\n \"acc_norm\": 0.6257309941520468,\n\
\ \"acc_norm_stderr\": 0.03711601185389481\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.016419874731135032,\n\
\ \"mc2\": 0.45427145393809903,\n \"mc2_stderr\": 0.01572626868352842\n\
\ }\n}\n```"
repo_url: https://huggingface.co/gywy/llama2-13b-chinese-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|arc:challenge|25_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hellaswag|10_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:34:52.527942.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:34:52.527942.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T06_34_52.527942
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T06:34:52.527942.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T06:34:52.527942.parquet'
---
# Dataset Card for Evaluation run of gywy/llama2-13b-chinese-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/gywy/llama2-13b-chinese-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [gywy/llama2-13b-chinese-v2](https://huggingface.co/gywy/llama2-13b-chinese-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gywy__llama2-13b-chinese-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T06:34:52.527942](https://huggingface.co/datasets/open-llm-leaderboard/details_gywy__llama2-13b-chinese-v2/blob/main/results_2023-08-23T06%3A34%3A52.527942.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4985803003924236,
"acc_stderr": 0.03531008970596033,
"acc_norm": 0.5022923346532945,
"acc_norm_stderr": 0.035299015367073264,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135032,
"mc2": 0.45427145393809903,
"mc2_stderr": 0.01572626868352842
},
"harness|arc:challenge|25": {
"acc": 0.5042662116040956,
"acc_stderr": 0.014610858923956952,
"acc_norm": 0.5392491467576792,
"acc_norm_stderr": 0.014566303676636588
},
"harness|hellaswag|10": {
"acc": 0.5623381796454889,
"acc_stderr": 0.004950848456984539,
"acc_norm": 0.7463652658832902,
"acc_norm_stderr": 0.0043420177099679534
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5283018867924528,
"acc_stderr": 0.030723535249006107,
"acc_norm": 0.5283018867924528,
"acc_norm_stderr": 0.030723535249006107
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.02837228779796294,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.02837228779796294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819115,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6580310880829016,
"acc_stderr": 0.03423465100104282,
"acc_norm": 0.6580310880829016,
"acc_norm_stderr": 0.03423465100104282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47692307692307695,
"acc_stderr": 0.025323990861736118,
"acc_norm": 0.47692307692307695,
"acc_norm_stderr": 0.025323990861736118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6,
"acc_stderr": 0.021004201260420075,
"acc_norm": 0.6,
"acc_norm_stderr": 0.021004201260420075
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0321495214780275,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0321495214780275
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199985,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199985
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5521472392638037,
"acc_stderr": 0.03906947479456607,
"acc_norm": 0.5521472392638037,
"acc_norm_stderr": 0.03906947479456607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613538,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613538
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.030118210106942652,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.030118210106942652
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.01699712334611342,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.01699712334611342
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.026680134761679217,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.026680134761679217
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.01612554382355295,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.01612554382355295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618877,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618877
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5594855305466238,
"acc_stderr": 0.028196400574197422,
"acc_norm": 0.5594855305466238,
"acc_norm_stderr": 0.028196400574197422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5308641975308642,
"acc_stderr": 0.02776768960683394,
"acc_norm": 0.5308641975308642,
"acc_norm_stderr": 0.02776768960683394
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.02904919034254346,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.02904919034254346
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3683181225554107,
"acc_stderr": 0.012319403369564639,
"acc_norm": 0.3683181225554107,
"acc_norm_stderr": 0.012319403369564639
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280072,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280072
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5,
"acc_stderr": 0.020227834851568375,
"acc_norm": 0.5,
"acc_norm_stderr": 0.020227834851568375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.6,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6257309941520468,
"acc_stderr": 0.03711601185389481,
"acc_norm": 0.6257309941520468,
"acc_norm_stderr": 0.03711601185389481
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135032,
"mc2": 0.45427145393809903,
"mc2_stderr": 0.01572626868352842
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_pe-nlp__llama-2-13b-platypus-vicuna-wizard | 2023-09-23T01:39:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of pe-nlp/llama-2-13b-platypus-vicuna-wizard
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pe-nlp/llama-2-13b-platypus-vicuna-wizard](https://huggingface.co/pe-nlp/llama-2-13b-platypus-vicuna-wizard)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pe-nlp__llama-2-13b-platypus-vicuna-wizard\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T01:39:24.392749](https://huggingface.co/datasets/open-llm-leaderboard/details_pe-nlp__llama-2-13b-platypus-vicuna-wizard/blob/main/results_2023-09-23T01-39-24.392749.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4077181208053691,\n\
\ \"em_stderr\": 0.005032501129819524,\n \"f1\": 0.44956795302013525,\n\
\ \"f1_stderr\": 0.004900290116380425,\n \"acc\": 0.3833965723476863,\n\
\ \"acc_stderr\": 0.007328839518475228\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4077181208053691,\n \"em_stderr\": 0.005032501129819524,\n\
\ \"f1\": 0.44956795302013525,\n \"f1_stderr\": 0.004900290116380425\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \
\ \"acc_stderr\": 0.002615326510775672\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174785\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pe-nlp/llama-2-13b-platypus-vicuna-wizard
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|arc:challenge|25_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T01_39_24.392749
path:
- '**/details_harness|drop|3_2023-09-23T01-39-24.392749.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T01-39-24.392749.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T01_39_24.392749
path:
- '**/details_harness|gsm8k|5_2023-09-23T01-39-24.392749.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T01-39-24.392749.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hellaswag|10_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:17:52.527407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T06_17_52.527407
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T06:17:52.527407.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T06:17:52.527407.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T01_39_24.392749
path:
- '**/details_harness|winogrande|5_2023-09-23T01-39-24.392749.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T01-39-24.392749.parquet'
- config_name: results
data_files:
- split: 2023_09_23T01_39_24.392749
path:
- results_2023-09-23T01-39-24.392749.parquet
- split: latest
path:
- results_2023-09-23T01-39-24.392749.parquet
---
# Dataset Card for Evaluation run of pe-nlp/llama-2-13b-platypus-vicuna-wizard
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pe-nlp/llama-2-13b-platypus-vicuna-wizard
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pe-nlp/llama-2-13b-platypus-vicuna-wizard](https://huggingface.co/pe-nlp/llama-2-13b-platypus-vicuna-wizard) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pe-nlp__llama-2-13b-platypus-vicuna-wizard",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T01:39:24.392749](https://huggingface.co/datasets/open-llm-leaderboard/details_pe-nlp__llama-2-13b-platypus-vicuna-wizard/blob/main/results_2023-09-23T01-39-24.392749.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4077181208053691,
"em_stderr": 0.005032501129819524,
"f1": 0.44956795302013525,
"f1_stderr": 0.004900290116380425,
"acc": 0.3833965723476863,
"acc_stderr": 0.007328839518475228
},
"harness|drop|3": {
"em": 0.4077181208053691,
"em_stderr": 0.005032501129819524,
"f1": 0.44956795302013525,
"f1_stderr": 0.004900290116380425
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.002615326510775672
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174785
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_chargoddard__MelangeA-70b | 2023-08-27T12:45:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/MelangeA-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/MelangeA-70b](https://huggingface.co/chargoddard/MelangeA-70b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__MelangeA-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T13:15:46.123810](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeA-70b/blob/main/results_2023-08-23T13%3A15%3A46.123810.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7049631693964158,\n\
\ \"acc_stderr\": 0.031169216813298206,\n \"acc_norm\": 0.7085775775902797,\n\
\ \"acc_norm_stderr\": 0.031140807495055736,\n \"mc1\": 0.41982864137086906,\n\
\ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.6061335096012639,\n\
\ \"mc2_stderr\": 0.01484530713808182\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.681740614334471,\n \"acc_stderr\": 0.013611993916971453,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6904999004182434,\n\
\ \"acc_stderr\": 0.004613427745209517,\n \"acc_norm\": 0.8730332603067118,\n\
\ \"acc_norm_stderr\": 0.003322552829608903\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741706,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741706\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.02568056464005688,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.02568056464005688\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n\
\ \"acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530616,\n\
\ \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530616\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295894,\n \
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295894\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\"\
: 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956718,\n \"\
acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956718\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709225,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709225\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867447,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7890173410404624,\n \"acc_stderr\": 0.021966309947043114,\n\
\ \"acc_norm\": 0.7890173410404624,\n \"acc_norm_stderr\": 0.021966309947043114\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6134078212290502,\n\
\ \"acc_stderr\": 0.01628667487910102,\n \"acc_norm\": 0.6134078212290502,\n\
\ \"acc_norm_stderr\": 0.01628667487910102\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.023929155517351277,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.023929155517351277\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.021613809395224802,\n\
\ \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.021613809395224802\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5531914893617021,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6010430247718384,\n\
\ \"acc_stderr\": 0.012506757655293682,\n \"acc_norm\": 0.6010430247718384,\n\
\ \"acc_norm_stderr\": 0.012506757655293682\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887657,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887657\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146606,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146606\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41982864137086906,\n\
\ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.6061335096012639,\n\
\ \"mc2_stderr\": 0.01484530713808182\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/MelangeA-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|arc:challenge|25_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hellaswag|10_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T13:15:46.123810.parquet'
---
# Dataset Card for Evaluation run of chargoddard/MelangeA-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/MelangeA-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/MelangeA-70b](https://huggingface.co/chargoddard/MelangeA-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__MelangeA-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T13:15:46.123810](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeA-70b/blob/main/results_2023-08-23T13%3A15%3A46.123810.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7049631693964158,
"acc_stderr": 0.031169216813298206,
"acc_norm": 0.7085775775902797,
"acc_norm_stderr": 0.031140807495055736,
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.6061335096012639,
"mc2_stderr": 0.01484530713808182
},
"harness|arc:challenge|25": {
"acc": 0.681740614334471,
"acc_stderr": 0.013611993916971453,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.6904999004182434,
"acc_stderr": 0.004613427745209517,
"acc_norm": 0.8730332603067118,
"acc_norm_stderr": 0.003322552829608903
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741706,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741706
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.02568056464005688,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.02568056464005688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284357,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295894,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295894
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.01280978008187893,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.01280978008187893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956718,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956718
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709225,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709225
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867447,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7890173410404624,
"acc_stderr": 0.021966309947043114,
"acc_norm": 0.7890173410404624,
"acc_norm_stderr": 0.021966309947043114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6134078212290502,
"acc_stderr": 0.01628667487910102,
"acc_norm": 0.6134078212290502,
"acc_norm_stderr": 0.01628667487910102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.023929155517351277,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.023929155517351277
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.021613809395224802,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.021613809395224802
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6010430247718384,
"acc_stderr": 0.012506757655293682,
"acc_norm": 0.6010430247718384,
"acc_norm_stderr": 0.012506757655293682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887657,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887657
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146606,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146606
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.6061335096012639,
"mc2_stderr": 0.01484530713808182
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora | 2023-09-26T17:07:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/ypotryll-22b-epoch2-qlora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/ypotryll-22b-epoch2-qlora](https://huggingface.co/chargoddard/ypotryll-22b-epoch2-qlora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-26T17:07:11.654928](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora/blob/main/results_2023-09-26T17-07-11.654928.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.39198825503355705,\n\
\ \"em_stderr\": 0.004999564353850857,\n \"f1\": 0.452352139261747,\n\
\ \"f1_stderr\": 0.004826380442768646,\n \"acc\": 0.4085244316417271,\n\
\ \"acc_stderr\": 0.00908196050272276\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.39198825503355705,\n \"em_stderr\": 0.004999564353850857,\n\
\ \"f1\": 0.452352139261747,\n \"f1_stderr\": 0.004826380442768646\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.053828658074298714,\n \
\ \"acc_stderr\": 0.006216328640238116\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.0119475923652074\n\
\ }\n}\n```"
repo_url: https://huggingface.co/chargoddard/ypotryll-22b-epoch2-qlora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|arc:challenge|25_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|arc:challenge|25_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_26T17_07_11.654928
path:
- '**/details_harness|drop|3_2023-09-26T17-07-11.654928.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-26T17-07-11.654928.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_26T17_07_11.654928
path:
- '**/details_harness|gsm8k|5_2023-09-26T17-07-11.654928.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-26T17-07-11.654928.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hellaswag|10_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hellaswag|10_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:24:06.867434.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:33:04.843641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T22_24_06.867434
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T22:24:06.867434.parquet'
- split: 2023_08_18T22_33_04.843641
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T22:33:04.843641.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T22:33:04.843641.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_26T17_07_11.654928
path:
- '**/details_harness|winogrande|5_2023-09-26T17-07-11.654928.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-26T17-07-11.654928.parquet'
- config_name: results
data_files:
- split: 2023_09_26T17_07_11.654928
path:
- results_2023-09-26T17-07-11.654928.parquet
- split: latest
path:
- results_2023-09-26T17-07-11.654928.parquet
---
# Dataset Card for Evaluation run of chargoddard/ypotryll-22b-epoch2-qlora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/ypotryll-22b-epoch2-qlora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/ypotryll-22b-epoch2-qlora](https://huggingface.co/chargoddard/ypotryll-22b-epoch2-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-26T17:07:11.654928](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__ypotryll-22b-epoch2-qlora/blob/main/results_2023-09-26T17-07-11.654928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.39198825503355705,
"em_stderr": 0.004999564353850857,
"f1": 0.452352139261747,
"f1_stderr": 0.004826380442768646,
"acc": 0.4085244316417271,
"acc_stderr": 0.00908196050272276
},
"harness|drop|3": {
"em": 0.39198825503355705,
"em_stderr": 0.004999564353850857,
"f1": 0.452352139261747,
"f1_stderr": 0.004826380442768646
},
"harness|gsm8k|5": {
"acc": 0.053828658074298714,
"acc_stderr": 0.006216328640238116
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.0119475923652074
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_chargoddard__MelangeB-70b | 2023-08-27T12:45:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/MelangeB-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/MelangeB-70b](https://huggingface.co/chargoddard/MelangeB-70b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__MelangeB-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T14:27:52.893839](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeB-70b/blob/main/results_2023-08-23T14%3A27%3A52.893839.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6996650574168185,\n\
\ \"acc_stderr\": 0.031147602036690588,\n \"acc_norm\": 0.7035103550754854,\n\
\ \"acc_norm_stderr\": 0.031115910811916572,\n \"mc1\": 0.423500611995104,\n\
\ \"mc1_stderr\": 0.01729742144853474,\n \"mc2\": 0.5935549068383343,\n\
\ \"mc2_stderr\": 0.015237734961215612\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6697952218430034,\n \"acc_stderr\": 0.013743085603760426,\n\
\ \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6950806612228639,\n\
\ \"acc_stderr\": 0.004594323838650354,\n \"acc_norm\": 0.8750248954391555,\n\
\ \"acc_norm_stderr\": 0.0033001484456091326\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.031546980450822305,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.031546980450822305\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n\
\ \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802267,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802267\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n\
\ \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n\
\ \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7021276595744681,\n\
\ \"acc_stderr\": 0.029896145682095455,\n \"acc_norm\": 0.7021276595744681,\n\
\ \"acc_norm_stderr\": 0.029896145682095455\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n\
\ \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n \"acc_norm\"\
: 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47883597883597884,\n\
\ \"acc_stderr\": 0.025728230952130726,\n \"acc_norm\": 0.47883597883597884,\n\
\ \"acc_norm_stderr\": 0.025728230952130726\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n\
\ \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8354838709677419,\n \"acc_stderr\": 0.02109084774593931,\n\
\ \"acc_norm\": 0.8354838709677419,\n \"acc_norm_stderr\": 0.02109084774593931\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424208,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424208\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7256410256410256,\n \"acc_stderr\": 0.02262276576749322,\n \
\ \"acc_norm\": 0.7256410256410256,\n \"acc_norm_stderr\": 0.02262276576749322\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7983193277310925,\n \"acc_stderr\": 0.026064313406304534,\n\
\ \"acc_norm\": 0.7983193277310925,\n \"acc_norm_stderr\": 0.026064313406304534\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683775,\n \"\
acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8954128440366973,\n \"acc_stderr\": 0.013120530245265593,\n \"\
acc_norm\": 0.8954128440366973,\n \"acc_norm_stderr\": 0.013120530245265593\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8921568627450981,\n \"acc_stderr\": 0.021770522281368394,\n \"\
acc_norm\": 0.8921568627450981,\n \"acc_norm_stderr\": 0.021770522281368394\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.869198312236287,\n \"acc_stderr\": 0.02194876605947076,\n \
\ \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.02194876605947076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8773946360153256,\n\
\ \"acc_stderr\": 0.011728672144131565,\n \"acc_norm\": 0.8773946360153256,\n\
\ \"acc_norm_stderr\": 0.011728672144131565\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7890173410404624,\n \"acc_stderr\": 0.02196630994704311,\n\
\ \"acc_norm\": 0.7890173410404624,\n \"acc_norm_stderr\": 0.02196630994704311\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6525139664804469,\n\
\ \"acc_stderr\": 0.015925564060208158,\n \"acc_norm\": 0.6525139664804469,\n\
\ \"acc_norm_stderr\": 0.015925564060208158\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.021887704613396147,\n\
\ \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.021887704613396147\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5560625814863103,\n\
\ \"acc_stderr\": 0.012689708167787675,\n \"acc_norm\": 0.5560625814863103,\n\
\ \"acc_norm_stderr\": 0.012689708167787675\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.761437908496732,\n \"acc_stderr\": 0.017242385828779613,\n \
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.017242385828779613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.423500611995104,\n\
\ \"mc1_stderr\": 0.01729742144853474,\n \"mc2\": 0.5935549068383343,\n\
\ \"mc2_stderr\": 0.015237734961215612\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/MelangeB-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:27:52.893839.parquet'
---
# Dataset Card for Evaluation run of chargoddard/MelangeB-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/MelangeB-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/MelangeB-70b](https://huggingface.co/chargoddard/MelangeB-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__MelangeB-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T14:27:52.893839](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeB-70b/blob/main/results_2023-08-23T14%3A27%3A52.893839.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6996650574168185,
"acc_stderr": 0.031147602036690588,
"acc_norm": 0.7035103550754854,
"acc_norm_stderr": 0.031115910811916572,
"mc1": 0.423500611995104,
"mc1_stderr": 0.01729742144853474,
"mc2": 0.5935549068383343,
"mc2_stderr": 0.015237734961215612
},
"harness|arc:challenge|25": {
"acc": 0.6697952218430034,
"acc_stderr": 0.013743085603760426,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134575
},
"harness|hellaswag|10": {
"acc": 0.6950806612228639,
"acc_stderr": 0.004594323838650354,
"acc_norm": 0.8750248954391555,
"acc_norm_stderr": 0.0033001484456091326
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.031546980450822305,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.031546980450822305
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802267,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802267
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8354838709677419,
"acc_stderr": 0.02109084774593931,
"acc_norm": 0.8354838709677419,
"acc_norm_stderr": 0.02109084774593931
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424208,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424208
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7256410256410256,
"acc_stderr": 0.02262276576749322,
"acc_norm": 0.7256410256410256,
"acc_norm_stderr": 0.02262276576749322
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7983193277310925,
"acc_stderr": 0.026064313406304534,
"acc_norm": 0.7983193277310925,
"acc_norm_stderr": 0.026064313406304534
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8954128440366973,
"acc_stderr": 0.013120530245265593,
"acc_norm": 0.8954128440366973,
"acc_norm_stderr": 0.013120530245265593
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8921568627450981,
"acc_stderr": 0.021770522281368394,
"acc_norm": 0.8921568627450981,
"acc_norm_stderr": 0.021770522281368394
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.869198312236287,
"acc_stderr": 0.02194876605947076,
"acc_norm": 0.869198312236287,
"acc_norm_stderr": 0.02194876605947076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8773946360153256,
"acc_stderr": 0.011728672144131565,
"acc_norm": 0.8773946360153256,
"acc_norm_stderr": 0.011728672144131565
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7890173410404624,
"acc_stderr": 0.02196630994704311,
"acc_norm": 0.7890173410404624,
"acc_norm_stderr": 0.02196630994704311
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6525139664804469,
"acc_stderr": 0.015925564060208158,
"acc_norm": 0.6525139664804469,
"acc_norm_stderr": 0.015925564060208158
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.808641975308642,
"acc_stderr": 0.021887704613396147,
"acc_norm": 0.808641975308642,
"acc_norm_stderr": 0.021887704613396147
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5560625814863103,
"acc_stderr": 0.012689708167787675,
"acc_norm": 0.5560625814863103,
"acc_norm_stderr": 0.012689708167787675
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.017242385828779613,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.017242385828779613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.423500611995104,
"mc1_stderr": 0.01729742144853474,
"mc2": 0.5935549068383343,
"mc2_stderr": 0.015237734961215612
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_chargoddard__MelangeC-70b | 2023-09-23T03:39:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/MelangeC-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/MelangeC-70b](https://huggingface.co/chargoddard/MelangeC-70b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__MelangeC-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T03:39:16.431965](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeC-70b/blob/main/results_2023-09-23T03-39-16.431965.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.489618288590604,\n\
\ \"em_stderr\": 0.005119364104825758,\n \"f1\": 0.5680631291946334,\n\
\ \"f1_stderr\": 0.004723246870166152,\n \"acc\": 0.4198895027624309,\n\
\ \"acc_stderr\": 0.005154604749093739\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.489618288590604,\n \"em_stderr\": 0.005119364104825758,\n\
\ \"f1\": 0.5680631291946334,\n \"f1_stderr\": 0.004723246870166152\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8397790055248618,\n\
\ \"acc_stderr\": 0.010309209498187479\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/MelangeC-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|arc:challenge|25_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T03_39_16.431965
path:
- '**/details_harness|drop|3_2023-09-23T03-39-16.431965.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T03-39-16.431965.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T03_39_16.431965
path:
- '**/details_harness|gsm8k|5_2023-09-23T03-39-16.431965.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T03-39-16.431965.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hellaswag|10_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T03_39_16.431965
path:
- '**/details_harness|winogrande|5_2023-09-23T03-39-16.431965.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T03-39-16.431965.parquet'
- config_name: results
data_files:
- split: 2023_09_23T03_39_16.431965
path:
- results_2023-09-23T03-39-16.431965.parquet
- split: latest
path:
- results_2023-09-23T03-39-16.431965.parquet
---
# Dataset Card for Evaluation run of chargoddard/MelangeC-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/MelangeC-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/MelangeC-70b](https://huggingface.co/chargoddard/MelangeC-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__MelangeC-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T03:39:16.431965](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeC-70b/blob/main/results_2023-09-23T03-39-16.431965.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.489618288590604,
"em_stderr": 0.005119364104825758,
"f1": 0.5680631291946334,
"f1_stderr": 0.004723246870166152,
"acc": 0.4198895027624309,
"acc_stderr": 0.005154604749093739
},
"harness|drop|3": {
"em": 0.489618288590604,
"em_stderr": 0.005119364104825758,
"f1": 0.5680631291946334,
"f1_stderr": 0.004723246870166152
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2 | 2023-08-27T12:45:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of beaugogh/Llama2-7b-openorca-mc-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beaugogh/Llama2-7b-openorca-mc-v2](https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 60 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-23T08:24:57.016837](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2/blob/main/results_2023-08-23T08%3A24%3A57.016837.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4858809149723908,\n\
\ \"acc_stderr\": 0.035268633291676094,\n \"acc_norm\": 0.4898038900879427,\n\
\ \"acc_norm_stderr\": 0.03525120428354332,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5149452069709564,\n\
\ \"mc2_stderr\": 0.015640395281238225\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.514505119453925,\n \"acc_stderr\": 0.014605241081370056,\n\
\ \"acc_norm\": 0.5554607508532423,\n \"acc_norm_stderr\": 0.01452122640562708\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.622087233618801,\n\
\ \"acc_stderr\": 0.00483874730578335,\n \"acc_norm\": 0.8125871340370444,\n\
\ \"acc_norm_stderr\": 0.0038944505016930402\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5283018867924528,\n \"acc_stderr\": 0.0307235352490061,\n\
\ \"acc_norm\": 0.5283018867924528,\n \"acc_norm_stderr\": 0.0307235352490061\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.0240268463928735,\n \"acc_norm\"\
: 0.3201058201058201,\n \"acc_norm_stderr\": 0.0240268463928735\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"\
acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.03275264467791515,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.03275264467791515\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.025254485424799602,\n\
\ \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.025254485424799602\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.032437180551374095,\n\
\ \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.032437180551374095\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6660550458715596,\n \"acc_stderr\": 0.020220554196736407,\n \"\
acc_norm\": 0.6660550458715596,\n \"acc_norm_stderr\": 0.020220554196736407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402543,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402543\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422882,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422882\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.04465869780531009,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.04465869780531009\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.048026946982589726,\n\
\ \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.048026946982589726\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.029202540153431194,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.029202540153431194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6628352490421456,\n\
\ \"acc_stderr\": 0.016905207420803547,\n \"acc_norm\": 0.6628352490421456,\n\
\ \"acc_norm_stderr\": 0.016905207420803547\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.02691189868637793,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.02691189868637793\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925295,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.45751633986928103,\n \"acc_stderr\": 0.028526383452142635,\n\
\ \"acc_norm\": 0.45751633986928103,\n \"acc_norm_stderr\": 0.028526383452142635\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5659163987138264,\n\
\ \"acc_stderr\": 0.028150232244535597,\n \"acc_norm\": 0.5659163987138264,\n\
\ \"acc_norm_stderr\": 0.028150232244535597\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5246913580246914,\n \"acc_stderr\": 0.027786800931427443,\n\
\ \"acc_norm\": 0.5246913580246914,\n \"acc_norm_stderr\": 0.027786800931427443\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3767926988265971,\n\
\ \"acc_stderr\": 0.0123764595938944,\n \"acc_norm\": 0.3767926988265971,\n\
\ \"acc_norm_stderr\": 0.0123764595938944\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4738562091503268,\n \"acc_stderr\": 0.020200164564804588,\n \
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.020200164564804588\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5061224489795918,\n \"acc_stderr\": 0.03200682020163907,\n\
\ \"acc_norm\": 0.5061224489795918,\n \"acc_norm_stderr\": 0.03200682020163907\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.03368787466115459,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.03368787466115459\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748017,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748017\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036155076303109365,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036155076303109365\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5149452069709564,\n\
\ \"mc2_stderr\": 0.015640395281238225\n }\n}\n```"
repo_url: https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|arc:challenge|25_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hellaswag|10_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T08:24:57.016837.parquet'
---
# Dataset Card for Evaluation run of beaugogh/Llama2-7b-openorca-mc-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [beaugogh/Llama2-7b-openorca-mc-v2](https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 60 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-23T08:24:57.016837](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2/blob/main/results_2023-08-23T08%3A24%3A57.016837.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4858809149723908,
"acc_stderr": 0.035268633291676094,
"acc_norm": 0.4898038900879427,
"acc_norm_stderr": 0.03525120428354332,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5149452069709564,
"mc2_stderr": 0.015640395281238225
},
"harness|arc:challenge|25": {
"acc": 0.514505119453925,
"acc_stderr": 0.014605241081370056,
"acc_norm": 0.5554607508532423,
"acc_norm_stderr": 0.01452122640562708
},
"harness|hellaswag|10": {
"acc": 0.622087233618801,
"acc_stderr": 0.00483874730578335,
"acc_norm": 0.8125871340370444,
"acc_norm_stderr": 0.0038944505016930402
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5283018867924528,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.5283018867924528,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.0240268463928735,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.0240268463928735
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.03275264467791515,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.03275264467791515
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.025254485424799602,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.025254485424799602
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47478991596638653,
"acc_stderr": 0.032437180551374095,
"acc_norm": 0.47478991596638653,
"acc_norm_stderr": 0.032437180551374095
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6660550458715596,
"acc_stderr": 0.020220554196736407,
"acc_norm": 0.6660550458715596,
"acc_norm_stderr": 0.020220554196736407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402543,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402543
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422882,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422882
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.04465869780531009,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.04465869780531009
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.048026946982589726,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.048026946982589726
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.029202540153431194,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.029202540153431194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6628352490421456,
"acc_stderr": 0.016905207420803547,
"acc_norm": 0.6628352490421456,
"acc_norm_stderr": 0.016905207420803547
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.02691189868637793,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.02691189868637793
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925295,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45751633986928103,
"acc_stderr": 0.028526383452142635,
"acc_norm": 0.45751633986928103,
"acc_norm_stderr": 0.028526383452142635
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5659163987138264,
"acc_stderr": 0.028150232244535597,
"acc_norm": 0.5659163987138264,
"acc_norm_stderr": 0.028150232244535597
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5246913580246914,
"acc_stderr": 0.027786800931427443,
"acc_norm": 0.5246913580246914,
"acc_norm_stderr": 0.027786800931427443
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3767926988265971,
"acc_stderr": 0.0123764595938944,
"acc_norm": 0.3767926988265971,
"acc_norm_stderr": 0.0123764595938944
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5061224489795918,
"acc_stderr": 0.03200682020163907,
"acc_norm": 0.5061224489795918,
"acc_norm_stderr": 0.03200682020163907
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.03368787466115459,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.03368787466115459
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748017,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748017
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036155076303109365,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036155076303109365
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5149452069709564,
"mc2_stderr": 0.015640395281238225
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b | 2023-09-17T13:22:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of totally-not-an-llm/PuddleJumper-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [totally-not-an-llm/PuddleJumper-13b](https://huggingface.co/totally-not-an-llm/PuddleJumper-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T13:22:43.977787](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b/blob/main/results_2023-09-17T13-22-43.977787.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08452181208053691,\n\
\ \"em_stderr\": 0.002848708763936303,\n \"f1\": 0.20933095637583904,\n\
\ \"f1_stderr\": 0.003279820666133777,\n \"acc\": 0.38053092049715975,\n\
\ \"acc_stderr\": 0.008728490320313857\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08452181208053691,\n \"em_stderr\": 0.002848708763936303,\n\
\ \"f1\": 0.20933095637583904,\n \"f1_stderr\": 0.003279820666133777\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03335860500379075,\n \
\ \"acc_stderr\": 0.004946282649173775\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453937\n\
\ }\n}\n```"
repo_url: https://huggingface.co/totally-not-an-llm/PuddleJumper-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|arc:challenge|25_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T13_22_43.977787
path:
- '**/details_harness|drop|3_2023-09-17T13-22-43.977787.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T13-22-43.977787.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T13_22_43.977787
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-22-43.977787.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-22-43.977787.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hellaswag|10_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T13_22_43.977787
path:
- '**/details_harness|winogrande|5_2023-09-17T13-22-43.977787.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T13-22-43.977787.parquet'
- config_name: results
data_files:
- split: 2023_09_17T13_22_43.977787
path:
- results_2023-09-17T13-22-43.977787.parquet
- split: latest
path:
- results_2023-09-17T13-22-43.977787.parquet
---
# Dataset Card for Evaluation run of totally-not-an-llm/PuddleJumper-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/totally-not-an-llm/PuddleJumper-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [totally-not-an-llm/PuddleJumper-13b](https://huggingface.co/totally-not-an-llm/PuddleJumper-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T13:22:43.977787](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b/blob/main/results_2023-09-17T13-22-43.977787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08452181208053691,
"em_stderr": 0.002848708763936303,
"f1": 0.20933095637583904,
"f1_stderr": 0.003279820666133777,
"acc": 0.38053092049715975,
"acc_stderr": 0.008728490320313857
},
"harness|drop|3": {
"em": 0.08452181208053691,
"em_stderr": 0.002848708763936303,
"f1": 0.20933095637583904,
"f1_stderr": 0.003279820666133777
},
"harness|gsm8k|5": {
"acc": 0.03335860500379075,
"acc_stderr": 0.004946282649173775
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453937
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k | 2023-09-17T16:08:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of totally-not-an-llm/EverythingLM-13b-V2-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [totally-not-an-llm/EverythingLM-13b-V2-16k](https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V2-16k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T16:08:08.117578](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k/blob/main/results_2023-09-17T16-08-08.117578.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n\
\ \"em_stderr\": 0.0005023380498893423,\n \"f1\": 0.060858850671140774,\n\
\ \"f1_stderr\": 0.0013785298252049116,\n \"acc\": 0.39915227208673193,\n\
\ \"acc_stderr\": 0.009710896158035016\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893423,\n\
\ \"f1\": 0.060858850671140774,\n \"f1_stderr\": 0.0013785298252049116\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \
\ \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7300710339384373,\n \"acc_stderr\": 0.0124764333720026\n\
\ }\n}\n```"
repo_url: https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V2-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|arc:challenge|25_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T16_08_08.117578
path:
- '**/details_harness|drop|3_2023-09-17T16-08-08.117578.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T16-08-08.117578.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T16_08_08.117578
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-08-08.117578.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-08-08.117578.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hellaswag|10_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T16_08_08.117578
path:
- '**/details_harness|winogrande|5_2023-09-17T16-08-08.117578.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T16-08-08.117578.parquet'
- config_name: results
data_files:
- split: 2023_09_17T16_08_08.117578
path:
- results_2023-09-17T16-08-08.117578.parquet
- split: latest
path:
- results_2023-09-17T16-08-08.117578.parquet
---
# Dataset Card for Evaluation run of totally-not-an-llm/EverythingLM-13b-V2-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V2-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [totally-not-an-llm/EverythingLM-13b-V2-16k](https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V2-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T16:08:08.117578](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k/blob/main/results_2023-09-17T16-08-08.117578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893423,
"f1": 0.060858850671140774,
"f1_stderr": 0.0013785298252049116,
"acc": 0.39915227208673193,
"acc_stderr": 0.009710896158035016
},
"harness|drop|3": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893423,
"f1": 0.060858850671140774,
"f1_stderr": 0.0013785298252049116
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067431
},
"harness|winogrande|5": {
"acc": 0.7300710339384373,
"acc_stderr": 0.0124764333720026
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
desik98/TeluguRiddles | 2023-10-06T05:43:44.000Z | [
"region:us"
] | desik98 | null | null | null | 0 | 0 | Entry not found |
Clavinli/kaggle | 2023-08-27T12:37:49.000Z | [
"region:us"
] | Clavinli | null | null | null | 0 | 0 | Entry not found |
Khush12295/moviereviewssynthetic100 | 2023-08-27T12:45:43.000Z | [
"region:us"
] | Khush12295 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: critique
dtype: string
splits:
- name: train
num_bytes: 125758
num_examples: 100
download_size: 87388
dataset_size: 125758
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Moviereviewssynthetic100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinchResearch/Haribon_IQ | 2023-08-27T14:43:24.000Z | [
"region:us"
] | FinchResearch | null | null | null | 0 | 0 | Entry not found |
YeungNLP/firefly-pretrain-dataset | 2023-09-02T13:45:30.000Z | [
"region:us"
] | YeungNLP | null | null | null | 6 | 0 | Entry not found |
Sanchayt/front_rear_cars | 2023-08-27T13:14:19.000Z | [
"region:us"
] | Sanchayt | null | null | null | 0 | 0 | Entry not found |
chenchi/guodegang-penggen | 2023-08-27T13:04:59.000Z | [
"region:us"
] | chenchi | null | null | null | 0 | 0 | Entry not found |
YYXMM/E-comic-cover | 2023-08-27T13:26:56.000Z | [
"license:openrail",
"region:us"
] | YYXMM | null | null | null | 0 | 0 | ---
license: openrail
---
|
Jason-Z111/test | 2023-08-27T13:41:44.000Z | [
"region:us"
] | Jason-Z111 | null | null | null | 0 | 0 | Entry not found |
sid172002/autotrain-data-amc-clinical | 2023-08-27T14:57:31.000Z | [
"region:us"
] | sid172002 | null | null | null | 0 | 0 | Entry not found |
claoire/elsayuhd | 2023-08-27T14:39:47.000Z | [
"region:us"
] | claoire | null | null | null | 0 | 0 | Entry not found |
trttryty/dffgdfddf | 2023-08-27T15:00:15.000Z | [
"region:us"
] | trttryty | null | null | null | 0 | 0 | Entry not found |
JustSaX/Ma_Distilbert_Test | 2023-08-27T15:58:59.000Z | [
"region:us"
] | JustSaX | null | null | null | 0 | 0 | Entry not found |
allistair99/SRHTest | 2023-08-27T15:50:43.000Z | [
"region:us"
] | allistair99 | Stanford Question Answering Dataset (SQuAD) is a reading comprehension dataset, consisting of questions posed by crowdworkers on a set of Wikipedia articles, where the answer to every question is a segment of text, or span, from the corresponding reading passage, or the question might be unanswerable. | @article{2016arXiv160605250R,
author = {{Rajpurkar}, Pranav and {Zhang}, Jian and {Lopyrev},
Konstantin and {Liang}, Percy},
title = "{SQuAD: 100,000+ Questions for Machine Comprehension of Text}",
journal = {arXiv e-prints},
year = 2016,
eid = {arXiv:1606.05250},
pages = {arXiv:1606.05250},
archivePrefix = {arXiv},
eprint = {1606.05250},
} | null | 0 | 0 | Entry not found |
Bot123smile/Kevindate | 2023-08-27T15:12:52.000Z | [
"license:openrail",
"region:us"
] | Bot123smile | null | null | null | 0 | 0 | ---
license: openrail
---
|
aimankem32/races | 2023-08-27T16:01:58.000Z | [
"license:apache-2.0",
"region:us"
] | aimankem32 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
r0ll/bratishkinoff | 2023-09-21T07:31:41.000Z | [
"license:openrail",
"region:us"
] | r0ll | null | null | null | 0 | 0 | ---
license: openrail
---
|
AmelieSchreiber/cafa_5_train_val_split_1 | 2023-08-27T16:09:57.000Z | [
"license:mit",
"region:us"
] | AmelieSchreiber | null | null | null | 1 | 0 | ---
license: mit
---
|
vedantmahalle21/bengali_asr_dataset | 2023-08-27T16:23:57.000Z | [
"region:us"
] | vedantmahalle21 | null | null | null | 0 | 0 | Entry not found |
yxgao/openassistant-guanaco-llama2 | 2023-08-27T16:29:54.000Z | [
"region:us"
] | yxgao | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15401731
num_examples: 9846
- name: test
num_bytes: 815439
num_examples: 518
download_size: 9458962
dataset_size: 16217170
---
# Dataset Card for "openassistant-guanaco-llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lotykun/spanish_story_telling | 2023-08-27T16:44:31.000Z | [
"license:other",
"region:us"
] | Lotykun | null | null | null | 0 | 0 | ---
license: other
---
|
deepghs/outfit_similarity | 2023-08-27T17:36:35.000Z | [
"license:mit",
"region:us"
] | deepghs | null | null | null | 0 | 0 | ---
license: mit
---
|
an1rud/test_lections | 2023-08-27T18:00:45.000Z | [
"license:gpl-3.0",
"region:us"
] | an1rud | null | null | null | 0 | 0 | ---
license: gpl-3.0
---
|
Vierza/bibi_DS_v1 | 2023-08-27T17:13:19.000Z | [
"region:us"
] | Vierza | null | null | null | 0 | 0 | Entry not found |
dhruvindankhara/microstructure_RVE | 2023-09-24T15:35:01.000Z | [
"license:mit",
"region:us"
] | dhruvindankhara | null | null | null | 0 | 0 | ---
license: mit
---
|
eduagarcia-temp/brwac_meta | 2023-08-27T18:56:00.000Z | [
"region:us"
] | eduagarcia-temp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: dedup
struct:
- name: exact_norm
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: exact_hash_idx
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash_idx
dtype: int64
- name: doc_id
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
splits:
- name: train
num_bytes: 18279917379
num_examples: 3530796
download_size: 11165124126
dataset_size: 18279917379
---
# Dataset Card for "brwac_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ybubnou/pubg | 2023-08-27T17:32:24.000Z | [
"license:odbl",
"region:us"
] | ybubnou | null | null | null | 0 | 0 | ---
license: odbl
---
|
rishitunu/ecc_crackdetector_dataset_main | 2023-08-27T17:42:46.000Z | [
"region:us"
] | rishitunu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 3136023.0
num_examples: 405
download_size: 3081581
dataset_size: 3136023.0
---
# Dataset Card for "ecc_crackdetector_dataset_main"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quocanh34/test_unmerged50_results | 2023-08-27T18:16:49.000Z | [
"region:us"
] | quocanh34 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: prediction
dtype: string
- name: spoken_norm_prediction
dtype: string
- name: id
dtype: string
- name: w2v2_large_5grams_transcription
dtype: string
- name: unmerged_50
dtype: string
splits:
- name: train
num_bytes: 174531640.625
num_examples: 1299
download_size: 164318499
dataset_size: 174531640.625
---
# Dataset Card for "test_unmerged50_results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vierza/bibi_PWv5_V1 | 2023-08-27T17:44:53.000Z | [
"region:us"
] | Vierza | null | null | null | 0 | 0 | Entry not found |
saiful21/Leaf_classification | 2023-08-27T21:08:58.000Z | [
"region:us"
] | saiful21 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': অর্জুন
'1': আম
'2': জাম
'3': পেয়ারা
'4': বেল
'5': লেবু
splits:
- name: train
num_bytes: 1640312612.080664
num_examples: 1011
- name: test
num_bytes: 289169464.14033616
num_examples: 179
download_size: 1929439824
dataset_size: 1929482076.2210002
---
# Dataset Card for "Leaf_classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DrSebastianK/tmdrag | 2023-08-27T18:06:05.000Z | [
"region:us"
] | DrSebastianK | null | null | null | 0 | 0 | Entry not found |
Vierza/bibi_MajreverieV1_v1 | 2023-08-27T18:09:39.000Z | [
"region:us"
] | Vierza | null | null | null | 0 | 0 | Entry not found |
Vierza/Bibi_MajLuxV2_v1 | 2023-08-27T18:32:49.000Z | [
"region:us"
] | Vierza | null | null | null | 0 | 0 | Entry not found |
Roscall/Le-RVC | 2023-08-27T19:04:46.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | |
Roscall/50s-Elvis-RVC | 2023-08-27T19:08:55.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
Roscall/Meiko-RVC | 2023-08-27T19:12:46.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
Roscall/Jessi-RVC | 2023-08-27T19:18:45.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
jxie/modelnet40-2048 | 2023-08-27T19:26:10.000Z | [
"region:us"
] | jxie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
sequence:
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 322555200
num_examples: 9840
- name: test
num_bytes: 80901040
num_examples: 2468
download_size: 296407531
dataset_size: 403456240
---
# Dataset Card for "modelnet40-2048"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nielsr/datacomp-small-with-text-embeddings | 2023-08-27T23:19:32.000Z | [
"region:us"
] | nielsr | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: uid
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: clip_b32_similarity_score
dtype: float32
- name: clip_l14_similarity_score
dtype: float32
- name: face_bboxes
sequence:
sequence: float64
- name: sha256
dtype: string
- name: clip_l14_text_embedding
sequence: float64
splits:
- name: train
num_bytes: 82649389578
num_examples: 12800000
download_size: 23102063139
dataset_size: 82649389578
---
# Dataset Card for "datacomp-small-with-text-embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_nnxor_l1_26 | 2023-08-27T20:09:20.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence:
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 8001200000
num_examples: 100000
- name: validation
num_bytes: 800120000
num_examples: 10000
- name: test
num_bytes: 800120000
num_examples: 10000
download_size: 7974442465
dataset_size: 9601440000
---
# Dataset Card for "autotree_nnxor_l1_26"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FranciscoMacaya/train_model.jsonl | 2023-08-28T04:59:08.000Z | [
"license:openrail",
"region:us"
] | FranciscoMacaya | null | null | null | 0 | 0 | ---
license: openrail
---
|
Ricardolpa/a | 2023-08-27T20:18:23.000Z | [
"region:us"
] | Ricardolpa | null | null | null | 0 | 0 | Entry not found |
eduagarcia-temp/OSCAR-2301_meta | 2023-08-28T14:07:22.000Z | [
"region:us"
] | eduagarcia-temp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: meta
struct:
- name: categories
sequence: string
- name: dedup
struct:
- name: exact_norm
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: exact_hash_idx
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash_idx
dtype: int64
- name: harmful_pp
dtype: float64
- name: identification
struct:
- name: label
dtype: string
- name: prob
dtype: float64
- name: quality_warnings
sequence: string
- name: sentence_identifications
list:
- name: label
dtype: string
- name: prob
dtype: float64
- name: tlsh
dtype: string
- name: warc_headers
struct:
- name: content-length
dtype: int64
- name: content-type
dtype: string
- name: warc-block-digest
dtype: string
- name: warc-date
dtype: string
- name: warc-identified-content-language
dtype: string
- name: warc-record-id
dtype: string
- name: warc-refers-to
dtype: string
- name: warc-target-uri
dtype: string
- name: warc-type
dtype: string
splits:
- name: train
num_bytes: 127702717461
num_examples: 18031400
download_size: 40317121912
dataset_size: 127702717461
---
# Dataset Card for "OSCAR-2301_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bulu/lego_blip_512 | 2023-08-27T21:03:50.000Z | [
"region:us"
] | bulu | null | null | null | 0 | 0 | Entry not found |
Vergastik/dataset1 | 2023-08-27T20:44:53.000Z | [
"license:mit",
"region:us"
] | Vergastik | null | null | null | 0 | 0 | ---
license: mit
---
|
jmgb0127/bloom-lotr | 2023-08-27T20:47:01.000Z | [
"region:us"
] | jmgb0127 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2753856.0
num_examples: 336
- name: test
num_bytes: 311448.0
num_examples: 38
download_size: 1407664
dataset_size: 3065304.0
---
# Dataset Card for "bloom-lotr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vadim71/embeddings.zip | 2023-08-27T20:59:10.000Z | [
"region:us"
] | vadim71 | null | null | null | 0 | 0 | Entry not found |
bulu/anime_blip_512 | 2023-08-27T21:50:12.000Z | [
"region:us"
] | bulu | null | null | null | 0 | 0 | Entry not found |
chineidu/tutorial_01 | 2023-08-27T21:22:50.000Z | [
"region:us"
] | chineidu | null | null | null | 0 | 0 | Entry not found |
GISY/cn_product | 2023-08-27T21:36:27.000Z | [
"region:us"
] | GISY | null | null | null | 0 | 0 | Entry not found |
Roscall/KRDK | 2023-08-27T22:53:57.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
linatul24/linatul24 | 2023-08-27T23:43:18.000Z | [
"region:us"
] | linatul24 | null | null | null | 0 | 0 | Entry not found |
model7845/model7845 | 2023-08-27T23:43:22.000Z | [
"region:us"
] | model7845 | null | null | null | 0 | 0 | Entry not found |
RKocielnik/bias-test-gpt-sentences5 | 2023-09-03T06:18:57.000Z | [
"license:apache-2.0",
"region:us"
] | RKocielnik | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
GeorgeOnTheWeb/Obsidian | 2023-08-28T00:46:19.000Z | [
"region:us"
] | GeorgeOnTheWeb | null | null | null | 0 | 0 | |
Henrybauerv/your-dataset-name | 2023-08-28T00:25:32.000Z | [
"region:us"
] | Henrybauerv | null | null | null | 0 | 0 | Entry not found |
Tsunematsu4656/Tsunematsu4656 | 2023-08-28T00:21:05.000Z | [
"region:us"
] | Tsunematsu4656 | null | null | null | 0 | 0 | Entry not found |
Maresuke454/Maresuke454 | 2023-08-28T00:21:08.000Z | [
"region:us"
] | Maresuke454 | null | null | null | 0 | 0 | Entry not found |
Sanjay1234/my-awesome-dataset | 2023-08-28T00:23:25.000Z | [
"region:us"
] | Sanjay1234 | null | null | null | 0 | 0 | Entry not found |
Konno565/Konno565 | 2023-08-28T00:23:33.000Z | [
"region:us"
] | Konno565 | null | null | null | 0 | 0 | Entry not found |
Roscall/DougChurch | 2023-08-28T00:24:41.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
Genichi5656/Genichi5656 | 2023-08-28T00:24:22.000Z | [
"region:us"
] | Genichi5656 | null | null | null | 0 | 0 | Entry not found |
Blinxtw/afaf | 2023-08-28T00:46:56.000Z | [
"doi:10.57967/hf/1032",
"region:us"
] | Blinxtw | null | null | null | 0 | 0 | Entry not found |
jgbv/sidewalk-imagery | 2023-08-28T00:50:50.000Z | [
"region:us"
] | jgbv | null | null | null | 0 | 0 | Entry not found |
Alterneko/q | 2023-09-19T12:51:38.000Z | [
"region:us"
] | Alterneko | null | null | null | 0 | 0 | Entry not found |
Jakir057/bangla_money | 2023-08-28T01:08:50.000Z | [
"region:us"
] | Jakir057 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '10'
'2': '100'
'3': '1000'
'4': '2'
'5': '20'
'6': '5'
'7': '50'
'8': '500'
splits:
- name: train
num_bytes: 13906365.244773366
num_examples: 1391
- name: test
num_bytes: 2506854.417226634
num_examples: 246
download_size: 16309282
dataset_size: 16413219.662
---
# Dataset Card for "bangla_money"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cnut1648/fingerprint-outputs | 2023-08-28T20:32:35.000Z | [
"region:us"
] | cnut1648 | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_Diabetes130US_gosdt_l512_d3 | 2023-08-28T02:06:07.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 487407664
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_Diabetes130US_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ryanlinjui/darkchess-robot | 2023-08-28T02:58:56.000Z | [
"region:us"
] | ryanlinjui | null | null | null | 0 | 0 | Entry not found |
mayur456/guanaco-llama2-1k | 2023-08-28T03:17:24.000Z | [
"region:us"
] | mayur456 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_snnxor_l1_2 | 2023-08-28T03:28:37.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence:
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 1550000000
num_examples: 100000
- name: validation
num_bytes: 155000000
num_examples: 10000
- name: test
num_bytes: 155000000
num_examples: 10000
download_size: 1065205233
dataset_size: 1860000000
---
# Dataset Card for "autotree_snnxor_l1_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LeeKinXUn/math | 2023-08-28T03:40:21.000Z | [
"region:us"
] | LeeKinXUn | null | null | null | 0 | 0 | Entry not found |
HUBioDataLab/AlphafoldStructures2 | 2023-08-28T06:36:22.000Z | [
"region:us"
] | HUBioDataLab | null | null | null | 1 | 0 | Entry not found |
Stepa/Difusion_course_3_unit_dreambooth_cyberpunk | 2023-08-28T04:37:02.000Z | [
"region:us"
] | Stepa | null | null | null | 0 | 0 | Entry not found |
taway0334/sql | 2023-08-28T04:41:05.000Z | [
"region:us"
] | taway0334 | null | null | null | 0 | 0 | Entry not found |
coralexbadea/monitorul_trial_qa400 | 2023-08-28T05:34:57.000Z | [
"region:us"
] | coralexbadea | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 945166
num_examples: 3181
download_size: 441212
dataset_size: 945166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "monitorul_trial_qa400"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FranciscoMacaya/Trainning_llama2 | 2023-08-28T05:02:09.000Z | [
"license:openrail",
"region:us"
] | FranciscoMacaya | null | null | null | 0 | 0 | ---
license: openrail
---
|
LRAI/task-normalization-chip2020 | 2023-08-28T05:55:23.000Z | [
"region:us"
] | LRAI | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: entities
sequence: string
splits:
- name: train
num_bytes: 623418
num_examples: 8000
- name: test
num_bytes: 412454
num_examples: 10000
download_size: 601155
dataset_size: 1035872
---
# Dataset Card for "task-normalization-chip2020"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
r0ll/prigojin | 2023-08-28T06:25:24.000Z | [
"license:openrail",
"region:us"
] | r0ll | null | null | null | 0 | 0 | ---
license: openrail
---
|
kavinilavan/array_n_poa_dataset_v2 | 2023-08-28T06:29:21.000Z | [
"region:us"
] | kavinilavan | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.